WO2021140621A1 - Information generation device, warning device, information generation method, warning method, information generation program, and warning program - Google Patents

Information generation device, warning device, information generation method, warning method, information generation program, and warning program Download PDF

Info

Publication number
WO2021140621A1
WO2021140621A1 PCT/JP2020/000484 JP2020000484W WO2021140621A1 WO 2021140621 A1 WO2021140621 A1 WO 2021140621A1 JP 2020000484 W JP2020000484 W JP 2020000484W WO 2021140621 A1 WO2021140621 A1 WO 2021140621A1
Authority
WO
WIPO (PCT)
Prior art keywords
information
vehicle
moving body
warning
range
Prior art date
Application number
PCT/JP2020/000484
Other languages
French (fr)
Japanese (ja)
Inventor
要介 石渡
Original Assignee
三菱電機株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 三菱電機株式会社 filed Critical 三菱電機株式会社
Priority to PCT/JP2020/000484 priority Critical patent/WO2021140621A1/en
Publication of WO2021140621A1 publication Critical patent/WO2021140621A1/en

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast

Definitions

  • the present disclosure relates to an information generation device, a warning device, an information generation method, a warning method, an information generation program, and a warning program for inferring and warning a peripheral moving object existing around the own vehicle, which is often overlooked.
  • the sensor information acquired from these systems is used, for example, in a warning device that warns the driver of a collision of the own vehicle.
  • the warning device uses sensor information acquired from the own vehicle, other vehicles, infrastructure devices, etc. to warn the driver when there is a peripheral moving object such as another vehicle or a person who is likely to collide with the own vehicle. Is issued.
  • the problem is how much the driver is aware of the peripheral moving objects detected by the sensor information.
  • the warning device keeps issuing warnings about peripheral moving objects that the driver is aware of, the warnings that the driver gives about peripheral moving objects that the driver is not aware of are mixed with warnings about peripheral moving objects that the driver is aware of. It ends up.
  • the warning device issues a warning about the peripheral moving object that the driver is aware of, the driver does not care about the warning itself, or the driver ignores the warning. Furthermore, when the number of warnings about peripheral moving objects that the driver is aware of increases, the driver finds many warnings annoying, and the warning device is not used in the first place.
  • the warning device is required to narrow down the number of warnings and give a warning only to peripheral moving objects that the driver is not aware of.
  • Patent Document 1 discloses a technique for determining the necessity of warning to an individual driver in a certain surrounding environment by using a parameter set by the driver by self-report.
  • the parameters set by self-report are sensory parameters, and there is a possibility that the parameters are set incorrectly. If the parameters to be set are incorrect, it may be judged that a warning is necessary when the warning is unnecessary, or it may be judged that the warning is unnecessary when the warning is necessary, and the accuracy of determining the necessity of the warning is poor. There was a problem that it could become.
  • This disclosure is made to solve the above-mentioned problems, and is an information generator, a warning device, an information generation method, and a warning method for accurately inferring and warning a peripheral moving object that is often overlooked. , Information generator, and warning program.
  • the information generator acquires the own vehicle position information indicating the position of the own vehicle within a predetermined period and the peripheral moving body position information indicating the position of the peripheral moving body existing around the own vehicle within the predetermined period.
  • the surrounding situation information acquisition unit to be acquired, and the collision possibility information calculation unit that calculates the collision possibility information that the own vehicle and the peripheral moving object may collide with each other using the own vehicle position information and the peripheral moving body position information.
  • the undetected range information calculation unit that calculates the undetected range information that is the range overlooked by the driver of the own vehicle in the surrounding situation indicated by the surrounding situation information using the collision possibility information, the line-of-sight information, and the surrounding situation information. It is provided with a warning range information calculation unit that calculates warning range information indicating a range in which a peripheral moving object warned to the driver of the own vehicle is located by using one or more undetected range information.
  • the warning device indicates the first position information of the own vehicle indicating the position of the own vehicle within the first predetermined period, and the position of the peripheral moving body existing around the own vehicle within the first predetermined period.
  • the moving body information acquisition unit that acquires the position information of the first peripheral moving body, and the second own vehicle position information indicating the position of the moving body within the second predetermined period, which is a period before the first predetermined period.
  • Possibility information is calculated, and the calculated collision possibility information, the second line-of-sight information which is the direction of the line-of-sight directed by the driver of the moving object within the second predetermined period, and the second predetermined period
  • the undetected range information which is the range overlooked by the driver of the moving object in the surrounding condition indicated by the surrounding condition information, is calculated by using the surrounding condition information which is the surrounding condition of the moving object, and one or more undetected range information is calculated.
  • a predetermined condition including at least the condition of the same driver in the own vehicle and the moving body.
  • the matching warning range information is acquired, and it is determined whether or not the peripheral moving objects existing around the own vehicle are the warning targets by using the first own vehicle position information, the first moving body position information, and the warning range information. It is provided with a warning target determination unit.
  • FIG. 1 It is a block diagram of the warning system including the information generation device and the warning device which concerns on Embodiment 1 of this disclosure. It is a block diagram of the information generation part which concerns on Embodiment 1 of this disclosure. It is a hardware block diagram of the warning system which concerns on Embodiment 1 of this disclosure. It is a flowchart which shows the operation of the warning system which concerns on Embodiment 1 of this disclosure. It is a flowchart which shows the operation of the information generation apparatus which concerns on Embodiment 1 of this disclosure. It is a conceptual diagram of the visual field information which concerns on Embodiment 1 of this disclosure. It is a conceptual diagram of the collision possibility information which concerns on Embodiment 1 of this disclosure. FIG.
  • FIG. 7a is a diagram in which the own vehicle is arranged in the center, the positions of the peripheral moving bodies are marked with circles, and the speeds are indicated by arrows.
  • FIG. 7b is a risk map. It is a figure for demonstrating the period immediately before approaching the intersection A which concerns on Embodiment 1 of this disclosure.
  • FIG. 5 is a conceptual diagram of a range in which the driver of the own vehicle according to the first embodiment of the present disclosure can be seen.
  • FIG. 5 is a conceptual diagram of line-of-sight application collision possibility information from time t11 to time t12 according to the first embodiment of the present disclosure. It is a conceptual diagram of the undetected range information of time t11 to time t16 which concerns on Embodiment 1 of this disclosure.
  • Embodiment 1 a case where the warning system is mounted on the vehicle and the own vehicle is just before approaching the intersection A will be described.
  • the own vehicle travels on the route passing through the intersection A in advance, and the warning system including the information generator and the warning device approaches the intersection A when the vehicle travels on the route in advance.
  • the range overlooked by the driver immediately before is stored in the warning range information storage unit 37.
  • the peripheral moving body in the situation where the own vehicle is currently traveling at the intersection A, the peripheral moving body is located in the range overlooked by the driver, which is stored in advance in the warning range information storage unit 37. Warning control is performed in some cases.
  • the peripheral moving object is a moving object such as another vehicle or a person existing around the own vehicle.
  • FIG. 1 is a block diagram of a warning system 3 including an information generation device 1 and a warning device 2 according to the first embodiment of the present disclosure.
  • the warning system 3 includes a moving body information detection sensor 31a that detects information on the own vehicle and surrounding moving objects existing around the own vehicle, a line-of-sight information detection sensor 32 that detects the line-of-sight information of the driver of the own vehicle, and the self.
  • the line-of-sight information storage unit 33 that stores the line-of-sight information of the driver of the vehicle
  • the undetected range information storage unit 34 that stores the undetected range information that is the information of the range overlooked by the driver, which will be described later, and the map information are stored.
  • the map information storage unit 35, the correspondence information storage unit 36 that stores the correspondence information that is the associated information among a plurality of information to be described later, and the warning range information that is the information of the range for determining whether or not to warn are stored.
  • the mobile body information detection sensor 31a is provided in the own vehicle and detects the own vehicle information such as the own vehicle position information, the speed information, and the size information which are the position information of the own vehicle.
  • the own vehicle information includes at least the own vehicle position information.
  • the moving body information detection sensor 31a detects peripheral moving body information such as peripheral moving body position information, speed information, size information, etc., which are position information of peripheral moving bodies existing around the own vehicle.
  • the peripheral moving body information includes at least the peripheral moving body position information.
  • the own vehicle information and the peripheral moving body information are information used for determining whether or not the peripheral moving body and the own vehicle collide with each other.
  • the moving body information detection sensor 31a transmits the detected own vehicle information and peripheral moving body information to at least one of the information generation device 1 and the warning device 2.
  • the line-of-sight information detection sensor 32 detects the line-of-sight information of the driver of the own vehicle.
  • the line-of-sight information detection sensor 32 transmits the detected line-of-sight information to at least one of the information generation device 1 and the warning device 2.
  • the line-of-sight information storage unit 33 receives the line-of-sight information detected by the line-of-sight information detection sensor 32 from at least one of the information generation device 1 and the warning device 2, and stores the received line-of-sight information.
  • the visual field information which is the information of the range of the line of sight directed by the driver, which will be described later, is stored.
  • the undetected range information storage unit 34 stores the undetected range information which is the information of the range overlooked by the driver when the own vehicle travels on a specific route in advance and approaches the intersection A. Details will be described later.
  • the map information storage unit 35 stores map information used when determining the surrounding conditions of the own vehicle, including building information, road information, and the like.
  • the correspondence information storage unit 36 corresponds to the collision possibility information storage unit 361, which stores the collision possibility information which is information on the possibility of collision between the own vehicle and the surrounding moving body such as a risk map, and the collision possibility information. It is composed of a corresponding line-of-sight information storage unit 362 that stores the line-of-sight information of the driver of the own vehicle, and a surrounding situation information storage unit 363 that stores the surrounding situation information that is the information of the surrounding situation corresponding to the collision possibility information.
  • the surrounding situation information includes, for example, information on the situation at intersection A, information on the situation at nightfall, information on the weather situation on a rainy day, and information on the driving situation when turning right. Information, etc., location, time, weather, driving conditions, etc., which is information on the surrounding conditions of the driver related to driving.
  • the surrounding situation information is not limited to the above as long as it is information on the situation related to driving.
  • Correspondence information is information that summarizes the corresponding collision possibility information, the corresponding line-of-sight information, and the corresponding surrounding situation information.
  • the correspondence is, for example, a case where the period includes the same time. In the first embodiment, it refers to the one at the same time.
  • the period including the same time means that if the time is the same, the start time and the end time of the information including the same time are exactly the same, and the start time and the end time of the information include the same time. May be different.
  • the warning range information storage unit 37 receives the warning range information from the information generation device 1 and stores the warning range information. The warning range information storage unit 37 transmits the stored warning range information to the warning device 2.
  • the information generation device 1 obtains the moving body information acquisition unit 4a for acquiring peripheral moving body information, the line-of-sight information acquisition unit 5 for acquiring the line-of-sight information indicating which direction the driver of the own vehicle is looking at, and the warning range information. It is composed of an information generation unit 11a to be generated.
  • the information generation device 1 is a device that generates information that serves as a determination standard when the warning device 2 determines whether or not to issue a warning.
  • the moving body information acquisition unit 4a acquires the own vehicle information at a certain time detected by the moving body information detection sensor 31a and the peripheral moving body information of the peripheral moving body existing around the own vehicle for a predetermined period t1. Since the detection timing by the moving body information detection sensor 31a depends on the sensor, detection that occurs when a plurality of sensors are used by setting a time width of a predetermined period t1 and acquiring own vehicle information and peripheral moving body information. It is possible to absorb the timing deviation.
  • the mobile body information acquisition unit 4a transmits the own vehicle information and the peripheral mobile body information to the information generation unit 11a in the information generation device 1.
  • the line-of-sight information acquisition unit 5 acquires the line-of-sight information of the driver of the own vehicle at a certain time detected by the line-of-sight information detection sensor 32, and stores the line-of-sight information in the line-of-sight information storage unit 33.
  • the certain time detected by the line-of-sight information detection sensor 32 is the same time as the certain time detected by the moving object information detection sensor 31a.
  • the line-of-sight information acquisition unit 5 acquires the line-of-sight information of the driver of the own vehicle during a certain period detected by the line-of-sight information detection sensor 32, and stores the line-of-sight information in the line-of-sight information storage unit 33.
  • the certain period detected by the line-of-sight information detection sensor 32 is a period having the same time width including a certain time detected by the moving object information detection sensor 31a. It should be noted that the period from the start of detection to the end of detection does not have to be exactly the same.
  • the line-of-sight information acquisition unit 5 uses the line-of-sight information acquired from the line-of-sight information detection sensor 32 for a predetermined period t2 at a certain time or the line-of-sight information stored in the line-of-sight information storage unit 33 for a predetermined period t2 at a certain time.
  • Acquires visual field information which is information on the range of the line of sight directed by the driver.
  • the driver drives the own vehicle in various situations in advance, the own vehicle information acquired by the moving body information acquisition unit 4a, the peripheral moving body information of the peripheral moving body existing around the own vehicle, and the peripheral moving body information.
  • Warning range information is generated using the line-of-sight information of the driver of the own vehicle acquired by the line-of-sight information acquisition unit 5.
  • the own vehicle travels on the route passing through the intersection A in advance, and the warning range information immediately before approaching the intersection A is generated.
  • FIG. 2 is a block diagram of the information generation unit 11a according to the first embodiment of the present disclosure.
  • the information generation unit 11a will be mainly described with reference to FIG.
  • the information generation unit 11a includes a surrounding situation information setting unit 101 that sets surrounding situation information, a collision possibility information calculation unit 102 that calculates collision possibility information, and collision possibility information, line-of-sight information, and surrounding situation. It is composed of an information mapping unit 103 that associates information with information, an undetected range information calculation unit 104a that calculates undetected range information, and a warning range information calculation unit 105a that calculates warning range information.
  • the surrounding condition information setting unit 101 heads for the sun at nightfall, for example, just before approaching an intersection, waiting for a right turn in the intersection, turning left at an empty intersection, just before changing lanes on a busy road, and so on. While going straight, waiting for a right turn at night, turning left at an intersection on a rainy day, changing lanes on a typhoon day, etc. Is set as the surrounding situation information.
  • the situation that can be the surrounding situation information is the situation where the movement of the own vehicle is restricted.
  • the surrounding situation information setting unit 101 transmits the surrounding situation information to the collision possibility information calculation unit 102.
  • the setting may be set when the own vehicle is running, or the situation is memorized by a drive recorder or the like while the vehicle is running, and the information memorized after stopping, parking, or getting off is confirmed. May be set.
  • the setting may be made by the driver of the own vehicle, a passenger, or a person who is not on board.
  • the example is given by combining two of the place, time, weather, and driving situation, three or more may be combined, and at least one situation may be set. Since the surrounding conditions are objective information regardless of location, time, people, etc., it is unlikely that they will be set incorrectly. Further, when the situation is memorized by a drive recorder or the like, the surrounding situation information may be automatically set from the image recognition.
  • the collision possibility information calculation unit 102 calculates the movement prediction between the own vehicle and the peripheral moving body by using the own vehicle information and the peripheral moving body information received from the moving body information acquisition unit 4a, and the collision possibility information calculation unit 102 and the own vehicle at a certain time. Collision possibility information is calculated as a risk map or the like in which the collision possibility with a surrounding moving object is linked with the collision probability and the coordinates around the own vehicle. Collision possibility information will be described later.
  • the collision possibility information calculation unit 102 transmits the surrounding situation information and the collision possibility information to the information association unit 103.
  • the information association unit 103 associates the surrounding situation information and the collision possibility information received from the collision possibility information calculation unit 102 with the line-of-sight information received from the line-of-sight information acquisition unit 5. Specifically, the information association unit 103 associates the surrounding situation information, the collision possibility information, and the line-of-sight information at the same time T1. The information association unit 103 stores the associated surrounding situation information, collision possibility information, and line-of-sight information in the corresponding information storage unit 36.
  • the information mapping unit 103 may associate the visual field information including the same time with the line-of-sight information. ..
  • the visual field information is the visual field information including the same time as the visual field information, but the visual field information including the time closest to the time when the visual field information is detected, or the visual field information obtained at the same event as in the past. May be acquired visual field information or visual field information that is updated at regular intervals. Further, the period of the visual field information may be changed according to the surrounding situation information because the period for the driver to recognize the object may differ depending on the surrounding situation.
  • the collision possibility information is in the collision possibility information storage unit 361 of the correspondence information storage unit 36
  • the line-of-sight information is in the correspondence line-of-sight information storage unit 362 of the correspondence information storage unit 36
  • the surrounding situation information is in the correspondence information storage unit 36.
  • the surrounding situation information storage unit 363 stores each of them.
  • the information association unit 103 stores the surrounding situation information, the collision possibility information, and the line-of-sight information in the collision possibility information storage unit 361, the corresponding line-of-sight information storage unit 362, and the surrounding situation information storage unit 363, respectively.
  • the correspondence may be stored in another storage unit or may be stored in one storage unit.
  • the undetected range information calculation unit 104a receives and receives the collision possibility information stored in the corresponding information storage unit 36, the line-of-sight information corresponding to the collision possibility information, the visual field information, and the surrounding situation information. Using the information, the undetected range information, which is the range in which the driver of the own vehicle overlooks the surrounding moving object in a certain surrounding situation such as immediately before approaching the intersection A, is calculated. Specifically, the undetected range information calculation unit 104a provides line-of-sight application collision possibility information, which is information in a range in which a peripheral moving object exists at time T2 in a certain surrounding situation and is overlooked by the driver of the own vehicle. calculate. The method of calculating the line-of-sight application collision possibility information will be described later.
  • the undetected range information calculation unit 104a calculates the line-of-sight application collision possibility information at each time in a certain surrounding situation, and uses the logical product of a plurality of line-of-sight application collision possibility information to detect undetected in a certain surrounding situation. Calculate range information.
  • the logical product is used because it is highly possible that it is a continuous period in a certain surrounding situation. For example, the surrounding situation immediately before approaching the intersection A is not a momentary situation, but a surrounding situation that takes at least several seconds.
  • the undetected range information can be created even in a momentary situation.
  • the undetected range information calculation unit 104a sets one line-of-sight application collision possibility information, which is information on the momentary situation, as the undetected range information.
  • the undetected range information calculation unit 104a stores the surrounding situation information and the undetected range information corresponding to the surrounding situation information in the undetected range information storage unit 34.
  • the warning range information calculation unit 105a receives the surrounding situation information and the undetected range information corresponding to the surrounding situation information from the undetected range information storage unit 34.
  • the warning range information calculation unit 105a uses the received information to perform a logical product of a plurality of undetected range information corresponding to the same surrounding situation information having different periods for each driver, and calculates the warning range information.
  • the driver of the own vehicle moves around in the surrounding situation for each driver at any time. It is possible to calculate warning range information, which is the range in which the body is overlooked.
  • the warning range information is a range in which a peripheral moving object exists within the range of the warning range information, and if the driver of the own vehicle does not direct his / her line of sight to the range, a warning should be issued.
  • the warning range information calculation unit 105a stores the warning range information in the warning range information storage unit 37.
  • the warning device 2 includes a moving body information acquisition unit 4a for acquiring peripheral moving body information, a line-of-sight information acquisition unit 5 for acquiring the line-of-sight information of the driver of the own vehicle, and the surroundings for determining the surrounding situation. It is composed of a situation determination unit 21a, a warning target determination unit 22a that determines a warning target that is a target for issuing a warning to the driver, and a warning unit 23 that gives a warning based on the determination of the warning target determination unit 22a.
  • the mobile information acquisition unit 4a is almost the same as the case of the information generation device 1, but the acquisition time is different. Since the information generation device 1 creates warning range information in advance and the warning device 2 issues a warning using the warning range information, the own vehicle information and peripheral movement acquired by the mobile information acquisition unit 4a of the information generation device 1 The acquisition time of the body information is earlier than the acquisition time of the own vehicle information and the peripheral moving body information acquired by the moving body information acquisition unit 4a of the warning device 2.
  • the mobile information acquisition unit 4a is the same as the case of the information generation device 1 except for the above. However, the driver may change the vehicle. That is, the present disclosure is applicable even when the vehicle on which the driver rides, such as a rental car, is changed every time.
  • the moving body information acquisition unit 4a transmits the own vehicle information and the peripheral moving body information to the surrounding situation determination unit 21a and the warning target determination unit 22a.
  • the own vehicle information acquired by the moving body information acquisition unit 4a of the warning device 2 is an example of the first own vehicle position information indicating the position of the own vehicle within the first predetermined period, and the movement of the warning device 2
  • the peripheral moving body information acquired by the body information acquisition unit 4a is an example of the first peripheral moving body position information indicating the position of the peripheral moving body within the first predetermined period.
  • the own vehicle information acquired by the mobile information acquisition unit 4a of the information generation device 1 is an example of the second own vehicle position information indicating the position of the moving body within the second predetermined period, and the information generation device 1
  • the peripheral moving body information acquired by the moving body information acquisition unit 4a of the above is an example of the second peripheral moving body position information indicating the position of the peripheral moving body within the second predetermined period.
  • the line-of-sight information acquisition unit 5 is almost the same as the case of the information generation device 1, but only the acquisition time is different. Since the information generation device 1 creates warning range information in advance and the warning device 2 issues a warning using the warning range information, the acquisition time of the line-of-sight information acquired by the line-of-sight information acquisition unit 5 of the information generation device 1 is set. It is earlier than the acquisition time of the line-of-sight information acquired by the line-of-sight information acquisition unit 5 of the warning device 2. In addition, the field of view information is not acquired.
  • the line-of-sight information acquisition unit 5 is the same as the case of the information generation device 1 except for the above.
  • the line-of-sight information acquisition unit 5 transmits the line-of-sight information to the surrounding situation determination unit 21a and the warning target determination unit 22a.
  • the line-of-sight information acquired by the line-of-sight information acquisition unit 5 of the warning device 2 is an example of the first line-of-sight information which is the direction of the line-of-sight directed by the driver of the own vehicle within the first predetermined period. Further, the line-of-sight information acquired by the line-of-sight information acquisition unit 5 of the information generator 1 is the second line-of-sight direction that is the direction of the line-of-sight directed by the driver of the own vehicle or the moving body that has changed the vehicle within the second predetermined period. This is an example of information.
  • the surrounding situation determination unit 21a for example, is waiting for a right turn at an intersection, is turning left at an empty intersection, is just before changing lanes on a busy road, and is facing the sun at nightfall. While going straight, waiting for a right turn at night, turning left at an intersection on a rainy day, changing lanes on a typhoon day, etc. Determined as surrounding situation information.
  • the surrounding situation determination unit 21a receives the own vehicle information and the peripheral moving body information from the moving body information acquisition unit 4a, the line-of-sight information from the line-of-sight information acquisition unit 5, and the map information from the map information storage unit 35.
  • the surrounding situation information is acquired from the surrounding situation information storage unit 363 of the corresponding information storage unit 36.
  • the surrounding situation determination unit 21a determines the current situation of the own vehicle based on environmental information such as peripheral moving object information, line-of-sight information, own vehicle information, map information, time information, weather information, and the like. To do.
  • the situation means something that restricts the movement of the own vehicle.
  • the situation refers to a situation in which the own vehicle is just before approaching the intersection A and tends to gaze at a road intersecting with a traffic light or a traveling lane, a situation in which the vehicle is slowly traveling on a congested road, and the like.
  • the surrounding situation may be determined from the peripheral moving body information and the environmental information without using the line-of-sight information.
  • the surrounding situation determination unit 21a acquires the transition of the peripheral situation information at the immediately preceding time, which is the “transition of the immediately preceding situation”, from the surrounding situation information storage unit 363 of the corresponding information storage unit 36.
  • the time width of the transition of the peripheral situation information to be acquired may be a fixed value t3 or a variable value according to the immediately preceding situation. This is because the time range to be considered differs between a situation that can change quickly (for example, turning left or right for a few seconds) and a situation that may not change easily (for example, traffic jam).
  • the surrounding condition determination unit 21a transmits the result of the surrounding condition of the own vehicle to the warning target determination unit 22a.
  • the result of the current surrounding situation of the own vehicle determined by the surrounding situation determining unit 21a may be stored in the surrounding situation information storage unit 363 so that it can be used when determining the next situation of the own vehicle.
  • the warning target determination unit 22a acquires the result of the surrounding situation from the surrounding situation determination unit 21a.
  • the warning target determination unit 22a acquires warning range information from the warning range information storage unit 37, which is a range in which the driver of the own vehicle overlooks the surrounding moving body, which is generated in advance in the surrounding situation similar to the result of the surrounding situation. ..
  • the warning target determination unit 22a acquires the own vehicle information and the peripheral moving object information from the moving body information acquisition unit 4a, and the line-of-sight information from the line-of-sight information acquisition unit 5.
  • the warning target determination unit 22a currently has the peripheral moving body, and the driver of the own vehicle does not look at the peripheral moving body.
  • the peripheral moving body is a peripheral moving body to be warned to warn the driver of the own vehicle. Since the surrounding moving objects that are the target of the warning are located in a range that the driver tends to overlook, issuing a warning enables safer driving, and at the same time, it is highly likely that the driver does not judge the warning as useless. , It is unlikely that the warning will be ignored.
  • the warning target determination unit 22a transmits the determination result to the warning unit 23 when there is a peripheral moving object to be warned.
  • the warning unit 23 receives the determination result from the warning target determination unit 22a, and if there is a peripheral moving object to be warned, the warning unit 23 warns the driver of the own vehicle by voice from the speaker, display by car navigation, or the like.
  • the warning method may be any method as long as it is noticed by the driver of the own vehicle.
  • FIG. 3 is a hardware configuration diagram of the warning system 3 according to the first embodiment of the present disclosure. The configuration of the warning system 3 according to the first embodiment of the present disclosure will be described with reference to FIG.
  • the warning system 3 is composed of a bus 40, a processor 41, a memory 42, a storage 43, a sensor 44a, an input I / F45, and an output I / F46.
  • Each function of the warning system 3 is realized by software, firmware, or a combination of software and firmware.
  • Software, firmware, or a combination of software and firmware is described as a program.
  • the bus 40 is a signal path that electrically connects each device and exchanges data.
  • the processor 41 is a CPU (Central Processing Unit) that executes a program stored in the memory 42.
  • the processor 41 connects to other devices via the bus 40 and controls each of these parts.
  • the processor 41 reads and executes the program in the memory 42.
  • the processor 41 loads at least a part of the OS (Operating System) stored in the memory 42, and executes the program while executing the OS.
  • the processor 41 may be an IC (Integrated Circuit) that performs processing, it may be a microprocessor, a microcomputer, or a DSP (Digital Signal Processor).
  • the determination unit 21a and the warning target determination unit 22a are realized by reading and executing the program loaded in the memory 42 by the processor 41.
  • the memory 42 stores software, firmware, or a program in which a combination of software and firmware is described. Further, the memory 42 stores various information and the like.
  • the memory 42 is, for example, a volatile semiconductor memory such as a RAM (Random Access SS Memory).
  • the memory 42 also stores the OS.
  • the determination unit 21a and the warning target determination unit 22a are realized by a program stored in the memory 42.
  • the memory 42 includes a mobile body information acquisition unit 4a, a line-of-sight information acquisition unit 5, a collision possibility information calculation unit 102, an information association unit 103, an undetected range information calculation unit 104a, and a warning range information calculation.
  • the functions of the unit 105a, the surrounding situation determination unit 21a, and the warning target determination unit 22a may be realized by separate memories.
  • the storage 43 stores various information and the like.
  • the storage 43 includes, for example, a ROM (Read Only Memory), a flash memory, an EPROM (EraSable Programmable Read Only Memory), an EEPROM (Electrically EraSable Programmable Memory Wireless HDD, an optical disk, etc.
  • Portable recording media such as discs, flexible discs, optical discs, compact discs, mini discs, and DVDs (Digital VerSail DiSc).
  • Corresponding information storage unit 36 including line-of-sight information storage unit 33, collision possibility information storage unit 361, corresponding line-of-sight information storage unit 362, and surrounding situation information storage unit 363, undetected range information storage unit 34, and warning range information.
  • the storage unit 37 and the map information storage unit 35 are realized by the storage 43.
  • the sensor 44a is a device that detects various types of information.
  • the sensor 44a is, for example, an in-vehicle sensor such as a millimeter wave radar or a camera.
  • the mobile body information detection sensor 31a and the line-of-sight information detection sensor 32 are realized by the sensor 44a.
  • the input I / F45 is a device that allows users such as drivers and passengers to input information.
  • the input I / F 45 is, for example, a device that performs character input and voice recognition for car navigation.
  • the surrounding situation information setting unit 101 is realized by the input I / F 45.
  • the output I / F46 is a device that warns the driver.
  • the output I / F46 is, for example, a device such as a speaker or a display.
  • the warning unit 23 is realized by the output I / F46.
  • Information and the like of each part may be stored in either the memory 42 or the storage 43. Further, the information and the like of each part may be stored in combination with the memory 42 and the storage 43.
  • a part may be realized by dedicated hardware and a part may be realized by software or firmware.
  • a part of the warning system 3 realizes its function by a processing circuit as dedicated hardware, and the remaining part realizes its function by reading and executing a program stored in a memory by a CPU which is a processing circuit. It may be realized.
  • the processing circuit can realize each function of the warning system 3 by hardware, software, firmware, or a combination thereof.
  • the communication to be used may be wired communication or wireless communication.
  • FIG. 4 is a flowchart showing the operation of the warning system 3 according to the first embodiment of the present disclosure. The operation of the warning system 3 will be described below with reference to FIG.
  • step S1 the moving body information detection sensor 31a detects the own vehicle information and the peripheral moving body information of the peripheral moving body existing around the own vehicle.
  • the moving body information detection sensor 31a transmits the detected own vehicle information and peripheral moving body information to the moving body information acquisition unit 4a, and proceeds to step S2.
  • step S2 the line-of-sight information detection sensor 32 detects the line-of-sight information of the driver of the own vehicle.
  • the line-of-sight information detection sensor 32 transmits the detected line-of-sight information to the line-of-sight information acquisition unit 5, and proceeds to step S3. Either of steps S1 and S2 may be processed first.
  • step S3 the information generation device 1 generates warning range information that serves as information on which a warning should be given. Details will be described later.
  • the information generation device 1 stores the warning range information in the warning range information storage unit 37, and proceeds to step S4.
  • step S4 the warning device 2 acquires the warning range information from the warning range information storage unit 37, and determines whether or not to warn the driver of the own vehicle from a plurality of information including the warning range information.
  • the warning device 2 determines that a warning is given, the warning device 2 gives a warning to the driver of the own vehicle. Details will be described later.
  • step S4 Even after executing step S4, the process returns to step S1 and the above processing is repeated until there is a trigger to end the processing such as turning off the power or performing an end operation. Although it is assumed that the above processing is repeated, it may be performed only once without repeating.
  • FIG. 5 is a flowchart showing the operation of the information generation device 1 according to the first embodiment of the present disclosure. The operation of the information generation device 1 in step S3 of FIG. 4 will be described below with reference to FIG.
  • the moving body information acquisition unit 4a acquires own vehicle information and peripheral moving body information from the moving body information detection sensor 31a.
  • the mobile body information acquisition unit 4a may acquire at least one data of the own vehicle information and the peripheral moving body information from the moving body information detection sensor 31a, or may move within a certain period of time.
  • the data sent from the body information detection sensor 31a may be collectively acquired.
  • the mobile body information acquisition unit 4a transmits the acquired own vehicle information and peripheral mobile body information to the information generation unit 11a in the information generation device 1. Even if the moving body information acquisition unit 4a transmits the data to the information generating unit 11a each time at least one data of the own vehicle information and the peripheral moving body information is sent from the moving body information detection sensor 31a. Alternatively, the data sent from the moving body information detection sensor 31a may be collected for a certain period of time, and then the data may be transmitted to the information generation unit 11a.
  • the moving body information acquisition unit 4a acquires the own vehicle information at a certain time detected by the moving body information detection sensor 31a and the peripheral moving body information of the peripheral moving body existing around the own vehicle for a predetermined period t1. .. Since the detection timing by the moving body information detection sensor 31a depends on the sensor, detection that occurs when a plurality of sensors are used by setting a time width of a predetermined period t1 and acquiring own vehicle information and peripheral moving body information. It is possible to absorb the timing deviation.
  • step S102 the line-of-sight information acquisition unit 5 acquires the line-of-sight information from the line-of-sight information detection sensor 32 at the same time that the moving body information acquisition unit 4a acquires the own vehicle information and the peripheral moving body information.
  • the timing, collection, and retention period for the line-of-sight information acquisition unit 5 to acquire the line-of-sight information are set in synchronization with the own vehicle information and the peripheral moving body information held by the moving body information acquisition unit 4a.
  • the moving body information acquisition unit 4a performs processing every time at least one data of own vehicle information and peripheral moving body information is sent from the moving body information detection sensor 31a, it is held for a certain period of time t1 at the timing of processing. If so, the line-of-sight information acquired by the line-of-sight information acquisition unit 5 is also retained for the same period.
  • the line-of-sight information acquisition unit 5 stores the line-of-sight information in the line-of-sight information storage unit 33.
  • the line-of-sight information acquisition unit 5 uses the line-of-sight information acquired from the line-of-sight information detection sensor 32 for a predetermined period t2 at a certain time or the line-of-sight information stored in the line-of-sight information storage unit 33 for a predetermined period t2 at a certain time.
  • Acquires visual field information which is information on the range of the line of sight directed by the driver.
  • the predetermined period t2 when acquiring the visual field information does not have to be the same as the period of the own vehicle information and the moving body information acquired by the moving body information acquisition unit 4a.
  • FIG. 6 is a conceptual diagram of visual field information according to the first embodiment of the present disclosure.
  • the visual field information is the range 51 of the line of sight directed by the driver of the own vehicle 50 during a predetermined period t2 at a certain time.
  • the line-of-sight information acquisition unit 5 stores the visual field information in the line-of-sight information storage unit 33.
  • step S103 the surrounding situation information setting unit 101 receives the own vehicle information and the peripheral moving body information from the moving body information acquisition unit 4a, and the driver receives the line-of-sight information from the line-of-sight information acquisition unit 5.
  • the own vehicle turns right at a specific intersection, so that the surrounding situation information setting unit 101 is "turning right at a specific intersection" input by the driver or the like. Information to that effect is set as surrounding situation information.
  • the surrounding situation information is set during running, but when the situation is memorized by a drive recorder or the like during running and the setting is made after stopping, parking, or getting off, the moving body information is described.
  • the surrounding situation information determined and input by the driver or the like from the image may be set as the surrounding situation information.
  • the surrounding situation information setting unit 101 receives the own vehicle information and the peripheral moving body information from the moving body information acquisition unit 4a, and the same time as the time when the line-of-sight information is received from the line-of-sight information acquisition unit 5. It may be set from the image stored in the drive recorder of.
  • the surrounding situation information setting unit 101 transmits the surrounding situation information to the collision possibility information calculation unit 102.
  • step S104 the collision possibility information calculation unit 102 receives the own vehicle information and the peripheral moving body information from the moving body information acquisition unit 4a, the line-of-sight information from the line-of-sight information acquisition unit 5, and the surrounding situation information from the surrounding situation information setting unit 101. To receive.
  • the collision possibility information calculation unit 102 generates collision possibility information from the own vehicle information and the peripheral moving body information received from the moving body information acquisition unit 4a.
  • FIG. 7 is a conceptual diagram of collision possibility information according to the first embodiment of the present disclosure.
  • FIG. 7a is a diagram in which the own vehicle 50 is arranged in the center, the positions of the peripheral moving bodies 52 to 54 are marked with circles, and the speed is indicated by arrows.
  • FIG. 7b is a risk map.
  • the collision probability in FIG. 7b the blacker the color, the higher the collision probability, and the whiter the color, the lower the collision probability. Since the collision probability in FIG. 7b is simple, the speed information prediction is not taken into consideration, and the risk map is such that the collision occurs as the position is simply black, which is calculated from the position information.
  • the gray gradation between black and white indicates measurement error.
  • the collision possibility information calculation unit 102 acquires the position and speed of the own vehicle 50 from the own vehicle information and the peripheral moving bodies 52 to 54 from the peripheral moving body information, and creates a map as shown in FIG. 7a. To do. Further, the collision possibility information calculation unit 102 calculates the collision probabilities of the own vehicle 50 and the peripheral moving objects 52 to 54, and creates a risk map as shown in FIGS. 7a to 7b. In the first embodiment, the collision possibility information calculation unit 102 creates a risk map, but the information on whether or not the own vehicle collides with a moving object may be calculated by a known calculation method. , Not limited to risk maps.
  • the collision possibility information calculation unit 102 transmits the collision possibility information, the line-of-sight information corresponding to the collision probability, and the surrounding situation information to the information association unit 103.
  • step S105 the information associating unit 103 associates the collision possibility information received from the collision possibility information calculation unit 102 with the line-of-sight information and the surrounding situation information, and stores them in the corresponding information storage unit 36.
  • the information association unit 103 converts the collision possibility information into the collision possibility information storage unit 361, the line-of-sight information corresponding to the collision possibility information into the corresponding line-of-sight information storage unit 362, and the collision possibility information.
  • the corresponding surrounding situation information is stored in the surrounding situation information storage unit 363.
  • the method of associating may be any method as long as it is known that they correspond to each other, such as time and assignment of the same number.
  • the undetected range information calculation unit 104a determines the case immediately before the own vehicle approaches the intersection A, which is an example of the first embodiment, immediately before the own vehicle approaches the intersection A from the corresponding information storage unit 36.
  • the collision possibility information of the period, the line-of-sight information, and the surrounding situation information are acquired.
  • the undetected range information calculation unit 104a calculates the undetected range information from the acquired collision possibility information, the line-of-sight information, and the surrounding situation information.
  • FIG. 8 is a diagram for explaining a period immediately before approaching the intersection A according to the first embodiment of the present disclosure. It is assumed that the time t11 to the time t16 are extracted as the time immediately before approaching the intersection A when the own vehicle travels once at the intersection A.
  • the undetected range information calculation unit 104a divides the time t11 to t16 into, for example, five, and divides the time t11 to time t12, time t12 to time t13, time t13 to time t14, time t14 to time t15, and time t15 to time t16, respectively. think about. Although it is divided into five, it may not be divided or may be another number of divisions. Moreover, although it was divided evenly, it may be divided unevenly.
  • the undetected range information calculation unit 104a uses the collision possibility information, the line-of-sight information, and the surrounding situation information of the period to determine the case immediately before the own vehicle approaches the intersection A. Calculate the line-of-sight application collision possibility information, which is the range overlooked by the driver of the own vehicle during the period.
  • FIG. 9 is a conceptual diagram of the range in which the driver of the own vehicle 50 according to the first embodiment of the present disclosure can be seen.
  • the undetected range information calculation unit 104a acquires the visual field information 51 from the line-of-sight information storage unit 33 via the line-of-sight information acquisition unit 5.
  • the visual field information 51 at a time close to the time t11 to the time t12 is acquired.
  • the undetected range information calculation unit 104a arranges the line-of-sight information on the own vehicle 50.
  • the undetected range information calculation unit 104a further arranges the visual field information 51 centering on the line-of-sight information 55.
  • the range of the visual field information 51 arranged around the line-of-sight information 55 is the range that the driver of the own vehicle 50 can see, and it is considered that there is no danger in the range.
  • FIG. 10 is a conceptual diagram of line-of-sight application collision possibility information from time t11 to time t12 according to the first embodiment of the present disclosure.
  • the undetected range information calculation unit 104a considers that there is no danger in the range in which the driver can see, and considers that there is no danger in the range in which the driver of the own vehicle 50 is visible and the collision possibility calculated in FIG.
  • the line-of-sight application collision possibility information which is the information in the range in which the driver cannot see from the time t11 to the time t12, is calculated.
  • the undetected range information calculation unit 104a sets the collision probability in the range in which the driver can see to zero, and leaves only the collision probability in the range in which the driver cannot see.
  • the undetected range information calculation unit 104a may not set the collision probability in the range visible to the driver to zero, but may only lower it.
  • the undetected range information calculation unit 104a also applies the line-of-sight application collision possibility information to the time t12 to time t13, the time t13 to the time t14, the time t14 to the time t15, and the time t15 to the time t16 in the same manner as the time t11 to the time t12. calculate.
  • the undetected range information calculation unit 104a when the time immediately before approaching the intersection A in FIG. 8 is from time t11 to time t16 and the own vehicle travels once at the intersection A, the line-of-sight application collision possibility during that period.
  • the information is logically producted, and the undetected range information, which is the information of the peripheral moving objects in the range in which the driver cannot see at all during the period, is calculated.
  • FIG. 11 is a conceptual diagram of undetected range information from time t11 to time t16 according to the first embodiment of the present disclosure.
  • the undetected range information calculation unit 104a performs a logical product for each line-of-sight application collision possibility information of time t11 to time t12, time t12 to time t13, time t13 to time t14, time t14 to time t15, and time t15 to time t16.
  • the undetected range information which is the range overlooked throughout the period from time t11 to time t16, is calculated.
  • the undetected range information calculation unit 104a needs to be created from the line-of-sight application collision possibility information of the same driver and the same surrounding situation.
  • the undetected range information calculation unit 104a transmits the undetected range information to the undetected range information storage unit 34, and stores the undetected range information in the undetected range information storage unit 34.
  • the undetected range information calculation unit 104a performs the operation of step S106 for all the collision possibility information stored in the collision possibility information storage unit 361, and the undetected range for each driver under various surrounding conditions. The information is calculated, and the undetected range information storage unit 34 stores the undetected range information.
  • the warning range information calculation unit 105a receives one or more undetected range information of the same driver and the same surrounding situation from the undetected range information storage unit 34. Specifically, the warning range information calculation unit 105a includes the first undetected range information and the second undetected range information, which is the case immediately before approaching the intersection A in the same surrounding condition with the same driver as shown in FIG. The undetected range information and the Nth undetected range information are received from the undetected range information storage unit 34.
  • the same driver and the same surroundings are examples of predetermined conditions.
  • the warning range information calculation unit 105a takes a logical product of the received first undetected range information, second undetected range information, ... Nth undetected range information, and has the same driver and the same surrounding conditions. Below, we calculate the range that the driver always overlooks. The calculated range is the warning range information that is the range to be warned.
  • step S107 Even after executing step S107, the process returns to step S101, and the above processing is repeated until there is a trigger to end the processing such as turning off the power or performing an end operation. Although it is assumed that the above processing is repeated, it may be performed only once without repeating. This is the machine learning process that generates information that is the basis for determining whether a warning should be given in advance.
  • the information generator 1 is mounted on the own vehicle in the first embodiment, it does not have to be mounted on the own vehicle. Specifically, the operation of the information generation device 1 may be performed by a server or the like that is not mounted on the own vehicle.
  • FIG. 12 is a flowchart showing the operation of the warning device 2 according to the first embodiment of the present disclosure. The operation of the warning device 2 in step S4 of FIG. 4 will be described below with reference to FIG.
  • step S201 the mobile information acquisition unit 4a performs the same operation as in step S101.
  • the moving body information acquisition unit 4a transmits the acquired own vehicle information and peripheral moving body information not to the information generation unit 11a but to the surrounding situation determination unit 21a and the warning target determination unit 22a.
  • step S202 the line-of-sight information acquisition unit 5 performs the same operation as in step S102. However, the field of view information is not acquired.
  • the surrounding situation determination unit 21a receives the own vehicle information and the peripheral moving body information from the moving body information acquisition unit 4a, the line-of-sight information from the line-of-sight information storage unit 33 via the line-of-sight information acquisition unit 5, and the map information storage unit. Get map information from 35. From the acquired information, the surrounding situation determination unit 21a determines whether or not the current surrounding situation of the own vehicle matches the surrounding situation stored in the surrounding situation information storage unit 363.
  • the surrounding situation determination unit 21a determines the matched surrounding situation as the current surrounding situation of the own vehicle, and step S203: Yes.
  • the surrounding situation determination unit 21a transmits the result to the warning target determination unit 22a, and proceeds to step S204. For example, if the driver's line of sight of the own vehicle is mainly moving to the right from the line of sight information, and the own vehicle is located near a specific intersection from the own vehicle information and the map information, the surrounding situation determination unit 21a Determines that the vehicle is about to turn right at a particular intersection.
  • the surrounding situation determination unit 21a determines that the vehicle is about to go straight at a specific intersection.
  • the determination method is not limited to the above.
  • the surrounding situation determination unit 21a determines that the surrounding situation is impossible, steps S203: No, and ends the operation.
  • the warning target determination unit 22a receives the result of the surrounding situation from the surrounding situation determination unit 21a, the own vehicle information and the peripheral moving body information from the moving body information acquisition unit 4a, and the line-of-sight information from the line-of-sight information acquisition unit 5. To do.
  • the warning target determination unit 22a acquires the warning range information of the surrounding situation similar to the driver of the own vehicle and the result of the received surrounding situation from the warning range information storage unit 37.
  • the warning target determination unit 22a determines whether or not the peripheral moving object is a warning target from the received or acquired information.
  • the peripheral moving body is located in the warning range from the own vehicle information, the peripheral moving body information, and the warning range information, and the driver of the own vehicle warns the line of sight from the line-of-sight information. If it is not directed to, it is determined that the peripheral moving object located in the warning range is the warning target, and step S204: Yes.
  • the warning target determination unit 22a controls the warning unit 23 to give a warning to the driver of the own vehicle, and proceeds to step S205.
  • the warning target determination unit 22a determines that the peripheral moving body is not the warning target unless the peripheral moving body is located in the warning range and the driver of the own vehicle does not direct the line of sight to the warning range. No, and the operation ends.
  • the warning target determination unit 22a warns when it is determined that the peripheral moving object is located in the warning range only from the own vehicle information, the peripheral moving object information, and the warning range information.
  • the unit 23 may be controlled to warn the driver of the own vehicle.
  • the warning target determination unit 22a warns that there is no possibility of collision, such as the peripheral moving body located in the warning range moving in the direction opposite to the moving direction of the own vehicle, based on the own vehicle information and the peripheral moving body information. If it can be separately determined that is not necessary, it may be determined that the peripheral moving object is not a warning target, and the operation may be terminated in step S204: No.
  • the warning unit 23 receives the determination result of whether or not the peripheral moving object is the warning target from the warning target determination unit 22a, and if it is the warning target, warns the driver of the own vehicle. Call attention.
  • the warning method may be any method such as sound, vibration, display on a display such as car navigation.
  • step S203 If the result is No in step S203, or if the result is No in step S204, the process returns to step S201 even after the execution of step S205, the power is turned off, or a trigger for the end of processing such as an end operation is performed. The above process is repeated until there is. Although it is assumed that the above processing is repeated, it may be performed only once without repeating.
  • the warning system 3 of the first embodiment generates warning range information as a base for determining whether the information generation device 1 should give a warning in consideration of the surrounding situation, and the warning device 2 is the current self.
  • the warning device 2 is the current self. In order to determine whether or not to issue a warning after taking into account the surrounding conditions of the vehicle, it is possible to accurately infer peripheral moving objects that are often overlooked and perform warning control without using sensory parameters.
  • the warning system 3 can give the minimum necessary warning "warning to peripheral moving objects overlooked by the driver” by narrowing down the warning target.
  • the effect of the warning is high.
  • safer driving can be performed and warnings that are not ignored can be given, resulting in accident reduction.
  • the warning system 3 of the first embodiment obtains the minimum range as the range overlooked by the driver, there is a high possibility that the driver has overlooked the peripheral moving objects located in the range, and the driver. You can increase the reliability of the warning that you should be careful because you have overlooked it.
  • the conventional technology has a problem that the parameters are influenced by the mood of the driver because the driver needs to manually set the sensory parameters.
  • the warning system 3 uses objective parameters instead of sensory parameters, the accuracy is better than that of the prior art.
  • the parameters can be set by a person other than the driver.
  • the surrounding situation information setting unit 101 of the warning system 3 acquires the surrounding situation as an image by a drive recorder, a camera, or the like, performs image recognition or machine learning of the surrounding situation from the acquired image, and automatically sets the surrounding situation. By doing so, the load of manually setting the parameters can be reduced.
  • the warning system 3 is more accurate than the prior art by associating the line-of-sight information with the surrounding situation information and considering the behavior of each driver in the surrounding situation.
  • warning system 3 is mounted on a vehicle as an example in the first embodiment, it can be mounted on a moving object such as an airplane, a ship, a train, or a person.
  • the warning system 3 includes the line-of-sight information detection sensor 32, but any information that can calculate or estimate the line-of-sight information of the driver may be obtained.
  • the warning system 3 may include a sensor for detecting the posture of the driver, a sensor for detecting the direction of the driver's face, and the like, instead of the line-of-sight information detection sensor 32.
  • Embodiment 2 the undetected range information calculation unit 104a and the warning range information calculation unit 105a of the information generation unit 11a calculate the undetected range information and the warning range information using the logical product, and narrow down the warning target. , The minimum necessary warning "warning for peripheral moving objects that the driver has overlooked" is given.
  • the undetected range information calculation unit 104b and the warning range information calculation unit 105b of the information generation unit 11b use the undetected range information and the warning range information by using OR. Is calculated.
  • FIG. 13 is a block diagram of the information generation unit 11b according to the second embodiment of the present disclosure.
  • the block diagram of the warning system 3 including the information generation device 1 and the warning device 2 according to the second embodiment is the same as that of FIG.
  • the undetected range information calculation unit 104a and the warning range information calculation unit 105a of FIG. 2 of the first embodiment are functional block diagrams. Join as a composition.
  • the undetected range information calculation unit 104b calculates the line-of-sight application collision possibility information at each time in a certain surrounding situation, and uses the logical sum of the plurality of line-of-sight application collision possibility information to perform undetected in a certain surrounding situation. Calculate range information.
  • the logical sum is used because there is a high possibility that it is a continuous period in a certain surrounding situation.
  • the undetected range information calculation unit 104b is the same as the undetected range information calculation unit 104a.
  • the warning range information calculation unit 105b uses the received information to OR the plurality of undetected range information corresponding to the same surrounding situation information having different periods for each driver, and calculates the warning range information.
  • ORing a plurality of undetected range information corresponding to the same surrounding situation information in which the time is different for each driver the driver of the own vehicle overlooks the surrounding moving object in the surrounding situation for each driver. It is possible to calculate the warning range information, which is the range that may be present.
  • the warning range information calculation unit 105b is the same as the warning range information calculation unit 105a.
  • the hardware configuration diagram of the warning system 3 according to the second embodiment of the present disclosure is the same as that of FIG. 3 of the first embodiment.
  • the hardware configuration of the undetected range information calculation unit 104b is the same as that of the undetected range information calculation unit 104a
  • the hardware configuration of the warning range information calculation unit 105b is the same as that of the warning range information calculation unit 105a.
  • the flowchart showing the operation of the warning system 3 according to the second embodiment of the present disclosure is the same as that of FIG. 4 of the first embodiment. Further, the flowchart showing the operation of the warning device 2 according to the second embodiment of the present disclosure is the same as that of FIG. 12 of the first embodiment. That is, the operation of the warning device 2 in step S4 of FIG. 4 is the same as that of the first embodiment.
  • FIG. 14 is a flowchart showing the operation of the information generation device 1 according to the second embodiment of the present disclosure. The operation of the information generation device 1 in step S3 of FIG. 4 will be described below with reference to FIG.
  • Steps S301 to S305 are the same as steps S101 to S105 of the first embodiment.
  • step S306 the undetected range information calculation unit 104b, in the case immediately before the own vehicle approaches the intersection A, which is the same example as in the first embodiment, the own vehicle approaches the intersection A from the corresponding information storage unit 36.
  • the collision possibility information, the line-of-sight information, and the surrounding situation information of the immediately preceding period are acquired.
  • the undetected range information calculation unit 104b calculates the undetected range information from the acquired collision possibility information, the line-of-sight information, and the surrounding situation information.
  • the figure for explaining the period immediately before approaching the intersection A according to the second embodiment of the present disclosure is the same as that of FIG.
  • the undetected range information calculation unit 104b like the undetected range information calculation unit 104a of the first embodiment, has time t11 to time t12, time t12 to time t13, time t13 to time t14, time t14 to time t15, and time. For the time t15 to the time t16, the line-of-sight application collision possibility information is calculated.
  • the undetected range information calculation unit 104b when the time immediately before approaching the intersection A in FIG. 8 is from time t11 to time t16 and the own vehicle travels once at the intersection A, the line-of-sight application collision possibility during that period.
  • the information is ORed, and the undetected range information, which is the information of the peripheral moving objects in the range where the driver cannot see even for a moment during the period, is calculated.
  • FIG. 15 is a conceptual diagram of undetected range information from time t11 to time t16 according to the second embodiment of the present disclosure.
  • the undetected range information calculation unit 104b performs a logical sum for each line-of-sight application collision possibility information of time t11 to time t12, time t12 to time t13, time t13 to time t14, time t14 to time t15, and time t15 to time t16.
  • the undetected range information which is the information of the peripheral moving objects in the range where the driver cannot see even for a moment, is calculated.
  • step S306 is the same as step S106 of the first embodiment.
  • step S307 the warning range information calculation unit 105b ORs the received first undetected range information, second undetected range information, ... Nth undetected range information, and is used by the same driver. , Calculate the warning range information, which is the range in which the driver of the own vehicle may have overlooked the surrounding moving objects under the same surrounding conditions.
  • the calculated range is the warning range information that is the range to be warned.
  • step S307 is the same as step S107 of the first embodiment.
  • step S307 Even after executing step S307, the process returns to step S301, and the above processing is repeated until there is a trigger to end the processing such as turning off the power or performing an end operation. Although it is assumed that the above processing is repeated, it may be performed only once without repeating. This is the machine learning process that generates information that is the basis for determining whether a warning should be given in advance.
  • the information generator 1 is mounted on the own vehicle, but it does not have to be mounted on the own vehicle. Specifically, the operation of the information generation device 1 may be performed by a server or the like that is not mounted on the own vehicle.
  • the warning system 3 of the second embodiment generates warning range information as a base for determining whether the information generation device 1 should give a warning in consideration of the surrounding situation, and the warning device 2 is the current self.
  • the warning system 3 of the second embodiment In order to determine whether or not to issue a warning after taking into account the surrounding conditions of the vehicle, it is possible to accurately infer peripheral moving objects that are often overlooked and perform warning control without using sensory parameters.
  • the undetected range information calculation unit 104b and the warning range information calculation unit 105b of the warning system 3 calculate the undetected range information and the warning range information by using the logical sum.
  • the logical sum By this calculation, it is possible to obtain a wide range that the driver of the own vehicle may have overlooked, and it is possible to give a warning about peripheral moving objects that the driver may have overlooked, thereby improving safety. Can be enhanced. As a result, safer driving can be performed, and as a result, it is considered that accidents can be reduced.
  • the effect of the second embodiment is the same as the effect of the first embodiment.
  • the collision possibility information calculation unit 102 uses the risk map as it is, but by setting the collision probability below a predetermined threshold value to zero, only the range with a high risk may be left. .. By doing so, it is possible to create warning range information only in a high-risk range, and it is possible to increase the reliability of the warning that the driver should be careful because he / she has overlooked it.
  • the modified example of the second embodiment is the same as the modified example of the first embodiment.
  • the warning range information calculation unit 105a of the information generation unit 11a calculates the warning range information by using one or more undetected range information of the same driver and the same surrounding situation.
  • the surrounding situation determination unit 21a determines from the acquired information whether or not the current surrounding situation of the own vehicle matches the surrounding situation stored in the surrounding situation information storage unit 363, and the matching one is found. If not, it was decided that the surrounding situation was impossible and the operation was terminated.
  • the warning target determination unit 22a determines whether or not the peripheral moving object is a warning target by using the warning range information of the surrounding conditions similar to the driver of the own vehicle and the same as the result of the received surrounding conditions.
  • the warning range information calculation unit 105c of the information generation unit 11c calculates the warning range information using one or more undetected range information of the same driver. To do.
  • the surrounding situation determination unit 21b warns the determined result regardless of whether or not the current surrounding situation of the own vehicle matches the surrounding conditions stored in the surrounding situation information storage unit 363 from the acquired information. It is transmitted to the determination unit 22a and the operation is continued.
  • the warning target determination unit 22b determines whether or not the peripheral moving object is a warning target by using the warning range information of the surrounding conditions similar to that of the driver of the own vehicle.
  • the warning range information calculation unit 105c can obtain the range information that the driver tends to overlook during the driving of the same driver by calculating the warning range information of the same driver in various surrounding conditions. It is possible to calculate the "missing habit" for each driver.
  • the surrounding situation determination unit 21b and the warning target determination unit 22b can determine whether or not the peripheral moving object is a warning target regardless of the surrounding situation. Other than that, it is the same as that of the first embodiment.
  • the configurations and operations already described in the following description are designated by the same reference numerals, and duplicate description will be omitted.
  • FIG. 16 is a block diagram of the warning system 3 including the information generation device 1 and the warning device 2 according to the third embodiment of the present disclosure.
  • the information generation unit 11a instead of the information generation unit 11a, the surrounding situation determination unit 21a and the warning target determination unit 22a of FIG. 1 of the first embodiment, the information generation unit 11c, the surrounding situation determination unit 21b and the warning target determination unit 22b It is added as a structure of the functional block diagram.
  • the information generation unit 11c generates warning range information. Details will be described later.
  • the surrounding situation determination unit 21b warns the determined result regardless of whether or not the current surrounding situation of the own vehicle matches the surrounding conditions stored in the surrounding situation information storage unit 363 from the acquired information. It is transmitted to the determination unit 22a. Other than that, the surrounding situation determination unit 21b is the same as the surrounding situation determination unit 21a.
  • the warning target determination unit 22b acquires the warning range information generated in advance from the warning range information storage unit 37, which is the range in which the driver of the own vehicle overlooks the surrounding moving body, regardless of the surrounding conditions. Other than that, the warning target determination unit 22b is the same as the warning target determination unit 22a.
  • FIG. 17 is a block diagram of the information generation unit 11c according to the third embodiment of the present disclosure.
  • the untested warning range information calculation unit 105c is added as a configuration of the functional block diagram instead of the warning range information calculation unit 105a of FIG. 2 of the first embodiment.
  • the warning range information calculation unit 105c receives the undetected range information from the undetected range information storage unit 34 regardless of the surrounding situation information.
  • the warning range information calculation unit 105c uses the received information to perform a logical product of a plurality of undetected range information corresponding to the same surrounding situation information having different periods for each driver, and calculates the warning range information.
  • the warning range information calculation unit 105c is the same as the warning range information calculation unit 105a.
  • the hardware configuration diagram of the warning system 3 according to the third embodiment of the present disclosure is the same as that of FIG. 3 of the first embodiment.
  • the hardware configuration of the warning range information calculation unit 105c is the same as that of the warning range information calculation unit 105a
  • the hardware configuration of the surrounding situation determination unit 21b is the same as that of the surrounding situation determination unit 21a
  • the hardware configuration of the warning target determination unit 22b is the same. This is the same as the warning target determination unit 22a.
  • FIG. 18 is a flowchart showing the operation of the information generation device 1 according to the third embodiment of the present disclosure. The operation of the information generation device 1 in step S3 of FIG. 4 will be described below with reference to FIG.
  • Steps S401 to S406 are the same as steps S101 to S106 of the first embodiment.
  • the warning range information calculation unit 105c receives one or more undetected range information of the same driver from the undetected range information storage unit 34 regardless of the surrounding conditions. Specifically, the warning range information calculation unit 105c includes the first undetected range information and the second undetected range information, which is the case immediately before approaching the intersection A which is the same driver as shown in FIG. , ... The Nth undetected range information is received from the undetected range information storage unit 34.
  • the warning range information calculation unit 105c takes a logical product of the received first undetected range information, second undetected range information, ... Nth undetected range information, and is always the same driver. Calculate the range that is overlooked.
  • the calculated range is the warning range information that is the range to be warned.
  • step S407 is the same as step S107 of the first embodiment.
  • step S407 Even after executing step S407, the process returns to step S401, and the above processing is repeated until there is a trigger to end the processing such as turning off the power or performing an end operation. Although it is assumed that the above processing is repeated, it may be performed only once without repeating. This is the machine learning process that generates information that is the basis for determining whether a warning should be given in advance.
  • the information generator 1 is mounted on the own vehicle in the third embodiment, it does not have to be mounted on the own vehicle. Specifically, the operation of the information generation device 1 may be performed by a server or the like that is not mounted on the own vehicle.
  • FIG. 19 is a flowchart showing the operation of the warning device 2 according to the third embodiment of the present disclosure. The operation of the warning device 2 in step S4 of FIG. 4 will be described below with reference to FIG.
  • Steps S501 to S502 are the same as steps S201 to S202 of the first embodiment.
  • step S503 the surrounding situation determination unit 21b determines from the acquired information whether or not the current surrounding situation of the own vehicle matches the surrounding situation stored in the surrounding situation information storage unit 363. If there is a match, the surrounding situation determination unit 21b determines the matched surrounding situation as the current surrounding situation of the own vehicle. If there is no match, the surrounding situation determination unit 21b determines that there is no surrounding situation. The surrounding situation determination unit 21b transmits the determined result to the warning target determination unit 22b, and proceeds to step S504. Other than that, step S503 is the same as step S203 of the first embodiment.
  • step S504 the warning target determination unit 22b receives the result of the surrounding situation from the surrounding situation determination unit 21b, the own vehicle information and the peripheral moving body information from the moving body information acquisition unit 4a, and the line-of-sight information from the line-of-sight information acquisition unit 5. To do.
  • the warning target determination unit 22b acquires the same warning range information as the driver of the own vehicle from the warning range information storage unit 37 regardless of the surrounding conditions.
  • the warning target determination unit 22b determines whether or not the peripheral moving object is a warning target from the received or acquired information.
  • step S504 is the same as step S204 of the first embodiment.
  • Step S505 is the same as step S205 of the first embodiment.
  • step S504 If No is obtained in step S504, the process returns to step S501 even after the execution of step S505, and the above processing is performed until there is a trigger to end the processing such as turning off the power or performing an end operation. repeat. Although it is assumed that the above processing is repeated, it may be performed only once without repeating.
  • the warning system 3 of the third embodiment generates warning range information as a base for determining whether the information generation device 1 should give a warning in consideration of the surrounding situation, and the warning device 2 is the current self.
  • the warning system 3 of the third embodiment In order to determine whether or not to issue a warning after taking into account the surrounding conditions of the vehicle, it is possible to accurately infer peripheral moving objects that are often overlooked and perform warning control without using sensory parameters.
  • the warning range information calculation unit 105c, the surrounding situation determination unit 21b, and the warning target determination unit 22b of the information generation unit 11c of the warning system 3 are with the driver of the own vehicle regardless of the surrounding conditions. Information is calculated and determined for the same. As a result, the range information that is often overlooked by the driver during the driving of the same driver is obtained, the "missing habit" for each driver is calculated, and whether or not the surrounding moving object is a warning target depends on the surrounding conditions. Can be determined without. Other than that, the effect of the third embodiment is the same as the effect of the first embodiment.
  • the warning system 3 calculates and determines that the driver is the same as the driver of the own vehicle regardless of the surrounding conditions, but the warning system 3 is the same driver and the same surrounding environment. Whether to calculate and determine the information for the same driver or to calculate and determine the information for the same driver can be selected by the driver with a hardware switch, a car navigation system, etc. You may be able to do it. By doing so, the level at which the driver wants to be alerted can be changed according to the situation at that time of the driver. By being able to change the level at which the driver wants to be alerted, it is possible to further prevent the driver from ignoring the warning and stopping the driver from using the warning device.
  • the warning system 3 includes the surrounding situation information acquisition unit 101, the surrounding situation information storage unit 363, and the surrounding situation determination unit 21b, but is the same as the driver of the own vehicle regardless of the surrounding situation.
  • the information generation device 1 creates warning range information by associating the collision possibility information with the corresponding line-of-sight information, and the warning device 2 acquires map information by the warning target determination unit 22b, regardless of the surrounding conditions.
  • the warning may be given by using the warning range information of the same driver. By doing so, the amount of calculation can be reduced. Further, when the information generation device 1 and the warning device 2 are mounted on the own vehicle, the weight of the device can be reduced and the fuel consumption is improved.
  • the modified example of the third embodiment is the same as the modified example of the first embodiment.
  • the moving body information detection sensor 31a is provided in the own vehicle and detects the own vehicle information including the own vehicle position information and the peripheral moving body information including the peripheral moving body position information.
  • the mobile body information acquisition unit 4a provided in the information generation device 1 and the warning device 2 acquires the own vehicle information and the peripheral moving body information from the moving body information detection sensor 31a mounted on the own vehicle, and mounts the moving body information on the own vehicle. The operation can be completed only with the existing sensor.
  • the mobile body information detection sensor 31b is a sensor provided not only in the own vehicle but also in the peripheral mobile body, the infrastructure device, and the like, and the information generation device 1
  • the mobile body information acquisition unit 4b provided in the warning device 2 acquires the own vehicle information and the peripheral moving body information from the moving body information detection sensor 31b mounted on the own vehicle, the peripheral moving body, the infrastructure device, and the like.
  • the warning system 3 improves the accuracy of the calculated information by acquiring more accurate information. Other than that, it is the same as that of the first embodiment.
  • the configurations and operations already described in the following description are designated by the same reference numerals, and duplicate description will be omitted.
  • FIG. 20 is a block diagram of a warning system 3 including an information generation device 1 and a warning device 2 according to the fourth embodiment of the present disclosure.
  • the block diagram of the information generation unit 11a according to the fourth embodiment is the same as that of FIG.
  • the mobile information detection sensor 31b and the mobile information acquisition unit 4b are configured as a functional block diagram instead of the mobile information detection sensor 31a and the mobile information acquisition unit 4a of FIG. 1 of the first embodiment. Join.
  • the mobile information detection sensor 31b is provided in the own vehicle, peripheral mobiles, infrastructure devices, and the like.
  • the mobile information detection sensor 31b provided in the peripheral mobile body and the infrastructure device and the like utilizes the device originally provided in the peripheral mobile body and the infrastructure device and the like.
  • the infrastructure device is, for example, a camera of a roadside sensor or the like.
  • FIG. 20 only one mobile information detection sensor 31b is shown, but when information is acquired from a plurality of sensors mounted on the own vehicle, peripheral mobiles, infrastructure devices, etc., the mobile information detection sensor There will be a plurality of 31b. Other than that, the mobile information detection sensor 31b is the same as the mobile information detection sensor 31a.
  • the moving body information acquisition unit 4b acquires the own vehicle information and the peripheral moving body information detected by the moving body information detection sensor 31b.
  • the mobile information acquisition unit 4b acquires information from the plurality of mobile information detection sensors 31b.
  • the mobile information acquisition unit 4b is the same as the mobile information acquisition unit 4a.
  • FIG. 21 is a hardware configuration diagram of the warning system 3 according to the fourth embodiment of the present disclosure. The configuration of the warning system 3 according to the fourth embodiment of the present disclosure will be described with reference to FIG. 21.
  • the communication I / F 47 is added to FIG. 3 of the first embodiment.
  • the warning system 3 is composed of a bus 40, a processor 41, a memory 42, a storage 43, a sensor 44a, an input I / F45, an output I / F46, and a communication I / F47.
  • Each function of the warning system 3 is realized by software, firmware, or a combination of software and firmware.
  • Software, firmware, or a combination of software and firmware is described as a program.
  • the bus 40 is also connected to the communication I / F 47, and each device is electrically connected to exchange data.
  • the processor 41 also connects to the communication I / F 47 via the bus 40 and controls the communication I / F 47.
  • the processor 41 is realized by reading and executing a program in which a part of the mobile information acquisition unit 4b is loaded into the memory 42 by the processor 41.
  • the memory 42 is realized by a program that stores a part of the mobile information acquisition unit 4b in the memory 42.
  • the sensor 44b is a device that detects various types of information.
  • the sensor 44b is, for example, an in-vehicle sensor such as a millimeter wave radar or a camera.
  • the mobile body information detection sensor 31b mounted on the own vehicle and the line-of-sight information detection sensor 32 are realized by the sensor 44b. Other than that, the sensor 44b is the same as the sensor 44a.
  • the communication I / F 47 receives the mobile body information from a sensor (not shown) that realizes the mobile body information detection sensor 31b provided in the peripheral mobile body, the infrastructure device, and the like.
  • the communication I / F 47 is, for example, a device that performs wireless communication such as the Internet.
  • a part of the mobile information acquisition unit 4b is realized by communication I / F47.
  • FIG. 22 is a flowchart showing the operation of the warning system 3 according to the fourth embodiment of the present disclosure. The operation of the warning system 3 will be described below with reference to FIG.
  • the moving body information detection sensor 31b is provided in the own vehicle, the peripheral moving body, the infrastructure device, and the like, and detects the own vehicle information and the peripheral moving body information of the peripheral moving body existing around the own vehicle.
  • the own vehicle of the first embodiment is provided with the moving body information detection sensor 31b, for example, a millimeter-wave radar which is a moving body information detection sensor 31b provided in a peripheral moving body , Own vehicle information and peripheral moving body information of peripheral moving bodies existing around the own vehicle are detected.
  • the camera of the roadside sensor which is an infrastructure device, detects the own vehicle information and the peripheral moving body information of the peripheral moving body existing around the own vehicle.
  • the moving body information detection sensor 31b that can detect the own vehicle information and the peripheral moving body information of the peripheral moving body existing around the own vehicle is provided in the own vehicle information and the surroundings of the peripheral moving body existing in the vicinity of the own vehicle no matter where it is provided. Detects moving object information.
  • the moving body information detection sensor 31b transmits the detected own vehicle information and peripheral moving body information to the moving body information acquisition unit 4b, and proceeds to step S602.
  • step S601 is the same as step S1 of the first embodiment.
  • Step S602 is the same as step S2 of the first embodiment.
  • steps S603 to S604 the moving body information acquisition unit 4b acquires the own vehicle information and the peripheral moving body information of the peripheral moving body existing around the own vehicle not only from the own vehicle but also from the peripheral moving body and the infrastructure device. .. Other than that, steps S603 to S604 are the same as steps S3 to S4 of the first embodiment. Details will be described later.
  • step S604 Even after executing step S604, the process returns to step S601, and the above processing is repeated until there is a trigger to end the processing such as turning off the power or performing an end operation. Although it is assumed that the above processing is repeated, it may be performed only once without repeating.
  • FIG. 23 is a flowchart showing the operation of the information generation device 1 according to the fourth embodiment of the present disclosure. The operation of the information generation device 1 in step S603 of FIG. 22 will be described below with reference to FIG. 23.
  • step S701 the mobile body information acquisition unit 4b acquires the own vehicle information and the peripheral moving body information from the moving body information detection sensor 31b provided in the own vehicle, the peripheral moving body, the infrastructure device, and the like.
  • the mobile information acquisition unit 4b acquires information from the plurality of mobile information detection sensors 31b.
  • a plurality of mobile information acquisition units 4b may be provided.
  • the information acquired by the mobile information acquisition unit 4d may be transmitted to the mobile information acquisition unit 4c, and the information may be unified by the mobile information acquisition unit 4c.
  • information may be transmitted from each of the mobile information acquisition unit 4c and the mobile information acquisition unit 4d to the information generation unit 11a without being unified.
  • step S701 is the same as step S101 of the first embodiment.
  • Steps S702 to S707 are the same as steps S102 to S107 of the first embodiment.
  • step S707 Even after executing step S707, the process returns to step S701, and the above processing is repeated until there is a trigger to end the processing such as turning off the power or performing an end operation. Although it is assumed that the above processing is repeated, it may be performed only once without repeating. This is the machine learning process that generates information that is the basis for determining whether a warning should be given in advance.
  • the information generator 1 is mounted on the own vehicle in the fourth embodiment, it does not have to be mounted on the own vehicle. Specifically, the operation of the information generation device 1 may be performed by a server or the like that is not mounted on the own vehicle.
  • FIG. 24 is a flowchart showing the operation of the warning device 2 according to the fourth embodiment of the present disclosure. The operation of the warning device 2 in step S604 of FIG. 22 will be described below with reference to FIG. 24.
  • step S801 the mobile information acquisition unit 4b performs the same operation as in step S701. However, the moving body information acquisition unit 4b transmits the acquired own vehicle information and peripheral moving body information not to the information generation unit 11a but to the surrounding situation determination unit 21a and the warning target determination unit 22a. Other than that, step S801 is the same as step S201 of the first embodiment.
  • Steps S802 to S805 are the same as steps S202 to S205 of the first embodiment.
  • step S803 If the result is No in step S803, if the result is No in step S804, the process returns to step S801 even after the execution of step S805, the power is turned off, or a trigger for the end of processing such as an end operation is performed. The above process is repeated until there is. Although it is assumed that the above processing is repeated, it may be performed only once without repeating.
  • the warning system 3 of the fourth embodiment generates warning range information as a base for determining whether the information generation device 1 should give a warning in consideration of the surrounding situation, and the warning device 2 is the current self.
  • the warning system 3 of the fourth embodiment In order to determine whether or not to issue a warning after taking into account the surrounding conditions of the vehicle, it is possible to accurately estimate peripheral moving objects that are often overlooked and perform warning control without using sensory parameters.
  • the warning system 3 includes not only the information of the moving body information detection sensor 31b provided in the own vehicle but also the information of the moving body information detection sensor 31b provided in the peripheral moving body and the infrastructure device. By using it, information can be acquired more widely, and more accurate information can be acquired. The warning system 3 improves the accuracy of the calculated information by acquiring more accurate information. As a result, safer driving can be performed, and as a result, it is considered that accidents can be reduced. Other than that, the effect of the fourth embodiment is the same as the effect of the first embodiment.
  • the mobile information detection sensor 31b is provided in the own vehicle, peripheral mobiles, infrastructure devices, etc., and the mobile information acquisition unit 4b acquires information from all the mobile information detection sensors 31b. However, only a part may be acquired.
  • the moving body information acquisition unit 4b may acquire information from the moving body information detection sensor 31b provided in the own vehicle and surrounding moving bodies, or the moving body information detection sensor provided in the own vehicle and the infrastructure device. Information may be acquired from 31b, or information may be acquired from the moving body information detection sensor 31b provided in the peripheral moving body and the infrastructure device without using the moving body information detection sensor 31b provided in the own vehicle. May be good.
  • the mobile information acquisition unit 4b may acquire information from only one mobile information detection sensor 31b from among the plurality of mobile information detection sensors 31b.
  • the combination of information of the mobile information detection sensor 31b acquired by the mobile information acquisition unit 4b may be 1 or more.
  • the modified example of the fourth embodiment is the same as the modified example of the first embodiment.
  • the information generator, the warning device, the information generation method, the warning method, the information generation program, and the warning program shown in the above-described embodiment are merely examples, and can be appropriately configured in combination with other devices. It is not limited to the configuration of the embodiment alone.
  • 1 Information generator 2 Warning device, 3 Warning system, 4a, 4b Mobile information acquisition unit, 5 Line-of-sight information acquisition unit, 11a, 11b, 11c Information generation unit, 21a, 21b Surrounding situation determination unit, 22a, 22b Warning target judgment unit, 23 Warning unit, 31a, 31b Mobile information detection sensor, 32 Line-of-sight information detection sensor, 33 Line-of-sight information storage unit, 34 Undetected range information storage unit, 35 Map information storage unit, 36 Corresponding information storage unit, 37 Warning range information storage unit, 40 buses, 41 processors, 42 memories, 43 storage, 44a, 44b sensors, 45 Input I / F, 46 Output I / F, 47 Communication I / F, 50 own vehicle, 51 field of view information, 52-54 peripheral moving body, 55 Line-of-sight information, 101 Surrounding situation information acquisition department, 102 Collision possibility information calculation unit, 103 Information mapping unit, 104a, 104b Undetected range information calculation unit, 105a, 105b, 105c

Abstract

In conventional warning devices, parameters that are set according to self-reports are sensory parameters, and such parameters may be erroneously set. If a set parameter is incorrect, a warning may be determined to be necessary even when the warning is unnecessary, or a warning may be determined to be unnecessary even when the warning is necessary. This has resulted in the problem that the accuracy of determining whether or not a warning is necessary deteriorates. The interior of an information generation device (1) is provided with: a collision possibility information calculation unit (102) which uses acquired host vehicle location information and acquired nearby moving body location information to calculate collision possibility information that is information about the possibility of a host vehicle colliding with a nearby moving body; an undetected region information calculation unit (104a) which uses the collision possibility information, acquired sight line information, and acquired ambient condition information to calculate undetected region information regarding the ambient conditions indicated by the ambient condition information; and a warning region information calculation unit (105a) which calculates warning region information using one or more sets of undetected region information.

Description

情報生成装置、警告装置、情報生成方法、警告方法、情報生成プログラム、および警告プログラムInformation generator, warning device, information generation method, warning method, information generation program, and warning program
 本開示は、自車両周辺に存在する見落としがちな周辺移動体を推測し、警告するための情報生成装置、警告装置、情報生成方法、警告方法、情報生成プログラム、および警告プログラムに関するものである。 The present disclosure relates to an information generation device, a warning device, an information generation method, a warning method, an information generation program, and a warning program for inferring and warning a peripheral moving object existing around the own vehicle, which is often overlooked.
 近年、車両運転支援の分野において、自車両にセンサを複数搭載し、当該センサ情報を利用する車載システムの研究開発が盛んになっている。また、自車両に搭載したセンサのみならず、他車両に搭載したセンサが取得したセンサ情報を車車間通信で自車両が受信し利用するシステム、及び路上に設置されたカメラ等のインフラ装置から取得したセンサ情報を路車間通信で自車両が受信し利用するシステムも研究開発が進んでいる。 In recent years, in the field of vehicle driving support, research and development of an in-vehicle system in which a plurality of sensors are mounted on the own vehicle and the sensor information is used has become active. In addition to the sensors mounted on the own vehicle, the sensor information acquired by the sensors mounted on other vehicles is received and used by the own vehicle through inter-vehicle communication, and is acquired from infrastructure devices such as cameras installed on the road. Research and development is also progressing on a system in which the own vehicle receives and uses the sensor information obtained by road-to-vehicle communication.
 これらのシステムから取得したセンサ情報は、例えば運転者へ自車両の衝突の警告を行う警告装置に利用される。警告装置は、自車両、他車両、インフラ装置等から取得したセンサ情報を利用して、自車両に衝突しそうな他車両あるいは人等の周辺移動体が存在する場合に、運転者に対して警告を出す。 The sensor information acquired from these systems is used, for example, in a warning device that warns the driver of a collision of the own vehicle. The warning device uses sensor information acquired from the own vehicle, other vehicles, infrastructure devices, etc. to warn the driver when there is a peripheral moving object such as another vehicle or a person who is likely to collide with the own vehicle. Is issued.
 警告装置においては、センサ情報によって検出した周辺移動体を、運転者がどれだけ気づいているかが問題になる。特に、運転者が気づいている周辺移動体について警告装置が警告を出し続けると、運転者が気づいていない周辺移動体について行う警告が、運転者が気づいている周辺移動体についての警告にまぎれてしまう。 In the warning device, the problem is how much the driver is aware of the peripheral moving objects detected by the sensor information. In particular, if the warning device keeps issuing warnings about peripheral moving objects that the driver is aware of, the warnings that the driver gives about peripheral moving objects that the driver is not aware of are mixed with warnings about peripheral moving objects that the driver is aware of. It ends up.
 また、運転者が気づいている周辺移動体について警告装置が警告を出すことにより、運転者が警告そのものを気にしない、あるいは運転者が警告を無視するようになってしまう。さらに、運転者が気づいている周辺移動体についての警告が多くなると、運転者が多数の警告を煩わしく感じ、そもそも警告装置が使われなくなる。 In addition, when the warning device issues a warning about the peripheral moving object that the driver is aware of, the driver does not care about the warning itself, or the driver ignores the warning. Furthermore, when the number of warnings about peripheral moving objects that the driver is aware of increases, the driver finds many warnings annoying, and the warning device is not used in the first place.
 上記のような場合があると、本来行うべき「運転者が気づいていない周辺移動体に対する警告」が運転者に届かない、あるいは運転者が無視する、という状況に陥り、効果が低減する。したがって、警告装置は警告の件数を絞り、運転者が気づいていない周辺移動体に対してのみ警告を行うことが求められる。 In the above cases, the effect is reduced because the driver does not receive the "warning for peripheral moving objects that the driver is not aware of" or the driver ignores it. Therefore, the warning device is required to narrow down the number of warnings and give a warning only to peripheral moving objects that the driver is not aware of.
 特許文献1には、運転者が自己申告により設定したパラメータを用いて、ある周辺環境における個々の運転者に対する警告の要否を判断する技術が開示されている。 Patent Document 1 discloses a technique for determining the necessity of warning to an individual driver in a certain surrounding environment by using a parameter set by the driver by self-report.
特開2009-163286号公報JP-A-2009-163286
 しかしながら、上記した従来の警告装置では、自己申告で設定するパラメータは感覚的なパラメータであり、間違ってパラメータを設定する可能性があった。設定するパラメータが誤っていた場合、警告が不要な場合に警告が必要と判断する、あるいは警告が必要な場合に警告は不要と判断する可能性があり、警告の要否の判断の精度が悪くなる場合があるという問題があった。 However, in the above-mentioned conventional warning device, the parameters set by self-report are sensory parameters, and there is a possibility that the parameters are set incorrectly. If the parameters to be set are incorrect, it may be judged that a warning is necessary when the warning is unnecessary, or it may be judged that the warning is unnecessary when the warning is necessary, and the accuracy of determining the necessity of the warning is poor. There was a problem that it could become.
 本開示は、上記のような問題を解決するためになされたものであって、見落としがちな周辺移動体を精度よく推測し、警告するための情報生成装置、警告装置、情報生成方法、警告方法、情報生成プログラム、および警告プログラムを提供することを目的とする。 This disclosure is made to solve the above-mentioned problems, and is an information generator, a warning device, an information generation method, and a warning method for accurately inferring and warning a peripheral moving object that is often overlooked. , Information generator, and warning program.
 本開示に係る情報生成装置は、自車両の所定期間内の位置を示す自車両位置情報、及び自車両周辺に存在する周辺移動体の所定期間内の位置を示す周辺移動体位置情報を取得する移動体情報取得部と、所定期間内の自車両の運転者が向けた視線の方向である視線情報を取得する視線情報取得部と、所定期間内の自車両の周囲状況である周囲状況情報を取得する周囲状況情報取得部と、自車両位置情報及び周辺移動体位置情報を用いて自車両と周辺移動体とが衝突する可能性である衝突可能性情報を算出する衝突可能性情報算出部と、衝突可能性情報と視線情報と周囲状況情報とを用いて、周囲状況情報が示す周囲状況における自車両の運転者が見落とした範囲である未検知範囲情報を算出する未検知範囲情報算出部と、1以上の未検知範囲情報を用いて、自車両の運転者に警告される周辺移動体が位置する範囲を示す警告範囲情報を算出する警告範囲情報算出部とを備える。 The information generator according to the present disclosure acquires the own vehicle position information indicating the position of the own vehicle within a predetermined period and the peripheral moving body position information indicating the position of the peripheral moving body existing around the own vehicle within the predetermined period. The moving body information acquisition unit, the line-of-sight information acquisition unit that acquires the line-of-sight information that is the direction of the line of sight directed by the driver of the own vehicle within a predetermined period, and the surrounding situation information that is the surrounding state of the own vehicle within the predetermined period. The surrounding situation information acquisition unit to be acquired, and the collision possibility information calculation unit that calculates the collision possibility information that the own vehicle and the peripheral moving object may collide with each other using the own vehicle position information and the peripheral moving body position information. , The undetected range information calculation unit that calculates the undetected range information that is the range overlooked by the driver of the own vehicle in the surrounding situation indicated by the surrounding situation information using the collision possibility information, the line-of-sight information, and the surrounding situation information. It is provided with a warning range information calculation unit that calculates warning range information indicating a range in which a peripheral moving object warned to the driver of the own vehicle is located by using one or more undetected range information.
 本開示に係る警告装置は、自車両の第一の所定期間内の位置を示す第一の自車両位置情報、及び自車両周辺に存在する周辺移動体の第一の所定期間内の位置を示す第一の周辺移動体位置情報を取得する移動体情報取得部と、第一の所定期間よりも前の期間である第二の所定期間内の移動体の位置を示す第二の自車両位置情報、及び第二の所定期間内の移動体の周辺に存在した周辺移動体の位置を示す第二の周辺移動体位置情報を用いて、移動体と周辺移動体とが衝突する可能性である衝突可能性情報が算出され、算出された衝突可能性情報と、第二の所定期間内の移動体の運転者が向けた視線の方向である第二の視線情報と、第二の所定期間内の移動体の周囲状況である周囲状況情報とを用いて、周囲状況情報が示す周囲状況における移動体の運転者が見落とした範囲である未検知範囲情報が算出され、1以上の未検知範囲情報を用いて算出された、移動体の運転者に警告される周辺移動体が位置する範囲を示す警告範囲情報の中から、自車両と移動体とにおいて同一運転者の条件を少なくとも含む所定の条件と合致した警告範囲情報を取得し、第一の自車両位置情報と第一の移動体位置情報と警告範囲情報とを用いて自車両周辺に存在する周辺移動体が警告対象であるか否か判定する警告対象判定部とを備える。 The warning device according to the present disclosure indicates the first position information of the own vehicle indicating the position of the own vehicle within the first predetermined period, and the position of the peripheral moving body existing around the own vehicle within the first predetermined period. The moving body information acquisition unit that acquires the position information of the first peripheral moving body, and the second own vehicle position information indicating the position of the moving body within the second predetermined period, which is a period before the first predetermined period. , And a collision in which the moving body and the peripheral moving body may collide with each other by using the second peripheral moving body position information indicating the position of the peripheral moving body existing around the moving body within the second predetermined period. Possibility information is calculated, and the calculated collision possibility information, the second line-of-sight information which is the direction of the line-of-sight directed by the driver of the moving object within the second predetermined period, and the second predetermined period The undetected range information, which is the range overlooked by the driver of the moving object in the surrounding condition indicated by the surrounding condition information, is calculated by using the surrounding condition information which is the surrounding condition of the moving object, and one or more undetected range information is calculated. From the warning range information calculated using the warning range information indicating the range in which the peripheral moving body is located, which is warned to the driver of the moving body, a predetermined condition including at least the condition of the same driver in the own vehicle and the moving body. The matching warning range information is acquired, and it is determined whether or not the peripheral moving objects existing around the own vehicle are the warning targets by using the first own vehicle position information, the first moving body position information, and the warning range information. It is provided with a warning target determination unit.
 本開示によれば、見落としがちな周辺移動体を精度よく推測し、警告を行えるようにできる。 According to this disclosure, it is possible to accurately infer peripheral moving objects that are often overlooked and issue a warning.
本開示の実施の形態1に係る情報生成装置及び警告装置を含む警告システムのブロック図である。It is a block diagram of the warning system including the information generation device and the warning device which concerns on Embodiment 1 of this disclosure. 本開示の実施の形態1に係る情報生成部のブロック図である。It is a block diagram of the information generation part which concerns on Embodiment 1 of this disclosure. 本開示の実施の形態1に係る警告システムのハードウェア構成図である。It is a hardware block diagram of the warning system which concerns on Embodiment 1 of this disclosure. 本開示の実施の形態1に係る警告システムの動作を示すフローチャートである。It is a flowchart which shows the operation of the warning system which concerns on Embodiment 1 of this disclosure. 本開示の実施の形態1に係る情報生成装置の動作を示すフローチャートである。It is a flowchart which shows the operation of the information generation apparatus which concerns on Embodiment 1 of this disclosure. 本開示の実施の形態1に係る視野情報の概念図である。It is a conceptual diagram of the visual field information which concerns on Embodiment 1 of this disclosure. 本開示の実施の形態1に係る衝突可能性情報の概念図である。図7aは、中央に自車両を配置し、周辺移動体の位置を丸印、速度を矢印で記載した図である。図7bは、リスクマップである。It is a conceptual diagram of the collision possibility information which concerns on Embodiment 1 of this disclosure. FIG. 7a is a diagram in which the own vehicle is arranged in the center, the positions of the peripheral moving bodies are marked with circles, and the speeds are indicated by arrows. FIG. 7b is a risk map. 本開示の実施の形態1に係る交差点Aに差し掛かる直前の期間を説明するための図である。It is a figure for demonstrating the period immediately before approaching the intersection A which concerns on Embodiment 1 of this disclosure. 本開示の実施の形態1に係る自車両の運転者が見えている範囲の概念図である。FIG. 5 is a conceptual diagram of a range in which the driver of the own vehicle according to the first embodiment of the present disclosure can be seen. 本開示の実施の形態1に係る時刻t11~時刻t12の視線適用衝突可能性情報の概念図である。FIG. 5 is a conceptual diagram of line-of-sight application collision possibility information from time t11 to time t12 according to the first embodiment of the present disclosure. 本開示の実施の形態1に係る時刻t11~時刻t16の未検知範囲情報の概念図である。It is a conceptual diagram of the undetected range information of time t11 to time t16 which concerns on Embodiment 1 of this disclosure. 本開示の実施の形態1に係る警告装置の動作を示すフローチャートである。It is a flowchart which shows the operation of the warning device which concerns on Embodiment 1 of this disclosure. 本開示の実施の形態2に係る情報生成部のブロック図である。It is a block diagram of the information generation part which concerns on Embodiment 2 of this disclosure. 本開示の実施の形態2に係る情報生成装置の動作を示すフローチャートである。It is a flowchart which shows the operation of the information generation apparatus which concerns on Embodiment 2 of this disclosure. 本開示の実施の形態2に係る時刻t11~時刻t16の未検知範囲情報の概念図である。It is a conceptual diagram of the undetected range information of time t11 to time t16 which concerns on Embodiment 2 of this disclosure. 本開示の実施の形態3に係る情報生成装置及び警告装置を含む警告システムのブロック図である。It is a block diagram of the warning system including the information generation device and the warning device which concerns on Embodiment 3 of this disclosure. 本開示の実施の形態3に係る情報生成部のブロック図である。It is a block diagram of the information generation part which concerns on Embodiment 3 of this disclosure. 本開示の実施の形態3に係る情報生成装置の動作を示すフローチャートである。It is a flowchart which shows the operation of the information generation apparatus which concerns on Embodiment 3 of this disclosure. 本開示の実施の形態3に係る警告装置の動作を示すフローチャートである。It is a flowchart which shows the operation of the warning device which concerns on Embodiment 3 of this disclosure. 本開示の実施の形態4に係る情報生成装置及び警告装置を含む警告システムのブロック図である。It is a block diagram of the warning system including the information generation device and the warning device which concerns on Embodiment 4 of this disclosure. 本開示の実施の形態4に係る警告システムのハードウェア構成図である。It is a hardware block diagram of the warning system which concerns on Embodiment 4 of this disclosure. 本開示の実施の形態4に係る警告システムの動作を示すフローチャートである。It is a flowchart which shows the operation of the warning system which concerns on Embodiment 4 of this disclosure. 本開示の実施の形態4に係る情報生成装置の動作を示すフローチャートである。It is a flowchart which shows the operation of the information generation apparatus which concerns on Embodiment 4 of this disclosure. 本開示の実施の形態4に係る警告装置の動作を示すフローチャートである。It is a flowchart which shows the operation of the warning device which concerns on Embodiment 4 of this disclosure.
実施の形態1.
 本実施の形態1では、例として、車両に警告システムを搭載し、自車両が交差点Aに差し掛かる直前の場合について説明する。実施の形態1では、第一段階において、事前に自車両が交差点Aを通るルートを走行し、情報生成装置及び警告装置を含む警告システムは当該ルートを事前に走行した際に交差点Aに差し掛かる直前で運転者が見落とした範囲を警告範囲情報記憶部37に記憶する。第二段階において、警告システムは、現在自車両が交差点Aを走行しようとしている状況で、警告範囲情報記憶部37に予め記憶している運転者が見落とした範囲に周辺移動体が位置している場合に警告制御を行うものである。なお、周辺移動体とは、自車両周辺に存在する他車両、人等の移動する物体である。
Embodiment 1.
In the first embodiment, as an example, a case where the warning system is mounted on the vehicle and the own vehicle is just before approaching the intersection A will be described. In the first embodiment, in the first stage, the own vehicle travels on the route passing through the intersection A in advance, and the warning system including the information generator and the warning device approaches the intersection A when the vehicle travels on the route in advance. The range overlooked by the driver immediately before is stored in the warning range information storage unit 37. In the second stage, in the warning system, in the situation where the own vehicle is currently traveling at the intersection A, the peripheral moving body is located in the range overlooked by the driver, which is stored in advance in the warning range information storage unit 37. Warning control is performed in some cases. The peripheral moving object is a moving object such as another vehicle or a person existing around the own vehicle.
 図1は本開示の実施の形態1に係る情報生成装置1及び警告装置2を含む警告システム3のブロック図である。警告システム3は、自車両及び自車両の周辺に存在する周辺移動体の情報を検知する移動体情報検知センサ31aと、自車両の運転者の視線情報を検知する視線情報検知センサ32と、自車両の運転者の視線情報を記憶する視線情報記憶部33と、後述する運転者が見落とした範囲の情報である未検知範囲情報を記憶する未検知範囲情報記憶部34と、地図情報を記憶する地図情報記憶部35と、後述する複数の情報のうち対応付けされた情報である対応情報を記憶する対応情報記憶部36と、警告する否か判定する範囲の情報である警告範囲情報を記憶する警告範囲情報記憶部37と、警告を行うべきか判定のベースとする情報を生成する情報生成装置1と、警告を行うか判定し、警告を行うと判定した場合に警告を行う警告装置2とで構成される。 FIG. 1 is a block diagram of a warning system 3 including an information generation device 1 and a warning device 2 according to the first embodiment of the present disclosure. The warning system 3 includes a moving body information detection sensor 31a that detects information on the own vehicle and surrounding moving objects existing around the own vehicle, a line-of-sight information detection sensor 32 that detects the line-of-sight information of the driver of the own vehicle, and the self. The line-of-sight information storage unit 33 that stores the line-of-sight information of the driver of the vehicle, the undetected range information storage unit 34 that stores the undetected range information that is the information of the range overlooked by the driver, which will be described later, and the map information are stored. The map information storage unit 35, the correspondence information storage unit 36 that stores the correspondence information that is the associated information among a plurality of information to be described later, and the warning range information that is the information of the range for determining whether or not to warn are stored. A warning range information storage unit 37, an information generation device 1 that generates information on which a warning should be given, and a warning device 2 that determines whether to give a warning and gives a warning when it is determined to give a warning. Consists of.
 移動体情報検知センサ31aは、自車両に備えられ、自車両の位置情報である自車両位置情報、速度情報、大きさ情報等である自車両情報を検知する。なお、自車両情報は少なくとも自車両位置情報を含む。また、移動体情報検知センサ31aは、自車両周辺に存在する周辺移動体の位置情報である周辺移動体位置情報、速度情報、大きさ情報等である周辺移動体情報を検知する。なお、周辺移動体情報は少なくとも周辺移動体位置情報を含む。自車両情報及び周辺移動体情報は、周辺移動体と自車両との衝突の可否判定に用いる情報である。移動体情報検知センサ31aは、検知した自車両情報及び周辺移動体情報を情報生成装置1、警告装置2のうち少なくとも1つに送信する。 The mobile body information detection sensor 31a is provided in the own vehicle and detects the own vehicle information such as the own vehicle position information, the speed information, and the size information which are the position information of the own vehicle. The own vehicle information includes at least the own vehicle position information. In addition, the moving body information detection sensor 31a detects peripheral moving body information such as peripheral moving body position information, speed information, size information, etc., which are position information of peripheral moving bodies existing around the own vehicle. It should be noted that the peripheral moving body information includes at least the peripheral moving body position information. The own vehicle information and the peripheral moving body information are information used for determining whether or not the peripheral moving body and the own vehicle collide with each other. The moving body information detection sensor 31a transmits the detected own vehicle information and peripheral moving body information to at least one of the information generation device 1 and the warning device 2.
 視線情報検知センサ32は、自車両の運転者の視線情報を検知する。視線情報検知センサ32は、検知した当該視線情報を情報生成装置1、警告装置2のうち少なくとも1つに送信する。 The line-of-sight information detection sensor 32 detects the line-of-sight information of the driver of the own vehicle. The line-of-sight information detection sensor 32 transmits the detected line-of-sight information to at least one of the information generation device 1 and the warning device 2.
 視線情報記憶部33は、視線情報検知センサ32で検知された視線情報を情報生成装置1、警告装置2のうち少なくとも1つから受信し、受信した視線情報を記憶する。また、後述する運転者が向けた視線の範囲の情報である視野情報を記憶する。 The line-of-sight information storage unit 33 receives the line-of-sight information detected by the line-of-sight information detection sensor 32 from at least one of the information generation device 1 and the warning device 2, and stores the received line-of-sight information. In addition, the visual field information, which is the information of the range of the line of sight directed by the driver, which will be described later, is stored.
 未検知範囲情報記憶部34は、事前に自車両が特定のルートを走行した際に、交差点Aに差し掛かる状況で運転者が見落とした範囲の情報である未検知範囲情報を記憶する。詳細については、後述する。 The undetected range information storage unit 34 stores the undetected range information which is the information of the range overlooked by the driver when the own vehicle travels on a specific route in advance and approaches the intersection A. Details will be described later.
 地図情報記憶部35は、建物の情報、道路の情報等を含んだ自車両の周囲状況を決定する際に用いる地図情報を記憶する。 The map information storage unit 35 stores map information used when determining the surrounding conditions of the own vehicle, including building information, road information, and the like.
 対応情報記憶部36は、リスクマップ等の自車両と周辺移動体との衝突の可能性の情報である衝突可能性情報を記憶する衝突可能性情報記憶部361と、衝突可能性情報に対応する自車両の運転者の視線情報を記憶する対応視線情報記憶部362と、衝突可能性情報に対応する周囲状況の情報である周囲状況情報を記憶する周囲状況情報記憶部363とで構成される。 The correspondence information storage unit 36 corresponds to the collision possibility information storage unit 361, which stores the collision possibility information which is information on the possibility of collision between the own vehicle and the surrounding moving body such as a risk map, and the collision possibility information. It is composed of a corresponding line-of-sight information storage unit 362 that stores the line-of-sight information of the driver of the own vehicle, and a surrounding situation information storage unit 363 that stores the surrounding situation information that is the information of the surrounding situation corresponding to the collision possibility information.
 なお、周囲状況情報とは、例えば、交差点Aであるという状況の情報、日暮れの時間帯であるという状況の情報、雨の日であるという天気の状況の情報、右折時であるという運転状況の情報等、場所、時間、天気、運転状況等、運転に関係する運転者の周囲の状況の情報である。周囲状況情報は、運転に関係する状況の情報であれば、上記に限らない。 The surrounding situation information includes, for example, information on the situation at intersection A, information on the situation at nightfall, information on the weather situation on a rainy day, and information on the driving situation when turning right. Information, etc., location, time, weather, driving conditions, etc., which is information on the surrounding conditions of the driver related to driving. The surrounding situation information is not limited to the above as long as it is information on the situation related to driving.
 対応情報とは、対応する衝突可能性情報と、対応する視線情報と、対応する周囲状況情報とをまとめた情報である。ここで、対応するとは、例えば同時刻を含む期間である場合等である。本実施の形態1では、同時刻のものを指す。なお、同時刻を含む期間とは、同時刻である場合、同時刻を含み情報の始めの時刻と終わりの時刻とが全く同じ場合、同時刻を含むが情報の始めの時刻と終わりの時刻とが異なる場合がある。 Correspondence information is information that summarizes the corresponding collision possibility information, the corresponding line-of-sight information, and the corresponding surrounding situation information. Here, the correspondence is, for example, a case where the period includes the same time. In the first embodiment, it refers to the one at the same time. The period including the same time means that if the time is the same, the start time and the end time of the information including the same time are exactly the same, and the start time and the end time of the information include the same time. May be different.
 警告範囲情報記憶部37は、情報生成装置1から警告範囲情報を受信し、警告範囲情報を記憶する。警告範囲情報記憶部37は、記憶している警告範囲情報を警告装置2に送信する。 The warning range information storage unit 37 receives the warning range information from the information generation device 1 and stores the warning range information. The warning range information storage unit 37 transmits the stored warning range information to the warning device 2.
 情報生成装置1は、周辺移動体情報を取得する移動体情報取得部4aと、自車両の運転者がどの方向を見ているかという視線情報を取得する視線情報取得部5と、警告範囲情報を生成する情報生成部11aとで構成される。情報生成装置1は、警告装置2で警告を出すか否かを判定する際の判定基準となる情報を生成する装置である。 The information generation device 1 obtains the moving body information acquisition unit 4a for acquiring peripheral moving body information, the line-of-sight information acquisition unit 5 for acquiring the line-of-sight information indicating which direction the driver of the own vehicle is looking at, and the warning range information. It is composed of an information generation unit 11a to be generated. The information generation device 1 is a device that generates information that serves as a determination standard when the warning device 2 determines whether or not to issue a warning.
 移動体情報取得部4aは、移動体情報検知センサ31aで検知したある時刻における自車両情報及び自車両周辺に存在する周辺移動体の周辺移動体情報を所定期間t1取得する。移動体情報検知センサ31aによる検知タイミングはセンサに依存するため、所定期間t1の時間幅を設定して自車両情報及び周辺移動体情報を取得することにより、複数のセンサを利用する場合に起こる検知タイミングのずれを吸収することができる。移動体情報取得部4aは、情報生成装置1内では、情報生成部11aに自車両情報及び周辺移動体情報を送信する。 The moving body information acquisition unit 4a acquires the own vehicle information at a certain time detected by the moving body information detection sensor 31a and the peripheral moving body information of the peripheral moving body existing around the own vehicle for a predetermined period t1. Since the detection timing by the moving body information detection sensor 31a depends on the sensor, detection that occurs when a plurality of sensors are used by setting a time width of a predetermined period t1 and acquiring own vehicle information and peripheral moving body information. It is possible to absorb the timing deviation. The mobile body information acquisition unit 4a transmits the own vehicle information and the peripheral mobile body information to the information generation unit 11a in the information generation device 1.
 視線情報取得部5は、視線情報検知センサ32で検知したある時刻における自車両の運転者の視線情報を取得し、当該視線情報を視線情報記憶部33に記憶させる。視線情報検知センサ32で検知したある時刻とは、移動体情報検知センサ31aで検知したある時刻と同じ時刻である。あるいは、視線情報取得部5は、視線情報検知センサ32で検知したある期間における自車両の運転者の視線情報を取得し、当該視線情報を視線情報記憶部33に記憶させる。視線情報検知センサ32で検知したある期間とは、移動体情報検知センサ31aで検知したある時刻を含む時間幅が同じ期間である。なお、検知し始めから検知し終わりまでが完全に一致していなくてもよい。 The line-of-sight information acquisition unit 5 acquires the line-of-sight information of the driver of the own vehicle at a certain time detected by the line-of-sight information detection sensor 32, and stores the line-of-sight information in the line-of-sight information storage unit 33. The certain time detected by the line-of-sight information detection sensor 32 is the same time as the certain time detected by the moving object information detection sensor 31a. Alternatively, the line-of-sight information acquisition unit 5 acquires the line-of-sight information of the driver of the own vehicle during a certain period detected by the line-of-sight information detection sensor 32, and stores the line-of-sight information in the line-of-sight information storage unit 33. The certain period detected by the line-of-sight information detection sensor 32 is a period having the same time width including a certain time detected by the moving object information detection sensor 31a. It should be noted that the period from the start of detection to the end of detection does not have to be exactly the same.
 また、視線情報取得部5は、ある時刻における所定期間t2の視線情報検知センサ32から取得した視線情報あるいは視線情報記憶部33に記憶されている視線情報を用いて、ある時刻における所定期間t2に運転者が向けた視線の範囲の情報である視野情報を取得する。 Further, the line-of-sight information acquisition unit 5 uses the line-of-sight information acquired from the line-of-sight information detection sensor 32 for a predetermined period t2 at a certain time or the line-of-sight information stored in the line-of-sight information storage unit 33 for a predetermined period t2 at a certain time. Acquires visual field information, which is information on the range of the line of sight directed by the driver.
 情報生成部11aは、事前に運転者が自車両を様々な状況で走行させ、移動体情報取得部4aが取得した自車両情報及び自車両周辺に存在する周辺移動体の周辺移動体情報、及び視線情報取得部5が取得した自車両の運転者の視線情報を用いて、警告範囲情報を生成する。本実施の形態1では、事前に自車両が交差点Aを通るルートを走行し、当該交差点Aに差し掛かる直前の警告範囲情報を生成する。 In the information generation unit 11a, the driver drives the own vehicle in various situations in advance, the own vehicle information acquired by the moving body information acquisition unit 4a, the peripheral moving body information of the peripheral moving body existing around the own vehicle, and the peripheral moving body information. Warning range information is generated using the line-of-sight information of the driver of the own vehicle acquired by the line-of-sight information acquisition unit 5. In the first embodiment, the own vehicle travels on the route passing through the intersection A in advance, and the warning range information immediately before approaching the intersection A is generated.
 図2は、本開示の実施の形態1に係る情報生成部11aのブロック図である。図2を用いて、主に情報生成部11aについて説明を記載する。 FIG. 2 is a block diagram of the information generation unit 11a according to the first embodiment of the present disclosure. The information generation unit 11a will be mainly described with reference to FIG.
 図2において、情報生成部11aは、周囲状況情報を設定する周囲状況情報設定部101と、衝突可能性情報を算出する衝突可能性情報算出部102と、衝突可能性情報と視線情報と周囲状況情報とを対応付けする情報対応付け部103と、未検知範囲情報を算出する未検知範囲情報算出部104aと、警告範囲情報を算出する警告範囲情報算出部105aとで構成される。 In FIG. 2, the information generation unit 11a includes a surrounding situation information setting unit 101 that sets surrounding situation information, a collision possibility information calculation unit 102 that calculates collision possibility information, and collision possibility information, line-of-sight information, and surrounding situation. It is composed of an information mapping unit 103 that associates information with information, an undetected range information calculation unit 104a that calculates undetected range information, and a warning range information calculation unit 105a that calculates warning range information.
 周囲状況情報設定部101は、例えば、交差点に差し掛かる直前、交差点内で右折待ち、空いている交差点を左折中、交通量の多い道路で車線変更をする直前、日暮れの時間帯で太陽に向かって直進中、夜の右折待ち、雨の日に交差点を左折中、台風の日に車線変更等、場所、時間、天気、運転状況等のある時刻における走行中の自車両がおかれている状況を周囲状況情報として設定する。周囲状況情報となりうる状況は、自車両の動きが制限される状況である。周囲状況情報設定部101は、周囲状況情報を衝突可能性情報算出部102に送信する。 The surrounding condition information setting unit 101 heads for the sun at nightfall, for example, just before approaching an intersection, waiting for a right turn in the intersection, turning left at an empty intersection, just before changing lanes on a busy road, and so on. While going straight, waiting for a right turn at night, turning left at an intersection on a rainy day, changing lanes on a typhoon day, etc. Is set as the surrounding situation information. The situation that can be the surrounding situation information is the situation where the movement of the own vehicle is restricted. The surrounding situation information setting unit 101 transmits the surrounding situation information to the collision possibility information calculation unit 102.
 なお、当該設定は、自車両を走行させているときに設定してもよいし、走行中はドライブレコーダー等で状況を記憶しておき、停車、駐車、降車後に記憶していた情報を確認して設定してもよい。また、設定するのは自車両の運転者でも同乗者でも同乗していない人でも誰でもよい。さらに、場所、時間、天気、運転状況のうち2つを組み合わせて例をあげたが、3つ以上を組み合わせてもいいし、少なくとも一つの状況を設定すればよい。周囲状況は場所、時間、人等に左右されず、客観的な情報であるため、間違って設定する可能性が低い。また、ドライブレコーダー等で状況を記憶した場合、画像認識から自動的に周囲状況情報を設定してもよい。 The setting may be set when the own vehicle is running, or the situation is memorized by a drive recorder or the like while the vehicle is running, and the information memorized after stopping, parking, or getting off is confirmed. May be set. In addition, the setting may be made by the driver of the own vehicle, a passenger, or a person who is not on board. Further, although the example is given by combining two of the place, time, weather, and driving situation, three or more may be combined, and at least one situation may be set. Since the surrounding conditions are objective information regardless of location, time, people, etc., it is unlikely that they will be set incorrectly. Further, when the situation is memorized by a drive recorder or the like, the surrounding situation information may be automatically set from the image recognition.
 衝突可能性情報算出部102は、移動体情報取得部4aから受信した自車両情報及び周辺移動体情報を用いて、自車両と周辺移動体との移動予測を算出し、ある時刻における自車両と周辺移動体との衝突可能性を衝突確率と自車両周辺の座標と紐付けたリスクマップ等として衝突可能性情報を算出する。衝突可能性情報については後述する。衝突可能性情報算出部102は、周囲状況情報及び衝突可能性情報を情報対応付け部103に送信する。 The collision possibility information calculation unit 102 calculates the movement prediction between the own vehicle and the peripheral moving body by using the own vehicle information and the peripheral moving body information received from the moving body information acquisition unit 4a, and the collision possibility information calculation unit 102 and the own vehicle at a certain time. Collision possibility information is calculated as a risk map or the like in which the collision possibility with a surrounding moving object is linked with the collision probability and the coordinates around the own vehicle. Collision possibility information will be described later. The collision possibility information calculation unit 102 transmits the surrounding situation information and the collision possibility information to the information association unit 103.
 情報対応付け部103は、衝突可能性情報算出部102から受信した周囲状況情報及び衝突可能性情報と、視線情報取得部5から受信した視線情報とを対応付ける。具体的には、情報対応付け部103は、周囲状況情報と衝突可能性情報と視線情報とを同じ時刻T1で対応付ける。情報対応付け部103は、対応付けた周囲状況情報と衝突可能性情報と視線情報とを対応情報記憶部36に記憶させる。 The information association unit 103 associates the surrounding situation information and the collision possibility information received from the collision possibility information calculation unit 102 with the line-of-sight information received from the line-of-sight information acquisition unit 5. Specifically, the information association unit 103 associates the surrounding situation information, the collision possibility information, and the line-of-sight information at the same time T1. The information association unit 103 stores the associated surrounding situation information, collision possibility information, and line-of-sight information in the corresponding information storage unit 36.
 なお、視野情報は変動が少ないため、時刻T1を含む視野情報を一緒に対応付けなくてもよいが、情報対応付け部103は、視線情報と同時刻を含んだ視野情報を対応付けてもよい。視野情報は、視線情報と同時刻を含んだ視野情報としたが、視線情報が検知された時刻と一番近い時刻を含んだ視野情報、あるいは過去同様のイベントで視野情報を取得していた場合は取得済みの視野情報、あるいは一定期間ごとに更新するような視野情報であってもよい。また、視野情報の期間は、周囲状況ごとに運転者が物体を認識できる程度の期間が異なる場合があるため、周囲状況情報に対応して変更してもよい。 Since the visual field information does not fluctuate much, it is not necessary to associate the visual field information including the time T1 together, but the information mapping unit 103 may associate the visual field information including the same time with the line-of-sight information. .. The visual field information is the visual field information including the same time as the visual field information, but the visual field information including the time closest to the time when the visual field information is detected, or the visual field information obtained at the same event as in the past. May be acquired visual field information or visual field information that is updated at regular intervals. Further, the period of the visual field information may be changed according to the surrounding situation information because the period for the driver to recognize the object may differ depending on the surrounding situation.
 衝突可能性情報は、対応情報記憶部36のうち衝突可能性情報記憶部361に、視線情報は、対応情報記憶部36のうち対応視線情報記憶部362に、周囲状況情報は対応情報記憶部36のうち周囲状況情報記憶部363にそれぞれ記憶させる。なお、情報対応付け部103は、周囲状況情報と衝突可能性情報と視線情報とを衝突可能性情報記憶部361と対応視線情報記憶部362と周囲状況情報記憶部363とにそれぞれに記憶させるとしたが、対応関係がわかれば、別の記憶部に記憶されてもよいし、一つの記憶部に記憶されてもよい。 The collision possibility information is in the collision possibility information storage unit 361 of the correspondence information storage unit 36, the line-of-sight information is in the correspondence line-of-sight information storage unit 362 of the correspondence information storage unit 36, and the surrounding situation information is in the correspondence information storage unit 36. Of these, the surrounding situation information storage unit 363 stores each of them. When the information association unit 103 stores the surrounding situation information, the collision possibility information, and the line-of-sight information in the collision possibility information storage unit 361, the corresponding line-of-sight information storage unit 362, and the surrounding situation information storage unit 363, respectively. However, if the correspondence is known, it may be stored in another storage unit or may be stored in one storage unit.
 未検知範囲情報算出部104aは、対応情報記憶部36に記憶された衝突可能性情報と、衝突可能性情報に対応する視線情報と、視野情報と、周囲状況情報とを受信し、受信した当該情報を用いて、交差点Aに差し掛かる直前等のある周囲状況時に自車両の運転者が周辺移動体を見落とした範囲である未検知範囲情報を算出する。具体的には、未検知範囲情報算出部104aは、ある周囲状況時の時刻T2に周辺移動体が存在し、かつ自車両の運転者が見落とした範囲の情報である視線適用衝突可能性情報を算出する。視線適用衝突可能性情報の算出方法については後述する。 The undetected range information calculation unit 104a receives and receives the collision possibility information stored in the corresponding information storage unit 36, the line-of-sight information corresponding to the collision possibility information, the visual field information, and the surrounding situation information. Using the information, the undetected range information, which is the range in which the driver of the own vehicle overlooks the surrounding moving object in a certain surrounding situation such as immediately before approaching the intersection A, is calculated. Specifically, the undetected range information calculation unit 104a provides line-of-sight application collision possibility information, which is information in a range in which a peripheral moving object exists at time T2 in a certain surrounding situation and is overlooked by the driver of the own vehicle. calculate. The method of calculating the line-of-sight application collision possibility information will be described later.
 未検知範囲情報算出部104aは、ある周囲状況時のそれぞれの時刻で視線適用衝突可能性情報を算出し、複数の視線適用衝突可能性情報の論理積を用いて、ある周囲状況時の未検知範囲情報を算出する。ここで、論理積を用いるのは、ある周囲状況時は連続した期間である可能性が高いからである。例えば、交差点Aに差し掛かる直前という周囲状況は、一瞬の状況ではなく、少なくとも数秒間かかる周囲状況である。なお、一瞬の状況でも未検知範囲情報は作成可能である。その場合は、未検知範囲情報算出部104aは、一瞬の状況の情報である一つの視線適用衝突可能性情報を未検知範囲情報とする。未検知範囲情報算出部104aは、周囲状況情報と、周囲状況情報と対応した未検知範囲情報とを未検知範囲情報記憶部34に記憶させる。 The undetected range information calculation unit 104a calculates the line-of-sight application collision possibility information at each time in a certain surrounding situation, and uses the logical product of a plurality of line-of-sight application collision possibility information to detect undetected in a certain surrounding situation. Calculate range information. Here, the logical product is used because it is highly possible that it is a continuous period in a certain surrounding situation. For example, the surrounding situation immediately before approaching the intersection A is not a momentary situation, but a surrounding situation that takes at least several seconds. The undetected range information can be created even in a momentary situation. In that case, the undetected range information calculation unit 104a sets one line-of-sight application collision possibility information, which is information on the momentary situation, as the undetected range information. The undetected range information calculation unit 104a stores the surrounding situation information and the undetected range information corresponding to the surrounding situation information in the undetected range information storage unit 34.
 警告範囲情報算出部105aは、未検知範囲情報記憶部34から周囲状況情報と、周囲状況情報と対応した未検知範囲情報とを受信する。警告範囲情報算出部105aは、受信した当該情報を用いて、運転者ごとの期間が異なる同様の周囲状況情報に対応する複数の未検知範囲情報の論理積をとり、警告範囲情報を算出する。運転者ごとの時刻が異なる同様の周囲状況情報に対応する複数の未検知範囲情報の論理積をとることで、どの時刻においても運転者ごとにある周囲状況時で自車両の運転者が周辺移動体を見落としている範囲である警告範囲情報を算出することができる。つまり、警告範囲情報は、警告範囲情報の範囲内に周辺移動体が存在する場合であって、もし自車両の運転者が視線を当該範囲に向けていなければ、警告を出すべき必要がある範囲の情報である。警告範囲情報算出部105aは、警告範囲情報を警告範囲情報記憶部37に記憶させる。 The warning range information calculation unit 105a receives the surrounding situation information and the undetected range information corresponding to the surrounding situation information from the undetected range information storage unit 34. The warning range information calculation unit 105a uses the received information to perform a logical product of a plurality of undetected range information corresponding to the same surrounding situation information having different periods for each driver, and calculates the warning range information. By taking the logical product of multiple undetected range information corresponding to the same surrounding situation information at different times for each driver, the driver of the own vehicle moves around in the surrounding situation for each driver at any time. It is possible to calculate warning range information, which is the range in which the body is overlooked. That is, the warning range information is a range in which a peripheral moving object exists within the range of the warning range information, and if the driver of the own vehicle does not direct his / her line of sight to the range, a warning should be issued. Information. The warning range information calculation unit 105a stores the warning range information in the warning range information storage unit 37.
 図1に戻って、警告装置2は、周辺移動体情報を取得する移動体情報取得部4aと、自車両の運転者の視線情報を取得する視線情報取得部5と、周囲状況を決定する周囲状況決定部21aと、運転者に警告を出す対象である警告対象を判定する警告対象判定部22aと、警告対象判定部22aの判定に基づき警告を行う警告部23とで構成される。 Returning to FIG. 1, the warning device 2 includes a moving body information acquisition unit 4a for acquiring peripheral moving body information, a line-of-sight information acquisition unit 5 for acquiring the line-of-sight information of the driver of the own vehicle, and the surroundings for determining the surrounding situation. It is composed of a situation determination unit 21a, a warning target determination unit 22a that determines a warning target that is a target for issuing a warning to the driver, and a warning unit 23 that gives a warning based on the determination of the warning target determination unit 22a.
 移動体情報取得部4aは、情報生成装置1の場合とほぼ同様であるが、取得する時間のみ異なる。情報生成装置1は事前に警告範囲情報を作成し、警告装置2は当該警告範囲情報を用いて警告を行うため、情報生成装置1の移動体情報取得部4aで取得する自車両情報及び周辺移動体情報の取得時間は、警告装置2の移動体情報取得部4aで取得する自車両情報及び周辺移動体情報の取得時間よりも過去となる。移動体情報取得部4aは、それ以外は情報生成装置1の場合と同様である。ただし、運転者は車両を変更してもよい。つまり、レンタカー等の運転者が乗る車両が毎回変更になった場合でも本開示は適用可能である。移動体情報取得部4aは、自車両情報及び周辺移動体情報を周囲状況決定部21a及び警告対象判定部22aに送信する。 The mobile information acquisition unit 4a is almost the same as the case of the information generation device 1, but the acquisition time is different. Since the information generation device 1 creates warning range information in advance and the warning device 2 issues a warning using the warning range information, the own vehicle information and peripheral movement acquired by the mobile information acquisition unit 4a of the information generation device 1 The acquisition time of the body information is earlier than the acquisition time of the own vehicle information and the peripheral moving body information acquired by the moving body information acquisition unit 4a of the warning device 2. The mobile information acquisition unit 4a is the same as the case of the information generation device 1 except for the above. However, the driver may change the vehicle. That is, the present disclosure is applicable even when the vehicle on which the driver rides, such as a rental car, is changed every time. The moving body information acquisition unit 4a transmits the own vehicle information and the peripheral moving body information to the surrounding situation determination unit 21a and the warning target determination unit 22a.
 なお、警告装置2の移動体情報取得部4aで取得する自車両情報は、自車両の第一の所定期間内の位置を示す第一の自車両位置情報の一例であり、警告装置2の移動体情報取得部4aで取得する周辺移動体情報は、周辺移動体の第一の所定期間内の位置を示す第一の周辺移動体位置情報の一例である。また、情報生成装置1の移動体情報取得部4aで取得する自車両情報は、移動体の第二の所定期間内の位置を示す第二の自車両位置情報の一例であり、情報生成装置1の移動体情報取得部4aで取得する周辺移動体情報は、周辺移動体の第二の所定期間内の位置を示す第二の周辺移動体位置情報の一例である。 The own vehicle information acquired by the moving body information acquisition unit 4a of the warning device 2 is an example of the first own vehicle position information indicating the position of the own vehicle within the first predetermined period, and the movement of the warning device 2 The peripheral moving body information acquired by the body information acquisition unit 4a is an example of the first peripheral moving body position information indicating the position of the peripheral moving body within the first predetermined period. Further, the own vehicle information acquired by the mobile information acquisition unit 4a of the information generation device 1 is an example of the second own vehicle position information indicating the position of the moving body within the second predetermined period, and the information generation device 1 The peripheral moving body information acquired by the moving body information acquisition unit 4a of the above is an example of the second peripheral moving body position information indicating the position of the peripheral moving body within the second predetermined period.
 視線情報取得部5は、情報生成装置1の場合とほぼ同様であるが、取得する時間のみ異なる。情報生成装置1は事前に警告範囲情報を作成し、警告装置2は当該警告範囲情報を用いて警告を行うため、情報生成装置1の視線情報取得部5で取得する視線情報の取得時間は、警告装置2の視線情報取得部5で取得する視線情報の取得時間よりも過去となる。また、視野情報は取得しない。視線情報取得部5は、それ以外は情報生成装置1の場合と同様である。視線情報取得部5は、視線情報を周囲状況決定部21a及び警告対象判定部22aに送信する。 The line-of-sight information acquisition unit 5 is almost the same as the case of the information generation device 1, but only the acquisition time is different. Since the information generation device 1 creates warning range information in advance and the warning device 2 issues a warning using the warning range information, the acquisition time of the line-of-sight information acquired by the line-of-sight information acquisition unit 5 of the information generation device 1 is set. It is earlier than the acquisition time of the line-of-sight information acquired by the line-of-sight information acquisition unit 5 of the warning device 2. In addition, the field of view information is not acquired. The line-of-sight information acquisition unit 5 is the same as the case of the information generation device 1 except for the above. The line-of-sight information acquisition unit 5 transmits the line-of-sight information to the surrounding situation determination unit 21a and the warning target determination unit 22a.
 なお、警告装置2の視線情報取得部5で取得する視線情報は、第一の所定期間内の自車両の運転者が向けた視線の方向である第一の視線情報の一例である。また、報生成装置1の視線情報取得部5で取得する視線情報は、第二の所定期間内の自車両あるいは車両を変更した移動体の運転者が向けた視線の方向である第二の視線情報の一例である。 The line-of-sight information acquired by the line-of-sight information acquisition unit 5 of the warning device 2 is an example of the first line-of-sight information which is the direction of the line-of-sight directed by the driver of the own vehicle within the first predetermined period. Further, the line-of-sight information acquired by the line-of-sight information acquisition unit 5 of the information generator 1 is the second line-of-sight direction that is the direction of the line-of-sight directed by the driver of the own vehicle or the moving body that has changed the vehicle within the second predetermined period. This is an example of information.
 周囲状況決定部21aは、例えば、交差点に差し掛かる直前、交差点内で右折待ち、空いている交差点を左折中、交通量の多い道路で車線変更をする直前、日暮れの時間帯で太陽に向かって直進中、夜の右折待ち、雨の日に交差点を左折中、台風の日に車線変更等、場所、時間、天気、運転状況等のある時刻における走行中の自車両がおかれている状況を周囲状況情報として決定する。 The surrounding situation determination unit 21a, for example, is waiting for a right turn at an intersection, is turning left at an empty intersection, is just before changing lanes on a busy road, and is facing the sun at nightfall. While going straight, waiting for a right turn at night, turning left at an intersection on a rainy day, changing lanes on a typhoon day, etc. Determined as surrounding situation information.
 具体的には、周囲状況決定部21aは、移動体情報取得部4aから自車両情報と周辺移動体情報とを、視線情報取得部5から視線情報を、地図情報記憶部35から地図情報を、対応情報記憶部36の周囲状況情報記憶部363から周囲状況情報を取得する。周囲状況決定部21aは、周辺移動体情報、視線情報、自車両情報、地図情報、時刻情報、天気情報等その他の情報である環境情報を元に現在の自車両の置かれている状況を決定する。状況とは、自車両の動きを制約するものを意味する。例えば、状況とは、自車両が交差点Aに差し掛かる直前であり、信号機あるいは走行車線と交差する道路を注視しがちという状況、渋滞中の道路をゆっくり進行中である状況等を指す。なお、周囲状況は、視線情報を用いず、周辺移動体情報と、環境情報とから決定してもよい。 Specifically, the surrounding situation determination unit 21a receives the own vehicle information and the peripheral moving body information from the moving body information acquisition unit 4a, the line-of-sight information from the line-of-sight information acquisition unit 5, and the map information from the map information storage unit 35. The surrounding situation information is acquired from the surrounding situation information storage unit 363 of the corresponding information storage unit 36. The surrounding situation determination unit 21a determines the current situation of the own vehicle based on environmental information such as peripheral moving object information, line-of-sight information, own vehicle information, map information, time information, weather information, and the like. To do. The situation means something that restricts the movement of the own vehicle. For example, the situation refers to a situation in which the own vehicle is just before approaching the intersection A and tends to gaze at a road intersecting with a traffic light or a traveling lane, a situation in which the vehicle is slowly traveling on a congested road, and the like. The surrounding situation may be determined from the peripheral moving body information and the environmental information without using the line-of-sight information.
 自車両の置かれている状況は連続的に変化することも多いため、状況の判定には「直前の状況の推移」を利用することで判定精度が高くなる。そのため、周囲状況決定部21aは「直前の状況の推移」である直前の時間の周辺状況情報の推移を、対応情報記憶部36の周囲状況情報記憶部363から取得する。取得する周辺状況情報の推移の時間幅は固定値t3でもいいし、直前の状況に応じた可変値でもよい。これは、すぐに変化しえる状況(例えば、数秒間の右左折等)と、なかなか変わらない可能性のある状況(例えば、渋滞等)では、考えるべき時間範囲が異なるためである。 Since the situation in which the own vehicle is placed often changes continuously, the judgment accuracy can be improved by using the "transition of the situation immediately before" to judge the situation. Therefore, the surrounding situation determination unit 21a acquires the transition of the peripheral situation information at the immediately preceding time, which is the “transition of the immediately preceding situation”, from the surrounding situation information storage unit 363 of the corresponding information storage unit 36. The time width of the transition of the peripheral situation information to be acquired may be a fixed value t3 or a variable value according to the immediately preceding situation. This is because the time range to be considered differs between a situation that can change quickly (for example, turning left or right for a few seconds) and a situation that may not change easily (for example, traffic jam).
 周囲状況決定部21aは、自車両の周囲状況の結果を警告対象判定部22aに送信する。なお、周囲状況決定部21aが決定した現在の自車両の周囲状況の結果は、次の自車両の状況を決定する際に利用できるよう、周囲状況情報記憶部363に記憶させてもよい。 The surrounding condition determination unit 21a transmits the result of the surrounding condition of the own vehicle to the warning target determination unit 22a. The result of the current surrounding situation of the own vehicle determined by the surrounding situation determining unit 21a may be stored in the surrounding situation information storage unit 363 so that it can be used when determining the next situation of the own vehicle.
 警告対象判定部22aは、周囲状況決定部21aから周囲状況の結果を取得する。警告対象判定部22aは、周囲状況の結果と同様の周囲状況において予め生成された自車両の運転者が周辺移動体を見落としている範囲である警告範囲情報を警告範囲情報記憶部37から取得する。 The warning target determination unit 22a acquires the result of the surrounding situation from the surrounding situation determination unit 21a. The warning target determination unit 22a acquires warning range information from the warning range information storage unit 37, which is a range in which the driver of the own vehicle overlooks the surrounding moving body, which is generated in advance in the surrounding situation similar to the result of the surrounding situation. ..
 警告対象判定部22aは、移動体情報取得部4aから自車両情報と周辺移動体情報とを、視線情報取得部5から視線情報を取得する。警告対象判定部22aは、警告範囲情報の自車両の運転者が過去に周辺移動体を見落としている範囲に現在周辺移動体が存在し、かつ自車両の運転者が視線を向けていない場合は、当該周辺移動体は自車両の運転者に警告すべき警告対象の周辺移動体であると判定する。警告対象である周辺移動体は運転者が見落としがちな範囲に存在するため、警告を出すことでより安全な運転が可能になると同時に、運転者が警告を無駄なものとして判定しない可能性が高く、警告を無視するようなことはならないと考えられる。警告対象判定部22aは、警告対象の周辺移動体が存在する場合、判定結果を警告部23に送信する。 The warning target determination unit 22a acquires the own vehicle information and the peripheral moving object information from the moving body information acquisition unit 4a, and the line-of-sight information from the line-of-sight information acquisition unit 5. When the warning range information, the driver of the own vehicle has overlooked the peripheral moving body in the past, the warning target determination unit 22a currently has the peripheral moving body, and the driver of the own vehicle does not look at the peripheral moving body. , It is determined that the peripheral moving body is a peripheral moving body to be warned to warn the driver of the own vehicle. Since the surrounding moving objects that are the target of the warning are located in a range that the driver tends to overlook, issuing a warning enables safer driving, and at the same time, it is highly likely that the driver does not judge the warning as useless. , It is unlikely that the warning will be ignored. The warning target determination unit 22a transmits the determination result to the warning unit 23 when there is a peripheral moving object to be warned.
 警告部23は、警告対象判定部22aから判定結果を受信し、警告対象の周辺移動体が存在する場合、自車両の運転者にスピーカからの音声、カーナビゲーションでの表示等で警告を行う。警告の方法については、自車両の運転者が気付くものであれば、どのような方法でもかまわない。 The warning unit 23 receives the determination result from the warning target determination unit 22a, and if there is a peripheral moving object to be warned, the warning unit 23 warns the driver of the own vehicle by voice from the speaker, display by car navigation, or the like. The warning method may be any method as long as it is noticed by the driver of the own vehicle.
 次に、実施の形態1における警告システム3のハードウェア構成について説明する。 Next, the hardware configuration of the warning system 3 in the first embodiment will be described.
 図3は、本開示の実施の形態1に係る警告システム3のハードウェア構成図である。図3を用いて、本開示の実施の形態1に係る警告システム3の構成について説明する。 FIG. 3 is a hardware configuration diagram of the warning system 3 according to the first embodiment of the present disclosure. The configuration of the warning system 3 according to the first embodiment of the present disclosure will be described with reference to FIG.
 警告システム3は、バス40と、プロセッサ41と、メモリ42と、ストレージ43と、センサ44aと、入力I/F45と、出力I/F46とで構成される。警告システム3の各機能はソフトウェア、ファームウェア、またはソフトウェアとファームウェアの組み合わせにより実現される。ソフトウェア、ファームウェア、またはソフトウェアとファームウェアの組み合わせはプログラムとして記述される。 The warning system 3 is composed of a bus 40, a processor 41, a memory 42, a storage 43, a sensor 44a, an input I / F45, and an output I / F46. Each function of the warning system 3 is realized by software, firmware, or a combination of software and firmware. Software, firmware, or a combination of software and firmware is described as a program.
 バス40は、各装置間を電気的に接続し、データのやり取りを行う信号経路である。 The bus 40 is a signal path that electrically connects each device and exchanges data.
 プロセッサ41は、メモリ42に格納されるプログラムを実行するCPU(Central ProceSSing Unit)である。プロセッサ41は、バス40を介して他の装置と接続し、これら各部を制御する。プロセッサ41は、メモリ42のプログラムを読み込み、実行する。プロセッサ41は、メモリ42に記憶したOS(Operating SyStem)の少なくとも一部をロードし、OSを実行しながら、プログラムを実行する。プロセッサ41は、プロセッシングを行うIC(Integrated Circuit)であればいいので、マイクロプロセッサ、マイクロコンピュータ、DSP(Digital Signal ProceSSor)であってもよい。移動体情報取得部4aと、視線情報取得部5と、衝突可能性情報算出部102と、情報対応付け部103と、未検知範囲情報算出部104aと、警告範囲情報算出部105aと、周囲状況決定部21aと、警告対象判定部22aとは、プロセッサ41がメモリ42にロードしたプログラムを読み込み、実行することにより実現する。 The processor 41 is a CPU (Central Processing Unit) that executes a program stored in the memory 42. The processor 41 connects to other devices via the bus 40 and controls each of these parts. The processor 41 reads and executes the program in the memory 42. The processor 41 loads at least a part of the OS (Operating System) stored in the memory 42, and executes the program while executing the OS. Since the processor 41 may be an IC (Integrated Circuit) that performs processing, it may be a microprocessor, a microcomputer, or a DSP (Digital Signal Processor). Mobile body information acquisition unit 4a, line-of-sight information acquisition unit 5, collision possibility information calculation unit 102, information association unit 103, undetected range information calculation unit 104a, warning range information calculation unit 105a, and surrounding conditions. The determination unit 21a and the warning target determination unit 22a are realized by reading and executing the program loaded in the memory 42 by the processor 41.
 メモリ42は、ソフトウェア、ファームウェア、またはソフトウェアとファームウェアの組み合わせが記述されたプログラムを記憶する。また、メモリ42は、各種情報等を記憶する。メモリ42は、例えば、RAM(Random AcceSS Memory)等の揮発性の半導体メモリである。メモリ42は、OSも記憶する。移動体情報取得部4aと、視線情報取得部5と、衝突可能性情報算出部102と、情報対応付け部103と、未検知範囲情報算出部104aと、警告範囲情報算出部105aと、周囲状況決定部21aと、警告対象判定部22aとは、メモリ42に記憶するプログラムによって実現する。なお、メモリ42は、移動体情報取得部4aと、視線情報取得部5と、衝突可能性情報算出部102と、情報対応付け部103と、未検知範囲情報算出部104aと、警告範囲情報算出部105aと、周囲状況決定部21aと、警告対象判定部22aとの機能それぞれを別々のメモリで実現してもよい。 The memory 42 stores software, firmware, or a program in which a combination of software and firmware is described. Further, the memory 42 stores various information and the like. The memory 42 is, for example, a volatile semiconductor memory such as a RAM (Random Access SS Memory). The memory 42 also stores the OS. Mobile body information acquisition unit 4a, line-of-sight information acquisition unit 5, collision possibility information calculation unit 102, information association unit 103, undetected range information calculation unit 104a, warning range information calculation unit 105a, and surrounding conditions. The determination unit 21a and the warning target determination unit 22a are realized by a program stored in the memory 42. The memory 42 includes a mobile body information acquisition unit 4a, a line-of-sight information acquisition unit 5, a collision possibility information calculation unit 102, an information association unit 103, an undetected range information calculation unit 104a, and a warning range information calculation. The functions of the unit 105a, the surrounding situation determination unit 21a, and the warning target determination unit 22a may be realized by separate memories.
 ストレージ43は、各種情報等を記憶する。ストレージ43は、例えば、ROM(Read Only Memory)、フラッシュメモリ、EPROM(EraSable Programmable Read Only Memory)、EEPROM(Electrically EraSable Programmable Read Only Memory)、HDD(Hard DiSk Drive)等の不揮発性の半導体メモリ、磁気ディスク、フレキシブルディスク、光ディスク、コンパクトディスク、ミニディスク、DVD(Digital VerSatile DiSc)といった可搬記録媒体等である。視線情報記憶部33と、衝突可能性情報記憶部361と対応視線情報記憶部362と周囲状況情報記憶部363とを含む対応情報記憶部36と、未検知範囲情報記憶部34と、警告範囲情報記憶部37と、地図情報記憶部35とはストレージ43によって実現する。 The storage 43 stores various information and the like. The storage 43 includes, for example, a ROM (Read Only Memory), a flash memory, an EPROM (EraSable Programmable Read Only Memory), an EEPROM (Electrically EraSable Programmable Memory Wireless HDD, an optical disk, etc. Portable recording media such as discs, flexible discs, optical discs, compact discs, mini discs, and DVDs (Digital VerSail DiSc). Corresponding information storage unit 36 including line-of-sight information storage unit 33, collision possibility information storage unit 361, corresponding line-of-sight information storage unit 362, and surrounding situation information storage unit 363, undetected range information storage unit 34, and warning range information. The storage unit 37 and the map information storage unit 35 are realized by the storage 43.
 センサ44aは、各種情報を検知する装置である。センサ44aは、例えば、ミリ波レーダー、カメラ等の車載センサである。移動体情報検知センサ31aと、視線情報検知センサ32とは、センサ44aにて実現する。 The sensor 44a is a device that detects various types of information. The sensor 44a is, for example, an in-vehicle sensor such as a millimeter wave radar or a camera. The mobile body information detection sensor 31a and the line-of-sight information detection sensor 32 are realized by the sensor 44a.
 入力I/F45は、運転者、同乗者等のユーザーが情報を入力可能とする装置である。入力I/F45は、例えば、カーナビゲーションの文字入力、音声認識を行う装置である。周囲状況情報設定部101は、入力I/F45にて実現する。 The input I / F45 is a device that allows users such as drivers and passengers to input information. The input I / F 45 is, for example, a device that performs character input and voice recognition for car navigation. The surrounding situation information setting unit 101 is realized by the input I / F 45.
 出力I/F46は、運転者に対し、警告を行う装置である。出力I/F46は、例えば、スピーカ、表示器等の装置である。警告部23は、出力I/F46にて実現する。 The output I / F46 is a device that warns the driver. The output I / F46 is, for example, a device such as a speaker or a display. The warning unit 23 is realized by the output I / F46.
 なお、各部の情報等は、メモリ42、ストレージ43のどちらに記憶されてもよい。また、各部の情報等は、メモリ42、ストレージ43を組み合わせて記憶されてもよい。 Information and the like of each part may be stored in either the memory 42 or the storage 43. Further, the information and the like of each part may be stored in combination with the memory 42 and the storage 43.
 さらに、警告システム3の各機能について、一部を専用のハードウェアで実現し、一部をソフトウェアまたはファームウェアで実現するようにしてもよい。例えば、警告システム3の一部は専用のハードウェアとしての処理回路でその機能を実現し、残りの部分は処理回路であるCPUがメモリに格納されたプログラムを読み出して実行することによってその機能を実現してもよい。処理回路は、ハードウェア、ソフトウェア、ファームウェア、またはこれらの組み合わせによって、警告システム3の各機能を実現することができる。 Further, for each function of the warning system 3, a part may be realized by dedicated hardware and a part may be realized by software or firmware. For example, a part of the warning system 3 realizes its function by a processing circuit as dedicated hardware, and the remaining part realizes its function by reading and executing a program stored in a memory by a CPU which is a processing circuit. It may be realized. The processing circuit can realize each function of the warning system 3 by hardware, software, firmware, or a combination thereof.
 なお、警告システム3の装置間及び機器間は通信を介して接続されている。用いる通信については、有線通信でも無線通信でもよい。 Note that the devices and devices of the warning system 3 are connected via communication. The communication to be used may be wired communication or wireless communication.
 次に、警告システム3の動作について説明する。 Next, the operation of the warning system 3 will be described.
 図4は、本開示の実施の形態1に係る警告システム3の動作を示すフローチャートである。図4を用いて、警告システム3の動作を以下に説明する。 FIG. 4 is a flowchart showing the operation of the warning system 3 according to the first embodiment of the present disclosure. The operation of the warning system 3 will be described below with reference to FIG.
 ステップS1において、移動体情報検知センサ31aは、自車両情報及び自車両周辺に存在する周辺移動体の周辺移動体情報を検知する。移動体情報検知センサ31aは、検知した自車両情報及び周辺移動体情報を移動体情報取得部4aに送信し、ステップS2に進む。 In step S1, the moving body information detection sensor 31a detects the own vehicle information and the peripheral moving body information of the peripheral moving body existing around the own vehicle. The moving body information detection sensor 31a transmits the detected own vehicle information and peripheral moving body information to the moving body information acquisition unit 4a, and proceeds to step S2.
 ステップS2において、視線情報検知センサ32は、自車両の運転者の視線情報を検知する。視線情報検知センサ32は、検知した視線情報を視線情報取得部5に送信し、ステップS3に進む。なお、ステップS1とS2とは、どちらを先に処理してもよい。 In step S2, the line-of-sight information detection sensor 32 detects the line-of-sight information of the driver of the own vehicle. The line-of-sight information detection sensor 32 transmits the detected line-of-sight information to the line-of-sight information acquisition unit 5, and proceeds to step S3. Either of steps S1 and S2 may be processed first.
 ステップS3において、情報生成装置1は、警告を行うべきか判定のベースとする情報となる警告範囲情報を生成する。詳細については、後述する。情報生成装置1は、警告範囲情報を警告範囲情報記憶部37に記憶させ、ステップS4に進む。 In step S3, the information generation device 1 generates warning range information that serves as information on which a warning should be given. Details will be described later. The information generation device 1 stores the warning range information in the warning range information storage unit 37, and proceeds to step S4.
 ステップS4において、警告装置2は、警告範囲情報記憶部37から警告範囲情報を取得し、警告範囲情報を含む複数の情報から自車両の運転者に警告をするか否か判定する。警告装置2は、警告をすると判定した場合、自車両の運転者に警告を行う。詳細については、後述する。 In step S4, the warning device 2 acquires the warning range information from the warning range information storage unit 37, and determines whether or not to warn the driver of the own vehicle from a plurality of information including the warning range information. When the warning device 2 determines that a warning is given, the warning device 2 gives a warning to the driver of the own vehicle. Details will be described later.
 ステップS4を実行した後もステップS1に戻り、電源をOFFにすること、あるいは終了操作がなされる等の処理の終了のトリガーがあるまで上記のような処理を繰り返す。なお、上記のような処理を繰り返すとしたが、繰り返さず一回行うだけでもよい。 Even after executing step S4, the process returns to step S1 and the above processing is repeated until there is a trigger to end the processing such as turning off the power or performing an end operation. Although it is assumed that the above processing is repeated, it may be performed only once without repeating.
 図5は、本開示の実施の形態1に係る情報生成装置1の動作を示すフローチャートである。図5を用いて、図4のステップS3の情報生成装置1の動作を以下に説明する。 FIG. 5 is a flowchart showing the operation of the information generation device 1 according to the first embodiment of the present disclosure. The operation of the information generation device 1 in step S3 of FIG. 4 will be described below with reference to FIG.
 ステップS101において、移動体情報取得部4aは、移動体情報検知センサ31aから自車両情報及び周辺移動体情報を取得する。なお、移動体情報取得部4aは、移動体情報検知センサ31aから自車両情報及び周辺移動体情報の少なくとも一つのデータが送られてくる毎に取得してもよいし、一定時間の間に移動体情報検知センサ31aから送られてくる当該データをまとめて取得してもよい。 In step S101, the moving body information acquisition unit 4a acquires own vehicle information and peripheral moving body information from the moving body information detection sensor 31a. The mobile body information acquisition unit 4a may acquire at least one data of the own vehicle information and the peripheral moving body information from the moving body information detection sensor 31a, or may move within a certain period of time. The data sent from the body information detection sensor 31a may be collectively acquired.
 移動体情報取得部4aは、情報生成装置1内では、取得した自車両情報及び周辺移動体情報を情報生成部11aに送信する。なお、移動体情報取得部4aは、移動体情報検知センサ31aから自車両情報及び周辺移動体情報の少なくとも一つのデータが送られてくる毎に、当該データを情報生成部11aに送信してもよいし、一定時間の間に移動体情報検知センサ31aから送られてくる当該データをまとめた上で、当該データを情報生成部11aに送信してもよい。 The mobile body information acquisition unit 4a transmits the acquired own vehicle information and peripheral mobile body information to the information generation unit 11a in the information generation device 1. Even if the moving body information acquisition unit 4a transmits the data to the information generating unit 11a each time at least one data of the own vehicle information and the peripheral moving body information is sent from the moving body information detection sensor 31a. Alternatively, the data sent from the moving body information detection sensor 31a may be collected for a certain period of time, and then the data may be transmitted to the information generation unit 11a.
 実施の形態1では、移動体情報取得部4aは、移動体情報検知センサ31aで検知したある時刻における自車両情報及び自車両周辺に存在する周辺移動体の周辺移動体情報を所定期間t1取得する。移動体情報検知センサ31aによる検知タイミングはセンサに依存するため、所定期間t1の時間幅を設定して自車両情報及び周辺移動体情報を取得することにより、複数のセンサを利用する場合に起こる検知タイミングのずれを吸収することができる。 In the first embodiment, the moving body information acquisition unit 4a acquires the own vehicle information at a certain time detected by the moving body information detection sensor 31a and the peripheral moving body information of the peripheral moving body existing around the own vehicle for a predetermined period t1. .. Since the detection timing by the moving body information detection sensor 31a depends on the sensor, detection that occurs when a plurality of sensors are used by setting a time width of a predetermined period t1 and acquiring own vehicle information and peripheral moving body information. It is possible to absorb the timing deviation.
 ステップS102において、視線情報取得部5は、移動体情報取得部4aが自車両情報及び周辺移動体情報を取得するのと同時に、視線情報検知センサ32から視線情報を取得する。視線情報取得部5が視線情報を取得するタイミング、収集、保持期間は、移動体情報取得部4aが保持する自車両情報及び周辺移動体情報と同期間に合わせる。 In step S102, the line-of-sight information acquisition unit 5 acquires the line-of-sight information from the line-of-sight information detection sensor 32 at the same time that the moving body information acquisition unit 4a acquires the own vehicle information and the peripheral moving body information. The timing, collection, and retention period for the line-of-sight information acquisition unit 5 to acquire the line-of-sight information are set in synchronization with the own vehicle information and the peripheral moving body information held by the moving body information acquisition unit 4a.
 例えば、移動体情報取得部4aが移動体情報検知センサ31aから自車両情報及び周辺移動体情報の少なくとも一つのデータが送られてくる毎に処理を行う場合、処理のタイミングで一定期間t1だけ保持するのであれば、視線情報取得部5が取得する視線情報も同期間分保持する。視線情報取得部5は、視線情報を視線情報記憶部33に記憶させる。 For example, when the moving body information acquisition unit 4a performs processing every time at least one data of own vehicle information and peripheral moving body information is sent from the moving body information detection sensor 31a, it is held for a certain period of time t1 at the timing of processing. If so, the line-of-sight information acquired by the line-of-sight information acquisition unit 5 is also retained for the same period. The line-of-sight information acquisition unit 5 stores the line-of-sight information in the line-of-sight information storage unit 33.
 また、視線情報取得部5は、ある時刻における所定期間t2の視線情報検知センサ32から取得した視線情報あるいは視線情報記憶部33に記憶されている視線情報を用いて、ある時刻における所定期間t2に運転者が向けた視線の範囲の情報である視野情報を取得する。なお、視野情報を取得する際の所定期間t2は、移動体情報取得部4aが取得する自車両情報及び移動体情報の期間と同じでなくてよい。 Further, the line-of-sight information acquisition unit 5 uses the line-of-sight information acquired from the line-of-sight information detection sensor 32 for a predetermined period t2 at a certain time or the line-of-sight information stored in the line-of-sight information storage unit 33 for a predetermined period t2 at a certain time. Acquires visual field information, which is information on the range of the line of sight directed by the driver. The predetermined period t2 when acquiring the visual field information does not have to be the same as the period of the own vehicle information and the moving body information acquired by the moving body information acquisition unit 4a.
 図6は、本開示の実施の形態1に係る視野情報の概念図である。自車両50の運転者がある時刻における所定期間t2の間に向けた視線の範囲51が視野情報である。 FIG. 6 is a conceptual diagram of visual field information according to the first embodiment of the present disclosure. The visual field information is the range 51 of the line of sight directed by the driver of the own vehicle 50 during a predetermined period t2 at a certain time.
 図1に戻って、視線情報取得部5は、視野情報を視線情報記憶部33に記憶させる。 Returning to FIG. 1, the line-of-sight information acquisition unit 5 stores the visual field information in the line-of-sight information storage unit 33.
 ステップS103において、周囲状況情報設定部101は、移動体情報取得部4aから自車両情報及び周辺移動体情報を受信し、視線情報取得部5から視線情報を受信するタイミングで、そのとき運転者が走行している周囲状況について周囲状況情報を設定する。実施の形態1では、例として、自車両が特定の交差点を右折する場合を想定しているため、周囲状況情報設定部101は、運転者等によって入力された「特定の交差点を右折している」旨の情報を周囲状況情報として設定する。 In step S103, the surrounding situation information setting unit 101 receives the own vehicle information and the peripheral moving body information from the moving body information acquisition unit 4a, and the driver receives the line-of-sight information from the line-of-sight information acquisition unit 5. Set the surrounding situation information about the surrounding situation while driving. In the first embodiment, as an example, it is assumed that the own vehicle turns right at a specific intersection, so that the surrounding situation information setting unit 101 is "turning right at a specific intersection" input by the driver or the like. Information to that effect is set as surrounding situation information.
 なお、実施の形態1では、走行時に周囲状況情報を設定する場合について記載するが、走行中はドライブレコーダー等で状況を記憶しておき、停車、駐車、降車後に設定する場合は、移動体情報取得部4aから自車両情報及び周辺移動体情報を受信し、視線情報取得部5から視線情報を受信した時刻を記憶しておき、周囲状況情報設定部101は、同時刻のドライブレコーダーの記憶した画像から運転者等によって判定され入力された周囲状況の情報を周囲状況情報として設定してもよい。画像認識から設定する場合も、周囲状況情報設定部101は、移動体情報取得部4aから自車両情報及び周辺移動体情報を受信し、視線情報取得部5から視線情報を受信した時刻と同時刻のドライブレコーダーの記憶した画像から設定してもよい。 In the first embodiment, the case where the surrounding situation information is set during running is described, but when the situation is memorized by a drive recorder or the like during running and the setting is made after stopping, parking, or getting off, the moving body information is described. The time when the own vehicle information and the peripheral moving object information were received from the acquisition unit 4a and the line-of-sight information was received from the line-of-sight information acquisition unit 5 was stored, and the surrounding situation information setting unit 101 stored the drive recorder at the same time. The surrounding situation information determined and input by the driver or the like from the image may be set as the surrounding situation information. Even when setting from image recognition, the surrounding situation information setting unit 101 receives the own vehicle information and the peripheral moving body information from the moving body information acquisition unit 4a, and the same time as the time when the line-of-sight information is received from the line-of-sight information acquisition unit 5. It may be set from the image stored in the drive recorder of.
 周囲状況情報設定部101は、周囲状況情報を衝突可能性情報算出部102に送信する。 The surrounding situation information setting unit 101 transmits the surrounding situation information to the collision possibility information calculation unit 102.
 ステップS104において、衝突可能性情報算出部102は、移動体情報取得部4aから自車両情報及び周辺移動体情報を、視線情報取得部5から視線情報を、周囲状況情報設定部101から周囲状況情報を受信する。衝突可能性情報算出部102は、移動体情報取得部4aから受信した自車両情報及び周辺移動体情報から衝突可能性情報を生成する。 In step S104, the collision possibility information calculation unit 102 receives the own vehicle information and the peripheral moving body information from the moving body information acquisition unit 4a, the line-of-sight information from the line-of-sight information acquisition unit 5, and the surrounding situation information from the surrounding situation information setting unit 101. To receive. The collision possibility information calculation unit 102 generates collision possibility information from the own vehicle information and the peripheral moving body information received from the moving body information acquisition unit 4a.
 図7は、本開示の実施の形態1に係る衝突可能性情報の概念図である。図7aは、中央に自車両50を配置し、周辺移動体52~54の位置を丸印、速度を矢印で記載した図である。図7bは、リスクマップである。図7bの衝突確率は黒色ほど衝突確率が高く、白色ほど衝突確率が低いことを示す。なお、図7bの衝突確率は、簡単のため速度情報予測は考慮せず、位置情報から算出した単純に黒色の位置にいるほど衝突するというリスクマップである。黒色と白色との間の灰色のグラデーション部分は計測誤差を示す。 FIG. 7 is a conceptual diagram of collision possibility information according to the first embodiment of the present disclosure. FIG. 7a is a diagram in which the own vehicle 50 is arranged in the center, the positions of the peripheral moving bodies 52 to 54 are marked with circles, and the speed is indicated by arrows. FIG. 7b is a risk map. As for the collision probability in FIG. 7b, the blacker the color, the higher the collision probability, and the whiter the color, the lower the collision probability. Since the collision probability in FIG. 7b is simple, the speed information prediction is not taken into consideration, and the risk map is such that the collision occurs as the position is simply black, which is calculated from the position information. The gray gradation between black and white indicates measurement error.
 実施の形態1では、衝突可能性情報算出部102は、自車両情報から自車両50、周辺移動体情報から周辺移動体52~54の位置及び速度を取得し、図7aのようなマップを作成する。また、衝突可能性情報算出部102は、自車両50と周辺移動体52~54それぞれの衝突確率を算出し、図7aから図7bのようなリスクマップを作成する。なお、実施の形態1では、衝突可能性情報算出部102は、リスクマップを作成するとしたが、自車両が移動体に衝突するか否かという情報の算出は既知の算出方法で算出すればよく、リスクマップに限らない。 In the first embodiment, the collision possibility information calculation unit 102 acquires the position and speed of the own vehicle 50 from the own vehicle information and the peripheral moving bodies 52 to 54 from the peripheral moving body information, and creates a map as shown in FIG. 7a. To do. Further, the collision possibility information calculation unit 102 calculates the collision probabilities of the own vehicle 50 and the peripheral moving objects 52 to 54, and creates a risk map as shown in FIGS. 7a to 7b. In the first embodiment, the collision possibility information calculation unit 102 creates a risk map, but the information on whether or not the own vehicle collides with a moving object may be calculated by a known calculation method. , Not limited to risk maps.
 図5に戻って、衝突可能性情報算出部102は、衝突可能性情報と、衝突確率に対応する視線情報と、周囲状況情報とを、情報対応付け部103に送信する。 Returning to FIG. 5, the collision possibility information calculation unit 102 transmits the collision possibility information, the line-of-sight information corresponding to the collision probability, and the surrounding situation information to the information association unit 103.
 ステップS105において、情報対応付け部103は、衝突可能性情報算出部102から受信した衝突可能性情報と、視線情報と、周囲状況情報とを対応付けて、対応情報記憶部36に記憶させる。実施の形態1では、情報対応付け部103は、衝突可能性情報を衝突可能性情報記憶部361に、衝突可能性情報に対応する視線情報を対応視線情報記憶部362に、衝突可能性情報に対応する周囲状況情報を周囲状況情報記憶部363に記憶させる。対応付けの仕方は、時刻、同一番号の付与等、対応していることがわかればどのようなものでもよい。 In step S105, the information associating unit 103 associates the collision possibility information received from the collision possibility information calculation unit 102 with the line-of-sight information and the surrounding situation information, and stores them in the corresponding information storage unit 36. In the first embodiment, the information association unit 103 converts the collision possibility information into the collision possibility information storage unit 361, the line-of-sight information corresponding to the collision possibility information into the corresponding line-of-sight information storage unit 362, and the collision possibility information. The corresponding surrounding situation information is stored in the surrounding situation information storage unit 363. The method of associating may be any method as long as it is known that they correspond to each other, such as time and assignment of the same number.
 ステップS106において、未検知範囲情報算出部104aは、実施の形態1の例である自車両が交差点Aに差し掛かる直前の場合について、対応情報記憶部36から自車両が交差点Aに差し掛かる直前の期間の衝突可能性情報と、視線情報と、周囲状況情報とを取得する。未検知範囲情報算出部104aは、取得した衝突可能性情報と、視線情報と、周囲状況情報とから未検知範囲情報を算出する。 In step S106, the undetected range information calculation unit 104a determines the case immediately before the own vehicle approaches the intersection A, which is an example of the first embodiment, immediately before the own vehicle approaches the intersection A from the corresponding information storage unit 36. The collision possibility information of the period, the line-of-sight information, and the surrounding situation information are acquired. The undetected range information calculation unit 104a calculates the undetected range information from the acquired collision possibility information, the line-of-sight information, and the surrounding situation information.
 図8は、本開示の実施の形態1に係る交差点Aに差し掛かる直前の期間を説明するための図である。交差点Aを自車両が1回走行したときについて、交差点Aに差し掛かる直前の時刻として時刻t11~時刻t16を抽出したとする。未検知範囲情報算出部104aは、時刻t11~t16を例えば5分割し、それぞれの時刻t11~時刻t12、時刻t12~時刻t13、時刻t13~時刻t14、時刻t14~時刻t15、時刻t15~時刻t16について考える。なお、5分割としたが、分割しなくてもよいし、他の分割数でもよい。また、均等に分割したが、不均等に分割してもよい。交差点Aを自車両が2回・・・N回走行したときについても、同様に、交差点Aに差し掛かる直前の時刻として時刻t21~時刻t26、・・・時刻tN1~時刻tN6を抽出したとする。 FIG. 8 is a diagram for explaining a period immediately before approaching the intersection A according to the first embodiment of the present disclosure. It is assumed that the time t11 to the time t16 are extracted as the time immediately before approaching the intersection A when the own vehicle travels once at the intersection A. The undetected range information calculation unit 104a divides the time t11 to t16 into, for example, five, and divides the time t11 to time t12, time t12 to time t13, time t13 to time t14, time t14 to time t15, and time t15 to time t16, respectively. think about. Although it is divided into five, it may not be divided or may be another number of divisions. Moreover, although it was divided evenly, it may be divided unevenly. Similarly, when the own vehicle travels at the intersection A twice ... N times, it is assumed that the time t21 to the time t26, ... the time tN1 to the time tN6 are extracted as the time immediately before approaching the intersection A. ..
 まずは、時刻t11~時刻t12について、未検知範囲情報算出部104aは、当該期間の衝突可能性情報と、視線情報と、周囲状況情報とから、自車両が交差点Aに差し掛かる直前の場合の当該期間において自車両の運転者が見落とした範囲である視線適用衝突可能性情報を算出する。 First, regarding the time t11 to the time t12, the undetected range information calculation unit 104a uses the collision possibility information, the line-of-sight information, and the surrounding situation information of the period to determine the case immediately before the own vehicle approaches the intersection A. Calculate the line-of-sight application collision possibility information, which is the range overlooked by the driver of the own vehicle during the period.
 図9は、本開示の実施の形態1に係る自車両50の運転者が見えている範囲の概念図である。未検知範囲情報算出部104aは、視線情報取得部5を介し、視線情報記憶部33から視野情報51を取得する。視線情報記憶部33に複数視野情報が記憶されている場合は、時刻t11~時刻t12に近い時刻の視野情報51を取得する。 FIG. 9 is a conceptual diagram of the range in which the driver of the own vehicle 50 according to the first embodiment of the present disclosure can be seen. The undetected range information calculation unit 104a acquires the visual field information 51 from the line-of-sight information storage unit 33 via the line-of-sight information acquisition unit 5. When a plurality of visual field information is stored in the line-of-sight information storage unit 33, the visual field information 51 at a time close to the time t11 to the time t12 is acquired.
 未検知範囲情報算出部104aは、自車両50に視線情報を配置する。未検知範囲情報算出部104aは、視線情報55を中心として、視野情報51をさらに配置する。視線情報55を中心として、配置された視野情報51の範囲が、自車両50の運転者が見えている範囲となり、当該範囲内は危険がないものとみなす。 The undetected range information calculation unit 104a arranges the line-of-sight information on the own vehicle 50. The undetected range information calculation unit 104a further arranges the visual field information 51 centering on the line-of-sight information 55. The range of the visual field information 51 arranged around the line-of-sight information 55 is the range that the driver of the own vehicle 50 can see, and it is considered that there is no danger in the range.
 図10は、本開示の実施の形態1に係る時刻t11~時刻t12の視線適用衝突可能性情報の概念図である。未検知範囲情報算出部104aは、運転者が見えている範囲では危険はないものとみなして、図9で算出した自車両50の運転者が見えている範囲と図7で算出した衝突可能性情報とを重畳することにより、図10のように時刻t11~時刻t12で運転者が見えていない範囲の情報である視線適用衝突可能性情報を算出する。具体的には、未検知範囲情報算出部104aは、運転者が見えている範囲の衝突確率をゼロとして、運転者が見えていない範囲の衝突確率のみ残す。なお、未検知範囲情報算出部104aは、運転者が見えている範囲の衝突確率をゼロとしなくても低くするだけでもよい。 FIG. 10 is a conceptual diagram of line-of-sight application collision possibility information from time t11 to time t12 according to the first embodiment of the present disclosure. The undetected range information calculation unit 104a considers that there is no danger in the range in which the driver can see, and considers that there is no danger in the range in which the driver of the own vehicle 50 is visible and the collision possibility calculated in FIG. By superimposing the information, as shown in FIG. 10, the line-of-sight application collision possibility information, which is the information in the range in which the driver cannot see from the time t11 to the time t12, is calculated. Specifically, the undetected range information calculation unit 104a sets the collision probability in the range in which the driver can see to zero, and leaves only the collision probability in the range in which the driver cannot see. The undetected range information calculation unit 104a may not set the collision probability in the range visible to the driver to zero, but may only lower it.
 未検知範囲情報算出部104aは、時刻t12~時刻t13、時刻t13~時刻t14、時刻t14~時刻t15、時刻t15~時刻t16についても、時刻t11~時刻t12と同様に視線適用衝突可能性情報を算出する。 The undetected range information calculation unit 104a also applies the line-of-sight application collision possibility information to the time t12 to time t13, the time t13 to the time t14, the time t14 to the time t15, and the time t15 to the time t16 in the same manner as the time t11 to the time t12. calculate.
 未検知範囲情報算出部104aは、図8の交差点Aに差し掛かる直前の時刻が時刻t11~時刻t16であり、交差点Aを自車両が1回走行したときについて、当該期間の視線適用衝突可能性情報について論理積を取り、当該期間の間に全く運転者が見えていない範囲にある周辺移動体の情報である未検知範囲情報を算出する。 In the undetected range information calculation unit 104a, when the time immediately before approaching the intersection A in FIG. 8 is from time t11 to time t16 and the own vehicle travels once at the intersection A, the line-of-sight application collision possibility during that period. The information is logically producted, and the undetected range information, which is the information of the peripheral moving objects in the range in which the driver cannot see at all during the period, is calculated.
 図11は、本開示の実施の形態1に係る時刻t11~時刻t16の未検知範囲情報の概念図である。未検知範囲情報算出部104aは、時刻t11~時刻t12、時刻t12~時刻t13、時刻t13~時刻t14、時刻t14~時刻t15、時刻t15~時刻t16それぞれの視線適用衝突可能性情報について論理積を取り、時刻t11~時刻t16の期間ずっと見落としている範囲である未検知範囲情報を算出する。 FIG. 11 is a conceptual diagram of undetected range information from time t11 to time t16 according to the first embodiment of the present disclosure. The undetected range information calculation unit 104a performs a logical product for each line-of-sight application collision possibility information of time t11 to time t12, time t12 to time t13, time t13 to time t14, time t14 to time t15, and time t15 to time t16. The undetected range information, which is the range overlooked throughout the period from time t11 to time t16, is calculated.
 なお、例えば時刻t11だけのような一瞬の情報しかない場合は、時刻t11の視線適用衝突可能性情報が未検知範囲情報となる。未検知範囲情報算出部104aは、一つの未検知範囲情報を算出するためには、同一運転者かつ同一周囲状況の視線適用衝突可能性情報から作成する必要がある。 If there is only momentary information such as only the time t11, the line-of-sight application collision possibility information at the time t11 becomes the undetected range information. In order to calculate one undetected range information, the undetected range information calculation unit 104a needs to be created from the line-of-sight application collision possibility information of the same driver and the same surrounding situation.
 図5に戻って、未検知範囲情報算出部104aは、未検知範囲情報を未検知範囲情報記憶部34に送信し、未検知範囲情報記憶部34に未検知範囲情報を記憶させる。未検知範囲情報算出部104aは、衝突可能性情報記憶部361に記憶されているすべての衝突可能性情報について、ステップS106の動作を行い、運転者ごとに、様々な周囲状況下で未検知範囲情報を算出し、未検知範囲情報記憶部34に未検知範囲情報を記憶させていく。 Returning to FIG. 5, the undetected range information calculation unit 104a transmits the undetected range information to the undetected range information storage unit 34, and stores the undetected range information in the undetected range information storage unit 34. The undetected range information calculation unit 104a performs the operation of step S106 for all the collision possibility information stored in the collision possibility information storage unit 361, and the undetected range for each driver under various surrounding conditions. The information is calculated, and the undetected range information storage unit 34 stores the undetected range information.
 ステップS107において、警告範囲情報算出部105aは、未検知範囲情報記憶部34から同一運転者、同一周囲状況である1以上の未検知範囲情報を受信する。具体的には、警告範囲情報算出部105aは、図8のような同一運転者で、同一周囲状況である交差点Aに差し掛かる直前の場合である1回目の未検知範囲情報と、2回目の未検知範囲情報と、・・・N回目の未検知範囲情報とを未検知範囲情報記憶部34から受信する。同一運転者で、同一周囲状況とは所定の条件の一例である。 In step S107, the warning range information calculation unit 105a receives one or more undetected range information of the same driver and the same surrounding situation from the undetected range information storage unit 34. Specifically, the warning range information calculation unit 105a includes the first undetected range information and the second undetected range information, which is the case immediately before approaching the intersection A in the same surrounding condition with the same driver as shown in FIG. The undetected range information and the Nth undetected range information are received from the undetected range information storage unit 34. The same driver and the same surroundings are examples of predetermined conditions.
 警告範囲情報算出部105aは、受信した1回目の未検知範囲情報、2回目の未検知範囲情報、・・・N回目の未検知範囲情報について論理積を取り、同一運転者で、同一周囲状況下で、いつも運転者が見落としている範囲を算出する。算出した当該範囲が、警告をすべき範囲である警告範囲情報である。 The warning range information calculation unit 105a takes a logical product of the received first undetected range information, second undetected range information, ... Nth undetected range information, and has the same driver and the same surrounding conditions. Below, we calculate the range that the driver always overlooks. The calculated range is the warning range information that is the range to be warned.
 ステップS107を実行した後もステップS101に戻り、電源をOFFにすること、あるいは終了操作がなされる等の処理の終了のトリガーがあるまで上記のような処理を繰り返す。なお、上記のような処理を繰り返すとしたが、繰り返さず一回行うだけでもよい。ここまでが事前に警告を行うべきか判定のベースとなる情報を生成する機械学習のプロセスである。 Even after executing step S107, the process returns to step S101, and the above processing is repeated until there is a trigger to end the processing such as turning off the power or performing an end operation. Although it is assumed that the above processing is repeated, it may be performed only once without repeating. This is the machine learning process that generates information that is the basis for determining whether a warning should be given in advance.
 なお、実施の形態1では、情報生成装置1は自車両に搭載されているとしたが、自車両に搭載していなくてもよい。具体的には、情報生成装置1の動作を自車両に搭載していないサーバ等で行ってもよい。 Although it is assumed that the information generator 1 is mounted on the own vehicle in the first embodiment, it does not have to be mounted on the own vehicle. Specifically, the operation of the information generation device 1 may be performed by a server or the like that is not mounted on the own vehicle.
 図12は、本開示の実施の形態1に係る警告装置2の動作を示すフローチャートである。図12を用いて、図4のステップS4の警告装置2の動作を以下に説明する。 FIG. 12 is a flowchart showing the operation of the warning device 2 according to the first embodiment of the present disclosure. The operation of the warning device 2 in step S4 of FIG. 4 will be described below with reference to FIG.
 ステップS201において、移動体情報取得部4aは、ステップS101と同様の動作を行う。ただし、移動体情報取得部4aは、取得した自車両情報及び周辺移動体情報を情報生成部11aではなく、周囲状況決定部21a及び警告対象判定部22aに送信する。 In step S201, the mobile information acquisition unit 4a performs the same operation as in step S101. However, the moving body information acquisition unit 4a transmits the acquired own vehicle information and peripheral moving body information not to the information generation unit 11a but to the surrounding situation determination unit 21a and the warning target determination unit 22a.
 ステップS202において、視線情報取得部5は、ステップS102と同様の動作を行う。ただし、視野情報は取得しない。 In step S202, the line-of-sight information acquisition unit 5 performs the same operation as in step S102. However, the field of view information is not acquired.
 ステップS203において、周囲状況決定部21aは、移動体情報取得部4aから自車両情報及び周辺移動体情報を、視線情報取得部5を介して視線情報記憶部33から視線情報を、地図情報記憶部35から地図情報を取得する。周囲状況決定部21aは、取得した情報から現在の自車両の周囲状況が周囲状況情報記憶部363に記憶されている周囲状況のうちで合致するものがあるか否か判定する。 In step S203, the surrounding situation determination unit 21a receives the own vehicle information and the peripheral moving body information from the moving body information acquisition unit 4a, the line-of-sight information from the line-of-sight information storage unit 33 via the line-of-sight information acquisition unit 5, and the map information storage unit. Get map information from 35. From the acquired information, the surrounding situation determination unit 21a determines whether or not the current surrounding situation of the own vehicle matches the surrounding situation stored in the surrounding situation information storage unit 363.
 周囲状況決定部21aは、合致するものがある場合、合致した周囲状況を現在の自車両の周囲状況と決定し、ステップS203:Yesとなる。周囲状況決定部21aは、結果を警告対象判定部22aに送信し、ステップS204に進む。例えば、視線情報から自車両の運転者の視線が右方向に主に動いており、かつ自車両情報及び地図情報から特定の交差点付近に自車両が位置している場合は、周囲状況決定部21aは、自車両が特定の交差点を右折しようとしている状況であると決定する。また、視線情報から自車両の運転者の視線がまっすぐ前を見ており、かつ自車両情報及び地図情報から特定の交差点付近に自車両が位置しており、自車両情報から自車両の速度変化がない場合は、周囲状況決定部21aは、自車両が特定の交差点を直進しようとしている状況であると決定する。なお、決定方法は、上記に限らない。 If there is a match, the surrounding situation determination unit 21a determines the matched surrounding situation as the current surrounding situation of the own vehicle, and step S203: Yes. The surrounding situation determination unit 21a transmits the result to the warning target determination unit 22a, and proceeds to step S204. For example, if the driver's line of sight of the own vehicle is mainly moving to the right from the line of sight information, and the own vehicle is located near a specific intersection from the own vehicle information and the map information, the surrounding situation determination unit 21a Determines that the vehicle is about to turn right at a particular intersection. Also, from the line-of-sight information, the driver's line of sight of the own vehicle is looking straight ahead, and from the own vehicle information and map information, the own vehicle is located near a specific intersection, and the speed change of the own vehicle is based on the own vehicle information. If there is no such information, the surrounding situation determination unit 21a determines that the vehicle is about to go straight at a specific intersection. The determination method is not limited to the above.
 周囲状況決定部21aは、合致するものがない場合、ありえない周囲状況と決定し、ステップS203:Noとなり、動作を終了する。 If there is no match, the surrounding situation determination unit 21a determines that the surrounding situation is impossible, steps S203: No, and ends the operation.
 ステップS204において、警告対象判定部22aは、周囲状況決定部21aから周囲状況の結果を、移動体情報取得部4aから自車両情報及び周辺移動体情報を、視線情報取得部5から視線情報を受信する。警告対象判定部22aは、自車両の運転者と同様、かつ受信した周囲状況の結果と同様の周囲状況の警告範囲情報を警告範囲情報記憶部37から取得する。警告対象判定部22aは、受信あるいは取得した情報から周辺移動体が警告対象であるか否かを判定する。 In step S204, the warning target determination unit 22a receives the result of the surrounding situation from the surrounding situation determination unit 21a, the own vehicle information and the peripheral moving body information from the moving body information acquisition unit 4a, and the line-of-sight information from the line-of-sight information acquisition unit 5. To do. The warning target determination unit 22a acquires the warning range information of the surrounding situation similar to the driver of the own vehicle and the result of the received surrounding situation from the warning range information storage unit 37. The warning target determination unit 22a determines whether or not the peripheral moving object is a warning target from the received or acquired information.
 具体的には、警告対象判定部22aは、自車両情報と周辺移動体情報と警告範囲情報とから周辺移動体が警告範囲に位置し、かつ視線情報から自車両の運転者が視線を警告範囲に向けていない場合、警告範囲に位置する周辺移動体は警告対象であると判定し、ステップS204:Yesとなる。警告対象判定部22aは、警告部23に自車両の運転者に対し警告を行うよう制御し、ステップS205に進む。 Specifically, in the warning target determination unit 22a, the peripheral moving body is located in the warning range from the own vehicle information, the peripheral moving body information, and the warning range information, and the driver of the own vehicle warns the line of sight from the line-of-sight information. If it is not directed to, it is determined that the peripheral moving object located in the warning range is the warning target, and step S204: Yes. The warning target determination unit 22a controls the warning unit 23 to give a warning to the driver of the own vehicle, and proceeds to step S205.
 警告対象判定部22aは、周辺移動体が警告範囲に位置し、かつ自車両の運転者が視線を警告範囲に向けていない場合以外は、周辺移動体は警告対象でないと判定し、ステップS204:Noとなり、動作を終了する。 The warning target determination unit 22a determines that the peripheral moving body is not the warning target unless the peripheral moving body is located in the warning range and the driver of the own vehicle does not direct the line of sight to the warning range. No, and the operation ends.
 なお、ステップS204では視線情報を用いたが、警告対象判定部22aは、自車両情報と周辺移動体情報と警告範囲情報とのみから、周辺移動体が警告範囲に位置していると判定したら警告部23に自車両の運転者に対し警告を行うよう制御するようにしてもよい。また、警告対象判定部22aは、自車両情報と周辺移動体情報とから、警告範囲に位置する周辺移動体が自車両移動方向と逆方向に移動している等、衝突の可能性が無く警告が必要ないと別途判定できる場合は、周辺移動体は警告対象でないと判定し、ステップS204:Noとして、動作を終了してもよい。 Although the line-of-sight information was used in step S204, the warning target determination unit 22a warns when it is determined that the peripheral moving object is located in the warning range only from the own vehicle information, the peripheral moving object information, and the warning range information. The unit 23 may be controlled to warn the driver of the own vehicle. In addition, the warning target determination unit 22a warns that there is no possibility of collision, such as the peripheral moving body located in the warning range moving in the direction opposite to the moving direction of the own vehicle, based on the own vehicle information and the peripheral moving body information. If it can be separately determined that is not necessary, it may be determined that the peripheral moving object is not a warning target, and the operation may be terminated in step S204: No.
 ステップS205において、警告部23は、警告対象判定部22aから周辺移動体が警告対象であるか否かの判定結果を受信し、警告対象である場合、自車両の運転者に対し警告を行い、注意喚起する。警告方法は、音、振動、カーナビゲーション等の表示器への表示等どのようなものでもよい。 In step S205, the warning unit 23 receives the determination result of whether or not the peripheral moving object is the warning target from the warning target determination unit 22a, and if it is the warning target, warns the driver of the own vehicle. Call attention. The warning method may be any method such as sound, vibration, display on a display such as car navigation.
 ステップS203でNoとなった場合、ステップS204でNoとなった場合、ステップS205を実行した後もステップS201に戻り、電源をOFFにすること、あるいは終了操作がなされる等の処理の終了のトリガーがあるまで上記のような処理を繰り返す。なお、上記のような処理を繰り返すとしたが、繰り返さず一回行うだけでもよい。 If the result is No in step S203, or if the result is No in step S204, the process returns to step S201 even after the execution of step S205, the power is turned off, or a trigger for the end of processing such as an end operation is performed. The above process is repeated until there is. Although it is assumed that the above processing is repeated, it may be performed only once without repeating.
 以上述べたように、実施の形態1の警告システム3は、情報生成装置1が周囲状況を加味した警告を行うべきか判定のベースとする警告範囲情報を生成し、警告装置2が現在の自車両の周囲状況を加味した上で警告を行うか否か判定するため、感覚的なパラメータを用いることなく、見落としがちな周辺移動体を精度よく推測し、警告制御を行えるようにできる。 As described above, the warning system 3 of the first embodiment generates warning range information as a base for determining whether the information generation device 1 should give a warning in consideration of the surrounding situation, and the warning device 2 is the current self. In order to determine whether or not to issue a warning after taking into account the surrounding conditions of the vehicle, it is possible to accurately infer peripheral moving objects that are often overlooked and perform warning control without using sensory parameters.
 また、実施の形態1においては、警告システム3は、警告対象を絞ることにより、必要最低限の警告である「運転者が見落としている周辺移動体に対しての警告」を行うことができ、警告の効果が高くなる。これにより、より安全な運転を行えると共に、無視されない警告を行え、結果として事故軽減に繋がると考えられる。また、実施の形態1の警告システム3は、運転者が見落としている範囲として最小限の範囲を得るため、当該範囲に位置する周辺移動体は運転者が見落としている可能性が高く、運転者が見落としているために注意すべきとの警告の信頼度を高めることができる。 Further, in the first embodiment, the warning system 3 can give the minimum necessary warning "warning to peripheral moving objects overlooked by the driver" by narrowing down the warning target. The effect of the warning is high. As a result, safer driving can be performed and warnings that are not ignored can be given, resulting in accident reduction. Further, since the warning system 3 of the first embodiment obtains the minimum range as the range overlooked by the driver, there is a high possibility that the driver has overlooked the peripheral moving objects located in the range, and the driver. You can increase the reliability of the warning that you should be careful because you have overlooked it.
 従来技術は、運転者が手動で感覚的なパラメータを設定する必要があるため、運転者の気分によってもパラメータが左右されるという問題があった。実施の形態1においては、警告システム3は、感覚的なパラメータではなく、客観的なパラメータを用いるため、従来技術よりも精度がよくなる。 The conventional technology has a problem that the parameters are influenced by the mood of the driver because the driver needs to manually set the sensory parameters. In the first embodiment, since the warning system 3 uses objective parameters instead of sensory parameters, the accuracy is better than that of the prior art.
 従来技術は、運転者が手動で感覚的なパラメータを設定する必要がある。感覚的なパラメータを設定するには、運転しているその瞬間にパラメータを設定する必要があり、運転中のパラメータ設定は運転者の負担になる。運転後にパラメータを設定することも可能であるが、運転者が様々な瞬間の感覚的なパラメータを覚えておくことは負荷がかかり、また忘れやすい。あるいは、他の人に同乗してもらい、他の人に運転者が口頭でパラメータを指示して入力してもらう方法もあるが、同乗者をわざわざ用意する煩わしさがある。実施の形態1においては、警告システム3は、客観的なパラメータを用いるため、走行中はドライブレコーダー等で状況を記憶しておき、停車、駐車、降車後に記憶していた情報を確認して設定することもできる。このようにすることで、運転者の運転時の負担を軽減できる。また、警告システム3は、客観的なパラメータを用いるため、運転者以外がパラメータを設定することもできる。さらに、警告システム3の周囲状況情報設定部101は、ドライブレコーダーやカメラ等で周囲状況を画像として取得し、取得した画像から周囲状況の画像認識あるいは機械学習を行い、周囲状況を自動で設定するようにすれば、手動でパラメータを設定する負荷そのものを軽減できる。 Conventional technology requires the driver to manually set sensory parameters. In order to set the sensory parameters, it is necessary to set the parameters at the moment of driving, and the parameter setting during driving is a burden on the driver. It is possible to set the parameters after driving, but it is burdensome and easy for the driver to remember the sensory parameters at various moments. Alternatively, there is a method in which another person is asked to ride with the passenger, and the driver verbally instructs the parameter and inputs the parameter, but it is troublesome to prepare the passenger. In the first embodiment, since the warning system 3 uses objective parameters, the situation is memorized by a drive recorder or the like while driving, and the information memorized after stopping, parking, and getting off is confirmed and set. You can also do it. By doing so, the burden on the driver during driving can be reduced. Further, since the warning system 3 uses objective parameters, the parameters can be set by a person other than the driver. Further, the surrounding situation information setting unit 101 of the warning system 3 acquires the surrounding situation as an image by a drive recorder, a camera, or the like, performs image recognition or machine learning of the surrounding situation from the acquired image, and automatically sets the surrounding situation. By doing so, the load of manually setting the parameters can be reduced.
 従来技術は、道幅のパラメータを考慮しているが、当該パラメータで運転者個々人がどのように振る舞うか具体的な判定まではしていない。実施の形態1においては、警告システム3は、周囲状況情報に視線情報等を対応付けることにより、運転者個々人の当該周囲状況での振る舞いを考慮することにより、従来技術よりも精度がよくなる。 The conventional technology considers the parameter of the road width, but does not make a concrete judgment as to how each driver behaves with the parameter. In the first embodiment, the warning system 3 is more accurate than the prior art by associating the line-of-sight information with the surrounding situation information and considering the behavior of each driver in the surrounding situation.
 なお、実施の形態1において、警告システム3は、車両に搭載することを例としてあげたが、飛行機、船舶、電車、人等移動する物体に搭載することができる。 Although the warning system 3 is mounted on a vehicle as an example in the first embodiment, it can be mounted on a moving object such as an airplane, a ship, a train, or a person.
 実施の形態1において、警告システム3は、視線情報検知センサ32を備えていたが、運転者の視線情報を算出あるいは推定できる情報を得られるものであればよい。警告システム3は、視線情報検知センサ32の代わりに、運転者の姿勢を検知するセンサ、運転者の顔の向きを検知するセンサ等を備えていてもよい。 In the first embodiment, the warning system 3 includes the line-of-sight information detection sensor 32, but any information that can calculate or estimate the line-of-sight information of the driver may be obtained. The warning system 3 may include a sensor for detecting the posture of the driver, a sensor for detecting the direction of the driver's face, and the like, instead of the line-of-sight information detection sensor 32.
実施の形態2.
 実施の形態1では、情報生成部11aの未検知範囲情報算出部104a及び警告範囲情報算出部105aは、論理積を用いて未検知範囲情報及び警告範囲情報を算出し、警告対象を絞ることにより、必要最低限の警告である「運転者が見落としている周辺移動体に対しての警告」を行う。
Embodiment 2.
In the first embodiment, the undetected range information calculation unit 104a and the warning range information calculation unit 105a of the information generation unit 11a calculate the undetected range information and the warning range information using the logical product, and narrow down the warning target. , The minimum necessary warning "warning for peripheral moving objects that the driver has overlooked" is given.
 実施の形態2では、図13~図15に示すように、情報生成部11bの未検知範囲情報算出部104b及び警告範囲情報算出部105bは、論理和を用いて未検知範囲情報及び警告範囲情報を算出する。 In the second embodiment, as shown in FIGS. 13 to 15, the undetected range information calculation unit 104b and the warning range information calculation unit 105b of the information generation unit 11b use the undetected range information and the warning range information by using OR. Is calculated.
 当該算出により、自車両の運転者が見落としている可能性のある範囲を広く得ることができ、「ある周囲状況下で存在し得る危険」である運転者が見落としている可能性がある周辺移動体についての警告を行うことができ、安全性を高めることができる。それ以外は、実施の形態1と同様である。以下の説明において既に説明した構成及び動作については同一符号を付して、重複する説明を省略する。 By this calculation, it is possible to obtain a wide range that the driver of the own vehicle may have overlooked, and the surrounding movement that may be overlooked by the driver who is "a danger that may exist under a certain surrounding condition". It can give warnings about the body and increase safety. Other than that, it is the same as that of the first embodiment. The configurations and operations already described in the following description are designated by the same reference numerals, and duplicate description will be omitted.
 図13は本開示の実施の形態2に係る情報生成部11bのブロック図である。なお、実施の形態2に係る情報生成装置1及び警告装置2を含む警告システム3のブロック図は図1と同様である。 FIG. 13 is a block diagram of the information generation unit 11b according to the second embodiment of the present disclosure. The block diagram of the warning system 3 including the information generation device 1 and the warning device 2 according to the second embodiment is the same as that of FIG.
 実施の形態2では、実施の形態1の図2の未検知範囲情報算出部104a及び警告範囲情報算出部105aの代わりに未検知範囲情報算出部104b及び警告範囲情報算出部105bが機能ブロック図の構成として加わる。 In the second embodiment, instead of the undetected range information calculation unit 104a and the warning range information calculation unit 105a of FIG. 2 of the first embodiment, the undetected range information calculation unit 104b and the warning range information calculation unit 105b are functional block diagrams. Join as a composition.
 未検知範囲情報算出部104bは、ある周囲状況時のそれぞれの時刻で視線適用衝突可能性情報を算出し、複数の視線適用衝突可能性情報の論理和を用いて、ある周囲状況時の未検知範囲情報を算出する。ここで、論理和を用いるのは、ある周囲状況時は連続した期間である可能性が高いからである。それ以外は、未検知範囲情報算出部104bは、未検知範囲情報算出部104aと同様である。 The undetected range information calculation unit 104b calculates the line-of-sight application collision possibility information at each time in a certain surrounding situation, and uses the logical sum of the plurality of line-of-sight application collision possibility information to perform undetected in a certain surrounding situation. Calculate range information. Here, the logical sum is used because there is a high possibility that it is a continuous period in a certain surrounding situation. Other than that, the undetected range information calculation unit 104b is the same as the undetected range information calculation unit 104a.
 警告範囲情報算出部105bは、受信した当該情報を用いて、運転者ごとの期間が異なる同様の周囲状況情報に対応する複数の未検知範囲情報の論理和をとり、警告範囲情報を算出する。運転者ごとの時刻が異なる同様の周囲状況情報に対応する複数の未検知範囲情報の論理和をとることで、運転者ごとにある周囲状況時で自車両の運転者が周辺移動体を見落としている可能性のある範囲である警告範囲情報を算出することができる。それ以外は、警告範囲情報算出部105bは、警告範囲情報算出部105aと同様である。 The warning range information calculation unit 105b uses the received information to OR the plurality of undetected range information corresponding to the same surrounding situation information having different periods for each driver, and calculates the warning range information. By ORing a plurality of undetected range information corresponding to the same surrounding situation information in which the time is different for each driver, the driver of the own vehicle overlooks the surrounding moving object in the surrounding situation for each driver. It is possible to calculate the warning range information, which is the range that may be present. Other than that, the warning range information calculation unit 105b is the same as the warning range information calculation unit 105a.
 本開示の実施の形態2に係る警告システム3のハードウェア構成図は、実施の形態1の図3と同様である。ただし、未検知範囲情報算出部104bのハードウェア構成は未検知範囲情報算出部104aと同様、警告範囲情報算出部105bのハードウェア構成は警告範囲情報算出部105aと同様である。 The hardware configuration diagram of the warning system 3 according to the second embodiment of the present disclosure is the same as that of FIG. 3 of the first embodiment. However, the hardware configuration of the undetected range information calculation unit 104b is the same as that of the undetected range information calculation unit 104a, and the hardware configuration of the warning range information calculation unit 105b is the same as that of the warning range information calculation unit 105a.
 次に、警告システム3の動作について説明する。なお、本開示の実施の形態2に係る警告システム3の動作を示すフローチャートは、実施の形態1の図4と同様である。また、本開示の実施の形態2に係る警告装置2の動作を示すフローチャートは、実施の形態1の図12と同様である。つまり、図4のステップS4の警告装置2の動作は、実施の形態1と同様である。 Next, the operation of the warning system 3 will be described. The flowchart showing the operation of the warning system 3 according to the second embodiment of the present disclosure is the same as that of FIG. 4 of the first embodiment. Further, the flowchart showing the operation of the warning device 2 according to the second embodiment of the present disclosure is the same as that of FIG. 12 of the first embodiment. That is, the operation of the warning device 2 in step S4 of FIG. 4 is the same as that of the first embodiment.
 図14は、本開示の実施の形態2に係る情報生成装置1の動作を示すフローチャートである。図14を用いて、図4のステップS3の情報生成装置1の動作を以下に説明する。 FIG. 14 is a flowchart showing the operation of the information generation device 1 according to the second embodiment of the present disclosure. The operation of the information generation device 1 in step S3 of FIG. 4 will be described below with reference to FIG.
 ステップS301~S305は、実施の形態1のステップS101~S105と同様である。 Steps S301 to S305 are the same as steps S101 to S105 of the first embodiment.
 ステップS306において、未検知範囲情報算出部104bは、実施の形態1と同様の例である自車両が交差点Aに差し掛かる直前の場合について、対応情報記憶部36から自車両が交差点Aに差し掛かる直前の期間の衝突可能性情報と、視線情報と、周囲状況情報とを取得する。未検知範囲情報算出部104bは、取得した衝突可能性情報と、視線情報と、周囲状況情報とから未検知範囲情報を算出する。 In step S306, the undetected range information calculation unit 104b, in the case immediately before the own vehicle approaches the intersection A, which is the same example as in the first embodiment, the own vehicle approaches the intersection A from the corresponding information storage unit 36. The collision possibility information, the line-of-sight information, and the surrounding situation information of the immediately preceding period are acquired. The undetected range information calculation unit 104b calculates the undetected range information from the acquired collision possibility information, the line-of-sight information, and the surrounding situation information.
 本開示の実施の形態2に係る交差点Aに差し掛かる直前の期間を説明するための図は図8と同様である。未検知範囲情報算出部104bは、実施の形態1の未検知範囲情報算出部104aと同様に、時刻t11~時刻t12、時刻t12~時刻t13、時刻t13~時刻t14、時刻t14~時刻t15、時刻t15~時刻t16について、視線適用衝突可能性情報を算出する。 The figure for explaining the period immediately before approaching the intersection A according to the second embodiment of the present disclosure is the same as that of FIG. The undetected range information calculation unit 104b, like the undetected range information calculation unit 104a of the first embodiment, has time t11 to time t12, time t12 to time t13, time t13 to time t14, time t14 to time t15, and time. For the time t15 to the time t16, the line-of-sight application collision possibility information is calculated.
 未検知範囲情報算出部104bは、図8の交差点Aに差し掛かる直前の時刻が時刻t11~時刻t16であり、交差点Aを自車両が1回走行したときについて、当該期間の視線適用衝突可能性情報について論理和を取り、当該期間の間に一瞬でも運転者が見えていない範囲にある周辺移動体の情報である未検知範囲情報を算出する。 In the undetected range information calculation unit 104b, when the time immediately before approaching the intersection A in FIG. 8 is from time t11 to time t16 and the own vehicle travels once at the intersection A, the line-of-sight application collision possibility during that period. The information is ORed, and the undetected range information, which is the information of the peripheral moving objects in the range where the driver cannot see even for a moment during the period, is calculated.
 図15は、本開示の実施の形態2に係る時刻t11~時刻t16の未検知範囲情報の概念図である。未検知範囲情報算出部104bは、時刻t11~時刻t12、時刻t12~時刻t13、時刻t13~時刻t14、時刻t14~時刻t15、時刻t15~時刻t16それぞれの視線適用衝突可能性情報について論理和を取り、時刻t11~時刻t16の期間に一瞬でも運転者が見えていない範囲にある周辺移動体の情報である未検知範囲情報を算出する。それ以外は、ステップS306は、実施の形態1のステップS106と同様である。 FIG. 15 is a conceptual diagram of undetected range information from time t11 to time t16 according to the second embodiment of the present disclosure. The undetected range information calculation unit 104b performs a logical sum for each line-of-sight application collision possibility information of time t11 to time t12, time t12 to time t13, time t13 to time t14, time t14 to time t15, and time t15 to time t16. In the period from time t11 to time t16, the undetected range information, which is the information of the peripheral moving objects in the range where the driver cannot see even for a moment, is calculated. Other than that, step S306 is the same as step S106 of the first embodiment.
 ステップS307において、警告範囲情報算出部105bは、受信した1回目の未検知範囲情報、2回目の未検知範囲情報、・・・N回目の未検知範囲情報について論理和を取り、同一運転者で、同一周囲状況下で、自車両の運転者が周辺移動体を見落としている可能性のある範囲である警告範囲情報を算出する。算出した当該範囲が、警告をすべき範囲である警告範囲情報である。それ以外は、ステップS307は、実施の形態1のステップS107と同様である。 In step S307, the warning range information calculation unit 105b ORs the received first undetected range information, second undetected range information, ... Nth undetected range information, and is used by the same driver. , Calculate the warning range information, which is the range in which the driver of the own vehicle may have overlooked the surrounding moving objects under the same surrounding conditions. The calculated range is the warning range information that is the range to be warned. Other than that, step S307 is the same as step S107 of the first embodiment.
 ステップS307を実行した後もステップS301に戻り、電源をOFFにすること、あるいは終了操作がなされる等の処理の終了のトリガーがあるまで上記のような処理を繰り返す。なお、上記のような処理を繰り返すとしたが、繰り返さず一回行うだけでもよい。ここまでが事前に警告を行うべきか判定のベースとなる情報を生成する機械学習のプロセスである。 Even after executing step S307, the process returns to step S301, and the above processing is repeated until there is a trigger to end the processing such as turning off the power or performing an end operation. Although it is assumed that the above processing is repeated, it may be performed only once without repeating. This is the machine learning process that generates information that is the basis for determining whether a warning should be given in advance.
 なお、実施の形態2では、情報生成装置1は自車両に搭載されているとしたが、自車両に搭載していなくてもよい。具体的には、情報生成装置1の動作を自車両に搭載していないサーバ等で行ってもよい。 In the second embodiment, the information generator 1 is mounted on the own vehicle, but it does not have to be mounted on the own vehicle. Specifically, the operation of the information generation device 1 may be performed by a server or the like that is not mounted on the own vehicle.
 以上述べたように、実施の形態2の警告システム3は、情報生成装置1が周囲状況を加味した警告を行うべきか判定のベースとする警告範囲情報を生成し、警告装置2が現在の自車両の周囲状況を加味した上で警告を行うか否か判定するため、感覚的なパラメータを用いることなく、見落としがちな周辺移動体を精度よく推測し、警告制御を行えるようにできる。 As described above, the warning system 3 of the second embodiment generates warning range information as a base for determining whether the information generation device 1 should give a warning in consideration of the surrounding situation, and the warning device 2 is the current self. In order to determine whether or not to issue a warning after taking into account the surrounding conditions of the vehicle, it is possible to accurately infer peripheral moving objects that are often overlooked and perform warning control without using sensory parameters.
 また、実施の形態2においては、警告システム3の未検知範囲情報算出部104b及び警告範囲情報算出部105bは、論理和を用いて未検知範囲情報及び警告範囲情報を算出する。当該算出により、自車両の運転者が見落としている可能性のある範囲を広く得ることができ、運転者が見落としている可能性がある周辺移動体についての警告を行うことができ、安全性を高めることができる。これにより、より安全な運転を行え、結果として事故軽減に繋がると考えられる。それ以外は、実施の形態2の効果は、実施の形態1の効果と同様である。 Further, in the second embodiment, the undetected range information calculation unit 104b and the warning range information calculation unit 105b of the warning system 3 calculate the undetected range information and the warning range information by using the logical sum. By this calculation, it is possible to obtain a wide range that the driver of the own vehicle may have overlooked, and it is possible to give a warning about peripheral moving objects that the driver may have overlooked, thereby improving safety. Can be enhanced. As a result, safer driving can be performed, and as a result, it is considered that accidents can be reduced. Other than that, the effect of the second embodiment is the same as the effect of the first embodiment.
 実施の形態2において、衝突可能性情報算出部102は、リスクマップをそのまま用いたが、予め定めた閾値以下の衝突確率をゼロとすることで、危険度が高い範囲のみ残すようにしてもよい。このようにすることで、危険度が高い範囲のみの警告範囲情報を作成することができ、運転者が見落としているために注意すべきとの警告の信頼度を高めることができる。それ以外は、実施の形態2の変形例は、実施の形態1の変形例と同様である。 In the second embodiment, the collision possibility information calculation unit 102 uses the risk map as it is, but by setting the collision probability below a predetermined threshold value to zero, only the range with a high risk may be left. .. By doing so, it is possible to create warning range information only in a high-risk range, and it is possible to increase the reliability of the warning that the driver should be careful because he / she has overlooked it. Other than that, the modified example of the second embodiment is the same as the modified example of the first embodiment.
実施の形態3.
 実施の形態1では、情報生成部11aの警告範囲情報算出部105aは、同一運転者、同一周囲状況である1以上の未検知範囲情報を用いて、警告範囲情報を算出した。周囲状況決定部21aは、取得した情報から現在の自車両の周囲状況が周囲状況情報記憶部363に記憶されている周囲状況のうちで合致するものがあるか否か判定し、合致するものがない場合、ありえない周囲状況と決定し、動作を終了した。警告対象判定部22aは、自車両の運転者と同様、かつ受信した周囲状況の結果と同様の周囲状況の警告範囲情報を用いて、周辺移動体が警告対象であるか否かを判定した。
Embodiment 3.
In the first embodiment, the warning range information calculation unit 105a of the information generation unit 11a calculates the warning range information by using one or more undetected range information of the same driver and the same surrounding situation. The surrounding situation determination unit 21a determines from the acquired information whether or not the current surrounding situation of the own vehicle matches the surrounding situation stored in the surrounding situation information storage unit 363, and the matching one is found. If not, it was decided that the surrounding situation was impossible and the operation was terminated. The warning target determination unit 22a determines whether or not the peripheral moving object is a warning target by using the warning range information of the surrounding conditions similar to the driver of the own vehicle and the same as the result of the received surrounding conditions.
 実施の形態3では、図16~図19に示すように、情報生成部11cの警告範囲情報算出部105cは、同一運転者である1以上の未検知範囲情報を用いて、警告範囲情報を算出する。周囲状況決定部21bは、取得した情報から現在の自車両の周囲状況が周囲状況情報記憶部363に記憶されている周囲状況のうちで合致するか否かに関わらず、決定した結果を警告対象判定部22aに送信し、動作を継続する。警告対象判定部22bは、自車両の運転者と同様の周囲状況の警告範囲情報を用いて、周辺移動体が警告対象であるか否かを判定する。 In the third embodiment, as shown in FIGS. 16 to 19, the warning range information calculation unit 105c of the information generation unit 11c calculates the warning range information using one or more undetected range information of the same driver. To do. The surrounding situation determination unit 21b warns the determined result regardless of whether or not the current surrounding situation of the own vehicle matches the surrounding conditions stored in the surrounding situation information storage unit 363 from the acquired information. It is transmitted to the determination unit 22a and the operation is continued. The warning target determination unit 22b determines whether or not the peripheral moving object is a warning target by using the warning range information of the surrounding conditions similar to that of the driver of the own vehicle.
 当該算出により、警告範囲情報算出部105cは、様々な周囲状況における同一運転者の警告範囲情報を算出することで、同一運転者の運転中全般における運転者の見落としがちな範囲情報を得ることができ、運転者別の「見逃し癖」を算出できる。周囲状況決定部21b及び警告対象判定部22bは、周辺移動体が警告対象であるか否かを周囲状況によらず判定することができる。それ以外は、実施の形態1と同様である。以下の説明において既に説明した構成及び動作については同一符号を付して、重複する説明を省略する。 By the calculation, the warning range information calculation unit 105c can obtain the range information that the driver tends to overlook during the driving of the same driver by calculating the warning range information of the same driver in various surrounding conditions. It is possible to calculate the "missing habit" for each driver. The surrounding situation determination unit 21b and the warning target determination unit 22b can determine whether or not the peripheral moving object is a warning target regardless of the surrounding situation. Other than that, it is the same as that of the first embodiment. The configurations and operations already described in the following description are designated by the same reference numerals, and duplicate description will be omitted.
 図16は、本開示の実施の形態3に係る情報生成装置1及び警告装置2を含む警告システム3のブロック図である。 FIG. 16 is a block diagram of the warning system 3 including the information generation device 1 and the warning device 2 according to the third embodiment of the present disclosure.
 実施の形態3では、実施の形態1の図1の情報生成部11a、周囲状況決定部21a及び警告対象判定部22aの代わりに情報生成部11c、周囲状況決定部21b及び警告対象判定部22bが機能ブロック図の構成として加わる。 In the third embodiment, instead of the information generation unit 11a, the surrounding situation determination unit 21a and the warning target determination unit 22a of FIG. 1 of the first embodiment, the information generation unit 11c, the surrounding situation determination unit 21b and the warning target determination unit 22b It is added as a structure of the functional block diagram.
 情報生成部11cは、警告範囲情報を生成する。詳細については、後述する。 The information generation unit 11c generates warning range information. Details will be described later.
 周囲状況決定部21bは、取得した情報から現在の自車両の周囲状況が周囲状況情報記憶部363に記憶されている周囲状況のうちで合致するか否かに関わらず、決定した結果を警告対象判定部22aに送信する。それ以外は、周囲状況決定部21bは、周囲状況決定部21aと同様である。 The surrounding situation determination unit 21b warns the determined result regardless of whether or not the current surrounding situation of the own vehicle matches the surrounding conditions stored in the surrounding situation information storage unit 363 from the acquired information. It is transmitted to the determination unit 22a. Other than that, the surrounding situation determination unit 21b is the same as the surrounding situation determination unit 21a.
 警告対象判定部22bは、周囲状況に関わらず、予め生成された自車両の運転者が周辺移動体を見落としている範囲である警告範囲情報を警告範囲情報記憶部37から取得する。それ以外は、警告対象判定部22bは、警告対象判定部22aと同様である。 The warning target determination unit 22b acquires the warning range information generated in advance from the warning range information storage unit 37, which is the range in which the driver of the own vehicle overlooks the surrounding moving body, regardless of the surrounding conditions. Other than that, the warning target determination unit 22b is the same as the warning target determination unit 22a.
 図17は、本開示の実施の形態3に係る情報生成部11cのブロック図である。 FIG. 17 is a block diagram of the information generation unit 11c according to the third embodiment of the present disclosure.
 実施の形態3では、実施の形態1の図2の警告範囲情報算出部105aの代わりに未検警告範囲情報算出部105cが機能ブロック図の構成として加わる。 In the third embodiment, the untested warning range information calculation unit 105c is added as a configuration of the functional block diagram instead of the warning range information calculation unit 105a of FIG. 2 of the first embodiment.
 警告範囲情報算出部105cは、未検知範囲情報記憶部34から周囲状況情報に関わらず未検知範囲情報を受信する。警告範囲情報算出部105cは、受信した当該情報を用いて、運転者ごとの期間が異なる同様の周囲状況情報に対応する複数の未検知範囲情報の論理積をとり、警告範囲情報を算出する。運転者ごとの時刻が異なる複数の未検知範囲情報の論理積をとることで、どの時刻においても運転者ごとの周辺移動体を見落としている範囲である警告範囲情報を算出することができる。それ以外は、警告範囲情報算出部105cは、警告範囲情報算出部105aと同様である。 The warning range information calculation unit 105c receives the undetected range information from the undetected range information storage unit 34 regardless of the surrounding situation information. The warning range information calculation unit 105c uses the received information to perform a logical product of a plurality of undetected range information corresponding to the same surrounding situation information having different periods for each driver, and calculates the warning range information. By taking the logical product of a plurality of undetected range information having different times for each driver, it is possible to calculate the warning range information which is the range in which the peripheral moving object for each driver is overlooked at any time. Other than that, the warning range information calculation unit 105c is the same as the warning range information calculation unit 105a.
 本開示の実施の形態3に係る警告システム3のハードウェア構成図は、実施の形態1の図3と同様である。ただし、警告範囲情報算出部105cのハードウェア構成は警告範囲情報算出部105aと同様、周囲状況決定部21bのハードウェア構成は周囲状況決定部21aと同様、警告対象判定部22bのハードウェア構成は警告対象判定部22aと同様である。 The hardware configuration diagram of the warning system 3 according to the third embodiment of the present disclosure is the same as that of FIG. 3 of the first embodiment. However, the hardware configuration of the warning range information calculation unit 105c is the same as that of the warning range information calculation unit 105a, the hardware configuration of the surrounding situation determination unit 21b is the same as that of the surrounding situation determination unit 21a, and the hardware configuration of the warning target determination unit 22b is the same. This is the same as the warning target determination unit 22a.
 次に、警告システム3の動作について説明する。なお、本開示の実施の形態3に係る警告システム3の動作を示すフローチャートは、実施の形態1の図4と同様である。 Next, the operation of the warning system 3 will be described. The flowchart showing the operation of the warning system 3 according to the third embodiment of the present disclosure is the same as that of FIG. 4 of the first embodiment.
 図18は、本開示の実施の形態3に係る情報生成装置1の動作を示すフローチャートである。図18を用いて、図4のステップS3の情報生成装置1の動作を以下に説明する。 FIG. 18 is a flowchart showing the operation of the information generation device 1 according to the third embodiment of the present disclosure. The operation of the information generation device 1 in step S3 of FIG. 4 will be described below with reference to FIG.
 ステップS401~S406は、実施の形態1のステップS101~S106と同様である。 Steps S401 to S406 are the same as steps S101 to S106 of the first embodiment.
 ステップS407において、警告範囲情報算出部105cは、未検知範囲情報記憶部34から周囲状況に関わらず、同一運転者である1以上の未検知範囲情報を受信する。具体的には、警告範囲情報算出部105cは、図8のような同一運転者である交差点Aに差し掛かる直前の場合である1回目の未検知範囲情報と、2回目の未検知範囲情報と、・・・N回目の未検知範囲情報とを未検知範囲情報記憶部34から受信する。 In step S407, the warning range information calculation unit 105c receives one or more undetected range information of the same driver from the undetected range information storage unit 34 regardless of the surrounding conditions. Specifically, the warning range information calculation unit 105c includes the first undetected range information and the second undetected range information, which is the case immediately before approaching the intersection A which is the same driver as shown in FIG. , ... The Nth undetected range information is received from the undetected range information storage unit 34.
 警告範囲情報算出部105cは、受信した1回目の未検知範囲情報、2回目の未検知範囲情報、・・・N回目の未検知範囲情報について論理積を取り、同一運転者で、いつも運転者が見落としている範囲を算出する。算出した当該範囲が、警告をすべき範囲である警告範囲情報である。それ以外は、ステップS407は、実施の形態1のステップS107と同様である。 The warning range information calculation unit 105c takes a logical product of the received first undetected range information, second undetected range information, ... Nth undetected range information, and is always the same driver. Calculate the range that is overlooked. The calculated range is the warning range information that is the range to be warned. Other than that, step S407 is the same as step S107 of the first embodiment.
 ステップS407を実行した後もステップS401に戻り、電源をOFFにすること、あるいは終了操作がなされる等の処理の終了のトリガーがあるまで上記のような処理を繰り返す。なお、上記のような処理を繰り返すとしたが、繰り返さず一回行うだけでもよい。ここまでが事前に警告を行うべきか判定のベースとなる情報を生成する機械学習のプロセスである。 Even after executing step S407, the process returns to step S401, and the above processing is repeated until there is a trigger to end the processing such as turning off the power or performing an end operation. Although it is assumed that the above processing is repeated, it may be performed only once without repeating. This is the machine learning process that generates information that is the basis for determining whether a warning should be given in advance.
 なお、実施の形態3では、情報生成装置1は自車両に搭載されているとしたが、自車両に搭載していなくてもよい。具体的には、情報生成装置1の動作を自車両に搭載していないサーバ等で行ってもよい。 Although it is assumed that the information generator 1 is mounted on the own vehicle in the third embodiment, it does not have to be mounted on the own vehicle. Specifically, the operation of the information generation device 1 may be performed by a server or the like that is not mounted on the own vehicle.
 図19は、本開示の実施の形態3に係る警告装置2の動作を示すフローチャートである。図19を用いて、図4のステップS4の警告装置2の動作を以下に説明する。 FIG. 19 is a flowchart showing the operation of the warning device 2 according to the third embodiment of the present disclosure. The operation of the warning device 2 in step S4 of FIG. 4 will be described below with reference to FIG.
 ステップS501~S502は、実施の形態1のステップS201~S202と同様である。 Steps S501 to S502 are the same as steps S201 to S202 of the first embodiment.
 ステップS503において、周囲状況決定部21bは、取得した情報から現在の自車両の周囲状況が周囲状況情報記憶部363に記憶されている周囲状況のうちで合致するものがあるか否か判定する。周囲状況決定部21bは、合致するものがある場合、合致した周囲状況を現在の自車両の周囲状況と決定する。周囲状況決定部21bは、合致するものがない場合、周囲状況なしと決定する。周囲状況決定部21bは、決定した結果を警告対象判定部22bに送信し、ステップS504に進む。それ以外は、ステップS503は、実施の形態1のステップS203と同様である。 In step S503, the surrounding situation determination unit 21b determines from the acquired information whether or not the current surrounding situation of the own vehicle matches the surrounding situation stored in the surrounding situation information storage unit 363. If there is a match, the surrounding situation determination unit 21b determines the matched surrounding situation as the current surrounding situation of the own vehicle. If there is no match, the surrounding situation determination unit 21b determines that there is no surrounding situation. The surrounding situation determination unit 21b transmits the determined result to the warning target determination unit 22b, and proceeds to step S504. Other than that, step S503 is the same as step S203 of the first embodiment.
 ステップS504において、警告対象判定部22bは、周囲状況決定部21bから周囲状況の結果を、移動体情報取得部4aから自車両情報及び周辺移動体情報を、視線情報取得部5から視線情報を受信する。警告対象判定部22bは、周囲状況に関わらず、自車両の運転者と同様の警告範囲情報を警告範囲情報記憶部37から取得する。警告対象判定部22bは、受信あるいは取得した情報から周辺移動体が警告対象であるか否かを判定する。それ以外は、ステップS504は、実施の形態1のステップS204と同様である。 In step S504, the warning target determination unit 22b receives the result of the surrounding situation from the surrounding situation determination unit 21b, the own vehicle information and the peripheral moving body information from the moving body information acquisition unit 4a, and the line-of-sight information from the line-of-sight information acquisition unit 5. To do. The warning target determination unit 22b acquires the same warning range information as the driver of the own vehicle from the warning range information storage unit 37 regardless of the surrounding conditions. The warning target determination unit 22b determines whether or not the peripheral moving object is a warning target from the received or acquired information. Other than that, step S504 is the same as step S204 of the first embodiment.
 ステップS505は、実施の形態1のステップS205と同様である。 Step S505 is the same as step S205 of the first embodiment.
 ステップS504でNoとなった場合、ステップS505を実行した後もステップS501に戻り、電源をOFFにすること、あるいは終了操作がなされる等の処理の終了のトリガーがあるまで上記のような処理を繰り返す。なお、上記のような処理を繰り返すとしたが、繰り返さず一回行うだけでもよい。 If No is obtained in step S504, the process returns to step S501 even after the execution of step S505, and the above processing is performed until there is a trigger to end the processing such as turning off the power or performing an end operation. repeat. Although it is assumed that the above processing is repeated, it may be performed only once without repeating.
 以上述べたように、実施の形態3の警告システム3は、情報生成装置1が周囲状況を加味した警告を行うべきか判定のベースとする警告範囲情報を生成し、警告装置2が現在の自車両の周囲状況を加味した上で警告を行うか否か判定するため、感覚的なパラメータを用いることなく、見落としがちな周辺移動体を精度よく推測し、警告制御を行えるようにできる。 As described above, the warning system 3 of the third embodiment generates warning range information as a base for determining whether the information generation device 1 should give a warning in consideration of the surrounding situation, and the warning device 2 is the current self. In order to determine whether or not to issue a warning after taking into account the surrounding conditions of the vehicle, it is possible to accurately infer peripheral moving objects that are often overlooked and perform warning control without using sensory parameters.
 また、実施の形態3においては、警告システム3の情報生成部11cの警告範囲情報算出部105c、周囲状況決定部21b及び警告対象判定部22bは、周囲状況に関わらず、自車両の運転者と同一であることに対して、情報を算出及び決定を行う。これにより、同一運転者の運転中全般における運転者の見落としがちな範囲情報を得、運転者別の「見逃し癖」を算出し、周辺移動体が警告対象であるか否かを周囲状況によらず判定することができる。それ以外は、実施の形態3の効果は、実施の形態1の効果と同様である。 Further, in the third embodiment, the warning range information calculation unit 105c, the surrounding situation determination unit 21b, and the warning target determination unit 22b of the information generation unit 11c of the warning system 3 are with the driver of the own vehicle regardless of the surrounding conditions. Information is calculated and determined for the same. As a result, the range information that is often overlooked by the driver during the driving of the same driver is obtained, the "missing habit" for each driver is calculated, and whether or not the surrounding moving object is a warning target depends on the surrounding conditions. Can be determined without. Other than that, the effect of the third embodiment is the same as the effect of the first embodiment.
 実施の形態3において、警告システム3は、周囲状況に関わらず、自車両の運転者と同一であることに対して情報を算出及び決定を行ったが、同一運転者かつ同一周囲環境であることに対して情報を算出及び決定を行うか、あるいは同一運転者であることに対して情報を算出及び決定を行うかは運転者がハードウェアスイッチあるいはカーナビゲーションシステム等で選択できるようにし、切り替えができるようにしてもよい。このようにすることで、運転者のその時の状況によって、警報して欲しいレベルを変更することができる。運転者の希望する警報して欲しいレベルを変更できることによって、運転者が警告を無視すること及び運転者が警告装置を使わなくなることをより防ぐことができる。 In the third embodiment, the warning system 3 calculates and determines that the driver is the same as the driver of the own vehicle regardless of the surrounding conditions, but the warning system 3 is the same driver and the same surrounding environment. Whether to calculate and determine the information for the same driver or to calculate and determine the information for the same driver can be selected by the driver with a hardware switch, a car navigation system, etc. You may be able to do it. By doing so, the level at which the driver wants to be alerted can be changed according to the situation at that time of the driver. By being able to change the level at which the driver wants to be alerted, it is possible to further prevent the driver from ignoring the warning and stopping the driver from using the warning device.
 実施の形態3において、警告システム3は、周囲状況情報取得部101、周囲状況情報記憶部363、周囲状況決定部21bを備えていたが、周囲状況に関わらず、自車両の運転者と同一であることに対して情報を算出及び決定を行う場合は、周囲状況情報取得部101、周囲状況情報記憶部363、周囲状況決定部21bを構成要素として含めなくてもよい。情報生成装置1は、衝突可能性情報と対応視線情報とを対応付けて、警告範囲情報を作成し、警告装置2は、警告対象判定部22bにて地図情報を取得し、周囲状況に関わらず、同一運転者の警告範囲情報を用いて、警告を行ってもよい。このようにすることで、演算量を低減することができる。また、情報生成装置1及び警告装置2を自車両に搭載する場合、装置の重量を軽減することができ、燃費がよくなる。それ以外は、実施の形態3の変形例は、実施の形態1の変形例と同様である。 In the third embodiment, the warning system 3 includes the surrounding situation information acquisition unit 101, the surrounding situation information storage unit 363, and the surrounding situation determination unit 21b, but is the same as the driver of the own vehicle regardless of the surrounding situation. When calculating and determining information about a certain thing, it is not necessary to include the surrounding situation information acquisition unit 101, the surrounding situation information storage unit 363, and the surrounding situation determining unit 21b as components. The information generation device 1 creates warning range information by associating the collision possibility information with the corresponding line-of-sight information, and the warning device 2 acquires map information by the warning target determination unit 22b, regardless of the surrounding conditions. , The warning may be given by using the warning range information of the same driver. By doing so, the amount of calculation can be reduced. Further, when the information generation device 1 and the warning device 2 are mounted on the own vehicle, the weight of the device can be reduced and the fuel consumption is improved. Other than that, the modified example of the third embodiment is the same as the modified example of the first embodiment.
実施の形態4.
 実施の形態1では、移動体情報検知センサ31aは自車両に備えられ、自車両位置情報を含む自車両情報及び周辺移動体位置情報を含む周辺移動体情報を検知する。情報生成装置1及び警告装置2に備えられる移動体情報取得部4aは、自車両に搭載された移動体情報検知センサ31aから自車両情報及び周辺移動体情報を取得し、自車両に搭載しているセンサのみで動作を完結することができる。
Embodiment 4.
In the first embodiment, the moving body information detection sensor 31a is provided in the own vehicle and detects the own vehicle information including the own vehicle position information and the peripheral moving body information including the peripheral moving body position information. The mobile body information acquisition unit 4a provided in the information generation device 1 and the warning device 2 acquires the own vehicle information and the peripheral moving body information from the moving body information detection sensor 31a mounted on the own vehicle, and mounts the moving body information on the own vehicle. The operation can be completed only with the existing sensor.
 実施の形態4では、図20~図24に示すように、移動体情報検知センサ31bは、自車両のみならず、周辺移動体及びインフラ装置等に備えられているセンサであり、情報生成装置1及び警告装置2に備えられる移動体情報取得部4bは、自車両、周辺移動体及びインフラ装置等に搭載された移動体情報検知センサ31bから自車両情報及び周辺移動体情報を取得する。 In the fourth embodiment, as shown in FIGS. 20 to 24, the mobile body information detection sensor 31b is a sensor provided not only in the own vehicle but also in the peripheral mobile body, the infrastructure device, and the like, and the information generation device 1 The mobile body information acquisition unit 4b provided in the warning device 2 acquires the own vehicle information and the peripheral moving body information from the moving body information detection sensor 31b mounted on the own vehicle, the peripheral moving body, the infrastructure device, and the like.
 当該検知及び取得により、より広く情報を取得できることになり、より精度のよい情報を取得することができる。警告システム3は、より精度の良い情報を取得することで、算出する情報の精度がよくなる。それ以外は、実施の形態1と同様である。以下の説明において既に説明した構成及び動作については同一符号を付して、重複する説明を省略する。 By the detection and acquisition, information can be acquired more widely, and more accurate information can be acquired. The warning system 3 improves the accuracy of the calculated information by acquiring more accurate information. Other than that, it is the same as that of the first embodiment. The configurations and operations already described in the following description are designated by the same reference numerals, and duplicate description will be omitted.
 図20は本開示の実施の形態4に係る情報生成装置1及び警告装置2を含む警告システム3のブロック図である。なお、実施の形態4に係る情報生成部11aのブロック図は図2と同様である。 FIG. 20 is a block diagram of a warning system 3 including an information generation device 1 and a warning device 2 according to the fourth embodiment of the present disclosure. The block diagram of the information generation unit 11a according to the fourth embodiment is the same as that of FIG.
 実施の形態4では、実施の形態1の図1の移動体情報検知センサ31a及び移動体情報取得部4aの代わりに移動体情報検知センサ31b及び移動体情報取得部4bが機能ブロック図の構成として加わる。 In the fourth embodiment, the mobile information detection sensor 31b and the mobile information acquisition unit 4b are configured as a functional block diagram instead of the mobile information detection sensor 31a and the mobile information acquisition unit 4a of FIG. 1 of the first embodiment. Join.
 移動体情報検知センサ31bは、自車両、周辺移動体及びインフラ装置等に備えられる。周辺移動体及びインフラ装置等に備えられる移動体情報検知センサ31bは、元から周辺移動体及びインフラ装置等に備えられる装置を活用する。なお、インフラ装置とは、例えば路側センサのカメラ等である。図20では、移動体情報検知センサ31bは一つしか記載していないが、自車両、周辺移動体及びインフラ装置等に搭載された複数のセンサから情報を取得する場合は、移動体情報検知センサ31bは複数存在することになる。それ以外は、移動体情報検知センサ31bは、移動体情報検知センサ31aと同様である。 The mobile information detection sensor 31b is provided in the own vehicle, peripheral mobiles, infrastructure devices, and the like. The mobile information detection sensor 31b provided in the peripheral mobile body and the infrastructure device and the like utilizes the device originally provided in the peripheral mobile body and the infrastructure device and the like. The infrastructure device is, for example, a camera of a roadside sensor or the like. In FIG. 20, only one mobile information detection sensor 31b is shown, but when information is acquired from a plurality of sensors mounted on the own vehicle, peripheral mobiles, infrastructure devices, etc., the mobile information detection sensor There will be a plurality of 31b. Other than that, the mobile information detection sensor 31b is the same as the mobile information detection sensor 31a.
 移動体情報取得部4bは、移動体情報検知センサ31bで検知した自車両情報及び周辺移動体情報を取得する。なお、移動体情報検知センサ31bが複数存在する場合は、移動体情報取得部4bは、複数の移動体情報検知センサ31bから情報を取得する。それ以外は、移動体情報取得部4bは、移動体情報取得部4aと同様である。 The moving body information acquisition unit 4b acquires the own vehicle information and the peripheral moving body information detected by the moving body information detection sensor 31b. When a plurality of mobile information detection sensors 31b exist, the mobile information acquisition unit 4b acquires information from the plurality of mobile information detection sensors 31b. Other than that, the mobile information acquisition unit 4b is the same as the mobile information acquisition unit 4a.
 図21は、本開示の実施の形態4に係る警告システム3のハードウェア構成図である。図21を用いて、本開示の実施の形態4に係る警告システム3の構成について説明する。 FIG. 21 is a hardware configuration diagram of the warning system 3 according to the fourth embodiment of the present disclosure. The configuration of the warning system 3 according to the fourth embodiment of the present disclosure will be described with reference to FIG. 21.
 実施の形態4では、実施の形態1の図3に通信I/F47が加わる。 In the fourth embodiment, the communication I / F 47 is added to FIG. 3 of the first embodiment.
 警告システム3は、バス40と、プロセッサ41と、メモリ42と、ストレージ43と、センサ44aと、入力I/F45と、出力I/F46、通信I/F47とで構成される。警告システム3の各機能はソフトウェア、ファームウェア、またはソフトウェアとファームウェアの組み合わせにより実現される。ソフトウェア、ファームウェア、またはソフトウェアとファームウェアの組み合わせはプログラムとして記述される。 The warning system 3 is composed of a bus 40, a processor 41, a memory 42, a storage 43, a sensor 44a, an input I / F45, an output I / F46, and a communication I / F47. Each function of the warning system 3 is realized by software, firmware, or a combination of software and firmware. Software, firmware, or a combination of software and firmware is described as a program.
 なお、バス40は、通信I/F47にも接続されており、各装置間を電気的に接続し、データのやり取りを行う。また、プロセッサ41は、バス40を介して通信I/F47にも接続し、通信I/F47を制御する。プロセッサ41は、移動体情報取得部4bの一部をプロセッサ41がメモリ42にロードしたプログラムを読み込み、実行することにより実現する。メモリ42は、移動体情報取得部4bの一部をメモリ42に記憶するプログラムによって実現する。 The bus 40 is also connected to the communication I / F 47, and each device is electrically connected to exchange data. The processor 41 also connects to the communication I / F 47 via the bus 40 and controls the communication I / F 47. The processor 41 is realized by reading and executing a program in which a part of the mobile information acquisition unit 4b is loaded into the memory 42 by the processor 41. The memory 42 is realized by a program that stores a part of the mobile information acquisition unit 4b in the memory 42.
 センサ44bは、各種情報を検知する装置である。センサ44bは、例えば、ミリ波レーダー、カメラ等の車載センサである。自車両に搭載されている移動体情報検知センサ31bと、視線情報検知センサ32とは、センサ44bにて実現する。それ以外は、センサ44bは、センサ44aと同様である。 The sensor 44b is a device that detects various types of information. The sensor 44b is, for example, an in-vehicle sensor such as a millimeter wave radar or a camera. The mobile body information detection sensor 31b mounted on the own vehicle and the line-of-sight information detection sensor 32 are realized by the sensor 44b. Other than that, the sensor 44b is the same as the sensor 44a.
 通信I/F47は、周辺移動体及びインフラ装置等に備えられる移動体情報検知センサ31bを実現する図示しないセンサから移動体情報を受信する。通信I/F47は、例えば、インターネット等の無線通信を行う装置である。移動体情報取得部4bの一部は、通信I/F47にて実現する。 The communication I / F 47 receives the mobile body information from a sensor (not shown) that realizes the mobile body information detection sensor 31b provided in the peripheral mobile body, the infrastructure device, and the like. The communication I / F 47 is, for example, a device that performs wireless communication such as the Internet. A part of the mobile information acquisition unit 4b is realized by communication I / F47.
 次に、警告システム3の動作について説明する。 Next, the operation of the warning system 3 will be described.
 図22は、本開示の実施の形態4に係る警告システム3の動作を示すフローチャートである。図22を用いて、警告システム3の動作を以下に説明する。 FIG. 22 is a flowchart showing the operation of the warning system 3 according to the fourth embodiment of the present disclosure. The operation of the warning system 3 will be described below with reference to FIG.
 ステップS601において、移動体情報検知センサ31bは、自車両、周辺移動体及びインフラ装置等に備えられており、自車両情報及び自車両周辺に存在する周辺移動体の周辺移動体情報を検知する。具体的には、実施の形態1の自車両に移動体情報検知センサ31bが備えられている場合に加えて、例えば、周辺移動体に備えられた移動体情報検知センサ31bであるミリ波レーダーが、自車両情報及び自車両周辺に存在する周辺移動体の周辺移動体情報を検知する。また、例えば、インフラ装置である路側センサのカメラが、自車両情報及び自車両周辺に存在する周辺移動体の周辺移動体情報を検知する。自車両情報及び自車両周辺に存在する周辺移動体の周辺移動体情報を検知できる移動体情報検知センサ31bは、どこに備えられていようが自車両情報及び自車両周辺に存在する周辺移動体の周辺移動体情報を検知する。 In step S601, the moving body information detection sensor 31b is provided in the own vehicle, the peripheral moving body, the infrastructure device, and the like, and detects the own vehicle information and the peripheral moving body information of the peripheral moving body existing around the own vehicle. Specifically, in addition to the case where the own vehicle of the first embodiment is provided with the moving body information detection sensor 31b, for example, a millimeter-wave radar which is a moving body information detection sensor 31b provided in a peripheral moving body , Own vehicle information and peripheral moving body information of peripheral moving bodies existing around the own vehicle are detected. Further, for example, the camera of the roadside sensor, which is an infrastructure device, detects the own vehicle information and the peripheral moving body information of the peripheral moving body existing around the own vehicle. The moving body information detection sensor 31b that can detect the own vehicle information and the peripheral moving body information of the peripheral moving body existing around the own vehicle is provided in the own vehicle information and the surroundings of the peripheral moving body existing in the vicinity of the own vehicle no matter where it is provided. Detects moving object information.
 移動体情報検知センサ31bは、検知した自車両情報及び周辺移動体情報を移動体情報取得部4bに送信し、ステップS602に進む。それ以外は、ステップS601は、実施の形態1のステップS1と同様である。 The moving body information detection sensor 31b transmits the detected own vehicle information and peripheral moving body information to the moving body information acquisition unit 4b, and proceeds to step S602. Other than that, step S601 is the same as step S1 of the first embodiment.
 ステップS602は、実施の形態1のステップS2と同様である。 Step S602 is the same as step S2 of the first embodiment.
 ステップS603~S604は、移動体情報取得部4bが、自車両だけではなく、周辺移動体及びインフラ装置等からも自車両情報及び自車両周辺に存在する周辺移動体の周辺移動体情報を取得する。それ以外は、ステップS603~S604は、実施の形態1のステップS3~S4と同様である。詳細については、後述する。 In steps S603 to S604, the moving body information acquisition unit 4b acquires the own vehicle information and the peripheral moving body information of the peripheral moving body existing around the own vehicle not only from the own vehicle but also from the peripheral moving body and the infrastructure device. .. Other than that, steps S603 to S604 are the same as steps S3 to S4 of the first embodiment. Details will be described later.
 ステップS604を実行した後もステップS601に戻り、電源をOFFにすること、あるいは終了操作がなされる等の処理の終了のトリガーがあるまで上記のような処理を繰り返す。なお、上記のような処理を繰り返すとしたが、繰り返さず一回行うだけでもよい。 Even after executing step S604, the process returns to step S601, and the above processing is repeated until there is a trigger to end the processing such as turning off the power or performing an end operation. Although it is assumed that the above processing is repeated, it may be performed only once without repeating.
 図23は、本開示の実施の形態4に係る情報生成装置1の動作を示すフローチャートである。図23を用いて、図22のステップS603の情報生成装置1の動作を以下に説明する。 FIG. 23 is a flowchart showing the operation of the information generation device 1 according to the fourth embodiment of the present disclosure. The operation of the information generation device 1 in step S603 of FIG. 22 will be described below with reference to FIG. 23.
 ステップS701において、移動体情報取得部4bは、自車両、周辺移動体及びインフラ装置等に備えられた移動体情報検知センサ31bから自車両情報及び周辺移動体情報を取得する。なお、移動体情報検知センサ31bが複数存在する場合は、移動体情報取得部4bは、複数の移動体情報検知センサ31bから情報を取得する。 In step S701, the mobile body information acquisition unit 4b acquires the own vehicle information and the peripheral moving body information from the moving body information detection sensor 31b provided in the own vehicle, the peripheral moving body, the infrastructure device, and the like. When a plurality of mobile information detection sensors 31b exist, the mobile information acquisition unit 4b acquires information from the plurality of mobile information detection sensors 31b.
 なお、移動体情報取得部4bは、複数備えていてもよい。例えば、周辺移動体及びインフラ装置等に備えられた移動体情報検知センサ31bから情報を取得する移動体情報取得部4cと、自車両に備えられた移動体情報検知センサ31bから情報を取得する移動体情報取得部4dとを分けてもよい。その場合は、例えば、移動体情報取得部4dで取得した情報を移動体情報取得部4cに送信し、移動体情報取得部4cで情報を一元化してもよい。また、一元化せずに、移動体情報取得部4cと移動体情報取得部4dとからそれぞれから情報を情報生成部11aに送信してもよい。それ以外は、ステップS701は、実施の形態1のステップS101と同様である。 Note that a plurality of mobile information acquisition units 4b may be provided. For example, a movement that acquires information from a moving body information acquisition unit 4c that acquires information from a moving body information detection sensor 31b provided in a peripheral moving body, an infrastructure device, or the like, and a moving body information detection sensor 31b provided in the own vehicle. It may be separated from the body information acquisition unit 4d. In that case, for example, the information acquired by the mobile information acquisition unit 4d may be transmitted to the mobile information acquisition unit 4c, and the information may be unified by the mobile information acquisition unit 4c. Further, information may be transmitted from each of the mobile information acquisition unit 4c and the mobile information acquisition unit 4d to the information generation unit 11a without being unified. Other than that, step S701 is the same as step S101 of the first embodiment.
 ステップS702~S707は、実施の形態1のステップS102~S107と同様である。 Steps S702 to S707 are the same as steps S102 to S107 of the first embodiment.
 ステップS707を実行した後もステップS701に戻り、電源をOFFにすること、あるいは終了操作がなされる等の処理の終了のトリガーがあるまで上記のような処理を繰り返す。なお、上記のような処理を繰り返すとしたが、繰り返さず一回行うだけでもよい。ここまでが事前に警告を行うべきか判定のベースとなる情報を生成する機械学習のプロセスである。 Even after executing step S707, the process returns to step S701, and the above processing is repeated until there is a trigger to end the processing such as turning off the power or performing an end operation. Although it is assumed that the above processing is repeated, it may be performed only once without repeating. This is the machine learning process that generates information that is the basis for determining whether a warning should be given in advance.
 なお、実施の形態4では、情報生成装置1は自車両に搭載されているとしたが、自車両に搭載していなくてもよい。具体的には、情報生成装置1の動作を自車両に搭載していないサーバ等で行ってもよい。 Although it is assumed that the information generator 1 is mounted on the own vehicle in the fourth embodiment, it does not have to be mounted on the own vehicle. Specifically, the operation of the information generation device 1 may be performed by a server or the like that is not mounted on the own vehicle.
 図24は、本開示の実施の形態4に係る警告装置2の動作を示すフローチャートである。図24を用いて、図22のステップS604の警告装置2の動作を以下に説明する。 FIG. 24 is a flowchart showing the operation of the warning device 2 according to the fourth embodiment of the present disclosure. The operation of the warning device 2 in step S604 of FIG. 22 will be described below with reference to FIG. 24.
 ステップS801において、移動体情報取得部4bは、ステップS701と同様の動作を行う。ただし、移動体情報取得部4bは、取得した自車両情報及び周辺移動体情報を情報生成部11aではなく、周囲状況決定部21a及び警告対象判定部22aに送信する。それ以外は、ステップS801は、実施の形態1のステップS201と同様である。 In step S801, the mobile information acquisition unit 4b performs the same operation as in step S701. However, the moving body information acquisition unit 4b transmits the acquired own vehicle information and peripheral moving body information not to the information generation unit 11a but to the surrounding situation determination unit 21a and the warning target determination unit 22a. Other than that, step S801 is the same as step S201 of the first embodiment.
 ステップS802~S805は、実施の形態1のステップS202~S205と同様である。 Steps S802 to S805 are the same as steps S202 to S205 of the first embodiment.
 ステップS803でNoとなった場合、ステップS804でNoとなった場合、ステップS805を実行した後もステップS801に戻り、電源をOFFにすること、あるいは終了操作がなされる等の処理の終了のトリガーがあるまで上記のような処理を繰り返す。なお、上記のような処理を繰り返すとしたが、繰り返さず一回行うだけでもよい。 If the result is No in step S803, if the result is No in step S804, the process returns to step S801 even after the execution of step S805, the power is turned off, or a trigger for the end of processing such as an end operation is performed. The above process is repeated until there is. Although it is assumed that the above processing is repeated, it may be performed only once without repeating.
 以上述べたように、実施の形態4の警告システム3は、情報生成装置1が周囲状況を加味した警告を行うべきか判定のベースとする警告範囲情報を生成し、警告装置2が現在の自車両の周囲状況を加味した上で警告を行うか否か判定するため、感覚的なパラメータを用いることなく、見落としがちな周辺移動体精度よくを推測し、警告制御を行えるようにできる。 As described above, the warning system 3 of the fourth embodiment generates warning range information as a base for determining whether the information generation device 1 should give a warning in consideration of the surrounding situation, and the warning device 2 is the current self. In order to determine whether or not to issue a warning after taking into account the surrounding conditions of the vehicle, it is possible to accurately estimate peripheral moving objects that are often overlooked and perform warning control without using sensory parameters.
 また、実施の形態4においては、警告システム3は、自車両に備えられた移動体情報検知センサ31bのみならず、周辺移動体及びインフラ装置等に備えられた移動体情報検知センサ31bの情報も使用することにより、より広く情報を取得できることになり、より精度のよい情報を取得することができる。警告システム3は、より精度の良い情報を取得することで、算出する情報の精度がよくなる。これにより、より安全な運転を行え、結果として事故軽減に繋がると考えられる。それ以外は、実施の形態4の効果は、実施の形態1の効果と同様である。 Further, in the fourth embodiment, the warning system 3 includes not only the information of the moving body information detection sensor 31b provided in the own vehicle but also the information of the moving body information detection sensor 31b provided in the peripheral moving body and the infrastructure device. By using it, information can be acquired more widely, and more accurate information can be acquired. The warning system 3 improves the accuracy of the calculated information by acquiring more accurate information. As a result, safer driving can be performed, and as a result, it is considered that accidents can be reduced. Other than that, the effect of the fourth embodiment is the same as the effect of the first embodiment.
 実施の形態4において、移動体情報検知センサ31bは、自車両、周辺移動体及びインフラ装置等に備えられ、移動体情報取得部4bは、すべての移動体情報検知センサ31bからの情報を取得したが、一部だけ取得してもよい。例えば、移動体情報取得部4bは、自車両及び周辺移動体に備えられた移動体情報検知センサ31bから情報を取得してもよいし、自車両及びインフラ装置に備えられた移動体情報検知センサ31bから情報を取得してもよいし、自車両に備えられた移動体情報検知センサ31bは使用せず、周辺移動体及びインフラ装置に備えられた移動体情報検知センサ31bから情報を取得してもよい。また、移動体情報取得部4bは、複数の移動体情報検知センサ31bの中から1つの移動体情報検知センサ31bからのみ情報を取得してもよい。移動体情報取得部4bが取得する移動体情報検知センサ31bの情報の組み合わせは1以上であればよい。それ以外は、実施の形態4の変形例は、実施の形態1の変形例と同様である。 In the fourth embodiment, the mobile information detection sensor 31b is provided in the own vehicle, peripheral mobiles, infrastructure devices, etc., and the mobile information acquisition unit 4b acquires information from all the mobile information detection sensors 31b. However, only a part may be acquired. For example, the moving body information acquisition unit 4b may acquire information from the moving body information detection sensor 31b provided in the own vehicle and surrounding moving bodies, or the moving body information detection sensor provided in the own vehicle and the infrastructure device. Information may be acquired from 31b, or information may be acquired from the moving body information detection sensor 31b provided in the peripheral moving body and the infrastructure device without using the moving body information detection sensor 31b provided in the own vehicle. May be good. Further, the mobile information acquisition unit 4b may acquire information from only one mobile information detection sensor 31b from among the plurality of mobile information detection sensors 31b. The combination of information of the mobile information detection sensor 31b acquired by the mobile information acquisition unit 4b may be 1 or more. Other than that, the modified example of the fourth embodiment is the same as the modified example of the first embodiment.
 ところで、上記した実施の形態に示した情報生成装置、警告装置、情報生成方法、警告方法、情報生成プログラム、および警告プログラムは一例に過ぎず、適宜、他の装置と組み合わせて構成することが出来るものであって、実施の形態単独の構成に限られるものではない。 By the way, the information generator, the warning device, the information generation method, the warning method, the information generation program, and the warning program shown in the above-described embodiment are merely examples, and can be appropriately configured in combination with other devices. It is not limited to the configuration of the embodiment alone.
 1 情報生成装置、 2 警告装置、 3 警告システム、
 4a, 4b 移動体情報取得部、 5 視線情報取得部、
 11a, 11b, 11c 情報生成部、
 21a, 21b 周囲状況決定部、
 22a, 22b 警告対象判定部、 23 警告部、
 31a, 31b 移動体情報検知センサ、
 32 視線情報検知センサ、 33 視線情報記憶部、
 34 未検知範囲情報記憶部、 35 地図情報記憶部、
 36 対応情報記憶部、 37 警告範囲情報記憶部、
 40 バス、 41 プロセッサ、 42 メモリ、
 43 ストレージ、 44a, 44b センサ、
 45 入力I/F、 46 出力I/F、 47 通信I/F、
 50 自車両、 51 視野情報、 52~54 周辺移動体、
 55 視線情報、 101 周囲状況情報取得部、
 102 衝突可能性情報算出部、 103 情報対応付け部、
 104a, 104b 未検知範囲情報算出部、
 105a, 105b, 105c 警告範囲情報算出部、
 361 衝突可能性情報記憶部、 362 対応視線情報記憶部、
 363 周囲状況情報記憶部。
1 Information generator, 2 Warning device, 3 Warning system,
4a, 4b Mobile information acquisition unit, 5 Line-of-sight information acquisition unit,
11a, 11b, 11c Information generation unit,
21a, 21b Surrounding situation determination unit,
22a, 22b Warning target judgment unit, 23 Warning unit,
31a, 31b Mobile information detection sensor,
32 Line-of-sight information detection sensor, 33 Line-of-sight information storage unit,
34 Undetected range information storage unit, 35 Map information storage unit,
36 Corresponding information storage unit, 37 Warning range information storage unit,
40 buses, 41 processors, 42 memories,
43 storage, 44a, 44b sensors,
45 Input I / F, 46 Output I / F, 47 Communication I / F,
50 own vehicle, 51 field of view information, 52-54 peripheral moving body,
55 Line-of-sight information, 101 Surrounding situation information acquisition department,
102 Collision possibility information calculation unit, 103 Information mapping unit,
104a, 104b Undetected range information calculation unit,
105a, 105b, 105c Warning range information calculation unit,
361 Collision possibility information storage unit, 362 Corresponding line-of-sight information storage unit,
363 Surrounding situation information storage unit.

Claims (21)

  1.  自車両の所定期間内の位置を示す自車両位置情報、及び前記自車両周辺に存在する周辺移動体の前記所定期間内の位置を示す周辺移動体位置情報を取得する移動体情報取得部と、

     前記所定期間内の前記自車両の運転者が向けた視線の方向である視線情報を取得する視線情報取得部と、

     前記所定期間内の前記自車両の周囲状況である周囲状況情報を取得する周囲状況情報取得部と、

     前記自車両位置情報及び前記周辺移動体位置情報を用いて前記自車両と前記周辺移動体とが衝突する可能性である衝突可能性情報を算出する衝突可能性情報算出部と、

     前記衝突可能性情報と前記視線情報と前記周囲状況情報とを用いて、前記周囲状況情報が示す前記周囲状況における前記自車両の運転者が見落とした範囲である未検知範囲情報を算出する未検知範囲情報算出部と、

     1以上の前記未検知範囲情報を用いて、前記自車両の運転者に警告される前記周辺移動体が位置する範囲を示す警告範囲情報を算出する警告範囲情報算出部と

    を備える情報生成装置。
    A mobile body information acquisition unit that acquires the position information of the own vehicle indicating the position of the own vehicle within a predetermined period and the position information of the peripheral moving body existing in the vicinity of the own vehicle within the predetermined period.

    A line-of-sight information acquisition unit that acquires line-of-sight information that is the direction of the line of sight directed by the driver of the own vehicle within the predetermined period.

    The surrounding condition information acquisition unit that acquires the surrounding condition information that is the surrounding condition of the own vehicle within the predetermined period, and the surrounding condition information acquisition unit.

    A collision possibility information calculation unit that calculates collision possibility information that may cause a collision between the own vehicle and the peripheral moving body using the own vehicle position information and the peripheral moving body position information.

    Undetected range information that is a range overlooked by the driver of the own vehicle in the surrounding situation indicated by the surrounding situation information is calculated by using the collision possibility information, the line-of-sight information, and the surrounding situation information. Range information calculation unit and

    With the warning range information calculation unit that calculates the warning range information indicating the range in which the peripheral moving body is located, which is warned to the driver of the own vehicle by using one or more of the undetected range information.

    An information generator comprising.
  2.  前記未検知範囲情報算出部は、前記視線情報から前記自車両の運転者に見えている範囲を求め、前記衝突可能性情報のうち前記自車両の運転者に見えている範囲における前記自車両と前記周辺移動体とが衝突する可能性を低くした情報である視線適用衝突可能性情報を算出し、前記視線適用衝突可能性情報を用いて前記周囲状況情報が示す前記周囲状況における前記未検知範囲情報を算出する

    請求項1に記載の情報生成装置。
    The undetected range information calculation unit obtains a range visible to the driver of the own vehicle from the line-of-sight information, and has a collision possibility information with the own vehicle in the range visible to the driver of the own vehicle. The line-of-sight application collision possibility information, which is information that reduces the possibility of collision with the peripheral moving object, is calculated, and the undetected range in the surrounding situation indicated by the surrounding situation information using the line-of-sight application collision possibility information. Calculate information

    The information generator according to claim 1.
  3.  前記視線情報取得部は、取得した複数の前記視線情報から前記自車両の運転者の視野の範囲である視野情報を算出、あるいは予め設定された前記視野情報を取得し、

     前記未検知範囲情報算出部は、前記視線情報に加えて前記視野情報を用いて前記自車両の運転者が見えている範囲を求める

    請求項2に記載の情報生成装置。
    The line-of-sight information acquisition unit calculates the field-of-view information that is the range of the field of view of the driver of the own vehicle from the acquired plurality of the line-of-sight information, or acquires the preset field-of-view information.

    The undetected range information calculation unit uses the visual field information in addition to the line-of-sight information to obtain a range in which the driver of the own vehicle can see.

    The information generator according to claim 2.
  4.  前記未検知範囲情報算出部は、同一運転者かつ同一周囲状況の1以上の前記視線適用衝突可能性情報の論理積をとることで前記周囲状況情報の周囲状況における前記未検知範囲情報を算出する

    請求項2~3のいずれか1項に記載の情報生成装置。
    The undetected range information calculation unit calculates the undetected range information in the surrounding situation of the surrounding situation information by taking the logical product of one or more of the line-of-sight application collision possibility information of the same driver and the same surrounding situation.

    The information generator according to any one of claims 2 to 3.
  5.  前記警告範囲情報算出部は、1以上の前記未検知範囲情報の論理積をとることで前記警告範囲情報を算出する

    請求項1~3のいずれか1項に記載の情報生成装置。
    The warning range information calculation unit calculates the warning range information by taking a logical product of one or more of the undetected range information.

    The information generator according to any one of claims 1 to 3.
  6.  前記未検知範囲情報算出部は、同一運転者かつ同一周囲状況の1以上の前記視線適用衝突可能性情報の論理和をとることで前記周囲状況情報の周囲状況における前記未検知範囲情報を算出する

    請求項2~3、5のいずれか1項に記載の情報生成装置。
    The undetected range information calculation unit calculates the undetected range information in the surrounding situation of the surrounding situation information by ORing one or more of the line-of-sight application collision possibility information of the same driver and the same surrounding situation.

    The information generator according to any one of claims 2 to 3 and 5.
  7.  前記警告範囲情報算出部は、1以上の前記未検知範囲情報の論理和をとることで前記警告範囲情報を算出する

    請求項1~4、6のいずれか1項に記載の情報生成装置。
    The warning range information calculation unit calculates the warning range information by ORing one or more of the undetected range information.

    The information generator according to any one of claims 1 to 4 and 6.
  8.  前記警告範囲情報算出部が用いる1以上の前記未検知範囲情報は、同一運転者かつ同一周辺状況である前記未検知範囲情報である

    請求項1~7のいずれか1項に記載の情報生成装置。
    The one or more undetected range information used by the warning range information calculation unit is the undetected range information that is the same driver and the same peripheral situation.

    The information generator according to any one of claims 1 to 7.
  9.  前記警告範囲情報算出部が用いる1以上の前記未検知範囲情報は、同一運転者である前記未検知範囲情報である

    請求項1~7のいずれか1項に記載の情報生成装置。
    The one or more undetected range information used by the warning range information calculation unit is the undetected range information of the same driver.

    The information generator according to any one of claims 1 to 7.
  10.  前記周囲状況情報取得部は、取得した前記自車両の周囲状況の画像を用いて、画像認識あるいは機械学習にて前記周囲状況情報を取得する

    請求項1~9のいずれか1項に記載の情報生成装置。
    The surrounding situation information acquisition unit acquires the surrounding situation information by image recognition or machine learning using the acquired image of the surrounding situation of the own vehicle.

    The information generator according to any one of claims 1 to 9.
  11.  前記警告範囲情報算出部は、1以上の前記未検知範囲情報を用いて、機械学習にて前記警告範囲情報を算出する

    請求項1~10のいずれか1項に記載の情報生成装置。
    The warning range information calculation unit calculates the warning range information by machine learning using one or more of the undetected range information.

    The information generator according to any one of claims 1 to 10.
  12.  前記移動体情報取得部は、前記自車両位置情報及び前記周辺移動体位置情報をそれぞれ前記自車両搭載のセンサ、前記周辺移動体搭載のセンサ、前記自車両が走行している道路のインフラ装置のセンサの少なくとも1つから取得あるいは複数取得して算出し、

     前記周囲状況情報取得部は、前記自車両搭載のセンサ、前記周辺移動体搭載のセンサ、前記インフラ装置のセンサの少なくとも1つから前記周囲状況情報を取得する

    請求項1~11のいずれか1項に記載の情報生成装置。
    The moving body information acquisition unit uses the own vehicle position information and the peripheral moving body position information of the sensor mounted on the own vehicle, the sensor mounted on the peripheral moving body, and the infrastructure device of the road on which the own vehicle is traveling, respectively. Calculated by acquiring or acquiring from at least one of the sensors

    The surrounding situation information acquisition unit acquires the surrounding situation information from at least one of the sensor mounted on the own vehicle, the sensor mounted on the peripheral moving body, and the sensor of the infrastructure device.

    The information generator according to any one of claims 1 to 11.
  13.  自車両の第一の所定期間内の位置を示す第一の自車両位置情報、及び前記自車両周辺に存在する周辺移動体の前記第一の所定期間内の位置を示す第一の周辺移動体位置情報を取得する移動体情報取得部と、

     前記第一の所定期間よりも前の期間である第二の所定期間内の移動体の位置を示す第二の自車両位置情報、及び前記第二の所定期間内の前記移動体の周辺に存在した周辺移動体の位置を示す第二の周辺移動体位置情報を用いて、前記移動体と前記周辺移動体とが衝突する可能性である衝突可能性情報が算出され、算出された前記衝突可能性情報と、前記第二の所定期間内の前記移動体の運転者が向けた視線の方向である第二の視線情報と、前記第二の所定期間内の前記移動体の周囲状況である周囲状況情報とを用いて、前記周囲状況情報が示す周囲状況における前記移動体の運転者が見落とした範囲である未検知範囲情報が算出され、1以上の前記未検知範囲情報を用いて算出された、前記移動体の運転者に警告される前記周辺移動体が位置する範囲を示す警告範囲情報の中から、前記自車両と前記移動体とにおいて同一運転者の条件を少なくとも含む所定の条件と合致した前記警告範囲情報を取得し、前記第一の自車両位置情報と前記第一の移動体位置情報と前記警告範囲情報とを用いて前記自車両周辺に存在する周辺移動体が警告対象であるか否か判定する警告対象判定部と

    を備える警告装置。
    The first own vehicle position information indicating the position of the own vehicle within the first predetermined period, and the first peripheral moving body indicating the position of the peripheral moving body existing around the own vehicle within the first predetermined period. A mobile information acquisition unit that acquires location information,

    Existence in the second own vehicle position information indicating the position of the moving body within the second predetermined period, which is a period before the first predetermined period, and around the moving body within the second predetermined period. Using the second peripheral moving body position information indicating the position of the peripheral moving body, the collision possibility information at which the moving body and the peripheral moving body may collide is calculated, and the calculated collision possibility is possible. Sexual information, second line-of-sight information which is the direction of the line of sight directed by the driver of the moving body within the second predetermined period, and surroundings which is the surrounding situation of the moving body within the second predetermined period. Using the situation information, the undetected range information, which is the range overlooked by the driver of the moving body in the surrounding situation indicated by the surrounding situation information, was calculated, and calculated using one or more of the undetected range information. , The warning range information indicating the range in which the peripheral moving body is located, which is warned to the driver of the moving body, meets a predetermined condition including at least the condition of the same driver in the own vehicle and the moving body. The warning range information is acquired, and the peripheral moving objects existing around the own vehicle are the warning targets by using the first own vehicle position information, the first moving body position information, and the warning range information. With the warning target judgment unit that determines whether or not

    A warning device equipped with.
  14.  前記第一の所定期間内の前記自車両の運転者が向けた視線の方向である第一の視線情報を取得する視線情報取得部を備え、

     前記警告対象判定部は、前記第一の自車両位置情報と前記第一の移動体位置情報と前記警告範囲情報とに加え、前記第一の視線情報も用いて前記自車両周辺に存在する周辺移動体が前記警告対象であるか否か判定する

    請求項13に記載の警告装置。
    The line-of-sight information acquisition unit for acquiring the first line-of-sight information which is the direction of the line-of-sight directed by the driver of the own vehicle within the first predetermined period is provided.

    The warning target determination unit uses the first line-of-sight information in addition to the first own vehicle position information, the first moving body position information, and the warning range information, and the peripheral area existing around the own vehicle. Determine whether the moving object is the target of the warning

    The warning device according to claim 13.
  15.  予め作成されている前記自車両の周囲状況である周囲状況情報を用いて前記第一の所定期間の前記自車両の周囲状況を決定する周囲状況決定部を備え、

     前記所定の条件は、前記警告範囲情報に対応する前記周囲状況情報が示す周囲状況が、前記第一の所定期間の前記自車両の周囲状況と合致する条件を含む

    請求項13~14のいずれか1項に記載の警告装置。
    It is provided with a surrounding condition determination unit that determines the surrounding condition of the own vehicle during the first predetermined period by using the surrounding condition information which is the surrounding condition of the own vehicle created in advance.

    The predetermined condition includes a condition in which the surrounding condition indicated by the surrounding condition information corresponding to the warning range information matches the surrounding condition of the own vehicle in the first predetermined period.

    The warning device according to any one of claims 13 to 14.
  16.  前記自車両周辺に存在する周辺移動体が前記警告対象であると前記警告対象判定部が判定した場合、前記自車両の運転者に前記自車両周辺に存在する周辺移動体が前記警告対象であると警告する警告部を備える

    請求項13~15のいずれか1項に記載の警告装置。
    When the warning target determination unit determines that the peripheral moving body existing in the vicinity of the own vehicle is the warning target, the peripheral moving body existing in the vicinity of the own vehicle is the warning target to the driver of the own vehicle. Equipped with a warning unit to warn

    The warning device according to any one of claims 13 to 15.
  17.  前記移動体情報取得部は、前記第一の自車両位置情報及び前記第一の周辺移動体位置情報を前記自車両搭載のセンサ、前記周辺移動体搭載のセンサ、または前記自車両が走行している道路のインフラ装置のセンサを用いて取得する

    請求項13~16のいずれか1項に記載の警告装置。
    The moving body information acquisition unit uses the first own vehicle position information and the first peripheral moving body position information on the sensor mounted on the own vehicle, the sensor mounted on the peripheral moving body, or the own vehicle traveling. Acquired using the sensor of the infrastructure equipment of the road

    The warning device according to any one of claims 13 to 16.
  18.  自車両の所定期間内の位置を示す自車両位置情報、及び前記自車両周辺に存在する周辺移動体の前記所定期間内の位置を示す周辺移動体位置情報を取得するステップと、

     前記所定期間内の前記自車両の運転者が向けた視線の方向である視線情報を取得するステップと、

     前記所定期間内の前記自車両の周囲状況である周囲状況情報を取得するステップと、

     前記自車両位置情報及び前記周辺移動体位置情報を用いて前記自車両と前記周辺移動体とが衝突する可能性である衝突可能性情報を算出するステップと、

     前記衝突可能性情報と前記視線情報と前記周囲状況情報とを用いて、前記周囲状況情報が示す前記周囲状況における前記自車両の運転者が見落とした範囲である未検知範囲情報を算出するステップと、

     1以上の前記未検知範囲情報を用いて、前記自車両の運転者に警告される前記周辺移動体が位置する範囲を示す警告範囲情報を算出するステップと

    を有する情報生成方法。
    A step of acquiring the own vehicle position information indicating the position of the own vehicle within a predetermined period and the peripheral moving body position information indicating the position of the peripheral moving body existing around the own vehicle within the predetermined period.

    A step of acquiring line-of-sight information which is the direction of the line of sight directed by the driver of the own vehicle within the predetermined period, and

    The step of acquiring the surrounding condition information which is the surrounding condition of the own vehicle within the predetermined period, and

    A step of calculating collision possibility information, which is a possibility of collision between the own vehicle and the peripheral moving body, using the own vehicle position information and the peripheral moving body position information, and a step of calculating the collision possibility information.

    Using the collision possibility information, the line-of-sight information, and the surrounding situation information, a step of calculating undetected range information which is a range overlooked by the driver of the own vehicle in the surrounding situation indicated by the surrounding situation information. ,

    With the step of calculating the warning range information indicating the range in which the peripheral moving body is located, which is warned to the driver of the own vehicle, by using one or more of the undetected range information.

    Information generation method having.
  19.  自車両の第一の所定期間内の位置を示す第一の自車両位置情報、及び前記自車両周辺に存在する周辺移動体の前記第一の所定期間内の位置を示す第一の周辺移動体位置情報を取得するステップと、

     前記第一の所定期間よりも前の期間である第二の所定期間内の移動体の位置を示す第二の自車両位置情報、及び前記第二の所定期間内の前記移動体の周辺に存在した周辺移動体の位置を示す第二の周辺移動体位置情報を用いて、前記移動体と前記周辺移動体とが衝突する可能性である衝突可能性情報が算出され、算出された前記衝突可能性情報と、前記第二の所定期間内の前記移動体の運転者が向けた視線の方向である第二の視線情報と、前記第二の所定期間内の前記移動体の周囲状況である周囲状況情報とを用いて、前記周囲状況情報が示す周囲状況における前記移動体の運転者が見落とした範囲である未検知範囲情報が算出され、1以上の前記未検知範囲情報を用いて算出された、前記移動体の運転者に警告される前記周辺移動体が位置する範囲を示す警告範囲情報の中から、前記自車両と前記移動体とにおいて同一運転者の条件を少なくとも含む所定の条件と合致した前記警告範囲情報を取得し、前記第一の自車両位置情報と前記第一の移動体位置情報と前記警告範囲情報とを用いて前記自車両周辺に存在する周辺移動体が警告対象であるか否か判定するステップと

    を有する警告方法。
    The first own vehicle position information indicating the position of the own vehicle within the first predetermined period, and the first peripheral moving body indicating the position of the peripheral moving body existing around the own vehicle within the first predetermined period. Steps to get location information and

    Existence in the second own vehicle position information indicating the position of the moving body within the second predetermined period, which is a period before the first predetermined period, and around the moving body within the second predetermined period. Using the second peripheral moving body position information indicating the position of the peripheral moving body, the collision possibility information at which the moving body and the peripheral moving body may collide is calculated, and the calculated collision possibility is possible. Sexual information, second line-of-sight information which is the direction of the line of sight directed by the driver of the moving body within the second predetermined period, and surroundings which is the surrounding situation of the moving body within the second predetermined period. Using the situation information, undetected range information, which is a range overlooked by the driver of the moving body in the surrounding situation indicated by the surrounding situation information, was calculated, and calculated using one or more of the undetected range information. , The warning range information indicating the range in which the peripheral moving body is located, which is warned to the driver of the moving body, meets a predetermined condition including at least the condition of the same driver in the own vehicle and the moving body. The warning range information is acquired, and the peripheral moving objects existing around the own vehicle are the warning targets by using the first own vehicle position information, the first moving body position information, and the warning range information. Steps to determine whether or not

    Warning method with.
  20.  自車両の所定期間内の位置を示す自車両位置情報、及び前記自車両周辺に存在する周辺移動体の前記所定期間内の位置を示す周辺移動体位置情報を取得する処理と、

     前記所定期間内の前記自車両の運転者が向けた視線の方向である視線情報を取得する処理と、

     前記所定期間内の前記自車両の周囲状況である周囲状況情報を取得する処理と、

     前記自車両位置情報及び前記周辺移動体位置情報を用いて前記自車両と前記周辺移動体とが衝突する可能性である衝突可能性情報を算出する処理と、

     前記衝突可能性情報と前記視線情報と前記周囲状況情報とを用いて、前記周囲状況情報が示す前記周囲状況における前記自車両の運転者が見落とした範囲である未検知範囲情報を算出する処理と、

     1以上の前記未検知範囲情報を用いて、前記自車両の運転者に警告される前記周辺移動体が位置する範囲を示す警告範囲情報を算出する処理と

    を実行させる情報生成プログラム。
    A process of acquiring the own vehicle position information indicating the position of the own vehicle within a predetermined period and the peripheral moving body position information indicating the position of the peripheral moving body existing around the own vehicle within the predetermined period.

    A process of acquiring line-of-sight information which is the direction of the line of sight directed by the driver of the own vehicle within the predetermined period, and

    The process of acquiring the surrounding condition information which is the surrounding condition of the own vehicle within the predetermined period, and

    A process of calculating collision possibility information, which is a possibility of collision between the own vehicle and the peripheral moving body, using the own vehicle position information and the peripheral moving body position information.

    A process of calculating undetected range information, which is a range overlooked by the driver of the own vehicle in the surrounding situation indicated by the surrounding situation information, using the collision possibility information, the line-of-sight information, and the surrounding situation information. ,

    A process of calculating warning range information indicating a range in which the peripheral moving body is located, which is warned to the driver of the own vehicle, using one or more of the undetected range information.

    An information generation program that executes.
  21.  自車両の第一の所定期間内の位置を示す第一の自車両位置情報、及び前記自車両周辺に存在する周辺移動体の前記第一の所定期間内の位置を示す第一の周辺移動体位置情報を取得する処理と、

     前記第一の所定期間よりも前の期間である第二の所定期間内の移動体の位置を示す第二の自車両位置情報、及び前記第二の所定期間内の前記移動体の周辺に存在した周辺移動体の位置を示す第二の周辺移動体位置情報を用いて、前記移動体と前記周辺移動体とが衝突する可能性である衝突可能性情報が算出され、算出された前記衝突可能性情報と、前記第二の所定期間内の前記移動体の運転者が向けた視線の方向である第二の視線情報と、前記第二の所定期間内の前記移動体の周囲状況である周囲状況情報とを用いて、前記周囲状況情報が示す周囲状況における前記移動体の運転者が見落とした範囲である未検知範囲情報が算出され、1以上の前記未検知範囲情報を用いて算出された、前記移動体の運転者に警告される前記周辺移動体が位置する範囲を示す警告範囲情報の中から、前記自車両と前記移動体とにおいて同一運転者の条件を少なくとも含む所定の条件と合致した前記警告範囲情報を取得し、前記第一の自車両位置情報と前記第一の移動体位置情報と前記警告範囲情報とを用いて前記自車両周辺に存在する周辺移動体が警告対象であるか否か判定する処理と

    を実行させる警告プログラム。
    The first own vehicle position information indicating the position of the own vehicle within the first predetermined period, and the first peripheral moving body indicating the position of the peripheral moving body existing around the own vehicle within the first predetermined period. The process of acquiring location information and

    Existence in the second own vehicle position information indicating the position of the moving body within the second predetermined period, which is a period before the first predetermined period, and around the moving body within the second predetermined period. Using the second peripheral moving body position information indicating the position of the peripheral moving body, the collision possibility information at which the moving body and the peripheral moving body may collide is calculated, and the calculated collision possibility is possible. Sexual information, second line-of-sight information which is the direction of the line of sight directed by the driver of the moving body within the second predetermined period, and surroundings which is the surrounding situation of the moving body within the second predetermined period. Using the situation information, undetected range information, which is a range overlooked by the driver of the moving body in the surrounding situation indicated by the surrounding situation information, was calculated, and calculated using one or more of the undetected range information. , The warning range information indicating the range in which the peripheral moving body is located, which is warned to the driver of the moving body, meets a predetermined condition including at least the condition of the same driver in the own vehicle and the moving body. The warning range information is acquired, and the peripheral moving objects existing around the own vehicle are the warning targets by using the first own vehicle position information, the first moving body position information, and the warning range information. With the process of determining whether or not

    Warning program to execute.
PCT/JP2020/000484 2020-01-09 2020-01-09 Information generation device, warning device, information generation method, warning method, information generation program, and warning program WO2021140621A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2020/000484 WO2021140621A1 (en) 2020-01-09 2020-01-09 Information generation device, warning device, information generation method, warning method, information generation program, and warning program

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2020/000484 WO2021140621A1 (en) 2020-01-09 2020-01-09 Information generation device, warning device, information generation method, warning method, information generation program, and warning program

Publications (1)

Publication Number Publication Date
WO2021140621A1 true WO2021140621A1 (en) 2021-07-15

Family

ID=76787792

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2020/000484 WO2021140621A1 (en) 2020-01-09 2020-01-09 Information generation device, warning device, information generation method, warning method, information generation program, and warning program

Country Status (1)

Country Link
WO (1) WO2021140621A1 (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2008029802A1 (en) * 2006-09-04 2008-03-13 Panasonic Corporation Travel information providing device
JP2009014444A (en) * 2007-07-03 2009-01-22 Konica Minolta Holdings Inc Range finder
JP2009237776A (en) * 2008-03-26 2009-10-15 Mazda Motor Corp Vehicle drive supporting apparatus
JP2013196031A (en) * 2012-03-15 2013-09-30 Toyota Motor Corp Driving support device
WO2015001677A1 (en) * 2013-07-05 2015-01-08 ルネサスエレクトロニクス株式会社 Safety assistance system and safety assistance device
JP2015079332A (en) * 2013-10-16 2015-04-23 トヨタ自動車株式会社 Vehicle system
JP2015230665A (en) * 2014-06-06 2015-12-21 日立オートモティブシステムズ株式会社 Obstacle information management device
JP2017142760A (en) * 2016-02-12 2017-08-17 日立オートモティブシステムズ株式会社 Moving body surrounding environment recognition device

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2008029802A1 (en) * 2006-09-04 2008-03-13 Panasonic Corporation Travel information providing device
JP2009014444A (en) * 2007-07-03 2009-01-22 Konica Minolta Holdings Inc Range finder
JP2009237776A (en) * 2008-03-26 2009-10-15 Mazda Motor Corp Vehicle drive supporting apparatus
JP2013196031A (en) * 2012-03-15 2013-09-30 Toyota Motor Corp Driving support device
WO2015001677A1 (en) * 2013-07-05 2015-01-08 ルネサスエレクトロニクス株式会社 Safety assistance system and safety assistance device
JP2015079332A (en) * 2013-10-16 2015-04-23 トヨタ自動車株式会社 Vehicle system
JP2015230665A (en) * 2014-06-06 2015-12-21 日立オートモティブシステムズ株式会社 Obstacle information management device
JP2017142760A (en) * 2016-02-12 2017-08-17 日立オートモティブシステムズ株式会社 Moving body surrounding environment recognition device

Similar Documents

Publication Publication Date Title
JP6894471B2 (en) Patrol car patrol by self-driving car (ADV) subsystem
CN109426256B (en) Lane assist system for autonomous vehicles based on driver intention
CN108068825B (en) Visual communication system for unmanned vehicles (ADV)
CN110641472B (en) Safety monitoring system for autonomous vehicle based on neural network
JP6964271B2 (en) Driving support method and driving support device, automatic driving control device, vehicle, program using it
US10077007B2 (en) Sidepod stereo camera system for an autonomous vehicle
CN108859938B (en) Method and system for automatic vehicle emergency light control for autonomous vehicles
JP6831419B2 (en) How to operate self-driving cars and warning services, systems and machine-readable media
JP4134894B2 (en) Vehicle driving support device
JP2018081080A (en) Emergency handling system for autonomous driving vehicle (adv)
JP6402684B2 (en) Display device
JP2018116705A (en) Method for holding distance between automatic driving vehicle and following vehicle by using brake light
JP2018158721A (en) Collision prediction and forward airbag deployment system for autonomous driving vehicles
CN110371018B (en) Improving vehicle behavior using information from other vehicle lights
CN111547043A (en) Automatic response to emergency service vehicle by autonomous vehicle
US11745745B2 (en) Systems and methods for improving driver attention awareness
JP2011175368A (en) Vehicle control apparatus
US11571969B2 (en) External communication suppression device for driving automation
CN107599965B (en) Electronic control device and method for vehicle
CN113401056A (en) Display control device, display control method, and computer-readable storage medium
JP2011175367A (en) Vehicle control apparatus
WO2021140621A1 (en) Information generation device, warning device, information generation method, warning method, information generation program, and warning program
US11590845B2 (en) Systems and methods for controlling a head-up display in a vehicle
EP4331938A1 (en) Control method and apparatus
WO2023090166A1 (en) Vehicle control device and vehicle control method

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20912796

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

NENP Non-entry into the national phase

Ref country code: JP

122 Ep: pct application non-entry in european phase

Ref document number: 20912796

Country of ref document: EP

Kind code of ref document: A1