WO2022071323A1 - Program, information processing method, information processing terminal, and map information provision device - Google Patents

Program, information processing method, information processing terminal, and map information provision device Download PDF

Info

Publication number
WO2022071323A1
WO2022071323A1 PCT/JP2021/035660 JP2021035660W WO2022071323A1 WO 2022071323 A1 WO2022071323 A1 WO 2022071323A1 JP 2021035660 W JP2021035660 W JP 2021035660W WO 2022071323 A1 WO2022071323 A1 WO 2022071323A1
Authority
WO
WIPO (PCT)
Prior art keywords
vehicle
dangerous
region
distance
warning
Prior art date
Application number
PCT/JP2021/035660
Other languages
French (fr)
Japanese (ja)
Inventor
和伸 太田
正晃 上坂
Original Assignee
Arithmer株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Arithmer株式会社 filed Critical Arithmer株式会社
Priority to JP2022554016A priority Critical patent/JPWO2022071323A1/ja
Publication of WO2022071323A1 publication Critical patent/WO2022071323A1/en

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B29/00Maps; Plans; Charts; Diagrams, e.g. route diagram
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B29/00Maps; Plans; Charts; Diagrams, e.g. route diagram
    • G09B29/10Map spot or coordinate position indicators; Map reading aids

Definitions

  • the present invention relates to a program, an information processing method, an information processing terminal, and a map information providing device.
  • Patent Document 1 discloses a factor analysis device that warns the driver by estimating the cause of a vehicle accident with high accuracy.
  • Patent Document 1 may give an excessive warning to the driver and cause discomfort to the driver.
  • the program according to one aspect of the present invention is the first area set within the first distance from the vehicle and the first from the vehicle based on the map information indicating the dangerous place where the dangerous event is likely to occur and the position information of the vehicle.
  • the computer is made to execute a process of determining whether or not a dangerous place overlaps with a second area set within a second distance shorter than one distance. Then, the program generates the output command of the first warning when it is determined that the dangerous part overlaps in the first area, and deletes the output command of the first warning when it is determined that the dangerous part overlaps in the second area. Let the computer perform the processing to be performed.
  • the first embodiment detects a dangerous event (passerby jumping out, ignoring a signal, etc.) that may occur in the vehicle from an image obtained by capturing the surroundings of the vehicle in which the user is riding, and warns the user, and at the same time, a plurality of vehicles.
  • the present invention relates to a form of warning the user of a dangerous place on a map obtained by analyzing a dangerous event that occurred in.
  • FIG. 1 is an explanatory diagram showing an outline of a pre-hazard warning release system.
  • the system of the present embodiment includes information processing terminals 1, 1, 1, ... And an information processing device 2, and each device transmits / receives information via a network N such as the Internet.
  • the information processing terminal 1 is a terminal device installed in a vehicle that acquires map information, detects a dangerous event in the vehicle, outputs a danger warning, and the like.
  • the information processing terminal 1 is, for example, an information processing device such as a smartphone, a tablet, a navigation device equipped with a car navigation system (Automotive Navigation System), or a personal computer.
  • the information processing terminal 1 is a smartphone or a tablet, it may be placed on a holder or the like installed on the dashboard when the vehicle is driving. In the following, the information processing terminal 1 is read as a vehicle terminal 1.
  • the information processing device 2 is an information processing device that processes, stores, and transmits / receives various information including map information.
  • the information processing device 2 is, for example, a server device, a personal computer, or the like.
  • the information processing device 2 is assumed to be a server device, and is referred to as a server 2 in the following for the sake of brevity.
  • the vehicle terminal 1 captures an image of the surroundings of the vehicle (for example, the front), detects a dangerous event from the image of the surroundings of the vehicle, and outputs a warning (second warning) to the user (see FIG. 5). ).
  • the vehicle terminal 1 transmits the detection information including the detection time and the position information at the time of detection to the server 2 in addition to the image when the dangerous event is detected, and reports the dangerous event. ..
  • the server 2 acquires detection information from the vehicle terminals 1, 1, 1, ... Of each of the plurality of vehicles. Then, the server 2 generates map information indicating a dangerous place (place) on the map where a dangerous event is likely to occur from the detection information of each vehicle. Map information is data in which data indicating points or areas on a map where dangerous events are likely to occur and time points or times when dangerous events are likely to occur are added to the map data displayed by a car navigation system or the like. be. In this embodiment, the map information is also referred to as a hazard map. The server 2 statistically processes information on dangerous events reported from each vehicle terminal 1, generates map information, and distributes it to each vehicle terminal 1.
  • the vehicle terminal 1 receives and displays map information from the server 2 (see FIG. 6). Then, when the vehicle approaches the dangerous place indicated by the map information, the vehicle terminal 1 outputs a warning (first warning) notifying that the vehicle has approached the dangerous place. In this way, the vehicle terminal 1 outputs a first warning when the vehicle approaches a dangerous place on the map, and outputs a second warning when a dangerous event is detected from an image around the vehicle.
  • a warning first warning
  • FIG. 2 is a block diagram showing a configuration example of the vehicle terminal 1.
  • the vehicle terminal 1 includes a control unit 11, a storage unit 12, a communication unit 13, an input unit 14, a display unit 15, a photographing unit 16, an auxiliary storage unit 17, a GPS (Global Positioning System) module 18, and a speaker 19.
  • Each configuration is connected by bus B.
  • the control unit 11 includes an arithmetic processing unit such as a CPU (Central Processing Unit), an MPU (Micro-Processing Unit), and a GPU (Graphics Processing Unit), and reads and executes the control program 1P stored in the storage unit 12. , Performs various information processing, control processing, etc. related to the vehicle terminal 1.
  • arithmetic processing unit such as a CPU (Central Processing Unit), an MPU (Micro-Processing Unit), and a GPU (Graphics Processing Unit)
  • CPU Central Processing Unit
  • MPU Micro-Processing Unit
  • GPU Graphics Processing Unit
  • the storage unit 12 includes memory elements such as RAM (RandomAccessMemory) and ROM (ReadOnlyMemory), and stores the control program 1P or data required for the control unit 11 to execute processing. Further, the storage unit 12 temporarily stores data and the like necessary for the control unit 11 to execute arithmetic processing.
  • the communication unit 13 is a communication module for performing processing related to communication, and transmits / receives information to / from the server 2 or the like via the network N.
  • the input unit 14 may be a keyboard, a mouse, or a touch panel integrated with the display unit 15.
  • the display unit 15 is a liquid crystal display, an organic EL (electroluminescence) display, or the like, and displays various information according to the instructions of the control unit 11.
  • the photographing unit 16 is an photographing device such as a CCD (Charge Coupled Device) camera or a CMOS (Complementary Metal Oxide Semiconductor) camera.
  • the photographing unit 16 may be composed of a plurality of photographing devices.
  • the photographing unit 16 may not be built in the vehicle terminal 1 but may be directly connected to the vehicle terminal 1 externally so as to be capable of photographing.
  • the auxiliary storage unit 17 is a non-volatile storage area such as a large-capacity memory or a hard disk, and includes, for example, a recording medium such as an HDD (Hard disk drive) or SSD (Solid State Drive).
  • the GPS module 18 is a module for acquiring position information of a vehicle using GPS satellites.
  • the speaker 19 is a device that converts an electric signal into sound.
  • FIG. 3 is a block diagram showing a configuration example of the server 2.
  • the server 2 includes a control unit 21, a storage unit 22, a communication unit 23, a reading unit 24, and a large-capacity storage unit 25. Each configuration is connected by bus B.
  • the control unit 21 includes arithmetic processing units such as a CPU, MPU, and GPU, and performs various information processing, control processing, and the like related to the server 2 by reading and executing the control program 2P stored in the storage unit 22. Although the control unit 21 is described as a single processor in FIG. 3, it may be a multiprocessor.
  • the storage unit 22 includes memory elements such as RAM and ROM, and stores the control program 2P or data required for the control unit 21 to execute the process. Further, the storage unit 22 temporarily stores data and the like necessary for the control unit 21 to execute the arithmetic processing.
  • the communication unit 23 is a communication module for performing processing related to communication, and transmits / receives information to / from the vehicle terminal 1 or the like via the network N.
  • the reading unit 24 reads a portable storage medium 2a including a CD (Compact Disc) -ROM or a DVD (Digital Versatile Disc) -ROM.
  • the control unit 21 may read the control program 2P from the portable storage medium 2a via the reading unit 24 and store it in the large-capacity storage unit 25. Further, the control unit 21 may download the control program 2P from another computer via the network N or the like and store it in the large-capacity storage unit 25. Furthermore, the control unit 21 may read the control program 2P from the semiconductor memory 2b.
  • the large-capacity storage unit 25 includes a recording medium such as an HDD or SSD.
  • the large-capacity storage unit 25 includes a map DB (database) 251 and a detection information DB 252.
  • the map DB 251 stores map information indicating a dangerous place where a dangerous event is likely to occur.
  • the map information may be stored in an external dedicated map database.
  • the detection information DB 252 stores the detection information in which a dangerous event of the vehicle is detected.
  • the storage unit 22 and the large-capacity storage unit 25 may be configured as an integrated storage device. Further, the large-capacity storage unit 25 may be composed of a plurality of storage devices. Furthermore, the large-capacity storage unit 25 may be an external storage device connected to the server 2.
  • server 2 is described as one information processing device in the present embodiment, it may be distributed and processed by a plurality of servers, or it may be configured by a virtual machine.
  • FIG. 4 is an explanatory diagram showing an example of the record layout of the detection information DB 252.
  • the detection information DB 252 includes a detection ID row, a vehicle ID row, an image row, a detection time row, a position information row, and a dangerous event row.
  • the detection ID column stores the detection ID for specifying the detection information.
  • the vehicle ID column stores the vehicle ID for identifying the vehicle.
  • the image sequence stores an image when a dangerous event is detected.
  • the detection time sequence stores the time when the dangerous event is detected.
  • the position information string stores the position information (for example, point name, latitude / longitude, etc.) of the place where the dangerous event occurs.
  • the dangerous event sequence stores the contents of the dangerous event (for example, jumping out of a person, involving a person, ignoring a red light, etc.).
  • FIG. 5 is an explanatory diagram for detecting a dangerous event of the vehicle.
  • Dangerous events of a vehicle are external dangerous events such as jumping out of a passerby, getting caught in, approaching another vehicle, and / or internal dangerous events such as ignoring the red light of the own vehicle and ignoring a temporary stop (dangerous driving). including.
  • the vehicle terminal 1 mounted on the vehicle acquires an image of the surroundings of the vehicle. For example, as shown in FIG. 5, the vehicle terminal 1 acquires an image with the front of the vehicle as the imaging range.
  • the image may be captured by the photographing unit 16 of the vehicle terminal 1, or may be captured by an external imaging device. Further, the image may include an image of not only the front of the vehicle but also the rear and side of the vehicle. Alternatively, the vehicle terminal 1 may acquire an image captured in all directions by a 360-degree camera or the like.
  • the vehicle terminal 1 detects a dangerous event of the vehicle in the acquired image. For example, as shown in the figure, when the pedestrian 11a moves from the sidewalk side to the roadway side, the event is detected as a dangerous event.
  • the method of detecting a dangerous event is not particularly limited, but for example, the vehicle terminal 1 performs pattern matching on a specific object (passerby who jumps out on the road, a red light) based on the shape, size, color, etc. of the object in the image. To detect.
  • the vehicle terminal 1 may prepare a trained machine learning model (for example, a neural network) to detect a dangerous event when a captured image is input, and detect the dangerous event.
  • a trained machine learning model for example, a neural network
  • the vehicle terminal 1 may detect a dangerous event from the entire image (entire area), but in the present embodiment, the vehicle terminal 1 may detect a dangerous event from a part of the region of interest 11b (ROI: Region of Interest) in the captured image. Detect dangerous events.
  • the region of interest 11b is a region of interest (image region) corresponding to a region on a travel path within a predetermined distance from the vehicle. In FIG. 5, the region of interest 11b is shown by hatching.
  • the region of interest 11b shown in FIG. 5 is an example.
  • the region of interest 11b may include a sidewalk in addition to the travel path (road), or all the travel paths recognizable from the image may be included in the region of interest.
  • the position, shape, range, etc. of the region of interest 11b in the image are not particularly limited.
  • the vehicle terminal 1 detects a dangerous event by detecting a specific object from a region of interest within a predetermined distance from the vehicle.
  • the vehicle terminal 1 outputs a second warning when it detects a dangerous event of the vehicle.
  • the second warning is a warning to alert the driver when a dangerous event of the vehicle is detected.
  • the vehicle terminal 1 outputs a second warning audio signal such as "a pedestrian jumping forward has been found. Please be careful! Through the speaker 19 a predetermined number of times (for example, three times). You may.
  • the second warning may be output repeatedly without being limited to a predetermined number of times. In this case, for example, the vehicle terminal 1 may erase the second warning after the vehicle reaches the periphery of the place (point) where the dangerous event is detected. Further, at the same time as the output (reproduction) of the second warning, the warning content in text format may be displayed on the screen.
  • the vehicle terminal 1 transmits the detection information of detecting a dangerous event of the vehicle to the server (map information providing device) 2.
  • the detection information includes the vehicle ID, the image at the time of detecting the dangerous event, the detection time, the position information (point name or longitude / latitude, etc.), or the content of the dangerous event (for example, ignoring the red light, jumping out of a person, etc.). ..
  • the server 2 receives the detection information transmitted from the vehicle terminal 1 and stores the received detection information in the detection information DB 252. Specifically, the server 2 allocates a detection ID and stores the vehicle ID, the image, the detection time, the position information, and the contents of the dangerous event as one record in the detection information DB 252.
  • the server 2 generates map information indicating a dangerous place on a map where a dangerous event is likely to occur, based on the detection information in each vehicle stored in the detection information DB 252.
  • the map information is data in which data indicating a point or area on a map where a dangerous event is likely to occur and a time or time zone where a dangerous event is likely to occur are added to the map data.
  • the map information includes the longitude / latitude of the dangerous place, the occurrence rate of the dangerous event for each dangerous place (for example, 25%), the danger level set according to the occurrence rate of the dangerous event, and the time zone in which the dangerous event is likely to occur.
  • the server 2 Based on the detection information received from each vehicle terminal 1, the server 2 aggregates the number of occurrences, frequency of occurrence, occurrence rate, etc. of dangerous events at each point or area on the map by time or time zone, and calculates the occurrence rate, etc. Generates map information showing danger points where the danger level determined accordingly is above a certain level.
  • the server 2 statistically processes the detection information in real time or at regular intervals (for example, one week, one month, etc.) every time the detection information is received from the vehicle terminal 1, and updates the map information.
  • the server 2 transmits the map information updated according to the detection information to the vehicle terminal 1 mounted on each vehicle.
  • Each vehicle terminal 1 receives the map information transmitted from the server 2 and displays it on the screen.
  • FIG. 6 is an explanatory diagram showing an example of a screen for displaying map information.
  • the screen includes a map display field 12a, a danger location display field 12b, and a danger event icon 12c.
  • the map display field 12a is a display field for displaying a map.
  • the dangerous place display column 12b is a display field for displaying a mark indicating a dangerous place (area).
  • the dangerous event icon 12c is an icon for indicating a dangerous event.
  • the server 2 acquires the map (hazard map) information from the map DB 251.
  • the server 2 transmits the acquired map information to the vehicle terminal 1.
  • the vehicle terminal 1 displays a hazard map on the screen based on the map information transmitted from the server 2.
  • a mark indicating a dangerous place where a dangerous event of a vehicle is likely to occur in the past, an occurrence rate of a dangerous event for each dangerous place (for example, 25%), and a time zone when a dangerous event is likely to occur for example, 7 o'clock.
  • ⁇ 8 o'clock and icons set according to the content of the dangerous event (for example, pop-out icon, entrainment icon, etc.) are superimposed and described on the hazard map.
  • the vehicle terminal 1 displays the map data included in the received map information in the map display field 12a, displays a mark indicating a dangerous place in the dangerous place display field 12b, and displays an icon indicating a dangerous event. Is displayed on the dangerous event icon 12c.
  • FIG. 7 is an explanatory diagram showing the relationship between the vehicle and various areas.
  • FIG. 8 is an explanatory diagram regarding the first warning based on the dangerous place.
  • the vehicle terminal 1 when the vehicle approaches the dangerous place, the vehicle terminal 1 outputs a first warning to the effect that the vehicle has approached the dangerous place, in addition to the second warning described above.
  • the vehicle terminal 1 processes as follows so that the first warning is not excessively output.
  • the vehicle terminal 1 acquires the current position information of the vehicle via the GPS module 18. Next, the vehicle terminal 1 determines whether or not the vehicle is approaching the dangerous place based on the acquired current position information and the position information of the dangerous place indicated by the map information. Specifically, the vehicle terminal 1 determines whether or not the dangerous portion overlaps with the first region shown in FIG. 7A.
  • the first region is a fan-shaped region set within the first distance forward from the vehicle with the vehicle position as the center of concentric circles.
  • the first region is not limited to the fan-shaped region, and may be, for example, an annular fan-shaped region set between the first distance and the second distance described later with the vehicle as the center of concentric circles.
  • the first region may be a rectangular region or the like.
  • the first region may be any region set within the first distance from the vehicle, and its shape, position (distance), range, and the like are not particularly limited.
  • the vehicle terminal 1 determines whether or not the danger point overlaps with the second region set within the second distance, which is shorter than the first distance. Then, when the vehicle terminal 1 determines that the dangerous portion overlaps with the second region, the vehicle terminal 1 deletes the output command of the first warning.
  • the second region is a fan-shaped region set in front of the vehicle with the vehicle position as the center of a circle, as shown by a white area surrounded by a broken line in FIG. 7B, and is a region constituting a part of the first region. be.
  • the second region may be set as an annular fan-shaped region, a rectangular region, or the like, and may be a region within the second distance shorter than the first distance.
  • FIG. 8 illustrates how the vehicle gradually approaches the dangerous place.
  • the dangerous place when the dangerous place is set as a certain area on the map (an elliptical area in FIG. 8), the first area overlaps with the area corresponding to the dangerous place (FIG. 8).
  • the vehicle terminal 1 generates an output command for the first warning and outputs the first warning.
  • the first warning is a warning to alert the driver based on the distance from the vehicle to the dangerous place.
  • the vehicle terminal 1 outputs a voice signal of the first warning such as "I will soon enter a dangerous area where an accident is likely to occur. Please be careful!" Through the speaker 19 a predetermined number of times (for example, three times). May be.
  • the number of times the first warning is output is not limited to a predetermined number of times.
  • the vehicle terminal 1 starts outputting the first warning when the first region overlaps the dangerous place, and continuously outputs the first warning thereafter.
  • the vehicle terminal 1 determines whether or not the dangerous portion overlaps with the second region. When it is determined that the second region overlaps (the third state from the top of FIG. 8), the vehicle terminal 1 deletes the output command of the first warning being output.
  • the vehicle terminal 1 generates a first warning output command when a dangerous place overlaps the first area set within the first distance from the vehicle, and within the second distance shorter than the first distance.
  • the output command of the first warning is deleted.
  • the vehicle terminal 1 may continue to output the first warning as it is, or may end the output of the first warning after a certain period of time has elapsed after the dangerous portion deviates from the first region.
  • the exception handling when the dangerous portion does not overlap with the second region is not particularly limited.
  • the second region has been described above as a part of the first region, the first region and the second region do not overlap each other (for example, the first region has a circular fan shape from the first distance to the second distance).
  • the region and the second region may be a fan-shaped region on the vehicle side of the first region), or the first region and the second region may be partially overlapped.
  • the vehicle terminal 1 changes the processing content of the dangerous event detection so that the dangerous event can be detected with high sensitivity when the dangerous part overlaps with the second region.
  • the vehicle terminal 1 expands the region of interest 11b exemplified in FIG. 5 when the second region overlaps the dangerous portion.
  • the vehicle terminal 1 extends the distance from the vehicle to the tip of the region of interest 11b (the upper side of the region of interest 11b on the upper side of FIG. 5) to expand the area on the travel path to be detected.
  • the detection sensitivity can be increased when the dangerous portion overlaps with the second region, that is, when the vehicle enters a place where a dangerous event is likely to occur, and the dangerous event can be suitably avoided.
  • the vehicle terminal 1 may change a parameter (threshold value) as a reference when detecting a dangerous event from an image.
  • a parameter for example, when a dangerous event is detected using a machine learning model of an object detection system, the output value from the learning model (for example, the probability that the object recognized from the image is a specific object such as a passerby or a red light). ) And lower the threshold. Even in this case, the detection sensitivity of the dangerous event can be increased in the same manner as described above.
  • the process of expanding the area of interest or changing the threshold value is mentioned, but the present embodiment is not limited to this.
  • the threshold value may be changed. That is, the second area for erasing the output command of the first warning and the third area for performing the process of expanding the area of interest or changing the threshold value may be set separately.
  • the second region and the third region are treated as the same region as shown in FIG. 7B.
  • FIG. 9 is a flowchart showing a dangerous event detection processing procedure.
  • the vehicle terminal 1 that has detected a dangerous event is represented by reference numeral 1a
  • the other vehicle terminals 1 are represented by reference numeral 1b.
  • the control unit 11 of the vehicle terminal 1a acquires map information indicating a dangerous place where a dangerous event is likely to occur from the map DB 251 of the server 2 via the communication unit 13 and displays it on the display unit 15 (step S01).
  • the control unit 11 acquires an image of the surroundings of the vehicle via the communication unit 13 or the photographing unit 16 (step S02). For example, the control unit 11 acquires an image with the front of the vehicle as the imaging range.
  • the control unit 11 detects a dangerous event of the vehicle from the region of interest in the acquired image by using pattern matching, a machine learning model, or the like (step S03).
  • the control unit 11 determines whether or not a dangerous event of the vehicle has occurred based on the detection result of the dangerous event (step S04).
  • control unit 11 determines that no dangerous event of the vehicle has occurred (NO in step S04)
  • the control unit 11 returns to the process of step S02.
  • the control unit 11 determines that a dangerous event of the vehicle has occurred (YES in step S04)
  • the control unit 11 outputs a second warning via the speaker 19 (step S05).
  • the control unit 11 transmits the detection information including the vehicle ID, the image at the time of detecting the dangerous event, the detection time, the position information, and the content of the dangerous event to the server 2 by the communication unit 13 (step S06).
  • the control unit 21 of the server 2 receives the detection information transmitted from the vehicle terminal 1a by the communication unit 23 and stores it in the detection information DB 252 (step S07).
  • the control unit 21 analyzes the detection information from each vehicle terminal 1a stored in the detection information DB 252 (step S08). For example, the control unit 21 sets the number of occurrences, frequency of occurrence, probability of occurrence, etc. of dangerous events at each point or area on the map based on the detection time, position information, etc. of the dangerous event indicated by the detection information from each vehicle terminal 1a. Or calculate by time zone.
  • the control unit 21 generates map information indicating a dangerous place on a map where a dangerous event is likely to occur based on the analysis result of step S08, and updates the map DB 251 based on the generated map information (step S09). For example, the control unit 21 compares the number of occurrences of dangerous events at each time or time zone with a predetermined threshold value, determines the danger level of each point or area on the map, and determines the point or area where the danger level is above a certain level. Generate map information as a dangerous place. The control unit 21 transmits the updated map information to each vehicle terminal 1 by the communication unit 23 (step S10).
  • steps S07 to S09 may be executed every time the detection information is received (step S06), or may be executed at regular intervals.
  • the control unit 11 of the vehicle terminal 1a receives the updated map information transmitted from the server 2 by the communication unit 13 (step S11).
  • the control unit 11 displays the received map information on the display unit 15 (step S12), and ends the process.
  • the control unit 11 of the vehicle terminal 1b receives the updated map information transmitted from the server 2 by the communication unit 13 (step S13).
  • the control unit 11 displays the received map information on the display unit 15 (step S14), and ends the process.
  • FIG. 10 is a flowchart showing a warning processing procedure for a dangerous place.
  • the vehicle terminal 1 executes the following processing in parallel with the dangerous event detection processing described with reference to FIG.
  • the control unit 11 of the vehicle terminal 1 acquires map information indicating a dangerous place where a dangerous event is likely to occur from the map DB 251 of the server 2 via the communication unit 13 (step S101).
  • the control unit 11 acquires the current position information via the GPS module 18 (step S102).
  • the control unit 11 determines whether or not the dangerous place overlaps the first area set within the first distance from the vehicle based on the current position information and the position information of the dangerous place indicated by the map information (step). S103). When the control unit 11 determines that the danger points do not overlap with the first region (NO in step S103), the control unit 11 returns to the process of step S102. When the control unit 11 determines that the dangerous portion overlaps with the first region (YES in step S103), the control unit 11 generates an output command for the first warning and outputs the first warning via the speaker 19 (step S104).
  • the control unit 11 determines whether or not the dangerous place overlaps the second area set within the second distance shorter than the first area based on the current position information and the position information of the dangerous place indicated by the map information. (Step S105). When the control unit 11 determines that the danger points do not overlap with the second region (NO in step S105), the control unit 11 returns to the process of step S105. When the control unit 11 determines that the dangerous portion overlaps with the second region (YES in step S105), the control unit 11 deletes the output command of the first warning being output (step S106). The control unit 11 expands the region of interest of the captured image to be detected as a dangerous event (step S107).
  • the control unit 11 determines whether or not the dangerous place is out of the second area based on the current position information and the position information of the dangerous place indicated by the map information (step S108). When the control unit 11 determines that the dangerous portion is out of the second region (YES in step S108), the control unit 11 returns the region of interest of the enlarged captured image to the region of interest before expansion (step S109), and returns to the process of step S101. .. When the control unit 11 determines that the dangerous portion has not deviated from the second region (NO in step S108), the control unit 11 returns to the process of step S108.
  • the present embodiment when a dangerous place overlaps with the second region, it is possible to increase the detection sensitivity of the dangerous event of the vehicle by changing the threshold value when detecting the dangerous event.
  • the map information generated based on the detection information from the vehicle has been described as an example, but the map information according to the present embodiment is not limited to this.
  • the map information may reflect traffic information or the like based on disaster information delivered in real time.
  • the second embodiment relates to a mode in which the first region is expanded according to the traveling speed of the vehicle.
  • the description of the contents overlapping with the first embodiment will be omitted.
  • the traveling speed of the vehicle affects the time it takes for the vehicle to reach the dangerous spot.
  • the traveling speed becomes high, the time for the vehicle to reach the dangerous place becomes short, so that the timing of outputting the danger warning to the user may be missed.
  • the danger warning can be output to the user at an early stage by expanding the first region.
  • FIG. 11 is a flowchart showing a processing procedure when expanding the first region according to the traveling speed of the vehicle.
  • the contents overlapping with FIG. 10 are designated by the same reference numerals and the description thereof will be omitted.
  • the control unit 11 of the vehicle terminal 1 acquires the traveling speed of the vehicle from, for example, a speed measurement sensor mounted on the vehicle (step S111).
  • the control unit 11 determines whether or not the acquired traveling speed is equal to or higher than a predetermined reference speed (step S112).
  • the control unit 11 determines that the acquired traveling speed is equal to or higher than the reference speed (YES in step S112), the control unit 11 expands the first region (step S113) and executes the process of step S103. For example, the control unit 11 expands the radius of a large circle in the first region set in a fan shape according to the magnitude of the traveling speed. For example, when the acquired traveling speed is 1.2 times the reference speed, the radius of the concentric circles is expanded 1.2 times in the first region. When the control unit 11 determines that the acquired traveling speed is less than the reference speed (NO in step S112), the control unit 11 transitions to the process of step S103.
  • the third embodiment relates to a mode in which the distance from the vehicle to the third region is changed according to the attribute of the user (driver) who drives the vehicle.
  • the above-mentioned second region and the third region are distinguished. The description of the contents overlapping with the first and second embodiments will be omitted.
  • User attributes include user age, etc.
  • an elderly driver may have a high incidence of traffic accidents due to problems such as traveling at a speed lower than the legal speed, delay in applying the brakes, or slow response.
  • the third distance from the vehicle is lengthened according to a predetermined rule.
  • the third distance from the vehicle may be increased by 1.5 times for an elderly driver in his 70s.
  • the user's attributes may include driving skills, years of driving experience, and the like.
  • FIG. 12 is a block diagram showing a configuration example of the server 2 of the second embodiment.
  • the contents overlapping with FIG. 3 are designated by the same reference numerals and the description thereof will be omitted.
  • the user DB 253 is stored in the large-capacity storage unit 25.
  • the user DB 253 stores user information including user attributes.
  • FIG. 13 is an explanatory diagram showing an example of the record layout of the user DB 253.
  • the user DB 253 includes a user ID column, a gender column, and an age column.
  • the user ID column stores the ID of a uniquely identified user in order to identify each user.
  • the gender column remembers the gender of the user.
  • the age column remembers the age of the user.
  • FIG. 14 is a flowchart showing a processing procedure when changing the third distance according to the age of the user.
  • the control unit 11 of the vehicle terminal 1 acquires the age of the user from the user DB 253 of the server 2 based on the user ID (step S121).
  • the control unit 11 determines whether or not the user is an elderly driver based on the acquired age of the user (step S122). For example, when the user's age is 70 years or older, the control unit 11 may determine that the user is an elderly driver.
  • control unit 11 sets the third distance from the vehicle longer than in the case of non-elderly drivers (step S123), and ends the process. For example, the third distance from the vehicle is increased 1.2 times.
  • control unit 11 determines that the user is not an elderly driver (NO in step S122)
  • the control unit 11 ends the process.
  • the present invention is not limited to this.
  • the third distance may be changed in consideration of, for example, the user's driving skill or years of driving experience.
  • the present embodiment by changing the third area distance from the vehicle according to the attribute of the user, it is possible to output a danger warning to the user in advance at an appropriate timing.
  • the second region and the third region are distinguished in the present embodiment, the second region and the third region may be the same region.
  • the third distance may be changed, but also the first distance may be changed.
  • the first distance may be changed.
  • the elderly driver by lengthening the first distance, the elderly driver can recognize the detection of the dangerous place earlier. This makes it possible to avoid a situation in which an elderly driver responds in a hurry to a dangerous event when the hazard map is updated in real time.
  • Control unit 11 Control unit 12 Storage unit 13 Communication unit 14 Input unit 15 Display unit 16 Imaging unit 17 Auxiliary storage unit 18 GPS module 19 Speaker 1P control program 2 Information processing device (server / map information providing device) 21 Control unit 22 Storage unit 23 Communication unit 24 Reading unit 25 Large-capacity storage unit 251 Map DB 252 Detection information DB 253 User DB 2a Portable storage medium 2b Semiconductor memory 2P control program

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Mathematical Physics (AREA)
  • Business, Economics & Management (AREA)
  • Educational Administration (AREA)
  • Educational Technology (AREA)
  • Traffic Control Systems (AREA)

Abstract

[Problem] To provide a program to output, to a driver, an appropriate danger warning in advance, and the like. [Solution] This program causes a computer to execute processing for determining, on the basis of map information indicating a dangerous spot at which a dangerous event tends to occur, and positional information of a vehicle, whether or not the dangerous spot overlaps with a first region set within a first distance from the vehicle and a second region set within a second distance shorter than the first distance from the vehicle. The program causes the computer to execute processing for generating a first warning output command if the dangerous spot is determined to overlap with the first region, and deleting the first warning output command if the dangerous spot is determined to overlap with the second region.

Description

プログラム、情報処理方法、情報処理端末及び地図情報提供装置Programs, information processing methods, information processing terminals and map information providing devices
 本発明は、プログラム、情報処理方法、情報処理端末及び地図情報提供装置に関する。 The present invention relates to a program, an information processing method, an information processing terminal, and a map information providing device.
 特許文献1には、車両の事故発生要因を高精度に推定することで、運転者への警告等を行う要因分析装置が開示されている。 Patent Document 1 discloses a factor analysis device that warns the driver by estimating the cause of a vehicle accident with high accuracy.
特許第6545940号公報Japanese Patent No. 6545940
 しかしながら、特許文献1に係る発明は、運転者への警告を過度に行い、運転者に対して不快感を与えることがある。 However, the invention according to Patent Document 1 may give an excessive warning to the driver and cause discomfort to the driver.
 本発明の一つの側面に係るプログラムは、危険事象が発生しやすい危険箇所を示す地図情報と、車両の位置情報とに基づき、車両から第1距離以内に設定される第1領域及び車両から第1距離より短い第2距離以内に設定される第2領域に危険箇所が重なったか否かを判定する処理をコンピュータに実行させる。そして、当該プログラムは、第1領域に危険箇所が重なったと判定した場合に第1警告の出力命令を生成し、第2領域に危険箇所が重なったと判定した場合、第1警告の出力命令を消去する処理をコンピュータに実行させる。 The program according to one aspect of the present invention is the first area set within the first distance from the vehicle and the first from the vehicle based on the map information indicating the dangerous place where the dangerous event is likely to occur and the position information of the vehicle. The computer is made to execute a process of determining whether or not a dangerous place overlaps with a second area set within a second distance shorter than one distance. Then, the program generates the output command of the first warning when it is determined that the dangerous part overlaps in the first area, and deletes the output command of the first warning when it is determined that the dangerous part overlaps in the second area. Let the computer perform the processing to be performed.
 本発明の一つの側面では、運転者に対して事前に適切な危険警告を出力することが可能となる。 In one aspect of the present invention, it is possible to output an appropriate danger warning to the driver in advance.
事前危険警告放出システムの概要を示す説明図である。It is explanatory drawing which shows the outline of the advance danger warning release system. 車両端末の構成例を示すブロック図である。It is a block diagram which shows the configuration example of a vehicle terminal. サーバの構成例を示すブロック図である。It is a block diagram which shows the configuration example of a server. 検知情報DBのレコードレイアウトの一例を示す説明図である。It is explanatory drawing which shows an example of the record layout of the detection information DB. 車両の危険事象を検知する説明図である。It is explanatory drawing which detects the dangerous event of a vehicle. 地図情報を表示する画面の一例を示す説明図である。It is explanatory drawing which shows an example of the screen which displays the map information. 車両と各種領域との関係を示す説明図である。It is explanatory drawing which shows the relationship between a vehicle and various areas. 危険箇所に基づく第1警告に関する説明図である。It is explanatory drawing about the 1st warning based on a dangerous place. 危険事象の検知処理手順を示すフローチャートである。It is a flowchart which shows the detection processing procedure of a dangerous event. 危険箇所の警告処理手順を示すフローチャートである。It is a flowchart which shows the warning processing procedure of a dangerous place. 車両の走行速度に応じて第1領域を拡大する際の処理手順を示すフローチャートである。It is a flowchart which shows the processing procedure at the time of expanding the 1st region according to the traveling speed of a vehicle. 実施形態2のサーバの構成例を示すブロック図である。It is a block diagram which shows the configuration example of the server of Embodiment 2. ユーザDBのレコードレイアウトの一例を示す説明図である。It is explanatory drawing which shows an example of the record layout of a user DB. ユーザの年齢に応じて第3距離を変更する際の処理手順を示すフローチャートである。It is a flowchart which shows the processing procedure at the time of changing the 3rd distance according to the age of a user.
 以下、本発明をその実施形態を示す図面に基づいて詳述する。 Hereinafter, the present invention will be described in detail with reference to the drawings showing the embodiments thereof.
 (実施形態1)
 実施形態1は、ユーザが乗車している車両の周囲を撮像した画像から当該車両に発生し得る危険事象(通行人の飛び出し、信号無視など)を検知してユーザに警告すると共に、複数の車両で発生した危険事象を解析して得た地図上の危険箇所をユーザに警告する形態に関する。図1は、事前危険警告放出システムの概要を示す説明図である。本実施形態のシステムは、情報処理端末1、1、1…及び情報処理装置2を含み、各装置はインターネット等のネットワークNを介して情報の送受信を行う。
(Embodiment 1)
The first embodiment detects a dangerous event (passerby jumping out, ignoring a signal, etc.) that may occur in the vehicle from an image obtained by capturing the surroundings of the vehicle in which the user is riding, and warns the user, and at the same time, a plurality of vehicles. The present invention relates to a form of warning the user of a dangerous place on a map obtained by analyzing a dangerous event that occurred in. FIG. 1 is an explanatory diagram showing an outline of a pre-hazard warning release system. The system of the present embodiment includes information processing terminals 1, 1, 1, ... And an information processing device 2, and each device transmits / receives information via a network N such as the Internet.
 情報処理端末1は、車両に設置され、地図情報の取得、車両の危険事象の検知または危険警告の出力等を行う端末装置である。情報処理端末1は、例えばスマートフォン、タブレット、カーナビゲーションシステム(Automotive Navigation System)が搭載されたナビゲーション装置またはパーソナルコンピュータ等の情報処理機器である。情報処理端末1がスマートフォンまたはタブレットである場合、車両の運転時においてダッシュボードに設置されたホルダ等に載置されても良い。以下では、情報処理端末1を車両端末1と読み替える。 The information processing terminal 1 is a terminal device installed in a vehicle that acquires map information, detects a dangerous event in the vehicle, outputs a danger warning, and the like. The information processing terminal 1 is, for example, an information processing device such as a smartphone, a tablet, a navigation device equipped with a car navigation system (Automotive Navigation System), or a personal computer. When the information processing terminal 1 is a smartphone or a tablet, it may be placed on a holder or the like installed on the dashboard when the vehicle is driving. In the following, the information processing terminal 1 is read as a vehicle terminal 1.
 情報処理装置2は、地図情報を含む種々の情報に対する処理、記憶及び送受信を行う情報処理装置である。情報処理装置2は、例えばサーバ装置またはパーソナルコンピュータ等である。本実施形態において、情報処理装置2はサーバ装置であるものとし、以下では簡潔のためサーバ2と読み替える。 The information processing device 2 is an information processing device that processes, stores, and transmits / receives various information including map information. The information processing device 2 is, for example, a server device, a personal computer, or the like. In the present embodiment, the information processing device 2 is assumed to be a server device, and is referred to as a server 2 in the following for the sake of brevity.
 本実施形態に係る車両端末1は、車両の周囲(例えば前方)を撮像しており、車両周囲の画像から危険事象を検知して、ユーザに対する警告(第2警告)を出力する(図5参照)。また、危険事象を検知した場合、車両端末1は、危険事象を検知した際の画像のほか、検知時刻、及び検知時の位置情報を含む検知情報をサーバ2に送信し、危険事象を報告する。 The vehicle terminal 1 according to the present embodiment captures an image of the surroundings of the vehicle (for example, the front), detects a dangerous event from the image of the surroundings of the vehicle, and outputs a warning (second warning) to the user (see FIG. 5). ). When a dangerous event is detected, the vehicle terminal 1 transmits the detection information including the detection time and the position information at the time of detection to the server 2 in addition to the image when the dangerous event is detected, and reports the dangerous event. ..
 サーバ2は、複数の車両それぞれの車両端末1、1、1…から検知情報を取得する。そしてサーバ2は、各車両での検知情報から、危険事象が発生しやすい地図上の危険箇所(場所)を示す地図情報を生成する。地図情報は、カーナビゲーションシステム等で表示する地図データに対し、危険事象が発生しやすい地図上の地点又はエリアと、危険事象が発生しやすい時点又は時間帯とを示すデータが付されたデータである。なお、本実施形態では、地図情報をハザードマップとも呼ぶ。サーバ2は、各車両端末1から報告される危険事象の情報を統計処理し、地図情報を生成して各車両端末1に配信する。 The server 2 acquires detection information from the vehicle terminals 1, 1, 1, ... Of each of the plurality of vehicles. Then, the server 2 generates map information indicating a dangerous place (place) on the map where a dangerous event is likely to occur from the detection information of each vehicle. Map information is data in which data indicating points or areas on a map where dangerous events are likely to occur and time points or times when dangerous events are likely to occur are added to the map data displayed by a car navigation system or the like. be. In this embodiment, the map information is also referred to as a hazard map. The server 2 statistically processes information on dangerous events reported from each vehicle terminal 1, generates map information, and distributes it to each vehicle terminal 1.
 車両端末1は、サーバ2から地図情報を受信して表示する(図6参照)。そして車両端末1は、地図情報が示す危険箇所に車両が接近した場合、危険箇所に接近したことを報知する警告(第1警告)を出力する。このように、車両端末1は、地図上の危険箇所に車両が接近した場合に第1警告を、車両周囲の画像から危険事象を検知した場合に第2警告をそれぞれ出力する。 The vehicle terminal 1 receives and displays map information from the server 2 (see FIG. 6). Then, when the vehicle approaches the dangerous place indicated by the map information, the vehicle terminal 1 outputs a warning (first warning) notifying that the vehicle has approached the dangerous place. In this way, the vehicle terminal 1 outputs a first warning when the vehicle approaches a dangerous place on the map, and outputs a second warning when a dangerous event is detected from an image around the vehicle.
 図2は、車両端末1の構成例を示すブロック図である。車両端末1は、制御部11、記憶部12、通信部13、入力部14、表示部15、撮影部16、補助記憶部17、GPS(Global Positioning System)モジュール18及びスピーカ19を含む。各構成はバスBで接続されている。 FIG. 2 is a block diagram showing a configuration example of the vehicle terminal 1. The vehicle terminal 1 includes a control unit 11, a storage unit 12, a communication unit 13, an input unit 14, a display unit 15, a photographing unit 16, an auxiliary storage unit 17, a GPS (Global Positioning System) module 18, and a speaker 19. Each configuration is connected by bus B.
 制御部11はCPU(Central Processing Unit)、MPU(Micro-Processing Unit)、GPU(Graphics Processing Unit)等の演算処理装置を含み、記憶部12に記憶された制御プログラム1Pを読み出して実行することにより、車両端末1に係る種々の情報処理、制御処理等を行う。なお、図2では制御部11を単一のプロセッサであるものとして説明するが、マルチプロセッサであっても良い。 The control unit 11 includes an arithmetic processing unit such as a CPU (Central Processing Unit), an MPU (Micro-Processing Unit), and a GPU (Graphics Processing Unit), and reads and executes the control program 1P stored in the storage unit 12. , Performs various information processing, control processing, etc. related to the vehicle terminal 1. Although the control unit 11 is described as a single processor in FIG. 2, it may be a multiprocessor.
 記憶部12はRAM(Random Access Memory)、ROM(Read Only Memory)等のメモリ素子を含み、制御部11が処理を実行するために必要な制御プログラム1P又はデータ等を記憶している。また、記憶部12は、制御部11が演算処理を実行するために必要なデータ等を一時的に記憶する。通信部13は通信に関する処理を行うための通信モジュールであり、ネットワークNを介して、サーバ2等との間で情報の送受信を行う。 The storage unit 12 includes memory elements such as RAM (RandomAccessMemory) and ROM (ReadOnlyMemory), and stores the control program 1P or data required for the control unit 11 to execute processing. Further, the storage unit 12 temporarily stores data and the like necessary for the control unit 11 to execute arithmetic processing. The communication unit 13 is a communication module for performing processing related to communication, and transmits / receives information to / from the server 2 or the like via the network N.
 入力部14は、キーボード、マウスまたは表示部15と一体化したタッチパネルでも良い。表示部15は、液晶ディスプレイ又は有機EL(electroluminescence)ディスプレイ等であり、制御部11の指示に従い各種情報を表示する。 The input unit 14 may be a keyboard, a mouse, or a touch panel integrated with the display unit 15. The display unit 15 is a liquid crystal display, an organic EL (electroluminescence) display, or the like, and displays various information according to the instructions of the control unit 11.
 撮影部16は、例えばCCD(Charge Coupled Device)カメラ、CMOS(Complementary Metal Oxide Semiconductor)カメラ等の撮影装置である。なお、撮影部16は、複数の撮影装置により構成されても良い。なお、撮影部16は車両端末1の中に内蔵せず、外部で直接に車両端末1と接続し、撮影可能な構成としても良い。補助記憶部17は、大容量メモリ、ハードディスク等の不揮発性記憶領域であり、例えばHDD(Hard disk drive)、SSD(Solid State Drive)等の記録媒体を備える。GPSモジュール18は、GPS衛星を利用して車両の位置情報を取得するためのモジュールである。スピーカ19は、電気信号を音に変換する装置である。 The photographing unit 16 is an photographing device such as a CCD (Charge Coupled Device) camera or a CMOS (Complementary Metal Oxide Semiconductor) camera. The photographing unit 16 may be composed of a plurality of photographing devices. The photographing unit 16 may not be built in the vehicle terminal 1 but may be directly connected to the vehicle terminal 1 externally so as to be capable of photographing. The auxiliary storage unit 17 is a non-volatile storage area such as a large-capacity memory or a hard disk, and includes, for example, a recording medium such as an HDD (Hard disk drive) or SSD (Solid State Drive). The GPS module 18 is a module for acquiring position information of a vehicle using GPS satellites. The speaker 19 is a device that converts an electric signal into sound.
 図3は、サーバ2の構成例を示すブロック図である。サーバ2は、制御部21、記憶部22、通信部23、読取部24及び大容量記憶部25を含む。各構成はバスBで接続されている。 FIG. 3 is a block diagram showing a configuration example of the server 2. The server 2 includes a control unit 21, a storage unit 22, a communication unit 23, a reading unit 24, and a large-capacity storage unit 25. Each configuration is connected by bus B.
 制御部21はCPU、MPU、GPU等の演算処理装置を含み、記憶部22に記憶された制御プログラム2Pを読み出して実行することにより、サーバ2に係る種々の情報処理、制御処理等を行う。なお、図3では制御部21を単一のプロセッサであるものとして説明するが、マルチプロセッサであっても良い。 The control unit 21 includes arithmetic processing units such as a CPU, MPU, and GPU, and performs various information processing, control processing, and the like related to the server 2 by reading and executing the control program 2P stored in the storage unit 22. Although the control unit 21 is described as a single processor in FIG. 3, it may be a multiprocessor.
 記憶部22はRAM、ROM等のメモリ素子を含み、制御部21が処理を実行するために必要な制御プログラム2P又はデータ等を記憶している。また、記憶部22は、制御部21が演算処理を実行するために必要なデータ等を一時的に記憶する。通信部23は通信に関する処理を行うための通信モジュールであり、ネットワークNを介して、車両端末1等との間で情報の送受信を行う。 The storage unit 22 includes memory elements such as RAM and ROM, and stores the control program 2P or data required for the control unit 21 to execute the process. Further, the storage unit 22 temporarily stores data and the like necessary for the control unit 21 to execute the arithmetic processing. The communication unit 23 is a communication module for performing processing related to communication, and transmits / receives information to / from the vehicle terminal 1 or the like via the network N.
 読取部24は、CD(Compact Disc)-ROM又はDVD(Digital Versatile Disc)-ROMを含む可搬型記憶媒体2aを読み取る。制御部21が読取部24を介して、制御プログラム2Pを可搬型記憶媒体2aより読み取り、大容量記憶部25に記憶しても良い。また、ネットワークN等を介して他のコンピュータから制御部21が制御プログラム2Pをダウンロードし、大容量記憶部25に記憶しても良い。さらにまた、半導体メモリ2bから、制御部21が制御プログラム2Pを読み込んでも良い。 The reading unit 24 reads a portable storage medium 2a including a CD (Compact Disc) -ROM or a DVD (Digital Versatile Disc) -ROM. The control unit 21 may read the control program 2P from the portable storage medium 2a via the reading unit 24 and store it in the large-capacity storage unit 25. Further, the control unit 21 may download the control program 2P from another computer via the network N or the like and store it in the large-capacity storage unit 25. Furthermore, the control unit 21 may read the control program 2P from the semiconductor memory 2b.
 大容量記憶部25は、例えばHDD、SSD等の記録媒体を備える。大容量記憶部25は、地図DB(database)251及び検知情報DB252を含む。地図DB251は、危険事象が発生しやすい危険箇所を示す地図情報を記憶している。なお、地図情報は外部の専用の地図データベースに記憶されても良い。検知情報DB252は、車両の危険事象を検知した検知情報を記憶している。 The large-capacity storage unit 25 includes a recording medium such as an HDD or SSD. The large-capacity storage unit 25 includes a map DB (database) 251 and a detection information DB 252. The map DB 251 stores map information indicating a dangerous place where a dangerous event is likely to occur. The map information may be stored in an external dedicated map database. The detection information DB 252 stores the detection information in which a dangerous event of the vehicle is detected.
 なお、本実施形態において記憶部22及び大容量記憶部25は一体の記憶装置として構成されていても良い。また、大容量記憶部25は複数の記憶装置により構成されていても良い。更にまた、大容量記憶部25はサーバ2に接続された外部記憶装置であっても良い。 In the present embodiment, the storage unit 22 and the large-capacity storage unit 25 may be configured as an integrated storage device. Further, the large-capacity storage unit 25 may be composed of a plurality of storage devices. Furthermore, the large-capacity storage unit 25 may be an external storage device connected to the server 2.
 なお、本実施形態では、サーバ2は一台の情報処理装置であるものとして説明するが、複数台により分散して処理させても良く、または仮想マシンにより構成されていても良い。 Although the server 2 is described as one information processing device in the present embodiment, it may be distributed and processed by a plurality of servers, or it may be configured by a virtual machine.
 図4は、検知情報DB252のレコードレイアウトの一例を示す説明図である。
 検知情報DB252は、検知ID列、車両ID列、画像列、検知時刻列、位置情報列及び危険事象列を含む。検知ID列は、検知情報を特定するための検知IDを記憶している。車両ID列は、車両を特定するための車両IDを記憶している。画像列は、危険事象を検知した際の画像を記憶している。検知時刻列は、危険事象を検知した時刻を記憶している。位置情報列は、危険事象が発生した箇所の位置情報(例えば、地点名、緯度・経度等)を記憶している。危険事象列は、危険事象の内容(例えば、人の飛び出し、人の巻き込みまたは赤信号無視等)を記憶している。
FIG. 4 is an explanatory diagram showing an example of the record layout of the detection information DB 252.
The detection information DB 252 includes a detection ID row, a vehicle ID row, an image row, a detection time row, a position information row, and a dangerous event row. The detection ID column stores the detection ID for specifying the detection information. The vehicle ID column stores the vehicle ID for identifying the vehicle. The image sequence stores an image when a dangerous event is detected. The detection time sequence stores the time when the dangerous event is detected. The position information string stores the position information (for example, point name, latitude / longitude, etc.) of the place where the dangerous event occurs. The dangerous event sequence stores the contents of the dangerous event (for example, jumping out of a person, involving a person, ignoring a red light, etc.).
 図5は、車両の危険事象を検知する説明図である。車両の危険事象は、通行人の飛び出し、巻き込み、他の車両の接近等の外的な危険事象、及び/又は自車両の赤信号無視、一時停止無視等の内的な危険事象(危険運転)を含む。 FIG. 5 is an explanatory diagram for detecting a dangerous event of the vehicle. Dangerous events of a vehicle are external dangerous events such as jumping out of a passerby, getting caught in, approaching another vehicle, and / or internal dangerous events such as ignoring the red light of the own vehicle and ignoring a temporary stop (dangerous driving). including.
 車両に搭載された車両端末1は、車両の周囲を撮像した画像を取得する。例えば車両端末1は、図5に示すように、車両前方を撮像範囲とした画像を取得する。該画像は、車両端末1の撮影部16により撮像されても良く、または外部の撮像装置により撮像されても良い。また、画像は車両前方だけでなく、車両の後方、側方を撮像した画像を含んでも良い。あるいは車両端末1は、360度カメラ等によって全方位を撮像した画像を取得しても良い。 The vehicle terminal 1 mounted on the vehicle acquires an image of the surroundings of the vehicle. For example, as shown in FIG. 5, the vehicle terminal 1 acquires an image with the front of the vehicle as the imaging range. The image may be captured by the photographing unit 16 of the vehicle terminal 1, or may be captured by an external imaging device. Further, the image may include an image of not only the front of the vehicle but also the rear and side of the vehicle. Alternatively, the vehicle terminal 1 may acquire an image captured in all directions by a 360-degree camera or the like.
 車両端末1は、取得した画像内の車両の危険事象を検知する。例えば図示のように、歩行者11aが歩道側から車道側へ移動すると、当該事象を危険事象として検知する。危険事象の検知方法は特に限定されないが、例えば車両端末1は、画像内の物体の形状、大きさ、色等から特定の事物(走行路に飛び出した通行人、赤信号)などをパターンマッチングで検出する。あるいは車両端末1は、撮像画像を入力した場合に危険事象を検知するよう学習済みの機械学習モデル(例えばニューラルネットワーク)を用意して危険事象を検知するなどしても良い。 The vehicle terminal 1 detects a dangerous event of the vehicle in the acquired image. For example, as shown in the figure, when the pedestrian 11a moves from the sidewalk side to the roadway side, the event is detected as a dangerous event. The method of detecting a dangerous event is not particularly limited, but for example, the vehicle terminal 1 performs pattern matching on a specific object (passerby who jumps out on the road, a red light) based on the shape, size, color, etc. of the object in the image. To detect. Alternatively, the vehicle terminal 1 may prepare a trained machine learning model (for example, a neural network) to detect a dangerous event when a captured image is input, and detect the dangerous event.
 車両端末1は、画像全体(全領域)から危険事象を検知するようにしても良いが、本実施の形態では、撮像画像内の一部の関心領域11b(ROI:Region of Interest)から車両の危険事象を検知する。例えば関心領域11bは、車両から所定距離内の走行路上の領域に対応する関心領域(画像領域)である。図5では、関心領域11bをハッチングで図示している。なお、図5に示す関心領域11bは一例であって、例えば関心領域11bには走行路(道路)以外に歩道も含まれるようにしても良く、あるいは画像から認識可能な走行路全てを関心領域としても良く、関心領域11bの画像内の位置、形状、範囲等は特に限定されない。車両端末1は、車両からの所定距離内の関心領域から特定の事物を検知することで、危険事象を検知する。 The vehicle terminal 1 may detect a dangerous event from the entire image (entire area), but in the present embodiment, the vehicle terminal 1 may detect a dangerous event from a part of the region of interest 11b (ROI: Region of Interest) in the captured image. Detect dangerous events. For example, the region of interest 11b is a region of interest (image region) corresponding to a region on a travel path within a predetermined distance from the vehicle. In FIG. 5, the region of interest 11b is shown by hatching. The region of interest 11b shown in FIG. 5 is an example. For example, the region of interest 11b may include a sidewalk in addition to the travel path (road), or all the travel paths recognizable from the image may be included in the region of interest. However, the position, shape, range, etc. of the region of interest 11b in the image are not particularly limited. The vehicle terminal 1 detects a dangerous event by detecting a specific object from a region of interest within a predetermined distance from the vehicle.
 車両端末1は、車両の危険事象を検知した場合、第2警告を出力する。第2警告は、車両の危険事象を検知した際に運転者への注意喚起の警告である。例えば、車両端末1はスピーカ19を介して、「前方に飛び出してきた歩行者が発見しました。ご注意ください!」等の第2警告の音声信号を所定の回数(例えば、3回)により出力しても良い。なお、第2警告は、所定の回数に限定せずに繰り返して出力されても良い。この場合、例えば車両が、該危険事象を検知した場所(地点)の周囲に到達した後に、車両端末1は第2警告を消去しても良い。また、第2警告の出力(再生)と同時に、テキスト形式の警告内容が画面に表示されても良い。 The vehicle terminal 1 outputs a second warning when it detects a dangerous event of the vehicle. The second warning is a warning to alert the driver when a dangerous event of the vehicle is detected. For example, the vehicle terminal 1 outputs a second warning audio signal such as "a pedestrian jumping forward has been found. Please be careful!" Through the speaker 19 a predetermined number of times (for example, three times). You may. The second warning may be output repeatedly without being limited to a predetermined number of times. In this case, for example, the vehicle terminal 1 may erase the second warning after the vehicle reaches the periphery of the place (point) where the dangerous event is detected. Further, at the same time as the output (reproduction) of the second warning, the warning content in text format may be displayed on the screen.
 車両端末1は、車両の危険事象を検知した検知情報をサーバ(地図情報提供装置)2に送信する。検知情報は、車両ID、危険事象を検知時の画像、検知時刻、位置情報(地点名または経度・緯度等)、または危険事象の内容(例えば、赤信号無視、人の飛び出し等)等を含む。サーバ2は、車両端末1から送信された検知情報を受信し、受信した検知情報を検知情報DB252に記憶する。具体的には、サーバ2は検知IDを割り振って、車両ID、画像、検知時刻、位置情報及び危険事象の内容を一つのレコードとして検知情報DB252に記憶する。 The vehicle terminal 1 transmits the detection information of detecting a dangerous event of the vehicle to the server (map information providing device) 2. The detection information includes the vehicle ID, the image at the time of detecting the dangerous event, the detection time, the position information (point name or longitude / latitude, etc.), or the content of the dangerous event (for example, ignoring the red light, jumping out of a person, etc.). .. The server 2 receives the detection information transmitted from the vehicle terminal 1 and stores the received detection information in the detection information DB 252. Specifically, the server 2 allocates a detection ID and stores the vehicle ID, the image, the detection time, the position information, and the contents of the dangerous event as one record in the detection information DB 252.
 サーバ2は、検知情報DB252に記憶された各車両での検知情報に基づき、危険事象が発生しやすい地図上の危険箇所を示す地図情報を生成する。地図情報は、地図データに、危険事象が発生しやすい地図上の地点又はエリアと、危険事象が発生しやすい時点又は時間帯とを示すデータが付されたデータである。例えば地図情報は、危険箇所の経度・緯度、危険箇所ごとの危険事象の発生率(例えば、25%)、危険事象の発生率に応じて設定された危険レベル、危険事象が発生しやすい時間帯(例えば、7時~8時)、または危険事象を検知時の危険事象の内容(例えば、赤信号無視、人の飛び出し等)を示す情報等を含む。サーバ2は、各車両端末1から受信した検知情報を元に、地図上の各地点又はエリアにおける危険事象の発生回数、発生頻度、発生率等を時刻又は時間帯別に集計し、発生率等に応じて定められる危険レベルが一定以上の危険箇所を示す地図情報を生成する。 The server 2 generates map information indicating a dangerous place on a map where a dangerous event is likely to occur, based on the detection information in each vehicle stored in the detection information DB 252. The map information is data in which data indicating a point or area on a map where a dangerous event is likely to occur and a time or time zone where a dangerous event is likely to occur are added to the map data. For example, the map information includes the longitude / latitude of the dangerous place, the occurrence rate of the dangerous event for each dangerous place (for example, 25%), the danger level set according to the occurrence rate of the dangerous event, and the time zone in which the dangerous event is likely to occur. (For example, from 7:00 to 8:00), or information indicating the content of the dangerous event at the time of detecting the dangerous event (for example, ignoring the red light, popping out of a person, etc.) is included. Based on the detection information received from each vehicle terminal 1, the server 2 aggregates the number of occurrences, frequency of occurrence, occurrence rate, etc. of dangerous events at each point or area on the map by time or time zone, and calculates the occurrence rate, etc. Generates map information showing danger points where the danger level determined accordingly is above a certain level.
 サーバ2は、車両端末1から検知情報を受信する度にリアルタイムで、あるいは一定期間(例えば一週間、一ヶ月等)毎に検知情報を統計処理し、地図情報を更新する。サーバ2は、それぞれの車両に搭載された車両端末1に、検知情報に応じて更新した地図情報を送信する。それぞれの車両端末1は、サーバ2から送信された地図情報を受信して画面に表示する。 The server 2 statistically processes the detection information in real time or at regular intervals (for example, one week, one month, etc.) every time the detection information is received from the vehicle terminal 1, and updates the map information. The server 2 transmits the map information updated according to the detection information to the vehicle terminal 1 mounted on each vehicle. Each vehicle terminal 1 receives the map information transmitted from the server 2 and displays it on the screen.
 図6は、地図情報を表示する画面の一例を示す説明図である。該画面は、地図表示欄12a、危険箇所表示欄12b及び危険事象アイコン12cを含む。地図表示欄12aは、地図を表示するための表示欄である。危険箇所表示欄12bは、危険箇所(エリア)を表すマークを表示するための表示欄である。危険事象アイコン12cは、危険事象を示すためのアイコンである。 FIG. 6 is an explanatory diagram showing an example of a screen for displaying map information. The screen includes a map display field 12a, a danger location display field 12b, and a danger event icon 12c. The map display field 12a is a display field for displaying a map. The dangerous place display column 12b is a display field for displaying a mark indicating a dangerous place (area). The dangerous event icon 12c is an icon for indicating a dangerous event.
 サーバ2は、地図DB251から地図(ハザードマップ)情報を取得する。サーバ2は、取得した地図情報を車両端末1に送信する。車両端末1は、サーバ2から送信された地図情報に基づいてハザードマップを画面に表示する。図示のように、過去に車両の危険事象が発生しやすい危険箇所を表すマーク、危険箇所ごとの危険事象の発生率(例えば、25%)、危険事象が発生しやすい時間帯(例えば、7時~8時)、及び危険事象の内容に応じて設定されたアイコン(例えば、飛び出しアイコン、巻き込みアイコン等)がハザードマップに重畳して記述される。 The server 2 acquires the map (hazard map) information from the map DB 251. The server 2 transmits the acquired map information to the vehicle terminal 1. The vehicle terminal 1 displays a hazard map on the screen based on the map information transmitted from the server 2. As shown in the figure, a mark indicating a dangerous place where a dangerous event of a vehicle is likely to occur in the past, an occurrence rate of a dangerous event for each dangerous place (for example, 25%), and a time zone when a dangerous event is likely to occur (for example, 7 o'clock). ~ 8 o'clock) and icons set according to the content of the dangerous event (for example, pop-out icon, entrainment icon, etc.) are superimposed and described on the hazard map.
 具体的には、車両端末1は、受信した地図情報に含まれている地図データを地図表示欄12aに表示し、危険箇所を表すマークを危険箇所表示欄12bに表示し、危険事象を示すアイコンを危険事象アイコン12cに表示する。 Specifically, the vehicle terminal 1 displays the map data included in the received map information in the map display field 12a, displays a mark indicating a dangerous place in the dangerous place display field 12b, and displays an icon indicating a dangerous event. Is displayed on the dangerous event icon 12c.
 図7は、車両と各種領域との関係を示す説明図である。図8は、危険箇所に基づく第1警告に関する説明図である。本実施形態において車両端末1は、車両が危険箇所に接近した場合、上記の第2警告とは別に、危険箇所に接近した旨の第1警告を出力する。ここで車両端末1は、第1警告が過度に出力されないように、以下のように処理する。 FIG. 7 is an explanatory diagram showing the relationship between the vehicle and various areas. FIG. 8 is an explanatory diagram regarding the first warning based on the dangerous place. In the present embodiment, when the vehicle approaches the dangerous place, the vehicle terminal 1 outputs a first warning to the effect that the vehicle has approached the dangerous place, in addition to the second warning described above. Here, the vehicle terminal 1 processes as follows so that the first warning is not excessively output.
 まず車両端末1は、GPSモジュール18を介して、車両の現在の位置情報を取得する。次に車両端末1は、取得した現在の位置情報と、地図情報が示す危険箇所の位置情報とに基づいて、車両が危険箇所に接近しているか否かを判定する。具体的には、車両端末1は、図7Aに示す第1領域に危険箇所が重なったか否かを判定する。 First, the vehicle terminal 1 acquires the current position information of the vehicle via the GPS module 18. Next, the vehicle terminal 1 determines whether or not the vehicle is approaching the dangerous place based on the acquired current position information and the position information of the dangerous place indicated by the map information. Specifically, the vehicle terminal 1 determines whether or not the dangerous portion overlaps with the first region shown in FIG. 7A.
 例えば第1領域は、図7Aに示すように、車両位置を同心円の中心として、車両から前方に第1距離以内に設定される扇形領域である。なお、第1領域は扇形の領域に限定されず、例えば車両を同心円の中心として第1距離から後述の第2距離までの間に設定される環状扇形領域であっても良い。あるいは第1領域は、矩形状の領域などであっても良い。このように、第1領域は車両から第1距離以内に設定される領域であれば良く、その形状、位置(距離)、範囲等は特に限定されない。 For example, as shown in FIG. 7A, the first region is a fan-shaped region set within the first distance forward from the vehicle with the vehicle position as the center of concentric circles. The first region is not limited to the fan-shaped region, and may be, for example, an annular fan-shaped region set between the first distance and the second distance described later with the vehicle as the center of concentric circles. Alternatively, the first region may be a rectangular region or the like. As described above, the first region may be any region set within the first distance from the vehicle, and its shape, position (distance), range, and the like are not particularly limited.
 その後、車両端末1は、第1距離よりも短い第2距離以内に設定される第2領域に危険箇所が重なったか否かを判定する。そして車両端末1は、第2領域に危険箇所が重なったと判定した場合、第1警告の出力命令を消去する。第2領域は、例えば図7Bに破線で囲んだ白抜き領域で示すように、車両位置を円の中心として車両前方に設定された扇形領域であり、第1領域の一部を構成する領域である。なお、第2領域も第1領域と同様に、環状扇形や矩形状等の領域として設定されても良く、第1距離よりも短い第2距離内の領域であれば良い。 After that, the vehicle terminal 1 determines whether or not the danger point overlaps with the second region set within the second distance, which is shorter than the first distance. Then, when the vehicle terminal 1 determines that the dangerous portion overlaps with the second region, the vehicle terminal 1 deletes the output command of the first warning. The second region is a fan-shaped region set in front of the vehicle with the vehicle position as the center of a circle, as shown by a white area surrounded by a broken line in FIG. 7B, and is a region constituting a part of the first region. be. As with the first region, the second region may be set as an annular fan-shaped region, a rectangular region, or the like, and may be a region within the second distance shorter than the first distance.
 図8では、車両が徐々に危険箇所に近づく様子を図示している。例えば図8に示すように、危険箇所が地図上の一定の領域(図8では楕円状の領域)として設定されている場合において、危険箇所に対応する領域に第1領域が重なった場合(図8の上から2番目の状態)、車両端末1は第1警告の出力命令を生成して、第1警告を出力する。第1警告は、車両から危険箇所までの距離に基づく運転者への注意喚起の警告である。例えば、車両端末1はスピーカ19を介して、「間もなく事故が起こりやすい危険エリアに入ります。ご注意ください!」等の第1警告の音声信号を所定の回数(例えば、3回)により出力しても良い。なお、第1警告の出力回数は所定の回数に限定されない。例えば車両端末1は、第1領域が危険箇所に重なった場合に第1警告の出力を開始し、以降は第1警告を継続して出力する。 FIG. 8 illustrates how the vehicle gradually approaches the dangerous place. For example, as shown in FIG. 8, when the dangerous place is set as a certain area on the map (an elliptical area in FIG. 8), the first area overlaps with the area corresponding to the dangerous place (FIG. 8). (Second state from the top of 8), the vehicle terminal 1 generates an output command for the first warning and outputs the first warning. The first warning is a warning to alert the driver based on the distance from the vehicle to the dangerous place. For example, the vehicle terminal 1 outputs a voice signal of the first warning such as "I will soon enter a dangerous area where an accident is likely to occur. Please be careful!" Through the speaker 19 a predetermined number of times (for example, three times). May be. The number of times the first warning is output is not limited to a predetermined number of times. For example, the vehicle terminal 1 starts outputting the first warning when the first region overlaps the dangerous place, and continuously outputs the first warning thereafter.
 その後、車両端末1は、危険箇所が第2領域に重なったか否かを判定する。第2領域に重なったと判定した場合(図8の上から3番目の状態)、車両端末1は、出力中の第1警告の出力命令を消去する。 After that, the vehicle terminal 1 determines whether or not the dangerous portion overlaps with the second region. When it is determined that the second region overlaps (the third state from the top of FIG. 8), the vehicle terminal 1 deletes the output command of the first warning being output.
 このように、車両端末1は、車両から第1距離以内に設定される第1領域に危険箇所が重なった場合に第1警告の出力命令を生成し、第1距離より短い第2距離以内に設定される第2領域に危険箇所が重なった場合に第1警告の出力命令を消去する。これにより、危険箇所に近づいた場合には第1警告による事前警告を行いつつ、さらに危険箇所に近づいた場合には第2警告の出力のみを行うように制御することで、第1警告の過度な出力を抑制しつつ、危険事象を好適に回避することができる。 In this way, the vehicle terminal 1 generates a first warning output command when a dangerous place overlaps the first area set within the first distance from the vehicle, and within the second distance shorter than the first distance. When a dangerous place overlaps with the set second area, the output command of the first warning is deleted. As a result, when a dangerous place is approached, a first warning is given in advance, and when the dangerous place is further approached, only a second warning is output, so that the first warning is excessive. Dangerous events can be suitably avoided while suppressing the output.
 なお、上記では、車両が危険箇所に向かって直進することで、第1領域に重なった後で第2領域に重なることを前提に説明したが、例えば車両が右折又は左折する等の理由で、
危険箇所が第1領域に重なるものの、第2領域に重ならずに第1領域から外れる場合も想定される。この場合、車両端末1はそのまま第1警告の出力を継続してもよく、あるいは危険箇所が第1領域から外れてから一定時間経過後に第1警告の出力を終了するようにしてもよい。このように、危険箇所が第2領域に重ならない場合の例外処理は特に限定されない。
In the above description, it is assumed that the vehicle goes straight toward the dangerous place and then overlaps with the first area and then overlaps with the second area. However, for example, the vehicle makes a right turn or a left turn.
Although the dangerous place overlaps with the first area, it may be out of the first area without overlapping with the second area. In this case, the vehicle terminal 1 may continue to output the first warning as it is, or may end the output of the first warning after a certain period of time has elapsed after the dangerous portion deviates from the first region. As described above, the exception handling when the dangerous portion does not overlap with the second region is not particularly limited.
 また、上記では第2領域が第1領域の一部であるものとして説明したが、第1領域及び第2領域は互いに重複しない(例えば第1領域は第1距離から第2距離までの環状扇形領域、第2領域は第1領域よりも車両側の扇形領域)ものであってもよく、あるいは第1領域及び第2領域は一部のみ重複するものであってもよい。 Further, although the second region has been described above as a part of the first region, the first region and the second region do not overlap each other (for example, the first region has a circular fan shape from the first distance to the second distance). The region and the second region may be a fan-shaped region on the vehicle side of the first region), or the first region and the second region may be partially overlapped.
 さらに車両端末1は、第2領域に危険箇所が重なった場合、危険事象を感度良く検出できるようにするため、危険事象検知の処理内容を変更すると好適である。具体的には、車両端末1は、第2領域が危険箇所に重なった場合、図5で例示した関心領域11bを拡大する。例えば車両端末1は、車両から関心領域11bの先端(図5上側の関心領域11bの上辺)までの距離を延ばし、検知対象とする走行路上の面積を拡大する。これにより、第2領域に危険箇所が重なった場合、すなわち危険事象が発生しやすい場所に車両が進入した場合に検知感度を高めることができ、危険事象を好適に回避することができる。 Further, it is preferable that the vehicle terminal 1 changes the processing content of the dangerous event detection so that the dangerous event can be detected with high sensitivity when the dangerous part overlaps with the second region. Specifically, the vehicle terminal 1 expands the region of interest 11b exemplified in FIG. 5 when the second region overlaps the dangerous portion. For example, the vehicle terminal 1 extends the distance from the vehicle to the tip of the region of interest 11b (the upper side of the region of interest 11b on the upper side of FIG. 5) to expand the area on the travel path to be detected. As a result, the detection sensitivity can be increased when the dangerous portion overlaps with the second region, that is, when the vehicle enters a place where a dangerous event is likely to occur, and the dangerous event can be suitably avoided.
 なお、上記では第2領域に危険箇所が重なった場合の処理例として関心領域の拡大を挙げたが、本実施形態はこれに限定されるものではない。例えば車両端末1は、画像から危険事象を検知する際に基準とするパラメータ(閾値)を変更しても良い。例えば、物体検出系の機械学習モデルを用いて危険事象を検知するものとした場合に、学習モデルからの出力値(例えば画像から認識した物体が通行人、赤信号等の特定の事物である確率)と比較する閾値を引き下げる。この場合でも上記と同様に、危険事象の検知感度を高めることができる。 In the above, the expansion of the region of interest is mentioned as a processing example when the dangerous portion overlaps with the second region, but the present embodiment is not limited to this. For example, the vehicle terminal 1 may change a parameter (threshold value) as a reference when detecting a dangerous event from an image. For example, when a dangerous event is detected using a machine learning model of an object detection system, the output value from the learning model (for example, the probability that the object recognized from the image is a specific object such as a passerby or a red light). ) And lower the threshold. Even in this case, the detection sensitivity of the dangerous event can be increased in the same manner as described above.
 また、上記では第2領域に危険箇所が重なった場合に、関心領域の拡大または閾値の変更を行う処理を挙げたが、本実施形態はこれに限定されるものではない。例えば図7Cに示すように、車両から第1距離より短い第3距離以内の第3領域を設定し、第3領域に危険箇所が重なった場合に、上述した処理と同様に、関心領域の拡大または閾値の変更処理を行っても良い。すなわち、第1警告の出力命令を消去するための第2領域と、関心領域の拡大または閾値の変更処理を行うための第3領域とを別に設定してもよい。ただし、以下では、別途説明しない限り、便宜上、図7Bに示すように第2領域と第3領域とを同一領域として扱う。 Further, in the above, when the dangerous part overlaps with the second area, the process of expanding the area of interest or changing the threshold value is mentioned, but the present embodiment is not limited to this. For example, as shown in FIG. 7C, when a third region within a third distance shorter than the first distance from the vehicle is set and a dangerous portion overlaps with the third region, the region of interest is expanded in the same manner as the above-mentioned processing. Alternatively, the threshold value may be changed. That is, the second area for erasing the output command of the first warning and the third area for performing the process of expanding the area of interest or changing the threshold value may be set separately. However, in the following, unless otherwise described, for convenience, the second region and the third region are treated as the same region as shown in FIG. 7B.
 図9は、危険事象の検知処理手順を示すフローチャートである。なお、図9のフローチャートでは説明の便宜上、危険事象を検知した車両端末1を符号1aで表し、その他の車両端末1を符号1bで表す。 FIG. 9 is a flowchart showing a dangerous event detection processing procedure. In the flowchart of FIG. 9, for convenience of explanation, the vehicle terminal 1 that has detected a dangerous event is represented by reference numeral 1a, and the other vehicle terminals 1 are represented by reference numeral 1b.
 車両端末1aの制御部11は通信部13を介して、危険事象が発生しやすい危険箇所を示す地図情報をサーバ2の地図DB251から取得し、表示部15に表示する(ステップS01)。制御部11は通信部13または撮影部16を介して、車両の周囲を撮像した画像を取得する(ステップS02)。例えば制御部11は、車両前方を撮像範囲とした画像を取得する。制御部11は、パターンマッチング、または機械学習モデル等を利用し、取得した画像内の関心領域から車両の危険事象を検知する(ステップS03)。制御部11は、危険事象の検知結果に基づいて、車両の危険事象が発生したか否かを判定する(ステップS04)。 The control unit 11 of the vehicle terminal 1a acquires map information indicating a dangerous place where a dangerous event is likely to occur from the map DB 251 of the server 2 via the communication unit 13 and displays it on the display unit 15 (step S01). The control unit 11 acquires an image of the surroundings of the vehicle via the communication unit 13 or the photographing unit 16 (step S02). For example, the control unit 11 acquires an image with the front of the vehicle as the imaging range. The control unit 11 detects a dangerous event of the vehicle from the region of interest in the acquired image by using pattern matching, a machine learning model, or the like (step S03). The control unit 11 determines whether or not a dangerous event of the vehicle has occurred based on the detection result of the dangerous event (step S04).
 制御部11は、車両の危険事象が発生していないと判定した場合(ステップS04でNO)、ステップS02の処理に戻る。制御部11は、車両の危険事象が発生したと判定した場合(ステップS04でYES)、スピーカ19を介して第2警告を出力する(ステップS05)。制御部11は、車両ID、危険事象を検知時の画像、検知時刻、位置情報及び危険事象の内容を含む検知情報を通信部13によりサーバ2に送信する(ステップS06)。 When the control unit 11 determines that no dangerous event of the vehicle has occurred (NO in step S04), the control unit 11 returns to the process of step S02. When the control unit 11 determines that a dangerous event of the vehicle has occurred (YES in step S04), the control unit 11 outputs a second warning via the speaker 19 (step S05). The control unit 11 transmits the detection information including the vehicle ID, the image at the time of detecting the dangerous event, the detection time, the position information, and the content of the dangerous event to the server 2 by the communication unit 13 (step S06).
 サーバ2の制御部21は、車両端末1aから送信された検知情報を通信部23により受信し、検知情報DB252に記憶する(ステップS07)。制御部21は、検知情報DB252に記憶されている各車両端末1aからの検知情報を解析する(ステップS08)。例えば制御部21は、各車両端末1aからの検知情報が示す危険事象の検知時刻、位置情報等に基づき、地図上の各地点又はエリアにおける危険事象の発生回数、発生頻度、発生確率等を時刻又は時間帯別に算出する。 The control unit 21 of the server 2 receives the detection information transmitted from the vehicle terminal 1a by the communication unit 23 and stores it in the detection information DB 252 (step S07). The control unit 21 analyzes the detection information from each vehicle terminal 1a stored in the detection information DB 252 (step S08). For example, the control unit 21 sets the number of occurrences, frequency of occurrence, probability of occurrence, etc. of dangerous events at each point or area on the map based on the detection time, position information, etc. of the dangerous event indicated by the detection information from each vehicle terminal 1a. Or calculate by time zone.
 制御部21は、ステップS08の解析結果に基づき、危険事象が発生しやすい地図上の危険箇所を示す地図情報を生成し、生成した地図情報を基に地図DB251を更新する(ステップS09)。例えば制御部21は、各時刻又は時間帯における危険事象の発生回数等を所定の閾値と比較して地図上の各地点又はエリアの危険レベルを判定し、危険レベルが一定以上の地点又はエリアを危険箇所として地図情報を生成する。制御部21は、更新後の地図情報を通信部23によりそれぞれの車両端末1に送信する(ステップS10)。 The control unit 21 generates map information indicating a dangerous place on a map where a dangerous event is likely to occur based on the analysis result of step S08, and updates the map DB 251 based on the generated map information (step S09). For example, the control unit 21 compares the number of occurrences of dangerous events at each time or time zone with a predetermined threshold value, determines the danger level of each point or area on the map, and determines the point or area where the danger level is above a certain level. Generate map information as a dangerous place. The control unit 21 transmits the updated map information to each vehicle terminal 1 by the communication unit 23 (step S10).
 なお、ステップS07~S09の処理は検知情報を受信(ステップS06)する度に実行しても良く、又は一定期間毎に実行するようにしても良い。 Note that the processes of steps S07 to S09 may be executed every time the detection information is received (step S06), or may be executed at regular intervals.
 車両端末1aの制御部11は、サーバ2から送信された更新後の地図情報を通信部13により受信する(ステップS11)。制御部11は、受信した地図情報を表示部15により表示し(ステップS12)、処理を終了する。車両端末1bの制御部11は、サーバ2から送信された更新後の地図情報を通信部13により受信する(ステップS13)。制御部11は、受信した地図情報を表示部15により表示し(ステップS14)、処理を終了する。 The control unit 11 of the vehicle terminal 1a receives the updated map information transmitted from the server 2 by the communication unit 13 (step S11). The control unit 11 displays the received map information on the display unit 15 (step S12), and ends the process. The control unit 11 of the vehicle terminal 1b receives the updated map information transmitted from the server 2 by the communication unit 13 (step S13). The control unit 11 displays the received map information on the display unit 15 (step S14), and ends the process.
 図10は、危険箇所の警告処理手順を示すフローチャートである。例えば車両端末1は、図9で説明した危険事象の検知処理と並行して以下の処理を実行する。 FIG. 10 is a flowchart showing a warning processing procedure for a dangerous place. For example, the vehicle terminal 1 executes the following processing in parallel with the dangerous event detection processing described with reference to FIG.
 車両端末1の制御部11は通信部13を介して、危険事象が発生しやすい危険箇所を示す地図情報をサーバ2の地図DB251から取得する(ステップS101)。制御部11は、GPSモジュール18を介して、現在の位置情報を取得する(ステップS102)。 The control unit 11 of the vehicle terminal 1 acquires map information indicating a dangerous place where a dangerous event is likely to occur from the map DB 251 of the server 2 via the communication unit 13 (step S101). The control unit 11 acquires the current position information via the GPS module 18 (step S102).
 制御部11は、現在の位置情報と地図情報が示す危険箇所の位置情報とに基づいて、車両から第1距離以内に設定される第1領域に危険箇所が重なったか否かを判定する(ステップS103)。制御部11は、第1領域に危険箇所が重なっていないと判定した場合(ステップS103でNO)、ステップS102の処理に戻る。制御部11は、第1領域に危険箇所が重なったと判定した場合(ステップS103でYES)、第1警告の出力命令を生成し、スピーカ19を介して第1警告を出力する(ステップS104)。 The control unit 11 determines whether or not the dangerous place overlaps the first area set within the first distance from the vehicle based on the current position information and the position information of the dangerous place indicated by the map information (step). S103). When the control unit 11 determines that the danger points do not overlap with the first region (NO in step S103), the control unit 11 returns to the process of step S102. When the control unit 11 determines that the dangerous portion overlaps with the first region (YES in step S103), the control unit 11 generates an output command for the first warning and outputs the first warning via the speaker 19 (step S104).
 制御部11は、現在の位置情報と地図情報が示す危険箇所の位置情報とに基づいて、第1領域より短い第2距離以内に設定される第2領域に危険箇所が重なったか否かを判定する(ステップS105)。制御部11は、第2領域に危険箇所が重なっていないと判定した場合(ステップS105でNO)、ステップS105の処理に戻る。制御部11は、第2領域に危険箇所が重なったと判定した場合(ステップS105でYES)、出力中の第1警告の出力命令を消去する(ステップS106)。制御部11は、危険事象の検知対象とする撮像画像の関心領域を拡大する(ステップS107)。 The control unit 11 determines whether or not the dangerous place overlaps the second area set within the second distance shorter than the first area based on the current position information and the position information of the dangerous place indicated by the map information. (Step S105). When the control unit 11 determines that the danger points do not overlap with the second region (NO in step S105), the control unit 11 returns to the process of step S105. When the control unit 11 determines that the dangerous portion overlaps with the second region (YES in step S105), the control unit 11 deletes the output command of the first warning being output (step S106). The control unit 11 expands the region of interest of the captured image to be detected as a dangerous event (step S107).
 制御部11は、現在の位置情報と地図情報が示す危険箇所の位置情報とに基づいて、第2領域から危険箇所が外れたか否かを判定する(ステップS108)。制御部11は、第2領域から危険箇所が外れたと判定した場合(ステップS108でYES)、拡大した撮像画像の関心領域を拡大前の関心領域に戻し(ステップS109)、ステップS101の処理に戻る。制御部11は、第2領域から危険箇所が外れていないと判定した場合(ステップS108でNO)、ステップS108の処理に戻る。 The control unit 11 determines whether or not the dangerous place is out of the second area based on the current position information and the position information of the dangerous place indicated by the map information (step S108). When the control unit 11 determines that the dangerous portion is out of the second region (YES in step S108), the control unit 11 returns the region of interest of the enlarged captured image to the region of interest before expansion (step S109), and returns to the process of step S101. .. When the control unit 11 determines that the dangerous portion has not deviated from the second region (NO in step S108), the control unit 11 returns to the process of step S108.
 本実施形態によると、車両の危険事象または危険事象が発生しやすい危険箇所に基づいて危険警告を出力することにより、事故の事前防止を実現することが可能となる。 According to this embodiment, it is possible to prevent accidents in advance by outputting a danger warning based on a dangerous event of the vehicle or a dangerous place where a dangerous event is likely to occur.
 本実施形態によると、第2領域に危険箇所が重なった場合に、危険事象の検知対象とする撮像画像の関心領域を拡大することが可能となる。 According to this embodiment, when a dangerous place overlaps with the second area, it is possible to expand the area of interest of the captured image to be detected of the dangerous event.
 本実施形態によると、第2領域に危険箇所が重なった場合に、危険事象を検知する際の閾値を変更することにより、車両の危険事象の検知感度を上げることが可能となる。 According to the present embodiment, when a dangerous place overlaps with the second region, it is possible to increase the detection sensitivity of the dangerous event of the vehicle by changing the threshold value when detecting the dangerous event.
 本実施形態によると、危険箇所と危険事象との両方を地図上に表示することにより、事故発生に関わる情報を総合的に確認することが可能となる。 According to this embodiment, by displaying both the dangerous place and the dangerous event on the map, it is possible to comprehensively confirm the information related to the occurrence of the accident.
 なお、本実施形態では、便宜上、車両からの検知情報に基づいて生成される地図情報を一例として説明したが、本実施形態に係る地図情報はこれに限定されるものではない。例えば、地図情報は、リアルタイムで配信される災害情報に基づく交通情報等が反映されたものであってもよい。 In the present embodiment, for convenience, the map information generated based on the detection information from the vehicle has been described as an example, but the map information according to the present embodiment is not limited to this. For example, the map information may reflect traffic information or the like based on disaster information delivered in real time.
 (実施形態2)
 実施形態2は、車両の走行速度に応じて第1領域を拡大する形態に関する。なお、実施形態1と重複する内容については説明を省略する。
(Embodiment 2)
The second embodiment relates to a mode in which the first region is expanded according to the traveling speed of the vehicle. The description of the contents overlapping with the first embodiment will be omitted.
 車両の走行速度は、該車両が危険箇所に到達する時間に影響を与える。走行速度が早くなると、該車両が危険箇所に到達する時間は短くなるため、ユーザに危険警告を出力するタイミングを逃がしてしまう恐れがある。このような事情を考慮して、走行速度が早い場合、第1領域を拡大することにより、ユーザに対して早めに危険警告を出力することができる。 The traveling speed of the vehicle affects the time it takes for the vehicle to reach the dangerous spot. When the traveling speed becomes high, the time for the vehicle to reach the dangerous place becomes short, so that the timing of outputting the danger warning to the user may be missed. In consideration of such a situation, when the traveling speed is high, the danger warning can be output to the user at an early stage by expanding the first region.
 図11は、車両の走行速度に応じて第1領域を拡大する際の処理手順を示すフローチャートである。なお、図10と重複する内容については同一の符号を付して説明を省略する。車両端末1の制御部11は、ステップS102の処理を実行した後に、車両の走行速度を、例えば該車両に搭載された速度計測センサから取得する(ステップS111)。制御部11は、取得した走行速度が所定の基準速度以上であるか否かを判定する(ステップS112)。 FIG. 11 is a flowchart showing a processing procedure when expanding the first region according to the traveling speed of the vehicle. The contents overlapping with FIG. 10 are designated by the same reference numerals and the description thereof will be omitted. After executing the process of step S102, the control unit 11 of the vehicle terminal 1 acquires the traveling speed of the vehicle from, for example, a speed measurement sensor mounted on the vehicle (step S111). The control unit 11 determines whether or not the acquired traveling speed is equal to or higher than a predetermined reference speed (step S112).
 制御部11は、取得した走行速度が基準速度以上であると判定した場合(ステップS112でYES)、第1領域を拡大し(ステップS113)、ステップS103の処理を実行する。例えば制御部11は、扇形に設定される第1領域の大きい円の半径を走行速度の大小に応じて拡大する。例えば、取得した走行速度が基準速度の1.2倍である場合、同心円の半径を、第1領域を1.2倍に拡大する。制御部11は、取得した走行速度が基準速度未満であると判定した場合(ステップS112でNO)、ステップS103の処理に遷移する。 When the control unit 11 determines that the acquired traveling speed is equal to or higher than the reference speed (YES in step S112), the control unit 11 expands the first region (step S113) and executes the process of step S103. For example, the control unit 11 expands the radius of a large circle in the first region set in a fan shape according to the magnitude of the traveling speed. For example, when the acquired traveling speed is 1.2 times the reference speed, the radius of the concentric circles is expanded 1.2 times in the first region. When the control unit 11 determines that the acquired traveling speed is less than the reference speed (NO in step S112), the control unit 11 transitions to the process of step S103.
 本実施形態によると、車両の走行速度に応じて第1領域を拡大することにより、適切なタイミングでユーザに危険警告を出力することが可能となる。 According to this embodiment, by expanding the first area according to the traveling speed of the vehicle, it is possible to output a danger warning to the user at an appropriate timing.
 (実施形態3)
 実施形態3は、車両を運転するユーザ(運転者)の属性に応じて、該車両からの第3領域までの距離を変更する形態に関する。実施形態3では、上述した第2領域と第3領域とを区別する。なお、実施形態1~2と重複する内容については説明を省略する。
(Embodiment 3)
The third embodiment relates to a mode in which the distance from the vehicle to the third region is changed according to the attribute of the user (driver) who drives the vehicle. In the third embodiment, the above-mentioned second region and the third region are distinguished. The description of the contents overlapping with the first and second embodiments will be omitted.
 ユーザの属性は、ユーザの年齢等を含む。例えば高齢運転者は、法定速度以下の速度で走行、ブレーキを踏むのが遅れ、または反応が遅い等問題によって、交通事故の発生率が高い恐れがある。以上のような事情を考慮して、ユーザの年齢に応じて車両からの第3距離を変更することが必要である。高齢運転者に対して、車両から第3距離を所定ルールにより長くする。例えば、70代の高齢運転者に対して、車両からの第3距離を1.5倍に長くしても良い。なお、ユーザの属性に運転スキルまたは運転経験年数等が含まれても良い。 User attributes include user age, etc. For example, an elderly driver may have a high incidence of traffic accidents due to problems such as traveling at a speed lower than the legal speed, delay in applying the brakes, or slow response. In consideration of the above circumstances, it is necessary to change the third distance from the vehicle according to the age of the user. For elderly drivers, the third distance from the vehicle is lengthened according to a predetermined rule. For example, the third distance from the vehicle may be increased by 1.5 times for an elderly driver in his 70s. The user's attributes may include driving skills, years of driving experience, and the like.
 図12は、実施形態2のサーバ2の構成例を示すブロック図である。なお、図3と重複する内容については同一の符号を付して説明を省略する。大容量記憶部25には、ユーザDB253が記憶されている。ユーザDB253は、ユーザの属性を含むユーザ情報を記憶している。 FIG. 12 is a block diagram showing a configuration example of the server 2 of the second embodiment. The contents overlapping with FIG. 3 are designated by the same reference numerals and the description thereof will be omitted. The user DB 253 is stored in the large-capacity storage unit 25. The user DB 253 stores user information including user attributes.
 図13は、ユーザDB253のレコードレイアウトの一例を示す説明図である。ユーザDB253は、ユーザID列、性別列及び年齢列を含む。ユーザID列は、各ユーザを識別するために、一意に特定されるユーザのIDを記憶している。性別列は、ユーザの性別を記憶している。年齢列は、ユーザの年齢を記憶している。 FIG. 13 is an explanatory diagram showing an example of the record layout of the user DB 253. The user DB 253 includes a user ID column, a gender column, and an age column. The user ID column stores the ID of a uniquely identified user in order to identify each user. The gender column remembers the gender of the user. The age column remembers the age of the user.
 図14は、ユーザの年齢に応じて第3距離を変更する際の処理手順を示すフローチャートである。車両端末1の制御部11は、ユーザIDに基づいて、ユーザの年齢をサーバ2のユーザDB253から取得する(ステップS121)。制御部11は、取得したユーザの年齢に基づいて、該ユーザが高齢運転者であるか否かを判定する(ステップS122)。例えば制御部11は、ユーザの年齢が70歳以上である場合、該ユーザが高齢運転者であると判定しても良い。 FIG. 14 is a flowchart showing a processing procedure when changing the third distance according to the age of the user. The control unit 11 of the vehicle terminal 1 acquires the age of the user from the user DB 253 of the server 2 based on the user ID (step S121). The control unit 11 determines whether or not the user is an elderly driver based on the acquired age of the user (step S122). For example, when the user's age is 70 years or older, the control unit 11 may determine that the user is an elderly driver.
 制御部11は、該ユーザが高齢運転者と判定した場合(ステップS122でYES)、車両からの第3距離を高齢者以外の場合よりも長く設定し(ステップS123)、処理を終了する。例えば、車両からの第3距離を1.2倍に長くする。制御部11は、該ユーザが高齢運転者でないと判定した場合(ステップS122でNO)、処理を終了する。 When the user determines that the user is an elderly driver (YES in step S122), the control unit 11 sets the third distance from the vehicle longer than in the case of non-elderly drivers (step S123), and ends the process. For example, the third distance from the vehicle is increased 1.2 times. When the control unit 11 determines that the user is not an elderly driver (NO in step S122), the control unit 11 ends the process.
 なお、上述した処理でユーザの年齢に基づいて第3距離を変更した例を説明したが、これに限るものではない。ユーザの年齢のほか、例えばユーザの運転スキルまたは運転経験年数等を考慮して第3距離を変更しても良い。 Although the example of changing the third distance based on the age of the user in the above-mentioned process has been described, the present invention is not limited to this. In addition to the user's age, the third distance may be changed in consideration of, for example, the user's driving skill or years of driving experience.
 本実施形態によると、ユーザの属性に応じて該車両からの第3領域距離を変更することにより、ユーザに対して適切なタイミングで事前に危険警告を出力することが可能となる。
 なお、本実施形態において、第2領域と第3領域とを区別したが、第2領域と第3領域は同一領域であってもよい。
According to the present embodiment, by changing the third area distance from the vehicle according to the attribute of the user, it is possible to output a danger warning to the user in advance at an appropriate timing.
Although the second region and the third region are distinguished in the present embodiment, the second region and the third region may be the same region.
 また、本実施形態において、第3距離を変更するだけでなく、第1距離を変更してもよい。例えば、高齢運転者の場合に第1距離を長くすることで、高齢運転者が危険箇所の検知をより早く認識することが可能となる。これにより、ハザードマップがリアルタイムで更新されている場合に、高齢運転者が危険事象に対して慌てて対応する状況を回避できる。 Further, in the present embodiment, not only the third distance may be changed, but also the first distance may be changed. For example, in the case of an elderly driver, by lengthening the first distance, the elderly driver can recognize the detection of the dangerous place earlier. This makes it possible to avoid a situation in which an elderly driver responds in a hurry to a dangerous event when the hazard map is updated in real time.
 今回開示された実施形態はすべての点で例示であって、制限的なものではないと考えられるべきである。本発明の範囲は、上記した意味ではなく、特許請求の範囲によって示され、特許請求の範囲と均等の意味及び範囲内でのすべての変更が含まれることが意図される。 The embodiments disclosed this time should be considered to be exemplary in all respects and not restrictive. The scope of the present invention is indicated by the scope of claims, not the above-mentioned meaning, and is intended to include all modifications within the meaning and scope equivalent to the scope of claims.
 1    情報処理端末(車両端末)
 11   制御部
 12   記憶部
 13   通信部
 14   入力部
 15   表示部
 16   撮影部
 17   補助記憶部
 18   GPSモジュール
 19   スピーカ
 1P   制御プログラム
 2    情報処理装置(サーバ・地図情報提供装置)
 21   制御部
 22   記憶部
 23   通信部
 24   読取部
 25   大容量記憶部
 251  地図DB
 252  検知情報DB
 253  ユーザDB
 2a   可搬型記憶媒体
 2b   半導体メモリ
 2P   制御プログラム
1 Information processing terminal (vehicle terminal)
11 Control unit 12 Storage unit 13 Communication unit 14 Input unit 15 Display unit 16 Imaging unit 17 Auxiliary storage unit 18 GPS module 19 Speaker 1P control program 2 Information processing device (server / map information providing device)
21 Control unit 22 Storage unit 23 Communication unit 24 Reading unit 25 Large-capacity storage unit 251 Map DB
252 Detection information DB
253 User DB
2a Portable storage medium 2b Semiconductor memory 2P control program

Claims (11)

  1.  危険事象が発生しやすい危険箇所を示す地図情報と、車両の位置情報とに基づき、前記車両から第1距離以内に設定される第1領域及び前記車両から前記第1距離より短い第2距離以内に設定される第2領域に前記危険箇所が重なったか否かを判定し、
     前記第1領域に前記危険箇所が重なったと判定した場合に第1警告の出力命令を生成し、
     前記第2領域に前記危険箇所が重なったと判定した場合、前記第1警告の出力命令を消去する
     処理をコンピュータに実行させるプログラム。
    Based on the map information indicating the dangerous place where a dangerous event is likely to occur and the position information of the vehicle, the first area set within the first distance from the vehicle and the second distance shorter than the first distance from the vehicle. It is determined whether or not the danger point overlaps with the second area set in.
    When it is determined that the dangerous part overlaps with the first area, the output command of the first warning is generated.
    A program that causes a computer to execute a process of erasing the output command of the first warning when it is determined that the dangerous part overlaps with the second area.
  2.  前記車両の周囲を撮像した画像を取得し、
     前記画像から前記車両の危険事象を検知し、
     前記危険事象を検知した場合に第2警告を出力する
     処理を実行させる請求項1に記載のプログラム。
    An image of the surroundings of the vehicle is acquired, and the image is taken.
    The dangerous event of the vehicle is detected from the image,
    The program according to claim 1, wherein a process for outputting a second warning when the dangerous event is detected is executed.
  3.  前記画像内の一定の関心領域から前記危険事象を検知し、
     前記車両から前記第1距離より短い第3距離以内に設定される第3領域に前記危険箇所が重なったと判定した場合、前記関心領域を拡大する
     処理を実行させる請求項2に記載のプログラム。
    The dangerous event is detected from a certain region of interest in the image, and the dangerous event is detected.
    The program according to claim 2, wherein when it is determined that the danger point overlaps with a third area set within a third distance shorter than the first distance from the vehicle, a process of expanding the area of interest is executed.
  4.  前記第3領域に前記危険箇所が重なったと判定した場合、前記画像から前記危険事象を検知する際の基準とする閾値を変更する
     処理を実行させる請求項3に記載のプログラム。
    The program according to claim 3, wherein when it is determined that the dangerous portion overlaps with the third region, a process of changing a threshold value as a reference when detecting the dangerous event from the image is executed.
  5.  前記車両を運転するユーザの属性に応じて、前記車両からの前記第3距離を変更する
     処理を実行させる請求項3又は4に記載のプログラム。
    The program according to claim 3 or 4, wherein the process of changing the third distance from the vehicle is executed according to the attribute of the user who drives the vehicle.
  6.  前記車両の走行速度に応じて、前記第1領域を拡大する
     処理を実行させる請求項1から5のいずれか一つに記載のプログラム。
    The program according to any one of claims 1 to 5, wherein the process of expanding the first region is executed according to the traveling speed of the vehicle.
  7.  前記車両の危険事象を検知した場合に、検知時刻及び位置情報を含む検知情報を、前記地図情報を提供する地図情報提供装置に送信し、
     前記検知情報に応じて更新された前記地図情報を前記地図情報提供装置から取得する
     処理を実行させる請求項1から6のいずれか一つに記載のプログラム。
    When the dangerous event of the vehicle is detected, the detection information including the detection time and the position information is transmitted to the map information providing device that provides the map information.
    The program according to any one of claims 1 to 6, which executes a process of acquiring the map information updated according to the detection information from the map information providing device.
  8.   危険事象が発生しやすい危険箇所を示す地図情報と、車両の位置情報とに基づき、前記車両から第1距離以内に設定される第1領域及び前記車両から前記第1距離より短い第2距離以内に設定される第2領域に前記危険箇所が重なったか否かを判定し、
     前記第1領域に前記危険箇所が重なったと判定した場合に第1警告の出力命令を生成し、
     前記第2領域に前記危険箇所が重なったと判定した場合、前記第1警告の出力命令を消去する
     処理をコンピュータに実行させる情報処理方法。
    Based on the map information indicating the dangerous place where a dangerous event is likely to occur and the position information of the vehicle, the first area set within the first distance from the vehicle and the second distance shorter than the first distance from the vehicle. It is determined whether or not the danger point overlaps with the second area set in.
    When it is determined that the dangerous part overlaps with the first area, the output command of the first warning is generated.
    An information processing method for causing a computer to execute a process of erasing the output command of the first warning when it is determined that the dangerous portion overlaps with the second area.
  9.  危険事象が発生しやすい危険箇所を示す地図情報地図情報と、車両の位置情報とに基づき、前記車両から第1距離以内に設定される第1領域及び前記車両から前記第1距離より短い第2距離以内に設定される第2領域に前記危険箇所が重なったか否かを判定する判定部と、
     前記第1領域に前記危険箇所が重なったと判定した場合に第1警告の出力命令を生成す
    る生成部と、
     前記第2領域に前記危険箇所が重なったと判定した場合、前記第1警告の出力命令を消去する制御部と
     を備えることを特徴とする情報処理端末。
    Map information indicating a dangerous place where a dangerous event is likely to occur Based on the map information and the position information of the vehicle, the first region set within the first distance from the vehicle and the second region shorter than the first distance from the vehicle. A determination unit that determines whether or not the dangerous location overlaps the second region set within a distance, and a determination unit.
    A generation unit that generates a first warning output command when it is determined that the danger points overlap with the first area.
    An information processing terminal including a control unit for erasing the output command of the first warning when it is determined that the dangerous portion overlaps with the second region.
  10.  危険事象が発生しやすい危険箇所を示す地図情報を取得する第1取得部と、
     車両の位置情報を取得する第2取得部と、
     前記地図情報と、前記車両の位置情報とに基づき、車両から第1距離以内に設定される第1領域に前記危険箇所が重なったか否かを判定する判定部と、
     前記車両の周囲を撮像した画像を取得する第3取得部と、
     取得した前記画像から前記車両の危険事象を検知する検知部と、
     前記第1領域に前記危険箇所が重なったと判定した場合に第1警告を、前記車両の危険事象を検知した場合に第2警告をそれぞれ出力する出力部と
     を備えることを特徴とする情報処理端末。
    The first acquisition unit that acquires map information indicating dangerous locations where dangerous events are likely to occur, and
    The second acquisition unit that acquires the location information of the vehicle,
    Based on the map information and the position information of the vehicle, a determination unit for determining whether or not the dangerous portion overlaps the first region set within the first distance from the vehicle, and a determination unit.
    A third acquisition unit that acquires an image of the surroundings of the vehicle, and
    A detection unit that detects a dangerous event of the vehicle from the acquired image,
    An information processing terminal including an output unit that outputs a first warning when it is determined that the dangerous portion overlaps with the first region and a second warning when a dangerous event of the vehicle is detected. ..
  11.  請求項9又は10に記載の情報処理端末と通信可能な地図情報提供装置であって、
     危険事象が発生しやすい危険箇所を示す地図情報を前記情報処理端末に提供する提供部と、
     前記情報処理端末から、前記車両が危険事象を検知した検知時刻及び位置情報を含む検知情報を受信する受信部と、
     前記検知情報に応じて、前記地図情報を更新する更新部と、
    を備える地図情報提供装置。
    A map information providing device capable of communicating with the information processing terminal according to claim 9 or 10.
    A provider that provides map information indicating dangerous locations where dangerous events are likely to occur to the information processing terminal,
    A receiving unit that receives detection information including detection time and position information when the vehicle detects a dangerous event from the information processing terminal.
    An update unit that updates the map information according to the detection information,
    A map information providing device equipped with.
PCT/JP2021/035660 2020-09-29 2021-09-28 Program, information processing method, information processing terminal, and map information provision device WO2022071323A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2022554016A JPWO2022071323A1 (en) 2020-09-29 2021-09-28

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2020163556 2020-09-29
JP2020-163556 2020-09-29

Publications (1)

Publication Number Publication Date
WO2022071323A1 true WO2022071323A1 (en) 2022-04-07

Family

ID=80950411

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2021/035660 WO2022071323A1 (en) 2020-09-29 2021-09-28 Program, information processing method, information processing terminal, and map information provision device

Country Status (2)

Country Link
JP (1) JPWO2022071323A1 (en)
WO (1) WO2022071323A1 (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH05310058A (en) * 1992-05-08 1993-11-22 Omron Corp Collision preventing device
JP2001183148A (en) * 1999-12-27 2001-07-06 Kankyo Service:Kk Antenna combined position data detector utilizing gps
JP2006092258A (en) * 2004-09-24 2006-04-06 Denso Corp Running-out alert control device and running-out alert control program
JP2008286558A (en) * 2007-05-15 2008-11-27 Aisin Aw Co Ltd Information preparation device and method, and program
JP2012118011A (en) * 2010-12-03 2012-06-21 Fujitsu Ten Ltd Information processor, on-vehicle device, and information processing method
JP2014044730A (en) * 2013-09-24 2014-03-13 Clarion Co Ltd Image processing apparatus
JP2015108926A (en) * 2013-12-04 2015-06-11 三菱電機株式会社 Vehicle driving support device
WO2018155159A1 (en) * 2017-02-24 2018-08-30 パナソニックIpマネジメント株式会社 Remote video output system and remote video output device

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH05310058A (en) * 1992-05-08 1993-11-22 Omron Corp Collision preventing device
JP2001183148A (en) * 1999-12-27 2001-07-06 Kankyo Service:Kk Antenna combined position data detector utilizing gps
JP2006092258A (en) * 2004-09-24 2006-04-06 Denso Corp Running-out alert control device and running-out alert control program
JP2008286558A (en) * 2007-05-15 2008-11-27 Aisin Aw Co Ltd Information preparation device and method, and program
JP2012118011A (en) * 2010-12-03 2012-06-21 Fujitsu Ten Ltd Information processor, on-vehicle device, and information processing method
JP2014044730A (en) * 2013-09-24 2014-03-13 Clarion Co Ltd Image processing apparatus
JP2015108926A (en) * 2013-12-04 2015-06-11 三菱電機株式会社 Vehicle driving support device
WO2018155159A1 (en) * 2017-02-24 2018-08-30 パナソニックIpマネジメント株式会社 Remote video output system and remote video output device

Also Published As

Publication number Publication date
JPWO2022071323A1 (en) 2022-04-07

Similar Documents

Publication Publication Date Title
US10810872B2 (en) Use sub-system of autonomous driving vehicles (ADV) for police car patrol
US11328219B2 (en) System and method for training a machine learning model deployed on a simulation platform
US10816984B2 (en) Automatic data labelling for autonomous driving vehicles
JP6109593B2 (en) Risk information processing method, apparatus and system, and program
JP4513740B2 (en) Route guidance system and route guidance method
JP4986135B2 (en) Database creation device and database creation program
JP6933457B2 (en) Recognition result presentation device, recognition result presentation method and autonomous mobile
JP2014154005A (en) Danger information provision method, device, and program
US20070124072A1 (en) Route guidance systems, methods, and programs
JP6335814B2 (en) Suspicious vehicle recognition device and suspicious vehicle recognition method
JP7155750B2 (en) Information systems and programs
JP2011145756A (en) Traveling support system and method
JP2010128637A (en) Device for facilitating braking preparation
JP6005475B2 (en) In-vehicle device, danger prediction method, and program
US20220017094A1 (en) Lane change planning method and vehicle-mounted device
WO2022071323A1 (en) Program, information processing method, information processing terminal, and map information provision device
JP2008090683A (en) Onboard navigation device
JP4521036B2 (en) Route search device, route search method, route search program, and computer-readable recording medium
JP4866061B2 (en) Information recording apparatus, information recording method, information recording program, and computer-readable recording medium
JP7107061B2 (en) Driving support method and driving support device
JP2017130104A (en) Composure degree determination device, composure degree determination method and drive support system
JP4930385B2 (en) Navigation device and program for navigation device
JP2006177753A (en) Route guidance system and route guidance method
JP2007271345A (en) Vehicle guide system and vehicle guide method
JP4062180B2 (en) Virtual movement guidance device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21875624

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2022554016

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21875624

Country of ref document: EP

Kind code of ref document: A1