WO2021161742A1 - Traffic risk reduction system, information processing device, and sensor - Google Patents

Traffic risk reduction system, information processing device, and sensor Download PDF

Info

Publication number
WO2021161742A1
WO2021161742A1 PCT/JP2021/001721 JP2021001721W WO2021161742A1 WO 2021161742 A1 WO2021161742 A1 WO 2021161742A1 JP 2021001721 W JP2021001721 W JP 2021001721W WO 2021161742 A1 WO2021161742 A1 WO 2021161742A1
Authority
WO
WIPO (PCT)
Prior art keywords
information
sensor
surveillance camera
information processing
notified
Prior art date
Application number
PCT/JP2021/001721
Other languages
French (fr)
Japanese (ja)
Inventor
裕太 山内
紘平 瀬川
隆之 堀
憲一 海老沢
寛樹 中谷
嘉生 鎌倉
Original Assignee
ソフトバンク株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ソフトバンク株式会社 filed Critical ソフトバンク株式会社
Publication of WO2021161742A1 publication Critical patent/WO2021161742A1/en

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16YINFORMATION AND COMMUNICATION TECHNOLOGY SPECIALLY ADAPTED FOR THE INTERNET OF THINGS [IoT]
    • G16Y10/00Economic sectors
    • G16Y10/40Transportation

Definitions

  • the present invention relates to a traffic risk reduction system, an information processing device and a sensor.
  • the present application has been made in view of the above, and provides a traffic risk reduction system, an information processing device, and a sensor capable of notifying a moving body of appropriate notification contents according to the surrounding conditions of the moving body.
  • the purpose is to do.
  • the information processing device includes a determination unit that determines information to be notified by the sensor to a moving body in the vicinity of the sensor based on the installation position of a sensor capable of transmitting information to the moving body, and the determination unit. It is characterized by having a providing unit that provides the information determined by the above sensor to the sensor.
  • FIG. 1 is a diagram showing an example of processing executed by the information processing apparatus 100 according to the embodiment.
  • FIG. 2 is a diagram showing a configuration example of the information processing device according to the embodiment.
  • FIG. 3 is a diagram showing an example of the position information storage unit according to the embodiment.
  • FIG. 4 is a diagram showing an example of the surveillance camera image storage unit according to the embodiment.
  • FIG. 5 is a diagram showing an example of the installation position storage unit according to the embodiment.
  • FIG. 6 is a diagram showing a configuration example of the surveillance camera according to the embodiment.
  • FIG. 7 is a diagram showing an example of a destination storage unit according to the embodiment.
  • FIG. 8 is a flowchart showing an example of a flow of specific processing executed by the information processing apparatus according to the embodiment.
  • FIG. 9 is a hardware configuration diagram showing an example of a computer that realizes the functions of the information processing device.
  • the driving technology of self-driving cars is not always sufficient.
  • an autonomous vehicle has a photographing device such as a camera, and by photographing the surroundings of the autonomous vehicle, the traveling is controlled based on the surrounding environment and the situation.
  • the self-driving car may not be able to recognize the side environment and the situation when traveling at a predetermined speed or higher.
  • the image of the camera that captures the side surface may flow, and other vehicles or objects may not be recognized.
  • the present embodiment a technology for supporting the running of an autonomous vehicle is provided.
  • the information from the side of the autonomous driving vehicle is complemented by utilizing the information acquired from an external device such as a sensor.
  • the present embodiment makes it possible to control the autonomous driving vehicle with high accuracy.
  • FIG. 1 is a diagram showing an example of processing executed by the information processing apparatus 100 according to the embodiment.
  • the traffic risk reduction system 1 includes automobiles CA1, 11 and 12 (hereinafter, may be collectively referred to as automobile 10) and a plurality of surveillance cameras 21 to 24 (hereinafter, surveillance cameras 20). (May be generically referred to) and the information processing apparatus 100.
  • the automobile 10, the surveillance camera 20, and the information processing device 100 are connected via a wireless network (not shown).
  • the traffic risk reduction system 1 shown in FIG. 1 may include an arbitrary number of automobiles 10 and an arbitrary number of surveillance cameras 20. Further, the processes described below may be realized by executing each process in parallel or in cooperation with a plurality of information processing devices 100.
  • the automobile 10 is a connected vehicle that communicates with various other devices via a wireless network. Further, the automobile 10 executes communication by an in-vehicle device such as a communication module incorporated in the automobile 10.
  • the in-vehicle device incorporated in each automobile 10 uses a GPS (Global Positioning System) sensor or the like to determine the current position of the in-vehicle device, that is, the current position of the automobile at predetermined intervals (for example, once or more per second). It has a function to detect, and more specific examples are a car navigation system, a drive recorder, and the like.
  • GPS Global Positioning System
  • the in-vehicle device generates position information indicating the detected current position, and obtains such position information and an automobile ID (Identifier) for identifying the automobile 10 on which the in-vehicle device is mounted via a wireless network or the like.
  • the location information is transmitted to the location information providing device PIS (see, for example, FIG. 2) that manages the location information.
  • the surveillance camera 20 is a surveillance camera having a function of being able to communicate with various other devices via a wireless network.
  • the surveillance camera 20 is fixedly installed on a traffic signal or a pillar to which a traffic signal is attached.
  • the surveillance camera 20 constantly or at a predetermined time interval (for example, 30 times per second) acquires an image including a landscape or the like taken from a predetermined angle. Then, the surveillance camera 20 transmits the image captured by the surveillance camera to the surveillance camera image providing device FIS via a wireless network or the like.
  • the surveillance camera 20 not only functions as a camera, but also uses various information processing, various communications via a network, and various communications using proximity communication technology such as wireless LAN and Bluetooth (registered trademark). It is a smart surveillance camera that has the function of performing.
  • the surveillance camera 20 may be composed of, for example, a surveillance camera that captures a still image or a moving image, and a surveillance camera server for performing various communications via information processing and a network.
  • Such a surveillance camera 20 has a function of directly or indirectly distributing various types of information using proximity communication technology.
  • the surveillance camera 20 has a function of notifying various information to the automobile 10 located in the vicinity.
  • the surveillance camera image providing device FIS is realized by, for example, a server device or a cloud system. Specifically, the surveillance camera image providing device FIS constantly acquires an image taken by the surveillance camera 20 (hereinafter, referred to as a surveillance camera image). Then, the surveillance camera image providing device FIS receives an image acquisition request from the information processing device 100 and provides the surveillance camera image to the information processing device 100.
  • a surveillance camera image an image taken by the surveillance camera 20
  • the information processing device 100 is realized by, for example, a server device or a cloud system.
  • a server device or a cloud system.
  • an example of the specific processing by the information processing apparatus 100 will be described along the flow with reference to FIG.
  • a vehicle CA1 that is not a connected vehicle that runs dangerously (hereinafter, may be referred to as a non-connected vehicle CA1) and vehicles 11 and 12 are traveling around an intersection. Then, it is assumed that the non-connected vehicle CA1 and the automobile 12 are likely to have a traffic accident from the surveillance camera image taken by the surveillance camera 20.
  • the automobile CA1 meanders at a speed exceeding a predetermined speed, and as a result of invading the intersection SP1 due to ignoring a signal or the like, the automobile 12 and the automobile CA1 may collide with each other. do.
  • the surveillance camera 20 alerts the automobiles 11 and 12 regarding a traffic accident.
  • the same notification such as "Please stop” is given to the automobiles 11 and 12 traveling near the intersection, not only the automobile 12 but also the automobile 11 traveling behind the automobile 12 is stopped. There is a risk of doing it. In such a case, another automobile traveling further behind the automobile 11 may collide with the automobile 11. Therefore, the information processing device 100 determines the information to be notified to the automobile 10 by each surveillance camera 20 according to the installation position of each surveillance camera 20, and provides the determined information to each surveillance camera 20.
  • each surveillance camera 20 transmits the captured surveillance camera image to the surveillance camera image providing device FIS (step S1). Then, the information processing device 100 acquires the surveillance camera images taken by each surveillance camera 20 (step S2). Then, the information processing device 100 analyzes the image of the surveillance camera image and estimates the area where a traffic accident may occur as a dangerous area (step S3).
  • the surveillance camera image captured by the surveillance camera 24 captures the vehicle CA1 traveling at a speed significantly exceeding the legal speed
  • the surveillance camera image captured by the surveillance camera 23 captures the vehicle.
  • the information processing device 100 sets the intersection SP1 as the danger zone SP1.
  • various estimation techniques for estimating the possibility of a future traffic accident from various images can be adopted.
  • the information processing device 100 notifies each surveillance camera 20 based on the distance from the danger zone SP1 to each surveillance camera 20.
  • the information is determined (step S4). More specifically, as the information to be notified for each surveillance camera 20, the information processing device 100 includes information indicating the danger zone SP1, the time until the accident occurs, or the information indicating the action for avoiding the accident. Generate information that is one or a combination of these.
  • the information processing device 100 provides information indicating the danger zone SP1 and the time until an accident occurs, such as "there is a risk of a traffic accident occurring at an intersection SP1 after X seconds" as information to be notified to the surveillance cameras 22 and 23. Generate information that indicates. Further, the information processing device 100 generates information indicating an action for avoiding an accident such as "let's stop” as information to be notified to the surveillance camera 23 closest to the intersection SP1. Further, the information processing device 100 generates information different from that of the surveillance camera 23, such as "let's decelerate", as information to be notified to the surveillance camera 22 which is some distance from the intersection SP1.
  • the information processing device 100 does not have to generate information for notifying the surveillance camera 21 whose distance from the intersection SP1 is longer than that of the surveillance camera 22. Further, for example, the information processing apparatus 100 may generate information for the surveillance camera 22 that does not include the time until an accident occurs, such as "there is a risk of an accident occurring at the intersection SP1". That is, the information processing device 100 performs information indicating the dangerous area SP1, the time until the accident occurs, or an action for avoiding the accident according to the distance from the intersection SP1 which is the dangerous area to each surveillance camera 20. Each of the indicated information is generated as information for notifying a message that has been changed as appropriate.
  • the information processing device 100 provides each surveillance camera 20 with information of the content determined for each surveillance camera 20 (step S5).
  • each surveillance camera 20 notifies the automobile 10 traveling in the vicinity of the information received from the information processing device 100.
  • the surveillance camera 23 notifies the nearby automobile 12 of information such as "There is a risk of an accident occurring at an intersection after X seconds. Let's stop”, and the surveillance camera 22 notifies the nearby automobile 11 "X seconds". Later, we will notify you of information such as "There is a risk of an accident at an intersection. Let's slow down.” That is, each surveillance camera 20 notifies the nearby automobile 10 of different information according to the distance from the dangerous area.
  • the surveillance camera 20 uses various proximity communication technologies to notify surrounding automobiles of information, but the embodiment is not limited to this.
  • the surveillance camera 20 may specify the automobile 10 to which the information is notified and notify the identified automobile 10 of the information by using various specific technologies described later.
  • the automobile 10 may receive an operation by the operator based on the notified information, and performs operation control such as automatically decelerating or stopping, for example. You may. Further, for example, the automobile 10 may travel by controlling the driving operation so as to move to a position away from the danger zone SP1.
  • the information processing device 100 refers to the automobile 10 which is highly likely to be involved in a traffic accident with the non-connected vehicle CA1 which runs dangerously from the surveillance camera image taken by the surveillance camera 20.
  • the destination of the automobile 10 cannot be specified only by the surveillance camera image taken by the surveillance camera 20.
  • the information processing device 100 estimates a dangerous area where a traffic accident is likely to occur from the image taken by the surveillance camera 20, and each surveillance camera is determined according to the distance from the estimated dangerous area. Determine the information to be notified to 20. Then, the information processing device 100 provides the information determined for each surveillance camera 20 to each surveillance camera 20.
  • the information processing apparatus 100 can change the content of the warning regarding the dangerous vehicle notified to the vehicle 10 according to the distance between the danger region SP1 and the surveillance camera 20.
  • the automobile 10 can perform various driving controls based on the information notified by the information processing device 100.
  • the automobile 10 is controlled to decelerate below a predetermined speed or to stop by gradually decelerating based on the information indicating the alert regarding the unconnected vehicle CA1 notified by the information processing device 100. It can be performed.
  • the automobile 10 performs various driving controls according to various traffic conditions, dangerous situations, and the like, so that it is possible to realize more safe driving. Therefore, the automobile 10 can realize appropriate traveling based on the information notified by the information processing device 100 even when the side environment and the situation cannot be recognized.
  • FIG. 2 is a diagram showing a configuration example of the information processing device 100 according to the embodiment.
  • the information processing device 100 includes a communication unit 110, a storage unit 120, and a control unit 130.
  • the communication unit 110 is realized by, for example, a NIC (Network Interface Card) or the like. Then, the communication unit 110 transmits / receives information between the surveillance camera 20, the surveillance camera image providing device FIS, and the position information providing device PIS via the wireless network.
  • the position information providing device PIS has a position information storage unit PIM showing a history of position information indicating a position measured by an in-vehicle device of each automobile 10.
  • the surveillance camera image providing device FIS has a surveillance camera image storage unit FIM showing a history of images taken by each surveillance camera 20.
  • the communication unit 110 may acquire the surveillance camera image registered in the surveillance camera image storage unit FIM, or may acquire the position information registered in the position information storage unit PIM.
  • the information processing device 100 may acquire the surveillance camera image storage unit FIM and the position information storage unit PIM from the surveillance camera image providing device FIS and the position information providing device PIS and store them in the storage unit 120.
  • FIG. 3 is a diagram showing an example of a position information storage unit according to an embodiment.
  • the position information storage unit PIM has items such as "moving body ID”, "position information", and "time information".
  • the “mobile ID” is an identifier that identifies the automobile 10.
  • the "position information” is the position information of the automobile 10 associated with the “moving body ID”.
  • the "time information” is information regarding the time of the position information acquired by the automobile ID associated with the "moving body ID”.
  • the position information storage unit PIM is located at a position indicating the position information "LO1" at the time indicated by the time information "DA1" from the vehicle indicated by the mobile ID "MO1". Indicates that the location information to that effect has been acquired.
  • the position information is represented by an abstract code such as "LO1”, but in reality, information indicating the latitude and longitude measured by GPS or the like is registered.
  • the time information is represented by an abstract code such as "DA1", in reality, a numerical value or the like indicating the date and time is registered.
  • FIG. 4 is a diagram showing an example of the surveillance camera image storage unit according to the embodiment.
  • the surveillance camera image storage unit FIM has items such as “camera ID”, “image”, and “time information”.
  • the "camera ID” is an identifier that identifies the surveillance camera 20.
  • the "image” is an image associated with the “camera ID”.
  • the "time information” is information regarding the time of the image taken by the surveillance camera 20 associated with the "camera ID”.
  • the surveillance camera image storage unit FIM it is registered in the surveillance camera image storage unit FIM that the image indicated by the image “FIM1” was taken by the surveillance camera 20 indicated by the camera ID “SC1” at the time indicated by the time information “DA1”. ing.
  • the surveillance camera image and the time information are represented by abstract codes such as "FIM1" and "DA1”, but the surveillance camera image and the time information have a specific file format related to the image and the time information. , A numerical value indicating the date and time, or the like.
  • the storage unit 120 is realized by, for example, a semiconductor memory element such as a RAM (Random Access Memory) or a flash memory (Flash Memory), or a storage device such as a hard disk or an optical disk.
  • the storage unit 120 has an installation position storage unit 121.
  • the installation position storage unit 121 stores information indicating the position where each surveillance camera 20 is installed.
  • FIG. 5 is a diagram showing an example of information registered in the installation position storage unit 121 according to the embodiment.
  • the installation position storage unit 121 has items such as “camera ID” and “installation position information”.
  • the "installation position information” is information indicating the installation position of the surveillance camera indicated by the "camera ID”.
  • FIG. 5 shows that the installation position of the surveillance camera 20 indicated by the camera ID “SC1” is “SCP1”.
  • the installation position of the surveillance camera 20 is represented by an abstract code such as "SCP1”, but in reality, the latitude and longitude indicating the installation position of the surveillance camera 20 are registered. It will be.
  • the control unit 130 is a controller, and for example, various programs stored in a storage device inside the information processing device 100 by a CPU (Central Processing Unit), an MPU (Micro Processing Unit), or the like store the RAM in a work area. It is realized by executing as. Further, the control unit 130 is a controller, and is realized by, for example, an integrated circuit such as an ASIC (Application Specific Integrated Circuit) or an FPGA (Field Programmable Gate Array).
  • ASIC Application Specific Integrated Circuit
  • FPGA Field Programmable Gate Array
  • the control unit 130 includes an acquisition unit 131, an estimation unit 132, a determination unit 133, and a provision unit 134, and realizes or executes an information processing function or operation described below. ..
  • the internal configuration of the control unit 130 is not limited to the configuration shown in FIG. 2, and may be another configuration as long as it is a configuration for performing information processing described later.
  • the connection relationship of each processing unit included in the control unit 130 is not limited to the connection relationship shown in FIG. 2, and may be another connection relationship.
  • the acquisition unit 131 acquires various types of information. Specifically, the acquisition unit 131 acquires a surveillance camera image taken by the surveillance camera 20. To give a more specific example, the acquisition unit 131 acquires various surveillance camera images registered in the surveillance camera image storage unit FIM from the surveillance camera image providing device FIS.
  • the estimation unit 132 refers to the surveillance camera image taken by the surveillance camera and estimates the dangerous area where an accident is likely to occur. Hereinafter, an example of the process of estimating the dangerous area will be described.
  • the estimation unit 132 when the estimation unit 132 acquires the surveillance camera image acquired from the surveillance camera image storage unit FIM, the estimation unit 132 detects a moving object such as a vehicle 10 included in the surveillance camera image.
  • the estimation unit 132 may detect a non-connected vehicle in addition to the connected vehicle. Then, the estimation unit 132 generates behavior information indicating the behavior of the detected moving body. For example, the estimation unit 132 estimates which route the moving body is traveling in what mode according to the installation position of each surveillance camera 20. In addition, the estimation unit 132 may estimate how a single moving body is traveling from the identity of the license plates of the automobiles photographed by a plurality of surveillance cameras.
  • the estimation unit 132 may estimate the moving speed of the moving body based on the installation position of the surveillance camera and the time zone in which each monitoring camera photographed the moving body. In addition to these, the estimation unit 132 may estimate the traveling mode of the moving body by using various arbitrary image analysis techniques.
  • the estimation unit 132 estimates the dangerous area where an accident is likely to occur based on the behavior information of each moving body. For example, the estimation unit 132 sets a region in which the possibility of collision of the moving bodies exceeds a predetermined threshold value as a dangerous area when each moving body moves according to the behavior indicated by the behavior information. In addition, the estimation unit 132 estimates the time until an accident occurs from the behavior information of each moving body. Even if the estimation unit 132 estimates the color of the traffic light and the traffic jam situation from the surveillance camera image and adds the estimated color of the traffic light and the traffic jam information to estimate the region where the moving body is likely to collide. good.
  • the estimation unit 132 uses, for example, the behavior information of the moving body that collided in a certain area, the traffic congestion situation in the surroundings until the collision occurs, the color of the traffic light, and the like as a positive example, and the behavior information of the moving body that did not collide.
  • a model for estimating a dangerous area may be trained by letting a predetermined model learn the characteristics of the positive example and the negative example by taking the surrounding congestion situation, the color of a traffic light, and the like as negative examples. Then, the estimation unit 132 may estimate whether or not there is a high possibility that the moving body collides with each other in each region by using such a model. In addition to these, the estimation unit 132 may use various arbitrary methods to estimate the region where an accident is likely to occur.
  • the determination unit 133 determines the information to be notified to each surveillance camera 20 based on the distance from the danger area to each surveillance camera 20. For example, the determination unit 133 refers to the installation position storage unit 121 and calculates the distance between the danger area estimated by the estimation unit 132 and each surveillance camera 20. Then, the determination unit 133 selects the surveillance camera 20 whose calculated distance is within a predetermined range as the information providing target. For example, the determination unit 133 may target a surveillance camera within a radius of 50 meters from the danger zone as an information providing target. Further, the determination unit 133 may select a surveillance camera to be provided with information according to the average speed of the vehicle at the installation position of the surveillance camera 20 and the moving direction.
  • the determination unit 133 determines the information to be notified from the surveillance camera 20 targeted for information provision according to the distance between the surveillance camera 20 targeted for information provision and the dangerous area. For example, the determination unit 133 provides information indicating a dangerous area, a time until an accident can occur, information for causing the automobile 10 to avoid an accident, and the like according to the distance between the surveillance camera 20 and the dangerous area. Generate changed information. For example, the determination unit 133 changes "information for avoiding an accident" such as "stop”, “decelerate”, and “be careful” according to the distance from the dangerous area. May be good. Further, the determination unit 133 may change the information indicating the dangerous area and the time until an accident may occur according to the distance from the dangerous area.
  • the providing unit 134 provides the information determined by the determining unit 133 to each surveillance camera 20.
  • the providing unit 134 provides the information determined by the determining unit 133 for each surveillance camera 20 to each surveillance camera 20.
  • FIG. 6 is a diagram showing a configuration example of the surveillance camera according to the embodiment.
  • the surveillance camera 20 includes a communication unit 210, a storage unit 220, a camera 230, and a control unit 240.
  • the communication unit 210 is realized by, for example, a NIC or the like. Then, the communication unit 210 transmits / receives information between the information processing device 100, the surveillance camera image providing device FIS, and the position information providing device PIS via the wireless network. For example, the communication unit 210 transmits the surveillance camera image taken by the surveillance camera 20 to the surveillance camera image providing device FIS. In addition, the communication unit 210 acquires position information indicating the position of each automobile from the position information providing device PIS.
  • the storage unit 220 is realized by, for example, a semiconductor memory element such as a RAM or a flash memory, or a storage device such as a hard disk or an optical disk. Further, in the example shown in FIG. 6, the storage unit 220 has a destination storage unit 221.
  • FIG. 7 shows an example of the destination storage unit 221 according to the embodiment.
  • FIG. 7 is a diagram showing an example of a destination storage unit according to the embodiment.
  • the destination storage unit 221 has items such as "vehicle ID" and "destination".
  • the “destination” is a destination associated with the automobile 10, and is, for example, an e-mail address or the like.
  • the destination storage unit 221 indicates that the destination of the automobile indicated by the automobile ID “MO1” is “AD1”.
  • the destination is represented by an abstract code such as "AD1”, but in reality, various information for indicating the notification destination such as an e-mail address and an IP address is registered.
  • the camera 230 is an imaging device for capturing an image.
  • the camera 230 is composed of an image sensor such as a CCD (Charged-coupled devices) sensor or a CMOS (Complementary metal-oxide-semiconductor) sensor.
  • the camera 230 may be installed so as to capture a predetermined area at a predetermined angle, or may capture an image while appropriately moving the gazing point, for example.
  • the camera 230 is mounted on the surveillance camera 20, but for example, the surveillance camera 20 includes a surveillance camera operating as the camera 230, a communication unit 210, a storage unit 220, and a storage unit 220. It may be configured by a surveillance camera server that operates as a control unit 240.
  • the control unit 240 is a controller, and is realized by, for example, a CPU, an MPU, or the like executing various programs stored in a storage device inside the surveillance camera 20 using the RAM as a work area. Further, the control unit 240 is a controller, and is realized by, for example, an integrated circuit such as an ASIC or FPGA.
  • control unit 240 has a reception unit 241, a detection unit 242, a specific unit 243, and a notification unit 244, and realizes or executes an information processing function or operation described below. ..
  • the internal configuration of the control unit 240 is not limited to the configuration shown in FIG. 6, and may be another configuration as long as it is a configuration for performing information processing described later.
  • connection relationship of each processing unit included in the control unit 240 is not limited to the connection relationship shown in FIG. 6, and may be another connection relationship.
  • the reception unit 241 receives the information notified by the surveillance camera 20 which is the information determined by the information processing device 100 according to the installation position of the surveillance camera 20. For example, the reception unit 241 receives information such as "There is a risk of an accident occurring at an intersection after X seconds. Let's stop.” In the following description, the information received by the reception unit 241 may be collectively referred to as "notification information”.
  • the detection unit 242 detects a vehicle for which information is to be notified. For example, the detection unit 242 uses the camera 230 to capture a surveillance camera image. Then, the detection unit 242 transmits the captured surveillance camera image to the surveillance camera image providing device FIS.
  • the detection unit 242 analyzes the image of the surveillance camera image and detects the vehicle to be notified. For example, the detection unit 242 may detect a vehicle photographed by a surveillance camera as a notification target. Further, for example, the detection unit 242 may acquire information indicating a dangerous area from the information processing device 100 and detect a vehicle moving in the direction of the acquired dangerous area as a notification target.
  • the surveillance camera 20 may broadcast notification information to a moving object located nearby.
  • the moving body may not be able to receive the notification information even if the notification information is broadcast. Therefore, the surveillance camera 20 uses the identification unit 243 to specify the destination of the automobile to be notified of the notification information.
  • the vehicle 10 photographed by the surveillance camera 20 and to be notified is described as the target vehicle.
  • the specific unit 243 estimates the position of the target vehicle by using the surveillance camera 20.
  • the identification unit 243 identifies the destination of the target vehicle by associating the vehicle 10 that has acquired the position information indicating the position corresponding to the estimated position with the target vehicle.
  • the longitude and latitude corresponding to the four corners, the center, etc. of the angle of view of the surveillance camera 20 are known.
  • each section corresponds to an individual latitude and longitude range.
  • the imaged object captured in the surveillance camera image captured by the surveillance camera 20 is located within the range of latitude and longitude corresponding to the section including the image in the surveillance camera image.
  • the identification unit 243 identifies the section including the image of the target vehicle in the surveillance camera image, and estimates that the target vehicle is located in the range of latitude and longitude corresponding to the specified section. Then, the information processing device 100 specifies the position information indicating the position included in the range of latitude and longitude corresponding to the specified section, and the vehicle 10 equipped with the in-vehicle device that is the source of the specified position information and the target. Link with a car. Then, the identification unit 243 specifies the destination of the notification to the target vehicle by using the association result.
  • the specific unit 243 estimates the range of latitude and longitude where the target vehicle is located from each section of the surveillance camera. Then, the identification unit 243 specifies the position information included in the range of the estimated latitude and longitude from the position information registered in the position information storage unit PIM. The specifying unit 243 may specify only the position information acquired within a predetermined period (for example, several seconds) from the processing time. Then, the specific unit 243 acquires the vehicle ID associated with the specified position information (that is, the vehicle ID of the vehicle 10 that is the source of the position information) from the position information storage unit PIM.
  • the identification unit 243 refers to the destination storage unit 221 and specifies the destination associated with the acquired vehicle ID. Then, the specific unit 243 notifies the notification unit 244 of the specified destination.
  • the notification unit 244 notifies the mobile body of the notification information received by the reception unit 241.
  • the notification unit 244 transmits the notification information received by the reception unit 241 to the destination received from the specific unit 243 via various networks. As a result of such processing, the notification unit 244 can transmit the notification information to the automobile 10 photographed by the surveillance camera.
  • FIG. 8 is a flowchart showing an example of a flow of specific processing executed by the information processing apparatus 100 according to the embodiment.
  • the information processing apparatus 100 acquires each surveillance camera image (step S101) and estimates the behavior of the moving object from the acquired surveillance camera image (step S102). Then, the information processing device 100 determines whether or not there is an area where a traffic accident is likely to occur, that is, a dangerous area (step S103). Here, the information processing apparatus 100 executes step S101 when the dangerous area does not exist (step S103: No).
  • step S103 when the dangerous area exists (step S103: Yes), the information processing device 100 determines different information according to the distance from the dangerous area as the information notified by each surveillance camera 20 (step S104). Then, the information processing device 100 notifies each surveillance camera 20 of the determined information (step S105).
  • the information processing apparatus 100 determines the content of the information notified by each surveillance camera 20 according to the distance between the danger zone SP1 and each surveillance camera 20.
  • the embodiment is not limited to this.
  • the information processing device 100 may determine the content of the information notified by the surveillance camera 20 based on the information around the surveillance camera 20 in addition to the position information. For example, the information processing device 100 estimates the state of the road located in the vicinity of the surveillance camera 21 from the surveillance camera image taken by the surveillance camera 21. To give a specific example, the information processing device 100 estimates whether or not there is a traffic jam on a road located in the vicinity of the surveillance camera 21, the average moving speed of an automobile, and the like.
  • the information processing device 100 causes an action for avoiding an accident to be performed early. Therefore, even if the distance between the surveillance camera 21 and the danger zone SP1 is equal to or greater than a predetermined distance, the notification information calling for the vehicle to make an emergency stop may be determined. Further, in the information processing device 100, for example, when a traffic jam occurs, the possibility that the automobile is involved in an accident is low, so that the distance between the surveillance camera 21 and the danger zone SP1 is equal to or less than a predetermined distance. Even so, notification information such as "There is a possibility of an accident in a traffic jam. Please be careful.” May be decided.
  • the information processing device 100 includes the number of bicycles and pedestrians located around the surveillance camera 20, and the traffic rules around the surveillance camera 20 (for example, the upper limit of speed).
  • Notification information is provided based on the information located around the various surveillance cameras 20, such as whether or not there is a stopped vehicle, whether or not public transportation such as a bus is located, whether or not there is a store, and the status of the store. May be generated.
  • the situation around the surveillance camera 21 may be estimated from, for example, the surveillance camera image taken by the surveillance camera 21, but another one located adjacent to the surveillance camera 21 or on the opposite side across the road. It may be estimated from the surveillance camera image taken by the surveillance camera 20 or the like, that is, the surveillance camera image taken by another surveillance camera 20, and is acquired by various sensors such as a Doppler sensor in addition to the surveillance camera image. It may be estimated from the information.
  • the surveillance camera 20 notifies the photographed automobile 10 of the notification information. Specifically, the surveillance camera 20 estimates the position where the target vehicle is located from the section of the surveillance camera image in which the image is captured, and identifies the vehicle 10 that is the source of the position information corresponding to the estimated position. , The notification information was transmitted with the specified destination of the vehicle 10 as the destination of the target vehicle.
  • the surveillance camera 20 may make a notification using a temporal destination instead of the permanent destination of the automobile 10 as described above. More specifically, the surveillance camera 20 gives a notification using a destination given according to the traveling mode of the automobile 10, such as the position and speed at which each automobile 10 travels, the traveling direction (direction), and the like. You may. For example, the surveillance camera 20 may notify the notification information by using the Destination Layer-2 ID of LTE (Long Term Evolution).
  • LTE Long Term Evolution
  • the destination identification process using the Destination Layer-2 ID will be explained.
  • the road on which the automobile 10 travels is divided into a plurality of areas, and different Layer-2 IDs are assigned to each area in advance.
  • Layer-2 ID "001" is assigned to the lane heading toward the danger zone SP1
  • the road away from the danger zone SP1 on the road photographed by the surveillance camera 21 is assigned to the lane heading in the direction.
  • the Layer-2 ID is not only the road photographed by the surveillance camera 21 and the road located near the surveillance camera 21, but also the road not photographed by the surveillance camera 21 (road in the blind spot). It is also given to roads located away from the surveillance camera 21. That is, in the specific process using the Destination Layer-2 ID, a different Layer-2 ID is assigned to each combination of the area in which the vehicle 10 travels and the direction in which the vehicle 10 travels.
  • the in-vehicle device of each automobile 10 When such a Layer-2 ID is given, the in-vehicle device of each automobile 10 appropriately changes its own destination Layer-2 ID according to the traveling position. For example, the in-vehicle device of the automobile 10 traveling in the lane to which the Layer-2 ID "001" is assigned was broadcast to the Layer-2 ID "001" among the information broadcast to each automobile 10. Only the information is processed by the higher processing layer, and the information destined for other Layer-2 IDs is discarded. Further, the in-vehicle device of the automobile 10 traveling in the lane to which the Layer-2 ID "002" is assigned was broadcast to the Layer-2 ID "002" among the information broadcast to each automobile 10. Only the information is processed by the higher processing layer, and the information destined for another Layer-2 ID is discarded.
  • the surveillance camera 21 identifies the Layer-2 ID of the vehicle heading for the dangerous area SP1. For example, the surveillance camera 21 identifies the Layer-2 ID "001" assigned to the lane heading toward the danger zone SP1. In addition to this, the surveillance camera 21 identifies the lane in which the vehicle heading for the danger zone SP1 travels and the Layer-2 ID assigned to the lane located within a predetermined range from the danger zone SP1. The surveillance camera 21 may change the distance between the lane to be notified and the danger zone SP1 according to the time until the accident occurs.
  • the surveillance camera 21 broadcasts the notification information received from the information processing device 100 to the specified Layer-2 ID as the destination. As a result of such processing, the surveillance camera 21 transmits notification information to the automobiles 10 located within a predetermined range from the danger zone SP1 among the automobiles 10 and heading toward the danger zone SP1. can do.
  • the surveillance camera 21 may specify a different destination according to the content of the event specified from the surveillance camera image.
  • the surveillance camera 21 has different Layer-2 IDs depending on whether it is estimated that the automobile CA1 and the automobile 12 collide with each other in the dangerous area SP1 and the automobile 11 and the bicycle collide with each other in the dangerous area SP1. May be selected as the destination.
  • the surveillance camera 21 may change the destination Layer-2 ID according to the time until the accident occurs, the traveling speed of the automobile 10 which is likely to be involved in the accident, and the like.
  • the surveillance camera 21 may detect or estimate an arbitrary event by various image analysis techniques, and perform the above-described processing with the region where the detected or estimated event occurs as the danger region SP1. For example, when a fire is occurring in a building facing the road, the surveillance camera 21 sets the road facing the building facing the fire as the danger zone SP1 and travels in the direction toward the danger zone SP1. The automobile 10 may be notified of the notification information.
  • the surveillance camera 21 may change the destination Layer-2 ID according to other information.
  • the in-vehicle device of each automobile 10 includes the size such as the size and weight of the automobile, the number and gender of the users actually riding, the attributes such as the age, the attributes of the automobile, the weather in the traveling area, and the surrounding pedestrians. Change the Layer-2 ID as appropriate according to various driving modes such as the presence or absence of.
  • the surveillance camera 21 determines the traveling mode of the automobile 10 to be notified according to the content of the detected event and the area where the event has occurred, and may use the Layer-2 ID corresponding to the determined traveling mode as the destination. good.
  • the surveillance camera 21 When executing such a process, for example, the surveillance camera 21 holds a list associated with the Layer-2 ID according to the driving mode of the automobile 10 to be notified for each content of the event that can be detected. You just have to do it. Using such a list, the surveillance camera 21 becomes a notification destination according to the content of the detected event (for example, whether an accident occurs, a fire occurs, the speed or type of the vehicle involved in the accident, etc.). It is possible to specify a destination that fluctuates temporally, which is the destination of the automobile 10.
  • the automobile has been described as an example, but the present invention may be applied to any moving body instead of the automobile.
  • the moving body is a bicycle, a drone, a bus, a train, or the like.
  • the moving body may be a vehicle in which a non-connected vehicle becomes a connected vehicle by mounting a communication device, an information processing device, or the like attached later, or an autonomous driving vehicle or the like. That is, the connected vehicle only needs to have a detection function for detecting the current position and a communication function for transmitting the detected position information indicating the current position to the information processing device 100, and realizes automatic driving. Whether or not it is a possible vehicle is a concept that does not matter.
  • the surveillance camera has been described as an example, but the surveillance camera may be any camera.
  • the camera may be a depth camera, an infrared camera, or the like.
  • the information processing device 100 may be, for example, an infrared sensor or various radio wave sensors, in addition to a camera that acquires an image in which various pixels or image regions correspond to positions in the real world.
  • the information processing device 100 appropriately determines the notification information to be notified from the various sensors to each moving body according to the distance between the various sensors and the danger region SP1 to appropriately notify each moving body. can do.
  • the surveillance camera 20 a smart surveillance camera having various shooting functions and a function capable of performing various information processing and information notification has been described as an example.
  • the surveillance camera 20 may be realized by, for example, a small server device including a camera fixed to the roadside. That is, the process of "transmitting information by the surveillance camera 20" is a concept including transmission of information by the surveillance camera 20, a small server device linked with the surveillance camera 20, or a small server having a camera.
  • the surveillance camera 20 has notified the notification information received from the information processing device 100, but the embodiment is not limited to this.
  • the information processing device 100 estimates the danger area SP1 based on the surveillance camera image, and notifies each surveillance camera 20 of the estimated danger area SP1.
  • each surveillance camera 20 determines the notification information according to the positional relationship between the notified danger area SP1 and the position where the own device is installed, and notifies the vehicle 10 of the determined notification information. You may.
  • the control unit 240 of the surveillance camera 20 further includes, for example, the determination unit 133 shown in FIG.
  • the surveillance camera 20 may autonomously determine the notification information in cooperation with the information processing device 100 and notify each vehicle of the determined information. Further, for example, the surveillance camera 20 may estimate the danger region SP1 from the surveillance camera image and notify the estimated danger region SP1 to another surveillance camera 20 in the same manner as the information processing device 100. In such a case, the surveillance camera 20 can determine and notify the notification information only by the surveillance camera 20 without having the information processing device 100.
  • the information processing device 100 and the surveillance camera 20 determine the notification information to be notified from the surveillance camera 20 according to the distance between the danger region SP1 and the surveillance camera 20.
  • the embodiment is not limited to this.
  • the information processing device 100 includes the number of intersections existing between the front of the surveillance camera 20 and the danger zone SP1, and the distance between the front of the surveillance camera 20 and the danger zone SP1. Notification according to the number of left or right turns at the intersection, the average travel time from the front of the surveillance camera 20 to the dangerous area SP1, the number of crosswalks existing from the front of the surveillance camera 20 to the dangerous area SP1, etc. Information may be determined. Further, the information processing device 100 determines the notification information according to whether or not there is a traffic jam between the front of the surveillance camera 20 and the dangerous area SP1, the average traffic volume, the number of pedestrians, and the like. You may.
  • the information processing device 100 determines whether or not the surveillance camera 20 is installed near the intersection, the shooting direction of the surveillance camera 20, the presence or absence of bicycles and the number of pedestrians in the vicinity of the surveillance camera 20, and the vicinity of the surveillance camera 20.
  • Notification information may be determined according to the event or the like that occurred in. That is, the information processing device 100 may determine the notification information based on arbitrary information, at least as long as the notification information is determined according to the installation position of the sensor. [6-6. In-vehicle device]
  • the vehicle-mounted device has been described as an example, but the vehicle-mounted device may be applied to any terminal device instead of the vehicle-mounted device.
  • the terminal device may be a terminal device or the like used by a user who accesses content such as a web page displayed on a browser or content for an application.
  • the terminal device may be a desktop PC (Personal Computer), a notebook PC, a tablet terminal, a mobile phone, a PDA (Personal Digital Assistant), a smart watch, a wearable device, or the like. ..
  • the in-vehicle device uses GPS to detect the current position.
  • the embodiment is not limited to this.
  • the in-vehicle device has been described by taking information as an example, it may be applied to any position information instead of GPS information.
  • the in-vehicle device may estimate or acquire the current position of the automobile 10 by using the position information of the base station communicating with the vehicle or the radio wave of WiFi (registered trademark) (Wireless Fidelity).
  • the surveillance camera 20 transmits the notification information to the various automobiles 10, but the acquisition process is not limited to the above.
  • the position information providing device PIS stores the attribute information related to the attributes of the automobile 10 and the GPS information in association with each other.
  • the surveillance camera 20 may acquire the attribute information related to the attributes of the automobile 10 from the position information providing device PS together with the GPS information of the automobile 10.
  • the surveillance camera 20 positions, as attribute information of the automobile 10, information on the type of vehicle, information on speed, information on the number of times of braking, information on driving operation, information on running time, information on the number of times of opening and closing the door, and the like. It may be acquired from the information providing device PIS. Then, the surveillance camera 20 links the target vehicle and the vehicle 10 based on the position of the target vehicle estimated from the image, the position indicated by the position information acquired from the vehicle 10, and the attribute information of the vehicle 10. You may.
  • the surveillance camera 20 estimates the attributes related to the movement of the target vehicle from the image. Further, in another example, the surveillance camera 20 estimates the attribute of the target vehicle to which the notification information is transmitted from the detected event content. Then, in the information processing device 100, the estimated attribute and the attribute indicated by the attribute information of the automobile 10 match or are similar, and the position indicated by the position information of the target automobile and the position information acquired from the automobile 10 indicate. When the positions are the same or similar, the destination of the vehicle 10 may be the destination of the target vehicle.
  • the information processing device 100 has described an example of a process of determining whether or not the non-connected vehicle CA1 is a vehicle having a high possibility of causing a traffic accident from an image, but the information processing device 100 is not limited to the above determination process. .. Specifically, the information processing device 100 may determine whether or not the non-connected vehicle CA1 included in the image is an emergency vehicle such as an ambulance.
  • the information processing device 100 analyzes an image by a conventional technique such as image analysis to determine whether or not the unconnected vehicle CA1 included in the image is an emergency vehicle based on the vehicle type, license plate, vehicle outer shape, and the like. You may judge. In this way, the information processing apparatus 100 can determine the unconnected vehicle CA1 that is characteristic in various situations according to various situations.
  • a conventional technique such as image analysis to determine whether or not the unconnected vehicle CA1 included in the image is an emergency vehicle based on the vehicle type, license plate, vehicle outer shape, and the like. You may judge.
  • the information processing apparatus 100 can determine the unconnected vehicle CA1 that is characteristic in various situations according to various situations.
  • the information processing device 100 may cause each surveillance camera 20 to notify that the automobile 10 does not interfere with the ambulance.
  • the information processing device 100 may set the intersection through which the ambulance passes as the danger zone SP1 and determine the notification information to be notified from each surveillance camera 20 according to the positional relationship between the danger zone SP1 and each surveillance camera 20. ..
  • the information processing device 100 estimates the collision between the automobile 10 and the automobile CA1, but the embodiment is not limited to this.
  • the information processing device 100 identifies a bicycle, a child, an animal, or the like from an image acquired from the surveillance camera 20 by using a conventional technique such as image analysis. Further, the information processing device 100 identifies a vehicle in which a bicycle, a child, an animal, or the like is presumed to exist in the blind spot as a target vehicle from the image acquired from the surveillance camera 20.
  • the information processing device 100 estimates whether or not there is a risk of collision between the target vehicle and a bicycle, a child, an animal, or the like, and if there is a risk of collision, a position where there is a risk of collision. And the area may be the danger area SP1.
  • the information processing device 100 causes the surveillance camera 20 to notify the automobile that may collide with various notification information even when a bicycle, a child, an animal, or the like jumps out. Can be done.
  • the bicycle, children, animals, etc. other than the automobile may enter a blind spot that cannot be grasped by the in-vehicle camera, and the in-vehicle camera may not be able to recognize the bicycle, children, animals, etc. rice field.
  • the information processing device 100 can specify the destination of the target vehicle even when a bicycle, a child, an animal, or the like jumps out. As a result, the automobile 10 can prevent a traffic accident.
  • the information processing device 100 may identify a vehicle that violates the stop prohibition based on the surveillance camera image taken by the surveillance camera 20, and may set the position or region of the specified vehicle as the danger region SP1. Further, the information processing device 100 may set the position or area of a taxi, an autonomous vehicle, a bus, or the like on which the user is getting on and off as the danger area SP1. In such a case, the information processing device 100 can reduce the possibility that another automobile 10 collides with the automobile 10 stopped on the road or a continuous billiard accident occurs.
  • the information processing device 100 may provide the police or the like with information regarding a claim for a fine or the like for a vehicle that is a violation of the prohibition of parking after a predetermined period of time.
  • the information processing device 100 may provide information such as a license plate of a photographed automobile.
  • the information processing device 100 may store a flag in a predetermined storage unit, such as a dangerous vehicle that repeats the violation, in association with the vehicle ID of the vehicle that violates the stop prohibition.
  • the above modification can be applied to a vehicle that violates any kind of vehicle instead of a vehicle that violates the parking prohibition.
  • the information processing device 100 identifies a vehicle that violates the overtaking prohibition based on the surveillance camera image taken by the surveillance camera 20. Further, the information processing device 100 may set the position or region of the specified automobile, which violates the overtaking prohibition, as the danger zone SP1.
  • the surveillance camera 20 may notify any information.
  • the information processing device 100 uses the information processing device 100 as information notified from the surveillance camera 20 such as the license plate of a dangerous driving vehicle that may cause an accident, the type of vehicle, the outer shape of the vehicle, the color of the vehicle, and the like. Includes features, estimated vehicle speed, vehicle position information, and so on.
  • the notification information may include control information or the like for causing the notification destination automobile 10 to automatically drive to avoid an accident.
  • the surveillance camera 20 may transmit notification information in any mode.
  • the surveillance camera 20 may notify the notification information acquired only by the automobile 10, or may notify the passengers of the automobile 10 of the notification information in an understandable manner.
  • the information processing device 100 may have the surveillance camera 20 in the vicinity of the vehicle 10 which is a violation of the stop prohibition notify the vehicle 10 which is a violation of the stop prohibition to stop the violation of the stop prohibition. ..
  • the information processing device 100 may set an arbitrary position as a dangerous area. For example, the information processing device 100 determines whether or not the automobile 10 is a dangerous automobile based on whether or not the automobile 10 captured in the surveillance camera image satisfies a predetermined condition. For example, in the information processing device 100, when the photographed automobile 10 travels by a predetermined driving operation, travels at a speed equal to or higher than a predetermined threshold value, or violates a traffic rule more than a predetermined number of times. If it is at least one of them, it is determined that the vehicle is dangerous.
  • the information processing device 100 may determine the automobile 10 as a dangerous automobile when the automobile 10 repeats meandering driving. That is, in the surveillance camera image acquired from the surveillance camera 20, the information processing device 100 has a predetermined driving operation in which the non-connected vehicle CA1 is likely to cause a traffic accident when the unconnected vehicle CA1 repeats meandering operation. The non-connected vehicle CA1 is determined to be a dangerous vehicle because the vehicle is traveling on the vehicle.
  • the information processing device 100 is unlikely to cause a traffic accident when the non-connected vehicle CA1 is traveling in compliance with the traffic rules. Since the vehicle is traveling by a predetermined driving operation, the non-connected vehicle CA1 is not determined to be a dangerous vehicle.
  • the information processing device 100 determines that the non-connected vehicle CA1 is a dangerous vehicle.
  • the unconnected vehicle CA1 is traveling at a speed lower than the legal speed, it is not necessary to determine the unconnected vehicle CA1 as a dangerous vehicle.
  • the license plate of the non-connected vehicle CA1 is specified by using the conventional technology such as image analysis technology for the surveillance camera image including the unconnected vehicle CA1.
  • the information processing device 100 acquires a violation history of violating the traffic rule of the non-connected vehicle CA1 associated with the license plate, and if it violates the traffic rule more than a predetermined number of times, the non-connected vehicle CA1 may be determined as a dangerous vehicle.
  • the information processing apparatus 100 does not have to determine the unconnected vehicle CA1 as a dangerous vehicle when the unconnected vehicle CA1 violates the traffic rule less than a predetermined number of times.
  • the information processing device 100 may determine whether or not another vehicle different from the dangerous vehicle is likely to be involved in a traffic accident. For example, in the information processing device 100, the positional relationship between the surveillance camera 24 in which a dangerous vehicle is photographed and the surveillance camera 21 in which another vehicle 11 is photographed is known, and the distance between the surveillance camera 21 and the surveillance camera 24 is known. Is less than the prescribed distance. In this case, the information processing device 100 may determine that the dangerous vehicle and the vehicle 11 are less than a predetermined distance, so that the vehicle 11 is a target vehicle with a high possibility of being involved in a traffic accident.
  • the information processing device 100 involves a vehicle 10 whose image is included in the same surveillance camera image as the dangerous vehicle and another vehicle 10 photographed by the same surveillance camera 20 as the dangerous vehicle within a predetermined time in a traffic accident. It may be determined that the target vehicle has a high possibility. Further, the above-mentioned processing may be executed by, for example, the estimation unit 132 shown in FIG. 2, and for example, the information processing apparatus 100 further adds a functional configuration (for example, a determination unit) for performing the above-mentioned various determinations. You may have.
  • a functional configuration for example, a determination unit
  • the surveillance camera 20 identifies the destination of the notification destination vehicle 10 by using the position information and the Layer-2 ID.
  • the surveillance camera 20 shows an example of a process of identifying the destination of the automobile 10 based on the license plate of the automobile 10 captured in the surveillance camera image.
  • the surveillance camera 20 identifies the license plate of the photographed automobile 10 by analyzing the license plate of the photographed automobile 10 by using a conventional technique such as image analysis when the automobile 10 having a high possibility of being involved in a traffic accident is photographed. do. Then, the surveillance camera 20 may specify the destination associated with the license plate of the specified automobile 10 from the storage unit and transmit the notification information to the specified destination.
  • the information processing device 100 is a solution for notifying various information from the surveillance camera image taken by the surveillance camera 20 to the automobile 10 to which the notification information is notified.
  • the information processing device 100 corresponds to the license plate when it is determined that the license plate of the automobile 10 identified by analyzing the surveillance camera image and the license plate of the automobile 10 stored in the storage unit match. The destination of the automobile 10 attached and stored is specified. Thereby, the information processing apparatus 100 can identify an appropriate moving body.
  • the surveillance camera 20 according to the embodiment exerts an advantageous effect when the license plate of the automobile 10 cannot be identified from the image due to the relationship between the surveillance camera 20 and the automobile 10.
  • the information processing device 100 according to the embodiment can change the content of the warning regarding the dangerous vehicle notified to the vehicle 10 according to the distance between the danger region SP1 and the surveillance camera 20.
  • the information processing apparatus 100 according to the embodiment can notify the automobile 10 of suitable contents regarding the alerting via the surveillance camera 20 even when the license plate of the automobile 10 cannot be specified. That is, the information processing apparatus 100 according to the embodiment can provide information that supplements the processing performed by the information processing apparatus 100 according to the other embodiment.
  • FIG. 9 is a hardware configuration diagram showing an example of a computer 1000 that realizes the functions of the information processing device 100.
  • the computer 1000 includes a CPU 1100, a RAM 1200, a ROM (Read Only Memory) 1300, an HDD (Hard Disk Drive) 1400, a communication interface (I / F) 1500, an input / output interface (I / F) 1600, and a media interface (I / F). ) Has 1700.
  • the CPU 1100 operates based on the program stored in the ROM 1300 or the HDD 1400, and controls each part.
  • the ROM 1300 stores a boot program executed by the CPU 1100 when the computer 1000 is started, a program depending on the hardware of the computer 1000, and the like.
  • the HDD 1400 stores a program executed by the CPU 1100, data used by such a program, and the like.
  • the communication interface 1500 receives data from another device via the network N and sends it to the CPU 1100, and the CPU 1100 transmits the data generated by the CPU 1100 via the network N to the other device.
  • the CPU 1100 controls an output device such as a display or a printer and an input device such as a keyboard or a mouse via the input / output interface 1600.
  • the CPU 1100 acquires data from the input device via the input / output interface 1600. Further, the CPU 1100 outputs the data generated via the input / output interface 1600 to the output device.
  • the media interface 1700 reads the program or data stored in the recording medium 1800 and provides the program or data to the CPU 1100 via the RAM 1200.
  • the CPU 1100 loads the program from the recording medium 1800 onto the RAM 1200 via the media interface 1700, and executes the loaded program.
  • the recording medium 1800 is, for example, an optical recording medium such as a DVD (Digital Versatile Disc) or PD (Phase change rewritable Disk), a magneto-optical recording medium such as an MO (Magneto-Optical disk), a tape medium, a magnetic recording medium, or a semiconductor memory. And so on.
  • the CPU 1100 of the computer 1000 realizes the function of the control unit 130 by executing the program loaded on the RAM 1200. Further, the data in the storage unit 120 is stored in the HDD 1400. The CPU 1100 of the computer 1000 reads and executes these programs from the recording medium 1800, but as another example, these programs may be acquired from another device via the network N.
  • each component of each device shown in the figure is a functional concept, and does not necessarily have to be physically configured as shown in the figure. That is, the specific form of distribution / integration of each device is not limited to the one shown in the figure, and all or part of the device is functionally or physically dispersed / physically distributed in any unit according to various loads and usage conditions. Can be integrated and configured.
  • the information processing device 100, the surveillance camera image providing device FIS, and the position information providing device PIS may be integrated into one information processing device. In this case, the information processing device acquires the position information from the automobile 10 and the image taken from the surveillance camera 20.
  • section, module, unit can be read as “means” or “circuit”.
  • the providing unit can be read as a providing means or a providing circuit.
  • the information processing apparatus 100 includes a determination unit 133 and a providing unit 134.
  • the determination unit 133 determines the information that the sensor notifies the moving body in the vicinity of the sensor based on the installation position of the sensor that can notify the moving body of the information.
  • the providing unit 134 provides the information determined by the determining unit 133 to the sensor.
  • the information processing apparatus 100 provides the sensor with information that the sensor determined based on the installation position of the sensor capable of notifying the moving body notifies the moving body in the vicinity of the sensor. Therefore, it is possible to notify the moving body of appropriate notification contents according to the situation around the moving body.
  • the determination unit 133 determines the information notified by the sensor for each sensor, and the providing unit 134 provides the information notified by the sensor for each sensor.
  • the information processing device 100 determines the information to be notified by the sensor for each sensor, and provides the information to be notified by the sensor for each sensor. Notification contents can be notified to the moving body.
  • the determination unit 133 determines the information to be notified by the sensor based on the distance from the position determined based on the information acquired by the other sensor to the sensor.
  • the information processing device 100 determines the information to be notified by the sensor based on the distance from the position determined based on the information acquired by the other sensor to the sensor, and thus is around the moving body. It is possible to notify the moving body of the appropriate notification content according to the situation of.
  • the determination unit 133 determines the information to be notified by the sensor as a predetermined position based on the distance from the position where an accident may occur to the sensor.
  • the information processing device 100 determines the information notified by the sensor based on the distance from the position where an accident may occur to the sensor as a predetermined position, so that the information processing device 100 determines the information to be notified by the sensor. Appropriate notification contents can be notified to the moving body.
  • the determination unit 133 uses the information that the sensor notifies the information indicating the position where the accident may occur.
  • the information processing device 100 uses the information indicating the position where an accident can occur as the information notified by the sensor, and therefore provides the moving body with appropriate notification contents according to the surrounding conditions of the moving body. Can be notified.
  • the determination unit 133 uses information indicating the time until an accident can occur as information notified by the sensor.
  • the information processing device 100 uses the information indicating the time until an accident can occur as the information notified by the sensor, so that the moving body receives appropriate notification contents according to the situation around the moving body. Can be notified to.
  • the determination unit 133 uses the information that the sensor notifies the information for causing the moving body to avoid an accident.
  • the information processing device 100 uses information for the sensor to notify the moving body of information for avoiding an accident, so that the moving body is provided with appropriate notification contents according to the surrounding conditions of the moving body. Can be notified.
  • the determination unit 133 determines the information to be notified by the sensor as a sensor based on the installation position of the sensor capable of notifying the information to the surrounding moving body.
  • the information processing device 100 determines the information to be notified by the sensor based on the installation position of the sensor capable of notifying the information to the surrounding moving body as a sensor, so that the situation around the moving body is determined. It is possible to notify the moving body of the appropriate notification content according to the above.
  • the determination unit 133 determines the information to be notified by the sensor based on the installation position of the sensor and the information in the vicinity of the sensor acquired by the sensor.
  • the information processing device 100 determines the information to be notified by the sensor based on the installation position of the sensor and the information in the vicinity of the sensor acquired by the sensor. Appropriate notification contents can be notified to the moving body.
  • the determination unit 133 is a monitoring camera capable of photographing the surrounding situation as a sensor, and an installation position of the monitoring camera capable of notifying a moving body of predetermined information. Based on, the information to be notified by the surveillance camera is determined.
  • the information processing device 100 is a surveillance camera capable of photographing the surrounding situation as a sensor, and is based on the installation position of the surveillance camera capable of notifying the moving body of predetermined information. Since the information to be notified by the surveillance camera is determined, it is possible to notify the moving body of appropriate notification contents according to the situation around the moving body.
  • the determination unit 133 determines based on the distance from each surveillance camera from the position where an accident is predicted to occur from the image taken by any of the surveillance cameras. Each surveillance camera determines the information to be notified to surrounding moving objects.
  • each surveillance camera can be moved based on the distance from the position where an accident is predicted to occur from the image taken by any of the surveillance cameras to each surveillance camera.
  • the determination unit 133 uses the information notified by the surveillance camera as the surveillance camera based on the installation position of the surveillance camera that notifies the photographed moving body of predetermined information. To determine.
  • the information processing device 100 determines the information to be notified by the surveillance camera as the surveillance camera based on the installation position of the surveillance camera that notifies the photographed moving body of predetermined information. , It is possible to notify the moving body of appropriate notification contents according to the surrounding situation of the moving body.
  • the determination unit 133 is a sensor for installing a sensor that notifies information to a moving body located in an area corresponding to the event when a predetermined event is detected. Determines the information that the sensor notifies based on.
  • the information processing device 100 is based on the installation position of the sensor that notifies the moving body located in the area corresponding to the event when a predetermined event is detected as the sensor. Since the information to be notified by the sensor is determined, it is possible to notify the moving body of appropriate notification contents according to the situation around the moving body.
  • the determination unit 133 uses information to be notified by the sensor, which is different for each area where the moving body is located.
  • the information processing device 100 uses information that is different for each region where the moving body is located as information to be notified by the sensor, so that the moving body receives appropriate notification content according to the situation around the moving body. Can be notified to.
  • the determination unit 133 installs a sensor that notifies information to a moving body that is moving in a direction corresponding to the event in an area corresponding to the event. Determines the information that the sensor notifies based on.
  • the information processing apparatus 100 as a sensor, is based on the installation position of the sensor that notifies information to the moving body that is moving in the direction corresponding to the event in the area corresponding to the event. Since the information to be notified by the sensor is determined, it is possible to notify the moving body of appropriate notification contents according to the situation around the moving body.
  • the determination unit 133 installs a sensor that notifies information to a moving body that is moving in a direction corresponding to the event in an area corresponding to the event. Determines the information that the sensor notifies based on.
  • the information processing apparatus 100 as a sensor, is based on the installation position of the sensor that notifies information to the moving body that is moving in the direction corresponding to the event in the area corresponding to the event. Since the information to be notified by the sensor is determined, it is possible to notify the moving body of appropriate notification contents according to the situation around the moving body.
  • the senor according to the embodiment has a detection unit 242, a reception unit 241 and a notification unit 244.
  • the detection unit 242 detects predetermined information.
  • the reception unit 241 receives the information to be notified from the information processing device 100 that determines the information to be notified to the sensor according to the installation position of the sensor.
  • the notification unit 244 notifies the surrounding mobile body of the information received by the reception unit 241.
  • the sensor according to the embodiment receives the information to be notified from the information processing device 100 that determines the information to be notified to the sensor according to the installation position of the sensor, and the received information is transmitted to the surrounding moving body. Since the notification is made, it is possible to notify the moving body of appropriate notification contents according to the situation around the moving body.
  • the notification unit 244 when a predetermined event is detected, notifies the moving body located in the area corresponding to the event of the information received from the information processing device 100.
  • the sensor according to the embodiment notifies the moving body located in the area corresponding to the event of the information received from the information processing device 100, so that the moving body. It is possible to notify the moving body of the appropriate notification content according to the surrounding situation of.
  • the notification unit 244 notifies different information for each area where the moving body is located.
  • the sensor according to the embodiment since the sensor according to the embodiment notifies different information for each area where the moving body is located, it is possible to notify the moving body of appropriate notification contents according to the situation around the moving body. ..
  • the notification unit 244 notifies the moving body moving in the direction corresponding to the event in the area corresponding to the event.
  • the sensor according to the embodiment since the sensor according to the embodiment notifies the moving body moving in the direction corresponding to the event in the area corresponding to the event, it is appropriate according to the situation around the moving body.
  • the content of the notification can be notified to the moving body.
  • the notification unit 244 notifies different information for each direction in which the moving body is moving.
  • the sensor according to the embodiment since the sensor according to the embodiment notifies different information for each direction in which the moving body is moving, it is necessary to notify the moving body of appropriate notification contents according to the situation around the moving body. Can be done.
  • the notification unit 244 notifies the information to the moving body located in the area corresponding to the event and the moving mode satisfies a predetermined condition.
  • the senor according to the embodiment is a moving body located in the area corresponding to the event, and the information is notified to the moving body whose movement mode satisfies a predetermined condition. Therefore, the situation around the moving body. It is possible to notify the moving body of the appropriate notification content according to the above.
  • the traffic risk reduction system 1 includes an information processing device 100 and a sensor.
  • the determination unit 133 of the information processing device 100 determines the information that the sensor notifies the moving body in the vicinity of the sensor based on the installation position of the sensor.
  • the providing unit 134 of the information processing device 100 provides the information determined by the determining unit 133 to the sensor.
  • the detection unit 242 of the sensor detects predetermined information.
  • the reception unit 241 of the sensor receives information from the information processing device 100.
  • the notification unit 244 of the sensor notifies the surrounding mobile body of the information received by the reception unit 241.
  • the traffic risk reduction system 1 can notify the moving body of appropriate notification contents according to the situation around the moving body.

Abstract

An information processing device according to the present application has a determination unit and a provision unit. The determination unit determines information reported by a sensor to a moving body in the vicinity of the sensor, on the basis of an installation position of the sensor at which information can be reported to the moving body. The provision unit provides the information determined by the determination unit to the sensor.

Description

交通リスク低減システム、情報処理装置及びセンサTraffic risk reduction system, information processing equipment and sensors
 本発明は、交通リスク低減システム、情報処理装置及びセンサに関する。 The present invention relates to a traffic risk reduction system, an information processing device and a sensor.
 近年、自動運転技術の進展により、自動運転車が普及しつつある。例えば、危険度に応じて動体フレーム及び動体予測バーの色を変化させる技術が知られている。 In recent years, with the development of autonomous driving technology, autonomous vehicles are becoming widespread. For example, a technique for changing the color of a moving body frame and a moving body prediction bar according to the degree of danger is known.
国際公開第2018/088224号公報International Publication No. 2018/08824
 しかしながら、上記の従来技術では、移動体の周囲の状況に応じた適切な通知内容を移動体に対して通知することができるとは限らない。例えば、上記の従来技術では、危険度に応じて動体フレーム及び動体予測バーの色を変化させているに過ぎず、事故を回避するために適切な情報等を移動体に対して通知することができるとは限らない。 However, with the above-mentioned conventional technology, it is not always possible to notify the moving body of appropriate notification contents according to the situation around the moving body. For example, in the above-mentioned prior art, the colors of the moving body frame and the moving body prediction bar are merely changed according to the degree of danger, and appropriate information or the like can be notified to the moving body in order to avoid an accident. Not always possible.
 本願は、上記に鑑みてなされたものであって、移動体の周囲の状況に応じた適切な通知内容を移動体に対して通知することができる交通リスク低減システム、情報処理装置及びセンサを提供することを目的とする。 The present application has been made in view of the above, and provides a traffic risk reduction system, an information processing device, and a sensor capable of notifying a moving body of appropriate notification contents according to the surrounding conditions of the moving body. The purpose is to do.
 本願に係る情報処理装置は、移動体に対して情報を通知可能なセンサの設置位置に基づいて、当該センサが当該センサの近傍の移動体に通知する情報を決定する決定部と、前記決定部により決定された情報を、前記センサに対して提供する提供部とを有することを特徴とする。 The information processing device according to the present application includes a determination unit that determines information to be notified by the sensor to a moving body in the vicinity of the sensor based on the installation position of a sensor capable of transmitting information to the moving body, and the determination unit. It is characterized by having a providing unit that provides the information determined by the above sensor to the sensor.
 実施形態の一態様によれば、移動体の周囲の状況に応じた適切な通知内容を移動体に対して通知することができるという効果を奏する。 According to one aspect of the embodiment, there is an effect that an appropriate notification content according to the situation around the moving body can be notified to the moving body.
図1は、実施形態に係る情報処理装置100が実行する処理の一例を示す図である。FIG. 1 is a diagram showing an example of processing executed by the information processing apparatus 100 according to the embodiment. 図2は、実施形態に係る情報処理装置の構成例を示す図である。FIG. 2 is a diagram showing a configuration example of the information processing device according to the embodiment. 図3は、実施形態に係る位置情報記憶部の一例を示す図である。FIG. 3 is a diagram showing an example of the position information storage unit according to the embodiment. 図4は、実施形態に係る監視カメラ画像記憶部の一例を示す図である。FIG. 4 is a diagram showing an example of the surveillance camera image storage unit according to the embodiment. 図5は、実施形態に係る設置位置記憶部の一例を示す図である。FIG. 5 is a diagram showing an example of the installation position storage unit according to the embodiment. 図6は、実施形態に係る監視カメラの構成例を示す図である。FIG. 6 is a diagram showing a configuration example of the surveillance camera according to the embodiment. 図7は、実施形態に係る宛先記憶部の一例を示す図である。FIG. 7 is a diagram showing an example of a destination storage unit according to the embodiment. 図8は、実施形態に係る情報処理装置が実行する特定処理の流れの一例を示すフローチャートである。FIG. 8 is a flowchart showing an example of a flow of specific processing executed by the information processing apparatus according to the embodiment. 図9は、情報処理装置の機能を実現するコンピュータの一例を示すハードウェア構成図である。FIG. 9 is a hardware configuration diagram showing an example of a computer that realizes the functions of the information processing device.
 以下に、本願に係る交通リスク低減システム、情報処理装置及びセンサを実施するための形態(以下、「実施形態」と呼ぶ)について図面を参照しつつ詳細に説明する。なお、この実施形態により本願に係る交通リスク低減システム、情報処理装置及びセンサが限定されるものではない。また、各実施形態は、処理内容を矛盾させない範囲で適宜組み合わせることが可能である。また、以下の各実施形態において同一の部位には同一の符号を付し、重複する説明は省略される。 Hereinafter, a mode for implementing the traffic risk reduction system, the information processing device, and the sensor according to the present application (hereinafter, referred to as “form”) will be described in detail with reference to the drawings. The traffic risk reduction system, information processing device, and sensor according to the present application are not limited by this embodiment. In addition, each embodiment can be appropriately combined as long as the processing contents do not contradict each other. Further, in each of the following embodiments, the same parts are designated by the same reference numerals, and duplicate description is omitted.
〔1.着想背景〕
 近年の自動運転技術の進展により、自動運転車が普及しつつあるが、交通システムの整備が十分とは言えない点がある。例えば、自動運転車が普及しつつある過程では、自動運転車と、運転手によって運転される自動車である手動自動車とが、混在する過程が存在する。この場合、自動運転車のみが交通ルールに従って走行したとしても、手動自動車に起因する交通事故等を防ぐことが難しい。このため、手動自動車の走行を考慮した上で、自動運転車を高精度に制御する必要があった。
[1. Idea background]
With the development of autonomous driving technology in recent years, autonomous vehicles are becoming widespread, but there is a point that the maintenance of the transportation system is not sufficient. For example, in the process in which autonomous vehicles are becoming widespread, there is a process in which autonomous vehicles and manual vehicles, which are vehicles driven by drivers, coexist. In this case, it is difficult to prevent a traffic accident caused by a manual vehicle even if only the autonomous vehicle travels according to the traffic rules. Therefore, it is necessary to control the self-driving car with high accuracy in consideration of the running of the manual car.
 しかしながら、自動運転車の走行技術が十分とは限らない。例えば、自動運転車は、カメラ等の撮影装置を有し、自動運転車の周辺を撮影することで、周辺の環境や、状況に基づいて、走行を制御する。この場合、自動運転車は、所定の速度以上で走行した場合に、側面の環境や、状況を認識できない場合がある。例えば、自動運転車は、高速で走行した場合に、側面を撮影するカメラの映像が流れてしまい、他の車両や、物体を認識できない場合がある。この場合、自動運転車は、側面からの情報を取得することが困難になるため、側面からの交通事故等のリスクが高くなる。このように、自動運転車の走行技術に関して、側面からの情報を如何に取得するかが課題であった。 However, the driving technology of self-driving cars is not always sufficient. For example, an autonomous vehicle has a photographing device such as a camera, and by photographing the surroundings of the autonomous vehicle, the traveling is controlled based on the surrounding environment and the situation. In this case, the self-driving car may not be able to recognize the side environment and the situation when traveling at a predetermined speed or higher. For example, when the self-driving car travels at a high speed, the image of the camera that captures the side surface may flow, and other vehicles or objects may not be recognized. In this case, it becomes difficult for the self-driving car to acquire information from the side surface, so that the risk of a traffic accident from the side surface increases. In this way, the issue was how to obtain information from the side regarding the driving technology of the autonomous driving vehicle.
 そこで、本実施形態では、自動運転車の走行を支援する技術を提供する。本実施形態では、センサ等の外部装置等から取得される情報を活用することで、自動運転車の側面からの情報を補完する。これにより、本実施形態は、自動運転車に対して高精度に制御を行うことを可能とする。 Therefore, in the present embodiment, a technology for supporting the running of an autonomous vehicle is provided. In the present embodiment, the information from the side of the autonomous driving vehicle is complemented by utilizing the information acquired from an external device such as a sensor. As a result, the present embodiment makes it possible to control the autonomous driving vehicle with high accuracy.
〔2.情報処理装置が示す処理の一例〕
 図1を用いて、実施形態に係る情報処理装置100が実行する処理の一例について説明する。図1は、実施形態に係る情報処理装置100が実行する処理の一例を示す図である。
[2. An example of processing indicated by an information processing device]
An example of the processing executed by the information processing apparatus 100 according to the embodiment will be described with reference to FIG. FIG. 1 is a diagram showing an example of processing executed by the information processing apparatus 100 according to the embodiment.
 図1に示すように、交通リスク低減システム1は、自動車CA1、11、12(以下では、自動車10と総称する場合がある)と、複数の監視カメラ21~24(以下では、監視カメラ20と総称する場合がある)と、情報処理装置100とを含む。自動車10、監視カメラ20及び情報処理装置100は、図示しない無線ネットワークを介して接続される。なお、図1に示す交通リスク低減システム1には、任意の数の自動車10や、任意の数の監視カメラ20が含まれていてもよい。また、以下に説明する処理は、複数台の情報処理装置100が同時並行的に、若しくは、協調して各処理を実行することで実現されてもよい。 As shown in FIG. 1, the traffic risk reduction system 1 includes automobiles CA1, 11 and 12 (hereinafter, may be collectively referred to as automobile 10) and a plurality of surveillance cameras 21 to 24 (hereinafter, surveillance cameras 20). (May be generically referred to) and the information processing apparatus 100. The automobile 10, the surveillance camera 20, and the information processing device 100 are connected via a wireless network (not shown). The traffic risk reduction system 1 shown in FIG. 1 may include an arbitrary number of automobiles 10 and an arbitrary number of surveillance cameras 20. Further, the processes described below may be realized by executing each process in parallel or in cooperation with a plurality of information processing devices 100.
 実施形態に係る自動車10は、無線ネットワークを介して他の各種装置と通信するコネクテッド車両である。また、自動車10は、自動車10に組み込まれた通信モジュール等の車載装置により通信を実行する。例えば、各自動車10に組み込まれた車載装置は、GPS(Global Positioning System)センサ等を用いて、所定の間隔(例えば毎秒1回以上)毎に、車載装置の現在位置、すなわち自動車の現在位置を検知する機能を有し、より具体的な例では、カーナビゲーションシステムやドライブレコーダ等である。そして、車載装置は、検知した現在位置を示す位置情報を生成し、無線ネットワーク等を介して、かかる位置情報と、車載装置が搭載された自動車10を識別するための自動車ID(Identifier)とを、位置情報を管理する位置情報提供装置PIS(例えば、図2参照)に送信する。 The automobile 10 according to the embodiment is a connected vehicle that communicates with various other devices via a wireless network. Further, the automobile 10 executes communication by an in-vehicle device such as a communication module incorporated in the automobile 10. For example, the in-vehicle device incorporated in each automobile 10 uses a GPS (Global Positioning System) sensor or the like to determine the current position of the in-vehicle device, that is, the current position of the automobile at predetermined intervals (for example, once or more per second). It has a function to detect, and more specific examples are a car navigation system, a drive recorder, and the like. Then, the in-vehicle device generates position information indicating the detected current position, and obtains such position information and an automobile ID (Identifier) for identifying the automobile 10 on which the in-vehicle device is mounted via a wireless network or the like. , The location information is transmitted to the location information providing device PIS (see, for example, FIG. 2) that manages the location information.
 実施形態に係る監視カメラ20は、無線ネットワークを介して他の各種装置と通信可能な機能を有する監視カメラである。例えば、監視カメラ20は、交通信号機または交通信号機が取り付けられる柱に固定されて設置される。また、監視カメラ20は、所定の角度から撮影された風景等を含む画像を常時あるいは所定の時間間隔(例えば、毎秒30回)取得する。そして、監視カメラ20は、無線ネットワーク等を介して、監視カメラにより撮影された画像を管理する監視カメラ画像提供装置FISに送信する。 The surveillance camera 20 according to the embodiment is a surveillance camera having a function of being able to communicate with various other devices via a wireless network. For example, the surveillance camera 20 is fixedly installed on a traffic signal or a pillar to which a traffic signal is attached. In addition, the surveillance camera 20 constantly or at a predetermined time interval (for example, 30 times per second) acquires an image including a landscape or the like taken from a predetermined angle. Then, the surveillance camera 20 transmits the image captured by the surveillance camera to the surveillance camera image providing device FIS via a wireless network or the like.
 ここで、監視カメラ20は、カメラとしての機能のみならず、各種の情報処理やネットワークを介した各種の通信、若しくは無線LANやブルートゥース(登録商標)等といった近接通信技術を用いて、各種の通信を行う機能を有するスマート監視カメラである。なお、監視カメラ20は、例えば、静止画像や動画像を撮影する監視カメラと、情報処理やネットワークを介した各種の通信を行うための監視カメラサーバとにより構成されていてもよい。このような監視カメラ20は、近接通信技術を用いて直接的、あるいは、ネットワークを用いて間接的に各種の情報を配信する機能を有する。例えば、監視カメラ20は、近傍に所在する自動車10に対し、各種の情報を通知する機能を有するものとする。 Here, the surveillance camera 20 not only functions as a camera, but also uses various information processing, various communications via a network, and various communications using proximity communication technology such as wireless LAN and Bluetooth (registered trademark). It is a smart surveillance camera that has the function of performing. The surveillance camera 20 may be composed of, for example, a surveillance camera that captures a still image or a moving image, and a surveillance camera server for performing various communications via information processing and a network. Such a surveillance camera 20 has a function of directly or indirectly distributing various types of information using proximity communication technology. For example, the surveillance camera 20 has a function of notifying various information to the automobile 10 located in the vicinity.
 実施形態に係る監視カメラ画像提供装置FISは、例えば、サーバ装置又はクラウドシステム等により実現される。具体的には、監視カメラ画像提供装置FISは、監視カメラ20によって撮影された画像(以下では、監視カメラ画像と記載する)を常時、取得する。そして、監視カメラ画像提供装置FISは、情報処理装置100から画像取得要求を受付け、情報処理装置100に監視カメラ画像を提供する。 The surveillance camera image providing device FIS according to the embodiment is realized by, for example, a server device or a cloud system. Specifically, the surveillance camera image providing device FIS constantly acquires an image taken by the surveillance camera 20 (hereinafter, referred to as a surveillance camera image). Then, the surveillance camera image providing device FIS receives an image acquisition request from the information processing device 100 and provides the surveillance camera image to the information processing device 100.
 実施形態に係る情報処理装置100は、例えば、サーバ装置又はクラウドシステム等により実現される。以下、図1を用いて、情報処理装置100による特定処理の一例を流れに沿って説明する。 The information processing device 100 according to the embodiment is realized by, for example, a server device or a cloud system. Hereinafter, an example of the specific processing by the information processing apparatus 100 will be described along the flow with reference to FIG.
 図1の例では、危険な走行を行うコネクテッド車両ではない自動車CA1(以下では、非コネクテッド車両CA1と表記する場合がある)と、自動車11、12が、交差点周辺を走行している。そして、非コネクテッド車両CA1と、自動車12とが、交通事故に遭う可能性が高いことが、監視カメラ20によって撮影された監視カメラ画像から検知されているものとする。例えば、図1に示す例では、自動車CA1が所定の速度を超過する速度で蛇行しており、信号無視等により交差点SP1に侵入する結果、自動車12と自動車CA1とが衝突する恐れがあるものとする。 In the example of FIG. 1, a vehicle CA1 that is not a connected vehicle that runs dangerously (hereinafter, may be referred to as a non-connected vehicle CA1) and vehicles 11 and 12 are traveling around an intersection. Then, it is assumed that the non-connected vehicle CA1 and the automobile 12 are likely to have a traffic accident from the surveillance camera image taken by the surveillance camera 20. For example, in the example shown in FIG. 1, the automobile CA1 meanders at a speed exceeding a predetermined speed, and as a result of invading the intersection SP1 due to ignoring a signal or the like, the automobile 12 and the automobile CA1 may collide with each other. do.
 この場合、監視カメラ20から自動車11、12に対して、交通事故に関する注意喚起等を行うといった態様が考えられる。しかしながら、交差点の近傍を走行する自動車11、12に対して単に「停止してください」といった同一内容の通知を行った場合、自動車12のみならず自動車12の後方を走行している自動車11も停止してしまう恐れがある。このような場合、自動車11のさらに後方を走行している他の自動車が、自動車11に衝突してしまう恐れもある。そこで、情報処理装置100は、各監視カメラ20の設置位置に応じて、各監視カメラ20が自動車10に対して通知する情報を決定し、決定した情報を各監視カメラ20に対して提供する。 In this case, it is conceivable that the surveillance camera 20 alerts the automobiles 11 and 12 regarding a traffic accident. However, when the same notification such as "Please stop" is given to the automobiles 11 and 12 traveling near the intersection, not only the automobile 12 but also the automobile 11 traveling behind the automobile 12 is stopped. There is a risk of doing it. In such a case, another automobile traveling further behind the automobile 11 may collide with the automobile 11. Therefore, the information processing device 100 determines the information to be notified to the automobile 10 by each surveillance camera 20 according to the installation position of each surveillance camera 20, and provides the determined information to each surveillance camera 20.
 まず、図1に示すように、各監視カメラ20は、撮影した監視カメラ画像を監視カメラ画像提供装置FISに送信する(ステップS1)。そして、情報処理装置100は、各監視カメラ20が撮影した監視カメラ画像を取得する(ステップS2)。そして、情報処理装置100は、監視カメラ画像の画像解析を行い、交通事故が発生する恐れのある領域を危険領域として推定する(ステップS3)。 First, as shown in FIG. 1, each surveillance camera 20 transmits the captured surveillance camera image to the surveillance camera image providing device FIS (step S1). Then, the information processing device 100 acquires the surveillance camera images taken by each surveillance camera 20 (step S2). Then, the information processing device 100 analyzes the image of the surveillance camera image and estimates the area where a traffic accident may occur as a dangerous area (step S3).
 例えば、情報処理装置100は、監視カメラ24によって撮影された監視カメラ画像に、法定速度を大幅に超えて走行する自動車CA1が撮影されており、監視カメラ23によって撮影された監視カメラ画像に、自動車CA1の死角に所在する自動車12が撮影されている場合、自動車CA1と自動車12が交差点SP1で交通事故を起こす可能性が高いと推定する。そして、情報処理装置100は、交差点SP1を危険領域SP1とする。なお、このような推定処理は、各種の画像から将来の交通事故が発生する可能性を推定する各種の推定技術が採用可能である。 For example, in the information processing apparatus 100, the surveillance camera image captured by the surveillance camera 24 captures the vehicle CA1 traveling at a speed significantly exceeding the legal speed, and the surveillance camera image captured by the surveillance camera 23 captures the vehicle. When the automobile 12 located in the blind spot of the CA1 is photographed, it is estimated that there is a high possibility that the automobile CA1 and the automobile 12 will cause a traffic accident at the intersection SP1. Then, the information processing device 100 sets the intersection SP1 as the danger zone SP1. For such estimation processing, various estimation techniques for estimating the possibility of a future traffic accident from various images can be adopted.
 このように、情報処理装置100は、交差点SP1で交通事故が発生する恐れが高いと推定された場合に、危険領域SP1から各監視カメラ20までの距離に基づいて、各監視カメラ20に通知させる情報を決定する(ステップS4)。より具体的には、情報処理装置100は、監視カメラ20毎に通知させる情報として、危険領域SP1を示す情報、事故が発生するまでの時間、若しくは事故を回避するための行動を示す情報のいずれか1つ、若しくはこれらを組み合わせた情報を生成する。 In this way, when it is estimated that a traffic accident is likely to occur at the intersection SP1, the information processing device 100 notifies each surveillance camera 20 based on the distance from the danger zone SP1 to each surveillance camera 20. The information is determined (step S4). More specifically, as the information to be notified for each surveillance camera 20, the information processing device 100 includes information indicating the danger zone SP1, the time until the accident occurs, or the information indicating the action for avoiding the accident. Generate information that is one or a combination of these.
 例えば、情報処理装置100は、監視カメラ22、23に通知させる情報として、「X秒後に交差点SP1で交通事故発生の恐れ」というように、危険領域SP1を示す情報や事故が発生するまでの時間を示す情報を生成する。また、情報処理装置100は、交差点SP1に最も近い監視カメラ23に通知させる情報として、「停止しましょう」といった事故を回避するための行動を示す情報を生成する。また、情報処理装置100は、交差点SP1からある程度距離がある監視カメラ22に通知させる情報として、「減速しましょう」といった監視カメラ23とは異なる情報を生成する。 For example, the information processing device 100 provides information indicating the danger zone SP1 and the time until an accident occurs, such as "there is a risk of a traffic accident occurring at an intersection SP1 after X seconds" as information to be notified to the surveillance cameras 22 and 23. Generate information that indicates. Further, the information processing device 100 generates information indicating an action for avoiding an accident such as "let's stop" as information to be notified to the surveillance camera 23 closest to the intersection SP1. Further, the information processing device 100 generates information different from that of the surveillance camera 23, such as "let's decelerate", as information to be notified to the surveillance camera 22 which is some distance from the intersection SP1.
 ここで、情報処理装置100は、交差点SP1からの距離が監視カメラ22よりも長い監視カメラ21については、通知させる情報を生成せずともよい。また、例えば、情報処理装置100は、監視カメラ22については、「交差点SP1で事故発生の恐れ」というように、事故が発生するまでの時間を含まない情報を生成してもよい。すなわち、情報処理装置100は、危険領域である交差点SP1から各監視カメラ20までの距離に応じて、危険領域SP1を示す情報、事故が発生するまでの時間、若しくは事故を回避するための行動を示す情報のそれぞれを適宜変更したメッセージを通知させる情報として生成する。 Here, the information processing device 100 does not have to generate information for notifying the surveillance camera 21 whose distance from the intersection SP1 is longer than that of the surveillance camera 22. Further, for example, the information processing apparatus 100 may generate information for the surveillance camera 22 that does not include the time until an accident occurs, such as "there is a risk of an accident occurring at the intersection SP1". That is, the information processing device 100 performs information indicating the dangerous area SP1, the time until the accident occurs, or an action for avoiding the accident according to the distance from the intersection SP1 which is the dangerous area to each surveillance camera 20. Each of the indicated information is generated as information for notifying a message that has been changed as appropriate.
 そして、情報処理装置100は、各監視カメラ20に対し、監視カメラ20毎に決定した内容の情報を提供する(ステップS5)。このような情報を受付けた場合、各監視カメラ20は、近接を走行する自動車10に対し、情報処理装置100から受付けた情報を通知することとなる。 Then, the information processing device 100 provides each surveillance camera 20 with information of the content determined for each surveillance camera 20 (step S5). When such information is received, each surveillance camera 20 notifies the automobile 10 traveling in the vicinity of the information received from the information processing device 100.
 例えば、監視カメラ23は、近傍の自動車12に対し、「X秒後に交差点で事故発生の恐れ。停止しましょう」といった情報を通知し、監視カメラ22は、近傍の自動車11に対し、「X秒後に交差点で事故発生の恐れ。減速しましょう」といった情報を通知することとなる。すなわち、各監視カメラ20は、近傍の自動車10に対し、危険領域からの距離に応じてそれぞれ異なる情報を通知することとなる。 For example, the surveillance camera 23 notifies the nearby automobile 12 of information such as "There is a risk of an accident occurring at an intersection after X seconds. Let's stop", and the surveillance camera 22 notifies the nearby automobile 11 "X seconds". Later, we will notify you of information such as "There is a risk of an accident at an intersection. Let's slow down." That is, each surveillance camera 20 notifies the nearby automobile 10 of different information according to the distance from the dangerous area.
 なお、上述した例では、例えば、監視カメラ20は、各種の近接通信技術を用いて、周囲の自動車に情報を通知することとなるが、実施形態は、これに限定されるものではない。例えば、監視カメラ20は、後述する各種の特定技術を用いて、情報の通知先となる自動車10を特定し、特定した自動車10に対して情報を通知してもよい。 In the above-mentioned example, for example, the surveillance camera 20 uses various proximity communication technologies to notify surrounding automobiles of information, but the embodiment is not limited to this. For example, the surveillance camera 20 may specify the automobile 10 to which the information is notified and notify the identified automobile 10 of the information by using various specific technologies described later.
 また、自動車10は、監視カメラ20から各種の通知を受付けた場合、通知された情報に基づいて、操作者による操作を受けつけてもよく、例えば、自動的に減速あるいは停止するといった運転制御を行ってもよい。また、例えば、自動車10は、危険領域SP1から離れる位置に移動するように運転操作を制御して走行してもよい。 Further, when the automobile 10 receives various notifications from the surveillance camera 20, the automobile 10 may receive an operation by the operator based on the notified information, and performs operation control such as automatically decelerating or stopping, for example. You may. Further, for example, the automobile 10 may travel by controlling the driving operation so as to move to a position away from the danger zone SP1.
 このように、実施形態に係る情報処理装置100は、監視カメラ20によって撮影された監視カメラ画像から、危険な走行を行う非コネクテッド車両CA1との交通事故に巻き込まれる可能性が高い自動車10に対して、交通事故に関する注意喚起等を行いたい場合に対して解決方法を提供する。従来、監視カメラ20によって撮影された監視カメラ画像だけでは、自動車10の宛先を特定できなかった。この課題に対して、情報処理装置100は、監視カメラ20によって撮影された画像から、交通事故が生じる可能性が高い危険領域を推定し、推定した危険領域からの距離に応じて、各監視カメラ20に通知させる情報を決定する。そして、情報処理装置100は、監視カメラ20ごとに決定した情報を各監視カメラ20に対して提供する。このような処理の結果、情報処理装置100は、危険領域SP1と、監視カメラ20との距離に応じて、自動車10に通知される危険自動車に関する注意喚起の内容を変更することができる。 As described above, the information processing device 100 according to the embodiment refers to the automobile 10 which is highly likely to be involved in a traffic accident with the non-connected vehicle CA1 which runs dangerously from the surveillance camera image taken by the surveillance camera 20. We will provide a solution for cases where you want to call attention to traffic accidents. Conventionally, the destination of the automobile 10 cannot be specified only by the surveillance camera image taken by the surveillance camera 20. In response to this problem, the information processing device 100 estimates a dangerous area where a traffic accident is likely to occur from the image taken by the surveillance camera 20, and each surveillance camera is determined according to the distance from the estimated dangerous area. Determine the information to be notified to 20. Then, the information processing device 100 provides the information determined for each surveillance camera 20 to each surveillance camera 20. As a result of such processing, the information processing apparatus 100 can change the content of the warning regarding the dangerous vehicle notified to the vehicle 10 according to the distance between the danger region SP1 and the surveillance camera 20.
 これにより、自動車10は、情報処理装置100によって通知される情報に基づいて、各種運転制御を行うことができる。例えば、自動車10は、情報処理装置100によって通知される非コネクテッド車両CA1に関する注意喚起を示す情報に基づいて、所定の速度以下に減速することや、徐々に減速することで、停車することといった制御を行うことができる。これにより、自動車10は、各種交通状況や、危険な状況等に応じた各種運転制御を行うため、より安全性の高い走行を実現することができる。したがって、自動車10は、側面の環境や、状況を認識できない場合であっても、情報処理装置100によって通知される情報に基づいて、適切な走行を実現することができる。 As a result, the automobile 10 can perform various driving controls based on the information notified by the information processing device 100. For example, the automobile 10 is controlled to decelerate below a predetermined speed or to stop by gradually decelerating based on the information indicating the alert regarding the unconnected vehicle CA1 notified by the information processing device 100. It can be performed. As a result, the automobile 10 performs various driving controls according to various traffic conditions, dangerous situations, and the like, so that it is possible to realize more safe driving. Therefore, the automobile 10 can realize appropriate traveling based on the information notified by the information processing device 100 even when the side environment and the situation cannot be recognized.
〔3.情報処理装置の構成〕
 次に、図2を用いて、実施形態に係る情報処理装置100の構成について説明する。図2は、実施形態に係る情報処理装置100の構成例を示す図である。図2に示すように、情報処理装置100は、通信部110と、記憶部120と、制御部130とを有する。
[3. Information processing device configuration]
Next, the configuration of the information processing apparatus 100 according to the embodiment will be described with reference to FIG. FIG. 2 is a diagram showing a configuration example of the information processing device 100 according to the embodiment. As shown in FIG. 2, the information processing device 100 includes a communication unit 110, a storage unit 120, and a control unit 130.
(通信部110について)
 通信部110は、例えば、NIC(Network Interface Card)等によって実現される。そして、通信部110は、無線ネットワークを介して、監視カメラ20と、監視カメラ画像提供装置FISと、位置情報提供装置PISとの間で情報の送受信を行う。例えば、図2に示す例では、位置情報提供装置PISは、各自動車10の車載装置が測定した位置を示す位置情報の履歴を示す位置情報記憶部PIMを有する。また、監視カメラ画像提供装置FISは、各監視カメラ20が撮影した画像の履歴を示す監視カメラ画像記憶部FIMを有する。通信部110は、監視カメラ画像記憶部FIMに登録された監視カメラ画像を取得してもよく、位置情報記憶部PIMに登録された位置情報を取得してもよい。
(About communication unit 110)
The communication unit 110 is realized by, for example, a NIC (Network Interface Card) or the like. Then, the communication unit 110 transmits / receives information between the surveillance camera 20, the surveillance camera image providing device FIS, and the position information providing device PIS via the wireless network. For example, in the example shown in FIG. 2, the position information providing device PIS has a position information storage unit PIM showing a history of position information indicating a position measured by an in-vehicle device of each automobile 10. Further, the surveillance camera image providing device FIS has a surveillance camera image storage unit FIM showing a history of images taken by each surveillance camera 20. The communication unit 110 may acquire the surveillance camera image registered in the surveillance camera image storage unit FIM, or may acquire the position information registered in the position information storage unit PIM.
 なお、図2に示す例では、監視カメラ画像記憶部FIMと位置情報記憶部PIMが情報処理装置100の外部に設けられている例について記載したが、実施形態は、これに限定されるものではない。例えば、情報処理装置100は、監視カメラ画像記憶部FIMと位置情報記憶部PIMとを監視カメラ画像提供装置FISと、位置情報提供装置PISから取得し、記憶部120内に記憶してもよい。 In the example shown in FIG. 2, an example in which the surveillance camera image storage unit FIM and the position information storage unit PIM are provided outside the information processing device 100 has been described, but the embodiment is not limited to this. No. For example, the information processing device 100 may acquire the surveillance camera image storage unit FIM and the position information storage unit PIM from the surveillance camera image providing device FIS and the position information providing device PIS and store them in the storage unit 120.
(位置情報記憶部PIMについて)
 続いて、位置情報記憶部PIMに登録される情報の一例を説明する。例えば、図3は、実施形態に係る位置情報記憶部の一例を示す図である。図3に示した例では、位置情報記憶部PIMは、「移動体ID」、「位置情報」、「時刻情報」といった項目を有する。
(About location information storage PIM)
Subsequently, an example of the information registered in the position information storage unit PIM will be described. For example, FIG. 3 is a diagram showing an example of a position information storage unit according to an embodiment. In the example shown in FIG. 3, the position information storage unit PIM has items such as "moving body ID", "position information", and "time information".
 「移動体ID」は、自動車10を識別する識別子である。「位置情報」は、「移動体ID」に対応付けられた自動車10の位置情報である。「時刻情報」は、「移動体ID」に対応付けられた自動車IDによって取得された位置情報の時刻に関する情報である。 The "mobile ID" is an identifier that identifies the automobile 10. The "position information" is the position information of the automobile 10 associated with the "moving body ID". The "time information" is information regarding the time of the position information acquired by the automobile ID associated with the "moving body ID".
 例えば、図3に示す例では、位置情報記憶部PIMは、移動体ID「MO1」が示す自動車から、時刻情報「DA1」が示す時刻に、位置情報「LO1」を示す位置に所在している旨の位置情報が取得された旨を示す。なお、図3に示した例では、位置情報を「LO1」等の抽象的な符号で表現したが、実際には、GPS等を用いて測位された緯度や経度を示す情報が登録される。また、時刻情報を「DA1」等の抽象的な符号で表現したが、実際には、日時を示す数値等が登録される。 For example, in the example shown in FIG. 3, the position information storage unit PIM is located at a position indicating the position information "LO1" at the time indicated by the time information "DA1" from the vehicle indicated by the mobile ID "MO1". Indicates that the location information to that effect has been acquired. In the example shown in FIG. 3, the position information is represented by an abstract code such as "LO1", but in reality, information indicating the latitude and longitude measured by GPS or the like is registered. Further, although the time information is represented by an abstract code such as "DA1", in reality, a numerical value or the like indicating the date and time is registered.
(監視カメラ画像記憶部FIMについて)
 続いて、監視カメラ画像記憶部FIMに登録される情報の一例を説明する。図4は、実施形態に係る監視カメラ画像記憶部の一例を示す図である。図4に示す例では、監視カメラ画像記憶部FIMは、「カメラID」、「画像」、「時刻情報」といった項目を有する。
(About surveillance camera image storage FIM)
Subsequently, an example of the information registered in the surveillance camera image storage unit FIM will be described. FIG. 4 is a diagram showing an example of the surveillance camera image storage unit according to the embodiment. In the example shown in FIG. 4, the surveillance camera image storage unit FIM has items such as “camera ID”, “image”, and “time information”.
 「カメラID」は、監視カメラ20を識別する識別子である。「画像」は、「カメラID」に対応付けられた画像である。「時刻情報」は、「カメラID」に対応付けられた監視カメラ20によって撮影された画像の時刻に関する情報である。 The "camera ID" is an identifier that identifies the surveillance camera 20. The "image" is an image associated with the "camera ID". The "time information" is information regarding the time of the image taken by the surveillance camera 20 associated with the "camera ID".
 例えば、図4では、カメラID「SC1」が示す監視カメラ20により、画像「FIM1」が示す画像が、時刻情報「DA1」が示す時刻に撮影された旨が監視カメラ画像記憶部FIMに登録されている。なお、図4に示した例では、監視カメラ画像及び時刻情報を「FIM1」及び「DA1」といった抽象的な符号で表現したが、監視カメラ画像及び時刻情報は、画像に関する具体的なファイル形式や、日時を示す数値等であってもよい。 For example, in FIG. 4, it is registered in the surveillance camera image storage unit FIM that the image indicated by the image “FIM1” was taken by the surveillance camera 20 indicated by the camera ID “SC1” at the time indicated by the time information “DA1”. ing. In the example shown in FIG. 4, the surveillance camera image and the time information are represented by abstract codes such as "FIM1" and "DA1", but the surveillance camera image and the time information have a specific file format related to the image and the time information. , A numerical value indicating the date and time, or the like.
(記憶部120について)
 図2に戻り、説明を続ける。記憶部120は、例えば、RAM(Random Access Memory)、フラッシュメモリ(Flash Memory)等の半導体メモリ素子、または、ハードディスク、光ディスク等の記憶装置によって実現される。記憶部120は、設置位置記憶部121を有する。
(About storage unit 120)
Returning to FIG. 2, the explanation will be continued. The storage unit 120 is realized by, for example, a semiconductor memory element such as a RAM (Random Access Memory) or a flash memory (Flash Memory), or a storage device such as a hard disk or an optical disk. The storage unit 120 has an installation position storage unit 121.
(設置位置記憶部121について)
 実施形態に係る設置位置記憶部121は、各監視カメラ20が設置された位置を示す情報を記憶する。例えば、図5は、実施形態に係る設置位置記憶部121に登録される情報の一例を示す図である。図5に示すように、設置位置記憶部121は、「カメラID」と「設置位置情報」といった項目を有する。ここで、「設置位置情報」とは、「カメラID」が示す監視カメラの設置位置を示す情報である。
(About the installation position storage unit 121)
The installation position storage unit 121 according to the embodiment stores information indicating the position where each surveillance camera 20 is installed. For example, FIG. 5 is a diagram showing an example of information registered in the installation position storage unit 121 according to the embodiment. As shown in FIG. 5, the installation position storage unit 121 has items such as “camera ID” and “installation position information”. Here, the "installation position information" is information indicating the installation position of the surveillance camera indicated by the "camera ID".
 例えば、図5では、カメラID「SC1」が示す監視カメラ20の設置位置が「SCP1」である旨を示す。なお、図5に示した例では、監視カメラ20の設置位置を「SCP1」といった抽象的な符号で表現したが、実際には、監視カメラ20の設置位置を示す緯度や経度等が登録されることとなる。 For example, FIG. 5 shows that the installation position of the surveillance camera 20 indicated by the camera ID “SC1” is “SCP1”. In the example shown in FIG. 5, the installation position of the surveillance camera 20 is represented by an abstract code such as "SCP1", but in reality, the latitude and longitude indicating the installation position of the surveillance camera 20 are registered. It will be.
(制御部130について)
 図2に戻り、説明を続ける。制御部130は、コントローラ(Controller)であり、例えば、CPU(Central Processing Unit)やMPU(Micro Processing Unit)等によって、情報処理装置100内部の記憶装置に記憶されている各種プログラムがRAMを作業領域として実行されることにより実現される。また、制御部130は、コントローラであり、例えば、ASIC(Application Specific Integrated Circuit)やFPGA(Field Programmable Gate Array)等の集積回路により実現される。
(About control unit 130)
Returning to FIG. 2, the explanation will be continued. The control unit 130 is a controller, and for example, various programs stored in a storage device inside the information processing device 100 by a CPU (Central Processing Unit), an MPU (Micro Processing Unit), or the like store the RAM in a work area. It is realized by executing as. Further, the control unit 130 is a controller, and is realized by, for example, an integrated circuit such as an ASIC (Application Specific Integrated Circuit) or an FPGA (Field Programmable Gate Array).
 図2に示すように、制御部130は、取得部131と、推定部132と、決定部133と、提供部134とを有し、以下に説明する情報処理の機能や作用を実現または実行する。なお、制御部130の内部構成は、図2に示した構成に限られず、後述する情報処理を行う構成であれば他の構成であってもよい。また、制御部130が有する各処理部の接続関係は、図2に示した接続関係に限られず、他の接続関係であってもよい。 As shown in FIG. 2, the control unit 130 includes an acquisition unit 131, an estimation unit 132, a determination unit 133, and a provision unit 134, and realizes or executes an information processing function or operation described below. .. The internal configuration of the control unit 130 is not limited to the configuration shown in FIG. 2, and may be another configuration as long as it is a configuration for performing information processing described later. Further, the connection relationship of each processing unit included in the control unit 130 is not limited to the connection relationship shown in FIG. 2, and may be another connection relationship.
(取得部131について)
 取得部131は、各種情報を取得する。具体的には、取得部131は、監視カメラ20によって撮影された監視カメラ画像を取得する。より具体的な例を挙げると、取得部131は、監視カメラ画像提供装置FISから、監視カメラ画像記憶部FIMに登録された各種監視カメラ画像を取得する。
(About acquisition unit 131)
The acquisition unit 131 acquires various types of information. Specifically, the acquisition unit 131 acquires a surveillance camera image taken by the surveillance camera 20. To give a more specific example, the acquisition unit 131 acquires various surveillance camera images registered in the surveillance camera image storage unit FIM from the surveillance camera image providing device FIS.
(推定部132について)
 推定部132は、監視カメラによって撮影された監視カメラ画像を参照し、事故が発生する可能性が高い危険領域を推定する。以下、危険領域を推定する処理の一例について記載する。
(About estimation unit 132)
The estimation unit 132 refers to the surveillance camera image taken by the surveillance camera and estimates the dangerous area where an accident is likely to occur. Hereinafter, an example of the process of estimating the dangerous area will be described.
 例えば、推定部132は、監視カメラ画像記憶部FIMから取得した監視カメラ画像を取得すると、監視カメラ画像に含まれる自動車10等の移動体を検出する。ここで、推定部132は、コネクテッド車両以外にも、非コネクテッド車両の検出を行ってもよい。そして、推定部132は、検出した移動体の挙動を示す挙動情報を生成する。例えば、推定部132は、各監視カメラ20の設置位置に応じて、移動体がどの経路をどのような走行態様で走行しているかを推定する。なお、推定部132は、複数の監視カメラに撮影された自動車のナンバープレートの同一性から、単一の移動体がどのように走行しているかを推定してもよい。また、推定部132は、監視カメラの設置位置と各監視カメラが移動体を撮影した時間帯とに基づいて、移動体の移動速度を推定してもよい。これら以外にも、推定部132は、各種任意の画像解析技術を用いて、移動体の走行態様を推定してよい。 For example, when the estimation unit 132 acquires the surveillance camera image acquired from the surveillance camera image storage unit FIM, the estimation unit 132 detects a moving object such as a vehicle 10 included in the surveillance camera image. Here, the estimation unit 132 may detect a non-connected vehicle in addition to the connected vehicle. Then, the estimation unit 132 generates behavior information indicating the behavior of the detected moving body. For example, the estimation unit 132 estimates which route the moving body is traveling in what mode according to the installation position of each surveillance camera 20. In addition, the estimation unit 132 may estimate how a single moving body is traveling from the identity of the license plates of the automobiles photographed by a plurality of surveillance cameras. Further, the estimation unit 132 may estimate the moving speed of the moving body based on the installation position of the surveillance camera and the time zone in which each monitoring camera photographed the moving body. In addition to these, the estimation unit 132 may estimate the traveling mode of the moving body by using various arbitrary image analysis techniques.
 そして、推定部132は、各移動体の挙動情報に基づいて、事故が発生する可能性が高い危険領域を推定する。例えば、推定部132は、挙動情報が示す挙動で各移動体が移動した場合に、移動体が衝突する可能性が所定の閾値を超える領域を危険領域とする。また、推定部132は、各移動体の挙動情報から、事故が発生するまでの時間を推定する。なお、推定部132は、監視カメラ画像から、信号機の色や渋滞状況を推定し、推定した信号機の色や渋滞情報を加味して、移動体が衝突する可能性が高い領域を推定してもよい。 Then, the estimation unit 132 estimates the dangerous area where an accident is likely to occur based on the behavior information of each moving body. For example, the estimation unit 132 sets a region in which the possibility of collision of the moving bodies exceeds a predetermined threshold value as a dangerous area when each moving body moves according to the behavior indicated by the behavior information. In addition, the estimation unit 132 estimates the time until an accident occurs from the behavior information of each moving body. Even if the estimation unit 132 estimates the color of the traffic light and the traffic jam situation from the surveillance camera image and adds the estimated color of the traffic light and the traffic jam information to estimate the region where the moving body is likely to collide. good.
 また、推定部132は、例えば、ある領域において衝突した移動体の挙動情報や、衝突が生じるまでの周囲の渋滞状況および信号機の色等を正例とし、衝突しなかった移動体の挙動情報や周囲の渋滞状況および信号機の色等を負例として、正例および負例の特徴を所定のモデルに学習させることで、危険領域を推定するモデルを学習させてもよい。そして、推定部132は、このようなモデルを用いることで、各領域において移動体が衝突する可能性が高いか否かの推定を行ってもよい。これら以外にも、推定部132は、各種任意の手法を用いて、事故が発生する可能性が高い領域の推定を行ってよい。 Further, the estimation unit 132 uses, for example, the behavior information of the moving body that collided in a certain area, the traffic congestion situation in the surroundings until the collision occurs, the color of the traffic light, and the like as a positive example, and the behavior information of the moving body that did not collide. A model for estimating a dangerous area may be trained by letting a predetermined model learn the characteristics of the positive example and the negative example by taking the surrounding congestion situation, the color of a traffic light, and the like as negative examples. Then, the estimation unit 132 may estimate whether or not there is a high possibility that the moving body collides with each other in each region by using such a model. In addition to these, the estimation unit 132 may use various arbitrary methods to estimate the region where an accident is likely to occur.
(決定部133について)
 決定部133は、推定部132により危険領域が推定された場合は、危険領域から各監視カメラ20までの距離に基づいて、各監視カメラ20に通知させる情報を決定する。例えば、決定部133は、設置位置記憶部121を参照し、推定部132により推定された危険領域と各監視カメラ20との距離を算出する。そして、決定部133は、算出した距離が所定の範囲内となる監視カメラ20を情報提供対象として選択する。例えば、決定部133は、危険領域から半径50メートル以内の監視カメラを情報提供対象としてもよい。また、決定部133は、監視カメラ20が設置位置における車両の平均速度や、移動方向に応じて、情報提供対象となる監視カメラを選択してもよい。
(About decision unit 133)
When the danger area is estimated by the estimation unit 132, the determination unit 133 determines the information to be notified to each surveillance camera 20 based on the distance from the danger area to each surveillance camera 20. For example, the determination unit 133 refers to the installation position storage unit 121 and calculates the distance between the danger area estimated by the estimation unit 132 and each surveillance camera 20. Then, the determination unit 133 selects the surveillance camera 20 whose calculated distance is within a predetermined range as the information providing target. For example, the determination unit 133 may target a surveillance camera within a radius of 50 meters from the danger zone as an information providing target. Further, the determination unit 133 may select a surveillance camera to be provided with information according to the average speed of the vehicle at the installation position of the surveillance camera 20 and the moving direction.
 そして、決定部133は、情報提供対象とした監視カメラ20と、危険領域との距離に応じて、情報提供対象とした監視カメラ20から通知させる情報を決定する。例えば、決定部133は、危険領域を示す情報や、事故が発生しうるまでの時間、自動車10に事故を回避させるための情報等を、監視カメラ20と、危険領域との距離に応じてそれぞれ変更した情報を生成する。例えば、決定部133は、危険領域からの距離に応じて、「停止してください」、「減速してください」、「注意してください」といった「事故を回避させるための情報」を変更してもよい。また、決定部133は、危険領域からの距離に応じて、危険領域を示す情報や事故が発生しうるまでの時間を変更してもよい。 Then, the determination unit 133 determines the information to be notified from the surveillance camera 20 targeted for information provision according to the distance between the surveillance camera 20 targeted for information provision and the dangerous area. For example, the determination unit 133 provides information indicating a dangerous area, a time until an accident can occur, information for causing the automobile 10 to avoid an accident, and the like according to the distance between the surveillance camera 20 and the dangerous area. Generate changed information. For example, the determination unit 133 changes "information for avoiding an accident" such as "stop", "decelerate", and "be careful" according to the distance from the dangerous area. May be good. Further, the determination unit 133 may change the information indicating the dangerous area and the time until an accident may occur according to the distance from the dangerous area.
(提供部134について)
 提供部134は、決定部133によって決定した情報を各監視カメラ20に対して提供する。例えば、提供部134は、決定部133が監視カメラ20ごとに決定した情報を、各監視カメラ20に対して提供する。
(About the provider 134)
The providing unit 134 provides the information determined by the determining unit 133 to each surveillance camera 20. For example, the providing unit 134 provides the information determined by the determining unit 133 for each surveillance camera 20 to each surveillance camera 20.
〔4.監視カメラの構成〕
 次に、図6を用いて、実施形態に係る監視カメラ20の構成について説明する。図6は、実施形態に係る監視カメラの構成例を示す図である。図6に示すように、監視カメラ20は、通信部210と、記憶部220と、カメラ230と、制御部240とを有する。
[4. Surveillance camera configuration]
Next, the configuration of the surveillance camera 20 according to the embodiment will be described with reference to FIG. FIG. 6 is a diagram showing a configuration example of the surveillance camera according to the embodiment. As shown in FIG. 6, the surveillance camera 20 includes a communication unit 210, a storage unit 220, a camera 230, and a control unit 240.
(通信部210について)
 通信部210は、例えば、NIC等によって実現される。そして、通信部210は、無線ネットワークを介して、情報処理装置100と、監視カメラ画像提供装置FISと、位置情報提供装置PISとの間で情報の送受信を行う。例えば、通信部210は、監視カメラ20が撮影した監視カメラ画像を監視カメラ画像提供装置FISに送信する。また、通信部210は、位置情報提供装置PISから、各自動車の位置を示す位置情報を取得する。
(About communication unit 210)
The communication unit 210 is realized by, for example, a NIC or the like. Then, the communication unit 210 transmits / receives information between the information processing device 100, the surveillance camera image providing device FIS, and the position information providing device PIS via the wireless network. For example, the communication unit 210 transmits the surveillance camera image taken by the surveillance camera 20 to the surveillance camera image providing device FIS. In addition, the communication unit 210 acquires position information indicating the position of each automobile from the position information providing device PIS.
(記憶部220について)
 記憶部220は、例えば、RAM、フラッシュメモリ等の半導体メモリ素子、または、ハードディスク、光ディスク等の記憶装置によって実現される。また、図6に示す例では、記憶部220は、宛先記憶部221を有する。
(About storage 220)
The storage unit 220 is realized by, for example, a semiconductor memory element such as a RAM or a flash memory, or a storage device such as a hard disk or an optical disk. Further, in the example shown in FIG. 6, the storage unit 220 has a destination storage unit 221.
(宛先記憶部221について)
 ここで、図7に、実施形態に係る宛先記憶部221の一例を示す。図7は、実施形態に係る宛先記憶部の一例を示す図である。図7に示した例では、宛先記憶部221は、「自動車ID」と「宛先」といった項目を有する。「宛先」は、自動車10に対応付けられた宛先であり、例えば、メールアドレス等である。
(About destination storage unit 221)
Here, FIG. 7 shows an example of the destination storage unit 221 according to the embodiment. FIG. 7 is a diagram showing an example of a destination storage unit according to the embodiment. In the example shown in FIG. 7, the destination storage unit 221 has items such as "vehicle ID" and "destination". The “destination” is a destination associated with the automobile 10, and is, for example, an e-mail address or the like.
 例えば、図7では、宛先記憶部221は、自動車ID「MO1」が示す自動車の宛先が「AD1」である旨を示す。なお、図7では、宛先を「AD1」等の抽象的な符号で表現したが、実際には、メールアドレスやIPアドレス等、通知先を示すための各種情報が登録される。 For example, in FIG. 7, the destination storage unit 221 indicates that the destination of the automobile indicated by the automobile ID “MO1” is “AD1”. In FIG. 7, the destination is represented by an abstract code such as "AD1", but in reality, various information for indicating the notification destination such as an e-mail address and an IP address is registered.
(カメラ230について)
 図6に戻り、説明を続ける。カメラ230は、画像を撮像するための撮像装置である。例えば、カメラ230は、CCD(Charged-coupled devices)センサやCMOS(Complementary metal-oxide-semiconductor)センサ等の撮像素子により構成される。なお、カメラ230は、所定の領域を所定の角度で撮影する様に設置されたものであってもよく、例えば、注視点を適宜移動させながら画像を撮影するものであってもよい。また、図6に示す例では、カメラ230は、監視カメラ20に搭載されたものであるが、例えば、監視カメラ20は、カメラ230として動作する監視カメラと、通信部210、記憶部220、および制御部240として動作する監視カメラサーバとにより構成されるものであってもよい。
(About camera 230)
Returning to FIG. 6, the description will be continued. The camera 230 is an imaging device for capturing an image. For example, the camera 230 is composed of an image sensor such as a CCD (Charged-coupled devices) sensor or a CMOS (Complementary metal-oxide-semiconductor) sensor. The camera 230 may be installed so as to capture a predetermined area at a predetermined angle, or may capture an image while appropriately moving the gazing point, for example. Further, in the example shown in FIG. 6, the camera 230 is mounted on the surveillance camera 20, but for example, the surveillance camera 20 includes a surveillance camera operating as the camera 230, a communication unit 210, a storage unit 220, and a storage unit 220. It may be configured by a surveillance camera server that operates as a control unit 240.
(制御部240について)
 制御部240は、コントローラであり、例えば、CPUやMPU等によって、監視カメラ20内部の記憶装置に記憶されている各種プログラムがRAMを作業領域として実行されることにより実現される。また、制御部240は、コントローラであり、例えば、ASICやFPGA等の集積回路により実現される。
(About control unit 240)
The control unit 240 is a controller, and is realized by, for example, a CPU, an MPU, or the like executing various programs stored in a storage device inside the surveillance camera 20 using the RAM as a work area. Further, the control unit 240 is a controller, and is realized by, for example, an integrated circuit such as an ASIC or FPGA.
 図6に示すように、制御部240は、受付部241と、検知部242と、特定部243と、通知部244とを有し、以下に説明する情報処理の機能や作用を実現または実行する。なお、制御部240の内部構成は、図6に示した構成に限られず、後述する情報処理を行う構成であれば他の構成であってもよい。また、制御部240が有する各処理部の接続関係は、図6に示した接続関係に限られず、他の接続関係であってもよい。 As shown in FIG. 6, the control unit 240 has a reception unit 241, a detection unit 242, a specific unit 243, and a notification unit 244, and realizes or executes an information processing function or operation described below. .. The internal configuration of the control unit 240 is not limited to the configuration shown in FIG. 6, and may be another configuration as long as it is a configuration for performing information processing described later. Further, the connection relationship of each processing unit included in the control unit 240 is not limited to the connection relationship shown in FIG. 6, and may be another connection relationship.
(受付部241について)
 受付部241は、情報処理装置100が監視カメラ20の設置位置に応じて決定した情報であって、監視カメラ20から通知される情報を受付ける。例えば、受付部241は、「X秒後に交差点で事故発生の恐れ。停止しましょう」といった情報を受付ける。なお、以下の説明では、受付部241が受け付けた情報を「通知情報」と総称する場合がある。
(About reception desk 241)
The reception unit 241 receives the information notified by the surveillance camera 20 which is the information determined by the information processing device 100 according to the installation position of the surveillance camera 20. For example, the reception unit 241 receives information such as "There is a risk of an accident occurring at an intersection after X seconds. Let's stop." In the following description, the information received by the reception unit 241 may be collectively referred to as "notification information".
(検知部242及び特定部243について)
 検知部242は、情報の通知対象となる自動車を検知する。例えば、検知部242は、カメラ230を用いて、監視カメラ画像を撮影する。そして、検知部242は、撮影した監視カメラ画像を監視カメラ画像提供装置FISへと送信する。
(About detection unit 242 and specific unit 243)
The detection unit 242 detects a vehicle for which information is to be notified. For example, the detection unit 242 uses the camera 230 to capture a surveillance camera image. Then, the detection unit 242 transmits the captured surveillance camera image to the surveillance camera image providing device FIS.
 また、検知部242は、受付部241が通知情報を受付けた場合は、監視カメラ画像の画像解析を行い、通知対象となる自動車の検知を行う。例えば、検知部242は、監視カメラに撮影された自動車を通知対象として検知してもよい。また、例えば、検知部242は、情報処理装置100から危険領域を示す情報を取得し、取得した危険領域の方向へと移動している自動車を通知対象として検知してもよい。 Further, when the reception unit 241 receives the notification information, the detection unit 242 analyzes the image of the surveillance camera image and detects the vehicle to be notified. For example, the detection unit 242 may detect a vehicle photographed by a surveillance camera as a notification target. Further, for example, the detection unit 242 may acquire information indicating a dangerous area from the information processing device 100 and detect a vehicle moving in the direction of the acquired dangerous area as a notification target.
 ここで、監視カメラ20は、近接に所在する移動体に対し、通知情報をブロードキャストしてもよい。一方で、移動体との距離が離れている場合、通知情報をブロードキャストしたとしても、移動体が通知情報を受信できない場合がある。そこで、監視カメラ20は、特定部243を用いて、通知情報の通知対象となる自動車の宛先を特定する。 Here, the surveillance camera 20 may broadcast notification information to a moving object located nearby. On the other hand, when the distance from the moving body is large, the moving body may not be able to receive the notification information even if the notification information is broadcast. Therefore, the surveillance camera 20 uses the identification unit 243 to specify the destination of the automobile to be notified of the notification information.
 以下、このような宛先の特定手法について説明する。なお、以下の説明では、監視カメラ20に撮影された自動車10であって、通知対象となる自動車10を対象自動車と記載する。例えば、特定部243は、監視カメラ20を用いて、対象自動車の位置を推定する。そして、特定部243は、推定した位置と対応する位置を示す位置情報を取得した自動車10を対象自動車と紐付けることで、対象自動車の宛先を特定する。 The method of specifying such a destination will be described below. In the following description, the vehicle 10 photographed by the surveillance camera 20 and to be notified is described as the target vehicle. For example, the specific unit 243 estimates the position of the target vehicle by using the surveillance camera 20. Then, the identification unit 243 identifies the destination of the target vehicle by associating the vehicle 10 that has acquired the position information indicating the position corresponding to the estimated position with the target vehicle.
 例えば、監視カメラ20の画角における四隅や、中央等に対応する経度及び緯度が既知であるものとする。このような場合、監視カメラ20が撮影する監視カメラ画像をマス状の区画に区分すると、各区画は、それぞれ個別の緯度及び経度の範囲と対応することとなる。この結果、監視カメラ20が撮影する監視カメラ画像において撮影された撮影対象は、監視カメラ画像において像が含まれている区画と対応する緯度および経度の範囲内に所在していると推定される。 For example, it is assumed that the longitude and latitude corresponding to the four corners, the center, etc. of the angle of view of the surveillance camera 20 are known. In such a case, if the surveillance camera image captured by the surveillance camera 20 is divided into square-shaped sections, each section corresponds to an individual latitude and longitude range. As a result, it is presumed that the imaged object captured in the surveillance camera image captured by the surveillance camera 20 is located within the range of latitude and longitude corresponding to the section including the image in the surveillance camera image.
 そこで、特定部243は、監視カメラ画像において対象自動車の像を含む区画を特定し、特定した区画と対応する緯度及び経度の範囲に対象自動車が所在すると推定する。そして、情報処理装置100は、特定した区画と対応する緯度および経度の範囲に含まれる位置を示す位置情報を特定し、特定した位置情報の送信元となる車載装置を搭載した自動車10と、対象自動車とを紐付ける。そして、特定部243は、紐付け結果を用いて、対象自動車に対する通知の宛先を特定する。 Therefore, the identification unit 243 identifies the section including the image of the target vehicle in the surveillance camera image, and estimates that the target vehicle is located in the range of latitude and longitude corresponding to the specified section. Then, the information processing device 100 specifies the position information indicating the position included in the range of latitude and longitude corresponding to the specified section, and the vehicle 10 equipped with the in-vehicle device that is the source of the specified position information and the target. Link with a car. Then, the identification unit 243 specifies the destination of the notification to the target vehicle by using the association result.
 例えば、特定部243は、対象自動車が所在する緯度及び経度の範囲を監視カメラの各区画から推定する。そして、特定部243は、位置情報記憶部PIMに登録された位置情報から、推定した緯度及び経度の範囲に含まれる位置情報を特定する。なお、特定部243は、処理時刻から所定の期間内(例えば、数秒)に取得された位置情報のみを特定してもよい。そして、特定部243は、特定した位置情報と対応付けられた自動車ID(すなわち、位置情報の送信元となる自動車10の自動車ID)を位置情報記憶部PIMから取得する。 For example, the specific unit 243 estimates the range of latitude and longitude where the target vehicle is located from each section of the surveillance camera. Then, the identification unit 243 specifies the position information included in the range of the estimated latitude and longitude from the position information registered in the position information storage unit PIM. The specifying unit 243 may specify only the position information acquired within a predetermined period (for example, several seconds) from the processing time. Then, the specific unit 243 acquires the vehicle ID associated with the specified position information (that is, the vehicle ID of the vehicle 10 that is the source of the position information) from the position information storage unit PIM.
 続いて、特定部243は、宛先記憶部221を参照し、取得した自動車IDと対応付けられた宛先を特定する。そして、特定部243は、特定した宛先を通知部244に通知する。 Subsequently, the identification unit 243 refers to the destination storage unit 221 and specifies the destination associated with the acquired vehicle ID. Then, the specific unit 243 notifies the notification unit 244 of the specified destination.
(通知部244について)
 通知部244は、受付部241が受け付けた通知情報を移動体に通知する。例えば、通知部244は、各種のネットワークを介して、受付部241が受け付けた通知情報を、特定部243から受付けた宛先へと送信する。このような処理の結果、通知部244は、監視カメラに撮影された自動車10へと、通知情報を送信することができる。
(About notification unit 244)
The notification unit 244 notifies the mobile body of the notification information received by the reception unit 241. For example, the notification unit 244 transmits the notification information received by the reception unit 241 to the destination received from the specific unit 243 via various networks. As a result of such processing, the notification unit 244 can transmit the notification information to the automobile 10 photographed by the surveillance camera.
〔5.処理手順〕
 次に、図8を用いて、実施形態に係る情報処理装置100が実行する特定処理の手順について説明する。図8は、実施形態に係る情報処理装置100が実行する特定処理の流れの一例を示すフローチャートである。
[5. Processing procedure]
Next, the procedure of the specific processing executed by the information processing apparatus 100 according to the embodiment will be described with reference to FIG. FIG. 8 is a flowchart showing an example of a flow of specific processing executed by the information processing apparatus 100 according to the embodiment.
 図8に示すように、情報処理装置100は、各監視カメラ画像を取得し(ステップS101)、取得した監視カメラ画像から移動体の挙動を推定する(ステップS102)。そして、情報処理装置100は、交通事故が発生する可能性が高いエリア、すなわち、危険領域が存在するか否かを判定する(ステップS103)。ここで、情報処理装置100は、危険領域が存在しない場合は(ステップS103:No)、ステップS101を実行する。 As shown in FIG. 8, the information processing apparatus 100 acquires each surveillance camera image (step S101) and estimates the behavior of the moving object from the acquired surveillance camera image (step S102). Then, the information processing device 100 determines whether or not there is an area where a traffic accident is likely to occur, that is, a dangerous area (step S103). Here, the information processing apparatus 100 executes step S101 when the dangerous area does not exist (step S103: No).
 一方、情報処理装置100は、危険領域が存在する場合は(ステップS103:Yes)、各監視カメラ20が通知する情報として、危険領域からの距離に応じて異なる情報を決定する(ステップS104)。そして、情報処理装置100は、各監視カメラ20に対して決定した情報を通知する(ステップS105)。 On the other hand, when the dangerous area exists (step S103: Yes), the information processing device 100 determines different information according to the distance from the dangerous area as the information notified by each surveillance camera 20 (step S104). Then, the information processing device 100 notifies each surveillance camera 20 of the determined information (step S105).
〔6.変形例〕
 上述した情報処理装置100は、上記実施形態以外にも種々の異なる形態にて実施されてよい。そこで、以下では、情報処理装置100の他の実施形態について説明する。
[6. Modification example]
The information processing device 100 described above may be implemented in various different forms other than the above-described embodiment. Therefore, another embodiment of the information processing apparatus 100 will be described below.
〔6-1.監視カメラの近接の情報を基づいた決定処理について〕
 上述した例では、情報処理装置100は、危険領域SP1と各監視カメラ20との距離に応じて、各監視カメラ20が通知する情報の内容を決定した。しかしながら、実施形態は、これに限定されるものではない。
[6-1. About decision processing based on proximity information of surveillance cameras]
In the above-described example, the information processing apparatus 100 determines the content of the information notified by each surveillance camera 20 according to the distance between the danger zone SP1 and each surveillance camera 20. However, the embodiment is not limited to this.
 例えば、情報処理装置100は、位置情報に加えて、監視カメラ20の周辺の情報に基づいて、監視カメラ20が通知する情報の内容を決定してもよい。例えば、情報処理装置100は、監視カメラ21が撮影した監視カメラ画像から、監視カメラ21の近傍に所在する道路の状態を推定する。具体的な例を挙げると、情報処理装置100は、監視カメラ21の近傍に所在する道路で渋滞が発生しているか否か、自動車の平均的な移動速度等を推定する。 For example, the information processing device 100 may determine the content of the information notified by the surveillance camera 20 based on the information around the surveillance camera 20 in addition to the position information. For example, the information processing device 100 estimates the state of the road located in the vicinity of the surveillance camera 21 from the surveillance camera image taken by the surveillance camera 21. To give a specific example, the information processing device 100 estimates whether or not there is a traffic jam on a road located in the vicinity of the surveillance camera 21, the average moving speed of an automobile, and the like.
 そして、情報処理装置100は、例えば、自動車の平均的な移動速度が所定の閾値を超える場合や、車線数が所定の閾値以下となる場合は、事故を回避させるための行動を早めに行わせるため、監視カメラ21と危険領域SP1との距離が所定の距離以上であったとしても、車両に緊急停止を呼びかける通知情報を決定してもよい。また、情報処理装置100は、例えば、渋滞が発生している場合には、自動車が事故に巻き込まれる可能性が低くなるため、監視カメラ21と危険領域SP1との距離が所定の距離以下であったとしても、「渋滞先で事故の可能性。注意してください。」というような通知情報を決定してもよい。 Then, for example, when the average moving speed of the automobile exceeds a predetermined threshold value or the number of lanes is equal to or less than a predetermined threshold value, the information processing device 100 causes an action for avoiding an accident to be performed early. Therefore, even if the distance between the surveillance camera 21 and the danger zone SP1 is equal to or greater than a predetermined distance, the notification information calling for the vehicle to make an emergency stop may be determined. Further, in the information processing device 100, for example, when a traffic jam occurs, the possibility that the automobile is involved in an accident is low, so that the distance between the surveillance camera 21 and the danger zone SP1 is equal to or less than a predetermined distance. Even so, notification information such as "There is a possibility of an accident in a traffic jam. Please be careful." May be decided.
 なお、上述した情報以外にも、例えば、情報処理装置100は、監視カメラ20の周囲に所在する自転車の数や歩行者の数、監視カメラ20の周囲における交通ルール(例えば、速度の上限等)、停車中の車両があるか否か、バス等の公共交通機関が所在するか否か、店舗の有無や店舗の状況等、各種監視カメラ20の周囲に所在する情報に基づいて、通知情報を生成してよい。なお、監視カメラ21の周囲の状況は、例えば、監視カメラ21が撮影した監視カメラ画像から推定されてもよいが、監視カメラ21と隣接する、あるいは、道路を挟んで反対側に所在する他の監視カメラ20等により撮影された監視カメラ画像、すなわち、他の監視カメラ20により撮影された監視カメラ画像から推定してもよく、監視カメラ画像以外にも、ドップラーセンサ等の各種センサにより取得された情報から推定されてもよい。 In addition to the above-mentioned information, for example, the information processing device 100 includes the number of bicycles and pedestrians located around the surveillance camera 20, and the traffic rules around the surveillance camera 20 (for example, the upper limit of speed). , Notification information is provided based on the information located around the various surveillance cameras 20, such as whether or not there is a stopped vehicle, whether or not public transportation such as a bus is located, whether or not there is a store, and the status of the store. May be generated. The situation around the surveillance camera 21 may be estimated from, for example, the surveillance camera image taken by the surveillance camera 21, but another one located adjacent to the surveillance camera 21 or on the opposite side across the road. It may be estimated from the surveillance camera image taken by the surveillance camera 20 or the like, that is, the surveillance camera image taken by another surveillance camera 20, and is acquired by various sensors such as a Doppler sensor in addition to the surveillance camera image. It may be estimated from the information.
〔6-2.宛先の特定について〕
 上述した例では、監視カメラ20は、撮影された自動車10に対して通知情報を通知した。具体的には、監視カメラ20は、監視カメラ画像のうち像が写った区画から、対象自動車が所在する位置を推定し、推定した位置と対応する位置情報の送信元となる自動車10を特定し、特定した自動車10の宛先を対象自動車の宛先として、通知情報の送信を行った。
[6-2. About specifying the destination]
In the above-mentioned example, the surveillance camera 20 notifies the photographed automobile 10 of the notification information. Specifically, the surveillance camera 20 estimates the position where the target vehicle is located from the section of the surveillance camera image in which the image is captured, and identifies the vehicle 10 that is the source of the position information corresponding to the estimated position. , The notification information was transmitted with the specified destination of the vehicle 10 as the destination of the target vehicle.
 しかしながら、実施形態は、これに限定されるものではない。監視カメラ20は、上述したような自動車10のパーマネントな宛先ではなく、テンポラルな宛先を用いた通知を行ってもよい。より具体的には、監視カメラ20は、各自動車10が走行する位置や速度、走行している方向(向き)等といった自動車10の走行態様に応じて付与される宛先を用いて、通知を行ってもよい。例えば、監視カメラ20は、LTE(Long Term Evolution)のDestination Layer-2 IDを用いて、通知情報の通知を行ってもよい。 However, the embodiment is not limited to this. The surveillance camera 20 may make a notification using a temporal destination instead of the permanent destination of the automobile 10 as described above. More specifically, the surveillance camera 20 gives a notification using a destination given according to the traveling mode of the automobile 10, such as the position and speed at which each automobile 10 travels, the traveling direction (direction), and the like. You may. For example, the surveillance camera 20 may notify the notification information by using the Destination Layer-2 ID of LTE (Long Term Evolution).
 Destination Layer-2 IDを用いた宛先の特定処理について説明する。このような特定処理においては、例えば、自動車10が走行する道路を複数の領域に分割し、領域毎に、それぞれ異なるLayer-2 IDをあらかじめ付与する。例えば、監視カメラ21が撮影している道路において、危険領域SP1方向に向かう車線に対し、Layer-2 ID「001」が付与され、監視カメラ21が撮影している道路において、危険領域SP1から離れる方向に向かう車線に対し、Layer-2 ID「002」が付与される。 The destination identification process using the Destination Layer-2 ID will be explained. In such a specific process, for example, the road on which the automobile 10 travels is divided into a plurality of areas, and different Layer-2 IDs are assigned to each area in advance. For example, on the road photographed by the surveillance camera 21, Layer-2 ID "001" is assigned to the lane heading toward the danger zone SP1, and the road away from the danger zone SP1 on the road photographed by the surveillance camera 21. Layer-2 ID "002" is assigned to the lane heading in the direction.
 ここで、Layer-2 IDは、監視カメラ21が撮影している道路や、監視カメラ21の近傍に位置する道路以外にも、監視カメラ21が撮影していない道路(死角にある道路)や、監視カメラ21から離れた位置にある道路に対しても付与されている。すなわち、Destination Layer-2 IDを用いた特定処理においては、自動車10が走行する領域及び自動車10が走行する方向との組合せごとに、異なるLayer-2 IDが付与される。 Here, the Layer-2 ID is not only the road photographed by the surveillance camera 21 and the road located near the surveillance camera 21, but also the road not photographed by the surveillance camera 21 (road in the blind spot). It is also given to roads located away from the surveillance camera 21. That is, in the specific process using the Destination Layer-2 ID, a different Layer-2 ID is assigned to each combination of the area in which the vehicle 10 travels and the direction in which the vehicle 10 travels.
 このようなLayer-2 IDが付与されている場合、各自動車10の車載装置は、走行位置に応じて自身の宛先となるLayer-2 IDを適宜変更する。例えば、Layer-2 ID「001」が付与された車線を走行する自動車10の車載装置は、各自動車10に対してブロードキャストされた情報のうち、Layer-2 ID「001」を宛先としてブロードキャストされた情報のみをより上位の処理レイヤで処理し、他のLayer-2 IDを宛先とする情報については、破棄する。また、Layer-2 ID「002」が付与された車線を走行する自動車10の車載装置は、各自動車10に対してブロードキャストされた情報のうち、Layer-2 ID「002」を宛先としてブロードキャストされた情報のみをより上位の処理レイヤで処理し、他のLayer-2 IDを宛先とする情報については、破棄する。 When such a Layer-2 ID is given, the in-vehicle device of each automobile 10 appropriately changes its own destination Layer-2 ID according to the traveling position. For example, the in-vehicle device of the automobile 10 traveling in the lane to which the Layer-2 ID "001" is assigned was broadcast to the Layer-2 ID "001" among the information broadcast to each automobile 10. Only the information is processed by the higher processing layer, and the information destined for other Layer-2 IDs is discarded. Further, the in-vehicle device of the automobile 10 traveling in the lane to which the Layer-2 ID "002" is assigned was broadcast to the Layer-2 ID "002" among the information broadcast to each automobile 10. Only the information is processed by the higher processing layer, and the information destined for another Layer-2 ID is discarded.
 そこで、監視カメラ21は、危険領域SP1で事故が発生すると推定される場合、危険領域SP1に向かっている車両のLayer-2 IDを特定する。例えば、監視カメラ21は、危険領域SP1方向に向かう車線に付与されたLayer-2 ID「001」を特定する。また、監視カメラ21は、これ以外にも、危険領域SP1に向かう車両が走行する車線や、危険領域SP1から所定の範囲内に位置する車線に付与されたLayer-2 IDを特定する。なお、監視カメラ21は、事故が発生するまでの時間に応じて、通知対象となる車線と危険領域SP1との距離を変動させてもよい。 Therefore, when it is estimated that an accident occurs in the dangerous area SP1, the surveillance camera 21 identifies the Layer-2 ID of the vehicle heading for the dangerous area SP1. For example, the surveillance camera 21 identifies the Layer-2 ID "001" assigned to the lane heading toward the danger zone SP1. In addition to this, the surveillance camera 21 identifies the lane in which the vehicle heading for the danger zone SP1 travels and the Layer-2 ID assigned to the lane located within a predetermined range from the danger zone SP1. The surveillance camera 21 may change the distance between the lane to be notified and the danger zone SP1 according to the time until the accident occurs.
 そして、監視カメラ21は、特定したLayer-2 IDを宛先として、情報処理装置100から受付けた通知情報をブロードキャストする。このような処理の結果、監視カメラ21は、各自動車10のうち、危険領域SP1から所定の範囲内に位置する自動車10であって、危険領域SP1へと向かう自動車10に対し、通知情報を送信することができる。 Then, the surveillance camera 21 broadcasts the notification information received from the information processing device 100 to the specified Layer-2 ID as the destination. As a result of such processing, the surveillance camera 21 transmits notification information to the automobiles 10 located within a predetermined range from the danger zone SP1 among the automobiles 10 and heading toward the danger zone SP1. can do.
 なお、監視カメラ21は、監視カメラ画像から特定したイベントの内容に応じて、異なる宛先を特定してもよい。例えば、監視カメラ21は、危険領域SP1で自動車CA1と自動車12とが衝突すると推定された場合と、危険領域SP1で自動車11と自転車とが衝突すると推定される場合とで、異なるLayer-2 IDを宛先として選択してもよい。また、監視カメラ21は、事故が発生するまでの時間や、事故に巻き込まれる可能性が高い自動車10の走行速度等に応じて、宛先となるLayer-2 IDを変更してもよい。 Note that the surveillance camera 21 may specify a different destination according to the content of the event specified from the surveillance camera image. For example, the surveillance camera 21 has different Layer-2 IDs depending on whether it is estimated that the automobile CA1 and the automobile 12 collide with each other in the dangerous area SP1 and the automobile 11 and the bicycle collide with each other in the dangerous area SP1. May be selected as the destination. Further, the surveillance camera 21 may change the destination Layer-2 ID according to the time until the accident occurs, the traveling speed of the automobile 10 which is likely to be involved in the accident, and the like.
 また、監視カメラ21は、各種の画像解析技術により、任意のイベントを検知若しくは推定し、検知もしくは推定したイベントが発生する領域を危険領域SP1として、上述した処理を行ってもよい。例えば、監視カメラ21は、道路に面する建物で火事が発生している場合は、火事が発生している建物が面する道路を危険領域SP1とし、危険領域SP1へ向かう方向に走行している自動車10に、通知情報の通知を行ってもよい。 Further, the surveillance camera 21 may detect or estimate an arbitrary event by various image analysis techniques, and perform the above-described processing with the region where the detected or estimated event occurs as the danger region SP1. For example, when a fire is occurring in a building facing the road, the surveillance camera 21 sets the road facing the building facing the fire as the danger zone SP1 and travels in the direction toward the danger zone SP1. The automobile 10 may be notified of the notification information.
 また、監視カメラ21は、他の情報に応じて、宛先となるLayer-2 IDを変更してもよい。例えば、各自動車10の車載装置は、自動車の大きさや重量といったサイズ、実際に乗車している利用者の数や性別、年代といった属性、自動車の属性、走行中の領域における天候、周囲の歩行者の有無等、各種の走行態様に応じて適宜Layer-2 IDを変更する。一方、監視カメラ21は、検出したイベントの内容やイベントが発生した領域に応じて、通知先となる自動車10の走行態様を決定し、決定した走行態様と対応するLayer-2 IDを宛先としてもよい。 Further, the surveillance camera 21 may change the destination Layer-2 ID according to other information. For example, the in-vehicle device of each automobile 10 includes the size such as the size and weight of the automobile, the number and gender of the users actually riding, the attributes such as the age, the attributes of the automobile, the weather in the traveling area, and the surrounding pedestrians. Change the Layer-2 ID as appropriate according to various driving modes such as the presence or absence of. On the other hand, the surveillance camera 21 determines the traveling mode of the automobile 10 to be notified according to the content of the detected event and the area where the event has occurred, and may use the Layer-2 ID corresponding to the determined traveling mode as the destination. good.
 なお、このような処理を実行する場合、例えば、監視カメラ21は、検知しうるイベントの内容ごとに、通知先となる自動車10の走行態様に応じたLayer-2 IDを対応付けたリストを保持していればよい。このようなリストを用いて、監視カメラ21は、検知したイベントの内容(例えば、事故が生じるのか、火事が生じるのか、事故に巻き込まれる自動車の速度や種別等)に応じて、通知先となる自動車10の宛先であって、テンポラルに変動する宛先を特定することができる。 When executing such a process, for example, the surveillance camera 21 holds a list associated with the Layer-2 ID according to the driving mode of the automobile 10 to be notified for each content of the event that can be detected. You just have to do it. Using such a list, the surveillance camera 21 becomes a notification destination according to the content of the detected event (for example, whether an accident occurs, a fire occurs, the speed or type of the vehicle involved in the accident, etc.). It is possible to specify a destination that fluctuates temporally, which is the destination of the automobile 10.
〔6-3.移動体〕
 上記実施形態では、自動車を例に挙げて説明してきたが、自動車の代わりに、如何なる移動体に対して適用してもよい。例えば、移動体は、自転車、ドローン、バス又は電車等である。また、移動体は、非コネクテッド車両が後付の通信装置や情報処理装置等搭載することによりコネクテッド車両になった場合の自動車や、自動運転車等でもよい。すなわち、コネクテッド車両とは、現在位置を検知するための検知機能と、検知した現在位置を示す位置情報を情報処理装置100へと送信する通信機能とを有していればよく、自動運転を実現可能な車両であるか否かは、問わない概念である。
[6-3. Mobile]
In the above embodiment, the automobile has been described as an example, but the present invention may be applied to any moving body instead of the automobile. For example, the moving body is a bicycle, a drone, a bus, a train, or the like. Further, the moving body may be a vehicle in which a non-connected vehicle becomes a connected vehicle by mounting a communication device, an information processing device, or the like attached later, or an autonomous driving vehicle or the like. That is, the connected vehicle only needs to have a detection function for detecting the current position and a communication function for transmitting the detected position information indicating the current position to the information processing device 100, and realizes automatic driving. Whether or not it is a possible vehicle is a concept that does not matter.
〔6-4.カメラ〕
 上記実施形態では、監視カメラを例に挙げて説明してきたが、監視カメラは、如何なるカメラであってもよい。例えば、カメラは、深度カメラ又は赤外線カメラ等であってもよい。また、情報処理装置100は、各種画素や画像の領域が現実世界の位置に対応するような画像を取得するカメラ以外にも、例えば、赤外線センサや各種電波センサであってもよい。情報処理装置100は、このような各種センサから各移動体へと通知する通知情報を、各種センサと危険領域SP1との距離に応じて決定することで、各移動体に通知する情報を適切にすることができる。
[6-4. camera〕
In the above embodiment, the surveillance camera has been described as an example, but the surveillance camera may be any camera. For example, the camera may be a depth camera, an infrared camera, or the like. Further, the information processing device 100 may be, for example, an infrared sensor or various radio wave sensors, in addition to a camera that acquires an image in which various pixels or image regions correspond to positions in the real world. The information processing device 100 appropriately determines the notification information to be notified from the various sensors to each moving body according to the distance between the various sensors and the danger region SP1 to appropriately notify each moving body. can do.
 また、上述した例では、監視カメラ20として、各種の撮影機能を有するとともに、各種の情報処理および情報の通知を行うことができる機能とを有するスマート監視カメラを例として説明した。ここで、監視カメラ20は、例えば、路傍に固定されたカメラを備える小型のサーバ装置により実現されてもよい。すなわち、「監視カメラ20が情報を送信する」という処理は、監視カメラ20や監視カメラ20と連動する小型のサーバ装置、若しくはカメラを有する小型サーバが情報を送信することも含む概念である。 Further, in the above-mentioned example, as the surveillance camera 20, a smart surveillance camera having various shooting functions and a function capable of performing various information processing and information notification has been described as an example. Here, the surveillance camera 20 may be realized by, for example, a small server device including a camera fixed to the roadside. That is, the process of "transmitting information by the surveillance camera 20" is a concept including transmission of information by the surveillance camera 20, a small server device linked with the surveillance camera 20, or a small server having a camera.
 また、上述した例では、監視カメラ20は、情報処理装置100から受付けた通知情報を通知していたが、実施形態は、これに限定されるものではない。例えば、情報処理装置100は、監視カメラ画像に基づいて、危険領域SP1を推定し、推定した危険領域SP1を各監視カメラ20に通知する。このような場合、各監視カメラ20は、通知された危険領域SP1と、自装置が設置された位置との位置関係に応じて、通知情報を決定し、決定した通知情報を自動車10に通知してもよい。なお、このような機能を有する場合、監視カメラ20の制御部240は、例えば、図2に示す決定部133をさらに有することとなる。 Further, in the above-described example, the surveillance camera 20 has notified the notification information received from the information processing device 100, but the embodiment is not limited to this. For example, the information processing device 100 estimates the danger area SP1 based on the surveillance camera image, and notifies each surveillance camera 20 of the estimated danger area SP1. In such a case, each surveillance camera 20 determines the notification information according to the positional relationship between the notified danger area SP1 and the position where the own device is installed, and notifies the vehicle 10 of the determined notification information. You may. When having such a function, the control unit 240 of the surveillance camera 20 further includes, for example, the determination unit 133 shown in FIG.
 このように、監視カメラ20は、情報処理装置100と連携しながら、自律的に通知情報の決定を行い、決定した情報を各自動車に通知してもよい。また、例えば、監視カメラ20は、情報処理装置100と同様に、監視カメラ画像から危険領域SP1を推定し、推定した危険領域SP1を他の監視カメラ20に対して通知してもよい。このような場合、監視カメラ20は、情報処理装置100を有さずとも、監視カメラ20のみで通知情報の決定や通知を行うことができる。 In this way, the surveillance camera 20 may autonomously determine the notification information in cooperation with the information processing device 100 and notify each vehicle of the determined information. Further, for example, the surveillance camera 20 may estimate the danger region SP1 from the surveillance camera image and notify the estimated danger region SP1 to another surveillance camera 20 in the same manner as the information processing device 100. In such a case, the surveillance camera 20 can determine and notify the notification information only by the surveillance camera 20 without having the information processing device 100.
〔6-5.位置関係に応じた通知情報について〕
 上述した例では、情報処理装置100や監視カメラ20は、危険領域SP1と監視カメラ20との間の距離に応じて、監視カメラ20から通知する通知情報を決定した。しかしながら、実施形態は、これに限定されるものではない。
[6-5. Notification information according to the positional relationship]
In the above-described example, the information processing device 100 and the surveillance camera 20 determine the notification information to be notified from the surveillance camera 20 according to the distance between the danger region SP1 and the surveillance camera 20. However, the embodiment is not limited to this.
 例えば、情報処理装置100(及び監視カメラ2、以下同様)は、監視カメラ20の前から危険領域SP1までの間に存在する交差点の数や、監視カメラ20の前から危険領域SP1までの間に交差点を左折若しくは右折する回数、監視カメラ20の前から危険領域SP1までの平均的な移動時間、監視カメラ20の前から危険領域SP1までの間に存在する横断歩道の数等に応じて、通知情報を決定してもよい。また、情報処理装置100は、監視カメラ20の前から危険領域SP1までの間に渋滞が発生しているか否か、平均的な交通量や歩行者の数等に応じて、通知情報を決定してもよい。 For example, the information processing device 100 (and the surveillance camera 2, the same applies hereinafter) includes the number of intersections existing between the front of the surveillance camera 20 and the danger zone SP1, and the distance between the front of the surveillance camera 20 and the danger zone SP1. Notification according to the number of left or right turns at the intersection, the average travel time from the front of the surveillance camera 20 to the dangerous area SP1, the number of crosswalks existing from the front of the surveillance camera 20 to the dangerous area SP1, etc. Information may be determined. Further, the information processing device 100 determines the notification information according to whether or not there is a traffic jam between the front of the surveillance camera 20 and the dangerous area SP1, the average traffic volume, the number of pedestrians, and the like. You may.
 また、情報処理装置100は、監視カメラ20が交差点の近傍に設置されているか否か、監視カメラ20の撮影方向、監視カメラ20の近傍における自転車の有無や歩行者の数、監視カメラ20の近傍で生じたイベント等に応じて、通知情報を決定してもよい。すなわち、情報処理装置100は、少なくとも、センサの設置位置に応じて通知情報を決定するのであれば、任意の情報に基づいて、通知情報を決定してもよい。
〔6-6.車載装置〕
 上記実施形態では、車載装置を例に挙げて説明してきたが、車載装置の代わりに、如何なる端末装置に対して適用してもよい。具体的には、端末装置は、ブラウザに表示されるウェブページやアプリケーション用のコンテンツ等のコンテンツにアクセスする利用者によって利用される端末装置等であってもよい。例えば、端末装置は、デスクトップ型PC(Personal Computer)や、ノート型PCや、タブレット端末や、携帯電話機や、PDA(Personal Digital Assistant)、スマートウォッチ、ウェアラブルデバイス(Wearable Device)等であってもよい。
Further, the information processing device 100 determines whether or not the surveillance camera 20 is installed near the intersection, the shooting direction of the surveillance camera 20, the presence or absence of bicycles and the number of pedestrians in the vicinity of the surveillance camera 20, and the vicinity of the surveillance camera 20. Notification information may be determined according to the event or the like that occurred in. That is, the information processing device 100 may determine the notification information based on arbitrary information, at least as long as the notification information is determined according to the installation position of the sensor.
[6-6. In-vehicle device]
In the above embodiment, the vehicle-mounted device has been described as an example, but the vehicle-mounted device may be applied to any terminal device instead of the vehicle-mounted device. Specifically, the terminal device may be a terminal device or the like used by a user who accesses content such as a web page displayed on a browser or content for an application. For example, the terminal device may be a desktop PC (Personal Computer), a notebook PC, a tablet terminal, a mobile phone, a PDA (Personal Digital Assistant), a smart watch, a wearable device, or the like. ..
〔6-7.位置情報〕
 上記実施形態では、車載装置は、GPSを用いて現在位置を検知していた。しかしながら、実施形態は、これに限定されるものではない。車載装置は、情報を例に挙げて説明してきたが、GPS情報の代わりに、如何なる位置情報に対して適用してもよい。例えば、車載装置は、通信を行っている基地局の位置情報や、WiFi(登録商標)(Wireless Fidelity)の電波を用いて自動車10の現在位置を推定若しくは取得してもよい。
[6-7. location information〕
In the above embodiment, the in-vehicle device uses GPS to detect the current position. However, the embodiment is not limited to this. Although the in-vehicle device has been described by taking information as an example, it may be applied to any position information instead of GPS information. For example, the in-vehicle device may estimate or acquire the current position of the automobile 10 by using the position information of the base station communicating with the vehicle or the radio wave of WiFi (registered trademark) (Wireless Fidelity).
〔6-8.自動運転車の属性〕
 上記実施形態では、監視カメラ20は、各種自動車10に対して通知情報を送信していたが、上記取得処理に限定されない。例えば、位置情報提供装置PISに自動車10の属性に関する属性情報と、GPS情報とが対応付けられて記憶されているものとする。この場合、監視カメラ20は、自動車10のGPS情報とともに、自動車10の属性に関する属性情報を位置情報提供装置PSから取得してもよい。
[6-8. Attributes of self-driving cars]
In the above embodiment, the surveillance camera 20 transmits the notification information to the various automobiles 10, but the acquisition process is not limited to the above. For example, it is assumed that the position information providing device PIS stores the attribute information related to the attributes of the automobile 10 and the GPS information in association with each other. In this case, the surveillance camera 20 may acquire the attribute information related to the attributes of the automobile 10 from the position information providing device PS together with the GPS information of the automobile 10.
 例えば、監視カメラ20は、自動車10の属性情報として、車両の種別に関する情報、速度に関する情報、ブレーキの回数に関する情報、運転操作に関する情報、走行時間に関する情報又はドアの開閉の回数に関する情報等を位置情報提供装置PISから取得してもよい。そして、監視カメラ20は、画像から推定される対象自動車の位置と、自動車10から取得された位置情報が示す位置と、自動車10の属性情報とに基づいて、対象自動車と自動車10とを紐付けてもよい。 For example, the surveillance camera 20 positions, as attribute information of the automobile 10, information on the type of vehicle, information on speed, information on the number of times of braking, information on driving operation, information on running time, information on the number of times of opening and closing the door, and the like. It may be acquired from the information providing device PIS. Then, the surveillance camera 20 links the target vehicle and the vehicle 10 based on the position of the target vehicle estimated from the image, the position indicated by the position information acquired from the vehicle 10, and the attribute information of the vehicle 10. You may.
 例えば、監視カメラ20は、画像から、対象自動車の移動に関する属性を推定する。また、他の例では、監視カメラ20は、検知したイベント内容から、通知情報の送信対象となる対象自動車の属性を推定する。そして、情報処理装置100は、推定した属性と、自動車10の属性情報が示す属性とが一致若しくは類似し、かつ、対象自動車の位置情報が示す位置と、自動車10から取得された位置情報が示す位置とが同一若しくは類似する場合は、自動車10の宛先を対象自動車の宛先としてもよい。 For example, the surveillance camera 20 estimates the attributes related to the movement of the target vehicle from the image. Further, in another example, the surveillance camera 20 estimates the attribute of the target vehicle to which the notification information is transmitted from the detected event content. Then, in the information processing device 100, the estimated attribute and the attribute indicated by the attribute information of the automobile 10 match or are similar, and the position indicated by the position information of the target automobile and the position information acquired from the automobile 10 indicate. When the positions are the same or similar, the destination of the vehicle 10 may be the destination of the target vehicle.
〔6-9.緊急車両〕
 上記実施形態では、情報処理装置100は、非コネクテッド車両CA1が、画像から交通事故を起こす可能性が高い自動車であるか否かを判定する処理の一例を説明したが、上記判定処理に限定されない。具体的には、情報処理装置100は、画像に含まれる非コネクテッド車両CA1が、救急車等の緊急車両か否かを判定してもよい。
[6-9. Emergency vehicle]
In the above embodiment, the information processing device 100 has described an example of a process of determining whether or not the non-connected vehicle CA1 is a vehicle having a high possibility of causing a traffic accident from an image, but the information processing device 100 is not limited to the above determination process. .. Specifically, the information processing device 100 may determine whether or not the non-connected vehicle CA1 included in the image is an emergency vehicle such as an ambulance.
 例えば、情報処理装置100は、画像解析等の従来技術によって画像を解析することで、車種や、ナンバープレート、車両の外形等から、画像に含まれる非コネクテッド車両CA1が、緊急車両か否かを判定してもよい。このように、情報処理装置100は、各種状況に応じて、各種状況において特徴的な非コネクテッド車両CA1を判定することができる。 For example, the information processing device 100 analyzes an image by a conventional technique such as image analysis to determine whether or not the unconnected vehicle CA1 included in the image is an emergency vehicle based on the vehicle type, license plate, vehicle outer shape, and the like. You may judge. In this way, the information processing apparatus 100 can determine the unconnected vehicle CA1 that is characteristic in various situations according to various situations.
 そして、情報処理装置100は、例えば、救急車が急病人を乗せ、病院に向かう場合に、自動車10が救急車の障害にならないような通知を各監視カメラ20に行わせてもよい。例えば、情報処理装置100は、救急車が通過する交差点を危険領域SP1とし、危険領域SP1と各監視カメラ20との位置関係に応じて、各監視カメラ20から通知させる通知情報を決定してもよい。 Then, for example, when the ambulance carries a suddenly ill person and heads for the hospital, the information processing device 100 may cause each surveillance camera 20 to notify that the automobile 10 does not interfere with the ambulance. For example, the information processing device 100 may set the intersection through which the ambulance passes as the danger zone SP1 and determine the notification information to be notified from each surveillance camera 20 according to the positional relationship between the danger zone SP1 and each surveillance camera 20. ..
〔6-10.自動車以外の危険行為〕
 上記実施形態では、情報処理装置100は、自動車10と自動車CA1との衝突を推定したが、実施形態は、これに限定されるものではない。例えば、情報処理装置100は、画像解析等の従来技術を用いることで、監視カメラ20から取得された画像から、自転車や、子供や、動物等を特定する。また、情報処理装置100は、監視カメラ20から取得された画像から、自転車や、子供や、動物等が死角に存在すると推定される自動車を対象自動車として特定する。このような場合、情報処理装置100は、自転車や、子供や、動物等と対象自動車とが衝突する恐れがあるか否かを推定し、衝突する恐れがある場合は、衝突する恐れがある位置や領域を危険領域SP1としてもよい。
[6-10. Dangerous acts other than automobiles]
In the above embodiment, the information processing device 100 estimates the collision between the automobile 10 and the automobile CA1, but the embodiment is not limited to this. For example, the information processing device 100 identifies a bicycle, a child, an animal, or the like from an image acquired from the surveillance camera 20 by using a conventional technique such as image analysis. Further, the information processing device 100 identifies a vehicle in which a bicycle, a child, an animal, or the like is presumed to exist in the blind spot as a target vehicle from the image acquired from the surveillance camera 20. In such a case, the information processing device 100 estimates whether or not there is a risk of collision between the target vehicle and a bicycle, a child, an animal, or the like, and if there is a risk of collision, a position where there is a risk of collision. And the area may be the danger area SP1.
 このように、実施形態に係る情報処理装置100は、自転車や、子供や、動物等が飛び出した場合でも、監視カメラ20から、衝突する恐れがある自動車に対し、各種の通知情報を通知させることができる。 In this way, the information processing device 100 according to the embodiment causes the surveillance camera 20 to notify the automobile that may collide with various notification information even when a bicycle, a child, an animal, or the like jumps out. Can be done.
 従来、自動車10が車載カメラを有する場合でも、自動車以外の自転車や、子供や、動物等が車載カメラで把握できない死角に入りこみ、車載カメラから自転車や、子供や、動物等を認識できない場合があった。この場合、自動車10の死角から、自転車や、子供や、動物等が飛び出してきた場合に交通事故に遭遇してしまう。この課題に対して、情報処理装置100は、自転車や、子供や、動物等が飛び出した場合でも対象自動車の宛先を特定することができる。これにより、自動車10は、交通事故を未然に防ぐことができる。 Conventionally, even if the automobile 10 has an in-vehicle camera, the bicycle, children, animals, etc. other than the automobile may enter a blind spot that cannot be grasped by the in-vehicle camera, and the in-vehicle camera may not be able to recognize the bicycle, children, animals, etc. rice field. In this case, if a bicycle, a child, an animal, or the like jumps out of the blind spot of the automobile 10, a traffic accident will be encountered. In response to this problem, the information processing device 100 can specify the destination of the target vehicle even when a bicycle, a child, an animal, or the like jumps out. As a result, the automobile 10 can prevent a traffic accident.
〔6-11.違反車両〕
 また、情報処理装置100は、監視カメラ20によって撮影された監視カメラ画像に基づいて、停車禁止違反である自動車を特定し、特定した自動車の位置や領域を危険領域SP1としてもよい。また、情報処理装置100は、利用者が乗降中のタクシー、自動運転車又はバス等の位置や領域を危険領域SP1としてもよい。このような場合、情報処理装置100は、道路で停車中の自動車10に他の自動車10が衝突したり、連続した玉突き事故が生じる恐れを軽減したりすることができる。
[6-11. Violating vehicle]
Further, the information processing device 100 may identify a vehicle that violates the stop prohibition based on the surveillance camera image taken by the surveillance camera 20, and may set the position or region of the specified vehicle as the danger region SP1. Further, the information processing device 100 may set the position or area of a taxi, an autonomous vehicle, a bus, or the like on which the user is getting on and off as the danger area SP1. In such a case, the information processing device 100 can reduce the possibility that another automobile 10 collides with the automobile 10 stopped on the road or a continuous billiard accident occurs.
 また、情報処理装置100は、所定の期間後に、停車禁止違反である自動車についての罰金等の請求に関する情報を警察等に提供してもよい。例えば、情報処理装置100は、撮影された自動車のナンバープレート等の情報を提供してもよい。また、情報処理装置100は、停車禁止違反である自動車の自動車IDに対応付けて、違反を繰り返す危険自動車であるといったフラグを所定の記憶部に記憶してもよい。 Further, the information processing device 100 may provide the police or the like with information regarding a claim for a fine or the like for a vehicle that is a violation of the prohibition of parking after a predetermined period of time. For example, the information processing device 100 may provide information such as a license plate of a photographed automobile. Further, the information processing device 100 may store a flag in a predetermined storage unit, such as a dangerous vehicle that repeats the violation, in association with the vehicle ID of the vehicle that violates the stop prohibition.
 なお、上記変形例は、停車禁止違反である自動車の代わりに、如何なる違反をした自動車に適用可能である。例えば、情報処理装置100は、監視カメラ20によって撮影された監視カメラ画像に基づいて、追越し禁止違反である自動車を特定する。また、情報処理装置100は、かかる特定された自動車であって、追越し禁止違反である自動車の位置や領域を危険領域SP1としてもよい。 Note that the above modification can be applied to a vehicle that violates any kind of vehicle instead of a vehicle that violates the parking prohibition. For example, the information processing device 100 identifies a vehicle that violates the overtaking prohibition based on the surveillance camera image taken by the surveillance camera 20. Further, the information processing device 100 may set the position or region of the specified automobile, which violates the overtaking prohibition, as the danger zone SP1.
〔6-12.通知内容〕
 上記実施形態では、監視カメラ20は、如何なる情報を通知してもよい。例えば、情報処理装置100は、監視カメラ20から通知される情報として、事故を生じさせる恐れがある危険運転車のナンバープレートや、車両の種別、車両の外形や、車両の色等の非車両の特徴や、車両の推定運行速度や、車両の位置情報等を含む。また、通知情報は、通知先の自動車10に対し、事故から回避させるための走行を自動的に行わせるための制御情報等を含んでいてもよい。
[6-12. Notification content]
In the above embodiment, the surveillance camera 20 may notify any information. For example, the information processing device 100 uses the information processing device 100 as information notified from the surveillance camera 20 such as the license plate of a dangerous driving vehicle that may cause an accident, the type of vehicle, the outer shape of the vehicle, the color of the vehicle, and the like. Includes features, estimated vehicle speed, vehicle position information, and so on. Further, the notification information may include control information or the like for causing the notification destination automobile 10 to automatically drive to avoid an accident.
〔6-13.通知態様〕
 上記実施形態では、監視カメラ20は、如何なる態様の通知情報を送信してもよい。例えば、監視カメラ20は、自動車10のみが取得する通知情報を通知してもよく、自動車10の搭乗者に理解できる態様の通知情報を通知してもよい。また、情報処理装置100は、停車禁止違反である自動車10の近傍の監視カメラ20から、停車禁止違反である自動車10に対し、停車禁止違反を止めるようにといった旨の内容を通知させてもよい。
[6-13. Notification mode]
In the above embodiment, the surveillance camera 20 may transmit notification information in any mode. For example, the surveillance camera 20 may notify the notification information acquired only by the automobile 10, or may notify the passengers of the automobile 10 of the notification information in an understandable manner. Further, the information processing device 100 may have the surveillance camera 20 in the vicinity of the vehicle 10 which is a violation of the stop prohibition notify the vehicle 10 which is a violation of the stop prohibition to stop the violation of the stop prohibition. ..
〔6-14.危険領域について〕
 ここで、情報処理装置100は、任意の位置を危険領域としてもよい。例えば、情報処理装置100は、監視カメラ画像に撮影された自動車10が所定の条件を満たすか否かに基づいて、自動車10が危険自動車であるか否かを判定する。例えば、情報処理装置100は、撮影された自動車10が所定の運転操作で走行する場合、所定の閾値以上の速度で走行する場合、又は、所定の回数以上の交通ルールに違反していた場合の少なくともいずれか1つである場合に、危険自動車であると判定する。
[6-14. About the danger zone]
Here, the information processing device 100 may set an arbitrary position as a dangerous area. For example, the information processing device 100 determines whether or not the automobile 10 is a dangerous automobile based on whether or not the automobile 10 captured in the surveillance camera image satisfies a predetermined condition. For example, in the information processing device 100, when the photographed automobile 10 travels by a predetermined driving operation, travels at a speed equal to or higher than a predetermined threshold value, or violates a traffic rule more than a predetermined number of times. If it is at least one of them, it is determined that the vehicle is dangerous.
 なお、情報処理装置100は、自動車10が蛇行運転を繰り返す場合に、自動車10を危険自動車と判定してもよい。すなわち、情報処理装置100は、監視カメラ20から取得した監視カメラ画像において、非コネクテッド車両CA1が蛇行運転を繰り返している場合は、非コネクテッド車両CA1が交通事故を起こす可能性が高い所定の運転操作で走行しているため、非コネクテッド車両CA1を危険自動車と判定する。 Note that the information processing device 100 may determine the automobile 10 as a dangerous automobile when the automobile 10 repeats meandering driving. That is, in the surveillance camera image acquired from the surveillance camera 20, the information processing device 100 has a predetermined driving operation in which the non-connected vehicle CA1 is likely to cause a traffic accident when the unconnected vehicle CA1 repeats meandering operation. The non-connected vehicle CA1 is determined to be a dangerous vehicle because the vehicle is traveling on the vehicle.
 一方、情報処理装置100は、監視カメラ20から取得した監視カメラ画像において、非コネクテッド車両CA1が交通ルールを遵守した走行を行っている場合は、非コネクテッド車両CA1が交通事故を起こす可能性が低い所定の運転操作で走行しているため、非コネクテッド車両CA1を危険自動車と判定しない。 On the other hand, in the surveillance camera image acquired from the surveillance camera 20, the information processing device 100 is unlikely to cause a traffic accident when the non-connected vehicle CA1 is traveling in compliance with the traffic rules. Since the vehicle is traveling by a predetermined driving operation, the non-connected vehicle CA1 is not determined to be a dangerous vehicle.
 また、例えば、情報処理装置100は、監視カメラ21から取得した監視カメラ画像において、非コネクテッド車両CA1が法定速度以上の速度で走行している場合は、非コネクテッド車両CA1を危険自動車と判定し、非コネクテッド車両CA1が法定速度未満の速度で走行している場合は、非コネクテッド車両CA1を危険自動車と判定せずともよい。 Further, for example, when the non-connected vehicle CA1 is traveling at a speed equal to or higher than the legal speed in the surveillance camera image acquired from the surveillance camera 21, the information processing device 100 determines that the non-connected vehicle CA1 is a dangerous vehicle. When the unconnected vehicle CA1 is traveling at a speed lower than the legal speed, it is not necessary to determine the unconnected vehicle CA1 as a dangerous vehicle.
 また、例えば、非コネクテッド車両CA1を含む監視カメラ画像を画像解析技術等の従来技術を用いることで、非コネクテッド車両CA1のナンバープレートを特定したものとする。この場合、情報処理装置100は、ナンバープレートに対応付けられた非コネクテッド車両CA1の交通ルールに違反した違反履歴を取得し、所定の回数以上の交通ルールに違反している場合は、非コネクテッド車両CA1を危険自動車と判定してもよい。一方、情報処理装置100は、非コネクテッド車両CA1が所定の回数未満しか交通ルールに違反していなかった場合は、非コネクテッド車両CA1を危険自動車と判定せずともよい。 Further, for example, it is assumed that the license plate of the non-connected vehicle CA1 is specified by using the conventional technology such as image analysis technology for the surveillance camera image including the unconnected vehicle CA1. In this case, the information processing device 100 acquires a violation history of violating the traffic rule of the non-connected vehicle CA1 associated with the license plate, and if it violates the traffic rule more than a predetermined number of times, the non-connected vehicle CA1 may be determined as a dangerous vehicle. On the other hand, the information processing apparatus 100 does not have to determine the unconnected vehicle CA1 as a dangerous vehicle when the unconnected vehicle CA1 violates the traffic rule less than a predetermined number of times.
 また、情報処理装置100は、危険自動車とは異なる他の自動車が交通事故に巻き込まれる可能性が高い自動車であるか否かを判定すればよい。例えば、情報処理装置100は、危険自動車が撮影された監視カメラ24と、他の自動車11が撮影された監視カメラ21との位置関係が既知であり、監視カメラ21と、監視カメラ24との距離が所定の距離未満であるものとする。この場合、情報処理装置100は、危険自動車と、自動車11とが所定の距離未満であるため、自動車11が交通事故に巻き込まれる可能性が高い対象自動車であると判定してもよい。 Further, the information processing device 100 may determine whether or not another vehicle different from the dangerous vehicle is likely to be involved in a traffic accident. For example, in the information processing device 100, the positional relationship between the surveillance camera 24 in which a dangerous vehicle is photographed and the surveillance camera 21 in which another vehicle 11 is photographed is known, and the distance between the surveillance camera 21 and the surveillance camera 24 is known. Is less than the prescribed distance. In this case, the information processing device 100 may determine that the dangerous vehicle and the vehicle 11 are less than a predetermined distance, so that the vehicle 11 is a target vehicle with a high possibility of being involved in a traffic accident.
 なお、情報処理装置100は、危険自動車と同じ監視カメラ画像に像が含まれる自動車10や、所定の時間内に危険自動車と同じ監視カメラ20により撮影された他の自動車10を交通事故に巻き込まれる可能性が高い対象自動車であると判定してもよい。また、上述した処理は、例えば、図2に示す推定部132により実行されてもよく、例えば、情報処理装置100は、上述した各種の判定を行うための機能構成(例えば、判定部)をさらに有していてもよい。 The information processing device 100 involves a vehicle 10 whose image is included in the same surveillance camera image as the dangerous vehicle and another vehicle 10 photographed by the same surveillance camera 20 as the dangerous vehicle within a predetermined time in a traffic accident. It may be determined that the target vehicle has a high possibility. Further, the above-mentioned processing may be executed by, for example, the estimation unit 132 shown in FIG. 2, and for example, the information processing apparatus 100 further adds a functional configuration (for example, a determination unit) for performing the above-mentioned various determinations. You may have.
〔7.他の実施形態〕
 また、他の実施形態について説明する。上述してきた実施形態では、監視カメラ20が、位置情報やLayer-2 IDを用いて通知先の自動車10の宛先を特定していた。他の実施形態では、監視カメラ20が、監視カメラ画像に撮影された自動車10のナンバープレートに基づいて、自動車10の宛先を特定する処理の例を示す。
[7. Other embodiments]
In addition, other embodiments will be described. In the above-described embodiment, the surveillance camera 20 identifies the destination of the notification destination vehicle 10 by using the position information and the Layer-2 ID. In another embodiment, the surveillance camera 20 shows an example of a process of identifying the destination of the automobile 10 based on the license plate of the automobile 10 captured in the surveillance camera image.
 他の実施形態では、自動車10のナンバープレートと、自動車10の宛先とが対応付けられて監視カメラ20の記憶部に記憶されているものとする。この場合、監視カメラ20は、交通事故に巻き込まれる可能性が高い自動車10が撮影された場合に、撮影された自動車10のナンバープレートを、画像解析等の従来技術を用いて解析することで特定する。そして、監視カメラ20は、特定された自動車10のナンバープレートと対応付けられた宛先を記憶部から特定し、特定した宛先に通知情報を送信してもよい。 In another embodiment, it is assumed that the license plate of the automobile 10 and the destination of the automobile 10 are associated and stored in the storage unit of the surveillance camera 20. In this case, the surveillance camera 20 identifies the license plate of the photographed automobile 10 by analyzing the license plate of the photographed automobile 10 by using a conventional technique such as image analysis when the automobile 10 having a high possibility of being involved in a traffic accident is photographed. do. Then, the surveillance camera 20 may specify the destination associated with the license plate of the specified automobile 10 from the storage unit and transmit the notification information to the specified destination.
 このように、他の実施形態に係る情報処理装置100は、監視カメラ20によって撮影された監視カメラ画像から、通知情報の通知先となる自動車10に対して、各種の情報を通知するための解決方法を提供する。例えば、情報処理装置100は、監視カメラ画像を解析することで特定された自動車10のナンバープレートと、記憶部に記憶される自動車10のナンバープレートとが一致すると判定した場合に、ナンバープレートに対応付けられて記憶される自動車10の宛先を特定する。これにより、情報処理装置100は、適切な移動体を特定することができる。 As described above, the information processing device 100 according to the other embodiment is a solution for notifying various information from the surveillance camera image taken by the surveillance camera 20 to the automobile 10 to which the notification information is notified. Provide a method. For example, the information processing device 100 corresponds to the license plate when it is determined that the license plate of the automobile 10 identified by analyzing the surveillance camera image and the license plate of the automobile 10 stored in the storage unit match. The destination of the automobile 10 attached and stored is specified. Thereby, the information processing apparatus 100 can identify an appropriate moving body.
 また、実施形態と、他の実施形態との関係性について以下に説明する。実施形態に係る監視カメラ20は、監視カメラ20と、自動車10との角度の関係から、自動車10のナンバープレートが画像から特定できない場合に、優位な効果を奏する。例えば、実施形態に係る情報処理装置100は、危険領域SP1と、監視カメラ20との距離に応じて、自動車10に通知される危険自動車に関する注意喚起の内容を変更することができる。このように、実施形態に係る情報処理装置100は、自動車10のナンバープレートが特定できない場合でも、自動車10に対して、注意喚起に関する好適な内容を監視カメラ20を介して通知することができる。すなわち、実施形態に係る情報処理装置100は、他の実施形態に係る情報処理装置100によって行われる処理を補足する情報を提供することができる。 Further, the relationship between the embodiment and other embodiments will be described below. The surveillance camera 20 according to the embodiment exerts an advantageous effect when the license plate of the automobile 10 cannot be identified from the image due to the relationship between the surveillance camera 20 and the automobile 10. For example, the information processing device 100 according to the embodiment can change the content of the warning regarding the dangerous vehicle notified to the vehicle 10 according to the distance between the danger region SP1 and the surveillance camera 20. As described above, the information processing apparatus 100 according to the embodiment can notify the automobile 10 of suitable contents regarding the alerting via the surveillance camera 20 even when the license plate of the automobile 10 cannot be specified. That is, the information processing apparatus 100 according to the embodiment can provide information that supplements the processing performed by the information processing apparatus 100 according to the other embodiment.
〔8.ハードウェア構成〕
 また、上述してきた実施形態に係る自動車10の車載装置、監視カメラ20、監視カメラ画像提供装置FIS、位置情報提供装置PIS及び情報処理装置100は、例えば図9に示すような構成のコンピュータ1000によって実現される。以下、情報処理装置100を例に挙げて説明する。図9は、情報処理装置100の機能を実現するコンピュータ1000の一例を示すハードウェア構成図である。コンピュータ1000は、CPU1100、RAM1200、ROM(Read Only Memory)1300、HDD(Hard Disk Drive)1400、通信インターフェイス(I/F)1500、入出力インターフェイス(I/F)1600、及びメディアインターフェイス(I/F)1700を有する。
[8. Hardware configuration]
Further, the in-vehicle device of the automobile 10 according to the above-described embodiment, the surveillance camera 20, the surveillance camera image providing device FIS, the position information providing device PIS, and the information processing device 100 are driven by, for example, a computer 1000 having a configuration as shown in FIG. It will be realized. Hereinafter, the information processing apparatus 100 will be described as an example. FIG. 9 is a hardware configuration diagram showing an example of a computer 1000 that realizes the functions of the information processing device 100. The computer 1000 includes a CPU 1100, a RAM 1200, a ROM (Read Only Memory) 1300, an HDD (Hard Disk Drive) 1400, a communication interface (I / F) 1500, an input / output interface (I / F) 1600, and a media interface (I / F). ) Has 1700.
 CPU1100は、ROM1300又はHDD1400に格納されたプログラムに基づいて動作し、各部の制御を行う。ROM1300は、コンピュータ1000の起動時にCPU1100によって実行されるブートプログラムや、コンピュータ1000のハードウェアに依存するプログラム等を格納する。 The CPU 1100 operates based on the program stored in the ROM 1300 or the HDD 1400, and controls each part. The ROM 1300 stores a boot program executed by the CPU 1100 when the computer 1000 is started, a program depending on the hardware of the computer 1000, and the like.
 HDD1400は、CPU1100によって実行されるプログラム、及び、かかるプログラムによって使用されるデータ等を格納する。通信インターフェイス1500は、ネットワークNを介して他の機器からデータを受信してCPU1100へ送り、CPU1100がネットワークNを介して生成したデータを他の機器へ送信する。 The HDD 1400 stores a program executed by the CPU 1100, data used by such a program, and the like. The communication interface 1500 receives data from another device via the network N and sends it to the CPU 1100, and the CPU 1100 transmits the data generated by the CPU 1100 via the network N to the other device.
 CPU1100は、入出力インターフェイス1600を介して、ディスプレイやプリンタ等の出力装置、及び、キーボードやマウス等の入力装置を制御する。CPU1100は、入出力インターフェイス1600を介して、入力装置からデータを取得する。また、CPU1100は、入出力インターフェイス1600を介して生成したデータを出力装置へ出力する。 The CPU 1100 controls an output device such as a display or a printer and an input device such as a keyboard or a mouse via the input / output interface 1600. The CPU 1100 acquires data from the input device via the input / output interface 1600. Further, the CPU 1100 outputs the data generated via the input / output interface 1600 to the output device.
 メディアインターフェイス1700は、記録媒体1800に格納されたプログラム又はデータを読み取り、RAM1200を介してCPU1100に提供する。CPU1100は、かかるプログラムを、メディアインターフェイス1700を介して記録媒体1800からRAM1200上にロードし、ロードしたプログラムを実行する。記録媒体1800は、例えばDVD(Digital Versatile Disc)、PD(Phase change rewritable Disk)等の光学記録媒体、MO(Magneto-Optical disk)等の光磁気記録媒体、テープ媒体、磁気記録媒体、または半導体メモリ等である。 The media interface 1700 reads the program or data stored in the recording medium 1800 and provides the program or data to the CPU 1100 via the RAM 1200. The CPU 1100 loads the program from the recording medium 1800 onto the RAM 1200 via the media interface 1700, and executes the loaded program. The recording medium 1800 is, for example, an optical recording medium such as a DVD (Digital Versatile Disc) or PD (Phase change rewritable Disk), a magneto-optical recording medium such as an MO (Magneto-Optical disk), a tape medium, a magnetic recording medium, or a semiconductor memory. And so on.
 例えば、コンピュータ1000が実施形態に係る情報処理装置100として機能する場合、コンピュータ1000のCPU1100は、RAM1200上にロードされたプログラムを実行することにより、制御部130の機能を実現する。また、HDD1400には、記憶部120内のデータが格納される。コンピュータ1000のCPU1100は、これらのプログラムを記録媒体1800から読み取って実行するが、他の例として、他の装置からネットワークNを介してこれらのプログラムを取得してもよい。 For example, when the computer 1000 functions as the information processing device 100 according to the embodiment, the CPU 1100 of the computer 1000 realizes the function of the control unit 130 by executing the program loaded on the RAM 1200. Further, the data in the storage unit 120 is stored in the HDD 1400. The CPU 1100 of the computer 1000 reads and executes these programs from the recording medium 1800, but as another example, these programs may be acquired from another device via the network N.
〔9.その他〕
 また、上記実施形態及び変形例において説明した各処理のうち、自動的に行われるものとして説明した処理の全部または一部を手動的に行うこともでき、あるいは、手動的に行われるものとして説明した処理の全部または一部を公知の方法で自動的に行うこともできる。この他、上記文書中や図面中で示した処理手順、具体的名称、各種のデータやパラメータを含む情報については、特記する場合を除いて任意に変更することができる。例えば、各図に示した各種情報は、図示した情報に限られない。
[9. others〕
Further, among the processes described in the above-described embodiments and modifications, all or part of the processes described as being automatically performed may be manually performed, or may be described as being manually performed. It is also possible to automatically perform all or part of the processed processing by a known method. In addition, the processing procedure, specific name, and information including various data and parameters shown in the above document and drawings can be arbitrarily changed unless otherwise specified. For example, the various information shown in each figure is not limited to the illustrated information.
 また、図示した各装置の各構成要素は機能概念的なものであり、必ずしも物理的に図示の如く構成されていることを要しない。すなわち、各装置の分散・統合の具体的形態は図示のものに限られず、その全部または一部を、各種の負荷や使用状況などに応じて、任意の単位で機能的または物理的に分散・統合して構成することができる。例えば、情報処理装置100と、監視カメラ画像提供装置FISと、位置情報提供装置PISとを統合して一つの情報処理装置としてもよい。この場合、かかる情報処理装置は、自動車10から位置情報を取得し、監視カメラ20から撮影された画像を取得する。 Further, each component of each device shown in the figure is a functional concept, and does not necessarily have to be physically configured as shown in the figure. That is, the specific form of distribution / integration of each device is not limited to the one shown in the figure, and all or part of the device is functionally or physically dispersed / physically distributed in any unit according to various loads and usage conditions. Can be integrated and configured. For example, the information processing device 100, the surveillance camera image providing device FIS, and the position information providing device PIS may be integrated into one information processing device. In this case, the information processing device acquires the position information from the automobile 10 and the image taken from the surveillance camera 20.
 また、上述してきた実施形態及び変形例は、処理内容を矛盾させない範囲で適宜組み合わせることが可能である。 Further, the above-described embodiments and modifications can be appropriately combined as long as the processing contents do not contradict each other.
 また、上述してきた「部(section、module、unit)」は、「手段」や「回路」などに読み替えることができる。例えば、提供部は、提供手段や提供回路に読み替えることができる。 Also, the above-mentioned "section, module, unit" can be read as "means" or "circuit". For example, the providing unit can be read as a providing means or a providing circuit.
〔10.効果〕
 上述してきたように、実施形態に係る情報処理装置100は、決定部133と、提供部134とを有する。決定部133は、決定部は、移動体に対して情報を通知可能なセンサの設置位置に基づいて、センサがセンサの近傍の移動体に通知する情報を決定する。提供部134は、決定部133により決定された情報を、センサに対して提供する。
[10. effect〕
As described above, the information processing apparatus 100 according to the embodiment includes a determination unit 133 and a providing unit 134. The determination unit 133 determines the information that the sensor notifies the moving body in the vicinity of the sensor based on the installation position of the sensor that can notify the moving body of the information. The providing unit 134 provides the information determined by the determining unit 133 to the sensor.
 このように、実施形態に係る情報処理装置100は、移動体に対して情報を通知可能なセンサの設置位置に基づいて決定されたセンサがセンサの近傍の移動体に通知する情報を、センサに対して提供するため、移動体の周囲の状況に応じた適切な通知内容を移動体に対して通知することができる。 As described above, the information processing apparatus 100 according to the embodiment provides the sensor with information that the sensor determined based on the installation position of the sensor capable of notifying the moving body notifies the moving body in the vicinity of the sensor. Therefore, it is possible to notify the moving body of appropriate notification contents according to the situation around the moving body.
 また、実施形態に係る情報処理装置100において、決定部133は、センサごとに、センサが通知する情報を決定し、提供部134は、センサごとに、センサが通知する情報を提供する。 Further, in the information processing apparatus 100 according to the embodiment, the determination unit 133 determines the information notified by the sensor for each sensor, and the providing unit 134 provides the information notified by the sensor for each sensor.
 これにより、実施形態に係る情報処理装置100は、センサごとに、センサが通知する情報を決定し、センサごとに、センサが通知する情報を提供するため、移動体の周囲の状況に応じた適切な通知内容を移動体に対して通知することができる。 As a result, the information processing device 100 according to the embodiment determines the information to be notified by the sensor for each sensor, and provides the information to be notified by the sensor for each sensor. Notification contents can be notified to the moving body.
 また、実施形態に係る情報処理装置100において、決定部133は、他のセンサが取得した情報に基づいて決定される位置からセンサまでの距離に基づいて、センサが通知する情報を決定する。 Further, in the information processing apparatus 100 according to the embodiment, the determination unit 133 determines the information to be notified by the sensor based on the distance from the position determined based on the information acquired by the other sensor to the sensor.
 これにより、実施形態に係る情報処理装置100は、他のセンサが取得した情報に基づいて決定される位置からセンサまでの距離に基づいて、センサが通知する情報を決定するため、移動体の周囲の状況に応じた適切な通知内容を移動体に対して通知することができる。 As a result, the information processing device 100 according to the embodiment determines the information to be notified by the sensor based on the distance from the position determined based on the information acquired by the other sensor to the sensor, and thus is around the moving body. It is possible to notify the moving body of the appropriate notification content according to the situation of.
 また、実施形態に係る情報処理装置100において、決定部133は、所定の位置として、事故が発生しうる位置からセンサまでの距離に基づいて、センサが通知する情報を決定する。 Further, in the information processing apparatus 100 according to the embodiment, the determination unit 133 determines the information to be notified by the sensor as a predetermined position based on the distance from the position where an accident may occur to the sensor.
 これにより、実施形態に係る情報処理装置100は、所定の位置として、事故が発生しうる位置からセンサまでの距離に基づいて、センサが通知する情報を決定するため、移動体の周囲の状況に応じた適切な通知内容を移動体に対して通知することができる。 As a result, the information processing device 100 according to the embodiment determines the information notified by the sensor based on the distance from the position where an accident may occur to the sensor as a predetermined position, so that the information processing device 100 determines the information to be notified by the sensor. Appropriate notification contents can be notified to the moving body.
 また、実施形態に係る情報処理装置100において、決定部133は、事故が発生しうる位置を示す情報をセンサが通知する情報とする。 Further, in the information processing apparatus 100 according to the embodiment, the determination unit 133 uses the information that the sensor notifies the information indicating the position where the accident may occur.
 これにより、実施形態に係る情報処理装置100は、事故が発生しうる位置を示す情報をセンサが通知する情報とするため、移動体の周囲の状況に応じた適切な通知内容を移動体に対して通知することができる。 As a result, the information processing device 100 according to the embodiment uses the information indicating the position where an accident can occur as the information notified by the sensor, and therefore provides the moving body with appropriate notification contents according to the surrounding conditions of the moving body. Can be notified.
 また、実施形態に係る情報処理装置100において、決定部133は、事故が発生しうるまでの時間を示す情報をセンサが通知する情報とする。 Further, in the information processing apparatus 100 according to the embodiment, the determination unit 133 uses information indicating the time until an accident can occur as information notified by the sensor.
 これにより、実施形態に係る情報処理装置100は、事故が発生しうるまでの時間を示す情報をセンサが通知する情報とするため、移動体の周囲の状況に応じた適切な通知内容を移動体に対して通知することができる。 As a result, the information processing device 100 according to the embodiment uses the information indicating the time until an accident can occur as the information notified by the sensor, so that the moving body receives appropriate notification contents according to the situation around the moving body. Can be notified to.
 また、実施形態に係る情報処理装置100において、決定部133は、移動体に事故を回避させるための情報をセンサが通知する情報とする。 Further, in the information processing device 100 according to the embodiment, the determination unit 133 uses the information that the sensor notifies the information for causing the moving body to avoid an accident.
 これにより、実施形態に係る情報処理装置100は、移動体に事故を回避させるための情報をセンサが通知する情報とするため、移動体の周囲の状況に応じた適切な通知内容を移動体に対して通知することができる。 As a result, the information processing device 100 according to the embodiment uses information for the sensor to notify the moving body of information for avoiding an accident, so that the moving body is provided with appropriate notification contents according to the surrounding conditions of the moving body. Can be notified.
 また、実施形態に係る情報処理装置100において、決定部133は、センサとして、周囲の移動体に情報を通知可能なセンサの設置位置に基づいて、センサが通知する情報を決定する。 Further, in the information processing device 100 according to the embodiment, the determination unit 133 determines the information to be notified by the sensor as a sensor based on the installation position of the sensor capable of notifying the information to the surrounding moving body.
 これにより、実施形態に係る情報処理装置100は、センサとして、周囲の移動体に情報を通知可能なセンサの設置位置に基づいて、センサが通知する情報を決定するため、移動体の周囲の状況に応じた適切な通知内容を移動体に対して通知することができる。 As a result, the information processing device 100 according to the embodiment determines the information to be notified by the sensor based on the installation position of the sensor capable of notifying the information to the surrounding moving body as a sensor, so that the situation around the moving body is determined. It is possible to notify the moving body of the appropriate notification content according to the above.
 また、実施形態に係る情報処理装置100において、決定部133は、センサの設置位置と、センサが取得したセンサの近傍の情報とに基づいて、センサが通知する情報を決定する。 Further, in the information processing apparatus 100 according to the embodiment, the determination unit 133 determines the information to be notified by the sensor based on the installation position of the sensor and the information in the vicinity of the sensor acquired by the sensor.
 これにより、実施形態に係る情報処理装置100は、センサの設置位置と、センサが取得したセンサの近傍の情報とに基づいて、センサが通知する情報を決定するため、移動体の周囲の状況に応じた適切な通知内容を移動体に対して通知することができる。 As a result, the information processing device 100 according to the embodiment determines the information to be notified by the sensor based on the installation position of the sensor and the information in the vicinity of the sensor acquired by the sensor. Appropriate notification contents can be notified to the moving body.
 また、実施形態に係る情報処理装置100において、決定部133は、センサとして、周囲の状況を撮影可能な監視カメラであって、移動体に対して所定の情報を通知可能な監視カメラの設置位置に基づいて、監視カメラが通知する情報を決定する。 Further, in the information processing device 100 according to the embodiment, the determination unit 133 is a monitoring camera capable of photographing the surrounding situation as a sensor, and an installation position of the monitoring camera capable of notifying a moving body of predetermined information. Based on, the information to be notified by the surveillance camera is determined.
 これにより、実施形態に係る情報処理装置100は、センサとして、周囲の状況を撮影可能な監視カメラであって、移動体に対して所定の情報を通知可能な監視カメラの設置位置に基づいて、監視カメラが通知する情報を決定するため、移動体の周囲の状況に応じた適切な通知内容を移動体に対して通知することができる。 As a result, the information processing device 100 according to the embodiment is a surveillance camera capable of photographing the surrounding situation as a sensor, and is based on the installation position of the surveillance camera capable of notifying the moving body of predetermined information. Since the information to be notified by the surveillance camera is determined, it is possible to notify the moving body of appropriate notification contents according to the situation around the moving body.
 また、実施形態に係る情報処理装置100において、決定部133は、いずれかの監視カメラにより撮影された画像から事故が発生しうると予測された位置から、各監視カメラまでの距離に基づいて、各監視カメラが周囲の移動体に通知する情報を決定する。 Further, in the information processing apparatus 100 according to the embodiment, the determination unit 133 determines based on the distance from each surveillance camera from the position where an accident is predicted to occur from the image taken by any of the surveillance cameras. Each surveillance camera determines the information to be notified to surrounding moving objects.
 これにより、実施形態に係る情報処理装置100は、いずれかの監視カメラにより撮影された画像から事故が発生しうると予測された位置から、各監視カメラまでの距離に基づいて、各監視カメラが周囲の移動体に通知する情報を決定するため、移動体の周囲の状況に応じた適切な通知内容を移動体に対して通知することができる。 As a result, in the information processing apparatus 100 according to the embodiment, each surveillance camera can be moved based on the distance from the position where an accident is predicted to occur from the image taken by any of the surveillance cameras to each surveillance camera. In order to determine the information to be notified to the surrounding moving body, it is possible to notify the moving body of the appropriate notification content according to the situation around the moving body.
 また、実施形態に係る情報処理装置100において、決定部133は、監視カメラとして、撮影された移動体に対して所定の情報を通知する監視カメラの設置位置に基づいて、監視カメラが通知する情報を決定する。 Further, in the information processing device 100 according to the embodiment, the determination unit 133 uses the information notified by the surveillance camera as the surveillance camera based on the installation position of the surveillance camera that notifies the photographed moving body of predetermined information. To determine.
 これにより、実施形態に係る情報処理装置100は、監視カメラとして、撮影された移動体に対して所定の情報を通知する監視カメラの設置位置に基づいて、監視カメラが通知する情報を決定するため、移動体の周囲の状況に応じた適切な通知内容を移動体に対して通知することができる。 As a result, the information processing device 100 according to the embodiment determines the information to be notified by the surveillance camera as the surveillance camera based on the installation position of the surveillance camera that notifies the photographed moving body of predetermined information. , It is possible to notify the moving body of appropriate notification contents according to the surrounding situation of the moving body.
 また、実施形態に係る情報処理装置100において、決定部133は、センサとして、所定のイベントが検知されると、イベントと対応する領域に所在する移動体に対して情報を通知するセンサの設置位置に基づいて、センサが通知する情報を決定する。 Further, in the information processing apparatus 100 according to the embodiment, the determination unit 133 is a sensor for installing a sensor that notifies information to a moving body located in an area corresponding to the event when a predetermined event is detected. Determines the information that the sensor notifies based on.
 これにより、実施形態に係る情報処理装置100は、センサとして、所定のイベントが検知されると、イベントと対応する領域に所在する移動体に対して情報を通知するセンサの設置位置に基づいて、センサが通知する情報を決定するため、移動体の周囲の状況に応じた適切な通知内容を移動体に対して通知することができる。 As a result, the information processing device 100 according to the embodiment is based on the installation position of the sensor that notifies the moving body located in the area corresponding to the event when a predetermined event is detected as the sensor. Since the information to be notified by the sensor is determined, it is possible to notify the moving body of appropriate notification contents according to the situation around the moving body.
 また、実施形態に係る情報処理装置100において、決定部133は、移動体が所在する領域ごとに異なる情報を、センサが通知する情報とする。 Further, in the information processing device 100 according to the embodiment, the determination unit 133 uses information to be notified by the sensor, which is different for each area where the moving body is located.
 これにより、実施形態に係る情報処理装置100は、移動体が所在する領域ごとに異なる情報を、センサが通知する情報とするため、移動体の周囲の状況に応じた適切な通知内容を移動体に対して通知することができる。 As a result, the information processing device 100 according to the embodiment uses information that is different for each region where the moving body is located as information to be notified by the sensor, so that the moving body receives appropriate notification content according to the situation around the moving body. Can be notified to.
 また、実施形態に係る情報処理装置100において、決定部133は、センサとして、イベントと対応する領域を、イベントと対応する方向に移動している移動体に対して情報を通知するセンサの設置位置に基づいて、センサが通知する情報を決定する。 Further, in the information processing apparatus 100 according to the embodiment, the determination unit 133, as a sensor, installs a sensor that notifies information to a moving body that is moving in a direction corresponding to the event in an area corresponding to the event. Determines the information that the sensor notifies based on.
 これにより、実施形態に係る情報処理装置100は、センサとして、イベントと対応する領域を、イベントと対応する方向に移動している移動体に対して情報を通知するセンサの設置位置に基づいて、センサが通知する情報を決定するため、移動体の周囲の状況に応じた適切な通知内容を移動体に対して通知することができる。 As a result, the information processing apparatus 100 according to the embodiment, as a sensor, is based on the installation position of the sensor that notifies information to the moving body that is moving in the direction corresponding to the event in the area corresponding to the event. Since the information to be notified by the sensor is determined, it is possible to notify the moving body of appropriate notification contents according to the situation around the moving body.
 また、実施形態に係る情報処理装置100において、決定部133は、センサとして、イベントと対応する領域を、イベントと対応する方向に移動している移動体に対して情報を通知するセンサの設置位置に基づいて、センサが通知する情報を決定する。 Further, in the information processing apparatus 100 according to the embodiment, the determination unit 133, as a sensor, installs a sensor that notifies information to a moving body that is moving in a direction corresponding to the event in an area corresponding to the event. Determines the information that the sensor notifies based on.
 これにより、実施形態に係る情報処理装置100は、センサとして、イベントと対応する領域を、イベントと対応する方向に移動している移動体に対して情報を通知するセンサの設置位置に基づいて、センサが通知する情報を決定するため、移動体の周囲の状況に応じた適切な通知内容を移動体に対して通知することができる。 As a result, the information processing apparatus 100 according to the embodiment, as a sensor, is based on the installation position of the sensor that notifies information to the moving body that is moving in the direction corresponding to the event in the area corresponding to the event. Since the information to be notified by the sensor is determined, it is possible to notify the moving body of appropriate notification contents according to the situation around the moving body.
 また、実施形態に係るセンサは、検知部242と、受付部241と、通知部244とを有する。検知部242は、所定の情報を検知する。受付部241は、センサの設置位置に応じてセンサに通知させる情報を決定する情報処理装置100から、通知対象となる情報を受付ける。通知部244は、受付部241により受付けられた情報を周囲の移動体に通知する。 Further, the sensor according to the embodiment has a detection unit 242, a reception unit 241 and a notification unit 244. The detection unit 242 detects predetermined information. The reception unit 241 receives the information to be notified from the information processing device 100 that determines the information to be notified to the sensor according to the installation position of the sensor. The notification unit 244 notifies the surrounding mobile body of the information received by the reception unit 241.
 このように、実施形態に係るセンサは、センサの設置位置に応じてセンサに通知させる情報を決定する情報処理装置100から、通知対象となる情報を受付け、受付けられた情報を周囲の移動体に通知するため、移動体の周囲の状況に応じた適切な通知内容を移動体に対して通知することができる。 As described above, the sensor according to the embodiment receives the information to be notified from the information processing device 100 that determines the information to be notified to the sensor according to the installation position of the sensor, and the received information is transmitted to the surrounding moving body. Since the notification is made, it is possible to notify the moving body of appropriate notification contents according to the situation around the moving body.
 また、実施形態に係るセンサにおいて、通知部244は、所定のイベントが検知された場合に、イベントと対応する領域に所在する移動体に対して、情報処理装置100から受付けた情報を通知する。 Further, in the sensor according to the embodiment, when a predetermined event is detected, the notification unit 244 notifies the moving body located in the area corresponding to the event of the information received from the information processing device 100.
 このように、実施形態に係るセンサは、所定のイベントが検知された場合に、イベントと対応する領域に所在する移動体に対して、情報処理装置100から受付けた情報を通知するため、移動体の周囲の状況に応じた適切な通知内容を移動体に対して通知することができる。 As described above, when a predetermined event is detected, the sensor according to the embodiment notifies the moving body located in the area corresponding to the event of the information received from the information processing device 100, so that the moving body. It is possible to notify the moving body of the appropriate notification content according to the surrounding situation of.
 また、実施形態に係るセンサにおいて、通知部244は、移動体が所在する領域ごとに異なる情報を通知する。 Further, in the sensor according to the embodiment, the notification unit 244 notifies different information for each area where the moving body is located.
 このように、実施形態に係るセンサは、移動体が所在する領域ごとに異なる情報を通知するため、移動体の周囲の状況に応じた適切な通知内容を移動体に対して通知することができる。 As described above, since the sensor according to the embodiment notifies different information for each area where the moving body is located, it is possible to notify the moving body of appropriate notification contents according to the situation around the moving body. ..
 また、実施形態に係るセンサにおいて、通知部244は、イベントと対応する領域を、イベントと対応する方向に移動している移動体に対して情報を通知する。 Further, in the sensor according to the embodiment, the notification unit 244 notifies the moving body moving in the direction corresponding to the event in the area corresponding to the event.
 このように、実施形態に係るセンサは、イベントと対応する領域を、イベントと対応する方向に移動している移動体に対して情報を通知するため、移動体の周囲の状況に応じた適切な通知内容を移動体に対して通知することができる。 As described above, since the sensor according to the embodiment notifies the moving body moving in the direction corresponding to the event in the area corresponding to the event, it is appropriate according to the situation around the moving body. The content of the notification can be notified to the moving body.
 また、実施形態に係るセンサにおいて、通知部244は、移動体が移動している方向ごとに異なる情報を通知する。 Further, in the sensor according to the embodiment, the notification unit 244 notifies different information for each direction in which the moving body is moving.
 このように、実施形態に係るセンサは、移動体が移動している方向ごとに異なる情報を通知するため、移動体の周囲の状況に応じた適切な通知内容を移動体に対して通知することができる。 In this way, since the sensor according to the embodiment notifies different information for each direction in which the moving body is moving, it is necessary to notify the moving body of appropriate notification contents according to the situation around the moving body. Can be done.
 また、実施形態に係るセンサにおいて、通知部244は、イベントと対応する領域に所在する移動体であって、移動態様が所定の条件を満たす移動体に対して情報を通知する。 Further, in the sensor according to the embodiment, the notification unit 244 notifies the information to the moving body located in the area corresponding to the event and the moving mode satisfies a predetermined condition.
 このように、実施形態に係るセンサは、イベントと対応する領域に所在する移動体であって、移動態様が所定の条件を満たす移動体に対して情報を通知するため、移動体の周囲の状況に応じた適切な通知内容を移動体に対して通知することができる。 As described above, the sensor according to the embodiment is a moving body located in the area corresponding to the event, and the information is notified to the moving body whose movement mode satisfies a predetermined condition. Therefore, the situation around the moving body. It is possible to notify the moving body of the appropriate notification content according to the above.
 また、実施形態に係る交通リスク低減システム1は、情報処理装置100とセンサとを有する。情報処理装置100の決定部133は、センサの設置位置に基づいて、センサがセンサの近傍の移動体に通知する情報を決定する。情報処理装置100の提供部134は、決定部133により決定された情報を、センサに対して提供する。センサの検知部242は、所定の情報を検知する。センサの受付部241は、情報処理装置100から情報を受付ける。センサの通知部244は、受付部241により受付けられた情報を周囲の移動体に通知する。 Further, the traffic risk reduction system 1 according to the embodiment includes an information processing device 100 and a sensor. The determination unit 133 of the information processing device 100 determines the information that the sensor notifies the moving body in the vicinity of the sensor based on the installation position of the sensor. The providing unit 134 of the information processing device 100 provides the information determined by the determining unit 133 to the sensor. The detection unit 242 of the sensor detects predetermined information. The reception unit 241 of the sensor receives information from the information processing device 100. The notification unit 244 of the sensor notifies the surrounding mobile body of the information received by the reception unit 241.
 このように、実施形態に係る交通リスク低減システム1は、移動体の周囲の状況に応じた適切な通知内容を移動体に対して通知することができる。 In this way, the traffic risk reduction system 1 according to the embodiment can notify the moving body of appropriate notification contents according to the situation around the moving body.
 以上、本願の実施形態のいくつかを図面に基づいて詳細に説明したが、これらは例示であり、発明の開示の欄に記載の態様を始めとして、当業者の知識に基づいて種々の変形、改良を施した他の形態で本発明を実施することが可能である。 Although some of the embodiments of the present application have been described in detail with reference to the drawings, these are examples, and various modifications are made based on the knowledge of those skilled in the art, including the embodiments described in the disclosure column of the invention. It is possible to practice the present invention in other improved forms.
   N ネットワーク
   1 交通リスク低減システム
  10、11、12 自動車
  20、21、22、23、24 監視カメラ
 210 通信部
 220 記憶部
 221 宛先記憶部
 230 カメラ
 240 制御部
 241 受付部
 242 検知部
 243 特定部
 244 通知部
 100 情報処理装置
 110 通信部
 120 記憶部
 121 設置位置記憶部
 130 制御部
 131 取得部
 132 推定部
 133 決定部
 134 提供部
 PIS 位置情報提供装置
 PIM 位置情報記憶部
 FIS 監視カメラ画像提供装置
 FIM 監視カメラ画像記憶部
N network 1 Traffic risk reduction system 10, 11, 12 Automobiles 20, 21, 22, 23, 24 Surveillance camera 210 Communication unit 220 Storage unit 221 Destination storage unit 230 Camera 240 Control unit 241 Reception unit 242 Detection unit 243 Specific unit 244 Notification Unit 100 Information processing device 110 Communication unit 120 Storage unit 121 Installation position storage unit 130 Control unit 131 Acquisition unit 132 Estimating unit 133 Determining unit 134 Providing unit PIS Position information providing device PIM Position information storage unit FIS Surveillance camera Image providing device FIM Surveillance camera Image storage

Claims (24)

  1.  移動体に対して情報を通知可能なセンサの設置位置に基づいて、当該センサが当該センサの近傍の移動体に通知する情報を決定する決定部と、
     前記決定部により決定された情報を、前記センサに対して提供する提供部と
     を有することを特徴とする情報処理装置。
    A determination unit that determines the information that the sensor notifies the moving body in the vicinity of the sensor based on the installation position of the sensor that can notify the moving body.
    An information processing device having a providing unit that provides information determined by the determining unit to the sensor.
  2.  前記決定部は、
     センサごとに、当該センサが通知する情報を決定し、
     前記提供部は、
     センサごとに、当該センサが通知する情報を提供する
     ことを特徴とする請求項1に記載の情報処理装置。
    The decision unit
    For each sensor, determine the information to be notified by the sensor,
    The providing part
    The information processing device according to claim 1, wherein the information notified by the sensor is provided for each sensor.
  3.  前記決定部は、
     他のセンサが取得した情報に基づいて決定される位置から前記センサまでの距離に基づいて、当該センサが通知する情報を決定する
     ことを特徴とする請求項1に記載の情報処理装置。
    The decision unit
    The information processing apparatus according to claim 1, wherein the information to be notified by the sensor is determined based on the distance from the position determined based on the information acquired by the other sensor to the sensor.
  4.  前記決定部は、
     前記位置として、事故が発生しうる位置から前記センサまでの距離に基づいて、当該センサが通知する情報を決定する
     ことを特徴とする請求項3に記載の情報処理装置。
    The decision unit
    The information processing apparatus according to claim 3, wherein the information processing device according to claim 3 is characterized in that the information to be notified by the sensor is determined based on the distance from the position where an accident can occur to the sensor as the position.
  5.  前記決定部は、
     前記事故が発生しうる位置を示す情報を前記センサが通知する情報とする
     ことを特徴とする請求項4に記載の情報処理装置。
    The decision unit
    The information processing apparatus according to claim 4, wherein the information indicating the position where the accident can occur is the information notified by the sensor.
  6.  前記決定部は、
     前記事故が発生しうるまでの時間を示す情報を前記センサが通知する情報とする
     ことを特徴とする請求項4に記載の情報処理装置。
    The decision unit
    The information processing apparatus according to claim 4, wherein the information indicating the time until the accident can occur is the information notified by the sensor.
  7.  前記決定部は、
     前記移動体に前記事故を回避させるための情報を前記センサが通知する情報とする
     ことを特徴とする請求項4に記載の情報処理装置。
    The decision unit
    The information processing device according to claim 4, wherein the information for causing the moving body to avoid the accident is information notified by the sensor.
  8.  前記決定部は、
     前記センサとして、周囲の移動体に情報を通知可能なセンサの設置位置に基づいて、当該センサが通知する情報を決定する
     ことを特徴とする請求項4に記載の情報処理装置。
    The decision unit
    The information processing device according to claim 4, wherein the sensor determines the information to be notified by the sensor based on the installation position of the sensor capable of notifying the surrounding moving body of the information.
  9.  前記決定部は、
     前記センサの設置位置と、当該センサが取得した当該センサの近傍の情報とに基づいて、当該センサが通知する情報を決定する
     ことを特徴とする請求項1に記載の情報処理装置。
    The decision unit
    The information processing apparatus according to claim 1, wherein the information to be notified by the sensor is determined based on the installation position of the sensor and the information in the vicinity of the sensor acquired by the sensor.
  10.  前記決定部は、
     前記センサとして、周囲の状況を撮影可能な監視カメラであって、移動体に対して所定の情報を通知可能な監視カメラの設置位置に基づいて、当該監視カメラが通知する情報を決定する
     ことを特徴とする請求項1に記載の情報処理装置。
    The decision unit
    As the sensor, it is a surveillance camera capable of photographing the surrounding situation, and the information to be notified by the surveillance camera is determined based on the installation position of the surveillance camera capable of notifying the moving body of predetermined information. The information processing apparatus according to claim 1.
  11.  前記決定部は、
     いずれかの監視カメラにより撮影された画像から事故が発生しうると予測された位置から、各監視カメラまでの距離に基づいて、各監視カメラが周囲の移動体に通知する情報を決定する
     ことを特徴とする請求項10に記載の情報処理装置。
    The decision unit
    Based on the distance to each surveillance camera from the position where an accident is predicted to occur from the image taken by one of the surveillance cameras, each surveillance camera determines the information to be notified to surrounding moving objects. The information processing apparatus according to claim 10.
  12.  前記決定部は、
     前記監視カメラとして、撮影された移動体に対して所定の情報を通知する監視カメラの設置位置に基づいて、当該監視カメラが通知する情報を決定する
     ことを特徴とする請求項10に記載の情報処理装置。
    The decision unit
    The information according to claim 10, wherein as the surveillance camera, the information to be notified by the surveillance camera is determined based on the installation position of the surveillance camera that notifies the captured moving object of predetermined information. Processing equipment.
  13.  前記決定部は、
     前記センサとして、所定のイベントが検知されると、当該イベントと対応する領域に所在する移動体に対して情報を通知するセンサの設置位置に基づいて、当該センサが通知する情報を決定する
     ことを特徴とする請求項1に記載の情報処理装置。
    The decision unit
    When a predetermined event is detected as the sensor, the information to be notified by the sensor is determined based on the installation position of the sensor that notifies the moving body located in the area corresponding to the event. The information processing apparatus according to claim 1.
  14.  前記決定部は、
     前記移動体が所在する領域ごとに異なる情報を、前記センサが通知する情報とする
     ことを特徴とする請求項13に記載の情報処理装置。
    The decision unit
    The information processing device according to claim 13, wherein information different for each region where the moving body is located is used as information notified by the sensor.
  15.  前記決定部は、
     前記センサとして、前記イベントと対応する領域を、当該イベントと対応する方向に移動している移動体に対して情報を通知するセンサの設置位置に基づいて、当該センサが通知する情報を決定する
     ことを特徴とする請求項13に記載の情報処理装置。
    The decision unit
    As the sensor, the information to be notified by the sensor is determined based on the installation position of the sensor that notifies the moving body moving in the direction corresponding to the event in the area corresponding to the event. 13. The information processing apparatus according to claim 13.
  16.  前記決定部は、
     前記移動体が移動している方向ごとに異なる情報を、前記センサが通知する情報とする
     ことを特徴とする請求項15に記載の情報処理装置。
    The decision unit
    The information processing device according to claim 15, wherein information different depending on the direction in which the moving body is moving is used as information notified by the sensor.
  17.  前記決定部は、
     前記センサとして、所定のイベントが検知されると、当該イベントと対応する領域に所在する移動体であって、移動態様が所定の条件を満たす移動体に対して情報を通知するセンサの設置位置に基づいて、当該センサが通知する情報を決定する
     ことを特徴とする請求項13に記載の情報処理装置。
    The decision unit
    When a predetermined event is detected as the sensor, the sensor is installed at a position where a moving body located in an area corresponding to the event is notified of information to a moving body whose movement mode satisfies a predetermined condition. The information processing apparatus according to claim 13, wherein the information to be notified by the sensor is determined based on the information processing apparatus.
  18.  所定の情報を検知する検知部と
     センサの設置位置に応じて当該センサに通知させる情報を決定する情報処理装置から、通知対象となる情報を受付ける受付部と、
     前記受付部により受付けられた情報を周囲の移動体に通知する通知部と
     を有することを特徴とするセンサ。
    A detection unit that detects predetermined information, a reception unit that receives information to be notified from an information processing device that determines information to be notified to the sensor according to the installation position of the sensor, and a reception unit.
    A sensor having a notification unit for notifying surrounding moving objects of information received by the reception unit.
  19.  前記通知部は、
     所定のイベントが検知された場合に、当該イベントと対応する領域に所在する移動体に対して、前記情報処理装置から受付けた情報を通知する
     ことを特徴とする請求項18に記載のセンサ。
    The notification unit
    The sensor according to claim 18, wherein when a predetermined event is detected, the moving body located in the area corresponding to the event is notified of the information received from the information processing apparatus.
  20.  前記通知部は、
     前記移動体が所在する領域ごとに異なる情報を通知する
     ことを特徴とする請求項19に記載のセンサ。
    The notification unit
    The sensor according to claim 19, wherein different information is notified for each region where the moving body is located.
  21.  前記通知部は、
     前記イベントと対応する領域を、当該イベントと対応する方向に移動している移動体に対して情報を通知する
     ことを特徴とする請求項19に記載のセンサ。
    The notification unit
    The sensor according to claim 19, wherein information is notified to a moving body moving in a direction corresponding to the event in a region corresponding to the event.
  22.  前記通知部は、
     前記移動体が移動している方向ごとに異なる情報を通知する
     ことを特徴とする請求項21に記載のセンサ。
    The notification unit
    The sensor according to claim 21, wherein different information is notified for each direction in which the moving body is moving.
  23.  前記通知部は、
     前記イベントと対応する領域に所在する移動体であって、移動態様が所定の条件を満たす移動体に対して情報を通知する
     ことを特徴とする請求項19に記載のセンサ。
    The notification unit
    The sensor according to claim 19, wherein the sensor is located in a region corresponding to the event and notifies information to the moving body whose movement mode satisfies a predetermined condition.
  24.  情報処理装置とセンサとを有する交通リスク低減システムであって、
     前記情報処理装置は、
     前記センサの設置位置に基づいて、当該センサが当該センサの近傍の移動体に通知する情報を決定する決定部と、
     前記決定部により決定された情報を、前記センサに対して提供する提供部と
     を有し、
     前記センサは、
     所定の情報を検知する検知部と
     前記情報処理装置から情報を受付ける受付部と、
     前記受付部により受付けられた情報を周囲の移動体に通知する通知部と
     を有することを特徴とする交通リスク低減システム。
    It is a traffic risk reduction system that has an information processing device and a sensor.
    The information processing device
    A determination unit that determines information that the sensor notifies a moving body in the vicinity of the sensor based on the installation position of the sensor.
    It has a providing unit that provides the information determined by the determining unit to the sensor, and has a providing unit.
    The sensor is
    A detection unit that detects predetermined information, a reception unit that receives information from the information processing device, and
    A traffic risk reduction system characterized by having a notification unit that notifies surrounding moving objects of information received by the reception unit.
PCT/JP2021/001721 2020-02-10 2021-01-19 Traffic risk reduction system, information processing device, and sensor WO2021161742A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2020020987A JP7038152B2 (en) 2020-02-10 2020-02-10 Information processing equipment
JP2020-020987 2020-02-10

Publications (1)

Publication Number Publication Date
WO2021161742A1 true WO2021161742A1 (en) 2021-08-19

Family

ID=77291884

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2021/001721 WO2021161742A1 (en) 2020-02-10 2021-01-19 Traffic risk reduction system, information processing device, and sensor

Country Status (2)

Country Link
JP (1) JP7038152B2 (en)
WO (1) WO2021161742A1 (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011022713A (en) * 2009-07-14 2011-02-03 Sumitomo Electric Ind Ltd Communication system and roadside communication apparatus
JP2011146053A (en) * 2011-02-18 2011-07-28 Sumitomo Electric Ind Ltd Drive support apparatus, vehicle and vehicle drive support method

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006329706A (en) * 2005-05-24 2006-12-07 Toshiba Corp Distance-measuring device and distance measurement method
WO2016132769A1 (en) * 2015-02-19 2016-08-25 シャープ株式会社 Imaging device, control method for imaging device, and control program

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011022713A (en) * 2009-07-14 2011-02-03 Sumitomo Electric Ind Ltd Communication system and roadside communication apparatus
JP2011146053A (en) * 2011-02-18 2011-07-28 Sumitomo Electric Ind Ltd Drive support apparatus, vehicle and vehicle drive support method

Also Published As

Publication number Publication date
JP2021128387A (en) 2021-09-02
JP7038152B2 (en) 2022-03-17

Similar Documents

Publication Publication Date Title
JP7362721B2 (en) Early warning and collision avoidance
US10332401B2 (en) Running vehicle alerting system and method
US10059261B2 (en) Collision warning system
US11967230B2 (en) System and method for using V2X and sensor data
CN105292036B (en) Boundary detection system
US10343601B2 (en) Collision warning system
US9511730B1 (en) Collision warning system
US20200128372A1 (en) Vehicle-to-infrastructure (v2i) messaging system
EP3187372A2 (en) Collision warning system and method
CN114586082A (en) Enhanced on-board equipment
US20150307131A1 (en) Autonomous Driving in a Hazard Situation
US11842631B2 (en) Communication device, control method thereof and communication system including the same
CN111547043A (en) Automatic response to emergency service vehicle by autonomous vehicle
JP2018139033A (en) Danger level determination device, information providing device, danger level determination method, information providing method and computer program
CN112185170B (en) Traffic safety prompting method and road monitoring equipment
JP2005327177A (en) System and device for supporting inter-vehicle collision prevention
JPWO2020071072A1 (en) Information provision system, mobile terminal, information provision method, and computer program
JP2021128386A (en) Information processing device, method, program, and traffic risk reduction device
WO2021161742A1 (en) Traffic risk reduction system, information processing device, and sensor
WO2021166525A1 (en) Transportation risk-reducing recording medium, information processing device, and method
Adla et al. Automotive collision avoidance methodologies Sensor-based and ITS-based
JP2006099453A (en) Driving support device
JP2024513710A (en) Light projection device and method and storage medium
US20220250613A1 (en) Systems And Methods To Prevent Vehicular Mishaps
Sarowar et al. Overspeed vehicular monitoring and control by using zigbee

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21753550

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21753550

Country of ref document: EP

Kind code of ref document: A1