WO2022091167A1 - 情報提供サーバ、情報提供方法及びプログラム記録媒体 - Google Patents
情報提供サーバ、情報提供方法及びプログラム記録媒体 Download PDFInfo
- Publication number
- WO2022091167A1 WO2022091167A1 PCT/JP2020/040040 JP2020040040W WO2022091167A1 WO 2022091167 A1 WO2022091167 A1 WO 2022091167A1 JP 2020040040 W JP2020040040 W JP 2020040040W WO 2022091167 A1 WO2022091167 A1 WO 2022091167A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- information
- moving body
- providing server
- moving
- road
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims description 61
- 230000005540 biological transmission Effects 0.000 claims abstract description 21
- 230000033001 locomotion Effects 0.000 claims description 46
- 238000004891 communication Methods 0.000 claims description 21
- 230000008569 process Effects 0.000 claims description 5
- 230000006870 function Effects 0.000 description 15
- 102100024633 Carbonic anhydrase 2 Human genes 0.000 description 9
- 101000760643 Homo sapiens Carbonic anhydrase 2 Proteins 0.000 description 9
- 238000010586 diagram Methods 0.000 description 8
- 238000001514 detection method Methods 0.000 description 7
- 230000002093 peripheral effect Effects 0.000 description 7
- 238000012545 processing Methods 0.000 description 7
- 238000007726 management method Methods 0.000 description 6
- 238000004458 analytical method Methods 0.000 description 4
- 230000008901 benefit Effects 0.000 description 3
- 238000012986 modification Methods 0.000 description 3
- 230000004048 modification Effects 0.000 description 3
- 230000004931 aggregating effect Effects 0.000 description 2
- 238000004590 computer program Methods 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 238000011156 evaluation Methods 0.000 description 2
- 230000004438 eyesight Effects 0.000 description 2
- 238000010295 mobile communication Methods 0.000 description 2
- 230000004044 response Effects 0.000 description 2
- 230000003068 static effect Effects 0.000 description 2
- 102100024650 Carbonic anhydrase 3 Human genes 0.000 description 1
- 101000760630 Homo sapiens Carbonic anhydrase 3 Proteins 0.000 description 1
- 230000003190 augmentative effect Effects 0.000 description 1
- 230000002457 bidirectional effect Effects 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 238000013135 deep learning Methods 0.000 description 1
- 238000012217 deletion Methods 0.000 description 1
- 230000037430 deletion Effects 0.000 description 1
- 230000010365 information processing Effects 0.000 description 1
- 230000007774 longterm Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000011664 signaling Effects 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 239000002699 waste material Substances 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/16—Anti-collision systems
- G08G1/164—Centralised systems, e.g. external to vehicles
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/01—Detecting movement of traffic to be counted or controlled
- G08G1/0104—Measuring and analyzing of parameters relative to traffic conditions
- G08G1/0108—Measuring and analyzing of parameters relative to traffic conditions based on the source of data
- G08G1/0112—Measuring and analyzing of parameters relative to traffic conditions based on the source of data from the vehicle, e.g. floating car data [FCD]
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/01—Detecting movement of traffic to be counted or controlled
- G08G1/0104—Measuring and analyzing of parameters relative to traffic conditions
- G08G1/0108—Measuring and analyzing of parameters relative to traffic conditions based on the source of data
- G08G1/0116—Measuring and analyzing of parameters relative to traffic conditions based on the source of data from roadside infrastructure, e.g. beacons
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/01—Detecting movement of traffic to be counted or controlled
- G08G1/0104—Measuring and analyzing of parameters relative to traffic conditions
- G08G1/0125—Traffic data processing
- G08G1/0133—Traffic data processing for classifying traffic situation
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/01—Detecting movement of traffic to be counted or controlled
- G08G1/0104—Measuring and analyzing of parameters relative to traffic conditions
- G08G1/0137—Measuring and analyzing of parameters relative to traffic conditions for specific applications
- G08G1/0141—Measuring and analyzing of parameters relative to traffic conditions for specific applications for traffic information dissemination
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/16—Anti-collision systems
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/52—Surveillance or monitoring of activities, e.g. for recognising suspicious objects
- G06V20/54—Surveillance or monitoring of activities, e.g. for recognising suspicious objects of traffic, e.g. cars on the road, trains or boats
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
- G06V20/58—Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
Definitions
- the present invention relates to an information providing server, an information providing method, and a program recording medium.
- Patent Document 1 discloses a notification system capable of detecting a blind spot moving object existing at a position that becomes a blind spot when viewed from a right-turning vehicle and providing information to the driver.
- the notification system described in the same document repeatedly captures images of a plurality of oncoming moving objects traveling at an intersection of roads with a camera. Then, this notification system determines the presence or absence of a blind spot moving body that cannot be seen from the right turn waiting position in the intersection among the oncoming moving bodies based on the image, and determines the determined blind spot moving body as the driver of the right turn vehicle in the intersection. Notify to.
- Patent Document 2 when the own vehicle waits for a right turn, the support information regarding the oncoming vehicle traveling on the oncoming road is set according to the driving condition of the oncoming vehicle, thereby reducing the complicated feeling given to the driver.
- a driving support device for turning right is disclosed.
- this right-turn driving support device depends on the degree to which the following vehicle becomes difficult to see due to the blind spot of the oncoming vehicle due to the relationship between the size of the vehicle body of the preceding vehicle and the following vehicle based on the information of the oncoming vehicle. Set the blind spot rank. Then, this right-turn driving support device sets the maximum value of each blind spot rank as an oncoming straight-ahead vehicle rank flag.
- this right-turn driving support device is based on the oncoming straight-ahead vehicle rank flag and the right-turn oncoming vehicle rank flag set according to the size of the vehicle body of the oncoming vehicle waiting for a right turn, when the own vehicle makes a right turn. Set the evaluation rank according to the degree of danger. Then, the right turn driving support device notifies the right turn driving support information according to the evaluation rank.
- Patent Documents 3 and 4 disclose an in-vehicle device that provides driving support when turning right at an intersection or the like using only a sensor mounted on the own vehicle without using information from a roadside device or another vehicle.
- Japanese Unexamined Patent Publication No. 2008-041058 Japanese Unexamined Patent Publication No. 2011-090582 Japanese Unexamined Patent Publication No. 2002-205615 Japanese Unexamined Patent Publication No. 2006-349456
- a moving body that exists in the vicinity of the moving body and is difficult to detect from the moving body (hereinafter referred to as a peripheral moving body) is detected and detected with high accuracy. It is required to notify the moving body of information about the moving body. Further, when notifying the mobile body of information, it is required to efficiently notify the information in order to reduce the communication load of the wireless communication network and improve the efficiency of using wireless resources.
- the method described as the background technique has a problem that it is difficult to maintain the detection accuracy of peripheral moving objects and to efficiently notify information.
- Patent Document 1 uses a configuration in which a camera and a processing computer are arranged one-to-one, and the information of each camera is independently notified to the vehicle. Therefore, there is a possibility that the notification information to the vehicle may be duplicated between the cameras. In addition, since information is notified independently from each camera to the same vehicle, the protocol overhead increases.
- An object of the present invention is to provide an information providing server, an information providing method, and a program recording medium that can contribute to both maintaining the detection accuracy of the peripheral moving object and improving the efficiency of information notification to the moving object.
- the primary moving body passing through the road is obtained from the plurality of sensors.
- An information providing server including an information creating means for creating the secondary information using information and a transmitting means for transmitting the secondary information to the first moving body is provided.
- a computer capable of acquiring primary information from a plurality of sensors that sense a predetermined range of the road passes through the road based on the primary information acquired from each of the plurality of sensors. It is determined whether or not to provide the secondary information created by using the primary information acquired from the plurality of sensors to the moving body of the above, and it is determined to provide the secondary information to the first moving body. If so, an information providing method is provided in which the secondary information is created using the primary information acquired from the plurality of sensors, and the secondary information is transmitted to the first moving body. This method is linked to a specific machine, a computer that can acquire information from the plurality of sensors described above.
- a computer program for realizing the above-mentioned function of the information providing server.
- This program is input to a computer device via an input device or an external communication interface, stored in a storage device, and drives a processor according to a predetermined step or process. Further, this program can display the processing result including the intermediate state at each stage via the display device, if necessary, or can communicate with the outside via the communication interface.
- Computer devices for that purpose typically include, for example, a processor, a storage device, an input device, a communication interface, and, if necessary, a display device that can be connected to each other by a bus.
- the program can also be recorded on a computer-readable (non-transitional) storage medium.
- the drawing reference reference numerals added to this outline are added to each element for convenience as an example for assisting understanding, and the present invention is not intended to be limited to the illustrated embodiment.
- the connecting line between blocks such as drawings referred to in the following description includes both bidirectional and unidirectional.
- the one-way arrow schematically shows the flow of the main signal (data), and does not exclude bidirectionality.
- the program is executed via a computer device, which comprises, for example, a processor, a storage device, an input device, a communication interface, and, if necessary, a display device.
- this computer device is configured to be capable of communicating with a device inside or outside the device (including a computer) via a communication interface regardless of whether it is wired or wireless.
- a communication interface regardless of whether it is wired or wireless.
- the present invention can be realized in one embodiment of the information providing server 20 including a determination unit 21, an information creation unit 22, and a transmission unit 23, as shown in FIG. Further, the information providing server 20 is connected to a plurality of sensors 10 that sense a predetermined range of the road by wire or wirelessly, and data (primary information) can be acquired from these sensors 10.
- the determination unit 21 is created by using the primary information acquired from the plurality of sensors for the first moving body passing through the road based on the primary information acquired from each of the sensors 10. It functions as a judgment means for determining whether or not to provide secondary information.
- the information creation unit 22 determines that the secondary information is provided to the first moving body passing through the road, the information for creating the secondary information using the primary information acquired from the plurality of sensors. Functions as a means of creation.
- the transmission unit 23 functions as a transmission means for transmitting the secondary information to the first moving body.
- the information providing server 20 configured as described above has, based on the primary information acquired from each of the plurality of sensors 10, the primary information acquired from the plurality of sensors with respect to the first moving body passing through the road. It is determined whether or not to provide the secondary information created by using.
- the information providing server 20 uses the primary information acquired from the plurality of sensors to provide the secondary information. Create information. Then, the information providing server 20 transmits the secondary information to the first mobile body. As a result, the first mobile body can obtain secondary information based on the primary information acquired from each of the plurality of sensors 10. Since this secondary information is created using the primary information obtained from the plurality of sensors 10, it is possible to cover the area around the first moving object from a wider viewpoint. Further, since the primary information is aggregated into the secondary information by the information providing server 20, efficient information notification is also realized.
- secondary information information on a mobile body that is difficult to detect from the first mobile body may be provided as secondary information.
- secondary information For example, a place where a blind spot is created based on information obtained by a plurality of sensors 10 for a first moving object trying to turn right or left at an intersection or a first moving object trying to pass a sharp curve. You may convey the existence and movement of the moving object located in.
- the example of secondary information is not limited to the above example.
- the secondary information may be information with improved accuracy of the primary information. Even with the same type of sensor, by using a plurality of sensors 10 arranged at different positions, it is possible to obtain a sensing result with higher accuracy than the sensing function possessed by the first moving body.
- the first moving body may be a person or a bicycle in addition to the vehicle.
- secondary information based on primary information obtained from a plurality of sensors 10 may be provided to a person or a bicycle trying to pass through an intersection with poor visibility.
- FIG. 2 is a diagram showing the configuration of the first embodiment of the present invention.
- an information providing server 200 connected to the cameras 100A to 100D as a plurality of sensors is shown.
- the cameras 100A to 100D are cameras that are attached to the traffic signals 400A to 400D at the intersection and can transmit the camera image (still image or moving image) to the information providing server 200.
- the camera 100A is installed at a position where traffic coming from an oncoming lane (a lane from the lower side to the upper side of FIG. 2) of a vertical road intersecting at an intersection shown on the left side of FIG. 2 can be photographed from the front.
- FIG. 3 is a diagram schematically showing the shooting range of the camera 100A.
- the camera 100A points in the same direction as the lamp of the traffic signal 400A, and can capture the range shown by the alternate long and short dash line in FIG.
- the shootable range of the camera 100A is almost a triangle, but the range in the distant direction (the base of the triangle of the alternate long and short dash line in FIG. 3) depends on the performance of the camera 100A and the shooting environment. ..
- the cameras 100B to 100D also have a shooting range equivalent to that of the camera 100A.
- the cameras 100A to 100D By arranging the cameras 100A to 100D in this way, it is possible to photograph the traffic flowing in and out of the intersection at various angles and monitor it in a plane.
- the arrangement of the cameras shown in FIGS. 2 to 4 is just an example, and the number and position of the cameras can be changed according to the content to be provided as secondary information.
- a sensor other than the camera may be arranged as the sensor.
- LiDAR Light Detection and Ringing
- RADAR Radio Detection And Ringing
- an infrared sensor a millimeter wave sensor, or the like
- a plurality of types of sensors may be used in combination.
- the information providing server 200 includes a determination unit 201, an information creation unit 202, and a transmission unit 203.
- the determination unit 201 is a moving object located in a blind spot as secondary information for a moving object entering the intersection from a specific direction (for example, lower part of FIG. 2) based on the camera images acquired from the cameras 100A to 100D. Determines whether to provide information notifying the existence of. The presence or absence of a moving object entering the intersection from the specific direction can be detected by the camera image of the camera 100A.
- an optical beacon, an ultrasonic sensor, or the like in the target lane to detect a moving object entering the intersection from a specific direction.
- various objects such as a vehicle, a pedestrian, and a bicycle can be considered as the moving object to which the service of the information providing server 200 is provided.
- the moving object to be serviced. Will be described with an example of a vehicle.
- a method of extracting a moving object from the camera images acquired from the cameras 100A to 100D for example, a method of extracting a moving object as a moving object from a comparison of previous and next video frames and a difference from a background image prepared in advance. Can be taken.
- the method of extracting the moving object from the camera image is not limited to these methods.
- a method of extracting a moving object by removing a static object from an object extracted from a camera image by using high-precision 3D map information (static object information) of the area (around an intersection) can be adopted. ..
- various known object detection techniques such as those using a deep learning technique can be used for extracting an object from the camera image and determining the type of an object (moving object).
- the information providing server 200 of the present embodiment will be described as specifying the type of the moving body together with the object detection.
- the determination unit 201 determines whether or not to provide the secondary information, the presence or absence of a moving body that is difficult to detect from the moving body entering the intersection from the specific direction, the type of the moving body, and the moving body. It can be judged by the movement attribute (direction / speed of movement) of. The method for determining whether or not to provide these secondary information will be described in detail later with specific examples.
- the information creating unit 202 uses the camera images acquired from the cameras 100A to 100D to provide the moving body (first moving body). Create secondary information to inform the existence of the moving body located in the blind spot to the moving body. More specifically, the information creation unit 202 creates secondary information in a form of removing duplicate information between camera images obtained by the cameras 100A to 100D.
- the transmission unit 203 transmits secondary information to the moving body (first moving body) notifying the existence of the moving body located in the blind spot.
- the transmission unit 203 can adopt a method of transmitting information in response to an inquiry from a communication device mounted on the mobile body or the like. On-demand method).
- the transmission unit 203 can transmit secondary information to the mobile body via a wireless communication network.
- the wireless communication network various mobile communication networks such as LTE (Long Term Evolution), 5G, local 5G, and Wi-Fi (registered trademark) can be used.
- FIG. 5 is a flowchart showing the operation of the information providing server 200 according to the first embodiment of the present invention.
- the information providing server 200 acquires camera images from the cameras 100A to 100D as primary information (step S001).
- the information providing server 200 analyzes the camera images acquired from the cameras 100A to 100D, and analyzes whether or not to provide the secondary information to the moving body entering the intersection from a specific direction (step S002). ..
- FIG. 6 shows an example of the result of the information providing server 200 analyzing the camera images acquired from the cameras 100A to 100D and extracting the moving object.
- CAR1 of FIG. 6 will be described as a moving body (first moving body) that enters the intersection from a specific direction.
- the information providing server 200 includes a vehicle CAR2 that is about to turn right from the oncoming lane in which the vehicle CAR1 is traveling, a motorcycle BIKE1 located behind the vehicle CAR2, and a pedestrian P1 waiting for a signal in front of the building in the lower left of FIG. Explain as if you know.
- the information providing server 200 when it is determined that the secondary information is provided to the moving body (first moving body) entering the intersection (Yes in step S003), the information providing server 200 provides the moving body to the moving body. Create secondary information (step S004). More specifically, the information providing server 200 uses the camera images acquired from the cameras 100A to 100D to create information notifying the moving body of the existence of the moving body located in the blind spot. If, as a result of the analysis in step S002, it is determined that the secondary information is not provided to the moving body entering the intersection (No in step S003), the information providing server 200 omits the subsequent processing.
- Method 1 For example, when the moving body exists in the blind spot of the first moving body (vehicle CAR1) to which the secondary information is provided as a result of analyzing the camera image (primary information), the information providing server 200 provides it. It is possible to adopt a method of determining that it is necessary and, if it does not exist, it is determined that the provision is unnecessary. For example, in the case of FIG. 6, since the two-wheeled vehicle BIKE1 and the pedestrian P1 are present in the blind spot of the first moving body (vehicle CAR1), the information providing server 200 determines that the provision is necessary.
- Method 2 For example, as a result of analyzing the camera image (primary information), a moving body exists in the blind spot of the first moving body (vehicle CAR1) to which the secondary information is provided, and the moving body is specific. In the case of a type (eg, motorcycle, bicycle, person, etc.), the information providing server 200 determines that it is necessary to provide secondary information. In other cases, the information providing server 200 determines that it is not necessary to provide the secondary information.
- a type eg, motorcycle, bicycle, person, etc.
- Method 3 For example, as a result of analyzing the camera image (primary information), a moving body exists in the blind spot of the first moving body (vehicle CAR1) to which the secondary information is provided, and the moving body is the first.
- the information providing server 200 determines that the provision is necessary.
- the information providing server 200 may adopt a method of determining that the provision is unnecessary. can. Further, in this method 3, the necessity of providing the secondary information may be determined in consideration of the speed of each moving body.
- the information providing server 200 may determine that the moving object is stopped regardless of the moving direction. good. In this way, it is also possible to adopt a method of determining whether or not secondary information needs to be provided by using the movement attribute of the moving body.
- the information providing server 200 is based on the position information of the first moving body (vehicle CAR1) and the information of the surrounding moving bodies (objects) (and the map information of the surroundings), and the first moving body (vehicle). It is determined whether or not each moving body is in the blind spot position depending on whether or not another moving body or structure exists on the straight line connecting CAR1) and each moving body. For example, as shown in FIG. 7, the information providing server 200 draws two virtual lines (broken lines) connecting the sensor position of the vehicle CAR1 and the edge of the moving body on the map showing the general condition of the intersection.
- the information providing server 200 determines that the moving body is in the blind spot of the first moving body (vehicle CAR1). You can take the method of doing. For example, in the case of the two-wheeled vehicle BIKE1 in FIG. 7, since another mobile body (vehicle CAR2) exists on the two virtual lines (broken line), the information providing server 200 has BIKE1 as the first mobile body (vehicle CAR1). ) Is in the blind spot position. Similarly, for example, in the case of the pedestrian P1 in FIG. 7, since the structure (“building” in the lower left of FIG.
- the pedestrian P1 is the first in the information providing server 200. It is determined that the moving body (vehicle CAR1) is in the blind spot position. On the other hand, in the case of the vehicle CAR2 of FIG. 7, since there are no other moving objects or structures on the virtual line (broken line), the information providing server 200 has the vehicle CAR2 as the blind spot of the first moving object (vehicle CAR1). Judge that it is not in the position.
- the method for determining whether or not a moving body is present in the blind spot is not limited to the above example, and various methods can be adopted. For example, as shown in FIG.
- a virtual line (broken line) from the center position of the moving body to the center position of another moving body is drawn, and a virtual line (broken line) is drawn in the middle.
- a simple method of determining that the other moving objects are in the blind spot may be adopted.
- the "blind spot” is assumed to be the blind spot of the camera mounted on the first moving body (vehicle CAR1), but the example of the blind spot is not limited to this. For example, a blind spot from the "driver's point of view" may be assumed.
- the "blind spot” is not limited to the blind spot due to "visible light”, and may be a blind spot such as LiDAR or RADAR depending on the type of sensor mounted on the first moving body (vehicle CAR1).
- the first mobile body (vehicle CAR1) is configured to send an inquiry message including the position information of the first mobile body (vehicle CAR1) to the information providing server 200. You can also.
- the information providing server 200 transmits the created secondary information to the first mobile body (step S005). For example, by specifying the communication address of the transmission source from the inquiry message from the first mobile body (vehicle CAR1) and transmitting the secondary information to the communication address, the secondary information is transmitted to the vehicle CAR1. can do.
- FIG. 8 is an example of secondary information provided by the information providing server 200 to the vehicle CAR1 which is the first mobile body of FIG.
- the positional relationship between the motorcycle BIKE1 in the blind spot from the vehicle CAR1 and the pedestrian P1 is displayed on the screen of the vehicle CAR1 to provide information to call attention. Further, such a positional relationship may be superimposed and displayed on a map to be provided.
- the form of providing the secondary information is not limited to the form illustrated in FIG.
- the secondary information may be in a form that can be interpreted by the in-vehicle terminal (including the case of a driving support device) of the first mobile body (vehicle CAR1).
- the in-vehicle terminal of the first mobile body (vehicle CAR1) can be interpreted. It is also possible to provide.
- the information indicating the positional relationship shown in FIG. 8 above can be created by the following method.
- the information providing server 200 identifies the same moving object shown in the camera images of the cameras 100A to 100D, and removes the duplication. For example, when a moving body of the same type and / or the same size is detected from a plurality of cameras at the same time and at the same position, the information providing server 200 identifies the moving body as the same moving body. Then, the information providing server 200 creates secondary information representing the positional relationship between the first mobile body (vehicle CAR1) and the object identified as the same. Further, since the vehicle CAR2 in FIG.
- the second moving object 8 is an object that can be detected from the first moving object (vehicle CAR1), it can be excluded from the information included in the secondary information. As a result, duplication and waste of the objects shown in the camera images of the cameras 100A to 100D are removed. Further, when the in-vehicle terminal of the first mobile body (vehicle CAR1) provides the secondary information in an interpretable form, similarly, the same mobile body can be specified to remove duplication, or the first mobile body can be removed. Secondary information can be created by excluding moving objects captured by (vehicle CAR1).
- the information of a plurality of moving objects existing at the blind spot position as seen from the first moving object is aggregated into the same message or the same IP packet, and the first method is used. It may be transmitted to a moving body (vehicle CAR1).
- a moving body vehicle CAR1
- each camera independently transmits the IP packetized information to the first moving object (vehicle)
- an IP header is added to each IP packet, and the ratio of the IP header to the entire transmission data becomes high.
- the ratio of the IP header to the entire transmission data can be reduced.
- the effect of reducing the signaling load of the mobile communication network can also be expected.
- the secondary information created as described above is used in various forms in the first mobile body (vehicle CAR1). For example, by transmitting to an in-vehicle terminal or smartphone of the first mobile body (vehicle CAR1) and displaying it on these devices, it is possible to take a form of presenting to the driver. Further, it is also possible to adopt a form in which AR (Augmented Reality) is displayed on the front window of the first moving body (vehicle CAR1). 9 and 10 show an example of presenting secondary information on these terminals and a front window. For example, in the example of FIG. 9, a message telling that the motorcycle BIKE1 and the pedestrian P1 exist behind the vehicle CAR2 and the building that are visible as a real image is displayed.
- AR Augmented Reality
- an object representing the motorcycle BIKE1 and the pedestrian P1 is AR-displayed behind the vehicle CAR2 and the building which are seen as a real image.
- the object (BIKE1 or P1) in FIG. 10 may be an icon or a frontal image estimated from a side image (primary information) of a motorcycle or a pedestrian obtained from the camera 100B. Further, the speed estimated from the camera image, the distance from CAR1 and the like may be displayed on these objects.
- secondary information can be used in addition to the form that appeals to the driver's eyesight.
- secondary information can be input to the in-vehicle terminal of the first mobile body (vehicle CAR1) and used as information for automatic driving or driving support information.
- secondary information can be provided as information that complements the dynamic map for autonomous driving.
- the present embodiment it is possible to efficiently transmit accurate secondary information to the first moving body (vehicle) entering the intersection from a specific direction.
- the reason is that a configuration is adopted in which the necessity of creating the secondary information is determined by using the primary information acquired from the cameras 100A to 100D, and the secondary information is created by omitting the duplication.
- an example of providing secondary information to a first moving body (vehicle) entering an intersection from a specific direction has been described, but there are situations where the present invention can be applied. , Not limited to this example.
- an image of a moving body (second moving body) located in a range that becomes a blind spot such as a sensor of the vehicle obj0 due to the presence of the parked vehicle obj1 (in the case of FIG. 11, walking). Person) may be located.
- it can also be applied to inform the first moving body (vehicle obj0) of the existence of a moving body (second moving body) located in a range that becomes a blind spot such as a sensor.
- a pedestrian obj2 that is difficult to be caught by the sensor of the vehicle obj0 may be located due to the presence of the vehicle obj1 parked in the parking lot. In such a case, it can also be applied to inform the vehicle obj0 of the existence of the pedestrian obj2.
- the first moving body may be a pedestrian or a bicycle in addition to the vehicle.
- the present invention can be applied to all uses for notifying the presence of a second mobile body that is difficult to detect from the first mobile body around a certain first mobile body.
- FIG. 13 is a diagram showing a configuration of an information providing server according to a second embodiment of the present invention.
- the structural difference from the first embodiment shown in FIG. 2 is that the address acquisition unit 204 is added to the information providing server 200a, and the functions of the determination unit 201a and the transmission unit 203a are changed. Is. Since other configurations are the same as those of the first embodiment, the differences will be mainly described below.
- the determination unit 201a provides the address acquisition unit 204 with camera images (primary information) obtained from the cameras 100A to 100D.
- the address acquisition unit 204 identifies the individual by reading the license plate information from the image of the moving body (vehicle) captured in the camera images (primary information) obtained from the cameras 100A to 100D. Then, the address acquisition unit 204 transmits the license plate information to the mobile management server 300 arranged on the cloud, and the IP (Internet Protocol) of the in-vehicle terminal of the mobile (vehicle) having the corresponding license plate information. Request an address.
- IP Internet Protocol
- the mobile body management server 300 is a server that manages mobile body information in which license plate information is associated with an IP address of an in-vehicle terminal or the like of each mobile body (vehicle).
- the mobile management server 300 receives a request for an IP address corresponding to the license plate information from the information providing server 200a, the mobile management server 300 responds to the information providing server 200a with the IP address.
- the transmission unit 203a uses the obtained IP address to transmit information to the mobile body (vehicle) notifying the existence of another mobile body located in the blind spot.
- FIG. 14 is a flowchart showing the operation of the information providing server 200a according to the second embodiment of the present invention. The difference from the operation of the first embodiment shown in FIG. 5 is that steps S105 and S106 are added after step S004.
- the information providing server 200a is the first movement captured in the images (primary information) obtained from the cameras 100A to 100D after the creation of the secondary information to be provided to the first moving body (vehicle) (step S004).
- the first moving body (vehicle) is specified by reading the license plate information from the image of the body (vehicle) (step S105).
- the information providing server 200a acquires the IP address of the in-vehicle terminal or the like of the specified first mobile body (vehicle) from the mobile body management server 300 arranged on the cloud (step S106).
- the information providing server 200a transmits the secondary information created in step S004 to the first mobile body (vehicle) using the acquired IP address (step S005).
- the present embodiment it is possible to specify the communication address and notify the information to the first mobile body without receiving the inquiry message. Further, according to the present embodiment, information can be transmitted to the first mobile body even if the vehicle-mounted terminal or the like of the first mobile body does not have the function of requesting secondary information. In other words, in this embodiment, in addition to the effect of the first embodiment, there is an advantage that the function on the in-vehicle terminal side can be simplified.
- the license plate information is read from the camera images obtained from the cameras 100A to 100D to specify the address of the in-vehicle terminal or the like.
- the method of specifying the address of the terminal or the like is not limited to this. For example, if there is a server in the cloud that associates a person's face image with an address, identify the driver from the face image captured in the camera images obtained from the cameras 100A to 100D, and then contact the server.
- a method of specifying the terminal address of the first mobile body can also be adopted. When this method is adopted, there is an advantage that secondary information can be provided to general pedestrians other than vehicle passengers and people riding bicycles.
- a server that links a person's face image with the address of an information terminal owned by that person is placed on the cloud.
- the information providing server 200a inquires of the server the address of the information terminal possessed by the person corresponding to the face image.
- the present invention can be applied not only when the destination of information is a vehicle but also when the destination is a pedestrian or a person riding a bicycle.
- FIG. 15 is a diagram showing a configuration of an information providing server according to a third embodiment of the present invention. The structural difference from the first embodiment shown in FIG.
- the determination unit 201b of the information providing server 200b is a camera image from a camera 100E mounted on a peripheral moving body traveling via a network. Is the point that can be obtained. Since other configurations are the same as those of the first embodiment, the differences will be mainly described below.
- the determination unit 201b of the information providing server 200b acquires the camera image from the camera 100E mounted on one or more moving objects passing near the intersection via the network.
- the determination unit 201b can specify the acquisition position of the image from the metadata such as EXIF information attached to the camera image.
- the network side is provided with a function of transmitting a camera image to the information providing server 200b based on the position information of the moving body, and the camera image from the moving body heading toward the intersection automatically provides information. It may be configured to be transmitted to the server 200b.
- the information providing server 200b of the present embodiment can acquire an image from the camera 100E mounted on one or more moving objects passing near the intersection as primary information.
- the information providing server 200b of the present embodiment can acquire an image from the camera 100E mounted on one or more moving objects passing near the intersection as primary information.
- the vehicle CAR3 traveling further behind the moving body (second moving body) BIKE1 located in the range of the blind spot of the first moving body (vehicle CAR1). It is possible to obtain an image.
- the present embodiment using a camera of a moving object in the vicinity as a sensor, it is possible to improve the accuracy of determining whether or not the secondary information needs to be created and to enrich the information to be included in the secondary information.
- the image quality may deteriorate due to a backlight state depending on the position of the sun. It is possible to prevent such a decrease.
- FIG. 16 is a diagram showing a configuration of an information providing server according to a fourth embodiment of the present invention.
- the structural difference from the first embodiment shown in FIG. 2 is that the moving state acquisition unit 205 is added to the information providing server 200c, and the information acquired by the information creating unit 202c in the moving state acquisition unit 205 is also used. It is a point that is configured to create secondary information. Since other configurations are the same as those of the first embodiment, the differences will be mainly described below.
- the moving state acquisition unit 205 of the information providing server 200c acquires the moving state of the first moving body (vehicle CAR1) captured by the cameras 100A to 100D.
- the "moving state” represents a state regarding the movement of the moving body, and includes, for example, a traveling direction and a speed.
- a traveling direction is acquired as a moving state will be described as an example.
- the movement of the image of the first moving body (vehicle CAR1) captured in the camera image obtained from the cameras 100A to 100D and the information of the direction indicator It is possible to take a method of estimating the traveling direction from.
- the method of acquiring the traveling direction of the first moving body (vehicle CAR1) is not limited to this, and various methods can be adopted. For example, when the direction of travel is specified in the lane of an intersection (for example, a right turn lane), the direction of travel is estimated using the lane information in which the first moving object (vehicle CAR1) captured in the camera image is located. The method can be used. Further, when the information providing server 200c can obtain the travel route planning information (route information of the car navigation system) of the first mobile body (vehicle CAR1) from the in-vehicle terminal of the mobile body which is a vehicle, the travel route planning information is used. The direction of travel may be estimated.
- the travel route planning information route information of the car navigation system
- the traveling direction of the first moving body may be estimated by using the steering angle.
- the direction of travel is specified (for example, a combination of red light and right arrow) in the lighting pattern of the traffic signal at the intersection, the lighting state of the traffic signal shown in the camera image and the control information of the traffic signal. Can be used to estimate the traveling direction of the first moving object (vehicle CAR1).
- the information creation unit 202c creates secondary information using the traveling direction of the first moving body (vehicle CAR1) obtained as described above, in addition to the camera images of the cameras 100A to 100D.
- the information creation unit 202c is an object in the traveling direction of the first moving object (vehicle CAR1) among the objects (second moving objects) in the blind spot of the first moving object (vehicle CAR1). Give a high priority to, and create secondary information in consideration of that priority.
- FIG. 17 is a flowchart showing the operation of the information providing server 200c according to the fourth embodiment of the present invention.
- the difference from the operation of the first embodiment shown in FIG. 5 is that steps S204 and S205 are added after step S003.
- the information providing server 200c determines that the secondary information is provided to the moving body (first moving body) (Yes in step S003), the information providing server 200c acquires the moving state (traveling direction information) of the moving body (step S204). ..
- the information providing server 200c determines the importance given to each object by using the arrangement of the objects and the moving state (traveling direction) of the moving object (step S205). Then, the information providing server 200c creates the secondary information in consideration of the determined importance (step S004). For example, it is assumed that the moving state (traveling direction) when the first moving body (vehicle CAR1) turns right is obtained. In this case, as shown in FIG. 18, the information providing server 200c is the traveling direction of the first moving object (vehicle CAR1) among the objects (BIKE1, P1) in the blind spot of the first moving object (vehicle CAR1). Set a higher importance for the motorcycle BIKE1 in. Then, the information providing server 200c creates secondary information that strongly calls attention to the motorcycle BIKE1. Further, at this time, the pedestrian P1 to which the low importance is given can be omitted from the secondary information.
- the information providing server 200c of the present embodiment that operates as described above, it is possible to further narrow down the information to be transmitted to the first mobile body (vehicle CAR1) in comparison with the first to third embodiments. Become. This makes it possible to more efficiently inform the driver of the first moving body (vehicle CAR1) and the in-vehicle device of the existence of the moving body in the blind spot.
- the secondary information is created using the importance given to the object, but the usage form of the importance is not limited to this.
- the appearance of each object when displaying secondary information as an image of an in-vehicle terminal may be different.
- the transmission form of the secondary information may be different depending on the degree of importance given to each object. For example, secondary information notifying the existence of a highly important object is transmitted in a push type without waiting for a request from the first moving object (vehicle CAR1) (see the second embodiment), and other information. Information can be transmitted in response to a request from the first moving object (vehicle CAR1).
- the speed of the first moving body may be used as the moving state.
- the velocity of the first moving object may be acquired from the velocity sensor or may be acquired by analyzing the image.
- the speed may be estimated from the position information of the shift lever. In this way, by using the information of speed as the moving state, for example, when the speed is equal to or less than a predetermined threshold value (such as when the vehicle is stopped or driving slowly), a low importance can be set.
- FIG. 19 is a diagram showing a configuration of an information providing server according to a fifth embodiment of the present invention.
- the structural difference from the first embodiment shown in FIG. 2 is that the moving state acquisition unit 205 is added to the determination unit 201d of the information providing server 200d, and the determination unit 201d is acquired by the movement state acquisition unit 205.
- the point is that it is configured to determine whether or not it is necessary to create secondary information using the information provided. Since other configurations are the same as those of the first embodiment, the differences will be mainly described below.
- the moving state acquisition unit 205 of the information providing server 200d acquires the moving state (traveling direction) of the first moving body (vehicle CAR1) captured by the cameras 100A to 100D. Since the method of acquiring the moving state (traveling direction) of the first moving body (vehicle CAR1) is the same as that of the fourth embodiment, the description thereof will be omitted. Further, in the following description, a case where the traveling direction is acquired as a moving state will be described as an example.
- the determination unit 201d needs to provide secondary information by using the moving state (traveling direction) of the first moving body (vehicle CAR1) obtained as described above in addition to the camera images of the cameras 100A to 100D. To judge. Specifically, the determination unit 201d provides secondary information depending on whether or not the object in the blind spot of the first moving body (vehicle CAR1) is in the traveling direction of the first moving body (vehicle CAR1). Judge the necessity.
- FIG. 20 is a flowchart showing the operation of the information providing server 200d according to the fifth embodiment of the present invention.
- the difference from the operation of the first embodiment shown in FIG. 5 is that the movement state is acquired in addition to the primary information in step S301, and the movement state is analyzed in addition to the primary information in step S302. Is the point where
- the information providing server 200d when the traveling direction when the first moving object (vehicle CAR1) turns right is obtained, the information providing server 200d is in the blind spot of the first moving object (vehicle CAR1) as shown in FIG.
- the objects (BIKE1, P1) since the motorcycle BIKE1 is in the traveling direction of the vehicle CAR1, it is determined that it is necessary to provide the secondary information.
- the information providing server 200d does not need to provide the secondary information. Is determined.
- the information providing server 200d of the present embodiment that operates as described above, it is possible to further narrow down the information to be transmitted to the first mobile body (vehicle CAR1) in comparison with the first to fourth embodiments. Become. This makes it possible to more efficiently inform the driver of the first moving body (vehicle CAR1) and the in-vehicle device of the existence of the moving body in the blind spot.
- the speed of the first moving body may be used as the moving state.
- the velocity of the first moving object may be acquired from the velocity sensor or may be acquired by analyzing the image.
- the speed may be estimated from the position information of the shift lever. For example, when the speed is equal to or less than a predetermined threshold value (such as when the vehicle is stopped or driving at the slowest speed), it can be determined that the provision is unnecessary.
- the moving state (traveling direction, speed, etc.) of the moving body) may be acquired to determine whether or not secondary information needs to be provided. For example, when a moving body is present in the blind spot of the first moving body, the traveling direction and speed of the moving body may be acquired, and the necessity of provision may be determined accordingly.
- FIG. 21 is a diagram showing a configuration of an information providing server according to a sixth embodiment of the present invention.
- the structural difference from the first embodiment shown in FIG. 2 is that the motion prediction unit 206 is added to the determination unit 201e of the information providing server 200e, and the information acquired by the determination unit 201e by the motion prediction unit 206. It is also configured to determine the necessity of creating secondary information. Since other configurations are the same as those of the first embodiment, the differences will be mainly described below.
- the motion prediction unit 206 of the information providing server 200e predicts the motion of the first mobile body (vehicle CAR1) from the state of the first mobile body (vehicle CAR1) captured by the cameras 100A to 100D.
- the movement after a predetermined time is predicted from the image of the first moving body (vehicle CAR1) captured in the camera image obtained from the cameras 100A to 100D. You can take the method.
- the method of predicting the movement of the first moving body (vehicle CAR1) is not limited to this, and various methods can be adopted.
- the information providing server 200e can obtain the first mobile body (instrument information (steering angle, speedometer, GPS information, etc.) of the vehicle CAR1) from the in-vehicle terminal of the first mobile body (vehicle CAR1), these The movement after a predetermined time may be estimated using the information of. Further, when the lighting state of the traffic signal at the intersection and the control information of the traffic signal can be acquired, a method of predicting the movement of the first moving body (vehicle CAR1) can be used by using these. For example, if the light of the traffic signal in the traveling direction of the first mobile body (vehicle CAR1) is red, it can be predicted that the first mobile body (vehicle CAR1) will not move for a while.
- the first mobile body when information is obtained that the light of the traffic signal in the traveling direction of the first mobile body (vehicle CAR1) will soon turn blue, the first mobile body (vehicle CAR1) will be used for a predetermined time. It can be predicted that the movement will start later. Further, for example, when the first moving body is a vehicle and its travel route planning information (route information of a car navigation system) can be obtained, the movement may be predicted using the travel route planning information. Further, for example, when the direction of travel is specified in the lane of the intersection (for example, the right turn lane), the movement is predicted using the lane information in which the first moving object (vehicle CAR1) captured in the camera image is located. Can be used.
- route planning information route information of a car navigation system
- the determination unit 201e determines whether or not secondary information needs to be provided by using the future movement of the first moving body (vehicle CAR1) obtained as described above in addition to the camera images of the cameras 100A to 100D. .. Specifically, the determination unit 201e determines whether or not the first moving object (vehicle CAR1) has been stopped for a while, in addition to the presence or absence of an object in the blind spot of the first moving object (vehicle CAR1). Determine whether or not secondary information needs to be provided.
- FIG. 22 is a flowchart showing the operation of the information providing server 200e according to the sixth embodiment of the present invention.
- step S501 for predicting the movement of the moving body is added after step S002.
- the information providing server 200e determines whether or not there is an object in the blind spot of the first moving object (vehicle CAR1). Nevertheless, it is determined that it is not necessary to provide secondary information. Further, when the prediction result that the first moving body (vehicle CAR1) is moving or that the moving will start soon is obtained, the information providing server 200e determines that it is necessary to provide the secondary information.
- the information providing server 200e of the present embodiment that operates as described above, it is possible to further narrow down the information to be transmitted to the first mobile body (vehicle CAR1) in comparison with the first to fifth embodiments. Become. This makes it possible to more efficiently inform the driver of the first moving body (vehicle CAR1) and the in-vehicle device of the existence of the moving body in the blind spot.
- Secondary information may be created by predicting future movements of the moving body). For example, if there are two mobiles in the blind spot of the first mobile and their future movements are different, different importance may be set according to these.
- FIG. 23 is a diagram showing the configuration of the seventh embodiment of the present invention.
- the information providing server 200g transmits the primary information acquired from the cameras 100A to 100D to the predetermined control server 500 on the camera 100A to 100D side of the network. It is a point placed on the edge of.
- FIG. 23 a configuration is shown in which the cameras 100A to 100D are connected to the control server 500 via the base station 600, the mobile backhaul 700, the gateway (GW) 800, and the Internet 900.
- the cameras 100A to 100D are connected to the control server 500 via the base station 600, the mobile backhaul 700, the gateway (GW) 800, and the Internet 900.
- GW gateway
- the base station 600 transmits camera images taken by the cameras 100A to 100D to the control server 500 and the information providing server 200g.
- the control server 500 performs information processing necessary for the control work by using the camera images taken by the cameras 100A to 100D.
- the information providing server 200g performs the same operation as in the first embodiment by using the camera images of the cameras 100A to 100D received from the base station 600, creates secondary information, and if necessary, the base station 600. Secondary information is transmitted to the moving body (first moving body) via. Therefore, the information providing server 200g functions as a kind of mobile edge computing server (MEC server). Since the mobile backhaul 700, the gateway (GW) 800, and the Internet 900 are well-known configurations by those skilled in the art, the description thereof will be omitted.
- MEC server mobile edge computing server
- the configuration of this embodiment it is possible to provide secondary information to a mobile body (first mobile body) by adding it to an existing traffic control system. Further, as described above, since the information providing server 200g is arranged at the edge on the camera 100A to 100D side of the network that transmits the primary information to the predetermined control server 500, the secondary information equivalent to that of the control server 500 can be obtained. There is an advantage that the processing delay can be reduced as compared with the case where the provision is performed.
- the present invention is not limited to the above-described embodiment, and further modifications, substitutions, and adjustments are made without departing from the basic technical idea of the present invention.
- the network configuration, the configuration of each element, and the representation form of the data shown in each drawing are examples for assisting the understanding of the present invention, and are not limited to the configurations shown in these drawings.
- the two cameras are arranged so that their shooting directions intersect at right angles, but the number and arrangement of the cameras are not limited to this.
- another embodiment can be configured by combining the features of two or more embodiments arbitrarily selected.
- the procedure shown in the first to seventh embodiments described above is performed by a program that realizes the function as the information providing server 200 to 200 g on the computer (9000 in FIG. 24) that functions as the information providing server 200 to 200 g. It is feasible.
- a computer is exemplified in a configuration including a CPU (Central Processing Unit) 9010, a communication interface 9020, a memory 9030, and an auxiliary storage device 9040 in FIG. 24. That is, the CPU 9010 in FIG. 24 may execute a data processing program or a data transmission program, and update each calculation parameter held in the auxiliary storage device 9040 or the like.
- a CPU Central Processing Unit
- each part (processing means, function) of the information providing server 200 to 200g shown in each of the above-described embodiments causes the processor mounted on these devices to execute each of the above-mentioned processing by using the hardware thereof. It can be realized by a computer program.
- the determination means of the information providing server described above identifies an object existing in the sensing range of the plurality of sensors from the primary information acquired from the plurality of sensors, and from the positions of the identified object and the first moving body, the determination means is used.
- the second moving object existing in the blind spot of the first moving object is extracted, and based on the information about the extracted second moving object, it is determined whether or not to provide the secondary information.
- the information creating means may adopt a configuration in which information including information about the second mobile body is created as the secondary information.
- the above-mentioned information providing server determines the identity between the plurality of sensors with respect to the object included in the primary information acquired from the plurality of sensors, and based on the determination result of the identity, the above-mentioned secondary information is obtained. It can be created and configured.
- the information providing server mentioned above is Further, an address acquisition means for performing individual identification of the first mobile body using the primary information and acquiring a communication address assigned to the first mobile body based on the result of the individual identification.
- the transmission means may be configured to transmit the secondary information to the communication address.
- the determination means of the information providing server described above determines whether or not to provide the secondary information based on the position information included in the message received from the first mobile body, and the transmitting means describes the message.
- the secondary information can be transmitted to the communication address of the transmission source of the above.
- a configuration may be adopted including at least one of a sensor installed on the road or a sensor provided on a moving body traveling on the road.
- the information providing server described above can further acquire the moving state of the moving body passing through the road. It is possible to adopt a configuration in which the determination means determines whether or not to provide the secondary information to the first moving body by using the moving state.
- the information providing server described above can further acquire the moving state of the moving body passing through the road.
- the information creating means can adopt a configuration in which the importance of each of the second moving bodies is obtained by using the moving state, and the secondary information is created in consideration of the importance.
- the above-mentioned information providing server as the above-mentioned moving state, the lighting state of the traffic signal in the vicinity, the traveling route plan information of the moving body passing through the road, the traveling lane information of the moving body passing through the road, and the passing on the road. It is possible to adopt a configuration using at least one of the velocity information of the moving body.
- the determination means of the information providing server predicts the movement of the moving body passing through the road and provides the secondary information to the moving body passing through the road by using the result of the prediction of the movement. It is possible to take a configuration that determines whether or not it is.
- the determination means of the information providing server described above predicts the movement of a moving body passing through the road, and uses the result of the prediction of the movement to obtain the importance of each of the second moving bodies, and the importance level is obtained. It is possible to adopt a configuration that creates the secondary information in consideration of the above.
- the determination means of the information providing server described above includes the lighting state of a traffic signal in the vicinity, travel route planning information of a moving body traveling on the road, traveling lane information of the moving body traveling on the road, and movement traveling on the road. It is possible to adopt a configuration that predicts the movement of a moving object traveling on the road based on at least one of the speed information of the body.
- the information providing server mentioned above is The server may be located at the edge of the network on the sensor side, which transmits the primary information acquired from each of the plurality of sensors to a predetermined control server.
- [15th form] (Refer to the information provision method from the second viewpoint above)
- [16th form] (Refer to the program from the third viewpoint above)
- the 15th to 16th forms can be expanded into the second to 14th forms in the same manner as the first form.
- any numerical value or small range included in the range should be construed as being specifically described even if not otherwise described.
- each of the disclosed matters of the above-cited documents may be used in combination with the matters described in this document in part or in whole as a part of the disclosure of the present invention, if necessary, in accordance with the purpose of the present invention. It is deemed to be included in the disclosure of this application.
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Chemical & Material Sciences (AREA)
- Analytical Chemistry (AREA)
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Traffic Control Systems (AREA)
Abstract
Description
続いて、本発明の第1の実施形態について図面を参照して詳細に説明する。図2は、本発明の第1の実施形態の構成を示す図である。図2を参照すると、複数のセンサーとしてのカメラ100A~カメラ100Dと接続された情報提供サーバ200が示されている。
(方法1)例えば、カメラ映像(一次情報)を分析した結果、二次情報の提供対象である第一の移動体(車両CAR1)の死角に移動体が存在する場合、情報提供サーバ200が提供要と判定し、存在しない場合に提供不要と判定する方法を採ることができる。例えば、図6の場合、第一の移動体(車両CAR1)の死角に二輪車BIKE1及び歩行者P1が存在するため、情報提供サーバ200が提供要と判定することになる。
続いて、特定の移動体に対し情報を送信する方法に変更を加えた第2の実施形態について図面を参照して詳細に説明する。第2の実施形態においても、情報提供サーバ200aのサービスの提供先となる移動体としては車両の他、歩行者や自転車など種々のものが考えられるが、以下の説明においては、サービス対象の移動体が車両である例を挙げて説明する。図13は、本発明の第2の実施形態の情報提供サーバの構成を示す図である。図2に示した第1の実施形態との構成上の相違点は、情報提供サーバ200aに、アドレス取得部204が追加され、判定部201aと送信部203aの機能に変更が加えられている点である。その他の構成は第1の実施形態と同様であるので、以下、その相違点を中心に説明する。
続いて、上記した各実施形態のセンサーの1つとして、移動体に搭載されたカメラを用いるようにした第3の実施形態について図面を参照して詳細に説明する。第3の実施形態においても、情報提供サーバ200bのサービスの提供先となる移動体とカメラ映像を取得する移動体としては、車両の他、歩行者や自転車など種々のものが考えられるが、以下の説明においては、これらの移動体がそれぞれ車両である例を挙げて説明する。図15は、本発明の第3の実施形態の情報提供サーバの構成を示す図である。図2に示した第1の実施形態との構成上の相違点は、情報提供サーバ200bの判定部201bが、ネットワークを介して、通行中の周辺移動体に搭載されたカメラ100Eからのカメラ映像を取得可能となっている点である。その他の構成は第1の実施形態と同様であるので、以下、その相違点を中心に説明する。
続いて、情報提供サーバが、移動体の移動状態を考慮して二次情報の提供を行う第4の実施形態について図面を参照して詳細に説明する。第4の実施形態においても、移動体としては、車両の他、歩行者や自転車など種々のものが考えられるが、以下の説明においては、移動体が車両である例を挙げて説明する。図16は、本発明の第4の実施形態の情報提供サーバの構成を示す図である。図2に示した第1の実施形態との構成上の相違点は、情報提供サーバ200cに、移動状態取得部205が追加され、情報作成部202cが移動状態取得部205で取得した情報も用いて二次情報を作成するよう構成されている点である。その他の構成は第1の実施形態と同様であるので、以下、その相違点を中心に説明する。
[第5の実施形態]
続いて、情報提供サーバが、移動体の移動状態を考慮して二次情報の提供要否を判定する第5の実施形態について図面を参照して詳細に説明する。第5の実施形態においても、移動体としては、車両の他、歩行者や自転車など種々のものが考えられるが、以下の説明においては、移動体が車両である例を挙げて説明する。図19は、本発明の第5の実施形態の情報提供サーバの構成を示す図である。図2に示した第1の実施形態との構成上の相違点は、情報提供サーバ200dの判定部201d内に、移動状態取得部205が追加され、判定部201dが移動状態取得部205で取得した情報も用いて二次情報の作成要否を判定するよう構成されている点である。その他の構成は第1の実施形態と同様であるので、以下、その相違点を中心に説明する。
続いて、情報提供サーバが、移動体の将来の動きを考慮して二次情報の提供要否を判定する第6の実施形態について図面を参照して詳細に説明する。第6の実施形態においても、移動体としては、車両の他、歩行者や自転車など種々のものが考えられるが、以下の説明においては、移動体が車両である例を挙げて説明する。図21は、本発明の第6の実施形態の情報提供サーバの構成を示す図である。図2に示した第1の実施形態との構成上の相違点は、情報提供サーバ200eの判定部201e内に、動き予測部206が追加され、判定部201eが動き予測部206で取得した情報も用いて二次情報の作成要否を判定するよう構成されている点である。その他の構成は第1の実施形態と同様であるので、以下、その相違点を中心に説明する。
続いて、情報提供サーバがMEC(Mobile Edge Computing/Multi-access Edge Computing)サーバとして機能する第7の実施形態について図面を参照して詳細に説明する。図23は、本発明の第7の実施形態の構成を示す図である。図2以下に示した第1の実施形態との相違点は、情報提供サーバ200gが、カメラ100A~100Dからそれぞれ取得した一次情報を所定の管制サーバ500に送信するネットワークの前記カメラ100A~100D側のエッジに配置されている点である。
[第1の形態]
(上記第1の視点による情報提供サーバ参照)
[第2の形態]
上記した情報提供サーバの判定手段は、前記第一の移動体の周辺に、前記第一の移動体からは検出困難な第二の移動体が存在するか否か、該第二の移動体の種類、または該第二の移動体の移動属性の少なくともいずれか一つに基づき、前記二次情報を提供するか否かを判定し、前記第一の移動体の周辺に、前記第一の移動体から検出困難な第二の移動体が存在すると判定した場合に、前記二次情報を提供すると判定する、構成を採ることができる。
[第3の形態]
上記した情報提供サーバの判定手段は、前記複数のセンサーから取得した一次情報から前記複数のセンサーのセンシング範囲に存在するオブジェクトを特定し、該特定したオブジェクト及び前記第一の移動体の位置から、前記第一の移動体の死角に存在する前記第二の移動体を抽出し、該抽出した前記第二の移動体に関する情報に基づいて、前記二次情報を提供するか否かを判定し、
前記情報作成手段は、前記二次情報として、前記第二の移動体に関する情報を含む情報を作成する、構成を採ることができる。
[第4の形態]
上記した情報提供サーバは、前記複数のセンサーから取得した一次情報に含まれるオブジェクトに関し、前記複数のセンサー間での同一性を判定し、前記同一性の判定結果に基づいて、前記二次情報を作成する、構成を採ることができる。
[第5の形態]
上記した情報提供サーバは、
さらに、前記一次情報を用いて前記第一の移動体の個体識別を実施し、該個体識別の結果をもとに、前記第一の移動体に割り当てられた通信アドレスを取得するアドレス取得手段を備え、前記送信手段は、前記通信アドレス宛に、前記二次情報を送信する、構成を採ることができる。
[第6の形態]
上記した情報提供サーバの判定手段は、前記第一の移動体から受信したメッセージに含まれる位置情報に基づいて、前記二次情報を提供するか否かを判定し、前記送信手段は、前記メッセージの送信元の通信アドレス宛に、前記二次情報を送信する、構成を採ることができる。
[第7の形態]
上記した情報提供サーバに接続される前記センサーとして、前記道路に設置されたセンサー又は前記道路を通行する移動体に備えられたセンサーの少なくとも1つを含む、構成を採ることができる。
[第8の形態]
上記した情報提供サーバは、さらに、前記道路を通行する移動体の移動状態を取得可能であり、
前記判定手段が、前記移動状態を用いて、前記第一の移動体に前記二次情報を提供するか否かを判定する、構成を採ることができる。
[第9の形態]
上記した情報提供サーバは、さらに、前記道路を通行する移動体の移動状態を取得可能であり、
前記情報作成手段が、前記移動状態を用いて、前記第二の移動体それぞれの重要度を求め、該重要度を考慮して前記二次情報を作成する、構成を採ることができる。
[第10の形態]
上記した情報提供サーバにおいて、上記した移動状態として、近傍の交通信号機の灯火状態、前記道路を通行する移動体の走行経路計画情報、前記道路を通行する移動体の走行車線情報、前記道路を通行する移動体の速度情報の少なくともいずれかを用いる構成を採ることができる。
[第11の形態]
上記した情報提供サーバの前記判定手段は、前記道路を通行する移動体の動きを予測し、該動きの予測の結果を用いて、前記道路を通行する移動体に前記二次情報を提供するか否かを判定する、構成を採ることができる。
[第12の形態]
上記した情報提供サーバの前記判定手段は、前記道路を通行する移動体の動きを予測し、該動きの予測の結果を用いて、前記第二の移動体それぞれの重要度を求め、該重要度を考慮して前記二次情報を作成する、構成を採ることができる。
[第13の形態]
上記した情報提供サーバの前記判定手段は、近傍の交通信号機の灯火状態、前記道路を通行する移動体の走行経路計画情報、 前記道路を通行する移動体の走行車線情報、前記道路を通行する移動体の速度情報の少なくともいずれかに基づいて、前記道路を通行する移動体の動きを予測する、構成を採ることができる。
[第14の形態]
上記した情報提供サーバは、
前記複数のセンサーからそれぞれ取得した一次情報を所定の管制サーバに送信するネットワークの前記センサー側のエッジに配置されているサーバであってもよい。
[第15の形態]
(上記第2の視点による情報提供方法参照)
[第16の形態]
(上記第3の視点によるプログラム参照)
なお、上記第15~第16の形態は、第1の形態と同様に、第2~第14の形態に展開することが可能である。
20、200、200a~200g 情報提供サーバ
21、201、201a~201e 判定部
22、202、202c 情報作成部
23、203、203a 送信部
100-1、100-2、100A~100E カメラ
204 アドレス取得部
205 移動状態取得部
206 動き予測部
300 移動体管理サーバ
400A~400D 交通信号機
500 管制サーバ
600 基地局
700 モバイルバックホール
800 ゲートウェイ(GW)
900 インターネット
CAR1~CAR2 車両
BIKE1 二輪車
P1 歩行者
obj0~obj2 オブジェクト/像
9000 コンピュータ
9010 CPU
9020 通信インターフェース
9030 メモリ
9040 補助記憶装置
Claims (16)
- 道路の所定の範囲をセンシングする複数のセンサーからそれぞれ取得した一次情報に基づいて、前記道路を通行する第一の移動体に対し、前記複数のセンサーから取得した一次情報を用いて作成する二次情報を提供するか否かを判定する判定手段と、
前記第一の移動体に対し、前記二次情報を提供すると判定した場合、前記複数のセンサーから取得した一次情報を用いて前記二次情報を作成する情報作成手段と、
前記第一の移動体に対し、前記二次情報を送信する送信手段と、
を備える情報提供サーバ。 - 前記判定手段は、前記第一の移動体の周辺に、前記第一の移動体からは検出困難な第二の移動体が存在するか否か、該第二の移動体の種類、または該第二の移動体の移動属性の少なくともいずれか一つに基づき、前記二次情報を提供するか否かを判定する、
請求項1に記載の情報提供サーバ。 - 前記判定手段は、前記複数のセンサーから取得した一次情報から前記複数のセンサーのセンシング範囲に存在するオブジェクトを特定し、該特定したオブジェクト及び前記第一の移動体の位置から、前記第一の移動体の死角に存在する前記第二の移動体を抽出し、該抽出した前記第二の移動体に関する情報に基づいて、前記二次情報を提供するか否かを判定し、
前記情報作成手段は、前記二次情報として、前記第二の移動体に関する情報を含む情報を作成する、
請求項2に記載の情報提供サーバ。 - 前記情報作成手段は、前記複数のセンサーから取得した一次情報に含まれるオブジェクトに関し、前記複数のセンサー間での同一性を判定し、前記同一性の判定結果に基づいて、前記二次情報を作成する、
請求項1から3いずれか一に記載の情報提供サーバ。 - さらに、前記一次情報を用いて前記第一の移動体の個体識別を実施し、該個体識別の結果をもとに、前記第一の移動体に割り当てられた通信アドレスを取得するアドレス取得手段を備え、
前記送信手段は、前記通信アドレス宛に、前記二次情報を送信する、
請求項1から4いずれか一に記載の情報提供サーバ。 - 前記判定手段は、前記第一の移動体から受信したメッセージに含まれる位置情報に基づいて、前記二次情報を提供するか否かを判定し、
前記送信手段は、前記メッセージの送信元の通信アドレス宛に、前記二次情報を送信する、
請求項1から4いずれか一に記載の情報提供サーバ。 - 前記センサーとして、前記道路に設置されたセンサー又は前記道路を通行する移動体に備えられたセンサーの少なくとも1つを含む、
請求項1から6いずれか一に記載の情報提供サーバ。 - さらに、前記道路を通行する移動体の移動状態を取得可能であり、
前記判定手段は、前記移動状態を用いて、前記第一の移動体に前記二次情報を提供するか否かを判定する、
請求項1から7いずれか一に記載の情報提供サーバ。 - さらに、前記道路を通行する移動体の移動状態を取得可能であり、
前記情報作成手段は、前記移動状態を用いて、前記第二の移動体それぞれの重要度を求め、該重要度を考慮して前記二次情報を作成する、
請求項2から7いずれか一に記載の情報提供サーバ。 - 前記移動状態は、近傍の交通信号機の灯火状態、前記道路を通行する移動体の走行経路計画情報、前記道路を通行する移動体の走行車線情報、前記道路を通行する移動体の速度情報の少なくともいずれかに基づいて生成される、 請求項8又は9に記載の情報提供サーバ。
- 前記判定手段は、前記道路を通行する移動体の動きを予測し、該動きの予測の結果を用いて、前記道路を通行する移動体に前記二次情報を提供するか否かを判定する、
請求項1から7いずれか一に記載の情報提供サーバ。 - 前記判定手段は、前記道路を通行する移動体の動きを予測し、該動きの予測の結果を用いて、前記第二の移動体それぞれの重要度を求め、該重要度を考慮して前記二次情報を作成する、
請求項2から7いずれか一に記載の情報提供サーバ。 - 前記判定手段は、近傍の交通信号機の灯火状態、前記道路を通行する移動体の走行経路計画情報、前記道路を通行する移動体の走行車線情報、前記道路を通行する移動体の速度情報の少なくともいずれかに基づいて、前記道路を通行する移動体の動きを予測する、
請求項11又は12に記載の情報提供サーバ。 - 前記一次情報を所定の管制サーバに送信するネットワークの前記センサー側のエッジに配置されている請求項1から13いずれか一に記載の情報提供サーバ。
- 道路の所定の範囲をセンシングする複数のセンサーから一次情報を取得可能なコンピュータが、
前記複数のセンサーからそれぞれ取得した一次情報に基づいて、前記道路を通行する第一の移動体に対し、前記複数のセンサーから取得した一次情報を用いて作成する二次情報を提供するか否かを判定し、
前記第一の移動体に対し、前記二次情報を提供すると判定した場合、前記複数のセンサーから取得した一次情報を用いて前記二次情報を作成し、
前記第一の移動体に対し、前記二次情報を送信する、
情報提供方法。 - 道路の所定の範囲をセンシングする複数のセンサーからそれぞれ情報を取得可能なコンピュータに、
前記複数のセンサーからそれぞれ取得した一次情報に基づいて、前記道路を通行する第一の移動体に対し、前記複数のセンサーから取得した一次情報を用いて作成する二次情報を提供するか否かを判定する処理と、
前記第一の移動体に対し、前記二次情報を提供すると判定した場合、前記複数のセンサーから取得した一次情報を用いて前記二次情報を作成する処理と、
前記第一の移動体に対し、前記二次情報を送信する処理と、
を実行させるプログラムを記録したコンピュータ記録媒体。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2022558601A JPWO2022091167A1 (ja) | 2020-10-26 | 2020-10-26 | |
PCT/JP2020/040040 WO2022091167A1 (ja) | 2020-10-26 | 2020-10-26 | 情報提供サーバ、情報提供方法及びプログラム記録媒体 |
US18/032,620 US20230394971A1 (en) | 2020-10-26 | 2020-10-26 | Information provision server, information provision method, and recording medium storing program |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2020/040040 WO2022091167A1 (ja) | 2020-10-26 | 2020-10-26 | 情報提供サーバ、情報提供方法及びプログラム記録媒体 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2022091167A1 true WO2022091167A1 (ja) | 2022-05-05 |
Family
ID=81383721
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2020/040040 WO2022091167A1 (ja) | 2020-10-26 | 2020-10-26 | 情報提供サーバ、情報提供方法及びプログラム記録媒体 |
Country Status (3)
Country | Link |
---|---|
US (1) | US20230394971A1 (ja) |
JP (1) | JPWO2022091167A1 (ja) |
WO (1) | WO2022091167A1 (ja) |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2010277123A (ja) * | 2009-05-26 | 2010-12-09 | Mazda Motor Corp | 車両用運転支援装置 |
JP2011138363A (ja) * | 2009-12-28 | 2011-07-14 | Toshiba Corp | 交差点車両警告装置 |
JP2019175201A (ja) * | 2018-03-29 | 2019-10-10 | 住友電気工業株式会社 | センサ共有システム、センサ共有サーバ、センサ共有サーバの動作方法、センサ装備装置、及びセンサ共有のためのコンピュータプログラム |
-
2020
- 2020-10-26 US US18/032,620 patent/US20230394971A1/en active Pending
- 2020-10-26 WO PCT/JP2020/040040 patent/WO2022091167A1/ja active Application Filing
- 2020-10-26 JP JP2022558601A patent/JPWO2022091167A1/ja active Pending
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2010277123A (ja) * | 2009-05-26 | 2010-12-09 | Mazda Motor Corp | 車両用運転支援装置 |
JP2011138363A (ja) * | 2009-12-28 | 2011-07-14 | Toshiba Corp | 交差点車両警告装置 |
JP2019175201A (ja) * | 2018-03-29 | 2019-10-10 | 住友電気工業株式会社 | センサ共有システム、センサ共有サーバ、センサ共有サーバの動作方法、センサ装備装置、及びセンサ共有のためのコンピュータプログラム |
Also Published As
Publication number | Publication date |
---|---|
US20230394971A1 (en) | 2023-12-07 |
JPWO2022091167A1 (ja) | 2022-05-05 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10203699B1 (en) | Selective remote control of ADAS functionality of vehicle | |
CN114370883B (zh) | 用于操纵车辆的方法和系统 | |
US10459440B2 (en) | System and method for remotely assisting autonomous vehicle operation | |
JP2024028327A (ja) | 判定装置及び判定方法並びに判定用プログラム | |
US8885039B2 (en) | Providing vehicle information | |
KR20240000658A (ko) | 자율 주행을 위한 3차원 특징 예측 | |
JP7205204B2 (ja) | 車両の制御装置及び自動運転システム | |
US11568741B2 (en) | Communication device, control method thereof, and communication system including the same | |
KR20210019499A (ko) | 로컬화 및 위치 기반 서비스를 위해 차량에서 캡처된 승객 주의 데이터의 활용 | |
CN102546696B (zh) | 行车感知导航系统 | |
US20230245071A1 (en) | Real-time visualization of autonomous vehicle behavior in mobile applications | |
KR20200047766A (ko) | 자율 차량들을 위한 교통 재지향에 대한 검출 및 응답 | |
WO2020249122A1 (zh) | 一种车辆变道方法及装置 | |
US20220095086A1 (en) | Method and apparatus for indicating, obtaining, and sending automated driving information | |
JP7348725B2 (ja) | 配信システム、配信方法および車載装置 | |
JP2020161039A (ja) | 遠隔操作用画像の表示方法及び遠隔操作装置 | |
US11626012B2 (en) | Hierarchical integrated traffic management system for managing vehicles | |
WO2015001677A1 (ja) | 安全支援システムおよび安全支援装置 | |
WO2020039798A1 (ja) | 情報提供装置、情報提供方法、情報提供システム、コンピュータプログラム、及びデータ構造 | |
JP4102181B2 (ja) | 信号待ち回数予測方法及びナビゲーション装置 | |
JP2020149323A (ja) | 情報処理装置及び情報処理装置を備える自動走行制御システム | |
WO2023248776A1 (ja) | 遠隔支援装置、遠隔支援方法、及び遠隔支援プログラム | |
JP2023118835A (ja) | 信号情報提供装置、信号情報提供方法及びプログラム | |
WO2022091167A1 (ja) | 情報提供サーバ、情報提供方法及びプログラム記録媒体 | |
WO2024161593A1 (ja) | 監視システム、監視装置及び監視方法 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 20959685 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 18032620 Country of ref document: US |
|
ENP | Entry into the national phase |
Ref document number: 2022558601 Country of ref document: JP Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 20959685 Country of ref document: EP Kind code of ref document: A1 |