WO2021106297A1 - Provision device, vehicle management device, vehicle management system, vehicle management method, and vehicle management program - Google Patents

Provision device, vehicle management device, vehicle management system, vehicle management method, and vehicle management program Download PDF

Info

Publication number
WO2021106297A1
WO2021106297A1 PCT/JP2020/032798 JP2020032798W WO2021106297A1 WO 2021106297 A1 WO2021106297 A1 WO 2021106297A1 JP 2020032798 W JP2020032798 W JP 2020032798W WO 2021106297 A1 WO2021106297 A1 WO 2021106297A1
Authority
WO
WIPO (PCT)
Prior art keywords
information
detection
vehicle management
vehicle
providing
Prior art date
Application number
PCT/JP2020/032798
Other languages
French (fr)
Japanese (ja)
Inventor
長村吉富
Original Assignee
住友電気工業株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 住友電気工業株式会社 filed Critical 住友電気工業株式会社
Publication of WO2021106297A1 publication Critical patent/WO2021106297A1/en

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions

Definitions

  • the present disclosure relates to providing devices, vehicle management devices, vehicle management systems, vehicle management methods and vehicle management programs.
  • This application claims priority on the basis of Japanese application Japanese Patent Application No. 2019-216678 filed on November 29, 2019, and incorporates all of its disclosures herein.
  • Patent Document 1 Japanese Unexamined Patent Publication No. 2019-175201 discloses the following techniques. That is, it is a sensor sharing system including a plurality of devices each connected to one or a plurality of sensors and a server capable of communicating with each of the plurality of devices, and each of the plurality of devices is a sensor sharing system. , A transmitting unit that collects and analyzes sensor information from a sensor to which the device is connected and transmits the sensor information and the analysis result to the server, a command receiving unit that receives a command from the server, and a command receiving unit that receives commands from the server.
  • the server includes a transmission control unit that controls the ratio of the transmission information amount of the sensor information to the transmission information amount of the analysis result by the transmission unit according to the received command, and the server is said to be from the plurality of devices.
  • a receiving unit that receives the sensor information and the analysis result, an aggregated analysis unit that aggregates and analyzes the sensor information received by the receiving unit and outputs an aggregated analysis result, and the analysis result received by the receiving unit. Based on the comparison unit that compares the aggregated analysis result and the comparison result by the comparison unit, the amount of transmission information of the sensor information transmitted from each of the plurality of devices to the server and the analysis result at a predetermined time.
  • Includes an adjusting unit that generates a command for adjusting the ratio of the information to the amount of transmitted information and transmits it to each of the plurality of devices.
  • Patent Document 2 Japanese Unexamined Patent Publication No. 2008-1919878 discloses the following techniques. That is, the vehicle peripheral monitoring device acquires a road photographed image including another vehicle traveling around the own vehicle from an outside vehicle photographing means arranged around the own vehicle on the traveling road. Based on the acquisition means and a plurality of road shot images with different shooting fields taken by the outside-vehicle shooting means, a bird's-eye view image of the surroundings of the own vehicle is obtained from a viewpoint located above the own vehicle.
  • a means for creating a bird's-eye view image of the vehicle's surroundings which is created by synthesizing the image taken on the road as an actual image while converting the viewpoint, and a form provided in the vehicle interior where the vehicle's position can be specified from the bird's-eye view image of the vehicle's surroundings. It is equipped with a means for displaying a bird's-eye view of the surroundings of the vehicle.
  • Patent Document 3 Japanese Unexamined Patent Publication No. 2009-186353 discloses the following techniques. That is, it is an object detection device that detects a stationary object and a moving object existing in the vicinity of the vehicle, and is relative to the stationary object by comparing two images captured in time series by a camera mounted on the vehicle. Acquires the distance and orientation of the stationary object and the moving object based on the image detecting means for acquiring the relative three-dimensional coordinates representing the spatial arrangement and the reflected wave related to the irradiation wave irradiated for the range corresponding to the image. Relative 3D coordinates to absolute 3D coordinates using the radar detecting means and the distance acquired by the radar detecting means for the stationary object in the image existing in the orientation acquired by the radar detecting means. It is provided with a coordinate calculation means for calculating.
  • Patent Document 4 Japanese Unexamined Patent Publication No. 2009-98025 discloses the following techniques. That is, the object detection device includes a radar detecting means for detecting the position of an object by a radar, an image capturing means for capturing the object, a detection point for the object detected by the radar detecting means, and the image capturing means.
  • the end specifying means for specifying the position of the end of the object based on the captured image is provided, and the end specifying means obtains a rectangular region including the object from the captured image and obtains the detection point.
  • the intersection of the object detection straight line corresponding to the above and the azimuth straight line at the left and right ends of the rectangular region is defined as the position of the end portion of the object.
  • Japanese Unexamined Patent Publication No. 2019-175201 Japanese Unexamined Patent Publication No. 2008-191988 Japanese Unexamined Patent Publication No. 2009-186353 JP-A-2009-98025 JP-A-2018-5520 JP-A-2019-174899
  • the providing device of the present disclosure includes an information creating unit that creates object detection information indicating an object detection result that affects the traffic of a vehicle in the providing device, and detection condition information that indicates a condition for detecting the object, and the information creating unit.
  • the object detection information created by the above and a transmission unit for transmitting the detection condition information are provided.
  • the vehicle management device of the present disclosure acquires object detection information indicating an object detection result affecting vehicle traffic in the providing device and detection condition information indicating a condition for detecting the object from the plurality of providing devices, respectively. Regarding the detection of the object in a predetermined target device among the plurality of providing devices based on the acquisition unit and the object detection information and the detection condition information of the plurality of providing devices acquired by the acquiring unit. It includes an information creation unit that creates change information indicating the change content, and a transmission unit that transmits the change information created by the information creation unit to the target device.
  • the vehicle management system of the present disclosure includes a plurality of providing devices and a vehicle management device, and the plurality of providing devices each include object detection information indicating a detection result of an object affecting the traffic of the vehicle in the providing device. And the detection condition information indicating the condition in the detection of the object is transmitted to the vehicle management device, and the vehicle management device is based on the object detection information and the detection condition information of the plurality of providing devices received. The change information indicating the change contents regarding the detection of the object in the predetermined target device among the provided devices is transmitted to the target device.
  • the vehicle management method of the present disclosure is a vehicle management method in a providing device, and is object detection information indicating a detection result of an object affecting the traffic of a vehicle in the providing device, and detection condition information indicating a condition in detecting the object. Includes a step of creating the object and a step of transmitting the created object detection information and the detection condition information.
  • the vehicle management method of the present disclosure is a vehicle management method in a vehicle management device, and is object detection information indicating a detection result of an object affecting the traffic of a vehicle in the providing device, and detection condition information indicating a condition in detecting the object.
  • object detection information indicating a detection result of an object affecting the traffic of a vehicle in the providing device
  • detection condition information indicating a condition in detecting the object.
  • the step includes a step of creating change information indicating the content of the change related to the detection of the object, and a step of transmitting the created change information to the target device.
  • the vehicle management method of the present disclosure is a vehicle management method in a vehicle management system including a plurality of providing devices and a vehicle management device, and each of the plurality of providing devices affects the traffic of a vehicle in the providing device.
  • the vehicle management program of the present disclosure is a vehicle management program used in a providing device, and uses a computer as an object detection information indicating a detection result of an object affecting the traffic of a vehicle in the providing device, and conditions for detecting the object.
  • This is a program for functioning as an information creation unit that creates detection condition information indicating the above, and a transmission unit that transmits the object detection information and the detection condition information created by the information creation unit.
  • the vehicle management program of the present disclosure is a vehicle management program used in a vehicle management device, in which a computer is used as object detection information indicating an object detection result that affects vehicle traffic in the providing device, and conditions for detecting the object.
  • a computer is used as object detection information indicating an object detection result that affects vehicle traffic in the providing device, and conditions for detecting the object.
  • the plurality of provisions are provided.
  • An information creation unit that creates change information indicating changes related to detection of the object in a predetermined target device among the devices, and a transmission unit that transmits the change information created by the information creation unit to the target device. It is a program to function as.
  • One aspect of the present disclosure can be realized as a semiconductor integrated circuit that realizes a part or all of the providing device. Further, one aspect of the present disclosure can be realized as a semiconductor integrated circuit that realizes a part or all of the vehicle management device. Further, one aspect of the present disclosure can be realized as a semiconductor integrated circuit that realizes a part or all of a vehicle management system. Further, one aspect of the present disclosure can be realized as a program for causing a computer to execute a processing step in a vehicle management system.
  • FIG. 1 is a diagram showing a configuration of a vehicle management system according to an embodiment of the present disclosure.
  • FIG. 2 is a diagram showing a configuration of a providing device according to an embodiment of the present disclosure.
  • FIG. 3 is a diagram showing the configuration of another example of the providing device according to the embodiment of the present disclosure.
  • FIG. 4 is a diagram showing a configuration of a vehicle management device according to an embodiment of the present disclosure.
  • FIG. 5 is a diagram conceptually showing a process of creating change information in the vehicle management device according to the embodiment of the present disclosure.
  • FIG. 6 is a diagram showing an example of a sequence of processing of provided information and change information in the vehicle management system according to the embodiment of the present disclosure.
  • FIG. 7 is a flowchart defining an operation procedure of the providing device in the vehicle management system according to the embodiment of the present disclosure.
  • FIG. 8 is a flowchart defining an operation procedure of the vehicle management device in the vehicle management system according to the embodiment of the present disclosure.
  • the present disclosure has been made to solve the above-mentioned problems, and an object thereof is a providing device, a vehicle management device, a vehicle management system, and a vehicle capable of improving the detection accuracy of an object affecting the traffic of a vehicle. To provide management methods and vehicle management programs.
  • the providing device is information for creating object detection information indicating an object detection result affecting vehicle traffic in the providing device and detection condition information indicating a condition for detecting the object. It includes a creation unit and a transmission unit that transmits the object detection information and the detection condition information created by the information creation unit.
  • the accuracy of the object detection condition in each providing device is utilized from the collected object detection information.
  • High elements can be extracted.
  • the sensing environment of the surrounding conditions such as a vehicle and a roadside machine varies, and the sensing environment fluctuates with time. Therefore, it is possible to realize highly accurate object detection by using the above-mentioned highly accurate information, as compared with the method of simply integrating the object detection information from each providing device. Therefore, it is possible to improve the detection accuracy of an object that affects the traffic of the vehicle.
  • the providing device further receives the change information indicating the change contents regarding the detection of the object in the providing device, and the change information received by the receiving unit. It is provided with a detection change unit that changes the detection of the object in the providing device.
  • the change information indicates the change content of the detection result.
  • the change information indicates a change in the setting of the sensor used for detecting the object.
  • the change information indicates a change in the content of the analysis process for the measurement result of the sensor used for detecting the object.
  • the detection condition information indicates the environment of the sensor that affects the measurement performance of the sensor used for detecting the object.
  • the providing device is mounted on a vehicle, and the detection condition information indicates a running state of the vehicle on which the providing device is mounted.
  • the vehicle management device includes a plurality of object detection information indicating the detection result of an object affecting the traffic of the vehicle in the providing device and detection condition information indicating a condition for detecting the object. Based on the acquisition unit acquired from the providing device, the object detection information of the plurality of providing devices acquired by the acquisition unit, and the detection condition information, a predetermined value among the plurality of providing devices is determined. It includes an information creation unit that creates change information indicating the content of the change related to the detection of the object in the target device, and a transmission unit that transmits the change information created by the information creation unit to the target device.
  • object detection information and detection condition information are collected from each providing device, and from each collected object detection information, highly accurate elements are selected by utilizing the object detection conditions in each providing device. Can be extracted. Then, by using the extracted highly accurate information and the object detection information of the target device to make changes related to object detection in the target device, the capabilities of the device and software of the target device used for object detection are improved. Or, the detection result of the object can be corrected. For example, the sensing environment of the surrounding conditions such as a vehicle and a roadside machine varies, and the sensing environment fluctuates with time.
  • the configuration that feeds back the highly accurate information as described above to the object detection in the target device realizes highly accurate object detection in the target device. can do. Therefore, it is possible to improve the detection accuracy of an object that affects the traffic of the vehicle.
  • the vehicle management system includes a plurality of providing devices and a vehicle management device, and each of the plurality of providing devices is an object that affects the traffic of the vehicle in the providing device.
  • the object detection information indicating the detection result and the detection condition information indicating the conditions for detecting the object are transmitted to the vehicle management device, and the vehicle management device receives the object detection information of the plurality of providing devices and the detection. Based on the condition information, the change information indicating the change contents regarding the detection of the object in the predetermined target device among the plurality of providing devices is transmitted to the target device.
  • object detection information and detection condition information are collected from each providing device, and from each collected object detection information, highly accurate elements are selected by utilizing the object detection conditions in each providing device. Can be extracted. Then, by using the extracted highly accurate information and the object detection information of the target device to make changes related to object detection in the target device, the capabilities of the device and software of the target device used for object detection are improved. Or, the detection result of the object can be corrected. For example, the sensing environment of the surrounding conditions such as a vehicle and a roadside machine varies, and the sensing environment fluctuates with time.
  • the configuration that feeds back the highly accurate information as described above to the object detection in the target device realizes highly accurate object detection in the target device. can do. Therefore, it is possible to improve the detection accuracy of an object that affects the traffic of the vehicle.
  • the vehicle management method is the vehicle management method in the providing device, and the object detection information indicating the detection result of the object affecting the traffic of the vehicle in the providing device, and the object. It includes a step of creating detection condition information indicating a condition in detection, and a step of transmitting the created object detection information and the detection condition information.
  • the accuracy of the object detection condition in each providing device is utilized from the collected object detection information.
  • High elements can be extracted.
  • the sensing environment of the surrounding conditions such as a vehicle and a roadside machine varies, and the sensing environment fluctuates with time. Therefore, it is possible to realize highly accurate object detection by using the above-mentioned highly accurate information, as compared with the method of simply integrating the object detection information from each providing device. Therefore, it is possible to improve the detection accuracy of an object that affects the traffic of the vehicle.
  • the vehicle management method is the vehicle management method in the vehicle management device, and the object detection information indicating the detection result of the object affecting the traffic of the vehicle in the providing device, and the object.
  • the plurality of providing devices based on the step of acquiring the detection condition information indicating the condition in the detection from each of the plurality of providing devices, the object detection information of the plurality of providing devices, and the detection condition information.
  • the step includes a step of creating change information indicating the change contents regarding the detection of the object in the predetermined target device, and a step of transmitting the created change information to the target device.
  • object detection information and detection condition information are collected from each providing device, and from each collected object detection information, highly accurate elements are selected by utilizing the object detection conditions in each providing device. Can be extracted. Then, by using the extracted highly accurate information and the object detection information of the target device to make changes related to object detection in the target device, the capabilities of the device and software of the target device used for object detection are improved. Or, the detection result of the object can be corrected. For example, the sensing environment of the surrounding conditions such as a vehicle and a roadside machine varies, and the sensing environment fluctuates with time.
  • the configuration that feeds back the highly accurate information as described above to the object detection in the target device realizes highly accurate object detection in the target device. can do. Therefore, it is possible to improve the detection accuracy of an object that affects the traffic of the vehicle.
  • the vehicle management method is a vehicle management method in a vehicle management system including a plurality of providing devices and a vehicle management device, and the plurality of providing devices are provided by each of the plurality of providing devices.
  • change information indicating the change contents regarding the detection of the object in the predetermined target device among the plurality of providing devices is transmitted to the target device. Including steps to do.
  • object detection information and detection condition information are collected from each providing device, and from each collected object detection information, highly accurate elements are selected by utilizing the object detection conditions in each providing device. Can be extracted. Then, by using the extracted highly accurate information and the object detection information of the target device to make changes related to object detection in the target device, the capabilities of the device and software of the target device used for object detection are improved. Or, the detection result of the object can be corrected. For example, the sensing environment of the surrounding conditions such as a vehicle and a roadside machine varies, and the sensing environment fluctuates with time.
  • the configuration that feeds back the highly accurate information as described above to the object detection in the target device realizes highly accurate object detection in the target device. can do. Therefore, it is possible to improve the detection accuracy of an object that affects the traffic of the vehicle.
  • the vehicle management program is a vehicle management program used in the providing device, and the object detection information indicating the detection result of the object affecting the traffic of the vehicle in the providing device by the computer. , And a program for functioning as an information creation unit that creates detection condition information indicating conditions for detecting the object, and a transmission unit that transmits the object detection information and the detection condition information created by the information creation unit. Is.
  • the accuracy of the object detection condition in each providing device is utilized from the collected object detection information.
  • High elements can be extracted.
  • the sensing environment of the surrounding conditions such as a vehicle and a roadside machine varies, and the sensing environment fluctuates with time. Therefore, it is possible to realize highly accurate object detection by using the above-mentioned highly accurate information, as compared with the method of simply integrating the object detection information from each providing device. Therefore, it is possible to improve the detection accuracy of an object that affects the traffic of the vehicle.
  • the vehicle management program is a vehicle management program used in the vehicle management device, and is object detection information indicating a detection result of an object affecting the traffic of the vehicle in the providing device using a computer.
  • the acquisition unit that acquires the detection condition information indicating the conditions for detecting the object from the plurality of providing devices, and the object detection information and the detection condition information of the plurality of providing devices acquired by the acquisition unit.
  • the information creation unit that creates change information indicating the change contents regarding the detection of the object in the predetermined target device among the plurality of providing devices, and the change information created by the information creation unit. This is a program for functioning as a transmission unit that transmits to the target device.
  • object detection information and detection condition information are collected from each providing device, and from each collected object detection information, highly accurate elements are selected by utilizing the object detection conditions in each providing device. Can be extracted. Then, by using the extracted highly accurate information and the object detection information of the target device to make changes related to object detection in the target device, the capabilities of the device and software of the target device used for object detection are improved. Or, the detection result of the object can be corrected. For example, the sensing environment of the surrounding conditions such as a vehicle and a roadside machine varies, and the sensing environment fluctuates with time.
  • the configuration that feeds back the highly accurate information as described above to the object detection in the target device realizes highly accurate object detection in the target device. can do. Therefore, it is possible to improve the detection accuracy of an object that affects the traffic of the vehicle.
  • FIG. 1 is a diagram showing a configuration of a vehicle management system according to an embodiment of the present disclosure.
  • the vehicle management system 301 includes a vehicle management device 201 and a plurality of providing devices 101.
  • the providing device 101 is mounted on, for example, the vehicle 1 or the roadside machine 3.
  • the vehicle management system 301 may be configured not to include the providing device 101 mounted on the roadside machine 3, or may not include the providing device 101 mounted on the vehicle 1.
  • the providing device 101 detects an object that affects the traffic of the vehicle, such as traveling or parking of the vehicle.
  • the objects are, for example, vehicles, pedestrians, bicycles and fixed structures.
  • the providing device 101 mounted on the vehicle 1 creates support information for autonomous driving, automatic driving, driving support, etc., based on, for example, an object detection result.
  • the providing device 101 creates support information including various control contents for performing automatic operation or the like, for example.
  • the vehicle 1 performs autonomous driving, automatic driving, driving support, or the like based on the support information created by the providing device 101.
  • the providing device 101 is not limited to the configuration for creating support information, and may be configured to notify, for example, an object detection result to an in-vehicle device different from the providing device 101.
  • the vehicle 1 and the roadside unit 3 are equipped with a communication device (not shown) capable of communicating with the vehicle management device 201 via the radio base station device 4.
  • the vehicle 1 and the roadside unit 3 can perform wireless communication with the wireless base station device 4 according to a communication standard such as LTE (Long Term Evolution), 5G or 3G.
  • LTE Long Term Evolution
  • 5G Fifth Generation
  • 3G 3th Generation
  • the vehicle 1 and the roadside unit 3 are not limited to the above, and may be configured to be able to communicate with the vehicle management device 201 via the radio base station device 4 using, for example, an ITS (Intelligent Transport System) radio. Further, the providing device 101 itself may be configured to include the above-mentioned communication device.
  • ITS Intelligent Transport System
  • Patent Documents 1 and 2 are techniques related to the above problems (1) and (2) because they integrate sensor information acquired in a plurality of vehicles and the like.
  • Patent Documents 3 and 4 are techniques related to the problem (3) above because they use a plurality of different types of sensors.
  • the providing device 101 creates object detection information indicating the detection result of an object affecting the traffic of the vehicle and detection condition information indicating a condition for detecting the object, and creates the radio base station device 4 and the external network 5. It is transmitted to the vehicle management device 201 via the vehicle.
  • the providing device 101 may be able to send and receive various information to and from the vehicle management device 201 without going through other devices and a network.
  • the object detection information and the detection condition information are collectively referred to as provided information.
  • each of the plurality of providing devices 101 including the providing device 101 (hereinafter, also referred to as the target device) to which the change information is provided transmits the provided information to the vehicle management device 201.
  • the vehicle management device 201 receives and stores the provided information from the plurality of providing devices 101 via the radio base station device 4 and the external network 5.
  • the vehicle management device 201 creates change information indicating the change contents regarding the detection of the object in the target device based on the provided information of the plurality of providing devices 101 including the target device. Then, the vehicle management device 201 transmits the created change information to the target device via the external network 5 and the radio base station device 4.
  • all of the plurality of providing devices 101 in the vehicle management system 301 may correspond to the target device, or some of them may correspond to the target device.
  • the providing device 101 receives change information from the vehicle management device 201, and makes changes related to the detection of an object in its own providing device 101 based on the received change information.
  • FIG. 2 is a diagram showing a configuration of a providing device according to an embodiment of the present disclosure.
  • the providing device 101A which is a providing device 101 mounted on the vehicle 1, includes a processing unit 11, a transmitting unit 12, a receiving unit 13, and a storage unit 14.
  • the processing unit 11 is realized by a processor such as a CPU (Central Processing Unit) or a DSP (Digital Signal Processing).
  • the transmitting unit 12 and the receiving unit 13 are realized by, for example, a communication circuit such as a communication IC (Integrated Circuit).
  • the storage unit 14 is, for example, a non-volatile memory.
  • the processing unit 11 includes an analysis unit 21, a detection change unit 22, a support information creation unit 23, and a vehicle control unit 24.
  • the analysis unit 21, the support information creation unit 23, and the vehicle control unit 24 are provided in an in-vehicle device different from the providing device 101 when a plurality of in-vehicle devices such as the providing device 101 are mounted on the vehicle 1. May be good.
  • One or a plurality of sensors 51 are provided in the vehicle 1 and output the measurement result to the processing unit 11.
  • the sensor 51 is, for example, a camera, a LiDAR (Light Detection and Ringing), or a millimeter-wave radar device.
  • the storage unit 14 stores various information such as the result of various processing by the processing unit 11, information in the middle of processing, various setting information, and conditions for detecting an object.
  • the analysis unit 21 in the processing unit 11 detects an object that affects the traffic of its own vehicle 1, such as another vehicle, a pedestrian, a bicycle, and a fixed structure, based on the measurement result received from the sensor 51. Perform processing.
  • the analysis process is, for example, a determination process by template matching or a determination process using a learning model created by machine learning such as deep learning.
  • the support information creation unit 23 creates support information for autonomous driving, automatic driving, driving support, etc., based on the detection result of the object by the analysis unit 21.
  • the vehicle control unit 24 performs various controls on its own vehicle 1, such as speed control, based on the support information created by the support information creation unit 23.
  • the analysis unit 21 creates object detection information indicating the detection result of an object affecting the traffic of its own vehicle 1 and detection condition information indicating a condition for detecting the object. Then, the analysis unit 21 includes the identification information of its own vehicle 1 in at least one of the object detection information and the detection condition information and outputs it to the transmission unit 12.
  • the object detection information includes, for example, a plurality of elements related to the above object.
  • the object detection information includes at least one element of the object, such as type, dimension, position information, inclination, velocity and acceleration.
  • the object detection information may include the certainty of each of the above-mentioned elements of the object, which is output from the learning model, for example.
  • the dimensions of the object are, for example, the width, height and depth of the object.
  • the dimensions of the object are, for example, a 3D bounding box.
  • the 3D bounding box is a figure showing the three-dimensional structure of the detected object when the object is detected from the camera image by using deep learning.
  • the detection condition information indicates, for example, the environment of the sensor 51 that affects the measurement performance of the sensor 51 used for detecting an object, for example, the environment of its own vehicle 1.
  • the detection condition information indicates the running state of the own vehicle 1 on which the providing device 101 is mounted.
  • the detection condition information includes, for example, a condition in which object detection information can be evaluated in the above element units.
  • the detection condition information includes sensor parameters such as measurement time, performance and set value, sensor installation information such as mounting position, direction and angle, and vehicle 1 such as weather, light intensity and visibility. It includes at least one element of the state of the surrounding environment of the vehicle 1 and the state information of its own vehicle 1 such as position information, speed, acceleration and inclination.
  • the transmission unit 12 transmits the object detection information and the detection condition information received from the analysis unit 21 to the vehicle management device 201 via the above-mentioned communication device (not shown) such as TCU (Telematics Communication Unit).
  • TCU Telematics Communication Unit
  • FIG. 3 is a diagram showing the configuration of another example of the providing device according to the embodiment of the present disclosure.
  • the providing device 101B which is a providing device 101 mounted on the roadside machine 3, includes a processing unit 61, a transmitting unit 62, a receiving unit 63, and a storage unit 64.
  • the processing unit 61 is realized by a processor such as a CPU or DSP.
  • the transmitting unit 62 and the receiving unit 63 are realized by, for example, a communication circuit such as a communication IC.
  • the storage unit 64 is, for example, a non-volatile memory.
  • the processing unit 61 includes an analysis unit 71 and a detection change unit 72.
  • the analysis unit 71 may be provided in a device other than the providing device 101 in the roadside machine 3.
  • the sensor 81 is, for example, a camera, LiDAR or millimeter wave radar device.
  • the storage unit 64 stores various information such as the result of various processing by the processing unit 61, information during processing, various setting information, and conditions for detecting an object.
  • the analysis unit 71 in the processing unit 61 is a vehicle around the roadside machine 3 such as a vehicle, a pedestrian, a bicycle, and a fixed structure (hereinafter, also referred to as a peripheral vehicle). Performs analysis processing to detect objects that affect traffic.
  • the analysis process is, for example, a determination process by template matching or a determination process using a learning model created by machine learning such as deep learning.
  • the analysis unit 71 creates object detection information indicating the detection result of an object affecting the traffic of surrounding vehicles and detection condition information indicating a condition for detecting the object. Then, the analysis unit 71 includes the identification information of its own roadside machine 3 in at least one of the object detection information and the detection condition information and outputs the identification information to the transmission unit 62.
  • the object detection information includes, for example, a plurality of elements related to the above object.
  • the object detection information includes at least one element of the object, such as type, dimension, position information, inclination, velocity and acceleration.
  • the object detection information may include the certainty of each of the above-mentioned elements of the object, which is output from the learning model, for example.
  • the detection condition information indicates, for example, the environment of the sensor 81 that affects the measurement performance of the sensor 81 used for detecting an object, for example, the environment of the roadside machine 3.
  • the detection condition information includes, for example, a condition in which object detection information can be evaluated in the above element units.
  • the detection condition information includes sensor parameters such as measurement time, performance and set values, sensor installation information such as mounting position, direction and angle, and peripheral vehicles such as weather, light intensity and visibility. It includes at least one element of the state of the surrounding environment and the state information of the surrounding vehicle such as position information, speed, acceleration and inclination.
  • the transmission unit 62 transmits the object detection information and the detection condition information received from the analysis unit 71 to the vehicle management device 201 via the communication device (not shown).
  • FIG. 4 is a diagram showing a configuration of a vehicle management device according to an embodiment of the present disclosure.
  • the vehicle management device 201 includes a processing unit 31, a transmitting unit 32, a receiving unit 33, and a storage unit 34.
  • the processing unit 31 is realized by a processor such as a CPU or DSP.
  • the storage unit 34 is, for example, a non-volatile memory.
  • the transmitting unit 32 and the receiving unit 33 are realized by, for example, a communication circuit such as a communication IC.
  • the processing unit 31 includes an acquisition unit 41 and an information creation unit 42.
  • the storage unit 34 stores the results of various processes by the processing unit 31, information in the middle of processing, and other information.
  • the acquisition unit 41 acquires the object detection information and the detection condition information from the plurality of providing devices 101, respectively.
  • the receiving unit 33 receives the provided information from each providing device 101, that is, the object detection information and the detection condition information via the external network 5, and outputs the information to the processing unit 31.
  • the acquisition unit 41 in the processing unit 31 stores the provided information received from the receiving unit 33 in the storage unit 34.
  • the information creation unit 42 in the target device is based on the object detection information and the detection condition information of the plurality of providing devices 101 including the providing device 101 of the change information providing target, that is, the target device, acquired by the acquisition unit 41. Create change information that indicates the changes related to object detection.
  • FIG. 5 is a diagram conceptually showing the process of creating change information in the vehicle management device according to the embodiment of the present disclosure.
  • the information creation unit 42 in the processing unit 31 performs an integrated process for creating integrated information that integrates the object detection information transmitted from each providing device 101.
  • the information creation unit 42 performs time synchronization for adjusting the time variation of the provided information transmitted from each providing device 101 including the target device.
  • the information creation unit 42 sets the target time for each target device.
  • the target time is, for example, the above-mentioned measurement time included in the provided information received from the target device.
  • the information creation unit 42 refers to the storage unit 34, and selects the provided information closest to the target time for each providing device 101 by using, for example, the above-mentioned measurement time included in each provided information. Further, for example, when the information providing unit 42 does not have the provided information close to the target time for a certain providing device 101 in the storage unit 34, the information creating unit 42 is based on the object detection information of the providing device 101 stored in the storage unit 34. The object detection information of the providing device 101 at the target time is estimated and selected.
  • the information creation unit 42 maps the provided information of each providing device 101 selected in the time synchronization to a single position coordinate.
  • the information creation unit 42 unifies the coordinate system of the provided information of each providing device 101. Specifically, for example, the information creation unit 42 converts the coordinates and vectors of the detected object in the relative coordinate system created by each providing device 101 into one coordinate system owned by itself.
  • the information creation unit 42 creates integrated information using each provided information after coordinate conversion.
  • the information creation unit 42 evaluates the object detection information on an element-by-element basis for each object detection information based on the corresponding detection condition information, and based on the evaluation result, the object detection of the plurality of providing devices 101. Multiple elements are extracted from the information, and integrated information including the extracted multiple elements is created. For example, the information creation unit 42 determines the accuracy of each element included in the corresponding object detection information based on the detection condition information.
  • the information creation unit 42 determines, for example, whether or not the distance from the sensor 51 or 81 to the detected object is an appropriate distance, whether or not the surrounding environment of the sensor 51 or 81 is appropriate, and whether or not the detected object.
  • the accuracy is determined based on whether or not there is an influence of another object different from the above, and whether or not the certainty of each element included in the object detection information is sufficient.
  • the information creation unit 42 provides when the surrounding environment of the sensor 51 or 81 is not good for the sensor, such as when the detected object is too far from the providing device 101 or when the surroundings of the providing device 101 are dark. When the sensor is positioned in a direction blocked by an object detected by the device 101, it is determined that the accuracy is low.
  • the information creation unit 42 is represented by a single position coordinate using a combination of each element determined to be accurate in the object detection information of each providing device 101, or a prediction result using each element. Create integrated information for each object detection information. For example, the information creation unit 42 creates the mapping and integrated information for each set target time, that is, for each target device.
  • the information creation unit 42 makes a prediction using each of the above elements using, for example, a learning model for discriminating a vehicle, a pedestrian, or the like created by machine learning such as deep learning.
  • the configuration that performs time synchronization, mapping, and accuracy judgment and reflection of the judgment result is more accurate and advanced on an element-by-element basis than a method that simply integrates the object detection information from each providing device 101. Integrated information can be created.
  • the width and height of an object can be known from the provided information of a certain providing device 101 but the reliability of the depth is low, the provided information of another providing device 101 that detects the object from a different angle is used to obtain the object. More accurate information such as obtaining depth can be comprehensively created.
  • the provided information of a certain providing device 101 when the dimensions of the object can be known from the provided information of a certain providing device 101 but the reliability of the speed is low, the provided information of another providing device 101 capable of detecting the object from a different position such as an altitude is used. More accurate information such as obtaining the velocity of the object can be comprehensively created.
  • the information creation unit 42 performs an improvement process for creating change information for improving object detection in the target device after performing the integrated process.
  • the information creation unit 42 converts the coordinate system of the position information included in the created integrated information into the coordinate system of the target device.
  • the information creation unit 42 compares the created integrated information with the corresponding object detection information, that is, the object detection information received from the target device, and selects and calculates the information to be transmitted to the target device based on the comparison result. And so on.
  • the information creation unit 42 creates change information by, for example, selecting and calculating information such that the object detection result in the providing device 101 or future object detection becomes more accurate.
  • the change information indicates the content of the change in the detection result of the object in the providing device 101.
  • the change information includes the correct dimension, the correct position information, the correct inclination, the correct velocity, or the correct acceleration of the detected object in the target device, and the time information that is the reference of these elements.
  • the change information is not limited to the correct value itself, but may include a difference from the correct value obtained from the integrated information of each of the above elements in the object detection information.
  • the change information may indicate the change contents of the setting of the sensor 51 or 81 used for detecting the object.
  • the change information includes correction parameters applicable to the sensor 51 or 81 in the providing device 101.
  • the change information includes orientation, zoom, frame rate, image size, setting values of various color tones, and the like with respect to the camera.
  • the change information includes the direction of the laser, the angle range, the number of revolutions, etc. with respect to LiDAR.
  • the change information may indicate a change in the content of the analysis process for the measurement result of the sensor 51 or 81 used for detecting the object.
  • the change information includes a correction parameter applicable to the analysis process of the analysis unit 21 or 71 in the providing device 101.
  • the change information includes criteria for template matching, weights used in the network of learning models, and the like.
  • the information creation unit 42 sets the utilization time, which is the time when the target device should utilize the change information, and makes a part or all of the content of the change information based on the above comparison result suitable for the utilization time. Create changed change information. Specifically, for example, the information creation unit 42 determines the prediction result of the future position of the detected object, which is a moving object, in consideration of the transmission delay from the vehicle management device 201 to the target device, the processing delay in the target device, and the like. Create reflected change information.
  • the information creation unit 42 maintains the content of the change information based on the above comparison result when the change according to the utilization time is not necessary.
  • the information creation unit 42 uses this utilization time as, for example, the "reference time information" in the above-mentioned change information.
  • the utilization time may be the past time, for example, when it is useful to correct the past object detection result in the target device.
  • improvement processing for example, it is possible to improve the object detection capability of the providing device 101 and correct the object detection result of the providing device 101. More specifically, the dimensions in the detection result of the object in the providing device 101, the orientation of the camera and the laser orientation of the LiDAR in the setting of the sensor 51 or 81, and the determination in the analysis process of the analysis unit 21 or 71 in the providing device 101. It is possible to realize a high degree of correction on an element-by-element basis such as a reference and a weight.
  • the providing device 101 manages the vehicle by transmitting new object detection information and detection condition information to the vehicle management device 201 by object detection with improved accuracy using the change information received from the vehicle management device 201. It is possible to create a virtuous cycle in which the accuracy of the integrated information in the device 201 is further improved.
  • the information creation unit 42 in the processing unit 31 may be configured to perform integrated processing at, for example, periodically arriving integration timings. Further, the information creation unit 42 may be configured to perform the integration process and the improvement process asynchronously.
  • the information creation unit 42 realizes the above-mentioned integrated processing and improvement processing by, for example, a statistical calculation method, a geometric calculation method such as graphic transformation, and inference using a neural network. Can be done.
  • the information creation unit 42 outputs the change information created by the improvement process to the transmission unit 32.
  • the transmission unit 32 transmits the change information received from the information creation unit 42 to the corresponding providing device 101, that is, the target device via the external network 5 and the radio base station device 4.
  • the receiving unit 13 receives the change information from the vehicle management device 201 via the radio base station device 4 and outputs the change information to the processing unit 11.
  • the detection change unit 22 in the processing unit 11 makes a change regarding the detection of the object in the providing device 101A based on the change information received by the receiving unit 13.
  • the detection change unit 22 corrects the detection result of the object by the analysis unit 21, changes the parameters related to the sensor 51, and analyzes the analysis unit 21, based on the change information received from the reception unit 13. Make at least one of the parameter changes related to.
  • the detection change unit 22 corrects the dimensions, position information, inclination, velocity, acceleration, etc. of the detected object to the values indicated by the change information in the detection result of the object by the analysis unit 21 at the utilization time.
  • the detection / change unit 22 corrects the setting values such as the orientation, zoom, frame rate, image size, and various color tones of the camera, which is the sensor 51, to the values indicated by the change information. Further, the detection change unit 22 corrects the laser direction, the angle range, the rotation speed, and the like of the LiDAR sensor 51 to the values indicated by the change information.
  • the detection / change unit 22 corrects the determination criteria for template matching in the analysis process of the analysis unit 21, the weights used in the network of the learning model, and the like to the values indicated by the change information.
  • the operation associated with the arrival of the change information by the receiving unit 63, the detection changing unit 72, and the like in the providing device 101B shown in FIG. 3 is the same as the above operation in the providing device 101A.
  • Each device in the vehicle management system 301 includes a computer including a memory, and an arithmetic processing unit such as a CPU in the computer reads a program including a part or all of each step of the following flowchart and sequence from the memory and executes the program. To do.
  • the programs of these plurality of devices can be installed from the outside.
  • the programs of these plurality of devices are distributed in a state of being stored in a recording medium.
  • FIG. 6 is a diagram showing an example of a sequence of processing of provided information and change information in the vehicle management system according to the embodiment of the present disclosure.
  • FIG. 6 shows, as an example, an operation in which one target device transmits the provided information to the vehicle management device 201 and receives change information from the vehicle management device 201.
  • object detection information and detection condition information are transmitted from one or a plurality of other providing devices 101 (step S1) and are stored in the storage unit 34 of the vehicle management device 201.
  • the storage unit 34 may store the object detection information and the detection condition information transmitted in the past from the target device (step S2).
  • the target device performs analysis processing based on the measurement result received from the sensor 51 or 81 (step S3).
  • the target device creates object detection information indicating the detection result of the object and detection condition information indicating the conditions for detecting the object (step S4), and transmits the information to the vehicle management device 201 (step S5).
  • the vehicle management device 201 stores the object detection information and the detection condition information received from the target device in the storage unit 34 (step S6).
  • the vehicle management device 201 performs an integrated process of creating integrated information that integrates the object detection information transmitted from each providing device 101. More specifically, in the integrated process, the vehicle management device 201 is transmitted from each providing device 101, for example, with the above-mentioned measurement time included in the providing information received from the target device to which the change information is provided as a target time. Time synchronization is performed to match the time variation of the provided information (step S7).
  • the vehicle management device 201 performs an improvement process for creating change information for improving object detection in the target device (step S8).
  • the vehicle management device 201 transmits the created change information to the target device (step S9).
  • the target device makes changes related to the detection of the object in the target device based on the change information received from the vehicle management device 201 (step S10).
  • FIG. 7 is a flowchart defining the operation procedure of the providing device in the vehicle management system according to the embodiment of the present disclosure.
  • FIG. 7 shows the operation of the providing device 101A mounted on the vehicle 1.
  • the providing device 101 acquires the measurement result from the sensor 51 (step S21).
  • the providing device 101 performs an analysis process for detecting an object that affects the traffic of its own vehicle 1 based on the acquired measurement result (step S22).
  • the providing device 101 creates support information for autonomous driving, automatic driving, driving support, etc. based on the detection result of the object, and based on the created support information, the own vehicle 1 such as speed control, etc. (Step S23).
  • the providing device 101 creates object detection information indicating the detection result of the object and detection condition information indicating the conditions for detecting the object (step S24), and transmits the information to the vehicle management device 201 (step S25).
  • step S26 when the providing device 101 listens for the change information from the vehicle management device 201 (NO in step S26) and receives the change information (YES in step S26), the providing device 101 performs a change process based on the received change information (step). S27).
  • the providing device 101 performs the analysis process again (step S21).
  • step S23 and the processing of steps S24 and S25 may be performed in a different order or in parallel. Further, the processes of steps S21 to S25 and the processes of steps S26 and S27 may be performed in parallel. Further, the process of step S23 may be performed after the change process (step S27).
  • step S22 the operation of the providing device 101B mounted on the roadside machine 3 is FIG. 7 except that the analysis process for detecting an object affecting the traffic of surrounding vehicles is performed in step S22 and the process in step S23 is not performed. It is the same as the flowchart shown in.
  • FIG. 8 is a flowchart defining the operation procedure of the vehicle management device in the vehicle management system according to the embodiment of the present disclosure.
  • step S42 when the vehicle management device 201 listens for the provided information from the providing device 101 (NO in step S41) and receives the provided information (YES in step S41), the vehicle management device 201 stores the received provided information. Accumulate in 34 (step S42).
  • the vehicle management device 201 performs time synchronization for adjusting the time variation of the provided information transmitted from each providing device 101 including the target device (step S43).
  • the vehicle management device 201 maps the provided information of each providing device 101 selected in the time synchronization to a single position coordinate (step S44).
  • the vehicle management device 201 creates integrated information using each provided information after coordinate conversion (step S45).
  • the vehicle management device 201 converts the coordinate system of the position information included in the created integrated information into the coordinate system of the target device (step S46).
  • the vehicle management device 201 compares the created integrated information with the object detection information received from the target device, selects and calculates the information to be transmitted to the target device based on the comparison result, and changes the information. Is created (step S47).
  • the vehicle management device 201 sets the utilization time, which is the time when the target device should utilize the change information, and makes a part or all of the content of the change information based on the above comparison result suitable for the utilization time. Create the changed change information (step S48).
  • the vehicle management device 201 outputs the created change information to the target device (step S49).
  • the presenting device may not include a receiving unit and a detection / changing unit, and may be configured not to receive the change information and reflect the change information in the object detection. Even with such a configuration, the vehicle management device 201 can extract highly accurate information from the object detection information of each providing device 101 to create more accurate and highly integrated information. As a result, for example, it is possible to improve the object detection accuracy in the vehicle or roadside machine to which the integrated information is distributed.
  • the vehicle management device according to the embodiment of the present disclosure may be provided by cloud computing. That is, the vehicle management device according to the embodiment of the present disclosure may be configured by a plurality of cloud servers and the like.
  • the analysis unit 21 or 71 detects the object detection information indicating the detection result in the providing device 101 of the object affecting the traffic of the vehicle 1, and the detection of the object. Create detection condition information indicating the conditions in. Then, the transmission unit 12 or 62 transmits the object detection information and the detection condition information created by the analysis unit 21 or 71.
  • the object detection information indicating the detection result in the providing device 101 of the object affecting the traffic of the vehicle 1 and the conditions for detecting the object.
  • Create detection condition information indicating.
  • the created object detection information and detection condition information are transmitted.
  • the object detection conditions in each providing device 101 are accurately utilized from the collected object detection information. It is possible to extract highly sexual elements. For example, the sensing environment of the surrounding conditions such as a vehicle and a roadside machine varies, and the sensing environment fluctuates with time. Therefore, as compared with the method of simply integrating the object detection information from each providing device 101, it is possible to realize highly accurate object detection by using the highly accurate information as described above.
  • the providing device and the vehicle management method according to the embodiment of the present disclosure can improve the detection accuracy of an object that affects the traffic of the vehicle.
  • the acquisition unit 41 indicates the object detection information indicating the detection result in the vehicle providing device 101 of the object affecting the traffic of the vehicle 1, and the conditions for detecting the object.
  • the detection condition information is acquired from each of the plurality of providing devices 101.
  • the information creation unit 42 changes the object detection in the target device based on the object detection information and the detection condition information of the plurality of providing devices 101 including the target device which is the target providing device 101 acquired by the acquisition unit 41. Create change information that indicates the content. Then, the transmission unit 32 transmits the change information created by the information creation unit 42 to the target device.
  • each of the plurality of providing devices 101 including the target device 101, which is the target providing device 101 has a detection result in the object providing device 101 that affects the traffic of the vehicle 1.
  • the object detection information indicating the above and the detection condition information indicating the conditions for detecting the object are transmitted to the vehicle management device 201.
  • the vehicle management device 201 transmits, based on the received object detection information and detection condition information of the plurality of providing devices 101, change information indicating the change contents regarding the detection of the object in the target device to the target device.
  • the vehicle management device 201 in the vehicle management device 201, first, in the object detection information indicating the detection result in the object providing device 101 that affects the traffic of the vehicle 1, and in the detection of the object.
  • the detection condition information indicating the condition is acquired from each of the plurality of providing devices 101.
  • change information indicating the change contents regarding the detection of the object in the target device is created. ..
  • the created change information is transmitted to the target device.
  • a plurality of providing devices 101 including the target device, which is the target providing device 101 affect the traffic of the vehicle 1, respectively.
  • the object detection information indicating the detection result in the object providing device 101 and the detection condition information indicating the conditions for detecting the object are transmitted to the vehicle management device 201.
  • the vehicle management device 201 transmits, based on the received object detection information and detection condition information of the plurality of providing devices 101, change information indicating the change contents regarding the detection of the object in the target device to the target device.
  • object detection information and detection condition information are collected from each providing device 101, and the object detection conditions in each providing device 101 are utilized from the collected object detection information with high accuracy. Elements can be extracted. Then, by using the extracted highly accurate information and the object detection information of the target device to make changes related to object detection in the target device, the capabilities of the device and software of the target device used for object detection are improved. Or, the detection result of the object can be corrected. For example, the sensing environment of the surrounding conditions such as a vehicle and a roadside machine varies, and the sensing environment fluctuates with time.
  • the object detection with high accuracy in the target device can be performed by the configuration in which the highly accurate information as described above is fed back to the object detection in the target device. It can be realized.
  • the vehicle management device, the vehicle management system, and the vehicle management method according to the embodiment of the present disclosure can improve the detection accuracy of an object that affects the traffic of the vehicle.
  • the amount of communication of information collected from the providing device 101 can be reduced.
  • the sensor information includes a large amount of information in the storage area in order to perform accurate object detection, and also includes information in an area unnecessary for object detection.
  • the object detection information and the detection condition information are the results of the object detection performed by the providing device 101, for example, they indicate the characteristics of the object such as the type, dimensions, position information, inclination, velocity, and acceleration of the detected object. Information on objects that have been converted into information and do not exist is not included. Therefore, since the information collected from the providing device 101 is object detection information and detection condition information having a smaller amount of data than the sensor information, the communication amount of the information collected from the providing device 101 can be reduced.
  • the information used by the vehicle management device 201 for the integrated processing is the object detection information and the detection condition information which are the results of the object detection by the providing device 101, and it is necessary to execute the same processing on the vehicle management device 201 side. There is no. Therefore, the processing load of the integrated processing in the vehicle management device 201 can be reduced.
  • the detection accuracy of the object affecting the traffic of the vehicle can be improved, the communication amount of the information of the detection result can be reduced, and the processing load of the integrated processing of the detection result can be increased. Can be reduced.
  • An information creation unit that creates object detection information that indicates the detection result of an object that affects the traffic of the vehicle in the providing device, and detection condition information that indicates the conditions for detecting the object.
  • the object detection information created by the information creation unit and the transmission unit for transmitting the detection condition information are provided.
  • the object detection information includes a plurality of elements related to the object.
  • the detection condition information includes the condition that the object detection information can be evaluated for each element.
  • An acquisition unit that acquires object detection information indicating the detection result of an object affecting the traffic of the vehicle in the providing device and detection condition information indicating a condition for detecting the object from the plurality of providing devices, respectively. Based on the object detection information and the detection condition information of the plurality of providing devices acquired by the acquisition unit, the contents of changes regarding the detection of the object in the predetermined target device among the plurality of providing devices are shown.
  • the information creation department that creates change information and It includes a transmission unit that transmits the change information created by the information creation unit to the target device.
  • the object detection information includes a plurality of elements related to the object.
  • the detection condition information includes the condition that the object detection information can be evaluated for each element.
  • the information creation unit creates integrated information that integrates the object detection information of the plurality of providing devices, and creates the change information based on the comparison result between the created integrated information and the object detection information of the target device. make, The information creation unit evaluates the object detection information for each element based on the corresponding detection condition information for each object detection information, and based on the evaluation result, the object detection of the plurality of providing devices.
  • a vehicle management device that extracts a plurality of the elements from the information and creates the integrated information including the extracted elements.
  • Each of the plurality of providing devices transmits the object detection information indicating the detection result of the object affecting the traffic of the vehicle in the providing device and the detection condition information indicating the conditions for detecting the object to the vehicle management device. Based on the object detection information and the detection condition information of the plurality of providing devices received, the vehicle management device changes the content of the detection of the object in the predetermined target device among the plurality of providing devices. The indicated change information is transmitted to the target device,
  • the object detection information includes a plurality of elements related to the object.
  • the detection condition information includes the condition that the object detection information can be evaluated for each element.
  • the vehicle management device creates integrated information that integrates the object detection information of the plurality of providing devices, and creates the change information based on a comparison result between the created integrated information and the object detection information of the target device. make, The vehicle management device evaluates the object detection information for each element based on the corresponding detection condition information for each object detection information, and based on the evaluation result, the object detection of the plurality of providing devices.
  • a vehicle management system that extracts a plurality of the elements from the information and creates the integrated information including the extracted elements.
  • a providing device including a processor and a communication circuit.
  • the processor An information creation unit that creates object detection information indicating the detection result of an object affecting the traffic of a vehicle in the providing device and detection condition information indicating a condition for detecting the object is realized.
  • the communication circuit A providing device that realizes a transmission unit that transmits the object detection information and the detection condition information created by the information creation unit.
  • a vehicle management device including a processor and a communication circuit.
  • the processor An acquisition unit that acquires object detection information indicating the detection result of an object affecting the traffic of the vehicle in the providing device and detection condition information indicating a condition for detecting the object from the plurality of providing devices, respectively. Based on the object detection information and the detection condition information of the plurality of providing devices acquired by the acquisition unit, the contents of changes regarding the detection of the object in the predetermined target device among the plurality of providing devices are shown. Realizes an information creation department that creates change information, The communication circuit A vehicle management device that realizes a transmission unit that transmits the change information created by the information creation unit to the target device.
  • Vehicle control unit 31 Processing unit 32 Transmission unit 33 Reception unit 34 Storage unit 41 Acquisition unit 42 Information creation unit 51, 81 Sensor 61 Processing unit 62 Transmission unit 63 Reception unit 64 Storage unit 71 Analysis unit 72 Detection change unit 101, 101A, 101B Providing device 201 Vehicle management device 301 Vehicle management system

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Traffic Control Systems (AREA)

Abstract

This provision device comprises: an information preparation unit that prepares object detection information indicating the result of detection of an object that affects the traffic of a vehicle in the provision device, and detection condition information indicating the conditions of detection of the object; and a transmission unit that transmits the object detection information and the detection condition information prepared by the information preparation unit.

Description

提供装置、車両管理装置、車両管理システム、車両管理方法および車両管理プログラムProvided equipment, vehicle management equipment, vehicle management system, vehicle management method and vehicle management program
 本開示は、提供装置、車両管理装置、車両管理システム、車両管理方法および車両管理プログラムに関する。
 この出願は、2019年11月29日に出願された日本出願特願2019-216678号を基礎とする優先権を主張し、その開示のすべてをここに取り込む。
The present disclosure relates to providing devices, vehicle management devices, vehicle management systems, vehicle management methods and vehicle management programs.
This application claims priority on the basis of Japanese application Japanese Patent Application No. 2019-216678 filed on November 29, 2019, and incorporates all of its disclosures herein.
 特許文献1(特開2019-175201号公報)には、以下のような技術が開示されている。すなわち、各々が1又は複数のセンサに接続される複数の装置と、当該複数の装置の各々との間で通信が可能なサーバとを含むセンサ共有システムであって、前記複数の装置の各々は、当該装置が接続されたセンサからセンサ情報を収集し解析して、当該センサ情報と解析結果とを前記サーバに送信する送信部と、前記サーバからコマンドを受信するコマンド受信部と、前記サーバから受けたコマンドに応じて、前記送信部による前記センサ情報の送信情報量と前記解析結果の送信情報量との比率を制御する送信制御部とを含み、前記サーバは、前記複数の装置から、前記センサ情報と前記解析結果とを受信する受信部と、前記受信部が受信した前記センサ情報を集約解析し、集約解析結果を出力する集約解析部と、前記受信部が受信した前記解析結果と、前記集約解析結果とを比較する比較部と、前記比較部による比較結果に基づいて、所定時間における、前記複数の装置の各々から前記サーバへ送信される前記センサ情報の送信情報量と前記解析結果の送信情報量との比率を調整するコマンドを生成し前記複数の装置の各々に送信する調整部とを含む。 Patent Document 1 (Japanese Unexamined Patent Publication No. 2019-175201) discloses the following techniques. That is, it is a sensor sharing system including a plurality of devices each connected to one or a plurality of sensors and a server capable of communicating with each of the plurality of devices, and each of the plurality of devices is a sensor sharing system. , A transmitting unit that collects and analyzes sensor information from a sensor to which the device is connected and transmits the sensor information and the analysis result to the server, a command receiving unit that receives a command from the server, and a command receiving unit that receives commands from the server. The server includes a transmission control unit that controls the ratio of the transmission information amount of the sensor information to the transmission information amount of the analysis result by the transmission unit according to the received command, and the server is said to be from the plurality of devices. A receiving unit that receives the sensor information and the analysis result, an aggregated analysis unit that aggregates and analyzes the sensor information received by the receiving unit and outputs an aggregated analysis result, and the analysis result received by the receiving unit. Based on the comparison unit that compares the aggregated analysis result and the comparison result by the comparison unit, the amount of transmission information of the sensor information transmitted from each of the plurality of devices to the server and the analysis result at a predetermined time. Includes an adjusting unit that generates a command for adjusting the ratio of the information to the amount of transmitted information and transmits it to each of the plurality of devices.
 また、特許文献2(特開2008-191988号公報)には、以下のような技術が開示されている。すなわち、車両周辺監視装置は、走行中の道路上にて自車の周囲に配置された車外撮影手段から、当該自車の周囲を走行中の他車を含む路上撮影画像を取得する路上撮影画像取得手段と、前記車外撮影手段が撮影する撮影視野の異なる複数の路上撮影画像に基づいて、前記自車の周囲領域を該自車よりも上方に位置する視点から俯瞰した自車周囲俯瞰画像を、前記路上撮影画像を視点変換しつつ実画像として合成することにより作成する自車周囲俯瞰画像作成手段と、車室内に設けられ、前記自車周囲俯瞰画像を前記自車位置が特定可能な形で表示する自車周囲俯瞰画像表示手段とを備える。 Further, Patent Document 2 (Japanese Unexamined Patent Publication No. 2008-191988) discloses the following techniques. That is, the vehicle peripheral monitoring device acquires a road photographed image including another vehicle traveling around the own vehicle from an outside vehicle photographing means arranged around the own vehicle on the traveling road. Based on the acquisition means and a plurality of road shot images with different shooting fields taken by the outside-vehicle shooting means, a bird's-eye view image of the surroundings of the own vehicle is obtained from a viewpoint located above the own vehicle. , A means for creating a bird's-eye view image of the vehicle's surroundings, which is created by synthesizing the image taken on the road as an actual image while converting the viewpoint, and a form provided in the vehicle interior where the vehicle's position can be specified from the bird's-eye view image of the vehicle's surroundings. It is equipped with a means for displaying a bird's-eye view of the surroundings of the vehicle.
 また、特許文献3(特開2009-186353号公報)には、以下のような技術が開示されている。すなわち、車両の近辺に存在する静止物体および移動物体を検出する物体検出装置であって、前記車両に搭載されたカメラによって時系列に撮像された2つの画像を対比することによって前記静止物体の相対的な空間配置をあらわす相対3次元座標を取得する画像検知手段と、前記画像に対応する範囲について照射した照射波に係る反射波に基づいて前記静止物体および前記移動物体に係る距離および方位を取得するレーダ検知手段と、前記レーダ検知手段によって取得された前記方位に存在する前記画像内の前記静止物体について前記レーダ検知手段によって取得された前記距離を用いて前記相対3次元座標から絶対3次元座標を算出する座標算出手段とを備える。 Further, Patent Document 3 (Japanese Unexamined Patent Publication No. 2009-186353) discloses the following techniques. That is, it is an object detection device that detects a stationary object and a moving object existing in the vicinity of the vehicle, and is relative to the stationary object by comparing two images captured in time series by a camera mounted on the vehicle. Acquires the distance and orientation of the stationary object and the moving object based on the image detecting means for acquiring the relative three-dimensional coordinates representing the spatial arrangement and the reflected wave related to the irradiation wave irradiated for the range corresponding to the image. Relative 3D coordinates to absolute 3D coordinates using the radar detecting means and the distance acquired by the radar detecting means for the stationary object in the image existing in the orientation acquired by the radar detecting means. It is provided with a coordinate calculation means for calculating.
 また、特許文献4(特開2009-98025号公報)には、以下のような技術が開示されている。すなわち、物体検出装置は、レーダにより物体の位置を検出するレーダ検出手段と、前記物体を撮像する画像撮像手段と、前記レーダ検出手段により検出された前記物体の検出点と、前記画像撮像手段により撮像された撮像画像とに基づき、前記物体の端部の位置を特定する端部特定手段とを備え、前記端部特定手段は、前記撮像画像から前記物体を含む矩形領域を求め、前記検出点に対応した物体検出直線と前記矩形領域の左右端の方位直線との交点を前記物体の端部の位置とする。 Further, Patent Document 4 (Japanese Unexamined Patent Publication No. 2009-98025) discloses the following techniques. That is, the object detection device includes a radar detecting means for detecting the position of an object by a radar, an image capturing means for capturing the object, a detection point for the object detected by the radar detecting means, and the image capturing means. The end specifying means for specifying the position of the end of the object based on the captured image is provided, and the end specifying means obtains a rectangular region including the object from the captured image and obtains the detection point. The intersection of the object detection straight line corresponding to the above and the azimuth straight line at the left and right ends of the rectangular region is defined as the position of the end portion of the object.
特開2019-175201号公報Japanese Unexamined Patent Publication No. 2019-175201 特開2008-191988号公報Japanese Unexamined Patent Publication No. 2008-191988 特開2009-186353号公報Japanese Unexamined Patent Publication No. 2009-186353 特開2009-98025号公報JP-A-2009-98025 特開2018-5520号公報JP-A-2018-5520 特開2019-174899号公報JP-A-2019-174899
 本開示の提供装置は、提供装置における車両の交通に影響する物体の検出結果を示す物体検出情報、および前記物体の検出における条件を示す検出条件情報を作成する情報作成部と、前記情報作成部によって作成された前記物体検出情報および前記検出条件情報を送信する送信部とを備える。 The providing device of the present disclosure includes an information creating unit that creates object detection information indicating an object detection result that affects the traffic of a vehicle in the providing device, and detection condition information that indicates a condition for detecting the object, and the information creating unit. The object detection information created by the above and a transmission unit for transmitting the detection condition information are provided.
 本開示の車両管理装置は、提供装置における車両の交通に影響する物体の検出結果を示す物体検出情報、および前記物体の検出における条件を示す検出条件情報を、複数の前記提供装置からそれぞれ取得する取得部と、前記取得部によって取得された前記複数の提供装置の前記物体検出情報および前記検出条件情報に基づいて、前記複数の提供装置のうちの予め定められた対象装置における前記物体の検出に関する変更内容を示す変更情報を作成する情報作成部と、前記情報作成部によって作成された前記変更情報を前記対象装置へ送信する送信部とを備える。 The vehicle management device of the present disclosure acquires object detection information indicating an object detection result affecting vehicle traffic in the providing device and detection condition information indicating a condition for detecting the object from the plurality of providing devices, respectively. Regarding the detection of the object in a predetermined target device among the plurality of providing devices based on the acquisition unit and the object detection information and the detection condition information of the plurality of providing devices acquired by the acquiring unit. It includes an information creation unit that creates change information indicating the change content, and a transmission unit that transmits the change information created by the information creation unit to the target device.
 本開示の車両管理システムは、複数の提供装置と、車両管理装置とを備え、前記複数の提供装置は、それぞれ、前記提供装置における車両の交通に影響する物体の検出結果を示す物体検出情報、および前記物体の検出における条件を示す検出条件情報を前記車両管理装置へ送信し、前記車両管理装置は、受信した前記複数の提供装置の前記物体検出情報および前記検出条件情報に基づいて、前記複数の提供装置のうちの予め定められた対象装置における前記物体の検出に関する変更内容を示す変更情報を前記対象装置へ送信する。 The vehicle management system of the present disclosure includes a plurality of providing devices and a vehicle management device, and the plurality of providing devices each include object detection information indicating a detection result of an object affecting the traffic of the vehicle in the providing device. And the detection condition information indicating the condition in the detection of the object is transmitted to the vehicle management device, and the vehicle management device is based on the object detection information and the detection condition information of the plurality of providing devices received. The change information indicating the change contents regarding the detection of the object in the predetermined target device among the provided devices is transmitted to the target device.
 本開示の車両管理方法は、提供装置における車両管理方法であって、前記提供装置における車両の交通に影響する物体の検出結果を示す物体検出情報、および前記物体の検出における条件を示す検出条件情報を作成するステップと、作成した前記物体検出情報および前記検出条件情報を送信するステップとを含む。 The vehicle management method of the present disclosure is a vehicle management method in a providing device, and is object detection information indicating a detection result of an object affecting the traffic of a vehicle in the providing device, and detection condition information indicating a condition in detecting the object. Includes a step of creating the object and a step of transmitting the created object detection information and the detection condition information.
 本開示の車両管理方法は、車両管理装置における車両管理方法であって、提供装置における車両の交通に影響する物体の検出結果を示す物体検出情報、および前記物体の検出における条件を示す検出条件情報を、複数の前記提供装置からそれぞれ取得するステップと、取得した前記複数の提供装置の前記物体検出情報および前記検出条件情報に基づいて、前記複数の提供装置のうちの予め定められた対象装置における前記物体の検出に関する変更内容を示す変更情報を作成するステップと、作成した前記変更情報を前記対象装置へ送信するステップとを含む。 The vehicle management method of the present disclosure is a vehicle management method in a vehicle management device, and is object detection information indicating a detection result of an object affecting the traffic of a vehicle in the providing device, and detection condition information indicating a condition in detecting the object. In a predetermined target device among the plurality of providing devices, based on the step of acquiring the object from each of the plurality of providing devices and the acquired object detection information and the detection condition information of the plurality of providing devices. The step includes a step of creating change information indicating the content of the change related to the detection of the object, and a step of transmitting the created change information to the target device.
 本開示の車両管理方法は、複数の提供装置と、車両管理装置とを備える車両管理システムにおける車両管理方法であって、前記複数の提供装置が、それぞれ、前記提供装置における車両の交通に影響する物体の検出結果を示す物体検出情報、および前記物体の検出における条件を示す検出条件情報を前記車両管理装置へ送信するステップと、前記車両管理装置が、受信した前記複数の提供装置の前記物体検出情報および前記検出条件情報に基づいて、前記複数の提供装置のうちの予め定められた対象装置における前記物体の検出に関する変更内容を示す変更情報を前記対象装置へ送信するステップとを含む。 The vehicle management method of the present disclosure is a vehicle management method in a vehicle management system including a plurality of providing devices and a vehicle management device, and each of the plurality of providing devices affects the traffic of a vehicle in the providing device. A step of transmitting object detection information indicating an object detection result and detection condition information indicating a condition for detecting the object to the vehicle management device, and the object detection of the plurality of providing devices received by the vehicle management device. Based on the information and the detection condition information, the step includes transmitting change information indicating the change contents regarding the detection of the object in the predetermined target device among the plurality of providing devices to the target device.
 本開示の車両管理プログラムは、提供装置において用いられる車両管理プログラムであって、コンピュータを、車両の交通に影響する物体の前記提供装置における検出結果を示す物体検出情報、および前記物体の検出における条件を示す検出条件情報を作成する情報作成部と、前記情報作成部によって作成された前記物体検出情報および前記検出条件情報を送信する送信部、として機能させるためのプログラムである。 The vehicle management program of the present disclosure is a vehicle management program used in a providing device, and uses a computer as an object detection information indicating a detection result of an object affecting the traffic of a vehicle in the providing device, and conditions for detecting the object. This is a program for functioning as an information creation unit that creates detection condition information indicating the above, and a transmission unit that transmits the object detection information and the detection condition information created by the information creation unit.
 本開示の車両管理プログラムは、車両管理装置において用いられる車両管理プログラムであって、コンピュータを、提供装置における車両の交通に影響する物体の検出結果を示す物体検出情報、および前記物体の検出における条件を示す検出条件情報を、複数の前記提供装置からそれぞれ取得する取得部と、前記取得部によって取得された前記複数の提供装置の前記物体検出情報および前記検出条件情報に基づいて、前記複数の提供装置のうちの予め定められた対象装置における前記物体の検出に関する変更内容を示す変更情報を作成する情報作成部と、前記情報作成部によって作成された前記変更情報を前記対象装置へ送信する送信部、として機能させるためのプログラムである。 The vehicle management program of the present disclosure is a vehicle management program used in a vehicle management device, in which a computer is used as object detection information indicating an object detection result that affects vehicle traffic in the providing device, and conditions for detecting the object. Based on the acquisition unit that acquires the detection condition information indicating the above from the plurality of providing devices, the object detection information of the plurality of providing devices acquired by the acquisition unit, and the detection condition information, the plurality of provisions are provided. An information creation unit that creates change information indicating changes related to detection of the object in a predetermined target device among the devices, and a transmission unit that transmits the change information created by the information creation unit to the target device. It is a program to function as.
 本開示の一態様は、提供装置の一部または全部を実現する半導体集積回路として実現され得る。また、本開示の一態様は、車両管理装置の一部または全部を実現する半導体集積回路として実現され得る。また、本開示の一態様は、車両管理システムの一部または全部を実現する半導体集積回路として実現され得る。また、本開示の一態様は、車両管理システムにおける処理のステップをコンピュータに実行させるためのプログラムとして実現され得る。 One aspect of the present disclosure can be realized as a semiconductor integrated circuit that realizes a part or all of the providing device. Further, one aspect of the present disclosure can be realized as a semiconductor integrated circuit that realizes a part or all of the vehicle management device. Further, one aspect of the present disclosure can be realized as a semiconductor integrated circuit that realizes a part or all of a vehicle management system. Further, one aspect of the present disclosure can be realized as a program for causing a computer to execute a processing step in a vehicle management system.
図1は、本開示の実施の形態に係る車両管理システムの構成を示す図である。FIG. 1 is a diagram showing a configuration of a vehicle management system according to an embodiment of the present disclosure. 図2は、本開示の実施の形態に係る提供装置の構成を示す図である。FIG. 2 is a diagram showing a configuration of a providing device according to an embodiment of the present disclosure. 図3は、本開示の実施の形態に係る提供装置の他の例の構成を示す図である。FIG. 3 is a diagram showing the configuration of another example of the providing device according to the embodiment of the present disclosure. 図4は、本開示の実施の形態に係る車両管理装置の構成を示す図である。FIG. 4 is a diagram showing a configuration of a vehicle management device according to an embodiment of the present disclosure. 図5は、本開示の実施の形態に係る車両管理装置における変更情報の作成処理を概念的に示す図である。FIG. 5 is a diagram conceptually showing a process of creating change information in the vehicle management device according to the embodiment of the present disclosure. 図6は、本開示の実施の形態に係る車両管理システムにおける提供情報および変更情報の処理のシーケンスの一例を示す図である。FIG. 6 is a diagram showing an example of a sequence of processing of provided information and change information in the vehicle management system according to the embodiment of the present disclosure. 図7は、本開示の実施の形態に係る車両管理システムにおける提供装置の動作手順を定めたフローチャートである。FIG. 7 is a flowchart defining an operation procedure of the providing device in the vehicle management system according to the embodiment of the present disclosure. 図8は、本開示の実施の形態に係る車両管理システムにおける車両管理装置の動作手順を定めたフローチャートである。FIG. 8 is a flowchart defining an operation procedure of the vehicle management device in the vehicle management system according to the embodiment of the present disclosure.
 従来、車両の自律走行、自動運転または運転支援等を行うための、車両の周囲等における物体検知に関する技術が開発されている。 Conventionally, technologies related to object detection around the vehicle have been developed for autonomous driving, automatic driving, driving support, etc. of the vehicle.
 [本開示が解決しようとする課題]
 このような特許文献1~4に記載の技術を超えて、車両の交通に影響する物体の検出精度を向上させる技術が望まれる。
[Issues to be solved by this disclosure]
A technique for improving the detection accuracy of an object affecting the traffic of a vehicle is desired beyond the techniques described in Patent Documents 1 to 4.
 本開示は、上述の課題を解決するためになされたもので、その目的は、車両の交通に影響する物体の検出精度を向上させることが可能な提供装置、車両管理装置、車両管理システム、車両管理方法および車両管理プログラムを提供することである。 The present disclosure has been made to solve the above-mentioned problems, and an object thereof is a providing device, a vehicle management device, a vehicle management system, and a vehicle capable of improving the detection accuracy of an object affecting the traffic of a vehicle. To provide management methods and vehicle management programs.
 [本開示の効果]
 本開示によれば、車両の交通に影響する物体の検出精度を向上させることができる。
[Effect of the present disclosure]
According to the present disclosure, it is possible to improve the detection accuracy of an object that affects the traffic of a vehicle.
 [本開示の実施形態の説明]
 最初に、本開示の実施形態の内容を列記して説明する。
[Explanation of Embodiments of the present disclosure]
First, the contents of the embodiments of the present disclosure will be listed and described.
 (1)本開示の実施の形態に係る提供装置は、提供装置における車両の交通に影響する物体の検出結果を示す物体検出情報、および前記物体の検出における条件を示す検出条件情報を作成する情報作成部と、前記情報作成部によって作成された前記物体検出情報および前記検出条件情報を送信する送信部とを備える。 (1) The providing device according to the embodiment of the present disclosure is information for creating object detection information indicating an object detection result affecting vehicle traffic in the providing device and detection condition information indicating a condition for detecting the object. It includes a creation unit and a transmission unit that transmits the object detection information and the detection condition information created by the information creation unit.
 このような構成により、たとえば各提供装置から物体検出情報および検出条件情報を収集する装置において、収集した各物体検出情報の中から、当該各提供装置における物体検出の条件を活用して正確性の高い要素を抽出することができる。たとえば、車両および路側機等の周辺状況のセンシング環境は様々であり、かつ、センシング環境は時間と共に変動するものである。そのため、単に各提供装置からの物体検出情報を統合する方法と比べて、上記のような正確性の高い情報を用いて精度の高い物体検出を実現することができる。したがって、車両の交通に影響する物体の検出精度を向上させることができる。 With such a configuration, for example, in a device that collects object detection information and detection condition information from each providing device, the accuracy of the object detection condition in each providing device is utilized from the collected object detection information. High elements can be extracted. For example, the sensing environment of the surrounding conditions such as a vehicle and a roadside machine varies, and the sensing environment fluctuates with time. Therefore, it is possible to realize highly accurate object detection by using the above-mentioned highly accurate information, as compared with the method of simply integrating the object detection information from each providing device. Therefore, it is possible to improve the detection accuracy of an object that affects the traffic of the vehicle.
 (2)好ましくは、前記提供装置は、さらに、前記提供装置における前記物体の検出に関する変更内容を示す変更情報を受信する受信部と、前記受信部によって受信された前記変更情報に基づいて、前記提供装置における前記物体の検出に関する変更を行う検出変更部とを備える。 (2) Preferably, the providing device further receives the change information indicating the change contents regarding the detection of the object in the providing device, and the change information received by the receiving unit. It is provided with a detection change unit that changes the detection of the object in the providing device.
 このような構成により、たとえば上記のような正確性の高い情報を用いて作成された変更情報に基づいて、提供装置における物体検出に関する変更を行うことができるため、当該物体検出に用いる提供装置の機器およびソフトウェア等の能力を向上したり、物体の検出結果を補正したりすることができる。 With such a configuration, it is possible to make changes related to object detection in the providing device based on the change information created by using the highly accurate information as described above, so that the providing device used for the object detection can be changed. It is possible to improve the capabilities of equipment and software, and to correct the detection result of objects.
 (3)より好ましくは、前記変更情報は、前記検出結果の変更内容を示す。 More preferably, the change information indicates the change content of the detection result.
 このような構成により、たとえば物体検出に用いるセンサの設定または物体の検出に用いる解析処理の内容を変更するための仕組みを提供装置において設けることを不要としながら、精度の高い検出結果を得ることができる。 With such a configuration, it is possible to obtain highly accurate detection results without having to provide, for example, a mechanism for changing the setting of the sensor used for object detection or the content of the analysis process used for object detection in the providing device. it can.
 (4)より好ましくは、前記変更情報は、前記物体の検出に用いるセンサの設定の変更内容を示す。 More preferably than (4), the change information indicates a change in the setting of the sensor used for detecting the object.
 このような構成により、提供装置において、物体の検出結果を補正する処理を追加することを不要とし、また、物体検出に用いる解析処理の内容を変更するための仕組みを設けることを不要としながら、精度の高い検出結果を得ることができる。 With such a configuration, it is not necessary to add a process for correcting the detection result of the object in the providing device, and it is not necessary to provide a mechanism for changing the content of the analysis process used for the object detection. Highly accurate detection results can be obtained.
 (5)より好ましくは、前記変更情報は、前記物体の検出に用いるセンサの計測結果に対する解析処理の内容変更を示す。 More preferably than (5), the change information indicates a change in the content of the analysis process for the measurement result of the sensor used for detecting the object.
 このような構成により、提供装置において、物体の検出結果を補正する処理を追加することを不要とし、また、物体検出に用いるセンサの設定を変更するための仕組みを設けることを不要としながら、精度の高い検出結果を得ることができる。 With such a configuration, it is not necessary to add a process for correcting the detection result of the object in the providing device, and it is not necessary to provide a mechanism for changing the setting of the sensor used for the object detection. High detection results can be obtained.
 (6)好ましくは、前記検出条件情報は、前記物体の検出に用いるセンサの計測性能に影響を与える前記センサの環境を示す。 (6) Preferably, the detection condition information indicates the environment of the sensor that affects the measurement performance of the sensor used for detecting the object.
 このような構成により、センサが苦手とする環境において得られた検出結果であるか否かを判断することができるため、物体検出情報から、より正確性の高い要素を抽出することができる。 With such a configuration, it is possible to determine whether or not the detection result is obtained in an environment that the sensor is not good at, so that a more accurate element can be extracted from the object detection information.
 (7)好ましくは、前記提供装置は、車両に搭載され、前記検出条件情報は、前記提供装置が搭載された前記車両の走行状態を示す。 (7) Preferably, the providing device is mounted on a vehicle, and the detection condition information indicates a running state of the vehicle on which the providing device is mounted.
 このような構成により、車両の走行状態の別に応じた物体検出への影響を判断することができるため、物体検出情報から、より正確性の高い要素を抽出することができる。 With such a configuration, it is possible to determine the influence on the object detection according to the traveling state of the vehicle, so that a more accurate element can be extracted from the object detection information.
 (8)本開示の実施の形態に係る車両管理装置は、提供装置における車両の交通に影響する物体の検出結果を示す物体検出情報、および前記物体の検出における条件を示す検出条件情報を、複数の前記提供装置からそれぞれ取得する取得部と、前記取得部によって取得された前記複数の提供装置の前記物体検出情報および前記検出条件情報に基づいて、前記複数の提供装置のうちの予め定められた対象装置における前記物体の検出に関する変更内容を示す変更情報を作成する情報作成部と、前記情報作成部によって作成された前記変更情報を前記対象装置へ送信する送信部とを備える。 (8) The vehicle management device according to the embodiment of the present disclosure includes a plurality of object detection information indicating the detection result of an object affecting the traffic of the vehicle in the providing device and detection condition information indicating a condition for detecting the object. Based on the acquisition unit acquired from the providing device, the object detection information of the plurality of providing devices acquired by the acquisition unit, and the detection condition information, a predetermined value among the plurality of providing devices is determined. It includes an information creation unit that creates change information indicating the content of the change related to the detection of the object in the target device, and a transmission unit that transmits the change information created by the information creation unit to the target device.
 このような構成により、各提供装置から物体検出情報および検出条件情報を収集し、収集した各物体検出情報の中から、当該各提供装置における物体検出の条件を活用して正確性の高い要素を抽出することができる。そして、抽出した正確性の高い情報と、対象装置の物体検出情報とを用いて、対象装置における物体検出に関する変更を行う構成により、物体検出に用いる対象装置の機器およびソフトウェア等の能力を向上したり、物体の検出結果を補正したりすることができる。たとえば、車両および路側機等の周辺状況のセンシング環境は様々であり、かつ、センシング環境は時間と共に変動するものである。そのため、単に各提供装置からの物体検出情報を統合する方法と比べて、上記のような正確性の高い情報を対象装置における物体検出にフィードバックする構成により、対象装置において精度の高い物体検出を実現することができる。したがって、車両の交通に影響する物体の検出精度を向上させることができる。 With such a configuration, object detection information and detection condition information are collected from each providing device, and from each collected object detection information, highly accurate elements are selected by utilizing the object detection conditions in each providing device. Can be extracted. Then, by using the extracted highly accurate information and the object detection information of the target device to make changes related to object detection in the target device, the capabilities of the device and software of the target device used for object detection are improved. Or, the detection result of the object can be corrected. For example, the sensing environment of the surrounding conditions such as a vehicle and a roadside machine varies, and the sensing environment fluctuates with time. Therefore, compared to the method of simply integrating the object detection information from each providing device, the configuration that feeds back the highly accurate information as described above to the object detection in the target device realizes highly accurate object detection in the target device. can do. Therefore, it is possible to improve the detection accuracy of an object that affects the traffic of the vehicle.
 (9)本開示の実施の形態に係る車両管理システムは、複数の提供装置と、車両管理装置とを備え、前記複数の提供装置は、それぞれ、前記提供装置における車両の交通に影響する物体の検出結果を示す物体検出情報、および前記物体の検出における条件を示す検出条件情報を前記車両管理装置へ送信し、前記車両管理装置は、受信した前記複数の提供装置の前記物体検出情報および前記検出条件情報に基づいて、前記複数の提供装置のうちの予め定められた対象装置における前記物体の検出に関する変更内容を示す変更情報を前記対象装置へ送信する。 (9) The vehicle management system according to the embodiment of the present disclosure includes a plurality of providing devices and a vehicle management device, and each of the plurality of providing devices is an object that affects the traffic of the vehicle in the providing device. The object detection information indicating the detection result and the detection condition information indicating the conditions for detecting the object are transmitted to the vehicle management device, and the vehicle management device receives the object detection information of the plurality of providing devices and the detection. Based on the condition information, the change information indicating the change contents regarding the detection of the object in the predetermined target device among the plurality of providing devices is transmitted to the target device.
 このような構成により、各提供装置から物体検出情報および検出条件情報を収集し、収集した各物体検出情報の中から、当該各提供装置における物体検出の条件を活用して正確性の高い要素を抽出することができる。そして、抽出した正確性の高い情報と、対象装置の物体検出情報とを用いて、対象装置における物体検出に関する変更を行う構成により、物体検出に用いる対象装置の機器およびソフトウェア等の能力を向上したり、物体の検出結果を補正したりすることができる。たとえば、車両および路側機等の周辺状況のセンシング環境は様々であり、かつ、センシング環境は時間と共に変動するものである。そのため、単に各提供装置からの物体検出情報を統合する方法と比べて、上記のような正確性の高い情報を対象装置における物体検出にフィードバックする構成により、対象装置において精度の高い物体検出を実現することができる。したがって、車両の交通に影響する物体の検出精度を向上させることができる。 With such a configuration, object detection information and detection condition information are collected from each providing device, and from each collected object detection information, highly accurate elements are selected by utilizing the object detection conditions in each providing device. Can be extracted. Then, by using the extracted highly accurate information and the object detection information of the target device to make changes related to object detection in the target device, the capabilities of the device and software of the target device used for object detection are improved. Or, the detection result of the object can be corrected. For example, the sensing environment of the surrounding conditions such as a vehicle and a roadside machine varies, and the sensing environment fluctuates with time. Therefore, compared to the method of simply integrating the object detection information from each providing device, the configuration that feeds back the highly accurate information as described above to the object detection in the target device realizes highly accurate object detection in the target device. can do. Therefore, it is possible to improve the detection accuracy of an object that affects the traffic of the vehicle.
 (10)本開示の実施の形態に係る車両管理方法は、提供装置における車両管理方法であって、前記提供装置における車両の交通に影響する物体の検出結果を示す物体検出情報、および前記物体の検出における条件を示す検出条件情報を作成するステップと、作成した前記物体検出情報および前記検出条件情報を送信するステップとを含む。 (10) The vehicle management method according to the embodiment of the present disclosure is the vehicle management method in the providing device, and the object detection information indicating the detection result of the object affecting the traffic of the vehicle in the providing device, and the object. It includes a step of creating detection condition information indicating a condition in detection, and a step of transmitting the created object detection information and the detection condition information.
 このような構成により、たとえば各提供装置から物体検出情報および検出条件情報を収集する装置において、収集した各物体検出情報の中から、当該各提供装置における物体検出の条件を活用して正確性の高い要素を抽出することができる。たとえば、車両および路側機等の周辺状況のセンシング環境は様々であり、かつ、センシング環境は時間と共に変動するものである。そのため、単に各提供装置からの物体検出情報を統合する方法と比べて、上記のような正確性の高い情報を用いて精度の高い物体検出を実現することができる。したがって、車両の交通に影響する物体の検出精度を向上させることができる。 With such a configuration, for example, in a device that collects object detection information and detection condition information from each providing device, the accuracy of the object detection condition in each providing device is utilized from the collected object detection information. High elements can be extracted. For example, the sensing environment of the surrounding conditions such as a vehicle and a roadside machine varies, and the sensing environment fluctuates with time. Therefore, it is possible to realize highly accurate object detection by using the above-mentioned highly accurate information, as compared with the method of simply integrating the object detection information from each providing device. Therefore, it is possible to improve the detection accuracy of an object that affects the traffic of the vehicle.
 (11)本開示の実施の形態に係る車両管理方法は、車両管理装置における車両管理方法であって、提供装置における車両の交通に影響する物体の検出結果を示す物体検出情報、および前記物体の検出における条件を示す検出条件情報を、複数の前記提供装置からそれぞれ取得するステップと、取得した前記複数の提供装置の前記物体検出情報および前記検出条件情報に基づいて、前記複数の提供装置のうちの予め定められた対象装置における前記物体の検出に関する変更内容を示す変更情報を作成するステップと、作成した前記変更情報を前記対象装置へ送信するステップとを含む。 (11) The vehicle management method according to the embodiment of the present disclosure is the vehicle management method in the vehicle management device, and the object detection information indicating the detection result of the object affecting the traffic of the vehicle in the providing device, and the object. Among the plurality of providing devices, based on the step of acquiring the detection condition information indicating the condition in the detection from each of the plurality of providing devices, the object detection information of the plurality of providing devices, and the detection condition information. The step includes a step of creating change information indicating the change contents regarding the detection of the object in the predetermined target device, and a step of transmitting the created change information to the target device.
 このような構成により、各提供装置から物体検出情報および検出条件情報を収集し、収集した各物体検出情報の中から、当該各提供装置における物体検出の条件を活用して正確性の高い要素を抽出することができる。そして、抽出した正確性の高い情報と、対象装置の物体検出情報とを用いて、対象装置における物体検出に関する変更を行う構成により、物体検出に用いる対象装置の機器およびソフトウェア等の能力を向上したり、物体の検出結果を補正したりすることができる。たとえば、車両および路側機等の周辺状況のセンシング環境は様々であり、かつ、センシング環境は時間と共に変動するものである。そのため、単に各提供装置からの物体検出情報を統合する方法と比べて、上記のような正確性の高い情報を対象装置における物体検出にフィードバックする構成により、対象装置において精度の高い物体検出を実現することができる。したがって、車両の交通に影響する物体の検出精度を向上させることができる。 With such a configuration, object detection information and detection condition information are collected from each providing device, and from each collected object detection information, highly accurate elements are selected by utilizing the object detection conditions in each providing device. Can be extracted. Then, by using the extracted highly accurate information and the object detection information of the target device to make changes related to object detection in the target device, the capabilities of the device and software of the target device used for object detection are improved. Or, the detection result of the object can be corrected. For example, the sensing environment of the surrounding conditions such as a vehicle and a roadside machine varies, and the sensing environment fluctuates with time. Therefore, compared to the method of simply integrating the object detection information from each providing device, the configuration that feeds back the highly accurate information as described above to the object detection in the target device realizes highly accurate object detection in the target device. can do. Therefore, it is possible to improve the detection accuracy of an object that affects the traffic of the vehicle.
 (12)本開示の実施の形態に係る車両管理方法は、複数の提供装置と、車両管理装置とを備える車両管理システムにおける車両管理方法であって、前記複数の提供装置が、それぞれ、前記提供装置における車両の交通に影響する物体の検出結果を示す物体検出情報、および前記物体の検出における条件を示す検出条件情報を前記車両管理装置へ送信するステップと、前記車両管理装置が、受信した前記複数の提供装置の前記物体検出情報および前記検出条件情報に基づいて、前記複数の提供装置のうちの予め定められた対象装置における前記物体の検出に関する変更内容を示す変更情報を前記対象装置へ送信するステップとを含む。 (12) The vehicle management method according to the embodiment of the present disclosure is a vehicle management method in a vehicle management system including a plurality of providing devices and a vehicle management device, and the plurality of providing devices are provided by each of the plurality of providing devices. The step of transmitting the object detection information indicating the detection result of the object affecting the traffic of the vehicle in the device and the detection condition information indicating the condition in the detection of the object to the vehicle management device, and the step received by the vehicle management device. Based on the object detection information and the detection condition information of the plurality of providing devices, change information indicating the change contents regarding the detection of the object in the predetermined target device among the plurality of providing devices is transmitted to the target device. Including steps to do.
 このような構成により、各提供装置から物体検出情報および検出条件情報を収集し、収集した各物体検出情報の中から、当該各提供装置における物体検出の条件を活用して正確性の高い要素を抽出することができる。そして、抽出した正確性の高い情報と、対象装置の物体検出情報とを用いて、対象装置における物体検出に関する変更を行う構成により、物体検出に用いる対象装置の機器およびソフトウェア等の能力を向上したり、物体の検出結果を補正したりすることができる。たとえば、車両および路側機等の周辺状況のセンシング環境は様々であり、かつ、センシング環境は時間と共に変動するものである。そのため、単に各提供装置からの物体検出情報を統合する方法と比べて、上記のような正確性の高い情報を対象装置における物体検出にフィードバックする構成により、対象装置において精度の高い物体検出を実現することができる。したがって、車両の交通に影響する物体の検出精度を向上させることができる。 With such a configuration, object detection information and detection condition information are collected from each providing device, and from each collected object detection information, highly accurate elements are selected by utilizing the object detection conditions in each providing device. Can be extracted. Then, by using the extracted highly accurate information and the object detection information of the target device to make changes related to object detection in the target device, the capabilities of the device and software of the target device used for object detection are improved. Or, the detection result of the object can be corrected. For example, the sensing environment of the surrounding conditions such as a vehicle and a roadside machine varies, and the sensing environment fluctuates with time. Therefore, compared to the method of simply integrating the object detection information from each providing device, the configuration that feeds back the highly accurate information as described above to the object detection in the target device realizes highly accurate object detection in the target device. can do. Therefore, it is possible to improve the detection accuracy of an object that affects the traffic of the vehicle.
 (13)本開示の実施の形態に係る車両管理プログラムは、提供装置において用いられる車両管理プログラムであって、コンピュータを、車両の交通に影響する物体の前記提供装置における検出結果を示す物体検出情報、および前記物体の検出における条件を示す検出条件情報を作成する情報作成部と、前記情報作成部によって作成された前記物体検出情報および前記検出条件情報を送信する送信部、として機能させるためのプログラムである。 (13) The vehicle management program according to the embodiment of the present disclosure is a vehicle management program used in the providing device, and the object detection information indicating the detection result of the object affecting the traffic of the vehicle in the providing device by the computer. , And a program for functioning as an information creation unit that creates detection condition information indicating conditions for detecting the object, and a transmission unit that transmits the object detection information and the detection condition information created by the information creation unit. Is.
 このような構成により、たとえば各提供装置から物体検出情報および検出条件情報を収集する装置において、収集した各物体検出情報の中から、当該各提供装置における物体検出の条件を活用して正確性の高い要素を抽出することができる。たとえば、車両および路側機等の周辺状況のセンシング環境は様々であり、かつ、センシング環境は時間と共に変動するものである。そのため、単に各提供装置からの物体検出情報を統合する方法と比べて、上記のような正確性の高い情報を用いて精度の高い物体検出を実現することができる。したがって、車両の交通に影響する物体の検出精度を向上させることができる。 With such a configuration, for example, in a device that collects object detection information and detection condition information from each providing device, the accuracy of the object detection condition in each providing device is utilized from the collected object detection information. High elements can be extracted. For example, the sensing environment of the surrounding conditions such as a vehicle and a roadside machine varies, and the sensing environment fluctuates with time. Therefore, it is possible to realize highly accurate object detection by using the above-mentioned highly accurate information, as compared with the method of simply integrating the object detection information from each providing device. Therefore, it is possible to improve the detection accuracy of an object that affects the traffic of the vehicle.
 (14)本開示の実施の形態に係る車両管理プログラムは、車両管理装置において用いられる車両管理プログラムであって、コンピュータを、提供装置における車両の交通に影響する物体の検出結果を示す物体検出情報、および前記物体の検出における条件を示す検出条件情報を、複数の前記提供装置からそれぞれ取得する取得部と、前記取得部によって取得された前記複数の提供装置の前記物体検出情報および前記検出条件情報に基づいて、前記複数の提供装置のうちの予め定められた対象装置における前記物体の検出に関する変更内容を示す変更情報を作成する情報作成部と、前記情報作成部によって作成された前記変更情報を前記対象装置へ送信する送信部、として機能させるためのプログラムである。 (14) The vehicle management program according to the embodiment of the present disclosure is a vehicle management program used in the vehicle management device, and is object detection information indicating a detection result of an object affecting the traffic of the vehicle in the providing device using a computer. , And the acquisition unit that acquires the detection condition information indicating the conditions for detecting the object from the plurality of providing devices, and the object detection information and the detection condition information of the plurality of providing devices acquired by the acquisition unit. Based on the above, the information creation unit that creates change information indicating the change contents regarding the detection of the object in the predetermined target device among the plurality of providing devices, and the change information created by the information creation unit. This is a program for functioning as a transmission unit that transmits to the target device.
 このような構成により、各提供装置から物体検出情報および検出条件情報を収集し、収集した各物体検出情報の中から、当該各提供装置における物体検出の条件を活用して正確性の高い要素を抽出することができる。そして、抽出した正確性の高い情報と、対象装置の物体検出情報とを用いて、対象装置における物体検出に関する変更を行う構成により、物体検出に用いる対象装置の機器およびソフトウェア等の能力を向上したり、物体の検出結果を補正したりすることができる。たとえば、車両および路側機等の周辺状況のセンシング環境は様々であり、かつ、センシング環境は時間と共に変動するものである。そのため、単に各提供装置からの物体検出情報を統合する方法と比べて、上記のような正確性の高い情報を対象装置における物体検出にフィードバックする構成により、対象装置において精度の高い物体検出を実現することができる。したがって、車両の交通に影響する物体の検出精度を向上させることができる。 With such a configuration, object detection information and detection condition information are collected from each providing device, and from each collected object detection information, highly accurate elements are selected by utilizing the object detection conditions in each providing device. Can be extracted. Then, by using the extracted highly accurate information and the object detection information of the target device to make changes related to object detection in the target device, the capabilities of the device and software of the target device used for object detection are improved. Or, the detection result of the object can be corrected. For example, the sensing environment of the surrounding conditions such as a vehicle and a roadside machine varies, and the sensing environment fluctuates with time. Therefore, compared to the method of simply integrating the object detection information from each providing device, the configuration that feeds back the highly accurate information as described above to the object detection in the target device realizes highly accurate object detection in the target device. can do. Therefore, it is possible to improve the detection accuracy of an object that affects the traffic of the vehicle.
 以下、本開示の実施の形態について図面を用いて説明する。なお、図中同一または相当部分には同一符号を付してその説明は繰り返さない。また、以下に記載する実施の形態の少なくとも一部を任意に組み合わせてもよい。 Hereinafter, embodiments of the present disclosure will be described with reference to the drawings. The same or corresponding parts in the drawings are designated by the same reference numerals, and the description thereof will not be repeated. In addition, at least a part of the embodiments described below may be arbitrarily combined.
 [車両管理システム]
 図1は、本開示の実施の形態に係る車両管理システムの構成を示す図である。
[Vehicle management system]
FIG. 1 is a diagram showing a configuration of a vehicle management system according to an embodiment of the present disclosure.
 図1を参照して、車両管理システム301は、車両管理装置201と、複数の提供装置101とを備える。提供装置101は、たとえば、車両1または路側機3に搭載される。 With reference to FIG. 1, the vehicle management system 301 includes a vehicle management device 201 and a plurality of providing devices 101. The providing device 101 is mounted on, for example, the vehicle 1 or the roadside machine 3.
 なお、車両管理システム301は、路側機3に搭載される提供装置101を備えない構成であってもよいし、車両1に搭載される提供装置101を備えない構成であってもよい。 The vehicle management system 301 may be configured not to include the providing device 101 mounted on the roadside machine 3, or may not include the providing device 101 mounted on the vehicle 1.
 提供装置101は、車両の走行または駐車等の、車両の交通に影響する物体を検出する。当該物体は、たとえば、車両、歩行者、自転車および固定された構造物である。 The providing device 101 detects an object that affects the traffic of the vehicle, such as traveling or parking of the vehicle. The objects are, for example, vehicles, pedestrians, bicycles and fixed structures.
 車両1に搭載される提供装置101は、たとえば、物体の検出結果に基づいて、自律走行、自動運転または運転支援等のための支援情報を作成する。具体的には、提供装置101は、たとえば、自動運転等を行うための各種制御内容を含む支援情報を作成する。車両1は、提供装置101によって作成された支援情報に基づいて、自律走行、自動運転または運転支援等を行う。なお、提供装置101は、支援情報を作成する構成に限らず、たとえば提供装置101と別の車載装置に物体の検出結果を通知する構成であってもよい。 The providing device 101 mounted on the vehicle 1 creates support information for autonomous driving, automatic driving, driving support, etc., based on, for example, an object detection result. Specifically, the providing device 101 creates support information including various control contents for performing automatic operation or the like, for example. The vehicle 1 performs autonomous driving, automatic driving, driving support, or the like based on the support information created by the providing device 101. The providing device 101 is not limited to the configuration for creating support information, and may be configured to notify, for example, an object detection result to an in-vehicle device different from the providing device 101.
 車両1および路側機3は、無線基地局装置4を介して車両管理装置201と通信可能な図示しない通信機器を搭載している。具体的には、車両1および路側機3は、たとえば、LTE(Long Term Evolution)、5Gまたは3G等の通信規格に従って無線基地局装置4と無線通信を行うことが可能である。図1では、1つの無線基地局装置4を代表的に示している。 The vehicle 1 and the roadside unit 3 are equipped with a communication device (not shown) capable of communicating with the vehicle management device 201 via the radio base station device 4. Specifically, the vehicle 1 and the roadside unit 3 can perform wireless communication with the wireless base station device 4 according to a communication standard such as LTE (Long Term Evolution), 5G or 3G. In FIG. 1, one radio base station device 4 is typically shown.
 なお、車両1および路側機3は、上記に限らず、たとえばITS(Intelligent Transport System)無線を用いて、無線基地局装置4を介して車両管理装置201と通信可能な構成であってもよい。また、提供装置101自体が、上記通信機器を備える構成であってもよい。 The vehicle 1 and the roadside unit 3 are not limited to the above, and may be configured to be able to communicate with the vehicle management device 201 via the radio base station device 4 using, for example, an ITS (Intelligent Transport System) radio. Further, the providing device 101 itself may be configured to include the above-mentioned communication device.
 [課題]
 上述のように、従来、車両の自律走行、自動運転または運転支援等を行うための、車両の周囲等における物体検知に関する技術が開発されている。
[Task]
As described above, conventionally, a technique for detecting an object around a vehicle or the like for autonomous driving, automatic driving, driving support, or the like of the vehicle has been developed.
 センサによる計測結果を示すセンサ情報を解析するシステムでは、たとえば、(1)死角に存在する物体を検知できない(2)一部が隠れている物体の検出が困難である(3)遠方における物体および悪天候等、センサが苦手とする環境での検出が困難である、といった課題がある。 In a system that analyzes sensor information showing measurement results by a sensor, for example, (1) it is not possible to detect an object existing in a blind spot, (2) it is difficult to detect an object that is partially hidden, and (3) an object in a distant place and an object. There is a problem that it is difficult to detect in an environment where the sensor is not good at, such as bad weather.
 特許文献1および2に記載の技術は、複数の車両等において取得したセンサ情報を統合することから、上記(1)および(2)の課題に関する技術である。 The techniques described in Patent Documents 1 and 2 are techniques related to the above problems (1) and (2) because they integrate sensor information acquired in a plurality of vehicles and the like.
 また、特許文献3および4に記載の技術は、種類の異なる複数のセンサを用いることから、上記(3)の課題に関する技術である。 Further, the techniques described in Patent Documents 3 and 4 are techniques related to the problem (3) above because they use a plurality of different types of sensors.
 一方、複数の車両等において取得したセンサ情報または物体の検出結果を統合するシステムにおいて、車両および路側機等の周辺状況のセンシング環境は様々であり、かつ、センシング環境は時間と共に変動するものである。このような変動は、特許文献1~4のようなセンサ情報または物体の検出結果の統合処理を用いた物体検出の精度に影響を及ぼす恐れがある。 On the other hand, in a system that integrates sensor information or object detection results acquired in a plurality of vehicles and the like, the sensing environment of the surrounding conditions of the vehicle and the roadside machine is various, and the sensing environment fluctuates with time. .. Such fluctuations may affect the accuracy of object detection using the integrated processing of sensor information or object detection results as in Patent Documents 1 to 4.
 これに対して、本開示の実施の形態に係る車両管理システムでは、以下のような構成および動作により、このような課題を解決する。 On the other hand, in the vehicle management system according to the embodiment of the present disclosure, such a problem is solved by the following configuration and operation.
 図1において、提供装置101は、車両の交通に影響する物体の検出結果を示す物体検出情報、および当該物体の検出における条件を示す検出条件情報を作成し、無線基地局装置4および外部ネットワーク5経由で車両管理装置201へ送信する。なお、提供装置101は、他の装置およびネットワークを介さずに車両管理装置201と各種情報を送受信可能であってもよい。以下、物体検出情報および検出条件情報をまとめて提供情報とも称する。 In FIG. 1, the providing device 101 creates object detection information indicating the detection result of an object affecting the traffic of the vehicle and detection condition information indicating a condition for detecting the object, and creates the radio base station device 4 and the external network 5. It is transmitted to the vehicle management device 201 via the vehicle. The providing device 101 may be able to send and receive various information to and from the vehicle management device 201 without going through other devices and a network. Hereinafter, the object detection information and the detection condition information are collectively referred to as provided information.
 より詳細には、変更情報の提供対象である提供装置101(以下、対象装置とも称する。)を含む複数の提供装置101は、それぞれ、提供情報を車両管理装置201へ送信する。 More specifically, each of the plurality of providing devices 101 including the providing device 101 (hereinafter, also referred to as the target device) to which the change information is provided transmits the provided information to the vehicle management device 201.
 車両管理装置201は、無線基地局装置4および外部ネットワーク5経由で複数の提供装置101から提供情報を受信して蓄積する。 The vehicle management device 201 receives and stores the provided information from the plurality of providing devices 101 via the radio base station device 4 and the external network 5.
 車両管理装置201は、対象装置を含む複数の提供装置101の提供情報に基づいて、当該対象装置における物体の検出に関する変更内容を示す変更情報を作成する。そして、車両管理装置201は、作成した変更情報を外部ネットワーク5および無線基地局装置4経由で対象装置へ送信する。 The vehicle management device 201 creates change information indicating the change contents regarding the detection of the object in the target device based on the provided information of the plurality of providing devices 101 including the target device. Then, the vehicle management device 201 transmits the created change information to the target device via the external network 5 and the radio base station device 4.
 なお、車両管理システム301における上記複数の提供装置101の全部が対象装置に該当してもよいし、一部が対象装置に該当してもよい。 Note that all of the plurality of providing devices 101 in the vehicle management system 301 may correspond to the target device, or some of them may correspond to the target device.
 提供装置101は、車両管理装置201から変更情報を受信し、受信した変更情報に基づいて、自己の提供装置101における物体の検出に関する変更を行う。 The providing device 101 receives change information from the vehicle management device 201, and makes changes related to the detection of an object in its own providing device 101 based on the received change information.
 [提供装置]
 図2は、本開示の実施の形態に係る提供装置の構成を示す図である。
[Providing equipment]
FIG. 2 is a diagram showing a configuration of a providing device according to an embodiment of the present disclosure.
 図2を参照して、車両1に搭載される提供装置101である提供装置101Aは、処理部11と、送信部12と、受信部13と、記憶部14とを備える。処理部11は、CPU(Central Processing Unit)またはDSP(Digital Signal Processing)等のプロセッサによって実現される。送信部12および受信部13は、たとえば、通信用IC(Integrated Circuit)等の通信回路によって実現される。記憶部14は、たとえば不揮発性メモリである。 With reference to FIG. 2, the providing device 101A, which is a providing device 101 mounted on the vehicle 1, includes a processing unit 11, a transmitting unit 12, a receiving unit 13, and a storage unit 14. The processing unit 11 is realized by a processor such as a CPU (Central Processing Unit) or a DSP (Digital Signal Processing). The transmitting unit 12 and the receiving unit 13 are realized by, for example, a communication circuit such as a communication IC (Integrated Circuit). The storage unit 14 is, for example, a non-volatile memory.
 処理部11は、解析部21と、検出変更部22と、支援情報作成部23と、車両制御部24とを含む。なお、解析部21、支援情報作成部23および車両制御部24は、たとえば提供装置101等の複数の車載装置が車両1に搭載される場合、提供装置101とは別の車載装置に設けられてもよい。 The processing unit 11 includes an analysis unit 21, a detection change unit 22, a support information creation unit 23, and a vehicle control unit 24. The analysis unit 21, the support information creation unit 23, and the vehicle control unit 24 are provided in an in-vehicle device different from the providing device 101 when a plurality of in-vehicle devices such as the providing device 101 are mounted on the vehicle 1. May be good.
 センサ51は、車両1に1または複数設けられ、計測結果を処理部11へ出力する。センサ51は、たとえば、カメラ、LiDAR(Light Detection and Ranging)またはミリ波レーダ装置である。 One or a plurality of sensors 51 are provided in the vehicle 1 and output the measurement result to the processing unit 11. The sensor 51 is, for example, a camera, a LiDAR (Light Detection and Ringing), or a millimeter-wave radar device.
 記憶部14は、処理部11による各種処理の結果または処理途中の情報、各種設定情報、および物体の検出における条件等、各種情報を記憶する。 The storage unit 14 stores various information such as the result of various processing by the processing unit 11, information in the middle of processing, various setting information, and conditions for detecting an object.
 処理部11における解析部21は、センサ51から受けた計測結果に基づいて、他の車両、歩行者、自転車および固定された構造物等、自己の車両1の交通に影響する物体を検出する解析処理を行う。解析処理は、たとえば、テンプレートマッチングによる判定処理、またはディープラーニング等の機械学習により作成された学習モデルを用いた判定処理である。 The analysis unit 21 in the processing unit 11 detects an object that affects the traffic of its own vehicle 1, such as another vehicle, a pedestrian, a bicycle, and a fixed structure, based on the measurement result received from the sensor 51. Perform processing. The analysis process is, for example, a determination process by template matching or a determination process using a learning model created by machine learning such as deep learning.
 支援情報作成部23は、解析部21による物体の検出結果に基づいて、自律走行、自動運転または運転支援等のための支援情報を作成する。 The support information creation unit 23 creates support information for autonomous driving, automatic driving, driving support, etc., based on the detection result of the object by the analysis unit 21.
 車両制御部24は、支援情報作成部23によって作成された支援情報に基づいて、速度制御等、自己の車両1における各種制御を行う。 The vehicle control unit 24 performs various controls on its own vehicle 1, such as speed control, based on the support information created by the support information creation unit 23.
 また、解析部21は、自己の車両1の交通に影響する物体の検出結果を示す物体検出情報、および当該物体の検出における条件を示す検出条件情報を作成する。そして、解析部21は、自己の車両1の識別情報を物体検出情報および検出条件情報の少なくともいずれか一方に含めて送信部12へ出力する。 Further, the analysis unit 21 creates object detection information indicating the detection result of an object affecting the traffic of its own vehicle 1 and detection condition information indicating a condition for detecting the object. Then, the analysis unit 21 includes the identification information of its own vehicle 1 in at least one of the object detection information and the detection condition information and outputs it to the transmission unit 12.
 物体検出情報は、たとえば上記物体に関する複数の要素を含む。具体的には、たとえば、物体検出情報は、物体の、種別、寸法、位置情報、傾き、速度および加速度のうちの少なくともいずれか1つの要素を含む。また、物体検出情報は、たとえば学習モデルから出力される、物体の上記各要素の確からしさを含んでもよい。 The object detection information includes, for example, a plurality of elements related to the above object. Specifically, for example, the object detection information includes at least one element of the object, such as type, dimension, position information, inclination, velocity and acceleration. Further, the object detection information may include the certainty of each of the above-mentioned elements of the object, which is output from the learning model, for example.
 物体の寸法は、たとえば、物体の幅、高さおよび奥行きである。また、物体の寸法は、たとえば、3Dバウンディングボックスである。3Dバウンディングボックスは、ディープラーニングを用いてカメラ画像から物体を検出した場合における、検出物体の三次元構造を示す図形である。 The dimensions of the object are, for example, the width, height and depth of the object. The dimensions of the object are, for example, a 3D bounding box. The 3D bounding box is a figure showing the three-dimensional structure of the detected object when the object is detected from the camera image by using deep learning.
 検出条件情報は、一例として、物体の検出に用いるセンサ51の計測性能に影響を与えるセンサ51の環境たとえば自己の車両1の環境を示す。他の例として、検出条件情報は、提供装置101が搭載された自己の車両1の走行状態を示す。 The detection condition information indicates, for example, the environment of the sensor 51 that affects the measurement performance of the sensor 51 used for detecting an object, for example, the environment of its own vehicle 1. As another example, the detection condition information indicates the running state of the own vehicle 1 on which the providing device 101 is mounted.
 検出条件情報は、たとえば物体検出情報を上記要素単位で評価可能な条件を含む。具体的には、たとえば、検出条件情報は、計測時刻と、性能および設定値等のセンサのパラメータと、取り付け位置、方向および角度等のセンサの設置情報と、天候、光量および視界等の車両1の周辺環境の状態と、位置情報、速度、加速度および傾き等の自己の車両1の状態情報とのうちの少なくともいずれか1つの要素を含む。 The detection condition information includes, for example, a condition in which object detection information can be evaluated in the above element units. Specifically, for example, the detection condition information includes sensor parameters such as measurement time, performance and set value, sensor installation information such as mounting position, direction and angle, and vehicle 1 such as weather, light intensity and visibility. It includes at least one element of the state of the surrounding environment of the vehicle 1 and the state information of its own vehicle 1 such as position information, speed, acceleration and inclination.
 送信部12は、解析部21から受けた物体検出情報および検出条件情報を、TCU(Telematics Communication Unit)等の図示しない上記通信機器を介して車両管理装置201へ送信する。 The transmission unit 12 transmits the object detection information and the detection condition information received from the analysis unit 21 to the vehicle management device 201 via the above-mentioned communication device (not shown) such as TCU (Telematics Communication Unit).
 図3は、本開示の実施の形態に係る提供装置の他の例の構成を示す図である。 FIG. 3 is a diagram showing the configuration of another example of the providing device according to the embodiment of the present disclosure.
 図3を参照して、路側機3に搭載される提供装置101である提供装置101Bは、処理部61と、送信部62と、受信部63と、記憶部64とを備える。処理部61は、CPUまたはDSP等のプロセッサによって実現される。送信部62および受信部63は、たとえば、通信用IC等の通信回路によって実現される。記憶部64は、たとえば不揮発性メモリである。処理部61は、解析部71と、検出変更部72とを含む。なお、解析部71は、路側機3における、提供装置101とは別の装置に設けられてもよい。 With reference to FIG. 3, the providing device 101B, which is a providing device 101 mounted on the roadside machine 3, includes a processing unit 61, a transmitting unit 62, a receiving unit 63, and a storage unit 64. The processing unit 61 is realized by a processor such as a CPU or DSP. The transmitting unit 62 and the receiving unit 63 are realized by, for example, a communication circuit such as a communication IC. The storage unit 64 is, for example, a non-volatile memory. The processing unit 61 includes an analysis unit 71 and a detection change unit 72. The analysis unit 71 may be provided in a device other than the providing device 101 in the roadside machine 3.
 センサ81は、路側機3に1または複数設けられ、計測結果を処理部61へ出力する。センサ81は、たとえば、カメラ、LiDARまたはミリ波レーダ装置である。 One or a plurality of sensors 81 are provided on the roadside machine 3 and output the measurement result to the processing unit 61. The sensor 81 is, for example, a camera, LiDAR or millimeter wave radar device.
 記憶部64は、処理部61による各種処理の結果または処理途中の情報、各種設定情報、および物体の検出における条件等、各種情報を記憶する。 The storage unit 64 stores various information such as the result of various processing by the processing unit 61, information during processing, various setting information, and conditions for detecting an object.
 処理部61における解析部71は、センサ81から受けた計測結果に基づいて、車両、歩行者、自転車および固定された構造物等、路側機3の周辺における車両(以下、周辺車両とも称する。)の交通に影響する物体を検出する解析処理を行う。解析処理は、たとえば、テンプレートマッチングによる判定処理、またはディープラーニング等の機械学習により作成された学習モデルを用いた判定処理である。 Based on the measurement result received from the sensor 81, the analysis unit 71 in the processing unit 61 is a vehicle around the roadside machine 3 such as a vehicle, a pedestrian, a bicycle, and a fixed structure (hereinafter, also referred to as a peripheral vehicle). Performs analysis processing to detect objects that affect traffic. The analysis process is, for example, a determination process by template matching or a determination process using a learning model created by machine learning such as deep learning.
 解析部71は、周辺車両の交通に影響する物体の検出結果を示す物体検出情報、および当該物体の検出における条件を示す検出条件情報を作成する。そして、解析部71は、自己の路側機3の識別情報を物体検出情報および検出条件情報の少なくともいずれか一方に含めて送信部62へ出力する。 The analysis unit 71 creates object detection information indicating the detection result of an object affecting the traffic of surrounding vehicles and detection condition information indicating a condition for detecting the object. Then, the analysis unit 71 includes the identification information of its own roadside machine 3 in at least one of the object detection information and the detection condition information and outputs the identification information to the transmission unit 62.
 物体検出情報は、たとえば上記物体に関する複数の要素を含む。具体的には、たとえば、物体検出情報は、物体の、種別、寸法、位置情報、傾き、速度および加速度のうちの少なくともいずれか1つの要素を含む。また、物体検出情報は、たとえば学習モデルから出力される、物体の上記各要素の確からしさを含んでもよい。 The object detection information includes, for example, a plurality of elements related to the above object. Specifically, for example, the object detection information includes at least one element of the object, such as type, dimension, position information, inclination, velocity and acceleration. Further, the object detection information may include the certainty of each of the above-mentioned elements of the object, which is output from the learning model, for example.
 検出条件情報は、たとえば、物体の検出に用いるセンサ81の計測性能に影響を与えるセンサ81の環境たとえば路側機3の環境を示す。 The detection condition information indicates, for example, the environment of the sensor 81 that affects the measurement performance of the sensor 81 used for detecting an object, for example, the environment of the roadside machine 3.
 検出条件情報は、たとえば物体検出情報を上記要素単位で評価可能な条件を含む。具体的には、たとえば、検出条件情報は、計測時刻と、性能および設定値等のセンサのパラメータと、取り付け位置、方向および角度等のセンサの設置情報と、天候、光量および視界等の周辺車両の周辺環境の状態と、位置情報、速度、加速度および傾き等の周辺車両の状態情報とのうちの少なくともいずれか1つの要素を含む。 The detection condition information includes, for example, a condition in which object detection information can be evaluated in the above element units. Specifically, for example, the detection condition information includes sensor parameters such as measurement time, performance and set values, sensor installation information such as mounting position, direction and angle, and peripheral vehicles such as weather, light intensity and visibility. It includes at least one element of the state of the surrounding environment and the state information of the surrounding vehicle such as position information, speed, acceleration and inclination.
 送信部62は、解析部71から受けた物体検出情報および検出条件情報を、図示しない上記通信機器を介して車両管理装置201へ送信する。 The transmission unit 62 transmits the object detection information and the detection condition information received from the analysis unit 71 to the vehicle management device 201 via the communication device (not shown).
 図4は、本開示の実施の形態に係る車両管理装置の構成を示す図である。 FIG. 4 is a diagram showing a configuration of a vehicle management device according to an embodiment of the present disclosure.
 図4を参照して、車両管理装置201は、処理部31と、送信部32と、受信部33と、記憶部34とを備える。処理部31は、CPUまたはDSP等のプロセッサによって実現される。記憶部34は、たとえば不揮発性メモリである。送信部32および受信部33は、たとえば、通信用IC等の通信回路によって実現される。処理部31は、取得部41と、情報作成部42とを含む。 With reference to FIG. 4, the vehicle management device 201 includes a processing unit 31, a transmitting unit 32, a receiving unit 33, and a storage unit 34. The processing unit 31 is realized by a processor such as a CPU or DSP. The storage unit 34 is, for example, a non-volatile memory. The transmitting unit 32 and the receiving unit 33 are realized by, for example, a communication circuit such as a communication IC. The processing unit 31 includes an acquisition unit 41 and an information creation unit 42.
 記憶部34は、処理部31による各種処理の結果または処理途中の情報、およびその他の情報を記憶する。 The storage unit 34 stores the results of various processes by the processing unit 31, information in the middle of processing, and other information.
 取得部41は、物体検出情報および検出条件情報を、複数の提供装置101からそれぞれ取得する。 The acquisition unit 41 acquires the object detection information and the detection condition information from the plurality of providing devices 101, respectively.
 より詳細には、受信部33は、外部ネットワーク5経由で各提供装置101からの提供情報すなわち物体検出情報および検出条件情報を受信し、処理部31へ出力する。 More specifically, the receiving unit 33 receives the provided information from each providing device 101, that is, the object detection information and the detection condition information via the external network 5, and outputs the information to the processing unit 31.
 処理部31における取得部41は、受信部33から受けた提供情報を記憶部34に蓄積する。 The acquisition unit 41 in the processing unit 31 stores the provided information received from the receiving unit 33 in the storage unit 34.
 情報作成部42は、取得部41によって取得された、変更情報の提供対象の提供装置101すなわち対象装置、を含む複数の提供装置101の物体検出情報および検出条件情報に基づいて、当該対象装置における物体の検出に関する変更内容を示す変更情報を作成する。 The information creation unit 42 in the target device is based on the object detection information and the detection condition information of the plurality of providing devices 101 including the providing device 101 of the change information providing target, that is, the target device, acquired by the acquisition unit 41. Create change information that indicates the changes related to object detection.
 図5は、本開示の実施の形態に係る車両管理装置における変更情報の作成処理を概念的に示す図である。 FIG. 5 is a diagram conceptually showing the process of creating change information in the vehicle management device according to the embodiment of the present disclosure.
 [統合処理]
 図4および図5を参照して、処理部31における情報作成部42は、各提供装置101から送信された物体検出情報を統合した統合情報を作成する統合処理を行う。
[Integrated processing]
With reference to FIGS. 4 and 5, the information creation unit 42 in the processing unit 31 performs an integrated process for creating integrated information that integrates the object detection information transmitted from each providing device 101.
 統合処理において、まず、情報作成部42は、対象装置を含む各提供装置101から送信された提供情報の時間のばらつきを合わせる時刻同期を行う。 In the integrated process, first, the information creation unit 42 performs time synchronization for adjusting the time variation of the provided information transmitted from each providing device 101 including the target device.
 より詳細には、情報作成部42は、対象装置ごとに目的時刻を設定する。目的時刻は、たとえば、対象装置から受信した提供情報に含まれる上述の計測時刻である。 More specifically, the information creation unit 42 sets the target time for each target device. The target time is, for example, the above-mentioned measurement time included in the provided information received from the target device.
 そして、情報作成部42は、記憶部34を参照し、たとえば各提供情報に含まれる上述の計測時刻を用いて、目的時刻に最も近い提供情報を提供装置101ごとに選択する。また、たとえば、情報作成部42は、ある提供装置101について目的時刻に近い提供情報が記憶部34に存在しない場合、記憶部34に保存されている当該提供装置101の物体検出情報に基づいて、目的時刻における当該提供装置101の物体検出情報を推定して選択する。 Then, the information creation unit 42 refers to the storage unit 34, and selects the provided information closest to the target time for each providing device 101 by using, for example, the above-mentioned measurement time included in each provided information. Further, for example, when the information providing unit 42 does not have the provided information close to the target time for a certain providing device 101 in the storage unit 34, the information creating unit 42 is based on the object detection information of the providing device 101 stored in the storage unit 34. The object detection information of the providing device 101 at the target time is estimated and selected.
 次に、情報作成部42は、時刻同期において選択した各提供装置101の提供情報の単一位置座標へのマッピングを行う。 Next, the information creation unit 42 maps the provided information of each providing device 101 selected in the time synchronization to a single position coordinate.
 より詳細には、情報作成部42は、各提供装置101の提供情報の座標系を統一する。具体的には、たとえば、情報作成部42は、各提供装置101が作成した相対的な座標系における検出物体の座標およびベクトルを、自己が有する1つの座標系に変換する。 More specifically, the information creation unit 42 unifies the coordinate system of the provided information of each providing device 101. Specifically, for example, the information creation unit 42 converts the coordinates and vectors of the detected object in the relative coordinate system created by each providing device 101 into one coordinate system owned by itself.
 次に、情報作成部42は、座標変換後の各提供情報を用いて統合情報を作成する。 Next, the information creation unit 42 creates integrated information using each provided information after coordinate conversion.
 より詳細には、情報作成部42は、物体検出情報ごとに、対応の検出条件情報に基づいて、物体検出情報を要素単位で評価し、評価結果に基づいて、複数の提供装置101の物体検出情報の中から複数の要素を抽出し、抽出した複数の要素を含む統合情報を作成する。たとえば、情報作成部42は、検出条件情報に基づいて、対応の物体検出情報に含まれる各要素の正確性を判断する。 More specifically, the information creation unit 42 evaluates the object detection information on an element-by-element basis for each object detection information based on the corresponding detection condition information, and based on the evaluation result, the object detection of the plurality of providing devices 101. Multiple elements are extracted from the information, and integrated information including the extracted multiple elements is created. For example, the information creation unit 42 determines the accuracy of each element included in the corresponding object detection information based on the detection condition information.
 具体的には、情報作成部42は、たとえば、センサ51または81から検出物体までの距離が適切な距離であるか否か、センサ51または81の周辺環境が適切であるか否か、検出物体とは異なる他の物体による影響の有無、および物体検出情報に含まれる各要素の確からしさが十分であるか否かに基づいて上記正確性を判断する。 Specifically, the information creation unit 42 determines, for example, whether or not the distance from the sensor 51 or 81 to the detected object is an appropriate distance, whether or not the surrounding environment of the sensor 51 or 81 is appropriate, and whether or not the detected object. The accuracy is determined based on whether or not there is an influence of another object different from the above, and whether or not the certainty of each element included in the object detection information is sufficient.
 たとえば、情報作成部42は、検出物体が提供装置101から遠すぎる場合および提供装置101の周囲が暗い等、センサ51または81の周辺環境が当該センサの苦手な環境である場合、または他の提供装置101において検出された物体によって遮られる方向に当該センサが位置する場合、正確性が低いと判断する。 For example, the information creation unit 42 provides when the surrounding environment of the sensor 51 or 81 is not good for the sensor, such as when the detected object is too far from the providing device 101 or when the surroundings of the providing device 101 are dark. When the sensor is positioned in a direction blocked by an object detected by the device 101, it is determined that the accuracy is low.
 そして、情報作成部42は、各提供装置101の物体検出情報において正確であると判断した各要素の組み合わせ、または当該各要素を用いた予測結果を用いて、単一位置座標で表わされた各物体検出情報の統合情報を作成する。たとえば、情報作成部42は、設定した目標時刻ごとすなわち対象装置ごとに上記マッピングおよび統合情報の作成を行う。 Then, the information creation unit 42 is represented by a single position coordinate using a combination of each element determined to be accurate in the object detection information of each providing device 101, or a prediction result using each element. Create integrated information for each object detection information. For example, the information creation unit 42 creates the mapping and integrated information for each set target time, that is, for each target device.
 ここで、情報作成部42は、たとえば、ディープラーニング等の機械学習により作成された、車両および歩行者等を判別する学習モデルを用いて、上記各要素を用いた予測を行う。 Here, the information creation unit 42 makes a prediction using each of the above elements using, for example, a learning model for discriminating a vehicle, a pedestrian, or the like created by machine learning such as deep learning.
 このように、時刻同期、マッピング、ならびに正確性の判断および判断結果の反映を行う構成により、単に各提供装置101からの物体検出情報を統合する方法と比べて、要素単位でのより正確で高度な統合情報を作成することができる。 In this way, the configuration that performs time synchronization, mapping, and accuracy judgment and reflection of the judgment result is more accurate and advanced on an element-by-element basis than a method that simply integrates the object detection information from each providing device 101. Integrated information can be created.
 たとえば、ある提供装置101の提供情報からは物体の幅および高さは分かるが奥行きの信頼性が低い場合、異なる角度から当該物体を検出した他の提供装置101の提供情報を用いて当該物体の奥行きを得る等、より正確な情報を総合的に作成することができる。 For example, if the width and height of an object can be known from the provided information of a certain providing device 101 but the reliability of the depth is low, the provided information of another providing device 101 that detects the object from a different angle is used to obtain the object. More accurate information such as obtaining depth can be comprehensively created.
 また、たとえば、ある提供装置101の提供情報からは物体の寸法は分かるが速度の信頼性が低い場合、標高等の異なる位置から当該物体を検出可能な他の提供装置101の提供情報を用いて当該物体の速度を得る等、より正確な情報を総合的に作成することができる。 Further, for example, when the dimensions of the object can be known from the provided information of a certain providing device 101 but the reliability of the speed is low, the provided information of another providing device 101 capable of detecting the object from a different position such as an altitude is used. More accurate information such as obtaining the velocity of the object can be comprehensively created.
 [改善処理]
 次に、情報作成部42は、統合処理を行った後、対象装置における物体検出を改善するための変更情報を作成する改善処理を行う。
[Improvement processing]
Next, the information creation unit 42 performs an improvement process for creating change information for improving object detection in the target device after performing the integrated process.
 改善処理において、まず、情報作成部42は、作成した統合情報に含まれる位置情報の座標系を、対象装置の座標系に変換する。 In the improvement process, first, the information creation unit 42 converts the coordinate system of the position information included in the created integrated information into the coordinate system of the target device.
 次に、情報作成部42は、作成した統合情報と対応の物体検出情報すなわち対象装置から受信した物体検出情報とを比較し、比較結果に基づいて、対象装置へ送信すべき情報の選別および算出等を行う。 Next, the information creation unit 42 compares the created integrated information with the corresponding object detection information, that is, the object detection information received from the target device, and selects and calculates the information to be transmitted to the target device based on the comparison result. And so on.
 より詳細には、情報作成部42は、たとえば、提供装置101における物体検出結果または未来の物体検出がより正確になるような情報の選別および算出等を行い、変更情報を作成する。 More specifically, the information creation unit 42 creates change information by, for example, selecting and calculating information such that the object detection result in the providing device 101 or future object detection becomes more accurate.
 たとえば、変更情報は、提供装置101における物体の検出結果の変更内容を示す。 For example, the change information indicates the content of the change in the detection result of the object in the providing device 101.
 具体的には、たとえば、変更情報は、対象装置における、検出物体の正しい寸法、正しい位置情報、正しい傾き、正しい速度、または正しい加速度、ならびに、これらの要素の基準となる時刻情報を含む。 Specifically, for example, the change information includes the correct dimension, the correct position information, the correct inclination, the correct velocity, or the correct acceleration of the detected object in the target device, and the time information that is the reference of these elements.
 なお、変更情報は、正しい値自体に限らず、物体検出情報における上記各要素の、統合情報から得られる正しい値からの差分を含んでもよい。 Note that the change information is not limited to the correct value itself, but may include a difference from the correct value obtained from the integrated information of each of the above elements in the object detection information.
 他の例として、変更情報は、物体の検出に用いるセンサ51または81の設定の変更内容を示してもよい。 As another example, the change information may indicate the change contents of the setting of the sensor 51 or 81 used for detecting the object.
 具体的には、たとえば、変更情報は、提供装置101におけるセンサ51または81に適用可能な補正パラメータを含む。また、たとえば、変更情報は、カメラに対する、向き、ズーム、フレームレート、画像サイズ、および各種色調の設定値等を含む。また、たとえば、変更情報は、LiDARに対する、レーザの向き、アングル範囲、および回転数等を含む。 Specifically, for example, the change information includes correction parameters applicable to the sensor 51 or 81 in the providing device 101. Further, for example, the change information includes orientation, zoom, frame rate, image size, setting values of various color tones, and the like with respect to the camera. Also, for example, the change information includes the direction of the laser, the angle range, the number of revolutions, etc. with respect to LiDAR.
 他の例として、変更情報は、物体の検出に用いるセンサ51または81の計測結果に対する解析処理の内容変更を示してもよい。 As another example, the change information may indicate a change in the content of the analysis process for the measurement result of the sensor 51 or 81 used for detecting the object.
 具体的には、たとえば、変更情報は、提供装置101における解析部21または71の解析処理に適用可能な補正パラメータを含む。また、たとえば、変更情報は、テンプレートマッチングの判定基準、および学習モデルのネットワークで使用される重み等を含む。 Specifically, for example, the change information includes a correction parameter applicable to the analysis process of the analysis unit 21 or 71 in the providing device 101. Also, for example, the change information includes criteria for template matching, weights used in the network of learning models, and the like.
 次に、情報作成部42は、対象装置が変更情報を活用すべき時刻である活用時刻を設定し、上記比較結果に基づく変更情報の内容の一部または全部を、活用時刻に適した内容に変更した変更情報を作成する。具体的には、たとえば、情報作成部42は、車両管理装置201から対象装置への伝送遅延および対象装置における処理遅延等を考慮して、たとえば移動体である検出物体の将来位置の予測結果を反映した変更情報を作成する。 Next, the information creation unit 42 sets the utilization time, which is the time when the target device should utilize the change information, and makes a part or all of the content of the change information based on the above comparison result suitable for the utilization time. Create changed change information. Specifically, for example, the information creation unit 42 determines the prediction result of the future position of the detected object, which is a moving object, in consideration of the transmission delay from the vehicle management device 201 to the target device, the processing delay in the target device, and the like. Create reflected change information.
 一方、情報作成部42は、活用時刻に応じた変更が必要ない場合、上記比較結果に基づく変更情報の内容を維持する。 On the other hand, the information creation unit 42 maintains the content of the change information based on the above comparison result when the change according to the utilization time is not necessary.
 ここで、情報作成部42は、たとえば、上述の変更情報における「基準となる時刻情報」として、この活用時刻を用いる。 Here, the information creation unit 42 uses this utilization time as, for example, the "reference time information" in the above-mentioned change information.
 なお、活用時刻は、たとえば対象装置における過去の物体検出結果を補正することが有用である場合、過去の時刻であってもよい。 The utilization time may be the past time, for example, when it is useful to correct the past object detection result in the target device.
 このような改善処理により、たとえば、提供装置101における物体検出能力を向上させたり、提供装置101における物体検出結果を補正したりすることが可能となる。より詳細には、提供装置101における物体の検出結果における寸法等、センサ51または81の設定におけるカメラの向きおよびLiDARのレーザの向き等、ならびに提供装置101における解析部21または71の解析処理における判定基準および重み等、要素単位での高度な補正を実現することができる。 By such improvement processing, for example, it is possible to improve the object detection capability of the providing device 101 and correct the object detection result of the providing device 101. More specifically, the dimensions in the detection result of the object in the providing device 101, the orientation of the camera and the laser orientation of the LiDAR in the setting of the sensor 51 or 81, and the determination in the analysis process of the analysis unit 21 or 71 in the providing device 101. It is possible to realize a high degree of correction on an element-by-element basis such as a reference and a weight.
 そして、提供装置101が、車両管理装置201から受信した変更情報を用いて精度を向上させた物体検出による、新たな物体検出情報および検出条件情報を車両管理装置201へ送信することにより、車両管理装置201における統合情報の精度がさらに向上する好循環を生み出すことが可能となる。 Then, the providing device 101 manages the vehicle by transmitting new object detection information and detection condition information to the vehicle management device 201 by object detection with improved accuracy using the change information received from the vehicle management device 201. It is possible to create a virtuous cycle in which the accuracy of the integrated information in the device 201 is further improved.
 なお、処理部31における情報作成部42は、たとえば周期的に到来する統合タイミングにおいて統合処理を行う構成であってもよい。また、情報作成部42は、統合処理および改善処理を非同期に行う構成であってもよい。 Note that the information creation unit 42 in the processing unit 31 may be configured to perform integrated processing at, for example, periodically arriving integration timings. Further, the information creation unit 42 may be configured to perform the integration process and the improvement process asynchronously.
 情報作成部42は、以上のような統合処理および改善処理を、一例として、統計学的な演算手法、図形変換等の幾何学的な演算手法、およびニューラルネットワークを用いた推論等により実現することができる。 The information creation unit 42 realizes the above-mentioned integrated processing and improvement processing by, for example, a statistical calculation method, a geometric calculation method such as graphic transformation, and inference using a neural network. Can be done.
 次に、情報作成部42は、改善処理によって作成した変更情報を送信部32へ出力する。 Next, the information creation unit 42 outputs the change information created by the improvement process to the transmission unit 32.
 送信部32は、情報作成部42から受けた変更情報を、外部ネットワーク5および無線基地局装置4経由で対応の提供装置101すなわち対象装置へ送信する。 The transmission unit 32 transmits the change information received from the information creation unit 42 to the corresponding providing device 101, that is, the target device via the external network 5 and the radio base station device 4.
 再び図2を参照して、対象装置が提供装置101Aである場合、受信部13は、無線基地局装置4経由で車両管理装置201からの変更情報を受信し、処理部11へ出力する。 With reference to FIG. 2 again, when the target device is the providing device 101A, the receiving unit 13 receives the change information from the vehicle management device 201 via the radio base station device 4 and outputs the change information to the processing unit 11.
 処理部11における検出変更部22は、受信部13によって受信された変更情報に基づいて、提供装置101Aにおける物体の検出に関する変更を行う。 The detection change unit 22 in the processing unit 11 makes a change regarding the detection of the object in the providing device 101A based on the change information received by the receiving unit 13.
 より詳細には、検出変更部22は、受信部13から受けた変更情報に基づいて、たとえば、解析部21による物体の検出結果の補正、センサ51に関するパラメータの変更、および解析部21の解析処理に関するパラメータの変更のうちの少なくともいずれか1つを行う。 More specifically, the detection change unit 22 corrects the detection result of the object by the analysis unit 21, changes the parameters related to the sensor 51, and analyzes the analysis unit 21, based on the change information received from the reception unit 13. Make at least one of the parameter changes related to.
 具体的には、検出変更部22は、活用時刻における解析部21による物体の検出結果において、検出物体の、寸法、位置情報、傾き、速度、および加速度等を変更情報の示す値に補正する。 Specifically, the detection change unit 22 corrects the dimensions, position information, inclination, velocity, acceleration, etc. of the detected object to the values indicated by the change information in the detection result of the object by the analysis unit 21 at the utilization time.
 また、検出変更部22は、センサ51であるカメラの、向き、ズーム、フレームレート、画像サイズ、および各種色調等の設定値を変更情報の示す値に補正する。また、検出変更部22は、センサ51であるLiDARの、レーザの向き、アングル範囲、および回転数等を変更情報の示す値に補正する。 Further, the detection / change unit 22 corrects the setting values such as the orientation, zoom, frame rate, image size, and various color tones of the camera, which is the sensor 51, to the values indicated by the change information. Further, the detection change unit 22 corrects the laser direction, the angle range, the rotation speed, and the like of the LiDAR sensor 51 to the values indicated by the change information.
 また、検出変更部22は、解析部21の解析処理における、テンプレートマッチングの判定基準、および学習モデルのネットワークで使用される重み等を変更情報の示す値に補正する。 Further, the detection / change unit 22 corrects the determination criteria for template matching in the analysis process of the analysis unit 21, the weights used in the network of the learning model, and the like to the values indicated by the change information.
 図3に示す提供装置101Bにおける受信部63および検出変更部72等による変更情報の到着に伴う動作は、提供装置101Aにおける上記動作と同様である。 The operation associated with the arrival of the change information by the receiving unit 63, the detection changing unit 72, and the like in the providing device 101B shown in FIG. 3 is the same as the above operation in the providing device 101A.
 [動作の流れ]
 車両管理システム301における各装置は、メモリを含むコンピュータを備え、当該コンピュータにおけるCPU等の演算処理部は、以下のフローチャートおよびシーケンスの各ステップの一部または全部を含むプログラムを当該メモリから読み出して実行する。これら複数の装置のプログラムは、それぞれ、外部からインストールすることができる。これら複数の装置のプログラムは、それぞれ、記録媒体に格納された状態で流通する。
[Operation flow]
Each device in the vehicle management system 301 includes a computer including a memory, and an arithmetic processing unit such as a CPU in the computer reads a program including a part or all of each step of the following flowchart and sequence from the memory and executes the program. To do. The programs of these plurality of devices can be installed from the outside. The programs of these plurality of devices are distributed in a state of being stored in a recording medium.
 図6は、本開示の実施の形態に係る車両管理システムにおける提供情報および変更情報の処理のシーケンスの一例を示す図である。図6は、一例として、ある1つの対象装置が、提供情報を車両管理装置201へ送信し、車両管理装置201から変更情報を受信する動作を示している。 FIG. 6 is a diagram showing an example of a sequence of processing of provided information and change information in the vehicle management system according to the embodiment of the present disclosure. FIG. 6 shows, as an example, an operation in which one target device transmits the provided information to the vehicle management device 201 and receives change information from the vehicle management device 201.
 図6を参照して、まず、1または複数の他の提供装置101から物体検出情報および検出条件情報が送信され(ステップS1)、車両管理装置201の記憶部34において蓄積されている状況を想定する。ここで、記憶部34には、対象装置から過去に送信された物体検出情報および検出条件情報が蓄積されている場合もあり得る(ステップS2)。 With reference to FIG. 6, first, it is assumed that object detection information and detection condition information are transmitted from one or a plurality of other providing devices 101 (step S1) and are stored in the storage unit 34 of the vehicle management device 201. To do. Here, the storage unit 34 may store the object detection information and the detection condition information transmitted in the past from the target device (step S2).
 次に、対象装置は、センサ51または81から受けた計測結果に基づいて解析処理を行う(ステップS3)。 Next, the target device performs analysis processing based on the measurement result received from the sensor 51 or 81 (step S3).
 次に、対象装置は、物体の検出結果を示す物体検出情報、および当該物体の検出における条件を示す検出条件情報を作成し(ステップS4)、車両管理装置201へ送信する(ステップS5)。 Next, the target device creates object detection information indicating the detection result of the object and detection condition information indicating the conditions for detecting the object (step S4), and transmits the information to the vehicle management device 201 (step S5).
 次に、車両管理装置201は、対象装置から受信した物体検出情報および検出条件情報を記憶部34に蓄積する(ステップS6)。 Next, the vehicle management device 201 stores the object detection information and the detection condition information received from the target device in the storage unit 34 (step S6).
 次に、車両管理装置201は、各提供装置101から送信された物体検出情報を統合した統合情報を作成する統合処理を行う。より詳細には、統合処理において、車両管理装置201は、たとえば変更情報の提供対象である対象装置から受信した提供情報に含まれる上述の計測時刻を目的時刻として、各提供装置101から送信された提供情報の時間のばらつきを合わせる時刻同期を行う(ステップS7)。 Next, the vehicle management device 201 performs an integrated process of creating integrated information that integrates the object detection information transmitted from each providing device 101. More specifically, in the integrated process, the vehicle management device 201 is transmitted from each providing device 101, for example, with the above-mentioned measurement time included in the providing information received from the target device to which the change information is provided as a target time. Time synchronization is performed to match the time variation of the provided information (step S7).
 次に、車両管理装置201は、対象装置における物体検出を改善するための変更情報を作成する改善処理を行う(ステップS8)。 Next, the vehicle management device 201 performs an improvement process for creating change information for improving object detection in the target device (step S8).
 次に、車両管理装置201は、作成した変更情報を対象装置へ送信する(ステップS9)。 Next, the vehicle management device 201 transmits the created change information to the target device (step S9).
 次に、対象装置は、車両管理装置201から受信した変更情報に基づいて、対象装置における物体の検出に関する変更を行う(ステップS10)。 Next, the target device makes changes related to the detection of the object in the target device based on the change information received from the vehicle management device 201 (step S10).
 図7は、本開示の実施の形態に係る車両管理システムにおける提供装置の動作手順を定めたフローチャートである。図7は、車両1に搭載される提供装置101Aの動作を示している。 FIG. 7 is a flowchart defining the operation procedure of the providing device in the vehicle management system according to the embodiment of the present disclosure. FIG. 7 shows the operation of the providing device 101A mounted on the vehicle 1.
 図7を参照して、まず、提供装置101は、センサ51から計測結果を取得する(ステップS21)。 With reference to FIG. 7, first, the providing device 101 acquires the measurement result from the sensor 51 (step S21).
 次に、提供装置101は、取得した計測結果に基づいて、自己の車両1の交通に影響する物体を検出する解析処理を行う(ステップS22)。 Next, the providing device 101 performs an analysis process for detecting an object that affects the traffic of its own vehicle 1 based on the acquired measurement result (step S22).
 次に、提供装置101は、物体の検出結果に基づいて、自律走行、自動運転または運転支援等のための支援情報を作成し、作成した支援情報に基づいて、速度制御等、自己の車両1における各種制御を行う(ステップS23)。 Next, the providing device 101 creates support information for autonomous driving, automatic driving, driving support, etc. based on the detection result of the object, and based on the created support information, the own vehicle 1 such as speed control, etc. (Step S23).
 次に、提供装置101は、物体の検出結果を示す物体検出情報、および当該物体の検出における条件を示す検出条件情報を作成し(ステップS24)、車両管理装置201へ送信する(ステップS25)。 Next, the providing device 101 creates object detection information indicating the detection result of the object and detection condition information indicating the conditions for detecting the object (step S24), and transmits the information to the vehicle management device 201 (step S25).
 次に、提供装置101は、車両管理装置201からの変更情報を待ち受け(ステップS26でNO)、変更情報を受信すると(ステップS26でYES)、受信した変更情報に基づいて変更処理を行う(ステップS27)。 Next, when the providing device 101 listens for the change information from the vehicle management device 201 (NO in step S26) and receives the change information (YES in step S26), the providing device 101 performs a change process based on the received change information (step). S27).
 そして、提供装置101は、センサ51から計測結果を新たに取得すると、再び解析処理を行う(ステップS21)。 Then, when the providing device 101 newly acquires the measurement result from the sensor 51, the providing device 101 performs the analysis process again (step S21).
 なお、ステップS23の処理とステップS24およびS25の処理とは、順番を入れ替えても良いし、並行して行ってもよい。また、ステップS21~S25の処理とステップS26およびS27の処理とを並行して行ってもよい。また、ステップS23の処理は、変更処理(ステップS27)の後に行ってもよい。 Note that the processing of step S23 and the processing of steps S24 and S25 may be performed in a different order or in parallel. Further, the processes of steps S21 to S25 and the processes of steps S26 and S27 may be performed in parallel. Further, the process of step S23 may be performed after the change process (step S27).
 また、路側機3に搭載される提供装置101Bの動作は、ステップS22において周辺車両の交通に影響する物体を検出する解析処理を行う点、およびステップS23の処理が行われない点以外は図7に示すフローチャートと同様である。 Further, the operation of the providing device 101B mounted on the roadside machine 3 is FIG. 7 except that the analysis process for detecting an object affecting the traffic of surrounding vehicles is performed in step S22 and the process in step S23 is not performed. It is the same as the flowchart shown in.
 図8は、本開示の実施の形態に係る車両管理システムにおける車両管理装置の動作手順を定めたフローチャートである。 FIG. 8 is a flowchart defining the operation procedure of the vehicle management device in the vehicle management system according to the embodiment of the present disclosure.
 図8を参照して、まず、車両管理装置201は、提供装置101からの提供情報を待ち受け(ステップS41でNO)、提供情報を受信すると(ステップS41でYES)、受信した提供情報を記憶部34に蓄積する(ステップS42)。 With reference to FIG. 8, first, when the vehicle management device 201 listens for the provided information from the providing device 101 (NO in step S41) and receives the provided information (YES in step S41), the vehicle management device 201 stores the received provided information. Accumulate in 34 (step S42).
 次に、車両管理装置201は、当該提供装置101が対象装置である場合、当該対象装置を含む各提供装置101から送信された提供情報の時間のばらつきを合わせる時刻同期を行う(ステップS43)。 Next, when the providing device 101 is the target device, the vehicle management device 201 performs time synchronization for adjusting the time variation of the provided information transmitted from each providing device 101 including the target device (step S43).
 次に、車両管理装置201は、時刻同期において選択された各提供装置101の提供情報の単一位置座標へのマッピングを行う(ステップS44)。 Next, the vehicle management device 201 maps the provided information of each providing device 101 selected in the time synchronization to a single position coordinate (step S44).
 次に、車両管理装置201は、座標変換後の各提供情報を用いて統合情報を作成する(ステップS45)。 Next, the vehicle management device 201 creates integrated information using each provided information after coordinate conversion (step S45).
 次に、車両管理装置201は、作成した統合情報に含まれる位置情報の座標系を、対象装置の座標系に変換する(ステップS46)。 Next, the vehicle management device 201 converts the coordinate system of the position information included in the created integrated information into the coordinate system of the target device (step S46).
 次に、車両管理装置201は、作成した統合情報と対象装置から受信した物体検出情報とを比較し、比較結果に基づいて、対象装置へ送信すべき情報の選別および算出等を行い、変更情報を作成する(ステップS47)。 Next, the vehicle management device 201 compares the created integrated information with the object detection information received from the target device, selects and calculates the information to be transmitted to the target device based on the comparison result, and changes the information. Is created (step S47).
 次に、車両管理装置201は、対象装置が変更情報を活用すべき時刻である活用時刻を設定し、上記比較結果に基づく変更情報の内容の一部または全部を、活用時刻に適した内容に変更した変更情報を作成する(ステップS48)。 Next, the vehicle management device 201 sets the utilization time, which is the time when the target device should utilize the change information, and makes a part or all of the content of the change information based on the above comparison result suitable for the utilization time. Create the changed change information (step S48).
 次に、車両管理装置201は、作成した変更情報を対象装置へ出力する(ステップS49)。 Next, the vehicle management device 201 outputs the created change information to the target device (step S49).
 なお、本開示の実施の形態に係る提示装置は、受信部および検出変更部を備えず、変更情報の受信および物体検出への反映を行わない構成であってもよい。このような構成であっても、車両管理装置201において各提供装置101の物体検出情報の中から正確性の高い情報を抽出し、より正確で高度な統合情報を作成することができる。これにより、たとえば統合情報の配信先の車両または路側機における物体検出精度を向上させることができる。 The presenting device according to the embodiment of the present disclosure may not include a receiving unit and a detection / changing unit, and may be configured not to receive the change information and reflect the change information in the object detection. Even with such a configuration, the vehicle management device 201 can extract highly accurate information from the object detection information of each providing device 101 to create more accurate and highly integrated information. As a result, for example, it is possible to improve the object detection accuracy in the vehicle or roadside machine to which the integrated information is distributed.
 また、本開示の実施の形態に係る車両管理装置の機能の一部または全部が、クラウドコンピューティングによって提供されてもよい。すなわち、本開示の実施の形態に係る車両管理装置が、複数のクラウドサーバ等によって構成されてもよい。 Further, a part or all of the functions of the vehicle management device according to the embodiment of the present disclosure may be provided by cloud computing. That is, the vehicle management device according to the embodiment of the present disclosure may be configured by a plurality of cloud servers and the like.
 以上のように、本開示の実施の形態に係る提供装置では、解析部21または71は、車両1の交通に影響する物体の提供装置101における検出結果を示す物体検出情報、および当該物体の検出における条件を示す検出条件情報を作成する。そして、送信部12または62は、解析部21または71によって作成された物体検出情報および検出条件情報を送信する。 As described above, in the providing device according to the embodiment of the present disclosure, the analysis unit 21 or 71 detects the object detection information indicating the detection result in the providing device 101 of the object affecting the traffic of the vehicle 1, and the detection of the object. Create detection condition information indicating the conditions in. Then, the transmission unit 12 or 62 transmits the object detection information and the detection condition information created by the analysis unit 21 or 71.
 また、本開示の実施の形態に係る車両管理方法では、提供装置101において、まず、車両1の交通に影響する物体の提供装置101における検出結果を示す物体検出情報、および当該物体の検出における条件を示す検出条件情報を作成する。次に、作成した物体検出情報および検出条件情報を送信する。 Further, in the vehicle management method according to the embodiment of the present disclosure, in the providing device 101, first, the object detection information indicating the detection result in the providing device 101 of the object affecting the traffic of the vehicle 1 and the conditions for detecting the object. Create detection condition information indicating. Next, the created object detection information and detection condition information are transmitted.
 このような構成により、たとえば各提供装置101から物体検出情報および検出条件情報を収集する装置において、収集した各物体検出情報の中から、当該各提供装置101における物体検出の条件を活用して正確性の高い要素を抽出することができる。たとえば、車両および路側機等の周辺状況のセンシング環境は様々であり、かつ、センシング環境は時間と共に変動するものである。そのため、単に各提供装置101からの物体検出情報を統合する方法と比べて、上記のような正確性の高い情報を用いて精度の高い物体検出を実現することができる。 With such a configuration, for example, in a device that collects object detection information and detection condition information from each providing device 101, the object detection conditions in each providing device 101 are accurately utilized from the collected object detection information. It is possible to extract highly sexual elements. For example, the sensing environment of the surrounding conditions such as a vehicle and a roadside machine varies, and the sensing environment fluctuates with time. Therefore, as compared with the method of simply integrating the object detection information from each providing device 101, it is possible to realize highly accurate object detection by using the highly accurate information as described above.
 したがって、本開示の実施の形態に係る提供装置および車両管理方法では、車両の交通に影響する物体の検出精度を向上させることができる。 Therefore, the providing device and the vehicle management method according to the embodiment of the present disclosure can improve the detection accuracy of an object that affects the traffic of the vehicle.
 また、本開示の実施の形態に係る車両管理装置では、取得部41は、車両1の交通に影響する物体の提供装置101における検出結果を示す物体検出情報、および当該物体の検出における条件を示す検出条件情報を、複数の提供装置101からそれぞれ取得する。情報作成部42は、取得部41によって取得された、対象の提供装置101である対象装置を含む複数の提供装置101の物体検出情報および検出条件情報に基づいて、対象装置における物体の検出に関する変更内容を示す変更情報を作成する。そして、送信部32は、情報作成部42によって作成された変更情報を対象装置へ送信する。 Further, in the vehicle management device according to the embodiment of the present disclosure, the acquisition unit 41 indicates the object detection information indicating the detection result in the vehicle providing device 101 of the object affecting the traffic of the vehicle 1, and the conditions for detecting the object. The detection condition information is acquired from each of the plurality of providing devices 101. The information creation unit 42 changes the object detection in the target device based on the object detection information and the detection condition information of the plurality of providing devices 101 including the target device which is the target providing device 101 acquired by the acquisition unit 41. Create change information that indicates the content. Then, the transmission unit 32 transmits the change information created by the information creation unit 42 to the target device.
 また、本開示の実施の形態に係る車両管理システムでは、対象の提供装置101である対象装置を含む複数の提供装置101は、それぞれ、車両1の交通に影響する物体の提供装置101における検出結果を示す物体検出情報、および当該物体の検出における条件を示す検出条件情報を車両管理装置201へ送信する。そして、車両管理装置201は、受信した複数の提供装置101の物体検出情報および検出条件情報に基づいて、対象装置における物体の検出に関する変更内容を示す変更情報を対象装置へ送信する。 Further, in the vehicle management system according to the embodiment of the present disclosure, each of the plurality of providing devices 101 including the target device 101, which is the target providing device 101, has a detection result in the object providing device 101 that affects the traffic of the vehicle 1. The object detection information indicating the above and the detection condition information indicating the conditions for detecting the object are transmitted to the vehicle management device 201. Then, the vehicle management device 201 transmits, based on the received object detection information and detection condition information of the plurality of providing devices 101, change information indicating the change contents regarding the detection of the object in the target device to the target device.
 また、本開示の実施の形態に係る車両管理方法では、車両管理装置201において、まず、車両1の交通に影響する物体の提供装置101における検出結果を示す物体検出情報、および当該物体の検出における条件を示す検出条件情報を、複数の提供装置101からそれぞれ取得する。次に、取得した、対象の提供装置101である対象装置を含む複数の提供装置101の物体検出情報および検出条件情報に基づいて、対象装置における物体の検出に関する変更内容を示す変更情報を作成する。次に、作成した変更情報を対象装置へ送信する。 Further, in the vehicle management method according to the embodiment of the present disclosure, in the vehicle management device 201, first, in the object detection information indicating the detection result in the object providing device 101 that affects the traffic of the vehicle 1, and in the detection of the object. The detection condition information indicating the condition is acquired from each of the plurality of providing devices 101. Next, based on the acquired object detection information and detection condition information of a plurality of providing devices 101 including the target device which is the target providing device 101, change information indicating the change contents regarding the detection of the object in the target device is created. .. Next, the created change information is transmitted to the target device.
 また、本開示の実施の形態に係る車両管理方法では、車両管理システム301において、まず、対象の提供装置101である対象装置を含む複数の提供装置101が、それぞれ、車両1の交通に影響する物体の提供装置101における検出結果を示す物体検出情報、および当該物体の検出における条件を示す検出条件情報を車両管理装置201へ送信する。次に、車両管理装置201が、受信した複数の提供装置101の物体検出情報および検出条件情報に基づいて、対象装置における物体の検出に関する変更内容を示す変更情報を対象装置へ送信する。 Further, in the vehicle management method according to the embodiment of the present disclosure, in the vehicle management system 301, first, a plurality of providing devices 101 including the target device, which is the target providing device 101, affect the traffic of the vehicle 1, respectively. The object detection information indicating the detection result in the object providing device 101 and the detection condition information indicating the conditions for detecting the object are transmitted to the vehicle management device 201. Next, the vehicle management device 201 transmits, based on the received object detection information and detection condition information of the plurality of providing devices 101, change information indicating the change contents regarding the detection of the object in the target device to the target device.
 このような構成により、各提供装置101から物体検出情報および検出条件情報を収集し、収集した各物体検出情報の中から、当該各提供装置101における物体検出の条件を活用して正確性の高い要素を抽出することができる。そして、抽出した正確性の高い情報と、対象装置の物体検出情報とを用いて、対象装置における物体検出に関する変更を行う構成により、物体検出に用いる対象装置の機器およびソフトウェア等の能力を向上したり、物体の検出結果を補正したりすることができる。たとえば、車両および路側機等の周辺状況のセンシング環境は様々であり、かつ、センシング環境は時間と共に変動するものである。そのため、単に各提供装置101からの物体検出情報を統合する方法と比べて、上記のような正確性の高い情報を対象装置における物体検出にフィードバックする構成により、対象装置において精度の高い物体検出を実現することができる。 With such a configuration, object detection information and detection condition information are collected from each providing device 101, and the object detection conditions in each providing device 101 are utilized from the collected object detection information with high accuracy. Elements can be extracted. Then, by using the extracted highly accurate information and the object detection information of the target device to make changes related to object detection in the target device, the capabilities of the device and software of the target device used for object detection are improved. Or, the detection result of the object can be corrected. For example, the sensing environment of the surrounding conditions such as a vehicle and a roadside machine varies, and the sensing environment fluctuates with time. Therefore, as compared with the method of simply integrating the object detection information from each providing device 101, the object detection with high accuracy in the target device can be performed by the configuration in which the highly accurate information as described above is fed back to the object detection in the target device. It can be realized.
 したがって、本開示の実施の形態に係る車両管理装置、車両管理システムおよび車両管理方法では、車両の交通に影響する物体の検出精度を向上させることができる。 Therefore, the vehicle management device, the vehicle management system, and the vehicle management method according to the embodiment of the present disclosure can improve the detection accuracy of an object that affects the traffic of the vehicle.
 さらに、本開示の実施の形態では、提供装置101から収集する情報の通信量を削減することができる。たとえば、センサ情報は、正確な物体検出を行うために、格納領域において多くの情報量を含み、かつ、物体検出に不要な領域の情報も含む。一方、物体検出情報および検出条件情報は、たとえば、提供装置101が物体検出を行った結果であることから、検出物体の種別、寸法、位置情報、傾き、速度および加速度等の物体の特徴を示す情報に変換されており、かつ、存在しない物体の情報は含まれない。したがって、提供装置101から収集する情報は、センサ情報よりもデータ量が少ない物体検出情報および検出条件情報であることから、提供装置101から収集する情報の通信量を削減することができる。 Further, in the embodiment of the present disclosure, the amount of communication of information collected from the providing device 101 can be reduced. For example, the sensor information includes a large amount of information in the storage area in order to perform accurate object detection, and also includes information in an area unnecessary for object detection. On the other hand, since the object detection information and the detection condition information are the results of the object detection performed by the providing device 101, for example, they indicate the characteristics of the object such as the type, dimensions, position information, inclination, velocity, and acceleration of the detected object. Information on objects that have been converted into information and do not exist is not included. Therefore, since the information collected from the providing device 101 is object detection information and detection condition information having a smaller amount of data than the sensor information, the communication amount of the information collected from the providing device 101 can be reduced.
 また、車両管理装置201が統合処理に用いる情報は、提供装置101が物体検出を行った結果である物体検出情報および検出条件情報であり、同様の処理を車両管理装置201側で実行する必要が無い。そのため、車両管理装置201における統合処理の処理負荷を低減することができる。 Further, the information used by the vehicle management device 201 for the integrated processing is the object detection information and the detection condition information which are the results of the object detection by the providing device 101, and it is necessary to execute the same processing on the vehicle management device 201 side. There is no. Therefore, the processing load of the integrated processing in the vehicle management device 201 can be reduced.
 したがって、本開示の実施の形態では、車両の交通に影響する物体の検出精度を向上させることができ、検出結果の情報の通信量を削減することができ、検出結果の統合処理の処理負荷を低減することができる。 Therefore, in the embodiment of the present disclosure, the detection accuracy of the object affecting the traffic of the vehicle can be improved, the communication amount of the information of the detection result can be reduced, and the processing load of the integrated processing of the detection result can be increased. Can be reduced.
 上記実施の形態は、すべての点で例示であって制限的なものではないと考えられるべきである。本発明の範囲は、上記説明ではなく請求の範囲によって示され、請求の範囲と均等の意味および範囲内でのすべての変更が含まれることが意図される。 The above embodiment should be considered to be exemplary in all respects and not restrictive. The scope of the present invention is shown by the scope of claims rather than the above description, and it is intended to include all modifications within the meaning and scope equivalent to the scope of claims.
 以上の説明は、以下に付記する特徴を含む。
 [付記1]
 提供装置における車両の交通に影響する物体の検出結果を示す物体検出情報、および前記物体の検出における条件を示す検出条件情報を作成する情報作成部と、
 前記情報作成部によって作成された前記物体検出情報および前記検出条件情報を送信する送信部とを備え、
 前記物体検出情報は、前記物体に関する複数の要素を含み、
 前記検出条件情報は、前記物体検出情報を前記要素単位で評価可能な前記条件を含む、提供装置。
The above description includes the features described below.
[Appendix 1]
An information creation unit that creates object detection information that indicates the detection result of an object that affects the traffic of the vehicle in the providing device, and detection condition information that indicates the conditions for detecting the object.
The object detection information created by the information creation unit and the transmission unit for transmitting the detection condition information are provided.
The object detection information includes a plurality of elements related to the object.
The detection condition information includes the condition that the object detection information can be evaluated for each element.
 [付記2]
 提供装置における車両の交通に影響する物体の検出結果を示す物体検出情報、および前記物体の検出における条件を示す検出条件情報を、複数の前記提供装置からそれぞれ取得する取得部と、
 前記取得部によって取得された前記複数の提供装置の前記物体検出情報および前記検出条件情報に基づいて、前記複数の提供装置のうちの予め定められた対象装置における前記物体の検出に関する変更内容を示す変更情報を作成する情報作成部と、
 前記情報作成部によって作成された前記変更情報を前記対象装置へ送信する送信部とを備え、
 前記物体検出情報は、前記物体に関する複数の要素を含み、
 前記検出条件情報は、前記物体検出情報を前記要素単位で評価可能な前記条件を含み、
 前記情報作成部は、前記複数の提供装置の前記物体検出情報を統合した統合情報を作成し、作成した前記統合情報と前記対象装置の前記物体検出情報との比較結果に基づいて前記変更情報を作成し、
 前記情報作成部は、前記物体検出情報ごとに、対応の前記検出条件情報に基づいて、前記物体検出情報を前記要素単位で評価し、評価結果に基づいて、前記複数の提供装置の前記物体検出情報の中から複数の前記要素を抽出し、抽出した複数の前記要素を含む前記統合情報を作成する、車両管理装置。
[Appendix 2]
An acquisition unit that acquires object detection information indicating the detection result of an object affecting the traffic of the vehicle in the providing device and detection condition information indicating a condition for detecting the object from the plurality of providing devices, respectively.
Based on the object detection information and the detection condition information of the plurality of providing devices acquired by the acquisition unit, the contents of changes regarding the detection of the object in the predetermined target device among the plurality of providing devices are shown. The information creation department that creates change information and
It includes a transmission unit that transmits the change information created by the information creation unit to the target device.
The object detection information includes a plurality of elements related to the object.
The detection condition information includes the condition that the object detection information can be evaluated for each element.
The information creation unit creates integrated information that integrates the object detection information of the plurality of providing devices, and creates the change information based on the comparison result between the created integrated information and the object detection information of the target device. make,
The information creation unit evaluates the object detection information for each element based on the corresponding detection condition information for each object detection information, and based on the evaluation result, the object detection of the plurality of providing devices. A vehicle management device that extracts a plurality of the elements from the information and creates the integrated information including the extracted elements.
 [付記3]
 複数の提供装置と、
 車両管理装置とを備え、
 前記複数の提供装置は、それぞれ、前記提供装置における車両の交通に影響する物体の検出結果を示す物体検出情報、および前記物体の検出における条件を示す検出条件情報を前記車両管理装置へ送信し、
 前記車両管理装置は、受信した前記複数の提供装置の前記物体検出情報および前記検出条件情報に基づいて、前記複数の提供装置のうちの予め定められた対象装置における前記物体の検出に関する変更内容を示す変更情報を前記対象装置へ送信し、
 前記物体検出情報は、前記物体に関する複数の要素を含み、
 前記検出条件情報は、前記物体検出情報を前記要素単位で評価可能な前記条件を含み、
 前記車両管理装置は、前記複数の提供装置の前記物体検出情報を統合した統合情報を作成し、作成した前記統合情報と前記対象装置の前記物体検出情報との比較結果に基づいて前記変更情報を作成し、
 前記車両管理装置は、前記物体検出情報ごとに、対応の前記検出条件情報に基づいて、前記物体検出情報を前記要素単位で評価し、評価結果に基づいて、前記複数の提供装置の前記物体検出情報の中から複数の前記要素を抽出し、抽出した複数の前記要素を含む前記統合情報を作成する、車両管理システム。
[Appendix 3]
With multiple providers
Equipped with a vehicle management device
Each of the plurality of providing devices transmits the object detection information indicating the detection result of the object affecting the traffic of the vehicle in the providing device and the detection condition information indicating the conditions for detecting the object to the vehicle management device.
Based on the object detection information and the detection condition information of the plurality of providing devices received, the vehicle management device changes the content of the detection of the object in the predetermined target device among the plurality of providing devices. The indicated change information is transmitted to the target device,
The object detection information includes a plurality of elements related to the object.
The detection condition information includes the condition that the object detection information can be evaluated for each element.
The vehicle management device creates integrated information that integrates the object detection information of the plurality of providing devices, and creates the change information based on a comparison result between the created integrated information and the object detection information of the target device. make,
The vehicle management device evaluates the object detection information for each element based on the corresponding detection condition information for each object detection information, and based on the evaluation result, the object detection of the plurality of providing devices. A vehicle management system that extracts a plurality of the elements from the information and creates the integrated information including the extracted elements.
 [付記4]
 プロセッサと、通信回路とを備える提供装置であって、
 前記プロセッサは、
 車両の交通に影響する物体の前記提供装置における検出結果を示す物体検出情報、および前記物体の検出における条件を示す検出条件情報を作成する情報作成部を実現し、
 前記通信回路は、
 前記情報作成部によって作成された前記物体検出情報および前記検出条件情報を送信する送信部を実現する、提供装置。
[Appendix 4]
A providing device including a processor and a communication circuit.
The processor
An information creation unit that creates object detection information indicating the detection result of an object affecting the traffic of a vehicle in the providing device and detection condition information indicating a condition for detecting the object is realized.
The communication circuit
A providing device that realizes a transmission unit that transmits the object detection information and the detection condition information created by the information creation unit.
 [付記5]
 プロセッサと、通信回路とを備える車両管理装置であって、
 前記プロセッサは、
 提供装置における車両の交通に影響する物体の検出結果を示す物体検出情報、および前記物体の検出における条件を示す検出条件情報を、複数の前記提供装置からそれぞれ取得する取得部と、
 前記取得部によって取得された前記複数の提供装置の前記物体検出情報および前記検出条件情報に基づいて、前記複数の提供装置のうちの予め定められた対象装置における前記物体の検出に関する変更内容を示す変更情報を作成する情報作成部とを実現し、
 前記通信回路は、
 前記情報作成部によって作成された前記変更情報を前記対象装置へ送信する送信部を実現する、車両管理装置。
[Appendix 5]
A vehicle management device including a processor and a communication circuit.
The processor
An acquisition unit that acquires object detection information indicating the detection result of an object affecting the traffic of the vehicle in the providing device and detection condition information indicating a condition for detecting the object from the plurality of providing devices, respectively.
Based on the object detection information and the detection condition information of the plurality of providing devices acquired by the acquisition unit, the contents of changes regarding the detection of the object in the predetermined target device among the plurality of providing devices are shown. Realizes an information creation department that creates change information,
The communication circuit
A vehicle management device that realizes a transmission unit that transmits the change information created by the information creation unit to the target device.
 1 車両
 3 路側機
 4 無線基地局装置
 5 外部ネットワーク
 11 処理部
 12 送信部
 13 受信部
 14 記憶部
 21 解析部
 22 検出変更部
 23 支援情報作成部
 24 車両制御部
 31 処理部
 32 送信部
 33 受信部
 34 記憶部
 41 取得部
 42 情報作成部
 51,81 センサ
 61 処理部
 62 送信部
 63 受信部
 64 記憶部
 71 解析部
 72 検出変更部
 101,101A,101B 提供装置
 201 車両管理装置
 301 車両管理システム
1 Vehicle 3 Roadside unit 4 Radio base station device 5 External network 11 Processing unit 12 Transmission unit 13 Reception unit 14 Storage unit 21 Analysis unit 22 Detection change unit 23 Support information creation unit 24 Vehicle control unit 31 Processing unit 32 Transmission unit 33 Reception unit 34 Storage unit 41 Acquisition unit 42 Information creation unit 51, 81 Sensor 61 Processing unit 62 Transmission unit 63 Reception unit 64 Storage unit 71 Analysis unit 72 Detection change unit 101, 101A, 101B Providing device 201 Vehicle management device 301 Vehicle management system

Claims (14)

  1.  提供装置における車両の交通に影響する物体の検出結果を示す物体検出情報、および前記物体の検出における条件を示す検出条件情報を作成する情報作成部と、
     前記情報作成部によって作成された前記物体検出情報および前記検出条件情報を送信する送信部とを備える、提供装置。
    An information creation unit that creates object detection information that indicates the detection result of an object that affects the traffic of the vehicle in the providing device, and detection condition information that indicates the conditions for detecting the object.
    A providing device including a transmission unit that transmits the object detection information and the detection condition information created by the information creation unit.
  2.  前記提供装置は、さらに、
     前記提供装置における前記物体の検出に関する変更内容を示す変更情報を受信する受信部と、
     前記受信部によって受信された前記変更情報に基づいて、前記提供装置における前記物体の検出に関する変更を行う検出変更部とを備える、請求項1に記載の提供装置。
    The providing device further
    A receiving unit that receives change information indicating the change contents regarding the detection of the object in the providing device, and
    The providing device according to claim 1, further comprising a detection changing unit that changes the detection of the object in the providing device based on the change information received by the receiving unit.
  3.  前記変更情報は、前記検出結果の変更内容を示す、請求項2に記載の提供装置。 The providing device according to claim 2, wherein the change information indicates the content of the change in the detection result.
  4.  前記変更情報は、前記物体の検出に用いるセンサの設定の変更内容を示す、請求項2または請求項3に記載の提供装置。 The providing device according to claim 2 or 3, wherein the change information indicates a change content of a setting of a sensor used for detecting the object.
  5.  前記変更情報は、前記物体の検出に用いるセンサの計測結果に対する解析処理の内容変更を示す、請求項2から請求項4のいずれか1項に記載の提供装置。 The providing device according to any one of claims 2 to 4, wherein the change information indicates a change in the content of analysis processing with respect to the measurement result of the sensor used for detecting the object.
  6.  前記検出条件情報は、前記物体の検出に用いるセンサの計測性能に影響を与える前記センサの環境を示す、請求項1から請求項5のいずれか1項に記載の提供装置。 The providing device according to any one of claims 1 to 5, wherein the detection condition information indicates an environment of the sensor that affects the measurement performance of the sensor used for detecting the object.
  7.  前記提供装置は、車両に搭載され、
     前記検出条件情報は、前記提供装置が搭載された前記車両の走行状態を示す、請求項1から請求項6のいずれか1項に記載の提供装置。
    The providing device is mounted on the vehicle and
    The providing device according to any one of claims 1 to 6, wherein the detection condition information indicates a running state of the vehicle on which the providing device is mounted.
  8.  提供装置における車両の交通に影響する物体の検出結果を示す物体検出情報、および前記物体の検出における条件を示す検出条件情報を、複数の前記提供装置からそれぞれ取得する取得部と、
     前記取得部によって取得された前記複数の提供装置の前記物体検出情報および前記検出条件情報に基づいて、前記複数の提供装置のうちの予め定められた対象装置における前記物体の検出に関する変更内容を示す変更情報を作成する情報作成部と、
     前記情報作成部によって作成された前記変更情報を前記対象装置へ送信する送信部とを備える、車両管理装置。
    An acquisition unit that acquires object detection information indicating the detection result of an object affecting the traffic of the vehicle in the providing device and detection condition information indicating a condition for detecting the object from the plurality of providing devices, respectively.
    Based on the object detection information and the detection condition information of the plurality of providing devices acquired by the acquisition unit, the contents of changes regarding the detection of the object in the predetermined target device among the plurality of providing devices are shown. The information creation department that creates change information and
    A vehicle management device including a transmission unit that transmits the change information created by the information creation unit to the target device.
  9.  複数の提供装置と、
     車両管理装置とを備え、
     前記複数の提供装置は、それぞれ、前記提供装置における車両の交通に影響する物体の検出結果を示す物体検出情報、および前記物体の検出における条件を示す検出条件情報を前記車両管理装置へ送信し、
     前記車両管理装置は、受信した前記複数の提供装置の前記物体検出情報および前記検出条件情報に基づいて、前記複数の提供装置のうちの予め定められた対象装置における前記物体の検出に関する変更内容を示す変更情報を前記対象装置へ送信する、車両管理システム。
    With multiple providers
    Equipped with a vehicle management device
    Each of the plurality of providing devices transmits the object detection information indicating the detection result of the object affecting the traffic of the vehicle in the providing device and the detection condition information indicating the conditions for detecting the object to the vehicle management device.
    Based on the object detection information and the detection condition information of the plurality of providing devices received, the vehicle management device changes the content of the detection of the object in the predetermined target device among the plurality of providing devices. A vehicle management system that transmits the indicated change information to the target device.
  10.  提供装置における車両管理方法であって、
     前記提供装置における車両の交通に影響する物体の検出結果を示す物体検出情報、および前記物体の検出における条件を示す検出条件情報を作成するステップと、
     作成した前記物体検出情報および前記検出条件情報を送信するステップとを含む、車両管理方法。
    It is a vehicle management method in the providing device.
    A step of creating object detection information indicating the detection result of an object affecting the traffic of the vehicle in the providing device, and detection condition information indicating a condition for detecting the object.
    A vehicle management method including a step of transmitting the created object detection information and the detection condition information.
  11.  車両管理装置における車両管理方法であって、
     提供装置における車両の交通に影響する物体の検出結果を示す物体検出情報、および前記物体の検出における条件を示す検出条件情報を、複数の前記提供装置からそれぞれ取得するステップと、
     取得した前記複数の提供装置の前記物体検出情報および前記検出条件情報に基づいて、前記複数の提供装置のうちの予め定められた対象装置における前記物体の検出に関する変更内容を示す変更情報を作成するステップと、
     作成した前記変更情報を前記対象装置へ送信するステップとを含む、車両管理方法。
    It is a vehicle management method in a vehicle management device.
    A step of acquiring object detection information indicating the detection result of an object affecting the traffic of the vehicle in the providing device and detection condition information indicating a condition in detecting the object from the plurality of providing devices, respectively.
    Based on the acquired object detection information of the plurality of providing devices and the detection condition information, change information indicating the change contents regarding the detection of the object in the predetermined target device among the plurality of providing devices is created. Steps and
    A vehicle management method including a step of transmitting the created change information to the target device.
  12.  複数の提供装置と、車両管理装置とを備える車両管理システムにおける車両管理方法であって、
     前記複数の提供装置が、それぞれ、前記提供装置における車両の交通に影響する物体の検出結果を示す物体検出情報、および前記物体の検出における条件を示す検出条件情報を前記車両管理装置へ送信するステップと、
     前記車両管理装置が、受信した前記複数の提供装置の前記物体検出情報および前記検出条件情報に基づいて、前記複数の提供装置のうちの予め定められた対象装置における前記物体の検出に関する変更内容を示す変更情報を前記対象装置へ送信するステップとを含む、車両管理方法。
    A vehicle management method in a vehicle management system including a plurality of providing devices and a vehicle management device.
    A step in which the plurality of providing devices transmit object detection information indicating an object detection result affecting vehicle traffic in the providing device and detection condition information indicating a condition for detecting the object to the vehicle management device, respectively. When,
    Based on the object detection information and the detection condition information of the plurality of providing devices received by the vehicle management device, the change contents regarding the detection of the object in the predetermined target device among the plurality of providing devices are changed. A vehicle management method including a step of transmitting the indicated change information to the target device.
  13.  提供装置において用いられる車両管理プログラムであって、
     コンピュータを、
     車両の交通に影響する物体の前記提供装置における検出結果を示す物体検出情報、および前記物体の検出における条件を示す検出条件情報を作成する情報作成部と、
     前記情報作成部によって作成された前記物体検出情報および前記検出条件情報を送信する送信部、
    として機能させるための、車両管理プログラム。
    A vehicle management program used in the providing equipment.
    Computer,
    An information creation unit that creates object detection information that indicates the detection result of an object that affects the traffic of a vehicle in the providing device, and detection condition information that indicates a condition for detecting the object.
    A transmission unit that transmits the object detection information and the detection condition information created by the information creation unit,
    Vehicle management program to function as.
  14.  車両管理装置において用いられる車両管理プログラムであって、
     コンピュータを、
     提供装置における車両の交通に影響する物体の検出結果を示す物体検出情報、および前記物体の検出における条件を示す検出条件情報を、複数の前記提供装置からそれぞれ取得する取得部と、
     前記取得部によって取得された前記複数の提供装置の前記物体検出情報および前記検出条件情報に基づいて、前記複数の提供装置のうちの予め定められた対象装置における前記物体の検出に関する変更内容を示す変更情報を作成する情報作成部と、
     前記情報作成部によって作成された前記変更情報を前記対象装置へ送信する送信部、
    として機能させるための、車両管理プログラム。
    A vehicle management program used in a vehicle management device.
    Computer,
    An acquisition unit that acquires object detection information indicating the detection result of an object affecting the traffic of the vehicle in the providing device and detection condition information indicating a condition for detecting the object from the plurality of providing devices, respectively.
    Based on the object detection information and the detection condition information of the plurality of providing devices acquired by the acquisition unit, the contents of changes regarding the detection of the object in the predetermined target device among the plurality of providing devices are shown. The information creation department that creates change information and
    A transmission unit that transmits the change information created by the information creation unit to the target device,
    Vehicle management program to function as.
PCT/JP2020/032798 2019-11-29 2020-08-31 Provision device, vehicle management device, vehicle management system, vehicle management method, and vehicle management program WO2021106297A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2019-216678 2019-11-29
JP2019216678 2019-11-29

Publications (1)

Publication Number Publication Date
WO2021106297A1 true WO2021106297A1 (en) 2021-06-03

Family

ID=76128843

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2020/032798 WO2021106297A1 (en) 2019-11-29 2020-08-31 Provision device, vehicle management device, vehicle management system, vehicle management method, and vehicle management program

Country Status (1)

Country Link
WO (1) WO2021106297A1 (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2016192598A (en) * 2015-03-30 2016-11-10 ソフトバンク株式会社 Image management system, server, image management method, server control method, and server program
WO2018155149A1 (en) * 2017-02-22 2018-08-30 住友電気工業株式会社 Sensor information presentation device, sensor information collection device, sensor information collection system, sensor information presentation method, sensor information collection method, and computer program

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2016192598A (en) * 2015-03-30 2016-11-10 ソフトバンク株式会社 Image management system, server, image management method, server control method, and server program
WO2018155149A1 (en) * 2017-02-22 2018-08-30 住友電気工業株式会社 Sensor information presentation device, sensor information collection device, sensor information collection system, sensor information presentation method, sensor information collection method, and computer program

Similar Documents

Publication Publication Date Title
JP6682833B2 (en) Database construction system for machine learning of object recognition algorithm
US10540554B2 (en) Real-time detection of traffic situation
EP3544856B1 (en) Determining a road surface characteristic
US10963706B2 (en) Distributable representation learning for associating observations from multiple vehicles
US11422561B2 (en) Sensor system for multiple perspective sensor data sets
US11315420B2 (en) Moving object and driving support system for moving object
CN112149550B (en) Automatic driving vehicle 3D target detection method based on multi-sensor fusion
CN111554088A (en) Multifunctional V2X intelligent roadside base station system
WO2018212346A1 (en) Control device, scanning system, control method, and program
KR102610001B1 (en) System for sensor synchronization data analysis in autonomous vehicles
JP6973351B2 (en) Sensor calibration method and sensor calibration device
KR101735557B1 (en) System and Method for Collecting Traffic Information Using Real time Object Detection
US11292481B2 (en) Method and apparatus for multi vehicle sensor suite diagnosis
KR102472075B1 (en) System and method for supporting automatic detection service based on real-time road image and radar signal analysis results
CN114080629A (en) Object detection in point clouds
US10839522B2 (en) Adaptive data collecting and processing system and methods
CN111753901B (en) Data fusion method, device, system and computer equipment
CN111862226B (en) Hardware design for camera calibration and image preprocessing in a vehicle
WO2021106297A1 (en) Provision device, vehicle management device, vehicle management system, vehicle management method, and vehicle management program
CN115359332A (en) Data fusion method and device based on vehicle-road cooperation, electronic equipment and system
CN115578716A (en) Vehicle-mounted data processing method, device, equipment and medium
CN114581748A (en) Multi-agent perception fusion system based on machine learning and implementation method thereof
WO2020235466A1 (en) Vehicle control system and vehicle control method
JP6933069B2 (en) Pathfinding device
US20230368537A1 (en) Automatic configuration of camera settings using radar

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20891454

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20891454

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP