WO2023145436A1 - Information processing device, road side machine, traffic support system, and information complementing method - Google Patents

Information processing device, road side machine, traffic support system, and information complementing method Download PDF

Info

Publication number
WO2023145436A1
WO2023145436A1 PCT/JP2023/000492 JP2023000492W WO2023145436A1 WO 2023145436 A1 WO2023145436 A1 WO 2023145436A1 JP 2023000492 W JP2023000492 W JP 2023000492W WO 2023145436 A1 WO2023145436 A1 WO 2023145436A1
Authority
WO
WIPO (PCT)
Prior art keywords
information
detected object
detected
unit
roadside
Prior art date
Application number
PCT/JP2023/000492
Other languages
French (fr)
Japanese (ja)
Inventor
真大 新田
輝 小山
Original Assignee
京セラ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 京セラ株式会社 filed Critical 京セラ株式会社
Priority to JP2023576758A priority Critical patent/JPWO2023145436A1/ja
Publication of WO2023145436A1 publication Critical patent/WO2023145436A1/en

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems

Definitions

  • the present invention relates to an information processing device, a roadside unit, a traffic support system, and an information supplement method.
  • a roadside unit is known that is installed above a road or the like, detects detected objects such as vehicles and passersby, and notifies surrounding vehicles of the detected detected objects (see Patent Document 1).
  • An information processing device includes: an acquisition unit that acquires first information that can predict the performance of a surrounding roadside unit from the roadside unit and acquires second information about a detected object in the vicinity; Based on the first information, a latent detected object that may not have been detected by the roadside unit is extracted from the detected objects included in the second information, and third information about the latent detected object is generated. and a control unit.
  • the roadside unit is a sensor unit that detects information about a detected object in the vicinity; a communication unit that transmits first information that can predict the performance of the sensor unit and receives third information regarding a latent detection object that may not be detected by the sensor unit; a control unit configured to generate fourth information including information about the detected object most recently detected by the sensor unit and the third information, and control the communication unit to transmit the fourth information.
  • the traffic support system is A sensor unit that detects information about a detected object in the vicinity, transmits first information that can predict the performance of the sensor unit, and receives third information about a latent detected object that may not be detected by the sensor unit.
  • a first communication unit for generating fourth information including information about a detected object most recently detected by the sensor unit and the third information, and controlling the communication unit to transmit the fourth information;
  • a roadside unit having a controller; an acquisition unit that acquires the first information from the roadside units in the vicinity and acquires second information about the detected objects in the vicinity; and based on the first information, the detected objects included in the second information.
  • a second control unit that extracts the latent detected object of the roadside unit and generates the third information about the latent detected object.
  • the information supplementation method is Acquiring first information from the roadside unit that can predict the performance of the surrounding roadside unit; obtaining second information about detected objects in the vicinity; Based on the first information, extracting a potential detected object that may not be detected by the roadside unit from among the detected objects included in the second information; Generate third information about the latent detected object.
  • FIG. 1 is a diagram illustrating a configuration example of a traffic support system including a vehicle equipped with an information processing device according to one embodiment.
  • 2 is a block diagram showing a schematic configuration of a roadside machine in FIG. 1;
  • FIG. 2 is a block diagram showing a schematic configuration of the vehicle in FIG. 1;
  • FIG. 4 is a block diagram showing a schematic configuration of the information processing apparatus of FIG. 3;
  • FIG. FIG. 4 is a state diagram for explaining a latent detected object based on first information;
  • FIG. 10 is a state diagram for explaining a potential detectable object outside the detectable range corresponding to the first information based on the first information and the fourth information;
  • FIG. 10 is a state diagram for explaining a potential detectable object within a detectable range corresponding to the first information based on the first information and the fourth information;
  • It is a ladder chart for demonstrating the information supplement process by a roadside unit and a vehicle.
  • FIG. 1 shows a configuration example of a traffic support system 11 including a vehicle 10 equipped with an information processing device according to one embodiment.
  • the traffic support system 11 is, for example, a safe driving support communication system of Intelligent Transport Systems (ITS).
  • ITS Intelligent Transport Systems
  • a driving safety support communication system is called a driving safety support system or a driving safety support radio system.
  • the traffic support system 11 includes roadside units 12 and vehicles 10 .
  • the roadside unit 12 may observe observation targets such as vehicles, objects, and people on the road in a predetermined area.
  • the roadside unit 12 may be placed near an intersection where multiple roads 13 (roadways) intersect to observe the road surface.
  • the roadside unit 12 may be arranged on the roadside other than the intersection.
  • one roadside unit 12 is arranged at the intersection.
  • a plurality of roadside units 12 may be arranged.
  • roadside units 12 are arranged in a number corresponding to the number of roads 13 intersecting at an intersection, and each roadside unit 12 is arranged so that at least one road 13 previously associated with the installer is included in the detection range. may be
  • the roadside unit 12 and the vehicle 10 running on the road 13 may perform wireless communication with each other.
  • a plurality of vehicles 10 may perform wireless communication with each other.
  • the roadside unit 12 notifies the vehicle 10 of various information, which will be described later.
  • the vehicle 10 notifies the roadside device 12 of various information acquired by the own vehicle, which will be described later.
  • the vehicle 10 may notify other vehicles 10 of various information acquired by the own vehicle, which will be described later.
  • at least various information obtained from the roadside unit 12 may be used to assist the driver of the vehicle 10 in driving safely.
  • the traffic assistance system 11 may assist the driver of the vehicle 10 in driving safely.
  • the vehicle 10 is, for example, an automobile, but is not limited to an automobile, and may be a motorcycle, a bus, a streetcar, a bicycle, or the like.
  • the roadside unit 12 includes a sensor unit 14, a communication unit 15, and a control unit (first control unit) 16.
  • the roadside unit 12 may be fixed to a structure having a height capable of capturing an outdoor scene including the road 13, such as a signal device near an intersection where the road 13 intersects, a utility pole, a streetlight, or the like.
  • the sensor unit 14 detects information about detected objects in the vicinity.
  • a detected object may be a vehicle 10 on the road, an object, a person, etc., which should be considered in the operation of the individual vehicle 10 .
  • the information about the detected object may be the position of the detected object, the presence or absence of the detected object, the type of existing detected object, the speed, and the traveling direction.
  • the information about the detected object detected by the sensor unit 14 may be original information that can generate the position of the detected object, the presence or absence of the detected object, the type, speed, and traveling direction of the existing detected object.
  • the sensor unit 14 may detect information about the detected object at predetermined intervals.
  • the sensor unit 14 includes, for example, imaging devices such as a visible light camera, FIR (Far Infrared Rays) camera, and stereo camera, and distance measuring devices such as infrared radar, millimeter wave radar, and Lidar (light detection and ranging). OK.
  • imaging devices such as a visible light camera, FIR (Far Infrared Rays) camera, and stereo camera
  • distance measuring devices such as infrared radar, millimeter wave radar, and Lidar (light detection and ranging).
  • the position of the detected object, the presence or absence of the detected object, the type of the detected object, and the traveling direction can be generated based on the image analysis of the image detected by the imaging device.
  • the velocity and acceleration of the detected object can be generated.
  • the position of the detected object, the presence or absence of the detected object, and the type of the existing detected object can be generated based on the distance of the object point detected by the rangefinder. Velocity and acceleration of the detected object can be generated based on successively detected distances at different times.
  • the sensor unit 14 has a detection range centered on the detection axis.
  • the sensor unit 14 may be installed so that a desired area on the road 13 is included in the detection range.
  • the attitude of the sensor unit 14 in the installation situation in other words, the direction and inclination of the detection axis with respect to the ground may be measured in advance.
  • the communication unit 15 may perform wireless communication with the vehicle 10 under the control of the control unit 16 .
  • the communication unit 15 may be composed of a communication circuit and an antenna.
  • the antenna is, for example, an omnidirectional antenna.
  • the communication unit 15 performs wireless communication using, for example, the 700 MHz band assigned to ITS. Also, the communication unit 15 performs wireless communication using, for example, a wireless LAN (Local Area Network).
  • a wireless LAN Local Area Network
  • the communication unit 15 may perform various processing such as amplification processing on the signal received by the antenna, and output the processed received signal to the control unit 16 .
  • the control unit 16 may perform various types of processing on the input received signal to acquire information contained in the received signal.
  • the communication unit 15 may perform various processing such as amplification processing on the information acquired from the control unit 16, and wirelessly transmit the processed transmission signal from the antenna.
  • the communication unit 15 transmits first information that can predict the performance of the sensor unit 14 .
  • the performance of the sensor section 14 is, for example, the detection range of the sensor section 14 .
  • the first information may be, for example, a detection range, in other words, information indicating a range in which the sensor unit 14 of the roadside unit 12 can detect the detected object.
  • the first information may be information capable of estimating the detection range.
  • the information that allows estimation of the detection range may be information indicating the positions of the plurality of detected objects within the detection range of the sensor unit 14 . Based on the positions of the plurality of detected objects, it can be estimated that at least the area connecting the plurality of positions is a detectable range.
  • the communication unit 15 may specifically transmit the fourth information.
  • the fourth information includes information about detected objects around the roadside unit 12, as will be described later.
  • the communication unit 15 specifically receives the third information.
  • the third information includes information about a latent detected object that may not be detected by the sensor unit 14 and is generated by an information processing device mounted on the vehicle 10 .
  • the communication unit 15 may specifically receive the fifth information.
  • the fifth information is generated in the information processing device mounted on the vehicle 10 and indicates the priority of detection targets around each vehicle 10 to which attention should be paid.
  • the control unit 16 includes one or more processors and memory.
  • the processor may include a general-purpose processor that loads a specific program to execute a specific function, and a dedicated processor that specializes in specific processing.
  • a dedicated processor may include an Application Specific Integrated Circuit (ASIC).
  • the processor may include a programmable logic device (PLD).
  • the PLD may include an FPGA (Field-Programmable Gate Array).
  • the control unit 16 may be either SoC (System-on-a-Chip) in which one or more processors cooperate, or SiP (System In a Package).
  • the control unit 16 may process the information regarding the surrounding detected objects detected by the sensor unit 14 as necessary.
  • the control unit 16 may add time information to the information regarding the detected object.
  • the time information may be obtained from a timer built into the control unit 16 or from a timer of the roadside device 12 .
  • the control unit 16 determines the presence or absence of a detected object by using a discrimination model constructed by pattern matching or machine learning for an image detected by a configuration in which the sensor unit 14 includes an imaging device. you can Furthermore, when a detected object exists, the control section 16 may estimate the type of the detected object. Furthermore, when a detected object exists, the control unit 16 may calculate the position of the detected object on the ground based on the position of the detected object in the image and the orientation of the sensor unit 14 . Furthermore, when a detected object exists, the control unit 16 may estimate the traveling direction of the detected object by estimating the orientation of the detected object.
  • control unit 16 may calculate the velocity of the detected object based on the difference in the positions of the same detected object on the ground in temporally consecutive frame images and the frame cycle. Further, the control unit 16 may calculate the acceleration of the detected object based on the difference in velocity calculated at consecutive points in time and the period of frames. The control unit 16 may recognize the information processed as described above as the information regarding the detected object.
  • control unit 16 calculates the distances of the object points in different tilt directions with respect to the detection axis detected in the configuration in which the sensor unit 14 includes a distance measuring device, and the distances to the road 13, as described above. The presence or absence of the detected object in the tilt direction may be determined by the comparison. Furthermore, when a detected object exists, the control unit 16 may calculate the position of the detected object on the ground based on the distance of the detected object and the orientation of the sensor unit 14 . Furthermore, when a detected object exists, the control unit 16 may estimate the type of the detected object based on the size of a set of object points corresponding to a plurality of tilt directions that can be regarded as the same distance.
  • control unit 16 may calculate the velocity of the detected object based on the difference in the position of the same detected object on the ground in the detection results of the distance measurement that is continuous in time and the period of the distance measurement. Further, the control unit 16 may calculate the acceleration of the detected object based on the speed difference calculated at consecutive points in time and the range measurement cycle. The control unit 16 may recognize the information processed as described above as the information regarding the detected object.
  • the control unit 16 generates new fourth information including the information about the detected object in the vicinity detected most recently by the sensor unit 14, the third information, and the fifth information.
  • the control unit 16 controls the communication unit 15 to transmit the newly generated fourth information. Note that the control unit 16 performs communication so as to transmit the fourth information not only to the vehicles 10 existing in the surroundings but also to other roadside units 12 existing in the surroundings that can receive the fourth information. 15 may be controlled.
  • the control unit 16 may determine whether or not the latent detected object in the third information is the same as the peripheral detected object most recently detected by the sensor unit 14 .
  • the control unit 16 may determine identity based on the position of the latent detected object and whether the positional interval between the positions of the detected object is equal to or less than a first distance threshold. Based on the latent detected object or the speed and acceleration of the detected object, the control unit 16 controls the detection of the latent detected object with respect to the difference between the acquisition time of the third information and the latest detection time of the sensor unit 14. Either the position or the position of the detected object may be modified.
  • the control unit 16 may use the positions of the latent detected object and the detected object, which are adjusted to the same point in time by correcting the positions, for identity determination.
  • the control unit 16 may periodically generate the fourth information.
  • the control unit 16 may use the latest third information among a plurality of pieces of third information acquired from the same vehicle at different times to generate the fourth information.
  • the control unit 16 may use the third information acquired after the previous transmission of the fourth information until the generation of the new fourth information to generate the new fourth information.
  • the control unit 16 may transmit the information read from the memory as the first information.
  • the control unit 16 may transmit the information read from the memory as the first information together with the identification information of the roadside device 12 .
  • the control unit 16 calculates the detection range based on the information read from the memory, and outputs information indicating the detection range to the first. 1 information may be transmitted.
  • the control unit 16 may control the communication unit 15 to periodically transmit the first information.
  • the control unit 16 may transmit the first information in the same period as the fourth information.
  • the control unit 16 may notify the server device or the like that manages the roadside unit 12 of the information as necessary. . In addition, the control unit 16 may notify vehicles or the like around the roadside unit 12 of the information as necessary.
  • the vehicle 10 is equipped with an information processing device 17 .
  • the vehicle 10 may further include a sensor section 18 and an ECU (Electronic Control Unit) 19 .
  • the vehicle 10 may be equipped with the in-vehicle wireless device 20 in a configuration in which the information processing device 17 does not include a wireless communication unit.
  • the information processing device 17 has a configuration without a wireless communication unit, but has functions similar to those of the vehicle-mounted wireless device 20 in a configuration with a wireless communication unit.
  • the sensor unit 18 may include various sensors.
  • the various sensors may include sensors that detect conditions around the vehicle 10, sensors that detect the state of the vehicle 10, and the like.
  • the sensors that detect the circumstances around the vehicle 10 may include sensors that detect information about detected objects around the vehicle 10 .
  • the information about the detected object may be similar to the information about the detected object detected by the sensor unit 14 of the roadside unit 12 . In other words, the information about the detected object may be the position of the detected object, the presence or absence of the detected object, the type of existing detected object, the speed, and the traveling direction.
  • a sensor that detects the state of the vehicle 10 may include a speed sensor that detects the speed of the vehicle itself, and a positioning device such as a GNSS (Global Navigation Satellite System) that detects the spatial position of the vehicle in real space.
  • GNSS Global Navigation Satellite System
  • the ECU 19 may assist the driving of the vehicle 10 based on information detected by the sensor unit 18 and information obtained from the information processing device 17, which will be described later. In a configuration in which the vehicle 10 can be controlled by the driver, the ECU 19 may notify the driver of the acquired information or information obtained by processing the information. The ECU 19 may use the information acquired for determining the operation in a configuration in which steering such as acceleration/deceleration and steering of the vehicle 10 is performed automatically or semi-automatically.
  • the in-vehicle wireless device 20 may perform wireless communication with the roadside units 12 and other vehicles 10 in the vicinity.
  • the in-vehicle radio device 20 may be composed of a communication circuit and an antenna.
  • the antenna is, for example, an omnidirectional antenna.
  • the in-vehicle wireless device 20 performs wireless communication using, for example, the 700 MHz band assigned to ITS.
  • the in-vehicle wireless device 20 performs wireless communication using, for example, a wireless LAN (Local Area Network).
  • the in-vehicle wireless device 20 may output information acquired from the roadside device 12 and the vehicle 10 in the vicinity to the information processing device 17 .
  • the in-vehicle wireless device 20 may output information acquired from the information processing device 17 to external devices such as the roadside device 12 and the vehicle 10 .
  • the information processing device 17 includes an acquisition unit 21 and a control unit (second control unit) 22 .
  • the acquisition unit 21 is an information interface with electronic devices other than the information processing device 17 .
  • the acquisition unit 21 acquires first information about the roadside device 12 from surrounding roadside devices 12 .
  • the acquisition unit 21 acquires the first information from the surrounding roadside device 12 via the vehicle-mounted wireless device 20 in a configuration in which the vehicle-mounted wireless device 20 is mounted on the vehicle 10 separately from the information processing device 17 .
  • the acquisition unit 21 may be able to acquire the fourth information together with the first information from the surrounding roadside units 12 . As with the first information, the acquisition unit 21 acquires the fourth information from the surrounding roadside device 12 via the vehicle-mounted wireless device 20 in a configuration in which the vehicle-mounted wireless device 20 is mounted on the vehicle 10 separately from the information processing device 17. may be obtainable.
  • the acquisition unit 21 acquires second information about detected objects around the vehicle 10 in which it is mounted.
  • the second information about the detected object may be similar to the information about the detected object detected by the sensor unit 14 of the roadside unit 12 .
  • the second information about the detected object may be the position of the detected object, the presence or absence of the detected object, the type of existing detected object, the speed, and the traveling direction.
  • the acquisition unit 21 may acquire the second information from the sensor unit 18 .
  • the acquisition unit 21 may acquire the second information from a surrounding vehicle 10 different from the vehicle 10 in which the information processing device 17 is mounted. Further, the vehicle 10 equipped with the information processing device 17 may transmit the second information acquired by the vehicle 10 to other vehicles 10 or roadside units 12 in the vicinity of the vehicle 10 .
  • the acquisition unit 21 may also have a function of outputting information to an external device. In the following description, it is assumed that the acquisition unit 21 has a function of outputting information to an external device.
  • the acquisition unit 21 may output the third information and the fifth information generated by the control unit 22 to the in-vehicle wireless device 20, as will be described later.
  • the acquisition unit 21 may output to the ECU 19 information about detected objects around the vehicle 10 in which the information processing device 17 is mounted.
  • the control unit 22 includes one or more processors and memory.
  • the processor may include a general-purpose processor that loads a specific program to execute a specific function, and a dedicated processor that specializes in specific processing.
  • a dedicated processor may include an application specific IC.
  • a processor may include a programmable logic device.
  • a PLD may include an FPGA.
  • the controller 22 may be either a SoC with one or more processors cooperating, and a SiP.
  • the control unit 22 may process the information regarding the surrounding detected objects detected by the sensor unit 18 as necessary.
  • the control unit 22 may add time information to the information regarding the detected object.
  • the time information may be obtained from a timer built into the control unit 22 or a timer of the information processing device 17 .
  • control unit 22 Based on the first information, the control unit 22 selects, from among the detected objects included in the second information, detected objects that may not have been detected by the roadside unit 12 that notified the first information as latent detected objects. Extract.
  • the control unit 22 may predict the performance of the roadside unit 12 based on the first information in order to extract the potential detectable object. For example, the control unit 22 may recognize the detectable range as the performance of the roadside unit 12 based on the first information. When the first information is information indicating the detectable range itself, the control unit 22 may recognize the detectable range as the detectable range of the roadside unit 12 . When the first information is information that allows the detection range to be estimated, the control unit 22 may recognize the range calculated based on the first information as the detectable range. When acquiring the first information from a plurality of roadside units 12, the control unit 22 recognizes the first information for each roadside unit 12 based on the identification information acquired together with the first information, and identifies the detectable roadside units 12. A range may be calculated.
  • the control unit 22 Based on the first information, more specifically, based on the detectable range corresponding to the first information, the control unit 22 notifies the first information from among the detected objects included in the second information. Detected objects that may not have been detected by the roadside unit 12 are extracted as latent detected objects. As shown in FIG. 5, the control unit 22 extracts, for example, a detected object located outside the detectable range rd among the detected objects do2 included in the second information as latent detected objects hdo.
  • the control unit 22 determines that the detection object do2 included in the second information is located outside the detectable range rd and the fourth information , may be extracted as latent detected objects hdo. In other words, even if the detected object do2 included in the second information is located outside the detectable range rd, the control unit 22 replaces the detected object do4 included in the fourth information with the potential detected object hdo. should not be identified.
  • the control unit 22 determines whether or not the detected object do2 included in the second information is a detected object not included in the fourth information by comparing fourth information in the same category as the second information. The determination may be made by comparing the position of the detected object in the information and the position of the detected object in the fourth information.
  • the control unit 22 determines that the detection object do2 included in the second information is positioned within the detectable range rd and the fourth information , may be extracted as latent detected objects hdo. In other words, even if the detected object do2 included in the second information is located within the detectable range rd, the control unit 22 replaces the detected object do4 included in the fourth information with the potential detected object hdo. should not be identified.
  • the control unit 22 may determine whether or not the detected object do2 included in the second information is a detected object not included in the fourth information by comparing the fourth information, which is in the same category as the second information. .
  • the control unit 22 generates third information including at least information about the latent detection object hdo extracted as described above.
  • the information about the latent detected object hdo may be similar to the information about the detected object detected by the sensor unit 18 of the vehicle 10 .
  • the information about the latent detected object hdo may be the position of the latent detected object hdo, the presence or absence of the latent detected object hdo, the type of the existing latent detected object hdo, the speed, and the traveling direction.
  • the control unit 22 may specify a range in which the latent detected object hdo is located. .
  • the range in which the latent detectable object hdo is positioned may not only be the position of the latent detectable object hdo itself, but may also be a range having a predetermined width with the position of the latent detectable object hdo as a reference or the like.
  • the control unit 22 may include in the third information information indicating that the specified range is a blind spot in the roadside unit 12 that has notified the fourth information.
  • the control unit 22 extracts the first time point included in the second information, in other words, the time information added to the second information. Based on the corresponding time, the state of the detected object (first detected object) do2 included in the second information at the third time point after the first time point may be predicted. Similarly, based on the second time point included in the fourth information, in other words, the time added to the fourth information, the control unit 22 controls the detection included in the fourth information at a third time point after the second time point. The state of the object (second detected object) do4 may be predicted.
  • the control unit 22 predicts the position of the detected object do2 in the second information at the third point in time. Also, for example, when the fourth information includes velocity and acceleration together with the position, the control unit 22 predicts the position of the detected object do4 in the fourth information at the third point in time.
  • control unit 22 determines whether or not the prediction result, in other words, the state of the detected object do2 included in the second information and the detected object do4 included in the fourth information at the same point in time at the third point in time is the same. You can judge. When the control unit 22 determines that the states are the same, the control unit 22 excludes information regarding either the detected object do2 included in the second information or the detected object do4 included in the fourth information from the generation of the third information. you can judge.
  • the control unit 22 may determine whether the detected object do2 included in the second information includes the vehicle 10 or not. When the detected object do2 includes the vehicle 10 , the control unit 22 may calculate the priority of other detected objects viewed from the vehicle 10 . Other detected objects may include the vehicle 10 on which the information processing device 17 is mounted. The control unit 22 may generate fifth information in which the calculated priority is associated with the other detected object.
  • the control unit 22 calculates the velocity vectors of other detected objects viewed from the vehicle 10, which is the detected object do2 included in the second information. Further, the control unit 22 calculates a range of a predetermined width centering on the vehicle 10 .
  • the predetermined size is a circle with a radius that is considered to ensure safety around the vehicle 10, and is, for example, a size with a radius of 100 m.
  • the control unit 22 calculates the priority based on the velocity vector and the range. Specifically, the control unit 22 determines whether or not a position moved from the position of another detected object by a moving distance obtained by multiplying the velocity vector of the detected object by a predetermined time is included in the range. .
  • the control unit 22 may calculate a higher priority when the movement position is included in the range than when it is outside the range. Alternatively, the control unit 22 may calculate the priority so that the smaller the distance between the movement position and the position of the vehicle 10, the higher the priority.
  • the control unit 22 calculates the positional interval between the vehicle 10, which is the detected object do2 included in the second information, and another detected object.
  • the control unit 22 calculates the priority based on the change over time between the previously calculated positional intervals of the same vehicle 10 and other detected objects and the newly calculated positional intervals. Specifically, the control unit 22 may calculate the priority so that the shorter the time change of the position interval, the higher the priority.
  • the control unit 22 may weight the priority based on the type of detected object. For example, the larger the size of the detected object or the faster the speed of movement of the detected object, the greater the weighting may be.
  • the control unit 22 may control the acquisition unit 21 to output the generated third information and fifth information to the in-vehicle wireless device 20 .
  • the control unit 22 may control the acquisition unit 21 to output the second information and the fourth information to the ECU 19 .
  • Information processing in each vehicle 10 is periodically performed while the information processing device 17 in the vehicle 10 is in operation. and communication results with other vehicles 10 are used for information processing of the vehicle 10 .
  • Information processing in the roadside device 12 is performed periodically, and when the vehicle 10 is located within a communicable range, the result of communication with the vehicle 10 is used for information processing in the roadside device 12 . Therefore, the information processing by the vehicle 10 and the roadside device 12 at an arbitrary time will be described below as an example, but the order of information processing in each vehicle 10 and the roadside device 12 is not limited to the following example.
  • step S ⁇ b>100 the roadside unit 12 detects information about the detected object in the sensor unit 14 .
  • the roadside device 12 generates fourth information using information about the detected object. As will be described later, if the roadside device 12 has previously acquired at least one of the third information and the fifth information, the roadside device 12 may also use the one to generate the fourth information.
  • the roadside device 12 transmits the first information and the generated fourth information to the surrounding vehicles 10 .
  • step S101 the first vehicle 10a, which is an arbitrary vehicle 10, detects information about the detected object in the sensor unit 18.
  • the first vehicle 10a generates second information using information about the detected object.
  • the first vehicle 10a transmits the generated second information to surrounding vehicles 10 including the second vehicle 10b other than the first vehicle 10a.
  • step S102 the sensor unit 18 of the second vehicle 10b detects information about the detected object.
  • the second vehicle 10b uses the information about the detected object to generate the second information.
  • the second vehicle 10b transmits the generated second information to surrounding vehicles 10 including the first vehicle 10a other than the second vehicle 10b.
  • step S103 the first vehicle 10a detects each of the second information detected by the first vehicle 10a, the second information obtained from the second vehicle 10b, and the fourth information obtained from the roadside unit 12. Predict the state of the detected object at a third point in time, which is the same point in time.
  • step S104 the first vehicle 10a extracts the latent detected object hdo based on at least one of the comparison of the state of the detected object predicted in step S103 and the first information acquired from the roadside unit 12.
  • step S105 the first vehicle 10a detects each of the second information detected by the first vehicle 10a, the second information obtained from the second vehicle 10b, and the fourth information obtained from the roadside unit 12. For each detected object, the priority of other detected objects viewed from the detected object is calculated to generate fifth information. The first vehicle 10a transmits the fifth information to the roadside unit 12 together with the third information generated in step S104.
  • the second vehicle 10b executes processing similar to steps S103 to S105 in the first vehicle 10a, and transmits third information and fifth information to the roadside unit 12.
  • the information processing device 17 of the present embodiment configured as described above acquires first information that can predict the performance of the roadside device 12 in the vicinity from the roadside device 12, and acquires second information regarding a detected object in the vicinity. and an acquiring unit 21 that extracts a latent detected object hdo that may not have been detected by the roadside unit 12 from among the detected objects do2 included in the second information based on the first information, and a control unit 22 that generates the third information.
  • the information processing device 17 can generate information about a detected object (latent detected object hdo) other than the detected object actually detected by the roadside device 12 . Therefore, the information processing device 17 can generate information that supplements the information detected by the roadside unit 12 .
  • the acquisition unit 21 can acquire the fourth information about the detected object detected by the roadside unit 12, and the control unit 22 can obtain the first information among the detected objects do2 included in the second information.
  • Detected objects located outside the corresponding detectable range rd and not included in the fourth information are extracted as latent detected objects hdo.
  • a detected object positioned outside the detectable range rd is highly likely not detected by the roadside unit 12 . Therefore, in the above configuration, the extraction process can be sped up by extracting the detected object positioned outside the detectable range rd of the information processing device 17 as the first condition.
  • the roadside unit 12 may detect information about the detected object even though it is located outside the detectable range rd. In response to such an event, the information processing device 17 can further extract information about the detected object that was not detected by the roadside unit 12, thereby improving the extraction accuracy of the latent detected object hdo.
  • the acquisition unit 21 can acquire the fourth information regarding the detected object detected by the roadside unit 12, and the control unit 22 can obtain the first information among the detected objects do2 included in the second information.
  • a detected object located within the corresponding detectable range rd and not included in the fourth information is extracted as a latent detected object hdo.
  • a detected object may exist in a blind spot from the roadside device 12 due to the presence of an obstacle, even though it is located within the detectable range rd of the roadside device 12 .
  • the information processing device 17 having the configuration described above can improve the extraction accuracy of the latent detection object hdo.
  • the information processing device 17 includes the range in which the latent detection object hdo is located in the third information as information indicating that the range is a blind spot in the roadside unit 12 . With such a configuration, the information processing device 17 can further supplement useful information for the roadside unit 12 .
  • the information processing device 17 detects the first detected object included in the second information and the detected object included in the fourth information based on the first time point included in the second information and the second time point included in the fourth information. A second detected object and its respective state at a third point in time are predicted. With such a configuration, even if the information regarding the same detected object differs due to a shift in the detection time, the information processing device 17 corrects the difference in the information and compares the information, so that the information regarding the same detected object is obtained. It is possible to improve the determination accuracy of whether or not.
  • the information processing device 17 determines that the first detected object and the second detected object are the same based on the prediction result, the second information about the first detected object and the fourth information about the second detected object from the generation of the third information.
  • the information processing device 17 can prevent the third information from including a plurality of pieces of information regarding the same detected object with different detection times. Therefore, the information processing device 17 can reduce the processing load on the roadside device 12 that acquires the third information.
  • the information processing device 17 calculates the priority regarding the detected object seen from the vehicle 10, and associates the priority with the detected object. 5 Generate information. With such a configuration, the information processing device 17 can generate useful information for other vehicles 10 that acquire the fifth information via the roadside device 12 . The information processing device 17 can reduce the processing load on the roadside device 12 by generating such useful information instead of the roadside device 12 .
  • the roadside unit 12 configured as described above transmits the sensor unit 14 that detects information about detected objects in the vicinity and the first information that can predict the performance of the sensor unit 14, and the sensor unit 14 does not detect generating fourth information including the communication unit 15 receiving the third information about the possible latent detection object hdo, the information about the latest detection object detected by the sensor unit 14, and the third information; and a control unit 16 that controls the communication unit 15 to transmit the
  • the roadside device 12 generates the fourth information including the third information that the roadside device 12 is highly likely to have failed to detect, so that the vehicle 10 can be provided with more useful information. obtain.
  • Embodiments of the information processing device 17 and roadside device 12 have been described above.
  • An optical disk, a magneto-optical disk, a CD-ROM, a CD-R, a CD-RW, a magnetic tape, a hard disk, a memory card, etc. can also be used.
  • the implementation form of the program is not limited to an application program such as an object code compiled by a compiler or a program code executed by an interpreter. good.
  • the program may or may not be configured so that all processing is performed only in the CPU on the control board.
  • the program may be configured to be partially or wholly executed by another processing unit mounted on an expansion board or expansion unit added to the board as required.
  • the information processing device 17 and the ECU 19 in the vehicle 10 are described as different components.
  • the information processing device 17 may include the ECU 19 and perform not only the processing as the information processing device 17 described above but also the processing performed by the ECU 19 described above.
  • Embodiments according to the present disclosure are not limited to any specific configuration of the embodiments described above. Embodiments of the present disclosure extend to all novel features or combinations thereof described in the present disclosure or to all novel method or process steps or combinations thereof described. be able to.
  • Descriptions such as “first” and “second” in this disclosure are identifiers for distinguishing the configurations. Configurations that are differentiated in descriptions such as “first” and “second” in this disclosure may interchange the numbers in that configuration. For example, the first information can replace the identifiers “first” and “second” with the second information. The exchange of identifiers is done simultaneously. The configurations are still distinct after the exchange of identifiers. Identifiers may be deleted. Configurations from which identifiers have been deleted are distinguished by codes. The description of identifiers such as “first” and “second” in this disclosure should not be used as a basis for interpreting the order of the configuration and the existence of identifiers with lower numbers.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Traffic Control Systems (AREA)

Abstract

This information processing device comprises an acquisition unit and a control unit. The acquisition unit acquires first information from a road side machine in the vicinity. The first information makes it possible to predict the properties of the road side machine. The acquisition unit acquires second information pertaining to detection objects in the vicinity. The control unit extracts, on the basis of the first information, a latent detection object which may not have been detected by the road side machine, from among the detection objects included in the second information. The control unit generates third information pertaining to the latent detection object.

Description

情報処理装置、路側機、交通支援システム、及び情報補足方法Information processing device, roadside unit, traffic support system, and information supplementation method 関連出願の相互参照Cross-reference to related applications
 本出願は、2022年1月27日に日本国に特許出願された特願2022-11224の優先権を主張するものであり、この先の出願の開示全体をここに参照のために取り込む。 This application claims priority from Japanese Patent Application No. 2022-11224 filed in Japan on January 27, 2022, and the entire disclosure of this earlier application is incorporated herein for reference.
 本発明は、情報処理装置、路側機、交通支援システム、及び情報補足方法に関するものである。 The present invention relates to an information processing device, a roadside unit, a traffic support system, and an information supplement method.
 道路等の上方に設置され、車両、通行人等の検出物体を検出し、周辺の車両等に検出した検出物体を通知する路側機が知られている(特許文献1参照)。 A roadside unit is known that is installed above a road or the like, detects detected objects such as vehicles and passersby, and notifies surrounding vehicles of the detected detected objects (see Patent Document 1).
特開2011-242846号公報JP 2011-242846 A
 第1の観点による情報処理装置は、
 周辺の路側機の性能を予測可能な第1情報を該路側機から取得し、且つ周辺における検出物体に関する第2情報を取得する取得部と、
 前記第1情報に基づいて、前記第2情報に含まれる検出物体の中から、該路側機が検出していない可能性のある潜在検出物体を抽出し、前記潜在検出物体に関する第3情報を生成する制御部と、を備える。
An information processing device according to a first aspect includes:
an acquisition unit that acquires first information that can predict the performance of a surrounding roadside unit from the roadside unit and acquires second information about a detected object in the vicinity;
Based on the first information, a latent detected object that may not have been detected by the roadside unit is extracted from the detected objects included in the second information, and third information about the latent detected object is generated. and a control unit.
 第2の観点による路側機は、
 周辺の検出物体に関する情報を検出するセンサ部と、
 前記センサ部の性能を予測可能な第1情報を送信し、前記センサ部が検出していない可能性のある潜在検出物体に関する第3情報を受信する通信部と、
 前記センサ部が最新に検出した検出物体に関する情報と前記第3情報とを含む第4情報を生成し、該第4情報を送信するように前記通信部を制御する制御部と、を備える。
The roadside unit according to the second aspect is
a sensor unit that detects information about a detected object in the vicinity;
a communication unit that transmits first information that can predict the performance of the sensor unit and receives third information regarding a latent detection object that may not be detected by the sensor unit;
a control unit configured to generate fourth information including information about the detected object most recently detected by the sensor unit and the third information, and control the communication unit to transmit the fourth information.
 第3の観点による交通支援システムは、
 周辺の検出物体に関する情報を検出するセンサ部と、前記センサ部の性能を予測可能な第1情報を送信し、前記センサ部が検出していない可能性のある潜在検出物体に関する第3情報を受信する通信部と、前記センサ部が最新に検出した検出物体に関する情報と前記第3情報とを含む第4情報を生成し、該第4情報を送信するように前記通信部を制御する第1の制御部と、を有する路側機と、
 周辺の前記路側機から前記第1情報を取得し、且つ周辺における検出物体に関する第2情報を取得する取得部と、前記第1情報に基づいて、前記第2情報に含まれる検出物体の中から、該路側機の前記潜在検出物体を抽出し、前記潜在検出物体に関する前記第3情報を生成する第2の制御部と、を有する情報処理装置を搭載する車両と、を備える。
The traffic support system according to the third aspect is
A sensor unit that detects information about a detected object in the vicinity, transmits first information that can predict the performance of the sensor unit, and receives third information about a latent detected object that may not be detected by the sensor unit. a first communication unit for generating fourth information including information about a detected object most recently detected by the sensor unit and the third information, and controlling the communication unit to transmit the fourth information; a roadside unit having a controller;
an acquisition unit that acquires the first information from the roadside units in the vicinity and acquires second information about the detected objects in the vicinity; and based on the first information, the detected objects included in the second information. and a second control unit that extracts the latent detected object of the roadside unit and generates the third information about the latent detected object.
 第4の観点による情報補足方法は、
 周辺の路側機の性能を予測可能な第1情報を、該路側機から取得し、
 周辺における検出物体に関する第2情報を取得し、
 前記第1情報に基づいて、前記第2情報に含まれる検出物体の中から、該路側機が検出していない可能性のある潜在検出物体を抽出し、
 前記潜在検出物体に関する第3情報を生成する。
The information supplementation method according to the fourth aspect is
Acquiring first information from the roadside unit that can predict the performance of the surrounding roadside unit;
obtaining second information about detected objects in the vicinity;
Based on the first information, extracting a potential detected object that may not be detected by the roadside unit from among the detected objects included in the second information;
Generate third information about the latent detected object.
図1は、一実施形態に係る情報処理装置を搭載する車両を備える交通支援システムの構成例を示す図である。FIG. 1 is a diagram illustrating a configuration example of a traffic support system including a vehicle equipped with an information processing device according to one embodiment. 図1の路側機の概略構成を示すブロック図である。2 is a block diagram showing a schematic configuration of a roadside machine in FIG. 1; FIG. 図1の車両の概略構成を示すブロック図である。2 is a block diagram showing a schematic configuration of the vehicle in FIG. 1; FIG. 図3の情報処理装置の概略構成を示すブロック図である。4 is a block diagram showing a schematic configuration of the information processing apparatus of FIG. 3; FIG. 第1情報に基づく潜在検出物体を説明するための状態図である。FIG. 4 is a state diagram for explaining a latent detected object based on first information; 第1情報及び第4情報に基づき、第1情報に対応する検出可能な範囲外における潜在検出物体を説明するための状態図である。FIG. 10 is a state diagram for explaining a potential detectable object outside the detectable range corresponding to the first information based on the first information and the fourth information; 第1情報及び第4情報に基づき、第1情報に対応する検出可能な範囲内における潜在検出物体を説明するための状態図である。FIG. 10 is a state diagram for explaining a potential detectable object within a detectable range corresponding to the first information based on the first information and the fourth information; 路側機及び車両による情報補足工程を説明するためのラダーチャートである。It is a ladder chart for demonstrating the information supplement process by a roadside unit and a vehicle.
 以下、本開示を適用した情報処理装置の実施形態について、図面を参照して説明する。 An embodiment of an information processing apparatus to which the present disclosure is applied will be described below with reference to the drawings.
 図1は、一実施形態に係る情報処理装置を搭載する車両10を備える交通支援システム11の構成例を示す。交通支援システム11は、例えば、高度道路交通システム(ITS:Intelligent Transport Systems)の安全運転支援通信システムである。安全運転支援通信システムは、安全運転支援システムと呼ばれたり、安全運転支援無線システムと呼ばれたりする。 FIG. 1 shows a configuration example of a traffic support system 11 including a vehicle 10 equipped with an information processing device according to one embodiment. The traffic support system 11 is, for example, a safe driving support communication system of Intelligent Transport Systems (ITS). A driving safety support communication system is called a driving safety support system or a driving safety support radio system.
 交通支援システム11は、路側機12及び車両10を含んで構成される。 The traffic support system 11 includes roadside units 12 and vehicles 10 .
 路側機12は、所定の領域における路上の車両、物体、人等の観察対象を観察してよい。路側機12は、複数の道路13(車道)が交差する交差点の近くに配置されて、路面を観察してよい。路側機12は、交差点以外の路側に配置されてよい。なお、図1においては、交差点には1つの路側機12が配置されている。しかしながら、これに限らず、複数の路側機12が配置されてもよい。例えば、交差点で交差する道路13の数に対応する数の路側機12が配置され、各路側機12は、設置者に予め対応づけられた少なくとも1の道路13が検出範囲に含まれるように配置されてもよい。 The roadside unit 12 may observe observation targets such as vehicles, objects, and people on the road in a predetermined area. The roadside unit 12 may be placed near an intersection where multiple roads 13 (roadways) intersect to observe the road surface. The roadside unit 12 may be arranged on the roadside other than the intersection. In addition, in FIG. 1, one roadside unit 12 is arranged at the intersection. However, not limited to this, a plurality of roadside units 12 may be arranged. For example, roadside units 12 are arranged in a number corresponding to the number of roads 13 intersecting at an intersection, and each roadside unit 12 is arranged so that at least one road 13 previously associated with the installer is included in the detection range. may be
 交通支援システム11では、路側機12と、道路13を走る車両10とが、互いに無線通信を行ってよい。複数の車両10は、互いに無線通信を行ってよい。 In the traffic support system 11, the roadside unit 12 and the vehicle 10 running on the road 13 may perform wireless communication with each other. A plurality of vehicles 10 may perform wireless communication with each other.
 路側機12は、後述する、多様な情報を車両10に通知する。車両10は、後述する、自車において取得する多様な情報を路側機12に通知する。車両10は、後述する、自車において取得する多様な情報を他の車両10に通知してよい。車両10では、少なくとも路側機12から取得する多様な情報が、当該車両10の運転者の安全運転を支援するために用いられてよい。 The roadside unit 12 notifies the vehicle 10 of various information, which will be described later. The vehicle 10 notifies the roadside device 12 of various information acquired by the own vehicle, which will be described later. The vehicle 10 may notify other vehicles 10 of various information acquired by the own vehicle, which will be described later. In the vehicle 10, at least various information obtained from the roadside unit 12 may be used to assist the driver of the vehicle 10 in driving safely.
 上記のように、交通支援システム11は、車両10の運転者の安全運転を支援してよい。車両10は、例えば、自動車であるが、自動車に限定されず、自動二輪車、バス、路面電車、自転車等であってよい。 As described above, the traffic assistance system 11 may assist the driver of the vehicle 10 in driving safely. The vehicle 10 is, for example, an automobile, but is not limited to an automobile, and may be a motorcycle, a bus, a streetcar, a bicycle, or the like.
 以下に、交通支援システム11を構成する路側機12及び車両10の詳細について説明する。 Details of the roadside unit 12 and the vehicle 10 that make up the traffic support system 11 will be described below.
 図2に示すように、路側機12は、センサ部14、通信部15、制御部(第1の制御部)16を含んで構成される。路側機12は、例えば、道路13が交差する交差点付近の信号装置、電柱、街灯等の、屋外において道路13を含む光景を撮像可能な高さを有する構造物に固定されてよい。 As shown in FIG. 2, the roadside unit 12 includes a sensor unit 14, a communication unit 15, and a control unit (first control unit) 16. The roadside unit 12 may be fixed to a structure having a height capable of capturing an outdoor scene including the road 13, such as a signal device near an intersection where the road 13 intersects, a utility pole, a streetlight, or the like.
 センサ部14は、周辺の検出物体に関する情報を検出する。検出物体は、路上の車両10、物体、人等の個々の車両10の運転において考慮すべき対象であってよい。検出物体に関する情報は、検出物体の位置、検出物体の存否、存在する検出物体の種類、速度、及び進行方向であってよい。センサ部14が検出する、検出物体に関する情報は、検出物体の位置、検出物体の存否、存在する検出物体の種類、速度、及び進行方向を生成し得る元情報であってよい。センサ部14は、所定の間隔で検出物体に関する情報を検出してよい。 The sensor unit 14 detects information about detected objects in the vicinity. A detected object may be a vehicle 10 on the road, an object, a person, etc., which should be considered in the operation of the individual vehicle 10 . The information about the detected object may be the position of the detected object, the presence or absence of the detected object, the type of existing detected object, the speed, and the traveling direction. The information about the detected object detected by the sensor unit 14 may be original information that can generate the position of the detected object, the presence or absence of the detected object, the type, speed, and traveling direction of the existing detected object. The sensor unit 14 may detect information about the detected object at predetermined intervals.
 センサ部14は、例えば、可視光カメラ、FIR(Far Infrared Rays)カメラ、及びステレオカメラ等の撮像装置、並びに及び赤外線レーダ、ミリ波レーダ、Lidar(Light detection and ranging)等の測距装置を含んでよい。撮像装置が検出する画像の画像解析に基づいて、検出物体の位置、検出物体の存否、検出物体の種類、及び進行方向は生成され得る。異なる時点で連続的に検出された画像の画像解析により、検出物体の速度及び加速度は生成され得る。測距装置が検出する物点の距離に基づいて、検出物体の位置、検出物体の存否、存在する検出物体の種類は生成され得る。異なる時点で連続的に検出された距離に基づいて、検出物体の速度及び加速度は生成され得る。 The sensor unit 14 includes, for example, imaging devices such as a visible light camera, FIR (Far Infrared Rays) camera, and stereo camera, and distance measuring devices such as infrared radar, millimeter wave radar, and Lidar (light detection and ranging). OK. The position of the detected object, the presence or absence of the detected object, the type of the detected object, and the traveling direction can be generated based on the image analysis of the image detected by the imaging device. By image analysis of successively detected images at different time points, the velocity and acceleration of the detected object can be generated. The position of the detected object, the presence or absence of the detected object, and the type of the existing detected object can be generated based on the distance of the object point detected by the rangefinder. Velocity and acceleration of the detected object can be generated based on successively detected distances at different times.
 センサ部14は、検出軸を中心とする検出範囲を有する。センサ部14は、道路13における所望の領域が検出範囲に含まれるように設置されてよい。設置状況におけるセンサ部14の姿勢、言換えると検出軸の地上を基準とした方角及び傾斜は予め測定されていてよい。 The sensor unit 14 has a detection range centered on the detection axis. The sensor unit 14 may be installed so that a desired area on the road 13 is included in the detection range. The attitude of the sensor unit 14 in the installation situation, in other words, the direction and inclination of the detection axis with respect to the ground may be measured in advance.
 通信部15は、制御部16によって制御されて車両10と無線通信を行ってよい。通信部15は、通信回路及びアンテナにより構成されてよい。アンテナは、例えば無指向性のアンテナである。通信部15は、例えば、ITSに割り当てられている700MHz帯を使用して無線通信を行う。又、通信部15は、例えば、無線LAN(Local Area Network)を用いて無線通信を行う。 The communication unit 15 may perform wireless communication with the vehicle 10 under the control of the control unit 16 . The communication unit 15 may be composed of a communication circuit and an antenna. The antenna is, for example, an omnidirectional antenna. The communication unit 15 performs wireless communication using, for example, the 700 MHz band assigned to ITS. Also, the communication unit 15 performs wireless communication using, for example, a wireless LAN (Local Area Network).
 通信部15は、アンテナで受信した信号に対して増幅処理などの各種処理を行い、処理後の受信信号を制御部16に出力してよい。制御部16は、入力される受信信号に対して各種処理を行って、当該受信信号に含まれる情報を取得してよい。通信部15は、制御部16から取得する情報に対して増幅処理などの各種処理を行って、処理後の送信信号をアンテナから無線送信してよい。 The communication unit 15 may perform various processing such as amplification processing on the signal received by the antenna, and output the processed received signal to the control unit 16 . The control unit 16 may perform various types of processing on the input received signal to acquire information contained in the received signal. The communication unit 15 may perform various processing such as amplification processing on the information acquired from the control unit 16, and wirelessly transmit the processed transmission signal from the antenna.
 通信部15は、具体的には、センサ部14の性能を予測可能な第1情報を送信する。センサ部14の性能とは、例えば、センサ部14の検出範囲である。第1情報は、例えば、検出範囲、言換えると路側機12におけるセンサ部14が検出物体を検出可能な範囲を示す情報であってよい。又は、第1情報は、検出範囲を推定させ得る情報であってよい。検出範囲を推定させ得る情報は、センサ部14による検出範囲中の複数の検出物体の各位置を示す情報であってよい。複数の検出物体の各位置に基づいて、少なくとも、複数の位置を結ぶ領域内は検出可能な範囲であることが推定され得る。 Specifically, the communication unit 15 transmits first information that can predict the performance of the sensor unit 14 . The performance of the sensor section 14 is, for example, the detection range of the sensor section 14 . The first information may be, for example, a detection range, in other words, information indicating a range in which the sensor unit 14 of the roadside unit 12 can detect the detected object. Alternatively, the first information may be information capable of estimating the detection range. The information that allows estimation of the detection range may be information indicating the positions of the plurality of detected objects within the detection range of the sensor unit 14 . Based on the positions of the plurality of detected objects, it can be estimated that at least the area connecting the plurality of positions is a detectable range.
 又、通信部15は、具体的には、第4情報を送信してよい。第4情報は、後述するように、路側機12の周辺における検出物体に関する情報を含む。 Also, the communication unit 15 may specifically transmit the fourth information. The fourth information includes information about detected objects around the roadside unit 12, as will be described later.
 又、通信部15は、具体的には、第3情報を受信する。第3情報は、後述するように、車両10に搭載される情報処理装置において生成される、センサ部14が検出していない可能性のある潜在検出物体に関する情報を含む。 Also, the communication unit 15 specifically receives the third information. As will be described later, the third information includes information about a latent detected object that may not be detected by the sensor unit 14 and is generated by an information processing device mounted on the vehicle 10 .
 又、通信部15は、具体的には、第5情報を受信してよい。第5情報は、後述するように、車両10に搭載される情報処理装置において生成される、各車両10の周囲の検出対象の注意すべき優先度を示す。 Also, the communication unit 15 may specifically receive the fifth information. As will be described later, the fifth information is generated in the information processing device mounted on the vehicle 10 and indicates the priority of detection targets around each vehicle 10 to which attention should be paid.
 制御部16は、1以上のプロセッサおよびメモリを含む。プロセッサは、特定のプログラムを読み込ませて特定の機能を実行する汎用のプロセッサ、および特定の処理に特化した専用のプロセッサを含んでよい。専用のプロセッサは、特定用途向けIC(ASIC;Application Specific Integrated Circuit)を含んでよい。プロセッサは、プログラマブルロジックデバイス(PLD;Programmable Logic Device)を含んでよい。PLDは、FPGA(Field-Programmable Gate Array)を含んでよい。制御部16は、1つ又は複数のプロセッサが協働するSoC(System-on-a-Chip)、およびSiP(System In a Package)のいずれかであってもよい。 The control unit 16 includes one or more processors and memory. The processor may include a general-purpose processor that loads a specific program to execute a specific function, and a dedicated processor that specializes in specific processing. A dedicated processor may include an Application Specific Integrated Circuit (ASIC). The processor may include a programmable logic device (PLD). The PLD may include an FPGA (Field-Programmable Gate Array). The control unit 16 may be either SoC (System-on-a-Chip) in which one or more processors cooperate, or SiP (System In a Package).
 制御部16は、センサ部14が検出した周辺の検出物体に関する情報を必要に応じて加工してよい。制御部16は、検出物体に関する情報に時刻情報を付加してよい。時刻情報は、制御部16が内蔵するタイマ、又は路側機12が有するタイマから時刻情報を取得してよい。 The control unit 16 may process the information regarding the surrounding detected objects detected by the sensor unit 14 as necessary. The control unit 16 may add time information to the information regarding the detected object. The time information may be obtained from a timer built into the control unit 16 or from a timer of the roadside device 12 .
 例えば、制御部16は、前述のように、センサ部14が撮像装置を含む構成において検出される画像に対して、パターンマッチング又は機械学習により構築された判別モデルにより、検出物体の存否を判別してよい。更に、制御部16は、検出物体が存在する場合、検出物体の種類を推定してよい。更に、制御部16は、検出物体が存在する場合、画像内における検出物体の位置及びセンサ部14の姿勢に基づいて、地上における検出物体の位置を算出してよい。更に、制御部16は、検出物体が存在する場合、検出物体の向きを推定することにより、検出物体の進行方向を推定してよい。更に、制御部16は、時間的に連続するフレームの画像において同一の検出物体の地上における位置の差と、フレームの周期とに基づいて検出物体の速度を算出してよい。更に、制御部16は、時間的に連続する時点で算出した速度の差及びフレームの周期に基づいて検出物体の加速度を算出してよい。制御部16は、上述のように加工した情報を検出物体に関する情報と認定してよい。 For example, as described above, the control unit 16 determines the presence or absence of a detected object by using a discrimination model constructed by pattern matching or machine learning for an image detected by a configuration in which the sensor unit 14 includes an imaging device. you can Furthermore, when a detected object exists, the control section 16 may estimate the type of the detected object. Furthermore, when a detected object exists, the control unit 16 may calculate the position of the detected object on the ground based on the position of the detected object in the image and the orientation of the sensor unit 14 . Furthermore, when a detected object exists, the control unit 16 may estimate the traveling direction of the detected object by estimating the orientation of the detected object. Furthermore, the control unit 16 may calculate the velocity of the detected object based on the difference in the positions of the same detected object on the ground in temporally consecutive frame images and the frame cycle. Further, the control unit 16 may calculate the acceleration of the detected object based on the difference in velocity calculated at consecutive points in time and the period of frames. The control unit 16 may recognize the information processed as described above as the information regarding the detected object.
 又、例えば、制御部16は、前述のように、センサ部14が測距装置を含む構成において検出される検出軸に対してそれぞれ異なる傾斜方向の物点の距離と道路13までの距離とを比較することにより、当該傾斜方向における検出物体の存否を判別してよい。更に、制御部16は、検出物体が存在する場合、検出物体の距離及びセンサ部14の姿勢に基づいて、地上における検出物体の位置を算出してよい。更に、制御部16は、検出物体が存在する場合、同一とみなせる距離とみなせる複数の傾斜方向に対応する物点の集合体の大きさに基づいて、検出物体の種類を推定してよい。更に、制御部16は、時間的に連続する測距の検出結果において同一の検出物体の地上における位置の差と、測距の周期とに基づいて検出物体の速度を算出してよい。更に、制御部16は、時間的に連続する時点で算出した速度の差及び測距の周期に基づいて検出物体の加速度を算出してよい。制御部16は、上述のように加工した情報を検出物体に関する情報と認定してよい。 Further, for example, the control unit 16 calculates the distances of the object points in different tilt directions with respect to the detection axis detected in the configuration in which the sensor unit 14 includes a distance measuring device, and the distances to the road 13, as described above. The presence or absence of the detected object in the tilt direction may be determined by the comparison. Furthermore, when a detected object exists, the control unit 16 may calculate the position of the detected object on the ground based on the distance of the detected object and the orientation of the sensor unit 14 . Furthermore, when a detected object exists, the control unit 16 may estimate the type of the detected object based on the size of a set of object points corresponding to a plurality of tilt directions that can be regarded as the same distance. Furthermore, the control unit 16 may calculate the velocity of the detected object based on the difference in the position of the same detected object on the ground in the detection results of the distance measurement that is continuous in time and the period of the distance measurement. Further, the control unit 16 may calculate the acceleration of the detected object based on the speed difference calculated at consecutive points in time and the range measurement cycle. The control unit 16 may recognize the information processed as described above as the information regarding the detected object.
 制御部16は、センサ部14が最新に検出した周辺の検出物体に関する情報、第3情報、及び第5情報を含む、新規な第4情報を生成する。制御部16は、新規に生成した第4情報を送信するように、通信部15を制御する。なお、制御部16は、第4情報を、周囲に存在する車両10だけでなく、周囲に存在する、当該第4情報を受信可能な他の路側機12に対しても送信するように、通信部15を制御してもよい。 The control unit 16 generates new fourth information including the information about the detected object in the vicinity detected most recently by the sensor unit 14, the third information, and the fifth information. The control unit 16 controls the communication unit 15 to transmit the newly generated fourth information. Note that the control unit 16 performs communication so as to transmit the fourth information not only to the vehicles 10 existing in the surroundings but also to other roadside units 12 existing in the surroundings that can receive the fourth information. 15 may be controlled.
 制御部16は、第3情報における潜在検出物体が、センサ部14が最新に検出した周辺の検出物体と同一であるか否かを判別してよい。制御部16は、当該潜在検出物体の位置、及び当該検出物体の位置の位置間隔が第1の距離閾値以下であるか否かに基づいて同一性を判別してよい。制御部16は、当該潜在検出物体又は当該検出物体の速度更には加速度にも基づいて、第3の情報の取得時点及びセンサ部14の最新の検出時点のずれに対して、当該潜在検出物体の位置及び当該検出物体の位置のいずれかを修正してよい。制御部16は、位置の修正により同時点に合わせた、当該潜在検出物体及び当該検出物体の位置を同一性判別に用いてよい。 The control unit 16 may determine whether or not the latent detected object in the third information is the same as the peripheral detected object most recently detected by the sensor unit 14 . The control unit 16 may determine identity based on the position of the latent detected object and whether the positional interval between the positions of the detected object is equal to or less than a first distance threshold. Based on the latent detected object or the speed and acceleration of the detected object, the control unit 16 controls the detection of the latent detected object with respect to the difference between the acquisition time of the third information and the latest detection time of the sensor unit 14. Either the position or the position of the detected object may be modified. The control unit 16 may use the positions of the latent detected object and the detected object, which are adjusted to the same point in time by correcting the positions, for identity determination.
 制御部16は、周期的に第4情報を生成してよい。制御部16は、同一の車両から異なる時点に取得する複数の第3の情報の中で最新の第3情報を、第4の情報の生成に用いてよい。制御部16は、前回の第4情報の送信後から新規な第4情報の生成までの間に取得する第3情報を、新規な第4情報の生成に用いてよい。 The control unit 16 may periodically generate the fourth information. The control unit 16 may use the latest third information among a plurality of pieces of third information acquired from the same vehicle at different times to generate the fourth information. The control unit 16 may use the third information acquired after the previous transmission of the fourth information until the generation of the new fourth information to generate the new fourth information.
 制御部16は、内蔵するメモリに検出範囲を示す情報が格納される構成においては、メモリから読出す当該情報を第1情報として送信してよい。制御部16は、内蔵するメモリに検出範囲を推定させ得る情報が格納される構成においては、メモリから読出す当該情報を、路側機12の識別情報と共に第1情報として送信してよい。又は、制御部16は、内蔵するメモリに検出範囲を推定させ得る情報が格納される構成においては、メモリから読出す当該情報に基づいて検出範囲を算出し、且つ当該検出範囲を示す情報を第1情報として送信してよい。 In a configuration in which information indicating the detection range is stored in a built-in memory, the control unit 16 may transmit the information read from the memory as the first information. In a configuration in which information for estimating the detection range is stored in the built-in memory, the control unit 16 may transmit the information read from the memory as the first information together with the identification information of the roadside device 12 . Alternatively, in a configuration in which information capable of estimating the detection range is stored in a built-in memory, the control unit 16 calculates the detection range based on the information read from the memory, and outputs information indicating the detection range to the first. 1 information may be transmitted.
 制御部16は、周期的に第1情報を送信するように、通信部15を制御してよい。制御部16は、第4情報と同じ周期で第1情報を送信してよい。 The control unit 16 may control the communication unit 15 to periodically transmit the first information. The control unit 16 may transmit the first information in the same period as the fourth information.
 制御部16は、第3情報が潜在検出物体の位置する範囲を死角であることを示す情報として含む場合、当該情報を必要に応じて、路側機12を管理するサーバ装置等に通知してよい。また、制御部16は、当該情報を必要に応じて、路側機12の周囲の車両等に通知してもよい。 If the third information includes the range in which the potential detectable object is positioned as information indicating a blind spot, the control unit 16 may notify the server device or the like that manages the roadside unit 12 of the information as necessary. . In addition, the control unit 16 may notify vehicles or the like around the roadside unit 12 of the information as necessary.
 図3に示すように、車両10は、情報処理装置17を搭載する。車両10は、更に、センサ部18及びECU(Electronic Control Unit)19を搭載してよい。車両10は、情報処理装置17が無線通信部を備えない構成においては、車載無線装置20を搭載してよい。以下の説明において、情報処理装置17は、無線通信部を備えない構成であるが、無線通信部を備える構成においては車載無線装置20と類似した機能を有する。 As shown in FIG. 3, the vehicle 10 is equipped with an information processing device 17 . The vehicle 10 may further include a sensor section 18 and an ECU (Electronic Control Unit) 19 . The vehicle 10 may be equipped with the in-vehicle wireless device 20 in a configuration in which the information processing device 17 does not include a wireless communication unit. In the following description, the information processing device 17 has a configuration without a wireless communication unit, but has functions similar to those of the vehicle-mounted wireless device 20 in a configuration with a wireless communication unit.
 センサ部18は、多様なセンサを含んでよい。多様なセンサは、車両10の周囲の状況を検出するセンサ、車両10の状態を検出するセンサ等を含んでよい。車両10の周囲の状況を検出するセンサは、自機の周辺の検出物体に関する情報を検出するセンサを含んでよい。検出物体に関する情報は、路側機12のセンサ部14が検出する検出物体に関する情報と類似してよい。言換えると、検出物体に関する情報は、検出物体の位置、検出物体の存否、存在する検出物体の種類、速度、及び進行方向であってよい。車両10の状態を検出するセンサは、自機の速度を検出する速度センサ、自機の実空間における空間位置を検出するGNSS(Global Navigation Satellite System)等の測位装置を含んでよい。 The sensor unit 18 may include various sensors. The various sensors may include sensors that detect conditions around the vehicle 10, sensors that detect the state of the vehicle 10, and the like. The sensors that detect the circumstances around the vehicle 10 may include sensors that detect information about detected objects around the vehicle 10 . The information about the detected object may be similar to the information about the detected object detected by the sensor unit 14 of the roadside unit 12 . In other words, the information about the detected object may be the position of the detected object, the presence or absence of the detected object, the type of existing detected object, the speed, and the traveling direction. A sensor that detects the state of the vehicle 10 may include a speed sensor that detects the speed of the vehicle itself, and a positioning device such as a GNSS (Global Navigation Satellite System) that detects the spatial position of the vehicle in real space.
 ECU19は、センサ部18が検出する情報及び後述する情報処理装置17から取得する情報に基づいて、車両10の運転を支援してよい。ECU19は、車両10が運転者により操縦可能な構成においては、取得する情報又は当該情報を加工した情報を運転者に報知してよい。ECU19は、車両10の加減速及び操舵等の操縦を自動又は半自動で行う構成においては、当該操縦の判断のために取得する情報を用いてよい。 The ECU 19 may assist the driving of the vehicle 10 based on information detected by the sensor unit 18 and information obtained from the information processing device 17, which will be described later. In a configuration in which the vehicle 10 can be controlled by the driver, the ECU 19 may notify the driver of the acquired information or information obtained by processing the information. The ECU 19 may use the information acquired for determining the operation in a configuration in which steering such as acceleration/deceleration and steering of the vehicle 10 is performed automatically or semi-automatically.
 車載無線装置20は、周辺の路側機12及び他の車両10と無線通信を行ってよい。車載無線装置20は、通信回路及びアンテナにより構成されてよい。アンテナは、例えば無指向性のアンテナである。車載無線装置20は、例えば、ITSに割り当てられている700MHz帯を使用して無線通信を行う。又、車載無線装置20は、例えば、無線LAN(Local Area Network)を用いて無線通信を行う。車載無線装置20は、周辺の路側機12及び車両10から取得する情報を情報処理装置17に出力してよい。車載無線装置20は、情報処理装置17から取得する情報を路側機12及び車両10等の外部機器に出力してよい。 The in-vehicle wireless device 20 may perform wireless communication with the roadside units 12 and other vehicles 10 in the vicinity. The in-vehicle radio device 20 may be composed of a communication circuit and an antenna. The antenna is, for example, an omnidirectional antenna. The in-vehicle wireless device 20 performs wireless communication using, for example, the 700 MHz band assigned to ITS. Also, the in-vehicle wireless device 20 performs wireless communication using, for example, a wireless LAN (Local Area Network). The in-vehicle wireless device 20 may output information acquired from the roadside device 12 and the vehicle 10 in the vicinity to the information processing device 17 . The in-vehicle wireless device 20 may output information acquired from the information processing device 17 to external devices such as the roadside device 12 and the vehicle 10 .
 図4に示すように、情報処理装置17は、取得部21及び制御部(第2の制御部)22を含んで構成される。 As shown in FIG. 4, the information processing device 17 includes an acquisition unit 21 and a control unit (second control unit) 22 .
 取得部21は、情報処理装置17以外の電子機器との情報のインタフェースである。 The acquisition unit 21 is an information interface with electronic devices other than the information processing device 17 .
 取得部21は、周辺の路側機12から当該路側機12についての第1情報を取得する。取得部21は、情報処理装置17とは別に車載無線装置20が車両10に搭載される構成において当該車載無線装置20を介して、周辺の路側機12から第1情報を取得する。 The acquisition unit 21 acquires first information about the roadside device 12 from surrounding roadside devices 12 . The acquisition unit 21 acquires the first information from the surrounding roadside device 12 via the vehicle-mounted wireless device 20 in a configuration in which the vehicle-mounted wireless device 20 is mounted on the vehicle 10 separately from the information processing device 17 .
 取得部21は、周辺の路側機12から第4情報を第1情報と共に取得可能であってよい。第1情報と同じく、取得部21は、情報処理装置17とは別に車載無線装置20が車両10に搭載される構成において当該車載無線装置20を介して、周辺の路側機12から第4情報を取得可能であってよい。 The acquisition unit 21 may be able to acquire the fourth information together with the first information from the surrounding roadside units 12 . As with the first information, the acquisition unit 21 acquires the fourth information from the surrounding roadside device 12 via the vehicle-mounted wireless device 20 in a configuration in which the vehicle-mounted wireless device 20 is mounted on the vehicle 10 separately from the information processing device 17. may be obtainable.
 取得部21は、搭載される車両10の周辺における検出物体に関する第2情報を取得する。検出物体に関する第2情報は、路側機12のセンサ部14が検出する検出物体に関する情報と類似してよい。言換えると、検出物体に関する第2情報は、検出物体の位置、検出物体の存否、存在する検出物体の種類、速度、及び進行方向であってよい。取得部21は、第2情報をセンサ部18から取得してよい。取得部21は、第2情報を情報処理装置17が搭載される車両10とは別の周囲の車両10から取得してよい。また、情報処理装置17が搭載される車両10は、当該車両10で取得された第2情報を当該車両10とは別の周囲の車両10や周辺の路側機12に送信してもよい。 The acquisition unit 21 acquires second information about detected objects around the vehicle 10 in which it is mounted. The second information about the detected object may be similar to the information about the detected object detected by the sensor unit 14 of the roadside unit 12 . In other words, the second information about the detected object may be the position of the detected object, the presence or absence of the detected object, the type of existing detected object, the speed, and the traveling direction. The acquisition unit 21 may acquire the second information from the sensor unit 18 . The acquisition unit 21 may acquire the second information from a surrounding vehicle 10 different from the vehicle 10 in which the information processing device 17 is mounted. Further, the vehicle 10 equipped with the information processing device 17 may transmit the second information acquired by the vehicle 10 to other vehicles 10 or roadside units 12 in the vicinity of the vehicle 10 .
 取得部21は、情報を外部機器に出力する機能も有してよい。以後の説明において、取得部21は、情報を外部機器に出力する機能を有するものとする。取得部21は、後述するように、制御部22が生成する第3情報及び第5情報を車載無線装置20に出力してよい。取得部21は、情報処理装置17が搭載される車両10の周辺の検出物体に関する情報をECU19に出力してよい。 The acquisition unit 21 may also have a function of outputting information to an external device. In the following description, it is assumed that the acquisition unit 21 has a function of outputting information to an external device. The acquisition unit 21 may output the third information and the fifth information generated by the control unit 22 to the in-vehicle wireless device 20, as will be described later. The acquisition unit 21 may output to the ECU 19 information about detected objects around the vehicle 10 in which the information processing device 17 is mounted.
 制御部22は、1以上のプロセッサおよびメモリを含む。プロセッサは、特定のプログラムを読み込ませて特定の機能を実行する汎用のプロセッサ、および特定の処理に特化した専用のプロセッサを含んでよい。専用のプロセッサは、特定用途向けICを含んでよい。プロセッサは、プログラマブルロジックデバイスを含んでよい。PLDは、FPGAを含んでよい。制御部22は、1つ又は複数のプロセッサが協働するSoC、およびSiPのいずれかであってもよい。 The control unit 22 includes one or more processors and memory. The processor may include a general-purpose processor that loads a specific program to execute a specific function, and a dedicated processor that specializes in specific processing. A dedicated processor may include an application specific IC. A processor may include a programmable logic device. A PLD may include an FPGA. The controller 22 may be either a SoC with one or more processors cooperating, and a SiP.
 制御部22は、センサ部18が検出した周辺の検出物体に関する情報を必要に応じて加工してよい。制御部22は、検出物体に関する情報に時刻情報を付加してよい。時刻情報は、制御部22が内蔵するタイマ、又は情報処理装置17が有するタイマから時刻情報を取得してよい。 The control unit 22 may process the information regarding the surrounding detected objects detected by the sensor unit 18 as necessary. The control unit 22 may add time information to the information regarding the detected object. The time information may be obtained from a timer built into the control unit 22 or a timer of the information processing device 17 .
 制御部22は、第1情報に基づいて、第2情報に含まれる検出物体の中から、該第1情報を通知した路側機12が検出していない可能性のある検出物体を潜在検出物体として抽出する。 Based on the first information, the control unit 22 selects, from among the detected objects included in the second information, detected objects that may not have been detected by the roadside unit 12 that notified the first information as latent detected objects. Extract.
 制御部22は、潜在検出物体の抽出のために、第1情報に基づいて、路側機12の性能を予測してよい。制御部22は、例えば、第1情報に基づいて路側機12の性能として、検出可能な範囲を認識してよい。制御部22は、第1情報が検出可能な範囲そのものを示す情報である場合、当該検出範囲を路側機12の検出可能な範囲として認識してよい。制御部22は、第1情報が検出範囲を推定させ得る情報である場合、当該第1情報に基づいて算出する範囲を検出可能な範囲として認識してよい。制御部22は、複数の路側機12から第1情報を取得する場合、第1情報と共に取得する識別情報に基づいて路側機12毎の第1情報を認識し、各路側機12の検出可能な範囲を算出してよい。 The control unit 22 may predict the performance of the roadside unit 12 based on the first information in order to extract the potential detectable object. For example, the control unit 22 may recognize the detectable range as the performance of the roadside unit 12 based on the first information. When the first information is information indicating the detectable range itself, the control unit 22 may recognize the detectable range as the detectable range of the roadside unit 12 . When the first information is information that allows the detection range to be estimated, the control unit 22 may recognize the range calculated based on the first information as the detectable range. When acquiring the first information from a plurality of roadside units 12, the control unit 22 recognizes the first information for each roadside unit 12 based on the identification information acquired together with the first information, and identifies the detectable roadside units 12. A range may be calculated.
 制御部22は、第1情報に基づいて、より具体的には第1情報に対応する検出可能な範囲に基づいて、第2情報に含まれる検出物体の中から、当該第1情報を通知した路側機12が検出していない可能性のある検出物体を潜在検出物体として抽出する。図5に示すように、制御部22は、例えば、第2情報に含まれる検出物体do2の中で、検出可能な範囲rd外に位置する検出物体を、潜在検出物体hdoとして抽出する。 Based on the first information, more specifically, based on the detectable range corresponding to the first information, the control unit 22 notifies the first information from among the detected objects included in the second information. Detected objects that may not have been detected by the roadside unit 12 are extracted as latent detected objects. As shown in FIG. 5, the control unit 22 extracts, for example, a detected object located outside the detectable range rd among the detected objects do2 included in the second information as latent detected objects hdo.
 図6に示すように、制御部22は、第1情報と共に第4情報を取得する場合、第2情報に含まれる検出物体do2の中で、検出可能な範囲rd外に位置し且つ第4情報に含まれない検出物体を潜在検出物体hdoとして抽出してよい。言換えると、制御部22は、第2情報に含まれる検出物体do2の中で、検出可能な範囲rd外に位置していても、第4情報に含まれる検出物体do4を、潜在検出物体hdoと認定しなくてよい。制御部22は、第2情報に含まれる検出物体do2が、第4情報に含まれない検出物体であるか否かを、第2情報と同じカテゴリーである第4情報の比較、例えば、第2情報における検出物体の位置及び第4情報における検出物体の位置の比較により判別してよい。 As shown in FIG. 6, when acquiring the fourth information together with the first information, the control unit 22 determines that the detection object do2 included in the second information is located outside the detectable range rd and the fourth information , may be extracted as latent detected objects hdo. In other words, even if the detected object do2 included in the second information is located outside the detectable range rd, the control unit 22 replaces the detected object do4 included in the fourth information with the potential detected object hdo. should not be identified. The control unit 22 determines whether or not the detected object do2 included in the second information is a detected object not included in the fourth information by comparing fourth information in the same category as the second information. The determination may be made by comparing the position of the detected object in the information and the position of the detected object in the fourth information.
 図7に示すように、制御部22は、第1情報と共に第4情報を取得する場合、第2情報に含まれる検出物体do2の中で、検出可能な範囲rd内に位置し且つ第4情報に含まれない検出物体を潜在検出物体hdoとして抽出してよい。言換えると、制御部22は、第2情報に含まれる検出物体do2の中で、検出可能な範囲rd内に位置していても、第4情報に含まれる検出物体do4を、潜在検出物体hdoと認定しなくてよい。制御部22は、第2情報に含まれる検出物体do2が、第4情報に含まれない検出物体であるか否かを、第2情報と同じカテゴリーである第4情報の比較により判別してよい。 As shown in FIG. 7, when acquiring the fourth information together with the first information, the control unit 22 determines that the detection object do2 included in the second information is positioned within the detectable range rd and the fourth information , may be extracted as latent detected objects hdo. In other words, even if the detected object do2 included in the second information is located within the detectable range rd, the control unit 22 replaces the detected object do4 included in the fourth information with the potential detected object hdo. should not be identified. The control unit 22 may determine whether or not the detected object do2 included in the second information is a detected object not included in the fourth information by comparing the fourth information, which is in the same category as the second information. .
 制御部22は、上述のように抽出した潜在検出物体hdoに関する情報を少なくとも含む第3情報を生成する。潜在検出物体hdoに関する情報は、車両10のセンサ部18が検出する検出物体に関する情報と類似してよい。言換えると、潜在検出物体hdoに関する情報は、潜在検出物体hdoの位置、潜在検出物体hdoの存否、存在する潜在検出物体hdoの種類、速度、及び進行方向であってよい。 The control unit 22 generates third information including at least information about the latent detection object hdo extracted as described above. The information about the latent detected object hdo may be similar to the information about the detected object detected by the sensor unit 18 of the vehicle 10 . In other words, the information about the latent detected object hdo may be the position of the latent detected object hdo, the presence or absence of the latent detected object hdo, the type of the existing latent detected object hdo, the speed, and the traveling direction.
 制御部22は、第2情報に含まれる検出物体do2の中で、第4情報に含まれない検出物体を潜在検出物体hdoとして抽出する場合、潜在検出物体hdoが位置する範囲を特定してよい。潜在検出物体hdoが位置する範囲は、潜在検出物体hdoの位置そのものだけでなく、潜在検出物体hdoの位置を中心等の基準とする所定の広さを有する範囲であってよい。制御部22は、特定した範囲を、第4情報を通知した路側機12における死角であることを示す情報を第3情報に含めてよい。 When extracting a detected object not included in the fourth information among the detected objects do2 included in the second information as a latent detected object hdo, the control unit 22 may specify a range in which the latent detected object hdo is located. . The range in which the latent detectable object hdo is positioned may not only be the position of the latent detectable object hdo itself, but may also be a range having a predetermined width with the position of the latent detectable object hdo as a reference or the like. The control unit 22 may include in the third information information indicating that the specified range is a blind spot in the roadside unit 12 that has notified the fourth information.
 制御部22は、第1情報と共に第4情報を取得する場合における潜在検出物体hdoの抽出のために、第2情報に含まれる第1時点、言換えると第2情報に付加された時刻情報に対応する時刻に基づいて、第1時点以後の第3時点における第2情報に含まれる検出物体(第1検出物体)do2の状態を予測してよい。同様に、制御部22は、第4情報に含まれる第2時点、言換えると第4情報に付加された時刻に基づいて、第2時点以後である第3時点における第4情報に含まれる検出物体(第2検出物体)do4の状態を予測してよい。制御部22は、例えば、第2情報が速度及び加速度を位置と共に含む場合、第3時点における第2情報の検出物体do2の位置を予測する。又、制御部22は、例えば、第4情報が速度及び加速度を位置と共に含む場合、第3時点における第4情報における検出物体do4の位置を予測する。 In order to extract the latent detectable object hdo when acquiring the fourth information together with the first information, the control unit 22 extracts the first time point included in the second information, in other words, the time information added to the second information. Based on the corresponding time, the state of the detected object (first detected object) do2 included in the second information at the third time point after the first time point may be predicted. Similarly, based on the second time point included in the fourth information, in other words, the time added to the fourth information, the control unit 22 controls the detection included in the fourth information at a third time point after the second time point. The state of the object (second detected object) do4 may be predicted. For example, when the second information includes velocity and acceleration together with the position, the control unit 22 predicts the position of the detected object do2 in the second information at the third point in time. Also, for example, when the fourth information includes velocity and acceleration together with the position, the control unit 22 predicts the position of the detected object do4 in the fourth information at the third point in time.
 更に、制御部22は、予測結果、言換えると同時点である第3時点における第2情報に含まれる検出物体do2と第4情報に含まれる検出物体do4の状態が同一であるか否かを判別してよい。制御部22は、状態が同一であると判断する場合、第2情報に含まれる検出物体do2と第4情報に含まれる検出物体do4のいずれか一方に関する情報を、第3情報の生成から排除してよい。 Further, the control unit 22 determines whether or not the prediction result, in other words, the state of the detected object do2 included in the second information and the detected object do4 included in the fourth information at the same point in time at the third point in time is the same. You can judge. When the control unit 22 determines that the states are the same, the control unit 22 excludes information regarding either the detected object do2 included in the second information or the detected object do4 included in the fourth information from the generation of the third information. you can
 制御部22は、第2情報に含まれる検出物体do2が車両10を含むか否かを判別してよい。制御部22は、当該検出物体do2が車両10を含む場合、当該車両10から見た他の検出物体に関する優先度を算出してよい。他の検出物体は、情報処理装置17を搭載する車両10を含んでよい。制御部22は、算出した優先度を当該他の検出物体に関連付けた第5情報を生成してよい。 The control unit 22 may determine whether the detected object do2 included in the second information includes the vehicle 10 or not. When the detected object do2 includes the vehicle 10 , the control unit 22 may calculate the priority of other detected objects viewed from the vehicle 10 . Other detected objects may include the vehicle 10 on which the information processing device 17 is mounted. The control unit 22 may generate fifth information in which the calculated priority is associated with the other detected object.
 例えば、制御部22は、第2情報含まれる検出物体do2である車両10から見た、他の検出物体の速度ベクトルを算出する。又、制御部22は、当該車両10を中心とする所定の広さの範囲を算出する。所定の広さは、車両10を中心として安全を確保できると考えられる半径の円であって、例えば、半径100mの広さである。制御部22は、当該速度ベクトル及び当該範囲に基づいて優先度を算出する。具体的には、制御部22は、他の検出物体の位置から、当該検出物体の速度ベクトルに所定の時間を乗じた移動距離だけ移動させた位置が当該範囲に含まれるか否かを判別する。制御部22は、移動位置が当該範囲に含まれる場合、範囲外である場合に比べて高くなるように優先度を算出してよい。又は、制御部22は、当該移動位置と車両10の位置との間隔が小さくなるほど高くなるように優先度を算出してよい。 For example, the control unit 22 calculates the velocity vectors of other detected objects viewed from the vehicle 10, which is the detected object do2 included in the second information. Further, the control unit 22 calculates a range of a predetermined width centering on the vehicle 10 . The predetermined size is a circle with a radius that is considered to ensure safety around the vehicle 10, and is, for example, a size with a radius of 100 m. The control unit 22 calculates the priority based on the velocity vector and the range. Specifically, the control unit 22 determines whether or not a position moved from the position of another detected object by a moving distance obtained by multiplying the velocity vector of the detected object by a predetermined time is included in the range. . The control unit 22 may calculate a higher priority when the movement position is included in the range than when it is outside the range. Alternatively, the control unit 22 may calculate the priority so that the smaller the distance between the movement position and the position of the vehicle 10, the higher the priority.
 又は、例えば、制御部22は、第2情報含まれる検出物体do2である車両10及び他の検出物体の位置間隔を算出する。制御部22は、以前に算出した同じ車両10及び他の検出物体の位置間隔と、新規に算出した位置間隔との時間変化に基づいて、優先度を算出する。具体的には、制御部22は、位置間隔の時間変化が短くなるほど高くなるように優先度を算出してよい。更に、制御部22は、検出物体の種類に基づいて優先度に重付けを施してよい。例えば、検出物体のサイズが大きくなるほど、又検出物体の移動速度が大きいほど、重付けを大きくしてよい。 Alternatively, for example, the control unit 22 calculates the positional interval between the vehicle 10, which is the detected object do2 included in the second information, and another detected object. The control unit 22 calculates the priority based on the change over time between the previously calculated positional intervals of the same vehicle 10 and other detected objects and the newly calculated positional intervals. Specifically, the control unit 22 may calculate the priority so that the shorter the time change of the position interval, the higher the priority. Furthermore, the control unit 22 may weight the priority based on the type of detected object. For example, the larger the size of the detected object or the faster the speed of movement of the detected object, the greater the weighting may be.
 制御部22は、生成した第3情報及び第5情報を、車載無線装置20に出力するように取得部21を制御してよい。制御部22は、第2情報及び第4情報を、ECU19に出力するように取得部21を制御してよい。 The control unit 22 may control the acquisition unit 21 to output the generated third information and fifth information to the in-vehicle wireless device 20 . The control unit 22 may control the acquisition unit 21 to output the second information and the fourth information to the ECU 19 .
 次に、本実施形態において車両10及び路側機12による情報補足について、図8のラダーチャートを用いて説明する。各車両10における情報処理は、車両10における情報処理装置17の稼働中に周期的に行われており、通信可能範囲内に他の路側機12及び他の車両10が位置する場合、路側機12及び他の車両10との通信結果が、車両10の情報処理に用いられる。路側機12における情報処理は周期的に行われており、通信可能範囲内に車両10が位置する場合、車両10との通信結果が、路側機12の情報処理に用いられる。したがって、任意の時間における車両10及び路側機12による情報処理を例として以下に説明するが、各車両10及び路側機12における情報処理の順番は以下の例に限定されない。 Next, information supplementation by the vehicle 10 and the roadside unit 12 in this embodiment will be described using the ladder chart of FIG. Information processing in each vehicle 10 is periodically performed while the information processing device 17 in the vehicle 10 is in operation. and communication results with other vehicles 10 are used for information processing of the vehicle 10 . Information processing in the roadside device 12 is performed periodically, and when the vehicle 10 is located within a communicable range, the result of communication with the vehicle 10 is used for information processing in the roadside device 12 . Therefore, the information processing by the vehicle 10 and the roadside device 12 at an arbitrary time will be described below as an example, but the order of information processing in each vehicle 10 and the roadside device 12 is not limited to the following example.
 ステップS100において、路側機12がセンサ部14において検出物体に関する情報を検出する。路側機12は、検出物体に関する情報を用いて第4情報を生成する。路側機12は、後述するように、以前に第3情報及び第5情報の少なくとも一方を取得している場合、当該一方も用いて第4情報を生成してよい。路側機12は、第1情報及び生成した第4情報を、周囲の車両10に送信する。 In step S<b>100 , the roadside unit 12 detects information about the detected object in the sensor unit 14 . The roadside device 12 generates fourth information using information about the detected object. As will be described later, if the roadside device 12 has previously acquired at least one of the third information and the fifth information, the roadside device 12 may also use the one to generate the fourth information. The roadside device 12 transmits the first information and the generated fourth information to the surrounding vehicles 10 .
 ステップS101では、任意の車両10である第1の車両10aは、センサ部18において検出物体に関する情報を検出する。第1の車両10aは、検出物体に関する情報を用いて第2情報を生成する。第1の車両10aは、生成した第2情報を、第1の車両10a以外の第2の車両10bを含む周囲の車両10に送信する。 In step S101, the first vehicle 10a, which is an arbitrary vehicle 10, detects information about the detected object in the sensor unit 18. The first vehicle 10a generates second information using information about the detected object. The first vehicle 10a transmits the generated second information to surrounding vehicles 10 including the second vehicle 10b other than the first vehicle 10a.
 ステップS102では、第2の車両10bは、センサ部18において検出物体に関する情報を検出する。第2の車両10bは、検出物体に関する情報を用いて第2情報を生成する。第2の車両10bは、生成した第2情報を、第2の車両10b以外の第1の車両10aを含む周囲の車両10に送信する。 In step S102, the sensor unit 18 of the second vehicle 10b detects information about the detected object. The second vehicle 10b uses the information about the detected object to generate the second information. The second vehicle 10b transmits the generated second information to surrounding vehicles 10 including the first vehicle 10a other than the second vehicle 10b.
 ステップS103では、第1の車両10aは、第1の車両10aが検出した第2情報、第2の車両10bから取得した第2情報、及び路側機12から取得した第4情報に含まれるそれぞれの検出物体について同じ時点である第3時点における状態を予測する。 In step S103, the first vehicle 10a detects each of the second information detected by the first vehicle 10a, the second information obtained from the second vehicle 10b, and the fourth information obtained from the roadside unit 12. Predict the state of the detected object at a third point in time, which is the same point in time.
 ステップS104では、第1の車両10aは、ステップS103において予測した検出物体の状態の比較、及び路側機12から取得した第1情報の少なくとも一方に基づいて、潜在検出物体hdoを抽出する。 In step S104, the first vehicle 10a extracts the latent detected object hdo based on at least one of the comparison of the state of the detected object predicted in step S103 and the first information acquired from the roadside unit 12.
 ステップS105では、第1の車両10aは、第1の車両10aが検出した第2情報、第2の車両10bから取得した第2情報、及び路側機12から取得した第4情報に含まれるそれぞれの検出物体別に、当該検出物体から見た他の検出物体の優先度を算出して、第5情報を生成する。第1の車両10aは、ステップS104において生成した第3情報と共に、第5情報を路側機12に送信する。 In step S105, the first vehicle 10a detects each of the second information detected by the first vehicle 10a, the second information obtained from the second vehicle 10b, and the fourth information obtained from the roadside unit 12. For each detected object, the priority of other detected objects viewed from the detected object is calculated to generate fifth information. The first vehicle 10a transmits the fifth information to the roadside unit 12 together with the third information generated in step S104.
 ステップS106からS108では、第2の車両10bは、第1の車両10aにおけるステップS103からS105と類似した処理を実行して、第3情報及び第5情報を路側機12に送信する。 In steps S106 to S108, the second vehicle 10b executes processing similar to steps S103 to S105 in the first vehicle 10a, and transmits third information and fifth information to the roadside unit 12.
 以後、ステップS100からS109における処理が、路側機12及び車両10間で繰り返される。 After that, the processes from steps S100 to S109 are repeated between the roadside unit 12 and the vehicle 10.
 以上のような構成の本実施形態の情報処理装置17は、周辺の路側機12の性能を予測可能な第1情報を当該路側機12から取得し、且つ周辺における検出物体に関する第2情報を取得する取得部21と、第1情報に基づいて第2情報に含まれる検出物体do2の中から当該路側機12が検出していない可能性のある潜在検出物体hdoを抽出し、潜在検出物体hdoに関する第3情報を生成する制御部22と、を備える。このような構成により、情報処理装置17は、路側機12が実際に検出した検出物体以外の検出物体(潜在検出物体hdo)に関する情報を生成し得る。したがって、情報処理装置17は、路側機12が検出する情報を補足する情報を生成し得る。 The information processing device 17 of the present embodiment configured as described above acquires first information that can predict the performance of the roadside device 12 in the vicinity from the roadside device 12, and acquires second information regarding a detected object in the vicinity. and an acquiring unit 21 that extracts a latent detected object hdo that may not have been detected by the roadside unit 12 from among the detected objects do2 included in the second information based on the first information, and a control unit 22 that generates the third information. With such a configuration, the information processing device 17 can generate information about a detected object (latent detected object hdo) other than the detected object actually detected by the roadside device 12 . Therefore, the information processing device 17 can generate information that supplements the information detected by the roadside unit 12 .
 又、情報処理装置17では、取得部21は路側機12が検出する検出物体に関する第4情報を取得可能であり、制御部22は第2情報に含まれる検出物体do2のなかで第1情報に対応する検出可能な範囲rdの外に位置し且つ第4情報に含まれない検出物体を潜在検出物体hdoとして抽出する。検出可能な範囲rdの外に位置する検出物体は路側機12に対して検出されていない可能性が高い。それゆえ、上記構成において、情報処理装置17が検出可能な範囲rdの外に位置する検出物体を第1の条件として抽出することにより、抽出処理を高速化し得る。又、検出可能な範囲rdの外に位置しながらも路側機12により検出物体に関する情報が検出されることがある。このような事象に対して、情報処理装置17は、更に路側機12に検出されなかった検出物体に関する情報を抽出することにより潜在検出物体hdoの抽出精度を向上させ得る。 Further, in the information processing device 17, the acquisition unit 21 can acquire the fourth information about the detected object detected by the roadside unit 12, and the control unit 22 can obtain the first information among the detected objects do2 included in the second information. Detected objects located outside the corresponding detectable range rd and not included in the fourth information are extracted as latent detected objects hdo. A detected object positioned outside the detectable range rd is highly likely not detected by the roadside unit 12 . Therefore, in the above configuration, the extraction process can be sped up by extracting the detected object positioned outside the detectable range rd of the information processing device 17 as the first condition. In addition, the roadside unit 12 may detect information about the detected object even though it is located outside the detectable range rd. In response to such an event, the information processing device 17 can further extract information about the detected object that was not detected by the roadside unit 12, thereby improving the extraction accuracy of the latent detected object hdo.
 又、情報処理装置17では、取得部21は路側機12が検出する検出物体に関する第4情報を取得可能であり、制御部22は第2情報に含まれる検出物体do2の中で第1情報に対応する検出可能な範囲rd内に位置し且つ第4情報に含まれない検出物体を潜在検出物体hdoとして抽出する。路側機12における検出可能な範囲rd内に位置しながらも、障害物の存在により路側機12から死角となる位置に検出物体が存在することがあり得る。このような事象に対して、上述の構成を有する情報処理装置17は、潜在検出物体hdoの抽出精度を向上させ得る。 In the information processing device 17, the acquisition unit 21 can acquire the fourth information regarding the detected object detected by the roadside unit 12, and the control unit 22 can obtain the first information among the detected objects do2 included in the second information. A detected object located within the corresponding detectable range rd and not included in the fourth information is extracted as a latent detected object hdo. A detected object may exist in a blind spot from the roadside device 12 due to the presence of an obstacle, even though it is located within the detectable range rd of the roadside device 12 . For such events, the information processing device 17 having the configuration described above can improve the extraction accuracy of the latent detection object hdo.
 又、情報処理装置17は、潜在検出物体hdoが位置する範囲を路側機12における死角であることを示す情報として、第3情報に含める。このような構成により、情報処理装置17は、路側機12に有益な情報を更に補足し得る。 Further, the information processing device 17 includes the range in which the latent detection object hdo is located in the third information as information indicating that the range is a blind spot in the roadside unit 12 . With such a configuration, the information processing device 17 can further supplement useful information for the roadside unit 12 .
 又、情報処理装置17は、第2情報に含まれる第1時点及び第4情報に含まれる第2時点に基づいて、当該第2情報に含まれる第1検出物体と当該第4情報に含まれる第2検出物体とそれぞれの第3時点における状態を予測する。このような構成により、情報処理装置17は、検出時点のずれにより、同じ検出物体に関する情報が異なる場合であっても、当該情報の異なりを修正して比較するので、同じ検出物体に関する情報であるか否かの判別精度を向上させ得る。更に、情報処理装置17は、予測結果に基づいて当該第1検出物体及び当該第2検出物体が同一であると判断する場合、第1検出物体に関する第2情報及び第2検出物体に関する第4情報のいずれか一方を第3情報の生成から排除する。このような構成により、情報処理装置17は、同じ検出物体に関する、検出時点のずれた複数の情報が第3情報に含まれることを防止し得る。したがって、情報処理装置17は、第3情報を取得する路側機12の処理負担を軽減させ得る。 Further, the information processing device 17 detects the first detected object included in the second information and the detected object included in the fourth information based on the first time point included in the second information and the second time point included in the fourth information. A second detected object and its respective state at a third point in time are predicted. With such a configuration, even if the information regarding the same detected object differs due to a shift in the detection time, the information processing device 17 corrects the difference in the information and compares the information, so that the information regarding the same detected object is obtained. It is possible to improve the determination accuracy of whether or not. Further, when the information processing device 17 determines that the first detected object and the second detected object are the same based on the prediction result, the second information about the first detected object and the fourth information about the second detected object from the generation of the third information. With such a configuration, the information processing device 17 can prevent the third information from including a plurality of pieces of information regarding the same detected object with different detection times. Therefore, the information processing device 17 can reduce the processing load on the roadside device 12 that acquires the third information.
 又、情報処理装置17は、第2情報に含まれる検出物体do2が車両10を含む場合、当該車両10から見た検出物体に関する優先度を算出し、当該優先度を当該検出物体に関連付けた第5情報を生成する。このような構成により、情報処理装置17は、路側機12を介して第5情報を取得する他の車両10に有益な情報を生成し得る。情報処理装置17は、このような有益な情報を路側機12の代わりに生成することにより、路側機12における処理負荷を低減し得る。 Further, when the detected object do2 included in the second information includes the vehicle 10, the information processing device 17 calculates the priority regarding the detected object seen from the vehicle 10, and associates the priority with the detected object. 5 Generate information. With such a configuration, the information processing device 17 can generate useful information for other vehicles 10 that acquire the fifth information via the roadside device 12 . The information processing device 17 can reduce the processing load on the roadside device 12 by generating such useful information instead of the roadside device 12 .
 以上のような構成を有する路側機12は、周辺の検出物体に関する情報を検出するセンサ部14と、センサ部14の性能を予測可能な第1情報を送信し、センサ部14が検出していない可能性のある潜在検出物体hdoに関する第3情報を受信する通信部15と、センサ部14が最新に検出した検出物体に関する情報と第3情報とを含む第4情報を生成し、当該第4情報を送信するように通信部15を制御する制御部16と、を備える。このような構成により、路側機12は、路側機12が検出できなかった可能性の高い第3情報を含めて第4情報を生成するので、車両10に対して、より有益な情報を提供し得る。 The roadside unit 12 configured as described above transmits the sensor unit 14 that detects information about detected objects in the vicinity and the first information that can predict the performance of the sensor unit 14, and the sensor unit 14 does not detect generating fourth information including the communication unit 15 receiving the third information about the possible latent detection object hdo, the information about the latest detection object detected by the sensor unit 14, and the third information; and a control unit 16 that controls the communication unit 15 to transmit the With such a configuration, the roadside device 12 generates the fourth information including the third information that the roadside device 12 is highly likely to have failed to detect, so that the vehicle 10 can be provided with more useful information. obtain.
 以上、情報処理装置17及び路側機12の実施形態を説明してきたが、本開示の実施形態としては、装置を実施するための方法又はプログラムの他、プログラムが記録された記憶媒体(一例として、光ディスク、光磁気ディスク、CD-ROM、CD-R、CD-RW、磁気テープ、ハードディスク、又はメモリカード等)としての実施態様をとることも可能である。 Embodiments of the information processing device 17 and roadside device 12 have been described above. An optical disk, a magneto-optical disk, a CD-ROM, a CD-R, a CD-RW, a magnetic tape, a hard disk, a memory card, etc.) can also be used.
 また、プログラムの実装形態としては、コンパイラによってコンパイルされるオブジェクトコード、インタプリタにより実行されるプログラムコード等のアプリケーションプログラムに限定されることはなく、オペレーティングシステムに組み込まれるプログラムモジュール等の形態であってもよい。さらに、プログラムは、制御基板上のCPUにおいてのみ全ての処理が実施されるように構成されてもされなくてもよい。プログラムは、必要に応じて基板に付加された拡張ボード又は拡張ユニットに実装された別の処理ユニットによってその一部又は全部が実施されるように構成されてもよい。 Moreover, the implementation form of the program is not limited to an application program such as an object code compiled by a compiler or a program code executed by an interpreter. good. Furthermore, the program may or may not be configured so that all processing is performed only in the CPU on the control board. The program may be configured to be partially or wholly executed by another processing unit mounted on an expansion board or expansion unit added to the board as required.
 本開示に係る実施形態について説明する図は模式的なものである。図面上の寸法比率等は、現実のものとは必ずしも一致していない。 The diagrams describing the embodiments according to the present disclosure are schematic. The dimensional ratios and the like on the drawings do not necessarily match the actual ones.
 本開示に係る実施形態について、諸図面及び実施例に基づき説明してきたが、当業者であれば本開示に基づき種々の変形又は改変を行うことが可能であることに注意されたい。従って、これらの変形又は改変は本開示の範囲に含まれることに留意されたい。例えば、各構成部等に含まれる機能等は論理的に矛盾しないように再配置可能であり、複数の構成部等を1つに組み合わせたり、或いは分割したりすることが可能である。 Although the embodiments according to the present disclosure have been described based on various drawings and examples, it should be noted that a person skilled in the art can make various modifications or alterations based on the present disclosure. Therefore, it should be noted that these variations or modifications are included within the scope of this disclosure. For example, the functions included in each component can be rearranged so as not to be logically inconsistent, and multiple components can be combined into one or divided.
 例えば、本開示に係る実施形態では、車両10において情報処理装置17と、ECU19とは異なる構成要素として説明される。しかしながら、これに限らず、情報処理装置17がECU19を含み、上述した情報処理装置17としての処理だけでなく、上述したECU19が行う処理も行ってもよい。 For example, in the embodiment according to the present disclosure, the information processing device 17 and the ECU 19 in the vehicle 10 are described as different components. However, without being limited to this, the information processing device 17 may include the ECU 19 and perform not only the processing as the information processing device 17 described above but also the processing performed by the ECU 19 described above.
 本開示に記載された構成要件の全て、及び/又は、開示された全ての方法、又は、処理の全てのステップについては、これらの特徴が相互に排他的である組合せを除き、任意の組合せで組み合わせることができる。また、本開示に記載された特徴の各々は、明示的に否定されない限り、同一の目的、同等の目的、または類似する目的のために働く代替の特徴に置換することができる。したがって、明示的に否定されない限り、開示された特徴の各々は、包括的な一連の同一、又は、均等となる特徴の一例にすぎない。 For all elements set forth in this disclosure and/or for any method or process step disclosed, in any combination, except combinations where these features are mutually exclusive. Can be combined. Also, each feature disclosed in this disclosure, unless expressly contradicted, may be replaced by alternative features serving the same, equivalent, or similar purpose. Thus, unless expressly stated otherwise, each feature disclosed is one example only of a generic series of identical or equivalent features.
 さらに、本開示に係る実施形態は、上述した実施形態のいずれの具体的構成にも制限されるものではない。本開示に係る実施形態は、本開示に記載された全ての新規な特徴、又は、それらの組合せ、あるいは記載された全ての新規な方法、又は、処理のステップ、又は、それらの組合せに拡張することができる。 Furthermore, the embodiments according to the present disclosure are not limited to any specific configuration of the embodiments described above. Embodiments of the present disclosure extend to all novel features or combinations thereof described in the present disclosure or to all novel method or process steps or combinations thereof described. be able to.
 本開示において「第1」及び「第2」等の記載は、当該構成を区別するための識別子である。本開示における「第1」及び「第2」等の記載で区別された構成は、当該構成における番号を交換することができる。例えば、第1情報は、第2情報と識別子である「第1」と「第2」とを交換することができる。識別子の交換は同時に行われる。識別子の交換後も当該構成は区別される。識別子は削除してよい。識別子を削除した構成は、符号で区別される。本開示における「第1」及び「第2」等の識別子の記載のみに基づいて、当該構成の順序の解釈、小さい番号の識別子が存在することの根拠に利用してはならない。 Descriptions such as "first" and "second" in this disclosure are identifiers for distinguishing the configurations. Configurations that are differentiated in descriptions such as "first" and "second" in this disclosure may interchange the numbers in that configuration. For example, the first information can replace the identifiers "first" and "second" with the second information. The exchange of identifiers is done simultaneously. The configurations are still distinct after the exchange of identifiers. Identifiers may be deleted. Configurations from which identifiers have been deleted are distinguished by codes. The description of identifiers such as "first" and "second" in this disclosure should not be used as a basis for interpreting the order of the configuration and the existence of identifiers with lower numbers.
 10 車両
 10a 第1の車両
 10b 第2の車両
 11 交通支援システム
 12 路側機
 13 道路
 14 センサ部
 15 通信部
 16 制御部
 17 情報処理装置
 18 センサ部
 19 ECU(Electronic Control Unit)
 20 車載無線装置
 21 取得部
 22 制御部
 do2 第2情報に含まれる検出物体
 do4 第4情報に含まれる検出物体
 hdo 潜在検出物体
 rd 検出可能な範囲
10 vehicle 10a first vehicle 10b second vehicle 11 traffic support system 12 roadside unit 13 road 14 sensor unit 15 communication unit 16 control unit 17 information processing device 18 sensor unit 19 ECU (Electronic Control Unit)
20 On-vehicle wireless device 21 Acquisition unit 22 Control unit do2 Detected object included in second information do4 Detected object included in fourth information hdo Potentially detected object rd Detectable range

Claims (12)

  1.  周辺の路側機の性能を予測可能な第1情報を該路側機から取得し、且つ周辺における検出物体に関する第2情報を取得する取得部と、
     前記第1情報に基づいて、前記第2情報に含まれる検出物体の中から、該路側機が検出していない可能性のある潜在検出物体を抽出し、前記潜在検出物体に関する第3情報を生成する制御部と、を備える
     情報処理装置。
    an acquisition unit that acquires first information that can predict the performance of a surrounding roadside unit from the roadside unit and acquires second information about a detected object in the vicinity;
    Based on the first information, a latent detected object that may not have been detected by the roadside unit is extracted from the detected objects included in the second information, and third information about the latent detected object is generated. an information processing apparatus comprising: a control unit that
  2.  請求項1に記載の情報処理装置において、
     前記第1情報は、前記路側機が検出物体を検出可能な範囲を示す情報、又は、前記路側機が検出した検出物体の位置を示す情報である
     情報処理装置。
    In the information processing device according to claim 1,
    The first information is information indicating a range in which the roadside device can detect the detected object, or information indicating the position of the detected object detected by the roadside device.
  3.  請求項1又は2に記載の情報処理装置において、
     前記取得部は、前記路側機が検出する検出物体に関する第4情報を取得可能であり、
     前記制御部は、前記第2情報に含まれる検出物体の中で、前記第1情報に対応する検出可能な範囲の外に位置し且つ前記第4情報に含まれない検出物体を、前記潜在検出物体として抽出する
     情報処理装置。
    In the information processing device according to claim 1 or 2,
    The acquisition unit is capable of acquiring fourth information about a detected object detected by the roadside unit,
    The control unit detects, among the detected objects included in the second information, a detected object located outside a detectable range corresponding to the first information and not included in the fourth information, as the latent detection. An information processing device that extracts objects.
  4.  請求項1又は2に記載の情報処理装置において、
     前記取得部は、前記路側機が検出する検出物体に関する第4情報を取得可能であり、
     前記制御部は、前記第2情報に含まれる検出物体の中で、前記第1情報に対応する検出可能な範囲内に位置し且つ前記第4情報に含まれない検出物体を、前記潜在検出物体として抽出する
     情報処理装置。
    In the information processing device according to claim 1 or 2,
    The acquisition unit is capable of acquiring fourth information about a detected object detected by the roadside unit,
    The control unit selects, among detected objects included in the second information, detected objects that are located within a detectable range corresponding to the first information and that are not included in the fourth information as latent detected objects. Information processing device.
  5.  請求項3又は4に記載の情報処理装置において、
     前記制御部は、
      前記第2情報に含まれる第1時点及び前記第4情報に含まれる第2時点に基づいて、該第2情報に含まれる第1検出物体と該第4情報に含まれる第2検出物体とそれぞれの第3時点における状態を予測し、
      予測結果に基づいて該第1検出物体及び該第2検出物体が同一であると判断する場合、前記第1検出物体に関する前記第2情報及び前記第2検出物体に関する前記第4情報のいずれか一方を前記第3情報の生成から排除する
     情報処理装置。
    In the information processing device according to claim 3 or 4,
    The control unit
    Based on the first time point included in the second information and the second time point included in the fourth information, the first detected object included in the second information and the second detected object included in the fourth information, respectively. Predict the state at the third time of
    either the second information about the first detected object or the fourth information about the second detected object when it is determined that the first detected object and the second detected object are the same based on the prediction result; is excluded from the generation of the third information.
  6.  請求項1から5のいずれか1項に記載の情報処理装置において、
     前記制御部は、前記潜在検出物体が位置する範囲を該路側機における死角であることを示す情報として、前記第3情報に含める
     情報処理装置。
    In the information processing device according to any one of claims 1 to 5,
    The information processing device, wherein the control unit includes the range in which the latent detection object is located in the third information as information indicating that the range is a blind spot of the roadside unit.
  7.  請求項1から6のいずれか1項に記載の情報処理装置において、
     前記制御部は、前記第2情報に含まれる検出物体が車両を含む場合、該車両から見た前記検出物体に関する優先度を算出し、該優先度を該検出物体に関連付けた第5情報を生成する
     情報処理装置。
    In the information processing device according to any one of claims 1 to 6,
    When the detected object included in the second information includes a vehicle, the control unit calculates a priority regarding the detected object viewed from the vehicle, and generates fifth information in which the priority is associated with the detected object. Information processing equipment.
  8.  請求項7に記載の情報処理装置において、
     前記制御部は、前記第2情報に含まれる検出物体が車両を含む場合、該車両から見た前記検出物体の速度ベクトルと該車両を中心とする所定の広さの範囲とに基づいて、前記優先度を算出する
     情報処理装置。
    In the information processing device according to claim 7,
    When the detected object included in the second information includes a vehicle, the control unit determines the Information processing device that calculates priority.
  9.  請求項7に記載の情報処理装置において、
     前記制御部は、前記第2情報に含まれる検出物体が車両を含む場合、該車両及び前記検出物体の間隔の時間変化に基づいて、前記優先度を算出する
     情報処理装置。
    In the information processing device according to claim 7,
    Information processing apparatus, wherein, when the detected object included in the second information includes a vehicle, the control unit calculates the priority based on a time change in the distance between the vehicle and the detected object.
  10.  周辺の検出物体に関する情報を検出するセンサ部と、
     前記センサ部の性能を予測可能な第1情報を送信し、前記センサ部が検出していない可能性のある潜在検出物体に関する第3情報を受信する通信部と、
     前記センサ部が最新に検出した検出物体に関する情報と前記第3情報とを含む第4情報を生成し、該第4情報を送信するように前記通信部を制御する制御部と、を備える
     路側機。
    a sensor unit that detects information about a detected object in the vicinity;
    a communication unit that transmits first information that can predict the performance of the sensor unit and receives third information regarding a latent detection object that may not be detected by the sensor unit;
    a control unit that generates fourth information including information about the detected object most recently detected by the sensor unit and the third information, and controls the communication unit to transmit the fourth information. .
  11.  周辺の検出物体に関する情報を検出するセンサ部と、前記センサ部の性能を予測可能な第1情報を送信し、前記センサ部が検出していない可能性のある潜在検出物体に関する第3情報を受信する通信部と、前記センサ部が最新に検出した検出物体に関する情報と前記第3情報とを含む第4情報を生成し、該第4情報を送信するように前記通信部を制御する第1の制御部と、を有する路側機と、
     周辺の前記路側機から前記第1情報を取得し、且つ周辺における検出物体に関する第2情報を取得する取得部と、前記第1情報に基づいて、前記第2情報に含まれる検出物体の中から、該路側機の前記潜在検出物体を抽出し、前記潜在検出物体に関する前記第3情報を生成する第2の制御部と、を有する情報処理装置を搭載する車両と、を備える
     交通支援システム。
    A sensor unit that detects information about a detected object in the vicinity, transmits first information that can predict the performance of the sensor unit, and receives third information about a latent detected object that may not be detected by the sensor unit. a first communication unit for generating fourth information including information about a detected object most recently detected by the sensor unit and the third information, and controlling the communication unit to transmit the fourth information; a roadside unit having a controller;
    an acquisition unit that acquires the first information from the roadside units in the vicinity and acquires second information about the detected objects in the vicinity; and based on the first information, the detected objects included in the second information. and a second control unit that extracts the latent detected object of the roadside unit and generates the third information about the latent detected object.
  12.  周辺の路側機の性能を予測可能な第1情報を、該路側機から取得し、
     周辺における検出物体に関する第2情報を取得し、
     前記第1情報に基づいて、前記第2情報に含まれる検出物体の中から、該路側機が検出していない可能性のある潜在検出物体を抽出し、
     前記潜在検出物体に関する第3情報を生成する
     情報補足方法。
    Acquiring first information from the roadside unit that can predict the performance of the surrounding roadside unit;
    obtaining second information about detected objects in the vicinity;
    Based on the first information, extracting a potential detected object that may not be detected by the roadside unit from among the detected objects included in the second information;
    An information supplementation method for generating third information about the latent detected object.
PCT/JP2023/000492 2022-01-27 2023-01-11 Information processing device, road side machine, traffic support system, and information complementing method WO2023145436A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2023576758A JPWO2023145436A1 (en) 2022-01-27 2023-01-11

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2022-011224 2022-01-27
JP2022011224 2022-01-27

Publications (1)

Publication Number Publication Date
WO2023145436A1 true WO2023145436A1 (en) 2023-08-03

Family

ID=87471191

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2023/000492 WO2023145436A1 (en) 2022-01-27 2023-01-11 Information processing device, road side machine, traffic support system, and information complementing method

Country Status (2)

Country Link
JP (1) JPWO2023145436A1 (en)
WO (1) WO2023145436A1 (en)

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010003242A (en) * 2008-06-23 2010-01-07 Toyota Motor Corp Communication system

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010003242A (en) * 2008-06-23 2010-01-07 Toyota Motor Corp Communication system

Also Published As

Publication number Publication date
JPWO2023145436A1 (en) 2023-08-03

Similar Documents

Publication Publication Date Title
CN108932869B (en) Vehicle system, vehicle information processing method, recording medium, traffic system, infrastructure system, and information processing method
US11315420B2 (en) Moving object and driving support system for moving object
JP6714513B2 (en) An in-vehicle device that informs the navigation module of the vehicle of the presence of an object
US8615109B2 (en) Moving object trajectory estimating device
EP2600328B1 (en) Driving assistance device
US9002631B2 (en) Vicinity environment estimation device with blind region prediction, road detection and intervehicle communication
JP6468062B2 (en) Object recognition system
CN112106065A (en) Predicting the state and position of an observed vehicle using optical tracking of wheel rotation
JP2018195289A (en) Vehicle system, vehicle information processing method, program, traffic system, infrastructure system and infrastructure information processing method
JPWO2017057043A1 (en) Image processing apparatus, image processing method, and program
US20210387616A1 (en) In-vehicle sensor system
JP2022024741A (en) Vehicle control device and vehicle control method
US10351144B2 (en) Sightline estimation system and sightline estimation method
US20240142607A1 (en) Information processing device, information processing method, computer program, and mobile device
JP2009181315A (en) Object detection device
WO2018109865A1 (en) Roadside machine and vehicle-to-road communication system
WO2022070250A1 (en) Information processing device, information processing method, and program
CN113771845B (en) Method and device for predicting vehicle track, vehicle and storage medium
WO2023145436A1 (en) Information processing device, road side machine, traffic support system, and information complementing method
CN116892949A (en) Ground object detection device, ground object detection method, and computer program for ground object detection
JP4644590B2 (en) Peripheral vehicle position detection device and peripheral vehicle position detection method
JP6946660B2 (en) Positioning device
CN115128566A (en) Radar data determination circuit and radar data determination method
CN111766601B (en) Identification device, vehicle control device, identification method, and storage medium
US11769337B2 (en) Traffic signal recognition method and traffic signal recognition device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23746651

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2023576758

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE