WO2023145436A1 - Dispositif de traitement d'informations, machine côté route, système de prise en charge de trafic, et procédé de fourniture d'informations - Google Patents

Dispositif de traitement d'informations, machine côté route, système de prise en charge de trafic, et procédé de fourniture d'informations Download PDF

Info

Publication number
WO2023145436A1
WO2023145436A1 PCT/JP2023/000492 JP2023000492W WO2023145436A1 WO 2023145436 A1 WO2023145436 A1 WO 2023145436A1 JP 2023000492 W JP2023000492 W JP 2023000492W WO 2023145436 A1 WO2023145436 A1 WO 2023145436A1
Authority
WO
WIPO (PCT)
Prior art keywords
information
detected object
detected
unit
roadside
Prior art date
Application number
PCT/JP2023/000492
Other languages
English (en)
Japanese (ja)
Inventor
真大 新田
輝 小山
Original Assignee
京セラ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 京セラ株式会社 filed Critical 京セラ株式会社
Publication of WO2023145436A1 publication Critical patent/WO2023145436A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems

Definitions

  • the present invention relates to an information processing device, a roadside unit, a traffic support system, and an information supplement method.
  • a roadside unit is known that is installed above a road or the like, detects detected objects such as vehicles and passersby, and notifies surrounding vehicles of the detected detected objects (see Patent Document 1).
  • An information processing device includes: an acquisition unit that acquires first information that can predict the performance of a surrounding roadside unit from the roadside unit and acquires second information about a detected object in the vicinity; Based on the first information, a latent detected object that may not have been detected by the roadside unit is extracted from the detected objects included in the second information, and third information about the latent detected object is generated. and a control unit.
  • the roadside unit is a sensor unit that detects information about a detected object in the vicinity; a communication unit that transmits first information that can predict the performance of the sensor unit and receives third information regarding a latent detection object that may not be detected by the sensor unit; a control unit configured to generate fourth information including information about the detected object most recently detected by the sensor unit and the third information, and control the communication unit to transmit the fourth information.
  • the traffic support system is A sensor unit that detects information about a detected object in the vicinity, transmits first information that can predict the performance of the sensor unit, and receives third information about a latent detected object that may not be detected by the sensor unit.
  • a first communication unit for generating fourth information including information about a detected object most recently detected by the sensor unit and the third information, and controlling the communication unit to transmit the fourth information;
  • a roadside unit having a controller; an acquisition unit that acquires the first information from the roadside units in the vicinity and acquires second information about the detected objects in the vicinity; and based on the first information, the detected objects included in the second information.
  • a second control unit that extracts the latent detected object of the roadside unit and generates the third information about the latent detected object.
  • the information supplementation method is Acquiring first information from the roadside unit that can predict the performance of the surrounding roadside unit; obtaining second information about detected objects in the vicinity; Based on the first information, extracting a potential detected object that may not be detected by the roadside unit from among the detected objects included in the second information; Generate third information about the latent detected object.
  • FIG. 1 is a diagram illustrating a configuration example of a traffic support system including a vehicle equipped with an information processing device according to one embodiment.
  • 2 is a block diagram showing a schematic configuration of a roadside machine in FIG. 1;
  • FIG. 2 is a block diagram showing a schematic configuration of the vehicle in FIG. 1;
  • FIG. 4 is a block diagram showing a schematic configuration of the information processing apparatus of FIG. 3;
  • FIG. FIG. 4 is a state diagram for explaining a latent detected object based on first information;
  • FIG. 10 is a state diagram for explaining a potential detectable object outside the detectable range corresponding to the first information based on the first information and the fourth information;
  • FIG. 10 is a state diagram for explaining a potential detectable object within a detectable range corresponding to the first information based on the first information and the fourth information;
  • It is a ladder chart for demonstrating the information supplement process by a roadside unit and a vehicle.
  • FIG. 1 shows a configuration example of a traffic support system 11 including a vehicle 10 equipped with an information processing device according to one embodiment.
  • the traffic support system 11 is, for example, a safe driving support communication system of Intelligent Transport Systems (ITS).
  • ITS Intelligent Transport Systems
  • a driving safety support communication system is called a driving safety support system or a driving safety support radio system.
  • the traffic support system 11 includes roadside units 12 and vehicles 10 .
  • the roadside unit 12 may observe observation targets such as vehicles, objects, and people on the road in a predetermined area.
  • the roadside unit 12 may be placed near an intersection where multiple roads 13 (roadways) intersect to observe the road surface.
  • the roadside unit 12 may be arranged on the roadside other than the intersection.
  • one roadside unit 12 is arranged at the intersection.
  • a plurality of roadside units 12 may be arranged.
  • roadside units 12 are arranged in a number corresponding to the number of roads 13 intersecting at an intersection, and each roadside unit 12 is arranged so that at least one road 13 previously associated with the installer is included in the detection range. may be
  • the roadside unit 12 and the vehicle 10 running on the road 13 may perform wireless communication with each other.
  • a plurality of vehicles 10 may perform wireless communication with each other.
  • the roadside unit 12 notifies the vehicle 10 of various information, which will be described later.
  • the vehicle 10 notifies the roadside device 12 of various information acquired by the own vehicle, which will be described later.
  • the vehicle 10 may notify other vehicles 10 of various information acquired by the own vehicle, which will be described later.
  • at least various information obtained from the roadside unit 12 may be used to assist the driver of the vehicle 10 in driving safely.
  • the traffic assistance system 11 may assist the driver of the vehicle 10 in driving safely.
  • the vehicle 10 is, for example, an automobile, but is not limited to an automobile, and may be a motorcycle, a bus, a streetcar, a bicycle, or the like.
  • the roadside unit 12 includes a sensor unit 14, a communication unit 15, and a control unit (first control unit) 16.
  • the roadside unit 12 may be fixed to a structure having a height capable of capturing an outdoor scene including the road 13, such as a signal device near an intersection where the road 13 intersects, a utility pole, a streetlight, or the like.
  • the sensor unit 14 detects information about detected objects in the vicinity.
  • a detected object may be a vehicle 10 on the road, an object, a person, etc., which should be considered in the operation of the individual vehicle 10 .
  • the information about the detected object may be the position of the detected object, the presence or absence of the detected object, the type of existing detected object, the speed, and the traveling direction.
  • the information about the detected object detected by the sensor unit 14 may be original information that can generate the position of the detected object, the presence or absence of the detected object, the type, speed, and traveling direction of the existing detected object.
  • the sensor unit 14 may detect information about the detected object at predetermined intervals.
  • the sensor unit 14 includes, for example, imaging devices such as a visible light camera, FIR (Far Infrared Rays) camera, and stereo camera, and distance measuring devices such as infrared radar, millimeter wave radar, and Lidar (light detection and ranging). OK.
  • imaging devices such as a visible light camera, FIR (Far Infrared Rays) camera, and stereo camera
  • distance measuring devices such as infrared radar, millimeter wave radar, and Lidar (light detection and ranging).
  • the position of the detected object, the presence or absence of the detected object, the type of the detected object, and the traveling direction can be generated based on the image analysis of the image detected by the imaging device.
  • the velocity and acceleration of the detected object can be generated.
  • the position of the detected object, the presence or absence of the detected object, and the type of the existing detected object can be generated based on the distance of the object point detected by the rangefinder. Velocity and acceleration of the detected object can be generated based on successively detected distances at different times.
  • the sensor unit 14 has a detection range centered on the detection axis.
  • the sensor unit 14 may be installed so that a desired area on the road 13 is included in the detection range.
  • the attitude of the sensor unit 14 in the installation situation in other words, the direction and inclination of the detection axis with respect to the ground may be measured in advance.
  • the communication unit 15 may perform wireless communication with the vehicle 10 under the control of the control unit 16 .
  • the communication unit 15 may be composed of a communication circuit and an antenna.
  • the antenna is, for example, an omnidirectional antenna.
  • the communication unit 15 performs wireless communication using, for example, the 700 MHz band assigned to ITS. Also, the communication unit 15 performs wireless communication using, for example, a wireless LAN (Local Area Network).
  • a wireless LAN Local Area Network
  • the communication unit 15 may perform various processing such as amplification processing on the signal received by the antenna, and output the processed received signal to the control unit 16 .
  • the control unit 16 may perform various types of processing on the input received signal to acquire information contained in the received signal.
  • the communication unit 15 may perform various processing such as amplification processing on the information acquired from the control unit 16, and wirelessly transmit the processed transmission signal from the antenna.
  • the communication unit 15 transmits first information that can predict the performance of the sensor unit 14 .
  • the performance of the sensor section 14 is, for example, the detection range of the sensor section 14 .
  • the first information may be, for example, a detection range, in other words, information indicating a range in which the sensor unit 14 of the roadside unit 12 can detect the detected object.
  • the first information may be information capable of estimating the detection range.
  • the information that allows estimation of the detection range may be information indicating the positions of the plurality of detected objects within the detection range of the sensor unit 14 . Based on the positions of the plurality of detected objects, it can be estimated that at least the area connecting the plurality of positions is a detectable range.
  • the communication unit 15 may specifically transmit the fourth information.
  • the fourth information includes information about detected objects around the roadside unit 12, as will be described later.
  • the communication unit 15 specifically receives the third information.
  • the third information includes information about a latent detected object that may not be detected by the sensor unit 14 and is generated by an information processing device mounted on the vehicle 10 .
  • the communication unit 15 may specifically receive the fifth information.
  • the fifth information is generated in the information processing device mounted on the vehicle 10 and indicates the priority of detection targets around each vehicle 10 to which attention should be paid.
  • the control unit 16 includes one or more processors and memory.
  • the processor may include a general-purpose processor that loads a specific program to execute a specific function, and a dedicated processor that specializes in specific processing.
  • a dedicated processor may include an Application Specific Integrated Circuit (ASIC).
  • the processor may include a programmable logic device (PLD).
  • the PLD may include an FPGA (Field-Programmable Gate Array).
  • the control unit 16 may be either SoC (System-on-a-Chip) in which one or more processors cooperate, or SiP (System In a Package).
  • the control unit 16 may process the information regarding the surrounding detected objects detected by the sensor unit 14 as necessary.
  • the control unit 16 may add time information to the information regarding the detected object.
  • the time information may be obtained from a timer built into the control unit 16 or from a timer of the roadside device 12 .
  • the control unit 16 determines the presence or absence of a detected object by using a discrimination model constructed by pattern matching or machine learning for an image detected by a configuration in which the sensor unit 14 includes an imaging device. you can Furthermore, when a detected object exists, the control section 16 may estimate the type of the detected object. Furthermore, when a detected object exists, the control unit 16 may calculate the position of the detected object on the ground based on the position of the detected object in the image and the orientation of the sensor unit 14 . Furthermore, when a detected object exists, the control unit 16 may estimate the traveling direction of the detected object by estimating the orientation of the detected object.
  • control unit 16 may calculate the velocity of the detected object based on the difference in the positions of the same detected object on the ground in temporally consecutive frame images and the frame cycle. Further, the control unit 16 may calculate the acceleration of the detected object based on the difference in velocity calculated at consecutive points in time and the period of frames. The control unit 16 may recognize the information processed as described above as the information regarding the detected object.
  • control unit 16 calculates the distances of the object points in different tilt directions with respect to the detection axis detected in the configuration in which the sensor unit 14 includes a distance measuring device, and the distances to the road 13, as described above. The presence or absence of the detected object in the tilt direction may be determined by the comparison. Furthermore, when a detected object exists, the control unit 16 may calculate the position of the detected object on the ground based on the distance of the detected object and the orientation of the sensor unit 14 . Furthermore, when a detected object exists, the control unit 16 may estimate the type of the detected object based on the size of a set of object points corresponding to a plurality of tilt directions that can be regarded as the same distance.
  • control unit 16 may calculate the velocity of the detected object based on the difference in the position of the same detected object on the ground in the detection results of the distance measurement that is continuous in time and the period of the distance measurement. Further, the control unit 16 may calculate the acceleration of the detected object based on the speed difference calculated at consecutive points in time and the range measurement cycle. The control unit 16 may recognize the information processed as described above as the information regarding the detected object.
  • the control unit 16 generates new fourth information including the information about the detected object in the vicinity detected most recently by the sensor unit 14, the third information, and the fifth information.
  • the control unit 16 controls the communication unit 15 to transmit the newly generated fourth information. Note that the control unit 16 performs communication so as to transmit the fourth information not only to the vehicles 10 existing in the surroundings but also to other roadside units 12 existing in the surroundings that can receive the fourth information. 15 may be controlled.
  • the control unit 16 may determine whether or not the latent detected object in the third information is the same as the peripheral detected object most recently detected by the sensor unit 14 .
  • the control unit 16 may determine identity based on the position of the latent detected object and whether the positional interval between the positions of the detected object is equal to or less than a first distance threshold. Based on the latent detected object or the speed and acceleration of the detected object, the control unit 16 controls the detection of the latent detected object with respect to the difference between the acquisition time of the third information and the latest detection time of the sensor unit 14. Either the position or the position of the detected object may be modified.
  • the control unit 16 may use the positions of the latent detected object and the detected object, which are adjusted to the same point in time by correcting the positions, for identity determination.
  • the control unit 16 may periodically generate the fourth information.
  • the control unit 16 may use the latest third information among a plurality of pieces of third information acquired from the same vehicle at different times to generate the fourth information.
  • the control unit 16 may use the third information acquired after the previous transmission of the fourth information until the generation of the new fourth information to generate the new fourth information.
  • the control unit 16 may transmit the information read from the memory as the first information.
  • the control unit 16 may transmit the information read from the memory as the first information together with the identification information of the roadside device 12 .
  • the control unit 16 calculates the detection range based on the information read from the memory, and outputs information indicating the detection range to the first. 1 information may be transmitted.
  • the control unit 16 may control the communication unit 15 to periodically transmit the first information.
  • the control unit 16 may transmit the first information in the same period as the fourth information.
  • the control unit 16 may notify the server device or the like that manages the roadside unit 12 of the information as necessary. . In addition, the control unit 16 may notify vehicles or the like around the roadside unit 12 of the information as necessary.
  • the vehicle 10 is equipped with an information processing device 17 .
  • the vehicle 10 may further include a sensor section 18 and an ECU (Electronic Control Unit) 19 .
  • the vehicle 10 may be equipped with the in-vehicle wireless device 20 in a configuration in which the information processing device 17 does not include a wireless communication unit.
  • the information processing device 17 has a configuration without a wireless communication unit, but has functions similar to those of the vehicle-mounted wireless device 20 in a configuration with a wireless communication unit.
  • the sensor unit 18 may include various sensors.
  • the various sensors may include sensors that detect conditions around the vehicle 10, sensors that detect the state of the vehicle 10, and the like.
  • the sensors that detect the circumstances around the vehicle 10 may include sensors that detect information about detected objects around the vehicle 10 .
  • the information about the detected object may be similar to the information about the detected object detected by the sensor unit 14 of the roadside unit 12 . In other words, the information about the detected object may be the position of the detected object, the presence or absence of the detected object, the type of existing detected object, the speed, and the traveling direction.
  • a sensor that detects the state of the vehicle 10 may include a speed sensor that detects the speed of the vehicle itself, and a positioning device such as a GNSS (Global Navigation Satellite System) that detects the spatial position of the vehicle in real space.
  • GNSS Global Navigation Satellite System
  • the ECU 19 may assist the driving of the vehicle 10 based on information detected by the sensor unit 18 and information obtained from the information processing device 17, which will be described later. In a configuration in which the vehicle 10 can be controlled by the driver, the ECU 19 may notify the driver of the acquired information or information obtained by processing the information. The ECU 19 may use the information acquired for determining the operation in a configuration in which steering such as acceleration/deceleration and steering of the vehicle 10 is performed automatically or semi-automatically.
  • the in-vehicle wireless device 20 may perform wireless communication with the roadside units 12 and other vehicles 10 in the vicinity.
  • the in-vehicle radio device 20 may be composed of a communication circuit and an antenna.
  • the antenna is, for example, an omnidirectional antenna.
  • the in-vehicle wireless device 20 performs wireless communication using, for example, the 700 MHz band assigned to ITS.
  • the in-vehicle wireless device 20 performs wireless communication using, for example, a wireless LAN (Local Area Network).
  • the in-vehicle wireless device 20 may output information acquired from the roadside device 12 and the vehicle 10 in the vicinity to the information processing device 17 .
  • the in-vehicle wireless device 20 may output information acquired from the information processing device 17 to external devices such as the roadside device 12 and the vehicle 10 .
  • the information processing device 17 includes an acquisition unit 21 and a control unit (second control unit) 22 .
  • the acquisition unit 21 is an information interface with electronic devices other than the information processing device 17 .
  • the acquisition unit 21 acquires first information about the roadside device 12 from surrounding roadside devices 12 .
  • the acquisition unit 21 acquires the first information from the surrounding roadside device 12 via the vehicle-mounted wireless device 20 in a configuration in which the vehicle-mounted wireless device 20 is mounted on the vehicle 10 separately from the information processing device 17 .
  • the acquisition unit 21 may be able to acquire the fourth information together with the first information from the surrounding roadside units 12 . As with the first information, the acquisition unit 21 acquires the fourth information from the surrounding roadside device 12 via the vehicle-mounted wireless device 20 in a configuration in which the vehicle-mounted wireless device 20 is mounted on the vehicle 10 separately from the information processing device 17. may be obtainable.
  • the acquisition unit 21 acquires second information about detected objects around the vehicle 10 in which it is mounted.
  • the second information about the detected object may be similar to the information about the detected object detected by the sensor unit 14 of the roadside unit 12 .
  • the second information about the detected object may be the position of the detected object, the presence or absence of the detected object, the type of existing detected object, the speed, and the traveling direction.
  • the acquisition unit 21 may acquire the second information from the sensor unit 18 .
  • the acquisition unit 21 may acquire the second information from a surrounding vehicle 10 different from the vehicle 10 in which the information processing device 17 is mounted. Further, the vehicle 10 equipped with the information processing device 17 may transmit the second information acquired by the vehicle 10 to other vehicles 10 or roadside units 12 in the vicinity of the vehicle 10 .
  • the acquisition unit 21 may also have a function of outputting information to an external device. In the following description, it is assumed that the acquisition unit 21 has a function of outputting information to an external device.
  • the acquisition unit 21 may output the third information and the fifth information generated by the control unit 22 to the in-vehicle wireless device 20, as will be described later.
  • the acquisition unit 21 may output to the ECU 19 information about detected objects around the vehicle 10 in which the information processing device 17 is mounted.
  • the control unit 22 includes one or more processors and memory.
  • the processor may include a general-purpose processor that loads a specific program to execute a specific function, and a dedicated processor that specializes in specific processing.
  • a dedicated processor may include an application specific IC.
  • a processor may include a programmable logic device.
  • a PLD may include an FPGA.
  • the controller 22 may be either a SoC with one or more processors cooperating, and a SiP.
  • the control unit 22 may process the information regarding the surrounding detected objects detected by the sensor unit 18 as necessary.
  • the control unit 22 may add time information to the information regarding the detected object.
  • the time information may be obtained from a timer built into the control unit 22 or a timer of the information processing device 17 .
  • control unit 22 Based on the first information, the control unit 22 selects, from among the detected objects included in the second information, detected objects that may not have been detected by the roadside unit 12 that notified the first information as latent detected objects. Extract.
  • the control unit 22 may predict the performance of the roadside unit 12 based on the first information in order to extract the potential detectable object. For example, the control unit 22 may recognize the detectable range as the performance of the roadside unit 12 based on the first information. When the first information is information indicating the detectable range itself, the control unit 22 may recognize the detectable range as the detectable range of the roadside unit 12 . When the first information is information that allows the detection range to be estimated, the control unit 22 may recognize the range calculated based on the first information as the detectable range. When acquiring the first information from a plurality of roadside units 12, the control unit 22 recognizes the first information for each roadside unit 12 based on the identification information acquired together with the first information, and identifies the detectable roadside units 12. A range may be calculated.
  • the control unit 22 Based on the first information, more specifically, based on the detectable range corresponding to the first information, the control unit 22 notifies the first information from among the detected objects included in the second information. Detected objects that may not have been detected by the roadside unit 12 are extracted as latent detected objects. As shown in FIG. 5, the control unit 22 extracts, for example, a detected object located outside the detectable range rd among the detected objects do2 included in the second information as latent detected objects hdo.
  • the control unit 22 determines that the detection object do2 included in the second information is located outside the detectable range rd and the fourth information , may be extracted as latent detected objects hdo. In other words, even if the detected object do2 included in the second information is located outside the detectable range rd, the control unit 22 replaces the detected object do4 included in the fourth information with the potential detected object hdo. should not be identified.
  • the control unit 22 determines whether or not the detected object do2 included in the second information is a detected object not included in the fourth information by comparing fourth information in the same category as the second information. The determination may be made by comparing the position of the detected object in the information and the position of the detected object in the fourth information.
  • the control unit 22 determines that the detection object do2 included in the second information is positioned within the detectable range rd and the fourth information , may be extracted as latent detected objects hdo. In other words, even if the detected object do2 included in the second information is located within the detectable range rd, the control unit 22 replaces the detected object do4 included in the fourth information with the potential detected object hdo. should not be identified.
  • the control unit 22 may determine whether or not the detected object do2 included in the second information is a detected object not included in the fourth information by comparing the fourth information, which is in the same category as the second information. .
  • the control unit 22 generates third information including at least information about the latent detection object hdo extracted as described above.
  • the information about the latent detected object hdo may be similar to the information about the detected object detected by the sensor unit 18 of the vehicle 10 .
  • the information about the latent detected object hdo may be the position of the latent detected object hdo, the presence or absence of the latent detected object hdo, the type of the existing latent detected object hdo, the speed, and the traveling direction.
  • the control unit 22 may specify a range in which the latent detected object hdo is located. .
  • the range in which the latent detectable object hdo is positioned may not only be the position of the latent detectable object hdo itself, but may also be a range having a predetermined width with the position of the latent detectable object hdo as a reference or the like.
  • the control unit 22 may include in the third information information indicating that the specified range is a blind spot in the roadside unit 12 that has notified the fourth information.
  • the control unit 22 extracts the first time point included in the second information, in other words, the time information added to the second information. Based on the corresponding time, the state of the detected object (first detected object) do2 included in the second information at the third time point after the first time point may be predicted. Similarly, based on the second time point included in the fourth information, in other words, the time added to the fourth information, the control unit 22 controls the detection included in the fourth information at a third time point after the second time point. The state of the object (second detected object) do4 may be predicted.
  • the control unit 22 predicts the position of the detected object do2 in the second information at the third point in time. Also, for example, when the fourth information includes velocity and acceleration together with the position, the control unit 22 predicts the position of the detected object do4 in the fourth information at the third point in time.
  • control unit 22 determines whether or not the prediction result, in other words, the state of the detected object do2 included in the second information and the detected object do4 included in the fourth information at the same point in time at the third point in time is the same. You can judge. When the control unit 22 determines that the states are the same, the control unit 22 excludes information regarding either the detected object do2 included in the second information or the detected object do4 included in the fourth information from the generation of the third information. you can judge.
  • the control unit 22 may determine whether the detected object do2 included in the second information includes the vehicle 10 or not. When the detected object do2 includes the vehicle 10 , the control unit 22 may calculate the priority of other detected objects viewed from the vehicle 10 . Other detected objects may include the vehicle 10 on which the information processing device 17 is mounted. The control unit 22 may generate fifth information in which the calculated priority is associated with the other detected object.
  • the control unit 22 calculates the velocity vectors of other detected objects viewed from the vehicle 10, which is the detected object do2 included in the second information. Further, the control unit 22 calculates a range of a predetermined width centering on the vehicle 10 .
  • the predetermined size is a circle with a radius that is considered to ensure safety around the vehicle 10, and is, for example, a size with a radius of 100 m.
  • the control unit 22 calculates the priority based on the velocity vector and the range. Specifically, the control unit 22 determines whether or not a position moved from the position of another detected object by a moving distance obtained by multiplying the velocity vector of the detected object by a predetermined time is included in the range. .
  • the control unit 22 may calculate a higher priority when the movement position is included in the range than when it is outside the range. Alternatively, the control unit 22 may calculate the priority so that the smaller the distance between the movement position and the position of the vehicle 10, the higher the priority.
  • the control unit 22 calculates the positional interval between the vehicle 10, which is the detected object do2 included in the second information, and another detected object.
  • the control unit 22 calculates the priority based on the change over time between the previously calculated positional intervals of the same vehicle 10 and other detected objects and the newly calculated positional intervals. Specifically, the control unit 22 may calculate the priority so that the shorter the time change of the position interval, the higher the priority.
  • the control unit 22 may weight the priority based on the type of detected object. For example, the larger the size of the detected object or the faster the speed of movement of the detected object, the greater the weighting may be.
  • the control unit 22 may control the acquisition unit 21 to output the generated third information and fifth information to the in-vehicle wireless device 20 .
  • the control unit 22 may control the acquisition unit 21 to output the second information and the fourth information to the ECU 19 .
  • Information processing in each vehicle 10 is periodically performed while the information processing device 17 in the vehicle 10 is in operation. and communication results with other vehicles 10 are used for information processing of the vehicle 10 .
  • Information processing in the roadside device 12 is performed periodically, and when the vehicle 10 is located within a communicable range, the result of communication with the vehicle 10 is used for information processing in the roadside device 12 . Therefore, the information processing by the vehicle 10 and the roadside device 12 at an arbitrary time will be described below as an example, but the order of information processing in each vehicle 10 and the roadside device 12 is not limited to the following example.
  • step S ⁇ b>100 the roadside unit 12 detects information about the detected object in the sensor unit 14 .
  • the roadside device 12 generates fourth information using information about the detected object. As will be described later, if the roadside device 12 has previously acquired at least one of the third information and the fifth information, the roadside device 12 may also use the one to generate the fourth information.
  • the roadside device 12 transmits the first information and the generated fourth information to the surrounding vehicles 10 .
  • step S101 the first vehicle 10a, which is an arbitrary vehicle 10, detects information about the detected object in the sensor unit 18.
  • the first vehicle 10a generates second information using information about the detected object.
  • the first vehicle 10a transmits the generated second information to surrounding vehicles 10 including the second vehicle 10b other than the first vehicle 10a.
  • step S102 the sensor unit 18 of the second vehicle 10b detects information about the detected object.
  • the second vehicle 10b uses the information about the detected object to generate the second information.
  • the second vehicle 10b transmits the generated second information to surrounding vehicles 10 including the first vehicle 10a other than the second vehicle 10b.
  • step S103 the first vehicle 10a detects each of the second information detected by the first vehicle 10a, the second information obtained from the second vehicle 10b, and the fourth information obtained from the roadside unit 12. Predict the state of the detected object at a third point in time, which is the same point in time.
  • step S104 the first vehicle 10a extracts the latent detected object hdo based on at least one of the comparison of the state of the detected object predicted in step S103 and the first information acquired from the roadside unit 12.
  • step S105 the first vehicle 10a detects each of the second information detected by the first vehicle 10a, the second information obtained from the second vehicle 10b, and the fourth information obtained from the roadside unit 12. For each detected object, the priority of other detected objects viewed from the detected object is calculated to generate fifth information. The first vehicle 10a transmits the fifth information to the roadside unit 12 together with the third information generated in step S104.
  • the second vehicle 10b executes processing similar to steps S103 to S105 in the first vehicle 10a, and transmits third information and fifth information to the roadside unit 12.
  • the information processing device 17 of the present embodiment configured as described above acquires first information that can predict the performance of the roadside device 12 in the vicinity from the roadside device 12, and acquires second information regarding a detected object in the vicinity. and an acquiring unit 21 that extracts a latent detected object hdo that may not have been detected by the roadside unit 12 from among the detected objects do2 included in the second information based on the first information, and a control unit 22 that generates the third information.
  • the information processing device 17 can generate information about a detected object (latent detected object hdo) other than the detected object actually detected by the roadside device 12 . Therefore, the information processing device 17 can generate information that supplements the information detected by the roadside unit 12 .
  • the acquisition unit 21 can acquire the fourth information about the detected object detected by the roadside unit 12, and the control unit 22 can obtain the first information among the detected objects do2 included in the second information.
  • Detected objects located outside the corresponding detectable range rd and not included in the fourth information are extracted as latent detected objects hdo.
  • a detected object positioned outside the detectable range rd is highly likely not detected by the roadside unit 12 . Therefore, in the above configuration, the extraction process can be sped up by extracting the detected object positioned outside the detectable range rd of the information processing device 17 as the first condition.
  • the roadside unit 12 may detect information about the detected object even though it is located outside the detectable range rd. In response to such an event, the information processing device 17 can further extract information about the detected object that was not detected by the roadside unit 12, thereby improving the extraction accuracy of the latent detected object hdo.
  • the acquisition unit 21 can acquire the fourth information regarding the detected object detected by the roadside unit 12, and the control unit 22 can obtain the first information among the detected objects do2 included in the second information.
  • a detected object located within the corresponding detectable range rd and not included in the fourth information is extracted as a latent detected object hdo.
  • a detected object may exist in a blind spot from the roadside device 12 due to the presence of an obstacle, even though it is located within the detectable range rd of the roadside device 12 .
  • the information processing device 17 having the configuration described above can improve the extraction accuracy of the latent detection object hdo.
  • the information processing device 17 includes the range in which the latent detection object hdo is located in the third information as information indicating that the range is a blind spot in the roadside unit 12 . With such a configuration, the information processing device 17 can further supplement useful information for the roadside unit 12 .
  • the information processing device 17 detects the first detected object included in the second information and the detected object included in the fourth information based on the first time point included in the second information and the second time point included in the fourth information. A second detected object and its respective state at a third point in time are predicted. With such a configuration, even if the information regarding the same detected object differs due to a shift in the detection time, the information processing device 17 corrects the difference in the information and compares the information, so that the information regarding the same detected object is obtained. It is possible to improve the determination accuracy of whether or not.
  • the information processing device 17 determines that the first detected object and the second detected object are the same based on the prediction result, the second information about the first detected object and the fourth information about the second detected object from the generation of the third information.
  • the information processing device 17 can prevent the third information from including a plurality of pieces of information regarding the same detected object with different detection times. Therefore, the information processing device 17 can reduce the processing load on the roadside device 12 that acquires the third information.
  • the information processing device 17 calculates the priority regarding the detected object seen from the vehicle 10, and associates the priority with the detected object. 5 Generate information. With such a configuration, the information processing device 17 can generate useful information for other vehicles 10 that acquire the fifth information via the roadside device 12 . The information processing device 17 can reduce the processing load on the roadside device 12 by generating such useful information instead of the roadside device 12 .
  • the roadside unit 12 configured as described above transmits the sensor unit 14 that detects information about detected objects in the vicinity and the first information that can predict the performance of the sensor unit 14, and the sensor unit 14 does not detect generating fourth information including the communication unit 15 receiving the third information about the possible latent detection object hdo, the information about the latest detection object detected by the sensor unit 14, and the third information; and a control unit 16 that controls the communication unit 15 to transmit the
  • the roadside device 12 generates the fourth information including the third information that the roadside device 12 is highly likely to have failed to detect, so that the vehicle 10 can be provided with more useful information. obtain.
  • Embodiments of the information processing device 17 and roadside device 12 have been described above.
  • An optical disk, a magneto-optical disk, a CD-ROM, a CD-R, a CD-RW, a magnetic tape, a hard disk, a memory card, etc. can also be used.
  • the implementation form of the program is not limited to an application program such as an object code compiled by a compiler or a program code executed by an interpreter. good.
  • the program may or may not be configured so that all processing is performed only in the CPU on the control board.
  • the program may be configured to be partially or wholly executed by another processing unit mounted on an expansion board or expansion unit added to the board as required.
  • the information processing device 17 and the ECU 19 in the vehicle 10 are described as different components.
  • the information processing device 17 may include the ECU 19 and perform not only the processing as the information processing device 17 described above but also the processing performed by the ECU 19 described above.
  • Embodiments according to the present disclosure are not limited to any specific configuration of the embodiments described above. Embodiments of the present disclosure extend to all novel features or combinations thereof described in the present disclosure or to all novel method or process steps or combinations thereof described. be able to.
  • Descriptions such as “first” and “second” in this disclosure are identifiers for distinguishing the configurations. Configurations that are differentiated in descriptions such as “first” and “second” in this disclosure may interchange the numbers in that configuration. For example, the first information can replace the identifiers “first” and “second” with the second information. The exchange of identifiers is done simultaneously. The configurations are still distinct after the exchange of identifiers. Identifiers may be deleted. Configurations from which identifiers have been deleted are distinguished by codes. The description of identifiers such as “first” and “second” in this disclosure should not be used as a basis for interpreting the order of the configuration and the existence of identifiers with lower numbers.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Traffic Control Systems (AREA)

Abstract

Ce dispositif de traitement d'informations comprend une unité d'acquisition et une unité de commande. L'unité d'acquisition acquiert des premières informations en provenance d'une machine côté route à proximité. Les premières informations permettent de prédire les propriétés de la machine côté route. L'unité d'acquisition acquiert des deuxièmes informations concernant des objets de détection à proximité. Sur la base des premières informations, l'unité de commande extrait un objet de détection latente qui peut ne pas avoir été détecté par la machine côté route, parmi les objets de détection inclus dans les deuxièmes informations. L'unité de commande génère des troisièmes informations concernant l'objet de détection latente.
PCT/JP2023/000492 2022-01-27 2023-01-11 Dispositif de traitement d'informations, machine côté route, système de prise en charge de trafic, et procédé de fourniture d'informations WO2023145436A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2022-011224 2022-01-27
JP2022011224 2022-01-27

Publications (1)

Publication Number Publication Date
WO2023145436A1 true WO2023145436A1 (fr) 2023-08-03

Family

ID=87471191

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2023/000492 WO2023145436A1 (fr) 2022-01-27 2023-01-11 Dispositif de traitement d'informations, machine côté route, système de prise en charge de trafic, et procédé de fourniture d'informations

Country Status (1)

Country Link
WO (1) WO2023145436A1 (fr)

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010003242A (ja) * 2008-06-23 2010-01-07 Toyota Motor Corp 通信システム

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010003242A (ja) * 2008-06-23 2010-01-07 Toyota Motor Corp 通信システム

Similar Documents

Publication Publication Date Title
CN108932869B (zh) 车辆系统、车辆信息处理方法、记录介质、交通系统、基础设施系统及信息处理方法
JP6714513B2 (ja) 車両のナビゲーションモジュールに対象物の存在を知らせる車載装置
US11315420B2 (en) Moving object and driving support system for moving object
US8615109B2 (en) Moving object trajectory estimating device
EP2600328B1 (fr) Dispositif d'aide à la conduite
US9002631B2 (en) Vicinity environment estimation device with blind region prediction, road detection and intervehicle communication
JP6468062B2 (ja) 物体認識システム
CN112106065A (zh) 使用车轮旋转的光学跟踪预测被观察车辆的状态和位置
JP2018195289A (ja) 車両システム、車両情報処理方法、プログラム、交通システム、インフラシステムおよびインフラ情報処理方法
JPWO2017057043A1 (ja) 画像処理装置、画像処理方法、およびプログラム
US10351144B2 (en) Sightline estimation system and sightline estimation method
US20210387616A1 (en) In-vehicle sensor system
US20240142607A1 (en) Information processing device, information processing method, computer program, and mobile device
JP2022024741A (ja) 車両制御装置、車両制御方法
JP2009181315A (ja) 物体検出装置
WO2018109865A1 (fr) Machine de bord de route et système de communication véhicule-vers-route
WO2023145436A1 (fr) Dispositif de traitement d'informations, machine côté route, système de prise en charge de trafic, et procédé de fourniture d'informations
CN116892949A (zh) 地上物检测装置、地上物检测方法以及地上物检测用计算机程序
JP4644590B2 (ja) 周辺車両位置検出装置および周辺車両位置検出方法
CN115128566A (zh) 雷达数据确定电路及雷达数据确定方法
WO2022070250A1 (fr) Dispositif de traitement d'informations, procédé de traitement d'informations, et programme
CN111766601B (zh) 识别装置、车辆控制装置、识别方法及存储介质
US11769337B2 (en) Traffic signal recognition method and traffic signal recognition device
JP6946660B2 (ja) 測位装置
WO2023210450A1 (fr) Dispositif d'observation, dispositif de traitement d'informations et procédé de traitement d'informations

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23746651

Country of ref document: EP

Kind code of ref document: A1