WO2022255074A1 - Device for vehicle and error estimation method - Google Patents

Device for vehicle and error estimation method Download PDF

Info

Publication number
WO2022255074A1
WO2022255074A1 PCT/JP2022/020383 JP2022020383W WO2022255074A1 WO 2022255074 A1 WO2022255074 A1 WO 2022255074A1 JP 2022020383 W JP2022020383 W JP 2022020383W WO 2022255074 A1 WO2022255074 A1 WO 2022255074A1
Authority
WO
WIPO (PCT)
Prior art keywords
vehicle
target
mounted object
communication device
detected
Prior art date
Application number
PCT/JP2022/020383
Other languages
French (fr)
Japanese (ja)
Inventor
周▲イク▼ 金
Original Assignee
株式会社デンソー
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社デンソー filed Critical 株式会社デンソー
Priority to JP2023525701A priority Critical patent/JPWO2022255074A1/ja
Publication of WO2022255074A1 publication Critical patent/WO2022255074A1/en

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems

Definitions

  • the present disclosure relates to a vehicle device and an error estimation method.
  • Patent Document 1 discloses a technique for determining the existence of a target beyond the line of sight of the own vehicle.
  • reception information which is position information of another vehicle transmitted from another vehicle as a target
  • sensor information including the moving speed and moving direction of the target detected by the sensor of the own vehicle.
  • Patent Document 1 With the technology disclosed in Patent Document 1, it is possible to determine the presence of other vehicles that cannot be detected by the own vehicle's sensors by receiving the position information of the other vehicles.
  • the position information of other vehicles transmitted from other vehicles includes positioning errors (hereinafter referred to as positioning errors), whether positioning is performed using positioning satellites or not using positioning satellites. included. Therefore, when other vehicles are located outside the detection range of the own vehicle's sensors, it is difficult to specify the positions of other vehicles with high accuracy.
  • positioning errors positioning errors
  • One object of this disclosure is to provide a vehicle device and an error estimation method that make it possible to more accurately identify the position of an object mounted with a communication device outside the detection range of the vehicle's surroundings monitoring sensor.
  • the vehicle apparatus of the present disclosure is a vehicle apparatus that can be used in a vehicle, and detects a target relative to the vehicle by a sensor for monitoring the surroundings of the vehicle.
  • Communication information acquisition that acquires, via wireless communication, a detection position specifying unit that specifies a position and a mounted object reference target position that is the position of a target that is transmitted from a mounted object on a communication device and detected by the mounted object on the communication device.
  • a same determination unit that determines whether or not the target detected by the vehicle surroundings monitoring sensor is the same as the target from which the mounted object reference target position is acquired by the communication information acquisition unit; an error estimating unit for estimating a mounting object positioning error, which is a positioning error in the communication equipment mounted object, from the difference between the mounting object reference target position and the detected target position when it is determined that the targets are the same.
  • the error estimation method of the present disclosure is capable of being used in a vehicle and is executed by at least one processor, comprising: A detection position specifying step of specifying a target detection position, which is a relative position of the vehicle, and a target position reference target position, which is the position of the target transmitted from the communication device mounted and detected by the communication device mounted, It is determined whether or not the communication information acquisition process acquired via wireless communication and the target detected by the vehicle surroundings monitoring sensor are the same as the target for which the mounted object reference target position is acquired in the communication information acquisition process.
  • the difference between the position of the reference target of the mounted object and the detected position of the target will result in the positioning error of the mounted object, which is the positioning error of the mounted object of the communication device. and an error estimation step of estimating the error.
  • the mounted object reference target position is the position of the target detected by the mounted object of the communication device, so it includes the positioning error of the mounted object of the communication device. Therefore, for targets determined to be the same, the target detection position, which is the relative position of the target with respect to the vehicle detected by the surroundings monitoring sensor, and the position of the target detected by the communication device transmitted from the communication device
  • the mounting object positioning error can be estimated from the deviation from the mounting object reference target position, which is the position.
  • the sensor for monitoring the surroundings of the vehicle detects the target detected by the sensor for monitoring the surroundings of the equipment mounted on the communication device.
  • the mounting object positioning error it is possible to correct the positioning error in the communication device mounting object and specify the position of the communication device mounting object with higher accuracy. As a result, it is possible to more accurately identify the position of the communication device-mounted object outside the detection range of the surroundings monitoring sensor of the own vehicle.
  • FIG. 2 is a diagram showing an example of a schematic configuration of a vehicle unit 2;
  • FIG. 2 is a diagram showing an example of a schematic configuration of a communication device 20;
  • FIG. 6 is a flow chart showing an example of the flow of positioning error estimation-related processing in the communication device 20;
  • FIG. 10 is a diagram for explaining an outline of a second embodiment;
  • 1 illustrates an exemplary architecture of a V2X communication device;
  • FIG. 3 shows the logical interfaces for CA services and other layers;
  • An example of the flowchart which shows the process which transmits CAM.
  • FIG. 3 shows the logical interfaces for CP services and other layers; Functional block diagram of CP service.
  • FIG. 4 illustrates FOC (or SIC) in CPM;
  • FIG. 4 is a diagram illustrating POC in CPM;
  • FIG. 4 is a diagram for explaining CP services;
  • An example of the flowchart which shows the process which transmits CPM.
  • the vehicle system 1 includes a vehicle unit 2 used in each of a plurality of vehicles.
  • a vehicle VEa, a vehicle VEb, and a vehicle VEc will be described as examples of vehicles.
  • Vehicle VEa, vehicle VEb, and vehicle VEc are assumed to be automobiles.
  • the vehicle unit 2 includes a communication device 20, a locator 30, a surroundings monitoring sensor 40, a surroundings monitoring ECU 50, and a driving support ECU 60, as shown in FIG.
  • the communication device 20, the locator 30, the perimeter monitoring ECU 50, and the driving support ECU 60 may be configured to be connected to an in-vehicle LAN (see LAN in FIG. 2).
  • the locator 30 is equipped with a GNSS (Global Navigation Satellite System) receiver.
  • a GNSS receiver receives positioning signals from multiple satellites.
  • the locator 30 uses the positioning signals received by the GNSS receiver to sequentially locate the position of the vehicle equipped with the locator 30 (hereinafter referred to as vehicle position).
  • the locator 30 determines a specified number, such as four, of positioning satellites to be used for positioning calculation from among the positioning satellites from which positioning signals have been output. Then, using the code pseudoranges, carrier wave phases, etc. for the determined specified number of positioning satellites, a positioning operation is performed to calculate the coordinates of the vehicle position.
  • the positioning operation itself may be a positioning operation using code pseudoranges or a positioning operation using carrier phases.
  • the coordinates should be at least latitude and longitude coordinates.
  • the locator 30 may be equipped with an inertial sensor.
  • the inertial sensor for example, a gyro sensor and an acceleration sensor may be used.
  • the locator 30 may sequentially locate the vehicle position by combining the positioning signal received by the GNSS receiver and the measurement result of the inertial sensor. Note that the vehicle position may be determined by using a traveling distance obtained from signals sequentially output from a vehicle speed sensor mounted on the own vehicle.
  • the surroundings monitoring sensor 40 detects obstacles around the vehicle, such as moving objects such as pedestrians, animals other than humans, bicycles, motorcycles, and other vehicles, as well as stationary objects such as falling objects on the road, guardrails, curbs, and trees. To detect. As the surroundings monitoring sensor 40, a configuration using a surroundings monitoring camera having an imaging range of a predetermined range around the own vehicle may be used. Perimeter monitoring sensor 40 may be configured to use millimeter wave radar, sonar, LIDAR (Light Detection and Ranging/Laser Imaging Detection and Ranging), or the like. Moreover, it is good also as a structure which uses these in combination.
  • the surroundings monitoring camera sequentially outputs captured images to the surroundings monitoring ECU 50 as sensing information. Sensors that transmit search waves such as sonar, millimeter wave radar, and LIDAR sequentially output scanning results based on received signals obtained when reflected waves reflected by obstacles are received to the perimeter monitoring ECU 50 as sensing information.
  • the perimeter monitoring ECU 50 includes a processor, memory, I/O, and a bus connecting these, and executes a control program stored in the memory to perform various processes related to recognition of targets in the vicinity of the vehicle. .
  • the surroundings monitoring ECU 50 recognizes information about targets existing around the vehicle (hereinafter referred to as target information) from the sensing information output from the surroundings monitoring sensor 40 .
  • the target information may be the relative position, type, speed, moving azimuth, etc. of the target with respect to the own vehicle.
  • the relative position may be the distance and direction from the own vehicle. It is assumed that the position specifying accuracy in recognizing the position of the target using the perimeter monitoring sensor 40 is higher than the position specifying accuracy by positioning with the locator 30 .
  • the type of target can be recognized by image recognition processing such as template matching based on the image captured by the surrounding surveillance camera.
  • image recognition processing such as template matching based on the image captured by the surrounding surveillance camera.
  • a monocular camera is used as a perimeter monitoring camera
  • the distance to the target from the vehicle and the distance to the vehicle can be determined from the installation position and the direction of the optical axis of the perimeter monitoring camera relative to the vehicle and the position of the target in the captured image. What is necessary is to recognize the direction of the target object with respect to .
  • the distance of the target from the own vehicle may be recognized based on the amount of parallax between the pair of cameras.
  • the speed of the target may be recognized from the amount of change per hour in the relative position of the target with respect to the own vehicle and the vehicle speed of the own vehicle.
  • the direction of movement may be recognized from the direction of change in the relative position of the target relative to the own vehicle and the vehicle position of the own vehicle.
  • the perimeter monitoring ECU 50 uses the sensing information of the sensor that transmits the search wave to recognize the distance of the target from the own vehicle, the direction of the target with respect to the own vehicle, the speed of the target, and the moving direction. may The perimeter monitoring ECU 50 may recognize the distance from the own vehicle to the target based on the time from when the search wave is sent until when the reflected wave is received. The perimeter monitoring ECU 50 may recognize the azimuth of the target with respect to the own vehicle based on the direction in which the search wave from which the reflected wave was obtained was transmitted. The perimeter monitoring ECU 50 may recognize the speed of the target from the speed of the target relative to the own vehicle calculated based on the Doppler shift between the transmitted search wave and the reflected wave, and the speed of the own vehicle. .
  • the driving assistance ECU 60 includes a processor, memory, I/O, and a bus connecting these, and executes various processes related to driving assistance of the own vehicle by executing a control program stored in the memory.
  • the driving assistance ECU 60 performs driving assistance based on the position of the target specified by the perimeter monitoring ECU 50 and the communication device 20 .
  • the position of the target specified by the perimeter monitoring ECU 50 is the relative position of the target detected by the perimeter monitoring sensor 40 (hereinafter referred to as the line-of-sight target) with respect to the own vehicle.
  • the position of the target specified by the communication device 20 is the relative position of the target that cannot be detected by the perimeter monitoring sensor 40 (hereinafter referred to as non-line-of-sight target) with respect to the own vehicle.
  • Examples of driving assistance include vehicle control for avoiding approaching a target, alerting for avoiding approaching a target, and the like.
  • the communication device 20 includes a processor, memory, I/O, and a bus connecting these, and executes a control program stored in the memory to perform wireless communication with the outside of the vehicle and target position identification. Execute the process. Execution of the control program by the processor corresponds to execution of the method corresponding to the control program.
  • Memory as used herein, is a non-transitory tangible storage medium for non-transitory storage of computer-readable programs and data. A non-transitional physical storage medium is implemented by a semiconductor memory, a magnetic disk, or the like.
  • This communication device 20 corresponds to a vehicle device. Details of the processing in the communication device 20 will be described later.
  • FIG. 20 An example of the schematic configuration of the communication device 20 will be described with reference to FIG.
  • the communication device 20, as shown in FIG. A specifying unit 208 and a target storage unit 209 are provided as functional blocks. Part or all of the functions executed by the communication device 20 may be configured as hardware using one or a plurality of ICs or the like. Also, some or all of the functional blocks included in the communication device 20 may be implemented by a combination of software executed by a processor and hardware members.
  • the detection information acquisition unit 201 acquires target information recognized by the perimeter monitoring ECU 50 . That is, the information of the target detected by the perimeter monitoring sensor 40 is obtained.
  • the target information includes at least the relative position of the target with respect to the own vehicle.
  • the target information is the relative position of the target consisting of the distance of the target from the own vehicle and the orientation of the target with respect to the own vehicle, the type of target, the speed of the target, and the moving direction of the target.
  • the relative position of the target may be coordinates based on the vehicle position of the own vehicle.
  • the communication unit 202 exchanges information with the outside of the vehicle by performing wireless communication with the outside of the vehicle.
  • the communication unit 202 performs wireless communication with the communication device 20 used in other vehicles. That is, the communication unit 202 performs inter-vehicle communication.
  • the communication unit 202 may be capable of wireless communication with a roadside device installed on the roadside. In other words, road-to-vehicle communication may be possible.
  • the communication unit 202 may indirectly exchange information with the communication device 20 used in another vehicle through road-to-vehicle communication. Henceforth, the communication part 202 continues description as what performs inter-vehicle communication. Further, in the present embodiment, vehicle-to-vehicle communication is performed between the vehicle VEa and the vehicle VEb, and vehicle-to-vehicle communication is performed between the vehicle VEa and the vehicle VEc.
  • the communication unit 202 includes a transmission unit 221 and a reception unit 222.
  • the transmission unit 221 transmits information to the outside of the own vehicle via wireless communication.
  • the transmission unit 221 transmits the target information acquired by the detection information acquisition unit 201 to the communication device 20 of the other vehicle via wireless communication.
  • the transmission unit 221 includes the vehicle position measured by the locator 30 of the vehicle and identification information for identifying the vehicle (hereinafter referred to as transmission source identification information) in the target information. Just send it.
  • the source identification information may be the vehicle ID of the own vehicle or the communication device ID of the communication device 20 of the own vehicle.
  • the target information transmitted from the transmission unit 221 includes the relative position of the target, the type of target, the speed of the target, the direction of movement of the target, the vehicle position of the transmission source vehicle, and the transmission source vehicle position. Contains identifying information.
  • the relative position of the target in the target information is a position based on the position of the own vehicle.
  • this relative position to be included in the target information will be referred to as a mounted object reference target position.
  • the receiving unit 222 receives information transmitted via wireless communication from the outside of the own vehicle.
  • the receiving unit 222 receives target information transmitted from the communication device 20 of another vehicle via wireless communication.
  • This other vehicle is assumed to be a vehicle using the vehicle unit 2 . Therefore, this other vehicle has a surrounding monitoring sensor, is capable of positioning using signals from positioning satellites, and corresponds to a communication device mounted object having a communication device 20 capable of wireless communication with the own vehicle.
  • the target information received by the receiving unit 222 includes the mounted object reference target position, target type, target speed, target moving azimuth, vehicle position of the transmission source vehicle, and source identification information.
  • This receiving section 222 corresponds to a communication information acquiring section.
  • the processing in this receiving unit 222 corresponds to the communication information acquisition step.
  • the detected position specifying unit 203 specifies the relative position of the target relative to the own vehicle in the target object information acquired by the detection information acquiring unit 201 as the relative position of the target relative to the own vehicle (hereinafter referred to as target detection position). That is, the target detection position, which is the relative position of the target detected by the vehicle surroundings monitoring sensor 40 with respect to the vehicle, is specified. As an example, the position of the own vehicle measured by the locator 30 may be converted into XY coordinates with the origin being the vehicle position. The processing in this detection position specifying unit 203 corresponds to the detection position specifying step.
  • the identity determination unit 204 monitors the surroundings of the target detected by the surroundings monitoring sensor 40 of the own vehicle (hereinafter referred to as the detected target of the own vehicle) and the surroundings of other vehicles from which the mounted object reference target position acquired by the receiving unit 222 is transmitted. It is determined whether or not the target detected by the sensor 40 (hereinafter referred to as another vehicle detected target) is the same.
  • the processing in the identity determination unit 204 corresponds to the identity determination step.
  • the identity determination unit 204 determines whether the target detected by the own vehicle and the target detected by another vehicle are the same, depending on whether the target information acquired by the detection information acquisition unit 201 and the target information acquired by the reception unit 222 are similar.
  • the target information to be compared for determining whether or not they are the same may be, for example, the type of target, the speed of the target, and the direction of movement of the target.
  • the same determination unit 204 may determine that the two are the same when all of the following conditions are satisfied: the types match, the speed difference is less than a threshold, and the direction difference is less than a threshold. On the other hand, if even some of these conditions are not met, it may be determined that they are not the same.
  • the conditions to be satisfied for the identity determination unit 204 to determine the identity may be part of the matching of types, the difference in speed being less than a threshold value, and the difference in moving direction being less than a threshold value.
  • the conversion unit 205 converts the mounted object reference target position in the target information acquired by the reception unit 222 into a relative position with respect to the own vehicle.
  • the coordinates may be converted into coordinates in an XY coordinate system (hereinafter referred to as the vehicle coordinate system) with the vehicle position of the vehicle positioned by the locator 30 as the origin.
  • the processing in this conversion unit 205 corresponds to the conversion step.
  • the identity determination unit 204 uses the target detection position specified by the detection position specifying unit 203 and the mounted object reference target position among the target object information received by the reception unit 222 to determine the own vehicle detected target. and the target detected by the other vehicle are the same.
  • the conversion unit 205 converts the mounted object reference target position into a relative position with respect to the host vehicle, and determines whether or not the positions are the same based on whether or not the positions are approximated.
  • the identity determination unit 204 may determine whether or not the positions are the same under the condition that the difference in position after conversion is less than a threshold.
  • the identity determination unit 204 may set the condition that the difference in position after conversion is less than a threshold in addition to the conditions that the type matches, the speed difference is less than the threshold, and the movement direction difference is less than the threshold.
  • the error estimation unit 206 compares the mounted object reference target position converted by the conversion unit 205 and the detected position identification unit 203 specified Based on the deviation from the detected target position, the positioning error in other vehicles (hereinafter referred to as mounted object positioning error) is estimated with reference to the position of the own vehicle.
  • the processing in this error estimating section 206 corresponds to the error estimating step.
  • the mounted object positioning error may be the deviation of each of the X and Y coordinates of the own vehicle coordinate system.
  • the error estimation unit 206 stores the estimated mounted object positioning error in the error storage unit 207 .
  • the error storage unit 207 may be a non-volatile memory or a volatile memory.
  • the error estimating unit 206 may associate the estimated mounted object positioning error with the transmission source identification information of the other vehicle whose mounted object positioning error is estimated, and store them in the error storage unit 207 .
  • the transmission source identification information of other vehicles the transmission source identification information of the target object information received by the receiving unit 222 may be used.
  • the target position specifying unit 208 uses the mounted object reference acquired by the receiving unit 222 even for targets that cannot be detected by the perimeter monitoring sensor 40 of the own vehicle.
  • the position of the target relative to the own vehicle is specified by correcting the position of the target by the mounted object positioning error. Correction may be performed by shifting the mounted object reference target position so as to eliminate the deviation corresponding to the mounted object positioning error.
  • the target position specifying unit 208 determines the mounted object reference target position from another vehicle when the same determination unit 204 determines that the targets are not the same and the mounted object positioning error is already stored in the error storage unit 207 .
  • the position of the target with respect to the own vehicle can be specified by correcting the mounted object reference target position obtained by the receiving unit 222 by the amount of the mounted object positioning error. Therefore, it is possible to specify the relative position with respect to the own vehicle with higher accuracy even from the mounted object reference target position acquired by the receiving unit 222 from another vehicle via wireless communication.
  • the target position specifying unit 208 determines whether or not the mounted object positioning error is already stored in the error storage unit 207, and whether or not there is transmission source identification information linked to the mounted object positioning error in the error storage unit 207. You can decide whether or not.
  • the target position specifying unit 208 stores the specified position of the target relative to the own vehicle in the target storage unit 209 . That is, the target storage unit 209 stores the positions of non-line-of-sight targets that cannot be detected by the perimeter monitoring sensor 40 of the own vehicle.
  • the target storage unit 209 may be a volatile memory.
  • the position of the line-of-sight target that can be detected by the vehicle surroundings monitoring sensor 40 may also be stored in the target storage unit 209 .
  • the position of the line-of-sight target may be stored in the memory of the perimeter monitoring ECU 50 .
  • the position of the target within the detection range of the surroundings monitoring sensor 40 is detected by the surroundings monitoring sensor 40 of the own vehicle, so no correction due to mounted object positioning error is performed.
  • the driving support ECU 60 uses the position of the target stored in the target storage unit 209 to perform vehicle control for avoiding proximity to the target, alerting for avoiding proximity to the target, and the like. Provide driving assistance. Therefore, even if the target is beyond the line of sight of the own vehicle, it is possible to perform more accurate driving support by using the relative position with respect to the own vehicle that is specified with higher accuracy.
  • FIG. 4 ⁇ Positioning Error Estimation Related Processing in Communication Device 20>
  • Execution of this process means execution of the error estimation method.
  • the flowchart of FIG. 4 may be configured to be started, for example, when a switch (hereinafter referred to as a power switch) for starting the internal combustion engine or motor generator of the own vehicle is turned on.
  • a switch hereinafter referred to as a power switch
  • step S1 when the receiving unit 222 receives target information transmitted from the communication device 20 of another vehicle via wireless communication (YES in S1), the process proceeds to step S2. On the other hand, if the target information has not been received (NO in S1), the process proceeds to step 12.
  • FIG. 1 when the receiving unit 222 receives target information transmitted from the communication device 20 of another vehicle via wireless communication (YES in S1), the process proceeds to step S2.
  • step S2 the detection information acquisition unit 201 acquires target information recognized by the perimeter monitoring ECU 50 of the own vehicle.
  • the detection position specifying unit 203 specifies the target detection position, which is the relative position of the target with respect to the own vehicle, from the relative position with respect to the own vehicle among the target object information acquired in S2.
  • step S4 the identity determination unit 204 determines whether or not the target detected by the own vehicle and the target detected by the other vehicle are the same. If it is determined in step S5 that the target detected by the own vehicle and the target detected by the other vehicle are the same (YES in S5), the process proceeds to step S6. On the other hand, if it is determined that they are not the same (NO in S5), the process moves to step S9.
  • step S6 the conversion unit 205 converts the mounted object reference target position in the target information received and acquired in S1 into a relative position with respect to the own vehicle.
  • the processing of S6 may be configured to be performed before the processing of S4.
  • S4 using the target detection position specified in S3 and the mounted object reference target position converted in S6, it is determined whether or not the target detected by the own vehicle and the target detected by the other vehicle are the same. You may
  • step S7 the error estimating unit 206 calculates the position of the own vehicle from the deviation between the mounted object reference target position converted in S6 and the target detection position specified in S3 for the targets determined to be the same in S5.
  • Estimate the mounted object positioning error which is the positioning error of other vehicles with reference to
  • the mounted object positioning error estimated in S7 is stored in the error storage unit 207 in association with the transmission source identification information of the other vehicle whose mounted object positioning error was estimated.
  • step S9 if the mounted object positioning error has already been stored in the error storage unit 207 for the other vehicle that is the transmission source of the mounted object reference target position (YES in S9), the process proceeds to step S10. On the other hand, if the mounted object positioning error has not been stored in the error storage unit 207 (NO in S9), the process proceeds to step S12.
  • step S10 the target position specifying unit 208 corrects the mounted object reference target position acquired in S1 by the mounted object positioning error of the other vehicle that transmitted the mounted object reference target position, and corrects the mounted object reference target position acquired in S1. Identify the position of the target that is determined not to be the same with respect to the own vehicle.
  • correction may be performed after the conversion unit 205 converts the mounted object reference target position.
  • correction may be performed before the conversion unit 205 converts the mounted object reference target position.
  • step S ⁇ b>11 the position of the non-line-of-sight target specified in S ⁇ b>10 is stored in the target storage unit 209 .
  • step S12 if it is time to end the positioning error estimation related process (YES in S12), the positioning error estimation related process ends. On the other hand, if it is not the end timing of the positioning error estimation related process (NO in S12), the process returns to S1 and repeats the process.
  • An example of the termination timing of the positioning error estimation-related processing is when the power switch is turned off.
  • the mounted object reference target position is based on the position of the other vehicle obtained by positioning using the signal of the positioning satellite in the other vehicle, which is the mounted object of the communication device. Includes positioning errors. Therefore, for a target determined to be identical by the identity determination unit 204, the target detection position, which is the relative position of the target with respect to the vehicle, detected by the surroundings monitoring sensor 40 of the own vehicle, and the target detected by the surroundings monitoring sensor 40 of the other vehicle A mounted object positioning error can be estimated from the difference between the detected target and the position obtained by converting the mounted object reference target position based on the position of the other vehicle into a relative position with respect to the own vehicle.
  • the object detected by the surroundings monitoring sensor 40 of the own vehicle is the target detected by the surroundings monitoring sensor 40 of the other vehicle. Even if it cannot be detected by 40, it is possible. Further, by using the mounted object positioning error, it is possible to correct the positioning error of the other vehicle and specify the position of the other vehicle with higher accuracy. As a result, it is possible to more accurately identify the position of the communication device mounted object outside the detection range of the perimeter monitoring sensor 40 of the own vehicle.
  • the other vehicle By correcting the mounted object reference target position of the target acquired via wireless communication from the vehicle by the amount of the mounted object positioning error, it is possible to specify the position with respect to the own vehicle with higher accuracy.
  • Embodiment 1 will be described using FIG. LMa, LMb, and LMc in FIG. 1 are targets such as pedestrians. It is assumed that this target is not a communicator-mounted object.
  • the target LMa can be detected by all the perimeter monitoring sensors 40 of the vehicles VEa, VEb, and VEc.
  • the target LMb can be detected by the surroundings monitoring sensor 40 of the vehicle VEb, but cannot be detected by the surroundings monitoring sensor 40 of the vehicle VEa.
  • the target LMc can be detected by the surroundings monitoring sensor 40 of the vehicle VEc, but cannot be detected by the surroundings monitoring sensor 40 of the vehicle VEa.
  • the vehicle VEb and the vehicle VEc cannot be detected by the surrounding monitoring sensor 40 of the vehicle VEa.
  • the vehicle VEa is the own vehicle and the vehicles VEb and VEc are the other vehicles.
  • the mounted object positioning error in the vehicle VEb can be estimated from the position obtained by converting the position of the target LMa into the position relative to the own vehicle.
  • the mounted object positioning error of the vehicle VEc can be similarly estimated.
  • the surroundings monitoring sensor 40 of the vehicle VEb acquires it from the vehicle VEb via wireless communication.
  • the position of the target LMb with respect to the own vehicle VEa can be specified with high accuracy from the detected mounted object reference target position of the target LMb and the mounted object positioning error in the vehicle VEb.
  • the mounted object reference of the target LMc detected by the perimeter monitoring sensor 40 of the vehicle VEc acquired from the vehicle VEc via wireless communication. It is possible to accurately identify the position of the target LMc with respect to the own vehicle VEa from the target position and the mounting object positioning error in the vehicle VEc.
  • Embodiment 2 An overview of the second embodiment will be described with reference to FIG.
  • the vehicle unit 2 sequentially transmits CAM (Cooperative Awareness Messages) and CPM (Cooperative Perception Messages).
  • the roadside device 3 also sequentially transmits the CPM.
  • the communication device 20 provided in the vehicle unit 2 uses the information contained in those messages. Then, similar to the first embodiment, identity determination, error estimation, and the like are performed. CAM and CPM will be described before describing processes such as identity determination and error estimation in the second embodiment.
  • FIG. 6 is a diagram showing an exemplary architecture of a V2X communication device.
  • a V2X communication device is used instead of the communication device 20 of the first embodiment.
  • the V2X communication device also has the configuration shown in FIG. 3, like the communication device 20 of the second embodiment.
  • a V2X communication device is a communication device that transmits target information.
  • the V2X communication device may perform communication between vehicles, vehicles and infrastructure, vehicles and bicycles, vehicles and mobile terminals, and the like.
  • the V2X communication device may correspond to an onboard device of a vehicle or may be included in the onboard device.
  • the onboard equipment is sometimes called an OBU (On-Board Unit).
  • the communication device may correspond to the infrastructure roadside unit or may be included in the roadside unit.
  • a roadside unit is sometimes called an RSU (Road Side Unit).
  • the communication device can also be one element that constitutes an ITS (Intelligent Transport System). If it is an element of the ITS, the communication device may correspond to an ITS station (ITS-S) or be included in the ITS-S.
  • ITS-S is a device for information exchange, and may be any of OBU, RSU, and mobile terminal, or may be included therein.
  • the mobile terminal is, for example, a PDA (Personal Digital Assistant) or a smart phone.
  • the communication device may correspond to a WAVE (Wireless Access in Vehicular) device disclosed in IEEE1609, or may be included in a WAVE device.
  • WAVE Wireless Access in Vehicular
  • V2X communication device mounted on a vehicle.
  • This V2X communication device has a function of providing CA (Cooperative Awareness) service and CP (Collective Perception) service.
  • CA service the V2X communication device transmits CAM.
  • CP service the V2X communication device transmits CPM. It should be noted that the same or similar methods disclosed below can be applied even if the communication device is an RSU or a mobile terminal.
  • the architecture shown in FIG. 6 is based on the ITS-S reference architecture according to EU standards.
  • the architecture shown in FIG. 6 has an application layer 110 , a facility layer 120 , a network & transport layer 140 , an access layer 130 , a management layer 150 and a security layer 160 .
  • the application layer 110 implements or supports various applications 111 .
  • FIG. 6 shows, as examples of the applications 111, a traffic safety application 111a, an efficient traffic information application 111b, and other applications 111c.
  • the facility layer 120 supports the execution of various use cases defined in the application layer 110.
  • the facility layer 120 can support the same or similar functionality as the top three layers (application layer, presentation layer and session layer) in the OSI reference model.
  • Facility means providing functions, information and data.
  • Facility layer 120 may provide the functionality of a V2X communication device.
  • facility layer 120 may provide the functions of application support 121, information support 122, and communication support 123 shown in FIG.
  • the application support 121 has functions that support basic application sets or message sets.
  • An example of a message is a V2X message.
  • V2X messages can include periodic messages such as CAMs and event messages such as DENMs (Decentralized Environmental Notification Messages).
  • Facility layer 120 may also support CPM.
  • the information support 122 has the function of providing common data or databases used for the basic application set or message set.
  • a database is the Local Dynamic Map (LDM).
  • the communication support 123 has functions for providing services for communication and session management.
  • Communication support 123 provides, for example, address mode and session support.
  • facility layer 120 supports a set of applications or a set of messages. That is, the facility layer 120 generates message sets or messages based on the information that the application layer 110 should send and the services it should provide. Messages generated in this way are sometimes referred to as V2X messages.
  • the access layer 130 includes an external IF (InterFace) 131 and an internal IF 132, and can transmit messages/data received by the upper layers via physical channels.
  • access stratum 130 may conduct or support data communication according to the following communication techniques.
  • the communication technology is, for example, a communication technology based on the IEEE 802.11 and/or 802.11p standard, an ITS-G5 wireless communication technology based on the physical transmission technology of the IEEE 802.11 and/or 802.11p standard, a satellite/broadband wireless mobile communication including 2G/3G/4G (LTE)/5G wireless mobile communication technology, broadband terrestrial digital broadcasting technology such as DVB-T/T2/ATC, GNSS communication technology, and WAVE communication technology.
  • the network & transport layer 140 can configure a vehicle communication network between homogeneous/heterogeneous networks using various transport protocols and network protocols.
  • the transport layer is the connecting layer between the upper and lower layers. Upper layers include a session layer, a presentation layer, and an application layer 110 .
  • the lower layers include the network layer, data link layer, and physical layer.
  • the transport layer can manage transmitted data to arrive at its destination correctly. At the source, the transport layer processes the data into appropriately sized packets for efficient data transmission. On the receiving side, the transport layer takes care of restoring the received packets to the original file.
  • Transport protocols are, for example, TCP (Transmission Control Protocol), UDP (User Datagram Protocol), and BTP (Basic Transport Protocol).
  • the network layer can manage logical addresses.
  • the network layer may also determine the routing of packets.
  • the network layer may receive packets generated by the transport layer and add the logical address of the destination to the network layer header. Packet transmission routes may be unicast/multicast/broadcast between vehicles, between vehicles and fixed stations, and between fixed stations.
  • IPv6 networking for geo-networking, mobility support or geo-networking may be considered.
  • the V2X communication device architecture may further include a management layer 150 and a security layer 160 .
  • Management layer 150 manages the communication and interaction of data between layers.
  • the management layer 150 comprises a management information base 151 , regulation management 152 , interlayer management 153 , station management 154 and application management 155 .
  • Security layer 160 manages security for all layers.
  • Security layer 160 comprises firewall and intrusion detection management 161 , authentication, authorization and profile management 162 and security management information base 163 .
  • V2X messages may also be referred to as ITS messages.
  • V2X messages can be generated at application layer 110 or facility layer 120 . Examples of V2X messages are CAM, DENM, CPM.
  • the transport layer in the network & transport layer 140 generates BTP packets.
  • a network layer at network & transport layer 140 may encapsulate the BTP packets to produce geo-networking packets.
  • Geo-networking packets are encapsulated in LLC (Logical Link Control) packets.
  • LLC Logical Link Control
  • the data may include message sets.
  • a message set is, for example, basic safety messages.
  • BTP is a protocol for transmitting V2X messages generated in the facility layer 120 to lower layers.
  • the BTP header has A type and B type.
  • a type BTP header may contain the destination port and source port required for transmission and reception in two-way packet transmission.
  • the B-type BTP header can include the destination port and destination port information required for transmission in non-bidirectional packet transmission.
  • the destination port identifies the facility entity corresponding to the destination of the data contained in the BTP packet (BTP-PDU).
  • a BTP-PDU is unit transmission data in BTP.
  • the source port is a field generated for the BTP-A type.
  • the source port indicates the port of the facility layer 120 protocol entity at the source of the corresponding packet. This field can have a size of 16 bits.
  • Destination port information is a field generated for the BTP-B type. Provides additional information if the destination port is a well-known port. This field can have a size of 16 bits.
  • a geo-networking packet includes a basic header and a common header according to the network layer protocol, and optionally includes an extension header according to the geo-networking mode.
  • An LLC packet is a geo-networking packet with an LLC header added.
  • the LLC header provides the ability to differentiate and transmit IP data and geo-networking data.
  • IP data and geo-networking data can be distinguished by SNAP (Subnetwork Access Protocol) Ethertype.
  • SNAP Network Access Protocol
  • the Ethertype When IP data is transmitted, the Ethertype is set to x86DD and may be included in the LLC header. If geo-networking data is transmitted, the Ethertype may be set to 0x86DC and included in the LLC header. The receiver can see the Ethertype field in the LLC packet header and forward and process the packet to the IP data path or the geo-networking path depending on the value of the Ethertype field in the LLC packet header.
  • the LLC header contains DSAP (Destination Service Access Point) and SSAP (Source Service Access Point).
  • SSAP is followed by a control field (Control in FIG. 7), protocol ID, and ethertype.
  • FIG. 8 shows the logical interfaces for the CA (Collaborative Awareness) service 128 and other layers in the V2X communication device architecture.
  • V2X communication devices may provide various services for traffic safety and efficiency.
  • One of the services may be CA service 128 .
  • Collaborative awareness in road traffic means that road users and roadside infrastructure can know each other's location, dynamics and attributes.
  • Road users refer to all users on and around roads for which traffic safety and control are required, such as automobiles, trucks, motorcycles, cyclists, pedestrians, etc.
  • Roadside infrastructure refers to road signs, traffic lights, barriers, entrances, etc. Refers to equipment.
  • V2V vehicle to vehicle
  • V2I vehicle to infrastructure
  • I2V infrastructure and vehicles
  • X2X wireless networks
  • V2X communication devices can provide situational awareness through their sensors and communication with other V2X communication devices.
  • the CA service can specify how the V2X communication device communicates its location, behavior and attributes by sending CAMs.
  • the CA service 128 may be an entity of the facility layer 120.
  • CA service 128 may be part of the application support domain of facility layer 120 .
  • the CA service 128 can apply the CAM protocol and provide two services of CAM transmission and reception.
  • CA service 128 may also be referred to as a CAM basic service.
  • the source ITS-S configures the CAM.
  • the CAM is sent to network & transport layer 140 for transmission.
  • the CAM may be sent directly from the source ITS-S to all ITS-S within range.
  • the communication range is varied by changing the transmission power of the source ITS-S.
  • CAMs are generated periodically at a frequency controlled by the CA service 128 of the originating ITS-S. The frequency of generation is determined by considering the state change of the source ITS-S. Conditions to consider are, for example, changes in the position or velocity of the source ITS-S, loading of the radio channel.
  • CA service 128 Upon receiving the CAM, CA service 128 utilizes the contents of the CAM to entities such as application 111 and/or LDM 127.
  • CA service 128 interfaces with entities of facility layer 120 and application layer 110 to collect relevant information for CAM generation and to further process received CAM data.
  • An example of an entity for ITS-S data collection in a vehicle is a VDP (Vehicle Data Provider) 125, a POTI (position and time) unit 126, and an LDM 127.
  • the VDP 125 is connected to the vehicle network and provides vehicle status information.
  • the POTI unit 126 provides ITS-S position and time information.
  • the LDM 127 is a database within ITS-S, as described in ETSI TR 102 863, which can be updated with received CAM data.
  • Application 111 can obtain information from LDM 127 and perform further processing.
  • the CA service 128 connects with the network & transport layer 140 via NF-SAP (Network & Transport/Facilities Service Access Point) in order to exchange CAMs with other ITS-S.
  • the CA service 128 interfaces with the security layer 160 via SF-SAP (Security Facilities Service Access Point) for CAM transmission and CAM reception security services.
  • SF-SAP Security Facilities Service Access Point
  • the CA service 128 directly provides the received CAM data to the application 111, it connects with the management layer 150 via the MF-SAP (Management/Facilities Service Access Point), FA-SAP (Facilities/Applications Service Access Point ) with the application layer 110 .
  • MF-SAP Management/Facilities Service Access Point
  • FA-SAP Fecilities/Applications Service Access Point
  • FIG. 9 shows the functional blocks of the CA service 128 and interfaces to other functions and layers.
  • CA service 128 provides four sub-functions of CAM encoding section 1281 , CAM decoding section 1282 , CAM transmission management section 1283 and CAM reception management section 1284 .
  • the CAM encoding unit 1281 constructs and encodes a CAM according to a predefined format.
  • CAM decoding section 1282 decodes the received CAM.
  • the CAM transmission manager 1283 executes the protocol operation of the source ITS-S to transmit the CAM.
  • the CAM transmission management unit 1283 executes, for example, activation and termination of CAM transmission operation, determination of CAM generation frequency, and triggering of CAM generation.
  • the CAM reception manager 1284 performs protocol operations specified in the receiving ITS-S to receive the CAM.
  • the CAM reception management unit 1284 for example, activates the CAM decoding function when receiving CAM.
  • the CAM reception management unit 1284 provides the received CAM data to the application 111 of the ITS-S on the reception side.
  • the CAM reception management unit 1284 may perform information check of the received CAM.
  • the CA service 128, as shown in FIG. Provide CAM.
  • Interface IF. CAM is an interface to LDM 127 or application 111 .
  • the interface to the application layer 110 may be implemented as an API (Application Programming Interface), and data may be exchanged between the CA service 128 and the application 111 via this API.
  • the interface to application layer 110 can also be implemented as FA-SAP.
  • the CA Service 128 interacts with other entities of the Facility Layer 120 to obtain the data necessary for CAM generation.
  • the set of entities that provide data for the CAM are data providing entities or data providing facilities.
  • Between the data providing entity and the CA service 128 is an interface IF. Data is exchanged via the FAC.
  • the CA service 128 has an interface IF. It exchanges information with the network & transport layer 140 via N&T. At the source ITS-S, CA service 128 provides the CAM to network & transport layer 140 along with protocol control information (PCI). PCI is control information according to ETSI EN 32 636-5-1. The CAM is embedded in the Service Data Unit (FL-SDU) of the facility layer 120 .
  • FL-SDU Service Data Unit
  • the network & transport layer 140 can provide the received CAM to the CA service 128.
  • the interface between the CA service 128 and the network & transport layer 140 relies on geo-networking/BTP stack services, or IPv6 stacks, and IPv6/geo-networking combined stacks.
  • CAM may rely on services provided by the GeoNetworking (GN)/BTP stack.
  • GN GeoNetworking
  • SHB Single Hop Broadcast
  • the PCI passed from CA service 128 to the geo-networking/BTP stack may include BTP type, destination port, destination port information, GN packet transmission type, GN communication profile, GN security profile. This PCI may also include GN traffic class, GN maximum packet lifetime.
  • a CAM can use an IPv6 stack or a combined IPv6/geo-networking stack to transmit the CAM, as specified in ETSI TS 102 636-3. If the combined IPv6/geo-networking stack is used for CAM transmission, the interface between the CA service 128 and the combined IPv6/geo-networking stack may be the same as the interface between the CA service 128 and the IPv6 stack.
  • the CA service 128 can exchange primitives with management entities of the management layer 150 via MF-SAP. Primitives are elements of information that are exchanged, such as instructions.
  • Primitives are elements of information that are exchanged, such as instructions.
  • the sender ITS-S has an interface IF. Obtain the setting information of T_GenCam_DCC from the management entity via Mng.
  • the CA service 128 is provided by the ITS-S security entity interface IF. Sec can be used to exchange primitives with ITS-S security entities via SF-SAP.
  • Point-to-multipoint communication may be used for CAM transmission.
  • a CAM is sent in a single hop from a source ITS-S only to a receiver ITS-S located within direct communication range of the source ITS-S. Since it is a single hop, the receiving IST-S does not forward the received CAM.
  • the initiation of the CA service 128 may be different for different types of ITS-S, such as vehicle ITS-S, roadside ITS-S, and personal ITS-S. As long as CA service 128 is active, CAM generation is managed by CA service 128 .
  • the CA service 128 is activated at the same time as the ITS-S is activated and terminated when the ITS-S is deactivated.
  • the frequency of CAM generation is managed by the CA service 128.
  • the CAM generation frequency is the time interval between two consecutive CAM generations.
  • the CAM generation interval is set so as not to fall below the minimum value T_GenCamMin.
  • the minimum value T_GenCamMin of the CAM generation interval is 100ms. If the CAM generation interval is 100 ms, the CAM generation frequency will be 10 Hz.
  • the CAM generation interval is set so as not to exceed the maximum value T_GenCamMax.
  • the maximum value T_GenCamMax of the CAM generation interval is 1000ms. If the CAM generation interval is 1000 ms, the CAM generation frequency will be 1 Hz.
  • the CA service 128 triggers CAM generation according to the dynamics of the source ITS-S and the congestion state of the channel. If the dynamics of the source ITS-S shortens the CAM generation interval, this interval is maintained for consecutive CAMs.
  • the CA service 128 repeatedly checks the conditions that trigger CAM generation every T_CheckCamGen. T_CheckCamGen is below T_GenCamMin.
  • the parameter T_GenCam_DCC provides the minimum generation time interval between two consecutive CAMs that reduces CAM generation according to the channel usage requirements for distributed congestion control (DCC) specified in ETSI TS 102 724. . This facilitates adjustment of the CAM generation frequency according to the remaining capacity of the radio channel during channel congestion.
  • DCC distributed congestion control
  • T_GenCam_DCC is provided by the management entity in milliseconds.
  • the range of values for T_GenCam_DCC is T_GenCamMin ⁇ T_GenCam_DCC ⁇ T_GenCamMax.
  • T_GenCam_DCC is set to T_GenCamMax. If the managing entity gives T_GenCam_DCC a value less than or equal to T_GenCamMin, or if T_GenCam_DCC is not provided, T_GenCam_DCC is set to T_GenCamMin.
  • LTE-V2X DCC and T_GenCam_DCC do not apply.
  • the access layer 130 manages channel congestion control.
  • T_GenCam is the upper limit of the currently valid CAM generation interval. Let the default value of T_GenCam be T_GenCamMax. T_GenCam sets the elapsed time from the last generation of CAM when CAM is generated according to condition 1 below. After N_GenCam consecutive CAMs are generated according to condition 2, T_GenCam is set to T_GenCamMax.
  • N_GenCam can be dynamically adjusted according to some environmental conditions. For example, when approaching an intersection, N_GenCam can be increased to increase the frequency of CAM reception. Note that the default and maximum values of N_GenCam are 3.
  • Condition 1 is that the elapsed time since the previous CAM generation is T_GenCam_DCC or more, and one of the following ITS-S dynamics-related conditions is given.
  • the first condition related to ITS-S dynamics is that the absolute value of the difference between the current bearing of the source ITS-S and the bearing contained in the CAM previously transmitted by the source ITS-S is 4 degrees or more. is.
  • the second is that the distance between the current location of the source ITS-S and the location included in the CAM previously transmitted by the source ITS-S is 4 m or more.
  • Condition 2 is that the elapsed time from the previous CAM generation is T_GenCam or more, and in the case of ITS-G5, T_GenCam_Dcc or more.
  • the CA service 128 immediately creates a CAM.
  • the CA service 128 When the CA service 128 creates a CAM, it creates an essential container.
  • the mandatory container mainly contains highly dynamic information of the source ITS-S. High dynamic information is included in basic and high frequency containers.
  • the CAM can contain optional data.
  • Optional data is mainly non-dynamic source ITS-S status and information for specific types of source ITS-S. The status of non-dynamic source ITS-S is contained in the low frequency container, and information for specific types of source ITS-S is contained in the special vehicle container.
  • Infrequent containers are included in the first CAM after the CA service 128 is activated. After that, if the elapsed time from the generation of the last CAM that generated the low-frequency container is 500 ms or more, the low-frequency container is included in the CAM.
  • the special vehicle container is also included in the first CAM generated after the CA service 128 is activated. After that, if 500 ms or more have passed since the CAM that last generated the special vehicle container, the special vehicle container is included in the CAM.
  • the CAM generation frequency of the roadside ITS-S is also defined by the time interval between two consecutive CAM generations.
  • the roadside ITS-S must be configured such that at least one CAM is transmitted while the vehicle is within the coverage of the roadside ITS-S.
  • the time interval for CAM generation must be 1000 ms or longer. Assuming that the time interval of CAM generation is 1000 ms as the maximum CAM generation frequency, the maximum CAM generation frequency is 1 Hz.
  • the probability that a passing vehicle receives a CAM from an RSU depends on the frequency of CAM occurrence and the time the vehicle is within the communication area. This time depends on the vehicle speed and the transmission power of the RSU.
  • each CAM is time-stamped.
  • time synchronization is established between different ITS-S.
  • the time required for CAM generation is 50 ms or less.
  • the time required for CAM generation is the difference between the time CAM generation is started and the time the CAM is delivered to network & transport layer 140 .
  • the time stamp shown in the CAM of the vehicle ITS-S corresponds to the time when the reference position of the source ITS-S shown in this CAM was determined.
  • the timestamp shown on the roadside ITS-S CAM is the time the CAM was generated.
  • Certificates may be used to authenticate messages transmitted between ITS-S.
  • a certificate indicates permission for the holder of the certificate to send a particular set of messages. Certificates can also indicate authority over specific data elements within a message.
  • authorization authority is indicated by a pair of identifiers, ITS-AID and SSP.
  • ITS-AID indicates the overall type of permission granted. For example, there is an ITS-AID that indicates that the sender has the right to send the CAM.
  • SSP Service Specific Permissions
  • ITS-AID Service Specific Permissions
  • a received signed CAM is accepted by the recipient if the certificate is valid and the CAM matches the certificate's ITS-AID and SSP.
  • the CAM is signed using the private key associated with the Authorization Ticket containing an SSP of type BitmapSsp.
  • FIG. 10 is a diagram showing the CAM format.
  • a CAM may include an ITS protocol data unit (PDU) header and multiple containers.
  • the ITS PDU header contains information on the protocol version, message type, and source ITS-S ID.
  • the CAM of the vehicle ITS-S includes one basic container and one high frequency container (hereinafter referred to as HF container), one low frequency container (hereinafter referred to as LF container), It can contain one or more other specialized vehicle containers. Special vehicle containers are sometimes called special containers.
  • the basic container contains basic information about the source ITS-S.
  • the base container can include station type, station location.
  • a station type is a vehicle, a roadside unit (RSU), or the like. If the station type is vehicle, the vehicle type may be included.
  • Location may include latitude, longitude, altitude and confidence.
  • the vehicle HF container is included in the HF container. If the station type is not vehicle, the HF container can contain other containers that are not vehicle HF containers.
  • the vehicle HF container contains dynamic state information that changes in a short time in the source vehicle ITS-S.
  • the vehicle HF container includes one or more of the orientation of the vehicle, vehicle speed, traveling direction, vehicle length, vehicle width, vehicle longitudinal acceleration, road curvature, curvature calculation mode, and yaw rate. It may be The running direction indicates either forward or backward.
  • the curvature calculation mode is a flag indicating whether or not the yaw rate of the vehicle is used to calculate the curvature.
  • the vehicle HF container includes one or more of the following: vehicle longitudinal acceleration control status, lane position, steering wheel angle, lateral acceleration, vertical acceleration, characteristic class, DSRC toll collection station location.
  • vehicle longitudinal acceleration control status lane position
  • steering wheel angle lateral acceleration
  • vertical acceleration vertical acceleration
  • characteristic class DSRC toll collection station location.
  • a property class is a value that determines the maximum age of data elements in the CAM.
  • the vehicle LF container is included in the LF container. If the station type is not vehicle, the LF container can contain other containers that are not vehicle LF containers.
  • a vehicle LF container can include one or more of the vehicle's role, the lighting state of external lights, and the travel trajectory.
  • the role of vehicle is a classification when the vehicle is a special vehicle.
  • the external lights are the most important external lights of the vehicle.
  • a travel trajectory indicates the movement of a vehicle at a certain time or a certain distance in the past.
  • the running locus can also be called a path history.
  • a travel locus is represented by a list of waypoints of a plurality of points (for example, 23 points).
  • a special vehicle container is a container for vehicles ITS-S that have a special role in road traffic such as public transportation.
  • a special vehicle container may include any one of a public transportation container, a special transportation container, a hazardous materials container, a road construction container, an ambulance container, an emergency container, or a security vehicle container.
  • a public transport container is a container for public transport vehicles such as buses.
  • Public transportation containers are used by public transportation vehicles to control boarding conditions, traffic lights, barriers, bollards, and the like.
  • Special shipping containers are included when the vehicle is one or both of a heavy vehicle and an oversized vehicle.
  • a dangerous goods container is a container that is included when a vehicle is transporting dangerous goods.
  • the dangerous goods container stores information indicating the type of dangerous goods.
  • a road construction container is a container that is included when the vehicle is a vehicle that participates in road construction.
  • the road construction container stores a code indicating the type of road construction and the cause of the road construction.
  • the roadwork container may also include information indicating whether the lane ahead is open or closed.
  • An ambulance container is a container that is included when the vehicle is an ambulance vehicle during an ambulance operation.
  • the ambulance container shows the status of light bar and siren usage, and emergency priority.
  • An emergency container is a container that is included when the vehicle is an emergency vehicle during emergency operations.
  • the emergency container shows light bar and siren usage, cause code and emergency priority.
  • the safety confirmation vehicle container is a container that is included when the vehicle is a safety confirmation vehicle.
  • a safety confirmation vehicle is a vehicle that accompanies a special transportation vehicle or the like.
  • the safety confirmation car container shows the use of light bars and sirens, overtaking regulations, and speed limits.
  • Fig. 11 shows a flowchart of the process of transmitting the CAM.
  • the processing shown in FIG. 11 is executed for each CAM generation cycle.
  • step S201 information configuring the CAM is acquired.
  • Step S201 is executed by the detection information acquisition unit 201, for example.
  • step S202 a CAM is generated based on the information obtained at step S201.
  • Step S202 can also be executed by the detection information acquisition unit 201 .
  • the transmission unit 221 transmits the CAM generated in step S202 to the surroundings of the vehicle.
  • V2X communication devices can support traffic safety by periodically providing their own location and status to surrounding V2X communication devices.
  • the CA service 128 has a limitation that only the information of the corresponding V2X communication device itself can be shared. To overcome this limitation, service development such as CP service 124 is required.
  • the CP service 124 may also be an entity of the facility layer 120, as shown in FIG.
  • CP service 124 may be part of the application support domain of facility layer 120 .
  • the CP service 124 may fundamentally differ from the CA service 128 in that, for example, it cannot receive input data regarding the host V2X communication device from the VDP 125 or the POTI unit 126 .
  • CPM transmission includes CPM generation and transmission.
  • an originating V2X communication device In the process of generating CPM, an originating V2X communication device generates a CPM, which is then sent to network & transport layer 140 for transmission.
  • An originating V2X communication device may be referred to as an originating V2X communication device, a host V2X communication device, and so on.
  • CP service 124 connects with other entities in facility layer 120 and V2X applications in facility layer 120 to collect relevant information for CPM generation and to deliver received CPM content for further processing.
  • entity for data collection may be the function that provides object detection at the host object detector.
  • CP service 124 may use services provided by protocol entities of network & transport layer 140 to deliver (or transmit) CPMs.
  • CP service 124 may interface with network & transport layer 140 through NF-SAP to exchange CPMs with other V2X communication devices.
  • NF-SAP is the service access point between network & transport layer 140 and facility layer 120 .
  • CP service 124 may connect with secure entities through SF-SAP, the SAP between security layer 160 and facility layer 120, to access security services for sending CPMs and receiving CPMs. good.
  • CP service 124 may also interface with management entities through MF-SAP, which is the SAP between management layer 150 and facility layer 120 .
  • MF-SAP which is the SAP between management layer 150 and facility layer 120 .
  • the CP service 124 may connect to the application layer 110 through FA-SAP, which is the SAP between the facility layer 120 and the application layer 110 .
  • the CP service 124 can specify how a V2X communication device informs other V2X communication devices about the location, behavior, and attributes of detected surrounding road users and other objects. For example, sending a CPM allows the CP service 124 to share the information contained in the CPM with other V2X communication devices. Note that the CP service 124 may be a function that can be added to all types of target information communication devices that participate in road traffic.
  • a CPM is a message exchanged between V2X communication devices via the V2X network.
  • CPM can be used to generate collective perceptions of road users and other objects detected and/or recognized by V2X communication devices.
  • the detected road users or objects may be, but are not limited to, road users or objects that are not equipped with V2X communication equipment.
  • a V2X communication device that shares information via a CAM shares only information about the recognition state of the V2X communication device itself with other V2X communication devices in order to perform cooperative recognition.
  • road users and others who are not equipped with V2X communication devices are not part of the system and thus have limited views on situations related to safety and traffic management.
  • a system that is equipped with a V2X communication device and can recognize road users and objects that are not equipped with a V2X communication device can detect the presence of road users and objects that are not equipped with a V2X communication device. and status to other V2X communication devices.
  • the CP service 124 cooperatively recognizes the presence of road users and objects that are not equipped with V2X communication devices, so the safety and traffic management performance of systems equipped with V2X communication devices can be easily improved. It is possible to
  • CPM delivery may vary depending on the applied communication system. For example, in ITS-G5 networks as defined in ETSI EN 302 663, a CPM may be sent from an originating V2X communication device directly to all V2X communication devices within range. The communication range can be particularly affected by the originating V2X communication device by changing the transmission power according to the region concerned.
  • CPMs may be generated periodically with a frequency controlled by the CP service 124 at the originating V2X communication device.
  • the frequency of generation may be determined taking into account the radio channel load determined by Distributed Congestion Control.
  • the generation frequency is also determined taking into account the state of the detected non-V2X objects, e.g. dynamic behavior of position, velocity or orientation, and the transmission of CPMs for the same perceived object by other V2X communication devices.
  • the CP service 124 makes the contents of the CPM available to functions within the receiving V2X communication device, such as the V2X application and/or the LDM 127 .
  • LDM 127 may be updated with received CPM data.
  • V2X applications may retrieve this information from LDM 127 for additional processing.
  • FIG. 13 is a functional block diagram of the CP service 124 in this embodiment. More specifically, FIG. 13 illustrates the functional blocks of the CP service 124 and functional blocks with interfaces for other functions and layers in this embodiment.
  • the CP service 124 can provide the following sub-functions for CPM transmission/reception.
  • CPM encoder 1241 constructs or generates CPM according to a predefined format. The latest in-vehicle data may be included in the CPM.
  • the CPM decoding unit 1242 decodes the received CPM.
  • the CPM transmission management unit 1243 executes the protocol operation of the source V2X communication device. The operations performed by the CPM transmission manager 1243 may include activation and termination of CPM transmission operations, determination of CPM generation frequency, and triggering of CPM generation.
  • the CPM reception manager 1244 can perform protocol operations for the receiving V2X communication device. Specifically, it can include triggering the CPM decoding function in CPM reception, providing the received CPM data to the LDM 127 or the V2X application of the receiving side V2X communication device, checking the information of the received CPM, and the like.
  • Point-to-multipoint communication may be used for CPM delivery.
  • ITS-G5 a control channel
  • CPM generation may be triggered and managed by CP service 124 while CP service 124 is running.
  • the CP service 124 may be launched upon activation of the V2X communication device and may be terminated when the V2X communication device is terminated.
  • a host V2X communication device may send a CPM whenever at least one object with sufficient confidence to exchange with a nearby V2X communication device is detected.
  • CP services should consider the trade-off between object lifetime and channel utilization. For example, from the point of view of an application that uses information received by a CPM, it needs to provide updated information as often as possible. However, from the point of view of the ITS-G5 stack, a low transmission period is required due to the need to minimize channel utilization. Therefore, it is desirable for the V2X communication device to consider this point and appropriately include the detected object and object information in the CPM. Also, in order to reduce the message size, it is necessary to evaluate the object before sending it.
  • FIG. 14 is a diagram showing the structure of the CPM.
  • the CPM structure shown in FIG. 14 may be the basic CPM structure.
  • CPMs may be messages exchanged between V2X communication devices in a V2X network.
  • CPM may also be used to generate collective perceptions of road users and/or other objects detected and/or recognized by V2X communication devices. That is, a CPM may be an ITS message for generating collective awareness of objects detected by V2X communication devices.
  • the CPM may include state information and attribute information of road users and objects detected by the source V2X communication device. Its content may vary depending on the type of road user or object detected and the detection capabilities of the originating V2X communication device. For example, if the object is a vehicle, the state information may include at least information about the actual time, location and motion state. Attribute information may include attributes such as dimensions, vehicle type, and role in road traffic.
  • the CPM may complement the CAM and work in the same way as the CAM. That is, it may be for enhancing cooperative recognition.
  • the CPM may contain externally observable information about detected road users or objects.
  • the CP service 124 may include methods to reduce duplication or duplication of CPMs sent by different V2X communication devices by verifying CPMs sent by other stations.
  • the receiving V2X communication device may recognize the presence, type and status of road users or objects detected by the originating V2X communication device.
  • the received information may be used by the receiving V2X communication device to support V2X applications to enhance safety, improve traffic efficiency and travel time. For example, by comparing the received information with the detected states of road users or objects, the receiving V2X communication device can estimate the risk of collision with road users or objects. Additionally, the receiving V2X communication device may notify the user via the receiving V2X communication device's Human Machine Interface (HMI) or automatically take corrective action.
  • HMI Human Machine Interface
  • CPM The basic format of CPM will be explained with reference to FIG.
  • the format of this CPM is ASN (Abstract Syntax Notation). May be presented as 1.
  • Data Elements (DE) and Data Frames (DF) not defined in this disclosure may be derived from the Common Data Dictionary specified in ETSI TS 102 894-2.
  • a CPM may include an ITS protocol data unit (PDU) header and multiple containers.
  • PDU protocol data unit
  • the ITS PDU header is a header that contains information about the protocol version, message type, and ITS ID of the source V2X communication device.
  • the ITS PDU header is a common header used in ITS messages and is present at the beginning of the ITS message.
  • the ITS PDU header is sometimes called a common header.
  • a station data container may include an originating vehicle container or an originating roadside unit container (RSU container).
  • a sensor information container is sometimes called a field-of-view container.
  • An originating vehicle container may also be referred to as an OVC.
  • a field of view container may also be described as an FOC.
  • a recognition object container may also be described as a POC.
  • the CPM includes the management container as a mandatory container, and the station data container, sensor information container, POC and free space ancillary containers may be optional containers.
  • the sensor information container, the perceived object container and the free space attachment container may be multiple containers. Each container will be described below.
  • the administrative container provides basic information about the originating ITS-S, regardless of whether it is a vehicle or roadside unit type station.
  • the management container may also include station type, reference location, segmentation information, number of recognized objects.
  • the station type indicates the type of ITS-S.
  • the reference location is the location of the originating ITS-S.
  • the segmentation information describes splitting information when splitting a CPM into multiple messages due to message size constraints.
  • Table 1 shown in FIG. 15 is an example of OVC in the station data container of CPM.
  • Table 1 shows the data elements (DE) and/or data frames (DF) included in an example OVC.
  • the station data container becomes an OVC when the ITS-S that is the source is a vehicle. If the originating ITS-S is an RSU, it becomes an Originating RSU Container.
  • the originating RSU container contains the ID for the road or intersection on which the RSU is located.
  • DE is a data type that contains single data.
  • DF is a data type containing one or more elements in a predetermined order.
  • DF is a data type that includes one or more DEs and/or one or more DFs in a predefined order.
  • the DE/DF may be used to construct facility layer messages or application layer messages.
  • facility layer messages are CAM, CPM, DENM.
  • the OVC contains basic information related to the V2X communication device that emits the CPM.
  • OVC can be interpreted as a scaled down version of CAM.
  • the OVC may include only the DE required for coordinate conversion processing. That is, OVC is similar to CAM, but provides basic information about the originating V2X communication device. The information contained in the OVC is focused on supporting the coordinate transformation process.
  • OVC can provide the following. That is, the OVC can provide the latest geographic location of the originating V2X communication device obtained by the CP service 124 at the time of CPM generation. OVC can also provide the absolute lateral and longitudinal velocity components of the originating V2X communication device. The OVC can provide the geometric dimensions of the originating V2X communication device.
  • the generated differential time shown in Table 1 indicates, as DE, the time corresponding to the time of the reference position in CPM.
  • the generation difference time can be regarded as the generation time of the CPM.
  • the generated differential time may be referred to as generated time.
  • the reference position indicates the geographical position of the V2X communication device as DF.
  • a reference position indicates the location of a geographical point.
  • the reference position includes information regarding latitude, longitude, position confidence and/or altitude.
  • Latitude represents the latitude of the geographical point
  • Longitude represents the longitude of the geographical point.
  • the position confidence represents the accuracy of the geographic location
  • the altitude represents the altitude and altitude accuracy of the geographic point.
  • the orientation indicates the orientation in the coordinate system as DF.
  • Heading includes heading value and/or heading confidence information.
  • the bearing value indicates the heading relative to north, and the bearing confidence indicates a preset level of confidence in the reported bearing value.
  • Longitudinal velocity can describe the accuracy of longitudinal velocity and velocity information for a moving object (eg, vehicle) as DF.
  • Longitudinal velocity includes velocity value and/or velocity accuracy information.
  • the velocity value represents the velocity value in the vertical direction
  • the velocity accuracy represents the accuracy of the velocity value.
  • Lateral velocity as a DF, can describe the lateral velocity and the accuracy of velocity information for a moving body (eg, vehicle). Lateral velocity includes information about velocity values and/or velocity accuracy.
  • the velocity value represents the velocity value in the lateral direction
  • the velocity accuracy represents the accuracy of the velocity value.
  • the vehicle length can describe the vehicle length and accuracy index as DF.
  • the vehicle length includes information about vehicle length values and/or vehicle length accuracy indicators.
  • the vehicle length represents the length of the vehicle, and the vehicle length accuracy index represents the reliability of the vehicle length.
  • the vehicle width indicates the width of the vehicle as DE.
  • vehicle width can represent the width of the vehicle including the side mirrors. If the width of the vehicle is 6.1 m or more, it is set to 61, and if the information cannot be obtained, it is set to 62.
  • Each DE/DF shown in Table 1 can refer to ETSI 102 894-2 shown in the right column of Table 1, except for the generation time difference.
  • ETSI 102 894-2 defines a CDD (common data dictionary). See ETSI EN 302 637-2 for generation time differences.
  • the OVC may contain information on the vehicle direction angle, vehicle traveling direction, longitudinal acceleration, lateral acceleration, vertical acceleration, yaw rate, pitch angle, roll angle, vehicle height and trailer data.
  • Table 2 is shown in FIG. Table 2 is an example of SIC (or FOC) in CPM.
  • the SIC provides a description of at least one sensor mounted on the originating V2X communication device. If the V2X communication device is equipped with multiple sensors, multiple explanations may be added. For example, the SIC provides information about the sensor capabilities of the originating V2X communication device. To do so, general sensor characteristics providing the originating V2X communication device's sensor mounting location, sensor type, sensor range and opening angle (i.e., sensor frustum) are included as part of the message. may be These pieces of information may be used by the receiving V2X communication device to select an appropriate prediction model according to sensor performance.
  • the sensor ID indicates a sensor-specific ID for specifying the sensor that detected the object.
  • the sensor ID is a random number generated when the V2X communication device starts up and does not change until the V2X communication device is terminated.
  • Sensor type indicates the type of sensor.
  • the sensor types are listed below.
  • the sensor type is undefined (0), radar (1), lidar (2), mono-video (3), stereovision (4), night vision (5), ultrasonic (6), pmd (7). , fusion (8), induction loop (9), spherical camera (10), and their set (11).
  • pmd is a photo mixing device.
  • a spherical camera is also called a 360-degree camera.
  • the X position indicates the mounting position of the sensor in the negative X direction
  • the Y position indicates the mounting position of the sensor in the Y direction.
  • These mounting positions are measurements from a reference position, which can be referred to EN 302 637-2 of ETSI.
  • the radius indicates the average recognition range of the sensor as defined by the manufacturer.
  • the start angle indicates the start angle of the sensor's frustum
  • the end angle indicates the end angle of the sensor's frustum.
  • a quality class represents a classification of the sensor that defines the quality of the measurement object.
  • the SIC may contain information regarding the reliability of the detection area and free space.
  • Table 3 is shown in FIG. Table 3 is an example of POC in CPM.
  • the POC is used to describe the object perceived by the sensor as seen by the transmitting V2X communication device.
  • the receiving V2X communication device that receives the POC can, with the help of the OVC, perform coordinate transformation processing to transform the position of the object into the reference coordinate system of the receiving vehicle.
  • multiple optional DEs may be provided if the originating V2X communication device can provide them.
  • a POC may consist of a selection of DEs to provide an abstract description of the recognized (or detected) object. For example, the relative distance, velocity information and timing information about the perceived object associated with the originating V2X communication device may be included in the POC as mandatory DEs. Additional DEs may also be provided if the sensors of the originating V2X communication device can provide the requested data.
  • the measurement time indicates the time in microseconds from the reference time of the message. This defines the relative age of the measured object.
  • An object ID is a unique random ID assigned to an object. This ID is retained (ie, not changed) while the object is being tracked, ie, considered in the originating V2X communication device's data fusion process.
  • the sensor ID is an ID corresponding to DE in the sensor ID in Table 2. This DE may be used to associate object information with the sensors that make the measurements.
  • the vertical distance includes the distance value and the distance reliability.
  • the distance value indicates the relative X distance to the object in the source reference frame.
  • the distance reliability is a value indicating the reliability of the X distance.
  • the horizontal distance also includes the distance value and distance reliability.
  • the distance value indicates the relative Y distance to the object in the source reference frame, and the distance confidence indicates the confidence of that Y distance.
  • Longitudinal velocity indicates the longitudinal velocity of the detected object according to its reliability. Lateral velocity indicates the lateral velocity of the detected object depending on the degree of confidence. Longitudinal and transverse velocities can be referred to CDD of TS 102 894-2.
  • the object orientation indicates the absolute orientation of the object in the reference coordinate system when provided by data fusion processing.
  • the object length indicates the measured object length.
  • the length confidence indicates the confidence of the length of the measured object.
  • Object Width indicates a measurement of the width of the object. Width confidence indicates the reliability of the object width measurement.
  • Object type represents the classification of the object as provided in the data fusion process. Classifications of objects may include vehicles, people, animals, and others.
  • object reliability, vertical distance, vertical speed, vertical acceleration, lateral acceleration, vertical acceleration, height of object, dynamic state of object, matched position (lane ID, vertical direction lane position) may be included in the POC.
  • the free space addition container is a container that indicates information about the free space recognized by the source V2X communication device (that is, free space information).
  • a free space is an area that is not considered to be occupied by road users or obstacles, and can also be called an empty space.
  • the free space can also be said to be a space in which a mobile object that moves together with the source V2X communication device can move.
  • the free space additional container is not a required container, but a container that can be added arbitrarily. If there is a difference between the free space recognized by the other V2X communication device and the free space recognized by the source V2X communication device, which can be calculated from the CPM received from the other V2X communication device, the free space Additional space containers can be added. Also, free space additional containers may be added to the CPM periodically.
  • the free space addition container contains information specifying the area of free space.
  • Free space can be specified in various shapes.
  • the shape of the free space can be represented, for example, by polygons (ie, polygons), circles, ellipses, rectangles, and the like.
  • polygons ie, polygons
  • circles ie, polygons
  • ellipses ie, circles
  • rectangles ie, polygons
  • the free space addition container contains information specifying the area of free space.
  • Free space can be specified in various shapes.
  • the shape of the free space can be represented, for example, by polygons (ie, polygons), circles, ellipses, rectangles, and the like.
  • When representing free space with a polygon specify the positions of the points that make up the polygon and the order in which the points are connected.
  • free space with an ellipse specify the
  • the free space additional container may contain the reliability of the free space. Reliability of free space is expressed numerically. The reliability of free space may also indicate that the reliability is unknown.
  • the free space addition container may also contain information about shadow areas. The shadow area indicates the area behind the object as seen from the vehicle or the sensor mounted on the vehicle.
  • FIG. 18 is a diagram explaining a sensor data extraction method by a V2X communication device that provides CP service 124.
  • FIG. 18(a) shows how a V2X communication device extracts sensor data at a low level.
  • FIG. 18(b) illustrates how a V2X communication device extracts sensor data at a high level.
  • the source of sensor data transmitted as part of CPM should be selected according to the requirements of the future data fusion process in the receiving V2X communication device.
  • the transmitted data should be as close as possible to the original sensor data.
  • simply transmitting original sensor data, such as raw data is not realistic. This is because it imposes very high demands on data rate and transmission cycle.
  • Figures 18(a) and 18(b) show possible embodiments for selecting data to be transmitted as part of the CPM.
  • sensor data is obtained from different sensors and processed as part of the low-level data management entity. This entity can select the object data to be inserted as part of the next CPM and also compute the validity of the detected objects.
  • the data of each sensor is transmitted, so the amount of data transmitted via the V2X network increases. However, the sensor information can be efficiently utilized by the V2X communication device on the receiving side.
  • sensor data or object data provided by a data fusion unit specific to the V2X communication device manufacturer is transmitted as part of the CPM.
  • the absolute value of the difference between the current yaw angle of the detected object and the yaw angle included in the CPM previously transmitted by the source V2X communication device exceeds 4 degrees, transmission will be considered.
  • the difference between the relative distance between the current position of the originating V2X communication device and the detected object and the relative distance between the originating V2X communication device and the detected object contained in the CPM previously transmitted by the originating V2X communication device exceeds 4m
  • the absolute value of the difference between the current velocity of the detected object and the velocity of the detected object contained in the CPM previously transmitted by the source V2X communication device exceeds 0.5 m/s, consider sending. good too.
  • CAM is a technology in which a vehicle equipped with a V2X module periodically transmits its position and status to other vehicles equipped with a V2X module in the surrounding area to support more stable driving.
  • the V2X module is a configuration including a V2X communication device or a V2X communication device.
  • CP service 124 is a technology that complements CAM.
  • CPS that is, CP service
  • ADAS technology is a technology in ADAS technology that notifies the surroundings of sensor data that recognizes the surrounding environment through V2X communication.
  • FIG. 19 is a diagram explaining the CP service 124.
  • each vehicle TxV1 and RxV2 is equipped with at least one sensor and has sensing ranges SrV1, SrV2 indicated by dashed lines.
  • TxV1 has a CPS function.
  • TxV1 can recognize vehicles, RV1 to RV11, which are peripheral objects belonging to sensing range SrV1, using a plurality of ADAS sensors mounted on the vehicle. Object information obtained by recognition may be distributed to nearby vehicles equipped with V2X communication devices through V2X communication.
  • RxV1 which is not equipped with a sensor, can acquire information on the following vehicle.
  • RxV2 equipped with a sensor receives CPM from TxV1
  • information on objects located outside the sensing range SrV2 of RxV2 and objects located in blind spots can also be obtained.
  • facility layer 120 can provide CP services 124 .
  • CP services 124 may run in facility layer 120 and may utilize services that reside in facility layer 120 .
  • the LDM 127 is a service that provides map information, and may provide map information for the CP service 124.
  • the provided map information may include dynamic information in addition to static information.
  • the POTI unit 126 performs a service that provides the location and time of the ego vehicle.
  • the POTI unit 126 can use the corresponding information to provide the location of the ego vehicle and the exact time.
  • the VDP 125 is a service that provides information about the vehicle, and may be used to capture information such as the size of the own vehicle into the CPM and transmit the CPM.
  • ADAS vehicles are equipped with various sensors such as cameras, infrared sensors, radar, and lidar for driving support. Each sensor individually recognizes an object. The recognized object information may be collected and fused by the data fusion unit and provided to the ADAS application.
  • CP service 124 the collection and fusion method of sensor information in the ADAS technology will be described.
  • Existing sensors for ADAS and existing sensors for CPS can always track surrounding objects and collect relevant data.
  • sensor values for CP services two methods can be used to collect sensor information.
  • each sensor value can be individually provided to surrounding vehicles through the CP basic service.
  • the aggregated integrated sensor information may be provided to the CP basic service after the data fusion part.
  • CP basic services form part of CP services 124 .
  • Fig. 20 shows a flowchart of the process of sending the CPM.
  • the processing shown in FIG. 20 is executed for each CPM generation cycle.
  • step S301 information forming the CPM is acquired.
  • step S301 is executed by the detection information acquisition unit 201, for example.
  • step S302 CPM is generated based on the information obtained in step S301.
  • Step S302 can also be executed by the detection information acquisition unit 201 .
  • the transmission unit 221 transmits the CPM generated at step S302 to the surroundings of the vehicle.
  • step S1A when the receiving unit 222 receives the CPM transmitted from the communication device 20 of the other vehicle (YES in S1A), the process proceeds to step S2. If no CPM has been received (NO in S1A), the process proceeds to step S1B.
  • Steps S2 and S3 are the same as in FIG. After executing step S3, the process moves to step S4A.
  • the identity determination unit 204 determines whether or not the vehicle detection target whose position was specified in step S3 is the same as the target specified by the POC of the CPM that was determined to have been received in step S1A.
  • the CPM includes the latitude and longitude (hereinafter referred to as absolute coordinates) of the other vehicle, and the distance and bearing from the other vehicle to the target. Therefore, the absolute coordinates of the target detected by the other vehicle can be determined from the CPM acquired from the other vehicle. This coordinate is compared with the position of the vehicle detection target identified in step S3 with the coordinate system aligned.
  • the coordinate system to be aligned may be an absolute coordinate system or a coordinate system centered on the own vehicle.
  • the POC of the CPM contains various information such as information indicating the behavior of targets detected by other vehicles. The same determination may be made by comparing the behavior of the target specified by the POC of the CPM and the behavior of the vehicle-detected target acquired in S2. Behavior is one or more pieces of information indicating target movement, such as velocity, acceleration, and angular velocity.
  • Target classification ie, type
  • the position of the target may be used for narrowing down.
  • the narrowing-down range is determined by a threshold radius set to a distance larger than the distance difference determined to be the same.
  • the center of the range of narrowing down is the position of the other vehicle detection target or the own vehicle detection target. The same judgment is performed for the targets detected by the own vehicle and the targets detected by other vehicles within the narrowed-down range.
  • step S1B when the receiving unit 222 receives the CAM transmitted from the communication device 20 of the other vehicle (YES in S1B), the process proceeds to step S9A. If the CAM has not been received (NO in S1B), the process moves to step S12. If it is determined in step S5 that the target detected by the own vehicle and the target detected by the other vehicle are not the same (NO in S5), the process also proceeds to step S9A.
  • step S9A if the mounted object positioning error has already been stored in error storage unit 207 for the other vehicle that is the transmission source of the message determined to have been received in step S1A or step S1B (YES in S9A), the process proceeds to step S10A. move.
  • step S10A differs depending on whether the message determined to have been received is CPM or CAM. If the message determined to have been received is a CPM, the position of the target detected by another vehicle specified by the POC of the CPM is corrected by the mounted object positioning error of the other vehicle that is the source of the CPM. If the message determined to have been received is the CAM, the position of the transmission source vehicle contained in the CAM is corrected by the mounting object positioning error. After executing step S10A, the process moves to step S11A.
  • step S11A the position of the other vehicle detection target corrected in step S10A or the position of the other vehicle that is the transmission source of the CAM is stored in the target storage unit 209.
  • the CPM transmitted by the vehicle unit 2 is used when estimating the mounted object positioning error.
  • a system for transmitting and receiving CPM can be used, making it easy to construct the vehicle system 1 .
  • the position of the source vehicle contained in the CAM is corrected by the mounting object positioning error (S10A).
  • the object mounted with the communication device may be a mobile object other than a vehicle.
  • Mobile objects other than vehicles include, for example, drones.
  • a unit provided with a function excluding the function specific to the vehicle among the vehicle units 2 may be mounted on a moving object such as a drone.
  • the communication equipment In Embodiments 1 and 2, the case where the communication equipment is a vehicle has been described as an example, but this is not necessarily the case.
  • the communication equipment may be a stationary object.
  • Stationary objects include, for example, roadside units.
  • the roadside unit may be equipped with a unit of the vehicle unit 2 that has functions other than those specific to the vehicle.
  • the communication unit 202 may be configured to perform road-to-vehicle communication.
  • the configuration in which the communication device 20 estimates the mounted object positioning error was shown, but the configuration is not necessarily limited to this.
  • functions other than the communication unit 202 may be configured to be performed by the perimeter monitoring ECU 50.
  • the perimeter monitoring ECU 50 may include a functional block for acquiring the target object information received by the receiver 222 . This functional block corresponds to the communication information acquisition unit.
  • the perimeter monitoring ECU 50 corresponds to the vehicle device.
  • the function of the communication device 20 may be performed by the communication device 20 and the perimeter monitoring ECU 50 .
  • a unit including the communication device 20 and the perimeter monitoring ECU 50 corresponds to the vehicle device.
  • the mounted object positioning error is estimated by comparing the mounted object reference target position converted into the relative position with respect to the own vehicle and the detected target position specified by the detected position specifying unit 203 .
  • the mounted object detected position and the target detected position are compared to determine the mounted object positioning error. can be estimated.
  • the mounted object reference target position transmitted by the other vehicle is indicated by absolute coordinates, the mounted object reference target position does not need to be converted.
  • a vehicle device that can be used in a vehicle, comprising: a detection position specifying unit (203) for specifying a target detection position, which is the relative position of the target detected by the vehicle surroundings monitoring sensor with respect to the vehicle; Detected by the perimeter monitoring sensor transmitted from a communication device equipped with a perimeter monitoring sensor, capable of positioning using positioning satellite signals, and equipped with a communication device capable of wireless communication with the vehicle a communication information acquisition unit (222) for acquiring, via wireless communication, a mounted object reference target position, which is a position of the target based on the position of the mounted object of the communication device obtained by the positioning; Determines whether or not the target detected by the perimeter monitoring sensor of the vehicle is the same as the target detected by the perimeter monitoring sensor of the communication device that is the transmission source of the target position of the target mounted on the vehicle acquired by the communication information acquisition unit.
  • a same determination unit (204) that a conversion unit (205) for converting the mounted object reference target position acquired by the communication information acquisition unit into a position relative to the vehicle;
  • the same determination unit determines that the targets are the same, the difference between the mounted object reference target position converted by the conversion unit and the detected target position specified by the detection position specifying unit for the target is determined as follows:
  • An apparatus for a vehicle comprising an error estimating section (206) for estimating a mounted object positioning error, which is a positioning error of a mounted communication device with respect to the position of the vehicle.
  • the mounted object reference target position acquired by the communication information acquisition unit is A vehicle apparatus comprising a target locator (208) that determines the position of the target with respect to the vehicle by correcting by the amount of .
  • a communication information acquisition unit is a device for a vehicle that acquires a mounted object reference target position from a mounted communication device as a mobile object.
  • An error estimation method that can be used in a vehicle, comprising: executed by at least one processor; a detection position specifying step of specifying a target detection position, which is a relative position of the target detected by the vehicle surroundings monitoring sensor with respect to the vehicle; Detected by the perimeter monitoring sensor transmitted from a communication device equipped with a perimeter monitoring sensor, capable of positioning using positioning satellite signals, and equipped with a communication device capable of wireless communication with the vehicle a communication information acquisition step of acquiring, via wireless communication, a mounted object reference target position, which is a position of the target based on the position of the communication device mounted object obtained by the positioning; Determines whether or not the target detected by the vehicle's perimeter monitoring sensor is the same as the target detected by the perimeter monitoring sensor of the communication device that is the source of the target position of the target mounted on the vehicle acquired in the communication information acquisition process.
  • an error estimating step of estimating a mounting object positioning error which is a positioning error of a communication device mounted object with respect to the position of the vehicle.
  • controller and techniques described in this disclosure may also be implemented by a special purpose computer comprising a processor programmed to perform one or more functions embodied by a computer program.
  • the apparatus and techniques described in this disclosure may be implemented by dedicated hardware logic circuitry.
  • the apparatus and techniques described in this disclosure may be implemented by one or more special purpose computers configured in combination with a processor executing a computer program and one or more hardware logic circuits.
  • the computer program may also be stored as computer-executable instructions on a computer-readable non-transitional tangible recording medium.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Traffic Control Systems (AREA)

Abstract

The present invention comprises: a detection position identification unit (203) that identifies a target detection position, which is the position of a target detected by a periphery monitoring sensor of a vehicle relative to the vehicle; a reception unit (222) that acquires, via wireless communication, a mounted object reference target position which is transmitted from another vehicle; an identity determination unit (204) that determines whether or not the target detected by the periphery monitoring sensor of the vehicle is identical to a target for which the mounted object reference target position has been acquired by the reception unit (222); and an error estimation unit (206) that, when the identity determination unit (204) determines that the targets are identical, estimates a mounted object positioning error from a difference between the mounted object reference target position and the target detection position identified by the detection position identification unit (203).

Description

車両用装置及び誤差推定方法Vehicle device and error estimation method 関連出願の相互参照Cross-reference to related applications
 この出願は、2021年6月3日に日本に出願された特許出願第2021-93878号を基礎としており、基礎の出願の内容を、全体的に、参照により援用している。 This application is based on Patent Application No. 2021-93878 filed in Japan on June 3, 2021, and the content of the underlying application is incorporated by reference in its entirety.
 本開示は、車両用装置及び誤差推定方法に関するものである。 The present disclosure relates to a vehicle device and an error estimation method.
 特許文献1には、自車の見通し外の物標が存在することを判定する技術が開示されている。特許文献1に開示の技術では、物標としての他車両から送信された他車両の位置情報である受信情報と自車のセンサで検出した物標の移動速度及び移動方位を含むセンサ情報との比較結果により、両者が一致しない場合に、見通し外に物標が存在すると判定する。 Patent Document 1 discloses a technique for determining the existence of a target beyond the line of sight of the own vehicle. In the technique disclosed in Patent Document 1, reception information, which is position information of another vehicle transmitted from another vehicle as a target, and sensor information including the moving speed and moving direction of the target detected by the sensor of the own vehicle. As a result of the comparison, if the two do not match, it is determined that there is a target in the non-line-of-sight.
特開2019-79316号公報JP 2019-79316 A
 特許文献1に開示の技術では、他車両の位置情報を受信することで自車のセンサで検出できない他車両の存在を判定できる。しかしながら、他車両から送信される他車両の位置情報には、測位衛星を用いた測位であっても、また、測位衛星を用いない測位であっても、測位の誤差(以下、測位誤差)が含まれる。よって、自車のセンサの検出範囲外に他車両が位置する場合にその他車両の位置を精度良く特定することが難しい。 With the technology disclosed in Patent Document 1, it is possible to determine the presence of other vehicles that cannot be detected by the own vehicle's sensors by receiving the position information of the other vehicles. However, the position information of other vehicles transmitted from other vehicles includes positioning errors (hereinafter referred to as positioning errors), whether positioning is performed using positioning satellites or not using positioning satellites. included. Therefore, when other vehicles are located outside the detection range of the own vehicle's sensors, it is difficult to specify the positions of other vehicles with high accuracy.
 この開示のひとつの目的は、自車の周辺監視センサの検出範囲外の通信機搭載物の位置をより精度良く特定することを可能にする車両用装置及び誤差推定方法を提供することにある。 One object of this disclosure is to provide a vehicle device and an error estimation method that make it possible to more accurately identify the position of an object mounted with a communication device outside the detection range of the vehicle's surroundings monitoring sensor.
 上記目的は独立請求項に記載の特徴の組み合わせにより達成され、また、下位請求項は、開示の更なる有利な具体例を規定する。請求の範囲に記載した括弧内の符号は、ひとつの態様として後述する実施形態に記載の具体的手段との対応関係を示すものであって、本開示の技術的範囲を限定するものではない。 The above object is achieved by the combination of features described in the independent claims, and the subclaims define further advantageous embodiments of the disclosure. Reference numerals in parentheses in the claims indicate correspondences with specific means described in embodiments described later as one aspect, and do not limit the technical scope of the present disclosure.
 上記目的を達成するために、本開示の車両用装置は、車両で用いることが可能な車両用装置であって、車両の周辺監視センサで検出した物標の車両に対する相対位置である物標検出位置を特定する検出位置特定部と、通信機搭載物から送信され、通信機搭載物で検出された物標の位置である搭載物基準物標位置を、無線通信を介して取得する通信情報取得部と、車両の周辺監視センサで検出した物標と、通信情報取得部で搭載物基準物標位置を取得した物標とが同一か否かを判定する同一判定部と、同一判定部で物標が同一と判定した場合には、搭載物基準物標位置と物標検出位置とのずれから、通信機搭載物での測位の誤差である搭載物測位誤差を推定する誤差推定部とを備える。 In order to achieve the above object, the vehicle apparatus of the present disclosure is a vehicle apparatus that can be used in a vehicle, and detects a target relative to the vehicle by a sensor for monitoring the surroundings of the vehicle. Communication information acquisition that acquires, via wireless communication, a detection position specifying unit that specifies a position and a mounted object reference target position that is the position of a target that is transmitted from a mounted object on a communication device and detected by the mounted object on the communication device. a same determination unit that determines whether or not the target detected by the vehicle surroundings monitoring sensor is the same as the target from which the mounted object reference target position is acquired by the communication information acquisition unit; an error estimating unit for estimating a mounting object positioning error, which is a positioning error in the communication equipment mounted object, from the difference between the mounting object reference target position and the detected target position when it is determined that the targets are the same. .
 上記目的を達成するために、本開示の誤差推定方法は、車両で用いることが可能であり、少なくとも1つのプロセッサにより実行される誤差推定方法であって、車両の周辺監視センサで検出した物標の車両に対する相対位置である物標検出位置を特定する検出位置特定工程と、通信機搭載物から送信され、通信機搭載物で検出された物標の位置である搭載物基準物標位置を、無線通信を介して取得する通信情報取得工程と、車両の周辺監視センサで検出した物標と、通信情報取得工程で搭載物基準物標位置を取得した物標とが同一か否かを判定する同一判定工程と、同一判定工程で物標が同一と判定した場合には、搭載物基準物標位置と物標検出位置とのずれから、通信機搭載物での測位の誤差である搭載物測位誤差を推定する誤差推定工程とを含む。 To achieve the above objectives, the error estimation method of the present disclosure is capable of being used in a vehicle and is executed by at least one processor, comprising: A detection position specifying step of specifying a target detection position, which is a relative position of the vehicle, and a target position reference target position, which is the position of the target transmitted from the communication device mounted and detected by the communication device mounted, It is determined whether or not the communication information acquisition process acquired via wireless communication and the target detected by the vehicle surroundings monitoring sensor are the same as the target for which the mounted object reference target position is acquired in the communication information acquisition process. If it is determined that the target is the same in the same determination process and the same determination process, the difference between the position of the reference target of the mounted object and the detected position of the target will result in the positioning error of the mounted object, which is the positioning error of the mounted object of the communication device. and an error estimation step of estimating the error.
 これらによれば、搭載物基準物標位置は、通信機搭載物で検出された物標の位置であるので、通信機搭載物での測位の誤差を含んでいる。よって、同一と判定した物標について、周辺監視センサで検出した物標の車両に対する相対位置である物標検出位置と、通信機搭載物から送信され、通信機搭載物で検出された物標の位置である搭載物基準物標位置とのずれから、搭載物測位誤差を推定することができる。ここで、車両の周辺監視センサで検出できればよいのは、通信機搭載物の周辺監視センサで検出した物標であるので、搭載物測位誤差の推定は、通信機搭載物を車両の周辺監視センサで検出できなくても可能となる。また、搭載物測位誤差を用いれば、通信機搭載物での測位誤差を補正して、通信機搭載物の位置をより精度良く特定することが可能になる。その結果、自車の周辺監視センサの検出範囲外の通信機搭載物の位置をより精度良く特定することが可能になる。 According to these, the mounted object reference target position is the position of the target detected by the mounted object of the communication device, so it includes the positioning error of the mounted object of the communication device. Therefore, for targets determined to be the same, the target detection position, which is the relative position of the target with respect to the vehicle detected by the surroundings monitoring sensor, and the position of the target detected by the communication device transmitted from the communication device The mounting object positioning error can be estimated from the deviation from the mounting object reference target position, which is the position. Here, what is sufficient is that the sensor for monitoring the surroundings of the vehicle detects the target detected by the sensor for monitoring the surroundings of the equipment mounted on the communication device. It can be detected even if it cannot be detected by Further, by using the mounting object positioning error, it is possible to correct the positioning error in the communication device mounting object and specify the position of the communication device mounting object with higher accuracy. As a result, it is possible to more accurately identify the position of the communication device-mounted object outside the detection range of the surroundings monitoring sensor of the own vehicle.
車両用システム1の概略的な構成の一例を示す図。The figure which shows an example of a schematic structure of the system 1 for vehicles. 車両ユニット2の概略的な構成の一例を示す図。2 is a diagram showing an example of a schematic configuration of a vehicle unit 2; FIG. 通信機20の概略的な構成の一例を示す図。2 is a diagram showing an example of a schematic configuration of a communication device 20; FIG. 通信機20での測位誤差推定関連処理の流れの一例を示すフローチャート。6 is a flow chart showing an example of the flow of positioning error estimation-related processing in the communication device 20; 実施形態2の概要を説明する図。FIG. 10 is a diagram for explaining an outline of a second embodiment; V2X通信装置の例示的なアーキテクチャを示す図。1 illustrates an exemplary architecture of a V2X communication device; FIG. V2Xメッセージを例示する図。A diagram illustrating a V2X message. CAサービスおよび他の層のための論理インターフェースを示す図。FIG. 3 shows the logical interfaces for CA services and other layers; CAサービスの機能ブロック図。A functional block diagram of a CA service. CAMの基本的なフォーマットを示す図。The figure which shows the basic format of CAM. CAMを送信する処理を示すフローチャートの一例。An example of the flowchart which shows the process which transmits CAM. CPサービスおよび他の層のための論理インターフェースを示す図。FIG. 3 shows the logical interfaces for CP services and other layers; CPサービスの機能ブロック図。Functional block diagram of CP service. CPMの基本的なフォーマットを示す図。The figure which shows the basic format of CPM. CPMにおけるOVCの一例を示す図。The figure which shows an example of OVC in CPM. CPMにおけるFOC(またはSIC)を例示する図。FIG. 4 illustrates FOC (or SIC) in CPM; CPMにおけるPOCを例示する図。FIG. 4 is a diagram illustrating POC in CPM; センサデータ抽出方法を説明する図。The figure explaining the sensor data extraction method. CPサービスを説明する図。FIG. 4 is a diagram for explaining CP services; CPMを送信する処理を示すフローチャートの一例。An example of the flowchart which shows the process which transmits CPM. 実施形態2で実行する測位誤差推定関連処理を示すフローチャートの一例。An example of a flowchart showing positioning error estimation-related processing executed in the second embodiment.
 図面を参照しながら、開示のための複数の実施形態を説明する。なお、説明の便宜上、複数の実施形態の間において、それまでの説明に用いた図に示した部分と同一の機能を有する部分については、同一の符号を付し、その説明を省略する場合がある。同一の符号を付した部分については、他の実施形態における説明を参照することができる。 A plurality of embodiments for disclosure will be described with reference to the drawings. For convenience of explanation, in some embodiments, parts having the same functions as the parts shown in the drawings used in the explanation so far are denoted by the same reference numerals, and the explanation thereof may be omitted. be. The description in the other embodiments can be referred to for the parts with the same reference numerals.
 (実施形態1)
 <車両用システム1の概略構成>
 以下、本開示の実施形態1について図面を用いて説明する。図1に示すように、車両用システム1は、複数の車両の各々に用いられる車両ユニット2を含んでいる。図1に示す例では、車両として、車両VEa,車両VEb,車両VEcを例に挙げて説明する。車両VEa,車両VEb,車両VEcは自動車であるものとする。
(Embodiment 1)
<Schematic Configuration of Vehicle System 1>
Embodiment 1 of the present disclosure will be described below with reference to the drawings. As shown in FIG. 1, the vehicle system 1 includes a vehicle unit 2 used in each of a plurality of vehicles. In the example shown in FIG. 1, a vehicle VEa, a vehicle VEb, and a vehicle VEc will be described as examples of vehicles. Vehicle VEa, vehicle VEb, and vehicle VEc are assumed to be automobiles.
 <車両ユニット2の概略構成>
 続いて、図2を用いて車両ユニット2の概略構成の一例を説明する。車両ユニット2は、図2に示すように、通信機20、ロケータ30、周辺監視センサ40、周辺監視ECU50、及び運転支援ECU60を含んでいる。通信機20、ロケータ30、周辺監視ECU50、及び運転支援ECU60は、車内LAN(図2のLAN参照)と接続される構成とすればよい。
<Schematic Configuration of Vehicle Unit 2>
Next, an example of the schematic configuration of the vehicle unit 2 will be described with reference to FIG. 2 . The vehicle unit 2 includes a communication device 20, a locator 30, a surroundings monitoring sensor 40, a surroundings monitoring ECU 50, and a driving support ECU 60, as shown in FIG. The communication device 20, the locator 30, the perimeter monitoring ECU 50, and the driving support ECU 60 may be configured to be connected to an in-vehicle LAN (see LAN in FIG. 2).
 ロケータ30は、GNSS(Global Navigation Satellite System)受信機を備えている。GNSS受信機は、複数の人工衛星からの測位信号を受信する。ロケータ30は、GNSS受信機で受信する測位信号を用いて、ロケータ30を搭載した自車の位置(以下、車両位置)を逐次測位する。ロケータ30は、測位信号が出力されてきた測位衛星のうちから、測位演算に用いる測位衛星を例えば4つといった規定数決定する。そして、決定した規定数の測位衛星についてのコード疑似距離,搬送波位相等を用い、車両位置の座標を算出する測位演算を行う。測位演算自体は、コード疑似距離を用いる測位演算であっても、搬送波位相を用いる測位演算であってもよい。また、座標は、少なくとも緯度,経度の座標とすればよい。 The locator 30 is equipped with a GNSS (Global Navigation Satellite System) receiver. A GNSS receiver receives positioning signals from multiple satellites. The locator 30 uses the positioning signals received by the GNSS receiver to sequentially locate the position of the vehicle equipped with the locator 30 (hereinafter referred to as vehicle position). The locator 30 determines a specified number, such as four, of positioning satellites to be used for positioning calculation from among the positioning satellites from which positioning signals have been output. Then, using the code pseudoranges, carrier wave phases, etc. for the determined specified number of positioning satellites, a positioning operation is performed to calculate the coordinates of the vehicle position. The positioning operation itself may be a positioning operation using code pseudoranges or a positioning operation using carrier phases. Also, the coordinates should be at least latitude and longitude coordinates.
 ロケータ30は、慣性センサを備えていてもよい。慣性センサとしては、例えばジャイロセンサ及び加速度センサを用いればよい。ロケータ30は、GNSS受信機で受信する測位信号と、慣性センサの計測結果とを組み合わせることにより、車両位置を逐次測位してもよい。なお、車両位置の測位には、自車に搭載された車速センサから逐次出力される信号から求めた走行距離を用いてもよい。 The locator 30 may be equipped with an inertial sensor. As the inertial sensor, for example, a gyro sensor and an acceleration sensor may be used. The locator 30 may sequentially locate the vehicle position by combining the positioning signal received by the GNSS receiver and the measurement result of the inertial sensor. Note that the vehicle position may be determined by using a traveling distance obtained from signals sequentially output from a vehicle speed sensor mounted on the own vehicle.
 周辺監視センサ40は、歩行者、人間以外の動物、自転車、オートバイ、及び他車両等の移動物体、さらに路上の落下物、ガードレール、縁石、及び樹木等の静止物体といった自車周辺の障害物を検出する。周辺監視センサ40としては、自車の周辺の所定範囲を撮像範囲とする周辺監視カメラを用いる構成とすればよい。周辺監視センサ40としては、ミリ波レーダ,ソナー,LIDAR(Light Detection and Ranging/Laser Imaging Detect ion and Ranging)等を用いる構成としてもよい。また、これらを組み合わせて用いる構成としてもよい。周辺監視カメラは、逐次撮像する撮像画像をセンシング情報として周辺監視ECU50へ逐次出力する。ソナー,ミリ波レーダ,LIDAR等の探査波を送信するセンサは、障害物によって反射された反射波を受信した場合に得られる受信信号に基づく走査結果をセンシング情報として周辺監視ECU50へ逐次出力する。 The surroundings monitoring sensor 40 detects obstacles around the vehicle, such as moving objects such as pedestrians, animals other than humans, bicycles, motorcycles, and other vehicles, as well as stationary objects such as falling objects on the road, guardrails, curbs, and trees. To detect. As the surroundings monitoring sensor 40, a configuration using a surroundings monitoring camera having an imaging range of a predetermined range around the own vehicle may be used. Perimeter monitoring sensor 40 may be configured to use millimeter wave radar, sonar, LIDAR (Light Detection and Ranging/Laser Imaging Detection and Ranging), or the like. Moreover, it is good also as a structure which uses these in combination. The surroundings monitoring camera sequentially outputs captured images to the surroundings monitoring ECU 50 as sensing information. Sensors that transmit search waves such as sonar, millimeter wave radar, and LIDAR sequentially output scanning results based on received signals obtained when reflected waves reflected by obstacles are received to the perimeter monitoring ECU 50 as sensing information.
 周辺監視ECU50は、プロセッサ、メモリ、I/O、これらを接続するバスを備え、メモリに記憶された制御プログラムを実行することで、自車の周辺の物標の認識に関する各種の処理を実行する。周辺監視ECU50は、周辺監視センサ40から出力されたセンシング情報から、自車周辺に存在する物標についての情報(以下、物標情報)を認識する。物標情報としては、物標についての、自車に対する相対位置、種類、速度、及び移動方位等とすればよい。相対位置は、自車からの距離と方位とすればよい。周辺監視センサ40を用いた物標の位置の認識での位置特定精度は、ロケータ30での測位による位置特定精度よりも高いものとする。 The perimeter monitoring ECU 50 includes a processor, memory, I/O, and a bus connecting these, and executes a control program stored in the memory to perform various processes related to recognition of targets in the vicinity of the vehicle. . The surroundings monitoring ECU 50 recognizes information about targets existing around the vehicle (hereinafter referred to as target information) from the sensing information output from the surroundings monitoring sensor 40 . The target information may be the relative position, type, speed, moving azimuth, etc. of the target with respect to the own vehicle. The relative position may be the distance and direction from the own vehicle. It is assumed that the position specifying accuracy in recognizing the position of the target using the perimeter monitoring sensor 40 is higher than the position specifying accuracy by positioning with the locator 30 .
 物標の種類については、周辺監視カメラで撮像した撮像画像をもとに、テンプレートマッチング等の画像認識処理によって認識すればよい。周辺監視カメラとして単眼カメラを用いる場合には、自車に対する周辺監視カメラの設置位置及び光軸の向きと撮像画像中での物標の位置とから、自車からの物標の距離及び自車に対する物標の方位を認識すればよい。複眼カメラを用いる場合には、一対のカメラの視差量をもとに自車からの物標の距離を認識すればよい。他にも、自車に対する物標の相対位置の時間あたりの変化量と、自車の車速とから、物標の速度を認識すればよい。また、自車に対する物標の相対位置の変化方向及び自車の車両位置から移動方位を認識すればよい。 The type of target can be recognized by image recognition processing such as template matching based on the image captured by the surrounding surveillance camera. When a monocular camera is used as a perimeter monitoring camera, the distance to the target from the vehicle and the distance to the vehicle can be determined from the installation position and the direction of the optical axis of the perimeter monitoring camera relative to the vehicle and the position of the target in the captured image. What is necessary is to recognize the direction of the target object with respect to . When a compound eye camera is used, the distance of the target from the own vehicle may be recognized based on the amount of parallax between the pair of cameras. Alternatively, the speed of the target may be recognized from the amount of change per hour in the relative position of the target with respect to the own vehicle and the vehicle speed of the own vehicle. Also, the direction of movement may be recognized from the direction of change in the relative position of the target relative to the own vehicle and the vehicle position of the own vehicle.
 他にも、周辺監視ECU50は、探査波を送信するセンサのセンシング情報を用いて、自車からの物標の距離、自車に対する物標の方位、物標の速度、及び移動方位を認識してもよい。周辺監視ECU50は、探査波を送出してから反射波を受信するまでの時間をもとに、自車から物標までの距離を認識すればよい。周辺監視ECU50は、反射波の得られた探査波を送信した方向をもとに、自車に対する物標の方位を認識すればよい。周辺監視ECU50は、送信した探査波と反射波とのドップラーシフトをもとに算出される自車に対する物標の相対速度と、自車の車速とから、物標の速度を認識してもよい。 In addition, the perimeter monitoring ECU 50 uses the sensing information of the sensor that transmits the search wave to recognize the distance of the target from the own vehicle, the direction of the target with respect to the own vehicle, the speed of the target, and the moving direction. may The perimeter monitoring ECU 50 may recognize the distance from the own vehicle to the target based on the time from when the search wave is sent until when the reflected wave is received. The perimeter monitoring ECU 50 may recognize the azimuth of the target with respect to the own vehicle based on the direction in which the search wave from which the reflected wave was obtained was transmitted. The perimeter monitoring ECU 50 may recognize the speed of the target from the speed of the target relative to the own vehicle calculated based on the Doppler shift between the transmitted search wave and the reflected wave, and the speed of the own vehicle. .
 運転支援ECU60は、プロセッサ、メモリ、I/O、これらを接続するバスを備え、メモリに記憶された制御プログラムを実行することで、自車の運転支援に関する各種の処理を実行する。運転支援ECU60は、周辺監視ECU50及び通信機20で特定される物標の位置をもとに運転支援を行う。周辺監視ECU50で特定する物標の位置は、周辺監視センサ40で検出できた物標(以下、見通し内物標)の自車に対する相対位置である。通信機20で特定する物標の位置は、周辺監視センサ40で検出できない物標(以下、見通し外物標)の自車に対する相対位置である。運転支援の一例としては、物標との近接を回避するための車両制御,物標との近接を回避するための注意喚起等が挙げられる。 The driving assistance ECU 60 includes a processor, memory, I/O, and a bus connecting these, and executes various processes related to driving assistance of the own vehicle by executing a control program stored in the memory. The driving assistance ECU 60 performs driving assistance based on the position of the target specified by the perimeter monitoring ECU 50 and the communication device 20 . The position of the target specified by the perimeter monitoring ECU 50 is the relative position of the target detected by the perimeter monitoring sensor 40 (hereinafter referred to as the line-of-sight target) with respect to the own vehicle. The position of the target specified by the communication device 20 is the relative position of the target that cannot be detected by the perimeter monitoring sensor 40 (hereinafter referred to as non-line-of-sight target) with respect to the own vehicle. Examples of driving assistance include vehicle control for avoiding approaching a target, alerting for avoiding approaching a target, and the like.
 通信機20は、プロセッサ、メモリ、I/O、これらを接続するバスを備え、メモリに記憶された制御プログラムを実行することで、自車の外部との無線通信及び物標の位置の特定に関する処理を実行する。プロセッサがこの制御プログラムを実行することは、制御プログラムに対応する方法が実行されることに相当する。ここで言うところのメモリは、コンピュータによって読み取り可能なプログラム及びデータを非一時的に格納する非遷移的実体的記憶媒体(non- transitory tangible storage medium)である。また、非遷移的実体的記憶媒体は、半導体メモリ又は磁気ディスクなどによって実現される。この通信機20が車両用装置に相当する。通信機20での処理の詳細については後述する。 The communication device 20 includes a processor, memory, I/O, and a bus connecting these, and executes a control program stored in the memory to perform wireless communication with the outside of the vehicle and target position identification. Execute the process. Execution of the control program by the processor corresponds to execution of the method corresponding to the control program. Memory, as used herein, is a non-transitory tangible storage medium for non-transitory storage of computer-readable programs and data. A non-transitional physical storage medium is implemented by a semiconductor memory, a magnetic disk, or the like. This communication device 20 corresponds to a vehicle device. Details of the processing in the communication device 20 will be described later.
 <通信機20の概略構成>
 続いて、図3を用いて通信機20の概略構成の一例を説明する。通信機20は、図3に示すように、検出情報取得部201、通信部202、検出位置特定部203、同一判定部204、変換部205、誤差推定部206、誤差記憶部207、物標位置特定部208及び物標記憶部209を機能ブロックとして備えている。なお、通信機20が実行する機能の一部又は全部を、一つ或いは複数のIC等によりハードウェア的に構成してもよい。また、通信機20が備える機能ブロックの一部又は全部は、プロセッサによるソフトウェアの実行とハードウェア部材の組み合わせによって実現されてもよい。
<Schematic Configuration of Communication Device 20>
Next, an example of the schematic configuration of the communication device 20 will be described with reference to FIG. The communication device 20, as shown in FIG. A specifying unit 208 and a target storage unit 209 are provided as functional blocks. Part or all of the functions executed by the communication device 20 may be configured as hardware using one or a plurality of ICs or the like. Also, some or all of the functional blocks included in the communication device 20 may be implemented by a combination of software executed by a processor and hardware members.
 検出情報取得部201は、周辺監視ECU50で認識した物標情報を取得する。つまり、周辺監視センサ40で検出した物標の情報を取得する。物標情報は、自車に対する物標の相対位置を少なくとも含むものとする。以降では、物標情報は、自車からの物標の距離及び自車に対する物標の方位からなる物標の相対位置、物標の種類、物標の速度、及び物標の移動方位であるものとして説明を続ける。なお、物標の相対位置は、自車の車両位置を基準とした座標であってもよい。 The detection information acquisition unit 201 acquires target information recognized by the perimeter monitoring ECU 50 . That is, the information of the target detected by the perimeter monitoring sensor 40 is obtained. The target information includes at least the relative position of the target with respect to the own vehicle. Hereinafter, the target information is the relative position of the target consisting of the distance of the target from the own vehicle and the orientation of the target with respect to the own vehicle, the type of target, the speed of the target, and the moving direction of the target. Continue to explain. The relative position of the target may be coordinates based on the vehicle position of the own vehicle.
 通信部202は、自車の外部と無線通信を行うことで、自車の外部と情報のやり取りを行う。通信部202は、他車両で用いられる通信機20との間で無線通信を行う。つまり、通信部202は、車車間通信を行う。通信部202は、路側に設置された路側機との間でも無線通信を行うことが可能であってもよい。つまり、路車間通信を行うことが可能であってもよい。通信部202は、路車間通信によって、間接的に、他車両で用いられる通信機20との間で情報のやり取りを行ってもよい。以降では、通信部202は、車車間通信を行うものとして説明を続ける。また、本実施形態では、車両VEaと車両VEbとの間での車車間通信と、車両VEaと車両VEcとの間での車車間通信とが行われるものとする。 The communication unit 202 exchanges information with the outside of the vehicle by performing wireless communication with the outside of the vehicle. The communication unit 202 performs wireless communication with the communication device 20 used in other vehicles. That is, the communication unit 202 performs inter-vehicle communication. The communication unit 202 may be capable of wireless communication with a roadside device installed on the roadside. In other words, road-to-vehicle communication may be possible. The communication unit 202 may indirectly exchange information with the communication device 20 used in another vehicle through road-to-vehicle communication. Henceforth, the communication part 202 continues description as what performs inter-vehicle communication. Further, in the present embodiment, vehicle-to-vehicle communication is performed between the vehicle VEa and the vehicle VEb, and vehicle-to-vehicle communication is performed between the vehicle VEa and the vehicle VEc.
 通信部202は、送信部221及び受信部222を備える。送信部221は、自車の外部に、無線通信を介して情報を送信する。送信部221は、検出情報取得部201で取得した物標情報を、無線通信を介して他車両の通信機20に送信する。送信部221は、物標情報を送信する場合に、自車のロケータ30で測位した車両位置、及び自車を識別するための識別情報(以下、送信元識別情報)も物標情報に含めて送信すればよい。送信元識別情報は、自車の車両IDであってもよいし、自車の通信機20の通信機IDであってもよい。送信部221から送信する物標情報には、上述したように、物標の相対位置、物標の種類、物標の速度、物標の移動方位、送信元の車両の車両位置、及び送信元識別情報が含まれる。物標情報のうちの物標の相対位置は、自車の位置を基準とする位置である。以降では、物標情報に含ませるこの相対位置を搭載物基準物標位置と呼ぶ。 The communication unit 202 includes a transmission unit 221 and a reception unit 222. The transmission unit 221 transmits information to the outside of the own vehicle via wireless communication. The transmission unit 221 transmits the target information acquired by the detection information acquisition unit 201 to the communication device 20 of the other vehicle via wireless communication. When transmitting target information, the transmission unit 221 includes the vehicle position measured by the locator 30 of the vehicle and identification information for identifying the vehicle (hereinafter referred to as transmission source identification information) in the target information. Just send it. The source identification information may be the vehicle ID of the own vehicle or the communication device ID of the communication device 20 of the own vehicle. As described above, the target information transmitted from the transmission unit 221 includes the relative position of the target, the type of target, the speed of the target, the direction of movement of the target, the vehicle position of the transmission source vehicle, and the transmission source vehicle position. Contains identifying information. The relative position of the target in the target information is a position based on the position of the own vehicle. Hereinafter, this relative position to be included in the target information will be referred to as a mounted object reference target position.
 受信部222は、自車の外部から無線通信を介して送信されてくる情報を受信する。受信部222は、他車両の通信機20から無線通信を介して送信されてくる物標情報を受信する。この他車両は、車両ユニット2を用いる車両とする。よって、この他車両は、周辺監視センサを有し、測位衛星の信号を用いた測位が可能であり、且つ、自車と無線通信が可能な通信機20を搭載する通信機搭載物に相当する。また、受信部222で受信する物標情報には、上述したように、搭載物基準物標位置、物標の種類、物標の速度、物標の移動方位、送信元の車両の車両位置、及び送信元識別情報が含まれる。この受信部222が通信情報取得部に相当する。この受信部222での処理が通信情報取得工程に相当する。 The receiving unit 222 receives information transmitted via wireless communication from the outside of the own vehicle. The receiving unit 222 receives target information transmitted from the communication device 20 of another vehicle via wireless communication. This other vehicle is assumed to be a vehicle using the vehicle unit 2 . Therefore, this other vehicle has a surrounding monitoring sensor, is capable of positioning using signals from positioning satellites, and corresponds to a communication device mounted object having a communication device 20 capable of wireless communication with the own vehicle. . As described above, the target information received by the receiving unit 222 includes the mounted object reference target position, target type, target speed, target moving azimuth, vehicle position of the transmission source vehicle, and source identification information. This receiving section 222 corresponds to a communication information acquiring section. The processing in this receiving unit 222 corresponds to the communication information acquisition step.
 検出位置特定部203は、検出情報取得部201で取得した物標情報のうちの、自車に対する相対位置を、物標の自車に対する相対位置(以下、物標検出位置)として特定する。つまり、自車の周辺監視センサ40で検出した物標の自車に対する相対位置である物標検出位置を特定する。一例として、ロケータ30で測位した自車の車両位置を原点とするXY座標に変換すればよい。この検出位置特定部203での処理が検出位置特定工程に相当する。 The detected position specifying unit 203 specifies the relative position of the target relative to the own vehicle in the target object information acquired by the detection information acquiring unit 201 as the relative position of the target relative to the own vehicle (hereinafter referred to as target detection position). That is, the target detection position, which is the relative position of the target detected by the vehicle surroundings monitoring sensor 40 with respect to the vehicle, is specified. As an example, the position of the own vehicle measured by the locator 30 may be converted into XY coordinates with the origin being the vehicle position. The processing in this detection position specifying unit 203 corresponds to the detection position specifying step.
 同一判定部204は、自車の周辺監視センサ40で検出した物標(以下、自車検出物標)と、受信部222で取得した搭載物基準物標位置の送信元の他車両の周辺監視センサ40で検出した物標(以下、他車検出物標)とが同一か否かを判定する。この同一判定部204での処理が同一判定工程に相当する。同一判定部204は、検出情報取得部201で取得する物標情報と、受信部222で取得する物標情報との近似の有無によって、自車検出物標と他車検出物標とが同一か否かを判定する。同一か否か判定のために比較する物標情報は、例えば物標の種類、物標の速度、及び物標の移動方位とすればよい。例えば、同一判定部204は、種類の一致、速度の差が閾値未満、及び移動方位の差が閾値未満の条件を全て満たす場合に、同一と判定すればよい。一方、これらの条件の一部でも満たさない場合に同一でないと判定すればよい。なお、同一判定部204で同一を判定するために満たすべき条件は、種類の一致、速度の差が閾値未満、及び移動方位の差が閾値未満の一部としてもよい。 The identity determination unit 204 monitors the surroundings of the target detected by the surroundings monitoring sensor 40 of the own vehicle (hereinafter referred to as the detected target of the own vehicle) and the surroundings of other vehicles from which the mounted object reference target position acquired by the receiving unit 222 is transmitted. It is determined whether or not the target detected by the sensor 40 (hereinafter referred to as another vehicle detected target) is the same. The processing in the identity determination unit 204 corresponds to the identity determination step. The identity determination unit 204 determines whether the target detected by the own vehicle and the target detected by another vehicle are the same, depending on whether the target information acquired by the detection information acquisition unit 201 and the target information acquired by the reception unit 222 are similar. determine whether or not The target information to be compared for determining whether or not they are the same may be, for example, the type of target, the speed of the target, and the direction of movement of the target. For example, the same determination unit 204 may determine that the two are the same when all of the following conditions are satisfied: the types match, the speed difference is less than a threshold, and the direction difference is less than a threshold. On the other hand, if even some of these conditions are not met, it may be determined that they are not the same. Note that the conditions to be satisfied for the identity determination unit 204 to determine the identity may be part of the matching of types, the difference in speed being less than a threshold value, and the difference in moving direction being less than a threshold value.
 変換部205は、受信部222で取得した物標情報のうちの搭載物基準物標位置を、自車に対する相対位置に変換する。一例として、ロケータ30で測位した自車の車両位置を原点とするXY座標系(以下、自車座標系)の座標に変換すればよい。この変換部205での処理が変換工程に相当する。 The conversion unit 205 converts the mounted object reference target position in the target information acquired by the reception unit 222 into a relative position with respect to the own vehicle. As an example, the coordinates may be converted into coordinates in an XY coordinate system (hereinafter referred to as the vehicle coordinate system) with the vehicle position of the vehicle positioned by the locator 30 as the origin. The processing in this conversion unit 205 corresponds to the conversion step.
 また、同一判定部204は、検出位置特定部203で特定した物標検出位置と、受信部222で受信する物標情報のうちの搭載物基準物標位置とを用いて、自車検出物標と他車検出物標とが同一か否かを判定してもよい。この場合には、搭載物基準物標位置を変換部205で自車に対する相対位置に変換し、位置の近似の有無によって同一か否かを判定すればよい。同一判定部204は、この変換後の位置の差が閾値未満であることを条件として、同一か否かを判定すればよい。同一判定部204は、種類の一致、速度の差が閾値未満、及び移動方位の差が閾値未満の条件に加えて、この変換後の位置の差が閾値未満であることを条件としてもよい。 In addition, the identity determination unit 204 uses the target detection position specified by the detection position specifying unit 203 and the mounted object reference target position among the target object information received by the reception unit 222 to determine the own vehicle detected target. and the target detected by the other vehicle are the same. In this case, the conversion unit 205 converts the mounted object reference target position into a relative position with respect to the host vehicle, and determines whether or not the positions are the same based on whether or not the positions are approximated. The identity determination unit 204 may determine whether or not the positions are the same under the condition that the difference in position after conversion is less than a threshold. The identity determination unit 204 may set the condition that the difference in position after conversion is less than a threshold in addition to the conditions that the type matches, the speed difference is less than the threshold, and the movement direction difference is less than the threshold.
 誤差推定部206は、同一判定部204で物標が同一と判定した場合には、その物標についての、変換部205で変換した搭載物基準物標位置と、検出位置特定部203で特定した物標検出位置とのずれから、自車の位置を基準とした他車両での測位の誤差(以下、搭載物測位誤差)を推定する。この誤差推定部206での処理が誤差推定工程に相当する。搭載物測位誤差は、自車座標系のX座標,Y座標のそれぞれのずれとすればよい。誤差推定部206は、推定した搭載物測位誤差を誤差記憶部207に記憶する。誤差記憶部207は、不揮発性メモリとしてもよいし、揮発性メモリとしてもよい。誤差推定部206は、推定した搭載物測位誤差と、その搭載物測位誤差が推定された他車両の送信元識別情報とを紐付けて誤差記憶部207に記憶すればよい。他車両の送信元識別情報としては、受信部222で受信した物標情報のうちの送信元識別情報を用いればよい。 When the same determination unit 204 determines that the targets are the same, the error estimation unit 206 compares the mounted object reference target position converted by the conversion unit 205 and the detected position identification unit 203 specified Based on the deviation from the detected target position, the positioning error in other vehicles (hereinafter referred to as mounted object positioning error) is estimated with reference to the position of the own vehicle. The processing in this error estimating section 206 corresponds to the error estimating step. The mounted object positioning error may be the deviation of each of the X and Y coordinates of the own vehicle coordinate system. The error estimation unit 206 stores the estimated mounted object positioning error in the error storage unit 207 . The error storage unit 207 may be a non-volatile memory or a volatile memory. The error estimating unit 206 may associate the estimated mounted object positioning error with the transmission source identification information of the other vehicle whose mounted object positioning error is estimated, and store them in the error storage unit 207 . As the transmission source identification information of other vehicles, the transmission source identification information of the target object information received by the receiving unit 222 may be used.
 物標位置特定部208は、誤差推定部206で搭載物測位誤差を一旦推定した後は、自車の周辺監視センサ40で検出できない物標であっても、受信部222で取得する搭載物基準物標位置を、その搭載物測位誤差の分だけ補正して、自車に対するその物標の位置を特定する。補正については、搭載物測位誤差の分のずれを解消するように、搭載物基準物標位置をずらすことで行えばよい。物標位置特定部208は、同一判定部204で物標が同一でないと判定した場合であって、且つ、誤差記憶部207に搭載物測位誤差が記憶済みの他車両から搭載物基準物標位置を取得した物標については、受信部222で取得する搭載物基準物標位置を、その搭載物測位誤差の分だけ補正して、自車に対するその物標の位置を特定すればよい。よって、無線通信を介して他車両から受信部222で取得する搭載物基準物標位置からでも、自車に対する相対位置をより精度良く特定することが可能になる。物標位置特定部208は、誤差記憶部207に搭載物測位誤差が記憶済みの他車両か否かを、誤差記憶部207に搭載物測位誤差に紐付けられた送信元識別情報が存在するか否かで判断すればよい。 After the error estimating unit 206 once estimates the mounted object positioning error, the target position specifying unit 208 uses the mounted object reference acquired by the receiving unit 222 even for targets that cannot be detected by the perimeter monitoring sensor 40 of the own vehicle. The position of the target relative to the own vehicle is specified by correcting the position of the target by the mounted object positioning error. Correction may be performed by shifting the mounted object reference target position so as to eliminate the deviation corresponding to the mounted object positioning error. The target position specifying unit 208 determines the mounted object reference target position from another vehicle when the same determination unit 204 determines that the targets are not the same and the mounted object positioning error is already stored in the error storage unit 207 . is obtained, the position of the target with respect to the own vehicle can be specified by correcting the mounted object reference target position obtained by the receiving unit 222 by the amount of the mounted object positioning error. Therefore, it is possible to specify the relative position with respect to the own vehicle with higher accuracy even from the mounted object reference target position acquired by the receiving unit 222 from another vehicle via wireless communication. The target position specifying unit 208 determines whether or not the mounted object positioning error is already stored in the error storage unit 207, and whether or not there is transmission source identification information linked to the mounted object positioning error in the error storage unit 207. You can decide whether or not.
 物標位置特定部208は、特定した自車に対する物標の位置を、物標記憶部209に記憶する。つまり、物標記憶部209には、自車の周辺監視センサ40で検出できない見通し外物標の位置が記憶される。物標記憶部209は、揮発性メモリとすればよい。自車の周辺監視センサ40で検出できる見通し内物標の位置についても、物標記憶部209に記憶してもよい。見通し内物標の位置については、周辺監視ECU50のメモリに記憶してもよい。周辺監視センサ40の検出範囲内の物標の位置については、自車の周辺監視センサ40で検出したものであるので、搭載物測位誤差による補正は行われない。 The target position specifying unit 208 stores the specified position of the target relative to the own vehicle in the target storage unit 209 . That is, the target storage unit 209 stores the positions of non-line-of-sight targets that cannot be detected by the perimeter monitoring sensor 40 of the own vehicle. The target storage unit 209 may be a volatile memory. The position of the line-of-sight target that can be detected by the vehicle surroundings monitoring sensor 40 may also be stored in the target storage unit 209 . The position of the line-of-sight target may be stored in the memory of the perimeter monitoring ECU 50 . The position of the target within the detection range of the surroundings monitoring sensor 40 is detected by the surroundings monitoring sensor 40 of the own vehicle, so no correction due to mounted object positioning error is performed.
 運転支援ECU60は、物標記憶部209に記憶されている物標の位置を用いて、物標との近接を回避するための車両制御,物標との近接を回避するための注意喚起等の運転支援を行う。よって、自車の見通し外の物標であっても、より精度良く特定した自車に対する相対位置を用いて、より精度の高い運転支援を行うことが可能になる。 The driving support ECU 60 uses the position of the target stored in the target storage unit 209 to perform vehicle control for avoiding proximity to the target, alerting for avoiding proximity to the target, and the like. Provide driving assistance. Therefore, even if the target is beyond the line of sight of the own vehicle, it is possible to perform more accurate driving support by using the relative position with respect to the own vehicle that is specified with higher accuracy.
 <通信機20での測位誤差推定関連処理>
 ここで、図4のフローチャートを用いて、通信機20での搭載物測位誤差の推定に関連する処理(以下、測位誤差推定関連処理)の流れの一例について説明を行う。この処理が実行されることは誤差推定方法が実施されることを意味する。図4のフローチャートは、例えば自車の内燃機関又はモータジェネレータを始動させるためのスイッチ(以下、パワースイッチ)がオンになった場合に開始される構成とすればよい。
<Positioning Error Estimation Related Processing in Communication Device 20>
Here, an example of the flow of processing related to estimation of mounted object positioning error in the communication device 20 (hereinafter referred to as positioning error estimation related processing) will be described using the flowchart of FIG. 4 . Execution of this process means execution of the error estimation method. The flowchart of FIG. 4 may be configured to be started, for example, when a switch (hereinafter referred to as a power switch) for starting the internal combustion engine or motor generator of the own vehicle is turned on.
 まず、ステップS1では、受信部222が、他車両の通信機20から無線通信を介して送信されてくる物標情報を受信した場合(S1でYES)には、ステップS2に移る。一方、物標情報を受信していない場合(S1でNO)には、ステップ12に移る。 First, in step S1, when the receiving unit 222 receives target information transmitted from the communication device 20 of another vehicle via wireless communication (YES in S1), the process proceeds to step S2. On the other hand, if the target information has not been received (NO in S1), the process proceeds to step 12. FIG.
 ステップS2では、検出情報取得部201が、自車の周辺監視ECU50で認識した物標情報を取得する。ステップS3では、検出位置特定部203は、S2で取得した物標情報のうちの、自車に対する相対位置から、物標の自車に対する相対位置である物標検出位置を特定する。 In step S2, the detection information acquisition unit 201 acquires target information recognized by the perimeter monitoring ECU 50 of the own vehicle. In step S3, the detection position specifying unit 203 specifies the target detection position, which is the relative position of the target with respect to the own vehicle, from the relative position with respect to the own vehicle among the target object information acquired in S2.
 ステップS4では、同一判定部204が、自車検出物標と他車検出物標とが同一か否かを判定する。ステップS5では、自車検出物標と他車検出物標とが同一と判定した場合(S5でYES)には、ステップS6に移る。一方、同一でないと判定した場合(S5でNO)には、ステップS9に移る。 In step S4, the identity determination unit 204 determines whether or not the target detected by the own vehicle and the target detected by the other vehicle are the same. If it is determined in step S5 that the target detected by the own vehicle and the target detected by the other vehicle are the same (YES in S5), the process proceeds to step S6. On the other hand, if it is determined that they are not the same (NO in S5), the process moves to step S9.
 ステップS6では、変換部205が、S1で受信して取得した物標情報のうちの搭載物基準物標位置を、自車に対する相対位置に変換する。なお、S6の処理は、S4の処理よりも前に行う構成としてもよい。この場合、S4では、S3で特定した物標検出位置と、S6で変換した搭載物基準物標位置とを用いて、自車検出物標と他車検出物標とが同一か否かを判定してもよい。 In step S6, the conversion unit 205 converts the mounted object reference target position in the target information received and acquired in S1 into a relative position with respect to the own vehicle. Note that the processing of S6 may be configured to be performed before the processing of S4. In this case, in S4, using the target detection position specified in S3 and the mounted object reference target position converted in S6, it is determined whether or not the target detected by the own vehicle and the target detected by the other vehicle are the same. You may
 ステップS7では、誤差推定部206が、S5で同一と判定した物標についての、S6で変換した搭載物基準物標位置と、S3で特定した物標検出位置とのずれから、自車の位置を基準とした他車両での測位の誤差である搭載物測位誤差を推定する。ステップS8では、S7で推定した搭載物測位誤差を、その搭載物測位誤差が推定された他車両の送信元識別情報と紐付けて誤差記憶部207に記憶する。 In step S7, the error estimating unit 206 calculates the position of the own vehicle from the deviation between the mounted object reference target position converted in S6 and the target detection position specified in S3 for the targets determined to be the same in S5. Estimate the mounted object positioning error, which is the positioning error of other vehicles with reference to In step S8, the mounted object positioning error estimated in S7 is stored in the error storage unit 207 in association with the transmission source identification information of the other vehicle whose mounted object positioning error was estimated.
 ステップS9では、搭載物基準物標位置の送信元の他車両について、誤差記憶部207に搭載物測位誤差が記憶済みの場合(S9でYES)には、ステップS10に移る。一方、誤差記憶部207に搭載物測位誤差が記憶済みでない場合(S9でNO)には、ステップS12に移る。 In step S9, if the mounted object positioning error has already been stored in the error storage unit 207 for the other vehicle that is the transmission source of the mounted object reference target position (YES in S9), the process proceeds to step S10. On the other hand, if the mounted object positioning error has not been stored in the error storage unit 207 (NO in S9), the process proceeds to step S12.
 ステップS10では、物標位置特定部208が、S1で取得した搭載物基準物標位置を、その搭載物基準物標位置の送信元の他車両の搭載物測位誤差の分だけ補正して、S5で同一でないと判定した物標の自車に対する位置を特定する。S10では、搭載物基準物標位置を変換部205で変換した後に補正を行ってもよい。S10では、搭載物基準物標位置を変換部205で変換する前に補正を行ってもよい。ステップS11では、S10で特定した見通し外の物標の位置を物標記憶部209に記憶する。 In step S10, the target position specifying unit 208 corrects the mounted object reference target position acquired in S1 by the mounted object positioning error of the other vehicle that transmitted the mounted object reference target position, and corrects the mounted object reference target position acquired in S1. Identify the position of the target that is determined not to be the same with respect to the own vehicle. In S<b>10 , correction may be performed after the conversion unit 205 converts the mounted object reference target position. In S<b>10 , correction may be performed before the conversion unit 205 converts the mounted object reference target position. In step S<b>11 , the position of the non-line-of-sight target specified in S<b>10 is stored in the target storage unit 209 .
 ステップS12では、測位誤差推定関連処理の終了タイミングであった場合(S12でYES)には、測位誤差推定関連処理を終了する。一方、測位誤差推定関連処理の終了タイミングでなかった場合(S12でNO)には、S1に戻って処理を繰り返す。測位誤差推定関連処理の終了タイミングの一例としては、パワースイッチがオフになったこと等が挙げられる。 In step S12, if it is time to end the positioning error estimation related process (YES in S12), the positioning error estimation related process ends. On the other hand, if it is not the end timing of the positioning error estimation related process (NO in S12), the process returns to S1 and repeats the process. An example of the termination timing of the positioning error estimation-related processing is when the power switch is turned off.
 なお、S5で物標が同一と判定した場合であっても、S1で取得した搭載物基準物標位置の送信元の他車両について、誤差記憶部207に搭載物測位誤差が記憶済みの場合には、S6~S8の処理を省略してS12の処理に移る構成としてもよい。 Note that even if it is determined in S5 that the targets are the same, if the mounted object positioning error is already stored in the error storage unit 207 for the other vehicle that transmitted the mounted object reference target position acquired in S1, Alternatively, the processing of S6 to S8 may be omitted and the processing of S12 may be performed.
 <実施形態1のまとめ>
 実施形態1の構成によれば、搭載物基準物標位置は、通信機搭載物である他車両での測位衛星の信号を用いた測位によって求められたその他車両の位置を基準とするので、その測位の誤差を含んでいる。よって、同一判定部204で同一と判定した物標について、自車の周辺監視センサ40で検出したその物標の自車に対する相対位置である物標検出位置と、他車両の周辺監視センサ40で検出したその物標の、その他車両の位置を基準とする搭載物基準物標位置を自車に対する相対位置に変換した位置とのずれから、搭載物測位誤差を推定することができる。ここで、自車の周辺監視センサ40で検出できればよいのは、他車両の周辺監視センサ40で検出した物標であるので、搭載物測位誤差の推定は、他車両を自車の周辺監視センサ40で検出できなくても可能となる。また、搭載物測位誤差を用いれば、他車両での測位誤差を補正して、他車両の位置をより精度良く特定することが可能になる。その結果、自車の周辺監視センサ40の検出範囲外の通信機搭載物であるの位置をより精度良く特定することが可能になる。
<Summary of Embodiment 1>
According to the configuration of the first embodiment, the mounted object reference target position is based on the position of the other vehicle obtained by positioning using the signal of the positioning satellite in the other vehicle, which is the mounted object of the communication device. Includes positioning errors. Therefore, for a target determined to be identical by the identity determination unit 204, the target detection position, which is the relative position of the target with respect to the vehicle, detected by the surroundings monitoring sensor 40 of the own vehicle, and the target detected by the surroundings monitoring sensor 40 of the other vehicle A mounted object positioning error can be estimated from the difference between the detected target and the position obtained by converting the mounted object reference target position based on the position of the other vehicle into a relative position with respect to the own vehicle. In this case, the object detected by the surroundings monitoring sensor 40 of the own vehicle is the target detected by the surroundings monitoring sensor 40 of the other vehicle. Even if it cannot be detected by 40, it is possible. Further, by using the mounted object positioning error, it is possible to correct the positioning error of the other vehicle and specify the position of the other vehicle with higher accuracy. As a result, it is possible to more accurately identify the position of the communication device mounted object outside the detection range of the perimeter monitoring sensor 40 of the own vehicle.
 また、実施形態1の構成によれば、自車の周辺監視センサ40の検出範囲外の物標であっても、他車両の周辺監視センサ40の検出範囲内の物標であれば、他車両から無線通信を介して取得するその物標についての搭載物基準物標位置を、搭載物測位誤差の分だけ補正することで、自車に対する位置をより精度良く特定することが可能になる。 Further, according to the configuration of the first embodiment, even if the target is outside the detection range of the surroundings monitoring sensor 40 of the own vehicle, if the target is within the detection range of the surroundings monitoring sensor 40 of the other vehicle, the other vehicle By correcting the mounted object reference target position of the target acquired via wireless communication from the vehicle by the amount of the mounted object positioning error, it is possible to specify the position with respect to the own vehicle with higher accuracy.
 ここで、図1を用いて、実施形態1の効果について説明する。図1のLMa,LMb,LMcは歩行者といった物標とする。この物標は、通信機搭載物ではないものとする。物標LMaは、車両VEa,VEb,VEcの全ての周辺監視センサ40で検出可能とする。一方、物標LMbは、車両VEbの周辺監視センサ40では検出可能だが、車両VEaの周辺監視センサ40では検出不能とする。物標LMcは、車両VEcの周辺監視センサ40では検出可能だが、車両VEaの周辺監視センサ40では検出不能とする。車両VEb,車両VEcは、車両VEaの周辺監視センサ40では検出不能とする。ここでは、車両VEaが自車であり、車両VEb,VEcが他車両であるものとして説明する。 Here, the effect of Embodiment 1 will be described using FIG. LMa, LMb, and LMc in FIG. 1 are targets such as pedestrians. It is assumed that this target is not a communicator-mounted object. The target LMa can be detected by all the perimeter monitoring sensors 40 of the vehicles VEa, VEb, and VEc. On the other hand, the target LMb can be detected by the surroundings monitoring sensor 40 of the vehicle VEb, but cannot be detected by the surroundings monitoring sensor 40 of the vehicle VEa. The target LMc can be detected by the surroundings monitoring sensor 40 of the vehicle VEc, but cannot be detected by the surroundings monitoring sensor 40 of the vehicle VEa. The vehicle VEb and the vehicle VEc cannot be detected by the surrounding monitoring sensor 40 of the vehicle VEa. Here, it is assumed that the vehicle VEa is the own vehicle and the vehicles VEb and VEc are the other vehicles.
 自車VEaの通信機20では、自車の周辺監視センサ40で検出した物標LMaの自車に対する位置と、車両VEbから無線通信を介して取得する車両VEbの周辺監視センサ40で検出した物標LMaの位置を自車に対する位置に変換した位置とから、車両VEbでの搭載物測位誤差が推定できる。車両VEcの搭載物測位誤差についても、同様にして推定できる。 In the communication device 20 of the own vehicle VEa, the position of the target object LMa with respect to the own vehicle detected by the surroundings monitoring sensor 40 of the own vehicle and the object detected by the surroundings monitoring sensor 40 of the vehicle VEb acquired from the vehicle VEb via wireless communication. The mounted object positioning error in the vehicle VEb can be estimated from the position obtained by converting the position of the target LMa into the position relative to the own vehicle. The mounted object positioning error of the vehicle VEc can be similarly estimated.
 また、自車VEaの通信機20では、自車VEaの周辺監視センサ40の検出範囲外の物標LMbであっても、車両VEbから無線通信を介して取得する車両VEbの周辺監視センサ40で検出した物標LMbの搭載物基準物標位置と、車両VEbでの搭載物測位誤差とから、自車VEaに対する物標LMbの位置を精度良く特定することが可能になる。また、自車VEaの周辺監視センサ40の検出範囲外の物標LMcであっても、車両VEcから無線通信を介して取得する車両VEcの周辺監視センサ40で検出した物標LMcの搭載物基準物標位置と、車両VEcでの搭載物測位誤差とから、自車VEaに対する物標LMcの位置を精度良く特定することが可能になる。 Further, even if the target LMb is out of the detection range of the surroundings monitoring sensor 40 of the vehicle VEa, the surroundings monitoring sensor 40 of the vehicle VEb acquires it from the vehicle VEb via wireless communication. The position of the target LMb with respect to the own vehicle VEa can be specified with high accuracy from the detected mounted object reference target position of the target LMb and the mounted object positioning error in the vehicle VEb. In addition, even if the target LMc is outside the detection range of the perimeter monitoring sensor 40 of the own vehicle VEa, the mounted object reference of the target LMc detected by the perimeter monitoring sensor 40 of the vehicle VEc acquired from the vehicle VEc via wireless communication. It is possible to accurately identify the position of the target LMc with respect to the own vehicle VEa from the target position and the mounting object positioning error in the vehicle VEc.
 (実施形態2)
 図5を用いて、実施形態2の概要を説明する。実施形態2では、車両ユニット2はCAM(Cooperative Awareness Messages)、CPM(Cooperative Perception Message)、2つのメッセージを逐次送信する。また、路側機3もCPMを逐次送信する。
(Embodiment 2)
An overview of the second embodiment will be described with reference to FIG. In Embodiment 2, the vehicle unit 2 sequentially transmits CAM (Cooperative Awareness Messages) and CPM (Cooperative Perception Messages). The roadside device 3 also sequentially transmits the CPM.
 車両VEaに搭載された車両ユニット2のように、2つ以上の送信元装置からCPMあるいはCAMを受信した場合、車両ユニット2が備える通信機20は、それらのメッセージに含まれている情報を用いて、実施形態1と同様、同一判定、誤差推定などを行う。実施形態2における同一判定、誤差推定などの処理を説明する前に、CAM、CPMについて説明する。 When receiving CPMs or CAMs from two or more transmission source devices like the vehicle unit 2 mounted on the vehicle VEa, the communication device 20 provided in the vehicle unit 2 uses the information contained in those messages. Then, similar to the first embodiment, identity determination, error estimation, and the like are performed. CAM and CPM will be described before describing processes such as identity determination and error estimation in the second embodiment.
 図6は、V2X通信装置の例示的なアーキテクチャを示す図である。V2X通信装置は、実施形態1の通信機20に代えて用いる。V2X通信装置は、実施形態2の通信機20と同様、図3に示した構成も備える。V2X通信装置は、物標情報を送信する通信装置である。 FIG. 6 is a diagram showing an exemplary architecture of a V2X communication device. A V2X communication device is used instead of the communication device 20 of the first embodiment. The V2X communication device also has the configuration shown in FIG. 3, like the communication device 20 of the second embodiment. A V2X communication device is a communication device that transmits target information.
 V2X通信装置は、車両と車両、車両とインフラ、車両と自転車、車両と携帯端末などの間の通信を実行してもよい。V2X通信装置は、車両の車載器に相当してもよいし、車載器に含まれてもよい。車載器は、OBU(On-Board Unit)と呼ばれることがある。 The V2X communication device may perform communication between vehicles, vehicles and infrastructure, vehicles and bicycles, vehicles and mobile terminals, and the like. The V2X communication device may correspond to an onboard device of a vehicle or may be included in the onboard device. The onboard equipment is sometimes called an OBU (On-Board Unit).
 通信装置は、インフラの路側機に対応してもよいし、路側機に含まれていてもよい。路側機はRSU(Road Side Unit)と呼ばれることもある。通信装置は、ITS(Intelligent Transport System)を構成する一要素とすることもできる。ITSの一要素である場合、通信装置は、ITSステーション(ITS-S)に対応していてもよいし、ITS-Sに含まれていてもよい。ITS-Sは、情報交換を行う装置であり、OBU、RSU、携帯端末のいずれでもよく、また、それらに含まれるものであってもよい。携帯端末は、たとえば、PDA(Personal Digital Assistant)あるいはスマートフォンである。 The communication device may correspond to the infrastructure roadside unit or may be included in the roadside unit. A roadside unit is sometimes called an RSU (Road Side Unit). The communication device can also be one element that constitutes an ITS (Intelligent Transport System). If it is an element of the ITS, the communication device may correspond to an ITS station (ITS-S) or be included in the ITS-S. ITS-S is a device for information exchange, and may be any of OBU, RSU, and mobile terminal, or may be included therein. The mobile terminal is, for example, a PDA (Personal Digital Assistant) or a smart phone.
 通信装置は、IEEE1609にて開示されているWAVE(Wireless Access in Vehicular)装置に相当してもよいし、WAVE装置に含まれてもよい。 The communication device may correspond to a WAVE (Wireless Access in Vehicular) device disclosed in IEEE1609, or may be included in a WAVE device.
 本実施形態では、車両に搭載されたV2X通信装置であるとする。このV2X通信装置は、CA(Cooperative Awareness)サービスおよびCP(Collective Perception)サービスを提供する機能を備える。CAサービスではV2X通信装置はCAMを送信する。CPサービスでは、V2X通信装置はCPMを送信する。なお、通信装置がRSUあるいは携帯端末であっても、以下に開示するものと同一または類似の方法が適用できる。 In this embodiment, it is a V2X communication device mounted on a vehicle. This V2X communication device has a function of providing CA (Cooperative Awareness) service and CP (Collective Perception) service. In CA service, the V2X communication device transmits CAM. In CP service, the V2X communication device transmits CPM. It should be noted that the same or similar methods disclosed below can be applied even if the communication device is an RSU or a mobile terminal.
 図6に示すアーキテクチャは、EU規格に従ったITS-Sの基準アーキテクチャに基づいている。図6に示すアーキテクチャは、アプリケーション層110、ファシリティ層120、ネットワーク&トランスポート層140、アクセス層130、マネージメント層150、セキュリティ層160を備えた構成である。 The architecture shown in FIG. 6 is based on the ITS-S reference architecture according to EU standards. The architecture shown in FIG. 6 has an application layer 110 , a facility layer 120 , a network & transport layer 140 , an access layer 130 , a management layer 150 and a security layer 160 .
 アプリケーション層110は、種々のアプリケーション111を実装あるいはサポートする。図6には、アプリケーション111の例として、交通安全アプリケーション111a、効率的な交通情報アプリケーション111b、その他のアプリケーション111cが示されている。 The application layer 110 implements or supports various applications 111 . FIG. 6 shows, as examples of the applications 111, a traffic safety application 111a, an efficient traffic information application 111b, and other applications 111c.
 ファシリティ層120は、アプリケーション層110で定義された種々のユースケースの実行をサポートする。ファシリティ層120は、OSI参照モデルにおける上位3層(アプリケーション層、プレゼンテーション層及びセッション層)と同じあるいは類似の機能をサポートすることができる。なお、ファシリティは、機能、情報、データを提供することを意味する。ファシリティ層120は、V2X通信装置の機能を提供してもよい。たとえば、ファシリティ層120は、図6に示すアプリケーションサポート121、情報サポート122、通信サポート123の機能を提供してもよい。 The facility layer 120 supports the execution of various use cases defined in the application layer 110. The facility layer 120 can support the same or similar functionality as the top three layers (application layer, presentation layer and session layer) in the OSI reference model. Facility means providing functions, information and data. Facility layer 120 may provide the functionality of a V2X communication device. For example, facility layer 120 may provide the functions of application support 121, information support 122, and communication support 123 shown in FIG.
 アプリケーションサポート121は、基本的なアプリケーションセットまたはメッセージセットをサポートする機能を備える。メッセージの一例は、V2Xメッセージである。V2Xメッセージには、CAMなどの定期メッセージ、および、DENM(Decentralized Environmental Notification Message)などのイベントメッセージを含ませることができる。また、ファシリティ層120は、CPMをサポートすることもできる。 The application support 121 has functions that support basic application sets or message sets. An example of a message is a V2X message. V2X messages can include periodic messages such as CAMs and event messages such as DENMs (Decentralized Environmental Notification Messages). Facility layer 120 may also support CPM.
 情報サポート122は、基本アプリケーションセットまたはメッセージセットに使用される共通のデータまたはデータベースを提供する機能を有する。データベースの一例は、ローカルダイナミックマップ(LDM)である。 The information support 122 has the function of providing common data or databases used for the basic application set or message set. One example of a database is the Local Dynamic Map (LDM).
 通信サポート123は、通信やセッション管理のためのサービスを提供する機能を備える。通信サポート123は、たとえば、アドレスモードやセッションサポートを提供する。 The communication support 123 has functions for providing services for communication and session management. Communication support 123 provides, for example, address mode and session support.
 このように、ファシリティ層120は、アプリケーションセットまたはメッセージセットをサポートする。すなわち、ファシリティ層120は、アプリケーション層110が送信すべき情報や提供すべきサービスに基づいて、メッセージセットまたはメッセージを生成する。このようにして生成されたメッセージは、V2Xメッセージと呼ばれることがある。 Thus, facility layer 120 supports a set of applications or a set of messages. That is, the facility layer 120 generates message sets or messages based on the information that the application layer 110 should send and the services it should provide. Messages generated in this way are sometimes referred to as V2X messages.
 アクセス層130は、外部IF(InterFace)131と内部IF132を備え、上位層で受信したメッセージ/データを、物理チャネルを介して送信することができる。たとえば、アクセス層130は、以下の通信技術により、データ通信を行う、あるいは、データ通信をサポートすることができる。通信技術は、たとえば、IEEE802.11および/または802.11p規格に基づく通信技術、IEEE802.11および/または802.11p規格の物理伝送技術に基づくITS-G5無線通信技術、衛星/広帯域無線移動通信を含む2G/3G/4G(LTE)/5G無線携帯通信技術、DVB-T/T2/ATCなどの広帯域地上デジタル放送技術、GNSS通信技術、WAVE通信技術である。 The access layer 130 includes an external IF (InterFace) 131 and an internal IF 132, and can transmit messages/data received by the upper layers via physical channels. For example, access stratum 130 may conduct or support data communication according to the following communication techniques. The communication technology is, for example, a communication technology based on the IEEE 802.11 and/or 802.11p standard, an ITS-G5 wireless communication technology based on the physical transmission technology of the IEEE 802.11 and/or 802.11p standard, a satellite/broadband wireless mobile communication including 2G/3G/4G (LTE)/5G wireless mobile communication technology, broadband terrestrial digital broadcasting technology such as DVB-T/T2/ATC, GNSS communication technology, and WAVE communication technology.
 ネットワーク&トランスポート層140は、各種トランスポートプロトコルやネットワークプロトコルを用いて、同種/異種ネットワーク間の車両通信用ネットワークを構成することができる。トランスポート層は、上位層と下位層との間の接続層である。上位層には、セッション層、プレゼンテーション層、アプリケーション層110がある。下位層には、ネットワーク層、データリンク層、物理層がある。トランスポート層は、送信データが正確に宛先に到着するように管理することができる。送信元では、トランスポート層は、効率的なデータ伝送のためにデータを適切なサイズのパケットに加工する。受信側では、トランスポート層は、受信したパケットを元のファイルに復元する処理を行なう。トランスポートプロトコルは、たとえば、TCP(Transmission Control Protocol)、UDP(User Datagram Protocol)、BTP(Basic Transport Protocol)である。 The network & transport layer 140 can configure a vehicle communication network between homogeneous/heterogeneous networks using various transport protocols and network protocols. The transport layer is the connecting layer between the upper and lower layers. Upper layers include a session layer, a presentation layer, and an application layer 110 . The lower layers include the network layer, data link layer, and physical layer. The transport layer can manage transmitted data to arrive at its destination correctly. At the source, the transport layer processes the data into appropriately sized packets for efficient data transmission. On the receiving side, the transport layer takes care of restoring the received packets to the original file. Transport protocols are, for example, TCP (Transmission Control Protocol), UDP (User Datagram Protocol), and BTP (Basic Transport Protocol).
 ネットワーク層は、論理アドレスを管理することができる。また、ネットワーク層は、パケットの配送経路を決定してもよい。ネットワーク層は、トランスポート層で生成されたパケットを受信し、宛先の論理アドレスをネットワーク層のヘッダに付加してもよい。パケットの送信経路は、車両間、車両と固定局との間、および固定局間のユニキャスト/マルチキャスト/ブロードキャストが考慮されてもよい。ネットワークプロトコルとして、ジオネットワーキング、モビリティサポートあるいはジオネットワーキングに関するIPv6ネットワーキングが考慮されてもよい。 The network layer can manage logical addresses. The network layer may also determine the routing of packets. The network layer may receive packets generated by the transport layer and add the logical address of the destination to the network layer header. Packet transmission routes may be unicast/multicast/broadcast between vehicles, between vehicles and fixed stations, and between fixed stations. As network protocol, IPv6 networking for geo-networking, mobility support or geo-networking may be considered.
 図6に示すように、V2X通信装置のアーキテクチャは、マネージメント層150とセキュリティ層160とをさらに含んでいてもよい。マネージメント層150は、層同士のデータの伝達や相互作用を管理する。マネージメント層150は、管理情報ベース151、規制管理152、層間管理153、ステーション管理154、アプリケーション管理155を備える。セキュリティ層160は、全部の層のセキュリティを管理する。セキュリティ層160は、ファイアウォールと侵入検知管理161、認証、承認、プロファイル管理162、セキュリティ管理情報ベース163を備える。 As shown in FIG. 6, the V2X communication device architecture may further include a management layer 150 and a security layer 160 . Management layer 150 manages the communication and interaction of data between layers. The management layer 150 comprises a management information base 151 , regulation management 152 , interlayer management 153 , station management 154 and application management 155 . Security layer 160 manages security for all layers. Security layer 160 comprises firewall and intrusion detection management 161 , authentication, authorization and profile management 162 and security management information base 163 .
 図7に、V2Xメッセージを例示する。V2Xメッセージは、ITSメッセージと呼ぶこともできる。V2Xメッセージは、アプリケーション層110またはファシリティ層120で生成できる。V2Xメッセージの具体例は、CAM、DENM、CPMである。  Fig. 7 shows an example of a V2X message. V2X messages may also be referred to as ITS messages. V2X messages can be generated at application layer 110 or facility layer 120 . Examples of V2X messages are CAM, DENM, CPM.
 ネットワーク&トランスポート層140におけるトランスポート層はBTPパケットを生成する。ネットワーク&トランスポート層140におけるネットワーク層は、そのBTPパケットをカプセル化してジオネットワーキングパケットを生成することができる。ジオネットワーキングパケットは、LLC(Logical Link Control)パケットにカプセル化される。図7において、データはメッセージセットを含んでいてもよい。メッセージセットは、たとえば、基本安全メッセージである。 The transport layer in the network & transport layer 140 generates BTP packets. A network layer at network & transport layer 140 may encapsulate the BTP packets to produce geo-networking packets. Geo-networking packets are encapsulated in LLC (Logical Link Control) packets. In FIG. 7, the data may include message sets. A message set is, for example, basic safety messages.
 BTPは、ファシリティ層120で生成されたV2Xメッセージを下位層に送信するためのプロトコルである。BTPヘッダには、AタイプとBタイプがある。AタイプのBTPヘッダは、双方向のパケット伝送において送受信に必要な宛先ポートと送信元ポートとを含むことがある。BタイプのBTPヘッダは、双方向ではないパケット伝送において、伝送に必要な宛先ポート、宛先ポート情報を含めることができる。 BTP is a protocol for transmitting V2X messages generated in the facility layer 120 to lower layers. The BTP header has A type and B type. A type BTP header may contain the destination port and source port required for transmission and reception in two-way packet transmission. The B-type BTP header can include the destination port and destination port information required for transmission in non-bidirectional packet transmission.
 BTPヘッダに含まれるフィールドを説明する。宛先ポートは、BTPパケットに含まれるデータ(BTP-PDU)の宛先に対応するファシリティエンティティを特定する。BTP-PDUは、BTPにおける単位伝送データである。 Explain the fields included in the BTP header. The destination port identifies the facility entity corresponding to the destination of the data contained in the BTP packet (BTP-PDU). A BTP-PDU is unit transmission data in BTP.
 送信元ポートは、BTP-Aタイプの場合に生成されるフィールドである。送信元ポートは、対応するパケットの送信元におけるファシリティ層120のプロトコルエンティティのポートを示す。このフィールドは、16ビットのサイズを持つことができる。  The source port is a field generated for the BTP-A type. The source port indicates the port of the facility layer 120 protocol entity at the source of the corresponding packet. This field can have a size of 16 bits.
 宛先ポート情報は、BTP-Bタイプの場合に生成されるフィールドである。宛先ポートがウェルノウンポートである場合に追加情報を提供する。このフィールドは、16ビットのサイズを持つことができる。  Destination port information is a field generated for the BTP-B type. Provides additional information if the destination port is a well-known port. This field can have a size of 16 bits.
 ジオネットワーキングパケットは、ネットワーク層のプロトコルに従う基本ヘッダと共通ヘッダを含み、ジオネットワーキングモードに従って選択的に拡張ヘッダを含む。 A geo-networking packet includes a basic header and a common header according to the network layer protocol, and optionally includes an extension header according to the geo-networking mode.
 LLCパケットは、ジオネットワーキングパケットにLLCヘッダを付加したものである。LLCヘッダは、IPデータとジオネットワーキングデータを区別して伝送する機能を提供する。IPデータとジオネットワーキングデータは、SNAP(Subnetwork Access Protocol)のイーサタイプで区別することができる。 An LLC packet is a geo-networking packet with an LLC header added. The LLC header provides the ability to differentiate and transmit IP data and geo-networking data. IP data and geo-networking data can be distinguished by SNAP (Subnetwork Access Protocol) Ethertype.
 IPデータが伝送される場合、イーサタイプはx86DDに設定され、LLCヘッダに含まれることがある。ジオネットワーキングデータが送信される場合、イーサタイプは0x86DCに設定され、LLCヘッダに含まれることがある。受信機は、LLCパケットヘッダのイーサタイプフィールドを確認し、LLCパケットヘッダのイーサタイプフィールドの値に応じて、パケットをIPデータパスまたはジオネットワーキングパスに転送および処理することができる。 When IP data is transmitted, the Ethertype is set to x86DD and may be included in the LLC header. If geo-networking data is transmitted, the Ethertype may be set to 0x86DC and included in the LLC header. The receiver can see the Ethertype field in the LLC packet header and forward and process the packet to the IP data path or the geo-networking path depending on the value of the Ethertype field in the LLC packet header.
 LLCヘッダにはDSAP(Destination Service Access Point)とSSAP(Source Service Access Point)が含まれている。LLCヘッダにおいてSSAPの次には、制御フィールド(図7におけるControl)、プロトコルID、イーサタイプが配置される。 The LLC header contains DSAP (Destination Service Access Point) and SSAP (Source Service Access Point). In the LLC header, SSAP is followed by a control field (Control in FIG. 7), protocol ID, and ethertype.
 図8は、V2X通信装置のアーキテクチャにおけるCA(協調認識)サービス128および他の層のための論理インターフェースを示している。 FIG. 8 shows the logical interfaces for the CA (Collaborative Awareness) service 128 and other layers in the V2X communication device architecture.
 V2X通信装置は、交通安全および率化のための様々なサービスを提供してもよい。サービスの1つは、CAサービス128であってもよい。道路交通における協調認識は、道路使用者および路側インフラが相互の位置、動態および属性を知ることができることを意味する。道路使用者とは、自動車、トラック、オートバイ、自転車、歩行者など、交通安全や制御を行う道路上や周辺のあらゆる使用者を指し、路側インフラとは、道路標識、信号機、障壁、入口などの設備を指す。  V2X communication devices may provide various services for traffic safety and efficiency. One of the services may be CA service 128 . Collaborative awareness in road traffic means that road users and roadside infrastructure can know each other's location, dynamics and attributes. Road users refer to all users on and around roads for which traffic safety and control are required, such as automobiles, trucks, motorcycles, cyclists, pedestrians, etc. Roadside infrastructure refers to road signs, traffic lights, barriers, entrances, etc. Refers to equipment.
 お互いを認識することは、交通安全や交通効率化などのアプリケーションの基本になる。互いの認識は、V2Xネットワークと呼ばれる無線ネットワークに基づく、車両間(V2V)、車両とインフラとの間(V2I)、インフラと車両との間(I2V)、全対象物と全対象物との間(X2X)などの道路ユーザー間の定期的な情報交換によって実行することができる。 Recognizing each other is the basis for applications such as traffic safety and traffic efficiency. Recognition between vehicles (V2V), between vehicles and infrastructure (V2I), between infrastructure and vehicles (I2V), between all objects and all objects, based on wireless networks called V2X networks. (X2X) can be carried out by regular exchange of information between road users.
 協調安全走行や交通効率化のアプリケーションでは、V2X通信装置の周囲の道路使用者の存在や行動を含む状況認識を進展させることが要求される。例えば、V2X通信装置は、自身のセンサや他のV2X通信装置との通信を通じて、状況認識を行うことができる。この場合、CAサービスは、V2X通信装置がCAMを送信することにより、自身の位置、挙動、属性を通知する方法を指定することができる。 Applications for cooperative safe driving and traffic efficiency require advanced situational awareness, including the presence and behavior of road users around V2X communication devices. For example, V2X communication devices can provide situational awareness through their sensors and communication with other V2X communication devices. In this case, the CA service can specify how the V2X communication device communicates its location, behavior and attributes by sending CAMs.
 図8に示すように、CAサービス128は、ファシリティ層120のエンティティであってもよい。たとえば、CAサービス128は、ファシリティ層120のアプリケーションサポートドメインの一部であってもよい。 As shown in FIG. 8, the CA service 128 may be an entity of the facility layer 120. For example, CA service 128 may be part of the application support domain of facility layer 120 .
 CAサービス128は、CAMプロトコルを適用し、CAMの送信と受信の2つのサービスを提供することができる。CAサービス128はCAM基本サービスと言うこともできる。 The CA service 128 can apply the CAM protocol and provide two services of CAM transmission and reception. CA service 128 may also be referred to as a CAM basic service.
 CAMの送信では、送信元ITS-SはCAMを構成する。CAMは送信のためにネットワーク&トランスポート層140に送られる。CAMは、送信元ITS-Sから、通信範囲内の全てのITS-Sに直接送信されることがある。通信範囲は、送信元ITS-Sの送信電力を変更することによって変動する。CAMは、送信元ITS-SのCAサービス128により制御された頻度で定期的に生成される。生成頻度は、送信元ITS-Sの状態変化を考慮して決定される。考慮する状態は、たとえば、送信元ITS-Sの位置または速度の変化、無線チャネルの負荷である。  In CAM transmission, the source ITS-S configures the CAM. The CAM is sent to network & transport layer 140 for transmission. The CAM may be sent directly from the source ITS-S to all ITS-S within range. The communication range is varied by changing the transmission power of the source ITS-S. CAMs are generated periodically at a frequency controlled by the CA service 128 of the originating ITS-S. The frequency of generation is determined by considering the state change of the source ITS-S. Conditions to consider are, for example, changes in the position or velocity of the source ITS-S, loading of the radio channel.
 CAMを受信すると、CAサービス128は、CAMの内容をアプリケーション111および/またはLDM127などのエンティティに利用する。 Upon receiving the CAM, CA service 128 utilizes the contents of the CAM to entities such as application 111 and/or LDM 127.
 CAMは、道路交通に関わる全てのITS-Sが送信する。CAサービス128は、CAM生成のための関連情報を収集するため、および、受信したCAMデータをさらに処理するために、ファシリティ層120のエンティティおよびアプリケーション層110と接続する。 CAM is transmitted by all ITS-S related to road traffic. CA service 128 interfaces with entities of facility layer 120 and application layer 110 to collect relevant information for CAM generation and to further process received CAM data.
 車両において、ITS-Sのデータ収集のためのエンティティの一例は、VDP(Vehicle Data Provider)125、POTI(position and time)ユニット126、LDM127である。VDP125は、車両ネットワークに接続され、車両の状態情報を提供する。POTIユニット126は、ITS-Sの位置と時間情報を提供する。LDM127は、ETSI TR 102 863に記載されているように、ITS-S内のデータベースであり、受信したCAMデータで更新することができる。アプリケーション111は、LDM127から情報を取得し、さらに処理を行うことができる。 An example of an entity for ITS-S data collection in a vehicle is a VDP (Vehicle Data Provider) 125, a POTI (position and time) unit 126, and an LDM 127. The VDP 125 is connected to the vehicle network and provides vehicle status information. The POTI unit 126 provides ITS-S position and time information. The LDM 127 is a database within ITS-S, as described in ETSI TR 102 863, which can be updated with received CAM data. Application 111 can obtain information from LDM 127 and perform further processing.
 CAサービス128は、他のITS-SとCAMを交換するために、NF-SAP(Network & Transport/Facilities Service Access Point)を介して、ネットワーク&トランスポート層140と接続する。CAサービス128は、CAM送信とCAM受信のセキュリティサービスのために、SF-SAP(Security Facilities Service Access Point)を介してセキュリティ層160と接続する。CAサービス128は、受信したCAMデータをアプリケーション111に直接提供する場合は、MF-SAP(Management/Facilities Service Access Point)を介してマネージメント層150と接続し、FA-SAP(Facilities/Applications Service Access Point)を介してアプリケーション層110と接続する。 The CA service 128 connects with the network & transport layer 140 via NF-SAP (Network & Transport/Facilities Service Access Point) in order to exchange CAMs with other ITS-S. The CA service 128 interfaces with the security layer 160 via SF-SAP (Security Facilities Service Access Point) for CAM transmission and CAM reception security services. When the CA service 128 directly provides the received CAM data to the application 111, it connects with the management layer 150 via the MF-SAP (Management/Facilities Service Access Point), FA-SAP (Facilities/Applications Service Access Point ) with the application layer 110 .
 図9は、CAサービス128の機能ブロック、および、他の機能および層へのインターフェースを示す。CAサービス128は、CAMエンコード部1281、CAMデコード部1282、CAM送信管理部1283、CAM受信管理部1284の4つのサブ機能を提供する。 FIG. 9 shows the functional blocks of the CA service 128 and interfaces to other functions and layers. CA service 128 provides four sub-functions of CAM encoding section 1281 , CAM decoding section 1282 , CAM transmission management section 1283 and CAM reception management section 1284 .
 CAMエンコード部1281は、予め定義されたフォーマットに従ってCAMを構築し、符号化する。CAMデコード部1282は、受信したCAMを復号する。CAM送信管理部1283は、CAMを送信するために、送信元ITS-Sのプロトコル動作を実行する。CAM送信管理部1283は、たとえば、CAM送信動作の起動と終了、CAM生成頻度の決定、CAM生成のトリガを実行する。CAM受信管理部1284は、CAMを受信するために、受信側ITS-Sに規定されたプロトコル動作を実行する。CAM受信管理部1284は、たとえば、CAM受信時に、CAMデコード機能を起動させる。また、CAM受信管理部1284は、受信したCAMのデータを受信側ITS-Sのアプリケーション111に提供する。CAM受信管理部1284は、受信したCAMの情報チェックを実行してもよい。 The CAM encoding unit 1281 constructs and encodes a CAM according to a predefined format. CAM decoding section 1282 decodes the received CAM. The CAM transmission manager 1283 executes the protocol operation of the source ITS-S to transmit the CAM. The CAM transmission management unit 1283 executes, for example, activation and termination of CAM transmission operation, determination of CAM generation frequency, and triggering of CAM generation. The CAM reception manager 1284 performs protocol operations specified in the receiving ITS-S to receive the CAM. The CAM reception management unit 1284, for example, activates the CAM decoding function when receiving CAM. Also, the CAM reception management unit 1284 provides the received CAM data to the application 111 of the ITS-S on the reception side. The CAM reception management unit 1284 may perform information check of the received CAM.
 受信データを提供するため、CAサービス128は、図9に示すように、インターフェースIF.CAMを提供する。インターフェースIF.CAMは、LDM127またはアプリケーション111へのインターフェースである。なお、アプリケーション層110へのインターフェースはAPI(Application Programming Interface)として実装され、このAPIを介してCAサービス128とアプリケーション111の間でデータが交換されてもよい。また、アプリケーション層110へのインターフェースは、FA-SAPとして実装することも可能である。  In order to provide the received data, the CA service 128, as shown in FIG. Provide CAM. Interface IF. CAM is an interface to LDM 127 or application 111 . The interface to the application layer 110 may be implemented as an API (Application Programming Interface), and data may be exchanged between the CA service 128 and the application 111 via this API. The interface to application layer 110 can also be implemented as FA-SAP.
 CAサービス128は、CAMの生成に必要なデータを取得するために、ファシリティ層120の他のエンティティと相互作用する。CAMのためのデータを提供する一連のエンティティは、データ提供エンティティあるいはデータ提供ファシリティである。データ提供エンティティとCAサービス128との間は、インターフェースIF.FACを介してデータが交換される。 The CA Service 128 interacts with other entities of the Facility Layer 120 to obtain the data necessary for CAM generation. The set of entities that provide data for the CAM are data providing entities or data providing facilities. Between the data providing entity and the CA service 128 is an interface IF. Data is exchanged via the FAC.
 CAサービス128は、インターフェースIF.N&Tを介してネットワーク&トランスポート層140と情報交換を行う。送信元ITS-Sにおいて、CAサービス128は、プロトコル制御情報(PCI)と共に、CAMをネットワーク&トランスポート層140に提供する。PCIは、ETSI EN 32 636-5-1に従った制御情報である。CAMは、ファシリティ層120のサービステータユニット(FL-SDU)に埋め込まれている。 The CA service 128 has an interface IF. It exchanges information with the network & transport layer 140 via N&T. At the source ITS-S, CA service 128 provides the CAM to network & transport layer 140 along with protocol control information (PCI). PCI is control information according to ETSI EN 32 636-5-1. The CAM is embedded in the Service Data Unit (FL-SDU) of the facility layer 120 .
 受信側ITS-Sでは、ネットワーク&トランスポート層140は、受信したCAMをCAサービス128に提供することができる。 On the receiving side ITS-S, the network & transport layer 140 can provide the received CAM to the CA service 128.
 CAサービス128とネットワーク&トランスポート層140との間のインターフェースは、ジオネットワーキング/BTPスタックのサービス、あるいは、IPv6スタック、およびIPv6/ジオネットワーキングの複合スタックに依存する。 The interface between the CA service 128 and the network & transport layer 140 relies on geo-networking/BTP stack services, or IPv6 stacks, and IPv6/geo-networking combined stacks.
 CAMは、ジオネットワーキング(GN)/BTPスタックが提供するサービスに依存することがある。このスタックが使用される場合、GNパケット転送タイプは、シングルホップブロードキャスト(SHB)が使用される。この場合、直接通信範囲内にあるノードのみがCAMを受信できる。 CAM may rely on services provided by the GeoNetworking (GN)/BTP stack. When this stack is used, the GN packet transfer type is Single Hop Broadcast (SHB). In this case, only nodes within direct communication range can receive the CAM.
 CAサービス128からジオネットワーキング/BTPスタックに渡されるPCIは、BTPタイプ、宛先ポート、宛先ポート情報、GNパケット伝送タイプ、GN通信プロファイル、GNセキュリティプロファイルを含むことがある。また、このPCIは、GNトラフィッククラス、GN最大パケット生存期間を含んでいてもよい。 The PCI passed from CA service 128 to the geo-networking/BTP stack may include BTP type, destination port, destination port information, GN packet transmission type, GN communication profile, GN security profile. This PCI may also include GN traffic class, GN maximum packet lifetime.
 CAMは、ETSI TS 102 636-3に規定されるように、CAMを送信するためにIPv6スタックまたはIPv6/ジオネットワーキング複合スタックを使用することができる。CAMの送信に、IPv6/ジオネットワーキング複合スタックを使用する場合、CAサービス128とIPv6/ジオネットワーキング複合スタックの間のインターフェースは、CAサービス128とIPv6スタックの間のインターフェースと同じであってよい。 A CAM can use an IPv6 stack or a combined IPv6/geo-networking stack to transmit the CAM, as specified in ETSI TS 102 636-3. If the combined IPv6/geo-networking stack is used for CAM transmission, the interface between the CA service 128 and the combined IPv6/geo-networking stack may be the same as the interface between the CA service 128 and the IPv6 stack.
 CAサービス128は、MF-SAPを介してマネージメント層150の管理エンティティとプリミティブを交換することができる。プリミティブは、命令など、交換する情報の要素である。ITS-G5のアクセス層130の場合、送信側ITS-Sは、インターフェースIF.Mngを介して、管理エンティティからT_GenCam_DCCの設定情報を取得する。 The CA service 128 can exchange primitives with management entities of the management layer 150 via MF-SAP. Primitives are elements of information that are exchanged, such as instructions. For the access layer 130 of ITS-G5, the sender ITS-S has an interface IF. Obtain the setting information of T_GenCam_DCC from the management entity via Mng.
 CAサービス128は、ITS-Sのセキュリティエンティティが提供するインターフェースIF.Secを用いて、SF-SAPを介して、ITS-Sのセキュリティエンティティとプリミティブを交換することができる。 The CA service 128 is provided by the ITS-S security entity interface IF. Sec can be used to exchange primitives with ITS-S security entities via SF-SAP.
 CAMの送信には、ポイントツーマルチポイント通信が使用されてもよい。CAMは、送信元ITS-Sから、送信元ITS-Sの直接通信範囲に位置する受信側ITS-Sに対してのみシングルホップで送信される。シングルホップであるので、受信側のIST-Sは、受信したCAMを転送しない。 Point-to-multipoint communication may be used for CAM transmission. A CAM is sent in a single hop from a source ITS-S only to a receiver ITS-S located within direct communication range of the source ITS-S. Since it is a single hop, the receiving IST-S does not forward the received CAM.
 CAサービス128の開始は、車両ITS-S、路側ITS-S、パーソナルITS-Sなど、異なるタイプのITS-Sで異なっていてもよい。CAサービス128がアクティブである限り、CAMの生成は、CAサービス128が管理する。 The initiation of the CA service 128 may be different for different types of ITS-S, such as vehicle ITS-S, roadside ITS-S, and personal ITS-S. As long as CA service 128 is active, CAM generation is managed by CA service 128 .
 車両ITS-Sの場合、CAサービス128は、ITS-Sの起動と同時に起動し、ITS-Sの非活性化時に終了する。 In the case of the vehicle ITS-S, the CA service 128 is activated at the same time as the ITS-S is activated and terminated when the ITS-S is deactivated.
 CAMの生成頻度は、CAサービス128が管理する。CAMの生成頻度は、連続する2つのCAMの生成時間間隔である。CAM生成間隔は最小値T_GenCamMinを下回らないように設定される。CAM生成間隔の最小値T_GenCamMinは100msである。CAM生成間隔が100msになる場合、CAM生成頻度は10Hzになる。CAM生成間隔は最大値T_GenCamMaxを上回らないように設定される。CAM生成間隔の最大値T_GenCamMaxは1000msである。CAM生成間隔が1000msになる場合、CAM生成頻度は1Hzになる。 The frequency of CAM generation is managed by the CA service 128. The CAM generation frequency is the time interval between two consecutive CAM generations. The CAM generation interval is set so as not to fall below the minimum value T_GenCamMin. The minimum value T_GenCamMin of the CAM generation interval is 100ms. If the CAM generation interval is 100 ms, the CAM generation frequency will be 10 Hz. The CAM generation interval is set so as not to exceed the maximum value T_GenCamMax. The maximum value T_GenCamMax of the CAM generation interval is 1000ms. If the CAM generation interval is 1000 ms, the CAM generation frequency will be 1 Hz.
 上記の生成頻度の範囲内で、CAサービス128は、送信元ITS-Sのダイナミクスとチャネルの輻輳状態に応じてCAM生成をトリガする。送信元ITS-SのダイナミクスによりCAM生成間隔が短縮された場合、この間隔は連続する複数のCAMで維持される。CAサービス128は、CAM生成のトリガとなる条件をT_CheckCamGen毎に繰り返し確認する。T_CheckCamGenはT_GenCamMin以下になっている。 Within the generation frequency described above, the CA service 128 triggers CAM generation according to the dynamics of the source ITS-S and the congestion state of the channel. If the dynamics of the source ITS-S shortens the CAM generation interval, this interval is maintained for consecutive CAMs. The CA service 128 repeatedly checks the conditions that trigger CAM generation every T_CheckCamGen. T_CheckCamGen is below T_GenCamMin.
 ITS-G5の場合、パラメータ T_GenCam_DCCは、ETSI TS 102 724で規定されている分散輻輳制御(DCC)のチャネル使用要件に従い、CAMの生成を減らす、連続する2つのCAMの最小生成時間間隔を提供する。これにより、チャネル輻輳時に無線チャネルの残存容量に応じたCAM生成頻度の調整が容易になる。 For ITS-G5, the parameter T_GenCam_DCC provides the minimum generation time interval between two consecutive CAMs that reduces CAM generation according to the channel usage requirements for distributed congestion control (DCC) specified in ETSI TS 102 724. . This facilitates adjustment of the CAM generation frequency according to the remaining capacity of the radio channel during channel congestion.
 T_GenCam_DCCは、管理エンティティによってミリ秒の単位で提供される。T_GenCam_DCCの値の範囲は、T_GenCamMin ≦ T_GenCam_DCC ≦ T_GenCamMax である。  T_GenCam_DCC is provided by the management entity in milliseconds. The range of values for T_GenCam_DCC is T_GenCamMin ≤ T_GenCam_DCC ≤ T_GenCamMax.
 管理エンティティがT_GenCam_DCCに、T_GenCamMax 以上の値を与える場合、T_GenCam_DCC はT_GenCamMaxに設定される。管理エンティティがT_GenCam_DCCにT_GenCamMin 以下の値を与える場合、又は、T_GenCam_DCCが提供されない場合、T_GenCam_DCCはT_GenCamMinに設定される。 If the management entity gives T_GenCam_DCC a value greater than or equal to T_GenCamMax, T_GenCam_DCC is set to T_GenCamMax. If the managing entity gives T_GenCam_DCC a value less than or equal to T_GenCamMin, or if T_GenCam_DCC is not provided, T_GenCam_DCC is set to T_GenCamMin.
 なお、LTE-V2Xでは、DCCおよびT_GenCam_DCCは適用しない。LTE-V2Xでは、チャネル輻輳制御はアクセス層130が管理する。  In LTE-V2X, DCC and T_GenCam_DCC do not apply. In LTE-V2X, the access layer 130 manages channel congestion control.
 T_GenCamは、現在有効なCAM生成間隔の上限である。T_GenCamのデフォルト値をT_GenCamMaxとする。T_GenCamは、下記の条件1によりCAMを発生させた場合、最後のCAM発生からの経過時間を設定する。条件2により、N_GenCam個の連続したCAMを発生させた後、T_GenCamをT_GenCamMax に設定する。  T_GenCam is the upper limit of the currently valid CAM generation interval. Let the default value of T_GenCam be T_GenCamMax. T_GenCam sets the elapsed time from the last generation of CAM when CAM is generated according to condition 1 below. After N_GenCam consecutive CAMs are generated according to condition 2, T_GenCam is set to T_GenCamMax.
 N_GenCamの値は、いくつかの環境条件に従って動的に調整することができる。たとえば、交差点に近づいた場合にN_GenCamを大きくし、CAMの受信頻度を高くすることができる。なお、N_GenCamのデフォルト値及び最大値は3である。 The value of N_GenCam can be dynamically adjusted according to some environmental conditions. For example, when approaching an intersection, N_GenCam can be increased to increase the frequency of CAM reception. Note that the default and maximum values of N_GenCam are 3.
 条件1は、前回のCAM生成からの経過時間がT_GenCam_DCC以上であり、かつ、以下のITS-Sダイナミクス関連条件のいずれかが与えられていることである。ITS-Sダイナミクス関連条件の1つ目は、送信元ITS-Sの現在の方位と、送信元ITS-Sが前に送信したCAMに含まれる方位の差の絶対値が4度以上であることである。2つ目は、送信元ITS-Sの現在位置と、送信元ITS-Sが前に送信したCAMに含まれる位置の距離が4m以上であることである。3つ目は、送信元ITS-Sの現在速度と、送信元ITS-Sが以前に送信したCAMに含まれる速度との差の絶対値が0.5m/sを超えることである。 Condition 1 is that the elapsed time since the previous CAM generation is T_GenCam_DCC or more, and one of the following ITS-S dynamics-related conditions is given. The first condition related to ITS-S dynamics is that the absolute value of the difference between the current bearing of the source ITS-S and the bearing contained in the CAM previously transmitted by the source ITS-S is 4 degrees or more. is. The second is that the distance between the current location of the source ITS-S and the location included in the CAM previously transmitted by the source ITS-S is 4 m or more. Third, the absolute value of the difference between the current velocity of the source ITS-S and the velocity contained in the CAM previously transmitted by the source ITS-S exceeds 0.5 m/s.
 条件2は、前回のCAM生成からの経過時間がT_GenCam以上、かつ、ITS-G5の場合にはT_GenCam_Dcc以上であることである。 Condition 2 is that the elapsed time from the previous CAM generation is T_GenCam or more, and in the case of ITS-G5, T_GenCam_Dcc or more.
 上記2つの条件のいずれかが満たされた場合、CAサービス128は、直ちにCAMを生成する。 If either of the above two conditions is met, the CA service 128 immediately creates a CAM.
 CAサービス128は、CAMを生成する場合、必須コンテナを生成する。必須コンテナには、主に送信元ITS-Sの高ダイナミック情報を含める。高ダイナミック情報は、基本コンテナおよび高頻度コンテナに含ませる。CAMは、オプションデータを含むことができる。オプションデータは、主に、動的でない送信元ITS-Sの状態、および、特定のタイプの送信元ITS-Sのための情報である。動的でない送信元ITS-Sの状態は、低頻度コンテナに含ませ、特定のタイプの送信元ITS-Sのための情報は、特殊車両コンテナに含ませる。 When the CA service 128 creates a CAM, it creates an essential container. The mandatory container mainly contains highly dynamic information of the source ITS-S. High dynamic information is included in basic and high frequency containers. The CAM can contain optional data. Optional data is mainly non-dynamic source ITS-S status and information for specific types of source ITS-S. The status of non-dynamic source ITS-S is contained in the low frequency container, and information for specific types of source ITS-S is contained in the special vehicle container.
 低頻度コンテナは、CAサービス128の起動後、最初のCAMに含ませる。その後、低頻度コンテナを生成した最後のCAMの生成からの経過時間が500ms以上の場合に、CAMに低頻度コンテナを含める。  Infrequent containers are included in the first CAM after the CA service 128 is activated. After that, if the elapsed time from the generation of the last CAM that generated the low-frequency container is 500 ms or more, the low-frequency container is included in the CAM.
 特殊車両コンテナも、CAサービス128の起動後、最初に生成されるCAMに含まれる。その後、最後に特殊車両コンテナを生成したCAMの生成からの経過時間が500ms以上の場合に、CAMに特殊車両コンテナを含める。 The special vehicle container is also included in the first CAM generated after the CA service 128 is activated. After that, if 500 ms or more have passed since the CAM that last generated the special vehicle container, the special vehicle container is included in the CAM.
 路側ITS-SのCAM生成頻度も、連続する2つのCAM生成の時間間隔によって定義される。路側ITS-Sは、車両が路側ITS-Sの通信領域内にある間に少なくとも1つのCAMが送信されるように設定される必要がある。また、CAM生成の時間間隔は1000ms以上でなければならない。CAM生成の時間間隔が1000msである場合を最大CAM生成周波数とすると、最大CAM生成周波数は1Hzである。 The CAM generation frequency of the roadside ITS-S is also defined by the time interval between two consecutive CAM generations. The roadside ITS-S must be configured such that at least one CAM is transmitted while the vehicle is within the coverage of the roadside ITS-S. Also, the time interval for CAM generation must be 1000 ms or longer. Assuming that the time interval of CAM generation is 1000 ms as the maximum CAM generation frequency, the maximum CAM generation frequency is 1 Hz.
 なお、通行する車両がRSUからCAMを受信する確率は、CAMの発生頻度と車両が通信領域内にいる時間に依存する。この時間は、車速とRSUの送信電力に依存する。 It should be noted that the probability that a passing vehicle receives a CAM from an RSU depends on the frequency of CAM occurrence and the time the vehicle is within the communication area. This time depends on the vehicle speed and the transmission power of the RSU.
 CAMの生成頻度に加え、CAM生成に要する時間およびメッセージの構築に要するデータの即時性が、受信側ITS-Sにおけるデータの適用性を決定づける。受信されたCAMを適切に解釈するために、それぞれのCAMにはタイムスタンプが付与される。なお、異なるITS-S間で時間同期が取れていることが好ましい。 In addition to the frequency of CAM generation, the time required for CAM generation and the immediacy of data required for message construction determine the applicability of the data in the receiving ITS-S. In order to properly interpret received CAMs, each CAM is time-stamped. In addition, it is preferable that time synchronization is established between different ITS-S.
 CAM生成に要する時間は50ms以下になっている。CAM生成に要する時間とは、CAM生成が開始された時刻と、CAMがネットワーク&トランスポート層140に届けられる時刻との差である。  The time required for CAM generation is 50 ms or less. The time required for CAM generation is the difference between the time CAM generation is started and the time the CAM is delivered to network & transport layer 140 .
 車両ITS-SのCAMに示されるタイムスタンプは、このCAMに示される送信元ITS-Sの基準位置が決定された時刻に対応する。路側ITS-SのCAMに示されるタイプスタンプは、CAMが生成された時刻である。 The time stamp shown in the CAM of the vehicle ITS-S corresponds to the time when the reference position of the source ITS-S shown in this CAM was determined. The timestamp shown on the roadside ITS-S CAM is the time the CAM was generated.
 ITS-S間で伝達されるメッセージの認証には証明書が使用されてもよい。証明書は、証明書の保有者が特定のメッセージセットを送信する許可を示す。また、証明書は、メッセージ内の特定データ要素に対する権限を示すこともできる。証明書において、許可権限はITS-AIDおよびSSPという一対の識別子で示される。 Certificates may be used to authenticate messages transmitted between ITS-S. A certificate indicates permission for the holder of the certificate to send a particular set of messages. Certificates can also indicate authority over specific data elements within a message. In the certificate, authorization authority is indicated by a pair of identifiers, ITS-AID and SSP.
 ITS-Application Identifier(ITS-AID)は、付与される許可の全体的なタイプを示す。例えば、送信者にCAMを送信する権利があることを示すITS-AIDが存在する。  ITS-Application Identifier (ITS-AID) indicates the overall type of permission granted. For example, there is an ITS-AID that indicates that the sender has the right to send the CAM.
 サービス固有許可(Service Specific Permissions(SSP))は、ITS-AIDによって示される全体的な許可の中の特定の許可セットを示すフィールドである。例えば、送信者が特定の車両の役割のためにCAMを送信する権利があることを示すCAM用のITS-AIDに関連するSSPの値が存在する場合がある。 Service Specific Permissions (SSP) is a field that indicates a specific set of permissions within the overall permissions indicated by the ITS-AID. For example, there may be an SSP value associated with the ITS-AID for the CAM that indicates that the sender is entitled to transmit the CAM for a particular vehicle role.
 受信した署名付きCAMは、証明書が有効であり、CAMがその証明書のITS-AIDおよびSSPと一致する場合に、受信側に受け入れられる。CAMは、BitmapSspタイプのSSPを含むAuthorization Tiketに関連付けられた秘密鍵を使用して署名される。 A received signed CAM is accepted by the recipient if the certificate is valid and the CAM matches the certificate's ITS-AID and SSP. The CAM is signed using the private key associated with the Authorization Ticket containing an SSP of type BitmapSsp.
 図10は、CAMのフォーマットを示す図である。図10に示すように、CAMは、ITSプロトコルデータユニット(PDU)ヘッダと複数のコンテナを含んでいてもよい。ITS PDUヘッダは、プロトコルバージョン、メッセージタイプ、送信元ITS-SのIDの情報を含む。 FIG. 10 is a diagram showing the CAM format. As shown in FIG. 10, a CAM may include an ITS protocol data unit (PDU) header and multiple containers. The ITS PDU header contains information on the protocol version, message type, and source ITS-S ID.
 車両ITS-SのCAMは、1つの基本コンテナと1つの高頻度(High Frequency)コンテナ(以下、HFコンテナ)を含み、さらに、1つの低頻度(Low Frequency)コンテナ(以下、LFコンテナ)と、1つ以上の他の特殊車両コンテナを含むことができる。特殊車両コンテナは特殊コンテナと呼ぶこともある。 The CAM of the vehicle ITS-S includes one basic container and one high frequency container (hereinafter referred to as HF container), one low frequency container (hereinafter referred to as LF container), It can contain one or more other specialized vehicle containers. Special vehicle containers are sometimes called special containers.
 基本コンテナには、送信元ITS-Sに関する基本情報が含まれる。具体的には、基本コンテナには、ステーションタイプ、ステーションの位置を含ませることができる。ステーションタイプは、車両、路側機(RSU)などである。ステーションタイプが車両である場合には、車両種別を含ませてもよい。位置には、緯度、経度、高度および信頼度が含まれていてもよい。 The basic container contains basic information about the source ITS-S. Specifically, the base container can include station type, station location. A station type is a vehicle, a roadside unit (RSU), or the like. If the station type is vehicle, the vehicle type may be included. Location may include latitude, longitude, altitude and confidence.
 車両ITS-SのCAMでは、HFコンテナに車両HFコンテナが含まれる。ステーションの種類が車両でない場合には、車両HFコンテナではない他コンテナをHFコンテナに含ませることができる。 In the vehicle ITS-S CAM, the vehicle HF container is included in the HF container. If the station type is not vehicle, the HF container can contain other containers that are not vehicle HF containers.
 車両HFコンテナは、送信元の車両ITS-Sにおいて短時間で変化する動的状態情報が含まれる。車両HFコンテナには、具体的には、車両が向いている方位、車両の速度、走行方向、車長、車幅、車両前後加速度、道路の曲率、曲率計算モード、ヨーレートの1つ以上が含まれていてもよい。走行方向は、前進および後進のいずれかを示す。曲率計算モードは曲率の算出に車両のヨーレートを使用するか否かを示すフラグである。 The vehicle HF container contains dynamic state information that changes in a short time in the source vehicle ITS-S. Specifically, the vehicle HF container includes one or more of the orientation of the vehicle, vehicle speed, traveling direction, vehicle length, vehicle width, vehicle longitudinal acceleration, road curvature, curvature calculation mode, and yaw rate. It may be The running direction indicates either forward or backward. The curvature calculation mode is a flag indicating whether or not the yaw rate of the vehicle is used to calculate the curvature.
 さらに、車両HFコンテナには、車両前後の加速度制御状態、レーンポジション、ステアリングホイール角度、横方向加速度、垂直加速度、特性クラス、DSRC料金収集局の位置に関する情報のうちの1つ以上を含ませることもできる。特性クラスは、CAMのデータ要素の最大経過時間を決める値である。 In addition, the vehicle HF container includes one or more of the following: vehicle longitudinal acceleration control status, lane position, steering wheel angle, lateral acceleration, vertical acceleration, characteristic class, DSRC toll collection station location. can also A property class is a value that determines the maximum age of data elements in the CAM.
 車両ITS-SのCAMでは、LFコンテナに車両LFコンテナが含まれる。ステーションの種類が車両でない場合には、車両LFコンテナではない他コンテナをLFコンテナに含ませることができる。 In the vehicle ITS-S CAM, the vehicle LF container is included in the LF container. If the station type is not vehicle, the LF container can contain other containers that are not vehicle LF containers.
 車両LFコンテナは、車両の役割、外部灯火の点灯状態、走行軌跡のうちの1つ以上を含ませることができる。車両の役割は、車両が特殊車両である場合の区分である。外部灯火は、車両が備える外部灯火のうち最も重要な外部灯火を指す。走行軌跡は、過去のある時間またはある距離における車両の動きを示すものである。走行軌跡はパスヒストリーと言うこともできる。走行軌跡は、複数点(たとえば23点)の経由地点のリストで表される。 A vehicle LF container can include one or more of the vehicle's role, the lighting state of external lights, and the travel trajectory. The role of vehicle is a classification when the vehicle is a special vehicle. The external lights are the most important external lights of the vehicle. A travel trajectory indicates the movement of a vehicle at a certain time or a certain distance in the past. The running locus can also be called a path history. A travel locus is represented by a list of waypoints of a plurality of points (for example, 23 points).
 特殊車両コンテナは、公共交通のような道路交通における特殊な役割をもつ車両ITS-Sのためのコンテナである。特殊車両コンテナには、公共輸送コンテナ、特殊輸送コンテナ、危険物コンテナ、道路工事コンテナ、救急コンテナ、緊急コンテナ、安全確認車コンテナのいずれか1つが含まれことがある。 A special vehicle container is a container for vehicles ITS-S that have a special role in road traffic such as public transportation. A special vehicle container may include any one of a public transportation container, a special transportation container, a hazardous materials container, a road construction container, an ambulance container, an emergency container, or a security vehicle container.
 公共輸送コンテナは、バスなどの公共輸送車両に対するコンテナである。公共輸送コンテナは、乗降状態、信号機、障壁、杭(ボラード)などを公共交車両が制御するために用いられる。特殊輸送コンテナは、車両が重量車およびオーバーサイズ車両の一方または両方である場合に含まれる。危険物コンテナは、車両が危険物を輸送している場合に含まれるコンテナである。危険物コンテナは危険物の種類を示す情報を格納する。道路工事コンテナは、車両が道路工事に参加する車両である場合に含まれるコンテナである。道路工事コンテナは道路工事の種類、道路工事の原因を示すコードを格納する。また、道路工事コンテナには、前方の車線の開閉状況を示す情報を含ませることができる。救急コンテナは、車両が救急活動中の救急車両である場合に含まれるコンテナである。救急コンテナには、ライトバーおよびサイレンの使用状況、緊急優先度が示される。緊急コンテナは、車両が緊急活動中の緊急車両である場合に含まれるコンテナである。緊急コンテナには、ライトバーおよびサイレンの使用状況、原因コード、緊急優先度が示される。安全確認車コンテナは、車両が安全確認車である場合に含まれるコンテナである。安全確認車は、特殊輸送車両などに付随する車である。安全確認車コンテナには、ライトバーおよびサイレンの使用状況、追い越し規制、制限速度が示される。 A public transport container is a container for public transport vehicles such as buses. Public transportation containers are used by public transportation vehicles to control boarding conditions, traffic lights, barriers, bollards, and the like. Special shipping containers are included when the vehicle is one or both of a heavy vehicle and an oversized vehicle. A dangerous goods container is a container that is included when a vehicle is transporting dangerous goods. The dangerous goods container stores information indicating the type of dangerous goods. A road construction container is a container that is included when the vehicle is a vehicle that participates in road construction. The road construction container stores a code indicating the type of road construction and the cause of the road construction. The roadwork container may also include information indicating whether the lane ahead is open or closed. An ambulance container is a container that is included when the vehicle is an ambulance vehicle during an ambulance operation. The ambulance container shows the status of light bar and siren usage, and emergency priority. An emergency container is a container that is included when the vehicle is an emergency vehicle during emergency operations. The emergency container shows light bar and siren usage, cause code and emergency priority. The safety confirmation vehicle container is a container that is included when the vehicle is a safety confirmation vehicle. A safety confirmation vehicle is a vehicle that accompanies a special transportation vehicle or the like. The safety confirmation car container shows the use of light bars and sirens, overtaking regulations, and speed limits.
 図11にCAMを送信する処理をフローチャートにより示す。図11に示す処理はCAMの生成周期ごとに実行する。ステップS201では、CAMを構成する情報を取得する。ステップS201は、たとえば、検出情報取得部201が実行する。ステップS202では、ステップS201で取得した情報をもとにCAMを生成する。ステップS202も、検出情報取得部201が実行することができる。ステップS203では、送信部221が、ステップS202で生成したCAMを、車両の周囲へ送信する。  Fig. 11 shows a flowchart of the process of transmitting the CAM. The processing shown in FIG. 11 is executed for each CAM generation cycle. In step S201, information configuring the CAM is acquired. Step S201 is executed by the detection information acquisition unit 201, for example. At step S202, a CAM is generated based on the information obtained at step S201. Step S202 can also be executed by the detection information acquisition unit 201 . In step S203, the transmission unit 221 transmits the CAM generated in step S202 to the surroundings of the vehicle.
 CAサービス128では、V2X通信装置が定期的に自身の位置や状態を周囲のV2X通信装置に提供することで、交通安全を支援することができる。しかし、CAサービス128では、対応するV2X通信装置自身の情報しか共有できないという制限がある。この制限を克服するために、CPサービス124などのサービス開発が必要である。 With the CA service 128, V2X communication devices can support traffic safety by periodically providing their own location and status to surrounding V2X communication devices. However, the CA service 128 has a limitation that only the information of the corresponding V2X communication device itself can be shared. To overcome this limitation, service development such as CP service 124 is required.
 図12に示すように、CPサービス124も、ファシリティ層120のエンティティであってもよい。たとえば、CPサービス124は、ファシリティ層120のアプリケーションサポートドメインの一部であってもよい。CPサービス124は、たとえば、VDP125やPOTIユニット126からホストV2X通信装置に関する入力データを受信できない点で、CAサービス128とは基本的に異なるものであってもよい。 The CP service 124 may also be an entity of the facility layer 120, as shown in FIG. For example, CP service 124 may be part of the application support domain of facility layer 120 . The CP service 124 may fundamentally differ from the CA service 128 in that, for example, it cannot receive input data regarding the host V2X communication device from the VDP 125 or the POTI unit 126 .
 CPMの送信は、CPMの生成と送信を含む。CPMを生成するプロセスでは、発信元のV2X通信装置がCPMを生成し、その後、CPMが、送信のためにネットワーク&トランスポート層140に送られる。発信元のV2X通信装置は、発信元V2X通信装置、ホストV2X通信装置等と呼ばれてもよい。  CPM transmission includes CPM generation and transmission. In the process of generating CPM, an originating V2X communication device generates a CPM, which is then sent to network & transport layer 140 for transmission. An originating V2X communication device may be referred to as an originating V2X communication device, a host V2X communication device, and so on.
 CPサービス124は、CPM生成のための関連情報を収集し、受信したCPMコンテンツを追加処理を目的として配信するために、ファシリティ層120内の他のエンティティおよびファシリティ層120内のV2Xアプリケーションと接続していてもよい。V2X通信装置において、データ収集のためのエンティティは、ホスト物体検出器において物体検出を提供する機能であってよい。 CP service 124 connects with other entities in facility layer 120 and V2X applications in facility layer 120 to collect relevant information for CPM generation and to deliver received CPM content for further processing. may be In a V2X communication device, the entity for data collection may be the function that provides object detection at the host object detector.
 さらに、CPMを配信する(あるいは送信する)ために、CPサービス124は、ネットワーク&トランスポート層140のプロトコルエンティティによって提供されるサービスを使用してもよい。例えば、CPサービス124は、他のV2X通信装置とCPMを交換するために、NF-SAPを通じてネットワーク&トランスポート層140と接続してもよい。NF-SAPは、ネットワーク&トランスポート層140とファシリティ層120との間のサービスアクセスポイントである。 In addition, CP service 124 may use services provided by protocol entities of network & transport layer 140 to deliver (or transmit) CPMs. For example, CP service 124 may interface with network & transport layer 140 through NF-SAP to exchange CPMs with other V2X communication devices. NF-SAP is the service access point between network & transport layer 140 and facility layer 120 .
 さらに、CPサービス124は、CPMの送信およびCPMの受信のためにセキュリティサービスにアクセスするために、セキュリティ層160とファシリティ層120との間のSAPであるSF-SAPを通じてセキュアエンティティと接続してもよい。また、CPサービス124は、マネージメント層150とファシリティ層120との間のSAPであるMF-SAPを通じて管理エンティティと接続してもよい。また、CPサービス124は、受信したCPMデータをアプリケーションに直接提供する場合は、ファシリティ層120とアプリケーション層110とのSAPであるFA-SAPを通じてアプリケーション層110と接続してもよい。 In addition, CP service 124 may connect with secure entities through SF-SAP, the SAP between security layer 160 and facility layer 120, to access security services for sending CPMs and receiving CPMs. good. CP service 124 may also interface with management entities through MF-SAP, which is the SAP between management layer 150 and facility layer 120 . Further, when the CP service 124 directly provides the received CPM data to the application, the CP service 124 may connect to the application layer 110 through FA-SAP, which is the SAP between the facility layer 120 and the application layer 110 .
 CPサービス124は、V2X通信装置が、検出された周囲の道路使用者や他の物体の位置、挙動、属性について、他のV2X通信装置に通知する方法を指定することができる。たとえば、CPサービス124は、CPMの送信により、CPMに含まれている情報を他のV2X通信装置と共有することができる。なお、CPサービス124は、道路交通に参加する全ての種類の物標情報通信装置に対して追加できる機能であってもよい。 The CP service 124 can specify how a V2X communication device informs other V2X communication devices about the location, behavior, and attributes of detected surrounding road users and other objects. For example, sending a CPM allows the CP service 124 to share the information contained in the CPM with other V2X communication devices. Note that the CP service 124 may be a function that can be added to all types of target information communication devices that participate in road traffic.
 CPMは、V2Xネットワークを介して、V2X通信装置間で交換されるメッセージである。CPMは、V2X通信装置によって検出および/または認識された道路使用者および他の物体に対する集団認識を生成するために使用できる。検出される道路使用者または物体は、V2X通信装置を備えていない道路使用者または物体であってもよいが、これに限定されない。 A CPM is a message exchanged between V2X communication devices via the V2X network. CPM can be used to generate collective perceptions of road users and other objects detected and/or recognized by V2X communication devices. The detected road users or objects may be, but are not limited to, road users or objects that are not equipped with V2X communication equipment.
 上述したように、CAMを介して情報を共有するV2X通信装置は、協調認識を行うために、V2X通信装置自身の認識状態に関する情報のみを他のV2X通信装置と共有する。この場合、V2X通信装置を搭載していない道路使用者等はシステムの一部ではないため、安全や交通管理に関連する状況についての見解が限定される。 As described above, a V2X communication device that shares information via a CAM shares only information about the recognition state of the V2X communication device itself with other V2X communication devices in order to perform cooperative recognition. In this case, road users and others who are not equipped with V2X communication devices are not part of the system and thus have limited views on situations related to safety and traffic management.
 これを改善する一つの方法として、V2X通信装置を搭載し、V2X通信装置を搭載していない道路使用者や物体を認識できるシステムが、V2X通信装置を搭載していない道路使用者や物体の存在や状態を他のV2X通信装置に通知することが考えられる。このように、CPサービス124は、V2X通信装置を搭載していない道路使用者や物体の存在を協調して認識するため、V2X通信装置を搭載したシステムの安全性や交通管理性能を容易に向上させることが可能である。 As one method to improve this, a system that is equipped with a V2X communication device and can recognize road users and objects that are not equipped with a V2X communication device can detect the presence of road users and objects that are not equipped with a V2X communication device. and status to other V2X communication devices. In this way, the CP service 124 cooperatively recognizes the presence of road users and objects that are not equipped with V2X communication devices, so the safety and traffic management performance of systems equipped with V2X communication devices can be easily improved. It is possible to
 CPMの配信は、適用される通信システムによって異なる場合がある。例えば、ETSI EN 302 663に定義されているITS-G5ネットワークにおいて、CPMは、発信元のV2X通信装置から直接通信範囲内の全てのV2X通信装置に送信される場合がある。通信範囲は、関連する地域に応じて送信電力を変更することにより、発信元のV2X通信装置によって特に影響を受ける可能性がある。  CPM delivery may vary depending on the applied communication system. For example, in ITS-G5 networks as defined in ETSI EN 302 663, a CPM may be sent from an originating V2X communication device directly to all V2X communication devices within range. The communication range can be particularly affected by the originating V2X communication device by changing the transmission power according to the region concerned.
 さらに、CPMは、発信元のV2X通信装置におけるCPサービス124によって制御される頻度で定期的に生成されてもよい。生成頻度は、分散輻輳制御(Distributed Congestion Control)により決定される無線チャネル負荷を考慮して決定されてもよい。また、生成頻度は、検出された非V2X物体の状態、たとえば、位置、速度または方向の動的挙動、および他のV2X通信装置による同一の知覚された物体に対するCPMの送信を考慮して決定されてもよい。 Furthermore, CPMs may be generated periodically with a frequency controlled by the CP service 124 at the originating V2X communication device. The frequency of generation may be determined taking into account the radio channel load determined by Distributed Congestion Control. The generation frequency is also determined taking into account the state of the detected non-V2X objects, e.g. dynamic behavior of position, velocity or orientation, and the transmission of CPMs for the same perceived object by other V2X communication devices. may
 さらに、受信側のV2X通信装置がCPMを受信すると、CPサービス124により、CPMの内容を受信側のV2X通信装置内の機能、たとえばV2Xアプリケーションおよび/またはLDM127で使用することができるようにする。たとえば、LDM127は、受信したCPMデータで更新されることがある。V2Xアプリケーションは、追加処理のためにLDM127からこの情報を取り出してもよい。 Furthermore, when the receiving V2X communication device receives the CPM, the CP service 124 makes the contents of the CPM available to functions within the receiving V2X communication device, such as the V2X application and/or the LDM 127 . For example, LDM 127 may be updated with received CPM data. V2X applications may retrieve this information from LDM 127 for additional processing.
 図13は、本実施形態におけるCPサービス124の機能ブロック図である。より具体的には、図13は、本実施形態におけるCPサービス124の機能ブロックと、他の機能および層のためのインターフェースを有する機能ブロックとを図示する。 FIG. 13 is a functional block diagram of the CP service 124 in this embodiment. More specifically, FIG. 13 illustrates the functional blocks of the CP service 124 and functional blocks with interfaces for other functions and layers in this embodiment.
 図13に示すように、CPサービス124は、CPM送受信のために以下のサブ機能を提供することができる。CPMエンコード部1241は、予め定義されたフォーマットに従ってCPMを構成または生成する。最新の車載データがCPMに含まれることがある。CPMデコード部1242は、受信したCPMを復号する。CPM送信管理部1243は、発信元のV2X通信装置のプロトコル動作を実行する。CPM送信管理部1243が実行する動作には、CPM送信動作の起動および終了、CPM生成頻度の決定、CPM生成のトリガを含んでもよい。CPM受信管理部1244は、受信側V2X通信装置のプロトコル動作を実行することができる。具体的には、CPM受信におけるCPMデコード機能のトリガ、受信したCPMデータのLDM127または受信側V2X通信装置のV2Xアプリケーションへの提供、受信したCPMの情報チェックなどを含むことができる。 As shown in FIG. 13, the CP service 124 can provide the following sub-functions for CPM transmission/reception. CPM encoder 1241 constructs or generates CPM according to a predefined format. The latest in-vehicle data may be included in the CPM. The CPM decoding unit 1242 decodes the received CPM. The CPM transmission management unit 1243 executes the protocol operation of the source V2X communication device. The operations performed by the CPM transmission manager 1243 may include activation and termination of CPM transmission operations, determination of CPM generation frequency, and triggering of CPM generation. The CPM reception manager 1244 can perform protocol operations for the receiving V2X communication device. Specifically, it can include triggering the CPM decoding function in CPM reception, providing the received CPM data to the LDM 127 or the V2X application of the receiving side V2X communication device, checking the information of the received CPM, and the like.
 次に、CPMの配信について詳細に説明する。具体的には、CPM配信の要件、CPサービスの起動と終了、CPMトリガ条件、CPM生成周期、制約条件等について説明する。CPM配信にポイントツーマルチポイント通信が使用されてもよい。例えば、CPMの配信にITS-G5が用いられる場合、制御チャネル(G5-CCH)が用いられてもよい。CPM生成は、CPサービス124が動作している間、CPサービス124によってトリガされ管理されてもよい。CPサービス124は、V2X通信装置の起動とともに起動されてもよく、V2X通信装置が終了したときに終了されてもよい。 Next, the CPM distribution will be explained in detail. Specifically, the requirements for CPM distribution, activation and termination of CP service, CPM trigger conditions, CPM generation cycle, constraint conditions, etc. will be described. Point-to-multipoint communication may be used for CPM delivery. For example, if ITS-G5 is used for CPM delivery, a control channel (G5-CCH) may be used. CPM generation may be triggered and managed by CP service 124 while CP service 124 is running. The CP service 124 may be launched upon activation of the V2X communication device and may be terminated when the V2X communication device is terminated.
 ホストV2X通信装置は、近くのV2X通信装置と交換する必要がある十分な信頼度を有する少なくとも1つの物体が検出されるたびにCPMを送信してよい。検出された物体を含めることに関して、CPサービスは、物体の寿命とチャネル利用率との間のトレードオフを考慮すべきである。たとえば、CPMが受信した情報を利用するアプリケーションの観点からは、できるだけ頻繁に更新された情報を提供する必要がある。しかし、ITS-G5スタックの観点からは、チャネル使用率を最小にする必要があるため、低い送信周期が要求される。したがって、V2X通信装置は、この点を考慮し、検出した物体や物体情報をCPMに適切に含めることが望ましい。また、メッセージサイズを小さくするために、物体を評価した上で送信する必要がある。 A host V2X communication device may send a CPM whenever at least one object with sufficient confidence to exchange with a nearby V2X communication device is detected. Regarding the inclusion of detected objects, CP services should consider the trade-off between object lifetime and channel utilization. For example, from the point of view of an application that uses information received by a CPM, it needs to provide updated information as often as possible. However, from the point of view of the ITS-G5 stack, a low transmission period is required due to the need to minimize channel utilization. Therefore, it is desirable for the V2X communication device to consider this point and appropriately include the detected object and object information in the CPM. Also, in order to reduce the message size, it is necessary to evaluate the object before sending it.
 図14は、CPMの構造を示す図である。図14に示すCPM構造が基本CPM構造であってもよい。上述したように、CPMは、V2Xネットワーク内のV2X信装置間で交換されるメッセージであってもよい。また、CPMは、V2X通信装置によって検出および/または認識された道路使用者および/または他の物体に対する集団認識を生成するために使用されてもよい。すなわち、CPMは、V2X通信装置によって検出された物体に対する集団認識を生成するためのITSメッセージであってもよい。 FIG. 14 is a diagram showing the structure of the CPM. The CPM structure shown in FIG. 14 may be the basic CPM structure. As noted above, CPMs may be messages exchanged between V2X communication devices in a V2X network. CPM may also be used to generate collective perceptions of road users and/or other objects detected and/or recognized by V2X communication devices. That is, a CPM may be an ITS message for generating collective awareness of objects detected by V2X communication devices.
 CPMは、発信元V2X通信装置が検出した道路使用者と物体の状態情報および属性情報を含んでもよい。その内容は、検出された道路使用者または物体の種類および発信元V2X通信装置の検出性能に応じて異なってもよい。例えば、物体が車両である場合、状態情報は、少なくとも、実際の時間、位置、および運動状態に関する情報を含んでもよい。属性情報には、寸法、車種、道路交通における役割などの属性が含まれてもよい。 The CPM may include state information and attribute information of road users and objects detected by the source V2X communication device. Its content may vary depending on the type of road user or object detected and the detection capabilities of the originating V2X communication device. For example, if the object is a vehicle, the state information may include at least information about the actual time, location and motion state. Attribute information may include attributes such as dimensions, vehicle type, and role in road traffic.
 CPMは、CAMを補完し、CAMと同様の働きをするものであってもよい。すなわち、協調的な認識を高めるためであってもよい。CPMは、検出された道路使用者または物体に関し、外部から観測可能な情報を含んでもよい。CPサービス124は、他のステーションが送信したCPMを確認することで、異なるV2X通信装置が送信したCPMの複製または重複を低減する方法を含んでもよい。 The CPM may complement the CAM and work in the same way as the CAM. That is, it may be for enhancing cooperative recognition. The CPM may contain externally observable information about detected road users or objects. The CP service 124 may include methods to reduce duplication or duplication of CPMs sent by different V2X communication devices by verifying CPMs sent by other stations.
 CPMの受信により、受信側のV2X通信装置は、発信元のV2X通信装置が検出した道路使用者または物体の存在、種類および状態を認識してもよい。受信した情報は、安全性を高め、交通効率および移動時間を改善するためのV2Xアプリケーションをサポートするために、受信側のV2X通信装置によって使用されてもよい。例えば、受信した情報と検出された道路使用者または物体の状態とを比較することにより、受信側のV2X通信装置は、道路使用者または物体との衝突の危険性を推定することができる。さらに、受信側V2X通信装置は、受信側V2X通信装置のヒューマンマシンインターフェース(HMI)を介してユーザーに通知してもよいし、自動的に修正措置を講じてもよい。 By receiving the CPM, the receiving V2X communication device may recognize the presence, type and status of road users or objects detected by the originating V2X communication device. The received information may be used by the receiving V2X communication device to support V2X applications to enhance safety, improve traffic efficiency and travel time. For example, by comparing the received information with the detected states of road users or objects, the receiving V2X communication device can estimate the risk of collision with road users or objects. Additionally, the receiving V2X communication device may notify the user via the receiving V2X communication device's Human Machine Interface (HMI) or automatically take corrective action.
 CPMの基本的なフォーマットを、図14を参照して説明する。このCPMのフォーマットは、ASN(Abstract Syntax Notation).1として提示されてもよい。本開示で定義されていないデータエレメント(DE)およびデータフレーム(DF)は、ETSI TS 102 894-2に規定されている共通データ辞書から導出されてもよい。図14に示すように、CPMは、ITSプロトコルデータユニット(PDU)ヘッダと、複数のコンテナとを含んでもよい。 The basic format of CPM will be explained with reference to FIG. The format of this CPM is ASN (Abstract Syntax Notation). May be presented as 1. Data Elements (DE) and Data Frames (DF) not defined in this disclosure may be derived from the Common Data Dictionary specified in ETSI TS 102 894-2. As shown in FIG. 14, a CPM may include an ITS protocol data unit (PDU) header and multiple containers.
 ITS PDUヘッダは、プロトコルバージョン、メッセージタイプ、及び発信元のV2X通信装置のITS IDに関する情報を含むヘッダである。ITS PDUヘッダは、ITSメッセージで使用される共通のヘッダであり、ITSメッセージの開始部分に存在する。ITS PDUヘッダは、共通ヘッダと呼ばれることもある。 The ITS PDU header is a header that contains information about the protocol version, message type, and ITS ID of the source V2X communication device. The ITS PDU header is a common header used in ITS messages and is present at the beginning of the ITS message. The ITS PDU header is sometimes called a common header.
 複数のコンテナは、管理コンテナ(Management Container)、ステーションデータコンテナ(Station Data Container)、センサ情報コンテナ(Sensor Information Container)、認識物体コンテナ(Perceived Object Container)、フリースペース追加コンテナ(Free Space Addendum Container)を含むことができる。ステーションデータコンテナは発信車両コンテナ(Originating Vehicle Container)あるいは発信路側機コンテナ(Originating RSU Container)を含むことができる。センサ情報コンテナは視野情報コンテナ(Field-of-View Container)と呼ばれることもある。発信車両コンテナはOVCと記載することもある。視野コンテナはFOCと記載することもある。認識物体コンテナはPOCと記載することもある。CPMは、必須のコンテナとして管理コンテナを含み、ステーションデータコンテナ、センサ情報コンテナ、POCおよびフリースペース付属コンテナを任意のコンテナとしてもよい。センサ情報コンテナ、認識物体コンテナおよびフリースペース付属コンテナは複数のコンテナであってもよい。以下、各コンテナについて説明する。 Multiple containers are Management Container, Station Data Container, Sensor Information Container, Perceived Object Container, Free Space Addendum Container. can contain. A station data container may include an originating vehicle container or an originating roadside unit container (RSU container). A sensor information container is sometimes called a field-of-view container. An originating vehicle container may also be referred to as an OVC. A field of view container may also be described as an FOC. A recognition object container may also be described as a POC. The CPM includes the management container as a mandatory container, and the station data container, sensor information container, POC and free space ancillary containers may be optional containers. The sensor information container, the perceived object container and the free space attachment container may be multiple containers. Each container will be described below.
 管理コンテナは、車両または路側機タイプのステーションであるかどうかに関係なく、発信元のITS-Sに関する基本情報を提供する。また、管理コンテナは、ステーションタイプ、基準位置、セグメント化情報、認識物体数を含んでいてもよい。ステーションタイプは、ITS-Sのタイプを示す。基準位置は、発信元ITS-Sの位置である。セグメント化情報は、メッセージサイズの制約によりCPMを複数のメッセージに分割する場合の分割情報を記述する。 The administrative container provides basic information about the originating ITS-S, regardless of whether it is a vehicle or roadside unit type station. The management container may also include station type, reference location, segmentation information, number of recognized objects. The station type indicates the type of ITS-S. The reference location is the location of the originating ITS-S. The segmentation information describes splitting information when splitting a CPM into multiple messages due to message size constraints.
 図15に示す表1は、CPMのステーションデータコンテナにおけるOVCの一例である。表1は、一例としてのOVCに含まれるデータエレメント(DE)および/またはデータフレーム(DF)を示している。なお、ステーションデータコンテナは、発信元のITS-Sが車両である場合には、OVCになる。発信元のITS-SがRSUである場合には、発信元RSUコンテナ(Originating RSU Container)になる。発信元RSUコンテナは、RSUが存在する道路あるいは交差点に関するIDを含んでいる。 Table 1 shown in FIG. 15 is an example of OVC in the station data container of CPM. Table 1 shows the data elements (DE) and/or data frames (DF) included in an example OVC. Note that the station data container becomes an OVC when the ITS-S that is the source is a vehicle. If the originating ITS-S is an RSU, it becomes an Originating RSU Container. The originating RSU container contains the ID for the road or intersection on which the RSU is located.
 DEは、単一データを含むデータタイプである。DFは、予め定められた順序で1つ以上の要素を含むデータタイプである。たとえば、DFは、1つ以上のDEおよび/または1つ以上のDFを予め定義された順序で含むデータタイプである。  DE is a data type that contains single data. DF is a data type containing one or more elements in a predetermined order. For example, DF is a data type that includes one or more DEs and/or one or more DFs in a predefined order.
 DE/DFは、ファシリティ層メッセージまたはアプリケーション層メッセージを構成するために使用されてもよい。ファシリティ層メッセージの例は、CAM、CPM、DENMである。 The DE/DF may be used to construct facility layer messages or application layer messages. Examples of facility layer messages are CAM, CPM, DENM.
 表1に示すように、OVCは、CPMを発信するV2X通信装置に関連する基本情報を含む。OVCは、CAMのスケールダウン版と解釈できる。ただし、OVCは座標変換処理に必要なDEのみを含んでもよい。すなわち、OVCは、CAMと類似しているが、発信元のV2X通信装置に関する基本情報を提供する。OVCに含まれる情報は、座標変換処理をサポートすることに重点を置いている。 As shown in Table 1, the OVC contains basic information related to the V2X communication device that emits the CPM. OVC can be interpreted as a scaled down version of CAM. However, the OVC may include only the DE required for coordinate conversion processing. That is, OVC is similar to CAM, but provides basic information about the originating V2X communication device. The information contained in the OVC is focused on supporting the coordinate transformation process.
 OVCは、以下のものを提供することができる。すなわち、OVCは、CPM生成時にCPサービス124が取得した発信元V2X通信装置の最新の地理的位置を提供することができる。また、OVCは、発信元V2X通信装置の横方向および縦方向の絶対速度成分を提供することができる。OVCは、発信元V2X通信装置の幾何学的寸法を提供することができる。 OVC can provide the following. That is, the OVC can provide the latest geographic location of the originating V2X communication device obtained by the CP service 124 at the time of CPM generation. OVC can also provide the absolute lateral and longitudinal velocity components of the originating V2X communication device. The OVC can provide the geometric dimensions of the originating V2X communication device.
 表1に示す生成差分時間は、DEとして、CPMにおける基準位置の時刻に対応する時間を示す。生成差分時間は、CPMの生成時刻とみなすことができる。本開示では、生成差分時間を生成時間と称することがある。 The generated differential time shown in Table 1 indicates, as DE, the time corresponding to the time of the reference position in CPM. The generation difference time can be regarded as the generation time of the CPM. In the present disclosure, the generated differential time may be referred to as generated time.
 基準位置は、DFとして、V2X通信装置の地理的な位置を示す。基準位置は、地理的な点の位置を示す。基準位置は、緯度、経度、位置信頼度および/または高度に関する情報を含む。緯度は、地理的地点の緯度を表し、経度は、地理的地点の経度を表す。位置信頼度は、地理的位置の精度を表し、高度は、地理的地点の高度および高度精度を表す。 The reference position indicates the geographical position of the V2X communication device as DF. A reference position indicates the location of a geographical point. The reference position includes information regarding latitude, longitude, position confidence and/or altitude. Latitude represents the latitude of the geographical point, and Longitude represents the longitude of the geographical point. The position confidence represents the accuracy of the geographic location, and the altitude represents the altitude and altitude accuracy of the geographic point.
 方位は、DFとして、座標系における方位を示す。方位は、方位値および/または方位信頼度の情報を含む。方位値は、北を基準とした進行方向を示し、方位の信頼度は、報告された方位値の信頼度が予め設定されたレベルであることを示す。 The orientation indicates the orientation in the coordinate system as DF. Heading includes heading value and/or heading confidence information. The bearing value indicates the heading relative to north, and the bearing confidence indicates a preset level of confidence in the reported bearing value.
 縦方向速度は、DFとして、移動体(たとえば車両)に関する縦方向速度と速度情報の精度を記述することができる。縦方向速度は、速度値および/または速度精度の情報を含む。速度値は、縦方向の速度値を表し、速度精度は、その速度値の精度を表す。  Longitudinal velocity can describe the accuracy of longitudinal velocity and velocity information for a moving object (eg, vehicle) as DF. Longitudinal velocity includes velocity value and/or velocity accuracy information. The velocity value represents the velocity value in the vertical direction, and the velocity accuracy represents the accuracy of the velocity value.
 横方向速度は、DFとして、移動体(たとえば車両)に関する横方向速度および速度情報の精度を記述することができる。横方向速度は、速度値および/または速度精度に関する情報を含む。上記速度値は、横方向の速度値を表し、速度精度は、その速度値の精度を表す。 Lateral velocity, as a DF, can describe the lateral velocity and the accuracy of velocity information for a moving body (eg, vehicle). Lateral velocity includes information about velocity values and/or velocity accuracy. The velocity value represents the velocity value in the lateral direction, and the velocity accuracy represents the accuracy of the velocity value.
 車両長は、DFとして、車両長および精度指標を記述することができる。車両長は、車両長の値および/または車両長の精度指標に関する情報を含む。車両長は、車両の長さを表し、車両長の精度指標は、その車両長の信頼性を表す。 The vehicle length can describe the vehicle length and accuracy index as DF. The vehicle length includes information about vehicle length values and/or vehicle length accuracy indicators. The vehicle length represents the length of the vehicle, and the vehicle length accuracy index represents the reliability of the vehicle length.
 車幅は、DEとして、車両の幅を示す。たとえば、車幅は、サイドミラーを含めた車両の幅を表すことができる。なお、車幅が6.1m以上の場合は61とし、情報が得られない場合は62とする。 The vehicle width indicates the width of the vehicle as DE. For example, vehicle width can represent the width of the vehicle including the side mirrors. If the width of the vehicle is 6.1 m or more, it is set to 61, and if the information cannot be obtained, it is set to 62.
 表1に示す各DE/DFは、生成時間差分を除き、それぞれ、表1の右列に示すETSI 102 894-2を参照できる。ETSI 102 894-2は、CDD(common data dictionary)を定めている。生成時間差分については、ETSI EN 302 637-2を参照できる。 Each DE/DF shown in Table 1 can refer to ETSI 102 894-2 shown in the right column of Table 1, except for the generation time difference. ETSI 102 894-2 defines a CDD (common data dictionary). See ETSI EN 302 637-2 for generation time differences.
 また、前述の情報以外に車両方向角度、車両進行方向、縦加速度、横加速度、垂直加速度、ヨーレート、ピッチ角度、ロール角度、車両高さ及びトレーラーデータに関する情報をOVCに含んでいてもよい。 In addition to the above information, the OVC may contain information on the vehicle direction angle, vehicle traveling direction, longitudinal acceleration, lateral acceleration, vertical acceleration, yaw rate, pitch angle, roll angle, vehicle height and trailer data.
 図16には表2を示している。表2はCPMにおけるSIC(またはFOC)の例である。 SICは、発信元のV2X通信装置に搭載された少なくとも1つのセンサの説明を提供する。V2X通信装置がマルチセンサを搭載している場合、説明は複数追加されることがある。たとえば、SICは発信元のV2X通信装置のセンサ能力に関する情報を提供する。このようにするために、発信元V2X通信装置のセンサの取り付け位置、センサの種類、センサの範囲と開き角(すなわちセンサのフラスタム)を提供する一般的なセンサ特性が、メッセージの一部として含まれてもよい。これらの情報は、受信側のV2X通信装置がセンサの性能に応じた適切な予測モデルを選択するために利用されることがある。 Table 2 is shown in FIG. Table 2 is an example of SIC (or FOC) in CPM. The SIC provides a description of at least one sensor mounted on the originating V2X communication device. If the V2X communication device is equipped with multiple sensors, multiple explanations may be added. For example, the SIC provides information about the sensor capabilities of the originating V2X communication device. To do so, general sensor characteristics providing the originating V2X communication device's sensor mounting location, sensor type, sensor range and opening angle (i.e., sensor frustum) are included as part of the message. may be These pieces of information may be used by the receiving V2X communication device to select an appropriate prediction model according to sensor performance.
 SICの各種の情報について表2を参照して説明する。センサIDは、物体を検出したセンサを特定するためのセンサ固有のIDを示す。実施形態では、センサIDは、V2X通信装置の起動時に生成される乱数であり、V2X通信装置が終了するまで変更されない。 Various information of SIC will be explained with reference to Table 2. The sensor ID indicates a sensor-specific ID for specifying the sensor that detected the object. In an embodiment, the sensor ID is a random number generated when the V2X communication device starts up and does not change until the V2X communication device is terminated.
 センサタイプは、センサのタイプを示す。以下にセンサのタイプを列挙する。たとえば、センサタイプは、未定義(0)、レーダー(1)、ライダー(2)、モノビデオ(3)、ステレオビジョン(4)、ナイトビジョン(5)、超音波(6)、pmd(7)、フュージョン(8)、インダクションループ(9)、球面カメラ(10)、それらの集合(11)である。pmdは、photo mixing deviceである。球面カメラは360度カメラとも呼ばれる。  Sensor type indicates the type of sensor. The sensor types are listed below. For example, the sensor type is undefined (0), radar (1), lidar (2), mono-video (3), stereovision (4), night vision (5), ultrasonic (6), pmd (7). , fusion (8), induction loop (9), spherical camera (10), and their set (11). pmd is a photo mixing device. A spherical camera is also called a 360-degree camera.
 センサ位置において、X位置は、センサのマイナスX方向の取付位置、Y位置はセンサのY方向の取付位置を示す。これらの取付位置は、基準位置からの測定値であり、基準位置はETSIのEN 302 637-2を参照できる。半径は、メーカが定義するセンサの平均的な認識範囲を示す。 In the sensor position, the X position indicates the mounting position of the sensor in the negative X direction, and the Y position indicates the mounting position of the sensor in the Y direction. These mounting positions are measurements from a reference position, which can be referred to EN 302 637-2 of ETSI. The radius indicates the average recognition range of the sensor as defined by the manufacturer.
 開き角において、開始角度はセンサのフラスタムの開始角度を示し、終了角度はセンサのフラスタムの終了角度を示す。品質クラスは、測定対象物の品質を定義するセンサの分類を表す。 In the opening angle, the start angle indicates the start angle of the sensor's frustum, and the end angle indicates the end angle of the sensor's frustum. A quality class represents a classification of the sensor that defines the quality of the measurement object.
 また、前述の情報以外に検出領域およびフリースペースの信頼性に関する情報をSICに含んでいてもよい。 In addition to the above information, the SIC may contain information regarding the reliability of the detection area and free space.
 図17には表3を示している。表3はCPMにおけるPOCの例である。POCは、送信するV2X通信装置から見て、センサが認識した物体を記述するために使用される。POCを受信した受信側V2X通信装置は、OVCの助けを借りて、物体の位置を受信側車両の基準座標系に変換する座標変換処理を行うことができる。 Table 3 is shown in FIG. Table 3 is an example of POC in CPM. The POC is used to describe the object perceived by the sensor as seen by the transmitting V2X communication device. The receiving V2X communication device that receives the POC can, with the help of the OVC, perform coordinate transformation processing to transform the position of the object into the reference coordinate system of the receiving vehicle.
 メッセージサイズを小さくするために、発信側V2X通信装置が提供できる場合に、複数のオプションDEを提供してもよい。 In order to reduce the message size, multiple optional DEs may be provided if the originating V2X communication device can provide them.
 POCは、認識された(または検出された)物体の抽象的な説明を提供するためにDEの選択で構成されてもよい。たとえば、発信元V2X通信装置に関連する認識された物体についての相対距離、速度情報およびタイミング情報は、必須のDEとしてPOCに含まれてもよい。また、発信元V2X通信装置のセンサが、要求されたデータを提供できる場合、追加のDEを提供してもよい。 A POC may consist of a selection of DEs to provide an abstract description of the recognized (or detected) object. For example, the relative distance, velocity information and timing information about the perceived object associated with the originating V2X communication device may be included in the POC as mandatory DEs. Additional DEs may also be provided if the sensors of the originating V2X communication device can provide the requested data.
 各情報(DEまたはDF)について、表3を参照して説明する。測定時間は、メッセージの基準時刻からの時間をマイクロ秒単位で示す。これは、測定された物体の相対的な年齢を定義する。 Each piece of information (DE or DF) will be explained with reference to Table 3. The measurement time indicates the time in microseconds from the reference time of the message. This defines the relative age of the measured object.
 物体IDは、物体に割り当てられた一意のランダムなIDである。このIDは、物体が追跡されている間、すなわち、発信元のV2X通信装置のデータ融合処理で考慮される間、保持される(すなわち、変更されない)。 An object ID is a unique random ID assigned to an object. This ID is retained (ie, not changed) while the object is being tracked, ie, considered in the originating V2X communication device's data fusion process.
 センサIDは、表2のセンサIDのDEに対応するIDである。このDEは、物体情報を、計測を行うセンサに関連付けるために使用されることがある。 The sensor ID is an ID corresponding to DE in the sensor ID in Table 2. This DE may be used to associate object information with the sensors that make the measurements.
 縦方向距離には、距離値と距離信頼度が含まれる。距離値は、発信元基準座標系における物体までの相対的なX距離を示す。距離信頼度は、そのX距離の信頼度を示す値である。  The vertical distance includes the distance value and the distance reliability. The distance value indicates the relative X distance to the object in the source reference frame. The distance reliability is a value indicating the reliability of the X distance.
 横方向距離も、距離値と距離信頼度が含まれる。距離値は、発信元基準座標系における物体までの相対的なY距離を示し、距離信頼度は、そのY距離の信頼度を示す。 The horizontal distance also includes the distance value and distance reliability. The distance value indicates the relative Y distance to the object in the source reference frame, and the distance confidence indicates the confidence of that Y distance.
 縦方向速度は、検出された物体の縦方向速度を信頼度に応じて示す。横方向速度は、検出された物体の横方向速度を信頼度に応じて示す。縦方向速度および横方向速度は、TS 102 894-2のCDDを参照できる。  Longitudinal velocity indicates the longitudinal velocity of the detected object according to its reliability. Lateral velocity indicates the lateral velocity of the detected object depending on the degree of confidence. Longitudinal and transverse velocities can be referred to CDD of TS 102 894-2.
 物体方位は、データフュージョン処理により提供される場合、基準座標系における物体の絶対方位を示す。物体の長さは、測定された物体の長さを示す。長さの信頼度は、測定された物体の長さの信頼度を示す。物体の幅は、物体の幅の測定値を示す。幅の信頼度は、物体の幅の測定値の信頼度を示す。物体タイプは、データフュージョンプロセスで提供される場合、物体の分類を表す。物体の分類としては車両、人、動物、その他が含まれてよい。また、前述の情報以外に物体の信頼度、垂直方向距離、垂直方向速度、縦方向加速度、横方向加速度、垂直方向加速度、物体の高さ、物体の動的状態、マッチドポジション(車線IDや縦方向車線位置を含む)に関する情報をPOCに含んでいてもよい。 The object orientation indicates the absolute orientation of the object in the reference coordinate system when provided by data fusion processing. The object length indicates the measured object length. The length confidence indicates the confidence of the length of the measured object. Object Width indicates a measurement of the width of the object. Width confidence indicates the reliability of the object width measurement. Object type represents the classification of the object as provided in the data fusion process. Classifications of objects may include vehicles, people, animals, and others. In addition to the above information, object reliability, vertical distance, vertical speed, vertical acceleration, lateral acceleration, vertical acceleration, height of object, dynamic state of object, matched position (lane ID, vertical direction lane position) may be included in the POC.
 フリースペース追加コンテナは、発信元のV2X通信装置が認識しているフリースペースについての情報(すなわちフリースペース情報)を示すコンテナである。フリースペースは、道路使用者や障害物が占有していないと考えられる領域であり、空き空間ということもできる。フリースペースは、発信元のV2X通信装置とともに移動する移動体が移動できるスペースということもできる。 The free space addition container is a container that indicates information about the free space recognized by the source V2X communication device (that is, free space information). A free space is an area that is not considered to be occupied by road users or obstacles, and can also be called an empty space. The free space can also be said to be a space in which a mobile object that moves together with the source V2X communication device can move.
 フリースペース追加コンテナは、必須のコンテナではなく任意に追加できるコンテナである。他のV2X通信装置から受信したCPMから計算できる、当該他のV2X通信装置が認識しているフリースペースと、発信元のV2X通信装置が認識しているフリースペースとに相違がある場合に、フリースペース追加コンテナを追加できる。また、定期的に、CPMにフリースペース追加コンテナを追加してもよい。 The free space additional container is not a required container, but a container that can be added arbitrarily. If there is a difference between the free space recognized by the other V2X communication device and the free space recognized by the source V2X communication device, which can be calculated from the CPM received from the other V2X communication device, the free space Additional space containers can be added. Also, free space additional containers may be added to the CPM periodically.
 フリースペース追加コンテナは、フリースペースの領域を特定する情報を含む。フリースペースは、種々の形状で特定することができる。フリースペースの形状は、たとえば、多角形(すなわちポリゴン)、円形、楕円形、長方形などで表現できる。フリースペースを多角形で表現する場合、多角形を構成する複数の点の位置と、それら複数の点を接続する順序を指定する。フリースペースを円形で表現する場合、円の中心の位置と円の半径を指定する。フリースペースを楕円で表現する場合、楕円の中心の位置と楕円の長径および短径を指定する。 The free space addition container contains information specifying the area of free space. Free space can be specified in various shapes. The shape of the free space can be represented, for example, by polygons (ie, polygons), circles, ellipses, rectangles, and the like. When representing free space with a polygon, specify the positions of the points that make up the polygon and the order in which the points are connected. Specify the position of the center of the circle and the radius of the circle when representing the free space in a circle. When expressing free space with an ellipse, specify the position of the center of the ellipse and the major and minor axes of the ellipse.
 フリースペース追加コンテナは、フリースペースの信頼性を含んでもよい。フリースペースの信頼性は数値で表す。フリースペースの信頼性は、信頼性が不明であることを示すこともある。また、フリースペース追加コンテナは、影領域に関する情報を含んでもよい。影領域とは車両または車両に搭載されるセンサから見て物体の後方の領域を示している。 The free space additional container may contain the reliability of the free space. Reliability of free space is expressed numerically. The reliability of free space may also indicate that the reliability is unknown. The free space addition container may also contain information about shadow areas. The shadow area indicates the area behind the object as seen from the vehicle or the sensor mounted on the vehicle.
 図18は、CPサービス124を提供するV2X通信装置によるセンサデータ抽出方法を説明する図である。より具体的には、図18(a)は、V2X通信装置が低レベルでセンサデータを抽出する方法を示す。図18(b)は、V2X通信装置が高レベルでセンサデータを抽出する方法を示す図である。 FIG. 18 is a diagram explaining a sensor data extraction method by a V2X communication device that provides CP service 124. FIG. More specifically, FIG. 18(a) shows how a V2X communication device extracts sensor data at a low level. FIG. 18(b) illustrates how a V2X communication device extracts sensor data at a high level.
 CPMの一部として送信されるセンサデータのソースは、受信側のV2X通信装置における将来のデータフュージョンプロセスの要件に従って選択される必要がある。一般的に、送信されるデータは、元のセンサデータにできるだけ近いものであるべきである。しかし、単純にオリジナルのセンサデータ、たとえば生データを送信することは現実的ではない。データレートと伝送周期に関して非常に高い要求を課すからである。 The source of sensor data transmitted as part of CPM should be selected according to the requirements of the future data fusion process in the receiving V2X communication device. In general, the transmitted data should be as close as possible to the original sensor data. However, simply transmitting original sensor data, such as raw data, is not realistic. This is because it imposes very high demands on data rate and transmission cycle.
 図18(a)および図18(b)はCPMの一部として送信されるデータを選択するための可能な実施形態を示している。図18(a)の実施形態では、センサデータは異なるセンサから取得され、低レベルデータ管理エンティティの一部として処理される。このエンティティは、次のCPMの一部として挿入されるオブジェクトデータを選択し、また、検出されたオブジェクトの妥当性を計算することができる。図18(a)では、各センサのデータを送信するため、V2Xネットワークを介して送信されるデータ量が増加する。しかし、受信側のV2X通信装置でセンサ情報を効率的に活用できる。 Figures 18(a) and 18(b) show possible embodiments for selecting data to be transmitted as part of the CPM. In the embodiment of Figure 18(a), sensor data is obtained from different sensors and processed as part of the low-level data management entity. This entity can select the object data to be inserted as part of the next CPM and also compute the validity of the detected objects. In FIG. 18(a), the data of each sensor is transmitted, so the amount of data transmitted via the V2X network increases. However, the sensor information can be efficiently utilized by the V2X communication device on the receiving side.
 図18(b)の実施形態では、V2X通信装置メーカに固有のデータフュージョン部により提供されるセンサデータまたはオブジェクトデータがCPMの一部として送信される。 In the embodiment of FIG. 18(b), sensor data or object data provided by a data fusion unit specific to the V2X communication device manufacturer is transmitted as part of the CPM.
 図18(b)では、データフュージョン部を介して1つに集められた統合センサデータが伝送されるため、V2Xネットワークを介して伝送されるデータ量が少なくて済むという利点がある。しかし、センサ情報を収集するV2X通信装置の収集方式に依存するというデメリットがある。また、メーカにより異なるデータフュージョン処理が実施される可能性がある。 In FIG. 18(b), integrated sensor data collected via the data fusion unit is transmitted, so there is the advantage that the amount of data transmitted via the V2X network is small. However, there is a demerit in that it depends on the collection method of the V2X communication device that collects sensor information. Also, different data fusion processes may be implemented by different manufacturers.
 V2X通信装置のセンサで物体を検出するたびに、その尤度を算出する必要がある。物体の尤度が所定の閾値PLAUS_OBJを超えた場合、送信を検討する必要がある。 Every time an object is detected by the sensor of the V2X communication device, it is necessary to calculate its likelihood. If the object likelihood exceeds a predetermined threshold PLAUS_OBJ, then transmission should be considered.
 たとえば、検出された物体の現在のヨー角と、発信元のV2X通信装置が過去に送信したCPMに含まれるヨー角との差の絶対値が4度を超える場合、送信を検討する。発信元のV2X通信装置と検出物体の現在位置の相対距離と、発信元のV2X通信装置が過去に送信したCPMに含まれる発信元V2X通信装置と検出物体の相対距離の差が4mを超える場合、または、検出物体の現在の速度と発信元のV2X通信装置が過去に送信したCPMに含まれる検出物体の速度との差の絶対値が0.5m/sを超える場合は送信を検討してもよい。 For example, if the absolute value of the difference between the current yaw angle of the detected object and the yaw angle included in the CPM previously transmitted by the source V2X communication device exceeds 4 degrees, transmission will be considered. When the difference between the relative distance between the current position of the originating V2X communication device and the detected object and the relative distance between the originating V2X communication device and the detected object contained in the CPM previously transmitted by the originating V2X communication device exceeds 4m Or, if the absolute value of the difference between the current velocity of the detected object and the velocity of the detected object contained in the CPM previously transmitted by the source V2X communication device exceeds 0.5 m/s, consider sending. good too.
 CAMは、V2Xモジュールを搭載した車両が、周囲のV2Xモジュールを搭載した車両に定期的に位置や状態を送信し、より安定した走行を支援する技術である。なお、V2Xモジュールは、V2X通信装置あるいはV2X通信装置を含む構成である。  CAM is a technology in which a vehicle equipped with a V2X module periodically transmits its position and status to other vehicles equipped with a V2X module in the surrounding area to support more stable driving. Note that the V2X module is a configuration including a V2X communication device or a V2X communication device.
 CAMは自車両の情報しか共有できないという制約があった。CPサービス124はCAMを補完する技術である。ADAS技術を搭載した車両は増え続けているため、多くの車両にはカメラ、レーダー、ライダーなどのセンサが搭載され、多くの周辺車両を認識し運転支援機能を発揮している。CPS(すなわちCPサービス)技術は、ADAS技術において、周辺環境を認識したセンサデータをV2X通信により周囲に通知する技術である。  There was a restriction that the CAM could only share information about its own vehicle. CP service 124 is a technology that complements CAM. As the number of vehicles equipped with ADAS technology continues to increase, many vehicles are equipped with sensors such as cameras, radars, and lidars, which recognize many surrounding vehicles and demonstrate driving support functions. CPS (that is, CP service) technology is a technology in ADAS technology that notifies the surroundings of sensor data that recognizes the surrounding environment through V2X communication.
 図19は、CPサービス124を説明する図である。TxV1およびRxV2の各車両は、少なくとも1つのセンサを備え、点線で示すセンシング範囲SrV1、SrV2を有するとする。TxV1はCPS機能を有する。TxV1は、車両に搭載された複数のADASセンサを用いて、センシング範囲SrV1に属する周辺物体である車両、RV1~RV11を認識することができる。認識により得られた物体情報は、V2X通信により、V2X通信装置を搭載する周辺車両に配信される場合がある。 FIG. 19 is a diagram explaining the CP service 124. FIG. It is assumed that each vehicle TxV1 and RxV2 is equipped with at least one sensor and has sensing ranges SrV1, SrV2 indicated by dashed lines. TxV1 has a CPS function. TxV1 can recognize vehicles, RV1 to RV11, which are peripheral objects belonging to sensing range SrV1, using a plurality of ADAS sensors mounted on the vehicle. Object information obtained by recognition may be distributed to nearby vehicles equipped with V2X communication devices through V2X communication.
 これにより、TxV1からCPMを受信した周辺車両のうち、センサを搭載していないRxV1は、後続車両の情報を取得できる。また、センサを搭載しているRxV2は、TxV1からCPMを受信した場合、RxV2のセンシング範囲SrV2の外にある物体や死角に位置する物体の情報(例えば、RV1~3、RV5、6及びRV8~10)を取得することも可能である。 As a result, of the surrounding vehicles that received the CPM from TxV1, RxV1, which is not equipped with a sensor, can acquire information on the following vehicle. In addition, when RxV2 equipped with a sensor receives CPM from TxV1, information on objects located outside the sensing range SrV2 of RxV2 and objects located in blind spots (for example, RV1 to 3, RV5, 6 and RV8 to 10) can also be obtained.
 前述の図12に示されるように、ファシリティ層120は、CPサービス124を提供することができる。CPサービス124は、ファシリティ層120で実行されてもよく、ファシリティ層120に存在するサービスを利用してもよい。 As shown in FIG. 12 above, facility layer 120 can provide CP services 124 . CP services 124 may run in facility layer 120 and may utilize services that reside in facility layer 120 .
 LDM127は、地図情報を提供するサービスであり、CPサービス124のために地図情報を提供してもよい。提供する地図情報には、静的な情報に加えて動的な情報を含んでいてもよい。POTIユニット126は、自車両の位置と時刻を提供するサービスを実行する。POTIユニット126は、対応する情報を用いて自車両の位置と正確な時刻を提供することができる。VDP125は、車両に関する情報を提供するサービスであり、これを用いてCPMに自車両のサイズなどの情報を取り込んで、CPMを送信してもよい。 The LDM 127 is a service that provides map information, and may provide map information for the CP service 124. The provided map information may include dynamic information in addition to static information. The POTI unit 126 performs a service that provides the location and time of the ego vehicle. The POTI unit 126 can use the corresponding information to provide the location of the ego vehicle and the exact time. The VDP 125 is a service that provides information about the vehicle, and may be used to capture information such as the size of the own vehicle into the CPM and transmit the CPM.
 ADAS車両には、運転支援のために、カメラ、赤外線センサ、レーダー、ライダーなどの各種センサが搭載されている。それぞれのセンサは、個別に物体を認識する。認識された物体情報は、データフュージョン部によって収集され且つ融合され、ADASアプリケーションに提供される場合がある。 ADAS vehicles are equipped with various sensors such as cameras, infrared sensors, radar, and lidar for driving support. Each sensor individually recognizes an object. The recognized object information may be collected and fused by the data fusion unit and provided to the ADAS application.
 再度、図18を参照し、CPサービス124に関し、ADAS技術におけるセンサ情報の収集とフュージョン方法について説明する。ADAS用の既存のセンサやCPS用の既存のセンサは、常に周囲の物体を追跡し、関連するデータを収集することができる。CPサービス用のセンサ値を使用する場合、2つの方法を用いてセンサ情報を収集することができる。 With reference to FIG. 18 again, regarding the CP service 124, the collection and fusion method of sensor information in the ADAS technology will be described. Existing sensors for ADAS and existing sensors for CPS can always track surrounding objects and collect relevant data. When using sensor values for CP services, two methods can be used to collect sensor information.
 図18(a)に示すように、CP基本サービスを通じて、それぞれのセンサ値を周辺車両に個別に提供することができる。また、図18(b)に示すように、データフュージョン部の後に1つに集められた統合センサ情報がCP基本サービスに提供されてもよい。CP基本サービスは、CPサービス124の一部を構成する。 As shown in FIG. 18(a), each sensor value can be individually provided to surrounding vehicles through the CP basic service. Also, as shown in FIG. 18(b), the aggregated integrated sensor information may be provided to the CP basic service after the data fusion part. CP basic services form part of CP services 124 .
 図20にCPMを送信する処理をフローチャートにより示す。図20に示す処理はCPMの生成周期ごとに実行する。ステップS301では、CPMを構成する情報を取得する。ステップS301は、たとえば、検出情報取得部201が実行する。ステップS302では、ステップS301で取得した情報をもとにCPMを生成する。ステップS302も、検出情報取得部201が実行することができる。ステップS303では、送信部221が、ステップS302で生成したCPMを、車両の周囲へ送信する。  Fig. 20 shows a flowchart of the process of sending the CPM. The processing shown in FIG. 20 is executed for each CPM generation cycle. In step S301, information forming the CPM is acquired. Step S301 is executed by the detection information acquisition unit 201, for example. In step S302, CPM is generated based on the information obtained in step S301. Step S302 can also be executed by the detection information acquisition unit 201 . At step S303, the transmission unit 221 transmits the CPM generated at step S302 to the surroundings of the vehicle.
 図21のフローチャートを用いて、実施形態2で実行する測位誤差推定関連処理の流れの一例を説明する。実施形態2では、他車両の通信機20から送信されてくる物標情報として、CPMとCAMを用いる。図21に示すフローチャートは、図4のフローチャートにおけるステップS1、S4、S9、S10、S11に代えて、ステップS1A、S4A、S9A、S10A、S11Aを実行するとともにS1Bを追加で実行する。なお、ステップS2、S3、S5、S6、S7、S8、S12は図4のフローチャートと図21のフローチャートで共通のため説明は省略するものとする。 An example of the flow of positioning error estimation-related processing executed in the second embodiment will be described using the flowchart of FIG. In the second embodiment, CPM and CAM are used as the target object information transmitted from the communication device 20 of the other vehicle. In the flowchart shown in FIG. 21, steps S1A, S4A, S9A, S10A, and S11A are executed instead of steps S1, S4, S9, S10, and S11 in the flowchart of FIG. 4, and S1B is additionally executed. Since steps S2, S3, S5, S6, S7, S8, and S12 are common between the flowchart of FIG. 4 and the flowchart of FIG.
 ステップS1Aでは、受信部222が、他車両の通信機20から送信されてくるCPMを受信した場合(S1AでYES)に、ステップS2に移る。CPMを受信していない場合(S1AでNO)にはステップS1Bに移る。 In step S1A, when the receiving unit 222 receives the CPM transmitted from the communication device 20 of the other vehicle (YES in S1A), the process proceeds to step S2. If no CPM has been received (NO in S1A), the process proceeds to step S1B.
 ステップS2、S3は図4と同じである。ステップS3を実行後はステップS4Aに移る。ステップS4Aでは、同一判定部204が、ステップS3で位置を特定した自車検出物標と、ステップS1Aで受信したと判断したCPMのPOCにより特定される物標とが同一であるかを判定する。CPMには、他車両の緯度と経度(以下、絶対座標)、他車両から物標までの距離と方位が含まれている。したがって、他車両から取得したCPMにより、他車検出物標の絶対座標を決定できる。この座標と、ステップS3で特定した自車検出物標の位置とを、座標系を揃えて比較する。揃える座標系は、絶対座標または自車両を中心とする座標系とすればよい。  Steps S2 and S3 are the same as in FIG. After executing step S3, the process moves to step S4A. In step S4A, the identity determination unit 204 determines whether or not the vehicle detection target whose position was specified in step S3 is the same as the target specified by the POC of the CPM that was determined to have been received in step S1A. . The CPM includes the latitude and longitude (hereinafter referred to as absolute coordinates) of the other vehicle, and the distance and bearing from the other vehicle to the target. Therefore, the absolute coordinates of the target detected by the other vehicle can be determined from the CPM acquired from the other vehicle. This coordinate is compared with the position of the vehicle detection target identified in step S3 with the coordinate system aligned. The coordinate system to be aligned may be an absolute coordinate system or a coordinate system centered on the own vehicle.
 また、CPMのPOCには、他車両が検出した物標について、挙動を示す情報などの種々の情報が含まれている。CPMのPOCにより特定される物標の挙動と、S2で取得した自車両検出物標の挙動とを比較して同一判定を行ってもよい。挙動は、速度、加速度、角速度など、物標の動きを示す1つ以上の情報である。 In addition, the POC of the CPM contains various information such as information indicating the behavior of targets detected by other vehicles. The same determination may be made by comparing the behavior of the target specified by the POC of the CPM and the behavior of the vehicle-detected target acquired in S2. Behavior is one or more pieces of information indicating target movement, such as velocity, acceleration, and angular velocity.
 さらに、同一判定を行う前に絞り込みを行ってもよい。絞り込みには、CPMのPOCにより特定される物標の分類(すなわち種類)を用いることができる。また、絞り込みに物標の位置を用いてもよい。位置により絞り込みを行う場合、同一と判定する距離差よりも大きい距離に設定した閾値半径により絞り込み範囲を定める。絞り込み範囲の中心は他車検出物標または自車検出物標の位置である。この絞り込み範囲内にある自車検出物標と他車検出物標とを対象として同一判定を行う。 In addition, you may narrow down before making the same judgment. Target classification (ie, type) specified by the POC of the CPM can be used for narrowing down. Also, the position of the target may be used for narrowing down. When narrowing down by position, the narrowing-down range is determined by a threshold radius set to a distance larger than the distance difference determined to be the same. The center of the range of narrowing down is the position of the other vehicle detection target or the own vehicle detection target. The same judgment is performed for the targets detected by the own vehicle and the targets detected by other vehicles within the narrowed-down range.
 ステップS1Bでは、受信部222が、他車両の通信機20から送信されてくるCAMを受信した場合(S1BでYES)に、ステップS9Aに移る。CAMを受信していない場合(S1BでNO)にはステップS12に移る。ステップS5で自車検出物標と他車検出物標とが同一でないと判定した場合(S5でNO)にもステップS9Aに移る。 In step S1B, when the receiving unit 222 receives the CAM transmitted from the communication device 20 of the other vehicle (YES in S1B), the process proceeds to step S9A. If the CAM has not been received (NO in S1B), the process moves to step S12. If it is determined in step S5 that the target detected by the own vehicle and the target detected by the other vehicle are not the same (NO in S5), the process also proceeds to step S9A.
 ステップS9Aでは、ステップS1AまたはステップS1Bで受信したと判断したメッセージの送信元となる他車両について、誤差記憶部207に搭載物測位誤差が記憶済みの場合(S9AでYES)には、ステップS10Aに移る。 In step S9A, if the mounted object positioning error has already been stored in error storage unit 207 for the other vehicle that is the transmission source of the message determined to have been received in step S1A or step S1B (YES in S9A), the process proceeds to step S10A. move.
 ステップS10Aは、受信したと判断したメッセージがCPMであるかCAMであるかにより処理が相違する。受信したと判断したメッセージがCPMであれば、CPMのPOCにより特定される他車検出物標の位置を、CPMの送信元となる他車両の搭載物測位誤差だけ補正する。受信したと判断したメッセージがCAMであれば、CAMに含まれている送信元車両の位置を搭載物測位誤差だけ補正する。ステップS10Aを実行後はステップS11Aに移る。 The processing of step S10A differs depending on whether the message determined to have been received is CPM or CAM. If the message determined to have been received is a CPM, the position of the target detected by another vehicle specified by the POC of the CPM is corrected by the mounted object positioning error of the other vehicle that is the source of the CPM. If the message determined to have been received is the CAM, the position of the transmission source vehicle contained in the CAM is corrected by the mounting object positioning error. After executing step S10A, the process moves to step S11A.
 ステップS11Aでは、ステップS10Aで補正した他車検出物標の位置またはCAMの送信元となる他車両の位置を物標記憶部209に記憶する。 In step S11A, the position of the other vehicle detection target corrected in step S10A or the position of the other vehicle that is the transmission source of the CAM is stored in the target storage unit 209.
 この実施形態2によれば、搭載物測位誤差を推定する際に、車両ユニット2が送信するCPMを使う。これにより、CPMを送受信するシステムを利用できるので、車両用システム1を構築しやすい。 According to the second embodiment, the CPM transmitted by the vehicle unit 2 is used when estimating the mounted object positioning error. As a result, a system for transmitting and receiving CPM can be used, making it easy to construct the vehicle system 1 .
 また、この実施形態2では、他車両からCAMを受信した場合であって(S1BでYES)、そのCAMの送信元となる他車両の搭載物測位誤差が記憶済みの場合(S9AでYES)、CAMに含まれている送信元車両の位置を搭載物測位誤差だけ補正する(S10A)。このようにすることで、CAMの送信元車両が、自車の周辺監視センサ40の検出範囲外であっても、その送信元車両の位置を精度よく特定できる。 Further, in the second embodiment, when the CAM is received from another vehicle (YES in S1B), and the mounted object positioning error of the other vehicle that is the transmission source of the CAM is already stored (YES in S9A), The position of the source vehicle contained in the CAM is corrected by the mounting object positioning error (S10A). By doing so, even if the CAM transmission source vehicle is outside the detection range of the perimeter monitoring sensor 40 of the own vehicle, the position of the transmission source vehicle can be specified with high accuracy.
 (実施形態3)
 実施形態1および実施形態2では、通信機搭載物が車両である場合を例に挙げて説明したが、必ずしもこれに限らない。例えば、通信機搭載物が車両以外の移動体であっても構わない。車両以外の移動体としては、例えばドローン等が挙げられる。この場合、ドローンといった移動体に、車両ユニット2のうちの車両に特有の機能を除く機能を備えるユニットを搭載すればよい。
(Embodiment 3)
In Embodiments 1 and 2, the case where the communication equipment is a vehicle has been described as an example, but this is not necessarily the case. For example, the object mounted with the communication device may be a mobile object other than a vehicle. Mobile objects other than vehicles include, for example, drones. In this case, a unit provided with a function excluding the function specific to the vehicle among the vehicle units 2 may be mounted on a moving object such as a drone.
 (実施形態4)
 実施形態1および実施形態2では、通信機搭載物が車両である場合を例に挙げて説明したが、必ずしもこれに限らない。例えば、通信機搭載物が静止物体であっても構わない。静止物体としては、例えば路側機等が挙げられる。この場合、路側機に、車両ユニット2のうちの車両に特有の機能を除く機能を備えるユニットを搭載すればよい。この場合、通信部202は路車間通信を行う構成とすればよい。
(Embodiment 4)
In Embodiments 1 and 2, the case where the communication equipment is a vehicle has been described as an example, but this is not necessarily the case. For example, the communication equipment may be a stationary object. Stationary objects include, for example, roadside units. In this case, the roadside unit may be equipped with a unit of the vehicle unit 2 that has functions other than those specific to the vehicle. In this case, the communication unit 202 may be configured to perform road-to-vehicle communication.
 (実施形態5)
 前述の実施形態では、通信機20が搭載物測位誤差を推定する構成を示したが、必ずしもこれに限らない。例えば、通信機20の機能のうち、通信部202以外の機能を、周辺監視ECU50が担う構成としてもよい。この場合、周辺監視ECU50は、受信部222で受信した物標情報を取得する機能ブロックを備えればよい。この機能ブロックが通信情報取得部に相当する。この場合、周辺監視ECU50が車両用装置に相当する。他にも、通信機20の機能を、通信機20と周辺監視ECU50とで担う構成としてもよい。この場合、通信機20及び周辺監視ECU50を含むユニットが車両用装置に相当する。
(Embodiment 5)
In the above-described embodiment, the configuration in which the communication device 20 estimates the mounted object positioning error was shown, but the configuration is not necessarily limited to this. For example, among the functions of the communication device 20, functions other than the communication unit 202 may be configured to be performed by the perimeter monitoring ECU 50. FIG. In this case, the perimeter monitoring ECU 50 may include a functional block for acquiring the target object information received by the receiver 222 . This functional block corresponds to the communication information acquisition unit. In this case, the perimeter monitoring ECU 50 corresponds to the vehicle device. Alternatively, the function of the communication device 20 may be performed by the communication device 20 and the perimeter monitoring ECU 50 . In this case, a unit including the communication device 20 and the perimeter monitoring ECU 50 corresponds to the vehicle device.
 (実施形態6)
 前述の実施形態では、自車両に対する相対位置に変換した搭載物基準物標位置と、検出位置特定部203が特定した物標検出位置とを比較して、搭載物測位誤差を推定していた。しかし、搭載物基準物標位置および検出位置特定部203が特定した物標検出位置を、ともに絶対座標に変換した後に、それら搭載物検出位置と物標検出位置とを比較して搭載物測位誤差を推定してもよい。また、絶対座標系で比較する場合において、他車両が送信する搭載物基準物標位置が絶対座標で示されていれば、搭載物基準物標位置は変換しなくてもよい。
(Embodiment 6)
In the above-described embodiment, the mounted object positioning error is estimated by comparing the mounted object reference target position converted into the relative position with respect to the own vehicle and the detected target position specified by the detected position specifying unit 203 . However, after both the mounted object reference target position and the target detected position specified by the detected position specifying unit 203 are converted into absolute coordinates, the mounted object detected position and the target detected position are compared to determine the mounted object positioning error. can be estimated. In the case of comparison in the absolute coordinate system, if the mounted object reference target position transmitted by the other vehicle is indicated by absolute coordinates, the mounted object reference target position does not need to be converted.
(技術的特徴1)
 車両で用いることが可能な車両用装置であって、
 車両の周辺監視センサで検出した物標の車両に対する相対位置である物標検出位置を特定する検出位置特定部(203)と、
 周辺監視センサを有し、測位衛星の信号を用いた測位が可能であり、且つ、車両と無線通信が可能な通信機を搭載する通信機搭載物から送信される、その周辺監視センサで検出した物標の、その測位によって求められたその通信機搭載物の位置を基準とする位置である搭載物基準物標位置を、無線通信を介して取得する通信情報取得部(222)と、
 車両の周辺監視センサで検出した物標と、通信情報取得部で取得した搭載物基準物標位置の送信元の通信機搭載物の周辺監視センサで検出した物標とが同一か否かを判定する同一判定部(204)と、
 通信情報取得部で取得した搭載物基準物標位置を車両に対する相対位置に変換する変換部(205)と、
 同一判定部で物標が同一と判定した場合には、その物標についての、変換部で変換した搭載物基準物標位置と、検出位置特定部で特定した物標検出位置とのずれから、車両の位置を基準とした通信機搭載物での測位の誤差である搭載物測位誤差を推定する誤差推定部(206)とを備える車両用装置。
(Technical feature 1)
A vehicle device that can be used in a vehicle, comprising:
a detection position specifying unit (203) for specifying a target detection position, which is the relative position of the target detected by the vehicle surroundings monitoring sensor with respect to the vehicle;
Detected by the perimeter monitoring sensor transmitted from a communication device equipped with a perimeter monitoring sensor, capable of positioning using positioning satellite signals, and equipped with a communication device capable of wireless communication with the vehicle a communication information acquisition unit (222) for acquiring, via wireless communication, a mounted object reference target position, which is a position of the target based on the position of the mounted object of the communication device obtained by the positioning;
Determines whether or not the target detected by the perimeter monitoring sensor of the vehicle is the same as the target detected by the perimeter monitoring sensor of the communication device that is the transmission source of the target position of the target mounted on the vehicle acquired by the communication information acquisition unit. a same determination unit (204) that
a conversion unit (205) for converting the mounted object reference target position acquired by the communication information acquisition unit into a position relative to the vehicle;
When the same determination unit determines that the targets are the same, the difference between the mounted object reference target position converted by the conversion unit and the detected target position specified by the detection position specifying unit for the target is determined as follows: An apparatus for a vehicle, comprising an error estimating section (206) for estimating a mounted object positioning error, which is a positioning error of a mounted communication device with respect to the position of the vehicle.
 (技術的特徴2)
 技術的特徴1に記載の車両用装置であって、
 誤差推定部で搭載物測位誤差を一旦推定した後は、車両の周辺監視センサで検出できない物標であっても、通信情報取得部で取得する搭載物基準物標位置を、その搭載物測位誤差の分だけ補正して、車両に対するその物標の位置を特定する物標位置特定部(208)を備える車両用装置。
(Technical feature 2)
A vehicle device according to technical feature 1,
Once the mounted object positioning error is estimated by the error estimating unit, even if the target cannot be detected by the vehicle perimeter monitoring sensor, the mounted object reference target position acquired by the communication information acquisition unit is A vehicle apparatus comprising a target locator (208) that determines the position of the target with respect to the vehicle by correcting by the amount of .
 (技術的特徴3)
 技術的特徴1又は2に記載の車両用装置であって、
 通信情報取得部は、移動体としての通信機搭載物から搭載物基準物標位置を取得する車両用装置。
(Technical feature 3)
A vehicle device according to technical feature 1 or 2,
A communication information acquisition unit is a device for a vehicle that acquires a mounted object reference target position from a mounted communication device as a mobile object.
 (技術的特徴4)
 車両で用いることが可能な誤差推定方法であって、
 少なくとも1つのプロセッサにより実行される、
 車両の周辺監視センサで検出した物標の車両に対する相対位置である物標検出位置を特定する検出位置特定工程と、
 周辺監視センサを有し、測位衛星の信号を用いた測位が可能であり、且つ、車両と無線通信が可能な通信機を搭載する通信機搭載物から送信される、その周辺監視センサで検出した物標の、その測位によって求められたその通信機搭載物の位置を基準とする位置である搭載物基準物標位置を、無線通信を介して取得する通信情報取得工程と、
 車両の周辺監視センサで検出した物標と、通信情報取得工程で取得した搭載物基準物標位置の送信元の通信機搭載物の周辺監視センサで検出した物標とが同一か否かを判定する同一判定工程と、
 通信情報取得工程で取得した搭載物基準物標位置を車両に対する相対位置に変換する変換工程と、
 同一判定工程で物標が同一と判定した場合には、その物標についての、変換工程で変換した搭載物基準物標位置と、検出位置特定工程で特定した物標検出位置とのずれから、車両の位置を基準とした通信機搭載物での測位の誤差である搭載物測位誤差を推定する誤差推定工程とを含む誤差推定方法。
(Technical feature 4)
An error estimation method that can be used in a vehicle, comprising:
executed by at least one processor;
a detection position specifying step of specifying a target detection position, which is a relative position of the target detected by the vehicle surroundings monitoring sensor with respect to the vehicle;
Detected by the perimeter monitoring sensor transmitted from a communication device equipped with a perimeter monitoring sensor, capable of positioning using positioning satellite signals, and equipped with a communication device capable of wireless communication with the vehicle a communication information acquisition step of acquiring, via wireless communication, a mounted object reference target position, which is a position of the target based on the position of the communication device mounted object obtained by the positioning;
Determines whether or not the target detected by the vehicle's perimeter monitoring sensor is the same as the target detected by the perimeter monitoring sensor of the communication device that is the source of the target position of the target mounted on the vehicle acquired in the communication information acquisition process. The same judgment process for
a conversion step of converting the mounted object reference target position acquired in the communication information acquisition step into a position relative to the vehicle;
When it is determined that the target is the same in the same determination step, from the deviation between the mounted object reference target position converted in the conversion step and the detected target position identified in the detection position identification step, and an error estimating step of estimating a mounting object positioning error, which is a positioning error of a communication device mounted object with respect to the position of the vehicle.
 なお、本開示は、上述した実施形態に限定されるものではなく、請求項に示した範囲で種々の変更が可能であり、異なる実施形態にそれぞれ開示された技術的手段を適宜組み合わせて得られる実施形態についても本開示の技術的範囲に含まれる。また、本開示に記載の制御部及びその手法は、コンピュータプログラムにより具体化された一つ乃至は複数の機能を実行するようにプログラムされたプロセッサを構成する専用コンピュータにより、実現されてもよい。あるいは、本開示に記載の装置及びその手法は、専用ハードウェア論理回路により、実現されてもよい。もしくは、本開示に記載の装置及びその手法は、コンピュータプログラムを実行するプロセッサと一つ以上のハードウェア論理回路との組み合わせにより構成された一つ以上の専用コンピュータにより、実現されてもよい。また、コンピュータプログラムは、コンピュータにより実行されるインストラクションとして、コンピュータ読み取り可能な非遷移有形記録媒体に記憶されていてもよい。 It should be noted that the present disclosure is not limited to the above-described embodiments, and can be modified in various ways within the scope of the claims, and can be obtained by appropriately combining technical means disclosed in different embodiments. Embodiments are also included in the technical scope of the present disclosure. The controller and techniques described in this disclosure may also be implemented by a special purpose computer comprising a processor programmed to perform one or more functions embodied by a computer program. Alternatively, the apparatus and techniques described in this disclosure may be implemented by dedicated hardware logic circuitry. Alternatively, the apparatus and techniques described in this disclosure may be implemented by one or more special purpose computers configured in combination with a processor executing a computer program and one or more hardware logic circuits. The computer program may also be stored as computer-executable instructions on a computer-readable non-transitional tangible recording medium.

Claims (6)

  1.  車両で用いることが可能な車両用装置であって、
     前記車両の周辺監視センサで検出した物標の前記車両に対する相対位置である物標検出位置を特定する検出位置特定部(203)と、
     通信機搭載物から送信され、前記通信機搭載物で検出された物標の位置である搭載物基準物標位置を、無線通信を介して取得する通信情報取得部(222)と、
     前記車両の周辺監視センサで検出した物標と、前記通信情報取得部で前記搭載物基準物標位置を取得した物標とが同一か否かを判定する同一判定部(204)と、
     前記同一判定部で前記物標が同一と判定した場合には、前記搭載物基準物標位置と前記物標検出位置とのずれから、前記通信機搭載物での測位の誤差である搭載物測位誤差を推定する誤差推定部(206)とを備える車両用装置。
    A vehicle device that can be used in a vehicle, comprising:
    a detection position specifying unit (203) for specifying a target detection position, which is a relative position of the target detected by the vehicle surroundings monitoring sensor with respect to the vehicle;
    a communication information acquisition unit (222) for acquiring, via wireless communication, a mount reference target position, which is the position of a target transmitted from a mount of a communication device and detected by the mount of the communication device;
    an identity determination unit (204) for determining whether or not the target detected by the vehicle surroundings monitoring sensor and the target from which the mounted object reference target position was acquired by the communication information acquisition unit are the same;
    When the same determination unit determines that the target is the same, the mounted object positioning error, which is the positioning error in the communication device mounted object, is determined from the deviation between the mounted object reference target position and the detected target position. and an error estimator (206) for estimating the error.
  2.  請求項1に記載の車両用装置であって、
     前記通信情報取得部で取得した前記搭載物基準物標位置を前記車両に対する相対位置に変換する変換部(205)を備え、
     前記誤差推定部は、前記変換部で変換した前記搭載物基準物標位置と、前記検出位置特定部で特定した前記物標検出位置とのずれから、前記搭載物測位誤差を推定する、車両用装置。
    A vehicle device according to claim 1, comprising:
    a conversion unit (205) for converting the mounted object reference target position acquired by the communication information acquisition unit into a relative position with respect to the vehicle;
    The error estimating unit estimates the mounted object positioning error from a deviation between the mounted object reference target position converted by the converting unit and the detected target position specified by the detected position specifying unit. Device.
  3.  請求項1又は2に記載の車両用装置であって、
     前記誤差推定部で前記搭載物測位誤差を一旦推定した後は、前記車両の周辺監視センサで検出できない物標であっても、前記通信情報取得部で取得する前記搭載物基準物標位置を、その搭載物測位誤差の分だけ補正して、前記車両に対するその物標の位置を特定する物標位置特定部(208)を備える車両用装置。
    The vehicle device according to claim 1 or 2,
    Once the mounted object positioning error is estimated by the error estimating unit, the mounted object reference target position acquired by the communication information acquisition unit is obtained even if the target cannot be detected by the surrounding monitoring sensor of the vehicle, A vehicle apparatus comprising a target position specifying unit (208) for specifying the position of the target with respect to the vehicle by correcting the mounted object positioning error.
  4.  請求項1又は2に記載の車両用装置であって、
     前記通信情報取得部は、前記通信機搭載物から送信される前記通信機搭載物の位置も前記無線通信を介して取得し、
     前記誤差推定部で前記搭載物測位誤差を一旦推定した後は、前記通信機搭載物から送信される前記通信機搭載物の位置を、前記搭載物測位誤差で補正する物標位置特定部を備える、車両用装置。
    The vehicle device according to claim 1 or 2,
    The communication information acquisition unit also acquires the position of the communication device-mounted object transmitted from the communication device-mounted object via the wireless communication,
    a target position specifying unit for correcting the position of the communication device mounted object transmitted from the communication device mounted object with the mounted object positioning error after the error estimation unit once estimates the mounted object positioning error; , vehicle equipment.
  5.  請求項1~4のいずれか1項に記載の車両用装置であって、
     前記通信情報取得部は、移動体としての前記通信機搭載物から前記搭載物基準物標位置を取得する車両用装置。
    The vehicle device according to any one of claims 1 to 4,
    The communication information acquisition unit is a vehicle device that acquires the mounted object reference target position from the communication device mounted object as a mobile body.
  6.  車両で用いることが可能であり、少なくとも1つのプロセッサにより実行される誤差推定方法であって、 
     前記車両の周辺監視センサで検出した物標の前記車両に対する相対位置である物標検出位置を特定する検出位置特定工程と、
     通信機搭載物から送信され、前記通信機搭載物で検出された物標の位置である搭載物基準物標位置を、無線通信を介して取得する通信情報取得工程と、
     前記車両の周辺監視センサで検出した物標と、前記通信情報取得工程で前記搭載物基準物標位置を取得した物標とが同一か否かを判定する同一判定工程と、
     前記同一判定工程で前記物標が同一と判定した場合には、前記搭載物基準物標位置と前記物標検出位置とのずれから、前記通信機搭載物での測位の誤差である搭載物測位誤差を推定する誤差推定工程とを含む誤差推定方法。
    An error estimation method usable in a vehicle and executed by at least one processor, comprising:
    a detection position specifying step of specifying a target detection position, which is a relative position of the target detected by the vehicle surroundings monitoring sensor with respect to the vehicle;
    a communication information obtaining step of obtaining, via wireless communication, a mounted object reference target position, which is a position of a target transmitted from a mounted object of the communication device and detected by the mounted object of the communication device;
    an identity determination step of determining whether or not the target detected by the vehicle surroundings monitoring sensor is the same as the target for which the mounted object reference target position is obtained in the communication information obtaining step;
    When the target is determined to be the same in the identity determination step, the mounted object positioning error, which is the positioning error in the communication device mounted object, is determined from the deviation between the mounted object reference target position and the detected target position. and an error estimation step of estimating the error.
PCT/JP2022/020383 2021-06-03 2022-05-16 Device for vehicle and error estimation method WO2022255074A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2023525701A JPWO2022255074A1 (en) 2021-06-03 2022-05-16

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2021093878 2021-06-03
JP2021-093878 2021-06-03

Publications (1)

Publication Number Publication Date
WO2022255074A1 true WO2022255074A1 (en) 2022-12-08

Family

ID=84324406

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2022/020383 WO2022255074A1 (en) 2021-06-03 2022-05-16 Device for vehicle and error estimation method

Country Status (2)

Country Link
JP (1) JPWO2022255074A1 (en)
WO (1) WO2022255074A1 (en)

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2020154666A (en) * 2019-03-20 2020-09-24 日立オートモティブシステムズ株式会社 Environment sensing information processing device

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2020154666A (en) * 2019-03-20 2020-09-24 日立オートモティブシステムズ株式会社 Environment sensing information processing device

Also Published As

Publication number Publication date
JPWO2022255074A1 (en) 2022-12-08

Similar Documents

Publication Publication Date Title
US11388565B2 (en) Apparatus and method for V2X communication
Meneguette et al. Intelligent transport system in smart cities
Llatser et al. Cooperative automated driving use cases for 5G V2X communication
US11218851B2 (en) Device and method for V2X communication
EP3462754B1 (en) Apparatus and method for v2x communication
EP3731547A1 (en) Device and method for v2x communication
US11776405B2 (en) Apparatus and method for V2X communication
US11968603B2 (en) Device and method for V2X communication
JP7192501B2 (en) EXTERNAL COMMUNICATION DEVICE, IN-VEHICLE DEVICE, IN-VEHICLE COMMUNICATION SYSTEM, COMMUNICATION CONTROL METHOD AND COMMUNICATION CONTROL PROGRAM
Kitazato et al. Proxy cooperative awareness message: an infrastructure-assisted v2v messaging
US20200326203A1 (en) Real-world traffic model
US20220086609A1 (en) Cpm message division method using object state sorting
WO2019131075A1 (en) Transmission device, point group data collection system, and computer program
Vermesan et al. IoT technologies for connected and automated driving applications
CN113099529A (en) Indoor vehicle navigation method, vehicle-mounted terminal, field terminal server and system
WO2022255074A1 (en) Device for vehicle and error estimation method
WO2022264731A1 (en) Vehicle device and error estimation method
US11900808B2 (en) Apparatus, method, and computer program for a first vehicle and for estimating a position of a second vehicle at the first vehicle
WO2023171371A1 (en) Communication device and communication method
KR20210098071A (en) Methods for comparing data on a vehicle in autonomous driving system
WO2023233989A1 (en) Communication device and communication method
US20240038060A1 (en) Communication within an intelligent transport system for signalling hidden objects
EP4177862A1 (en) Road space collective perception message within an intelligent transport system
US20230104083A1 (en) Pre-crash denm message within an intelligent transport system
GB2612602A (en) Road space collective perception message within an intelligent transport system

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22815826

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2023525701

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 22815826

Country of ref document: EP

Kind code of ref document: A1