WO2022264731A1 - Vehicle device and error estimation method - Google Patents

Vehicle device and error estimation method Download PDF

Info

Publication number
WO2022264731A1
WO2022264731A1 PCT/JP2022/020386 JP2022020386W WO2022264731A1 WO 2022264731 A1 WO2022264731 A1 WO 2022264731A1 JP 2022020386 W JP2022020386 W JP 2022020386W WO 2022264731 A1 WO2022264731 A1 WO 2022264731A1
Authority
WO
WIPO (PCT)
Prior art keywords
vehicle
target
communication device
mounted object
unit
Prior art date
Application number
PCT/JP2022/020386
Other languages
French (fr)
Japanese (ja)
Inventor
周▲イク▼ 金
Original Assignee
株式会社デンソー
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社デンソー filed Critical 株式会社デンソー
Priority to JP2023529693A priority Critical patent/JPWO2022264731A1/ja
Priority to DE112022003058.5T priority patent/DE112022003058T5/en
Publication of WO2022264731A1 publication Critical patent/WO2022264731A1/en

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/01Satellite radio beacon positioning systems transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/13Receivers
    • G01S19/14Receivers specially adapted for specific applications
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/38Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
    • G01S19/39Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/396Determining accuracy or reliability of position or pseudorange measurements
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/38Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
    • G01S19/39Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/42Determining position
    • G01S19/51Relative positioning
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions

Definitions

  • the present disclosure relates to a vehicle device and an error estimation method.
  • Patent Document 1 discloses a technique for determining the existence of a target beyond the line of sight of the own vehicle.
  • reception information which is position information of another vehicle transmitted from another vehicle as a target
  • sensor information including the moving speed and moving direction of the target detected by the sensor of the own vehicle.
  • Patent Document 1 With the technology disclosed in Patent Document 1, it is possible to determine the presence of other vehicles that cannot be detected by the own vehicle's sensors by receiving the position information of the other vehicles.
  • the position information of other vehicles transmitted from other vehicles includes positioning errors (hereinafter referred to as positioning errors), whether positioning is performed using positioning satellites or not using positioning satellites. included. Therefore, with the technology disclosed in Patent Literature 1, it is difficult to accurately identify the position of another vehicle when the other vehicle is located outside the detection range of the sensor of the own vehicle.
  • positioning errors positioning errors
  • One object of this disclosure is to provide a vehicle device and an error estimation method that make it possible to more accurately identify the position of an object mounted with a communication device outside the detection range of the vehicle's surroundings monitoring sensor.
  • the vehicle device of the present disclosure is a vehicle device that can be used in a vehicle and acquires a reference position wirelessly transmitted from a reference device that detects a reference position of a target.
  • a reference position acquisition unit for acquiring a first positioning position, which is the position of the first communication device-mounted object transmitted from the first communication device-mounted object and obtained by positioning by the first communication device-mounted object; an identity determination unit that determines whether or not the target whose reference position is obtained by the reference position obtaining unit and the first communication device mounted object whose first positioning position is obtained by the mounted object information obtaining unit are the same;
  • the first positioning error which is the positioning error in the first communication device mounted device, is determined from the deviation between the reference position and the first positioning position.
  • an error estimator for estimating the positioning error.
  • the error estimation method of the present disclosure is capable of being used in a vehicle and is executed by at least one processor, a reference device for detecting a reference position of a target.
  • the first positioning position is obtained by positioning with the first communication device mounted object, so it includes positioning errors.
  • the disclosed technique acquires the reference position of the target from the reference device. Therefore, when it is determined that the target and the first communication device-mounted object are the same, from the deviation between the reference position and the first positioning position acquired from the first communication device-mounted object, the positioning by the first communication device-mounted object It becomes possible to more accurately estimate the first positioning error, which is the error of .
  • the first positioning error is estimated from the reference position and the first positioning position. It becomes possible. Further, by using the first positioning error, it is possible to correct the positioning error in the first communication device mounted object and specify the position of the first communication device mounted object with higher accuracy. As a result, it is possible to more accurately identify the position of the communication device-mounted object outside the detection range of the surroundings monitoring sensor of the own vehicle.
  • FIG. 2 is a diagram showing an example of a schematic configuration of a vehicle unit 2;
  • FIG. 2 is a diagram showing an example of a schematic configuration of a communication device 20;
  • FIG. 6 is a flow chart showing an example of the flow of positioning error estimation-related processing in the communication device 20;
  • 6 is a flowchart showing an example of the flow of second estimation-related processing in the communication device 20;
  • FIG. 10 is a diagram for explaining an outline of a second embodiment; 1 illustrates an exemplary architecture of a V2X communication device;
  • FIG. 3 shows the logical interfaces for CA services and other layers; A functional block diagram of a CA service.
  • FIG. 3 shows the logical interfaces for CP services and other layers; Functional block diagram of CP service.
  • FIG. 4 illustrates FOC (or SIC) in CPM;
  • FIG. 4 is a diagram illustrating POC in CPM;
  • FIG. 4 is a diagram for explaining CP services;
  • FIG. FIG. 11 is a diagram for explaining targets for identity determination in the second embodiment;
  • the vehicle system 1 includes a vehicle unit 2 and a roadside unit 3 used for each of a plurality of vehicles.
  • a vehicle VEa, a vehicle VEb, and a vehicle VEc will be described as examples of vehicles.
  • Vehicle VEa, vehicle VEb, and vehicle VEc are assumed to be automobiles.
  • the roadside device 3 includes a control device 300, a communication device 301, a perimeter monitoring sensor 302, and a position storage section 303, as shown in FIG.
  • the communication device 301 exchanges information with the outside of its own device by performing wireless communication with the outside of its own device.
  • the communication device 301 wirelessly communicates with a communication device 20 which is used in the vehicle and will be described later. That is, the communication device 301 performs road-to-road communication. In this embodiment, it is assumed that road-vehicle communication is performed between the roadside device 3 and the communication device 20 of the vehicle VEa.
  • the peripheral monitoring sensor 302 detects obstacles around the device, such as moving objects such as pedestrians, animals other than humans, bicycles, motorcycles, and other vehicles, as well as stationary objects such as falling objects on the road, guardrails, curbs, and trees. To detect. As the surroundings monitoring sensor 302, a configuration using a surroundings monitoring camera whose imaging range is a predetermined range around the own device may be used. Perimeter monitoring sensor 302 may be configured to use millimeter wave radar, sonar, LIDAR (Light Detection and Ranging/Laser Imaging Detection and Ranging), or the like. Moreover, it is good also as a structure which uses these in combination.
  • the peripheral monitoring camera sequentially outputs captured images to the control device 300 as sensing information. Sensors that transmit search waves such as sonar, millimeter-wave radar, and LIDAR sequentially output scanning results based on received signals obtained when reflected waves reflected by obstacles are received to control device 300 as sensing information.
  • the position storage unit 303 stores information on the absolute position of the own device.
  • a non-volatile memory may be used as the position storage unit 303 .
  • the absolute position information should be at least latitude and longitude coordinates. It is assumed that the absolute position information of the own device has a position accuracy higher than that obtained by positioning using the signals of the positioning satellites by the own device alone.
  • the information on the absolute position of the own device may be obtained in advance by surveying with reference to triangulation points or electronic reference points and stored in the position storage unit 303 .
  • the control device 300 includes a processor, memory, I/O, and a bus connecting these, and executes various processes by executing control programs stored in the memory.
  • the control device 300 recognizes information (hereinafter referred to as target information) about targets existing around the device from the sensing information output from the periphery monitoring sensor 302 .
  • the target information may be the relative position, type, speed, moving azimuth, etc. of the target with respect to the own device.
  • the relative position may be the distance and direction from the own device. It is assumed that the position specifying accuracy in recognizing the position of the target using the perimeter monitoring sensor 302 is higher than the position specifying accuracy by positioning with the locator 30, which will be described later.
  • the type of target can be recognized by image recognition processing such as template matching based on the image captured by the surrounding surveillance camera.
  • image recognition processing such as template matching based on the image captured by the surrounding surveillance camera.
  • the distance of the target from the owning device and the own vehicle can be determined from the installation position and the direction of the optical axis of the surroundings monitoring camera relative to the own device and the position of the target in the captured image. What is necessary is to recognize the direction of the target object with respect to .
  • the distance of the target from the own device may be recognized based on the amount of parallax between the pair of cameras.
  • the speed of the target may be recognized from the amount of change per time in the relative position of the target with respect to the device itself.
  • the direction of movement may be recognized from the direction of change in the relative position of the target relative to the device itself and the information on the absolute position of the device itself stored in the position storage unit 303 .
  • control device 300 uses the sensing information of the sensor that transmits the search wave to recognize the distance of the target from its own device, the azimuth of the target with respect to its own device, the speed of the target, and the moving azimuth.
  • the control device 300 may recognize the distance from its own device to the target on the basis of the time from when the search wave is sent until when the reflected wave is received.
  • the control device 300 may recognize the azimuth of the target with respect to its own device based on the direction in which the search wave from which the reflected wave was obtained was transmitted.
  • the control device 300 may recognize the speed of the target from the relative speed of the target with respect to its own device calculated based on the Doppler shift between the transmitted search wave and the reflected wave.
  • the control device 300 controls the communication device 301 to perform road-to-vehicle communication.
  • the control device 300 transmits the target object information recognized from the sensing information output from the surroundings monitoring sensor 302 to the communication device 20 of the vehicle in the vicinity of the own device through road-to-vehicle communication.
  • the control device 300 also includes information on the absolute position of its own device stored in the position storage unit 303 and transmits the target information.
  • the source identification information may be the vehicle ID of the own vehicle or the communication device ID of the communication device 20 of the own vehicle.
  • the target information transmitted from the communication device 301 includes, as described above, the relative position of the target, the type of target, the speed of the target, the direction of movement of the target, and the absolute position of the roadside unit 3 (hereinafter referred to as , roadside unit absolute position).
  • the relative position of the target in the target information is a position based on the position of the roadside unit 3 .
  • this relative position included in target object information is called a roadside machine reference
  • the control device 300 may be configured to periodically transmit the target information, for example.
  • the vehicle unit 2 includes a communication device 20, a locator 30, a surroundings monitoring sensor 40, a surroundings monitoring ECU 50, and a driving support ECU 60, as shown in FIG.
  • the communication device 20, the locator 30, the perimeter monitoring ECU 50, and the driving support ECU 60 may be configured to be connected to an in-vehicle LAN (see LAN in FIG. 3).
  • the locator 30 is equipped with a GNSS (Global Navigation Satellite System) receiver.
  • a GNSS receiver receives positioning signals from multiple satellites.
  • the locator 30 uses the positioning signals received by the GNSS receiver to sequentially locate the position of the own vehicle equipped with the locator 30 (hereinafter referred to as the positioning position).
  • the locator 30 determines a specified number, such as four, of positioning satellites to be used for positioning calculation from among the positioning satellites from which positioning signals have been output. Then, using the code pseudoranges, carrier wave phases, etc. for the determined specified number of positioning satellites, positioning calculation is performed to calculate the coordinates of the positioning position.
  • the positioning operation itself may be a positioning operation using code pseudoranges or a positioning operation using carrier phases.
  • the coordinates should be at least latitude and longitude coordinates.
  • the locator 30 may be equipped with an inertial sensor.
  • the inertial sensor for example, a gyro sensor and an acceleration sensor may be used.
  • the locator 30 may sequentially measure the positioning position by combining the positioning signal received by the GNSS receiver and the measurement result of the inertial sensor. Note that the travel distance obtained from the signals sequentially output from the vehicle speed sensor mounted on the own vehicle may be used for the positioning of the positioning position.
  • the surroundings monitoring sensor 40 detects obstacles around the vehicle, such as moving objects such as pedestrians, animals other than humans, bicycles, motorcycles, and other vehicles, as well as stationary objects such as falling objects on the road, guardrails, curbs, and trees. To detect. As the surroundings monitoring sensor 40, a configuration using a surroundings monitoring camera having an imaging range of a predetermined range around the own vehicle may be used. Perimeter monitoring sensor 40 may be configured to use millimeter wave radar, sonar, LIDAR (Light Detection and Ranging/Laser Imaging Detection and Ranging), or the like. Moreover, it is good also as a structure which uses these in combination.
  • the surroundings monitoring camera sequentially outputs captured images to the surroundings monitoring ECU 50 as sensing information. Sensors that transmit search waves such as sonar, millimeter wave radar, and LIDAR sequentially output scanning results based on received signals obtained when reflected waves reflected by obstacles are received to the perimeter monitoring ECU 50 as sensing information.
  • the perimeter monitoring ECU 50 includes a processor, memory, I/O, and a bus connecting these, and executes a control program stored in the memory to perform various processes related to recognition of targets in the vicinity of the vehicle. .
  • the surroundings monitoring ECU 50 recognizes target object information about targets existing around the vehicle from the sensing information output from the surroundings monitoring sensor 40 in the same manner as the control device 300 described above.
  • Recognition of target object information by the perimeter monitoring ECU 50 is the same as recognition of target object information by the control device 300 except that the own vehicle is used as a reference instead of the roadside unit 3 .
  • the points of difference are as follows.
  • the speed of the target can be recognized from the amount of change per hour in the relative position of the target with respect to the own vehicle and the vehicle speed of the own vehicle.
  • the speed of the target may be recognized from the speed of the target relative to the vehicle, which is calculated based on the Doppler shift between the transmitted search wave and the reflected wave, and the vehicle speed of the vehicle. It is assumed that the accuracy of position identification in recognizing the position of the target using the perimeter monitoring sensor 40 is higher than the accuracy of position identification by positioning with the locator 30 .
  • the driving assistance ECU 60 includes a processor, memory, I/O, and a bus connecting these, and executes various processes related to driving assistance of the own vehicle by executing a control program stored in the memory.
  • the driving assistance ECU 60 performs driving assistance based on the position of the target specified by the perimeter monitoring ECU 50 and the communication device 20 .
  • the position of the target specified by the perimeter monitoring ECU 50 is the relative position of the target detected by the perimeter monitoring sensor 40 (hereinafter referred to as the line-of-sight target) with respect to the own vehicle.
  • the position of the target specified by the communication device 20 is the relative position of the target that cannot be detected by the perimeter monitoring sensor 40 (hereinafter referred to as non-line-of-sight target) with respect to the own vehicle.
  • Examples of driving assistance include vehicle control for avoiding approaching a target, alerting for avoiding approaching a target, and the like.
  • the communication device 20 includes a processor, memory, I/O, and a bus connecting these, and executes a control program stored in the memory to perform wireless communication with the outside of the vehicle and target position identification. Execute the process. Execution of the control program by the processor corresponds to execution of the method corresponding to the control program.
  • Memory as used herein, is a non-transitory tangible storage medium for non-transitory storage of computer-readable programs and data. A non-transitional physical storage medium is implemented by a semiconductor memory, a magnetic disk, or the like.
  • This communication device 20 corresponds to a vehicle device. Details of the processing in the communication device 20 will be described later.
  • the communication device 20 includes a detection information acquisition unit 201, a vehicle-to-vehicle communication unit 202, a road-to-vehicle communication unit 203, an identity determination unit 204, a conversion unit 205, an error estimation unit 206, an error storage unit 207, a position A specifying unit 208 and a position storage unit 209 are provided as functional blocks.
  • Part or all of the functions executed by the communication device 20 may be configured as hardware using one or a plurality of ICs or the like. Also, some or all of the functional blocks included in the communication device 20 may be implemented by a combination of software executed by a processor and hardware members.
  • the detection information acquisition unit 201 acquires target information recognized by the perimeter monitoring ECU 50 . That is, the information of the target detected by the perimeter monitoring sensor 40 is acquired.
  • the target information is the relative position of the target consisting of the distance of the target from the own vehicle and the orientation of the target with respect to the own vehicle, the type of target, the speed of the target, and the moving direction of the target.
  • the relative position of the target may be coordinates based on the positioning position of the own vehicle.
  • the detection information acquisition unit 201 also acquires the measured position of the own vehicle measured by the locator 30 .
  • the vehicle-to-vehicle communication unit 202 exchanges information with the communication device 20 of the other vehicle by performing vehicle-to-vehicle communication with the other vehicle. Further, in the present embodiment, vehicle-to-vehicle communication is performed between the vehicle VEa and the vehicle VEb, and vehicle-to-vehicle communication is performed between the vehicle VEa and the vehicle VEc.
  • the vehicle-to-vehicle communication unit 202 includes a transmission unit 221 and a vehicle-to-vehicle reception unit 222.
  • the transmission unit 221 transmits information to other vehicles through inter-vehicle communication.
  • the transmission unit 221 transmits the target information acquired by the detection information acquisition unit 201 and the positioning position measured by the locator 30 of the own vehicle to the communication device 20 of the other vehicle via wireless communication.
  • the transmitting unit 221 may include identification information for identifying the own vehicle (hereinafter referred to as transmission source identification information) in the target information.
  • the source identification information may be the vehicle ID of the own vehicle or the communication device ID of the communication device 20 of the own vehicle.
  • the vehicle-side receiving unit 222 receives information transmitted from other vehicles through vehicle-to-vehicle communication.
  • the vehicle-side receiving unit 222 receives target information transmitted from the communication device 20 of another vehicle via wireless communication. This other vehicle is assumed to be a vehicle using the vehicle unit 2 .
  • the vehicle-side receiving section 222 corresponds to the mounted object information acquiring section.
  • the processing in this vehicle-side receiving section 222 corresponds to the mounted object information acquisition step.
  • the vehicle-side receiving unit 222 will be described as receiving target information from two types of other vehicles through vehicle-to-vehicle communication.
  • the first type is other vehicles within the detection range of the perimeter monitoring sensor 302 of the roadside unit 3 .
  • This other vehicle will hereinafter be referred to as the first communication equipment.
  • the second type is other vehicles outside the detection range of the perimeter monitoring sensor 302 of the roadside unit 3 .
  • This other vehicle is hereinafter referred to as a second communication device mounted object.
  • the target information received by the vehicle-side receiving unit 222 includes, as described above, the reference target position of the mounted object, the type of target, the speed of the target, the moving azimuth of the target, and the measured position of the vehicle that is the transmission source. , and source identification information.
  • the positioning position measured by the locator 30 of the first communication device mounted object will be referred to as the first positioning position.
  • a target detected by the perimeter monitoring sensor 40 of the first communication device is called a first target.
  • the positioning position measured by the locator 30 of the second communication device is called a second positioning position.
  • a target detected by the perimeter monitoring sensor 40 of the second communication device is called a second target.
  • the road-to-vehicle communication unit 203 exchanges information by performing road-to-vehicle communication with the roadside device 3 . Further, in this embodiment, it is assumed that road-to-vehicle communication is performed between the vehicle VEa and the roadside device 3 .
  • the road-to-vehicle communication unit 203 includes a road-to-vehicle receiving unit 231 . Note that the road-to-vehicle communication unit 203 may transmit information to the roadside device 3 through road-to-vehicle communication, but the description is omitted for convenience.
  • the roadside and vehicle side receiving unit 231 receives information transmitted from the roadside device 3 through roadtovehicle communication.
  • the roadside and vehicle side receiving unit 231 receives target object information transmitted from the roadside device 3 via wireless communication.
  • the target information received by the roadside vehicle side receiving unit 231 includes the roadside unit reference target position, the type of target, the speed of the target, the moving direction of the target, and the absolute position of the roadside unit. is included.
  • the roadside unit reference target position is a reference position referred to when estimating the first positioning error in the error estimating unit 206, which will be described later, and the roadside unit 3 corresponds to the reference device.
  • the road and vehicle side reception unit 231 corresponds to the reference position acquisition unit and the roadside device information acquisition unit.
  • the processing in the road and vehicle side reception unit 231 corresponds to the reference position acquisition step and the roadside device information acquisition step.
  • the same determination unit 204 detects a target detected by the perimeter monitoring sensor 302 of the roadside unit 3 from which the roadside unit reference target position is acquired by the roadside unit receiving unit 231 (hereinafter referred to as a roadside unit detection target), and the vehicle side reception unit 231. It is determined whether or not the transmission source of the target object information acquired by the unit 222 is the same as the other vehicle (hereinafter referred to as another communication vehicle).
  • the other vehicle corresponds to the first communication equipment mounted object, and the positioning position included in this target object information corresponds to the first positioning position.
  • the processing in the identity determination unit 204 corresponds to the identity determination step.
  • the identity determination unit 204 determines whether the roadside unit reference target position acquired by the roadside vehicle side reception unit 231 and the mounted object reference target position acquired by the vehicle side reception unit 222 are similar to each other. and the communicating other vehicle are the same. For example, if the difference between these positions is less than a threshold, the identity determination unit 204 may determine that they are the same. On the other hand, if the difference between these positions is not less than the threshold, it may be determined that they are not the same. It should be noted that other conditions, such as matching of types of targets, difference in speed less than a threshold value, and difference in direction of movement less than a threshold value, may be used as conditions for determining identity by the identity determining unit 204 . In this case, the target object information transmitted from the transmission unit 221 may include information such as the type, speed, and direction of movement of the own vehicle.
  • the conversion unit 205 converts the roadside machine reference target position acquired by the roadside vehicle side reception unit 231 and the positioning position acquired by the vehicle side reception unit 222 into relative positions with respect to the own vehicle.
  • the coordinates may be converted into coordinates in an XY coordinate system (hereinafter referred to as the vehicle coordinate system) with the measured position of the vehicle measured by the locator 30 as the origin.
  • the processing in this conversion unit 205 corresponds to the conversion step.
  • the identity determination unit 204 determines that the target detected by the roadside unit and the other communication vehicle are the same, the other communication vehicle is the first communication device mounted object.
  • the first positioning position acquired by the vehicle-side receiving unit 222 from the first communication device mounted object is Convert to relative position.
  • the transforming unit 205 may also transform the mounted object reference target position acquired by the vehicle-side receiving unit 222 into a position relative to the own vehicle.
  • the error estimation unit 206 calculates the roadside unit reference target position and the measured position converted by the conversion unit 205. Based on the deviation of the position of the own vehicle, the positioning error of the other vehicle is estimated.
  • the identity determination unit 204 determines that the target detected by the roadside unit and the other communication vehicle are the same, the other communication vehicle is the first communication device mounted object. Therefore, the error estimating unit 206 calculates the positioning error ( Hereinafter, the first positioning error) is estimated.
  • the processing in this error estimating section 206 corresponds to the error estimating step.
  • the positioning error estimated by the error estimator 206 may be the deviation of each of the X and Y coordinates of the own vehicle coordinate system.
  • Error estimation section 206 stores the estimated first positioning error in error storage section 207 .
  • the error storage unit 207 may be a non-volatile memory or a volatile memory.
  • the error estimator 206 may associate the estimated first positioning error with the transmission source identification information of the first communication device-mounted object from which the first positioning error was estimated, and store them in the error storage 207 .
  • the transmission source identification information of the first communication device-mounted object the transmission source identification information of the target object information received by the vehicle-side receiving unit 222 may be used.
  • the position specifying unit 208 has a mounted object position specifying unit 281 and a target position specifying unit 282 . After the mounted object position specifying unit 281 once estimates the first positioning error in the error estimating unit 206, even if the first communication device mounted object cannot be detected by the surroundings monitoring sensor 302 of the roadside unit 3, The vehicle-side receiving unit 222 corrects the first positioning position acquired from the first communication device mounted object by the amount of the first positioning error to specify the position of the first communication device mounted object with respect to the own vehicle. is preferred.
  • the mounted object position specifying unit 281 stores the first position of the first communication device mounted object in the error storage unit 207 when the identity determination unit 204 determines that the target detected by the roadside unit and the communicating other vehicle are not the same.
  • the first positioning position acquired by the vehicle-side receiving unit 222 is corrected by the first positioning error, and the position of the first communication device mounted object with respect to the own vehicle. should be specified.
  • the correction may be performed by shifting the first positioning position so as to eliminate the deviation of the first positioning error. According to this, even if neither the perimeter monitoring sensor 302 of the roadside device 3 nor the perimeter monitoring sensor 40 of the vehicle can detect the first communication device-mounted object, the relative position of the first communication device-mounted object with respect to the vehicle can be detected. It becomes possible to identify the position with higher accuracy.
  • the mounted object position specifying unit 281 determines whether or not the first positioning error of the first communication device mounted object has been stored in the error storage unit 207, and stores the transmission source identification information linked to the first positioning error in the error storage unit 207. It can be judged whether or not exists.
  • the mounted object position specifying unit 281 stores the position of the first communication device mounted object with respect to the specified own vehicle in the position storage unit 209 .
  • the position storage unit 209 may be a volatile memory.
  • the position of the first communication device mounted object that can be detected by the perimeter monitoring sensor 40 of the own vehicle may be stored in the memory of the perimeter monitoring ECU 50 instead of being stored in the position storage unit 209 .
  • the position detected by the surroundings monitoring sensor 40 of the own vehicle may be used as for the position of the first communication device mounted object within the detection range of the surroundings monitoring sensor 40.
  • the target position specifying unit 282 uses the vehicle-side receiving unit 222 to acquire the mounted object reference target position (hereinafter referred to as the first It is preferable to specify the position of the first target with respect to the own vehicle by correcting the position of the first target with respect to the vehicle.
  • the first target is the target detected by the perimeter monitoring sensor 40 of the first communication device mounted object, as described above.
  • the correction may be performed by shifting the position of the first mounted object reference target so as to eliminate the shift corresponding to the first positioning error.
  • the relative position of the first target with respect to the own vehicle can be determined more accurately even from the first mounted object reference target position acquired by the vehicle side receiving unit 222 from the mounted object of the first communication device via wireless communication. can be well identified.
  • the target position specifying unit 282 stores the specified position of the first target relative to the own vehicle in the position storage unit 209 .
  • the identity determination unit 204 determines that the position of the first target with respect to the vehicle is specified by the target position specifying unit 282 by correcting the position of the first target using the first positioning error, and the vehicle side receiving unit 222
  • the mounted object reference target position hereinafter referred to as the second mounted object reference target position
  • the identity determining unit 204 determines the first target whose position relative to the own vehicle is specified by the target position specifying unit 282 and the second communication of the transmission source of the second mounted object reference target position acquired by the vehicle-side receiving unit 222. It is preferable to determine whether or not the target (hereinafter referred to as the second target) detected by the perimeter monitoring sensor 40 of the on-board object is the same.
  • the identity determination unit 204 determines whether the target information obtained from the vehicle-side receiving unit 222 from the first communication device-mounted object and the target information obtained from the vehicle-side receiving unit 222 from the second communication device-mounted object. Whether or not the first target is the same as the second target is determined based on the presence or absence of approximation.
  • the target information to be compared for determining whether or not they are the same may be, for example, the type of target, the speed of the target, and the direction of movement of the target.
  • the same determination unit 204 may determine that the two are the same when all of the following conditions are satisfied: the types match, the speed difference is less than a threshold, and the direction difference is less than a threshold.
  • the conditions to be satisfied for the identity determination unit 204 to determine the identity may be part of the matching of types, the difference in speed being less than a threshold value, and the difference in moving direction being less than a threshold value. Further, the identity determination unit 204 determines whether the first mount reference target position and the second mount reference target position are the same based on the presence or absence of approximation between the positions converted into relative positions with respect to the own vehicle by the conversion unit 205 . It is necessary to determine whether The identity determination unit 204 may determine whether or not the positions are the same under the condition that the difference in position after conversion is less than a threshold. The identity determination unit 204 may set the condition that the difference in position after conversion is less than a threshold in addition to the conditions that the type matches, the speed difference is less than the threshold, and the movement direction difference is less than the threshold.
  • the error estimating unit 206 determines the position of the second mounted object reference target converted by the converting unit 205 and the target position specification. From the deviation from the position of the first target with respect to the own vehicle specified in the section 282, if the positioning error (hereinafter referred to as the second positioning error) of the second communication equipment mounted on the position of the own vehicle is estimated. good. According to this, even if neither the perimeter monitoring sensor 302 of the roadside device 3 nor the perimeter monitoring sensor 40 of the vehicle has ever detected the second communication device mounted object, the second communication device for the vehicle is detected. It becomes possible to specify the relative position of the mounted object with higher accuracy.
  • Error estimation section 206 stores the estimated second positioning error in error storage section 207 .
  • the error estimator 206 may associate the estimated second positioning error with the transmission source identification information of the second communication device-mounted object from which the second positioning error was estimated, and store them in the error storage unit 207 .
  • the mounted object position specifying unit 281 uses the vehicle-side receiving unit 222 to obtain the second measured position obtained from the second communication device mounted object as the second positioning error. After correcting the error, the position of the second communication device mounted object relative to the own vehicle is specified. According to this, even if the object mounted on the second communication device cannot be detected by either the surroundings monitoring sensor 302 of the roadside device 3 or the surroundings monitoring sensor 40 of the own vehicle, the relative position of the object mounted on the second communication device to the own vehicle can be detected. It becomes possible to identify the position with higher accuracy.
  • the mounted object position specifying unit 281 stores the position of the second communication device mounted object with respect to the specified own vehicle in the position storage unit 209 .
  • the position storage unit 209 may be a volatile memory. Note that the position of the second communication device mounted object that can be detected by the surroundings monitoring sensor 40 of the own vehicle may be stored in the memory of the surroundings monitoring ECU 50 without being stored in the position storage unit 209 . As for the position of the second communication device mounted object within the detection range of the surroundings monitoring sensor 40, the position detected by the surroundings monitoring sensor 40 of the own vehicle may be used.
  • the target position specifying unit 282 uses the vehicle-side receiving unit 222 to acquire the mounted object reference target position (hereinafter referred to as the second It is preferable to specify the position of the second target with respect to the own vehicle by correcting the position of the second target relative to the own vehicle.
  • the second target is, as described above, a target detected by the perimeter monitoring sensor 40 of the second communication device mounted object.
  • the correction may be performed by shifting the position of the second mounted object reference target so as to eliminate the deviation of the second positioning error.
  • the relative position of the second target with respect to the own vehicle can be determined more accurately even from the second mounted object reference target position acquired by the vehicle side receiving unit 222 from the mounted object of the second communication device via wireless communication. can be well identified.
  • the target position specifying unit 282 stores the specified position of the second target relative to the own vehicle in the position storage unit 209 .
  • the driving support ECU 60 uses the position of the target including the communication equipment mounted object stored in the position storage unit 209 to control the vehicle to avoid approaching the target, and to control the vehicle to avoid approaching the target.
  • Provide driving assistance such as calling attention to Therefore, even if the target is beyond the line of sight of the own vehicle, it is possible to perform more accurate driving support by using the relative position with respect to the own vehicle that is specified with higher accuracy.
  • FIG. 5 ⁇ Positioning Error Estimation Related Processing in Communication Device 20>
  • Execution of this process means execution of the error estimation method.
  • the flowchart of FIG. 5 may be configured to be started, for example, when a switch (hereinafter referred to as a power switch) for starting the internal combustion engine or motor generator of the own vehicle is turned on.
  • a switch hereinafter referred to as a power switch
  • step S1 when the road-vehicle-side receiving unit 231 receives target information transmitted from the roadside device 3 through road-to-vehicle communication (YES in S1), the process proceeds to step S2. On the other hand, if the target information transmitted by the road-to-vehicle communication has not been received (NO in S1), the process proceeds to step S8.
  • step S2 when the vehicle-side receiving unit 222 receives target information transmitted from another vehicle via inter-vehicle communication (YES in S2), the process proceeds to step S3. On the other hand, if target object information has not been received through inter-vehicle communication (NO in S2), the process proceeds to step S13.
  • step S2 when target information is received from another vehicle within a certain period of time after receiving the target information from the roadside device 3 in S1, the target information transmitted from the other vehicle by vehicle-to-vehicle communication is received. It should be.
  • the term "within a certain period of time" as used herein may be, for example, a period of time equal to or less than the period of transmission of target object information from the roadside unit 3, and may be set arbitrarily.
  • step S3 the identity determination unit 204 determines the target detected by the roadside unit 3, which is the target detected by the surrounding monitoring sensor 302 of the roadside unit 3 from which the target information was acquired in S1, and the transmission source of the target information acquired in S2. It is determined whether or not the communication other vehicle, which is the other vehicle, is the same.
  • step S4 when it is determined that the target detected by the roadside unit and the other communication vehicle are the same (YES in S4), the process proceeds to step S5. On the other hand, if it is determined that they are not the same (NO in S4), the process proceeds to step S8.
  • a communication other vehicle determined to be the same as the target detected by the roadside unit corresponds to the first communication device mounted object.
  • step S5 the conversion unit 205 converts the roadside unit reference target position in the target information received and acquired in S1 and the measured position in the target information received and acquired in S2 to the own vehicle. Convert to relative position.
  • the processing of S5 may be configured to be performed before the processing of S3. In this case, in S3, it may be determined whether or not the roadside machine reference target position and the other communication vehicle are the same using the roadside machine reference target position and the positioning position converted in S5.
  • step S6 the error estimating unit 206 determines the first communication based on the position of the vehicle based on the deviation between the roadside unit reference target position converted in S5 and the positioning position for the target determined to be the same in S4.
  • a first positioning error which is a positioning error of the on-board object, is estimated.
  • step S7 the first positioning error estimated in S6 is stored in the error storage unit 207 in association with the transmission source identification information of the first communication device-mounted object from which the first positioning error was estimated.
  • step S8 if the vehicle-side receiving unit 222 receives target information transmitted from another vehicle via vehicle-to-vehicle communication (YES at S8), the process proceeds to step S9. On the other hand, if target object information has not been received through inter-vehicle communication (NO in S8), the process proceeds to step S13.
  • step S9 if the positioning error is already stored in the error storage unit 207 for the other vehicle from which the target information was received and acquired in S2 or S6 (YES in S9), the process proceeds to step S11. On the other hand, if the positioning error has not been stored in the error storage unit 207 (NO in S9), the process moves to step S10.
  • the positioning error here includes the above-described first positioning error and second positioning error.
  • step S10 a second estimation-related process is performed, and the process proceeds to step S13.
  • a second estimation-related process is performed, and the process proceeds to step S13.
  • an example of the flow of the second estimation-related processing will be described using the flowchart of FIG. 6 .
  • step S101 when the vehicle-side receiving unit 222 receives target information transmitted through inter-vehicle communication with another vehicle different from that received in S2 or S8 (YES in S101), , the process proceeds to step S102. On the other hand, if target object information transmitted through inter-vehicle communication with another vehicle has not been received (NO in S101), the process proceeds to step S13.
  • the other vehicle transmits the target information through inter-vehicle communication. It is assumed that the target information has been received.
  • This different other vehicle corresponds to the second communication device mounted object, and the mounted object reference target position included in this target information corresponds to the second mounted object reference target position.
  • the term “within a certain period of time” as used herein may be a period of time that can be arbitrarily set.
  • the identification of the vehicle may be performed by the identity determination unit 204 based on the transmission source identification information included in the target object information.
  • step S102 if the position of the first target relative to the vehicle has been corrected by the target position specifying unit 282 using the first positioning error (YES in S102), the process proceeds to step S103. On the other hand, if it has not been specified (NO in S102), the process proceeds to step S13.
  • step S103 the identity determination unit 204 determines the first target whose position relative to the own vehicle is specified by the target position specifying unit 282 and the second communication device mounted object that is the transmission source of the target information received and acquired in S101. It is determined whether or not the second target, which is the target detected by the surroundings monitoring sensor 40, is the same.
  • step S104 when it determines with a 1st target and a 2nd target being the same (it is YES at S104), it moves to step S105. On the other hand, if it is determined that they are not the same (NO in S104), the process proceeds to step S13.
  • step S105 the second mounted object reference target position in the target information received and acquired in S101 is converted into a relative position with respect to the own vehicle.
  • the processing of S105 may be configured to be performed before the processing of S103.
  • the first target and the second It may be determined whether or not the target is the same.
  • step S106 the error estimating unit 206 converts the second mounted object reference target position converted in S105 and the first object relative to the own vehicle identified by the target position identifying unit 282 for the target determined to be the same in S104.
  • a second positioning error which is an error in positioning by the second communication device-mounted object with respect to the position of the own vehicle, is estimated from the deviation from the position of the target.
  • step S107 the second positioning error estimated in S106 is linked to the transmission source identification information of the second communication device-mounted object for which the second positioning error was estimated, and stored in the error storage unit 207, and the process proceeds to step S13. .
  • the position specifying unit 208 stores at least one of the positioning position and the mounted object reference target position received and acquired in S2 or S8 as the positioning error stored in the error storage unit 207.
  • the position relative to the own vehicle is specified by correcting by the amount of .
  • the mounted object position specifying unit 281 corrects the first positioned position by the first positioning error, Locate objects.
  • the target position specifying unit 282 corrects the first mount reference target position by the first positioning error, Identify the position of the first target with respect to the own vehicle.
  • the mounted object position specifying unit 281 corrects the second positioned position by the second positioning error, Locate objects.
  • the target position specifying unit 282 corrects the second mount reference target position by the second positioning error, Identify the position of the second target with respect to the own vehicle.
  • correction may be performed after the conversion unit 205 converts the mounted object reference target position.
  • correction may be performed before the conversion unit 205 converts the mounted object reference target position.
  • step S12 the position of the target specified in S11 is stored in the position storage unit 209. FIG.
  • step S13 if it is time to end the positioning error estimation related process (YES in S13), the positioning error estimation related process ends. On the other hand, if it is not the end timing of the positioning error estimation related process (NO in S13), the process returns to S1 and repeats the process.
  • An example of the termination timing of the positioning error estimation-related processing is when the power switch is turned off.
  • the first positioning position is determined by positioning using the signals of the positioning satellites in the first communication device, it contains positioning errors.
  • the roadside unit reference target position is the position of the target detected by the perimeter monitoring sensor 302 of the roadside unit 3 having the information of the absolute position of the own device with higher positional accuracy than that obtained by positioning using the signals of the positioning satellites. Since the position is based on the position of the roadside unit 3, the position accuracy is higher than that obtained by positioning using signals from positioning satellites.
  • the roadside unit reference target position obtained from the roadside unit 3 and the first measured position obtained from the object mounted on the first communication device From the deviation it is possible to more accurately estimate the first positioning error, which is the positioning error of the first communication device-mounted object with respect to the position of the own vehicle.
  • the estimation of the first positioning error is performed when the object mounted on the first communication device cannot be detected by the perimeter monitoring sensor 40 of the own vehicle. is possible.
  • the first positioning error it is possible to correct the positioning error in the first communication device mounted object and specify the position of the first communication device mounted object with higher accuracy. As a result, it is possible to more accurately identify the position of the communication device-mounted object outside the detection range of the perimeter monitoring sensor 40 of the own vehicle.
  • the target even if the target is outside the detection range of the surroundings monitoring sensor 40 of the own vehicle, it may be a target within the detection range of the surroundings monitoring sensor 40 of the first communication device mounted object.
  • the first mounted object reference target position of the target acquired by vehicle-to-vehicle communication from the mounted object of the first communication device by the amount of the first positioning error, the position relative to the own vehicle can be determined more accurately. can be specified.
  • the second mounted object reference target position is a position based on the second positioning position of the second mounted object, it includes positioning errors in the second mounted object.
  • the position of the first target with respect to the own vehicle which is specified by correcting the first mount reference target position by the first positioning error in the target position specifying unit 282, is corrected so that the positioning error is reduced. ing.
  • a second positioning error which is an error in positioning by the second communication equipment mounted on the vehicle with the position of the vehicle as a reference, from the deviation from the position converted from the position of the target relative to the vehicle.
  • the target even if the target is outside the detection range of the vehicle and the surroundings monitoring sensor 40 of the first communication device, it is within the detection range of the surroundings monitoring sensor 40 of the second communication device. If the target is a target, by correcting the second mounting object reference target position for the target acquired by inter-vehicle communication from the second communication device mounting object by the second positioning error, It becomes possible to identify the position with higher accuracy.
  • Embodiment 1 will be described using FIG. LMa, LMb, and LMc in FIG. 1 are targets such as pedestrians. It is assumed that this target is not a communicator-mounted object.
  • the target LMa can be detected by the peripheral monitoring sensors 40 of the vehicle VEb and the vehicle VEc.
  • the target LMb can be detected by the surroundings monitoring sensor 40 of the vehicle VEb, but cannot be detected by the surroundings monitoring sensor 40 of the vehicle VEa and the vehicle VEc.
  • the target LMc can be detected by the surroundings monitoring sensor 40 of the vehicle VEc, but cannot be detected by the surroundings monitoring sensor 40 of the vehicle VEa and the vehicle VEb.
  • the vehicle VEb cannot be detected by the surrounding monitoring sensors 40 of the vehicle VEa and the vehicle VEc and the surrounding monitoring sensor 302 of the roadside unit 3 .
  • the vehicle VEc cannot be detected by the surroundings monitoring sensor 40 of the vehicle VEa and the vehicle VEc, but can be detected by the surroundings monitoring sensor 302 of the roadside unit 3 .
  • the vehicle VEa is the own vehicle
  • the vehicles VEb and VEc are the other vehicles.
  • the vehicle VEc corresponds to the first communication equipment mounted object
  • the vehicle VEb corresponds to the second communication equipment mounted equipment.
  • the target LMa corresponds to the first target and also to the second target.
  • the target LMb hits the second target but does not hit the first target.
  • the target LMc hits the first target but does not hit the second target.
  • the positioning position of the vehicle VEc obtained from the vehicle VEc through inter-vehicle communication, and the vehicle VEc detected by the surrounding monitoring sensor 302 of the roadside device 3 obtained from the roadside device 3 through inter-vehicle communication can be estimated from the roadside machine reference target position. This positioning error becomes the first positioning error.
  • the positioning error of the mounted object reference target position of the vehicle VEc can be similarly estimated.
  • the target LMc is acquired from the vehicle VEc through inter-vehicle communication. From the target position and the first positioning error of the vehicle VEc, it is possible to accurately identify the position of the target LMc with respect to the own vehicle VEa.
  • the target LMa that can be detected in common by the surroundings monitoring sensor 40 of the vehicle VEb and the vehicle VEc is used. Then, the positioning error in the vehicle VEb can be estimated. This error becomes the second positioning error.
  • the position of the target LMa corrected using the estimated first positioning error and the position of the target LMc obtained from the vehicle VEb by inter-vehicle communication The position of the vehicle VEb relative to the own vehicle VEa can be specified with high accuracy from the reference target position.
  • the communication device 20 of the vehicle VEa can detect the target LMb acquired from the vehicle VEb through vehicle-to-vehicle communication.
  • the position of the target LMb with respect to the own vehicle VEa can be specified with high accuracy from the two-mounted object reference target position and the second positioning error in the vehicle VEb.
  • Embodiment 2 An overview of the second embodiment will be described with reference to FIG.
  • the vehicle unit 2 sequentially transmits CAM (Cooperative Awareness Messages) and CPM (Cooperative Perception Messages).
  • the roadside device 3 also sequentially transmits the CPM.
  • the communication device 20 provided in the vehicle unit 2 uses the information contained in those messages. Then, similar to the first embodiment, identity determination, error estimation, and the like are performed. CAM and CPM will be described before describing processes such as identity determination and error estimation in the second embodiment.
  • FIG. 8 is a diagram showing an exemplary architecture of a V2X communication device.
  • a V2X communication device is used instead of the communication device 20 of the first embodiment.
  • the V2X communication device also has the configuration shown in FIG. 4, like the communication device 20 of the second embodiment.
  • a V2X communication device is a communication device that transmits target information.
  • the V2X communication device may perform communication between vehicles, vehicles and infrastructure, vehicles and bicycles, vehicles and mobile terminals, and the like.
  • the V2X communication device may correspond to an onboard device of a vehicle or may be included in the onboard device.
  • the onboard equipment is sometimes called an OBU (On-Board Unit).
  • the communication device may correspond to the infrastructure roadside unit or may be included in the roadside unit.
  • a roadside unit is sometimes called an RSU (Road Side Unit).
  • the communication device can also be one element that constitutes an ITS (Intelligent Transport System). If it is an element of the ITS, the communication device may correspond to an ITS station (ITS-S) or be included in the ITS-S.
  • ITS-S is a device for information exchange, and may be any of OBU, RSU, and mobile terminal, or may be included therein.
  • the mobile terminal is, for example, a PDA (Personal Digital Assistant) or a smart phone.
  • the communication device may correspond to a WAVE (Wireless Access in Vehicular) device disclosed in IEEE1609, or may be included in a WAVE device.
  • WAVE Wireless Access in Vehicular
  • V2X communication device mounted on a vehicle.
  • This V2X communication device has a function of providing CA (Cooperative Awareness) service and CP (Collective Perception) service.
  • CA service the V2X communication device transmits CAM.
  • CP service the V2X communication device transmits CPM. It should be noted that the same or similar methods disclosed below can be applied even if the communication device is an RSU or a mobile terminal.
  • the architecture shown in FIG. 8 is based on the ITS-S reference architecture according to EU standards.
  • the architecture shown in FIG. 8 comprises an application layer 110, a facility layer 120, a network & transport layer 140, an access layer 130, a management layer 150, and a security layer 160.
  • FIG. 8 is based on the ITS-S reference architecture according to EU standards.
  • the architecture shown in FIG. 8 comprises an application layer 110, a facility layer 120, a network & transport layer 140, an access layer 130, a management layer 150, and a security layer 160.
  • the application layer 110 implements or supports various applications 111 .
  • FIG. 8 shows, as examples of the applications 111, a traffic safety application 111a, an efficient traffic information application 111b, and other applications 111c.
  • the facility layer 120 supports the execution of various use cases defined in the application layer 110.
  • the facility layer 120 can support the same or similar functionality as the top three layers (application layer, presentation layer and session layer) in the OSI reference model.
  • Facility means providing functions, information and data.
  • Facility layer 120 may provide the functionality of a V2X communication device.
  • facility layer 120 may provide the functions of application support 121, information support 122, and communication support 123 shown in FIG.
  • the application support 121 has functions that support basic application sets or message sets.
  • An example of a message is a V2X message.
  • V2X messages can include periodic messages such as CAMs and event messages such as DENMs (Decentralized Environmental Notification Messages).
  • Facility layer 120 may also support CPM.
  • the information support 122 has the function of providing common data or databases used for the basic application set or message set.
  • a database is the Local Dynamic Map (LDM).
  • the communication support 123 has functions for providing services for communication and session management.
  • Communication support 123 provides, for example, address mode and session support.
  • facility layer 120 supports a set of applications or a set of messages. That is, the facility layer 120 generates message sets or messages based on the information that the application layer 110 should send and the services it should provide. Messages generated in this way are sometimes referred to as V2X messages.
  • the access layer 130 includes an external IF (InterFace) 131 and an internal IF 132, and can transmit messages/data received by the upper layers via physical channels.
  • access stratum 130 may conduct or support data communication according to the following communication techniques.
  • the communication technology is, for example, a communication technology based on the IEEE 802.11 and/or 802.11p standard, an ITS-G5 wireless communication technology based on the physical transmission technology of the IEEE 802.11 and/or 802.11p standard, a satellite/broadband wireless mobile communication including 2G/3G/4G (LTE)/5G wireless mobile communication technology, broadband terrestrial digital broadcasting technology such as DVB-T/T2/ATC, GNSS communication technology, and WAVE communication technology.
  • the network & transport layer 140 can configure a vehicle communication network between homogeneous/heterogeneous networks using various transport protocols and network protocols.
  • the transport layer is the connecting layer between the upper and lower layers. Upper layers include a session layer, a presentation layer, and an application layer 110 .
  • the lower layers include the network layer, data link layer, and physical layer.
  • the transport layer can manage transmitted data to arrive at its destination correctly. At the source, the transport layer processes the data into appropriately sized packets for efficient data transmission. On the receiving side, the transport layer takes care of restoring the received packets to the original file.
  • Transport protocols are, for example, TCP (Transmission Control Protocol), UDP (User Datagram Protocol), and BTP (Basic Transport Protocol).
  • the network layer can manage logical addresses.
  • the network layer may also determine the routing of packets.
  • the network layer may receive packets generated by the transport layer and add the logical address of the destination to the network layer header. Packet transmission routes may be unicast/multicast/broadcast between vehicles, between vehicles and fixed stations, and between fixed stations.
  • IPv6 networking for geo-networking, mobility support or geo-networking may be considered.
  • the architecture of the V2X communication device may further include a management layer 150 and a security layer 160.
  • Management layer 150 manages the communication and interaction of data between layers.
  • the management layer 150 comprises a management information base 151 , regulation management 152 , interlayer management 153 , station management 154 and application management 155 .
  • Security layer 160 manages security for all layers.
  • Security layer 160 comprises firewall and intrusion detection management 161 , authentication, authorization and profile management 162 and security management information base 163 .
  • V2X messages may also be referred to as ITS messages.
  • V2X messages can be generated at application layer 110 or facility layer 120 . Examples of V2X messages are CAM, DENM, CPM.
  • the transport layer in the network & transport layer 140 generates BTP packets.
  • a network layer at network & transport layer 140 may encapsulate the BTP packets to produce geo-networking packets.
  • Geo-networking packets are encapsulated in LLC (Logical Link Control) packets.
  • LLC Logical Link Control
  • the data may include message sets.
  • a message set is, for example, basic safety messages.
  • BTP is a protocol for transmitting V2X messages generated in the facility layer 120 to lower layers.
  • the BTP header has A type and B type.
  • a type BTP header may contain the destination port and source port required for transmission and reception in two-way packet transmission.
  • the B-type BTP header can include the destination port and destination port information required for transmission in non-bidirectional packet transmission.
  • the destination port identifies the facility entity corresponding to the destination of the data contained in the BTP packet (BTP-PDU).
  • a BTP-PDU is unit transmission data in BTP.
  • the source port is a field generated for the BTP-A type.
  • the source port indicates the port of the facility layer 120 protocol entity at the source of the corresponding packet. This field can have a size of 16 bits.
  • Destination port information is a field generated for the BTP-B type. Provides additional information if the destination port is a well-known port. This field can have a size of 16 bits.
  • a geo-networking packet includes a basic header and a common header according to the network layer protocol, and optionally includes an extension header according to the geo-networking mode.
  • An LLC packet is a geo-networking packet with an LLC header added.
  • the LLC header provides the ability to differentiate and transmit IP data and geo-networking data.
  • IP data and geo-networking data can be distinguished by SNAP (Subnetwork Access Protocol) Ethertype.
  • SNAP Network Access Protocol
  • the Ethertype When IP data is transmitted, the Ethertype is set to x86DD and may be included in the LLC header. If geo-networking data is transmitted, the Ethertype may be set to 0x86DC and included in the LLC header. The receiver can see the Ethertype field in the LLC packet header and forward and process the packet to the IP data path or the geo-networking path depending on the value of the Ethertype field in the LLC packet header.
  • the LLC header contains DSAP (Destination Service Access Point) and SSAP (Source Service Access Point).
  • SSAP is followed by a control field (Control in FIG. 9), protocol ID, and ethertype.
  • FIG. 10 shows the logical interfaces for the CA (Collaborative Awareness) service 128 and other layers in the V2X communication device architecture.
  • V2X communication devices may provide various services for traffic safety and efficiency.
  • One of the services may be CA service 128 .
  • Collaborative awareness in road traffic means that road users and roadside infrastructure can know each other's location, dynamics and attributes.
  • Road users refer to all users on and around roads for which traffic safety and control are required, such as automobiles, trucks, motorcycles, cyclists, pedestrians, etc.
  • Roadside infrastructure refers to road signs, traffic lights, barriers, entrances, etc. Refers to equipment.
  • V2V vehicle to vehicle
  • V2I vehicle to infrastructure
  • I2V infrastructure and vehicles
  • X2X wireless networks
  • V2X communication devices can provide situational awareness through their sensors and communication with other V2X communication devices.
  • the CA service can specify how the V2X communication device communicates its location, behavior and attributes by sending CAMs.
  • the CA service 128 may be an entity of the facility layer 120.
  • CA service 128 may be part of the application support domain of facility layer 120 .
  • the CA service 128 can apply the CAM protocol and provide two services of CAM transmission and reception.
  • CA service 128 may also be referred to as a CAM basic service.
  • the source ITS-S configures the CAM.
  • the CAM is sent to network & transport layer 140 for transmission.
  • the CAM may be sent directly from the source ITS-S to all ITS-S within range.
  • the communication range is varied by changing the transmission power of the source ITS-S.
  • CAMs are generated periodically at a frequency controlled by the CA service 128 of the originating ITS-S. The frequency of generation is determined by considering the state change of the source ITS-S. Conditions to consider are, for example, changes in the position or velocity of the source ITS-S, loading of the radio channel.
  • CA service 128 Upon receiving the CAM, CA service 128 utilizes the contents of the CAM to entities such as application 111 and/or LDM 127.
  • CA service 128 interfaces with entities of facility layer 120 and application layer 110 to collect relevant information for CAM generation and to further process received CAM data.
  • An example of an entity for ITS-S data collection in a vehicle is a VDP (Vehicle Data Provider) 125, a POTI (position and time) unit 126, and an LDM 127.
  • the VDP 125 is connected to the vehicle network and provides vehicle status information.
  • the POTI unit 126 provides ITS-S position and time information.
  • the LDM 127 is a database within ITS-S, as described in ETSI TR 102 863, which can be updated with received CAM data.
  • Application 111 can obtain information from LDM 127 and perform further processing.
  • the CA service 128 connects with the network & transport layer 140 via NF-SAP (Network & Transport/Facilities Service Access Point) in order to exchange CAMs with other ITS-S.
  • the CA service 128 interfaces with the security layer 160 via SF-SAP (Security Facilities Service Access Point) for CAM transmission and CAM reception security services.
  • SF-SAP Security Facilities Service Access Point
  • the CA service 128 directly provides the received CAM data to the application 111, it connects with the management layer 150 via the MF-SAP (Management/Facilities Service Access Point), FA-SAP (Facilities/Applications Service Access Point ) with the application layer 110 .
  • MF-SAP Management/Facilities Service Access Point
  • FA-SAP Fecilities/Applications Service Access Point
  • FIG. 11 shows the functional blocks of CA service 128 and interfaces to other functions and layers.
  • CA service 128 provides four sub-functions of CAM encoding section 1281 , CAM decoding section 1282 , CAM transmission management section 1283 and CAM reception management section 1284 .
  • the CAM encoding unit 1281 constructs and encodes a CAM according to a predefined format.
  • CAM decoding section 1282 decodes the received CAM.
  • the CAM transmission manager 1283 executes the protocol operation of the source ITS-S to transmit the CAM.
  • the CAM transmission management unit 1283 executes, for example, activation and termination of CAM transmission operation, determination of CAM generation frequency, and triggering of CAM generation.
  • the CAM reception manager 1284 performs protocol operations specified in the receiving ITS-S to receive the CAM.
  • the CAM reception management unit 1284 for example, activates the CAM decoding function when receiving CAM.
  • the CAM reception management unit 1284 provides the received CAM data to the application 111 of the ITS-S on the reception side.
  • the CAM reception management unit 1284 may perform information check of the received CAM.
  • Interface IF. CAM is an interface to LDM 127 or application 111 .
  • the interface to the application layer 110 may be implemented as an API (Application Programming Interface), and data may be exchanged between the CA service 128 and the application 111 via this API.
  • the interface to application layer 110 can also be implemented as FA-SAP.
  • the CA Service 128 interacts with other entities of the Facility Layer 120 to obtain the data necessary for CAM generation.
  • the set of entities that provide data for the CAM are data providing entities or data providing facilities.
  • Between the data providing entity and the CA service 128 is an interface IF. Data is exchanged via the FAC.
  • the CA service 128 has an interface IF. It exchanges information with the network & transport layer 140 via N&T. At the source ITS-S, CA service 128 provides the CAM to network & transport layer 140 along with protocol control information (PCI). PCI is control information according to ETSI EN 32 636-5-1. The CAM is embedded in the Service Data Unit (FL-SDU) of the facility layer 120 .
  • FL-SDU Service Data Unit
  • the network & transport layer 140 can provide the received CAM to the CA service 128.
  • the interface between the CA service 128 and the network & transport layer 140 relies on geo-networking/BTP stack services, or IPv6 stacks, and IPv6/geo-networking combined stacks.
  • CAM may rely on services provided by the GeoNetworking (GN)/BTP stack.
  • GN GeoNetworking
  • SHB Single Hop Broadcast
  • the PCI passed from CA service 128 to the geo-networking/BTP stack may include BTP type, destination port, destination port information, GN packet transmission type, GN communication profile, GN security profile. This PCI may also include GN traffic class, GN maximum packet lifetime.
  • a CAM can use an IPv6 stack or a combined IPv6/geo-networking stack to transmit the CAM, as specified in ETSI TS 102 636-3. If the combined IPv6/geo-networking stack is used for CAM transmission, the interface between the CA service 128 and the combined IPv6/geo-networking stack may be the same as the interface between the CA service 128 and the IPv6 stack.
  • the CA service 128 can exchange primitives with management entities of the management layer 150 via MF-SAP. Primitives are elements of information that are exchanged, such as instructions.
  • Primitives are elements of information that are exchanged, such as instructions.
  • the sender ITS-S has an interface IF. Obtain the setting information of T_GenCam_DCC from the management entity via Mng.
  • the CA service 128 is provided by the ITS-S security entity interface IF. Sec can be used to exchange primitives with ITS-S security entities via SF-SAP.
  • Point-to-multipoint communication may be used for CAM transmission.
  • a CAM is sent in a single hop from a source ITS-S only to a receiver ITS-S located within direct communication range of the source ITS-S. Since it is a single hop, the receiving IST-S does not forward the received CAM.
  • the initiation of the CA service 128 may be different for different types of ITS-S, such as vehicle ITS-S, roadside ITS-S, and personal ITS-S. As long as CA service 128 is active, CAM generation is managed by CA service 128 .
  • the CA service 128 is activated at the same time as the ITS-S is activated and terminated when the ITS-S is deactivated.
  • the frequency of CAM generation is managed by the CA service 128.
  • the CAM generation frequency is the time interval between two consecutive CAM generations.
  • the CAM generation interval is set so as not to fall below the minimum value T_GenCamMin.
  • the minimum value T_GenCamMin of the CAM generation interval is 100 ms. If the CAM generation interval is 100 ms, the CAM generation frequency will be 10 Hz.
  • the CAM generation interval is set so as not to exceed the maximum value T_GenCamMax.
  • the maximum value T_GenCamMax of the CAM generation interval is 1000 ms. If the CAM generation interval is 1000 ms, the CAM generation frequency will be 1 Hz.
  • the CA service 128 triggers CAM generation according to the dynamics of the source ITS-S and the congestion state of the channel. If the dynamics of the source ITS-S shortens the CAM generation interval, this interval is maintained for consecutive CAMs.
  • the CA service 128 repeatedly checks the conditions that trigger CAM generation every T_CheckCamGen. T_CheckCamGen is below T_GenCamMin.
  • the parameter T_GenCam_DCC provides the minimum generation time interval between two consecutive CAMs that reduces CAM generation according to the channel usage requirements for distributed congestion control (DCC) specified in ETSI TS 102 724. . This facilitates adjustment of the CAM generation frequency according to the remaining capacity of the radio channel during channel congestion.
  • DCC distributed congestion control
  • T_GenCam_DCC is provided by the management entity in milliseconds.
  • the range of values for T_GenCam_DCC is T_GenCamMin ⁇ T_GenCam_DCC ⁇ T_GenCamMax.
  • T_GenCam_DCC is set to T_GenCamMax. If the managing entity gives T_GenCam_DCC a value less than or equal to T_GenCamMin, or if T_GenCam_DCC is not provided, T_GenCam_DCC is set to T_GenCamMin.
  • LTE-V2X DCC and T_GenCam_DCC do not apply.
  • the access layer 130 manages channel congestion control.
  • T_GenCam is the upper limit of the currently valid CAM generation interval. Let the default value of T_GenCam be T_GenCamMax. T_GenCam sets the elapsed time from the last generation of CAM when CAM is generated according to condition 1 below. After N_GenCam consecutive CAMs are generated according to condition 2, T_GenCam is set to T_GenCamMax.
  • N_GenCam can be dynamically adjusted according to some environmental conditions. For example, when approaching an intersection, N_GenCam can be increased to increase the frequency of CAM reception. Note that the default and maximum values of N_GenCam are 3.
  • Condition 1 is that the elapsed time since the previous CAM generation is T_GenCam_DCC or more, and one of the following ITS-S dynamics-related conditions is given.
  • the first condition related to ITS-S dynamics is that the absolute value of the difference between the current bearing of the source ITS-S and the bearing contained in the CAM previously transmitted by the source ITS-S is 4 degrees or more. is.
  • the second is that the distance between the current location of the source ITS-S and the location included in the CAM previously transmitted by the source ITS-S is 4 m or more.
  • Condition 2 is that the elapsed time from the previous CAM generation is T_GenCam or more, and in the case of ITS-G5, T_GenCam_Dcc or more.
  • the CA service 128 immediately creates a CAM.
  • the CA service 128 When the CA service 128 creates a CAM, it creates an essential container.
  • the mandatory container mainly contains highly dynamic information of the source ITS-S. High dynamic information is included in basic and high frequency containers.
  • the CAM can contain optional data.
  • Optional data is mainly non-dynamic source ITS-S status and information for specific types of source ITS-S. The status of non-dynamic source ITS-S is contained in the low frequency container, and information for specific types of source ITS-S is contained in the special vehicle container.
  • Infrequent containers are included in the first CAM after the CA service 128 is activated. After that, if the elapsed time from the generation of the last CAM that generated the low-frequency container is 500 ms or more, the low-frequency container is included in the CAM.
  • the special vehicle container is also included in the first CAM generated after the CA service 128 is activated. After that, if 500 ms or more have passed since the CAM that last generated the special vehicle container, the special vehicle container is included in the CAM.
  • the CAM generation frequency of the roadside ITS-S is also defined by the time interval between two consecutive CAM generations.
  • the roadside ITS-S must be configured such that at least one CAM is transmitted while the vehicle is within the coverage of the roadside ITS-S.
  • the time interval for CAM generation must be 1000 ms or longer. Assuming that the time interval of CAM generation is 1000 ms as the maximum CAM generation frequency, the maximum CAM generation frequency is 1 Hz.
  • the probability that a passing vehicle receives a CAM from an RSU depends on the frequency of CAM occurrence and the time the vehicle is within the communication area. This time depends on the vehicle speed and the transmission power of the RSU.
  • each CAM is time-stamped.
  • time synchronization is established between different ITS-S.
  • the time required for CAM generation is 50 ms or less.
  • the time required for CAM generation is the difference between the time CAM generation is started and the time the CAM is delivered to network & transport layer 140 .
  • the time stamp shown in the CAM of the vehicle ITS-S corresponds to the time when the reference position of the source ITS-S shown in this CAM was determined.
  • the timestamp shown on the roadside ITS-S CAM is the time the CAM was generated.
  • Certificates may be used to authenticate messages transmitted between ITS-S.
  • a certificate indicates permission for the holder of the certificate to send a particular set of messages. Certificates can also indicate authority over specific data elements within a message.
  • authorization authority is indicated by a pair of identifiers, ITS-AID and SSP.
  • ITS-AID indicates the overall type of permission granted. For example, there is an ITS-AID that indicates that the sender has the right to send the CAM.
  • SSP Service Specific Permissions
  • ITS-AID Service Specific Permissions
  • a received signed CAM is accepted by the recipient if the certificate is valid and the CAM matches the certificate's ITS-AID and SSP.
  • the CAM is signed using the private key associated with the Authorization Ticket containing an SSP of type BitmapSsp.
  • FIG. 12 is a diagram showing the CAM format.
  • a CAM may include an ITS protocol data unit (PDU) header and multiple containers.
  • the ITS PDU header contains information on the protocol version, message type, and source ITS-S ID.
  • the CAM of the vehicle ITS-S includes one basic container and one high frequency container (hereinafter referred to as HF container), one low frequency container (hereinafter referred to as LF container), It can contain one or more other specialized vehicle containers. Special vehicle containers are sometimes called special containers.
  • the basic container contains basic information about the source ITS-S.
  • the base container can include station type, station location.
  • a station type is a vehicle, a roadside unit (RSU), or the like. If the station type is vehicle, the vehicle type may be included.
  • Location may include latitude, longitude, altitude and confidence.
  • the vehicle HF container is included in the HF container. If the station type is not vehicle, the HF container can contain other containers that are not vehicle HF containers.
  • the vehicle HF container contains dynamic state information that changes in a short time in the source vehicle ITS-S.
  • the vehicle HF container includes one or more of the orientation of the vehicle, vehicle speed, traveling direction, vehicle length, vehicle width, vehicle longitudinal acceleration, road curvature, curvature calculation mode, and yaw rate. It may be The running direction indicates either forward or backward.
  • the curvature calculation mode is a flag indicating whether or not the yaw rate of the vehicle is used to calculate the curvature.
  • the vehicle HF container includes one or more of the following: vehicle longitudinal acceleration control status, lane position, steering wheel angle, lateral acceleration, vertical acceleration, characteristic class, DSRC toll collection station location.
  • vehicle longitudinal acceleration control status lane position
  • steering wheel angle lateral acceleration
  • vertical acceleration vertical acceleration
  • characteristic class DSRC toll collection station location.
  • a property class is a value that determines the maximum age of data elements in the CAM.
  • the vehicle LF container is included in the LF container. If the station type is not vehicle, the LF container can contain other containers that are not vehicle LF containers.
  • a vehicle LF container can include one or more of the vehicle's role, the lighting state of external lights, and the travel trajectory.
  • the role of vehicle is a classification when the vehicle is a special vehicle.
  • the external lights are the most important external lights of the vehicle.
  • a travel trajectory indicates the movement of a vehicle at a certain time or a certain distance in the past.
  • the running locus can also be called a path history.
  • a travel locus is represented by a list of waypoints of a plurality of points (for example, 23 points).
  • a special vehicle container is a container for vehicles ITS-S that have a special role in road traffic such as public transportation.
  • a special vehicle container may include any one of a public transportation container, a special transportation container, a hazardous materials container, a road construction container, an ambulance container, an emergency container, or a security vehicle container.
  • a public transport container is a container for public transport vehicles such as buses.
  • Public transportation containers are used by public transportation vehicles to control boarding conditions, traffic lights, barriers, bollards, and the like.
  • Special shipping containers are included when the vehicle is one or both of a heavy vehicle and an oversized vehicle.
  • a dangerous goods container is a container that is included when a vehicle is transporting dangerous goods.
  • the dangerous goods container stores information indicating the type of dangerous goods.
  • a road construction container is a container that is included when the vehicle is a vehicle that participates in road construction.
  • the road construction container stores a code indicating the type of road construction and the cause of the road construction.
  • the roadwork container may also include information indicating whether the lane ahead is open or closed.
  • An ambulance container is a container that is included when the vehicle is an ambulance vehicle during an ambulance operation.
  • the ambulance container shows the status of light bar and siren usage, and emergency priority.
  • An emergency container is a container that is included when the vehicle is an emergency vehicle during emergency operations.
  • the emergency container shows light bar and siren usage, cause code and emergency priority.
  • the safety confirmation vehicle container is a container that is included when the vehicle is a safety confirmation vehicle.
  • a safety confirmation vehicle is a vehicle that accompanies a special transportation vehicle or the like.
  • the safety confirmation car container shows the use of light bars and sirens, overtaking regulations, and speed limits.
  • Fig. 13 shows a flowchart of the process of transmitting the CAM.
  • the processing shown in FIG. 13 is executed for each CAM generation cycle.
  • step S201 information configuring the CAM is acquired.
  • step S201 is executed by the detection information acquisition unit 201, for example.
  • step S202 a CAM is generated based on the information acquired at step S201.
  • Step S202 can also be executed by the detection information acquisition unit 201 .
  • the transmission unit 221 transmits the CAM generated in step S202 to the surroundings of the vehicle.
  • V2X communication devices can support traffic safety by periodically providing their own location and status to surrounding V2X communication devices.
  • the CA service 128 has a limitation that only the information of the corresponding V2X communication device itself can be shared. To overcome this limitation, service development such as CP service 124 is required.
  • the CP service 124 may also be an entity of the facility layer 120, as shown in FIG.
  • CP service 124 may be part of the application support domain of facility layer 120 .
  • the CP service 124 may fundamentally differ from the CA service 128 in that, for example, it cannot receive input data regarding the host V2X communication device from the VDP 125 or the POTI unit 126 .
  • CPM transmission includes CPM generation and transmission.
  • an originating V2X communication device In the process of generating CPM, an originating V2X communication device generates a CPM, which is then sent to network & transport layer 140 for transmission.
  • An originating V2X communication device may be referred to as an originating V2X communication device, a host V2X communication device, and so on.
  • CP service 124 connects with other entities in facility layer 120 and V2X applications in facility layer 120 to collect relevant information for CPM generation and to deliver received CPM content for further processing.
  • entity for data collection may be the function that provides object detection at the host object detector.
  • CP service 124 may use services provided by protocol entities of network & transport layer 140 to deliver (or transmit) CPMs.
  • CP service 124 may interface with network & transport layer 140 through NF-SAP to exchange CPMs with other V2X communication devices.
  • NF-SAP is the service access point between network & transport layer 140 and facility layer 120 .
  • CP service 124 may connect with secure entities through SF-SAP, the SAP between security layer 160 and facility layer 120, to access security services for sending CPMs and receiving CPMs. good.
  • CP service 124 may also interface with management entities through MF-SAP, which is the SAP between management layer 150 and facility layer 120 .
  • MF-SAP which is the SAP between management layer 150 and facility layer 120 .
  • the CP service 124 may connect to the application layer 110 through FA-SAP, which is the SAP between the facility layer 120 and the application layer 110 .
  • the CP service 124 can specify how a V2X communication device informs other V2X communication devices about the location, behavior, and attributes of detected surrounding road users and other objects. For example, sending a CPM allows the CP service 124 to share the information contained in the CPM with other V2X communication devices. Note that the CP service 124 may be a function that can be added to all types of target information communication devices that participate in road traffic.
  • a CPM is a message exchanged between V2X communication devices via the V2X network.
  • CPM can be used to generate collective perceptions of road users and other objects detected and/or recognized by V2X communication devices.
  • the detected road users or objects may be, but are not limited to, road users or objects that are not equipped with V2X communication equipment.
  • a V2X communication device that shares information via a CAM shares only information about the recognition state of the V2X communication device itself with other V2X communication devices in order to perform cooperative recognition.
  • road users and others who are not equipped with V2X communication devices are not part of the system and thus have limited views on situations related to safety and traffic management.
  • a system that is equipped with a V2X communication device and can recognize road users and objects that are not equipped with a V2X communication device can detect the presence of road users and objects that are not equipped with a V2X communication device. and status to other V2X communication devices.
  • the CP service 124 cooperatively recognizes the presence of road users and objects that are not equipped with V2X communication devices, so the safety and traffic management performance of systems equipped with V2X communication devices can be easily improved. It is possible to
  • CPM delivery may vary depending on the applied communication system. For example, in ITS-G5 networks as defined in ETSI EN 302 663, a CPM may be sent from an originating V2X communication device directly to all V2X communication devices within range. The communication range can be particularly affected by the originating V2X communication device by changing the transmission power according to the region concerned.
  • CPMs may be generated periodically with a frequency controlled by the CP service 124 at the originating V2X communication device.
  • the frequency of generation may be determined taking into account the radio channel load determined by Distributed Congestion Control.
  • the generation frequency is also determined taking into account the state of the detected non-V2X objects, e.g. dynamic behavior of position, velocity or orientation, and the transmission of CPMs for the same perceived object by other V2X communication devices.
  • the CP service 124 makes the contents of the CPM available to functions within the receiving V2X communication device, such as the V2X application and/or the LDM 127 .
  • LDM 127 may be updated with received CPM data.
  • V2X applications may retrieve this information from LDM 127 for additional processing.
  • FIG. 15 is a functional block diagram of the CP service 124 in this embodiment. More specifically, FIG. 15 illustrates the functional blocks of the CP service 124 and functional blocks with interfaces for other functions and layers in this embodiment.
  • the CP service 124 can provide the following sub-functions for CPM transmission/reception.
  • CPM encoder 1241 constructs or generates CPM according to a predefined format. The latest in-vehicle data may be included in the CPM.
  • the CPM decoding unit 1242 decodes the received CPM.
  • the CPM transmission management unit 1243 executes the protocol operation of the source V2X communication device. The operations performed by the CPM transmission manager 1243 may include activation and termination of CPM transmission operations, determination of CPM generation frequency, and triggering of CPM generation.
  • the CPM reception manager 1244 can perform protocol operations for the receiving V2X communication device. Specifically, it can include triggering the CPM decoding function in CPM reception, providing the received CPM data to the LDM 127 or the V2X application of the receiving side V2X communication device, checking the information of the received CPM, and the like.
  • Point-to-multipoint communication may be used for CPM delivery.
  • ITS-G5 a control channel
  • CPM generation may be triggered and managed by CP service 124 while CP service 124 is running.
  • the CP service 124 may be launched upon activation of the V2X communication device and may be terminated when the V2X communication device is terminated.
  • a host V2X communication device may send a CPM whenever at least one object with sufficient confidence to exchange with a nearby V2X communication device is detected.
  • CP services should consider the trade-off between object lifetime and channel utilization. For example, from the point of view of an application that uses information received by a CPM, it needs to provide updated information as often as possible. However, from the point of view of the ITS-G5 stack, a low transmission period is required due to the need to minimize channel utilization. Therefore, it is desirable for the V2X communication device to consider this point and appropriately include the detected object and object information in the CPM. Also, in order to reduce the message size, it is necessary to evaluate the object before sending it.
  • FIG. 16 is a diagram showing the structure of the CPM.
  • the CPM structure shown in FIG. 16 may be the basic CPM structure.
  • CPMs may be messages exchanged between V2X communication devices in a V2X network.
  • CPM may also be used to generate collective perceptions of road users and/or other objects detected and/or recognized by V2X communication devices. That is, a CPM may be an ITS message for generating collective awareness of objects detected by V2X communication devices.
  • the CPM may include state information and attribute information of road users and objects detected by the source V2X communication device. Its content may vary depending on the type of road user or object detected and the detection capabilities of the originating V2X communication device. For example, if the object is a vehicle, the state information may include at least information about the actual time, location and motion state. Attribute information may include attributes such as dimensions, vehicle type, and role in road traffic.
  • the CPM may complement the CAM and work in the same way as the CAM. That is, it may be for enhancing cooperative recognition.
  • the CPM may contain externally observable information about detected road users or objects.
  • the CP service 124 may include methods to reduce duplication or duplication of CPMs sent by different V2X communication devices by verifying CPMs sent by other stations.
  • the receiving V2X communication device may recognize the presence, type and status of road users or objects detected by the originating V2X communication device.
  • the received information may be used by the receiving V2X communication device to support V2X applications to enhance safety, improve traffic efficiency and travel time. For example, by comparing the received information with the detected states of road users or objects, the receiving V2X communication device can estimate the risk of collision with road users or objects. Additionally, the receiving V2X communication device may notify the user via the receiving V2X communication device's Human Machine Interface (HMI) or automatically take corrective action.
  • HMI Human Machine Interface
  • CPM The basic format of CPM will be explained with reference to FIG.
  • the format of this CPM is ASN (Abstract Syntax Notation). May be presented as 1.
  • Data Elements (DE) and Data Frames (DF) not defined in this disclosure may be derived from the Common Data Dictionary specified in ETSI TS 102 894-2.
  • a CPM may include an ITS protocol data unit (PDU) header and multiple containers.
  • PDU protocol data unit
  • the ITS PDU header is a header that contains information about the protocol version, message type, and ITS ID of the source V2X communication device.
  • the ITS PDU header is a common header used in ITS messages and is present at the beginning of the ITS message.
  • the ITS PDU header is sometimes called a common header.
  • a station data container may include an originating vehicle container or an originating roadside unit container (RSU container).
  • a sensor information container is sometimes called a field-of-view container.
  • An originating vehicle container may also be referred to as an OVC.
  • a field of view container may also be described as an FOC.
  • a recognition object container may also be described as a POC.
  • the CPM includes the management container as a mandatory container, and the station data container, sensor information container, POC and free space ancillary containers may be optional containers.
  • the sensor information container, the perceived object container and the free space attachment container may be multiple containers. Each container will be described below.
  • the administrative container provides basic information about the originating ITS-S, regardless of whether it is a vehicle or roadside unit type station.
  • the management container may also include station type, reference location, segmentation information, number of recognized objects.
  • the station type indicates the type of ITS-S.
  • the reference location is the location of the originating ITS-S.
  • the segmentation information describes splitting information when splitting a CPM into multiple messages due to message size constraints.
  • Table 1 shown in FIG. 17 is an example of OVC in the station data container of CPM.
  • Table 1 shows the data elements (DE) and/or data frames (DF) included in an example OVC.
  • the station data container becomes an OVC when the ITS-S that is the source is a vehicle. If the originating ITS-S is an RSU, it becomes an Originating RSU Container.
  • the originating RSU container contains the ID for the road or intersection on which the RSU is located.
  • DE is a data type that contains single data.
  • DF is a data type containing one or more elements in a predetermined order.
  • DF is a data type that includes one or more DEs and/or one or more DFs in a predefined order.
  • the DE/DF may be used to construct facility layer messages or application layer messages.
  • facility layer messages are CAM, CPM, DENM.
  • the OVC contains basic information related to the V2X communication device that emits the CPM.
  • OVC can be interpreted as a scaled down version of CAM.
  • the OVC may include only the DE required for coordinate conversion processing. That is, OVC is similar to CAM, but provides basic information about the originating V2X communication device. The information contained in the OVC is focused on supporting the coordinate transformation process.
  • OVC can provide the following. That is, the OVC can provide the latest geographic location of the originating V2X communication device obtained by the CP service 124 at the time of CPM generation. OVC can also provide the absolute lateral and longitudinal velocity components of the originating V2X communication device. The OVC can provide the geometric dimensions of the originating V2X communication device.
  • the generated differential time shown in Table 1 indicates, as DE, the time corresponding to the time of the reference position in CPM.
  • the generation difference time can be regarded as the generation time of the CPM.
  • the generated differential time may be referred to as generated time.
  • the reference position indicates the geographical position of the V2X communication device as DF.
  • a reference position indicates the location of a geographical point.
  • the reference position includes information regarding latitude, longitude, position confidence and/or altitude.
  • Latitude represents the latitude of the geographical point
  • Longitude represents the longitude of the geographical point.
  • the position confidence represents the accuracy of the geographic location
  • the altitude represents the altitude and altitude accuracy of the geographic point.
  • the orientation indicates the orientation in the coordinate system as DF.
  • Heading includes heading value and/or heading confidence information.
  • the bearing value indicates the heading relative to north, and the bearing confidence indicates a preset level of confidence in the reported bearing value.
  • Longitudinal velocity can describe the accuracy of longitudinal velocity and velocity information for a moving object (eg, vehicle) as DF.
  • Longitudinal velocity includes velocity value and/or velocity accuracy information.
  • the velocity value represents the velocity value in the vertical direction
  • the velocity accuracy represents the accuracy of the velocity value.
  • Lateral velocity as a DF, can describe the lateral velocity and the accuracy of velocity information for a moving body (eg, vehicle). Lateral velocity includes information about velocity values and/or velocity accuracy.
  • the velocity value represents the velocity value in the lateral direction
  • the velocity accuracy represents the accuracy of the velocity value.
  • the vehicle length can describe the vehicle length and accuracy index as DF.
  • the vehicle length includes information about vehicle length values and/or vehicle length accuracy indicators.
  • the vehicle length represents the length of the vehicle, and the vehicle length accuracy index represents the reliability of the vehicle length.
  • the vehicle width indicates the width of the vehicle as DE.
  • vehicle width can represent the width of the vehicle including the side mirrors. If the width of the vehicle is 6.1 m or more, it is set to 61, and if the information cannot be obtained, it is set to 62.
  • Each DE/DF shown in Table 1 can refer to ETSI 102 894-2 shown in the right column of Table 1, except for the generation time difference.
  • ETSI 102 894-2 defines a CDD (common data dictionary). See ETSI EN 302 637-2 for generation time differences.
  • the OVC may contain information on the vehicle direction angle, vehicle traveling direction, longitudinal acceleration, lateral acceleration, vertical acceleration, yaw rate, pitch angle, roll angle, vehicle height and trailer data.
  • Table 2 is shown in FIG. Table 2 is an example of SIC (or FOC) in CPM.
  • the SIC provides a description of at least one sensor mounted on the originating V2X communication device. If the V2X communication device is equipped with multiple sensors, multiple explanations may be added. For example, the SIC provides information about the sensor capabilities of the originating V2X communication device. To do so, general sensor characteristics providing the originating V2X communication device's sensor mounting location, sensor type, sensor range and opening angle (i.e., sensor frustum) are included as part of the message. may be These pieces of information may be used by the receiving V2X communication device to select an appropriate prediction model according to sensor performance.
  • the sensor ID indicates a sensor-specific ID for specifying the sensor that detected the object.
  • the sensor ID is a random number generated when the V2X communication device starts up and does not change until the V2X communication device is terminated.
  • Sensor type indicates the type of sensor.
  • the sensor types are listed below.
  • the sensor type is undefined (0), radar (1), lidar (2), mono-video (3), stereovision (4), night vision (5), ultrasonic (6), pmd (7). , fusion (8), induction loop (9), spherical camera (10), and their set (11).
  • pmd is a photo mixing device.
  • a spherical camera is also called a 360-degree camera.
  • the X position indicates the mounting position of the sensor in the negative X direction
  • the Y position indicates the mounting position of the sensor in the Y direction.
  • These mounting positions are measurements from a reference position, which can be referred to EN 302 637-2 of ETSI.
  • the radius indicates the average recognition range of the sensor as defined by the manufacturer.
  • the start angle indicates the start angle of the sensor's frustum
  • the end angle indicates the end angle of the sensor's frustum.
  • a quality class represents a classification of the sensor that defines the quality of the measurement object.
  • the SIC may contain information regarding the reliability of the detection area and free space.
  • Table 3 is shown in FIG. Table 3 is an example of POC in CPM.
  • the POC is used to describe the object perceived by the sensor as seen by the transmitting V2X communication device.
  • the receiving V2X communication device that receives the POC can, with the help of the OVC, perform coordinate transformation processing to transform the position of the object into the reference coordinate system of the receiving vehicle.
  • multiple optional DEs may be provided if the originating V2X communication device can provide them.
  • a POC may consist of a selection of DEs to provide an abstract description of the recognized (or detected) object. For example, the relative distance, velocity information and timing information about the perceived object associated with the originating V2X communication device may be included in the POC as mandatory DEs. Additional DEs may also be provided if the sensors of the originating V2X communication device can provide the requested data.
  • the measurement time indicates the time in microseconds from the reference time of the message. This defines the relative age of the measured object.
  • An object ID is a unique random ID assigned to an object. This ID is retained (ie, not changed) while the object is being tracked, ie, considered in the originating V2X communication device's data fusion process.
  • the sensor ID is an ID corresponding to DE in the sensor ID in Table 2. This DE may be used to associate object information with the sensors that make the measurements.
  • the vertical distance includes the distance value and the distance reliability.
  • the distance value indicates the relative X distance to the object in the source reference frame.
  • the distance reliability is a value indicating the reliability of the X distance.
  • the horizontal distance also includes the distance value and distance reliability.
  • the distance value indicates the relative Y distance to the object in the source reference frame, and the distance confidence indicates the confidence of that Y distance.
  • Longitudinal velocity indicates the longitudinal velocity of the detected object according to its reliability. Lateral velocity indicates the lateral velocity of the detected object depending on the degree of confidence. Longitudinal and transverse velocities can be referred to CDD of TS 102 894-2.
  • the object orientation indicates the absolute orientation of the object in the reference coordinate system when provided by data fusion processing.
  • the object length indicates the measured object length.
  • the length confidence indicates the confidence of the length of the measured object.
  • Object Width indicates a measurement of the width of the object. Width confidence indicates the reliability of the object width measurement.
  • Object type represents the classification of the object as provided in the data fusion process. Classifications of objects may include vehicles, people, animals, and others.
  • object reliability, vertical distance, vertical speed, vertical acceleration, lateral acceleration, vertical acceleration, height of object, dynamic state of object, matched position (lane ID, vertical direction lane position) may be included in the POC.
  • the free space addition container is a container that indicates information about the free space recognized by the source V2X communication device (that is, free space information).
  • a free space is an area that is not considered to be occupied by road users or obstacles, and can also be called an empty space.
  • the free space can also be said to be a space in which a mobile object that moves together with the source V2X communication device can move.
  • the free space additional container is not a required container, but a container that can be added arbitrarily. If there is a difference between the free space recognized by the other V2X communication device and the free space recognized by the source V2X communication device, which can be calculated from the CPM received from the other V2X communication device, the free space Additional space containers can be added. Also, free space additional containers may be added to the CPM periodically.
  • the free space addition container contains information specifying the area of free space.
  • Free space can be specified in various shapes.
  • the shape of the free space can be represented, for example, by polygons (ie, polygons), circles, ellipses, rectangles, and the like.
  • polygons ie, polygons
  • circles ie, polygons
  • ellipses ie, circles
  • rectangles ie, polygons
  • the free space addition container contains information specifying the area of free space.
  • Free space can be specified in various shapes.
  • the shape of the free space can be represented, for example, by polygons (ie, polygons), circles, ellipses, rectangles, and the like.
  • When representing free space with a polygon specify the positions of the points that make up the polygon and the order in which the points are connected.
  • free space with an ellipse specify the
  • the free space additional container may contain the reliability of the free space. Reliability of free space is expressed numerically. The reliability of free space may also indicate that the reliability is unknown.
  • the free space addition container may also contain information about shadow areas. The shadow area indicates the area behind the object as seen from the vehicle or the sensor mounted on the vehicle.
  • FIG. 20 is a diagram explaining a sensor data extraction method by a V2X communication device that provides the CP service 124. More specifically, FIG. 20(a) illustrates how a V2X communication device extracts sensor data at a low level. FIG. 20(b) illustrates how a V2X communication device extracts sensor data at a high level.
  • the source of sensor data transmitted as part of CPM should be selected according to the requirements of the future data fusion process in the receiving V2X communication device.
  • the transmitted data should be as close as possible to the original sensor data.
  • simply transmitting original sensor data, such as raw data is not realistic. This is because it imposes very high demands on data rate and transmission cycle.
  • Figures 20(a) and 20(b) show possible embodiments for selecting data to be sent as part of the CPM.
  • sensor data is obtained from different sensors and processed as part of the low-level data management entity. This entity can select the object data to be inserted as part of the next CPM and also compute the validity of the detected objects.
  • the sensor information can be efficiently utilized by the V2X communication device on the receiving side.
  • sensor data or object data provided by a data fusion unit specific to the V2X communication device manufacturer is transmitted as part of the CPM.
  • the absolute value of the difference between the current yaw angle of the detected object and the yaw angle included in the CPM previously transmitted by the source V2X communication device exceeds 4 degrees, transmission will be considered.
  • the difference between the relative distance between the current position of the originating V2X communication device and the detected object and the relative distance between the originating V2X communication device and the detected object contained in the CPM previously transmitted by the originating V2X communication device exceeds 4m
  • the absolute value of the difference between the current velocity of the detected object and the velocity of the detected object contained in the CPM previously transmitted by the source V2X communication device exceeds 0.5 m/s, consider sending. good too.
  • CAM is a technology in which a vehicle equipped with a V2X module periodically transmits its position and status to other vehicles equipped with a V2X module in the surrounding area to support more stable driving.
  • the V2X module is a configuration including a V2X communication device or a V2X communication device.
  • CP service 124 is a technology that complements CAM.
  • CPS that is, CP service
  • ADAS technology is a technology in ADAS technology that notifies the surroundings of sensor data that recognizes the surrounding environment through V2X communication.
  • FIG. 21 is a diagram explaining the CP service 124.
  • each vehicle TxV1 and RxV2 is equipped with at least one sensor and has sensing ranges SrV1, SrV2 indicated by dashed lines.
  • TxV1 has a CPS function.
  • TxV1 can recognize vehicles, RV1 to RV11, which are peripheral objects belonging to sensing range SrV1, using a plurality of ADAS sensors mounted on the vehicle. Object information obtained by recognition may be distributed to nearby vehicles equipped with V2X communication devices through V2X communication.
  • RxV1 which is not equipped with a sensor, can acquire information on the following vehicle.
  • RxV2 equipped with a sensor receives CPM from TxV1
  • information on objects located outside the sensing range SrV2 of RxV2 and objects located in blind spots can also be obtained.
  • facility layer 120 can provide CP services 124 .
  • CP services 124 may run in facility layer 120 and may utilize services that reside in facility layer 120 .
  • the LDM 127 is a service that provides map information, and may provide map information for the CP service 124.
  • the provided map information may include dynamic information in addition to static information.
  • the POTI unit 126 performs a service that provides the location and time of the ego vehicle.
  • the POTI unit 126 can use the corresponding information to provide the location of the ego vehicle and the exact time.
  • the VDP 125 is a service that provides information about the vehicle, and may be used to capture information such as the size of the own vehicle into the CPM and transmit the CPM.
  • ADAS vehicles are equipped with various sensors such as cameras, infrared sensors, radar, and lidar for driving support. Each sensor individually recognizes an object. The recognized object information may be collected and fused by the data fusion unit and provided to the ADAS application.
  • CP service 124 the collection and fusion method of sensor information in the ADAS technology will be described.
  • Existing sensors for ADAS and existing sensors for CPS can always track surrounding objects and collect relevant data.
  • sensor values for CP services two methods can be used to collect sensor information.
  • each sensor value can be individually provided to surrounding vehicles through the CP basic service.
  • the aggregated integrated sensor information may be provided to the CP basic service after the data fusion part.
  • CP basic services form part of CP services 124 .
  • FIG. 22 is a flow chart showing the process of sending the CPM.
  • the processing shown in FIG. 22 is executed for each CPM generation cycle.
  • step S301 information forming the CPM is acquired.
  • step S301 is executed by the detection information acquisition unit 201, for example.
  • step S302 CPM is generated based on the information obtained in step S301.
  • Step S302 can also be executed by the detection information acquisition unit 201 .
  • the transmission unit 221 transmits the CPM generated at step S302 to the surroundings of the vehicle.
  • the CPM transmitted from the roadside device 3 is used as the target object information transmitted from the roadside device 3 .
  • CAM or CPM transmitted from another vehicle is used as target information transmitted from another vehicle. Since CPM and CAM are used as target information, the flowchart shown in FIG. 23 executes steps S1A, S2A, S3A, S8A, and S10A instead of steps S1, S2, S3, S8, and S10 in the flowchart of FIG. do. Note that steps S4, S5, S6, S7, S9, S11, S12, and S13 are common between the flowchart of FIG. 5 and the flowchart of FIG. 23, and thus description thereof will be omitted.
  • step S1A when the roadside and vehicle side receiving unit 231 receives the CPM transmitted from the roadside device 3 through roadtovehicle communication (YES in S1A), the process proceeds to step S2A. On the other hand, if the road and vehicle side receiver 231 has not received the CPM from the road side device 3 (NO in S1A), the process proceeds to step S8A.
  • step S2A if the vehicle-side receiving unit 222 receives the CAM or CPM transmitted from another vehicle via inter-vehicle communication (YES at S2A), the process proceeds to step S3A. On the other hand, if the CAM or CPM has not been received in the vehicle-to-vehicle communication (NO in S2A), the process proceeds to step S13. Note that CAM and CPM may be collectively referred to as messages below.
  • step S3A the identity determining unit 204 determines whether the target specified by the POC of the CPM determined to have been received in S1A is the same as the target specified by the CAM or CPM determined to have been received in S2A. judge.
  • the CPM transmitted by the roadside device 3 includes the latitude and longitude (hereinafter referred to as absolute coordinates) of the roadside device 3 and the distance and direction from the roadside device 3 to the target. Therefore, from the CPM acquired from the roadside device 3, the absolute coordinates of the target detected by the roadside device 3 can be determined. Also, the CPM includes POC containing various information about the target detected by the roadside unit 3 .
  • the CAM or CPM obtained from other vehicles contains the absolute coordinates of the other vehicles.
  • the CAM acquired from other vehicles includes various other vehicle information.
  • the CPM acquired from other vehicles also contains information on other vehicles, although it contains less information on other vehicles as a transmission source than the CAM. Therefore, based on the CPM acquired from the roadside device 3 and the CAM or CPM acquired from another vehicle, it can be determined whether or not the target detected by the roadside device 3 is the same as the other vehicle that received the CAM or CPM.
  • the absolute coordinates of the target specified by the CPM acquired from the roadside device 3 and the absolute coordinates of the other vehicle indicated by the CAM or CPM acquired from the other vehicle are similar, It can be determined that the target specified by the CPM and the other vehicle are the same. Approximation of coordinates can be determined by whether the distance between coordinates is less than a preset threshold. Note that relative coordinates based on the vehicle may be used instead of absolute coordinates.
  • the target and the other vehicle are the same. can be judged. Also, the same determination may be made using absolute coordinates or relative coordinates and behavior.
  • Behavior is one or more pieces of information that indicate the movement of an object, such as velocity, acceleration, and angular velocity.
  • the CPM does not include the behavior of the other vehicle that is the transmission source.
  • the CPM is received a plurality of times from the same other vehicle, and the behavior of the other vehicle is determined from the time change of the absolute coordinates of the other vehicle included in each CPM.
  • the CAM contains information indicating the behavior of other vehicles, it is preferable to use the CAM as a message when determining the same by using the behavior. If the same determination is made using the behavior, the same determination can be made with high accuracy even if there is a positioning error.
  • the type of the object that sent the message can be used.
  • the type can be determined, for example, by the station type included in the message.
  • Absolute coordinates of objects may be used for narrowing down.
  • the narrowing-down range is determined by a threshold radius set to a distance larger than the distance difference determined to be the same.
  • the center of the narrowing range is the coordinates of the target or other vehicle. The same determination is made for the target and the other vehicle within this narrowed-down range.
  • step S8A The determination content of step S8A is the same as that of step S2A. If step S8A is YES, the process moves to step S9, and if step S8A is NO, the process moves to step S13. If NO in step S9, the process proceeds to S10A.
  • step S10A a second estimation-related process is performed, and the process proceeds to step S13.
  • An example of the flow of the second estimation-related processing according to the second embodiment will be described using the flowchart of FIG. 24 .
  • the flowchart shown in FIG. 24 executes steps S101A, S103A, and S106 instead of steps S101, S103, and S106 in the flowchart of FIG. Note that steps S102, S104, S105, and S107 are common between the flowchart of FIG. 6 and the flowchart of FIG. 24, and thus description thereof will be omitted.
  • step S101A when the vehicle-side reception unit 222, which is the mounted object information acquisition unit, receives the CAM or CPM from another vehicle different from the other vehicle that received the message in S2A or S8A (YES in S101A), step Move to S102. If the CAM and CPM have not been received (NO in S101A), the process proceeds to S13 in FIG.
  • the identity determination unit 204 executes step S103A.
  • S103A it is determined whether or not the target specified by the CAM or CPM determined to have been received in S101A is the same as the first target whose position is specified by the target position specifying unit 282 .
  • One target for the same determination is the first target whose position was determined to have been identified in the previous S102.
  • the other target for the same determination is either the second communication device mounted object or the second target.
  • the other target of the same determination is the second communication device mounted object.
  • the second communication equipment is the vehicle VEb.
  • the other object of the same determination is one or both of the second communication device mounted object and the second target.
  • the second target is the target detected by the second communication device mounted object.
  • the second target is the target LMa when specifically shown using FIG.
  • the message is CPM
  • As the target information of the first target use the POC included in the CPM transmitted by the first communication device, and use the POC included in the CPM transmitted by the second communication device as the target information of the second target. Use the included POC. It is determined whether or not the first target and the second target are the same depending on whether or not one or both of the position and behavior of the object indicated by the POCs included in the two CPMs are similar. Of course, the type of target may also be taken into consideration. Further, narrowing down by position may be performed.
  • the message is CAM, determine whether the first target and the second communication device mounted object are the same.
  • the POC included in the CPM transmitted by the first communication device is used as the target information of the first target, and the second communication device is used for the target information of the second target communication device.
  • the message is CPM, it is possible to determine whether the first target and the second communication device mounted object are the same.
  • the target information of the first target the POC included in the CPM transmitted by the first communication equipment is used.
  • One or both of the CPM management container and the OVC transmitted by the second target communication device is used for the target information of the second target communication device. Note that the type of the target may be considered and the narrowing may be performed by position, as in the case of determining whether or not the first target and the second target are the same.
  • the error estimation unit 206 estimates the second positioning error.
  • the second positioning error is the positional deviation of the two targets that are subject to the same determination in S103A. If the two targets to be determined to be the same in S103A are the first target and the second target, the deviation in the position of the object indicated by the POCs included in the two CPMs is set as the second positioning error.
  • the second positioning position obtained from the second communication device and the second By correcting the positions of the targets by the second positioning error, the position of the second communication device mounted object and the position of the second target relative to the own vehicle can be specified.
  • one or both of the CPM and CAM transmitted by the vehicle unit 2 and the CPM transmitted by the roadside device 3 are used when estimating the positioning error of the object mounted with the first communication device. . Since a system that transmits and receives CAM and CPM can be used, it is easy to construct a system that accurately identifies the position of a target such as the object mounted on the first communication device.
  • the object mounted with the communication device may be a mobile object other than a vehicle.
  • Mobile objects other than vehicles include, for example, drones.
  • a unit provided with a function excluding the function specific to the vehicle among the vehicle units 2 may be mounted on a moving object such as a drone.
  • the communication device 20 detects the detected position of the roadside device 3 based on the positioning position of the own vehicle, From the absolute position information of the roadside device 3 acquired in , the positioning error of the own vehicle may be estimated.
  • the detected position of the roadside device 3 based on the positioning position of the own vehicle is obtained by calculating the position of the roadside device 3 with respect to the position of the own vehicle recognized by the perimeter monitoring ECU 50 using the coordinates of the positioning position of the own vehicle.
  • the detected position of the roadside unit 3 based on the measured position of the own vehicle contains the positioning error of the own vehicle because it is based on the measured position of the own vehicle. Therefore, the positioning error of the own vehicle can be estimated from the deviation between the detected position of the roadside unit 3 and the absolute position of the roadside unit 3 based on the positioning position of the own vehicle.
  • the estimated positioning error of the own vehicle may be used for correction when specifying the position of the own vehicle.
  • the configuration in which the communication device 20 estimates the positioning error has been described in the above-described embodiment, the configuration is not necessarily limited to this.
  • the functions other than the vehicle-to-vehicle communication unit 202 and the road-to-vehicle communication unit 203 may be performed by the perimeter monitoring ECU 50.
  • the perimeter monitoring ECU 50 may include a functional block for acquiring target information received by the vehicle-to-vehicle communication section 202 and the road-to-vehicle communication section 203 .
  • This functional block corresponds to the mounted object information acquisition unit and the roadside unit information acquisition unit.
  • the perimeter monitoring ECU 50 corresponds to the vehicle device.
  • the function of the communication device 20 may be performed by the communication device 20 and the perimeter monitoring ECU 50 .
  • a unit including the communication device 20 and the perimeter monitoring ECU 50 corresponds to the vehicle device.
  • the first positioning error is estimated after the roadside unit reference target position and the first positioning position are converted into relative positions with respect to the own vehicle by the conversion unit 205 (S5, S6).
  • the first positioning error may be estimated using the coordinates of the roadside machine reference target position and the first positioning position as absolute positions.
  • the first positioning position is absolute coordinates
  • the conversion unit 205 is unnecessary. is.
  • the roadside unit reference target position which is the position of the target detected by the roadside unit 3, is used as the reference position.
  • the reference position is not limited to the target position detected by the roadside unit 3 .
  • the vehicle unit 2 mounted on the vehicle VEc can receive the CPM transmitted by the roadside unit 3.
  • This CPM also includes the position of the vehicle VEc.
  • the vehicle unit 2 mounted on the vehicle VEc can correct its own positioning error based on the position of the vehicle VEc included in the CPM. After correcting the positioning error, it can be considered that the vehicle unit 2 has almost no positioning error for a certain period of time. Therefore, after the positioning error is corrected, the position of the target detected by the error-corrected vehicle unit 2 may be used as the reference position for a certain period of time.
  • the vehicle unit 2 may indicate whether or not the position of the target transmitted by itself can be used as the reference position, for example, by means of a flag included in the message.
  • a vehicle device that can be used in a vehicle, comprising: A roadside that has a surrounding monitoring sensor, has information on the absolute position of its own device with higher positional accuracy than required by positioning using signals from positioning satellites, and is equipped with a communication device capable of wireless communication with the vehicle.
  • the mounted object information acquisition unit A vehicle apparatus comprising a mounted object position specifying unit (281) for specifying the position of the first communication device mounted object with respect to the vehicle by correcting the first positioning position acquired from the object by the first positioning error.
  • the mounted object information acquisition unit obtains the position of the first target, which is the target detected by the surrounding monitoring sensor, transmitted from the mounted object, which also has a surrounding monitoring sensor, with respect to the first positioning position.
  • the first mounted object reference target position is also acquired via wireless communication, After the first positioning error is once estimated by the error estimating section, the mounted object information acquisition section corrects the first mounted object reference target position acquired from the first communication device mounted object by the amount of the first positioning error.
  • a target position specifying unit (282) for specifying the position of the first target with respect to the vehicle.
  • the mounted object information acquisition unit has a surrounding monitoring sensor, is capable of positioning using a positioning satellite signal, and is equipped with a communication device capable of wireless communication with the vehicle, and a roadside device and a mounted object of the first communication device.
  • the second positioning position which is the position of the second communication device-mounted object obtained by the positioning, and the second object, which is the target detected by the surrounding monitoring sensor, transmitted from the second communication device-mounted object other than a second mounted object reference target position, which is the position of the target based on the position of the mounted object of the second communication device determined by the positioning, is also acquired via wireless communication;
  • the identity determination unit determines whether the first target whose position relative to the vehicle is specified by the target position specifying unit and the second communication device mounted object that is the transmission source of the second mounted object reference target position acquired by the mounted object information acquisition unit.
  • the converting unit converts the second mounted object reference target position acquired by the mounted object information acquiring unit into a relative position with respect to the vehicle,
  • the error estimating unit determines the position of the second mounted object reference target converted by the converting unit and the position of the target specified by the target position identifying unit.
  • a vehicle device for estimating a second positioning error which is a positioning error of a second communication device mounted on the vehicle with reference to the position of the vehicle, from the deviation from the position of the first target with respect to the vehicle.
  • a vehicle device (Technical feature 5) A vehicle device according to technical feature 4, After the second positioning error is once estimated by the error estimating unit, the mounted object information acquisition unit corrects the second positioning position acquired from the second communication device mounted object by the second positioning error, A device for a vehicle comprising a mounted object position identifying section (281) for identifying the position of a second communication device mounted object.
  • the target position identifying unit calculates the second mounting object reference target position acquired from the second communication device mounted object by the mounted object information acquiring unit as the second A vehicle device that specifies the position of a second target with respect to the vehicle by correcting the positioning error.
  • An error estimation method that can be used in a vehicle, comprising: executed by at least one processor; A roadside that has a surrounding monitoring sensor, has information on the absolute position of its own device with higher positional accuracy than required by positioning using signals from positioning satellites, and is equipped with a communication device capable of wireless communication with the vehicle.
  • controller and techniques described in this disclosure may also be implemented by a special purpose computer comprising a processor programmed to perform one or more functions embodied by a computer program.
  • the apparatus and techniques described in this disclosure may be implemented by dedicated hardware logic circuitry.
  • the apparatus and techniques described in this disclosure may be implemented by one or more special purpose computers configured by a combination of a processor executing a computer program and one or more hardware logic circuits.
  • the computer program may also be stored as computer-executable instructions on a computer-readable non-transitional tangible recording medium.

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Traffic Control Systems (AREA)

Abstract

This invention comprises: a reference position acquisition unit (231) that acquires a reference position transmitted from a reference device; a vehicle-side reception unit (222) that acquires a first positioning location transmitted from a first communication-device-mounted object; a co-incidence determination unit (204) that determines whether or not a target for which a reference position was acquired by the reference position acquisition unit is the same as the first communication-device-mounted object for which the first positioning location was acquired by the vehicle-side reception unit (222); and an error estimation unit (206) that estimates a positioning error at the first communication-device-mounted object from a deviation between the reference position and the first positioning location, if the co-incidence determination unit (204) determined that the target and the first communication-device-mounted object are the same.

Description

車両用装置及び誤差推定方法Vehicle device and error estimation method 関連出願の相互参照Cross-reference to related applications
 この出願は、2021年6月14日に日本に出願された特許出願第2021-98905号を基礎としており、基礎の出願の内容を、全体的に、参照により援用している。 This application is based on Patent Application No. 2021-98905 filed in Japan on June 14, 2021, and the content of the underlying application is incorporated by reference in its entirety.
 本開示は、車両用装置及び誤差推定方法に関するものである。 The present disclosure relates to a vehicle device and an error estimation method.
 特許文献1には、自車の見通し外の物標が存在することを判定する技術が開示されている。特許文献1に開示の技術では、物標としての他車両から送信された他車両の位置情報である受信情報と自車のセンサで検出した物標の移動速度及び移動方位を含むセンサ情報との比較結果により、両者が一致しない場合に、見通し外に物標が存在すると判定する。 Patent Document 1 discloses a technique for determining the existence of a target beyond the line of sight of the own vehicle. In the technique disclosed in Patent Document 1, reception information, which is position information of another vehicle transmitted from another vehicle as a target, and sensor information including the moving speed and moving direction of the target detected by the sensor of the own vehicle. As a result of the comparison, if the two do not match, it is determined that there is a target in the non-line-of-sight.
特開2019-79316号公報JP 2019-79316 A
 特許文献1に開示の技術では、他車両の位置情報を受信することで自車のセンサで検出できない他車両の存在を判定できる。しかしながら、他車両から送信される他車両の位置情報には、測位衛星を用いた測位であっても、また、測位衛星を用いない測位であっても、測位の誤差(以下、測位誤差)が含まれる。よって、特許文献1に開示された技術では、自車のセンサの検出範囲外に他車両が位置する場合にその他車両の位置を精度良く特定することが難しい。 With the technology disclosed in Patent Document 1, it is possible to determine the presence of other vehicles that cannot be detected by the own vehicle's sensors by receiving the position information of the other vehicles. However, the position information of other vehicles transmitted from other vehicles includes positioning errors (hereinafter referred to as positioning errors), whether positioning is performed using positioning satellites or not using positioning satellites. included. Therefore, with the technology disclosed in Patent Literature 1, it is difficult to accurately identify the position of another vehicle when the other vehicle is located outside the detection range of the sensor of the own vehicle.
 この開示のひとつの目的は、自車の周辺監視センサの検出範囲外の通信機搭載物の位置をより精度良く特定することを可能にする車両用装置及び誤差推定方法を提供することにある。 One object of this disclosure is to provide a vehicle device and an error estimation method that make it possible to more accurately identify the position of an object mounted with a communication device outside the detection range of the vehicle's surroundings monitoring sensor.
 上記目的は独立請求項に記載の特徴の組み合わせにより達成され、また、下位請求項は、開示の更なる有利な具体例を規定する。請求の範囲に記載した括弧内の符号は、ひとつの態様として後述する実施形態に記載の具体的手段との対応関係を示すものであって、本開示の技術的範囲を限定するものではない。 The above object is achieved by the combination of features described in the independent claims, and the subclaims define further advantageous embodiments of the disclosure. Reference numerals in parentheses in the claims indicate correspondences with specific means described in embodiments described later as one aspect, and do not limit the technical scope of the present disclosure.
 上記目的を達成するために、本開示の車両用装置は、車両で用いることが可能な車両用装置であって、物標の参照位置を検出する参照装置から無線送信される参照位置を取得する参照位置取得部と、第1通信機搭載物から送信され、第1通信機搭載物での測位によって求められた第1通信機搭載物の位置である第1測位位置を取得する搭載物情報取得部と、参照位置取得部で参照位置を取得した物標と、搭載物情報取得部で第1測位位置を取得した第1通信機搭載物とが同一か否かを判定する同一判定部と、同一判定部で物標と第1通信機搭載物とが同一と判定した場合には、参照位置と第1測位位置とのずれから、第1通信機搭載物での測位の誤差である第1測位誤差を推定する誤差推定部とを備える。 In order to achieve the above object, the vehicle device of the present disclosure is a vehicle device that can be used in a vehicle and acquires a reference position wirelessly transmitted from a reference device that detects a reference position of a target. a reference position acquisition unit for acquiring a first positioning position, which is the position of the first communication device-mounted object transmitted from the first communication device-mounted object and obtained by positioning by the first communication device-mounted object; an identity determination unit that determines whether or not the target whose reference position is obtained by the reference position obtaining unit and the first communication device mounted object whose first positioning position is obtained by the mounted object information obtaining unit are the same; When the same determination unit determines that the target and the first communication device mounted object are the same, the first positioning error, which is the positioning error in the first communication device mounted device, is determined from the deviation between the reference position and the first positioning position. and an error estimator for estimating the positioning error.
 上記目的を達成するために、本開示の誤差推定方法は、車両で用いることが可能であり、少なくとも1つのプロセッサにより実行される誤差推定方法であって、物標の参照位置を検出する参照装置から無線送信される参照位置を取得する参照位置取得工程と、第1通信機搭載物から送信され、第1通信機搭載物での測位によって求められた第1通信機搭載物の位置である第1測位位置を取得する搭載物情報取得工程と、参照位置取得工程で参照位置を取得した物標と、搭載物情報取得工程で第1測位位置を取得した第1通信機搭載物とが同一か否かを判定する同一判定工程と、同一判定工程で物標と第1通信機搭載物とが同一と判定した場合には、参照位置と第1測位位置とのずれから、第1通信機搭載物での測位の誤差である第1測位誤差を推定する誤差推定工程とを含む。 To achieve the above objectives, the error estimation method of the present disclosure is capable of being used in a vehicle and is executed by at least one processor, a reference device for detecting a reference position of a target. a reference position acquisition step of acquiring a reference position wirelessly transmitted from the first 1 Whether the mounted object information obtaining step for obtaining the positioning position, the target whose reference position is obtained in the reference position obtaining step, and the first communication device mounted object obtaining the first positioning position in the mounted object information obtaining step are the same. and when it is determined that the target and the object mounted on the first communication device are the same in the same determination step, it is determined from the deviation between the reference position and the first positioning position that the target mounted on the first communication device and an error estimating step of estimating a first positioning error that is the positioning error at the object.
 これらによれば、第1測位位置は、第1通信機搭載物で測位によって求められるものであるので、その測位の誤差を含んでいる。一方、開示した技術は、参照装置から物標の参照位置を取得する。そのため、物標と第1通信機搭載物とが同一と判定した場合、参照位置と、第1通信機搭載物から取得した第1測位位置とのずれから、第1通信機搭載物での測位の誤差である第1測位誤差をより精度良く推定することが可能になる。ここで、第1測位誤差は、参照位置と第1測位位置から推定するので、第1測位誤差の推定は、第1通信機搭載物を車両に搭載された周辺監視センサで検出できなくても可能となる。また、第1測位誤差を用いれば、第1通信機搭載物での測位誤差を補正して、第1通信機搭載物の位置をより精度良く特定することが可能になる。その結果、自車の周辺監視センサの検出範囲外の通信機搭載物の位置をより精度良く特定することが可能になる。 According to these, the first positioning position is obtained by positioning with the first communication device mounted object, so it includes positioning errors. On the other hand, the disclosed technique acquires the reference position of the target from the reference device. Therefore, when it is determined that the target and the first communication device-mounted object are the same, from the deviation between the reference position and the first positioning position acquired from the first communication device-mounted object, the positioning by the first communication device-mounted object It becomes possible to more accurately estimate the first positioning error, which is the error of . Here, the first positioning error is estimated from the reference position and the first positioning position. It becomes possible. Further, by using the first positioning error, it is possible to correct the positioning error in the first communication device mounted object and specify the position of the first communication device mounted object with higher accuracy. As a result, it is possible to more accurately identify the position of the communication device-mounted object outside the detection range of the surroundings monitoring sensor of the own vehicle.
車両用システム1の概略的な構成の一例を示す図。The figure which shows an example of a schematic structure of the system 1 for vehicles. 路側機3の概略的な構成の一例を示す図。The figure which shows an example of a schematic structure of the roadside machine 3. FIG. 車両ユニット2の概略的な構成の一例を示す図。2 is a diagram showing an example of a schematic configuration of a vehicle unit 2; FIG. 通信機20の概略的な構成の一例を示す図。2 is a diagram showing an example of a schematic configuration of a communication device 20; FIG. 通信機20での測位誤差推定関連処理の流れの一例を示すフローチャート。6 is a flow chart showing an example of the flow of positioning error estimation-related processing in the communication device 20; 通信機20での第2推定関連処理の流れの一例を示すフローチャート。6 is a flowchart showing an example of the flow of second estimation-related processing in the communication device 20; 実施形態2の概要を説明する図。FIG. 10 is a diagram for explaining an outline of a second embodiment; V2X通信装置の例示的なアーキテクチャを示す図。1 illustrates an exemplary architecture of a V2X communication device; FIG. V2Xメッセージを例示する図。A diagram illustrating a V2X message. CAサービスおよび他の層のための論理インターフェースを示す図。FIG. 3 shows the logical interfaces for CA services and other layers; CAサービスの機能ブロック図。A functional block diagram of a CA service. CAMの基本的なフォーマットを示す図。The figure which shows the basic format of CAM. CAMを送信する処理を示すフローチャートの一例。An example of the flowchart which shows the process which transmits CAM. CPサービスおよび他の層のための論理インターフェースを示す図。FIG. 3 shows the logical interfaces for CP services and other layers; CPサービスの機能ブロック図。Functional block diagram of CP service. CPMの基本的なフォーマットを示す図。The figure which shows the basic format of CPM. CPMにおけるOVCの一例を示す図。The figure which shows an example of OVC in CPM. CPMにおけるFOC(またはSIC)を例示する図。FIG. 4 illustrates FOC (or SIC) in CPM; CPMにおけるPOCを例示する図。FIG. 4 is a diagram illustrating POC in CPM; センサデータ抽出方法を説明する図。The figure explaining the sensor data extraction method. CPサービスを説明する図。FIG. 4 is a diagram for explaining CP services; CPMを送信する処理を示すフローチャートの一例。An example of the flowchart which shows the process which transmits CPM. 実施形態2で実行する測位誤差推定関連処理を示すフローチャートの一例。An example of a flowchart showing positioning error estimation-related processing executed in the second embodiment. 実施形態2における第2推定関連処理を示すフローチャートの一例。An example of the flowchart which shows the 2nd estimation related process in Embodiment 2. FIG. 実施形態2における同一判定の対象を説明する図。FIG. 11 is a diagram for explaining targets for identity determination in the second embodiment;
 図面を参照しながら、開示のための複数の実施形態を説明する。なお、説明の便宜上、複数の実施形態の間において、それまでの説明に用いた図に示した部分と同一の機能を有する部分については、同一の符号を付し、その説明を省略する場合がある。同一の符号を付した部分については、他の実施形態における説明を参照することができる。 A plurality of embodiments for disclosure will be described with reference to the drawings. For convenience of explanation, in some embodiments, parts having the same functions as the parts shown in the drawings used in the explanation so far are denoted by the same reference numerals, and the explanation thereof may be omitted. be. The description in the other embodiments can be referred to for the parts with the same reference numerals.
 (実施形態1)
 <車両用システム1の概略構成>
 以下、本開示の実施形態1について図面を用いて説明する。図1に示すように、車両用システム1は、複数の車両の各々に用いられる車両ユニット2と路側機3とを含んでいる。図1に示す例では、車両として、車両VEa,車両VEb,車両VEcを例に挙げて説明する。車両VEa,車両VEb,車両VEcは自動車であるものとする。
(Embodiment 1)
<Schematic Configuration of Vehicle System 1>
Embodiment 1 of the present disclosure will be described below with reference to the drawings. As shown in FIG. 1, the vehicle system 1 includes a vehicle unit 2 and a roadside unit 3 used for each of a plurality of vehicles. In the example shown in FIG. 1, a vehicle VEa, a vehicle VEb, and a vehicle VEc will be described as examples of vehicles. Vehicle VEa, vehicle VEb, and vehicle VEc are assumed to be automobiles.
 <路側機3の概略構成>
 続いて、図2を用いて路側機3の概略構成の一例を説明する。路側機3は、図2に示すように、制御装置300、通信機301、周辺監視センサ302、及び位置記憶部303を含んでいる。
<Schematic configuration of the roadside unit 3>
Next, an example of the schematic configuration of the roadside unit 3 will be described with reference to FIG. The roadside device 3 includes a control device 300, a communication device 301, a perimeter monitoring sensor 302, and a position storage section 303, as shown in FIG.
 通信機301は、自機器の外部と無線通信を行うことで、自機器の外部と情報のやり取りを行う。通信機301は、車両で用いられる後述の通信機20との間で無線通信を行う。つまり、通信機301は、路間通信を行う。本実施形態では、路側機3と車両VEaの通信機20との間で路車間通信が行われるものとする。 The communication device 301 exchanges information with the outside of its own device by performing wireless communication with the outside of its own device. The communication device 301 wirelessly communicates with a communication device 20 which is used in the vehicle and will be described later. That is, the communication device 301 performs road-to-road communication. In this embodiment, it is assumed that road-vehicle communication is performed between the roadside device 3 and the communication device 20 of the vehicle VEa.
 周辺監視センサ302は、歩行者、人間以外の動物、自転車、オートバイ、及び他車両等の移動物体、さらに路上の落下物、ガードレール、縁石、及び樹木等の静止物体といった自機器周辺の障害物を検出する。周辺監視センサ302としては、自機器の周辺の所定範囲を撮像範囲とする周辺監視カメラを用いる構成とすればよい。周辺監視センサ302としては、ミリ波レーダ,ソナー,LIDAR(Light Detection and Ranging/Laser Imaging Detect ion and Ranging)等を用いる構成としてもよい。また、これらを組み合わせて用いる構成としてもよい。周辺監視カメラは、逐次撮像する撮像画像をセンシング情報として制御装置300へ逐次出力する。ソナー,ミリ波レーダ,LIDAR等の探査波を送信するセンサは、障害物によって反射された反射波を受信した場合に得られる受信信号に基づく走査結果をセンシング情報として制御装置300へ逐次出力する。 The peripheral monitoring sensor 302 detects obstacles around the device, such as moving objects such as pedestrians, animals other than humans, bicycles, motorcycles, and other vehicles, as well as stationary objects such as falling objects on the road, guardrails, curbs, and trees. To detect. As the surroundings monitoring sensor 302, a configuration using a surroundings monitoring camera whose imaging range is a predetermined range around the own device may be used. Perimeter monitoring sensor 302 may be configured to use millimeter wave radar, sonar, LIDAR (Light Detection and Ranging/Laser Imaging Detection and Ranging), or the like. Moreover, it is good also as a structure which uses these in combination. The peripheral monitoring camera sequentially outputs captured images to the control device 300 as sensing information. Sensors that transmit search waves such as sonar, millimeter-wave radar, and LIDAR sequentially output scanning results based on received signals obtained when reflected waves reflected by obstacles are received to control device 300 as sensing information.
 位置記憶部303は、自機器の絶対位置の情報を記憶する。位置記憶部303としては不揮発性メモリを用いればよい。絶対位置の情報は、少なくとも緯度,経度の座標とすればよい。自機器の絶対位置の情報は、自機器単独での測位衛星の信号を用いた測位で求められるよりも位置精度の高いものとする。自機器の絶対位置の情報は、三角点若しくは電子基準点を基準とした測量によって予め求められて位置記憶部303に記憶されるものとすればよい。 The position storage unit 303 stores information on the absolute position of the own device. A non-volatile memory may be used as the position storage unit 303 . The absolute position information should be at least latitude and longitude coordinates. It is assumed that the absolute position information of the own device has a position accuracy higher than that obtained by positioning using the signals of the positioning satellites by the own device alone. The information on the absolute position of the own device may be obtained in advance by surveying with reference to triangulation points or electronic reference points and stored in the position storage unit 303 .
 制御装置300は、プロセッサ、メモリ、I/O、これらを接続するバスを備え、メモリに記憶された制御プログラムを実行することで、各種の処理を実行する。制御装置300は、周辺監視センサ302から出力されたセンシング情報から、自機器周辺に存在する物標についての情報(以下、物標情報)を認識する。物標情報としては、物標についての、自機器に対する相対位置、種類、速度、及び移動方位等とすればよい。相対位置は、自機器からの距離と方位とすればよい。周辺監視センサ302を用いた物標の位置の認識での位置特定精度は、後述するロケータ30での測位による位置特定精度よりも高いものとする。 The control device 300 includes a processor, memory, I/O, and a bus connecting these, and executes various processes by executing control programs stored in the memory. The control device 300 recognizes information (hereinafter referred to as target information) about targets existing around the device from the sensing information output from the periphery monitoring sensor 302 . The target information may be the relative position, type, speed, moving azimuth, etc. of the target with respect to the own device. The relative position may be the distance and direction from the own device. It is assumed that the position specifying accuracy in recognizing the position of the target using the perimeter monitoring sensor 302 is higher than the position specifying accuracy by positioning with the locator 30, which will be described later.
 物標の種類については、周辺監視カメラで撮像した撮像画像をもとに、テンプレートマッチング等の画像認識処理によって認識すればよい。周辺監視カメラとして単眼カメラを用いる場合には、自機器に対する周辺監視カメラの設置位置及び光軸の向きと撮像画像中での物標の位置とから、自機器からの物標の距離及び自車に対する物標の方位を認識すればよい。複眼カメラを用いる場合には、一対のカメラの視差量をもとに自機器からの物標の距離を認識すればよい。他にも、自機器に対する物標の相対位置の時間あたりの変化量から、物標の速度を認識すればよい。また、自機器に対する物標の相対位置の変化方向及び位置記憶部303に記憶されている自機器の絶対位置の情報から移動方位を認識すればよい。 The type of target can be recognized by image recognition processing such as template matching based on the image captured by the surrounding surveillance camera. When a monocular camera is used as a surroundings monitoring camera, the distance of the target from the owning device and the own vehicle can be determined from the installation position and the direction of the optical axis of the surroundings monitoring camera relative to the own device and the position of the target in the captured image. What is necessary is to recognize the direction of the target object with respect to . When a compound eye camera is used, the distance of the target from the own device may be recognized based on the amount of parallax between the pair of cameras. Alternatively, the speed of the target may be recognized from the amount of change per time in the relative position of the target with respect to the device itself. Also, the direction of movement may be recognized from the direction of change in the relative position of the target relative to the device itself and the information on the absolute position of the device itself stored in the position storage unit 303 .
 他にも、制御装置300は、探査波を送信するセンサのセンシング情報を用いて、自機器からの物標の距離、自機器に対する物標の方位、物標の速度、及び移動方位を認識してもよい。制御装置300は、探査波を送出してから反射波を受信するまでの時間をもとに、自機器から物標までの距離を認識すればよい。制御装置300は、反射波の得られた探査波を送信した方向をもとに、自機器に対する物標の方位を認識すればよい。制御装置300は、送信した探査波と反射波とのドップラーシフトをもとに算出される自機器に対する物標の相対速度から、物標の速度を認識してもよい。 In addition, the control device 300 uses the sensing information of the sensor that transmits the search wave to recognize the distance of the target from its own device, the azimuth of the target with respect to its own device, the speed of the target, and the moving azimuth. may The control device 300 may recognize the distance from its own device to the target on the basis of the time from when the search wave is sent until when the reflected wave is received. The control device 300 may recognize the azimuth of the target with respect to its own device based on the direction in which the search wave from which the reflected wave was obtained was transmitted. The control device 300 may recognize the speed of the target from the relative speed of the target with respect to its own device calculated based on the Doppler shift between the transmitted search wave and the reflected wave.
 制御装置300は、通信機301を制御して路車間通信を行わせる。制御装置300は、周辺監視センサ302から出力されたセンシング情報から認識した物標情報を、路車間通信によって自機器の周辺の車両の通信機20に送信する。制御装置300は、物標情報を送信する場合に、位置記憶部303に記憶されている自機器の絶対位置の情報も物標情報に含めて送信する。送信元識別情報は、自車の車両IDであってもよいし、自車の通信機20の通信機IDであってもよい。通信機301から送信させる物標情報には、上述したように、物標の相対位置、物標の種類、物標の速度、物標の移動方位、及び路側機3の絶対位置の情報(以下、路側機絶対位置)が含まれる。物標情報のうちの物標の相対位置は、路側機3の位置を基準とする位置である。以降では、物標情報に含ませるこの相対位置を路側機基準物標位置と呼ぶ。制御装置300は、物標情報を例えば定期的に送信させる構成とすればよい。 The control device 300 controls the communication device 301 to perform road-to-vehicle communication. The control device 300 transmits the target object information recognized from the sensing information output from the surroundings monitoring sensor 302 to the communication device 20 of the vehicle in the vicinity of the own device through road-to-vehicle communication. When transmitting the target information, the control device 300 also includes information on the absolute position of its own device stored in the position storage unit 303 and transmits the target information. The source identification information may be the vehicle ID of the own vehicle or the communication device ID of the communication device 20 of the own vehicle. The target information transmitted from the communication device 301 includes, as described above, the relative position of the target, the type of target, the speed of the target, the direction of movement of the target, and the absolute position of the roadside unit 3 (hereinafter referred to as , roadside unit absolute position). The relative position of the target in the target information is a position based on the position of the roadside unit 3 . Henceforth, this relative position included in target object information is called a roadside machine reference|standard target position. The control device 300 may be configured to periodically transmit the target information, for example.
 <車両ユニット2の概略構成>
 続いて、図3を用いて車両ユニット2の概略構成の一例を説明する。車両ユニット2は、図3に示すように、通信機20、ロケータ30、周辺監視センサ40、周辺監視ECU50、及び運転支援ECU60を含んでいる。通信機20、ロケータ30、周辺監視ECU50、及び運転支援ECU60は、車内LAN(図3のLAN参照)と接続される構成とすればよい。
<Schematic Configuration of Vehicle Unit 2>
Next, an example of the schematic configuration of the vehicle unit 2 will be described with reference to FIG. 3 . The vehicle unit 2 includes a communication device 20, a locator 30, a surroundings monitoring sensor 40, a surroundings monitoring ECU 50, and a driving support ECU 60, as shown in FIG. The communication device 20, the locator 30, the perimeter monitoring ECU 50, and the driving support ECU 60 may be configured to be connected to an in-vehicle LAN (see LAN in FIG. 3).
 ロケータ30は、GNSS(Global Navigation Satellite System)受信機を備えている。GNSS受信機は、複数の人工衛星からの測位信号を受信する。ロケータ30は、GNSS受信機で受信する測位信号を用いて、ロケータ30を搭載した自車の位置(以下、測位位置)を逐次測位する。ロケータ30は、測位信号が出力されてきた測位衛星のうちから、測位演算に用いる測位衛星を例えば4つといった規定数決定する。そして、決定した規定数の測位衛星についてのコード疑似距離,搬送波位相等を用い、測位位置の座標を算出する測位演算を行う。測位演算自体は、コード疑似距離を用いる測位演算であっても、搬送波位相を用いる測位演算であってもよい。また、座標は、少なくとも緯度,経度の座標とすればよい。 The locator 30 is equipped with a GNSS (Global Navigation Satellite System) receiver. A GNSS receiver receives positioning signals from multiple satellites. The locator 30 uses the positioning signals received by the GNSS receiver to sequentially locate the position of the own vehicle equipped with the locator 30 (hereinafter referred to as the positioning position). The locator 30 determines a specified number, such as four, of positioning satellites to be used for positioning calculation from among the positioning satellites from which positioning signals have been output. Then, using the code pseudoranges, carrier wave phases, etc. for the determined specified number of positioning satellites, positioning calculation is performed to calculate the coordinates of the positioning position. The positioning operation itself may be a positioning operation using code pseudoranges or a positioning operation using carrier phases. Also, the coordinates should be at least latitude and longitude coordinates.
 ロケータ30は、慣性センサを備えていてもよい。慣性センサとしては、例えばジャイロセンサ及び加速度センサを用いればよい。ロケータ30は、GNSS受信機で受信する測位信号と、慣性センサの計測結果とを組み合わせることにより、測位位置を逐次測位してもよい。なお、測位位置の測位には、自車に搭載された車速センサから逐次出力される信号から求めた走行距離を用いてもよい。 The locator 30 may be equipped with an inertial sensor. As the inertial sensor, for example, a gyro sensor and an acceleration sensor may be used. The locator 30 may sequentially measure the positioning position by combining the positioning signal received by the GNSS receiver and the measurement result of the inertial sensor. Note that the travel distance obtained from the signals sequentially output from the vehicle speed sensor mounted on the own vehicle may be used for the positioning of the positioning position.
 周辺監視センサ40は、歩行者、人間以外の動物、自転車、オートバイ、及び他車両等の移動物体、さらに路上の落下物、ガードレール、縁石、及び樹木等の静止物体といった自車周辺の障害物を検出する。周辺監視センサ40としては、自車の周辺の所定範囲を撮像範囲とする周辺監視カメラを用いる構成とすればよい。周辺監視センサ40としては、ミリ波レーダ,ソナー,LIDAR(Light Detection and Ranging/Laser Imaging Detect ion and Ranging)等を用いる構成としてもよい。また、これらを組み合わせて用いる構成としてもよい。周辺監視カメラは、逐次撮像する撮像画像をセンシング情報として周辺監視ECU50へ逐次出力する。ソナー,ミリ波レーダ,LIDAR等の探査波を送信するセンサは、障害物によって反射された反射波を受信した場合に得られる受信信号に基づく走査結果をセンシング情報として周辺監視ECU50へ逐次出力する。 The surroundings monitoring sensor 40 detects obstacles around the vehicle, such as moving objects such as pedestrians, animals other than humans, bicycles, motorcycles, and other vehicles, as well as stationary objects such as falling objects on the road, guardrails, curbs, and trees. To detect. As the surroundings monitoring sensor 40, a configuration using a surroundings monitoring camera having an imaging range of a predetermined range around the own vehicle may be used. Perimeter monitoring sensor 40 may be configured to use millimeter wave radar, sonar, LIDAR (Light Detection and Ranging/Laser Imaging Detection and Ranging), or the like. Moreover, it is good also as a structure which uses these in combination. The surroundings monitoring camera sequentially outputs captured images to the surroundings monitoring ECU 50 as sensing information. Sensors that transmit search waves such as sonar, millimeter wave radar, and LIDAR sequentially output scanning results based on received signals obtained when reflected waves reflected by obstacles are received to the perimeter monitoring ECU 50 as sensing information.
 周辺監視ECU50は、プロセッサ、メモリ、I/O、これらを接続するバスを備え、メモリに記憶された制御プログラムを実行することで、自車の周辺の物標の認識に関する各種の処理を実行する。周辺監視ECU50は、周辺監視センサ40から出力されたセンシング情報から、前述の制御装置300と同様にして、自車周辺に存在する物標についての物標情報を認識する。路側機3を基準とする代わりに自車を基準とする点を除けば、周辺監視ECU50での物標情報の認識は、制御装置300での物標情報の認識と同様である。異なる点としては、以下の通りである。物標の速度は、自車に対する物標の相対位置の時間あたりの変化量と、自車の車速とから認識すればよい。また、物標の速度は、送信した探査波と反射波とのドップラーシフトをもとに算出される自車に対する物標の相対速度と、自車の車速とから認識してもよい。なお、周辺監視センサ40を用いた物標の位置の認識での位置特定精度は、ロケータ30での測位による位置特定精度よりも高いものとする。 The perimeter monitoring ECU 50 includes a processor, memory, I/O, and a bus connecting these, and executes a control program stored in the memory to perform various processes related to recognition of targets in the vicinity of the vehicle. . The surroundings monitoring ECU 50 recognizes target object information about targets existing around the vehicle from the sensing information output from the surroundings monitoring sensor 40 in the same manner as the control device 300 described above. Recognition of target object information by the perimeter monitoring ECU 50 is the same as recognition of target object information by the control device 300 except that the own vehicle is used as a reference instead of the roadside unit 3 . The points of difference are as follows. The speed of the target can be recognized from the amount of change per hour in the relative position of the target with respect to the own vehicle and the vehicle speed of the own vehicle. Also, the speed of the target may be recognized from the speed of the target relative to the vehicle, which is calculated based on the Doppler shift between the transmitted search wave and the reflected wave, and the vehicle speed of the vehicle. It is assumed that the accuracy of position identification in recognizing the position of the target using the perimeter monitoring sensor 40 is higher than the accuracy of position identification by positioning with the locator 30 .
 運転支援ECU60は、プロセッサ、メモリ、I/O、これらを接続するバスを備え、メモリに記憶された制御プログラムを実行することで、自車の運転支援に関する各種の処理を実行する。運転支援ECU60は、周辺監視ECU50及び通信機20で特定される物標の位置をもとに運転支援を行う。周辺監視ECU50で特定する物標の位置は、周辺監視センサ40で検出できた物標(以下、見通し内物標)の自車に対する相対位置である。通信機20で特定する物標の位置は、周辺監視センサ40で検出できない物標(以下、見通し外物標)の自車に対する相対位置である。運転支援の一例としては、物標との近接を回避するための車両制御,物標との近接を回避するための注意喚起等が挙げられる。 The driving assistance ECU 60 includes a processor, memory, I/O, and a bus connecting these, and executes various processes related to driving assistance of the own vehicle by executing a control program stored in the memory. The driving assistance ECU 60 performs driving assistance based on the position of the target specified by the perimeter monitoring ECU 50 and the communication device 20 . The position of the target specified by the perimeter monitoring ECU 50 is the relative position of the target detected by the perimeter monitoring sensor 40 (hereinafter referred to as the line-of-sight target) with respect to the own vehicle. The position of the target specified by the communication device 20 is the relative position of the target that cannot be detected by the perimeter monitoring sensor 40 (hereinafter referred to as non-line-of-sight target) with respect to the own vehicle. Examples of driving assistance include vehicle control for avoiding approaching a target, alerting for avoiding approaching a target, and the like.
 通信機20は、プロセッサ、メモリ、I/O、これらを接続するバスを備え、メモリに記憶された制御プログラムを実行することで、自車の外部との無線通信及び物標の位置の特定に関する処理を実行する。プロセッサがこの制御プログラムを実行することは、制御プログラムに対応する方法が実行されることに相当する。ここで言うところのメモリは、コンピュータによって読み取り可能なプログラム及びデータを非一時的に格納する非遷移的実体的記憶媒体(non- transitory tangible storage medium)である。また、非遷移的実体的記憶媒体は、半導体メモリ又は磁気ディスクなどによって実現される。この通信機20が車両用装置に相当する。通信機20での処理の詳細については後述する。 The communication device 20 includes a processor, memory, I/O, and a bus connecting these, and executes a control program stored in the memory to perform wireless communication with the outside of the vehicle and target position identification. Execute the process. Execution of the control program by the processor corresponds to execution of the method corresponding to the control program. Memory, as used herein, is a non-transitory tangible storage medium for non-transitory storage of computer-readable programs and data. A non-transitional physical storage medium is implemented by a semiconductor memory, a magnetic disk, or the like. This communication device 20 corresponds to a vehicle device. Details of the processing in the communication device 20 will be described later.
 <通信機20の概略構成>
 続いて、図4を用いて通信機20の概略構成の一例を説明する。通信機20は、図4に示すように、検出情報取得部201、車車間通信部202、路車間通信部203、同一判定部204、変換部205、誤差推定部206、誤差記憶部207、位置特定部208、及び位置記憶部209を機能ブロックとして備えている。なお、通信機20が実行する機能の一部又は全部を、1つ或いは複数のIC等によりハードウェア的に構成してもよい。また、通信機20が備える機能ブロックの一部又は全部は、プロセッサによるソフトウェアの実行とハードウェア部材の組み合わせによって実現されてもよい。
<Schematic Configuration of Communication Device 20>
Next, an example of the schematic configuration of the communication device 20 will be described with reference to FIG. As shown in FIG. 4, the communication device 20 includes a detection information acquisition unit 201, a vehicle-to-vehicle communication unit 202, a road-to-vehicle communication unit 203, an identity determination unit 204, a conversion unit 205, an error estimation unit 206, an error storage unit 207, a position A specifying unit 208 and a position storage unit 209 are provided as functional blocks. Part or all of the functions executed by the communication device 20 may be configured as hardware using one or a plurality of ICs or the like. Also, some or all of the functional blocks included in the communication device 20 may be implemented by a combination of software executed by a processor and hardware members.
 検出情報取得部201は、周辺監視ECU50で認識した物標情報を取得する。つまり、周辺監視センサ40で検出した物標の情報を取得する。以降では、物標情報は、自車からの物標の距離及び自車に対する物標の方位からなる物標の相対位置、物標の種類、物標の速度、及び物標の移動方位であるものとして説明を続ける。なお、物標の相対位置は、自車の測位位置を基準とした座標であってもよい。また、検出情報取得部201は、ロケータ30で測位した自車の測位位置を取得する。 The detection information acquisition unit 201 acquires target information recognized by the perimeter monitoring ECU 50 . That is, the information of the target detected by the perimeter monitoring sensor 40 is acquired. Hereinafter, the target information is the relative position of the target consisting of the distance of the target from the own vehicle and the orientation of the target with respect to the own vehicle, the type of target, the speed of the target, and the moving direction of the target. Continue to explain. The relative position of the target may be coordinates based on the positioning position of the own vehicle. The detection information acquisition unit 201 also acquires the measured position of the own vehicle measured by the locator 30 .
 車車間通信部202は、他車両との間で車車間通信を行うことで、他車両の通信機20と情報のやり取りを行う。また、本実施形態では、車両VEaと車両VEbとの間での車車間通信と、車両VEaと車両VEcとの間での車車間通信とが行われるものとする。 The vehicle-to-vehicle communication unit 202 exchanges information with the communication device 20 of the other vehicle by performing vehicle-to-vehicle communication with the other vehicle. Further, in the present embodiment, vehicle-to-vehicle communication is performed between the vehicle VEa and the vehicle VEb, and vehicle-to-vehicle communication is performed between the vehicle VEa and the vehicle VEc.
 車車間通信部202は、送信部221及び車車側受信部222を備える。送信部221は、車車間通信によって他車両に情報を送信する。送信部221は、検出情報取得部201で取得した物標情報及び自車のロケータ30で測位した測位位置を、無線通信を介して他車両の通信機20に送信する。送信部221は、物標情報を送信する場合に、自車を識別するための識別情報(以下、送信元識別情報)も物標情報に含めて送信すればよい。送信元識別情報は、自車の車両IDであってもよいし、自車の通信機20の通信機IDであってもよい。送信部221から送信する物標情報には、上述したように、物標の相対位置、物標の種類、物標の速度、物標の移動方位、送信元の車両の測位位置、及び送信元識別情報が含まれる。物標情報のうちの物標の相対位置は、自車の位置を基準とする位置である。以降では、物標情報に含ませるこの相対位置を搭載物基準物標位置と呼ぶ。 The vehicle-to-vehicle communication unit 202 includes a transmission unit 221 and a vehicle-to-vehicle reception unit 222. The transmission unit 221 transmits information to other vehicles through inter-vehicle communication. The transmission unit 221 transmits the target information acquired by the detection information acquisition unit 201 and the positioning position measured by the locator 30 of the own vehicle to the communication device 20 of the other vehicle via wireless communication. When transmitting the target information, the transmitting unit 221 may include identification information for identifying the own vehicle (hereinafter referred to as transmission source identification information) in the target information. The source identification information may be the vehicle ID of the own vehicle or the communication device ID of the communication device 20 of the own vehicle. As described above, the target information transmitted from the transmission unit 221 includes the relative position of the target, the type of target, the speed of the target, the direction of movement of the target, the measured position of the vehicle that is the transmission source, and the transmission source Contains identifying information. The relative position of the target in the target information is a position based on the position of the own vehicle. Hereinafter, this relative position to be included in the target information will be referred to as a mounted object reference target position.
 車車側受信部222は、車車間通信によって他車両から送信されてくる情報を受信する。車車側受信部222は、他車両の通信機20から無線通信を介して送信されてくる物標情報を受信する。この他車両は、車両ユニット2を用いる車両とする。この車車側受信部222が搭載物情報取得部に相当する。この車車側受信部222での処理が搭載物情報取得工程に相当する。 The vehicle-side receiving unit 222 receives information transmitted from other vehicles through vehicle-to-vehicle communication. The vehicle-side receiving unit 222 receives target information transmitted from the communication device 20 of another vehicle via wireless communication. This other vehicle is assumed to be a vehicle using the vehicle unit 2 . The vehicle-side receiving section 222 corresponds to the mounted object information acquiring section. The processing in this vehicle-side receiving section 222 corresponds to the mounted object information acquisition step.
 本実施形態では、車車側受信部222が、2種類の他車両から車車間通信によって物標情報を受信するものとして説明を行う。1種類目は、路側機3の周辺監視センサ302の検出範囲内の他車両である。この他車両を以下では第1通信機搭載物と呼ぶ。2種類目は、路側機3の周辺監視センサ302の検出範囲外の他車両である。この他車両を以下では第2通信機搭載物と呼ぶ。車車側受信部222で受信する物標情報には、上述したように、搭載物基準物標位置、物標の種類、物標の速度、物標の移動方位、送信元の車両の測位位置、及び送信元識別情報が含まれる。以降では、第1通信機搭載物のロケータ30で測位した測位位置を第1測位位置と呼ぶ。第1通信機搭載物の周辺監視センサ40で検出した物標は第1物標と呼ぶ。また、第2通信機搭載物のロケータ30で測位した測位位置を第2測位位置と呼ぶ。第2通信機搭載物の周辺監視センサ40で検出した物標は第2物標と呼ぶ。 In this embodiment, the vehicle-side receiving unit 222 will be described as receiving target information from two types of other vehicles through vehicle-to-vehicle communication. The first type is other vehicles within the detection range of the perimeter monitoring sensor 302 of the roadside unit 3 . This other vehicle will hereinafter be referred to as the first communication equipment. The second type is other vehicles outside the detection range of the perimeter monitoring sensor 302 of the roadside unit 3 . This other vehicle is hereinafter referred to as a second communication device mounted object. The target information received by the vehicle-side receiving unit 222 includes, as described above, the reference target position of the mounted object, the type of target, the speed of the target, the moving azimuth of the target, and the measured position of the vehicle that is the transmission source. , and source identification information. Hereinafter, the positioning position measured by the locator 30 of the first communication device mounted object will be referred to as the first positioning position. A target detected by the perimeter monitoring sensor 40 of the first communication device is called a first target. Further, the positioning position measured by the locator 30 of the second communication device is called a second positioning position. A target detected by the perimeter monitoring sensor 40 of the second communication device is called a second target.
 路車間通信部203は、路側機3との間で路車間通信を行うことで情報のやり取りを行う。また、本実施形態では、車両VEaと路側機3との間で路車間通信が行われるものとする。路車間通信部203は、路車側受信部231を備える。なお、路車間通信部203は、路車間通信によって路側機3に情報を送信しても構わないが、便宜上、説明を省略する。 The road-to-vehicle communication unit 203 exchanges information by performing road-to-vehicle communication with the roadside device 3 . Further, in this embodiment, it is assumed that road-to-vehicle communication is performed between the vehicle VEa and the roadside device 3 . The road-to-vehicle communication unit 203 includes a road-to-vehicle receiving unit 231 . Note that the road-to-vehicle communication unit 203 may transmit information to the roadside device 3 through road-to-vehicle communication, but the description is omitted for convenience.
 路車側受信部231は、路車間通信によって路側機3から送信されてくる情報を受信する。路車側受信部231は、路側機3から無線通信を介して送信されてくる物標情報を受信する。また、路車側受信部231で受信する物標情報には、上述したように、路側機基準物標位置、物標の種類、物標の速度、物標の移動方位、及び路側機絶対位置が含まれる。路側機基準物標位置は、後述する誤差推定部206において第1測位誤差を推定する際に参照する参照位置であり、路側機3が参照装置に相当する。この路車側受信部231が参照位置取得部及び路側機情報取得部に相当する。この路車側受信部231での処理が参照位置取得工程及び路側機情報取得工程に相当する。 The roadside and vehicle side receiving unit 231 receives information transmitted from the roadside device 3 through roadtovehicle communication. The roadside and vehicle side receiving unit 231 receives target object information transmitted from the roadside device 3 via wireless communication. As described above, the target information received by the roadside vehicle side receiving unit 231 includes the roadside unit reference target position, the type of target, the speed of the target, the moving direction of the target, and the absolute position of the roadside unit. is included. The roadside unit reference target position is a reference position referred to when estimating the first positioning error in the error estimating unit 206, which will be described later, and the roadside unit 3 corresponds to the reference device. The road and vehicle side reception unit 231 corresponds to the reference position acquisition unit and the roadside device information acquisition unit. The processing in the road and vehicle side reception unit 231 corresponds to the reference position acquisition step and the roadside device information acquisition step.
 同一判定部204は、路車側受信部231で路側機基準物標位置を取得した路側機3の周辺監視センサ302で検出した物標(以下、路側機検出物標)と、車車側受信部222で取得した物標情報の送信元の他車両(以下、通信他車両)とが同一か否かを判定する。この他車両が第1通信機搭載物に相当し、この物標情報に含まれる測位位置が第1測位位置に相当する。また、この同一判定部204での処理が同一判定工程に相当する。同一判定部204は、路車側受信部231で取得した路側機基準物標位置と、車車側受信部222で取得する搭載物基準物標位置との近似の有無によって、路側機検出物標と通信他車両とが同一か否かを判定すればよい。例えば、同一判定部204は、これらの位置の差が閾値未満の場合に、同一と判定すればよい。一方、これらの位置の差が閾値未満でない場合に同一でないと判定すればよい。なお、同一判定部204で同一を判定する条件として、物標の種類の一致、速度の差が閾値未満、移動方位の差が閾値未満等の他の条件を用いる構成としてもよい。この場合には、送信部221から送信する物標情報には、自車の種類、速度、移動方位等の情報も含ませる構成とすればよい。 The same determination unit 204 detects a target detected by the perimeter monitoring sensor 302 of the roadside unit 3 from which the roadside unit reference target position is acquired by the roadside unit receiving unit 231 (hereinafter referred to as a roadside unit detection target), and the vehicle side reception unit 231. It is determined whether or not the transmission source of the target object information acquired by the unit 222 is the same as the other vehicle (hereinafter referred to as another communication vehicle). The other vehicle corresponds to the first communication equipment mounted object, and the positioning position included in this target object information corresponds to the first positioning position. Further, the processing in the identity determination unit 204 corresponds to the identity determination step. The identity determination unit 204 determines whether the roadside unit reference target position acquired by the roadside vehicle side reception unit 231 and the mounted object reference target position acquired by the vehicle side reception unit 222 are similar to each other. and the communicating other vehicle are the same. For example, if the difference between these positions is less than a threshold, the identity determination unit 204 may determine that they are the same. On the other hand, if the difference between these positions is not less than the threshold, it may be determined that they are not the same. It should be noted that other conditions, such as matching of types of targets, difference in speed less than a threshold value, and difference in direction of movement less than a threshold value, may be used as conditions for determining identity by the identity determining unit 204 . In this case, the target object information transmitted from the transmission unit 221 may include information such as the type, speed, and direction of movement of the own vehicle.
 変換部205は、路車側受信部231で取得した路側機基準物標位置及び車車側受信部222で取得した測位位置を自車に対する相対位置に変換する。一例として、ロケータ30で測位した自車の測位位置を原点とするXY座標系(以下、自車座標系)の座標に変換すればよい。この変換部205での処理が変換工程に相当する。同一判定部204で路側機検出物標と通信他車両とが同一と判定する場合には、この通信他車両は第1通信機搭載物である。よって、同一判定部204で路側機検出物標と通信他車両とが同一と判定する場合には、第1通信機搭載物から車車側受信部222で取得した第1測位位置を自車に対する相対位置に変換することになる。変換部205は、車車側受信部222で取得した搭載物基準物標位置も自車に対する相対位置に変換してもよい。 The conversion unit 205 converts the roadside machine reference target position acquired by the roadside vehicle side reception unit 231 and the positioning position acquired by the vehicle side reception unit 222 into relative positions with respect to the own vehicle. As an example, the coordinates may be converted into coordinates in an XY coordinate system (hereinafter referred to as the vehicle coordinate system) with the measured position of the vehicle measured by the locator 30 as the origin. The processing in this conversion unit 205 corresponds to the conversion step. When the identity determination unit 204 determines that the target detected by the roadside unit and the other communication vehicle are the same, the other communication vehicle is the first communication device mounted object. Therefore, when the same determination unit 204 determines that the target detected by the roadside unit and the other vehicle are the same, the first positioning position acquired by the vehicle-side receiving unit 222 from the first communication device mounted object is Convert to relative position. The transforming unit 205 may also transform the mounted object reference target position acquired by the vehicle-side receiving unit 222 into a position relative to the own vehicle.
 誤差推定部206は、同一判定部204で路側機検出物標と通信他車両とが同一と判定した場合には、それらについての、変換部205で変換した路側機基準物標位置と測位位置とのずれから、自車の位置を基準とした通信他車両での測位の誤差を推定する。同一判定部204で路側機検出物標と通信他車両とが同一と判定する場合には、この通信他車両は第1通信機搭載物である。よって、誤差推定部206は、変換部205で変換した路側機基準物標位置と第1測位位置とのずれから、自車の位置を基準とした第1通信機搭載物での測位の誤差(以下、第1測位誤差)を推定する。この誤差推定部206での処理が誤差推定工程に相当する。誤差推定部206で推定する測位の誤差は、自車座標系のX座標,Y座標のそれぞれのずれとすればよい。誤差推定部206は、推定した第1測位誤差を誤差記憶部207に記憶する。誤差記憶部207は、不揮発性メモリとしてもよいし、揮発性メモリとしてもよい。誤差推定部206は、推定した第1測位誤差と、その第1測位誤差が推定された第1通信機搭載物の送信元識別情報とを紐付けて誤差記憶部207に記憶すればよい。第1通信機搭載物の送信元識別情報としては、車車側受信部222で受信した物標情報のうちの送信元識別情報を用いればよい。 When the identity determination unit 204 determines that the roadside unit detected target and the other communication vehicle are the same, the error estimation unit 206 calculates the roadside unit reference target position and the measured position converted by the conversion unit 205. Based on the deviation of the position of the own vehicle, the positioning error of the other vehicle is estimated. When the identity determination unit 204 determines that the target detected by the roadside unit and the other communication vehicle are the same, the other communication vehicle is the first communication device mounted object. Therefore, the error estimating unit 206 calculates the positioning error ( Hereinafter, the first positioning error) is estimated. The processing in this error estimating section 206 corresponds to the error estimating step. The positioning error estimated by the error estimator 206 may be the deviation of each of the X and Y coordinates of the own vehicle coordinate system. Error estimation section 206 stores the estimated first positioning error in error storage section 207 . The error storage unit 207 may be a non-volatile memory or a volatile memory. The error estimator 206 may associate the estimated first positioning error with the transmission source identification information of the first communication device-mounted object from which the first positioning error was estimated, and store them in the error storage 207 . As the transmission source identification information of the first communication device-mounted object, the transmission source identification information of the target object information received by the vehicle-side receiving unit 222 may be used.
 位置特定部208は、搭載物位置特定部281及び物標位置特定部282を有する。搭載物位置特定部281は、誤差推定部206で第1測位誤差を一旦推定した後は、第1通信機搭載物が路側機3の周辺監視センサ302で検出できなくなった場合であっても、車車側受信部222で第1通信機搭載物から取得する第1測位位置を、その第1測位誤差の分だけ補正して、自車に対するその第1通信機搭載物の位置を特定することが好ましい。搭載物位置特定部281は、同一判定部204で路側機検出物標と通信他車両とが同一でないと判定した場合であって、且つ、誤差記憶部207に第1通信機搭載物の第1測位誤差が記憶済みである場合には、車車側受信部222で取得する第1測位位置を、その第1測位誤差の分だけ補正して、自車に対するその第1通信機搭載物の位置を特定すればよい。補正については、第1測位誤差の分のずれを解消するように、第1測位位置をずらすことで行えばよい。これによれば、第1通信機搭載物が路側機3の周辺監視センサ302及び自車の周辺監視センサ40のいずれでも検出できない場合であっても、自車に対する第1通信機搭載物の相対位置をより精度良く特定することが可能になる。搭載物位置特定部281は、誤差記憶部207に第1通信機搭載物の第1測位誤差が記憶済みか否かを、誤差記憶部207に第1測位誤差に紐付けられた送信元識別情報が存在するか否かで判断すればよい。 The position specifying unit 208 has a mounted object position specifying unit 281 and a target position specifying unit 282 . After the mounted object position specifying unit 281 once estimates the first positioning error in the error estimating unit 206, even if the first communication device mounted object cannot be detected by the surroundings monitoring sensor 302 of the roadside unit 3, The vehicle-side receiving unit 222 corrects the first positioning position acquired from the first communication device mounted object by the amount of the first positioning error to specify the position of the first communication device mounted object with respect to the own vehicle. is preferred. The mounted object position specifying unit 281 stores the first position of the first communication device mounted object in the error storage unit 207 when the identity determination unit 204 determines that the target detected by the roadside unit and the communicating other vehicle are not the same. When the positioning error is already stored, the first positioning position acquired by the vehicle-side receiving unit 222 is corrected by the first positioning error, and the position of the first communication device mounted object with respect to the own vehicle. should be specified. The correction may be performed by shifting the first positioning position so as to eliminate the deviation of the first positioning error. According to this, even if neither the perimeter monitoring sensor 302 of the roadside device 3 nor the perimeter monitoring sensor 40 of the vehicle can detect the first communication device-mounted object, the relative position of the first communication device-mounted object with respect to the vehicle can be detected. It becomes possible to identify the position with higher accuracy. The mounted object position specifying unit 281 determines whether or not the first positioning error of the first communication device mounted object has been stored in the error storage unit 207, and stores the transmission source identification information linked to the first positioning error in the error storage unit 207. It can be judged whether or not exists.
 搭載物位置特定部281は、特定した自車に対する第1通信機搭載物の位置を、位置記憶部209に記憶する。位置記憶部209は、揮発性メモリとすればよい。なお、自車の周辺監視センサ40で検出できる第1通信機搭載物の位置については、位置記憶部209に記憶せずに、周辺監視ECU50のメモリに記憶してもよい。周辺監視センサ40の検出範囲内の第1通信機搭載物の位置については、自車の周辺監視センサ40で検出した位置を用いればよい。 The mounted object position specifying unit 281 stores the position of the first communication device mounted object with respect to the specified own vehicle in the position storage unit 209 . The position storage unit 209 may be a volatile memory. The position of the first communication device mounted object that can be detected by the perimeter monitoring sensor 40 of the own vehicle may be stored in the memory of the perimeter monitoring ECU 50 instead of being stored in the position storage unit 209 . As for the position of the first communication device mounted object within the detection range of the surroundings monitoring sensor 40, the position detected by the surroundings monitoring sensor 40 of the own vehicle may be used.
 物標位置特定部282は、誤差推定部206で第1測位誤差を一旦推定した後は、車車側受信部222で第1通信機搭載物から取得する搭載物基準物標位置(以下、第1搭載物基準物標位置)を、その第1測位誤差の分だけ補正して、自車に対する第1物標の位置を特定することが好ましい。第1物標とは、前述したように、第1通信機搭載物の周辺監視センサ40で検出した物標である。補正については、第1測位誤差の分のずれを解消するように、第1搭載物基準物標位置をずらすことで行えばよい。これによれば、無線通信を介して第1通信機搭載物から車車側受信部222で取得する第1搭載物基準物標位置からでも、自車に対する第1物標の相対位置をより精度良く特定することが可能になる。物標位置特定部282は、特定した自車に対する第1物標の位置を、位置記憶部209に記憶する。 After once estimating the first positioning error in the error estimating unit 206, the target position specifying unit 282 uses the vehicle-side receiving unit 222 to acquire the mounted object reference target position (hereinafter referred to as the first It is preferable to specify the position of the first target with respect to the own vehicle by correcting the position of the first target with respect to the vehicle. The first target is the target detected by the perimeter monitoring sensor 40 of the first communication device mounted object, as described above. The correction may be performed by shifting the position of the first mounted object reference target so as to eliminate the shift corresponding to the first positioning error. According to this, the relative position of the first target with respect to the own vehicle can be determined more accurately even from the first mounted object reference target position acquired by the vehicle side receiving unit 222 from the mounted object of the first communication device via wireless communication. can be well identified. The target position specifying unit 282 stores the specified position of the first target relative to the own vehicle in the position storage unit 209 .
 同一判定部204は、物標位置特定部282で自車に対する第1物標の位置を、第1測位誤差を用いて補正して特定した場合であって、且つ、車車側受信部222で第2通信機搭載物から搭載物基準物標位置(以下、第2搭載物基準物標位置)を取得した場合には、以下のようにすることが好ましい。同一判定部204は、物標位置特定部282で自車に対する位置を特定した第1物標と、車車側受信部222で取得した第2搭載物基準物標位置の送信元の第2通信機搭載物の周辺監視センサ40で検出した物標(以下、第2物標)とが同一か否かを判定することが好ましい。 The identity determination unit 204 determines that the position of the first target with respect to the vehicle is specified by the target position specifying unit 282 by correcting the position of the first target using the first positioning error, and the vehicle side receiving unit 222 When the mounted object reference target position (hereinafter referred to as the second mounted object reference target position) is acquired from the second communication device mounted object, it is preferable to perform the following. The identity determining unit 204 determines the first target whose position relative to the own vehicle is specified by the target position specifying unit 282 and the second communication of the transmission source of the second mounted object reference target position acquired by the vehicle-side receiving unit 222. It is preferable to determine whether or not the target (hereinafter referred to as the second target) detected by the perimeter monitoring sensor 40 of the on-board object is the same.
 この場合、同一判定部204は、車車側受信部222で第1通信機搭載物から取得した物標情報と車車側受信部222で第2通信機搭載物から取得した物標情報との近似の有無によって、第1物標と第2物標とが同一か否かを判定する。同一か否か判定のために比較する物標情報は、例えば物標の種類、物標の速度、及び物標の移動方位とすればよい。例えば、同一判定部204は、種類の一致、速度の差が閾値未満、及び移動方位の差が閾値未満の条件を全て満たす場合に、同一と判定すればよい。一方、これらの条件の一部でも満たさない場合に同一でないと判定すればよい。なお、同一判定部204で同一を判定するために満たすべき条件は、種類の一致、速度の差が閾値未満、及び移動方位の差が閾値未満の一部としてもよい。また、同一判定部204は、第1搭載物基準物標位置と第2搭載物基準物標位置とをそれぞれ変換部205で自車に対する相対位置に変換した位置同士の近似の有無によって同一か否かを判定すればよい。同一判定部204は、この変換後の位置の差が閾値未満であることを条件として、同一か否かを判定すればよい。同一判定部204は、種類の一致、速度の差が閾値未満、及び移動方位の差が閾値未満の条件に加えて、この変換後の位置の差が閾値未満であることを条件としてもよい。 In this case, the identity determination unit 204 determines whether the target information obtained from the vehicle-side receiving unit 222 from the first communication device-mounted object and the target information obtained from the vehicle-side receiving unit 222 from the second communication device-mounted object. Whether or not the first target is the same as the second target is determined based on the presence or absence of approximation. The target information to be compared for determining whether or not they are the same may be, for example, the type of target, the speed of the target, and the direction of movement of the target. For example, the same determination unit 204 may determine that the two are the same when all of the following conditions are satisfied: the types match, the speed difference is less than a threshold, and the direction difference is less than a threshold. On the other hand, if even some of these conditions are not met, it may be determined that they are not the same. Note that the conditions to be satisfied for the identity determination unit 204 to determine the identity may be part of the matching of types, the difference in speed being less than a threshold value, and the difference in moving direction being less than a threshold value. Further, the identity determination unit 204 determines whether the first mount reference target position and the second mount reference target position are the same based on the presence or absence of approximation between the positions converted into relative positions with respect to the own vehicle by the conversion unit 205 . It is necessary to determine whether The identity determination unit 204 may determine whether or not the positions are the same under the condition that the difference in position after conversion is less than a threshold. The identity determination unit 204 may set the condition that the difference in position after conversion is less than a threshold in addition to the conditions that the type matches, the speed difference is less than the threshold, and the movement direction difference is less than the threshold.
 誤差推定部206は、同一判定部204で第1物標と第2物標とが同一と判定した場合には、変換部205で変換した第2搭載物基準物標位置と、物標位置特定部282で特定した自車に対する第1物標の位置とのずれから、自車の位置を基準とした第2通信機搭載物での測位の誤差(以下、第2測位誤差)を推定すればよい。これによれば、第2通信機搭載物が路側機3の周辺監視センサ302及び自車の周辺監視センサ40のいずれでも検出されたことがない場合であっても、自車に対する第2通信機搭載物の相対位置をより精度良く特定することが可能になる。誤差推定部206は、推定した第2測位誤差を誤差記憶部207に記憶する。誤差推定部206は、推定した第2測位誤差と、その第2測位誤差が推定された第2通信機搭載物の送信元識別情報とを紐付けて誤差記憶部207に記憶すればよい。 When the identity determining unit 204 determines that the first target and the second target are the same, the error estimating unit 206 determines the position of the second mounted object reference target converted by the converting unit 205 and the target position specification. From the deviation from the position of the first target with respect to the own vehicle specified in the section 282, if the positioning error (hereinafter referred to as the second positioning error) of the second communication equipment mounted on the position of the own vehicle is estimated. good. According to this, even if neither the perimeter monitoring sensor 302 of the roadside device 3 nor the perimeter monitoring sensor 40 of the vehicle has ever detected the second communication device mounted object, the second communication device for the vehicle is detected. It becomes possible to specify the relative position of the mounted object with higher accuracy. Error estimation section 206 stores the estimated second positioning error in error storage section 207 . The error estimator 206 may associate the estimated second positioning error with the transmission source identification information of the second communication device-mounted object from which the second positioning error was estimated, and store them in the error storage unit 207 .
 搭載物位置特定部281は、誤差推定部206で第2測位誤差を一旦推定した後は、車車側受信部222で第2通信機搭載物から取得する第2測位位置を、その第2測位誤差の分だけ補正して、自車に対する第2通信機搭載物の位置を特定する。これによれば、第2通信機搭載物が路側機3の周辺監視センサ302及び自車の周辺監視センサ40のいずれでも検出できない場合であっても、自車に対する第2通信機搭載物の相対位置をより精度良く特定することが可能になる。 After once estimating the second positioning error in the error estimating unit 206, the mounted object position specifying unit 281 uses the vehicle-side receiving unit 222 to obtain the second measured position obtained from the second communication device mounted object as the second positioning error. After correcting the error, the position of the second communication device mounted object relative to the own vehicle is specified. According to this, even if the object mounted on the second communication device cannot be detected by either the surroundings monitoring sensor 302 of the roadside device 3 or the surroundings monitoring sensor 40 of the own vehicle, the relative position of the object mounted on the second communication device to the own vehicle can be detected. It becomes possible to identify the position with higher accuracy.
 搭載物位置特定部281は、特定した自車に対する第2通信機搭載物の位置を、位置記憶部209に記憶する。位置記憶部209は、揮発性メモリとすればよい。なお、自車の周辺監視センサ40で検出できる第2通信機搭載物の位置については、位置記憶部209に記憶せずに、周辺監視ECU50のメモリに記憶してもよい。周辺監視センサ40の検出範囲内の第2通信機搭載物の位置については、自車の周辺監視センサ40で検出した位置を用いればよい。 The mounted object position specifying unit 281 stores the position of the second communication device mounted object with respect to the specified own vehicle in the position storage unit 209 . The position storage unit 209 may be a volatile memory. Note that the position of the second communication device mounted object that can be detected by the surroundings monitoring sensor 40 of the own vehicle may be stored in the memory of the surroundings monitoring ECU 50 without being stored in the position storage unit 209 . As for the position of the second communication device mounted object within the detection range of the surroundings monitoring sensor 40, the position detected by the surroundings monitoring sensor 40 of the own vehicle may be used.
 物標位置特定部282は、誤差推定部206で第2測位誤差を一旦推定した後は、車車側受信部222で第2通信機搭載物から取得する搭載物基準物標位置(以下、第2搭載物基準物標位置)を、その第2測位誤差の分だけ補正して、自車に対する第2物標の位置を特定することが好ましい。第2物標とは、前述したように、第2通信機搭載物の周辺監視センサ40で検出した物標である。補正については、第2測位誤差の分のずれを解消するように、第2搭載物基準物標位置をずらすことで行えばよい。これによれば、無線通信を介して第2通信機搭載物から車車側受信部222で取得する第2搭載物基準物標位置からでも、自車に対する第2物標の相対位置をより精度良く特定することが可能になる。物標位置特定部282は、特定した自車に対する第2物標の位置を、位置記憶部209に記憶する。 After once estimating the second positioning error in the error estimating unit 206, the target position specifying unit 282 uses the vehicle-side receiving unit 222 to acquire the mounted object reference target position (hereinafter referred to as the second It is preferable to specify the position of the second target with respect to the own vehicle by correcting the position of the second target relative to the own vehicle. The second target is, as described above, a target detected by the perimeter monitoring sensor 40 of the second communication device mounted object. The correction may be performed by shifting the position of the second mounted object reference target so as to eliminate the deviation of the second positioning error. According to this, the relative position of the second target with respect to the own vehicle can be determined more accurately even from the second mounted object reference target position acquired by the vehicle side receiving unit 222 from the mounted object of the second communication device via wireless communication. can be well identified. The target position specifying unit 282 stores the specified position of the second target relative to the own vehicle in the position storage unit 209 .
 運転支援ECU60は、位置記憶部209に記憶されている通信機搭載物を含む物標の位置を用いて、物標との近接を回避するための車両制御,物標との近接を回避するための注意喚起等の運転支援を行う。よって、自車の見通し外の物標であっても、より精度良く特定した自車に対する相対位置を用いて、より精度の高い運転支援を行うことが可能になる。 The driving support ECU 60 uses the position of the target including the communication equipment mounted object stored in the position storage unit 209 to control the vehicle to avoid approaching the target, and to control the vehicle to avoid approaching the target. Provide driving assistance such as calling attention to Therefore, even if the target is beyond the line of sight of the own vehicle, it is possible to perform more accurate driving support by using the relative position with respect to the own vehicle that is specified with higher accuracy.
 <通信機20での測位誤差推定関連処理>
 ここで、図5のフローチャートを用いて、通信機20での搭載物測位誤差の推定に関連する処理(以下、測位誤差推定関連処理)の流れの一例について説明を行う。この処理が実行されることは誤差推定方法が実施されることを意味する。図5のフローチャートは、例えば自車の内燃機関又はモータジェネレータを始動させるためのスイッチ(以下、パワースイッチ)がオンになった場合に開始される構成とすればよい。
<Positioning Error Estimation Related Processing in Communication Device 20>
Here, an example of the flow of processing related to estimation of mounted object positioning error in the communication device 20 (hereinafter referred to as positioning error estimation related processing) will be described using the flowchart of FIG. 5 . Execution of this process means execution of the error estimation method. The flowchart of FIG. 5 may be configured to be started, for example, when a switch (hereinafter referred to as a power switch) for starting the internal combustion engine or motor generator of the own vehicle is turned on.
 まず、ステップS1では、路車側受信部231が、路側機3から路車間通信で送信されてくる物標情報を受信した場合(S1でYES)には、ステップS2に移る。一方、路車間通信で送信されてくる物標情報を受信していない場合(S1でNO)には、ステップS8に移る。 First, in step S1, when the road-vehicle-side receiving unit 231 receives target information transmitted from the roadside device 3 through road-to-vehicle communication (YES in S1), the process proceeds to step S2. On the other hand, if the target information transmitted by the road-to-vehicle communication has not been received (NO in S1), the process proceeds to step S8.
 ステップS2では、車車側受信部222が、他車両から車車間通信で送信されてくる物標情報を受信した場合(S2でYES)には、ステップS3に移る。一方、車車間通信で物標情報を受信していない場合(S2でNO)には、ステップS13に移る。S2では、S1で路側機3から物標情報を受信してから一定時間内に他車両から物標情報を受信した場合に、他車両から車車間通信で送信されてくる物標情報を受信したものとすればよい。ここで言うところの一定時間内とは、例えば路側機3から物標情報を送信する周期以下の時間とすればよく、任意に設定可能な時間とする。 In step S2, when the vehicle-side receiving unit 222 receives target information transmitted from another vehicle via inter-vehicle communication (YES in S2), the process proceeds to step S3. On the other hand, if target object information has not been received through inter-vehicle communication (NO in S2), the process proceeds to step S13. In S2, when target information is received from another vehicle within a certain period of time after receiving the target information from the roadside device 3 in S1, the target information transmitted from the other vehicle by vehicle-to-vehicle communication is received. It should be. The term "within a certain period of time" as used herein may be, for example, a period of time equal to or less than the period of transmission of target object information from the roadside unit 3, and may be set arbitrarily.
 ステップS3では、同一判定部204が、S1で物標情報を取得した路側機3の周辺監視センサ302で検出した物標である路側機検出物標と、S2で取得した物標情報の送信元の他車両である通信他車両とが同一か否かを判定する。ステップS4では、路側機検出物標と通信他車両とが同一と判定した場合(S4でYES)には、ステップS5に移る。一方、同一でないと判定した場合(S4でNO)には、ステップS8に移る。路側機検出物標と同一と判定される通信他車両が、第1通信機搭載物にあたる。 In step S3, the identity determination unit 204 determines the target detected by the roadside unit 3, which is the target detected by the surrounding monitoring sensor 302 of the roadside unit 3 from which the target information was acquired in S1, and the transmission source of the target information acquired in S2. It is determined whether or not the communication other vehicle, which is the other vehicle, is the same. In step S4, when it is determined that the target detected by the roadside unit and the other communication vehicle are the same (YES in S4), the process proceeds to step S5. On the other hand, if it is determined that they are not the same (NO in S4), the process proceeds to step S8. A communication other vehicle determined to be the same as the target detected by the roadside unit corresponds to the first communication device mounted object.
 ステップS5では、変換部205が、S1で受信して取得した物標情報のうちの路側機基準物標位置と、S2で受信して取得した物標情報のうちの測位位置を、自車に対する相対位置に変換する。なお、S5の処理は、S3の処理よりも前に行う構成としてもよい。この場合、S3では、S5で変換した路側機基準物標位置と測位位置とを用いて、路側機基準物標位置と通信他車両とが同一か否かを判定してもよい。 In step S5, the conversion unit 205 converts the roadside unit reference target position in the target information received and acquired in S1 and the measured position in the target information received and acquired in S2 to the own vehicle. Convert to relative position. Note that the processing of S5 may be configured to be performed before the processing of S3. In this case, in S3, it may be determined whether or not the roadside machine reference target position and the other communication vehicle are the same using the roadside machine reference target position and the positioning position converted in S5.
 ステップS6では、誤差推定部206が、S4で同一と判定した物標についての、S5で変換した路側機基準物標位置と測位位置とのずれから、自車の位置を基準とした第1通信機搭載物での測位の誤差である第1測位誤差を推定する。ステップS7では、S6で推定した第1測位誤差を、その第1測位誤差が推定された第1通信機搭載物の送信元識別情報と紐付けて誤差記憶部207に記憶する。 In step S6, the error estimating unit 206 determines the first communication based on the position of the vehicle based on the deviation between the roadside unit reference target position converted in S5 and the positioning position for the target determined to be the same in S4. A first positioning error, which is a positioning error of the on-board object, is estimated. In step S7, the first positioning error estimated in S6 is stored in the error storage unit 207 in association with the transmission source identification information of the first communication device-mounted object from which the first positioning error was estimated.
 ステップS8では、車車側受信部222が、他車両から車車間通信で送信されてくる物標情報を受信した場合(S8でYES)には、ステップS9に移る。一方、車車間通信で物標情報を受信していない場合(S8でNO)には、ステップS13に移る。 At step S8, if the vehicle-side receiving unit 222 receives target information transmitted from another vehicle via vehicle-to-vehicle communication (YES at S8), the process proceeds to step S9. On the other hand, if target object information has not been received through inter-vehicle communication (NO in S8), the process proceeds to step S13.
 ステップS9では、S2若しくはS6で受信して取得した物標情報の送信元の他車両について、誤差記憶部207に測位誤差が記憶済みの場合(S9でYES)には、ステップS11に移る。一方、誤差記憶部207に測位誤差が記憶済みでない場合(S9でNO)には、ステップS10に移る。ここでの測位誤差には、前述した第1測位誤差及び第2測位誤差が含まれる。 In step S9, if the positioning error is already stored in the error storage unit 207 for the other vehicle from which the target information was received and acquired in S2 or S6 (YES in S9), the process proceeds to step S11. On the other hand, if the positioning error has not been stored in the error storage unit 207 (NO in S9), the process moves to step S10. The positioning error here includes the above-described first positioning error and second positioning error.
 ステップS10では、第2推定関連処理を行って、ステップS13に移る。ここで、図6のフローチャートを用いて、第2推定関連処理の流れの一例について説明する。 In step S10, a second estimation-related process is performed, and the process proceeds to step S13. Here, an example of the flow of the second estimation-related processing will be described using the flowchart of FIG. 6 .
 まず、ステップS101では、車車側受信部222が、S2若しくはS8で受信したのとは異なる他車両との車車間通信で送信されてくる物標情報を受信した場合(S101でYES)には、ステップS102に移る。一方、異なる他車両との車車間通信で送信されてくる物標情報を受信していない場合(S101でNO)には、ステップS13に移る。 First, in step S101, when the vehicle-side receiving unit 222 receives target information transmitted through inter-vehicle communication with another vehicle different from that received in S2 or S8 (YES in S101), , the process proceeds to step S102. On the other hand, if target object information transmitted through inter-vehicle communication with another vehicle has not been received (NO in S101), the process proceeds to step S13.
 S101では、S2若しくはS8で他車両から物標情報を受信してから一定時間内にその他車両と異なる他車両から物標情報を受信した場合に、異なる他車両から車車間通信で送信されてくる物標情報を受信したものとすればよい。この異なる他車両が第2通信機搭載物にあたり、この物標情報に含まれる搭載物基準物標位置が第2搭載物基準物標位置にあたる。ここで言うところの一定時間内とは、任意に設定可能な時間とすればよい。また、車両の区別については、同一判定部204が、物標情報に含まれる送信元識別情報をもとに行えばよい。 In S101, when the target information is received from another vehicle different from the other vehicle within a predetermined time after the target information is received from the other vehicle in S2 or S8, the other vehicle transmits the target information through inter-vehicle communication. It is assumed that the target information has been received. This different other vehicle corresponds to the second communication device mounted object, and the mounted object reference target position included in this target information corresponds to the second mounted object reference target position. The term “within a certain period of time” as used herein may be a period of time that can be arbitrarily set. Further, the identification of the vehicle may be performed by the identity determination unit 204 based on the transmission source identification information included in the target object information.
 ステップS102では、物標位置特定部282で自車に対する第1物標の位置を、第1測位誤差を用いて補正して特定済みの場合(S102でYES)には、ステップS103に移る。一方、特定済みでない場合(S102でNO)には、ステップS13に移る。 In step S102, if the position of the first target relative to the vehicle has been corrected by the target position specifying unit 282 using the first positioning error (YES in S102), the process proceeds to step S103. On the other hand, if it has not been specified (NO in S102), the process proceeds to step S13.
 ステップS103では、同一判定部204が、物標位置特定部282で自車に対する位置を特定した第1物標と、S101で受信して取得した物標情報の送信元の第2通信機搭載物の周辺監視センサ40で検出した物標である第2物標とが同一か否かを判定する。ステップS104では、第1物標と第2物標とが同一と判定した場合(S104でYES)には、ステップS105に移る。一方、同一でないと判定した場合(S104でNO)には、ステップS13に移る。 In step S103, the identity determination unit 204 determines the first target whose position relative to the own vehicle is specified by the target position specifying unit 282 and the second communication device mounted object that is the transmission source of the target information received and acquired in S101. It is determined whether or not the second target, which is the target detected by the surroundings monitoring sensor 40, is the same. In step S104, when it determines with a 1st target and a 2nd target being the same (it is YES at S104), it moves to step S105. On the other hand, if it is determined that they are not the same (NO in S104), the process proceeds to step S13.
 ステップS105では、S101で受信して取得した物標情報のうちの第2搭載物基準物標位置を、自車に対する相対位置に変換する。なお、S105の処理は、S103の処理よりも前に行う構成としてもよい。この場合、S103では、物標位置特定部282で特定済みの自車に対する第1物標の位置とS105で変換した第2搭載物基準物標位置とを用いて、第1物標と第2物標とが同一か否かを判定してもよい。 In step S105, the second mounted object reference target position in the target information received and acquired in S101 is converted into a relative position with respect to the own vehicle. Note that the processing of S105 may be configured to be performed before the processing of S103. In this case, in S103, using the position of the first target with respect to the own vehicle already identified by the target position identifying unit 282 and the second mounted object reference target position converted in S105, the first target and the second It may be determined whether or not the target is the same.
 ステップS106では、誤差推定部206が、S104で同一と判定した物標についての、S105で変換した第2搭載物基準物標位置と物標位置特定部282で特定済みの自車に対する第1物標の位置とのずれから、自車の位置を基準とした第2通信機搭載物での測位の誤差である第2測位誤差を推定する。ステップS107では、S106で推定した第2測位誤差を、その第2測位誤差が推定された第2通信機搭載物の送信元識別情報と紐付けて誤差記憶部207に記憶し、ステップS13に移る。 In step S106, the error estimating unit 206 converts the second mounted object reference target position converted in S105 and the first object relative to the own vehicle identified by the target position identifying unit 282 for the target determined to be the same in S104. A second positioning error, which is an error in positioning by the second communication device-mounted object with respect to the position of the own vehicle, is estimated from the deviation from the position of the target. In step S107, the second positioning error estimated in S106 is linked to the transmission source identification information of the second communication device-mounted object for which the second positioning error was estimated, and stored in the error storage unit 207, and the process proceeds to step S13. .
 図5に戻って、ステップS11では、位置特定部208が、S2若しくはS8で受信して取得した測位位置及び搭載物基準物標位置の少なくともいずれかを、誤差記憶部207に記憶済みの測位誤差の分だけ補正して、自車に対する位置を特定する。第1通信機搭載物から取得した第1測位位置の場合には、搭載物位置特定部281が、第1測位位置を第1測位誤差の分だけ補正して、自車に対する第1通信機搭載物の位置を特定する。第1通信機搭載物から取得した第1搭載物基準物標位置の場合には、物標位置特定部282が、第1搭載物基準物標位置を第1測位誤差の分だけ補正して、自車に対する第1物標の位置を特定する。第2通信機搭載物から取得した第2測位位置の場合には、搭載物位置特定部281が、第2測位位置を第2測位誤差の分だけ補正して、自車に対する第2通信機搭載物の位置を特定する。第2通信機搭載物から取得した第2搭載物基準物標位置の場合には、物標位置特定部282が、第2搭載物基準物標位置を第2測位誤差の分だけ補正して、自車に対する第2物標の位置を特定する。 Returning to FIG. 5, in step S11, the position specifying unit 208 stores at least one of the positioning position and the mounted object reference target position received and acquired in S2 or S8 as the positioning error stored in the error storage unit 207. The position relative to the own vehicle is specified by correcting by the amount of . In the case of the first positioning position acquired from the first communication device mounted object, the mounted object position specifying unit 281 corrects the first positioned position by the first positioning error, Locate objects. In the case of the first mount reference target position acquired from the first communication device mount, the target position specifying unit 282 corrects the first mount reference target position by the first positioning error, Identify the position of the first target with respect to the own vehicle. In the case of the second positioning position acquired from the second communication device mounted object, the mounted object position specifying unit 281 corrects the second positioned position by the second positioning error, Locate objects. In the case of the second mount reference target position acquired from the second communication device mount, the target position specifying unit 282 corrects the second mount reference target position by the second positioning error, Identify the position of the second target with respect to the own vehicle.
 S11では、搭載物基準物標位置を変換部205で変換した後に補正を行ってもよい。S11では、搭載物基準物標位置を変換部205で変換する前に補正を行ってもよい。ステップS12では、S11で特定した物標の位置を位置記憶部209に記憶する。 In S11, correction may be performed after the conversion unit 205 converts the mounted object reference target position. In S<b>11 , correction may be performed before the conversion unit 205 converts the mounted object reference target position. In step S12, the position of the target specified in S11 is stored in the position storage unit 209. FIG.
 ステップS13では、測位誤差推定関連処理の終了タイミングであった場合(S13でYES)には、測位誤差推定関連処理を終了する。一方、測位誤差推定関連処理の終了タイミングでなかった場合(S13でNO)には、S1に戻って処理を繰り返す。測位誤差推定関連処理の終了タイミングの一例としては、パワースイッチがオフになったこと等が挙げられる。 In step S13, if it is time to end the positioning error estimation related process (YES in S13), the positioning error estimation related process ends. On the other hand, if it is not the end timing of the positioning error estimation related process (NO in S13), the process returns to S1 and repeats the process. An example of the termination timing of the positioning error estimation-related processing is when the power switch is turned off.
 なお、S4で物標が同一と判定した場合であっても、S2で取得した物標情報の送信元の第1通信機搭載物について、誤差記憶部207に第1測位誤差が記憶済みの場合には、S5~S7の処理を省略してS13の処理に移る構成としてもよい。 Note that even if it is determined in S4 that the targets are the same, if the error storage unit 207 has already stored the first positioning error for the first communication device mounted object that is the transmission source of the target information acquired in S2. Alternatively, the processing of S5 to S7 may be omitted and the processing of S13 may be performed.
 <実施形態1のまとめ>
 実施形態1の構成によれば、第1測位位置は、第1通信機搭載物での測位衛星の信号を用いた測位によって求められるものであるので、その測位の誤差を含んでいる。一方、路側機基準物標位置は、測位衛星の信号を用いた測位で求められるよりも位置精度の高い自機器の絶対位置の情報を有する路側機3の周辺監視センサ302で検出した物標の、路側機3の位置を基準とする位置であるので、測位衛星の信号を用いた測位で求められるよりも位置精度が高い。よって、同一と判定した路側機検出物標と第1通信機搭載物とについて、路側機3から取得した路側機基準物標位置と、第1通信機搭載物から取得した第1測位位置とのずれから、自車の位置を基準とした第1通信機搭載物での測位の誤差である第1測位誤差をより精度良く推定することが可能になる。ここで、路側機基準物標位置も第1測位位置も無線通信を介して取得するので、第1測位誤差の推定は、第1通信機搭載物を自車の周辺監視センサ40で検出できなくても可能となる。また、第1測位誤差を用いれば、第1通信機搭載物での測位誤差を補正して、第1通信機搭載物の位置をより精度良く特定することが可能になる。その結果、自車の周辺監視センサ40の検出範囲外の通信機搭載物の位置をより精度良く特定することが可能になる。
<Summary of Embodiment 1>
According to the configuration of the first embodiment, since the first positioning position is determined by positioning using the signals of the positioning satellites in the first communication device, it contains positioning errors. On the other hand, the roadside unit reference target position is the position of the target detected by the perimeter monitoring sensor 302 of the roadside unit 3 having the information of the absolute position of the own device with higher positional accuracy than that obtained by positioning using the signals of the positioning satellites. Since the position is based on the position of the roadside unit 3, the position accuracy is higher than that obtained by positioning using signals from positioning satellites. Therefore, for the target detected by the roadside unit and the object mounted on the first communication device determined to be the same, the roadside unit reference target position obtained from the roadside unit 3 and the first measured position obtained from the object mounted on the first communication device From the deviation, it is possible to more accurately estimate the first positioning error, which is the positioning error of the first communication device-mounted object with respect to the position of the own vehicle. Here, since both the roadside unit reference target position and the first positioning position are acquired via wireless communication, the estimation of the first positioning error is performed when the object mounted on the first communication device cannot be detected by the perimeter monitoring sensor 40 of the own vehicle. is possible. Further, by using the first positioning error, it is possible to correct the positioning error in the first communication device mounted object and specify the position of the first communication device mounted object with higher accuracy. As a result, it is possible to more accurately identify the position of the communication device-mounted object outside the detection range of the perimeter monitoring sensor 40 of the own vehicle.
 また、実施形態1の構成によれば、自車の周辺監視センサ40の検出範囲外の物標であっても、第1通信機搭載物の周辺監視センサ40の検出範囲内の物標であれば、第1通信機搭載物から車車間通信で取得するその物標についての第1搭載物基準物標位置を、第1測位誤差の分だけ補正することで、自車に対する位置をより精度良く特定することが可能になる。 Further, according to the configuration of the first embodiment, even if the target is outside the detection range of the surroundings monitoring sensor 40 of the own vehicle, it may be a target within the detection range of the surroundings monitoring sensor 40 of the first communication device mounted object. For example, by correcting the first mounted object reference target position of the target acquired by vehicle-to-vehicle communication from the mounted object of the first communication device by the amount of the first positioning error, the position relative to the own vehicle can be determined more accurately. can be specified.
 他にも、実施形態1の構成によれば、路側機3の周辺監視センサ302の検出範囲外の他車両である第2通信機搭載物での測位の誤差もより精度良く推定することが可能になる。第2搭載物基準物標位置は、第2通信機搭載物の第2測位位置を基準とする位置であるので、第2通信機搭載物での測位の誤差を含んでいる。一方、物標位置特定部282で第1搭載物基準物標位置を第1測位誤差の分だけ補正して特定された、自車に対する第1物標の位置は補正によって測位の誤差が低減されている。よって、同一と判定した第1物標と第2物標とについて、この補正して特定された自車に対する第1物標の位置と、第2通信機搭載物から取得した第2搭載物基準物標位置を自車に対する位置に変換した位置とのずれから、自車の位置を基準とした第2通信機搭載物での測位の誤差である第2測位誤差をより精度良く推定することが可能になる。 In addition, according to the configuration of the first embodiment, it is possible to more accurately estimate the positioning error in the second communication device-mounted object, which is another vehicle outside the detection range of the surroundings monitoring sensor 302 of the roadside device 3. become. Since the second mounted object reference target position is a position based on the second positioning position of the second mounted object, it includes positioning errors in the second mounted object. On the other hand, the position of the first target with respect to the own vehicle, which is specified by correcting the first mount reference target position by the first positioning error in the target position specifying unit 282, is corrected so that the positioning error is reduced. ing. Therefore, for the first target and the second target that are determined to be the same, It is possible to more accurately estimate a second positioning error, which is an error in positioning by the second communication equipment mounted on the vehicle with the position of the vehicle as a reference, from the deviation from the position converted from the position of the target relative to the vehicle. be possible.
 また、以上の構成によれば、自車及び第1通信機搭載物の周辺監視センサ40の検出範囲外の物標であっても、第2通信機搭載物の周辺監視センサ40の検出範囲内の物標であれば、第2通信機搭載物から車車間通信で取得するその物標についての第2搭載物基準物標位置を、第2測位誤差の分だけ補正することで、自車に対する位置をより精度良く特定することが可能になる。 In addition, according to the above configuration, even if the target is outside the detection range of the vehicle and the surroundings monitoring sensor 40 of the first communication device, it is within the detection range of the surroundings monitoring sensor 40 of the second communication device. If the target is a target, by correcting the second mounting object reference target position for the target acquired by inter-vehicle communication from the second communication device mounting object by the second positioning error, It becomes possible to identify the position with higher accuracy.
 ここで、図1を用いて、実施形態1の効果について説明する。図1のLMa,LMb,LMcは歩行者といった物標とする。この物標は、通信機搭載物ではないものとする。物標LMaは、車両VEb,車両VEcの周辺監視センサ40で検出可能とする。一方、物標LMbは、車両VEbの周辺監視センサ40では検出可能だが、車両VEa,車両VEcの周辺監視センサ40では検出不能とする。物標LMcは、車両VEcの周辺監視センサ40では検出可能だが、車両VEa,車両VEbの周辺監視センサ40では検出不能とする。車両VEbは、車両VEa,車両VEcの周辺監視センサ40でも路側機3の周辺監視センサ302でも検出不能とする。車両VEcは、車両VEa,車両VEcの周辺監視センサ40では検出不能だが、路側機3の周辺監視センサ302では検出可能とする。ここでは、車両VEaが自車であり、車両VEb,車両VEcが他車両であるものとして説明する。また、車両VEcが第1通信機搭載物にあたり、車両VEbが第2通信機搭載物にあたる。物標LMaは第1物標にあたり、第2物標にもあたる。物標LMbは第2物標にあたるが、第1物標にはあたらない。物標LMcは、第1物標にあたるが、第2物標にはあたらない。 Here, the effect of Embodiment 1 will be described using FIG. LMa, LMb, and LMc in FIG. 1 are targets such as pedestrians. It is assumed that this target is not a communicator-mounted object. The target LMa can be detected by the peripheral monitoring sensors 40 of the vehicle VEb and the vehicle VEc. On the other hand, the target LMb can be detected by the surroundings monitoring sensor 40 of the vehicle VEb, but cannot be detected by the surroundings monitoring sensor 40 of the vehicle VEa and the vehicle VEc. The target LMc can be detected by the surroundings monitoring sensor 40 of the vehicle VEc, but cannot be detected by the surroundings monitoring sensor 40 of the vehicle VEa and the vehicle VEb. The vehicle VEb cannot be detected by the surrounding monitoring sensors 40 of the vehicle VEa and the vehicle VEc and the surrounding monitoring sensor 302 of the roadside unit 3 . The vehicle VEc cannot be detected by the surroundings monitoring sensor 40 of the vehicle VEa and the vehicle VEc, but can be detected by the surroundings monitoring sensor 302 of the roadside unit 3 . Here, it is assumed that the vehicle VEa is the own vehicle, and the vehicles VEb and VEc are the other vehicles. In addition, the vehicle VEc corresponds to the first communication equipment mounted object, and the vehicle VEb corresponds to the second communication equipment mounted equipment. The target LMa corresponds to the first target and also to the second target. The target LMb hits the second target but does not hit the first target. The target LMc hits the first target but does not hit the second target.
 自車VEaの通信機20では、車両VEcから車車間通信で取得した車両VEcの測位位置と、路側機3から路車間通信で取得する路側機3の周辺監視センサ302で検出した車両VEcについての路側機基準物標位置とから、車両VEcでの測位誤差が推定できる。この測位誤差が第1測位誤差となる。車両VEcの搭載物基準物標位置の測位誤差についても、同様にして推定できる。 In the communication device 20 of the own vehicle VEa, the positioning position of the vehicle VEc obtained from the vehicle VEc through inter-vehicle communication, and the vehicle VEc detected by the surrounding monitoring sensor 302 of the roadside device 3 obtained from the roadside device 3 through inter-vehicle communication. The positioning error of the vehicle VEc can be estimated from the roadside machine reference target position. This positioning error becomes the first positioning error. The positioning error of the mounted object reference target position of the vehicle VEc can be similarly estimated.
 自車VEaの通信機20では、自車VEaの周辺監視センサ40の検出範囲外の物標LMcであっても、車両VEcから車車間通信で取得する物標LMcについての第1搭載物基準物標位置と、車両VEcでの第1測位誤差とから、自車VEaに対する物標LMcの位置を精度良く特定することが可能になる。 In the communication device 20 of the own vehicle VEa, even if the target LMc is outside the detection range of the perimeter monitoring sensor 40 of the own vehicle VEa, the target LMc is acquired from the vehicle VEc through inter-vehicle communication. From the target position and the first positioning error of the vehicle VEc, it is possible to accurately identify the position of the target LMc with respect to the own vehicle VEa.
 自車VEaの通信機20では、路側機3の周辺監視センサ302の検出範囲外の車両VEbであっても、車両VEbと車両VEcとの周辺監視センサ40で共通に検出できる物標LMaを利用して、車両VEbでの測位誤差が推定できる。この誤差が第2測位誤差となる。詳しくは、自車VEaの通信機20では、推定済みの第1測位誤差を利用して補正した物標LMaの位置と、車両VEbから車車間通信で取得した物標LMcについての第2搭載物基準物標位置とから、自車VEaに対する車両VEbの位置を精度良く特定することが可能になる。 In the communication device 20 of the own vehicle VEa, even if the vehicle VEb is out of the detection range of the surroundings monitoring sensor 302 of the roadside device 3, the target LMa that can be detected in common by the surroundings monitoring sensor 40 of the vehicle VEb and the vehicle VEc is used. Then, the positioning error in the vehicle VEb can be estimated. This error becomes the second positioning error. Specifically, in the communication device 20 of the own vehicle VEa, the position of the target LMa corrected using the estimated first positioning error and the position of the target LMc obtained from the vehicle VEb by inter-vehicle communication The position of the vehicle VEb relative to the own vehicle VEa can be specified with high accuracy from the reference target position.
 また、自車VEaの通信機20では、自車VEa及び車両VEcの周辺監視センサ40の検出範囲外の物標LMbであっても、車両VEbから車車間通信で取得する物標LMbについての第2搭載物基準物標位置と、車両VEbでの第2測位誤差とから、自車VEaに対する物標LMbの位置を精度良く特定することが可能になる。 Further, even if the target LMb is outside the detection range of the perimeter monitoring sensors 40 of the vehicle VEa and the vehicle VEc, the communication device 20 of the vehicle VEa can detect the target LMb acquired from the vehicle VEb through vehicle-to-vehicle communication. The position of the target LMb with respect to the own vehicle VEa can be specified with high accuracy from the two-mounted object reference target position and the second positioning error in the vehicle VEb.
 (実施形態2)
 図7を用いて、実施形態2の概要を説明する。実施形態2では、車両ユニット2はCAM(Cooperative Awareness Messages)、CPM(Cooperative Perception Message)、2つのメッセージを逐次送信する。また、路側機3もCPMを逐次送信する。
(Embodiment 2)
An overview of the second embodiment will be described with reference to FIG. In Embodiment 2, the vehicle unit 2 sequentially transmits CAM (Cooperative Awareness Messages) and CPM (Cooperative Perception Messages). The roadside device 3 also sequentially transmits the CPM.
 車両VEaに搭載された車両ユニット2のように、2つ以上の送信元装置からCPMあるいはCAMを受信した場合、車両ユニット2が備える通信機20は、それらのメッセージに含まれている情報を用いて、実施形態1と同様、同一判定、誤差推定などを行う。実施形態2における同一判定、誤差推定などの処理を説明する前に、CAM、CPMについて説明する。 When receiving CPMs or CAMs from two or more transmission source devices like the vehicle unit 2 mounted on the vehicle VEa, the communication device 20 provided in the vehicle unit 2 uses the information contained in those messages. Then, similar to the first embodiment, identity determination, error estimation, and the like are performed. CAM and CPM will be described before describing processes such as identity determination and error estimation in the second embodiment.
 図8は、V2X通信装置の例示的なアーキテクチャを示す図である。V2X通信装置は、実施形態1の通信機20に代えて用いる。V2X通信装置は、実施形態2の通信機20と同様、図4に示した構成も備える。V2X通信装置は、物標情報を送信する通信装置である。 FIG. 8 is a diagram showing an exemplary architecture of a V2X communication device. A V2X communication device is used instead of the communication device 20 of the first embodiment. The V2X communication device also has the configuration shown in FIG. 4, like the communication device 20 of the second embodiment. A V2X communication device is a communication device that transmits target information.
 V2X通信装置は、車両と車両、車両とインフラ、車両と自転車、車両と携帯端末などの間の通信を実行してもよい。V2X通信装置は、車両の車載器に相当してもよいし、車載器に含まれてもよい。車載器は、OBU(On-Board Unit)と呼ばれることがある。 The V2X communication device may perform communication between vehicles, vehicles and infrastructure, vehicles and bicycles, vehicles and mobile terminals, and the like. The V2X communication device may correspond to an onboard device of a vehicle or may be included in the onboard device. The onboard equipment is sometimes called an OBU (On-Board Unit).
 通信装置は、インフラの路側機に対応してもよいし、路側機に含まれていてもよい。路側機はRSU(Road Side Unit)と呼ばれることもある。通信装置は、ITS(Intelligent Transport System)を構成する一要素とすることもできる。ITSの一要素である場合、通信装置は、ITSステーション(ITS-S)に対応していてもよいし、ITS-Sに含まれていてもよい。ITS-Sは、情報交換を行う装置であり、OBU、RSU、携帯端末のいずれでもよく、また、それらに含まれるものであってもよい。携帯端末は、たとえば、PDA(Personal Digital Assistant)あるいはスマートフォンである。 The communication device may correspond to the infrastructure roadside unit or may be included in the roadside unit. A roadside unit is sometimes called an RSU (Road Side Unit). The communication device can also be one element that constitutes an ITS (Intelligent Transport System). If it is an element of the ITS, the communication device may correspond to an ITS station (ITS-S) or be included in the ITS-S. ITS-S is a device for information exchange, and may be any of OBU, RSU, and mobile terminal, or may be included therein. The mobile terminal is, for example, a PDA (Personal Digital Assistant) or a smart phone.
 通信装置は、IEEE1609にて開示されているWAVE(Wireless Access in Vehicular)装置に相当してもよいし、WAVE装置に含まれてもよい。 The communication device may correspond to a WAVE (Wireless Access in Vehicular) device disclosed in IEEE1609, or may be included in a WAVE device.
 本実施形態では、車両に搭載されたV2X通信装置であるとする。このV2X通信装置は、CA(Cooperative Awareness)サービスおよびCP(Collective Perception)サービスを提供する機能を備える。CAサービスではV2X通信装置はCAMを送信する。CPサービスでは、V2X通信装置はCPMを送信する。なお、通信装置がRSUあるいは携帯端末であっても、以下に開示するものと同一または類似の方法が適用できる。 In this embodiment, it is a V2X communication device mounted on a vehicle. This V2X communication device has a function of providing CA (Cooperative Awareness) service and CP (Collective Perception) service. In CA service, the V2X communication device transmits CAM. In CP service, the V2X communication device transmits CPM. It should be noted that the same or similar methods disclosed below can be applied even if the communication device is an RSU or a mobile terminal.
 図8に示すアーキテクチャは、EU規格に従ったITS-Sの基準アーキテクチャに基づいている。図8に示すアーキテクチャは、アプリケーション層110、ファシリティ層120、ネットワーク&トランスポート層140、アクセス層130、マネージメント層150、セキュリティ層160を備えた構成である。 The architecture shown in FIG. 8 is based on the ITS-S reference architecture according to EU standards. The architecture shown in FIG. 8 comprises an application layer 110, a facility layer 120, a network & transport layer 140, an access layer 130, a management layer 150, and a security layer 160. FIG.
 アプリケーション層110は、種々のアプリケーション111を実装あるいはサポートする。図8には、アプリケーション111の例として、交通安全アプリケーション111a、効率的な交通情報アプリケーション111b、その他のアプリケーション111cが示されている。 The application layer 110 implements or supports various applications 111 . FIG. 8 shows, as examples of the applications 111, a traffic safety application 111a, an efficient traffic information application 111b, and other applications 111c.
 ファシリティ層120は、アプリケーション層110で定義された種々のユースケースの実行をサポートする。ファシリティ層120は、OSI参照モデルにおける上位3層(アプリケーション層、プレゼンテーション層及びセッション層)と同じあるいは類似の機能をサポートすることができる。なお、ファシリティは、機能、情報、データを提供することを意味する。ファシリティ層120は、V2X通信装置の機能を提供してもよい。たとえば、ファシリティ層120は、図8に示すアプリケーションサポート121、情報サポート122、通信サポート123の機能を提供してもよい。 The facility layer 120 supports the execution of various use cases defined in the application layer 110. The facility layer 120 can support the same or similar functionality as the top three layers (application layer, presentation layer and session layer) in the OSI reference model. Facility means providing functions, information and data. Facility layer 120 may provide the functionality of a V2X communication device. For example, facility layer 120 may provide the functions of application support 121, information support 122, and communication support 123 shown in FIG.
 アプリケーションサポート121は、基本的なアプリケーションセットまたはメッセージセットをサポートする機能を備える。メッセージの一例は、V2Xメッセージである。V2Xメッセージには、CAMなどの定期メッセージ、および、DENM(Decentralized Environmental Notification Message)などのイベントメッセージを含ませることができる。また、ファシリティ層120は、CPMをサポートすることもできる。 The application support 121 has functions that support basic application sets or message sets. An example of a message is a V2X message. V2X messages can include periodic messages such as CAMs and event messages such as DENMs (Decentralized Environmental Notification Messages). Facility layer 120 may also support CPM.
 情報サポート122は、基本アプリケーションセットまたはメッセージセットに使用される共通のデータまたはデータベースを提供する機能を有する。データベースの一例は、ローカルダイナミックマップ(LDM)である。 The information support 122 has the function of providing common data or databases used for the basic application set or message set. One example of a database is the Local Dynamic Map (LDM).
 通信サポート123は、通信やセッション管理のためのサービスを提供する機能を備える。通信サポート123は、たとえば、アドレスモードやセッションサポートを提供する。 The communication support 123 has functions for providing services for communication and session management. Communication support 123 provides, for example, address mode and session support.
 このように、ファシリティ層120は、アプリケーションセットまたはメッセージセットをサポートする。すなわち、ファシリティ層120は、アプリケーション層110が送信すべき情報や提供すべきサービスに基づいて、メッセージセットまたはメッセージを生成する。このようにして生成されたメッセージは、V2Xメッセージと呼ばれることがある。 Thus, facility layer 120 supports a set of applications or a set of messages. That is, the facility layer 120 generates message sets or messages based on the information that the application layer 110 should send and the services it should provide. Messages generated in this way are sometimes referred to as V2X messages.
 アクセス層130は、外部IF(InterFace)131と内部IF132を備え、上位層で受信したメッセージ/データを、物理チャネルを介して送信することができる。たとえば、アクセス層130は、以下の通信技術により、データ通信を行う、あるいは、データ通信をサポートすることができる。通信技術は、たとえば、IEEE802.11および/または802.11p規格に基づく通信技術、IEEE802.11および/または802.11p規格の物理伝送技術に基づくITS-G5無線通信技術、衛星/広帯域無線移動通信を含む2G/3G/4G(LTE)/5G無線携帯通信技術、DVB-T/T2/ATCなどの広帯域地上デジタル放送技術、GNSS通信技術、WAVE通信技術である。 The access layer 130 includes an external IF (InterFace) 131 and an internal IF 132, and can transmit messages/data received by the upper layers via physical channels. For example, access stratum 130 may conduct or support data communication according to the following communication techniques. The communication technology is, for example, a communication technology based on the IEEE 802.11 and/or 802.11p standard, an ITS-G5 wireless communication technology based on the physical transmission technology of the IEEE 802.11 and/or 802.11p standard, a satellite/broadband wireless mobile communication including 2G/3G/4G (LTE)/5G wireless mobile communication technology, broadband terrestrial digital broadcasting technology such as DVB-T/T2/ATC, GNSS communication technology, and WAVE communication technology.
 ネットワーク&トランスポート層140は、各種トランスポートプロトコルやネットワークプロトコルを用いて、同種/異種ネットワーク間の車両通信用ネットワークを構成することができる。トランスポート層は、上位層と下位層との間の接続層である。上位層には、セッション層、プレゼンテーション層、アプリケーション層110がある。下位層には、ネットワーク層、データリンク層、物理層がある。トランスポート層は、送信データが正確に宛先に到着するように管理することができる。送信元では、トランスポート層は、効率的なデータ伝送のためにデータを適切なサイズのパケットに加工する。受信側では、トランスポート層は、受信したパケットを元のファイルに復元する処理を行なう。トランスポートプロトコルは、たとえば、TCP(Transmission Control Protocol)、UDP(User Datagram Protocol)、BTP(Basic Transport Protocol)である。 The network & transport layer 140 can configure a vehicle communication network between homogeneous/heterogeneous networks using various transport protocols and network protocols. The transport layer is the connecting layer between the upper and lower layers. Upper layers include a session layer, a presentation layer, and an application layer 110 . The lower layers include the network layer, data link layer, and physical layer. The transport layer can manage transmitted data to arrive at its destination correctly. At the source, the transport layer processes the data into appropriately sized packets for efficient data transmission. On the receiving side, the transport layer takes care of restoring the received packets to the original file. Transport protocols are, for example, TCP (Transmission Control Protocol), UDP (User Datagram Protocol), and BTP (Basic Transport Protocol).
 ネットワーク層は、論理アドレスを管理することができる。また、ネットワーク層は、パケットの配送経路を決定してもよい。ネットワーク層は、トランスポート層で生成されたパケットを受信し、宛先の論理アドレスをネットワーク層のヘッダに付加してもよい。パケットの送信経路は、車両間、車両と固定局との間、および固定局間のユニキャスト/マルチキャスト/ブロードキャストが考慮されてもよい。ネットワークプロトコルとして、ジオネットワーキング、モビリティサポートあるいはジオネットワーキングに関するIPv6ネットワーキングが考慮されてもよい。 The network layer can manage logical addresses. The network layer may also determine the routing of packets. The network layer may receive packets generated by the transport layer and add the logical address of the destination to the network layer header. Packet transmission routes may be unicast/multicast/broadcast between vehicles, between vehicles and fixed stations, and between fixed stations. As network protocol, IPv6 networking for geo-networking, mobility support or geo-networking may be considered.
 図8に示すように、V2X通信装置のアーキテクチャは、マネージメント層150とセキュリティ層160とをさらに含んでいてもよい。マネージメント層150は、層同士のデータの伝達や相互作用を管理する。マネージメント層150は、管理情報ベース151、規制管理152、層間管理153、ステーション管理154、アプリケーション管理155を備える。セキュリティ層160は、全部の層のセキュリティを管理する。セキュリティ層160は、ファイアウォールと侵入検知管理161、認証、承認、プロファイル管理162、セキュリティ管理情報ベース163を備える。 As shown in FIG. 8, the architecture of the V2X communication device may further include a management layer 150 and a security layer 160. Management layer 150 manages the communication and interaction of data between layers. The management layer 150 comprises a management information base 151 , regulation management 152 , interlayer management 153 , station management 154 and application management 155 . Security layer 160 manages security for all layers. Security layer 160 comprises firewall and intrusion detection management 161 , authentication, authorization and profile management 162 and security management information base 163 .
 図9に、V2Xメッセージを例示する。V2Xメッセージは、ITSメッセージと呼ぶこともできる。V2Xメッセージは、アプリケーション層110またはファシリティ層120で生成できる。V2Xメッセージの具体例は、CAM、DENM、CPMである。  Fig. 9 shows an example of a V2X message. V2X messages may also be referred to as ITS messages. V2X messages can be generated at application layer 110 or facility layer 120 . Examples of V2X messages are CAM, DENM, CPM.
 ネットワーク&トランスポート層140におけるトランスポート層はBTPパケットを生成する。ネットワーク&トランスポート層140におけるネットワーク層は、そのBTPパケットをカプセル化してジオネットワーキングパケットを生成することができる。ジオネットワーキングパケットは、LLC(Logical Link Control)パケットにカプセル化される。図9において、データはメッセージセットを含んでいてもよい。メッセージセットは、たとえば、基本安全メッセージである。 The transport layer in the network & transport layer 140 generates BTP packets. A network layer at network & transport layer 140 may encapsulate the BTP packets to produce geo-networking packets. Geo-networking packets are encapsulated in LLC (Logical Link Control) packets. In FIG. 9, the data may include message sets. A message set is, for example, basic safety messages.
 BTPは、ファシリティ層120で生成されたV2Xメッセージを下位層に送信するためのプロトコルである。BTPヘッダには、AタイプとBタイプがある。AタイプのBTPヘッダは、双方向のパケット伝送において送受信に必要な宛先ポートと送信元ポートとを含むことがある。BタイプのBTPヘッダは、双方向ではないパケット伝送において、伝送に必要な宛先ポート、宛先ポート情報を含めることができる。 BTP is a protocol for transmitting V2X messages generated in the facility layer 120 to lower layers. The BTP header has A type and B type. A type BTP header may contain the destination port and source port required for transmission and reception in two-way packet transmission. The B-type BTP header can include the destination port and destination port information required for transmission in non-bidirectional packet transmission.
 BTPヘッダに含まれるフィールドを説明する。宛先ポートは、BTPパケットに含まれるデータ(BTP-PDU)の宛先に対応するファシリティエンティティを特定する。BTP-PDUは、BTPにおける単位伝送データである。 Explain the fields included in the BTP header. The destination port identifies the facility entity corresponding to the destination of the data contained in the BTP packet (BTP-PDU). A BTP-PDU is unit transmission data in BTP.
 送信元ポートは、BTP-Aタイプの場合に生成されるフィールドである。送信元ポートは、対応するパケットの送信元におけるファシリティ層120のプロトコルエンティティのポートを示す。このフィールドは、16ビットのサイズを持つことができる。  The source port is a field generated for the BTP-A type. The source port indicates the port of the facility layer 120 protocol entity at the source of the corresponding packet. This field can have a size of 16 bits.
 宛先ポート情報は、BTP-Bタイプの場合に生成されるフィールドである。宛先ポートがウェルノウンポートである場合に追加情報を提供する。このフィールドは、16ビットのサイズを持つことができる。  Destination port information is a field generated for the BTP-B type. Provides additional information if the destination port is a well-known port. This field can have a size of 16 bits.
 ジオネットワーキングパケットは、ネットワーク層のプロトコルに従う基本ヘッダと共通ヘッダを含み、ジオネットワーキングモードに従って選択的に拡張ヘッダを含む。 A geo-networking packet includes a basic header and a common header according to the network layer protocol, and optionally includes an extension header according to the geo-networking mode.
 LLCパケットは、ジオネットワーキングパケットにLLCヘッダを付加したものである。LLCヘッダは、IPデータとジオネットワーキングデータを区別して伝送する機能を提供する。IPデータとジオネットワーキングデータは、SNAP(Subnetwork Access Protocol)のイーサタイプで区別することができる。 An LLC packet is a geo-networking packet with an LLC header added. The LLC header provides the ability to differentiate and transmit IP data and geo-networking data. IP data and geo-networking data can be distinguished by SNAP (Subnetwork Access Protocol) Ethertype.
 IPデータが伝送される場合、イーサタイプはx86DDに設定され、LLCヘッダに含まれることがある。ジオネットワーキングデータが送信される場合、イーサタイプは0x86DCに設定され、LLCヘッダに含まれることがある。受信機は、LLCパケットヘッダのイーサタイプフィールドを確認し、LLCパケットヘッダのイーサタイプフィールドの値に応じて、パケットをIPデータパスまたはジオネットワーキングパスに転送および処理することができる。 When IP data is transmitted, the Ethertype is set to x86DD and may be included in the LLC header. If geo-networking data is transmitted, the Ethertype may be set to 0x86DC and included in the LLC header. The receiver can see the Ethertype field in the LLC packet header and forward and process the packet to the IP data path or the geo-networking path depending on the value of the Ethertype field in the LLC packet header.
 LLCヘッダにはDSAP(Destination Service Access Point)とSSAP(Source Service Access Point)が含まれている。LLCヘッダにおいてSSAPの次には、制御フィールド(図9におけるControl)、プロトコルID、イーサタイプが配置される。 The LLC header contains DSAP (Destination Service Access Point) and SSAP (Source Service Access Point). In the LLC header, SSAP is followed by a control field (Control in FIG. 9), protocol ID, and ethertype.
 図10は、V2X通信装置のアーキテクチャにおけるCA(協調認識)サービス128および他の層のための論理インターフェースを示している。 FIG. 10 shows the logical interfaces for the CA (Collaborative Awareness) service 128 and other layers in the V2X communication device architecture.
 V2X通信装置は、交通安全および率化のための様々なサービスを提供してもよい。サービスの1つは、CAサービス128であってもよい。道路交通における協調認識は、道路使用者および路側インフラが相互の位置、動態および属性を知ることができることを意味する。道路使用者とは、自動車、トラック、オートバイ、自転車、歩行者など、交通安全や制御を行う道路上や周辺のあらゆる使用者を指し、路側インフラとは、道路標識、信号機、障壁、入口などの設備を指す。  V2X communication devices may provide various services for traffic safety and efficiency. One of the services may be CA service 128 . Collaborative awareness in road traffic means that road users and roadside infrastructure can know each other's location, dynamics and attributes. Road users refer to all users on and around roads for which traffic safety and control are required, such as automobiles, trucks, motorcycles, cyclists, pedestrians, etc. Roadside infrastructure refers to road signs, traffic lights, barriers, entrances, etc. Refers to equipment.
 お互いを認識することは、交通安全や交通効率化などのアプリケーションの基本になる。互いの認識は、V2Xネットワークと呼ばれる無線ネットワークに基づく、車両間(V2V)、車両とインフラとの間(V2I)、インフラと車両との間(I2V)、全対象物と全対象物との間(X2X)などの道路ユーザー間の定期的な情報交換によって実行することができる。 Recognizing each other is the basis for applications such as traffic safety and traffic efficiency. Recognition between vehicles (V2V), between vehicles and infrastructure (V2I), between infrastructure and vehicles (I2V), between all objects and all objects, based on wireless networks called V2X networks. (X2X) can be carried out by regular exchange of information between road users.
 協調安全走行や交通効率化のアプリケーションでは、V2X通信装置の周囲の道路使用者の存在や行動を含む状況認識を進展させることが要求される。例えば、V2X通信装置は、自身のセンサや他のV2X通信装置との通信を通じて、状況認識を行うことができる。この場合、CAサービスは、V2X通信装置がCAMを送信することにより、自身の位置、挙動、属性を通知する方法を指定することができる。 Applications for cooperative safe driving and traffic efficiency require advanced situational awareness, including the presence and behavior of road users around V2X communication devices. For example, V2X communication devices can provide situational awareness through their sensors and communication with other V2X communication devices. In this case, the CA service can specify how the V2X communication device communicates its location, behavior and attributes by sending CAMs.
 図10に示すように、CAサービス128は、ファシリティ層120のエンティティであってもよい。たとえば、CAサービス128は、ファシリティ層120のアプリケーションサポートドメインの一部であってもよい。 As shown in FIG. 10, the CA service 128 may be an entity of the facility layer 120. For example, CA service 128 may be part of the application support domain of facility layer 120 .
 CAサービス128は、CAMプロトコルを適用し、CAMの送信と受信の2つのサービスを提供することができる。CAサービス128はCAM基本サービスと言うこともできる。 The CA service 128 can apply the CAM protocol and provide two services of CAM transmission and reception. CA service 128 may also be referred to as a CAM basic service.
 CAMの送信では、送信元ITS-SはCAMを構成する。CAMは送信のためにネットワーク&トランスポート層140に送られる。CAMは、送信元ITS-Sから、通信範囲内の全てのITS-Sに直接送信されることがある。通信範囲は、送信元ITS-Sの送信電力を変更することによって変動する。CAMは、送信元ITS-SのCAサービス128により制御された頻度で定期的に生成される。生成頻度は、送信元ITS-Sの状態変化を考慮して決定される。考慮する状態は、たとえば、送信元ITS-Sの位置または速度の変化、無線チャネルの負荷である。  In CAM transmission, the source ITS-S configures the CAM. The CAM is sent to network & transport layer 140 for transmission. The CAM may be sent directly from the source ITS-S to all ITS-S within range. The communication range is varied by changing the transmission power of the source ITS-S. CAMs are generated periodically at a frequency controlled by the CA service 128 of the originating ITS-S. The frequency of generation is determined by considering the state change of the source ITS-S. Conditions to consider are, for example, changes in the position or velocity of the source ITS-S, loading of the radio channel.
 CAMを受信すると、CAサービス128は、CAMの内容をアプリケーション111および/またはLDM127などのエンティティに利用する。 Upon receiving the CAM, CA service 128 utilizes the contents of the CAM to entities such as application 111 and/or LDM 127.
 CAMは、道路交通に関わる全てのITS-Sが送信する。CAサービス128は、CAM生成のための関連情報を収集するため、および、受信したCAMデータをさらに処理するために、ファシリティ層120のエンティティおよびアプリケーション層110と接続する。 CAM is transmitted by all ITS-S related to road traffic. CA service 128 interfaces with entities of facility layer 120 and application layer 110 to collect relevant information for CAM generation and to further process received CAM data.
 車両において、ITS-Sのデータ収集のためのエンティティの一例は、VDP(Vehicle Data Provider)125、POTI(position and time)ユニット126、LDM127である。VDP125は、車両ネットワークに接続され、車両の状態情報を提供する。POTIユニット126は、ITS-Sの位置と時間情報を提供する。LDM127は、ETSI TR 102 863に記載されているように、ITS-S内のデータベースであり、受信したCAMデータで更新することができる。アプリケーション111は、LDM127から情報を取得し、さらに処理を行うことができる。 An example of an entity for ITS-S data collection in a vehicle is a VDP (Vehicle Data Provider) 125, a POTI (position and time) unit 126, and an LDM 127. The VDP 125 is connected to the vehicle network and provides vehicle status information. The POTI unit 126 provides ITS-S position and time information. The LDM 127 is a database within ITS-S, as described in ETSI TR 102 863, which can be updated with received CAM data. Application 111 can obtain information from LDM 127 and perform further processing.
 CAサービス128は、他のITS-SとCAMを交換するために、NF-SAP(Network & Transport/Facilities Service Access Point)を介して、ネットワーク&トランスポート層140と接続する。CAサービス128は、CAM送信とCAM受信のセキュリティサービスのために、SF-SAP(Security Facilities Service Access Point)を介してセキュリティ層160と接続する。CAサービス128は、受信したCAMデータをアプリケーション111に直接提供する場合は、MF-SAP(Management/Facilities Service Access Point)を介してマネージメント層150と接続し、FA-SAP(Facilities/Applications Service Access Point)を介してアプリケーション層110と接続する。 The CA service 128 connects with the network & transport layer 140 via NF-SAP (Network & Transport/Facilities Service Access Point) in order to exchange CAMs with other ITS-S. The CA service 128 interfaces with the security layer 160 via SF-SAP (Security Facilities Service Access Point) for CAM transmission and CAM reception security services. When the CA service 128 directly provides the received CAM data to the application 111, it connects with the management layer 150 via the MF-SAP (Management/Facilities Service Access Point), FA-SAP (Facilities/Applications Service Access Point ) with the application layer 110 .
 図11は、CAサービス128の機能ブロック、および、他の機能および層へのインターフェースを示す。CAサービス128は、CAMエンコード部1281、CAMデコード部1282、CAM送信管理部1283、CAM受信管理部1284の4つのサブ機能を提供する。 FIG. 11 shows the functional blocks of CA service 128 and interfaces to other functions and layers. CA service 128 provides four sub-functions of CAM encoding section 1281 , CAM decoding section 1282 , CAM transmission management section 1283 and CAM reception management section 1284 .
 CAMエンコード部1281は、予め定義されたフォーマットに従ってCAMを構築し、符号化する。CAMデコード部1282は、受信したCAMを復号する。CAM送信管理部1283は、CAMを送信するために、送信元ITS-Sのプロトコル動作を実行する。CAM送信管理部1283は、たとえば、CAM送信動作の起動と終了、CAM生成頻度の決定、CAM生成のトリガを実行する。CAM受信管理部1284は、CAMを受信するために、受信側ITS-Sに規定されたプロトコル動作を実行する。CAM受信管理部1284は、たとえば、CAM受信時に、CAMデコード機能を起動させる。また、CAM受信管理部1284は、受信したCAMのデータを受信側ITS-Sのアプリケーション111に提供する。CAM受信管理部1284は、受信したCAMの情報チェックを実行してもよい。 The CAM encoding unit 1281 constructs and encodes a CAM according to a predefined format. CAM decoding section 1282 decodes the received CAM. The CAM transmission manager 1283 executes the protocol operation of the source ITS-S to transmit the CAM. The CAM transmission management unit 1283 executes, for example, activation and termination of CAM transmission operation, determination of CAM generation frequency, and triggering of CAM generation. The CAM reception manager 1284 performs protocol operations specified in the receiving ITS-S to receive the CAM. The CAM reception management unit 1284, for example, activates the CAM decoding function when receiving CAM. Also, the CAM reception management unit 1284 provides the received CAM data to the application 111 of the ITS-S on the reception side. The CAM reception management unit 1284 may perform information check of the received CAM.
 受信データを提供するため、CAサービス128は、図11に示すように、インターフェースIF.CAMを提供する。インターフェースIF.CAMは、LDM127またはアプリケーション111へのインターフェースである。なお、アプリケーション層110へのインターフェースはAPI(Application Programming Interface)として実装され、このAPIを介してCAサービス128とアプリケーション111の間でデータが交換されてもよい。また、アプリケーション層110へのインターフェースは、FA-SAPとして実装することも可能である。 To provide received data, the CA service 128, as shown in FIG. 11, interfaces IF. Provide CAM. Interface IF. CAM is an interface to LDM 127 or application 111 . The interface to the application layer 110 may be implemented as an API (Application Programming Interface), and data may be exchanged between the CA service 128 and the application 111 via this API. The interface to application layer 110 can also be implemented as FA-SAP.
 CAサービス128は、CAMの生成に必要なデータを取得するために、ファシリティ層120の他のエンティティと相互作用する。CAMのためのデータを提供する一連のエンティティは、データ提供エンティティあるいはデータ提供ファシリティである。データ提供エンティティとCAサービス128との間は、インターフェースIF.FACを介してデータが交換される。 The CA Service 128 interacts with other entities of the Facility Layer 120 to obtain the data necessary for CAM generation. The set of entities that provide data for the CAM are data providing entities or data providing facilities. Between the data providing entity and the CA service 128 is an interface IF. Data is exchanged via the FAC.
 CAサービス128は、インターフェースIF.N&Tを介してネットワーク&トランスポート層140と情報交換を行う。送信元ITS-Sにおいて、CAサービス128は、プロトコル制御情報(PCI)と共に、CAMをネットワーク&トランスポート層140に提供する。PCIは、ETSI EN 32 636-5-1に従った制御情報である。CAMは、ファシリティ層120のサービステータユニット(FL-SDU)に埋め込まれている。 The CA service 128 has an interface IF. It exchanges information with the network & transport layer 140 via N&T. At the source ITS-S, CA service 128 provides the CAM to network & transport layer 140 along with protocol control information (PCI). PCI is control information according to ETSI EN 32 636-5-1. The CAM is embedded in the Service Data Unit (FL-SDU) of the facility layer 120 .
 受信側ITS-Sでは、ネットワーク&トランスポート層140は、受信したCAMをCAサービス128に提供することができる。 On the receiving side ITS-S, the network & transport layer 140 can provide the received CAM to the CA service 128.
 CAサービス128とネットワーク&トランスポート層140との間のインターフェースは、ジオネットワーキング/BTPスタックのサービス、あるいは、IPv6スタック、およびIPv6/ジオネットワーキングの複合スタックに依存する。 The interface between the CA service 128 and the network & transport layer 140 relies on geo-networking/BTP stack services, or IPv6 stacks, and IPv6/geo-networking combined stacks.
 CAMは、ジオネットワーキング(GN)/BTPスタックが提供するサービスに依存することがある。このスタックが使用される場合、GNパケット転送タイプは、シングルホップブロードキャスト(SHB)が使用される。この場合、直接通信範囲内にあるノードのみがCAMを受信できる。 CAM may rely on services provided by the GeoNetworking (GN)/BTP stack. When this stack is used, the GN packet transfer type is Single Hop Broadcast (SHB). In this case, only nodes within direct communication range can receive the CAM.
 CAサービス128からジオネットワーキング/BTPスタックに渡されるPCIは、BTPタイプ、宛先ポート、宛先ポート情報、GNパケット伝送タイプ、GN通信プロファイル、GNセキュリティプロファイルを含むことがある。また、このPCIは、GNトラフィッククラス、GN最大パケット生存期間を含んでいてもよい。 The PCI passed from CA service 128 to the geo-networking/BTP stack may include BTP type, destination port, destination port information, GN packet transmission type, GN communication profile, GN security profile. This PCI may also include GN traffic class, GN maximum packet lifetime.
 CAMは、ETSI TS 102 636-3に規定されるように、CAMを送信するためにIPv6スタックまたはIPv6/ジオネットワーキング複合スタックを使用することができる。CAMの送信に、IPv6/ジオネットワーキング複合スタックを使用する場合、CAサービス128とIPv6/ジオネットワーキング複合スタックの間のインターフェースは、CAサービス128とIPv6スタックの間のインターフェースと同じであってよい。 A CAM can use an IPv6 stack or a combined IPv6/geo-networking stack to transmit the CAM, as specified in ETSI TS 102 636-3. If the combined IPv6/geo-networking stack is used for CAM transmission, the interface between the CA service 128 and the combined IPv6/geo-networking stack may be the same as the interface between the CA service 128 and the IPv6 stack.
 CAサービス128は、MF-SAPを介してマネージメント層150の管理エンティティとプリミティブを交換することができる。プリミティブは、命令など、交換する情報の要素である。ITS-G5のアクセス層130の場合、送信側ITS-Sは、インターフェースIF.Mngを介して、管理エンティティからT_GenCam_DCCの設定情報を取得する。 The CA service 128 can exchange primitives with management entities of the management layer 150 via MF-SAP. Primitives are elements of information that are exchanged, such as instructions. For the access layer 130 of ITS-G5, the sender ITS-S has an interface IF. Obtain the setting information of T_GenCam_DCC from the management entity via Mng.
 CAサービス128は、ITS-Sのセキュリティエンティティが提供するインターフェースIF.Secを用いて、SF-SAPを介して、ITS-Sのセキュリティエンティティとプリミティブを交換することができる。 The CA service 128 is provided by the ITS-S security entity interface IF. Sec can be used to exchange primitives with ITS-S security entities via SF-SAP.
 CAMの送信には、ポイントツーマルチポイント通信が使用されてもよい。CAMは、送信元ITS-Sから、送信元ITS-Sの直接通信範囲に位置する受信側ITS-Sに対してのみシングルホップで送信される。シングルホップであるので、受信側のIST-Sは、受信したCAMを転送しない。 Point-to-multipoint communication may be used for CAM transmission. A CAM is sent in a single hop from a source ITS-S only to a receiver ITS-S located within direct communication range of the source ITS-S. Since it is a single hop, the receiving IST-S does not forward the received CAM.
 CAサービス128の開始は、車両ITS-S、路側ITS-S、パーソナルITS-Sなど、異なるタイプのITS-Sで異なっていてもよい。CAサービス128がアクティブである限り、CAMの生成は、CAサービス128が管理する。 The initiation of the CA service 128 may be different for different types of ITS-S, such as vehicle ITS-S, roadside ITS-S, and personal ITS-S. As long as CA service 128 is active, CAM generation is managed by CA service 128 .
 車両ITS-Sの場合、CAサービス128は、ITS-Sの起動と同時に起動し、ITS-Sの非活性化時に終了する。 In the case of the vehicle ITS-S, the CA service 128 is activated at the same time as the ITS-S is activated and terminated when the ITS-S is deactivated.
 CAMの生成頻度は、CAサービス128が管理する。CAMの生成頻度は、連続する2つのCAMの生成時間間隔である。CAM生成間隔は最小値T_GenCamMinを下回らないように設定される。CAM生成間隔の最小値T_GenCamMinは100msである。CAM生成間隔が100msになる場合、CAM生成頻度は10Hzになる。CAM生成間隔は最大値T_GenCamMaxを上回らないように設定される。CAM生成間隔の最大値T_GenCamMaxは1000msである。CAM生成間隔が1000msになる場合、CAM生成頻度は1Hzになる。 The frequency of CAM generation is managed by the CA service 128. The CAM generation frequency is the time interval between two consecutive CAM generations. The CAM generation interval is set so as not to fall below the minimum value T_GenCamMin. The minimum value T_GenCamMin of the CAM generation interval is 100 ms. If the CAM generation interval is 100 ms, the CAM generation frequency will be 10 Hz. The CAM generation interval is set so as not to exceed the maximum value T_GenCamMax. The maximum value T_GenCamMax of the CAM generation interval is 1000 ms. If the CAM generation interval is 1000 ms, the CAM generation frequency will be 1 Hz.
 上記の生成頻度の範囲内で、CAサービス128は、送信元ITS-Sのダイナミクスとチャネルの輻輳状態に応じてCAM生成をトリガする。送信元ITS-SのダイナミクスによりCAM生成間隔が短縮された場合、この間隔は連続する複数のCAMで維持される。CAサービス128は、CAM生成のトリガとなる条件をT_CheckCamGen毎に繰り返し確認する。T_CheckCamGenはT_GenCamMin以下になっている。 Within the generation frequency described above, the CA service 128 triggers CAM generation according to the dynamics of the source ITS-S and the congestion state of the channel. If the dynamics of the source ITS-S shortens the CAM generation interval, this interval is maintained for consecutive CAMs. The CA service 128 repeatedly checks the conditions that trigger CAM generation every T_CheckCamGen. T_CheckCamGen is below T_GenCamMin.
 ITS-G5の場合、パラメータ T_GenCam_DCCは、ETSI TS 102 724で規定されている分散輻輳制御(DCC)のチャネル使用要件に従い、CAMの生成を減らす、連続する2つのCAMの最小生成時間間隔を提供する。これにより、チャネル輻輳時に無線チャネルの残存容量に応じたCAM生成頻度の調整が容易になる。 For ITS-G5, the parameter T_GenCam_DCC provides the minimum generation time interval between two consecutive CAMs that reduces CAM generation according to the channel usage requirements for distributed congestion control (DCC) specified in ETSI TS 102 724. . This facilitates adjustment of the CAM generation frequency according to the remaining capacity of the radio channel during channel congestion.
 T_GenCam_DCCは、管理エンティティによってミリ秒の単位で提供される。T_GenCam_DCCの値の範囲は、T_GenCamMin ≦ T_GenCam_DCC ≦ T_GenCamMax である。  T_GenCam_DCC is provided by the management entity in milliseconds. The range of values for T_GenCam_DCC is T_GenCamMin ≤ T_GenCam_DCC ≤ T_GenCamMax.
 管理エンティティがT_GenCam_DCCに、T_GenCamMax 以上の値を与える場合、T_GenCam_DCC はT_GenCamMaxに設定される。管理エンティティがT_GenCam_DCCにT_GenCamMin 以下の値を与える場合、又は、T_GenCam_DCCが提供されない場合、T_GenCam_DCCはT_GenCamMinに設定される。 If the management entity gives T_GenCam_DCC a value greater than or equal to T_GenCamMax, T_GenCam_DCC is set to T_GenCamMax. If the managing entity gives T_GenCam_DCC a value less than or equal to T_GenCamMin, or if T_GenCam_DCC is not provided, T_GenCam_DCC is set to T_GenCamMin.
 なお、LTE-V2Xでは、DCCおよびT_GenCam_DCCは適用しない。LTE-V2Xでは、チャネル輻輳制御はアクセス層130が管理する。  In LTE-V2X, DCC and T_GenCam_DCC do not apply. In LTE-V2X, the access layer 130 manages channel congestion control.
 T_GenCamは、現在有効なCAM生成間隔の上限である。T_GenCamのデフォルト値をT_GenCamMaxとする。T_GenCamは、下記の条件1によりCAMを発生させた場合、最後のCAM発生からの経過時間を設定する。条件2により、N_GenCam個の連続したCAMを発生させた後、T_GenCamをT_GenCamMax に設定する。  T_GenCam is the upper limit of the currently valid CAM generation interval. Let the default value of T_GenCam be T_GenCamMax. T_GenCam sets the elapsed time from the last generation of CAM when CAM is generated according to condition 1 below. After N_GenCam consecutive CAMs are generated according to condition 2, T_GenCam is set to T_GenCamMax.
 N_GenCamの値は、いくつかの環境条件に従って動的に調整することができる。たとえば、交差点に近づいた場合にN_GenCamを大きくし、CAMの受信頻度を高くすることができる。なお、N_GenCamのデフォルト値及び最大値は3である。 The value of N_GenCam can be dynamically adjusted according to some environmental conditions. For example, when approaching an intersection, N_GenCam can be increased to increase the frequency of CAM reception. Note that the default and maximum values of N_GenCam are 3.
 条件1は、前回のCAM生成からの経過時間がT_GenCam_DCC以上であり、かつ、以下のITS-Sダイナミクス関連条件のいずれかが与えられていることである。ITS-Sダイナミクス関連条件の1つ目は、送信元ITS-Sの現在の方位と、送信元ITS-Sが前に送信したCAMに含まれる方位の差の絶対値が4度以上であることである。2つ目は、送信元ITS-Sの現在位置と、送信元ITS-Sが前に送信したCAMに含まれる位置の距離が4m以上であることである。3つ目は、送信元ITS-Sの現在速度と、送信元ITS-Sが以前に送信したCAMに含まれる速度との差の絶対値が0.5m/sを超えることである。 Condition 1 is that the elapsed time since the previous CAM generation is T_GenCam_DCC or more, and one of the following ITS-S dynamics-related conditions is given. The first condition related to ITS-S dynamics is that the absolute value of the difference between the current bearing of the source ITS-S and the bearing contained in the CAM previously transmitted by the source ITS-S is 4 degrees or more. is. The second is that the distance between the current location of the source ITS-S and the location included in the CAM previously transmitted by the source ITS-S is 4 m or more. Third, the absolute value of the difference between the current velocity of the source ITS-S and the velocity contained in the CAM previously transmitted by the source ITS-S exceeds 0.5 m/s.
 条件2は、前回のCAM生成からの経過時間がT_GenCam以上、かつ、ITS-G5の場合にはT_GenCam_Dcc以上であることである。 Condition 2 is that the elapsed time from the previous CAM generation is T_GenCam or more, and in the case of ITS-G5, T_GenCam_Dcc or more.
 上記2つの条件のいずれかが満たされた場合、CAサービス128は、直ちにCAMを生成する。 If either of the above two conditions is met, the CA service 128 immediately creates a CAM.
 CAサービス128は、CAMを生成する場合、必須コンテナを生成する。必須コンテナには、主に送信元ITS-Sの高ダイナミック情報を含める。高ダイナミック情報は、基本コンテナおよび高頻度コンテナに含ませる。CAMは、オプションデータを含むことができる。オプションデータは、主に、動的でない送信元ITS-Sの状態、および、特定のタイプの送信元ITS-Sのための情報である。動的でない送信元ITS-Sの状態は、低頻度コンテナに含ませ、特定のタイプの送信元ITS-Sのための情報は、特殊車両コンテナに含ませる。 When the CA service 128 creates a CAM, it creates an essential container. The mandatory container mainly contains highly dynamic information of the source ITS-S. High dynamic information is included in basic and high frequency containers. The CAM can contain optional data. Optional data is mainly non-dynamic source ITS-S status and information for specific types of source ITS-S. The status of non-dynamic source ITS-S is contained in the low frequency container, and information for specific types of source ITS-S is contained in the special vehicle container.
 低頻度コンテナは、CAサービス128の起動後、最初のCAMに含ませる。その後、低頻度コンテナを生成した最後のCAMの生成からの経過時間が500ms以上の場合に、CAMに低頻度コンテナを含める。  Infrequent containers are included in the first CAM after the CA service 128 is activated. After that, if the elapsed time from the generation of the last CAM that generated the low-frequency container is 500 ms or more, the low-frequency container is included in the CAM.
 特殊車両コンテナも、CAサービス128の起動後、最初に生成されるCAMに含まれる。その後、最後に特殊車両コンテナを生成したCAMの生成からの経過時間が500ms以上の場合に、CAMに特殊車両コンテナを含める。 The special vehicle container is also included in the first CAM generated after the CA service 128 is activated. After that, if 500 ms or more have passed since the CAM that last generated the special vehicle container, the special vehicle container is included in the CAM.
 路側ITS-SのCAM生成頻度も、連続する2つのCAM生成の時間間隔によって定義される。路側ITS-Sは、車両が路側ITS-Sの通信領域内にある間に少なくとも1つのCAMが送信されるように設定される必要がある。また、CAM生成の時間間隔は1000ms以上でなければならない。CAM生成の時間間隔が1000msである場合を最大CAM生成周波数とすると、最大CAM生成周波数は1Hzである。 The CAM generation frequency of the roadside ITS-S is also defined by the time interval between two consecutive CAM generations. The roadside ITS-S must be configured such that at least one CAM is transmitted while the vehicle is within the coverage of the roadside ITS-S. Also, the time interval for CAM generation must be 1000 ms or longer. Assuming that the time interval of CAM generation is 1000 ms as the maximum CAM generation frequency, the maximum CAM generation frequency is 1 Hz.
 なお、通行する車両がRSUからCAMを受信する確率は、CAMの発生頻度と車両が通信領域内にいる時間に依存する。この時間は、車速とRSUの送信電力に依存する。 It should be noted that the probability that a passing vehicle receives a CAM from an RSU depends on the frequency of CAM occurrence and the time the vehicle is within the communication area. This time depends on the vehicle speed and the transmission power of the RSU.
 CAMの生成頻度に加え、CAM生成に要する時間およびメッセージの構築に要するデータの即時性が、受信側ITS-Sにおけるデータの適用性を決定づける。受信されたCAMを適切に解釈するために、それぞれのCAMにはタイムスタンプが付与される。なお、異なるITS-S間で時間同期が取れていることが好ましい。 In addition to the frequency of CAM generation, the time required for CAM generation and the immediacy of data required for message construction determine the applicability of the data in the receiving ITS-S. In order to properly interpret received CAMs, each CAM is time-stamped. In addition, it is preferable that time synchronization is established between different ITS-S.
 CAM生成に要する時間は50ms以下になっている。CAM生成に要する時間とは、CAM生成が開始された時刻と、CAMがネットワーク&トランスポート層140に届けられる時刻との差である。  The time required for CAM generation is 50 ms or less. The time required for CAM generation is the difference between the time CAM generation is started and the time the CAM is delivered to network & transport layer 140 .
 車両ITS-SのCAMに示されるタイムスタンプは、このCAMに示される送信元ITS-Sの基準位置が決定された時刻に対応する。路側ITS-SのCAMに示されるタイプスタンプは、CAMが生成された時刻である。 The time stamp shown in the CAM of the vehicle ITS-S corresponds to the time when the reference position of the source ITS-S shown in this CAM was determined. The timestamp shown on the roadside ITS-S CAM is the time the CAM was generated.
 ITS-S間で伝達されるメッセージの認証には証明書が使用されてもよい。証明書は、証明書の保有者が特定のメッセージセットを送信する許可を示す。また、証明書は、メッセージ内の特定データ要素に対する権限を示すこともできる。証明書において、許可権限はITS-AIDおよびSSPという一対の識別子で示される。 Certificates may be used to authenticate messages transmitted between ITS-S. A certificate indicates permission for the holder of the certificate to send a particular set of messages. Certificates can also indicate authority over specific data elements within a message. In the certificate, authorization authority is indicated by a pair of identifiers, ITS-AID and SSP.
 ITS-Application Identifier(ITS-AID)は、付与される許可の全体的なタイプを示す。例えば、送信者にCAMを送信する権利があることを示すITS-AIDが存在する。  ITS-Application Identifier (ITS-AID) indicates the overall type of permission granted. For example, there is an ITS-AID that indicates that the sender has the right to send the CAM.
 サービス固有許可(Service Specific Permissions(SSP))は、ITS-AIDによって示される全体的な許可の中の特定の許可セットを示すフィールドである。例えば、送信者が特定の車両の役割のためにCAMを送信する権利があることを示すCAM用のITS-AIDに関連するSSPの値が存在する場合がある。 Service Specific Permissions (SSP) is a field that indicates a specific set of permissions within the overall permissions indicated by the ITS-AID. For example, there may be an SSP value associated with the ITS-AID for the CAM that indicates that the sender is entitled to transmit the CAM for a particular vehicle role.
 受信した署名付きCAMは、証明書が有効であり、CAMがその証明書のITS-AIDおよびSSPと一致する場合に、受信側に受け入れられる。CAMは、BitmapSspタイプのSSPを含むAuthorization Tiketに関連付けられた秘密鍵を使用して署名される。 A received signed CAM is accepted by the recipient if the certificate is valid and the CAM matches the certificate's ITS-AID and SSP. The CAM is signed using the private key associated with the Authorization Ticket containing an SSP of type BitmapSsp.
 図12は、CAMのフォーマットを示す図である。図12に示すように、CAMは、ITSプロトコルデータユニット(PDU)ヘッダと複数のコンテナを含んでいてもよい。ITS PDUヘッダは、プロトコルバージョン、メッセージタイプ、送信元ITS-SのIDの情報を含む。 FIG. 12 is a diagram showing the CAM format. As shown in FIG. 12, a CAM may include an ITS protocol data unit (PDU) header and multiple containers. The ITS PDU header contains information on the protocol version, message type, and source ITS-S ID.
 車両ITS-SのCAMは、1つの基本コンテナと1つの高頻度(High Frequency)コンテナ(以下、HFコンテナ)を含み、さらに、1つの低頻度(Low Frequency)コンテナ(以下、LFコンテナ)と、1つ以上の他の特殊車両コンテナを含むことができる。特殊車両コンテナは特殊コンテナと呼ぶこともある。 The CAM of the vehicle ITS-S includes one basic container and one high frequency container (hereinafter referred to as HF container), one low frequency container (hereinafter referred to as LF container), It can contain one or more other specialized vehicle containers. Special vehicle containers are sometimes called special containers.
 基本コンテナには、送信元ITS-Sに関する基本情報が含まれる。具体的には、基本コンテナには、ステーションタイプ、ステーションの位置を含ませることができる。ステーションタイプは、車両、路側機(RSU)などである。ステーションタイプが車両である場合には、車両種別を含ませてもよい。位置には、緯度、経度、高度および信頼度が含まれていてもよい。 The basic container contains basic information about the source ITS-S. Specifically, the base container can include station type, station location. A station type is a vehicle, a roadside unit (RSU), or the like. If the station type is vehicle, the vehicle type may be included. Location may include latitude, longitude, altitude and confidence.
 車両ITS-SのCAMでは、HFコンテナに車両HFコンテナが含まれる。ステーションの種類が車両でない場合には、車両HFコンテナではない他コンテナをHFコンテナに含ませることができる。 In the vehicle ITS-S CAM, the vehicle HF container is included in the HF container. If the station type is not vehicle, the HF container can contain other containers that are not vehicle HF containers.
 車両HFコンテナは、送信元の車両ITS-Sにおいて短時間で変化する動的状態情報が含まれる。車両HFコンテナには、具体的には、車両が向いている方位、車両の速度、走行方向、車長、車幅、車両前後加速度、道路の曲率、曲率計算モード、ヨーレートの1つ以上が含まれていてもよい。走行方向は、前進および後進のいずれかを示す。曲率計算モードは曲率の算出に車両のヨーレートを使用するか否かを示すフラグである。 The vehicle HF container contains dynamic state information that changes in a short time in the source vehicle ITS-S. Specifically, the vehicle HF container includes one or more of the orientation of the vehicle, vehicle speed, traveling direction, vehicle length, vehicle width, vehicle longitudinal acceleration, road curvature, curvature calculation mode, and yaw rate. It may be The running direction indicates either forward or backward. The curvature calculation mode is a flag indicating whether or not the yaw rate of the vehicle is used to calculate the curvature.
 さらに、車両HFコンテナには、車両前後の加速度制御状態、レーンポジション、ステアリングホイール角度、横方向加速度、垂直加速度、特性クラス、DSRC料金収集局の位置に関する情報のうちの1つ以上を含ませることもできる。特性クラスは、CAMのデータ要素の最大経過時間を決める値である。 In addition, the vehicle HF container includes one or more of the following: vehicle longitudinal acceleration control status, lane position, steering wheel angle, lateral acceleration, vertical acceleration, characteristic class, DSRC toll collection station location. can also A property class is a value that determines the maximum age of data elements in the CAM.
 車両ITS-SのCAMでは、LFコンテナに車両LFコンテナが含まれる。ステーションの種類が車両でない場合には、車両LFコンテナではない他コンテナをLFコンテナに含ませることができる。 In the vehicle ITS-S CAM, the vehicle LF container is included in the LF container. If the station type is not vehicle, the LF container can contain other containers that are not vehicle LF containers.
 車両LFコンテナは、車両の役割、外部灯火の点灯状態、走行軌跡のうちの1つ以上を含ませることができる。車両の役割は、車両が特殊車両である場合の区分である。外部灯火は、車両が備える外部灯火のうち最も重要な外部灯火を指す。走行軌跡は、過去のある時間またはある距離における車両の動きを示すものである。走行軌跡はパスヒストリーと言うこともできる。走行軌跡は、複数点(たとえば23点)の経由地点のリストで表される。 A vehicle LF container can include one or more of the vehicle's role, the lighting state of external lights, and the travel trajectory. The role of vehicle is a classification when the vehicle is a special vehicle. The external lights are the most important external lights of the vehicle. A travel trajectory indicates the movement of a vehicle at a certain time or a certain distance in the past. The running locus can also be called a path history. A travel locus is represented by a list of waypoints of a plurality of points (for example, 23 points).
 特殊車両コンテナは、公共交通のような道路交通における特殊な役割をもつ車両ITS-Sのためのコンテナである。特殊車両コンテナには、公共輸送コンテナ、特殊輸送コンテナ、危険物コンテナ、道路工事コンテナ、救急コンテナ、緊急コンテナ、安全確認車コンテナのいずれか1つが含まれことがある。 A special vehicle container is a container for vehicles ITS-S that have a special role in road traffic such as public transportation. A special vehicle container may include any one of a public transportation container, a special transportation container, a hazardous materials container, a road construction container, an ambulance container, an emergency container, or a security vehicle container.
 公共輸送コンテナは、バスなどの公共輸送車両に対するコンテナである。公共輸送コンテナは、乗降状態、信号機、障壁、杭(ボラード)などを公共交車両が制御するために用いられる。特殊輸送コンテナは、車両が重量車およびオーバーサイズ車両の一方または両方である場合に含まれる。危険物コンテナは、車両が危険物を輸送している場合に含まれるコンテナである。危険物コンテナは危険物の種類を示す情報を格納する。道路工事コンテナは、車両が道路工事に参加する車両である場合に含まれるコンテナである。道路工事コンテナは道路工事の種類、道路工事の原因を示すコードを格納する。また、道路工事コンテナには、前方の車線の開閉状況を示す情報を含ませることができる。救急コンテナは、車両が救急活動中の救急車両である場合に含まれるコンテナである。救急コンテナには、ライトバーおよびサイレンの使用状況、緊急優先度が示される。緊急コンテナは、車両が緊急活動中の緊急車両である場合に含まれるコンテナである。緊急コンテナには、ライトバーおよびサイレンの使用状況、原因コード、緊急優先度が示される。安全確認車コンテナは、車両が安全確認車である場合に含まれるコンテナである。安全確認車は、特殊輸送車両などに付随する車である。安全確認車コンテナには、ライトバーおよびサイレンの使用状況、追い越し規制、制限速度が示される。 A public transport container is a container for public transport vehicles such as buses. Public transportation containers are used by public transportation vehicles to control boarding conditions, traffic lights, barriers, bollards, and the like. Special shipping containers are included when the vehicle is one or both of a heavy vehicle and an oversized vehicle. A dangerous goods container is a container that is included when a vehicle is transporting dangerous goods. The dangerous goods container stores information indicating the type of dangerous goods. A road construction container is a container that is included when the vehicle is a vehicle that participates in road construction. The road construction container stores a code indicating the type of road construction and the cause of the road construction. The roadwork container may also include information indicating whether the lane ahead is open or closed. An ambulance container is a container that is included when the vehicle is an ambulance vehicle during an ambulance operation. The ambulance container shows the status of light bar and siren usage, and emergency priority. An emergency container is a container that is included when the vehicle is an emergency vehicle during emergency operations. The emergency container shows light bar and siren usage, cause code and emergency priority. The safety confirmation vehicle container is a container that is included when the vehicle is a safety confirmation vehicle. A safety confirmation vehicle is a vehicle that accompanies a special transportation vehicle or the like. The safety confirmation car container shows the use of light bars and sirens, overtaking regulations, and speed limits.
 図13にCAMを送信する処理をフローチャートにより示す。図13に示す処理はCAMの生成周期ごとに実行する。ステップS201では、CAMを構成する情報を取得する。ステップS201は、たとえば、検出情報取得部201が実行する。ステップS202では、ステップS201で取得した情報をもとにCAMを生成する。ステップS202も、検出情報取得部201が実行することができる。ステップS203では、送信部221が、ステップS202で生成したCAMを、車両の周囲へ送信する。  Fig. 13 shows a flowchart of the process of transmitting the CAM. The processing shown in FIG. 13 is executed for each CAM generation cycle. In step S201, information configuring the CAM is acquired. Step S201 is executed by the detection information acquisition unit 201, for example. At step S202, a CAM is generated based on the information acquired at step S201. Step S202 can also be executed by the detection information acquisition unit 201 . In step S203, the transmission unit 221 transmits the CAM generated in step S202 to the surroundings of the vehicle.
 CAサービス128では、V2X通信装置が定期的に自身の位置や状態を周囲のV2X通信装置に提供することで、交通安全を支援することができる。しかし、CAサービス128では、対応するV2X通信装置自身の情報しか共有できないという制限がある。この制限を克服するために、CPサービス124などのサービス開発が必要である。 With the CA service 128, V2X communication devices can support traffic safety by periodically providing their own location and status to surrounding V2X communication devices. However, the CA service 128 has a limitation that only the information of the corresponding V2X communication device itself can be shared. To overcome this limitation, service development such as CP service 124 is required.
 図14に示すように、CPサービス124も、ファシリティ層120のエンティティであってもよい。たとえば、CPサービス124は、ファシリティ層120のアプリケーションサポートドメインの一部であってもよい。CPサービス124は、たとえば、VDP125やPOTIユニット126からホストV2X通信装置に関する入力データを受信できない点で、CAサービス128とは基本的に異なるものであってもよい。 The CP service 124 may also be an entity of the facility layer 120, as shown in FIG. For example, CP service 124 may be part of the application support domain of facility layer 120 . The CP service 124 may fundamentally differ from the CA service 128 in that, for example, it cannot receive input data regarding the host V2X communication device from the VDP 125 or the POTI unit 126 .
 CPMの送信は、CPMの生成と送信を含む。CPMを生成するプロセスでは、発信元のV2X通信装置がCPMを生成し、その後、CPMが、送信のためにネットワーク&トランスポート層140に送られる。発信元のV2X通信装置は、発信元V2X通信装置、ホストV2X通信装置等と呼ばれてもよい。  CPM transmission includes CPM generation and transmission. In the process of generating CPM, an originating V2X communication device generates a CPM, which is then sent to network & transport layer 140 for transmission. An originating V2X communication device may be referred to as an originating V2X communication device, a host V2X communication device, and so on.
 CPサービス124は、CPM生成のための関連情報を収集し、受信したCPMコンテンツを追加処理を目的として配信するために、ファシリティ層120内の他のエンティティおよびファシリティ層120内のV2Xアプリケーションと接続していてもよい。V2X通信装置において、データ収集のためのエンティティは、ホスト物体検出器において物体検出を提供する機能であってよい。 CP service 124 connects with other entities in facility layer 120 and V2X applications in facility layer 120 to collect relevant information for CPM generation and to deliver received CPM content for further processing. may be In a V2X communication device, the entity for data collection may be the function that provides object detection at the host object detector.
 さらに、CPMを配信する(あるいは送信する)ために、CPサービス124は、ネットワーク&トランスポート層140のプロトコルエンティティによって提供されるサービスを使用してもよい。例えば、CPサービス124は、他のV2X通信装置とCPMを交換するために、NF-SAPを通じてネットワーク&トランスポート層140と接続してもよい。NF-SAPは、ネットワーク&トランスポート層140とファシリティ層120との間のサービスアクセスポイントである。 In addition, CP service 124 may use services provided by protocol entities of network & transport layer 140 to deliver (or transmit) CPMs. For example, CP service 124 may interface with network & transport layer 140 through NF-SAP to exchange CPMs with other V2X communication devices. NF-SAP is the service access point between network & transport layer 140 and facility layer 120 .
 さらに、CPサービス124は、CPMの送信およびCPMの受信のためにセキュリティサービスにアクセスするために、セキュリティ層160とファシリティ層120との間のSAPであるSF-SAPを通じてセキュアエンティティと接続してもよい。また、CPサービス124は、マネージメント層150とファシリティ層120との間のSAPであるMF-SAPを通じて管理エンティティと接続してもよい。また、CPサービス124は、受信したCPMデータをアプリケーションに直接提供する場合は、ファシリティ層120とアプリケーション層110とのSAPであるFA-SAPを通じてアプリケーション層110と接続してもよい。 In addition, CP service 124 may connect with secure entities through SF-SAP, the SAP between security layer 160 and facility layer 120, to access security services for sending CPMs and receiving CPMs. good. CP service 124 may also interface with management entities through MF-SAP, which is the SAP between management layer 150 and facility layer 120 . Further, when the CP service 124 directly provides the received CPM data to the application, the CP service 124 may connect to the application layer 110 through FA-SAP, which is the SAP between the facility layer 120 and the application layer 110 .
 CPサービス124は、V2X通信装置が、検出された周囲の道路使用者や他の物体の位置、挙動、属性について、他のV2X通信装置に通知する方法を指定することができる。たとえば、CPサービス124は、CPMの送信により、CPMに含まれている情報を他のV2X通信装置と共有することができる。なお、CPサービス124は、道路交通に参加する全ての種類の物標情報通信装置に対して追加できる機能であってもよい。 The CP service 124 can specify how a V2X communication device informs other V2X communication devices about the location, behavior, and attributes of detected surrounding road users and other objects. For example, sending a CPM allows the CP service 124 to share the information contained in the CPM with other V2X communication devices. Note that the CP service 124 may be a function that can be added to all types of target information communication devices that participate in road traffic.
 CPMは、V2Xネットワークを介して、V2X通信装置間で交換されるメッセージである。CPMは、V2X通信装置によって検出および/または認識された道路使用者および他の物体に対する集団認識を生成するために使用できる。検出される道路使用者または物体は、V2X通信装置を備えていない道路使用者または物体であってもよいが、これに限定されない。 A CPM is a message exchanged between V2X communication devices via the V2X network. CPM can be used to generate collective perceptions of road users and other objects detected and/or recognized by V2X communication devices. The detected road users or objects may be, but are not limited to, road users or objects that are not equipped with V2X communication equipment.
 上述したように、CAMを介して情報を共有するV2X通信装置は、協調認識を行うために、V2X通信装置自身の認識状態に関する情報のみを他のV2X通信装置と共有する。この場合、V2X通信装置を搭載していない道路使用者等はシステムの一部ではないため、安全や交通管理に関連する状況についての見解が限定される。 As described above, a V2X communication device that shares information via a CAM shares only information about the recognition state of the V2X communication device itself with other V2X communication devices in order to perform cooperative recognition. In this case, road users and others who are not equipped with V2X communication devices are not part of the system and thus have limited views on situations related to safety and traffic management.
 これを改善する一つの方法として、V2X通信装置を搭載し、V2X通信装置を搭載していない道路使用者や物体を認識できるシステムが、V2X通信装置を搭載していない道路使用者や物体の存在や状態を他のV2X通信装置に通知することが考えられる。このように、CPサービス124は、V2X通信装置を搭載していない道路使用者や物体の存在を協調して認識するため、V2X通信装置を搭載したシステムの安全性や交通管理性能を容易に向上させることが可能である。 As one method to improve this, a system that is equipped with a V2X communication device and can recognize road users and objects that are not equipped with a V2X communication device can detect the presence of road users and objects that are not equipped with a V2X communication device. and status to other V2X communication devices. In this way, the CP service 124 cooperatively recognizes the presence of road users and objects that are not equipped with V2X communication devices, so the safety and traffic management performance of systems equipped with V2X communication devices can be easily improved. It is possible to
 CPMの配信は、適用される通信システムによって異なる場合がある。例えば、ETSI EN 302 663に定義されているITS-G5ネットワークにおいて、CPMは、発信元のV2X通信装置から直接通信範囲内の全てのV2X通信装置に送信される場合がある。通信範囲は、関連する地域に応じて送信電力を変更することにより、発信元のV2X通信装置によって特に影響を受ける可能性がある。  CPM delivery may vary depending on the applied communication system. For example, in ITS-G5 networks as defined in ETSI EN 302 663, a CPM may be sent from an originating V2X communication device directly to all V2X communication devices within range. The communication range can be particularly affected by the originating V2X communication device by changing the transmission power according to the region concerned.
 さらに、CPMは、発信元のV2X通信装置におけるCPサービス124によって制御される頻度で定期的に生成されてもよい。生成頻度は、分散輻輳制御(Distributed Congestion Control)により決定される無線チャネル負荷を考慮して決定されてもよい。また、生成頻度は、検出された非V2X物体の状態、たとえば、位置、速度または方向の動的挙動、および他のV2X通信装置による同一の知覚された物体に対するCPMの送信を考慮して決定されてもよい。 Furthermore, CPMs may be generated periodically with a frequency controlled by the CP service 124 at the originating V2X communication device. The frequency of generation may be determined taking into account the radio channel load determined by Distributed Congestion Control. The generation frequency is also determined taking into account the state of the detected non-V2X objects, e.g. dynamic behavior of position, velocity or orientation, and the transmission of CPMs for the same perceived object by other V2X communication devices. may
 さらに、受信側のV2X通信装置がCPMを受信すると、CPサービス124により、CPMの内容を受信側のV2X通信装置内の機能、たとえばV2Xアプリケーションおよび/またはLDM127で使用することができるようにする。たとえば、LDM127は、受信したCPMデータで更新されることがある。V2Xアプリケーションは、追加処理のためにLDM127からこの情報を取り出してもよい。 Furthermore, when the receiving V2X communication device receives the CPM, the CP service 124 makes the contents of the CPM available to functions within the receiving V2X communication device, such as the V2X application and/or the LDM 127 . For example, LDM 127 may be updated with received CPM data. V2X applications may retrieve this information from LDM 127 for additional processing.
 図15は、本実施形態におけるCPサービス124の機能ブロック図である。より具体的には、図15は、本実施形態におけるCPサービス124の機能ブロックと、他の機能および層のためのインターフェースを有する機能ブロックとを図示する。 FIG. 15 is a functional block diagram of the CP service 124 in this embodiment. More specifically, FIG. 15 illustrates the functional blocks of the CP service 124 and functional blocks with interfaces for other functions and layers in this embodiment.
 図15に示すように、CPサービス124は、CPM送受信のために以下のサブ機能を提供することができる。CPMエンコード部1241は、予め定義されたフォーマットに従ってCPMを構成または生成する。最新の車載データがCPMに含まれることがある。CPMデコード部1242は、受信したCPMを復号する。CPM送信管理部1243は、発信元のV2X通信装置のプロトコル動作を実行する。CPM送信管理部1243が実行する動作には、CPM送信動作の起動および終了、CPM生成頻度の決定、CPM生成のトリガを含んでもよい。CPM受信管理部1244は、受信側V2X通信装置のプロトコル動作を実行することができる。具体的には、CPM受信におけるCPMデコード機能のトリガ、受信したCPMデータのLDM127または受信側V2X通信装置のV2Xアプリケーションへの提供、受信したCPMの情報チェックなどを含むことができる。 As shown in FIG. 15, the CP service 124 can provide the following sub-functions for CPM transmission/reception. CPM encoder 1241 constructs or generates CPM according to a predefined format. The latest in-vehicle data may be included in the CPM. The CPM decoding unit 1242 decodes the received CPM. The CPM transmission management unit 1243 executes the protocol operation of the source V2X communication device. The operations performed by the CPM transmission manager 1243 may include activation and termination of CPM transmission operations, determination of CPM generation frequency, and triggering of CPM generation. The CPM reception manager 1244 can perform protocol operations for the receiving V2X communication device. Specifically, it can include triggering the CPM decoding function in CPM reception, providing the received CPM data to the LDM 127 or the V2X application of the receiving side V2X communication device, checking the information of the received CPM, and the like.
 次に、CPMの配信について詳細に説明する。具体的には、CPM配信の要件、CPサービスの起動と終了、CPMトリガ条件、CPM生成周期、制約条件等について説明する。CPM配信にポイントツーマルチポイント通信が使用されてもよい。例えば、CPMの配信にITS-G5が用いられる場合、制御チャネル(G5-CCH)が用いられてもよい。CPM生成は、CPサービス124が動作している間、CPサービス124によってトリガされ管理されてもよい。CPサービス124は、V2X通信装置の起動とともに起動されてもよく、V2X通信装置が終了したときに終了されてもよい。 Next, the CPM distribution will be explained in detail. Specifically, the requirements for CPM distribution, activation and termination of CP service, CPM trigger conditions, CPM generation cycle, constraint conditions, etc. will be described. Point-to-multipoint communication may be used for CPM delivery. For example, if ITS-G5 is used for CPM delivery, a control channel (G5-CCH) may be used. CPM generation may be triggered and managed by CP service 124 while CP service 124 is running. The CP service 124 may be launched upon activation of the V2X communication device and may be terminated when the V2X communication device is terminated.
 ホストV2X通信装置は、近くのV2X通信装置と交換する必要がある十分な信頼度を有する少なくとも1つの物体が検出されるたびにCPMを送信してよい。検出された物体を含めることに関して、CPサービスは、物体の寿命とチャネル利用率との間のトレードオフを考慮すべきである。たとえば、CPMが受信した情報を利用するアプリケーションの観点からは、できるだけ頻繁に更新された情報を提供する必要がある。しかし、ITS-G5スタックの観点からは、チャネル使用率を最小にする必要があるため、低い送信周期が要求される。したがって、V2X通信装置は、この点を考慮し、検出した物体や物体情報をCPMに適切に含めることが望ましい。また、メッセージサイズを小さくするために、物体を評価した上で送信する必要がある。 A host V2X communication device may send a CPM whenever at least one object with sufficient confidence to exchange with a nearby V2X communication device is detected. Regarding the inclusion of detected objects, CP services should consider the trade-off between object lifetime and channel utilization. For example, from the point of view of an application that uses information received by a CPM, it needs to provide updated information as often as possible. However, from the point of view of the ITS-G5 stack, a low transmission period is required due to the need to minimize channel utilization. Therefore, it is desirable for the V2X communication device to consider this point and appropriately include the detected object and object information in the CPM. Also, in order to reduce the message size, it is necessary to evaluate the object before sending it.
 図16は、CPMの構造を示す図である。図16に示すCPM構造が基本CPM構造であってもよい。上述したように、CPMは、V2Xネットワーク内のV2X信装置間で交換されるメッセージであってもよい。また、CPMは、V2X通信装置によって検出および/または認識された道路使用者および/または他の物体に対する集団認識を生成するために使用されてもよい。すなわち、CPMは、V2X通信装置によって検出された物体に対する集団認識を生成するためのITSメッセージであってもよい。 FIG. 16 is a diagram showing the structure of the CPM. The CPM structure shown in FIG. 16 may be the basic CPM structure. As noted above, CPMs may be messages exchanged between V2X communication devices in a V2X network. CPM may also be used to generate collective perceptions of road users and/or other objects detected and/or recognized by V2X communication devices. That is, a CPM may be an ITS message for generating collective awareness of objects detected by V2X communication devices.
 CPMは、発信元V2X通信装置が検出した道路使用者と物体の状態情報および属性情報を含んでもよい。その内容は、検出された道路使用者または物体の種類および発信元V2X通信装置の検出性能に応じて異なってもよい。例えば、物体が車両である場合、状態情報は、少なくとも、実際の時間、位置、および運動状態に関する情報を含んでもよい。属性情報には、寸法、車種、道路交通における役割などの属性が含まれてもよい。 The CPM may include state information and attribute information of road users and objects detected by the source V2X communication device. Its content may vary depending on the type of road user or object detected and the detection capabilities of the originating V2X communication device. For example, if the object is a vehicle, the state information may include at least information about the actual time, location and motion state. Attribute information may include attributes such as dimensions, vehicle type, and role in road traffic.
 CPMは、CAMを補完し、CAMと同様の働きをするものであってもよい。すなわち、協調的な認識を高めるためであってもよい。CPMは、検出された道路使用者または物体に関し、外部から観測可能な情報を含んでもよい。CPサービス124は、他のステーションが送信したCPMを確認することで、異なるV2X通信装置が送信したCPMの複製または重複を低減する方法を含んでもよい。 The CPM may complement the CAM and work in the same way as the CAM. That is, it may be for enhancing cooperative recognition. The CPM may contain externally observable information about detected road users or objects. The CP service 124 may include methods to reduce duplication or duplication of CPMs sent by different V2X communication devices by verifying CPMs sent by other stations.
 CPMの受信により、受信側のV2X通信装置は、発信元のV2X通信装置が検出した道路使用者または物体の存在、種類および状態を認識してもよい。受信した情報は、安全性を高め、交通効率および移動時間を改善するためのV2Xアプリケーションをサポートするために、受信側のV2X通信装置によって使用されてもよい。例えば、受信した情報と検出された道路使用者または物体の状態とを比較することにより、受信側のV2X通信装置は、道路使用者または物体との衝突の危険性を推定することができる。さらに、受信側V2X通信装置は、受信側V2X通信装置のヒューマンマシンインターフェース(HMI)を介してユーザーに通知してもよいし、自動的に修正措置を講じてもよい。 By receiving the CPM, the receiving V2X communication device may recognize the presence, type and status of road users or objects detected by the originating V2X communication device. The received information may be used by the receiving V2X communication device to support V2X applications to enhance safety, improve traffic efficiency and travel time. For example, by comparing the received information with the detected states of road users or objects, the receiving V2X communication device can estimate the risk of collision with road users or objects. Additionally, the receiving V2X communication device may notify the user via the receiving V2X communication device's Human Machine Interface (HMI) or automatically take corrective action.
 CPMの基本的なフォーマットを、図16を参照して説明する。このCPMのフォーマットは、ASN(Abstract Syntax Notation).1として提示されてもよい。本開示で定義されていないデータエレメント(DE)およびデータフレーム(DF)は、ETSI TS 102 894-2に規定されている共通データ辞書から導出されてもよい。図16に示すように、CPMは、ITSプロトコルデータユニット(PDU)ヘッダと、複数のコンテナとを含んでもよい。 The basic format of CPM will be explained with reference to FIG. The format of this CPM is ASN (Abstract Syntax Notation). May be presented as 1. Data Elements (DE) and Data Frames (DF) not defined in this disclosure may be derived from the Common Data Dictionary specified in ETSI TS 102 894-2. As shown in FIG. 16, a CPM may include an ITS protocol data unit (PDU) header and multiple containers.
 ITS PDUヘッダは、プロトコルバージョン、メッセージタイプ、及び発信元のV2X通信装置のITS IDに関する情報を含むヘッダである。ITS PDUヘッダは、ITSメッセージで使用される共通のヘッダであり、ITSメッセージの開始部分に存在する。ITS PDUヘッダは、共通ヘッダと呼ばれることもある。 The ITS PDU header is a header that contains information about the protocol version, message type, and ITS ID of the source V2X communication device. The ITS PDU header is a common header used in ITS messages and is present at the beginning of the ITS message. The ITS PDU header is sometimes called a common header.
 複数のコンテナは、管理コンテナ(Management Container)、ステーションデータコンテナ(Station Data Container)、センサ情報コンテナ(Sensor Information Container)、認識物体コンテナ(Perceived Object Container)、フリースペース追加コンテナ(Free Space Addendum Container)を含むことができる。ステーションデータコンテナは発信車両コンテナ(Originating Vehicle Container)あるいは発信路側機コンテナ(Originating RSU Container)を含むことができる。センサ情報コンテナは視野情報コンテナ(Field-of-View Container)と呼ばれることもある。発信車両コンテナはOVCと記載することもある。視野コンテナはFOCと記載することもある。認識物体コンテナはPOCと記載することもある。CPMは、必須のコンテナとして管理コンテナを含み、ステーションデータコンテナ、センサ情報コンテナ、POCおよびフリースペース付属コンテナを任意のコンテナとしてもよい。センサ情報コンテナ、認識物体コンテナおよびフリースペース付属コンテナは複数のコンテナであってもよい。以下、各コンテナについて説明する。 Multiple containers are Management Container, Station Data Container, Sensor Information Container, Perceived Object Container, Free Space Addendum Container. can contain. A station data container may include an originating vehicle container or an originating roadside unit container (RSU container). A sensor information container is sometimes called a field-of-view container. An originating vehicle container may also be referred to as an OVC. A field of view container may also be described as an FOC. A recognition object container may also be described as a POC. The CPM includes the management container as a mandatory container, and the station data container, sensor information container, POC and free space ancillary containers may be optional containers. The sensor information container, the perceived object container and the free space attachment container may be multiple containers. Each container will be described below.
 管理コンテナは、車両または路側機タイプのステーションであるかどうかに関係なく、発信元のITS-Sに関する基本情報を提供する。また、管理コンテナは、ステーションタイプ、基準位置、セグメント化情報、認識物体数を含んでいてもよい。ステーションタイプは、ITS-Sのタイプを示す。基準位置は、発信元ITS-Sの位置である。セグメント化情報は、メッセージサイズの制約によりCPMを複数のメッセージに分割する場合の分割情報を記述する。 The administrative container provides basic information about the originating ITS-S, regardless of whether it is a vehicle or roadside unit type station. The management container may also include station type, reference location, segmentation information, number of recognized objects. The station type indicates the type of ITS-S. The reference location is the location of the originating ITS-S. The segmentation information describes splitting information when splitting a CPM into multiple messages due to message size constraints.
 図17に示す表1は、CPMのステーションデータコンテナにおけるOVCの一例である。表1は、一例としてのOVCに含まれるデータエレメント(DE)および/またはデータフレーム(DF)を示している。なお、ステーションデータコンテナは、発信元のITS-Sが車両である場合には、OVCになる。発信元のITS-SがRSUである場合には、発信元RSUコンテナ(Originating RSU Container)になる。発信元RSUコンテナは、RSUが存在する道路あるいは交差点に関するIDを含んでいる。 Table 1 shown in FIG. 17 is an example of OVC in the station data container of CPM. Table 1 shows the data elements (DE) and/or data frames (DF) included in an example OVC. Note that the station data container becomes an OVC when the ITS-S that is the source is a vehicle. If the originating ITS-S is an RSU, it becomes an Originating RSU Container. The originating RSU container contains the ID for the road or intersection on which the RSU is located.
 DEは、単一データを含むデータタイプである。DFは、予め定められた順序で1つ以上の要素を含むデータタイプである。たとえば、DFは、1つ以上のDEおよび/または1つ以上のDFを予め定義された順序で含むデータタイプである。  DE is a data type that contains single data. DF is a data type containing one or more elements in a predetermined order. For example, DF is a data type that includes one or more DEs and/or one or more DFs in a predefined order.
 DE/DFは、ファシリティ層メッセージまたはアプリケーション層メッセージを構成するために使用されてもよい。ファシリティ層メッセージの例は、CAM、CPM、DENMである。 The DE/DF may be used to construct facility layer messages or application layer messages. Examples of facility layer messages are CAM, CPM, DENM.
 表1に示すように、OVCは、CPMを発信するV2X通信装置に関連する基本情報を含む。OVCは、CAMのスケールダウン版と解釈できる。ただし、OVCは座標変換処理に必要なDEのみを含んでもよい。すなわち、OVCは、CAMと類似しているが、発信元のV2X通信装置に関する基本情報を提供する。OVCに含まれる情報は、座標変換処理をサポートすることに重点を置いている。 As shown in Table 1, the OVC contains basic information related to the V2X communication device that emits the CPM. OVC can be interpreted as a scaled down version of CAM. However, the OVC may include only the DE required for coordinate conversion processing. That is, OVC is similar to CAM, but provides basic information about the originating V2X communication device. The information contained in the OVC is focused on supporting the coordinate transformation process.
 OVCは、以下のものを提供することができる。すなわち、OVCは、CPM生成時にCPサービス124が取得した発信元V2X通信装置の最新の地理的位置を提供することができる。また、OVCは、発信元V2X通信装置の横方向および縦方向の絶対速度成分を提供することができる。OVCは、発信元V2X通信装置の幾何学的寸法を提供することができる。 OVC can provide the following. That is, the OVC can provide the latest geographic location of the originating V2X communication device obtained by the CP service 124 at the time of CPM generation. OVC can also provide the absolute lateral and longitudinal velocity components of the originating V2X communication device. The OVC can provide the geometric dimensions of the originating V2X communication device.
 表1に示す生成差分時間は、DEとして、CPMにおける基準位置の時刻に対応する時間を示す。生成差分時間は、CPMの生成時刻とみなすことができる。本開示では、生成差分時間を生成時間と称することがある。 The generated differential time shown in Table 1 indicates, as DE, the time corresponding to the time of the reference position in CPM. The generation difference time can be regarded as the generation time of the CPM. In the present disclosure, the generated differential time may be referred to as generated time.
 基準位置は、DFとして、V2X通信装置の地理的な位置を示す。基準位置は、地理的な点の位置を示す。基準位置は、緯度、経度、位置信頼度および/または高度に関する情報を含む。緯度は、地理的地点の緯度を表し、経度は、地理的地点の経度を表す。位置信頼度は、地理的位置の精度を表し、高度は、地理的地点の高度および高度精度を表す。 The reference position indicates the geographical position of the V2X communication device as DF. A reference position indicates the location of a geographical point. The reference position includes information regarding latitude, longitude, position confidence and/or altitude. Latitude represents the latitude of the geographical point, and Longitude represents the longitude of the geographical point. The position confidence represents the accuracy of the geographic location, and the altitude represents the altitude and altitude accuracy of the geographic point.
 方位は、DFとして、座標系における方位を示す。方位は、方位値および/または方位信頼度の情報を含む。方位値は、北を基準とした進行方向を示し、方位の信頼度は、報告された方位値の信頼度が予め設定されたレベルであることを示す。 The orientation indicates the orientation in the coordinate system as DF. Heading includes heading value and/or heading confidence information. The bearing value indicates the heading relative to north, and the bearing confidence indicates a preset level of confidence in the reported bearing value.
 縦方向速度は、DFとして、移動体(たとえば車両)に関する縦方向速度と速度情報の精度を記述することができる。縦方向速度は、速度値および/または速度精度の情報を含む。速度値は、縦方向の速度値を表し、速度精度は、その速度値の精度を表す。  Longitudinal velocity can describe the accuracy of longitudinal velocity and velocity information for a moving object (eg, vehicle) as DF. Longitudinal velocity includes velocity value and/or velocity accuracy information. The velocity value represents the velocity value in the vertical direction, and the velocity accuracy represents the accuracy of the velocity value.
 横方向速度は、DFとして、移動体(たとえば車両)に関する横方向速度および速度情報の精度を記述することができる。横方向速度は、速度値および/または速度精度に関する情報を含む。上記速度値は、横方向の速度値を表し、速度精度は、その速度値の精度を表す。 Lateral velocity, as a DF, can describe the lateral velocity and the accuracy of velocity information for a moving body (eg, vehicle). Lateral velocity includes information about velocity values and/or velocity accuracy. The velocity value represents the velocity value in the lateral direction, and the velocity accuracy represents the accuracy of the velocity value.
 車両長は、DFとして、車両長および精度指標を記述することができる。車両長は、車両長の値および/または車両長の精度指標に関する情報を含む。車両長は、車両の長さを表し、車両長の精度指標は、その車両長の信頼性を表す。 The vehicle length can describe the vehicle length and accuracy index as DF. The vehicle length includes information about vehicle length values and/or vehicle length accuracy indicators. The vehicle length represents the length of the vehicle, and the vehicle length accuracy index represents the reliability of the vehicle length.
 車幅は、DEとして、車両の幅を示す。たとえば、車幅は、サイドミラーを含めた車両の幅を表すことができる。なお、車幅が6.1m以上の場合は61とし、情報が得られない場合は62とする。 The vehicle width indicates the width of the vehicle as DE. For example, vehicle width can represent the width of the vehicle including the side mirrors. If the width of the vehicle is 6.1 m or more, it is set to 61, and if the information cannot be obtained, it is set to 62.
 表1に示す各DE/DFは、生成時間差分を除き、それぞれ、表1の右列に示すETSI 102 894-2を参照できる。ETSI 102 894-2は、CDD(common data dictionary)を定めている。生成時間差分については、ETSI EN 302 637-2を参照できる。 Each DE/DF shown in Table 1 can refer to ETSI 102 894-2 shown in the right column of Table 1, except for the generation time difference. ETSI 102 894-2 defines a CDD (common data dictionary). See ETSI EN 302 637-2 for generation time differences.
 また、前述の情報以外に車両方向角度、車両進行方向、縦加速度、横加速度、垂直加速度、ヨーレート、ピッチ角度、ロール角度、車両高さ及びトレーラーデータに関する情報をOVCに含んでいてもよい。 In addition to the above information, the OVC may contain information on the vehicle direction angle, vehicle traveling direction, longitudinal acceleration, lateral acceleration, vertical acceleration, yaw rate, pitch angle, roll angle, vehicle height and trailer data.
 図18には表2を示している。表2はCPMにおけるSIC(またはFOC)の例である。 SICは、発信元のV2X通信装置に搭載された少なくとも1つのセンサの説明を提供する。V2X通信装置がマルチセンサを搭載している場合、説明は複数追加されることがある。たとえば、SICは発信元のV2X通信装置のセンサ能力に関する情報を提供する。このようにするために、発信元V2X通信装置のセンサの取り付け位置、センサの種類、センサの範囲と開き角(すなわちセンサのフラスタム)を提供する一般的なセンサ特性が、メッセージの一部として含まれてもよい。これらの情報は、受信側のV2X通信装置がセンサの性能に応じた適切な予測モデルを選択するために利用されることがある。 Table 2 is shown in FIG. Table 2 is an example of SIC (or FOC) in CPM. The SIC provides a description of at least one sensor mounted on the originating V2X communication device. If the V2X communication device is equipped with multiple sensors, multiple explanations may be added. For example, the SIC provides information about the sensor capabilities of the originating V2X communication device. To do so, general sensor characteristics providing the originating V2X communication device's sensor mounting location, sensor type, sensor range and opening angle (i.e., sensor frustum) are included as part of the message. may be These pieces of information may be used by the receiving V2X communication device to select an appropriate prediction model according to sensor performance.
 SICの各種の情報について表2を参照して説明する。センサIDは、物体を検出したセンサを特定するためのセンサ固有のIDを示す。実施形態では、センサIDは、V2X通信装置の起動時に生成される乱数であり、V2X通信装置が終了するまで変更されない。 Various information of SIC will be explained with reference to Table 2. The sensor ID indicates a sensor-specific ID for specifying the sensor that detected the object. In an embodiment, the sensor ID is a random number generated when the V2X communication device starts up and does not change until the V2X communication device is terminated.
 センサタイプは、センサのタイプを示す。以下にセンサのタイプを列挙する。たとえば、センサタイプは、未定義(0)、レーダー(1)、ライダー(2)、モノビデオ(3)、ステレオビジョン(4)、ナイトビジョン(5)、超音波(6)、pmd(7)、フュージョン(8)、インダクションループ(9)、球面カメラ(10)、それらの集合(11)である。pmdは、photo mixing deviceである。球面カメラは360度カメラとも呼ばれる。  Sensor type indicates the type of sensor. The sensor types are listed below. For example, the sensor type is undefined (0), radar (1), lidar (2), mono-video (3), stereovision (4), night vision (5), ultrasonic (6), pmd (7). , fusion (8), induction loop (9), spherical camera (10), and their set (11). pmd is a photo mixing device. A spherical camera is also called a 360-degree camera.
 センサ位置において、X位置は、センサのマイナスX方向の取付位置、Y位置はセンサのY方向の取付位置を示す。これらの取付位置は、基準位置からの測定値であり、基準位置はETSIのEN 302 637-2を参照できる。半径は、メーカが定義するセンサの平均的な認識範囲を示す。 In the sensor position, the X position indicates the mounting position of the sensor in the negative X direction, and the Y position indicates the mounting position of the sensor in the Y direction. These mounting positions are measurements from a reference position, which can be referred to EN 302 637-2 of ETSI. The radius indicates the average recognition range of the sensor as defined by the manufacturer.
 開き角において、開始角度はセンサのフラスタムの開始角度を示し、終了角度はセンサのフラスタムの終了角度を示す。品質クラスは、測定対象物の品質を定義するセンサの分類を表す。 In the opening angle, the start angle indicates the start angle of the sensor's frustum, and the end angle indicates the end angle of the sensor's frustum. A quality class represents a classification of the sensor that defines the quality of the measurement object.
 また、前述の情報以外に検出領域およびフリースペースの信頼性に関する情報をSICに含んでいてもよい。 In addition to the above information, the SIC may contain information regarding the reliability of the detection area and free space.
 図19には表3を示している。表3はCPMにおけるPOCの例である。POCは、送信するV2X通信装置から見て、センサが認識した物体を記述するために使用される。POCを受信した受信側V2X通信装置は、OVCの助けを借りて、物体の位置を受信側車両の基準座標系に変換する座標変換処理を行うことができる。 Table 3 is shown in FIG. Table 3 is an example of POC in CPM. The POC is used to describe the object perceived by the sensor as seen by the transmitting V2X communication device. The receiving V2X communication device that receives the POC can, with the help of the OVC, perform coordinate transformation processing to transform the position of the object into the reference coordinate system of the receiving vehicle.
 メッセージサイズを小さくするために、発信側V2X通信装置が提供できる場合に、複数のオプションDEを提供してもよい。 In order to reduce the message size, multiple optional DEs may be provided if the originating V2X communication device can provide them.
 POCは、認識された(または検出された)物体の抽象的な説明を提供するためにDEの選択で構成されてもよい。たとえば、発信元V2X通信装置に関連する認識された物体についての相対距離、速度情報およびタイミング情報は、必須のDEとしてPOCに含まれてもよい。また、発信元V2X通信装置のセンサが、要求されたデータを提供できる場合、追加のDEを提供してもよい。 A POC may consist of a selection of DEs to provide an abstract description of the recognized (or detected) object. For example, the relative distance, velocity information and timing information about the perceived object associated with the originating V2X communication device may be included in the POC as mandatory DEs. Additional DEs may also be provided if the sensors of the originating V2X communication device can provide the requested data.
 各情報(DEまたはDF)について、表3を参照して説明する。測定時間は、メッセージの基準時刻からの時間をマイクロ秒単位で示す。これは、測定された物体の相対的な年齢を定義する。 Each piece of information (DE or DF) will be explained with reference to Table 3. The measurement time indicates the time in microseconds from the reference time of the message. This defines the relative age of the measured object.
 物体IDは、物体に割り当てられた一意のランダムなIDである。このIDは、物体が追跡されている間、すなわち、発信元のV2X通信装置のデータ融合処理で考慮される間、保持される(すなわち、変更されない)。 An object ID is a unique random ID assigned to an object. This ID is retained (ie, not changed) while the object is being tracked, ie, considered in the originating V2X communication device's data fusion process.
 センサIDは、表2のセンサIDのDEに対応するIDである。このDEは、物体情報を、計測を行うセンサに関連付けるために使用されることがある。 The sensor ID is an ID corresponding to DE in the sensor ID in Table 2. This DE may be used to associate object information with the sensors that make the measurements.
 縦方向距離には、距離値と距離信頼度が含まれる。距離値は、発信元基準座標系における物体までの相対的なX距離を示す。距離信頼度は、そのX距離の信頼度を示す値である。  The vertical distance includes the distance value and the distance reliability. The distance value indicates the relative X distance to the object in the source reference frame. The distance reliability is a value indicating the reliability of the X distance.
 横方向距離も、距離値と距離信頼度が含まれる。距離値は、発信元基準座標系における物体までの相対的なY距離を示し、距離信頼度は、そのY距離の信頼度を示す。 The horizontal distance also includes the distance value and distance reliability. The distance value indicates the relative Y distance to the object in the source reference frame, and the distance confidence indicates the confidence of that Y distance.
 縦方向速度は、検出された物体の縦方向速度を信頼度に応じて示す。横方向速度は、検出された物体の横方向速度を信頼度に応じて示す。縦方向速度および横方向速度は、TS 102 894-2のCDDを参照できる。  Longitudinal velocity indicates the longitudinal velocity of the detected object according to its reliability. Lateral velocity indicates the lateral velocity of the detected object depending on the degree of confidence. Longitudinal and transverse velocities can be referred to CDD of TS 102 894-2.
 物体方位は、データフュージョン処理により提供される場合、基準座標系における物体の絶対方位を示す。物体の長さは、測定された物体の長さを示す。長さの信頼度は、測定された物体の長さの信頼度を示す。物体の幅は、物体の幅の測定値を示す。幅の信頼度は、物体の幅の測定値の信頼度を示す。物体タイプは、データフュージョンプロセスで提供される場合、物体の分類を表す。物体の分類としては車両、人、動物、その他が含まれてよい。また、前述の情報以外に物体の信頼度、垂直方向距離、垂直方向速度、縦方向加速度、横方向加速度、垂直方向加速度、物体の高さ、物体の動的状態、マッチドポジション(車線IDや縦方向車線位置を含む)に関する情報をPOCに含んでいてもよい。 The object orientation indicates the absolute orientation of the object in the reference coordinate system when provided by data fusion processing. The object length indicates the measured object length. The length confidence indicates the confidence of the length of the measured object. Object Width indicates a measurement of the width of the object. Width confidence indicates the reliability of the object width measurement. Object type represents the classification of the object as provided in the data fusion process. Classifications of objects may include vehicles, people, animals, and others. In addition to the above information, object reliability, vertical distance, vertical speed, vertical acceleration, lateral acceleration, vertical acceleration, height of object, dynamic state of object, matched position (lane ID, vertical direction lane position) may be included in the POC.
 フリースペース追加コンテナは、発信元のV2X通信装置が認識しているフリースペースについての情報(すなわちフリースペース情報)を示すコンテナである。フリースペースは、道路使用者や障害物が占有していないと考えられる領域であり、空き空間ということもできる。フリースペースは、発信元のV2X通信装置とともに移動する移動体が移動できるスペースということもできる。 The free space addition container is a container that indicates information about the free space recognized by the source V2X communication device (that is, free space information). A free space is an area that is not considered to be occupied by road users or obstacles, and can also be called an empty space. The free space can also be said to be a space in which a mobile object that moves together with the source V2X communication device can move.
 フリースペース追加コンテナは、必須のコンテナではなく任意に追加できるコンテナである。他のV2X通信装置から受信したCPMから計算できる、当該他のV2X通信装置が認識しているフリースペースと、発信元のV2X通信装置が認識しているフリースペースとに相違がある場合に、フリースペース追加コンテナを追加できる。また、定期的に、CPMにフリースペース追加コンテナを追加してもよい。 The free space additional container is not a required container, but a container that can be added arbitrarily. If there is a difference between the free space recognized by the other V2X communication device and the free space recognized by the source V2X communication device, which can be calculated from the CPM received from the other V2X communication device, the free space Additional space containers can be added. Also, free space additional containers may be added to the CPM periodically.
 フリースペース追加コンテナは、フリースペースの領域を特定する情報を含む。フリースペースは、種々の形状で特定することができる。フリースペースの形状は、たとえば、多角形(すなわちポリゴン)、円形、楕円形、長方形などで表現できる。フリースペースを多角形で表現する場合、多角形を構成する複数の点の位置と、それら複数の点を接続する順序を指定する。フリースペースを円形で表現する場合、円の中心の位置と円の半径を指定する。フリースペースを楕円で表現する場合、楕円の中心の位置と楕円の長径および短径を指定する。 The free space addition container contains information specifying the area of free space. Free space can be specified in various shapes. The shape of the free space can be represented, for example, by polygons (ie, polygons), circles, ellipses, rectangles, and the like. When representing free space with a polygon, specify the positions of the points that make up the polygon and the order in which the points are connected. Specify the position of the center of the circle and the radius of the circle when representing the free space in a circle. When expressing free space with an ellipse, specify the position of the center of the ellipse and the major and minor axes of the ellipse.
 フリースペース追加コンテナは、フリースペースの信頼性を含んでもよい。フリースペースの信頼性は数値で表す。フリースペースの信頼性は、信頼性が不明であることを示すこともある。また、フリースペース追加コンテナは、影領域に関する情報を含んでもよい。影領域とは車両または車両に搭載されるセンサから見て物体の後方の領域を示している。 The free space additional container may contain the reliability of the free space. Reliability of free space is expressed numerically. The reliability of free space may also indicate that the reliability is unknown. The free space addition container may also contain information about shadow areas. The shadow area indicates the area behind the object as seen from the vehicle or the sensor mounted on the vehicle.
 図20は、CPサービス124を提供するV2X通信装置によるセンサデータ抽出方法を説明する図である。より具体的には、図20(a)は、V2X通信装置が低レベルでセンサデータを抽出する方法を示す。図20(b)は、V2X通信装置が高レベルでセンサデータを抽出する方法を示す図である。 FIG. 20 is a diagram explaining a sensor data extraction method by a V2X communication device that provides the CP service 124. More specifically, FIG. 20(a) illustrates how a V2X communication device extracts sensor data at a low level. FIG. 20(b) illustrates how a V2X communication device extracts sensor data at a high level.
 CPMの一部として送信されるセンサデータのソースは、受信側のV2X通信装置における将来のデータフュージョンプロセスの要件に従って選択される必要がある。一般的に、送信されるデータは、元のセンサデータにできるだけ近いものであるべきである。しかし、単純にオリジナルのセンサデータ、たとえば生データを送信することは現実的ではない。データレートと伝送周期に関して非常に高い要求を課すからである。 The source of sensor data transmitted as part of CPM should be selected according to the requirements of the future data fusion process in the receiving V2X communication device. In general, the transmitted data should be as close as possible to the original sensor data. However, simply transmitting original sensor data, such as raw data, is not realistic. This is because it imposes very high demands on data rate and transmission cycle.
 図20(a)および図20(b)はCPMの一部として送信されるデータを選択するための可能な実施形態を示している。図20(a)の実施形態では、センサデータは異なるセンサから取得され、低レベルデータ管理エンティティの一部として処理される。このエンティティは、次のCPMの一部として挿入されるオブジェクトデータを選択し、また、検出されたオブジェクトの妥当性を計算することができる。図20(a)では、各センサのデータを送信するため、V2Xネットワークを介して送信されるデータ量が増加する。しかし、受信側のV2X通信装置でセンサ情報を効率的に活用できる。 Figures 20(a) and 20(b) show possible embodiments for selecting data to be sent as part of the CPM. In the embodiment of Figure 20(a), sensor data is obtained from different sensors and processed as part of the low-level data management entity. This entity can select the object data to be inserted as part of the next CPM and also compute the validity of the detected objects. In FIG. 20(a), since data of each sensor is transmitted, the amount of data transmitted via the V2X network increases. However, the sensor information can be efficiently utilized by the V2X communication device on the receiving side.
 図20(b)の実施形態では、V2X通信装置メーカに固有のデータフュージョン部により提供されるセンサデータまたはオブジェクトデータがCPMの一部として送信される。 In the embodiment of FIG. 20(b), sensor data or object data provided by a data fusion unit specific to the V2X communication device manufacturer is transmitted as part of the CPM.
 図20(b)では、データフュージョン部を介して1つに集められた統合センサデータが伝送されるため、V2Xネットワークを介して伝送されるデータ量が少なくて済むという利点がある。しかし、センサ情報を収集するV2X通信装置の収集方式に依存するというデメリットがある。また、メーカにより異なるデータフュージョン処理が実施される可能性がある。 In FIG. 20(b), integrated sensor data collected via the data fusion unit is transmitted, so there is the advantage that the amount of data transmitted via the V2X network is small. However, there is a demerit in that it depends on the collection method of the V2X communication device that collects sensor information. Also, different data fusion processes may be implemented by different manufacturers.
 V2X通信装置のセンサで物体を検出するたびに、その尤度を算出する必要がある。物体の尤度が所定の閾値PLAUS_OBJを超えた場合、送信を検討する必要がある。 Every time an object is detected by the sensor of the V2X communication device, it is necessary to calculate its likelihood. If the object likelihood exceeds a predetermined threshold PLAUS_OBJ, then transmission should be considered.
 たとえば、検出された物体の現在のヨー角と、発信元のV2X通信装置が過去に送信したCPMに含まれるヨー角との差の絶対値が4度を超える場合、送信を検討する。発信元のV2X通信装置と検出物体の現在位置の相対距離と、発信元のV2X通信装置が過去に送信したCPMに含まれる発信元V2X通信装置と検出物体の相対距離の差が4mを超える場合、または、検出物体の現在の速度と発信元のV2X通信装置が過去に送信したCPMに含まれる検出物体の速度との差の絶対値が0.5m/sを超える場合は送信を検討してもよい。 For example, if the absolute value of the difference between the current yaw angle of the detected object and the yaw angle included in the CPM previously transmitted by the source V2X communication device exceeds 4 degrees, transmission will be considered. When the difference between the relative distance between the current position of the originating V2X communication device and the detected object and the relative distance between the originating V2X communication device and the detected object contained in the CPM previously transmitted by the originating V2X communication device exceeds 4m Or, if the absolute value of the difference between the current velocity of the detected object and the velocity of the detected object contained in the CPM previously transmitted by the source V2X communication device exceeds 0.5 m/s, consider sending. good too.
 CAMは、V2Xモジュールを搭載した車両が、周囲のV2Xモジュールを搭載した車両に定期的に位置や状態を送信し、より安定した走行を支援する技術である。なお、V2Xモジュールは、V2X通信装置あるいはV2X通信装置を含む構成である。  CAM is a technology in which a vehicle equipped with a V2X module periodically transmits its position and status to other vehicles equipped with a V2X module in the surrounding area to support more stable driving. Note that the V2X module is a configuration including a V2X communication device or a V2X communication device.
 CAMは自車両の情報しか共有できないという制約があった。CPサービス124はCAMを補完する技術である。ADAS技術を搭載した車両は増え続けているため、多くの車両にはカメラ、レーダー、ライダーなどのセンサが搭載され、多くの周辺車両を認識し運転支援機能を発揮している。CPS(すなわちCPサービス)技術は、ADAS技術において、周辺環境を認識したセンサデータをV2X通信により周囲に通知する技術である。  There was a restriction that the CAM could only share information about its own vehicle. CP service 124 is a technology that complements CAM. As the number of vehicles equipped with ADAS technology continues to increase, many vehicles are equipped with sensors such as cameras, radars, and lidars, which recognize many surrounding vehicles and demonstrate driving support functions. CPS (that is, CP service) technology is a technology in ADAS technology that notifies the surroundings of sensor data that recognizes the surrounding environment through V2X communication.
 図21は、CPサービス124を説明する図である。TxV1およびRxV2の各車両は、少なくとも1つのセンサを備え、点線で示すセンシング範囲SrV1、SrV2を有するとする。TxV1はCPS機能を有する。TxV1は、車両に搭載された複数のADASセンサを用いて、センシング範囲SrV1に属する周辺物体である車両、RV1~RV11を認識することができる。認識により得られた物体情報は、V2X通信により、V2X通信装置を搭載する周辺車両に配信される場合がある。 FIG. 21 is a diagram explaining the CP service 124. FIG. It is assumed that each vehicle TxV1 and RxV2 is equipped with at least one sensor and has sensing ranges SrV1, SrV2 indicated by dashed lines. TxV1 has a CPS function. TxV1 can recognize vehicles, RV1 to RV11, which are peripheral objects belonging to sensing range SrV1, using a plurality of ADAS sensors mounted on the vehicle. Object information obtained by recognition may be distributed to nearby vehicles equipped with V2X communication devices through V2X communication.
 これにより、TxV1からCPMを受信した周辺車両のうち、センサを搭載していないRxV1は、後続車両の情報を取得できる。また、センサを搭載しているRxV2は、TxV1からCPMを受信した場合、RxV2のセンシング範囲SrV2の外にある物体や死角に位置する物体の情報(例えば、RV1~3、RV5、6及びRV8~10)を取得することも可能である。 As a result, of the surrounding vehicles that received the CPM from TxV1, RxV1, which is not equipped with a sensor, can acquire information on the following vehicle. In addition, when RxV2 equipped with a sensor receives CPM from TxV1, information on objects located outside the sensing range SrV2 of RxV2 and objects located in blind spots (for example, RV1 to 3, RV5, 6 and RV8 to 10) can also be obtained.
 前述の図14に示されるように、ファシリティ層120は、CPサービス124を提供することができる。CPサービス124は、ファシリティ層120で実行されてもよく、ファシリティ層120に存在するサービスを利用してもよい。 As shown in FIG. 14 above, facility layer 120 can provide CP services 124 . CP services 124 may run in facility layer 120 and may utilize services that reside in facility layer 120 .
 LDM127は、地図情報を提供するサービスであり、CPサービス124のために地図情報を提供してもよい。提供する地図情報には、静的な情報に加えて動的な情報を含んでいてもよい。POTIユニット126は、自車両の位置と時刻を提供するサービスを実行する。POTIユニット126は、対応する情報を用いて自車両の位置と正確な時刻を提供することができる。VDP125は、車両に関する情報を提供するサービスであり、これを用いてCPMに自車両のサイズなどの情報を取り込んで、CPMを送信してもよい。 The LDM 127 is a service that provides map information, and may provide map information for the CP service 124. The provided map information may include dynamic information in addition to static information. The POTI unit 126 performs a service that provides the location and time of the ego vehicle. The POTI unit 126 can use the corresponding information to provide the location of the ego vehicle and the exact time. The VDP 125 is a service that provides information about the vehicle, and may be used to capture information such as the size of the own vehicle into the CPM and transmit the CPM.
 ADAS車両には、運転支援のために、カメラ、赤外線センサ、レーダー、ライダーなどの各種センサが搭載されている。それぞれのセンサは、個別に物体を認識する。認識された物体情報は、データフュージョン部によって収集され且つ融合され、ADASアプリケーションに提供される場合がある。 ADAS vehicles are equipped with various sensors such as cameras, infrared sensors, radar, and lidar for driving support. Each sensor individually recognizes an object. The recognized object information may be collected and fused by the data fusion unit and provided to the ADAS application.
 再度、図20を参照し、CPサービス124に関し、ADAS技術におけるセンサ情報の収集とフュージョン方法について説明する。ADAS用の既存のセンサやCPS用の既存のセンサは、常に周囲の物体を追跡し、関連するデータを収集することができる。CPサービス用のセンサ値を使用する場合、2つの方法を用いてセンサ情報を収集することができる。 With reference to FIG. 20 again, regarding the CP service 124, the collection and fusion method of sensor information in the ADAS technology will be described. Existing sensors for ADAS and existing sensors for CPS can always track surrounding objects and collect relevant data. When using sensor values for CP services, two methods can be used to collect sensor information.
 図20(a)に示すように、CP基本サービスを通じて、それぞれのセンサ値を周辺車両に個別に提供することができる。また、図20(b)に示すように、データフュージョン部の後に1つに集められた統合センサ情報がCP基本サービスに提供されてもよい。CP基本サービスは、CPサービス124の一部を構成する。 As shown in FIG. 20(a), each sensor value can be individually provided to surrounding vehicles through the CP basic service. Also, as shown in FIG. 20(b), the aggregated integrated sensor information may be provided to the CP basic service after the data fusion part. CP basic services form part of CP services 124 .
 図22にCPMを送信する処理をフローチャートにより示す。図22に示す処理はCPMの生成周期ごとに実行する。ステップS301では、CPMを構成する情報を取得する。ステップS301は、たとえば、検出情報取得部201が実行する。ステップS302では、ステップS301で取得した情報をもとにCPMを生成する。ステップS302も、検出情報取得部201が実行することができる。ステップS303では、送信部221が、ステップS302で生成したCPMを、車両の周囲へ送信する。 FIG. 22 is a flow chart showing the process of sending the CPM. The processing shown in FIG. 22 is executed for each CPM generation cycle. In step S301, information forming the CPM is acquired. Step S301 is executed by the detection information acquisition unit 201, for example. In step S302, CPM is generated based on the information obtained in step S301. Step S302 can also be executed by the detection information acquisition unit 201 . At step S303, the transmission unit 221 transmits the CPM generated at step S302 to the surroundings of the vehicle.
 図23のフローチャートを用いて、実施形態2で実行する測位誤差推定関連処理の流れの一例を説明する。実施形態2では、路側機3から送信される物標情報として、路側機3から送信されるCPMを用いる。また、実施形態2では、他車両から送信される物標情報として、他車両から送信されるCAMまたはCPMを用いる。物標情報としてCPM、CAMを用いることから、図23に示すフローチャートは、図5のフローチャートにおけるステップS1、S2、S3、S8、S10に代えて、ステップS1A、S2A、S3A、S8A、S10Aを実行する。なお、ステップS4、S5、S6、S7、S9、S11、S12、S13は図5のフローチャートと図23のフローチャートで共通のため説明は省略するものとする。 An example of the flow of positioning error estimation-related processing executed in the second embodiment will be described using the flowchart of FIG. In the second embodiment, the CPM transmitted from the roadside device 3 is used as the target object information transmitted from the roadside device 3 . Further, in the second embodiment, CAM or CPM transmitted from another vehicle is used as target information transmitted from another vehicle. Since CPM and CAM are used as target information, the flowchart shown in FIG. 23 executes steps S1A, S2A, S3A, S8A, and S10A instead of steps S1, S2, S3, S8, and S10 in the flowchart of FIG. do. Note that steps S4, S5, S6, S7, S9, S11, S12, and S13 are common between the flowchart of FIG. 5 and the flowchart of FIG. 23, and thus description thereof will be omitted.
 ステップS1Aでは、路車側受信部231が、路側機3から路車間通信で送信されてくるCPMを受信した場合(S1AでYES)には、ステップS2Aに移る。一方、路車側受信部231が路側機3からCPMを受信していない場合(S1AでNO)には、ステップS8Aに移る。 In step S1A, when the roadside and vehicle side receiving unit 231 receives the CPM transmitted from the roadside device 3 through roadtovehicle communication (YES in S1A), the process proceeds to step S2A. On the other hand, if the road and vehicle side receiver 231 has not received the CPM from the road side device 3 (NO in S1A), the process proceeds to step S8A.
 ステップS2Aでは、車車側受信部222が、他車両から車車間通信で送信されてくるCAMまたはCPMを受信した場合(S2AでYES)には、ステップS3Aに移る。一方、車車間通信でCAMまたはCPMを受信していない場合(S2AでNO)には、ステップS13に移る。なお、以下では、CAMおよびCPMを総称してメッセージと記載することもある。 At step S2A, if the vehicle-side receiving unit 222 receives the CAM or CPM transmitted from another vehicle via inter-vehicle communication (YES at S2A), the process proceeds to step S3A. On the other hand, if the CAM or CPM has not been received in the vehicle-to-vehicle communication (NO in S2A), the process proceeds to step S13. Note that CAM and CPM may be collectively referred to as messages below.
 ステップS3Aでは、同一判定部204が、S1Aで受信したと判断したCPMのPOCにより特定される物標と、S2Aで受信したと判断したCAMまたはCPMにより特定される物標とが同一か否かを判定する。路側機3が送信するCPMには、路側機3の緯度と経度(以下、絶対座標)、路側機3から物標までの距離と方位が含まれている。したがって、路側機3から取得したCPMにより、路側機3が検出した物標の絶対座標を決定できる。また、そのCPMには、POCに、路側機3が検出した物標についての種々の情報が含まれている。 In step S3A, the identity determining unit 204 determines whether the target specified by the POC of the CPM determined to have been received in S1A is the same as the target specified by the CAM or CPM determined to have been received in S2A. judge. The CPM transmitted by the roadside device 3 includes the latitude and longitude (hereinafter referred to as absolute coordinates) of the roadside device 3 and the distance and direction from the roadside device 3 to the target. Therefore, from the CPM acquired from the roadside device 3, the absolute coordinates of the target detected by the roadside device 3 can be determined. Also, the CPM includes POC containing various information about the target detected by the roadside unit 3 .
 一方、他車両から取得するCAMあるいはCPMには、その他車両の絶対座標が含まれている。また、他車両から取得するCAMには、その他車両の種々の情報が含まれている。また、CAMよりは、送信元となる他車両の情報は少ないが、他車両から取得するCPMにも、その他車両の情報が含まれている。したがって、路側機3から取得したCPMと、他車両から取得したCAMまたはCPMとにより、路側機3が検出した物標と、CAMまたはCPMを受信した他車両が同一か否かを判断できる。 On the other hand, the CAM or CPM obtained from other vehicles contains the absolute coordinates of the other vehicles. In addition, the CAM acquired from other vehicles includes various other vehicle information. In addition, the CPM acquired from other vehicles also contains information on other vehicles, although it contains less information on other vehicles as a transmission source than the CAM. Therefore, based on the CPM acquired from the roadside device 3 and the CAM or CPM acquired from another vehicle, it can be determined whether or not the target detected by the roadside device 3 is the same as the other vehicle that received the CAM or CPM.
 たとえば、路側機3から取得したCPMにより特定される物標の絶対座標と、他車両から取得したCAMまたはCPMが示す他車両の絶対座標とが近似している場合に、路側機3から取得したCPMにより特定される物標と他車両が同一であると判断することができる。座標の近似は、座標間の距離が事前に設定された閾値未満であるか否かにより判断できる。なお、絶対座標に代えて、自車を基準とする相対座標を用いてもよい。 For example, when the absolute coordinates of the target specified by the CPM acquired from the roadside device 3 and the absolute coordinates of the other vehicle indicated by the CAM or CPM acquired from the other vehicle are similar, It can be determined that the target specified by the CPM and the other vehicle are the same. Approximation of coordinates can be determined by whether the distance between coordinates is less than a preset threshold. Note that relative coordinates based on the vehicle may be used instead of absolute coordinates.
 また、路側機3から取得したCPMにより特定される物標の挙動と、他車両から取得したCAMまたはCPMが示す他車両の挙動が近似している場合に、物標と他車両が同一であると判断してもよい。また、絶対座標あるいは相対座標と挙動とを用いて同一判定を行ってもよい。 Further, when the behavior of the target specified by the CPM acquired from the roadside device 3 and the behavior of the other vehicle indicated by the CAM or CPM acquired from the other vehicle are similar, the target and the other vehicle are the same. can be judged. Also, the same determination may be made using absolute coordinates or relative coordinates and behavior.
 挙動は、速度、加速度、角速度など、物体の動きを示す1つ以上の情報である。CPMには、送信元となる他車両の挙動は含まれていない。CPMから他車両の挙動を定める場合には、同一の他車両から複数回のCPMを受信し、それぞれのCPMに含まれている他車両の絶対座標の時間変化から他車両の挙動を定める。一方、CAMには他車両の挙動を示す情報が含まれているので、挙動を用いて同一判定する場合には、メッセージとしてCAMを用いることが好ましい。挙動を用いて同一判定すれば、測位誤差があっても精度よく同一判定できる。 Behavior is one or more pieces of information that indicate the movement of an object, such as velocity, acceleration, and angular velocity. The CPM does not include the behavior of the other vehicle that is the transmission source. When determining the behavior of the other vehicle from the CPM, the CPM is received a plurality of times from the same other vehicle, and the behavior of the other vehicle is determined from the time change of the absolute coordinates of the other vehicle included in each CPM. On the other hand, since the CAM contains information indicating the behavior of other vehicles, it is preferable to use the CAM as a message when determining the same by using the behavior. If the same determination is made using the behavior, the same determination can be made with high accuracy even if there is a positioning error.
 さらに、同一判定を行う前に絞り込みを行ってもよい。絞り込みには、たとえば、メッセージを送信した物体の種別を用いることができる。種別は、たとえば、メッセージに含まれるステーションタイプにより判断することができる。絞り込みに物体の絶対座標を用いてもよい。絶対座標により絞り込みを行う場合、同一と判定する距離差よりも大きい距離に設定した閾値半径により絞り込み範囲定める。絞り込み範囲の中心は物標あるいは他車両の座標である。この絞り込み範囲内にある物標と他車両とを対象として同一判定を行う。 In addition, you may narrow down before making the same judgment. For narrowing down, for example, the type of the object that sent the message can be used. The type can be determined, for example, by the station type included in the message. Absolute coordinates of objects may be used for narrowing down. When narrowing down by absolute coordinates, the narrowing-down range is determined by a threshold radius set to a distance larger than the distance difference determined to be the same. The center of the narrowing range is the coordinates of the target or other vehicle. The same determination is made for the target and the other vehicle within this narrowed-down range.
 ステップS8Aの判断内容はステップS2Aと同じである。ステップS8AがYESであればステップS9に移り、ステップS8AがNOであればステップS13に移る。ステップS9でNOになるとS10Aに移る。 The determination content of step S8A is the same as that of step S2A. If step S8A is YES, the process moves to step S9, and if step S8A is NO, the process moves to step S13. If NO in step S9, the process proceeds to S10A.
 ステップS10Aでは第2推定関連処理を行って、ステップS13に移る。図24のフローチャートを用いて、実施形態2における第2推定関連処理の流れの一例について説明する。図24に示すフローチャートは、図6のフローチャートにおけるステップS101、S103、S106に代えて、ステップS101A、S103A、S106を実行する。なお、ステップS102、S104、S105、S107は図6のフローチャートと図24のフローチャートで共通のため説明は省略するものとする。 In step S10A, a second estimation-related process is performed, and the process proceeds to step S13. An example of the flow of the second estimation-related processing according to the second embodiment will be described using the flowchart of FIG. 24 . The flowchart shown in FIG. 24 executes steps S101A, S103A, and S106 instead of steps S101, S103, and S106 in the flowchart of FIG. Note that steps S102, S104, S105, and S107 are common between the flowchart of FIG. 6 and the flowchart of FIG. 24, and thus description thereof will be omitted.
 ステップS101Aでは、搭載物情報取得部である車車側受信部222が、S2A若しくはS8Aでメッセージを受信した他車両とは異なる他車両からCAMまたはCPMを受信した場合(S101AでYES)に、ステップS102に移る。CAMおよびCPMを受信していない場合(S101AでNO)には、図23のS13に移る。 In step S101A, when the vehicle-side reception unit 222, which is the mounted object information acquisition unit, receives the CAM or CPM from another vehicle different from the other vehicle that received the message in S2A or S8A (YES in S101A), step Move to S102. If the CAM and CPM have not been received (NO in S101A), the process proceeds to S13 in FIG.
 ステップS103Aは、同一判定部204が実行する。S103Aでは、S101Aで受信したと判断したCAMまたはCPMにより特定される物標と、物標位置特定部282が位置を特定した第1物標とが同一か否かを判定する。 The identity determination unit 204 executes step S103A. In S103A, it is determined whether or not the target specified by the CAM or CPM determined to have been received in S101A is the same as the first target whose position is specified by the target position specifying unit 282 .
 同一判定の一方の対象は、直前のS102で位置特定済みと判断した第1物標である。一方、同一判定の他方の対象は、第2通信機搭載物および第2物標のいずれかである。 One target for the same determination is the first target whose position was determined to have been identified in the previous S102. On the other hand, the other target for the same determination is either the second communication device mounted object or the second target.
 S101Aで受信したと判断したメッセージがCAMである場合、同一判定の他方の対象は第2通信機搭載物である。図25を用いて具体的に示すと、第2通信機搭載物は車両VEbである。S101で受信したと判断したメッセージがCPMであれば、同一判定の他方の対象は、第2通信機搭載物および第2物標の一方または両方である。第2物標は、第2通信機搭載物が検出した物標である。第2物標は、図1を用いて具体的に示す場合、物標LMaである。 If the message determined to have been received in S101A is the CAM, the other target of the same determination is the second communication device mounted object. Specifically, with reference to FIG. 25, the second communication equipment is the vehicle VEb. If the message determined to have been received in S101 is CPM, the other object of the same determination is one or both of the second communication device mounted object and the second target. The second target is the target detected by the second communication device mounted object. The second target is the target LMa when specifically shown using FIG.
 メッセージがCPMである場合は、第1物標と第2物標とが同一か否かを判定することができる。第1物標の物標情報として、第1通信機搭載物が送信するCPMに含まれているPOCを使い、第2物標の物標情報として、第2通信機搭載物が送信するCPMに含まれているPOCを使う。2つのCPMに含まれているPOCが示す物体の位置および挙動の一方または両方が近似しているか否かにより、第1物標と第2物標とが同一か否かを判定する。もちろん、さらに、物標の種別を考慮してもよい。また、位置による絞り込みを行ってもよい。 If the message is CPM, it can be determined whether the first target and the second target are the same. As the target information of the first target, use the POC included in the CPM transmitted by the first communication device, and use the POC included in the CPM transmitted by the second communication device as the target information of the second target. Use the included POC. It is determined whether or not the first target and the second target are the same depending on whether or not one or both of the position and behavior of the object indicated by the POCs included in the two CPMs are similar. Of course, the type of target may also be taken into consideration. Further, narrowing down by position may be performed.
 メッセージがCAMである場合、第1物標と第2通信機搭載物とが同一か否かを判定する。第1物標の物標情報として第1通信機搭載物が送信するCPMに含まれているPOCを使い、第2物標通信機搭載物の物標情報には、第2通信機搭載物が送信するCAMを使う。これらCPMに含まれているPOCとCAMとが示す物体の位置および挙動の一方または両方が近似しているか否かにより、第1物標と第2通信機搭載物とが同一か否かを判定する。 If the message is CAM, determine whether the first target and the second communication device mounted object are the same. The POC included in the CPM transmitted by the first communication device is used as the target information of the first target, and the second communication device is used for the target information of the second target communication device. Use the transmitting CAM. It is determined whether or not the first target and the second communication device mounted object are the same based on whether or not one or both of the position and behavior of the object indicated by the POC and CAM included in the CPM are similar. do.
 メッセージがCPMである場合も、第1物標と第2通信機搭載物とが同一か否かを判定することができる。第1物標の物標情報として第1通信機搭載物が送信するCPMに含まれているPOCを使う。第2物標通信機搭載物の物標情報には、第2通信機搭載物が送信するCPMの管理コンテナおよびOVCの一方または両方を使う。なお、物標の種別を考慮してもよい点、位置による絞り込みを行ってもよい点は、第1物標と第2物標とが同一か否かを判定する場合と同様である。 Even if the message is CPM, it is possible to determine whether the first target and the second communication device mounted object are the same. As the target information of the first target, the POC included in the CPM transmitted by the first communication equipment is used. One or both of the CPM management container and the OVC transmitted by the second target communication device is used for the target information of the second target communication device. Note that the type of the target may be considered and the narrowing may be performed by position, as in the case of determining whether or not the first target and the second target are the same.
 S106Aでは、誤差推定部206が第2測位誤差を推定する。第2測位誤差は、S103Aで同一判定の対象とした2つの物標の位置のずれである。S103Aで同一判定の対象とした2つの物標が第1物標と第2物標であれば、2つのCPMに含まれているPOCが示す物体の位置のずれを第2測位誤差とする。S103Aで同一判定の対象とした2つの物標が第1物標と第2通信機搭載物であれば、第1通信機搭載物が送信するCPMのPOCが示す物体の位置と、第2通信機搭載物から受信したCAMに示されている第2通信機搭載物の位置とのずれを第2測位誤差とする。もちろん、第2測位誤差を推定する際には、座標系を揃えて2つの位置を比較する。 At S106A, the error estimation unit 206 estimates the second positioning error. The second positioning error is the positional deviation of the two targets that are subject to the same determination in S103A. If the two targets to be determined to be the same in S103A are the first target and the second target, the deviation in the position of the object indicated by the POCs included in the two CPMs is set as the second positioning error. In S103A, if the two targets subject to the same determination are the first target and the object mounted on the second communication device, the position of the object indicated by the POC of the CPM transmitted by the object mounted on the first communication device and the position of the object indicated by the second communication A deviation from the position of the second communication device-mounted object shown in the CAM received from the aircraft-mounted object is defined as a second positioning error. Of course, when estimating the second positioning error, the coordinate systems are aligned and the two positions are compared.
 実施形態2のようにして第2測位誤差を推定した場合も、実施形態1と同様、第2測位誤差を一旦推定した後は、第2通信機搭載物から取得する第2測位位置および第2物標の位置を、それぞれ、第2測位誤差の分だけ補正して、自車両に対する第2通信機搭載物の位置および第2物標の位置を特定できる。 Even when the second positioning error is estimated as in the second embodiment, as in the first embodiment, once the second positioning error is estimated, the second positioning position obtained from the second communication device and the second By correcting the positions of the targets by the second positioning error, the position of the second communication device mounted object and the position of the second target relative to the own vehicle can be specified.
 以上、説明した実施形態2では、第1通信機搭載物の測位の誤差等を推定する際に、車両ユニット2が送信するCPMおよびCAMの一方または両方と、路側機3が送信するCPMを使う。CAMやCPMを送受信するシステムを利用できるので、第1通信機搭載物などの物標の位置を精度よく特定するシステムを構築しやすい。 In the second embodiment described above, one or both of the CPM and CAM transmitted by the vehicle unit 2 and the CPM transmitted by the roadside device 3 are used when estimating the positioning error of the object mounted with the first communication device. . Since a system that transmits and receives CAM and CPM can be used, it is easy to construct a system that accurately identifies the position of a target such as the object mounted on the first communication device.
 (実施形態3)
 実施形態1および実施形態2では、通信機搭載物が車両である場合を例に挙げて説明したが、必ずしもこれに限らない。例えば、通信機搭載物が車両以外の移動体であっても構わない。車両以外の移動体としては、例えばドローン等が挙げられる。この場合、ドローンといった移動体に、車両ユニット2のうちの車両に特有の機能を除く機能を備えるユニットを搭載すればよい。
(Embodiment 3)
In Embodiments 1 and 2, the case where the communication equipment is a vehicle has been described as an example, but this is not necessarily the case. For example, the object mounted with the communication device may be a mobile object other than a vehicle. Mobile objects other than vehicles include, for example, drones. In this case, a unit provided with a function excluding the function specific to the vehicle among the vehicle units 2 may be mounted on a moving object such as a drone.
 (実施形態4)
 他にも、通信機20は、自車の周辺監視センサ40で路側機3を検出できる場合に、自車の測位位置を基準とした路側機3の検出位置と、路側機3から路車間通信で取得する路側機3の絶対位置の情報とから、自車での測位の誤差を推定してもよい。自車の測位位置を基準とした路側機3の検出位置は、周辺監視ECU50で認識した自車の位置に対する路側機3の位置を、自車の測位位置の座標を用いて緯度,経度の座標に変換することで求めればよい。自車の測位位置を基準とした路側機3の検出位置は、自車の測位位置を基準とするので、自車での測位の誤差を含んでいる。よって、自車の測位位置を基準とした路側機3の検出位置と路側機3の絶対位置とのずれから、自車での測位の誤差を推定することができる。推定した自車での測位の誤差については、自車の位置を特定する際の補正に用いればよい。
(Embodiment 4)
In addition, when the roadside device 3 can be detected by the surrounding monitoring sensor 40 of the own vehicle, the communication device 20 detects the detected position of the roadside device 3 based on the positioning position of the own vehicle, From the absolute position information of the roadside device 3 acquired in , the positioning error of the own vehicle may be estimated. The detected position of the roadside device 3 based on the positioning position of the own vehicle is obtained by calculating the position of the roadside device 3 with respect to the position of the own vehicle recognized by the perimeter monitoring ECU 50 using the coordinates of the positioning position of the own vehicle. can be obtained by converting to The detected position of the roadside unit 3 based on the measured position of the own vehicle contains the positioning error of the own vehicle because it is based on the measured position of the own vehicle. Therefore, the positioning error of the own vehicle can be estimated from the deviation between the detected position of the roadside unit 3 and the absolute position of the roadside unit 3 based on the positioning position of the own vehicle. The estimated positioning error of the own vehicle may be used for correction when specifying the position of the own vehicle.
 (実施形態5)
 前述の実施形態では、通信機20が測位誤差を推定する構成を示したが、必ずしもこれに限らない。例えば、通信機20の機能のうち、車車間通信部202及び路車間通信部203以外の機能を、周辺監視ECU50が担う構成としてもよい。この場合、周辺監視ECU50は、車車間通信部202及び路車間通信部203で受信した物標情報を取得する機能ブロックを備えればよい。この機能ブロックが搭載物情報取得部及び路側機情報取得部に相当する。この場合、周辺監視ECU50が車両用装置に相当する。他にも、通信機20の機能を、通信機20と周辺監視ECU50とで担う構成としてもよい。この場合、通信機20及び周辺監視ECU50を含むユニットが車両用装置に相当する。
(Embodiment 5)
Although the configuration in which the communication device 20 estimates the positioning error has been described in the above-described embodiment, the configuration is not necessarily limited to this. For example, among the functions of the communication device 20, the functions other than the vehicle-to-vehicle communication unit 202 and the road-to-vehicle communication unit 203 may be performed by the perimeter monitoring ECU 50. In this case, the perimeter monitoring ECU 50 may include a functional block for acquiring target information received by the vehicle-to-vehicle communication section 202 and the road-to-vehicle communication section 203 . This functional block corresponds to the mounted object information acquisition unit and the roadside unit information acquisition unit. In this case, the perimeter monitoring ECU 50 corresponds to the vehicle device. Alternatively, the function of the communication device 20 may be performed by the communication device 20 and the perimeter monitoring ECU 50 . In this case, a unit including the communication device 20 and the perimeter monitoring ECU 50 corresponds to the vehicle device.
 (実施形態6)
 前述の実施形態では、路側機基準物標位置及び第1測位位置を変換部205で自車に対する相対位置に変換した後、第1測位誤差を推定していた(S5,S6)。しかし、路側機基準物標位置及び第1測位位置の座標を絶対位置として、第1測位誤差を推定してもよい。特に、第1測位位置は絶対座標であるので、路側機3が、路側機基準物標位置に代えて、検出した物標の位置を絶対座標に変換して送信する場合、変換部205は不要である。
(Embodiment 6)
In the above-described embodiment, the first positioning error is estimated after the roadside unit reference target position and the first positioning position are converted into relative positions with respect to the own vehicle by the conversion unit 205 (S5, S6). However, the first positioning error may be estimated using the coordinates of the roadside machine reference target position and the first positioning position as absolute positions. In particular, since the first positioning position is absolute coordinates, when the roadside unit 3 converts the position of the detected target into absolute coordinates instead of the roadside unit reference target position and transmits the absolute coordinates, the conversion unit 205 is unnecessary. is.
 (実施形態7)
 実施形態では、路側機3が検出する物標の位置である路側機基準物標位置を参照位置としていた。しかし、参照位置は、路側機3が検出する物標の位置に限られない。
(Embodiment 7)
In the embodiment, the roadside unit reference target position, which is the position of the target detected by the roadside unit 3, is used as the reference position. However, the reference position is not limited to the target position detected by the roadside unit 3 .
 たとえば、車両VEcに搭載された車両ユニット2は、路側機3が送信するCPMを受信できる。このCPMには、車両VEcの位置も含まれている。車両VEcに搭載された車両ユニット2は、CPMに含まれている車両VEcの位置により、自身の測位誤差を補正することができる。測位誤差を補正した後、一定期間は、車両ユニット2は測位誤差がほとんどない状態である考えることができる。したがって、測位誤差を補正した後、一定期間の間は、誤差を補正した車両ユニット2が検出した物標の位置を参照位置としてもよい。車両ユニット2は、自身が送信する物標の位置を参照位置として使用できるかどうかを、たとえば、メッセージに含ませるフラグにより示せばよい。 For example, the vehicle unit 2 mounted on the vehicle VEc can receive the CPM transmitted by the roadside unit 3. This CPM also includes the position of the vehicle VEc. The vehicle unit 2 mounted on the vehicle VEc can correct its own positioning error based on the position of the vehicle VEc included in the CPM. After correcting the positioning error, it can be considered that the vehicle unit 2 has almost no positioning error for a certain period of time. Therefore, after the positioning error is corrected, the position of the target detected by the error-corrected vehicle unit 2 may be used as the reference position for a certain period of time. The vehicle unit 2 may indicate whether or not the position of the target transmitted by itself can be used as the reference position, for example, by means of a flag included in the message.
 以上の実施形態に基づくと、本開示は、以下に示す技術的特徴も把握できる。 Based on the above embodiments, the present disclosure can also grasp the following technical features.
 (技術的特徴1)
 車両で用いることが可能な車両用装置であって、
 周辺監視センサを有し、測位衛星の信号を用いた測位で求められるよりも位置精度の高い自機器の絶対位置の情報を有し、且つ、車両と無線通信が可能な通信機を搭載する路側機から送信される、周辺監視センサで検出した物標の、路側機の位置を基準とする位置である路側機基準物標位置を、無線通信を介して取得する路側機情報取得部(231)と、
 測位衛星の信号を用いた測位が可能であり、且つ、車両と無線通信が可能な通信機を搭載する路側機以外の第1通信機搭載物から送信される、その測位によって求められたその第1通信機搭載物の位置である第1測位位置を、無線通信を介して取得する搭載物情報取得部(222)と、
 路側機情報取得部で路側機基準物標位置を取得した路側機の周辺監視センサで検出した物標と、搭載物情報取得部で取得した第1測位位置の送信元の第1通信機搭載物とが同一か否かを判定する同一判定部(204)と、
 搭載物情報取得部で取得した路側機基準物標位置及び搭載物情報取得部で取得した第1測位位置を車両に対する相対位置に変換する変換部(205)と、
 同一判定部で物標と第1通信機搭載物とが同一と判定した場合には、それらについての、変換部で変換した路側機基準物標位置と第1測位位置とのずれから、車両の位置を基準とした第1通信機搭載物での測位の誤差である第1測位誤差を推定する誤差推定部(206)とを備える車両用装置。
(Technical feature 1)
A vehicle device that can be used in a vehicle, comprising:
A roadside that has a surrounding monitoring sensor, has information on the absolute position of its own device with higher positional accuracy than required by positioning using signals from positioning satellites, and is equipped with a communication device capable of wireless communication with the vehicle. A roadside unit information acquisition unit (231) for acquiring, via wireless communication, the roadside unit reference target position, which is the position of the target detected by the perimeter monitoring sensor and transmitted from the vehicle, based on the position of the roadside unit. When,
Positioning using signals from positioning satellites and transmitted from a first communication device other than a roadside device equipped with a communication device capable of wireless communication with the vehicle, and obtained by the positioning a mounted object information acquisition unit (222) for acquiring, via wireless communication, a first positioning position, which is the position of a mounted object of one communication device;
Targets detected by the perimeter monitoring sensor of the roadside unit whose reference target position was acquired by the roadside unit information acquisition unit, and the mounted object of the first communication device that is the transmission source of the first positioning position acquired by the mounted object information acquisition unit A same determination unit (204) that determines whether or not the
a conversion unit (205) for converting the roadside unit reference target position acquired by the mounted object information acquiring unit and the first positioning position acquired by the mounted object information acquiring unit into a relative position with respect to the vehicle;
When the identity determination unit determines that the target and the first communication device mounted object are the same, the deviation between the roadside unit reference target position converted by the conversion unit and the first positioning position is used to determine the position of the vehicle. An apparatus for a vehicle, comprising an error estimator (206) for estimating a first positioning error, which is a positioning error in the first communication equipment mounted on the basis of the position.
 (技術的特徴2)
 技術的特徴1に記載の車両用装置であって、
 誤差推定部で第1測位誤差を一旦推定した後は、第1通信機搭載物が路側機の周辺監視センサで検出できなくなった場合であっても、搭載物情報取得部で第1通信機搭載物から取得する第1測位位置を、その第1測位誤差の分だけ補正して、車両に対するその第1通信機搭載物の位置を特定する搭載物位置特定部(281)を備える車両用装置。
(Technical feature 2)
A vehicle device according to technical feature 1,
After the error estimating unit estimates the first positioning error, even if the object mounted on the first communication device cannot be detected by the surrounding monitoring sensor of the roadside unit, the mounted object information acquisition unit A vehicle apparatus comprising a mounted object position specifying unit (281) for specifying the position of the first communication device mounted object with respect to the vehicle by correcting the first positioning position acquired from the object by the first positioning error.
 (技術的特徴3)
 技術的特徴1又は2に記載の車両用装置であって、
 搭載物情報取得部は、さらに周辺監視センサも有する第1通信機搭載物から送信される、その周辺監視センサで検出した物標である第1物標の、第1測位位置を基準とする位置である第1搭載物基準物標位置も、無線通信を介して取得し、
 誤差推定部で第1測位誤差を一旦推定した後は、搭載物情報取得部で第1通信機搭載物から取得する第1搭載物基準物標位置を、その第1測位誤差の分だけ補正して、車両に対する第1物標の位置を特定する物標位置特定部(282)を備える車両用装置。
(Technical feature 3)
A vehicle device according to technical feature 1 or 2,
The mounted object information acquisition unit obtains the position of the first target, which is the target detected by the surrounding monitoring sensor, transmitted from the mounted object, which also has a surrounding monitoring sensor, with respect to the first positioning position. The first mounted object reference target position is also acquired via wireless communication,
After the first positioning error is once estimated by the error estimating section, the mounted object information acquisition section corrects the first mounted object reference target position acquired from the first communication device mounted object by the amount of the first positioning error. and a target position specifying unit (282) for specifying the position of the first target with respect to the vehicle.
 (技術的特徴4)
 技術的特徴3に記載の車両用装置であって、
 搭載物情報取得部は、周辺監視センサを有し、測位衛星の信号を用いた測位が可能であり、且つ、車両と無線通信が可能な通信機を搭載する路側機及び第1通信機搭載物以外の第2通信機搭載物から送信される、その測位によって求められたその第2通信機搭載物の位置である第2測位位置、及びその周辺監視センサで検出した物標である第2物標の、その測位によって求められたその第2通信機搭載物の位置を基準とする位置である第2搭載物基準物標位置も、無線通信を介して取得し、
 同一判定部は、物標位置特定部で車両に対する位置を特定した第1物標と、搭載物情報取得部で取得した第2搭載物基準物標位置の送信元の第2通信機搭載物の周辺監視センサで検出した第2物標とが同一か否かを判定し、
 変換部は、搭載物情報取得部で取得した第2搭載物基準物標位置を車両に対する相対位置に変換し、
 誤差推定部は、同一判定部で第1物標と第2物標とが同一と判定した場合には、変換部で変換した第2搭載物基準物標位置と、物標位置特定部で特定した車両に対する第1物標の位置とのずれから、車両の位置を基準とした第2通信機搭載物での測位の誤差である第2測位誤差を推定する車両用装置。
(Technical feature 4)
A vehicle device according to technical feature 3,
The mounted object information acquisition unit has a surrounding monitoring sensor, is capable of positioning using a positioning satellite signal, and is equipped with a communication device capable of wireless communication with the vehicle, and a roadside device and a mounted object of the first communication device. The second positioning position, which is the position of the second communication device-mounted object obtained by the positioning, and the second object, which is the target detected by the surrounding monitoring sensor, transmitted from the second communication device-mounted object other than a second mounted object reference target position, which is the position of the target based on the position of the mounted object of the second communication device determined by the positioning, is also acquired via wireless communication;
The identity determination unit determines whether the first target whose position relative to the vehicle is specified by the target position specifying unit and the second communication device mounted object that is the transmission source of the second mounted object reference target position acquired by the mounted object information acquisition unit. Determining whether or not the second target detected by the surrounding monitoring sensor is the same,
The converting unit converts the second mounted object reference target position acquired by the mounted object information acquiring unit into a relative position with respect to the vehicle,
When the identity determining unit determines that the first target and the second target are the same, the error estimating unit determines the position of the second mounted object reference target converted by the converting unit and the position of the target specified by the target position identifying unit. A vehicle device for estimating a second positioning error, which is a positioning error of a second communication device mounted on the vehicle with reference to the position of the vehicle, from the deviation from the position of the first target with respect to the vehicle.
 (技術的特徴5)
 技術的特徴4に記載の車両用装置であって、
 誤差推定部で第2測位誤差を一旦推定した後は、搭載物情報取得部で第2通信機搭載物から取得する第2測位位置を、その第2測位誤差の分だけ補正して、車両に対する第2通信機搭載物の位置を特定する搭載物位置特定部(281)を備える車両用装置。
(Technical feature 5)
A vehicle device according to technical feature 4,
After the second positioning error is once estimated by the error estimating unit, the mounted object information acquisition unit corrects the second positioning position acquired from the second communication device mounted object by the second positioning error, A device for a vehicle comprising a mounted object position identifying section (281) for identifying the position of a second communication device mounted object.
 (技術的特徴6)
 技術的特徴4又は5に記載の車両用装置であって、
 物標位置特定部は、誤差推定部で第2測位誤差を一旦推定した後は、搭載物情報取得部で第2通信機搭載物から取得する第2搭載物基準物標位置を、その第2測位誤差の分だけ補正して、車両に対する第2物標の位置を特定する車両用装置。
(Technical feature 6)
A vehicle device according to technical feature 4 or 5,
After the error estimating unit estimates the second positioning error once, the target position identifying unit calculates the second mounting object reference target position acquired from the second communication device mounted object by the mounted object information acquiring unit as the second A vehicle device that specifies the position of a second target with respect to the vehicle by correcting the positioning error.
 (技術的特徴7)
 車両で用いることが可能な誤差推定方法であって、
 少なくとも1つのプロセッサにより実行される、
 周辺監視センサを有し、測位衛星の信号を用いた測位で求められるよりも位置精度の高い自機器の絶対位置の情報を有し、且つ、車両と無線通信が可能な通信機を搭載する路側機から送信される、周辺監視センサで検出した物標の、路側機の位置を基準とする位置である路側機基準物標位置を、無線通信を介して取得する路側機情報取得工程と、
 測位衛星の信号を用いた測位が可能であり、且つ、車両と無線通信が可能な通信機を搭載する路側機以外の第1通信機搭載物から送信される、その測位によって求められたその第1通信機搭載物の位置である第1測位位置を、無線通信を介して取得する搭載物情報取得工程と、
 路側機情報取得工程で路側機基準物標位置を取得した路側機の周辺監視センサで検出した物標と、搭載物情報取得工程で取得した第1測位位置の送信元の第1通信機搭載物とが同一か否かを判定する同一判定工程と、
 搭載物情報取得工程で取得した路側機基準物標位置及び搭載物情報取得工程で取得した第1測位位置を車両に対する相対位置に変換する変換工程と、
 同一判定工程で物標と第1通信機搭載物とが同一と判定した場合には、それらについての、変換工程で変換した路側機基準物標位置と第1測位位置とのずれから、車両の位置を基準とした第1通信機搭載物での測位の誤差である第1測位誤差を推定する誤差推定工程とを含む誤差推定方法。
(Technical feature 7)
An error estimation method that can be used in a vehicle, comprising:
executed by at least one processor;
A roadside that has a surrounding monitoring sensor, has information on the absolute position of its own device with higher positional accuracy than required by positioning using signals from positioning satellites, and is equipped with a communication device capable of wireless communication with the vehicle. a roadside unit information acquisition step of acquiring, via wireless communication, a roadside unit reference target position, which is a position of a target detected by a perimeter monitoring sensor and transmitted from the machine with reference to the position of the roadside unit;
Positioning using signals from positioning satellites and transmitted from a first communication device other than a roadside device equipped with a communication device capable of wireless communication with the vehicle, and obtained by the positioning a mounted object information obtaining step of obtaining a first positioning position, which is the position of a mounted object of one communication device, via wireless communication;
The target detected by the perimeter monitoring sensor of the roadside unit that acquired the roadside unit reference target position in the roadside unit information acquisition process, and the first communication device mounted object that is the transmission source of the first positioning position acquired in the mounted object information acquisition process A same determination step of determining whether or not is the same,
a conversion step of converting the roadside unit reference target position acquired in the mounted object information acquiring step and the first positioning position acquired in the mounted object information acquiring step into a relative position with respect to the vehicle;
If it is determined in the same determination step that the target and the first communication device mounted object are the same, the difference between the roadside unit reference target position converted in the conversion step and the first positioning position for them is determined. an error estimating step of estimating a first positioning error, which is a positioning error of the first communicator mount relative to the position.
 なお、本開示は、上述した実施形態に限定されるものではなく、請求項に示した範囲で種々の変更が可能であり、異なる実施形態にそれぞれ開示された技術的手段を適宜組み合わせて得られる実施形態についても本開示の技術的範囲に含まれる。また、本開示に記載の制御部及びその手法は、コンピュータプログラムにより具体化された1つ乃至は複数の機能を実行するようにプログラムされたプロセッサを構成する専用コンピュータにより、実現されてもよい。あるいは、本開示に記載の装置及びその手法は、専用ハードウェア論理回路により、実現されてもよい。もしくは、本開示に記載の装置及びその手法は、コンピュータプログラムを実行するプロセッサと1つ以上のハードウェア論理回路との組み合わせにより構成された1つ以上の専用コンピュータにより、実現されてもよい。また、コンピュータプログラムは、コンピュータにより実行されるインストラクションとして、コンピュータ読み取り可能な非遷移有形記録媒体に記憶されていてもよい。 It should be noted that the present disclosure is not limited to the above-described embodiments, and can be modified in various ways within the scope of the claims, and can be obtained by appropriately combining technical means disclosed in different embodiments. Embodiments are also included in the technical scope of the present disclosure. The controller and techniques described in this disclosure may also be implemented by a special purpose computer comprising a processor programmed to perform one or more functions embodied by a computer program. Alternatively, the apparatus and techniques described in this disclosure may be implemented by dedicated hardware logic circuitry. Alternatively, the apparatus and techniques described in this disclosure may be implemented by one or more special purpose computers configured by a combination of a processor executing a computer program and one or more hardware logic circuits. The computer program may also be stored as computer-executable instructions on a computer-readable non-transitional tangible recording medium.

Claims (10)

  1.  車両で用いることが可能な車両用装置であって、
     物標の参照位置を検出する参照装置から無線送信される前記参照位置を取得する参照位置取得部(231)と、
     第1通信機搭載物から送信され、前記第1通信機搭載物での測位によって求められた前記第1通信機搭載物の位置である第1測位位置を取得する搭載物情報取得部(222)と、
     前記参照位置取得部で前記参照位置を取得した前記物標と、前記搭載物情報取得部で前記第1測位位置を取得した前記第1通信機搭載物とが同一か否かを判定する同一判定部(204)と、
     前記同一判定部で前記物標と前記第1通信機搭載物とが同一と判定した場合には、前記参照位置と前記第1測位位置とのずれから、前記第1通信機搭載物での測位の誤差である第1測位誤差を推定する誤差推定部(206)とを備える車両用装置。
    A vehicle device that can be used in a vehicle, comprising:
    a reference position acquisition unit (231) for acquiring the reference position wirelessly transmitted from a reference device for detecting the reference position of the target;
    Mounted object information acquisition unit (222) for acquiring a first positioning position, which is the position of the first communication device-mounted object transmitted from the first communication device-mounted object and obtained by positioning by the first communication device-mounted object. When,
    Identity determination for determining whether or not the target for which the reference position is obtained by the reference position obtaining unit and the first communication device mounted object for which the first positioning position is obtained by the mounted object information obtaining unit are the same. a unit (204);
    When the same determination unit determines that the target and the first communication device mounted object are the same, the positioning by the first communication device mounted object is performed based on the deviation between the reference position and the first positioning position. and an error estimator (206) for estimating a first positioning error that is an error in the vehicle.
  2.  請求項1に記載の車両用装置であって、
     前記参照装置が路側機であり、
     前記参照位置取得部として、前記路側機が検出した前記物標の位置である路側機基準物標位置を前記参照位置として前記路側機から取得する路側機情報取得部を備える車両用装置。
    A vehicle device according to claim 1, comprising:
    wherein the reference device is a roadside unit;
    A vehicle apparatus comprising, as the reference position acquisition unit, a roadside unit information acquisition unit that acquires a roadside unit reference target position, which is the position of the target detected by the roadside unit, from the roadside unit as the reference position.
  3.  請求項2に記載の車両用装置であって、
     前記搭載物情報取得部で取得した前記路側機基準物標位置及び前記搭載物情報取得部で取得した前記第1測位位置を前記車両に対する相対位置に変換する変換部(205)を備え、
     前記誤差推定部は、前記変換部で変換した前記路側機基準物標位置と、前記変換部で変換した前記第1測位位置とのずれから、前記車両の位置を基準とした前記第1測位誤差を推定する、車両用装置。
    A vehicle device according to claim 2, wherein
    A conversion unit (205) for converting the roadside unit reference target position acquired by the mounted object information acquisition unit and the first positioning position acquired by the mounted object information acquisition unit into a relative position with respect to the vehicle,
    The error estimating unit calculates the first positioning error with respect to the position of the vehicle from the deviation between the roadside unit reference target position converted by the converting unit and the first positioning position converted by the converting unit. A vehicle device for estimating
  4.  請求項1~3のいずれか1項に記載の車両用装置であって、
     前記誤差推定部で前記第1測位誤差を一旦推定した後は、前記第1通信機搭載物が前記参照装置により検出できなくなった場合であっても、前記搭載物情報取得部で前記第1通信機搭載物から取得する前記第1測位位置を、その第1測位誤差の分だけ補正して、前記車両に対するその第1通信機搭載物の位置を特定する搭載物位置特定部(281)を備える車両用装置。
    The vehicle device according to any one of claims 1 to 3,
    After the error estimating unit once estimates the first positioning error, even if the first communication device mounted object cannot be detected by the reference device, the mounted object information acquiring unit a mounted object position specifying unit (281) for specifying the position of the first communication device mounted object with respect to the vehicle by correcting the first positioning position obtained from the mounted object by the first positioning error. vehicle equipment.
  5.  請求項1~4のいずれか1項に記載の車両用装置であって、
     前記搭載物情報取得部は、前記第1通信機搭載物から送信され、前記第1通信機搭載物が備える周辺監視センサで検出した第1物標の位置である第1搭載物基準物標位置も取得し、
     前記誤差推定部で前記第1測位誤差を一旦推定した後は、前記搭載物情報取得部で取得する前記第1搭載物基準物標位置をその第1測位誤差の分だけ補正して、前記車両に対する前記第1物標の位置を特定する物標位置特定部(282)を備える車両用装置。
    The vehicle device according to any one of claims 1 to 4,
    The mounted object information acquiring unit is a first mounted object reference target position which is the position of the first target transmitted from the mounted object of the first communication device and detected by a perimeter monitoring sensor included in the mounted object of the first communication device. also get
    After the first positioning error is once estimated by the error estimating unit, the first mounted object reference target position acquired by the mounted object information acquiring unit is corrected by the first positioning error, and the vehicle A vehicle device comprising a target position identifying section (282) that identifies the position of the first target with respect to.
  6.  請求項5に記載の車両用装置であって、
     前記搭載物情報取得部は、前記第1通信機搭載物以外の第2通信機搭載物から送信される前記第2通信機搭載物の位置である第2測位位置、及び前記第2通信機搭載物が備える周辺監視センサで検出した第2物標の前記第2通信機搭載物の位置を基準とする位置である第2搭載物基準物標位置も取得し、
     前記同一判定部は、前記第1物標と前記第2物標とが同一か否かを判定し、
     前記誤差推定部は、前記同一判定部で前記第1物標と前記第2物標とが同一と判定した場合には、前記第2搭載物基準物標位置と、前記物標位置特定部で特定した前記第1物標の位置とのずれから、前記第2通信機搭載物での前記測位の誤差である第2測位誤差を推定する車両用装置。
    A vehicle device according to claim 5, wherein
    The mounted object information acquisition unit obtains a second positioning position, which is the position of the second communication device mounted object transmitted from a second communication device mounted object other than the first communication device mounted object, and a position of the second communication device mounted object. Acquiring a second mounted object reference target position, which is a position of the second target detected by a perimeter monitoring sensor included in the object, with reference to the position of the second communication device mounted object,
    The identity determination unit determines whether the first target and the second target are the same,
    When the identity determining unit determines that the first target and the second target are the same, the error estimating unit determines that the second mounted object reference target position and the target position identifying unit A vehicular device for estimating a second positioning error, which is an error in the positioning by the second communication device-mounted object, from the deviation from the position of the identified first target.
  7.  請求項6に記載の車両用装置であって、
     前記誤差推定部で前記第2測位誤差を一旦推定した後は、前記搭載物情報取得部で前記第2通信機搭載物から取得する前記第2測位位置を、その第2測位誤差の分だけ補正して、前記車両に対する前記第2通信機搭載物の位置を特定する搭載物位置特定部(281)を備える車両用装置。
    A vehicle device according to claim 6, wherein
    After the second positioning error is once estimated by the error estimating unit, the mounted object information acquiring unit corrects the second positioning position acquired from the second communication device mounted object by the second positioning error. and a mounted object position specifying unit (281) for specifying the position of the second communication device mounted object with respect to the vehicle.
  8.  請求項6又は7に記載の車両用装置であって、
     前記物標位置特定部は、前記誤差推定部で前記第2測位誤差を一旦推定した後は、前記搭載物情報取得部で前記第2通信機搭載物から取得する前記第2搭載物基準物標位置を、その第2測位誤差の分だけ補正して、前記車両に対する前記第2物標の位置を特定する車両用装置。
    A vehicle device according to claim 6 or 7,
    The target position specifying unit, after once estimating the second positioning error in the error estimating unit, determines the second mounted object reference target acquired from the second communication device mounted object in the mounted object information acquiring unit. A vehicle device that corrects the position by the second positioning error to identify the position of the second target with respect to the vehicle.
  9.  請求項5に記載の車両用装置であって、
     前記搭載物情報取得部は、前記第1通信機搭載物以外の第2通信機搭載物から送信される前記第2通信機搭載物の位置である第2測位位置も取得し、
     前記同一判定部は、前記第1物標と前記第2通信機搭載物とが同一か否かを判定し、
     前記誤差推定部は、前記同一判定部で前記第1物標と前記第2通信機搭載物とが同一であると判定した場合には、前記第2測位位置と前記第1物標の位置とのずれから、前記第2通信機搭載物での前記測位の誤差である第2測位誤差を推定する車両用装置。
    A vehicle device according to claim 5, wherein
    The mounted object information acquisition unit also acquires a second positioning position, which is the position of the second communication device mounted object transmitted from a second communication device mounted object other than the first communication device mounted object,
    The identity determination unit determines whether the first target and the second communication device mounted object are the same,
    When the identity determining unit determines that the first target and the second communication device mounted object are the same, the error estimating unit compares the second positioning position with the position of the first target. a vehicle device for estimating a second positioning error, which is an error in the positioning by the second communication device-mounted object, from the deviation of the second communication device.
  10.  車両で用いることが可能であり、少なくとも1つのプロセッサにより実行される誤差推定方法であって、
     物標の参照位置を検出する参照装置から無線送信される前記参照位置を取得する参照位置取得工程と、
     第1通信機搭載物から送信され、前記第1通信機搭載物での測位によって求められた前記第1通信機搭載物の位置である第1測位位置を取得する搭載物情報取得工程と、
     前記参照位置取得工程で前記参照位置を取得した前記物標と、前記搭載物情報取得工程で前記第1測位位置を取得した前記第1通信機搭載物とが同一か否かを判定する同一判定工程と、
     前記同一判定工程で前記物標と前記第1通信機搭載物とが同一と判定した場合には、前記参照位置と前記第1測位位置とのずれから、前記第1通信機搭載物での測位の誤差である第1測位誤差を推定する誤差推定工程とを含む誤差推定方法。
    An error estimation method usable in a vehicle and executed by at least one processor, comprising:
    a reference position obtaining step of obtaining the reference position wirelessly transmitted from a reference device for detecting the reference position of the target;
    a mounted object information acquisition step of acquiring a first positioning position, which is the position of the first communication device mounted object transmitted from the first communication device mounted object and obtained by positioning by the first communication device mounted object;
    Identity determination for determining whether or not the target for which the reference position is obtained in the reference position obtaining step and the first communication device mounted object for which the first positioning position is obtained in the mounted object information obtaining step are the same. process and
    When it is determined in the identity determination step that the target and the first communication device mounted object are the same, the position measurement by the first communication device mounted object is performed based on the deviation between the reference position and the first positioning position. and an error estimation step of estimating a first positioning error that is an error of .
PCT/JP2022/020386 2021-06-14 2022-05-16 Vehicle device and error estimation method WO2022264731A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
JP2023529693A JPWO2022264731A1 (en) 2021-06-14 2022-05-16
DE112022003058.5T DE112022003058T5 (en) 2021-06-14 2022-05-16 VEHICLE DEVICE AND ERROR ESTIMATION METHODS

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2021098905 2021-06-14
JP2021-098905 2021-06-14

Publications (1)

Publication Number Publication Date
WO2022264731A1 true WO2022264731A1 (en) 2022-12-22

Family

ID=84526190

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2022/020386 WO2022264731A1 (en) 2021-06-14 2022-05-16 Vehicle device and error estimation method

Country Status (3)

Country Link
JP (1) JPWO2022264731A1 (en)
DE (1) DE112022003058T5 (en)
WO (1) WO2022264731A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024202611A1 (en) * 2023-03-30 2024-10-03 日本電気株式会社 Object detection system, processing system, method for generating teacher data, object detection method, and recording medium

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012211843A (en) * 2011-03-31 2012-11-01 Daihatsu Motor Co Ltd Position correction device and inter-vehicle communication system
JP2016143088A (en) * 2015-01-29 2016-08-08 住友電気工業株式会社 Position detection system and on-vehicle information processing apparatus
JP2020193954A (en) * 2019-05-30 2020-12-03 住友電気工業株式会社 Position correction server, position management device, moving object position management system and method, position information correction method, computer program, onboard device, and vehicle

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7107660B2 (en) 2017-10-25 2022-07-27 株式会社Soken In-vehicle system, target object recognition method and computer program
JP2021098905A (en) 2019-12-20 2021-07-01 トヨタ紡織株式会社 Resin fiber, method for manufacturing fiber board and method for manufacturing molding

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012211843A (en) * 2011-03-31 2012-11-01 Daihatsu Motor Co Ltd Position correction device and inter-vehicle communication system
JP2016143088A (en) * 2015-01-29 2016-08-08 住友電気工業株式会社 Position detection system and on-vehicle information processing apparatus
JP2020193954A (en) * 2019-05-30 2020-12-03 住友電気工業株式会社 Position correction server, position management device, moving object position management system and method, position information correction method, computer program, onboard device, and vehicle

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024202611A1 (en) * 2023-03-30 2024-10-03 日本電気株式会社 Object detection system, processing system, method for generating teacher data, object detection method, and recording medium

Also Published As

Publication number Publication date
JPWO2022264731A1 (en) 2022-12-22
DE112022003058T5 (en) 2024-05-08

Similar Documents

Publication Publication Date Title
EP3731547B1 (en) Device and method for v2x communication
US11388565B2 (en) Apparatus and method for V2X communication
US11218851B2 (en) Device and method for V2X communication
EP3462754B1 (en) Apparatus and method for v2x communication
US11968603B2 (en) Device and method for V2X communication
US11776405B2 (en) Apparatus and method for V2X communication
US20200322830A1 (en) Extra-vehicular communication device, onboard device, onboard communication system, communication control method, and communication control program
TW202209906A (en) Techniques for managing data distribution in a v2x environment
US20200326203A1 (en) Real-world traffic model
US20220107382A1 (en) Device and method for v2x communication
Kitazato et al. Proxy cooperative awareness message: an infrastructure-assisted v2v messaging
US20220086609A1 (en) Cpm message division method using object state sorting
US11145195B2 (en) Service station for an intelligent transportation system
WO2019131075A1 (en) Transmission device, point group data collection system, and computer program
CN113099529A (en) Indoor vehicle navigation method, vehicle-mounted terminal, field terminal server and system
WO2022264731A1 (en) Vehicle device and error estimation method
US11900808B2 (en) Apparatus, method, and computer program for a first vehicle and for estimating a position of a second vehicle at the first vehicle
EP4177862A1 (en) Road space collective perception message within an intelligent transport system
WO2022255074A1 (en) Device for vehicle and error estimation method
CN118805390A (en) Based on relative position of User Equipment (UE)
EP3783930B1 (en) Service station for an intelligent transportation system
WO2023171371A1 (en) Communication device and communication method
US20240038060A1 (en) Communication within an intelligent transport system for signalling hidden objects
JP2023176218A (en) Communication device and communication method
Paparone An open-source, multiprotocol Internet of vehicles platform for the analysis of traffic flows in urban scenarios

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22824725

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2023529693

Country of ref document: JP

Kind code of ref document: A

WWE Wipo information: entry into national phase

Ref document number: 112022003058

Country of ref document: DE

122 Ep: pct application non-entry in european phase

Ref document number: 22824725

Country of ref document: EP

Kind code of ref document: A1