WO2022264731A1 - Vehicle device and error estimation method - Google Patents
Vehicle device and error estimation method Download PDFInfo
- Publication number
- WO2022264731A1 WO2022264731A1 PCT/JP2022/020386 JP2022020386W WO2022264731A1 WO 2022264731 A1 WO2022264731 A1 WO 2022264731A1 JP 2022020386 W JP2022020386 W JP 2022020386W WO 2022264731 A1 WO2022264731 A1 WO 2022264731A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- vehicle
- target
- communication device
- mounted object
- unit
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims description 74
- 238000004891 communication Methods 0.000 claims description 440
- 238000012544 monitoring process Methods 0.000 claims description 102
- 230000008569 process Effects 0.000 claims description 52
- 238000006243 chemical reaction Methods 0.000 claims description 19
- 238000005259 measurement Methods 0.000 claims description 8
- 239000010410 layer Substances 0.000 description 108
- 230000005540 biological transmission Effects 0.000 description 66
- 238000007726 management method Methods 0.000 description 35
- 238000001514 detection method Methods 0.000 description 33
- 238000012545 processing Methods 0.000 description 30
- 230000006870 function Effects 0.000 description 23
- 238000005516 engineering process Methods 0.000 description 17
- 230000006399 behavior Effects 0.000 description 16
- 238000010586 diagram Methods 0.000 description 15
- 230000001133 acceleration Effects 0.000 description 12
- 238000001297 coherence probe microscopy Methods 0.000 description 12
- 230000033001 locomotion Effects 0.000 description 10
- 102100034112 Alkyldihydroxyacetonephosphate synthase, peroxisomal Human genes 0.000 description 7
- 101000799143 Homo sapiens Alkyldihydroxyacetonephosphate synthase, peroxisomal Proteins 0.000 description 7
- 238000000848 angular dependent Auger electron spectroscopy Methods 0.000 description 7
- 238000010276 construction Methods 0.000 description 7
- 230000008859 change Effects 0.000 description 6
- 238000012937 correction Methods 0.000 description 6
- 230000004927 fusion Effects 0.000 description 5
- 238000007499 fusion processing Methods 0.000 description 5
- 230000004913 activation Effects 0.000 description 4
- 238000004364 calculation method Methods 0.000 description 4
- 238000012790 confirmation Methods 0.000 description 4
- 238000003384 imaging method Methods 0.000 description 4
- 230000008447 perception Effects 0.000 description 4
- 230000002093 peripheral effect Effects 0.000 description 4
- 241001465754 Metazoa Species 0.000 description 3
- 102220486681 Putative uncharacterized protein PRO1854_S10A_mutation Human genes 0.000 description 3
- 238000013475 authorization Methods 0.000 description 3
- 238000004590 computer program Methods 0.000 description 3
- 238000012384 transportation and delivery Methods 0.000 description 3
- 230000004888 barrier function Effects 0.000 description 2
- 230000033228 biological regulation Effects 0.000 description 2
- 238000013480 data collection Methods 0.000 description 2
- 238000013075 data extraction Methods 0.000 description 2
- 230000007613 environmental effect Effects 0.000 description 2
- 239000000284 extract Substances 0.000 description 2
- 238000010295 mobile communication Methods 0.000 description 2
- 230000011218 segmentation Effects 0.000 description 2
- 230000009466 transformation Effects 0.000 description 2
- 101001093748 Homo sapiens Phosphatidylinositol N-acetylglucosaminyltransferase subunit P Proteins 0.000 description 1
- 241000269400 Sirenidae Species 0.000 description 1
- 230000009471 action Effects 0.000 description 1
- 230000008901 benefit Effects 0.000 description 1
- 230000002457 bidirectional effect Effects 0.000 description 1
- 238000002485 combustion reaction Methods 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 150000001875 compounds Chemical class 0.000 description 1
- 238000013523 data management Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000018109 developmental process Effects 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 230000002708 enhancing effect Effects 0.000 description 1
- 239000000383 hazardous chemical Substances 0.000 description 1
- 230000006698 induction Effects 0.000 description 1
- 230000000977 initiatory effect Effects 0.000 description 1
- 238000009434 installation Methods 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 239000011229 interlayer Substances 0.000 description 1
- 238000002156 mixing Methods 0.000 description 1
- 230000006855 networking Effects 0.000 description 1
- 230000004297 night vision Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 238000007500 overflow downdraw method Methods 0.000 description 1
- 230000000737 periodic effect Effects 0.000 description 1
- 230000000717 retained effect Effects 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
- 230000001131 transforming effect Effects 0.000 description 1
- 230000001960 triggered effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/16—Anti-collision systems
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S19/00—Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
- G01S19/01—Satellite radio beacon positioning systems transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
- G01S19/13—Receivers
- G01S19/14—Receivers specially adapted for specific applications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S19/00—Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
- G01S19/38—Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
- G01S19/39—Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
- G01S19/396—Determining accuracy or reliability of position or pseudorange measurements
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S19/00—Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
- G01S19/38—Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
- G01S19/39—Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
- G01S19/42—Determining position
- G01S19/51—Relative positioning
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/09—Arrangements for giving variable traffic instructions
Definitions
- the present disclosure relates to a vehicle device and an error estimation method.
- Patent Document 1 discloses a technique for determining the existence of a target beyond the line of sight of the own vehicle.
- reception information which is position information of another vehicle transmitted from another vehicle as a target
- sensor information including the moving speed and moving direction of the target detected by the sensor of the own vehicle.
- Patent Document 1 With the technology disclosed in Patent Document 1, it is possible to determine the presence of other vehicles that cannot be detected by the own vehicle's sensors by receiving the position information of the other vehicles.
- the position information of other vehicles transmitted from other vehicles includes positioning errors (hereinafter referred to as positioning errors), whether positioning is performed using positioning satellites or not using positioning satellites. included. Therefore, with the technology disclosed in Patent Literature 1, it is difficult to accurately identify the position of another vehicle when the other vehicle is located outside the detection range of the sensor of the own vehicle.
- positioning errors positioning errors
- One object of this disclosure is to provide a vehicle device and an error estimation method that make it possible to more accurately identify the position of an object mounted with a communication device outside the detection range of the vehicle's surroundings monitoring sensor.
- the vehicle device of the present disclosure is a vehicle device that can be used in a vehicle and acquires a reference position wirelessly transmitted from a reference device that detects a reference position of a target.
- a reference position acquisition unit for acquiring a first positioning position, which is the position of the first communication device-mounted object transmitted from the first communication device-mounted object and obtained by positioning by the first communication device-mounted object; an identity determination unit that determines whether or not the target whose reference position is obtained by the reference position obtaining unit and the first communication device mounted object whose first positioning position is obtained by the mounted object information obtaining unit are the same;
- the first positioning error which is the positioning error in the first communication device mounted device, is determined from the deviation between the reference position and the first positioning position.
- an error estimator for estimating the positioning error.
- the error estimation method of the present disclosure is capable of being used in a vehicle and is executed by at least one processor, a reference device for detecting a reference position of a target.
- the first positioning position is obtained by positioning with the first communication device mounted object, so it includes positioning errors.
- the disclosed technique acquires the reference position of the target from the reference device. Therefore, when it is determined that the target and the first communication device-mounted object are the same, from the deviation between the reference position and the first positioning position acquired from the first communication device-mounted object, the positioning by the first communication device-mounted object It becomes possible to more accurately estimate the first positioning error, which is the error of .
- the first positioning error is estimated from the reference position and the first positioning position. It becomes possible. Further, by using the first positioning error, it is possible to correct the positioning error in the first communication device mounted object and specify the position of the first communication device mounted object with higher accuracy. As a result, it is possible to more accurately identify the position of the communication device-mounted object outside the detection range of the surroundings monitoring sensor of the own vehicle.
- FIG. 2 is a diagram showing an example of a schematic configuration of a vehicle unit 2;
- FIG. 2 is a diagram showing an example of a schematic configuration of a communication device 20;
- FIG. 6 is a flow chart showing an example of the flow of positioning error estimation-related processing in the communication device 20;
- 6 is a flowchart showing an example of the flow of second estimation-related processing in the communication device 20;
- FIG. 10 is a diagram for explaining an outline of a second embodiment; 1 illustrates an exemplary architecture of a V2X communication device;
- FIG. 3 shows the logical interfaces for CA services and other layers; A functional block diagram of a CA service.
- FIG. 3 shows the logical interfaces for CP services and other layers; Functional block diagram of CP service.
- FIG. 4 illustrates FOC (or SIC) in CPM;
- FIG. 4 is a diagram illustrating POC in CPM;
- FIG. 4 is a diagram for explaining CP services;
- FIG. FIG. 11 is a diagram for explaining targets for identity determination in the second embodiment;
- the vehicle system 1 includes a vehicle unit 2 and a roadside unit 3 used for each of a plurality of vehicles.
- a vehicle VEa, a vehicle VEb, and a vehicle VEc will be described as examples of vehicles.
- Vehicle VEa, vehicle VEb, and vehicle VEc are assumed to be automobiles.
- the roadside device 3 includes a control device 300, a communication device 301, a perimeter monitoring sensor 302, and a position storage section 303, as shown in FIG.
- the communication device 301 exchanges information with the outside of its own device by performing wireless communication with the outside of its own device.
- the communication device 301 wirelessly communicates with a communication device 20 which is used in the vehicle and will be described later. That is, the communication device 301 performs road-to-road communication. In this embodiment, it is assumed that road-vehicle communication is performed between the roadside device 3 and the communication device 20 of the vehicle VEa.
- the peripheral monitoring sensor 302 detects obstacles around the device, such as moving objects such as pedestrians, animals other than humans, bicycles, motorcycles, and other vehicles, as well as stationary objects such as falling objects on the road, guardrails, curbs, and trees. To detect. As the surroundings monitoring sensor 302, a configuration using a surroundings monitoring camera whose imaging range is a predetermined range around the own device may be used. Perimeter monitoring sensor 302 may be configured to use millimeter wave radar, sonar, LIDAR (Light Detection and Ranging/Laser Imaging Detection and Ranging), or the like. Moreover, it is good also as a structure which uses these in combination.
- the peripheral monitoring camera sequentially outputs captured images to the control device 300 as sensing information. Sensors that transmit search waves such as sonar, millimeter-wave radar, and LIDAR sequentially output scanning results based on received signals obtained when reflected waves reflected by obstacles are received to control device 300 as sensing information.
- the position storage unit 303 stores information on the absolute position of the own device.
- a non-volatile memory may be used as the position storage unit 303 .
- the absolute position information should be at least latitude and longitude coordinates. It is assumed that the absolute position information of the own device has a position accuracy higher than that obtained by positioning using the signals of the positioning satellites by the own device alone.
- the information on the absolute position of the own device may be obtained in advance by surveying with reference to triangulation points or electronic reference points and stored in the position storage unit 303 .
- the control device 300 includes a processor, memory, I/O, and a bus connecting these, and executes various processes by executing control programs stored in the memory.
- the control device 300 recognizes information (hereinafter referred to as target information) about targets existing around the device from the sensing information output from the periphery monitoring sensor 302 .
- the target information may be the relative position, type, speed, moving azimuth, etc. of the target with respect to the own device.
- the relative position may be the distance and direction from the own device. It is assumed that the position specifying accuracy in recognizing the position of the target using the perimeter monitoring sensor 302 is higher than the position specifying accuracy by positioning with the locator 30, which will be described later.
- the type of target can be recognized by image recognition processing such as template matching based on the image captured by the surrounding surveillance camera.
- image recognition processing such as template matching based on the image captured by the surrounding surveillance camera.
- the distance of the target from the owning device and the own vehicle can be determined from the installation position and the direction of the optical axis of the surroundings monitoring camera relative to the own device and the position of the target in the captured image. What is necessary is to recognize the direction of the target object with respect to .
- the distance of the target from the own device may be recognized based on the amount of parallax between the pair of cameras.
- the speed of the target may be recognized from the amount of change per time in the relative position of the target with respect to the device itself.
- the direction of movement may be recognized from the direction of change in the relative position of the target relative to the device itself and the information on the absolute position of the device itself stored in the position storage unit 303 .
- control device 300 uses the sensing information of the sensor that transmits the search wave to recognize the distance of the target from its own device, the azimuth of the target with respect to its own device, the speed of the target, and the moving azimuth.
- the control device 300 may recognize the distance from its own device to the target on the basis of the time from when the search wave is sent until when the reflected wave is received.
- the control device 300 may recognize the azimuth of the target with respect to its own device based on the direction in which the search wave from which the reflected wave was obtained was transmitted.
- the control device 300 may recognize the speed of the target from the relative speed of the target with respect to its own device calculated based on the Doppler shift between the transmitted search wave and the reflected wave.
- the control device 300 controls the communication device 301 to perform road-to-vehicle communication.
- the control device 300 transmits the target object information recognized from the sensing information output from the surroundings monitoring sensor 302 to the communication device 20 of the vehicle in the vicinity of the own device through road-to-vehicle communication.
- the control device 300 also includes information on the absolute position of its own device stored in the position storage unit 303 and transmits the target information.
- the source identification information may be the vehicle ID of the own vehicle or the communication device ID of the communication device 20 of the own vehicle.
- the target information transmitted from the communication device 301 includes, as described above, the relative position of the target, the type of target, the speed of the target, the direction of movement of the target, and the absolute position of the roadside unit 3 (hereinafter referred to as , roadside unit absolute position).
- the relative position of the target in the target information is a position based on the position of the roadside unit 3 .
- this relative position included in target object information is called a roadside machine reference
- the control device 300 may be configured to periodically transmit the target information, for example.
- the vehicle unit 2 includes a communication device 20, a locator 30, a surroundings monitoring sensor 40, a surroundings monitoring ECU 50, and a driving support ECU 60, as shown in FIG.
- the communication device 20, the locator 30, the perimeter monitoring ECU 50, and the driving support ECU 60 may be configured to be connected to an in-vehicle LAN (see LAN in FIG. 3).
- the locator 30 is equipped with a GNSS (Global Navigation Satellite System) receiver.
- a GNSS receiver receives positioning signals from multiple satellites.
- the locator 30 uses the positioning signals received by the GNSS receiver to sequentially locate the position of the own vehicle equipped with the locator 30 (hereinafter referred to as the positioning position).
- the locator 30 determines a specified number, such as four, of positioning satellites to be used for positioning calculation from among the positioning satellites from which positioning signals have been output. Then, using the code pseudoranges, carrier wave phases, etc. for the determined specified number of positioning satellites, positioning calculation is performed to calculate the coordinates of the positioning position.
- the positioning operation itself may be a positioning operation using code pseudoranges or a positioning operation using carrier phases.
- the coordinates should be at least latitude and longitude coordinates.
- the locator 30 may be equipped with an inertial sensor.
- the inertial sensor for example, a gyro sensor and an acceleration sensor may be used.
- the locator 30 may sequentially measure the positioning position by combining the positioning signal received by the GNSS receiver and the measurement result of the inertial sensor. Note that the travel distance obtained from the signals sequentially output from the vehicle speed sensor mounted on the own vehicle may be used for the positioning of the positioning position.
- the surroundings monitoring sensor 40 detects obstacles around the vehicle, such as moving objects such as pedestrians, animals other than humans, bicycles, motorcycles, and other vehicles, as well as stationary objects such as falling objects on the road, guardrails, curbs, and trees. To detect. As the surroundings monitoring sensor 40, a configuration using a surroundings monitoring camera having an imaging range of a predetermined range around the own vehicle may be used. Perimeter monitoring sensor 40 may be configured to use millimeter wave radar, sonar, LIDAR (Light Detection and Ranging/Laser Imaging Detection and Ranging), or the like. Moreover, it is good also as a structure which uses these in combination.
- the surroundings monitoring camera sequentially outputs captured images to the surroundings monitoring ECU 50 as sensing information. Sensors that transmit search waves such as sonar, millimeter wave radar, and LIDAR sequentially output scanning results based on received signals obtained when reflected waves reflected by obstacles are received to the perimeter monitoring ECU 50 as sensing information.
- the perimeter monitoring ECU 50 includes a processor, memory, I/O, and a bus connecting these, and executes a control program stored in the memory to perform various processes related to recognition of targets in the vicinity of the vehicle. .
- the surroundings monitoring ECU 50 recognizes target object information about targets existing around the vehicle from the sensing information output from the surroundings monitoring sensor 40 in the same manner as the control device 300 described above.
- Recognition of target object information by the perimeter monitoring ECU 50 is the same as recognition of target object information by the control device 300 except that the own vehicle is used as a reference instead of the roadside unit 3 .
- the points of difference are as follows.
- the speed of the target can be recognized from the amount of change per hour in the relative position of the target with respect to the own vehicle and the vehicle speed of the own vehicle.
- the speed of the target may be recognized from the speed of the target relative to the vehicle, which is calculated based on the Doppler shift between the transmitted search wave and the reflected wave, and the vehicle speed of the vehicle. It is assumed that the accuracy of position identification in recognizing the position of the target using the perimeter monitoring sensor 40 is higher than the accuracy of position identification by positioning with the locator 30 .
- the driving assistance ECU 60 includes a processor, memory, I/O, and a bus connecting these, and executes various processes related to driving assistance of the own vehicle by executing a control program stored in the memory.
- the driving assistance ECU 60 performs driving assistance based on the position of the target specified by the perimeter monitoring ECU 50 and the communication device 20 .
- the position of the target specified by the perimeter monitoring ECU 50 is the relative position of the target detected by the perimeter monitoring sensor 40 (hereinafter referred to as the line-of-sight target) with respect to the own vehicle.
- the position of the target specified by the communication device 20 is the relative position of the target that cannot be detected by the perimeter monitoring sensor 40 (hereinafter referred to as non-line-of-sight target) with respect to the own vehicle.
- Examples of driving assistance include vehicle control for avoiding approaching a target, alerting for avoiding approaching a target, and the like.
- the communication device 20 includes a processor, memory, I/O, and a bus connecting these, and executes a control program stored in the memory to perform wireless communication with the outside of the vehicle and target position identification. Execute the process. Execution of the control program by the processor corresponds to execution of the method corresponding to the control program.
- Memory as used herein, is a non-transitory tangible storage medium for non-transitory storage of computer-readable programs and data. A non-transitional physical storage medium is implemented by a semiconductor memory, a magnetic disk, or the like.
- This communication device 20 corresponds to a vehicle device. Details of the processing in the communication device 20 will be described later.
- the communication device 20 includes a detection information acquisition unit 201, a vehicle-to-vehicle communication unit 202, a road-to-vehicle communication unit 203, an identity determination unit 204, a conversion unit 205, an error estimation unit 206, an error storage unit 207, a position A specifying unit 208 and a position storage unit 209 are provided as functional blocks.
- Part or all of the functions executed by the communication device 20 may be configured as hardware using one or a plurality of ICs or the like. Also, some or all of the functional blocks included in the communication device 20 may be implemented by a combination of software executed by a processor and hardware members.
- the detection information acquisition unit 201 acquires target information recognized by the perimeter monitoring ECU 50 . That is, the information of the target detected by the perimeter monitoring sensor 40 is acquired.
- the target information is the relative position of the target consisting of the distance of the target from the own vehicle and the orientation of the target with respect to the own vehicle, the type of target, the speed of the target, and the moving direction of the target.
- the relative position of the target may be coordinates based on the positioning position of the own vehicle.
- the detection information acquisition unit 201 also acquires the measured position of the own vehicle measured by the locator 30 .
- the vehicle-to-vehicle communication unit 202 exchanges information with the communication device 20 of the other vehicle by performing vehicle-to-vehicle communication with the other vehicle. Further, in the present embodiment, vehicle-to-vehicle communication is performed between the vehicle VEa and the vehicle VEb, and vehicle-to-vehicle communication is performed between the vehicle VEa and the vehicle VEc.
- the vehicle-to-vehicle communication unit 202 includes a transmission unit 221 and a vehicle-to-vehicle reception unit 222.
- the transmission unit 221 transmits information to other vehicles through inter-vehicle communication.
- the transmission unit 221 transmits the target information acquired by the detection information acquisition unit 201 and the positioning position measured by the locator 30 of the own vehicle to the communication device 20 of the other vehicle via wireless communication.
- the transmitting unit 221 may include identification information for identifying the own vehicle (hereinafter referred to as transmission source identification information) in the target information.
- the source identification information may be the vehicle ID of the own vehicle or the communication device ID of the communication device 20 of the own vehicle.
- the vehicle-side receiving unit 222 receives information transmitted from other vehicles through vehicle-to-vehicle communication.
- the vehicle-side receiving unit 222 receives target information transmitted from the communication device 20 of another vehicle via wireless communication. This other vehicle is assumed to be a vehicle using the vehicle unit 2 .
- the vehicle-side receiving section 222 corresponds to the mounted object information acquiring section.
- the processing in this vehicle-side receiving section 222 corresponds to the mounted object information acquisition step.
- the vehicle-side receiving unit 222 will be described as receiving target information from two types of other vehicles through vehicle-to-vehicle communication.
- the first type is other vehicles within the detection range of the perimeter monitoring sensor 302 of the roadside unit 3 .
- This other vehicle will hereinafter be referred to as the first communication equipment.
- the second type is other vehicles outside the detection range of the perimeter monitoring sensor 302 of the roadside unit 3 .
- This other vehicle is hereinafter referred to as a second communication device mounted object.
- the target information received by the vehicle-side receiving unit 222 includes, as described above, the reference target position of the mounted object, the type of target, the speed of the target, the moving azimuth of the target, and the measured position of the vehicle that is the transmission source. , and source identification information.
- the positioning position measured by the locator 30 of the first communication device mounted object will be referred to as the first positioning position.
- a target detected by the perimeter monitoring sensor 40 of the first communication device is called a first target.
- the positioning position measured by the locator 30 of the second communication device is called a second positioning position.
- a target detected by the perimeter monitoring sensor 40 of the second communication device is called a second target.
- the road-to-vehicle communication unit 203 exchanges information by performing road-to-vehicle communication with the roadside device 3 . Further, in this embodiment, it is assumed that road-to-vehicle communication is performed between the vehicle VEa and the roadside device 3 .
- the road-to-vehicle communication unit 203 includes a road-to-vehicle receiving unit 231 . Note that the road-to-vehicle communication unit 203 may transmit information to the roadside device 3 through road-to-vehicle communication, but the description is omitted for convenience.
- the roadside and vehicle side receiving unit 231 receives information transmitted from the roadside device 3 through roadtovehicle communication.
- the roadside and vehicle side receiving unit 231 receives target object information transmitted from the roadside device 3 via wireless communication.
- the target information received by the roadside vehicle side receiving unit 231 includes the roadside unit reference target position, the type of target, the speed of the target, the moving direction of the target, and the absolute position of the roadside unit. is included.
- the roadside unit reference target position is a reference position referred to when estimating the first positioning error in the error estimating unit 206, which will be described later, and the roadside unit 3 corresponds to the reference device.
- the road and vehicle side reception unit 231 corresponds to the reference position acquisition unit and the roadside device information acquisition unit.
- the processing in the road and vehicle side reception unit 231 corresponds to the reference position acquisition step and the roadside device information acquisition step.
- the same determination unit 204 detects a target detected by the perimeter monitoring sensor 302 of the roadside unit 3 from which the roadside unit reference target position is acquired by the roadside unit receiving unit 231 (hereinafter referred to as a roadside unit detection target), and the vehicle side reception unit 231. It is determined whether or not the transmission source of the target object information acquired by the unit 222 is the same as the other vehicle (hereinafter referred to as another communication vehicle).
- the other vehicle corresponds to the first communication equipment mounted object, and the positioning position included in this target object information corresponds to the first positioning position.
- the processing in the identity determination unit 204 corresponds to the identity determination step.
- the identity determination unit 204 determines whether the roadside unit reference target position acquired by the roadside vehicle side reception unit 231 and the mounted object reference target position acquired by the vehicle side reception unit 222 are similar to each other. and the communicating other vehicle are the same. For example, if the difference between these positions is less than a threshold, the identity determination unit 204 may determine that they are the same. On the other hand, if the difference between these positions is not less than the threshold, it may be determined that they are not the same. It should be noted that other conditions, such as matching of types of targets, difference in speed less than a threshold value, and difference in direction of movement less than a threshold value, may be used as conditions for determining identity by the identity determining unit 204 . In this case, the target object information transmitted from the transmission unit 221 may include information such as the type, speed, and direction of movement of the own vehicle.
- the conversion unit 205 converts the roadside machine reference target position acquired by the roadside vehicle side reception unit 231 and the positioning position acquired by the vehicle side reception unit 222 into relative positions with respect to the own vehicle.
- the coordinates may be converted into coordinates in an XY coordinate system (hereinafter referred to as the vehicle coordinate system) with the measured position of the vehicle measured by the locator 30 as the origin.
- the processing in this conversion unit 205 corresponds to the conversion step.
- the identity determination unit 204 determines that the target detected by the roadside unit and the other communication vehicle are the same, the other communication vehicle is the first communication device mounted object.
- the first positioning position acquired by the vehicle-side receiving unit 222 from the first communication device mounted object is Convert to relative position.
- the transforming unit 205 may also transform the mounted object reference target position acquired by the vehicle-side receiving unit 222 into a position relative to the own vehicle.
- the error estimation unit 206 calculates the roadside unit reference target position and the measured position converted by the conversion unit 205. Based on the deviation of the position of the own vehicle, the positioning error of the other vehicle is estimated.
- the identity determination unit 204 determines that the target detected by the roadside unit and the other communication vehicle are the same, the other communication vehicle is the first communication device mounted object. Therefore, the error estimating unit 206 calculates the positioning error ( Hereinafter, the first positioning error) is estimated.
- the processing in this error estimating section 206 corresponds to the error estimating step.
- the positioning error estimated by the error estimator 206 may be the deviation of each of the X and Y coordinates of the own vehicle coordinate system.
- Error estimation section 206 stores the estimated first positioning error in error storage section 207 .
- the error storage unit 207 may be a non-volatile memory or a volatile memory.
- the error estimator 206 may associate the estimated first positioning error with the transmission source identification information of the first communication device-mounted object from which the first positioning error was estimated, and store them in the error storage 207 .
- the transmission source identification information of the first communication device-mounted object the transmission source identification information of the target object information received by the vehicle-side receiving unit 222 may be used.
- the position specifying unit 208 has a mounted object position specifying unit 281 and a target position specifying unit 282 . After the mounted object position specifying unit 281 once estimates the first positioning error in the error estimating unit 206, even if the first communication device mounted object cannot be detected by the surroundings monitoring sensor 302 of the roadside unit 3, The vehicle-side receiving unit 222 corrects the first positioning position acquired from the first communication device mounted object by the amount of the first positioning error to specify the position of the first communication device mounted object with respect to the own vehicle. is preferred.
- the mounted object position specifying unit 281 stores the first position of the first communication device mounted object in the error storage unit 207 when the identity determination unit 204 determines that the target detected by the roadside unit and the communicating other vehicle are not the same.
- the first positioning position acquired by the vehicle-side receiving unit 222 is corrected by the first positioning error, and the position of the first communication device mounted object with respect to the own vehicle. should be specified.
- the correction may be performed by shifting the first positioning position so as to eliminate the deviation of the first positioning error. According to this, even if neither the perimeter monitoring sensor 302 of the roadside device 3 nor the perimeter monitoring sensor 40 of the vehicle can detect the first communication device-mounted object, the relative position of the first communication device-mounted object with respect to the vehicle can be detected. It becomes possible to identify the position with higher accuracy.
- the mounted object position specifying unit 281 determines whether or not the first positioning error of the first communication device mounted object has been stored in the error storage unit 207, and stores the transmission source identification information linked to the first positioning error in the error storage unit 207. It can be judged whether or not exists.
- the mounted object position specifying unit 281 stores the position of the first communication device mounted object with respect to the specified own vehicle in the position storage unit 209 .
- the position storage unit 209 may be a volatile memory.
- the position of the first communication device mounted object that can be detected by the perimeter monitoring sensor 40 of the own vehicle may be stored in the memory of the perimeter monitoring ECU 50 instead of being stored in the position storage unit 209 .
- the position detected by the surroundings monitoring sensor 40 of the own vehicle may be used as for the position of the first communication device mounted object within the detection range of the surroundings monitoring sensor 40.
- the target position specifying unit 282 uses the vehicle-side receiving unit 222 to acquire the mounted object reference target position (hereinafter referred to as the first It is preferable to specify the position of the first target with respect to the own vehicle by correcting the position of the first target with respect to the vehicle.
- the first target is the target detected by the perimeter monitoring sensor 40 of the first communication device mounted object, as described above.
- the correction may be performed by shifting the position of the first mounted object reference target so as to eliminate the shift corresponding to the first positioning error.
- the relative position of the first target with respect to the own vehicle can be determined more accurately even from the first mounted object reference target position acquired by the vehicle side receiving unit 222 from the mounted object of the first communication device via wireless communication. can be well identified.
- the target position specifying unit 282 stores the specified position of the first target relative to the own vehicle in the position storage unit 209 .
- the identity determination unit 204 determines that the position of the first target with respect to the vehicle is specified by the target position specifying unit 282 by correcting the position of the first target using the first positioning error, and the vehicle side receiving unit 222
- the mounted object reference target position hereinafter referred to as the second mounted object reference target position
- the identity determining unit 204 determines the first target whose position relative to the own vehicle is specified by the target position specifying unit 282 and the second communication of the transmission source of the second mounted object reference target position acquired by the vehicle-side receiving unit 222. It is preferable to determine whether or not the target (hereinafter referred to as the second target) detected by the perimeter monitoring sensor 40 of the on-board object is the same.
- the identity determination unit 204 determines whether the target information obtained from the vehicle-side receiving unit 222 from the first communication device-mounted object and the target information obtained from the vehicle-side receiving unit 222 from the second communication device-mounted object. Whether or not the first target is the same as the second target is determined based on the presence or absence of approximation.
- the target information to be compared for determining whether or not they are the same may be, for example, the type of target, the speed of the target, and the direction of movement of the target.
- the same determination unit 204 may determine that the two are the same when all of the following conditions are satisfied: the types match, the speed difference is less than a threshold, and the direction difference is less than a threshold.
- the conditions to be satisfied for the identity determination unit 204 to determine the identity may be part of the matching of types, the difference in speed being less than a threshold value, and the difference in moving direction being less than a threshold value. Further, the identity determination unit 204 determines whether the first mount reference target position and the second mount reference target position are the same based on the presence or absence of approximation between the positions converted into relative positions with respect to the own vehicle by the conversion unit 205 . It is necessary to determine whether The identity determination unit 204 may determine whether or not the positions are the same under the condition that the difference in position after conversion is less than a threshold. The identity determination unit 204 may set the condition that the difference in position after conversion is less than a threshold in addition to the conditions that the type matches, the speed difference is less than the threshold, and the movement direction difference is less than the threshold.
- the error estimating unit 206 determines the position of the second mounted object reference target converted by the converting unit 205 and the target position specification. From the deviation from the position of the first target with respect to the own vehicle specified in the section 282, if the positioning error (hereinafter referred to as the second positioning error) of the second communication equipment mounted on the position of the own vehicle is estimated. good. According to this, even if neither the perimeter monitoring sensor 302 of the roadside device 3 nor the perimeter monitoring sensor 40 of the vehicle has ever detected the second communication device mounted object, the second communication device for the vehicle is detected. It becomes possible to specify the relative position of the mounted object with higher accuracy.
- Error estimation section 206 stores the estimated second positioning error in error storage section 207 .
- the error estimator 206 may associate the estimated second positioning error with the transmission source identification information of the second communication device-mounted object from which the second positioning error was estimated, and store them in the error storage unit 207 .
- the mounted object position specifying unit 281 uses the vehicle-side receiving unit 222 to obtain the second measured position obtained from the second communication device mounted object as the second positioning error. After correcting the error, the position of the second communication device mounted object relative to the own vehicle is specified. According to this, even if the object mounted on the second communication device cannot be detected by either the surroundings monitoring sensor 302 of the roadside device 3 or the surroundings monitoring sensor 40 of the own vehicle, the relative position of the object mounted on the second communication device to the own vehicle can be detected. It becomes possible to identify the position with higher accuracy.
- the mounted object position specifying unit 281 stores the position of the second communication device mounted object with respect to the specified own vehicle in the position storage unit 209 .
- the position storage unit 209 may be a volatile memory. Note that the position of the second communication device mounted object that can be detected by the surroundings monitoring sensor 40 of the own vehicle may be stored in the memory of the surroundings monitoring ECU 50 without being stored in the position storage unit 209 . As for the position of the second communication device mounted object within the detection range of the surroundings monitoring sensor 40, the position detected by the surroundings monitoring sensor 40 of the own vehicle may be used.
- the target position specifying unit 282 uses the vehicle-side receiving unit 222 to acquire the mounted object reference target position (hereinafter referred to as the second It is preferable to specify the position of the second target with respect to the own vehicle by correcting the position of the second target relative to the own vehicle.
- the second target is, as described above, a target detected by the perimeter monitoring sensor 40 of the second communication device mounted object.
- the correction may be performed by shifting the position of the second mounted object reference target so as to eliminate the deviation of the second positioning error.
- the relative position of the second target with respect to the own vehicle can be determined more accurately even from the second mounted object reference target position acquired by the vehicle side receiving unit 222 from the mounted object of the second communication device via wireless communication. can be well identified.
- the target position specifying unit 282 stores the specified position of the second target relative to the own vehicle in the position storage unit 209 .
- the driving support ECU 60 uses the position of the target including the communication equipment mounted object stored in the position storage unit 209 to control the vehicle to avoid approaching the target, and to control the vehicle to avoid approaching the target.
- Provide driving assistance such as calling attention to Therefore, even if the target is beyond the line of sight of the own vehicle, it is possible to perform more accurate driving support by using the relative position with respect to the own vehicle that is specified with higher accuracy.
- FIG. 5 ⁇ Positioning Error Estimation Related Processing in Communication Device 20>
- Execution of this process means execution of the error estimation method.
- the flowchart of FIG. 5 may be configured to be started, for example, when a switch (hereinafter referred to as a power switch) for starting the internal combustion engine or motor generator of the own vehicle is turned on.
- a switch hereinafter referred to as a power switch
- step S1 when the road-vehicle-side receiving unit 231 receives target information transmitted from the roadside device 3 through road-to-vehicle communication (YES in S1), the process proceeds to step S2. On the other hand, if the target information transmitted by the road-to-vehicle communication has not been received (NO in S1), the process proceeds to step S8.
- step S2 when the vehicle-side receiving unit 222 receives target information transmitted from another vehicle via inter-vehicle communication (YES in S2), the process proceeds to step S3. On the other hand, if target object information has not been received through inter-vehicle communication (NO in S2), the process proceeds to step S13.
- step S2 when target information is received from another vehicle within a certain period of time after receiving the target information from the roadside device 3 in S1, the target information transmitted from the other vehicle by vehicle-to-vehicle communication is received. It should be.
- the term "within a certain period of time" as used herein may be, for example, a period of time equal to or less than the period of transmission of target object information from the roadside unit 3, and may be set arbitrarily.
- step S3 the identity determination unit 204 determines the target detected by the roadside unit 3, which is the target detected by the surrounding monitoring sensor 302 of the roadside unit 3 from which the target information was acquired in S1, and the transmission source of the target information acquired in S2. It is determined whether or not the communication other vehicle, which is the other vehicle, is the same.
- step S4 when it is determined that the target detected by the roadside unit and the other communication vehicle are the same (YES in S4), the process proceeds to step S5. On the other hand, if it is determined that they are not the same (NO in S4), the process proceeds to step S8.
- a communication other vehicle determined to be the same as the target detected by the roadside unit corresponds to the first communication device mounted object.
- step S5 the conversion unit 205 converts the roadside unit reference target position in the target information received and acquired in S1 and the measured position in the target information received and acquired in S2 to the own vehicle. Convert to relative position.
- the processing of S5 may be configured to be performed before the processing of S3. In this case, in S3, it may be determined whether or not the roadside machine reference target position and the other communication vehicle are the same using the roadside machine reference target position and the positioning position converted in S5.
- step S6 the error estimating unit 206 determines the first communication based on the position of the vehicle based on the deviation between the roadside unit reference target position converted in S5 and the positioning position for the target determined to be the same in S4.
- a first positioning error which is a positioning error of the on-board object, is estimated.
- step S7 the first positioning error estimated in S6 is stored in the error storage unit 207 in association with the transmission source identification information of the first communication device-mounted object from which the first positioning error was estimated.
- step S8 if the vehicle-side receiving unit 222 receives target information transmitted from another vehicle via vehicle-to-vehicle communication (YES at S8), the process proceeds to step S9. On the other hand, if target object information has not been received through inter-vehicle communication (NO in S8), the process proceeds to step S13.
- step S9 if the positioning error is already stored in the error storage unit 207 for the other vehicle from which the target information was received and acquired in S2 or S6 (YES in S9), the process proceeds to step S11. On the other hand, if the positioning error has not been stored in the error storage unit 207 (NO in S9), the process moves to step S10.
- the positioning error here includes the above-described first positioning error and second positioning error.
- step S10 a second estimation-related process is performed, and the process proceeds to step S13.
- a second estimation-related process is performed, and the process proceeds to step S13.
- an example of the flow of the second estimation-related processing will be described using the flowchart of FIG. 6 .
- step S101 when the vehicle-side receiving unit 222 receives target information transmitted through inter-vehicle communication with another vehicle different from that received in S2 or S8 (YES in S101), , the process proceeds to step S102. On the other hand, if target object information transmitted through inter-vehicle communication with another vehicle has not been received (NO in S101), the process proceeds to step S13.
- the other vehicle transmits the target information through inter-vehicle communication. It is assumed that the target information has been received.
- This different other vehicle corresponds to the second communication device mounted object, and the mounted object reference target position included in this target information corresponds to the second mounted object reference target position.
- the term “within a certain period of time” as used herein may be a period of time that can be arbitrarily set.
- the identification of the vehicle may be performed by the identity determination unit 204 based on the transmission source identification information included in the target object information.
- step S102 if the position of the first target relative to the vehicle has been corrected by the target position specifying unit 282 using the first positioning error (YES in S102), the process proceeds to step S103. On the other hand, if it has not been specified (NO in S102), the process proceeds to step S13.
- step S103 the identity determination unit 204 determines the first target whose position relative to the own vehicle is specified by the target position specifying unit 282 and the second communication device mounted object that is the transmission source of the target information received and acquired in S101. It is determined whether or not the second target, which is the target detected by the surroundings monitoring sensor 40, is the same.
- step S104 when it determines with a 1st target and a 2nd target being the same (it is YES at S104), it moves to step S105. On the other hand, if it is determined that they are not the same (NO in S104), the process proceeds to step S13.
- step S105 the second mounted object reference target position in the target information received and acquired in S101 is converted into a relative position with respect to the own vehicle.
- the processing of S105 may be configured to be performed before the processing of S103.
- the first target and the second It may be determined whether or not the target is the same.
- step S106 the error estimating unit 206 converts the second mounted object reference target position converted in S105 and the first object relative to the own vehicle identified by the target position identifying unit 282 for the target determined to be the same in S104.
- a second positioning error which is an error in positioning by the second communication device-mounted object with respect to the position of the own vehicle, is estimated from the deviation from the position of the target.
- step S107 the second positioning error estimated in S106 is linked to the transmission source identification information of the second communication device-mounted object for which the second positioning error was estimated, and stored in the error storage unit 207, and the process proceeds to step S13. .
- the position specifying unit 208 stores at least one of the positioning position and the mounted object reference target position received and acquired in S2 or S8 as the positioning error stored in the error storage unit 207.
- the position relative to the own vehicle is specified by correcting by the amount of .
- the mounted object position specifying unit 281 corrects the first positioned position by the first positioning error, Locate objects.
- the target position specifying unit 282 corrects the first mount reference target position by the first positioning error, Identify the position of the first target with respect to the own vehicle.
- the mounted object position specifying unit 281 corrects the second positioned position by the second positioning error, Locate objects.
- the target position specifying unit 282 corrects the second mount reference target position by the second positioning error, Identify the position of the second target with respect to the own vehicle.
- correction may be performed after the conversion unit 205 converts the mounted object reference target position.
- correction may be performed before the conversion unit 205 converts the mounted object reference target position.
- step S12 the position of the target specified in S11 is stored in the position storage unit 209. FIG.
- step S13 if it is time to end the positioning error estimation related process (YES in S13), the positioning error estimation related process ends. On the other hand, if it is not the end timing of the positioning error estimation related process (NO in S13), the process returns to S1 and repeats the process.
- An example of the termination timing of the positioning error estimation-related processing is when the power switch is turned off.
- the first positioning position is determined by positioning using the signals of the positioning satellites in the first communication device, it contains positioning errors.
- the roadside unit reference target position is the position of the target detected by the perimeter monitoring sensor 302 of the roadside unit 3 having the information of the absolute position of the own device with higher positional accuracy than that obtained by positioning using the signals of the positioning satellites. Since the position is based on the position of the roadside unit 3, the position accuracy is higher than that obtained by positioning using signals from positioning satellites.
- the roadside unit reference target position obtained from the roadside unit 3 and the first measured position obtained from the object mounted on the first communication device From the deviation it is possible to more accurately estimate the first positioning error, which is the positioning error of the first communication device-mounted object with respect to the position of the own vehicle.
- the estimation of the first positioning error is performed when the object mounted on the first communication device cannot be detected by the perimeter monitoring sensor 40 of the own vehicle. is possible.
- the first positioning error it is possible to correct the positioning error in the first communication device mounted object and specify the position of the first communication device mounted object with higher accuracy. As a result, it is possible to more accurately identify the position of the communication device-mounted object outside the detection range of the perimeter monitoring sensor 40 of the own vehicle.
- the target even if the target is outside the detection range of the surroundings monitoring sensor 40 of the own vehicle, it may be a target within the detection range of the surroundings monitoring sensor 40 of the first communication device mounted object.
- the first mounted object reference target position of the target acquired by vehicle-to-vehicle communication from the mounted object of the first communication device by the amount of the first positioning error, the position relative to the own vehicle can be determined more accurately. can be specified.
- the second mounted object reference target position is a position based on the second positioning position of the second mounted object, it includes positioning errors in the second mounted object.
- the position of the first target with respect to the own vehicle which is specified by correcting the first mount reference target position by the first positioning error in the target position specifying unit 282, is corrected so that the positioning error is reduced. ing.
- a second positioning error which is an error in positioning by the second communication equipment mounted on the vehicle with the position of the vehicle as a reference, from the deviation from the position converted from the position of the target relative to the vehicle.
- the target even if the target is outside the detection range of the vehicle and the surroundings monitoring sensor 40 of the first communication device, it is within the detection range of the surroundings monitoring sensor 40 of the second communication device. If the target is a target, by correcting the second mounting object reference target position for the target acquired by inter-vehicle communication from the second communication device mounting object by the second positioning error, It becomes possible to identify the position with higher accuracy.
- Embodiment 1 will be described using FIG. LMa, LMb, and LMc in FIG. 1 are targets such as pedestrians. It is assumed that this target is not a communicator-mounted object.
- the target LMa can be detected by the peripheral monitoring sensors 40 of the vehicle VEb and the vehicle VEc.
- the target LMb can be detected by the surroundings monitoring sensor 40 of the vehicle VEb, but cannot be detected by the surroundings monitoring sensor 40 of the vehicle VEa and the vehicle VEc.
- the target LMc can be detected by the surroundings monitoring sensor 40 of the vehicle VEc, but cannot be detected by the surroundings monitoring sensor 40 of the vehicle VEa and the vehicle VEb.
- the vehicle VEb cannot be detected by the surrounding monitoring sensors 40 of the vehicle VEa and the vehicle VEc and the surrounding monitoring sensor 302 of the roadside unit 3 .
- the vehicle VEc cannot be detected by the surroundings monitoring sensor 40 of the vehicle VEa and the vehicle VEc, but can be detected by the surroundings monitoring sensor 302 of the roadside unit 3 .
- the vehicle VEa is the own vehicle
- the vehicles VEb and VEc are the other vehicles.
- the vehicle VEc corresponds to the first communication equipment mounted object
- the vehicle VEb corresponds to the second communication equipment mounted equipment.
- the target LMa corresponds to the first target and also to the second target.
- the target LMb hits the second target but does not hit the first target.
- the target LMc hits the first target but does not hit the second target.
- the positioning position of the vehicle VEc obtained from the vehicle VEc through inter-vehicle communication, and the vehicle VEc detected by the surrounding monitoring sensor 302 of the roadside device 3 obtained from the roadside device 3 through inter-vehicle communication can be estimated from the roadside machine reference target position. This positioning error becomes the first positioning error.
- the positioning error of the mounted object reference target position of the vehicle VEc can be similarly estimated.
- the target LMc is acquired from the vehicle VEc through inter-vehicle communication. From the target position and the first positioning error of the vehicle VEc, it is possible to accurately identify the position of the target LMc with respect to the own vehicle VEa.
- the target LMa that can be detected in common by the surroundings monitoring sensor 40 of the vehicle VEb and the vehicle VEc is used. Then, the positioning error in the vehicle VEb can be estimated. This error becomes the second positioning error.
- the position of the target LMa corrected using the estimated first positioning error and the position of the target LMc obtained from the vehicle VEb by inter-vehicle communication The position of the vehicle VEb relative to the own vehicle VEa can be specified with high accuracy from the reference target position.
- the communication device 20 of the vehicle VEa can detect the target LMb acquired from the vehicle VEb through vehicle-to-vehicle communication.
- the position of the target LMb with respect to the own vehicle VEa can be specified with high accuracy from the two-mounted object reference target position and the second positioning error in the vehicle VEb.
- Embodiment 2 An overview of the second embodiment will be described with reference to FIG.
- the vehicle unit 2 sequentially transmits CAM (Cooperative Awareness Messages) and CPM (Cooperative Perception Messages).
- the roadside device 3 also sequentially transmits the CPM.
- the communication device 20 provided in the vehicle unit 2 uses the information contained in those messages. Then, similar to the first embodiment, identity determination, error estimation, and the like are performed. CAM and CPM will be described before describing processes such as identity determination and error estimation in the second embodiment.
- FIG. 8 is a diagram showing an exemplary architecture of a V2X communication device.
- a V2X communication device is used instead of the communication device 20 of the first embodiment.
- the V2X communication device also has the configuration shown in FIG. 4, like the communication device 20 of the second embodiment.
- a V2X communication device is a communication device that transmits target information.
- the V2X communication device may perform communication between vehicles, vehicles and infrastructure, vehicles and bicycles, vehicles and mobile terminals, and the like.
- the V2X communication device may correspond to an onboard device of a vehicle or may be included in the onboard device.
- the onboard equipment is sometimes called an OBU (On-Board Unit).
- the communication device may correspond to the infrastructure roadside unit or may be included in the roadside unit.
- a roadside unit is sometimes called an RSU (Road Side Unit).
- the communication device can also be one element that constitutes an ITS (Intelligent Transport System). If it is an element of the ITS, the communication device may correspond to an ITS station (ITS-S) or be included in the ITS-S.
- ITS-S is a device for information exchange, and may be any of OBU, RSU, and mobile terminal, or may be included therein.
- the mobile terminal is, for example, a PDA (Personal Digital Assistant) or a smart phone.
- the communication device may correspond to a WAVE (Wireless Access in Vehicular) device disclosed in IEEE1609, or may be included in a WAVE device.
- WAVE Wireless Access in Vehicular
- V2X communication device mounted on a vehicle.
- This V2X communication device has a function of providing CA (Cooperative Awareness) service and CP (Collective Perception) service.
- CA service the V2X communication device transmits CAM.
- CP service the V2X communication device transmits CPM. It should be noted that the same or similar methods disclosed below can be applied even if the communication device is an RSU or a mobile terminal.
- the architecture shown in FIG. 8 is based on the ITS-S reference architecture according to EU standards.
- the architecture shown in FIG. 8 comprises an application layer 110, a facility layer 120, a network & transport layer 140, an access layer 130, a management layer 150, and a security layer 160.
- FIG. 8 is based on the ITS-S reference architecture according to EU standards.
- the architecture shown in FIG. 8 comprises an application layer 110, a facility layer 120, a network & transport layer 140, an access layer 130, a management layer 150, and a security layer 160.
- the application layer 110 implements or supports various applications 111 .
- FIG. 8 shows, as examples of the applications 111, a traffic safety application 111a, an efficient traffic information application 111b, and other applications 111c.
- the facility layer 120 supports the execution of various use cases defined in the application layer 110.
- the facility layer 120 can support the same or similar functionality as the top three layers (application layer, presentation layer and session layer) in the OSI reference model.
- Facility means providing functions, information and data.
- Facility layer 120 may provide the functionality of a V2X communication device.
- facility layer 120 may provide the functions of application support 121, information support 122, and communication support 123 shown in FIG.
- the application support 121 has functions that support basic application sets or message sets.
- An example of a message is a V2X message.
- V2X messages can include periodic messages such as CAMs and event messages such as DENMs (Decentralized Environmental Notification Messages).
- Facility layer 120 may also support CPM.
- the information support 122 has the function of providing common data or databases used for the basic application set or message set.
- a database is the Local Dynamic Map (LDM).
- the communication support 123 has functions for providing services for communication and session management.
- Communication support 123 provides, for example, address mode and session support.
- facility layer 120 supports a set of applications or a set of messages. That is, the facility layer 120 generates message sets or messages based on the information that the application layer 110 should send and the services it should provide. Messages generated in this way are sometimes referred to as V2X messages.
- the access layer 130 includes an external IF (InterFace) 131 and an internal IF 132, and can transmit messages/data received by the upper layers via physical channels.
- access stratum 130 may conduct or support data communication according to the following communication techniques.
- the communication technology is, for example, a communication technology based on the IEEE 802.11 and/or 802.11p standard, an ITS-G5 wireless communication technology based on the physical transmission technology of the IEEE 802.11 and/or 802.11p standard, a satellite/broadband wireless mobile communication including 2G/3G/4G (LTE)/5G wireless mobile communication technology, broadband terrestrial digital broadcasting technology such as DVB-T/T2/ATC, GNSS communication technology, and WAVE communication technology.
- the network & transport layer 140 can configure a vehicle communication network between homogeneous/heterogeneous networks using various transport protocols and network protocols.
- the transport layer is the connecting layer between the upper and lower layers. Upper layers include a session layer, a presentation layer, and an application layer 110 .
- the lower layers include the network layer, data link layer, and physical layer.
- the transport layer can manage transmitted data to arrive at its destination correctly. At the source, the transport layer processes the data into appropriately sized packets for efficient data transmission. On the receiving side, the transport layer takes care of restoring the received packets to the original file.
- Transport protocols are, for example, TCP (Transmission Control Protocol), UDP (User Datagram Protocol), and BTP (Basic Transport Protocol).
- the network layer can manage logical addresses.
- the network layer may also determine the routing of packets.
- the network layer may receive packets generated by the transport layer and add the logical address of the destination to the network layer header. Packet transmission routes may be unicast/multicast/broadcast between vehicles, between vehicles and fixed stations, and between fixed stations.
- IPv6 networking for geo-networking, mobility support or geo-networking may be considered.
- the architecture of the V2X communication device may further include a management layer 150 and a security layer 160.
- Management layer 150 manages the communication and interaction of data between layers.
- the management layer 150 comprises a management information base 151 , regulation management 152 , interlayer management 153 , station management 154 and application management 155 .
- Security layer 160 manages security for all layers.
- Security layer 160 comprises firewall and intrusion detection management 161 , authentication, authorization and profile management 162 and security management information base 163 .
- V2X messages may also be referred to as ITS messages.
- V2X messages can be generated at application layer 110 or facility layer 120 . Examples of V2X messages are CAM, DENM, CPM.
- the transport layer in the network & transport layer 140 generates BTP packets.
- a network layer at network & transport layer 140 may encapsulate the BTP packets to produce geo-networking packets.
- Geo-networking packets are encapsulated in LLC (Logical Link Control) packets.
- LLC Logical Link Control
- the data may include message sets.
- a message set is, for example, basic safety messages.
- BTP is a protocol for transmitting V2X messages generated in the facility layer 120 to lower layers.
- the BTP header has A type and B type.
- a type BTP header may contain the destination port and source port required for transmission and reception in two-way packet transmission.
- the B-type BTP header can include the destination port and destination port information required for transmission in non-bidirectional packet transmission.
- the destination port identifies the facility entity corresponding to the destination of the data contained in the BTP packet (BTP-PDU).
- a BTP-PDU is unit transmission data in BTP.
- the source port is a field generated for the BTP-A type.
- the source port indicates the port of the facility layer 120 protocol entity at the source of the corresponding packet. This field can have a size of 16 bits.
- Destination port information is a field generated for the BTP-B type. Provides additional information if the destination port is a well-known port. This field can have a size of 16 bits.
- a geo-networking packet includes a basic header and a common header according to the network layer protocol, and optionally includes an extension header according to the geo-networking mode.
- An LLC packet is a geo-networking packet with an LLC header added.
- the LLC header provides the ability to differentiate and transmit IP data and geo-networking data.
- IP data and geo-networking data can be distinguished by SNAP (Subnetwork Access Protocol) Ethertype.
- SNAP Network Access Protocol
- the Ethertype When IP data is transmitted, the Ethertype is set to x86DD and may be included in the LLC header. If geo-networking data is transmitted, the Ethertype may be set to 0x86DC and included in the LLC header. The receiver can see the Ethertype field in the LLC packet header and forward and process the packet to the IP data path or the geo-networking path depending on the value of the Ethertype field in the LLC packet header.
- the LLC header contains DSAP (Destination Service Access Point) and SSAP (Source Service Access Point).
- SSAP is followed by a control field (Control in FIG. 9), protocol ID, and ethertype.
- FIG. 10 shows the logical interfaces for the CA (Collaborative Awareness) service 128 and other layers in the V2X communication device architecture.
- V2X communication devices may provide various services for traffic safety and efficiency.
- One of the services may be CA service 128 .
- Collaborative awareness in road traffic means that road users and roadside infrastructure can know each other's location, dynamics and attributes.
- Road users refer to all users on and around roads for which traffic safety and control are required, such as automobiles, trucks, motorcycles, cyclists, pedestrians, etc.
- Roadside infrastructure refers to road signs, traffic lights, barriers, entrances, etc. Refers to equipment.
- V2V vehicle to vehicle
- V2I vehicle to infrastructure
- I2V infrastructure and vehicles
- X2X wireless networks
- V2X communication devices can provide situational awareness through their sensors and communication with other V2X communication devices.
- the CA service can specify how the V2X communication device communicates its location, behavior and attributes by sending CAMs.
- the CA service 128 may be an entity of the facility layer 120.
- CA service 128 may be part of the application support domain of facility layer 120 .
- the CA service 128 can apply the CAM protocol and provide two services of CAM transmission and reception.
- CA service 128 may also be referred to as a CAM basic service.
- the source ITS-S configures the CAM.
- the CAM is sent to network & transport layer 140 for transmission.
- the CAM may be sent directly from the source ITS-S to all ITS-S within range.
- the communication range is varied by changing the transmission power of the source ITS-S.
- CAMs are generated periodically at a frequency controlled by the CA service 128 of the originating ITS-S. The frequency of generation is determined by considering the state change of the source ITS-S. Conditions to consider are, for example, changes in the position or velocity of the source ITS-S, loading of the radio channel.
- CA service 128 Upon receiving the CAM, CA service 128 utilizes the contents of the CAM to entities such as application 111 and/or LDM 127.
- CA service 128 interfaces with entities of facility layer 120 and application layer 110 to collect relevant information for CAM generation and to further process received CAM data.
- An example of an entity for ITS-S data collection in a vehicle is a VDP (Vehicle Data Provider) 125, a POTI (position and time) unit 126, and an LDM 127.
- the VDP 125 is connected to the vehicle network and provides vehicle status information.
- the POTI unit 126 provides ITS-S position and time information.
- the LDM 127 is a database within ITS-S, as described in ETSI TR 102 863, which can be updated with received CAM data.
- Application 111 can obtain information from LDM 127 and perform further processing.
- the CA service 128 connects with the network & transport layer 140 via NF-SAP (Network & Transport/Facilities Service Access Point) in order to exchange CAMs with other ITS-S.
- the CA service 128 interfaces with the security layer 160 via SF-SAP (Security Facilities Service Access Point) for CAM transmission and CAM reception security services.
- SF-SAP Security Facilities Service Access Point
- the CA service 128 directly provides the received CAM data to the application 111, it connects with the management layer 150 via the MF-SAP (Management/Facilities Service Access Point), FA-SAP (Facilities/Applications Service Access Point ) with the application layer 110 .
- MF-SAP Management/Facilities Service Access Point
- FA-SAP Fecilities/Applications Service Access Point
- FIG. 11 shows the functional blocks of CA service 128 and interfaces to other functions and layers.
- CA service 128 provides four sub-functions of CAM encoding section 1281 , CAM decoding section 1282 , CAM transmission management section 1283 and CAM reception management section 1284 .
- the CAM encoding unit 1281 constructs and encodes a CAM according to a predefined format.
- CAM decoding section 1282 decodes the received CAM.
- the CAM transmission manager 1283 executes the protocol operation of the source ITS-S to transmit the CAM.
- the CAM transmission management unit 1283 executes, for example, activation and termination of CAM transmission operation, determination of CAM generation frequency, and triggering of CAM generation.
- the CAM reception manager 1284 performs protocol operations specified in the receiving ITS-S to receive the CAM.
- the CAM reception management unit 1284 for example, activates the CAM decoding function when receiving CAM.
- the CAM reception management unit 1284 provides the received CAM data to the application 111 of the ITS-S on the reception side.
- the CAM reception management unit 1284 may perform information check of the received CAM.
- Interface IF. CAM is an interface to LDM 127 or application 111 .
- the interface to the application layer 110 may be implemented as an API (Application Programming Interface), and data may be exchanged between the CA service 128 and the application 111 via this API.
- the interface to application layer 110 can also be implemented as FA-SAP.
- the CA Service 128 interacts with other entities of the Facility Layer 120 to obtain the data necessary for CAM generation.
- the set of entities that provide data for the CAM are data providing entities or data providing facilities.
- Between the data providing entity and the CA service 128 is an interface IF. Data is exchanged via the FAC.
- the CA service 128 has an interface IF. It exchanges information with the network & transport layer 140 via N&T. At the source ITS-S, CA service 128 provides the CAM to network & transport layer 140 along with protocol control information (PCI). PCI is control information according to ETSI EN 32 636-5-1. The CAM is embedded in the Service Data Unit (FL-SDU) of the facility layer 120 .
- FL-SDU Service Data Unit
- the network & transport layer 140 can provide the received CAM to the CA service 128.
- the interface between the CA service 128 and the network & transport layer 140 relies on geo-networking/BTP stack services, or IPv6 stacks, and IPv6/geo-networking combined stacks.
- CAM may rely on services provided by the GeoNetworking (GN)/BTP stack.
- GN GeoNetworking
- SHB Single Hop Broadcast
- the PCI passed from CA service 128 to the geo-networking/BTP stack may include BTP type, destination port, destination port information, GN packet transmission type, GN communication profile, GN security profile. This PCI may also include GN traffic class, GN maximum packet lifetime.
- a CAM can use an IPv6 stack or a combined IPv6/geo-networking stack to transmit the CAM, as specified in ETSI TS 102 636-3. If the combined IPv6/geo-networking stack is used for CAM transmission, the interface between the CA service 128 and the combined IPv6/geo-networking stack may be the same as the interface between the CA service 128 and the IPv6 stack.
- the CA service 128 can exchange primitives with management entities of the management layer 150 via MF-SAP. Primitives are elements of information that are exchanged, such as instructions.
- Primitives are elements of information that are exchanged, such as instructions.
- the sender ITS-S has an interface IF. Obtain the setting information of T_GenCam_DCC from the management entity via Mng.
- the CA service 128 is provided by the ITS-S security entity interface IF. Sec can be used to exchange primitives with ITS-S security entities via SF-SAP.
- Point-to-multipoint communication may be used for CAM transmission.
- a CAM is sent in a single hop from a source ITS-S only to a receiver ITS-S located within direct communication range of the source ITS-S. Since it is a single hop, the receiving IST-S does not forward the received CAM.
- the initiation of the CA service 128 may be different for different types of ITS-S, such as vehicle ITS-S, roadside ITS-S, and personal ITS-S. As long as CA service 128 is active, CAM generation is managed by CA service 128 .
- the CA service 128 is activated at the same time as the ITS-S is activated and terminated when the ITS-S is deactivated.
- the frequency of CAM generation is managed by the CA service 128.
- the CAM generation frequency is the time interval between two consecutive CAM generations.
- the CAM generation interval is set so as not to fall below the minimum value T_GenCamMin.
- the minimum value T_GenCamMin of the CAM generation interval is 100 ms. If the CAM generation interval is 100 ms, the CAM generation frequency will be 10 Hz.
- the CAM generation interval is set so as not to exceed the maximum value T_GenCamMax.
- the maximum value T_GenCamMax of the CAM generation interval is 1000 ms. If the CAM generation interval is 1000 ms, the CAM generation frequency will be 1 Hz.
- the CA service 128 triggers CAM generation according to the dynamics of the source ITS-S and the congestion state of the channel. If the dynamics of the source ITS-S shortens the CAM generation interval, this interval is maintained for consecutive CAMs.
- the CA service 128 repeatedly checks the conditions that trigger CAM generation every T_CheckCamGen. T_CheckCamGen is below T_GenCamMin.
- the parameter T_GenCam_DCC provides the minimum generation time interval between two consecutive CAMs that reduces CAM generation according to the channel usage requirements for distributed congestion control (DCC) specified in ETSI TS 102 724. . This facilitates adjustment of the CAM generation frequency according to the remaining capacity of the radio channel during channel congestion.
- DCC distributed congestion control
- T_GenCam_DCC is provided by the management entity in milliseconds.
- the range of values for T_GenCam_DCC is T_GenCamMin ⁇ T_GenCam_DCC ⁇ T_GenCamMax.
- T_GenCam_DCC is set to T_GenCamMax. If the managing entity gives T_GenCam_DCC a value less than or equal to T_GenCamMin, or if T_GenCam_DCC is not provided, T_GenCam_DCC is set to T_GenCamMin.
- LTE-V2X DCC and T_GenCam_DCC do not apply.
- the access layer 130 manages channel congestion control.
- T_GenCam is the upper limit of the currently valid CAM generation interval. Let the default value of T_GenCam be T_GenCamMax. T_GenCam sets the elapsed time from the last generation of CAM when CAM is generated according to condition 1 below. After N_GenCam consecutive CAMs are generated according to condition 2, T_GenCam is set to T_GenCamMax.
- N_GenCam can be dynamically adjusted according to some environmental conditions. For example, when approaching an intersection, N_GenCam can be increased to increase the frequency of CAM reception. Note that the default and maximum values of N_GenCam are 3.
- Condition 1 is that the elapsed time since the previous CAM generation is T_GenCam_DCC or more, and one of the following ITS-S dynamics-related conditions is given.
- the first condition related to ITS-S dynamics is that the absolute value of the difference between the current bearing of the source ITS-S and the bearing contained in the CAM previously transmitted by the source ITS-S is 4 degrees or more. is.
- the second is that the distance between the current location of the source ITS-S and the location included in the CAM previously transmitted by the source ITS-S is 4 m or more.
- Condition 2 is that the elapsed time from the previous CAM generation is T_GenCam or more, and in the case of ITS-G5, T_GenCam_Dcc or more.
- the CA service 128 immediately creates a CAM.
- the CA service 128 When the CA service 128 creates a CAM, it creates an essential container.
- the mandatory container mainly contains highly dynamic information of the source ITS-S. High dynamic information is included in basic and high frequency containers.
- the CAM can contain optional data.
- Optional data is mainly non-dynamic source ITS-S status and information for specific types of source ITS-S. The status of non-dynamic source ITS-S is contained in the low frequency container, and information for specific types of source ITS-S is contained in the special vehicle container.
- Infrequent containers are included in the first CAM after the CA service 128 is activated. After that, if the elapsed time from the generation of the last CAM that generated the low-frequency container is 500 ms or more, the low-frequency container is included in the CAM.
- the special vehicle container is also included in the first CAM generated after the CA service 128 is activated. After that, if 500 ms or more have passed since the CAM that last generated the special vehicle container, the special vehicle container is included in the CAM.
- the CAM generation frequency of the roadside ITS-S is also defined by the time interval between two consecutive CAM generations.
- the roadside ITS-S must be configured such that at least one CAM is transmitted while the vehicle is within the coverage of the roadside ITS-S.
- the time interval for CAM generation must be 1000 ms or longer. Assuming that the time interval of CAM generation is 1000 ms as the maximum CAM generation frequency, the maximum CAM generation frequency is 1 Hz.
- the probability that a passing vehicle receives a CAM from an RSU depends on the frequency of CAM occurrence and the time the vehicle is within the communication area. This time depends on the vehicle speed and the transmission power of the RSU.
- each CAM is time-stamped.
- time synchronization is established between different ITS-S.
- the time required for CAM generation is 50 ms or less.
- the time required for CAM generation is the difference between the time CAM generation is started and the time the CAM is delivered to network & transport layer 140 .
- the time stamp shown in the CAM of the vehicle ITS-S corresponds to the time when the reference position of the source ITS-S shown in this CAM was determined.
- the timestamp shown on the roadside ITS-S CAM is the time the CAM was generated.
- Certificates may be used to authenticate messages transmitted between ITS-S.
- a certificate indicates permission for the holder of the certificate to send a particular set of messages. Certificates can also indicate authority over specific data elements within a message.
- authorization authority is indicated by a pair of identifiers, ITS-AID and SSP.
- ITS-AID indicates the overall type of permission granted. For example, there is an ITS-AID that indicates that the sender has the right to send the CAM.
- SSP Service Specific Permissions
- ITS-AID Service Specific Permissions
- a received signed CAM is accepted by the recipient if the certificate is valid and the CAM matches the certificate's ITS-AID and SSP.
- the CAM is signed using the private key associated with the Authorization Ticket containing an SSP of type BitmapSsp.
- FIG. 12 is a diagram showing the CAM format.
- a CAM may include an ITS protocol data unit (PDU) header and multiple containers.
- the ITS PDU header contains information on the protocol version, message type, and source ITS-S ID.
- the CAM of the vehicle ITS-S includes one basic container and one high frequency container (hereinafter referred to as HF container), one low frequency container (hereinafter referred to as LF container), It can contain one or more other specialized vehicle containers. Special vehicle containers are sometimes called special containers.
- the basic container contains basic information about the source ITS-S.
- the base container can include station type, station location.
- a station type is a vehicle, a roadside unit (RSU), or the like. If the station type is vehicle, the vehicle type may be included.
- Location may include latitude, longitude, altitude and confidence.
- the vehicle HF container is included in the HF container. If the station type is not vehicle, the HF container can contain other containers that are not vehicle HF containers.
- the vehicle HF container contains dynamic state information that changes in a short time in the source vehicle ITS-S.
- the vehicle HF container includes one or more of the orientation of the vehicle, vehicle speed, traveling direction, vehicle length, vehicle width, vehicle longitudinal acceleration, road curvature, curvature calculation mode, and yaw rate. It may be The running direction indicates either forward or backward.
- the curvature calculation mode is a flag indicating whether or not the yaw rate of the vehicle is used to calculate the curvature.
- the vehicle HF container includes one or more of the following: vehicle longitudinal acceleration control status, lane position, steering wheel angle, lateral acceleration, vertical acceleration, characteristic class, DSRC toll collection station location.
- vehicle longitudinal acceleration control status lane position
- steering wheel angle lateral acceleration
- vertical acceleration vertical acceleration
- characteristic class DSRC toll collection station location.
- a property class is a value that determines the maximum age of data elements in the CAM.
- the vehicle LF container is included in the LF container. If the station type is not vehicle, the LF container can contain other containers that are not vehicle LF containers.
- a vehicle LF container can include one or more of the vehicle's role, the lighting state of external lights, and the travel trajectory.
- the role of vehicle is a classification when the vehicle is a special vehicle.
- the external lights are the most important external lights of the vehicle.
- a travel trajectory indicates the movement of a vehicle at a certain time or a certain distance in the past.
- the running locus can also be called a path history.
- a travel locus is represented by a list of waypoints of a plurality of points (for example, 23 points).
- a special vehicle container is a container for vehicles ITS-S that have a special role in road traffic such as public transportation.
- a special vehicle container may include any one of a public transportation container, a special transportation container, a hazardous materials container, a road construction container, an ambulance container, an emergency container, or a security vehicle container.
- a public transport container is a container for public transport vehicles such as buses.
- Public transportation containers are used by public transportation vehicles to control boarding conditions, traffic lights, barriers, bollards, and the like.
- Special shipping containers are included when the vehicle is one or both of a heavy vehicle and an oversized vehicle.
- a dangerous goods container is a container that is included when a vehicle is transporting dangerous goods.
- the dangerous goods container stores information indicating the type of dangerous goods.
- a road construction container is a container that is included when the vehicle is a vehicle that participates in road construction.
- the road construction container stores a code indicating the type of road construction and the cause of the road construction.
- the roadwork container may also include information indicating whether the lane ahead is open or closed.
- An ambulance container is a container that is included when the vehicle is an ambulance vehicle during an ambulance operation.
- the ambulance container shows the status of light bar and siren usage, and emergency priority.
- An emergency container is a container that is included when the vehicle is an emergency vehicle during emergency operations.
- the emergency container shows light bar and siren usage, cause code and emergency priority.
- the safety confirmation vehicle container is a container that is included when the vehicle is a safety confirmation vehicle.
- a safety confirmation vehicle is a vehicle that accompanies a special transportation vehicle or the like.
- the safety confirmation car container shows the use of light bars and sirens, overtaking regulations, and speed limits.
- Fig. 13 shows a flowchart of the process of transmitting the CAM.
- the processing shown in FIG. 13 is executed for each CAM generation cycle.
- step S201 information configuring the CAM is acquired.
- step S201 is executed by the detection information acquisition unit 201, for example.
- step S202 a CAM is generated based on the information acquired at step S201.
- Step S202 can also be executed by the detection information acquisition unit 201 .
- the transmission unit 221 transmits the CAM generated in step S202 to the surroundings of the vehicle.
- V2X communication devices can support traffic safety by periodically providing their own location and status to surrounding V2X communication devices.
- the CA service 128 has a limitation that only the information of the corresponding V2X communication device itself can be shared. To overcome this limitation, service development such as CP service 124 is required.
- the CP service 124 may also be an entity of the facility layer 120, as shown in FIG.
- CP service 124 may be part of the application support domain of facility layer 120 .
- the CP service 124 may fundamentally differ from the CA service 128 in that, for example, it cannot receive input data regarding the host V2X communication device from the VDP 125 or the POTI unit 126 .
- CPM transmission includes CPM generation and transmission.
- an originating V2X communication device In the process of generating CPM, an originating V2X communication device generates a CPM, which is then sent to network & transport layer 140 for transmission.
- An originating V2X communication device may be referred to as an originating V2X communication device, a host V2X communication device, and so on.
- CP service 124 connects with other entities in facility layer 120 and V2X applications in facility layer 120 to collect relevant information for CPM generation and to deliver received CPM content for further processing.
- entity for data collection may be the function that provides object detection at the host object detector.
- CP service 124 may use services provided by protocol entities of network & transport layer 140 to deliver (or transmit) CPMs.
- CP service 124 may interface with network & transport layer 140 through NF-SAP to exchange CPMs with other V2X communication devices.
- NF-SAP is the service access point between network & transport layer 140 and facility layer 120 .
- CP service 124 may connect with secure entities through SF-SAP, the SAP between security layer 160 and facility layer 120, to access security services for sending CPMs and receiving CPMs. good.
- CP service 124 may also interface with management entities through MF-SAP, which is the SAP between management layer 150 and facility layer 120 .
- MF-SAP which is the SAP between management layer 150 and facility layer 120 .
- the CP service 124 may connect to the application layer 110 through FA-SAP, which is the SAP between the facility layer 120 and the application layer 110 .
- the CP service 124 can specify how a V2X communication device informs other V2X communication devices about the location, behavior, and attributes of detected surrounding road users and other objects. For example, sending a CPM allows the CP service 124 to share the information contained in the CPM with other V2X communication devices. Note that the CP service 124 may be a function that can be added to all types of target information communication devices that participate in road traffic.
- a CPM is a message exchanged between V2X communication devices via the V2X network.
- CPM can be used to generate collective perceptions of road users and other objects detected and/or recognized by V2X communication devices.
- the detected road users or objects may be, but are not limited to, road users or objects that are not equipped with V2X communication equipment.
- a V2X communication device that shares information via a CAM shares only information about the recognition state of the V2X communication device itself with other V2X communication devices in order to perform cooperative recognition.
- road users and others who are not equipped with V2X communication devices are not part of the system and thus have limited views on situations related to safety and traffic management.
- a system that is equipped with a V2X communication device and can recognize road users and objects that are not equipped with a V2X communication device can detect the presence of road users and objects that are not equipped with a V2X communication device. and status to other V2X communication devices.
- the CP service 124 cooperatively recognizes the presence of road users and objects that are not equipped with V2X communication devices, so the safety and traffic management performance of systems equipped with V2X communication devices can be easily improved. It is possible to
- CPM delivery may vary depending on the applied communication system. For example, in ITS-G5 networks as defined in ETSI EN 302 663, a CPM may be sent from an originating V2X communication device directly to all V2X communication devices within range. The communication range can be particularly affected by the originating V2X communication device by changing the transmission power according to the region concerned.
- CPMs may be generated periodically with a frequency controlled by the CP service 124 at the originating V2X communication device.
- the frequency of generation may be determined taking into account the radio channel load determined by Distributed Congestion Control.
- the generation frequency is also determined taking into account the state of the detected non-V2X objects, e.g. dynamic behavior of position, velocity or orientation, and the transmission of CPMs for the same perceived object by other V2X communication devices.
- the CP service 124 makes the contents of the CPM available to functions within the receiving V2X communication device, such as the V2X application and/or the LDM 127 .
- LDM 127 may be updated with received CPM data.
- V2X applications may retrieve this information from LDM 127 for additional processing.
- FIG. 15 is a functional block diagram of the CP service 124 in this embodiment. More specifically, FIG. 15 illustrates the functional blocks of the CP service 124 and functional blocks with interfaces for other functions and layers in this embodiment.
- the CP service 124 can provide the following sub-functions for CPM transmission/reception.
- CPM encoder 1241 constructs or generates CPM according to a predefined format. The latest in-vehicle data may be included in the CPM.
- the CPM decoding unit 1242 decodes the received CPM.
- the CPM transmission management unit 1243 executes the protocol operation of the source V2X communication device. The operations performed by the CPM transmission manager 1243 may include activation and termination of CPM transmission operations, determination of CPM generation frequency, and triggering of CPM generation.
- the CPM reception manager 1244 can perform protocol operations for the receiving V2X communication device. Specifically, it can include triggering the CPM decoding function in CPM reception, providing the received CPM data to the LDM 127 or the V2X application of the receiving side V2X communication device, checking the information of the received CPM, and the like.
- Point-to-multipoint communication may be used for CPM delivery.
- ITS-G5 a control channel
- CPM generation may be triggered and managed by CP service 124 while CP service 124 is running.
- the CP service 124 may be launched upon activation of the V2X communication device and may be terminated when the V2X communication device is terminated.
- a host V2X communication device may send a CPM whenever at least one object with sufficient confidence to exchange with a nearby V2X communication device is detected.
- CP services should consider the trade-off between object lifetime and channel utilization. For example, from the point of view of an application that uses information received by a CPM, it needs to provide updated information as often as possible. However, from the point of view of the ITS-G5 stack, a low transmission period is required due to the need to minimize channel utilization. Therefore, it is desirable for the V2X communication device to consider this point and appropriately include the detected object and object information in the CPM. Also, in order to reduce the message size, it is necessary to evaluate the object before sending it.
- FIG. 16 is a diagram showing the structure of the CPM.
- the CPM structure shown in FIG. 16 may be the basic CPM structure.
- CPMs may be messages exchanged between V2X communication devices in a V2X network.
- CPM may also be used to generate collective perceptions of road users and/or other objects detected and/or recognized by V2X communication devices. That is, a CPM may be an ITS message for generating collective awareness of objects detected by V2X communication devices.
- the CPM may include state information and attribute information of road users and objects detected by the source V2X communication device. Its content may vary depending on the type of road user or object detected and the detection capabilities of the originating V2X communication device. For example, if the object is a vehicle, the state information may include at least information about the actual time, location and motion state. Attribute information may include attributes such as dimensions, vehicle type, and role in road traffic.
- the CPM may complement the CAM and work in the same way as the CAM. That is, it may be for enhancing cooperative recognition.
- the CPM may contain externally observable information about detected road users or objects.
- the CP service 124 may include methods to reduce duplication or duplication of CPMs sent by different V2X communication devices by verifying CPMs sent by other stations.
- the receiving V2X communication device may recognize the presence, type and status of road users or objects detected by the originating V2X communication device.
- the received information may be used by the receiving V2X communication device to support V2X applications to enhance safety, improve traffic efficiency and travel time. For example, by comparing the received information with the detected states of road users or objects, the receiving V2X communication device can estimate the risk of collision with road users or objects. Additionally, the receiving V2X communication device may notify the user via the receiving V2X communication device's Human Machine Interface (HMI) or automatically take corrective action.
- HMI Human Machine Interface
- CPM The basic format of CPM will be explained with reference to FIG.
- the format of this CPM is ASN (Abstract Syntax Notation). May be presented as 1.
- Data Elements (DE) and Data Frames (DF) not defined in this disclosure may be derived from the Common Data Dictionary specified in ETSI TS 102 894-2.
- a CPM may include an ITS protocol data unit (PDU) header and multiple containers.
- PDU protocol data unit
- the ITS PDU header is a header that contains information about the protocol version, message type, and ITS ID of the source V2X communication device.
- the ITS PDU header is a common header used in ITS messages and is present at the beginning of the ITS message.
- the ITS PDU header is sometimes called a common header.
- a station data container may include an originating vehicle container or an originating roadside unit container (RSU container).
- a sensor information container is sometimes called a field-of-view container.
- An originating vehicle container may also be referred to as an OVC.
- a field of view container may also be described as an FOC.
- a recognition object container may also be described as a POC.
- the CPM includes the management container as a mandatory container, and the station data container, sensor information container, POC and free space ancillary containers may be optional containers.
- the sensor information container, the perceived object container and the free space attachment container may be multiple containers. Each container will be described below.
- the administrative container provides basic information about the originating ITS-S, regardless of whether it is a vehicle or roadside unit type station.
- the management container may also include station type, reference location, segmentation information, number of recognized objects.
- the station type indicates the type of ITS-S.
- the reference location is the location of the originating ITS-S.
- the segmentation information describes splitting information when splitting a CPM into multiple messages due to message size constraints.
- Table 1 shown in FIG. 17 is an example of OVC in the station data container of CPM.
- Table 1 shows the data elements (DE) and/or data frames (DF) included in an example OVC.
- the station data container becomes an OVC when the ITS-S that is the source is a vehicle. If the originating ITS-S is an RSU, it becomes an Originating RSU Container.
- the originating RSU container contains the ID for the road or intersection on which the RSU is located.
- DE is a data type that contains single data.
- DF is a data type containing one or more elements in a predetermined order.
- DF is a data type that includes one or more DEs and/or one or more DFs in a predefined order.
- the DE/DF may be used to construct facility layer messages or application layer messages.
- facility layer messages are CAM, CPM, DENM.
- the OVC contains basic information related to the V2X communication device that emits the CPM.
- OVC can be interpreted as a scaled down version of CAM.
- the OVC may include only the DE required for coordinate conversion processing. That is, OVC is similar to CAM, but provides basic information about the originating V2X communication device. The information contained in the OVC is focused on supporting the coordinate transformation process.
- OVC can provide the following. That is, the OVC can provide the latest geographic location of the originating V2X communication device obtained by the CP service 124 at the time of CPM generation. OVC can also provide the absolute lateral and longitudinal velocity components of the originating V2X communication device. The OVC can provide the geometric dimensions of the originating V2X communication device.
- the generated differential time shown in Table 1 indicates, as DE, the time corresponding to the time of the reference position in CPM.
- the generation difference time can be regarded as the generation time of the CPM.
- the generated differential time may be referred to as generated time.
- the reference position indicates the geographical position of the V2X communication device as DF.
- a reference position indicates the location of a geographical point.
- the reference position includes information regarding latitude, longitude, position confidence and/or altitude.
- Latitude represents the latitude of the geographical point
- Longitude represents the longitude of the geographical point.
- the position confidence represents the accuracy of the geographic location
- the altitude represents the altitude and altitude accuracy of the geographic point.
- the orientation indicates the orientation in the coordinate system as DF.
- Heading includes heading value and/or heading confidence information.
- the bearing value indicates the heading relative to north, and the bearing confidence indicates a preset level of confidence in the reported bearing value.
- Longitudinal velocity can describe the accuracy of longitudinal velocity and velocity information for a moving object (eg, vehicle) as DF.
- Longitudinal velocity includes velocity value and/or velocity accuracy information.
- the velocity value represents the velocity value in the vertical direction
- the velocity accuracy represents the accuracy of the velocity value.
- Lateral velocity as a DF, can describe the lateral velocity and the accuracy of velocity information for a moving body (eg, vehicle). Lateral velocity includes information about velocity values and/or velocity accuracy.
- the velocity value represents the velocity value in the lateral direction
- the velocity accuracy represents the accuracy of the velocity value.
- the vehicle length can describe the vehicle length and accuracy index as DF.
- the vehicle length includes information about vehicle length values and/or vehicle length accuracy indicators.
- the vehicle length represents the length of the vehicle, and the vehicle length accuracy index represents the reliability of the vehicle length.
- the vehicle width indicates the width of the vehicle as DE.
- vehicle width can represent the width of the vehicle including the side mirrors. If the width of the vehicle is 6.1 m or more, it is set to 61, and if the information cannot be obtained, it is set to 62.
- Each DE/DF shown in Table 1 can refer to ETSI 102 894-2 shown in the right column of Table 1, except for the generation time difference.
- ETSI 102 894-2 defines a CDD (common data dictionary). See ETSI EN 302 637-2 for generation time differences.
- the OVC may contain information on the vehicle direction angle, vehicle traveling direction, longitudinal acceleration, lateral acceleration, vertical acceleration, yaw rate, pitch angle, roll angle, vehicle height and trailer data.
- Table 2 is shown in FIG. Table 2 is an example of SIC (or FOC) in CPM.
- the SIC provides a description of at least one sensor mounted on the originating V2X communication device. If the V2X communication device is equipped with multiple sensors, multiple explanations may be added. For example, the SIC provides information about the sensor capabilities of the originating V2X communication device. To do so, general sensor characteristics providing the originating V2X communication device's sensor mounting location, sensor type, sensor range and opening angle (i.e., sensor frustum) are included as part of the message. may be These pieces of information may be used by the receiving V2X communication device to select an appropriate prediction model according to sensor performance.
- the sensor ID indicates a sensor-specific ID for specifying the sensor that detected the object.
- the sensor ID is a random number generated when the V2X communication device starts up and does not change until the V2X communication device is terminated.
- Sensor type indicates the type of sensor.
- the sensor types are listed below.
- the sensor type is undefined (0), radar (1), lidar (2), mono-video (3), stereovision (4), night vision (5), ultrasonic (6), pmd (7). , fusion (8), induction loop (9), spherical camera (10), and their set (11).
- pmd is a photo mixing device.
- a spherical camera is also called a 360-degree camera.
- the X position indicates the mounting position of the sensor in the negative X direction
- the Y position indicates the mounting position of the sensor in the Y direction.
- These mounting positions are measurements from a reference position, which can be referred to EN 302 637-2 of ETSI.
- the radius indicates the average recognition range of the sensor as defined by the manufacturer.
- the start angle indicates the start angle of the sensor's frustum
- the end angle indicates the end angle of the sensor's frustum.
- a quality class represents a classification of the sensor that defines the quality of the measurement object.
- the SIC may contain information regarding the reliability of the detection area and free space.
- Table 3 is shown in FIG. Table 3 is an example of POC in CPM.
- the POC is used to describe the object perceived by the sensor as seen by the transmitting V2X communication device.
- the receiving V2X communication device that receives the POC can, with the help of the OVC, perform coordinate transformation processing to transform the position of the object into the reference coordinate system of the receiving vehicle.
- multiple optional DEs may be provided if the originating V2X communication device can provide them.
- a POC may consist of a selection of DEs to provide an abstract description of the recognized (or detected) object. For example, the relative distance, velocity information and timing information about the perceived object associated with the originating V2X communication device may be included in the POC as mandatory DEs. Additional DEs may also be provided if the sensors of the originating V2X communication device can provide the requested data.
- the measurement time indicates the time in microseconds from the reference time of the message. This defines the relative age of the measured object.
- An object ID is a unique random ID assigned to an object. This ID is retained (ie, not changed) while the object is being tracked, ie, considered in the originating V2X communication device's data fusion process.
- the sensor ID is an ID corresponding to DE in the sensor ID in Table 2. This DE may be used to associate object information with the sensors that make the measurements.
- the vertical distance includes the distance value and the distance reliability.
- the distance value indicates the relative X distance to the object in the source reference frame.
- the distance reliability is a value indicating the reliability of the X distance.
- the horizontal distance also includes the distance value and distance reliability.
- the distance value indicates the relative Y distance to the object in the source reference frame, and the distance confidence indicates the confidence of that Y distance.
- Longitudinal velocity indicates the longitudinal velocity of the detected object according to its reliability. Lateral velocity indicates the lateral velocity of the detected object depending on the degree of confidence. Longitudinal and transverse velocities can be referred to CDD of TS 102 894-2.
- the object orientation indicates the absolute orientation of the object in the reference coordinate system when provided by data fusion processing.
- the object length indicates the measured object length.
- the length confidence indicates the confidence of the length of the measured object.
- Object Width indicates a measurement of the width of the object. Width confidence indicates the reliability of the object width measurement.
- Object type represents the classification of the object as provided in the data fusion process. Classifications of objects may include vehicles, people, animals, and others.
- object reliability, vertical distance, vertical speed, vertical acceleration, lateral acceleration, vertical acceleration, height of object, dynamic state of object, matched position (lane ID, vertical direction lane position) may be included in the POC.
- the free space addition container is a container that indicates information about the free space recognized by the source V2X communication device (that is, free space information).
- a free space is an area that is not considered to be occupied by road users or obstacles, and can also be called an empty space.
- the free space can also be said to be a space in which a mobile object that moves together with the source V2X communication device can move.
- the free space additional container is not a required container, but a container that can be added arbitrarily. If there is a difference between the free space recognized by the other V2X communication device and the free space recognized by the source V2X communication device, which can be calculated from the CPM received from the other V2X communication device, the free space Additional space containers can be added. Also, free space additional containers may be added to the CPM periodically.
- the free space addition container contains information specifying the area of free space.
- Free space can be specified in various shapes.
- the shape of the free space can be represented, for example, by polygons (ie, polygons), circles, ellipses, rectangles, and the like.
- polygons ie, polygons
- circles ie, polygons
- ellipses ie, circles
- rectangles ie, polygons
- the free space addition container contains information specifying the area of free space.
- Free space can be specified in various shapes.
- the shape of the free space can be represented, for example, by polygons (ie, polygons), circles, ellipses, rectangles, and the like.
- When representing free space with a polygon specify the positions of the points that make up the polygon and the order in which the points are connected.
- free space with an ellipse specify the
- the free space additional container may contain the reliability of the free space. Reliability of free space is expressed numerically. The reliability of free space may also indicate that the reliability is unknown.
- the free space addition container may also contain information about shadow areas. The shadow area indicates the area behind the object as seen from the vehicle or the sensor mounted on the vehicle.
- FIG. 20 is a diagram explaining a sensor data extraction method by a V2X communication device that provides the CP service 124. More specifically, FIG. 20(a) illustrates how a V2X communication device extracts sensor data at a low level. FIG. 20(b) illustrates how a V2X communication device extracts sensor data at a high level.
- the source of sensor data transmitted as part of CPM should be selected according to the requirements of the future data fusion process in the receiving V2X communication device.
- the transmitted data should be as close as possible to the original sensor data.
- simply transmitting original sensor data, such as raw data is not realistic. This is because it imposes very high demands on data rate and transmission cycle.
- Figures 20(a) and 20(b) show possible embodiments for selecting data to be sent as part of the CPM.
- sensor data is obtained from different sensors and processed as part of the low-level data management entity. This entity can select the object data to be inserted as part of the next CPM and also compute the validity of the detected objects.
- the sensor information can be efficiently utilized by the V2X communication device on the receiving side.
- sensor data or object data provided by a data fusion unit specific to the V2X communication device manufacturer is transmitted as part of the CPM.
- the absolute value of the difference between the current yaw angle of the detected object and the yaw angle included in the CPM previously transmitted by the source V2X communication device exceeds 4 degrees, transmission will be considered.
- the difference between the relative distance between the current position of the originating V2X communication device and the detected object and the relative distance between the originating V2X communication device and the detected object contained in the CPM previously transmitted by the originating V2X communication device exceeds 4m
- the absolute value of the difference between the current velocity of the detected object and the velocity of the detected object contained in the CPM previously transmitted by the source V2X communication device exceeds 0.5 m/s, consider sending. good too.
- CAM is a technology in which a vehicle equipped with a V2X module periodically transmits its position and status to other vehicles equipped with a V2X module in the surrounding area to support more stable driving.
- the V2X module is a configuration including a V2X communication device or a V2X communication device.
- CP service 124 is a technology that complements CAM.
- CPS that is, CP service
- ADAS technology is a technology in ADAS technology that notifies the surroundings of sensor data that recognizes the surrounding environment through V2X communication.
- FIG. 21 is a diagram explaining the CP service 124.
- each vehicle TxV1 and RxV2 is equipped with at least one sensor and has sensing ranges SrV1, SrV2 indicated by dashed lines.
- TxV1 has a CPS function.
- TxV1 can recognize vehicles, RV1 to RV11, which are peripheral objects belonging to sensing range SrV1, using a plurality of ADAS sensors mounted on the vehicle. Object information obtained by recognition may be distributed to nearby vehicles equipped with V2X communication devices through V2X communication.
- RxV1 which is not equipped with a sensor, can acquire information on the following vehicle.
- RxV2 equipped with a sensor receives CPM from TxV1
- information on objects located outside the sensing range SrV2 of RxV2 and objects located in blind spots can also be obtained.
- facility layer 120 can provide CP services 124 .
- CP services 124 may run in facility layer 120 and may utilize services that reside in facility layer 120 .
- the LDM 127 is a service that provides map information, and may provide map information for the CP service 124.
- the provided map information may include dynamic information in addition to static information.
- the POTI unit 126 performs a service that provides the location and time of the ego vehicle.
- the POTI unit 126 can use the corresponding information to provide the location of the ego vehicle and the exact time.
- the VDP 125 is a service that provides information about the vehicle, and may be used to capture information such as the size of the own vehicle into the CPM and transmit the CPM.
- ADAS vehicles are equipped with various sensors such as cameras, infrared sensors, radar, and lidar for driving support. Each sensor individually recognizes an object. The recognized object information may be collected and fused by the data fusion unit and provided to the ADAS application.
- CP service 124 the collection and fusion method of sensor information in the ADAS technology will be described.
- Existing sensors for ADAS and existing sensors for CPS can always track surrounding objects and collect relevant data.
- sensor values for CP services two methods can be used to collect sensor information.
- each sensor value can be individually provided to surrounding vehicles through the CP basic service.
- the aggregated integrated sensor information may be provided to the CP basic service after the data fusion part.
- CP basic services form part of CP services 124 .
- FIG. 22 is a flow chart showing the process of sending the CPM.
- the processing shown in FIG. 22 is executed for each CPM generation cycle.
- step S301 information forming the CPM is acquired.
- step S301 is executed by the detection information acquisition unit 201, for example.
- step S302 CPM is generated based on the information obtained in step S301.
- Step S302 can also be executed by the detection information acquisition unit 201 .
- the transmission unit 221 transmits the CPM generated at step S302 to the surroundings of the vehicle.
- the CPM transmitted from the roadside device 3 is used as the target object information transmitted from the roadside device 3 .
- CAM or CPM transmitted from another vehicle is used as target information transmitted from another vehicle. Since CPM and CAM are used as target information, the flowchart shown in FIG. 23 executes steps S1A, S2A, S3A, S8A, and S10A instead of steps S1, S2, S3, S8, and S10 in the flowchart of FIG. do. Note that steps S4, S5, S6, S7, S9, S11, S12, and S13 are common between the flowchart of FIG. 5 and the flowchart of FIG. 23, and thus description thereof will be omitted.
- step S1A when the roadside and vehicle side receiving unit 231 receives the CPM transmitted from the roadside device 3 through roadtovehicle communication (YES in S1A), the process proceeds to step S2A. On the other hand, if the road and vehicle side receiver 231 has not received the CPM from the road side device 3 (NO in S1A), the process proceeds to step S8A.
- step S2A if the vehicle-side receiving unit 222 receives the CAM or CPM transmitted from another vehicle via inter-vehicle communication (YES at S2A), the process proceeds to step S3A. On the other hand, if the CAM or CPM has not been received in the vehicle-to-vehicle communication (NO in S2A), the process proceeds to step S13. Note that CAM and CPM may be collectively referred to as messages below.
- step S3A the identity determining unit 204 determines whether the target specified by the POC of the CPM determined to have been received in S1A is the same as the target specified by the CAM or CPM determined to have been received in S2A. judge.
- the CPM transmitted by the roadside device 3 includes the latitude and longitude (hereinafter referred to as absolute coordinates) of the roadside device 3 and the distance and direction from the roadside device 3 to the target. Therefore, from the CPM acquired from the roadside device 3, the absolute coordinates of the target detected by the roadside device 3 can be determined. Also, the CPM includes POC containing various information about the target detected by the roadside unit 3 .
- the CAM or CPM obtained from other vehicles contains the absolute coordinates of the other vehicles.
- the CAM acquired from other vehicles includes various other vehicle information.
- the CPM acquired from other vehicles also contains information on other vehicles, although it contains less information on other vehicles as a transmission source than the CAM. Therefore, based on the CPM acquired from the roadside device 3 and the CAM or CPM acquired from another vehicle, it can be determined whether or not the target detected by the roadside device 3 is the same as the other vehicle that received the CAM or CPM.
- the absolute coordinates of the target specified by the CPM acquired from the roadside device 3 and the absolute coordinates of the other vehicle indicated by the CAM or CPM acquired from the other vehicle are similar, It can be determined that the target specified by the CPM and the other vehicle are the same. Approximation of coordinates can be determined by whether the distance between coordinates is less than a preset threshold. Note that relative coordinates based on the vehicle may be used instead of absolute coordinates.
- the target and the other vehicle are the same. can be judged. Also, the same determination may be made using absolute coordinates or relative coordinates and behavior.
- Behavior is one or more pieces of information that indicate the movement of an object, such as velocity, acceleration, and angular velocity.
- the CPM does not include the behavior of the other vehicle that is the transmission source.
- the CPM is received a plurality of times from the same other vehicle, and the behavior of the other vehicle is determined from the time change of the absolute coordinates of the other vehicle included in each CPM.
- the CAM contains information indicating the behavior of other vehicles, it is preferable to use the CAM as a message when determining the same by using the behavior. If the same determination is made using the behavior, the same determination can be made with high accuracy even if there is a positioning error.
- the type of the object that sent the message can be used.
- the type can be determined, for example, by the station type included in the message.
- Absolute coordinates of objects may be used for narrowing down.
- the narrowing-down range is determined by a threshold radius set to a distance larger than the distance difference determined to be the same.
- the center of the narrowing range is the coordinates of the target or other vehicle. The same determination is made for the target and the other vehicle within this narrowed-down range.
- step S8A The determination content of step S8A is the same as that of step S2A. If step S8A is YES, the process moves to step S9, and if step S8A is NO, the process moves to step S13. If NO in step S9, the process proceeds to S10A.
- step S10A a second estimation-related process is performed, and the process proceeds to step S13.
- An example of the flow of the second estimation-related processing according to the second embodiment will be described using the flowchart of FIG. 24 .
- the flowchart shown in FIG. 24 executes steps S101A, S103A, and S106 instead of steps S101, S103, and S106 in the flowchart of FIG. Note that steps S102, S104, S105, and S107 are common between the flowchart of FIG. 6 and the flowchart of FIG. 24, and thus description thereof will be omitted.
- step S101A when the vehicle-side reception unit 222, which is the mounted object information acquisition unit, receives the CAM or CPM from another vehicle different from the other vehicle that received the message in S2A or S8A (YES in S101A), step Move to S102. If the CAM and CPM have not been received (NO in S101A), the process proceeds to S13 in FIG.
- the identity determination unit 204 executes step S103A.
- S103A it is determined whether or not the target specified by the CAM or CPM determined to have been received in S101A is the same as the first target whose position is specified by the target position specifying unit 282 .
- One target for the same determination is the first target whose position was determined to have been identified in the previous S102.
- the other target for the same determination is either the second communication device mounted object or the second target.
- the other target of the same determination is the second communication device mounted object.
- the second communication equipment is the vehicle VEb.
- the other object of the same determination is one or both of the second communication device mounted object and the second target.
- the second target is the target detected by the second communication device mounted object.
- the second target is the target LMa when specifically shown using FIG.
- the message is CPM
- As the target information of the first target use the POC included in the CPM transmitted by the first communication device, and use the POC included in the CPM transmitted by the second communication device as the target information of the second target. Use the included POC. It is determined whether or not the first target and the second target are the same depending on whether or not one or both of the position and behavior of the object indicated by the POCs included in the two CPMs are similar. Of course, the type of target may also be taken into consideration. Further, narrowing down by position may be performed.
- the message is CAM, determine whether the first target and the second communication device mounted object are the same.
- the POC included in the CPM transmitted by the first communication device is used as the target information of the first target, and the second communication device is used for the target information of the second target communication device.
- the message is CPM, it is possible to determine whether the first target and the second communication device mounted object are the same.
- the target information of the first target the POC included in the CPM transmitted by the first communication equipment is used.
- One or both of the CPM management container and the OVC transmitted by the second target communication device is used for the target information of the second target communication device. Note that the type of the target may be considered and the narrowing may be performed by position, as in the case of determining whether or not the first target and the second target are the same.
- the error estimation unit 206 estimates the second positioning error.
- the second positioning error is the positional deviation of the two targets that are subject to the same determination in S103A. If the two targets to be determined to be the same in S103A are the first target and the second target, the deviation in the position of the object indicated by the POCs included in the two CPMs is set as the second positioning error.
- the second positioning position obtained from the second communication device and the second By correcting the positions of the targets by the second positioning error, the position of the second communication device mounted object and the position of the second target relative to the own vehicle can be specified.
- one or both of the CPM and CAM transmitted by the vehicle unit 2 and the CPM transmitted by the roadside device 3 are used when estimating the positioning error of the object mounted with the first communication device. . Since a system that transmits and receives CAM and CPM can be used, it is easy to construct a system that accurately identifies the position of a target such as the object mounted on the first communication device.
- the object mounted with the communication device may be a mobile object other than a vehicle.
- Mobile objects other than vehicles include, for example, drones.
- a unit provided with a function excluding the function specific to the vehicle among the vehicle units 2 may be mounted on a moving object such as a drone.
- the communication device 20 detects the detected position of the roadside device 3 based on the positioning position of the own vehicle, From the absolute position information of the roadside device 3 acquired in , the positioning error of the own vehicle may be estimated.
- the detected position of the roadside device 3 based on the positioning position of the own vehicle is obtained by calculating the position of the roadside device 3 with respect to the position of the own vehicle recognized by the perimeter monitoring ECU 50 using the coordinates of the positioning position of the own vehicle.
- the detected position of the roadside unit 3 based on the measured position of the own vehicle contains the positioning error of the own vehicle because it is based on the measured position of the own vehicle. Therefore, the positioning error of the own vehicle can be estimated from the deviation between the detected position of the roadside unit 3 and the absolute position of the roadside unit 3 based on the positioning position of the own vehicle.
- the estimated positioning error of the own vehicle may be used for correction when specifying the position of the own vehicle.
- the configuration in which the communication device 20 estimates the positioning error has been described in the above-described embodiment, the configuration is not necessarily limited to this.
- the functions other than the vehicle-to-vehicle communication unit 202 and the road-to-vehicle communication unit 203 may be performed by the perimeter monitoring ECU 50.
- the perimeter monitoring ECU 50 may include a functional block for acquiring target information received by the vehicle-to-vehicle communication section 202 and the road-to-vehicle communication section 203 .
- This functional block corresponds to the mounted object information acquisition unit and the roadside unit information acquisition unit.
- the perimeter monitoring ECU 50 corresponds to the vehicle device.
- the function of the communication device 20 may be performed by the communication device 20 and the perimeter monitoring ECU 50 .
- a unit including the communication device 20 and the perimeter monitoring ECU 50 corresponds to the vehicle device.
- the first positioning error is estimated after the roadside unit reference target position and the first positioning position are converted into relative positions with respect to the own vehicle by the conversion unit 205 (S5, S6).
- the first positioning error may be estimated using the coordinates of the roadside machine reference target position and the first positioning position as absolute positions.
- the first positioning position is absolute coordinates
- the conversion unit 205 is unnecessary. is.
- the roadside unit reference target position which is the position of the target detected by the roadside unit 3, is used as the reference position.
- the reference position is not limited to the target position detected by the roadside unit 3 .
- the vehicle unit 2 mounted on the vehicle VEc can receive the CPM transmitted by the roadside unit 3.
- This CPM also includes the position of the vehicle VEc.
- the vehicle unit 2 mounted on the vehicle VEc can correct its own positioning error based on the position of the vehicle VEc included in the CPM. After correcting the positioning error, it can be considered that the vehicle unit 2 has almost no positioning error for a certain period of time. Therefore, after the positioning error is corrected, the position of the target detected by the error-corrected vehicle unit 2 may be used as the reference position for a certain period of time.
- the vehicle unit 2 may indicate whether or not the position of the target transmitted by itself can be used as the reference position, for example, by means of a flag included in the message.
- a vehicle device that can be used in a vehicle, comprising: A roadside that has a surrounding monitoring sensor, has information on the absolute position of its own device with higher positional accuracy than required by positioning using signals from positioning satellites, and is equipped with a communication device capable of wireless communication with the vehicle.
- the mounted object information acquisition unit A vehicle apparatus comprising a mounted object position specifying unit (281) for specifying the position of the first communication device mounted object with respect to the vehicle by correcting the first positioning position acquired from the object by the first positioning error.
- the mounted object information acquisition unit obtains the position of the first target, which is the target detected by the surrounding monitoring sensor, transmitted from the mounted object, which also has a surrounding monitoring sensor, with respect to the first positioning position.
- the first mounted object reference target position is also acquired via wireless communication, After the first positioning error is once estimated by the error estimating section, the mounted object information acquisition section corrects the first mounted object reference target position acquired from the first communication device mounted object by the amount of the first positioning error.
- a target position specifying unit (282) for specifying the position of the first target with respect to the vehicle.
- the mounted object information acquisition unit has a surrounding monitoring sensor, is capable of positioning using a positioning satellite signal, and is equipped with a communication device capable of wireless communication with the vehicle, and a roadside device and a mounted object of the first communication device.
- the second positioning position which is the position of the second communication device-mounted object obtained by the positioning, and the second object, which is the target detected by the surrounding monitoring sensor, transmitted from the second communication device-mounted object other than a second mounted object reference target position, which is the position of the target based on the position of the mounted object of the second communication device determined by the positioning, is also acquired via wireless communication;
- the identity determination unit determines whether the first target whose position relative to the vehicle is specified by the target position specifying unit and the second communication device mounted object that is the transmission source of the second mounted object reference target position acquired by the mounted object information acquisition unit.
- the converting unit converts the second mounted object reference target position acquired by the mounted object information acquiring unit into a relative position with respect to the vehicle,
- the error estimating unit determines the position of the second mounted object reference target converted by the converting unit and the position of the target specified by the target position identifying unit.
- a vehicle device for estimating a second positioning error which is a positioning error of a second communication device mounted on the vehicle with reference to the position of the vehicle, from the deviation from the position of the first target with respect to the vehicle.
- a vehicle device (Technical feature 5) A vehicle device according to technical feature 4, After the second positioning error is once estimated by the error estimating unit, the mounted object information acquisition unit corrects the second positioning position acquired from the second communication device mounted object by the second positioning error, A device for a vehicle comprising a mounted object position identifying section (281) for identifying the position of a second communication device mounted object.
- the target position identifying unit calculates the second mounting object reference target position acquired from the second communication device mounted object by the mounted object information acquiring unit as the second A vehicle device that specifies the position of a second target with respect to the vehicle by correcting the positioning error.
- An error estimation method that can be used in a vehicle, comprising: executed by at least one processor; A roadside that has a surrounding monitoring sensor, has information on the absolute position of its own device with higher positional accuracy than required by positioning using signals from positioning satellites, and is equipped with a communication device capable of wireless communication with the vehicle.
- controller and techniques described in this disclosure may also be implemented by a special purpose computer comprising a processor programmed to perform one or more functions embodied by a computer program.
- the apparatus and techniques described in this disclosure may be implemented by dedicated hardware logic circuitry.
- the apparatus and techniques described in this disclosure may be implemented by one or more special purpose computers configured by a combination of a processor executing a computer program and one or more hardware logic circuits.
- the computer program may also be stored as computer-executable instructions on a computer-readable non-transitional tangible recording medium.
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computer Networks & Wireless Communication (AREA)
- Traffic Control Systems (AREA)
Abstract
Description
<車両用システム1の概略構成>
以下、本開示の実施形態1について図面を用いて説明する。図1に示すように、車両用システム1は、複数の車両の各々に用いられる車両ユニット2と路側機3とを含んでいる。図1に示す例では、車両として、車両VEa,車両VEb,車両VEcを例に挙げて説明する。車両VEa,車両VEb,車両VEcは自動車であるものとする。 (Embodiment 1)
<Schematic Configuration of
続いて、図2を用いて路側機3の概略構成の一例を説明する。路側機3は、図2に示すように、制御装置300、通信機301、周辺監視センサ302、及び位置記憶部303を含んでいる。 <Schematic configuration of the
Next, an example of the schematic configuration of the
続いて、図3を用いて車両ユニット2の概略構成の一例を説明する。車両ユニット2は、図3に示すように、通信機20、ロケータ30、周辺監視センサ40、周辺監視ECU50、及び運転支援ECU60を含んでいる。通信機20、ロケータ30、周辺監視ECU50、及び運転支援ECU60は、車内LAN(図3のLAN参照)と接続される構成とすればよい。 <Schematic Configuration of
Next, an example of the schematic configuration of the
続いて、図4を用いて通信機20の概略構成の一例を説明する。通信機20は、図4に示すように、検出情報取得部201、車車間通信部202、路車間通信部203、同一判定部204、変換部205、誤差推定部206、誤差記憶部207、位置特定部208、及び位置記憶部209を機能ブロックとして備えている。なお、通信機20が実行する機能の一部又は全部を、1つ或いは複数のIC等によりハードウェア的に構成してもよい。また、通信機20が備える機能ブロックの一部又は全部は、プロセッサによるソフトウェアの実行とハードウェア部材の組み合わせによって実現されてもよい。 <Schematic Configuration of
Next, an example of the schematic configuration of the
ここで、図5のフローチャートを用いて、通信機20での搭載物測位誤差の推定に関連する処理(以下、測位誤差推定関連処理)の流れの一例について説明を行う。この処理が実行されることは誤差推定方法が実施されることを意味する。図5のフローチャートは、例えば自車の内燃機関又はモータジェネレータを始動させるためのスイッチ(以下、パワースイッチ)がオンになった場合に開始される構成とすればよい。 <Positioning Error Estimation Related Processing in
Here, an example of the flow of processing related to estimation of mounted object positioning error in the communication device 20 (hereinafter referred to as positioning error estimation related processing) will be described using the flowchart of FIG. 5 . Execution of this process means execution of the error estimation method. The flowchart of FIG. 5 may be configured to be started, for example, when a switch (hereinafter referred to as a power switch) for starting the internal combustion engine or motor generator of the own vehicle is turned on.
実施形態1の構成によれば、第1測位位置は、第1通信機搭載物での測位衛星の信号を用いた測位によって求められるものであるので、その測位の誤差を含んでいる。一方、路側機基準物標位置は、測位衛星の信号を用いた測位で求められるよりも位置精度の高い自機器の絶対位置の情報を有する路側機3の周辺監視センサ302で検出した物標の、路側機3の位置を基準とする位置であるので、測位衛星の信号を用いた測位で求められるよりも位置精度が高い。よって、同一と判定した路側機検出物標と第1通信機搭載物とについて、路側機3から取得した路側機基準物標位置と、第1通信機搭載物から取得した第1測位位置とのずれから、自車の位置を基準とした第1通信機搭載物での測位の誤差である第1測位誤差をより精度良く推定することが可能になる。ここで、路側機基準物標位置も第1測位位置も無線通信を介して取得するので、第1測位誤差の推定は、第1通信機搭載物を自車の周辺監視センサ40で検出できなくても可能となる。また、第1測位誤差を用いれば、第1通信機搭載物での測位誤差を補正して、第1通信機搭載物の位置をより精度良く特定することが可能になる。その結果、自車の周辺監視センサ40の検出範囲外の通信機搭載物の位置をより精度良く特定することが可能になる。 <Summary of
According to the configuration of the first embodiment, since the first positioning position is determined by positioning using the signals of the positioning satellites in the first communication device, it contains positioning errors. On the other hand, the roadside unit reference target position is the position of the target detected by the
図7を用いて、実施形態2の概要を説明する。実施形態2では、車両ユニット2はCAM(Cooperative Awareness Messages)、CPM(Cooperative Perception Message)、2つのメッセージを逐次送信する。また、路側機3もCPMを逐次送信する。 (Embodiment 2)
An overview of the second embodiment will be described with reference to FIG. In
実施形態1および実施形態2では、通信機搭載物が車両である場合を例に挙げて説明したが、必ずしもこれに限らない。例えば、通信機搭載物が車両以外の移動体であっても構わない。車両以外の移動体としては、例えばドローン等が挙げられる。この場合、ドローンといった移動体に、車両ユニット2のうちの車両に特有の機能を除く機能を備えるユニットを搭載すればよい。 (Embodiment 3)
In
他にも、通信機20は、自車の周辺監視センサ40で路側機3を検出できる場合に、自車の測位位置を基準とした路側機3の検出位置と、路側機3から路車間通信で取得する路側機3の絶対位置の情報とから、自車での測位の誤差を推定してもよい。自車の測位位置を基準とした路側機3の検出位置は、周辺監視ECU50で認識した自車の位置に対する路側機3の位置を、自車の測位位置の座標を用いて緯度,経度の座標に変換することで求めればよい。自車の測位位置を基準とした路側機3の検出位置は、自車の測位位置を基準とするので、自車での測位の誤差を含んでいる。よって、自車の測位位置を基準とした路側機3の検出位置と路側機3の絶対位置とのずれから、自車での測位の誤差を推定することができる。推定した自車での測位の誤差については、自車の位置を特定する際の補正に用いればよい。 (Embodiment 4)
In addition, when the
前述の実施形態では、通信機20が測位誤差を推定する構成を示したが、必ずしもこれに限らない。例えば、通信機20の機能のうち、車車間通信部202及び路車間通信部203以外の機能を、周辺監視ECU50が担う構成としてもよい。この場合、周辺監視ECU50は、車車間通信部202及び路車間通信部203で受信した物標情報を取得する機能ブロックを備えればよい。この機能ブロックが搭載物情報取得部及び路側機情報取得部に相当する。この場合、周辺監視ECU50が車両用装置に相当する。他にも、通信機20の機能を、通信機20と周辺監視ECU50とで担う構成としてもよい。この場合、通信機20及び周辺監視ECU50を含むユニットが車両用装置に相当する。 (Embodiment 5)
Although the configuration in which the
前述の実施形態では、路側機基準物標位置及び第1測位位置を変換部205で自車に対する相対位置に変換した後、第1測位誤差を推定していた(S5,S6)。しかし、路側機基準物標位置及び第1測位位置の座標を絶対位置として、第1測位誤差を推定してもよい。特に、第1測位位置は絶対座標であるので、路側機3が、路側機基準物標位置に代えて、検出した物標の位置を絶対座標に変換して送信する場合、変換部205は不要である。 (Embodiment 6)
In the above-described embodiment, the first positioning error is estimated after the roadside unit reference target position and the first positioning position are converted into relative positions with respect to the own vehicle by the conversion unit 205 (S5, S6). However, the first positioning error may be estimated using the coordinates of the roadside machine reference target position and the first positioning position as absolute positions. In particular, since the first positioning position is absolute coordinates, when the
実施形態では、路側機3が検出する物標の位置である路側機基準物標位置を参照位置としていた。しかし、参照位置は、路側機3が検出する物標の位置に限られない。 (Embodiment 7)
In the embodiment, the roadside unit reference target position, which is the position of the target detected by the
車両で用いることが可能な車両用装置であって、
周辺監視センサを有し、測位衛星の信号を用いた測位で求められるよりも位置精度の高い自機器の絶対位置の情報を有し、且つ、車両と無線通信が可能な通信機を搭載する路側機から送信される、周辺監視センサで検出した物標の、路側機の位置を基準とする位置である路側機基準物標位置を、無線通信を介して取得する路側機情報取得部(231)と、
測位衛星の信号を用いた測位が可能であり、且つ、車両と無線通信が可能な通信機を搭載する路側機以外の第1通信機搭載物から送信される、その測位によって求められたその第1通信機搭載物の位置である第1測位位置を、無線通信を介して取得する搭載物情報取得部(222)と、
路側機情報取得部で路側機基準物標位置を取得した路側機の周辺監視センサで検出した物標と、搭載物情報取得部で取得した第1測位位置の送信元の第1通信機搭載物とが同一か否かを判定する同一判定部(204)と、
搭載物情報取得部で取得した路側機基準物標位置及び搭載物情報取得部で取得した第1測位位置を車両に対する相対位置に変換する変換部(205)と、
同一判定部で物標と第1通信機搭載物とが同一と判定した場合には、それらについての、変換部で変換した路側機基準物標位置と第1測位位置とのずれから、車両の位置を基準とした第1通信機搭載物での測位の誤差である第1測位誤差を推定する誤差推定部(206)とを備える車両用装置。 (Technical feature 1)
A vehicle device that can be used in a vehicle, comprising:
A roadside that has a surrounding monitoring sensor, has information on the absolute position of its own device with higher positional accuracy than required by positioning using signals from positioning satellites, and is equipped with a communication device capable of wireless communication with the vehicle. A roadside unit information acquisition unit (231) for acquiring, via wireless communication, the roadside unit reference target position, which is the position of the target detected by the perimeter monitoring sensor and transmitted from the vehicle, based on the position of the roadside unit. When,
Positioning using signals from positioning satellites and transmitted from a first communication device other than a roadside device equipped with a communication device capable of wireless communication with the vehicle, and obtained by the positioning a mounted object information acquisition unit (222) for acquiring, via wireless communication, a first positioning position, which is the position of a mounted object of one communication device;
Targets detected by the perimeter monitoring sensor of the roadside unit whose reference target position was acquired by the roadside unit information acquisition unit, and the mounted object of the first communication device that is the transmission source of the first positioning position acquired by the mounted object information acquisition unit A same determination unit (204) that determines whether or not the
a conversion unit (205) for converting the roadside unit reference target position acquired by the mounted object information acquiring unit and the first positioning position acquired by the mounted object information acquiring unit into a relative position with respect to the vehicle;
When the identity determination unit determines that the target and the first communication device mounted object are the same, the deviation between the roadside unit reference target position converted by the conversion unit and the first positioning position is used to determine the position of the vehicle. An apparatus for a vehicle, comprising an error estimator (206) for estimating a first positioning error, which is a positioning error in the first communication equipment mounted on the basis of the position.
技術的特徴1に記載の車両用装置であって、
誤差推定部で第1測位誤差を一旦推定した後は、第1通信機搭載物が路側機の周辺監視センサで検出できなくなった場合であっても、搭載物情報取得部で第1通信機搭載物から取得する第1測位位置を、その第1測位誤差の分だけ補正して、車両に対するその第1通信機搭載物の位置を特定する搭載物位置特定部(281)を備える車両用装置。 (Technical feature 2)
A vehicle device according to
After the error estimating unit estimates the first positioning error, even if the object mounted on the first communication device cannot be detected by the surrounding monitoring sensor of the roadside unit, the mounted object information acquisition unit A vehicle apparatus comprising a mounted object position specifying unit (281) for specifying the position of the first communication device mounted object with respect to the vehicle by correcting the first positioning position acquired from the object by the first positioning error.
技術的特徴1又は2に記載の車両用装置であって、
搭載物情報取得部は、さらに周辺監視センサも有する第1通信機搭載物から送信される、その周辺監視センサで検出した物標である第1物標の、第1測位位置を基準とする位置である第1搭載物基準物標位置も、無線通信を介して取得し、
誤差推定部で第1測位誤差を一旦推定した後は、搭載物情報取得部で第1通信機搭載物から取得する第1搭載物基準物標位置を、その第1測位誤差の分だけ補正して、車両に対する第1物標の位置を特定する物標位置特定部(282)を備える車両用装置。 (Technical feature 3)
A vehicle device according to
The mounted object information acquisition unit obtains the position of the first target, which is the target detected by the surrounding monitoring sensor, transmitted from the mounted object, which also has a surrounding monitoring sensor, with respect to the first positioning position. The first mounted object reference target position is also acquired via wireless communication,
After the first positioning error is once estimated by the error estimating section, the mounted object information acquisition section corrects the first mounted object reference target position acquired from the first communication device mounted object by the amount of the first positioning error. and a target position specifying unit (282) for specifying the position of the first target with respect to the vehicle.
技術的特徴3に記載の車両用装置であって、
搭載物情報取得部は、周辺監視センサを有し、測位衛星の信号を用いた測位が可能であり、且つ、車両と無線通信が可能な通信機を搭載する路側機及び第1通信機搭載物以外の第2通信機搭載物から送信される、その測位によって求められたその第2通信機搭載物の位置である第2測位位置、及びその周辺監視センサで検出した物標である第2物標の、その測位によって求められたその第2通信機搭載物の位置を基準とする位置である第2搭載物基準物標位置も、無線通信を介して取得し、
同一判定部は、物標位置特定部で車両に対する位置を特定した第1物標と、搭載物情報取得部で取得した第2搭載物基準物標位置の送信元の第2通信機搭載物の周辺監視センサで検出した第2物標とが同一か否かを判定し、
変換部は、搭載物情報取得部で取得した第2搭載物基準物標位置を車両に対する相対位置に変換し、
誤差推定部は、同一判定部で第1物標と第2物標とが同一と判定した場合には、変換部で変換した第2搭載物基準物標位置と、物標位置特定部で特定した車両に対する第1物標の位置とのずれから、車両の位置を基準とした第2通信機搭載物での測位の誤差である第2測位誤差を推定する車両用装置。 (Technical feature 4)
A vehicle device according to
The mounted object information acquisition unit has a surrounding monitoring sensor, is capable of positioning using a positioning satellite signal, and is equipped with a communication device capable of wireless communication with the vehicle, and a roadside device and a mounted object of the first communication device. The second positioning position, which is the position of the second communication device-mounted object obtained by the positioning, and the second object, which is the target detected by the surrounding monitoring sensor, transmitted from the second communication device-mounted object other than a second mounted object reference target position, which is the position of the target based on the position of the mounted object of the second communication device determined by the positioning, is also acquired via wireless communication;
The identity determination unit determines whether the first target whose position relative to the vehicle is specified by the target position specifying unit and the second communication device mounted object that is the transmission source of the second mounted object reference target position acquired by the mounted object information acquisition unit. Determining whether or not the second target detected by the surrounding monitoring sensor is the same,
The converting unit converts the second mounted object reference target position acquired by the mounted object information acquiring unit into a relative position with respect to the vehicle,
When the identity determining unit determines that the first target and the second target are the same, the error estimating unit determines the position of the second mounted object reference target converted by the converting unit and the position of the target specified by the target position identifying unit. A vehicle device for estimating a second positioning error, which is a positioning error of a second communication device mounted on the vehicle with reference to the position of the vehicle, from the deviation from the position of the first target with respect to the vehicle.
技術的特徴4に記載の車両用装置であって、
誤差推定部で第2測位誤差を一旦推定した後は、搭載物情報取得部で第2通信機搭載物から取得する第2測位位置を、その第2測位誤差の分だけ補正して、車両に対する第2通信機搭載物の位置を特定する搭載物位置特定部(281)を備える車両用装置。 (Technical feature 5)
A vehicle device according to technical feature 4,
After the second positioning error is once estimated by the error estimating unit, the mounted object information acquisition unit corrects the second positioning position acquired from the second communication device mounted object by the second positioning error, A device for a vehicle comprising a mounted object position identifying section (281) for identifying the position of a second communication device mounted object.
技術的特徴4又は5に記載の車両用装置であって、
物標位置特定部は、誤差推定部で第2測位誤差を一旦推定した後は、搭載物情報取得部で第2通信機搭載物から取得する第2搭載物基準物標位置を、その第2測位誤差の分だけ補正して、車両に対する第2物標の位置を特定する車両用装置。 (Technical feature 6)
A vehicle device according to technical feature 4 or 5,
After the error estimating unit estimates the second positioning error once, the target position identifying unit calculates the second mounting object reference target position acquired from the second communication device mounted object by the mounted object information acquiring unit as the second A vehicle device that specifies the position of a second target with respect to the vehicle by correcting the positioning error.
車両で用いることが可能な誤差推定方法であって、
少なくとも1つのプロセッサにより実行される、
周辺監視センサを有し、測位衛星の信号を用いた測位で求められるよりも位置精度の高い自機器の絶対位置の情報を有し、且つ、車両と無線通信が可能な通信機を搭載する路側機から送信される、周辺監視センサで検出した物標の、路側機の位置を基準とする位置である路側機基準物標位置を、無線通信を介して取得する路側機情報取得工程と、
測位衛星の信号を用いた測位が可能であり、且つ、車両と無線通信が可能な通信機を搭載する路側機以外の第1通信機搭載物から送信される、その測位によって求められたその第1通信機搭載物の位置である第1測位位置を、無線通信を介して取得する搭載物情報取得工程と、
路側機情報取得工程で路側機基準物標位置を取得した路側機の周辺監視センサで検出した物標と、搭載物情報取得工程で取得した第1測位位置の送信元の第1通信機搭載物とが同一か否かを判定する同一判定工程と、
搭載物情報取得工程で取得した路側機基準物標位置及び搭載物情報取得工程で取得した第1測位位置を車両に対する相対位置に変換する変換工程と、
同一判定工程で物標と第1通信機搭載物とが同一と判定した場合には、それらについての、変換工程で変換した路側機基準物標位置と第1測位位置とのずれから、車両の位置を基準とした第1通信機搭載物での測位の誤差である第1測位誤差を推定する誤差推定工程とを含む誤差推定方法。 (Technical feature 7)
An error estimation method that can be used in a vehicle, comprising:
executed by at least one processor;
A roadside that has a surrounding monitoring sensor, has information on the absolute position of its own device with higher positional accuracy than required by positioning using signals from positioning satellites, and is equipped with a communication device capable of wireless communication with the vehicle. a roadside unit information acquisition step of acquiring, via wireless communication, a roadside unit reference target position, which is a position of a target detected by a perimeter monitoring sensor and transmitted from the machine with reference to the position of the roadside unit;
Positioning using signals from positioning satellites and transmitted from a first communication device other than a roadside device equipped with a communication device capable of wireless communication with the vehicle, and obtained by the positioning a mounted object information obtaining step of obtaining a first positioning position, which is the position of a mounted object of one communication device, via wireless communication;
The target detected by the perimeter monitoring sensor of the roadside unit that acquired the roadside unit reference target position in the roadside unit information acquisition process, and the first communication device mounted object that is the transmission source of the first positioning position acquired in the mounted object information acquisition process A same determination step of determining whether or not is the same,
a conversion step of converting the roadside unit reference target position acquired in the mounted object information acquiring step and the first positioning position acquired in the mounted object information acquiring step into a relative position with respect to the vehicle;
If it is determined in the same determination step that the target and the first communication device mounted object are the same, the difference between the roadside unit reference target position converted in the conversion step and the first positioning position for them is determined. an error estimating step of estimating a first positioning error, which is a positioning error of the first communicator mount relative to the position.
Claims (10)
- 車両で用いることが可能な車両用装置であって、
物標の参照位置を検出する参照装置から無線送信される前記参照位置を取得する参照位置取得部(231)と、
第1通信機搭載物から送信され、前記第1通信機搭載物での測位によって求められた前記第1通信機搭載物の位置である第1測位位置を取得する搭載物情報取得部(222)と、
前記参照位置取得部で前記参照位置を取得した前記物標と、前記搭載物情報取得部で前記第1測位位置を取得した前記第1通信機搭載物とが同一か否かを判定する同一判定部(204)と、
前記同一判定部で前記物標と前記第1通信機搭載物とが同一と判定した場合には、前記参照位置と前記第1測位位置とのずれから、前記第1通信機搭載物での測位の誤差である第1測位誤差を推定する誤差推定部(206)とを備える車両用装置。 A vehicle device that can be used in a vehicle, comprising:
a reference position acquisition unit (231) for acquiring the reference position wirelessly transmitted from a reference device for detecting the reference position of the target;
Mounted object information acquisition unit (222) for acquiring a first positioning position, which is the position of the first communication device-mounted object transmitted from the first communication device-mounted object and obtained by positioning by the first communication device-mounted object. When,
Identity determination for determining whether or not the target for which the reference position is obtained by the reference position obtaining unit and the first communication device mounted object for which the first positioning position is obtained by the mounted object information obtaining unit are the same. a unit (204);
When the same determination unit determines that the target and the first communication device mounted object are the same, the positioning by the first communication device mounted object is performed based on the deviation between the reference position and the first positioning position. and an error estimator (206) for estimating a first positioning error that is an error in the vehicle. - 請求項1に記載の車両用装置であって、
前記参照装置が路側機であり、
前記参照位置取得部として、前記路側機が検出した前記物標の位置である路側機基準物標位置を前記参照位置として前記路側機から取得する路側機情報取得部を備える車両用装置。 A vehicle device according to claim 1, comprising:
wherein the reference device is a roadside unit;
A vehicle apparatus comprising, as the reference position acquisition unit, a roadside unit information acquisition unit that acquires a roadside unit reference target position, which is the position of the target detected by the roadside unit, from the roadside unit as the reference position. - 請求項2に記載の車両用装置であって、
前記搭載物情報取得部で取得した前記路側機基準物標位置及び前記搭載物情報取得部で取得した前記第1測位位置を前記車両に対する相対位置に変換する変換部(205)を備え、
前記誤差推定部は、前記変換部で変換した前記路側機基準物標位置と、前記変換部で変換した前記第1測位位置とのずれから、前記車両の位置を基準とした前記第1測位誤差を推定する、車両用装置。 A vehicle device according to claim 2, wherein
A conversion unit (205) for converting the roadside unit reference target position acquired by the mounted object information acquisition unit and the first positioning position acquired by the mounted object information acquisition unit into a relative position with respect to the vehicle,
The error estimating unit calculates the first positioning error with respect to the position of the vehicle from the deviation between the roadside unit reference target position converted by the converting unit and the first positioning position converted by the converting unit. A vehicle device for estimating - 請求項1~3のいずれか1項に記載の車両用装置であって、
前記誤差推定部で前記第1測位誤差を一旦推定した後は、前記第1通信機搭載物が前記参照装置により検出できなくなった場合であっても、前記搭載物情報取得部で前記第1通信機搭載物から取得する前記第1測位位置を、その第1測位誤差の分だけ補正して、前記車両に対するその第1通信機搭載物の位置を特定する搭載物位置特定部(281)を備える車両用装置。 The vehicle device according to any one of claims 1 to 3,
After the error estimating unit once estimates the first positioning error, even if the first communication device mounted object cannot be detected by the reference device, the mounted object information acquiring unit a mounted object position specifying unit (281) for specifying the position of the first communication device mounted object with respect to the vehicle by correcting the first positioning position obtained from the mounted object by the first positioning error. vehicle equipment. - 請求項1~4のいずれか1項に記載の車両用装置であって、
前記搭載物情報取得部は、前記第1通信機搭載物から送信され、前記第1通信機搭載物が備える周辺監視センサで検出した第1物標の位置である第1搭載物基準物標位置も取得し、
前記誤差推定部で前記第1測位誤差を一旦推定した後は、前記搭載物情報取得部で取得する前記第1搭載物基準物標位置をその第1測位誤差の分だけ補正して、前記車両に対する前記第1物標の位置を特定する物標位置特定部(282)を備える車両用装置。 The vehicle device according to any one of claims 1 to 4,
The mounted object information acquiring unit is a first mounted object reference target position which is the position of the first target transmitted from the mounted object of the first communication device and detected by a perimeter monitoring sensor included in the mounted object of the first communication device. also get
After the first positioning error is once estimated by the error estimating unit, the first mounted object reference target position acquired by the mounted object information acquiring unit is corrected by the first positioning error, and the vehicle A vehicle device comprising a target position identifying section (282) that identifies the position of the first target with respect to. - 請求項5に記載の車両用装置であって、
前記搭載物情報取得部は、前記第1通信機搭載物以外の第2通信機搭載物から送信される前記第2通信機搭載物の位置である第2測位位置、及び前記第2通信機搭載物が備える周辺監視センサで検出した第2物標の前記第2通信機搭載物の位置を基準とする位置である第2搭載物基準物標位置も取得し、
前記同一判定部は、前記第1物標と前記第2物標とが同一か否かを判定し、
前記誤差推定部は、前記同一判定部で前記第1物標と前記第2物標とが同一と判定した場合には、前記第2搭載物基準物標位置と、前記物標位置特定部で特定した前記第1物標の位置とのずれから、前記第2通信機搭載物での前記測位の誤差である第2測位誤差を推定する車両用装置。 A vehicle device according to claim 5, wherein
The mounted object information acquisition unit obtains a second positioning position, which is the position of the second communication device mounted object transmitted from a second communication device mounted object other than the first communication device mounted object, and a position of the second communication device mounted object. Acquiring a second mounted object reference target position, which is a position of the second target detected by a perimeter monitoring sensor included in the object, with reference to the position of the second communication device mounted object,
The identity determination unit determines whether the first target and the second target are the same,
When the identity determining unit determines that the first target and the second target are the same, the error estimating unit determines that the second mounted object reference target position and the target position identifying unit A vehicular device for estimating a second positioning error, which is an error in the positioning by the second communication device-mounted object, from the deviation from the position of the identified first target. - 請求項6に記載の車両用装置であって、
前記誤差推定部で前記第2測位誤差を一旦推定した後は、前記搭載物情報取得部で前記第2通信機搭載物から取得する前記第2測位位置を、その第2測位誤差の分だけ補正して、前記車両に対する前記第2通信機搭載物の位置を特定する搭載物位置特定部(281)を備える車両用装置。 A vehicle device according to claim 6, wherein
After the second positioning error is once estimated by the error estimating unit, the mounted object information acquiring unit corrects the second positioning position acquired from the second communication device mounted object by the second positioning error. and a mounted object position specifying unit (281) for specifying the position of the second communication device mounted object with respect to the vehicle. - 請求項6又は7に記載の車両用装置であって、
前記物標位置特定部は、前記誤差推定部で前記第2測位誤差を一旦推定した後は、前記搭載物情報取得部で前記第2通信機搭載物から取得する前記第2搭載物基準物標位置を、その第2測位誤差の分だけ補正して、前記車両に対する前記第2物標の位置を特定する車両用装置。 A vehicle device according to claim 6 or 7,
The target position specifying unit, after once estimating the second positioning error in the error estimating unit, determines the second mounted object reference target acquired from the second communication device mounted object in the mounted object information acquiring unit. A vehicle device that corrects the position by the second positioning error to identify the position of the second target with respect to the vehicle. - 請求項5に記載の車両用装置であって、
前記搭載物情報取得部は、前記第1通信機搭載物以外の第2通信機搭載物から送信される前記第2通信機搭載物の位置である第2測位位置も取得し、
前記同一判定部は、前記第1物標と前記第2通信機搭載物とが同一か否かを判定し、
前記誤差推定部は、前記同一判定部で前記第1物標と前記第2通信機搭載物とが同一であると判定した場合には、前記第2測位位置と前記第1物標の位置とのずれから、前記第2通信機搭載物での前記測位の誤差である第2測位誤差を推定する車両用装置。 A vehicle device according to claim 5, wherein
The mounted object information acquisition unit also acquires a second positioning position, which is the position of the second communication device mounted object transmitted from a second communication device mounted object other than the first communication device mounted object,
The identity determination unit determines whether the first target and the second communication device mounted object are the same,
When the identity determining unit determines that the first target and the second communication device mounted object are the same, the error estimating unit compares the second positioning position with the position of the first target. a vehicle device for estimating a second positioning error, which is an error in the positioning by the second communication device-mounted object, from the deviation of the second communication device. - 車両で用いることが可能であり、少なくとも1つのプロセッサにより実行される誤差推定方法であって、
物標の参照位置を検出する参照装置から無線送信される前記参照位置を取得する参照位置取得工程と、
第1通信機搭載物から送信され、前記第1通信機搭載物での測位によって求められた前記第1通信機搭載物の位置である第1測位位置を取得する搭載物情報取得工程と、
前記参照位置取得工程で前記参照位置を取得した前記物標と、前記搭載物情報取得工程で前記第1測位位置を取得した前記第1通信機搭載物とが同一か否かを判定する同一判定工程と、
前記同一判定工程で前記物標と前記第1通信機搭載物とが同一と判定した場合には、前記参照位置と前記第1測位位置とのずれから、前記第1通信機搭載物での測位の誤差である第1測位誤差を推定する誤差推定工程とを含む誤差推定方法。 An error estimation method usable in a vehicle and executed by at least one processor, comprising:
a reference position obtaining step of obtaining the reference position wirelessly transmitted from a reference device for detecting the reference position of the target;
a mounted object information acquisition step of acquiring a first positioning position, which is the position of the first communication device mounted object transmitted from the first communication device mounted object and obtained by positioning by the first communication device mounted object;
Identity determination for determining whether or not the target for which the reference position is obtained in the reference position obtaining step and the first communication device mounted object for which the first positioning position is obtained in the mounted object information obtaining step are the same. process and
When it is determined in the identity determination step that the target and the first communication device mounted object are the same, the position measurement by the first communication device mounted object is performed based on the deviation between the reference position and the first positioning position. and an error estimation step of estimating a first positioning error that is an error of .
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2023529693A JPWO2022264731A1 (en) | 2021-06-14 | 2022-05-16 | |
DE112022003058.5T DE112022003058T5 (en) | 2021-06-14 | 2022-05-16 | VEHICLE DEVICE AND ERROR ESTIMATION METHODS |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2021098905 | 2021-06-14 | ||
JP2021-098905 | 2021-06-14 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2022264731A1 true WO2022264731A1 (en) | 2022-12-22 |
Family
ID=84526190
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2022/020386 WO2022264731A1 (en) | 2021-06-14 | 2022-05-16 | Vehicle device and error estimation method |
Country Status (3)
Country | Link |
---|---|
JP (1) | JPWO2022264731A1 (en) |
DE (1) | DE112022003058T5 (en) |
WO (1) | WO2022264731A1 (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2024202611A1 (en) * | 2023-03-30 | 2024-10-03 | 日本電気株式会社 | Object detection system, processing system, method for generating teacher data, object detection method, and recording medium |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2012211843A (en) * | 2011-03-31 | 2012-11-01 | Daihatsu Motor Co Ltd | Position correction device and inter-vehicle communication system |
JP2016143088A (en) * | 2015-01-29 | 2016-08-08 | 住友電気工業株式会社 | Position detection system and on-vehicle information processing apparatus |
JP2020193954A (en) * | 2019-05-30 | 2020-12-03 | 住友電気工業株式会社 | Position correction server, position management device, moving object position management system and method, position information correction method, computer program, onboard device, and vehicle |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP7107660B2 (en) | 2017-10-25 | 2022-07-27 | 株式会社Soken | In-vehicle system, target object recognition method and computer program |
JP2021098905A (en) | 2019-12-20 | 2021-07-01 | トヨタ紡織株式会社 | Resin fiber, method for manufacturing fiber board and method for manufacturing molding |
-
2022
- 2022-05-16 DE DE112022003058.5T patent/DE112022003058T5/en active Pending
- 2022-05-16 JP JP2023529693A patent/JPWO2022264731A1/ja active Pending
- 2022-05-16 WO PCT/JP2022/020386 patent/WO2022264731A1/en active Application Filing
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2012211843A (en) * | 2011-03-31 | 2012-11-01 | Daihatsu Motor Co Ltd | Position correction device and inter-vehicle communication system |
JP2016143088A (en) * | 2015-01-29 | 2016-08-08 | 住友電気工業株式会社 | Position detection system and on-vehicle information processing apparatus |
JP2020193954A (en) * | 2019-05-30 | 2020-12-03 | 住友電気工業株式会社 | Position correction server, position management device, moving object position management system and method, position information correction method, computer program, onboard device, and vehicle |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2024202611A1 (en) * | 2023-03-30 | 2024-10-03 | 日本電気株式会社 | Object detection system, processing system, method for generating teacher data, object detection method, and recording medium |
Also Published As
Publication number | Publication date |
---|---|
JPWO2022264731A1 (en) | 2022-12-22 |
DE112022003058T5 (en) | 2024-05-08 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP3731547B1 (en) | Device and method for v2x communication | |
US11388565B2 (en) | Apparatus and method for V2X communication | |
US11218851B2 (en) | Device and method for V2X communication | |
EP3462754B1 (en) | Apparatus and method for v2x communication | |
US11968603B2 (en) | Device and method for V2X communication | |
US11776405B2 (en) | Apparatus and method for V2X communication | |
US20200322830A1 (en) | Extra-vehicular communication device, onboard device, onboard communication system, communication control method, and communication control program | |
TW202209906A (en) | Techniques for managing data distribution in a v2x environment | |
US20200326203A1 (en) | Real-world traffic model | |
US20220107382A1 (en) | Device and method for v2x communication | |
Kitazato et al. | Proxy cooperative awareness message: an infrastructure-assisted v2v messaging | |
US20220086609A1 (en) | Cpm message division method using object state sorting | |
US11145195B2 (en) | Service station for an intelligent transportation system | |
WO2019131075A1 (en) | Transmission device, point group data collection system, and computer program | |
CN113099529A (en) | Indoor vehicle navigation method, vehicle-mounted terminal, field terminal server and system | |
WO2022264731A1 (en) | Vehicle device and error estimation method | |
US11900808B2 (en) | Apparatus, method, and computer program for a first vehicle and for estimating a position of a second vehicle at the first vehicle | |
EP4177862A1 (en) | Road space collective perception message within an intelligent transport system | |
WO2022255074A1 (en) | Device for vehicle and error estimation method | |
CN118805390A (en) | Based on relative position of User Equipment (UE) | |
EP3783930B1 (en) | Service station for an intelligent transportation system | |
WO2023171371A1 (en) | Communication device and communication method | |
US20240038060A1 (en) | Communication within an intelligent transport system for signalling hidden objects | |
JP2023176218A (en) | Communication device and communication method | |
Paparone | An open-source, multiprotocol Internet of vehicles platform for the analysis of traffic flows in urban scenarios |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 22824725 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2023529693 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 112022003058 Country of ref document: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 22824725 Country of ref document: EP Kind code of ref document: A1 |