WO2018109865A1 - Roadside machine and vehicle-to-road communication system - Google Patents

Roadside machine and vehicle-to-road communication system Download PDF

Info

Publication number
WO2018109865A1
WO2018109865A1 PCT/JP2016/087222 JP2016087222W WO2018109865A1 WO 2018109865 A1 WO2018109865 A1 WO 2018109865A1 JP 2016087222 W JP2016087222 W JP 2016087222W WO 2018109865 A1 WO2018109865 A1 WO 2018109865A1
Authority
WO
WIPO (PCT)
Prior art keywords
vehicle
detection result
unit
detection
road
Prior art date
Application number
PCT/JP2016/087222
Other languages
French (fr)
Japanese (ja)
Inventor
栗田 明
元吉 克幸
平 明徳
Original Assignee
三菱電機株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 三菱電機株式会社 filed Critical 三菱電機株式会社
Priority to PCT/JP2016/087222 priority Critical patent/WO2018109865A1/en
Publication of WO2018109865A1 publication Critical patent/WO2018109865A1/en

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions

Definitions

  • the present invention relates to a roadside device capable of communicating with an in-vehicle device mounted on a vehicle existing in a communication area, and a road-to-vehicle communication system including the in-vehicle device and the roadside device.
  • detection means such as a camera, are used in order to detect surrounding objects such as surrounding vehicles, pedestrians, or falling objects.
  • Patent Document 1 describes a vehicle equipped with a camera that recognizes the environment ahead of the vehicle. This camera is attached to the ceiling in front of the vehicle interior so that an object outside the vehicle can be imaged.
  • the present invention has been made in view of the above, and an object thereof is to obtain a roadside machine capable of improving the detection accuracy of surrounding objects.
  • the present invention includes a peripheral object detection unit that detects an object existing in the vicinity, and a first reception unit that receives a road-to-vehicle radio signal transmitted from the in-vehicle device.
  • a detection result integration unit that integrates a detection result of the peripheral object detection unit and a detection result of an object present in the vicinity of the on-vehicle device received from the on-vehicle device using a road-to-vehicle radio signal.
  • the roadside machine according to the present invention has an effect of improving the accuracy of detection results of surrounding objects.
  • FIG. 1 is a schematic configuration diagram of a road-vehicle communication system according to a first embodiment of the present invention.
  • Configuration diagram of roadside machine according to Embodiment 1 The figure which shows the example of arrangement
  • FIG. 3 is a flowchart showing a detection operation of a peripheral object of the roadside device according to the first exemplary embodiment.
  • FIG. 1 is a schematic configuration diagram of a road-vehicle communication system according to a first embodiment of the present invention.
  • Configuration diagram of roadside machine according to Embodiment 1 The figure which shows the example of arrangement
  • the flowchart which shows the 1st example of the integration process of the state parameter of Embodiment 1 Flowchart showing a second example of state parameter integration processing of the first embodiment
  • the block diagram of the roadside machine concerning Embodiment 2 Flowchart showing detection operation of surrounding objects of roadside machine according to second exemplary embodiment.
  • the block diagram of the roadside machine concerning Embodiment 3 Flowchart showing detection operation of surrounding objects of roadside machine according to third exemplary embodiment.
  • FIG. 1 is a schematic configuration diagram of a road-vehicle communication system according to a first exemplary embodiment of the present invention.
  • the road-vehicle communication system 100 includes a roadside machine 1 ⁇ / b> A and a roadside machine 1 ⁇ / b> B, and an in-vehicle device 2.
  • a plurality of constituent elements having similar functions are given a reference numeral in which a different alphabet is added after a common numeral.
  • only common numerals are used as symbols, and when distinguishing each of a plurality of components having similar functions, A code to which a different alphabet is added later is used.
  • the roadside machine 1A and the roadside machine 1B can be referred to as a plurality of roadside machines 1.
  • one in-vehicle device 2 is shown for simplicity, but the road-to-vehicle communication system 100 may include a plurality of in-vehicle devices 2.
  • the roadside machine 1 is not limited to two, and the road-vehicle communication system 100 can include a plurality of roadside machines 1.
  • the roadside machine 1 is a communication device that is fixedly installed at a predetermined location.
  • the roadside machine 1 is installed in the vicinity of a place where vehicles such as a roadway or a parking lot can pass, for example.
  • the roadside device 1 can wirelessly communicate with the in-vehicle device 2 existing in the communication area.
  • the communication area is an area predetermined for each roadside device 1 and is a range in which each roadside device 1 covers communication or an area where a radio signal transmitted by the roadside device 1 can be received. Further, the roadside machine 1 can communicate with other roadside machines 1.
  • road-to-vehicle communication communication between the roadside device 1 and the vehicle-mounted device 2
  • a signal transmitted and received using road-to-vehicle communication is referred to as a road-to-vehicle signal.
  • inter-road communication communication between the roadside machine 1 and the roadside machine 1
  • a signal transmitted and received using the inter-road communication is referred to as an inter-road signal.
  • the roadside machine 1 has a function of detecting a peripheral object 4 existing around the roadside machine 1.
  • the peripheral object 4 is any object including a vehicle, a fallen object, a pedestrian, and the like.
  • the in-vehicle device 2 is a communication device mounted on the vehicle 3.
  • the vehicle-mounted device 2 can communicate with the roadside device 1 when the vehicle-mounted device 2 exists in the communication area of the roadside device 1.
  • the in-vehicle device 2 has a function of detecting a peripheral object 4 existing around the vehicle 3 in which the in-vehicle device 2 is mounted.
  • the roadside device 1 and the in-vehicle device 2 each detect the surrounding object 4. And the vehicle equipment 2 notifies the detection result of the surrounding object 4 to the roadside machine 1 using road-to-vehicle communication.
  • the roadside device 1 integrates the detection result of the peripheral object 4 detected by itself and the detection result notified from the in-vehicle device 2.
  • the roadside device 1 may acquire detection results from other roadside devices 1 using road-to-road communication, and may further integrate the acquired detection results.
  • the roadside device 1 that integrates the detection results collected from the in-vehicle device 2 and the other roadside devices 1 and the detection results of the peripheral objects 4 detected by itself will be described.
  • FIG. 2 is a configuration diagram of the roadside machine according to the first embodiment.
  • the roadside machine 1 includes a peripheral object detection unit 11 and a first acquisition unit 12.
  • the first acquisition unit 12 includes a first detection accuracy output unit 13 and a first state parameter output unit 14.
  • the roadside machine 1 further includes a circulator 21, a first reception unit 22, and a second acquisition unit 23.
  • the second acquisition unit 23 includes a second detection accuracy output unit 24 and a second state parameter output unit 25.
  • the roadside machine 1 further includes a second reception unit 31 and a third acquisition unit 32.
  • the third acquisition unit 32 includes a third detection accuracy output unit 33 and a third state parameter output unit 34.
  • the roadside machine 1 further includes a detection result integration unit 41.
  • the detection result integration unit 41 includes a detection accuracy integration unit 42 and a state parameter integration unit 43.
  • the roadside device 1 further includes a peripheral object list generation unit 51, a transmission frame generation unit 52, and a first transmission unit 61.
  • the peripheral object detection unit 11 is connected to the peripheral object detection sensor 10.
  • the peripheral object detection unit 11 detects an object existing around the roadside machine 1 using the peripheral object detection sensor 10.
  • the peripheral object detection unit 11 outputs detection information related to the detected peripheral object to the first acquisition unit 12.
  • the peripheral object detection sensor 10 is, for example, an image pickup device or a radar.
  • the surrounding object detection sensor 10 is an imaging device
  • the surrounding object detection unit 11 detects the surrounding object 4 using a technique such as pattern recognition based on an image acquired by the imaging device. In this case, the peripheral object detection unit 11 outputs an image including the detected peripheral object 4 to the first acquisition unit 12 as a detection result.
  • the surrounding object detection unit 11 detects the surrounding object 4 based on the radar signal reflected by the surrounding object 4 after the radar is emitted, and detects the surrounding object 4. Is output to the first acquisition unit 12 as a detection result.
  • the first acquisition unit 12 processes the detection result output by the peripheral object detection unit 11, acquires the processed detection result, and outputs the acquired detection result to the detection result integration unit 41.
  • the first detection accuracy output unit 13 outputs the detection accuracy of the peripheral object 4 to the detection result integration unit 41 as the detection result after processing based on the detection result output by the peripheral object detection unit 11.
  • the detection accuracy varies depending on the performance of the sensor that detects the peripheral object 4, the state of the sensor, the positional relationship between the target peripheral object 4 and the sensor, and the like. In the case where the sensor is an imager, the detection accuracy is considered to be significantly reduced in a situation in which direct sunlight is applied to the lens of the imager. Therefore, the first acquisition unit 12 depends on the detected time zone and the weather at that time.
  • the detection accuracy value may be calculated.
  • the first acquisition unit 12 can detect the distance between the target peripheral object 4 and the sensor and calculate the detection accuracy according to this distance.
  • the first state parameter output unit 14 obtains a state parameter that is a parameter indicating the state of the peripheral object 4 based on the detection result output by the peripheral object detection unit 11, and detects the obtained state parameter as a detection result after processing.
  • the result is output to the result integration unit 41.
  • the state parameter is a parameter indicating the state of the peripheral object 4 and is, for example, the position, size, moving speed, and attribute of the peripheral object 4.
  • the first state parameter output unit 14 can hold a history of detection results output by the surrounding object detection unit 11 and can obtain a state parameter based on the history of detection results.
  • the first state parameter output unit 14 obtains the moving speed of the peripheral object 4 from a plurality of images including the peripheral object 4 based on the moving distance and the time taken for the movement. it can.
  • the attribute is information indicating the type of the peripheral object 4 such as a normal vehicle, a large vehicle, a two-wheeled vehicle, and other objects.
  • the circulator 21 is connected to the road-vehicle communication antenna 20 and demultiplexes the transmission signal and the reception signal.
  • the first receiving unit 22 receives the road-to-vehicle signal transmitted by the in-vehicle device 2 via the road-to-vehicle communication antenna 20 and the circulator 21.
  • the first reception unit 22 receives the detection result of the peripheral object 4 existing around the vehicle-mounted device 2 from the vehicle-mounted device 2 using the road-to-vehicle signal, and outputs the received detection result to the second acquisition unit 23. .
  • the second acquisition unit 23 processes the detection result received from the in-vehicle device 2 and outputs the detection result after processing to the detection result integration unit 41.
  • the second detection accuracy output unit 24 extracts the detection accuracy of each peripheral object 4 included in the detection result received from the in-vehicle device 2, and integrates the detection result as the detection result after processing. Output to the unit 41.
  • the second state parameter output unit 25 extracts the state parameters of each peripheral object 4 included in the detection result received from the in-vehicle device 2 and outputs the extracted state parameters to the detection result integration unit 41 as detection results after processing. Output.
  • the second receiving unit 31 is connected to the roadside communication antenna 30 and receives a roadside signal transmitted by another roadside device 1.
  • the second receiving unit 31 receives the detection result of the peripheral object 4 existing in the vicinity of the other roadside machine 1 from the other roadside machine 1 using the roadside signal, and receives the received roadside signal as the first signal. 3 is output to the acquisition unit 32.
  • the 3rd acquisition part 32 processes the detection result received from the other roadside machine 1, and outputs the detection result after a process to the detection result integration part 41.
  • FIG. Specifically, the third detection accuracy output unit 33 extracts the detection accuracy of each peripheral object 4 included in the detection result received from the other roadside device 1 and outputs the extracted detection accuracy to the detection result integration unit 41. To do.
  • the third state parameter output unit 34 extracts the state parameters of each peripheral object 4 included in the detection results received from the other roadside devices 1, and outputs the extracted state parameters to the detection result integration unit 41 as detection results after processing. Output.
  • the detection result integration unit 41 integrates and integrates the detection result output from the first acquisition unit 12, the detection result output from the second acquisition unit 23, and the detection result output from the third acquisition unit 32. Generate later detection results. Specifically, the detection result integration unit 41 detects the same peripheral object 4 from the plurality of detection results output from each of the first acquisition unit 12, the second acquisition unit 23, and the third acquisition unit 32. Are extracted and grouped.
  • the detection accuracy integration unit 42 integrates the detection accuracy output from each of the first detection accuracy output unit 13, the second detection accuracy output unit 24, and the third detection accuracy output unit 33 for each group. The detection accuracy after integration of the peripheral objects 4 is generated.
  • state parameter integration unit 43 integrates the state parameters output from the first state parameter output unit 14, the second state parameter output unit 25, and the third state parameter output unit 34 for each group, A state parameter after the integration of the object 4 is generated. Specific processing of the detection result integration unit 41 will be described later.
  • the peripheral object list generation unit 51 generates a peripheral object list based on the detection result after the detection result integration unit 41 integrates.
  • the peripheral object list includes detection accuracy and state parameters after integration for each peripheral object.
  • the transmission frame generation unit 52 generates a transmission frame to be transmitted to the in-vehicle device 2 based on the peripheral object list generated by the peripheral object list generation unit 51.
  • the transmission frame generation unit 52 outputs the generated transmission frame to the first transmission unit 61.
  • the first transmission unit 61 is connected to the road-vehicle communication antenna 20 via the circulator 21, and transmits the transmission frame output from the transmission frame generation unit 52 to the in-vehicle device 2.
  • FIG. 3 is a diagram showing an arrangement example of roadside units and vehicles.
  • the detection process of the peripheral object 4 performed by the roadside machine 1 will be described below using a specific example on the assumption of the state of FIG.
  • a roadside machine 1A and a roadside machine 1B are installed beside the roadway, and a fallen object 5A exists on the roadway.
  • the vehicle 3A travels in one lane
  • the vehicle 3B, the vehicle 3C, and the vehicle 3D travel in the other lane that is the opposite lane.
  • the roadway is left-hand traffic.
  • the peripheral objects 4 are a vehicle 3A, a vehicle 3B, a vehicle 3C, a vehicle 3D, and a fallen object 5A.
  • FIG. 4 is a diagram showing the state of the peripheral object shown in FIG.
  • the vehicle 3A exists at a position of 35.50000 degrees north latitude and 139.40000 degrees east longitude, and the size of the vehicle body is a large vehicle of about 10 m ⁇ 6 m ⁇ 4 m.
  • the vehicle 3B exists at a position of 35.50000 degrees north latitude and 139.50000 degrees east longitude, and the size of the vehicle body is a normal car of about 5 m ⁇ 3 m ⁇ 2 m.
  • the vehicle 3C exists at a position of 35.50000 degrees north latitude and 139.60000 degrees east longitude, and the size of the vehicle body is a normal car of about 5 m ⁇ 3 m ⁇ 2 m.
  • the vehicle 3D exists at a position of 35.50000 degrees north latitude and 139.70000 degrees east longitude, and the size of the vehicle body is a normal car of about 5 m ⁇ 3 m ⁇ 2 m.
  • the falling object 5A exists at a position of 35.50000 degrees north latitude and 139.30000 degrees east longitude, and is an object having a size of about 2 m ⁇ 2 m ⁇ 1 m.
  • FIG. 5 is a flowchart of the detection operation of the surrounding objects of the roadside machine according to the first embodiment.
  • FIG. 5 shows a detection operation performed by the roadside machine 1A shown in FIG.
  • the roadside machine 1A acquires the detection result of the peripheral object 4 acquired by the peripheral object detection unit 11 (step S101).
  • the first acquisition unit 12 acquires the detection accuracy and state parameters of each peripheral object 4 from the detection result acquired by the peripheral object detection unit 11.
  • the 1st receiving part 22 receives the detection result of peripheral object 4 which exists in the circumference of vehicle equipment 2 from vehicle equipment 2 (Step S102).
  • the second acquisition unit 23 receives the detection results acquired by each of the plurality of in-vehicle devices 2.
  • the second receiving unit 31 receives the detection result of the peripheral object 4 from the other roadside device 1B (step S103).
  • the roadside machine 1 ⁇ / b> B is shown as the other roadside machine 1, but the roadside machine 1 ⁇ / b> A may acquire the detection result of the peripheral object 4 from a plurality of other roadside machines 1.
  • FIG. 6 is a diagram showing an example of the detection result detected in the example shown in FIG. FIG. 6 shows the detection results acquired by the roadside device 1A by the processing shown in steps S101 to S103.
  • This detection result includes the detection result of the peripheral object 4 detected by each of the roadside machine 1A, the roadside machine 1B, the vehicle 3A, the vehicle 3B, and the vehicle 3C.
  • the detection result integration unit 41 of the roadside machine 1A integrates a plurality of received detection results (step S104). Specifically, the detection result integration unit 41 integrates the detection accuracy and state parameters included in the received plurality of detection results for each peripheral object 4.
  • the detection accuracy integration method include a method in which the detection accuracy having the largest value is used as the detection accuracy after integration, and a method in which arithmetic processing is performed using a plurality of detection accuracy and state parameter values.
  • the detection accuracy integration method may be selected in accordance with a state parameter integration method described later.
  • the peripheral object list generation unit 51 generates a peripheral object list including the detection results after the detection result integration unit 41 integrates (step S105).
  • FIG. 7 is a diagram illustrating a detection result after integrating the detection results illustrated in FIG. 6.
  • the “object” column indicates which object in FIG. 3 corresponds to each detection result after integration.
  • the detection result after integration shows a value of detection accuracy and a state parameter for each target peripheral object 4.
  • FIG. 8 is a diagram showing a transmission frame transmitted in the first embodiment.
  • a transmission frame 200 illustrated in FIG. 8 includes detection results of N peripheral objects.
  • the detection result includes a post-integration detection accuracy 201 and a post-integration state parameter 202.
  • the transmission frame 200 includes other data 203.
  • the transmission frame generation unit 52 outputs the generated transmission frame 200 to the first transmission unit 61.
  • the 1st transmission part 61 transmits the transmission frame 200 which the transmission frame production
  • the vehicle 3 equipped with the vehicle-mounted device 2 that has received the transmission frame 200 can control automatic driving or driving assistance using the detection result included in the transmission frame 200.
  • the state parameter integration method can be selected from a plurality of integration methods shown below depending on the type of the state parameter or the content of the detection result.
  • FIG. 9 is a flowchart showing a first example of state parameter integration processing.
  • the detection result integration unit 41 of the roadside machine 1A extracts the detection results of the same peripheral object 4 from the detection results acquired by a plurality of detection subjects as shown in FIG. 6 and groups them (step S108). .
  • the detection result integration unit 41 extracts the detection result of the same peripheral object 4
  • a method using image analysis may be mentioned.
  • the detection result integration unit 41 can recognize the same peripheral object from a plurality of images using an image including the peripheral object acquired by the peripheral object detection sensor 10.
  • the detection result integration unit 41 can extract the detection result of the same peripheral object 4 from the plurality of detection results based on the position information included in each detection result.
  • the detection result integration unit 41 considers errors and determines that these peripheral objects 4 are the same peripheral object 4 when the difference in position information of the peripheral objects 4 is within a predetermined range. can do. In the example of FIG.
  • the detection result integration unit 41 detects the detection result of the peripheral object 4 in which the 01th detection result, the 05th detection result, and the 07th detection result are the same from the position included in the detection result. Therefore, it can be determined that the 02nd detection result and the 08th detection result are the same peripheral object 4 detection results. Furthermore, the detection result integration unit 41 is the detection result of the peripheral object 4 in which the 03th detection result, the 06th detection result, and the 10th detection result are the same, and the 04th detection result and the 09th detection result. And the twelfth detection result can be determined to be the same detection result of the surrounding object 4. The detection result integration unit 41 can further determine that the eleventh detection result and the thirteenth detection result are detection results of the same peripheral object 4.
  • the detection result integration unit 41 weights and synthesizes the state parameters using a weighting factor corresponding to the detection accuracy of the detection results within the group divided in step S108 (step S109). For example, assuming that the detection accuracy of the detection means i is ⁇ i % and the numerical value of the state parameter of the detection means i is X i , the numerical value ⁇ total of the state parameter after integration is expressed by the following formula. In this case, the weighting factor is detection accuracy ⁇ i .
  • the state parameter integration process shown in FIG. 9 can be used when the state parameters are represented by numerical values.
  • FIG. 7 shows values obtained by rounding off the values calculated by using the weighting factors for the sizes of the surrounding objects 4 in the groups with detection result IDs (IDentification) of 03, 06 and 10 in FIG.
  • the size of the state parameter is indicated by numerical values of the length, width and height of the object.
  • the length value after integration is calculated as 4.12, as shown below.
  • the width value after integration is 2
  • the height value after integration is 1.76.
  • FIG. 10 is a flowchart showing a second example of state parameter integration processing.
  • the detection result integration unit 41 extracts the detection results of the same peripheral objects 4 and groups them (step 108).
  • the detection result integration unit 41 determines whether or not the state parameters of two or more detection results match (step S110). When the state parameters of two or more detection results match (step S110: Yes), the detection result integration unit 41 sets the most state parameters as the state parameters after integration (step S111). When the state parameters of two or more detection results do not match (step S110: No), the detection result integration unit 41 performs integration processing using other criteria (step S112).
  • the attribute values are two “other objects” and one “two-wheeled vehicle”.
  • the value of the attribute after integration is “other object”. If two or more state parameters do not match, this integration method cannot be used, so an integration process using other criteria is used. Also, this integration method is used when the number of detection results is equal to or greater than a predetermined number because the accuracy of state parameters after integration may decrease when the number of detection results in the same group is small. May be.
  • FIG. 11 is a flowchart showing a third example of state parameter integration processing.
  • the detection result integration unit 41 extracts the detection results of the same peripheral objects 4 and groups them (step 108).
  • the detection result integration unit 41 selects the state parameter with the highest detection result detection accuracy within the group as the state parameter after integration (step S113).
  • the moving speed values are 0 km / h, 1 km / h, and 20 km / h, and the detection accuracy corresponding to each state parameter is 95. %, 80% and 15%.
  • 0 km / h with the highest detection accuracy is selected as the value of the movement speed after integration. This integration method can be used even if the status parameter is a value other than a numerical value.
  • one integration process selected for each state parameter may be used, or a plurality of integration processes may be used in combination. Depending on the number and contents of the acquired state parameters, an integration process to be used may be selected as appropriate.
  • the detection result of the peripheral object 4 detected by the roadside device 1A and the detection result of the peripheral object 4 detected by the in-vehicle device 2 can be integrated. .
  • the amount of information used to obtain the detection result increases, so that the accuracy of the detection result can be improved.
  • the roadside machine 1 since the roadside machine 1 may be able to detect the peripheral object 4 present at the position where it is the blind spot of the vehicle 3, it can be expected to improve the accuracy of the detection result. In particular, in places with poor visibility such as intersections and curves, there is a high possibility that even the peripheral object 4 that cannot be detected by the sensor mounted on the vehicle 3 will be detected by the roadside machine 1.
  • the roadside machine 1A can further integrate the detection results acquired by the other roadside machine 1B. Since the plurality of roadside machines 1 are often installed at intervals, the other roadside machines 1B can detect the peripheral objects 4 existing in a different range from the roadside machine 1A. For this reason, the detection results after the integration include the detection results of the peripheral objects 4 existing in a wide range, and each vehicle 3 that has acquired the detection results It becomes possible to grasp the vehicle 3, the obstacle, the traffic jam, the occurrence of an accident, and the like.
  • FIG. FIG. 12 is a configuration diagram of a roadside machine according to the second embodiment.
  • the roadside device 1 shown in FIG. 12 further includes a threshold setting unit 71, a congestion degree estimation unit 72, and a threshold generation unit 73 in addition to the configuration of the roadside device 1 according to Embodiment 1 shown in FIG. .
  • a threshold setting unit 71 a congestion degree estimation unit 72
  • a threshold generation unit 73 a threshold generation unit 73 in addition to the configuration of the roadside device 1 according to Embodiment 1 shown in FIG.
  • differences from the roadside device 1 according to the first embodiment will be mainly described.
  • the threshold setting unit 71 sets a transmission permission determination threshold, which is a threshold for the in-vehicle device 2 to determine whether the detection result can be transmitted, in the transmission frame generation unit 52.
  • a transmission permission determination threshold which is a threshold for the in-vehicle device 2 to determine whether the detection result can be transmitted, in the transmission frame generation unit 52.
  • the in-vehicle device 2 determines whether or not the detection result can be transmitted based on the detection accuracy of the detection result.
  • the in-vehicle device 2 suppresses the amount of data transmitted and received by road-to-vehicle communication by extracting and transmitting a detection result having a detection accuracy higher than the transmission availability determination threshold value from the plurality of detected detection results. Is possible.
  • the congestion level estimation unit 72 estimates the congestion level of road-to-vehicle communication, and outputs the estimation result of the congestion level to the threshold value generation unit 73. For example, the congestion degree estimation unit 72 estimates the degree of congestion of road-to-vehicle communication based on the number of surrounding vehicles 3 on the assumption that the traffic amount of road-to-vehicle communication increases as the number of surrounding vehicles 3 increases. Can do. In this case, in the congestion degree estimation unit 72, among the peripheral objects 4 detected by the peripheral object detection unit 11, the state parameter attribute output by the first state parameter output unit 14 is a vehicle 3 such as a normal vehicle or a large vehicle. Based on the number of surrounding objects 4, the degree of congestion can be estimated.
  • the congestion degree estimation unit 72 may estimate the traffic from the in-vehicle device 2 to the roadside device 1 by analyzing the road-to-vehicle signal received by the first receiving unit 22.
  • the threshold generation unit 73 receives the estimation result of the congestion level output from the congestion level estimation unit 72, generates a transmission permission determination threshold value based on the estimation result, and outputs the generated transmission permission determination threshold value to the threshold setting unit 71. .
  • the threshold generation unit 73 increases the value of the transmission permission / inhibition determination threshold as the degree of congestion is higher, and decreases the value of the transmission permission / inhibition determination threshold as the degree of congestion is lower.
  • the threshold setting unit 71 sets a transmission permission / inhibition determination threshold according to the estimation result of the congestion degree, and can adjust the communication amount according to the congestion degree.
  • FIG. 13 is a flowchart showing an operation of detecting a peripheral object of the roadside machine according to the second embodiment.
  • the roadside device 1 performs detection result integration processing (step S100).
  • the detection result integration process shown in step S100 corresponds to steps S101 to S105 shown in FIG.
  • the congestion level estimation unit 72 estimates the congestion level of road-to-vehicle communication (step S201). Then, the threshold generation unit 73 generates a transmission permission determination threshold based on the estimated congestion level (step S202).
  • the transmission frame generation unit 52 generates a transmission frame including the detection result after the integration by the detection result integration process shown in step S100 and a transmission permission determination threshold (step S203).
  • the transmission frame generation unit 52 outputs the generated transmission frame to the first transmission unit 61.
  • FIG. 14 is a diagram illustrating a transmission frame transmitted in the second embodiment.
  • the transmission frame 210 includes a transmission permission determination threshold value 204 in addition to the information included in the transmission frame 200 shown in FIG.
  • the first transmission unit 61 transmits the generated transmission frame 210 to the in-vehicle device 2 (step S204).
  • the in-vehicle device 2 is notified of the transmission permission / inhibition determination threshold generated according to the congestion degree of road-to-vehicle communication.
  • the onboard device 2 changes to the roadside device 1. It is possible to control the transmission of the detection result on the roadside device 1 side.
  • the transmission permission / inhibition determination threshold is a threshold for detection accuracy, the roadside device 1 selectively acquires a detection result with high detection accuracy and high importance even in a state where the traffic is suppressed. It becomes possible to do.
  • FIG. 15 is a configuration diagram of a roadside machine according to the third embodiment.
  • the roadside machine 1 shown in FIG. 15 includes a transmission unnecessary list generation unit 53 and a threshold setting unit 71 in addition to the configuration of the roadside machine 1 according to Embodiment 1 shown in FIG.
  • the transmission unnecessary list generation unit 53 uses the peripheral object list generated by the peripheral object list generation unit 51 based on the detection result after integration, and no more detection results are transmitted between the in-vehicle device 2 and the roadside device 1.
  • a road-to-vehicle transmission unnecessary list that is a list of unnecessary peripheral objects 4 is generated.
  • the transmission unnecessary list generating unit 53 can identify the peripheral object 4 that does not require any further transmission of the detection result between the in-vehicle device 2 and the roadside device 1 based on the detection accuracy. For example, the transmission unnecessary list generating unit 53 extracts a peripheral object whose detection accuracy is equal to or higher than a predetermined value and whose moving speed is 0 from the peripheral object list, and generates a road-to-vehicle transmission unnecessary list. Can do.
  • the transmission unnecessary list generating unit 53 outputs the generated road-to-vehicle transmission unnecessary list to the transmission frame generating unit 52.
  • the threshold setting unit 71 sets a predetermined transmission availability determination threshold in the transmission frame generation unit 52. To do.
  • FIG. 16 is a flowchart of the detection operation of the surrounding objects of the roadside machine according to the third embodiment.
  • the detection result integration unit 41 of the roadside machine 1 performs detection result integration processing (step S100).
  • the threshold setting unit 71 sets a predetermined transmission permission / inhibition determination threshold in the transmission frame generation unit 52 (step S301).
  • the transmission frame generation unit 52 generates a transmission frame including the detection result after the integration of the peripheral objects 4 and a transmission permission / rejection determination threshold (step S302).
  • the transmission unnecessary list generating unit 53 generates a road-to-vehicle transmission unnecessary list from the peripheral object list (step S303).
  • the generated road-to-vehicle transmission unnecessary list is output to the transmission frame generation unit 52.
  • the transmission frame generation unit 52 turns on the road-to-vehicle transmission unnecessary flag corresponding to the detection result of the peripheral object 4 included in the road-to-vehicle transmission unnecessary list (step S304).
  • the transmission frame generation unit 52 outputs the generated transmission frame to the first transmission unit 61.
  • the 1st transmission part 61 transmits the transmission frame output from the transmission frame production
  • FIG. 17 is a diagram showing a transmission frame transmitted in the third embodiment.
  • a transmission frame 220 illustrated in FIG. 17 includes a road-to-vehicle transmission unnecessary flag 205 for each peripheral object 4 in addition to the information included in the transmission frame 210 illustrated in FIG.
  • the road-to-vehicle transmission unnecessary flag corresponding to the peripheral object 4 determined to be unnecessary between roads and vehicles is turned ON.
  • the vehicle-mounted device 2 may exclude the detection result of the peripheral object 4 for which the road-to-vehicle transmission unnecessary flag is ON from the transmission target. it can.
  • stationary object detection information that has already been detected with sufficient accuracy is not transmitted from the in-vehicle device 2 to the roadside device 1.
  • the detection performance of the peripheral object 4 is maintained and the roadside from the in-vehicle device 2 is maintained.
  • the amount of communication to the machine 1 can be suppressed.
  • FIG. 18 is a diagram illustrating a hardware configuration of the roadside machine according to the first to third embodiments.
  • Each function of the roadside machine 1 can be realized by using the processor 81 and the memory 82.
  • the processor 81 and the memory 82 are connected by a system bus 83.
  • the processor 81 can realize each function of the roadside machine 1 by reading and executing the computer program stored in the memory 82.
  • the memory 82 stores a computer program executed by the processor 81 and information used in accordance with the execution of the computer program.
  • the third acquisition unit 32, the detection result integration unit 41, the peripheral object list generation unit 51, the transmission frame generation unit 52, the transmission unnecessary list generation unit 53, the congestion degree estimation unit 72, and the threshold generation unit 73, the processor 81 has a memory 82. This is realized by reading out and executing each operation program stored in.
  • the threshold setting unit 71 is realized by the memory 82.
  • FIG. 18 one processor 81 and one memory 82 are shown.
  • the present invention is not limited to such an example, and the roadside machine 1 is formed by cooperation of a plurality of processors 81 and a plurality of memories 82. These functions may be realized.
  • the configuration described in the above embodiment shows an example of the contents of the present invention, and can be combined with another known technique, and can be combined with other configurations without departing from the gist of the present invention. It is also possible to omit or change the part.
  • FIG. 2 shows one peripheral object detection sensor 10 for simplicity, but the present invention is not limited to such an example.
  • the roadside machine 1 may include a plurality of surrounding object detection sensors 10.
  • the plurality of peripheral object detection sensors 10 may be the same type of sensors, or may be a plurality of types of sensors.
  • the road-to-vehicle communication antenna 20 and the road-to-road communication antenna 30 are different antennas, but when two communications are performed in close frequency bands, A common antenna can be used for inter-road communication.
  • the road-to-road communication is wireless communication, the present invention is not limited to such an example.
  • the road-to-road communication may be performed via a wired network such as an optical fiber.
  • the surrounding object 4 is an object that has a high possibility of moving over time, such as the vehicle 3, the fallen object 5, and a pedestrian, but the surrounding object 4 is fixed. It may be an object. For example, in order to realize an automatic driving system, detailed static map information of roads is required, but since this map information is required to be detailed and accurate information, it is actually running. It is desirable to use information on the peripheral object 4 collected from the vehicle 3.
  • the present invention may be used to detect information on surrounding objects for use as such map information.

Abstract

The present invention is characterized by including: a peripheral object detection unit (11) which detects an object present in the surroundings; a first receiving unit (22) which receives a vehicle-to-road signal transmitted from an on-board machine; and a detection result integration unit (41) which integrates a detection result of an object detected by the peripheral object detection unit (11), and a detection result, of an object present in the surroundings of the on-board machine, received from the on-board machine by using the vehicle-to-road signal.

Description

路側機および路車間通信システムRoadside machine and road-vehicle communication system
 本発明は、通信領域内に存在する車両に搭載された車載機と通信可能な路側機と、車載機および路側機を含む路車間通信システムとに関する。 The present invention relates to a roadside device capable of communicating with an in-vehicle device mounted on a vehicle existing in a communication area, and a road-to-vehicle communication system including the in-vehicle device and the roadside device.
 車両の運転を自動化または補助するシステムの開発が進められている。このようなシステムでは、車両周辺の現在の状態を正確に把握することが重要である。このため、周辺の車両、歩行者または落下物といった、周辺に存在する物体を検出するために、カメラなどの検出手段が用いられている。 Developed systems to automate or assist vehicle driving. In such a system, it is important to accurately grasp the current state around the vehicle. For this reason, detection means, such as a camera, are used in order to detect surrounding objects such as surrounding vehicles, pedestrians, or falling objects.
 特許文献1には、車両の前方環境を認識するカメラを搭載した車両が記載されている。このカメラは、車両室内の前方の天井に取り付けられており、車外の物体を撮像することができるようになっている。 Patent Document 1 describes a vehicle equipped with a camera that recognizes the environment ahead of the vehicle. This camera is attached to the ceiling in front of the vehicle interior so that an object outside the vehicle can be imaged.
特開2016-162299号公報JP 2016-162299 A
 しかしながら、上記従来の技術によれば、車両に取り付けられたカメラの撮像範囲に存在する物体しか検出することができず、周辺物体の検出精度が十分でないという問題があった。 However, according to the above conventional technique, only an object existing in the imaging range of the camera attached to the vehicle can be detected, and there is a problem that the detection accuracy of surrounding objects is not sufficient.
 本発明は、上記に鑑みてなされたものであって、周辺物体の検出精度を向上させることが可能な路側機を得ることを目的とする。 The present invention has been made in view of the above, and an object thereof is to obtain a roadside machine capable of improving the detection accuracy of surrounding objects.
 上述した課題を解決し、目的を達成するために、本発明は、周辺に存在する物体を検出する周辺物体検出部と、車載機から送信される路車間無線信号を受信する第1受信部と、周辺物体検出部の検出結果と、路車間無線信号を用いて車載機から受信する車載機の周辺に存在する物体の検出結果とを統合する検出結果統合部と、を備えることを特徴とする。 In order to solve the above-described problems and achieve the object, the present invention includes a peripheral object detection unit that detects an object existing in the vicinity, and a first reception unit that receives a road-to-vehicle radio signal transmitted from the in-vehicle device. A detection result integration unit that integrates a detection result of the peripheral object detection unit and a detection result of an object present in the vicinity of the on-vehicle device received from the on-vehicle device using a road-to-vehicle radio signal. .
 本発明にかかる路側機は、周辺物体の検出結果の精度を向上させるという効果を奏する。 The roadside machine according to the present invention has an effect of improving the accuracy of detection results of surrounding objects.
本発明の実施の形態1にかかる路車間通信システムの概略構成図1 is a schematic configuration diagram of a road-vehicle communication system according to a first embodiment of the present invention. 実施の形態1にかかる路側機の構成図Configuration diagram of roadside machine according to Embodiment 1 実施の形態1の路側機および車両の配置例を示す図The figure which shows the example of arrangement | positioning of the roadside machine and vehicle of Embodiment 1 図3に示す周辺物体の状態を示す図The figure which shows the state of the surrounding object shown in FIG. 実施の形態1にかかる路側機の周辺物体の検出動作を示すフローチャートFIG. 3 is a flowchart showing a detection operation of a peripheral object of the roadside device according to the first exemplary embodiment. 図3に示す例において検出された検出結果の一例を示す図The figure which shows an example of the detection result detected in the example shown in FIG. 図6に示す検出結果を統合した後の検出結果を示す図The figure which shows the detection result after integrating the detection result shown in FIG. 実施の形態1において送信される送信フレームを示す図The figure which shows the transmission frame transmitted in Embodiment 1. FIG. 実施の形態1の状態パラメータの統合処理の第1の例を示すフローチャートThe flowchart which shows the 1st example of the integration process of the state parameter of Embodiment 1 実施の形態1の状態パラメータの統合処理の第2の例を示すフローチャートFlowchart showing a second example of state parameter integration processing of the first embodiment 実施の形態1の状態パラメータの統合処理の第3の例を示すフローチャートFlowchart illustrating a third example of state parameter integration processing according to the first embodiment. 実施の形態2にかかる路側機の構成図The block diagram of the roadside machine concerning Embodiment 2 実施の形態2にかかる路側機の周辺物体の検出動作を示すフローチャートFlowchart showing detection operation of surrounding objects of roadside machine according to second exemplary embodiment. 実施の形態2において送信される送信フレームを示す図The figure which shows the transmission frame transmitted in Embodiment 2. 実施の形態3にかかる路側機の構成図The block diagram of the roadside machine concerning Embodiment 3 実施の形態3にかかる路側機の周辺物体の検出動作を示すフローチャートFlowchart showing detection operation of surrounding objects of roadside machine according to third exemplary embodiment. 実施の形態3において送信される送信フレームを示す図The figure which shows the transmission frame transmitted in Embodiment 3. 実施の形態1から3にかかる路側機のハードウェア構成を示す図The figure which shows the hardware constitutions of the roadside machine concerning Embodiment 1-3.
 以下に、本発明の実施の形態にかかる路側機および路車間通信システムを図面に基づいて詳細に説明する。なお、この実施の形態によりこの発明が限定されるものではない。 Hereinafter, a roadside machine and a road-vehicle communication system according to an embodiment of the present invention will be described in detail with reference to the drawings. Note that the present invention is not limited to the embodiments.
実施の形態1.
 図1は、本発明の実施の形態1にかかる路車間通信システムの概略構成図である。路車間通信システム100は、路側機1Aおよび路側機1Bと、車載機2とを含む。以下の実施の形態および図面において、同様の機能を有する複数の構成要素は、共通の数字の後に異なるアルファベットを付加した符号を付与する。同様の機能を有する複数の構成要素に共通する事項を説明する際には、共通の数字のみが符号として用いられ、同様の機能を有する複数の構成要素のそれぞれを区別する際には、数字の後に異なるアルファベットを付加した符号が用いられる。例えば、路側機1Aおよび路側機1Bは、複数の路側機1と称することができる。なお、図1では簡単のため1つの車載機2が示されているが、路車間通信システム100は、複数の車載機2を含んでもよい。同様に、路側機1は2つに限らず、路車間通信システム100は、複数の路側機1を含むことができる。
Embodiment 1 FIG.
FIG. 1 is a schematic configuration diagram of a road-vehicle communication system according to a first exemplary embodiment of the present invention. The road-vehicle communication system 100 includes a roadside machine 1 </ b> A and a roadside machine 1 </ b> B, and an in-vehicle device 2. In the following embodiments and drawings, a plurality of constituent elements having similar functions are given a reference numeral in which a different alphabet is added after a common numeral. When explaining matters common to a plurality of components having similar functions, only common numerals are used as symbols, and when distinguishing each of a plurality of components having similar functions, A code to which a different alphabet is added later is used. For example, the roadside machine 1A and the roadside machine 1B can be referred to as a plurality of roadside machines 1. In FIG. 1, one in-vehicle device 2 is shown for simplicity, but the road-to-vehicle communication system 100 may include a plurality of in-vehicle devices 2. Similarly, the roadside machine 1 is not limited to two, and the road-vehicle communication system 100 can include a plurality of roadside machines 1.
 路側機1は、予め定められた場所に固定して設置された通信装置である。路側機1は、例えば、車道または駐車場のような車両が通行可能な場所の近傍に設置されている。路側機1は、通信領域内に存在する車載機2と無線通信可能である。通信領域は、各路側機1に対して予め定められた領域であって、各路側機1が通信をカバーする範囲、または、路側機1が送信した無線信号を受信可能な領域である。また、路側機1は、他の路側機1と通信することもできる。以下、路側機1と車載機2との間の通信を路車間通信と称し、路車間通信を用いて送受信される信号を路車間信号と称する。また、路側機1と路側機1との間の通信を路路間通信と称し、路路間通信を用いて送受信される信号を路路間信号と称する。本実施の形態において、路側機1は、路側機1の周辺に存在する周辺物体4を検出する機能を有する。周辺物体4は、例えば車両、落下物、歩行者などを含むあらゆる物体である。 The roadside machine 1 is a communication device that is fixedly installed at a predetermined location. The roadside machine 1 is installed in the vicinity of a place where vehicles such as a roadway or a parking lot can pass, for example. The roadside device 1 can wirelessly communicate with the in-vehicle device 2 existing in the communication area. The communication area is an area predetermined for each roadside device 1 and is a range in which each roadside device 1 covers communication or an area where a radio signal transmitted by the roadside device 1 can be received. Further, the roadside machine 1 can communicate with other roadside machines 1. Hereinafter, communication between the roadside device 1 and the vehicle-mounted device 2 is referred to as road-to-vehicle communication, and a signal transmitted and received using road-to-vehicle communication is referred to as a road-to-vehicle signal. Further, communication between the roadside machine 1 and the roadside machine 1 is referred to as inter-road communication, and a signal transmitted and received using the inter-road communication is referred to as an inter-road signal. In the present embodiment, the roadside machine 1 has a function of detecting a peripheral object 4 existing around the roadside machine 1. The peripheral object 4 is any object including a vehicle, a fallen object, a pedestrian, and the like.
 車載機2は、車両3に搭載された通信装置である。車載機2は、路側機1の通信領域内に存在しているとき、路側機1と通信することができる。本実施の形態において、車載機2は、車載機2が搭載された車両3の周辺に存在する周辺物体4を検出する機能を有する。 The in-vehicle device 2 is a communication device mounted on the vehicle 3. The vehicle-mounted device 2 can communicate with the roadside device 1 when the vehicle-mounted device 2 exists in the communication area of the roadside device 1. In the present embodiment, the in-vehicle device 2 has a function of detecting a peripheral object 4 existing around the vehicle 3 in which the in-vehicle device 2 is mounted.
 路側機1と、車載機2とは、それぞれが周辺物体4の検出を行っている。そして、車載機2は、周辺物体4の検出結果を、路車間通信を用いて路側機1に通知する。路側機1は、自身が検出した周辺物体4の検出結果と、車載機2から通知された検出結果とを統合する。また路側機1は、他の路側機1から路路間通信を用いて検出結果を取得して、取得した検出結果をさらに統合してもよい。以下、車載機2および他の路側機1から収集した検出結果と、自身が検出した周辺物体4の検出結果とを統合する路側機1について説明する。 The roadside device 1 and the in-vehicle device 2 each detect the surrounding object 4. And the vehicle equipment 2 notifies the detection result of the surrounding object 4 to the roadside machine 1 using road-to-vehicle communication. The roadside device 1 integrates the detection result of the peripheral object 4 detected by itself and the detection result notified from the in-vehicle device 2. The roadside device 1 may acquire detection results from other roadside devices 1 using road-to-road communication, and may further integrate the acquired detection results. Hereinafter, the roadside device 1 that integrates the detection results collected from the in-vehicle device 2 and the other roadside devices 1 and the detection results of the peripheral objects 4 detected by itself will be described.
 図2は、実施の形態1にかかる路側機の構成図である。路側機1は、周辺物体検出部11と、第1取得部12とを有する。第1取得部12は、第1検出精度出力部13と、第1状態パラメータ出力部14とを有する。路側機1は、サーキュレータ21と、第1受信部22と、第2取得部23とをさらに有する。第2取得部23は、第2検出精度出力部24と、第2状態パラメータ出力部25とを有する。路側機1は、第2受信部31と、第3取得部32とをさらに有する。第3取得部32は、第3検出精度出力部33と、第3状態パラメータ出力部34とを有する。路側機1は、検出結果統合部41をさらに有する。検出結果統合部41は、検出精度統合部42と、状態パラメータ統合部43とを有する。路側機1は、周辺物体リスト生成部51と、送信フレーム生成部52と、第1送信部61とをさらに有する。 FIG. 2 is a configuration diagram of the roadside machine according to the first embodiment. The roadside machine 1 includes a peripheral object detection unit 11 and a first acquisition unit 12. The first acquisition unit 12 includes a first detection accuracy output unit 13 and a first state parameter output unit 14. The roadside machine 1 further includes a circulator 21, a first reception unit 22, and a second acquisition unit 23. The second acquisition unit 23 includes a second detection accuracy output unit 24 and a second state parameter output unit 25. The roadside machine 1 further includes a second reception unit 31 and a third acquisition unit 32. The third acquisition unit 32 includes a third detection accuracy output unit 33 and a third state parameter output unit 34. The roadside machine 1 further includes a detection result integration unit 41. The detection result integration unit 41 includes a detection accuracy integration unit 42 and a state parameter integration unit 43. The roadside device 1 further includes a peripheral object list generation unit 51, a transmission frame generation unit 52, and a first transmission unit 61.
 周辺物体検出部11は、周辺物体検出用センサ10に接続されている。周辺物体検出部11は、周辺物体検出用センサ10を用いて、路側機1の周辺に存在する物体を検出する。周辺物体検出部11は、検出した周辺物体に関する検出情報を第1取得部12に出力する。周辺物体検出用センサ10は、例えば撮像機やレーダである。周辺物体検出用センサ10が撮像機である場合、周辺物体検出部11は、撮像機が取得した画像に基づいて、パターン認識などの技術を用いて、周辺物体4を検出する。この場合、周辺物体検出部11は、検出した周辺物体4を含む画像を検出結果として第1取得部12に出力する。周辺物体検出用センサ10がレーダである場合、周辺物体検出部11は、レーダが出射した後、周辺物体4で反射したレーダ信号に基づいて、周辺物体4を検出し、周辺物体4を検出したときのレーダ信号を検出結果として第1取得部12に出力する。 The peripheral object detection unit 11 is connected to the peripheral object detection sensor 10. The peripheral object detection unit 11 detects an object existing around the roadside machine 1 using the peripheral object detection sensor 10. The peripheral object detection unit 11 outputs detection information related to the detected peripheral object to the first acquisition unit 12. The peripheral object detection sensor 10 is, for example, an image pickup device or a radar. When the surrounding object detection sensor 10 is an imaging device, the surrounding object detection unit 11 detects the surrounding object 4 using a technique such as pattern recognition based on an image acquired by the imaging device. In this case, the peripheral object detection unit 11 outputs an image including the detected peripheral object 4 to the first acquisition unit 12 as a detection result. When the surrounding object detection sensor 10 is a radar, the surrounding object detection unit 11 detects the surrounding object 4 based on the radar signal reflected by the surrounding object 4 after the radar is emitted, and detects the surrounding object 4. Is output to the first acquisition unit 12 as a detection result.
 第1取得部12は、周辺物体検出部11が出力した検出結果を加工して、加工された検出結果を取得し、取得した検出結果を検出結果統合部41に出力する。具体的には、第1検出精度出力部13は、周辺物体検出部11が出力した検出結果に基づいて、周辺物体4の検出精度を加工後の検出結果として検出結果統合部41に出力する。検出精度は、周辺物体4を検出するセンサの性能、センサの状態、対象の周辺物体4とセンサとの位置関係などによって変化する。センサが撮像機である場合、撮像機のレンズに対して直射日光が当たる状況では検出精度が著しく低下すると考えられるため、第1取得部12は、検出された時間帯およびそのときの天気に応じて検出精度の値を算出してもよい。或いは、第1取得部12は、対象の周辺物体4とセンサとの間の距離を検出して、この距離に応じた検出精度を算出することができる。第1状態パラメータ出力部14は、周辺物体検出部11が出力した検出結果に基づいて、周辺物体4の状態を示すパラメータである状態パラメータを求め、求めた状態パラメータを加工後の検出結果として検出結果統合部41に出力する。状態パラメータは、周辺物体4の状態を示すパラメータであり、例えば周辺物体4の位置、サイズ、移動速度および属性である。第1状態パラメータ出力部14は、周辺物体検出部11が出力した検出結果の履歴を保持しておき、検出結果の履歴に基づいて、状態パラメータを求めることもできる。例えば、移動速度を求める場合、第1状態パラメータ出力部14は、周辺物体4を含む複数の画像から、移動距離と移動にかかった時間とに基づいて、周辺物体4の移動速度を求めることができる。属性は、例えば、普通車、大型車、二輪車、およびその他物体といった周辺物体4の種類を示す情報である。 The first acquisition unit 12 processes the detection result output by the peripheral object detection unit 11, acquires the processed detection result, and outputs the acquired detection result to the detection result integration unit 41. Specifically, the first detection accuracy output unit 13 outputs the detection accuracy of the peripheral object 4 to the detection result integration unit 41 as the detection result after processing based on the detection result output by the peripheral object detection unit 11. The detection accuracy varies depending on the performance of the sensor that detects the peripheral object 4, the state of the sensor, the positional relationship between the target peripheral object 4 and the sensor, and the like. In the case where the sensor is an imager, the detection accuracy is considered to be significantly reduced in a situation in which direct sunlight is applied to the lens of the imager. Therefore, the first acquisition unit 12 depends on the detected time zone and the weather at that time. Thus, the detection accuracy value may be calculated. Alternatively, the first acquisition unit 12 can detect the distance between the target peripheral object 4 and the sensor and calculate the detection accuracy according to this distance. The first state parameter output unit 14 obtains a state parameter that is a parameter indicating the state of the peripheral object 4 based on the detection result output by the peripheral object detection unit 11, and detects the obtained state parameter as a detection result after processing. The result is output to the result integration unit 41. The state parameter is a parameter indicating the state of the peripheral object 4 and is, for example, the position, size, moving speed, and attribute of the peripheral object 4. The first state parameter output unit 14 can hold a history of detection results output by the surrounding object detection unit 11 and can obtain a state parameter based on the history of detection results. For example, when obtaining the moving speed, the first state parameter output unit 14 obtains the moving speed of the peripheral object 4 from a plurality of images including the peripheral object 4 based on the moving distance and the time taken for the movement. it can. The attribute is information indicating the type of the peripheral object 4 such as a normal vehicle, a large vehicle, a two-wheeled vehicle, and other objects.
 サーキュレータ21は、路車間通信用アンテナ20に接続されており、送信信号と受信信号とを分波する。第1受信部22は、路車間通信用アンテナ20およびサーキュレータ21を介して、車載機2が送信した路車間信号を受信する。第1受信部22は、路車間信号を用いて、車載機2の周辺に存在する周辺物体4の検出結果を車載機2から受信して、受信した検出結果を第2取得部23に出力する。 The circulator 21 is connected to the road-vehicle communication antenna 20 and demultiplexes the transmission signal and the reception signal. The first receiving unit 22 receives the road-to-vehicle signal transmitted by the in-vehicle device 2 via the road-to-vehicle communication antenna 20 and the circulator 21. The first reception unit 22 receives the detection result of the peripheral object 4 existing around the vehicle-mounted device 2 from the vehicle-mounted device 2 using the road-to-vehicle signal, and outputs the received detection result to the second acquisition unit 23. .
 第2取得部23は、車載機2から受信した検出結果を加工して、加工後の検出結果を検出結果統合部41に出力する。具体的には、第2検出精度出力部24は、車載機2から受信した検出結果に含まれる各周辺物体4の検出精度を取り出して、取り出した検出精度を加工後の検出結果として検出結果統合部41に出力する。また、第2状態パラメータ出力部25は、車載機2から受信した検出結果に含まれる各周辺物体4の状態パラメータを取り出して、取り出した状態パラメータを加工後の検出結果として検出結果統合部41に出力する。 The second acquisition unit 23 processes the detection result received from the in-vehicle device 2 and outputs the detection result after processing to the detection result integration unit 41. Specifically, the second detection accuracy output unit 24 extracts the detection accuracy of each peripheral object 4 included in the detection result received from the in-vehicle device 2, and integrates the detection result as the detection result after processing. Output to the unit 41. In addition, the second state parameter output unit 25 extracts the state parameters of each peripheral object 4 included in the detection result received from the in-vehicle device 2 and outputs the extracted state parameters to the detection result integration unit 41 as detection results after processing. Output.
 第2受信部31は、路路間通信用アンテナ30に接続されており、他の路側機1が送信した路路間信号を受信する。第2受信部31は、路路間信号を用いて、他の路側機1の周辺に存在する周辺物体4の検出結果を他の路側機1から受信して、受信した路路間信号を第3取得部32に出力する。 The second receiving unit 31 is connected to the roadside communication antenna 30 and receives a roadside signal transmitted by another roadside device 1. The second receiving unit 31 receives the detection result of the peripheral object 4 existing in the vicinity of the other roadside machine 1 from the other roadside machine 1 using the roadside signal, and receives the received roadside signal as the first signal. 3 is output to the acquisition unit 32.
 第3取得部32は、他の路側機1から受信した検出結果を加工して、加工後の検出結果を検出結果統合部41に出力する。具体的には、第3検出精度出力部33は、他の路側機1から受信した検出結果に含まれる各周辺物体4の検出精度を取り出して、取り出した検出精度を検出結果統合部41に出力する。第3状態パラメータ出力部34は、他の路側機1から受信した検出結果に含まれる各周辺物体4の状態パラメータを取り出して、取り出した状態パラメータを加工後の検出結果として検出結果統合部41に出力する。 3rd acquisition part 32 processes the detection result received from the other roadside machine 1, and outputs the detection result after a process to the detection result integration part 41. FIG. Specifically, the third detection accuracy output unit 33 extracts the detection accuracy of each peripheral object 4 included in the detection result received from the other roadside device 1 and outputs the extracted detection accuracy to the detection result integration unit 41. To do. The third state parameter output unit 34 extracts the state parameters of each peripheral object 4 included in the detection results received from the other roadside devices 1, and outputs the extracted state parameters to the detection result integration unit 41 as detection results after processing. Output.
 検出結果統合部41は、第1取得部12から出力された検出結果と、第2取得部23から出力された検出結果と、第3取得部32から出力された検出結果とを統合して統合後の検出結果を生成する。具体的には、検出結果統合部41は、第1取得部12、第2取得部23および第3取得部32のそれぞれから出力された複数の検出結果の中から、同じ周辺物体4の検出結果を抽出してグループ分けする。そして、検出精度統合部42は、グループごとに、第1検出精度出力部13、第2検出精度出力部24および第3検出精度出力部33のそれぞれから出力された検出精度を統合して、各周辺物体4の統合後の検出精度を生成する。また状態パラメータ統合部43は、グループごとに、第1状態パラメータ出力部14、第2状態パラメータ出力部25および第3状態パラメータ出力部34のそれぞれから出力された状態パラメータを統合して、各周辺物体4の統合後の状態パラメータを生成する。検出結果統合部41の具体的な処理については後述する。 The detection result integration unit 41 integrates and integrates the detection result output from the first acquisition unit 12, the detection result output from the second acquisition unit 23, and the detection result output from the third acquisition unit 32. Generate later detection results. Specifically, the detection result integration unit 41 detects the same peripheral object 4 from the plurality of detection results output from each of the first acquisition unit 12, the second acquisition unit 23, and the third acquisition unit 32. Are extracted and grouped. The detection accuracy integration unit 42 integrates the detection accuracy output from each of the first detection accuracy output unit 13, the second detection accuracy output unit 24, and the third detection accuracy output unit 33 for each group. The detection accuracy after integration of the peripheral objects 4 is generated. In addition, the state parameter integration unit 43 integrates the state parameters output from the first state parameter output unit 14, the second state parameter output unit 25, and the third state parameter output unit 34 for each group, A state parameter after the integration of the object 4 is generated. Specific processing of the detection result integration unit 41 will be described later.
 周辺物体リスト生成部51は、検出結果統合部41が統合した後の検出結果に基づいて、周辺物体リストを生成する。周辺物体リストは、各周辺物体についての、統合後の検出精度および状態パラメータを含む。 The peripheral object list generation unit 51 generates a peripheral object list based on the detection result after the detection result integration unit 41 integrates. The peripheral object list includes detection accuracy and state parameters after integration for each peripheral object.
 送信フレーム生成部52は、周辺物体リスト生成部51が生成した周辺物体リストに基づいて、車載機2に送信する送信フレームを生成する。送信フレーム生成部52は、生成した送信フレームを第1送信部61に出力する。 The transmission frame generation unit 52 generates a transmission frame to be transmitted to the in-vehicle device 2 based on the peripheral object list generated by the peripheral object list generation unit 51. The transmission frame generation unit 52 outputs the generated transmission frame to the first transmission unit 61.
 第1送信部61は、サーキュレータ21を介して路車間通信用アンテナ20に接続されており、送信フレーム生成部52が出力する送信フレームを車載機2に送信する。 The first transmission unit 61 is connected to the road-vehicle communication antenna 20 via the circulator 21, and transmits the transmission frame output from the transmission frame generation unit 52 to the in-vehicle device 2.
 図3は、路側機および車両の配置例を示す図である。路側機1が行う周辺物体4の検出処理について、図3の状態を前提として、以下に具体例を用いて説明する。車道の脇に路側機1Aおよび路側機1Bが設置されており、車道上に落下物5Aが存在している。この車道においてある時点において、一方の車線を車両3Aが走行しており、対向車線である他方の車線を車両3B、車両3Cおよび車両3Dが走行している。車道は左側通行である。図3において、周辺物体4は、車両3A、車両3B、車両3C、車両3Dおよび落下物5Aである。 FIG. 3 is a diagram showing an arrangement example of roadside units and vehicles. The detection process of the peripheral object 4 performed by the roadside machine 1 will be described below using a specific example on the assumption of the state of FIG. A roadside machine 1A and a roadside machine 1B are installed beside the roadway, and a fallen object 5A exists on the roadway. At a certain point on this roadway, the vehicle 3A travels in one lane, and the vehicle 3B, the vehicle 3C, and the vehicle 3D travel in the other lane that is the opposite lane. The roadway is left-hand traffic. In FIG. 3, the peripheral objects 4 are a vehicle 3A, a vehicle 3B, a vehicle 3C, a vehicle 3D, and a fallen object 5A.
 図4は、図3に示す周辺物体の状態を示す図である。車両3Aは、北緯35.50000度、東経139.40000度の位置に存在しており、車体のサイズは、10m×6m×4m程度の大型車である。車両3Bは、北緯35.50000度、東経139.50000度の位置に存在しており、車体のサイズは5m×3m×2m程度の普通車である。車両3Cは、北緯35.50000度、東経139.60000度の位置に存在しており、車体のサイズは5m×3m×2m程度の普通車である。車両3Dは、北緯35.50000度、東経139.70000度の位置に存在しており、車体のサイズは5m×3m×2m程度の普通車である。落下物5Aは、北緯35.50000度、東経139.30000度の位置に存在しており、サイズは2m×2m×1m程度の物体である。 FIG. 4 is a diagram showing the state of the peripheral object shown in FIG. The vehicle 3A exists at a position of 35.50000 degrees north latitude and 139.40000 degrees east longitude, and the size of the vehicle body is a large vehicle of about 10 m × 6 m × 4 m. The vehicle 3B exists at a position of 35.50000 degrees north latitude and 139.50000 degrees east longitude, and the size of the vehicle body is a normal car of about 5 m × 3 m × 2 m. The vehicle 3C exists at a position of 35.50000 degrees north latitude and 139.60000 degrees east longitude, and the size of the vehicle body is a normal car of about 5 m × 3 m × 2 m. The vehicle 3D exists at a position of 35.50000 degrees north latitude and 139.70000 degrees east longitude, and the size of the vehicle body is a normal car of about 5 m × 3 m × 2 m. The falling object 5A exists at a position of 35.50000 degrees north latitude and 139.30000 degrees east longitude, and is an object having a size of about 2 m × 2 m × 1 m.
 図5は、実施の形態1にかかる路側機の周辺物体の検出動作を示すフローチャートである。図5は、図3に示す路側機1Aが行う検出動作を示している。路側機1Aは、周辺物体検出部11が取得した周辺物体4の検出結果を取得する(ステップS101)。具体的には、第1取得部12は、周辺物体検出部11が取得した検出結果から、各周辺物体4の検出精度と、状態パラメータとを取得する。 FIG. 5 is a flowchart of the detection operation of the surrounding objects of the roadside machine according to the first embodiment. FIG. 5 shows a detection operation performed by the roadside machine 1A shown in FIG. The roadside machine 1A acquires the detection result of the peripheral object 4 acquired by the peripheral object detection unit 11 (step S101). Specifically, the first acquisition unit 12 acquires the detection accuracy and state parameters of each peripheral object 4 from the detection result acquired by the peripheral object detection unit 11.
 第1受信部22は、車載機2から、車載機2の周辺に存在する周辺物体4の検出結果を受信する(ステップS102)。第2取得部23は、複数の車載機2のそれぞれが取得した検出結果を受信する。 The 1st receiving part 22 receives the detection result of peripheral object 4 which exists in the circumference of vehicle equipment 2 from vehicle equipment 2 (Step S102). The second acquisition unit 23 receives the detection results acquired by each of the plurality of in-vehicle devices 2.
 第2受信部31は、他の路側機1Bから、周辺物体4の検出結果を受信する(ステップS103)。図3の例では、他の路側機1は路側機1Bだけが示されているが、路側機1Aは、複数の他の路側機1から周辺物体4の検出結果を取得してもよい。 The second receiving unit 31 receives the detection result of the peripheral object 4 from the other roadside device 1B (step S103). In the example of FIG. 3, only the roadside machine 1 </ b> B is shown as the other roadside machine 1, but the roadside machine 1 </ b> A may acquire the detection result of the peripheral object 4 from a plurality of other roadside machines 1.
 図6は、図3に示す例において検出された検出結果の一例を示す図である。図6には、ステップS101からステップS103に示す処理によって、路側機1Aが取得した検出結果が示されている。この検出結果は、路側機1A、路側機1B、車両3A、車両3Bおよび車両3Cのそれぞれが検出した周辺物体4の検出結果を含む。 FIG. 6 is a diagram showing an example of the detection result detected in the example shown in FIG. FIG. 6 shows the detection results acquired by the roadside device 1A by the processing shown in steps S101 to S103. This detection result includes the detection result of the peripheral object 4 detected by each of the roadside machine 1A, the roadside machine 1B, the vehicle 3A, the vehicle 3B, and the vehicle 3C.
 図5の説明に戻る。路側機1Aの検出結果統合部41は、受信した複数の検出結果を統合する(ステップS104)。具体的には、検出結果統合部41は、受信した複数の検出結果が含む検出精度および状態パラメータを、周辺物体4ごとにまとめて統合する。検出精度の統合方法としては、例えば、最も値が大きい検出精度を統合後の検出精度とする方法、複数の検出精度と状態パラメータの値とを用いて演算処理を行う方法などが挙げられる。検出精度の統合方法は、後述される状態パラメータの統合方法に合わせて選択されてもよい。 Returning to the explanation of FIG. The detection result integration unit 41 of the roadside machine 1A integrates a plurality of received detection results (step S104). Specifically, the detection result integration unit 41 integrates the detection accuracy and state parameters included in the received plurality of detection results for each peripheral object 4. Examples of the detection accuracy integration method include a method in which the detection accuracy having the largest value is used as the detection accuracy after integration, and a method in which arithmetic processing is performed using a plurality of detection accuracy and state parameter values. The detection accuracy integration method may be selected in accordance with a state parameter integration method described later.
 周辺物体リスト生成部51は、検出結果統合部41が統合した後の検出結果を含む周辺物体リストを生成する(ステップS105)。図7は、図6に示す検出結果を統合した後の検出結果を示す図である。ここでは、理解を助けるために、統合後の各検出結果が図3のどの物体に相当するかを「対象」の欄に示している。統合後の検出結果は、対象の周辺物体4ごとに検出精度の値と状態パラメータとが示されている。 The peripheral object list generation unit 51 generates a peripheral object list including the detection results after the detection result integration unit 41 integrates (step S105). FIG. 7 is a diagram illustrating a detection result after integrating the detection results illustrated in FIG. 6. Here, in order to help understanding, the “object” column indicates which object in FIG. 3 corresponds to each detection result after integration. The detection result after integration shows a value of detection accuracy and a state parameter for each target peripheral object 4.
 図5の説明に戻る。周辺物体リストが生成された後、送信フレーム生成部52は、周辺物体リストを含む送信フレームを生成する(ステップS106)。図8は、実施の形態1において送信される送信フレームを示す図である。図8に示す送信フレーム200は、N個の周辺物体の検出結果を含んでいる。検出結果は、統合後検出精度201および統合後状態パラメータ202を含む。また送信フレーム200は、その他データ203を含んでいる。送信フレーム生成部52は、生成した送信フレーム200を第1送信部61に出力する。 Returning to the explanation of FIG. After the peripheral object list is generated, the transmission frame generation unit 52 generates a transmission frame including the peripheral object list (step S106). FIG. 8 is a diagram showing a transmission frame transmitted in the first embodiment. A transmission frame 200 illustrated in FIG. 8 includes detection results of N peripheral objects. The detection result includes a post-integration detection accuracy 201 and a post-integration state parameter 202. The transmission frame 200 includes other data 203. The transmission frame generation unit 52 outputs the generated transmission frame 200 to the first transmission unit 61.
 図5の説明に戻る。第1送信部61は、路車間通信用アンテナ20を介して、送信フレーム生成部52が出力した送信フレーム200を、通信領域内に存在する車載機2に送信する(ステップS107)。送信フレーム200を受信した車載機2を搭載する車両3は、送信フレーム200に含まれる検出結果を用いて、自動運転または運転アシストの制御を行うことができる。 Returning to the explanation of FIG. The 1st transmission part 61 transmits the transmission frame 200 which the transmission frame production | generation part 52 output via the road-vehicle communication antenna 20 to the vehicle equipment 2 which exists in a communication area (step S107). The vehicle 3 equipped with the vehicle-mounted device 2 that has received the transmission frame 200 can control automatic driving or driving assistance using the detection result included in the transmission frame 200.
 続いて、状態パラメータの統合方法の詳細について説明する。状態パラメータの統合方法は、状態パラメータの種類によって、或いは、検出結果の内容によって、以下に示す複数の統合方法の中から選択して用いることができる。 Next, the details of the state parameter integration method will be described. The state parameter integration method can be selected from a plurality of integration methods shown below depending on the type of the state parameter or the content of the detection result.
 図9は、状態パラメータの統合処理の第1の例を示すフローチャートである。路側機1Aの検出結果統合部41は、図6に示すような、複数の検出主体によって取得された検出結果の中から、同じ周辺物体4の検出結果を抽出してグループ分けする(ステップS108)。 FIG. 9 is a flowchart showing a first example of state parameter integration processing. The detection result integration unit 41 of the roadside machine 1A extracts the detection results of the same peripheral object 4 from the detection results acquired by a plurality of detection subjects as shown in FIG. 6 and groups them (step S108). .
 例えば、検出結果統合部41が同じ周辺物体4の検出結果を抽出する方法としては、画像解析を用いる方法が挙げられる。検出結果統合部41は、周辺物体検出用センサ10が取得した周辺物体を含む画像を用いて、複数の画像の中から同一の周辺物体を認識することができる。或いは、検出結果統合部41は、各検出結果に含まれる位置情報に基づいて、複数の検出結果の中から、同じ周辺物体4の検出結果を抽出することができる。ここで、検出結果統合部41は、誤差を考慮して、周辺物体4の位置情報の差が予め定められた範囲内であるとき、これらの周辺物体4を同一の周辺物体4であると判断することができる。図6の例では、検出結果統合部41は、検出結果に含まれる位置から、01番目の検出結果と、05番目の検出結果と、07番目の検出結果とが同一の周辺物体4の検出結果であり、02番目の検出結果と、08番目の検出結果とが同一の周辺物体4の検出結果であると判断することができる。さらに、検出結果統合部41は、03番目の検出結果と、06番目の検出結果と、10番目の検出結果とが同一の周辺物体4の検出結果であり、04番目の検出結果と、09番目の検出結果と、12番目の検出結果とが同一の周辺物体4の検出結果であると判断することができる。検出結果統合部41は、さらに、11番目の検出結果と、13番目の検出結果とが同一の周辺物体4の検出結果であると判断することができる。 For example, as a method by which the detection result integration unit 41 extracts the detection result of the same peripheral object 4, a method using image analysis may be mentioned. The detection result integration unit 41 can recognize the same peripheral object from a plurality of images using an image including the peripheral object acquired by the peripheral object detection sensor 10. Alternatively, the detection result integration unit 41 can extract the detection result of the same peripheral object 4 from the plurality of detection results based on the position information included in each detection result. Here, the detection result integration unit 41 considers errors and determines that these peripheral objects 4 are the same peripheral object 4 when the difference in position information of the peripheral objects 4 is within a predetermined range. can do. In the example of FIG. 6, the detection result integration unit 41 detects the detection result of the peripheral object 4 in which the 01th detection result, the 05th detection result, and the 07th detection result are the same from the position included in the detection result. Therefore, it can be determined that the 02nd detection result and the 08th detection result are the same peripheral object 4 detection results. Furthermore, the detection result integration unit 41 is the detection result of the peripheral object 4 in which the 03th detection result, the 06th detection result, and the 10th detection result are the same, and the 04th detection result and the 09th detection result. And the twelfth detection result can be determined to be the same detection result of the surrounding object 4. The detection result integration unit 41 can further determine that the eleventh detection result and the thirteenth detection result are detection results of the same peripheral object 4.
 図9の説明に戻る。検出結果統合部41は、ステップS108で分けたグループ内で、検出結果の検出精度に応じた重み係数を用いて、状態パラメータを重みづけ合成する(ステップS109)。例えば、検出手段iの検出精度をα%として、検出手段iの状態パラメータの数値をXとした場合、統合後の状態パラメータの数値αtotalは、以下の数式で示される。この場合、重み係数は、検出精度αである。図9に示す状態パラメータの統合処理は、状態パラメータが数値で表される場合に用いることができる。 Returning to the description of FIG. The detection result integration unit 41 weights and synthesizes the state parameters using a weighting factor corresponding to the detection accuracy of the detection results within the group divided in step S108 (step S109). For example, assuming that the detection accuracy of the detection means i is α i % and the numerical value of the state parameter of the detection means i is X i , the numerical value α total of the state parameter after integration is expressed by the following formula. In this case, the weighting factor is detection accuracy α i . The state parameter integration process shown in FIG. 9 can be used when the state parameters are represented by numerical values.
Figure JPOXMLDOC01-appb-M000001
Figure JPOXMLDOC01-appb-M000001
 例えば、図6の検出結果ID(IDentification)が03、06および10のグループにおいて、周辺物体4のサイズを、重み係数を用いて算出した値を四捨五入した値が、図7に示されている。状態パラメータのサイズは、物体の長さ、幅および高さの数値で示されている。この場合、複数の検出結果に含まれる長さ、幅および高さそれぞれの数値に対して、上記の数式を用いて重みづけ合成することができる。例えば長さの数値を上記の数式に当てはめると、以下に示すように、統合後の長さの値は4.12と算出される。同様に算出すると、統合後の幅の値は2であり、統合後の高さの値は1.76である。 For example, FIG. 7 shows values obtained by rounding off the values calculated by using the weighting factors for the sizes of the surrounding objects 4 in the groups with detection result IDs (IDentification) of 03, 06 and 10 in FIG. The size of the state parameter is indicated by numerical values of the length, width and height of the object. In this case, it is possible to weight and combine the numerical values of the length, width, and height included in the plurality of detection results using the above mathematical formula. For example, when a numerical value of length is applied to the above mathematical formula, the length value after integration is calculated as 4.12, as shown below. When calculated in the same manner, the width value after integration is 2, and the height value after integration is 1.76.
Figure JPOXMLDOC01-appb-M000002
Figure JPOXMLDOC01-appb-M000002
 図10は、状態パラメータの統合処理の第2の例を示すフローチャートである。図9の場合と同様に、検出結果統合部41は、同じ周辺物体4の検出結果を抽出してグループ分けする(ステップ108)。 FIG. 10 is a flowchart showing a second example of state parameter integration processing. As in the case of FIG. 9, the detection result integration unit 41 extracts the detection results of the same peripheral objects 4 and groups them (step 108).
 検出結果統合部41は、2以上の検出結果の状態パラメータが一致するか否か判断する(ステップS110)。2以上の検出結果の状態パラメータが一致する場合(ステップS110:Yes)、検出結果統合部41は、最も多い状態パラメータを統合後の状態パラメータとする(ステップS111)。2以上の検出結果の状態パラメータが一致しない場合(ステップS110:No)、検出結果統合部41は、他の基準を用いた統合処理を行う(ステップS112)。 The detection result integration unit 41 determines whether or not the state parameters of two or more detection results match (step S110). When the state parameters of two or more detection results match (step S110: Yes), the detection result integration unit 41 sets the most state parameters as the state parameters after integration (step S111). When the state parameters of two or more detection results do not match (step S110: No), the detection result integration unit 41 performs integration processing using other criteria (step S112).
 例えば、図6の検出結果IDが01、05および07のグループにおいて、属性の値は、「その他物体」が2つ、「二輪車」が1つである。この場合、図10の統合処理では、統合後の属性の値は「その他物体」となる。2以上の状態パラメータが一致しない場合、この統合方法は使用することができないため、他の基準を用いた統合処理が用いられる。またこの統合方法は、同じグループ内の検出結果の数が少ない場合、統合後の状態パラメータの精度が低下することがあるため、検出結果の数が予め定められた数以上である場合に使用されてもよい。 For example, in the group of detection result IDs 01, 05, and 07 in FIG. 6, the attribute values are two “other objects” and one “two-wheeled vehicle”. In this case, in the integration process of FIG. 10, the value of the attribute after integration is “other object”. If two or more state parameters do not match, this integration method cannot be used, so an integration process using other criteria is used. Also, this integration method is used when the number of detection results is equal to or greater than a predetermined number because the accuracy of state parameters after integration may decrease when the number of detection results in the same group is small. May be.
 図11は、状態パラメータの統合処理の第3の例を示すフローチャートである。図9の場合と同様に、検出結果統合部41は、同じ周辺物体4の検出結果を抽出してグループ分けする(ステップ108)。 FIG. 11 is a flowchart showing a third example of state parameter integration processing. As in the case of FIG. 9, the detection result integration unit 41 extracts the detection results of the same peripheral objects 4 and groups them (step 108).
 検出結果統合部41は、グループ内で検出結果の検出精度が最も高い状態パラメータを、統合後の状態パラメータとして選択する(ステップS113)。 The detection result integration unit 41 selects the state parameter with the highest detection result detection accuracy within the group as the state parameter after integration (step S113).
 例えば、図6の検出結果のIDが01、05および07のグループにおいて、移動速度の値は、0km/h、1km/hおよび20km/hであり、各状態パラメータに対応する検出精度は、95%、80%、15%である。この場合、図11の統合処理では、最も検出精度の高い0km/hが統合後の移動速度の値として選択される。この統合方法は、状態パラメータが数値以外の値であっても使用することができる。 For example, in the group of detection result IDs 01, 05, and 07 in FIG. 6, the moving speed values are 0 km / h, 1 km / h, and 20 km / h, and the detection accuracy corresponding to each state parameter is 95. %, 80% and 15%. In this case, in the integration process of FIG. 11, 0 km / h with the highest detection accuracy is selected as the value of the movement speed after integration. This integration method can be used even if the status parameter is a value other than a numerical value.
 以上説明した状態パラメータの統合処理は、各状態パラメータについて選択された1つの統合処理が使用されてもよいし、複数の統合処理が組み合わせて用いられてもよい。取得された状態パラメータの数および内容に応じて、適宜使用する統合処理が選択されてもよい。 In the state parameter integration process described above, one integration process selected for each state parameter may be used, or a plurality of integration processes may be used in combination. Depending on the number and contents of the acquired state parameters, an integration process to be used may be selected as appropriate.
 以上説明したように、本発明の実施の形態1によれば、路側機1Aが検出した周辺物体4の検出結果と、車載機2が検出した周辺物体4の検出結果とを統合することができる。この構成によって、検出結果を得るために用いる情報量が増えるため、検出結果の精度を向上させることができる。また、路側機1は、車両3の死角となっている位置に存在する周辺物体4を検出できることがあるため、検出結果の精度向上が望める。特に、交差点やカーブなど見通しの悪い場所では、車両3に搭載されたセンサでは検出できない周辺物体4であっても、路側機1において検出される可能性が高くなる。また、路側機1Aは、他の路側機1Bが取得した検出結果をさらに統合することができる。複数の路側機1は間隔をあけて設置されていることが多いため、他の路側機1Bは、路側機1Aとは異なる範囲に存在する周辺物体4を検出することができる。このため、統合後の検出結果には広い範囲に存在する周辺物体4の検出結果が含まれることになり、この検出結果を取得した各車両3は、走行を予定している経路上の周辺の車両3、障害物、渋滞状況、事故の発生などを把握することが可能になる。 As described above, according to the first embodiment of the present invention, the detection result of the peripheral object 4 detected by the roadside device 1A and the detection result of the peripheral object 4 detected by the in-vehicle device 2 can be integrated. . With this configuration, the amount of information used to obtain the detection result increases, so that the accuracy of the detection result can be improved. Moreover, since the roadside machine 1 may be able to detect the peripheral object 4 present at the position where it is the blind spot of the vehicle 3, it can be expected to improve the accuracy of the detection result. In particular, in places with poor visibility such as intersections and curves, there is a high possibility that even the peripheral object 4 that cannot be detected by the sensor mounted on the vehicle 3 will be detected by the roadside machine 1. The roadside machine 1A can further integrate the detection results acquired by the other roadside machine 1B. Since the plurality of roadside machines 1 are often installed at intervals, the other roadside machines 1B can detect the peripheral objects 4 existing in a different range from the roadside machine 1A. For this reason, the detection results after the integration include the detection results of the peripheral objects 4 existing in a wide range, and each vehicle 3 that has acquired the detection results It becomes possible to grasp the vehicle 3, the obstacle, the traffic jam, the occurrence of an accident, and the like.
実施の形態2.
 図12は、実施の形態2にかかる路側機の構成図である。図12に示す路側機1は、図2に示した実施の形態1に係る路側機1の構成に加えて、閾値設定部71と、混雑度推定部72と、閾値生成部73とをさらに有する。以下、実施の形態1にかかる路側機1と異なる点について主に説明する。
Embodiment 2. FIG.
FIG. 12 is a configuration diagram of a roadside machine according to the second embodiment. The roadside device 1 shown in FIG. 12 further includes a threshold setting unit 71, a congestion degree estimation unit 72, and a threshold generation unit 73 in addition to the configuration of the roadside device 1 according to Embodiment 1 shown in FIG. . Hereinafter, differences from the roadside device 1 according to the first embodiment will be mainly described.
 閾値設定部71は、車載機2が検出結果の送信可否を決定するための閾値である送信可否判定閾値を送信フレーム生成部52に設定する。車載機2から送信される検出結果の量が増えると、路車間通信の帯域が占有されてしまうことがある。より有用な検出結果を優先して送信することが望ましいため、本実施の形態では、車載機2は検出結果の検出精度に基づいて検出結果の送信可否を判断する。車載機2が、検出された複数の検出結果の中から、送信可否判定閾値よりも検出精度が高い検出結果を抽出して送信することにより、路車間通信で送受信されるデータ量を抑制することが可能になる。 The threshold setting unit 71 sets a transmission permission determination threshold, which is a threshold for the in-vehicle device 2 to determine whether the detection result can be transmitted, in the transmission frame generation unit 52. When the amount of detection results transmitted from the in-vehicle device 2 increases, the road-vehicle communication band may be occupied. Since it is desirable to transmit a more useful detection result with priority, in the present embodiment, the in-vehicle device 2 determines whether or not the detection result can be transmitted based on the detection accuracy of the detection result. The in-vehicle device 2 suppresses the amount of data transmitted and received by road-to-vehicle communication by extracting and transmitting a detection result having a detection accuracy higher than the transmission availability determination threshold value from the plurality of detected detection results. Is possible.
 混雑度推定部72は、路車間通信の混雑度を推定して、混雑度の推定結果を閾値生成部73に出力する。混雑度推定部72は、例えば、周辺の車両3の数が多いほど路車間通信の通信量が増加すると仮定して、周辺の車両3の数に基づいて路車間通信の混雑度を推定することができる。この場合、混雑度推定部72は、周辺物体検出部11が検出した周辺物体4のうち、第1状態パラメータ出力部14が出力した状態パラメータの属性が、普通車、大型車といった車両3である周辺物体4の数に基づいて、混雑度を推定することができる。或いは、混雑度推定部72は、第1受信部22が受信した路車間信号を解析することで、車載機2から路側機1への通信量を推定してもよい。閾値生成部73は、混雑度推定部72が出力した混雑度の推定結果を受け付けて、推定結果に基づいて送信可否判定閾値を生成し、生成した送信可否判定閾値を閾値設定部71に出力する。閾値生成部73は、混雑度が高いほど、送信可否判定閾値の値を大きくして、混雑度が低いほど送信可否判定閾値の値を小さくする。これにより、閾値設定部71は、混雑度の推定結果に応じた送信可否判定閾値を設定することになり、混雑度に応じて通信量を調整することが可能になる。 The congestion level estimation unit 72 estimates the congestion level of road-to-vehicle communication, and outputs the estimation result of the congestion level to the threshold value generation unit 73. For example, the congestion degree estimation unit 72 estimates the degree of congestion of road-to-vehicle communication based on the number of surrounding vehicles 3 on the assumption that the traffic amount of road-to-vehicle communication increases as the number of surrounding vehicles 3 increases. Can do. In this case, in the congestion degree estimation unit 72, among the peripheral objects 4 detected by the peripheral object detection unit 11, the state parameter attribute output by the first state parameter output unit 14 is a vehicle 3 such as a normal vehicle or a large vehicle. Based on the number of surrounding objects 4, the degree of congestion can be estimated. Alternatively, the congestion degree estimation unit 72 may estimate the traffic from the in-vehicle device 2 to the roadside device 1 by analyzing the road-to-vehicle signal received by the first receiving unit 22. The threshold generation unit 73 receives the estimation result of the congestion level output from the congestion level estimation unit 72, generates a transmission permission determination threshold value based on the estimation result, and outputs the generated transmission permission determination threshold value to the threshold setting unit 71. . The threshold generation unit 73 increases the value of the transmission permission / inhibition determination threshold as the degree of congestion is higher, and decreases the value of the transmission permission / inhibition determination threshold as the degree of congestion is lower. As a result, the threshold setting unit 71 sets a transmission permission / inhibition determination threshold according to the estimation result of the congestion degree, and can adjust the communication amount according to the congestion degree.
 図13は、実施の形態2にかかる路側機の周辺物体の検出動作を示すフローチャートである。路側機1は、検出結果統合処理を行う(ステップS100)。ステップS100に示す検出結果統合処理は、図5に示すステップS101からステップS105に相当する。 FIG. 13 is a flowchart showing an operation of detecting a peripheral object of the roadside machine according to the second embodiment. The roadside device 1 performs detection result integration processing (step S100). The detection result integration process shown in step S100 corresponds to steps S101 to S105 shown in FIG.
 ステップS100に示す検出結果統合処理と並行して、混雑度推定部72は、路車間通信の混雑度を推定する(ステップS201)。そして、閾値生成部73は、推定された混雑度に基づいて、送信可否判定閾値を生成する(ステップS202)。 In parallel with the detection result integration process shown in step S100, the congestion level estimation unit 72 estimates the congestion level of road-to-vehicle communication (step S201). Then, the threshold generation unit 73 generates a transmission permission determination threshold based on the estimated congestion level (step S202).
 送信フレーム生成部52は、ステップS100に示す検出結果統合処理による統合後の検出結果および送信可否判定閾値を含む送信フレームを生成する(ステップS203)。送信フレーム生成部52は、生成した送信フレームを第1送信部61に出力する。図14は、実施の形態2において送信される送信フレームを示す図である。送信フレーム210は、図8に示した送信フレーム200が含む情報に加えて、送信可否判定閾値204を含む。 The transmission frame generation unit 52 generates a transmission frame including the detection result after the integration by the detection result integration process shown in step S100 and a transmission permission determination threshold (step S203). The transmission frame generation unit 52 outputs the generated transmission frame to the first transmission unit 61. FIG. 14 is a diagram illustrating a transmission frame transmitted in the second embodiment. The transmission frame 210 includes a transmission permission determination threshold value 204 in addition to the information included in the transmission frame 200 shown in FIG.
 図13の説明に戻る。送信フレーム210が生成されると、第1送信部61は、生成された送信フレーム210を車載機2に送信する(ステップS204)。 Returning to the explanation of FIG. When the transmission frame 210 is generated, the first transmission unit 61 transmits the generated transmission frame 210 to the in-vehicle device 2 (step S204).
 以上説明したように、本実施の形態2では、路車間通信の混雑度に応じて生成された送信可否判定閾値が、車載機2に通知される。この構成により、路側機1の通信領域内に存在する車両の台数が増加して、車載機2から路側機1への通信量が急増する場合であっても、車載機2から路側機1への検出結果の送信を、路側機1側で制御することが可能になる。また、送信可否判定閾値は、検出精度に対する閾値であるため、通信量を抑制している状態であっても、路側機1は、検出精度が高く、重要度の高い検出結果を選択的に取得することが可能になる。 As described above, in the second embodiment, the in-vehicle device 2 is notified of the transmission permission / inhibition determination threshold generated according to the congestion degree of road-to-vehicle communication. With this configuration, even if the number of vehicles existing in the communication area of the roadside device 1 increases and the amount of communication from the onboard device 2 to the roadside device 1 increases rapidly, the onboard device 2 changes to the roadside device 1. It is possible to control the transmission of the detection result on the roadside device 1 side. In addition, since the transmission permission / inhibition determination threshold is a threshold for detection accuracy, the roadside device 1 selectively acquires a detection result with high detection accuracy and high importance even in a state where the traffic is suppressed. It becomes possible to do.
実施の形態3.
 図15は、実施の形態3にかかる路側機の構成図である。図15に示す路側機1は、図2に示した実施の形態1に係る路側機1の構成に加えて、送信不要リスト生成部53と、閾値設定部71とを有する。
Embodiment 3 FIG.
FIG. 15 is a configuration diagram of a roadside machine according to the third embodiment. The roadside machine 1 shown in FIG. 15 includes a transmission unnecessary list generation unit 53 and a threshold setting unit 71 in addition to the configuration of the roadside machine 1 according to Embodiment 1 shown in FIG.
 送信不要リスト生成部53は、周辺物体リスト生成部51が統合後の検出結果に基づいて生成した周辺物体リストを用いて、車載機2と路側機1との間でこれ以上検出結果の送信が必要ない周辺物体4のリストである路車間送信不要リストを生成する。送信不要リスト生成部53は、検出精度に基づいて、車載機2と路側機1との間でこれ以上検出結果の送信が必要ない周辺物体4を特定することができる。送信不要リスト生成部53は、例えば、周辺物体リストから、検出精度が予め定められた値以上であって、移動速度が0である周辺物体を抽出して、路車間送信不要リストを生成することができる。送信不要リスト生成部53は、生成した路車間送信不要リストを送信フレーム生成部52に出力する。 The transmission unnecessary list generation unit 53 uses the peripheral object list generated by the peripheral object list generation unit 51 based on the detection result after integration, and no more detection results are transmitted between the in-vehicle device 2 and the roadside device 1. A road-to-vehicle transmission unnecessary list that is a list of unnecessary peripheral objects 4 is generated. The transmission unnecessary list generating unit 53 can identify the peripheral object 4 that does not require any further transmission of the detection result between the in-vehicle device 2 and the roadside device 1 based on the detection accuracy. For example, the transmission unnecessary list generating unit 53 extracts a peripheral object whose detection accuracy is equal to or higher than a predetermined value and whose moving speed is 0 from the peripheral object list, and generates a road-to-vehicle transmission unnecessary list. Can do. The transmission unnecessary list generating unit 53 outputs the generated road-to-vehicle transmission unnecessary list to the transmission frame generating unit 52.
 本実施の形態3では、路側機1は、路車間通信の混雑度を推定する機能を有さないため、閾値設定部71は、予め定められた送信可否判定閾値を送信フレーム生成部52に設定する。 In the third embodiment, since the roadside device 1 does not have a function of estimating the degree of congestion of road-to-vehicle communication, the threshold setting unit 71 sets a predetermined transmission availability determination threshold in the transmission frame generation unit 52. To do.
 図16は、実施の形態3にかかる路側機の周辺物体の検出動作を示すフローチャートである。路側機1の検出結果統合部41は、検出結果統合処理を行う(ステップS100)。閾値設定部71は、送信フレーム生成部52に予め定められた送信可否判定閾値を設定する(ステップS301)。送信フレーム生成部52は、各周辺物体4の統合後の検出結果および送信可否判定閾値を含む送信フレームを生成する(ステップS302)。 FIG. 16 is a flowchart of the detection operation of the surrounding objects of the roadside machine according to the third embodiment. The detection result integration unit 41 of the roadside machine 1 performs detection result integration processing (step S100). The threshold setting unit 71 sets a predetermined transmission permission / inhibition determination threshold in the transmission frame generation unit 52 (step S301). The transmission frame generation unit 52 generates a transmission frame including the detection result after the integration of the peripheral objects 4 and a transmission permission / rejection determination threshold (step S302).
 送信不要リスト生成部53は、周辺物体リストから、路車間送信不要リストを生成する(ステップS303)。生成された路車間送信不要リストは、送信フレーム生成部52に出力される。送信フレーム生成部52は、路車間送信不要リストに含まれている周辺物体4の検出結果に対応する路車間送信不要フラグをONにする(ステップS304)。送信フレーム生成部52は、生成した送信フレームを第1送信部61に出力する。第1送信部61は、送信フレーム生成部52から出力された送信フレームを車載機2に送信する(ステップS305)。 The transmission unnecessary list generating unit 53 generates a road-to-vehicle transmission unnecessary list from the peripheral object list (step S303). The generated road-to-vehicle transmission unnecessary list is output to the transmission frame generation unit 52. The transmission frame generation unit 52 turns on the road-to-vehicle transmission unnecessary flag corresponding to the detection result of the peripheral object 4 included in the road-to-vehicle transmission unnecessary list (step S304). The transmission frame generation unit 52 outputs the generated transmission frame to the first transmission unit 61. The 1st transmission part 61 transmits the transmission frame output from the transmission frame production | generation part 52 to the vehicle equipment 2 (step S305).
 図17は、実施の形態3において送信される送信フレームを示す図である。図17に示す送信フレーム220は、図14に示した送信フレーム210に含まれる情報に加えて、周辺物体4ごとに路車間送信不要フラグ205を含む。ステップS304に示す処理によって、路車間で送信が不要であると判断された周辺物体4に対応する路車間送信不要フラグがONとなる。これにより、送信フレーム220を受信した車載機2は、次回の検出結果を通知する際には、路車間送信不要フラグがONとなっている周辺物体4の検出結果を送信の対象から外すことができる。 FIG. 17 is a diagram showing a transmission frame transmitted in the third embodiment. A transmission frame 220 illustrated in FIG. 17 includes a road-to-vehicle transmission unnecessary flag 205 for each peripheral object 4 in addition to the information included in the transmission frame 210 illustrated in FIG. By the processing shown in step S304, the road-to-vehicle transmission unnecessary flag corresponding to the peripheral object 4 determined to be unnecessary between roads and vehicles is turned ON. As a result, when the vehicle-mounted device 2 that has received the transmission frame 220 notifies the next detection result, the vehicle-mounted device 2 may exclude the detection result of the peripheral object 4 for which the road-to-vehicle transmission unnecessary flag is ON from the transmission target. it can.
 以上説明した本実施の形態3によれば、既に十分な精度で検出できた静止物体の検出情報が車載機2から路側機1に送信されなくなる。これにより、道路が混雑しており、路側機1の通信領域内に存在する車載機2の台数が増加した場合であっても、周辺物体4の検出性能を維持しつつ、車載機2から路側機1への通信量を抑制することが可能になる。 According to the third embodiment described above, stationary object detection information that has already been detected with sufficient accuracy is not transmitted from the in-vehicle device 2 to the roadside device 1. As a result, even when the road is congested and the number of in-vehicle devices 2 existing in the communication area of the roadside device 1 is increased, the detection performance of the peripheral object 4 is maintained and the roadside from the in-vehicle device 2 is maintained. The amount of communication to the machine 1 can be suppressed.
 図18は、実施の形態1から3にかかる路側機のハードウェア構成を示す図である。路側機1の各機能は、プロセッサ81およびメモリ82を用いて実現することができる。プロセッサ81とメモリ82とは、システムバス83によって接続されている。 FIG. 18 is a diagram illustrating a hardware configuration of the roadside machine according to the first to third embodiments. Each function of the roadside machine 1 can be realized by using the processor 81 and the memory 82. The processor 81 and the memory 82 are connected by a system bus 83.
 プロセッサ81は、メモリ82に記憶されたコンピュータプログラムを読み出して実行することにより、路側機1の各機能を実現することができる。メモリ82は、プロセッサ81が実行するコンピュータプログラムと、コンピュータプログラムの実行に伴って用いられる情報とを記憶する。 The processor 81 can realize each function of the roadside machine 1 by reading and executing the computer program stored in the memory 82. The memory 82 stores a computer program executed by the processor 81 and information used in accordance with the execution of the computer program.
 本発明の実施の形態1から実施の形態3において説明した周辺物体検出部11、第1受信部22、第2受信部31、第1送信部61、第1取得部12、第2取得部23、第3取得部32、検出結果統合部41、周辺物体リスト生成部51、送信フレーム生成部52、送信不要リスト生成部53、混雑度推定部72および閾値生成部73は、プロセッサ81がメモリ82に記憶された各動作プログラムを読み出して実行することにより実現される。閾値設定部71は、メモリ82により実現される。 The peripheral object detection unit 11, the first reception unit 22, the second reception unit 31, the first transmission unit 61, the first acquisition unit 12, and the second acquisition unit 23 described in the first to third exemplary embodiments of the present invention. , The third acquisition unit 32, the detection result integration unit 41, the peripheral object list generation unit 51, the transmission frame generation unit 52, the transmission unnecessary list generation unit 53, the congestion degree estimation unit 72, and the threshold generation unit 73, the processor 81 has a memory 82. This is realized by reading out and executing each operation program stored in. The threshold setting unit 71 is realized by the memory 82.
 なお、図18では1つのプロセッサ81と1つのメモリ82とが示されているが、本発明はかかる例に限定されず、複数のプロセッサ81と複数のメモリ82とが連携して、路側機1の各機能を実現してもよい。 In FIG. 18, one processor 81 and one memory 82 are shown. However, the present invention is not limited to such an example, and the roadside machine 1 is formed by cooperation of a plurality of processors 81 and a plurality of memories 82. These functions may be realized.
 以上の実施の形態に示した構成は、本発明の内容の一例を示すものであり、別の公知の技術と組み合わせることも可能であるし、本発明の要旨を逸脱しない範囲で、構成の一部を省略、変更することも可能である。 The configuration described in the above embodiment shows an example of the contents of the present invention, and can be combined with another known technique, and can be combined with other configurations without departing from the gist of the present invention. It is also possible to omit or change the part.
 例えば、図2では、簡単のため、1つの周辺物体検出用センサ10を示したが、本発明はかかる例に限定されない。路側機1は複数の周辺物体検出用センサ10を備えていてもよい。この場合、複数の周辺物体検出用センサ10は、同種のセンサであってもよいし、複数の種類のセンサであってもよい。 For example, FIG. 2 shows one peripheral object detection sensor 10 for simplicity, but the present invention is not limited to such an example. The roadside machine 1 may include a plurality of surrounding object detection sensors 10. In this case, the plurality of peripheral object detection sensors 10 may be the same type of sensors, or may be a plurality of types of sensors.
 例えば、上記の実施の形態では、路車間通信用アンテナ20と路路間通信用アンテナ30とは別のアンテナとしたが、2つの通信が近い周波数帯で行われる場合には、路車間通信と路路間通信とに共通のアンテナを用いることができる。また、路路間通信は無線通信であることとしたが、本発明はかかる例に限定されない。路路間通信は、光ファイバなどの有線網を介して行われてもよい。 For example, in the above embodiment, the road-to-vehicle communication antenna 20 and the road-to-road communication antenna 30 are different antennas, but when two communications are performed in close frequency bands, A common antenna can be used for inter-road communication. Further, although the road-to-road communication is wireless communication, the present invention is not limited to such an example. The road-to-road communication may be performed via a wired network such as an optical fiber.
 例えば、上記の実施の形態では、周辺物体4は、車両3、落下物5、歩行者など、時間の経過と共に移動する可能性が高い物体であることとしたが、周辺物体4は固定された物体であってもよい。例えば、自動運転システムの実現に向けては、道路の詳細な静的地図情報が必要となるが、この地図情報は、詳細かつ正確な情報であることが求められるため、実際に走行している車両3から収集する周辺物体4の情報を利用することが望ましい。本発明は、このような地図情報として用いるための周辺物体の情報を検出するために用いられてもよい。 For example, in the above embodiment, the surrounding object 4 is an object that has a high possibility of moving over time, such as the vehicle 3, the fallen object 5, and a pedestrian, but the surrounding object 4 is fixed. It may be an object. For example, in order to realize an automatic driving system, detailed static map information of roads is required, but since this map information is required to be detailed and accurate information, it is actually running. It is desirable to use information on the peripheral object 4 collected from the vehicle 3. The present invention may be used to detect information on surrounding objects for use as such map information.
 また、上記の実施の形態においてフローチャートを用いて説明した動作は、得られる結果に相違がない場合、実行する順番を入れ替えたり、同時並行で実行したりしてもよい。例えば、図16では、検出結果統合処理を行った後に、送信可否判定閾値を設定する処理が行われているが、これらの処理を実行する順番を入れ替えてもよいし、同時並行で実行してもよい。 In addition, the operations described using the flowcharts in the above-described embodiment may be executed in parallel or executed in parallel when there is no difference in the obtained results. For example, in FIG. 16, after the detection result integration process is performed, a process for setting a transmission permission / inhibition determination threshold is performed. However, the order in which these processes are performed may be changed, or the processes may be performed concurrently. Also good.
 1,1A,1B 路側機、2 車載機、3,3A,3B,3C,3D 車両、4 周辺物体、5,5A 落下物、10 周辺物体検出用センサ、11 周辺物体検出部、12 第1取得部、13 第1検出精度出力部、14 第1状態パラメータ出力部、21 サーキュレータ、22 第1受信部、23 第2取得部、24 第2検出精度出力部、25 第2状態パラメータ出力部、31 第2受信部、32 第3取得部、33 第3検出精度出力部、34 第3状態パラメータ出力部、41 検出結果統合部、42 検出精度統合部、43 状態パラメータ統合部、51 周辺物体リスト生成部、52 送信フレーム生成部、53 送信不要リスト生成部、71 閾値設定部、72 混雑度推定部、73 閾値生成部、81 プロセッサ、82 メモリ、83 システムバス、100 路車間通信システム、200,210,220 送信フレーム、201 統合後検出精度、202 統合後状態パラメータ、203 その他データ、204 送信可否判定閾値、205 路車間送信不要フラグ。 1, 1A, 1B roadside machine, 2 onboard machine, 3, 3A, 3B, 3C, 3D vehicle, 4 surrounding object, 5, 5A fallen object, 10 surrounding object detection sensor, 11 surrounding object detection unit, 12 first acquisition Unit, 13 first detection accuracy output unit, 14 first state parameter output unit, 21 circulator, 22 first reception unit, 23 second acquisition unit, 24 second detection accuracy output unit, 25 second state parameter output unit, 31 2nd receiving part, 32 3rd acquisition part, 33 3rd detection accuracy output part, 34 3rd state parameter output part, 41 detection result integration part, 42 detection accuracy integration part, 43 state parameter integration part, 51 peripheral object list generation Unit, 52 transmission frame generation unit, 53 transmission unnecessary list generation unit, 71 threshold setting unit, 72 congestion degree estimation unit, 73 threshold generation unit, 81 processor 82 memory, 83 a system bus, 100 road-vehicle communication system, 200, 210, 220 transmission frame, 201 integration after detection accuracy, 202 integration after state parameter, 203 other data, 204 transmission determination threshold, 205 road-vehicle transmission required flag.

Claims (6)

  1.  周辺に存在する物体を検出する周辺物体検出部と、
     車載機から送信される路車間信号を受信する第1受信部と、
     前記周辺物体検出部が検出した物体の検出結果と、前記車載機から前記路車間信号を用いて受信する前記車載機の周辺に存在する物体の検出結果とを統合する検出結果統合部と、を備えることを特徴とする路側機。
    A peripheral object detection unit for detecting objects existing in the vicinity;
    A first receiver for receiving a road-to-vehicle signal transmitted from the in-vehicle device;
    A detection result integration unit that integrates a detection result of the object detected by the peripheral object detection unit and a detection result of an object existing around the on-vehicle device received from the on-vehicle device using the road-to-vehicle signal; A roadside machine characterized by comprising.
  2.  他の路側機から送信される路路間信号を受信する第2受信部をさらに備え、
     前記検出結果統合部は、前記路路間信号を用いて前記他の路側機から受信する前記他の路側機の周辺に存在する物体の検出結果をさらに統合することを特徴とする請求項1に記載の路側機。
    A second receiving unit for receiving a road-to-road signal transmitted from another roadside device;
    The detection result integration unit further integrates detection results of objects existing around the other roadside device received from the other roadside device using the road-to-road signal. The roadside machine described.
  3.  前記検出結果統合部は、各検出結果の検出精度に応じた重み係数を用いて重みづけ合成した検出結果の値、取得した複数の検出結果のうち検出精度が最も高い検出結果の値、または取得した複数の検出結果のうち値が同じになる検出結果の数が最も多い検出結果の値のいずれかを統合後の検出結果の値とすることを特徴とする請求項1または2に記載の路側機。 The detection result integration unit is a detection result value weighted and synthesized using a weighting factor corresponding to the detection accuracy of each detection result, a detection result value having the highest detection accuracy among a plurality of acquired detection results, or acquisition. The roadside according to claim 1 or 2, wherein one of detection results having the same number of detection results having the same value among the plurality of detection results is set as a detection result value after integration. Machine.
  4.  前記路車間信号の混雑度を推定する混雑度推定部と、
     前記混雑度推定部の推定結果に基づき、前記車載機が前記周辺物体の検出結果を送信するか否か判定するための基準である送信可否判定閾値を生成する閾値生成部と、
     前記送信可否判定閾値を前記車載機へ送信する第1送信部と、をさらに備えることを特徴とする請求項1から3のいずれか1項に記載の路側機。
    A congestion degree estimation unit for estimating the degree of congestion of the road-to-vehicle signal;
    Based on the estimation result of the congestion degree estimation unit, a threshold generation unit that generates a transmission permission determination threshold that is a reference for determining whether or not the in-vehicle device transmits the detection result of the surrounding object;
    The roadside machine according to any one of claims 1 to 3, further comprising: a first transmission unit that transmits the transmission permission determination threshold value to the in-vehicle device.
  5.  前記検出結果統合部が統合した検出結果に基づいて、予め定められた基準以上の検出精度が得られた周辺物体を特定して、特定した周辺物体を、前記車載機から検出結果の送信が不要な周辺物体としてリスト化した路車間送信不要リストを生成する送信不要リスト生成部と、
     前記路車間送信不要リストに含まれる周辺物体を特定する情報を前記車載機へ送信する第1送信部と、をさらに備えることを特徴とする請求項1から3のいずれか1項に記載の路側機。
    Based on the detection result integrated by the detection result integration unit, a peripheral object with a detection accuracy equal to or higher than a predetermined reference is specified, and the specified peripheral object does not need to be transmitted from the in-vehicle device. A transmission unnecessary list generating unit for generating a road-to-vehicle transmission unnecessary list listed as a peripheral object,
    The roadside according to any one of claims 1 to 3, further comprising: a first transmission unit that transmits information specifying peripheral objects included in the road-to-vehicle transmission unnecessary list to the in-vehicle device. Machine.
  6.  請求項1から5のいずれか1項に記載の複数の路側機と、複数の前記車載機とを含むことを特徴とする路車間通信システム。 A road-to-vehicle communication system comprising a plurality of roadside machines according to any one of claims 1 to 5 and a plurality of the in-vehicle devices.
PCT/JP2016/087222 2016-12-14 2016-12-14 Roadside machine and vehicle-to-road communication system WO2018109865A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2016/087222 WO2018109865A1 (en) 2016-12-14 2016-12-14 Roadside machine and vehicle-to-road communication system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2016/087222 WO2018109865A1 (en) 2016-12-14 2016-12-14 Roadside machine and vehicle-to-road communication system

Publications (1)

Publication Number Publication Date
WO2018109865A1 true WO2018109865A1 (en) 2018-06-21

Family

ID=62559577

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2016/087222 WO2018109865A1 (en) 2016-12-14 2016-12-14 Roadside machine and vehicle-to-road communication system

Country Status (1)

Country Link
WO (1) WO2018109865A1 (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2019176299A (en) * 2018-03-28 2019-10-10 住友電気工業株式会社 Environment detection device, environment detection system, environment detection method, and computer program
JP2021149162A (en) * 2020-03-16 2021-09-27 株式会社Soken Transportation system
JP2021149163A (en) * 2020-03-16 2021-09-27 株式会社Soken Transportation system
JP2021174064A (en) * 2020-04-20 2021-11-01 株式会社Soken Traffic system
CN113682307A (en) * 2021-08-06 2021-11-23 南京市德赛西威汽车电子有限公司 Visual lane change assisting method and system
WO2022208570A1 (en) * 2021-03-29 2022-10-06 日本電気株式会社 Vehicle-mounted device, control server, measured data collection method, and program recording medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH10105880A (en) * 1996-09-30 1998-04-24 Hitachi Ltd Mobile object control system
JP2012256162A (en) * 2011-06-08 2012-12-27 Sumitomo Electric Ind Ltd Roadside communicator, radio communication system, method for receiving radio signal, and computer program
JP2016110608A (en) * 2014-12-01 2016-06-20 住友電気工業株式会社 Roadside communication device, communication system, and data relay method
JP2016167199A (en) * 2015-03-10 2016-09-15 住友電気工業株式会社 Roadside communication device and data relay method
JP2016167202A (en) * 2015-03-10 2016-09-15 住友電気工業株式会社 Roadside communication device, data relay method, central device, computer program, and data processing method

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH10105880A (en) * 1996-09-30 1998-04-24 Hitachi Ltd Mobile object control system
JP2012256162A (en) * 2011-06-08 2012-12-27 Sumitomo Electric Ind Ltd Roadside communicator, radio communication system, method for receiving radio signal, and computer program
JP2016110608A (en) * 2014-12-01 2016-06-20 住友電気工業株式会社 Roadside communication device, communication system, and data relay method
JP2016167199A (en) * 2015-03-10 2016-09-15 住友電気工業株式会社 Roadside communication device and data relay method
JP2016167202A (en) * 2015-03-10 2016-09-15 住友電気工業株式会社 Roadside communication device, data relay method, central device, computer program, and data processing method

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2019176299A (en) * 2018-03-28 2019-10-10 住友電気工業株式会社 Environment detection device, environment detection system, environment detection method, and computer program
JP7069944B2 (en) 2018-03-28 2022-05-18 住友電気工業株式会社 Environment detectors, environment detection systems, environment detection methods, and computer programs
JP2021149162A (en) * 2020-03-16 2021-09-27 株式会社Soken Transportation system
JP2021149163A (en) * 2020-03-16 2021-09-27 株式会社Soken Transportation system
JP2021174064A (en) * 2020-04-20 2021-11-01 株式会社Soken Traffic system
WO2022208570A1 (en) * 2021-03-29 2022-10-06 日本電気株式会社 Vehicle-mounted device, control server, measured data collection method, and program recording medium
CN113682307A (en) * 2021-08-06 2021-11-23 南京市德赛西威汽车电子有限公司 Visual lane change assisting method and system
CN113682307B (en) * 2021-08-06 2023-09-12 南京市德赛西威汽车电子有限公司 Visual lane change assisting method and system

Similar Documents

Publication Publication Date Title
WO2018109865A1 (en) Roadside machine and vehicle-to-road communication system
US9465105B2 (en) V2V communication-based vehicle identification apparatus and identification method thereof
CN111284487B (en) Lane line display method and electronic device for executing same
WO2015087502A1 (en) Vehicle self-location device
JP6626410B2 (en) Vehicle position specifying device and vehicle position specifying method
CN110036429B (en) Driving support system and driving support device
KR20180056675A (en) METHOD AND SYSTEM FOR GENERATING A DIGIT MAP
WO2015098510A1 (en) Vehicle control device, vehicle mounted with vehicle control device, and moving body detection method
JP5200568B2 (en) In-vehicle device, vehicle running support system
JP6828655B2 (en) Own vehicle position estimation device
JP4609467B2 (en) Peripheral vehicle information generation device, peripheral vehicle information generation system, computer program, and peripheral vehicle information generation method
JP2009181315A (en) Object detection device
US10909848B2 (en) Driving assistance device
KR20170019794A (en) Vehicle and collision avoidance method for the same
US11501539B2 (en) Vehicle control system, sensing device and sensing data processing method
US11199854B2 (en) Vehicle control system, apparatus for classifying markings, and method thereof
KR20150055278A (en) System and method for collecting traffic information using vehicle radar
CN109923598B (en) Object detection device for vehicle and object detection system for vehicle
WO2014167793A1 (en) Vehicle outside image saving device, and portable terminal with imaging function
JP2007108837A (en) On-board communication device and inter-vehicle communication system
US20160349070A1 (en) Unit setting apparatus and unit setting method
US20220406190A1 (en) Communication device, vehicle, computer-readable storage medium, and communication method
WO2015092974A1 (en) Oncoming car information generation device
KR102002583B1 (en) System and vehicle for providing precise position information of road landmarks
US11783426B2 (en) Information processing device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16923900

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16923900

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP