WO2023145494A1 - Information processing device and information processing method - Google Patents

Information processing device and information processing method Download PDF

Info

Publication number
WO2023145494A1
WO2023145494A1 PCT/JP2023/000844 JP2023000844W WO2023145494A1 WO 2023145494 A1 WO2023145494 A1 WO 2023145494A1 JP 2023000844 W JP2023000844 W JP 2023000844W WO 2023145494 A1 WO2023145494 A1 WO 2023145494A1
Authority
WO
WIPO (PCT)
Prior art keywords
information
vehicle
processing unit
time
detected object
Prior art date
Application number
PCT/JP2023/000844
Other languages
French (fr)
Japanese (ja)
Inventor
真大 新田
隆彦 稲冨
輝 小山
史行 井澤
敏軒 杜
Original Assignee
京セラ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 京セラ株式会社 filed Critical 京セラ株式会社
Publication of WO2023145494A1 publication Critical patent/WO2023145494A1/en

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems

Definitions

  • the present invention relates to an information processing device and an information processing method.
  • a technology is known in which a vehicle performs vehicle-to-vehicle communication with another vehicle and also performs road-to-vehicle communication with a roadside machine to acquire information about detected objects detected by the other vehicle and the roadside machine (see Patent Document 1).
  • An information processing device includes: receiving first information indicating the state of the object detected at the first acquisition time and the first acquisition time from the first transmitting device, and receiving second information and the second acquisition indicating the state of the object detected at the second acquisition time a receiver that receives the time from the second transmitter; a processing unit that processes the first information and the first acquisition time and the second information and the second acquisition time received by the receiving unit; The processing unit predicts, from the first information, a state of the detected object in the first information at a third time after the first acquisition time and the second acquisition time.
  • the processing unit performs the detected object in the first information and the second is the same as the detected object in the information, and either the first information or the second information is selected based on the result of the determination.
  • the information processing method includes: receiving the first information and the first acquisition time from the first transmitting device that acquired the first information indicating the state of the object detected at the first acquisition time, and transmitting the state of the object detected at the second acquisition time; a receiving step of receiving the second information and the second acquisition time from the second transmitting device that acquired the second information indicating From the first information, predict the state of the detected object in the first information at a third time after the first acquisition time and the second acquisition time, and from the second information, predict the state of the detected object a prediction step of predicting the state of the detected object in the second information at a third time; Based on the state of the detected object in each of the first information and the second information at the third time predicted in the prediction step, the state of the detected object in the first information and the second information is determined. a judgment step of judging whether the detected object in the information is the same, and selecting either the first information or the second information based on the judgment result; including.
  • FIG. 1 is a layout diagram showing the layout of an information processing device according to an embodiment in a vehicle
  • FIG. 2 is a functional block diagram showing a schematic configuration of the information processing apparatus of FIG. 1
  • FIG. It is a figure which shows the position of the detection object regarding the information which a process part deletes. It is a figure which shows the advancing direction distance of a detection object. It is the figure which divided the area
  • FIG. 4 is a diagram showing a specific example of determination of identity of detected objects performed by a processing unit
  • FIG. 2 is a flowchart for explaining processing executed by the information processing apparatus of FIG. 1;
  • FIG. 1 shows a configuration example of a traffic support system including a vehicle 10 equipped with an information processing device according to one embodiment.
  • the traffic support system is, for example, a safe driving support communication system of Intelligent Transport Systems (ITS).
  • ITS Intelligent Transport Systems
  • a driving safety support communication system is called a driving safety support system or a driving safety support radio system.
  • the traffic support system includes vehicles 10 and roadside units 20 .
  • Vehicles 10 may include a plurality of vehicles 10 .
  • the following explanation will be given with the arbitrarily selected single vehicle 10 as the self-vehicle 30 .
  • Vehicles 10 other than own vehicle 30 are also referred to as other vehicles 40 .
  • features that can be explained without distinguishing between the own vehicle 30 and the other vehicle 40 will be explained as features of the vehicle 10 including the own vehicle 30 and the other vehicle 40 .
  • the roadside unit 20 may observe observation targets such as vehicles, objects, and pedestrians 50 on the road in a predetermined area.
  • the roadside unit 20 may be placed near an intersection where multiple roads 60 (roadways) intersect to observe the road surface.
  • the roadside unit 20 may be arranged on the roadside other than the intersection.
  • one roadside unit 20 is arranged at the intersection.
  • a plurality of roadside units 20 may be arranged.
  • roadside units 20 are arranged in a number corresponding to the number of roads 60 intersecting at an intersection, and each roadside unit 20 is arranged so that at least one road 60 associated with the installer in advance is included in the detection range. may be
  • the roadside unit 20 and the vehicle 10 running on the road 60 may perform wireless communication with each other.
  • a plurality of vehicles 10 may perform wireless communication with each other.
  • the roadside unit 20 may notify the vehicle 10 of various information, which will be described later.
  • the vehicle 10 may notify the other vehicle 10 and the roadside device 20 of various information, which will be described later.
  • the vehicle 10 may use various information acquired from the roadside unit 20 and other vehicles 10 to assist the driver of the vehicle 10 in driving safely.
  • the traffic assistance system may assist the driver of the vehicle 10 in driving safely.
  • the vehicle 10 is, for example, an automobile, but is not limited to an automobile, and may be a motorcycle, a bus, a streetcar, a bicycle, or the like.
  • the roadside unit 20 may be fixed to a structure having a height capable of imaging the scene including the road 60 outdoors, such as a signal device near an intersection where the roads 60 intersect, a utility pole, a streetlight, or the like. As shown in FIG. 2, roadside unit 20 may include sensor 21 . The sensor 21 detects information indicating the state of the detected object. The roadside device 20 may add the detection time to the information detected by the sensor 21 .
  • the roadside device 20 may periodically transmit information indicating the state of the detected object to its surroundings.
  • the information processing device 11 of the present disclosure may be mounted on the vehicle 10 .
  • Vehicle 10 may further include sensors 12 .
  • the information processing device 11 includes a receiver 13 and a processor 14 .
  • the information processing device 11 may include a control section 15 .
  • the sensor 12 may detect surrounding objects as detection objects.
  • a detected object may be a vehicle 10 on the road, a fixed object, a person, etc., which should be considered in driving the individual vehicle 10 .
  • the sensor 12 may generate information indicating the state of the detected object at predetermined intervals.
  • the sensor 12 may add the acquisition time to the information indicating the state of the detected object.
  • the acquisition time may be acquired from a timer built into the processing unit 14 .
  • the information indicating the state of the detected object includes the position (latitude, longitude and altitude), speed, traveling direction, acceleration, type, reliability, and the like of the detected object.
  • the information indicating the state of the detected object may include the shift position, steering angle, and the like.
  • the sensor 12 includes, for example, an imaging device such as a visible light camera, an FIR (Far Infrared Rays) camera, and a stereo camera, and a ranging device such as an infrared radar, a millimeter wave radar, and a Lidar (light detection and ranging). good.
  • an imaging device such as a visible light camera, an FIR (Far Infrared Rays) camera, and a stereo camera
  • a ranging device such as an infrared radar, a millimeter wave radar, and a Lidar (light detection and ranging).
  • the position of the detected object, the presence or absence of the detected object, the type of the detected object, and the traveling direction can be generated based on the image analysis of the image detected by the imaging device.
  • the velocity and acceleration of the detected object can be generated.
  • the position of the detected object, the presence or absence of the detected object, and the type of the existing detected object can be generated based on the distance of the object point detected by the rangefinder. Velocity
  • the sensor 12 may include a speed sensor that detects the speed of the vehicle 10 and a positioning device such as a GNSS (Global Navigation Satellite System) that detects the spatial position of the vehicle 10 in real space.
  • GNSS Global Navigation Satellite System
  • the receiving unit 13 receives the first information indicating the state of the detected object detected at the first acquisition time and the first acquisition time from the first transmission device.
  • the receiving unit 13 receives second information indicating the state of the detected object detected at the second acquisition time and the second acquisition time from the second transmission device.
  • the first transmitter and the second transmitter may include other vehicles 10 and roadside units 20 .
  • the receiving unit 13 may include a communication interface capable of communicating with the first transmitting device and the second transmitting device.
  • a "communication interface" in this disclosure may include, for example, a radio.
  • the radio may include radios conforming to standards including RC-005, IEEE 802.11p.
  • a radio includes at least one antenna.
  • the processing unit 14 includes one or more processors and memory.
  • the processor may include at least one of a general-purpose processor that loads a specific program and executes a specific function, and a dedicated processor that specializes in specific processing.
  • a dedicated processor may include an Application Specific Integrated Circuit (ASIC).
  • the processor may include a programmable logic device (PLD).
  • the PLD may include an FPGA (Field-Programmable Gate Array).
  • the processing unit 14 may include at least one of SoC (System-on-a-Chip) and SiP (System In a Package) in which one or more processors cooperate.
  • the processing unit 14 may transmit information indicating the state of the own vehicle 30 acquired by the sensor 12 of the vehicle 10 to the other vehicle 10 .
  • the transmitted information becomes first information or second information for other vehicles 10 .
  • the processing unit 14 processes the first information and the first acquisition time and the second information and the second acquisition time received by the receiving unit 13 .
  • a specific example of processing will be described below.
  • the processing unit 14 predicts the state of the detected object in the first information at a third time t3 after the first acquisition time t1 and the second acquisition time t2. From the second information, the processing unit 14 predicts the state of the detected object in the second information at the third time t3, which is the current time, for example.
  • An example of a prediction procedure for the first information is described below. A similar prediction procedure is performed for the second information.
  • processing unit 14 may perform the processing described below only when the difference between the first acquisition time t1 and the second acquisition time t2 is equal to or less than a predetermined threshold.
  • the processing unit 14 may extract the position p1, velocity v1, acceleration a, and traveling direction ⁇ 1 of the detected object at the first acquisition time t1 from the first information.
  • the processing unit 14 predicts the state of the detected object in the first information at the third time t3.
  • the processing unit 14 may predict the velocity v3 of the detected object at the third time t3.
  • the processing unit 14 may predict the movement distance d of the detected object from the first acquisition time t1 to the third time t3.
  • v3 v1+a ⁇ (t3-t1) (1)
  • d v1 ⁇ (t3 ⁇ t1)+a ⁇ (t3 ⁇ t1) 2 /2 (2)
  • the processing unit 14 calculates the latitude and longitude displacement of the detected object from the first acquisition time t1 to the third time t3 based on the predicted moving distance d and the traveling direction ⁇ 1 of the detected object at the first acquisition time t1. can be calculated.
  • the processing unit 14 may predict the position p3 of the detected object at the third time t3 from the position p1 at the first acquisition time t1 and the calculated displacement.
  • the processing unit 14 may predict the distance dr between the position p3 of the detected object at the third time t3 and the position of the vehicle 10 at the third time t3.
  • the processing unit 14 may predict the direction ⁇ 3 of the detected object viewed from the vehicle 30 at the third time t3 from the position p3 of the detected object and the position of the vehicle 10 .
  • the processing unit 14 may calculate the angle ⁇ between the direction ⁇ 3 of the detected object and the traveling direction of the vehicle 10 .
  • the direction ⁇ 3 of the detected object and the traveling direction of the vehicle 10 may be azimuth angles.
  • the processing unit 14 may determine that the detected object exists in the blind spot of the vehicle 10 when the difference ⁇ is equal to or greater than the first angle threshold.
  • the processing unit 14 may notify the control unit 15 of the determination result.
  • the processing unit 14 may notify the control unit 15 of the determination result only when the difference ⁇ is greater than or equal to the second angle threshold and the relative distance dr is less than or equal to the first distance threshold.
  • the processing unit 14 may predict the information of the vehicle 10 at the third time t3 from the information of the vehicle 10 acquired at the fourth acquisition time t4 before the third time t3. The processing unit 14 may determine whether to delete the first information and the second information further based on the information of the vehicle 10 at the third time t3. An example of a specific determination procedure will be described below.
  • the processing unit 14 may delete the information on the detected object before transmitting the information on the detected object to the control unit 15 .
  • the processing unit 14 may delete the information on the detected object before the prediction of the information on the detected object described above or after determining the identity of the detected object described later.
  • the information on the detected object required for the own vehicle 30 may vary depending on the manufacturer of the own vehicle 30 and the like. Therefore, the user or the like may be able to specify the determination procedure from an external interface or the like provided in the processing section 14 .
  • the decision procedure may be specified during driving according to driving conditions, traffic conditions, and the like.
  • the processing unit 14 may start the determination procedure.
  • the information maintenance range A may include a range Af on the traveling direction side of the vehicle 10 or a range Ab on the side opposite to the traveling direction of the vehicle 10 (see FIG. 3).
  • the range Af may be a semicircular shape centered on the widthwise center of the front edge of the vehicle 10 and having a radius rf.
  • the range Ab may be a semicircular shape centered on the widthwise center of the front edge of the vehicle 10 and having a radius rb.
  • Radius rf or radius rb may be, for example, 300 meters, 250 meters, 200 meters, 150 meters, 100 meters or 50 meters. By setting the radius rf to about 240 meters or more, the vehicle 10 is considered to be able to turn right without decelerating when there is no other vehicle 10 in the traveling direction of the T-junction.
  • the processing unit 14 does not have to delete the information on the detected object if the information maintenance range A is not specified.
  • the processing unit 14 calculates the distance between the position of the center of the front edge of the vehicle 30 in the width direction and the position of the other vehicle 40 projected along the traveling direction of the vehicle 30 (hereinafter referred to as dt (refer to FIG. 4) may be calculated.
  • the processing unit 14 may determine whether the other vehicle 40 is on the traveling direction side of the own vehicle 30 or on the opposite side to the traveling direction.
  • Thresholds for the traveling direction distance may be set on the traveling direction side and the opposite side of the own vehicle 30, respectively.
  • the processing unit 14 may delete the information on the detected object when the traveling direction distance exceeds the threshold on the side where the detected object exists.
  • the traveling direction threshold may be set to be equal to or greater than the traveling direction distance to any intersection in front of the host vehicle 30 .
  • the opposite side threshold is a distance that does not interfere with another vehicle 40 such as an emergency vehicle that moves in the same traveling direction as the own vehicle 30 at a speed higher than the speed of the own vehicle 30, and that the own vehicle 30 The distance may be within a range where the other vehicle 40 can be sufficiently detected.
  • the processing unit 14 may determine whether the other vehicle 40 is visible from the own vehicle 30 (is it close to the own vehicle 30) at the third time t3. Other vehicles 40 that are visible from own vehicle 30 may pose a danger to own vehicle 30 . When the other vehicle 40 is not visible, the processing unit 14 may determine whether the other vehicle 40 is dangerous to the own vehicle 30 at the third time t3. The processing unit 14 may delete information about detected objects including the other vehicle 40 that are not visible from the host vehicle 30 and determined not to be dangerous.
  • the determination of visibility may be based on the position of the other vehicle 40 with respect to the own vehicle 30.
  • the determination of danger may be based on the position relative to own vehicle 30 and the behavior of other vehicle 40 .
  • An example of a procedure for determining whether or not to permit viewing and whether or not the other vehicle 40 is dangerous to the own vehicle 30 will be described below.
  • Areas from area 1 to the right end in the same row are defined as areas 2 to 5 in order.
  • a region one after region 1 is defined as region 6 .
  • Areas from area 6 to the right end in the same row are defined as areas 7 to 10 in order.
  • Regions 11 to 25 are determined in the same way thereafter. It should be noted that the own vehicle 30 is located in the area 13 .
  • the processing unit 14 may determine that the other vehicle 40 existing in the regions 12 and 14 adjacent to the left and right of the own vehicle 30 is visible from the own vehicle 30 .
  • the processing unit 14 may determine the danger of the other vehicle 40 located in the adjacent area 8 in front of the own vehicle 30 . If the angle formed by the direction of travel of the own vehicle 30 and the direction of travel of the other vehicle 40 is greater than or equal to the third angle threshold, the processing unit 14 causes the other vehicle 40 to change lanes away from the own vehicle 30. You can judge that there is The processing unit 14 may determine that the other vehicle 40 changing lanes in the direction is dangerous for the own vehicle 30 . When the other vehicle 40 changes lanes, the driver of the other vehicle 40 may become distracted and may not pay enough attention to the own vehicle 30 .
  • the processing unit 14 determines that the other vehicle 40 is rapidly approaching the host vehicle 30. I judge.
  • the processing unit 14 may determine that another vehicle 40 that is rapidly approaching the own vehicle 30 is dangerous for the own vehicle 30 .
  • the processing unit 14 may determine the danger of the other vehicle 40 located in the area 3 two areas away from the own vehicle 30 .
  • the processing unit 14 may determine that the other vehicle 40 is dangerous for the own vehicle 30 .
  • the processing unit 14 may determine that the other vehicle 40 is rapidly approaching the own vehicle 30 when the acceleration of the other vehicle 40 is negative and the acceleration is equal to or less than the second acceleration threshold.
  • the processing unit 14 may determine that another vehicle 40 that is rapidly approaching the own vehicle 30 is dangerous for the own vehicle 30 . Since the other vehicle 40 is some distance away from the own vehicle 30, the absolute value of the second acceleration threshold may be larger than the first acceleration threshold.
  • the processing unit 14 may determine the danger of the other vehicle 40 located in the adjacent area 18 behind the own vehicle 30 . As described above, when the processing unit 14 determines that the other vehicle 40 is changing lanes away from the own vehicle 30, the processing unit 14 determines that the other vehicle 40 is dangerous to the own vehicle 30. you can Further, when the acceleration of the other vehicle 40 is positive (the other vehicle 40 is accelerating) and the acceleration is greater than or equal to the third acceleration threshold, the processing unit 14 causes the other vehicle 40 to suddenly approach the own vehicle 30. You can judge that there is The processing unit 14 may determine that another vehicle 40 that is rapidly approaching the own vehicle 30 is dangerous for the own vehicle 30 .
  • the processing unit 14 may determine the danger of the other vehicle 40 located in the area 23 two areas behind the own vehicle 30 . When the processing unit 14 determines that the other vehicle 40 is changing lanes away from the own vehicle 30 , the processing unit 14 may determine that the other vehicle 40 is dangerous to the own vehicle 30 . The processing unit 14 may determine that the other vehicle 40 is rapidly approaching the own vehicle 30 when the acceleration of the other vehicle 40 is positive and the acceleration is equal to or greater than the fourth acceleration threshold. The processing unit 14 may determine that another vehicle 40 that is rapidly approaching the own vehicle 30 is dangerous for the own vehicle 30 . Since the other vehicle 40 is some distance away from the own vehicle 30, the absolute value of the fourth acceleration threshold may be larger than the absolute value of the third acceleration threshold.
  • the processing unit 14 may determine the danger of the other vehicle 40 located in the area 7 or 9, which is one area ahead of the host vehicle 30 and one area away in the left or right direction.
  • the angle between the traveling direction of the other vehicle 40 and the traveling direction of the own vehicle 30 is equal to or greater than a fourth angle threshold, and the traveling direction of the other vehicle 40 and the vector from the other vehicle 40 to the own vehicle 30 are in the same direction.
  • the processing unit 14 may determine that the other vehicle 40 is changing lanes to approach the own vehicle 30 . In this case, the processing unit 14 may determine that the other vehicle 40 is dangerous for the own vehicle 30 . If another vehicle 40 changes lanes into the same lane as the own vehicle 30, danger may arise.
  • the processing unit 14 may also determine that the other vehicle 40 is dangerous to the own vehicle 30 when the acceleration of the other vehicle 40 is negative and the acceleration is equal to or less than the fifth acceleration threshold.
  • the processing unit 14 may similarly determine the risk of each of the other vehicles 40 positioned in areas 2 or 4 that are two areas ahead of the own vehicle 30 and one area away in the left or right direction.
  • the processing unit 14 may determine the danger of the other vehicle 40 located in the area 17 or 19 which is one area behind the host vehicle 30 and one area away in the left or right direction. As described above, when the processing unit 14 determines that the other vehicle 40 is changing lanes to approach the own vehicle 30, the processing unit 14 determines that the other vehicle 40 is dangerous to the own vehicle 30. good. The processing unit 14 may also determine that the other vehicle 40 is dangerous to the own vehicle 30 when the acceleration of the other vehicle 40 is positive and equal to or greater than the sixth acceleration threshold. The processing unit 14 may similarly determine the risk of each other vehicle 40 located in the area 22 or 24 that is two areas behind the host vehicle 30 and one area away in the left or right direction.
  • the processing unit 14 determines the risk of other vehicles 40 located in areas 1, 5, 6, 10, 11, 15, 16, 20, 21 or 25 two areas away from the own vehicle 30 in the left or right direction. you can As described above, when the processing unit 14 determines that the other vehicle 40 is changing lanes to approach the own vehicle 30, the processing unit 14 determines that the other vehicle 40 is dangerous to the own vehicle 30. good. Since the other vehicle 40 is separated from the own vehicle 30 by two lanes, the fourth angle threshold for judging that the other vehicle 40 is rapidly approaching the own vehicle 30 may be larger than in the case described above.
  • the processing unit 14 may determine that other vehicles 40 located further away from the own vehicle 30 are not dangerous for the own vehicle 30 .
  • the processing unit 14 may further determine other vehicles 40 that are blind spots from the own vehicle 30 . For example, in areas 7-9 and 17-19 that are one area away from the host vehicle 30 in the left or right direction and one area away in the front or rear direction, the first other vehicle 40 is present, and the vehicle and the first vehicle are located. A case where a second other vehicle 40 exists between the other vehicle 40 is conceivable. In this case, since the second other vehicle 40 is in a state of hiding the first other vehicle 40 (due to occlusion), the processing unit 14 determines that the first other vehicle 40 exists in the blind spot of the host vehicle 30. You can judge. The determination may be based on the size or type of the first other vehicle 40 or the second other vehicle 40 .
  • the processing unit 14 may notify the control unit 15 of the determination result.
  • the processing unit 14 detects an object detected in the first information (hereinafter referred to as "first detected object”) and an object detected in the second information (hereinafter referred to as "second detected object”) at the third time t3. Based on each predicted state, it is determined whether the first detected object and the second detected object are the same. An example of a specific determination procedure will be described below.
  • the first detected object and the second detected object may be other vehicles 40 .
  • the first detected object and the second detected object may be the own vehicle 30 .
  • the processing unit 14 may determine the identity of the recognition class (object type) between the first detected object and the second detected object.
  • Recognition classes are, for example, large vehicles, ordinary vehicles, emergency vehicles, pedestrians, and the like. When the recognition classes of the first detected object and the second detected object are similar, the processing unit 14 may determine that the first detected object and the second detected object have the same recognition class.
  • the processing unit 14 may determine the identity of the velocities of the first detected object and the second detected object at the third time t3. Specifically, when the difference between the speed of the first detected object and the speed of the second detected object is equal to or less than the speed threshold, the processing unit 14 determines whether the first detected object and the second detected object have the same speed. You can judge that it has sex.
  • the processing unit 14 may determine the identity of the traveling directions of the first detected object and the second detected object at the third time t3. Specifically, when the angle between the traveling direction of the first detected object and the traveling direction of the second detected object is equal to or smaller than the second angle threshold, the processing unit 14 detects the first detected object and the second detected object. It may be determined that objects have heading identity.
  • the processing unit 14 may determine the identity of the positions of the first detected object and the second detected object at the third time t3. Specifically, the processing unit 14 may calculate the distance between the position of the first detected object and the position of the second detected object at the third time t3. The processing unit 14 may determine that the positions of the first detected object and the second detected object have the same position when the calculated distance is equal to or less than the second distance threshold.
  • the processing unit 14 determines the identity of the recognition class, the identity of the velocity, and the traveling direction of the first detected object and the second detected object determined between the first detected object and the second detected object at the third time t3. It may be determined whether the first detected object and the second detected object are the same based on at least one of the identity of and the identity of the positions.
  • the processing unit 14 determines that the first detected object and the second detected object are the same, it selects the first information or the second information.
  • the processing unit 14 may calculate the priority of each of the first information and the second information for selection.
  • the processing unit 14 may select the first information or the second information based on the first information and the second priority.
  • the processing unit 14 may transmit the selected information to the control unit 15 . Details of the priority calculation procedure will be described below.
  • Information received from other vehicles 40 may have a higher priority than information received from roadside units 20 from the viewpoint of accuracy.
  • the priority of the information received from the roadside unit 20 may be increased.
  • the processing unit 14 may give higher priority to the information acquired later than the first information and the second information. .
  • the receiving unit 13 of the own vehicle 30 receives the information 101 to 106 from the roadside device 20.
  • the receiving unit 13 receives information 201 from the other vehicle 40a.
  • the receiving unit 13 receives information 301 from the other vehicle 40b.
  • the receiving unit 13 receives information 401 from the other vehicle 40c.
  • the information 101 is information indicating the state of the other vehicle 40a.
  • Information 102 is information indicating the state of the other vehicle 40b.
  • the information 103 is information indicating the state of the other vehicle 40c.
  • Information 104 is information indicating the state of the vehicle 30 .
  • the information 105 is information indicating the state of the pedestrian 50a.
  • Information 106 is information indicating the state of the pedestrian 50b.
  • the sensor 12 of the own vehicle 30 detects the position of the own vehicle 30 and the like.
  • the sensor 12 creates information 501 indicating the state of the own vehicle 30 .
  • the processing unit 14 predicts the state of the detected object in the information 101 to 106, 201, 301, 401 and 501 at the current time (third time) t3.
  • the processing unit 14 performs identity determination processing for the detected object based on the prediction of the state of the detected object at the third time t3. Specifically, the processing unit 14 determines that the detected objects in the information 101 to 106, 201, 301, 401, and 501 have the same recognition class, the same speed, the same traveling direction, and the same position. It is determined whether the detected objects are identical based on at least one of the genders. The processing unit 14 determines that the information 101 and 201, the information 102 and 301, the information 103 and 401, and the information 104 and 501 are information detected from the same object.
  • the processing unit 14 selects information 201 , 301 , 401 and 501 .
  • the processing unit 14 determines to transmit the selected information to the control unit 15 .
  • the processing unit 14 further determines that the information 105 and 106 and other information are not information detected from the same object. Based on this determination, the processing unit 14 determines to transmit the information 105 and 106 to the control unit 15 . As described above, the processing unit 14 may determine the identity of the detected object.
  • the first information and the second information may contain the vehicle ID.
  • the control unit 15 may recognize multiple pieces of information having the same vehicle ID as information indicating the same vehicle.
  • the processing unit 14 transmits the first information to the control unit 15, and in the current processing, the first detected object and the second detected object related to the same object as the detected object in the first information is acquired to determine identity. Furthermore, the processing unit 14 may select the second information and transmit it to the control unit 15 in this process.
  • the vehicle ID of the first information given by the first transmission device may be different from the vehicle ID of the second information given by the second transmission device. Information may not be recognized as referring to the same vehicle. As a result, the continuity of information about the vehicle provided to the control unit 15 may be interrupted and the trajectory may be lost.
  • the processing unit 14 may rewrite the vehicle ID in the second information to be transmitted to the control unit 15 so that it matches the first information.
  • the processing unit 14 may store the vehicle ID of the information (first information) transmitted to the control unit 15 for at least a certain period of time.
  • the processing unit 14 may rewrite the vehicle ID of the second information to the stored vehicle ID of the first information.
  • the first information and the second information may further include an increment counter indicating the time series of the information.
  • the processing unit 14 may store the increment counter of the information transmitted to the control unit 15 for at least a certain period of time. The processing unit 14 may also rewrite the increment counter when rewriting the vehicle ID.
  • the sensor 12 of the own vehicle 30 may acquire information indicating the state of the own vehicle 30 .
  • the processing unit 14 extracts the above-described first information between the information indicating the state of the own vehicle 30 acquired by the sensor 12 of the own vehicle 30 and the first information or the second information received from the roadside unit 20.
  • the identity determination may be performed by a process similar or similar to the identity determination of the detected object in the second information.
  • the information indicating the state of the own vehicle 30 acquired by the sensor 12 of the own vehicle 30 is received from the roadside unit 20.
  • the priority may be higher than the first information or the second information.
  • the information received from the roadside device 20 The priority of the first information or the second information may be increased.
  • the process of determining the identity of objects described above may take a long time.
  • the processing unit 14 may further predict the first information or the second information at the time (fifth time) t5 after the third time t3 when the above processing is finished.
  • An example of a prediction procedure for the first information is described below.
  • a similar or similar prediction procedure may be performed on the second information.
  • a similar or similar prediction procedure may be performed on the information indicating the state of the own vehicle 30 acquired by the sensor 12 of the own vehicle 30 .
  • the processing unit 14 may predict the state of the object detected in the first information at the fifth time t5 as the current time.
  • the prediction may be made in a manner similar or similar to the state prediction method at the third time.
  • the processing unit 14 may further acquire information indicating the state of the detected object detected by the other vehicle 40 from the other vehicle 40 .
  • the other vehicle 40a carries out inter-vehicle communication of the information indicating the state of the other vehicle 40b detected as a detected object together with the information 201, and the processing unit 14 acquires this information.
  • the processing unit 14 determines the identity of the detected object with respect to the information indicating the state of the other vehicle 40b acquired from the other vehicle 40a. determination processing, deletion processing of detected object information, or further prediction processing of object information may be performed.
  • the control unit 15 may control the vehicle 10 based on the processing result of the processing unit 14.
  • the controller 15 may be an ECU (Electronic Control Unit).
  • the control unit 15 may assist the driving of the vehicle 10 based on the detected object information received from the processing unit 14 .
  • the control unit 15 may notify the driver of the acquired information or information obtained by processing the information.
  • the control unit 15 may process the acquired information indicating the state of the detected object to narrow it down to information related to an urgent situation for the vehicle 10 (self-vehicle 30), and then notify the driver. In that case, the control unit 15 may narrow down the information to be notified to the driver by comparing the relative distance and relative speed between the own vehicle 30 and the detected object with predetermined criteria.
  • the control unit 15 may use the information acquired for determining the operation in a configuration in which steering such as acceleration/deceleration and steering of the vehicle 10 is performed automatically or semi-automatically.
  • the control unit 15 may be used to assist the driver of the vehicle 10 in driving safely.
  • the processing unit 14 of the present disclosure may be provided outside the vehicle 10 .
  • the processing unit 14 provided outside the vehicle 10 and the control unit 15 of the vehicle 10 may be network-connected, and the processing unit 14 may provide the cloud service to the vehicle 10 .
  • the information processing device 11 may perform this process at regular intervals.
  • the receiving unit 13 of the information processing device 11 receives the information on the detected object and the acquisition time.
  • the processing unit 14 of the information processing device 11 predicts information about a detected object.
  • the processing unit 14 deletes the information on the detected object.
  • the processing unit 14 determines the identity of the detected object.
  • the processing unit 14 further predicts information on the detected object.
  • step S ⁇ b>105 the processing unit 14 transmits information on the detected object to the control unit 15 . After the calculation, the process ends.
  • the processing unit 14 of the information processing apparatus 11 performs the predicted state of the detected object at the third time t3 based on the first information and the second information, respectively. It is determined whether the detected object in the first information and the detected object in the second information are the same, and either the first information or the second information is selected based on the determination result. With such a configuration, the amount of information processed by the information processing device 11 can be reduced, and the processing load of the information processing device 11 can be reduced.
  • the processing unit 14 of the information processing apparatus 11 of the present embodiment determines that the detected object in each of the first information and the second information is the same, the first information and the second information Only one is transmitted to the control unit 15 .
  • the amount of information received by the control unit 15 is reduced, and the information processing device 11 can reduce the processing load of the control unit 15 .
  • the processing unit 14 of the information processing apparatus 11 of the present embodiment transmits which of the first information and the second information to the control unit 15 further based on the first information and the second reliability. or judge.
  • the control unit 15 can receive only highly accurate information.
  • the information processing apparatus 11 can improve the reliability of processing.
  • the processing unit 14 of the information processing device 11 of the present embodiment predicts the information of the vehicle 10 at the third time from the information of the vehicle 10 acquired at the fourth acquisition time before the third time. Further based on the information of the vehicle 10 at time 3, it is determined whether to delete the first information and the second information. With such a configuration, information unnecessary to the vehicle 10 is deleted, and the processing load of the information processing device 11 can be further reduced.
  • the processing unit 14 of the information processing apparatus 11 determines from the first information, The first information is predicted, and the second information at the third time is predicted from the second information.
  • the information processing apparatus 11 determines the identity of the detected object in the first information and the detected object in the second information. Because it becomes difficult to judge Therefore, it is possible to improve the accuracy of identity determination performed by the information processing apparatus 11 .
  • the processing unit 14 of the information processing device 11 of the present embodiment further predicts the first information or the second information at the fifth time after the third time.
  • the information processing apparatus 11 can reduce deterioration in the accuracy of the first information or the second information due to the processing unit 14 needing time to determine the identity of the object, for example.
  • a storage medium on which the program is recorded for example, an optical disk, a magneto-optical A disk, CD-ROM, CD-R, CD-RW, magnetic tape, hard disk, memory card, etc.
  • an optical disk, a magneto-optical A disk, CD-ROM, CD-R, CD-RW, magnetic tape, hard disk, memory card, etc. can also be used.
  • the implementation form of the program is not limited to an application program such as an object code compiled by a compiler or a program code executed by an interpreter. good.
  • the program may or may not be configured so that all processing is performed only in the CPU on the control board.
  • the program may be configured to be partially or wholly executed by another processing unit mounted on an expansion board or expansion unit added to the board as required.
  • the processing unit 14 and the control unit 15 in the vehicle 10 are described as having different configurations.
  • the configuration is not limited to this, and the control unit 15 may include the processing unit 14 .
  • the control unit 15 performs the sameness determination process in the processing unit 14 described above, deletes the information of the detected object according to the result of the determination process, and then uses the information that has not been deleted. may assist in driving the vehicle 10.
  • Embodiments according to the present disclosure are not limited to any specific configuration of the embodiments described above. Embodiments of the present disclosure extend to all novel features or combinations thereof described in the present disclosure or to all novel method or process steps or combinations thereof described. be able to.
  • Descriptions such as “first” and “second” in this disclosure are identifiers for distinguishing the configurations. Configurations that are differentiated in descriptions such as “first” and “second” in this disclosure may interchange the numbers in that configuration. For example, the first information can replace the identifiers “first” and “second” with the second information. The exchange of identifiers is done simultaneously. The configurations are still distinct after the exchange of identifiers. Identifiers may be deleted. Configurations from which identifiers have been deleted are distinguished by codes. The description of identifiers such as “first” and “second” in this disclosure should not be used as a basis for interpreting the order of the configuration or the existence of lower numbered identifiers.
  • Reference Signs List 10 vehicle 11 information processing device 12 sensor 13 receiving unit 14 processing unit 15 control unit 20 roadside device 21 sensor 30 host vehicle 40, 40a, 40b, 40c other vehicle 50, 50a, 50b, 50c pedestrian 60 road 101, 102, 103 , 104, 105, 106, 201, 301, 401, 501 Information A Information maintenance range Af Range on the traveling direction side Ab Range on the opposite side to the traveling direction rf, rb Radius dt Distance in the traveling direction

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Traffic Control Systems (AREA)

Abstract

This information processing device comprises a reception unit and a processing unit. The reception unit receives, from a first transmission device, a first acquisition time and first information that indicates a state of an object detected at the first acquisition time, and receives, from a second transmission device, a second acquisition time and second information that indicates a state of an object detected at the second acquisition time. The processing unit processes the first information, the first acquisition time, the second information, and the second acquisition time received by the reception unit. The processing unit predicts, from the first information, the state of the object detected in the first information at a third time that is after the first acquisition time and the second acquisition time, and detects, from the second information, the state of the object detected in the second information at the third time. On the basis of the predicted state, at the third time, of the object detected in each of the first information and second information, the processing unit determines whether or not the object detected in the first information and the object detected in the second information are the same, and selects one of the first information and the second information on the basis of the result of the determination.

Description

情報処理装置及び情報処理方法Information processing device and information processing method 関連出願の相互参照Cross-reference to related applications
 本出願は、2022年1月27日に日本国に特許出願された特願2022-011332の優先権を主張するものであり、この先の出願の開示全体をここに参照のために取り込む。 This application claims priority from Japanese Patent Application No. 2022-011332 filed in Japan on January 27, 2022, and the entire disclosure of this earlier application is incorporated herein for reference.
 本発明は、情報処理装置及び情報処理方法に関するものである。 The present invention relates to an information processing device and an information processing method.
 車両が、他車両と車車間通信すると共に、路側機と路車間通信して、他車両及び路側機が検出した検出物体の情報を取得する技術が知られている(特許文献1参照)。 A technology is known in which a vehicle performs vehicle-to-vehicle communication with another vehicle and also performs road-to-vehicle communication with a roadside machine to acquire information about detected objects detected by the other vehicle and the roadside machine (see Patent Document 1).
特開2019-185640号公報JP 2019-185640 A
 第1の観点による情報処理装置は、
 第1取得時刻に検出した物体の状態を示す第1の情報及び第1取得時刻を第1送信装置から受信し、第2取得時刻に検出した物体の状態を示す第2の情報及び第2取得時刻を第2送信装置から受信する、受信部と、
 前記受信部が受信した前記第1の情報及び前記第1取得時刻並びに前記第2の情報及び前記第2取得時刻を処理する処理部と、を備える装置であり、
 前記処理部は、前記第1の情報から、前記第1取得時刻及び前記第2取得時刻よりも後の第3時刻における、第1の情報における検出した物体の状態を予測するとともに、前記第2の情報から、前記第3時刻における、第2の情報における検出した物体の状態を予測し、
 前記処理部は、予測された、前記第3時刻における、前記第1の情報及び前記第2の情報それぞれにおける検出した物体の状態に基づいて、前記第1の情報における検出した物体と前記第2の情報における検出した物体とが同一であるか判断し、判断結果に基づいて前記第1の情報及び前記第2の情報のいずれかを選択する。
An information processing device according to a first aspect includes:
receiving first information indicating the state of the object detected at the first acquisition time and the first acquisition time from the first transmitting device, and receiving second information and the second acquisition indicating the state of the object detected at the second acquisition time a receiver that receives the time from the second transmitter;
a processing unit that processes the first information and the first acquisition time and the second information and the second acquisition time received by the receiving unit;
The processing unit predicts, from the first information, a state of the detected object in the first information at a third time after the first acquisition time and the second acquisition time. predicting the state of the detected object in the second information at the third time from the information in
Based on the predicted state of the detected object in each of the first information and the second information at the third time, the processing unit performs the detected object in the first information and the second is the same as the detected object in the information, and either the first information or the second information is selected based on the result of the determination.
 また、第2の観点による情報処理方法は、
 第1取得時刻に検出した物体の状態を示す第1の情報を取得した第1送信装置から前記第1の情報及び前記第1取得時刻を受信し、第2取得時刻に検出した物体の状態を示す第2の情報を取得した第2送信装置から前記第2の情報及び前記第2取得時刻を受信する、受信ステップと、
 前記第1の情報から、前記第1取得時刻及び前記第2取得時刻よりも後の第3時刻における、第1の情報における検出した物体の状態を予測するとともに、前記第2の情報から、前記第3時刻における、第2の情報における検出した物体の状態を予測する予測ステップと、
 前記予測ステップで予測された、前記第3時刻における、前記第1の情報及び前記第2の情報それぞれにおける検出した物体の状態に基づいて、前記第1の情報における検出した物体と前記第2の情報における検出した物体とが同一であるか判断し、判断結果に基づいて前記第1の情報及び前記第2の情報のいずれかを選択する、判断ステップと、
を含む。
Further, the information processing method according to the second aspect includes:
receiving the first information and the first acquisition time from the first transmitting device that acquired the first information indicating the state of the object detected at the first acquisition time, and transmitting the state of the object detected at the second acquisition time; a receiving step of receiving the second information and the second acquisition time from the second transmitting device that acquired the second information indicating
From the first information, predict the state of the detected object in the first information at a third time after the first acquisition time and the second acquisition time, and from the second information, predict the state of the detected object a prediction step of predicting the state of the detected object in the second information at a third time;
Based on the state of the detected object in each of the first information and the second information at the third time predicted in the prediction step, the state of the detected object in the first information and the second information is determined. a judgment step of judging whether the detected object in the information is the same, and selecting either the first information or the second information based on the judgment result;
including.
本実施形態に係る情報処理装置の車両における配置を示す配置図である。1 is a layout diagram showing the layout of an information processing device according to an embodiment in a vehicle; FIG. 図1の情報処理装置の概略構成を示す機能ブロック図である。2 is a functional block diagram showing a schematic configuration of the information processing apparatus of FIG. 1; FIG. 処理部が削除する情報に関する検出物体の位置を示す図である。It is a figure which shows the position of the detection object regarding the information which a process part deletes. 検出物体の進行方向距離を示す図である。It is a figure which shows the advancing direction distance of a detection object. 車両周辺の領域を碁盤目状に区画した図である。It is the figure which divided the area|region around a vehicle into the shape of a grid. 処理部が行う検出物体の同一性の判断の具体的な例を示す図である。FIG. 4 is a diagram showing a specific example of determination of identity of detected objects performed by a processing unit; 図1の情報処理装置が実行する処理を説明するためのフローチャートである。FIG. 2 is a flowchart for explaining processing executed by the information processing apparatus of FIG. 1; FIG.
 以下、本開示を適用した情報処理装置の実施形態について、図面を参照して説明する。 An embodiment of an information processing apparatus to which the present disclosure is applied will be described below with reference to the drawings.
 図1は、一実施形態に係る情報処理装置を搭載する車両10を備える交通支援システムの構成例を示す。交通支援システムは、例えば、高度道路交通システム(ITS:Intelligent Transport Systems)の安全運転支援通信システムである。安全運転支援通信システムは、安全運転支援システムと呼ばれたり、安全運転支援無線システムと呼ばれたりする。 FIG. 1 shows a configuration example of a traffic support system including a vehicle 10 equipped with an information processing device according to one embodiment. The traffic support system is, for example, a safe driving support communication system of Intelligent Transport Systems (ITS). A driving safety support communication system is called a driving safety support system or a driving safety support radio system.
 交通支援システムは、車両10及び路側機20を含んで構成される。車両10は、複数の車両10を含んでよい。なお、説明の簡易化のために、任意に選択した単一の車両10を自車両30として以下の説明が行われる。自車両30以外の車両10は他車両40とも称される。以下において、自車両30及び他車両40を区別すること無く説明可能な特徴は、自車両30及び他車両40を含む車両10の特徴として説明される。 The traffic support system includes vehicles 10 and roadside units 20 . Vehicles 10 may include a plurality of vehicles 10 . For simplicity of explanation, the following explanation will be given with the arbitrarily selected single vehicle 10 as the self-vehicle 30 . Vehicles 10 other than own vehicle 30 are also referred to as other vehicles 40 . In the following, features that can be explained without distinguishing between the own vehicle 30 and the other vehicle 40 will be explained as features of the vehicle 10 including the own vehicle 30 and the other vehicle 40 .
 路側機20は、所定の領域における路上の車両、物体、歩行者50等の観察対象を観察してよい。路側機20は、複数の道路60(車道)が交差する交差点の近くに配置されて、路面を観察してよい。路側機20は、交差点以外の路側に配置されてよい。なお、図1においては、交差点には1つの路側機20が配置されている。しかしながら、これに限らず、複数の路側機20が配置されてもよい。例えば、交差点で交差する道路60の数に対応する数の路側機20が配置され、各路側機20は、設置者に予め対づけられた少なくとも1の道路60が検出範囲に含まれるように配置されてもよい。 The roadside unit 20 may observe observation targets such as vehicles, objects, and pedestrians 50 on the road in a predetermined area. The roadside unit 20 may be placed near an intersection where multiple roads 60 (roadways) intersect to observe the road surface. The roadside unit 20 may be arranged on the roadside other than the intersection. In addition, in FIG. 1, one roadside unit 20 is arranged at the intersection. However, not limited to this, a plurality of roadside units 20 may be arranged. For example, roadside units 20 are arranged in a number corresponding to the number of roads 60 intersecting at an intersection, and each roadside unit 20 is arranged so that at least one road 60 associated with the installer in advance is included in the detection range. may be
 交通支援システムでは、路側機20と、道路60を走る車両10とが、互いに無線通信を行ってよい。複数の車両10は、互いに無線通信を行ってよい。 In the traffic support system, the roadside unit 20 and the vehicle 10 running on the road 60 may perform wireless communication with each other. A plurality of vehicles 10 may perform wireless communication with each other.
 路側機20は、後述する、多様な情報を車両10に通知してよい。車両10は、後述する、多様な情報を他の車両10及び路側機20に通知してよい。車両10では、路側機20及び他の車両10から取得する多様な情報が、当該車両10の運転者の安全運転を支援するために用いられてよい。 The roadside unit 20 may notify the vehicle 10 of various information, which will be described later. The vehicle 10 may notify the other vehicle 10 and the roadside device 20 of various information, which will be described later. The vehicle 10 may use various information acquired from the roadside unit 20 and other vehicles 10 to assist the driver of the vehicle 10 in driving safely.
 上記のように、交通支援システムは、車両10の運転者の安全運転を支援してよい。車両10は、例えば、自動車であるが、自動車に限定されず、自動二輪車、バス、路面電車、自転車等であってよい。 As described above, the traffic assistance system may assist the driver of the vehicle 10 in driving safely. The vehicle 10 is, for example, an automobile, but is not limited to an automobile, and may be a motorcycle, a bus, a streetcar, a bicycle, or the like.
 以下に、交通支援システムを構成する路側機20及び車両10の詳細が説明される。 Details of the roadside unit 20 and the vehicle 10 that make up the traffic support system will be described below.
 路側機20は、例えば、道路60が交差する交差点付近の信号装置、電柱、街灯等の、屋外において道路60を含む光景を撮像可能な高さを有する構造物に固定されてよい。図2に示されるように、路側機20はセンサ21を含んでよい。センサ21は、検出物体の状態を示す情報を検出する。路側機20はセンサ21が検出した情報に、検出時刻を付加してよい。 The roadside unit 20 may be fixed to a structure having a height capable of imaging the scene including the road 60 outdoors, such as a signal device near an intersection where the roads 60 intersect, a utility pole, a streetlight, or the like. As shown in FIG. 2, roadside unit 20 may include sensor 21 . The sensor 21 detects information indicating the state of the detected object. The roadside device 20 may add the detection time to the information detected by the sensor 21 .
 路側機20は、検出物体の状態を示す情報を周囲に周期的に送信してよい。 The roadside device 20 may periodically transmit information indicating the state of the detected object to its surroundings.
 本開示の情報処理装置11は、車両10に搭載されてよい。車両10は、更にセンサ12を搭載してよい。情報処理装置11は、受信部13と、処理部14と、を含む。情報処理装置11は、制御部15を含んでよい。 The information processing device 11 of the present disclosure may be mounted on the vehicle 10 . Vehicle 10 may further include sensors 12 . The information processing device 11 includes a receiver 13 and a processor 14 . The information processing device 11 may include a control section 15 .
 センサ12は、周囲の物体を検出物体として検出してよい。検出物体は、路上の車両10、固定物、人等の、個々の車両10の運転において考慮すべき対象であってよい。センサ12は、所定の間隔で検出物体の状態を示す情報を生成してよい。センサ12は、検出物体の状態を示す情報に取得時刻を付加してよい。取得時刻は、処理部14が内蔵するタイマから取得してよい。検出物体の状態を示す情報は、検出物体の位置(緯度、経度及び高度)、速度、進行方向、加速度、種別、信頼度等を含む。検出物体が車両である場合、検出物体の状態を示す情報は、シフトポジション、ステアリング角度等を含んでよい。 The sensor 12 may detect surrounding objects as detection objects. A detected object may be a vehicle 10 on the road, a fixed object, a person, etc., which should be considered in driving the individual vehicle 10 . The sensor 12 may generate information indicating the state of the detected object at predetermined intervals. The sensor 12 may add the acquisition time to the information indicating the state of the detected object. The acquisition time may be acquired from a timer built into the processing unit 14 . The information indicating the state of the detected object includes the position (latitude, longitude and altitude), speed, traveling direction, acceleration, type, reliability, and the like of the detected object. When the detected object is a vehicle, the information indicating the state of the detected object may include the shift position, steering angle, and the like.
 センサ12は、例えば、可視光カメラ、FIR(Far Infrared Rays)カメラ、及びステレオカメラ等の撮像装置、並びに及び赤外線レーダ、ミリ波レーダ、Lidar(Light detection and ranging)等の測距装置を含んでよい。撮像装置が検出する画像の画像解析に基づいて、検出物体の位置、検出物体の存否、検出物体の種別、及び進行方向は生成され得る。異なる時点で連続的に検出された画像の画像解析により、検出物体の速度及び加速度は生成され得る。測距装置が検出する物点の距離に基づいて、検出物体の位置、検出物体の存否、存在する検出物体の種別は生成され得る。異なる時点で連続的に検出された距離に基づいて、検出物体の速度及び加速度は生成され得る。 The sensor 12 includes, for example, an imaging device such as a visible light camera, an FIR (Far Infrared Rays) camera, and a stereo camera, and a ranging device such as an infrared radar, a millimeter wave radar, and a Lidar (light detection and ranging). good. The position of the detected object, the presence or absence of the detected object, the type of the detected object, and the traveling direction can be generated based on the image analysis of the image detected by the imaging device. By image analysis of successively detected images at different time points, the velocity and acceleration of the detected object can be generated. The position of the detected object, the presence or absence of the detected object, and the type of the existing detected object can be generated based on the distance of the object point detected by the rangefinder. Velocity and acceleration of the detected object can be generated based on successively detected distances at different times.
 センサ12は、車両10の速度を検出する速度センサ、車両10の実空間における空間位置を検出するGNSS(Global Navigation Satellite System)等の測位装置を含んでよい。 The sensor 12 may include a speed sensor that detects the speed of the vehicle 10 and a positioning device such as a GNSS (Global Navigation Satellite System) that detects the spatial position of the vehicle 10 in real space.
 受信部13は、第1取得時刻に検出した検出物体の状態を示す第1の情報及び第1取得時間を第1送信装置から受信する。受信部13は、第2取得時刻に検出した検出物体の状態を示す第2の情報及び第2取得時刻を第2送信装置から受信する。第1送信装置及び第2送信装置は、他の車両10及び路側機20を含んでよい。 The receiving unit 13 receives the first information indicating the state of the detected object detected at the first acquisition time and the first acquisition time from the first transmission device. The receiving unit 13 receives second information indicating the state of the detected object detected at the second acquisition time and the second acquisition time from the second transmission device. The first transmitter and the second transmitter may include other vehicles 10 and roadside units 20 .
 受信部13は、第1送信装置及び第2送信装置と通信可能な通信インタフェースを含んでよい。本開示における「通信インタフェース」は、例えば無線通信機を含んでよい。無線通信機は、RC-005、IEEE802.11pを含む各規格に準拠する無線通信機を含んでよい。無線通信機は、少なくとも1つのアンテナを含む。 The receiving unit 13 may include a communication interface capable of communicating with the first transmitting device and the second transmitting device. A "communication interface" in this disclosure may include, for example, a radio. The radio may include radios conforming to standards including RC-005, IEEE 802.11p. A radio includes at least one antenna.
 処理部14は、1以上のプロセッサ及びメモリを含む。プロセッサは、特定のプログラムを読み込ませて特定の機能を実行する汎用のプロセッサ、及び特定の処理に特化した専用のプロセッサの少なくともいずれかを含んでよい。専用のプロセッサは、特定用途向けIC(ASIC;Application Specific Integrated Circuit)を含んでよい。プロセッサは、プログラマブルロジックデバイス(PLD;Programmable Logic Device)を含んでよい。PLDは、FPGA(Field-Programmable Gate Array)を含んでよい。処理部14は、1つ又は複数のプロセッサが協働するSoC(System-on-a-Chip)、及びSiP(System In a Package)の少なくともいずれかを含んでもよい。 The processing unit 14 includes one or more processors and memory. The processor may include at least one of a general-purpose processor that loads a specific program and executes a specific function, and a dedicated processor that specializes in specific processing. A dedicated processor may include an Application Specific Integrated Circuit (ASIC). The processor may include a programmable logic device (PLD). The PLD may include an FPGA (Field-Programmable Gate Array). The processing unit 14 may include at least one of SoC (System-on-a-Chip) and SiP (System In a Package) in which one or more processors cooperate.
 処理部14は、車両10のセンサ12が取得した自車両30の状態を示す情報を他の車両10に送信してよい。送信された情報は、他の車両10にとっての第1の情報又は第2の情報となる。 The processing unit 14 may transmit information indicating the state of the own vehicle 30 acquired by the sensor 12 of the vehicle 10 to the other vehicle 10 . The transmitted information becomes first information or second information for other vehicles 10 .
 処理部14は、受信部13が受信した第1の情報及び第1取得時刻並びに第2の情報及び第2取得時刻を処理する。以下、具体的な処理の例が説明される。 The processing unit 14 processes the first information and the first acquisition time and the second information and the second acquisition time received by the receiving unit 13 . A specific example of processing will be described below.
<検出物体の情報の予測>
 処理部14は、第1の情報から、第1取得時刻t1及び第2取得時刻t2よりも後の第3時刻t3において、第1の情報における検出物体の状態を予測する。処理部14は、第2の情報から、例えば現在時刻である第3時刻t3における、第2の情報において検出物体の状態を予測する。以下、第1の情報に対する予測手順の例が説明される。第2の情報に対しても類似の予測手順が行われる。
<Prediction of information on detected objects>
From the first information, the processing unit 14 predicts the state of the detected object in the first information at a third time t3 after the first acquisition time t1 and the second acquisition time t2. From the second information, the processing unit 14 predicts the state of the detected object in the second information at the third time t3, which is the current time, for example. An example of a prediction procedure for the first information is described below. A similar prediction procedure is performed for the second information.
 なお、処理部14は、以下に述べる処理を、第1取得時刻t1及び第2取得時刻t2の差が所定のしきい値以下である場合にのみ、実施してもよい。 Note that the processing unit 14 may perform the processing described below only when the difference between the first acquisition time t1 and the second acquisition time t2 is equal to or less than a predetermined threshold.
 処理部14は、第1の情報から、第1取得時刻t1における検出物体の位置p1、速度v1、加速度a及び進行方向θ1を抽出してよい。 The processing unit 14 may extract the position p1, velocity v1, acceleration a, and traveling direction θ1 of the detected object at the first acquisition time t1 from the first information.
 処理部14は、第3時刻t3における、第1の情報における検出物体の状態を予測する。処理部14は、第3時刻t3における検出物体の速度v3を予測してよい。処理部14は、第1取得時刻t1から第3時刻t3までの検出物体の移動距離dを予測してよい。
 v3=v1+a×(t3-t1)   ・・・(1)
 d=v1×(t3-t1)+a×(t3-t1)/2   ・・・(2)
The processing unit 14 predicts the state of the detected object in the first information at the third time t3. The processing unit 14 may predict the velocity v3 of the detected object at the third time t3. The processing unit 14 may predict the movement distance d of the detected object from the first acquisition time t1 to the third time t3.
v3=v1+a×(t3-t1) (1)
d=v1×(t3−t1)+a×(t3−t1) 2 /2 (2)
 なお、式(1)において、加速度aが不明な場合は、
 v3=v1   ・・・(1)’
としてよい。
In addition, in the formula (1), if the acceleration a is unknown,
v3=v1 (1)'
may be
 処理部14は、予測した移動距離d、及び第1取得時刻t1における検出物体の進行方向θ1に基づいて、第1取得時刻t1から第3時刻t3までの、検出物体の緯度及び経度の変位を算出してよい。処理部14は、第1取得時刻t1における位置p1と算出した変位とから、第3時刻t3における検出物体の位置p3を予測してよい。 The processing unit 14 calculates the latitude and longitude displacement of the detected object from the first acquisition time t1 to the third time t3 based on the predicted moving distance d and the traveling direction θ1 of the detected object at the first acquisition time t1. can be calculated. The processing unit 14 may predict the position p3 of the detected object at the third time t3 from the position p1 at the first acquisition time t1 and the calculated displacement.
 処理部14は、第3時刻t3における検出物体の位置p3と、第3時刻t3における車両10の位置との距離drを予測してよい。処理部14は、検出物体の位置p3、及び車両10の位置から、第3時刻t3における、自車両30から見た、検出物体の方向φ3を予測してよい。処理部14は、検出物体の方向φ3と、車両10の進行方向とがなす角度Δを算出してよい。検出物体の方向φ3と車両10の進行方向とは方位角であってよい。処理部14は、差分Δが第1角度閾値以上である場合に、検出物体が車両10の死角に存在する、と判定してよい。処理部14は、判定結果を制御部15に通知してよい。処理部14は、差分Δが第2角度閾値以上であり、かつ相対距離drが第1距離閾値以下である場合のみに、判定結果を制御部15に通知してよい。 The processing unit 14 may predict the distance dr between the position p3 of the detected object at the third time t3 and the position of the vehicle 10 at the third time t3. The processing unit 14 may predict the direction φ3 of the detected object viewed from the vehicle 30 at the third time t3 from the position p3 of the detected object and the position of the vehicle 10 . The processing unit 14 may calculate the angle Δ between the direction φ3 of the detected object and the traveling direction of the vehicle 10 . The direction φ3 of the detected object and the traveling direction of the vehicle 10 may be azimuth angles. The processing unit 14 may determine that the detected object exists in the blind spot of the vehicle 10 when the difference Δ is equal to or greater than the first angle threshold. The processing unit 14 may notify the control unit 15 of the determination result. The processing unit 14 may notify the control unit 15 of the determination result only when the difference Δ is greater than or equal to the second angle threshold and the relative distance dr is less than or equal to the first distance threshold.
<検出物体の情報の削除>
 処理部14は、第3時刻t3よりも前の第4取得時刻t4に取得した車両10の情報から、第3時刻t3における、車両10の情報を予測してよい。処理部14は、第3時刻t3における、車両10の情報に更に基づいて、第1の情報及び第2の情報を削除するか判断してよい。以下、具体的な判断手順の例が説明される。処理部14は、検出物体の情報の削除を、検出物体の情報を制御部15に送信する前に行ってよい。処理部14は、検出物体の情報の削除を、前述の検出物体の情報の予測の前、又は後述の検出物体の同一性の判断の後、に行ってもよい。
<Deletion of detected object information>
The processing unit 14 may predict the information of the vehicle 10 at the third time t3 from the information of the vehicle 10 acquired at the fourth acquisition time t4 before the third time t3. The processing unit 14 may determine whether to delete the first information and the second information further based on the information of the vehicle 10 at the third time t3. An example of a specific determination procedure will be described below. The processing unit 14 may delete the information on the detected object before transmitting the information on the detected object to the control unit 15 . The processing unit 14 may delete the information on the detected object before the prediction of the information on the detected object described above or after determining the identity of the detected object described later.
 自車両30に必要な検出物体の情報は、自車両30の製造者等によって異なることがある。ゆえに、ユーザ等が処理部14に設けられる外部インタフェース等から判断手順を指定できてよい。判断手順は運転中に、運転状況、渋滞状況等に応じて指定されてよい。 The information on the detected object required for the own vehicle 30 may vary depending on the manufacturer of the own vehicle 30 and the like. Therefore, the user or the like may be able to specify the determination procedure from an external interface or the like provided in the processing section 14 . The decision procedure may be specified during driving according to driving conditions, traffic conditions, and the like.
 処理部14が受信部13から検出物体の情報を取得したときに、処理部14は判断手順を開始してよい。 When the processing unit 14 acquires information on the detected object from the receiving unit 13, the processing unit 14 may start the determination procedure.
(判断手順1)
 処理部14は、第3時刻t3において、情報維持範囲Aの外に検出物体が存在すると判断する場合に、検出物体の情報を削除してよい。ユーザ等は、情報維持範囲Aを指定できてよい。情報維持範囲Aは、車両10の進行方向側の範囲Af、又は車両10の進行方向とは逆側の範囲Abを含んでよい(図3参照)。範囲Afは車両10の前縁の幅方向中心を中心とする半径rfの半円形であってよい。範囲Abは車両10の前縁の幅方向中心を中心とする半径rbの半円形であってよい。半径rf又は半径rbは例えば、300メートル,250メートル,200メートル,150メートル,100メートル又は50メートルであってよい。半径rfを約240メートル以上とすることで、車両10は、T字路において進行方向に他の車両10が存在しない場合に、減速せずに右折できると考えられる。
(Judgment procedure 1)
When the processing unit 14 determines that the detected object exists outside the information maintenance range A at the third time t3, the information on the detected object may be deleted. A user or the like may be able to specify the information maintenance range A. FIG. The information maintenance range A may include a range Af on the traveling direction side of the vehicle 10 or a range Ab on the side opposite to the traveling direction of the vehicle 10 (see FIG. 3). The range Af may be a semicircular shape centered on the widthwise center of the front edge of the vehicle 10 and having a radius rf. The range Ab may be a semicircular shape centered on the widthwise center of the front edge of the vehicle 10 and having a radius rb. Radius rf or radius rb may be, for example, 300 meters, 250 meters, 200 meters, 150 meters, 100 meters or 50 meters. By setting the radius rf to about 240 meters or more, the vehicle 10 is considered to be able to turn right without decelerating when there is no other vehicle 10 in the traveling direction of the T-junction.
 処理部14は、情報維持範囲Aが指定されていない場合、検出物体の情報を削除しなくてもよい。 The processing unit 14 does not have to delete the information on the detected object if the information maintenance range A is not specified.
(判断手順2)
 処理部14は、第3時刻t3において、自車両30の前縁の幅方向中心の位置と、他車両40を自車両30の進行方向に沿った方向に投影した位置と、の距離(以下、「進行方向距離」という)dt(図4参照)を算出してよい。処理部14は、他車両40が自車両30の進行方向側及び進行方向とは逆側のいずれに存在するか判断してよい。
(Judgment procedure 2)
At the third time t3, the processing unit 14 calculates the distance between the position of the center of the front edge of the vehicle 30 in the width direction and the position of the other vehicle 40 projected along the traveling direction of the vehicle 30 (hereinafter referred to as dt (refer to FIG. 4) may be calculated. The processing unit 14 may determine whether the other vehicle 40 is on the traveling direction side of the own vehicle 30 or on the opposite side to the traveling direction.
 自車両30の進行方向側及び進行方向とは逆側のそれぞれにおいて、進行方向距離の閾値(以下、「進行方向側閾値」及び「逆側閾値」という)が設定されてよい。処理部14は、進行方向距離が検出物体の存在する側の閾値を超える場合に、検出物体の情報を削除してよい。進行方向側閾値は、自車両30の前方のいずれかの交差点までの進行方向距離以上としてよい。逆側閾値は、自車両30と同じ進行方向に当該自車両30の速度よりも大きな速度で移動する緊急車両のような他車両40の妨げとならない距離以上であって、かつ、自車両30が当該他車両40を十分に検出可能な範囲の距離としてよい。 Thresholds for the traveling direction distance (hereinafter referred to as "advancing direction threshold" and "opposite side threshold") may be set on the traveling direction side and the opposite side of the own vehicle 30, respectively. The processing unit 14 may delete the information on the detected object when the traveling direction distance exceeds the threshold on the side where the detected object exists. The traveling direction threshold may be set to be equal to or greater than the traveling direction distance to any intersection in front of the host vehicle 30 . The opposite side threshold is a distance that does not interfere with another vehicle 40 such as an emergency vehicle that moves in the same traveling direction as the own vehicle 30 at a speed higher than the speed of the own vehicle 30, and that the own vehicle 30 The distance may be within a range where the other vehicle 40 can be sufficiently detected.
(判断手順3)
 処理部14は、第3時刻t3において、他車両40が、自車両30から視認可能であるか(自車両30に近接しているか)を判断してよい。自車両30から視認可能である他車両40は、自車両30に対して危険である可能性がある。処理部14は、他車両40が視認可能でない場合、第3時刻t3において、他車両40が、自車両30にとって危険であるかを判断してよい。処理部14は、自車両30から視認可能でなく且つ危険ではないと判断した他車両40を含む検出物体の情報を削除してよい。
(Judgment procedure 3)
The processing unit 14 may determine whether the other vehicle 40 is visible from the own vehicle 30 (is it close to the own vehicle 30) at the third time t3. Other vehicles 40 that are visible from own vehicle 30 may pose a danger to own vehicle 30 . When the other vehicle 40 is not visible, the processing unit 14 may determine whether the other vehicle 40 is dangerous to the own vehicle 30 at the third time t3. The processing unit 14 may delete information about detected objects including the other vehicle 40 that are not visible from the host vehicle 30 and determined not to be dangerous.
 視認可否の判断は、自車両30に対する他車両40の位置に基づいてよい。危険であるかの判断は、自車両30に対する位置及び他車両40の行動に基づいてよい。以下、視認可否の判断、及び他車両40が自車両30にとって危険であるかの判断手順の例が説明される。 The determination of visibility may be based on the position of the other vehicle 40 with respect to the own vehicle 30. The determination of danger may be based on the position relative to own vehicle 30 and the behavior of other vehicle 40 . An example of a procedure for determining whether or not to permit viewing and whether or not the other vehicle 40 is dangerous to the own vehicle 30 will be described below.
 処理部14は、自車両30周辺を、自車両30の進行方向に平行な複数の等間隔の直線と、進行方向に直交する複数の等間隔の直線とで、碁盤目状に区画してよい(図5参照)。処理部14は、例えば、自車両30周辺を、自車両30の進行方向及び逆方向側に10メートルの間隔でそれぞれ2行、自車両30の左右方向に3~3.5メートルの間隔(車線の幅)でそれぞれ2列の、自車両30の領域を含む5×5の25の領域に区画してよい。以下、25の領域における、最も前側且つ最も左側の領域が領域1に定められる。領域1から同じ行で右端までの領域が順番に領域2から領域5に定められる。領域1の1つ後側の領域が領域6と定められる。領域6から同じ行で右端までの領域が順番に領域7から領域10と定められる。以後同様に、領域11から領域25が定められる。なお、自車両30は領域13に位置する。 The processing unit 14 may partition the periphery of the vehicle 30 in a grid pattern with a plurality of equally spaced straight lines parallel to the traveling direction of the vehicle 30 and a plurality of equally spaced straight lines orthogonal to the traveling direction. (See Figure 5). For example, the processing unit 14 divides the surroundings of the own vehicle 30 into two rows in the traveling direction and the opposite direction of the own vehicle 30 with an interval of 10 meters each, and an interval of 3 to 3.5 meters in the left and right direction of the own vehicle 30 (lane lines). width), and divided into 5×5=25 areas including the area of the own vehicle 30, each of which has two rows. In the following, region 1 is defined as the frontmost and leftmost region in the 25 regions. Areas from area 1 to the right end in the same row are defined as areas 2 to 5 in order. A region one after region 1 is defined as region 6 . Areas from area 6 to the right end in the same row are defined as areas 7 to 10 in order. Regions 11 to 25 are determined in the same way thereafter. It should be noted that the own vehicle 30 is located in the area 13 .
 処理部14は、自車両30の左右において隣接する領域12、14に存在する他車両40を、自車両30から視認可能であると判断してよい。 The processing unit 14 may determine that the other vehicle 40 existing in the regions 12 and 14 adjacent to the left and right of the own vehicle 30 is visible from the own vehicle 30 .
 処理部14は、自車両30の前方において隣接する領域8に位置する他車両40の危険性を判断してよい。ここで、自車両30の進行方向と他車両40の進行方向とがなす角度が第3角度閾値以上である場合、処理部14は、他車両40が自車両30から離れる方向に車線変更していると判断してよい。処理部14は、当該方向に車線変更する他車両40を自車両30にとって危険であると判断してよい。他車両40が車線変更する場合、他車両40の運転手は注意散漫になり、自車両30に対する注意が不十分になるおそれがある。また処理部14は、他車両40の加速度が負であり(他車両40が減速しており)、加速度が第1加速度閾値以下である場合、他車両40は自車両30に急接近していると判断する。処理部14は、自車両30に急接近している他車両40を自車両30にとって危険であると判断してよい。 The processing unit 14 may determine the danger of the other vehicle 40 located in the adjacent area 8 in front of the own vehicle 30 . If the angle formed by the direction of travel of the own vehicle 30 and the direction of travel of the other vehicle 40 is greater than or equal to the third angle threshold, the processing unit 14 causes the other vehicle 40 to change lanes away from the own vehicle 30. You can judge that there is The processing unit 14 may determine that the other vehicle 40 changing lanes in the direction is dangerous for the own vehicle 30 . When the other vehicle 40 changes lanes, the driver of the other vehicle 40 may become distracted and may not pay enough attention to the own vehicle 30 . If the acceleration of the other vehicle 40 is negative (the other vehicle 40 is decelerating) and the acceleration is equal to or less than the first acceleration threshold, the processing unit 14 determines that the other vehicle 40 is rapidly approaching the host vehicle 30. I judge. The processing unit 14 may determine that another vehicle 40 that is rapidly approaching the own vehicle 30 is dangerous for the own vehicle 30 .
 処理部14は、自車両30の前方において2領域離れた領域3に位置する他車両40の危険性を判断してよい。処理部14が、他車両40が自車両30から離れる方向に車線変更していると判断する場合、処理部14は、他車両40を自車両30にとって危険であると判断してよい。また処理部14は、他車両40の加速度が負であり、加速度が第2加速度閾値以下である場合、他車両40は自車両30に急接近していると判断してよい。処理部14は、自車両30に急接近している他車両40を自車両30にとって危険であると判断してよい。なお、他車両40は自車両30からある程度離れているため、第2加速度閾値の絶対値が、第1加速度閾値よりも大きくてもよい。 The processing unit 14 may determine the danger of the other vehicle 40 located in the area 3 two areas away from the own vehicle 30 . When the processing unit 14 determines that the other vehicle 40 is changing lanes away from the own vehicle 30 , the processing unit 14 may determine that the other vehicle 40 is dangerous for the own vehicle 30 . Further, the processing unit 14 may determine that the other vehicle 40 is rapidly approaching the own vehicle 30 when the acceleration of the other vehicle 40 is negative and the acceleration is equal to or less than the second acceleration threshold. The processing unit 14 may determine that another vehicle 40 that is rapidly approaching the own vehicle 30 is dangerous for the own vehicle 30 . Since the other vehicle 40 is some distance away from the own vehicle 30, the absolute value of the second acceleration threshold may be larger than the first acceleration threshold.
 処理部14は、自車両30の後方において隣接する領域18に位置する他車両40の危険性を判断してよい。上述したように、処理部14が、他車両40が自車両30から離れる方向に車線変更していると判断する場合、処理部14は、他車両40が自車両30にとって危険であると判断してよい。また、処理部14は、他車両40の加速度が正であり(他車両40が加速している)、加速度が第3加速度閾値以上である場合、他車両40は自車両30に急接近していると判断してよい。処理部14は、自車両30に急接近している他車両40を自車両30にとって危険であると判断してよい。 The processing unit 14 may determine the danger of the other vehicle 40 located in the adjacent area 18 behind the own vehicle 30 . As described above, when the processing unit 14 determines that the other vehicle 40 is changing lanes away from the own vehicle 30, the processing unit 14 determines that the other vehicle 40 is dangerous to the own vehicle 30. you can Further, when the acceleration of the other vehicle 40 is positive (the other vehicle 40 is accelerating) and the acceleration is greater than or equal to the third acceleration threshold, the processing unit 14 causes the other vehicle 40 to suddenly approach the own vehicle 30. You can judge that there is The processing unit 14 may determine that another vehicle 40 that is rapidly approaching the own vehicle 30 is dangerous for the own vehicle 30 .
 処理部14は、自車両30の後方において2領域離れた領域23に位置する他車両40の危険性を判断してよい。他車両40が自車両30から離れる方向に車線変更していると処理部14が判断する場合、処理部14は、他車両40が自車両30にとって危険であると判断してよい。処理部14は、他車両40の加速度が正であり、加速度が第4加速度閾値以上である場合、処理部14は、他車両40は自車両30に急接近していると判断してよい。処理部14は、自車両30に急接近している他車両40が自車両30にとって危険であると判断してよい。なお、他車両40は自車両30からある程度離れているため、第4加速度閾値の絶対値が、第3加速度閾値の絶対値よりも大きくてよい。 The processing unit 14 may determine the danger of the other vehicle 40 located in the area 23 two areas behind the own vehicle 30 . When the processing unit 14 determines that the other vehicle 40 is changing lanes away from the own vehicle 30 , the processing unit 14 may determine that the other vehicle 40 is dangerous to the own vehicle 30 . The processing unit 14 may determine that the other vehicle 40 is rapidly approaching the own vehicle 30 when the acceleration of the other vehicle 40 is positive and the acceleration is equal to or greater than the fourth acceleration threshold. The processing unit 14 may determine that another vehicle 40 that is rapidly approaching the own vehicle 30 is dangerous for the own vehicle 30 . Since the other vehicle 40 is some distance away from the own vehicle 30, the absolute value of the fourth acceleration threshold may be larger than the absolute value of the third acceleration threshold.
 処理部14は、自車両30の前方に1領域離れ、かつ左又は右方向に1領域離れた領域7又は9に位置する他車両40の危険性を判断してよい。他車両40の進行方向と自車両30の進行方向との間の角度が第4角度閾値以上であり、かつ他車両40の進行方向と、他車両40から自車両30までのベクトルとが同じ方向を向いている場合、処理部14は、他車両40が自車両30に近づく方向に車線変更していると判断してよい。処理部14は、この場合、他車両40が自車両30にとって危険であると判断してよい。他車両40が自車両30と同じ車線へ車線変更する場合、危険が生じ得る。処理部14はまた、他車両40の加速度が負であり、加速度が第5加速度閾値以下である場合も、他車両40が自車両30にとって危険であると判断してよい。処理部14は、自車両30の前方に2領域離れ、かつ左又は右方向に1領域離れた領域2又は4に位置する各他車両40の危険性を、類似して判断してよい。 The processing unit 14 may determine the danger of the other vehicle 40 located in the area 7 or 9, which is one area ahead of the host vehicle 30 and one area away in the left or right direction. The angle between the traveling direction of the other vehicle 40 and the traveling direction of the own vehicle 30 is equal to or greater than a fourth angle threshold, and the traveling direction of the other vehicle 40 and the vector from the other vehicle 40 to the own vehicle 30 are in the same direction. , the processing unit 14 may determine that the other vehicle 40 is changing lanes to approach the own vehicle 30 . In this case, the processing unit 14 may determine that the other vehicle 40 is dangerous for the own vehicle 30 . If another vehicle 40 changes lanes into the same lane as the own vehicle 30, danger may arise. The processing unit 14 may also determine that the other vehicle 40 is dangerous to the own vehicle 30 when the acceleration of the other vehicle 40 is negative and the acceleration is equal to or less than the fifth acceleration threshold. The processing unit 14 may similarly determine the risk of each of the other vehicles 40 positioned in areas 2 or 4 that are two areas ahead of the own vehicle 30 and one area away in the left or right direction.
 処理部14は、自車両30の後方に1領域離れ、かつ左又は右方向に1領域離れた領域17又は19に位置する他車両40の危険性を判断してよい。上述したように、他車両40が自車両30に近づく方向に車線変更していると処理部14が判断した場合、処理部14は、他車両40が自車両30にとって危険であると判断してよい。また、処理部14は、他車両40の加速度が正であり、第6加速度閾値以上である場合も、他車両40が自車両30にとって危険であると判断してよい。処理部14は、自車両30の後方に2領域離れ、かつ左又は右方向に1領域離れた領域22又は24に位置する各他車両40の危険性を、類似して判断してよい。 The processing unit 14 may determine the danger of the other vehicle 40 located in the area 17 or 19 which is one area behind the host vehicle 30 and one area away in the left or right direction. As described above, when the processing unit 14 determines that the other vehicle 40 is changing lanes to approach the own vehicle 30, the processing unit 14 determines that the other vehicle 40 is dangerous to the own vehicle 30. good. The processing unit 14 may also determine that the other vehicle 40 is dangerous to the own vehicle 30 when the acceleration of the other vehicle 40 is positive and equal to or greater than the sixth acceleration threshold. The processing unit 14 may similarly determine the risk of each other vehicle 40 located in the area 22 or 24 that is two areas behind the host vehicle 30 and one area away in the left or right direction.
 処理部14は、自車両30の左又は右方向に2領域離れた領域1、5、6、10、11、15、16、20、21又は25に位置する他車両40の危険性を判断してよい。上述したように、他車両40が自車両30に近づく方向に車線変更していると処理部14が判断した場合、処理部14は、他車両40が自車両30にとって危険であると判断してよい。なお、他車両40は自車両30から2車線分離れているため、上述した場合よりも、他車両40が自車両30に急接近していると判断する第4角度閾値が大きくてよい。 The processing unit 14 determines the risk of other vehicles 40 located in areas 1, 5, 6, 10, 11, 15, 16, 20, 21 or 25 two areas away from the own vehicle 30 in the left or right direction. you can As described above, when the processing unit 14 determines that the other vehicle 40 is changing lanes to approach the own vehicle 30, the processing unit 14 determines that the other vehicle 40 is dangerous to the own vehicle 30. good. Since the other vehicle 40 is separated from the own vehicle 30 by two lanes, the fourth angle threshold for judging that the other vehicle 40 is rapidly approaching the own vehicle 30 may be larger than in the case described above.
 処理部14は、自車両30から更に離れた領域に位置する他車両40は自車両30にとって危険ではないと判断してよい。 The processing unit 14 may determine that other vehicles 40 located further away from the own vehicle 30 are not dangerous for the own vehicle 30 .
 処理部14は、更に自車両30から死角になる他車両40を判断してよい。例えば、自車両30の左又は右方向に1領域離れ、かつ前方又は後方に1領域離れた領域7-9及び17-19において、第1の他車両40が存在し、かつ車両と第1の他車両40との間に第2の他車両40が存在する場合が考えられる。この場合に、第2の他車両40が第1の他車両40を隠す状態になっているため(オクルージョンにより)、処理部14は、第1の他車両40が自車両30の死角に存在すると判断してよい。判断は、第1の他車両40又は第2の他車両40の大きさ又は種別に基づいてもよい。他車両40が自車両30の左、右、前及び後方向の少なくとも1方向に2領域離れた領域1-6、10、11、15、16又は20-25に存在する場合、処理部14は、他車両40が自車両30の死角に存在すると判断してよい。処理部14は、判断結果を制御部15に通知してよい。 The processing unit 14 may further determine other vehicles 40 that are blind spots from the own vehicle 30 . For example, in areas 7-9 and 17-19 that are one area away from the host vehicle 30 in the left or right direction and one area away in the front or rear direction, the first other vehicle 40 is present, and the vehicle and the first vehicle are located. A case where a second other vehicle 40 exists between the other vehicle 40 is conceivable. In this case, since the second other vehicle 40 is in a state of hiding the first other vehicle 40 (due to occlusion), the processing unit 14 determines that the first other vehicle 40 exists in the blind spot of the host vehicle 30. You can judge. The determination may be based on the size or type of the first other vehicle 40 or the second other vehicle 40 . When the other vehicle 40 exists in the areas 1-6, 10, 11, 15, 16, or 20-25 that are two areas apart in at least one of the left, right, front, and rear directions of the own vehicle 30, the processing unit 14 , it may be determined that the other vehicle 40 exists in the blind spot of the own vehicle 30 . The processing unit 14 may notify the control unit 15 of the determination result.
<検出物体の同一性の判断>
 処理部14は、第3時刻t3における、第1の情報における検出した物体(以下、「第1検出物体」という)及び第2の情報における検出した物体(以下、「第2検出物体」という)それぞれの予測された状態に基づいて、第1検出物体と、第2検出物体と、が同一であるか判断する。以下、具体的な判断手順の例が説明される。第1検出物体及び第2検出物体は、他車両40であってよい。第1検出物体及び第2検出物体は、自車両30であってよい。
<Determination of Identity of Detected Object>
The processing unit 14 detects an object detected in the first information (hereinafter referred to as "first detected object") and an object detected in the second information (hereinafter referred to as "second detected object") at the third time t3. Based on each predicted state, it is determined whether the first detected object and the second detected object are the same. An example of a specific determination procedure will be described below. The first detected object and the second detected object may be other vehicles 40 . The first detected object and the second detected object may be the own vehicle 30 .
 処理部14は、第1検出物体と第2検出物体との認識クラス(物体種別)の同一性を判断してよい。認識クラスは例えば、大型車両、普通車両、緊急車両、歩行者等である。処理部14は、第1検出物体及び第2検出物体の認識クラスが類似する場合に、第1検出物体及び第2検出物体が認識クラスの同一性を有すると判断してよい。 The processing unit 14 may determine the identity of the recognition class (object type) between the first detected object and the second detected object. Recognition classes are, for example, large vehicles, ordinary vehicles, emergency vehicles, pedestrians, and the like. When the recognition classes of the first detected object and the second detected object are similar, the processing unit 14 may determine that the first detected object and the second detected object have the same recognition class.
 処理部14は、第3時刻t3における、第1検出物体と第2検出物体との速度の同一性を判断してよい。具体的には、処理部14は、第1検出物体の速度と第2検出物体との速度との差が、速度閾値以下である場合に、第1検出物体及び第2検出物体が速度の同一性を有すると判断してよい。 The processing unit 14 may determine the identity of the velocities of the first detected object and the second detected object at the third time t3. Specifically, when the difference between the speed of the first detected object and the speed of the second detected object is equal to or less than the speed threshold, the processing unit 14 determines whether the first detected object and the second detected object have the same speed. You can judge that it has sex.
 処理部14は、第3時刻t3における、第1検出物体と第2検出物体との進行方向の同一性を判断してよい。具体的には、処理部14は、第1検出物体の進行方向と第2検出物体との進行方向とがなす角度が、第2角度閾値以下である場合に、第1検出物体及び第2検出物体が進行方向の同一性を有すると判断してよい。 The processing unit 14 may determine the identity of the traveling directions of the first detected object and the second detected object at the third time t3. Specifically, when the angle between the traveling direction of the first detected object and the traveling direction of the second detected object is equal to or smaller than the second angle threshold, the processing unit 14 detects the first detected object and the second detected object. It may be determined that objects have heading identity.
 処理部14は、第3時刻t3における、第1検出物体と第2検出物体との位置の同一性を判断してよい。具体的には、処理部14は、第3時刻t3における、第1検出物体の位置と第2検出物体の位置との間の距離を算出してよい。処理部14は、算出した距離が第2距離閾値以下である場合に、第1検出物体及び第2検出物体が位置の同一性を有すると判断してよい。 The processing unit 14 may determine the identity of the positions of the first detected object and the second detected object at the third time t3. Specifically, the processing unit 14 may calculate the distance between the position of the first detected object and the position of the second detected object at the third time t3. The processing unit 14 may determine that the positions of the first detected object and the second detected object have the same position when the calculated distance is equal to or less than the second distance threshold.
 処理部14は、第3時刻t3における、第1検出物体と第2検出物体との間で判断した、第1検出物体及び第2検出物体の認識クラスの同一性、速度の同一性、進行方向の同一性及び位置の同一性の少なくとも1つに基づいて、第1検出物体と第2検出物体とが同一であるか判断してよい。 The processing unit 14 determines the identity of the recognition class, the identity of the velocity, and the traveling direction of the first detected object and the second detected object determined between the first detected object and the second detected object at the third time t3. It may be determined whether the first detected object and the second detected object are the same based on at least one of the identity of and the identity of the positions.
 処理部14は、第1検出物体と第2検出物体とが同一であると判断した場合、第1の情報又は第2の情報を選択する。処理部14は、選択のために、第1の情報及び前記第2の情報それぞれの優先度を算出してよい。処理部14は、第1の情報及び第2の優先度に基づいて、第1の情報又は第2の情報を選択してよい。処理部14は、選択された情報を制御部15に送信してよい。以下、優先度の算出手順の詳細が説明される。 When the processing unit 14 determines that the first detected object and the second detected object are the same, it selects the first information or the second information. The processing unit 14 may calculate the priority of each of the first information and the second information for selection. The processing unit 14 may select the first information or the second information based on the first information and the second priority. The processing unit 14 may transmit the selected information to the control unit 15 . Details of the priority calculation procedure will be described below.
 他車両40から受信した情報は、精度等の観点から、路側機20から受信した情報よりも優先度を高くしてよい。ただし、例えば他車両40から受信した情報に含まれる「緯度、経度の信頼度」の値が信頼度閾値以下である場合には、路側機20から受信した情報の優先度を高くしてよい。第1送信装置及び第2送信装置がいずれも互いに異なる路側機30である場合、処理部14は、第1情報及び第2情報のうち、取得時刻が後の情報の優先度を高くしてよい。 Information received from other vehicles 40 may have a higher priority than information received from roadside units 20 from the viewpoint of accuracy. However, for example, when the value of "latitude and longitude reliability" included in the information received from the other vehicle 40 is equal to or less than the reliability threshold, the priority of the information received from the roadside unit 20 may be increased. When both the first transmission device and the second transmission device are roadside devices 30 different from each other, the processing unit 14 may give higher priority to the information acquired later than the first information and the second information. .
 以下図6を参照して、自車両30の処理部14が行う検出物体の同一性の判断の具体的な例が説明される。 A specific example of determination of identity of detected objects performed by the processing unit 14 of the own vehicle 30 will be described below with reference to FIG.
 自車両30の受信部13は、路側機20から情報101~106を受信する。受信部13は、他車両40aから情報201を受信する。受信部13は、他車両40bから情報301を受信する。受信部13は、他車両40cから情報401を受信する。情報101は他車両40aの状態を示す情報である。情報102は他車両40bの状態を示す情報である。情報103は他車両40cの状態を示す情報である。情報104は自車両30の状態を示す情報である。情報105は歩行者50aの状態を示す情報である。情報106は歩行者50bの状態を示す情報である。 The receiving unit 13 of the own vehicle 30 receives the information 101 to 106 from the roadside device 20. The receiving unit 13 receives information 201 from the other vehicle 40a. The receiving unit 13 receives information 301 from the other vehicle 40b. The receiving unit 13 receives information 401 from the other vehicle 40c. The information 101 is information indicating the state of the other vehicle 40a. Information 102 is information indicating the state of the other vehicle 40b. The information 103 is information indicating the state of the other vehicle 40c. Information 104 is information indicating the state of the vehicle 30 . The information 105 is information indicating the state of the pedestrian 50a. Information 106 is information indicating the state of the pedestrian 50b.
 自車両30のセンサ12は、自車両30の位置等を検出する。センサ12は、自車両30の状態を示す情報501を作成する。 The sensor 12 of the own vehicle 30 detects the position of the own vehicle 30 and the like. The sensor 12 creates information 501 indicating the state of the own vehicle 30 .
 処理部14は、前述のように、現在時刻(第3時刻)t3における、情報101~106,201,301,401及び501における検出物体の状態を予測する。 As described above, the processing unit 14 predicts the state of the detected object in the information 101 to 106, 201, 301, 401 and 501 at the current time (third time) t3.
 処理部14は、検出した物体の第3時刻t3における状態の予測に基づいて、検出物体の同一性の判断処理を行う。具体的には、処理部14は、情報101~106,201,301,401及び501における検出した物体の間で、認識クラスの同一性、速度の同一性、進行方向の同一性及び位置の同一性の少なくとも1つに基づいて、検出物体が同一であるか判断する。処理部14は、情報101及び201と、情報102及び301と、情報103及び401と、情報104及び501とがそれぞれ、同一の物体から検出した情報であると判断する。 The processing unit 14 performs identity determination processing for the detected object based on the prediction of the state of the detected object at the third time t3. Specifically, the processing unit 14 determines that the detected objects in the information 101 to 106, 201, 301, 401, and 501 have the same recognition class, the same speed, the same traveling direction, and the same position. It is determined whether the detected objects are identical based on at least one of the genders. The processing unit 14 determines that the information 101 and 201, the information 102 and 301, the information 103 and 401, and the information 104 and 501 are information detected from the same object.
 ここで、他車両40a、40b及び40cは自車両30から比較的近い距離に位置していたとする。この場合、他車両40a,40b及び40cから車車間通信により受信した情報である情報201,301及び401、並びに自車両30が取得した情報である情報501は、信頼度が高くなっている。ゆえに処理部14は、情報201,301,401及び501を選択する。処理部14は、選択された情報を制御部15に送信すると判断する。 Here, it is assumed that the other vehicles 40a, 40b, and 40c are located relatively close to the own vehicle 30. In this case, the information 201, 301 and 401 received from the other vehicles 40a, 40b and 40c through inter-vehicle communication and the information 501 acquired by the own vehicle 30 are highly reliable. Therefore, the processing unit 14 selects information 201 , 301 , 401 and 501 . The processing unit 14 determines to transmit the selected information to the control unit 15 .
 処理部14は更に、情報105及び106と、他の情報とが、同一の物体から検出した情報ではないと判断する。処理部14は、この判断に基づいて、情報105及び106を、制御部15に送信すると判断する。以上のように、処理部14は検出物体の同一性の判断を行ってよい。 The processing unit 14 further determines that the information 105 and 106 and other information are not information detected from the same object. Based on this determination, the processing unit 14 determines to transmit the information 105 and 106 to the control unit 15 . As described above, the processing unit 14 may determine the identity of the detected object.
 第1の情報及び第2の情報は、車両IDを含んでよい。制御部15は、車両IDが同じ複数の情報を、同じ車両を指す情報として認識してよい。以前に行われた処理において、処理部14は第1の情報を制御部15に送信し、今回の処理において、当該第1の情報の検出物体と同じ物体に関する第1検出物体と第2検出物体を取得して同一性判断を行うことが生じ得る。更に、処理部14は、今回の処理において第2の情報を選択して制御部15に送信することが起こり得る。第1送信装置が付与する第1の情報の車両IDと、第2送信装置が付与する第2の情報の車両IDとが異なることがあり、制御部15は、第1の情報及び第2の情報を同じ車両を指す情報として認識しないことがある。その結果、制御部15に提供する車両に関する情報の連続性が途切れ、軌跡が失われることがある。 The first information and the second information may contain the vehicle ID. The control unit 15 may recognize multiple pieces of information having the same vehicle ID as information indicating the same vehicle. In the processing performed previously, the processing unit 14 transmits the first information to the control unit 15, and in the current processing, the first detected object and the second detected object related to the same object as the detected object in the first information is acquired to determine identity. Furthermore, the processing unit 14 may select the second information and transmit it to the control unit 15 in this process. The vehicle ID of the first information given by the first transmission device may be different from the vehicle ID of the second information given by the second transmission device. Information may not be recognized as referring to the same vehicle. As a result, the continuity of information about the vehicle provided to the control unit 15 may be interrupted and the trajectory may be lost.
 このような事象に対して、処理部14は、制御部15に送信する第2の情報の車両IDを、第1の情報と整合が取れるように書き換えてよい。具体的には、処理部14は、制御部15に送信した情報(第1の情報)の車両IDを、少なくとも一定期間記憶しておいてよい。処理部14は、第1検出物体と第2検出物体とが同一であると判断した場合に、第2の情報の車両IDを、記憶されている第1の情報の車両IDに書き換えてよい。 In response to such an event, the processing unit 14 may rewrite the vehicle ID in the second information to be transmitted to the control unit 15 so that it matches the first information. Specifically, the processing unit 14 may store the vehicle ID of the information (first information) transmitted to the control unit 15 for at least a certain period of time. When the processing unit 14 determines that the first detected object and the second detected object are the same, the processing unit 14 may rewrite the vehicle ID of the second information to the stored vehicle ID of the first information.
 第1の情報及び第2の情報は更に、情報の時系列を示すインクリメントカウンタを含んでよい。処理部14は、制御部15に送信した情報のインクリメントカウンタを、少なくとも一定期間記憶しておいてよい。処理部14は、車両IDを書き換える場合に、インクリメントカウンタも書き換えてよい。 The first information and the second information may further include an increment counter indicating the time series of the information. The processing unit 14 may store the increment counter of the information transmitted to the control unit 15 for at least a certain period of time. The processing unit 14 may also rewrite the increment counter when rewriting the vehicle ID.
 ここまで、第1の情報及び第2の情報における検出した物体の同一性の判断を説明した。上述したように、自車両30のセンサ12は、自車両30の状態を示す情報を取得してよい。処理部14は、自車両30のセンサ12が取得した自車両30の状態を示す情報と、路側機20から受信した第1の情報又は第2の情報との間で、上述した第1の情報及び第2の情報における検出した物体の同一性の判断と同様又は類似した処理により、同一性判断を行ってよい。 So far, the determination of the identity of the detected object in the first information and the second information has been explained. As described above, the sensor 12 of the own vehicle 30 may acquire information indicating the state of the own vehicle 30 . The processing unit 14 extracts the above-described first information between the information indicating the state of the own vehicle 30 acquired by the sensor 12 of the own vehicle 30 and the first information or the second information received from the roadside unit 20. And the identity determination may be performed by a process similar or similar to the identity determination of the detected object in the second information.
 自車両30のセンサ12が取得した自車両30の状態を示す情報の優先度の決定に際しては、自車両30のセンサ12が取得した自車両30の状態を示す情報は、路側機20から受信した第1の情報又は第2の情報よりも優先度を高くしてよい。ただし、例えば自車両30のセンサ12が取得した自車両30の状態を示す情報に含まれる「緯度、経度の信頼度」の値が信頼度閾値以下である場合には、路側機20から受信した第1の情報又は第2の情報の優先度を高くしてよい。 When determining the priority of the information indicating the state of the own vehicle 30 acquired by the sensor 12 of the own vehicle 30, the information indicating the state of the own vehicle 30 acquired by the sensor 12 of the own vehicle 30 is received from the roadside unit 20. The priority may be higher than the first information or the second information. However, for example, when the value of "latitude and longitude reliability" included in the information indicating the state of the vehicle 30 acquired by the sensor 12 of the vehicle 30 is equal to or less than the reliability threshold, the information received from the roadside device 20 The priority of the first information or the second information may be increased.
<物体の情報の更なる予測>
 上述した物体の同一性の判断処理には時間がかかることがある。処理部14は、第3時刻t3よりも後の、上記処理が終了した時間(第5時刻)t5における、第1の情報又は第2の情報を更に予測してよい。以下、第1の情報に対する予測手順の例が説明される。第2の情報に対しても同様又は類似の予測手順が実行されてよい。なお、自車両30のセンサ12が取得した自車両30の状態を示す情報に対しても同様又は類似の予測手順が実行されてよい。
<Further Prediction of Object Information>
The process of determining the identity of objects described above may take a long time. The processing unit 14 may further predict the first information or the second information at the time (fifth time) t5 after the third time t3 when the above processing is finished. An example of a prediction procedure for the first information is described below. A similar or similar prediction procedure may be performed on the second information. A similar or similar prediction procedure may be performed on the information indicating the state of the own vehicle 30 acquired by the sensor 12 of the own vehicle 30 .
 処理部14は、現在時刻としての第5時刻t5における、第1の情報において検出した物体の状態を予測してよい。予測は、第3時刻における状態の予測方法と同様又は類似した方法で行われてよい。 The processing unit 14 may predict the state of the object detected in the first information at the fifth time t5 as the current time. The prediction may be made in a manner similar or similar to the state prediction method at the third time.
 また、処理部14は、他車両40から当該他車両40が検出した検出物体の状態を示す情報をさらに取得してもよい。例えば、他車両40aが、情報201とともに、検出物体として検出した他車両40bの状態を示す情報を車車間通信し、処理部14がこれら情報を取得したとする。この場合、処理部14は、路側機20から取得した情報102,他車両40bから取得した情報301に加えて、他車両40aから取得した他車両40bの状態を示す情報について、検出物体の同一性の判断処理、検出物体の情報の削除処理又は物体の情報の更なる予測処理を行ってもよい。 In addition, the processing unit 14 may further acquire information indicating the state of the detected object detected by the other vehicle 40 from the other vehicle 40 . For example, it is assumed that the other vehicle 40a carries out inter-vehicle communication of the information indicating the state of the other vehicle 40b detected as a detected object together with the information 201, and the processing unit 14 acquires this information. In this case, in addition to the information 102 acquired from the roadside device 20 and the information 301 acquired from the other vehicle 40b, the processing unit 14 determines the identity of the detected object with respect to the information indicating the state of the other vehicle 40b acquired from the other vehicle 40a. determination processing, deletion processing of detected object information, or further prediction processing of object information may be performed.
 制御部15は、処理部14の処理結果に基づいて、車両10を制御してよい。制御部15は、ECU(Electronic Control Unit)であってよい。制御部15は、処理部14から受信した検出物体の情報に基づいて、車両10の運転を支援してよい。制御部15は、車両10が運転者により操縦可能な構成においては、取得する情報又は当該情報を加工した情報を運転者に報知してよい。例えば、制御部15は、取得した検出物体の状態を示す情報のうち、車両10(自車両30)にとって急を要する事態に関わる情報にのみ絞り込む加工を行ってから運転者に報知してよい。その場合、制御部15は、自車両30と検出物体との相対距離や相対速度を、所定の基準と比較することで運転者に報知する情報を絞り込んでよい。制御部15は、車両10の加減速及び操舵等の操縦を自動又は半自動で行う構成においては、当該操縦の判断のために取得する情報を用いてよい。制御部15は、車両10の運転手の安全運転を支援するために用いられてよい。 The control unit 15 may control the vehicle 10 based on the processing result of the processing unit 14. The controller 15 may be an ECU (Electronic Control Unit). The control unit 15 may assist the driving of the vehicle 10 based on the detected object information received from the processing unit 14 . In a configuration in which the vehicle 10 is operable by the driver, the control unit 15 may notify the driver of the acquired information or information obtained by processing the information. For example, the control unit 15 may process the acquired information indicating the state of the detected object to narrow it down to information related to an urgent situation for the vehicle 10 (self-vehicle 30), and then notify the driver. In that case, the control unit 15 may narrow down the information to be notified to the driver by comparing the relative distance and relative speed between the own vehicle 30 and the detected object with predetermined criteria. The control unit 15 may use the information acquired for determining the operation in a configuration in which steering such as acceleration/deceleration and steering of the vehicle 10 is performed automatically or semi-automatically. The control unit 15 may be used to assist the driver of the vehicle 10 in driving safely.
 本開示の処理部14は、車両10の外部に設けられてよい。車両10の外部に設けられた処理部14と、車両10の制御部15と、がネットワーク接続されて、処理部14が車両10にクラウドサービスを提供してもよい。 The processing unit 14 of the present disclosure may be provided outside the vehicle 10 . The processing unit 14 provided outside the vehicle 10 and the control unit 15 of the vehicle 10 may be network-connected, and the processing unit 14 may provide the cloud service to the vehicle 10 .
 次に、本実施形態において情報処理装置11により実行される処理が、図7のフローチャートを参照して説明される。情報処理装置11は、この処理を、一定周期で実行してよい。 Next, the processing executed by the information processing device 11 in this embodiment will be described with reference to the flowchart of FIG. The information processing device 11 may perform this process at regular intervals.
 ステップS100において、図2を参照して、情報処理装置11の受信部13は、検出物体の情報及び取得時間を受信する。ステップS101において、情報処理装置11の処理部14は、検出物体の情報を予測する。ステップS102において、処理部14は、検出物体の情報を削除する。ステップS103において、処理部14は、検出物体の同一性を判断する。ステップS104において、処理部14は、検出物体の情報を更に予測する。ステップS105において、処理部14は、検出物体の情報を制御部15に送信する。計算後、処理は終了する。 At step S100, referring to FIG. 2, the receiving unit 13 of the information processing device 11 receives the information on the detected object and the acquisition time. In step S101, the processing unit 14 of the information processing device 11 predicts information about a detected object. In step S102, the processing unit 14 deletes the information on the detected object. In step S103, the processing unit 14 determines the identity of the detected object. In step S104, the processing unit 14 further predicts information on the detected object. In step S<b>105 , the processing unit 14 transmits information on the detected object to the control unit 15 . After the calculation, the process ends.
 以上のような構成の本実施形態の情報処理装置11の処理部14は、予測された、第3時刻t3における、第1の情報及び第2の情報それぞれにおける検出物体の状態に基づいて、第1の情報における検出した物体と第2の情報における検出した物体とが同一であるか判断し、判断結果に基づいて第1の情報及び第2の情報のいずれかを選択する。このような構成により、情報処理装置11が処理する情報の量が減少し、情報処理装置11は処理負荷を軽減させ得る。 The processing unit 14 of the information processing apparatus 11 according to the present embodiment configured as described above performs the predicted state of the detected object at the third time t3 based on the first information and the second information, respectively. It is determined whether the detected object in the first information and the detected object in the second information are the same, and either the first information or the second information is selected based on the determination result. With such a configuration, the amount of information processed by the information processing device 11 can be reduced, and the processing load of the information processing device 11 can be reduced.
 また、本実施形態の情報処理装置11の処理部14は、第1の情報及び第2の情報それぞれにおける検出した物体が同一であると判断した場合に、第1の情報及び第2の情報の一方のみを制御部15に送信する。このような構成により、制御部15が受信する情報の量が減少し、情報処理装置11は制御部15の処理負荷を軽減させ得る。 Further, when the processing unit 14 of the information processing apparatus 11 of the present embodiment determines that the detected object in each of the first information and the second information is the same, the first information and the second information Only one is transmitted to the control unit 15 . With such a configuration, the amount of information received by the control unit 15 is reduced, and the information processing device 11 can reduce the processing load of the control unit 15 .
 また、本実施形態の情報処理装置11の処理部14は、第1の情報及び第2の信頼度に更に基づいて、第1の情報と第2の情報とのどちらを制御部15に送信するか判断する。このような構成により、制御部15はより精度の高い情報のみを受信できる。その結果、情報処理装置11は処理の信頼性を向上させ得る。 Further, the processing unit 14 of the information processing apparatus 11 of the present embodiment transmits which of the first information and the second information to the control unit 15 further based on the first information and the second reliability. or judge. With such a configuration, the control unit 15 can receive only highly accurate information. As a result, the information processing apparatus 11 can improve the reliability of processing.
 また、本実施形態の情報処理装置11の処理部14は、第3時刻よりも前の第4取得時刻に取得した車両10の情報から、第3時刻における、車両10の情報を予測し、第3時刻における、車両10の情報に更に基づいて、第1の情報及び第2の情報を削除するか判断する。このような構成により、車両10に不要な情報が削除されて、情報処理装置11は処理負荷を更に軽減させ得る。 Further, the processing unit 14 of the information processing device 11 of the present embodiment predicts the information of the vehicle 10 at the third time from the information of the vehicle 10 acquired at the fourth acquisition time before the third time. Further based on the information of the vehicle 10 at time 3, it is determined whether to delete the first information and the second information. With such a configuration, information unnecessary to the vehicle 10 is deleted, and the processing load of the information processing device 11 can be further reduced.
 また、本実施形態の情報処理装置11の処理部14は、第1取得時刻及び第2取得時刻の差が所定のしきい値以下である場合に、第1の情報から、第3時刻における、第1の情報を予測するとともに、第2の情報から、前記第3時刻における、第2の情報を予測する。このような構成により、情報処理装置11は、第1取得時刻と第2取得時刻とが大きく異なる場合には、第1の情報における検出した物体と第2の情報における検出した物体との同一性の判断が難しくなるため、当該判断を行わない。したがって、情報処理装置11が行う同一性の判断の精度を向上させ得る。 Further, when the difference between the first acquisition time and the second acquisition time is equal to or less than a predetermined threshold value, the processing unit 14 of the information processing apparatus 11 according to the present embodiment determines from the first information, The first information is predicted, and the second information at the third time is predicted from the second information. With such a configuration, when the first acquisition time and the second acquisition time are significantly different, the information processing apparatus 11 determines the identity of the detected object in the first information and the detected object in the second information. Because it becomes difficult to judge Therefore, it is possible to improve the accuracy of identity determination performed by the information processing apparatus 11 .
 また、本実施形態の情報処理装置11の処理部14は、第3時刻よりも後の第5時刻における、第1の情報又は第2の情報を更に予測する。このような構成により、情報処理装置11は例えば、処理部14が物体の同一性を判断する時間を要することによる、第1の情報又は第2の情報の精度の劣化を軽減させ得る。 Also, the processing unit 14 of the information processing device 11 of the present embodiment further predicts the first information or the second information at the fifth time after the third time. With such a configuration, the information processing apparatus 11 can reduce deterioration in the accuracy of the first information or the second information due to the processing unit 14 needing time to determine the identity of the object, for example.
 以上、情報処理装置11の実施形態を説明してきたが、本開示の実施形態としては、装置を実施するための方法又はプログラムの他、プログラムが記録された記憶媒体(一例として、光ディスク、光磁気ディスク、CD-ROM、CD-R、CD-RW、磁気テープ、ハードディスク、又はメモリカード等)としての実施態様をとることも可能である。 Although the embodiment of the information processing device 11 has been described above, as an embodiment of the present disclosure, in addition to the method or program for implementing the device, a storage medium on which the program is recorded (for example, an optical disk, a magneto-optical A disk, CD-ROM, CD-R, CD-RW, magnetic tape, hard disk, memory card, etc.) can also be used.
 また、プログラムの実装形態としては、コンパイラによってコンパイルされるオブジェクトコード、インタプリタにより実行されるプログラムコード等のアプリケーションプログラムに限定されることはなく、オペレーティングシステムに組み込まれるプログラムモジュール等の形態であってもよい。さらに、プログラムは、制御基板上のCPUにおいてのみ全ての処理が実施されるように構成されてもされなくてもよい。プログラムは、必要に応じて基板に付加された拡張ボード又は拡張ユニットに実装された別の処理ユニットによってその一部又は全部が実施されるように構成されてもよい。 Moreover, the implementation form of the program is not limited to an application program such as an object code compiled by a compiler or a program code executed by an interpreter. good. Furthermore, the program may or may not be configured so that all processing is performed only in the CPU on the control board. The program may be configured to be partially or wholly executed by another processing unit mounted on an expansion board or expansion unit added to the board as required.
 本開示に係る実施形態について説明する図は模式的なものである。図面上の寸法比率等は、現実のものとは必ずしも一致していない。 The diagrams describing the embodiments according to the present disclosure are schematic. The dimensional ratios and the like on the drawings do not necessarily match the actual ones.
 本開示に係る実施形態について、諸図面及び実施例に基づき説明してきたが、当業者であれば本開示に基づき種々の変形又は改変を行うことが可能であることに注意されたい。従って、これらの変形又は改変は本開示の範囲に含まれることに留意されたい。例えば、各構成部等に含まれる機能等は論理的に矛盾しないように再配置可能であり、複数の構成部等を1つに組み合わせたり、或いは分割したりすることが可能である。 Although the embodiments according to the present disclosure have been described based on various drawings and examples, it should be noted that a person skilled in the art can make various modifications or alterations based on the present disclosure. Therefore, it should be noted that these variations or modifications are included within the scope of this disclosure. For example, the functions included in each component can be rearranged so as not to be logically inconsistent, and multiple components can be combined into one or divided.
 本開示に記載された構成要件の全て、及び/又は、開示された全ての方法、又は、処理の全てのステップについては、これらの特徴が相互に排他的である組合せを除き、任意の組合せで組み合わせることができる。また、本開示に記載された特徴の各々は、明示的に否定されない限り、同一の目的、同等の目的、または類似する目的のために働く代替の特徴に置換することができる。したがって、明示的に否定されない限り、開示された特徴の各々は、包括的な一連の同一、又は、均等となる特徴の一例にすぎない。 For all elements set forth in this disclosure and/or for any method or process step disclosed, in any combination, except combinations where these features are mutually exclusive. Can be combined. Also, each feature disclosed in this disclosure, unless expressly contradicted, may be replaced by alternative features serving the same, equivalent, or similar purpose. Thus, unless expressly stated otherwise, each feature disclosed is one example only of a generic series of identical or equivalent features.
 例えば、本開示に係る実施形態では、車両10において処理部14と、制御部15とを異なる構成として記載していた。しかしながら、これに限らず、制御部15が処理部14を含む構成であってもよい。この場合、制御部15が、上述した処理部14における同一性の判断処理を行うともに判断処理の結果に応じた検出物体の情報を削除する処理を行い、その後、削除しなかった情報を利用して車両10の運転を支援してもよい。 For example, in the embodiment according to the present disclosure, the processing unit 14 and the control unit 15 in the vehicle 10 are described as having different configurations. However, the configuration is not limited to this, and the control unit 15 may include the processing unit 14 . In this case, the control unit 15 performs the sameness determination process in the processing unit 14 described above, deletes the information of the detected object according to the result of the determination process, and then uses the information that has not been deleted. may assist in driving the vehicle 10.
 さらに、本開示に係る実施形態は、上述した実施形態のいずれの具体的構成にも制限されるものではない。本開示に係る実施形態は、本開示に記載された全ての新規な特徴、又は、それらの組合せ、あるいは記載された全ての新規な方法、又は、処理のステップ、又は、それらの組合せに拡張することができる。 Furthermore, the embodiments according to the present disclosure are not limited to any specific configuration of the embodiments described above. Embodiments of the present disclosure extend to all novel features or combinations thereof described in the present disclosure or to all novel method or process steps or combinations thereof described. be able to.
 本開示において「第1」及び「第2」等の記載は、当該構成を区別するための識別子である。本開示における「第1」及び「第2」等の記載で区別された構成は、当該構成における番号を交換することができる。例えば、第1情報は、第2情報と識別子である「第1」と「第2」とを交換することができる。識別子の交換は同時に行われる。識別子の交換後も当該構成は区別される。識別子は削除してよい。識別子を削除した構成は、符号で区別される。本開示における「第1」及び「第2」等の識別子の記載のみに基づいて、当該構成の順序の解釈、小さい番号の識別子が存在することの根拠に利用してはならない。 Descriptions such as "first" and "second" in this disclosure are identifiers for distinguishing the configurations. Configurations that are differentiated in descriptions such as "first" and "second" in this disclosure may interchange the numbers in that configuration. For example, the first information can replace the identifiers "first" and "second" with the second information. The exchange of identifiers is done simultaneously. The configurations are still distinct after the exchange of identifiers. Identifiers may be deleted. Configurations from which identifiers have been deleted are distinguished by codes. The description of identifiers such as “first” and “second” in this disclosure should not be used as a basis for interpreting the order of the configuration or the existence of lower numbered identifiers.
 10 車両
 11 情報処理装置
 12 センサ
 13 受信部
 14 処理部
 15 制御部
 20 路側機
 21 センサ
 30 自車両
 40,40a,40b,40c 他車両
 50,50a,50b,50c 歩行者
 60 道路
 101,102,103,104,105,106,201,301,401,501 情報
 A 情報維持範囲
 Af 進行方向側の範囲
 Ab 進行方向とは逆側の範囲
 rf,rb 半径
 dt 進行方向距離
 
Reference Signs List 10 vehicle 11 information processing device 12 sensor 13 receiving unit 14 processing unit 15 control unit 20 roadside device 21 sensor 30 host vehicle 40, 40a, 40b, 40c other vehicle 50, 50a, 50b, 50c pedestrian 60 road 101, 102, 103 , 104, 105, 106, 201, 301, 401, 501 Information A Information maintenance range Af Range on the traveling direction side Ab Range on the opposite side to the traveling direction rf, rb Radius dt Distance in the traveling direction

Claims (7)

  1.  第1取得時刻に検出した物体の状態を示す第1の情報及び第1取得時刻を第1送信装置から受信し、第2取得時刻に検出した物体の状態を示す第2の情報及び第2取得時刻を第2送信装置から受信する、受信部と、
     前記受信部が受信した前記第1の情報及び前記第1取得時刻並びに前記第2の情報及び前記第2取得時刻を処理する処理部と、を備える装置であり、
     前記処理部は、前記第1の情報から、前記第1取得時刻及び前記第2取得時刻よりも後の第3時刻における、第1の情報における検出した物体の状態を予測するとともに、前記第2の情報から、前記第3時刻における、第2の情報における検出した物体の状態を予測し、
     前記処理部は、予測された、前記第3時刻における、前記第1の情報及び前記第2の情報それぞれにおける検出した物体の状態に基づいて、前記第1の情報における検出した物体と前記第2の情報における検出した物体とが同一であるか判断し、判断結果に基づいて前記第1の情報及び前記第2の情報のいずれかを選択する、
    情報処理装置。
    receiving first information indicating the state of the object detected at the first acquisition time and the first acquisition time from the first transmitting device, and receiving second information and the second acquisition indicating the state of the object detected at the second acquisition time a receiver that receives the time from the second transmitter;
    a processing unit that processes the first information and the first acquisition time and the second information and the second acquisition time received by the receiving unit;
    The processing unit predicts, from the first information, a state of the detected object in the first information at a third time after the first acquisition time and the second acquisition time. predicting the state of the detected object in the second information at the third time from the information in
    Based on the predicted state of the detected object in each of the first information and the second information at the third time, the processing unit performs the detected object in the first information and the second determining whether the detected object in the information is the same, and selecting either the first information or the second information based on the determination result;
    Information processing equipment.
  2.  請求項1に記載の情報処理装置において、
     車両を制御する制御部を更に備え、
     前記処理部は、前記第1の情報及び前記第2の情報それぞれにおける検出した物体が同一であると判断した場合に、前記第1の情報及び前記第2の情報の一方のみを前記制御部に送信する
    情報処理装置。
    In the information processing device according to claim 1,
    Further comprising a control unit for controlling the vehicle,
    The processing unit transmits only one of the first information and the second information to the control unit when it is determined that the detected object in each of the first information and the second information is the same. information processing device for transmission;
  3.  請求項2に記載の情報処理装置において、
     前記第1の情報及び前記第2の情報はそれぞれ、信頼度を含み、
     前記処理部は、前記第1の情報及び前記第2の前記信頼度に基づいて、前記第1の情報と前記第2の情報とのどちらを前記制御部に送信するか判断する、
    情報処理装置。
    In the information processing apparatus according to claim 2,
    the first information and the second information each include a reliability;
    The processing unit determines which of the first information and the second information is to be transmitted to the control unit based on the first information and the second reliability.
    Information processing equipment.
  4.  請求項1から3のいずれか一項に記載の情報処理装置において、
     前記処理部は、前記第3時刻よりも前の第4取得時刻に取得した車両の情報から、前記第3時刻における、前記車両の情報を予測し、前記第3時刻における、前記車両の情報に基づいて、前記第1の情報及び前記第2の情報を削除するか判断する、
    情報処理装置。
    In the information processing device according to any one of claims 1 to 3,
    The processing unit predicts the vehicle information at the third time from the vehicle information acquired at the fourth acquisition time before the third time, and predicts the vehicle information at the third time. Determining whether to delete the first information and the second information based on
    Information processing equipment.
  5.  請求項1から4のいずれか一項に記載の情報処理装置において、
     前記処理部は、前記第1取得時刻及び前記第2取得時刻の差が所定のしきい値以下である場合に、前記第1の情報から、前記第3時刻における、第1の情報を予測するとともに、前記第2の情報から、前記第3時刻における、第2の情報を予測する
    情報処理装置。
    In the information processing device according to any one of claims 1 to 4,
    The processing unit predicts the first information at the third time from the first information when the difference between the first acquisition time and the second acquisition time is equal to or less than a predetermined threshold. and an information processing apparatus that predicts the second information at the third time from the second information.
  6.  請求項1から5のいずれか一項に記載の情報処理装置において、
     前記処理部は、前記第3時刻よりも後の第5時刻における、前記第1の情報又は前記第2の情報を更に予測する、
    情報処理装置。
    In the information processing device according to any one of claims 1 to 5,
    The processing unit further predicts the first information or the second information at a fifth time after the third time,
    Information processing equipment.
  7.  第1取得時刻に検出した物体の状態を示す第1の情報を取得した第1送信装置から前記第1の情報及び前記第1取得時刻を受信し、第2取得時刻に検出した物体の状態を示す第2の情報を取得した第2送信装置から前記第2の情報及び前記第2取得時刻を受信する、受信ステップと、
     前記第1の情報から、前記第1取得時刻及び前記第2取得時刻よりも後の第3時刻における、第1の情報における検出した物体の状態を予測するとともに、前記第2の情報から、前記第3時刻における、第2の情報における検出した物体の状態を予測する予測ステップと、
     前記予測ステップで予測された、前記第3時刻における、前記第1の情報及び前記第2の情報それぞれにおける検出した物体の状態に基づいて、前記第1の情報における検出した物体と前記第2の情報における検出した物体とが同一であるか判断し、判断結果に基づいて前記第1の情報及び前記第2の情報のいずれかを選択する、判断ステップと、
    を含む情報処理方法。
     
    receiving the first information and the first acquisition time from the first transmitting device that acquired the first information indicating the state of the object detected at the first acquisition time, and transmitting the state of the object detected at the second acquisition time; a receiving step of receiving the second information and the second acquisition time from the second transmitting device that acquired the second information indicating
    From the first information, predict the state of the detected object in the first information at a third time after the first acquisition time and the second acquisition time, and from the second information, predict the state of the detected object a prediction step of predicting the state of the detected object in the second information at a third time;
    Based on the state of the detected object in each of the first information and the second information at the third time predicted in the prediction step, the state of the detected object in the first information and the second information is determined. a judgment step of judging whether the detected object in the information is the same, and selecting either the first information or the second information based on the judgment result;
    Information processing method including.
PCT/JP2023/000844 2022-01-27 2023-01-13 Information processing device and information processing method WO2023145494A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2022-011332 2022-01-27
JP2022011332 2022-01-27

Publications (1)

Publication Number Publication Date
WO2023145494A1 true WO2023145494A1 (en) 2023-08-03

Family

ID=87471325

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2023/000844 WO2023145494A1 (en) 2022-01-27 2023-01-13 Information processing device and information processing method

Country Status (1)

Country Link
WO (1) WO2023145494A1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018211638A1 (en) * 2017-05-17 2018-11-22 三菱電機株式会社 Object identification device, road-side device and object identification method
WO2019003263A1 (en) * 2017-06-26 2019-01-03 三菱電機株式会社 Driving assistance device and driving assistance system using said driving assistance device
WO2021111858A1 (en) * 2019-12-06 2021-06-10 パナソニック株式会社 Moving body detection method, roadside device, and vehicle-mounted device

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018211638A1 (en) * 2017-05-17 2018-11-22 三菱電機株式会社 Object identification device, road-side device and object identification method
WO2019003263A1 (en) * 2017-06-26 2019-01-03 三菱電機株式会社 Driving assistance device and driving assistance system using said driving assistance device
WO2021111858A1 (en) * 2019-12-06 2021-06-10 パナソニック株式会社 Moving body detection method, roadside device, and vehicle-mounted device

Similar Documents

Publication Publication Date Title
US10783789B2 (en) Lane change estimation device, lane change estimation method, and storage medium
CN108628300B (en) Route determination device, vehicle control device, route determination method, and storage medium
CN109426263B (en) Vehicle control device, vehicle control method, and storage medium
US11433894B2 (en) Driving assistance method and driving assistance device
US20190333373A1 (en) Vehicle Behavior Prediction Method and Vehicle Behavior Prediction Apparatus
JP6269552B2 (en) Vehicle travel control device
US20180056998A1 (en) System and Method for Multi-Vehicle Path Planning Technical Field
CN110087964B (en) Vehicle control system, vehicle control method, and storage medium
JP6219312B2 (en) Method for determining the position of a vehicle in a lane traffic path of a road lane and a method for detecting alignment and collision risk between two vehicles
CN110678912A (en) Vehicle control system and vehicle control method
KR20200102004A (en) Apparatus, system and method for preventing collision
CN109398358B (en) Vehicle control device, vehicle control method, and medium storing program
JP4877364B2 (en) Object detection device
JP2021099793A (en) Intelligent traffic control system and control method for the same
WO2016181618A1 (en) Monitored area setting device and monitored area setting method
JP2020086986A (en) Traffic guide object recognition device, traffic guide object recognition method, and program
US20220176952A1 (en) Behavior Prediction Method and Behavior Prediction Device for Mobile Unit, and Vehicle
US20210224557A1 (en) Driving Support Method and Driving Support Device
CN110962744A (en) Vehicle blind area detection method and vehicle blind area detection system
US11423780B2 (en) Traffic control system
JP2017111498A (en) Driving support device
JP7043765B2 (en) Vehicle driving control method and equipment
JPWO2011024237A1 (en) Mobile radio communication apparatus and inter-vehicle communication method
EP3912877B1 (en) Driving assistance method and driving assistance device
JP6881001B2 (en) Automatic driving control device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23746708

Country of ref document: EP

Kind code of ref document: A1