WO2023145494A1 - Dispositif de traitement d'informations et procédé de traitement d'informations - Google Patents

Dispositif de traitement d'informations et procédé de traitement d'informations Download PDF

Info

Publication number
WO2023145494A1
WO2023145494A1 PCT/JP2023/000844 JP2023000844W WO2023145494A1 WO 2023145494 A1 WO2023145494 A1 WO 2023145494A1 JP 2023000844 W JP2023000844 W JP 2023000844W WO 2023145494 A1 WO2023145494 A1 WO 2023145494A1
Authority
WO
WIPO (PCT)
Prior art keywords
information
vehicle
processing unit
time
detected object
Prior art date
Application number
PCT/JP2023/000844
Other languages
English (en)
Japanese (ja)
Inventor
真大 新田
隆彦 稲冨
輝 小山
史行 井澤
敏軒 杜
Original Assignee
京セラ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 京セラ株式会社 filed Critical 京セラ株式会社
Priority to JP2023576787A priority Critical patent/JPWO2023145494A1/ja
Publication of WO2023145494A1 publication Critical patent/WO2023145494A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems

Definitions

  • the present invention relates to an information processing device and an information processing method.
  • a technology is known in which a vehicle performs vehicle-to-vehicle communication with another vehicle and also performs road-to-vehicle communication with a roadside machine to acquire information about detected objects detected by the other vehicle and the roadside machine (see Patent Document 1).
  • An information processing device includes: receiving first information indicating the state of the object detected at the first acquisition time and the first acquisition time from the first transmitting device, and receiving second information and the second acquisition indicating the state of the object detected at the second acquisition time a receiver that receives the time from the second transmitter; a processing unit that processes the first information and the first acquisition time and the second information and the second acquisition time received by the receiving unit; The processing unit predicts, from the first information, a state of the detected object in the first information at a third time after the first acquisition time and the second acquisition time.
  • the processing unit performs the detected object in the first information and the second is the same as the detected object in the information, and either the first information or the second information is selected based on the result of the determination.
  • the information processing method includes: receiving the first information and the first acquisition time from the first transmitting device that acquired the first information indicating the state of the object detected at the first acquisition time, and transmitting the state of the object detected at the second acquisition time; a receiving step of receiving the second information and the second acquisition time from the second transmitting device that acquired the second information indicating From the first information, predict the state of the detected object in the first information at a third time after the first acquisition time and the second acquisition time, and from the second information, predict the state of the detected object a prediction step of predicting the state of the detected object in the second information at a third time; Based on the state of the detected object in each of the first information and the second information at the third time predicted in the prediction step, the state of the detected object in the first information and the second information is determined. a judgment step of judging whether the detected object in the information is the same, and selecting either the first information or the second information based on the judgment result; including.
  • FIG. 1 is a layout diagram showing the layout of an information processing device according to an embodiment in a vehicle
  • FIG. 2 is a functional block diagram showing a schematic configuration of the information processing apparatus of FIG. 1
  • FIG. It is a figure which shows the position of the detection object regarding the information which a process part deletes. It is a figure which shows the advancing direction distance of a detection object. It is the figure which divided the area
  • FIG. 4 is a diagram showing a specific example of determination of identity of detected objects performed by a processing unit
  • FIG. 2 is a flowchart for explaining processing executed by the information processing apparatus of FIG. 1;
  • FIG. 1 shows a configuration example of a traffic support system including a vehicle 10 equipped with an information processing device according to one embodiment.
  • the traffic support system is, for example, a safe driving support communication system of Intelligent Transport Systems (ITS).
  • ITS Intelligent Transport Systems
  • a driving safety support communication system is called a driving safety support system or a driving safety support radio system.
  • the traffic support system includes vehicles 10 and roadside units 20 .
  • Vehicles 10 may include a plurality of vehicles 10 .
  • the following explanation will be given with the arbitrarily selected single vehicle 10 as the self-vehicle 30 .
  • Vehicles 10 other than own vehicle 30 are also referred to as other vehicles 40 .
  • features that can be explained without distinguishing between the own vehicle 30 and the other vehicle 40 will be explained as features of the vehicle 10 including the own vehicle 30 and the other vehicle 40 .
  • the roadside unit 20 may observe observation targets such as vehicles, objects, and pedestrians 50 on the road in a predetermined area.
  • the roadside unit 20 may be placed near an intersection where multiple roads 60 (roadways) intersect to observe the road surface.
  • the roadside unit 20 may be arranged on the roadside other than the intersection.
  • one roadside unit 20 is arranged at the intersection.
  • a plurality of roadside units 20 may be arranged.
  • roadside units 20 are arranged in a number corresponding to the number of roads 60 intersecting at an intersection, and each roadside unit 20 is arranged so that at least one road 60 associated with the installer in advance is included in the detection range. may be
  • the roadside unit 20 and the vehicle 10 running on the road 60 may perform wireless communication with each other.
  • a plurality of vehicles 10 may perform wireless communication with each other.
  • the roadside unit 20 may notify the vehicle 10 of various information, which will be described later.
  • the vehicle 10 may notify the other vehicle 10 and the roadside device 20 of various information, which will be described later.
  • the vehicle 10 may use various information acquired from the roadside unit 20 and other vehicles 10 to assist the driver of the vehicle 10 in driving safely.
  • the traffic assistance system may assist the driver of the vehicle 10 in driving safely.
  • the vehicle 10 is, for example, an automobile, but is not limited to an automobile, and may be a motorcycle, a bus, a streetcar, a bicycle, or the like.
  • the roadside unit 20 may be fixed to a structure having a height capable of imaging the scene including the road 60 outdoors, such as a signal device near an intersection where the roads 60 intersect, a utility pole, a streetlight, or the like. As shown in FIG. 2, roadside unit 20 may include sensor 21 . The sensor 21 detects information indicating the state of the detected object. The roadside device 20 may add the detection time to the information detected by the sensor 21 .
  • the roadside device 20 may periodically transmit information indicating the state of the detected object to its surroundings.
  • the information processing device 11 of the present disclosure may be mounted on the vehicle 10 .
  • Vehicle 10 may further include sensors 12 .
  • the information processing device 11 includes a receiver 13 and a processor 14 .
  • the information processing device 11 may include a control section 15 .
  • the sensor 12 may detect surrounding objects as detection objects.
  • a detected object may be a vehicle 10 on the road, a fixed object, a person, etc., which should be considered in driving the individual vehicle 10 .
  • the sensor 12 may generate information indicating the state of the detected object at predetermined intervals.
  • the sensor 12 may add the acquisition time to the information indicating the state of the detected object.
  • the acquisition time may be acquired from a timer built into the processing unit 14 .
  • the information indicating the state of the detected object includes the position (latitude, longitude and altitude), speed, traveling direction, acceleration, type, reliability, and the like of the detected object.
  • the information indicating the state of the detected object may include the shift position, steering angle, and the like.
  • the sensor 12 includes, for example, an imaging device such as a visible light camera, an FIR (Far Infrared Rays) camera, and a stereo camera, and a ranging device such as an infrared radar, a millimeter wave radar, and a Lidar (light detection and ranging). good.
  • an imaging device such as a visible light camera, an FIR (Far Infrared Rays) camera, and a stereo camera
  • a ranging device such as an infrared radar, a millimeter wave radar, and a Lidar (light detection and ranging).
  • the position of the detected object, the presence or absence of the detected object, the type of the detected object, and the traveling direction can be generated based on the image analysis of the image detected by the imaging device.
  • the velocity and acceleration of the detected object can be generated.
  • the position of the detected object, the presence or absence of the detected object, and the type of the existing detected object can be generated based on the distance of the object point detected by the rangefinder. Velocity
  • the sensor 12 may include a speed sensor that detects the speed of the vehicle 10 and a positioning device such as a GNSS (Global Navigation Satellite System) that detects the spatial position of the vehicle 10 in real space.
  • GNSS Global Navigation Satellite System
  • the receiving unit 13 receives the first information indicating the state of the detected object detected at the first acquisition time and the first acquisition time from the first transmission device.
  • the receiving unit 13 receives second information indicating the state of the detected object detected at the second acquisition time and the second acquisition time from the second transmission device.
  • the first transmitter and the second transmitter may include other vehicles 10 and roadside units 20 .
  • the receiving unit 13 may include a communication interface capable of communicating with the first transmitting device and the second transmitting device.
  • a "communication interface" in this disclosure may include, for example, a radio.
  • the radio may include radios conforming to standards including RC-005, IEEE 802.11p.
  • a radio includes at least one antenna.
  • the processing unit 14 includes one or more processors and memory.
  • the processor may include at least one of a general-purpose processor that loads a specific program and executes a specific function, and a dedicated processor that specializes in specific processing.
  • a dedicated processor may include an Application Specific Integrated Circuit (ASIC).
  • the processor may include a programmable logic device (PLD).
  • the PLD may include an FPGA (Field-Programmable Gate Array).
  • the processing unit 14 may include at least one of SoC (System-on-a-Chip) and SiP (System In a Package) in which one or more processors cooperate.
  • the processing unit 14 may transmit information indicating the state of the own vehicle 30 acquired by the sensor 12 of the vehicle 10 to the other vehicle 10 .
  • the transmitted information becomes first information or second information for other vehicles 10 .
  • the processing unit 14 processes the first information and the first acquisition time and the second information and the second acquisition time received by the receiving unit 13 .
  • a specific example of processing will be described below.
  • the processing unit 14 predicts the state of the detected object in the first information at a third time t3 after the first acquisition time t1 and the second acquisition time t2. From the second information, the processing unit 14 predicts the state of the detected object in the second information at the third time t3, which is the current time, for example.
  • An example of a prediction procedure for the first information is described below. A similar prediction procedure is performed for the second information.
  • processing unit 14 may perform the processing described below only when the difference between the first acquisition time t1 and the second acquisition time t2 is equal to or less than a predetermined threshold.
  • the processing unit 14 may extract the position p1, velocity v1, acceleration a, and traveling direction ⁇ 1 of the detected object at the first acquisition time t1 from the first information.
  • the processing unit 14 predicts the state of the detected object in the first information at the third time t3.
  • the processing unit 14 may predict the velocity v3 of the detected object at the third time t3.
  • the processing unit 14 may predict the movement distance d of the detected object from the first acquisition time t1 to the third time t3.
  • v3 v1+a ⁇ (t3-t1) (1)
  • d v1 ⁇ (t3 ⁇ t1)+a ⁇ (t3 ⁇ t1) 2 /2 (2)
  • the processing unit 14 calculates the latitude and longitude displacement of the detected object from the first acquisition time t1 to the third time t3 based on the predicted moving distance d and the traveling direction ⁇ 1 of the detected object at the first acquisition time t1. can be calculated.
  • the processing unit 14 may predict the position p3 of the detected object at the third time t3 from the position p1 at the first acquisition time t1 and the calculated displacement.
  • the processing unit 14 may predict the distance dr between the position p3 of the detected object at the third time t3 and the position of the vehicle 10 at the third time t3.
  • the processing unit 14 may predict the direction ⁇ 3 of the detected object viewed from the vehicle 30 at the third time t3 from the position p3 of the detected object and the position of the vehicle 10 .
  • the processing unit 14 may calculate the angle ⁇ between the direction ⁇ 3 of the detected object and the traveling direction of the vehicle 10 .
  • the direction ⁇ 3 of the detected object and the traveling direction of the vehicle 10 may be azimuth angles.
  • the processing unit 14 may determine that the detected object exists in the blind spot of the vehicle 10 when the difference ⁇ is equal to or greater than the first angle threshold.
  • the processing unit 14 may notify the control unit 15 of the determination result.
  • the processing unit 14 may notify the control unit 15 of the determination result only when the difference ⁇ is greater than or equal to the second angle threshold and the relative distance dr is less than or equal to the first distance threshold.
  • the processing unit 14 may predict the information of the vehicle 10 at the third time t3 from the information of the vehicle 10 acquired at the fourth acquisition time t4 before the third time t3. The processing unit 14 may determine whether to delete the first information and the second information further based on the information of the vehicle 10 at the third time t3. An example of a specific determination procedure will be described below.
  • the processing unit 14 may delete the information on the detected object before transmitting the information on the detected object to the control unit 15 .
  • the processing unit 14 may delete the information on the detected object before the prediction of the information on the detected object described above or after determining the identity of the detected object described later.
  • the information on the detected object required for the own vehicle 30 may vary depending on the manufacturer of the own vehicle 30 and the like. Therefore, the user or the like may be able to specify the determination procedure from an external interface or the like provided in the processing section 14 .
  • the decision procedure may be specified during driving according to driving conditions, traffic conditions, and the like.
  • the processing unit 14 may start the determination procedure.
  • the information maintenance range A may include a range Af on the traveling direction side of the vehicle 10 or a range Ab on the side opposite to the traveling direction of the vehicle 10 (see FIG. 3).
  • the range Af may be a semicircular shape centered on the widthwise center of the front edge of the vehicle 10 and having a radius rf.
  • the range Ab may be a semicircular shape centered on the widthwise center of the front edge of the vehicle 10 and having a radius rb.
  • Radius rf or radius rb may be, for example, 300 meters, 250 meters, 200 meters, 150 meters, 100 meters or 50 meters. By setting the radius rf to about 240 meters or more, the vehicle 10 is considered to be able to turn right without decelerating when there is no other vehicle 10 in the traveling direction of the T-junction.
  • the processing unit 14 does not have to delete the information on the detected object if the information maintenance range A is not specified.
  • the processing unit 14 calculates the distance between the position of the center of the front edge of the vehicle 30 in the width direction and the position of the other vehicle 40 projected along the traveling direction of the vehicle 30 (hereinafter referred to as dt (refer to FIG. 4) may be calculated.
  • the processing unit 14 may determine whether the other vehicle 40 is on the traveling direction side of the own vehicle 30 or on the opposite side to the traveling direction.
  • Thresholds for the traveling direction distance may be set on the traveling direction side and the opposite side of the own vehicle 30, respectively.
  • the processing unit 14 may delete the information on the detected object when the traveling direction distance exceeds the threshold on the side where the detected object exists.
  • the traveling direction threshold may be set to be equal to or greater than the traveling direction distance to any intersection in front of the host vehicle 30 .
  • the opposite side threshold is a distance that does not interfere with another vehicle 40 such as an emergency vehicle that moves in the same traveling direction as the own vehicle 30 at a speed higher than the speed of the own vehicle 30, and that the own vehicle 30 The distance may be within a range where the other vehicle 40 can be sufficiently detected.
  • the processing unit 14 may determine whether the other vehicle 40 is visible from the own vehicle 30 (is it close to the own vehicle 30) at the third time t3. Other vehicles 40 that are visible from own vehicle 30 may pose a danger to own vehicle 30 . When the other vehicle 40 is not visible, the processing unit 14 may determine whether the other vehicle 40 is dangerous to the own vehicle 30 at the third time t3. The processing unit 14 may delete information about detected objects including the other vehicle 40 that are not visible from the host vehicle 30 and determined not to be dangerous.
  • the determination of visibility may be based on the position of the other vehicle 40 with respect to the own vehicle 30.
  • the determination of danger may be based on the position relative to own vehicle 30 and the behavior of other vehicle 40 .
  • An example of a procedure for determining whether or not to permit viewing and whether or not the other vehicle 40 is dangerous to the own vehicle 30 will be described below.
  • Areas from area 1 to the right end in the same row are defined as areas 2 to 5 in order.
  • a region one after region 1 is defined as region 6 .
  • Areas from area 6 to the right end in the same row are defined as areas 7 to 10 in order.
  • Regions 11 to 25 are determined in the same way thereafter. It should be noted that the own vehicle 30 is located in the area 13 .
  • the processing unit 14 may determine that the other vehicle 40 existing in the regions 12 and 14 adjacent to the left and right of the own vehicle 30 is visible from the own vehicle 30 .
  • the processing unit 14 may determine the danger of the other vehicle 40 located in the adjacent area 8 in front of the own vehicle 30 . If the angle formed by the direction of travel of the own vehicle 30 and the direction of travel of the other vehicle 40 is greater than or equal to the third angle threshold, the processing unit 14 causes the other vehicle 40 to change lanes away from the own vehicle 30. You can judge that there is The processing unit 14 may determine that the other vehicle 40 changing lanes in the direction is dangerous for the own vehicle 30 . When the other vehicle 40 changes lanes, the driver of the other vehicle 40 may become distracted and may not pay enough attention to the own vehicle 30 .
  • the processing unit 14 determines that the other vehicle 40 is rapidly approaching the host vehicle 30. I judge.
  • the processing unit 14 may determine that another vehicle 40 that is rapidly approaching the own vehicle 30 is dangerous for the own vehicle 30 .
  • the processing unit 14 may determine the danger of the other vehicle 40 located in the area 3 two areas away from the own vehicle 30 .
  • the processing unit 14 may determine that the other vehicle 40 is dangerous for the own vehicle 30 .
  • the processing unit 14 may determine that the other vehicle 40 is rapidly approaching the own vehicle 30 when the acceleration of the other vehicle 40 is negative and the acceleration is equal to or less than the second acceleration threshold.
  • the processing unit 14 may determine that another vehicle 40 that is rapidly approaching the own vehicle 30 is dangerous for the own vehicle 30 . Since the other vehicle 40 is some distance away from the own vehicle 30, the absolute value of the second acceleration threshold may be larger than the first acceleration threshold.
  • the processing unit 14 may determine the danger of the other vehicle 40 located in the adjacent area 18 behind the own vehicle 30 . As described above, when the processing unit 14 determines that the other vehicle 40 is changing lanes away from the own vehicle 30, the processing unit 14 determines that the other vehicle 40 is dangerous to the own vehicle 30. you can Further, when the acceleration of the other vehicle 40 is positive (the other vehicle 40 is accelerating) and the acceleration is greater than or equal to the third acceleration threshold, the processing unit 14 causes the other vehicle 40 to suddenly approach the own vehicle 30. You can judge that there is The processing unit 14 may determine that another vehicle 40 that is rapidly approaching the own vehicle 30 is dangerous for the own vehicle 30 .
  • the processing unit 14 may determine the danger of the other vehicle 40 located in the area 23 two areas behind the own vehicle 30 . When the processing unit 14 determines that the other vehicle 40 is changing lanes away from the own vehicle 30 , the processing unit 14 may determine that the other vehicle 40 is dangerous to the own vehicle 30 . The processing unit 14 may determine that the other vehicle 40 is rapidly approaching the own vehicle 30 when the acceleration of the other vehicle 40 is positive and the acceleration is equal to or greater than the fourth acceleration threshold. The processing unit 14 may determine that another vehicle 40 that is rapidly approaching the own vehicle 30 is dangerous for the own vehicle 30 . Since the other vehicle 40 is some distance away from the own vehicle 30, the absolute value of the fourth acceleration threshold may be larger than the absolute value of the third acceleration threshold.
  • the processing unit 14 may determine the danger of the other vehicle 40 located in the area 7 or 9, which is one area ahead of the host vehicle 30 and one area away in the left or right direction.
  • the angle between the traveling direction of the other vehicle 40 and the traveling direction of the own vehicle 30 is equal to or greater than a fourth angle threshold, and the traveling direction of the other vehicle 40 and the vector from the other vehicle 40 to the own vehicle 30 are in the same direction.
  • the processing unit 14 may determine that the other vehicle 40 is changing lanes to approach the own vehicle 30 . In this case, the processing unit 14 may determine that the other vehicle 40 is dangerous for the own vehicle 30 . If another vehicle 40 changes lanes into the same lane as the own vehicle 30, danger may arise.
  • the processing unit 14 may also determine that the other vehicle 40 is dangerous to the own vehicle 30 when the acceleration of the other vehicle 40 is negative and the acceleration is equal to or less than the fifth acceleration threshold.
  • the processing unit 14 may similarly determine the risk of each of the other vehicles 40 positioned in areas 2 or 4 that are two areas ahead of the own vehicle 30 and one area away in the left or right direction.
  • the processing unit 14 may determine the danger of the other vehicle 40 located in the area 17 or 19 which is one area behind the host vehicle 30 and one area away in the left or right direction. As described above, when the processing unit 14 determines that the other vehicle 40 is changing lanes to approach the own vehicle 30, the processing unit 14 determines that the other vehicle 40 is dangerous to the own vehicle 30. good. The processing unit 14 may also determine that the other vehicle 40 is dangerous to the own vehicle 30 when the acceleration of the other vehicle 40 is positive and equal to or greater than the sixth acceleration threshold. The processing unit 14 may similarly determine the risk of each other vehicle 40 located in the area 22 or 24 that is two areas behind the host vehicle 30 and one area away in the left or right direction.
  • the processing unit 14 determines the risk of other vehicles 40 located in areas 1, 5, 6, 10, 11, 15, 16, 20, 21 or 25 two areas away from the own vehicle 30 in the left or right direction. you can As described above, when the processing unit 14 determines that the other vehicle 40 is changing lanes to approach the own vehicle 30, the processing unit 14 determines that the other vehicle 40 is dangerous to the own vehicle 30. good. Since the other vehicle 40 is separated from the own vehicle 30 by two lanes, the fourth angle threshold for judging that the other vehicle 40 is rapidly approaching the own vehicle 30 may be larger than in the case described above.
  • the processing unit 14 may determine that other vehicles 40 located further away from the own vehicle 30 are not dangerous for the own vehicle 30 .
  • the processing unit 14 may further determine other vehicles 40 that are blind spots from the own vehicle 30 . For example, in areas 7-9 and 17-19 that are one area away from the host vehicle 30 in the left or right direction and one area away in the front or rear direction, the first other vehicle 40 is present, and the vehicle and the first vehicle are located. A case where a second other vehicle 40 exists between the other vehicle 40 is conceivable. In this case, since the second other vehicle 40 is in a state of hiding the first other vehicle 40 (due to occlusion), the processing unit 14 determines that the first other vehicle 40 exists in the blind spot of the host vehicle 30. You can judge. The determination may be based on the size or type of the first other vehicle 40 or the second other vehicle 40 .
  • the processing unit 14 may notify the control unit 15 of the determination result.
  • the processing unit 14 detects an object detected in the first information (hereinafter referred to as "first detected object”) and an object detected in the second information (hereinafter referred to as "second detected object”) at the third time t3. Based on each predicted state, it is determined whether the first detected object and the second detected object are the same. An example of a specific determination procedure will be described below.
  • the first detected object and the second detected object may be other vehicles 40 .
  • the first detected object and the second detected object may be the own vehicle 30 .
  • the processing unit 14 may determine the identity of the recognition class (object type) between the first detected object and the second detected object.
  • Recognition classes are, for example, large vehicles, ordinary vehicles, emergency vehicles, pedestrians, and the like. When the recognition classes of the first detected object and the second detected object are similar, the processing unit 14 may determine that the first detected object and the second detected object have the same recognition class.
  • the processing unit 14 may determine the identity of the velocities of the first detected object and the second detected object at the third time t3. Specifically, when the difference between the speed of the first detected object and the speed of the second detected object is equal to or less than the speed threshold, the processing unit 14 determines whether the first detected object and the second detected object have the same speed. You can judge that it has sex.
  • the processing unit 14 may determine the identity of the traveling directions of the first detected object and the second detected object at the third time t3. Specifically, when the angle between the traveling direction of the first detected object and the traveling direction of the second detected object is equal to or smaller than the second angle threshold, the processing unit 14 detects the first detected object and the second detected object. It may be determined that objects have heading identity.
  • the processing unit 14 may determine the identity of the positions of the first detected object and the second detected object at the third time t3. Specifically, the processing unit 14 may calculate the distance between the position of the first detected object and the position of the second detected object at the third time t3. The processing unit 14 may determine that the positions of the first detected object and the second detected object have the same position when the calculated distance is equal to or less than the second distance threshold.
  • the processing unit 14 determines the identity of the recognition class, the identity of the velocity, and the traveling direction of the first detected object and the second detected object determined between the first detected object and the second detected object at the third time t3. It may be determined whether the first detected object and the second detected object are the same based on at least one of the identity of and the identity of the positions.
  • the processing unit 14 determines that the first detected object and the second detected object are the same, it selects the first information or the second information.
  • the processing unit 14 may calculate the priority of each of the first information and the second information for selection.
  • the processing unit 14 may select the first information or the second information based on the first information and the second priority.
  • the processing unit 14 may transmit the selected information to the control unit 15 . Details of the priority calculation procedure will be described below.
  • Information received from other vehicles 40 may have a higher priority than information received from roadside units 20 from the viewpoint of accuracy.
  • the priority of the information received from the roadside unit 20 may be increased.
  • the processing unit 14 may give higher priority to the information acquired later than the first information and the second information. .
  • the receiving unit 13 of the own vehicle 30 receives the information 101 to 106 from the roadside device 20.
  • the receiving unit 13 receives information 201 from the other vehicle 40a.
  • the receiving unit 13 receives information 301 from the other vehicle 40b.
  • the receiving unit 13 receives information 401 from the other vehicle 40c.
  • the information 101 is information indicating the state of the other vehicle 40a.
  • Information 102 is information indicating the state of the other vehicle 40b.
  • the information 103 is information indicating the state of the other vehicle 40c.
  • Information 104 is information indicating the state of the vehicle 30 .
  • the information 105 is information indicating the state of the pedestrian 50a.
  • Information 106 is information indicating the state of the pedestrian 50b.
  • the sensor 12 of the own vehicle 30 detects the position of the own vehicle 30 and the like.
  • the sensor 12 creates information 501 indicating the state of the own vehicle 30 .
  • the processing unit 14 predicts the state of the detected object in the information 101 to 106, 201, 301, 401 and 501 at the current time (third time) t3.
  • the processing unit 14 performs identity determination processing for the detected object based on the prediction of the state of the detected object at the third time t3. Specifically, the processing unit 14 determines that the detected objects in the information 101 to 106, 201, 301, 401, and 501 have the same recognition class, the same speed, the same traveling direction, and the same position. It is determined whether the detected objects are identical based on at least one of the genders. The processing unit 14 determines that the information 101 and 201, the information 102 and 301, the information 103 and 401, and the information 104 and 501 are information detected from the same object.
  • the processing unit 14 selects information 201 , 301 , 401 and 501 .
  • the processing unit 14 determines to transmit the selected information to the control unit 15 .
  • the processing unit 14 further determines that the information 105 and 106 and other information are not information detected from the same object. Based on this determination, the processing unit 14 determines to transmit the information 105 and 106 to the control unit 15 . As described above, the processing unit 14 may determine the identity of the detected object.
  • the first information and the second information may contain the vehicle ID.
  • the control unit 15 may recognize multiple pieces of information having the same vehicle ID as information indicating the same vehicle.
  • the processing unit 14 transmits the first information to the control unit 15, and in the current processing, the first detected object and the second detected object related to the same object as the detected object in the first information is acquired to determine identity. Furthermore, the processing unit 14 may select the second information and transmit it to the control unit 15 in this process.
  • the vehicle ID of the first information given by the first transmission device may be different from the vehicle ID of the second information given by the second transmission device. Information may not be recognized as referring to the same vehicle. As a result, the continuity of information about the vehicle provided to the control unit 15 may be interrupted and the trajectory may be lost.
  • the processing unit 14 may rewrite the vehicle ID in the second information to be transmitted to the control unit 15 so that it matches the first information.
  • the processing unit 14 may store the vehicle ID of the information (first information) transmitted to the control unit 15 for at least a certain period of time.
  • the processing unit 14 may rewrite the vehicle ID of the second information to the stored vehicle ID of the first information.
  • the first information and the second information may further include an increment counter indicating the time series of the information.
  • the processing unit 14 may store the increment counter of the information transmitted to the control unit 15 for at least a certain period of time. The processing unit 14 may also rewrite the increment counter when rewriting the vehicle ID.
  • the sensor 12 of the own vehicle 30 may acquire information indicating the state of the own vehicle 30 .
  • the processing unit 14 extracts the above-described first information between the information indicating the state of the own vehicle 30 acquired by the sensor 12 of the own vehicle 30 and the first information or the second information received from the roadside unit 20.
  • the identity determination may be performed by a process similar or similar to the identity determination of the detected object in the second information.
  • the information indicating the state of the own vehicle 30 acquired by the sensor 12 of the own vehicle 30 is received from the roadside unit 20.
  • the priority may be higher than the first information or the second information.
  • the information received from the roadside device 20 The priority of the first information or the second information may be increased.
  • the process of determining the identity of objects described above may take a long time.
  • the processing unit 14 may further predict the first information or the second information at the time (fifth time) t5 after the third time t3 when the above processing is finished.
  • An example of a prediction procedure for the first information is described below.
  • a similar or similar prediction procedure may be performed on the second information.
  • a similar or similar prediction procedure may be performed on the information indicating the state of the own vehicle 30 acquired by the sensor 12 of the own vehicle 30 .
  • the processing unit 14 may predict the state of the object detected in the first information at the fifth time t5 as the current time.
  • the prediction may be made in a manner similar or similar to the state prediction method at the third time.
  • the processing unit 14 may further acquire information indicating the state of the detected object detected by the other vehicle 40 from the other vehicle 40 .
  • the other vehicle 40a carries out inter-vehicle communication of the information indicating the state of the other vehicle 40b detected as a detected object together with the information 201, and the processing unit 14 acquires this information.
  • the processing unit 14 determines the identity of the detected object with respect to the information indicating the state of the other vehicle 40b acquired from the other vehicle 40a. determination processing, deletion processing of detected object information, or further prediction processing of object information may be performed.
  • the control unit 15 may control the vehicle 10 based on the processing result of the processing unit 14.
  • the controller 15 may be an ECU (Electronic Control Unit).
  • the control unit 15 may assist the driving of the vehicle 10 based on the detected object information received from the processing unit 14 .
  • the control unit 15 may notify the driver of the acquired information or information obtained by processing the information.
  • the control unit 15 may process the acquired information indicating the state of the detected object to narrow it down to information related to an urgent situation for the vehicle 10 (self-vehicle 30), and then notify the driver. In that case, the control unit 15 may narrow down the information to be notified to the driver by comparing the relative distance and relative speed between the own vehicle 30 and the detected object with predetermined criteria.
  • the control unit 15 may use the information acquired for determining the operation in a configuration in which steering such as acceleration/deceleration and steering of the vehicle 10 is performed automatically or semi-automatically.
  • the control unit 15 may be used to assist the driver of the vehicle 10 in driving safely.
  • the processing unit 14 of the present disclosure may be provided outside the vehicle 10 .
  • the processing unit 14 provided outside the vehicle 10 and the control unit 15 of the vehicle 10 may be network-connected, and the processing unit 14 may provide the cloud service to the vehicle 10 .
  • the information processing device 11 may perform this process at regular intervals.
  • the receiving unit 13 of the information processing device 11 receives the information on the detected object and the acquisition time.
  • the processing unit 14 of the information processing device 11 predicts information about a detected object.
  • the processing unit 14 deletes the information on the detected object.
  • the processing unit 14 determines the identity of the detected object.
  • the processing unit 14 further predicts information on the detected object.
  • step S ⁇ b>105 the processing unit 14 transmits information on the detected object to the control unit 15 . After the calculation, the process ends.
  • the processing unit 14 of the information processing apparatus 11 performs the predicted state of the detected object at the third time t3 based on the first information and the second information, respectively. It is determined whether the detected object in the first information and the detected object in the second information are the same, and either the first information or the second information is selected based on the determination result. With such a configuration, the amount of information processed by the information processing device 11 can be reduced, and the processing load of the information processing device 11 can be reduced.
  • the processing unit 14 of the information processing apparatus 11 of the present embodiment determines that the detected object in each of the first information and the second information is the same, the first information and the second information Only one is transmitted to the control unit 15 .
  • the amount of information received by the control unit 15 is reduced, and the information processing device 11 can reduce the processing load of the control unit 15 .
  • the processing unit 14 of the information processing apparatus 11 of the present embodiment transmits which of the first information and the second information to the control unit 15 further based on the first information and the second reliability. or judge.
  • the control unit 15 can receive only highly accurate information.
  • the information processing apparatus 11 can improve the reliability of processing.
  • the processing unit 14 of the information processing device 11 of the present embodiment predicts the information of the vehicle 10 at the third time from the information of the vehicle 10 acquired at the fourth acquisition time before the third time. Further based on the information of the vehicle 10 at time 3, it is determined whether to delete the first information and the second information. With such a configuration, information unnecessary to the vehicle 10 is deleted, and the processing load of the information processing device 11 can be further reduced.
  • the processing unit 14 of the information processing apparatus 11 determines from the first information, The first information is predicted, and the second information at the third time is predicted from the second information.
  • the information processing apparatus 11 determines the identity of the detected object in the first information and the detected object in the second information. Because it becomes difficult to judge Therefore, it is possible to improve the accuracy of identity determination performed by the information processing apparatus 11 .
  • the processing unit 14 of the information processing device 11 of the present embodiment further predicts the first information or the second information at the fifth time after the third time.
  • the information processing apparatus 11 can reduce deterioration in the accuracy of the first information or the second information due to the processing unit 14 needing time to determine the identity of the object, for example.
  • a storage medium on which the program is recorded for example, an optical disk, a magneto-optical A disk, CD-ROM, CD-R, CD-RW, magnetic tape, hard disk, memory card, etc.
  • an optical disk, a magneto-optical A disk, CD-ROM, CD-R, CD-RW, magnetic tape, hard disk, memory card, etc. can also be used.
  • the implementation form of the program is not limited to an application program such as an object code compiled by a compiler or a program code executed by an interpreter. good.
  • the program may or may not be configured so that all processing is performed only in the CPU on the control board.
  • the program may be configured to be partially or wholly executed by another processing unit mounted on an expansion board or expansion unit added to the board as required.
  • the processing unit 14 and the control unit 15 in the vehicle 10 are described as having different configurations.
  • the configuration is not limited to this, and the control unit 15 may include the processing unit 14 .
  • the control unit 15 performs the sameness determination process in the processing unit 14 described above, deletes the information of the detected object according to the result of the determination process, and then uses the information that has not been deleted. may assist in driving the vehicle 10.
  • Embodiments according to the present disclosure are not limited to any specific configuration of the embodiments described above. Embodiments of the present disclosure extend to all novel features or combinations thereof described in the present disclosure or to all novel method or process steps or combinations thereof described. be able to.
  • Descriptions such as “first” and “second” in this disclosure are identifiers for distinguishing the configurations. Configurations that are differentiated in descriptions such as “first” and “second” in this disclosure may interchange the numbers in that configuration. For example, the first information can replace the identifiers “first” and “second” with the second information. The exchange of identifiers is done simultaneously. The configurations are still distinct after the exchange of identifiers. Identifiers may be deleted. Configurations from which identifiers have been deleted are distinguished by codes. The description of identifiers such as “first” and “second” in this disclosure should not be used as a basis for interpreting the order of the configuration or the existence of lower numbered identifiers.
  • Reference Signs List 10 vehicle 11 information processing device 12 sensor 13 receiving unit 14 processing unit 15 control unit 20 roadside device 21 sensor 30 host vehicle 40, 40a, 40b, 40c other vehicle 50, 50a, 50b, 50c pedestrian 60 road 101, 102, 103 , 104, 105, 106, 201, 301, 401, 501 Information A Information maintenance range Af Range on the traveling direction side Ab Range on the opposite side to the traveling direction rf, rb Radius dt Distance in the traveling direction

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Traffic Control Systems (AREA)

Abstract

La présente invention concerne un dispositif de traitement d'informations comprenant une unité de réception et une unité de traitement. L'unité de réception reçoit, en provenance d'un premier dispositif de transmission, un premier temps d'acquisition et des premières informations qui indiquent un état d'un objet détecté au premier temps d'acquisition, et reçoit, en provenance d'un second dispositif de transmission, un second temps d'acquisition et des secondes informations qui indiquent un état d'un objet détecté au deuxième temps d'acquisition. L'unité de traitement traite les premières informations, le premier temps d'acquisition, les secondes informations et le deuxième temps d'acquisition reçus par l'unité de réception. L'unité de traitement prédit, à partir des premières informations, l'état de l'objet détecté dans les premières informations à un troisième temps qui est après le premier temps d'acquisition et le deuxième temps d'acquisition, et détecte, à partir des secondes informations, l'état de l'objet détecté dans les secondes informations au troisième temps. Sur la base de l'état prédit, au troisième temps, de l'objet détecté dans chacune des premières informations et des secondes informations, l'unité de traitement détermine si l'objet détecté dans les premières informations et l'objet détecté dans les secondes informations sont les mêmes, et sélectionne l'une des premières informations et des secondes informations sur la base du résultat de la détermination.
PCT/JP2023/000844 2022-01-27 2023-01-13 Dispositif de traitement d'informations et procédé de traitement d'informations WO2023145494A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2023576787A JPWO2023145494A1 (fr) 2022-01-27 2023-01-13

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2022011332 2022-01-27
JP2022-011332 2022-01-27

Publications (1)

Publication Number Publication Date
WO2023145494A1 true WO2023145494A1 (fr) 2023-08-03

Family

ID=87471325

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2023/000844 WO2023145494A1 (fr) 2022-01-27 2023-01-13 Dispositif de traitement d'informations et procédé de traitement d'informations

Country Status (2)

Country Link
JP (1) JPWO2023145494A1 (fr)
WO (1) WO2023145494A1 (fr)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018211638A1 (fr) * 2017-05-17 2018-11-22 三菱電機株式会社 Dispositif d'identification d'objet, dispositif côté route et procédé d'identification d'objet
WO2019003263A1 (fr) * 2017-06-26 2019-01-03 三菱電機株式会社 Dispositif d'aide à la conduite et système d'aide à la conduite faisant appel audit dispositif
WO2021111858A1 (fr) * 2019-12-06 2021-06-10 パナソニック株式会社 Procédé de détection de corps mobile, dispositif routier et dispositif embarqué

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018211638A1 (fr) * 2017-05-17 2018-11-22 三菱電機株式会社 Dispositif d'identification d'objet, dispositif côté route et procédé d'identification d'objet
WO2019003263A1 (fr) * 2017-06-26 2019-01-03 三菱電機株式会社 Dispositif d'aide à la conduite et système d'aide à la conduite faisant appel audit dispositif
WO2021111858A1 (fr) * 2019-12-06 2021-06-10 パナソニック株式会社 Procédé de détection de corps mobile, dispositif routier et dispositif embarqué

Also Published As

Publication number Publication date
JPWO2023145494A1 (fr) 2023-08-03

Similar Documents

Publication Publication Date Title
US10783789B2 (en) Lane change estimation device, lane change estimation method, and storage medium
CN108628300B (zh) 路径决定装置、车辆控制装置、路径决定方法及存储介质
US11433894B2 (en) Driving assistance method and driving assistance device
CN109426263B (zh) 车辆控制装置、车辆控制方法及存储介质
US20190333373A1 (en) Vehicle Behavior Prediction Method and Vehicle Behavior Prediction Apparatus
CN110087964B (zh) 车辆控制系统、车辆控制方法及存储介质
JP6269552B2 (ja) 車両走行制御装置
US20180056998A1 (en) System and Method for Multi-Vehicle Path Planning Technical Field
JP6219312B2 (ja) 道路の車線の車線交通路内の車両の位置を決定する方法、並びに2つの車両間の整列及び衝突リスクを検知する方法
CN110678912A (zh) 车辆控制系统及车辆控制方法
CN109398358B (zh) 车辆控制装置、车辆控制方法及存储程序的介质
KR20200102004A (ko) 충돌 방지 장치, 시스템 및 방법
JP2021099793A (ja) インテリジェント交通管制システム及びその制御方法
US20220176952A1 (en) Behavior Prediction Method and Behavior Prediction Device for Mobile Unit, and Vehicle
JP4877364B2 (ja) 物体検出装置
CN110962744A (zh) 车辆盲区检测方法和车辆盲区检测系统
US20210224557A1 (en) Driving Support Method and Driving Support Device
WO2017104209A1 (fr) Dispositif d'aide à la conduite
US11423780B2 (en) Traffic control system
JP7289821B2 (ja) インフラストラクチャ情報から計算される信頼度数値に基づく車両制御
JPWO2011024237A1 (ja) 移動無線通信装置および車車間通信方法
JP7043765B2 (ja) 車両走行制御方法及び装置
EP3912877B1 (fr) Procédé d'aide à la conduite et dispositif d'aide à la conduite
JP6881001B2 (ja) 自動走行制御装置
EP3855123B1 (fr) Système de génération de carte

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23746708

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2023576787

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE