WO2018070335A1 - Dispositif de détection de mouvement, procédé de détection de mouvement - Google Patents

Dispositif de détection de mouvement, procédé de détection de mouvement Download PDF

Info

Publication number
WO2018070335A1
WO2018070335A1 PCT/JP2017/036263 JP2017036263W WO2018070335A1 WO 2018070335 A1 WO2018070335 A1 WO 2018070335A1 JP 2017036263 W JP2017036263 W JP 2017036263W WO 2018070335 A1 WO2018070335 A1 WO 2018070335A1
Authority
WO
WIPO (PCT)
Prior art keywords
future
fusion
vehicle
future position
detected
Prior art date
Application number
PCT/JP2017/036263
Other languages
English (en)
Japanese (ja)
Inventor
崇弘 馬場
Original Assignee
株式会社デンソー
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社デンソー filed Critical 株式会社デンソー
Publication of WO2018070335A1 publication Critical patent/WO2018070335A1/fr

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R21/00Arrangements or fittings on vehicles for protecting or preventing injuries to occupants or pedestrians in case of accidents or other traffic risks
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C3/00Measuring distances in line of sight; Optical rangefinders
    • G01C3/02Details
    • G01C3/06Use of electric means to obtain final indication
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems

Definitions

  • the present disclosure relates to a movement detection device and a movement detection method for detecting movement of an object.
  • the movement detection device that detects the position of an object in front of the host vehicle using a captured image acquired by an image sensor or an electromagnetic wave sensor, and calculates the future position of the object based on a change in the detection position in time series.
  • the future position is a position where the object is predicted to arrive after a predetermined period.
  • the movement detection device predicts the movement direction of the object from a plurality of detection positions having different time series, and calculates the future position of the object based on the predicted movement direction.
  • Patent Document 1 discloses a technique that takes into account such detection accuracy characteristics. Specifically, when the object detected by the electromagnetic wave sensor and the object detected by the captured image are the same object, the orientation of the object detected by the captured image and the relative distance of the object detected by the electromagnetic wave sensor are A new fusion position is calculated by combining. Then, the newly calculated fusion position is used as an object detection position. The fusion position has higher object detection accuracy than the respective detection results of the captured image and the electromagnetic wave sensor. Therefore, the movement detection device can increase the accuracy of the future position by calculating the future position of the object using the fusion position.
  • the condition for detecting the position of the object from the captured image is different from the condition for detecting the position of the object by the electromagnetic wave sensor, there is a time difference between the timing at which the object is detected by the electromagnetic wave sensor and the timing at which the object is detected from the captured image. May occur.
  • the timing at which the movement detection device can calculate the fusion position is the position of the object detected by the electromagnetic wave sensor. From that point onwards. Further, the delay in calculation of the fusion position becomes a factor that delays the start of calculation of the future position. For example, when performing control for avoiding a collision with an object with respect to a vehicle according to a future position, it may take a long time to calculate the future position, thereby causing a delay in the execution of the control for avoiding the collision. .
  • An object is to provide an apparatus and a movement detection method.
  • a first position where an object ahead of the vehicle is detected by an electromagnetic wave sensor and a second position of the object detected from a captured image obtained by imaging the front of the vehicle by an image sensor.
  • the fusion position is calculated as the position of the object.
  • a fusion position calculation unit; and a future position calculation unit that calculates a future position of the object based on a plurality of the fusion positions having different time series.
  • the future position calculation unit includes the same object.
  • the future position of the object When calculating the future position of the object based on the fusion position obtained by fusing the first position and the second position, if the acquisition of the first position or the second position is delayed, as many fusions as necessary to calculate the future position are performed. It may take time to set the position.
  • the first position or the second position of the object when it is determined that the object is the same object, if the first position or the second position of the object has been acquired before that, the first position acquired
  • the future position of the object is calculated based on one of the second position and the fusion position.
  • the calculation of the future position of the object is started at the time when the fusion position is set, so the timing for calculating the future position can be advanced.
  • FIG. 1 is a configuration diagram illustrating a vehicle control device
  • FIG. 2 is a diagram for explaining object detection.
  • FIG. 3 is a diagram for explaining the calculation of the future position of the object performed by the ECU.
  • FIG. 4 is a flowchart illustrating a process in which the ECU stores the first position and the second position.
  • FIG. 5 is a flowchart for explaining prediction of the future position of the object using the first position and the second position.
  • FIG. 6 is a flowchart for explaining the integration permission determination in step S26 of FIG.
  • FIG. 7 is a diagram for explaining the object pop-out determination.
  • FIG. 1 is a configuration diagram illustrating a vehicle control device
  • FIG. 2 is a diagram for explaining object detection.
  • FIG. 3 is a diagram for explaining the calculation of the future position of the object performed by the ECU.
  • FIG. 4 is a flowchart illustrating a process in which the ECU stores the first position and the second position.
  • FIG. 5 is a flowchar
  • FIG. 8 is a flowchart for explaining the integration permission determination in step S29 of FIG.
  • FIG. 9 is a diagram for explaining the integration permission determination in step S26 in the second embodiment.
  • FIG. 10 is a flowchart for explaining the integration permission determination in step S26 in the third embodiment.
  • FIG. 11 is a diagram for explaining the integration permission determination in step S26 in the third embodiment.
  • FIG. 12 is a flowchart illustrating the process executed in step S31 in the fourth embodiment.
  • FIG. 13 is a diagram illustrating the reference number of detection points to be changed based on the reliability.
  • a movement detection apparatus is applied as a part of vehicle control apparatus which avoids or reduces the collision with the own vehicle and the object ahead of the own vehicle.
  • parts that are the same or equivalent to each other are denoted by the same reference numerals in the drawings, and the description of the same reference numerals is used.
  • the vehicle control device 100 is mounted on a vehicle and detects the movement of an object located in front of the vehicle. And when there exists a possibility that an object and a vehicle may collide, the avoidance operation
  • the vehicle control device 100 includes various sensors 30, an ECU (Electronic Control Unit) 20 that functions as a movement detection device, and a driving support device 40.
  • ECU Electronic Control Unit
  • the various sensors 30 are connected to the ECU 20 and output detection results for the object to the ECU 20.
  • the various sensors 30 include an electromagnetic wave sensor 31, an image sensor 32 that acquires a captured image, and an illuminance sensor 33 that detects the brightness around the vehicle.
  • the object detected by the electromagnetic wave sensor 31 is described as an electromagnetic wave target, and the object detected by the captured image is It is described as an image target.
  • the electromagnetic wave sensor 31 transmits a directional transmission wave such as a millimeter wave or a radar, and an object position or a relative speed based on the own vehicle by a reflected wave reflected from an electromagnetic wave target according to the transmission wave. Is detected.
  • the first position Pr which is the detection position of the object by the electromagnetic wave sensor 31, is detected as a position on the XY plane where the lateral direction of the vehicle is the X direction and the traveling direction of the vehicle is the Y direction.
  • the first position Pr includes a relative distance r1 from the host vehicle to the object and an azimuth ⁇ r centered on the host vehicle.
  • tip position (position in which the electromagnetic wave sensor 31 was provided) of the own vehicle CS is set as the reference point Po.
  • the image sensor 32 is arranged on the front side of the host vehicle CS, acquires a captured image obtained by capturing the front of the host vehicle, and outputs the captured image to the ECU 20 at a predetermined cycle.
  • the image sensor 32 is configured by arranging image sensors such as a charge coupled device (CCD) in the vertical and horizontal directions according to the number of resolutions.
  • the captured image acquired by the image sensor 32 is formed by pixels corresponding to the resolution of the image sensor 32.
  • the image sensor 32 is described as a monocular camera, but a stereo camera may be used.
  • the illuminance sensor 33 detects the brightness around the host vehicle CS.
  • the illuminance sensor 33 includes a detection unit such as a photodiode that detects brightness, and outputs a signal corresponding to the detection result of the detection unit to the ECU 20.
  • ECU20 is comprised as a known computer provided with CPU, ROM, RAM, etc.
  • the CPU executes a program stored in the ROM, thereby realizing each function for calculating the future position of the object ahead of the host vehicle and determining the possibility of collision with the object based on the future position. To do.
  • the object determination unit 21 determines whether the object is based on the first position where the electromagnetic wave sensor 31 detects an object ahead of the host vehicle and the second position of the object detected from the captured image obtained by imaging the front of the host vehicle using the image sensor 32. It is determined whether or not they are the same object.
  • the object determination unit 21 includes an electromagnetic wave region setting unit 22 that sets an electromagnetic wave search region based on a first position, and an image region setting unit 23 that sets an image search region based on a second position. .
  • the electromagnetic wave search region Rr has a width corresponding to an assumed error that is set in advance based on the characteristics of the electromagnetic wave sensor 31 in each of the distance direction and the azimuth direction with respect to the first position Pr. This is the area that has For example, the electromagnetic wave search region Rr is set as a region that expands by an assumed error in the distance direction and an assumed error in the angle in the azimuth direction with the first position Pr (r1, ⁇ r) as a reference.
  • the image area setting unit 23 detects the second position based on the recognition result of the image target included in the captured image.
  • the image target is recognized from the captured image by matching processing using a dictionary registered in advance.
  • a dictionary is prepared for each type of image target, whereby the type of image target is also specified. Examples of the types of image targets include pedestrians, bicycles, automobiles, guardrails, and the like.
  • the center point is detected as the second position among the pixels of the captured image recognized as the image target.
  • the second position Pi includes a relative distance r2 in the X direction from the own vehicle and an azimuth ⁇ i in the azimuth direction with respect to the own vehicle, and is on the XY plane in the same manner as the first position Pr. It is detected as the position.
  • the image search region Ri is a region having a width corresponding to an assumed error set in advance based on the characteristics of the image sensor 32 in each of the distance direction and the azimuth direction with the second position Pi as a reference.
  • the second position Pi (r2, ⁇ i) is set as a region widened by an estimated error in the distance direction and an estimated error in the azimuth direction angle.
  • the object determination unit 21 determines that the electromagnetic wave target and the image target are the same object on the condition that there is an overlapping area between the electromagnetic wave search area and the image search area.
  • the object determination unit 21 determines that the electromagnetic wave target and the image target are the same object. To do.
  • the fusion position calculation unit 24 calculates the fusion position as the position of the object based on the first position and the second position in the object determined as the same object by the object determination unit 21.
  • the fusion position calculation unit 24 calculates the fusion position by fusing highly accurate information of the first position Pr and the second position Pi in the objects determined to be the same object.
  • the fusion position Pf (r1, ⁇ i) is calculated using the relative distance r1 of the first position Pr (r1, ⁇ r) and the direction ⁇ i of the second position Pi (r2, ⁇ i). ing.
  • the object for which the fusion position is calculated by determining the same object by the object determination unit 21 is referred to as a fusion target.
  • the future position calculation unit 25 calculates the future position of the object determined as the fusion target based on a plurality of fusion positions having different time series. The calculation of the future position by the future position calculation unit 25 will be described with reference to FIG. FIG. 3 shows a change in the relative position of the pedestrian at times t1 to t6 when the pedestrian located in front of the host vehicle moves in the lateral direction with respect to the host vehicle CS. Further, it is assumed that only the second position is detected for the pedestrian from time t1 to t3, and the first position is not detected. Then, it is assumed that the first position and the second position are detected for the pedestrian at time t4, and that the pedestrian is determined as a fusion target after time t4.
  • the future position calculation unit 25 calculates the future position of the object based on a plurality of fusion positions having different time series. Specifically, the future position calculation unit 25 calculates a movement locus of the object based on a predetermined number of fusion positions, and calculates the future position of the object by extending the movement locus toward the own vehicle side. .
  • FIG. 3A illustrates the calculation of the movement trajectory based on the three fusion positions calculated at times t4 to t6. Further, the future position Fp is calculated by extending the calculated movement locus toward the host vehicle.
  • the collision determination unit 26 determines whether or not the object collides with the host vehicle based on the future position calculated by the future position calculation unit 25.
  • the collision determination unit 26 calculates a collision allowance time TTC (Time to Collision) between the object and the own vehicle so that the object becomes the own vehicle. Whether or not to collide with.
  • the collision lateral position CSP is a range extending in the lateral direction (X direction) from the center of the host vehicle at the front portion of the host vehicle.
  • the collision allowance time TTC is an evaluation value indicating how many seconds later the object collides with the object when traveling at the vehicle speed as it is. The smaller the TTC, the higher the risk of collision and the larger the TTC. The lower the risk of collision.
  • the collision allowance time TTC can be calculated by a method such as dividing the distance in the traveling direction between the object and the host vehicle by the relative speed of the object with respect to the host vehicle. The relative speed is acquired by the electromagnetic wave sensor 31.
  • the driving support device 40 is an alarm device that emits an alarm sound to the driver, or a brake device that decelerates the vehicle speed of the host vehicle, and performs a collision avoiding operation or a collision reducing operation with an object based on the collision allowance time TTC. . If the driving support device 40 is a brake device, automatic braking is performed according to the collision allowance time TTC. If the driving support device 40 is an alarm device, an alarm sound is emitted according to the collision allowance time TTC.
  • the future position accuracy can be increased by calculating the future position of the object using the fusion position.
  • the condition for detecting the position of the object from the captured image is different from the condition for detecting the position of the object by the electromagnetic wave sensor 31, even if the object can be detected from the captured image for the same object, the electromagnetic wave sensor 31 is detected. In some cases, the electromagnetic wave sensor 31 cannot detect the object until the condition for detecting the object is satisfied.
  • the second position is detected at time t1, but the first position is detected at time t4 after time t1, and calculation of the fusion position starts at time t4.
  • calculation of the future position requires a plurality of fusion positions
  • calculation of the first future position is started at time t6 after calculation of the fusion position at time t4. For this reason, since it takes time to calculate the future position, there is a case where a delay occurs in the execution of the control of the driving assistance device 40 for avoiding the collision with the object determined as the fusion target.
  • the future position calculation unit 25 is acquired when one of the first position and the second position of the object is acquired before the object is determined as the fusion target.
  • the future position of the object is calculated based on either the first position or the second position and the fusion position.
  • the movement locus of the object is determined by the second position obtained at time t2 and t3 obtained at the time of calculation of the fusion position. Start the calculation. Therefore, the prediction of the future position is started at the time t4.
  • step S11 it is determined whether or not the first position, which is the detection result of the electromagnetic wave target by the electromagnetic wave sensor 31, has been detected.
  • step S12 a reliability indicating the certainty of the first position is calculated.
  • the ECU 20 calculates the reliability of the first position according to the intensity of the reflected wave reflected from the electromagnetic wave target or the number of times the same electromagnetic wave target is continuously detected. Specifically, the higher the intensity of the reflected wave, the higher the reliability of the first position. In addition, as the number of times the same electromagnetic wave target is continuously detected is increased, the reliability of the first position is set to a higher value.
  • step S13 it is determined whether or not the second position, which is the detection result of the image target from the captured image, has been detected.
  • the second position is detected (step S13: YES)
  • step S14 a reliability indicating the certainty of the second position is calculated.
  • the ECU 20 calculates the reliability according to the pattern matching score used to identify the image target from the captured image. Specifically, the higher the pattern matching score, the higher the reliability of the second position.
  • step S15 the first position and the second position are stored in association with the buffer on the RAM.
  • the ECU 20 also stores the reliability calculated in steps S12 and S14 in association with each position.
  • an area where the first position is stored on the RAM is referred to as a first storage area
  • an area where the second position is stored on the RAM is referred to as a second storage area.
  • step S13 NO
  • step S16 only the first position is recorded in the buffer on the RAM in step S16. Therefore, the first position is stored in the first storage area on the RAM.
  • step S11 when the first position is not detected (step S11: NO), in step S17, it is determined whether or not the second position is detected.
  • step S17 When the second position is detected (step S17: YES), the reliability of the second position is calculated in step S18.
  • step S14 the reliability calculated in step S18 is calculated according to, for example, the pattern matching score. Therefore, steps S12, S14, and S18 function as a reliability calculation unit.
  • step S19 only the second position is recorded in the buffer on the RAM. Therefore, the second position is stored in the second storage area on the RAM.
  • step S17: NO the process shown in FIG. 4 is terminated. In this case, the position of the object (first position, second position) is not detected in front of the vehicle.
  • the process shown in FIG. 5 is a process performed by the ECU 20 at a predetermined cycle. Further, the calculation of the future position shown in FIG. 5 illustrates the case of calculating the future position of the pedestrian.
  • step S21 it is determined whether or not the electromagnetic wave target that has detected the first position and the image target that has detected the second position are the same object.
  • the ECU 20 sets the image target and the electromagnetic wave target on the condition that there is an overlapping area between the electromagnetic wave search area set based on the first position and the image search area set based on the second position. Are the same object.
  • step S21: NO the same object is not determined.
  • step S21 NO.
  • finishes the process shown in FIG. Step S21 functions as an object determination step.
  • step S22 when it is determined that the electromagnetic wave target and the image target are the same object (step S21: YES), in step S22, the fusion is performed based on the first position and the second position stored in the RAM buffer. Calculate the position. Step S22 functions as a fusion position calculation step.
  • step S23 the position indicated by the fusion position calculated in step S22 is stored in a buffer on the RAM.
  • the buffer stored in step S22 is an area different from the first storage area in which the first position is stored and the second storage area in which the second position is stored on the RAM. It is described as an area.
  • step S24 it is determined whether or not the number of fusion positions calculated for the same object is equal to or less than a threshold value Th1.
  • the threshold value Th1 is determined based on, for example, the reference number of fusion positions that the ECU 20 needs to calculate the future position, and is an integer of 2 or more.
  • step S24 When the calculation number of the fusion position is equal to or less than the threshold value Th1 (step S24: YES), it is determined whether the first position or the second position of the object has been acquired before it is determined that they are the same object. judge. First, in step S25, if a predetermined number of second positions of objects determined as fusion targets are stored in the second storage area on the RAM (step S25: YES), in step S26, the fusion target and It is determined whether or not the integration of the fusion position in the determined object and the plurality of second positions is permitted.
  • the integration means that each position is stored in a predetermined area on the buffer in order to use either the first position or the second position and the fusion position as a detection point for calculating the future position of the object. is doing.
  • step S26 The integration permission determination in step S26 will be described with reference to FIG.
  • steps S41 to S43 function as a pop-up determination unit.
  • step S41 it is determined whether or not there is an obstacle in the captured image that can be an obstacle to the detection when the electromagnetic wave sensor 31 detects a pedestrian. For example, when the ECU 20 recognizes an obstruction that becomes a blind spot when detecting a pedestrian from the host vehicle in the captured image, the ECU 20 determines the obstruction as an obstruction that can be an obstacle to detection of the pedestrian. Hereinafter, a description will be given of a parked vehicle located on a road as an example of an obstacle. If the presence of an obstacle is not determined (step S41: NO), the process proceeds to step S47.
  • step S42 the pedestrian determined as the fusion target walks out of the parked vehicle determined in step S41 in the lateral direction. It is determined whether it is a person. For example, the ECU 20 first determines whether or not a pedestrian whose second position is detected is located around the parked vehicle. Then, when a pedestrian is located around the parked vehicle, a known optical flow is calculated based on a time-series change of feature points corresponding to the pedestrian. Then, based on the calculated optical flow, when the pedestrian moves in the lateral direction from the parked vehicle in front of the own vehicle, it is determined that the pedestrian moves in the direction of jumping out on the own lane.
  • the feature point is a pixel used for recognizing a pedestrian in a captured image, and for example, a pedestrian edge point is used.
  • step S43 the detection point of the electromagnetic wave sensor 31 is used to determine in step S41. It is determined whether or not the parked vehicle is appropriate. Specifically, since the parked vehicle is a three-dimensional object, the ECU 20 should be able to detect the length of the parked vehicle along the traveling direction of the vehicle by detecting the parked vehicle with the electromagnetic wave sensor 31. Therefore, as shown in FIG. 7A, the ECU 20 acquires a detection point DP obtained by detecting the parked vehicle ahead of the host vehicle by the electromagnetic wave sensor 31, and uses the detected point DP in the traveling direction of the parked vehicle.
  • step S44 the process proceeds to step S47.
  • the threshold Th2 is a value set based on the length of the vehicle, for example, a value of 3 meters or more and 6 meters or less.
  • step S44 the reliability of the second position used for calculating the future position is determined.
  • the ECU 20 determines the reliability of the second position by comparing the reliability acquired in step S14 or S18 of FIG. 4 with the threshold Th3. If the reliability is less than the threshold Th3 (step S44: NO), the process proceeds to step S47.
  • the threshold value Th3 is a value that is experimentally determined based on the imaging accuracy of the image sensor 32, the detection accuracy of the object detection algorithm of the ECU 20, and the like.
  • step S45 it is determined whether or not the pedestrian and the parked vehicle are close to each other. Since the future position of the object calculated using the fusion position and the second position is slightly lower than the future position of the object calculated using only the fusion position, it is desirable to use the position in a limited manner.
  • FIG. 7A when the parked vehicle and the pedestrian are close to each other, their positions overlap in the lateral direction, and it is difficult to detect the first position by the electromagnetic wave sensor 31. There is a high possibility that the first position has not been detected in the past from the time when the fusion target is determined. Therefore, the ECU 20 sets one of the integration permission conditions that the pedestrian and the parked vehicle are close to each other.
  • the ECU 20 includes a line segment connecting the host vehicle CS and the pedestrian Gb1
  • An angle ⁇ d formed by a line segment connecting the host vehicle CS and the parked vehicle Gb2 is calculated, and the distance in the lateral direction between the pedestrian Gb1 and the parked vehicle Gb2 is determined based on the angle ⁇ d.
  • an edge point indicating an end portion far from the host vehicle CS in the traveling direction of the host vehicle is acquired, and the reference point Po and the edge point are acquired.
  • the threshold value Th4 is a value set based on the distance by which the pedestrian and the parked vehicle can be distinguished by the electromagnetic wave sensor 31.
  • Step S45 functions as a proximity determination unit. For example, a value of 0 degree or more and 20 degrees or less can be used as the threshold Th4.
  • step S46 When the pedestrian and the parked vehicle are close to each other (step S45: YES), in step S46, a permission flag for permitting the integration of the fusion position and the second position is turned on and the integration is permitted. On the other hand, when the pedestrian and the parked vehicle are not in the vicinity (step S45: NO), in step S47, the permission flag that permits the integration of the fusion position and the second position is turned off, and the integration is not permitted.
  • step S26 if the integration is not permitted (step S26: NO), the processing shown in FIG. 5 is temporarily terminated.
  • step S27 the fusion position in the same object and the plurality of second positions are integrated as reference points for calculating the future position of the fusion target.
  • the ECU 20 integrates the third storage area in which the fusion position calculated at the present time is stored and the second storage area in which the second position is stored on the RAM, and displays the movement locus. A new area in which each position for calculation is stored is set.
  • step S28 it is determined whether or not the predetermined number or more of the first positions are stored in the buffer. .
  • step S29 it is determined whether or not the integration of the fusion position in the same object and the plurality of first positions is permitted.
  • FIG. 8 is a flowchart for explaining the integration permission determination in step S29.
  • the detected brightness around the vehicle detected by the illuminance sensor 33 belongs to a brightness range that is predetermined as the brightness with which the image target can be detected from the captured image. It is determined whether or not.
  • the brightness range includes a threshold value Th11 determined according to a lower limit value of brightness at which an image target can be detected from the captured image, and an upper limit value of brightness at which the image target can be detected from the captured image. It is a range between the threshold value Th12 determined accordingly.
  • step S51 if the detected brightness indicating the brightness is equal to or less than the threshold Th11, the surroundings of the vehicle are dark, and the process proceeds to step S53. Even when the detected lightness exceeds the threshold Th11 (step S51: NO), if the detected lightness is equal to or greater than the threshold Th12 (step S52: YES), the surroundings of the vehicle is too bright, and the process proceeds to step S53. If the detected brightness is less than the threshold Th12 (step S52: NO), the integration of the first position and the fusion position is not permitted in step S56.
  • step S53 it is determined whether or not the first position is within the detection range of the electromagnetic wave sensor 31. This is because when the first position is out of the detection range of the electromagnetic wave sensor 31, there is a high possibility that noise or the like is erroneously detected as an electromagnetic wave target. Therefore, if the first position is within the detection range (step S53: YES), the process proceeds to step S54. On the other hand, if the first position is outside the detection range (step S53: NO), the process proceeds to step S56, and the integration is not permitted.
  • step S54 the reliability of the first position used for calculating the fusion position is determined.
  • the ECU 20 compares the reliability acquired in step S12 of FIG. 4 with a threshold value Th13. If the reliability is greater than or equal to the threshold Th13, the reliability of the electromagnetic wave target is high and the process proceeds to step S55. On the other hand, if the reliability is less than the threshold Th13, it is determined that the reliability of the electromagnetic wave target is low, and the process proceeds to step S56.
  • the threshold value Th13 is a value experimentally determined based on the detection accuracy of the object by the electromagnetic wave sensor 31. When the reliability is calculated based on the reflection intensity, the lower limit value is stronger than the reflection intensity at which the electromagnetic sensor 31 can detect the first position of the object.
  • step S55 the integration of the first position and the fusion position is permitted.
  • step S56 the integration of the first position and the fusion position is not permitted.
  • step S30 the future position of the fusion target is calculated from the fusion position and the plurality of first positions. Integrate as a reference point for Specifically, the ECU 20 integrates, on the RAM, the third storage area in which the fusion position calculated at the present time is stored and the first storage area in which the first position is stored. Set the integrated area. If the integration is not permitted (step S29: NO), the process shown in FIG.
  • step S31 the future position of the fusion target is calculated based on each position integrated in step S27 or step S30. For example, when permitting the integration of the second position and the fusion position, the ECU 20 calculates the movement trajectory of the pedestrian using the fusion position and the second position acquired in the past from the fusion position. The future position of the pedestrian is calculated by extending this movement locus toward the host vehicle. Moreover, when permitting the integration of the first position and the fusion position, the ECU 20 calculates the movement trajectory of the pedestrian using the fusion position and the first position acquired in the past from the fusion position. The future position of the pedestrian is calculated by extending this movement locus toward the host vehicle. Therefore, step S31 functions as a future position calculation process.
  • step S24 when the number of calculation of the fusion position is equal to or less than the threshold Th1, in step S32, an object movement prediction using the fusion position is performed.
  • the movement trajectory of the fusion target is calculated only from the fusion position.
  • the ECU 20 calculates a pedestrian's movement trajectory using time-series fusion positions, and calculates the pedestrian's future position by extending the movement trajectory toward the host vehicle.
  • the ECU 20 has already acquired either the first position or the second position of the object at the time when the object is determined as the fusion target.
  • the future position of the object is calculated based on either the acquired first position or second position and the fusion position.
  • the calculation of the future position of the object is started at the time when the fusion position is set, so the timing for calculating the future position can be advanced. .
  • the ECU 20 calculates the future position of the object on the condition that the object is determined to be a pedestrian who has a high possibility of jumping out on the own lane in front of the own vehicle. In this case, the start timing of prediction of the future position of a pedestrian who is likely to collide with the host vehicle can be advanced, and the safety of the host vehicle can be improved.
  • the future position of the object using the fusion position and the second position is slightly lower in accuracy than the case of calculating the future position of the object using only the fusion position, it is desirable that the scene to be applied is limited.
  • the ECU 20 calculates the future position of the object based on the already acquired second position and the fusion position on the condition that the pedestrian and the stationary object on the road are close to each other. did.
  • the future position of the object is unnecessarily calculated using the second position and the fusion position by determining a situation in which there is a high possibility that the first position has not been detected in the past from the determination of the fusion target. Can be prevented.
  • the ECU 20 calculates the future position of the object based on the already acquired first position and the fusion position on the condition that the brightness around the own vehicle does not belong to the predetermined range. did. In this case, it is possible to prevent the future position from being unnecessarily calculated using the first position and the fusion position by determining a situation where the second position is not likely to be detected.
  • the ECU 20 determines whether or not the reliability of the first position or the second position is equal to or higher than a predetermined value, and on the condition that the detection accuracy is equal to or higher than the predetermined value, Based on the second position, the future position of the fusion target is calculated. In this case, when the reliability is low, the future position of the object using the first position or the second position and the fusion position is not predicted. Therefore, the prediction accuracy is given priority to the advance timing of the prediction start of the future position. Can be prevented from greatly decreasing.
  • the second position In the second embodiment, in the determination of the integration permission in step S26 of FIG. 5, when the pedestrian as the image target is an object that jumps out from the guard rail extending along the own lane onto the own lane, the second position And allow integration with the fusion location.
  • the guardrail is an example of an obstacle that can be an obstacle to the detection of the pedestrian by the electromagnetic wave sensor 31, and the reflection intensity of the reflected wave received by the electromagnetic wave sensor 31 is stronger than that of the pedestrian. Therefore, when the pedestrian is positioned in the vicinity of the guardrail, the intensity of the reflected wave from the guardrail exceeds the intensity of the reflected wave from the pedestrian, and the electromagnetic wave sensor 31 may detect the first position of the pedestrian. It becomes difficult. Therefore, in the second embodiment, the future position of the object is calculated based on the second position and the fusion position on the condition that a stationary object such as a guardrail exists in the vicinity of the pedestrian.
  • step S41 of FIG. 6 it is determined whether or not a guardrail is present in the captured image.
  • step S43 the length of the guard rail in the vehicle traveling direction (Y direction) is calculated using the detection point of the electromagnetic wave sensor 31.
  • step S45 it is determined whether or not the pedestrian Gb11 and the guardrail Gb12 in the captured image are close to each other. Based on the captured image, the ECU 20 calculates an angle ⁇ d that indicates the distance between the pedestrian Gb11 and the guard rail Gb12 in the vehicle lateral direction.
  • the point Pa where the line segment extending in the lateral direction (X direction) from the second position where the pedestrian Gb11 is detected and the guard rail Gb12 intersect is calculated.
  • an angle ⁇ d between the line segment connecting the intersection Pa from the reference point Po and the line segment connecting the host vehicle CS and the pedestrian Gb11 is calculated. And when this angle (theta) is below threshold value Th4, it determines with the pedestrian and the guardrail adjoining.
  • step S46 When it is determined that the pedestrian is close to the guardrail (step S45: YES), in step S46, a permission flag for permitting the integration of the fusion position and the second position is turned on. On the other hand, when it is determined that the pedestrian and the guardrail are not close to each other (step S45: NO), in step S47, the permission flag that permits the integration of the fusion position and the second position is turned off.
  • the calculation start timing of the future position of the pedestrian can be advanced. As a result, the host vehicle can travel safely.
  • the ECU 20 calculates the future position of a preceding vehicle traveling in front of the own lane as the future position of the object. Then, when the ECU 20 determines the preceding vehicle as a fusion target, the ECU 20 calculates the future position of the object based on either the first position or the second position of the preceding vehicle and the fusion position.
  • FIG. 10 is a flowchart for explaining the integration permission determination in step S26 of FIG. 5 in the third embodiment.
  • steps S61 to S63 it is determined whether or not there is a parallel running vehicle that runs parallel to the preceding vehicle.
  • the parallel running vehicle is a vehicle traveling in an adjacent lane adjacent to the own lane. Therefore, steps S61 to S63 function as a parallel running determination unit.
  • step S61 based on the captured image, it is detected whether a preceding vehicle and a vehicle other than the preceding vehicle are traveling ahead of the host vehicle.
  • the ECU 20 detects another vehicle Gb32 around the preceding vehicle Gb31 determined as the fusion target in the captured image. If no other vehicle can be detected around the preceding vehicle (step S61: NO), the process proceeds to step S66, and the integration of the fusion position and the second position is not permitted.
  • step S62 an inter-vehicle distance difference between the preceding vehicle and the other vehicle in the own vehicle traveling direction is calculated.
  • the ECU 20 calculates a difference D2 between the inter-vehicle distance D1 from the own vehicle to the preceding vehicle and the inter-vehicle distance from the own vehicle to the other vehicle based on the captured image.
  • step S62: NO the difference in the inter-vehicle distance exceeds the threshold Th21 (step S62: NO)
  • step S63 the distance in the lateral direction between the preceding vehicle and the other vehicle is determined.
  • the ECU 20 calculates an angle ⁇ d formed by a line segment connecting the host vehicle and the preceding vehicle and a line segment connecting the host vehicle and the preceding vehicle, and determines the distance in the lateral direction based on the angle ⁇ d.
  • the angle ⁇ d exceeds the threshold Th22 (step S63: NO)
  • the ECU 20 Determines that the other vehicle is running in parallel with the preceding vehicle, and proceeds to step S64.
  • the threshold values Th21 and Th22 are values calculated based on the distance in the traveling direction of the host vehicle and the distance in the lateral direction, for example, so that the electromagnetic wave sensor 31 can recognize two vehicles.
  • step S64 the reliability of the second position used for calculating the future position is determined.
  • the ECU 20 determines the reliability of the second position by comparing the reliability of the preceding vehicle acquired in step S14 or S18 of FIG. 4 with the threshold Th23.
  • step S65 If the reliability of the second position is greater than or equal to the threshold Th23 (step S64: YES), in step S65, integration of the second position and the fusion position is permitted. If the reliability of the second position is less than the threshold Th23 (step S64: NO), in step S66, the integration of the second position and the fusion position is not permitted.
  • the third embodiment described above has the following effects.
  • the future position of the object using the fusion position and the second position is slightly lower in accuracy than the case of calculating the future position of the object using only the fusion position, it is desirable that the scene to be applied is limited.
  • the electromagnetic wave sensor 31 when calculating the future position of the preceding vehicle, it is difficult for the electromagnetic wave sensor 31 to detect the first position of the preceding vehicle due to the presence of the parallel running vehicle that runs parallel to the preceding vehicle.
  • the ECU 20 calculates the future position of the fusion target based on the acquired second position and the fusion position on the condition that there is a parallel running vehicle that runs parallel to the preceding vehicle. .
  • the future position of the object is unnecessarily calculated using the second position and the fusion position by determining a situation in which there is a high possibility that the first position has not been detected in the past from the determination of the fusion target. Can be prevented.
  • the ECU 20 when calculating the future position of the fusion target based on either the first position or the second position and the fusion position, uses the first position or the first position used to calculate the future position. As the reliability of the second position is lower, the number of first positions or second positions used to calculate the future position of the object is increased.
  • FIG. 12 is a flowchart illustrating the process executed in step S31 in the fourth embodiment.
  • the case where the second position and the fusion position are integrated in step S30 will be described as an example.
  • step S71 the reliability of the second position used for calculating the future position is determined.
  • the ECU 20 calculates the sum of the reliability of the second position stored in the second storage area on the RAM. For example, when the future position is calculated from the two second positions and the one merged position, the reliability of the two second positions is summed.
  • step S72 the number of second positions used for calculating the future position is changed according to the sum of the reliability calculated in step S71.
  • the ECU 20 holds a map that defines the relationship between the sum of reliability and the number N of second positions shown in FIG. 13A, and calculates the future position based on this map.
  • the number of the second positions used for is changed to a number N corresponding to the total reliability.
  • Each value is defined in the map such that the lower the total reliability of the second position calculated in step S71, the greater the number N of second positions used for calculating the future position.
  • N0 is an initial value of the second position used for calculating the future position, and in this embodiment is two points.
  • step S73 If the number of second positions corresponding to the total sum of reliability is not stored in the buffer, the process proceeds to step S73 without changing the current number of second positions used for calculating the future position. It may be.
  • step S73 the future position of the fusion target is calculated based on the plurality of second positions and the fusion position.
  • FIG. 13B illustrates the number of second positions that change according to the reliability, taking as an example a case where the initial value of the second position used for calculating the future position is two points.
  • the reliability sum is high, the number of second positions used for calculating the future position is not changed from two points, and the future of the fusion target is based on the two second positions and the fusion position.
  • the position is calculated. Therefore, the fusion position calculated at time t15 and the two second positions acquired at time t13 and t14 are used to calculate the future position.
  • the number of the second positions used for calculating the future position is increased to 3, and the future position of the fusion target is determined based on the three second positions and the fusion position. Calculated. Therefore, the fusion position calculated at time t15 and the three second positions acquired at times t12 to t14 are used to calculate the future position.
  • the fourth embodiment described above has the following effects. -If the reliability of the object which detected the 1st position or the 2nd position is low, we are anxious also about the prediction precision of a future position falling.
  • the ECU 20 calculates the reliability of the first position or the second position used for calculating the future position, and the lower the reliability, the first position or the first position used to calculate the future position of the fusion target.
  • step S24 of FIG. 5 instead of comparing the calculation number of the fusion position with the threshold value, when the fusion target determination on the object is the first time, either the first position or the second position and the fusion position
  • the future position of the object may be calculated based on the above.
  • the type of the object for calculating the future position may be a bicycle instead of a pedestrian.
  • the ECU 20 recognizes the bicycle from the captured image by using a bicycle dictionary instead of the pedestrian dictionary, and detects the second position based on the recognized bicycle.
  • step S42 of FIG. 6 instead of determining whether the pedestrian or the bicycle is moving in the horizontal direction using the captured image, the pedestrian or the bicycle is determined based on the detection result by the electromagnetic wave sensor 31. You may determine whether it is moving to the horizontal direction.
  • the method for determining the integration permission may be changed depending on whether the fusion target is a pedestrian or a preceding vehicle. In this case, when the fusion target is a pedestrian, the integration permission determination shown in the first embodiment is performed, and when the fusion target is a preceding vehicle, the integration permission determination shown in the third embodiment is performed.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • General Physics & Mathematics (AREA)
  • Electromagnetism (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Mechanical Engineering (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Traffic Control Systems (AREA)

Abstract

Selon la présente invention, sur la base d'une première position dans laquelle un objet a été détecté par un capteur d'ondes électromagnétiques (31) et d'une seconde position de l'objet tel qu'il a été détecté à partir d'une image capturée par un capteur d'image (32), une ECU (20) détermine que l'objet est le même objet. Au moment où l'ECU (20) a déterminé que l'objet est le même objet, si, avant ce moment, l'ECU avait acquis une première position ou une seconde position de l'objet, l'ECU calcule une position future de l'objet sur la base de la première position ou de la seconde position acquises et d'une position combinée.
PCT/JP2017/036263 2016-10-13 2017-10-05 Dispositif de détection de mouvement, procédé de détection de mouvement WO2018070335A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2016-202080 2016-10-13
JP2016202080A JP6601362B2 (ja) 2016-10-13 2016-10-13 移動検出装置

Publications (1)

Publication Number Publication Date
WO2018070335A1 true WO2018070335A1 (fr) 2018-04-19

Family

ID=61905634

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2017/036263 WO2018070335A1 (fr) 2016-10-13 2017-10-05 Dispositif de détection de mouvement, procédé de détection de mouvement

Country Status (2)

Country Link
JP (1) JP6601362B2 (fr)
WO (1) WO2018070335A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPWO2018087817A1 (ja) * 2016-11-08 2019-03-07 三菱電機株式会社 物体検知装置および物体検知方法
US20210279487A1 (en) * 2020-03-06 2021-09-09 Subaru Corporation Vehicle exterior environment recognition apparatus

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6820958B2 (ja) * 2019-01-22 2021-01-27 三菱電機株式会社 車両の制御装置および制御方法
JP7195219B2 (ja) * 2019-05-31 2022-12-23 本田技研工業株式会社 衝突予測判定装置、及び交通弱者保護システム
JP2021092996A (ja) * 2019-12-11 2021-06-17 国立大学法人 東京大学 計測システム、車両、計測方法、計測装置及び計測プログラム
JP2022050966A (ja) * 2020-09-18 2022-03-31 株式会社デンソー 物体検出装置
JP7533180B2 (ja) 2020-12-03 2024-08-14 株式会社豊田自動織機 動作予測装置、動作予測方法、及び動作予測プログラム

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004301567A (ja) * 2003-03-28 2004-10-28 Fujitsu Ltd 衝突予測装置
JP2016068754A (ja) * 2014-09-30 2016-05-09 株式会社デンソー 運転支援装置

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004301567A (ja) * 2003-03-28 2004-10-28 Fujitsu Ltd 衝突予測装置
JP2016068754A (ja) * 2014-09-30 2016-05-09 株式会社デンソー 運転支援装置

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPWO2018087817A1 (ja) * 2016-11-08 2019-03-07 三菱電機株式会社 物体検知装置および物体検知方法
US20210279487A1 (en) * 2020-03-06 2021-09-09 Subaru Corporation Vehicle exterior environment recognition apparatus
US11842552B2 (en) * 2020-03-06 2023-12-12 Subaru Corporation Vehicle exterior environment recognition apparatus

Also Published As

Publication number Publication date
JP2018063606A (ja) 2018-04-19
JP6601362B2 (ja) 2019-11-06

Similar Documents

Publication Publication Date Title
JP6601362B2 (ja) 移動検出装置
CN107408345B (zh) 物标存在判定方法以及装置
JP6673178B2 (ja) 車両制御装置、車両制御方法
US9797734B2 (en) Object recognition apparatus
JP6592266B2 (ja) 物体検知装置、及び物体検知方法
US11014566B2 (en) Object detection apparatus
JP6539228B2 (ja) 車両制御装置、及び車両制御方法
CN109204311B (zh) 一种汽车速度控制方法和装置
JP4558758B2 (ja) 車両用障害物認識装置
JP6787157B2 (ja) 車両制御装置
US10535264B2 (en) Object detection apparatus and object detection method
WO2018074287A1 (fr) Dispositif de commande de véhicule
JP6855776B2 (ja) 物体検出装置、及び物体検出方法
US20170066445A1 (en) Vehicle control apparatus
US10752223B2 (en) Autonomous emergency braking system and method for vehicle at crossroad
JP6614108B2 (ja) 車両制御装置、車両制御方法
JP6669090B2 (ja) 車両制御装置
JP2016192166A (ja) 車両制御装置、及び車両制御方法
WO2016158634A1 (fr) Dispositif de commande de véhicule et procédé de commande de véhicule
WO2017138329A1 (fr) Dispositif de prédiction de collision
US11407390B2 (en) Vehicle control apparatus and vehicle control method
KR20150096924A (ko) 전방 충돌 차량 선정 방법 및 시스템
JP2019046143A (ja) 走行支援装置
US11420624B2 (en) Vehicle control apparatus and vehicle control method
JP6733616B2 (ja) 車両制御装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17861162

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17861162

Country of ref document: EP

Kind code of ref document: A1