WO2021229722A1 - Dispositif de détection d'obstacle, dispositif d'aide au stationnement, dispositif d'évitement de collision et procédé de détection d'obstacle - Google Patents

Dispositif de détection d'obstacle, dispositif d'aide au stationnement, dispositif d'évitement de collision et procédé de détection d'obstacle Download PDF

Info

Publication number
WO2021229722A1
WO2021229722A1 PCT/JP2020/019116 JP2020019116W WO2021229722A1 WO 2021229722 A1 WO2021229722 A1 WO 2021229722A1 JP 2020019116 W JP2020019116 W JP 2020019116W WO 2021229722 A1 WO2021229722 A1 WO 2021229722A1
Authority
WO
WIPO (PCT)
Prior art keywords
obstacle
unit
obstacle detection
peak shape
vehicle
Prior art date
Application number
PCT/JP2020/019116
Other languages
English (en)
Japanese (ja)
Inventor
浩章 村上
裕 小野寺
亘 辻田
元気 山下
Original Assignee
三菱電機株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 三菱電機株式会社 filed Critical 三菱電機株式会社
Priority to PCT/JP2020/019116 priority Critical patent/WO2021229722A1/fr
Priority to JP2022522170A priority patent/JP7186923B2/ja
Publication of WO2021229722A1 publication Critical patent/WO2021229722A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/52Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00
    • G01S7/523Details of pulse systems
    • G01S7/526Receivers

Definitions

  • This disclosure relates to an obstacle detection device, a parking support device, a collision avoidance device, and an obstacle detection method.
  • the area in front of the vehicle is referred to as the "front area”. Further, the area behind the vehicle is referred to as a “rear area”. The area to the left of the vehicle is referred to as the "left area”. The area to the right of the vehicle is referred to as the "right area”. In addition, the left area or the right area may be collectively referred to as a "side area”. In addition, the front area, the rear area, the left area, or the right area may be collectively referred to as a "peripheral area”.
  • a ToF (Time THERf Flat) type range-finding sensor for automobiles has been developed.
  • ultrasonic sensors, millimeter-wave radars, and laser radars have been developed.
  • waves for example, ultrasonic waves, millimeter waves, or laser beams
  • explosion waves to be transmitted and received by the ToF type distance measuring sensor.
  • Patent Document 1 determines the height of an obstacle based on the non-detection rate or the residual average value (see, for example, the summary of Patent Document 1).
  • the non-detection rate or the residual average value fluctuates due to the following factors.
  • the exploration wave received by the ranging sensor may include the exploration wave reflected by the road surface in addition to the exploration wave reflected by the obstacle to be detected.
  • the non-detection rate decreases and the residual average value increases.
  • the exploration wave reflected by the road surface is measured. It becomes easier to receive by the distance sensor.
  • the non-detection rate is further reduced, and the residual average value is further increased.
  • the component corresponding to the exploration wave reflected by the road surface may be referred to as “road surface noise”.
  • the present disclosure has been made to solve the above-mentioned problems, and an object of the present disclosure is to suppress a decrease in the accuracy of determining the height of an obstacle due to road surface noise.
  • the obstacle detection device is an obstacle detection unit that acquires information corresponding to an obstacle by detecting an obstacle in a surrounding area with respect to a moving vehicle by using a distance measuring sensor provided in the vehicle. And, a map generation unit that generates a map corresponding to a frequency distribution based on a plurality of distance measurement values included in the information, and a frequency group including a frequency exceeding the first threshold among a plurality of frequency groups in the map are extracted.
  • FIG. 1 It is a block diagram which shows the main part of the parking support system including the obstacle detection device which concerns on Embodiment 1.
  • FIG. 2 is a block diagram which shows the main part of the obstacle detection part in the obstacle detection device which concerns on Embodiment 1.
  • FIG. It is a block diagram which shows the main part of the discrimination part in the obstacle detection apparatus which concerns on Embodiment 1.
  • FIG. It is a block diagram which shows the hardware composition of the main part of the parking support apparatus including the obstacle detection apparatus which concerns on Embodiment 1.
  • FIG. It is a block diagram which shows the other hardware composition of the main part of the parking support apparatus including the obstacle detection apparatus which concerns on Embodiment 1.
  • FIG. It is a flowchart which shows the operation of the obstacle detection apparatus which concerns on Embodiment 1. It is a flowchart which shows the operation of the parking support apparatus including the obstacle detection apparatus which concerns on Embodiment 1.
  • FIG. It is explanatory drawing which shows the example of the plurality of reflection points corresponding to obstacles, and the plurality of reflection points corresponding to road surface noise. It is explanatory drawing which shows the example of the map and the example of the 1st threshold value. It is explanatory drawing which shows the example of the peak shape corresponding to the road obstacle, and the example of the 2nd threshold value.
  • FIG. 1 It is explanatory drawing which shows the example of the peak shape corresponding to the running obstacle and the example of the 2nd threshold value. It is explanatory drawing which shows the example of the peak shape corresponding to the road obstacle, and the example of the third threshold value. It is explanatory drawing which shows the example of the peak shape corresponding to the traveling obstacle and the example of the 3rd threshold value.
  • FIG. It is a flowchart which shows the operation of the obstacle detection apparatus which concerns on Embodiment 2. It is a flowchart which shows the operation of the parking support apparatus including the obstacle detection apparatus which concerns on Embodiment 2.
  • FIG. 1 It is explanatory drawing which shows the example of the state in which the arrangement direction of a plurality of reflection points corresponding to an obstacle is non-parallel to the moving direction of a vehicle.
  • FIG. 1 It is a block diagram which shows the main part of the collision avoidance system including the obstacle detection apparatus which concerns on Embodiment 3.
  • FIG. It is a block diagram which shows the main part of the obstacle detection part in the obstacle detection device which concerns on Embodiment 3.
  • FIG. It is a block diagram which shows the hardware composition of the main part of the collision avoidance apparatus including the obstacle detection apparatus which concerns on Embodiment 3.
  • FIG. It is a block diagram which shows the other hardware composition of the main part of the collision avoidance apparatus including the obstacle detection apparatus which concerns on Embodiment 3.
  • FIG. 1 It is a block diagram which shows the other hardware composition of the main part of the collision avoidance apparatus including the obstacle detection apparatus which concerns on Embodiment 3.
  • FIG. It is a flowchart which shows the operation of the obstacle detection apparatus which concerns on Embodiment 3.
  • It is a flowchart which shows the operation of the collision avoidance apparatus including the obstacle detection apparatus which concerns on Embodiment 3.
  • FIG. It is explanatory drawing which shows the example of the movement amount of a vehicle.
  • FIG. It is a block diagram which shows the main part of the discrimination part in the obstacle detection apparatus which concerns on Embodiment 4.
  • FIG. 1 is a block diagram showing a main part of a parking support system including an obstacle detection device according to the first embodiment.
  • FIG. 2 is a block diagram showing a main part of the obstacle detection unit in the obstacle detection device according to the first embodiment.
  • FIG. 3 is a block diagram showing a main part of a discrimination unit in the obstacle detection device according to the first embodiment.
  • a parking support system including an obstacle detection device according to the first embodiment will be described with reference to FIGS. 1 to 3.
  • the vehicle 1 has a parking support system 2.
  • the parking support system 2 includes a distance measuring sensor 3, first sensors 4, second sensors 5, and a parking support device 6.
  • the parking support device 6 includes an obstacle detection unit 11, a map generation unit 12, a peak shape extraction unit 13, a discrimination unit 14, and a parking support control unit 21.
  • the obstacle detection unit 11, the map generation unit 12, the peak shape extraction unit 13, and the discrimination unit 14 constitute a main part of the obstacle detection device 7.
  • the range-finding sensor 3 is composed of one range-finding sensor or a plurality of range-finding sensors.
  • Each distance measuring sensor is composed of a ToF type distance measuring sensor.
  • each distance measuring sensor is composed of an ultrasonic sensor, a millimeter wave radar, or a laser radar.
  • the distance measuring sensor 3 is composed of one distance measuring sensor and the one distance measuring sensor is composed of an ultrasonic sensor will be mainly described.
  • the distance measuring sensor 3 is provided on the left side of the vehicle 1. Alternatively, the distance measuring sensor 3 is provided on the right side of the vehicle 1. Alternatively, the distance measuring sensor 3 is provided on each of the left side portion of the vehicle 1 and the right side portion of the vehicle 1. Alternatively, the distance measuring sensor 3 is provided at the front end portion of the vehicle 1. Alternatively, the distance measuring sensor 3 is provided at the rear end of the vehicle 1. Alternatively, the distance measuring sensor 3 is provided at each of the front end portion of the vehicle 1 and the rear end portion of the vehicle 1.
  • the distance measuring sensor 3 is provided on the left side portion of the vehicle 1 or the right side portion of the vehicle 1 . More specifically, an example in which the distance measuring sensor 3 is provided on the left side of the vehicle 1 will be mainly described. Further, in the third embodiment, an example in which the distance measuring sensor 3 is provided at the front end portion of the vehicle 1 or the rear end portion of the vehicle 1 will be mainly described. More specifically, an example in which the distance measuring sensor 3 is provided at the front end portion of the vehicle 1 will be mainly described.
  • the first sensors 4 are used for detecting the speed Vm of the vehicle 1.
  • the first sensors 4 include one type of sensor or a plurality of types of sensors. Specifically, for example, the first sensors 4 include a wheel speed sensor and a shift position sensor.
  • the second sensors 5 are used to detect the position of the vehicle 1.
  • the position of the vehicle 1 includes the direction of the vehicle 1.
  • the second sensor class 5 includes one type of sensor or a plurality of types of sensors.
  • the second sensors 5 include a GNSS (Global Navigation Satellite System) receiver, a yaw rate sensor, and a gyro sensor.
  • GNSS Global Navigation Satellite System
  • the obstacle detection unit 11 detects an obstacle O in the surrounding area (for example, the left area LA) with respect to the moving vehicle 1 by using the distance measuring sensor 3. As a result, the obstacle detection unit 11 acquires information about the obstacle O (hereinafter, may be referred to as “obstacle detection information”).
  • the obstacle detection unit 11 includes a speed determination unit 31, a transmission signal output unit 32, a reception signal acquisition unit 33, a distance calculation unit 34, and a position calculation unit 35.
  • the speed determination unit 31 uses the first sensors 4 to determine whether or not the vehicle 1 is moving at a speed Vm equal to or lower than a predetermined speed Vth.
  • the predetermined speed Vth is set, for example, every 30 kilometers per hour.
  • the transmission signal output unit 32 sends a predetermined electric signal (hereinafter referred to as “transmission signal”) to the distance measuring sensor 3 at a predetermined time interval. It is to output.
  • the ranging sensor 3 transmits the exploration wave to the surrounding region (for example, the left region LA) at a predetermined time interval.
  • the exploration wave transmitted by the distance measuring sensor 3 irradiates the obstacle O in the surrounding region (for example, the left region LA).
  • the irradiated exploration wave is reflected by the obstacle O.
  • the reflected exploration wave (hereinafter, may be referred to as “reflected wave”) is received by the ranging sensor 3.
  • the distance measuring sensor 3 outputs an electric signal (hereinafter referred to as “received signal”) corresponding to the received reflected wave.
  • the received signal acquisition unit 33 acquires the received signal output by the distance measuring sensor 3.
  • the distance calculation unit 34 calculates the distance value d by ToF using the received signal acquired by the received signal acquisition unit 33.
  • the distance calculation unit 34 calculates the round-trip propagation time Tp of the search wave based on the time when the distance measurement sensor 3 transmits the search wave and the time when the distance measurement sensor 3 receives the corresponding reflected wave.
  • information indicating the propagation velocity Vp of the exploration wave in the air is prepared in advance.
  • the distance calculation unit 34 calculates the distance value d by the following equation (1) based on the reciprocating propagation time Tp and the propagation velocity Vp.
  • the distance value d by ToF is calculated. That is, when the vehicle 1 is moving at a speed Vm equal to or lower than a predetermined speed Vth, the exploration wave is transmitted a plurality of times. As a result, a plurality of distance values d are sequentially calculated.
  • the position calculation unit 35 acquires the distance value d calculated by the distance calculation unit 34.
  • the position calculation unit 35 sets Pr at the point where the exploration wave corresponding to the acquired distance value d is reflected by the obstacle O (hereinafter referred to as “reflection point”) RP (hereinafter referred to as “reflection position”) Pr. It is a calculation. Specifically, for example, the position calculation unit 35 calculates the reflection position Pr as follows.
  • the position calculation unit 35 uses the second sensors 5 to transmit the search wave or the time when the distance measurement sensor 3 receives the corresponding reflected wave (hereinafter collectively referred to as “target time”).
  • target time The position of the vehicle 1 in
  • information indicating the installation position and installation direction of the distance measuring sensor 3 in the vehicle 1 is prepared in advance.
  • the position calculation unit 35 calculates the position (hereinafter referred to as “sensor position”) Ps of the distance measuring sensor 3 at each target time based on the detected position, and also calculates the individual target time.
  • the direction (hereinafter referred to as "sensor direction”) Ds of the distance measuring sensor 3 in the above is calculated.
  • the position calculation unit 35 calculates the following vector V using the individual sensor positions Ps, the corresponding sensor orientation Ds, and the corresponding distance value d. That is, each vector V has a starting point corresponding to the corresponding sensor position Ps. Further, each vector V has a direction corresponding to the corresponding sensor direction Ds. Further, each vector V has a magnitude corresponding to the corresponding distance value d.
  • the position calculation unit 35 pretends that the end point of each vector V corresponds to the reflection point RP. As a result, the reflection position Pr corresponding to each vector V is calculated.
  • the reflection position Pr is calculated.
  • the reflection point RP is detected. That is, when the vehicle 1 is traveling at a speed Vr equal to or lower than a predetermined speed Vth, a plurality of reflection position Prs corresponding to the plurality of reflection point RPs are sequentially calculated.
  • the obstacle detection unit 11 generates information indicating the reflection position Pr corresponding to each reflection point RP (hereinafter referred to as “reflection position information"). Further, the obstacle detection unit 11 generates information indicating the distance measurement value D corresponding to each reflection point RP (hereinafter referred to as “distance measurement value information”). For the distance measurement value D, for example, the distance value d calculated by the distance calculation unit 34 is used. The obstacle detection unit 11 outputs information including the generated reflection position information and the generated distance measurement value information (that is, obstacle detection information) to the map generation unit 12.
  • the map generation unit 12 acquires obstacle detection information output by the obstacle detection unit 11.
  • the distance measurement value information included in the acquired obstacle detection information indicates a plurality of distance measurement values D.
  • the map generation unit 12 generates a map M corresponding to the frequency distribution by the plurality of distance measurement values D.
  • the map generation unit 12 determines which bin of the plurality of distance sections (hereinafter referred to as “bin”) each of the plurality of distance measurement values D is included in the map.
  • the generation unit 12 counts the number of distance measurement values D included in each bin based on the result of such determination.
  • the map M corresponding to the frequency distribution is generated.
  • the transmitted exploration wave may be applied to the road surface.
  • the irradiated exploration wave is reflected by the road surface.
  • the reflected exploration wave (that is, the reflected wave) is received by the ranging sensor 3.
  • the received signal acquired by the received signal acquisition unit 33 may include a component corresponding to the reflected wave by the road surface (that is, road surface noise) in addition to the component corresponding to the reflected wave by the obstacle O.
  • the distance value d calculated by the distance calculation unit 34 may include the distance value d corresponding to the road surface noise in addition to the distance value d corresponding to the obstacle O.
  • the reflection position Pr calculated by the position calculation unit 35 may include the reflection position Pr corresponding to the road surface noise in addition to the reflection position Pr corresponding to the obstacle O.
  • the map M generated by the map generation unit 12 can include the frequency group FG corresponding to the road surface noise in addition to the frequency group FG corresponding to the obstacle O.
  • the frequency group FG corresponding to the obstacle O among the plurality of frequency group FGs in the map M may be referred to as a “peak shape”.
  • the code of "PS" may be used for the peak shape.
  • the peak shape extraction unit 13 extracts the peak shape PS in the map M. Specifically, for example, the peak shape extraction unit 13 extracts the peak shape PS as follows.
  • the peak shape extraction unit 13 executes the peak separation process for the map M. As a result, the individual frequency group FG in the map M is extracted.
  • various known techniques can be used for the peak extraction process.
  • the peak shape extraction unit 13 may extract a range in which the frequency F in the map M continuously exceeds a predetermined value as one frequency group FG.
  • a clustering method such as the k-nearest neighbor method may be used for the peak separation process.
  • the peak shape extraction unit 13 extracts a frequency group FG including a frequency F exceeding a predetermined threshold value (hereinafter referred to as “first threshold value”) Fth1 among the plurality of extracted frequency group FGs. As a result, the peak shape PS is extracted.
  • first threshold value a predetermined threshold value
  • the discriminating unit 14 includes the height discriminating unit 41.
  • the height determination unit 41 determines the height of the obstacle O based on the peak shape PS extracted by the peak shape extraction unit 13. Specifically, for example, the height determination unit 41 determines which of the two heights the height of the obstacle O is, as follows.
  • an obstacle having a height high enough to come into contact with the bumper of the vehicle 1 due to the movement of the vehicle 1 is referred to as a "running obstacle”.
  • the traveling obstacle has a height large enough to come into contact with the door of the vehicle 1 by opening and closing the door of the vehicle 1.
  • the traveling obstacle includes, for example, a wall or another parked vehicle (hereinafter referred to as "parked vehicle”).
  • parked vehicle an obstacle having a height so small that it cannot come into contact with the bumper of the vehicle 1 due to the movement of the vehicle 1
  • a road obstacle has a height so small that it cannot come into contact with the door of the vehicle 1 by opening and closing the door of the vehicle 1.
  • Road obstacles include, for example, curbs or steps.
  • the height determination unit 41 compares the maximum value of the plurality of frequencies F in the peak shape PS with a predetermined threshold value (hereinafter referred to as “second threshold value”) Fth2. In other words, the height determination unit 41 compares the height of the peak shape PS with the second threshold value Fth2.
  • the second threshold value Fth2 is set to a value larger than the first threshold value Fth1.
  • the height determination unit 41 determines that the corresponding obstacle O is a traveling obstacle. On the other hand, when the height of the peak shape PS is equal to or less than the second threshold value Fth2, the height determination unit 41 determines that the corresponding obstacle O is a road obstacle.
  • the main part of the obstacle detection device 7 is configured.
  • the obstacle detection device 7 outputs a signal indicating a detection result by the obstacle detection unit 11 (that is, a signal indicating obstacle detection information) to the outside of the obstacle detection device 7. Further, the obstacle detection device 7 outputs a signal indicating the discrimination result by the discrimination unit 14 (that is, a signal indicating the discrimination result by the height discrimination unit 41) to the outside of the obstacle detection device 7.
  • the signals output by the obstacle detection device 7 may be collectively referred to as “result signal”.
  • the parking support control unit 21 acquires the result signal output by the obstacle detection device 7.
  • the parking support control unit 21 detects a space (hereinafter referred to as “parking space”) in which the vehicle 1 can be parked in the surrounding area (for example, the left area LA) by using the acquired result signal. ..
  • parking space a space in which the vehicle 1 can be parked in the surrounding area (for example, the left area LA) by using the acquired result signal. ..
  • Various known techniques can be used to detect the parking space. Detailed description of these techniques will be omitted.
  • the parking support control unit 21 executes control to park the vehicle 1 in the detected parking space by operating the vehicle 1. That is, the parking support control unit 21 executes control for realizing so-called "automatic parking”.
  • Various known techniques can be used for control to realize automatic parking. Detailed description of these techniques will be omitted.
  • the parking support control unit 21 uses an output device (not shown) to execute control to notify the driver of the vehicle 1 of the detected parking space.
  • the output device includes, for example, a display and a speaker.
  • the driver of the vehicle 1 parks the vehicle 1 in the parking space by so-called "manual parking".
  • manual parking Various known techniques can be used for such notification. Detailed description of these techniques will be omitted.
  • the main part of the parking support device 6 is configured.
  • the processing executed by the obstacle detection unit 11 may be collectively referred to as “obstacle detection processing”.
  • the processes executed by the map generation unit 12 may be collectively referred to as “map generation process”.
  • the processes executed by the peak shape extraction unit 13 may be collectively referred to as “peak shape extraction process”.
  • the processes executed by the discriminating unit 14 may be collectively referred to as “discrimination processing”.
  • the processing and control executed by the parking support control unit 21 may be collectively referred to as "parking support control”.
  • the processing executed by the obstacle detection device 7 may be collectively referred to as "obstacle detection processing, etc.” That is, the obstacle detection process and the like include an obstacle detection process, a map generation process, a peak shape extraction process, and a discrimination process.
  • the functions of the obstacle detection unit 11 may be collectively referred to as "obstacle detection function”.
  • the functions of the map generation unit 12 may be collectively referred to as “map generation function”.
  • the functions of the peak shape extraction unit 13 may be collectively referred to as “peak shape extraction function”.
  • the functions of the discrimination unit 14 may be collectively referred to as “discrimination function”.
  • the functions of the parking support control unit 21 may be collectively referred to as "parking support function".
  • the code of "F1” may be used for the obstacle detection function.
  • the code of "F2” may be used for the map generation function.
  • the code of "F3” may be used for the peak shape extraction function.
  • the reference numeral of "F4" may be used for the discrimination function.
  • the code “F11” may be used for the parking support function.
  • the parking support device 6 has a processor 51 and a memory 52.
  • the memory 52 stores programs corresponding to a plurality of functions (including an obstacle detection function, a map generation function, a peak shape extraction function, a discrimination function, and a parking support function) F1 to F4 and F11.
  • the processor 51 reads and executes the program stored in the memory 52. As a result, a plurality of functions F1 to F4 and F11 are realized.
  • the parking support device 6 has a processing circuit 53.
  • the processing circuit 53 executes processing corresponding to a plurality of functions F1 to F4 and F11. As a result, a plurality of functions F1 to F4 and F11 are realized.
  • the parking support device 6 has a processor 51, a memory 52, and a processing circuit 53.
  • the memory 52 stores programs corresponding to some of the plurality of functions F1 to F4 and F11.
  • the processor 51 reads and executes the program stored in the memory 52. As a result, some of these functions are realized.
  • the processing circuit 53 executes processing corresponding to the remaining functions of the plurality of functions F1 to F4 and F11. As a result, such residual functions are realized.
  • the processor 51 is composed of one or more processors.
  • a CPU Central Processing Unit
  • a GPU Graphics Processing Unit
  • a microprocessor a microprocessor
  • a microprocessor a microprocessor
  • a DSP Digital Signal Processor
  • the memory 52 is composed of one or more non-volatile memories.
  • the memory 52 is composed of one or more non-volatile memories and one or more volatile memories. That is, the memory 52 is composed of one or more memories.
  • the individual memory uses, for example, a semiconductor memory or a magnetic disk. More specifically, each volatile memory uses, for example, a RAM (Random Access Memory).
  • the individual non-volatile memory is, for example, a ROM (Read Only Memory), a flash memory, an EPROM (Erasable Programmable Read Only Memory), an EEPROM (Electrically Erasable Programle) drive, a solid state drive O Is.
  • the processing circuit 53 is composed of one or more digital circuits.
  • the processing circuit 53 is composed of one or more digital circuits and one or more analog circuits. That is, the processing circuit 53 is composed of one or more processing circuits.
  • the individual processing circuits are, for example, ASIC (Application Specific Integrated Circuit), PLD (Programmable Logic Device), FPGA (Field Programmable Gate Array), System LSI (Sy), and System (Sy). Is.
  • the processor 51 when the processor 51 is composed of a plurality of processors, the correspondence between the plurality of functions F1 to F4 and F11 and the plurality of processors is arbitrary. That is, each of the plurality of processors may read and execute a program corresponding to one or more corresponding functions among the plurality of functions F1 to F4 and F11.
  • the processor 51 may include a dedicated processor corresponding to each of the plurality of functions F1 to F4 and F11.
  • each of the plurality of memories may store a program corresponding to one or more corresponding functions among the plurality of functions F1 to F4 and F11.
  • the memory 52 may include a dedicated memory corresponding to each of the plurality of functions F1 to F4 and F11.
  • the processing circuit 53 when the processing circuit 53 is composed of a plurality of processing circuits, the correspondence between the plurality of functions F1 to F4 and F11 and the plurality of processing circuits is arbitrary. That is, each of the plurality of processing circuits may execute processing corresponding to one or more corresponding functions among the plurality of functions F1 to F4 and F11.
  • the processing circuit 53 may include a dedicated processing circuit corresponding to each of the plurality of functions F1 to F4 and F11.
  • the operation of the obstacle detection device 7 will be described with reference to the flowchart shown in FIG. That is, the process shown in FIG. 7 is executed when the vehicle 1 is moving at a speed Vm equal to or lower than a predetermined speed Vth.
  • the obstacle detection unit 11 executes the obstacle detection process (step ST1).
  • the map generation unit 12 executes the map generation process (step ST2).
  • the peak shape extraction unit 13 executes the peak shape extraction process (step ST3).
  • the discrimination unit 14 executes the discrimination process (step ST4).
  • the obstacle detection device 7 executes obstacle detection processing and the like (step ST11). As a result, the processes of steps ST1 to ST4 shown in FIG. 7 are executed.
  • the parking support control unit 21 executes parking support control (step ST21).
  • an obstacle O exists in the left region LA.
  • the obstacle O is, for example, a wall or a curb. That is, the obstacle O has a long shape.
  • the vehicle 1 moves at a speed Vm equal to or lower than a predetermined speed Vth.
  • Dm indicates the moving direction of the vehicle 1.
  • the X-axis shows a virtual axis along the front-rear direction of the vehicle 1 (that is, the moving direction Dm of the vehicle 1). Further, the Y-axis indicates a virtual axis along the left-right direction of the vehicle 1 (that is, the direction perpendicular to the moving direction Dm of the vehicle 1).
  • the position calculation unit 35 detects a plurality of reflection points RP_1 corresponding to the obstacle O. In addition to this, it is assumed that a plurality of reflection points RP_2 and RP_3 corresponding to the road surface noise are detected. In FIG. 9, only one reflection point RP_1 out of the plurality of reflection points RP_1 is designated by the reference numeral “RP_1”.
  • the map generation unit 12 generates the map M shown in FIG.
  • the map M includes the frequency group FG_1 corresponding to the obstacle O. That is, the map M includes the frequency group FG_1 corresponding to the plurality of reflection points RP_1.
  • the frequency group FG_1 contains three frequencies F_1 belonging to each of three adjacent bins.
  • the map M includes the frequency group FG_2 corresponding to the road surface noise. That is, the map M includes the frequency group FG_2 corresponding to one reflection point RP_2 among the plurality of reflection points RP_2 and RP_3.
  • the frequency group FG_2 includes one frequency F_2 belonging to one bin.
  • the map M includes the frequency group FG_3 corresponding to the road surface noise. That is, the map M includes the frequency group FG_3 corresponding to one reflection point RP_3 out of the plurality of reflection points RP_2 and RP_3.
  • the frequency group FG_3 includes one frequency F_3 belonging to one bin.
  • the peak shape extraction unit 13 executes the peak separation process for the map M. As a result, the individual frequency groups FG_1, FG_2, and FG_3 in the map M are extracted.
  • the peak shape extraction unit 13 determines whether or not the individual frequency groups FG_1, FG_2, and FG_3 include the frequency F exceeding the first threshold value Fth1. In other words, the peak shape extraction unit 13 compares the heights of the individual frequency groups FG_1, FG_2, and FG_3 in the map M with the first threshold value Fth1.
  • the frequency group FG_1 includes a frequency F_1 that exceeds the first threshold value Fth1.
  • the frequency group FG_2 does not include the frequency F_2 exceeding the first threshold value Fth1.
  • the frequency group FG_3 does not include the frequency F_3 exceeding the first threshold value Fth1.
  • the peak shape extraction unit 13 extracts the frequency group FG_1 including the frequency F_1 exceeding the first threshold value Fth1 among the three frequency groups FG_1, FG_1, and FG_1. As a result, the peak shape PS is extracted.
  • the frequency with which the reflected wave from the road surface is received by the distance measuring sensor 3 is lower than the frequency with which the reflected wave from the obstacle O is received by the distance measuring sensor 3. Therefore, when the difference value between the distance measurement value D corresponding to the obstacle O and the distance measurement value D corresponding to the road surface noise is sufficiently large (that is, the frequency group FG corresponding to the obstacle O and the frequency group corresponding to the road surface noise). (When it is separable from the FG), the height of the frequency group FG corresponding to the obstacle O is larger than the height of the frequency group FG corresponding to the road surface noise. In other words, the height of the frequency group FG corresponding to the road surface noise is smaller than the height of the frequency group FG corresponding to the obstacle O.
  • the frequency group FG having a height of the first threshold value Fth1 or more can be extracted. That is, the peak shape PS can be extracted.
  • the first threshold value Fth1 is set to a value that can distinguish the frequency group FG corresponding to the obstacle O and the frequency group FG corresponding to the road surface noise.
  • the installation height of the distance measuring sensor 3 in the vehicle 1 is set to a height of about several tens of centimeters with respect to the road surface. Therefore, for an obstacle O (for example, a curb) having a small height, when the distance measuring sensor 3 transmits an exploration wave, the transmitted exploration wave may not be irradiated. As a result, the reflected wave may not be received by the ranging sensor 3 for the obstacle O having a small height. Therefore, the height of the peak-shaped PS corresponding to the road obstacle (that is, the maximum value of the frequency F in the peak-shaped PS corresponding to the road obstacle) is the height of the peak-shaped PS corresponding to the traveling obstacle (that is, the traveling obstacle). It is smaller than the maximum value of the frequency F in the peak shape PS corresponding to the object).
  • the height determination unit 41 determines whether the corresponding obstacle O is a traveling obstacle or a road obstacle by comparing the height of the peak shape PS with the second threshold value Fth2. To do.
  • the height of the peak shape PS is equal to or less than the second threshold value Fth2. In this case, the height determination unit 41 determines that the corresponding obstacle O is a road obstacle. On the other hand, in the example shown in FIG. 12, the height of the peak shape PS exceeds the second threshold value Fth2. In this case, the height determination unit 41 determines that the corresponding obstacle O is a traveling obstacle.
  • the area of the portion of the obstacle O that reflects the exploration wave is larger than that when the obstacle O is a road obstacle. Therefore, when the obstacle O is a traveling obstacle, the total number of the corresponding round-trip propagation paths (hereinafter referred to as “paths”) of the exploration wave is larger than that when the obstacle O is a road obstacle. , The difference in path length between paths is also large.
  • the width W of the corresponding peak shape PS also becomes large.
  • the height determination unit 41 compares the width W of the peak shape PS with a predetermined threshold value (hereinafter referred to as “width threshold value”) Wth to determine whether the corresponding obstacle O is a traveling obstacle or a road surface obstacle. It may be something that determines whether or not it is. That is, when the width W of the peak shape PS exceeds the width threshold value Wth, the height determination unit 41 determines that the corresponding obstacle O is a traveling obstacle. On the other hand, when the width W of the peak shape PS is equal to or less than the width threshold value Wth, the height determination unit 41 determines that the corresponding obstacle O is a road obstacle.
  • width threshold value a predetermined threshold value
  • the height determination unit 41 may calculate the width W of the peak shape PS as follows.
  • the height determination unit 41 extracts a bin having a frequency F exceeding a predetermined threshold value (hereinafter referred to as “third threshold value”) Fth3 in the peak shape PS.
  • the height determination unit 41 calculates the width W based on the number n of the extracted bins. In other words, the height determination unit 41 uses a value corresponding to the number n of the extracted bins for the width W.
  • the width W calculation method is not limited to such a calculation method.
  • Various known techniques can be used to calculate the width W. Detailed description of these techniques will be omitted.
  • the peak shape extraction unit 13 may set the first threshold value Fth1 as follows.
  • the peak shape extraction unit 13 may use the set first threshold value Fth1 instead of using the predetermined first threshold value Fth1.
  • the distance measurement value information included in the obstacle detection information indicates a plurality of distance measurement values D.
  • the peak shape extraction unit 13 acquires information indicating the number of transmissions (that is, the number) N of the exploration waves corresponding to the plurality of distance measurement values D from the obstacle detection unit 11.
  • wavenumber information information indicating the number of transmissions (that is, the number) N of the exploration waves corresponding to the plurality of distance measurement values D from the obstacle detection unit 11.
  • the peak shape extraction unit 13 sets the first threshold value Fth1 by multiplying the transmission frequency N indicated by the acquired wave number information by a predetermined ratio (hereinafter referred to as “first ratio”) R1. In other words, the peak shape extraction unit 13 uses a value obtained by multiplying the number of transmissions N by the first ratio R1 for the first threshold value Fth1.
  • the number of transmissions N can vary depending on the speed Vm of the vehicle 1 and the length of the obstacle O.
  • the total number of distance measurement values D used to generate the map M may also differ depending on the speed Vm of the vehicle 1 and the length of the obstacle O. Therefore, the height of each frequency group FG in the map M may also differ depending on the speed Vm of the vehicle 1 and the length of the obstacle O.
  • the first threshold value Fth1 is set to an appropriate value regardless of the speed Vm of the vehicle 1 and the length of the obstacle O. Can be done. As a result, the peak shape PS can be accurately extracted.
  • the height determination unit 41 may set the second threshold value Fth2 as follows. The height determination unit 41 may use the set second threshold value Fth2 instead of using the predetermined second threshold value Fth2.
  • the height determination unit 41 acquires wave number information from the obstacle detection unit 11.
  • the height determination unit 41 sets the second threshold value Fth2 by multiplying the transmission number N indicated by the acquired wave number information by a predetermined ratio (hereinafter referred to as “second ratio”) R2.
  • second ratio a predetermined ratio
  • the height determination unit 41 uses a value obtained by multiplying the number of transmissions N by the second ratio R2 for the second threshold value Fth2.
  • the second threshold value Fth2 By setting the second threshold value Fth2 according to the number of transmissions N, the second threshold value Fth2 can be set to an appropriate value regardless of the speed Vm of the vehicle 1 and the length of the obstacle O. As a result, it is possible to accurately determine whether the obstacle O is a traveling obstacle or a road surface obstacle.
  • the height determination unit 41 may set the third threshold value Fth3 as follows.
  • the height determination unit 41 may use the set third threshold value Fth3 instead of using the predetermined third threshold value Fth3.
  • the height determination unit 41 extracts the bin having the maximum frequency F in the peak shape PS.
  • the height determination unit 41 sets the third threshold value Fth3 by multiplying the frequency F in the extracted bottle by a predetermined ratio (hereinafter referred to as “third ratio”) R3.
  • the third ratio R3 is set to 0.1.
  • the total value of the frequency F in the peak shape PS may differ depending on the condition of the road surface (for example, the presence or absence of a puddle) and the shape of the obstacle O (for example, the inclination of the wall).
  • the third threshold value Fth3 is set to an appropriate value regardless of the condition of the road surface and the shape of the obstacle O. Can be set. As a result, it is possible to accurately determine whether the obstacle O is a traveling obstacle or a road surface obstacle.
  • the obstacle detection device 7 detects an obstacle O in the surrounding area with respect to the moving vehicle 1 by using the distance measuring sensor 3 provided in the vehicle 1.
  • An obstacle detection unit 11 that acquires information corresponding to the object O (obstacle detection information) and a map M corresponding to the frequency distribution by a plurality of distance measurement values D included in the information (obstacle detection information) are generated.
  • the peak shape extraction unit that extracts the peak shape PS in the map M by extracting the map generation unit 12 and the frequency group FG including the frequency F exceeding the first threshold value Fth1 among the plurality of frequency group FGs in the map M.
  • a determination unit 14 for determining the height of the obstacle O based on the peak shape PS is provided.
  • the discriminating unit 14 determines whether the obstacle O is a traveling obstacle or a road obstacle by comparing the height of the peak shape PS with the second threshold value Fth2 which is larger than the first threshold value Fth1. .. Thereby, it is possible to accurately determine whether the obstacle O is a traveling obstacle or a road obstacle.
  • the discrimination unit 14 determines whether the obstacle O is a traveling obstacle or a road obstacle by comparing the width W of the peak shape PS with the width threshold value Wth. Thereby, it is possible to accurately determine whether the obstacle O is a traveling obstacle or a road obstacle.
  • the peak shape extraction unit 13 sets the first threshold value Fth1 by multiplying the number of transmissions N of the exploration waves corresponding to the plurality of distance measurement values D by the first ratio R1. Thereby, the first threshold value Fth1 can be set to an appropriate value. As a result, the peak shape PS can be accurately extracted.
  • the discrimination unit 14 sets the second threshold value Fth2 by multiplying the transmission frequency N of the exploration waves corresponding to the plurality of distance measurement values D by the second ratio R2. Thereby, the second threshold value Fth2 can be set to an appropriate value. As a result, the height of the obstacle O can be accurately determined.
  • the discrimination unit 14 extracts bins having a frequency F exceeding the third threshold value Fth3 in the peak shape PS, and calculates the width W of the peak shape PS based on the number n of the extracted bins. Thereby, the width W of the peak shape PS can be calculated.
  • the discrimination unit 14 extracts the bin having the maximum frequency F in the peak shape PS, and sets the third threshold value Fth3 by multiplying the frequency F in the extracted bin by the third ratio R3. Thereby, the third threshold value Fth3 can be set to an appropriate value. As a result, the height of the obstacle O can be accurately determined.
  • the parking support device 6 includes an obstacle detection device 7 and a parking support control unit 21 that executes parking support control based on the detection result by the obstacle detection unit 11 and the discrimination result by the discrimination unit 14. , Equipped with.
  • the obstacle detection device 7 can accurately determine the height of the obstacle O. By using the result of such determination, the parking space can be detected with high accuracy.
  • the obstacle detection unit 11 detects an obstacle O in the surrounding area with respect to the moving vehicle 1 by using the distance measuring sensor 3 provided in the vehicle 1.
  • the map generation unit 12 corresponds to the frequency distribution based on the plurality of distance measurement values D included in the information (obstacle detection information) and the step ST1 for acquiring the information corresponding to the obstacle O (obstacle detection information).
  • the map M is generated by step ST2 for generating the map M to be generated, and the peak shape extraction unit 13 extracts the frequency group FG including the frequency F exceeding the first threshold value Fth1 among the plurality of frequency group FGs in the map M.
  • FIG. 15 is a block diagram showing a main part of the parking support system including the obstacle detection device according to the second embodiment.
  • a parking support system including an obstacle detection device according to the second embodiment will be described with reference to FIG.
  • the same blocks as those shown in FIG. 1 are designated by the same reference numerals and the description thereof will be omitted.
  • the parking support system 2a includes a distance measuring sensor 3, first sensors 4, second sensors 5, and a parking support device 6a.
  • the parking support device 6a includes an obstacle detection unit 11, a map generation unit 12, a peak shape extraction unit 13, a discrimination unit 14, a correction unit 15, and a parking support control unit 21.
  • the obstacle detection unit 11, the map generation unit 12, the peak shape extraction unit 13, the discrimination unit 14, and the correction unit 15 constitute a main part of the obstacle detection device 7a.
  • the distance measurement value information included in the obstacle detection information includes a plurality of distance measurement values D.
  • the plurality of distance measurement values D correspond to a plurality of reflection point RPs.
  • the correction unit 15 causes the arrangement direction Da to be parallel or substantially parallel to the movement direction Dm.
  • the plurality of distance measurement values D are corrected.
  • parallel or substantially parallel is collectively referred to as "parallel”.
  • the map generation unit 12 uses the distance measurement value D'corrected by the correction unit 15 to generate the map M.
  • the vehicle 1 moves in the direction along the longitudinal direction of the obstacle O.
  • the plurality of reflection point RPs corresponding to the obstacle O are arranged in a direction parallel to the moving direction Dm of the vehicle 1 (see, for example, FIG. 9).
  • the vehicle 1 may move in an oblique direction with respect to the longitudinal direction of the obstacle O (see, for example, FIG. 18 described later).
  • the distance measurement value D corresponding to the obstacle O fluctuates with time.
  • the plurality of distance measurement values D corresponding to the obstacle O are different from each other.
  • the frequency group FG corresponding to the obstacle O is distorted. That is, distortion occurs in the peak shape PS.
  • the main part of the obstacle detection device 7a is configured.
  • correction processing the processes executed by the correction unit 15 may be collectively referred to as "correction processing”. Further, the functions of the correction unit 15 may be collectively referred to as “correction function”. Further, the reference numeral of "F5" may be used for such a correction function.
  • the processing executed by the obstacle detection device 7a may be collectively referred to as "obstacle detection processing, etc.” That is, the obstacle detection process and the like include an obstacle detection process, a correction process, a map generation process, a peak shape extraction process, and a discrimination process.
  • the hardware configuration of the main part of the parking support device 6a is the same as that described with reference to FIGS. 4 to 6 in the first embodiment. Therefore, detailed description thereof will be omitted.
  • the parking support device 6a has a plurality of functions (including an obstacle detection function, a correction function, a map generation function, a peak shape extraction function, a discrimination function, and a parking support function) F1 to F5 and F11. ..
  • Each of the plurality of functions F1 to F5 and F11 may be realized by the processor 51 and the memory 52, or may be realized by the dedicated processing circuit 53.
  • the processor 51 may include a dedicated processor corresponding to each of the plurality of functions F1 to F5 and F11.
  • the memory 52 may include a dedicated memory corresponding to each of the plurality of functions F1 to F5 and F11.
  • the processing circuit 53 may include a dedicated processing circuit corresponding to each of the plurality of functions F1 to F5 and F11.
  • step ST1 the process of step ST1 is executed.
  • the correction unit 15 executes the correction process (step ST5).
  • the processes of steps ST2 to ST4 are executed.
  • the map generation unit 12 uses the distance measurement value D'corrected by the correction unit 15.
  • step ST11a the obstacle detection device 7a executes obstacle detection processing and the like.
  • steps ST1, ST5, ST2 to ST4 shown in FIG. 16 are executed.
  • step ST21 the process of step ST21 is executed.
  • the correction unit 15 sets a straight line SL_ref parallel to the X axis. That is, the correction unit 15 sets a straight line SL_ref parallel to the moving direction Dm of the vehicle 1. Further, the correction unit 15 detects a straight line SL along the arrangement direction Da. Next, the correction unit 15 calculates the angle ⁇ of the straight line SL with respect to the straight line SL_ref.
  • the correction unit 15 performs individual reflection point RPs so that the straight line SL is parallel to the straight line SL_ref by rotating the straight line SL at a rotation angle ( ⁇ ) corresponding to the calculated angle ⁇ .
  • the ranging value D corresponding to is increased or decreased.
  • the center point of such rotation is arbitrary.
  • the distance measurement value D corresponding to each reflection point RP is corrected so that the arrangement direction Da is parallel to the movement direction Dm.
  • the correction unit 15 may detect the straight line SL by executing the straight line detection using the Hough transform.
  • the correction unit 15 may detect the straight line SL by executing the straight line detection using the RANSAC (Random Simple Consensus) algorithm.
  • RANSAC Random Simple Consensus
  • the arrangement direction Da of the plurality of reflection point RPs corresponding to the plurality of distance measurement values D is parallel to the movement direction Dm of the vehicle 1.
  • a correction unit 15 for correcting a plurality of distance measurement values D is provided so as to be.
  • FIG. 19 is a block diagram showing a main part of a collision avoidance system including an obstacle detection device according to a third embodiment.
  • FIG. 20 is a block diagram showing a main part of the obstacle detection unit in the obstacle detection device according to the third embodiment.
  • a collision avoidance system including the obstacle detection device according to the third embodiment will be described with reference to FIGS. 19 and 20.
  • FIG. 19 the same blocks as those shown in FIG. 1 are designated by the same reference numerals and the description thereof will be omitted. Further, in FIG. 20, the same blocks as those shown in FIG. 2 are designated by the same reference numerals and the description thereof will be omitted.
  • the vehicle 1 has a collision avoidance system 8.
  • the collision avoidance system 8 includes a distance measuring sensor 3, second sensors 5, and a collision avoidance device 9.
  • the collision avoidance device 9 includes an obstacle detection unit 11a, a map generation unit 12, a peak shape extraction unit 13, a discrimination unit 14, a correction unit 15a, and a collision avoidance control unit 61.
  • the obstacle detection unit 11a, the map generation unit 12, the peak shape extraction unit 13, the discrimination unit 14, and the correction unit 15a constitute a main part of the obstacle detection device 7b.
  • the obstacle detection unit 11a includes a transmission signal output unit 32, a reception signal acquisition unit 33, a distance calculation unit 34, and a position calculation unit 35.
  • the distance measuring sensor 3 is provided at the front end portion of the vehicle 1 or the rear end portion of the vehicle 1.
  • the distance measuring sensor 3 is provided at the front end portion of the vehicle 1 or the rear end portion of the vehicle 1.
  • the obstacle detection unit 11a detects an obstacle O in a surrounding area (for example, a front area FA) when the vehicle 1 is moving. That is, in the collision avoidance system 8, the obstacle detection process can be executed regardless of the speed Vm when the vehicle 1 is moving.
  • the obstacle detection device 7b is provided with a correction unit 15a.
  • the correction unit 15a corrects each distance measurement value D with a correction amount corresponding to the movement amount ⁇ D of the vehicle 1. Specifically, for example, the correction unit 15a corrects each distance measurement value D as follows.
  • the correction unit 15a is the position of the vehicle 1 at the time when the exploration wave corresponding to the first distance measurement value D among the plurality of distance measurement values D indicated by the distance measurement value information is transmitted (hereinafter, “the first”. Information indicating "1 vehicle position”) is acquired from the obstacle detection unit 11a. Further, the correction unit 15a is the position of the vehicle 1 at the time when the exploration wave corresponding to the second and subsequent individual distance measurement values D among the plurality of distance measurement values D is transmitted (hereinafter, "second vehicle”). Information indicating "position”) is acquired from the obstacle detection unit 11a.
  • the correction unit 15a calculates the movement amount ⁇ D by calculating the distance from the first vehicle position to the corresponding second vehicle position for each of the second and subsequent distance measurement values D.
  • the correction unit 15a calculates the corrected distance measurement value D ”by adding the movement amount ⁇ D corresponding to each of the second and subsequent distance measurement values D.
  • the map generation unit 12 uses the distance measurement value D "corrected by the correction unit 15a" for generating the map M. Thereby, the map M can be accurately generated.
  • the main part of the obstacle detection device 7b is configured.
  • the obstacle detection device 7b outputs a signal indicating the detection result by the obstacle detection unit 11a (that is, a signal indicating the obstacle detection information) to the outside of the obstacle detection device 7b. Further, the obstacle detection device 7b outputs a signal indicating the discrimination result by the discrimination unit 14 (that is, a signal indicating the discrimination result by the height discrimination unit 41) to the outside of the obstacle detection device 7b. That is, the obstacle detection device 7b outputs the result signal to the outside of the obstacle detection device 7b.
  • the collision avoidance control unit 61 acquires the result signal output by the obstacle detection device 7b.
  • the collision avoidance control unit 61 calculates the probability that the vehicle 1 collides with the obstacle O by using the acquired result signal.
  • Various known techniques can be used to calculate such a probability. Detailed description of these processes will be omitted.
  • the collision avoidance control unit 61 executes control to stop the vehicle 1 when the calculated probability is high. That is, the collision avoidance control unit 61 executes control for realizing the so-called “automatic braking” or “collision damage mitigation braking". Various known techniques can be used for such control. Detailed description of these techniques will be omitted.
  • the main part of the collision avoidance device 9 is configured.
  • the processing executed by the obstacle detection unit 11a may be collectively referred to as "obstacle detection processing”. Further, the functions of the obstacle detection unit 11a may be collectively referred to as an “obstacle detection function”. Further, the reference numeral of "F1a" may be used for the obstacle detection function.
  • correction processing the processes executed by the correction unit 15a may be collectively referred to as "correction processing".
  • functions of the correction unit 15a may be collectively referred to as “correction function”.
  • correction function the functions of the correction unit 15a
  • F5a the reference numeral of "F5a” may be used for such a correction function.
  • the processing executed by the obstacle detection device 7b may be collectively referred to as "obstacle detection processing, etc.” That is, the obstacle detection process and the like include an obstacle detection process, a correction process, a map generation process, a peak shape extraction process, and a discrimination process.
  • collision avoidance control the processing and control executed by the collision avoidance control unit 61 may be collectively referred to as “collision avoidance control”. Further, the functions of the collision avoidance control unit 61 may be collectively referred to as a “collision avoidance function”. Further, the reference numeral of "F21" may be used for such a collision avoidance function.
  • the hardware configuration of the main part of the collision avoidance device 9 is the same as that described with reference to FIGS. 4 to 6 in the first embodiment. Therefore, detailed description thereof will be omitted.
  • the collision avoidance device 9 includes a plurality of functions (including an obstacle detection function, a correction function, a map generation function, a peak shape extraction function, a discrimination function, and a collision avoidance function) F1a, F2 to F4, F5a, F21.
  • a plurality of functions including an obstacle detection function, a correction function, a map generation function, a peak shape extraction function, a discrimination function, and a collision avoidance function
  • F1a, F2 to F4, F5a, F21 may be realized by the processor 71 and the memory 72, or may be realized by the processing circuit 73 (FIG. 21). , FIG. 22 or FIG. 23).
  • the processor 71 may include a dedicated processor corresponding to each of the plurality of functions F1a, F2 to F4, F5a, F21.
  • the memory 72 may include a dedicated memory corresponding to each of the plurality of functions F1a, F2 to F4, F5a, F21.
  • the processing circuit 73 may include a dedicated processing circuit corresponding to each of the plurality of functions F1a, F2 to F4, F5a, F21.
  • the obstacle detection unit 11a executes the obstacle detection process (step ST1a).
  • the correction unit 15a executes the correction process (step ST5a).
  • the processes of steps ST2 to ST4 are executed.
  • the map generation unit 12 uses the distance measurement value D ”corrected by the correction unit 15a.
  • the obstacle detection device 7b executes obstacle detection processing and the like (step ST11b). As a result, the processes of steps ST1a, ST5a, and ST2 to ST4 shown in FIG. 24 are executed.
  • the collision avoidance control unit 61 executes the collision avoidance control (step ST31).
  • the distance measurement value information includes three distance measurement values D_1, D_2, and D_3 corresponding to the three reflection points RP_1, RP_2, and RP_3, respectively.
  • P_1 indicates the position of the vehicle 1 at the time when the exploration wave corresponding to the first reflection point RP_1 is transmitted.
  • P_2 indicates the position of the vehicle 1 at the time when the exploration wave corresponding to the second reflection point RP_2 is transmitted.
  • P_3 indicates the position of the vehicle 1 at the time when the exploration wave corresponding to the third reflection point RP_3 is transmitted.
  • the following equation (2) holds for the distance from the position P_1 to the position P_2 (that is, the movement amount ⁇ D_2 corresponding to the second ranging value D_2).
  • the following equation (3) is established for the distance from the position P_1 to the position P_3 (that is, the movement amount ⁇ D_3 corresponding to the third ranging value D_3).
  • the map generation unit 12 generates a map M using these distance measurement values D_1, D "_2, D” _3.
  • the obstacle detection device 7b includes a correction unit 15a that corrects a plurality of distance measurement values D according to the movement amount ⁇ D of the vehicle 1.
  • the map generation unit 12 can accurately generate the map M.
  • the collision avoidance device 9 includes an obstacle detection device 7b and a collision avoidance control unit 61 that executes collision avoidance control based on the detection result by the obstacle detection unit 11a and the discrimination result by the discrimination unit 14. , Equipped with.
  • the obstacle detection device 7b can accurately determine the height of the obstacle O. By using the result of such determination, the probability that the vehicle 1 collides with the obstacle O can be accurately calculated.
  • FIG. 27 is a block diagram showing a main part of the parking support system including the obstacle detection device according to the fourth embodiment.
  • FIG. 28 is a block diagram showing a main part of the discrimination unit in the obstacle detection device according to the fourth embodiment.
  • a parking support system including the obstacle detection device according to the fourth embodiment will be described with reference to FIGS. 27 and 28.
  • FIG. 27 the same blocks as those shown in FIG. 1 are designated by the same reference numerals and the description thereof will be omitted. Further, in FIG. 28, the same blocks as those shown in FIG. 3 are designated by the same reference numerals and the description thereof will be omitted.
  • the parking support system 2b includes a distance measuring sensor 3, first sensors 4, second sensors 5, and a parking support device 6b.
  • the parking support device 6b includes an obstacle detection unit 11, a map generation unit 12, a peak shape extraction unit 13, a discrimination unit 14a, and a parking support control unit 21.
  • the obstacle detection unit 11, the map generation unit 12, the peak shape extraction unit 13, and the discrimination unit 14a constitute a main part of the obstacle detection device 7c.
  • the discriminating unit 14a includes a height discriminating unit 41 and an implant discriminating unit 42.
  • implantation obstacle planting, hedges, flower beds, etc.
  • implantation obstacle the implantation of running obstacles
  • the implant determination unit 42 determines whether or not the obstacle O is an implant obstacle. Specifically, for example, the implant determination unit 42 determines whether or not the obstacle O is an implant obstacle as follows.
  • the implant determination unit 42 extracts the bottle having the maximum frequency F in the peak shape PS.
  • the implant determination unit 42 calculates the deviation amount ⁇ of the position of the extracted bottle with respect to the central portion of the peak shape PS.
  • the implant determination unit 42 determines whether or not the obstacle O is an implant obstacle based on the calculated deviation amount ⁇ .
  • the main part of the obstacle detection device 7c is configured.
  • the obstacle detection device 7c outputs a signal indicating the detection result by the obstacle detection unit 11 (that is, a signal indicating obstacle detection information) to the outside of the obstacle detection device 7c. Further, the obstacle detection device 7c outputs a signal indicating the discrimination result by the discrimination unit 14a (that is, a signal indicating the discrimination result by the height discrimination unit 41 and the discrimination result by the implant discrimination unit 42) to the outside of the obstacle detection device 7c. ..
  • the obstacle detection device 7c outputs a result signal.
  • the output result signal is used for parking support control by the parking support control unit 21.
  • discrimination processing the processes executed by the discrimination unit 14a may be collectively referred to as "discrimination processing". Further, the functions of the discrimination unit 14a may be collectively referred to as “discrimination function”. Further, the reference numeral of "F4a" may be used for such a discrimination function.
  • the processing executed by the obstacle detection device 7c may be collectively referred to as "obstacle detection processing, etc.” That is, the obstacle detection process and the like include an obstacle detection process, a map generation process, a peak shape extraction process, and a discrimination process.
  • the hardware configuration of the main part of the parking support device 6b is the same as that described with reference to FIGS. 4 to 6 in the first embodiment. Therefore, detailed description thereof will be omitted.
  • the parking support device 6b has a plurality of functions (including an obstacle detection function, a map generation function, a peak shape extraction function, a discrimination function, and a parking support function) F1 to F3, F4a, and F11.
  • a plurality of functions including an obstacle detection function, a map generation function, a peak shape extraction function, a discrimination function, and a parking support function
  • Each of the plurality of functions F1 to F3, F4a, and F11 may be realized by the processor 51 and the memory 52, or may be realized by the dedicated processing circuit 53.
  • the processor 51 may include a dedicated processor corresponding to each of the plurality of functions F1 to F3, F4a, and F11.
  • the memory 52 may include a dedicated memory corresponding to each of the plurality of functions F1 to F3, F4a, and F11.
  • the processing circuit 53 may include a dedicated processing circuit corresponding to each of the plurality of functions F1 to F3, F4a, and F11.
  • step ST4a the discrimination process
  • step ST11c the obstacle detection device 7c executes obstacle detection processing and the like.
  • steps ST1 to ST3 and ST4a shown in FIG. 29 are executed.
  • step ST21 the process of step ST21 is executed.
  • the surface shape such as implantation is more complicated than the surface shape of the wall, and is more complicated than the surface shape of the curb. Therefore, when the obstacle O is an implant or the like, the variation in the corresponding plurality of distance measurement values D becomes larger than when the obstacle O is a wall or a curb. In particular, when the density of plants in planting or the like is uniform, the probability that the exploration wave is reflected after traveling a certain distance inside the planting or the like is substantially constant regardless of the reflected position.
  • the peak shape PS when the obstacle O is an implant or the like follows a binomial distribution.
  • the peak shape PS has a left-right asymmetrical shape.
  • FIG. 31 shows an example of such a peak shape PS.
  • only one of the plurality of frequencies F in the peak shape PS is designated by “F”.
  • the implant determination unit 42 calculates the overall width W_A of the peak shape PS.
  • the implant determination unit 42 extracts the bottle having the maximum frequency F in the peak shape PS.
  • the implant determination unit 42 calculates the width W_L of the portion on the left side of the extracted bottle in the peak shape PS. Further, the implant determination unit 42 calculates the width W_R of the portion on the right side of the extracted bottle in the peak shape PS.
  • the implant determination unit 42 determines that the obstacle O is an implant obstacle.
  • the implant determination unit 42 determines that the obstacle O is not an implant obstacle.
  • the threshold value Rth is set to a value larger than 0 and a value smaller than 1.
  • the obstacle detection device 7c various modifications similar to those described in the first embodiment can be adopted. Further, the obstacle detection device 7c may have a correction unit similar to the correction unit 15 in the obstacle detection device 7a according to the second embodiment.
  • the discrimination unit 14a is based on the deviation amount ⁇ of the position of the bin having the maximum frequency F in the peak shape PS with respect to the central portion of the peak shape PS. , It is determined whether or not the obstacle O is an implantation obstacle. Thereby, it is possible to determine whether or not the obstacle O is an implantation obstacle. As a result, for example, the result of such determination can be used for parking support control.
  • the obstacle detection device and the obstacle detection method according to the present disclosure can be used, for example, in a parking support device or a collision avoidance device.
  • the parking support device and the collision avoidance device according to the present disclosure can be used for a vehicle.

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Traffic Control Systems (AREA)

Abstract

L'invention concerne un dispositif de détection d'obstacle (7) comprenant : une unité de détection d'obstacle (11) permettant de détecter un obstacle (O) dans une région autour d'un véhicule en déplacement (1) à l'aide d'un capteur de mesure de distance (3) situé sur le véhicule (1) afin d'acquérir des informations sur l'obstacle (O) ; une unité de génération de carte (12) permettant de générer une carte (M) correspondant à une distribution de fréquence d'une pluralité de valeurs de mesure de distance (D) incluses dans les informations ; une unité d'extraction de forme de pic (13) permettant d'extraire un groupe de fréquences (FG) comprenant des fréquences (F) dépassant un premier seuil (Fth1) parmi une pluralité de groupes de fréquences (FG) dans la carte (M) afin d'extraire une forme de pic (PS) dans la carte (M) ; et une unité de détermination (14) permettant de déterminer une hauteur de l'obstacle (O) en fonction de la forme de pic (PS).
PCT/JP2020/019116 2020-05-13 2020-05-13 Dispositif de détection d'obstacle, dispositif d'aide au stationnement, dispositif d'évitement de collision et procédé de détection d'obstacle WO2021229722A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/JP2020/019116 WO2021229722A1 (fr) 2020-05-13 2020-05-13 Dispositif de détection d'obstacle, dispositif d'aide au stationnement, dispositif d'évitement de collision et procédé de détection d'obstacle
JP2022522170A JP7186923B2 (ja) 2020-05-13 2020-05-13 障害物検知装置、駐車支援装置、衝突回避装置及び障害物検知方法

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2020/019116 WO2021229722A1 (fr) 2020-05-13 2020-05-13 Dispositif de détection d'obstacle, dispositif d'aide au stationnement, dispositif d'évitement de collision et procédé de détection d'obstacle

Publications (1)

Publication Number Publication Date
WO2021229722A1 true WO2021229722A1 (fr) 2021-11-18

Family

ID=78525454

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2020/019116 WO2021229722A1 (fr) 2020-05-13 2020-05-13 Dispositif de détection d'obstacle, dispositif d'aide au stationnement, dispositif d'évitement de collision et procédé de détection d'obstacle

Country Status (2)

Country Link
JP (1) JP7186923B2 (fr)
WO (1) WO2021229722A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023176646A1 (fr) * 2022-03-18 2023-09-21 ソニーグループ株式会社 Dispositif de traitement d'informations, procédé de traitement d'informations et programme

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002154396A (ja) * 2000-11-22 2002-05-28 Nissan Motor Co Ltd 車両用駐車支援装置
US20110121994A1 (en) * 2009-10-29 2011-05-26 Christian Pampus Method For Detecting Objects Having a Low Height
JP2014058247A (ja) * 2012-09-18 2014-04-03 Aisin Seiki Co Ltd 駐車支援装置
JP2019143977A (ja) * 2018-02-15 2019-08-29 パナソニックIpマネジメント株式会社 判定システム、センサシステム、及び判定方法

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3385304B2 (ja) * 1997-08-29 2003-03-10 三菱電機株式会社 車載用レーダ装置
JP2011196699A (ja) * 2010-03-17 2011-10-06 Denso Corp 道路端検出装置
JP6333412B2 (ja) * 2014-12-26 2018-05-30 三菱電機株式会社 障害物検知装置
KR101694347B1 (ko) * 2015-08-31 2017-01-09 현대자동차주식회사 차량 및 차선인지방법
WO2017195753A1 (fr) * 2016-05-13 2017-11-16 コニカミノルタ株式会社 Système de surveillance
EP3483629B1 (fr) * 2017-11-09 2021-12-29 Veoneer Sweden AB Détection d'une rangée de stationnement à l'aide d'un système de radar de véhicule
JP6933986B2 (ja) * 2018-02-06 2021-09-08 ヴィオニア スウェーデン エービー 物標検出装置、物標検出方法及びプログラム

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002154396A (ja) * 2000-11-22 2002-05-28 Nissan Motor Co Ltd 車両用駐車支援装置
US20110121994A1 (en) * 2009-10-29 2011-05-26 Christian Pampus Method For Detecting Objects Having a Low Height
JP2014058247A (ja) * 2012-09-18 2014-04-03 Aisin Seiki Co Ltd 駐車支援装置
JP2019143977A (ja) * 2018-02-15 2019-08-29 パナソニックIpマネジメント株式会社 判定システム、センサシステム、及び判定方法

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023176646A1 (fr) * 2022-03-18 2023-09-21 ソニーグループ株式会社 Dispositif de traitement d'informations, procédé de traitement d'informations et programme

Also Published As

Publication number Publication date
JPWO2021229722A1 (fr) 2021-11-18
JP7186923B2 (ja) 2022-12-09

Similar Documents

Publication Publication Date Title
US10605896B2 (en) Radar-installation-angle calculating device, radar apparatus, and radar-installation-angle calculating method
US7710247B2 (en) Object recognizing apparatus
US6580385B1 (en) Object detection system
US6896082B2 (en) Road surface detection apparatus and apparatus for detecting upward/downward axis displacement of vehicle-mounted radar
US6862527B2 (en) Vehicle surroundings monitoring apparatus
Ogawa et al. Lane recognition using on-vehicle lidar
US20230008630A1 (en) Radar device
CN110888115B (zh) 对雷达跟踪的潜在静止对象进行分类
US7557907B2 (en) Object-detection device for vehicle
WO2021229722A1 (fr) Dispositif de détection d'obstacle, dispositif d'aide au stationnement, dispositif d'évitement de collision et procédé de détection d'obstacle
JP7217817B2 (ja) 物体認識装置及び物体認識方法
JP6811913B2 (ja) 障害物検知装置
US11798417B2 (en) Driving assistance device
JP7199436B2 (ja) 障害物検知装置及び運転支援装置
JP7167871B2 (ja) 物標検出装置
WO2020008536A1 (fr) Dispositif de détection d'obstacle
WO2021106030A1 (fr) Dispositif de détection d'obstacles
WO2020049650A1 (fr) Dispositif d'aide à la conduite
JP6890744B2 (ja) 駐車形態判定装置
US11816990B2 (en) Moving-object detection apparatus for vehicle
US20230099678A1 (en) Traveling lane recognition apparatus and traveling lane recognition method
US20230109206A1 (en) Own position estimation apparatus and own position estimation method
US20230373524A1 (en) Automatic driving device and radar device
WO2020008535A1 (fr) Dispositif de détection d'obstacle

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20934912

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2022522170

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20934912

Country of ref document: EP

Kind code of ref document: A1