WO2020012852A1 - Dispositif d'aide à la localisation et méthode d'aide à la localisation - Google Patents

Dispositif d'aide à la localisation et méthode d'aide à la localisation Download PDF

Info

Publication number
WO2020012852A1
WO2020012852A1 PCT/JP2019/023104 JP2019023104W WO2020012852A1 WO 2020012852 A1 WO2020012852 A1 WO 2020012852A1 JP 2019023104 W JP2019023104 W JP 2019023104W WO 2020012852 A1 WO2020012852 A1 WO 2020012852A1
Authority
WO
WIPO (PCT)
Prior art keywords
distance
sensor
distance measuring
measuring sensor
vehicle
Prior art date
Application number
PCT/JP2019/023104
Other languages
English (en)
Japanese (ja)
Inventor
佳幸 水野
Original Assignee
クラリオン株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by クラリオン株式会社 filed Critical クラリオン株式会社
Publication of WO2020012852A1 publication Critical patent/WO2020012852A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S15/00Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
    • G01S15/88Sonar systems specially adapted for specific applications
    • G01S15/93Sonar systems specially adapted for specific applications for anti-collision purposes
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S5/00Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations
    • G01S5/18Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations using ultrasonic, sonic, or infrasonic waves
    • G01S5/28Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations using ultrasonic, sonic, or infrasonic waves by co-ordinating position lines of different shape, e.g. hyperbolic, circular, elliptical or radial

Definitions

  • the present invention relates to a positioning support device and a positioning support method having a plurality of distance measurement sensors for transmitting ultrasonic waves around a vehicle and receiving ultrasonic waves reflected on an object around the vehicle.
  • a positioning support device in which a plurality of ultrasonic transmission / reception units are mounted on a vehicle and a position of an object is specified based on a distance measurement value detected by the ultrasonic transmission / reception units is known (for example, see Patent Document 1). Furthermore, in the positioning support device disclosed in Patent Document 1, the position of an object that forms a linear wall surface is also specified based on distance measurement values detected by three ultrasonic transmission / reception units having mutually overlapping detection regions. are doing.
  • Patent Literature 1 requires three ultrasonic transmission / reception units to determine the shape of an obstacle. For this reason, the configuration of the positioning support device is complicated.
  • an object of the present invention is to provide a positioning support apparatus and a positioning support method that can calculate a position according to the shape of an object with a simple configuration and thereby improve the position accuracy of the object.
  • a positioning support device of the present invention includes a plurality of distance measurement sensors that transmit ultrasonic waves around a vehicle and receive ultrasonic waves reflected by objects around the vehicle, and a plurality of distance measurement sensors. Some of the distance measurement sensors are driven as first distance measurement sensors that transmit ultrasonic waves around the vehicle and receive ultrasonic waves reflected by an object. A second distance measuring sensor adjacent to the first distance measuring sensor and different from the first distance measuring sensor is used for receiving an ultrasonic wave transmitted from the first distance measuring sensor and reflected by an object.
  • a sensor drive control unit that operates as a distance measurement sensor, a first distance calculation unit that calculates a first distance from the vehicle to an object based on ultrasonic waves received by the first distance measurement sensor, and a second measurement unit Calculating a second distance from the vehicle to the object based on the ultrasonic waves received by the distance sensor; , A circle having a first distance as a radius around the installation position of the first distance measurement sensor, and a second with the installation positions of the first and second distance measurement sensors as focal points.
  • An intersection position calculation unit that calculates an intersection with an ellipse set based on the distance of the object, and a peripheral object position identification unit that identifies the position of the object based on the intersection calculated by the intersection position calculation unit
  • the sensor drive control unit drives a specific distance measurement sensor as a first distance measurement sensor at a specific time and drives a distance measurement sensor adjacent to the specific distance measurement sensor as a second distance measurement sensor, and At a time different from the time, the specific distance measurement sensor is driven as the second distance measurement sensor, and the distance measurement sensor adjacent to the specific distance measurement sensor is driven as the first distance measurement sensor.
  • First specific distance measurement sensor A first line segment connecting a circle and an ellipse when driven as a distance measurement sensor and an intersection of tangents to the circle and the ellipse, and a circle when a specific distance measurement sensor is driven as a second distance measurement sensor And a second line segment connecting the ellipse and the intersection of the tangents of the circle and the ellipse is obtained, and the position of the object is specified based on the first line segment and the second line segment.
  • a positioning support method is directed to a positioning support device having a plurality of distance measurement sensors for transmitting ultrasonic waves around a vehicle and receiving ultrasonic waves reflected by an object around the vehicle.
  • some of the plurality of ranging sensors are used as a first ranging sensor that transmits ultrasonic waves around a vehicle and receives ultrasonic waves reflected by an object.
  • Driving, and a distance measuring sensor that is adjacent to the first distance measuring sensor among the plurality of distance measuring sensors and is different from the first distance measuring sensor is transmitted from the first distance measuring sensor and reflected on the object.
  • a second distance measuring sensor that is driven as a second distance measuring sensor that receives the ultrasonic wave obtained, calculates a first distance from the vehicle to the object based on the ultrasonic wave received by the first distance measuring sensor, Calculates a second distance from the vehicle to the object based on the ultrasonic waves received by It is set based on a circle having a first distance as a radius around the installation position of the first distance measurement sensor and a second distance with the installation positions of the first distance measurement sensor and the second distance measurement sensor as focal points. The intersection with the ellipse is calculated, the position of the object is specified based on the intersection, and at a specific time, the specific distance measurement sensor is driven as the first distance measurement sensor, and is adjacent to the specific distance measurement sensor.
  • the distance measuring sensor to be driven is driven as a second distance measuring sensor, the specific distance measuring sensor is driven as a second distance measuring sensor at a time different from the specific time, and the distance measuring sensor adjacent to the specific distance measuring sensor is driven.
  • a specific ranging sensor to a second ranging A second line segment connecting the circle and the ellipse when driven as a sensor and the intersection of the tangents of the circle and the ellipse is obtained, and the position of the object is specified based on the first and the second line segment. It is characterized by doing.
  • the peripheral object position specifying unit is configured to drive the specific distance measurement sensor as the first distance measurement sensor to the circle and the ellipse and the tangents of the circle and the ellipse.
  • a first line segment connecting the intersection and a second line connecting a circle and an ellipse when a specific distance measurement sensor is driven as a second distance measurement sensor and an intersection of a tangent to the circle and the ellipse And the position of the object is specified based on the first line segment and the second line segment.
  • the position can be calculated in accordance with the shape of the object with a simple configuration, thereby improving the position accuracy of the object.
  • a line segment and a second line segment connecting a circle and an ellipse when the specific distance measurement sensor is driven as the second distance measurement sensor and an intersection of a tangent to the circle and the ellipse are obtained.
  • the position of the object is specified based on the line segment and the second line segment.
  • the position can be calculated in accordance with the shape of the object with a simple configuration, thereby improving the position accuracy of the object.
  • FIG. 1 is a block diagram illustrating a schematic configuration of an automatic parking system to which a positioning support device according to an embodiment of the present invention is applied.
  • FIG. 2 is a diagram illustrating an example of an arrangement position of a vehicle-mounted camera and a distance measurement sensor mounted on the automatic parking system according to the embodiment.
  • 1 is a functional block diagram illustrating a schematic configuration of a positioning support device according to an embodiment.
  • 5 is a flowchart illustrating an example of an operation of the positioning support device according to the embodiment.
  • FIG. 7 is a diagram for explaining an example of an operation of the positioning support device according to the embodiment.
  • FIG. 7 is a diagram for explaining an example of an operation of the positioning support device according to the embodiment.
  • FIG. 7 is a diagram for explaining an example of an operation of the positioning support device according to the embodiment.
  • FIG. 7 is a diagram for explaining an example of an operation of the positioning support device according to the embodiment.
  • FIG. 7 is a diagram for explaining an example of an operation of the positioning support device according to the embodiment.
  • FIG. 7 is a diagram for explaining an example of an operation of the positioning support device according to the embodiment.
  • FIG. 7 is a diagram for explaining an example of an operation of the positioning support device according to the embodiment.
  • FIG. 7 is a diagram for explaining an example of an operation of the positioning support device according to the embodiment.
  • FIG. 7 is a diagram for explaining an example of an operation of the positioning support device according to the embodiment.
  • FIG. 7 is a diagram for explaining an example of an operation of the positioning support device according to the embodiment.
  • FIG. 7 is a diagram for explaining an example of an operation of the positioning support device according to the embodiment.
  • FIG. 7 is a diagram for explaining an example of an operation of the positioning support device according to the embodiment.
  • FIG. 7 is a diagram for explaining an example of an operation of the positioning support device according
  • FIG. 1 is a block diagram showing a schematic configuration of an automatic parking system to which a positioning support device and a positioning support method according to an embodiment of the present invention are applied
  • FIG. 2 is mounted on the automatic parking system according to the embodiment. It is a figure showing an example of an arrangement position of an in-vehicle camera and a distance measurement sensor.
  • the positioning support device 1 is applied to an automatic parking system S that automatically parks a vehicle V (see FIG. 2) in a parking space.
  • a plurality of small cameras are provided on the front, rear, left and right of the vehicle V as shown in FIG.
  • a front camera 20a is mounted on a front bumper or a front grill of the vehicle V toward the front of the vehicle V.
  • a rear camera 20b is mounted on the rear bumper or rear garnish of the vehicle V toward the rear of the vehicle V.
  • a left camera 20c is mounted on a left door mirror of the vehicle V toward the left side of the vehicle V.
  • a right side camera 20d is mounted on the right side mirror of the vehicle V toward the right side of the vehicle V.
  • Each of the front camera 20a, the rear camera 20b, the left camera 20c, and the right camera 20d is equipped with a wide-angle lens or a fisheye lens capable of observing a wide range, and four cameras 20a to 20d are provided around the vehicle V.
  • the area including the road surface can be observed without omission.
  • sonars 30a to 30f as distance measuring sensors are mounted on the front, rear, left and right of the vehicle V. These sonars 30a to 30f sequentially transmit ultrasonic waves of a predetermined frequency (for example, 20 kHz or more) around the vehicle V based on an instruction from a sonar ECU (Electronic Control Unit) 32 (FIG. 1). In addition, the sonars 30a to 30f sequentially receive reflected waves reflected on an object within the ultrasonic irradiation range.
  • a predetermined frequency for example, 20 kHz or more
  • the sonars 30a to 30f are sensors that measure the distance to the object based on the transmission timing of the ultrasonic wave and the reception timing of the reflected wave.
  • the distance (the distance between the sonar and the outer end of its irradiation range) at which the sonars 30a to 30f can measure the distance to the object is, for example, about 30 cm to 8 m.
  • 4Four sonars 30a capable of irradiating measurement waves toward the front of the vehicle V are mounted on the front bumper and the front grill of the vehicle V.
  • the sonars 30a adjacent to each other are installed such that their irradiation ranges (not shown) overlap.
  • 4Four sonars 30d capable of irradiating measurement waves toward the rear of the vehicle V are mounted on the rear bumper and the rear garnish of the vehicle V.
  • the sonars 30d adjacent to each other are installed such that their irradiation ranges (not shown) overlap.
  • a sonar 30b capable of irradiating a measurement wave toward the left side of the vehicle V is mounted on the left side (left side) with respect to the traveling direction of the vehicle V.
  • a sonar 30c capable of irradiating a measurement wave toward the left side of the vehicle V is mounted on the left rear side of the vehicle V.
  • the irradiation ranges (not shown) of the two sonars 30b and 30c do not overlap.
  • a sonar 30f is mounted on the right side (right side) front side of the traveling direction of the vehicle V toward the right side of the vehicle V.
  • a sonar 30e is mounted on the right rear side of the vehicle V.
  • the irradiation ranges (not shown) of the two sonars 30f and 30e do not overlap each other, similarly to the sonars 30b and 30c described above.
  • the automatic parking system S includes a front camera 20a, a rear camera 20b, a left camera 20c, a right camera 20d, a camera ECU 22, sonars 30a to 30f, and a sonar ECU 32 mounted on the vehicle V. , A wheel speed sensor 47 and a steering angle sensor 48.
  • the camera ECU 22 controls the cameras 20a to 20d and detects a parking space using information detected by the cameras 20a to 20d.
  • the sonar ECU 32 controls the sonars 30a to 30f and detects a parking space using information detected by the sonars 30a to 30f.
  • the wheel speed sensor 47 is a sensor that detects the wheel speed of the vehicle V.
  • the detection information (wheel speed) detected by the wheel speed sensor 47 is input to the vehicle control ECU 60.
  • the steering angle sensor 48 detects the steering angle of the steering of the vehicle V.
  • the steering angle when the vehicle V travels in a straight-ahead state is defined as a neutral position (steering angle 0 °), and the rotation angle from the neutral position is output as the steering angle.
  • Detection information (steering angle) detected by the steering angle sensor 48 is input to the vehicle control ECU 60.
  • the automatic parking system S further includes an automatic parking start switch 24, a vehicle control ECU 60, a steering control unit 70, a throttle control unit 80, and a brake control unit 90.
  • the automatic parking start switch 24 receives an instruction to start automatic parking.
  • the vehicle control ECU 60 mainly includes a microcomputer including a CPU (Central Processing Unit), a ROM (Read Only Memory), a RAM (Random Access Memory), and the like.
  • the vehicle control ECU 60 executes various processes for assisting parking of the vehicle V based on each detection information input from the camera ECU 22, the sonar ECU 32, the wheel speed sensor 47, and the steering angle sensor 48.
  • the vehicle control ECU 60 performs detection processing for detecting a parking space based on each detection information, A determination process for determining whether the vehicle V can be parked and an automatic parking process for automatically parking the vehicle V in a parking space determined to be parkable are executed.
  • the steering control unit 70 drives the power steering actuator 72 based on the vehicle control information determined by the vehicle control ECU 60 to control the steering angle of the vehicle V.
  • the throttle control unit 80 controls the throttle of the vehicle V by driving the throttle actuator 82 based on the vehicle control information determined by the vehicle control ECU 60.
  • the brake control unit 90 controls the brake of the vehicle V by driving the brake actuator 92 based on the vehicle control information determined by the vehicle control ECU 60.
  • the camera ECU 22, the sonar ECU 32, the wheel speed sensor 47, the steering angle sensor 48, and the vehicle control ECU 60 are controlled by sensor information CAN (registered trademark) (Controller Area Network) 50, which is an in-vehicle LAN (Local Area Network). Connected.
  • CAN registered trademark
  • Controller Area Network 50 which is an in-vehicle LAN (Local Area Network). Connected.
  • the steering control unit 70, the throttle control unit 80, the brake control unit 90, and the vehicle control ECU 60 are connected by vehicle information CAN (registered trademark) 52, which is an in-vehicle LAN.
  • the positioning support device 1 of the present embodiment is mainly configured by the sonars 30a to 30f and the sonar ECU 32.
  • a radar not shown in FIG. 2 may be installed instead of the sonars 30a to 30f.
  • a radar ECU (not shown in FIG. 2) that controls the radar and detects an obstacle around the vehicle V is installed.
  • the sonars 30a to 30f and the radar have different ranging ranges, they may of course be mixed.
  • an obstacle is detected by comparing images captured at different times by the front camera 20a, the rear camera 20b, the left camera 20c, and the right camera 20d.
  • a so-called motion stereo function may be implemented.
  • FIG. 3 is a functional block diagram illustrating a schematic configuration of the positioning support device 1 according to the present embodiment.
  • the positioning support device 1 includes a control unit 100, a storage unit 110, a distance measurement sensor 120, and an input / output unit 130.
  • the distance measurement sensor 120 mainly composed of the sonars 30a to 30f transmits an ultrasonic wave around the vehicle V and receives an ultrasonic wave reflected by an object around the vehicle V.
  • each of the sonars 30a to 30f has a function of transmitting and receiving ultrasonic waves.
  • some sonars 30a to 30f transmit and receive ultrasonic waves, and the other sonars 30a to 30f only receive ultrasonic waves.
  • the sonars 30a to 30f for transmitting and receiving ultrasonic waves are first distance measuring sensors 121 in the present invention, and the sonars 30a to 30f for receiving ultrasonic waves are second distance measuring sensors 122 in the present invention.
  • the sonars 30a to 30f that transmit and receive ultrasonic waves and the sonars 30a to 30f that only receive ultrasonic waves are periodically changed. Therefore, in a certain cycle, specific sonars 30a to 30f constitute a first distance measuring sensor 121, and other sonars 30a to 30f constitute a second distance measuring sensor 122.
  • the control unit 100 mainly composed of the sonar ECU 32 controls the entire positioning support device 1.
  • the control unit 100 includes a CPU, a programmable logic device such as an FPGA, and an arithmetic element represented by an integrated circuit such as an ASIC.
  • An unillustrated control program is stored in the storage unit 110 of the positioning support device 1, and the control program is executed by the control unit 100 when the positioning support device 1 is activated. It has a functional configuration as shown.
  • the positioning support device 1 of the present embodiment performs high-speed signal processing as described later, it is preferable that the positioning support device 1 include an arithmetic element capable of high-speed operation, such as an FPGA.
  • the control unit 100 includes a sensor drive control unit 101, a first distance calculation unit 102, a second distance calculation unit 103, an intersection position calculation unit 104, a peripheral object position identification unit 105, and a coordinate conversion unit 106.
  • the sensor drive control unit 101 transmits ultrasonic waves around the vehicle V to some of the distance measurement sensors 120 among the plurality of distance measurement sensors 120 and receives ultrasonic waves reflected by objects around the vehicle V. It is driven as the first distance measuring sensor 121, and the distance measuring sensor 120 adjacent to the first distance measuring sensor 121 is transmitted from the first distance measuring sensor 121 and reflected by an object around the vehicle V. It is driven as a second distance measuring sensor 122 that receives an ultrasonic wave.
  • the sensor drive control unit 101 drives the specific distance measurement sensor 120 as the first distance measurement sensor 121 at a specific time, and sets the distance measurement sensor 120 adjacent to the specific distance measurement sensor 120 to the second distance measurement.
  • each distance measuring sensor 120 is provided with an ID for specifying the installation location on the vehicle V, and the sensor drive control unit 101 specifies and drives each distance measuring sensor 120 based on this ID. Control.
  • the first distance calculation unit 102 calculates the first distance from the vehicle V to the object based on the ultrasonic waves received by the first distance measurement sensor 121.
  • the second distance calculator 103 calculates a second distance from the vehicle to the object based on the ultrasonic waves received by the second distance measuring sensor 122.
  • the intersection position calculation unit 104 calculates a circle having a first distance as a radius around the installation position of the first distance measurement sensor 121, and a circle of the installation position of the first distance measurement sensor 121 and the second distance measurement sensor 122. The intersection with the ellipse set based on the second distance is calculated with the installation position as the focal point.
  • the peripheral object position specifying unit 105 specifies the position of the object based on the intersection calculated by the intersection position calculating unit 104.
  • the peripheral object position specifying unit 105 generates a first line segment that connects a circle and an ellipse when the specific distance measurement sensor 120 is driven as the first distance measurement sensor 121 and an intersection of a tangent to the circle and the ellipse. And a second line segment connecting the circle and the ellipse and the intersection of the tangents of the circle and the ellipse when the specific distance sensor 120 is driven as the second distance sensor 122 is obtained.
  • the position of the object is specified based on the line segment and the second line segment.
  • the peripheral object position specifying unit 105 specifies that the wall-shaped object is located at the position of the tangent line.
  • the peripheral object position specifying unit 105 specifies that the wall-shaped object is located at the position of the tangent line. I do.
  • the peripheral object position specifying unit 105 positions the wall-shaped object at the position of the tangent line. Identify things.
  • the coordinate conversion unit 106 converts the position coordinates of the object identified by the peripheral object position identification unit 105. That is, the distance to the object calculated by the first distance calculation unit 102 and the second distance calculation unit 103 is a distance on the relative coordinate system whose origin is the position of the vehicle V (more precisely, the distance measurement sensor 120). Therefore, this is converted into the distance (coordinate value) in the absolute coordinate system.
  • the coordinate conversion unit 106 performs coordinate conversion to an absolute coordinate system in consideration of the behavior of the vehicle V, based on information from the wheel speed sensor 47 and the like input via the vehicle control ECU 60.
  • the coordinate conversion unit 106 corrects the position information of the object included in the sensor information 112 stored in the storage unit 110 described later based on the information from the wheel speed sensor 47 and the like input via the vehicle control ECU 60. I do.
  • the peripheral object position specifying unit 105 specifies the position of a peripheral object based on the position information corrected by the coordinate conversion unit 106.
  • the distance measurement sensor 120 periodically performs the distance measurement operation.
  • the position information of the object which is the result of the identification by the peripheral object position identification unit 105, includes the vehicle speed and the steering angle due to the movement of the vehicle V. It is preferable that the correction be appropriately made in accordance with the above. Therefore, the coordinate conversion unit 106 corrects the position information based on the vehicle speed of the vehicle V in consideration of the moving distance of the vehicle V.
  • the storage unit 110 mainly composed of the sonar ECU 32 has a storage medium such as a large-capacity storage medium such as a hard disk drive and a semiconductor storage medium such as a ROM and a RAM.
  • the storage unit 110 temporarily or non-temporarily stores various data used for various operations in the control unit 100.
  • the storage unit 110 stores a sensor drive pattern table 111, sensor information 112, and a sensor combination table 113.
  • the sensor driving pattern table 111 is a table in which pattern information indicating which distance measuring sensor 120 is driven as the first distance measuring sensor 121 or the second distance measuring sensor 122 in a specific cycle is stored. Although details will be described later, in the positioning support apparatus 1 of the present embodiment, a pattern that determines whether to drive each of the twelve distance measurement sensors 120 as the first distance measurement sensor 121 or the second distance measurement sensor 122 is determined. Are provided. Then, the sensor drive control unit 101 refers to the sensor drive pattern table 111 and controls the drive of the distance measurement sensor 120 while sequentially changing the pattern at a predetermined time period.
  • the ⁇ sensor information 112 is information on the position of the object specified by the peripheral object position specifying unit 105. More specifically, for each specific pattern described above, the ID of the distance measurement sensor 120 driven as the first distance measurement sensor 121 and the ID of the distance measurement sensor 120 driven as the second distance measurement sensor 122 in that pattern.
  • the information on the position of the object specified by the peripheral object position specifying unit 105 is stored in the storage unit 110 as the sensor information 112 together with the ID and the position information of the vehicle V at the object specifying operation time.
  • the sensor combination table 113 is a table in which information on a combination of the distance measurement sensors 120 provided adjacent to the vehicle V is stored. The details of the sensor combination table 113 will be described later.
  • the input / output unit 130 receives information on the behavior of the vehicle V from the vehicle control ECU 60 and sends the information to the control unit 100. Further, the input / output unit 130 sends the object position around the vehicle V calculated by the peripheral object position specifying unit 105 of the control unit 100 to the vehicle control ECU 60.
  • the positioning support device 1 of the present embodiment has twelve distance measuring sensors 120.
  • each distance measuring sensor 120 is given a name.
  • the sonars 30a provided on the front bumper and the front grille of the vehicle V are arranged so that the sonars FOL, FIL, FIR are arranged in order from the left when the vehicle V is viewed from above. , FOR.
  • the sonars 30d provided on the rear bumper and the rear garnish of the vehicle V are named sonar ROL, RIL, RIR, and ROR in order from the left when the vehicle V is viewed from above.
  • the four sonars 30a may be collectively referred to as a front sonar and the four sonars 30d may be collectively referred to as a rear sonar.
  • sonars 30b and 30c provided on the left side surface of the vehicle V are named sonar SFL and SRL in order from the front when the vehicle V is viewed from above.
  • sonars 30f and 30e provided on the right side surface of the vehicle V are named sonar SFR and SRR in order from the front when the vehicle V is viewed from above.
  • the sensor drive control unit 101 drives these twelve sonars 30a to 30f in four groups (group A, group B, group C, and group D).
  • group A, group B, group C, and group D the distance measuring sensor 120 (sonar 30a, 30f) driven as the first distance measuring sensor 121 in each of the front sonar and the rear sonar is driven as the second distance measuring sensor 122.
  • the ranging sensor 120 driven as the first ranging sensor 121 is different for each group.
  • the front sonar or the rear sonar is driven by the sensor drive control unit 101 as the first distance measurement sensor 121, is driven as the second distance measurement sensor 122, or is not driven by the sensor drive control unit 101 (that is, it is not driven). Or no transmission / reception of sound waves). Further, the sonars 30b, 30c, 30e, and 30f on the left and right sides of the vehicle V are driven by the sensor drive control unit 101 as the first distance measuring sensor 121 or are not driven by the sensor drive control unit 101 (that is, the (Transmission and reception of sound waves are not performed).
  • the sensor drive control unit 101 drives the distance measurement sensor 120 as the first distance measurement sensor 121 in each group with reference to the sensor drive pattern table 111 stored in the storage unit 110, or performs the second measurement. It is driven as the distance sensor 122 or not driven.
  • the sensor drive control unit 101 performs an ultrasonic transmission / reception operation by the distance measurement sensor 120 in each group for a certain period of time, and sequentially changes the group (that is, group A ⁇ group B ⁇ group C ⁇ group D) to perform drive control. Do.
  • the sensor drive control unit 101 returns to the group A and performs the drive control by sequentially changing the group.
  • the pillar P is located at the same position as in FIG.
  • the ultrasonic wave transmitted from the first distance measuring sensor 121 toward the pillar P is reflected by the pillar P and received by the second distance measuring sensor 122 adjacent to the first distance measuring sensor 121.
  • the pillar P is an ellipse E having a focal point at the installation position of the first distance measurement sensor 121 and the second distance measurement sensor 122.
  • L [m] T ⁇ c.
  • the position of the pillar P is an intersection IP1 of the circle C and the ellipse E as shown in FIG.
  • the object position identification method shown in FIG. 9 is hereinafter referred to as “trigonometry”.
  • the position of the wall W is determined by the tangent TA between the circle C and the ellipse E, or more precisely, the tangent TA to the circle C and the ellipse. Intersections IP2 and IP3 with E.
  • the object position specifying method shown in FIG. 10 is hereinafter referred to as a “tangent method”.
  • the intersection point IP1 of the circle C and the ellipse E is obtained as shown in FIG. Therefore, when the position of the object is specified with respect to the wall W by using the trigonometry, an error occurs between the specified position and the actual position.
  • ultrasonic waves are transmitted and received to and from the same object from among the combinations of the first distance measuring sensor 121 and the second distance measuring sensor 122 adjacent to each other. Are selected, and the shape of the object is determined based on the positional relationship of the object obtained by each of the plurality of combinations.
  • the peripheral object position specifying unit 105 stores the result of specifying the position of the object in the certain group by the first distance measuring sensor 121 and the second distance measuring sensor 122 in the sensor information 112 of the storage unit 110. Then, using a result of specifying the position of the object by the first distance measuring sensor 121 and the second distance measuring sensor 122 in another certain group and the sensor information 112 stored in the storage unit 110, the object is formed in a columnar shape. Is determined, and the correct position of the object is specified based on the determination result. At this time, the position specifying result of the object stored as the sensor information 112 is based on the tangent method.
  • FIG. 11 is a diagram showing an example of the sensor combination table 113.
  • the left of the arrow "->" indicates the combination of the first distance measuring sensor 121
  • the right of the arrow indicates the combination of the second distance measuring sensor 122.
  • Each distance measuring sensor 120 is specified by a name shown in FIG.
  • the peripheral object position specifying unit 105 performs the shape determination of the object based on the result of the position specification of the object, which is shown in white and is a combination of two or more front sonars or rear sonars. Do. Therefore, the shaded combination is not used as the basis for determining the shape of the object.
  • the combination of the distance measurement sensors 120 that are currently performing the distance measurement operation is “FIL ⁇ FOL”.
  • the peripheral object position specifying unit 105 uses the sensor information 112 stored in the storage unit 110 to perform an object distance measurement operation using a combination of “FOL-> FIL” and / or “FIL-> FIR”. Is read out, and the shape of the object is determined according to the following procedure.
  • the peripheral object position specifying unit 105 determines whether or not the inclination of the line connecting the positions of the objects, which is the result of performing the distance measurement operation by a plurality of combinations, is equal to or less than a predetermined angle. If it is determined that the angle is equal to or smaller than the predetermined angle, it is determined that the object has a wall shape.
  • the peripheral object position specifying unit 105 determines that the object is wall-shaped, it specifies the position of the object by the tangent method. On the other hand, if it is determined that the object has a columnar shape, the peripheral object position specifying unit 105 specifies the position of the object by trigonometry.
  • the inclination of the line segments L1 and L2 can be defined as the inclination with respect to the coordinate axes of this coordinate system.
  • the shape of the object can be determined based on the difference between the inclinations of the line segments L1 and L2. If the object is columnar, if the position is detected based on the tangent method, a large error occurs in the position specification. This is because, as a result, the inclinations of the line segments L1 and L2 are considered to be different for the line segments L1 and L2. Conversely, if the object is wall-shaped, both line segments L1 and L2 are considered to be along the wall surface, so the difference between the inclinations of line segments L1 and L2 is estimated to be sufficiently small.
  • the predetermined angle may be appropriately determined by referring to the actual determination result by the peripheral object position specifying unit 105 and the like.
  • the peripheral object position specifying unit 105 determines the distance between the first line segment L1 and the second line segment L2 connecting the object positions A1 and A2, which is the result of performing the distance measurement operation by a plurality of combinations. It is determined whether or not the distance between them is equal to or less than a predetermined distance. If it is determined that the distance is equal to or less than the predetermined distance, it is determined that the object has a wall shape.
  • the distance between the positions A1 and A2 approaching each other that is, the position A2 of the first line segment L1 and the second line segment
  • the distance between L2 and the position A1 is defined as a distance d1 between the line segments L1 and L2. If the distance d1 between the line segments L1 and L2 is equal to or less than a predetermined distance as shown in FIG. 13A, it is determined that the object has a wall shape. On the other hand, when the distance d1 between the line segments L1 and L2 exceeds a predetermined distance as shown in FIG. 13B, it is determined that the object is columnar.
  • the peripheral object position specifying unit 105 determines that the object is wall-shaped, it specifies the position of the object by the tangent method. On the other hand, if it is determined that the object has a columnar shape, the peripheral object position specifying unit 105 specifies the position of the object by trigonometry.
  • the shape of the object can be determined based on the distance d1 between the line segments L1 and L2. If the object is a column, if the position is detected based on the tangent method, an error in specifying the position is large. This is because it is considered that the distance d1 between the line segments L1 and L2 increases as a result. Further, even if it is determined that the difference between the inclinations of the line segments L1 and L2 is equal to or smaller than the predetermined angle by the above-described method (1), the shape determination of the object is erroneously determined if the distance d1 between the line segments L1 and L2 is large. It is possible that
  • the peripheral object position specifying unit 105 determines that the length of the overlapping portion of the line segments L1 and L2 connecting the object positions A1 and A2, which is the result of performing the distance measurement operation by a plurality of combinations, is a predetermined length It is determined whether the length is less than a predetermined length. If it is determined that the length is less than a predetermined length, the object is determined to be wall-shaped.
  • straight lines LL1 to LL4 passing through positions A1 and A2 are drawn from a specific distance measuring sensor 120 (sonar FIL in the illustrated example), and positions A1 and A2 forming two line segments L1 and L2 are drawn.
  • the distance d2 between straight lines (LL2 and LL3 in the illustrated example) passing through the positions A1 and A2 that are close to each other is defined as the length of the overlapping portion of the line segments L1 and L2.
  • the overlap ratio of the line segments L1 and L2 is calculated by dividing the distance d2 by the length of the line segment L1.
  • the peripheral object position specifying unit 105 determines the shape of the object based on the overlapping rate.
  • the peripheral object position specifying unit 105 determines that the object is wall-shaped, it specifies the position of the object by the tangent method. On the other hand, if it is determined that the object has a columnar shape, the peripheral object position specifying unit 105 specifies the position of the object by trigonometry.
  • the shape of the object can be determined based on the overlap ratio of the line segments L1 and L2. If the object is columnar, if the position is detected based on the tangent method, a large error occurs in the position specification. This is because the overlap ratio of the line segments L1 and L2 is considered to increase as a result.
  • the peripheral object position specifying unit 105 determines the shape of the object based on all of the conditions shown in the above methods (1) to (3). Alternatively, the shape of the object may be determined alone or in combination.
  • the ultrasonic wave transmission / reception operation by the distance measurement sensor 120 is performed by the sensor drive control unit 101 in all of the drive patterns of the groups A to D, and the peripheral object position specifying unit The process is started after the result of performing the position specifying operation of the object by the sensor 105 is stored in the storage unit 110 as the sensor information 112.
  • the operation shown in the flowchart of FIG. 4 is executed for each group.
  • step S1 the sensor drive control unit 101 is stored in the storage unit 110.
  • the pattern of a specific group is read with reference to the sensor drive pattern table 111.
  • the driving of the distance measuring sensor 120 is controlled as the first distance measuring sensor 121 or the second distance measuring sensor 122 based on the read pattern.
  • the distance measurement sensor 120 which is driven and controlled as the first distance measurement sensor 121 and the second distance measurement sensor 122, receives ultrasonic waves reflected from an object existing around the vehicle V.
  • step S2 the first distance calculator 102 and the second distance calculator 103 calculate the distance to the object based on the ultrasonic waves received in step S1.
  • the intersection position calculation unit 104 calculates the intersection between the circle and the ellipse
  • the peripheral object position identification unit 105 identifies the position of the object based on the intersection calculated by the intersection position calculation unit 104.
  • the peripheral object position specifying unit 105 specifies the position of the object using both the triangulation method and the tangent method described above.
  • step S3 the coordinate conversion unit 106 performs coordinate conversion on the position coordinates of the object specified in step S2, and calculates position coordinates in the absolute coordinate system.
  • step S4 the control unit 100 stores the information obtained in steps S1 to S3 as the sensor information 112 in the storage unit 110.
  • the information stored in step S4 includes the ID of the first distance measurement sensor 121, the ID of the second distance measurement sensor 122, the position information of the object calculated by the peripheral object position specifying unit 105 by the tangent method, and the information of the vehicle V. Contains at least location information.
  • step S4 of the sensor information 112 stored in the storage unit 110 the sensor information 112 in which the ID of the first distance measuring sensor 121 and the ID of the second distance measuring sensor 122 match each other has already been stored. If so, the control unit 100 overwrites the data.
  • the peripheral object position identification unit 105 refers to the sensor combination table 113 stored in the storage unit 110, and uses the first distance measurement sensor 121 and the second distance measurement sensor 121 used in the ultrasonic reception information acquired in step S1.
  • the sensor information 112 based on the combination of the first distance measurement sensor 121 and the second distance measurement sensor 122 corresponding to the combination of the two distance measurement sensors 122 is read from the storage unit 110.
  • step S6 the coordinate conversion unit 106 determines the position information of the object among the sensor information 112 read in step S5, and stores the vehicle V from the time when the read sensor information 112 was stored in the storage unit 110 to the present. Is corrected with reference to the moving distance of. The correction by the coordinate conversion unit 106 may be performed based on the position information of the vehicle V included in the sensor information 112.
  • step S7 the peripheral object position specifying unit 105 calculates the difference between the inclinations of the line segments L1 and L2 (see FIG. 12).
  • step S8 the peripheral object position specifying unit 105 determines whether or not the difference between the inclinations of the line segments L1 and L2 calculated in step S7 is within a predetermined angle. If it is determined that the angle is within the predetermined angle (YES in step S8), the program proceeds to step S9. On the other hand, if it is determined that the angle has exceeded the predetermined angle (NO in step S8), the program proceeds to step S14.
  • step S9 the peripheral object position specifying unit 105 calculates the distance d1 between the line segments L1 and L2 (see FIG. 13).
  • step S10 the peripheral object position specifying unit 105 determines whether or not the distance d1 between the line segments L1 and L2 calculated in step S9 is equal to or less than a predetermined distance. When it is determined that the distance is equal to or shorter than the predetermined distance (YES in step S10), the program proceeds to step S11. On the other hand, if it is determined that the distance has exceeded the predetermined distance (NO in step S10), the program proceeds to step S14.
  • step S11 the peripheral object position specifying unit 105 calculates the overlap ratio of the line segments L1 and L2 (see FIGS. 14 and 15).
  • step S12 the peripheral object position specifying unit 105 determines whether or not the overlap ratio of the line segments L1 and L2 calculated in step S11 is less than a predetermined value. Then, when it is determined that the value is less than the predetermined value (YES in step S12), the program proceeds to step S13. On the other hand, if it is determined that the value is equal to or more than the predetermined value (NO in step S12), the program proceeds to step S14.
  • step S13 the peripheral object position specifying unit 105 determines that the object has a wall shape, and specifies the position of the object.
  • step S14 the peripheral object position specifying unit 105 determines that the object is columnar, and specifies the position of the object.
  • the sensor drive control unit 101 drives the specific distance measurement sensor 120 as the first distance measurement sensor 121 at a specific time, and The distance measuring sensor 120 adjacent to the distance measuring sensor 120 is driven as a second distance measuring sensor 122.
  • the sensor drive control unit 101 drives the specific distance measuring sensor 120 as the second distance measuring sensor 122 at a time different from the specific time, and sets the first distance measuring sensor 120 adjacent to the specific distance measuring sensor 120 to the first distance measuring sensor 120. Is driven as the distance measuring sensor 121.
  • the peripheral object position specifying unit 105 generates the first line segment connecting the circle and the ellipse and the intersection of the tangents of the circle and the ellipse when the specific distance measurement sensor 120 is driven as the first distance measurement sensor 121.
  • L1 and a second line segment L2 connecting a circle and an ellipse when the specific distance measurement sensor 120 is driven as the second distance measurement sensor 122 and an intersection of tangents to the circle and the ellipse are obtained.
  • the shape of the object is determined based on the first line segment L1 and the second line segment L2, and the position of the object is specified based on the determination result.
  • the shape of the object is determined.
  • the position can be specified. Accordingly, it is possible to calculate a position in accordance with the shape of the object with a simple configuration, thereby providing a positioning support device capable of improving the position accuracy of the object.

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Acoustics & Sound (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Measurement Of Velocity Or Position Using Acoustic Or Ultrasonic Waves (AREA)
  • Traffic Control Systems (AREA)

Abstract

La présente invention concerne un dispositif d'aide à la localisation (1) permettant d'améliorer la précision de position d'un objet en calculant la position en fonction de la forme de l'objet qui comprend une unité d'identification de position d'objet périphérique (105) pour identifier la position de l'objet sur la base d'un point d'intersection calculé par un point d'unité de calcul de position d'intersection (104), l'unité d'identification de position d'objet périphérique (105) obtenant un premier segment de ligne L1 joignant des points d'intersection d'un cercle et d'une ellipse, obtenus lorsqu'un capteur de mesure de distance spécifique (120) est entraîné en tant que premier capteur de mesure de distance (121), et une tangente au cercle et à l'ellipse, et un second segment de ligne L2 joignant des points d'intersection d'un cercle et d'une ellipse, obtenus lorsque le capteur de mesure de distance spécifique (120) est entraîné en tant que second capteur de mesure de distance (122), et une tangente au cercle et à l'ellipse, et identifie la position de l'objet sur la base du premier segment de ligne L1 et du second segment de ligne L2.
PCT/JP2019/023104 2018-07-11 2019-06-11 Dispositif d'aide à la localisation et méthode d'aide à la localisation WO2020012852A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2018-131306 2018-07-11
JP2018131306 2018-07-11

Publications (1)

Publication Number Publication Date
WO2020012852A1 true WO2020012852A1 (fr) 2020-01-16

Family

ID=69141509

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/023104 WO2020012852A1 (fr) 2018-07-11 2019-06-11 Dispositif d'aide à la localisation et méthode d'aide à la localisation

Country Status (1)

Country Link
WO (1) WO2020012852A1 (fr)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20210231799A1 (en) * 2018-10-19 2021-07-29 Denso Corporation Object detection device, object detection method and program
CN113359136A (zh) * 2020-03-06 2021-09-07 华为技术有限公司 一种目标检测方法、装置及分布式雷达系统
WO2022233371A1 (fr) * 2021-05-07 2022-11-10 Continental Autonomous Mobility Germany GmbH Procédé de détermination de la position d'un objet

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003194938A (ja) * 2001-12-25 2003-07-09 Denso Corp 障害物検知装置
WO2003060552A1 (fr) * 2002-01-09 2003-07-24 M/A-Com, Inc. Procede et appareil pour identifier des objets complexes sur la base de lectures de distances fournies par des capteurs multiples
WO2016103464A1 (fr) * 2014-12-26 2016-06-30 三菱電機株式会社 Dispositif de détection d'obstacle et procédé de détection d'obstacle
JP2017142171A (ja) * 2016-02-10 2017-08-17 株式会社Soken 物体検知装置
WO2018043028A1 (fr) * 2016-08-29 2018-03-08 株式会社デンソー Dispositif et procédé de surveillance d'environnement

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003194938A (ja) * 2001-12-25 2003-07-09 Denso Corp 障害物検知装置
WO2003060552A1 (fr) * 2002-01-09 2003-07-24 M/A-Com, Inc. Procede et appareil pour identifier des objets complexes sur la base de lectures de distances fournies par des capteurs multiples
WO2016103464A1 (fr) * 2014-12-26 2016-06-30 三菱電機株式会社 Dispositif de détection d'obstacle et procédé de détection d'obstacle
JP2017142171A (ja) * 2016-02-10 2017-08-17 株式会社Soken 物体検知装置
WO2018043028A1 (fr) * 2016-08-29 2018-03-08 株式会社デンソー Dispositif et procédé de surveillance d'environnement

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20210231799A1 (en) * 2018-10-19 2021-07-29 Denso Corporation Object detection device, object detection method and program
CN113359136A (zh) * 2020-03-06 2021-09-07 华为技术有限公司 一种目标检测方法、装置及分布式雷达系统
WO2022233371A1 (fr) * 2021-05-07 2022-11-10 Continental Autonomous Mobility Germany GmbH Procédé de détermination de la position d'un objet
DE102021204635A1 (de) 2021-05-07 2022-11-10 Continental Autonomous Mobility Germany GmbH Verfahren zur Bestimmung der Lage eines Objekts

Similar Documents

Publication Publication Date Title
EP3048022B1 (fr) Système de commande d'évitement de collision et procédé de commande
US9707959B2 (en) Driving assistance apparatus
JP6361567B2 (ja) 自動運転車両システム
JP6555067B2 (ja) 車線変更支援装置
CN108928343A (zh) 一种全景融合自动泊车系统及方法
WO2020012852A1 (fr) Dispositif d'aide à la localisation et méthode d'aide à la localisation
JP6413621B2 (ja) 車載用物体判別装置
JP6421716B2 (ja) 車両の運転支援制御装置
US11941834B2 (en) Trailer angle determination system for a vehicle
JP2015105070A (ja) 駐車支援装置
JP5906999B2 (ja) 駐車支援装置
JP6658235B2 (ja) 車線維持装置
JP2016112911A (ja) 車両用走行制御装置
CN110730735A (zh) 泊车辅助方法及泊车辅助装置
WO2019058578A1 (fr) Dispositif d'aide au positionnement
JP5786775B2 (ja) 駐車支援装置
JP5041983B2 (ja) 障害物警告装置、障害物警告方法及びコンピュータプログラム
JP4957589B2 (ja) 障害物警告装置、障害物警告方法及びコンピュータプログラム
JP2014034321A (ja) 駐車支援装置
JP5880858B2 (ja) 駐車支援装置
JP5166975B2 (ja) 車両周辺監視装置および車両周辺監視方法
JP2019028486A (ja) 障害物検知装置及び障害物検知方法
US9132858B2 (en) Method and device for assisting a driver of a motor vehicle during a driving maneuver
WO2016063533A1 (fr) Appareil embarqué de détermination d'objets
JP4893678B2 (ja) 他車両検出装置、他車両検出方法及びコンピュータプログラム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19833011

Country of ref document: EP

Kind code of ref document: A1

DPE1 Request for preliminary examination filed after expiration of 19th month from priority date (pct application filed from 20040101)
NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19833011

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP