WO2004083889A1 - 障害物検知装置 - Google Patents
障害物検知装置 Download PDFInfo
- Publication number
- WO2004083889A1 WO2004083889A1 PCT/JP2004/003026 JP2004003026W WO2004083889A1 WO 2004083889 A1 WO2004083889 A1 WO 2004083889A1 JP 2004003026 W JP2004003026 W JP 2004003026W WO 2004083889 A1 WO2004083889 A1 WO 2004083889A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- obstacle
- distance
- detection device
- image
- obstacle detection
- Prior art date
Links
- 238000001514 detection method Methods 0.000 title claims abstract description 145
- 238000004364 calculation method Methods 0.000 claims abstract description 39
- 238000000034 method Methods 0.000 claims description 24
- 230000005855 radiation Effects 0.000 claims description 17
- 230000008859 change Effects 0.000 claims description 8
- 239000000284 extract Substances 0.000 claims description 3
- 239000013598 vector Substances 0.000 description 39
- 238000010586 diagram Methods 0.000 description 29
- 230000005540 biological transmission Effects 0.000 description 16
- 230000000875 corresponding effect Effects 0.000 description 16
- 230000008569 process Effects 0.000 description 14
- 238000013500 data storage Methods 0.000 description 11
- 238000012545 processing Methods 0.000 description 11
- 230000003796 beauty Effects 0.000 description 5
- 239000011248 coating agent Substances 0.000 description 4
- 238000000576 coating method Methods 0.000 description 4
- 230000002596 correlated effect Effects 0.000 description 2
- 238000009434 installation Methods 0.000 description 2
- 241000750004 Nestor meridionalis Species 0.000 description 1
- 241000287463 Phalacrocorax Species 0.000 description 1
- 101150107341 RERE gene Proteins 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 230000001276 controlling effect Effects 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 230000014509 gene expression Effects 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 239000003973 paint Substances 0.000 description 1
- 238000010422 painting Methods 0.000 description 1
- 238000002310 reflectometry Methods 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 238000005070 sampling Methods 0.000 description 1
- 210000002784 stomach Anatomy 0.000 description 1
- 238000012360 testing method Methods 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/88—Radar or analogous systems specially adapted for specific applications
- G01S13/93—Radar or analogous systems specially adapted for specific applications for anti-collision purposes
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R21/00—Arrangements or fittings on vehicles for protecting or preventing injuries to occupants or pedestrians in case of accidents or other traffic risks
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B62—LAND VEHICLES FOR TRAVELLING OTHERWISE THAN ON RAILS
- B62D—MOTOR VEHICLES; TRAILERS
- B62D15/00—Steering not otherwise provided for
- B62D15/02—Steering position indicators ; Steering position determination; Steering aids
- B62D15/027—Parking aids, e.g. instruction means
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/88—Radar or analogous systems specially adapted for specific applications
- G01S13/93—Radar or analogous systems specially adapted for specific applications for anti-collision purposes
- G01S13/931—Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0231—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
- G05D1/0238—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using obstacle or wall sensors
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
- G06V20/58—Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
- G06V20/586—Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads of parking space
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/16—Anti-collision systems
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60T—VEHICLE BRAKE CONTROL SYSTEMS OR PARTS THEREOF; BRAKE CONTROL SYSTEMS OR PARTS THEREOF, IN GENERAL; ARRANGEMENT OF BRAKING ELEMENTS ON VEHICLES IN GENERAL; PORTABLE DEVICES FOR PREVENTING UNWANTED MOVEMENT OF VEHICLES; VEHICLE MODIFICATIONS TO FACILITATE COOLING OF BRAKES
- B60T2201/00—Particular use of vehicle brake systems; Special systems using also the brakes; Special software modules within the brake system controller
- B60T2201/10—Automatic or semi-automatic parking aid systems
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/88—Radar or analogous systems specially adapted for specific applications
- G01S13/93—Radar or analogous systems specially adapted for specific applications for anti-collision purposes
- G01S13/931—Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
- G01S2013/9314—Parking operations
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/88—Radar or analogous systems specially adapted for specific applications
- G01S13/93—Radar or analogous systems specially adapted for specific applications for anti-collision purposes
- G01S13/931—Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
- G01S2013/9327—Sensor installation details
- G01S2013/93271—Sensor installation details in the front of the vehicles
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/88—Radar or analogous systems specially adapted for specific applications
- G01S13/93—Radar or analogous systems specially adapted for specific applications for anti-collision purposes
- G01S13/931—Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
- G01S2013/9327—Sensor installation details
- G01S2013/93272—Sensor installation details in the back of the vehicles
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/88—Radar or analogous systems specially adapted for specific applications
- G01S13/93—Radar or analogous systems specially adapted for specific applications for anti-collision purposes
- G01S13/931—Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
- G01S2013/9327—Sensor installation details
- G01S2013/93274—Sensor installation details on the side of the vehicles
Definitions
- the present invention relates to an obstacle detection device, and more particularly, to an obstacle detection device that is mounted on a vehicle and detects and displays an obstacle around the vehicle.
- a radar that detects an obstacle within a predetermined angle range is used. Some obstacles are detected.
- the first obstacle detection device detects an obstacle, it turns on a display lamp indicating the direction in which the obstacle was detected, and displays the distance to the obstacle by a number.
- a second obstacle detection device As another obstacle detection device (hereinafter referred to as a second obstacle detection device), an obstacle around the vehicle is detected using a radar, and the shape of the detected obstacle is represented. Some display images.
- the second obstacle detection device accumulates the position of obstacles accurately detected using laser radar as point data, and draws a diagram representing the outline of the obstacle based on the accumulated point data. By doing so, a map around the vehicle is created.
- the first obstacle detection device indicates the position of the obstacle by turning on the lamp indicating the direction and the numerical value indicating the distance
- the driver intuitively understands the positional relationship between the host vehicle and the obstacle. Grasp This is difficult.
- the second obstacle detection device displays a map showing the outline of the detected obstacle in a diagram, so that the driver can easily grasp the positional relationship between the host vehicle and the obstacle.
- the second obstacle detection device detects the position of an obstacle as a point.
- a laser radar having a very narrow beam spread angle is provided.
- a laser radar is expensive, so that the cost of the entire apparatus becomes very expensive.
- the horn that emits ultrasonic waves or the antenna that radiates radio waves must be very large in order to reduce the spread angle of the ultrasonic or radio wave beam. Therefore, if there is not enough space for the vehicle, there will be some problems with the race and the race. In other words, regardless of the type of the beam, whether it is a laser, a sound wave, or a radio wave, there is a problem with the cost or installation, so the beam has a very narrow spread angle. It is not practical to use a radar that emits radiation for an obstacle detection device.
- An object of the present invention is to provide an obstacle detection device that displays the information easily. Disclosure of the invention
- a first aspect of the present invention is an obstacle which is mounted on a vehicle and detects and displays an obstacle around the vehicle.
- a harmful object detection device that sequentially emits a beam having a predetermined divergence angle in a plurality of different directions and receives reflected waves from obstacles in each direction.
- an obstacle detection unit that detects an obstacle existing within the radiation angle range of the beam in each direction, and a reflected wave in each direction output from the obstacle detection unit.
- a distance calculator Based on the received signal, a distance calculator that calculates a distance representative of the obstacle and the vehicle in each direction, and an image of the distance in each direction calculated by the distance calculator.
- a two-dimensionally expanded figure is created as an obstacle image within the emission angle range of the beam emitted in each direction, and the obstacle Generates and outputs image data for displaying images
- An obstacle image creating unit that receives the image data created by the obstacle image creating unit, and a display unit that displays an image indicating a positional relationship between the obstacle and the vehicle.
- the distance calculation unit calculates the average of the range of the presence of the obstacle indicated by the reception signal of the reflected wave output from the obstacle detection unit when viewed from the radiation point of the beam. Calculate the distance.
- the distance calculation unit detects a portion where the torsion width of the received signal of the reflected wave output from the obstacle detection unit exceeds a predetermined threshold, and a threshold discrimination unit. The start time and end time of the received signal part are detected, and the time that elapses from the time when the beam is emitted to the time when the detected start time and end time are simply averaged is calculated. A representative distance calculation unit that calculates a representative distance between the obstacle and the host vehicle based on the obtained elapsed time. In addition, the distance calculation unit calculates the shortest distance as viewed from the radiation point of the beam in the range of the presence of the obstacle indicated by the reception signal of the reflected wave output from the obstacle detection unit. Is calculated.
- the distance calculation unit detects a portion where the amplitude of the reception signal of the reflected wave output from the obstacle detection unit exceeds a predetermined threshold, and a reception unit detected by the threshold discrimination unit.
- the start time and end time of the signal part are detected, and the time that elapses from the emission of the beam to the detected start time is obtained, and the obstacle and the obstacle are determined based on the obtained elapsed time.
- a representative distance calculation unit that calculates a representative distance between the own vehicle and the own vehicle.
- the obstacle image creation unit determines the corresponding azimuth calculated by the distance calculation unit with the beam radiating point as a center within the respective radiation angle range of the beam radiated in each azimuth. An arc shape with a radius of all distances is created as an obstacle image.
- the obstacle image creation unit changes the line thickness of the arc figure as the obstacle image created for each direction according to the distance calculated by the distance calculation unit. .
- the obstacle image creation unit determines the corresponding azimuth calculated by the distance calculation unit with the radiating point of the beam as the center within the respective radiating angle range of the beam radiated in each azimuth.
- a figure that includes at least an arc trajectory drawn with the radius of the distance as a radius and has an area is created as an obstacle image.
- the obstacle image created by the obstacle image creating unit is, for example, an elliptical figure in which both end points on the long axis side coincide with both end points of the circular arc locus.
- the obstacle image creation unit specifically The brightness of the entire figure as the created obstacle image is changed according to the distance calculated by the distance calculating unit.
- the obstacle image creation unit further treats the figure having the area created for each direction as a reference figure, and includes an arc locus included in the reference figure whose directions are adjacent to each other.
- An obstacle image is defined as the entire figure in which all the reference figures are connected in the direction of the azimuth with the line segment connecting one end point of the beam and the line segment connecting the other end point, and the distance from the radiation point of the beam. Based on, the inside of the whole figure is divided, and image data in which the brightness of each of the divided parts is changed stepwise is created.
- the obstacle image creation unit further calculates, in the center direction of the radiation angle range of the beam emitted in each direction, the direction calculated by the distance calculation unit from the emission point of the beam.
- a point separated by a distance of is treated as a representative position of an obstacle in each direction, and image data of a polygonal line connecting the reference position in the direction of the direction with a line segment is created.
- the obstacle detection device determines the beam emission point based on the azimuth from which the beam was emitted and the representative distance from the vehicle to the obstacle calculated for each azimuth.
- the driver can easily and intuitively grasp the obstacle between the own vehicle and the obstacle.
- the positional relationship can be displayed.
- the obstacle image indicating the position of the obstacle is displayed as the range of the radiation angle of the beam in each direction, the angle range where the obstacle exists is separated from the driver. It can be displayed easily.
- An obstacle detection device that detects and displays various obstacles, sequentially emits a beam having a predetermined spread angle in a plurality of different directions, and in each direction.
- an obstacle detection unit that detects obstacles within the beam radiation angle range for each direction and an output from the obstacle detection unit are provided.
- a distance calculating unit that calculates a distance representative of the distance between the obstacle and the vehicle in each direction based on the received signal of the reflected wave in each direction, and a direction in which each beam is emitted.
- An obstacle data calculation unit that calculates the position of the detected obstacle based on the distance calculated by the distance calculation unit, and shape data representing the shape of the obstacle to be detected are stored in advance.
- Shape data matching that calculates the position and angle of the obstacle to be detected with respect to the vehicle by comparing the obstacle data calculated by the data calculation unit with the obstacle data calculated by the data calculation unit. Detection based on the position data, the shape data of the obstacle to be detected, and the position angle calculated by the shape data matching unit.
- An obstacle image creation unit that creates an object obstacle image in which the position and angle of the shape of the obstacle are changed, and generates image data for displaying the object obstacle image; and an obstacle.
- a display unit that receives the image data created by the image creation unit and displays an image indicating the positional relationship between the obstacle and the host vehicle;
- the obstacle image creating unit treats the distance in each direction calculated by the distance calculating unit as an image creation reference, so that the beam in each direction can be obtained. Obstacles detected when a figure is two-dimensionally expanded over the entire radiation angle range The image data is created as an object image, image data for displaying the obstacle image is generated, and output together with the image data of the target obstacle image.
- the display unit receives the image data of the target obstacle image and the image data of the detected obstacle image created by the obstacle image creating unit, and receives the target obstacle image and the target obstacle image.
- the detected obstacle images and are superimposed and displayed.
- the obstacle detection device inputs shape data representing the shape of the obstacle to be detected in advance and matches the position of the obstacle detected by the obstacle detection means. As described above, since the image showing the relative positional relationship between the obstacle and the own vehicle is displayed, the driver can easily grasp the positional relationship between the entire obstacle and the own vehicle.
- FIG. 1 is a block diagram showing a configuration of an obstacle detection device according to a first embodiment of the present invention.
- FIG. 2 is a schematic diagram of a beam emitted by the obstacle detection device according to the first embodiment of the present invention.
- FIG. 3 is a flowchart showing the operation of the obstacle detection device according to the first to fourth embodiments of the present invention.
- FIG. 4 is a schematic diagram for explaining distance data calculated by the distance calculation unit in the first embodiment of the present invention.
- FIGS. 5A to 5C are schematic diagrams illustrating transmission signals and reception signals transmitted and received by the obstacle detection unit according to the first embodiment of the present invention.
- FIG. 6 is a flowchart showing the operation of the subroutine step S107 of FIG. 3 in the first embodiment of the present invention.
- FIG. 7 is a diagram for explaining the operation when the obstacle image creating unit creates an obstacle image in the first embodiment of the present invention.
- FIG. 8 is a flowchart showing the operation of the subroutine step S108 of FIG. 3 in the first embodiment of the present invention.
- FIGS. 9A and 9B are schematic views illustrating an obstacle image displayed on the display unit in the first embodiment of the present invention.
- FIGS. FIG. 5 is a schematic diagram showing another example of the obstacle image displayed on the display unit according to the first embodiment of the present invention.
- FIG. 11 is a flowchart showing the operation of the subroutine step S107 of FIG. 3 in the second embodiment of the present invention.
- FIGS. 12A and 12B are schematic diagrams illustrating an obstacle image displayed on the display unit according to the second embodiment of the present invention.
- FIG. 13A and FIG. 13B are schematic diagrams showing another example of the obstacle image displayed on the display unit in the second embodiment of the present invention. It is.
- FIG. 14 is a sub-section of FIG. 3 according to the third embodiment of the present invention.
- FIG. 15 is a diagram for explaining an operation when an obstacle image creating unit creates an obstacle image in the third embodiment of the present invention.
- FIGS. 16A and 16B are schematic diagrams illustrating an obstacle image displayed on the display unit in the third embodiment of the present invention.
- FIG. 17A and FIG. B are schematic diagrams showing another example of the obstacle image displayed on the display unit in the third embodiment of the present invention.
- FIG. 18 is a flowchart showing the operation of the subroutine step S107 of FIG. 3 in the fourth embodiment of the present invention.
- FIGS. 19A and 19B are schematic diagrams illustrating an obstacle image displayed on the display unit in the fourth embodiment of the present invention.
- FIGS. 2OA and 2OB are schematic diagrams showing another example of the obstacle image displayed on the display unit in the fourth embodiment of the present invention.
- FIG. 21 is a block diagram illustrating a configuration of an obstacle detection device according to a fifth embodiment of the present invention.
- FIG. 22 is a flowchart showing an operation when the obstacle detection device according to the fifth embodiment of the present invention sets shape data. It is.
- FIG. 23 is an example of a shape data number setting screen displayed on the display unit in the fifth embodiment of the present invention.
- FIG. 24A and FIG. 24B are schematic diagrams illustrating a parking type selection screen and a parking area size setting screen displayed on the display unit in the fifth embodiment of the present invention. is there.
- FIG. 25A — FIG. H is a schematic diagram illustrating a parking lot type selected at the time of inputting shape data in the fifth embodiment of the present invention.
- FIG. 26 is a diagram showing shape points and shape vectors stored as shape data in the fifth embodiment of the present invention.
- FIG. 27 is a flowchart showing the operation of the obstacle detection device according to the fifth embodiment of the present invention.
- FIG. 28 is a flowchart showing the operation of the subroutine step S707 of FIG. 27 in the fourth embodiment of the present invention.
- FIG. 29 is a diagram showing obstacle detection points according to the fifth embodiment of the present invention.
- FIG. 30 is a schematic diagram showing obstacle detection points and interpolation points according to the fifth embodiment of the present invention.
- FIG. 31A and FIG. B are schematic diagrams illustrating a shape vector and an obstacle vector according to the fifth embodiment of the present invention.
- FIG. 32A—FIG. D is a schematic view illustrating a rotated shape vector according to the fifth embodiment of the present invention.
- FIG. 33 is a diagram exemplifying the highest matching azimuth shape point detected in the fifth embodiment of the present invention.
- FIG. 34 is a diagram illustrating an obstacle image displayed on the display unit in the fifth embodiment of the present invention.
- FIG. 1 is a block diagram showing a configuration of an obstacle detection device according to a first embodiment of the present invention.
- the obstacle detection device is typically mounted on a vehicle, and includes an obstacle detection unit 11, a distance calculation unit 12, a control unit 13, an obstacle image creation unit 14, and a display unit 1. 5 and an input unit 16.
- the obstacle detection unit 11 is configured as a radio wave radar device that detects an obstacle around the vehicle, and according to a direction in which the obstacle is to be detected, for example, a front part or a side part of the vehicle. It will be installed at one or more locations selected from the rear and rear.
- the obstacle detection unit 11 emits a beam having a predetermined divergence angle a plurality of times while changing the azimuth, and exists within the irradiation range of each beam in order to emit the beam. Receives the reflected wave of the beam reflected by the obstacle.
- the obstacle detecting unit 11 is a radio wave radar.
- the obstacle detection unit 11 is not limited to the radio wave radar.
- an ultrasonic radar or a radar may be used. It may be the radar.
- the obstacle detecting unit 11 includes a transmitting unit 111, a receiving unit 112, and an antenna 113.
- the transmission section 111 generates a transmission signal, outputs the transmission signal to the antenna 113, and outputs the transmission signal. Then, a part of the transmission signal is output to the distance calculation unit 12.
- the antenna 113 emits a beam having a predetermined divergent angle, and the receiver 112 receives an reflected signal of the beam reflected by an obstacle. Outputs the received signal received by the antenna 113, including the detector and the detector.
- the antenna 113 is, for example, an array antenna.
- the array antenna is composed of a plurality of antenna elements arranged in the same plane, and the same number of phase elements as the antenna elements for supplying a transmission signal to the corresponding antenna element. By controlling the amplitude and phase of the signal fed to the corresponding antenna element, a beam is emitted in a desired direction.
- the antenna 113 may have separate antenna elements for transmission and reception, or may be identical using a transmission / reception switch or a circulator. This antenna element may be used for transmission and reception.
- FIG. 2 illustrates one beam emitted by the obstacle detection unit 11.
- an antenna 113 of the obstacle detection unit 11 is installed at the center of the rear end of the vehicle V.
- the beam BM is radiated in a direction represented by a line segment LS forming a predetermined angle ⁇ from the antenna 113 with respect to the vehicle center line CL (see a dashed line).
- the beam BM is radiated within a range of a predetermined angle ⁇ centered on the line segment LS.
- the direction in which the beam is emitted is defined as follows: the center line CL of the vehicle is defined as 0 °, the left side of the vehicle is the minus side, The right side of the vehicle is defined as the plus side, and is expressed as an angle made with respect to the center line CL of the vehicle when viewed from the antenna 113.
- the azimuth represented by the line segment LS in Fig. 2 is - ⁇ .
- the obstacle detecting unit 11 emits the beam BM within a predetermined angle range by emitting the beam BM multiple times while changing the azimuth, and the distance calculating unit 12 includes the transmitting unit 1.
- the distance from the antenna 113 to the obstacle is calculated based on the transmission signal output from the receiver 11 and the reception signal output from the receiver 11, and the distance data is calculated. And output.
- the control unit 13 outputs azimuth data indicating the azimuth at which the beam is radiated to the transmission unit 111, and after the beam is radiated based on the azimuth data, the distance calculation unit 12 outputs the azimuth data.
- the output distance data is stored together with the direction data.
- the obstacle image creation unit 14 creates an obstacle image representing at least the position of the detected obstacle based on the direction data and the distance data accumulated by the control unit 13.
- the display unit 15 is, for example, a display such as an LCD installed on a console of the vehicle, and includes an obstacle image created by the obstacle image creation unit 14. Display an image.
- the input section 16 is, for example, a selector switch or a keyboard operated by a driver, and is used, for example, to switch on / off an obstacle detection device. Can be
- FIG. 3 is a flowchart showing the operation of the entire obstacle detection device.
- the control unit 13 indicates the direction in which the beam is emitted.
- the azimuth data is output to the transmitter 1 1 1 and a counter indicating the number of times of beam emission is incremented (step
- the transmitting unit 11 1 transmits an azimuth control signal for emitting a beam in the azimuth indicated by the azimuth data and a transmission signal to the antenna 11.
- the beam is radiated by outputting to 3 and the transmission signal is output to the distance calculation unit 12 (step S102).
- Receiving section 112 receives the reflected wave of the beam radiated in step S302 and outputs the received signal to distance calculating section 12 (step S103) ).
- FIG. 4 is a diagram illustrating a state in which the vehicle V is going to retreat in order to park in the parking space PL as viewed from above the vehicle.
- the obstacle S C is a structure that surrounds the parking space P L on three sides.
- an antenna 1 13 of the obstacle detection unit 11 is installed at the rear of vehicle V.
- the beam BM is one beam radiated from the antenna 113, and has a beam divergence angle of about 16 ° when viewed from above the vehicle.
- the direction in which the beam B M is emitted is changed by approximately 6 ° each time it is emitted, and the beam is emitted 10 times in one scan.
- One beam BM radiated from the antenna 11 13 spreads within the area between the two dotted lines DL, and advances on the wall WL indicated by the thick line. Irradiated.
- the beam BM is reflected by the wall WL, and the receivers 1 1 and 2 are connected to the various portions of the wall WL. It receives a received signal in which reflected waves from various locations are integrated.
- distance calculating section 12 calculates a distance from the antenna to the obstacle based on the transmission signal and the reception signal (step S104).
- step S104 With reference to FIGS. 5A and 5B, an operation when distance calculating section 12 calculates the distance from antenna 113 to the obstacle will be specifically described.
- one pulse beam is the transmitted signal PB 1
- the transmitted signal PB i is reflected by the obstacle SC.
- a reflected wave reception signal PB 2 that is that has been shown.
- FIG. 5 A ideal reception signal PB 2 of rectangles and Ri Ki Tsu and transmit signal [rho beta same rather has that have been shown.
- the reception signal ⁇ ⁇ 2 is the difference between the time T i received by the antenna 113 and the beam, and the radio wave that is the reflected wave is transmitted to the antenna 113 by the antenna 113. This is the time required to make a round trip to the obstacle SC.
- the distance D from the antenna 113 to the obstacle SC is given by the following equation (1).
- a predetermined threshold value is set for the amplitude of the reflected wave, and the distance calculation unit 12 determines that the amplitude of the received signal exceeds the threshold value. Calculate the average value of the time zone.
- the yo Ri specifically, the distance calculation unit 1 2, test the time T 2 where the received signal [rho beta 3 exceeds a threshold value, and a subsequent time the amplitude of the received signal [rho beta 3 has Tsu falls below the threshold value T 3 Output, time ⁇ 2 ⁇ time ⁇ 3 average value and time ⁇ .
- the time ⁇ which is the difference between and, is found.
- the distance from the antenna 113 to the obstacle SC calculated by the distance calculation unit 12 as described above will be specifically described.
- the beam ⁇ ⁇ radiated from the antenna 113 is reflected at every part of the wall WL within the irradiation range, so that the antenna 113 is most likely to be reflected on the wall WL.
- the reflected waves from various places from the near part to the farthest part are received as a body.
- the distance D calculated as described above based on the reception signal of such a reflected wave is the distance from the antenna 113 to the nearest part of the wall WL, which is the longest. It is the average value of the distance to the part.
- the distance D calculated as described above is A circle whose radius is the distance D calculated as described above as the center is represented by an arc RC within an angular range irradiated with the beam BM.
- the position of the obstacle SC present around the vehicle is detected.
- the narrower the spread angle of the beam is, the more the angle range RC in which the detected obstacle SC can exist is limited, so that the position of the obstacle SC can be specified more accurately.
- the wider the spread angle of the beam BM the more broadly the position of the obstruction SC that can be detected.
- the shorter the distance from the antenna 113 to the obstacle SC the narrower the irradiation range of the beam BM. Therefore, the shorter the distance from the antenna 113 to the obstacle SC, the more accurately the position of the obstacle SC can be detected.
- the longer the distance from the antenna 113 to the obstacle SC the larger the position of the obstacle SC that can be detected becomes.
- the threshold for detecting the received signal needs to be set in advance. While the vehicle is actually running, any object can become an obstacle SC, and the reflectivity of radio waves differs depending on the material and shape of the object. Specifically, an object to be detected as an obstacle s C is specified to some extent, and the amplitude value that can detect the object with the lowest reflectance among the objects is set as a threshold value. It is preferable that the setting is made as follows.
- control unit 13 compares the azimuth data indicating the azimuth at which the beam was emitted with the distance data calculated by the distance calculation unit 12. Is accumulated (step S105). The control unit 13 determines whether or not the number of times of beam emission has reached a predetermined number (e.g., 10 in the present embodiment) to emit a beam in one scan. It is determined whether or not the scan has been completed (step S106). If the control unit 13 determines that one scan has not been completed yet, the process returns to step S101 to continue the scan. On the other hand, when it is determined that one scan has been completed, the control unit 13 proceeds to step S107.
- a predetermined number e.g. 10 in the present embodiment
- FIG. 6 is a flowchart showing a subroutine process of the obstacle image creation unit 14 in step S107.
- FIG. 7 shows an obstacle image created in step S107.
- Obstacle image generator 1 4 first, you draw a vehicle projection image I v where the vehicle is projected over whether al (scan STEP S 2 0 1). Doo-out of this, the actual that you only to the vehicle antenna 1 1 3 of the installation location and position P AT of the antenna of a defined vehicle projected image I v in the same position to define the coordinates of the origin deep .
- the obstacle image creation unit 14 determines at least one obstacle.
- Draw the presence line Lsc (step S202). Obstacle presence Line L sc This is an example of an obstacle image according to the present embodiment.
- the obstacle image creating unit 14 generates the respective azimuth data and the corresponding distance data by using the azimuth indicated by the azimuth data as a center. .
- the obstacle existing position line L sc shown in FIG. 7 is an obstacle existing line when the azimuth data is 0 i and the distance data is.
- step S108 the obstacle image creating unit 14 draws a line segment representing the shape of the detected obstacle.
- FIG. 8 is a flowchart showing the subroutine processing of step S108.
- the obstacle image creation unit 14 calculates the coordinates of a point (hereinafter, referred to as an obstacle detection point) Psc representing the position of the detected obstacle based on the azimuth data and the distance data.
- the point P sc illustrated in FIG. 7 is an obstacle detection point of an obstacle whose azimuth data and distance data are 0 t and ⁇ D, respectively.
- the obstacle image creation unit 14 draws a polygonal line CL in which the coordinates of each obstacle detection point Psc calculated in step S301 are connected in sequence (step S3 0 2). After the processing in step S302, the operation of the obstacle detection device is shown in FIG. Return to main routine.
- the display unit 15 displays the obstacle image created by the obstacle image creation unit 14 in step S107 to the driver (step S109).
- the control unit 13 determines whether or not the system of the obstacle detection device is on (step S110). For example, the on / off of the obstacle detection device system is input by a selector switch provided in the input unit 16 by a driver's operation. If the determination is Yes, the control unit 13 resets the number of times the beam has been emitted (step S111), and the process proceeds to step S11. 0 Return to 1 and start scanning for obstacles again. On the other hand, if the above determination is No, the operation of the obstacle detection device ends.
- FIGS. 9A, 9B, 1A and 1OB show stepwise images of the obstacle until the vehicle V in FIG. 4 moves into the parking space PL. It has been.
- the lines corresponding to the actual obstacles are shown by dotted lines for comparison with each other, but these lines are not shown in the actual obstacle image. Not drawn.
- the shorter the distance to the obstacle as the vehicle moves the more the irradiation range of the beam is limited.
- the lines (arcs) are shortened, and the position of obstacles is drawn more accurately.
- the driver to grasp the shape of the obstacles to a line CL connecting the point of the center on each obstacle exists La Lee emissions L sc to reference.
- Lsc In each figure, for convenience, only one obstacle existing line is denoted by Lsc as a reference code.
- Lsc this present obstacle detecting device
- the obstacle detection device displays the shape of the obstacle by drawing a line connecting the points representing the representative positions of the detected obstacles in each irradiation angle range of each beam.
- the distance calculation unit 12 sets the difference between the average value of the time zone in which the amplitude of the received signal exceeds the predetermined threshold value and the time at which the beam is emitted as time, The distance to the obstacle was calculated.
- Figure 5 Remind as and C, the time T 2, the amplitude which exceeded a predetermined threshold Te troduction of the received signal, the time T the beam is emitted based on the transmission signal .
- the difference from this may be set as the time ⁇ , and the distance to the obstacle may be calculated.
- the distance calculated in this way is the distance to the shortest obstacle to the vehicle among the obstacles within the irradiation range of each beam, and is drawn on the obstacle image.
- the obstacle presence line indicates the position closest to the vehicle where the detected obstacle exists within the irradiation range of each beam. For this reason, the driver performs the driving operation so that the vehicle projection image ⁇ ⁇ of the obstacle image does not come into contact with the obstacle presence line L sc, and thereby, the driver can perform the vehicle Contact can be prevented.
- the obstacle detection unit 11 of the obstacle detection device changes the direction in which the beam is electrically radiated using the array antenna.
- the azimuth of the radiating beam may be changed by mechanically changing the azimuth of the antenna that radiates the beam in one predetermined direction.
- the obstacle detection device draws the obstacle existing line in the obstacle image with a thicker line as the distance from the radar to the obstacle is shorter. It is characterized by Due to such features, the obstacle image creation unit 14 provided in the obstacle detection device performs a process different from that of the above-described first embodiment.
- FIG. 11 is a flowchart showing the operation of the obstacle image creation unit 14.
- the flow chart of FIG. 11 is a flow chart of the processing of step S 102 of the flow chart according to the first embodiment (see FIG. 6). Replace with 400 and follow up on the processing of step S402! ]. Therefore, among the steps shown in FIG. 11, the same steps as those in FIG. 3 will not be described.
- the obstacle image creation unit 14 determines a distance level that represents the distance from the vehicle in a predetermined number of steps for each of the 10 distance data stored in the control unit 13 (step 10). Step
- the number of distance levels is as follows: from shortest distance to level 1, level 2, level 3 3 ⁇ level 4 4 Stage.
- AD (D max -D rain ) ⁇ 4
- the distance level for the i-th distance data D is determined by which of the following conditional expressions (2) to (5) is satisfied.
- the distance level is set to level i. If the distance data D i satisfies the equation (3), the distance level is set to level 2. ), The distance level is set to level 3, and if equation (5) is satisfied, the distance level is set to level 4.
- the obstacle image creation unit 14 changes the line thickness according to the distance level determined in step S402 for each distance data, and outputs the azimuth data and the distance data respectively. Based on this, an obstacle presence line is drawn in the same manner as in the first embodiment (step S403). For example, for the distances of level 1, level 2, level 3, and level 4, the thicknesses of the lines to be drawn are 1.0 mm, 0.7 mm, and 0 mm, respectively. 4 mm and 0.2 mm are specified in advance, and the obstacle image creation unit 14 draws each obstacle existing line with the line thickness corresponding to each distance level. .
- FIGS. 12A, B, 13 A, and B illustrate an obstacle image created by the obstacle image creating unit 14. Yes.
- the lines corresponding to the outlines of the actual obstacles are shown by dotted lines for comparison with each other, but these are not shown in the actual obstacle image. Dotted lines are not drawn.
- these obstacle images show the obstacle existing position line with a thicker line as the distance from the vehicle becomes shorter. It was drawn.
- the obstacle existing lines in the obstacle image are drawn with the same thickness.
- the distance of the vehicle or colleagues Ru is drawn with a short obstacle exists La Lee down L sc nearly as thick lines.
- the obstacle presence line 3 £ which indicates the position of the obstacle closer to the vehicle, is drawn with emphasis, so that the position of the obstacle that the driver needs to pay more attention to can be grasped. It is easier to do this.
- the obstacle detection device detects a start time and an end time of a section in which the amplitude of a received signal exceeds a certain threshold, and calculates a distance calculated based on the time between the two. Is the representative distance between the obstacle and the vehicle, or the start time of the section where the amplitude of the received signal exceeds a certain threshold is detected, and the distance calculated based on the start time is determined as the obstacle.
- the calculation of the representative distance is not limited to this example. For example, a so-called under sampling method may be used. What is the undersampling method?
- the distance between the own vehicle and the obstacle does not change during that period.
- This is a method for detecting the received signal that satisfies
- the correlation between the received signal and the copy of the transmitted signal, which are gradually shifted by a small amount of time, is determined.
- the correlation is determined. If the information of the shifted minute time is known, the obstacle detection device sets the distance calculated based on the information as the representative distance between the obstacle and the host vehicle.
- the high-speed AZD converter required for calculating the representative distance described in the present embodiment is not required, and the representative distance is calculated with a simpler hardware configuration. It is possible to do.
- the position of the detected obstacle is represented by an arc whose line thickness changes according to the distance from the vehicle.
- the obstacle detection device calculates the position of the detected obstacle by using a figure with an area. It is represented by Asc (hereinafter referred to as an obstacle existence area), and is characterized by changing the brightness of the figure according to the distance from the vehicle. Due to such features, the obstacle image creation unit 14 provided in the obstacle detection device performs a process different from that of the second embodiment.
- FIG. 14 is a flowchart showing the operation of the obstacle image creating unit 14.
- the flowchart shown in FIG. 14 corresponds to the processing of step S403 of the flowchart (see FIG. 11) according to the second embodiment. Is replaced by the processing of step S503 and step S504. Therefore, among the steps shown in FIG. 14, the description of the same steps as those in FIG. 11 will be omitted.
- the obstacle image creation unit 14 draws an obstacle presence area Asc based on each of the azimuth data and the distance data (step S503 ).
- step S503 the operation of the obstacle image creating unit 14 in step S503 will be specifically described with reference to FIG.
- the 1 5 scan STEP S 5 0 3 1 single obstacle exists d Re ⁇ A sc that will be rendered is shown in Ru Rere. Obstacle exists et Li A A s c illustrated in Figure 1 5, the azimuth data theta! It is drawn based on the distance data D
- the long axis is an ellipse whose length coincides with the irradiation range of the beam emitted in the relevant direction.
- Point H which is shown in Figure 1 5 is Ri position der of A integrators Na, point I, point J Contact good beauty point K, respectively, obstructions exist et Re ⁇ A sc der Ru oval circular length 2 points at both ends of the shaft Oh Ru in mind point in the good beauty obstacles exist et Li A A sc.
- obstacles exist et Li A A sc is, Ri Oh than even that corresponds to put that obstacles exist La Lee emissions L s c in Figure 7, point 1 your good beauty point J obstacles exist La fin Matches both ends of L sc .
- the obstacle image creating unit 14 calculates the length of the major axis of the ellipse and the coordinates of the center point of the ellipse based on the azimuth data and the distance data.
- the major axis IJ of the ellipse is a line segment connecting two points whose distances from the antenna are both D and whose directions differ by about 16 °, so the length D i is It is equivalent to the length of one other side of an isosceles triangle with an interior angle of about 16 ° between the two sides.
- the triangles ⁇ ⁇ IK and ⁇ ⁇ JK have the lengths of the hypotenuses, and one of the interior angles is a right-angled triangle of approximately 8 ° .
- the length of the boiled example to the side IK your good Pi side JK is, the D l S in 8 ° der is, the length of the long axis IJ Ru look for in a well, that Do and 2 D l S in 8 °.
- the coordinates of the center point K of the ellipse can be obtained by the following method.
- the triangle ⁇ ⁇ ⁇ ⁇ in Fig. 15 is a right-angled triangle with hypotenuse HI of length 0 1 and interior angle 1 ⁇ [1 :: approximately 8 °, so the length of side HK is Di Cos It becomes ga.
- the point K has an azimuth of 0 as viewed from the antenna, and the distance from the antenna force to Dtc0s8.
- the slope of the obstacle presence error is the angle ⁇ i given by the azimuth data ⁇ ,.
- a se der Ru ellipse is Ru constant length Der predetermined et a.
- the obstacle image creation unit 14 outputs the azimuth data and the distance data. Based on the above, the major axis and minor axis length of the ellipse obtained as described above are calculated based on the length of the minor axis, the coordinates of the center point, and the inclination of the ellipse.
- the obstacle image creation unit 14 generates each obstacle presence area As drawn in step S503 .
- FIG. 16A, FIG. B, FIG. 17A and FIG. B illustrate an obstacle image created by the obstacle image creation unit 14.
- the lines corresponding to the actual outlines of the obstacles are indicated by dotted lines for the purpose of mutual comparison, but these dotted lines are drawn in the actual obstacle image.
- Ni will distance of the vehicle or et al Yo that is short obstacle exists et Li ⁇ A se ho throat emphasized, the internal change Ete obstacles exist et Li A A sc brightness coating It has been crushed. This is these obstacles image or et al., It can be seen and the child that has been drawn to emphasize Ri'm etc. ho obstacles exist et Li ⁇ A s c close to the vehicle.
- the vehicle is moved, by Tsu the distance of the obstacle or was on the short rather than that Do not, because the irradiation range of the beam is limited Ri good, the obstacle exists et Li A A s c Oh The major axis of the ellipse becomes shorter, and the position of the obstacle is more accurately inserted.
- the obstacle detection apparatus because the obstacle exists et Li 7 A S e that indicates the position of the obstacle nearly as obstacle near the vehicle is drawn with emphasis, the driver attention It becomes easy to grasp the position of the obstacle to be done.
- a and this representative of the position of the detected obstacle by the obstacle presence et Li A A sc oval This is a cash forte, the throat other shapes such Tatoebahi to form May be used.
- the obstacle detection device indicates the position of the detected obstacle by one obstacle presence area A s (: was les, then. to this was paired, the obstacle detection apparatus, the single figure you include your only that all obstacles exist et Li a a sc to obstacles image of the third embodiment
- the obstacle image creation unit 14 is characterized in that the interior of the obstacle image creation unit 14 is modified in such a way that the inside of the obstacle image is emphasized as the distance from the radar is shorter. A process different from that of the third embodiment is performed.
- FIG. 18 is a flowchart showing the operation of the obstacle image creation unit 14.
- the flowchart shown in Figure 18 is for a third implementation.
- the processing of step S504 in the flowchart (see FIG. 14) according to the embodiment is replaced with the processing of step S604 and step S605. It has been replaced by Therefore, among the steps shown in FIG. 18, the description of the same steps as those in FIG. 14 will be omitted.
- the obstacle image creation unit 14 generates a region surrounded by the entire obstacle presence area and the line segment drawn in step S604 (hereinafter, referred to as an obstacle presence zone). ) Zs .
- the interior of the interior is painted with a different brightness so that the shorter the distance from the antenna is, the more it is emphasized (step S605).
- the inside of the object existence zone Z st: is divided at equal intervals by (n — 1) concentric circles with the position of the antenna 113 as the center, and the divided obstacle existence zone Z Each part of se is painted with n levels of brightness according to the distance from the antenna 113.
- the obstacle image creation unit 14 divides the interior of the obstacle existence zone Z se by (n-1) concentric circles with the position of the antenna 113 as the center and R i as the radius. Then, change the lightness in order from the partial force ⁇ ⁇ close to the antenna 113, and paint. In addition, obstacles exist zone Z s. The arc of the concentric circle that divides the circle must be drawn.
- FIG. 19A, FIG. B, FIG. 2 OA, and FIG. B illustrates an obstacle image created by the obstacle image creating unit 14.
- the lines corresponding to the actual obstacles are shown by dotted lines for the purpose of mutual comparison, but these lines are not shown in the actual obstacle image.
- the obstacle presence area that is closer to the vehicle in the gradation of four levels of brightness is darker as the obstacle presence area becomes darker.
- the inside is painted out.
- These figures show that, of the obstacle presence zone Z se , the area near the vehicle is painted out in darker color and emphasized. And are known. Also, as the distance from the radar to the obstacle becomes shorter, the obstacle presence zone z se becomes smaller, and it can be seen that the shape of the obstacle is detected more accurately.
- the obstacle detection device detects the position of an obstacle existing around the vehicle by increasing the inside color of the vehicle as the distance from the vehicle becomes shorter. N z s . The driver can easily find the position of the obstacle to watch out for.
- the presence obstacle indicates the position where there is an obstacle La Lee down L sc, obstacles exist et Li A A se, only one of your good beauty obstacles exist zone Z c, it was or May display only the polygonal line CL indicating the shape of the obstacle.
- the obstacle detection device is configured to register the exact shape of an obstacle to be detected in advance and compare the shape of the previously registered obstacle with the position of the detected obstacle.
- the first feature is to estimate the positional relationship between the obstacle to be detected and the vehicle.
- the obstacle detection device is further characterized in that the exact shape of the obstacle to be detected is displayed together with the positional relationship with respect to the own vehicle. Due to these features, the configuration and operation of the obstacle detection device are different from those of the first embodiment in the following points.
- FIG. 21 is a block diagram showing the configuration of the obstacle detection device.
- the shape data storage unit 17 and the shape data matching unit 18 have the following force Q. They differ in that they are used. Since there is no other difference between the two obstacle detection devices, the same reference numerals are used for the same components as those of the first embodiment among the components of the obstacle detection device. The description of each is omitted.
- the shape data storage unit 17 stores previously input data (hereinafter, referred to as shape data) representing the shape of an obstacle such as a driver's home garage.
- the shape data matching section 18 reads out the shape data stored in the shape data storage section 17, and outputs the read shape data and the obstacle detection section 11.
- the position of the detected obstacle is matched with the position of the detected obstacle, and in the obstacle image, the shape of the registered obstacle matches the shape of the detected obstacle best. Calculate position and angle of shape data I do.
- the shape data is information that defines the shape and various attributes of the parking lot.
- the shape data includes a shape data number, a shape data name, a parking lot type, a shape point, and a shape vector.
- Shape data number ⁇ The shape data name is for identifying the registered shape data.
- the parking lot type is a type of parking lot that is classified according to the shape of the parking lot, the traveling direction when the vehicle enters the parking lot, and a combination such as parallel Z parallel parking. .
- the shape and the shape vector represent the shape of the registered obstacle by the point data and the vector, respectively.
- the shape data number, shape data name, and parking lot type are input from the input unit 16 by, for example, the operation of a diver, and the shape point and the shape vector are input. It is calculated based on the information input from the section 16 and stored by the shape data storage section 17.
- FIG. 22 is a flowchart showing an example of the operation of the obstacle detection device at the time of registration of shape data.
- the control unit 13 receives the shape data number input from the input unit 16 by an operation of a person or the like (step S701).
- FIG. 23 exemplifies a shape data number setting screen.
- the driver uses the input unit 16 to input the shape data number and the name of the shape data to be registered from now on, and input the shape data number ⁇ BX and the shape data name. Input to BX 2 .
- the control unit 13 determines whether the input unit 16 is operated by the driver. Then, the parking lot type input is received (step S702). More specifically, when the driver selects a parking type to be registered from among the prepared parking types using the input unit 16, The control unit 13 receives the input parking lot type.
- the parking type depends on the shape of the obstacles around the parking space, the direction of travel of the vehicle when it enters the parking space, and the approach of the vehicle to the parking space. At the start, it was specified whether the parking space was on the right or left side of the vehicle.
- FIG. 24A shows an example of a screen for selecting a parking type.
- the vehicle progresses as it enters the parking space for one type of obstacle around the parking space.
- Four different parking types are shown and shown.
- the parking lot types are not limited to those shown in Fig. 24A, but are shown in Fig. 25A-various parking spaces as shown in Fig. H. It is prepared to correspond to the shape of the obstacle SC.
- the control unit 13 receives the dimensions of the obstacle around the parking space input from the input unit 16 by the driver's operation (step S7). 0 3). Specifically, the control unit 13 displays a screen for setting the dimensions of the parking lot corresponding to the parking type selected in step S702, and prompts the driver. Receiving the dimensions of the obstacle input from the input section 16 operated by the take.
- FIG. 24B shows an example of a parking lot dimension setting screen. In the parking dimension setting screen shown in Fig. 24B,
- the control unit 13 determines a point at an interval of 10 cm along the shape of the obstacle (hereinafter, referred to as a shape point).
- the coordinates of A are calculated (step S704).
- the control unit 13 is configured such that one of the consecutive shape points, starting from the right end when viewed from the vehicle, is set as a start point and the next shape point is set as an end point. (Hereinafter, referred to as shape vector) s is calculated (step S705).
- FIG. 26 shows step S704 and step S704.
- FIG. 7 is a diagram showing a shape point A and a shape vector a calculated in S705. On the left side of FIG.
- FIG. 26 a row of shape points A showing almost the entirety of the obstacle is shown, and on the right side of FIG. 26, an enlarged view of the inside of the dashed-dotted line is shown. .
- a shape point A and a shape vector a force S are further shown.
- Each point shown in Fig. 26 represents the shape point A calculated based on the parking lot dimensions manually input as shown in Fig. 24B.
- Each arrow shown in FIG. 26 represents a shape vector.
- the shape data storage section 17 stores the shape data number, shape data name and parking lot type entered as described above, and the shape points A and the shape base calculated.
- the thread knotting of the knife 3 is stored as one shape 7 (step S 7 06
- the above-described operation of the obstacle detection device when registering shape data is an example of a method of registering shape data, and the method of registering shape data is not limited to this.
- Figure 27 shows the position and shape of the obstacle with respect to the vehicle using the detected azimuth and distance data of the obstacle and the shape data stored in the shape data storage unit 17. This is a flowchart showing the operation of the obstacle detection device when it is displayed.
- the flowchart shown in FIG. 27 has step S108 of the flowchart (see FIG. 3) according to the first embodiment. It has been replaced with the step S808. Therefore, the description of the same processing as in the first embodiment will be omitted.
- step S808 the control unit 13 sets the azimuth and orientation so that the shape of the obstacle registered in advance matches the shape of the detected obstacle best.
- FIG. 28 is a flowchart showing the process of the sub-routine of step S808 shown in FIG. 27 by changing the position and creating an image superimposed on the obstacle image. .
- the control unit 13 receives the signal from the input unit 16 by the driver's operation, and registers the signal in the obstacle image created in step S807. It is determined whether or not the superimposed shape data is to be superimposed (step S910). If the above judgment is No, the processing returns to the main routine shown in FIG. On the other hand, the above judgment is Y If it is es, the process proceeds to step S902.
- step S 902 the shape data matching section 18 uses each of the obstacle detection points P based on the azimuth data and the distance data stored in the control section 13. , And the coordinates of the interpolation point P 'for interpolating between the two obstacle detection points P at predetermined intervals, and the calculated coordinates of each obstacle detection point P and the interpolation point P' Then, an obstacle vector ⁇ representing the shape of the detected obstacle is obtained based on the obtained obstacle (step S902).
- FIG. FIG. 29 shows at least one obstacle detection point P in the obstacle image.
- Figure 30 also shows that, as in the case of the obstacle image, several obstacle detection points P in the coordinate system with the vehicle antenna as the origin are black, and furthermore, The interpolation point P and force s are shown in white.
- the interpolation point p is one or more points that interpolate between two obstacle detection points P that are separated by a predetermined distance, and the coordinates of which are the obstacle detection points P and the interpolation points.
- the distance between the points P ' is determined to be equal, and to be the value closest to the distance between the shape points (10 cm in the present embodiment).
- the control unit 13 specifies one natural number n, and then subtracts 1 from the specified natural number n between the two obstacle detection points P to be targeted.
- the control unit 13 determines the number of interpolation points P ′ to be created.
- the shape data matching part 18 is used for the obstacle detection points P and Based on the coordinates of the interpolation point P ', one of the consecutive obstacle detection points P and one of the interception points P, starting from the right end when viewed from the vehicle, is the starting point Then, a vector ending at the next point (hereinafter referred to as an obstacle vector) is calculated.
- the shape data matching unit 18 determines whether the position of the detected obstacle and the shape data number to be matched have been input (step S903). Step S 9
- step S905 If the determination in 03 is Yes, the process proceeds to step S905. On the other hand, if the judgment in step S903 is No, the process proceeds to step S904, and the shape deg matching portion 18 is operated by the driver. The shape data number to be matched is received from the input section 16 which has been operated by the operator (step S904).
- step S905 the control unit 13 reads the shape data identified by the shape data number received in step S904 from the shape data storage unit 17 and reads the shape data. While changing the direction of the vector, a part of the shape vector and the direction that best match the obstacle vector calculated in step S902 are detected. (Step S905).
- FIG. 31A, FIG. B, and FIG. 32A are schematic diagrams for explaining the operation of the shape data matching section 18 in step S905, and is shown in the figures.
- the interval between the shape point A, the obstacle detection point P and the interpolation point P ' is wider than the actual one.
- Fig. 3 1A Shows a shape point A and a shape vector s read from the shape data storage unit 17.
- 3 1 point is shown in A A i, A 2, A 3, A 4, is ... A n, Ri Ah example of shape points A, a. , A 2 ,. , A 3,.
- ... A 0 is an example of a ⁇ -shaped vector.
- Fig. 31B shows the obstacle detection point P and the interpolation point P ', and the obstacle vector force.
- the points B 2 , B 3 , B 4 ,... B ⁇ shown in FIG. 31B are obstacle detection points ⁇ and interpolation points ,, b, b 2 s ⁇ 3 ... m - ⁇ is, Ru example der obstacles downy click door Norre.
- the shape data matching section 18 extracts the same (m-1) continuous shape vectors a as the obstacle vector ⁇ and removes all shape vector a forces. Then, (m — 1) obstacle vectors and (m-1) shape vectors a are paired in that order, and the paired vectors ⁇ The sum of the inner products of a and a is calculated as the degree of azimuth matching. For example, the shape base click preparative Le 3 ⁇ , 0, a 2, . , A 3 ,. ... s ra . And the obstacle vector, b 2 , b 3 ... azimuth matching degree is a. ⁇ I + a 2 ,. ⁇ 2 + a 3, o ⁇ b 3 + ⁇ a ra — 0 ⁇ b m _i
- the shape data matching section 18 is composed of a series of m-one shape vectors that are extracted from all the shape vectors. Calculate the orientation matching degree.
- the shape data matching section 18 has a shape vector s. , A 2 ,. , A 3 ,. ... and the s ra and 0 is by approximately 1 ° rotation in anticlockwise, the shape vector a 1 k, a 2, k , a 3, k ... a n. Lik (k is rotated angle) for Similarly, the degree of orientation matching is calculated.
- Figure 3 2A Figure D shows approximately 1 °, approximately 2 °, and approximately The shape vector a when rotated at an angle of 3 °, approximately k °, is shown.
- the orientation matching degree M calculated as described above is expressed by the following equation (7).
- the value of the rotation angle k of the shape vector a may be limited to a predetermined range corresponding to the parking lot type of the read shape data. For example, in a parking lot type as shown in Figure 24B, the direction of the parking space relative to the vehicle is approximately 0 ° when entering the parking space. Then, when parking is completed, the counterclockwise direction is 90 when viewed from above the vehicle. It is rotating.
- the value of the rotation angle k of the shape vector ideally changes within a range of 0 ° to 90 °.
- a range slightly beyond the range of 0 ° to 90 ° is taken as the range where the rotation angle k can be taken into account in consideration of driving operation such as vehicle swaying and switching. It is preferable to be limited.
- the shape data matching section 18 is composed of (m-1) shape vectors ai , in which the azimuth matching degree M is the maximum of all the azimuth matching degrees M calculated as described above. k , a i + I , k , ⁇ a i + ra - 2 and k are specified, and the shape point A i and angle k corresponding to the start point of this shape vector s are detected. .
- the shape data matching section 18 extracts m shape points in order from the shape point A i thus detected, and rotates the shape point A i by the angle k around the center.
- the object (hereinafter, referred to as the highest matching shape point Q) is the shape point A that most closely matches the obstacle detection point P. Point C i shown in Fig. 33 , C 2 , C 3 , C 4 , 'C m are the highest matching shape points Q detected in this manner.
- the shape data matching section 18 determines that the highest matching shape point Q detected in step S905 best matches the obstacle detection point P.
- the coordinates on the obstacle image are detected (step S907).
- the operation of the shape data matching section 18 in step S907 will be described more specifically.
- the shape data matching section 18 shifts the coordinates of all the highest matching shape points Q so that the coordinates of the obstacle detection point Bt and the coordinates of the shape point Ci match.
- the obstacle detection point P, the interpolation point P ', and the highest matching shape point Q are sequentially paired, and the reciprocal of the sum of the distances between the two points in each pair (hereinafter, referred to as (Referred to as position matching degree R).
- the shape data matching section 18 shifts the coordinates of the highest matching shape point Q by 10 cm in the X-axis direction and the y-axis direction by 10 cm, and shifts each coordinate by 10 cm.
- the position matching degree R calculated while changing the position of the highest matching shape point Q is calculated by the following equation.
- the shape data matching part 18 is all calculated in this way.
- the obstacle image creating unit 14 rotates all the read shape points around the point A i by the angle k °, and changes the coordinates of the point A ⁇ to step S906.
- your stomach de San coordinates (X Q, y Q) to calculate the coordinates is moved in the jar by ing in, line segment obstacles to connect the coordinates of the calculated shape point to the jar good of this in order
- the image is drawn so as to be superimposed on the image (step S907).
- Figure 34 shows the obstacle image drawn as described above. From FIG. 34, in the obstacle detection device according to the present embodiment, the shape of the obstacle registered in advance is displayed so as to match the position of the detected obstacle, and the obstacle including the detection range is displayed. The shape of the whole object is displayed in an easily understandable manner.
- the obstacle detection device memorizes the shape of an obstacle in advance, and when an obstacle is detected, the obstacle detection device is located at a position that best matches the detected obstacle.
- the shape of the obstacle registered in advance to be detected is superimposed and displayed. As a result, it is possible to display the exact position and shape of the obstacle with respect to the own vehicle, and also to display the shape of the obstacle that is out of the detection range. It is easier to grasp the shape of obstacles around the vehicle with the force S.
- the target obstacle image representing the shape of the obstacle to be detected is displayed so as to be superimposed on the obstacle image in the first embodiment.
- the obstacle image may be displayed so as to be superimposed on the obstacle image in any of the second to fourth embodiments.
- both the shape point and the shape vector are stored in the shape data storage unit 17, but instead, only the shape point is stored in the shape data storage unit 17,
- the shape vector may be calculated from the shape points read from the shape data storage unit 17 in the process of step S904 in FIG. 28.
- the interval between the shape points representing the shape of the registered obstacle is 10 cm, but the interval between the shape points is longer than 10 cm. It can be short or short. As the interval between the shape points becomes shorter, the calculation load in the shape data matching section 18 increases, but the matching process can be performed more accurately. In addition, the interval between the interpolation points that complements between the obstacle detection points must be determined so as to be closest to the interval between the shape points, corresponding to the interval between the shape points.
- the obstacle detection device can be used even when a radio wave radar that emits a relatively wide range of a beam is used. Although the position of obstacles existing in the area can be displayed easily, it is better to use a radar that emits a line beam such as a laser radar.
- the obstacle images in the first to fifth embodiments of the present invention can be realized by one obstacle detection device, and the input unit 16 operated by the driver can be used. Any of the obstacle images in the above-described first to fifth embodiments may be displayed based on these signals.
- the obstacle detection device is effective for applications such as a driving support device that requires a technical effect that a driver can easily intuitively understand the positional relationship between a vehicle and an obstacle. .
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- General Physics & Mathematics (AREA)
- Electromagnetism (AREA)
- Computer Networks & Wireless Communication (AREA)
- Mechanical Engineering (AREA)
- Combustion & Propulsion (AREA)
- Transportation (AREA)
- Chemical & Material Sciences (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Aviation & Aerospace Engineering (AREA)
- Automation & Control Theory (AREA)
- Radar Systems Or Details Thereof (AREA)
- Traffic Control Systems (AREA)
Abstract
Description
Claims
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP04718728A EP1605277A1 (en) | 2003-03-20 | 2004-03-09 | Obstacle detection device |
JP2005503654A JPWO2004083889A1 (ja) | 2003-03-20 | 2004-03-09 | 障害物検知装置 |
US10/524,208 US7230524B2 (en) | 2003-03-20 | 2004-03-09 | Obstacle detection device |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2003-078925 | 2003-03-20 | ||
JP2003078925 | 2003-03-20 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2004083889A1 true WO2004083889A1 (ja) | 2004-09-30 |
Family
ID=33027990
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2004/003026 WO2004083889A1 (ja) | 2003-03-20 | 2004-03-09 | 障害物検知装置 |
Country Status (6)
Country | Link |
---|---|
US (1) | US7230524B2 (ja) |
EP (1) | EP1605277A1 (ja) |
JP (1) | JPWO2004083889A1 (ja) |
KR (1) | KR20050107388A (ja) |
CN (1) | CN1701242A (ja) |
WO (1) | WO2004083889A1 (ja) |
Cited By (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2005009992A (ja) * | 2003-06-18 | 2005-01-13 | Denso Corp | 車両用周辺監視装置 |
EP1679526A1 (en) * | 2005-01-07 | 2006-07-12 | Toyota Jidosha Kabushiki Kaisha | Parking support device for motor vehicles using neighboring object information acquisition device |
JP2007128320A (ja) * | 2005-11-04 | 2007-05-24 | Denso Corp | 駐車支援システム |
JP2013212808A (ja) * | 2012-04-04 | 2013-10-17 | Nippon Soken Inc | 駐車空間検知装置 |
JP2013220745A (ja) * | 2012-04-17 | 2013-10-28 | Nippon Soken Inc | 駐車空間検知装置 |
JP2016075501A (ja) * | 2014-10-03 | 2016-05-12 | 三菱電機株式会社 | 物体検出装置、駐車支援装置および物体検出方法 |
CN110867132A (zh) * | 2019-10-15 | 2020-03-06 | 百度在线网络技术(北京)有限公司 | 环境感知的方法、装置、电子设备和计算机可读存储介质 |
US11064151B2 (en) * | 2016-04-26 | 2021-07-13 | Denso Corporation | Display control apparatus |
CN113475976A (zh) * | 2020-03-16 | 2021-10-08 | 珠海格力电器股份有限公司 | 机器人可通行区域确定方法、装置、存储介质及机器人 |
Families Citing this family (52)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP4375064B2 (ja) * | 2004-03-10 | 2009-12-02 | 株式会社デンソー | レーダ装置 |
WO2005107261A1 (ja) * | 2004-04-27 | 2005-11-10 | Matsushita Electric Industrial Co., Ltd. | 車両周囲表示装置 |
JP4179285B2 (ja) * | 2005-01-12 | 2008-11-12 | トヨタ自動車株式会社 | 駐車支援装置 |
JP2006234494A (ja) * | 2005-02-23 | 2006-09-07 | Aisin Seiki Co Ltd | 物体認識装置 |
KR100954621B1 (ko) | 2005-02-23 | 2010-04-27 | 파나소닉 전공 주식회사 | 자동운전차량 및 평면 장애물인식방법 |
JP4600760B2 (ja) * | 2005-06-27 | 2010-12-15 | アイシン精機株式会社 | 障害物検出装置 |
JP2007148835A (ja) * | 2005-11-28 | 2007-06-14 | Fujitsu Ten Ltd | 物体判別装置、報知制御装置、物体判別方法および物体判別プログラム |
GB2432986B (en) * | 2005-12-03 | 2008-07-09 | Shih-Hsiung Li | Image display device for car monitoring system |
US8942483B2 (en) | 2009-09-14 | 2015-01-27 | Trimble Navigation Limited | Image-based georeferencing |
DE102006007644B4 (de) * | 2006-02-18 | 2008-01-31 | Heinz Wipf | Verfahren und System zur Eindringverhinderung eines beweglichen Objekts in einen Abschnitt eines Verkehrsweges |
DE102006008981A1 (de) * | 2006-02-23 | 2007-08-30 | Siemens Ag | Assistenzsystem zur Unterstützung eines Fahrers |
ITMO20060071A1 (it) * | 2006-02-27 | 2007-08-28 | Meta System Spa | Sistema e metodo di rilevamento ad ultrasuoni, particolarmente per applicazioni su autoveicoli o simili |
US7653487B2 (en) * | 2006-10-06 | 2010-01-26 | Toyota Motor Engineering & Manufacturing North America, Inc. | Object detection apparatus and method |
DE102006052779A1 (de) * | 2006-11-09 | 2008-05-15 | Bayerische Motoren Werke Ag | Verfahren zur Erzeugung eines Gesamtbilds der Umgebung eines Kraftfahrzeugs |
KR101104609B1 (ko) | 2007-10-26 | 2012-01-12 | 주식회사 만도 | 차량의 목표주차위치 인식 방법 및 그 시스템 |
CN201145741Y (zh) * | 2007-11-28 | 2008-11-05 | 严伟文 | 2.4g无线屏幕显示语言提示倒车指示器 |
JP2010078323A (ja) * | 2008-09-23 | 2010-04-08 | Denso Corp | 障害物検知装置 |
US8305444B2 (en) * | 2008-11-14 | 2012-11-06 | Toyota Motor Engineering & Manufacturing North America, Inc. | Integrated visual display system |
JP4788798B2 (ja) * | 2009-04-23 | 2011-10-05 | トヨタ自動車株式会社 | 物体検出装置 |
EP2272730A1 (en) * | 2009-07-07 | 2011-01-12 | HI-KEY Limited | Method and apparatus for displaying distance data in a parking distance control system |
US8279107B2 (en) * | 2009-08-06 | 2012-10-02 | Innovapark Llc | Radar vehicle detection system |
DE102009029439A1 (de) * | 2009-09-14 | 2011-03-24 | Robert Bosch Gmbh | Verfahren und Vorrichtung zur Darstellung von Hindernissen in einem Einparkhilfesystem von Kraftfahrzeugen |
US8897541B2 (en) | 2009-09-14 | 2014-11-25 | Trimble Navigation Limited | Accurate digitization of a georeferenced image |
DE102009045286A1 (de) * | 2009-10-02 | 2011-04-21 | Robert Bosch Gmbh | Verfahren zur Abbildung des Umfelds eines Fahrzeugs |
US9047779B2 (en) * | 2010-05-19 | 2015-06-02 | Mitsubishi Electric Corporation | Vehicle rear view monitoring device |
US8855937B2 (en) | 2010-10-25 | 2014-10-07 | Trimble Navigation Limited | Crop characteristic estimation |
US9846848B2 (en) | 2010-10-25 | 2017-12-19 | Trimble Inc. | Exchanging water allocation credits |
US9213905B2 (en) * | 2010-10-25 | 2015-12-15 | Trimble Navigation Limited | Automatic obstacle location mapping |
US9408342B2 (en) | 2010-10-25 | 2016-08-09 | Trimble Navigation Limited | Crop treatment compatibility |
US10115158B2 (en) | 2010-10-25 | 2018-10-30 | Trimble Inc. | Generating a crop recommendation |
US9058633B2 (en) | 2010-10-25 | 2015-06-16 | Trimble Navigation Limited | Wide-area agricultural monitoring and prediction |
JP5353999B2 (ja) * | 2011-04-01 | 2013-11-27 | 株式会社デンソー | 運転者支援装置 |
KR20120123899A (ko) * | 2011-05-02 | 2012-11-12 | 현대모비스 주식회사 | 차량의 주변 상황 인지 장치 및 방법 |
KR20130019639A (ko) * | 2011-08-17 | 2013-02-27 | 엘지이노텍 주식회사 | 차량용 카메라 장치 |
DE102011113916A1 (de) * | 2011-09-21 | 2013-03-21 | Volkswagen Aktiengesellschaft | Verfahren zur Klassifikation von Parkszenarien für ein Einparksystem eines Kraftfahrzeugs |
KR101947826B1 (ko) * | 2012-04-10 | 2019-02-13 | 현대자동차주식회사 | 차량의 주차구획 인식방법 |
EP2927075B1 (en) * | 2012-11-27 | 2017-04-05 | Nissan Motor Co., Ltd. | Vehicle acceleration suppression device and vehicle acceleration suppression method |
US9246528B2 (en) * | 2013-01-11 | 2016-01-26 | Empire Technology Development Llc | Distributed antenna for wireless communication at high speed |
US8825260B1 (en) * | 2013-07-23 | 2014-09-02 | Google Inc. | Object and ground segmentation from a sparse one-dimensional range data |
JP6471528B2 (ja) * | 2014-02-24 | 2019-02-20 | 株式会社リコー | 物体認識装置、物体認識方法 |
US10663558B2 (en) * | 2015-05-22 | 2020-05-26 | Schneider Electric It Corporation | Systems and methods for detecting physical asset locations |
CN104933895B (zh) * | 2015-06-11 | 2018-02-02 | 百度在线网络技术(北京)有限公司 | 交通工具的提醒方法及装置 |
JP6252559B2 (ja) * | 2015-07-27 | 2017-12-27 | トヨタ自動車株式会社 | 移動体検出装置及び運転支援装置 |
JP6310899B2 (ja) | 2015-11-25 | 2018-04-11 | 株式会社Subaru | 車外環境認識装置 |
JP6200481B2 (ja) * | 2015-11-25 | 2017-09-20 | 株式会社Subaru | 車外環境認識装置 |
US10271220B2 (en) * | 2016-02-12 | 2019-04-23 | Microsoft Technology Licensing, Llc | Wireless communication using a central controller |
US10139490B2 (en) * | 2016-03-17 | 2018-11-27 | GM Global Technology Operations LLC | Fault tolerant power liftgate obstruction detection system |
US10304335B2 (en) | 2016-04-12 | 2019-05-28 | Ford Global Technologies, Llc | Detecting available parking spaces |
TWI579578B (zh) * | 2016-05-30 | 2017-04-21 | 均利科技股份有限公司 | 停車位狀態感測系統與方法 |
CN107316492B (zh) * | 2017-07-25 | 2020-10-23 | 纵目科技(上海)股份有限公司 | 在图像中定位停车位的方法及系统 |
EP3467789A1 (en) * | 2017-10-06 | 2019-04-10 | Thomson Licensing | A method and apparatus for reconstructing a point cloud representing a 3d object |
US11535257B2 (en) * | 2020-05-22 | 2022-12-27 | Robert Bosch Gmbh | Lane localization system and method |
Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4618948A (en) * | 1982-11-17 | 1986-10-21 | Nippon Soken, Inc. | Apparatus for detecting obstructions behind vehicle |
EP0782007A2 (en) * | 1995-12-27 | 1997-07-02 | Denso Corporation | Method and apparatus for measuring distance |
JPH1138137A (ja) * | 1997-07-23 | 1999-02-12 | Denso Corp | 距離測定装置 |
JP2000187075A (ja) * | 1998-12-22 | 2000-07-04 | Matsushita Electric Works Ltd | 車両用障害物表示装置 |
JP2001334897A (ja) * | 2000-05-25 | 2001-12-04 | Daihatsu Motor Co Ltd | 駐車支援装置及びその制御方法 |
US20020044048A1 (en) * | 2000-10-12 | 2002-04-18 | Nissan Motor Co., Tld. | Method and apparatus for detecting position of object present in a surrounding detection zone of automotive vehicle |
JP2002120677A (ja) * | 2000-10-12 | 2002-04-23 | Daihatsu Motor Co Ltd | 駐車支援装置及びその制御方法 |
JP2002170103A (ja) * | 2000-12-01 | 2002-06-14 | Nissan Motor Co Ltd | 駐車スペース地図作成装置および駐車スペース地図表示装置 |
JP2003312414A (ja) * | 2002-04-25 | 2003-11-06 | Equos Research Co Ltd | 駐車支援装置 |
JP2004025942A (ja) * | 2002-06-21 | 2004-01-29 | Equos Research Co Ltd | 駐車操作支援装置 |
Family Cites Families (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP3431678B2 (ja) | 1994-02-14 | 2003-07-28 | 三菱自動車工業株式会社 | 車両用周囲状況表示装置 |
JP3448946B2 (ja) | 1994-03-11 | 2003-09-22 | 日産自動車株式会社 | 車両周囲モニタ装置 |
JPH0848199A (ja) | 1994-08-09 | 1996-02-20 | Hitachi Ltd | 障害物警報システム |
DE19650808A1 (de) * | 1996-12-06 | 1998-06-10 | Bosch Gmbh Robert | Einparkvorrichtung für ein Kraftfahrzeug |
JP3183501B2 (ja) * | 1997-07-07 | 2001-07-09 | 本田技研工業株式会社 | 車両用走行制御装置 |
JP2000207697A (ja) | 1999-01-19 | 2000-07-28 | Toyota Motor Corp | 車両周辺監視装置 |
JP2000275339A (ja) | 1999-03-26 | 2000-10-06 | Honda Motor Co Ltd | レーダ装置におけるデータ処理方法 |
JP2002029349A (ja) | 2000-07-13 | 2002-01-29 | Nissan Motor Co Ltd | 車両周囲認識装置 |
JP2002055718A (ja) | 2000-08-11 | 2002-02-20 | Meidensha Corp | 無人車位置検出方式 |
JP3608527B2 (ja) * | 2001-05-15 | 2005-01-12 | 株式会社豊田中央研究所 | 周辺状況表示装置 |
US20030102965A1 (en) * | 2001-12-03 | 2003-06-05 | Apollo Ltd. | Vehicle mountable device for detecting the reflecting characteristics of a surface |
-
2004
- 2004-03-09 WO PCT/JP2004/003026 patent/WO2004083889A1/ja not_active Application Discontinuation
- 2004-03-09 US US10/524,208 patent/US7230524B2/en not_active Expired - Fee Related
- 2004-03-09 KR KR1020057003361A patent/KR20050107388A/ko not_active Application Discontinuation
- 2004-03-09 JP JP2005503654A patent/JPWO2004083889A1/ja not_active Withdrawn
- 2004-03-09 EP EP04718728A patent/EP1605277A1/en not_active Withdrawn
- 2004-03-09 CN CNA2004800007363A patent/CN1701242A/zh active Pending
Patent Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4618948A (en) * | 1982-11-17 | 1986-10-21 | Nippon Soken, Inc. | Apparatus for detecting obstructions behind vehicle |
EP0782007A2 (en) * | 1995-12-27 | 1997-07-02 | Denso Corporation | Method and apparatus for measuring distance |
JPH1138137A (ja) * | 1997-07-23 | 1999-02-12 | Denso Corp | 距離測定装置 |
JP2000187075A (ja) * | 1998-12-22 | 2000-07-04 | Matsushita Electric Works Ltd | 車両用障害物表示装置 |
JP2001334897A (ja) * | 2000-05-25 | 2001-12-04 | Daihatsu Motor Co Ltd | 駐車支援装置及びその制御方法 |
US20020044048A1 (en) * | 2000-10-12 | 2002-04-18 | Nissan Motor Co., Tld. | Method and apparatus for detecting position of object present in a surrounding detection zone of automotive vehicle |
JP2002120677A (ja) * | 2000-10-12 | 2002-04-23 | Daihatsu Motor Co Ltd | 駐車支援装置及びその制御方法 |
JP2002170103A (ja) * | 2000-12-01 | 2002-06-14 | Nissan Motor Co Ltd | 駐車スペース地図作成装置および駐車スペース地図表示装置 |
JP2003312414A (ja) * | 2002-04-25 | 2003-11-06 | Equos Research Co Ltd | 駐車支援装置 |
JP2004025942A (ja) * | 2002-06-21 | 2004-01-29 | Equos Research Co Ltd | 駐車操作支援装置 |
Cited By (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2005009992A (ja) * | 2003-06-18 | 2005-01-13 | Denso Corp | 車両用周辺監視装置 |
EP1679526A1 (en) * | 2005-01-07 | 2006-07-12 | Toyota Jidosha Kabushiki Kaisha | Parking support device for motor vehicles using neighboring object information acquisition device |
JP2007128320A (ja) * | 2005-11-04 | 2007-05-24 | Denso Corp | 駐車支援システム |
JP4682809B2 (ja) * | 2005-11-04 | 2011-05-11 | 株式会社デンソー | 駐車支援システム |
JP2013212808A (ja) * | 2012-04-04 | 2013-10-17 | Nippon Soken Inc | 駐車空間検知装置 |
JP2013220745A (ja) * | 2012-04-17 | 2013-10-28 | Nippon Soken Inc | 駐車空間検知装置 |
JP2016075501A (ja) * | 2014-10-03 | 2016-05-12 | 三菱電機株式会社 | 物体検出装置、駐車支援装置および物体検出方法 |
US11064151B2 (en) * | 2016-04-26 | 2021-07-13 | Denso Corporation | Display control apparatus |
US11750768B2 (en) | 2016-04-26 | 2023-09-05 | Denso Corporation | Display control apparatus |
CN110867132A (zh) * | 2019-10-15 | 2020-03-06 | 百度在线网络技术(北京)有限公司 | 环境感知的方法、装置、电子设备和计算机可读存储介质 |
CN113475976A (zh) * | 2020-03-16 | 2021-10-08 | 珠海格力电器股份有限公司 | 机器人可通行区域确定方法、装置、存储介质及机器人 |
Also Published As
Publication number | Publication date |
---|---|
KR20050107388A (ko) | 2005-11-11 |
US7230524B2 (en) | 2007-06-12 |
EP1605277A1 (en) | 2005-12-14 |
CN1701242A (zh) | 2005-11-23 |
US20050225439A1 (en) | 2005-10-13 |
JPWO2004083889A1 (ja) | 2006-06-22 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2004083889A1 (ja) | 障害物検知装置 | |
US11027653B2 (en) | Apparatus, system and method for preventing collision | |
JP4850898B2 (ja) | レーダ装置 | |
JP6575776B2 (ja) | 道路情報検知装置及び道路情報検知方法 | |
JPH08301029A (ja) | 小型乗物用の多数のアンテナを含む後方及び側方障害物検出システム | |
EP3686631A1 (en) | Range finding method, range finding apparatus, and range finding system | |
JP3942722B2 (ja) | 車載レーダ装置 | |
US20210055734A1 (en) | Methods Circuits Devices Assemblies Systems and Related Machine Executable Code for Providing and Operating an Active Sensor on a Host Vehicle | |
CN110907934B (zh) | 用于提高的信噪比的圆偏振汽车雷达 | |
US7084806B2 (en) | Intruding object detecting apparatus, and setting apparatus, setting process and setting confirmation process therefor | |
JP4816009B2 (ja) | 接近報知装置 | |
CN110333727A (zh) | 机器人路径规划方法、装置、设备及介质 | |
JP2009133761A (ja) | レーダ装置 | |
CN107076844A (zh) | 模块化平面多扇区90度视场雷达天线结构 | |
JP2010127835A (ja) | レーダ装置 | |
JP3991793B2 (ja) | レーダ | |
JP2007163317A (ja) | レーダー装置 | |
JP2021521432A (ja) | メタマテリアルアンテナのサイドローブ特徴を内蔵する物体検出用の方法及び装置 | |
JP2010181257A (ja) | 障害物検出装置 | |
US8451689B2 (en) | Ultrasonic apparatus with an adjustable horn | |
JPH1027299A (ja) | 車載用レーダ装置 | |
JPH08327731A (ja) | レーダによる方位検出方法及び方位検出レーダ装置及び自動車用衝突防止装置 | |
JP4644590B2 (ja) | 周辺車両位置検出装置および周辺車両位置検出方法 | |
JPH1164500A (ja) | レーダ装置 | |
JP2004286537A (ja) | 車載用レーダ装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AK | Designated states |
Kind code of ref document: A1 Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NA NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW |
|
AL | Designated countries for regional patents |
Kind code of ref document: A1 Designated state(s): BW GH GM KE LS MW MZ SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IT LU MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
WWE | Wipo information: entry into national phase |
Ref document number: 2004718728 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 10524208 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2005503654 Country of ref document: JP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 1020057003361 Country of ref document: KR |
|
WWE | Wipo information: entry into national phase |
Ref document number: 20048007363 Country of ref document: CN |
|
WWP | Wipo information: published in national office |
Ref document number: 1020057003361 Country of ref document: KR |
|
WWP | Wipo information: published in national office |
Ref document number: 2004718728 Country of ref document: EP |
|
WWW | Wipo information: withdrawn in national office |
Ref document number: 2004718728 Country of ref document: EP |