WO2005111937A1 - 車両用走行区分線認識装置 - Google Patents
車両用走行区分線認識装置 Download PDFInfo
- Publication number
- WO2005111937A1 WO2005111937A1 PCT/JP2005/008937 JP2005008937W WO2005111937A1 WO 2005111937 A1 WO2005111937 A1 WO 2005111937A1 JP 2005008937 W JP2005008937 W JP 2005008937W WO 2005111937 A1 WO2005111937 A1 WO 2005111937A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- vehicle
- lane marking
- photographed
- time
- Prior art date
Links
- 230000002194 synthesizing effect Effects 0.000 claims abstract description 16
- 238000000034 method Methods 0.000 claims description 46
- 230000008569 process Effects 0.000 claims description 42
- 230000008859 change Effects 0.000 claims description 30
- 238000003708 edge detection Methods 0.000 claims description 23
- 238000010586 diagram Methods 0.000 description 31
- 238000003384 imaging method Methods 0.000 description 18
- 230000015572 biosynthetic process Effects 0.000 description 16
- 238000003786 synthesis reaction Methods 0.000 description 16
- 238000001514 detection method Methods 0.000 description 7
- 239000010426 asphalt Substances 0.000 description 6
- 239000000203 mixture Substances 0.000 description 6
- 230000003321 amplification Effects 0.000 description 5
- 239000002131 composite material Substances 0.000 description 5
- 238000010276 construction Methods 0.000 description 5
- 238000003199 nucleic acid amplification method Methods 0.000 description 5
- 230000002265 prevention Effects 0.000 description 4
- 230000004069 differentiation Effects 0.000 description 3
- 230000000694 effects Effects 0.000 description 3
- 241000282326 Felis catus Species 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 230000005484 gravity Effects 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 230000003068 static effect Effects 0.000 description 2
- 230000009466 transformation Effects 0.000 description 2
- 238000006243 chemical reaction Methods 0.000 description 1
- 239000004567 concrete Substances 0.000 description 1
- 230000007423 decrease Effects 0.000 description 1
- 230000007613 environmental effect Effects 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 239000003973 paint Substances 0.000 description 1
- 230000035945 sensitivity Effects 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/50—Image enhancement or restoration using two or more images, e.g. averaging or subtraction
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/12—Edge-based segmentation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
- G06V20/588—Recognition of the road, e.g. of lane markings; Recognition of the vehicle driving pattern in relation to the road
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2420/00—Indexing codes relating to the type of sensors based on the principle of their operation
- B60W2420/40—Photo, light or radio wave sensitive means, e.g. infrared sensors
- B60W2420/403—Image sensing, e.g. optical camera
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2510/00—Input parameters relating to a particular sub-units
- B60W2510/20—Steering systems
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2520/00—Input parameters relating to overall vehicle dynamics
- B60W2520/14—Yaw
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2520/00—Input parameters relating to overall vehicle dynamics
- B60W2520/28—Wheel speed
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2540/00—Input parameters relating to occupants
- B60W2540/18—Steering angle
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2710/00—Output or target parameters relating to a particular sub-units
- B60W2710/20—Steering systems
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W50/08—Interaction between the driver and the control system
- B60W50/14—Means for informing the driver, warning the driver or prompting a driver intervention
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10016—Video; Image sequence
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20048—Transform domain processing
- G06T2207/20061—Hough transform
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30248—Vehicle exterior or interior
- G06T2207/30252—Vehicle exterior; Vicinity of vehicle
- G06T2207/30256—Lane; Road marking
Definitions
- the present invention relates to a traveling lane marking recognition device for a vehicle.
- Various road markings such as a white line, a yellow line, a cat's eye and the like are displayed or installed on a road surface (traveling road) on which a vehicle travels.
- a road surface travelling road
- an edge is detected by performing differentiation and binary image processing on an image captured by a capturing device such as a CCD camera and an image processing ECU, and the detected edge is detected.
- the edge point sequence (white line candidate point sequence) is recognized by performing a Hough transformation to obtain a similar linear component.
- Patent Document 1 Japanese Patent Publication No. 6-42261
- Patent Document 2 Japanese Patent Application Laid-Open No. 2001-236506
- a white line candidate point sequence extracted from a captured image captured a predetermined time ago is combined with a white line candidate point sequence extracted from the current white line image (superposition).
- To extend the white line candidate point sequence find a straight line approximating the extended white line candidate point sequence, and recognize it as a white line to compensate for missing or unclear portions. This is not always satisfactory in terms of recognition accuracy of the lane marking (white line) because the alignment is performed on the edge image after the edge detection.
- an object of the present invention is to solve the above-mentioned disadvantages, and it is possible to eliminate a point sequence corresponding to a distant lane marking from a combined image when combining captured images. It is an object of the present invention to provide a vehicular lane marking recognition device capable of accurately recognizing a lane marking.
- the present invention provides a photographing means for photographing an area including a road surface in a traveling direction of a vehicle, and a photographing photographed by the photographing means.
- a lane marking recognizing means capable of recognizing at least a dashed lane marking on the road surface in an image, and a plurality of photographed images photographed at different times by the photographing means;
- An image synthesizing unit for extending the length of a line, wherein the image synthesizing unit includes at least one of the captured images in the lane marking recognition process performed by the lane marking recognition unit.
- the configuration is such that the photographed images are combined in a processing step that does not change the shape of the traveling lane marking.
- the present invention provides a photographing means for photographing an area including a road surface in a traveling direction of a vehicle as described in claim 2 described below, and an edge in a photographed image photographed by the photographing means.
- a lane marking recognizing means capable of recognizing at least a broken lane marking on the road surface by detecting and performing a Hough transform on the detected edge; and a plurality of images photographed at different times by the photographing means.
- the captured image is composed before the edge detection in the line recognition process.
- the captured lane markings are combined at a processing step in which at least the shape of the lane markings in the captured image is not changed in the traveling lane marking recognition processing.
- distant lane markings can also be accurately detected, so that the lane markings can be apparently extended, and recognition accuracy can be improved.
- the travel lane markings may have a short length depending on whether the lane markings are blurred or chipped or depending on the construction. In such a case, the lane markings can be accurately recognized. In addition, the direction of the lane marking is more accurately recognized by extending the length of the lane marking than by recognizing a dotted lane marking normally drawn on a road surface. As a result, the lane marking can be recognized with higher accuracy.
- the "vehicle traveling direction" described in the above-mentioned claim is provided only with a front part of the forward traveling, and is provided with a photographing means for photographing an area including a rear road surface at the time of forward traveling. It is also possible to recognize the traveling lane markings behind the captured image. As described above, it is needless to say that the “vehicle traveling direction” indicates the longitudinal direction of the vehicle.
- FIG. 1 is a schematic diagram showing an overall configuration of a vehicle lane marking recognition apparatus according to an embodiment of the present invention.
- FIG. 2 is a block diagram showing the operation of the control ECU shown in FIG. 1 and also showing input / output forces.
- FIG. 3 is a flowchart showing the operation of the apparatus shown in FIGS. 1 and 2.
- FIG. 4 is an explanatory diagram of an edge image obtained by the edge detection processing in FIG. 3.
- FIG. 5 is an explanatory diagram of a guideline, which is a linear component corresponding to a lane marking such as a white line obtained by the Hough (No, F) conversion process in FIG. 3;
- FIG. 6 is an explanatory diagram of lane candidates (white line candidate point sequence) obtained in the lane candidate detection processing of FIG. 3.
- FIG. 7 is an explanatory diagram showing an estimated position of a lane in which the own vehicle travels, which is obtained by the lane position estimation process of FIG. 3.
- FIG. 8 is a sub-routine 'flow chart' showing the frame synthesizing process of FIG. 3.
- FIG. 9 is an explanatory diagram showing a captured image input at a current time (current processing cycle) t which is a reference in the past frame selection processing of FIG. 8.
- FIG. 9 is an explanatory diagram showing an image corrected by the vertical line correction process in FIG.
- FIG. 12 is an explanatory diagram showing a combined image obtained by the combining process of FIG. 8.
- FIG. 13 is an explanatory diagram showing a predetermined area (road surface) of a shooting area to be detected in the brightness detection processing of the past frame in FIG. 8.
- FIG. 14 is an explanatory diagram showing brightness detection processing of a past frame in FIG. 8.
- FIG. 15 is a subroutine of a brightness correction process of FIG. 8;
- FIG. 16 is an explanatory graph showing characteristics of a shutter speed and an aperture with respect to road surface brightness used in the processing of FIG. 15.
- FIG. 17 is also an explanatory graph showing characteristics of amplifier gain (amplification factor) with respect to the shutter speed and the aperture used in the process of FIG.
- FIG. 4 is an explanatory diagram showing a change in (degree).
- FIG. 19 is a sub-routine 'Huichi I' chart of the horizontal pixel correction processing of FIG. 8.
- FIG. 20 is a sub-routine 'Huichi Ichi' chart of the vertical line correction process of FIG. 8.
- FIG. 21 is a chart showing a subroutine 'Fuichi Ichi' of the process in FIG. 19.
- FIG. 22 is an explanatory diagram showing the processing of FIG. 21.
- FIG. 23 is an explanatory diagram showing the processing in FIG. 21.
- FIG. 24 is a subroutine 'flow chart' of the process of FIG. 19.
- FIG. 25 is an explanatory diagram showing the processing of FIG. 24.
- FIG. 26 is an explanatory diagram showing the processing of FIG. 24.
- FIG. 27 is an explanatory diagram showing an image image of the image composition obtained by adding the position correction processing of FIG. 8.
- FIG. 28 is an explanatory diagram showing an image image of image synthesis when the position correction process of FIG. 8 is not performed.
- FIG. 29 is an explanatory diagram showing an image in the case of combining before performing the edge detection process in the combining process of FIG. 8.
- FIG. 30 is an explanatory diagram showing an image in a case where the image is synthesized after performing the edge detection processing in the related art.
- FIG. 31 is an explanatory diagram for explaining a no-turn matching method that replaces the edge detection process and the Hough transform.
- FIG. 32 is an explanatory diagram showing an example of a pattern in the pattern matching shown in FIG. 31.
- FIG. 1 is a schematic diagram showing an entire configuration of a vehicle lane marking recognition apparatus according to an embodiment of the present invention.
- reference numeral 10 denotes a camera provided with an imaging element such as a CCD or a C-MOS mounted inside a cabin 12a in the traveling direction of the vehicle 12, and the camera 10 is mounted on the vehicle 12. The area including the road surface in the traveling direction is photographed (imaged).
- An image processing ECU Electronic Control Unit
- the image processing ECU receives an imaging (image) signal indicating information of a traveling lane (lane) in front of the vehicle, which is output from the camera 10, and performs image processing to be described later. Integrated Circuit).
- the camera 10 and the image processing ECU power correspond to the above-described photographing means.
- the "travel lane marking” refers to a vehicle lane (lane or lane).
- a road marking such as a solid or broken white or yellow line, or a cat's eye that is placed at intervals on the road surface, as indicated by paint on the surface of the road to be divided (road).
- the traffic lane divided by the lane marking is called “lane” or “lane”.
- a steering wheel 16 provided in a driver's seat in a cabin 12a of the vehicle 12 is connected to a rack shaft 20 via a rack-and-pione type steering gear, and the rack shaft 20 is driven via a tie rod 22. Connected to wheel 24.
- An electric power steering mechanism (EPS) 30 including an electric motor 26 is disposed on the rack shaft 20, and the rack shaft 20 is reciprocated by the rotation of the electric motor 26.
- EPS electric power steering mechanism
- Wheel speed sensors 32 are provided near the drive wheels 24 and the driven wheels (not shown), respectively, and rotate around a predetermined angle, that is, the traveling speed (vehicle speed) of the vehicle 12. Outputs a signal according to.
- a yaw rate sensor 34 is disposed at the center of the vehicle 12 (near the rear axle) and outputs a signal corresponding to the yaw rate (rotational angular velocity) around the vertical axis (gravity axis) at the position of the center of gravity of the vehicle 12.
- a steering angle sensor 36 is provided near the steering wheel 16, and outputs a signal corresponding to the rotation amount of the steering wheel 16 operated by the driver, that is, the steering angle.
- control ECU also including a microcomputer
- the control ECU 40 inputs the outputs of the image processing ECU and the above-described sensors, and is necessary for lane keeping assist control for traveling along the traveling lane or lane departure prevention control for preventing inadvertent deviation from the traveling lane.
- the steering force is calculated, converted into a command value, and output.
- the control ECU 40 is connected to an EPSECU 42 that controls the operation of the EPS 30.
- the EPSECU 42 also includes a microcomputer, transmits and receives data to and from the control ECU 40, and drives the electric motor 26 based on a command value output from the control ECU 40.
- FIG. 2 is a block diagram showing the input / output force of the operation of the control ECU 40 shown in FIG.
- control ECU 40 receives the output of the camera 10 and the above-mentioned image processing ECU (indicated by reference numeral 44), which is also powerful, and the output of the yaw rate sensor 34, etc., and communicates command values for lane keeping assist control and the like.
- a meter and a switch (SW) are arranged near the driver's seat in the power cabin 12a, which is omitted in FIG. 1, and the control content of the control ECU 40 is displayed on the meter. Also, the control instruction from the driver is displayed on the meter via the switch and is input to the control ECU 40.
- FIG. 3 is a flowchart showing the operation of the apparatus shown in FIGS. 1 and 2.
- the illustrated program is executed by the image processing ECU 44 and the control ECU 40 at predetermined time intervals, for example, every 33 msec.
- step S10 an imaging signal is also output from the imaging element of the camera 10.
- the imaging device of the camera 10 has an imaging area having a length of n lines and a width of m pixels.
- the optical axis of the camera 10 is adjusted in a predetermined direction including a traveling path in a traveling direction on a production line of a factory or the like.
- Step S 12 frame synthesis, that is, an image signal output from the image sensor is input and synthesized. Specifically, the imaging signal input in the current processing cycle and the imaging signal input in the past processing cycle and stored in the memory of the image processing ECU 44, in other words, the current image (frame) And the past image.
- the feature of this embodiment lies in the frame synthesis, and will be described later in detail.
- Steps S12 to S16 are processing executed by a hardware-based image processing IC in the image processing ECU 44.
- an edge detection process including a well-known differentiation process and a subsequent binarization process is performed to generate an edge image as shown in FIG. 4 from the input image.
- a straight line component shown by a solid line in FIG. 5 corresponding to a lane marking such as a white line is determined.
- a plurality of edge points (a sequence of edge points; indicated by black circles in the figure) along the straight line component as shown in FIG. 6 are detected as lane candidates (a sequence of white line candidate points).
- vehicle control is executed in S22. That is, based on the estimated lane position, Command values for vehicle control such as the lane keeping assist control and the lane departure prevention control described above are output to the EPSECU 42 to control the driving of the electric motor 26.
- the processes up to S12 and S16 are executed by the above-mentioned hardware image processing IC in the image processing ECU 44, the processes in S18 and S20 are executed by the image processing ECU 44, and the processes after S22 are executed by the control ECU 40.
- FIG. 8 shows the processing of the frame synthesis (frame synthesis of the acquired image (imaging signal) at the present time and the image (imaging signal) acquired and stored at the past time) in S12 in FIG. It is a chart of sub 'Noretin' flow.
- a feature of this embodiment lies in various processes of frame synthesis shown in FIG. 8. First, referring to FIG. 8, an outline will be given. In S100, a past frame to be synthesized based on the detected vehicle speed is obtained. Select
- an image to be synthesized with the captured image (shown in FIG. 9) input at the current time (current processing cycle) t is input for each processing cycle before the previous time and stored in the memory of the image processing ECU 44.
- n means a sample time of a discrete system
- t means a photographed image taken and input n times before, that is, an image photographed and stored at a different time in the past.
- the brightness of the past frame selected in S102 is corrected.
- the photographed image input at the present time and the photographed image at the past time selected according to the vehicle speed in S100 differ in the photographing time, so that the brightness of the photographing target (road surface) differs due to the influence of shadows, etc.
- the composition is performed in such a way that the brightness of the road surface part of one image is brighter than that of the lane marking such as the white line of the other image, the white line will be buried in the road surface in the composite image.
- the brightness of the captured image at the past time is adjusted to match the brightness of the captured image at the current time so that the brightness of the image at the current time and the brightness at the past time match. I do. The details will be described later.
- a combination start position is set. That is, when the attitude of the vehicle 12 is different from the initial state (at the time of adjusting the optical axis of the camera 10) due to factors such as loading, in other words, the attitude of the static pitching direction may change.
- This posture change causes the camera Since the 10 shooting areas fluctuate up and down, the composition start (reference) position is learned and corrected (set) in consideration of the fluctuation when compositing images. This process is performed by matching the position of the horizon between the present time and the past, since the area above the road surface is usually unnecessary when recognizing the lane marking.
- horizontal pixel correction is performed in S 106. That is, the posture or angle of the vehicle 12 with respect to the lane may be different between the present time and the past time, so that an image obtained at the time of photographing may be shifted by the position or the angle. In this case, of course, if the images are combined as they are, the horizontal position or angular direction of the lane marking will be shifted. To prevent this, the horizontal position and angular direction of the lane marking on the image at the past time are corrected by the amount corresponding to the change in the vehicle attitude between the current time and the past time. The details will be described later.
- the characteristic point is to select a past frame to be synthesized based on the vehicle speed (running speed) as described in S100 of FIG. That is to say.
- an image at a past time point to be combined with a current image is an image that is a fixed time ago.
- the moving distance of the vehicle 12 varies depending on the vehicle speed (running speed), and the shooting area of the camera 10 (of the image sensor) naturally moves in the traveling direction of the vehicle 12 as well. Also moves as the vehicle 12 advances, in other words, the higher the vehicle speed, the greater the position of the travel lane marking to be photographed. Therefore, the amount and rate of extension of the travel lane marking in the combined image will differ depending on how long the past image is selected and combined with the current image.
- a dashed line in which a white line portion (colored portion) and a blank (achromatic portion such as asphalt) between the white line portion (colored portion) are periodically repeated is known as a travel division line.
- the white line of this lane marking may be faded or partially missing in terms of durability, and the white line may be drawn short for temporary reasons due to construction or other reasons.
- the white line portion is synthesized so that the white line portion is apparently extended regardless of whether the white line portion is blurred or chipped.
- the corresponding image is selected from images (past frames) taken in the past at time intervals determined by the vehicle speed (running speed). Note that this is not limited to the white line, and the same applies to the yellow line.
- the inventors have learned experimentally that if the length (percentage) of the white line portion with respect to the total length of the white line portion and the blank of the traveling lane marking line is about 30% or more, the traveling lane marking line It was possible to accurately recognize the traveling path of the own vehicle by accurately recognizing the road. As described above, the length of the white line portion that is extended according to the vehicle speed changes, so vehicle control such as lane keep assist control and lane departure prevention control described at the beginning using the data of the recognized lane markings The number of frames before in the imaging cycle that should be used is determined according to the control vehicle speed range.
- the imaging cycle is 33.3 msec, for example, when the vehicle speed (running speed) is 60 km Zh or more, to secure the necessary amount of extension of the white line portion, three frames before May be used.
- the amount of extension of the white line part is insufficient if the vehicle speed is 90 kmZh or more even in the image two frames before.
- the amount of extension of the white line part decreases even in the vehicle speed range of 60 to 90 kmZh, but the length of extension is small. More than lm, white in total length Since the ratio of the line portion also increases to about 30%, it is possible to accurately recognize the lane markings and accurately recognize the travel path of the vehicle.
- At least one photographed image taken in a cycle corresponding to a time interval determined according to the detected vehicle speed (running speed) should be combined with an arbitrary (current) photographed image. Now selected as an image.
- the vehicle speed may be changed to a certain value, for example, 105 kmZh, 110 kmZh, and the image used for the synthesis two frames before and three frames before the image used for synthesis may be switched, or if the vehicle speed is 60 kmZh or more.
- the length of the white line part of the lane marking is 30% or more of the total length of the blank line and the blank, so it is necessary to use only the image three frames before. It is good.
- an arbitrary captured image (specifically, the current image, more specifically, the image captured in the current travel lane marking recognition cycle) At least one captured image (more than one) previously taken at a time interval determined according to the vehicle speed (traveling speed) detected for that captured image with respect to the shooting time of the Specifically, at least one photographed image taken in a cycle corresponding to a time interval determined according to the detected vehicle speed (running speed)) is taken as a photographed image to be combined with the arbitrary photographed image. Since the selection is made, the apparent extension amount or extension ratio of the white line portion of the lane marking can be set optimally regardless of the vehicle speed, and the recognition accuracy of the lane marking can be improved. Target the route of the vehicle Rukoto be recognized to leave in.
- the configuration is such that at least one captured image captured in the past is selected at a longer time interval as the detected traveling speed is lower.
- the total length of the white line portion (colored portion) and the blank (achromatic portion) of the lane marking is different. At least one photographed image taken in the past is selected at a time interval so that the length of the colored part to be printed is 30% or more, so if the lane markings are blurred or missing Even if the length is short due to construction or construction, the apparent extension amount or extension ratio of the white line part of the lane marking can be set optimally, and the recognition accuracy of the lane marking can be improved. Therefore, it is possible to accurately recognize the traveling path of the own vehicle.
- the image serving as the reference for synthesis is an image captured in the current travel lane marking recognition processing cycle
- the image can be synthesized based on the latest information.
- V and V are determined based on the latest information. Images can be combined.
- the second characteristic point of this embodiment is that the brightness correction of the past frame is performed as described in S102 of Fig. 8.
- brightness refers to brightness, brightness, density, a shutter speed adjusted in accordance with brightness on a detected image, an aperture, or an output amplifier gain of a video signal. This is used to include all shooting conditions.
- the brightness of the road surface changes every moment due to a shadow of a building or the like, a wet road surface, strong sunlight, or various environmental changes.
- the photographing conditions are adjusted as described above.
- the white line part is brighter than the other road surface (asphalt, concrete), so when compositing, at the same point (same place) on the image, select the It is doing.
- the shutter speed and the like are adjusted based on the detected brightness so that the brightness is constant between the captured images.
- the shutter speed, aperture, and amplifier gain are not changed at once but are changed gradually for reasons such as hunting prevention.
- the white line portion of another captured image becomes darker than the road surface (asphalt portion) of one captured image.
- FIG. 14 is an explanatory diagram showing a process of correcting image brightness in this embodiment.
- the three types of graphs with a, b, and c attached to the right end of the figure show simplified brightness at the white line in the image on the left side of the figure and the dotted line on the road surface (asphalt).
- FIG. The note at the top of Fig. A indicates that the protruding part of the pulse-shaped waveform is the white line, and that both sides are the road surface. Accordingly, FIG. 7A shows that in the left images A and B, the road surface of B is brighter than the white line portion of A.
- a in FIG. 14 is an image captured and stored at a past time
- B is an image captured and acquired at the current time.
- the road surface of image B is brighter than the white line (dotted line in the figure) of image A.
- the position of the image is corrected according to the change in the position of the vehicle between the present time and the past time.
- the brightness of the image does not change between A and A.
- both images are combined as shown by the dashed line while noting “no correction” in FIG. Since the road surface (asphalt) of the image B is brighter than the white line portion of the image A as shown in Fig. 14, the road surface portion of the image B is selected, and as shown in the image C 'of Fig. 14, the image A ( ⁇ ') The problem occurs that the image has no white line part.
- the brightness of the captured image is detected by detecting the brightness of a predetermined area (the shaded portion in FIG. 13) on the road surface of the imaging area and the like.
- correction is performed so that the brightness of a plurality of captured images matches with reference to one of the plurality of captured images.
- the brightness may be actually detected or the brightness may be simply detected.
- the above-mentioned problem is solved by calculating the ratio of the brightness of the two images and the shooting conditions at the time of the past and the present, and using this to match the brightness of the two images. I turned it off. It is also possible to match the brightness of the captured images synthesized using the difference in brightness instead of the ratio.
- FIG. 15 is a sub-routine 'flow chart' of the brightness correction processing in S102 of FIG.
- the photographing conditions which are the photographing conditions at the time of photographing, include the shutter speed SO, Sl of the camera 10, the aperture value 10, II, and the amplifier gain GO, Gl indicating the amplification factor of the image signal Store the conditions.
- the shutter speed is a function of the brightness of the detected captured image, specifically, the brightness of a predetermined area of the road surface (the shaded area in FIG. 13) in the imaging area of the imaging means, such as the density of pixels in the area.
- the sensitivity is set by adjusting (determining) according to the characteristics shown in FIG.
- the aperture value as shown in FIG. 16
- the amount of light to be taken changes, so that the brightness of the image can be adjusted.
- the image density adjusted by at least one of the shutter speed and the aperture can be further adjusted n times by the amplifier gain to adjust the brightness if the surrounding environment is dark and adjustment is still insufficient. .
- the aperture value and the amplifier gain are stored in addition to the shutter speed.
- the brightness ratio of the image between the present time and the past time is calculated by the formula shown in the drawing based on the stored shooting conditions, and the brightness ratio calculated in S206 is compared with the previous time.
- the brightness of the image at the past time is corrected to match the brightness of the current image.
- a brightness ratio of a plurality of captured images is detected based on at least one of the shutter speed and the aperture value, and the shutter speed at the time of each of the captured images is determined.
- the brightness is corrected based on at least one of the aperture values and / or the difference, specifically, based on both ratios.
- the ratio of the brightness of a plurality of captured images is detected based on the amplification factor.
- the brightness is determined by detecting the brightness of a predetermined area of the road surface indicated by the hatched portion in FIG. 13 from the density of the pixels at the site, and by using the detected brightness, The detected brightness is corrected based on at least the ratio between the shutter speed and the aperture value at the time of shooting.
- the image at the past time point whose brightness has been corrected and the image at the current time point are synthesized.
- the lane marking such as a white line is usually brighter than the surrounding road surface such as asphalt. ), The brighter in brightness, that is, the brighter one for each pixel is selected and synthesized.
- the brightness of a captured image captured by the capturing unit is detected, and the brightness of a plurality of captured images is determined based on one of the captured images.
- the brightness of the two images is matched. It is possible to extend the white line portion satisfactorily without the white line portion of the other image being buried in the road surface. Therefore, the recognition accuracy of the lane marking can be improved, and the traveling path of the own vehicle can be accurately recognized.
- the image serving as the reference for synthesis is the image of the current lane marking recognition cycle
- the images can be synthesized while matching the brightness based on the latest information.
- the recognition accuracy of the lane marking can be improved, and thus the traveling path of the vehicle can be accurately recognized.
- the brightness can be detected using a common scale, and the recognition accuracy of the lane marking can be improved.
- the brightness is determined based on the gain. Since detection or correction is performed, brightness detection / correction becomes easy. Since the correction is performed so that the brightness at the same position in a plurality of images matches, the brightness can be corrected with higher accuracy.
- the combined image power is also determined by the running classification. The line can be recognized with higher accuracy, and therefore, the traveling path of the own vehicle can be more accurately recognized.
- the third characteristic point of this embodiment is that horizontal pixel correction and vertical line correction are performed as described in S106 and S108 in FIG.
- the posture (angle) of the vehicle 12 with respect to the travel lane marking and the horizontal distance from the travel lane marking may be different between the present time and the past time.
- the image is shifted in the horizontal direction or in the angular direction of the vehicle 12 with respect to the lane marking, and the image is naturally synthesized as it is, the horizontal position and angle of the lane marking will be shifted.
- FIG. 18 shows an image taken by the camera 10, that is, from the time t when the exposure was performed to the time t after the time t.
- FIG. 9 is an explanatory diagram showing changes in the position and attitude (angle) of the vehicle 12 with respect to the lane marking (lane) up to -2 -10.
- the exposure time t is the current time
- the time t is the past
- the time t is the time t
- the lane markings (lanes) located on the left and right sides of the vehicle 12 in FIG. 18 are recognized, and the center of the lane markings is used as a reference point. Minutes ( Figure 18 For example, in the case of lane keeping control, the short dashed line is detected and used as a line on which the vehicle should travel. Note that the lane marking in Fig. 18 was obtained by photographing at a time t in the past.
- the point sequence is also a point sequence at time t.
- Vehicle position at exposure time t, t Vehicle position at exposure time t, t
- the angle between the line segment connecting the point sequence and the direction of the vehicle 12 (the front-rear direction angle of the vehicle 12 with respect to the lane marking) is 0, ⁇
- FIG. 19 is a sub-routine 'flow' chart of the horizontal pixel correction process in S106 of FIG.
- the figure shows the deviation of the horizontal position between the exposure times t and t (the past (time
- 5 is a processing flow for obtaining a correction amount of a pixel position in a horizontal direction on a force image.
- the ⁇ force also determines the amount of correction of the pixel position at the time of synthesis on the image plane, and compares the camera coordinate system (u, V) performed as hardware processing before synthesis with the coordinate system (X, Y) of the real plane. Perform association. Details of the processing in S302 will be described later.
- FIG. 20 is a sub-routine 'Huichi I' chart of the vertical line correction process in S108 of FIG.
- step 2 the correction amount of the vertical composite line position on the image plane is calculated based on the value, and the camera coordinate system (U, V) and the real plane coordinate system (X, Y ).
- FIG. 21 is a flowchart showing the subroutine of the process of S300 in FIG. 19, and FIG. 22 and FIG. 23 are explanatory diagrams thereof.
- the coordinate axes of the trajectory estimation are coordinate axes set with the center axis of the traveling direction of the vehicle (vehicle center axis) at a certain time in the past as the X axis.
- the exposure is performed based on the vehicle position at the exposure time t.
- Vehicle (center) at time t to determine the amount of lateral change in vehicle position at time t
- the coordinates of the position of the vehicle 12 at the exposure time t on the detection result coordinate system x, y are (X, y).
- the coordinates (X, y) of the position of the vehicle at the exposure time t_ are obtained.
- the distance L between the point (X, y) on the straight line connecting, ⁇ and the preceding coordinate (X, y) is calculated as k + 1 p -1 using y — y.
- the vehicle position at light time t, point sequence P, etc. are omitted.
- the change amount ⁇ ⁇ of the angle of the vehicle with respect to the vehicle is determined.
- the exposure time is calculated using the point sequence detected and photographed at the exposure time t.
- FIG. 24 is a flowchart illustrating the subroutine “S302” in FIG. 19, and FIGS. 25 and 26 are explanatory diagrams thereof.
- step S604 the horizontal pixel positions U5m and U30m on the image plane corresponding to the correction distances Y5m and Y30m obtained in step S600 are obtained.
- a straight line indicating the position and angle of the vehicle at time t on the real plane to be compensated for changes is a straight line indicating the position and angle at time t shown in Fig. 25
- composition position setting is processed in the above-described hardware-based image processing IC, high-speed processing can be performed.
- FIG. 27 and FIG. 28 are diagrams showing an image image of the image synthesis in which the position correction processing of this embodiment is added and an image image of the image synthesis in a case where the position correction is not performed. .
- the traveling lane marking becomes an extended form as shown in image C in Fig. 27. If there is no travel lane, the lane markings taken at the past and present time are combined in a shifted form as shown in Fig. 28, which indicates that the lane lane is not extended.
- the longitudinal angle ⁇ of the vehicle 12 with respect to the travel lane marking in a plurality of captured images and the distance L between the travel lane marking and the vehicle 12 are determined, and the angle and the distance between the plurality of captured images are determined.
- the photographed image is detected.
- the image Since the difference in pitch angle between images ( ⁇ ⁇ pit) is detected, and based on the detected difference in pitch angle, the image is corrected to an image taken at the same pitch angle, and then multiple images are combined.
- the apparent lane marking can be extended more reliably.
- the composite position between the plurality of images is changed so that the image is captured at the same angle and distance of the vehicle with respect to the travel lane marking. It may be adjusted and combined. That is, instead of moving the image itself as described above, the image may be synthesized by adjusting the synthesis position (overlapping position).
- one of the previously acquired image A and the currently acquired image B is shifted by the change in the horizontal position and the angle between the two times described above, and the other image is shifted.
- the same result as the above-described correction can be obtained. That is, when considering the correction of the horizontal position, the point p of the image B shifted in the horizontal direction from the point p by the correction amount of the horizontal position should overlap the point p of the image A in FIG. In FIG. 27, there is almost no change in the angle between both times, but if there is a change in the angle, the image may be rotated according to the amount of angle change and then synthesized. Further, even when there is a difference in pitch angle, one image may be shifted in the vertical direction in FIG. 27 by an amount corresponding to the difference and synthesized. By doing so, it is possible to obtain the same effect as that for correcting the image.
- the fourth characteristic feature of this embodiment is that, in the synthesizing process of S110 in FIG. 8, the images at the present time and the past are synthesized at the stage of the original image before the edge detection process.
- Fig. 29 shows an image obtained when the image at the present time and the past time are combined at the stage of the original image before performing the edge detection processing
- Fig. 30 shows the image obtained when the patent document 2 is used.
- This figure shows an image obtained by combining the present time and a past time in the state of a so-called edge image after performing the edge detection processing as in the prior art described above.
- the length of the traveling lane markings distant from the own vehicle on the images as in images A and B is short, that is, the size on the image is small. If the current and past images are separately subjected to edge detection processing, they will not be recognized as edges.As a result, as shown in images a and b in FIG. There will be nothing.
- a position correction process based on a change in the vehicle attitude is performed to obtain a position correction image shown in an image a ′ in FIG. 30, and subsequently, an edge image between the current time and a past time is synthesized, There is no edge corresponding to the distant lane markings seen in the original image, such as the area surrounded by the broken line in image D in FIG.
- an image at a past time (image A in FIG. 29) is subjected to position correction (image A ′ in FIG. 29) and image brightness correction (image A ′ in FIG. 29) based on changes in vehicle attitude.
- image A ′ in FIG. 29 position correction
- image A ′ in FIG. 29 image brightness correction
- a short distant travel dividing line is formed on each original image. Even if edge detection is performed after the image is extended as shown in image C in Fig. 29, it is surrounded by a solid line in image D in Fig. 29. Thus, it is detected as an edge.
- the configuration is such that the captured images are combined, that is, the images at the present time and the past are combined at the stage of the original image before the edge detection processing, so that the distant lane marking can be accurately detected. Therefore, the lane marking can be apparently extended, and the recognition accuracy can be improved.
- the edge detection processing is performed before the edge detection processing.
- the edge detection processing is usually performed as described above. Since the processing is composed of the processing, the binarization processing, and the edge detection, the effect of the present invention can be obtained if the images are synthesized before the edge detection processing, that is, before the binarization processing even if not before the differentiation processing.
- FIG. 31 shows another method.
- pattern matching can be performed with higher accuracy by extending the traveling lane marking, which is effective.
- Figure 32 shows an example of the pattern.
- the pattern of the shape of the image in which the travel lane markings are captured by the on-board camera is set in advance as a plurality of templates as shown in FIG. 32, and this is added to the captured image as shown in the center diagram of FIG. In this way, the degree of coincidence between the image and the template is determined, and the distribution line of the degree of coincidence is used as the travel lane marking line, as shown in the lower diagram of Fig. 31. According to the present invention, even in such pattern matching, the traveling lane marking is extended, and the recognition accuracy of the traveling lane marking can be further improved.
- the number of images to be combined in the process of S100 is set to two. More than two images may be used as necessary. Further, in this embodiment, a selection is made according to the vehicle speed from images that are periodically taken, but images are taken at the same common period by providing two or more photographing means. It may be possible to select from images that do not exist according to the vehicle speed. [0115] In the correction in S108, a pitch angle sensor may be arranged at an appropriate position of the vehicle 12 instead of the change in the horizon at which the image force is also obtained, and the output force may be corrected by obtaining a dynamic shift. .
- the camera 10 is provided with a photographing means (camera) for photographing an area including the road surface behind the vehicle when the power vehicle 12 travels forward and is configured to photograph an area including the road surface in front of the vehicle. Needless to say, it may be configured so as to recognize the traveling lane markings behind the vehicle from the photographed image taken by the photographing means. Therefore, in this specification, the term “vehicle traveling direction” is used to indicate the front-back direction of the vehicle.
- photographing means (camera 10, image processing ECU 44) for photographing an area including the road surface in the traveling direction of the vehicle 12 as described above, and on the road surface in the photographed image photographed by the photographing means.
- a lane marking recognizing means (image processing ECU 44, S12 to S16) capable of recognizing at least a broken lane marking, and a plurality of photographed images photographed at different times by the photographing means.
- the image composing means comprises: In the recognition process, the photographed images are combined in the V ⁇ processing step (S12) that does not change at least the shape of the traveling lane marking in the photographed images.
- photographing means (camera 10, image processing ECU 44) for photographing an area including a road surface in the traveling direction of the vehicle 12, an edge in a photographed image photographed by the photographing means is detected, and the detected edge is detected.
- a lane marking recognition means (image processing ECU 44, S12 to S16) capable of recognizing at least a broken lane marking on the road surface by performing the Hough transform on the road surface, and a plurality of images taken at different times by the imaging means.
- An image synthesizing means (image processing ECU44, S12) for synthesizing the photographed image of and extending the length of the traveling lane marking in the photographed image.
- the photographed image is composed before the edge detection (S14) in the lane marking recognition process by the lane marking recognition means.
- the traveling in the captured image in the traveling lane marking recognition process is performed. Since the configuration is such that captured images are synthesized at a processing stage that does not change the shape of the lane markings, distant lane markings can be accurately detected, and thus the lane markings can be apparently extended, thereby improving recognition accuracy.
- a vehicle lane marking recognition device can be provided.
- the driving lane may be blurred or chipped, and its length may be short depending on the case or construction.In such a case, the driving lane can be recognized with high accuracy. Since the length of the traveling lane markings is increased, the direction of the lane markings can be recognized with higher accuracy, as compared to the recognition of the dotted lane markings drawn on the road. Further, it is possible to provide a traveling lane marking recognition device for a vehicle, which can recognize a traveling lane marking with high accuracy.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Multimedia (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Signal Processing (AREA)
- Image Analysis (AREA)
- Traffic Control Systems (AREA)
- Image Processing (AREA)
- Closed-Circuit Television Systems (AREA)
Abstract
Description
Claims
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP05741423A EP1760662A4 (en) | 2004-05-19 | 2005-05-17 | TRAVEL SECTION LINE KNOWLEDGE FOR A VEHICLE |
US11/596,900 US7421095B2 (en) | 2004-05-19 | 2005-05-17 | Traffic lane marking line recognition system for vehicle |
CN2005800157914A CN1954343B (zh) | 2004-05-19 | 2005-05-17 | 车辆用行驶划分线识别装置 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2004-148577 | 2004-05-19 | ||
JP2004148577A JP3898709B2 (ja) | 2004-05-19 | 2004-05-19 | 車両用走行区分線認識装置 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2005111937A1 true WO2005111937A1 (ja) | 2005-11-24 |
Family
ID=35394357
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2005/008937 WO2005111937A1 (ja) | 2004-05-19 | 2005-05-17 | 車両用走行区分線認識装置 |
Country Status (5)
Country | Link |
---|---|
US (1) | US7421095B2 (ja) |
EP (1) | EP1760662A4 (ja) |
JP (1) | JP3898709B2 (ja) |
CN (1) | CN1954343B (ja) |
WO (1) | WO2005111937A1 (ja) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2008257681A (ja) * | 2007-03-30 | 2008-10-23 | Hyundai Motor Co Ltd | 自動車用車線離脱防止方法 |
US20100121561A1 (en) * | 2007-01-29 | 2010-05-13 | Naoaki Kodaira | Car navigation system |
CN107292263A (zh) * | 2017-06-19 | 2017-10-24 | 深圳市创艺工业技术有限公司 | 一种机动、电动车辆自动驾驶识别系统 |
Families Citing this family (34)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP4659631B2 (ja) * | 2005-04-26 | 2011-03-30 | 富士重工業株式会社 | 車線認識装置 |
JP4616068B2 (ja) * | 2005-04-28 | 2011-01-19 | 本田技研工業株式会社 | 車両、画像処理システム、画像処理方法及び画像処理プログラム |
JP4614826B2 (ja) * | 2005-06-14 | 2011-01-19 | トヨタ自動車株式会社 | 道路区画線検出装置 |
US8184159B2 (en) | 2007-03-26 | 2012-05-22 | Trw Automotive U.S. Llc | Forward looking sensor system |
EP2075170B1 (en) * | 2007-12-28 | 2011-02-16 | Magneti Marelli S.p.A. | A driving assistance system for a vehicle travelling along a roadway that lacks lane demarcation lines |
JP4697480B2 (ja) * | 2008-01-11 | 2011-06-08 | 日本電気株式会社 | 車線認識装置、車線認識方法および車線認識プログラム |
JP5065210B2 (ja) * | 2008-09-17 | 2012-10-31 | トヨタ自動車株式会社 | 車線認識装置 |
JP4656456B2 (ja) * | 2008-10-22 | 2011-03-23 | 日本電気株式会社 | 車線区画線検出装置、車線区画線検出方法、及び車線区画線検出プログラム |
DE102009007342A1 (de) * | 2009-02-04 | 2010-08-05 | Hella Kgaa Hueck & Co. | Verfahren und Vorrichtung zum Ermitteln einer geltenden Fahrspurmarkierung |
US8174374B2 (en) * | 2009-06-30 | 2012-05-08 | Mitsubishi Electric Research Laboratories, Inc. | Method and system for coding digital information in lane markings using an optical sensor |
US8378799B2 (en) | 2009-06-30 | 2013-02-19 | Mitsubishi Electric Research Laboratories, Inc. | Method and system for coding information on a roadway surface subject to motion blur |
CN101624008A (zh) * | 2009-08-12 | 2010-01-13 | 刘胜恒 | 变频分动力高速汽车 |
JP5664152B2 (ja) * | 2009-12-25 | 2015-02-04 | 株式会社リコー | 撮像装置、車載用撮像システム及び物体識別装置 |
JP4962581B2 (ja) * | 2010-03-03 | 2012-06-27 | 株式会社デンソー | 区画線検出装置 |
US20110301813A1 (en) * | 2010-06-07 | 2011-12-08 | Denso International America, Inc. | Customizable virtual lane mark display |
JP5276637B2 (ja) * | 2010-09-08 | 2013-08-28 | 富士重工業株式会社 | 車線推定装置 |
CN101976336A (zh) * | 2010-10-21 | 2011-02-16 | 西北工业大学 | 一种基于模糊增强和曲面拟合的图像边缘特征提取方法 |
DE102010051206A1 (de) * | 2010-11-12 | 2012-05-16 | Valeo Schalter Und Sensoren Gmbh | Verfahren zum Erzeugen eines Bilds einer Fahrzeugumgebung und Abbildungsvorrichtung |
KR101240469B1 (ko) * | 2010-11-23 | 2013-03-11 | 현대모비스 주식회사 | 객체 인식 시스템, 차량용 장애물 인식 시스템 및 차량용 장애물 인식 방법 |
US20120309321A1 (en) * | 2011-05-31 | 2012-12-06 | Broadcom Corporation | Synchronized calibration for wireless communication devices |
EP2574958B1 (en) | 2011-09-28 | 2017-02-22 | Honda Research Institute Europe GmbH | Road-terrain detection method and system for driver assistance systems |
US9538144B2 (en) * | 2012-05-02 | 2017-01-03 | GM Global Technology Operations LLC | Full speed lane sensing using multiple cameras |
JP5817927B2 (ja) * | 2012-05-18 | 2015-11-18 | 日産自動車株式会社 | 車両用表示装置、車両用表示方法及び車両用表示プログラム |
EP2899669A1 (en) | 2014-01-22 | 2015-07-29 | Honda Research Institute Europe GmbH | Lane relative position estimation method and system for driver assistance systems |
CN104658265B (zh) * | 2015-03-08 | 2017-02-08 | 未来汽车科技(深圳)有限公司 | 位于交通路口的跨实线并道车辆识别系统 |
US9884623B2 (en) * | 2015-07-13 | 2018-02-06 | GM Global Technology Operations LLC | Method for image-based vehicle localization |
US9829888B2 (en) | 2015-11-17 | 2017-11-28 | Ford Global Technologies, Llc | Distinguishing lane markings for a vehicle to follow |
JP6790895B2 (ja) * | 2017-02-17 | 2020-11-25 | セイコーエプソン株式会社 | 印刷装置及び印刷制御方法 |
WO2018232681A1 (en) * | 2017-06-22 | 2018-12-27 | Baidu.Com Times Technology (Beijing) Co., Ltd. | TRAFFIC PREDICTION BASED ON CARD IMAGES FOR AUTONOMOUS DRIVING |
DE102017011808A1 (de) * | 2017-12-20 | 2019-06-27 | Daimler Ag | Verfahren zur Regelung der Bewegung eines Fahrzeugs in einem automatisierten Fahrbetrieb und Vorrichtung zur Durchführung des Verfahrens |
JP7070082B2 (ja) * | 2018-05-18 | 2022-05-18 | 株式会社デンソー | 車載カメラ |
US11199847B2 (en) * | 2018-09-26 | 2021-12-14 | Baidu Usa Llc | Curvature corrected path sampling system for autonomous driving vehicles |
US11068724B2 (en) * | 2018-10-11 | 2021-07-20 | Baidu Usa Llc | Deep learning continuous lane lines detection system for autonomous vehicles |
JP7197554B2 (ja) * | 2020-12-28 | 2022-12-27 | 本田技研工業株式会社 | 車両制御システム及び区画線推定方法 |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH10320559A (ja) * | 1997-05-20 | 1998-12-04 | Nissan Motor Co Ltd | 車両用走行路検出装置 |
JP2001236506A (ja) * | 2000-02-22 | 2001-08-31 | Nec Corp | 白線検出方法および白線検出装置 |
JP2004110521A (ja) * | 2002-09-19 | 2004-04-08 | Denso Corp | 変位データ抽出方法及び物体検出装置 |
Family Cites Families (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5359666A (en) * | 1988-09-28 | 1994-10-25 | Honda Giken Kogyo Kabushiki Kaisha | Driving way judging device and method |
JP2799279B2 (ja) | 1993-03-15 | 1998-09-17 | 立山アルミニウム工業株式会社 | 連段窓 |
JP3169483B2 (ja) * | 1993-06-25 | 2001-05-28 | 富士通株式会社 | 道路環境認識装置 |
EP0827127B1 (en) * | 1996-08-28 | 2006-10-04 | Matsushita Electric Industrial Co., Ltd. | Local positioning apparatus, and method therefor |
JP2002083285A (ja) * | 2000-07-07 | 2002-03-22 | Matsushita Electric Ind Co Ltd | 画像合成装置および画像合成方法 |
JP2002029348A (ja) | 2000-07-17 | 2002-01-29 | Honda Motor Co Ltd | 車両用外界認識装置 |
JP2002092796A (ja) | 2000-09-18 | 2002-03-29 | Toyota Motor Corp | 車線認識装置 |
JP3651387B2 (ja) * | 2000-11-22 | 2005-05-25 | 日産自動車株式会社 | 白線検出装置 |
JP3736346B2 (ja) | 2000-12-26 | 2006-01-18 | 日産自動車株式会社 | 車線検出装置 |
JP3297040B1 (ja) * | 2001-04-24 | 2002-07-02 | 松下電器産業株式会社 | 車載カメラの画像合成表示方法及びその装置 |
JP4156214B2 (ja) * | 2001-06-13 | 2008-09-24 | 株式会社デンソー | 車両周辺画像処理装置及び記録媒体 |
JP4024581B2 (ja) * | 2002-04-18 | 2007-12-19 | オリンパス株式会社 | 撮像装置 |
US7433494B2 (en) * | 2002-09-19 | 2008-10-07 | Denso Corporation | Moving body detecting apparatus |
US20050031169A1 (en) * | 2003-08-09 | 2005-02-10 | Alan Shulman | Birds eye view virtual imaging for real time composited wide field of view |
-
2004
- 2004-05-19 JP JP2004148577A patent/JP3898709B2/ja not_active Expired - Fee Related
-
2005
- 2005-05-17 WO PCT/JP2005/008937 patent/WO2005111937A1/ja active Application Filing
- 2005-05-17 CN CN2005800157914A patent/CN1954343B/zh not_active Expired - Fee Related
- 2005-05-17 EP EP05741423A patent/EP1760662A4/en not_active Withdrawn
- 2005-05-17 US US11/596,900 patent/US7421095B2/en not_active Expired - Fee Related
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH10320559A (ja) * | 1997-05-20 | 1998-12-04 | Nissan Motor Co Ltd | 車両用走行路検出装置 |
JP2001236506A (ja) * | 2000-02-22 | 2001-08-31 | Nec Corp | 白線検出方法および白線検出装置 |
JP2004110521A (ja) * | 2002-09-19 | 2004-04-08 | Denso Corp | 変位データ抽出方法及び物体検出装置 |
Non-Patent Citations (1)
Title |
---|
See also references of EP1760662A4 * |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20100121561A1 (en) * | 2007-01-29 | 2010-05-13 | Naoaki Kodaira | Car navigation system |
US8175806B2 (en) * | 2007-01-29 | 2012-05-08 | Kabushiki Kaisha Toshiba | Car navigation system |
JP2008257681A (ja) * | 2007-03-30 | 2008-10-23 | Hyundai Motor Co Ltd | 自動車用車線離脱防止方法 |
CN107292263A (zh) * | 2017-06-19 | 2017-10-24 | 深圳市创艺工业技术有限公司 | 一种机动、电动车辆自动驾驶识别系统 |
CN107292263B (zh) * | 2017-06-19 | 2019-01-15 | 盐城塞纳世信息技术有限公司 | 一种机动、电动车辆自动驾驶识别系统 |
Also Published As
Publication number | Publication date |
---|---|
EP1760662A4 (en) | 2009-06-17 |
CN1954343A (zh) | 2007-04-25 |
US7421095B2 (en) | 2008-09-02 |
JP2005332104A (ja) | 2005-12-02 |
JP3898709B2 (ja) | 2007-03-28 |
EP1760662A1 (en) | 2007-03-07 |
US20070198146A1 (en) | 2007-08-23 |
CN1954343B (zh) | 2010-05-12 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2005111937A1 (ja) | 車両用走行区分線認識装置 | |
WO2005111966A1 (ja) | 車両用走行区分線認識装置 | |
WO2005111965A1 (ja) | 車両用走行区分線認識装置 | |
US7415134B2 (en) | Traffic lane marking line recognition system for vehicle | |
US8548200B2 (en) | Lane-marker recognition system with improved recognition-performance | |
EP2251238B1 (en) | Vehicle travel support device, vehicle, and vehicle travel support program | |
EP2045132B1 (en) | Driving support device, driving support method, and computer program | |
WO2005111964A1 (ja) | 車両用走行区分線認識装置 | |
US7623700B2 (en) | Stereoscopic image processing apparatus and the method of processing stereoscopic images | |
JP2008201178A (ja) | 駐車支援装置 | |
JP2006012191A (ja) | 車両用走行区分線認識装置 | |
JP7426174B2 (ja) | 車両周囲画像表示システム及び車両周囲画像表示方法 | |
JP4703544B2 (ja) | 運転支援装置 | |
JP2002163641A (ja) | 車両用画像処理装置 | |
CN113170057A (zh) | 摄像部控制装置 | |
JP4423926B2 (ja) | 車両のレーン走行支援装置 | |
JP3154671B2 (ja) | 車両用走行区分帯検出装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AK | Designated states |
Kind code of ref document: A1 Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS KE KG KM KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NA NG NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SM SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW |
|
AL | Designated countries for regional patents |
Kind code of ref document: A1 Designated state(s): GM KE LS MW MZ NA SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IS IT LT LU MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
WWE | Wipo information: entry into national phase |
Ref document number: 11596900 Country of ref document: US Ref document number: 2007198146 Country of ref document: US Ref document number: 200580015791.4 Country of ref document: CN |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWW | Wipo information: withdrawn in national office |
Ref document number: DE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2005741423 Country of ref document: EP |
|
WWP | Wipo information: published in national office |
Ref document number: 2005741423 Country of ref document: EP |
|
WWP | Wipo information: published in national office |
Ref document number: 11596900 Country of ref document: US |