WO2013022153A1 - Appareil et procédé de détection de voie - Google Patents

Appareil et procédé de détection de voie Download PDF

Info

Publication number
WO2013022153A1
WO2013022153A1 PCT/KR2011/009800 KR2011009800W WO2013022153A1 WO 2013022153 A1 WO2013022153 A1 WO 2013022153A1 KR 2011009800 W KR2011009800 W KR 2011009800W WO 2013022153 A1 WO2013022153 A1 WO 2013022153A1
Authority
WO
WIPO (PCT)
Prior art keywords
lane
feature points
fitting
basis
image
Prior art date
Application number
PCT/KR2011/009800
Other languages
English (en)
Inventor
Youngkyung Park
Jeihun Lee
Joongjae LEE
Jonghun Kim
Junoh PARK
Andreas PARK
Chandra Shekhar DHIR
Hyunsoo Kim
Original Assignee
Lg Electronics Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Lg Electronics Inc. filed Critical Lg Electronics Inc.
Publication of WO2013022153A1 publication Critical patent/WO2013022153A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • G06T7/75Determining position or orientation of objects or cameras using feature-based methods involving models
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/02Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
    • B60W40/06Road conditions
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/14Means for informing the driver, warning the driver or prompting a driver intervention
    • B60W2050/146Display means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/02Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
    • B60W40/06Road conditions
    • B60W40/072Curvature of the road
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20048Transform domain processing
    • G06T2207/20061Hough transform
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30248Vehicle exterior or interior
    • G06T2207/30252Vehicle exterior; Vicinity of vehicle
    • G06T2207/30256Lane; Road marking

Definitions

  • the present invention relates to an apparatus and method for detecting a lane.
  • the driving system needs to detect the lines first by using an image recognition apparatus in order to detect that the driving system is leaving its line and send a warning to the outside.
  • a recognition apparatus using images captures images of a target to be recognized by a driving system by using a camera, extracts features of the target by using digital image processing techniques, and checks the target by using the extracted features.
  • a line which is a target, needs to be more accurately extracted from a moving object.
  • a lane refers to a reference line for driving and becomes a basis on which all driving actions such as going forward, going backward, changing lanes, changing directions, forward parking, reverse parking, and perpendicular parking are performed.
  • a method of detecting a lane is mainly applied to advanced safety vehicles (ASV) that are currently being high-functioning and intelligent.
  • This method of detecting a lane is applied to a lane departure warning system to prevent dozing off behind the wheel, a rear parking guide and perpendicular parking guide system to help novice drivers park their cars, and a lane keeping system to apply torque to a steering wheel in a dangerous situation to keep a lane.
  • the application of the lane detection method has gradually been expanded.
  • Methods of detecting a lane include mapping lane points corresponding to coordinates of the captured images to the two-dimensional coordinate system, and detecting and displaying positions of the lanes.
  • an object of the present invention is to provide an apparatus and method for accurately detecting a lane even when it is difficult to express a curved road with high curvature or a road connecting a straight road and a curved road into a single curve equation.
  • an apparatus for detecting a lane including: a camera module capturing an image; a control unit extracting a plurality of feature points from the image, carrying out lane fitting to connect the plurality of feature points with a single line, and tracking the lane fitted; and a display unit displaying the lane tracked, wherein the lane fitting may include carrying out short-range fitting on the basis of feature points present in a short-range region among the plurality of feature points, determining an offset representing lateral inclination of the lane on the basis of a result of the short-range fitting, and carrying out curve fitting on the basis of the offset.
  • the curve fitting may include determining coefficients of a curve equation in arbitrary dimension on the basis of the curve equation.
  • the short-range region may be a lower region between upper and lower regions obtained by equally dividing the image or a region ranging from a lower side to an arbitrary region of the image.
  • the camera module may include at least one pair of cameras separated by a horizontal interval in the same central axis in the same plane, or a single camera.
  • the plurality of feature points may be extracted only from a region of interest defined only on a lower part of a road on the basis of the horizon within the image.
  • the plurality of feature points may be extracted on the basis of gradient information or color information of the image.
  • the control unit may transform the plurality of feature points to a world coordinate system and fits the lane on the basis of the feature points transformed to the world coordinate system.
  • the fitting of the lane may be performed by using at least one method of a least square method, Random Sample Consensus (RANSAC), a general hough transform method, and a spline interpolation method.
  • RANSAC Random Sample Consensus
  • the tracking of the lane may be performed on every fitted lane.
  • a method of detecting a lane including: capturing an image; extracting a plurality of feature points from the image; carrying out lane fitting to connect the plurality of feature points with a single line; tracking the lane fitted; and displaying the lane tracked, wherein the carrying out of the lane fitting may include: carrying out short-range fitting on the basis of feature points present in a short-range region among the plurality of feature points; determining an offset representing lateral inclination of the lane on the basis of a result of the short-range fitting; and carrying out curve fitting on the basis of the offset.
  • the carrying out of the curve fitting may include determining coefficients of a curve equation in arbitrary dimension on the basis of the curve equation.
  • the short-range region may be a lower region between upper and lower regions obtained by equally dividing the image or a region ranging from a lower side to an arbitrary region of the image.
  • the camera module may include at least one pair of cameras separated by a horizontal interval in the same central axis in the same plane, or a single camera.
  • the extracting of the plurality of feature points may include: defining a region of interest only on a lower part of a road on the basis of the horizon within the image; and extracting the plurality of feature points only from the region of interest.
  • the plurality of feature points may be extracted on the basis of gradient information or color information about the image.
  • the fitting of the lane may include: transforming the plurality of feature points to a world coordinate system; and fitting the lane on the basis of the feature points transformed to the world coordinate system.
  • the fitting of the lane may be performed by using at least one method of a least square method, Random Sample Consensus (RANSAC), a general hough transform method, and a spline interpolation method.
  • RANSAC Random Sample Consensus
  • an apparatus for detecting a lane According to an apparatus for detecting a lane according to the present disclosure, all lanes appearing in a driving route are tracked, and lanes are detected from a tracking result, so that lanes can be accurately detected even in changing road conditions such as interchanges and new driving lanes can be detected quickly even when lanes are changed.
  • an apparatus for detecting a lane when it comes to detecting a curved road with high curvature, short-range fitting is performed first, and long-range curve fitting is then performed on the basis of the short-range fitting, so that a result close to the actual lane can be obtained, and at the same time, an accurate offset can be determined.
  • FIG. 1 is a block diagram illustrating the configuration of an apparatus for detecting a lane according to an exemplary embodiment according to the present disclosure
  • FIG. 2 is a flowchart illustrating a lane detection process according to an exemplary embodiment according to the present disclosure
  • FIG. 3 is a view illustrating an image according to an exemplary embodiment according to the present disclosure
  • FIG. 4 is a view illustrating a result of extracting feature points according to an exemplary embodiment according to the present disclosure
  • FIG. 5 is a view illustrating a result of transforming extracted feature points to a world coordinate system according to an exemplary embodiment according to the present disclosure
  • FIG. 6 is a view illustrating a lane fitting result according to an exemplary embodiment according to the present disclosure.
  • FIG. 7 is a view displaying a lane tracking result according to an exemplary embodiment according to the present disclosure.
  • FIG. 8 is a flowchart illustrating a curve fitting process according to an exemplary embodiment according to the present disclosure
  • FIG. 9 is a view illustrating the comparison between the actual lane and a curve fitting result lane
  • FIG. 10 is a view illustrating a short-range fitting result according to an exemplary embodiment according to the present disclosure.
  • FIG. 11 is a view illustrating a curve fitting result according to an exemplary embodiment according to the present disclosure.
  • Exemplary embodiments according to the present disclosure may stand alone and may also be applied to various types of terminals including mobile terminals, telematics terminals, smartphones, portable terminals, personal digital assistants (PDAs), portable multimedia players (PMPs), notebook computers, tablet PCs, Wibro terminals, Internet protocol television (IPTV) terminals, televisions, 3D televisions, video equipment, navigation terminals, and audio video navigation (AVN) terminals.
  • mobile terminals telematics terminals, smartphones, portable terminals, personal digital assistants (PDAs), portable multimedia players (PMPs), notebook computers, tablet PCs, Wibro terminals, Internet protocol television (IPTV) terminals, televisions, 3D televisions, video equipment, navigation terminals, and audio video navigation (AVN) terminals.
  • PDAs personal digital assistants
  • PMPs portable multimedia players
  • notebook computers tablet PCs
  • Wibro terminals Wireless Fidelity terminals
  • IPTV Internet protocol television
  • 3D televisions televisions
  • video equipment video equipment
  • navigation terminals audio video
  • Embodiments described herein may be implemented in a program command form, capable of being performed through various computer means, recorded in a computer-readable recording medium.
  • Examples of the computer-readable recording medium may include a program command, a data file, and a data structure separately or in a combination thereof.
  • the program command recorded in a recording medium may be a command designed or configured specially for an exemplary embodiment of the present invention, or usably known to a person having ordinary skills in the art.
  • Examples of the computer-readable recording medium may include a hardware device specially configured to store and execute a program command such as magnetic media such as a hard disc, a floppy disc, and a magnetic tape; optical media such as Compact Disc-Read Only Memory (CD-ROM) or Digital Versatile Disc (DVD); magneto-optical media such floptical disk; ROM; RAM; and flash memory.
  • Examples of the program command may include a high level language code executable by a computer using an inter-print as well as a machine language code such as a code created by a compiler.
  • the hardware device can be configured as at least one software module to execute an operation of an exemplary embodiment of the present invention. The same applies to an example thereof.
  • FIG. 1 is a block diagram illustrating an apparatus for detecting a lane according to an exemplary embodiment of the present disclosure.
  • the apparatus for detecting a lane may include a camera module 110, a control unit 120, a storage unit 130, and an output unit 140.
  • the camera module 110 is a camera system that captures front, rear, and/or lateral images at the same time by using a rotary reflector, a condenser lens, and an imaging device.
  • the camera module 110 may have application to security facilities, surveillance cameras, and robot visions.
  • the rotary reflector may have various shapes such as a conicoid or spherical, conical, or combined shape.
  • the camera module 110 may include at least one pair of cameras (for example, stereo cameras or stereoscopic cameras) that are separated by a horizontal interval in the same central axis in the arbitrary same plane of the lane detection apparatus 100, or a single camera. At this time, the horizontal interval may be determined in consideration of the distance between eyes of the average person, and may be set when the lane detection apparatus 100 is configured.
  • the camera module 110 may be any camera module that can capture an image.
  • a Charge Coupled Device CCD
  • CMOS Complementary Metal Oxide Semiconductor
  • the camera module 110 may include at least one of a stereo camera and a moving stereo camera in order to capture an image of the front.
  • the stereo camera is an image apparatus that is composed of a plurality of cameras. With an image that is captured by the camera module 110, two-dimensional information about surrounding areas of the camera module 110 can be provided. By using a plurality of images that are captured by a plurality of cameras in different directions, three-dimensional information about the surrounding areas of the camera module 110 can be obtained.
  • the moving stereo camera refers to a camera that fixes vergence for observational obstacles as the position of the stereo camera actively changes according to the distance of the obstacles.
  • the stereo camera generally includes two cameras that are arranged next to each other to capture images, and the distance of the obstacles can be calculated according to stereo disparity of the captured images.
  • the stereo camera is a passive camera in which an optical axis is always arranged parallel and fixed.
  • the moving stereo camera can fix vergence by actively changing the geometric position of the optical axis.
  • the control of vergence of the stereo camera according to the distance of the obstacles is called vergence control.
  • the vergence control stereo camera may keep constant stereo disparity related to a moving object to thereby provide a 3D image observant with more natural 3D images and also provide useful information in terms of measuring the distance of the obstacles and processing stereo images.
  • the control unit 120 may control the general operation of the lane detection apparatus 100.
  • the control unit 120 may perform the control of various types of power driving units in order for the lane detection apparatus 100 to drive.
  • the control unit 120 processes an image received from the camera module 110, carries out lane detection, and processes other operations.
  • a lane detection process of the control unit 120 will be described in detail with reference to FIGS. 2 to 11.
  • control unit 120 may perform functions related to lane keeping (including a lane departure warning message function and an automatic lane keeping function) on the basis of the position of the lane detection apparatus 100 (or a vehicle having the lane detection apparatus 100) that is detected by an arbitrary GPS module and the detected lane.
  • functions related to lane keeping including a lane departure warning message function and an automatic lane keeping function
  • the storage unit 130 may store data and programs for the operation of the control unit 120 and temporarily store data being input/output.
  • the storage unit 120 may temporarily store an image that is received by the camera module 110, processing information related to the image, and lane detection information.
  • the storage unit 120 may store operation expressions (for example, curve equations) used to process the image.
  • the storage unit 130 may store software components including an operating system, a module performing a wireless communication unit, a module operating together with a user input unit, a module operating together with an A/V input unit, and a module operating together with the output unit 140.
  • the operating system for example, LINUX, UNIX, OS X, WINDOWS, Chrome, Symbian, iOS, Android, VxWorks or other embedded operating systems
  • the storage unit 130 may include at least one storage medium including a flash memory type, a hard disk type, a multimedia card micro type, a memory card type (e.g., SD or DX memory), Random Access Memory (RAM), Static Random Access Memory (SRAM), Read-Only Memory (ROM), Electrically Erasable Programmable Read-only Memory (EEPROM), Programmable Read-only Memory (PROM), magnetic memory, magnetic disk, and optical disk.
  • the lane detection apparatus 100 may operate in relation with a web storage that performs the storage function of the storage unit 130 on the Internet.
  • the output unit 140 generates outputs related to sight, hearing, or touch and may include a display unit 141 and a sound output module 142.
  • the display unit 141 may output information that is processed by the lane detection apparatus 100. For example, when the lane detection apparatus 100 is driving, the display unit 141 may display a UI (User Interface) or a GUI (Graphic User Interface) related to driving.
  • UI User Interface
  • GUI Graphic User Interface
  • the display unit 141 may display the images captured by the camera module 110 of the lane detection apparatus 100 and/or information about lanes detected by the control unit 120.
  • the display unit 141 displays the images and the information about the detected lanes at the same time.
  • the images and the information may be displayed separately at top and bottom or left and right, or the information about the detected lanes may be overlapped with the images.
  • the display unit 141 may include at least one of a liquid crystal display (LCD), a thin film transistor-liquid crystal display (TFT LCD), an organic light-emitting diode (OLED), a flexible display, and a 3D display.
  • LCD liquid crystal display
  • TFT LCD thin film transistor-liquid crystal display
  • OLED organic light-emitting diode
  • flexible display and a 3D display.
  • the displays may be transparent type or transmissive type displays, which may be called transparent displays.
  • Typical examples of the transparent displays may include a Transparent OLED (TOLED).
  • a rear structure of the display unit 141 may also be formed of a transmissive structure.
  • two or more display units 141 may exist.
  • a plurality of display units may be arranged separately or integrally on one surface of the plurality of display units of the lane detection apparatus 100 or may be arranged separately on different surfaces thereof.
  • the display unit 141 may have a layered structure with a touch sensor that senses a touch.
  • the display unit 141 may serve as an input device as well as an output device.
  • the touch sensor may be formed as a touch film, a touch sheet, a touch pad, or a touch panel.
  • the touch sensor may be configured to convert variations in pressure applied to a particular portion of the display unit 141 or capacitance generated at the particular portion of the display unit 141 into electrical input signals.
  • the touch sensor may be configured to sense pressure applied when being touched as well as touch position and touch area.
  • a corresponding thereto is sent to a touch controller.
  • the touch controller processes the signal and transmits corresponding data to the control unit 120. In this manner, the control unit 120 is informed which area of the display unit 141 is touched.
  • the sound output module 142 may output audio data stored in the storage unit 130 in recording mode and voice recognition mode.
  • the sound output module 142 may output sound signals related to a lane detection result (for example, an alarm regarding a kind of a detected lane) and functions regarding lane detection (for example, lane departure warning and automatic lane keeping alarm).
  • the sound output module 142 may include a receiver, a speaker, and a buzzer.
  • the lane detection apparatus 100 may include a communication unit that performs communications with an arbitrary terminal or server under the control of the control unit 120.
  • the communication unit may include a wired/wireless communication module.
  • the wireless internet technique may include a wireless local area network (WLAN), Wi-Fi, wireless broadband (WiBro), world interoperability for microwave access (WiMAX), high speed downlink packet access (HSDPA), IEEE 802.16, long term evolution (LTE), and wireless mobile broadband service (WMBS).
  • Examples of the short range communication technology may include Bluetooth, Radio Frequency IDentification (RFID), Infrared Data Association (IrDA), Ultra-WideBand (UWB), and ZigBee.
  • the wired communication technology may include Universal Serial Bus (USB) communication.
  • the communication module may include CAN communication, Ethernet for a vehicle, flexray, and a Local Interconnect Network (LIN) in order to perform communications with an arbitrary vehicle having the lane detection apparatus 100.
  • CAN communication Ethernet for a vehicle
  • flexray flexray
  • LIN Local Interconnect Network
  • the communication unit may receive an image captured by an arbitrary camera module from the arbitrary terminal or server. Moreover, the communication unit may transmit lane detection information about the arbitrary image to the arbitrary terminal or server under the control of the control unit 120.
  • the lane detection apparatus 100 may be implemented with a larger number of components shown in FIG. 1, or the lane detection apparatus 100 may be implemented with a smaller number of components.
  • FIG. 2 is a flowchart illustrating a lane detection process according to the present disclosure.
  • the lane detection apparatus 100 obtains an image in operation S21.
  • the camera module 110 may obtain a first image and a second image that are captured by at least one pair of cameras (for example, stereo cameras or stereoscopic cameras) that are separated by a horizontal interval in the same central axis in the same plane of the lane detection apparatus 100, or an image that is captured by a single camera.
  • the first image may be a left image captured by a left camera included in the one pair of cameras
  • the second image may be a right image captured by a right camera included in the one pair of cameras.
  • the camera module 110 may receive any one of the first image and the second image that are captured by the one pair of cameras.
  • the camera module 110 may obtain an image 310 as shown in FIG. 3.
  • the lane detection apparatus 100 extracts feature points of a lane in operation S22.
  • the control unit 120 extracts a plurality of feature points (edge points) 410 that are present in the image.
  • the control unit 120 may set only the lower part of roads on the basis of the horizon as a Region of Interest (ROI), and extract the feature points 410 only in the region of interest.
  • ROI Region of Interest
  • the feature points 410 may be extracted by using various algorithms.
  • the control unit 120 may extract the feature points 410 on the basis of gradient information of the obtained image 310. That is, when brightness or gradations of color between adjacent pixels of the obtained image 310 gradually change, the control unit 120 may not regard this as the feature points 410. On the other hand, when brightness or gradations of color between adjacent pixels of the obtained image 310 drastically change, the control unit 120 may regard this as the feature points 410 and extract pixel information of the feature points 410.
  • the feature points 410 may be formed of discontinuous points on the boundary of two regions that are distinct in terms of brightness or gradations of color between the pixels.
  • the control unit 120 may extract feature points 410 on the basis of color information about the obtained image 310.
  • color information about the obtained image 310 In general, on roads, general lanes appear white, centerlines appear yellow, and the other parts except lanes appear black. Therefore, the control unit 120 may extract the feature points 410 on the basis of color information about lanes. That is, the control unit 120 may create a region having colors that can be classified as lanes from the image 310 as one object, classify only the region corresponding to the roads as a region of interest in order to exclude other objects driving the roads, and extract the feature points 410 from the object created on the basis of the color information within the region of interest.
  • the present invention is not limited thereto, and the feature points 410 may be extracted by various types of algorithms or filters for extracting feature points, such as an Edge Tracing algorithm or a Boundary Flowing algorithm.
  • the control unit 120 may transform the extracted feature points 410 to a world coordinate system.
  • the control unit 120 may use transformation matrices or coordinate transformation equations.
  • the transformation matrices may be homographic matrices that are stored in advance in the storage unit 130.
  • the control unit 120 keeps the same vertical and horizontal intervals of a plurality of feature points 510 that are transformed to the world coordinate system to thereby detect errors that occur during coordinate transformation.
  • the lane detection apparatus 100 fits the lane in operation S23.
  • control unit 120 carries out lane fitting in order to express the extracted feature points 510 into a single line 610.
  • the control unit 120 may use any one method of a least square method, Random Sample Consensus (RANSAC), a general hough transform method, and a spline interpolation method in order to extract a straight line or a curved line from the image.
  • RANSAC Random Sample Consensus
  • general hough transform method a general hough transform method
  • spline interpolation method a spline interpolation method
  • the control unit 120 may carry out lane fitting on the basis of a curve equation. Specifically, the control unit 120 substitutes the extracted feature points 510 into the curve equation to obtain coefficients and carries out curve fitting on the basis of a result of the curve equation whose coefficients are obtained.
  • the curve equation is stored in advance in the storage unit 130 and may be an arbitrary multidimensional equation.
  • the control unit 120 determines that the feature points 510 form a straight line if a is 0 and that the feature points 510 are a curve if a is not 0.
  • a is a curve derivative
  • b is a curvature
  • c is a heading angle
  • d is an offset in the quadratic equation. If both a and b are 0, a straight line is detected, and c is a heading angle, and a d is an offset.
  • control unit 120 may carry out curve fitting on the basis of a short-range fitting result.
  • control unit 120 may first perform short-range fitting of a lower-order polynomial with respect to a short-range lane and then carry out curve fitting of a high-order polynomial on the basis of a short-range fitting result.
  • the lane detection apparatus 100 tracks the lane in operation S24.
  • control unit 120 carries out tracking on the basis of the plurality of feature points 510 corresponding to the fitted lane.
  • calibration refers to calculating the transformation relationship between the camera module 110 and the world coordinate system.
  • control unit 120 may carry out tracking on all of the plurality fitted lanes. That is, the control unit 120 may carry out tracking on adjacent lanes present in the driving road in addition to the lane in which the lane detection apparatus 100 is driving.
  • control unit 120 can quickly detect a lane from the existing tracking information without newly detecting a lane even when the lane detection apparatus 100 changes lanes.
  • control unit 120 may estimate and detect lanes on the basis of tracking information of the plurality of lanes (for example, positions of lanes, a lane width, and curve equations of curved lanes).
  • control unit 120 may store the tracking result in the storage unit 130.
  • control unit 120 can correct errors on the basis of the stored tracking result even when some errors occur during lane fitting.
  • the lane detection apparatus 100 may display a lane tracking result in operation S25.
  • the output unit 140 may display the tracking result through the lane detection process as shown in FIG. 7.
  • the output unit 140 may display the image 310 captured by the camera module 110 of the lane detection apparatus 100 and/or a lane tracking result 710 detected by the control unit 120.
  • the output unit 140 displays the image 310 and the detected lane tracking result 710 at the same time.
  • the image 310 and the detected lane tracking result 710 may be displayed separately at top and bottom or left and right, or the information about the detected lane may be overlapped with the image 310.
  • the output unit 140 may output sound signals related to a lane detection result (for example, an alarm regarding a kind of a detected lane) and functions regarding lane detection (for example, lane departure warning and automatic lane keeping alarm).
  • a lane detection result for example, an alarm regarding a kind of a detected lane
  • functions regarding lane detection for example, lane departure warning and automatic lane keeping alarm
  • FIG. 8 is a flowchart illustrating a curve fitting process according to an exemplary embodiment according to the present disclosure.
  • the lane detection apparatus 100 carries out short-range fitting in operation S231.
  • the control unit 120 may carry out lane fitting by using a curve equation.
  • the accuracy of the offset of the detected lane may significantly decrease.
  • the offset represents whether the lane is located at the left or right of the lane detection apparatus 100.
  • the offset refers to a value corresponding to a constant term (a coefficient in the first term). That is, when the constant term has a positive value, the lane is located at the right side of the lane detection apparatus 100. On the other hand, when the constant term has a negative value, the lane is located at the left side of the lane detection apparatus 100.
  • the control unit 120 that detects the extracted feature points 510 as a curve carries out lane fitting on the basis of a curve equation, and thus, a lane detection result becomes a curve fitting result 920 that has a curvature.
  • lane detection with respect to the long-range region 902 may be correct, while lane detection with respect to the short-range region 901 that partially has a straight lane becomes inaccurate.
  • the actual lane 910 which is located at the right side of the lane detection apparatus 100 (starting point of FIG. 9)
  • the starting point of the lane as the curve fitting result 920 is located at the left side of the lane detection apparatus 100. Therefore, an inaccurate offset occurs, and safe driving becomes difficult.
  • the control unit 120 first performs short-range fitting on the plurality of feature points 510 in order to find out an accurate offset. As shown in FIG. 10, the control unit 120 classifies the plurality of feature points 510 present in the short-range region 901 among the extracted feature points 510 and performs lane fitting on the basis these feature points 510. For example, with reference to FIG. 10, since the short-range region 901 of the actual lane 910 corresponds to a straight road, a short-range fitting result 930 has a straight form. Therefore, the lane detected from the short-range fitting result 930 may be fit to a one-dimensional linear equation.
  • the control unit 120 may equally divide the obtained image 310 into upper and lower parts or classify a region up to an arbitrary portion of the straight road as the short-range region 901.
  • the lane detection apparatus 100 may have a separate operation unit in order to determine the boundary for accurate lane detection.
  • criteria on which a plurality of lanes present on the short-range region 901 are classified are not particularly limited, and a predetermined number of feature points that are present within an arbitrary distance may be classified.
  • control unit 120 not only divides the road region into the short-range region 901 and the long-range region 902 but also divides the road region into a plurality of regions to perform short-range fitting on each region.
  • the lane detection apparatus 100 determines an offset of an equation from the short-range fitting result in operation S232.
  • the control unit 120 determines a constant that represents an offset of a lane among coefficients of a curve equation from the short-range fitting result 930.
  • a is a heading angle
  • b is an offset.
  • the control unit 120 performs curve fitting on entire lane on the basis of the determined offset to thereby obtain the other coefficients of the curve equation.
  • control unit 120 may carry out curve fitting on the long-range region 902 on the basis of the rest of feature points except for the feature points 510 used for the short-range fitting.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Traffic Control Systems (AREA)
  • Multimedia (AREA)
  • Automation & Control Theory (AREA)
  • Mathematical Physics (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Image Analysis (AREA)

Abstract

L'invention porte sur un appareil destiné à détecter une voie, l'appareil comprenant : un module de caméra capturant une image ; une unité de commande extrayant une pluralité de points caractéristiques de l'image, réalisant un ajustement de voie afin de relier la pluralité de points caractéristiques par une seule ligne, et suivant la voie ajustée ; et une unité d'affichage affichant la voie suivie, l'ajustement de voie pouvant consister à réaliser un ajustement à courte distance sur la base de points caractéristiques présents dans une région à courte distance parmi la pluralité de points caractéristiques, déterminer un décalage représentant une inclinaison latérale de la voie sur la base d'un résultat de l'ajustement à courte distance, et réaliser un ajustement de courbe sur la base du décalage. De plus, l'invention porte sur un procédé de détection d'une voie, le procédé consistant à : capturer une image ; extraire une pluralité de points caractéristiques de l'image ; réaliser un ajustement de voie afin de relier la pluralité de points caractéristiques par une seule ligne ; suivre la voie ajustée ; et afficher la voie suivie, la réalisation de l'ajustement de voie pouvant consister à : réaliser un ajustement à courte distance sur la base de points caractéristiques présents dans une région à courte distance parmi la pluralité de points caractéristiques ; déterminer un décalage représentant une inclinaison latérale de la voie sur la base d'un résultat de l'ajustement à courte distance ; et réaliser un ajustement de courbe sur la base du décalage.
PCT/KR2011/009800 2011-08-05 2011-12-19 Appareil et procédé de détection de voie WO2013022153A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR1020110078361A KR101612822B1 (ko) 2011-08-05 2011-08-05 차선 인식 장치 및 그 방법
KR10-2011-0078361 2011-08-05

Publications (1)

Publication Number Publication Date
WO2013022153A1 true WO2013022153A1 (fr) 2013-02-14

Family

ID=47668642

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2011/009800 WO2013022153A1 (fr) 2011-08-05 2011-12-19 Appareil et procédé de détection de voie

Country Status (2)

Country Link
KR (1) KR101612822B1 (fr)
WO (1) WO2013022153A1 (fr)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108571975A (zh) * 2017-03-07 2018-09-25 现代自动车株式会社 车辆及其控制方法以及使用该车辆和方法的自主驾驶系统
DE102018112177A1 (de) * 2018-05-22 2019-11-28 Connaught Electronics Ltd. Fahrspurdetektion auf der Basis von Fahrspurmodellenn
CN114435402A (zh) * 2022-02-16 2022-05-06 智道网联科技(北京)有限公司 一种车道线平滑方法、装置和电子设备
CN114435402B (zh) * 2022-02-16 2024-05-31 智道网联科技(北京)有限公司 一种车道线平滑方法、装置和电子设备

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20180088149A (ko) 2017-01-26 2018-08-03 삼성전자주식회사 차량 경로 가이드 방법 및 장치
KR102564856B1 (ko) 2018-09-07 2023-08-08 삼성전자주식회사 차선을 검출하는 방법 및 장치
CN116101292B (zh) * 2022-09-08 2023-10-27 广州汽车集团股份有限公司 一种获得道路上车辆之间纵向距离的方法、系统及车辆

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH10124687A (ja) * 1996-08-28 1998-05-15 Matsushita Electric Ind Co Ltd 道路白線検出方法及び道路白線検出装置
KR20040034243A (ko) * 2002-10-21 2004-04-28 학교법인 한양학원 차선 인식 방법 및 시스템
KR20080004834A (ko) * 2006-07-06 2008-01-10 삼성전자주식회사 차선 인식을 위한 영상 처리 방법 및 시스템
KR20100044305A (ko) * 2008-10-22 2010-04-30 주식회사 만도 차선을 인식하는 방법 및 장치
JP2010165142A (ja) * 2009-01-15 2010-07-29 Nissan Motor Co Ltd 車両用道路標示認識装置および車両用道路標示認識方法

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5631581B2 (ja) * 2009-12-01 2014-11-26 富士重工業株式会社 道路認識装置

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH10124687A (ja) * 1996-08-28 1998-05-15 Matsushita Electric Ind Co Ltd 道路白線検出方法及び道路白線検出装置
KR20040034243A (ko) * 2002-10-21 2004-04-28 학교법인 한양학원 차선 인식 방법 및 시스템
KR20080004834A (ko) * 2006-07-06 2008-01-10 삼성전자주식회사 차선 인식을 위한 영상 처리 방법 및 시스템
KR20100044305A (ko) * 2008-10-22 2010-04-30 주식회사 만도 차선을 인식하는 방법 및 장치
JP2010165142A (ja) * 2009-01-15 2010-07-29 Nissan Motor Co Ltd 車両用道路標示認識装置および車両用道路標示認識方法

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108571975A (zh) * 2017-03-07 2018-09-25 现代自动车株式会社 车辆及其控制方法以及使用该车辆和方法的自主驾驶系统
CN108571975B (zh) * 2017-03-07 2023-03-10 现代自动车株式会社 车辆及其控制方法以及使用该车辆和方法的自主驾驶系统
DE102018112177A1 (de) * 2018-05-22 2019-11-28 Connaught Electronics Ltd. Fahrspurdetektion auf der Basis von Fahrspurmodellenn
WO2019224103A1 (fr) 2018-05-22 2019-11-28 Connaught Electronics Ltd. Détection de voie basée sur des modèles de voie
CN114435402A (zh) * 2022-02-16 2022-05-06 智道网联科技(北京)有限公司 一种车道线平滑方法、装置和电子设备
CN114435402B (zh) * 2022-02-16 2024-05-31 智道网联科技(北京)有限公司 一种车道线平滑方法、装置和电子设备

Also Published As

Publication number Publication date
KR101612822B1 (ko) 2016-04-15
KR20130015984A (ko) 2013-02-14

Similar Documents

Publication Publication Date Title
KR101647370B1 (ko) 카메라 및 레이더를 이용한 교통정보 관리시스템
US10183621B2 (en) Vehicular image processing apparatus and vehicular image processing system
WO2021112462A1 (fr) Procédé d'estimation de valeurs de coordonnées tridimensionnelles pour chaque pixel d'une image bidimensionnelle, et procédé d'estimation d'informations de conduite autonome l'utilisant
JP5663352B2 (ja) 画像処理装置、画像処理方法、及び画像処理プログラム
WO2019225817A1 (fr) Dispositif d'estimation de position de véhicule, procédé d'estimation de position de véhicule et support d'enregistrement lisible par ordinateur destiné au stockage d'un programme informatique programmé pour mettre en œuvre ledit procédé
TWI534764B (zh) 車輛定位裝置與方法
WO2020004817A1 (fr) Appareil et procédé de détection d'informations de voie, et support d'enregistrement lisible par ordinateur stockant un programme informatique programmé pour exécuter ledit procédé
WO2013022153A1 (fr) Appareil et procédé de détection de voie
WO2020235734A1 (fr) Procédé destiné à estimer la distance à un véhicule autonome et sa position au moyen d'une caméra monoscopique
WO2017116134A1 (fr) Système de radar et de fusion d'images pour l'application des règlements de la circulation routière
EP2740103A1 (fr) Appareil de reconnaissance de voie de trafic et procédé associé
WO2020036295A1 (fr) Appareil et procédé d'acquisition d'informations de conversion de coordonnées
JP6723079B2 (ja) 物体距離検出装置
EP3432265A1 (fr) Dispositif de traitement d'image, système de commande d'appareil, dispositif de capture d'image, procédé de traitement d'image, et programme
JP2015232442A (ja) 画像処理装置及び車両前方監視装置
US20200320314A1 (en) Road object recognition method and device using stereo camera
EP3667413A1 (fr) Dispositif de traitement d'image stéréo
JP5951785B2 (ja) 画像処理装置及び車両前方監視装置
JP2007316856A (ja) 移動体検出装置、コンピュータプログラム及び移動体検出方法
JP2002321579A (ja) 警告情報生成方法及び車両側方映像生成装置
KR20190134303A (ko) 영상 인식 장치 및 그 방법
WO2013018961A1 (fr) Appareil et procédé de détection d'une voie de circulation
WO2020218716A1 (fr) Dispositif de stationnement automatique et procédé de stationnement automatique
JP2007087203A (ja) 衝突判定システム、衝突判定方法及びコンピュータプログラム
JP2008286648A (ja) 距離計測装置、距離計測システム、距離計測方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 11870648

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 11870648

Country of ref document: EP

Kind code of ref document: A1