WO2013022154A1 - Apparatus and method for detecting lane - Google Patents

Apparatus and method for detecting lane Download PDF

Info

Publication number
WO2013022154A1
WO2013022154A1 PCT/KR2011/009843 KR2011009843W WO2013022154A1 WO 2013022154 A1 WO2013022154 A1 WO 2013022154A1 KR 2011009843 W KR2011009843 W KR 2011009843W WO 2013022154 A1 WO2013022154 A1 WO 2013022154A1
Authority
WO
WIPO (PCT)
Prior art keywords
lane
damaged
section
lanes
image
Prior art date
Application number
PCT/KR2011/009843
Other languages
French (fr)
Inventor
Youngkyung Park
Jeihun Lee
Joongjae LEE
Jonghun Kim
Junoh PARK
Andreas PARK
Chandra Shekhar DHIR
Hyunsoo Kim
Original Assignee
Lg Electronics Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Lg Electronics Inc. filed Critical Lg Electronics Inc.
Publication of WO2013022154A1 publication Critical patent/WO2013022154A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R1/00Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/02Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
    • B60W40/06Road conditions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/14Means for informing the driver, warning the driver or prompting a driver intervention
    • B60W2050/146Display means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20048Transform domain processing
    • G06T2207/20061Hough transform
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30248Vehicle exterior or interior
    • G06T2207/30252Vehicle exterior; Vicinity of vehicle
    • G06T2207/30256Lane; Road marking

Definitions

  • the present disclosure relates to an apparatus and method for detecting a lane.
  • an apparatus for detecting a lane detects a lane that is included in an arbitrary image input through a camera or received from an external terminal.
  • One of the apparatuses for detecting a lane according to the related art is disclosed in Korean Publication No. 1995-0017509.
  • an object of the present invention is to provide an apparatus and method for detecting a lane that can accurately detect a lane.
  • An aspect of the present invention also provides an apparatus and method for detecting a lane that can continuously display driving lanes on an image by displaying virtual lanes in a damaged lane section on the basis of a lane width between the driving lanes, calculated before the damaged lane section, when the damaged lane section is detected.
  • an apparatus for detecting a lane including: a camera module; a display unit displaying an image captured by the camera module; and a control unit detecting candidate lanes from the image captured by the camera module on the basis of lane feature points, displaying driving lanes of a vehicle among the candidate lanes on the image, and correcting a damaged lane section on the basis of a lane width between the driving lanes when the damaged lane section is detected from the image.
  • the control unit may display virtual lanes in the damaged lane section on the basis of a lane width calculated before the damaged lane section when the damaged lane section is detected.
  • the apparatus may further include a storage unit storing information corresponding to the lane width, wherein the control unit calculates the lane width in real time or periodically.
  • the control unit may transform the lane feature points to world coordinates and calculates the lane width on the basis of the lane feature points transformed to the world coordinates.
  • the control unit may detect a section in which the lane feature points are not detected in any one of the driving lanes as the damaged lane section.
  • the control unit may detect a section in which the lane feature points corresponding to the driving lanes are not temporarily detected as the damaged lane section.
  • the control unit may generate virtual lanes corresponding to the lane width calculated before the damaged lane section and display the virtual lanes in the damaged lane section when the damaged lane section is detected.
  • the control unit may detect a heading direction of the vehicle and display virtual lanes in the damaged lane section on the basis of the lane width calculated before the damaged lane section and the heading direction when the damaged lane section is detected.
  • the control unit may generate the virtual lanes on the basis of the lane width calculated before the damaged lane section and display the virtual lanes on the image on the basis of the heading direction of the vehicle.
  • a method of detecting a lane including: detecting candidate lanes from an image captured by a camera module on the basis of lane feature points; displaying driving lanes of a vehicle among the candidate lanes on the image; and correcting a damaged lane section on the basis of a lane width between the driving lanes when the damaged lane section is detected from the image.
  • FIG. 1 is a block diagram illustrating the configuration of an apparatus for detecting a lane according to an exemplary embodiment of the present invention
  • FIG. 2 is an exemplary view illustrating an image captured by a camera according to an exemplary embodiment of the present invention
  • FIG. 3 is an exemplary view illustrating guide lines according to an exemplary embodiment of the present invention.
  • FIG. 4 is an exemplary view illustrating lane feature points according to an exemplary embodiment of the present invention.
  • FIG. 5 is an exemplary view illustrating lane feature points transformed to world coordinates according to an exemplary embodiment of the present invention
  • FIG. 6 is a view illustrating driving lanes according to an exemplary embodiment of the present invention.
  • FIG. 7 is a flowchart illustrating a method of detecting a lane according to an exemplary embodiment of the present invention.
  • FIG. 8 is an exemplary view illustrating an image including a damaged lane section to describe an exemplary embodiment of the present invention
  • FIG. 9 is an exemplary view illustrating an image including a different damaged lane section in order to describe an exemplary embodiment of the present invention.
  • FIG. 10 is a view illustrating an image and lanes displayed on a display unit according to an exemplary embodiment of the present invention.
  • FIG. 11 is a flowchart illustrating a method of detecting a lane according to another exemplary embodiment of the present invention.
  • the apparatus for detecting a lane as shown in FIG. 1 may a stand-alone apparatus.
  • the lane detection apparatus of FIG. 1 may be applied to various types of terminals including mobile terminals, telematics terminals, smartphones, portable terminals, personal digital assistants (PDAs), portable multimedia players (PMPs), tablet PCs, Wibro terminals, navigation terminals, and audio video navigation (AVN) terminals.
  • PDAs personal digital assistants
  • PMPs portable multimedia players
  • tablet PCs tablet PCs
  • Wibro terminals navigation terminals
  • APN audio video navigation
  • FIG. 1 is a block diagram illustrating the configuration of an apparatus for detecting a lane according to an exemplary embodiment of the present invention.
  • an apparatus for detecting a lane (hereinafter, referred to as a lane detection apparatus) 10 according to an exemplary embodiment of the present invention includes a camera module 110; a display unit 130 that displays an image captured by the camera module 110; and a control unit 120 that detects all candidate lanes from the image captured by the camera module 110, displays lanes (for example, a lane adjacent to the left side of the vehicle and a lane adjacent to the right side of the vehicle) adjacent to a vehicle among the detected candidate lanes as the current driving lanes of the vehicle, calculates and stores widths of the driving lanes in real time or periodically, and displays virtual lanes in a damaged lane section on the basis of the calculated lane widths when the damaged lane section is detected from the image. That is, when the damaged lane section is detected, the control unit 120 corrects the damaged lane section on the basis of lane widths between the driving lanes calculated before the detected damaged lane section.
  • lanes for example, a lane adjacent to the left side of the vehicle and a
  • the lane detection apparatus 10 may be implemented with a larger number of components shown in FIG. 1, or the lane detection apparatus 10 may be implemented with a smaller number of components.
  • FIG. 2 is an exemplary view illustrating an image that is captured by a camera according to an exemplary embodiment of the present invention.
  • the camera module 110 receives an image 210 that is captured by a single camera.
  • the camera module 110 may receive the image 210 that includes lanes corresponding to a first lane, a second lane, and a third lane, and a double line of a white or yellow solid line (or a white or yellow solid line and a broken double line).
  • the control unit 120 receives the image 210 through the camera module 110 and extracts a plurality of support points (for example, feature points of lanes) within the captured image 210 on the basis of predetermined guide lines to extract support points (for example, lane feature points) from the image 210 .
  • a plurality of support points for example, feature points of lanes
  • support points for example, lane feature points
  • line spacing between guide lines 310 is set to be large at the bottom of the image 210, while line spacing between the guide lines 310 is set to get smaller toward the top of the image 210.
  • change width of line spacing between the guide lines 310 may vary according to the designer s design.
  • the same line spacing may be kept when the data of the image 210 is transformed to world coordinates.
  • the guide lines are not actually displayed on the image, but refer to virtual lines that are used to obtain as uniform point intervals as possible when the support points (feature points) are transformed to world coordinates.
  • the control unit 120 extracts the plurality of support points (lane feature points) on the basis of the predetermined guide lines within the image 210 and displays support points 410, extracted as described above, in the image domain through the display unit 130. That is, the control unit 120 displays support points corresponding to lanes 401 and support points corresponding to a double line 402.
  • intervals in a vertical direction between the plurality of support points on the basis of a horizontal axis (x axis) get smaller from bottom to top in the vertical direction of the display unit 130.
  • the control unit 120 transforms the extracted support points to world coordinates. That is, the control unit 120 may transform the extracted support points to world coordinates by using transformation matrices (for example, homographic matrices) stored in advance in a storage unit 140.
  • transformation matrices for example, homographic matrices
  • control unit 120 transforms the extracted support points to world coordinates on the basis of the homographic matrix stored in advance in the storage unit 140 and displays a plurality of support points 510, which are transformed to the world coordinates, on the display unit 130.
  • the same interval in the vertical direction between the plurality of support points, which are transformed to the world coordinates, is kept on the horizontal axis.
  • the control unit 120 may detect (or check) a plurality of points corresponding to a curve among the plurality of support points, which are transform to the world coordinates, on the basis of the plurality of support points transformed to the world coordinate and curve equations stored in advance in the storage unit 140. That is, the control unit 120 substitutes the plurality of support points, transformed to the world coordinates, into the curve equation stored in advance in the storage unit 140 and determines whether the plurality of support points, transformed to the world coordinates, plot a curve according to a result of the substitution.
  • the curve equation may be a quadratic or higher equation.
  • a is a
  • b is a curvature of a lane
  • c is heading of a vehicle
  • d is an offset.
  • both a and b are 0, the straight line is detected, and c is heading of a vehicle, and d is an offset.
  • the control unit 120 detects lanes by tracking the plurality of support points, transformed to the world coordinates, or by tracking the plurality of points corresponding to the detected curve.
  • the control unit 120 may calculate curve information following virtual center points of the lane on the basis of the plurality of points corresponding to the detected curve.
  • the curve information being calculated may be used to reduce effects of camera calibration and enhance lane keeping performance on the world coordinates. That is, the control unit 120 may calculate curve information following virtual center points of the lane by using any one method of a least square method, Random Sample Consensus (RANSAC), a general hough transform method, and a spline interpolation method with respect to the plurality of points corresponding to the detected curve.
  • RANSAC Random Sample Consensus
  • the control unit 120 may overlap the calculated curve information following the virtual center points of the lane and information about the detected curve with the captured image, which is then displayed on the display unit 130. For example, control unit 120 transforms (or maps) the calculated the lane (curve/straight line) information following the center points of the lane, which are world coordinates, and the detected lane information to coordinates in the image domain, and overlaps the transformed coordinates with the captured image on the display unit 130.
  • FIG. 6 is a view illustrating driving lanes according to an exemplary embodiment of the present invention.
  • the control unit 120 may select a first lane that is the most adjacent to the left side of the vehicle and a second lane that is the most adjacent to the right side of the vehicle on the basis of a direction in which the vehicle heads among the detected lanes, and display the first lane and the second lane as driving lanes 610 of the vehicle on the image.
  • the control unit 120 transforms (or maps) the detected first and second lanes (610) to coordinates in the image domain and overlaps the coordinates with the image 210.
  • the control unit 120 may directly extract lane feature points (support points) from the image and track the lane feature points to thereby detect the lanes.
  • the camera module 110 includes at least one pair of cameras (for example, stereo cameras or stereoscopic cameras) that are separated by a horizontal interval in the arbitrary same plane of the lane detection apparatus 10 or a single camera in order to capture lanes of a road in one image.
  • the fixed horizontal interval may be determined in consideration of the distance between eyes of the average person.
  • the camera module 110 may be any camera module that can capture an image.
  • the camera module 110 receives a first image (for example, a left image captured by a left camera included in the one pair of cameras) and a second image (for example, a right image captured by a right camera included in the one pair of cameras) that are captured by the one pair of cameras at the same time.
  • a first image for example, a left image captured by a left camera included in the one pair of cameras
  • a second image for example, a right image captured by a right camera included in the one pair of cameras
  • the camera module 110 may be an image sensor such as a CCD (Charge-Coupled Device) or a CMOS (Complementary Metal Oxide Semiconductor).
  • CMOS Complementary Metal Oxide Semiconductor
  • the camera module 110 may be fixed at a predetermined position (for example, a room mirror of the vehicle) of the vehicle in order to capture an image in the front of the vehicle in the direction in which the vehicle heads.
  • the camera module 110 may be fixed at a predetermined position (for example, a side mirror of the vehicle or a rear bumper of the vehicle) in order to capture images of the side and rear of the vehicle.
  • the control unit 120 performs functions related to lane keeping (including a lane departure warning message function and an automatic lane keeping function) on the basis of the position of the lane detection apparatus 10 (or the vehicle having the lane detection apparatus 10) that is detected by an arbitrary GPS module (not shown) and the detected curve (or lane).
  • the display unit 130 displays various contents including various menu screens by using a user interface and/or a graphic user interface included in the storage unit 140 under the control of the control unit 120.
  • the contents displayed on the display unit 130 include menu screens including various texts or image data (including various types of information data) and data such as icons, list menus, and combo boxes.
  • the display unit 130 includes a 3D Display or a 2D Display.
  • the display unit 130 may include at least one of a Liquid Crystal Display (LCD), a Thin Film Transistor-Liquid Crystal Display (TFT LCD), an Organic Light-Emitting Diode (OLED), a flexible display, and an LED (Light Emitting Diode).
  • LCD Liquid Crystal Display
  • TFT LCD Thin Film Transistor-Liquid Crystal Display
  • OLED Organic Light-Emitting Diode
  • flexible display and an LED (Light Emitting Diode).
  • the display unit 130 displays a 3D image (or a 2D image) under the control of the control unit 120.
  • Two or more of the display units 130 may exist according to how the lane detection apparatus 10 is embodied.
  • a plurality of display units may be separated from each other or arranged in a single body in one plane (same plane) in the lane detection apparatus 10.
  • the plurality of display units may be arranged in different planes.
  • the display unit 130 and a sensor for detecting a touch event have a layered structure (hereinafter, referred to as a touch screen )
  • the display unit 130 may serve as an input device as well as an output device.
  • the touch sensor may be formed as a touch film, a touch sheet, a touch pad, or a touch panel.
  • the touch sensor may be configured to convert variations in pressure applied to a particular portion of the display unit 130 or capacitance generated at the particular portion of the display unit 130 into electrical input signals.
  • the touch sensor may be configured to sense pressure applied when being touched as well as touch position and touch area.
  • a signal (signals) corresponding thereto is sent to a touch controller (not shown).
  • the touch controller processes the signal (signals) and transmits corresponding data to the control unit 120. In this manner, the control unit 120 is informed which area of the display unit 130 is touched.
  • the display unit 130 may include a proximity sensor.
  • the proximity sensor may be arranged at the inside of the lane detection apparatus 10, which is covered by the touch screen, or around the touch screen.
  • the proximity sensor is a sensor able to detect the presence of objects approaching a predetermined sensing face or nearby objects by an electromagnetic force or infrared radiation without any physical contact.
  • the proximity sensor has a longer life time and higher efficiency than a contact sensor.
  • Examples of the proximity sensor may include a transmission type photo-electric sensor, a direct reflection type photo-electric sensor, a mirror reflection type photo-electric sensor, a high-frequency oscillating proximity sensor, a capacitive proximity sensor, a magnetic proximity sensor, and an infrared proximity sensor.
  • the touch screen is a capacitive touch screen
  • the capacitive touch screen is constructed such that proximity of a pointer is detected through a variation in an electric field according to the proximity of the pointer.
  • the touch screen may be classified as a proximity sensor.
  • proximity touch The action of the pointer approaching the touch screen without actually touching the touch screen so the pointer is located on the touch screen
  • contact touch The action of the pointer coming into actual contact with the touch screen
  • a position at which proximity touch is made on the touch screen by the pointer refers to a position at which the pointer vertically corresponds to the touch screen.
  • the proximity sensor senses proximity touch and a proximity touch pattern (for example, proximity touch distance, proximity touch direction, proximity touch speed, proximity touch time, proximity touch position, and a proximity touch moving state).
  • a proximity touch pattern for example, proximity touch distance, proximity touch direction, proximity touch speed, proximity touch time, proximity touch position, and a proximity touch moving state.
  • Information corresponding to the sensed proximity touch operation and proximity touch pattern may be output to the touch screen.
  • the display unit 130 may receive button input from the user, or receive a command or a control signal according to user s operation such as touching/scrolling the screen being displayed.
  • the lane detection apparatus 10 may include the storage unit 140 that stores a program for detecting the image and the lanes and lane width information calculated in real time or periodically.
  • the storage unit 140 may further store various menu screens, various user interfaces (UI) and/or graphic user interfaces (GUI).
  • UI user interfaces
  • GUI graphic user interfaces
  • the storage unit 140 may further store transformation matrices (for example, homographic matrices), curve equations, and methods including a least square method.
  • transformation matrices for example, homographic matrices
  • curve equations for example, curve equations, and methods including a least square method.
  • the storage unit 140 may further include data and programs necessary for the lane detection apparatus 10 to operate.
  • the storage unit 140 may include at least one storage medium including a flash memory type, a hard disk type, a multimedia card micro type, a memory card type (e.g., SD or DX memory), Read-Only Memory (ROM), Electrically Erasable Programmable Read-only Memory (EEPROM), Programmable Read-only Memory (PROM), Random Access Memory (RAM), Static Random Access Memory (SRAM), magnetic memory, magnetic disk, and optical disk.
  • a flash memory type e.g., a hard disk type, a multimedia card micro type, a memory card type (e.g., SD or DX memory), Read-Only Memory (ROM), Electrically Erasable Programmable Read-only Memory (EEPROM), Programmable Read-only Memory (PROM), Random Access Memory (RAM), Static Random Access Memory (SRAM), magnetic memory, magnetic disk, and optical disk.
  • ROM Read-Only Memory
  • EEPROM Electrically Erasable Programmable Read-only Memory
  • PROM Programmable Read-only Memory
  • RAM Random Access Memory
  • the lane detection apparatus 10 may further include a communication module (not shown) that performs communications with an arbitrary terminal or server under the control of the control unit 120.
  • the communication module may include a wired/wireless communication module.
  • the wireless internet technique may include a wireless local area network (WLAN), Wi-Fi, wireless broadband (WiBro), world interoperability for microwave access (WiMAX), high speed downlink packet access (HSDPA), IEEE 802.16, long term evolution (LTE), and wireless mobile broadband service (WMBS).
  • Examples of the short range communication technology may include Bluetooth, Radio Frequency IDentification (RFID), Infrared Data Association (IrDA), Ultra-WideBand (UWB), and ZigBee.
  • the wired communication technology may include Universal Serial Bus (USB) communication.
  • the communication module may include CAN communication, Ethernet for a vehicle, flexray, and a Local Interconnect Network (LIN) in order to perform communications with an arbitrary vehicle having the lane detection apparatus 10.
  • CAN communication for a vehicle
  • flexray for a vehicle
  • LIN Local Interconnect Network
  • the communication module may transmit a plurality support points extracted within an arbitrary image under the control of the control unit 120, points transformed from the plurality of support points to world coordinates, a plurality of points corresponding to a curve among the points transformed to the world coordinate, and curve information following center points of the lane calculated on the basis of the plurality of points corresponding to the curve to the arbitrary terminal or server.
  • the communication module may receive a first image and a second image that are captured at the same time by a pair of arbitrary stereo cameras transmitted from the arbitrary terminal or server.
  • the lane detection apparatus 10 may further include an input unit (not shown) that includes at least one microphone (not shown) to receive an audio signal.
  • the microphone receives external sound signals (including user s voice (voice signal or voice information)) in calling mode, recording mode, voice recognition mode, video conference mode, and video calling mod, and processes the external sound signals into electrical voice data.
  • the processed voice data may be output through a voice output unit (not shown) or be transformed such that it can be transmitted to an external terminal through the communication module.
  • various noise reduction algorithms may be implemented in order to remove noise generated when receiving external audio signals.
  • the input unit receives signals according to user s button operation or commands or control signals created by operations such as touching/scrolling the screen being displayed.
  • the input unit receives signals corresponding to information input by the user.
  • various devices such as a keyboard, a keypad, a dome switch, a touch pad (resistive/capacitive), a touch screen, a jog shuttle, a jog switch, a mouse, a stylus pen, a touch pen, and a laser pointer.
  • the input unit receives signals corresponding to inputs by the various devices.
  • the lane detection apparatus 10 may further include a voice output unit (not shown) that outputs voice information included in a signal subjected to predetermined signal processing by the control unit 120.
  • the voice output unit may be a speaker.
  • support points (feature points) which become a candidate group of lanes within an image are extracted and then transformed to world coordinates, and a lane is detected according to the transformed world coordinates, thereby reducing the likelihood of error accumulation when compared to the method of directly detecting a lane from an image in terms of error transfer of calibration between camera information and the world coordinates.
  • the apparatus and method for detecting a lane information about a lane detected according to world coordinates is displayed, and a warning message is created and output on the basis of this information, thereby increasing accuracy/sensitivity and user convenience.
  • the apparatus and method for detecting a lane when a damaged lane section is detected, virtual lanes are displayed in the damaged lane section on the basis of a lane width (width between driving lanes) that is calculated before the damaged lane section, thereby offering drivers convenience.
  • FIG. 7 is a flowchart illustrating a method of detecting a lane according to an exemplary embodiment of the present invention.
  • the camera module 110 receives a first image and a second image that are captured by at least one pair of cameras (for example, stereo cameras or stereoscopic cameras) that are separated by a horizontal interval on the same central axis in the same plane of the lane detection apparatus 10 or an image that is captured by a single camera.
  • the first image may be a left image captured by a left camera included in the one pair of cameras
  • the second image may be a right image captured by a right camera included in the one pair of cameras.
  • the camera module 110 may receive any one of the first image and the second image that are captured by the one pair of cameras.
  • the camera module 110 receives the image 210 that is captured by the single camera.
  • the camera module 110 may receive the image 210 that includes lanes corresponding to a first lane, a second lane, and a third lane and/or a double line of a while or yellow solid line (or a white or yellow solid line and a broken double line).
  • the control unit 120 receives the image 210 through the camera module 110 in operation S11 and extracts a plurality support points (for example, feature points of a lane) within the captured image 210 on the basis of guide lines set beforehand in order to extract support points from the image 210.
  • a plurality support points for example, feature points of a lane
  • line spacing between the guide lines 310 is set to be large at the bottom of the image 210, while line spacing between the guide lines 310 is set to get smaller toward the top of the image 210.
  • change width of line spacing between the guide lines 310 may vary according to the designer s design.
  • the same line spacing may be kept when data of the image 210 is transformed to world coordinates.
  • the guide lines are not actually displayed on the image, but refer to virtual lines that are used to obtain as uniform point intervals as possible when the support points (feature points) are transformed to world coordinates.
  • the control unit 120 extracts a plurality support points (lane feature points) within the image 210 on the basis of the guide lines set beforehand and displays the extracted support points 510 on the display unit 130. That is, the control unit 120 displays support points corresponding to lanes 501 and support points corresponding to a double line 502 in the image domain.
  • intervals in a vertical direction between the plurality of support points on the basis of a horizontal axis (x axis) get smaller from bottom to top in a vertical direction of the display unit 130.
  • the control unit 120 transforms the extracted support points into world coordinates. That is, the control unit 120 may transform the extracted support points to world coordinate by using transformation matrices (including, for example, homographic matrices) stored in advance in the storage unit 140.
  • transformation matrices including, for example, homographic matrices
  • control unit 120 transforms the extracted support points to world coordinates on the basis of the homographic matrix stored in advance in the storage unit 140 and displays the plurality of support points (610), transformed to the world coordinates, on the display unit 130.
  • intervals in the vertical direction between the plurality of support points, transformed to the world coordinates, on the basis of the horizontal axis are kept the same.
  • the control unit 120 may detect (or check) a plurality of points corresponding to a curve among the plurality of support points transformed to the world coordinate on the basis of the plurality of support points, transformed to the world coordinate, and the curve equation stored in advance in the storage unit 140. That is, the control unit 120 may substitute the plurality of support points, transformed to the world coordinates, into the curve equation stored in advance in the storage unit 140 and determine (or check) whether the plurality of support points, transformed to the world coordinates, plot a curve or not according to a result of the substitution.
  • the curve equation may be a quadratic or higher equation.
  • a is 0, in the cubic equation, b is a curvature of a lane, c is heading of a vehicle, and, d is an offset.
  • d is an offset.
  • the control unit 120 detects lanes by tracking the plurality of support points, transformed to the world coordinates, or by tracking the plurality of points corresponding to the detected curve.
  • the control unit 120 detects as driving lanes of the vehicle, a first lane that is the most adjacent to the left side of the vehicle and a second lane that is the most adjacent to the right side of the vehicle on the basis of a direction in which the vehicle heads among the detected lanes in operation S12.
  • the control unit 120 displays the detected driving lanes on the image. For example, the control unit 120 transforms (or maps) the detected driving lanes to individual coordinates in the image domain and overlaps the transformed coordinates with the image 210.
  • the control unit 120 calculates a lane width between the driving lanes in real time or periodically in operation S13 and stores the calculated lane width in the storage unit 140. For example, the control unit 120 calculates the lane width on the basis of pixels (for example, pixels corresponding to one straight line connecting the driving lanes) located between the driving lanes.
  • the control unit 120 may calculate the lane width between the driving lanes on the basis of the lane feature points transformed to the world coordinate.
  • the control unit 120 determines whether a damaged lane section is detected from the image in operation S14. For example, the control unit 120 may detect a section in which the lane feature points are not detected in any one of the driving lanes as the damaged lane section.
  • FIG. 8 is an exemplary view illustrating an image 810 that includes a damaged lane section in order to describe an exemplary embodiment of the present invention.
  • control unit 120 may detect the section (801) in which the lane feature points are not detected as a damaged lane section 801.
  • the control unit 120 may also detect a section in which the lane feature points corresponding to the driving lanes 610 are not temporarily detected as the damaged lane section.
  • FIG. 9 is an exemplary view illustrating an image 910 that includes a different damaged lane section in order to describe an exemplary embodiment of the present invention.
  • control unit 120 may detect sections (801 and 901) in which lane feature points corresponding to the driving lanes 610 are not temporarily detected as the damaged lane section.
  • the control unit 120 corrects the damaged lane section on the basis of the calculated lane width when the damaged lane section is detected in operation S15. For example, when the damaged lane sections 801 and 901 are detected, the control unit 120 reads a lane width between the driving lanes, which is calculated before the detected damaged lane section, from the storage unit 140 and displays virtual lanes in the damaged lane section on the basis of the read lane width. That is, when the damaged lane sections 801 and 901 are detected, the control unit 120 generates virtual lanes corresponding to the lane width between the driving lanes, which is calculated before the detected damaged lane section, and displays the virtual lanes in the damaged lane section.
  • FIG. 10 is a view illustrating an image and lanes displayed on a display unit according to an exemplary embodiment of the present invention.
  • the control unit 120 reads a lane width between the driving lanes, which is calculated before the detected damaged lane section, form the storage unit 140, and displays virtual lanes in the damaged lane section on the basis of the read lane width, so that driving lanes 1010 can be continuously displayed on the image to thereby offer drivers convenience.
  • control unit 120 performs functions related to lane keeping (including a lane departure warning message function and an automatic lane keeping function) on the basis of the position of the lane detection apparatus 10 (or a vehicle having the lane detection apparatus 10) that is detected by an arbitrary GPS module (not shown), and the detected curve (or lane).
  • functions related to lane keeping including a lane departure warning message function and an automatic lane keeping function
  • the apparatus and method for detecting a lane when the damaged lane section is detected, virtual lanes are displayed in the damaged lane section on the basis of the lane width between the driving lanes, which is calculated before the detected damaged lane section, so that driving lanes can be continuously displayed on the image, thereby offering drivers convenience.
  • FIG. 11 is a flowchart illustrating a method of detecting a lane according to another exemplary embodiment of the present invention.
  • the camera module 110 receives a first image and a second image that are captured by at least one pair of cameras (for example, stereo cameras or stereoscopic cameras) that are separated by a horizontal interval on the same central axis in the same plane of the lane detection apparatus 10 or an image that is captured by a single camera.
  • the first image may be a left image captured by a left camera included in the one pair of cameras
  • the second image may be a right image captured by a right camera included in the one pair of cameras.
  • the camera module 110 may receive any one of the first image and the second image that are captured by the one pair of cameras.
  • the control unit 120 receives the image 210 through the camera module 110 in operation S21 and extracts a plurality support points (for example, feature points of a lane) within the captured image 210 on the basis of guide lines set beforehand in order to extract support points from the image 210.
  • a plurality support points for example, feature points of a lane
  • the control unit 120 transforms the extracted support points to world coordinates. That is, the control unit 120 may transform the extracted support points to world coordinates by using transformation matrices (for example, homographic matrices) stored in advance in the storage unit 140.
  • transformation matrices for example, homographic matrices
  • the control unit 120 may detect (or check) a plurality of points corresponding to a curve among the plurality of support points, which are transform to the world coordinates, on the basis of the plurality of support points transformed to the world coordinate and curve equations stored in advance in the storage unit 140.
  • the control unit 120 detects lanes by tracking the plurality of support points, transformed to the world coordinates, or by tracking the plurality of points corresponding to the detected curve.
  • the control unit 120 detects as driving lanes of the vehicle, a first lane that is the most adjacent to the left side of the vehicle and a second lane that is the most adjacent to the right side of the vehicle on the basis of a direction in which the vehicle heads among the detected lanes in operation S22.
  • the control unit 120 displays the detected driving lanes on the image. For example, the control unit 120 transforms (or maps) the detected driving lanes to coordinates in the image domain and overlaps the coordinates with the image 210.
  • the control unit 120 calculates a lane width between the driving lanes in real time or periodically in operation S23 and stores the calculated lane width in the storage unit 140. For example, the control unit 120 calculates the lane width on the basis of pixels (for example, pixels corresponding to one straight line connecting the driving lanes) located between the driving lanes.
  • the control unit 120 determines whether a damaged lane section is detected from the image in operation S24. When a section (801) in which the lane feature points are not detected in any one of the driving lanes 610, the control unit 120 may detect the section (801) in which the lane feature points are not detected as a damaged lane section 801. The control unit 120 may also detect a section (or sections) (801 and 901) in which lane feature points corresponding to the driving lanes 610 are not temporarily detected as the damaged lane section.
  • the control unit 120 detects a heading direction of the vehicle in operation S25.
  • the control unit 120 may receive a heading angle of the vehicle from an Electronic Control Unit (ECU) of the vehicle through a vehicle interface or detect a heading direction of the vehicle by further installing a heading angle sensing sensor of the vehicle in the apparatus for detecting a lane.
  • ECU Electronic Control Unit
  • the control unit 120 displays the virtual lanes in the damaged lane section on the basis of the calculated lane width and the heading direction of the vehicle (corrects the damaged lane section) in operation S26. For example, when the damaged lane sections 801 and 901 are detected, the control unit 120 reads the lane width between the driving lanes, which is calculated before the detected damaged lane section, from the storage unit 140, generates virtual lanes corresponding to the read lane width, and displays the virtual lanes on the basis of the heading direction of the vehicle.
  • the apparatus and method for detecting a lane when the damaged lane section is detected, virtual lanes are displayed in the damaged lane section on the basis of the lane width between the driving lanes, which is detected before the detected damaged lane section, and the heading direction of the vehicle, so that the driving lanes can be continuously accurately displayed on the image
  • the apparatus and method for detecting a lane when the damaged lane section is detected, virtual lanes are displayed in the damaged lane section on the basis of a lane width between driving lanes, calculated before the detected damaged lane section, so that the driving lanes can be continuously displayed on the image to thereby offer drivers convenience.
  • the apparatus and method for detecting a lane when the damaged lane section is detected, virtual lanes are displayed in the damaged lane section on the basis of a lane width between driving lanes, calculated before the detected damaged lane section, and a heading direction of a vehicle, so that the driving lanes can be continuously accurately displayed on the image.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Mechanical Engineering (AREA)
  • Automation & Control Theory (AREA)
  • Mathematical Physics (AREA)
  • Transportation (AREA)
  • Multimedia (AREA)
  • Traffic Control Systems (AREA)
  • Image Analysis (AREA)

Abstract

Disclosed is an apparatus and method for detecting a lane that can accurately detect a driving lane of a vehicle. An apparatus for detecting a lane according to an aspect of the invention may include: a camera module; a display unit displaying an image captured by the camera module; and a control unit detecting candidate lanes from the image captured by the camera module on the basis of lane feature points, displaying driving lanes of a vehicle among the candidate lanes on the image, and correcting a damaged lane section on the basis of a lane width between the driving lanes when the damaged lane section is detected from the image.

Description

APPARATUS AND METHOD FOR DETECTING LANE
The present disclosure relates to an apparatus and method for detecting a lane.
In general, an apparatus for detecting a lane detects a lane that is included in an arbitrary image input through a camera or received from an external terminal. One of the apparatuses for detecting a lane according to the related art is disclosed in Korean Publication No. 1995-0017509.
Therefore, an object of the present invention is to provide an apparatus and method for detecting a lane that can accurately detect a lane.
An aspect of the present invention also provides an apparatus and method for detecting a lane that can continuously display driving lanes on an image by displaying virtual lanes in a damaged lane section on the basis of a lane width between the driving lanes, calculated before the damaged lane section, when the damaged lane section is detected.
To achieve these and other advantages and in accordance with the purpose of the present invention, as embodied and broadly described herein, there is provided an apparatus for detecting a lane including: a camera module; a display unit displaying an image captured by the camera module; and a control unit detecting candidate lanes from the image captured by the camera module on the basis of lane feature points, displaying driving lanes of a vehicle among the candidate lanes on the image, and correcting a damaged lane section on the basis of a lane width between the driving lanes when the damaged lane section is detected from the image.
The control unit may display virtual lanes in the damaged lane section on the basis of a lane width calculated before the damaged lane section when the damaged lane section is detected.
The apparatus may further include a storage unit storing information corresponding to the lane width, wherein the control unit calculates the lane width in real time or periodically.
The control unit may transform the lane feature points to world coordinates and calculates the lane width on the basis of the lane feature points transformed to the world coordinates.
The control unit may detect a section in which the lane feature points are not detected in any one of the driving lanes as the damaged lane section.
The control unit may detect a section in which the lane feature points corresponding to the driving lanes are not temporarily detected as the damaged lane section.
The control unit may generate virtual lanes corresponding to the lane width calculated before the damaged lane section and display the virtual lanes in the damaged lane section when the damaged lane section is detected.
The control unit may detect a heading direction of the vehicle and display virtual lanes in the damaged lane section on the basis of the lane width calculated before the damaged lane section and the heading direction when the damaged lane section is detected.
The control unit may generate the virtual lanes on the basis of the lane width calculated before the damaged lane section and display the virtual lanes on the image on the basis of the heading direction of the vehicle.
According to another aspect of the present invention, there is provided a method of detecting a lane including: detecting candidate lanes from an image captured by a camera module on the basis of lane feature points; displaying driving lanes of a vehicle among the candidate lanes on the image; and correcting a damaged lane section on the basis of a lane width between the driving lanes when the damaged lane section is detected from the image.
FIG. 1 is a block diagram illustrating the configuration of an apparatus for detecting a lane according to an exemplary embodiment of the present invention;
FIG. 2 is an exemplary view illustrating an image captured by a camera according to an exemplary embodiment of the present invention;
FIG. 3 is an exemplary view illustrating guide lines according to an exemplary embodiment of the present invention;
FIG. 4 is an exemplary view illustrating lane feature points according to an exemplary embodiment of the present invention;
FIG. 5 is an exemplary view illustrating lane feature points transformed to world coordinates according to an exemplary embodiment of the present invention;
FIG. 6 is a view illustrating driving lanes according to an exemplary embodiment of the present invention;
FIG. 7 is a flowchart illustrating a method of detecting a lane according to an exemplary embodiment of the present invention;
FIG. 8 is an exemplary view illustrating an image including a damaged lane section to describe an exemplary embodiment of the present invention;
FIG. 9 is an exemplary view illustrating an image including a different damaged lane section in order to describe an exemplary embodiment of the present invention;
FIG. 10 is a view illustrating an image and lanes displayed on a display unit according to an exemplary embodiment of the present invention; and
FIG. 11 is a flowchart illustrating a method of detecting a lane according to another exemplary embodiment of the present invention.
While the present invention has been particularly shown and described with reference to exemplary embodiments thereof, it will be understood by those of ordinary skill in the art that various changes in form and details may be made therein without departing from the spirit and scope of the present invention as defined by the following claims. Technical terms used in this specification are used to merely illustrate specific embodiments, and should be understood that they are not intended to limit the present disclosure. As far as not being defined differently, all terms used herein including technical or scientific terms may have the same meaning as those generally understood by an ordinary person skilled in the art to which the present disclosure belongs to, and should not be construed in an excessively comprehensive meaning or an excessively restricted meaning. In addition, if a technical term used in the description of the present disclosure is an erroneous term that fails to clearly express the idea of the present disclosure, it should be replaced by a technical term that can be properly understood by the skilled person in the art. In addition, general terms used in the description of the present disclosure should be construed according to definitions in dictionaries or according to its front or rear context, and should not be construed to have an excessively restrained meaning.
A singular representation may include a plural representation as far as it represents a definitely different meaning from the context. Terms comprise or include used herein should be understood that they are intended to indicate an existence of several components or several steps, disclosed in the specification, and it may also be understood that part of the components or steps may not be included or additional components or steps may further be included.
It will be understood that, although the terms first, second, etc. may be used herein to describe various elements, these elements should not be limited by these terms. These terms are only used to distinguish one element from another. For example, a first element could be termed a second element, and, similarly, a second element could be termed a first element, without departing from the scope of the present disclosure.
Hereinafter, preferred embodiments of the present invention will be described with reference to the accompanying drawings, and like reference numerals are used for referring to the same or similar elements in the description and drawings.
Moreover, detailed descriptions related to well-known functions or configurations will be ruled out in order not to unnecessarily obscure subject matters of the present invention. Also, the accompanying drawings are given for easy understanding of the preferred embodiments of the present invention, and should not be construed as limiting the sprit of the present invention disclosed in the present disclosure.
Hereinafter, an apparatus for detecting a lane according to an exemplary embodiment of the present invention will be described with reference to FIG. 1. Here, the apparatus for detecting a lane as shown in FIG. 1 may a stand-alone apparatus. Also, the lane detection apparatus of FIG. 1 may be applied to various types of terminals including mobile terminals, telematics terminals, smartphones, portable terminals, personal digital assistants (PDAs), portable multimedia players (PMPs), tablet PCs, Wibro terminals, navigation terminals, and audio video navigation (AVN) terminals.
FIG. 1 is a block diagram illustrating the configuration of an apparatus for detecting a lane according to an exemplary embodiment of the present invention.
As shown in FIG. 1, an apparatus for detecting a lane (hereinafter, referred to as a lane detection apparatus) 10 according to an exemplary embodiment of the present invention includes a camera module 110; a display unit 130 that displays an image captured by the camera module 110; and a control unit 120 that detects all candidate lanes from the image captured by the camera module 110, displays lanes (for example, a lane adjacent to the left side of the vehicle and a lane adjacent to the right side of the vehicle) adjacent to a vehicle among the detected candidate lanes as the current driving lanes of the vehicle, calculates and stores widths of the driving lanes in real time or periodically, and displays virtual lanes in a damaged lane section on the basis of the calculated lane widths when the damaged lane section is detected from the image. That is, when the damaged lane section is detected, the control unit 120 corrects the damaged lane section on the basis of lane widths between the driving lanes calculated before the detected damaged lane section.
Not all of the components of the lane detection apparatus 10 as shown in FIG. 1 are essential components. The lane detection apparatus 10 may be implemented with a larger number of components shown in FIG. 1, or the lane detection apparatus 10 may be implemented with a smaller number of components.
FIG. 2 is an exemplary view illustrating an image that is captured by a camera according to an exemplary embodiment of the present invention.
As shown in FIG. 2, the camera module 110 receives an image 210 that is captured by a single camera. For example, the camera module 110 may receive the image 210 that includes lanes corresponding to a first lane, a second lane, and a third lane, and a double line of a white or yellow solid line (or a white or yellow solid line and a broken double line).
The control unit 120 receives the image 210 through the camera module 110 and extracts a plurality of support points (for example, feature points of lanes) within the captured image 210 on the basis of predetermined guide lines to extract support points (for example, lane feature points) from the image 210 . At this time, as shown in FIG. 3, in terms of the guide lines, when the bottom of the image 210 is transformed to world coordinates, the bottom thereof shows a nearby area, while the middle and top of the image 210 are transformed to the world coordinates, the middle and top thereof show a distant area. Therefore, in order to obtain as uniform point intervals as possible when data of the image 210 is transformed to world coordinates, line spacing between guide lines 310 is set to be large at the bottom of the image 210, while line spacing between the guide lines 310 is set to get smaller toward the top of the image 210. Here, change width of line spacing between the guide lines 310 may vary according to the designer s design. In addition, the same line spacing may be kept when the data of the image 210 is transformed to world coordinates. The guide lines are not actually displayed on the image, but refer to virtual lines that are used to obtain as uniform point intervals as possible when the support points (feature points) are transformed to world coordinates.
As shown in FIG. 4, the control unit 120 extracts the plurality of support points (lane feature points) on the basis of the predetermined guide lines within the image 210 and displays support points 410, extracted as described above, in the image domain through the display unit 130. That is, the control unit 120 displays support points corresponding to lanes 401 and support points corresponding to a double line 402. Here, intervals in a vertical direction between the plurality of support points on the basis of a horizontal axis (x axis) get smaller from bottom to top in the vertical direction of the display unit 130.
The control unit 120 transforms the extracted support points to world coordinates. That is, the control unit 120 may transform the extracted support points to world coordinates by using transformation matrices (for example, homographic matrices) stored in advance in a storage unit 140.
In one example, as shown in FIG. 5, the control unit 120 transforms the extracted support points to world coordinates on the basis of the homographic matrix stored in advance in the storage unit 140 and displays a plurality of support points 510, which are transformed to the world coordinates, on the display unit 130. Here, the same interval in the vertical direction between the plurality of support points, which are transformed to the world coordinates, is kept on the horizontal axis.
The control unit 120 may detect (or check) a plurality of points corresponding to a curve among the plurality of support points, which are transform to the world coordinates, on the basis of the plurality of support points transformed to the world coordinate and curve equations stored in advance in the storage unit 140. That is, the control unit 120 substitutes the plurality of support points, transformed to the world coordinates, into the curve equation stored in advance in the storage unit 140 and determines whether the plurality of support points, transformed to the world coordinates, plot a curve according to a result of the substitution. Here, the curve equation may be a quadratic or higher equation.
The control unit 120 substitutes the plurality of support points, transformed to the world coordinates, into a quadratic curve equation (for example, y=ax2+bx+c, where a is a curvature, b is a gradient (or heading), and c is an offset) stored in advance in the storage unit 140, and determines that the support points make a straight line if a=0 is satisfied and that the support points plot a curve if a?0 is satisfied.
The control unit 120 may substitute the plurality of support points, transformed to the world coordinates, into a cubic equation (for example, y=ax3+bx2+cx+d, where a is a curve derivative, b is a curvature, c is heading, and d is an offset) stored in advance in the storage unit 140 and determine whether the support points plot a curve or not. At this time, when a is 0, in the cubic equation, b is a curvature of a lane, c is heading of a vehicle, and, d is an offset. When both a and b are 0, the straight line is detected, and c is heading of a vehicle, and d is an offset.
The control unit 120 detects lanes by tracking the plurality of support points, transformed to the world coordinates, or by tracking the plurality of points corresponding to the detected curve.
The control unit 120 may calculate curve information following virtual center points of the lane on the basis of the plurality of points corresponding to the detected curve. The curve information being calculated may be used to reduce effects of camera calibration and enhance lane keeping performance on the world coordinates. That is, the control unit 120 may calculate curve information following virtual center points of the lane by using any one method of a least square method, Random Sample Consensus (RANSAC), a general hough transform method, and a spline interpolation method with respect to the plurality of points corresponding to the detected curve.
The control unit 120 may overlap the calculated curve information following the virtual center points of the lane and information about the detected curve with the captured image, which is then displayed on the display unit 130. For example, control unit 120 transforms (or maps) the calculated the lane (curve/straight line) information following the center points of the lane, which are world coordinates, and the detected lane information to coordinates in the image domain, and overlaps the transformed coordinates with the captured image on the display unit 130.
FIG. 6 is a view illustrating driving lanes according to an exemplary embodiment of the present invention.
As shown in FIG. 6, the control unit 120 may select a first lane that is the most adjacent to the left side of the vehicle and a second lane that is the most adjacent to the right side of the vehicle on the basis of a direction in which the vehicle heads among the detected lanes, and display the first lane and the second lane as driving lanes 610 of the vehicle on the image. For example, the control unit 120 transforms (or maps) the detected first and second lanes (610) to coordinates in the image domain and overlaps the coordinates with the image 210.
The control unit 120 may directly extract lane feature points (support points) from the image and track the lane feature points to thereby detect the lanes.
The camera module 110 includes at least one pair of cameras (for example, stereo cameras or stereoscopic cameras) that are separated by a horizontal interval in the arbitrary same plane of the lane detection apparatus 10 or a single camera in order to capture lanes of a road in one image. At this time, the fixed horizontal interval may be determined in consideration of the distance between eyes of the average person. In addition, the camera module 110 may be any camera module that can capture an image.
The camera module 110 receives a first image (for example, a left image captured by a left camera included in the one pair of cameras) and a second image (for example, a right image captured by a right camera included in the one pair of cameras) that are captured by the one pair of cameras at the same time.
The camera module 110 may be an image sensor such as a CCD (Charge-Coupled Device) or a CMOS (Complementary Metal Oxide Semiconductor).
When the lane detection apparatus 10 is installed at the vehicle, the camera module 110 may be fixed at a predetermined position (for example, a room mirror of the vehicle) of the vehicle in order to capture an image in the front of the vehicle in the direction in which the vehicle heads. The camera module 110 may be fixed at a predetermined position (for example, a side mirror of the vehicle or a rear bumper of the vehicle) in order to capture images of the side and rear of the vehicle.
The control unit 120 performs functions related to lane keeping (including a lane departure warning message function and an automatic lane keeping function) on the basis of the position of the lane detection apparatus 10 (or the vehicle having the lane detection apparatus 10) that is detected by an arbitrary GPS module (not shown) and the detected curve (or lane).
The display unit 130 displays various contents including various menu screens by using a user interface and/or a graphic user interface included in the storage unit 140 under the control of the control unit 120. Here, the contents displayed on the display unit 130 include menu screens including various texts or image data (including various types of information data) and data such as icons, list menus, and combo boxes.
The display unit 130 includes a 3D Display or a 2D Display. In addition, the display unit 130 may include at least one of a Liquid Crystal Display (LCD), a Thin Film Transistor-Liquid Crystal Display (TFT LCD), an Organic Light-Emitting Diode (OLED), a flexible display, and an LED (Light Emitting Diode).
The display unit 130 displays a 3D image (or a 2D image) under the control of the control unit 120.
Two or more of the display units 130 may exist according to how the lane detection apparatus 10 is embodied. For example, a plurality of display units may be separated from each other or arranged in a single body in one plane (same plane) in the lane detection apparatus 10. Alternatively, the plurality of display units may be arranged in different planes.
When the display unit 130 and a sensor for detecting a touch event (hereinafter, referred to as a touch sensor ) have a layered structure (hereinafter, referred to as a touch screen ), the display unit 130 may serve as an input device as well as an output device. For example, the touch sensor may be formed as a touch film, a touch sheet, a touch pad, or a touch panel.
The touch sensor may be configured to convert variations in pressure applied to a particular portion of the display unit 130 or capacitance generated at the particular portion of the display unit 130 into electrical input signals. In addition, the touch sensor may be configured to sense pressure applied when being touched as well as touch position and touch area. When touch input is received in association with the touch sensor, a signal (signals) corresponding thereto is sent to a touch controller (not shown). The touch controller processes the signal (signals) and transmits corresponding data to the control unit 120. In this manner, the control unit 120 is informed which area of the display unit 130 is touched.
The display unit 130 may include a proximity sensor. In addition, the proximity sensor may be arranged at the inside of the lane detection apparatus 10, which is covered by the touch screen, or around the touch screen.
The proximity sensor is a sensor able to detect the presence of objects approaching a predetermined sensing face or nearby objects by an electromagnetic force or infrared radiation without any physical contact. The proximity sensor has a longer life time and higher efficiency than a contact sensor. Examples of the proximity sensor may include a transmission type photo-electric sensor, a direct reflection type photo-electric sensor, a mirror reflection type photo-electric sensor, a high-frequency oscillating proximity sensor, a capacitive proximity sensor, a magnetic proximity sensor, and an infrared proximity sensor. When the touch screen is a capacitive touch screen, the capacitive touch screen is constructed such that proximity of a pointer is detected through a variation in an electric field according to the proximity of the pointer. In this instance, the touch screen (touch sensor) may be classified as a proximity sensor.
The action of the pointer approaching the touch screen without actually touching the touch screen so the pointer is located on the touch screen may be referred to as proximity touch, while the action of the pointer coming into actual contact with the touch screen may be referred to as contact touch. A position at which proximity touch is made on the touch screen by the pointer refers to a position at which the pointer vertically corresponds to the touch screen.
In addition, the proximity sensor senses proximity touch and a proximity touch pattern (for example, proximity touch distance, proximity touch direction, proximity touch speed, proximity touch time, proximity touch position, and a proximity touch moving state). Information corresponding to the sensed proximity touch operation and proximity touch pattern may be output to the touch screen.
As such, when the display unit 130 is used as an input device, the display unit 130 may receive button input from the user, or receive a command or a control signal according to user s operation such as touching/scrolling the screen being displayed.
The lane detection apparatus 10 according to the exemplary embodiment of the present invention may include the storage unit 140 that stores a program for detecting the image and the lanes and lane width information calculated in real time or periodically.
The storage unit 140 may further store various menu screens, various user interfaces (UI) and/or graphic user interfaces (GUI).
The storage unit 140 may further store transformation matrices (for example, homographic matrices), curve equations, and methods including a least square method.
The storage unit 140 may further include data and programs necessary for the lane detection apparatus 10 to operate.
The storage unit 140 may include at least one storage medium including a flash memory type, a hard disk type, a multimedia card micro type, a memory card type (e.g., SD or DX memory), Read-Only Memory (ROM), Electrically Erasable Programmable Read-only Memory (EEPROM), Programmable Read-only Memory (PROM), Random Access Memory (RAM), Static Random Access Memory (SRAM), magnetic memory, magnetic disk, and optical disk.
The lane detection apparatus 10 may further include a communication module (not shown) that performs communications with an arbitrary terminal or server under the control of the control unit 120. At this time, the communication module may include a wired/wireless communication module. Here, examples of the wireless internet technique may include a wireless local area network (WLAN), Wi-Fi, wireless broadband (WiBro), world interoperability for microwave access (WiMAX), high speed downlink packet access (HSDPA), IEEE 802.16, long term evolution (LTE), and wireless mobile broadband service (WMBS). Examples of the short range communication technology may include Bluetooth, Radio Frequency IDentification (RFID), Infrared Data Association (IrDA), Ultra-WideBand (UWB), and ZigBee. In addition, the wired communication technology may include Universal Serial Bus (USB) communication.
The communication module may include CAN communication, Ethernet for a vehicle, flexray, and a Local Interconnect Network (LIN) in order to perform communications with an arbitrary vehicle having the lane detection apparatus 10.
The communication module may transmit a plurality support points extracted within an arbitrary image under the control of the control unit 120, points transformed from the plurality of support points to world coordinates, a plurality of points corresponding to a curve among the points transformed to the world coordinate, and curve information following center points of the lane calculated on the basis of the plurality of points corresponding to the curve to the arbitrary terminal or server.
The communication module may receive a first image and a second image that are captured at the same time by a pair of arbitrary stereo cameras transmitted from the arbitrary terminal or server.
The lane detection apparatus 10 may further include an input unit (not shown) that includes at least one microphone (not shown) to receive an audio signal.
The microphone receives external sound signals (including user s voice (voice signal or voice information)) in calling mode, recording mode, voice recognition mode, video conference mode, and video calling mod, and processes the external sound signals into electrical voice data. In addition, the processed voice data may be output through a voice output unit (not shown) or be transformed such that it can be transmitted to an external terminal through the communication module. In addition, in terms of the microphone, various noise reduction algorithms may be implemented in order to remove noise generated when receiving external audio signals.
The input unit receives signals according to user s button operation or commands or control signals created by operations such as touching/scrolling the screen being displayed.
The input unit receives signals corresponding to information input by the user. As the input unit, various devices such as a keyboard, a keypad, a dome switch, a touch pad (resistive/capacitive), a touch screen, a jog shuttle, a jog switch, a mouse, a stylus pen, a touch pen, and a laser pointer. At this time, the input unit receives signals corresponding to inputs by the various devices.
The lane detection apparatus 10 may further include a voice output unit (not shown) that outputs voice information included in a signal subjected to predetermined signal processing by the control unit 120. Here, the voice output unit may be a speaker.
According to the apparatus and method for detecting a lane according to the exemplary embodiment of the present invention, support points (feature points) which become a candidate group of lanes within an image are extracted and then transformed to world coordinates, and a lane is detected according to the transformed world coordinates, thereby reducing the likelihood of error accumulation when compared to the method of directly detecting a lane from an image in terms of error transfer of calibration between camera information and the world coordinates.
According to the apparatus and method for detecting a lane according to the exemplary embodiment of the present invention, information about a lane detected according to world coordinates is displayed, and a warning message is created and output on the basis of this information, thereby increasing accuracy/sensitivity and user convenience.
According to the apparatus and method for detecting a lane according to the exemplary embodiment of the present invention, when a damaged lane section is detected, virtual lanes are displayed in the damaged lane section on the basis of a lane width (width between driving lanes) that is calculated before the damaged lane section, thereby offering drivers convenience.
Hereinafter, a method of detecting a lane according to an exemplary embodiment of the present invention will be described in detail with reference to FIGS. 1 to 10.
FIG. 7 is a flowchart illustrating a method of detecting a lane according to an exemplary embodiment of the present invention.
First, the camera module 110 receives a first image and a second image that are captured by at least one pair of cameras (for example, stereo cameras or stereoscopic cameras) that are separated by a horizontal interval on the same central axis in the same plane of the lane detection apparatus 10 or an image that is captured by a single camera. Here, the first image may be a left image captured by a left camera included in the one pair of cameras, while the second image may be a right image captured by a right camera included in the one pair of cameras. In addition, the camera module 110 may receive any one of the first image and the second image that are captured by the one pair of cameras.
The camera module 110 receives the image 210 that is captured by the single camera. For example, the camera module 110 may receive the image 210 that includes lanes corresponding to a first lane, a second lane, and a third lane and/or a double line of a while or yellow solid line (or a white or yellow solid line and a broken double line).
The control unit 120 receives the image 210 through the camera module 110 in operation S11 and extracts a plurality support points (for example, feature points of a lane) within the captured image 210 on the basis of guide lines set beforehand in order to extract support points from the image 210. At this time, as shown in FIG. 4, in terms of the guide lines, when the bottom of the image 210 is transformed to the world coordinate, the bottom thereof shows a nearby area, while the middle and top of the image 210 are transformed to the world coordinates, the middle and top thereof show a distant area. Therefore, in order to obtain as uniform point intervals as possible when data of the image 210 is transformed to world coordinates, line spacing between the guide lines 310 is set to be large at the bottom of the image 210, while line spacing between the guide lines 310 is set to get smaller toward the top of the image 210. Here, change width of line spacing between the guide lines 310 may vary according to the designer s design. In addition, the same line spacing may be kept when data of the image 210 is transformed to world coordinates. The guide lines are not actually displayed on the image, but refer to virtual lines that are used to obtain as uniform point intervals as possible when the support points (feature points) are transformed to world coordinates.
As shown in FIG. 5, the control unit 120 extracts a plurality support points (lane feature points) within the image 210 on the basis of the guide lines set beforehand and displays the extracted support points 510 on the display unit 130. That is, the control unit 120 displays support points corresponding to lanes 501 and support points corresponding to a double line 502 in the image domain. Here, intervals in a vertical direction between the plurality of support points on the basis of a horizontal axis (x axis) get smaller from bottom to top in a vertical direction of the display unit 130.
The control unit 120 transforms the extracted support points into world coordinates. That is, the control unit 120 may transform the extracted support points to world coordinate by using transformation matrices (including, for example, homographic matrices) stored in advance in the storage unit 140.
In one example, as shown in FIG. 6, the control unit 120 transforms the extracted support points to world coordinates on the basis of the homographic matrix stored in advance in the storage unit 140 and displays the plurality of support points (610), transformed to the world coordinates, on the display unit 130. Here, intervals in the vertical direction between the plurality of support points, transformed to the world coordinates, on the basis of the horizontal axis are kept the same.
The control unit 120 may detect (or check) a plurality of points corresponding to a curve among the plurality of support points transformed to the world coordinate on the basis of the plurality of support points, transformed to the world coordinate, and the curve equation stored in advance in the storage unit 140. That is, the control unit 120 may substitute the plurality of support points, transformed to the world coordinates, into the curve equation stored in advance in the storage unit 140 and determine (or check) whether the plurality of support points, transformed to the world coordinates, plot a curve or not according to a result of the substitution. Here, the curve equation may be a quadratic or higher equation.
The control unit 120 substitutes the plurality of support points, transformed to the world coordinates, into a quadratic curve equation (for example, y=ax2+bx+c, where a is a curvature, b is a gradient (or heading), and c is an offset) stored in advance in the storage unit 140, and determines that the support points make a straight line if a=0 is satisfied and that the support points plot a curve if a?0 is satisfied.
The control unit 120 may substitute the plurality of support points, transformed to the world coordinates, into a cubic curve equation (for example, y=ax3+bx2+cx+d, where a is a curve derivative, b is a curvature, c is heading, and d is an offset) stored in advance in the storage unit 140 and determine whether the support points plot a curve or not. At this time, when a is 0, in the cubic equation, b is a curvature of a lane, c is heading of a vehicle, and, d is an offset. When both a and b are 0, the straight line is detected, and c is heading of a vehicle, and d is an offset.
The control unit 120 detects lanes by tracking the plurality of support points, transformed to the world coordinates, or by tracking the plurality of points corresponding to the detected curve.
The control unit 120 detects as driving lanes of the vehicle, a first lane that is the most adjacent to the left side of the vehicle and a second lane that is the most adjacent to the right side of the vehicle on the basis of a direction in which the vehicle heads among the detected lanes in operation S12.
The control unit 120 displays the detected driving lanes on the image. For example, the control unit 120 transforms (or maps) the detected driving lanes to individual coordinates in the image domain and overlaps the transformed coordinates with the image 210.
The control unit 120 calculates a lane width between the driving lanes in real time or periodically in operation S13 and stores the calculated lane width in the storage unit 140. For example, the control unit 120 calculates the lane width on the basis of pixels (for example, pixels corresponding to one straight line connecting the driving lanes) located between the driving lanes. Here, the individual pixels may have the same distance value or different distance values. That is, when it is assumed that thirty pixels are located between two lanes (driving lanes), and each pixel has a predetermined distance value of 10 cm, a lane width between the two lanes is 300 cm (10cm*30=300cm).
The control unit 120 may calculate the lane width between the driving lanes on the basis of the lane feature points transformed to the world coordinate.
The control unit 120 determines whether a damaged lane section is detected from the image in operation S14. For example, the control unit 120 may detect a section in which the lane feature points are not detected in any one of the driving lanes as the damaged lane section.
FIG. 8 is an exemplary view illustrating an image 810 that includes a damaged lane section in order to describe an exemplary embodiment of the present invention.
As shown in FIG. 8, when a section (801) in which the lane feature points are not detected in any one of the driving lanes 610, the control unit 120 may detect the section (801) in which the lane feature points are not detected as a damaged lane section 801.
The control unit 120 may also detect a section in which the lane feature points corresponding to the driving lanes 610 are not temporarily detected as the damaged lane section.
FIG. 9 is an exemplary view illustrating an image 910 that includes a different damaged lane section in order to describe an exemplary embodiment of the present invention.
As shown in FIG. 9, the control unit 120 may detect sections (801 and 901) in which lane feature points corresponding to the driving lanes 610 are not temporarily detected as the damaged lane section.
The control unit 120 corrects the damaged lane section on the basis of the calculated lane width when the damaged lane section is detected in operation S15. For example, when the damaged lane sections 801 and 901 are detected, the control unit 120 reads a lane width between the driving lanes, which is calculated before the detected damaged lane section, from the storage unit 140 and displays virtual lanes in the damaged lane section on the basis of the read lane width. That is, when the damaged lane sections 801 and 901 are detected, the control unit 120 generates virtual lanes corresponding to the lane width between the driving lanes, which is calculated before the detected damaged lane section, and displays the virtual lanes in the damaged lane section.
FIG. 10 is a view illustrating an image and lanes displayed on a display unit according to an exemplary embodiment of the present invention.
As shown in FIG. 10, when the damaged lane sections 801 and 901 are detected, the control unit 120 reads a lane width between the driving lanes, which is calculated before the detected damaged lane section, form the storage unit 140, and displays virtual lanes in the damaged lane section on the basis of the read lane width, so that driving lanes 1010 can be continuously displayed on the image to thereby offer drivers convenience.
Meanwhile, the control unit 120 performs functions related to lane keeping (including a lane departure warning message function and an automatic lane keeping function) on the basis of the position of the lane detection apparatus 10 (or a vehicle having the lane detection apparatus 10) that is detected by an arbitrary GPS module (not shown), and the detected curve (or lane).
Therefore, according to the apparatus and method for detecting a lane according to the exemplary embodiment of the present invention, when the damaged lane section is detected, virtual lanes are displayed in the damaged lane section on the basis of the lane width between the driving lanes, which is calculated before the detected damaged lane section, so that driving lanes can be continuously displayed on the image, thereby offering drivers convenience.
Hereinafter, a method of detecting a lane according to another exemplary embodiment of the present invention will be described in detail with reference to FIGS. 1 and 11.
FIG. 11 is a flowchart illustrating a method of detecting a lane according to another exemplary embodiment of the present invention.
First, the camera module 110 receives a first image and a second image that are captured by at least one pair of cameras (for example, stereo cameras or stereoscopic cameras) that are separated by a horizontal interval on the same central axis in the same plane of the lane detection apparatus 10 or an image that is captured by a single camera. Here, the first image may be a left image captured by a left camera included in the one pair of cameras, while the second image may be a right image captured by a right camera included in the one pair of cameras. In addition, the camera module 110 may receive any one of the first image and the second image that are captured by the one pair of cameras.
The control unit 120 receives the image 210 through the camera module 110 in operation S21 and extracts a plurality support points (for example, feature points of a lane) within the captured image 210 on the basis of guide lines set beforehand in order to extract support points from the image 210.
The control unit 120 transforms the extracted support points to world coordinates. That is, the control unit 120 may transform the extracted support points to world coordinates by using transformation matrices (for example, homographic matrices) stored in advance in the storage unit 140.
The control unit 120 may detect (or check) a plurality of points corresponding to a curve among the plurality of support points, which are transform to the world coordinates, on the basis of the plurality of support points transformed to the world coordinate and curve equations stored in advance in the storage unit 140.
The control unit 120 detects lanes by tracking the plurality of support points, transformed to the world coordinates, or by tracking the plurality of points corresponding to the detected curve.
The control unit 120 detects as driving lanes of the vehicle, a first lane that is the most adjacent to the left side of the vehicle and a second lane that is the most adjacent to the right side of the vehicle on the basis of a direction in which the vehicle heads among the detected lanes in operation S22.
The control unit 120 displays the detected driving lanes on the image. For example, the control unit 120 transforms (or maps) the detected driving lanes to coordinates in the image domain and overlaps the coordinates with the image 210.
The control unit 120 calculates a lane width between the driving lanes in real time or periodically in operation S23 and stores the calculated lane width in the storage unit 140. For example, the control unit 120 calculates the lane width on the basis of pixels (for example, pixels corresponding to one straight line connecting the driving lanes) located between the driving lanes. Here, individual pixels may have the same distance value or different distance values. That is, when it is assumed that thirty pixels are located between two lanes (driving lanes), and each pixel has a predetermined distance value of 10 cm, a lane width between the two lanes is 300 cm (10cm*30=300cm).
The control unit 120 determines whether a damaged lane section is detected from the image in operation S24. When a section (801) in which the lane feature points are not detected in any one of the driving lanes 610, the control unit 120 may detect the section (801) in which the lane feature points are not detected as a damaged lane section 801. The control unit 120 may also detect a section (or sections) (801 and 901) in which lane feature points corresponding to the driving lanes 610 are not temporarily detected as the damaged lane section.
When the damaged lane section is detected, the control unit 120 detects a heading direction of the vehicle in operation S25. For example, the control unit 120 may receive a heading angle of the vehicle from an Electronic Control Unit (ECU) of the vehicle through a vehicle interface or detect a heading direction of the vehicle by further installing a heading angle sensing sensor of the vehicle in the apparatus for detecting a lane.
The control unit 120 displays the virtual lanes in the damaged lane section on the basis of the calculated lane width and the heading direction of the vehicle (corrects the damaged lane section) in operation S26. For example, when the damaged lane sections 801 and 901 are detected, the control unit 120 reads the lane width between the driving lanes, which is calculated before the detected damaged lane section, from the storage unit 140, generates virtual lanes corresponding to the read lane width, and displays the virtual lanes on the basis of the heading direction of the vehicle.
Therefore, according to the apparatus and method for detecting a lane according to another exemplary embodiment of the present invention, when the damaged lane section is detected, virtual lanes are displayed in the damaged lane section on the basis of the lane width between the driving lanes, which is detected before the detected damaged lane section, and the heading direction of the vehicle, so that the driving lanes can be continuously accurately displayed on the image
As set forth above, according to the apparatus and method for detecting a lane according to exemplary embodiments of the present invention, when the damaged lane section is detected, virtual lanes are displayed in the damaged lane section on the basis of a lane width between driving lanes, calculated before the detected damaged lane section, so that the driving lanes can be continuously displayed on the image to thereby offer drivers convenience.
According to the apparatus and method for detecting a lane according to exemplary embodiments of the present invention, when the damaged lane section is detected, virtual lanes are displayed in the damaged lane section on the basis of a lane width between driving lanes, calculated before the detected damaged lane section, and a heading direction of a vehicle, so that the driving lanes can be continuously accurately displayed on the image.

Claims (18)

  1. An apparatus for detecting a lane comprising:
    a camera module;
    a display unit configured to display an image captured by the camera module; and
    a control unit configured to detect candidate lanes from the image captured by the camera module on the basis of lane feature points, display driving lanes of a vehicle among the candidate lanes on the image, and correct a damaged lane section on the basis of a lane width between the driving lanes when the damaged lane section is detected from the image.
  2. The apparatus of claim 1, wherein the control unit displays virtual lanes in the damaged lane section on the basis of a lane width calculated before the damaged lane section when the damaged lane section is detected.
  3. The apparatus of claim 1, further comprising a storage unit storing information corresponding to the lane width,
    wherein the control unit calculates the lane width in real time or periodically.
  4. The apparatus of claim 1, wherein the control unit transforms the lane feature points to world coordinates and calculates the lane width on the basis of the lane feature points transformed to the world coordinates.
  5. The apparatus of claim 1, wherein the control unit detects a section in which the lane feature points are not detected in any one of the driving lanes as the damaged lane section.
  6. The apparatus of claim 1, wherein the control unit detects a section in which the lane feature points corresponding to the driving lanes are not temporarily detected as the damaged lane section.
  7. The apparatus of claim 1, wherein the control unit generates virtual lanes corresponding to the lane width calculated before the damaged lane section and displays the virtual lanes in the damaged lane section when the damaged lane section is detected.
  8. The apparatus of claim 1, wherein the control unit detects a heading direction of the vehicle and displays virtual lanes in the damaged lane section on the basis of the lane width calculated before the damaged lane section and the heading direction when the damaged lane section is detected.
  9. The apparatus of claim 8, wherein the control unit generates the virtual lanes on the basis of the lane width calculated before the damaged lane section and displays the virtual lanes on the image on the basis of the heading direction of the vehicle.
  10. A method of detecting a lane comprising:
    detecting candidate lanes from an image captured by a camera module on the basis of lane feature points;
    displaying driving lanes of a vehicle among the candidate lanes on the image; and
    correcting a damaged lane section on the basis of a lane width between the driving lanes when the damaged lane section is detected from the image.
  11. The method of claim 10, wherein the correcting of the damaged lane section comprises displaying virtual lanes in the damaged lane section on the basis of the lane width calculated before the damaged lane section.
  12. The method of claim 10, further comprising storing information corresponding to the lane width,
    wherein the lane width is calculated in real time or periodically.
  13. The method of claim 10, wherein the lane width comprises:
    transforming the lane feature points to world coordinates; and
    calculating the lane width on the basis of the lane feature points transformed to the world coordinates.
  14. The method of claim 10, wherein the damaged lane section is a section in which the lane feature points are not detected from any one of the driving lanes.
  15. The method of claim 10, wherein the damaged lane section is a section in which the lane feature points corresponding to the driving lanes are not temporarily detected.
  16. The method of claim 10, wherein the correcting of the damaged lane section comprises:
    generating virtual lanes corresponding to a lane width calculated before the damaged lane section when the damaged lane section is detected; and
    displaying the virtual lanes in the damaged lane section.
  17. The method of claim 10, wherein the correcting of the damaged lane section comprises:
    detecting a heading direction of the vehicle when the damaged lane section is detected; and
    displaying the virtual lanes in the damaged lane section on the basis of the lane width calculated before the damaged lane section and the heading direction.
  18. The method of claim 17, wherein the displaying of the virtual lanes comprises:
    generating the virtual lanes on the basis of the lane width calculated before the damaged lane section; and
    displaying the virtual lanes on the image on the basis of the heading direction of the vehicle.
PCT/KR2011/009843 2011-08-05 2011-12-20 Apparatus and method for detecting lane WO2013022154A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR1020110078358A KR101677640B1 (en) 2011-08-05 2011-08-05 Apparatus for detecting lane and method thereof
KR10-2011-0078358 2011-08-05

Publications (1)

Publication Number Publication Date
WO2013022154A1 true WO2013022154A1 (en) 2013-02-14

Family

ID=47668643

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2011/009843 WO2013022154A1 (en) 2011-08-05 2011-12-20 Apparatus and method for detecting lane

Country Status (2)

Country Link
KR (1) KR101677640B1 (en)
WO (1) WO2013022154A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11454970B2 (en) 2018-05-21 2022-09-27 Cummins Inc. Adjustment of autonomous vehicle control authority

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102268641B1 (en) * 2014-07-10 2021-06-23 현대모비스 주식회사 Around view system and the operating method
CN106364403A (en) * 2016-10-14 2017-02-01 深圳市元征科技股份有限公司 Lane recognizing method and mobile terminal

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20090064946A (en) * 2007-12-17 2009-06-22 한국전자통신연구원 Method and apparatus for generating virtual lane for video based car navigation system
US20100238283A1 (en) * 2009-03-18 2010-09-23 Hyundai Motor Company Lane departure warning method and system using virtual lane-dividing line
KR20110051531A (en) * 2009-11-10 2011-05-18 한국전자통신연구원 Apparatus for keeping a traffic lane for a vehicle and method thereof

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101176693B1 (en) * 2008-03-13 2012-08-23 주식회사 만도 Method and System for Detecting Lane by Using Distance Sensor
JP5359092B2 (en) * 2008-07-28 2013-12-04 日産自動車株式会社 Vehicle driving support device and vehicle driving support method

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20090064946A (en) * 2007-12-17 2009-06-22 한국전자통신연구원 Method and apparatus for generating virtual lane for video based car navigation system
US20100238283A1 (en) * 2009-03-18 2010-09-23 Hyundai Motor Company Lane departure warning method and system using virtual lane-dividing line
KR20110051531A (en) * 2009-11-10 2011-05-18 한국전자통신연구원 Apparatus for keeping a traffic lane for a vehicle and method thereof

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11454970B2 (en) 2018-05-21 2022-09-27 Cummins Inc. Adjustment of autonomous vehicle control authority

Also Published As

Publication number Publication date
KR20130015981A (en) 2013-02-14
KR101677640B1 (en) 2016-11-18

Similar Documents

Publication Publication Date Title
WO2013018962A1 (en) Traffic lane recognizing apparatus and method thereof
WO2021112406A1 (en) Electronic apparatus and method for controlling thereof
WO2019088667A1 (en) Electronic device for recognizing fingerprint using display
WO2019164183A1 (en) Electronic device for acquiring biometric information by using electrode selected from electrodes of biometric sensor, and method for controlling same
WO2013022159A1 (en) Traffic lane recognizing apparatus and method thereof
WO2013022154A1 (en) Apparatus and method for detecting lane
WO2019035607A1 (en) Electronic device and method for controlling touch sensing signals and storage medium
WO2013022153A1 (en) Apparatus and method for detecting lane
WO2016072610A1 (en) Recognition method and recognition device
WO2021158055A1 (en) Electronic device including touch sensor ic and operation method thereof
WO2019103427A1 (en) Method, device, and recording medium for processing image
WO2020050432A1 (en) Mobile terminal
WO2017003152A1 (en) Apparatus and method for controlling object movement
WO2022030998A1 (en) Electronic device comprising display and operation method thereof
WO2021201593A1 (en) Data providing method and electronic device supporting same
WO2022025630A1 (en) Electronic device including distance sensor, and autofocusing method
WO2014171720A1 (en) Electronic device and method for preventing touch input error
KR20130015983A (en) Apparatus for tracing lane and method thereof
WO2018034385A1 (en) System for securing image quality for all range of image and method thereof
WO2023106606A1 (en) Cloud server supporting collaborative editing between electronic devices, and operation method therefor
WO2021158034A1 (en) Electronic device including touch sensor ic and operation method for same
WO2022231252A1 (en) Video stream detection method and electronic device supporting same
WO2023146228A1 (en) Electronic device and electronic device control method
WO2022065844A1 (en) Method for displaying preview image, and electronic apparatus supporting same
WO2024111976A1 (en) Electronic device and method for providing virtual iot environment

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 11870541

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 11870541

Country of ref document: EP

Kind code of ref document: A1