KR101677640B1 - Apparatus for detecting lane and method thereof - Google Patents

Apparatus for detecting lane and method thereof Download PDF

Info

Publication number
KR101677640B1
KR101677640B1 KR1020110078358A KR20110078358A KR101677640B1 KR 101677640 B1 KR101677640 B1 KR 101677640B1 KR 1020110078358 A KR1020110078358 A KR 1020110078358A KR 20110078358 A KR20110078358 A KR 20110078358A KR 101677640 B1 KR101677640 B1 KR 101677640B1
Authority
KR
South Korea
Prior art keywords
lane
curve
detected
lanes
image
Prior art date
Application number
KR1020110078358A
Other languages
Korean (ko)
Other versions
KR20130015981A (en
Inventor
김종헌
박영경
이중재
김현수
박준오
안드레아스 박
디르 산드라 셰이커
이제훈
Original Assignee
엘지전자 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 엘지전자 주식회사 filed Critical 엘지전자 주식회사
Priority to KR1020110078358A priority Critical patent/KR101677640B1/en
Priority to PCT/KR2011/009843 priority patent/WO2013022154A1/en
Publication of KR20130015981A publication Critical patent/KR20130015981A/en
Application granted granted Critical
Publication of KR101677640B1 publication Critical patent/KR101677640B1/en

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/02Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
    • B60W40/06Road conditions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/14Means for informing the driver, warning the driver or prompting a driver intervention
    • B60W2050/146Display means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20048Transform domain processing
    • G06T2207/20061Hough transform
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30248Vehicle exterior or interior
    • G06T2207/30252Vehicle exterior; Vicinity of vehicle
    • G06T2207/30256Lane; Road marking

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Mechanical Engineering (AREA)
  • Automation & Control Theory (AREA)
  • Mathematical Physics (AREA)
  • Transportation (AREA)
  • Multimedia (AREA)
  • Traffic Control Systems (AREA)
  • Image Analysis (AREA)

Abstract

TECHNICAL FIELD [0001] The present invention relates to a lane recognizing device and a method thereof that can accurately recognize a driving lane of a vehicle. The lane recognizing apparatus according to the present invention comprises: a camera module; A display unit for displaying an image photographed by the camera module; The method includes detecting candidate lanes based on lane feature points in an image captured by the camera module, displaying driving lanes of the vehicle among the detected candidate lanes on the image, detecting a lane- And a controller for compensating for the lane-damaged section based on a lane width between the lane markers.

Description

[0001] APPARATUS FOR DETECTING LANE AND METHOD THEREOF [0002]

The present specification relates to a lane recognition apparatus and a method thereof.

Generally, a lane recognition device is a device that recognizes a lane included in an arbitrary image input through a camera or the like or received from an external terminal. The lane recognition apparatus according to the prior art is also disclosed in Korean Patent Laid-Open Publication No. 1995-0017509.

It is an object of the present invention to provide a lane recognition apparatus and a method thereof that can accurately recognize a lane.

In this specification, when a lane-damaged section is detected, a virtual lane is displayed in the lane-damaged section based on the lane width between the lanes calculated before the detected lane-corrupted section, There is provided a lane recognizing apparatus and a method thereof.

According to an aspect of the present invention, there is provided a lane recognition apparatus comprising: a camera module; A display unit for displaying an image photographed by the camera module; The method includes detecting candidate lanes based on lane feature points in an image captured by the camera module, displaying driving lanes of the vehicle among the detected candidate lanes on the image, detecting a lane- And a controller for compensating for the lane-damaged section based on a lane width between the lane markers.

In one embodiment of the present invention, when the lane departure section is detected, the controller may display a virtual lane on the lane departure section based on the lane width calculated before the detected lane departure section.

As an example related to the present specification, the control unit may further include a storage unit for storing information corresponding to the lane width, and the control unit may calculate the lane width in real time or periodically.

As an example related to the present specification, the control unit may calculate the lane width based on a distance value set for each pixel corresponding to a straight line distance between the driving lanes.

As an example related to the present specification, the control unit may detect, as the lane departure section, an interval in which the lane characteristic points are not detected in any one of the driving lanes.

As an example related to the present specification, the control unit may detect an interval in which lane characteristic points corresponding to the driving lanes are not detected temporarily, as the lane departure period.

The controller may generate a virtual lane corresponding to the lane width calculated before the detected lane departure section and detect the virtual lane as the lane departure section Can be displayed.

When the lane departure section is detected, the control section detects the heading direction of the vehicle, and determines the lane departure section based on the lane width calculated before the lane departure section and the heading direction, The virtual lane can be displayed.

In one embodiment of the present invention, the controller generates the virtual lane on the basis of the lane width calculated before the lane departure section, and displays the virtual lane on the image based on the heading direction of the vehicle .

According to another aspect of the present invention, there is provided a lane recognition method comprising: detecting candidate lanes based on lane minutiae points in an image photographed by a camera module; Displaying driving lanes of the vehicle among the detected candidate lanes on the image; And compensating the lane-damaged section based on the lane width between the driving lanes when the lane-damaged section is detected in the image.

The lane recognizing apparatus and method according to the embodiments of the present invention are characterized in that when a lane departure section is detected, the lane recognition section calculates a virtual lane on the lane departure section based on the lane width between the lanes calculated before the detected lane departure section So that the driving lane can be continuously displayed on the image continuously without interruption, thereby providing the driver with convenience in operation.

The lane recognition apparatus and method according to embodiments of the present invention are characterized in that when a lane-damaged section is detected, based on the lane width between the lanes calculated before the detected lane-corrupted section and the heading direction of the vehicle, It is possible to display the driving lane continuously and accurately on the image by displaying the virtual lane in the interval.

1 is a block diagram illustrating a configuration of a lane recognition apparatus according to an embodiment of the present invention.
2 is an exemplary view showing an image taken by a camera according to an embodiment of the present invention.
3 is an exemplary view illustrating a guideline according to an embodiment of the present invention.
4 is an exemplary view showing lane characteristic points according to an embodiment of the present invention.
5 is an exemplary view illustrating lane feature points converted to world coordinates according to an embodiment of the present invention.
6 is a view showing driving lanes according to an embodiment of the present invention.
7 is a flowchart illustrating a lane recognition method according to an embodiment of the present invention.
FIG. 8 is a view illustrating an image including a lane departure section for explaining an embodiment of the present invention.
FIG. 9 is a diagram illustrating an image including another lane-corrupted section for explaining an embodiment of the present invention.
10 is a diagram illustrating images and lanes displayed on a display unit according to an embodiment of the present invention.
11 is a flowchart illustrating a lane recognition method according to another embodiment of the present invention.

It is noted that the technical terms used herein are used only to describe specific embodiments and are not intended to limit the invention. It is also to be understood that the technical terms used herein are to be interpreted in a sense generally understood by a person skilled in the art to which the present invention belongs, Should not be construed to mean, or be interpreted in an excessively reduced sense. Further, when a technical term used herein is an erroneous technical term that does not accurately express the spirit of the present invention, it should be understood that technical terms that can be understood by a person skilled in the art are replaced. In addition, the general terms used in the present invention should be interpreted according to a predefined or prior context, and should not be construed as being excessively reduced.

Also, the singular forms "as used herein include plural referents unless the context clearly dictates otherwise. In the present application, the term "comprising" or "comprising" or the like should not be construed as necessarily including the various elements or steps described in the specification, Or may be further comprised of additional components or steps.

Furthermore, terms including ordinals such as first, second, etc. used in this specification can be used to describe various elements, but the elements should not be limited by the terms. The terms are used only for the purpose of distinguishing one component from another. For example, without departing from the scope of the present invention, the first component may be referred to as a second component, and similarly, the second component may also be referred to as a first component.

Hereinafter, exemplary embodiments of the present invention will be described in detail with reference to the accompanying drawings, wherein like reference numerals refer to like or similar elements throughout the several views, and redundant description thereof will be omitted.

In the following description, well-known functions or constructions are not described in detail since they would obscure the invention in unnecessary detail. It is to be noted that the accompanying drawings are only for the purpose of facilitating understanding of the present invention, and should not be construed as limiting the scope of the present invention with reference to the accompanying drawings.

Hereinafter, the configuration of a lane recognizing apparatus according to an embodiment of the present invention will be described with reference to FIG. The lane recognizing device of FIG. 1 may be configured as a stand alone device, or may be a mobile terminal, a telematics terminal, a smart phone, a portable terminal A personal digital assistant (PDA), a portable multimedia player (PMP), a tablet PC, a wibro terminal, a navigation terminal, an AVN (Audio Video Navigation) terminal And the like.

1 is a block diagram illustrating a configuration of a lane recognition apparatus according to an embodiment of the present invention.

As shown in FIG. 1, the lane recognizing apparatus 10 according to an embodiment of the present invention includes a camera module 110; A display unit 130 for displaying an image photographed by the camera module 110; (E.g., a lane adjacent to the left side of the vehicle and a lane adjacent to the right side of the vehicle) among the detected candidate lanes, and detects all candidate lanes in the image photographed by the camera module 110. [ Wherein the lane marking means displays on the image as the current driving lane of the vehicle, calculates and stores the width between the driving lanes in real time or periodically, and when the lane marking section is detected in the image, And a control unit 120 for displaying virtual lanes. That is, when the lane departure section is detected, the controller 120 compensates the lane departure section based on the lane width between the lanes calculated before the detected lane departure section.

The lane recognizing apparatus 10 shown in Fig. 1 is not necessarily an essential component, and the lane recognizing apparatus 10 may be implemented by more elements than the elements shown in Fig. 1, The lane recognition device 10 may also be implemented by a component.

2 is an exemplary view showing an image taken by a camera according to an embodiment of the present invention.

As shown in FIG. 2, the camera module 110 receives the image 310 photographed through a single camera. For example, the camera module 110 may be configured to display images including lanes corresponding to primary, secondary, tertiary, etc., and double lines of white or yellow solid lines (or white or yellow solid lines and dotted double lines) Lt; RTI ID = 0.0 > 210 < / RTI >

The control unit 120 receives the image 210 through the camera module 110 and extracts support points (for example, lane feature points) from the image 210 A plurality of support points (for example, feature points of lanes) are extracted from the photographed image 210 based on a set guide line. 3, the lower part of the image 210 represents a near region when converted into a world coordinate, and the middle and upper portions of the image 210 are represented as world coordinates When the data of the image 210 is transformed into world coordinates, a line interval of the guide line 310 is set to be wider below the image 210 in order to obtain a maximum uniform point interval , And the interval between the lines of the guide lines 310 gradually decreases toward the upper side of the image 210. Here, the change width of the line spacing of the guide lines 310 can be variously set according to the designer's design, and can be set so as to maintain the same line interval when converted into world coordinates. The guideline is not actually displayed on the image but refers to a virtual line used to obtain a maximum uniform point interval when converting the support points (feature points) to world coordinates.

4, the control unit 120 extracts a plurality of support points (lane characteristic points) from the image 310 based on the preset guide line, And displays a plurality of fulcrum points 410 on the image domain. That is, the control unit 120 displays support points corresponding to the lanes 401 and supporting points corresponding to the double line 402 on the image domain. Here, the vertical distance between the plurality of supporting points with respect to the horizontal axis (x-axis) becomes narrower from the lower side of the vertical direction of the display unit 130 to the upper side of the vertical direction.

The control unit 120 converts the extracted plurality of support points into world coordinates. That is, the control unit 120 may convert the extracted plurality of support points into world coordinates using a transform matrix (for example, a homographic matrix or the like) previously stored in the storage unit 140 have.

5, the control unit 120 may convert the plurality of extracted support points into world coordinates based on a homography matrix stored in advance in the storage unit 140, And displays a plurality of converted support points 510 on the display unit 130. Here, the intervals in the vertical direction between the plurality of support points converted to the world coordinates with respect to the horizontal axis are maintained at the same interval.

The control unit 120 detects a plurality of points corresponding to curves among a plurality of support points converted into the world coordinates and a plurality of support points converted into the world coordinates based on a curve equation previously stored in the storage unit 140 (Or confirm). That is, the controller 120 substitutes a plurality of support points converted into the world coordinates into a curve equation stored in the storage unit 140 in advance, and stores the coordinates of the plurality of support points converted into the world coordinates It is possible to determine (or confirm) the presence or absence of the curve. Here, the curve equation may be a two-dimensional or more equation.

The control unit 120 stores a plurality of support points converted into the world coordinates into a quadratic curve equation previously stored in the storage unit 140 (for example, y = ax 2 + bx + c, where a is a curvature, b (Or heading), and C is an offset). If a = 0, it is recognized as a straight line. If a ≠ 0, it is recognized as a curve.

The control unit 120 stores a plurality of support points converted into the world coordinates into a cubic curve equation (for example, y = ax 3 + bx 2 + cx + d, where a Quot; is a curve derivative, b is a curvature, c is a heading, and d is an offset). In this case, if a is 0, b is the curvature of the lane, c is the heading of the vehicle, and d is the offset. If both a and b are 0, Represents an offset.

The control unit 120 detects lanes by tracking a plurality of support points converted into the world coordinates, or detects lanes by tracking a plurality of points corresponding to the detected curves.

The controller 120 may calculate curve information that follows a virtual center point of the lane based on a plurality of points corresponding to the detected curve. At this time, the calculated curve information can be used to improve the lane keeping performance on the world coordinates by minimizing the influence of the camera calibration state. That is, the controller 120 performs a least square method, a random sample consensus (RANSAC) method, a general hough transform method, and a spline method on a plurality of points corresponding to the detected curve. a spline interpolation method or the like may be applied to calculate curve information that follows the center point of the lane.

The control unit 120 may overlap the calculated image information such as the curve information that tracks the calculated center point of the lane and the detected curve on the display unit 130. [ For example, the control unit 120 may convert lane (curve / straight line) information that follows the center point of the calculated lane, which is the world coordinate, into coordinates on the image domain and displays each of the transformed coordinates on the display unit 130 by overlapping with the photographed image.

6 is a view showing driving lanes according to an embodiment of the present invention.

6, the control unit 120 selects a first lane closest to the left side of the vehicle and a second lane closest to the right side of the vehicle, based on the traveling direction of the vehicle, among the detected lanes And the first lane and the second lane may be displayed on the image as the driving lanes 610 of the vehicle. For example, the control unit 120 may convert (or map) the detected first lane and second lane 610 into coordinates on the image domain, respectively, 210).

The control unit 120 may detect the lanes by directly extracting lane feature points (support points) from the image and tracking the lane feature points.

The camera module 110 may include at least one pair of cameras (for example, a stereo camera (not shown)) installed on a same plane of the lane recognition device 10 so as to horizontally separate the lanes of the road, a stereo camera or a stereoscopic camera), or a single camera. At this time, the fixed horizontal interval may be set considering the distance between two eyes of a general person. In addition, the camera module 110 may be any camera module capable of image capturing.

The camera module 110 may include a first image (e.g., a left image photographed by a left camera included in the pair of cameras) photographed simultaneously by the pair of cameras, and a second image (e.g., The right image captured by the right camera included in the pair of cameras).

The camera module 110 may be an image sensor such as a CCD (Charge-Coupled Device) or a CMOS (Complementary Metal Oxide Semiconductor).

When the lane recognition device 10 is installed in a vehicle, the camera module 110 is fixed to a predetermined position of the vehicle (for example, a room mirror of the vehicle), and can photograph the front of the vehicle in the running direction . The camera module 110 may be fixed to a predetermined position of the vehicle (for example, a side mirror of the vehicle, a rear bumper of the vehicle) so as to photograph the side, rear, and the like of the vehicle.

The controller 120 controls the position of the lane recognizing device 10 (or the vehicle equipped with the lane recognizing device 10) identified through an arbitrary GPS module (not shown) (Including lane departure warning message function, automatic lane keeping function, etc.) related to the lane keeping based on the lane (or lane).

The display unit 130 displays various contents such as various menu screens by using the user interface and / or graphical user interface included in the storage unit 140 under the control of the controller 120. [ Here, the content displayed on the display unit 130 includes various text or image data (including various information data) and a menu screen including data such as an icon, a list menu, and a combo box.

The display unit 130 includes a three-dimensional display (3D display) or a two-dimensional display (2D display). The display unit 130 may be a liquid crystal display (LCD), a thin film transistor liquid crystal display (TFT LCD), an organic light-emitting diode (OLED) A display (flexible display), and a light emitting diode (LED).

The display unit 130 displays the three-dimensional image (or two-dimensional image) under the control of the control unit 120.

There may be two or more display units 130 according to the embodiment of the lane recognition device 10. [ For example, the plurality of display units may be spaced apart from each other on one surface (the same surface) or may be integrally disposed on the lane identification device 10, and may be disposed on different surfaces, respectively.

Meanwhile, when the display unit 130 and a sensor (hereinafter, referred to as 'touch sensor') for detecting a touch operation have a mutual layer structure (hereinafter referred to as a 'touch screen' It can also be used as an input device in addition to the device. The touch sensor may take the form of, for example, a touch film, a touch sheet, a touch pad, a touch panel, or the like.

The touch sensor may be configured to convert a change in a pressure applied to a specific portion of the display unit 130 or a capacitance generated in a specific portion of the display unit 130 into an electrical input signal. In addition, the touch sensor can be configured to detect not only the position and area to be touched but also the pressure at the time of touch. If there is a touch input to the touch sensor, the corresponding signal (s) is sent to a touch controller (not shown). The touch controller processes the signal (s) and then transmits the corresponding data to the controller 120. Thus, the control unit 120 can know which area of the display unit 130 is touched or the like.

The display unit 130 may include a proximity sensor. In addition, the proximity sensor may be disposed in an inner area of the lane recognition device 10, which is surrounded by the touch screen, or in the vicinity of the touch screen.

The proximity sensor refers to a sensor that detects the presence of an object approaching a predetermined detection surface or an object existing in the vicinity thereof without mechanical contact by using an electromagnetic force or an infrared ray. The proximity sensor has a longer lifetime and higher utilization than the contact-type sensor. Examples of the proximity sensor include a transmission type photoelectric sensor, a direct reflection type photoelectric sensor, a mirror reflection type photoelectric sensor, a high frequency oscillation type proximity sensor, a capacitive proximity sensor, a magnetic proximity sensor, and an infrared proximity sensor. And to detect the proximity of the pointer by the change of the electric field along the proximity of the pointer when the touch screen is electrostatic. In this case, the touch screen (touch sensor) may be classified as a proximity sensor.

The act of recognizing that the pointer is positioned on the touch screen without touching the pointer on the touch screen may be referred to as "Proximity Touch, " The contact action can be referred to as "Contact Touch ". A position where the pointer is proximally touched on the touch screen means a position where the pointer corresponds to the touch screen vertically when the pointer is touched.

In addition, the proximity sensor detects a proximity touch and a proximity touch pattern (e.g., a proximity touch distance, a proximity touch direction, a proximity touch speed, a proximity touch time, a proximity touch position, a proximity touch movement state, etc.). Information corresponding to the detected proximity touch operation and the proximity touch pattern may be output on the touch screen.

In this way, when the display unit 130 is used as an input device, a command or a control signal can be received by an operation such as receiving a button operation by a user or touching / scrolling a displayed screen.

The lane recognizing apparatus 10 according to the embodiment of the present invention may include a storage unit 140 for storing the image and a program for detecting the lane, real time or periodically calculated lane width information, and the like.

The storage unit 140 may further store various menu screens, various user interfaces (UI), and / or a graphical user interface (GUI).

The storage unit 140 may further store mathematical expressions such as a transformation matrix (e.g., a homography matrix), a curve equation, a least squares method, and the like.

The storage unit 140 may further store data and programs necessary for the lane recognition apparatus 10 to operate.

The storage unit 140 may be a flash memory type, a hard disk type, a multimedia card micro type, a card type memory (for example, an SD or XD memory (ROM), electrically erasable programmable read-only memory (EEPROM), programmable read-only memory (PROM), random access memory (RAM), static random access memory (SRAM) A magnetic disk, a magnetic disk, or an optical disk.

The lane recognition device 10 may further include a communication unit (not shown) that performs a communication function with an arbitrary terminal or a server under the control of the control unit 120. [ At this time, the communication unit may include a wired / wireless communication module. Herein, the wireless Internet technology may be a wireless LAN (WLAN), a Wi-Fi, a wireless broadband (Wibro), a World Interoperability for Microwave Access (HSDPA), IEEE 802.16, Long Term Evolution (LTE), Wireless Mobile Broadband Service (WMBS), and the like. The Short Range Communication Technology may include Bluetooth, Radio Frequency Identification (RFID), Infrared Data Association (IrDA), Ultra Wideband (UWB), ZigBee, and the like. The wired communication technology may include USB (Universal Serial Bus) communication, and the like.

The communication unit may include a CAN communication, a car Ethernet, a flexray, a LIN (Local Interconnect Network), or the like, for communication with any vehicle provided with the lane recognition device 10.

The communication unit may include a plurality of support points extracted for an arbitrary image under control of the control unit 120, points obtained by respectively converting the plurality of support points into world coordinates, points corresponding to curves among the points converted into the world coordinates And curve information that tracks a center point of a lane calculated on the basis of a plurality of points corresponding to the curve, to the arbitrary terminal or server.

The communication unit may receive the first image and the second image simultaneously photographed through any pair of stereo cameras transmitted from the arbitrary terminal or server.

The lane recognizing apparatus 10 may further include an input unit (not shown) including at least one microphone (not shown) for receiving an audio signal.

The microphone receives an external sound signal (including user's voice (voice signal or voice information)) by a microphone in a communication mode, a recording mode, a voice recognition mode, a video conference mode, And processes it as voice data. The processed voice data may be output through a voice output unit (not shown), or may be converted into a form that can be transmitted to an external terminal through the communication unit and output. In addition, the microphone may be implemented with various noise reduction algorithms for eliminating noise generated in receiving an external sound signal.

The input unit receives a signal corresponding to a button operation by a user or receives a command or a control signal generated by an operation such as touching / scrolling a displayed screen.

The input unit receives a signal corresponding to information input by a user and includes a keyboard, a key pad, a dome switch, a touch pad (static / static), a touch screen A jog shuttle, a jog wheel, a jog switch, a mouse, a stylus pen, a touch pen, a laser pointer, and the like can be used. At this time, the input unit receives a signal corresponding to the input by the various devices.

The lane recognition apparatus 10 may further include a voice output unit (not shown) for outputting voice information included in a signal processed by the control unit 120. [ Here, the audio output unit may be a speaker.

A lane recognition apparatus and method according to an embodiment of the present invention extracts focal points (feature points) serving as candidate lines of a lane in an image, converts the focal points into world coordinates, recognizes lanes on the converted world coordinates, In the error transition of the calibration between the coordinates, the possibility of the cumulative error can be reduced compared with the method of directly recognizing the lane in the image.

The lane recognition apparatus and method according to the embodiment of the present invention display information on lanes recognized on the world coordinates and generate and output a warning message based thereon to improve accuracy / .

The lane recognizing apparatus and method according to the embodiment of the present invention is characterized in that when a lane-damaged section is detected, a virtual lane is created in the lane-damaged section based on the lane width (width between the lane) calculated before the lane- It is possible to provide the driver with convenience of operation.

Hereinafter, a lane recognition method according to an embodiment of the present invention will be described in detail with reference to FIGS. 1 to 10. FIG.

7 is a flowchart illustrating a lane recognition method according to an embodiment of the present invention.

First, the camera module 110 is photographed through at least one pair of cameras (for example, a stereo camera or a stereoscopic camera) installed at a horizontal interval on the same central axis of the same plane of the lane identification device 10 And receives a first image and a second image, or an image photographed through a single camera. Here, the first image may be a left image captured by a left camera included in the pair of cameras, and the second image may be a right image captured by a right camera included in the pair of cameras . In addition, the camera module 110 may receive any one of the first image and the second image captured through the pair of cameras.

The camera module 110 receives the image 210 photographed through a single camera. For example, the camera module 110 may include a double line (or a white or yellow solid line and a dotted double line) of lanes corresponding to primary, secondary, tertiary, etc. and / or white or yellow solid lines The image 210 of FIG.

The control unit 120 receives the image 210 through the camera module 110 in step S11 and generates a guide line for extracting support points from the image 210. [ (For example, feature points of lanes) in the photographed image 210 on the basis of a plurality of support points (e.g. 4, the lower part of the image 210 represents a nearby area when converted into world coordinates, and the middle and upper parts of the image 210 are converted into world coordinates, The interval between the lines of the guide line 310 is set to be wider below the image 210 in order to obtain the maximum uniform point interval when the data of the image 210 is converted into world coordinates, 210, the intervals between the lines of the guide lines 310 gradually become narrower. Here, the change width of the line spacing of the guide lines 310 can be variously set according to the designer's design, and can be set so as to maintain the same line interval when converted into world coordinates. The guideline is not actually displayed on the image but refers to a virtual line used to obtain a maximum uniform point interval when converting the support points (feature points) to world coordinates.

5, the control unit 120 extracts a plurality of support points (lane characteristic points) from the image 210 based on the preset guide line, And displays a plurality of fulcrum points 510 on the image domain. That is, the control unit 120 displays support points corresponding to the lanes 501 and supporting points corresponding to the double line 502 on the image domain. Here, the vertical distance between the plurality of supporting points with respect to the horizontal axis (x-axis) becomes narrower from the lower side of the vertical direction of the display unit 130 to the upper side of the vertical direction.

The control unit 120 converts the extracted plurality of support points into world coordinates. That is, the control unit 120 may convert the extracted plurality of support points into world coordinates using a transform matrix (for example, a homographic matrix or the like) previously stored in the storage unit 140 have.

6, the control unit 120 converts the extracted plurality of support points into world coordinates based on a homography matrix stored in advance in the storage unit 140, And displays a plurality of converted support points 610 on the display unit 130. Here, the intervals in the vertical direction between the plurality of support points converted to the world coordinates with respect to the horizontal axis are maintained at the same interval.

The control unit 120 detects a plurality of points corresponding to curves among a plurality of support points converted into the world coordinates and a plurality of support points converted into the world coordinates based on a curve equation previously stored in the storage unit 140 (Or confirm). That is, the controller 120 substitutes a plurality of support points converted into the world coordinates into a curve equation stored in the storage unit 140 in advance, and stores the coordinates of the plurality of support points converted into the world coordinates It is possible to determine (or confirm) the presence or absence of the curve. Here, the curve equation may be a two-dimensional or more equation.

The control unit 120 stores a plurality of support points converted into the world coordinates into a quadratic curve equation previously stored in the storage unit 140 (for example, y = ax 2 + bx + c, where a is a curvature, b (Or heading), and C is an offset). If a = 0, it is recognized as a straight line. If a ≠ 0, it is recognized as a curve.

The control unit 120 stores a plurality of support points converted into the world coordinates into a cubic curve equation (for example, y = ax 3 + bx 2 + cx + d, where a Quot; is a curve derivative, b is a curvature, c is a heading, and d is an offset). In this case, if a is 0, b is the curvature of the lane, c is the heading of the vehicle, and d is the offset. If both a and b are 0, Represents an offset.

The control unit 120 detects lanes by tracking a plurality of support points converted into the world coordinates, or detects lanes by tracking a plurality of points corresponding to the detected curves.

The control unit 120 detects a first lane closest to the left side of the vehicle and a second lane closest to the right side of the vehicle as driving lanes of the vehicle on the basis of the traveling direction of the vehicle among the detected lanes (S12).

The control unit 120 displays the detected driving lanes on the image. For example, the control unit 120 converts (or maps) the detected driving lanes into coordinates on the image domain, respectively, and overlaps the transformed coordinates with the image 210.

The control unit 120 calculates the lane width between the driving lanes in real time or periodically (S13), and stores the calculated lane width in the storage unit 140. [ For example, the controller 120 calculates a lane width based on pixels located between the driving lanes (e.g., pixels corresponding to one straight line connecting the driving lanes) . Here, each pixel may have the same distance value or different distance values. That is, assuming that there are 30 pixels located between two lanes (driving lanes), and a preset distance value for each pixel is 10cm each, the lane width between the two lanes is 300cm (10cm * 30 = 300 cm).

The control unit 120 may calculate the lane width between the driving lanes based on the lane characteristic points converted into the world coordinates.

The control unit 120 determines whether a lane-damaged section is detected in the image (S14). For example, the control unit 120 may detect a lane departure period in which lane characteristic points are not detected in any one of the driving lanes.

8 is an exemplary view showing an image 810 including a lane departure section for explaining an embodiment of the present invention.

8, when the section 801 in which the lane characteristic points are not detected in any one of the driving lanes 610 is generated, the control unit 120 determines that the lane characteristic points are not detected (801) can be detected as the lane departure section (801).

The control unit 120 may detect an interval in which the lane characteristic points corresponding to the driving lanes 610 are not detected temporarily as the lane departure period.

9 is an exemplary view showing an image 910 including another lane departure section for explaining an embodiment of the present invention.

9, the controller 120 may detect the section (s) 801 and 901 in which the lane characteristic points corresponding to the driving lanes 610 are not detected temporarily as the lane departure section have.

If the lane departure section is detected, the controller 120 compensates the lane departure section based on the calculated lane width (S15). For example, when the lane departure sections 801 and 901 are detected, the controller 120 reads the lane width between the lanes calculated before the detected lane departure section from the storage section 140, A virtual lane is displayed on the lane-damaged section based on the read lane width. That is, when the lane departure section 801 or 901 is detected, the controller 120 generates a virtual lane corresponding to the lane width between the lanes calculated before the detected lane departure section, And displays the lane in the lane-damaged section.

10 is a diagram illustrating images and lanes displayed on a display unit according to an embodiment of the present invention.

10, when the lane departure sections 801 and 901 are detected, the control section 120 stores the lane widths between the lanes that are calculated before the detected lane departure section, And displays a virtual lane on the lane-damaged section based on the read lane width, thereby displaying the driving lane 1010 continuously on the video without interruption, thereby providing the driver with convenience in operation .

Meanwhile, the control unit 120 determines the position of the lane recognition device 10 (or the vehicle equipped with the lane recognition device 10), which is confirmed through an arbitrary GPS module (not shown) Performs functions related to lane keeping (including lane departure warning message function, automatic lane keeping function, etc.) based on the curve (or lane).

Therefore, the lane recognizing apparatus and method according to an embodiment of the present invention is characterized in that, when the lane departure section is detected, based on the lane width between the driving lanes calculated before the detected lane departure section, By displaying the imaginary lane, it is possible to continuously display the driving lane without interruption on the image, thereby providing the driver with the convenience of operation.

Hereinafter, a lane recognition method according to another embodiment of the present invention will be described in detail with reference to FIGS. 1 and 11. FIG.

11 is a flowchart illustrating a lane recognition method according to another embodiment of the present invention.

First, the camera module 110 is photographed through at least one pair of cameras (for example, a stereo camera or a stereoscopic camera) installed at a horizontal interval on the same central axis of the same plane of the lane identification device 10 And receives a first image and a second image, or an image photographed through a single camera. Here, the first image may be a left image captured by a left camera included in the pair of cameras, and the second image may be a right image captured by a right camera included in the pair of cameras . In addition, the camera module 110 may receive any one of the first image and the second image captured through the pair of cameras.

The control unit 120 receives the image 210 through the camera module 110 in step S21 and generates a guide line for extracting support points from the image 210. [ (For example, feature points of lanes) in the photographed image 210 on the basis of a plurality of support points (e.g.

The control unit 120 converts the extracted plurality of support points into world coordinates. That is, the control unit 120 may convert the extracted plurality of support points into world coordinates using a transform matrix (for example, a homographic matrix or the like) previously stored in the storage unit 140 have.

The control unit 120 detects a plurality of points corresponding to curves among a plurality of support points converted into the world coordinates and a plurality of support points converted into the world coordinates based on a curve equation previously stored in the storage unit 140 (Or confirm).

The control unit 120 detects lanes by tracking a plurality of support points converted into the world coordinates, or detects lanes by tracking a plurality of points corresponding to the detected curves.

The control unit 120 detects a first lane closest to the left side of the vehicle and a second lane closest to the right side of the vehicle as driving lanes of the vehicle on the basis of the traveling direction of the vehicle among the detected lanes (S22).

The control unit 120 displays the detected driving lanes on the image. For example, the control unit 120 converts (or maps) the detected driving lanes into coordinates on the image domain, respectively, and overlaps the transformed coordinates with the image 210.

The control unit 120 calculates the lane width between the driving lanes in real time or periodically (S23), and stores the calculated lane width in the storage unit 140. [ For example, the controller 120 calculates a lane width based on pixels located between the driving lanes (e.g., pixels corresponding to one straight line connecting the driving lanes) . Here, each pixel may have the same distance value or different distance values. That is, assuming that there are 30 pixels located between two lanes (driving lanes), and a preset distance value for each pixel is 10cm each, the lane width between the two lanes is 300cm (10cm * 30 = 300 cm).

The controller 120 determines whether a lane-damaged section is detected in the image (S24). For example, when an interval 801 in which the lane characteristic points are not detected in any one of the driving lanes 610 is generated, the control unit 120 determines an interval 801 in which the lane characteristic points are not detected It can be detected as the lane departure section 801. The controller 120 may detect the section (s) 801 and 901 in which the lane characteristic points corresponding to the driving lanes 610 are not detected temporarily, as the lane departure section.

If the lane departure section is detected, the controller 120 detects the heading direction of the vehicle (S25). For example, the control unit 120 may receive a heading angle of the vehicle from an ECU (Electronic Control Unit) of the vehicle through a vehicle interface, or may transmit a heading angle detection sensor of the vehicle to the lane It is possible to detect the heading direction of the vehicle by further installing it in the recognizing device.

The control unit 120 displays the virtual lane on the lane departure section based on the calculated lane width and the heading direction of the vehicle (compensates for the lane departure section) (S26). For example, when the lane departure sections 801 and 901 are detected, the controller 120 reads the lane width between the lanes calculated before the detected lane departure section from the storage section 140, Generates a virtual lane corresponding to the read lane width, and displays the virtual lane on the basis of the heading direction of the vehicle.

Therefore, the lane recognizing apparatus and method according to another embodiment of the present invention is characterized in that, when the lane departure section is detected, based on the lane width between the travel lanes calculated before the detected lane departure section and the heading direction of the vehicle By displaying a virtual lane in the lane-damaged section, the driving lane can be continuously and accurately displayed on the image without interruption.

As described above, the lane recognizing apparatus and the method according to the embodiments of the present invention are characterized in that, when the lane marking failure section is detected, By displaying a virtual lane in the lane-damaged area, the driving lane can be continuously displayed on the video without interruption, thereby providing the driver with convenience of operation.

The lane recognition apparatus and method according to embodiments of the present invention are characterized in that when the lane departure section is detected, based on the lane width between the lanes calculated before the detected lane departure section and the heading direction of the vehicle, By displaying a virtual lane in the damaged section, the driving lane can be continuously and accurately displayed on the image without interruption.

It will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the spirit and scope of the invention as defined by the appended claims. Therefore, the embodiments disclosed in the present invention are intended to illustrate rather than limit the scope of the present invention, and the scope of the technical idea of the present invention is not limited by these embodiments. The scope of protection of the present invention should be construed according to the following claims, and all technical ideas within the scope of equivalents should be construed as falling within the scope of the present invention.

10: lane recognition device 110: camera module
120: control unit 130: display unit
140:

Claims (15)

A camera module;
A display unit for displaying an image photographed by the camera module;
The lane feature points are converted into world coordinates, the lane feature points converted into the world coordinates are substituted into a previously stored curve equation, and the lane feature points And determining curve existence information and curve information, displaying driving lanes and curve information of the vehicle among the detected candidate lanes on the image, and detecting a lane width between the driving lanes based on the detected lane- And a controller for compensating for the lane-damaged section with the lane-
Wherein the curve information is determined by one of the pre-stored curve equations based on a curve derivative, a curvature of a lane, a heading and an offset of the vehicle,
And when the curve change rate and the curvature of the lane are all 0, the lane characteristic points are recognized as a straight line.
The apparatus of claim 1,
Wherein when the curve change rate is 0, a virtual lane is displayed in the lane departure section based on the lane width calculated before the detected lane departure section and the curve information, and when the curve change rate is 0, And determines the curve information based on a cubic curve equation when the curve change rate is not zero.
The method according to claim 1,
And a storage unit for storing information corresponding to the lane width, wherein the control unit calculates the lane width in real time or periodically.
The apparatus of claim 1,
Converts the lane characteristic points into world coordinates, and calculates the lane width based on the lane characteristic points converted into the world coordinates.
The apparatus of claim 1,
Wherein the lanes are detected as a lane departure section in which lane characteristic points are not detected in any of the lanes of the driving lane.
The apparatus of claim 1,
And detects a section in which lane characteristic points corresponding to the driving lanes are not temporarily detected as the lane departure section.
The apparatus of claim 1,
Wherein when the lane departure section is detected, a virtual lane corresponding to the lane width calculated before the detected lane departure section is generated, and the virtual lane is displayed in the lane departure section.
3. The apparatus of claim 2,
Wherein when the lane departure section is detected, the heading direction of the vehicle is detected, and the virtual lane is displayed on the lane departure section based on the lane width calculated before the lane departure section and the heading direction Lane recognition device.
9. The apparatus according to claim 8,
The virtual lane is generated on the basis of the lane width calculated before the lane departure section and the virtual lane is displayed on the image based on the heading direction of the vehicle.
Detecting candidate lanes based on lane characteristic points in an image photographed by a camera module;
Converting the lane feature points into world coordinates;
Determining curve existence and curve information of the lane feature points by substituting the lane feature points converted into the world coordinates into a previously stored curve equation;
Displaying driving lanes of the vehicle and the curve information among the detected candidate lanes on the image;
And compensating the lane departure section based on a lane width between the lanes when the lane departure section is detected in the image,
Wherein the curve information is determined by one of the pre-stored curve equations based on a curve derivative, a curvature of a lane, a heading and an offset of the vehicle,
And when the curve change rate and the curvature of the lane are both 0, the lane characteristic points are recognized as a straight line.
11. The method of claim 10, wherein compensating the lane-
A virtual lane is displayed in the lane departure section on the basis of the lane width and the curve information calculated before the detected lane departure section, and when the curve change rate is 0, the curve information is determined based on the second-order curve equation And if the curve change rate is not 0, the curve information is determined based on a cubic curve equation.
11. The method of claim 10,
Further comprising the step of storing information corresponding to the lane width, wherein the lane width is calculated in real time or periodically.
11. The method according to claim 10,
Converting the lane feature points into world coordinates;
And calculating the lane width based on the lane feature points converted into the world coordinates.
11. The method according to claim 10, wherein the lane-
Wherein a lane marking point in which no lane characteristic points are detected in any one of the driving lanes or lane characteristic points corresponding to the driving lanes is temporarily detected.
12. The method of claim 11, wherein compensating the lane-
Detecting a heading direction of the vehicle when the lane departure section is detected;
And displaying the virtual lane on the lane departure section based on the lane width calculated before the lane departure section and the heading direction.
KR1020110078358A 2011-08-05 2011-08-05 Apparatus for detecting lane and method thereof KR101677640B1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
KR1020110078358A KR101677640B1 (en) 2011-08-05 2011-08-05 Apparatus for detecting lane and method thereof
PCT/KR2011/009843 WO2013022154A1 (en) 2011-08-05 2011-12-20 Apparatus and method for detecting lane

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020110078358A KR101677640B1 (en) 2011-08-05 2011-08-05 Apparatus for detecting lane and method thereof

Publications (2)

Publication Number Publication Date
KR20130015981A KR20130015981A (en) 2013-02-14
KR101677640B1 true KR101677640B1 (en) 2016-11-18

Family

ID=47668643

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020110078358A KR101677640B1 (en) 2011-08-05 2011-08-05 Apparatus for detecting lane and method thereof

Country Status (2)

Country Link
KR (1) KR101677640B1 (en)
WO (1) WO2013022154A1 (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102268641B1 (en) * 2014-07-10 2021-06-23 현대모비스 주식회사 Around view system and the operating method
CN106364403A (en) * 2016-10-14 2017-02-01 深圳市元征科技股份有限公司 Lane recognizing method and mobile terminal
US11454970B2 (en) 2018-05-21 2022-09-27 Cummins Inc. Adjustment of autonomous vehicle control authority

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010030399A (en) * 2008-07-28 2010-02-12 Nissan Motor Co Ltd Vehicle operation support device and vehicle operation support method

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100921427B1 (en) * 2007-12-17 2009-10-14 한국전자통신연구원 Method and Apparatus for generating virtual lane for video based car navigation system
KR101176693B1 (en) * 2008-03-13 2012-08-23 주식회사 만도 Method and System for Detecting Lane by Using Distance Sensor
KR101163446B1 (en) * 2009-03-18 2012-07-18 기아자동차주식회사 A lane departure warning system using a virtual lane and a system according to the same
KR101262921B1 (en) * 2009-11-10 2013-05-09 한국전자통신연구원 Apparatus for keeping a traffic lane for a vehicle and method thereof

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010030399A (en) * 2008-07-28 2010-02-12 Nissan Motor Co Ltd Vehicle operation support device and vehicle operation support method

Also Published As

Publication number Publication date
WO2013022154A1 (en) 2013-02-14
KR20130015981A (en) 2013-02-14

Similar Documents

Publication Publication Date Title
US20140226011A1 (en) Traffic lane recognizing apparatus and method thereof
US10564392B2 (en) Imaging apparatus and focus control method
KR102583682B1 (en) Electronic device and method for dispalying sharing information based on augmented reality
KR101575159B1 (en) Method of operating application for providing parking information to mobile terminal
KR101641251B1 (en) Apparatus for detecting lane and method thereof
US20150262343A1 (en) Image processing device and image processing method
KR101563542B1 (en) Parking information system using mobile terminal
US20160292888A1 (en) Image measurement device, and recording medium
US20220063096A1 (en) Mobile robot and driving method thereof
KR101257871B1 (en) Apparatus and method for detecting object based on vanishing point and optical flow
KR101677640B1 (en) Apparatus for detecting lane and method thereof
KR101612822B1 (en) Apparatus for detecting lane and method thereof
JP2024052899A (en) Communication device and communication method
KR102518535B1 (en) Apparatus and method for processing image of vehicle
KR101658089B1 (en) Method for estimating a center lane for lkas control and apparatus threof
KR101612821B1 (en) Apparatus for tracing lane and method thereof
KR101224090B1 (en) Apparatus and method for detecting nearing car
KR101612817B1 (en) Apparatus and method for tracking car
KR102543742B1 (en) Mobile terminal and computer readable recording medium recording operating method of mobile terminal
US20160379416A1 (en) Apparatus and method for controlling object movement
KR20130015975A (en) Apparatus and method for detecting a vehicle
JP5176523B2 (en) Moving body detection apparatus, moving body detection method, and moving body detection program
KR20140103021A (en) Object recognition device
KR20230005034A (en) Autonomous Vehicle, Control system for remotely controlling the same, and method thereof
KR20130015974A (en) Apparatus and method for detecting object based on optical flow

Legal Events

Date Code Title Description
A201 Request for examination
E902 Notification of reason for refusal
E90F Notification of reason for final refusal
E701 Decision to grant or registration of patent right
GRNT Written decision to grant
FPAY Annual fee payment

Payment date: 20191014

Year of fee payment: 4