WO2013022159A1 - Traffic lane recognizing apparatus and method thereof - Google Patents

Traffic lane recognizing apparatus and method thereof Download PDF

Info

Publication number
WO2013022159A1
WO2013022159A1 PCT/KR2012/000190 KR2012000190W WO2013022159A1 WO 2013022159 A1 WO2013022159 A1 WO 2013022159A1 KR 2012000190 W KR2012000190 W KR 2012000190W WO 2013022159 A1 WO2013022159 A1 WO 2013022159A1
Authority
WO
WIPO (PCT)
Prior art keywords
traffic lane
traffic
feature points
lanes
vehicle
Prior art date
Application number
PCT/KR2012/000190
Other languages
French (fr)
Inventor
Youngkyung Park
Jonghun Kim
Joongjae LEE
Hyunsoo Kim
Junoh PARK
Andreas PARK
Chandra Shekhar DHIR
Original Assignee
Lg Electronics Inc.
Lee, Jeihun
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Lg Electronics Inc., Lee, Jeihun filed Critical Lg Electronics Inc.
Publication of WO2013022159A1 publication Critical patent/WO2013022159A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R1/00Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/02Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
    • B60W40/06Road conditions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/588Recognition of the road, e.g. of lane markings; Recognition of the vehicle driving pattern in relation to the road
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30248Vehicle exterior or interior
    • G06T2207/30252Vehicle exterior; Vicinity of vehicle
    • G06T2207/30256Lane; Road marking

Definitions

  • the present invention relates to a traffic lane recognizing apparatus and a method thereof.
  • a traffic lane recognizing apparatus is an apparatus for recognizing a traffic lane included in a certain image input from a camera, or the like, or a certain image received from an external terminal.
  • a related art traffic lane recognizing apparatus is disclosed in Korean Patent Publication Laid Open No. 1995-0017509.
  • a traffic lane recognizing apparatus including: a camera module; a display unit configured to display an image captured by the camera module; and a controller configured to detect candidate traffic lanes based on traffic lane conformance from among traffic lanes detected based on traffic lane feature points detected from the image captured by the camera module and different traffic lane equations and display traffic lanes adjacent to a vehicle, as current traffic lanes along which the vehicle is traveling (or moving) on, among the detected candidate traffic lanes, on the image.
  • the traffic lane conformance may be previously determined based on the sum of differences in distances between a traffic lane obtained by substituting the traffic lane feature points to the different traffic lane equations, and the traffic lane feature points.
  • the traffic lane conformance may be determined based on the sum of differences in distances between a traffic lane obtained by substituting the traffic lane feature points to the different traffic lane equations, and the traffic lane feature points, the number of the traffic lane feature points, and a range in which the traffic lane feature points are distributed.
  • the controller may detect traffic lanes, whose difference in distance between themselves and the traffic lane feature points is a threshold value or smaller, as the candidate traffic lanes, from among the traffic lanes detected based on the traffic lane feature points detected from the image captured by the camera module and the different traffic lane equations.
  • the controller may detect the traffic lane feature points from the captured image, and sequentially substitute the different traffic lane equations to the detected traffic lane feature points to thereby detect traffic lanes, whose difference in distance smaller than the threshold value, as the candidate traffic lanes, from among the detected traffic lanes.
  • the controller may detect traffic lane feature points from the captured image, convert the traffic lane feature points into world coordinates, and sequentially substitute the traffic lane feature points which have been converted into the world coordinates to the different traffic lane equations, to thereby detect traffic lanes, whose difference in distance is smaller than the threshold value, as the candidate traffic lanes, from among the detected traffic lanes.
  • a traffic lane recognizing method including: displaying an image captured by a camera module on a display unit; detecting candidate traffic lanes based on traffic lane conformance from among traffic lanes detected based on traffic lane feature points detected from the image captured by the camera module and different traffic lane equations; and displaying traffic lanes adjacent to a vehicle as current traffic lanes along which the vehicle is traveling (or moving) on, among the detected candidate traffic lanes, on the image.
  • FIG. 1 is a schematic block diagram showing the configuration of a traffic lane recognizing apparatus according to an embodiment of the present invention.
  • FIG. 2 is a view showing an image captured by a camera according to an embodiment of the present invention.
  • FIG. 3 is a view showing guide lines according to an embodiment of the present invention.
  • FIG. 4 is a view showing feature points of traffic lanes according to an embodiment of the present invention.
  • FIG. 5 is a view showing feature points of traffic lanes converted into world coordinates according to an embodiment of the present invention.
  • FIG. 6 is a view showing travel traffic lanes according to an embodiment of the present invention.
  • FIG. 7 is a view showing an image and traffic lanes displayed on a display unit according to an embodiment of the present invention.
  • FIG. 8 is a flow chart illustrating a process of a traffic lane recognizing method according to an embodiment of the present invention.
  • first and second may be used to describe various components, such components must not be understood as being limited to the above terms. The above terms are used only to distinguish one component from another. For example, a first component may be referred to as a second component without departing from the scope of rights of the present invention, and likewise a second component may be referred to as a first component.
  • the traffic lane recognizing apparatus illustrated in FIG. 1 may be configured as a stand alone device or may be applicable to various terminals such as a mobile terminal, a telematics terminal, a smart phone, a portable terminal, a personal digital assistant (PDA), a portable multimedia player (PMP), a tablet PC, a Wibro terminal, a navigation terminal, an AVN (Audio Video Navigation) terminal, and the like.
  • a mobile terminal such as a mobile terminal, a telematics terminal, a smart phone, a portable terminal, a personal digital assistant (PDA), a portable multimedia player (PMP), a tablet PC, a Wibro terminal, a navigation terminal, an AVN (Audio Video Navigation) terminal, and the like.
  • PDA personal digital assistant
  • PMP portable multimedia player
  • tablet PC such as a Wibro terminal, a navigation terminal, an AVN (Audio Video Navigation) terminal, and the like.
  • AVN Audio Video Navigation
  • FIG. 1 is a schematic block diagram showing the configuration of a traffic lane recognizing apparatus according to an embodiment of the present invention.
  • a traffic lane recognizing apparatus 10 includes a camera module 110, a display unit 130 configured to display an image captured by the camera module 110, and a controller 120 configured to detect candidate traffic lanes based on traffic lane conformance from among traffic lanes detected by substituting different traffic lane equations (or curve equations) in the image captured by the camera module 110 and display traffic lanes adjacent to a vehicle (e.g., a traffic lane adjacent to the left side of the vehicle and a traffic lane adjacent to the right side of the vehicle) as current traffic lanes along which the vehicle is traveling (or moving) on, among the detected candidate traffic lanes, on the image.
  • the traffic lane conformance may be previously determined based on the sum of differences in distances between a traffic lane obtained by substituting the traffic lane feature points to the different traffic lane equations, and the traffic lane feature points.
  • the controller 120 may detect traffic lanes, whose difference in distance between themselves and the traffic lane feature points is a threshold value or smaller, as the candidate traffic lanes, from among the traffic lanes detected based on the traffic lane feature points detected from the image captured by the camera module and the different traffic lane equations.
  • the controller 120 may detect traffic lane feature points (support points) from the captured image, and sequentially (e.g., in order from a low-order traffic lane equation (a linear traffic lane equation) to a high-order traffic lane equation (a cubic traffic lane equation) or in order from the high-order traffic lane equation (the cubic traffic lane equation) to the low-order traffic lane equation (the linear traffic lane equation)) substitute different traffic lane equations for detecting a traffic lane (a straight line and/or a curved line) to the detected traffic lane feature points, to thereby detect traffic lanes whose difference in distance is smaller than the threshold value (or traffic lanes having a high traffic lane conformance (e.g., 90% or more)), as the candidate traffic lanes, from among the detected traffic lanes.
  • a low-order traffic lane equation a linear traffic lane equation
  • a cubic traffic lane equation a cubic traffic lane equation
  • the linear traffic lane equation the controller 120 may detect traffic lane feature points (support points
  • the components of the traffic lane recognizing apparatus 10 illustrated in FIG. 1 are not all essential components; the traffic lane recognizing apparatus 10 may be implemented by more components or less components.
  • FIG. 2 is a view showing an image captured by a camera according to an embodiment of the present invention.
  • the camera module 110 receives an image 310 captured by a single camera.
  • the camera module 110 may receive an image 210 including traffic lanes corresponding to first road, second road, third road, and the like.
  • the controller 120 receives the image 210 through the camera module 110, and extracts feature points of traffic lanes within the captured image 210 based on pre-set guide lines for extracting the support points (e.g., traffic lane feature points) from the image 210.
  • the guide lines as shown in FIG. 3, when a lower portion of the image 210 is converted into world coordinates, it represents a closer region, and when middle and upper portions of the image 210 are converted into world coordinates, they represent a distant region.
  • the interval between the guide lines 310 is set to be large at the lower portion of the image 210, and the interval between the guide lines 310 set to be gradually narrowed toward the upper portion of the image 210.
  • a change width of the interval between lines of the guide lines 410 may be variably set according to a design of a designer and may be set to maintain the equal interval between lines when the data of the image 310 is converted into world coordinates.
  • the guide line refers to a virtual line used to obtain a point interval as uniform as possible when the support points (feature points) are converted into the world coordinates, rather than being actually displayed on the image.
  • the controller 120 extracts a plurality of traffic lane feature points from the image 310 based on the pre-set guide lines, and displays the plurality of extracted traffic lane feature points 410 on an image domain of the display unit 130. Namely, the controller 120 displays the traffic lane feature points corresponding to the traffic lanes 401 on the image domain.
  • the interval between the plurality of traffic lane feature points in a vertical direction based on the horizontal axis (x axis) is gradually narrowed from a lower side to an upper side of the display unit 130 in the vertical direction.
  • the controller 120 converts the plurality of extracted traffic lane feature points into world coordinates. Namely, the controller 120 may convert the plurality of extracted traffic lane feature points into world coordinates by using a conversion matrix (e.g., a homographic matrix), or the like) previously stored in the storage unit 140.
  • a conversion matrix e.g., a homographic matrix
  • the controller 120 converts the plurality of extracted traffic lane feature points into world coordinates based on the homographic matrix previously stored in the storage unit 140, and displays a plurality of traffic lane feature points 510 which have been converted into the world coordinates on the display unit 130.
  • the interval between the plurality of traffic lane feature points which have been converted into the world coordinates in the vertical direction is maintained to be equal.
  • the controller 120 may detect (or check) the plurality of traffic lane feature points corresponding to the curve among the traffic lane feature points which have been converted into the world coordinates based on the traffic lane feature points which have been converted into the world coordinates and the linear, quadratic, and cubic, n-order equations previously stored in the storage unit 140. Namely, the controller 120 may sequentially substitute the plurality of traffic lane feature points which have been converted into the world coordinates to the curve equations previously stored in the storage unit 140 and determine (or check) whether the plurality of traffic lane feature points which have been converted into the world coordinates make a curve based on the substitution results.
  • the curve equation may be a linear, quadratic, and cubic, or n-order equation.
  • c 1 0
  • the controller 120 recognizes the plurality of traffic lane feature points as a straight line
  • c 1 the controller 120 recognizes the plurality of traffic lane feature points as a curve (traffic lane detection).
  • the linear traffic lane equation may be used to detect a traffic lane on a general straight road.
  • c 2 0
  • the controller 120 recognizes the plurality of traffic lane feature points as a straight line
  • c 2 the controller 120 recognizes the plurality of traffic lane feature points as a curve (traffic lane detection).
  • the quadratic traffic lane equation may be used to detect a traffic lane on a curved road in which a curvature radius of a traffic lane is large and a curvature change is uniform.
  • the cubic curve equation may be used to detect a traffic lane on a curved road in which a curvature radius of a traffic lane is small and a curvature change is not uniform.
  • c 1 may be a direction of a traffic lane and a direction of the vehicle or a direction of the traffic lane with respect to the vehicle.
  • nth-order equation may be used to detect a traffic lane on a curved road which can be hardly modeled by the (n-)th-order equation
  • the controller 120 detects a traffic lane by substituting the traffic lane feature points which have been converted into the world coordinates to the curve equations (e.g., the linear equation) previously stored in the storage unit 140, and determines whether or not a difference in distances between the traffic lane feature points and the traffic lane obtained from the substitution result is smaller than the pre-set threshold value (reference distance value or a traffic lane conformance).
  • the controller 120 displays the traffic lane detected according to the linear equation, on the image.
  • the controller 120 substitutes the traffic lane feature points which have been converted into the world coordinates to the curve equations (e.g., the quadratic equation) previously stored in the storage unit 140, and determines whether or not the difference in distances between the traffic lane feature points and the traffic lane obtained from the substitution result is smaller, or equal to or greater than the pre-set threshold value (reference distance value or a traffic lane conformance).
  • the controller 120 displays the traffic lane detected according to the quadratic equation, on the image.
  • the controller 120 substitutes the traffic lane feature points which have been converted into the world coordinates to the curve equations (e.g., the cubic equation) previously stored in the storage unit 140, and determines whether or not the difference in distances between the traffic lane feature points and the traffic lane obtained from the substitution result is smaller, or equal to or greater than the pre-set threshold value (reference distance value or a traffic lane conformance).
  • the controller 120 displays the traffic lane detected according to the cubic equation, on the image.
  • the controller 120 may detect a traffic lane having high traffic lane conformance from among the curve equations (e.g., the linear, the quadratic, and the cubic equations) previously stored in the storage unit 140.
  • the curve equations e.g., the linear, the quadratic, and the cubic equations
  • the traffic lane conformance may be calculated in various manners, and may be calculated through sum of squares according to Equation 1 shown below:
  • n is the number of traffic lane feature points used for a traffic lane detection
  • y i is y value of ith traffic lane feature point
  • w i is a weight value with respect to each traffic lane feature point.
  • the controller may additionally use the number of the traffic lane feature points, the range in which the traffic lane feature points are distributed, and the like, in order to determine the conformance of a model on the current road.
  • the controller 120 may detect the candidate traffic lanes by substituting traffic lane feature points whose distribution range is equal to or greater than a pre-set range to the different traffic lane equations.
  • the pre-set reference number and the pre-set range may be variably defined by a designer.
  • the controller 120 may detect traffic lanes in real time by tracking traffic lane feature points substituted to the curve equations (e.g., the linear equation) previously stored in the storage unit 140, and display the detected traffic lanes.
  • the curve equations e.g., the linear equation
  • the controller 120 may also calculate the curve information that follows a virtual central point of a traffic lane based on the plurality of points corresponding to the detected traffic lane (a straight line or a curve).
  • the calculated curve information may be used to enhance traffic lane maintaining performance on the world coordinates by minimizing the influence of a calibration state of the camera.
  • the controller 120 may calculate curve information following the central point of the traffic lane by applying any one of a least square method, a random sample consensus (RANSAC), a general hough transform method, a spline interpolation method, and the like, with respect to the plurality of points corresponding to the detected curve.
  • the controller 120 may overlap the calculated curve information following the central point of the traffic lane, the information such as the detected curve, or the like, with the captured image, and display the same on the display unit 130.
  • the controller 120 may convert (or map) the calculated curve information following the central point of the traffic lane and the detected curve information, or the like, into coordinates on an image domain, respectively, overlap the respective converted coordinates with the captured image, and display the same on the display unit 130.
  • FIG. 6 is a view showing travel traffic lanes according to an embodiment of the present invention.
  • the controller 120 may select a first traffic lane most adjacent to the left side of the vehicle and a second traffic lane most adjacent to the right side of the vehicle based on a moving direction of the vehicle from the detected candidate traffic lanes, and display the first traffic lane and the second traffic lane as travel traffic lanes 610 of the vehicle on the image 210.
  • the controller 120 may convert (or map) the detected first traffic lane and the second traffic lane 610 into coordinates on the image domain, and overlap the respective converted coordinates with the image 210.
  • FIG. 7 is a view showing travel traffic lanes displayed on an image according to an embodiment of the present invention.
  • the controller 120 may select a first traffic lane most adjacent to the left side of the vehicle and a second traffic lane most adjacent to the right side of the vehicle based on a moving direction of the vehicle from the detected candidate traffic lanes, detects the first traffic lane and the second traffic lane as travel traffic lanes 610 of the vehicle, and overlap the travel traffic lanes 610 with the image.
  • the camera module 110 may include at least a pair of cameras (e.g., a stereo camera, a stereoscopic camera), installed to be spaced apart horizontally on the same plane of the traffic lane recognizing apparatus 10, or a single camera.
  • the fixed horizontal interval may be set in consideration of the distance between ordinary humans two eyes.
  • the camera module 110 may be any camera modules that can capture an image.
  • the camera module 110 may receive a first image (e.g., a left image captured by a left camera included in the pair of cameras) and a second image (e.g., a right image captured by a right camera included in the pair of cameras) which are simultaneously captured by the pair of cameras.
  • a first image e.g., a left image captured by a left camera included in the pair of cameras
  • a second image e.g., a right image captured by a right camera included in the pair of cameras
  • the camera module 110 may be an image sensor such as a charge-coupled device (CCD), a complementary metal oxide semiconductor (CMOS), or the like.
  • CCD charge-coupled device
  • CMOS complementary metal oxide semiconductor
  • the camera module 110 may be fixed to a certain position (e.g., a room mirror of the vehicle) of the vehicle to capture an image of a front side in the traveling direction of the vehicle.
  • the camera module 110 may be fixedly installed at certain positions (e.g., a side mirror of the vehicle, a rear bumper of the vehicle) in order to capture images of the side and the rear side of the vehicle.
  • the controller 120 performs functions (including a traffic lane deviation warning message function, an automatic traffic lane maintaining function, and the like) in relation to maintaining a traffic lane based on the position of the traffic lane recognizing apparatus 10 (or a vehicle including the traffic lane recognizing apparatus) and the detected curve (or the traffic lane) checked through a certain GPS module (not shown).
  • functions including a traffic lane deviation warning message function, an automatic traffic lane maintaining function, and the like
  • the display unit 130 displays various contents such as various menu screen images, or the like, by using a user interface and/or a graphic user interface included in the storage unit 140 under the control of the controller 120.
  • the contents displayed on the display unit 130 includes menu screen images such as various text or image data (including various information data) and data such as an icon, a list menu, a combo box, and the like.
  • the display unit 130 includes a 3D display or a 2D display. Also, the display unit 130 may include at least one of a Liquid Crystal Display (LCD), a Thin Film Transistor-LCD (TFT-LCD), an Organic Light Emitting Diode (OLED) display, a flexible display, and an LED (Light Emitting Diode).
  • LCD Liquid Crystal Display
  • TFT-LCD Thin Film Transistor-LCD
  • OLED Organic Light Emitting Diode
  • flexible display a flexible display
  • LED Light Emitting Diode
  • the display unit 130 displays the 3D image (or a 2D image) under the control of the controller 120.
  • the traffic lane recognizing apparatus 10 may include two or more display units 130 according to its particular desired embodiment.
  • a plurality of display units may be separately or integrally disposed on a single face (the same surface) of the traffic lane recognizing apparatus 10, or may be disposed on mutually different faces of the traffic lane recognizing apparatus 10.
  • the display unit 130 and a sensor sensing a touch operation are overlaid in a layered manner (referred to as a 'touch screen', hereinafter)
  • the display unit 130 may function as both an input device and an output device.
  • the touch sensor may have the form of, for example, a touch film, a touch sheet, a touch pad, a touch panel, and the like.
  • the touch sensor may be configured to convert the pressure applied to a particular portion of the display unit 130 or a change in capacitance generated at a particular portion of the display unit 130 into an electrical input signal. Also, the touch sensor may be configured to detect a touch input pressure as well as a touch input position and a touch input area. When there is a touch input with respect to the touch sensor, the corresponding signal(s) are sent to a touch controller (not shown). The touch controller processes the signal(s) and transmits corresponding data to the controller 120. Accordingly, the controller 120 can recognize a touched region of the display unit 151.
  • the display unit 130 may include a proximity sensor.
  • the proximity sensor may be disposed in an internal region of the traffic lane recognizing apparatus 10 covered by the touch screen or in the vicinity of the touch screen.
  • the proximity sensor may be disposed within the mobile terminal covered by the touch screen or near the touch screen.
  • the proximity sensor refers to a sensor for detecting the presence or absence of an object that accesses a certain detect surface or an object that exists nearby by using the force of electromagnetism or infrared rays without a mechanical contact.
  • the proximity sensor has a longer life span compared with a contact type sensor, and it can be utilized for various purposes.
  • Examples of the proximity sensor may include a transmission type photoelectric sensor, a direct reflection type photoelectric sensor, a mirror-reflection type photoelectric sensor, an RF oscillation type proximity sensor, a capacitance type proximity sensor, a magnetic proximity sensor, an infrared proximity sensor, and the like.
  • the touch screen is an electrostatic type touch screen, an approach of the pointer is detected based on a change in an electric field according to the approach of the pointer. In this case, the touch screen (touch sensor) may be classified as a proximity sensor.
  • a proximity touch Recognition of a pointer positioned to be close to the touch screen without being contacted may be called a 'contact touch'.
  • recognition of actual contacting of the pointer on the touch screen may be called a 'contact touch'.
  • the pointer when the pointer is in the state of the proximity touch, it means that the pointer is positioned to correspond vertically to the touch screen.
  • the proximity sensor detects a proximity touch and a proximity touch pattern (e.g., a proximity touch distance, a proximity touch speed, a proximity touch time, a proximity touch position, a proximity touch movement state, or the like), and information corresponding to the detected proximity touch operation and the proximity touch pattern can be outputted to the touch screen.
  • a proximity touch and a proximity touch pattern e.g., a proximity touch distance, a proximity touch speed, a proximity touch time, a proximity touch position, a proximity touch movement state, or the like
  • the display unit 130 When the display unit 130 is used as an input device, it may receive a user s button manipulation or receive a command or a control signal generated according to a manipulation such as touching/scrolling a displayed screen image.
  • the traffic lane recognizing apparatus 10 may include the storage unit 140 storing a program for detecting the image and the traffic lane, traffic lane with information calculated periodically or in real time, and the like.
  • the storage unit 140 may further store various menu screen images, a user interface (UIs), and/or a graphic user interface (GUI).
  • UIs user interface
  • GUI graphic user interface
  • the storage unit 140 may further store mathematical equations such as a conversion matrix (e.g., homographic matrix, and the like), a curve equation, the least square method, and the like.
  • a conversion matrix e.g., homographic matrix, and the like
  • a curve equation e.g., the least square method, and the like.
  • the storage unit 140 may further store data, programs, and the like, required for operating the traffic lane recognizing apparatus 10.
  • the storage unit 140 may include at least one type of storage mediums including a flash memory type, a hard disk type, a multimedia card micro type, a card-type memory (e.g., SD or DX memory, etc), a Read-Only Memory (ROM), an Electrically Erasable Programmable Read-Only Memory (EEPROM), a Programmable Read-Only memory (PROM), a Random Access Memory (RAM), a Static Random Access Memory (SRAM), a magnetic memory, a magnetic disk, and an optical disk.
  • a flash memory type e.g., a hard disk type, a multimedia card micro type, a card-type memory (e.g., SD or DX memory, etc), a Read-Only Memory (ROM), an Electrically Erasable Programmable Read-Only Memory (EEPROM), a Programmable Read-Only memory (PROM), a Random Access Memory (RAM), a Static Random Access Memory (SRAM), a magnetic memory, a magnetic disk, and an optical disk.
  • ROM
  • the traffic lane recognizing apparatus 10 may further include a communication unit (not shown) performing a communication function with a certain terminal or a server under the control of the controller 120.
  • the communication unit may include a wired/wireless communication module.
  • a wireless Internet technique may include a wireless local area network (WLAN), Wi-Fi, wireless broadband (WiBro), world interoperability for microwave access (WiMAX), high speed downlink packet access (HSDPA), IEEE802.16, long-term evolution (LTE), a wireless mobile broadband service (WMBS), and the like
  • a short-range communication technology include Bluetooth TM , Radio Frequency IDentification (RFID), Infrared Data Association (IrDA), Ultra-WideBand (UWB), ZigBee TM , and the like.
  • the wired communication technique may include USB (Universal Serial Bus) communication, and the like.
  • the communication unit may include CAN communication, vehicle Ethernet, flexray, LIN (Local Interconnect Network), and the like, for communication with a certain vehicle in which the traffic lane recognizing apparatus 10 is provided.
  • CAN communication Vehicle Ethernet, flexray, LIN (Local Interconnect Network), and the like, for communication with a certain vehicle in which the traffic lane recognizing apparatus 10 is provided.
  • LIN Local Interconnect Network
  • the communication unit may transmit curve information, or the like, that follows a central point of a traffic lane calculated based on a plurality of support points extracted from a certain image under the control of the controller 120, points obtained by converting the plurality of support points into world coordinates, a plurality of points corresponding to a curve among points which have been converted into the world coordinates, and a plurality of curves corresponding to the curve, to the certain terminal or server.
  • curve information or the like, that follows a central point of a traffic lane calculated based on a plurality of support points extracted from a certain image under the control of the controller 120, points obtained by converting the plurality of support points into world coordinates, a plurality of points corresponding to a curve among points which have been converted into the world coordinates, and a plurality of curves corresponding to the curve, to the certain terminal or server.
  • the communication unit may receive a first image and a second image, which were simultaneously captured by a pair of stereo cameras, transmitted from the certain terminal or server.
  • the traffic lane recognizing apparatus 10 may further include an input unit (not shown) including one or more microphones (not shown) for receiving an audio signal.
  • the microphone receives an external audio signal (including a user s voice (voice signal or voice information)) in a phone call mode, a recording mode, a voice recognition mode, and the like, and processes the audio signal into electrical voice data.
  • the processed voice data processed by the microphone may be output through a voice output unit (not shown) or converted into a format that is transmittable and output to an external terminal through the communication unit.
  • the microphone may implement various types of noise canceling algorithms to cancel noise generated in a procedure of receiving the external audio signal.
  • the input unit receives a signal according to a user's button manipulation or receives a command or a control signal generated according to a manipulation such as touching/scrolling a displayed screen image.
  • the input unit receives a signal corresponding to information input by the user, and as the input unit, various devices such as a keyboard, a keypad, a dome switch, a touch pad (pressure/capacitance), a touch screen, a jog shuttle, a jog wheel, a jog switch, a mouse, a stylus, pen, a touch pen, a laser pointer, and the like, may be used.
  • various devices such as a keyboard, a keypad, a dome switch, a touch pad (pressure/capacitance), a touch screen, a jog shuttle, a jog wheel, a jog switch, a mouse, a stylus, pen, a touch pen, a laser pointer, and the like, may be used.
  • the input unit receives signals corresponding to inputs by various devices.
  • the traffic lane recognizing apparatus 10 may further include a voice output unit (not shown) outputting voice information included in the signal processed by the controller 120.
  • the voice output unit may be a speaker.
  • support points (feature points) as a candidate group of a traffic lane are extracted from an image and converted into world coordinates, and a traffic lane is recognized on the converted world coordinates.
  • information regarding a traffic lane recognized from the world coordinates is displayed, based on which a warning message is generated and output, thereby enhancing accuracy/sensitivity and user convenience.
  • FIG. 8 is a flow chart illustrating a process of a traffic lane recognizing method according to an embodiment of the present invention.
  • the camera module 110 receives a first image and a second image captured by at least a pair of cameras (e.g., a stereo camera or a stereoscopic camera) installed separately by a horizontal interval on the same central axis of the same surface of the traffic lane recognizing apparatus 10, or receives an image captured by a single camera.
  • the first image may be a left image captured by a left camera included in the pair of cameras and the second image may be a right image captured by a right camera included in the pair of cameras.
  • the camera module 110 may receive any one of the first image and the second image captured by the pair of cameras.
  • the camera module 110 receives the image 210 captured by a single camera.
  • the camera module 110 receives an image 310 captured by a single camera.
  • the camera module 110 may receive the image 210 including traffic lanes corresponding to a first lane, a second lane, a third lane, and the like, and/or double lines of white or yellow solid lines (or double lines of white or yellow solid lines and dotted lines).
  • the controller 120 receives the image 210 through the camera module 110 (S11), and extracts a plurality of feature points of a traffic lane from the captured image 210 based on pre-set guide lines for detecting (extracting) traffic lane feature points from the image 210.
  • guide lines as shown in FIG. 4, when a lower portion of the image 210 is converted into world coordinates, it represents a closer region, and when middle and upper portions of the image 210 are converted into world coordinates, they represent a distant region.
  • the lower portion of the image 210 is set to have a wider interval between lines and the interval between lines of the guide lines 310 is gradually narrowed toward the upper portion of the image 210.
  • a change width of the interval between lines of the guide lines 310 may be variably set according to a design of a designer and may be set to maintain the equal interval between lines when the data of the image 210 is converted into world coordinates.
  • the guide line refers to a virtual line used to obtain a point interval as uniform as possible when the traffic lane feature points are converted into the world coordinates, rather than being actually displayed on the image.
  • the controller 120 extracts a plurality of traffic lane feature points from the image 210 based on the pre-set guide lines, and displays the plurality of extracted traffic lane feature points 510 on an image domain of the display unit 130. Namely, the controller 120 displays the traffic lane feature points corresponding to the traffic lanes 501 and the traffic lane feature points corresponding to the traffic lanes 501 on the image domain.
  • the interval between the plurality of traffic lane feature points in a vertical direction based on the horizontal axis (x axis) is gradually narrowed from a lower side to an upper side of the display unit 130 in the vertical direction.
  • the controller 120 converts the plurality of extracted traffic lane feature points into world coordinates. Namely, the controller 120 may convert the plurality of extracted traffic lane feature points into world coordinates by using a conversion matrix (e.g., a homographic matrix), or the like) previously stored in the storage unit 140.
  • a conversion matrix e.g., a homographic matrix
  • the controller 120 converts the plurality of extracted traffic lane feature points into world coordinates based on the homographic matrix previously stored in the storage unit 140, and displays a plurality of traffic lane feature points 610 which have been converted into the world coordinates on the display unit 130.
  • the interval between the plurality of traffic lane feature points which have been converted into the world coordinates in the vertical direction is maintained to be equal.
  • the controller 120 detects traffic lanes based on the plurality of traffic lane feature points which have been converted into the world coordinates and the curve equation previously stored in the storage unit 140 (S12). Namely, the controller 120 may detect the traffic lanes by substituting the plurality of traffic lane feature points which have been converted into the world coordinates to the curve equations previously stored in the storage unit 140 and determining (or checking) whether the plurality of traffic lane feature points which have been converted into the world coordinates make a straight line or a curve based on the substitution result.
  • the curve equation may be a linear equation, a quadratic equation, or a cubic equation, or higher.
  • the controller 120 detects the candidate traffic lanes from the detected traffic lanes based on the traffic lane conformance (S13). For example, the controller 120 detects traffic lanes having a high traffic lane conformance (e.g., traffic lanes having a small difference in distance) as the candidate traffic lanes, from among the detected traffic lanes. Also, among the traffic lanes detected based on the traffic lane feature points detected from the image captured by the camera module and the different traffic lane equations, the controller may detect traffic lanes, whose difference in distance between themselves and the traffic lane feature points is smaller than a threshold value, as the candidate traffic lanes.
  • a threshold value e.g., a threshold value
  • the controller 120 detects a traffic lane by substituting the traffic lane feature points which have been converted into the world coordinates to the linear equation previously stored in the storage unit 140, if the difference in distances between the traffic lane feature points and the traffic lane obtained from the substitution result is equal to or greater than the pre-set threshold value (reference distance value), the controller 120 detects the traffic lane by substituting the traffic lane feature points which have been converted into the world coordinates to the quadratic equation previously stored in the storage unit 140, and if the difference in distances between the traffic lane feature points and the traffic lane obtained from the quadratic equation substitution result is equal to or greater than the pre-set threshold value, the controller 120 detects a traffic lane by substituting the traffic lane feature points which have been converted into the world coordinates to the cubic equation previously stored in the storage unit 140.
  • the pre-set threshold value reference distance value
  • the controller 120 may determine traffic lanes, whose difference in distances between the traffic lanes according to the linear to cubic traffic lane equations and the traffic lane feature points which have been converted into the world coordinates are the smallest, as the candidate traffic lanes.
  • the controller 120 determines a first traffic lane most adjacent to the left side of the vehicle and a second traffic lane most adjacent to the right side of the vehicle based on the direction in which the vehicle is traveling on, as travel traffic lanes of the vehicle, among the detected candidate traffic lanes (S14).
  • the controller 120 displays the detected travel traffic lanes on the image (S15). For example, the controller 120 converts (or maps) the detected travel traffic lanes into coordinates on the image domain, respectively, and overlaps the respective converted coordinates on the image 210.
  • traffic lanes are detected based on traffic lane conformance from among traffic lanes detected based on the traffic lane feature points detected from an image captured by a camera module and different traffic lane equations.
  • traffic lanes can be accurately detected.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Mechanical Engineering (AREA)
  • Automation & Control Theory (AREA)
  • Mathematical Physics (AREA)
  • Transportation (AREA)
  • Traffic Control Systems (AREA)
  • Image Analysis (AREA)

Abstract

Disclosed is a traffic lane recognizing apparatus and method capable of accurately recognizing a travel traffic lane of a vehicle. The traffic lane recognizing apparatus includes: a camera module; a display unit configured to display an image captured by the camera module; and a controller configured to detect candidate traffic lanes based on traffic lane conformance from among traffic lanes detected based on traffic lane feature points detected from the image captured by the camera module and different traffic lane equations and display traffic lanes adjacent to a vehicle, as current traffic lanes along which the vehicle is traveling (or moving) on, among the detected candidate traffic lanes, on the image.

Description

TRAFFIC LANE RECOGNIZING APPARATUS AND METHOD THEREOF
The present invention relates to a traffic lane recognizing apparatus and a method thereof.
In general, a traffic lane recognizing apparatus is an apparatus for recognizing a traffic lane included in a certain image input from a camera, or the like, or a certain image received from an external terminal. A related art traffic lane recognizing apparatus is disclosed in Korean Patent Publication Laid Open No. 1995-0017509.
According to an aspect of the present invention, there is provided a traffic lane recognizing apparatus including: a camera module; a display unit configured to display an image captured by the camera module; and a controller configured to detect candidate traffic lanes based on traffic lane conformance from among traffic lanes detected based on traffic lane feature points detected from the image captured by the camera module and different traffic lane equations and display traffic lanes adjacent to a vehicle, as current traffic lanes along which the vehicle is traveling (or moving) on, among the detected candidate traffic lanes, on the image.
In an example related to the present disclosure, the traffic lane conformance may be previously determined based on the sum of differences in distances between a traffic lane obtained by substituting the traffic lane feature points to the different traffic lane equations, and the traffic lane feature points.
In an example related to the present disclosure, the traffic lane conformance may be determined based on the sum of differences in distances between a traffic lane obtained by substituting the traffic lane feature points to the different traffic lane equations, and the traffic lane feature points, the number of the traffic lane feature points, and a range in which the traffic lane feature points are distributed.
In an example related to the present disclosure, the controller may detect traffic lanes, whose difference in distance between themselves and the traffic lane feature points is a threshold value or smaller, as the candidate traffic lanes, from among the traffic lanes detected based on the traffic lane feature points detected from the image captured by the camera module and the different traffic lane equations.
In an example related to the present disclosure, the controller may detect the traffic lane feature points from the captured image, and sequentially substitute the different traffic lane equations to the detected traffic lane feature points to thereby detect traffic lanes, whose difference in distance smaller than the threshold value, as the candidate traffic lanes, from among the detected traffic lanes.
In an example related to the present disclosure, the controller may detect traffic lane feature points from the captured image, convert the traffic lane feature points into world coordinates, and sequentially substitute the traffic lane feature points which have been converted into the world coordinates to the different traffic lane equations, to thereby detect traffic lanes, whose difference in distance is smaller than the threshold value, as the candidate traffic lanes, from among the detected traffic lanes.
According to another aspect of the present invention, there is provided a traffic lane recognizing method including: displaying an image captured by a camera module on a display unit; detecting candidate traffic lanes based on traffic lane conformance from among traffic lanes detected based on traffic lane feature points detected from the image captured by the camera module and different traffic lane equations; and displaying traffic lanes adjacent to a vehicle as current traffic lanes along which the vehicle is traveling (or moving) on, among the detected candidate traffic lanes, on the image.
FIG. 1 is a schematic block diagram showing the configuration of a traffic lane recognizing apparatus according to an embodiment of the present invention.
FIG. 2 is a view showing an image captured by a camera according to an embodiment of the present invention.
FIG. 3 is a view showing guide lines according to an embodiment of the present invention.
FIG. 4 is a view showing feature points of traffic lanes according to an embodiment of the present invention.
FIG. 5 is a view showing feature points of traffic lanes converted into world coordinates according to an embodiment of the present invention.
FIG. 6 is a view showing travel traffic lanes according to an embodiment of the present invention.
FIG. 7 is a view showing an image and traffic lanes displayed on a display unit according to an embodiment of the present invention.
FIG. 8 is a flow chart illustrating a process of a traffic lane recognizing method according to an embodiment of the present invention.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. Unless otherwise defined, all terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention pertains, and should not be interpreted as having an excessively comprehensive meaning nor as having an excessively contracted meaning. If technical terms used herein is erroneous that fails to accurately express the technical idea of the present invention, it should be replaced with technical terms that allow the person in the art to properly understand. The general terms used herein should be interpreted according to the definitions in the dictionary or in the context and should not be interpreted as an excessively contracted meaning.
In the present application, it is to be understood that the terms such as "including" or "having," etc., are intended to indicate the existence of the features, numbers, operations, actions, components, parts, or combinations thereof disclosed in the specification, and are not intended to preclude the possibility that one or more other features, numbers, operations, actions, components, parts, or combinations thereof may exist or may be added.
While terms such as "first" and "second," etc., may be used to describe various components, such components must not be understood as being limited to the above terms. The above terms are used only to distinguish one component from another. For example, a first component may be referred to as a second component without departing from the scope of rights of the present invention, and likewise a second component may be referred to as a first component.
The exemplary embodiments of the present invention will now be described with reference to the accompanying drawings, in which like numbers refer to like elements throughout.
In describing the present invention, if a detailed explanation for a related known function or construction is considered to unnecessarily divert the gist of the present invention, such explanation has been omitted but would be understood by those skilled in the art. The accompanying drawings of the present invention aim to facilitate understanding of the present invention and should not be construed as limited to the accompanying drawings.
Hereinafter, the configuration of a traffic lane recognizing apparatus according to an embodiment of the present invention will be described with reference to FIG. 1. Here, the traffic lane recognizing apparatus illustrated in FIG. 1 may be configured as a stand alone device or may be applicable to various terminals such as a mobile terminal, a telematics terminal, a smart phone, a portable terminal, a personal digital assistant (PDA), a portable multimedia player (PMP), a tablet PC, a Wibro terminal, a navigation terminal, an AVN (Audio Video Navigation) terminal, and the like.
FIG. 1 is a schematic block diagram showing the configuration of a traffic lane recognizing apparatus according to an embodiment of the present invention.
As shown in FIG. 1, a traffic lane recognizing apparatus 10 according to an embodiment of the present invention includes a camera module 110, a display unit 130 configured to display an image captured by the camera module 110, and a controller 120 configured to detect candidate traffic lanes based on traffic lane conformance from among traffic lanes detected by substituting different traffic lane equations (or curve equations) in the image captured by the camera module 110 and display traffic lanes adjacent to a vehicle (e.g., a traffic lane adjacent to the left side of the vehicle and a traffic lane adjacent to the right side of the vehicle) as current traffic lanes along which the vehicle is traveling (or moving) on, among the detected candidate traffic lanes, on the image. The traffic lane conformance may be previously determined based on the sum of differences in distances between a traffic lane obtained by substituting the traffic lane feature points to the different traffic lane equations, and the traffic lane feature points.
The controller 120 may detect traffic lanes, whose difference in distance between themselves and the traffic lane feature points is a threshold value or smaller, as the candidate traffic lanes, from among the traffic lanes detected based on the traffic lane feature points detected from the image captured by the camera module and the different traffic lane equations. For example, the controller 120 may detect traffic lane feature points (support points) from the captured image, and sequentially (e.g., in order from a low-order traffic lane equation (a linear traffic lane equation) to a high-order traffic lane equation (a cubic traffic lane equation) or in order from the high-order traffic lane equation (the cubic traffic lane equation) to the low-order traffic lane equation (the linear traffic lane equation)) substitute different traffic lane equations for detecting a traffic lane (a straight line and/or a curved line) to the detected traffic lane feature points, to thereby detect traffic lanes whose difference in distance is smaller than the threshold value (or traffic lanes having a high traffic lane conformance (e.g., 90% or more)), as the candidate traffic lanes, from among the detected traffic lanes.
The components of the traffic lane recognizing apparatus 10 illustrated in FIG. 1 are not all essential components; the traffic lane recognizing apparatus 10 may be implemented by more components or less components.
FIG. 2 is a view showing an image captured by a camera according to an embodiment of the present invention.
As shown in FIG. 2, the camera module 110 receives an image 310 captured by a single camera. For example, the camera module 110 may receive an image 210 including traffic lanes corresponding to first road, second road, third road, and the like.
The controller 120 receives the image 210 through the camera module 110, and extracts feature points of traffic lanes within the captured image 210 based on pre-set guide lines for extracting the support points (e.g., traffic lane feature points) from the image 210. Here, as for the guide lines, as shown in FIG. 3, when a lower portion of the image 210 is converted into world coordinates, it represents a closer region, and when middle and upper portions of the image 210 are converted into world coordinates, they represent a distant region. Thus, in order to obtain point intervals as uniform as possible when the data of the image 210 is converted into the world coordinates, the interval between the guide lines 310 is set to be large at the lower portion of the image 210, and the interval between the guide lines 310 set to be gradually narrowed toward the upper portion of the image 210. Here, a change width of the interval between lines of the guide lines 410 may be variably set according to a design of a designer and may be set to maintain the equal interval between lines when the data of the image 310 is converted into world coordinates. The guide line refers to a virtual line used to obtain a point interval as uniform as possible when the support points (feature points) are converted into the world coordinates, rather than being actually displayed on the image.
As shown in FIG. 4, the controller 120 extracts a plurality of traffic lane feature points from the image 310 based on the pre-set guide lines, and displays the plurality of extracted traffic lane feature points 410 on an image domain of the display unit 130. Namely, the controller 120 displays the traffic lane feature points corresponding to the traffic lanes 401 on the image domain. Here, the interval between the plurality of traffic lane feature points in a vertical direction based on the horizontal axis (x axis) is gradually narrowed from a lower side to an upper side of the display unit 130 in the vertical direction.
The controller 120 converts the plurality of extracted traffic lane feature points into world coordinates. Namely, the controller 120 may convert the plurality of extracted traffic lane feature points into world coordinates by using a conversion matrix (e.g., a homographic matrix), or the like) previously stored in the storage unit 140.
For example, as shown in FIG. 5, the controller 120 converts the plurality of extracted traffic lane feature points into world coordinates based on the homographic matrix previously stored in the storage unit 140, and displays a plurality of traffic lane feature points 510 which have been converted into the world coordinates on the display unit 130. Here, based on the horizontal axis, the interval between the plurality of traffic lane feature points which have been converted into the world coordinates in the vertical direction is maintained to be equal.
The controller 120 may detect (or check) the plurality of traffic lane feature points corresponding to the curve among the traffic lane feature points which have been converted into the world coordinates based on the traffic lane feature points which have been converted into the world coordinates and the linear, quadratic, and cubic, n-order equations previously stored in the storage unit 140. Namely, the controller 120 may sequentially substitute the plurality of traffic lane feature points which have been converted into the world coordinates to the curve equations previously stored in the storage unit 140 and determine (or check) whether the plurality of traffic lane feature points which have been converted into the world coordinates make a curve based on the substitution results. Here, the curve equation may be a linear, quadratic, and cubic, or n-order equation.
The controller 120 substitutes the plurality of traffic lane feature points which have been converted into the world coordinates to a linear traffic lane equation (e.g., y=c1x+c0, wherein c1 is a tilt (or a heading angle of the vehicle), and c0 is an offset between the traffic lane and the vehicle) previously stored in the storage unit 140. When c1=0, the controller 120 recognizes the plurality of traffic lane feature points as a straight line, and when c 1 0, the controller 120 recognizes the plurality of traffic lane feature points as a curve (traffic lane detection). Namely, the linear traffic lane equation may be used to detect a traffic lane on a general straight road.
The controller 120 substitutes the plurality of traffic lane feature points which have been converted into the world coordinates to a quadratic traffic lane equation (e.g., y=c2x2+c1x+c0, wherein c2 is a curvature of the traffic lane, c1 is a tilt of the vehicle (or a heading angle of the vehicle), and c0 is an offset between the traffic lane and the vehicle) previously stored in the storage unit 140. When c2=0, the controller 120 recognizes the plurality of traffic lane feature points as a straight line, and when c 2 0, the controller 120 recognizes the plurality of traffic lane feature points as a curve (traffic lane detection). Namely, the quadratic traffic lane equation may be used to detect a traffic lane on a curved road in which a curvature radius of a traffic lane is large and a curvature change is uniform.
The controller 120 substitutes the plurality of traffic lane feature points which have been converted into the world coordinates to a cubic traffic lane equation (e.g., y=c3x3+ c2x2+c1x+c0, wherein c3 is a curve derivative, c2 is a curvature of the traffic lane, c1 is a tilt of the vehicle (or a heading angle of the vehicle), and c0 is an offset between the traffic lane and the vehicle) previously stored in the storage unit 140, to check whether or not there is a traffic lane (curve) (traffic lane detection). When c3=0, in the cubic traffic lane equation, c2 is a curvature of the traffic lane, c1 is a tilt of the vehicle (or a heading angle of the vehicle), and c0 is an offset between the traffic lane and the vehicle, and when both c3 and c2 are 0, it indicates a linear detection, and c1 is a heading angle of the vehicle and c0 is an offset between the traffic lane and the vehicle. Namely, the cubic curve equation may be used to detect a traffic lane on a curved road in which a curvature radius of a traffic lane is small and a curvature change is not uniform. c1 may be a direction of a traffic lane and a direction of the vehicle or a direction of the traffic lane with respect to the vehicle.
The controller 120 may check whether or not there is a traffic lane (curve) by substituting the plurality of traffic lane feature points which have been converted into the world coordinates to nth-order equation (e.g., y=cnxn+cn-1xn-1 +,....., +c3x3+c2x2+c1x+c0). Namely, the nth-order equation may be used to detect a traffic lane on a curved road which can be hardly modeled by the (n-)th-order equation
The controller 120 detects a traffic lane by substituting the traffic lane feature points which have been converted into the world coordinates to the curve equations (e.g., the linear equation) previously stored in the storage unit 140, and determines whether or not a difference in distances between the traffic lane feature points and the traffic lane obtained from the substitution result is smaller than the pre-set threshold value (reference distance value or a traffic lane conformance). When the difference in distances between the traffic lane feature points and the traffic lane obtained from the substitution result is smaller than the pre-set threshold value (e.g., 1 to 5 cm), the controller 120 displays the traffic lane detected according to the linear equation, on the image.
When the difference in distances between the traffic lane feature points and the traffic lane obtained from the substitution result is equal to or greater than the pre-set threshold value (e.g., 5 to 10 cm), the controller 120 substitutes the traffic lane feature points which have been converted into the world coordinates to the curve equations (e.g., the quadratic equation) previously stored in the storage unit 140, and determines whether or not the difference in distances between the traffic lane feature points and the traffic lane obtained from the substitution result is smaller, or equal to or greater than the pre-set threshold value (reference distance value or a traffic lane conformance). When the difference in distances between the traffic lane feature points and the traffic lane obtained from the substitution result is smaller than the pre-set threshold value, the controller 120 displays the traffic lane detected according to the quadratic equation, on the image.
When the difference in distances between the traffic lane feature points and the traffic lane obtained from the substitution result is equal to or greater than the pre-set threshold value, the controller 120 substitutes the traffic lane feature points which have been converted into the world coordinates to the curve equations (e.g., the cubic equation) previously stored in the storage unit 140, and determines whether or not the difference in distances between the traffic lane feature points and the traffic lane obtained from the substitution result is smaller, or equal to or greater than the pre-set threshold value (reference distance value or a traffic lane conformance). When the difference in distances between the traffic lane feature points and the traffic lane obtained from the substitution result is smaller than the pre-set threshold value, the controller 120 displays the traffic lane detected according to the cubic equation, on the image.
Based on the traffic lane conformance obtained based on the sum of differences in distances between the traffic lane feature points and the traffic lane according to the substitution result, the controller 120 may detect a traffic lane having high traffic lane conformance from among the curve equations (e.g., the linear, the quadratic, and the cubic equations) previously stored in the storage unit 140.
The traffic lane conformance may be calculated in various manners, and may be calculated through sum of squares according to Equation 1 shown below:
[Equation 1]
SSE=
Figure PCTKR2012000190-appb-I000001
Here, n is the number of traffic lane feature points used for a traffic lane detection, and yi is y value of ith traffic lane feature point,
Figure PCTKR2012000190-appb-I000002
is a y value of an equation estimated with respect to x value of the ith traffic land feature point, and wi is a weight value with respect to each traffic lane feature point.
The controller may additionally use the number of the traffic lane feature points, the range in which the traffic lane feature points are distributed, and the like, in order to determine the conformance of a model on the current road. The controller 120 may detect the candidate traffic lanes by substituting traffic lane feature points whose distribution range is equal to or greater than a pre-set range to the different traffic lane equations. Here, the pre-set reference number and the pre-set range may be variably defined by a designer.
The controller 120 may detect traffic lanes in real time by tracking traffic lane feature points substituted to the curve equations (e.g., the linear equation) previously stored in the storage unit 140, and display the detected traffic lanes.
The controller 120 may also calculate the curve information that follows a virtual central point of a traffic lane based on the plurality of points corresponding to the detected traffic lane (a straight line or a curve). Here, the calculated curve information may be used to enhance traffic lane maintaining performance on the world coordinates by minimizing the influence of a calibration state of the camera. Namely, the controller 120 may calculate curve information following the central point of the traffic lane by applying any one of a least square method, a random sample consensus (RANSAC), a general hough transform method, a spline interpolation method, and the like, with respect to the plurality of points corresponding to the detected curve.
The controller 120 may overlap the calculated curve information following the central point of the traffic lane, the information such as the detected curve, or the like, with the captured image, and display the same on the display unit 130. For example, the controller 120 may convert (or map) the calculated curve information following the central point of the traffic lane and the detected curve information, or the like, into coordinates on an image domain, respectively, overlap the respective converted coordinates with the captured image, and display the same on the display unit 130.
FIG. 6 is a view showing travel traffic lanes according to an embodiment of the present invention.
As shown in FIG. 6, the controller 120 may select a first traffic lane most adjacent to the left side of the vehicle and a second traffic lane most adjacent to the right side of the vehicle based on a moving direction of the vehicle from the detected candidate traffic lanes, and display the first traffic lane and the second traffic lane as travel traffic lanes 610 of the vehicle on the image 210. For example, the controller 120 may convert (or map) the detected first traffic lane and the second traffic lane 610 into coordinates on the image domain, and overlap the respective converted coordinates with the image 210.
FIG. 7 is a view showing travel traffic lanes displayed on an image according to an embodiment of the present invention.
As shown in FIG. 7, the controller 120 may select a first traffic lane most adjacent to the left side of the vehicle and a second traffic lane most adjacent to the right side of the vehicle based on a moving direction of the vehicle from the detected candidate traffic lanes, detects the first traffic lane and the second traffic lane as travel traffic lanes 610 of the vehicle, and overlap the travel traffic lanes 610 with the image.
The camera module 110 may include at least a pair of cameras (e.g., a stereo camera, a stereoscopic camera), installed to be spaced apart horizontally on the same plane of the traffic lane recognizing apparatus 10, or a single camera. Here, the fixed horizontal interval may be set in consideration of the distance between ordinary humans two eyes. Also, the camera module 110 may be any camera modules that can capture an image.
The camera module 110 may receive a first image (e.g., a left image captured by a left camera included in the pair of cameras) and a second image (e.g., a right image captured by a right camera included in the pair of cameras) which are simultaneously captured by the pair of cameras.
The camera module 110 may be an image sensor such as a charge-coupled device (CCD), a complementary metal oxide semiconductor (CMOS), or the like.
When the traffic lane recognizing apparatus 10 is installed in a vehicle, the camera module 110 may be fixed to a certain position (e.g., a room mirror of the vehicle) of the vehicle to capture an image of a front side in the traveling direction of the vehicle. The camera module 110 may be fixedly installed at certain positions (e.g., a side mirror of the vehicle, a rear bumper of the vehicle) in order to capture images of the side and the rear side of the vehicle.
The controller 120 performs functions (including a traffic lane deviation warning message function, an automatic traffic lane maintaining function, and the like) in relation to maintaining a traffic lane based on the position of the traffic lane recognizing apparatus 10 (or a vehicle including the traffic lane recognizing apparatus) and the detected curve (or the traffic lane) checked through a certain GPS module (not shown).
The display unit 130 displays various contents such as various menu screen images, or the like, by using a user interface and/or a graphic user interface included in the storage unit 140 under the control of the controller 120. Here, the contents displayed on the display unit 130 includes menu screen images such as various text or image data (including various information data) and data such as an icon, a list menu, a combo box, and the like.
The display unit 130 includes a 3D display or a 2D display. Also, the display unit 130 may include at least one of a Liquid Crystal Display (LCD), a Thin Film Transistor-LCD (TFT-LCD), an Organic Light Emitting Diode (OLED) display, a flexible display, and an LED (Light Emitting Diode).
The display unit 130 displays the 3D image (or a 2D image) under the control of the controller 120.
The traffic lane recognizing apparatus 10 may include two or more display units 130 according to its particular desired embodiment. For example, a plurality of display units may be separately or integrally disposed on a single face (the same surface) of the traffic lane recognizing apparatus 10, or may be disposed on mutually different faces of the traffic lane recognizing apparatus 10.
Meanwhile, when the display unit 130 and a sensor sensing a touch operation (referred to as a 'touch sensor', hereinafter) are overlaid in a layered manner (referred to as a 'touch screen', hereinafter), the display unit 130 may function as both an input device and an output device. The touch sensor may have the form of, for example, a touch film, a touch sheet, a touch pad, a touch panel, and the like.
The touch sensor may be configured to convert the pressure applied to a particular portion of the display unit 130 or a change in capacitance generated at a particular portion of the display unit 130 into an electrical input signal. Also, the touch sensor may be configured to detect a touch input pressure as well as a touch input position and a touch input area. When there is a touch input with respect to the touch sensor, the corresponding signal(s) are sent to a touch controller (not shown). The touch controller processes the signal(s) and transmits corresponding data to the controller 120. Accordingly, the controller 120 can recognize a touched region of the display unit 151.
The display unit 130 may include a proximity sensor. The proximity sensor may be disposed in an internal region of the traffic lane recognizing apparatus 10 covered by the touch screen or in the vicinity of the touch screen.
The proximity sensor may be disposed within the mobile terminal covered by the touch screen or near the touch screen. The proximity sensor refers to a sensor for detecting the presence or absence of an object that accesses a certain detect surface or an object that exists nearby by using the force of electromagnetism or infrared rays without a mechanical contact. Thus, the proximity sensor has a longer life span compared with a contact type sensor, and it can be utilized for various purposes. Examples of the proximity sensor may include a transmission type photoelectric sensor, a direct reflection type photoelectric sensor, a mirror-reflection type photoelectric sensor, an RF oscillation type proximity sensor, a capacitance type proximity sensor, a magnetic proximity sensor, an infrared proximity sensor, and the like. When the touch screen is an electrostatic type touch screen, an approach of the pointer is detected based on a change in an electric field according to the approach of the pointer. In this case, the touch screen (touch sensor) may be classified as a proximity sensor.
Recognition of a pointer positioned to be close to the touch screen without being contacted may be called a proximity touch , while recognition of actual contacting of the pointer on the touch screen may be called a 'contact touch'. In this case, when the pointer is in the state of the proximity touch, it means that the pointer is positioned to correspond vertically to the touch screen.
The proximity sensor detects a proximity touch and a proximity touch pattern (e.g., a proximity touch distance, a proximity touch speed, a proximity touch time, a proximity touch position, a proximity touch movement state, or the like), and information corresponding to the detected proximity touch operation and the proximity touch pattern can be outputted to the touch screen.
When the display unit 130 is used as an input device, it may receive a user s button manipulation or receive a command or a control signal generated according to a manipulation such as touching/scrolling a displayed screen image.
The traffic lane recognizing apparatus 10 according to an embodiment of the present invention may include the storage unit 140 storing a program for detecting the image and the traffic lane, traffic lane with information calculated periodically or in real time, and the like.
The storage unit 140 may further store various menu screen images, a user interface (UIs), and/or a graphic user interface (GUI).
The storage unit 140 may further store mathematical equations such as a conversion matrix (e.g., homographic matrix, and the like), a curve equation, the least square method, and the like.
The storage unit 140 may further store data, programs, and the like, required for operating the traffic lane recognizing apparatus 10.
The storage unit 140 may include at least one type of storage mediums including a flash memory type, a hard disk type, a multimedia card micro type, a card-type memory (e.g., SD or DX memory, etc), a Read-Only Memory (ROM), an Electrically Erasable Programmable Read-Only Memory (EEPROM), a Programmable Read-Only memory (PROM), a Random Access Memory (RAM), a Static Random Access Memory (SRAM), a magnetic memory, a magnetic disk, and an optical disk.
The traffic lane recognizing apparatus 10 may further include a communication unit (not shown) performing a communication function with a certain terminal or a server under the control of the controller 120. Here, the communication unit may include a wired/wireless communication module. Here, a wireless Internet technique may include a wireless local area network (WLAN), Wi-Fi, wireless broadband (WiBro), world interoperability for microwave access (WiMAX), high speed downlink packet access (HSDPA), IEEE802.16, long-term evolution (LTE), a wireless mobile broadband service (WMBS), and the like, and a short-range communication technology include BluetoothTM, Radio Frequency IDentification (RFID), Infrared Data Association (IrDA), Ultra-WideBand (UWB), ZigBeeTM, and the like. Also, the wired communication technique may include USB (Universal Serial Bus) communication, and the like.
The communication unit may include CAN communication, vehicle Ethernet, flexray, LIN (Local Interconnect Network), and the like, for communication with a certain vehicle in which the traffic lane recognizing apparatus 10 is provided.
The communication unit may transmit curve information, or the like, that follows a central point of a traffic lane calculated based on a plurality of support points extracted from a certain image under the control of the controller 120, points obtained by converting the plurality of support points into world coordinates, a plurality of points corresponding to a curve among points which have been converted into the world coordinates, and a plurality of curves corresponding to the curve, to the certain terminal or server.
The communication unit may receive a first image and a second image, which were simultaneously captured by a pair of stereo cameras, transmitted from the certain terminal or server.
The traffic lane recognizing apparatus 10 may further include an input unit (not shown) including one or more microphones (not shown) for receiving an audio signal.
The microphone receives an external audio signal (including a user s voice (voice signal or voice information)) in a phone call mode, a recording mode, a voice recognition mode, and the like, and processes the audio signal into electrical voice data. The processed voice data processed by the microphone may be output through a voice output unit (not shown) or converted into a format that is transmittable and output to an external terminal through the communication unit. The microphone may implement various types of noise canceling algorithms to cancel noise generated in a procedure of receiving the external audio signal.
The input unit receives a signal according to a user's button manipulation or receives a command or a control signal generated according to a manipulation such as touching/scrolling a displayed screen image.
The input unit receives a signal corresponding to information input by the user, and as the input unit, various devices such as a keyboard, a keypad, a dome switch, a touch pad (pressure/capacitance), a touch screen, a jog shuttle, a jog wheel, a jog switch, a mouse, a stylus, pen, a touch pen, a laser pointer, and the like, may be used. Here, the input unit receives signals corresponding to inputs by various devices.
The traffic lane recognizing apparatus 10 may further include a voice output unit (not shown) outputting voice information included in the signal processed by the controller 120. Here, the voice output unit may be a speaker.
In the traffic lane recognizing apparatus and method according to an embodiment of the present invention, support points (feature points) as a candidate group of a traffic lane are extracted from an image and converted into world coordinates, and a traffic lane is recognized on the converted world coordinates. Thus, a possibility of an accumulated error can be reduced compared with the method of directly recognizing a traffic lane from an image in an error transition of calibration between camera information and the world coordinates.
In the traffic lane recognizing apparatus and method according to an embodiment of the present invention, information regarding a traffic lane recognized from the world coordinates is displayed, based on which a warning message is generated and output, thereby enhancing accuracy/sensitivity and user convenience.
FIG. 8 is a flow chart illustrating a process of a traffic lane recognizing method according to an embodiment of the present invention.
First, the camera module 110 receives a first image and a second image captured by at least a pair of cameras (e.g., a stereo camera or a stereoscopic camera) installed separately by a horizontal interval on the same central axis of the same surface of the traffic lane recognizing apparatus 10, or receives an image captured by a single camera. Here, the first image may be a left image captured by a left camera included in the pair of cameras and the second image may be a right image captured by a right camera included in the pair of cameras. Also, the camera module 110 may receive any one of the first image and the second image captured by the pair of cameras.
The camera module 110 receives the image 210 captured by a single camera. For example, the camera module 110 receives an image 310 captured by a single camera. For example, the camera module 110 may receive the image 210 including traffic lanes corresponding to a first lane, a second lane, a third lane, and the like, and/or double lines of white or yellow solid lines (or double lines of white or yellow solid lines and dotted lines).
The controller 120 receives the image 210 through the camera module 110 (S11), and extracts a plurality of feature points of a traffic lane from the captured image 210 based on pre-set guide lines for detecting (extracting) traffic lane feature points from the image 210. Here, as for the guide lines, as shown in FIG. 4, when a lower portion of the image 210 is converted into world coordinates, it represents a closer region, and when middle and upper portions of the image 210 are converted into world coordinates, they represent a distant region. Thus, in order to obtain point intervals as uniform as possible when data of the image 210 is converted into the world coordinates, the lower portion of the image 210 is set to have a wider interval between lines and the interval between lines of the guide lines 310 is gradually narrowed toward the upper portion of the image 210. Here, a change width of the interval between lines of the guide lines 310 may be variably set according to a design of a designer and may be set to maintain the equal interval between lines when the data of the image 210 is converted into world coordinates. The guide line refers to a virtual line used to obtain a point interval as uniform as possible when the traffic lane feature points are converted into the world coordinates, rather than being actually displayed on the image.
As shown in FIG. 5, the controller 120 extracts a plurality of traffic lane feature points from the image 210 based on the pre-set guide lines, and displays the plurality of extracted traffic lane feature points 510 on an image domain of the display unit 130. Namely, the controller 120 displays the traffic lane feature points corresponding to the traffic lanes 501 and the traffic lane feature points corresponding to the traffic lanes 501 on the image domain. Here, the interval between the plurality of traffic lane feature points in a vertical direction based on the horizontal axis (x axis) is gradually narrowed from a lower side to an upper side of the display unit 130 in the vertical direction.
The controller 120 converts the plurality of extracted traffic lane feature points into world coordinates. Namely, the controller 120 may convert the plurality of extracted traffic lane feature points into world coordinates by using a conversion matrix (e.g., a homographic matrix), or the like) previously stored in the storage unit 140.
For example, as shown in FIG. 6, the controller 120 converts the plurality of extracted traffic lane feature points into world coordinates based on the homographic matrix previously stored in the storage unit 140, and displays a plurality of traffic lane feature points 610 which have been converted into the world coordinates on the display unit 130. Here, based on the horizontal axis, the interval between the plurality of traffic lane feature points which have been converted into the world coordinates in the vertical direction is maintained to be equal.
The controller 120 detects traffic lanes based on the plurality of traffic lane feature points which have been converted into the world coordinates and the curve equation previously stored in the storage unit 140 (S12). Namely, the controller 120 may detect the traffic lanes by substituting the plurality of traffic lane feature points which have been converted into the world coordinates to the curve equations previously stored in the storage unit 140 and determining (or checking) whether the plurality of traffic lane feature points which have been converted into the world coordinates make a straight line or a curve based on the substitution result. Here, the curve equation may be a linear equation, a quadratic equation, or a cubic equation, or higher.
The controller 120 detects the candidate traffic lanes from the detected traffic lanes based on the traffic lane conformance (S13). For example, the controller 120 detects traffic lanes having a high traffic lane conformance (e.g., traffic lanes having a small difference in distance) as the candidate traffic lanes, from among the detected traffic lanes. Also, among the traffic lanes detected based on the traffic lane feature points detected from the image captured by the camera module and the different traffic lane equations, the controller may detect traffic lanes, whose difference in distance between themselves and the traffic lane feature points is smaller than a threshold value, as the candidate traffic lanes.
When the controller 120 detects a traffic lane by substituting the traffic lane feature points which have been converted into the world coordinates to the linear equation previously stored in the storage unit 140, if the difference in distances between the traffic lane feature points and the traffic lane obtained from the substitution result is equal to or greater than the pre-set threshold value (reference distance value), the controller 120 detects the traffic lane by substituting the traffic lane feature points which have been converted into the world coordinates to the quadratic equation previously stored in the storage unit 140, and if the difference in distances between the traffic lane feature points and the traffic lane obtained from the quadratic equation substitution result is equal to or greater than the pre-set threshold value, the controller 120 detects a traffic lane by substituting the traffic lane feature points which have been converted into the world coordinates to the cubic equation previously stored in the storage unit 140. Meanwhile, if the differences in distances between traffic lanes according to the linear to cubic traffic lane equations and the traffic lane feature points which have been converted into the world coordinates are all equal to or greater than the pre-set threshold value, the controller 120 may determine traffic lanes, whose difference in distances between the traffic lanes according to the linear to cubic traffic lane equations and the traffic lane feature points which have been converted into the world coordinates are the smallest, as the candidate traffic lanes.
The controller 120 determines a first traffic lane most adjacent to the left side of the vehicle and a second traffic lane most adjacent to the right side of the vehicle based on the direction in which the vehicle is traveling on, as travel traffic lanes of the vehicle, among the detected candidate traffic lanes (S14).
The controller 120 displays the detected travel traffic lanes on the image (S15). For example, the controller 120 converts (or maps) the detected travel traffic lanes into coordinates on the image domain, respectively, and overlaps the respective converted coordinates on the image 210.
As described above, in the traffic lane recognizing apparatus and method according to embodiments of the present invention, traffic lanes are detected based on traffic lane conformance from among traffic lanes detected based on the traffic lane feature points detected from an image captured by a camera module and different traffic lane equations. Thus, traffic lanes can be accurately detected.

Claims (18)

  1. A traffic lane recognizing apparatus comprising:
    a camera module;
    a display unit configured to display an image captured by the camera module; and
    a controller configured to detect candidate traffic lanes based on traffic lane conformance from among traffic lanes detected based on traffic lane feature points detected from the image captured by the camera module and different traffic lane equations and display traffic lanes adjacent to a vehicle, as current traffic lanes along which the vehicle is traveling on, among the detected candidate traffic lanes, on the image.
  2. The apparatus of claim 1, wherein the traffic lane conformance is previously determined based on the sum of differences in distances between a traffic lane obtained by substituting the traffic lane feature points to the different traffic lane equations, and the traffic lane feature points.
  3. The apparatus of claim 2, wherein the traffic lane conformance is determined based on the sum of differences in distances between a traffic lane obtained by substituting the traffic lane feature points to the different traffic lane equations, and the traffic lane feature points, the number of the traffic lane feature points, and a range in which the traffic lane feature points are distributed.
  4. The apparatus of claim 1, wherein the controller detects traffic lanes, whose difference in distance between themselves and the traffic lane feature points is a threshold value or smaller, as the candidate traffic lanes, from among the traffic lanes detected based on the traffic lane feature points detected from the image captured by the camera module and the different traffic lane equations.
  5. The apparatus of claim 1, wherein the controller detects the traffic lane feature points from the captured image, and sequentially substitutes the different traffic lane equations to the detected traffic lane feature points to thereby detect traffic lanes, whose difference in distance is smaller than the threshold value, as the candidate traffic lanes, from among the detected traffic lanes.
  6. The apparatus of claim 1, wherein the controller detects traffic lane feature points from the captured image, convert the traffic lane feature points into world coordinates, and sequentially substitutes the traffic lane feature points which have been converted into the world coordinates to the different traffic lane equations, to thereby detect traffic lanes, whose difference in distance is smaller than the threshold value, as the candidate traffic lanes, from among the detected traffic lanes.
  7. The apparatus of claim 1, wherein the different traffic lane equations are linear, quadratic, and cubic traffic lane equations, and the linear traffic lane equation is y=c1x+c0 wherein c1 is a heading angle of the vehicle and c0 is an offset between the traffic lane and the vehicle, the quadratic traffic lane equation is y=c2x2+c1x+c0 wherein c2 is a curvature of the traffic lane, c1 is a heading angle of the vehicle, and c0 is an offset between the traffic lane and the vehicle, and the cubic traffic lane equation is y=c3x3+ c2x2+c1x+c0 wherein c3 is a curve derivative, c2 is a curvature of the traffic lane, c1 is a heading angle of the vehicle, and c0 is an offset between the traffic lane and the vehicle.
  8. The apparatus of claim 1, wherein when the difference in distances between the traffic lane feature points and the traffic lane obtained from the substitution result is equal to or greater than the pre-set threshold value, the controller converts the traffic lane feature points into world coordinates, and substitutes the traffic lane feature points to any one of the different traffic lane curve equations, and when the difference in distances between the traffic lane feature points and the traffic lane obtained from the substitution result is smaller than the pre-set threshold value, the controller displays the traffic lane detected according to the quadratic equation, on the image.
  9. The apparatus of claim 8, wherein the any one traffic lane curve equation is a quadratic traffic lane equation which is y=c2x2+c1x+c0 wherein c2 is a curvature of the traffic lane, c1 is a heading angle of the vehicle, and c0 is an offset between the traffic lane and the vehicle.
  10. A traffic lane recognizing method comprising:
    displaying an image captured by a camera module on a display unit;
    detecting candidate traffic lanes based on traffic lane conformance from among traffic lanes detected based on traffic lane feature points detected from the image captured by the camera module and different traffic lane equations; and
    displaying traffic lanes adjacent to a vehicle as current traffic lanes along which the vehicle is traveling on, among the detected candidate traffic lanes, on the image.
  11. The method of claim 10, wherein the traffic lane conformance is previously determined based on the sum of differences in distances between a traffic lane obtained by substituting the traffic lane feature points to the different traffic lane equations, and the traffic lane feature points.
  12. The method of claim 11, wherein the traffic lane conformance is determined based on the sum of differences in distances between a traffic lane obtained by substituting the traffic lane feature points to the different traffic lane equations, and the traffic lane feature points, the number of the traffic lane feature points, and a range in which the traffic lane feature points are distributed.
  13. The method of claim 10, wherein the detecting of the candidate traffic lanes comprises detecting traffic lanes, whose difference in distance between themselves and the traffic lane feature points is a threshold value or smaller, as the candidate traffic lanes, from among the traffic lanes detected based on the traffic lane feature points detected from the image captured by the camera module and the different traffic lane equations.
  14. The method of claim 10, wherein the detecting of the candidate traffic lanes comprises:
    detecting the traffic lane feature points from the captured image; and
    detecting traffic lanes, whose difference in distance is smaller than the threshold value, as the candidate traffic lanes, from among the detected traffic lanes detected by sequentially substituting the different traffic lane equations to the detected traffic lane feature points.
  15. The method of claim 10, wherein the detecting of the candidate traffic lanes comprises:
    detecting traffic lane feature points from the captured image;
    converting the traffic lane feature points into world coordinates; and
    sequentially substituting the traffic lane feature points which have been converted into the world coordinates to the different traffic lane equations, to thereby detect traffic lanes, whose difference in distance is smaller than the threshold value, as the candidate traffic lanes, from among the detected traffic lanes.
  16. The method of claim 10, wherein the different traffic lane equations are linear, quadratic, and cubic traffic lane equations, and the linear traffic lane equation is y=c1x+c0 wherein c1 is a heading angle of the vehicle and c0 is an offset between the traffic lane and the vehicle, the quadratic traffic lane equation is y=c2x2+c1x+c0 wherein c2 is a curvature of the traffic lane, c1 is a heading angle of the vehicle, and c0 is an offset between the traffic lane and the vehicle, and the cubic traffic lane equation is y=c3x3+ c2x2+c1x+c0 wherein c3 is a curve derivative, c2 is a curvature of the traffic lane, c1 is a heading angle of the vehicle, and c0 is an offset between the traffic lane and the vehicle.
  17. The method of claim 10, wherein the detecting of the candidate traffic lanes comprises:
    when the difference in distances between the traffic lane feature points and the traffic lane obtained from the substitution result is equal to or greater than the pre-set threshold value, converting the traffic lane feature points into world coordinates;
    substituting the traffic lane feature points to any one of the different traffic lane curve equations; and
    when the difference in distances between the traffic lane feature points and the traffic lane obtained from the substitution result is smaller than the pre-set threshold value, displaying the traffic lane detected according to the quadratic equation, on the image.
  18. The method of claim 17, wherein the any one traffic lane curve equation is a quadratic traffic lane equation which is y=c2x2+c1x+c0 wherein c2 is a curvature of the traffic lane, c1 is a heading angle of the vehicle, and c0 is an offset between the traffic lane and the vehicle.
PCT/KR2012/000190 2011-08-05 2012-01-09 Traffic lane recognizing apparatus and method thereof WO2013022159A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR1020110078353A KR101641251B1 (en) 2011-08-05 2011-08-05 Apparatus for detecting lane and method thereof
KR10-2011-0078353 2011-08-05

Publications (1)

Publication Number Publication Date
WO2013022159A1 true WO2013022159A1 (en) 2013-02-14

Family

ID=47668648

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2012/000190 WO2013022159A1 (en) 2011-08-05 2012-01-09 Traffic lane recognizing apparatus and method thereof

Country Status (2)

Country Link
KR (1) KR101641251B1 (en)
WO (1) WO2013022159A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111967662A (en) * 2020-08-11 2020-11-20 中国石油化工股份有限公司 Method for improving unloading efficiency of tank container train
FR3127320A1 (en) * 2021-09-21 2023-03-24 Continental Automotive Method for determining the position of an object relative to a road marking line

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101498975B1 (en) * 2013-11-29 2015-03-05 현대모비스(주) Lane Departure Warning System
KR102384580B1 (en) * 2015-08-17 2022-04-08 엘지이노텍 주식회사 Apparatus for recognizing lane conditions and moethod thereof
KR102595897B1 (en) 2018-08-27 2023-10-30 삼성전자 주식회사 Method and apparatus of determining road line

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004038760A (en) * 2002-07-05 2004-02-05 Nissan Motor Co Ltd Traveling lane recognition device for vehicle
KR20040034243A (en) * 2002-10-21 2004-04-28 학교법인 한양학원 Method for detecting lane and system therefor
JP2005165726A (en) * 2003-12-03 2005-06-23 Denso Corp Device and method for detecting traveling lane boundary

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101035761B1 (en) * 2006-07-06 2011-05-20 포항공과대학교 산학협력단 Method of processing image for recognizing a lane and the system thereof
KR101176693B1 (en) * 2008-03-13 2012-08-23 주식회사 만도 Method and System for Detecting Lane by Using Distance Sensor

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004038760A (en) * 2002-07-05 2004-02-05 Nissan Motor Co Ltd Traveling lane recognition device for vehicle
KR20040034243A (en) * 2002-10-21 2004-04-28 학교법인 한양학원 Method for detecting lane and system therefor
JP2005165726A (en) * 2003-12-03 2005-06-23 Denso Corp Device and method for detecting traveling lane boundary

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
HAN, SUNGJI ET AL.: "Lane and curvature detection algorithm based on the curv e template matching method using top view image.", JOURNAL OF THE INSTITUTE OF ELECTRONICS ENGINEERS OF KOREA., vol. 47-SP, no. 6, November 2010 (2010-11-01), pages 97 - 106 *

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111967662A (en) * 2020-08-11 2020-11-20 中国石油化工股份有限公司 Method for improving unloading efficiency of tank container train
CN111967662B (en) * 2020-08-11 2024-01-23 中国石油化工股份有限公司 Method for improving unloading efficiency of tank container train
FR3127320A1 (en) * 2021-09-21 2023-03-24 Continental Automotive Method for determining the position of an object relative to a road marking line
WO2023046776A1 (en) * 2021-09-21 2023-03-30 Continental Automotive Gmbh Method for determining the position of an object with respect to a road marking line of a road

Also Published As

Publication number Publication date
KR101641251B1 (en) 2016-07-20
KR20130015978A (en) 2013-02-14

Similar Documents

Publication Publication Date Title
WO2013018962A1 (en) Traffic lane recognizing apparatus and method thereof
WO2013022159A1 (en) Traffic lane recognizing apparatus and method thereof
WO2017010601A1 (en) Vehicle control device and method therefor
WO2020050636A1 (en) User intention-based gesture recognition method and apparatus
EP3695591A1 (en) Electronic device for controlling a plurality of applications
WO2020017890A1 (en) System and method for 3d association of detected objects
WO2016006728A1 (en) Electronic apparatus and method for processing three-dimensional information using image
CN112684371A (en) Fault positioning method and diagnostic equipment of automobile bus and automobile detection system and method
WO2013022153A1 (en) Apparatus and method for detecting lane
WO2016072610A1 (en) Recognition method and recognition device
WO2015108401A1 (en) Portable device and control method using plurality of cameras
WO2013022154A1 (en) Apparatus and method for detecting lane
KR20130015976A (en) Apparatus and method for detecting a vehicle
WO2020235740A1 (en) Image-based indoor positioning service system and method
WO2014129825A1 (en) Coordinate selection circuit and method in differential touch sensing system
JP2012177998A (en) On-vehicle terminal device
WO2014171720A1 (en) Electronic device and method for preventing touch input error
WO2021086005A1 (en) Electronic device, and method for providing evacuation information according to occurrence of disaster by using same
KR101612821B1 (en) Apparatus for tracing lane and method thereof
WO2017003152A1 (en) Apparatus and method for controlling object movement
WO2015122588A1 (en) Dot pattern recognizing device and content executing device
WO2017003040A1 (en) System and method for displaying graphic-based web vector map
CN109814750B (en) Three-finger coaxial splitting point judgment method, touch screen and touch display device
KR20130015975A (en) Apparatus and method for detecting a vehicle
WO2014069749A1 (en) Processing system and processing method according to swipe motion detection in mobile webpage

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 12821794

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 12821794

Country of ref document: EP

Kind code of ref document: A1