KR101677640B1 - Apparatus for detecting lane and method thereof - Google Patents
Apparatus for detecting lane and method thereof Download PDFInfo
- Publication number
- KR101677640B1 KR101677640B1 KR1020110078358A KR20110078358A KR101677640B1 KR 101677640 B1 KR101677640 B1 KR 101677640B1 KR 1020110078358 A KR1020110078358 A KR 1020110078358A KR 20110078358 A KR20110078358 A KR 20110078358A KR 101677640 B1 KR101677640 B1 KR 101677640B1
- Authority
- KR
- South Korea
- Prior art keywords
- lane
- curve
- detected
- lanes
- image
- Prior art date
Links
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W40/00—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
- B60W40/02—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
- B60W40/06—Road conditions
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
- G06T7/73—Determining position or orientation of objects or cameras using feature-based methods
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W50/08—Interaction between the driver and the control system
- B60W50/14—Means for informing the driver, warning the driver or prompting a driver intervention
- B60W2050/146—Display means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20048—Transform domain processing
- G06T2207/20061—Hough transform
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30248—Vehicle exterior or interior
- G06T2207/30252—Vehicle exterior; Vicinity of vehicle
- G06T2207/30256—Lane; Road marking
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Mechanical Engineering (AREA)
- Automation & Control Theory (AREA)
- Mathematical Physics (AREA)
- Transportation (AREA)
- Multimedia (AREA)
- Traffic Control Systems (AREA)
- Image Analysis (AREA)
Abstract
TECHNICAL FIELD [0001] The present invention relates to a lane recognizing device and a method thereof that can accurately recognize a driving lane of a vehicle. The lane recognizing apparatus according to the present invention comprises: a camera module; A display unit for displaying an image photographed by the camera module; The method includes detecting candidate lanes based on lane feature points in an image captured by the camera module, displaying driving lanes of the vehicle among the detected candidate lanes on the image, detecting a lane- And a controller for compensating for the lane-damaged section based on a lane width between the lane markers.
Description
The present specification relates to a lane recognition apparatus and a method thereof.
Generally, a lane recognition device is a device that recognizes a lane included in an arbitrary image input through a camera or the like or received from an external terminal. The lane recognition apparatus according to the prior art is also disclosed in Korean Patent Laid-Open Publication No. 1995-0017509.
It is an object of the present invention to provide a lane recognition apparatus and a method thereof that can accurately recognize a lane.
In this specification, when a lane-damaged section is detected, a virtual lane is displayed in the lane-damaged section based on the lane width between the lanes calculated before the detected lane-corrupted section, There is provided a lane recognizing apparatus and a method thereof.
According to an aspect of the present invention, there is provided a lane recognition apparatus comprising: a camera module; A display unit for displaying an image photographed by the camera module; The method includes detecting candidate lanes based on lane feature points in an image captured by the camera module, displaying driving lanes of the vehicle among the detected candidate lanes on the image, detecting a lane- And a controller for compensating for the lane-damaged section based on a lane width between the lane markers.
In one embodiment of the present invention, when the lane departure section is detected, the controller may display a virtual lane on the lane departure section based on the lane width calculated before the detected lane departure section.
As an example related to the present specification, the control unit may further include a storage unit for storing information corresponding to the lane width, and the control unit may calculate the lane width in real time or periodically.
As an example related to the present specification, the control unit may calculate the lane width based on a distance value set for each pixel corresponding to a straight line distance between the driving lanes.
As an example related to the present specification, the control unit may detect, as the lane departure section, an interval in which the lane characteristic points are not detected in any one of the driving lanes.
As an example related to the present specification, the control unit may detect an interval in which lane characteristic points corresponding to the driving lanes are not detected temporarily, as the lane departure period.
The controller may generate a virtual lane corresponding to the lane width calculated before the detected lane departure section and detect the virtual lane as the lane departure section Can be displayed.
When the lane departure section is detected, the control section detects the heading direction of the vehicle, and determines the lane departure section based on the lane width calculated before the lane departure section and the heading direction, The virtual lane can be displayed.
In one embodiment of the present invention, the controller generates the virtual lane on the basis of the lane width calculated before the lane departure section, and displays the virtual lane on the image based on the heading direction of the vehicle .
According to another aspect of the present invention, there is provided a lane recognition method comprising: detecting candidate lanes based on lane minutiae points in an image photographed by a camera module; Displaying driving lanes of the vehicle among the detected candidate lanes on the image; And compensating the lane-damaged section based on the lane width between the driving lanes when the lane-damaged section is detected in the image.
The lane recognizing apparatus and method according to the embodiments of the present invention are characterized in that when a lane departure section is detected, the lane recognition section calculates a virtual lane on the lane departure section based on the lane width between the lanes calculated before the detected lane departure section So that the driving lane can be continuously displayed on the image continuously without interruption, thereby providing the driver with convenience in operation.
The lane recognition apparatus and method according to embodiments of the present invention are characterized in that when a lane-damaged section is detected, based on the lane width between the lanes calculated before the detected lane-corrupted section and the heading direction of the vehicle, It is possible to display the driving lane continuously and accurately on the image by displaying the virtual lane in the interval.
1 is a block diagram illustrating a configuration of a lane recognition apparatus according to an embodiment of the present invention.
2 is an exemplary view showing an image taken by a camera according to an embodiment of the present invention.
3 is an exemplary view illustrating a guideline according to an embodiment of the present invention.
4 is an exemplary view showing lane characteristic points according to an embodiment of the present invention.
5 is an exemplary view illustrating lane feature points converted to world coordinates according to an embodiment of the present invention.
6 is a view showing driving lanes according to an embodiment of the present invention.
7 is a flowchart illustrating a lane recognition method according to an embodiment of the present invention.
FIG. 8 is a view illustrating an image including a lane departure section for explaining an embodiment of the present invention.
FIG. 9 is a diagram illustrating an image including another lane-corrupted section for explaining an embodiment of the present invention.
10 is a diagram illustrating images and lanes displayed on a display unit according to an embodiment of the present invention.
11 is a flowchart illustrating a lane recognition method according to another embodiment of the present invention.
It is noted that the technical terms used herein are used only to describe specific embodiments and are not intended to limit the invention. It is also to be understood that the technical terms used herein are to be interpreted in a sense generally understood by a person skilled in the art to which the present invention belongs, Should not be construed to mean, or be interpreted in an excessively reduced sense. Further, when a technical term used herein is an erroneous technical term that does not accurately express the spirit of the present invention, it should be understood that technical terms that can be understood by a person skilled in the art are replaced. In addition, the general terms used in the present invention should be interpreted according to a predefined or prior context, and should not be construed as being excessively reduced.
Also, the singular forms "as used herein include plural referents unless the context clearly dictates otherwise. In the present application, the term "comprising" or "comprising" or the like should not be construed as necessarily including the various elements or steps described in the specification, Or may be further comprised of additional components or steps.
Furthermore, terms including ordinals such as first, second, etc. used in this specification can be used to describe various elements, but the elements should not be limited by the terms. The terms are used only for the purpose of distinguishing one component from another. For example, without departing from the scope of the present invention, the first component may be referred to as a second component, and similarly, the second component may also be referred to as a first component.
Hereinafter, exemplary embodiments of the present invention will be described in detail with reference to the accompanying drawings, wherein like reference numerals refer to like or similar elements throughout the several views, and redundant description thereof will be omitted.
In the following description, well-known functions or constructions are not described in detail since they would obscure the invention in unnecessary detail. It is to be noted that the accompanying drawings are only for the purpose of facilitating understanding of the present invention, and should not be construed as limiting the scope of the present invention with reference to the accompanying drawings.
Hereinafter, the configuration of a lane recognizing apparatus according to an embodiment of the present invention will be described with reference to FIG. The lane recognizing device of FIG. 1 may be configured as a stand alone device, or may be a mobile terminal, a telematics terminal, a smart phone, a portable terminal A personal digital assistant (PDA), a portable multimedia player (PMP), a tablet PC, a wibro terminal, a navigation terminal, an AVN (Audio Video Navigation) terminal And the like.
1 is a block diagram illustrating a configuration of a lane recognition apparatus according to an embodiment of the present invention.
As shown in FIG. 1, the
The
2 is an exemplary view showing an image taken by a camera according to an embodiment of the present invention.
As shown in FIG. 2, the
The
4, the
The
5, the
The
The
The
The
The
The
6 is a view showing driving lanes according to an embodiment of the present invention.
6, the
The
The
The
The
When the
The
The
The
The
There may be two or
Meanwhile, when the
The touch sensor may be configured to convert a change in a pressure applied to a specific portion of the
The
The proximity sensor refers to a sensor that detects the presence of an object approaching a predetermined detection surface or an object existing in the vicinity thereof without mechanical contact by using an electromagnetic force or an infrared ray. The proximity sensor has a longer lifetime and higher utilization than the contact-type sensor. Examples of the proximity sensor include a transmission type photoelectric sensor, a direct reflection type photoelectric sensor, a mirror reflection type photoelectric sensor, a high frequency oscillation type proximity sensor, a capacitive proximity sensor, a magnetic proximity sensor, and an infrared proximity sensor. And to detect the proximity of the pointer by the change of the electric field along the proximity of the pointer when the touch screen is electrostatic. In this case, the touch screen (touch sensor) may be classified as a proximity sensor.
The act of recognizing that the pointer is positioned on the touch screen without touching the pointer on the touch screen may be referred to as "Proximity Touch, " The contact action can be referred to as "Contact Touch ". A position where the pointer is proximally touched on the touch screen means a position where the pointer corresponds to the touch screen vertically when the pointer is touched.
In addition, the proximity sensor detects a proximity touch and a proximity touch pattern (e.g., a proximity touch distance, a proximity touch direction, a proximity touch speed, a proximity touch time, a proximity touch position, a proximity touch movement state, etc.). Information corresponding to the detected proximity touch operation and the proximity touch pattern may be output on the touch screen.
In this way, when the
The
The
The
The
The
The
The communication unit may include a CAN communication, a car Ethernet, a flexray, a LIN (Local Interconnect Network), or the like, for communication with any vehicle provided with the
The communication unit may include a plurality of support points extracted for an arbitrary image under control of the
The communication unit may receive the first image and the second image simultaneously photographed through any pair of stereo cameras transmitted from the arbitrary terminal or server.
The
The microphone receives an external sound signal (including user's voice (voice signal or voice information)) by a microphone in a communication mode, a recording mode, a voice recognition mode, a video conference mode, And processes it as voice data. The processed voice data may be output through a voice output unit (not shown), or may be converted into a form that can be transmitted to an external terminal through the communication unit and output. In addition, the microphone may be implemented with various noise reduction algorithms for eliminating noise generated in receiving an external sound signal.
The input unit receives a signal corresponding to a button operation by a user or receives a command or a control signal generated by an operation such as touching / scrolling a displayed screen.
The input unit receives a signal corresponding to information input by a user and includes a keyboard, a key pad, a dome switch, a touch pad (static / static), a touch screen A jog shuttle, a jog wheel, a jog switch, a mouse, a stylus pen, a touch pen, a laser pointer, and the like can be used. At this time, the input unit receives a signal corresponding to the input by the various devices.
The
A lane recognition apparatus and method according to an embodiment of the present invention extracts focal points (feature points) serving as candidate lines of a lane in an image, converts the focal points into world coordinates, recognizes lanes on the converted world coordinates, In the error transition of the calibration between the coordinates, the possibility of the cumulative error can be reduced compared with the method of directly recognizing the lane in the image.
The lane recognition apparatus and method according to the embodiment of the present invention display information on lanes recognized on the world coordinates and generate and output a warning message based thereon to improve accuracy / .
The lane recognizing apparatus and method according to the embodiment of the present invention is characterized in that when a lane-damaged section is detected, a virtual lane is created in the lane-damaged section based on the lane width (width between the lane) calculated before the lane- It is possible to provide the driver with convenience of operation.
Hereinafter, a lane recognition method according to an embodiment of the present invention will be described in detail with reference to FIGS. 1 to 10. FIG.
7 is a flowchart illustrating a lane recognition method according to an embodiment of the present invention.
First, the
The
The
5, the
The
6, the
The
The
The
The
The
The
The
The
The
8 is an exemplary view showing an
8, when the
The
9 is an exemplary view showing an
9, the
If the lane departure section is detected, the
10 is a diagram illustrating images and lanes displayed on a display unit according to an embodiment of the present invention.
10, when the
Meanwhile, the
Therefore, the lane recognizing apparatus and method according to an embodiment of the present invention is characterized in that, when the lane departure section is detected, based on the lane width between the driving lanes calculated before the detected lane departure section, By displaying the imaginary lane, it is possible to continuously display the driving lane without interruption on the image, thereby providing the driver with the convenience of operation.
Hereinafter, a lane recognition method according to another embodiment of the present invention will be described in detail with reference to FIGS. 1 and 11. FIG.
11 is a flowchart illustrating a lane recognition method according to another embodiment of the present invention.
First, the
The
The
The
The
The
The
The
The
If the lane departure section is detected, the
The
Therefore, the lane recognizing apparatus and method according to another embodiment of the present invention is characterized in that, when the lane departure section is detected, based on the lane width between the travel lanes calculated before the detected lane departure section and the heading direction of the vehicle By displaying a virtual lane in the lane-damaged section, the driving lane can be continuously and accurately displayed on the image without interruption.
As described above, the lane recognizing apparatus and the method according to the embodiments of the present invention are characterized in that, when the lane marking failure section is detected, By displaying a virtual lane in the lane-damaged area, the driving lane can be continuously displayed on the video without interruption, thereby providing the driver with convenience of operation.
The lane recognition apparatus and method according to embodiments of the present invention are characterized in that when the lane departure section is detected, based on the lane width between the lanes calculated before the detected lane departure section and the heading direction of the vehicle, By displaying a virtual lane in the damaged section, the driving lane can be continuously and accurately displayed on the image without interruption.
It will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the spirit and scope of the invention as defined by the appended claims. Therefore, the embodiments disclosed in the present invention are intended to illustrate rather than limit the scope of the present invention, and the scope of the technical idea of the present invention is not limited by these embodiments. The scope of protection of the present invention should be construed according to the following claims, and all technical ideas within the scope of equivalents should be construed as falling within the scope of the present invention.
10: lane recognition device 110: camera module
120: control unit 130: display unit
140:
Claims (15)
A display unit for displaying an image photographed by the camera module;
The lane feature points are converted into world coordinates, the lane feature points converted into the world coordinates are substituted into a previously stored curve equation, and the lane feature points And determining curve existence information and curve information, displaying driving lanes and curve information of the vehicle among the detected candidate lanes on the image, and detecting a lane width between the driving lanes based on the detected lane- And a controller for compensating for the lane-damaged section with the lane-
Wherein the curve information is determined by one of the pre-stored curve equations based on a curve derivative, a curvature of a lane, a heading and an offset of the vehicle,
And when the curve change rate and the curvature of the lane are all 0, the lane characteristic points are recognized as a straight line.
Wherein when the curve change rate is 0, a virtual lane is displayed in the lane departure section based on the lane width calculated before the detected lane departure section and the curve information, and when the curve change rate is 0, And determines the curve information based on a cubic curve equation when the curve change rate is not zero.
And a storage unit for storing information corresponding to the lane width, wherein the control unit calculates the lane width in real time or periodically.
Converts the lane characteristic points into world coordinates, and calculates the lane width based on the lane characteristic points converted into the world coordinates.
Wherein the lanes are detected as a lane departure section in which lane characteristic points are not detected in any of the lanes of the driving lane.
And detects a section in which lane characteristic points corresponding to the driving lanes are not temporarily detected as the lane departure section.
Wherein when the lane departure section is detected, a virtual lane corresponding to the lane width calculated before the detected lane departure section is generated, and the virtual lane is displayed in the lane departure section.
Wherein when the lane departure section is detected, the heading direction of the vehicle is detected, and the virtual lane is displayed on the lane departure section based on the lane width calculated before the lane departure section and the heading direction Lane recognition device.
The virtual lane is generated on the basis of the lane width calculated before the lane departure section and the virtual lane is displayed on the image based on the heading direction of the vehicle.
Converting the lane feature points into world coordinates;
Determining curve existence and curve information of the lane feature points by substituting the lane feature points converted into the world coordinates into a previously stored curve equation;
Displaying driving lanes of the vehicle and the curve information among the detected candidate lanes on the image;
And compensating the lane departure section based on a lane width between the lanes when the lane departure section is detected in the image,
Wherein the curve information is determined by one of the pre-stored curve equations based on a curve derivative, a curvature of a lane, a heading and an offset of the vehicle,
And when the curve change rate and the curvature of the lane are both 0, the lane characteristic points are recognized as a straight line.
A virtual lane is displayed in the lane departure section on the basis of the lane width and the curve information calculated before the detected lane departure section, and when the curve change rate is 0, the curve information is determined based on the second-order curve equation And if the curve change rate is not 0, the curve information is determined based on a cubic curve equation.
Further comprising the step of storing information corresponding to the lane width, wherein the lane width is calculated in real time or periodically.
Converting the lane feature points into world coordinates;
And calculating the lane width based on the lane feature points converted into the world coordinates.
Wherein a lane marking point in which no lane characteristic points are detected in any one of the driving lanes or lane characteristic points corresponding to the driving lanes is temporarily detected.
Detecting a heading direction of the vehicle when the lane departure section is detected;
And displaying the virtual lane on the lane departure section based on the lane width calculated before the lane departure section and the heading direction.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020110078358A KR101677640B1 (en) | 2011-08-05 | 2011-08-05 | Apparatus for detecting lane and method thereof |
PCT/KR2011/009843 WO2013022154A1 (en) | 2011-08-05 | 2011-12-20 | Apparatus and method for detecting lane |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020110078358A KR101677640B1 (en) | 2011-08-05 | 2011-08-05 | Apparatus for detecting lane and method thereof |
Publications (2)
Publication Number | Publication Date |
---|---|
KR20130015981A KR20130015981A (en) | 2013-02-14 |
KR101677640B1 true KR101677640B1 (en) | 2016-11-18 |
Family
ID=47668643
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
KR1020110078358A KR101677640B1 (en) | 2011-08-05 | 2011-08-05 | Apparatus for detecting lane and method thereof |
Country Status (2)
Country | Link |
---|---|
KR (1) | KR101677640B1 (en) |
WO (1) | WO2013022154A1 (en) |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR102268641B1 (en) * | 2014-07-10 | 2021-06-23 | 현대모비스 주식회사 | Around view system and the operating method |
CN106364403A (en) * | 2016-10-14 | 2017-02-01 | 深圳市元征科技股份有限公司 | Lane recognizing method and mobile terminal |
US11454970B2 (en) | 2018-05-21 | 2022-09-27 | Cummins Inc. | Adjustment of autonomous vehicle control authority |
Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2010030399A (en) * | 2008-07-28 | 2010-02-12 | Nissan Motor Co Ltd | Vehicle operation support device and vehicle operation support method |
Family Cites Families (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR100921427B1 (en) * | 2007-12-17 | 2009-10-14 | 한국전자통신연구원 | Method and Apparatus for generating virtual lane for video based car navigation system |
KR101176693B1 (en) * | 2008-03-13 | 2012-08-23 | 주식회사 만도 | Method and System for Detecting Lane by Using Distance Sensor |
KR101163446B1 (en) * | 2009-03-18 | 2012-07-18 | 기아자동차주식회사 | A lane departure warning system using a virtual lane and a system according to the same |
KR101262921B1 (en) * | 2009-11-10 | 2013-05-09 | 한국전자통신연구원 | Apparatus for keeping a traffic lane for a vehicle and method thereof |
-
2011
- 2011-08-05 KR KR1020110078358A patent/KR101677640B1/en active IP Right Grant
- 2011-12-20 WO PCT/KR2011/009843 patent/WO2013022154A1/en active Application Filing
Patent Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2010030399A (en) * | 2008-07-28 | 2010-02-12 | Nissan Motor Co Ltd | Vehicle operation support device and vehicle operation support method |
Also Published As
Publication number | Publication date |
---|---|
KR20130015981A (en) | 2013-02-14 |
WO2013022154A1 (en) | 2013-02-14 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20140226011A1 (en) | Traffic lane recognizing apparatus and method thereof | |
US10564392B2 (en) | Imaging apparatus and focus control method | |
KR102583682B1 (en) | Electronic device and method for dispalying sharing information based on augmented reality | |
KR101575159B1 (en) | Method of operating application for providing parking information to mobile terminal | |
KR101641251B1 (en) | Apparatus for detecting lane and method thereof | |
KR101563542B1 (en) | Parking information system using mobile terminal | |
US11801602B2 (en) | Mobile robot and driving method thereof | |
US20150262343A1 (en) | Image processing device and image processing method | |
US20160292888A1 (en) | Image measurement device, and recording medium | |
KR101257871B1 (en) | Apparatus and method for detecting object based on vanishing point and optical flow | |
KR101677640B1 (en) | Apparatus for detecting lane and method thereof | |
KR101612822B1 (en) | Apparatus for detecting lane and method thereof | |
KR102518535B1 (en) | Apparatus and method for processing image of vehicle | |
JP2013200778A (en) | Image processing device and image processing method | |
KR101658089B1 (en) | Method for estimating a center lane for lkas control and apparatus threof | |
JP2024052899A (en) | Communication device and communication method | |
KR101612821B1 (en) | Apparatus for tracing lane and method thereof | |
KR101224090B1 (en) | Apparatus and method for detecting nearing car | |
KR101612817B1 (en) | Apparatus and method for tracking car | |
KR102543742B1 (en) | Mobile terminal and computer readable recording medium recording operating method of mobile terminal | |
US20160379416A1 (en) | Apparatus and method for controlling object movement | |
KR102299500B1 (en) | Electronic apparatus and control method thereof | |
KR20130015975A (en) | Apparatus and method for detecting a vehicle | |
KR20140103021A (en) | Object recognition device | |
KR20230005034A (en) | Autonomous Vehicle, Control system for remotely controlling the same, and method thereof |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
A201 | Request for examination | ||
E902 | Notification of reason for refusal | ||
E90F | Notification of reason for final refusal | ||
E701 | Decision to grant or registration of patent right | ||
GRNT | Written decision to grant | ||
FPAY | Annual fee payment |
Payment date: 20191014 Year of fee payment: 4 |