CN110414487B - Method and device for identifying lane line - Google Patents

Method and device for identifying lane line Download PDF

Info

Publication number
CN110414487B
CN110414487B CN201910759230.9A CN201910759230A CN110414487B CN 110414487 B CN110414487 B CN 110414487B CN 201910759230 A CN201910759230 A CN 201910759230A CN 110414487 B CN110414487 B CN 110414487B
Authority
CN
China
Prior art keywords
vehicle
image
lane line
acquiring
identification
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910759230.9A
Other languages
Chinese (zh)
Other versions
CN110414487A (en
Inventor
刘威
张春民
宋希强
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Neusoft Reach Automotive Technology Shenyang Co Ltd
Original Assignee
Neusoft Reach Automotive Technology Shenyang Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Neusoft Reach Automotive Technology Shenyang Co Ltd filed Critical Neusoft Reach Automotive Technology Shenyang Co Ltd
Priority to CN201910759230.9A priority Critical patent/CN110414487B/en
Publication of CN110414487A publication Critical patent/CN110414487A/en
Application granted granted Critical
Publication of CN110414487B publication Critical patent/CN110414487B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/588Recognition of the road, e.g. of lane markings; Recognition of the vehicle driving pattern in relation to the road

Abstract

The application discloses method and device for recognizing lane lines, which can acquire a second image shot by an image acquisition device positioned at a preset position in front of a vehicle besides a first image shot by a front-view camera of the vehicle, wherein the first image and the second image form a complete image in front of the vehicle. After the first image and the second image are acquired, the first image and the second image can be identified to obtain a lane line in front of the vehicle. Therefore, in the embodiment of the application, the lane line in front of the vehicle can be determined according to the complete image in front of the vehicle (namely the first image and the second image).

Description

Method and device for identifying lane line
Technical Field
The present application relates to the field of vehicles, and in particular, to a method and an apparatus for identifying lane lines.
Background
With the development of science and technology, automatic driving systems or auxiliary driving systems are installed on many vehicles at present. Currently, an automatic driving system or an auxiliary driving system may recognize a lane line on a road in combination with an image photographed by a forward-looking camera of a vehicle, thereby implementing automatic driving or auxiliary driving in combination with the recognized lane line.
However, such a method of implementing automatic driving or driving assistance may cause a phenomenon in which the vehicle steps on a lane line during driving, thereby causing a poor effect of automatic driving or driving assistance.
Disclosure of Invention
The technical problem to be solved by the application is that in a traditional mode for realizing automatic driving or auxiliary driving, the phenomenon that a vehicle steps on a lane line in the driving process can exist, so that the effect of automatic driving or auxiliary driving is poor, and the method and the device for identifying the lane line are provided.
In a first aspect, an embodiment of the present application provides a method for identifying a lane line, where the method includes:
acquiring a first image shot by a front-view camera of a vehicle and acquiring a second image shot by an image acquisition device positioned at a preset position in front of the vehicle; the first image and the second image constitute a complete image of the front of the vehicle;
and carrying out image recognition on the first image and the second image to obtain a lane line in front of the vehicle.
Optionally, the image recognizing the first image and the second image to obtain a lane line in front of the vehicle includes:
respectively carrying out image recognition on the first image and the second image to obtain a lane line corresponding to the first image and a lane line corresponding to the second image;
and fitting the lane line corresponding to the first image and the lane line corresponding to the second image to obtain the lane line in front of the vehicle.
Optionally, the method further includes:
acquiring a third image shot by a rearview camera of the vehicle;
carrying out image recognition on the third image to obtain a lane line behind the vehicle;
and fitting the lane line in front of the vehicle and the lane line behind the vehicle to obtain a global lane line.
Optionally, the method further includes:
acquiring a first position of the vehicle through a positioning device on the vehicle, and acquiring map information near the first position;
determining relative position information between a preset reference object in front of the vehicle and the vehicle according to the first image;
and according to the map information, determining a second position of which the relative position with the preset reference object meets the relative position information, and determining the second position as the position where the vehicle is actually located.
Optionally, the preset reference object includes any one or a combination of the following items:
lane lines, traffic signs, signal lights, and buildings.
Optionally, the method further includes:
acquiring a historical lane line corresponding to the second position; the historical lane line is calculated when the vehicle passes through the second position at the historical moment;
the image recognition of the first image and the second image to obtain the lane line in front of the vehicle includes:
and identifying the first image and the second image, and combining the identification result with the historical lane line to obtain the lane line in front of the vehicle.
Optionally, the preset position in front of the vehicle includes:
an air intake grille in front of the vehicle, and/or a bumper in front of the vehicle.
Optionally, the image acquiring apparatus includes:
a fisheye camera, and/or a visual sensor.
In a second aspect, an embodiment of the present application provides an apparatus for identifying a lane line, where the apparatus includes:
the device comprises a first acquisition unit, a second acquisition unit and a control unit, wherein the first acquisition unit is used for acquiring a first image shot by a front-view camera of a vehicle and acquiring a second image shot by an image acquisition device positioned in a preset position in front of the vehicle; the first image and the second image constitute a complete image of the front of the vehicle;
and the first identification unit is used for carrying out image identification on the first image and the second image to obtain a lane line in front of the vehicle.
Optionally, the first identifying unit includes:
the identification subunit is used for respectively carrying out image identification on the first image and the second image to obtain a lane line corresponding to the first image and a lane line corresponding to the second image;
and the fitting subunit is used for fitting the lane line corresponding to the first image and the lane line corresponding to the second image to obtain the lane line in front of the vehicle.
Optionally, the apparatus further comprises:
the second acquisition unit is used for acquiring a third image shot by a rearview camera of the vehicle;
the second identification unit is used for carrying out image identification on the third image to obtain a lane line behind the vehicle;
and the fitting unit is used for fitting the lane line in front of the vehicle and the lane line behind the vehicle to obtain a global lane line.
Optionally, the apparatus further comprises:
the third acquisition unit is used for acquiring a first position where the vehicle is located through a positioning device on the vehicle and acquiring map information near the first position;
a first determination unit configured to determine, from the first image, relative position information between a preset reference object in front of the vehicle and the vehicle;
and the second determining unit is used for determining a second position of which the relative position with the preset reference object meets the relative position information according to the map information, and determining the second position as the position where the vehicle is actually located.
Optionally, the preset reference object includes any one or a combination of the following items:
lane lines, traffic signs, signal lights, and buildings.
Optionally, the apparatus further comprises:
a fourth obtaining unit, configured to obtain a historical lane line corresponding to the second position; the historical lane line is calculated when the vehicle passes through the second position at the historical moment;
the first identification unit is specifically configured to:
and identifying the first image and the second image, and combining the identification result with the historical lane line to obtain the lane line in front of the vehicle.
Optionally, the preset position in front of the vehicle includes:
an air intake grille in front of the vehicle, and/or a bumper in front of the vehicle.
Optionally, the image acquiring apparatus includes:
a fisheye camera, and/or a visual sensor. Compared with the prior art, the embodiment of the application has the following advantages:
in the embodiment of the application, it is considered that in the prior art, when the forward-looking camera shoots, a certain shooting blind area exists at the near end of the vehicle, that is, the forward-looking camera of the image at the near end in front of the vehicle may not shoot, so that the accuracy of lane line identification is affected. Therefore, in the embodiment of the present application, in addition to the first image taken by the front-view camera of the vehicle, the second image taken by the image-taking device located at the preset position in front of the vehicle may be obtained, and the first image and the second image constitute a complete image in front of the vehicle. In other words, the second image includes an image corresponding to a blind spot of the front camera at the front position of the vehicle. After the first image and the second image are acquired, the first image and the second image can be identified to obtain a lane line in front of the vehicle. Therefore, in the embodiment of the application, the lane line in front of the vehicle can be determined according to the complete image (namely the first image and the second image) in front of the vehicle, compared with the prior art, the lane line at the far end in front of the vehicle can be determined, the lane line corresponding to the shooting blind area of the front-view camera can also be determined, the determined lane line in front of the vehicle is more accurate, the problem that the vehicle steps on the lane line in the process of automatic driving or auxiliary driving is avoided, and the effect of automatic driving or auxiliary driving is improved.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings needed to be used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments described in the present application, and other drawings can be obtained by those skilled in the art without creative efforts.
Fig. 1 is a schematic flowchart of a method for identifying a lane line according to an embodiment of the present disclosure;
fig. 2 is a schematic flowchart of a position determining method according to an embodiment of the present application;
fig. 3 is a schematic structural diagram of a device for identifying a lane line according to an embodiment of the present application.
Detailed Description
In order to make the technical solutions of the present application better understood, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The inventor of the present application has found that, in the prior art, an automatic driving system or an auxiliary driving system may identify a lane line on a road in combination with an image captured by a forward-looking camera of a vehicle, so as to implement automatic driving or auxiliary driving in combination with the identified lane line. However, such a method of implementing automatic driving or driving assistance may cause a phenomenon in which the vehicle steps on a lane line during driving, thereby causing a poor effect of automatic driving or driving assistance. When the forward-looking camera shoots, a certain shooting blind area exists at the near end of the vehicle, namely the forward-looking camera of the image of the near end in front of the vehicle can not shoot, so that the lane line corresponding to the shooting blind area part of the forward-looking camera cannot be identified, the accuracy of lane line identification is influenced, further, the phenomenon that the vehicle steps on the lane line exists in the automatic driving or auxiliary driving process, and the automatic driving or auxiliary driving effect is poor.
In order to solve the above problem, in the embodiment of the present application, in addition to a first image taken by a front-view camera of a vehicle, a second image taken by an image-taking device located at a preset position in front of the vehicle may be taken, and the first image and the second image constitute a complete image in front of the vehicle. In other words, the second image includes an image corresponding to a blind spot of the front camera at the front position of the vehicle. After the first image and the second image are acquired, the first image and the second image can be identified to obtain a lane line in front of the vehicle. Therefore, in the embodiment of the application, the lane line in front of the vehicle can be determined according to the complete image (namely the first image and the second image) in front of the vehicle, compared with the prior art, the lane line at the far end in front of the vehicle can be determined, and the lane line corresponding to the shooting blind area of the front-view camera can also be determined, so that the determined lane line in front of the vehicle is more accurate, the problem that the vehicle steps on the lane line in the process of automatic driving or auxiliary driving is avoided, and the effect of automatic driving or auxiliary driving is improved.
Various non-limiting embodiments of the present application are described in detail below with reference to the accompanying drawings.
Exemplary method
Referring to fig. 1, the figure is a schematic flowchart of a method for identifying a lane line according to an embodiment of the present application.
The method for identifying the lane line provided by the embodiment of the application may be implemented by, for example, a controller on a vehicle, where the controller may be a controller that is already present on the vehicle, for example, for a vehicle equipped with an automatic driving or driving assistance system, the controller may be a controller of the automatic driving system or the driving assistance system. The controller may also be a controller configured separately to identify a lane line, and the embodiment of the present application is not particularly limited.
The method for identifying a lane line provided in the embodiment of the present application can be implemented, for example, by the following steps S101 to S102.
S101: acquiring a first image shot by a front-view camera of a vehicle and acquiring a second image shot by an image acquisition device positioned at a preset position in front of the vehicle; the first image and the second image constitute a complete image in front of the vehicle.
In general, a front camera of a vehicle is placed on the inner side of a front windshield of the vehicle. This results in a blind spot at the near end of the vehicle when the forward-looking camera takes a picture, i.e., the forward-looking camera may not take a picture of the near end in front of the vehicle.
It is understood that the first image captured by the front-view camera of the vehicle may be an image of a road surface of an area in front of the vehicle and corresponding objects in front of the vehicle, such as a traffic signboard, a signal lamp, a building, and the like.
In the embodiment of the application, besides processing the front-view camera on the vehicle, the image acquisition device can be arranged at a preset position in front of the vehicle. The image in front of the vehicle is acquired by the image acquisition device and the front-view camera. The image acquisition equipment located at the preset position can shoot images corresponding to the shooting blind areas of the front-view cameras. The first image and the second image may constitute a complete image of the front of the vehicle.
The preset position is not particularly limited in the embodiments of the present application, and as an example, the preset position may be an intake grille in front of the vehicle. As yet another example, the preset position may be a bumper in front of a vehicle, and specifically, the image capturing apparatus may be located at any position on the bumper in front of the vehicle.
The embodiment of the present application does not particularly limit the image capturing apparatus, and as an example, the image capturing apparatus may be a fisheye camera in consideration that a shooting angle of the fisheye camera is 180 degrees. As still another example, the image acquisition apparatus may be a vision sensor in consideration that the precision of acquiring a close-range image by the vision sensor is high and a dead zone of a front-view camera is generally relatively close to a vehicle.
It is understood that, in practical applications, an image acquisition device, such as a fisheye camera or a vision sensor, may be provided only at the intake grille, and not on the bumper; it is also possible to provide an image acquisition device, such as a fisheye camera or a vision sensor, only on the bumper, and not at the air intake grille; it is also possible to provide both an image acquisition device, such as a fisheye camera or a vision sensor, at the air intake grille and an image acquisition device, such as a fisheye camera or a vision sensor, on the bumper; the embodiments of the present application are not particularly limited.
S102: and carrying out image recognition on the first image and the second image to obtain a lane line in front of the vehicle.
In this embodiment of the application, after the first image and the second image are acquired, the first image and the second image may be identified, so as to obtain a lane line in front of the vehicle. The embodiment of the present application does not specifically limit a specific implementation manner of image recognition on the first image and the second image, and as an example, considering that a lane line generally includes a white line and a yellow line, the white line and the yellow line in the first image and the second image may be recognized, so as to obtain the lane line in front of the vehicle.
According to the above description, utilize the scheme of this application embodiment, can be according to complete image in vehicle the place ahead (first image plus second image promptly), confirm the lane line in vehicle the place ahead, compare with the conventional art, not only can determine the lane line of vehicle the place ahead distal end, can also determine the lane line that the shooting blind area of forward-looking camera corresponds, so the lane line in vehicle the place ahead of confirming is more accurate, thereby avoid the problem that the in-process vehicle of automatic driving or assisted driving steps on the lane line, the effect of automatic driving or assisted driving has been promoted.
In an implementation manner of the embodiment of the present application, it is considered that the first image and the second image are not the same image, and therefore, in an implementation manner of the embodiment of the present application, the first image and the second image may be subjected to image recognition respectively to obtain a lane line corresponding to the first image and a lane line corresponding to the second image, and then the lane line corresponding to the first image and the lane line corresponding to the second image are fitted to obtain the lane line in front of the vehicle. For example, a white line and a yellow line in the first image and a white line and a yellow line in the second image are respectively identified, and then the white line identified from the first image and the white line identified from the second image are fitted to obtain a complete white line lane line; and fitting the yellow line obtained by identification in the first image and the yellow line obtained by identification in the second image to obtain a complete yellow line lane line, wherein the complete yellow line lane line and the complete white line lane line form a lane line in front of the vehicle. Specifically, a vehicle coordinate system may be constructed, a lane line equation corresponding to the first image in the vehicle coordinate system and a lane line equation corresponding to the second image in the vehicle coordinate system are obtained through image recognition, and when a lane line corresponding to the first image and a lane line corresponding to the second image are fitted, the lane line equation corresponding to the first image and the lane line equation corresponding to the second image may be fitted to obtain an equation corresponding to a lane line in front of the vehicle.
The vehicle coordinate system may be constructed with a center point of a rear axle of the vehicle as an origin of the vehicle coordinate system, and directions of respective coordinate axes of the vehicle coordinate system may be determined according to actual conditions, which is not limited herein.
It can be understood that, during the running process of the vehicle, the coordinates of the center point of the rear axle of the vehicle in the world coordinate system may change, and the above mentioned equation corresponding to the lane line in front of the vehicle refers to the corresponding equation in the vehicle coordinate system corresponding to the time when the lane line equation is calculated. For example, the time when the vehicle captures the first image and the second image is time t1, a certain time is required for the controller to calculate and obtain the lane line according to the first image and the second image (the time is generally short, and when the calculation efficiency of the controller is high enough, the time can be ignored), the time when the lane line is obtained is time t2, time t1 and time t2 correspond to a vehicle coordinate system respectively, and the lane line equation is the lane line equation in the vehicle coordinate system corresponding to time t 2. It can be understood that, if the calculation efficiency of the controller is high enough, the lane line equation is the lane line equation in the vehicle coordinate system corresponding to the time t 1. When the time difference between the time t1 and the time t2 is not negligible, the lane line equation under the vehicle coordinate system corresponding to the time t1 may be obtained by calculation, and then the lane line equation under the vehicle coordinate system corresponding to the time t1 is converted into the lane line equation under the vehicle coordinate system corresponding to the time t2 according to the coordinate conversion relationship between the vehicle coordinate system corresponding to the time t1 and the vehicle coordinate system corresponding to the time t 2.
In another implementation manner of the embodiment of the present application, in order to improve the accuracy of the calculated lane line in front of the vehicle, map information near the current position of the vehicle may be obtained, and the lane line in front of the vehicle may be determined by combining the map information and the first and second images. Specifically, the position of the vehicle may be acquired by using a positioning device on the vehicle, such as a GPS positioning system, so as to acquire map information near the current position of the vehicle according to the position of the vehicle, and then lane line information in the map information is extracted, so as to perform matching according to the extracted lane line information and the calculated lane line information, and determine the calculated lane line according to the matching result. For example, when an error between the lane line information extracted from the map information and the calculated lane line information is small, the lane line corresponding to the lane line information extracted from the map may be directly determined as the lane line ahead of the vehicle, or the calculated lane line information may be directly determined as the lane line ahead of the vehicle. When the error between the lane line information extracted from the map information and the calculated lane line information is large, it is considered that the error between the lane line information extracted from the map information and the calculated lane line information is large probably due to the positioning error of the GPS, and thus the lane line in front of the vehicle may be recalculated in combination with other information. The lane lines ahead of the vehicle are determined, for example, in conjunction with the historical lane lines described below.
In an implementation manner of the embodiment of the application, in order to make the identified lane line more complete, a third image shot by a rear-view camera of the vehicle can be obtained; carrying out image recognition on the third image to obtain a lane line behind the vehicle; and fitting the lane line in front of the vehicle and the lane line behind the vehicle to obtain a global lane line.
In this application embodiment, the rear view camera may be disposed at a certain position of the parking space, for example, and the rear view camera may capture an image of a road surface to which the tail of the vehicle is directed.
It should be noted that, a specific implementation manner of recognizing the third image to obtain the lane line behind the vehicle is similar to the implementation manner of recognizing the first image to obtain the lane line corresponding to the first image, so as to refer to the description part of the implementation manner of recognizing the first image to obtain the lane line behind the vehicle, and details thereof are not described here.
In this application embodiment, to the lane line in vehicle the place ahead with the lane line at vehicle rear is fitted, obtains the implementation of global lane line, with to the lane line that first image corresponds and the lane line that the second image corresponds fit, obtain the implementation of lane line in vehicle the place ahead is similar, so to right the lane line in vehicle the place ahead with the lane line at vehicle rear is fitted, obtains the implementation of global lane line, can refer to the preceding and to fitting to the lane line that corresponds to first image and the lane line that the second image corresponds, obtains the description part of the implementation of lane line in vehicle the place ahead, here no longer details.
In the embodiment of the application, it is considered that in practical application, a traveling route is adjusted or planned according to the position of a vehicle in real time during the driving process of the vehicle with the automatic driving or auxiliary driving function started. Therefore, it is important to accurately determine the position of the vehicle during driving. Currently, the location of the vehicle can be determined using a positioning device on the vehicle, such as a GPS positioning system. However, the positioning device has a certain positioning error, and in order to accurately determine the position of the vehicle during the driving process, the method provided in the embodiment of the present application may further include steps S201 to S203 shown in fig. 2. Fig. 2 is a schematic flowchart of a position determining method according to an embodiment of the present disclosure.
S201: the method comprises the steps of obtaining a first position where the vehicle is located through a positioning device on the vehicle, and obtaining map information near the first position.
It should be noted that, the positioning device on the vehicle mentioned in the embodiment of the present application may be, for example, a GPS positioning system on the vehicle. It will be appreciated that the first position is a position indicative of the current position of the vehicle at which the locating means on the vehicle is located.
The embodiment of the present application is not particularly limited to the specific implementation of obtaining the map information near the first location, and as an example, the map information near the first location may be obtained through the internet; of course, if the map information is stored locally in the vehicle, the map information in the vicinity of the first position may be acquired locally from the vehicle.
In the embodiment of the present application, the map information near the first position may be regarded as map information including the first position and including objects near the first position.
S202: and determining relative position information between a preset reference object in front of the vehicle and the vehicle according to the first image.
As can be seen from the foregoing, the first image may be an image including a road surface of an area in front of the vehicle, and corresponding objects in front of the vehicle, such as a traffic signboard, a signal light, and a building. In view of this, in the embodiment of the present application, the first image may be analyzed to determine the relative position information between the preset reference object in front of the vehicle and the vehicle.
The embodiment of the present application is not particularly limited to the preset reference object, and the preset reference object may be at least one of a lane line, a traffic signboard, a signal lamp, and a building, for example. The lane line in front of the vehicle may be the lane line in front of the vehicle determined by the method provided in the above embodiment. The traffic signboard, the signal lamp and the building in front of the vehicle can be obtained by carrying out image recognition on the first image.
As described above, the lane line corresponding to the front of the vehicle may be embodied as a lane line equation of the lane line in a vehicle coordinate system, and when the preset reference object includes the lane line in front of the vehicle, the relative position information between the preset reference object and the vehicle may be the lane line equation itself. When the preset reference object includes at least one of a traffic sign, a signal light, and a building in front of the vehicle, the relative position information between the preset reference object and the vehicle may be information representing a relative position relationship between the vehicle and at least one of the traffic sign, the signal light, and the building in front of the vehicle, for example, including a distance between the preset reference object and the vehicle, a direction between the preset reference object and the vehicle, and the like.
In this embodiment, if the preset reference object includes at least one of a traffic sign, a signal lamp, and a building in front of the vehicle, at least one of the traffic sign, the signal lamp, and the building in front of the vehicle may be obtained through first image recognition, and then, in combination with the viewing angle information of the first image captured by the front-view camera, a relative positional relationship between the at least one of the traffic sign, the signal lamp, and the building in front of the vehicle and the vehicle, such as a distance between the vehicle and a direction between the vehicle, may be determined, so as to obtain a relative positional relationship between the at least one of the traffic sign, the signal lamp, the building, and the like in front of the vehicle and the vehicle.
S203: and according to the map information, determining a second position of which the relative position with the preset reference object meets the relative position information, and determining the second position as the position where the vehicle is actually located.
It will be appreciated that although the positioning device on the vehicle has a certain positioning error, the error is generally on the order of decimeters, and is not particularly large, and therefore the map information obtained from the first position includes the position where the vehicle is actually located and the preset reference object.
Therefore, in the embodiment of the present application, it is possible to determine, in combination with the map information, a second position where the relative position with the preset reference object satisfies the aforementioned relative position information, and determine the second position as the position where the vehicle is located.
Since the relative position information between the preset reference object and the vehicle is calculated based on the first image and can be regarded as accurate, and the map information in the vicinity of the first position can also be regarded as accurate and reliable, the second position determined by combining the map information and the relative position information between the preset reference object and the vehicle can be regarded as the position where the vehicle is actually located. It has been proved that with the method of S201-S203, the positioning error is on the order of centimeters, and the position determined by the method of S201-S203 is more accurate than the positioning device on the vehicle.
In another implementation manner of the embodiment of the present application, for convenience of description, the "lane line in front of the vehicle calculated by the vehicle passing the current position once" is referred to as a "historical lane line", in consideration that in practical applications, the vehicle may pass the same position multiple times. In order to improve the accuracy of the calculated lane line in front of the vehicle, if there is a historical lane line corresponding to the current position of the vehicle, the historical lane line, the first image and the second image may be combined to determine the lane line in front of the vehicle. Specifically, a historical lane line calculated when the vehicle passes through the second position at the historical time may be acquired, the first image and the second image may be identified, and the lane line in front of the vehicle may be determined in combination with the identification result. Specifically, the first image and the second image may be subjected to image recognition to obtain an initial lane line, and then the historical lane line and the initial lane line may be correspondingly processed, for example, the historical lane line and the initial lane line may be processed by using a kalman filter method, so as to obtain a lane line in front of the vehicle.
The historical time is not particularly limited in the embodiments of the present application, and the historical time may be any time before the time corresponding to the acquisition of the first image and the second image.
Exemplary device
Based on the above embodiments, a method for identifying a lane line is provided, and an apparatus for identifying a lane line is also provided in the embodiments of the present application, which is described below with reference to the accompanying drawings.
Referring to fig. 3, the figure is a schematic structural diagram of an apparatus for identifying a lane line according to an embodiment of the present application. The apparatus 300 may specifically include, for example: a first acquisition unit 301 and a first recognition unit 302.
A first acquiring unit 301, configured to acquire a first image captured by a front-view camera of a vehicle, and acquire a second image captured by an image capturing device located at a preset position in front of the vehicle; the first image and the second image constitute a complete image in front of the vehicle.
A first recognition unit 302, configured to perform image recognition on the first image and the second image to obtain a lane line in front of the vehicle.
Optionally, the first identifying unit 301 includes:
the identification subunit is used for respectively carrying out image identification on the first image and the second image to obtain a lane line corresponding to the first image and a lane line corresponding to the second image;
and the fitting subunit is used for fitting the lane line corresponding to the first image and the lane line corresponding to the second image to obtain the lane line in front of the vehicle.
Optionally, the apparatus 300 further includes:
the second acquisition unit is used for acquiring a third image shot by a rearview camera of the vehicle;
the second identification unit is used for carrying out image identification on the third image to obtain a lane line behind the vehicle;
and the fitting unit is used for fitting the lane line in front of the vehicle and the lane line behind the vehicle to obtain a global lane line.
Optionally, the apparatus 300 further includes:
the third acquisition unit is used for acquiring a first position where the vehicle is located through a positioning device on the vehicle and acquiring map information near the first position;
a first determination unit configured to determine, from the first image, relative position information between a preset reference object in front of the vehicle and the vehicle;
and the second determining unit is used for determining a second position of which the relative position with the preset reference object meets the relative position information according to the map information, and determining the second position as the position where the vehicle is actually located.
Optionally, the preset reference object includes any one or a combination of the following items:
lane lines, traffic signs, signal lights, and buildings.
Optionally, the apparatus 300 further includes:
a fourth obtaining unit, configured to obtain a historical lane line corresponding to the second position; the historical lane line is calculated when the vehicle passes through the second position at the historical moment;
the first identifying unit 302 is specifically configured to:
and identifying the first image and the second image, and combining the identification result with the historical lane line to obtain the lane line in front of the vehicle.
Optionally, the preset position in front of the vehicle includes:
an air intake grille in front of the vehicle, and/or a bumper in front of the vehicle.
Optionally, the image acquiring apparatus includes:
a fisheye camera, and/or a visual sensor. Since the apparatus 300 is an apparatus corresponding to the method provided in the above method embodiment, and the specific implementation of each unit of the apparatus 300 is the same as that of the above method embodiment, for the specific implementation of each unit of the apparatus 300, reference may be made to the description part of the above method embodiment, and details are not repeated here.
Known through the above description, utilize the scheme of this application embodiment, can be based on the complete image in vehicle the place ahead (first image plus second image promptly), confirm the lane line in vehicle the place ahead, compare with the conventional art, not only can determine the lane line of vehicle the place ahead distal end, can also determine the lane line that the shooting blind area of forward-looking camera corresponds, so the lane line in vehicle the place ahead of confirming is more accurate, thereby avoid the problem that the in-process vehicle of automatic drive or driver assistance stepped on the lane line, automatic drive or driver assistance's effect has been promoted.
Other embodiments of the present application will be apparent to those skilled in the art from consideration of the specification and practice of the invention disclosed herein. This application is intended to cover any variations, uses, or adaptations of the invention following, in general, the principles of the application and including such departures from the present disclosure as come within known or customary practice in the art to which the invention pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the application being indicated by the following claims.
It will be understood that the present application is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the application is limited only by the attached claims
The above description is only exemplary of the present application and should not be taken as limiting the present application, as any modification, equivalent replacement, or improvement made within the spirit and principle of the present application should be included in the protection scope of the present application.

Claims (12)

1. A method of identifying a lane line, the method comprising:
acquiring a first image shot by a front-view camera of a vehicle and acquiring a second image shot by an image acquisition device positioned at a preset position in front of the vehicle; the first image and the second image constitute a complete image of the front of the vehicle; the front-view camera of the vehicle is placed on the inner side of a front windshield of the vehicle and is used for shooting a far-end lane line of the vehicle and a corresponding object in front of the vehicle; the preset position is an air inlet grille in front of the vehicle or a bumper in front of the vehicle, and the image acquisition equipment in the preset position is used for acquiring a shooting blind area at the near end of the vehicle when the front-view camera shoots;
carry out image recognition to the first image and the second image, obtain lane line in front of the vehicle, include:
respectively carrying out image recognition on the first image and the second image to obtain a lane line corresponding to the first image and a lane line corresponding to the second image, and specifically comprising the following steps: constructing a vehicle coordinate system by taking the central point of the rear axle of the vehicle as the origin of the vehicle coordinate system, and obtaining a lane line equation corresponding to the first image in the vehicle coordinate system and a lane line equation corresponding to the second image in the vehicle coordinate system by utilizing image recognition;
fitting the lane line corresponding to the first image with the lane line corresponding to the second image to obtain the lane line in front of the vehicle, and the method specifically comprises the following steps: fitting a lane line equation corresponding to the first image and a lane line equation corresponding to the second image; and obtaining an equation corresponding to the lane line in front of the vehicle.
2. The method of claim 1, further comprising:
acquiring a third image shot by a rearview camera of the vehicle;
carrying out image recognition on the third image to obtain a lane line behind the vehicle;
and fitting the lane line in front of the vehicle and the lane line behind the vehicle to obtain a global lane line.
3. The method of claim 1, further comprising:
acquiring a first position of the vehicle through a positioning device on the vehicle, and acquiring map information near the first position;
determining relative position information between a preset reference object in front of the vehicle and the vehicle according to the first image;
and according to the map information, determining a second position of which the relative position with the preset reference object meets the relative position information, and determining the second position as the position where the vehicle is actually located.
4. The method of claim 3, wherein the pre-set reference comprises any one or a combination of:
lane lines, traffic signs, signal lights, and buildings.
5. The method of claim 3, further comprising:
acquiring a historical lane line corresponding to the second position; the historical lane line is calculated when the vehicle passes through the second position at the historical moment;
the image recognition of the first image and the second image to obtain the lane line in front of the vehicle includes:
and identifying the first image and the second image, and combining an identification result with the historical lane line to obtain the lane line in front of the vehicle.
6. The method according to any one of claims 1 to 5, wherein the image acquisition device comprises:
a fisheye camera, and/or a visual sensor.
7. An apparatus for identifying a lane line, the apparatus comprising:
the device comprises a first acquisition unit, a second acquisition unit and a control unit, wherein the first acquisition unit is used for acquiring a first image shot by a front-view camera of a vehicle and acquiring a second image shot by an image acquisition device positioned in a preset position in front of the vehicle; the first image and the second image constitute a complete image of the front of the vehicle; the front-view camera of the vehicle is placed on the inner side of a front windshield of the vehicle and is used for shooting a far-end lane line of the vehicle and a corresponding object in front of the vehicle; the preset position is an air inlet grille in front of the vehicle or a bumper in front of the vehicle, and the image acquisition equipment in the preset position is used for acquiring a shooting blind area at the near end of the vehicle when the front-view camera shoots;
the first identification unit is used for carrying out image identification on the first image and the second image to obtain a lane line in front of the vehicle;
the first recognition unit includes:
the identification subunit is used for respectively carrying out image identification on the first image and the second image to obtain a lane line corresponding to the first image and a lane line corresponding to the second image;
the identification subunit is specifically configured to construct a vehicle coordinate system with a rear axle center point of the vehicle as an origin of the vehicle coordinate system, and obtain a lane line equation corresponding to the first image in the vehicle coordinate system and a lane line equation corresponding to the second image in the vehicle coordinate system by using image identification;
the fitting subunit is configured to fit a lane line corresponding to the first image and a lane line corresponding to the second image to obtain a lane line in front of the vehicle;
the fitting subunit is specifically configured to fit a lane line equation corresponding to the first image and a lane line equation corresponding to the second image; and obtaining an equation corresponding to the lane line in front of the vehicle.
8. The apparatus of claim 7, further comprising:
the second acquisition unit is used for acquiring a third image shot by a rearview camera of the vehicle;
the second identification unit is used for carrying out image identification on the third image to obtain a lane line behind the vehicle;
and the fitting unit is used for fitting the lane line in front of the vehicle and the lane line behind the vehicle to obtain a global lane line.
9. The apparatus of claim 7, further comprising:
the third acquisition unit is used for acquiring a first position of the vehicle through a positioning device on the vehicle and acquiring map information near the first position;
a first determination unit configured to determine, from the first image, relative position information between a preset reference object in front of the vehicle and the vehicle;
and the second determining unit is used for determining a second position of which the relative position with the preset reference object meets the relative position information according to the map information, and determining the second position as the position where the vehicle is actually located.
10. The apparatus of claim 9, wherein the preset reference comprises any one or a combination of the following:
lane lines, traffic signs, signal lights, and buildings.
11. The apparatus of claim 9, further comprising:
a fourth obtaining unit, configured to obtain a historical lane line corresponding to the second position; the historical lane line is calculated when the vehicle passes through the second position at the historical moment;
the first identification unit is specifically configured to:
and identifying the first image and the second image, and combining an identification result with the historical lane line to obtain the lane line in front of the vehicle.
12. The apparatus according to any one of claims 7 to 11, wherein the image acquisition device comprises:
a fisheye camera, and/or a visual sensor.
CN201910759230.9A 2019-08-16 2019-08-16 Method and device for identifying lane line Active CN110414487B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910759230.9A CN110414487B (en) 2019-08-16 2019-08-16 Method and device for identifying lane line

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910759230.9A CN110414487B (en) 2019-08-16 2019-08-16 Method and device for identifying lane line

Publications (2)

Publication Number Publication Date
CN110414487A CN110414487A (en) 2019-11-05
CN110414487B true CN110414487B (en) 2022-05-13

Family

ID=68367499

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910759230.9A Active CN110414487B (en) 2019-08-16 2019-08-16 Method and device for identifying lane line

Country Status (1)

Country Link
CN (1) CN110414487B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117218619A (en) * 2023-11-07 2023-12-12 安徽中科星驰自动驾驶技术有限公司 Lane recognition method and system for automatic driving vehicle

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103991449A (en) * 2014-06-12 2014-08-20 北京联合大学 Vehicle travelling control method and system
CN105426864A (en) * 2015-12-04 2016-03-23 华中科技大学 Multiple lane line detecting method based on isometric peripheral point matching
CN108528456A (en) * 2017-03-03 2018-09-14 通用汽车环球科技运作有限责任公司 Lane detection system and method
CN108765496A (en) * 2018-05-24 2018-11-06 河海大学常州校区 A kind of multiple views automobile looks around DAS (Driver Assistant System) and method
CN109816980A (en) * 2019-02-20 2019-05-28 东软睿驰汽车技术(沈阳)有限公司 The method and relevant apparatus in lane locating for a kind of determining vehicle

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102421855B1 (en) * 2017-09-28 2022-07-18 삼성전자주식회사 Method and apparatus of identifying driving lane
CN109271857A (en) * 2018-08-10 2019-01-25 广州小鹏汽车科技有限公司 A kind of puppet lane line elimination method and device

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103991449A (en) * 2014-06-12 2014-08-20 北京联合大学 Vehicle travelling control method and system
CN105426864A (en) * 2015-12-04 2016-03-23 华中科技大学 Multiple lane line detecting method based on isometric peripheral point matching
CN108528456A (en) * 2017-03-03 2018-09-14 通用汽车环球科技运作有限责任公司 Lane detection system and method
CN108765496A (en) * 2018-05-24 2018-11-06 河海大学常州校区 A kind of multiple views automobile looks around DAS (Driver Assistant System) and method
CN109816980A (en) * 2019-02-20 2019-05-28 东软睿驰汽车技术(沈阳)有限公司 The method and relevant apparatus in lane locating for a kind of determining vehicle

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
"高速公路车道线检测算法研究";侯利龙;《中国优秀硕士学位论文全文数据库(电子期刊)》;20130215;期刊第4章 *

Also Published As

Publication number Publication date
CN110414487A (en) 2019-11-05

Similar Documents

Publication Publication Date Title
US11014561B2 (en) Vehicle trailer hitch assist system
EP2071491B1 (en) Stereo camera device
US9151626B1 (en) Vehicle position estimation system
US9389093B2 (en) Traffic signal recognition apparatus
JP4530060B2 (en) Parking support apparatus and method
US8094192B2 (en) Driving support method and driving support apparatus
US9126533B2 (en) Driving support method and driving support device
WO2012091476A2 (en) Apparatus and method for displaying a blind spot
TWI534764B (en) Apparatus and method for vehicle positioning
US9953227B2 (en) In-vehicle image processing device
CN106463051B (en) Traffic signal recognition device and traffic signal recognition method
CN104802710B (en) A kind of intelligent automobile reversing aid system and householder method
US20170313253A1 (en) Method for operating a driver assistance system of a motor vehicle, driver assistance system and motor vehicle
CN108349533B (en) Method for determining a parking area for parking a motor vehicle, driver assistance system and motor vehicle
US20190135169A1 (en) Vehicle communication system using projected light
CN108177524B (en) ARHUD system and lane line drawing method thereof
CN110491156A (en) A kind of cognitive method, apparatus and system
US11130418B2 (en) Method and apparatus for aligning a vehicle with a wireless charging system
CN110780287A (en) Distance measurement method and distance measurement system based on monocular camera
JP2015125708A (en) Traffic light recognition device
CN110414487B (en) Method and device for identifying lane line
JP2020047210A (en) Object detection device
KR101424636B1 (en) Automatic parking system for vehicle
JP2006344133A (en) Road division line detector
KR102552712B1 (en) System for estimating a vehicle location and method for estimating the vehicle location using the system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant