CN112528807A - Method and device for predicting driving track, electronic equipment and storage medium - Google Patents

Method and device for predicting driving track, electronic equipment and storage medium Download PDF

Info

Publication number
CN112528807A
CN112528807A CN202011403570.7A CN202011403570A CN112528807A CN 112528807 A CN112528807 A CN 112528807A CN 202011403570 A CN202011403570 A CN 202011403570A CN 112528807 A CN112528807 A CN 112528807A
Authority
CN
China
Prior art keywords
track
line
points
image
lane
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202011403570.7A
Other languages
Chinese (zh)
Other versions
CN112528807B (en
Inventor
李奕润
蔡永辉
刘业鹏
程骏
庞建新
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ubtech Robotics Corp
Original Assignee
Ubtech Robotics Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ubtech Robotics Corp filed Critical Ubtech Robotics Corp
Priority to CN202011403570.7A priority Critical patent/CN112528807B/en
Publication of CN112528807A publication Critical patent/CN112528807A/en
Application granted granted Critical
Publication of CN112528807B publication Critical patent/CN112528807B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/588Recognition of the road, e.g. of lane markings; Recognition of the vehicle driving pattern in relation to the road
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/26Segmentation of patterns in the image field; Cutting or merging of image elements to establish the pattern region, e.g. clustering-based techniques; Detection of occlusion
    • G06V10/267Segmentation of patterns in the image field; Cutting or merging of image elements to establish the pattern region, e.g. clustering-based techniques; Detection of occlusion by performing operations on regions, e.g. growing, shrinking or watersheds

Abstract

The application is applicable to the technical field of automatic driving, and provides a method and a device for predicting a driving track, electronic equipment and a storage medium. The method for predicting the running track comprises the steps of obtaining an image of a current road, extracting a lane line from the image, determining track points for fitting the running track and bend-in points of the running track according to the lane line, dividing the track points into track points of a straight line part and track points of a curved line part according to the bend-in points, fitting the track points of the straight line part and the track points of the curved line part respectively to obtain a straight line track and a curved line track, connecting the straight line track and the curved line track to obtain the running track, and improving the accuracy of the track fitting of the curved line position, so that the accuracy of the predicted running track is improved, and the stability of the running of a vehicle is improved.

Description

Method and device for predicting driving track, electronic equipment and storage medium
Technical Field
The present application relates to the field of automatic driving technologies, and in particular, to a method and an apparatus for predicting a driving trajectory, an electronic device, and a storage medium.
Background
Along with the continuous development of artificial intelligence in recent years, the unmanned technology is more and more emphasized, and compared with the traditional vehicle driven by human beings, the unmanned vehicle has the advantages of greatly reducing the number of traffic accidents, greatly reducing traffic jam, contributing to improving fuel efficiency, reducing greenhouse gas emission, greatly saving driver time and the like.
In the unmanned technology, the prediction of the driving track of the vehicle is a key technology for ensuring that the unmanned vehicle can drive stably according to a lane line, and the driving track of the vehicle cannot be accurately predicted in the existing unmanned technology, so that the vehicle cannot drive stably.
Disclosure of Invention
In view of this, embodiments of the present application provide a method and an apparatus for predicting a driving trajectory, an electronic device, and a storage medium, which can improve the accuracy of the predicted driving trajectory and improve the smoothness of a vehicle during driving.
A first aspect of an embodiment of the present application provides a method for predicting a travel track, including:
acquiring an image of a current road;
extracting a lane line from the image;
determining track points for fitting a driving track and bend-in points of the driving track according to the lane lines;
dividing the track points into track points of a straight line part and track points of a curved line part according to the bending points;
and respectively fitting the track points of the linear part and the track points of the curved part to obtain a linear track and a curved track, and connecting the linear track and the curved track to obtain the running track.
In one possible implementation, the extracting lane lines from the image includes:
and extracting the lane lines from the image by adopting an image segmentation algorithm based on a neural network.
In a possible implementation manner, the capturing view angle of the image is a first view angle, and the extracting the lane line from the image includes:
carrying out visual angle transformation on the image to obtain an image of a overlooking visual angle;
and extracting a lane line from the image of the overlooking visual angle.
In one possible implementation, the lane line of the top view includes a left lane line, a right lane line and is located the left lane line with the dotted line between the right lane line, the dotted line includes at least two line segments, the travel track includes a left travel track and a right travel track, according to the lane line determines the track point for fitting the travel track, including:
taking the midpoint of each first connecting line as a track point for fitting a left driving track, wherein one first connecting line is a connecting line between the center of the line segment and the left lane line and is perpendicular to the corresponding line segment;
and taking the midpoint of each second connecting line as a track point for fitting a right driving track, wherein one second connecting line is a connecting line between the center of the line segment and the right lane line, and the second connecting line is perpendicular to the corresponding line segment.
In one possible implementation manner, the determining a turning point of the driving track according to the lane line includes:
determining a bend-in point of the lane line;
and determining the bend-in point of the driving track according to the bend-in point of the lane line.
In one possible implementation, the determining the bend-in point of the lane line includes:
determining a gray image corresponding to the lane line of the overlooking visual angle;
and determining a bending point of the lane line according to the sum of the pixel values of each row of pixels in the gray-scale image, wherein the direction of each row of pixels is consistent with the direction of the linear track.
In a possible implementation manner, after the connecting the straight-line trajectory and the curved-line trajectory to obtain the travel trajectory, the method for predicting the travel trajectory further includes:
and determining the running speed of the vehicle and the steering engine steering angle of the vehicle entering the curve according to the running track.
A second aspect of an embodiment of the present application provides a device for predicting a travel track, including:
the acquisition module is used for acquiring an image of a current road;
the extraction module is used for extracting the lane line from the image;
the determining module is used for determining track points used for fitting a driving track and bend-in points of the driving track according to the lane lines;
the dividing module is used for dividing the track points into track points of a straight line part and track points of a curved line part according to the bending points;
and the fitting module is used for fitting the track points of the linear part and the track points of the curved part respectively to obtain a linear track and a curved track, and connecting the linear track and the curved track to obtain the running track.
In a possible implementation manner, the extraction module is specifically configured to:
and extracting the lane lines from the image by adopting an image segmentation algorithm based on a neural network.
In a possible implementation manner, the shooting angle of view of the image is a first angle of view, and the extraction module is specifically configured to:
carrying out visual angle transformation on the image to obtain an image of a overlooking visual angle;
and extracting a lane line from the image of the overlooking visual angle.
In a possible implementation manner, the lane line of the top view includes a left lane line, a right lane line, and an imaginary line located between the left lane line and the right lane line, the imaginary line includes at least two line segments, the travel trajectory includes a left travel trajectory and a right travel trajectory, and the determining module is specifically configured to:
taking the midpoint of each first connecting line as a track point for fitting a left driving track, wherein one first connecting line is a connecting line between the center of the line segment and the left lane line and is perpendicular to the corresponding line segment;
and taking the midpoint of each second connecting line as a track point for fitting a right driving track, wherein one second connecting line is a connecting line between the center of the line segment and the right lane line, and the second connecting line is perpendicular to the corresponding line segment.
In a possible implementation manner, the determining module is specifically further configured to:
determining a bend-in point of the lane line;
and determining the bend-in point of the driving track according to the bend-in point of the lane line.
In a possible implementation manner, the determining module is specifically further configured to:
determining a gray image corresponding to the lane line of the overlooking visual angle;
and determining a bending point of the lane line according to the sum of the pixel values of each row of pixels in the gray-scale image, wherein the direction of each row of pixels is consistent with the direction of the linear track.
In one possible implementation, the device for predicting a driving trajectory further includes a control module, configured to:
and determining the running speed of the vehicle and the steering engine steering angle of the vehicle entering the curve according to the running track.
A third aspect of the embodiments of the present application provides an electronic device, including a memory, a processor, and a computer program stored in the memory and executable on the processor, where the processor, when executing the computer program, implements the method for predicting a travel track according to the first aspect.
A fourth aspect of embodiments of the present application provides a computer-readable storage medium storing a computer program which, when executed by a processor, implements the method for predicting a travel trajectory according to the first aspect.
A fifth aspect of embodiments of the present application provides a computer program product, which, when running on a terminal device, causes the terminal device to execute the method for predicting a travel track according to the first aspect.
Compared with the prior art, the embodiment of the application has the advantages that: the method comprises the steps of extracting a lane line from an image by obtaining the image of a current road, determining track points for fitting a driving track and bend-in points of the driving track according to the lane line, dividing the track points into track points of a straight line part and track points of a curved line part according to the bend-in points, fitting the track points of the straight line part and the track points of the curved line part respectively to obtain a straight line track and a curved line track, and connecting the straight line track and the curved line track to obtain the driving track. The curve point is divided into the track points of the straight line part and the track points of the curved line part, and the straight line track and the curved line track are respectively fitted, so that the fitting accuracy of the track of the curved line position can be improved, the accuracy of the predicted running track is improved, the stability of the vehicle in running is improved, and particularly the stability of the vehicle in running at the curved line position is improved.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings used in the embodiments or the description of the prior art will be briefly described below.
Fig. 1 is a schematic flow chart illustrating an implementation of a method for predicting a driving trajectory according to an embodiment of the present application;
FIG. 2 is a schematic diagram of an image of a current road provided by an embodiment of the present application;
FIG. 3 is a gray scale image of a lane line provided by an embodiment of the present application;
FIG. 4 is a schematic diagram of an image of a current road from a top view provided by an embodiment of the present application;
FIG. 5 is a schematic view of a lane line from a top view provided by an embodiment of the present application;
FIG. 6 is a schematic diagram of trace points provided by embodiments of the present application;
FIG. 7 is a diagram illustrating a distribution of sums of pixel values provided by an embodiment of the present application;
FIG. 8 is a schematic diagram of a driving trajectory obtained by fitting provided by the present embodiment;
fig. 9 is a schematic diagram of a device for predicting a travel track according to an embodiment of the present application;
fig. 10 is a schematic structural diagram of an electronic device provided in an embodiment of the present application.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth, such as particular system structures, techniques, etc. in order to provide a thorough understanding of the embodiments of the present application. It will be apparent, however, to one skilled in the art that the present application may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present application with unnecessary detail.
In order to explain the technical solution described in the present application, the following description will be given by way of specific examples.
It will be understood that the terms "comprises" and/or "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It is also to be understood that the terminology used in the description of the present application herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. As used in the specification of the present application and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise.
It should be further understood that the term "and/or" as used in this specification and the appended claims refers to and includes any and all possible combinations of one or more of the associated listed items.
As used in this specification and the appended claims, the term "if" may be interpreted contextually as "when", "upon" or "in response to a determination" or "in response to a detection". Similarly, the phrase "if it is determined" or "if a [ described condition or event ] is detected" may be interpreted contextually to mean "upon determining" or "in response to determining" or "upon detecting [ described condition or event ]" or "in response to detecting [ described condition or event ]".
In addition, in the description of the present application, the terms "first", "second", and the like are used only for distinguishing the description, and are not intended to indicate or imply relative importance.
In the existing unmanned technology, the driving track of the vehicle is generally fitted according to the determined lane line, and the lane line comprises a straight line part and a curve part, and correspondingly, the driving track also comprises a straight line track and a curve track. Because the polynomial difference of the curve corresponding to the straight line track and the curve track is large, if one polynomial is used for fitting the straight line track and the curve track, the fitted driving track of the position of the curve is inaccurate, and then the vehicle cannot stably drive at the position of the curve.
Therefore, the application provides a method and a device for predicting the driving track, an electronic device and a storage medium, which can improve the accuracy of the predicted driving track and further improve the stability of the vehicle during driving.
The following is an exemplary description of a method for predicting a travel track provided in the present application.
The method for predicting the driving track is applied to electronic equipment, and the electronic equipment can be a mobile phone, a tablet personal computer, a vehicle-mounted terminal and the like.
Referring to fig. 1, a method for predicting a driving trajectory according to an embodiment of the present application includes:
s101: and acquiring an image of the current road.
Specifically, a video of a current road sent by a camera is acquired, the video is subjected to framing processing to obtain an image of the current road, and the camera is mounted at the front end of a vehicle. The current road can be a straight road or a curve, and the image of the current road is an RGB image.
S102: lane lines are extracted from the image.
In one possible implementation, lane lines may be extracted from the image based on an edge detection operator, with a higher computational speed.
However, the robustness of extracting lane lines from the image based on the edge detection operator to some extreme environments such as dark scenes and backlight scenes is poor. In another possible implementation mode, the image segmentation algorithm based on the neural network is adopted to extract the lane lines from the image, the image segmentation algorithm based on the neural network is obtained by training a training sample, and pictures under various shooting scenes are added into the training sample, so that the image segmentation algorithm based on the neural network can adapt to different scenes, and the lane lines can be accurately extracted under different scenes.
For example, the lane lines are segmented from the image using the Deeplab V3+ segmentation algorithm. The Deeplab V3+ algorithm has the advantages of high segmentation accuracy, mature algorithm, capability of using backbones with different network magnitudes for different platform performances and the like, and can be obtained by training for corresponding scenes, so that the robustness of the algorithm in each scene is improved.
For example, fig. 2 is an image of a current road, and a Deeplab V3+ segmentation algorithm is used to segment a lane line from the image, so as to obtain the lane line shown in fig. 3. The image of the current road shown in fig. 2 may be a grayscale image or an RGB image.
In a possible implementation manner, the shooting angle of view of the image of the current road is a first angle of view, and correspondingly, the extracted lane lines are lane lines of the first angle of view, and if the first angle of view is not a top view angle, the distances between the lane lines are different at different angles. In order to improve the accuracy of the subsequent trajectory prediction, the image at the first viewing angle as shown in fig. 2 may be subjected to view angle transformation to obtain an image at the top viewing angle as shown in fig. 4, and then the lane lines are extracted from the image at the top viewing angle to obtain the lane lines at the top viewing angle as shown in fig. 5. Or the lane lines at the first visual angle are extracted from the image at the first visual angle, and then the lane lines at the first visual angle are subjected to visual angle transformation to obtain the lane lines at the overlooking visual angle, so that the distance between the lane lines is kept unchanged at any angle, the calculation difficulty can be reduced when the subsequent driving track is fitted, and the calculation speed is increased.
S103: and determining track points for fitting a driving track and bending points of the driving track according to the lane lines.
The driving track is located between two lane lines, the driving track is also located between the two lane lines, if the current road has two lane lines, the corresponding driving track is one, if the current road has three lane lines, the corresponding driving track has two, and so on.
In one possible implementation, the current road includes three lane lines, namely a left lane line, a right lane line, and a dotted line located between the left lane line and the right lane line, where the dotted line includes at least two line segments, which are lane lines located between the left lane line and the right lane line. If the lane line between the left lane line and the right lane line is a solid line, the solid line may be divided into a plurality of line segments to form a dotted line. As shown in fig. 6, for each line segment, the center of the line segment is determined, a perpendicular line is drawn along the center of the line segment, the perpendicular line intersects with the left lane line and the right lane line, in the formed perpendicular line, the connecting line part of the center of the line segment and the left lane line is a first connecting line, the connecting line part of the center of the line segment and the right lane line is a second connecting line, the midpoint of each first connecting line is used as a track point for fitting the left-side travel track, and the midpoint of each second connecting line is used as a track point for fitting the right-side travel track, so that the track point can be located at the centers of the two lane lines, and the accuracy of the fitted travel track is improved.
In another possible implementation manner, the current road includes two lane lines, and one of the lane lines is divided into a plurality of line segments to form a dotted line. And drawing a vertical line along the center of each line segment, wherein the vertical line is intersected with the other lane line to form a third connecting line, and the center of each third connecting line is a track point for fitting the driving track.
If the current road is a straight road, the corresponding driving track is a straight track, and if the current road is a curved road, the corresponding driving track comprises a straight track and a curved track. If the current road is a straight road and the bend-entering point is at an infinite distance, if the current road is a curved road and the bend-entering point is the intersection point of the straight line track and the curve track on the driving track, the bend-entering point can be coincident with the track point or not. In a possible implementation manner, every two adjacent track points may be connected to form a line segment, and the bending point is determined according to an included angle between two adjacent line segments, for example, the track point between two line segments with the largest included angle is used as the bending point.
In another possible implementation, the curve-entering point of the lane line may be determined first, and then the curve-entering point of the driving track may be determined according to the curve-entering point of the lane line, for example, after the curve-entering point of the lane line is determined, the curve-entering point of the lane line and the curve-entering point of the driving track are on a straight line, and the straight line is perpendicular to the straight line portion in the lane line.
In one possible implementation, the curve-entering point of the lane line may be determined by counting the distribution of pixels in the image where the lane line is located. Specifically, a gray image corresponding to a lane line of a top view angle is determined, and the lane line is different from the pixels of the background in the gray image. For example, as shown in fig. 5, to highlight the lane line, the pixel of the background is 0. And determining a bending point of the lane line according to the sum of the pixel values of each column of pixels in the gray-scale image, wherein the direction of each column of pixels is consistent with the direction of the straight line track, namely the direction of each column of pixels is consistent with the straight line part of the lane line. In order to reduce the calculation difficulty, the grayscale image corresponding to the lane line is placed along the linear direction of the lane line, that is, the linear direction of the lane line is along the vertical direction, and correspondingly, each column of pixels is also along the vertical direction. Taking the dotted line in fig. 5 as an example, a coordinate system is established with the vertical direction as the y-axis direction and the horizontal direction as the x-axis direction. The distribution of the sum of pixel values along the y-axis is shown in fig. 7, the abscissa is the x-axis pixel coordinate, the ordinate is the sum of pixel values of each column of pixels, and the inflection point is where the sum of pixel values changes abruptly. In fig. 7, the sum of the pixel values between the point a and the point B is higher than the sum of the pixel values after the point B, the sum of the pixel values decreases abruptly at the point B, the abscissa corresponding to the point B is the abscissa of the curve-entering point of the lane line, that is, the abscissa of the curve-entering point of the driving track, and the position of the curve-entering point can be determined on the image of the current road according to the position of the track point. The position of the bending point is determined by a method of calculating the sum of the pixel values, so that the accuracy of the determined bending point is improved.
S104: and dividing the track points into track points of a straight line part and track points of a curved line part according to the bending point.
S105: and respectively fitting the track points of the linear part and the track points of the curved part to obtain a linear track and a curved track, and connecting the linear track and the curved track to obtain the running track.
Specifically, different polynomials are used for fitting the linear track and the curve track, so that the accuracy of the fitted linear track and the fitted curve track can be improved.
In one possible implementation, for the track points on the linear track, a least square method is adopted to fit to obtain the linear track. For track points on the curve track, a Catmull _ Rom curve fitting algorithm is adopted to fit to obtain the curve track, and the accuracy of the straight line track and the curve track is improved. For example, for the trajectory points shown in fig. 6, the driving trajectory shown in fig. 8 is obtained after fitting.
In a possible implementation mode, the fitted driving track is applied to automatic driving or automatic driving simulation, the driving speed of the vehicle and the steering engine turning angle of the vehicle entering a curve are determined according to the driving track, so that the vehicle can drive along the fitted driving track, the track is stable in the straight line driving process, and the vehicle is closer to a lane line when turning.
In the above embodiment, the image of the current road is obtained, the lane line is extracted from the image, the track point for fitting the traveling track and the bend-in point of the traveling track are determined according to the lane line, the track point is divided into the track point of the straight line part and the track point of the curved line part according to the bend-in point, the track point of the straight line part and the track point of the curved line part are respectively fitted to obtain the straight line track and the curved line track, and the straight line track and the curved line track are connected to obtain the traveling track. The curve position track fitting accuracy can be improved by dividing the track points of the straight line part and the track points of the curved line part through the bend points and respectively fitting the straight line track and the curved line track, so that the predicted accuracy of the running track is improved, and the stability of the vehicle during running is improved.
It should be understood that, the sequence numbers of the steps in the foregoing embodiments do not imply an execution sequence, and the execution sequence of each process should be determined by its function and inherent logic, and should not constitute any limitation to the implementation process of the embodiments of the present application.
As for the method for predicting a travel track described in the above embodiment, fig. 9 shows a block diagram of a travel track prediction apparatus provided in the embodiment of the present application, and for convenience of explanation, only the relevant portions of the embodiment of the present application are shown.
As shown in fig. 9, the prediction means of the travel locus includes,
the acquisition module 10 is used for acquiring an image of a current road;
an extraction module 20, configured to extract a lane line from the image;
the determining module 30 is configured to determine track points for fitting a driving track and bend-in points of the driving track according to the lane lines;
the dividing module 40 is used for dividing the track points into track points of a straight line part and track points of a curved line part according to the bending points;
and the fitting module 50 is used for fitting the track points of the linear part and the track points of the curved part respectively to obtain a linear track and a curved track, and connecting the linear track and the curved track to obtain the running track.
In a possible implementation manner, the extraction module 20 is specifically configured to:
and extracting the lane lines from the image by adopting an image segmentation algorithm based on a neural network.
In a possible implementation manner, the shooting angle of view of the image is a first angle of view, and the extraction module 20 is specifically configured to:
carrying out visual angle transformation on the image to obtain an image of a overlooking visual angle;
and extracting a lane line from the image of the overlooking visual angle.
In a possible implementation manner, the lane line of the top view includes a left lane line, a right lane line, and an imaginary line located between the left lane line and the right lane line, the imaginary line includes at least two line segments, the travel track includes a left travel track and a right travel track, and the determining module 30 is specifically configured to:
taking the midpoint of each first connecting line as a track point for fitting a left driving track, wherein one first connecting line is a connecting line between the center of the line segment and the left lane line and is perpendicular to the corresponding line segment;
and taking the midpoint of each second connecting line as a track point for fitting a right driving track, wherein one second connecting line is a connecting line between the center of the line segment and the right lane line, and the second connecting line is perpendicular to the corresponding line segment.
In a possible implementation manner, the determining module 30 is specifically further configured to:
determining a bend-in point of the lane line;
and determining the bend-in point of the driving track according to the bend-in point of the lane line.
In a possible implementation manner, the determining module 30 is specifically further configured to:
determining a gray image corresponding to the lane line of the overlooking visual angle;
and determining a bending point of the lane line according to the sum of the pixel values of each row of pixels in the gray-scale image, wherein the direction of each row of pixels is consistent with the direction of the linear track.
In one possible implementation, the device for predicting a driving trajectory further includes a control module, configured to:
and determining the running speed of the vehicle and the steering engine steering angle of the vehicle entering the curve according to the running track.
It should be noted that, for the information interaction, execution process, and other contents between the above-mentioned devices/units, the specific functions and technical effects thereof are based on the same concept as those of the embodiment of the method of the present application, and specific reference may be made to the part of the embodiment of the method, which is not described herein again.
Fig. 10 is a schematic structural diagram of an electronic device provided in an embodiment of the present application. As shown in fig. 10, the electronic apparatus of this embodiment includes: a processor 11, a memory 12 and a computer program 13 stored in said memory 12 and executable on said processor 11. The processor 11, when executing the computer program 13, implements the steps in the control method embodiment of the electronic device described above, such as the steps S101 to S105 shown in fig. 1. Alternatively, the processor 11, when executing the computer program 13, implements the functions of each module/unit in the above-mentioned device embodiments, such as the functions of the obtaining module 10 to the fitting module 50 shown in fig. 9.
Illustratively, the computer program 13 may be partitioned into one or more modules/units, which are stored in the memory 12 and executed by the processor 11 to accomplish the present application. The one or more modules/units may be a series of computer program instruction segments capable of performing specific functions, which are used to describe the execution process of the computer program 13 in the terminal device.
Those skilled in the art will appreciate that fig. 10 is merely an example of an electronic device and is not intended to be limiting and may include more or fewer components than those shown, or some components may be combined, or different components, e.g., the electronic device may also include input-output devices, network access devices, buses, etc.
The Processor 11 may be a Central Processing Unit (CPU), other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other Programmable logic device, a discrete Gate or transistor logic device, a discrete hardware component, etc. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The memory 12 may be an internal storage unit of the electronic device, such as a hard disk or a memory of the electronic device. The memory 12 may also be an external storage device of the electronic device, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), and the like provided on the electronic device. Further, the memory 12 may also include both an internal storage unit and an external storage device of the electronic device. The memory 12 is used for storing the computer program and other programs and data required by the electronic device. The memory 12 may also be used to temporarily store data that has been output or is to be output.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-mentioned division of the functional units and modules is illustrated, and in practical applications, the above-mentioned function distribution may be performed by different functional units and modules according to needs, that is, the internal structure of the apparatus is divided into different functional units or modules to perform all or part of the above-mentioned functions. Each functional unit and module in the embodiments may be integrated in one processing unit, or each unit may exist alone physically, or two or more units are integrated in one unit, and the integrated unit may be implemented in a form of hardware, or in a form of software functional unit. In addition, specific names of the functional units and modules are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present application. The specific working processes of the units and modules in the system may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and reference may be made to the related descriptions of other embodiments for parts that are not described or illustrated in a certain embodiment.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus/terminal device and method may be implemented in other ways. For example, the above-described embodiments of the apparatus/terminal device are merely illustrative, and for example, the division of the modules or units is only one logical division, and there may be other divisions when actually implemented, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated modules/units, if implemented in the form of software functional units and sold or used as separate products, may be stored in a computer readable storage medium. Based on such understanding, all or part of the flow in the method of the embodiments described above can be realized by a computer program, which can be stored in a computer-readable storage medium and can realize the steps of the embodiments of the methods described above when the computer program is executed by a processor. Wherein the computer program comprises computer program code, which may be in the form of source code, object code, an executable file or some intermediate form, etc. The computer-readable medium may include: any entity or device capable of carrying the computer program code, recording medium, usb disk, removable hard disk, magnetic disk, optical disk, computer Memory, Read-Only Memory (ROM), Random Access Memory (RAM), electrical carrier wave signals, telecommunications signals, software distribution medium, and the like.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
The above-mentioned embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not substantially depart from the spirit and scope of the embodiments of the present application and are intended to be included within the scope of the present application.

Claims (10)

1. A method for predicting a travel track, comprising:
acquiring an image of a current road;
extracting a lane line from the image;
determining track points for fitting a driving track and bend-in points of the driving track according to the lane lines;
dividing the track points into track points of a straight line part and track points of a curved line part according to the bending points;
and respectively fitting the track points of the linear part and the track points of the curved part to obtain a linear track and a curved track, and connecting the linear track and the curved track to obtain the running track.
2. The method according to claim 1, wherein the extracting a lane line from the image includes:
and extracting the lane lines from the image by adopting an image segmentation algorithm based on a neural network.
3. The method according to claim 1, wherein the image is captured at a first viewing angle, and the extracting the lane line from the image includes:
carrying out visual angle transformation on the image to obtain an image of a overlooking visual angle;
and extracting a lane line from the image of the overlooking visual angle.
4. The method according to claim 3, wherein the lane lines of the overhead view include a left lane line, a right lane line, and an imaginary line located between the left lane line and the right lane line, the imaginary line includes at least two line segments, the travel track includes a left travel track and a right travel track, and determining the track points for fitting the travel track according to the lane lines includes:
taking the midpoint of each first connecting line as a track point for fitting a left driving track, wherein one first connecting line is a connecting line between the center of the line segment and the left lane line and is perpendicular to the corresponding line segment;
and taking the midpoint of each second connecting line as a track point for fitting a right driving track, wherein one second connecting line is a connecting line between the center of the line segment and the right lane line, and the second connecting line is perpendicular to the corresponding line segment.
5. The method for predicting a travel track according to claim 3, wherein the determining a turning point of the travel track according to the lane line includes:
determining a bend-in point of the lane line;
and determining the bend-in point of the driving track according to the bend-in point of the lane line.
6. The method for predicting a travel track according to claim 5, wherein the determining a turning point of the lane line includes:
determining a gray image corresponding to the lane line of the overlooking visual angle;
and determining a bending point of the lane line according to the sum of the pixel values of each row of pixels in the gray-scale image, wherein the direction of each row of pixels is consistent with the direction of the linear track.
7. The method for predicting a travel locus according to claim 1, wherein after the connecting the straight-line locus and the curved-line locus to obtain the travel locus, the method for predicting a travel locus further comprises:
and determining the running speed of the vehicle and the steering engine steering angle of the vehicle entering the curve according to the running track.
8. A travel track prediction apparatus comprising:
the acquisition module is used for acquiring an image of a current road;
the extraction module is used for extracting the lane line from the image;
the determining module is used for determining track points used for fitting a driving track and bend-in points of the driving track according to the lane lines;
the dividing module is used for dividing the track points into track points of a straight line part and track points of a curved line part according to the bending points;
and the fitting module is used for fitting the track points of the linear part and the track points of the curved part respectively to obtain a linear track and a curved track, and connecting the linear track and the curved track to obtain the running track.
9. An electronic device comprising a memory, a processor, and a computer program stored in the memory and executable on the processor, wherein the processor implements the method of any of claims 1 to 7 when executing the computer program.
10. A computer-readable storage medium, in which a computer program is stored which, when being executed by a processor, carries out the method according to any one of claims 1 to 7.
CN202011403570.7A 2020-12-04 2020-12-04 Method and device for predicting running track, electronic equipment and storage medium Active CN112528807B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011403570.7A CN112528807B (en) 2020-12-04 2020-12-04 Method and device for predicting running track, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011403570.7A CN112528807B (en) 2020-12-04 2020-12-04 Method and device for predicting running track, electronic equipment and storage medium

Publications (2)

Publication Number Publication Date
CN112528807A true CN112528807A (en) 2021-03-19
CN112528807B CN112528807B (en) 2023-12-19

Family

ID=74997471

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011403570.7A Active CN112528807B (en) 2020-12-04 2020-12-04 Method and device for predicting running track, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN112528807B (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113587942A (en) * 2021-06-29 2021-11-02 深圳一清创新科技有限公司 Route processing method and device based on autonomous map building and electronic equipment
CN113781603A (en) * 2021-09-15 2021-12-10 北京有竹居网络技术有限公司 Method and device for generating track points, computer equipment and computer storage medium
CN114184206A (en) * 2021-12-03 2022-03-15 北京车慧达科技有限公司 Method and device for generating driving route based on vehicle track points
CN114419877A (en) * 2021-12-15 2022-04-29 中国科学院深圳先进技术研究院 Vehicle track prediction data processing method and device based on road characteristics

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104819724A (en) * 2015-03-02 2015-08-05 北京理工大学 Unmanned ground vehicle self-driving assisting system based on GIS
CN108242145A (en) * 2016-12-26 2018-07-03 高德软件有限公司 Abnormal track point detecting method and device
WO2020029667A1 (en) * 2018-08-09 2020-02-13 Zhejiang Dahua Technology Co., Ltd. Methods and systems for lane line identification
CN110838233A (en) * 2019-10-12 2020-02-25 中国平安人寿保险股份有限公司 Vehicle behavior analysis method and device and computer readable storage medium
CN111380543A (en) * 2018-12-29 2020-07-07 沈阳美行科技有限公司 Map data generation method and device
CN111797780A (en) * 2020-07-08 2020-10-20 中国第一汽车股份有限公司 Vehicle following track planning method, device, server and storage medium

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104819724A (en) * 2015-03-02 2015-08-05 北京理工大学 Unmanned ground vehicle self-driving assisting system based on GIS
CN108242145A (en) * 2016-12-26 2018-07-03 高德软件有限公司 Abnormal track point detecting method and device
WO2020029667A1 (en) * 2018-08-09 2020-02-13 Zhejiang Dahua Technology Co., Ltd. Methods and systems for lane line identification
CN111380543A (en) * 2018-12-29 2020-07-07 沈阳美行科技有限公司 Map data generation method and device
CN110838233A (en) * 2019-10-12 2020-02-25 中国平安人寿保险股份有限公司 Vehicle behavior analysis method and device and computer readable storage medium
CN111797780A (en) * 2020-07-08 2020-10-20 中国第一汽车股份有限公司 Vehicle following track planning method, device, server and storage medium

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113587942A (en) * 2021-06-29 2021-11-02 深圳一清创新科技有限公司 Route processing method and device based on autonomous map building and electronic equipment
CN113781603A (en) * 2021-09-15 2021-12-10 北京有竹居网络技术有限公司 Method and device for generating track points, computer equipment and computer storage medium
CN113781603B (en) * 2021-09-15 2023-08-22 北京有竹居网络技术有限公司 Track point generating method, device, computer equipment and computer storage medium
CN114184206A (en) * 2021-12-03 2022-03-15 北京车慧达科技有限公司 Method and device for generating driving route based on vehicle track points
CN114184206B (en) * 2021-12-03 2024-04-19 北京车慧达科技有限公司 Method and device for generating driving route based on vehicle track points
CN114419877A (en) * 2021-12-15 2022-04-29 中国科学院深圳先进技术研究院 Vehicle track prediction data processing method and device based on road characteristics
CN114419877B (en) * 2021-12-15 2022-11-15 中国科学院深圳先进技术研究院 Vehicle track prediction data processing method and device based on road characteristics

Also Published As

Publication number Publication date
CN112528807B (en) 2023-12-19

Similar Documents

Publication Publication Date Title
CN112528807B (en) Method and device for predicting running track, electronic equipment and storage medium
CN112528878B (en) Method and device for detecting lane line, terminal equipment and readable storage medium
JP3522317B2 (en) Travel guide device for vehicles
CN103093181B (en) A kind of method and apparatus of license plate image location
CN110929655B (en) Lane line identification method in driving process, terminal device and storage medium
CN110008891B (en) Pedestrian detection positioning method and device, vehicle-mounted computing equipment and storage medium
CN112967283A (en) Target identification method, system, equipment and storage medium based on binocular camera
CN112819864B (en) Driving state detection method and device and storage medium
CN111881832A (en) Lane target detection method, device, equipment and computer readable storage medium
CN111160132B (en) Method and device for determining lane where obstacle is located, electronic equipment and storage medium
CN113297939B (en) Obstacle detection method, obstacle detection system, terminal device and storage medium
CN113902047B (en) Image element matching method, device, equipment and storage medium
CN114037977B (en) Road vanishing point detection method, device, equipment and storage medium
Wennan et al. Lane detection in some complex conditions
CN114066930A (en) Planar target tracking method and device, terminal equipment and storage medium
CN112184605A (en) Method, equipment and system for enhancing vehicle driving visual field
CN115601435B (en) Vehicle attitude detection method, device, vehicle and storage medium
CN113688653A (en) Road center line recognition device and method and electronic equipment
CN114581890B (en) Method and device for determining lane line, electronic equipment and storage medium
CN114898325B (en) Vehicle dangerous lane change detection method and device and electronic equipment
CN112434591B (en) Lane line determination method and device
EP4224361A1 (en) Lane line detection method and apparatus
CN116168325A (en) Vehicle lane change detection method, device, electronic equipment and readable storage medium
CN116246459A (en) Method, device, terminal equipment and storage medium for detecting lane where vehicle is located
CN115909265A (en) Lane-level positioning detection method, device, equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant