CN112528807B - Method and device for predicting running track, electronic equipment and storage medium - Google Patents

Method and device for predicting running track, electronic equipment and storage medium Download PDF

Info

Publication number
CN112528807B
CN112528807B CN202011403570.7A CN202011403570A CN112528807B CN 112528807 B CN112528807 B CN 112528807B CN 202011403570 A CN202011403570 A CN 202011403570A CN 112528807 B CN112528807 B CN 112528807B
Authority
CN
China
Prior art keywords
track
line
point
image
lane
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011403570.7A
Other languages
Chinese (zh)
Other versions
CN112528807A (en
Inventor
李奕润
蔡永辉
刘业鹏
程骏
庞建新
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ubtech Robotics Corp
Original Assignee
Ubtech Robotics Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ubtech Robotics Corp filed Critical Ubtech Robotics Corp
Priority to CN202011403570.7A priority Critical patent/CN112528807B/en
Publication of CN112528807A publication Critical patent/CN112528807A/en
Application granted granted Critical
Publication of CN112528807B publication Critical patent/CN112528807B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/588Recognition of the road, e.g. of lane markings; Recognition of the vehicle driving pattern in relation to the road
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/26Segmentation of patterns in the image field; Cutting or merging of image elements to establish the pattern region, e.g. clustering-based techniques; Detection of occlusion
    • G06V10/267Segmentation of patterns in the image field; Cutting or merging of image elements to establish the pattern region, e.g. clustering-based techniques; Detection of occlusion by performing operations on regions, e.g. growing, shrinking or watersheds

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Computational Linguistics (AREA)
  • Evolutionary Computation (AREA)
  • Artificial Intelligence (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Health & Medical Sciences (AREA)
  • Data Mining & Analysis (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Traffic Control Systems (AREA)

Abstract

The application is suitable for the technical field of automatic driving, and provides a prediction method and device of a driving track, electronic equipment and a storage medium. The prediction method of the running track comprises the steps of obtaining an image of a current road, extracting a lane line from the image, determining track points for fitting the running track and in-bending points of the running track according to the lane line, dividing the track points into track points of a straight line part and track points of a curve part according to the in-bending points, fitting the track points of the straight line part and the track points of the curve part respectively to obtain the straight line track and the curve track, connecting the straight line track and the curve track to obtain the running track, and therefore accuracy of curve position track fitting can be improved, accuracy of predicted running track is improved, and stability of the vehicle in running is improved.

Description

Method and device for predicting running track, electronic equipment and storage medium
Technical Field
The application belongs to the technical field of automatic driving, and particularly relates to a prediction method and device of a driving track, electronic equipment and a storage medium.
Background
With the continuous development of artificial intelligence in recent years, unmanned technology is receiving more and more attention, and compared with traditional human-driven vehicles, unmanned has the advantages of greatly reducing the number of traffic accidents, greatly reducing traffic jams, being beneficial to improving fuel efficiency, reducing greenhouse gas emission, greatly saving time of drivers and the like.
In the unmanned technique, predicting the running track of the vehicle is a key technique for ensuring that the unmanned vehicle can stably drive according to the lane lines, and in the existing unmanned technique, the running track of the vehicle cannot be accurately predicted, so that the vehicle cannot stably drive.
Disclosure of Invention
In view of this, the embodiments of the present application provide a method, an apparatus, an electronic device, and a storage medium for predicting a driving track, which can improve accuracy of the predicted driving track and improve stability of a vehicle during driving.
A first aspect of an embodiment of the present application provides a method for predicting a driving track, including:
acquiring an image of a current road;
extracting lane lines from the image;
determining track points for fitting a running track according to the lane lines and bending-in points of the running track;
dividing the track point into a track point of a straight line part and a track point of a curve part according to the bending-in point;
and fitting the track points of the linear part and the track points of the curve part respectively to obtain a linear track and a curve track, and connecting the linear track and the curve track to obtain the running track.
In one possible implementation manner, the extracting the lane line from the image includes:
and extracting lane lines from the image by adopting an image segmentation algorithm based on a neural network.
In one possible implementation manner, the shooting view angle of the image is a first view angle, and the extracting the lane line from the image includes:
performing view angle transformation on the image to obtain an image with a overlooking view angle;
and extracting lane lines from the image from the overlooking view.
In one possible implementation manner, the lane lines of the top view include a left lane line, a right lane line, and a broken line located between the left lane line and the right lane line, the broken line includes at least two line segments, the driving track includes a left driving track and a right driving track, the determining, according to the lane lines, a track point for fitting the driving track includes:
taking the middle point of each first connecting line as a track point for fitting a left driving track, wherein one first connecting line is a connecting line of the center of one line segment and the left lane line, and the first connecting line is perpendicular to the corresponding line segment;
and taking the middle point of each second connecting line as a track point for fitting a right-side driving track, wherein one second connecting line is a connecting line of the center of one line segment and the right lane line, and the second connecting line is perpendicular to the corresponding line segment.
In one possible implementation manner, the determining the in-bending point of the driving track according to the lane line includes:
determining an in-bending point of the lane line;
and determining the entry bending point of the running track according to the entry bending point of the lane line.
In one possible implementation manner, the determining the in-bending point of the lane line includes:
determining a gray image corresponding to a lane line of the overhead view;
and determining an in-turn point of the lane line according to the sum of pixel values of each column of pixels in the gray level image, wherein the direction of each column of pixels is consistent with the direction of the straight line track.
In one possible implementation manner, after the connecting the straight track and the curved track to obtain the running track, the method for predicting the running track further includes:
and determining the running speed of the vehicle and the steering engine corner of the vehicle entering the curve according to the running track.
A second aspect of the embodiments of the present application provides a prediction apparatus for a travel track, including:
the acquisition module is used for acquiring an image of the current road;
the extraction module is used for extracting lane lines from the image;
the determining module is used for determining track points for fitting a running track and in-bending points of the running track according to the lane lines;
the dividing module is used for dividing the track point into a track point of a straight line part and a track point of a curve part according to the bending-in point;
and the fitting module is used for respectively fitting the track points of the linear part and the track points of the curve part to obtain a linear track and a curve track, and connecting the linear track and the curve track to obtain the running track.
In one possible implementation manner, the extracting module is specifically configured to:
and extracting lane lines from the image by adopting an image segmentation algorithm based on a neural network.
In one possible implementation manner, the shooting view angle of the image is a first view angle, and the extracting module is specifically configured to:
performing view angle transformation on the image to obtain an image with a overlooking view angle;
and extracting lane lines from the image from the overlooking view.
In one possible implementation manner, the lane lines of the top view include a left lane line, a right lane line, and a broken line located between the left lane line and the right lane line, the broken line includes at least two line segments, the driving track includes a left driving track and a right driving track, and the determining module is specifically configured to:
taking the middle point of each first connecting line as a track point for fitting a left driving track, wherein one first connecting line is a connecting line of the center of one line segment and the left lane line, and the first connecting line is perpendicular to the corresponding line segment;
and taking the middle point of each second connecting line as a track point for fitting a right-side driving track, wherein one second connecting line is a connecting line of the center of one line segment and the right lane line, and the second connecting line is perpendicular to the corresponding line segment.
In a possible implementation manner, the determining module is specifically further configured to:
determining an in-bending point of the lane line;
and determining the entry bending point of the running track according to the entry bending point of the lane line.
In a possible implementation manner, the determining module is specifically further configured to:
determining a gray image corresponding to a lane line of the overhead view;
and determining an in-turn point of the lane line according to the sum of pixel values of each column of pixels in the gray level image, wherein the direction of each column of pixels is consistent with the direction of the straight line track.
In one possible implementation manner, the prediction device of the running track further includes a control module, where the control module is configured to:
and determining the running speed of the vehicle and the steering engine corner of the vehicle entering the curve according to the running track.
A third aspect of the embodiments of the present application provides an electronic device, including a memory, a processor, and a computer program stored in the memory and executable on the processor, where the processor implements the method for predicting a driving trajectory according to the first aspect.
A fourth aspect of the embodiments of the present application provides a computer-readable storage medium storing a computer program which, when executed by a processor, implements the method for predicting a travel track according to the first aspect described above.
A fifth aspect of the embodiments of the present application provides a computer program product, which when run on a terminal device, causes the terminal device to perform the method for predicting a travel track according to the first aspect.
Compared with the prior art, the embodiment of the application has the beneficial effects that: the method comprises the steps of obtaining an image of a current road, extracting a lane line from the image, determining a track point for fitting a running track and an in-bending point of the running track according to the lane line, dividing the track point into a track point of a straight line part and a track point of a curve part according to the in-bending point, fitting the track point of the straight line part and the track point of the curve part respectively to obtain a straight line track and a curve track, and connecting the straight line track and the curve track to obtain the running track. The track points of the straight line part and the track points of the curve part are divided by the bending points, and the straight line track and the curve track are respectively fitted, so that the accuracy of curve position track fitting can be improved, the accuracy of predicted running track is further improved, the stability of the vehicle during running is improved, and particularly the stability of the vehicle during running at the curve position is improved.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings used in the description of the embodiments or the prior art will be briefly described below.
Fig. 1 is a schematic implementation flow chart of a method for predicting a driving track according to an embodiment of the present application;
FIG. 2 is a schematic diagram of an image of a current road provided by an embodiment of the present application;
fig. 3 is a grayscale image of a lane line provided in an embodiment of the present application;
FIG. 4 is a schematic view of an image of a current road from a top view perspective provided by an embodiment of the present application;
FIG. 5 is a schematic illustration of lane lines from a top view provided by an embodiment of the present application;
FIG. 6 is a schematic diagram of a trace point provided by an embodiment of the present application;
FIG. 7 is a schematic diagram of a distribution of the sum of pixel values provided by embodiments of the present application;
fig. 8 is a schematic diagram of a travel track obtained by fitting provided in the present embodiment;
fig. 9 is a schematic diagram of a prediction apparatus for a travel track provided in an embodiment of the present application;
fig. 10 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth, such as particular system configurations, techniques, etc. in order to provide a thorough understanding of the embodiments of the present application. It will be apparent, however, to one skilled in the art that the present application may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present application with unnecessary detail.
In order to illustrate the technical solutions described in the present application, the following description is made by specific examples.
It should be understood that the terms "comprises" and/or "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It is also to be understood that the terminology used in the description of the present application is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. As used in this specification and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise.
It should be further understood that the term "and/or" as used in this specification and the appended claims refers to any and all possible combinations of one or more of the associated listed items, and includes such combinations.
As used in this specification and the appended claims, the term "if" may be interpreted as "when..once" or "in response to a determination" or "in response to detection" depending on the context. Similarly, the phrase "if a determination" or "if a [ described condition or event ] is detected" may be interpreted in the context of meaning "upon determination" or "in response to determination" or "upon detection of a [ described condition or event ]" or "in response to detection of a [ described condition or event ]".
In addition, in the description of the present application, the terms "first," "second," and the like are used merely to distinguish between descriptions and are not to be construed as indicating or implying relative importance.
In the existing unmanned technology, the running track of the vehicle is generally fitted according to the determined lane line, and the lane line comprises a straight line part and a curve part, so that the corresponding running track also comprises the straight line track and the curve track. Because the polynomial difference of the curves corresponding to the linear track and the curve track is large, if one polynomial is used for fitting the linear track and the curve track, the running track of the fitted curve position is inaccurate, and the vehicle cannot run stably at the curve position.
Therefore, the method, the device, the electronic equipment and the storage medium for predicting the running track can improve the accuracy of the predicted running track, and further improve the stability of the vehicle during running.
The following describes an exemplary method for predicting a travel track provided in the present application.
The prediction method of the running track is applied to electronic equipment, and the electronic equipment can be a mobile phone, a tablet personal computer, a vehicle-mounted terminal and the like.
Referring to fig. 1, a method for predicting a driving track according to an embodiment of the present application includes:
s101: and acquiring an image of the current road.
Specifically, a video of a current road sent by a camera is obtained, the video is subjected to framing processing, an image of the current road is obtained, and the camera is arranged at the front end of a vehicle. The current road may be a straight road or a curved road, and the image of the current road is an RGB image.
S102: and extracting lane lines from the image.
In one possible implementation, lane lines may be extracted from the image based on edge detection operators, with a higher computational speed.
Extracting lane lines from the image based on the edge detection operator is less robust to some extreme environments such as dim scenes, backlit scenes. In another possible implementation manner, the lane lines are extracted from the image by adopting an image segmentation algorithm based on the neural network, the image segmentation algorithm based on the neural network is obtained by training a training sample, and various pictures under shooting scenes are added into the training sample, so that the image segmentation algorithm based on the neural network is suitable for different scenes, and the lane lines can be accurately extracted under different scenes.
For example, a deep v3+ segmentation algorithm is used to segment the lane lines from the image. The deep V < 3+ > algorithm has the advantages of high segmentation accuracy, mature algorithm, capability of using backbones with different network magnitudes according to different platform performances, and the like, can be obtained by training corresponding scenes, and improves the robustness of the algorithm under each scene.
For example, fig. 2 is an image of a current road, and a deep v3+ segmentation algorithm is used to segment a lane line from the image, so as to obtain the lane line shown in fig. 3. The image of the current road shown in fig. 2 may be a gray-scale image or an RGB image.
In one possible implementation manner, the shooting view angle of the image of the current road is a first view angle, correspondingly, the extracted lane lines are lane lines of the first view angle, and if the first view angle is not a overlooking view angle, the intervals between the lane lines are different under different angles. In order to improve accuracy of subsequent track prediction, the image at the first view angle shown in fig. 2 may be subjected to view angle transformation to obtain an image at the top view angle shown in fig. 4, and then the lane line is extracted from the image at the top view angle to obtain the lane line at the top view angle shown in fig. 5. Or extracting the lane lines of the first visual angle from the images of the first visual angle, and then performing visual angle transformation on the lane lines of the first visual angle to obtain the lane lines of the overlooking visual angle, so that the distance between the lane lines is kept unchanged at any angle, the calculation difficulty can be reduced when the follow-up driving track is fitted, and the calculation speed is improved.
S103: and determining track points for fitting the running track and the bending-in points of the running track according to the lane lines.
The driving track is positioned between the two lane lines, the driving track is also positioned between the two lane lines, if the current road has two lane lines, the corresponding driving track is one, if the current road has three lane lines, the corresponding driving track has two, and so on.
In one possible implementation, the current road includes three lane lines, namely a left lane line, a right lane line, and a broken line between the left lane line and the right lane line, the broken line including at least two line segments, being a lane line located in the middle of the left lane line and the right lane line. If the lane line between the left lane line and the right lane line is a solid line, the solid line may be divided into a plurality of line segments to form a broken line. As shown in fig. 6, for each line segment, the center of the line segment is determined, a vertical line is drawn along the center of the line segment, the vertical line intersects with the left lane line and the right lane line, the connecting line portion between the center of the line segment and the left lane line is a first connecting line, the connecting line portion between the center of the line segment and the right lane line is a second connecting line, the middle point of each first connecting line is used as a track point for fitting the left driving track, and the middle point of each second connecting line is used as a track point for fitting the right driving track, so that the track points are located at the centers of the two lane lines, and the accuracy of the fitted driving track is improved.
In another possible implementation, the current road includes two lane lines, one of which is divided into a plurality of line segments, forming a broken line. And (3) making a vertical line along the center of each line segment, wherein the vertical line is intersected with the other lane line to form a third connecting line, and the center of each third connecting line is a track point for fitting a running track.
If the current road is a straight road, the corresponding running track is a straight track, and if the current road is a curve, the corresponding running track comprises a straight track and a curve track. If the current road is a straight road, the entering curved point is at infinity, and if the current road is a curve, the entering curved point is the intersection point of the straight line track and the curve track on the running track, and the entering curved point can be coincident with the track point or not. In one possible implementation manner, every two adjacent track points may be connected to form a line segment, and the entering curved point is determined according to an included angle between the two adjacent line segments, for example, the track point between two line segments with the largest included angle is used as the entering curved point.
In another possible implementation, the entry point of the lane line may be determined first, and then the entry point of the driving track may be determined according to the entry point of the lane line, for example, after determining the entry point of the lane line, the entry point of the lane line and the entry point of the driving track are on a straight line, which is perpendicular to the straight line portion in the lane line.
In one possible implementation, the entry bend point of the lane line may be determined by counting the distribution of pixels in the image in which the lane line is located. Specifically, a grayscale image corresponding to a lane line of a top view angle is determined, in which the lane line is different from pixels of the background. For example, as shown in fig. 5, to highlight a lane line, the pixel of the background is 0. And determining the bending point of the lane line according to the sum of pixel values of each column of pixels in the gray level image, wherein the direction of each column of pixels is consistent with the direction of the straight line track, namely the direction of each column of pixels is consistent with the straight line part of the lane line. In order to reduce the calculation difficulty, the gray level image corresponding to the lane line is placed along the straight line direction of the lane line, namely, the straight line direction of the lane line is along the vertical direction, and correspondingly, each column of pixels is along the vertical direction. Taking the dotted line in fig. 5 as an example, a coordinate system is established with the vertical direction as the y-axis direction and the horizontal direction as the x-axis direction. As shown in fig. 7, the abscissa is the pixel coordinate of the x-axis, the ordinate is the sum of the pixel values of each column of pixels, and the entry point is where the sum of the pixel values is abrupt. In fig. 7, the sum of the pixel values between the point a and the point B is higher than the sum of the pixel values after the point B, the sum of the pixel values suddenly drops at the point B, and the abscissa corresponding to the point B is the abscissa of the entry curved point of the lane line, that is, the abscissa of the entry curved point of the driving track, and the position of the entry curved point can be determined on the current road image according to the position of the track point. The position of the bending point is determined by calculating the sum of the pixel values, so that the accuracy of the determined bending point is improved.
S104: and dividing the track point into a track point of a straight line part and a track point of a curve part according to the bending-in point.
S105: and fitting the track points of the linear part and the track points of the curve part respectively to obtain a linear track and a curve track, and connecting the linear track and the curve track to obtain the running track.
Specifically, the fitting of the linear track and the curve track adopts different polynomials, so that the accuracy of the fitted linear track and curve track can be improved.
In one possible implementation, for the track points on the linear track, a least squares fit is used to obtain the linear track. And for track points on the curve track, a Catmull_Rom curve fitting algorithm is adopted to fit the curve track, so that the accuracy of the linear track and the curve track is improved. For example, for the trajectory points shown in fig. 6, the travel trajectory shown in fig. 8 is obtained after fitting.
In one possible implementation manner, the fitted running track is applied to automatic driving or automatic driving simulation, and the running speed of the vehicle and the steering engine corner of the vehicle entering the curve are determined according to the running track, so that the vehicle runs along the fitted running track, and the track is stable in the straight running process and is closer to the lane line when turning.
In the above embodiment, by acquiring the image of the current road, extracting the lane line from the image, determining the track point for fitting the driving track and the in-bend point of the driving track according to the lane line, dividing the track point into the track point of the straight line portion and the track point of the curve portion according to the in-bend point, fitting the track point of the straight line portion and the track point of the curve portion respectively to obtain the straight line track and the curve track, and connecting the straight line track and the curve track to obtain the driving track. The curve position track fitting accuracy can be improved by dividing the track points of the straight line part and the track points of the curve part by the bending points and fitting the straight line track and the curve track respectively, so that the accuracy of the predicted running track is improved, and the stability of the vehicle during running is improved.
It should be understood that the sequence number of each step in the foregoing embodiment does not mean that the execution sequence of each process should be determined by the function and the internal logic of each process, and should not limit the implementation process of the embodiment of the present application in any way.
With respect to the method for predicting a travel track described in the above embodiments, fig. 9 shows a block diagram of a configuration of a device for predicting a travel track provided in an embodiment of the present application, and for convenience of explanation, only a portion related to the embodiment of the present application is shown.
As shown in fig. 9, the prediction apparatus of the travel locus includes,
an acquisition module 10 for acquiring an image of a current road;
an extraction module 20 for extracting lane lines from the image;
a determining module 30, configured to determine a track point for fitting a driving track and an in-bending point of the driving track according to the lane line;
a dividing module 40, configured to divide the track point into a track point of a straight line portion and a track point of a curve portion according to the in-bending point;
and the fitting module 50 is used for respectively fitting the track points of the straight line part and the track points of the curve part to obtain a straight line track and a curve track, and connecting the straight line track and the curve track to obtain the running track.
In one possible implementation, the extracting module 20 is specifically configured to:
and extracting lane lines from the image by adopting an image segmentation algorithm based on a neural network.
In one possible implementation manner, the shooting angle of view of the image is a first angle of view, and the extracting module 20 is specifically configured to:
performing view angle transformation on the image to obtain an image with a overlooking view angle;
and extracting lane lines from the image from the overlooking view.
In one possible implementation manner, the lane lines of the top view include a left lane line, a right lane line, and a broken line located between the left lane line and the right lane line, the broken line includes at least two line segments, the driving track includes a left driving track and a right driving track, and the determining module 30 is specifically configured to:
taking the middle point of each first connecting line as a track point for fitting a left driving track, wherein one first connecting line is a connecting line of the center of one line segment and the left lane line, and the first connecting line is perpendicular to the corresponding line segment;
and taking the middle point of each second connecting line as a track point for fitting a right-side driving track, wherein one second connecting line is a connecting line of the center of one line segment and the right lane line, and the second connecting line is perpendicular to the corresponding line segment.
In one possible implementation, the determining module 30 is specifically further configured to:
determining an in-bending point of the lane line;
and determining the entry bending point of the running track according to the entry bending point of the lane line.
In one possible implementation, the determining module 30 is specifically further configured to:
determining a gray image corresponding to a lane line of the overhead view;
and determining an in-turn point of the lane line according to the sum of pixel values of each column of pixels in the gray level image, wherein the direction of each column of pixels is consistent with the direction of the straight line track.
In one possible implementation manner, the prediction device of the running track further includes a control module, where the control module is configured to:
and determining the running speed of the vehicle and the steering engine corner of the vehicle entering the curve according to the running track.
It should be noted that, because the content of information interaction and execution process between the above devices/units is based on the same concept as the method embodiment of the present application, specific functions and technical effects thereof may be referred to in the method embodiment section, and will not be described herein again.
Fig. 10 is a schematic structural diagram of an electronic device according to an embodiment of the present application. As shown in fig. 10, the electronic apparatus of this embodiment includes: a processor 11, a memory 12, and a computer program 13 stored in the memory 12 and executable on the processor 11. The processor 11, when executing the computer program 13, implements the steps in the control method embodiment of the electronic device described above, such as steps S101 to S105 shown in fig. 1. Alternatively, the processor 11 may perform the functions of the modules/units in the above-described apparatus embodiments when executing the computer program 13, for example, the functions of the acquisition module 10 to the fitting module 50 shown in fig. 9.
By way of example, the computer program 13 may be divided into one or more modules/units, which are stored in the memory 12 and executed by the processor 11 to complete the present application. The one or more modules/units may be a series of computer program instruction segments capable of performing specific functions for describing the execution of the computer program 13 in the terminal device.
It will be appreciated by those skilled in the art that fig. 10 is merely an example of an electronic device and is not meant to be limiting, and may include more or fewer components than shown, or may combine certain components, or different components, e.g., the electronic device may further include an input-output device, a network access device, a bus, etc.
The processor 11 may be a central processing unit (Central Processing Unit, CPU), but may also be other general purpose processors, digital signal processors (Digital Signal Processor, DSP), application specific integrated circuits (Application Specific Integrated Circuit, ASIC), field programmable gate arrays (Field-Programmable Gate Array, FPGA) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, or the like. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The memory 12 may be an internal storage unit of the electronic device, such as a hard disk or a memory of the electronic device. The memory 12 may also be an external storage device of the electronic device, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card) or the like, which are provided on the electronic device. Further, the memory 12 may also include both internal storage units and external storage devices of the electronic device. The memory 12 is used for storing the computer program as well as other programs and data required by the electronic device. The memory 12 may also be used to temporarily store data that has been output or is to be output.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-described division of the functional units and modules is illustrated, and in practical application, the above-described functional distribution may be performed by different functional units and modules according to needs, i.e. the internal structure of the apparatus is divided into different functional units or modules to perform all or part of the above-described functions. The functional units and modules in the embodiment may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit, where the integrated units may be implemented in a form of hardware or a form of a software functional unit. In addition, specific names of the functional units and modules are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present application. The specific working process of the units and modules in the above system may refer to the corresponding process in the foregoing method embodiment, which is not described herein again.
In the foregoing embodiments, the descriptions of the embodiments are emphasized, and in part, not described or illustrated in any particular embodiment, reference is made to the related descriptions of other embodiments.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus/terminal device and method may be implemented in other manners. For example, the apparatus/terminal device embodiments described above are merely illustrative, e.g., the division of the modules or units is merely a logical function division, and there may be additional divisions in actual implementation, e.g., multiple units or components may be combined or integrated into another system, or some features may be omitted or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed may be an indirect coupling or communication connection via interfaces, devices or units, which may be in electrical, mechanical or other forms.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in each embodiment of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The integrated modules/units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a computer readable storage medium. Based on such understanding, the present application may implement all or part of the flow of the method of the above embodiment, or may be implemented by a computer program to instruct related hardware, where the computer program may be stored in a computer readable storage medium, and when the computer program is executed by a processor, the computer program may implement the steps of each method embodiment described above. Wherein the computer program comprises computer program code which may be in source code form, object code form, executable file or some intermediate form etc. The computer readable medium may include: any entity or device capable of carrying the computer program code, a recording medium, a U disk, a removable hard disk, a magnetic disk, an optical disk, a computer Memory, a Read-Only Memory (ROM), a random access Memory (RAM, random Access Memory), an electrical carrier signal, a telecommunications signal, a software distribution medium, and so forth.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
The above embodiments are only for illustrating the technical solution of the present application, and are not limiting; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit and scope of the technical solutions of the embodiments of the present application, and are intended to be included in the scope of the present application.

Claims (8)

1. A method of predicting a travel locus, comprising:
acquiring an image of a current road, wherein a shooting view angle of the image is a first view angle;
performing view angle transformation on the image to obtain an image with a overlooking view angle;
extracting lane lines from the image of the overlooking view, wherein the lane lines of the overlooking view comprise a left lane line, a right lane line and a broken line positioned between the left lane line and the right lane line, the broken line comprises at least two line segments, and the running track comprises a left running track and a right running track;
determining track points for fitting a running track according to the lane lines and bending-in points of the running track; the determining the track point for fitting the driving track according to the lane line comprises the following steps: taking the middle point of each first connecting line as a track point for fitting a left driving track, wherein one first connecting line is a connecting line of the center of one line segment and the left lane line, and the first connecting line is perpendicular to the corresponding line segment; taking the middle point of each second connecting line as a track point for fitting a right-side driving track, wherein one second connecting line is a connecting line of the center of one line segment and the right lane line, and the second connecting line is perpendicular to the corresponding line segment;
dividing the track point into a track point of a straight line part and a track point of a curve part according to the bending-in point;
and fitting the track points of the linear part and the track points of the curve part respectively to obtain a linear track and a curve track, and connecting the linear track and the curve track to obtain the running track.
2. The method according to claim 1, wherein the extracting lane lines from the image from the top view perspective comprises:
and extracting lane lines from the image from the overlooking view angle by adopting an image segmentation algorithm based on a neural network.
3. The method for predicting a driving trajectory according to claim 1, wherein the determining the entry and exit point of the driving trajectory according to the lane line includes:
determining an in-bending point of the lane line;
and determining the entry bending point of the running track according to the entry bending point of the lane line.
4. The method for predicting a driving trajectory according to claim 3, wherein the determining the entry and exit point of the lane line includes:
determining a gray image corresponding to a lane line of the overhead view;
and determining an in-turn point of the lane line according to the sum of pixel values of each column of pixels in the gray level image, wherein the direction of each column of pixels is consistent with the direction of the straight line track.
5. The method according to claim 1, wherein after the connecting the straight-line trajectory and the curved-line trajectory to obtain the travel trajectory, the method further comprises:
and determining the running speed of the vehicle and the steering engine corner of the vehicle entering the curve according to the running track.
6. A travel locus prediction apparatus, comprising:
the acquisition module is used for acquiring an image of a current road, and the shooting view angle of the image is a first view angle;
the extraction module is used for carrying out visual angle transformation on the image to obtain an image with a overlooking visual angle; extracting lane lines from the image of the overlooking view, wherein the lane lines of the overlooking view comprise a left lane line, a right lane line and a broken line positioned between the left lane line and the right lane line, the broken line comprises at least two line segments, and the running track comprises a left running track and a right running track;
the determining module is used for determining track points for fitting a running track and in-bending points of the running track according to the lane lines; the determining the track point for fitting the driving track according to the lane line comprises the following steps: taking the middle point of each first connecting line as a track point for fitting a left driving track, wherein one first connecting line is a connecting line of the center of one line segment and the left lane line, and the first connecting line is perpendicular to the corresponding line segment; taking the middle point of each second connecting line as a track point for fitting a right-side driving track, wherein one second connecting line is a connecting line of the center of one line segment and the right lane line, and the second connecting line is perpendicular to the corresponding line segment;
the dividing module is used for dividing the track point into a track point of a straight line part and a track point of a curve part according to the bending-in point;
and the fitting module is used for respectively fitting the track points of the linear part and the track points of the curve part to obtain a linear track and a curve track, and connecting the linear track and the curve track to obtain the running track.
7. An electronic device comprising a memory, a processor and a computer program stored in the memory and executable on the processor, wherein the processor implements the method of any one of claims 1 to 5 when executing the computer program.
8. A computer readable storage medium storing a computer program, characterized in that the computer program when executed by a processor implements the method according to any one of claims 1 to 5.
CN202011403570.7A 2020-12-04 2020-12-04 Method and device for predicting running track, electronic equipment and storage medium Active CN112528807B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011403570.7A CN112528807B (en) 2020-12-04 2020-12-04 Method and device for predicting running track, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011403570.7A CN112528807B (en) 2020-12-04 2020-12-04 Method and device for predicting running track, electronic equipment and storage medium

Publications (2)

Publication Number Publication Date
CN112528807A CN112528807A (en) 2021-03-19
CN112528807B true CN112528807B (en) 2023-12-19

Family

ID=74997471

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011403570.7A Active CN112528807B (en) 2020-12-04 2020-12-04 Method and device for predicting running track, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN112528807B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113587942A (en) * 2021-06-29 2021-11-02 深圳一清创新科技有限公司 Route processing method and device based on autonomous map building and electronic equipment
CN113781603B (en) * 2021-09-15 2023-08-22 北京有竹居网络技术有限公司 Track point generating method, device, computer equipment and computer storage medium
CN114184206B (en) * 2021-12-03 2024-04-19 北京车慧达科技有限公司 Method and device for generating driving route based on vehicle track points
CN114419877B (en) * 2021-12-15 2022-11-15 中国科学院深圳先进技术研究院 Vehicle track prediction data processing method and device based on road characteristics

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104819724A (en) * 2015-03-02 2015-08-05 北京理工大学 Unmanned ground vehicle self-driving assisting system based on GIS
CN108242145A (en) * 2016-12-26 2018-07-03 高德软件有限公司 Abnormal track point detecting method and device
WO2020029667A1 (en) * 2018-08-09 2020-02-13 Zhejiang Dahua Technology Co., Ltd. Methods and systems for lane line identification
CN110838233A (en) * 2019-10-12 2020-02-25 中国平安人寿保险股份有限公司 Vehicle behavior analysis method and device and computer readable storage medium
CN111380543A (en) * 2018-12-29 2020-07-07 沈阳美行科技有限公司 Map data generation method and device
CN111797780A (en) * 2020-07-08 2020-10-20 中国第一汽车股份有限公司 Vehicle following track planning method, device, server and storage medium

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104819724A (en) * 2015-03-02 2015-08-05 北京理工大学 Unmanned ground vehicle self-driving assisting system based on GIS
CN108242145A (en) * 2016-12-26 2018-07-03 高德软件有限公司 Abnormal track point detecting method and device
WO2020029667A1 (en) * 2018-08-09 2020-02-13 Zhejiang Dahua Technology Co., Ltd. Methods and systems for lane line identification
CN111380543A (en) * 2018-12-29 2020-07-07 沈阳美行科技有限公司 Map data generation method and device
CN110838233A (en) * 2019-10-12 2020-02-25 中国平安人寿保险股份有限公司 Vehicle behavior analysis method and device and computer readable storage medium
CN111797780A (en) * 2020-07-08 2020-10-20 中国第一汽车股份有限公司 Vehicle following track planning method, device, server and storage medium

Also Published As

Publication number Publication date
CN112528807A (en) 2021-03-19

Similar Documents

Publication Publication Date Title
CN112528807B (en) Method and device for predicting running track, electronic equipment and storage medium
CN112528878B (en) Method and device for detecting lane line, terminal equipment and readable storage medium
CN108986465B (en) Method, system and terminal equipment for detecting traffic flow
JP3522317B2 (en) Travel guide device for vehicles
CN110929655B (en) Lane line identification method in driving process, terminal device and storage medium
CN110008891B (en) Pedestrian detection positioning method and device, vehicle-mounted computing equipment and storage medium
CN112819864B (en) Driving state detection method and device and storage medium
CN112654998B (en) Lane line detection method and device
CN114926540A (en) Lane line calibration method and device, terminal equipment and readable storage medium
CN113297939B (en) Obstacle detection method, obstacle detection system, terminal device and storage medium
CN113569812A (en) Unknown obstacle identification method and device and electronic equipment
CN114693722B (en) Vehicle driving behavior detection method, detection device and detection equipment
CN107452230B (en) Obstacle detection method and device, terminal equipment and storage medium
CN115601435A (en) Vehicle attitude detection method, device, vehicle and storage medium
CN115019511A (en) Method and device for identifying illegal lane change of motor vehicle based on automatic driving vehicle
CN115965636A (en) Vehicle side view generating method and device and terminal equipment
CN117011481A (en) Method and device for constructing three-dimensional map, electronic equipment and storage medium
CN115393827A (en) Traffic signal lamp state identification method and system, electronic equipment and storage medium
CN116206483B (en) Parking position determining method, electronic device and computer readable storage medium
CN113688653B (en) Recognition device and method for road center line and electronic equipment
EP4224361A1 (en) Lane line detection method and apparatus
CN113269004B (en) Traffic counting device and method and electronic equipment
CN111435425B (en) Method and system for detecting drivable region, electronic device, and readable storage medium
CN116246459A (en) Method, device, terminal equipment and storage medium for detecting lane where vehicle is located
CN118262325A (en) Method and device for identifying state of traffic light, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant