CN110781816A - Method, device, equipment and storage medium for transverse positioning of vehicle in lane - Google Patents

Method, device, equipment and storage medium for transverse positioning of vehicle in lane Download PDF

Info

Publication number
CN110781816A
CN110781816A CN201911024194.8A CN201911024194A CN110781816A CN 110781816 A CN110781816 A CN 110781816A CN 201911024194 A CN201911024194 A CN 201911024194A CN 110781816 A CN110781816 A CN 110781816A
Authority
CN
China
Prior art keywords
lane
image
vehicle
sar
determining
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201911024194.8A
Other languages
Chinese (zh)
Inventor
唐侃
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Autoroad Tech Co Ltd
Original Assignee
Beijing Autoroad Tech Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Autoroad Tech Co Ltd filed Critical Beijing Autoroad Tech Co Ltd
Priority to CN201911024194.8A priority Critical patent/CN110781816A/en
Publication of CN110781816A publication Critical patent/CN110781816A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/588Recognition of the road, e.g. of lane markings; Recognition of the vehicle driving pattern in relation to the road
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/44Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Traffic Control Systems (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses a method, a device and equipment for transversely positioning a vehicle in a lane and a storage medium. The method comprises the following steps: acquiring a visual image through a camera arranged on a vehicle, converting the visual image into an overlooking image, and detecting a lane line through the overlooking image; respectively acquiring SAR images through two synthetic aperture radars SAR arranged in the oblique front of a vehicle, and detecting the lane edge through the SAR images; fusing the overlook image and the SAR image, and determining the positions of a lane line and a lane edge in the fused image; determining a lateral lane-level location of the vehicle based on the location of the lane line, the location of the lane edge, and the vehicle motion information in the fused image. The method needs small SAR data volume, can be suitable for all-weather environment, can improve detection precision and positioning precision, and has small calculation amount, stability and reliability.

Description

Method, device, equipment and storage medium for transverse positioning of vehicle in lane
Technical Field
The embodiment of the invention relates to the technical field of information fusion, in particular to a method, a device, equipment and a storage medium for transversely positioning a vehicle in a lane.
Background
Lane-level lateral positioning is a necessary means for improving the safety of an automatic driving vehicle, road characteristic information, obstacle detection and tracking and the like which can be obtained by the automatic driving vehicle can be obtained, and the lateral positioning enables the vehicle to accurately know the position of the vehicle in the current lane, so that the method is very critical in the application of lane departure early warning, vehicle auxiliary steering, autonomous navigation and the like.
The transverse positioning method in the prior art is lane line detection purely based on a visual system to realize transverse positioning, but the visual system is very sensitive to background light, and the visual system needs complete identification of lane lines and uniform format; the other method is a transverse positioning method which is not a pure vision system, but the method has poor detection effect on the lane edge, so that the transverse positioning is not accurate.
Disclosure of Invention
The embodiment of the invention provides a method, a device, equipment and a storage medium for transversely positioning a vehicle in a lane, which can improve the detection precision and the positioning precision, and have small calculated amount, stability and reliability.
In a first aspect, an embodiment of the present invention provides a method for laterally positioning a vehicle in a lane, the method including: acquiring a visual image through a camera arranged on a vehicle, converting the visual image into an overlooking image, and detecting a lane line through the overlooking image; respectively acquiring SAR images through two Synthetic Aperture Radars (SAR) arranged in the oblique front of a vehicle, and detecting the lane edge through the SAR images; fusing the overlook image and the SAR image, and determining the positions of a lane line and a lane edge in the fused image; determining a lateral lane-level location of the vehicle based on the location of the lane line, the location of the lane edge, and the vehicle motion information in the fused image.
In a second aspect, an embodiment of the present invention further provides a device for laterally positioning a vehicle in a lane, the device including: the lane line detection module is used for acquiring a visual image through a camera arranged on a vehicle, converting the visual image into an overlook image and detecting a lane line through the overlook image; the lane edge detection module is used for respectively acquiring SAR images through two synthetic aperture radars (synthetic aperture radar) arranged in the oblique front of the vehicle and detecting lane edges through the SAR images; the fusion module is used for fusing the overlook image and the SAR image and determining the positions of a lane line and a lane edge in the fused image; and the lane-level positioning module is used for determining the transverse lane-level positioning of the vehicle based on the position of the lane line, the position of the lane edge and the vehicle motion information in the fusion image.
In a third aspect, an embodiment of the present invention further provides a vehicle lateral positioning apparatus, including:
one or more processors;
a storage device for storing one or more programs,
when executed by the one or more processors, cause the one or more processors to implement a method of lateral positioning of a vehicle in a lane, as described in any embodiment of the invention.
In a fourth aspect, the embodiments of the present invention further provide a computer-readable storage medium, on which a computer program is stored, which when executed by a processor, implements a method for lateral positioning of a vehicle in a lane according to any of the embodiments of the present invention.
The method comprises the steps of acquiring a visual image through a camera arranged on a vehicle, converting the visual image into an overlook image, and detecting a lane line through the overlook image; respectively acquiring SAR images through two synthetic aperture radars SAR arranged in the oblique front of a vehicle, and detecting the lane edge through the SAR images; fusing the overlook image and the SAR image, and determining the positions of a lane line and a lane edge in the fused image; the transverse lane level positioning of the vehicle is determined based on the position of the lane line in the fusion image, the position of the lane edge and the vehicle motion information, the transverse positioning problem of the vehicle in the lane is solved, the needed SAR data volume is small, the method can be applied to all-weather environments, the detection precision is high, the positioning precision is high, the calculated amount is small, and the effects of stability and reliability are achieved.
Drawings
FIG. 1 is a flow chart of a method for lateral positioning of a vehicle in a roadway provided by an embodiment of the present invention;
FIG. 2 is a schematic diagram of the installation and geometric relationship of the SAR and the camera in the method for laterally positioning the vehicle in the lane according to the embodiment of the present invention;
FIG. 3 is a schematic diagram of a fused image of a method for lateral positioning of a vehicle in a lane according to an embodiment of the present invention;
FIG. 4 is a flow chart of a method for lateral positioning and road perception of a vehicle in a lane provided by an embodiment of the present invention;
FIG. 5 is a block diagram of a method for lateral positioning and road perception of a vehicle in a lane according to an embodiment of the present invention;
FIG. 6 is a schematic structural diagram of a lateral positioning device of a vehicle in a lane according to an embodiment of the present invention;
fig. 7 is a schematic structural diagram of a vehicle lateral positioning apparatus according to an embodiment of the present invention.
Detailed Description
The present invention will be described in further detail with reference to the accompanying drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the invention and are not limiting of the invention. It should be further noted that, for the convenience of description, only some of the structures related to the present invention are shown in the drawings, not all of the structures.
Fig. 1 is a flowchart of a method for laterally positioning a vehicle in a lane according to an embodiment of the present invention, and fig. 2 is a schematic view of an installation and a geometric relationship of a SAR and a camera in a method for laterally positioning a vehicle in a lane according to an embodiment of the present invention, where the present embodiment is applicable to a situation where a vehicle confirms a lane when the vehicle is automatically driven, and the method may be performed by a device for laterally positioning a vehicle in a lane, where the device may be implemented by software and/or hardware, and the device may be integrated in a processor in a vehicle, as shown in fig. 1, where the method specifically includes:
s110, acquiring a visual image through a camera arranged on a vehicle, converting the visual image into an overlook image, and detecting a lane line through the overlook image;
as shown in fig. 2, the camera 220 may be installed behind a front windshield of the vehicle 200, where the installation position is highest, so that the lane line condition can be observed with a greater visual sense, and the camera is not easily blocked by a reverse image of an object such as a green belt or a road block. The camera 220 may be a monocular camera, may perform inverse perspective transformation through an original video, convert an acquired visual image into an overhead image 270, may perform gray level transformation, filtering, and natural noise removal on the overhead image 270, and then screen out a region of interest in shape through edge detection, perform line detection using hough transformation, perform lane presumption according to a line position to determine whether a detected line is a lane line 240, and finally realize detection of the lane line 240 through the overhead image 270.
S120, acquiring SAR images through two synthetic aperture radars SAR arranged in the oblique front of the vehicle respectively, and detecting lane edges through the SAR images;
as shown in fig. 2, the SAR210 may be installed in a front oblique direction of the vehicle 200, one SAR210 may be installed in each of two front oblique directions of the vehicle 200, the millimeter waves generated by the SAR210 may scan a road with a certain angle, the image 260 may be obtained, since the angle may be changed and the vehicle is moving, the image 260 obtained by the SAR210 may finally generate the SAR image 280, and the lane edge 230 is detected through the SAR image 280.
In an implementation manner of the embodiment of the present invention, optionally, detecting the lane edge through the SAR image may include: carrying out binarization on the SAR image, and carrying out drying treatment; recording the point of the obstacle closest to the vehicle in the processed SAR image; and fitting a straight line to the point of the obstacle closest to the vehicle, and taking the fitted straight line as the edge of the lane.
The obtained SAR images on two sides of the vehicle can be subjected to binarization processing, and denoising processing can be performed by using morphology. As shown in fig. 2, the point of the obstacle closest to the vehicle 200 in the processed SAR image 280 may be recorded. These points are fitted with a straight line to obtain the lane edge 230.
S130, fusing the overlook image and the SAR image, and determining the positions of a lane line and a lane edge in the fused image;
in an implementation manner of the embodiment of the present invention, optionally, fusing the overlook image and the SAR image, includes: and converting the SAR image into a coordinate system where the overhead view image is located, and fusing the converted image and the overhead view image.
As shown in fig. 2, the irradiation ranges of the two SAR210 overlap with the field of view of the camera 220, but the center of the field of view of the camera 220, i.e., the front of the vehicle 200, does not intersect. Therefore, after the camera 220 and the SAR210 are installed, calibration may be performed first, the installation positions of the camera 220 and the SAR210 are determined, and the field of view of the camera 220 and the irradiation range of the SAR210 are determined, specifically, the field of view of the camera 220 at a certain time may be a region involved in the overhead view image 270, and the irradiation range of the SAR210 may be the beam-formed region 260. Then, time alignment is realized, the overhead image 270 acquired by the camera 220 is matched with the SAR image 280 acquired by the SAR210, and the distance information of the object is identified by the SAR210 through the object acquired by the camera 220. The coordinate system of the overhead view image 270 obtained by conversion after the camera 220 is acquired may be taken as a reference, the SAR image 280 acquired by the SAR210 is aligned with the overhead view image 270, the content in the SAR image 280 and the detection result are filled in, and the overhead view image 270 and the SAR image 280 are fused to form a fusion-compatible image. The lane line 240 obtained by detecting the overhead image 270 obtained by conversion after the acquisition by the camera 220 and the lane edge 230 obtained by detecting the SAR image 280 obtained by the SAR210 are obtained. The lane lines 240 and the lane edges 230 may be displayed on the fused image to facilitate determining the lateral lane-level positioning of the vehicle. Fig. 3 is a schematic diagram of a fused image of a method for laterally positioning a vehicle in a lane according to an embodiment of the present invention, and as shown in fig. 3, the fused image includes an overhead view image 270, an SAR image 280, a lane line 240 and a lane edge 230.
S140, determining the transverse lane-level positioning of the vehicle based on the position of the lane line in the fused image, the position of the lane edge and the vehicle motion information.
In an embodiment of the present invention, optionally, determining the lateral lane-level location of the vehicle based on the position of the lane line, the position of the lane edge in the fused image, and the vehicle motion information includes: determining the total lane width based on the position of the lane edge in the fusion image and the corner information of the vehicle; a lateral lane-level location of the vehicle is determined based on the total width of the lane, the location of the lane line, and the vehicle location information.
As shown in fig. 2, the top view image obtained by the camera may obtain the number N of lanes of the current road, where the number N of lanes may be determined according to the number of lane lines obtained by processing the top view image; obtaining the SAR image 280 through the SAR210 may result in a total lane width L.
Wherein, optionally, confirm the total width of lane based on position and the corner information of vehicle of lane border in the fusion image, include: respectively recording target distances between the nearest end of the fused image and the positions of the unilateral road edges of the corresponding lanes, and determining the total width of the lane to be determined based on the target distances and the sum of the transverse distance between the two SAR and the set blind area distance; determining the total width of the lane based on the total width of the lane to be determined and the corner information of the vehicle; and the nearest end of the fused image is the end of the fused image closest to the SAR. Specifically, as shown in fig. 2, the nearest end of the fused image may include an end a nearest to the SAR210 on the left side of the vehicle 200, and a side b nearest to the SAR210 on the right side of the vehicle 200.
Assuming that the target distances between a and b and the positions of the single-sided road edges of the corresponding lanes are L1 and L2, respectively, for example, the target distance of a to the lane edge 230 on the left side of the vehicle 200 is L1; b is at a target distance L2 from the lane edge 230 on the right side of the vehicle 200.
Assume that the sum of the lateral distance between the SAR210 on the left side and the SAR210 on the right side of the vehicle 200 and the set blind zone distance is L0. The blind area distance may be preset according to an actual situation, and may be a sum of distances from the nearest ends of the two fused images to the corresponding SAR210, respectively. When the vehicle is traveling straight, the total lane width L can be obtained by the formula L — L0+ L1+ L2. That is, the total lane width L is obtained by adding the sum of the target distances between the nearest end of the fused image and the position of the one-sided road edge of the corresponding lane, and the sum of the lateral distance between the two SAR210 and the set blind zone distance.
It should be noted that, in an actual vehicle operation scene, the vehicle is not always in an execution state, the vehicle may turn left or right, the undetermined total width of the lane is calculated by using the formula L — L0+ L1+ L2, and the total width of the lane needs to be obtained by using the corner information of the vehicle according to a triangular formula. The acquisition of the turning angle information of the vehicle may be through sensors in the vehicle, and the sensors may acquire the relationship between the lane and the position, speed, yaw angle, and driving turning angle of the vehicle. The method can be completely consistent with the actual running position of the vehicle, and accurately describe the running information of the vehicle such as lane change, stop, turning around, turning and the like. And the steering angle information of the vehicle may be obtained by measuring the steering wheel or the tire steering angle by a sensor. The obtained total lane width L can be ensured to be accurate.
The vehicle positioning information can be obtained through vehicle interior speed and angle measuring equipment or carried GPS positioning. According to the total width of the lane, the position of the lane line and the vehicle positioning information, the transverse lane level positioning of the vehicle can be determined, for example, the total width of the lane is 90 meters, the width of the lane is 30 meters, the positions of the lane line are respectively located at the position 30 meters away from the edge of the left lane and at the position 60 meters away from the edge of the left lane, the vehicle positioning information is located at the position 50 meters away from the edge of the left lane, and the position between the lane lines located at the position 30 meters away from the edge of the left lane and at the position 60 meters away from the edge of the left lane, namely the position of the vehicle on the second lane on the left side.
In the embodiment of the present invention, after determining the total width of the lane, the method may further include: determining a detected number of lanes based on the detected lane lines; determining a fixed number of lanes based on the total lane width and a standard lane width; and if the fixed number is inconsistent with the detection number, performing combined obstacle detection based on the fused image to redetermine the position of the lane edge, or correcting the position of the lane line based on the fixed number.
Wherein re-determining the lane edge position re-determines the total lane width, thereby enhancing accuracy. The detected number of lanes N may be the detected lane line plus 1. The fixed number of lanes may be calculated by modulo arithmetic, the standard lane width may be a general lane width in the design standard of the national highway, and if the general lane width in the design standard of the national highway is X, the fixed number is M ═ L/X. And if the fixed number M is consistent with the detection number N, the lane lines detected by the camera are accurate. If the fixed number M is not consistent with the detection number N, the road is considered to have other states or be provided with lane marking-free lines, or the edges detected by the SAR images are influenced by obstacles. At this time, it is possible to redetermine the lane edge position by performing the obstacle joint detection based on the fused image, or correct the position of the lane line based on the fixed number. For example, it is necessary to detect objects such as sidewalks, green belts, and segregator barriers as lane edge distinctions, and to identify roadside parked vehicles and other scattered obstacles to re-locate lane edges. Whether the position of the lane line is accurate can be determined by detecting whether the distance between the two lane lines or the lane edges at the two sides closest to the lane line meets the standard lane width. The position of the lane edge can be accurately determined, the position of the lane line can be corrected, and the transverse lane level positioning of the vehicle is accurate.
In the prior art, lane line detection based on a vision system is sensitive to background light, and the vision system needs complete identification of lane lines and uniform format; the other transverse positioning method is not a pure visual system, such as laser radar detection and visual system fusion, on one hand, the laser radar cost is too high, the data calculation amount is too large, on the other hand, the laser radar identifies dynamic and static obstacles and judges that the lane edge depends on metal targets (such as fences, lamp posts and the like) of the road twice, the detection effect on the crossing, the tunnel or other lane edges without the metal targets is poor, the lane edge line is generated by fitting a detection target point, the precision is limited, and the transverse positioning is not accurate. Compared with the SAR required by the prior art, the transverse positioning method of the vehicle in the lane provided by the embodiment of the invention has the advantages of small data volume, low price and suitability for the all-day environment, and can realize the identification of the moving and static obstacles and the identification of the lane edge at the sensor end, thereby having small calculated amount, high precision, stability and reliability.
According to the technical scheme of the embodiment, a visual image is obtained through a camera arranged on a vehicle, the visual image is converted into an overlooking image, and a lane line is detected through the overlooking image; respectively acquiring SAR images through two SAR arranged in the oblique front of a vehicle, and detecting the lane edge through the SAR images; fusing the overlook image and the SAR image, and determining the positions of a lane line and a lane edge in the fused image; the method determines the transverse lane level positioning of the vehicle based on the position of the lane line in the fusion image, the position of the lane edge and the vehicle motion information, solves the transverse positioning problem of the vehicle in the lane, has small needed SAR data volume, can be suitable for all-weather environments, and realizes the effects of high detection precision, high positioning precision, small calculated amount, stability and reliability through the matching of the SAR and the camera.
Fig. 4 is a flow chart of a method for lateral positioning and road perception of a vehicle in a lane according to an embodiment of the present invention. As shown in fig. 4, the method specifically includes:
s210, a visual image is obtained through a camera arranged on the vehicle, the visual image is converted into an overlook image, and the lane line is detected through the overlook image.
S220, acquiring SAR images through two synthetic aperture radars SAR arranged in the oblique front of the vehicle respectively, and detecting lane edges through the SAR images;
s230, fusing the overlook image and the SAR image, and determining the positions of a lane line and a lane edge in the fused image;
s240, determining the transverse lane-level positioning of the vehicle based on the position of the lane line in the fused image, the position of the lane edge and the vehicle motion information.
And S250, carrying out intersection detection and moving and static target identification based on the fusion image.
The road perception of the vehicle can be realized by road intersection detection and moving and static target identification, the moving and static targets can be avoided by the vehicle, and the vehicle can smoothly and safely pass through the road intersection.
Fig. 5 is a block diagram of a method for lateral positioning of a vehicle in a lane and road perception provided by an embodiment of the present invention. As shown in fig. 5, the vehicle can realize lane edge detection through the SAR, realize lane line detection through the camera, fuse the two parts, including the fusion of the acquired images and the fusion of the detection results, and combine with the sensor carried by the vehicle, so as to realize the correction of the detection results, intersection discrimination and moving and static target detection, and finally realize the road perception and lane level positioning accurately.
According to the technical scheme of the embodiment, a visual image is obtained through a camera arranged on a vehicle, the visual image is converted into an overlooking image, and a lane line is detected through the overlooking image; respectively acquiring SAR images through two SAR arranged in the oblique front of a vehicle, and detecting the lane edge through the SAR images; fusing the overlook image and the SAR image, and determining the positions of a lane line and a lane edge in the fused image; determining the transverse lane-level positioning of the vehicle based on the position of the lane line, the position of the lane edge and the vehicle motion information in the fusion image; based on the fusion image is used for intersection detection and moving and static target recognition, the problems of transverse positioning of vehicles in lanes and road perception of the vehicles are solved, the needed SAR data volume is small, the fusion image can be suitable for all-weather environments, the detection accuracy is high through the matching of the SAR and the camera, the positioning accuracy is high, the calculated amount is small, the stability and the reliability are good, obstacles can be avoided when the vehicles are automatically driven through the road perception, the vehicles smoothly pass through intersections, and the automatic driving safety of the vehicles is guaranteed.
Fig. 6 is a schematic structural diagram of a lateral positioning device of a vehicle in a lane according to an embodiment of the present invention. With reference to fig. 6, the apparatus comprises: lane line detection module 310, lane edge detection module 320, fusion module 330, and lane level location module 340.
The lane line detection module 310 is configured to acquire a visual image through a camera provided on a vehicle, convert the visual image into an overhead image, and detect a lane line through the overhead image;
the lane edge detection module 320 is used for respectively acquiring SAR images through two synthetic aperture radars SAR arranged in the oblique front of the vehicle and detecting lane edges through the SAR images;
optionally, the lane edge detecting module 320 includes:
the de-noising processing unit is used for carrying out binarization on the SAR image and carrying out de-drying processing;
the obstacle point recording unit is used for recording the point of the obstacle closest to the vehicle in the processed SAR image;
and the straight line fitting unit is used for performing straight line fitting on the points of the obstacles closest to the vehicle and taking the fitted straight lines as the edges of the lane.
The fusion module 330 is configured to fuse the overhead view image and the SAR image, and determine positions of a lane line and a lane edge in the fusion image;
optionally, the fusion module 330 includes:
and the fusion unit is used for converting the SAR image into a coordinate system where the overhead view image is located and fusing the converted image and the overhead view image.
And a lane-level positioning module 340, configured to determine a lateral lane-level positioning of the vehicle based on the position of the lane line, the position of the lane edge, and the vehicle motion information in the fused image.
Optionally, the lane-level positioning module 340 includes:
the lane total width determining unit is used for determining the lane total width based on the position of the lane edge in the fusion image and the corner information of the vehicle;
a lateral lane-level location determination unit of the vehicle for determining a lateral lane-level location of the vehicle based on the total lane width, the position of the lane line, and the vehicle location information.
Optionally, the total lane width determining unit is specifically configured to record a target distance between the nearest end of the fused image and a position of a single-side road edge of the corresponding lane, and determine the total width of the lane to be determined based on the target distance and a sum of a lateral distance between the two SAR's and a set blind zone distance;
determining the total width of the lane based on the total width of the lane to be determined and the corner information of the vehicle;
and the nearest end of the fusion image is the end of the fusion image closest to the SAR.
The device for transversely positioning the vehicle in the lane, provided by the embodiment of the invention, can execute the method for transversely positioning the vehicle in the lane, provided by any embodiment of the invention, and has corresponding functional modules and beneficial effects of the execution method.
On the basis of the foregoing embodiment, optionally, the apparatus provided in the embodiment of the present invention further includes:
a detection number determination module for determining the detection number of lanes based on the detected lane lines;
the fixed number determining module is used for determining the fixed number of the lanes based on the total width of the lanes and the standard lane width;
and the obstacle joint detection module is used for performing obstacle joint detection based on the fusion image if the fixed number is inconsistent with the detection number so as to redetermine the position of the lane edge or correct the position of the lane line based on the fixed number.
On the basis of the foregoing embodiment, optionally, the apparatus provided in the embodiment of the present invention further includes:
and the intersection detection and moving and static target identification module is used for carrying out intersection detection and moving and static target identification based on the fusion image.
Fig. 7 is a schematic structural diagram of a vehicle lateral positioning apparatus provided in an embodiment of the present invention, and as shown in fig. 7, the apparatus includes:
one or more processors 410, one processor 410 being exemplified in FIG. 7;
a memory 420;
the apparatus may further include: input device 430, output device 440, SAR210, camera 220, and sensor 450.
The processor 410, memory 420, input device 430, output device 440, SAR210, camera 220, and sensor 450 of the apparatus may be connected via a bus or other means, as exemplified by a bus connection in fig. 7.
The memory 420, as a non-transitory computer-readable storage medium, may be used for storing software programs, computer-executable programs, and modules, such as program instructions/modules corresponding to a method for lateral positioning of a vehicle in a lane in an embodiment of the invention (e.g., the lane line detection module 310, the lane edge detection module 320, the fusion module 330, and the lane-level positioning module 340 shown in fig. 3). The processor 410 executes various functional applications and data processing of the computer device by running the software programs, instructions and modules stored in the memory 420, namely, a method for lateral positioning of a vehicle in a lane, which implements the above-described method embodiments, namely:
acquiring a visual image through a camera arranged on a vehicle, converting the visual image into an overlooking image, and detecting a lane line through the overlooking image;
respectively acquiring SAR images through two synthetic aperture radars SAR arranged in the oblique front of a vehicle, and detecting the lane edge through the SAR images;
fusing the overlook image and the SAR image, and determining the positions of a lane line and a lane edge in the fused image;
determining a lateral lane-level location of the vehicle based on the location of the lane line, the location of the lane edge, and the vehicle motion information in the fused image.
The memory 420 may include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required for at least one function; the storage data area may store data created according to use of the computer device, and the like. Further, the memory 420 may include high speed random access memory, and may also include non-transitory memory, such as at least one magnetic disk storage device, flash memory device, or other non-transitory solid state storage device. In some embodiments, memory 420 may optionally include memory located remotely from processor 410, which may be connected to the terminal device via a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
The input device 430 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the computer apparatus. The output device 440 may include a display device such as a display screen.
The SAR210 may be disposed diagonally in front of the vehicle, and may be one on each side, for acquiring SAR images to facilitate detection of lane edges by the SAR images.
The camera 220 may be disposed behind a front windshield of the vehicle to acquire a visual image, so as to convert the visual image into an overhead image and detect a lane line through the overhead image.
The sensor 450 may be used to obtain the relationship between the lane and the position, speed, yaw angle and vehicle driving angle of the vehicle, so as to correct the overall width of the lane according to the information of the vehicle's angle of rotation.
Embodiments of the present invention provide a computer-readable storage medium, on which a computer program is stored, which when executed by a processor implements a method for lateral positioning of a vehicle in a lane, as provided by embodiments of the present invention:
acquiring a visual image through a camera arranged on a vehicle, converting the visual image into an overlooking image, and detecting a lane line through the overlooking image;
respectively acquiring SAR images through two synthetic aperture radars SAR arranged in the oblique front of a vehicle, and detecting the lane edge through the SAR images;
fusing the overlook image and the SAR image, and determining the positions of a lane line and a lane edge in the fused image;
determining a lateral lane-level location of the vehicle based on the location of the lane line, the location of the lane edge, and the vehicle motion information in the fused image.
Any combination of one or more computer-readable media may be employed. The computer readable medium may be a computer readable signal medium or a computer readable storage medium. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples (a non-exhaustive list) of the computer readable storage medium would include the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this document, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
A computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Computer program code for carrying out operations for aspects of the present invention may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Smalltalk, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider).
It is to be noted that the foregoing is only illustrative of the preferred embodiments of the present invention and the technical principles employed. It will be understood by those skilled in the art that the present invention is not limited to the particular embodiments described herein, but is capable of various obvious changes, rearrangements and substitutions as will now become apparent to those skilled in the art without departing from the scope of the invention. Therefore, although the present invention has been described in greater detail by the above embodiments, the present invention is not limited to the above embodiments, and may include other equivalent embodiments without departing from the spirit of the present invention, and the scope of the present invention is determined by the scope of the appended claims.

Claims (10)

1. A method of lateral positioning of a vehicle in a roadway, comprising:
acquiring a visual image through a camera arranged on a vehicle, converting the visual image into an overlooking image, and detecting a lane line through the overlooking image;
respectively acquiring SAR images through two synthetic aperture radars SAR arranged in the oblique front of a vehicle, and detecting the lane edge through the SAR images;
fusing the overlook image and the SAR image, and determining the positions of a lane line and a lane edge in the fused image;
determining a lateral lane-level location of the vehicle based on the location of the lane line, the location of the lane edge, and the vehicle motion information in the fused image.
2. The method of claim 1, further comprising:
and carrying out intersection detection and moving and static target identification based on the fusion image.
3. The method of claim 1, wherein determining a lateral lane-level location of a vehicle based on the location of the lane line, the location of the lane edge, and vehicle motion information in the fused image comprises:
determining the total lane width based on the position of the lane edge in the fusion image and the corner information of the vehicle;
determining a lateral lane-level location of the vehicle based on the total lane width, the location of the lane line, and the vehicle location information.
4. The method of claim 3, wherein determining the total lane width based on the position of the lane edge in the fused image and the corner information of the vehicle comprises:
respectively recording target distances between the nearest end of the fused image and the positions of the single-side road edges of the corresponding lanes, and determining the total width of the lane to be determined based on the target distances and the sum of the transverse distance between the two SARs and the set blind area distance;
determining the total lane width based on the total lane width to be determined and the corner information of the vehicle;
and the nearest end of the fused image is the end of the fused image closest to the SAR.
5. The method of claim 1, wherein the detecting lane edges by the SAR image comprises:
carrying out binarization on the SAR image and carrying out denoising treatment;
recording the points of the obstacles closest to the vehicle in the processed SAR image;
and fitting a straight line to the point of the obstacle closest to the vehicle, and taking the fitted straight line as the edge of the lane.
6. The method of claim 1, wherein fusing the overhead image and the SAR image comprises:
and converting the SAR image into a coordinate system where the overhead view image is located, and fusing the converted image and the overhead view image.
7. The method of claim 3, further comprising:
determining a detected number of lanes based on the detected lane lines;
determining a fixed number of lanes based on the total lane width and a standard lane width;
and if the fixed number is inconsistent with the detection number, performing combined obstacle detection based on the fused image to redetermine the position of the lane edge, or correcting the position of the lane line based on the fixed number.
8. A lateral positioning device of a vehicle in a roadway, comprising:
the lane line detection module is used for acquiring a visual image through a camera arranged on a vehicle, converting the visual image into an overlook image and detecting a lane line through the overlook image;
the lane edge detection module is used for respectively acquiring SAR images through two synthetic aperture radars (synthetic aperture radar) arranged in the oblique front of the vehicle and detecting lane edges through the SAR images;
the fusion module is used for fusing the overlook image and the SAR image and determining the positions of a lane line and a lane edge in the fused image;
and the lane-level positioning module is used for determining the transverse lane-level positioning of the vehicle based on the position of the lane line, the position of the lane edge and the vehicle motion information in the fusion image.
9. A vehicle lateral positioning apparatus, comprising:
one or more processors;
a storage device for storing one or more programs,
when executed by the one or more processors, cause the one or more processors to implement a method of lateral positioning of a vehicle in a roadway as recited in any one of claims 1-7.
10. A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out a method for lateral positioning of a vehicle in a roadway as claimed in any one of claims 1 to 7.
CN201911024194.8A 2019-10-25 2019-10-25 Method, device, equipment and storage medium for transverse positioning of vehicle in lane Pending CN110781816A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911024194.8A CN110781816A (en) 2019-10-25 2019-10-25 Method, device, equipment and storage medium for transverse positioning of vehicle in lane

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911024194.8A CN110781816A (en) 2019-10-25 2019-10-25 Method, device, equipment and storage medium for transverse positioning of vehicle in lane

Publications (1)

Publication Number Publication Date
CN110781816A true CN110781816A (en) 2020-02-11

Family

ID=69386693

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911024194.8A Pending CN110781816A (en) 2019-10-25 2019-10-25 Method, device, equipment and storage medium for transverse positioning of vehicle in lane

Country Status (1)

Country Link
CN (1) CN110781816A (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111323802A (en) * 2020-03-20 2020-06-23 北京百度网讯科技有限公司 Vehicle positioning method, device and equipment
CN111524351A (en) * 2020-04-22 2020-08-11 东风汽车集团有限公司 Ramp speed limit identification method
CN112373474A (en) * 2020-11-23 2021-02-19 重庆长安汽车股份有限公司 Lane line fusion and transverse control method, system, vehicle and storage medium
CN113392762A (en) * 2021-06-15 2021-09-14 北京纵目安驰智能科技有限公司 Intersection detection method, system, terminal and computer readable storage medium
CN113494915A (en) * 2020-04-02 2021-10-12 广州汽车集团股份有限公司 Vehicle transverse positioning method, device and system

Citations (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2010105605A1 (en) * 2009-03-18 2010-09-23 Eads Deutschland Gmbh Method and device for determining aspect angle progression
CN102303605A (en) * 2011-06-30 2012-01-04 中国汽车技术研究中心 Multi-sensor information fusion-based collision and departure pre-warning device and method
CN102447911A (en) * 2010-10-01 2012-05-09 魏载荣 Image acquisition unit, acquisition method, and associated control unit
CN103886566A (en) * 2014-03-18 2014-06-25 河海大学常州校区 Urban traffic dispatching system and method based on image fusion in severe weather
US20150022675A1 (en) * 2008-08-19 2015-01-22 Digimarc Corporation Image processing architectures and methods
CN105282498A (en) * 2014-06-18 2016-01-27 富士重工业株式会社 Image processing apparatus
CN105480227A (en) * 2015-12-29 2016-04-13 大连楼兰科技股份有限公司 Information fusion method based on infrared radar and video image in active driving technique
CN105667518A (en) * 2016-02-25 2016-06-15 福州华鹰重工机械有限公司 Lane detection method and device
CN105698812A (en) * 2016-01-15 2016-06-22 武汉光庭科技有限公司 Lane line detecting system and method based on safe driving map and cameras on two sides during automatic driving
CA2990317A1 (en) * 2015-06-16 2016-12-22 King Abdulaziz City Of Science And Technology Systems and methods for enhancing synthetic aperture radar imagery
CN106379319A (en) * 2016-10-13 2017-02-08 上汽大众汽车有限公司 Automobile driving assistance system and control method
CN106461774A (en) * 2014-02-20 2017-02-22 御眼视觉技术有限公司 Advanced driver assistance system based on radar-cued visual imaging
US20180196135A1 (en) * 2011-12-20 2018-07-12 Sadar 3D, Inc. Scanners, targets, and methods for surveying
CN108960183A (en) * 2018-07-19 2018-12-07 北京航空航天大学 A kind of bend target identification system and method based on Multi-sensor Fusion
CN109532826A (en) * 2017-09-21 2019-03-29 天津所托瑞安汽车科技有限公司 A kind of radar anticollision method for early warning based on the optimization of lane line Visual identification technology
CN109583280A (en) * 2017-09-29 2019-04-05 比亚迪股份有限公司 Lane detection method, apparatus, equipment and storage medium

Patent Citations (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150022675A1 (en) * 2008-08-19 2015-01-22 Digimarc Corporation Image processing architectures and methods
WO2010105605A1 (en) * 2009-03-18 2010-09-23 Eads Deutschland Gmbh Method and device for determining aspect angle progression
CN102447911A (en) * 2010-10-01 2012-05-09 魏载荣 Image acquisition unit, acquisition method, and associated control unit
CN102303605A (en) * 2011-06-30 2012-01-04 中国汽车技术研究中心 Multi-sensor information fusion-based collision and departure pre-warning device and method
US20180196135A1 (en) * 2011-12-20 2018-07-12 Sadar 3D, Inc. Scanners, targets, and methods for surveying
CN106461774A (en) * 2014-02-20 2017-02-22 御眼视觉技术有限公司 Advanced driver assistance system based on radar-cued visual imaging
CN103886566A (en) * 2014-03-18 2014-06-25 河海大学常州校区 Urban traffic dispatching system and method based on image fusion in severe weather
CN105282498A (en) * 2014-06-18 2016-01-27 富士重工业株式会社 Image processing apparatus
CA2990317A1 (en) * 2015-06-16 2016-12-22 King Abdulaziz City Of Science And Technology Systems and methods for enhancing synthetic aperture radar imagery
CN105480227A (en) * 2015-12-29 2016-04-13 大连楼兰科技股份有限公司 Information fusion method based on infrared radar and video image in active driving technique
CN105698812A (en) * 2016-01-15 2016-06-22 武汉光庭科技有限公司 Lane line detecting system and method based on safe driving map and cameras on two sides during automatic driving
CN105667518A (en) * 2016-02-25 2016-06-15 福州华鹰重工机械有限公司 Lane detection method and device
CN106379319A (en) * 2016-10-13 2017-02-08 上汽大众汽车有限公司 Automobile driving assistance system and control method
CN109532826A (en) * 2017-09-21 2019-03-29 天津所托瑞安汽车科技有限公司 A kind of radar anticollision method for early warning based on the optimization of lane line Visual identification technology
CN109583280A (en) * 2017-09-29 2019-04-05 比亚迪股份有限公司 Lane detection method, apparatus, equipment and storage medium
CN108960183A (en) * 2018-07-19 2018-12-07 北京航空航天大学 A kind of bend target identification system and method based on Multi-sensor Fusion

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
杨文 等: "合成孔径雷达图像目标识别问题研究", 《航天返回与遥感》 *

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111323802A (en) * 2020-03-20 2020-06-23 北京百度网讯科技有限公司 Vehicle positioning method, device and equipment
CN111323802B (en) * 2020-03-20 2023-02-28 阿波罗智能技术(北京)有限公司 Intelligent driving vehicle positioning method, device and equipment
CN113494915A (en) * 2020-04-02 2021-10-12 广州汽车集团股份有限公司 Vehicle transverse positioning method, device and system
CN111524351A (en) * 2020-04-22 2020-08-11 东风汽车集团有限公司 Ramp speed limit identification method
CN112373474A (en) * 2020-11-23 2021-02-19 重庆长安汽车股份有限公司 Lane line fusion and transverse control method, system, vehicle and storage medium
CN112373474B (en) * 2020-11-23 2022-05-17 重庆长安汽车股份有限公司 Lane line fusion and transverse control method, system, vehicle and storage medium
CN113392762A (en) * 2021-06-15 2021-09-14 北京纵目安驰智能科技有限公司 Intersection detection method, system, terminal and computer readable storage medium
CN113392762B (en) * 2021-06-15 2024-04-26 北京纵目安驰智能科技有限公司 Intersection detection method, system, terminal and computer readable storage medium

Similar Documents

Publication Publication Date Title
CN110781816A (en) Method, device, equipment and storage medium for transverse positioning of vehicle in lane
WO2018177026A1 (en) Device and method for determining road edge
CN109188438B (en) Yaw angle determination method, device, equipment and medium
US9713983B2 (en) Lane boundary line recognition apparatus and program for recognizing lane boundary line on roadway
US8180561B2 (en) Vehicle-installation obstacle detection apparatus
RU2667675C1 (en) Device for determining position of vehicle and method for determining position of vehicle
US20180288320A1 (en) Camera Fields of View for Object Detection
US8175331B2 (en) Vehicle surroundings monitoring apparatus, method, and program
US10762782B2 (en) On-street parking map generation
US20210073557A1 (en) Systems and methods for augmenting upright object detection
CN107389084B (en) Driving path planning method and storage medium
CN111797741A (en) Vehicle detection method, device, vehicle and storage medium
US20120212612A1 (en) Lane Departure Warning Apparatus and Lane Departure Warning System
JP2011118482A (en) In-vehicle device and recognition support system
US10325163B2 (en) Vehicle vision
CN110858405A (en) Attitude estimation method, device and system of vehicle-mounted camera and electronic equipment
JP2017207973A (en) Detector and detection method
CN110516621B (en) Method and device for detecting barrier-free driving area, vehicle and storage medium
CN111091037A (en) Method and device for determining driving information
CN111736153A (en) Environment detection system, method, apparatus, and medium for unmanned vehicle
CN112740225A (en) Method and device for determining road surface elements
CN114442101A (en) Vehicle navigation method, device, equipment and medium based on imaging millimeter wave radar
CN115292435B (en) High-precision map updating method and device, electronic equipment and storage medium
JP2020197506A (en) Object detector for vehicles
US20230085455A1 (en) Vehicle condition estimation method, vehicle condition estimation device, and vehicle condition estimation program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20200211

RJ01 Rejection of invention patent application after publication