CN113569800A - Lane recognition and verification method and device, readable storage medium and electronic equipment - Google Patents

Lane recognition and verification method and device, readable storage medium and electronic equipment Download PDF

Info

Publication number
CN113569800A
CN113569800A CN202110909393.8A CN202110909393A CN113569800A CN 113569800 A CN113569800 A CN 113569800A CN 202110909393 A CN202110909393 A CN 202110909393A CN 113569800 A CN113569800 A CN 113569800A
Authority
CN
China
Prior art keywords
lane
target object
coordinate system
vehicle
determining
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110909393.8A
Other languages
Chinese (zh)
Inventor
丁垒
齐连军
丁美昆
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Horizon Robotics Technology Research and Development Co Ltd
Original Assignee
Beijing Horizon Robotics Technology Research and Development Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Horizon Robotics Technology Research and Development Co Ltd filed Critical Beijing Horizon Robotics Technology Research and Development Co Ltd
Priority to CN202110909393.8A priority Critical patent/CN113569800A/en
Publication of CN113569800A publication Critical patent/CN113569800A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures

Abstract

The embodiment of the disclosure discloses a lane identification and verification method, a lane identification and verification device, a computer readable storage medium and an electronic device, wherein the method comprises the following steps: determining at least one first lane line on a road where the vehicle is located and a first coordinate of a target object under a pre-calibrated vehicle coordinate system; determining a first lane where a target object in a vehicle coordinate system is located based on the first coordinate and at least one first lane line; determining at least one second lane line on the road and second coordinates representing a position of the target object in an image coordinate system of the two-dimensional image photographed for the target object; determining a second lane where the target object is located in the image coordinate system based on the second coordinate and the second lane line; and verifying the first lane by using the second lane to obtain the lane where the target object is located. The embodiment of the disclosure can utilize the two-dimensional image to verify the lane of the target object, and improves the accuracy of lane detection on the target object.

Description

Lane recognition and verification method and device, readable storage medium and electronic equipment
Technical Field
The present disclosure relates to the field of computer technologies, and in particular, to a lane identification and verification method, an apparatus, a computer-readable storage medium, and an electronic device.
Background
In the advanced driving assistance system based on visual perception, whether or not a target output of CIPV (close In-Path Vehicle, recently on-route Vehicle) is stable has a great influence on the system performance. The CIPV is a recent target in the lane and can be a vehicle, a two-wheel vehicle, a pedestrian and the like. In the applications of FCW (Forward Collision Warning), ACC (Adaptive Cruise Control) and the like, if abnormal missing selection occurs to a CIPV target due to reasons such as jumping of lane lines and target sensing results, the FCW cannot give correct alarm information in time, and an ACC unit cannot make correct deceleration Control, so that accidents are easily caused. In addition, misselecting a non-own-lane target causes misbraking, which affects the use experience of the user. The stability and accuracy of the target output of the CIPV directly determine the performance of the driving assistance system.
Disclosure of Invention
The embodiment of the disclosure provides a lane identification and verification method and device, a computer readable storage medium and electronic equipment.
The embodiment of the disclosure provides a lane identification and verification method, which comprises the following steps: determining at least one first lane line on a road where the vehicle is located and a first coordinate representing the position of a target object on the road under a pre-calibrated vehicle coordinate system; determining a first lane where a target object in a vehicle coordinate system is located based on the first coordinate and at least one first lane line; determining at least one second lane line on the road and second coordinates representing a position of the target object in an image coordinate system of the two-dimensional image photographed for the target object; determining a second lane where the target object is located in the image coordinate system based on the second coordinate and the at least one second lane line; and verifying the first lane by using the second lane to obtain the lane where the target object is located.
According to another aspect of the embodiments of the present disclosure, there is provided a lane recognition checking apparatus including: the first determining module is used for determining at least one first lane line on a road where the vehicle is located and first coordinates representing the position of a target object on the road under a pre-calibrated vehicle coordinate system; the second determining module is used for determining a first lane where the target object is located in the vehicle coordinate system based on the first coordinate and the at least one first lane line; a third determination module for determining at least one second lane line on the road and determining second coordinates representing a position of the target object in an image coordinate system of the two-dimensional image photographed for the target object; the fourth determining module is used for determining a second lane where the target object is located in the image coordinate system based on the second coordinate and the at least one second lane line; and the checking module is used for checking the first lane by using the second lane to obtain the lane where the target object is located.
According to another aspect of the embodiments of the present disclosure, there is provided a computer-readable storage medium storing a computer program for executing the above lane recognition checking method.
According to another aspect of the embodiments of the present disclosure, there is provided an electronic apparatus including: a processor; a memory for storing processor-executable instructions; and the processor is used for reading the executable instructions from the memory and executing the instructions to realize the lane identification and verification method.
Based on the lane identification and verification method, the lane identification and verification device, the computer-readable storage medium and the electronic device provided by the embodiments of the disclosure, at least one first lane line on a road is determined in a vehicle coordinate system, a first lane where a target object is located is determined according to the at least one first lane line, at least one second lane line on the road is determined in an image coordinate system of a two-dimensional image taken of the target object, a second lane where the target object is located is determined according to the at least one second lane line, and finally the first lane is verified by using the second lane to obtain the lane where the target object is located. In addition, according to the embodiment of the disclosure, when the lane is checked, hardware devices such as additional sensors are not required to be added, so that the detection cost is reduced on the basis of improving the accuracy of lane detection.
The technical solution of the present disclosure is further described in detail by the accompanying drawings and examples.
Drawings
The above and other objects, features and advantages of the present disclosure will become more apparent by describing in more detail embodiments of the present disclosure with reference to the attached drawings. The accompanying drawings are included to provide a further understanding of the embodiments of the disclosure and are incorporated in and constitute a part of this specification, illustrate embodiments of the disclosure and together with the description serve to explain the principles of the disclosure and not to limit the disclosure. In the drawings, like reference numbers generally represent like parts or steps.
Fig. 1 is a system diagram to which the present disclosure is applicable.
Fig. 2 is a schematic flowchart of a lane identification checking method according to an exemplary embodiment of the present disclosure.
Fig. 3 is an exemplary schematic diagram of a vehicle coordinate system of the lane recognition checking method of the embodiment of the present disclosure.
Fig. 4 is an exemplary schematic view of a first lane line in a vehicle coordinate system of the lane identification checking method of the embodiment of the present disclosure.
Fig. 5 is an exemplary schematic diagram of a second lane where a target vehicle is located in an image coordinate system of the lane recognition checking method according to the embodiment of the present disclosure.
Fig. 6 is a flowchart illustrating a lane identification checking method according to another exemplary embodiment of the present disclosure.
Fig. 7 is a flowchart illustrating a lane identification checking method according to still another exemplary embodiment of the present disclosure.
Fig. 8 is a flowchart illustrating a lane identification checking method according to still another exemplary embodiment of the present disclosure.
Fig. 9 is a schematic structural diagram of a lane identification and verification device according to an exemplary embodiment of the present disclosure.
Fig. 10 is a schematic structural diagram of a lane identification and verification device according to another exemplary embodiment of the present disclosure.
Fig. 11 discloses a block diagram of an electronic device provided in an exemplary embodiment.
Detailed Description
Hereinafter, example embodiments according to the present disclosure will be described in detail with reference to the accompanying drawings. It is to be understood that the described embodiments are merely a subset of the embodiments of the present disclosure and not all embodiments of the present disclosure, with the understanding that the present disclosure is not limited to the example embodiments described herein.
It should be noted that: the relative arrangement of the components and steps, the numerical expressions, and numerical values set forth in these embodiments do not limit the scope of the present disclosure unless specifically stated otherwise.
It will be understood by those of skill in the art that the terms "first," "second," and the like in the embodiments of the present disclosure are used merely to distinguish one element from another, and are not intended to imply any particular technical meaning, nor is the necessary logical order between them.
It is also understood that in embodiments of the present disclosure, "a plurality" may refer to two or more and "at least one" may refer to one, two or more.
It is also to be understood that any reference to any component, data, or structure in the embodiments of the disclosure, may be generally understood as one or more, unless explicitly defined otherwise or stated otherwise.
In addition, the term "and/or" in the present disclosure is only one kind of association relationship describing an associated object, and means that three kinds of relationships may exist, for example, a and/or B may mean: a exists alone, A and B exist simultaneously, and B exists alone. In addition, the character "/" in the present disclosure generally indicates that the former and latter associated objects are in an "or" relationship.
It should also be understood that the description of the various embodiments of the present disclosure emphasizes the differences between the various embodiments, and the same or similar parts may be referred to each other, so that the descriptions thereof are omitted for brevity.
Meanwhile, it should be understood that the sizes of the respective portions shown in the drawings are not drawn in an actual proportional relationship for the convenience of description.
The following description of at least one exemplary embodiment is merely illustrative in nature and is in no way intended to limit the disclosure, its application, or uses.
Techniques, methods, and apparatus known to those of ordinary skill in the relevant art may not be discussed in detail but are intended to be part of the specification where appropriate.
It should be noted that: like reference numbers and letters refer to like items in the following figures, and thus, once an item is defined in one figure, further discussion thereof is not required in subsequent figures.
The disclosed embodiments may be applied to electronic devices such as terminal devices, computer systems, servers, etc., which are operational with numerous other general purpose or special purpose computing system environments or configurations. Examples of well known terminal devices, computing systems, environments, and/or configurations that may be suitable for use with electronic devices, such as terminal devices, computer systems, servers, and the like, include, but are not limited to: personal computer systems, server computer systems, thin clients, thick clients, hand-held or laptop devices, microprocessor-based systems, set top boxes, programmable consumer electronics, network pcs, minicomputer systems, mainframe computer systems, distributed cloud computing environments that include any of the above systems, and the like.
Electronic devices such as terminal devices, computer systems, servers, etc. may be described in the general context of computer system-executable instructions, such as program modules, being executed by a computer system. Generally, program modules may include routines, programs, objects, components, logic, data structures, etc. that perform particular tasks or implement particular abstract data types. The computer system/server may be practiced in distributed cloud computing environments where tasks are performed by remote processing devices that are linked through a communications network. In a distributed cloud computing environment, program modules may be located in both local and remote computer system storage media including memory storage devices.
Summary of the application
The existing CIPV recognition based on visual perception is carried out under a VCS (vehicle coordinate System) coordinate System by using collected obstacle information and lane line information, the method has high requirement on the transverse distance measurement accuracy of a target, a lane where a wrong target is located is easily recognized in a front curve or when a lane line is short, and therefore the CIPV is mistakenly selected and missed.
Exemplary System
Fig. 1 illustrates an exemplary system architecture 100 to which a lane recognition verification method or lane recognition verification apparatus of an embodiment of the present disclosure may be applied.
As shown in fig. 1, the system architecture 100 may include a terminal device 101, a network 102, a server 103, and a vehicle 104. Among them, the terminal apparatus 101 may be provided on the vehicle 104. Network 102 is the medium used to provide communication links between terminal devices 101 and server 103. Network 102 may include various connection types, such as wired, wireless communication links, or fiber optic cables, to name a few.
A user may use terminal device 101 to interact with server 103 over network 102 to receive or send messages and the like. Various communication client applications, such as an automatic driving application, a map-type application, a navigation-type application, and the like, may be installed on the terminal device 101.
The terminal device 101 may be various electronic devices including, but not limited to, devices such as a mobile phone, a notebook computer, a digital broadcast receiver, a PDA (personal digital assistant), a PAD (tablet computer), a PMP (portable multimedia player), a vehicle-mounted terminal (e.g., a car navigation terminal), and the like. The terminal apparatus 101 is generally provided on the vehicle 104.
The vehicle 104 may be provided with a camera 1041, and the camera 1041 may capture an image of the periphery of the vehicle.
The server 103 may be a server that provides various services, such as a background lane check server that processes information such as images uploaded by the terminal apparatus 101. The background lane verification server can verify the lane where the target vehicle is located determined in the vehicle coordinate system by using the received image to obtain the lane where the target object is located.
It should be noted that the lane identification verification method provided by the embodiment of the present disclosure may be executed by the server 103 or the terminal device 101, and accordingly, the lane identification verification apparatus may be disposed in the server 103 or the terminal device 101. It should be further noted that the lane identification verification method provided by the embodiment of the present disclosure may also be executed by the terminal device and the server in combination (for example, the terminal device 101 determines the lane in which the target object is located in the vehicle coordinate system, and the server 103 verifies the lane in which the target object is located by using the two-dimensional image).
It should be understood that the number of terminal devices, networks, servers, and vehicles in fig. 1 is merely illustrative. There may be any number of terminal devices, networks, servers, and vehicles, as desired for implementation. For example, when the lane recognition checking method is executed by the terminal device, or when the lane recognition checking means is provided on the terminal device, the above system architecture may include only the vehicle and the terminal device.
Exemplary method
Fig. 2 is a schematic flowchart of a lane identification checking method according to an exemplary embodiment of the present disclosure. The embodiment can be applied to an electronic device (such as the terminal device 101 or the server 103 shown in fig. 1), and as shown in fig. 2, the method includes the following steps:
step 201, under a pre-calibrated vehicle coordinate system, at least one first lane line on a road where the vehicle is located and first coordinates representing the position of a target object on the road are determined.
In this embodiment, the electronic device may determine, in a vehicle coordinate system calibrated in advance, at least one first lane line on a road where the host vehicle (e.g., the vehicle 104 shown in fig. 1) is located and first coordinates representing a position of a target object on the road. The vehicle coordinate system may be a coordinate system established based on the vehicle. For example, the rear axle center of the vehicle may be used as the origin of the vehicle coordinate system. As an example, the Vehicle Coordinate System may be an existing VCS (Vehicle Coordinate System) Coordinate System or a world Coordinate System. In general, coordinates can be calibrated by using an image captured by the camera 1041 shown in fig. 1, a correspondence between an image coordinate system of the two-dimensional image and a vehicle coordinate system is established, and various objects in the two-dimensional image are mapped under the image coordinate system by using a visual perception method. Optionally, a first coordinate of the target object in the vehicle coordinate system may also be detected by using equipment such as a binocular stereo camera and a laser radar.
As shown in fig. 3, which shows an exemplary schematic view of a vehicle coordinate system, wherein the x-axis of the vehicle coordinate system is parallel to the nose orientation of the host vehicle, the y-axis is perpendicular to the nose orientation, and the origin of the coordinate system is located at the center of the rear axle of the vehicle.
The target object may be various types of objects such as a vehicle, a pedestrian, an obstacle, a road sign, and the like. In this embodiment, the target object may be an object within a detection range of the camera 1041 or other target detection devices shown in fig. 1, or an object included in a preset range within the detection range. The electronic device may determine coordinates of the target object in the vehicle coordinate system as the first coordinates based on an existing target detection method. The first coordinate may correspond to a location on the vehicle that may be any pre-specified location. For example, the coordinates of the geometric center of a rectangular frame containing the target object may be used as the first coordinates of the target vehicle.
The first lane line may be determined based on an existing lane line detection method, for example, a neural network is used to determine a position of a point included in the lane line in the two-dimensional image, and then a position of at least one lane line in the vehicle coordinate system is determined according to a correspondence between the two-dimensional image coordinate system and the vehicle coordinate system.
And step 202, determining a first lane where the target object is located in the vehicle coordinate system based on the first coordinate and the at least one first lane line.
In this embodiment, the electronic device may determine, based on the first coordinates and the at least one first lane line, a first lane in which the target object is located in the vehicle coordinate system. Specifically, the road may be divided into at least one lane based on at least one first lane line, and an area of the lane of the first coordinate is further determined, so as to determine the first lane in which the target object is located. The first lane may be recorded with a lane identification.
As an example, as shown in fig. 4, four first lane lines, respectively labeled as L1, L2, L3, L4, are detected under the vehicle coordinate system, divided into five lanes, respectively a1, a2, A3, a4, a 5. The lane in which the host vehicle 301 is located is a 3. The first coordinate of the target vehicle 302 is shown as a dot 3021 in the figure, and is located in a lane a 4.
In step 203, at least one second lane line on the road is determined and second coordinates representing the position of the target object are determined in an image coordinate system of the two-dimensional image taken of the target object.
In the present embodiment, the electronic apparatus may determine at least one second lane line on the road and determine second coordinates representing the position of the target object in an image coordinate system of a two-dimensional image taken of the target object. The electronic device may determine the at least one second lane line using various methods. For example, at least one second lane line may be determined from the two-dimensional image taken of the target object using an existing lane recognition method for the two-dimensional image. Or, the at least one first lane line may be mapped to the image coordinate system based on a corresponding relationship between the image coordinate system and the vehicle coordinate system, so as to obtain the at least one second lane line.
The electronic device may determine the second coordinates representing the position of the target object using an existing target detection method for a two-dimensional image. As an example, the coordinates of the geometric center of a rectangular frame containing the target object may be determined as the second coordinates in the two-dimensional image.
And 204, determining a second lane where the target object is located in the image coordinate system based on the second coordinate and the at least one second lane line.
In this embodiment, the electronic device may determine the second lane in which the target object is located in the image coordinate system based on the second coordinates and the at least one second lane line. Specifically, the road in the two-dimensional image may be divided into at least one lane based on at least one second lane line, and the area of the lane of the second coordinate may be further determined, so as to determine the second lane in which the target object is located. The second lane may be recorded with a lane marker.
As shown in fig. 5, the upper left corner of the two-dimensional image is the origin of the image coordinate system, the horizontal direction is the x-axis, and the vertical direction is the y-axis. Two second lane lines L2 'and L3' are shown in the two-dimensional image, corresponding to L2 and L3 in fig. 4, respectively. The position of the target vehicle 501 may be represented by coordinates of a geometric center of the rectangular frame 502, and the lane in which the target object is located may be determined to be the a4 lane, that is, the right lane of the lane in which the target object is located this time, according to the lanes divided by the respective second lane lines.
And step 205, checking the first lane by using the second lane to obtain the lane where the target object is located.
In this embodiment, the electronic device may check the first lane by using the second lane to obtain the lane where the target object is located. As an example, the second lane may be determined as the lane in which the target object is located when the identification of the first lane and the identification of the second lane are different.
The method provided by the above embodiment of the present disclosure determines at least one first lane line on a road in a vehicle coordinate system, determines a first lane in which a target object is located according to the at least one first lane line, then determines at least one second lane line on the road in an image coordinate system of a two-dimensional image taken of the target object, determines a second lane in which the target object is located according to the at least one second lane line, and finally verifies the first lane by using the second lane to obtain the lane in which the target object is located, thereby implementing verification of the lane of the target object determined by a visual perception system by using the two-dimensional image, and improving accuracy of lane detection on the target object. In addition, according to the embodiment of the disclosure, when the lane is checked, hardware devices such as additional sensors are not required to be added, so that the detection cost is reduced on the basis of improving the accuracy of lane detection.
In some alternative implementations, as shown in fig. 6, step 203 may include the following sub-steps:
step 2031, based on a preset sampling interval, a set of sampling points is extracted from at least one first lane line.
In general, a sampling point may be extracted from each first lane line according to a coordinate system as shown in fig. 3, and the coordinates of the sampling point are (x1, y1), (x2, y2), … …, (xi, yi), … ….
Step 2032, based on the preset coordinate transformation parameters characterizing the transformation relationship between the vehicle coordinate system and the image coordinate system, mapping the sampling points included in the sampling point set to the image coordinate system to obtain a mapping point set.
The coordinate conversion parameters may be pre-calibrated. For example, the vehicle coordinate system may be the same as the camera coordinate system of the camera mounted on the vehicle, and the coordinate conversion parameters may be camera parameters calibrated in advance.
Step 2033, determining at least one second lane line based on the set of mapping points.
Specifically, at least one curve may be fitted as the second lane line according to coordinates of points included in the mapped point set.
Fig. 6 corresponds to an embodiment, the data amount for determining and generating the second lane line may be reduced by extracting the sampling point set from the first lane line, mapping the sampling point set to the image coordinate system, and generating the second lane line according to the mapping point set, and since the second lane line is obtained by mapping the points on the first lane line to the image coordinate system, the error between the lane line and the lane in the vehicle coordinate system and the image coordinate system may be reduced, and the accuracy of lane verification may be improved.
In some alternative implementations, in step 203, the second coordinates representing the position of the target object may be determined as follows:
first, in a two-dimensional image, a block diagram containing a preset shape of a target object is determined. The preset shape may be any shape, such as a rectangle, a circle, and the like.
Then, on the block diagram, a preset number of second coordinates representing a preset number of points is determined.
The preset number may be any number, such as three. The second coordinate may be selected in various ways, for example, coordinates of a predetermined number of points are extracted uniformly from a central line parallel to the top and bottom sides of the rectangular block as the second coordinate.
According to the implementation mode, the block diagram containing the target object is determined in the two-dimensional image, the preset number of second coordinates are determined from the block diagram, when the lane where the target vehicle is located is determined subsequently, the lane where most of the target object is located can be judged more accurately according to the preset number of second coordinates, and therefore lane checking accuracy is improved.
In some alternative implementations, the electronic device may determine a preset number of second coordinates representing a preset number of points on the block diagram according to the following steps:
and determining second coordinates respectively corresponding to the preset number of points which are uniformly distributed on the bottom edge included by the block diagram.
As an example, as shown in fig. 5, the preset number is three, and the points 1/4, 2/4 and 3/4 on the bottom side are sequentially selected from left to right on the bottom side of the block diagram, and are respectively marked as P1, P2 and P3.
Since the length of the bottom side in the two-dimensional image is closest to the length of the portion of the target object closest to the host vehicle, the bottom side can more accurately represent the position of the target object, and the point extracted from the bottom side can be used to more accurately perform lane check on the target object.
In some optional implementations, determining a second lane in which the target object in the image coordinate system is located based on the second coordinates and the at least one second lane line includes:
and determining a second lane where the target object is located in the image coordinate system based on the relative position relation between the second coordinates respectively corresponding to the preset number of points and at least one second lane line.
Specifically, the lateral components of a preset number of second coordinates (e.g., the components on the x-axis shown in fig. 5) may be compared with the lateral component of the corresponding point on the at least one second lane line (e.g., the intersection of the extension line of the bottom side and the at least one second lane line), so as to determine the second lane in which the target object is located.
As an example, as shown in fig. 5, the preset number is three, and the coordinates of the left end point on the bottom side of the rectangular frame are (start _ x, start _ y), and the coordinates of the right end point are (end _ x, end _ y), where start _ y is end _ y. The point on the left lane line L2 'at which the y value of the map point is closest to start _ y is denoted as s1, and the point on the right lane line L3' at which the y value of the map point is closest to start _ y is denoted as s2. If p1.x > s1.x and p2.x < s2.x, the lane in which the target object is located corresponds to the lane a3 in fig. 4, which is the lane in which the host vehicle is located. If p2.x < s1.x, the target object is mostly located in the left lane of the lane where the host vehicle is located, i.e., the lane corresponding to a2 in fig. 4. If p3.x > s2.x, the target object is mostly located in the right lane of the lane where the host vehicle is located, i.e., the lane corresponding to a4 in fig. 4.
The implementation mode can accurately judge which lane most of the target object is located by determining the relative position relation between each second coordinate and the second lane line, so that the lane where the target object is located can be more accurately determined in the two-dimensional image.
In some alternative implementations, step 205 may be performed as follows:
first, it is determined whether the first lane and the second lane belong to the key lanes based on the positional relationship between the lane in which the host vehicle is located and the first lane and the second lane.
The key lanes comprise a lane where the vehicle is located, a left lane of the vehicle and a right lane of the vehicle. As shown in fig. 4, if the lane where the host vehicle is located is A3, lanes a2, A3, and a4 are key lanes.
And then, if the first lane and the second lane belong to the key lane and the first lane and the second lane are not coincident, determining the second lane as the lane where the target object is located.
Specifically, when the target object is determined to be in the left lane or the right lane of the host vehicle in the vehicle coordinate system, but the target object is determined to be in the lane in which the host vehicle is located in the two-dimensional image coordinate system, the lane in which the target object is located is finally determined to be the same as the lane in which the host vehicle is located. When the target object is determined to be in the lane where the host vehicle is located in the vehicle coordinate system, but the target object is determined to be in the left lane (or the right lane) of the host vehicle in the two-dimensional image coordinate system, the lane where the target object is located is finally determined to be the left lane (or the right lane) of the host vehicle.
It should be noted that, when the first lane and/or the second lane do not belong to the key lane, the first lane is kept as the final lane where the target object is located.
According to the implementation mode, the first lane determined under the vehicle coordinate system and the second lane determined under the image coordinate system belong to the key lanes, and when the first lane and the second lane do not coincide with each other, the second lane is finally determined as the lane where the target object is located, so that when the target object approaches the lane where the vehicle is located, the lane where the target object is located is corrected through the two-dimensional image, the vehicle can be accurately and timely controlled to avoid collision with the target object, and the driving safety of the vehicle is improved.
In some alternative implementations, as shown in fig. 7, step 201 may include the following sub-steps:
in step 2011, at least one set of lane line coefficients is determined in a pre-calibrated vehicle coordinate system.
Specifically, the electronic device may fit at least one set of lane line coefficients by collecting a certain number of points from the identified lane lines in a preset curve equation form (e.g., a cubic curve equation) in the vehicle coordinate system shown in fig. 3, so as to obtain lane line equations corresponding to the lane lines, respectively.
Step 2012, at least one lane line equation representing at least one first lane line is determined based on the at least one set of lane line coefficients.
As an example, if the lane line equation employs a cubic curve equation, the lane line equation is as follows:
y=C0+C1·x+C2·x2+C3·x3
wherein, C0、C1、C2、C3Is the lane line coefficient. If there are four lane lines as shown in fig. 4, four sets of lane line coefficients can be determined, resulting in four lane line equations.
Based on the steps 2011 and 2012, as shown in fig. 7, the step 202 may include the following sub-steps:
step 2021, substituting the first coordinate components included in the first coordinate into at least one lane line equation to obtain at least one point located on at least one lane line.
The first coordinate component may be an x-axis component in a vehicle coordinate system as shown in fig. 3, and a point on each lane line may be obtained by substituting an x value included in the first coordinate (x, y) into each lane line direction. For example, if the number of lane line equations is four, the x value is substituted into the four lane line equations to obtain y1, y2, y3, and y4, that is, the coordinates of four points on the four lane lines are (x, y1), (x, y2), (x, y3), and (x, y 4). y1, y2, y3 and y4 correspond to L2, L3, L1 and L4 respectively shown in FIG. 4.
Step 2022, determining the first lane in which the target object is located based on the second coordinate component included in the first coordinate and the second coordinate component included in each of the at least one point.
The second coordinate component is a y-axis component in the vehicle coordinate system shown in fig. 3. Comparing the y value included in the first coordinates (x, y) with the second coordinate components of the respective points obtained in step 2021, the first lane in which the target object is located can be determined.
Continuing with the example in step 2021 described above, in conjunction with the lane shown in fig. 4,
if y1 is more than or equal to y2, the first lane mark is A3, namely the first lane is the lane where the vehicle is located;
if y3 is more than y1, the first lane is marked as A2, namely the first lane is the left lane of the vehicle;
if y2 is more than y and more than or equal to y4, the first lane is marked as A4, namely the first lane is the right lane of the vehicle;
if y3 is less than y, the first lane is marked as A1, i.e. the first lane is the left lane of the vehicle;
if y4 > y, the first lane is labeled A5, i.e., the first lane is the right and left lane of the host vehicle.
According to the implementation mode, at least one lane line equation representing at least one first lane line is determined, so that a complete lane line can be obtained in a vehicle coordinate system, the risk of lane detection errors caused by incomplete lane line identification is reduced, and the accuracy of lane detection and verification is improved.
In some optional implementations, as shown in fig. 8, after step 205, the method may further include the steps of:
step 206, determine the number of target objects.
Step 207, if there are at least two target objects, determining a target object which is located in the same lane as the host vehicle and has the closest distance to the host vehicle from the at least two target objects.
Since the coordinates (which may be the coordinates in the image coordinate system or the coordinates in the image coordinate system) representing each target object can be determined when the target object is determined, the distance between each target object and the host vehicle can be calculated. The determined target object is the CIPV described in the background above.
Step 208, information characterizing the determined target object is output.
Wherein the output information may be in various forms, including, for example, but not limited to, at least one of: a block diagram, characters, symbols, and the like displayed in the two-dimensional image.
The lane identification and verification method provided by the embodiment of the disclosure can finally output information accurately representing the target object of the CIPV, thereby being beneficial to improving the accuracy of vehicle control by using the CIPV. For example, the number of times of error braking under the ADAS system is reduced, when the vehicle encounters a collision risk, the vehicle is accurately controlled to decelerate in advance, the collision risk is reduced, and the safety of the ADAS system is improved.
Exemplary devices
Fig. 9 is a schematic structural diagram of a lane identification and verification device according to an exemplary embodiment of the present disclosure. The present embodiment can be applied to an electronic device, and as shown in fig. 9, the lane identification and verification apparatus includes: a first determining module 901, configured to determine, in a pre-calibrated vehicle coordinate system, at least one first lane line on a road where the host vehicle is located and a first coordinate indicating a position of a target object on the road; a second determining module 902, configured to determine, based on the first coordinate and the at least one first lane line, a first lane in which the target object in the vehicle coordinate system is located; a third determining module 903 for determining at least one second lane line on the road and second coordinates representing a position of the target object in an image coordinate system of the two-dimensional image photographed for the target object; a fourth determining module 904, configured to determine, based on the second coordinate and the at least one second lane line, a second lane in which the target object in the image coordinate system is located; the checking module 905 is configured to check the first lane by using the second lane to obtain a lane where the target object is located.
In this embodiment, the first determining module 901 may determine, in a vehicle coordinate system calibrated in advance, at least one first lane line on a road where a host vehicle (for example, the vehicle 104 shown in fig. 1) is located and first coordinates representing a position of a target object on the road. The vehicle coordinate system may be a coordinate system established based on the vehicle. For example, the rear axle center of the vehicle may be used as the origin of the vehicle coordinate system. As an example, the vehicle coordinate System may be an existing VCS (vehicle coordinate System) coordinate System, or may be a world coordinate System. In general, coordinates can be calibrated by using an image captured by the camera 1041 shown in fig. 1, a correspondence between an image coordinate system of the two-dimensional image and a vehicle coordinate system is established, and various objects in the two-dimensional image are mapped under the image coordinate system by using a visual perception method. Optionally, a first coordinate of the target object in the vehicle coordinate system may also be detected by using equipment such as a binocular stereo camera and a laser radar.
The target object may be various types of objects such as a vehicle, a pedestrian, an obstacle, a road sign, and the like. In this embodiment, the target object may be an object within a detection range of the camera 1041 or other target detection devices shown in fig. 1, or an object included in a preset range within the detection range. The first determining module 901 may determine coordinates of the target object in the vehicle coordinate system as first coordinates based on an existing target detection method. The first coordinate may correspond to a location on the vehicle that may be any pre-specified location. For example, the coordinates of the geometric center of a rectangular frame containing the target object may be used as the first coordinates of the target vehicle.
The first lane line may be determined based on an existing lane line detection method, for example, a neural network is used to determine a position of a point included in the lane line in the two-dimensional image, and then a position of at least one lane line in the vehicle coordinate system is determined according to a correspondence between the two-dimensional image coordinate system and the vehicle coordinate system.
In this embodiment, the second determining module 902 may determine the first lane in which the target object is located in the vehicle coordinate system based on the first coordinate and the at least one first lane line. Specifically, the road may be divided into at least one lane based on at least one first lane line, and an area of the lane of the first coordinate is further determined, so as to determine the first lane in which the target object is located. The first lane may be recorded with a lane identification.
In this embodiment, the third determining module 903 may determine at least one second lane line on the road and determine second coordinates representing the position of the target object in an image coordinate system of a two-dimensional image taken of the target object. The third determination module 903 may determine the at least one second lane line using various methods. For example, at least one second lane line may be determined from the two-dimensional image taken of the target object using an existing lane recognition method for the two-dimensional image. Or, the at least one first lane line may be mapped to the image coordinate system based on a corresponding relationship between the image coordinate system and the vehicle coordinate system, so as to obtain the at least one second lane line.
The third determining module 903 may determine the second coordinates representing the position of the target object using an existing target detection method for a two-dimensional image. As an example, the coordinates of the geometric center of a rectangular frame containing the target object may be determined as the second coordinates in the two-dimensional image.
In this embodiment, the fourth determining module 904 may determine the second lane in which the target object is located in the image coordinate system based on the second coordinate and the at least one second lane line. Specifically, the road in the two-dimensional image may be divided into at least one lane based on at least one second lane line, and the area of the lane of the second coordinate may be further determined, so as to determine the second lane in which the target object is located. The second lane may be recorded with a lane marker.
In this embodiment, the checking module 905 may check the first lane by using the second lane to obtain the lane where the target object is located. As an example, the second lane may be determined as the lane in which the target object is located when the identification of the first lane and the identification of the second lane are different.
Referring to fig. 10, fig. 10 is a schematic structural diagram of a lane identification and verification apparatus according to another exemplary embodiment of the present disclosure.
In some optional implementations, the third determining module 903 may include: an extracting unit 9031, configured to extract a set of sampling points from at least one first lane line based on a preset sampling interval; the mapping unit 9032 is configured to map sampling points included in the sampling point set to an image coordinate system based on a preset coordinate conversion parameter representing a conversion relationship between a vehicle coordinate system and the image coordinate system, so as to obtain a mapping point set; a first determining unit 9033, configured to determine at least one second lane line based on the set of mapping points.
In some optional implementations, the third determining module 903 may include: a second determination unit 9034 configured to determine a block diagram containing a preset shape of the target object in the two-dimensional image; a third determining unit 9035 is configured to determine, in the block diagram, a preset number of second coordinates representing a preset number of points.
In some optional implementations, the third determining unit 903 may be further configured to: and determining second coordinates respectively corresponding to the preset number of points which are uniformly distributed on the bottom edge included by the block diagram.
In some optional implementations, the fourth determining module 904 may be further configured to: and determining a second lane where the target object is located in the image coordinate system based on the relative position relation between the second coordinates respectively corresponding to the preset number of points and at least one second lane line.
In some optional implementations, the checking module 905 may include: a fourth determining unit 9051, configured to determine, based on a positional relationship between a lane where the host vehicle is located and the first lane and the second lane, whether the first lane and the second lane belong to key lanes, where the key lanes include the lane where the host vehicle is located, a left lane of the host vehicle, and a right lane of the host vehicle; a fifth determining unit 9052, configured to determine the second lane as the lane where the target object is located if the first lane and the second lane belong to the key lane and the first lane and the second lane are not coincident.
In some optional implementations, the first determining module 901 may include: a sixth determining unit 9011, configured to determine at least one set of lane line coefficients in a pre-calibrated vehicle coordinate system; a seventh determining unit 9012, configured to determine, based on the at least one set of lane line coefficients, at least one lane line equation that represents at least one first lane line; the second determining module 902 may include: a calculating unit 9021, configured to substitute first coordinate components included in the first coordinates into at least one lane line equation, respectively, to obtain at least one point located on at least one lane line; an eighth determining unit 9022, configured to determine the first lane in which the target object is located based on the second coordinate component included in the first coordinate and the second coordinate components included in the at least one point, respectively.
In some optional implementations, the apparatus may further include: a fifth determining module 906 for determining the number of target objects; a sixth determining module 907, configured to determine, if there are at least two target objects, a target object that is located in the same lane as the host vehicle and is closest to the host vehicle from among the at least two target objects; an output module 908 for outputting information characterizing the determined target object.
The lane identification and verification device provided by the above embodiment of the present disclosure determines at least one first lane line on a road in a vehicle coordinate system, determines a first lane in which a target object is located according to the at least one first lane line, then determines at least one second lane line on the road in an image coordinate system of a two-dimensional image taken of the target object, determines a second lane in which the target object is located according to the at least one second lane line, and finally verifies the first lane by using the second lane to obtain the lane in which the target object is located, thereby implementing verification of the lane of the target object determined by a visual perception system by using the two-dimensional image, and improving accuracy of lane detection on the target object. In addition, according to the embodiment of the disclosure, when the lane is checked, hardware devices such as additional sensors are not required to be added, so that the detection cost is reduced on the basis of improving the accuracy of lane detection.
Exemplary electronic device
Next, an electronic apparatus according to an embodiment of the present disclosure is described with reference to fig. 11. The electronic device may be either or both of the terminal device 101 and the server 103 as shown in fig. 1, or a stand-alone device separate from them, which may communicate with the terminal device 101 and the server 103 to receive the collected input signals therefrom.
FIG. 11 illustrates a block diagram of an electronic device in accordance with an embodiment of the disclosure.
As shown in fig. 11, the electronic device 1100 includes one or more processors 1101 and memory 1102.
The processor 1101 may be a Central Processing Unit (CPU) or other form of processing unit having data processing capabilities and/or instruction execution capabilities, and may control other components in the electronic device 1100 to perform desired functions.
Memory 1102 may include one or more computer program products that may include various forms of computer-readable storage media, such as volatile memory and/or non-volatile memory. Volatile memory can include, for example, Random Access Memory (RAM), cache memory (or the like). The non-volatile memory may include, for example, Read Only Memory (ROM), a hard disk, flash memory, and the like. One or more computer program instructions may be stored on the computer readable storage medium and executed by the processor 1101 to implement the lane identification verification methods of the various embodiments of the present disclosure above and/or other desired functions. Various contents such as a two-dimensional image, coordinates, etc. may also be stored in the computer-readable storage medium.
In one example, the electronic device 1100 may further include: an input device 1103 and an output device 1104, which are interconnected by a bus system and/or other form of connection mechanism (not shown).
For example, when the electronic device is the terminal device 101 or the server 103, the input device 1103 may be a device such as a camera, a mouse, and a keyboard, and is used to input contents such as a two-dimensional image and various commands. When the electronic device is a stand-alone device, the input device 1103 may be a communication network connector for receiving contents of two-dimensional images, various commands, and the like input from the terminal device 101 and the server 103.
The output device 1104 may output various information including the determined lane in which the target object is located to the outside. The output devices 1104 may include, for example, a display, speakers, printer, and remote output device connected to a communication network or the like.
Of course, for simplicity, only some of the components of the electronic device 1100 relevant to the present disclosure are shown in fig. 11, omitting components such as buses, input/output interfaces, and the like. In addition, electronic device 1100 may include any other suitable components depending on the particular application.
Exemplary computer program product and computer-readable storage Medium
In addition to the above-described methods and apparatus, embodiments of the present disclosure may also be a computer program product comprising computer program instructions that, when executed by a processor, cause the processor to perform the steps in the lane identification verification method according to various embodiments of the present disclosure described in the "exemplary methods" section of this specification above.
The computer program product may write program code for carrying out operations for embodiments of the present disclosure in any combination of one or more programming languages, including an object oriented programming language such as Java, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computing device, partly on the user's device, as a stand-alone software package, partly on the user's computing device and partly on a remote computing device, or entirely on the remote computing device or server.
Furthermore, embodiments of the present disclosure may also be a computer-readable storage medium having stored thereon computer program instructions that, when executed by a processor, cause the processor to perform the steps in the lane identification verification method according to various embodiments of the present disclosure described in the "exemplary methods" section above in this specification.
The computer-readable storage medium may take any combination of one or more readable media. The readable medium may be a readable signal medium or a readable storage medium. A readable storage medium may include, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples (a non-exhaustive list) of the readable storage medium include: an electrical connection having one or more wires, a portable disk, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
The foregoing describes the general principles of the present disclosure in conjunction with specific embodiments, however, it is noted that the advantages, effects, etc. mentioned in the present disclosure are merely examples and are not limiting, and they should not be considered essential to the various embodiments of the present disclosure. Furthermore, the foregoing disclosure of specific details is for the purpose of illustration and description and is not intended to be limiting, since the disclosure is not intended to be limited to the specific details so described.
In the present specification, the embodiments are described in a progressive manner, each embodiment focuses on differences from other embodiments, and the same or similar parts in the embodiments are referred to each other. For the system embodiment, since it basically corresponds to the method embodiment, the description is relatively simple, and for the relevant points, reference may be made to the partial description of the method embodiment.
The block diagrams of devices, apparatuses, systems referred to in this disclosure are only given as illustrative examples and are not intended to require or imply that the connections, arrangements, configurations, etc. must be made in the manner shown in the block diagrams. These devices, apparatuses, devices, systems may be connected, arranged, configured in any manner, as will be appreciated by those skilled in the art. Words such as "including," "comprising," "having," and the like are open-ended words that mean "including, but not limited to," and are used interchangeably therewith. The words "or" and "as used herein mean, and are used interchangeably with, the word" and/or, "unless the context clearly dictates otherwise. The word "such as" is used herein to mean, and is used interchangeably with, the phrase "such as but not limited to".
The methods and apparatus of the present disclosure may be implemented in a number of ways. For example, the methods and apparatus of the present disclosure may be implemented by software, hardware, firmware, or any combination of software, hardware, and firmware. The above-described order for the steps of the method is for illustration only, and the steps of the method of the present disclosure are not limited to the order specifically described above unless specifically stated otherwise. Further, in some embodiments, the present disclosure may also be embodied as programs recorded in a recording medium, the programs including machine-readable instructions for implementing the methods according to the present disclosure. Thus, the present disclosure also covers a recording medium storing a program for executing the method according to the present disclosure.
It is also noted that in the devices, apparatuses, and methods of the present disclosure, each component or step can be decomposed and/or recombined. These decompositions and/or recombinations are to be considered equivalents of the present disclosure.
The previous description of the disclosed aspects is provided to enable any person skilled in the art to make or use the present disclosure. Various modifications to these aspects will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other aspects without departing from the scope of the disclosure. Thus, the present disclosure is not intended to be limited to the aspects shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.
The foregoing description has been presented for purposes of illustration and description. Furthermore, this description is not intended to limit embodiments of the disclosure to the form disclosed herein. While a number of example aspects and embodiments have been discussed above, those of skill in the art will recognize certain variations, modifications, alterations, additions and sub-combinations thereof.

Claims (10)

1. A lane identification verification method, comprising:
determining at least one first lane line on a road where the vehicle is located and a first coordinate representing the position of a target object on the road under a pre-calibrated vehicle coordinate system;
determining a first lane in which the target object is located in the vehicle coordinate system based on the first coordinate and the at least one first lane line;
determining at least one second lane line on the road and second coordinates representing a position of the target object in an image coordinate system of a two-dimensional image taken of the target object;
determining a second lane in which the target object is located in the image coordinate system based on the second coordinate and the at least one second lane line;
and verifying the first lane by using the second lane to obtain the lane where the target object is located.
2. The method of claim 1, wherein said determining at least one second lane line on the road in an image coordinate system of a two-dimensional image taken of the target object comprises:
extracting a sampling point set from the at least one first lane line based on a preset sampling interval;
mapping sampling points included in the sampling point set to the image coordinate system based on a preset coordinate conversion parameter representing a conversion relation between the vehicle coordinate system and the image coordinate system to obtain a mapping point set;
determining the at least one second lane line based on the set of mapped points.
3. The method of claim 1, wherein the determining second coordinates representing the location of the target object comprises:
determining a block diagram containing a preset shape of the target object in the two-dimensional image;
on the block diagram, a preset number of second coordinates representing a preset number of points is determined.
4. The method of claim 3, wherein said determining, on the block diagram, a preset number of second coordinates representing a preset number of points comprises:
and determining second coordinates respectively corresponding to the uniformly distributed preset number of points on the bottom edge included by the block diagram.
5. The method of claim 1, wherein the verifying the first lane with the second lane to obtain the lane in which the target object is located comprises:
determining whether the first lane and the second lane belong to key lanes or not based on the position relationship between the lane where the vehicle is located and the first lane and the second lane, wherein the key lanes comprise the lane where the vehicle is located, a left lane of the vehicle and a right lane of the vehicle;
and if the first lane and the second lane belong to key lanes and the first lane and the second lane are not coincident, determining the second lane as the lane where the target object is located.
6. The method of claim 1, wherein the determining at least one first lane line on the road on which the host vehicle is located under a pre-calibrated vehicle coordinate system comprises:
determining at least one group of lane line coefficients under a pre-calibrated vehicle coordinate system;
determining at least one lane line equation representing the at least one first lane line based on the at least one set of lane line coefficients;
the determining, based on the first coordinate and the at least one first lane line, a first lane in which the target object is located in the vehicle coordinate system includes:
respectively substituting first coordinate components included in the first coordinates into the at least one lane line equation to obtain at least one point located on the at least one lane line;
and determining a first lane in which the target object is located based on a second coordinate component included in the first coordinate and a second coordinate component included in each of the at least one point.
7. The method of any of claims 1-6, wherein after the verifying the first lane with the second lane resulting in the lane in which the target object is located, the method further comprises:
determining the number of the target objects;
if the number of the target objects is at least two, determining a target object which is located in the same lane with the vehicle and is closest to the vehicle from the at least two target objects;
and outputting information representing the determined target object.
8. A lane identification verification apparatus comprising:
the first determining module is used for determining at least one first lane line on a road where the vehicle is located and first coordinates representing the position of a target object on the road under a pre-calibrated vehicle coordinate system;
a second determining module, configured to determine, based on the first coordinate and the at least one first lane line, a first lane in which the target object is located in the vehicle coordinate system;
a third determination module configured to determine at least one second lane line on the road and determine a second coordinate representing a position of the target object in an image coordinate system of the two-dimensional image taken of the target object;
a fourth determining module, configured to determine, based on the second coordinate and the at least one second lane line, a second lane in which the target object is located in the image coordinate system;
and the checking module is used for checking the first lane by using the second lane to obtain the lane where the target object is located.
9. A computer-readable storage medium, the storage medium storing a computer program for performing the method of any of the preceding claims 1-7.
10. An electronic device, the electronic device comprising:
a processor;
a memory for storing the processor-executable instructions;
the processor is configured to read the executable instructions from the memory and execute the instructions to implement the method of any one of claims 1 to 7.
CN202110909393.8A 2021-08-09 2021-08-09 Lane recognition and verification method and device, readable storage medium and electronic equipment Pending CN113569800A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110909393.8A CN113569800A (en) 2021-08-09 2021-08-09 Lane recognition and verification method and device, readable storage medium and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110909393.8A CN113569800A (en) 2021-08-09 2021-08-09 Lane recognition and verification method and device, readable storage medium and electronic equipment

Publications (1)

Publication Number Publication Date
CN113569800A true CN113569800A (en) 2021-10-29

Family

ID=78171079

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110909393.8A Pending CN113569800A (en) 2021-08-09 2021-08-09 Lane recognition and verification method and device, readable storage medium and electronic equipment

Country Status (1)

Country Link
CN (1) CN113569800A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115019276A (en) * 2022-06-30 2022-09-06 南京慧尔视智能科技有限公司 Target detection method, system and related equipment

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109724615A (en) * 2019-02-28 2019-05-07 北京经纬恒润科技有限公司 A kind of method of calibration and system of Lane detection result
KR20200070702A (en) * 2018-12-10 2020-06-18 르노삼성자동차 주식회사 Method of verifying lane detection in more improved lane detection system
CN111524185A (en) * 2020-04-21 2020-08-11 上海商汤临港智能科技有限公司 Positioning method and device, electronic equipment and storage medium
CN112068567A (en) * 2020-09-16 2020-12-11 上海振华重工(集团)股份有限公司 Positioning method and positioning system based on ultra-wideband and visual image
CN112184799A (en) * 2019-07-05 2021-01-05 北京地平线机器人技术研发有限公司 Lane line space coordinate determination method and device, storage medium and electronic equipment
CN112560680A (en) * 2020-12-16 2021-03-26 北京百度网讯科技有限公司 Lane line processing method and device, electronic device and storage medium
CN112712040A (en) * 2020-12-31 2021-04-27 潍柴动力股份有限公司 Method, device and equipment for calibrating lane line information based on radar and storage medium
CN112800812A (en) * 2019-11-13 2021-05-14 北京地平线机器人技术研发有限公司 Target object lane change identification method and device, readable storage medium and electronic equipment

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20200070702A (en) * 2018-12-10 2020-06-18 르노삼성자동차 주식회사 Method of verifying lane detection in more improved lane detection system
CN109724615A (en) * 2019-02-28 2019-05-07 北京经纬恒润科技有限公司 A kind of method of calibration and system of Lane detection result
CN112184799A (en) * 2019-07-05 2021-01-05 北京地平线机器人技术研发有限公司 Lane line space coordinate determination method and device, storage medium and electronic equipment
CN112800812A (en) * 2019-11-13 2021-05-14 北京地平线机器人技术研发有限公司 Target object lane change identification method and device, readable storage medium and electronic equipment
CN111524185A (en) * 2020-04-21 2020-08-11 上海商汤临港智能科技有限公司 Positioning method and device, electronic equipment and storage medium
CN112068567A (en) * 2020-09-16 2020-12-11 上海振华重工(集团)股份有限公司 Positioning method and positioning system based on ultra-wideband and visual image
CN112560680A (en) * 2020-12-16 2021-03-26 北京百度网讯科技有限公司 Lane line processing method and device, electronic device and storage medium
CN112712040A (en) * 2020-12-31 2021-04-27 潍柴动力股份有限公司 Method, device and equipment for calibrating lane line information based on radar and storage medium

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115019276A (en) * 2022-06-30 2022-09-06 南京慧尔视智能科技有限公司 Target detection method, system and related equipment
CN115019276B (en) * 2022-06-30 2023-10-27 南京慧尔视智能科技有限公司 Target detection method, system and related equipment

Similar Documents

Publication Publication Date Title
EP3627180B1 (en) Sensor calibration method and device, computer device, medium, and vehicle
US10217005B2 (en) Method, apparatus and device for generating target detection information
CN111174782B (en) Pose estimation method and device, electronic equipment and computer readable storage medium
CN110148312B (en) Collision early warning method and device based on V2X system and storage medium
CN110567475A (en) Navigation method, navigation device, computer readable storage medium and electronic equipment
EP3842752A1 (en) Vehicle positioning method, apparatus, electronic device, vehicle and storage medium
CN111398989A (en) Performance analysis method and test equipment of driving assistance system
US11738747B2 (en) Server device and vehicle
KR101995223B1 (en) System, module and method for detecting pedestrian, computer program
CN111627066A (en) Method and device for adjusting external parameters of camera
CN114475593B (en) Travel track prediction method, vehicle, and computer-readable storage medium
CN113569800A (en) Lane recognition and verification method and device, readable storage medium and electronic equipment
CN109635868B (en) Method and device for determining obstacle type, electronic device and storage medium
CN113112643A (en) Evaluation method and device for predicted trajectory, electronic device and storage medium
CN112116655A (en) Method and device for determining position information of image of target object
CN111678488B (en) Distance measuring method and device, computer readable storage medium and electronic equipment
CN115930978A (en) Map creating method and device
US20230091574A1 (en) Driving assistance processing method and apparatus, computer-readable medium, and electronic device
CN109827610A (en) Method and apparatus for check sensor fusion results
CN109859254B (en) Method and device for sending information in automatic driving
CN114743174A (en) Determination method and device for observed lane line, electronic equipment and storage medium
CN114638887A (en) Lane detection method and device, computer readable storage medium and electronic device
CN113111692B (en) Target detection method, target detection device, computer readable storage medium and electronic equipment
CN115331482A (en) Vehicle early warning prompting method and device, base station and storage medium
US11353579B2 (en) Method for indicating obstacle by smart roadside unit

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination