CN111699404B - Driving auxiliary target acquisition method and device, radar, driving system and vehicle - Google Patents

Driving auxiliary target acquisition method and device, radar, driving system and vehicle Download PDF

Info

Publication number
CN111699404B
CN111699404B CN201980012097.9A CN201980012097A CN111699404B CN 111699404 B CN111699404 B CN 111699404B CN 201980012097 A CN201980012097 A CN 201980012097A CN 111699404 B CN111699404 B CN 111699404B
Authority
CN
China
Prior art keywords
target
fence
vehicle
detection
radar
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201980012097.9A
Other languages
Chinese (zh)
Other versions
CN111699404A (en
Inventor
王宇
陈雷
薛彤晖
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
SZ DJI Technology Co Ltd
Original Assignee
SZ DJI Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by SZ DJI Technology Co Ltd filed Critical SZ DJI Technology Co Ltd
Publication of CN111699404A publication Critical patent/CN111699404A/en
Application granted granted Critical
Publication of CN111699404B publication Critical patent/CN111699404B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/91Radar or analogous systems specially adapted for specific applications for traffic control
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/66Radar-tracking systems; Analogous systems
    • G01S13/72Radar-tracking systems; Analogous systems for two-dimensional tracking, e.g. combination of angle and range tracking, track-while-scan radar
    • G01S13/723Radar-tracking systems; Analogous systems for two-dimensional tracking, e.g. combination of angle and range tracking, track-while-scan radar by using numerical data
    • G01S13/726Multiple target tracking
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/91Radar or analogous systems specially adapted for specific applications for traffic control
    • G01S13/92Radar or analogous systems specially adapted for specific applications for traffic control for velocity measurement
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/02Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
    • G01S7/023Interference mitigation, e.g. reducing or avoiding non-intentional interference with other HF-transmitters, base station transmitters for mobile communication or other radar systems, e.g. using electro-magnetic interference [EMI] reduction techniques
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/02Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
    • G01S7/36Means for anti-jamming, e.g. ECCM, i.e. electronic counter-counter measures
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/02Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
    • G01S7/41Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section
    • G01S7/414Discriminating targets with respect to background clutter
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/02Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
    • G01S7/41Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section
    • G01S7/415Identification of targets based on measurements of movement associated with the target
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • G01S2013/9327Sensor installation details
    • G01S2013/93272Sensor installation details in the back of the vehicles

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • General Physics & Mathematics (AREA)
  • Electromagnetism (AREA)
  • Traffic Control Systems (AREA)

Abstract

A driving auxiliary target acquisition method and device, a radar, a driving system and a vehicle, wherein the method comprises the following steps: obtaining target detection results of a radar behind a vehicle to obtain position information and speed information of each detection target behind the vehicle, and then determining a filtering target according to the position information, the speed information and corresponding single-side fence information of each detection target, wherein the filtering target comprises: and (3) removing the filtering targets from the detection targets by using mirror targets generated outside the fence to obtain the driving auxiliary targets. By filtering the mirror image targets and other filtering targets, the accuracy of the lane information is improved, and the running risk of the vehicle is reduced.

Description

Driving auxiliary target acquisition method and device, radar, driving system and vehicle
Technical Field
The invention relates to the technical field of intelligent transportation, in particular to a method and a device for acquiring a driving auxiliary target, a radar, a driving system and a vehicle.
Background
With the continuous development of the intelligent traffic field, radar technology is also increasingly widely applied to advanced driving assistance systems (ADVANCED DRIVING ASSISTANT SYSTEM, ADAS) of vehicles, which occupy an indispensable position in unmanned systems. In the prior art, radar can realize target detection of the environment where a vehicle is located by releasing radar signals and receiving radar feedback signals.
However, in the process of realizing target detection in the running process of the vehicle by using a radar, especially a millimeter wave radar, if fences exist on two sides of a lane where the vehicle is located, the fences can increase primary reflection of radar signals, and the multipath effect caused by radar signal reflection often causes mirror image targets to be generated. This results in the radar output to the travel controller of the vehicle including the mirror image target, resulting in a decrease in accuracy of the lane information and thus a decrease in safety of the vehicle traveling.
Disclosure of Invention
The invention provides a driving auxiliary target acquisition method and device, a radar, a driving system and a vehicle, so as to improve the accuracy of lane information and reduce the driving risk of the vehicle by filtering mirror image targets.
In a first aspect, an embodiment of the present invention provides a driving assistance target acquisition method, including:
Obtaining target detection results of a radar behind a vehicle, and obtaining position information and speed information of each detection target behind the vehicle;
determining a filtering target according to the position information, the speed information and the corresponding single-side fence information of each detection target, wherein the filtering target comprises: mirror image targets generated outside the fence;
and removing the filtering target from each detection target to obtain a driving auxiliary target.
In a second aspect, an embodiment of the present invention provides a travel assist target acquisition apparatus including:
A memory;
A processor; and
A computer program;
Wherein the computer program is stored in the memory and configured to be executed by the processor to implement the method of the first aspect.
In a third aspect, an embodiment of the present invention provides a radar including:
the radar signal receiving and transmitting device is used for sending out detection signals and receiving feedback signals;
The processor is used for controlling the sending of the detection signal and obtaining a target detection result according to the feedback signal;
a travel assist target acquisition means for executing the method according to the first aspect based on the target detection result.
In a fourth aspect, an embodiment of the present invention provides a running system including:
A radar including a travel assist target acquisition means for performing the method of the first aspect;
And the running controller is used for controlling the vehicle to run according to the running auxiliary target sent by the radar.
In a fifth aspect, an embodiment of the present invention provides a vehicle including:
A vehicle body;
the radar of the third aspect, mounted behind the vehicle body;
And the controller is used for controlling the vehicle to run according to the driving auxiliary target sent by the radar.
In a sixth aspect, embodiments of the present invention provide a computer-readable storage medium having a computer program stored thereon,
The computer program is executed by a processor to implement the method of the first aspect.
According to the method and the device for acquiring the driving auxiliary target, the radar, the driving system and the vehicle, provided by the invention, the mirror image targets near the fence due to fence reflection are identified and filtered by combining the position information and the speed information of each target behind the vehicle carried in the radar detection result, so that the driving auxiliary target sent to the driving controller of the vehicle by the radar is ensured not to contain the mirror image targets, the occupation of the radar output bandwidth is saved, the adverse interference of the mirror image targets on the accuracy of lane information and the driving safety of the vehicle is avoided, the driving controller of the vehicle is facilitated to realize the accurate judgment and the accurate control of the environment where the vehicle is located, and the safety in the driving process of the vehicle is improved.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the disclosure and together with the description, serve to explain the principles of the disclosure.
FIG. 1 is a schematic diagram of a target detection result according to an embodiment of the present invention;
fig. 2 is a schematic flow chart of a method for acquiring a driving assistance target according to an embodiment of the present invention;
FIG. 3 is a schematic diagram of a coordinate system of a vehicle body according to an embodiment of the present invention;
FIG. 4 is a flowchart of another driving assistance target obtaining method according to an embodiment of the present invention;
FIG. 5 is a flowchart of another driving assistance target obtaining method according to an embodiment of the present invention;
FIG. 6 is a flowchart of another driving assistance target obtaining method according to an embodiment of the present invention;
FIG. 7 is a schematic view of a rear radar mounted on a vehicle in an embodiment of the invention;
FIG. 8 is a functional block diagram of a driving assistance object acquiring apparatus according to an embodiment of the present invention;
fig. 9 is a schematic physical structure of a driving assistance target obtaining apparatus according to an embodiment of the present invention;
fig. 10 is a schematic diagram of a radar architecture according to an embodiment of the present invention;
Fig. 11 is a schematic diagram of a driving system according to an embodiment of the present invention;
fig. 12 is a schematic diagram of a vehicle architecture according to an embodiment of the present invention.
Specific embodiments of the present disclosure have been shown by way of the above drawings and will be described in more detail below. These drawings and the written description are not intended to limit the scope of the disclosed concepts in any way, but rather to illustrate the disclosed concepts to those skilled in the art by reference to specific embodiments.
Detailed Description
Reference will now be made in detail to exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, the same numbers in different drawings refer to the same or similar elements, unless otherwise indicated. The implementations described in the following exemplary examples are not representative of all implementations consistent with the present disclosure. Rather, they are merely examples of apparatus and methods consistent with some aspects of the present disclosure as detailed in the accompanying claims.
The specific application scene of the invention is as follows: and detecting a scene aiming at a target of a rear environment in the running process of the vehicle. Further, the driving scene of the vehicle may be an automatic driving or unmanned driving scene, in which, in order to realize automatic control of the driving process of the vehicle, the environment where the vehicle is located needs to be subjected to target detection.
In the driving scene of the vehicle, if fences exist on two sides of a lane where the vehicle is located, mirror image targets are easily generated near the fences.
Fig. 1 shows a schematic diagram of a target detection result provided. Specifically, two vertical lines in the image shown in fig. 1 represent fences, while the right fence in the image corresponds to the right fence of the vehicle in the actual driving scene, and the left fence in the image corresponds to the left fence of the vehicle in the actual driving scene. Furthermore, in some scenes, it may also be possible to refer to a case where the right side fence in the image corresponds to the left side fence of the vehicle in the actual traveling scene, and the left side fence in the image corresponds to the right side fence of the vehicle in the actual traveling scene, to which the embodiment of the present invention is not particularly limited.
For convenience of explanation, the left fence referred to later in the embodiments of the present invention refers to a fence located on the left side of a vehicle in an actual scene, and the right fence refers to a fence located on the right side of a vehicle in an actual scene.
As shown in fig. 1, in the right fence of the vehicle, radar signals are increased by primary reflection due to the presence of the right fence, resulting in two corresponding detection targets on both sides of the fence in the obtained target detection result, wherein only one detection target is actually present, and the other detection target symmetrical thereto is a specular reflection point (reflector) of the detection target, that is, a mirror target.
In an actual target detection scene, the accuracy of lane information is reduced due to the existence of a mirror image target, so that vehicle state judgment errors are easy to cause, and a large potential safety hazard exists. In addition, the mirror image target can independently occupy a target point resource, which also causes waste of the target point resource.
The technical scheme provided by the invention aims to solve the technical problems in the prior art.
The following describes the technical scheme of the present application and how the technical scheme of the present application solves the above technical problems in detail with specific embodiments. The following embodiments may be combined with each other, and the same or similar concepts or processes may not be described in detail in some embodiments. Embodiments of the present application will be described below with reference to the accompanying drawings.
Example 1
The embodiment of the invention provides a driving auxiliary target acquisition method. Referring to fig. 2, the method includes the following steps:
S202, acquiring a target detection result of a radar behind the vehicle, and obtaining position information and speed information of each detection target behind the vehicle.
The target detection result is obtained by detecting a radar mounted in the vehicle, which is used for detecting a target in the environment behind the vehicle. In order to actually realize this step, the radar is only required to be activated, and the radar may be used to perform target detection for the environment behind the vehicle.
Furthermore, the types of radar to which the present invention may be directed may include, but are not limited to: millimeter wave radar. The millimeter wave radar is a detection radar working in the millimeter wave band, and compared with other radars such as infrared radars and laser radars, the millimeter wave radar has higher penetrating power and higher spatial resolution, can identify a plurality of smaller targets, and has the characteristics of all weather and all weather.
In one possible implementation scenario, the target detection result may be obtained by at least one of the following processes using a frequency modulated continuous wave (Frequency Modulated Continuous Wave, FMCW) radar system at 77 GHz: one-dimensional/two-dimensional fast fourier transform (1D/2D Fast Fourier Transformation,1D/2D FFT) processing, constant False-ALARM RATE, CFAR detection processing, angle measurement processing, target tracking processing, filtering processing, and the like.
In addition, it should be noted that the embodiment of the present invention is not particularly limited as to the type of the detection target included in the target detection result, and depends on the type of each object included in the environment in which the vehicle is traveling. For example, the detection target may be another vehicle target behind the vehicle, or may be a stationary target, such as a fence or the like in an environment where the vehicle is traveling.
S204, determining a filtering target according to the position information, the speed information and the corresponding single-side fence information of each detection target, wherein the filtering target comprises: mirror image targets generated outside the fence.
The single-side fence information refers to information related to a single-side fence in a scene where a vehicle is located, and may include, but is not limited to: position information and width information of the single-side fence. In one possible design, the single-sided fence information may include, but is not limited to: the connection line formed by the point set sequence of the single-side fence. For example, two straight lines in the scene shown in FIG. 1 are used to characterize a fence.
And the information of the single-side fence can be obtained through fence detection. The main execution body for executing the fence detection step may be a radar, and the target detection result output by the radar includes the single-side fence information, so that the fence detection is not required to be executed again when the step is executed, and the target detection result is directly used for processing. Alternatively, in another implementation, the target detection result output by the radar only includes each object behind the detected vehicle, and in this case, before executing step S204, it is further required to perform fence detection according to the target detection result, so as to obtain the single-side fence information therein.
In addition, for any detection target, if the detection target is positioned on the right side of the lane center line, the right fence is a single fence corresponding to the detection target; if the detection target is positioned at the left side of the lane center line, the left fence is a single fence corresponding to the detection target. This is to take into account that when determining the filtering target, the right fence is less likely to cause generation of a mirror image target for the detection target located near the left fence, and other cases, and therefore this processing is time-consuming and ineffective, and therefore, determination of the filtering target only needs to be performed using the single-side fence information corresponding to each detection target.
The embodiment of the invention has no special limitation on the number of the fences in the actual scene, and the actual scene can be provided with only one-side fences or two-side fences. In the implementation scenario with double-sided fences, when a specific filtering target is determined, the filtering targets of each side can be determined respectively for the single-sided fences of each side, and when the filtering targets are removed later, the filtering targets of both sides need to be removed.
It should be noted that "outside the fence" according to the embodiment of the present invention refers to a side of the road where the fence is located away from the vehicle, and "inside the fence" refers to a side of the road where the vehicle is located. Referring to the schematic diagram shown in fig. 1, the outer side of the right fence refers to the right side of the actual lane where the vehicle is located, and the inner side of the right fence refers to the left side of the right fence, that is, the side of the actual lane where the vehicle is located.
S206, removing the filtering target from each detection target to obtain a driving auxiliary target.
When the step is executed, the driving auxiliary target can be obtained only by removing the filtering target from all the detection targets. Wherein the driving assistance target may be used for assisting automatic driving control of the vehicle, or for outputting to assist the driver in viewing the situation behind the vehicle.
The manner of determining the filtering target in the above-described flow is specifically described below.
In the determination of the filtering target performed in S204, the determined filtering target may include, but is not limited to: mirror image targets generated outside the fence. In addition, as shown in fig. 1, at least one of the following may be further included: a stationary target near the fence, an off-fence target within a fourth predetermined range of the vehicle. The following describes a method for determining each filtering target.
First, the image object generated outside the fence is determined.
As described above, the mirror image target is a mirror reflection point of the fence inside detection target, at which time the inside detection target corresponding to the mirror image target can be regarded as a reference target, and the mirror image target can be determined by the reference target. Specifically, the reference target is a detection target that is located inside the fence and whose distance with respect to the fence satisfies a preset condition.
Specifically, the reference target needs to satisfy a preset condition in at least two dimensions of position and velocity. In an actual target detection scene, when a reference target is relatively close to a fence, the fence can increase the reflection point at the reference target for one time to form a mirror reflection point; if the reference object is far from the fence, the probability of forming specular reflection points is low. Therefore, the position information of the detection target can be regarded as one dimension of the confirmation reference target. And if the reflection increased by the fence forms a mirror image target of the reference target, the speed between the reference target and the mirror image target is theoretically consistent, so that the speed information of the detection target can be used as one dimension for confirming the reference target.
Based on this, when specifically determining the mirror image target, the reference target may be first determined inside the fence, so that the mirror image target screening condition is determined according to the position information and the speed information of the reference target, and whether the detection target located outside the fence is the mirror image target is determined according to the mirror image target screening condition.
The positional information according to the embodiment of the present invention refers to a coordinate position in a vehicle body coordinate system. For example, the positional information of each detection target described above includes, but is not limited to, the coordinate position of each detection target in the vehicle body coordinate system. The position information of the one-sided fence is also a position in the vehicle body coordinate system.
Specifically, the vehicle body coordinate system may include, but is not limited to: a two-dimensional coordinate system on a plane. Or may also include image depth information, constituting a three-dimensional coordinate system.
Fig. 3 specifically illustrates a schematic diagram of a vehicle body coordinate system according to an embodiment of the present invention. In the design shown in fig. 3, the vehicle center is set as the origin of the coordinate system, the traveling direction (the head direction) is set as the X-axis direction, and the horizontal and vertical direction (the right door direction) of the traveling direction is set as the Y-axis direction. For convenience of explanation, the following description will be given directly with respect to the X-axis direction and the Y-axis direction shown in fig. 3.
In addition, in the practical implementation process, besides the coordinate system construction mode shown in fig. 3, the vehicle body coordinate system can be constructed in other modes in the embodiment of the invention, for example, the Y-axis direction is directed to the left side vehicle door direction; for another example, the coordinate system origin position is designed as the head position. And will not be described in detail.
In the embodiment of the present invention, the mirror image target screening condition includes at least one of the following:
a distance from a symmetry point of the reference target, the symmetry point of the reference target being a symmetry point of the reference target about the fence as an axis of symmetry;
relative velocity with respect to the reference target.
It should be noted that, in consideration of the actual implementation scenario, the reference target and the corresponding mirror image target are not completely symmetrical, that is, if the symmetry point is acquired by taking the fence as the symmetry axis, the symmetry point may not exist in the object detection result, or may also be a detection target expressed in the object detection result. Furthermore, the symmetry point of the reference object may be its corresponding mirror object, possibly the object present in the actual scene.
Based on the foregoing design, in determining the mirror image target specifically, the embodiment of the present invention will be described with reference to a single-sided fence as an example. At this time, S204 may be implemented by the implementation shown in fig. 4:
S2042-2, according to the position information of each detection target and the unilateral fence information, acquiring a reference target which is positioned on the inner side of the fence and is positioned in a first preset range of the fence.
The first preset range is used for limiting the relative distance between the reference target and the single-side fence in the Y-axis direction. In particular, a threshold value of the first preset range may be preset in the Y-axis direction.
Assuming that the threshold value of the first preset range is dis_minor, for any detection target, it is first determined whether or not the detection target is located inside the corresponding one-sided fence based on the Y-axis component in the position information (coordinates in the vehicle body coordinate system) thereof, and is not located on the fence (this is in consideration of the fact that the fence has a certain width, and thus there may be a case where the detection target falls on the fence), and based on the Y-axis component of the corresponding one-sided fence, the Y-axis vertical distance between the two is obtained, and if the Y-axis vertical distance is smaller than the threshold value dis_minor, the detection target is determined to be located within the first preset range, and the detection target is taken as the reference target.
S2042-4, taking a unilateral fence corresponding to the reference target as a symmetry axis, and acquiring symmetry points of the reference target on the outer side of the fence.
As described above, the symmetry point obtained by taking the single-side fence as the symmetry axis is determined, and the symmetry point may be one detection target actually existing or may not exist, and is only one pixel point in the image.
In addition, when the symmetry point is obtained, besides the way of directly taking the single-side fence as the symmetry axis, the center point of the single-side fence can be obtained on the horizontal line of the X-axis component of the detection target, and the center point is taken as the symmetry center, so that the symmetry point of the detection target can be obtained.
S2042-6, obtaining a detection target within a second preset range where the symmetry point is located, and taking the detection target as a first candidate target.
The second preset range is used for limiting the deviation degree between the mirror reflection point of the reference target and the symmetrical point. In a specific implementation, the second preset range may be preset as a partial area including the symmetry point. In one possible design, the symmetry point may be used as a center, and the circular area with the preset radius r is used as the second preset range. In another possible design, any polygon may be acquired, and the polygon coverage area range is preset to a second preset range, where the symmetry point falls within the polygon coverage area range.
Based on the determined second preset range, a detection target falling within the second preset range is obtained and used as a first candidate target.
In addition, if any detection target cannot be obtained within the second preset range, the detection target is not provided with a mirror image target.
S2042-8, according to the speed information of the reference target, determining a mirror image target of the reference target in the first candidate targets to serve as a mirror image target generated outside the fence.
The foregoing S2042-2 to S2042-6 have already determined first candidate targets, and each first candidate target is likely to be a mirror image target of the reference target, but it is also necessary to determine, from the judgment of the speed information, a first candidate target in which the relative speed of the two satisfies the target screening condition, so as to obtain a mirror image target of the reference target.
As described above, the moving speed of the mirror target is theoretically the same as the moving speed of the reference target, so that a speed difference between the two can be obtained, and if the speed difference is small (smaller than the preset speed threshold), the first candidate target can be determined as the mirror target.
In the embodiment of the invention, the speed information can be contained in the target detection result acquired by the radar, or can be acquired by independently acquiring or receiving the speed information acquired by radar tracking.
Further, considering that the moving direction of the moving object may not completely coincide with the moving direction of the vehicle, if the moving direction of each detection object is combined and the velocity information thereof is regarded as a vector having a velocity, the velocity information of each detection object may have not only a velocity component in the X-axis direction but also a velocity component in the Y-axis (determined by the moving direction).
In such an implementation scenario, considering that the moving direction of each detection target behind the vehicle, particularly the detection target on the same lane, is substantially the same as the moving direction of the vehicle, the determination of the mirror image target can be implemented using the velocity component of each detection target on the X-axis. S2042-8 can be achieved by: and acquiring a first relative speed of the first candidate target and the reference target in the running direction according to the speed information of the reference target and the speed information of the first candidate target, and then determining the first candidate target with the first relative speed smaller than a preset first speed threshold as a mirror image target of the reference target.
Through the method, each mirror image target which is contained in the target detection result and is positioned outside the fence can be screened out, and the mirror image targets are determined to be filtering targets for filtering and deleting.
Furthermore, in another implementation scenario, the filtering targets may also include, but are not limited to: a mirrored target located within the fence. The method considers that mirror reflection points of the detection targets outside the fence can exist in the fence, avoids the influence of the mirror reflection targets on the lane information, and further reduces the safety risk in the running process of the vehicle.
Secondly, a determination is made as to the stationary object in the vicinity of the fence.
Specifically, a stationary target is a target that is stationary relative to the ground. In actual implementation, the stationary target may be determined according to the relative speed of the detection target and the vehicle. Wherein the stationary object near the fence may be determined by a third preset range of the fence, in other words, the third preset range is used to define the range of the stationary object near the fence. When the third preset range is specifically designed, the third preset range may be preset according to actual needs, and the embodiment of the present invention is not particularly limited thereto.
In one possible design, the stationary object in the vicinity of the fence may be determined from the relative speed of each detected object with respect to the vehicle, and the speed of the vehicle itself with respect to the ground, within a third preset range of the fence.
For any detection target within a third preset range, if the detection target is stationary relative to the vehicle. If the ground speed of the vehicle itself is stationary, then the detection target is stationary as well, belonging to a stationary target in the vicinity of the fence. Or if the speed to ground of the vehicle itself is C (C is a positive number), then the speed to ground of the detection target is also C, and the detection target is in a moving state and does not belong to a stationary target near the fence.
For any detection target within the third preset range, if the movement speed of the detection target relative to the vehicle is B (B is not 0). If the ground speed of the vehicle is static, the ground speed of the detection target is B, and the detection target is in a motion state and does not belong to a static target of the fence accessory. Or the ground speed of the vehicle is C, if the difference between C and B is 0, the detection target is stationary relative to the ground and belongs to a stationary target near the fence; or if the difference between C and B is not 0, the detection target is in a motion state relative to the ground and does not belong to a static target of the fence accessory.
In addition, in another possible design, the determination of the stationary object may also be achieved by an implementation as shown in fig. 5:
S2044-2, according to the speed information of the detection target in the third preset range of the fence, acquiring a second relative speed of the detection target relative to the vehicle in the running direction.
For any detection target, its velocity component in the X-axis direction is noted as: vx, and the vehicle speed itself, i.e. in the direction of travel, can be noted as: v (car), for ease of processing, the absolute value of the difference between the two speeds can be obtained as its second relative speed, which can be noted, for example: abs (|vx-v (car) |).
S2044-4, determining a stationary target near the fence according to the second relative speed and the speed of the vehicle relative to the ground, and determining the stationary target as the filtering target.
In performing this step, the processing may be performed in the manner described above. Wherein the state of motion of the detection target relative to the vehicle can be determined by the second relative speed. Specifically, if the second relative speed is smaller than the preset second speed threshold value, determining that the detection target is in a stationary state relative to the vehicle. The second speed threshold value and the first speed threshold value may be the same or different, and are not particularly limited. The second speed threshold is noted as delta_v_threshold, at which time, for any detection target, if abs (|vx-v (car) |) < delta_v_threshold is satisfied and the vehicle is in a stationary state, the detection target may be determined as a filtering target for deletion in a subsequent step.
By the steps, the static targets near the fence are used as filtering targets, so that the static point tracks near the fence can be filtered, the number of the static target points can be effectively reduced, limited storage and target tracking (track) resources are saved, and the capability of tracking other traffic participants is improved.
Again, the determination of the out-of-fence target that is within a fourth preset range of the vehicle.
The fourth preset range is a partial area outside the fence near the vehicle and is used for indicating that the relative position relationship between the object outside the fence and the vehicle is relatively close.
The radar installed in the vehicle is relatively close to the vehicle, so that the accuracy of speed information of the part of the detected target acquired by the radar installed in the vehicle is low, and misjudgment of the target is easy to be caused, thus, the output of the target is discontinuous, and further, the target in the part of the output region may flicker. Moreover, when the vehicle runs close to the fence, the detection target that makes the outside-fence target located in the fourth preset range possible to be located outside the fence is likely to fall within the range of blind spot detection (Blind Spot Detection, BSD) or auxiliary lane change (LANE CHANGE ASSIST, LCA) detection, which also easily causes erroneous judgment of the ADAS system. Therefore, the fence external target in the third preset area has a great influence on the accuracy of identifying the road information by the vehicle, and the detection target in the part of the area has a small value in the actual driving assistance process in the actual driving process, so that the fence external target in the part of the area can be used as a filtering target for filtering and removing.
Specifically, the targets outside the fence are determined according to the position information of each detection target. In other words, the out-of-fence target is determined based on the longitudinal distance (relative positional relationship in the X-axis direction) of the detection target with respect to the vehicle and the lateral distance (relative positional relationship in the Y-axis direction) with respect to the fence.
In one possible design, the determination of the out-of-fence target may be achieved by an implementation as shown in fig. 6:
s2046-2, according to the position information of each detection target and the unilateral fence information, obtaining a second candidate target positioned outside the fence.
The second candidate target refers to a detection target located outside the fence. And the determination of the second candidate object may be based on the coordinate components of each detection object in the Y-axis direction on the vehicle body coordinate system.
As described above, the relationship between each detection target and the one-sided fence may include three types: inside the fence, outside the fence or on the fence. Therefore, when this step is implemented, it is necessary to finally determine the second candidate object based on the Y-axis coordinate component of each detection object and the Y-axis component and width of the one-sided fence.
Specifically, the coordinate components of each detection target and the one-side fence in the horizontal and vertical direction (Y-axis direction) of the traveling direction may be acquired based on the position information of each detection target and the one-side fence information.
Because the single-side fence has a width, the Y-axis coordinate component of the single-side fence obtained at this time can have at least three implementation modes:
A Y-axis coordinate component near a side of a road on which the vehicle is located; or a Y-axis coordinate component on a side away from the road on which the vehicle is located; or obtained by interpolation processing of the position information of the detection target.
In the first implementation, if the Y-axis coordinate component on the side far from the road where the vehicle is located is obtained, since the width of the single-sided fence is already included in the Y-axis coordinate component, it is not necessary to consider the influence of the width of the fence any more, and only the second candidate object located outside the fence needs to be obtained according to each Y-axis coordinate component. It is known that, in the vehicle body coordinate system shown in fig. 3, the Y-axis coordinate component of the second candidate object outside each left fence is smaller than the Y-axis coordinate component of the left fence; and the Y-axis coordinate component of the second candidate object outside the right barrier is greater than the Y-axis coordinate component of the right barrier.
Or in the second implementation, if the Y-axis coordinate component near the side of the road where the vehicle is located is acquired, it is necessary to further consider the influence of the width of the fence.
At this time, a third candidate object located outside the one-sided fence is acquired according to the Y-axis coordinate components of each detection object and the one-sided fence, and then a second relative distance between the third candidate object and the corresponding one-sided fence in the Y-axis direction is acquired, so that the third candidate object with the second relative distance being greater than the width of the one-sided fence is acquired as the second candidate object.
Taking the left side fence in the vehicle body coordinate system shown in fig. 3 as an example, the Y-axis coordinate component (y_obj) of each third candidate object is smaller than the Y-axis coordinate component (y_roadside_left) of the left side fence, that is, satisfies: y_obj < y_roadside_left. On this basis, the absolute value of the difference between the two is obtained as the second relative distance: abs (|y_obj-y_roadside_left|), if the second relative distance is greater than the width of the left barrier (y_roadside_width), the third candidate object may be determined to be the second candidate object.
Further, in this implementation scenario, for any third candidate object, if abs (|y_obj-y_roadside_left|) < y_roadside_width is satisfied, then that third candidate object may be determined to be on the left-hand fence. And, for any third candidate object, if y_obj > y_roadside_left is satisfied, determining that the third candidate object is inside the left barrier.
Or in the third implementation manner, the Y-axis coordinate component of the single-side fence is obtained by performing interpolation processing on the position information of the detection target. At this time, the one-sided fence appears as a line composed of a sequence of point sets whose Y-axis component is unclear. At this time, the Y-axis coordinate component of the one-side fence is obtained by interpolating the x-axis coordinate components of at least two detection targets having the shortest distances (i.e., closest) from the detection targets in the one-side fence point set.
At this time, the manner of acquiring the Y-axis coordinate component of the one-sided fence may be: and acquiring two unilateral fence points with the shortest distance relative to the vehicle in the running direction (X-axis direction) from the unilateral fence point set, and then obtaining interpolation coordinates corresponding to the current position of the vehicle according to the coordinates of the two unilateral fence points, thereby determining coordinate components of the unilateral fence in the Y-axis direction according to the interpolation coordinates.
For example, in a vehicle body coordinate system, two left fence points closest to a vehicle are obtained in the X-axis direction for a left fence that is driven by the vehicle, and the coordinates thereof are: fence point 1 (X1, Y1) and fence point 2 (X2, Y2), so that, when determining the Y-axis coordinate component of the left fence, interpolation processing is only required for coordinates Y1, Y2 of fence point 1 and fence point 2 in the Y-axis direction. In one possible implementation, the interpolation may be performed by taking an average value.
In this implementation manner, the manner of determining the second candidate target is the same as that of the foregoing second implementation manner, and after the third candidate target is determined, secondary screening is performed according to the width of the single-side fence, so as to obtain the second candidate target, which is not described in detail. In this implementation, however, the determination of the detection targets is different for the inside of the single-sided fence and on the single-sided fence. At this time, taking the foregoing example as an example, for any third candidate object, if y_obj > y_roadside_left is satisfied and abs (|y_obj-y_roadside_left|) > y_roadside_width is satisfied, then the third candidate object is determined to be inside the left fence; for any third candidate target, if y_obj > y_roadside_left is satisfied and abs (|y_obj-y_roadside_left|) < y_roadside_width is satisfied, then the third candidate target is determined to fall on the left fence.
S2046-4, obtaining first relative distances between each second candidate object and the vehicle in the running direction.
The former step is for screening the detection targets from the positional relationship in the Y-axis direction, and the step is for screening the detection targets from the positional relationship in the X-axis direction.
S2046-6, for any second candidate target, determining that the second candidate target is the out-fence target within the fourth preset range if the first relative distance is smaller than a preset first distance threshold.
The first distance threshold (x_filter_limit) may be preset as required, which is not described herein.
In addition, when determining the targets outside the fence within the fourth preset range of the vehicle, the method may be performed in reverse or simultaneously, except that the targets are screened according to the lateral distance from the fence and then the screening determination is performed according to the longitudinal distance from the vehicle, in the manner shown in fig. 6. In other words, in another implementation scenario, the out-of-fence target that is within the fourth preset range of the vehicle may be determined finally by first screening based on the longitudinal distance to the vehicle and then by the lateral distance to the fence. Or in another implementation scenario, screening may be implemented simultaneously based on the longitudinal distance to the vehicle and the lateral distance to the fence, respectively, and the intersection thereof may be obtained to obtain an out-of-fence target located within a fourth preset range of the vehicle.
Through the steps, the fence external target in the fourth preset range is determined to be the filtering target, so that the fence external target which is close to the vehicle can be filtered, the problem of flickering of the target point in the non-concerned area can be solved, the lateral interval judgment of traffic participants can be realized, the detection accuracy of BSD and/or LCA is improved to a certain extent, the occurrence probability of misjudgment of an ADAS system is reduced, and the safety of vehicle running is improved.
In addition, when the determination of the filtering target is specifically implemented, other filtering targets may be further determined according to the actual scene, which is not particularly limited in the embodiment of the present invention.
In one possible design, the filtering targets may also include, but are not limited to: a detection target falling on the fence. The determination of the detection target can be described in the above-mentioned step S2046-2, and will not be described in detail.
In another possible design, the filtering targets may also include, but are not limited to: and (5) a fence. Such an implementation may be applicable in some implementation scenarios where the fence is not required as a reference, or may be applicable in some scenarios where the user instructs to remove the fence mark.
In addition, the embodiment of the present invention is not particularly limited to the number of radars. In an actual implementation scenario, the solution may be implemented using target detection results obtained by at least one radar.
In one possible implementation, as shown in fig. 7, two backward radars may be mounted behind the vehicle, which can detect objects and obstacles behind the vehicle using echoes.
In such a scenario, the detection range of each backward radar is generally not exactly the same; and, based on the setting of the detection target, there may generally be a detected coincidence region. As shown in fig. 7, two backward radars mounted in the rear of the vehicle perform target detection for the left area and the right area of the rear of the vehicle, respectively, and there is a detection overlap area.
When the determination of the filtering target is implemented by using the target detection results obtained by at least two radars, the determination and removal of the filtering target may be performed based on the respective target detection results, or the target detection results of the radars may be fused, and the filtering target may be determined and removed by using at least one implementation scheme based on the fused target detection results. When the target detection results are fused, the target detection results can be realized based on detection coincidence areas in detection ranges of all backward radars, namely, each target in the detection coincidence areas is matched to realize the matching of at least two target detection results, and then the fusion is performed based on the matching results, so that a complete fused target detection result is obtained.
By determining and removing the filtering targets, more accurate lane information is obtained, the same lane judgment of the moving targets is facilitated, the judgment accuracy of the backward radar in the ADAS system is assisted in remote call (Remote Procedure Call, RPC), the misjudgment occurrence probability of the ADAS is reduced, and the safety of vehicle running is improved. Moreover, through the processing, the number of the output driving auxiliary targets can be effectively reduced, so that when the method is executed in the radar, the number of the output targets can be effectively reduced, which is beneficial to reducing the occupation of the output bandwidth of the radar and improving the performance of the vehicle-mounted radar.
Based on any of the foregoing implementations, travel assist targets in the target detection results may be determined, which may be further used to assist the ADAS system in achieving travel control of the vehicle.
At this time, the method may further include the steps of: and tracking the driving auxiliary target by using the radar to obtain a tracking result, and outputting the tracking result to a controller of the vehicle or a controller of an ADAS system so that the controller of the vehicle or the controller of the ADAS system can realize driving control of the vehicle according to the driving auxiliary target.
It is to be understood that some or all of the steps or operations in the above-described embodiments are merely examples, and that embodiments of the present application may also perform other operations or variations of the various operations. Furthermore, the various steps may be performed in a different order presented in the above embodiments, and it is possible that not all of the operations in the above embodiments are performed.
When used in the present application, although the terms "first," "second," etc. may be used in the present application to describe various elements, these elements should not be limited by these terms. These terms are only used to distinguish one element from another element. For example, a first element could be termed a second element, and, similarly, a second element could be termed a first element, without changing the meaning of the description, so long as all occurrences of the "first element" are renamed consistently and all occurrences of the "second element" are renamed consistently. The first element and the second element are both elements, but may not be the same element.
The terminology used in the present application is used for the purpose of describing embodiments only and is not intended to limit the claims. As used in the description of the embodiments and the claims, the singular forms "a," "an," and "the" (the) are intended to include the plural forms as well, unless the context clearly indicates otherwise. Similarly, the term "and/or" as used in this disclosure is meant to encompass any and all possible combinations of one or more of the associated listed. Furthermore, when used in the present disclosure, the terms "comprises," "comprising," and/or variations thereof, mean that the recited features, integers, steps, operations, elements, and/or components are present, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
Example two
Based on the driving assistance target obtaining method provided in the first embodiment, the embodiment of the present invention further provides an apparatus embodiment for implementing each step and method in the foregoing method embodiment.
An embodiment of the present invention provides a driving assistance target obtaining apparatus, please refer to fig. 8, the driving assistance target obtaining apparatus 800 includes:
An obtaining module 81, configured to obtain a target detection result of a radar behind a vehicle, and obtain position information and speed information of each detected target behind the vehicle;
A determining module 82, configured to determine a filtering target according to the position information, the speed information, and the corresponding single-side fence information of each detection target, where the filtering target includes: mirror image targets generated outside the fence;
and a filtering module 83, configured to remove the filtering target from each detection target, and obtain a driving assistance target.
In the embodiment of the invention, the mirror image target is determined according to a reference target, wherein the reference target is a detection target which is positioned at the inner side of the fence and the distance relative to the fence meets the preset condition.
And determining mirror image target screening conditions according to the position information and the speed information of the reference target, and determining whether the detection target positioned outside the fence is the mirror image target according to the mirror image target screening conditions.
Specifically, the mirrored target screening conditions include at least one of:
a distance from a symmetry point of the reference target, the symmetry point of the reference target being a symmetry point of the reference target about the fence as an axis of symmetry;
relative velocity with respect to the reference target.
In one possible design, the determining module 82 is specifically configured to:
acquiring a reference target which is positioned at the inner side of the fence and is positioned in a first preset range of the fence according to the position information of each detection target and the unilateral fence information;
Taking a unilateral fence corresponding to the reference target as a symmetry axis, and acquiring symmetry points of the reference target on the outer side of the fence;
Acquiring a detection target in a second preset range where the symmetry point is located, and taking the detection target as a first candidate target;
And determining a mirror image target of the reference target in the first candidate target according to the speed information of the reference target, so as to serve as a mirror image target generated outside the fence.
In another possible design, the determining module 82 is specifically configured to:
acquiring a first relative speed of the first candidate target and the reference target in the running direction according to the speed information of the reference target and the speed information of the first candidate target;
and determining the first candidate target with the first relative speed smaller than a preset first speed threshold as a mirror image target of the reference target.
In another possible design, the filtering target further includes: a stationary object in the vicinity of the fence.
Wherein the stationary target is determined from the relative speed of the detection target and the vehicle.
In another possible design, the determining module 82 is specifically configured to:
Acquiring a second relative speed of a detection target relative to the vehicle in the running direction according to speed information of the detection target in a third preset range of the fence;
And determining a stationary target near the fence according to the second relative speed and the speed of the vehicle relative to the ground, and determining the stationary target as the filtering target.
In another possible design, the filtering target further includes: and an out-of-fence target located within a fourth preset range of the vehicle.
Wherein the out-of-fence target is determined based on a longitudinal distance of the detection target relative to the vehicle and a lateral distance relative to the fence.
In another possible design, the determining module 82 is specifically configured to:
acquiring a second candidate target positioned outside the fence according to the position information of each detection target and the unilateral fence information;
Acquiring a first relative distance between each second candidate target and the vehicle in the running direction;
and aiming at any second candidate target, if the first relative distance is smaller than a preset first distance threshold value, determining that the second candidate target is the out-fence target positioned in the fourth preset range.
In another possible design, the determining module 82 is specifically configured to:
Acquiring coordinate components of each detection target and the single-side fence in the horizontal and vertical directions of the running direction according to the position information of each detection target and the single-side fence information;
acquiring a third candidate target positioned outside the single-side fence according to the coordinate components of each detection target and the single-side fence;
Acquiring a second relative distance between the third candidate target and the corresponding unilateral fence in the horizontal and vertical directions of the driving direction;
And acquiring the third candidate target with the second relative distance being greater than the width of the single-side fence as the second candidate target.
In another possible design, the determining module 82 is specifically configured to:
Acquiring two unilateral fence points with shortest distance relative to the vehicle in the driving direction from the unilateral fence point set;
Obtaining interpolation coordinates corresponding to the current position of the vehicle according to the coordinates of the two single-side fence points;
And determining coordinate components of the single-side fence in the horizontal and vertical directions of the running direction according to the interpolation coordinates.
In an embodiment of the present invention, the location information may include, but is not limited to: and the coordinate position of each detection target in a vehicle body coordinate system.
In the embodiment of the invention, the unilateral fence information is obtained through fence detection;
The single-sided fence information includes: and a connecting line formed by the point set sequences of the single-side fence.
In addition, in the embodiment of the invention, the number of the radars is at least one;
the radar includes: millimeter wave radar.
In another possible design, the driving assistance target obtaining apparatus 800 further includes:
A tracking module (not shown in fig. 8) for tracking the driving assistance target by using the radar to obtain a tracking result;
An output module (not shown in fig. 8) for outputting the tracking result to a controller of the vehicle or a controller of a driving assistance ADAS system.
The driving assistance target obtaining apparatus 800 of the embodiment shown in fig. 8 may be used to implement the technical solution of the above-described method embodiment, and the implementation principle and technical effects may be further referred to in the related description of the method embodiment, and alternatively, the driving assistance target obtaining apparatus 800 may be a radar or a part of processing units in a radar processor, or may also be another processor that is disposed separately from the radar.
It should be understood that the above division of the respective modules of the driving assistance target obtaining apparatus 800 shown in fig. 8 is merely a division of a logic function, and may be integrated into one physical entity in whole or in part in actual implementation, or may be physically separated. And these modules may all be implemented in software in the form of calls by the processing element; or can be realized in hardware; it is also possible that part of the modules are implemented in the form of software called by the processing element and part of the modules are implemented in the form of hardware. For example, the determination module 82 may be a processing element that is separately set up, may be integrated in the driving assistance target obtaining device 800, for example, in a chip of a terminal, or may be stored in a memory of the driving assistance target obtaining device 800 in a program form, and the functions of the above modules may be called and executed by a processing element of the driving assistance target obtaining device 800. The implementation of the other modules is similar. In addition, all or part of the modules can be integrated together or can be independently implemented. The processing element described herein may be an integrated circuit having signal processing capabilities. In implementation, each step of the above method or each module above may be implemented by an integrated logic circuit of hardware in a processor element or an instruction in a software form.
For example, the modules above may be one or more integrated circuits configured to implement the methods above, such as: one or more Application SPECIFIC INTEGRATED Circuits (ASIC), or one or more microprocessors (DIGITAL SINGNAL processor, DSP), or one or more field programmable gate arrays (Field Programmable GATE ARRAY, FPGA), etc. For another example, when a module above is implemented in the form of a processing element scheduler, the processing element may be a general purpose processor, such as a central processing unit (Central Processing Unit, CPU) or other processor that may invoke a program. For another example, the modules may be integrated together and implemented in the form of a system-on-a-chip (SOC).
Further, an embodiment of the present invention provides a driving assistance target obtaining apparatus, please refer to fig. 9, the driving assistance target obtaining apparatus 800 including:
A memory 810;
A processor 820; and
A computer program;
Wherein the computer program is stored in the memory 810 and configured to be executed by the processor 820 to implement the method as described in the above embodiments.
The number of the processors 820 in the driving assistance target obtaining device 800 may be one or more, and the processors 820 may also be referred to as a processing unit, and may implement a certain control function. The processor 820 may be a general purpose processor or a special purpose processor, etc. In an alternative design, the processor 820 may also have instructions stored therein, which may be executed by the processor 820, to cause the driving assistance target obtaining device 800 to perform the method described in the above method embodiment.
In yet another possible design, the driving assistance target obtaining apparatus 800 may include a circuit that may implement the functions of transmission or reception or communication in the foregoing method embodiment.
Alternatively, the number of the memories 810 in the driving assistance target obtaining device 800 may be one or more, and the memories 810 may have instructions or intermediate data stored therein, where the instructions may be executed on the processor 820, so that the driving assistance target obtaining device 800 performs the method described in the above method embodiments. Optionally, other relevant data may also be stored in the memory 810. Instructions and/or data may also optionally be stored in processor 820. The processor 820 and the memory 810 may be provided separately or may be integrated.
In addition, as shown in fig. 9, a transceiver 830 is further provided in the driving assistance target obtaining apparatus 800, where the transceiver 830 may be referred to as a transceiver unit, a transceiver circuit, or a transceiver, etc. for performing data transmission or communication with a test device or other terminal devices, which will not be described herein.
As shown in fig. 9, the memory 810, the processor 820, and the transceiver 830 are connected and communicate by a bus.
If the driving assistance target obtaining apparatus 800 is used to implement the method corresponding to fig. 2, for example, the result of target detection by the radar behind the vehicle may be obtained by the transceiver 830, and the transceiver 830 may be used to output the filtered driving assistance target. And processor 820 is configured to perform corresponding determining or controlling operations, and optionally, corresponding instructions may also be stored in memory 810. For a specific manner of processing of the individual components, reference may be made to the relevant description of the previous embodiments.
Furthermore, an embodiment of the present invention provides a readable storage medium having stored thereon a computer program to be executed by a processor to implement the method according to embodiment one.
The embodiment of the invention provides a radar. Referring to fig. 10, the radar 1000 includes:
radar signal transceiver 1010 for sending out detection signals and receiving feedback signals;
a processor 1020, configured to control transmission of the detection signal, and obtain a target detection result according to the feedback signal;
the driving assistance target obtaining device 800 is configured to perform the method according to any implementation manner of the embodiment according to the target detection result.
Specifically, the auxiliary target acquisition apparatus 800 is configured to:
Obtaining target detection results of a radar behind a vehicle, and obtaining position information and speed information of each detection target behind the vehicle;
determining a filtering target according to the position information, the speed information and the corresponding single-side fence information of each detection target, wherein the filtering target comprises: mirror image targets generated outside the fence;
and removing the filtering target from each detection target to obtain a driving auxiliary target.
In the embodiment of the invention, the mirror image target is determined according to a reference target, wherein the reference target is a detection target which is positioned at the inner side of the fence and the distance relative to the fence meets the preset condition.
And determining mirror image target screening conditions according to the position information and the speed information of the reference target, and determining whether the detection target positioned outside the fence is the mirror image target according to the mirror image target screening conditions.
Specifically, the mirrored target screening conditions include at least one of:
a distance from a symmetry point of the reference target, the symmetry point of the reference target being a symmetry point of the reference target about the fence as an axis of symmetry;
relative velocity with respect to the reference target.
In another possible design, the auxiliary target acquisition device 800 is specifically configured to:
acquiring a reference target which is positioned at the inner side of the fence and is positioned in a first preset range of the fence according to the position information of each detection target and the unilateral fence information;
Taking a unilateral fence corresponding to the reference target as a symmetry axis, and acquiring symmetry points of the reference target on the outer side of the fence;
Acquiring a detection target in a second preset range where the symmetry point is located, and taking the detection target as a first candidate target;
And determining a mirror image target of the reference target in the first candidate target according to the speed information of the reference target, so as to serve as a mirror image target generated outside the fence.
In another possible design, the auxiliary target acquisition device 800 is specifically configured to:
acquiring a first relative speed of the first candidate target and the reference target in the running direction according to the speed information of the reference target and the speed information of the first candidate target;
and determining the first candidate target with the first relative speed smaller than a preset first speed threshold as a mirror image target of the reference target.
In another possible design, the filtering target further includes: a stationary object in the vicinity of the fence.
Wherein the stationary target is determined from the relative speed of the detection target and the vehicle.
In another possible design, the auxiliary target acquisition device 800 is specifically configured to:
Acquiring a second relative speed of a detection target relative to the vehicle in the running direction according to speed information of the detection target in a third preset range of the fence;
And determining a stationary target near the fence according to the second relative speed and the speed of the vehicle relative to the ground, and determining the stationary target as the filtering target.
In another possible design, the filtering target further includes: and an out-of-fence target located within a fourth preset range of the vehicle.
Wherein the out-of-fence target is determined based on a longitudinal distance of the detection target relative to the vehicle and a lateral distance relative to the fence.
In another possible design, the auxiliary target acquisition device 800 is specifically configured to:
acquiring a second candidate target positioned outside the fence according to the position information of each detection target and the unilateral fence information;
Acquiring a first relative distance between each second candidate target and the vehicle in the running direction;
and aiming at any second candidate target, if the first relative distance is smaller than a preset first distance threshold value, determining that the second candidate target is the out-fence target positioned in the fourth preset range.
In another possible design, the auxiliary target acquisition device 800 is specifically configured to:
Acquiring coordinate components of each detection target and the single-side fence in the horizontal and vertical directions of the running direction according to the position information of each detection target and the single-side fence information;
acquiring a third candidate target positioned outside the single-side fence according to the coordinate components of each detection target and the single-side fence;
Acquiring a second relative distance between the third candidate target and the corresponding unilateral fence in the horizontal and vertical directions of the driving direction;
And acquiring the third candidate target with the second relative distance being greater than the width of the single-side fence as the second candidate target.
In another possible design, the auxiliary target acquisition device 800 is specifically configured to:
Acquiring two unilateral fence points with shortest distance relative to the vehicle in the driving direction from the unilateral fence point set;
Obtaining interpolation coordinates corresponding to the current position of the vehicle according to the coordinates of the two single-side fence points;
And determining coordinate components of the single-side fence in the horizontal and vertical directions of the running direction according to the interpolation coordinates.
In an embodiment of the present invention, the location information may include, but is not limited to: and the coordinate position of each detection target in a vehicle body coordinate system.
In the embodiment of the invention, the unilateral fence information is obtained through fence detection;
The single-sided fence information includes: and a connecting line formed by the point set sequences of the single-side fence.
In another possible design, the driving assistance target obtaining apparatus 800 is further configured to:
Tracking the driving auxiliary target by using the radar to obtain a tracking result;
and outputting the tracking result to a controller of the vehicle or a controller of a driving assistance ADAS system.
In addition, in the embodiment of the invention, the number of the radars is at least one;
the radar includes: millimeter wave radar.
The embodiment of the invention provides a running system. Referring to fig. 11, the driving system 1100 includes:
Radar 1000 as shown in fig. 10;
and a travel controller 1110 for controlling the travel of the vehicle according to the travel assist target transmitted by the radar 1000.
As shown in fig. 10, the radar 1000 includes a travel assist target acquisition device 800 for performing the method according to any one of the implementations of the embodiment.
In one possible design, the travel system 1100 may be an ADAS system.
The embodiment of the invention provides a vehicle. Referring to fig. 12, the vehicle 1200 includes:
a vehicle body 1210;
as shown in fig. 10, a radar 1000 is installed at the rear of the vehicle body 1210;
a controller 1220 for controlling the vehicle 1200 to travel according to the travel assist target transmitted by the radar 1000.
As shown in fig. 10, the radar 1000 includes a travel assist target acquisition device 800 for performing the method according to any one of the implementations of the embodiment.
The embodiment of the present invention is not particularly limited as to the vehicle body 1210, and in a practical scenario, it may be composed of at least a power system and a vehicle body.
Since each module in this embodiment is capable of executing the method shown in embodiment one, a part of this embodiment which is not described in detail can be referred to the description related to embodiment one.
Those of ordinary skill in the art will appreciate that: all or part of the steps for implementing the above method embodiments may be implemented by hardware associated with program instructions, where the foregoing program may be stored in a computer readable storage medium, and when executed, the program performs steps including the above method embodiments; and the aforementioned storage medium includes: various media that can store program code, such as ROM, RAM, magnetic or optical disks.
Finally, it should be noted that: the above embodiments are only for illustrating the technical solution of the present invention, and not for limiting the same; although the invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some or all of the technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit of the invention.

Claims (20)

1. A travel assist target acquisition method characterized by comprising:
Obtaining target detection results of a radar behind a vehicle, and obtaining position information and speed information of each detection target behind the vehicle;
determining a filtering target according to the position information, the speed information and the corresponding single-side fence information of each detection target, wherein the filtering target comprises: mirror image targets generated outside the fence;
removing the filtering target from each detection target to obtain a driving auxiliary target;
The mirror image target is determined according to a reference target, wherein the reference target is a detection target which is positioned at the inner side of the fence and the distance of the reference target relative to the fence meets a preset condition;
Determining the mirror image target generated outside the fence, comprising:
acquiring a reference target which is positioned at the inner side of the fence and is positioned in a first preset range of the fence according to the position information of each detection target and the unilateral fence information;
Taking a unilateral fence corresponding to the reference target as a symmetry axis, and acquiring symmetry points of the reference target on the outer side of the fence;
Acquiring a detection target in a second preset range where the symmetry point is located, and taking the detection target as a first candidate target;
And determining a mirror image target of the reference target in the first candidate target according to the speed information of the reference target, so as to serve as a mirror image target generated outside the fence.
2. The method according to claim 1, wherein a mirror image target screening condition is determined based on the position information and the speed information of the reference target, and whether the detection target located outside the fence is the mirror image target is determined based on the mirror image target screening condition.
3. The method of claim 2, wherein the mirrored target screening conditions include at least one of:
a distance from a symmetry point of the reference target, the symmetry point of the reference target being a symmetry point of the reference target about the fence as an axis of symmetry;
relative velocity with respect to the reference target.
4. The method of claim 1, wherein determining a mirrored target of the reference target from among the first candidate targets based on the velocity information of the reference target comprises:
acquiring a first relative speed of the first candidate target and the reference target in the running direction according to the speed information of the reference target and the speed information of the first candidate target;
and determining the first candidate target with the first relative speed smaller than a preset first speed threshold as a mirror image target of the reference target.
5. A method according to any one of claims 1-3, wherein the filtering target further comprises: a stationary object in the vicinity of the fence.
6. The method of claim 5, wherein the stationary target is determined based on a relative speed of the detection target and the vehicle.
7. The method of claim 5, wherein the determining the stationary target relative to the vehicle comprises:
acquiring a second relative speed of a detection target relative to the vehicle in a running direction according to speed information of the detection target in a third preset range of the fence;
And determining a stationary target near the fence according to the second relative speed and the speed of the vehicle relative to the ground, and determining the stationary target as the filtering target.
8. A method according to any one of claims 1-3, wherein the filtering target further comprises: and an out-of-fence target located within a fourth preset range of the vehicle.
9. The method of claim 8, wherein the out-of-fence target is determined based on a longitudinal distance of the detection target relative to the vehicle and a lateral distance relative to the fence.
10. The method of claim 8, wherein the determining the out-of-fence target that is within the fourth preset range of the vehicle comprises:
acquiring a second candidate target positioned outside the fence according to the position information of each detection target and the unilateral fence information;
Acquiring a first relative distance between each second candidate target and the vehicle in the running direction;
and aiming at any second candidate target, if the first relative distance is smaller than a preset first distance threshold value, determining that the second candidate target is the out-fence target positioned in the fourth preset range.
11. The method of claim 10, wherein the acquiring a second candidate object located outside the fence comprises:
Acquiring coordinate components of each detection target and the single-side fence in the horizontal and vertical directions of the running direction according to the position information of each detection target and the single-side fence information;
acquiring a third candidate target positioned outside the single-side fence according to the coordinate components of each detection target and the single-side fence;
Acquiring a second relative distance between the third candidate target and the corresponding unilateral fence in the horizontal and vertical directions of the driving direction;
And acquiring the third candidate target with the second relative distance being greater than the width of the single-side fence as the second candidate target.
12. The method of claim 11, wherein the acquiring the coordinate component of the one-sided fence in the horizontal-vertical direction of the travel direction comprises:
Acquiring two unilateral fence points with shortest distance relative to the vehicle in the driving direction from the unilateral fence point set;
Obtaining interpolation coordinates corresponding to the current position of the vehicle according to the coordinates of the two single-side fence points;
And determining coordinate components of the single-side fence in the horizontal and vertical directions of the running direction according to the interpolation coordinates.
13. A method according to any one of claims 1-3, wherein the location information comprises: and the coordinate position of each detection target in a vehicle body coordinate system.
14. A method according to any one of claims 1 to 3, wherein the single-sided fence information is obtained by fence detection;
The single-sided fence information includes: and a connecting line formed by the point set sequences of the single-side fence.
15. A method according to any one of claims 1-3, wherein the number of radars is at least one;
the radar includes: millimeter wave radar.
16. A method according to any one of claims 1-3, wherein the method further comprises:
Tracking the driving auxiliary target by using the radar to obtain a tracking result;
and outputting the tracking result to a controller of the vehicle or a controller of a driving assistance ADAS system.
17. A travel assist target acquisition device characterized by comprising:
A memory;
A processor; and
A computer program;
wherein the computer program is stored in the memory and configured to be executed by the processor to implement the method of any one of claims 1-16.
18. A radar, comprising:
the radar signal receiving and transmitting device is used for sending out detection signals and receiving feedback signals;
A processor in communication with the radar signal transceiver, the processor being configured to control transmission of the probe signal, and the processor being configured to obtain a target detection result based on the feedback signal, and to perform the method of any of claims 1-16 based on the target detection result.
19. A travel system, characterized by comprising:
Radar comprising a driving assistance target obtaining means for performing the method of any one of claims 1-16;
And the running controller is used for controlling the vehicle to run according to the running auxiliary target sent by the radar.
20. A vehicle, characterized by comprising:
A vehicle body;
The radar of claim 18, mounted behind the vehicle body;
And the controller is used for controlling the vehicle to run according to the driving auxiliary target sent by the radar.
CN201980012097.9A 2019-05-30 2019-05-30 Driving auxiliary target acquisition method and device, radar, driving system and vehicle Active CN111699404B (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2019/089309 WO2020237579A1 (en) 2019-05-30 2019-05-30 Method and device for obtaining driving assistance target, radar, driving system, and vehicle

Publications (2)

Publication Number Publication Date
CN111699404A CN111699404A (en) 2020-09-22
CN111699404B true CN111699404B (en) 2024-04-30

Family

ID=72476463

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201980012097.9A Active CN111699404B (en) 2019-05-30 2019-05-30 Driving auxiliary target acquisition method and device, radar, driving system and vehicle

Country Status (2)

Country Link
CN (1) CN111699404B (en)
WO (1) WO2020237579A1 (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112672047B (en) * 2020-12-20 2022-08-26 英特睿达(山东)电子科技有限公司 Image acquisition system and image processing method
CN112799044A (en) * 2020-12-30 2021-05-14 南京市德赛西威汽车电子有限公司 BSD radar-based simple and efficient railing identification method
CN112835026B (en) * 2020-12-31 2024-02-20 福瑞泰克智能系统有限公司 Radar mirror image target detection method and device, radar equipment and vehicle
CN114779180A (en) * 2022-06-20 2022-07-22 成都瑞达物联科技有限公司 Multipath interference mirror image target filtering method for vehicle-road cooperative radar
CN115079121B (en) * 2022-07-19 2022-11-01 南京隼眼电子科技有限公司 Multipath target identification method and device for vehicle-mounted radar and storage medium

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106019283A (en) * 2016-05-24 2016-10-12 深圳市雷博泰克科技有限公司 Automobile blind area detecting method, device and system
CN106383339A (en) * 2016-08-30 2017-02-08 电子科技大学 Mirror-image object inhibition method of multi-site radar signal-level combined detection
CN107656255A (en) * 2017-10-25 2018-02-02 中国人民解放军国防科技大学 Ultra-wideband radar moving target two-dimensional positioning method based on multipath echoes
CN108318864A (en) * 2018-02-06 2018-07-24 成都纳雷科技有限公司 A kind of method and device for eliminating multipath target in Radar Targets'Detection
CN108859952A (en) * 2018-05-22 2018-11-23 森思泰克河北科技有限公司 Vehicle lane change method for early warning, device and radar
CN109254289A (en) * 2018-11-01 2019-01-22 百度在线网络技术(北京)有限公司 The detection method and detection device of road guard

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5424959B2 (en) * 2010-03-31 2014-02-26 富士通テン株式会社 Signal processing device, radar device, vehicle control system, and signal processing method
SI2724175T1 (en) * 2011-06-21 2015-10-30 Kapsch Trafficcom Ag Method and device for detecting a rotating wheel
DE102012208852A1 (en) * 2012-05-25 2013-11-28 Robert Bosch Gmbh Detection of radar objects with a radar sensor of a motor vehicle
DE102014201026A1 (en) * 2014-01-21 2015-07-23 Robert Bosch Gmbh Method for angle estimation and radar sensor for motor vehicles
JP6275007B2 (en) * 2014-09-12 2018-02-07 アイシン精機株式会社 Parking assistance device
CN109283534B (en) * 2018-09-18 2020-04-21 中国科学院沈阳自动化研究所 Intersection merging auxiliary early warning system and method based on millimeter wave radar

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106019283A (en) * 2016-05-24 2016-10-12 深圳市雷博泰克科技有限公司 Automobile blind area detecting method, device and system
CN106383339A (en) * 2016-08-30 2017-02-08 电子科技大学 Mirror-image object inhibition method of multi-site radar signal-level combined detection
CN107656255A (en) * 2017-10-25 2018-02-02 中国人民解放军国防科技大学 Ultra-wideband radar moving target two-dimensional positioning method based on multipath echoes
CN108318864A (en) * 2018-02-06 2018-07-24 成都纳雷科技有限公司 A kind of method and device for eliminating multipath target in Radar Targets'Detection
CN108859952A (en) * 2018-05-22 2018-11-23 森思泰克河北科技有限公司 Vehicle lane change method for early warning, device and radar
CN109254289A (en) * 2018-11-01 2019-01-22 百度在线网络技术(北京)有限公司 The detection method and detection device of road guard

Also Published As

Publication number Publication date
CN111699404A (en) 2020-09-22
WO2020237579A1 (en) 2020-12-03

Similar Documents

Publication Publication Date Title
CN111699404B (en) Driving auxiliary target acquisition method and device, radar, driving system and vehicle
US20200219264A1 (en) Using light detection and ranging (lidar) to train camera and imaging radar deep learning networks
CN106991389B (en) Device and method for determining road edge
CN106909152B (en) Automobile-used environmental perception system and car
US11508122B2 (en) Bounding box estimation and object detection
JP4665903B2 (en) Axis deviation angle estimation method and apparatus
US11544940B2 (en) Hybrid lane estimation using both deep learning and computer vision
CN113490863A (en) Radar-assisted three-dimensional depth reconstruction of a single image
US20190113926A1 (en) Autonomous cruise control apparatus and method
US11475678B2 (en) Lane marker detection and lane instance recognition
Giacalone et al. Challenges in aggregation of heterogeneous sensors for Autonomous Driving Systems
CN111699406B (en) Millimeter wave radar tracking detection method, millimeter wave radar and vehicle
US10302754B1 (en) Object recognition device
CN112654879B (en) Anti-interference method, device and system based on vehicle-mounted millimeter wave radar and vehicle
CN109358322A (en) Forward direction target detection radar and method
JP2020197506A (en) Object detector for vehicles
WO2018000666A1 (en) Radar system, transportation vehicle, unmanned aerial vehicle and detection method
JP2020047210A (en) Object detection device
CN107886036B (en) Vehicle control method and device and vehicle
JP2009222455A (en) Radar system
CN110941003B (en) Vehicle identification method, device, storage medium and electronic equipment
JP4644590B2 (en) Peripheral vehicle position detection device and peripheral vehicle position detection method
CN113167886B (en) Target detection method and device
CN116359908A (en) Point cloud data enhancement method, device, computer equipment, system and storage medium
CN111723866A (en) Point cloud clustering method and device, unmanned vehicle and readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant