CN110705445A - Trailer and blind area target detection method and device - Google Patents

Trailer and blind area target detection method and device Download PDF

Info

Publication number
CN110705445A
CN110705445A CN201910924662.0A CN201910924662A CN110705445A CN 110705445 A CN110705445 A CN 110705445A CN 201910924662 A CN201910924662 A CN 201910924662A CN 110705445 A CN110705445 A CN 110705445A
Authority
CN
China
Prior art keywords
trailer
target
vehicle
area
edge
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201910924662.0A
Other languages
Chinese (zh)
Inventor
李金川
刘建伟
甄龙豹
李普
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Great Wall Motor Co Ltd
Original Assignee
Great Wall Motor Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Great Wall Motor Co Ltd filed Critical Great Wall Motor Co Ltd
Priority to CN201910924662.0A priority Critical patent/CN110705445A/en
Publication of CN110705445A publication Critical patent/CN110705445A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/588Recognition of the road, e.g. of lane markings; Recognition of the vehicle driving pattern in relation to the road
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/26Segmentation of patterns in the image field; Cutting or merging of image elements to establish the pattern region, e.g. clustering-based techniques; Detection of occlusion
    • G06V10/267Segmentation of patterns in the image field; Cutting or merging of image elements to establish the pattern region, e.g. clustering-based techniques; Detection of occlusion by performing operations on regions, e.g. growing, shrinking or watersheds
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/58Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
    • G06V20/584Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads of vehicle lights or traffic lights

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Radar Systems Or Details Thereof (AREA)
  • Traffic Control Systems (AREA)

Abstract

The invention relates to the technical field of vehicles, and provides a trailer and a blind area target detection method and device. The trailer detection method comprises the following steps: acquiring image information in a set range behind the vehicle; performing edge point processing on the image information to obtain an edge line of the target in the set range; and judging whether any edge line of any target is in a preset trailer area, if so, determining that the target is a trailer. The trailer detection method can identify the trailer without additionally configuring the trailer module, saves the cost brought by the trailer module, does not need to calibrate the trailer size in the trailer identification process, and is easy to realize.

Description

Trailer and blind area target detection method and device
Technical Field
The invention relates to the technical field of vehicles, in particular to a trailer and a method and a device for detecting a target in a blind area.
Background
At present, many vehicles are provided with blind area detection systems, the systems mainly rely on millimeter wave radars arranged on two sides of the tail of the vehicle to detect vehicles in adjacent lanes, and if the vehicles in the adjacent lanes enter the blind areas of the rearview mirrors of the vehicles, the drivers are reminded of paying attention to the safety of the vehicles in running.
However, if the vehicle is equipped with a trailer, the trailer may interfere with the radar wave signals emitted by the millimeter wave radar, causing false alarms in the blind zone detection system. Accordingly, how to identify the trailer becomes important, and the following two methods are mainly used in the prior art for identifying the trailer:
first, additionally install the trailer module on the vehicle, this trailer module can send the signal of closing the blind area detection function to blind area detecting system after discerning the trailer. However, this solution requires additional trailer modules to be installed on the vehicle, and the weight and cost of the entire vehicle are increased.
And secondly, the trailer size is calibrated only by relying on the millimeter wave radar. The scheme realizes filtering of false targets according to the size of the reflected energy of the detected targets, and greatly increases the calibration difficulty of the trailer.
Therefore, it can be seen that the conventional trailer identification method has limitations in cost and performance.
Disclosure of Invention
In view of the above, the present invention is directed to a trailer detecting method, so as to at least partially solve the above technical problems.
In order to achieve the purpose, the technical scheme of the invention is realized as follows:
a trailer inspection method comprising: acquiring image information in a set range behind the vehicle; performing edge point processing on the image information to obtain an edge line of the target in the set range; and judging whether any edge line of any target is in a preset trailer area, if so, determining that the target is a trailer.
Further, the trailer detection method further comprises the step of setting the trailer area, which comprises the following steps: determining a plurality of calibration points for calibrating the trailer area, wherein the attribute parameters of the plurality of calibration points include a longitudinal distance, a lateral distance, an azimuth angle and a trailer height of the trailer relative to the tail of the host vehicle; adjusting the range of the attribute parameters of the plurality of calibration points according to the profile information of the trailer, the connection mode of the trailer and the self vehicle and/or the driving state of the self vehicle; and determining the trailer area based on the adjusted attribute parameters of the plurality of calibration points.
Further, the determining whether any edge line of any target is in a preset trailer area includes: acquiring attribute parameters of edge points on the edge lines, wherein the attribute parameters comprise longitudinal distance, transverse distance and azimuth angle of the edge points relative to the vehicle tail of the vehicle and height information of the edge points; and in a preset continuous observation period, if the attribute parameters of the edge points on any edge line fall into the range corresponding to the calibration point of the trailer area, determining that the target corresponding to the edge line is the trailer.
Compared with the prior art, the trailer detection method provided by the invention has the advantages that the edge line of the target is obtained by processing the edge point of the image information, whether the target is a trailer is judged by determining whether the edge line enters the preset trailer area, the trailer can be identified without additionally configuring a trailer module, the cost brought by the trailer module is saved, the size of the trailer is not required to be calibrated in the trailer identification process, and the method is easy to realize.
Another object of the present invention is to provide a blind area target detection method to at least partially solve the above technical problems.
In order to achieve the purpose, the technical scheme of the invention is realized as follows:
a blind area target detection method is applied to vehicles detecting blind areas through millimeter wave radars, and comprises the following steps: acquiring image information within a blind area range of a self-vehicle; performing edge point processing on the image information to obtain an edge line of the target in the blind area range; judging whether any edge line of any target is in a preset trailer area, if so, determining that the target is a trailer, and acquiring trailer contour information expressed in the form of edge points; and after determining that the trailer exists, performing the following processing for a target other than the trailer:
acquiring the position information of the target detected by the millimeter wave radar;
and comparing the position information of the target with the corresponding edge line, and if the position information of the target is consistent with the corresponding edge line, determining that the target is a real target.
Further, the blind area target detection method further includes setting the trailer area, including: determining a plurality of calibration points for calibrating the trailer area, wherein the attribute parameters of the plurality of calibration points include a longitudinal distance, a lateral distance, an azimuth angle and a trailer height of the trailer relative to the tail of the host vehicle; adjusting the range of the attribute parameters of the plurality of calibration points according to the profile information of the trailer, the connection mode of the trailer and the self vehicle and/or the driving state of the self vehicle; and determining the trailer area based on the adjusted attribute parameters of the plurality of calibration points.
Further, the determining whether any edge line of any target is in a preset trailer area includes: acquiring attribute parameters of edge points on the edge lines, wherein the attribute parameters comprise longitudinal distance, transverse distance and azimuth angle of the edge points relative to the vehicle tail of the vehicle and height information of the edge points; and in a preset continuous observation period, if the attribute parameters of the edge points on any edge line fall into the range corresponding to the calibration point of the trailer area, determining that the target corresponding to the edge line is the trailer.
Further, the image information obtained from the vehicle within the range of the blind area includes: and acquiring image information in the range of the blind area of the vehicle through a camera arranged on the vehicle.
Compared with the prior art, the blind area target detection method disclosed by the invention has the advantages that the image information of the target and the position information detected by the millimeter wave radar are subjected to information fusion processing, so that the trailer can be detected, the false target generated by the interference of the real target and the trailer can be distinguished, the false alarm caused by the interference of the trailer can be avoided without closing the blind area detection function of the vehicle when the trailer is detected, and the driving safety and the driving experience are ensured.
Another object of the present invention is to provide a trailer detecting device, which at least partially solves the above technical problem.
In order to achieve the purpose, the technical scheme of the invention is realized as follows:
the trailer detection device comprises a controller, wherein the controller is used for executing the trailer detection method.
Further, the trailer detection device further comprises: and the camera is communicated with the controller and is used for capturing image information in a set range behind the vehicle and transmitting the image information to the controller.
The trailer detection device and the trailer detection method have the same advantages compared with the prior art, and are not described in detail herein.
Another object of the present invention is to provide a blind area target detection device, which at least partially solves the above technical problems.
In order to achieve the purpose, the technical scheme of the invention is realized as follows:
a blind area target detection device comprises a controller, wherein the controller is used for executing the blind area target detection method.
Further, the blind area target detection device further includes: the camera is communicated with the controller and is used for capturing image information in a blind area range of the vehicle and transmitting the image information to the controller; and the millimeter wave radar is communicated with the controller and used for detecting the position information of the target in the blind area range and transmitting the position information to the controller.
Compared with the prior art, the blind area target detection device and the blind area target detection method have the same advantages, and are not described herein again.
Another object of the present invention is to propose a machine readable storage medium to at least partially solve the above technical problem.
In order to achieve the purpose, the technical scheme of the invention is realized as follows:
a machine-readable storage medium having instructions stored thereon for causing a machine to perform the above-described trailer detection method and the above-described blind zone target detection method.
The machine-readable storage medium has the same advantages as the trailer detection method and the blind area target detection method compared with the prior art, and is not described again here.
Additional features and advantages of the invention will be set forth in the detailed description which follows.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate an embodiment of the invention and, together with the description, serve to explain the invention and not to limit the invention.
In the drawings:
fig. 1 is a schematic flow chart of a trailer detection method according to a first embodiment of the present invention;
FIG. 2 is a schematic flow diagram of the trailer area setup in a preferred embodiment of the present invention;
FIG. 3 is a schematic view of the position of the bicycle relative to the trailer in an embodiment of the present invention;
FIG. 4 is a schematic flow chart of a blind area target detection method according to a second embodiment of the present invention;
FIG. 5 is a schematic diagram of a millimeter wave radar combined with a camera to determine a real target in an example of an embodiment of the invention;
fig. 6 is a system architecture diagram of a trailer detection apparatus according to a third embodiment of the present invention; and
fig. 7 is a system architecture diagram of a blind area target detection apparatus according to a fourth embodiment of the present invention.
Description of reference numerals:
610. 710 central control module
620. 720 camera
630. 730 Association component
740 millimeter wave radar
750 human-computer interaction module
Detailed Description
In addition, the embodiments of the present invention and the features of the embodiments may be combined with each other without conflict.
In the embodiment of the present invention, the vehicle on which the trailer is mounted is referred to as a "host vehicle".
The present invention will be described in detail below with reference to the accompanying drawings in conjunction with embodiments.
Example one
Fig. 1 is a schematic flow chart of a trailer detection method according to a first embodiment of the present invention. As shown in fig. 1, the trailer detection method may include the steps of:
and step S110, acquiring image information in a set range behind the vehicle.
The set range may be, for example, a blind area range defined by referring to ISO17387 standard.
Preferably, the image information can be obtained by shooting through a camera. The image information output by the camera reflects the free space of the vehicle, namely the space in which the vehicle can run, to a certain extent, and is defined as the road surface divided by the edges of the road shoulder, the static object and the moving object.
And step S120, performing edge point processing on the image information to obtain an edge line of the target in the set range.
The edge point processing to extract edge points, edge lines and even target contours is a conventional image processing technology, for example, by judging the difference of pixels to be detected in an image, such as left and right, upper and lower pixels, judging whether the pixel is an edge point, setting ID information of the edge point, selecting continuous edge points according to the ID information to be sequentially connected to form edge lines, and drawing the target contours and even free spaces through the edge lines. In addition, because the actual vehicles are symmetrical left and right based on the vehicle central axis, when the target is the vehicle target, the target contour drawing process can be simplified by using the symmetry of the vehicle target, and other targets with symmetry can be processed in the same way.
Step S130, determining whether any edge line of any target is in a preset trailer area, and if so, determining that the target is a trailer.
In a preferred embodiment, the trailer inspection method may further comprise providing the trailer area. Fig. 2 is a schematic flow chart of the trailer area setting in the preferred embodiment of the present invention, and as shown in fig. 2, the method may include the following steps:
step S210, determining a plurality of calibration points for calibrating the trailer area.
Fig. 3 is a schematic diagram of the position relationship of the self-vehicle relative to the trailer in the embodiment of the invention, and it can be known that the trailer area can be described by the longitudinal distance dx, the transverse distance dy and the azimuth angle r of the trailer relative to the tail of the self-vehicle, and the trailer height dz (it is easy to know that dx, dy and dz are two by two relatively vertical) which is not shown in fig. 3 is also included in consideration of the size of the trailer. Based on this, the attribute parameters of the plurality of index points may include a longitudinal distance dx, a lateral distance dy, an azimuth angle r, and a trailer height dz for the trailer relative to the own vehicle rear.
Step S220, adjusting the range of the attribute parameters of the plurality of calibration points according to the profile information of the trailer, the connection mode of the trailer and the host vehicle, and/or the driving state of the host vehicle.
Wherein the profile information of the trailer may for example be dimension information of a conventional trailer, from which the range of attribute parameters of the respective calibration point, e.g. the range of trailer height dz, may be empirically determined. The connection mode of the trailer and the self-vehicle comprises a soft connection mode and a hard connection mode, wherein the hard connection mode refers to a connection mode that the distance between the trailer and the self-vehicle is completely fixed, and the soft connection mode refers to a connection mode that the distance between the trailer and the self-vehicle can be changed during driving. In consideration of the existence of the soft connection mode, in the driving process, the longitudinal distance, the transverse distance and the azimuth angle between the self vehicle and the trailer are changed due to the driving states of the self vehicle such as constant-speed driving, accelerated driving, turning or braking, and the like, so that the range of the attribute parameters of the plurality of calibration points can be adjusted according to the driving state of the self vehicle.
Step S230, determining the trailer area based on the adjusted attribute parameters of the calibration points.
For example, through adjustment of the attribute parameters of the calibration points, the trailer area may be determined to be an area formed by dx ═ 0, -a ] meter, r ═ b, -c ] degree, dy ═ e, -f ], dz ═ g + Δ g ], which is set in consideration of various actual scenes, so that the objects entering the area are generally trailers. Wherein a, b, c, e, f, g and Δ g are all calibration values.
Further, after determining the trailer area, in a preferred embodiment, the step S130 may further include:
step S131, obtaining attribute parameters of the edge points on the edge line.
Wherein the attribute parameters of the edge point comprise the longitudinal distance, the transverse distance and the azimuth angle of the edge point relative to the vehicle tail of the vehicle and the height information of the edge point. As above, in the example, these four attribute parameters may be represented as longitudinal distance dx, lateral distance dy, azimuth angle r, and trailer height dz, respectively.
Step S132, in a preset continuous observation period, if the attribute parameter of the edge point on any edge line falls within the range corresponding to the calibration point of the trailer area, determining that the target corresponding to the edge line is the trailer.
For example, corresponding to the above-mentioned example regarding the trailer region dx ═ 0, -a ] meter, r ═ b, -c degree, dy ═ e, -f, and dz ═ g + Δ g ], when there are regions of dx ═ 0, -a ] meter, r ═ b, -c degree, dy ═ e, -f ] continuously at the edge point of a certain edge line of the rear object within the range of dz ═ g + Δ g ] meter during the continuous observation period t time, it is indicated that the rear object has an edge line entering the preset trailer region, and thus the rear object is considered as a trailer.
Therefore, according to the embodiment of the invention, the edge line of the target is obtained by processing the edge point of the image information, and whether the target is a trailer is judged by determining whether the edge line enters the preset trailer area, so that the trailer can be identified without additionally configuring a trailer module, the cost brought by the trailer module is saved, the size of the trailer is not required to be calibrated in the trailer identification process, and the method is easy to implement.
Example two
In the background section above, it has been mentioned that trailers can interfere with radar signals from millimeter wave radar, causing false alarms in blind zone detection systems. This interference mainly manifests as partial radar wave detection range can be sheltered from to the trailer to probably cause the reflection of radar wave, form the deceitful shadow, false target promptly. To this problem, in the scheme that adopts the trailer module, after the trailer module discerned the trailer, can send the signal of closing the blind area detection function in order to avoid blind area detection system wrong report to police to the blind area detection system. However, obviously, the closing of the blind zone detection function may make the vehicle no longer able to detect the target present in the blind zone, thereby increasing the risk of collision of the target with the own vehicle or trailer, which is not conducive to ensuring driving safety.
In contrast, the second embodiment of the present invention provides a method for detecting a blind area target on the basis of the first embodiment. Fig. 4 is a flowchart illustrating a blind area target detection method according to a second embodiment of the present invention, which is applied to a vehicle that detects a blind area by a millimeter wave radar. As shown in fig. 4, the blind area detection method may include the steps of:
and step S410, acquiring image information in the range of the blind area of the vehicle.
The range of the blind zone can be defined by referring to the ISO17387 standard.
In a preferred embodiment, image information within a blind spot range of the own vehicle may be captured by a camera mounted on the own vehicle. The camera can be installed at the center of the tail of the vehicle, the top of the vehicle and the like, or a 360-degree looking-around camera can be adopted, so that the camera can shoot images in the whole blind area range.
Step S420, performing edge point processing on the image information to obtain an edge line of the target in the blind area range.
For a specific edge point processing process, reference may be made to embodiment one, and details are not described herein.
Step S430, determining whether any edge line of any target is in a preset trailer area, if so, determining that the target is a trailer, and acquiring trailer contour information expressed in the form of edge points.
For the setting of the trailer area and the specific determination process of the trailer, reference may also be made to the first embodiment, which is not described herein again. It should be noted that, in the process of determining the trailer by using the method of the first embodiment, the edge points of the trailer are obtained, and these edge points can be used for representing the contour of the trailer.
Step S440, after determining that the trailer exists, processes the object other than the trailer to determine whether it is a real object.
Specifically, the process of determining whether the target is a real target in step S440 may include: acquiring the position information of the target detected by the millimeter wave radar; and comparing the position information of the target with the corresponding edge line, and if the position information of the target is consistent with the corresponding edge line, determining that the target is a real target.
For example, fig. 5 is a schematic diagram illustrating the principle of combining a millimeter wave radar and a camera to determine a real target in an example of the embodiment of the present invention. As shown in fig. 5, the arc-shaped portion is a detection range of the millimeter-wave radar, and the region formed by the irregular lines is a detection range of the camera (i.e., a free space of the camera). For the target 1 and the target 2, comparing the position information detected by the millimeter wave radar with the free space formed by the target edge line detected by the camera, it can be known that the position of the target 1 determined by the millimeter wave radar is consistent with the edge line of the target 1 determined by the camera, so that the target 1 is a real target, and the target 2 has no obvious edge line in the free space of the camera, so that the target 1 is likely to be a false target.
In a preferred embodiment, after determining the real target, an alarm may be given, for example, in combination with TTC (Time To Collision) information output by the millimeter wave radar, To alarm the target 1.
The second embodiment of the invention actually performs information fusion processing on the image information of the target and the position information detected by the millimeter wave radar, thereby ensuring that the trailer can be detected, and also ensuring that false targets generated by interference between the real target and the trailer can be distinguished, so that false alarm caused by interference of the trailer can be avoided without closing a blind area detection function of a vehicle when the trailer is detected, and driving safety and driving experience are ensured.
It should be noted that, in a preferred embodiment, in addition to the image information collected by the camera and the like and the position information detected by the millimeter wave Radar, information collected by other sensors or RCS (Radar-Cross Section) and size information of the target may be combined to further improve the detection strategy of the blind area target, so as to improve the detection accuracy.
For details and effects of the second embodiment, reference may be made to the first embodiment, and further description is omitted here.
EXAMPLE III
The third embodiment of the invention provides a trailer detection device, which is provided with a controller, wherein the controller is used for executing the trailer detection method in the first embodiment.
Fig. 6 is a system architecture diagram of a trailer detecting apparatus according to a third embodiment of the present invention. As shown in fig. 6, the controller of the trailer detecting apparatus configuration of this example, such as the central control module 610, is responsible for executing the trailer detecting method of the first embodiment. Preferably, the trailer detecting device may further include a camera 620, wherein the camera 620 mainly functions to capture image information within a set range behind the vehicle, for example, as described in the first embodiment, and transmit the image information to the central control module 610 for image processing. The trailer detection apparatus 600 may also include a correlation component 630, for example, a correlation sensor and a correlation actuator, such as a steering wheel angle sensor, a yaw acceleration sensor, an electronic stability system. The association component 630 communicates with the central control module 610 so that image information captured by the camera 620, information obtained after image processing by the central control module 610, etc. can be obtained from the central control module 610 to assist in vehicle operation. For example, the electronic stability system may obtain the azimuth angle r from the central control module 610 for trailer stability control. The association component 630 may also provide the central control module 610 with information gathered by its sensors to enrich the trailer detection strategy, such as transmitting steering wheel angles and yaw angular accelerations to the central control module 610 to assist in determining the vehicle's driving status.
For details and effects of the third embodiment, reference may be made to the first embodiment, and further description is omitted here.
Example four
The fourth embodiment of the present invention provides a blind area target detection device, which is configured with a controller, wherein the controller is configured to execute the blind area target detection method described in the second embodiment.
Fig. 7 is a system architecture diagram of a blind area target detection apparatus according to a fourth embodiment of the present invention. As shown in fig. 7, the controller of the blind zone object detection device configuration of this example, for example, the central control module 710, is responsible for executing the blind zone object detection method of the second embodiment. Preferably, the blind area target detection device may further include a camera 720 and a millimeter wave radar 740, wherein the camera 720 mainly functions to capture image information within a blind area range of the vehicle according to the second embodiment and transmit the image information to the central control module 710 for image processing, and the millimeter wave radar 740 mainly functions to detect position information of a target within the blind area range according to the second embodiment and transmit the position information of the target to the central control module 710 for comparison between the position information of the target and an edge line. The blind area target detection apparatus 700 may further include a correlation component 730, which has the same function as that of the embodiment, and therefore, the description thereof is omitted. In addition, the blind zone target detection apparatus 700 may further include a human-machine interaction module 750, which is in communication with the central control module 710, and may be configured to receive the central control module 710 for warning of finding a real target of a blind zone, and may be configured to transmit a function of turning off the blind zone detection system to the central control module 710 in response to a driver's operation, and the like.
For details and effects of the fourth embodiment, reference may be made to the second embodiment, which will not be repeated herein.
The embodiment of the present invention further provides a machine-readable storage medium, where instructions are stored on the machine-readable storage medium, where the instructions are used to enable a machine to execute the trailer detection method according to the first embodiment or the blind area target detection method according to the second embodiment.
The embodiment of the invention further provides a processor, wherein the processor is used for running a program, and when the program runs, the trailer detection method in the first embodiment or the blind area target detection method in the second embodiment is executed.
Embodiments of the present invention also provide a computer program product adapted to perform a program for initializing the following method steps when executed on a vehicle-related component: the trailer detection method of the first embodiment or the blind area target detection method of the second embodiment.
As will be appreciated by one skilled in the art, embodiments of the present application may be provided as a method, apparatus or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
In a typical configuration, a computing device includes one or more processors (CPUs), input/output interfaces, network interfaces, and memory.
The memory may include forms of volatile memory in a computer readable medium, Random Access Memory (RAM) and/or non-volatile memory, such as Read Only Memory (ROM) or flash memory (flash RAM). The memory is an example of a computer-readable medium.
Computer-readable media, including both non-transitory and non-transitory, removable and non-removable media, may implement information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of computer storage media include, but are not limited to, phase change memory (PRAM), Static Random Access Memory (SRAM), Dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), Read Only Memory (ROM), Electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), Digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic tape magnetic disk storage or other magnetic storage devices, or any other non-transmission medium that can be used to store information that can be accessed by a computing device.
It should also be noted that the term "comprises/comprising" or any other variation thereof is intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in the process, method, article, or apparatus that comprises the element.
The above description is only for the purpose of illustrating the preferred embodiments of the present invention and is not to be construed as limiting the invention, and any modifications, equivalents, improvements and the like that fall within the spirit and principle of the present invention are intended to be included therein.

Claims (12)

1. A trailer inspection method, comprising:
acquiring image information in a set range behind the vehicle;
performing edge point processing on the image information to obtain an edge line of the target in the set range; and
and judging whether any edge line of any target is in a preset trailer area, and if so, determining that the target is a trailer.
2. The trailer inspection method of claim 1, further comprising:
setting the trailer area, including:
determining a plurality of calibration points for calibrating the trailer area, wherein the attribute parameters of the plurality of calibration points include a longitudinal distance, a lateral distance, an azimuth angle and a trailer height of the trailer relative to the tail of the host vehicle;
adjusting the range of the attribute parameters of the plurality of calibration points according to the profile information of the trailer, the connection mode of the trailer and the self vehicle and/or the driving state of the self vehicle; and
determining the trailer area based on the adjusted attribute parameters of the plurality of calibration points.
3. The trailer detection method of claim 2, wherein said determining whether any edge line of any object is in a predetermined trailer area comprises:
acquiring attribute parameters of edge points on the edge lines, wherein the attribute parameters comprise longitudinal distance, transverse distance and azimuth angle of the edge points relative to the vehicle tail of the vehicle and height information of the edge points; and
and in a preset continuous observation period, if the attribute parameters of the edge points on any edge line fall into the range corresponding to the calibration point of the trailer area, determining that the target corresponding to the edge line is the trailer.
4. A blind area target detection method is applied to a vehicle for detecting a blind area through a millimeter wave radar, and is characterized by comprising the following steps:
acquiring image information within a blind area range of a self-vehicle;
performing edge point processing on the image information to obtain an edge line of the target in the blind area range;
judging whether any edge line of any target is in a preset trailer area, if so, determining that the target is a trailer, and acquiring trailer contour information expressed in the form of edge points; and
after determining that the trailer is present, for a target other than the trailer, performing:
acquiring the position information of the target detected by the millimeter wave radar;
and comparing the position information of the target with the corresponding edge line, and if the position information of the target is consistent with the corresponding edge line, determining that the target is a real target.
5. The blind spot target detection method according to claim 4, further comprising:
setting the trailer area, including:
determining a plurality of calibration points for calibrating the trailer area, wherein the attribute parameters of the plurality of calibration points include a longitudinal distance, a lateral distance, an azimuth angle and a trailer height of the trailer relative to the tail of the host vehicle;
adjusting the range of the attribute parameters of the plurality of calibration points according to the profile information of the trailer, the connection mode of the trailer and the self vehicle and/or the driving state of the self vehicle; and
determining the trailer area based on the adjusted attribute parameters of the plurality of calibration points.
6. The blind spot target detecting method according to claim 5, wherein said determining whether any edge line of any target is in a preset trailer area comprises:
acquiring attribute parameters of edge points on the edge lines, wherein the attribute parameters comprise longitudinal distance, transverse distance and azimuth angle of the edge points relative to the vehicle tail of the vehicle and height information of the edge points; and
and in a preset continuous observation period, if the attribute parameters of the edge points on any edge line fall into the range corresponding to the calibration point of the trailer area, determining that the target corresponding to the edge line is the trailer.
7. The blind area target detection method according to any one of claims 4 to 6, wherein the image information obtained within the range of the blind area of the own vehicle includes:
and acquiring image information in the range of the blind area of the vehicle through a camera arranged on the vehicle.
8. A trailer inspection apparatus, comprising a controller, wherein the controller is configured to perform the trailer inspection method of any one of claims 1 to 3.
9. The trailer detection apparatus of claim 8, further comprising:
and the camera is communicated with the controller and is used for capturing image information in a set range behind the vehicle and transmitting the image information to the controller.
10. A blind area object detecting apparatus characterized by comprising a controller for executing the blind area object detecting method according to any one of claims 4 to 7.
11. The blind area target detection device according to claim 10, characterized by further comprising:
the camera is communicated with the controller and is used for capturing image information in a blind area range of the vehicle and transmitting the image information to the controller; and
and the millimeter wave radar is communicated with the controller and is used for detecting the position information of the target in the blind area range and transmitting the position information to the controller.
12. A machine-readable storage medium having stored thereon instructions for causing a machine to perform the trailer detection method of any one of claims 1 to 3 and the blind spot object detection method of any one of claims 4 to 7.
CN201910924662.0A 2019-09-27 2019-09-27 Trailer and blind area target detection method and device Pending CN110705445A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910924662.0A CN110705445A (en) 2019-09-27 2019-09-27 Trailer and blind area target detection method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910924662.0A CN110705445A (en) 2019-09-27 2019-09-27 Trailer and blind area target detection method and device

Publications (1)

Publication Number Publication Date
CN110705445A true CN110705445A (en) 2020-01-17

Family

ID=69196812

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910924662.0A Pending CN110705445A (en) 2019-09-27 2019-09-27 Trailer and blind area target detection method and device

Country Status (1)

Country Link
CN (1) CN110705445A (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111546986A (en) * 2020-04-30 2020-08-18 北京大椽科技有限公司 Trailer panoramic looking-around method
CN111746395A (en) * 2020-06-12 2020-10-09 东风商用车有限公司 Blind area detection method for preventing self-trailer from being mistakenly identified
CN114582165A (en) * 2022-03-02 2022-06-03 浙江海康智联科技有限公司 Collaborative lane change safety auxiliary early warning method and system based on V2X
CN114966673A (en) * 2022-05-31 2022-08-30 上海海拉电子有限公司 Radar-based trailer detection method and system and vehicle

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090005932A1 (en) * 2007-06-27 2009-01-01 Gm Global Technology Operations, Inc. Trailer articulation angle estimation
US20160101730A1 (en) * 2014-10-08 2016-04-14 Ford Global Technologies Llc Vehicle blind spot system operation with trailer tow
US20180045823A1 (en) * 2016-08-09 2018-02-15 Delphi Technologies, Inc. Trailer dimension estimation with two dimensional radar and camera
US20180061239A1 (en) * 2016-08-29 2018-03-01 Delphi Technologies, Inc. Camera based trailer identification and blind zone adjustment
US20180121742A1 (en) * 2016-11-02 2018-05-03 Lg Electronics Inc. Apparatus for providing around view image, and vehicle
US20190241126A1 (en) * 2018-02-06 2019-08-08 GM Global Technology Operations LLC Vehicle-trailer rearview vision system and method

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090005932A1 (en) * 2007-06-27 2009-01-01 Gm Global Technology Operations, Inc. Trailer articulation angle estimation
US20160101730A1 (en) * 2014-10-08 2016-04-14 Ford Global Technologies Llc Vehicle blind spot system operation with trailer tow
US20180045823A1 (en) * 2016-08-09 2018-02-15 Delphi Technologies, Inc. Trailer dimension estimation with two dimensional radar and camera
US20180061239A1 (en) * 2016-08-29 2018-03-01 Delphi Technologies, Inc. Camera based trailer identification and blind zone adjustment
US20180121742A1 (en) * 2016-11-02 2018-05-03 Lg Electronics Inc. Apparatus for providing around view image, and vehicle
US20190241126A1 (en) * 2018-02-06 2019-08-08 GM Global Technology Operations LLC Vehicle-trailer rearview vision system and method

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111546986A (en) * 2020-04-30 2020-08-18 北京大椽科技有限公司 Trailer panoramic looking-around method
CN111746395A (en) * 2020-06-12 2020-10-09 东风商用车有限公司 Blind area detection method for preventing self-trailer from being mistakenly identified
CN114582165A (en) * 2022-03-02 2022-06-03 浙江海康智联科技有限公司 Collaborative lane change safety auxiliary early warning method and system based on V2X
CN114966673A (en) * 2022-05-31 2022-08-30 上海海拉电子有限公司 Radar-based trailer detection method and system and vehicle

Similar Documents

Publication Publication Date Title
CN110705445A (en) Trailer and blind area target detection method and device
CN109305165B (en) Intelligent ultrasonic system, vehicle rear collision warning device and control method thereof
JP4211809B2 (en) Object detection device
US9274213B2 (en) Method for calibrating a plurality of environment sensors in a vehicle
KR20200047886A (en) Driver assistance system and control method for the same
KR102352464B1 (en) Driver assistance system and control method for the same
US11351997B2 (en) Collision prediction apparatus and collision prediction method
US20160031371A1 (en) In-vehicle apparatus
US20190065878A1 (en) Fusion of radar and vision sensor systems
KR20200139443A (en) Apparatus and method for driver assistance
CN107886729B (en) Vehicle identification method and device and vehicle
KR20200129374A (en) Vehicle, and control method for the same
US11640172B2 (en) Vehicle controls based on reliability values calculated from infrastructure information
US20230415734A1 (en) Vehicular driving assist system using radar sensors and cameras
CN113050615B (en) Driving safety control method and device, electronic equipment and storage medium
CN116872957A (en) Early warning method and device for intelligent driving vehicle, electronic equipment and storage medium
CN107886036B (en) Vehicle control method and device and vehicle
US20230101872A1 (en) Vehicle and control method thereof
EP3701281B1 (en) Using data from a radar sensor for machine learning based perception
US12043310B2 (en) Driver assistance system and control method for the same
US11798287B2 (en) Driver assistance apparatus and method of controlling the same
US20220024428A1 (en) Vehicle and method of controlling the same
US11667295B2 (en) Apparatus and method for recognizing object
KR102675290B1 (en) Vehicle and control method thereof
JP2020192869A (en) Object detection device in vehicle, drive assistance control device in vehicle, drive assistance system, and object detection method and drive assistance method in vehicle

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination