CN108761479B - Trailer detection method and system and unmanned trailer - Google Patents

Trailer detection method and system and unmanned trailer Download PDF

Info

Publication number
CN108761479B
CN108761479B CN201810578127.XA CN201810578127A CN108761479B CN 108761479 B CN108761479 B CN 108761479B CN 201810578127 A CN201810578127 A CN 201810578127A CN 108761479 B CN108761479 B CN 108761479B
Authority
CN
China
Prior art keywords
trailer
line segment
axis
data
front plane
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810578127.XA
Other languages
Chinese (zh)
Other versions
CN108761479A (en
Inventor
文扬
张丹
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Uisee Technologies Beijing Co Ltd
Original Assignee
Uisee Technologies Beijing Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Uisee Technologies Beijing Co Ltd filed Critical Uisee Technologies Beijing Co Ltd
Priority to CN201810578127.XA priority Critical patent/CN108761479B/en
Publication of CN108761479A publication Critical patent/CN108761479A/en
Application granted granted Critical
Publication of CN108761479B publication Critical patent/CN108761479B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/46Indirect determination of position data
    • G01S17/48Active triangulation systems, i.e. using the transmission and reflection of electromagnetic waves other than radio waves
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging

Landscapes

  • Physics & Mathematics (AREA)
  • Electromagnetism (AREA)
  • Engineering & Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Traffic Control Systems (AREA)
  • Optical Radar Systems And Details Thereof (AREA)

Abstract

The embodiment of the invention provides a trailer detection method and system and an unmanned trailer. The trailer detection method comprises the following steps: collecting trailer data, wherein the trailer data comprises information representative of a position of a first plane of a trailer; determining an area of interest in the trailer data according to the position of the first plane in the trailer data when the trailer and the vehicle head are in a normal connection state; determining, in the region of interest, information on a line segment representing a first plane of a scoop based on the information representing the position of the first plane; and determining the position state of the trailer according to the information about the line segment representing the first plane. The technical scheme adopts a non-contact detection mode, so that the service life of the system is obviously prolonged; and the detection result is more accurate.

Description

Trailer detection method and system and unmanned trailer
Technical Field
The invention relates to the field of automatic detection, in particular to a trailer detection method and system and an unmanned trailer.
Background
With the development of scientific technology, automatic detection technology has been applied to many fields. For example, the detection of the position state of the trailer hopper is achieved using a touch sensor.
In existing trailer bed detection systems, a contact sensor is typically used to determine whether the bed is still attached to the head of the trailer and thus whether the bed is detached from the head. Because the trailer detection system adopts a contact detection mode, the trailer detection system is easy to wear and has short service life.
In still other trailer bed detection systems, the positional state of the trailer bed is determined by calculating the distance relationship between the center of the trailer bed and the center of the bed. The detection accuracy of the trailer detection system is low, and the actual requirements of users are difficult to meet.
Disclosure of Invention
The present invention has been made in view of the above problems. According to one aspect of the invention, a trailer detection method is provided, comprising:
collecting trailer data, wherein the trailer data comprises information representative of a position of a first plane of a trailer;
determining a region of interest (ROI) in the trailer data according to the position of the first plane in the trailer data when the trailer and a vehicle head are in a normal connection state;
determining, in the ROI, information on a line segment representing a first plane of a trailer based on the information representing a position of the first plane; and
and determining the position state of the trailer according to the information about the line segment representing the first plane.
Illustratively, the method further comprises: and calibrating the trailer data to construct the trailer data under a vehicle coordinate system, wherein the vehicle coordinate system takes the transverse center of the vehicle head as an original point.
Illustratively, the determining the ROI in the trailer data comprises:
determining a portion of an X-axis of the vehicle coordinate system as an upper edge of the ROIAnd determining a distance D1 between a point P1 of the lower edge of the ROI, which is the largest in distance from the X-axis, and the X-axis according to a distance Dcd between the origin of the car coordinate system and the position of the first plane in the car data when the car is in a normal connection state with the car head, wherein a first threshold value<D1-Dcd<And a second threshold value, wherein the X axis of the vehicle coordinate system is a direction which passes through the origin and is parallel to the transverse direction of the vehicle head.
Illustratively, the lower edge of the ROI is a line segment parallel to the X-axis.
Illustratively, the point P1 is located on a Y-axis of the vehicle coordinate system, wherein the Y-axis of the vehicle coordinate system is a direction passing through the origin and parallel to the vehicle head longitudinal direction, the ROI is a region enclosed by a line segment L11, a line segment L12, a line segment L13, a line segment L14, a line segment L15 and a line segment L16, wherein,
the line segment L11 passes through the point P1, is parallel to the X-axis, is bisected by the Y-axis, and has a length equal to the length of the first plane in the trailer data;
the lower endpoint of the line segment L12 coincides with the left endpoint of the line segment L11, is at an angle of 125 degrees to the line segment L11, and has a length equal to the length of the line segment L11;
the line segment L14 coincides with the X axis, is bisected by the Y axis, and has a length equal to 2 times the length of the line segment L11;
the upper endpoint of the line segment L13 coincides with the left endpoint of the line segment L14, and the lower endpoint of the line segment L13 coincides with the upper endpoint of the line segment L12;
the line segment L15 and the line segment L16 are symmetrical to the line segment L13 and the line segment L12, respectively, about the Y axis.
Illustratively, the ROI is a region surrounded by a line segment L21, a line segment L22, a line segment L23, and a circular arc L24, wherein,
the line segment L21 is coincident with the X axis, bisected by the Y axis of the vehicle coordinate system, and has a length equal to 2 times the length of the first plane in the trailer data, wherein the Y axis of the vehicle coordinate system is the direction through the origin and parallel to the vehicle head longitudinal direction;
the circular arc L24 is a semicircular arc having a center at a point P2 on the Y axis, which is located at a distance from the point P1 equal to the length of the first plane in the trailer data, and having a radius equal to the length of the first plane in the trailer data, and having an opening facing the positive direction of the Y axis;
the upper end point of the line segment L22 coincides with the left end point of the line segment L21, and the lower end point of the line segment L22 coincides with the left end point of the arc L24; and
the line segment L23 is symmetrical to the line segment L22 about the Y-axis.
Illustratively, the scoop data is lidar point cloud data.
Exemplary, the determining, in the ROI, information about a line segment representing a first plane of a trailer based on the information representing the position of the first plane includes:
reading laser radar point cloud data in the ROI;
and performing straight line fitting on the read laser radar point cloud data, and determining the information about the line segment representing the first plane according to a straight line fitting result.
Illustratively, the performing a straight line fitting on the read lidar point cloud data and determining the information about the line segment representing the first plane according to a straight line fitting result includes:
performing linear fitting operation for the read laser radar point cloud data;
determining the line segment representing the first plane from the straight line obtained in the case where the absolute value of the ordinate of the intersection of the straight line obtained in the straight line fitting operation and the Y axis, which is the direction passing through the origin and parallel to the vehicle head longitudinal direction, is greater than or equal to Dcd;
and for the condition that the absolute value of the vertical coordinate of the intersection point of the straight line and the Y axis obtained by the straight line fitting operation is less than Dcd, deleting the laser radar point cloud data corresponding to the obtained straight line and executing the next straight line fitting operation on the rest laser radar point cloud data until no laser radar point cloud data exists or the absolute value of the vertical coordinate of the intersection point of the obtained straight line and the Y axis is more than or equal to Dcd and determining the line segment representing the first plane according to the finally obtained straight line.
Illustratively, the straight line fitting for the read lidar point cloud data comprises:
and performing straight line fitting on the read laser radar point cloud data by using a random sample consensus (RANSAC) algorithm.
For example, the determining the position state of the trailer according to the information about the line segment representing the first plane includes:
counting the number of points, with the distance from the line segment of the first plane representing the trailer being smaller than a third threshold value, in the lidar point cloud data in the ROI;
and determining that the trailer is separated from the locomotive under the condition that the number is smaller than a fourth threshold value.
Illustratively, the collecting the trailer data comprises: and acquiring the trailer data by using a single-line laser radar sensor.
For example, the determining the position state of the trailer according to the information about the line segment representing the first plane includes:
determining a linear equation where the line segment representing the first plane is located;
and according to the linear equation, determining the angle between the line segment representing the first plane and the transverse direction of the vehicle head so as to determine the posture of the trailer.
According to another aspect of the present invention, there is also provided a trailer bed detection system comprising:
a data acquisition device for acquiring trailer data, wherein the trailer data comprises information indicative of a position of a first plane of the trailer;
the calculating device is used for determining the ROI in the trailer data according to the position of the first plane in the trailer data when the trailer and the vehicle head are in a normal connection state; determining, in the ROI, information on a line segment representing a first plane of a trailer based on the information representing a position of the first plane; and determining the position state of the trailer according to the information about the line segment representing the first plane.
Illustratively, the data acquisition device is a single line lidar sensor.
Illustratively, the system further comprises a display for displaying the position state of the trailer.
According to yet another aspect of the invention, there is also provided an unmanned trailer comprising the trailer bed detection system described above.
According to the trailer detection method and system, the position of a plane of the trailer is represented by a line segment in the detection range, and therefore the position state of the trailer is determined. The trailer detection method and the trailer detection system adopt a non-contact detection mode, so that the service life of the system is remarkably prolonged; and the detection result is more accurate. The automatic driving trailer can automatically and accurately detect the position state of the trailer by using the trailer detection system.
The foregoing description is only an overview of the technical solutions of the present invention, and the embodiments of the present invention are described below in order to make the technical means of the present invention more clearly understood and to make the above and other objects, features, and advantages of the present invention more clearly understandable.
Drawings
The above and other objects, features and advantages of the present invention will become more apparent by describing in more detail embodiments of the present invention with reference to the attached drawings. The accompanying drawings are included to provide a further understanding of the embodiments of the invention and are incorporated in and constitute a part of this specification, illustrate embodiments of the invention and together with the description serve to explain the principles of the invention and not to limit the invention. In the drawings, like reference numbers generally represent like parts or steps.
FIG. 1 shows a schematic flow diagram of a trailer bed detection method according to one embodiment of the present invention;
FIG. 2 shows a schematic representation of a trailer according to one embodiment of the present invention;
FIG. 3 illustrates trailer data when the trailer is in a normal connection with the locomotive, in accordance with one embodiment of the present invention;
FIG. 4 shows calibrated results of the trailer data shown in FIG. 3;
FIG. 5 illustrates an ROI of scoop data according to one embodiment of the invention;
FIG. 6 illustrates trailer data and an ROI therein according to another embodiment of the present invention;
FIG. 7 shows an ROI of scoop data according to a further embodiment of the invention;
FIG. 8 illustrates trailer data and an ROI therein according to one embodiment of the present invention; and
FIG. 9 shows the trailer data and ROI therein according to another embodiment of the invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, exemplary embodiments according to the present invention will be described in detail below with reference to the accompanying drawings. It is to be understood that the described embodiments are merely a subset of embodiments of the invention and not all embodiments of the invention, with the understanding that the invention is not limited to the example embodiments described herein. All other embodiments, which can be derived by a person skilled in the art from the embodiments of the invention described herein without inventive step, shall fall within the scope of protection of the invention.
In the trailer detection method of the embodiment of the invention, the position state of the trailer is determined according to the plane of the trailer by using the line segment in the detection range to represent. Since the trailer has a fixed shape, the plane of the trailer may be a front plane, a side plane, a rear plane, etc. of the trailer. The position of these planes can clearly indicate the position of the trailer. It will be appreciated that detection of the front plane is easier to implement. For convenience of description, the trailer detection method is described below by taking the front plane of the trailer as an example.
A trailer bed detection method according to one embodiment of the present invention will be described with reference to fig. 1. FIG. 1 shows a schematic flow diagram of a trailer bed detection method 100 according to one embodiment of the present invention. As shown in fig. 1, the method 100 includes the following steps.
And step S110, collecting trailer data. The trailer data includes information about the status of the trailer position, such as information indicative of the position of the front plane of the trailer.
The trailer data may be any suitable data including the information described above. For example, the scoop data may be data in various formats such as three-dimensional image data, lidar point cloud data, and the like. In the present embodiment, the method 100 for detecting trailer tug is illustrated by taking lidar point cloud data as an example.
The scoop data may be collected by a data collection device (e.g., a lidar sensor, etc.) and transmitted to a computing device (e.g., a processor) for processing. When the device is used, the data acquisition device can be fixed at a specific position of a vehicle head so as to conveniently acquire trailer data in real time. It will be appreciated that the position information of all objects within a certain detection range of the data acquisition device may be included in the trailer data. Figure 2 shows a schematic drawing of a trailer according to one embodiment of the present invention. In the trailer shown in fig. 2, a lidar sensor is used for trailer data acquisition. The lidar sensor is mounted on the left edge of the trailer's nose, for example on a protruding member fixed to the left edge of the nose. The detection range covered by the trailer data is shown by the dashed box in fig. 2. If the trailer and the vehicle head are in a connected state, some point cloud data exist in the trailer data at the corresponding position of the front plane of the trailer. From these point cloud data, the position of the front plane of the trailer can be determined. Otherwise, if the trailer and the vehicle head are in a separated state, corresponding point cloud data may not exist in the trailer data. At this time, the trailer data includes information indicating that the front plane of the trailer is not within the detection range.
In summary, the information indicating the position of the front plane of the trailer may include data indicating the specific position of the front plane in the trailer data and information indicating that the front plane is not included in the detection range corresponding to the trailer data, respectively, in different cases.
The laser radar sensor can detect the position state of the trailer in real time all day long, and is not easy to be interfered by the environment. Alternatively, the lidar sensor may be a single line lidar sensor. The single-line laser radar sensor can meet the application requirements and is low in price. In addition, the single line lidar sensor is typically a sensor that is self-contained within the unmanned trailer itself, which may also be compatible with other functions. In a word, the cost of trailer detection system can be reduced when the application of single line laser radar sensor guarantees trailer detection accuracy.
In the context of the present application, the front-back direction of the vehicle head is referred to as the longitudinal direction, and correspondingly, the left-right direction of the vehicle head is referred to as the transverse direction. A plane coordinate system is constructed in a plane where the trailer is located, and in trailer data under the plane coordinate system, point cloud data corresponding to a front plane of the trailer can form an approximate line segment. The line segment can represent the position of the front plane of the trailer.
And S120, determining an ROI in the trailer data according to the position of the front plane of the trailer in the trailer data when the trailer is in a normal connection state with the vehicle head.
The transverse section of the vehicle head is the cross section of the vehicle head, the transverse section of the trailer is the cross section of the trailer, and the transverse direction of the trailer is the left and right direction of the trailer. The longitudinal axis passing through the center of the cross-section is referred to as the central longitudinal axis. The plane passing through the longitudinal axis of the headstock or the trailer and perpendicular to the ground is called the vertical plane. The condition that the trailer and the vehicle head are in a normal connection state means that the central longitudinal axis of the trailer and the central longitudinal axis of the vehicle head are in the same vertical plane and the trailer and the vehicle head are in a connection state. The central longitudinal axes of the trailer and the vehicle head are shown in fig. 2, and the trailer and the vehicle head are in a normal connection state as shown in fig. 2.
Fig. 3 shows trailer data when the trailer is in a normal connection state with the locomotive according to one embodiment of the invention. Where the top, including right angle, of fig. 3 is the position of the lidar sensor, and the lower, irregular graph shows lidar point cloud data representing the front plane of the scoop. As can be seen from the trailer data shown in fig. 3, when the trailer and the vehicle head are in a normal connection state, the specific position of the front plane of the trailer in the laser radar point cloud data is found. Typically, the point cloud data collected by the lidar sensor is based on a lidar sensor coordinate system. In the lidar sensor coordinate system, the lidar sensor is used as an origin, and two sides of a right angle in fig. 3 are an X axis and a Y axis of the lidar sensor coordinate system respectively.
The ROI can be determined in the trailer data according to the position of the front plane of the trailer in the trailer data when the trailer and the vehicle head are in a normal connection state. When the trailer and the vehicle head are in a normal connection state, ideal trailer data can be acquired. The position of the front plane of the trailer in the trailer data is predetermined according to the ideal trailer data. And determining the position information of the ROI in the ideal trailer data according to the position of the front plane of the trailer in the trailer data. In all subsequently collected trailer data to be detected, the ROI can be determined according to the position information.
The ROI includes at least a portion of the location of the front plane in the trailer data at the aforementioned state. For example, when the trailer and the vehicle head are in a normal connection state, the front plane approximately forms a line segment in the trailer data, that is, the point cloud data corresponding to the front plane, and the ROI may include all or at least a part of the line segment (for example, a middle part of the line segment).
Preferably, while the ROI includes at least a part of the position of the front plane in the trailer data when the trailer and the vehicle head are in a normal connection state, the ROI is as small as possible to avoid an error caused by introducing data corresponding to other planes or structures.
Thus, the position state of the trailer can be determined according to the ROI of the trailer data. In step S130, in the ROI, information on a line segment representing the front plane is determined based on the information representing the position of the front plane of the trailer.
As mentioned above, the point cloud data corresponding to the front plane of the trailer may form an approximate line segment. In other words, the line segment may be used to represent the front plane of the trailer. In step S130, information on the line segment is determined based on the information indicating the position of the front plane of the trailer in the ROI. Corresponding to the information indicating the position of the front plane, the information about the line segment may include information indicating a specific position of the line segment in the ROI and information indicating that the line segment does not exist in the ROI, respectively, in different cases.
In one example, step S130 may include the following sub-steps. First, lidar point cloud data in the ROI is read. Then, straight line fitting is performed on the read lidar point cloud data, and information on a line segment representing the front plane is determined according to the straight line fitting result.
Optionally, a straight line fitting is performed on the read lidar point cloud data by using a RANSAC algorithm. The RANSAC algorithm can ideally fit the laser radar point cloud data, and then more accurate information representing the line segment of the front plane is obtained.
Alternatively, a straight line fitting may also be performed on the read lidar point cloud data using a least squares method.
And step S140, determining the position state of the trailer according to the information of the line segment determined in the step S130.
As described in step S120, the ROI is determined according to the position of the front plane in the trailer data when the trailer is normally connected to the vehicle head. Therefore, if it is determined in step S130 that there is no line segment representing the front plane in the ROI, it may be determined that the trailer is detached from the vehicle head. If it is determined in step S130 that a line segment representing the front plane exists in the ROI, and further a specific position of the line segment in the ROI is determined, a position state of the trailer, such as detachment or connection of the trailer from or to the vehicle head, a posture of the trailer, and the like, may be determined according to the specific position.
The trailer bed detection method 100 determines the position of the trailer bed based on the position of the trailer bed by representing the position of a plane of the bed with a line segment within the detection range. The trailer detection method 100 adopts a non-contact detection mode, so that the service life of the system is remarkably prolonged; and the detection result is more accurate.
It is to be understood that the trailer bed detection method 100 described above is merely an example and is not a limitation of the present invention. For example, step S120 may be performed prior to step S110, rather than necessarily in the order shown above.
Typically, raw scoop data is represented based on a lidar sensor coordinate system established with the lidar sensor as an origin. This presents difficulties for subsequent calculations. Optionally, for data calculation convenience, the trailer bed detection method 100 further comprises the following steps: and calibrating the trailer data to construct the trailer data under a vehicle coordinate system, wherein the vehicle coordinate system takes the transverse center of the vehicle head as an original point. The calibration operation converts the original trailer data in the laser radar sensor coordinate system into the trailer data in the vehicle coordinate system through operations such as rotation and translation. Fig. 4 shows the results of calibration of the trailer data shown in fig. 3. The catheti in fig. 4 are positive axes of the X-axis and Y-axis, respectively, of a rectangular coordinate system. After a calibration operation, the trailer data will be more likely to be located in the middle of the coordinate system, as shown in fig. 4. For example, when the trailer is in a normal connection state with the locomotive, the point cloud data corresponding to the front plane of the trailer will be symmetrical about the Y-axis. Thus, the calibration step makes the trailer data representation and calculation easier.
In the above embodiment, the rectangular coordinate system is used as an example for description, but it will be understood by those skilled in the art that other coordinate systems such as a polar coordinate system may be established to represent the trailer data.
Optionally, step S120 in the trailer bed detection method 100 includes: determining a portion of the X-axis of the vehicle coordinate system as an upper edge of the ROI; and determining a distance D1 between a point P1 in the lower edge of the ROI, which is the largest in distance from the X-axis under the car coordinate system, and the X-axis according to a distance Dcd between the origin of the car coordinate system and the position of the front plane in the trailer data when the trailer is in a normal connection state with the vehicle head, wherein a first threshold < D1-Dcd < a second threshold, and the X-axis of the car coordinate system is a direction passing through the origin and parallel to the lateral direction of the vehicle head.
The front plane of the trailer corresponds to some point cloud data in the trailer data. These point cloud data generally constitute a line segment. Distance Dcd is the distance between the origin and the line. Distance Dcd roughly determines the span of the ROI in the longitudinal direction. The distance Dcd can be measured in advance when the trailer is normally connected to the truck head, or set empirically.
FIG. 5 illustrates an ROI of the trailer data, as indicated by the shaded portion, according to one embodiment of the present invention. The irregular pattern in the lower part of fig. 5 is the lidar point cloud data corresponding to the front plane of the trailer. The aforementioned distance Dcd and distance D1 are also shown in FIG. 5. The difference value between the distance D1 between the point P1 with the maximum distance from the X axis under the vehicle coordinate system in the lower edge of the ROI and the X axis and the distance Dcd is larger than a first threshold value, so that certain errors can be allowed to exist in the trailer data, and point cloud data corresponding to the front plane of the trailer are ensured to exist in the ROI when the trailer and the vehicle head are in a connected state. Even if there is some angle between the front plane of the trailer and the X-axis, for example 10 degrees, in the case of a trailer turning around, it is ensured that there is some point cloud data within the ROI that corresponds to the front plane of the trailer. The difference value between the distance D1 and the distance Dcd is smaller than a second threshold value, so that the ROI range is smaller, invalid data are effectively prevented from participating in operation, and the accuracy of the detection result of the trailer is guaranteed.
Optionally, the lower edge of the ROI is a line segment parallel to the X-axis. Thus, the ordinate of all points in the lower edge of the ROI is equal to the ordinate of the aforementioned point P1. For example, the ROI may be a rectangle. If the lower edge of the ROI is a line segment, trailer data in the ROI can be read more simply, and convenience is brought to trailer detection.
FIG. 6 illustrates an ROI of scoop data according to another embodiment of the invention, where the lower irregularity pattern of FIG. 6 is lidar point cloud data. The detailed procedure of step S120 of obtaining the ROI is described below. Step S120 may include the following steps.
And step S121, determining an intersection point P1 of the lower edge of the ROI and the Y axis of the car coordinate system according to the distance Dcd between the origin of the car coordinate system and the position of the front plane of the trailer in the trailer data when the trailer and the car head are in a normal connection state. Wherein the Y-axis of the vehicle coordinate system is the direction through the origin and parallel to the longitudinal direction of the vehicle head. Specifically, the center point of the front plane of the trailer when the trailer and the vehicle head are in a normal connection state can be translated downwards along the Y axis by a distance corresponding to 8-12 cm in the world coordinate system, and a point P1 is obtained. The crossing point P1 is taken as the parallel line L11 of the X axis. A portion of the parallel line L11 is a portion of the lower edge of the ROI.
Step S122, a circle Cir1 is drawn by taking the origin of the vehicle coordinate system as the center of a circle and the distance from the point P1 to the origin as the radius, and a point P2 located at 225 degrees and a point P3 located at 315 degrees are selected from the circle Cir 1.
In step S123, a tangent L12 and a tangent L16 of the circle Cir1 are drawn through the point P2 and the point P3, respectively, to obtain an intersection P4 of the tangent L12 and the straight line L11, and an intersection P5 of the tangent L16 and the straight line L11. The length of the line segment having the point P4 and the point P5 as the end points determined in the above manner is substantially equal to the length of the bucket front plane in the bucket data.
In step S124, a point P6 is taken on the tangent line L12, wherein the distance from the point P2 to the point P6 is equal to the distance from the point P2 to the point P4.
In step S124, a point P7 is taken on the tangent line L16, wherein the distance from the point P3 to the point P7 is equal to the distance from the point P3 to the point P5.
In step S125, straight lines L13 and L15 parallel to the Y axis are drawn through the point P6 and the point P7, respectively, and intersection points P8 and P9 of the two straight lines and the X axis are obtained, respectively.
At step S126, connecting points P8, P6, P4, P5, P7 and P9, a polygon region is formed as the ROI, as shown in FIG. 6.
As shown in FIG. 6, the intersection of the lower edge of the ROI with the Y-axis is point P1. The ROI is a region surrounded by a line segment L11, a line segment L12, a line segment L13, a line segment L14, a line segment L15, and a line segment L16. The line segment L11 passes through the point P1, is parallel to the X axis, is divided equally by the Y axis, and has a length substantially equal to the length of the front plane in the bucket data. The lower endpoint of segment L12 coincides with the left endpoint of segment L11, is at an angle of 125 degrees to segment L11, and has a length equal to the length of segment L11. Line segment L14 coincides with the X axis, is bisected by the Y axis, and has a length equal to 2 times the length of line segment L11. The upper end point of the line segment L13 coincides with the left end point of the line segment L14, and the lower end point of the line segment L13 coincides with the upper end point of the line segment L12. The segment L15 and the segment L16 are symmetrical to the segment L13 and the segment L12, respectively, about the Y axis.
The ROI can eliminate a large amount of invalid data, reduce the amount of calculation, and secure the number of valid data. From this, showing the accuracy that has improved trailer and having fought detection.
It will be understood by those skilled in the art that although a specific implementation of step S120 of obtaining the ROI shown in fig. 6 is described in detail above, it is only illustrative and not limiting to the invention. For example, the ROI may be determined by first determining the line segment L14 and then sequentially determining other line segments from the line segment L14.
FIG. 7 shows an ROI of scoop data according to yet another embodiment of the invention. As shown in fig. 7, this ROI is a region surrounded by a line segment L21, a line segment L22, a line segment L23, and an arc L24. Where the line segment L21 coincides with the X axis, is bisected by the Y axis, and has a length equal to 2 times the length of the front plane in the trailer data. The circular arc L24 is a semicircular arc having a center at a point P2 on the Y axis, which is located at a distance from the point P1 equal to the length of the front plane in the trailer data, and having a radius equal to the length of the front plane in the trailer data, and having an opening facing the positive direction of the Y axis. The upper end point of the line segment L22 coincides with the left end point of the line segment L21, and the lower end point of the line segment L22 coincides with the left end point of the circular arc L24. It will be appreciated that line segment L22 is a tangent to the circle on which arc L24 lies. Line segment L23 is symmetric to line segment L22 about the Y-axis.
The ROI is easier to realize, and can eliminate invalid data to some extent, reduce the amount of calculation, and ensure the number of valid data. From this, improved the accuracy that trailer fought detected.
After the ROI is determined in step S120, information on a line segment representing the front plane of the trailer is determined in the ROI based on the information representing the position of the front plane. For example, straight line fitting is carried out on the laser radar point cloud data, and information of a line segment representing the front plane is determined according to the straight line fitting result. It will be appreciated that different point cloud data corresponding to a plurality of different planes may be included within the ROI. For example, FIG. 8 illustrates trailer data and an ROI therein according to one embodiment of the present invention. This ROI will be described by taking the ROI shown in fig. 6 as an example. As shown in fig. 8, the ROI includes point cloud data corresponding to the side plane of the trailer in addition to point cloud data corresponding to the front plane of the trailer. Particularly, when the trailer and the vehicle head are not in a normal connection state, some point cloud data corresponding to other planes except the front plane of the trailer are easier to appear. During the execution of the trailer bed detection method, for each plane, a corresponding line segment representing the plane can be fitted. However, for a line segment representing the front plane, the absolute value of the ordinate of the intersection of the line segment and the Y-axis is smallest when the trailer and the truck head are in a normal connected state. Thus, step S130 can be implemented by the following steps based on the rule.
First, a straight line fitting operation is performed once for the lidar point cloud data read in the ROI. This time straight line fitting operation can obtain a straight line corresponding to one plane.
And determining the intersection point of the straight line obtained by the straight line fitting operation and the Y axis of the vehicle coordinate system. As mentioned above, the Y-axis of the vehicle coordinate system is the direction passing through the origin of the vehicle coordinate system and parallel to the longitudinal direction of the vehicle head.
For the case where the absolute value of the ordinate of the intersection is greater than or equal to the distance Dcd between the origin of the vehicle coordinate system and the position of the front plane in the trailer data when the trailer and the vehicle head are in the normal connected state, it is considered that the obtained straight line corresponds to the front plane of the trailer. FIG. 9 shows the trailer data and ROI therein according to another embodiment of the invention. And when the trailer and the locomotive are in a normal connection state, the point cloud data corresponding to the front plane is between a point P1 and a point P12. For the straight line L31 obtained by the straight line fitting operation, the intersection with the Y axis is the point P11. The absolute value of the ordinate of the point P11 is greater than the ordinate of the point P1, so it is necessarily greater than Dcd. The line L31 can therefore be considered to be a line representing the front plane of the trailer. Thus, it is determined that the obtained straight line L31 is a line segment representing the front plane, and thus position information of the line segment representing the front plane is obtained. Step S130 is executed up to this point, and may be ended, and step S140 is continuously executed.
Assuming that the straight line obtained by this time of the straight line fitting operation is the straight line L32, the operation thereof will be different from the above-described straight line L31. The absolute value of the ordinate of the intersection point P10 of the straight line L32 with the Y axis is smaller than the ordinate of the point P12, so it is necessarily smaller than Dcd. For this case, the laser radar point cloud data corresponding to the obtained straight line L32 is deleted and the next straight line fitting operation is performed on the remaining laser radar point cloud data. The straight line fitting operation is repeatedly performed until the straight line fitting operation cannot be performed by the remaining laser radar point cloud data (i.e., no laser radar point cloud data exists), or the absolute value of the ordinate of the intersection of the obtained straight line and the Y-axis is greater than or equal to Dcd (according to the embodiment of fig. 9, i.e., the aforementioned straight line L31) and a line segment representing the front plane is determined from the last obtained straight line.
The implementation manner of the step S130 effectively distinguishes the line segment actually representing the front plane of the trailer in the trailer data, avoids the interference of the straight line corresponding to the side plane of the trailer, and the like, and ensures the accuracy of trailer detection from the perspective of the data source.
After determining information on a line segment representing the front plane of the trailer in step S130, the position state of the trailer is determined from the information on the aforementioned line segment in step S140.
Optionally, step S140 includes the following sub-steps:
first, the number of points in the lidar point cloud data in the ROI whose distance from a line segment with respect to a front plane representing the scoop is less than a third threshold is counted. If the distance of a point from a line segment is less than a third threshold, the point can be considered to be on the line segment, i.e., the point is formed by the front plane of the bucket.
Then, determining that the trailer is separated from the vehicle head under the condition that the number of the counted points is smaller than a fourth threshold value; otherwise, the trailer can be considered to be in a connection state with the locomotive.
It will be appreciated that in some cases, the information about the line segment representing the front plane of the trailer indicates that no line segment representing the front plane of the trailer is currently present in the ROI, from which it can be directly determined that the trailer is detached from the vehicle head.
Through the steps, whether the trailer is separated from the vehicle head or not can be accurately and automatically detected through limited calculation.
Optionally, step S140 further comprises the following sub-steps:
first, from the information on the line segment representing the front plane of the trailer, a straight line equation where the line segment representing the front plane of the trailer is located is determined.
Then, according to the linear equation, the angle of a line segment representing the front plane of the trailer and the transverse direction of the vehicle head is determined so as to determine the posture of the trailer.
Through the two substeps, the relative angle relationship between the trailer and the locomotive can be accurately and automatically detected through limited calculation amount. Whether the trailer is in a turning state or not can be automatically judged according to the above.
It will be appreciated by those skilled in the art that although the trailer tow detection method is described in the above embodiments by taking the example that the tow data is lidar point cloud data, the method may also be implemented based on three-dimensional image data. In the aforementioned three-dimensional image data of the vehicle coordinate system, the front plane of the trailer may directly correspond to a line segment. The position state of the trailer can be determined according to the information of the line segment. For example, the position state of the trailer is determined according to the length of the line segment in the ROI, the angle relation between the line segment and the X axis and the like. For the case where the length of the line segment within the ROI is short or non-existent, the trailer is considered to have disengaged from the locomotive.
According to another aspect of the invention, a trailer detection system is also provided. The trailer detection system comprises a data acquisition device and a computing device. The data acquisition device is used for acquiring trailer data, wherein the trailer data comprises information representing the position of a first plane of the trailer. The calculating device is used for determining an ROI in the trailer data according to the position of the first plane in the trailer data when the trailer and a vehicle head are in a normal connection state; determining, in the ROI, information on a line segment representing a first plane of a trailer based on the information representing a position of the first plane; and determining the position state of the trailer according to the information about the line segment representing the first plane.
Optionally, the computing device is further configured to implement other steps of the trailer bed detection method.
Optionally, the data acquisition device is a single line lidar sensor.
Optionally, the system further comprises a display for displaying the position status of the trailer. For example, the trailer is displayed in a different color. When the trailer is separated from the vehicle head, the trailer is displayed in red; when the trailer is connected with the vehicle head, the trailer is displayed as green. For another example, the display is used to display the angle of the front plane of the trailer with respect to the transverse direction of the vehicle head.
The specific implementation of the trailer bed detection system and the technical effects thereof can be understood by those skilled in the art from the above description, and for the sake of brevity, detailed description is omitted here.
According to yet another aspect of the invention, there is also provided an unmanned trailer. The unmanned trailer comprises the trailer detection system. This unmanned trailer can utilize this trailer detecting system to detect the position state of trailer automatically and accurately, and the cost is lower relatively moreover.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
In the description provided herein, numerous specific details are set forth. It is understood, however, that embodiments of the invention may be practiced without these specific details. In some instances, well-known methods, structures and techniques have not been shown in detail in order not to obscure an understanding of this description.
Similarly, it should be appreciated that in the description of exemplary embodiments of the invention, various features of the invention are sometimes grouped together in a single embodiment, figure, or description thereof for the purpose of streamlining the invention and aiding in the understanding of one or more of the various inventive aspects. However, the method of the present invention should not be construed to reflect the intent: that the invention as claimed requires more features than are expressly recited in each claim. Rather, as the following claims reflect, inventive aspects lie in less than all features of a single disclosed embodiment. Thus, the claims following the detailed description are hereby expressly incorporated into this detailed description, with each claim standing on its own as a separate embodiment of this invention.
It will be understood by those skilled in the art that all of the features disclosed in this specification (including any accompanying claims, abstract and drawings), and all of the processes or elements of any method or apparatus so disclosed, may be combined in any combination, except combinations where such features are mutually exclusive. Each feature disclosed in this specification (including any accompanying claims, abstract and drawings) may be replaced by alternative features serving the same, equivalent or similar purpose, unless expressly stated otherwise.
Furthermore, those skilled in the art will appreciate that while some embodiments described herein include some features included in other embodiments, rather than other features, combinations of features of different embodiments are meant to be within the scope of the invention and form different embodiments. For example, in the claims, any of the claimed embodiments may be used in any combination.
The various component embodiments of the invention may be implemented in hardware, or in software modules running on one or more processors, or in a combination thereof. Those skilled in the art will appreciate that a microprocessor or Digital Signal Processor (DSP) may be used in practice to implement some or all of the functions of some of the modules in a trailer bed detection system according to embodiments of the present invention. The present invention may also be embodied as apparatus programs (e.g., computer programs and computer program products) for performing a portion or all of the methods described herein. Such programs implementing the present invention may be stored on computer-readable media or may be in the form of one or more signals. Such a signal may be downloaded from an internet website or provided on a carrier signal or in any other form.
It should be noted that the above-mentioned embodiments illustrate rather than limit the invention, and that those skilled in the art will be able to design alternative embodiments without departing from the scope of the appended claims. In the claims, any reference signs placed between parentheses shall not be construed as limiting the claim. The word "comprising" does not exclude the presence of elements or steps not listed in a claim. The word "a" or "an" preceding an element does not exclude the presence of a plurality of such elements. The invention may be implemented by means of hardware comprising several distinct elements, and by means of a suitably programmed computer. In the unit claims enumerating several means, several of these means may be embodied by one and the same item of hardware. The usage of the words first, second and third, etcetera do not indicate any ordering. These words may be interpreted as names.
The above description is only for the specific embodiment of the present invention or the description thereof, and the protection scope of the present invention is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present invention, and the changes or substitutions should be covered within the protection scope of the present invention. The protection scope of the present invention shall be subject to the protection scope of the claims.

Claims (11)

1. A trailer bed detection method comprising:
collecting trailer data, wherein the trailer data comprises information representing the position of a front plane of the trailer, and the collected trailer data comprises at least one of laser radar point cloud data corresponding to the front plane and a side plane of the trailer;
determining an area of interest based on Dcd, wherein the area of interest is under a vehicle coordinate system, an origin of the vehicle coordinate system is a center of a transverse direction of a vehicle head, an X axis is a direction passing through the origin and parallel to the transverse direction of the vehicle head, a Y axis is a direction passing through the origin and parallel to a longitudinal direction of the vehicle head, the Dcd is a distance between the origin of the vehicle coordinate system and a position of a front plane in trailer data when a trailer and the vehicle head are in a normal connection state, an upper edge of the area of interest is a part of the X axis, a distance between a point P1 with the largest distance from the X axis in a lower edge of the area of interest and the X axis is D1, and a first threshold value < D1-Dcd < a second threshold value;
reading laser radar point cloud data in the region of interest;
performing linear fitting operation for the read laser radar point cloud data;
determining a line segment representing the front plane from the straight line obtained in the straight line fitting operation for a case where an absolute value of a vertical coordinate of an intersection of the straight line obtained in the straight line fitting operation and the Y axis is greater than or equal to Dcd;
for the case that the absolute value of the ordinate of the intersection point of the straight line and the Y axis obtained by the straight line fitting operation is less than Dcd, deleting the lidar point cloud data corresponding to the obtained straight line and performing the next straight line fitting operation on the remaining lidar point cloud data until the lidar point cloud data does not exist or the absolute value of the ordinate of the intersection point of the obtained straight line and the Y axis is greater than or equal to Dcd and determining a line segment representing the front plane according to the straight line obtained last;
and determining the posture of the trailer according to the line segment representing the front plane.
2. The method of claim 1, wherein a lower edge of the region of interest is a line segment parallel to the X-axis.
3. The method as claimed in claim 2, wherein the point P1 lies on a Y-axis of the vehicle coordinate system, wherein the Y-axis of the vehicle coordinate system is a direction passing through the origin and parallel to the vehicle front longitudinal direction, the region of interest is a region enclosed by a line segment L11, a line segment L12, a line segment L13, a line segment L14, a line segment L15 and a line segment L16, wherein,
the line segment L11 passes through the point P1, is parallel to the X-axis, is bisected by the Y-axis, and has a length equal to the length of the front plane in the trailer data;
the lower endpoint of the line segment L12 coincides with the left endpoint of the line segment L11, is at an angle of 125 degrees to the line segment L11, and has a length equal to the length of the line segment L11;
the line segment L14 coincides with the X axis, is bisected by the Y axis, and has a length equal to 2 times the length of the line segment L11;
the upper endpoint of the line segment L13 coincides with the left endpoint of the line segment L14, and the lower endpoint of the line segment L13 coincides with the upper endpoint of the line segment L12;
the line segment L15 and the line segment L16 are symmetrical to the line segment L13 and the line segment L12, respectively, about the Y axis.
4. The method of claim 1, wherein the region of interest is a region enclosed by a line segment L21, a line segment L22, a line segment L23, and a circular arc L24, wherein,
the line segment L21 coincides with the X axis, is bisected by the Y axis of the vehicle coordinate system, which is the direction through the origin and parallel to the vehicle head longitudinal direction, and has a length equal to 2 times the length of the front plane in the trailer data;
the circular arc L24 is a semicircular arc having a center at a point P2 on the Y axis, which is located at a distance from the point P1 equal to the length of the front plane in the trailer data, and having a radius equal to the length of the front plane in the trailer data, and having an opening facing the positive direction of the Y axis;
the upper end point of the line segment L22 coincides with the left end point of the line segment L21, and the lower end point of the line segment L22 coincides with the left end point of the arc L24; and
the line segment L23 is symmetrical to the line segment L22 about the Y-axis.
5. The method of claim 1, wherein fitting a straight line to the read lidar point cloud data comprises:
and performing straight line fitting on the read laser radar point cloud data by utilizing a random sampling consistency algorithm.
6. The method of claim 1, further comprising:
counting the number of points, of which the distance from the line segment of the front plane representing the trailer is smaller than a third threshold value, in the laser radar point cloud data in the region of interest;
and determining that the trailer is separated from the locomotive under the condition that the number is smaller than a fourth threshold value.
7. The method of claim 1, wherein the collecting of the trailer data comprises: and acquiring the trailer data by using a single-line laser radar sensor.
8. The method of any one of claims 1 to 4, wherein said determining the attitude of the trailer from the line segment representing the front plane comprises:
determining a linear equation where the line segment representing the front plane is located;
and according to the linear equation, determining the angle between the line segment representing the front plane and the transverse direction of the vehicle head so as to determine the posture of the trailer.
9. A trailer bed detection system comprising:
the system comprises a laser radar sensor, a sensor and a controller, wherein the laser radar sensor is used for acquiring trailer data, the trailer data comprises information representing the position of a front plane of a trailer, and the acquired trailer data comprises at least one of laser radar point cloud data corresponding to the front plane and a side plane of the trailer;
computing means for:
determining an area of interest based on Dcd, wherein the area of interest is under a vehicle coordinate system, an origin of the vehicle coordinate system is a center of a transverse direction of a vehicle head, an X axis is a direction passing through the origin and parallel to the transverse direction of the vehicle head, a Y axis is a direction passing through the origin and parallel to a longitudinal direction of the vehicle head, the Dcd is a distance between the origin of the vehicle coordinate system and a position of a front plane in trailer data when a trailer and the vehicle head are in a normal connection state, an upper edge of the area of interest is a part of the X axis, a distance between a point P1 with the largest distance from the X axis in a lower edge of the area of interest and the X axis is D1, and a first threshold value < D1-Dcd < a second threshold value;
reading laser radar point cloud data in the region of interest;
performing linear fitting operation for the read laser radar point cloud data;
determining a line segment representing the front plane from the straight line obtained in the straight line fitting operation for a case where an absolute value of a vertical coordinate of an intersection of the straight line obtained in the straight line fitting operation and the Y axis is greater than or equal to Dcd;
for the case that the absolute value of the ordinate of the intersection point of the straight line and the Y axis obtained by the straight line fitting operation is less than Dcd, deleting the lidar point cloud data corresponding to the obtained straight line and performing the next straight line fitting operation on the remaining lidar point cloud data until the lidar point cloud data does not exist or the absolute value of the ordinate of the intersection point of the obtained straight line and the Y axis is greater than or equal to Dcd and determining a line segment representing the front plane according to the straight line obtained last;
and determining the posture of the trailer according to the line segment representing the front plane.
10. The system of claim 9, wherein the system further comprises a display for displaying a status of the position of the trailer.
11. An unmanned trailer comprising a trailer bed detection system as claimed in any one of claims 9 to 10.
CN201810578127.XA 2018-06-07 2018-06-07 Trailer detection method and system and unmanned trailer Active CN108761479B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810578127.XA CN108761479B (en) 2018-06-07 2018-06-07 Trailer detection method and system and unmanned trailer

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810578127.XA CN108761479B (en) 2018-06-07 2018-06-07 Trailer detection method and system and unmanned trailer

Publications (2)

Publication Number Publication Date
CN108761479A CN108761479A (en) 2018-11-06
CN108761479B true CN108761479B (en) 2021-07-02

Family

ID=64000307

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810578127.XA Active CN108761479B (en) 2018-06-07 2018-06-07 Trailer detection method and system and unmanned trailer

Country Status (1)

Country Link
CN (1) CN108761479B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116184417A (en) * 2018-12-10 2023-05-30 北京图森智途科技有限公司 Trailer pinch angle measuring method and device and vehicle
CN110850431A (en) * 2019-11-25 2020-02-28 盟识(上海)科技有限公司 System and method for measuring trailer deflection angle
CN116424331B (en) * 2023-06-13 2023-09-22 九曜智能科技(浙江)有限公司 Tractor, docking method of towed target and electronic equipment

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN2286461Y (en) * 1996-11-19 1998-07-22 毕福利 Alarming device for trailer drawed by automobile when fault occurs
DE102005042957A1 (en) * 2005-09-09 2007-03-22 Daimlerchrysler Ag Drawbar angle and trailer angle determining method, involves determining geometric ratios and/or relative position of characteristics edges and lines of drawbar and front side of trailer to find drawbar angle and trailer angle
JP2007161148A (en) * 2005-12-15 2007-06-28 Nissan Motor Co Ltd Drive control device of rear side vehicle in combination vehicle
DE102014007898A1 (en) * 2014-05-27 2015-12-03 Man Truck & Bus Ag Method and driver assistance system for supporting a commercial vehicle team
CN104048646B (en) * 2014-06-27 2015-08-12 山东世纪矿山机电有限公司 Based on Derail detector and the method for laser image measurement
CN105223583B (en) * 2015-09-10 2017-06-13 清华大学 A kind of target vehicle course angle computational methods based on three-dimensional laser radar
CN106225723B (en) * 2016-07-25 2019-03-29 浙江零跑科技有限公司 A kind of hinged angle measuring method of multiple row vehicle based on backsight binocular camera
US20180068566A1 (en) * 2016-09-08 2018-03-08 Delphi Technologies, Inc. Trailer lane departure warning and sway alert

Also Published As

Publication number Publication date
CN108761479A (en) 2018-11-06

Similar Documents

Publication Publication Date Title
CN108761479B (en) Trailer detection method and system and unmanned trailer
JP6008605B2 (en) TOWED VEHICLE WHEELBASE DETECTING DEVICE AND VEHICLE CONTROL DEVICE HAVING TOWED VEHICLE WHEELBASE DETECTING DEVICE
CN106485949B (en) The sensor of video camera and V2V data for vehicle merges
CN107850446B (en) Self-position estimating device and self-position estimate method
JP2018092483A (en) Object recognition device
US9613421B2 (en) Optical tracking
CN110673107B (en) Road edge detection method and device based on multi-line laser radar
JP2020113268A (en) Method for calculating tow hitch position
CN104913775B (en) Measurement method, unmanned plane localization method and the device of unmanned plane distance away the ground
WO2015182147A1 (en) Stereo camera device and vehicle provided with stereo camera device
KR20190082298A (en) Self-calibration sensor system for wheel vehicles
CN112525147B (en) Distance measurement method for automatic driving equipment and related device
WO2019021876A1 (en) In-vehicle camera calibration device and method
JP5310027B2 (en) Lane recognition device and lane recognition method
CN113870357A (en) Camera external parameter calibration method and device, sensing equipment and storage medium
CN109883432B (en) Position determination method, device, equipment and computer readable storage medium
CN106803066B (en) Vehicle yaw angle determination method based on Hough transformation
CN111671360B (en) Sweeping robot position calculating method and device and sweeping robot
CN105043341B (en) The measuring method and device of unmanned plane distance away the ground
JP2018084987A (en) Object recognition device
JP4978938B2 (en) Distance measuring apparatus and method, and computer program
CN114644014A (en) Intelligent driving method based on lane line and related equipment
CN112183157A (en) Road geometry identification method and device
CN111256651A (en) Week vehicle distance measuring method and device based on monocular vehicle-mounted camera
CN114863089A (en) Automatic acquisition method, device, medium and equipment for automatic driving perception data

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
REG Reference to a national code

Ref country code: HK

Ref legal event code: DE

Ref document number: 1261871

Country of ref document: HK

GR01 Patent grant
GR01 Patent grant