WO2020206639A1 - Procédé d'ajustement d'objet cible, capteur de nuage de points et plateforme mobile - Google Patents

Procédé d'ajustement d'objet cible, capteur de nuage de points et plateforme mobile Download PDF

Info

Publication number
WO2020206639A1
WO2020206639A1 PCT/CN2019/082119 CN2019082119W WO2020206639A1 WO 2020206639 A1 WO2020206639 A1 WO 2020206639A1 CN 2019082119 W CN2019082119 W CN 2019082119W WO 2020206639 A1 WO2020206639 A1 WO 2020206639A1
Authority
WO
WIPO (PCT)
Prior art keywords
angle
fitting
point cloud
visible
area
Prior art date
Application number
PCT/CN2019/082119
Other languages
English (en)
Chinese (zh)
Inventor
李星河
邱凡
刘寒颖
Original Assignee
深圳市大疆创新科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳市大疆创新科技有限公司 filed Critical 深圳市大疆创新科技有限公司
Priority to PCT/CN2019/082119 priority Critical patent/WO2020206639A1/fr
Priority to CN201980005593.1A priority patent/CN111316289A/zh
Publication of WO2020206639A1 publication Critical patent/WO2020206639A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/255Detecting or recognising potential candidate objects based on visual cues, e.g. shapes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/06Topological mapping of higher dimensional structures onto lower dimensional surfaces
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/58Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
    • G06V20/584Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads of vehicle lights or traffic lights
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30248Vehicle exterior or interior
    • G06T2207/30252Vehicle exterior; Vicinity of vehicle

Definitions

  • This application relates to the field of automatic driving, and in particular to a method for fitting a target object, a point cloud sensor and a mobile platform.
  • vehicle point cloud is a set of discrete points. In actual processing, it is often necessary to fit the point cloud determined as the target vehicle to obtain the fitted vehicle to determine the orientation and size of the vehicle according to the fitted vehicle. Complete the tracking task.
  • the embodiments of the present application provide a method for fitting a target object, a point cloud sensor and a mobile platform, which improve the accuracy of fitting the target object.
  • an embodiment of the present application provides a method for fitting a target object, including: generating a two-dimensional point cloud in a top view according to a point cloud of the target object collected by a point cloud sensor, wherein the point cloud sensor is used for carrying In the mobile platform; determine the visible area of the two-dimensional point cloud toward the mobile platform; determine the fitting feature of the target object according to the visible area, and perform the target object according to the fitting feature Fitting.
  • determining the fitting feature of the target object according to the visible area includes: determining according to the visible area and the two-dimensional point cloud The fitting feature.
  • the fitting feature is the fitting angle of the target object, and the determining the fitting angle is based on the visible area and the two-dimensional point cloud.
  • the fitting feature includes: determining the fitting angle according to the visible area, the two-dimensional point cloud, and a preset angle range.
  • the determining the fitting angle according to the visible area, the two-dimensional point cloud, and a preset angle range includes: according to the The visible area, the two-dimensional point cloud, and the preset angle range determine the fitting angle with the least corresponding visible edge cost.
  • the fitting is determined according to the visible area, the two-dimensional point cloud, and a preset angle range.
  • the angle includes: selecting at least one reference angle from the preset angle range; according to the at least one reference angle, the visible area, and the two-dimensional point cloud, determining the corresponding fitting with the least visible edge cost angle.
  • Determining the corresponding fitting angle with the smallest visible edge cost includes: for any one of a plurality of reference angles, according to the visible area and the two-dimensional point cloud, in the first reference angle Determine the first preselected angle with the smallest visible edge cost within the first neighborhood range, where the first neighborhood range includes the first reference angle; determine the corresponding preselected angle with the least visible edge cost among the preselected angles as said Fitting angle.
  • the visible edge cost is the smallest within the first neighborhood of the first reference angle
  • the first preselected angle includes: obtaining a first visible edge cost corresponding to the first reference angle according to the visible area and the two-dimensional point cloud, and updating the first reference angle to the first neighbor
  • the second visible edge cost corresponding to the first angle is obtained according to the visible area and the two-dimensional point cloud
  • the second visible edge cost and the first visible edge The relationship between the costs, update the first angle to the second angle in the first neighborhood; repeat the execution of obtaining the visible edge costs corresponding to the angles in the first neighborhood, according to two adjacent
  • the obtained relationship of the visible edge cost is an operation of updating the angle within the first neighborhood range until the first preselected angle with the smallest visible edge cost is obtained.
  • Determining the corresponding fitting angle with the smallest visible edge cost includes: determining the corresponding target reference angle with the smallest visible edge cost among multiple reference angles according to the visible area and the two-dimensional point cloud; For the visible area and the two-dimensional point cloud, determine the fitting angle with the smallest visible edge cost within a second neighborhood range of the target reference angle, and the second neighborhood range includes the target reference angle .
  • the fitting angle includes: acquiring a plurality of sub-reference angles from within a second neighborhood of the target reference angle; determining the visible edge cost corresponding to each sub-reference angle according to the visible area and the two-dimensional point cloud The smallest sub-reference angle is the fitting angle.
  • obtaining the first visible edge cost corresponding to the first reference angle includes: The first outer envelope rectangle of the two-dimensional point cloud in the first direction, the visible area and the first boundary and the second boundary corresponding to the visible area, to obtain the first visible edge cost; wherein, The first direction is a direction corresponding to the rotation of the first reference angle from the coordinate origin of the point cloud sensor coordinate system, and the coordinate origin of the point cloud sensor coordinate system is the position of the point cloud sensor.
  • the visible area corresponds to the visible area
  • Obtaining the first visible edge cost includes: obtaining the first area of the first area defined by the visible area, the first border and the second border; obtaining the The first outer envelope rectangle faces the visible side of the mobile platform, the second area of the second area defined by the first boundary and the second boundary; obtains the side of the first envelope rectangle outside the target area, A third area of a third area defined by the first boundary and the second boundary, the target area is the area between the first boundary and the second boundary, and the target area includes the two-dimensional point cloud ; According to the first area, the second area and the third area, determine the first visible edge cost.
  • the first visible edge cost includes: obtaining the first area, the second area, and the first preset weight, and obtaining the first preselected visible edge cost; according to the third area and the second preset weight, obtaining the first Two preselected visible edge costs; determining the first visible edge cost according to the first preselected visible edge cost and the second preselected visible edge cost.
  • the angle included in the preset angle range is an angle between a first preset angle and a second preset angle, and the first preset angle The angle minus the second preset angle is equal to 90°, and the preset angle range includes the first preset angle and/or the second preset angle.
  • the fitting feature includes at least one of the following: a fitting angle of the target object, a fitting direction of the target object, and The fitted bounding box of the target object, and the fitted visible edge of the target object;
  • the fitting direction is the direction corresponding to the rotation of the fitting angle from the coordinate origin of the point cloud sensor coordinate system
  • the coordinate origin of the point cloud sensor coordinate system is the position of the point cloud sensor
  • the fitting the target object according to the fitting feature includes:
  • the fitting feature includes at least one of the following: the fitting angle of the target object, and the fitting direction of the target object , Fitting visible edges of the target object, fitting the target object according to the fitting feature and the fitting height, including:
  • obtaining the fitting height of the target object according to the point cloud of the target object includes:
  • the fitting height is obtained.
  • an embodiment of the present application provides a point cloud sensor, which is used to be mounted on a mobile platform, and the point cloud sensor includes:
  • the processor is in communication connection with the collector, and is configured to execute the method described in the first aspect and any possible implementation manner of the first aspect. It is understandable that, in the second aspect, "generate a two-dimensional point cloud in the top view according to the point cloud of the target object collected by the point cloud sensor" is "the point cloud of the target object collected by the collector of the point cloud sensor, Generate a two-dimensional point cloud under the top view”.
  • embodiments of the present application provide a mobile platform, including: a point cloud sensor for collecting the point cloud of the target object; wherein the point cloud sensor is mounted on the mobile platform; a processor, and The point cloud sensor communication connection is used to execute the method described in the first aspect and any possible implementation manner of the first aspect.
  • an embodiment of the present application provides a mobile platform, characterized in that the point cloud sensor described in the second aspect is mounted on the mobile platform.
  • an embodiment of the present application provides a computer-readable storage medium, including a program or instruction.
  • the program or instruction runs on a computer, the first aspect and any possible implementation of the first aspect The method is executed.
  • an embodiment of the present invention provides a computer-readable storage medium, the computer-readable storage medium stores a computer program, the computer program includes at least one piece of code, the at least one piece of code can be executed by a computer to control the computer to execute
  • the computer program includes at least one piece of code, the at least one piece of code can be executed by a computer to control the computer to execute
  • an embodiment of the present invention provides a computer program, when the computer program is executed by a computer, it is used to execute the foregoing first aspect and the method described in any possible implementation manner of the first aspect.
  • the program may be stored in whole or in part on a storage medium that is packaged with the processor, and may also be stored in part or all in a storage medium that is not packaged with the processor.
  • the storage medium is, for example, a memory.
  • the target object Since the target object is close to the point cloud sensor or one or two surfaces of the mobile platform equipped with the point cloud sensor can be observed by the point cloud sensor, it is away from the point cloud sensor or two or three surfaces of the mobile platform equipped with the point cloud sensor It cannot be observed by the point cloud sensor. Therefore, the point cloud sensor based on the target object in this application can be fitted to the target object, that is, according to the visible area of the target object’s two-dimensional point cloud towards the mobile platform, the target Object fitting can improve the accuracy of fitting the target object.
  • Figure 1 is a schematic diagram of an application scenario provided by an embodiment of the application
  • FIG. 2 is a first flowchart of a method for fitting a target object provided by an embodiment of the application
  • FIG. 3 is a schematic diagram of the visible area facing the mobile platform in the two-dimensional point cloud provided by an embodiment of the application;
  • FIG. 4 is a schematic diagram 1 of fitting visible edges of a target object provided by the implementation of this application.
  • 5 is a second schematic diagram of fitting visible edges of the target object provided by the implementation of this application.
  • FIG. 6 is the third schematic diagram of fitting visible edges of the target object provided by the implementation of this application.
  • FIG. 7 is a second flowchart of a method for fitting a target object provided by an embodiment of the application.
  • FIG. 8 is a schematic diagram of obtaining visible edge cost provided by an embodiment of the application.
  • FIG. 9 is a schematic structural diagram of a point cloud sensor provided by an embodiment of the application.
  • Fig. 10 is a schematic structural diagram of a mobile platform provided by an embodiment of the application.
  • FIG. 1 is a schematic diagram of an application scenario provided by an embodiment of the application.
  • a cloud sensor 12 is mounted on a mobile platform 11.
  • the point cloud sensor 12 is used to collect point cloud data of the target object, and the point cloud data of the target object is used to fit the target object.
  • FIG. 2 is a first flowchart of a method for fitting a target object provided by an embodiment of the application. Referring to Fig. 2, the method of this embodiment includes:
  • Step S101 Generate a two-dimensional point cloud in the top view according to the point cloud of the target object collected by the point cloud sensor, where the point cloud sensor is used to be mounted on the mobile platform.
  • the point cloud of the target object collected by the point cloud sensor is a three-dimensional point cloud
  • the three-dimensional point cloud can be projected on a plane in the vertical direction to obtain a two-dimensional point cloud, that is, the two-dimensional point cloud under the top view is generated according to the point cloud of the target object Point cloud.
  • the target object is a vehicle driving on the road
  • the collected three-dimensional point cloud of the vehicle can be projected on the plane of the road along the vertical direction, so as to obtain the two-dimensional point cloud in the top view.
  • the point cloud sensor in this embodiment may be a TOF sensor or a lidar.
  • the mobile platform in this embodiment may be a vehicle, and the target object may be other vehicles driving on the road.
  • the vehicle is equipped with a lidar, which may be one or more lidars, and may be a rotating lidar or a solid-state lidar. The lidar is used to obtain three-dimensional point cloud information around the vehicle.
  • Step S102 Determine the visible area facing the mobile platform in the two-dimensional point cloud.
  • the present embodiment performs fitting based on the face of the target object that can be observed by the point cloud sensor of the target object, that is, to determine the two-dimensional
  • the visible area facing the mobile platform in the point cloud is used to fit the target object according to the visible area facing the mobile platform in the two-dimensional point cloud.
  • the visible area of the two-dimensional point cloud of the target object facing the mobile platform is the area of the two-dimensional point cloud of the target object in the sensing range and the field of view of the point cloud sensor.
  • FIG. 3 is a schematic diagram of a visible area facing a mobile platform in a two-dimensional point cloud provided by an embodiment of the application.
  • Fig. 3 which illustrates some points in the two-dimensional point cloud.
  • the coordinate origin O shown in Fig. 3 is the position of the point cloud sensor
  • the irregular figure 21 in Fig. 3 is the contour or convex hull of the two-dimensional point cloud of the target object, and the contour or convex hull of the two-dimensional point cloud is added
  • the thick part 211 is the visible area facing the mobile platform in the two-dimensional point cloud
  • OA and OB are the boundaries corresponding to the visible area facing the mobile platform in the two-dimensional point cloud.
  • Step S103 Determine the fitting feature of the target object according to the visible area facing the mobile platform in the two-dimensional point cloud of the target object, and fit the target object according to the fitting feature.
  • the fitting feature of the target object can be determined according to the visible area, and the target object can be fitted according to the fitting feature.
  • fitting the target object according to the fitting feature includes:
  • the fitted maximum height and the fitted minimum height of the point cloud of the target object are obtained, and the fitted height of the target object is obtained according to the fitted maximum height and the fitted minimum height.
  • the difference between the maximum fitting height and the minimum fitting height is the fitting height of the target object.
  • the maximum fitting height can be the height of the point with the highest height in the point cloud of the target object, or the maximum height obtained according to hierarchical aggregation, such as fitting according to the height of all points in the point cloud with a certain thickness
  • the point cloud of a certain thickness includes the point with the largest height in the point cloud of the target object.
  • the minimum fitting height can be the height of the point with the smallest height in the point cloud of the target object, or the minimum height obtained according to hierarchical aggregation, such as fitting according to the height of all points in a point cloud with a certain thickness. It can be understood that the point cloud of a certain thickness includes the point with the smallest height in the point cloud of the target object.
  • the fitting feature of the target object may include at least one of the following: fitting angle of the target object, fitting direction of the target object, fitting bounding box of the target object, and fitting visible edge of the target object.
  • the fitting feature of the target object may also be other features, which is not limited in this embodiment.
  • fitting feature of the target object is the fitting angle of the target object
  • fitting the target object according to the fitting feature of the target object and the fitting height of the target object including:
  • the direction corresponding to the rotation fitting angle from the coordinate origin of the point cloud sensor coordinate system is the fitting direction of the target object, and the coordinate origin of the point cloud sensor coordinate system is the position of the point cloud sensor.
  • the preset angle range and the preselected angle range may be the same or different; the angle included in the preset angle range is the angle between the first preset angle and the second preset angle, and the second preset angle minus the first preset angle Set the angle equal to 90°, the preset angle range includes the first preset angle and/or the second preset angle; the preselected angle range includes the angle between the third preset angle and the fourth preset angle, and the fourth The preset angle minus the third preset angle is equal to 90°, and the preselected angle range includes the third preset angle and/or the fourth preset angle.
  • the pre-selected angle range is obtained according to the preset angle range.
  • the fitting angle of the target object is the angle corresponding to the minimum cost of the visible edge.
  • each angle corresponds to a visible edge cost.
  • the visible edge cost is the degree of fit between the outer envelope rectangle of the two-dimensional point cloud of the target object in the direction corresponding to the angle and the two-dimensional point cloud, and the overflow of the outer envelope rectangle The parameter of the overflow degree of the target area; where the direction corresponding to the angle is the direction corresponding to the angle after rotating the angle from the coordinate origin of the point cloud sensor coordinate system.
  • the target area is the area between the first boundary and the second boundary corresponding to the visible area facing the mobile platform in the two-dimensional point cloud of the target object, and the target area includes the two-dimensional point cloud of the target object; further, the The first boundary and the second boundary can surround the two-dimensional point cloud, and both the first boundary and the second boundary have intersection points with the convex hull of the two-dimensional point cloud.
  • OA in Figure 3 is the first boundary corresponding to the visible area facing the mobile platform in the two-dimensional point cloud of the target object
  • OB is the first boundary corresponding to the visible area facing the mobile platform in the two-dimensional point cloud of the target object. Two borders.
  • the outer envelope rectangle of the two-dimensional point cloud of the target object in the direction corresponding to the angle satisfies the following conditions: the two sides of the outer envelope rectangle are in the same direction as the direction corresponding to the angle, and the other The direction of the two sides is perpendicular to the direction corresponding to the angle.
  • the outer envelope rectangle of the two-dimensional point cloud of the target object in the fitting direction is obtained, and the outer envelope rectangle is the fitting bounding box of the target object.
  • the outer envelope rectangle of the two-dimensional point cloud of the target object in the fitting direction satisfies the following conditions: the two sides of the outer envelope rectangle are in the same direction as the fitting direction of the target object, and the outer envelope The direction of the other two sides of the rectangle is perpendicular to the fitting direction of the target object.
  • the fitting direction of the target object in the process of obtaining the fitted bounding box of the target object, first obtain the fitted visible edge of the target object according to the fitting direction, and then use the fitted visible edge of the target object as the edge to obtain the target
  • the outer envelope rectangle of the two-dimensional point cloud of the object in the fitting direction in the process of obtaining the fitted bounding box of the target object, first obtain the fitted visible edge of the target object according to the fitting direction, and then use the fitted visible edge of the target object as the edge to obtain the target The outer envelope rectangle of the two-dimensional point cloud of the object in the fitting direction.
  • the fitted visible edge of the target object can be referred to as the L-shaped feature or the I-shaped feature of the target object. It is understandable that the fitted visible edge of the target object satisfies the following conditions: (1) The two-dimensional point cloud of the target object is located on the same side of the fitted visible edge of the target object, and the convex hull of the two-dimensional point cloud of the target object is The fitted visible edges of the target object have intersections; (2) In the case of two fitted visible edges of the target object, one of the two fitted visible edges is located in the same direction as the fitting direction of the target object , The direction of the other edge is perpendicular to the fitting direction of the target object; in the case that the fitted visible edge of the target object has one, the fitted visible edge is located in the same direction as the fitting direction of the target object.
  • FIG. 4 is the first schematic diagram of the fitted visible edge of the target object provided by the implementation of this application
  • FIG. 5 is the second schematic diagram of the fitted visible edge of the target object provided by the implementation of this application
  • FIG. 6 is the simulated view of the target object provided by the implementation of this application Schematic three of the visible edges.
  • 41 and 42 are the fitted visible edges of the target object, and the fitted visible edges at this time are the L-shaped features of the target object.
  • 51 is the fitted visible edge of the target object, and the fitted visible edge at this time is the I-type feature of the target object.
  • 61 is the fitted visible edge of the target object, and the fitted visible edge at this time is the I-type feature of the target object.
  • the fitting feature of the target object is the fitting direction of the target object
  • the fitting feature of the target object is acquired according to the visible area of the two-dimensional point cloud of the target object toward the mobile platform, including:
  • fitting the target object according to the fitting feature of the target object and the fitting height of the target object including:
  • the fitting feature of the target object is the fitting bounding box of the target object
  • the fitting feature of the target object is determined according to the visible area of the two-dimensional point cloud of the target object toward the mobile platform, including:
  • fitting the target object according to the fitting feature of the target object and the fitting height of the target object including: according to the fitting bounding box of the target object and the fitting height of the target object, The object is fitted.
  • the fitting feature of the target object is the fitting visible edge of the target object
  • the fitting feature of the target object is obtained according to the visible area of the two-dimensional point cloud of the target object toward the mobile platform, including:
  • fitting the target object according to the fitting feature of the target object and the fitting height of the target object including:
  • the point cloud sensor based on the target object can perform fitting to the target object, that is, according to the visible area facing the mobile platform in the two-dimensional point cloud of the target object, Fitting the target object can improve the accuracy of fitting the target object.
  • FIG. 7 is the second flowchart of the target object fitting method provided by the embodiment of the application. Referring to FIG. 7, the method of this embodiment includes:
  • Step S201 Select at least one reference angle from a preset angle range.
  • the meaning of the preset angle range in this embodiment is the same as the meaning of the preset angle range in the previous embodiment.
  • At least one reference angle refers to one or more reference angles.
  • the reference angle can be any angle within the first preset reference range.
  • the preset angle range is [0°, 90°]
  • one reference angle may be 45°.
  • one reference angle can be selected at the same interval within the preset angle range to obtain multiple reference angles.
  • the multiple reference angles may include: 0°, 10°, 20°, 30°, 40°, 50°, 60°, 70°, 80°, 90°
  • multiple reference angles can include: 0°, 10°, 20°, 30°, 40°, 50°, 60°, 70 °, 80°.
  • Step S202 According to at least one reference angle, the visible area facing the mobile platform in the two-dimensional point cloud of the target object, and the two-dimensional point cloud, determine the corresponding fitting angle with the least visible edge cost.
  • the following takes multiple reference angles as an example to illustrate the method of “determining the corresponding fitting angle with the least visible edge cost based on the multiple reference angles and the visible area”.
  • the first preselected angle includes the first reference angle in the first neighborhood.
  • each reference angle has a neighborhood range, and therefore has multiple neighborhood ranges, and the union of the multiple neighborhood ranges is the preselected angle range in the previous embodiment.
  • the preset angle range is [P 1 °,P 2 °) or [P 1 °,P 2 °] or (P 1 °, P 2 °], and a reference is selected at intervals of ⁇ ° within the preset angle range
  • the first reference angle is N°
  • the first neighborhood range is (N°- ⁇ °/2, N°+ ⁇ °/2] or [N°- ⁇ ° /2, N°+ ⁇ °/2)
  • the preset angle range and the preselected angle range are not the same; in another way, if the first reference angle is N°, if N is not equal to P 1 and neither When equal to P 2 , the first neighborhood range is (N°- ⁇ °/2, N°+ ⁇ °/2] or [N°- ⁇ °/2, N°+ ⁇ °/2), where N is equal to
  • the first neighborhood range is [N°, N°+ ⁇ °/2] or (N°, N°+ ⁇ °/2] or [N°, N°+ ⁇ °/
  • the gradient descent method can be used to determine the first preselected angle with the least visible edge cost within the first neighborhood of the first reference angle, specifically:
  • the method for obtaining the first visible edge cost is as follows: According to the first outer envelope rectangle of the two-dimensional point cloud in the first direction, the visible area and the first boundary and the second boundary corresponding to the visible area, obtain The first visible edge cost corresponding to the first reference angle; where the first direction is the direction corresponding to the rotation of the first reference angle from the coordinate origin of the point cloud sensor coordinate system.
  • the first boundary and the second boundary corresponding to the visible area are the same as those described in the previous embodiment.
  • FIG. 8 is a schematic diagram of obtaining the visible edge cost provided by an embodiment of the application.
  • the following is a description of "according to the first outer envelope rectangle of the two-dimensional point cloud in the first direction, the visible area and the first visible area corresponding to the visible area in conjunction with FIG. A boundary and a second boundary, the process of obtaining the first visible edge cost corresponding to the first reference angle is described.
  • "According to the first outer envelope rectangle of the two-dimensional point cloud in the first direction, the visible area and the first and second boundaries corresponding to the visible area, obtain the first visible edge corresponding to the first reference angle Cost" including:
  • OA in FIG. 8 is the first boundary
  • OB is the second boundary
  • the first area is the area 81 in FIG. 8.
  • the first direction is the direction indicated by the arrow in FIG. 8
  • the second area is the area 82 in FIG. 8.
  • the target area includes a two-dimensional point cloud of the target object.
  • the third area is the area 83 in FIG. 8, that is, the area filled with solid lines.
  • determining the visible edge cost corresponding to the first reference angle according to the first area, the second area, and the third area includes:
  • the first preselected visible edge cost is used to characterize the degree of fit between the first outer envelope rectangle and the convex hull of the two-dimensional point cloud.
  • the first preselected visible edge cost is equal to the difference between the first area and the second area multiplied by the first preset weight.
  • the difference between the first area and the second area is the area of the area filled by the dotted line in the figure.
  • the second preselected visible edge cost is used to characterize the overflow degree of the first outer envelope rectangle overflowing the target area.
  • the second preselected visible edge cost is equal to the third area multiplied by the second preset weight.
  • the visible edge cost is a combination of the fit between the outer envelope rectangle of the two-dimensional point cloud of the target object in the direction corresponding to a certain angle and the two-dimensional point cloud and the overflow degree of the outer envelope rectangle overflowing the target area. parameter.
  • the first angle may be an angle greater than the first reference angle, or may be an angle smaller than the first reference angle.
  • the method of obtaining the second visible edge cost corresponding to the first angle is the same as the method of obtaining the first visible edge cost corresponding to the first reference angle, and will not be repeated here.
  • angle update rules are as follows:
  • first angle is greater than the first reference angle, and the first visible edge cost is greater than the second visible edge cost, update the first angle to a second angle smaller than the first angle in the first neighborhood.
  • first angle is greater than the first reference angle, and the first visible edge cost is less than the second visible edge cost, update the first angle to a second angle larger than the first angle in the first neighborhood.
  • first angle is smaller than the first reference angle, and the first visible edge cost is greater than the second visible edge cost, update the first angle to a second angle larger than the first angle in the first neighborhood.
  • first angle is smaller than the first reference angle, and the first visible edge cost is smaller than the second visible edge cost, update the first angle to a second angle smaller than the first angle in the first neighborhood.
  • the above j1 to j3 are specific implementations of using the gradient descent method to determine the first preselected angle with the smallest visible edge cost within the first neighborhood of the first reference angle.
  • the preselected angles corresponding to the other reference angles are obtained according to the method of obtaining the first preselected angle corresponding to the first reference angle, and finally multiple preselected angles are obtained.
  • the fitting angle of the target object with the least visible edge cost is determined according to at least one reference angle, the visible area and the two-dimensional point cloud of the target object ,include:
  • m1 according to the visible area and the two-dimensional point cloud of the target object, determine the corresponding target reference angle with the least visible edge cost among multiple reference angles.
  • the visible edge cost corresponding to each reference angle is obtained, and the reference angle with the smallest corresponding visible edge cost is determined as the target reference angle.
  • obtaining the visible edge cost corresponding to each reference angle refers to the method for obtaining the visible edge cost corresponding to the first reference angle.
  • the second neighborhood range is (S 1 °, S 2 °), where S 1 ° is the absolute value of the difference with the target reference angle being the smallest and smaller than the target reference angle S 2 ° is the reference angle with the smallest absolute value of the difference from the target reference angle and greater than the target reference angle.
  • the gradient descent method can be used to determine the fitting angle of the target object with the least visible edge cost within the second neighborhood of the target reference angle based on the visible area and the two-dimensional point cloud of the target object .
  • the specific implementation of this manner refers to the method of determining the first preselected angle in the gradient descent method within the first neighborhood of the first reference angle, which will not be repeated here.
  • multiple sub-reference angles are obtained from the second neighborhood of the target reference angle, and the smallest visible edge cost corresponding to each sub-reference angle is determined according to the visible area and the two-dimensional point cloud of the target object.
  • the sub-reference angle is the fitting angle of the target object.
  • obtaining the visible edge cost corresponding to each sub-reference angle refers to the obtaining method of obtaining the visible edge cost corresponding to the first reference angle, which will not be repeated here.
  • a sub-reference angle may be obtained at every interval of the same angle within the second neighborhood range of the target reference angle.
  • This embodiment provides a specific method for obtaining the fitting angle of the target object, and the method of this embodiment can further improve the accuracy of obtaining the fitting angle of the target object.
  • FIG. 9 is a schematic structural diagram of a point cloud sensor provided by an embodiment of the application.
  • the point cloud sensor provided in this embodiment is used to be mounted on a mobile platform.
  • the point cloud sensor includes a collector 91 and a processor 92.
  • the collector 91 is used to collect the point cloud of the target object
  • the processor 92 is communicatively connected to the collector and is configured to perform the following operations: generate a two-dimensional point cloud in the top view according to the point cloud of the target object collected by the collector; and determine the direction in the two-dimensional point cloud The visible area of the mobile platform; according to the visible area, the fitting feature of the target object is determined, and the target object is fitted according to the fitting feature.
  • the processor 92 when configured to perform the operation of determining the fitting feature of the target object according to the visible area, it is specifically configured to: determine according to the visible area and the two-dimensional point cloud The fitting feature.
  • the fitting feature is the fitting angle of the target object
  • the processor 92 is configured to perform the operation of determining the fitting feature based on the visible area and the two-dimensional point cloud , Specifically used for: determining the fitting angle according to the visible area, the two-dimensional point cloud, and a preset angle range.
  • the processor 92 when configured to perform the operation of determining the fitting angle according to the visible area, the two-dimensional point cloud, and a preset angle range, it is specifically configured to: , The two-dimensional point cloud and the preset angle range determine the corresponding fitting angle with the smallest visible edge cost.
  • the processor 92 when the processor 92 is configured to perform the operation of determining the corresponding fitting angle with the least visible edge cost according to the visible area, the two-dimensional point cloud, and a preset angle range, specifically use In: selecting at least one reference angle from the preset angle range; and determining the corresponding fitting angle with the least visible edge cost according to the at least one reference angle, the visible area, and the two-dimensional point cloud.
  • the processor 92 is configured to determine that the corresponding visible edge cost is the smallest according to the at least one reference angle, the visible area, and the two-dimensional point cloud.
  • the fitting angle it is specifically used to: for any one of the multiple reference angles, according to the visible area and the two-dimensional point cloud, in the first reference angle Determine a first preselected angle with the smallest visible edge cost within a neighborhood range, where the first neighborhood range includes the first reference angle; determine the corresponding preselected angle with the least visible edge cost among the preselected angles as the fitting angle.
  • the processor 92 is configured to perform determining a first preselected angle with the least visible edge cost within a first neighborhood range of the first reference angle based on the visible area and the two-dimensional point cloud During the operation, it is specifically used to obtain the first visible edge cost corresponding to the first reference angle according to the visible area and the two-dimensional point cloud, and update the first reference angle to the first neighbor According to the first angle within the domain, the second visible edge cost corresponding to the first angle is obtained according to the visible area and the two-dimensional point cloud; according to the second visible edge cost and the first visible edge The relationship between the costs, update the first angle to the second angle in the first neighborhood; repeat the execution of obtaining the visible edge costs corresponding to the angles in the first neighborhood, according to two adjacent The obtained relationship of the visible edge cost is an operation of updating the angle within the first neighborhood range until the first preselected angle with the smallest visible edge cost is obtained.
  • the processor 92 is configured to determine that the corresponding visible edge cost is the smallest according to the at least one reference angle, the visible area, and the two-dimensional point cloud.
  • the fitting angle it is specifically used to: determine the corresponding target reference angle with the least visible edge cost among multiple reference angles according to the visible area and the two-dimensional point cloud; according to the visible area And the two-dimensional point cloud, determining the corresponding fitting angle with the smallest visible edge cost within a second neighborhood range of the target reference angle, and the second neighborhood range includes the target reference angle.
  • the processor 92 is configured to determine the fitting angle with the least visible edge cost within the second neighborhood range of the target reference angle according to the visible area and the two-dimensional point cloud During the operation, it is specifically used to: obtain multiple sub-reference angles from within the second neighborhood of the target reference angle; determine the visible edge cost corresponding to each sub-reference angle according to the visible area and the two-dimensional point cloud The smallest sub-reference angle is the fitting angle.
  • the processor 92 when configured to perform the operation of obtaining the first visible edge cost corresponding to the first reference angle according to the visible area and the two-dimensional point cloud, it is specifically configured to: The first outer envelope rectangle of the two-dimensional point cloud in the first direction, the visible area and the first boundary and the second boundary corresponding to the visible area, to obtain the first visible edge cost; wherein, The first direction is a direction corresponding to the rotation of the first reference angle from the coordinate origin of the point cloud sensor coordinate system, and the coordinate origin of the point cloud sensor coordinate system is the position of the point cloud sensor.
  • the processor 92 is configured to execute the first outer envelope rectangle in the first direction according to the two-dimensional point cloud, the visible area and the first boundary and the second boundary corresponding to the visible area.
  • Boundary when the operation of obtaining the first visible edge cost is specifically used to: obtain the first area of the first area defined by the visible area, the first boundary, and the second boundary; The outer envelope rectangle faces the visible side of the mobile platform, and the second area of the second area defined by the first boundary and the second boundary; obtains the side of the first envelope rectangle outside the target area, the A third area of a third area defined by the first boundary and the second boundary, the target area is the area between the first boundary and the second boundary, and the target area includes the second boundary of the target object.
  • One-dimensional point cloud determine the first visible edge cost according to the first area, the second area, and the third area.
  • the processor 92 is configured to perform an operation of determining the first visible edge cost corresponding to the first reference angle according to the first area, the second area, and the third area. , It is specifically used to: obtain the first area, the second area, and the first preset weight, obtain the first preselected visible edge cost; according to the third area and the second preset weight, obtain the second preselect Visible edge cost; determining the first visible edge cost according to the first preselected visible edge cost and the second preselected visible edge cost.
  • the angle included in the preset angle range is an angle between a first preset angle and a second preset angle, and the first preset angle minus the second preset angle is equal to 90°,
  • the preset angle range includes the first preset angle and/or the second preset angle.
  • the fitting feature includes at least one of the following: the fitting angle of the target object, the fitting direction of the target object, the fitting bounding box of the target object, and the fitting of the target object.
  • the processor 92 when configured to perform the operation of fitting the target object according to the fitting feature, it is specifically configured to: obtain the information of the target object according to the point cloud of the target object Fitting height; fitting the target object according to the fitting feature and the fitting height.
  • the processing when the fitting feature includes at least one of the following: the fitting angle of the target object, the fitting direction of the target object, and the fitting visible edge of the target object, the processing When the device 92 is used to perform the operation of fitting the target object according to the fitting feature and the fitting height, it is specifically configured to: obtain the fitting boundary of the target object according to the fitting feature Box; according to the fitting bounding box and fitting height, fitting the target object.
  • the processor 92 when configured to perform the operation of obtaining the fitting height of the target object according to the point cloud of the target object, it is specifically configured to obtain the maximum fitting height of the target object. Height and fitting minimum height; obtaining the fitting height according to the fitting maximum height and the fitting minimum height.
  • the point cloud sensor in this embodiment can be used to implement the technical solutions in the foregoing method embodiments, and its implementation principles and technical effects are similar, and will not be repeated here.
  • FIG. 10 is a schematic structural diagram of a mobile platform provided by an embodiment of this application.
  • the mobile platform of this embodiment includes: a point cloud sensor 101 and a processor 102.
  • the point cloud sensor 101 is used to collect the point cloud of the target object; wherein, the point cloud sensor 101 is mounted on the mobile platform;
  • the processor 102 is in communication connection with the point cloud sensor 101, and is configured to perform the following operations: generate a two-dimensional point cloud in the top view according to the point cloud of the target object collected by the point cloud sensor; and determine the two-dimensional point cloud In the visible area facing the mobile platform; according to the visible area, the fitting feature of the target object is determined, and the target object is fitted according to the fitting feature.
  • the processor 102 when configured to perform the operation of determining the fitting feature of the target object according to the visible area, it is specifically configured to: determine according to the visible area and the two-dimensional point cloud The fitting feature.
  • the fitting feature is a fitting angle of the target object
  • the processor 102 when the processor 102 is configured to perform an operation of determining the fitting feature based on the visible area and the two-dimensional point cloud , Specifically used for: determining the fitting angle according to the visible area, the two-dimensional point cloud, and a preset angle range.
  • the processor 102 when configured to perform the operation of determining the fitting angle according to the visible area, the two-dimensional point cloud, and a preset angle range, it is specifically configured to: according to the visible area , The two-dimensional point cloud and the preset angle range determine the corresponding fitting angle with the smallest visible edge cost.
  • the processor 102 when the processor 102 is configured to perform the operation of determining the corresponding fitting angle with the least visible edge cost based on the visible area, the two-dimensional point cloud, and a preset angle range, specifically use In: selecting at least one reference angle from the preset angle range; and determining the corresponding fitting angle with the least visible edge cost according to the at least one reference angle, the visible area, and the two-dimensional point cloud.
  • the processor 102 is configured to determine that the corresponding visible edge cost is the smallest according to the at least one reference angle, the visible area, and the two-dimensional point cloud.
  • the fitting angle it is specifically used to: for any one of the multiple reference angles, according to the visible area and the two-dimensional point cloud, in the first reference angle Determine a first preselected angle with the smallest visible edge cost within a neighborhood range, where the first neighborhood range includes the first reference angle; determine the corresponding preselected angle with the least visible edge cost among the preselected angles as the fitting angle.
  • the processor 102 is configured to perform determining a first preselected angle with the least visible edge cost within a first neighborhood range of the first reference angle based on the visible area and the two-dimensional point cloud During the operation, it is specifically used to obtain the first visible edge cost corresponding to the first reference angle according to the visible area and the two-dimensional point cloud, and update the first reference angle to the first neighbor According to the first angle within the domain, the second visible edge cost corresponding to the first angle is obtained according to the visible area and the two-dimensional point cloud; according to the second visible edge cost and the first visible edge The relationship between the costs, update the first angle to the second angle in the first neighborhood; repeat the execution of obtaining the visible edge costs corresponding to the angles in the first neighborhood, according to two adjacent The obtained relationship of the visible edge cost is an operation of updating the angle within the first neighborhood range until the first preselected angle with the smallest visible edge cost is obtained.
  • the processor 102 is configured to determine that the corresponding visible edge cost is the smallest according to the at least one reference angle, the visible area, and the two-dimensional point cloud.
  • the fitting angle it is specifically used to: determine the corresponding target reference angle with the least visible edge cost among multiple reference angles according to the visible area and the two-dimensional point cloud; according to the visible area And the two-dimensional point cloud, determining the corresponding fitting angle with the smallest visible edge cost within a second neighborhood range of the target reference angle, and the second neighborhood range includes the target reference angle.
  • the processor 102 is configured to determine the fitting angle with the least visible edge cost within the second neighborhood range of the target reference angle according to the visible area and the two-dimensional point cloud During the operation, it is specifically used to: obtain multiple sub-reference angles from within the second neighborhood of the target reference angle; determine the visible edge cost corresponding to each sub-reference angle according to the visible area and the two-dimensional point cloud The smallest sub-reference angle is the fitting angle.
  • the processor 102 when configured to perform the operation of obtaining the first visible edge cost corresponding to the first reference angle according to the visible area and the two-dimensional point cloud, it is specifically configured to: The first outer envelope rectangle of the two-dimensional point cloud in the first direction, the visible area and the first boundary and the second boundary corresponding to the visible area, to obtain the first visible edge cost; wherein, The first direction is a direction corresponding to the rotation of the first reference angle from the coordinate origin of the point cloud sensor coordinate system, and the coordinate origin of the point cloud sensor coordinate system is the position of the point cloud sensor.
  • the processor 102 is configured to execute the first outer envelope rectangle in the first direction according to the two-dimensional point cloud, the visible area and the first boundary and the second boundary corresponding to the visible area.
  • Boundary when the operation of obtaining the first visible edge cost is specifically used to: obtain the first area of the first area defined by the visible area, the first boundary, and the second boundary; The outer envelope rectangle faces the visible side of the mobile platform, and the second area of the second area defined by the first boundary and the second boundary; obtains the side of the first envelope rectangle outside the target area, the A third area of a third area defined by the first boundary and the second boundary, the target area is the area between the first boundary and the second boundary, and the target area includes the two-dimensional point cloud ; According to the first area, the second area and the third area, determine the first visible edge cost.
  • the processor 102 is configured to perform an operation of determining the first visible edge cost corresponding to the first reference angle according to the first area, the second area, and the third area , It is specifically used to: obtain the first area, the second area, and the first preset weight, obtain the first preselected visible edge cost; according to the third area and the second preset weight, obtain the second preselect Visible edge cost; determining the first visible edge cost according to the first preselected visible edge cost and the second preselected visible edge cost.
  • the angle included in the preset angle range is an angle between a first preset angle and a second preset angle, and the first preset angle minus the second preset angle is equal to 90°,
  • the preset angle range includes the first preset angle and/or the second preset angle.
  • the fitting feature includes at least one of the following: the fitting angle of the target object, the fitting direction of the target object, the fitting bounding box of the target object, and the fitting of the target object.
  • the processor 102 when configured to perform the operation of fitting the target object according to the fitting feature, it is specifically configured to: obtain the information of the target object according to the point cloud of the target object Fitting height; fitting the target object according to the fitting feature and the fitting height.
  • the processing when the fitting feature includes at least one of the following: the fitting angle of the target object, the fitting direction of the target object, and the fitting visible edge of the target object, the processing When the device 102 is used to perform the operation of fitting the target object according to the fitting feature and the fitting height, it is specifically used to: obtain the fitting boundary of the target object according to the fitting feature Box; according to the fitting bounding box and fitting height, fitting the target object.
  • the processor 102 when configured to perform the operation of obtaining the fitting height of the target object according to the point cloud of the target object, it is specifically configured to obtain the maximum fitting height of the target object. Height and fitting minimum height; obtaining the fitting height according to the fitting maximum height and the fitting minimum height.
  • the mobile platform of this embodiment can be used to execute the technical solutions in the foregoing method embodiments, and its implementation principles and technical effects are similar, and will not be repeated here.
  • An embodiment of the present application also provides a computer-readable storage medium, including a program or instruction, and when the program or instruction runs on a computer, the method described in the foregoing method embodiment is executed.
  • the aforementioned program can be stored in a computer readable storage medium.
  • the program executes the steps including the above-mentioned method embodiments; and the aforementioned storage medium includes: ROM, RAM, magnetic disk, or optical disk and other media that can store program code.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Optical Radar Systems And Details Thereof (AREA)

Abstract

La présente invention concerne un procédé d'ajustement d'objet cible, un capteur de nuage de points et une plateforme mobile. Le procédé comprend : selon un nuage de points d'un objet cible collecté par un capteur de nuage de points, la génération d'un nuage de points bidimensionnel avec une vue du haut, le capteur de nuage de points étant utilisé pour monter une plateforme mobile (S101) ; la détermination d'une zone visible, située en face de la plateforme mobile, dans le nuage de points bidimensionnel (S102) ; et selon la zone visible, située en face de la plateforme mobile et dans le nuage de points bidimensionnel, de l'objet cible, la détermination d'une caractéristique d'ajustement de l'objet cible, et l'ajustement de l'objet cible selon la caractéristique d'ajustement (S103). Ce procédé peut améliorer la précision d'ajustement d'un objet cible.
PCT/CN2019/082119 2019-04-10 2019-04-10 Procédé d'ajustement d'objet cible, capteur de nuage de points et plateforme mobile WO2020206639A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/CN2019/082119 WO2020206639A1 (fr) 2019-04-10 2019-04-10 Procédé d'ajustement d'objet cible, capteur de nuage de points et plateforme mobile
CN201980005593.1A CN111316289A (zh) 2019-04-10 2019-04-10 目标物体的拟合方法、点云传感器和移动平台

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2019/082119 WO2020206639A1 (fr) 2019-04-10 2019-04-10 Procédé d'ajustement d'objet cible, capteur de nuage de points et plateforme mobile

Publications (1)

Publication Number Publication Date
WO2020206639A1 true WO2020206639A1 (fr) 2020-10-15

Family

ID=71161148

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2019/082119 WO2020206639A1 (fr) 2019-04-10 2019-04-10 Procédé d'ajustement d'objet cible, capteur de nuage de points et plateforme mobile

Country Status (2)

Country Link
CN (1) CN111316289A (fr)
WO (1) WO2020206639A1 (fr)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113597568A (zh) * 2020-10-12 2021-11-02 深圳市大疆创新科技有限公司 数据处理方法、控制设备及存储介质

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105404844A (zh) * 2014-09-12 2016-03-16 广州汽车集团股份有限公司 一种基于多线激光雷达的道路边界检测方法
EP3332218A1 (fr) * 2015-08-03 2018-06-13 TomTom Global Content B.V. Procédés et systèmes de génération et d'utilisation de données de référence de localisation
CN109061703A (zh) * 2018-06-11 2018-12-21 百度在线网络技术(北京)有限公司 用于定位的方法、装置、设备和计算机可读存储介质
CN109271880A (zh) * 2018-08-27 2019-01-25 深圳清创新科技有限公司 车辆检测方法、装置、计算机设备和存储介质
CN109446886A (zh) * 2018-09-07 2019-03-08 百度在线网络技术(北京)有限公司 基于无人车的障碍物检测方法、装置、设备以及存储介质

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102005000652A1 (de) * 2005-01-04 2006-07-13 Robert Bosch Gmbh Verfahren für die Objekterfassung
JP6945785B2 (ja) * 2016-03-14 2021-10-06 イムラ ウーロプ ソシエテ・パ・アクシオンス・シンプリフィエ 3dポイントクラウドの処理方法
CN108875804B (zh) * 2018-05-31 2019-12-20 腾讯科技(深圳)有限公司 一种基于激光点云数据的数据处理方法和相关装置

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105404844A (zh) * 2014-09-12 2016-03-16 广州汽车集团股份有限公司 一种基于多线激光雷达的道路边界检测方法
EP3332218A1 (fr) * 2015-08-03 2018-06-13 TomTom Global Content B.V. Procédés et systèmes de génération et d'utilisation de données de référence de localisation
CN109061703A (zh) * 2018-06-11 2018-12-21 百度在线网络技术(北京)有限公司 用于定位的方法、装置、设备和计算机可读存储介质
CN109271880A (zh) * 2018-08-27 2019-01-25 深圳清创新科技有限公司 车辆检测方法、装置、计算机设备和存储介质
CN109446886A (zh) * 2018-09-07 2019-03-08 百度在线网络技术(北京)有限公司 基于无人车的障碍物检测方法、装置、设备以及存储介质

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
YE GANG: "Research on Multi-target Detection and Tracking Algorithm of Autonomous Vehicle Based on 3D Lidar in Urban Environment", MASTER THESIS, 31 August 2016 (2016-08-31), CN, pages 1 - 72, XP009523617 *

Also Published As

Publication number Publication date
CN111316289A (zh) 2020-06-19

Similar Documents

Publication Publication Date Title
EP3620823B1 (fr) Procédé et dispositif permettant de détecter la précision d'un paramètre interne d'un radar laser
US8072450B2 (en) System and method for measuring a three-dimensional object
CN110031824A (zh) 激光雷达联合标定方法及装置
KR101858902B1 (ko) 컴포넌트를 활용한 점군 데이터의 객체 위치정보 추출 시스템
CN107766405A (zh) 自动车辆道路模型定义系统
CN106887020A (zh) 一种基于LiDAR点云的道路纵横断面获取方法
JP2010541051A (ja) ユーザ選択可能な建物形状のオプションを備えた地球空間モデリング・システム及び関連した方法
WO2020168685A1 (fr) Procédé de planification de point de vue de balayage tridimensionnel, dispositif et support d'informations lisible par ordinateur
CN107622530B (zh) 一种高效鲁棒的三角网切割方法
JP2010541050A (ja) インペインティング及び誤り算出の構成を備えた地球空間モデリング・システム及び関連した方法
CN103761739A (zh) 一种基于半全局能量优化的影像配准方法
CN105184854B (zh) 针对地下空间扫描点云成果数据的快速建模方法
CN110503723B (zh) 一种牙列缺损数字模型观测线的确定方法
WO2020206639A1 (fr) Procédé d'ajustement d'objet cible, capteur de nuage de points et plateforme mobile
CN108563915B (zh) 车辆数字化仿真测试模型构建系统及方法、计算机程序
CN114299242A (zh) 高精地图中图像处理方法、装置、设备以及存储介质
CN110796735B (zh) Nurbs曲面有限元板壳网格划分方法及计算机实现系统
CN116958218A (zh) 一种基于标定板角点对齐的点云与图像配准方法及设备
CN104036096A (zh) 斜面上凸起特征映射为制造特征体积的映射方法
CN114820505A (zh) 一种动态目标的非接触测量方法
CN112950708A (zh) 一种定位方法、定位装置及机器人
CN107146286B (zh) 基于影像边缘特征的三维模型自动调整方法
CN112197773A (zh) 基于平面信息的视觉和激光定位建图方法
CN115861561B (zh) 一种基于语义约束的等高线生成方法和装置
CN117315183B (zh) 一种基于激光雷达构建三维地图和作业分析的方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19924327

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19924327

Country of ref document: EP

Kind code of ref document: A1