CN112099050A - Vehicle appearance recognition device and method, vehicle processing apparatus and method - Google Patents

Vehicle appearance recognition device and method, vehicle processing apparatus and method Download PDF

Info

Publication number
CN112099050A
CN112099050A CN202010960657.8A CN202010960657A CN112099050A CN 112099050 A CN112099050 A CN 112099050A CN 202010960657 A CN202010960657 A CN 202010960657A CN 112099050 A CN112099050 A CN 112099050A
Authority
CN
China
Prior art keywords
vehicle
laser radar
point cloud
power device
point
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010960657.8A
Other languages
Chinese (zh)
Inventor
鄂文轩
周子策
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Devil Fish Technology Co ltd
Original Assignee
Beijing Devil Fish Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Devil Fish Technology Co ltd filed Critical Beijing Devil Fish Technology Co ltd
Priority to CN202010960657.8A priority Critical patent/CN112099050A/en
Publication of CN112099050A publication Critical patent/CN112099050A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/002Measuring arrangements characterised by the use of optical techniques for measuring two or more coordinates
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/42Simultaneous measurement of distance and other co-ordinates
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/4802Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/60Type of objects
    • G06V20/64Three-dimensional objects

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Electromagnetism (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Optical Radar Systems And Details Thereof (AREA)

Abstract

The application provides a vehicle appearance recognition device and a method, wherein the device comprises: the system comprises a laser radar and a bearing part thereof, wherein the laser radar is used for scanning the space where the vehicle is located to obtain a three-dimensional point cloud, a point cloud cutting module is used for cutting the three-dimensional point cloud according to a preset service area range to form a service area point cloud, and a ground cutting module is used for searching and deleting a plane point cloud representing the ground from the service area point cloud to form a vehicle appearance point cloud; the laser radar is mounted on the bearing part. The present application also provides a vehicle processing apparatus and method. According to the vehicle appearance recognition method and device, the high-resolution and high-precision rapid recognition of the vehicle appearance can be achieved, and based on the high-resolution and high-precision rapid recognition, the vehicle can be cleaned, detected and coated.

Description

Vehicle appearance recognition device and method, vehicle processing apparatus and method
Technical Field
The present application relates to the field of vehicle technologies, and in particular, to a vehicle shape recognition apparatus and method, a vehicle processing device and method.
Background
Along with the popularization of vehicles, the requirements on the aspects of vehicle manufacturing, cleaning, flaw detection, maintenance and the like are more and more, related matched products are correspondingly produced, and the automatic vehicle washing equipment is more and more favored by vehicle owners by taking vehicle cleaning as an example.
The existing automatic car washing equipment is generally divided into two types of non-contact car washing and brush type automatic car washing, the brush type automatic car washing equipment mainly uses a certain force to push a brush to a car body to wash the car, and the brush type automatic car washing equipment is gradually eliminated due to the defects of damaging auxiliary equipment such as car antennas and the like; the non-contact automatic car washing equipment mainly utilizes computer to control high-pressure water to wash car, and its working principle is that it utilizes the photoelectric system to detect, and utilizes computer to analyze and calculate optimum position and force of various actions so as to attain the optimum car washing effect. The conventional contactless automatic car washing equipment generally obtains a vehicle outline curve by a distance measuring sensor such as infrared, laser, millimeter wave, ultrasonic wave and the like which is arranged on the top of the equipment, but the curve can only obtain the approximate outline of the vehicle, cannot generate the details of an antenna, a rearview mirror and the like of the vehicle, and is easy to damage the vehicle when cleaning and the like are carried out.
Disclosure of Invention
In order to solve the above problems, the present application provides a vehicle shape recognition apparatus and method, and a vehicle processing apparatus and method, which are capable of recognizing the shape of a vehicle with high resolution and high accuracy, and based on this, performing processing such as cleaning, flaw detection, coating, and the like on the vehicle.
The application provides a vehicle appearance recognition device includes: the laser radar and the bearing part thereof are used for scanning the space where the vehicle is located to obtain a three-dimensional point cloud; the point cloud cutting module is used for cutting the three-dimensional point cloud according to a preset service area range to form a service area point cloud; the ground cutting module is used for searching and deleting the plane point cloud representing the ground from the service area point cloud to form a vehicle appearance point cloud; the laser radar is mounted on the bearing part.
Preferably, the vehicle exterior recognition device further includes: the parameter acquisition module is used for receiving the manually filled calibration value as a calibration parameter or generating the calibration parameter through an automatic registration algorithm; the point cloud calibration module is used for adjusting each three-dimensional point cloud obtained by scanning the laser radar according to the calibration parameters; and the point cloud splicing module is used for splicing the plurality of three-dimensional point clouds obtained by scanning the laser radar to form a complete three-dimensional point cloud.
Preferably, the vehicle exterior recognition device further includes: and the noise deleting module is used for clustering the vehicle appearance point cloud according to a preset distance value and deleting irrelevant points which are not clustered in the vehicle appearance point cloud to form the noise deleting module for optimizing the vehicle appearance point cloud.
Preferably, the bearing part comprises a first rail, a first power device, an electric control module and a fixing part; the laser radar is arranged on the first power device through the fixing piece; the electric control module is arranged on the first power device, is electrically connected with the first power device and controls the first power device to move along a first track arranged around the vehicle; the laser radar is a single-line laser radar, and the rotating axis of the laser radar is parallel to the movement direction of the first power device.
Preferably, the bearing part comprises a gantry, a gantry guide rail and a second power device, the laser radar is arranged on a gantry beam, and the second power device is arranged at the lower end of the gantry and drives the gantry to move along the gantry guide rail; the laser radar is a single-line laser radar, and the rotating axis of the laser radar is parallel to the movement direction of the gantry; or the bearing part comprises a top-hung guide rail and a third power device, the laser radar is installed on the third power device, and the laser radar moves along the top-hung guide rail under the driving of the third power device; the laser radar is a single-line laser radar, and the rotating axis of the laser radar is parallel to the movement direction of the third power device; or the bearing part comprises a movable chassis and a fixed part, and the laser radar is arranged on the movable chassis through the fixed part; the laser radar is a single-line laser radar, and the rotating axis of the laser radar is parallel to the moving direction of the movable chassis.
Preferably, the stereoscopic three-dimensional point cloud is represented by the spatial coordinates (x, y + d × cos (α), z + d × sin (α)) of each spatial point; wherein x, y and z respectively represent the spatial coordinate value of the point where the laser radar is located, and d and alpha respectively represent the distance and the elevation angle of the point where the laser radar is located.
Preferably, the laser radar is a multi-line laser radar or a multi-axis rotating radar and is fixedly installed through the bearing part.
Preferably, the stereoscopic three-dimensional point cloud is represented by the following spatial coordinates of each spatial point:
( x + d*cos(α)*cos(β), y + d*cos(α)*sin(β), z + d*sin(α) )
x, y and z respectively represent space coordinate values of a point where the laser radar is located, d and alpha respectively represent the distance and the elevation angle of the point where the space point and the laser radar are located, and beta represents an included angle between the projection of a connecting line of the space point and the point where the laser radar is located on an XOY plane and an X axis.
The application also provides a vehicle processing device, which comprises a device control device, a vehicle processing device and the vehicle appearance recognition device; the equipment control device controls the motion path of the vehicle processing device according to the vehicle outline point cloud generated by the vehicle outline recognition device, and the vehicle processing device realizes the processing operation of the vehicle.
Preferably, the vehicle processing device is a vehicle washing device, a vehicle flaw detection device or a vehicle painting device.
The application provides a vehicle appearance recognition method.A vehicle appearance recognition device for executing the method comprises a laser radar, wherein the laser radar is arranged on a bearing part; the method comprises the following steps: scanning the space where the vehicle is located through the laser radar to obtain three-dimensional point cloud; cutting the three-dimensional point cloud according to a preset service area range to form a service area point cloud; and searching and deleting the plane point cloud representing the ground from the service area point cloud to form the vehicle appearance point cloud.
Preferably, after the step of obtaining the stereoscopic three-dimensional point cloud of the space where the vehicle is located by the laser radar, the method further comprises: adjusting each three-dimensional point cloud obtained by scanning the laser radar according to the calibration parameters; splicing a plurality of three-dimensional point clouds obtained by scanning of the laser radar to form a complete three-dimensional point cloud; wherein the calibration parameter is a calibration value filled manually or generated by an automatic registration algorithm.
Preferably, after the step of finding and deleting the plane point cloud representing the ground from the service area point cloud to form the vehicle outline point cloud, the method further comprises: and clustering the vehicle appearance point cloud according to a preset distance value, and deleting irrelevant points which are not clustered in the vehicle appearance point cloud to form an optimized vehicle appearance point cloud.
Preferably, the bearing part comprises a first rail, a first power device, an electric control module and a fixing part; the laser radar is arranged on the first power device through the fixing piece; the electric control module is arranged on the first power device, is electrically connected with the first power device and controls the first power device to move along a first track arranged around the vehicle; the laser radar is a single-line laser radar, and the rotating axis of the laser radar is parallel to the movement direction of the first power device.
Preferably, the bearing part comprises a gantry, a gantry guide rail and a second power device, the laser radar is arranged on a gantry beam, and the second power device is arranged at the lower end of the gantry and drives the gantry to move along the gantry guide rail; the laser radar is a single-line laser radar, and the rotating axis of the laser radar is parallel to the movement direction of the gantry; or the bearing part comprises a top-hung guide rail and a third power device, the laser radar is installed on the third power device, and the laser radar moves along the top-hung guide rail under the driving of the third power device; the laser radar is a single-line laser radar, and the rotating axis of the laser radar is parallel to the movement direction of the third power device; or the bearing part comprises a movable chassis and a fixed part, and the laser radar is arranged on the movable chassis through the fixed part; the laser radar is a single-line laser radar, and the rotating axis of the laser radar is parallel to the moving direction of the movable chassis.
Preferably, the stereoscopic three-dimensional point cloud is represented by the spatial coordinates (x, y + d × cos (α), z + d × sin (α)) of each spatial point; wherein x, y and z respectively represent the spatial coordinate value of the point where the laser radar is located, and d and alpha respectively represent the distance and the elevation angle of the point where the laser radar is located.
Preferably, the laser radar is a multi-line laser radar or a multi-axis rotating radar and is fixedly installed through the bearing part.
Preferably, the stereoscopic three-dimensional point cloud is represented by the following spatial coordinates of each spatial point:
( x + d*cos(α)*cos(β), y + d*cos(α)*sin(β), z + d*sin(α) )
x, y and z respectively represent space coordinate values of a point where the laser radar is located, d and alpha respectively represent the distance and the elevation angle of the point where the space point and the laser radar are located, and beta represents an included angle between the projection of a connecting line of the space point and the point where the laser radar is located on an XOY plane and an X axis.
The application provides a vehicle processing method, and equipment for executing the method comprises a vehicle processing device and a vehicle appearance recognition device, wherein the method comprises the following steps: executing the vehicle appearance identification method to generate a vehicle appearance point cloud; and controlling the motion path of the vehicle processing device according to the vehicle outline point cloud, and realizing the processing operation of the vehicle by the vehicle processing device.
Preferably, the vehicle treatment operation is vehicle cleaning, vehicle inspection, or vehicle painting.
Compared with the prior art, the method has the following advantages:
the preferred embodiment of this application uses laser radar as the collection equipment of point cloud data, and the discernment that can carry out high accuracy, high resolution to the vehicle appearance fast is to point cloud separation means such as corresponding region cutting, ground cutting to can help vehicle operating equipment effectively to be close to the automobile body, to the limited environment of operating space, if with the condition in underground garage as washing the garage, can the limited space of make full use of, realize reducing the purpose of equipment overall height. The resolution of 0.5mm can be realized under the matching of a proper laser radar and a power device, and by taking car washing equipment as an example, the whole height of the equipment can be controlled within 2.35m, so that the requirements of height-limited environments such as garages, basements and the like can be completely met.
The high-quality vehicle appearance recognition can also realize the full presentation of vehicle details, accurately express vehicle auxiliary devices such as vehicle rearview mirrors, antennas, hubs and luggage racks, not only can accurately avoid vehicles when the vehicles are cleaned, but also can provide a solid data base for vehicle flaw detection, vehicle assembly, vehicle coating and the like.
In addition, the laser with the proper wavelength can be used for most of paint surfaces with colors and gloss, and the universality is better.
Drawings
The drawings are only for purposes of illustrating the preferred embodiments and are not to be construed as limiting the application. Also, like reference numerals are used to refer to like parts throughout the drawings. In the drawings:
FIG. 1 is a schematic structural diagram of a vehicle shape recognition apparatus according to a first embodiment of the present application;
FIG. 2 is a schematic structural diagram of a vehicle shape recognition apparatus according to a second embodiment of the present application;
FIGS. 3a and 3b are schematic diagrams of the ground track type lidar driving mode according to the present application;
FIG. 4 is a schematic diagram of a driving method of the gantry rail type laser radar of the present application;
FIG. 5 is a schematic diagram of a driving method of a top-hung rail-mounted lidar according to the present application;
FIG. 6a is a schematic diagram of a driving method of the mobile chassis type laser radar of the present application;
FIG. 6b is a side view of the mobile chassis of FIG. 6 a;
fig. 7a and 7b are schematic diagrams illustrating comparison of three-dimensional point cloud effects before and after cutting of a service area;
FIG. 7c is a schematic view of the vehicle outline point cloud obtained after the service area point cloud shown in FIG. 7b is subjected to ground point cloud excision;
fig. 8a and 8b are schematic diagrams of the effect of left point cloud and right point cloud obtained by scanning of the dual laser radar respectively;
FIG. 8c is a schematic diagram of the effect of the spliced point cloud obtained by splicing FIG. 8a and FIG. 8 b;
FIGS. 9a and 9b are schematic diagrams illustrating comparison of the effects before and after point cloud calibration;
FIG. 10 is a flowchart of a first embodiment of a vehicle contour recognition method of the present application;
FIG. 11 is a flowchart of a second embodiment of a vehicle contour recognition method of the present application;
FIG. 12 is a schematic structural diagram of an embodiment of a vehicle processing apparatus according to the present application.
Detailed Description
In order to make the aforementioned objects, features and advantages of the present application more comprehensible, the present application is described in further detail with reference to the accompanying drawings and the detailed description.
In the description of the present application, it is to be understood that the terms "first", "second" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implying any number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include one or more of that feature. The meaning of "plurality" is two or more unless specifically limited otherwise. The terms "comprising," including, "and the like are to be construed as open-ended terms, i.e.," including/including but not limited to. The term "based on" is "based, at least in part, on". The term "an embodiment" means "at least one embodiment"; the term "another embodiment" means "at least one additional embodiment". Relevant definitions for other terms will be given in the following description.
Referring to fig. 1, a composition structure of a first embodiment of the vehicle outline recognition apparatus of the present application is shown, including a laser radar 20, a bearing portion 10, a point cloud cutting module 60, and a ground ablation module 70, wherein:
the laser radar 20 is arranged on the bearing part 10, and in order to obtain the overall appearance data of the vehicle, the installation height of the laser radar 20 is preferably higher than the roof of the vehicle to be identified; utilizing a laser radar 20 to carry out integral scanning on a space where a vehicle is located, and obtaining three-dimensional point cloud data of the space;
in particular, the three-dimensional point cloud data can be represented by three-dimensional coordinate values. Taking a single-line laser radar as an example, outputting point data through rotation, driving the radar to advance at a constant speed by a power device, spirally advancing a radar laser head in a space, returning data by fitting the radar and the power device, and restoring the actual position of a radar point in the space so as to convert two-dimensional radar data into three-dimensional point cloud. The effect is shown in figure 7 a.
Assuming that the spatial coordinates of the radar point are (x, y, z), wherein the values of y and z can be calculated from the installation position, and x can be obtained from the power device; if the elevation angle between a spatial point and a point where the radar is located (i.e. the angle between the line connecting the two points and the horizontal plane) is α, and the distance is l, and the spatial point and the radar are located in a plane parallel to YOZ, the spatial coordinates of the spatial point are (x, y + d × cos (α), z + d × sin (α)).
Then, the point cloud cutting module 60 cuts the three-dimensional point cloud according to a preset service area range to form a service area point cloud;
in specific implementation, the service area range can be set by selecting a middle area of the space, the length and the width can be slightly larger than those of the vehicle, and the height can be slightly smaller than the height of the ceiling (such as 2.5 m), so that the ceiling can be removed; the specific cutting method comprises the following steps: taking the point cloud data of the three-dimensional coordinates as an example, traversal retrieval can be carried out on the points of all the position data (x, y, z), the points beyond the set point range are deleted, and the rest is the point cloud of the service area. The practical effect is to cut a cuboid in space. The effect is shown in figure 7 b.
The ground cutting module 70 then finds out and deletes the planar point cloud representing the ground from the service area point cloud to form the vehicle appearance point cloud.
In specific implementation, a plane can be searched in the service area point cloud, and the plane found under the condition without a ceiling is the ground. Of course, under the condition that the ground height is known and the ground is completely horizontal, the ground can be cut directly according to the height and can be separated; the cutting method is similar to the point cloud cutting of the service area. The effect is shown in figure 7 c.
Through the means, the vehicle appearance can be rapidly identified with high precision and high resolution, and by taking the automobile as an example, the complete outline can be identified, and the auxiliary objects such as rearview mirrors, antennas, flagpoles and the like can be completely expressed, so that the problem that the operation requirements of cleaning, detecting flaws, coating and the like on the vehicle in real time cannot be met in the prior art can be effectively solved.
Referring to fig. 2, a composition structure of a second embodiment of the vehicle appearance recognition apparatus of the present application is shown, and compared with the first apparatus embodiment, in addition to including the laser radar 20, the bearing part 10, the point cloud cutting module 60 and the ground ablation module 70, in order to adapt to the situation that multiple laser radars scan and acquire multiple point cloud data of the space where the vehicle is located, and in order to adapt to the situation that a single laser radar makes a U-shaped motion around the vehicle and acquires multiple point cloud data of the space where the vehicle is located, the apparatus is further provided with a parameter acquisition module 30, a point cloud calibration module 40 and a point cloud splicing module 50. Wherein:
a parameter obtaining module 30, configured to receive a manually filled calibration value as a calibration parameter or generate a calibration parameter through an automatic registration algorithm;
the point cloud calibration module 40 is configured to adjust each three-dimensional point cloud obtained by scanning the laser radar according to the calibration parameters determined by the parameter acquisition module 30;
and the point cloud splicing module 50 is used for splicing a plurality of three-dimensional point clouds obtained by scanning the laser radar to form a complete three-dimensional point cloud.
Taking a dual laser radar as an example, the left point cloud is shown in fig. 8a, the right point cloud is shown in fig. 8b, and the pieced point cloud is shown in fig. 8 c.
In specific implementation, an artificial correction mode can be adopted, and specifically, the scanning data of the characteristic marker can be manually displaced and superposed with the virtual marker. The marker can be a plurality of obvious roadblock piles and the like, and can be used for conveniently measuring the position relation of the object and the equipment.
When the calibration parameters are generated by using the automatic registration algorithm, taking an iterative Closest Point algorithm (ICP) as an example, the calibration parameters can be obtained by using an algorithm packaged by an existing Visualization Toolkit (VTK), and a calculation result, that is, a matrix of rotation and offset of one Point cloud with respect to another Point cloud, can be returned only by inputting two groups of Point cloud data and iteration times. The calibration effect is shown in fig. 9a and 9 b.
It should be noted that the registration parameters do not need to be adjusted for the same machine under the condition of no mechanical deformation, and the registration parameters can be used for a long time.
In addition, when the method is specifically implemented, the sequence of area cutting and point cloud splicing can be changed, but the area cutting can be performed once less after the area cutting is spliced; the operation sequence can be changed for separating the ground and the area cutting, but the interference of irrelevant points can be effectively reduced by the area cutting firstly, and the ground separation success rate is improved.
In a further preferred embodiment, in order to solve the problem that sundries placed near the vehicle may exist in the service area range, the vehicle appearance recognition device is further provided with a noise deleting module 80, which is used for clustering the vehicle appearance point cloud according to a preset distance value, and deleting irrelevant points which are not clustered in the vehicle appearance point cloud to form an optimized vehicle appearance point cloud.
In specific implementation, the clustering algorithm may group the proximity points into a group according to the set distance. After separating the ground, the largest and near-center group obtained by the clustering algorithm is typically the vehicle point cloud (if other objects are to be identified, other options may be substituted). Of course, if the place where the vehicle is located is clean enough and the service area range is ensured not to enter sundries (such as animals and the like), the complete vehicle outline point cloud can be obtained after the service area is cut and the ground is separated.
In the above embodiments, the lidar 20 may select a single-line lidar, or may select a multi-line lidar according to an environmental requirement, where the cost required for the multi-line lidar is higher under the condition that the same point cloud precision is required, and the multi-axis rotating radar may be selected according to the environmental requirement.
When selecting the single line laser radar, in order to obtain the complete point cloud data of the vehicle space, a matched power device is required to be configured to drive the radar to move, and the following multiple implementation modes can be adopted:
referring to fig. 3a and 3b, a ground track type lidar driving mode of the present application is shown, including: the vehicle comprises a first rail 14 arranged on two sides of the vehicle, a first power device 11 arranged above the first rail 14, and an electric control module 13 which is arranged on the first power device 11, is electrically connected with the first power device 11 and controls the first power device 11 to move along the first rail 14. The laser radar 20 is mounted on the first power device 11 through the fixing member 12, and the rotation axis of the laser radar 20 is parallel to the movement direction of the first power device 11.
In another preferred embodiment, the track can be arranged in a U-shaped structure, and in this case, only 1 set of the power device for driving the laser radar and the radar to move along the U-shaped track needs to be configured.
Referring to fig. 4, a gantry rail laser radar driving method according to the present application is shown, including: a gantry 111, a gantry type guide rail 141 and a second power device arranged in the gantry 111. The laser radar 20 is arranged on a beam of the gantry 111, and the gantry 111 moves along the gantry guide rail 141 under the drive of the second power device; the rotation axis of the laser radar 20 is parallel to the movement direction of the gantry 111.
Referring to fig. 5, a top-hung rail-mounted laser radar driving method according to the present application is shown, including: the laser radar 20 is arranged on the third power device 112 and moves along the top-hung guide rail 142 under the driving of the third power device 112; the rotation axis of the laser radar 20 is parallel to the movement direction of the third power unit 1112. It should be noted that the top hanging rail is not limited to the separated implementation shown in fig. 5, and a connecting rod connection type top hanging device may also be used.
Referring to fig. 6a and 6b, a moving chassis type laser radar driving mode of the present application is shown (wherein, fig. 6b is a side view of the moving chassis in fig. 6 a); the method comprises the following steps: the laser radar 20 is arranged on the movable chassis 113 through the fixing piece; the rotation axis of the laser radar 20 is parallel to the moving direction of the movable chassis 113.
In particular, the mobile chassis 113 may be an existing automatic navigation device (agv, Automated Guided Vehicles).
At this time, the stereoscopic three-dimensional point cloud is represented by the spatial coordinates (x, y + d × cos (α), z + d × sin (α)) of each spatial point; wherein x, y and z respectively represent the spatial coordinate value of the point where the laser radar is located, and d and alpha respectively represent the distance and the elevation angle of the point where the laser radar is located.
When the multi-line laser radar or the multi-axis rotating radar is selected, the radar can be installed on the power device through the bearing part, and the radar can also be fixedly installed on a proper position through the bearing part. At this time, the stereoscopic three-dimensional point cloud data obtained by the radar scanning the space where the vehicle is located may be represented by the following spatial coordinates of each spatial point:
( x + d*cos(α)*cos(β), y + d*cos(α)*sin(β), z + d*sin(α) )
x, y and z respectively represent space coordinate values of a point where the laser radar is located, d and alpha respectively represent the distance and the elevation angle of the point where the space point and the laser radar are located, and beta represents an included angle between the projection of a connecting line of the space point and the point where the laser radar is located on an XOY plane and an X axis.
Referring to fig. 12, a schematic diagram of a composition structure of an embodiment of the vehicle processing apparatus of the present application is shown, including an apparatus control device 200, a vehicle processing device 300, and the vehicle exterior recognition device 100 described above; the device control apparatus 200 controls the motion path of the vehicle processing apparatus 300 according to the vehicle outline point cloud generated by any one of the vehicle outline recognition apparatus embodiments described above, and the vehicle processing apparatus 300 performs the processing operation on the vehicle.
In particular, the vehicle processing apparatus 300 may be a vehicle cleaning apparatus, a vehicle inspection apparatus, a vehicle painting apparatus, or the like.
Referring to fig. 10, which shows the flow of the first embodiment of the vehicle shape recognition method of the present application, a vehicle shape recognition apparatus for performing the method includes a lidar mounted on a load-bearing portion; in order to obtain the overall appearance data of the vehicle, the installation height of the laser radar is preferably higher than the roof of the vehicle; the method comprises the following steps:
step S10: scanning the space where the vehicle is located through the laser radar to obtain three-dimensional point cloud;
step S30: cutting the three-dimensional point cloud according to a preset service area range to form a service area point cloud; and the number of the first and second groups,
step S50: and searching and deleting the plane point cloud representing the ground from the service area point cloud to form the vehicle appearance point cloud.
Referring to fig. 11, a flow of a second embodiment of the vehicle exterior shape recognition method of the present application is shown, which differs from the first embodiment in that, in order to adapt to the case where multiple lidars scan and acquire multiple point cloud data of a space where a vehicle is located, and the case where a lidar moves around a vehicle in a U-shape and acquires multiple point cloud data of the space where the vehicle is located, after step S10, the method further includes:
step S15: adjusting each three-dimensional point cloud obtained by scanning the laser radar according to the calibration parameters; and the number of the first and second groups,
step S20: splicing a plurality of three-dimensional point clouds obtained by scanning the laser radar to form a complete three-dimensional point cloud;
wherein the calibration parameter is a calibration value filled manually or generated by an automatic registration algorithm.
In a further preferred embodiment, in order to solve the problem that sundries placed near the vehicle may exist in the service area, after step S50, the method may further include:
step S60: and clustering the vehicle appearance point cloud according to a preset distance value, and deleting irrelevant points which are not clustered in the vehicle appearance point cloud to form an optimized vehicle appearance point cloud.
In addition, the application also provides a vehicle processing method, and equipment for executing the method comprises a vehicle processing device and a vehicle appearance recognition device, and the method comprises the following steps:
firstly, executing any one of the vehicle appearance identification method embodiments to generate a vehicle appearance point cloud;
secondly, controlling the motion path of the vehicle processing device according to the vehicle outline point cloud, and realizing the processing operation of the vehicle by the vehicle processing device. The vehicle processing operation may be car washing, flaw detection, painting, etc.
It should be noted that for simplicity of description, the foregoing method embodiments are described as a series of acts or combination of acts, but those skilled in the art will recognize that the present application is not limited by the order of acts described, as some steps may, in accordance with the present application, be performed in other orders or concurrently; for example, the service area range division in step S30 and then the ground point cloud division in step S50 may be performed, or the ground point cloud division in step S50 and then the service area range division in step S30 may be performed. Further, those skilled in the art should also appreciate that the above-described method embodiments are preferred embodiments and that the acts and modules involved are not necessarily required for the application.
The embodiments in the present specification are described in a progressive manner, each embodiment focuses on differences from other embodiments, and the same and similar parts among the embodiments are referred to each other. The above-described embodiments of the apparatus are merely illustrative, wherein the modules described as separate parts may or may not be physically separated, and may be located in one part or may also be distributed in a plurality of electrically connected parts, and some or all of the modules may be selected according to actual needs to achieve the purpose of the embodiments, and those skilled in the art can understand and implement the embodiments without creative efforts.
The principle and the implementation of the present application are explained herein by applying specific examples, and the above description of the embodiments is only used to help understand the method and the core idea of the present application; meanwhile, for a person skilled in the art, according to the idea of the present application, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present application.

Claims (20)

1. A vehicle appearance recognition device characterized by comprising:
the laser radar and the bearing part thereof are used for scanning the space where the vehicle is located to obtain a three-dimensional point cloud;
the point cloud cutting module is used for cutting the three-dimensional point cloud according to a preset service area range to form a service area point cloud; and the number of the first and second groups,
the ground excision module is used for searching and deleting the plane point cloud representing the ground from the service area point cloud to form the vehicle appearance point cloud;
the laser radar is mounted on the bearing part.
2. The vehicle exterior recognition device according to claim 1, further comprising:
the parameter acquisition module is used for receiving the manually filled calibration value as a calibration parameter or generating the calibration parameter through an automatic registration algorithm;
the point cloud calibration module is used for adjusting each three-dimensional point cloud obtained by scanning the laser radar according to the calibration parameters; and the number of the first and second groups,
and the point cloud splicing module is used for splicing a plurality of three-dimensional point clouds obtained by scanning the laser radar to form a complete three-dimensional point cloud.
3. The vehicle exterior recognition device according to claim 1 or 2, further comprising:
and the noise deleting module is used for clustering the vehicle appearance point cloud according to a preset distance value and deleting irrelevant points which are not clustered in the vehicle appearance point cloud to form the noise deleting module for optimizing the vehicle appearance point cloud.
4. The vehicle appearance identifying device of claim 1, wherein the carrier includes a first rail, a first power device, an electrical control module, and a mount;
the laser radar is arranged on the first power device through the fixing piece; the electric control module is arranged on the first power device, is electrically connected with the first power device and controls the first power device to move along a first track arranged around the vehicle;
the laser radar is a single-line laser radar, and the rotating axis of the laser radar is parallel to the movement direction of the first power device.
5. The vehicle appearance recognition device according to claim 1,
the laser radar is arranged on a gantry beam, and the second power device is arranged at the lower end of the gantry and drives the gantry to move along the gantry guide rail; the laser radar is a single-line laser radar, and the rotating axis of the laser radar is parallel to the movement direction of the gantry;
or the like, or, alternatively,
the laser radar is arranged on the third power device and moves along the top-hung guide rail under the driving of the third power device; the laser radar is a single-line laser radar, and the rotating axis of the laser radar is parallel to the movement direction of the third power device;
or the like, or, alternatively,
the bearing part comprises a movable chassis and a fixed part, and the laser radar is arranged on the movable chassis through the fixed part; the laser radar is a single-line laser radar, and the rotating axis of the laser radar is parallel to the moving direction of the movable chassis.
6. The vehicle appearance recognition device according to claim 4 or 5, wherein the stereoscopic three-dimensional point cloud is represented by spatial coordinates (x, y + d × cos (α), z + d × sin (α)) of each spatial point;
wherein x, y and z respectively represent the spatial coordinate value of the point where the laser radar is located, and d and alpha respectively represent the distance and the elevation angle of the point where the laser radar is located.
7. The vehicle profile recognition device according to claim 1, wherein the lidar is a multi-line lidar or a multi-axis rotary radar, and is fixedly mounted through a bearing portion.
8. The vehicle appearance recognition device according to claim 7, wherein the stereoscopic three-dimensional point cloud is represented by the following spatial coordinates of each spatial point:
( x + d*cos(α)*cos(β), y + d*cos(α)*sin(β), z + d*sin(α) )
x, y and z respectively represent space coordinate values of a point where the laser radar is located, d and alpha respectively represent the distance and the elevation angle of the point where the space point and the laser radar are located, and beta represents an included angle between the projection of a connecting line of the space point and the point where the laser radar is located on an XOY plane and an X axis.
9. A vehicle processing apparatus comprising an apparatus control device, a vehicle processing device, and the vehicle exterior recognition device according to any one of claims 1 to 8;
the equipment control device controls the motion path of the vehicle processing device according to the vehicle outline point cloud generated by the vehicle outline recognition device, and the vehicle processing device realizes the processing operation of the vehicle.
10. The vehicle processing apparatus according to claim 9, wherein the vehicle processing device is a vehicle washing device, a vehicle flaw detection device, or a vehicle painting device.
11. A vehicle shape recognition method is characterized in that a vehicle shape recognition device for executing the method comprises a laser radar, wherein the laser radar is arranged on a bearing part; the method comprises the following steps:
scanning the space where the vehicle is located through the laser radar to obtain three-dimensional point cloud;
cutting the three-dimensional point cloud according to a preset service area range to form a service area point cloud; and the number of the first and second groups,
and searching and deleting the plane point cloud representing the ground from the service area point cloud to form the vehicle appearance point cloud.
12. The vehicle outline recognition method according to claim 11, further comprising, after the step of obtaining a three-dimensional point cloud by scanning a space in which the vehicle is located with the laser radar,:
adjusting each three-dimensional point cloud obtained by scanning the laser radar according to the calibration parameters; and the number of the first and second groups,
splicing a plurality of three-dimensional point clouds obtained by scanning the laser radar to form a complete three-dimensional point cloud;
wherein the calibration parameter is a calibration value filled manually or generated by an automatic registration algorithm.
13. The vehicle exterior recognition method according to claim 11 or 12, further comprising, after the step of finding and deleting a planar point cloud representing the ground from the service area point cloud to form a vehicle exterior point cloud:
and clustering the vehicle appearance point cloud according to a preset distance value, and deleting irrelevant points which are not clustered in the vehicle appearance point cloud to form an optimized vehicle appearance point cloud.
14. The vehicle exterior recognition method of claim 11, wherein the carrier includes a first rail, a first power unit, an electrical control module, and a mount;
the laser radar is arranged on the first power device through the fixing piece; the electric control module is arranged on the first power device, is electrically connected with the first power device and controls the first power device to move along a first track arranged around the vehicle;
the laser radar is a single-line laser radar, and the rotating axis of the laser radar is parallel to the movement direction of the first power device.
15. The vehicle exterior recognition method according to claim 11,
the laser radar is arranged on a gantry beam, and the second power device is arranged at the lower end of the gantry and drives the gantry to move along the gantry guide rail; the laser radar is a single-line laser radar, and the rotating axis of the laser radar is parallel to the movement direction of the gantry;
or the like, or, alternatively,
the laser radar is arranged on the third power device and moves along the top-hung guide rail under the driving of the third power device; the laser radar is a single-line laser radar, and the rotating axis of the laser radar is parallel to the movement direction of the third power device;
or the like, or, alternatively,
the bearing part comprises a movable chassis and a fixed part, and the laser radar is arranged on the movable chassis through the fixed part; the laser radar is a single-line laser radar, and the rotating axis of the laser radar is parallel to the moving direction of the movable chassis.
16. The vehicle exterior recognition method according to claim 14 or 15, wherein the stereoscopic three-dimensional point cloud is represented by spatial coordinates (x, y + d × cos (α), z + d × sin (α)) of each spatial point;
wherein x, y and z respectively represent the spatial coordinate value of the point where the laser radar is located, and d and alpha respectively represent the distance and the elevation angle of the point where the laser radar is located.
17. The vehicle exterior recognition method of claim 11, wherein the lidar is a multi-line lidar or a multi-axis rotary radar and is fixedly mounted via a bearing.
18. The vehicle exterior recognition method according to claim 17, wherein the stereoscopic three-dimensional point cloud is represented by the following spatial coordinates of each spatial point:
( x + d*cos(α)*cos(β), y + d*cos(α)*sin(β), z + d*sin(α) )
x, y and z respectively represent space coordinate values of a point where the laser radar is located, d and alpha respectively represent the distance and the elevation angle of the point where the space point and the laser radar are located, and beta represents an included angle between the projection of a connecting line of the space point and the point where the laser radar is located on an XOY plane and an X axis.
19. A vehicle processing method characterized in that an apparatus for performing the method includes a vehicle processing device and a vehicle exterior recognition device, the method comprising:
performing the vehicle contour identification method of any of claims 11-18 to generate a vehicle contour point cloud;
and controlling the motion path of the vehicle processing device according to the vehicle outline point cloud, and realizing the processing operation of the vehicle by the vehicle processing device.
20. The vehicle treatment method according to claim 19, wherein the vehicle treatment operation is vehicle cleaning, vehicle inspection, or vehicle painting.
CN202010960657.8A 2020-09-14 2020-09-14 Vehicle appearance recognition device and method, vehicle processing apparatus and method Pending CN112099050A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010960657.8A CN112099050A (en) 2020-09-14 2020-09-14 Vehicle appearance recognition device and method, vehicle processing apparatus and method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010960657.8A CN112099050A (en) 2020-09-14 2020-09-14 Vehicle appearance recognition device and method, vehicle processing apparatus and method

Publications (1)

Publication Number Publication Date
CN112099050A true CN112099050A (en) 2020-12-18

Family

ID=73751559

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010960657.8A Pending CN112099050A (en) 2020-09-14 2020-09-14 Vehicle appearance recognition device and method, vehicle processing apparatus and method

Country Status (1)

Country Link
CN (1) CN112099050A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114071112A (en) * 2021-10-18 2022-02-18 北京魔鬼鱼科技有限公司 Vehicle point cloud identification imaging method and system
CN114088041A (en) * 2021-10-18 2022-02-25 北京魔鬼鱼科技有限公司 Vehicle three-dimensional scanning imaging method and system

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103226833A (en) * 2013-05-08 2013-07-31 清华大学 Point cloud data partitioning method based on three-dimensional laser radar
CN107167090A (en) * 2017-03-13 2017-09-15 深圳市速腾聚创科技有限公司 Vehicle overall dimension measuring method and system
US20180017501A1 (en) * 2016-07-13 2018-01-18 Sightline Innovation Inc. System and method for surface inspection
CN110163904A (en) * 2018-09-11 2019-08-23 腾讯大地通途(北京)科技有限公司 Object marking method, control method for movement, device, equipment and storage medium
US20200158874A1 (en) * 2018-11-19 2020-05-21 Dalong Li Traffic recognition and adaptive ground removal based on lidar point cloud statistics
CN111192328A (en) * 2019-12-31 2020-05-22 芜湖哈特机器人产业技术研究院有限公司 Two-dimensional laser radar-based point cloud processing method for three-dimensional scanning system of compartment container

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103226833A (en) * 2013-05-08 2013-07-31 清华大学 Point cloud data partitioning method based on three-dimensional laser radar
US20180017501A1 (en) * 2016-07-13 2018-01-18 Sightline Innovation Inc. System and method for surface inspection
CN107167090A (en) * 2017-03-13 2017-09-15 深圳市速腾聚创科技有限公司 Vehicle overall dimension measuring method and system
CN110163904A (en) * 2018-09-11 2019-08-23 腾讯大地通途(北京)科技有限公司 Object marking method, control method for movement, device, equipment and storage medium
US20200158874A1 (en) * 2018-11-19 2020-05-21 Dalong Li Traffic recognition and adaptive ground removal based on lidar point cloud statistics
CN111192328A (en) * 2019-12-31 2020-05-22 芜湖哈特机器人产业技术研究院有限公司 Two-dimensional laser radar-based point cloud processing method for three-dimensional scanning system of compartment container

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
王建主编: "《汽车现代测试技术》", 北京:国防工业出版社, pages: 117 - 118 *

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114071112A (en) * 2021-10-18 2022-02-18 北京魔鬼鱼科技有限公司 Vehicle point cloud identification imaging method and system
CN114088041A (en) * 2021-10-18 2022-02-25 北京魔鬼鱼科技有限公司 Vehicle three-dimensional scanning imaging method and system
WO2023066232A1 (en) * 2021-10-18 2023-04-27 北京魔鬼鱼科技有限公司 Three-dimensional scanning and imaging method and system for vehicle, and computer device and storage medium
CN114071112B (en) * 2021-10-18 2023-09-01 北京魔鬼鱼科技有限公司 Vehicle point cloud identification imaging method and system

Similar Documents

Publication Publication Date Title
KR101863360B1 (en) 3D laser scanning system using the laser scanner capable of tracking dynamic position in real time
Paek et al. K-radar: 4d radar object detection for autonomous driving in various weather conditions
CN112665556A (en) Generating three-dimensional maps of scenes using passive and active measurements
CN112099050A (en) Vehicle appearance recognition device and method, vehicle processing apparatus and method
CN111077506A (en) Method, device and system for calibrating millimeter wave radar
US11061122B2 (en) High-definition map acquisition system
KR101880593B1 (en) Lidar sensor device for automatic driving of unmanned vehicles
CN112379674B (en) Automatic driving equipment and system
CN105699985A (en) Single-line laser radar device
CN102476619A (en) Method for detecting the environment of a vehicle
CN114061446B (en) Carriage size measurement system and method based on multiple three-dimensional scanning equipment
CN110082783B (en) Cliff detection method and device
CN103050010A (en) Integrated laser scanning traffic survey device and integrated laser scanning traffic survey method
CN109375629A (en) A kind of cruiser and its barrier-avoiding method that navigates
EP3769115A1 (en) Methods and systems for identifying material composition of moving objects
WO2020241043A1 (en) Work analysis system, work analysis device, and work analysis program
EP3992662A1 (en) Three dimensional measurement device having a camera with a fisheye lens
CN113768419B (en) Method and device for determining sweeping direction of sweeper and sweeper
CN115166769A (en) Detection method, laser radar, vehicle, and computer-readable storage medium
US20220187428A1 (en) Autonomous mobile aircraft inspection system
CN218866094U (en) Detection device, laser radar, and vehicle
Fröhlich et al. Imaging laser radar for 3‐D modelling of real world environments
KR101784584B1 (en) Apparatus and method for determing 3d object using rotation of laser
CN111623744A (en) Curved surface appearance acquisition and measurement system
CN115509214B (en) Positioning control method and device, and autonomous charging control device, method and system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination