CN115291243A - Laser radar three-dimensional reconstruction method and device, electronic equipment and storage medium - Google Patents

Laser radar three-dimensional reconstruction method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN115291243A
CN115291243A CN202210898161.1A CN202210898161A CN115291243A CN 115291243 A CN115291243 A CN 115291243A CN 202210898161 A CN202210898161 A CN 202210898161A CN 115291243 A CN115291243 A CN 115291243A
Authority
CN
China
Prior art keywords
data
point cloud
laser radar
rotation angle
radar
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210898161.1A
Other languages
Chinese (zh)
Inventor
吴海腾
玉正英
范洪达
熊发春
罗福良
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hangzhou Shenhao Technology Co Ltd
Original Assignee
Hangzhou Shenhao Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hangzhou Shenhao Technology Co Ltd filed Critical Hangzhou Shenhao Technology Co Ltd
Priority to CN202210898161.1A priority Critical patent/CN115291243A/en
Publication of CN115291243A publication Critical patent/CN115291243A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • G01S17/8943D imaging with simultaneous measurement of time-of-flight at a 2D array of receiver pixels, e.g. time-of-flight cameras or flash lidar
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects

Abstract

The application relates to the technical field of three-dimensional reconstruction, in particular to a laser radar three-dimensional reconstruction method, a device, electronic equipment and a storage medium, wherein the method comprises the following steps: performing three-dimensional reconstruction scanning on a scene to be operated through a laser radar, and acquiring rotation angle data and radar measurement data of the laser radar in real time; the rotation angle data comprises angle data and corresponding timestamp data, and the radar measurement data comprises point cloud data and corresponding timestamp data; comparing the timestamp data in the radar measurement data and the timestamp data in the rotation angle data, and screening out matched point cloud data and angle data according to the time difference between the two pieces of timestamp data so as to obtain laser radar data with accurate angle pose; and (4) processing the screened point cloud data through callback function analysis, performing coordinate conversion on the analyzed point cloud data, and splicing and fusing the point cloud data under different time sequences according to the rotation angle. The method and the device effectively reduce the occupation of the space structure and improve the definition and accuracy of three-dimensional reconstruction.

Description

Laser radar three-dimensional reconstruction method and device, electronic equipment and storage medium
Technical Field
The present application relates to the field of three-dimensional reconstruction technologies, and in particular, to a laser radar three-dimensional reconstruction method and apparatus, an electronic device, and a storage medium.
Background
When the outdoor electrified power distribution robot works remotely or intelligently, three-dimensional reconstruction needs to be carried out on a scene to be worked at first to determine the working position of the mechanical arm and provide basic data for an obstacle avoidance strategy of the mechanical arm.
The three-dimensional reconstruction mode can adopt visual perception equipment such as a binocular camera and a laser radar, and the binocular camera is easy to generate cavity areas in a large range under the condition of sufficient outdoor sunlight, so that the reconstruction result is not ideal.
The laser radar ranging principle is that a laser center emits laser with a certain wavelength, the distance of a light spot emitted by an internal ranging module is in a direct proportion relation with a ranging range, namely the longer the ranging range is, the larger the size of the light spot is, if the light spot irradiates a small target object and a large object is arranged in the area behind the target, multiple reflections of the light spot are easily caused, the laser ranging is prone to have wrong results, and the traditional laser radar modeling has a smear problem, so that the modeling effect is not ideal;
when adopting laser radar as the front end sensor, adopt slip table and the cooperation of multi-thread laser radar more, when adopting this mode, generally longer in order to increase visual field range slip table length design, can occupy more operation platform spaces and increase the structural design complexity.
Disclosure of Invention
The embodiment of the application provides a laser radar three-dimensional reconstruction method, a laser radar three-dimensional reconstruction device, electronic equipment and a storage medium, and aims to at least solve the problems that in the related technology, when a laser radar is used for three-dimensional reconstruction, the occupied operating platform space is large, and the modeling is not clear and accurate due to the fact that false modeling results and smearing occur easily due to multiple reflections of light spots.
In a first aspect, an embodiment of the present application provides a laser radar three-dimensional reconstruction method, where the method includes:
the method comprises the following steps that S1, three-dimensional reconstruction scanning is carried out on a scene to be operated through a laser radar, and current rotation angle data and radar measurement data of the laser radar are obtained in real time; the rotation angle data comprises angle data and corresponding timestamp data, and the radar measurement data comprises point cloud data and corresponding timestamp data;
s2, comparing the timestamp data in the radar measurement data with the timestamp data in the rotation angle data, and screening out matched point cloud data and angle data according to the time difference between the timestamp data in the radar measurement data and the timestamp data in the rotation angle data so as to obtain laser radar data with accurate angle poses;
and S3, analyzing and processing the point cloud data screened in the step S2 through a callback function, performing coordinate conversion on the point cloud data after analysis and processing, and splicing and fusing the point cloud data under different time sequences according to the rotation angle of the point cloud data.
In some embodiments, in step S2, the acquired rotation angle data and radar measurement data are stored in a queue, the rotation angle data and the radar measurement data are respectively extracted in sequence from the head of the queue, a difference between timestamp data of the two sets of data is analyzed and compared, the difference is compared with a preset first threshold,
if the difference value is smaller than the first threshold value, adopting the angle data of the current timestamp as the rotation angle of the current point cloud data;
if the difference value is larger than a first threshold value, the sizes of the timestamp data in the radar measurement data and the timestamp data in the rotation angle data are continuously judged, a group of data with smaller timestamp data values in the radar measurement data and the rotation angle data is abandoned, the difference value of the timestamp data of the next group of data and the data which are not abandoned is selected from the abandoned data queue in sequence, and the difference value is compared with a preset first threshold value.
In some embodiments, in step S1, before the lidar operation, it is determined whether the lidar is in a preset initial pose;
if the laser radar is in the preset initial pose, performing rotation measurement on the laser radar, and acquiring current rotation angle data and radar measurement data in real time;
and if the laser radar is not in the preset initial pose, controlling the laser radar to rotate to the preset initial pose and then performing rotation measurement, and acquiring current rotation angle data and radar measurement data in real time.
In some embodiments, the step S3 includes removing abnormal measurement points from the point cloud data filtered in the step S2 by point cloud filtering, including:
further segmenting the screened point cloud data to obtain a local point cloud set;
calculating the spatial distance between a point in the local point cloud set and the laser radar origin, and calculating the standard deviation of the local point cloud set by matching with a preset point cloud ranging mean value in the local point cloud set;
comparing the standard deviation with a preset second threshold value,
if the standard deviation is larger than a preset second threshold value, judging that abnormal measuring points exist in the local point cloud collection area, executing filtering operation at the moment, and removing the abnormal measuring points;
and if the standard deviation is smaller than a preset second threshold, judging that the local point cloud set region has no abnormal measuring point, and judging other local point cloud sets at the moment.
In some embodiments, when the abnormal measurement points are rejected:
when the standard deviation is larger than a preset second threshold value, the points in the local point cloud set are judged,
if the spatial distance from one point in the local point cloud set to the laser radar origin is smaller than or equal to a preset point cloud ranging mean value, judging that the point is abnormal, and storing the point;
and if the spatial distance from one point in the local point cloud set to the laser radar origin is greater than a preset point cloud ranging average value, judging that the point is an abnormal point, and removing the point.
In some embodiments, in the step S3, during the point cloud data coordinate conversion, the radar measurement data after the point cloud filtering is subjected to rotational translation transformation according to the rotation angle data matched with the radar measurement data, so as to obtain a three-dimensional reconstruction result based on the initial pose state, and store the obtained result.
In a second aspect, the embodiment of the application provides a laser radar three-dimensional reconstruction device, the device includes laser radar and revolution mechanic, revolution mechanic can make laser radar follows the rotatory measurement of horizontal direction, laser radar's original position with revolution mechanic's center is in under the same coordinate system, the device still is provided with transmission structure for upload the measured data that laser radar caught and laser radar's rotation angle data in real time.
In some embodiments, the horizontal rotation range of the rotating structure is 0-180 °, one end of the lidar in the horizontal rotation range is set as an initial pose of the lidar, and a sensor for detecting whether the lidar is in the initial pose is arranged in the initial pose.
In a third aspect, an embodiment of the present application provides an electronic device, including a processor and a memory storing a computer program, where the computer program, when executed by the processor, implements the method according to any one of the above items.
In a fourth aspect, embodiments of the present application provide a computer-readable storage medium, on which a computer program is stored, which, when executed by a processor, implements the method according to any one of the above.
According to the content, compared with the prior art, the technical scheme of the invention has the beneficial effects that:
1. according to the invention, the single-point type rotating three-dimensional mapping of the laser radar is adopted, and the large-range and high-precision rapid three-dimensional scene mapping can be completed through less occupied structural space;
2. the point cloud data are matched with the rotating angle according to the time stamps, the point cloud data are processed by using a filtering algorithm, abnormal point cloud data are detected and eliminated, the point cloud data with different time stamps are fused and spliced by the filtered point cloud data according to the rotating angle, wrong modeling results caused by multiple reflections of laser radar light spots are avoided, the problem of smear in traditional laser modeling is solved, and the definition and the accuracy of a three-dimensional map are effectively improved.
Drawings
The accompanying drawings, which are included to provide a further understanding of the application and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the application and together with the description serve to explain the application and not to limit the application. In the drawings:
fig. 1 is a flowchart of a lidar three-dimensional reconstruction method according to an embodiment of the application;
FIG. 2 is a schematic structural diagram of a lidar three-dimensional reconstruction apparatus according to an embodiment of the application;
fig. 3 is a block diagram of an electronic device according to an embodiment of the present application.
Description of the reference numerals: 1. a laser radar; 2. a rotating structure; 3. mounting a bracket; 4. a sensor.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more clearly understood, the present application is described and illustrated below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the present application and are not intended to limit the present application. All other embodiments obtained by a person of ordinary skill in the art based on the embodiments provided in the present application without any inventive step are within the scope of protection of the present application.
It is obvious that the drawings in the following description are only examples or embodiments of the application, and that it is also possible for a person skilled in the art to apply the application to other similar contexts on the basis of these drawings without inventive effort. Moreover, it should be appreciated that in the development of any such actual implementation, as in any engineering or design project, numerous implementation-specific decisions must be made to achieve the developers' specific goals, such as compliance with system-related and business-related constraints, which may vary from one implementation to another.
Reference in the specification to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment can be included in at least one embodiment of the specification. The appearances of the phrase in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. Those of ordinary skill in the art will explicitly and implicitly appreciate that the embodiments described herein may be combined with other embodiments without conflict.
Unless defined otherwise, technical or scientific terms referred to herein shall have the ordinary meaning as understood by those of ordinary skill in the art to which this application belongs. Reference to "a," "an," "the," and similar words throughout this application are not to be construed as limiting in number, and may refer to the singular or the plural. The present application is directed to the use of the terms "including," "comprising," "having," and any variations thereof, which are intended to cover non-exclusive inclusions; for example, a process, method, system, article, or apparatus that comprises a list of steps or modules (elements) is not limited to only those steps or elements but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus. Reference to "connected," "coupled," and the like in this application is not intended to be limited to physical or mechanical connections, but may include electrical connections, whether direct or indirect. Reference herein to "a plurality" means greater than or equal to two. "and/or" describes the association relationship of the associated object, indicating that there may be three relationships, for example, "a and/or B" may indicate: a exists alone, A and B exist simultaneously, and B exists alone. Reference herein to the terms "first," "second," "third," and the like, are merely to distinguish similar objects and do not denote a particular ordering for the objects.
The embodiment of the application aims to provide a single-point type rotation three-dimensional reconstruction technology based on a laser radar, and the single-point type rotation three-dimensional reconstruction technology is used for solving the problems that when the existing laser radar is used for three-dimensional reconstruction, the occupied operation space is large, and the inaccurate modeling result and the inaccurate modeling are easy to generate due to the fact that light spots are reflected for multiple times, and the smearing occurs, so that the problems are not clear and inaccurate.
According to the technology, timestamp comparison is carried out on radar measurement data obtained by scanning laser radar and rotation angle data obtained by a rotation structure of the laser radar, point cloud data and angle data corresponding to the compared timestamps in an error-allowable range are paired, filtering is carried out on the point cloud data to remove abnormal data, the point cloud data which are not removed are fused through the angle data matched with the point cloud data, and a clear and accurate three-dimensional reconstruction structure based on the initial pose of the laser radar can be obtained.
Fig. 1 is a flowchart of a lidar three-dimensional reconstruction method according to an embodiment of the application, as shown in fig. 1, the lidar three-dimensional reconstruction method includes the following steps:
the method comprises the following steps of S1, performing three-dimensional reconstruction scanning on a scene to be operated through a laser radar, and acquiring current rotation angle data and radar measurement data of the laser radar in real time; the rotation angle data comprises angle data and corresponding time stamp data, and the radar measurement data comprises point cloud data and corresponding time stamp data;
s2, comparing the timestamp data in the radar measurement data with the timestamp data in the rotation angle data, and screening out matched point cloud data and angle data according to the time difference between the timestamp data in the radar measurement data and the timestamp data in the rotation angle data so as to obtain laser radar data with accurate angle positions;
and S3, analyzing and processing the point cloud data screened in the step S2 through a callback function, performing coordinate conversion on the point cloud data after analysis and processing, and splicing and fusing the point cloud data under different time sequences according to the rotation angle of the point cloud data.
In order to more clearly explain the present application, specific examples are given and detailed below.
The method comprises the following steps that S1, three-dimensional reconstruction scanning is carried out on a scene to be operated through a laser radar, and current rotation angle data and radar measurement data of the laser radar are obtained in real time; the rotation angle data comprise angle data and corresponding time stamp data, and the radar measurement data comprise point cloud data and corresponding time stamp data.
As an example, the laser radar is horizontally and rotatably installed at the middle position of two mechanical arms of the working robot through the rotating structure, the rotating angle range of the rotating structure is set to be 0-180 degrees, the initial pose of the laser radar is preset when the laser radar faces 0 degree and is recorded, and whether the laser radar is in the preset initial pose is judged through the cooperation of a sensor and a preset program.
Further, before performing three-dimensional reconstruction, whether the laser radar is in a preset initial pose is judged:
if the laser radar is in a preset initial pose, the laser radar starts to carry out rotation measurement, and current rotation angle data and radar measurement data are acquired and uploaded in real time;
and if the laser radar is not at the preset initial pose, controlling the laser radar to rotate to the preset initial pose through the rotating structure, then starting the rotation measurement of the laser radar, and acquiring and uploading the current rotation angle data and the radar measurement data in real time.
Wherein, radar survey data include point cloud data and with the timestamp data of point cloud data matching, the rotation angle data include angle data and with the timestamp data of angle data matching, the rotation angle data are uploaded with 50 Hz's frequency.
The laser radar is connected with the rotating structure through the mounting support, and the origin position of the laser radar and the center of the rotating structure are located under the same coordinate system.
Furthermore, the rotating shaft of the laser radar and the rotating shaft of the rotating structure are coaxially arranged, and the rotating shaft of the rotating structure is perpendicular to the horizontal plane.
Preferably, the sensor may be a photoelectric switch or a proximity switch provided at one end of the initial pose to determine whether the laser radar is in the initial pose.
Preferably, the laser radar is a 16-line 360-degree mechanical laser radar.
Preferably, the two arms of the operation robot are respectively arranged on the left side and the right side, the direction towards one of the arms of the operation robot is set to be a 0-degree direction, the direction towards the other arm of the operation robot is set to be a 180-degree direction, and when the view window of the laser radar faces to the 0-degree direction, the laser radar is in the initial pose.
As an example, the data structure of a single measurement point in the point cloud data is designed to:
Struct point{
double x;
double y;
double z;
double idensity;
int ring;
bool save;
}
the method comprises the steps that x, y and z are distances from a laser emitter to a target point in an x axis, a y axis and a z axis in a three-dimensional space respectively, intensity is laser reflection intensity, ring is a laser emitter number, and save is whether the point is stored or not.
And S2, comparing the timestamp data in the radar measurement data with the timestamp data in the rotation angle data, and screening out matched point cloud data and angle data according to the time difference between the timestamp data in the radar measurement data and the timestamp data in the rotation angle data so as to obtain laser radar data with accurate angle positions.
As an example, the acquired rotation angle data and the radar measurement data are respectively stored in a queue mode according to the sequence of data acquisition time, and the data acquired first are stored at the front end of the queue; and respectively extracting rotation angle data from the rotation angle data queue and radar measurement data from the radar measurement data queue in sequence from the head of the queue, analyzing corresponding timestamp data from the extracted rotation angle data and radar measurement data, calculating a difference value of the two timestamp data, and comparing the difference value with a preset first threshold value.
Further, when the difference value is compared with a preset first threshold value, if the difference value is smaller than the first threshold value, the angle data of the current timestamp is adopted as the rotation angle of the current point cloud data;
if the difference value is larger than the first threshold value, the size of the timestamp data in the radar measurement data and the size of the timestamp data in the rotation angle data are continuously judged:
if the timestamp of the radar measurement data is smaller than the timestamp of the rotation angle data, the timestamp of the radar measurement data is considered to be earlier time data, the radar measurement data is abandoned, the next radar measurement data is extracted from a radar measurement data queue, the timestamp data difference is recalculated, and the judgment is carried out;
and if the time stamp of the rotation angle data is smaller than that of the radar measurement data, the time stamp of the rotation angle data is considered to be earlier time data, the group of rotation angle data is abandoned, the next group of rotation angle data is extracted from the rotation angle data queue, the time stamp data difference value is recalculated, and the judgment is carried out.
And storing the point cloud data and the angle data which are matched in pair after screening, thereby obtaining the laser radar data with accurate angle positions.
When the difference value of the two groups of timestamp data is greater than a preset first threshold value and timestamp data is compared, one group of data with smaller timestamp data value is discarded because the data queues are arranged from small to large according to the time sequence, the data is extracted according to the queue sequence when the data is extracted for comparison, and the timestamp data after the queues is certainly greater than the timestamp data before the queues. At this time, if a group of data with a larger timestamp data value is discarded, the timestamp data of the next group of data extracted will result in a larger difference between the timestamp of the rotation angle data and the timestamp of the radar measurement data than the previous group of data, which is not suitable for the determination purpose.
And S3, analyzing and processing the point cloud data screened in the step S2 through a callback function, carrying out coordinate conversion on the point cloud data after analysis and processing, and splicing and fusing the point cloud data under different time sequences according to the rotation angle of the point cloud data.
As one example, the callback function includes point cloud filtering and point cloud coordinate conversion.
In an application scene of an operation robot, a result that laser ranging is wrong due to multiple reflections of light spots emitted by a laser radar can occur.
As an example, the point cloud data screened in step S2 is parsed, the point cloud data is divided into 16 groups of point cloud files according to ring values (laser emitter numbers) of the point cloud data, and the point cloud data in each group of point cloud files are ordered according to angle values of 0 to 360 degrees;
traversing all point cloud data in 16 groups of point cloud files, and further segmenting the point cloud data in each group of point cloud files through two constraint conditions of a preset angle value and a preset number of adjacent points when traversing the point cloud files to obtain a local point cloud set N k ={p i ,......,p n }。
Specifically, different angle values and the number of adjacent points can be adopted according to different application scenes to further distinguish the point cloud data.
Further, a set N is calculated k The standard deviation s of (a) is,
Figure BDA0003769862260000081
wherein, the first and the second end of the pipe are connected with each other,
Figure BDA0003769862260000082
representing a local point cloud set N k The space distance from the inner i point to the laser radar origin (0, 0); (x, y, z) is the three-dimensional coordinates of point i; n is a local point cloud set N k The number of point clouds within; mu.s k Is a preset local point cloud set N k An internal point cloud ranging mean value;
the standard deviation s is compared with a preset second threshold value,
if the standard deviation s is larger than a preset second threshold value, judging the local point cloud set N k When abnormal measuring points exist in the area, the filtering operation is executed, and the abnormal measuring points are removed;
If the standard deviation s is smaller than a preset second threshold value, judging the local point cloud set N k And if the area has no abnormal measuring point, judging other local point cloud sets until all point cloud data in the 16 groups of point cloud files are traversed.
Further, when the standard deviation s is greater than a preset second threshold, removing the abnormal measurement points:
for local point cloud set N k And (4) judging the inner points, setting the judged points as i points, and adopting the following judging formula:
Figure BDA0003769862260000083
wherein ThrAbnormal is a preset second threshold; p is a radical of formula .i Representing a local point cloud set N k The space distance from the point i to the laser radar origin; p is a radical of i.save = false denotes to p .i Saving as false; p is a radical of i.save = true for p .i Storing as true;
the judging process is as follows:
if the local point cloud set N k Inner i point to laser radar origin point space distance p .i Less than or equal to the preset point cloud ranging mean value mu k Then, it is determined that there is no abnormality p at that point .save = true, save this point;
if the local point cloud is collected N k Inner i point to laser radar origin point space distance p .i Is greater than the preset point cloud ranging mean value mu k Then, the point is determined as an abnormal point p .save = false, the point is eliminated.
And further, rotating and translating the point cloud data after filtering processing in a rotating and translating matrix mode according to the angle data and the timestamp data matched with the point cloud data to obtain a three-dimensional reconstruction result based on the initial pose state, and storing the result after multi-frame fusion.
Specifically, since the rotating structure rotates around only one axis, the rotation and translation only performs rotation and translation of the corresponding axis.
Preferably, if a certain error exists in the installation stage, the initial pose calibration is performed in a manual calibration mode, the calibration mode uses laser point cloud data in the initial pose state to scan an area with obvious characteristics such as planes and angular points, the collected point cloud data is subjected to manual leveling, and calibration parameters of the point cloud data are used as fixed parameters and stored and used in a program.
And S4, judging whether unprocessed data still exist in the radar measurement data queue, and if so, repeatedly executing the steps.
And S5, obtaining a three-dimensional reconstruction result based on the rotating structure as the center according to the steps, and removing burr areas of point cloud in the reconstruction result by applying a statistical filtering algorithm to obtain smoother reconstructed point cloud data.
Further, according to the space mapping relation between the laser central point and the central position of the mechanical arm, the point cloud result obtained in the step S5 is subjected to translation transformation, and finally the whole three-dimensional reconstruction task is completed.
Therefore, according to the content, the radar measurement data uploaded in real time and the rotation angle data are subjected to timestamp comparison, point cloud data and angle data which are matched with each other are formed according to the timestamp, and abnormal data in the point cloud data are eliminated by filtering the point cloud data through the callback function. Compared with the traditional method, the method has the advantages that the overall precision of the three-dimensional reconstruction result can be improved, the influence of abnormal data on the modeling result can be prevented, and the smear condition generated in the three-dimensional reconstruction of the laser radar can be solved.
It should be noted that the steps illustrated in the above-described flow diagrams or in the flow diagrams of the figures may be performed in a computer system, such as a set of computer-executable instructions, and that, although a logical order is illustrated in the flow diagrams, in some cases, the steps illustrated or described may be performed in an order different than here.
The embodiment of the present application further provides a laser radar three-dimensional reconstruction device, which is used for implementing the foregoing embodiment and preferred embodiments, and details of the description are not repeated.
Fig. 2 is a schematic structural diagram of a lidar three-dimensional reconstruction device according to an embodiment of the application, and as shown in fig. 2, the device includes a lidar 1 and a rotating structure 2, the lidar 1 can be horizontally rotated and measured by the rotating structure 2 and is installed at the middle position of two mechanical arms of a working robot, the rotating angle range of the rotating structure 2 is 0-180 °, the initial pose of the lidar 1 is preset when the lidar 1 faces 0 °, and a sensor 4 for detecting whether the lidar 1 is in the initial pose is arranged at the initial pose.
Wherein, laser radar 1 is connected with revolution mechanic 2 through installing support 3, and laser radar 1's initial position is in under same coordinate system with revolution mechanic 2's center, and the device still is provided with transmission structure for upload in real time laser radar 1 captured measured data and laser radar 1's rotation angle data.
Further, the rotation axis of the laser radar 1 is coaxial with the rotation axis of the rotating structure 2, and the rotation axis of the rotating structure 2 is perpendicular to the horizontal plane.
Preferably, whether the laser radar 1 is in the preset initial pose is judged through the cooperation of the sensor 4 and a preset program.
Preferably, the sensor 4 may be a photoelectric switch or a proximity switch provided at one end of the initial attitude to determine whether the laser radar 1 is in the initial attitude.
Preferably, the laser radar 1 is a 16-line 360 ° mechanical laser radar.
Preferably, the two arms of the operation robot are respectively arranged on the left side and the right side, the direction towards one of the arms of the operation robot is set to be a 0-degree direction, the direction towards the other arm of the operation robot is set to be a 180-degree direction, and when the view window of the laser radar 1 faces to the 0-degree direction, the laser radar 1 is in the initial pose.
Preferably, the rotating structure 2 can be further installed on the working robot through a sliding table, so that the device can realize rotating scanning measurement along the sliding of the sliding table.
To sum up, this application embodiment drives multi-thread laser radar through revolution mechanic and scans the measurement based on the three-dimensional reconstruction technique of mechanical type multi-thread laser radar, acquires and uploads radar measured data and rotation angle data, compares through the timestamp and selects matching point cloud data and angle data, filters the point cloud data of selecting and rejects abnormal point, can draw the three-dimensional reconstruction result after the point cloud data rotation that will not be rejected, translation, fusion are handled.
In addition, an electronic device is further provided in an embodiment of the present application, and fig. 3 is a block diagram of a structure of the electronic device according to the embodiment of the present application, and as shown in fig. 3, the electronic device includes a processor 81 and a memory 82 in which computer program instructions are stored.
Specifically, the processor 81 may include a Central Processing Unit (CPU), or A Specific Integrated Circuit (ASIC), or may be configured to implement one or more Integrated circuits of the embodiments of the present Application.
Memory 82 may include, among other things, mass storage for data or instructions. By way of example, and not limitation, memory 82 may include a Hard Disk Drive (Hard Disk Drive, abbreviated HDD), a floppy Disk Drive, a Solid State Drive (SSD), flash memory, an optical Disk, a magneto-optical Disk, tape, or a Universal Serial Bus (USB) Drive or a combination of two or more of these. Memory 82 may include removable or non-removable (or fixed) media, where appropriate. The memory 82 may be internal or external to the data processing apparatus, where appropriate. In a particular embodiment, the memory 82 is a Non-Volatile (Non-Volatile) memory. In particular embodiments, memory 82 includes Read-Only Memory (ROM) and Random Access Memory (RAM). The ROM may be mask-programmed ROM, programmable ROM (PROM), erasable PROM (EPROM), electrically Erasable PROM (EEPROM), electrically rewritable ROM (earrom) or FLASH Memory (FLASH), or a combination of two or more of these, where appropriate. The RAM may be a Static Random-Access Memory (SRAM) or a Dynamic Random-Access Memory (DRAM), where the DRAM may be a Fast Page Mode Dynamic Random Access Memory (FPMDRAM), an Extended Data Out Dynamic Random Access Memory (EDODRAM), a Synchronous Dynamic Random Access Memory (SDRAM), and the like.
The memory 82 may be used to store or cache various data files for processing and/or communication use, as well as possible computer program instructions executed by the processor 81.
The processor 81 reads and executes the computer program instructions stored in the memory 82 to implement any one of the three-dimensional reconstruction methods of the lidar embodiments described above.
In some of these embodiments, the electronic device may also include a communication interface 83 and a bus 80. As shown in fig. 3, the processor 81, the memory 82, and the communication interface 83 are connected via the bus 80 to complete communication therebetween.
The communication interface 83 is used for implementing communication between modules, devices, units and/or equipment in the embodiment of the present application. The communication interface 83 may also enable communication with other components such as: the data communication is carried out among external equipment, image/data acquisition equipment, a database, external storage, an image/data processing workstation and the like.
The bus 80 includes hardware, software, or both to couple the components of the electronic device to one another. Bus 80 includes, but is not limited to, at least one of the following: data Bus (Data Bus), address Bus (Address Bus), control Bus (control Bus), expansion Bus (Expansion Bus), and Local Bus (Local Bus). By way of example and not limitation, bus 80 may include an Accelerated Graphics Port (AGP) or other Graphics Bus, an Enhanced Industry Standard Architecture (EISA) Bus, a front-side Bus (FSB), a Hyper Transport (HT) Interconnect, an ISA (ISA) Bus, an InfiniBand (InfiniBand) Interconnect, a Low Pin Count (LPC) Bus, a memory Bus, a microchannel Architecture (MCA) Bus, a PCI-Interconnect (PCI) Bus, a PCI-Express (PCI-X) Bus, a Serial Advanced Technology Attachment (SATA) Bus, a vlaudio Bus, a Video Bus, or a combination of two or more of these suitable buses. Bus 80 may include one or more buses, where appropriate. Although specific buses are described and shown in the embodiments of the application, any suitable buses or interconnects are contemplated by the application.
In addition, in combination with the laser radar three-dimensional reconstruction method in the foregoing embodiment, the embodiment of the present application may provide a computer-readable storage medium to implement the method. The computer readable storage medium having stored thereon computer program instructions; the computer program instructions, when executed by a processor, implement any of the lidar three-dimensional reconstruction methods of the embodiments described above.
It should be understood by those skilled in the art that various technical features of the above-described embodiments can be arbitrarily combined, and for the sake of brevity, all possible combinations of the technical features in the above-described embodiments are not described, however, so long as there is no contradiction between the combinations of the technical features, they should be considered as being within the scope of the present description.
The above-mentioned embodiments only express several embodiments of the present application, and the description thereof is specific and detailed, but not to be understood as limiting the scope of the invention. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the concept of the present application, and these are all within the scope of protection of the present application. Therefore, the protection scope of the present patent shall be subject to the appended claims.

Claims (10)

1. A lidar three-dimensional reconstruction method, the method comprising:
the method comprises the following steps that S1, three-dimensional reconstruction scanning is carried out on a scene to be operated through a laser radar, and current rotation angle data and radar measurement data of the laser radar are obtained in real time; the rotation angle data comprises angle data and corresponding timestamp data, and the radar measurement data comprises point cloud data and corresponding timestamp data;
s2, comparing timestamp data in the radar measurement data with timestamp data in the rotation angle data, and screening out matched point cloud data and angle data according to the time difference between the timestamp data in the radar measurement data and the timestamp data in the rotation angle data so as to obtain laser radar data with accurate angle pose;
and S3, analyzing and processing the point cloud data screened in the step S2 through a callback function, carrying out coordinate conversion on the point cloud data after analysis and processing, and splicing and fusing the point cloud data under different time sequences according to the rotation angle of the point cloud data.
2. The method according to claim 1, wherein in step S2, the acquired rotation angle data and radar measurement data are stored in a queue, and are respectively extracted in sequence from the head of the queue, the difference between the timestamp data of the two sets of data is analyzed and compared, the difference is compared with a preset first threshold value,
if the difference value is smaller than the first threshold value, adopting the angle data of the current timestamp as the rotation angle of the current point cloud data;
if the difference value is larger than a first threshold value, the sizes of the timestamp data in the radar measurement data and the timestamp data in the rotation angle data are continuously judged, a group of data with smaller timestamp data values in the radar measurement data and the rotation angle data is abandoned, the difference value of the timestamp data of the next group of data and the data which are not abandoned is selected from the abandoned data queue in sequence, and the difference value is compared with a preset first threshold value.
3. The method according to claim 1, wherein in step S1, before the lidar operation, it is determined whether the lidar is in a preset initial pose;
if the laser radar is in the preset initial pose, performing rotation measurement on the laser radar, and acquiring current rotation angle data and radar measurement data in real time;
and if the laser radar is not in the preset initial pose, controlling the laser radar to rotate to the preset initial pose and then performing rotation measurement, and acquiring current rotation angle data and radar measurement data in real time.
4. The method according to claim 1, wherein the step S3 includes removing abnormal measuring points from the point cloud data filtered in the step S2 by using a point cloud filtering process, including:
further segmenting the screened point cloud data to obtain a local point cloud set;
calculating the spatial distance between a point in the local point cloud set and the origin of the laser radar, and matching the spatial distance with a preset point cloud ranging mean value in the local point cloud set to calculate the standard deviation of the local point cloud set;
the standard deviation is compared with a preset second threshold value,
if the standard deviation is larger than a preset second threshold value, judging that abnormal measuring points exist in the local point cloud set area, executing filtering operation at the moment, and removing the abnormal measuring points;
if the standard deviation is smaller than a preset second threshold value, judging that the local point cloud set area has no abnormal measuring point, and judging other local point cloud sets at the moment.
5. The method according to claim 4, wherein when the abnormal measurement points are eliminated:
when the standard deviation is larger than a preset second threshold value, judging the points in the local point cloud set,
if the spatial distance from one point in the local point cloud set to the laser radar origin is smaller than or equal to a preset point cloud ranging mean value, judging that the point is abnormal, and storing the point;
and if the spatial distance from one point in the local point cloud set to the laser radar origin is greater than a preset point cloud ranging average value, judging that the point is an abnormal point, and removing the point.
6. The method according to claim 5, wherein in the step S3, during the point cloud data coordinate transformation, the radar measurement data after the point cloud filtering is subjected to rotational translation transformation according to the rotation angle data matched with the radar measurement data, so as to obtain a three-dimensional reconstruction result based on an initial pose state, and the obtained result is stored.
7. A laser radar three-dimensional reconstruction device is characterized in that: the device includes laser radar and revolution mechanic, revolution mechanic can make laser radar follows the rotatory measurement of horizontal direction, laser radar's initial position with revolution mechanic's center is under same coordinate system, the device still is provided with transmission structure for upload the measured data that laser radar caught and laser radar's rotation angle data in real time.
8. The lidar three-dimensional reconstruction apparatus according to claim 7, wherein: the horizontal rotation range of the rotating structure is 0-180 degrees, one end of the laser radar in the horizontal rotation range is set as the initial pose of the laser radar, and a sensor for detecting whether the laser radar is in the initial pose is arranged at the initial pose.
9. An electronic device comprising a processor and a memory storing a computer program, wherein the computer program, when executed by the processor, implements the method of any of claims 1 to 6.
10. A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the method according to any one of claims 1 to 6.
CN202210898161.1A 2022-07-28 2022-07-28 Laser radar three-dimensional reconstruction method and device, electronic equipment and storage medium Pending CN115291243A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210898161.1A CN115291243A (en) 2022-07-28 2022-07-28 Laser radar three-dimensional reconstruction method and device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210898161.1A CN115291243A (en) 2022-07-28 2022-07-28 Laser radar three-dimensional reconstruction method and device, electronic equipment and storage medium

Publications (1)

Publication Number Publication Date
CN115291243A true CN115291243A (en) 2022-11-04

Family

ID=83825110

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210898161.1A Pending CN115291243A (en) 2022-07-28 2022-07-28 Laser radar three-dimensional reconstruction method and device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN115291243A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116125446A (en) * 2023-01-31 2023-05-16 清华大学 Calibration method and device of rotary driving type multi-line laser radar three-dimensional reconstruction device

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116125446A (en) * 2023-01-31 2023-05-16 清华大学 Calibration method and device of rotary driving type multi-line laser radar three-dimensional reconstruction device
CN116125446B (en) * 2023-01-31 2023-09-05 清华大学 Calibration method and device of rotary driving type multi-line laser radar three-dimensional reconstruction device

Similar Documents

Publication Publication Date Title
CN109087274B (en) Electronic device defect detection method and device based on multi-dimensional fusion and semantic segmentation
CN111222395B (en) Target detection method and device and electronic equipment
JP6336117B2 (en) Building height calculation method, apparatus and storage medium
CN111006601A (en) Key technology of three-dimensional laser scanning in deformation monitoring
CN111879235A (en) Three-dimensional scanning detection method and system for bent pipe and computer equipment
CN110120091B (en) Method and device for manufacturing electric power inspection image sample and computer equipment
US11965967B2 (en) Apparatus and method for detecting intersection edges
CN109523528B (en) Power transmission line extraction method based on unmanned aerial vehicle binocular vision SGC algorithm
CN107564111A (en) Power line space safety analysis method based on computer vision
CN112669318A (en) Surface defect detection method, device, equipment and storage medium
CN111412842A (en) Method, device and system for measuring cross-sectional dimension of wall surface
CN110991452B (en) Parking space frame detection method, device, equipment and readable storage medium
CN107092905B (en) Method for positioning instrument to be identified of power inspection robot
JP2017040600A (en) Inspection method, inspection device, image processor, program and record medium
CN115147370A (en) Battery top cover welding defect detection method and device, medium and electronic equipment
CN115291243A (en) Laser radar three-dimensional reconstruction method and device, electronic equipment and storage medium
US11908081B2 (en) Method and system for automatic characterization of a three-dimensional (3D) point cloud
Ahmadabadian et al. Image selection in photogrammetric multi-view stereo methods for metric and complete 3D reconstruction
CN113532277A (en) Method and system for detecting plate-shaped irregular curved surface workpiece
CN112565615B (en) Method and device for determining trigger point of flying shooting
WO2022110043A1 (en) Erosion detection method and apparatus, and computer-readable medium
CN117292076A (en) Dynamic three-dimensional reconstruction method and system for local operation scene of engineering machinery
CN112800806B (en) Object pose detection tracking method and device, electronic equipment and storage medium
CN111630569B (en) Binocular matching method, visual imaging device and device with storage function
CN113592767B (en) Method and device for debugging duplexer

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination