CN113643382B - Method and device for acquiring dense colored point cloud based on rotary laser fusion camera - Google Patents

Method and device for acquiring dense colored point cloud based on rotary laser fusion camera Download PDF

Info

Publication number
CN113643382B
CN113643382B CN202110964419.9A CN202110964419A CN113643382B CN 113643382 B CN113643382 B CN 113643382B CN 202110964419 A CN202110964419 A CN 202110964419A CN 113643382 B CN113643382 B CN 113643382B
Authority
CN
China
Prior art keywords
point cloud
camera
dimensional laser
acquisition
steering engine
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110964419.9A
Other languages
Chinese (zh)
Other versions
CN113643382A (en
Inventor
王越
张群康
李彰
朱性利
汪曼
熊蓉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang University ZJU
Original Assignee
Zhejiang University ZJU
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang University ZJU filed Critical Zhejiang University ZJU
Priority to CN202110964419.9A priority Critical patent/CN113643382B/en
Publication of CN113643382A publication Critical patent/CN113643382A/en
Application granted granted Critical
Publication of CN113643382B publication Critical patent/CN113643382B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/86Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/40Scaling of whole images or parts thereof, e.g. expanding or contracting
    • G06T3/4038Image mosaicing, e.g. composing plane images from plane sub-images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A90/00Technologies having an indirect contribution to adaptation to climate change
    • Y02A90/10Information and communication technologies [ICT] supporting adaptation to climate change, e.g. for weather forecasting or climate simulation

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Electromagnetism (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Optical Radar Systems And Details Thereof (AREA)

Abstract

The invention discloses a method and a device for acquiring dense color point clouds based on a rotary laser fusion camera, wherein the device comprises a steering engine, a bracket driven to rotate by the steering engine and a two-dimensional laser radar arranged on the bracket, and the steering engine drives the two-dimensional laser radar to rotate by driving the rotary bracket to rotate; the two-dimensional laser radar is used for rotating laser acquisition, and the scanning plane of the two-dimensional laser radar is rotated for 360 degrees, so that the density of the acquired point cloud is improved. According to the invention, the steering engine is added, the degree of freedom of point cloud acquisition is improved for the two-dimensional laser radar, meanwhile, the rotating speed of the steering engine can acquire point clouds with different density degrees, the scanning plane of the two-dimensional laser is placed in parallel with the rotating shaft of the steering engine, so that the acquired laser point clouds are improved from two dimensions to three dimensions, the RGB camera is added, the RGB information in a scene can be acquired while the point cloud data is acquired, and the possibility is provided for the coloring and the visualization application of the point clouds.

Description

Method and device for acquiring dense colored point cloud based on rotary laser fusion camera
Technical Field
The invention relates to a point cloud acquisition technology, in particular to a method and a device for acquiring dense colored point cloud based on a rotary laser fusion camera.
Background
In recent years, with the development of Virtual Reality (VR) and Augmented Reality (AR) technologies, there is a higher demand for reconstruction of scenes in daily life. In applications such as virtual conferences, virtual viewing, ancient site restoration, and indoor navigation, the reconstruction of three-dimensional scenes is not possible. Therefore, the three-dimensional dense color point cloud for efficiently acquiring the scene has wide application prospect. Currently, the mainstream mode of acquiring scene point clouds generally includes a depth camera, a multi-line laser radar or a static laser scanner, but the two modes are difficult to be well balanced in terms of detection distance and density: although the depth camera is dense, the visual angle and the effective measurement distance are smaller, and high-efficiency acquisition of a large scene is difficult to perform; although the multi-line laser radar has a scanning angle of 360 degrees and a long measurement distance, the point cloud is relatively sparse, the color of the point cloud cannot be obtained, and in the three-dimensional reconstruction guided by the visualization function, a complex geometric structure cannot be reconstructed. Static laser scanners are expensive and cannot be used in a wide range.
Disclosure of Invention
In order to solve the above problems of the existing point cloud acquisition device, the invention aims to design a complete device system combining a rotary two-dimensional laser radar and an RGB camera, and the device and the method are used for acquiring dense coloring point clouds and can efficiently acquire the coloring point clouds of dense scenes.
The invention is realized by the following technical scheme:
the invention discloses a dense point cloud acquisition device based on a rotary two-dimensional laser fusion camera, which comprises a steering engine, a bracket driven to rotate by the steering engine and a two-dimensional laser radar arranged on the bracket, wherein the steering engine drives the two-dimensional laser radar to rotate by driving the rotary bracket to rotate; the two-dimensional laser radar is used for rotating laser acquisition, and the scanning plane of the two-dimensional laser radar is rotated for 360 degrees, so that the density of the acquired point cloud is improved.
The invention also comprises at least more than one RGB camera which is positioned around the two-dimensional laser radar and is used for acquiring RGB information of a scene. In order to avoid time synchronization errors between a camera and a two-dimensional laser radar, the panoramic view angle is fixedly formed by using a plurality of RGB cameras, RGB information of a scene is acquired, and the panoramic view angle is acquired by not driving the camera to rotate through a steering engine. And re-projecting the acquired three-dimensional point cloud back to each camera coordinate system after specific external parameter transformation to obtain image color information of corresponding pixels and obtain dense three-dimensional point cloud with RGB information.
As a further improvement, the acquisition plane of the two-dimensional laser radar is parallel to the steering engine rotating shaft.
As a further improvement, the RGB cameras according to the present invention are respectively numbered 1, 2, …, and camera No. 1 is considered as a main camera for subsequent coordinate transformation between other sensor coordinate systems. The number of cameras is not fixed, and can be flexibly adjusted according to actual requirements. Meanwhile, since the angles of view of the different types of cameras are different, there is also a difference in the number requirements. After the steps are finished, the whole system is built, and data acquisition and post-processing are carried out through related sensors. The whole device can be kept still in the process of data acquisition, so that the awareness of environment details is improved.
The invention also discloses an acquisition method of the dense point cloud acquisition device based on the rotary two-dimensional laser fusion camera, which comprises the following data acquisition steps: 1) point cloud acquisition, 2) image acquisition, 3) point cloud coloring.
As a further improvement, before step 1) of the present invention, the sensor parameters involved in the present device are defined, and the internal parameter matrix of the camera is:camera distortion coefficient: k (k) 1 、k 2 、k 3 External transformation matrix of camera: />c i And c j Representing the serial number of the camera,/-, respectively>Represents slave c i Camera coordinate system to c j A coordinate transformation matrix of the camera coordinate system. Because the two-dimensional laser radar can rotate continuously under the action of the steering engine, the coordinate system of the two-dimensional laser radar can be not fixed along with the rotation of the steering engine, the conventional steering engine can obtain angle position readings, and in order to fix the laser radar coordinate system, the two-dimensional laser radar coordinate system at the position of 0 degree is defined as a point cloud coordinate system. All the acquired laser point clouds can be converted into a point cloud coordinate system through steering engine angle readings theta for splicing; also, the external transformation matrix between the main camera and the point cloud coordinate system can be obtained by calibration +.>In this way, a uniform coordinate system can be used for all sensors.
As a further improvement, the step 1) of point cloud acquisition comprises two steps of rotating laser acquisition and point cloud splicing, wherein the selecting laser acquisition specifically comprises the following steps: the computer collects original laser scanning point clouds through the two-dimensional laser radar, and when collecting two-dimensional laser point cloud data, the computer records the angle readings theta of the steering engine at each moment in addition to the laser point clouds; the point cloud splicing specifically comprises the following steps: processing the original data: point cloud splicing is carried out by combining external parameters of a laser steering engine and steering engine angles, a plurality of frames of two-dimensional laser scanning are spliced to obtain three-dimensional point cloud, and each angle theta corresponds to a transformation matrixFor each frame of laser point cloud, converting the point cloud from the current laser coordinate system to the point cloud coordinate system through the transformation matrix to splice P l =T θ P θ Wherein P is l Representing the point cloud in the point cloud coordinate system, P θ Representing the point cloud acquired at the current time.
As a further improvement, the image acquisition in step 2) of the invention comprises two steps of adopting a camera to acquire images and adopting a computer to acquire camera RGB images through the camera, and the image de-distortion comprises the step of calibrating the acquired images by utilizing the internal parameter distortion coefficients of the camera so as to remove the influence caused by lens distortion.
As a further improvement, in the image de-distortion process, the invention aims at any point [ u, v ] on the image pixels during calibration]R is the point to the origin of the coordinate system [ u ] 0 ,v 0 ]Distance between k 1 、k 2 、k 3 For the distortion coefficients mentioned above, the calibrated pixel coordinates are:
u distorted =u(1+k 1 r 2 +k 2 r 4 +k 3 r 6 )
v distorted =v(1+k 1 r 2 +k 2 r 4 +k 3 r 6 )。
as a further improvement, the step 3) of the coloring of the point cloud comprises the following specific steps: first converting all point clouds into a main camera coordinate systemThen converting them into respective camera coordinate systems using inter-camera external parameters>Finally, through the internal parameter matrix K of the camera, the camera coordinate system c i Projection of the lower point cloud to the pixel plane +.>Wherein Z is ci And (3) representing the depth value (z coordinate value) of the point, wherein the RGB value of the pixel point is the color information of the corresponding point cloud, carrying out color extraction on all the points in the image range, removing all the uncolored point cloud, obtaining the final densely colored three-dimensional point cloud, and completing data acquisition.
The core idea of the invention is that the following three points are: 1. how to obtain a denser geometrical point cloud; 2. accurately coloring the obtained point cloud; 3. and the stable, efficient and convenient acquisition process is realized. Compared with the existing point cloud acquisition device, the point cloud acquisition device has the beneficial effects that:
1. according to the invention, the steering engine is added, so that the degree of freedom of point cloud acquisition is improved for the two-dimensional laser radar, and meanwhile, the rotating speed of the steering engine can acquire the point clouds with different density degrees.
2. According to the invention, the scanning plane of the two-dimensional laser is placed in parallel with the steering engine rotating shaft, so that the laser point cloud obtained by acquisition is improved from two dimensions to three dimensions.
3. The invention increases the RGB camera, can acquire RGB information in the scene while acquiring the point cloud data, and provides possibility for point cloud coloring and visualization application.
4. The invention combines the laser radar and the RGB camera to obtain RGB dense point cloud, and compared with the traditional laser radar, the invention has better visual effect and richer scene information.
5. The invention adopts static setting in the data acquisition process, avoids errors caused by time synchronization among the two-dimensional laser, the camera and the steering engine, and improves the accuracy of data acquisition.
Drawings
FIG. 1 is a schematic diagram of a rotary two-dimensional laser radar module of the present invention;
FIG. 2 is a schematic diagram of the overall apparatus of the present invention;
FIG. 3 is an overall flow diagram of the present invention;
in the figure, 101 is a two-dimensional laser radar, 102 is a steering engine, 201 is an RGB camera, 103 is a rotation axis of the two-dimensional laser radar, and 104 is a laser plane of the two-dimensional laser radar.
Detailed Description
The technical scheme of the invention is further described by the following specific examples:
the invention discloses a dense point cloud acquisition device based on a rotary two-dimensional laser fusion camera, and fig. 1 is a schematic diagram of a rotary two-dimensional laser radar module structure of the invention; FIG. 2 is a schematic diagram of the overall apparatus of the present invention; the device includes a steering wheel 102, drives rotatory L type support, the two-dimensional laser radar 101 of setting on L type support through steering wheel 102, and steering wheel 102 drives two-dimensional laser radar 101 through the rotatory L type support rotation of drive and rotates, and two-dimensional laser radar 101 is used for rotatory laser acquisition, still includes the RGB camera 201 of 4 wide angles, is located two-dimensional laser radar 101 periphery for acquire the RGB information of scene. The laser plane 104 of the two-dimensional laser radar is parallel to the rotation axis 103 of the two-dimensional laser radar, and the rotation axis 103 of the two-dimensional laser radar is the rotation axis of the steering engine 102; the RGB cameras 201 are respectively numbered 1, 2, …, and camera No. 1 is regarded as a main camera for subsequent coordinate transformation between other sensor coordinate systems. The purpose of selecting a plurality of RGB cameras 201 is to obtain a panoramic view angle while ensuring photographing quality. The two-dimensional laser radar is connected with the steering engine 102, and the scanning plane of the two-dimensional laser radar is rotated by 360 degrees, so that the density of the acquired point cloud is improved. Because the two-dimensional laser radar continuously rotates under the action of the steering engine 102, the coordinate system of the two-dimensional laser radar is not fixed along with the rotation of the steering engine 102, the conventional steering engine 102 can obtain angular position readings, and in order to fix the coordinate system of the laser radar, the coordinate system of the two-dimensional laser radar at the position of 0 degree is defined as a point cloud coordinate system. All the acquired laser point clouds can be converted into a point cloud coordinate system through the angle reading theta of the steering engine 102 to be spliced.
For the two-dimensional laser radar and the steering engine 102, since the steering engine 102 is rotatable, the rigid relationship between the laser and the steering engine 102 is the relationship between the rotation axis of the steering engine 102 and the origin of the laser coordinate system. In order to obtain more accurate measurements, the sensor parameters in the device, including the internal parameters K of the RGB camera 201, the external parameters between the cameras, need to be calibrated before the device is usedExternal parameters of main camera and point cloud coordinate system +.>Defining sensor parameters involved in the device, and an internal parameter matrix of a camera: />Camera with camera bodyDistortion coefficient: k (k) 1 、k 2 、k 3 External transformation matrix of camera: />c i And c j Representing the serial number of the camera,/-, respectively>Represents slave c i Camera coordinate system to c j A coordinate transformation matrix of the camera coordinate system.
Also, an external transformation matrix between the main camera and the point cloud coordinate system can be obtained through calibrationIn this way, a uniform coordinate system can be used for all sensors; the data acquisition is performed with the whole set of device completely stationary, avoiding distortions and errors of the two-dimensional laser, camera and steering engine 102 due to time synchronization errors.
The invention also discloses an acquisition method of the dense point cloud acquisition device based on the rotary two-dimensional laser fusion camera, which comprises the following data acquisition steps: 1) point cloud acquisition, 2) image acquisition, 3) point cloud coloring.
And (5) data acquisition is carried out after the external parameters are calibrated. When the device is used for data acquisition, the device is connected with a computer, and all data processing is performed in the computer. Fig. 3 is an overall flow diagram of the present invention:
1) And (3) collecting point cloud: the method comprises the following two steps of rotary laser acquisition and point cloud splicing, wherein the selective laser acquisition is specifically as follows: the computer collects original laser scanning point clouds through the two-dimensional laser radar 101, and when collecting two-dimensional laser point cloud data, the computer records the angle readings theta of the steering engine 102 at each moment in addition to the laser point clouds; the point cloud splicing specifically comprises the following steps: processing the original data: the laser steering engine 102 is utilized to perform point cloud splicing in combination with the steering engine 102 angle, a plurality of frames of two-dimensional laser scanning are spliced to obtain three-dimensional point cloud, and each angle theta corresponds to a transformation matrixFor each frame of laser point cloud, converting the point cloud from the current laser coordinate system to the point cloud coordinate system through the transformation matrix to splice P l =T θ P θ Wherein P is l Representing the point cloud in the point cloud coordinate system, P θ Representing the point cloud acquired at the current time.
2) And (3) image acquisition: the method comprises the steps of adopting a camera to collect images and adopting an image de-distortion step, wherein the image collection step comprises that a computer collects RGB images of the camera through the camera, the image de-distortion step comprises the steps of utilizing internal parameter distortion coefficients of the camera to calibrate the obtained images so as to remove the influence caused by lens distortion, and the internal parameter distortion coefficients of the camera are utilized to calibrate the obtained images so as to remove the influence caused by lens distortion. In distortion calibration, for any point [ u, v ] on an image pixel]R is the point to the origin of the coordinate system [ u ] 0 ,v 0 ]Distance between k 1 、k 2 、k 3 For the distortion coefficients mentioned above, the calibrated pixel coordinates are:
u distorted =u(1+k 1 r 2 +k 2 r 4 +k 3 r 6 )
v distorted =v(1+k 1 r 2 +k 2 r 4 +k 3 r 6 )。
3) Point cloud coloring: after the above steps are completed, the whole point cloud is colored, and the process firstly converts all the point clouds into a main camera coordinate systemThen converting them into respective camera coordinate systems using inter-camera external parameters>Finally, through the internal parameter matrix K of the camera, the camera coordinate system c i Projection of the lower point cloud to the pixel plane +.>Wherein Z is ci The depth value (z coordinate value) of the point is represented, and the RGB value of the pixel point is color information of the corresponding point cloud. And carrying out color extraction on all the points in the image range, removing all uncolored point clouds, obtaining a final densely colored three-dimensional point cloud, and completing data acquisition.
The foregoing is illustrative of the preferred embodiments of the present invention, and the invention is not limited to the above examples, but is intended to cover modifications and variations within the scope of the invention as would be apparent to one skilled in the art without departing from the spirit and scope of the invention.

Claims (3)

1. The acquisition method of the dense point cloud acquisition device based on the rotary two-dimensional laser fusion camera is characterized in that the device comprises a steering engine (102), a bracket driven to rotate by the steering engine (102) and a two-dimensional laser radar (101) arranged on the bracket, wherein the steering engine (102) drives the two-dimensional laser radar (101) to rotate by driving the rotating bracket to rotate; the two-dimensional laser radar (101) is used for rotary laser acquisition; the system also comprises at least more than one RGB camera (201) which is positioned at the periphery of the two-dimensional laser radar (101) and is used for acquiring RGB information of a scene; the laser plane (104) of the two-dimensional laser radar is parallel to the rotating shaft (103) of the two-dimensional laser radar; the RGB cameras (201) are respectively numbered 1, 2 and …, and the camera No. 1 is regarded as a main camera and is used for subsequent coordinate transformation between the coordinate systems of other sensors;
the data acquisition step of the device comprises the following steps:
1) And (3) collecting point cloud: step 1) point cloud acquisition, which comprises two steps of rotary laser acquisition and point cloud splicing, wherein the rotary laser acquisition specifically comprises the following steps: the computer collects original laser scanning point clouds through the two-dimensional laser radar (101), and when collecting two-dimensional laser point cloud data, the computer records the angle readings theta of the steering engine (102) at each moment in addition to the laser point clouds; the point cloud splicing specifically comprises the following steps: processing the original data: point cloud splicing is performed by combining external parameters of a laser steering engine (102) and angles of the steering engine (102), and multi-frame two-dimensional laser scanning is splicedObtaining a three-dimensional point cloud, wherein each angle theta corresponds to a transformation matrixFor each frame of laser point cloud, converting the point cloud from the current laser coordinate system to the point cloud coordinate system through the transformation matrix to splice P l =T θ P θ Wherein P is l Representing the point cloud in the point cloud coordinate system, P θ Representing the point cloud acquired at the current moment;
2) And (3) image acquisition: the step 2) of image acquisition comprises two steps of image acquisition by a camera and image de-distortion, wherein the image acquisition by the camera comprises that a computer acquires an RGB image of the camera by the camera, and the image de-distortion comprises that the acquired image is calibrated by using an internal reference distortion coefficient of the camera so as to remove the influence caused by lens distortion;
3) Point cloud coloring: the specific step of coloring the point cloud in the step 3) is as follows: first converting all point clouds into a main camera coordinate systemThen using external parameters between cameras to convert them to respective camera coordinate systemsFinally, through the internal parameter matrix K of the camera, the camera coordinate system c i Projection of the underlying point cloud onto the pixel planeWherein Z is ci And the z coordinate value of the depth value of the pixel point is represented, the RGB value of the pixel point is corresponding to the color information of the point cloud, all the points in the image range are subjected to color extraction, all the uncolored point cloud is removed, the final densely colored three-dimensional point cloud is obtained, and the data acquisition is completed.
2. Dense point cloud acquisition device based on rotary two-dimensional laser fusion camera as claimed in claim 1Before step 1), defining the sensor parameters related to the device, and an internal parameter matrix of the camera:camera distortion coefficient: k (k) 1 、k 2 、k 3 External transformation matrix of camera: />c i And c j Representing the serial number of the camera,/-, respectively>Represents slave c i Camera coordinate system to c j A coordinate transformation matrix of the camera coordinate system.
3. The method for acquiring the dense point cloud based on the rotational two-dimensional laser fusion camera according to claim 1, wherein in the image de-distortion process, any point [ u, v ] on the image pixel is calibrated]R is the point to the origin of the coordinate system [ u ] 0 ,v 0 ]Distance between k 1 、k 2 、k 3 For the distortion coefficients mentioned above, the calibrated pixel coordinates are:
u distorted =u(1+k 1 r 2 +k 2 r 4 +k 3 r 6 )
v distorted =v(1+k 1 r 2 +k 2 r 4 +k 3 r 6 )。
CN202110964419.9A 2021-08-22 2021-08-22 Method and device for acquiring dense colored point cloud based on rotary laser fusion camera Active CN113643382B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110964419.9A CN113643382B (en) 2021-08-22 2021-08-22 Method and device for acquiring dense colored point cloud based on rotary laser fusion camera

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110964419.9A CN113643382B (en) 2021-08-22 2021-08-22 Method and device for acquiring dense colored point cloud based on rotary laser fusion camera

Publications (2)

Publication Number Publication Date
CN113643382A CN113643382A (en) 2021-11-12
CN113643382B true CN113643382B (en) 2023-10-10

Family

ID=78423302

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110964419.9A Active CN113643382B (en) 2021-08-22 2021-08-22 Method and device for acquiring dense colored point cloud based on rotary laser fusion camera

Country Status (1)

Country Link
CN (1) CN113643382B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114630096B (en) * 2022-01-05 2023-10-27 深圳技术大学 Method, device and equipment for densification of TOF camera point cloud and readable storage medium
CN114594489A (en) * 2022-02-16 2022-06-07 北京天玛智控科技股份有限公司 Mining three-dimensional color point cloud reconstruction system and method
CN116243275A (en) * 2022-12-21 2023-06-09 北京天玛智控科技股份有限公司 Panoramic three-dimensional inspection system

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105928457A (en) * 2016-06-21 2016-09-07 大连理工大学 Omnidirectional three-dimensional laser color scanning system and method thereof
CN109341689A (en) * 2018-09-12 2019-02-15 北京工业大学 Vision navigation method of mobile robot based on deep learning
CN111366908A (en) * 2020-04-22 2020-07-03 北京国电富通科技发展有限责任公司 Laser radar rotary table and measuring device and measuring method thereof
CN111505606A (en) * 2020-04-14 2020-08-07 武汉大学 Method and device for calibrating relative pose of multi-camera and laser radar system
WO2020233443A1 (en) * 2019-05-21 2020-11-26 菜鸟智能物流控股有限公司 Method and device for performing calibration between lidar and camera
CN112034484A (en) * 2020-09-02 2020-12-04 亿嘉和科技股份有限公司 Modeling system and method based on hemispherical laser radar
CN112304307A (en) * 2020-09-15 2021-02-02 浙江大华技术股份有限公司 Positioning method and device based on multi-sensor fusion and storage medium
CN112384891A (en) * 2018-05-01 2021-02-19 联邦科学与工业研究组织 Method and system for point cloud coloring
CN113075683A (en) * 2021-03-05 2021-07-06 上海交通大学 Environment three-dimensional reconstruction method, device and system

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105928457A (en) * 2016-06-21 2016-09-07 大连理工大学 Omnidirectional three-dimensional laser color scanning system and method thereof
CN112384891A (en) * 2018-05-01 2021-02-19 联邦科学与工业研究组织 Method and system for point cloud coloring
CN109341689A (en) * 2018-09-12 2019-02-15 北京工业大学 Vision navigation method of mobile robot based on deep learning
WO2020233443A1 (en) * 2019-05-21 2020-11-26 菜鸟智能物流控股有限公司 Method and device for performing calibration between lidar and camera
CN111505606A (en) * 2020-04-14 2020-08-07 武汉大学 Method and device for calibrating relative pose of multi-camera and laser radar system
CN111366908A (en) * 2020-04-22 2020-07-03 北京国电富通科技发展有限责任公司 Laser radar rotary table and measuring device and measuring method thereof
CN112034484A (en) * 2020-09-02 2020-12-04 亿嘉和科技股份有限公司 Modeling system and method based on hemispherical laser radar
CN112304307A (en) * 2020-09-15 2021-02-02 浙江大华技术股份有限公司 Positioning method and device based on multi-sensor fusion and storage medium
CN113075683A (en) * 2021-03-05 2021-07-06 上海交通大学 Environment three-dimensional reconstruction method, device and system

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
吴胜浩 ; 钟若飞 ; .基于移动平台的激光点云与数字影像融合方法.首都师范大学学报(自然科学版).2011,第32卷(第4期),第57-61页. *
杨雪飞 等.海量变电站点云实时渲染方法.《科技与创新》.2021,(第7期),第1-5页. *

Also Published As

Publication number Publication date
CN113643382A (en) 2021-11-12

Similar Documents

Publication Publication Date Title
CN113643382B (en) Method and device for acquiring dense colored point cloud based on rotary laser fusion camera
CN108507462B (en) A kind of scaling method of four axis measuring device rotary shaft of holographic interference
CN108921901B (en) Large-view-field camera calibration method based on precise two-axis turntable and laser tracker
CN109919911B (en) Mobile three-dimensional reconstruction method based on multi-view photometric stereo
CN107424118A (en) Based on the spherical panorama mosaic method for improving Lens Distortion Correction
CN111369630A (en) Method for calibrating multi-line laser radar and camera
CN107431803A (en) The seizure of panoramic virtual reality content and render
CN106548477A (en) A kind of multichannel fisheye camera caliberating device and method based on stereo calibration target
CN101276465A (en) Method for automatically split-jointing wide-angle image
CN108648147A (en) A kind of super-resolution image acquisition method and system of human eye retina's mechanism
CN111461963B (en) Fisheye image stitching method and device
CN110425983B (en) Monocular vision three-dimensional reconstruction distance measurement method based on polarization multispectral
CN110322485A (en) A kind of fast image registration method of isomery polyphaser imaging system
CN113362228A (en) Method and system for splicing panoramic images based on improved distortion correction and mark splicing
CN206460515U (en) A kind of multichannel fisheye camera caliberating device based on stereo calibration target
CN113793270A (en) Aerial image geometric correction method based on unmanned aerial vehicle attitude information
CN115937288A (en) Three-dimensional scene model construction method for transformer substation
CN109788270B (en) 3D-360-degree panoramic image generation method and device
CN115761532A (en) Automatic detection system for power transmission line navigation image
CN115294313A (en) Dense true color point cloud data acquisition method and device based on 3D-2D multi-mode fusion
CN113962853B (en) Automatic precise resolving method for rotary linear array scanning image pose
CN111340959A (en) Three-dimensional model seamless texture mapping method based on histogram matching
CN108510537B (en) 3D modeling method and device
CN111273439A (en) Full scene three-dimensional optical scanning system and optimization method
CN116381712A (en) Measurement method based on linear array camera and ground laser radar combined device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant