CN115984512A - Three-dimensional reconstruction device and method for plane scene - Google Patents

Three-dimensional reconstruction device and method for plane scene Download PDF

Info

Publication number
CN115984512A
CN115984512A CN202310282859.5A CN202310282859A CN115984512A CN 115984512 A CN115984512 A CN 115984512A CN 202310282859 A CN202310282859 A CN 202310282859A CN 115984512 A CN115984512 A CN 115984512A
Authority
CN
China
Prior art keywords
plane
laser ranging
module
dimensional
point cloud
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202310282859.5A
Other languages
Chinese (zh)
Other versions
CN115984512B (en
Inventor
毛靖宇
戴忠余
宋小亮
李杨
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Chengdu Liangxin Integrated Technology Co ltd
Original Assignee
Chengdu Liangxin Integrated Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chengdu Liangxin Integrated Technology Co ltd filed Critical Chengdu Liangxin Integrated Technology Co ltd
Priority to CN202310282859.5A priority Critical patent/CN115984512B/en
Publication of CN115984512A publication Critical patent/CN115984512A/en
Application granted granted Critical
Publication of CN115984512B publication Critical patent/CN115984512B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Processing Or Creating Images (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The invention relates to the technical field of three-dimensional reconstruction, in particular to a device and a method for three-dimensional reconstruction of a plane scene, wherein the method comprises the following steps: five laser ranging modules are arranged, and extension lines of the ranging directions of the five laser ranging modules are intersected at one point; visible light spots of the five laser ranging modules irradiate the five plane areas, namely the left plane area, the lower plane area, the right plane area, the upper plane area and the front plane area, and the five laser ranging modules are moved simultaneously and three-dimensional point cloud data of the five plane areas are continuously collected; fitting plane equation parameters by using three-dimensional point cloud data to obtain three-dimensional coordinates of four intersection points, and obtaining a fitting plane based on the four intersection points; and when the fitting plane of the two continuously obtained three-dimensional point cloud data changes, splicing a new plane based on the two three-dimensional point cloud data. The invention adopts the plane reconstruction device and method of sparse point cloud, the precision of the distance measurement module is millimeter level, the manufacturing cost of the equipment is reduced, and the measurement precision is increased.

Description

Three-dimensional reconstruction device and method for plane scene
Technical Field
The invention relates to the field of three-dimensional reconstruction, in particular to a three-dimensional reconstruction device and a three-dimensional reconstruction method for a plane scene.
Background
The existing three-dimensional reconstruction system mainly completes three-dimensional reconstruction through a laser SLAM algorithm, a visual SALM algorithm and a multi-view point cloud splicing algorithm. However, the above methods all require dense point cloud data, such devices are usually very expensive, and in order to ensure the consistency of the point cloud data, the distance measurement accuracy is sacrificed, which is usually centimeter-level, and does not satisfy the mapping field of millimeter-level accuracy.
Based on the above disadvantages, the present invention provides a sparse point cloud based planar reconstruction apparatus and method, so as to solve at least some of the above technical problems.
Disclosure of Invention
The technical problem to be solved by the invention is as follows: provides a planar scene three-position reconstruction device to solve at least part of the problems
In order to achieve the purpose, the technical scheme adopted by the invention is as follows:
a three-dimensional planar scene reconstruction device comprises a base, a control module arranged on the base, a first laser ranging module arranged on the base, a second laser ranging module arranged on the base, a third laser ranging module arranged on the base, a fourth laser ranging module arranged on the base, a fifth laser ranging module arranged on the base and a handheld structure arranged on the base, wherein the first laser ranging module is arranged on the base; the control module is respectively connected with the first laser ranging module, the second laser ranging module, the third laser ranging module, the fourth laser ranging module and the fifth laser ranging module; the first laser ranging module is fixed on the left side of the base; the second laser ranging module is fixed in the base downside, the third laser ranging module is fixed in the base right side, the fourth laser ranging module is fixed in the base upside, the fifth laser ranging module is fixed in the base front side, and the extension line of the range finding direction of first laser ranging module, second laser ranging module, third laser ranging module, fourth laser ranging module and fifth laser ranging module intersects in a point.
Furthermore, the ranging precision of the first laser ranging module, the second laser ranging module, the third laser ranging module, the fourth laser ranging module and the fifth laser ranging module is millimeter level.
Further, the planar scene three-dimensional reconstruction device comprises the following control methods:
visible light spots of the first laser ranging module, the second laser ranging module, the third laser ranging module, the fourth laser ranging module and the fifth laser ranging module respectively emit to five plane areas, namely, the left plane area, the lower plane area, the right plane area, the upper plane area and the front plane area, the plane scene three-dimensional reconstruction device is moved, and three-dimensional point cloud data of the five plane areas are continuously collected;
the control module utilizes the three-dimensional point cloud data to fit the plane equation parameters to obtain three-dimensional coordinates of the four intersection points, and a fitting plane is obtained based on the four intersection points; and when the fitting plane of the two continuously obtained three-dimensional point cloud data changes, splicing a new plane based on the two three-dimensional point cloud data.
A planar scene three-dimensional reconstruction method is characterized by comprising the following steps:
step 1, setting five laser ranging modules, wherein extension lines of ranging directions of the five laser ranging modules are intersected at one point;
step 2, visible light spots of the five laser ranging modules irradiate the five plane areas, namely the left plane area, the lower plane area, the right plane area, the upper plane area and the front plane area, and the five laser ranging modules are moved simultaneously and three-dimensional point cloud data of the five plane areas are continuously collected;
step 3, fitting plane equation parameters by using three-dimensional point cloud data to obtain three-dimensional coordinates of four intersection points, and obtaining a fitting plane based on the four intersection points;
and 4, when the fitting plane of the two continuously obtained three-dimensional point cloud data changes, splicing a new plane based on the two three-dimensional point cloud data.
The distance measuring directions of the five laser distance measuring modules in the step 1 are intersected at one point, and the point is the origin of a coordinate system established by the equipment.
Further, in the step 2, among the three-dimensional point cloud data of the five collected plane areas, the three-dimensional point cloud data of the previous frame and the three-dimensional point cloud data of the next frame have a common view.
Further, the step 3 comprises:
step 31, establishing an expression of a plane equation, wherein the expression is ax + by + cz + d =0;
step 32, substituting the three-dimensional point cloud data of the five plane areas into a ransac algorithm to respectively fit 5 plane parameters;
step 33, substituting each plane parameter into the plane equation respectively to obtain the three-dimensional coordinates of 4 intersection points of the fitting plane;
and step 34, obtaining the area and the side length of the fitting plane based on the three-dimensional coordinates.
Further, the step 4 comprises:
step 41, when the five laser ranging modules are moved simultaneously, calculating Euclidean distances between three-dimensional coordinates of each intersection point of the next frame and the previous frame, finding a corresponding relation of each intersection point of the next frame and the previous frame according to a minimum Euclidean distance principle, and if the variable quantity of the Euclidean distances is larger than 10%, generating a new plane;
when the side lengths of the fitting planes of the next frame and the previous frame are changed, a new plane appears;
step 42, when a new plane appears, unifying all plane equations of the next frame to a coordinate system of the corresponding intersection point, and splicing with the plane equation of the previous frame;
43, establishing all plane equations in the step 42 in a simultaneous mode, and solving the intersection point three-dimensional coordinates of the spliced new planes;
and 44, repeating the step 41 to the step 43, continuously splicing new planes, and completing the three-dimensional reconstruction of the plane scene.
Compared with the prior art, the invention has the following beneficial effects: according to the invention, three-dimensional reconstruction is completed by adopting a ransac algorithm, so that the consistency of cloud data is reduced, and the manufacturing cost of equipment is reduced; the precision of the laser ranging module reaches millimeter level, and the measuring precision of the device is improved, so that the economy of the device is improved, and the precision of three-dimensional reconstruction of a plane scene is improved.
The control module of the invention can directly process data, is convenient to carry, and increases the flexibility and convenience of the device.
Drawings
FIG. 1 is a schematic structural diagram of the present invention.
FIG. 2 is a flow chart of the three-dimensional reconstruction of the present invention.
Fig. 3 is a top view of a room for three-dimensional reconstruction of a planar scene.
Fig. 4 is a schematic front view of a room interior observed for three-dimensional reconstruction of a planar scene.
Fig. 5 is a schematic view of a right view observed inside a room for three-dimensional reconstruction of a planar scene.
The system comprises a base 1, a control module 2, a first laser ranging module 3, a second laser ranging module 4, a handheld structure 5, a third laser ranging module 6, a fourth laser ranging module 7 and a fifth laser ranging module 8.
Description of the preferred embodiment
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention is further described in detail below with reference to the accompanying drawings. It should be apparent that the described embodiments are only some embodiments of the present invention, and not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
In the description of the present invention, it should be noted that the terms "front", "upper", "lower", "left", "right", etc., indicate orientations or positional relationships based on those shown in the drawings, and are only for convenience in describing the present invention and simplifying the description, but do not indicate or imply that the referred devices or elements must have a specific orientation or be constructed and operated in a specific orientation, and thus, it should not be understood that the terms "first", "second", etc., are used for descriptive purposes only and are not to be construed as indicating or implying relative importance to the present invention.
In the description of the present invention, it should be noted that, unless explicitly stated or limited otherwise, the term "connected" is to be interpreted broadly, e.g. as a fixed connection, a detachable connection, or an integral connection; of course, mechanical connection and electrical connection are also possible; in addition, they may be directly connected, indirectly connected through an intermediary, or may be interconnected between two elements. The specific meanings of the above terms in the present invention can be understood by those skilled in the art according to specific situations.
As shown in fig. 1, the invention provides a planar scene three-dimensional reconstruction device, which includes a base 1, a control module 2 arranged on the base 1, a first laser ranging module 3 arranged on the base, a second laser ranging module 4 arranged on the base, a third laser ranging module 6 arranged on the base, a fourth laser ranging module 7 arranged on the base, a fifth laser ranging module 8 arranged on the base, and a handheld structure 5 arranged on the base; the control module is respectively connected with the first laser ranging module 3, the second laser ranging module 4, the third laser ranging module 6, the fourth laser ranging module 7 and the fifth laser ranging module 8; the first laser 3-distance measuring module is fixed on the left side of the base; second laser rangefinder module 4 is fixed in the base downside, third laser rangefinder module 6 is fixed in the base right side, fourth laser rangefinder module 7 is fixed in the base upside, fifth laser rangefinder module 8 is fixed in the base front side, and the extension line of the range finding direction of first laser rangefinder module 3, second laser rangefinder module 4, third laser rangefinder module 6, fourth laser rangefinder module 7 and fifth laser rangefinder module 8 intersects in a point.
In some embodiments, the distance measurement precision of the first laser distance measurement module 3, the second laser distance measurement module 4, the third laser distance measurement module 6, the fourth laser distance measurement module 7, and the fifth laser distance measurement module 8 is in millimeter level. The measurement precision of the planar scene three-dimensional reconstruction device is improved, and the error of three-dimensional reconstruction is reduced.
In some embodiments, the planar scene three-dimensional reconstruction device includes the following control methods:
visible light spots of the first laser ranging module 3, the second laser ranging module 4, the third laser ranging module 6, the fourth laser ranging module 7 and the fifth laser ranging module 8 respectively emit to five plane areas, namely, the left plane area, the lower plane area, the right plane area, the upper plane area and the front plane area, the plane scene three-dimensional reconstruction device is moved, and three-dimensional point cloud data of the five plane areas are continuously collected;
the control module 2 fits the plane equation parameters by using the three-dimensional point cloud data to obtain three-dimensional coordinates of the four intersection points, and obtains a fitting plane based on the four intersection points; and when the fitting plane of the two continuously obtained three-dimensional point cloud data changes, splicing a new plane based on the two three-dimensional point cloud data.
The control module 2 is directly connected with the first laser ranging module 3, the second laser ranging module 4, the third laser ranging module 6, the fourth laser ranging module 7 and the fifth laser ranging module 8, so that three-dimensional point cloud data uploaded by the first laser ranging module 3, the second laser ranging module 44, the third laser ranging module 6, the fourth laser ranging module 7 and the fifth laser ranging module 8 can be received conveniently, and the three-dimensional point cloud data can be processed.
As shown in fig. 2, a three-dimensional reconstruction method for a planar scene includes the following steps:
step 1, setting five laser ranging modules, wherein extension lines of ranging directions of the five laser ranging modules are intersected at one point;
step 2, emitting visible light spots of the five laser ranging modules to five plane areas including the left plane area, the lower plane area, the right plane area, the upper plane area and the front plane area, moving the five laser ranging modules simultaneously, and continuously collecting three-dimensional point cloud data of the five plane areas;
step 3, fitting plane equation parameters by using three-dimensional point cloud data to obtain three-dimensional coordinates of four intersection points, and obtaining a fitting plane based on the four intersection points;
and 4, when the fitting plane of the two continuously obtained three-dimensional point cloud data changes, splicing a new plane based on the two three-dimensional point cloud data.
The plane scene three-dimensional reconstruction method can be applied to the plane scene three-dimensional reconstruction device, the method utilizes laser ranging modules with the ranging precision of millimeter level through a continuous moving device, each laser ranging module respectively sends out laser ranging to the left, the lower, the right, the upper and the front five areas, each area emits nine laser ranging positions, three-dimensional point cloud data of each plane are collected and uploaded to a control module, the control module fits a plane equation through a ransac algorithm to obtain four intersection points of the fitted plane, the area and the side length of the fitted plane are determined, the ransac algorithm can generate high-precision parameters, and the plane scene three-dimensional modeling precision is improved.
And (2) intersecting extension lines of the distance measuring directions of the five laser modules in the step (1) at a point, which is an origin of a coordinate system established by the three-dimensional reconstruction device of the plane scene.
In some embodiments, in step 2, of the acquired three-dimensional point cloud data of the five plane areas, the three-dimensional point cloud data of the previous frame and the three-dimensional point cloud data of the next frame have a common view.
The purpose of ensuring that the previous and subsequent frames have a common view is to add map point information to optimize the map.
In some embodiments, the step 3 includes:
step 31, establishing an expression of a plane equation, wherein the expression is ax + by + cz + d =0;
step 32, substituting the three-dimensional point cloud data of the five plane areas into a ransac algorithm to respectively fit 5 plane parameters;
step 33, substituting each plane parameter into the plane equation respectively to obtain the three-dimensional coordinates of 4 intersection points of the fitting plane;
and step 34, obtaining the area and the side length of the fitting plane based on the three-dimensional coordinates.
The control module directly processes the three-dimensional point cloud data transmitted by the laser ranging module through the ransac algorithm without being connected with a computer for calculation, convenience is provided for three-dimensional modeling of the plane scene, meanwhile, the ransac algorithm uses initial data which meet feasible conditions and are as few as possible, and uses a consistent data set to expand the initial data, the concept of finding a model to fit data is achieved, errors of data fitting are reduced, front plane data can be obtained more accurately, and the three-dimensional reconstruction accuracy of the plane scene is improved.
In some embodiments, the step 4 includes:
step 41, when the five laser ranging modules are moved simultaneously, calculating Euclidean distances between three-dimensional coordinates of each intersection point of the next frame and the previous frame, finding a corresponding relation of each intersection point of the next frame and the previous frame according to a minimum Euclidean distance principle, and if the variable quantity of the Euclidean distances is larger than 10%, generating a new plane;
when the side lengths of the fitting planes of the next frame and the previous frame are changed, a new plane appears;
step 42, when a new plane appears, unifying all plane equations of the next frame to a coordinate system of the corresponding intersection point, and splicing with the plane equation of the previous frame;
43, establishing all plane equations in the step 42 in a simultaneous mode, and solving the intersection point three-dimensional coordinates of the spliced new planes;
and step 44, repeating the step 41 to the step 43, continuously splicing new planes, and completing the three-dimensional reconstruction of the plane scene.
The plan view of the room three-dimensionally reconstructed with the planar scene is shown in fig. 3, the front view of the room three-dimensionally reconstructed with the planar scene is shown in fig. 4, the right view of the room three-dimensionally reconstructed with the planar scene is shown in fig. 5, when the front view and the right view of the room are spliced, the front view of the room three-dimensionally reconstructed with the planar scene is the previous frame in the step 41, the right view of the room three-dimensionally reconstructed with the planar scene is the next frame in the step 41, whether the next frame is a new plane is determined according to the step 41, a corresponding relationship between each intersection point of the previous frame and each intersection point of the next frame is found, the upper right corner of the front view of the three-dimensionally reconstructed with the planar scene is shown in fig. 4 and the upper left corner of the right view of the three-dimensionally reconstructed with the planar scene is shown in fig. 5 are corresponding common points, the lower right corner of the front view of the three-dimensionally reconstructed with the planar scene is shown in fig. 5, the planar scene three-dimensional reconstruction device splices the planes of the previous frame and the next frame together based on the two corresponding common points, and jointly splices, so that all the planes appear in the whole space. And when a new plane appears, the plane scene three-dimensional reconstruction device splices the new plane based on the spliced intersection point. And completing splicing among all room planes, such as the room top surface and the front view surface, the room top surface and the right view surface and the like, until the three-dimensional reconstruction of the plane scene is completed.
Finally, it should be noted that: the above embodiments are only preferred embodiments of the present invention to illustrate the technical solutions of the present invention, but not to limit the technical solutions, and certainly not to limit the patent scope of the present invention; while the invention has been described in detail and with reference to the foregoing embodiments, it will be understood by those skilled in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some or all of the technical features may be equivalently replaced; the modifications or the substitutions do not make the essence of the corresponding technical solutions depart from the scope of the technical solutions of the embodiments of the present invention; that is, the technical problems to be solved by the present invention, which are not substantially changed or supplemented by the spirit and the concept of the main body of the present invention, are still consistent with the present invention and shall be included in the scope of the present invention; in addition, the technical scheme of the invention is directly or indirectly applied to other related technical fields, and the technical scheme of the invention is included in the patent protection scope of the invention.

Claims (7)

1. A three-dimensional planar scene reconstruction device is characterized by comprising a base (1), a control module (2) arranged on the base (1), a first laser ranging module (3) arranged on the base (1), a second laser ranging module (4) arranged on the base (1), a third laser ranging module (6) arranged on the base (1), a fourth laser ranging module (7) arranged on the base (1), a fifth laser ranging module (8) arranged on the base (1), and a handheld structure (5) arranged on the base (1); the control module (2) is respectively connected with a first laser ranging module (3), a second laser ranging module (4), a third laser ranging module (6), a fourth laser ranging module (7) and a fifth laser ranging module (8); the first laser ranging module (3) is fixed on the left side of the base (1); second laser rangefinder module (4) are fixed in base (1) downside, third laser rangefinder module (6) are fixed in base (1) right side, fourth laser rangefinder module (7) are fixed in base (1) upside, fifth laser rangefinder module (8) are fixed in base (1) front side, and the extension line of the range finding direction of first laser rangefinder module (3), second laser rangefinder module (4), third laser rangefinder module (6), fourth laser rangefinder module (7) and fifth laser rangefinder module (8) is handed in a point.
2. The planar scene three-dimensional reconstruction device according to claim 1, wherein the first laser ranging module (3), the second laser ranging module (4), the third laser ranging module (6), the fourth laser ranging module (7) and the fifth laser ranging module (8) have a millimeter-scale ranging precision.
3. The planar scene three-dimensional reconstruction device as claimed in claim 1, wherein the planar scene three-dimensional reconstruction device comprises the following control methods:
visible light spots of a first laser ranging module (3), a second laser ranging module (4), a third laser ranging module (6), a fourth laser ranging module (7) and a fifth laser ranging module (8) respectively emit to five plane areas, namely, the left plane area, the lower plane area, the right plane area, the upper plane area and the front plane area, a plane scene three-dimensional reconstruction device is moved, and three-dimensional point cloud data of the five plane areas are continuously collected;
the control module utilizes the three-dimensional point cloud data to fit the plane equation parameters to obtain three-dimensional coordinates of the four intersection points, and a fitting plane is obtained based on the four intersection points; and when the fitting plane of the two continuously obtained three-dimensional point cloud data changes, splicing a new plane based on the two three-dimensional point cloud data.
4. A planar scene three-dimensional reconstruction method is characterized by comprising the following steps:
step 1, setting five laser ranging modules, wherein extension lines of ranging directions of the five laser ranging modules are intersected at one point;
step 2, emitting visible light spots of the five laser ranging modules to five plane areas including the left plane area, the lower plane area, the right plane area, the upper plane area and the front plane area, moving the five laser ranging modules simultaneously, and continuously collecting three-dimensional point cloud data of the five plane areas;
step 3, fitting plane equation parameters by using three-dimensional point cloud data to obtain three-dimensional coordinates of four intersection points, and obtaining a fitting plane based on the four intersection points;
and 4, when the fitting plane of the two continuously obtained three-dimensional point cloud data changes, splicing a new plane based on the two three-dimensional point cloud data.
5. The method as claimed in claim 4, wherein in the step 2, the three-dimensional point cloud data of the previous frame and the three-dimensional point cloud data of the next frame have a common view in the collected three-dimensional point cloud data of the five plane areas.
6. The method for reconstructing a planar scene in three dimensions as claimed in claim 4, wherein said step 3 comprises:
step 31, establishing an expression of a plane equation, wherein the expression is ax + by + cz + d =0;
step 32, substituting the three-dimensional point cloud data of the five plane areas into a ransac algorithm to respectively fit 5 plane parameters;
step 33, substituting each plane parameter into a plane equation to obtain three-dimensional coordinates of 4 intersection points of the fitting plane;
and step 34, obtaining the area and the side length of the fitting plane based on the three-dimensional coordinates.
7. The method for reconstructing a planar scene in three dimensions as claimed in claim 5, wherein said step 4 comprises:
step 41, when the five laser ranging modules are moved simultaneously, calculating Euclidean distances between three-dimensional coordinates of each intersection point of the next frame and the previous frame, finding a corresponding relation of each intersection point of the next frame and the previous frame according to a minimum Euclidean distance principle, and if the variable quantity of the Euclidean distances is larger than 10%, generating a new plane;
when the side lengths of the fitting planes of the next frame and the previous frame are changed, a new plane appears;
step 42, when a new plane appears, unifying all plane equations of the next frame to a coordinate system of the corresponding intersection point, and splicing with the plane equation of the previous frame;
43, establishing all plane equations in the step 42 in a simultaneous mode, and solving the intersection point three-dimensional coordinates of the spliced new planes;
and 44, repeating the step 41 to the step 43, continuously splicing new planes, and completing the three-dimensional reconstruction of the plane scene.
CN202310282859.5A 2023-03-22 2023-03-22 Three-dimensional reconstruction device and method for plane scene Active CN115984512B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310282859.5A CN115984512B (en) 2023-03-22 2023-03-22 Three-dimensional reconstruction device and method for plane scene

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310282859.5A CN115984512B (en) 2023-03-22 2023-03-22 Three-dimensional reconstruction device and method for plane scene

Publications (2)

Publication Number Publication Date
CN115984512A true CN115984512A (en) 2023-04-18
CN115984512B CN115984512B (en) 2023-06-13

Family

ID=85963504

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310282859.5A Active CN115984512B (en) 2023-03-22 2023-03-22 Three-dimensional reconstruction device and method for plane scene

Country Status (1)

Country Link
CN (1) CN115984512B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116824067A (en) * 2023-08-24 2023-09-29 成都量芯集成科技有限公司 Indoor three-dimensional reconstruction method and device thereof
CN117115362A (en) * 2023-10-20 2023-11-24 成都量芯集成科技有限公司 Three-dimensional reconstruction method for indoor structured scene

Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102867304A (en) * 2012-09-04 2013-01-09 南京航空航天大学 Method for establishing relation between scene stereoscopic depth and vision difference in binocular stereoscopic vision system
CN106441151A (en) * 2016-09-30 2017-02-22 中国科学院光电技术研究所 Measuring system for three-dimensional target Euclidean space reconstruction based on vision and active optical fusion
CN106652018A (en) * 2016-09-29 2017-05-10 北京京东尚科信息技术有限公司 Object three-dimensional reconstruction method, device and system
CN108732587A (en) * 2018-06-07 2018-11-02 安徽理工大学 It is a kind of that power method is determined based on scanning element cloud ranging, angle measurement
CN109708618A (en) * 2018-12-29 2019-05-03 成都天佑智隧科技有限公司 A kind of combined type determines the scaling method of photographic scale after three-dimensional reconstruction
CN110148180A (en) * 2019-04-22 2019-08-20 河海大学 A kind of laser radar and camera fusing device and scaling method
US20190392598A1 (en) * 2017-07-17 2019-12-26 Shining 3D Tech Co., Ltd. Three-Dimensional Reconstruction Method and Device Based on Monocular Three-dimensional Scanning System
CN112378349A (en) * 2020-09-28 2021-02-19 湖南海森格诺信息技术有限公司 Sagger flatness detection device based on binocular structured light and detection method thereof
CN113406604A (en) * 2021-06-30 2021-09-17 山东新一代信息产业技术研究院有限公司 Device and method for calibrating positions of laser radar and camera
WO2021208442A1 (en) * 2020-04-14 2021-10-21 广东博智林机器人有限公司 Three-dimensional scene reconstruction system and method, device, and storage medium
CN114782636A (en) * 2022-05-17 2022-07-22 江苏集萃深度感知技术研究所有限公司 Three-dimensional reconstruction method, device and system
CN115598626A (en) * 2022-12-14 2023-01-13 成都量芯集成科技有限公司(Cn) Deviation calibration device and method for laser range finder
CN115657061A (en) * 2022-12-13 2023-01-31 成都量芯集成科技有限公司 Indoor wall surface three-dimensional scanning device and method

Patent Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102867304A (en) * 2012-09-04 2013-01-09 南京航空航天大学 Method for establishing relation between scene stereoscopic depth and vision difference in binocular stereoscopic vision system
CN106652018A (en) * 2016-09-29 2017-05-10 北京京东尚科信息技术有限公司 Object three-dimensional reconstruction method, device and system
CN106441151A (en) * 2016-09-30 2017-02-22 中国科学院光电技术研究所 Measuring system for three-dimensional target Euclidean space reconstruction based on vision and active optical fusion
US20190392598A1 (en) * 2017-07-17 2019-12-26 Shining 3D Tech Co., Ltd. Three-Dimensional Reconstruction Method and Device Based on Monocular Three-dimensional Scanning System
CN108732587A (en) * 2018-06-07 2018-11-02 安徽理工大学 It is a kind of that power method is determined based on scanning element cloud ranging, angle measurement
CN109708618A (en) * 2018-12-29 2019-05-03 成都天佑智隧科技有限公司 A kind of combined type determines the scaling method of photographic scale after three-dimensional reconstruction
CN110148180A (en) * 2019-04-22 2019-08-20 河海大学 A kind of laser radar and camera fusing device and scaling method
WO2021208442A1 (en) * 2020-04-14 2021-10-21 广东博智林机器人有限公司 Three-dimensional scene reconstruction system and method, device, and storage medium
CN112378349A (en) * 2020-09-28 2021-02-19 湖南海森格诺信息技术有限公司 Sagger flatness detection device based on binocular structured light and detection method thereof
CN113406604A (en) * 2021-06-30 2021-09-17 山东新一代信息产业技术研究院有限公司 Device and method for calibrating positions of laser radar and camera
CN114782636A (en) * 2022-05-17 2022-07-22 江苏集萃深度感知技术研究所有限公司 Three-dimensional reconstruction method, device and system
CN115657061A (en) * 2022-12-13 2023-01-31 成都量芯集成科技有限公司 Indoor wall surface three-dimensional scanning device and method
CN115598626A (en) * 2022-12-14 2023-01-13 成都量芯集成科技有限公司(Cn) Deviation calibration device and method for laser range finder

Non-Patent Citations (5)

* Cited by examiner, † Cited by third party
Title
IOANNIS BRILAKIS 等: "Progressive 3D reconstruction of infrastructure with videogrammetry", AUTOMATION IN CONSTRUCTION, vol. 20, pages 884 - 895, XP028290277, DOI: 10.1016/j.autcon.2011.03.005 *
刘钦 等: "基于一维激光测距仪和云台的场景三维重建", 工具技术, vol. 45, no. 11, pages 76 - 79 *
胡芳侨: "基于深度学习的斜拉桥结构感知三维重建方法研究", 中国优秀硕士学位论文全文数据库 工程科技Ⅱ辑, no. 03, pages 034 - 1045 *
闫阳阳 等: "三维激光点云联合无人机影像的三维场景重建研究", 测绘通报, no. 01, pages 84 - 87 *
黄凌潇: "基于三维激光扫描技术的点云滤波与平面拟合算法研究", 中国优秀硕士学位论文全文数据库 信息科技辑, no. 04, pages 135 - 92 *

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116824067A (en) * 2023-08-24 2023-09-29 成都量芯集成科技有限公司 Indoor three-dimensional reconstruction method and device thereof
CN116824067B (en) * 2023-08-24 2023-11-24 成都量芯集成科技有限公司 Indoor three-dimensional reconstruction method and device thereof
CN117115362A (en) * 2023-10-20 2023-11-24 成都量芯集成科技有限公司 Three-dimensional reconstruction method for indoor structured scene
CN117115362B (en) * 2023-10-20 2024-04-26 成都量芯集成科技有限公司 Three-dimensional reconstruction method for indoor structured scene

Also Published As

Publication number Publication date
CN115984512B (en) 2023-06-13

Similar Documents

Publication Publication Date Title
CN115984512A (en) Three-dimensional reconstruction device and method for plane scene
CN102425991B (en) Automation storage yard laser measurement device and application method thereof
CN105091911B (en) A kind of POS system dynamic locating accuracy detecting system and method
CN102184563B (en) Three-dimensional scanning method, three-dimensional scanning system and three-dimensional scanning device used for plant organ form
CN103110429B (en) The optical calibration method of ultrasound probe
CN108090959A (en) Indoor and outdoor one modeling method and device
CN107764270A (en) A kind of laser scan type indoor map generation and updating device and method
CN112945137B (en) Storage ore heap scanning method based on single-line laser radar and range finder equipment
CN114998499A (en) Binocular three-dimensional reconstruction method and system based on line laser galvanometer scanning
CN101504275A (en) Hand-hold line laser three-dimensional measuring system based on spacing wireless location
CN109751992B (en) Indoor three-dimensional space-oriented positioning correction method, positioning method and equipment thereof
CN103198481B (en) A kind of camera marking method
CN102506825B (en) Electric transmission and transformation equipment external insulated antifouling parameter photographic measurement method
CN112258590A (en) Laser-based depth camera external parameter calibration method, device and storage medium thereof
CN113681559B (en) Line laser scanning robot hand-eye calibration method based on standard cylinder
CN111640156A (en) Three-dimensional reconstruction method, equipment and storage equipment for outdoor weak texture target
CN113781576B (en) Binocular vision detection system, method and device for adjusting pose with multiple degrees of freedom in real time
CN114543787B (en) Millimeter-scale indoor map positioning method based on fringe projection profilometry
CN111308495B (en) Method for generating indoor house type 3D data through radar ranging
CN104180756A (en) Method for measuring relative displacement of butt-joint pieces through laser displacement sensors
CN108151717A (en) A kind of Full automatic quadrat method based on BIM systems
KR20180126475A (en) Three Dimensional Surveying System and Survey Method in Multimodal Mode
CN110728745A (en) Underwater binocular stereoscopic vision three-dimensional reconstruction method based on multilayer refraction image model
CN112016157B (en) Method for constructing digital whole-plane coordinates of high-aspect-ratio airplane
CN112945145A (en) Non-contact corner measuring method based on multipoint ranging

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant