CN109816774A - A kind of three-dimensional reconstruction system and three-dimensional rebuilding method based on unmanned plane - Google Patents
A kind of three-dimensional reconstruction system and three-dimensional rebuilding method based on unmanned plane Download PDFInfo
- Publication number
- CN109816774A CN109816774A CN201811651070.8A CN201811651070A CN109816774A CN 109816774 A CN109816774 A CN 109816774A CN 201811651070 A CN201811651070 A CN 201811651070A CN 109816774 A CN109816774 A CN 109816774A
- Authority
- CN
- China
- Prior art keywords
- laser
- data
- dimensional
- camera
- unmanned plane
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 title claims abstract description 19
- 230000004927 fusion Effects 0.000 claims abstract description 7
- 238000004422 calculation algorithm Methods 0.000 claims description 12
- 239000011159 matrix material Substances 0.000 claims description 9
- 238000000605 extraction Methods 0.000 claims description 8
- 230000000007 visual effect Effects 0.000 claims description 5
- 238000013519 translation Methods 0.000 claims description 4
- 241001269238 Data Species 0.000 claims description 3
- 238000004364 calculation method Methods 0.000 claims description 3
- 238000006243 chemical reaction Methods 0.000 claims description 3
- 238000001514 detection method Methods 0.000 claims description 2
- 238000010586 diagram Methods 0.000 description 7
- 229920000049 Carbon (fiber) Polymers 0.000 description 4
- 239000004917 carbon fiber Substances 0.000 description 4
- 238000009434 installation Methods 0.000 description 4
- VNWKTOKETHGBQD-UHFFFAOYSA-N methane Chemical compound C VNWKTOKETHGBQD-UHFFFAOYSA-N 0.000 description 4
- 241000208340 Araliaceae Species 0.000 description 2
- 235000005035 Panax pseudoginseng ssp. pseudoginseng Nutrition 0.000 description 2
- 235000003140 Panax quinquefolius Nutrition 0.000 description 2
- 238000013461 design Methods 0.000 description 2
- 235000008434 ginseng Nutrition 0.000 description 2
- 238000004321 preservation Methods 0.000 description 2
- RZVHIXYEVGDQDX-UHFFFAOYSA-N 9,10-anthraquinone Chemical compound C1=CC=C2C(=O)C3=CC=CC=C3C(=O)C2=C1 RZVHIXYEVGDQDX-UHFFFAOYSA-N 0.000 description 1
- 229910000838 Al alloy Inorganic materials 0.000 description 1
- 244000170916 Paeonia officinalis Species 0.000 description 1
- 235000006484 Paeonia officinalis Nutrition 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000007796 conventional method Methods 0.000 description 1
- 230000006837 decompression Effects 0.000 description 1
- 230000005611 electricity Effects 0.000 description 1
- 238000012423 maintenance Methods 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000006641 stabilisation Effects 0.000 description 1
- 238000011105 stabilization Methods 0.000 description 1
Landscapes
- Length Measuring Devices By Optical Means (AREA)
- Optical Radar Systems And Details Thereof (AREA)
Abstract
The invention discloses a kind of three-dimensional reconstruction system and three-dimensional rebuilding method based on unmanned plane, which comprises obtain combined calibrating parameter;The registration of laser data;Using combined calibrating method, laser data is projected on vision data, so that parts of images can obtain actual three-dimensional coordinate, realizes the fusion of data;It is calibrated, obtains the laser point cloud data being registrated completely to the end, realize the reconstruct of three-dimensional scenic.The present invention realizes the acquisition of data using UAV flight's three-dimensional laser radar and visible light holder camera, ensure that the precision of the data of acquisition, that is, the precision reconstructed is higher.Feasibility is increased again compared with the general camera only with multi-eye stereo, there is no have imaginary scene existing for one camera.
Description
Technical field
The invention belongs to three-dimensional reconstruction fields, and in particular to a kind of three-dimensional reconstruction system and three-dimensional based on unmanned plane
Method for reconstructing.
Background technique
The acquisition methods of image for three-dimensional reconstruction include that control is spatially separated at least two light sources of setting
The brightness of each light source periodically change, acquired respectively using three cameras of at least three positions for three-dimensional reconstruction
Image.Three-dimensional reconstruction based on image is two or more two-dimensional figures according to captured by object or scene
Picture is calculated and is matched automatically by computer, and the two-dimensional geological information and depth information of object or scene are calculated, and
Establish the process of three-dimensional three-dimensional model.But there is also some shortcomingss, firstly, can not be obtained when to rebuild real scene
When getting the image really perceived, for example object or scene do not exist, be fabricate come out or scene be in set
When counting the planning stage, changed constantly, cannot use and be based on image modeling technology.Secondly as the object in scene
All become the two-dimensional object in image, therefore user is difficult to interact with these two-dimensional graphics objects, required for acquisition
Information;In addition there is certain requirement to camera and photographic equipment, this is the needs for obtaining true perceptual image.While these
A large amount of image file is also required to enough memory spaces to save.
In photovoltaic industry, often will appear pipeline has damage, but the case where will not be rebuild on a large scale immediately, it is past
It is subsequent to be positioned when rectify and improve toward can first record scene at that time.Three-dimensional reconstruction is exactly by widely at this
It is applied in the actual scene of sample.A reconstructing three-dimensional model is carried out to whole environment at regular intervals, each time all
It can change, difference can be found in the comparison maintenance in later period.Traditional three-dimensional reconstruction is to hold laser by manpower
The environment rebuild to needs is scanned, and reliability is not high, and the only point cloud information that simple laser radar obtains, and is rebuild
What is obtained is model framework, can not really restore real scene, also need cooperation visual sensor, obtain include shape, texture,
The information of color etc..
Summary of the invention
To solve the above-mentioned problems, the present invention provides a kind of three-dimensional rebuilding method based on three-dimensional laser and camera, will
Laser data and vision data fusion, avoid the case where imaginary scene is had existing for one camera generation.
The technical solution of the present invention is as follows: a kind of three-dimensional rebuilding method based on three-dimensional laser and camera, comprising the following steps:
(1) obtain combined calibrating parameter: the laser data to same timestamp and vision data progress corner feature mention respectively
It takes, combined calibrating parameter is acquired based on corner feature;
(2) registration of laser data: being continuously registrated the point cloud data of different moments, i.e., the point cloud being newly added
In Registration of Measuring Data to history point cloud data, and the point cloud data being newly added is calibrated;
(3) feature extraction is carried out to vision data;
(4) using the combined calibrating method in step (1), laser data is projected on vision data, so that part figure
As actual three-dimensional coordinate can be obtained, the fusion of data is realized;
(5) it is calibrated, obtains the laser point cloud data being registrated completely to the end, realize the reconstruct of three-dimensional scenic.
Preferably, finding out a spin matrix R [α in the step (1)x, αy, αz] and a translation matrix T [tx,
ty, tz], so that mutually conversion can be set up, see that laser data is switchable under camera coordinates system shown in formula (a):
PA=RPB+T (a);
Wherein, PB=[xB, yB, zB]TFor coordinate of the laser data under laser coordinate system; PA=[xA, yA, zA]TIt is sharp
Coordinate of the light data under camera coordinates system.
Preferably, in the step (2), using the positioning navigation data that RTK is obtained come to the point cloud data being newly added
It is calibrated, wherein being calibrated using iterative closest point algorithm, iterative closest point algorithm includes:
During each iteration, each of every frame point cloud laser point, the Search Length in another frame point cloud are calculated
Spin matrix is calculated by corresponding point pair from nearest point, so that the distance of all point clouds pair between any two is minimum,
As shown in formula (b),
Wherein, P and Q is respectively two frame laser point cloud datas.
Preferably, using angle point as trace point, wherein corner feature is extracted is being schemed by window in the step (3)
As it is upper mobile when, grey scale change in calculation window, if grey scale change is little, then angle point is not present in this region.If in window
When moving in one direction, gray value is varied widely, but gray value does not have but substantially when moving along other directions
Variation in this region may be then straight line;If window when being moved on along all directions, gray value all have occurred compared with
Big variation, then it is assumed that there are angle points in this region.
Preferably, when being merged laser data and vision data, being needed laser number first in the step (4)
According to certain rotation translation is carried out, gone in reprojection to corresponding visual data image.
The present invention also provides a kind of three-dimensional reconstruction system based on unmanned plane, including unmanned plane body, power pack with
And the camera for acquiring vision data, which is characterized in that further include the laser radar for acquiring laser data, it is described nobody
Machine body includes the load plate for carrying and installing fixed camera and laser radar, and the camera and laser radar are installed respectively
In the two sides of load plate.
The man-machine power electric of the presence or absence of power supply is integrally needed in the present invention, flies the power supply of the external members such as control, and figure, which passes, powers, laser thunder
Up to power supply, camera power supply.All power sources are unmanned plane boat batteries, and the battery of two 6s is used in parallel, individually for
Electric 24V, in addition to figure passes it is desirable that 12V voltage, remaining can directly be powered by battery, therefore tap a decompression in power part
Module passes for figure and provides the voltage of 12V.
Preferably, the unmanned plane body further includes horn, be provided on the horn for fix RTK antenna with
The fixing bolt of GPS.
Preferably, the distance between two antennas of RTK antenna are greater than 30 centimetres, the arrow of GPS is directed toward unmanned plane machine
The heading of body.
Preferably, further including the long-range behaviour for carrying out remote control to unmanned plane body, laser radar and camera
Control server.
Software environment is built in the present invention, and whole system is based on linux system, is added ROS operating system, is used cooperatively
Windows system.
Relevant exploitation and operational order are carried out in order to the airborne computer to unmanned plane, in subscriber computer
(Windows) XShell is installed on, it being capable of telnet server, the i.e. corresponding exploitation work of Telnet airborne computer realization
Make.
By XShell Telnet unmanned aerial vehicle onboard computer, its ip is modified immediately, it is made to be maintained at same with laser ip
Network segment guarantees the normal work of laser.
Mqtt client (mosquitto) is installed at subscriber computer (Linux), for realize assign to airborne computer it is winged
Line command and acquisition unmanned plane relevant information.(telecommunication is realized by the 4G of airborne computer, mqtt is housed on airborne computer
Server-side (paho-mqtt).Subscriber computer can assign flight orders to airborne computer by mqtt, trigger airborne computer
Execution task).
To realize that the end PC controls unmanned plane during flying, earth station is installed at subscriber computer (Windows), sees aircraft in real time
State of flight and location information, can see the video of small holder shooting, observation surrounding enviroment etc. in real time.
The acquisition and preservation of laser point cloud information count laser using packet catcher during laser work in real time
According to packet capturing is carried out, (one packet catcher wireshark of installation either tcpdump, long-range by xshell on airborne computer
Airborne computer is logged in, a booting self-starting script has been created, realize the packet capturing of laser and is temporarily saved).
Subscriber computer (Windows) installs RSview, the laser data of packet capturing can be saved and be watched point cloud chart;Or
Person watches point cloud chart by rviz in subscriber computer (ROS) and saves.
Compared with prior art, the beneficial effects of the present invention are embodied in:
The present invention realizes the acquisition of data using UAV flight's three-dimensional laser radar and visible light holder camera, ensure that
The precision of the data of acquisition, that is, the precision reconstructed are higher.It is increased again compared with the general camera only with multi-eye stereo feasible
Property, there is no have imaginary scene existing for one camera.
Detailed description of the invention
Fig. 1 is the connection block diagram of hardware system in the present invention.
Fig. 2 is that laser radar and camera are installed on the structural schematic diagram on unmanned plane body in the present invention.
Fig. 3 is the flow diagram of three-dimensional rebuilding method of the present invention.
Fig. 4 is the mounting means schematic diagram of unmanned plane propeller in the present invention.
Fig. 5 is the structural schematic diagram of unmanned plane internal body in the present invention.
Fig. 6 is the structural schematic diagram of unmanned plane body in the present invention.
Specific embodiment
Embodiment 1
The present embodiment includes unmanned plane body, flight control system, data link system, power pack etc..Present invention design
Load plate, it is effective to carry laser radar and visible light holder camera, obtain laser data and video information.
Hardware system block diagram is shown in Fig. 1, contains the various components of a whole set of UAV system.
The present invention major improvement is that the load plate of laser radar and holder camera design and installation.The present invention is again
It is the mode that lies down that one piece of carbon fiber board, which has been designed and produced, for hanging laser and small holder gondola, the mounting means of laser radar,
Ensure that the foot prop of unmanned plane will not cause to block to laser, laser radar weighs 840g, reaches equilibrium state in order to whole, will swash
Optical radar is mounted on and in addition installs small holder gondola on one side on one side, figure passes the aerial power supply adaptor held with laser.
The aluminium alloy fixing bolt for fixing RTK antenna and GPS has been made on horn, it is ensured that two antennas of RTK it
Between distance be greater than 30 centimetres, the arrow of two GPS is directed toward the heading of unmanned plane, it is ensured that the stabilization of navigation, laser radar
See that (blue is two antennas of RTK to Fig. 2 in figure, and peony GPS eliminates a GPS, purple with camera scheme of installation
Laser and its power supply adaptor are represented, the side of cylindrical body is laser work region, and yellow represents small holder gondola, and figure passes aerial
End is sheltered from by small holder, and it will not go into details.)
Power pack, it is whole to need the man-machine power electric of the presence or absence of power supply, fly the power supply of the external members such as control, figure, which passes, powers, laser
Radar power supply, camera power supply.All power sources are unmanned plane boat batteries, and the battery of two 6s is used in parallel, individually may be used
Power 24V, in addition to figure passes it is desirable that 12V voltage, remaining can directly be powered by battery, therefore tap a drop in power part
Die block passes for figure and provides the voltage of 12V.
The power section of unmanned plane includes that motor and electricity are adjusted in the present invention, and unmanned plane selection is six rotations in the present invention
The wing, therefore propeller is installed according to the arrow in Fig. 4.
As shown in Figure 5 and Figure 6, the original carbon fiber board of fuselage is dismantled in the present invention, inside there are some spaces, is mounted with to lead
The mainboard and NCU of FCU (face-up and arrow direction and heading are consistent) and PMU and RTK of control, remote controler
Receiver be also mounted in available gap.The space utilization of innermost layer is complete, covers original carbon fiber board.At this
The day dead end of airborne computer (being equivalent to industrial personal computer, can be used for developing) and number biography is installed on layer carbon fiber board.
The mode debugged in the present invention are as follows: connected by USB line and adjust ginseng interface and computer, using the end PC tune join software into
Row adjusts ginseng, sets more rotor types first as six rotors, calibrates to remote controler, is tuned into U.S.'s hand, day according to personal habits
This hand, middle national champion (I is arranged to U.S.'s hand according to personal habits), remote control receiver are SBus type, realize electric adjustment one by one
Standard, GPS calibration, horizontal alignment, all calibrations will be completed one by one, require to carry out according to service manual.
Software environment is built in the present invention, and whole system is based on linux system, is added ROS operating system, is used cooperatively
Windows system.
Relevant exploitation and operational order are carried out in order to the airborne computer to unmanned plane, in subscriber computer
(Windows) XShell is installed on, it being capable of telnet server, the i.e. corresponding exploitation work of Telnet airborne computer realization
Make.
By XShell Telnet unmanned aerial vehicle onboard computer, its ip is modified immediately, it is made to be maintained at same with laser ip
Network segment guarantees the normal work of laser.
Mqtt client (mosquitto) is installed at subscriber computer (Linux), for realize assign to airborne computer it is winged
Line command and acquisition unmanned plane relevant information.(telecommunication is realized by the 4G of airborne computer, mqtt is housed on airborne computer
Server-side (paho-mqtt).Subscriber computer can assign flight orders to airborne computer by mqtt, trigger airborne computer
Execution task).
To realize that the end PC controls unmanned plane during flying, earth station is installed at subscriber computer (Windows), sees aircraft in real time
State of flight and location information, can see the video of small holder shooting, observation surrounding enviroment etc. in real time.
The acquisition and preservation of laser point cloud information count laser using packet catcher during laser work in real time
According to packet capturing is carried out, (one packet catcher wireshark of installation either tcpdump, long-range by xshell on airborne computer
Airborne computer is logged in, a booting self-starting script has been created, realize the packet capturing of laser and is temporarily saved).
Subscriber computer (Windows) installs RSview, the laser data of packet capturing can be saved and be watched point cloud chart;Or
Person watches point cloud chart by rviz in subscriber computer (ROS) and saves.
The present embodiment further relates to a kind of algorithm of three-dimensional reconstruction, and whole algorithm thinking is shown in Fig. 3.Specifically include following step
It is rapid:
1, algorithm realizes the combined calibrating of three-dimensional laser and holder camera, obtains combined calibrating parameter
The laser data to same timestamp and vision data carry out corner feature extraction respectively, seek joint based on corner feature
Calibrating parameters.The purpose is in order to which laser data can be transplanted in visual coordinate system, and opposite vision data also may migrate to laser
In coordinate system, it is more advantageous to the fusion of later data.
Specifically: needing to obtain the combined calibrating parameter of laser and camera, that is, need to find out a spin matrix R [αx, αy,
αz] and a translation matrix T [tx, ty, tz], so that mutually conversion can be set up, as shown by the equation, laser data is switchable to
P under camera coordinates systemA=RPB+ T, wherein PB=[xB, yB, zB]TFor coordinate of the laser data under laser coordinate system,
PA=[xA, yA, zA]TFor coordinate of the laser data under camera coordinates system.
2, algorithm realizes the registration of laser data
Three-dimensional reconstruction will continuously be registrated the point cloud data of different moments, i.e., the point cloud data being newly added is matched
Standard in history point cloud data, this will using the positioning navigation data that RTK is obtained come to the point cloud data being newly added into
Row calibration, used here as iterative closest point algorithm.Specifically: during each iteration, calculating each in every frame point cloud
A laser point, the nearest point of detection range calculates spin matrix by corresponding point pair in another frame point cloud, so that institute
There is the distance of point cloud pair between any two minimum, as shown by the equation,Wherein, P and Q is respectively two frame laser point cloud datas.
3, algorithm, which is realized, carries out feature extraction to vision data
In order to enable the image of frame can correspond to, corner feature extraction is carried out to vision data, using angle point as trace point
It can preferably match adjacent image.When specifically corner feature extraction is moved on the image by window, in calculation window
Grey scale change, if grey scale change is little, then angle point is not present in this region.If in window when moving in one direction, ash
Angle value varies widely, but gray value does not change but substantially when moving along other directions, then the possibility in this region
It is straight line;If window, when moving on along all directions, gray value is all varied widely, then it is assumed that deposit in this region
In angle point
4, algorithm realizes fused data information
Using the combined calibrating method in step 1, by three-dimensional laser data projection to visual image data, such part
Image can obtain actual three-dimensional coordinate, realize the fusion of data.Specifically,
(1) laser uses working frequency identical with camera, can be mapped under identical timestamp, but general work frequency
It is not identical;
(2) working frequency of laser is different from camera, therefore timestamp is not necessarily perfectly aligned, therefore two class data carry out
Fusion needs that laser data is first carried out certain rotation to translate, go in reprojection to corresponding image.
5, algorithm realizes the three-dimensional scenic reconstruct based on fused data
It to match, obtain to the end including the estimation that they are moved using the existing various calibrations of conventional method further progress
The laser point cloud data being registrated completely, realizes the reconstruct of three-dimensional scenic.
Claims (9)
1. a kind of three-dimensional rebuilding method based on three-dimensional laser and camera, which comprises the following steps:
(1) obtain combined calibrating parameter: the laser data to same timestamp and vision data carry out corner feature extraction, base respectively
Combined calibrating parameter is acquired in corner feature;
(2) registration of laser data: being continuously registrated the point cloud data of different moments, i.e., the point cloud data being newly added
It is registrated in history point cloud data, and the point cloud data being newly added is calibrated;
(3) feature extraction is carried out to vision data;
(4) using the combined calibrating method in step (1), laser data is projected on vision data, so that parts of images energy
Actual three-dimensional coordinate is accessed, realizes the fusion of data;
(5) it is calibrated, obtains the laser point cloud data being registrated completely to the end, realize the reconstruct of three-dimensional scenic.
2. the three-dimensional rebuilding method based on three-dimensional laser and camera as described in claim 1, which is characterized in that the step
(1) in, a spin matrix R [α is found outx, αy, αz] and a translation matrix T [tx, ty, tz], so that mutually conversion can be at
It is vertical, see that laser data is switchable under camera coordinates system shown in formula (a):
PA=RPB+T (a);
Wherein, PB=[xB, yB, zB]TFor coordinate of the laser data under laser coordinate system;
PA=[xA, yA, zA]TFor coordinate of the laser data under camera coordinates system.
3. the three-dimensional rebuilding method based on three-dimensional laser and camera as described in claim 1, which is characterized in that the step
(2) in, the point cloud data being newly added is calibrated using the positioning navigation data that RTK is obtained, wherein being changed using closest approach
It is calibrated for algorithm, iterative closest point algorithm includes:
During each iteration, each of every frame point cloud laser point is calculated, detection range is most in another frame point cloud
Close point calculates spin matrix by corresponding point pair, so that the distance of all point clouds pair between any two is minimum, it is such as public
Shown in formula (b),
Wherein, P and Q is respectively two frame laser point cloud datas.
4. the three-dimensional rebuilding method based on three-dimensional laser and camera as described in claim 1, which is characterized in that the step
(3) in, using angle point as trace point, when wherein corner feature extraction is moved on the image by window, gray scale in calculation window
Variation, if grey scale change is little, then angle point is not present in this region.If in window when moving in one direction, gray value
Gray value does not change but substantially when varying widely, but moving along other directions, then in this region may be one
Straight line;If window, when moving on along all directions, gray value is all varied widely, then it is assumed that there are angles in this region
Point.
5. the three-dimensional rebuilding method based on three-dimensional laser and camera as described in claim 1, which is characterized in that the step
(4) it in, when laser data and vision data are merged, needs that laser data is first carried out certain rotation to translate, then throw
It is gone on shadow to corresponding visual data image.
6. a kind of three-dimensional reconstruction system based on unmanned plane, including unmanned plane body, power pack and for acquiring vision number
According to camera, which is characterized in that further include the laser radar for acquiring laser data, the unmanned plane body includes for holding
The load plate of fixed camera and laser radar is carried and installs, the camera and laser radar are respectively arranged in the two sides of load plate.
7. as claimed in claim 6 based on the three-dimensional reconstruction system of unmanned plane, which is characterized in that the unmanned plane body also wraps
Horn is included, the fixing bolt for fixing RTK antenna and GPS is provided on the horn.
8. as claimed in claim 6 based on the three-dimensional reconstruction system of unmanned plane, which is characterized in that two antennas of RTK antenna
The distance between be greater than 30 centimetres, the arrow of GPS is directed toward the heading of unmanned plane body.
9. as claimed in claim 6 based on the three-dimensional reconstruction system of unmanned plane, which is characterized in that further include for unmanned plane
Body, laser radar and camera carry out the remote control server of remote control.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811651070.8A CN109816774B (en) | 2018-12-31 | 2018-12-31 | Three-dimensional reconstruction system and three-dimensional reconstruction method based on unmanned aerial vehicle |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811651070.8A CN109816774B (en) | 2018-12-31 | 2018-12-31 | Three-dimensional reconstruction system and three-dimensional reconstruction method based on unmanned aerial vehicle |
Publications (2)
Publication Number | Publication Date |
---|---|
CN109816774A true CN109816774A (en) | 2019-05-28 |
CN109816774B CN109816774B (en) | 2023-11-17 |
Family
ID=66603299
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201811651070.8A Active CN109816774B (en) | 2018-12-31 | 2018-12-31 | Three-dimensional reconstruction system and three-dimensional reconstruction method based on unmanned aerial vehicle |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN109816774B (en) |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110672091A (en) * | 2019-09-29 | 2020-01-10 | 哈尔滨飞机工业集团有限责任公司 | Time domain aircraft flexible towing pod positioning system |
CN110864725A (en) * | 2019-10-24 | 2020-03-06 | 大连理工大学 | Panoramic three-dimensional color laser scanning system and method based on lifting motion |
CN111047631A (en) * | 2019-12-04 | 2020-04-21 | 广西大学 | Multi-view three-dimensional point cloud registration method based on single Kinect and round box |
CN111199578A (en) * | 2019-12-31 | 2020-05-26 | 南京航空航天大学 | Unmanned aerial vehicle three-dimensional environment modeling method based on vision-assisted laser radar |
CN112419417A (en) * | 2021-01-25 | 2021-02-26 | 成都翼比特自动化设备有限公司 | Unmanned aerial vehicle-based photographing point positioning method and related device |
CN112767475A (en) * | 2020-12-30 | 2021-05-07 | 重庆邮电大学 | Intelligent roadside sensing system based on C-V2X, radar and vision |
CN113379910A (en) * | 2021-06-09 | 2021-09-10 | 山东大学 | Mobile robot mine scene reconstruction method and system based on SLAM |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104268935A (en) * | 2014-09-18 | 2015-01-07 | 华南理工大学 | Feature-based airborne laser point cloud and image data fusion system and method |
WO2018072433A1 (en) * | 2016-10-19 | 2018-04-26 | 杭州思看科技有限公司 | Three-dimensional scanning method including a plurality of lasers with different wavelengths, and scanner |
CN108828606A (en) * | 2018-03-22 | 2018-11-16 | 中国科学院西安光学精密机械研究所 | One kind being based on laser radar and binocular Visible Light Camera union measuring method |
-
2018
- 2018-12-31 CN CN201811651070.8A patent/CN109816774B/en active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104268935A (en) * | 2014-09-18 | 2015-01-07 | 华南理工大学 | Feature-based airborne laser point cloud and image data fusion system and method |
WO2018072433A1 (en) * | 2016-10-19 | 2018-04-26 | 杭州思看科技有限公司 | Three-dimensional scanning method including a plurality of lasers with different wavelengths, and scanner |
CN108828606A (en) * | 2018-03-22 | 2018-11-16 | 中国科学院西安光学精密机械研究所 | One kind being based on laser radar and binocular Visible Light Camera union measuring method |
Non-Patent Citations (1)
Title |
---|
杜宇楠等: "基于激光与立体视觉同步数据的场景三维重建", 《软件》 * |
Cited By (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110672091A (en) * | 2019-09-29 | 2020-01-10 | 哈尔滨飞机工业集团有限责任公司 | Time domain aircraft flexible towing pod positioning system |
CN110672091B (en) * | 2019-09-29 | 2023-05-23 | 哈尔滨飞机工业集团有限责任公司 | Flexible drag nacelle positioning system of time domain aircraft |
CN110864725A (en) * | 2019-10-24 | 2020-03-06 | 大连理工大学 | Panoramic three-dimensional color laser scanning system and method based on lifting motion |
CN111047631A (en) * | 2019-12-04 | 2020-04-21 | 广西大学 | Multi-view three-dimensional point cloud registration method based on single Kinect and round box |
CN111047631B (en) * | 2019-12-04 | 2023-04-07 | 广西大学 | Multi-view three-dimensional point cloud registration method based on single Kinect and round box |
CN111199578A (en) * | 2019-12-31 | 2020-05-26 | 南京航空航天大学 | Unmanned aerial vehicle three-dimensional environment modeling method based on vision-assisted laser radar |
CN111199578B (en) * | 2019-12-31 | 2022-03-15 | 南京航空航天大学 | Unmanned aerial vehicle three-dimensional environment modeling method based on vision-assisted laser radar |
CN112767475A (en) * | 2020-12-30 | 2021-05-07 | 重庆邮电大学 | Intelligent roadside sensing system based on C-V2X, radar and vision |
CN112767475B (en) * | 2020-12-30 | 2022-10-18 | 重庆邮电大学 | Intelligent roadside sensing system based on C-V2X, radar and vision |
CN112419417A (en) * | 2021-01-25 | 2021-02-26 | 成都翼比特自动化设备有限公司 | Unmanned aerial vehicle-based photographing point positioning method and related device |
CN113379910A (en) * | 2021-06-09 | 2021-09-10 | 山东大学 | Mobile robot mine scene reconstruction method and system based on SLAM |
Also Published As
Publication number | Publication date |
---|---|
CN109816774B (en) | 2023-11-17 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109816774A (en) | A kind of three-dimensional reconstruction system and three-dimensional rebuilding method based on unmanned plane | |
US20210239815A1 (en) | Movable object performing real-time mapping using a payload assembly | |
CN107504957B (en) | Method for rapidly constructing three-dimensional terrain model by using unmanned aerial vehicle multi-view camera shooting | |
Udin et al. | Assessment of photogrammetric mapping accuracy based on variation flying altitude using unmanned aerial vehicle | |
CN111091613A (en) | Three-dimensional live-action modeling method based on unmanned aerial vehicle aerial survey | |
CN108168521A (en) | One kind realizes landscape three-dimensional visualization method based on unmanned plane | |
CN108317953A (en) | A kind of binocular vision target surface 3D detection methods and system based on unmanned plane | |
Yang et al. | A novel approach of efficient 3D reconstruction for real scene using unmanned aerial vehicle oblique photogrammetry with five cameras | |
CN108344397A (en) | Automation modeling method, system and its auxiliary device based on oblique photograph technology | |
CN108287164B (en) | Crack detection system | |
CN104118561B (en) | Method for monitoring large endangered wild animals based on unmanned aerial vehicle technology | |
CN109739254A (en) | Using the unmanned plane and its localization method of visual pattern positioning in a kind of electric inspection process | |
CN105758384A (en) | Unmanned aerial vehicle rocking oblique photograph system | |
CN110660125B (en) | Three-dimensional modeling device for power distribution network system | |
CN115147538B (en) | Method for dynamically updating live-action three-dimensional modeling based on environment monitoring unmanned aerial vehicle | |
CN115937440A (en) | Method for fusing real-time video and three-dimensional scene of unmanned aerial vehicle | |
CN105096284A (en) | Method, device and system of generating road orthographic projection image | |
CN110864725A (en) | Panoramic three-dimensional color laser scanning system and method based on lifting motion | |
CN113031462A (en) | Port machine inspection route planning system and method for unmanned aerial vehicle | |
CN113415433B (en) | Pod attitude correction method and device based on three-dimensional scene model and unmanned aerial vehicle | |
CN205809689U (en) | A kind of airframe checks system | |
US20230079285A1 (en) | Display control device, display control method, and program | |
CN205594455U (en) | Three -dimensional modeling system of transmission line shaft tower | |
CN112559662A (en) | Mobile internet map big data platform | |
CN205203414U (en) | Low latitude image acquisition device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
TA01 | Transfer of patent application right | ||
TA01 | Transfer of patent application right |
Effective date of registration: 20191230 Address after: 213031 No.2 Tianhe Road, Xinbei District, Changzhou City, Jiangsu Province Applicant after: Jiangsu Tianze Robot Technology Co.,Ltd. Address before: 213022, No. 2, Tianhe Road, Tianhe photovoltaic industrial park, Xinbei District, Jiangsu, Changzhou Applicant before: TRINA SOLAR Co.,Ltd. |
|
GR01 | Patent grant | ||
GR01 | Patent grant |