CN110646808A - Forestry knapsack formula laser radar multisensor integrated system - Google Patents

Forestry knapsack formula laser radar multisensor integrated system Download PDF

Info

Publication number
CN110646808A
CN110646808A CN201911026626.9A CN201911026626A CN110646808A CN 110646808 A CN110646808 A CN 110646808A CN 201911026626 A CN201911026626 A CN 201911026626A CN 110646808 A CN110646808 A CN 110646808A
Authority
CN
China
Prior art keywords
backpack
laser radar
time
forestry
gnss
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201911026626.9A
Other languages
Chinese (zh)
Inventor
邢涛
陈东鹏
王崇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Northeast Forestry University
Original Assignee
Northeast Forestry University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Northeast Forestry University filed Critical Northeast Forestry University
Priority to CN201911026626.9A priority Critical patent/CN110646808A/en
Publication of CN110646808A publication Critical patent/CN110646808A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/87Combinations of systems using electromagnetic waves other than radio waves
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/38Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
    • G01S19/39Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/42Determining position
    • G01S19/45Determining position by combining measurements of signals from the satellite radio beacon positioning system with a supplementary measurement
    • G01S19/47Determining position by combining measurements of signals from the satellite radio beacon positioning system with a supplementary measurement the supplementary measurement being an inertial measurement, e.g. tightly coupled inertial
    • GPHYSICS
    • G04HOROLOGY
    • G04RRADIO-CONTROLLED TIME-PIECES
    • G04R20/00Setting the time according to the time information carried or implied by the radio signal
    • G04R20/02Setting the time according to the time information carried or implied by the radio signal the radio signal being sent by a satellite, e.g. GPS

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Electromagnetism (AREA)
  • Multimedia (AREA)
  • Automation & Control Theory (AREA)
  • Position Fixing By Use Of Radio Waves (AREA)
  • Navigation (AREA)
  • Optical Radar Systems And Details Thereof (AREA)

Abstract

The invention relates to the field of forest parameter measurement, in particular to a forestry backpack type laser radar multi-sensor integrated system. The invention consists of sensor components and a backpack mechanical structure, and comprises two true color cameras, a binocular camera, a GNSS (global navigation satellite system), two laser radar probes, an industrial personal computer, a combined navigation module, a battery and a backpack mechanical mechanism. The invention closely combines the mobile measurement technology and the multi-sensor integration technology, and has the characteristics of convenient movement, strong adaptability, good stability, portability and the like; the invention improves the time registration precision of the multiple sensors, can quickly acquire three-dimensional space geographic information in real time and reduce positioning errors; the invention expands the system time and space coverage range, thereby improving the space-time monitoring range of the system; the invention improves the comprehensive anti-interference capability of the system and ensures the continuity of navigation and the reliability of the system.

Description

Forestry knapsack formula laser radar multisensor integrated system
Technical Field
The invention relates to the field of forest parameter measurement, in particular to a forestry backpack type laser radar multi-sensor integrated system.
Background
The forest is an ecological system with the largest land area, the widest distribution, the most complex composition structure and the most abundant material resources. Traditional forest parameter measurement needs field measurement, only data on some points can be obtained, and research on regional dimensions or large-scale forest parameters is not facilitated. With the increasingly deep and mature application of remote sensing technology, the remote sensing technology has been successfully used in the aspects of forest mapping, disaster monitoring and the like in large areas. Particularly, the laser radar which is the fastest growing LiDAR (light detection and ranging) in recent years can overcome the limitations of optical remote sensing and microwave remote sensing signal saturation, and improve the accuracy of forest three-dimensional parameter estimation. The laser radar is an active remote sensing technology which is developed very rapidly internationally in recent years, and is successfully applied to quantitative measurement and inversion of forest parameters. The laser radar has an imaging mechanism different from that of passive optical remote sensing, has strong detection capability on vegetation space structures and terrains, and particularly has incomparable advantages on other remote sensing data on the detection capability on forest heights.
The existing three-dimensional laser scanning systems of different carrying platforms such as a vehicle-mounted laser radar system, a machine-mounted laser radar system, a foundation laser radar system and the like are wide in application in the field of forest parameter measurement, are used as a rapidly developed measurement technology, are suitable for obtaining three-dimensional point cloud data in a forest scene, can obtain detailed data of a single tree and can also scan spatial data in a sample plot range, are high-precision non-contact three-dimensional measurement means, and can quickly obtain structural parameters of a forest sample plot and a three-dimensional structure for depicting a forest in detail. In practical application, because general ground laser radar equipment is heavy, the limitation of scanning angle and the existence of trees phenomenon of sheltering from each other, often can influence the scanning efficiency of laser radar probe, and then influence the whole process of measurement work. The backpack type laser radar system in the foundation laser radar system has the characteristics of convenience in installation, easiness in operation, portability and the like, and can adapt to the environment with dense forest and complex terrain. However, in practical application, the conventional backpack laser radar system still has the defects of small scanning coverage, large positioning error and poor data transmission synchronism, and further influences the acquisition precision of forest parameters.
Disclosure of Invention
The invention overcomes the defects of the prior art and realizes a forestry backpack type laser radar multi-sensor integrated system. The invention improves the problems of small measurement range, insufficient time registration precision and poor satellite navigation signal stability of the conventional backpack laser radar system, thereby improving the acquisition precision of the system on forest parameters such as tree height, breast diameter, single tree number and the like.
In order to achieve the purpose, the technical scheme of the invention is as follows:
1. the invention provides a backpack type laser radar multi-sensor integrated system applied to forest parameter measurement, which comprises: the system comprises two laser radar probes, a GNSS (global navigation satellite system), two true color cameras, a binocular camera, a combined navigation module, an industrial personal computer, a battery and a backpack mechanical structure. The backpack mechanical structure comprises a support rod, a backpack frame, a backpack bottom plate and a backpack shell. The support rod is fixedly installed through a locking buckle which is connected to a backpack bottom plate through threads and used for installing the GNSS and a laser radar probe; the backpack frame is connected to the backpack bottom plate through threads and used for installing the two true color cameras, the combined navigation module, the industrial personal computer, the binocular camera and the laser radar probe. The backpack bottom plate is used for mounting the backpack frame, the supporting rod, the battery and the backpack shell; the backpack shell and all components on the backpack frame are nested and installed, and are installed on the backpack bottom plate through threads.
2. The two laser radar probes are used for extracting laser point cloud data in a forest scene, wherein the laser radar probes horizontally arranged at the top of the supporting rod transmit laser energy and receive return signals in a scanning range with a backward included angle of 270 degrees, and the laser energy and the return signals are collected and stored as point cloud data; and the other laser radar probe is arranged on a laser radar probe mounting plate on the backpack frame, and the cross section of the laser radar probe is taken as a reference surface, forms a scanning range with an included angle of +/-15 degrees with the reference surface and is used for extracting laser point cloud data on one side of the backpack shell.
3. The industrial computer is installed on knapsack frame right branch fagging through the copper post. The microprocessor module is installed on a mainboard of the industrial personal computer, data acquisition and synchronous transmission among the sensors are coordinated and controlled, and the industrial personal computer is connected with the sensor assemblies through cables.
4. The two true color cameras are vertically symmetrical relative to the binocular camera and are arranged on a true color camera mounting plate of the backpack frame, and the true color forest image information under the forest environment is acquired by continuously and regularly taking pictures in symmetrical directions, so that the acquisition richness and accuracy of forest parameters are improved; and matching the picture shot by the true color camera with the point cloud data collected by the laser radar probe, and further supplementing the tree texture and spectral information which are lost by the point cloud data.
5. The binocular camera is installed on a binocular camera installation plate of the backpack frame, and the depth information and the three-dimensional model of the forest scene can be acquired in real time by using binocular stereo matching algorithm calculation. After the time difference image is obtained through stereo matching, a depth image can be determined, three-dimensional scene information is recovered, and three-dimensional point cloud information is generated; the method comprises the steps of acquiring forest scene images from different space visual angles by using binocular cameras, calibrating relative positions of the binocular cameras, matching feature points of the two acquired forest images by using a stereo matching algorithm, and then performing three-dimensional reconstruction on the forest scene by using an accurate and rapid feature algorithm and binocular passive stereoscopic vision so as to realize restoration of the three-dimensional forest scene.
6. The combined navigation module is installed on a left support plate of the backpack frame through a copper column and comprises a GPS satellite navigation signal receiver and an IMU inertial measurement unit, the GPS satellite navigation signal receiver and the IMU inertial measurement unit can be used in a combined mode to acquire high-precision three-dimensional position, speed and attitude information in real time, inertial attitude measurement in a motion or vibration state is facilitated, and measurement errors are reduced through nonlinear compensation, orthogonal compensation, temperature compensation and drift compensation. The combined navigation module has high short-term precision, can provide effective navigation information when the satellite navigation signal is lost, ensures the navigation continuity and the system reliability, and improves the comprehensive anti-jamming capability of the system by combining the GPS satellite navigation signal receiver and the IMU inertia measurement unit.
7. The invention combines the time-space characteristics of GNSS, two laser radar probes, two true color cameras, a binocular camera and a combined navigation module, and designs and realizes the control function of the microprocessor module. The microprocessor module introduces a GNSS time system in the operation process and can maintain the time system during the interruption period of GNSS signals, thereby realizing high-precision time-space synchronization among the multiple sensors. The microprocessor module unifies asynchronous data information of the sensors observing the same target to the same time reference, so that the sensors correspond to the observed data of the same target at the same moment. The microprocessor module acquires time reference from an industrial personal computer system clock and a GNSS time system clock, controls image data acquisition of a true color camera and a binocular camera and point cloud data acquisition of a laser radar probe, and realizes time reference unification of heterogeneous data, so that laser point cloud and optical images can be converted into an absolute measurement coordinate system from a relative coordinate system, and data synchronous control is realized.
Due to the existence of various unstable factors such as interference signals, the invention realizes the strict time synchronization of each sensor by utilizing the PPS signal provided by the GPS satellite navigation signal receiver in the combined navigation module and the corresponding GNSS time, and eliminates the influence of the interference signals on the time. The PPS signals are timed through the GNSS, and the synchronization precision can reach 0.1 ms. And the microprocessor module receives the time and space information transmitted by the GNSS and the relative space information and setting information of the laser radar probe, establishes a time reference and fuses the position information, the distance information and the time information. The adjustable highest output frequency of the scanning data frequency of the two laser radar probes is 100Hz, namely scanning information can be obtained once every 10ms, the output frequency and the positioning output frequency of GNSS raw data are 50Hz at most, namely positioning data information can be obtained once every 20ms, and the laser radar scanning system also has the function of recording time. In order to ensure that the point cloud data acquired by the laser radar probe and the GNSS data keep time synchronization, the acquisition of the GNSS data is recorded in time while the point cloud data is acquired by the laser radar probe, and further, the time recording information of the GNSS can be solved in the later processing of the sensor data, so that the time registration among multiple sensors is carried out.
Compared with the prior art, the invention has the beneficial effects that:
the problem that the time registration accuracy of an existing laser radar system is low is solved, the time delay characteristics of a GNSS, two laser radar probes, two true color cameras, a binocular camera and a combined navigation module are combined, a time reference is established through recording and analyzing time marks under the working state of multiple sensors, the cooperative work and time synchronization among the sensors are achieved, and the time registration accuracy is improved. The stability of the working environment of the backpack type laser radar multi-sensor integrated system and the safety in the advancing process are guaranteed, the survival capability of the system is enhanced, and the reliability of data acquisition is improved. The time and space coverage range is expanded, and a plurality of sensors are overlapped and covered with each other, so that the space-time monitoring range of the system is further improved. The combined navigation module improves the comprehensive anti-interference capability of the system and ensures the continuity of navigation and the reliability of the system.
Drawings
FIG. 1 is a front view of the internal structure of the present invention;
FIG. 2 is an external view of the present invention;
FIG. 3 is a view of the frame structure of the backpack of the present invention;
FIG. 4 is a schematic view of the installation of the support rod of the present invention;
FIG. 5 is a left isometric view of the present invention;
FIG. 6 is a right isometric view of the present invention;
FIG. 7 is a schematic diagram of multi-sensor time synchronization control.
Wherein: laser radar probe (1), GNSS (2), true color camera (3), binocular camera (4), combination navigation module (5), industrial computer (6), microprocessor module (601), FPGA (602), battery (7), bracing piece (801), knapsack frame (802), laser radar probe mounting panel (8021), true color camera mounting panel (8022), binocular camera mounting panel (8023), left branch fagging (8024), right branch fagging (8025), knapsack bottom plate (803), locking buckle (8031), knapsack shell (804).
Detailed Description
The following description of the present invention will be made with reference to the accompanying drawings.
A forestry backpack type laser radar multi-sensor integrated system is composed of sensor components and a backpack mechanical structure.
As shown in fig. 1, the present invention provides a forestry backpack type lidar multi-sensor integrated system. The method comprises the following steps: the system comprises two laser radar probes (1), a GNSS (global navigation satellite system) (2), two true color cameras (3), a binocular camera (4), a combined navigation module (5), an industrial personal computer (6), a battery (7) and a backpack mechanical structure.
As shown in fig. 1, 2, and 3, the backpack mechanical structure includes a support pole (801), a backpack frame (802), a backpack floor (803), and a backpack housing (804).
As shown in fig. 4, the support rod (801) is fixed and installed by a locking buckle (8031) screwed on the backpack bottom plate (803) for installing the GNSS (2) and a laser radar probe (1);
as shown in fig. 1 and 3, the backpack frame (802) is fixed on the backpack bottom plate (803) and is used for mounting the two true color cameras (3), the combined navigation module (5), the industrial personal computer (6), the binocular camera (4) and the laser radar probe (1);
as shown in fig. 1, the backpack bottom plate (803) is used for mounting the backpack frame (802), the support rods (801), the battery (7) and the backpack shell (804);
as shown in fig. 2, the backpack shell (804) is nested with components on the backpack frame (802) and is threadably mounted to the backpack floor (803).
Further, as shown in fig. 1, the two laser radar probes (1) are used for extracting laser point cloud data in a forest scene, wherein the laser radar probe (1) horizontally mounted at the top of the support rod (801) transmits laser energy and receives return signals in a scanning range with a backward included angle of 270 degrees, and collects and stores the laser energy and the return signals as point cloud data; the other laser radar probe (1) is arranged on a laser radar probe mounting plate (8021) on the backpack frame (802), and the cross section of the laser radar probe (1) is taken as a reference surface, forms a scanning range with an included angle of +/-15 degrees with the reference surface, and is used for extracting laser point cloud data on one side of the backpack shell (804).
Further, as shown in fig. 6, the industrial personal computer (6) is installed on a right support plate (8025) of the backpack frame (802) through a copper column, the microprocessor module (601) is installed on a mainboard of the industrial personal computer (6), the microprocessor module (601) coordinates and controls data acquisition and synchronous transmission among the sensors, and the industrial personal computer (6) is connected with the sensor components through cables.
Further, as shown in fig. 1, the two true color cameras (3) are vertically symmetrical about the binocular camera (4), are mounted on a true color camera mounting plate (8022) on the backpack frame (802), and continuously take photos at regular time through symmetrical directions to acquire full color image information under a forest environment, so that the acquisition richness and accuracy of forest parameters are improved; and matching the picture shot by the true color camera (3) with the point cloud data collected by the laser radar probe (1), and further supplementing the tree texture and spectral information missing from the point cloud data.
Further, as shown in fig. 1, the binocular camera (4) is installed on a binocular camera installation plate (8023) of the backpack frame (802), and the binocular camera (4) is calculated by using a binocular stereo matching algorithm, so that forest scene depth information and a three-dimensional model can be acquired in real time. After the time difference image is obtained through stereo matching, a depth image can be determined, three-dimensional scene information is recovered, and three-dimensional point cloud information is generated; the method comprises the steps of acquiring forest scene images from different space visual angles by using a binocular camera (4), calibrating the relative position of the binocular camera (4), matching feature points of the two acquired forest images by using a stereo matching algorithm, and performing three-dimensional reconstruction on the forest scene by using a precise and rapid feature algorithm and binocular passive stereoscopic vision so as to realize the restoration of the three-dimensional forest scene.
Further, as shown in fig. 5, the combined navigation module (5) is mounted on the left support plate (8024) of the backpack frame (802) through a copper column, and includes a GPS satellite navigation signal receiver and an IMU inertial measurement unit, which are used in combination to obtain high-precision three-dimensional position, velocity, and attitude information in real time, so as to facilitate inertial attitude measurement in a motion or vibration state, and reduce measurement errors through nonlinear compensation, quadrature compensation, temperature compensation, and drift compensation. The combined navigation module has high short-term precision, can provide effective navigation information when the satellite navigation signal is lost, ensures the navigation continuity and the system reliability, and improves the comprehensive anti-jamming capability of the system by combining the GPS satellite navigation signal receiver and the IMU inertia measurement unit.
Further, by the combined navigation module (5), motion track information under a forest scene, including position and speed information, posture information and acceleration information, is obtained through real-time calculation; the method comprises the steps of driving a GPS satellite navigation receiver and an IMU inertial measurement unit in a forest scene, driving the GPS satellite navigation receiver and the IMU inertial measurement unit to operate and perform information fusion according to scene data information calculated by the system in real time, and realizing combined navigation operation of the system in the forest scene; real-time satellite navigation information generated by a GPS satellite navigation receiver and real-time attitude and acceleration information output by an IMU inertial measurement unit are transmitted to a microprocessor module (601) through cables, so that real-time coordinate calculation is realized, and a positioning result is obtained in real time;
furthermore, the microprocessor module (601) acquires time reference from an industrial personal computer (6) system clock and a GNSS (2) time system clock, controls image data acquisition of the true color camera (3) and the binocular camera (4) and point cloud data acquisition of the laser radar probe (1), and realizes unification of time reference of heterogeneous data, so that laser point cloud and optical images can be converted into an absolute measurement coordinate system from a relative coordinate system, and data synchronization control is realized.
Furthermore, the invention utilizes the PPS signal provided by the GPS satellite navigation signal receiver in the combined navigation module (5) and the corresponding GNSS (2) time to realize the strict time synchronization of each sensor and eliminate the influence of interference signals on the time. As shown in fig. 7, the synchronization accuracy can reach 0.1ms when the PPS signal is timed by GNSS (2). The microprocessor module (601) receives the time and space information transmitted by the GNSS (2), the relative space information and the setting information of the laser radar probe (1), establishes a time reference, and fuses the position information, the distance information and the time information.
Further, aiming at the characteristics of the invention, the system clock of the industrial personal computer adopts a high-stability crystal oscillator of 100MHz and a team type FPGA (602) which works in parallel. When a certain submodule is debugged and changed, the realization results of other modules can not be influenced. The FPGA (602) finishes the key part of timing, generates a us pulse generation unit, an ms pulse generation unit and an s pulse generation unit through programming control, generates an alignment pulse when an alignment pulse generation circuit detects an effective 1PPS pulse, resets the ms pulse generation circuit and the s pulse generation circuit to finish timing once at the moment, namely when the 1PPS pulse edge reaches the FPGA (602) circuit, the circuit clears the decimal part of time 'second'.
Furthermore, the scanning data frequency of the two laser radar probes (1) can be adjusted to be 100Hz, namely scanning information can be obtained every 10ms, the output frequency and the positioning output frequency of the GNSS (2) raw data are 50Hz at most, namely positioning data information can be obtained every 20ms, and the scanning data frequency and positioning data frequency recording device also has a time recording function. In order to ensure that the point cloud data acquired by the laser radar probe (1) and the GNSS (2) data can keep time synchronization, the acquisition of the GNSS (2) data is recorded in time while the laser radar probe (1) acquires the data, and further, the time recording information of the GNSS (2) can be solved in the later processing of the sensor data, so that the time registration among multiple sensors can be carried out.
Furthermore, since the sampling frequencies of the sensors are different, interpolation and extrapolation processing needs to be performed on target data of different frequencies acquired by the sensors, and the observation time series of the sensor with the highest output frequency is selected as a time reference, and then the observation data acquired by the low-frequency sensor is estimated on the time reference, so that the accuracy of multi-sensor time registration is improved. The data acquisition and recording time of each sensor can reach synchronization and be unified, and the data acquisition and recording time is converted into a unified time coordinate system. The time registration is to accurately fuse the image information acquired by the two true color cameras (3) and the binocular camera (4), the three-dimensional space point cloud data acquired by the two laser radar probes (1) and the positioning and attitude determination data information acquired by the GPS satellite navigation receiver and the IMU inertial measurement unit in the combined navigation module (5) at each sampling moment, and keep the time sequence of data acquisition synchronous.
Further, the data output frequency of the GPS satellite navigation receiver is set to be delta fGPSThe data output frequency of the laser radar probe (1) is delta f at 20HzLiDAR50Hz, and the sampling frequency of the IMU inertial measurement unit is delta fIMU20Hz, GNSS output with a frequency Δ fGNSSAnd then, aligning the GPS satellite navigation receiver positioning data and the IMU inertial measurement unit observation data by applying a BP neural network method, and ensuring that the output time intervals of all sensors of the system are unified to be delta t 1/delta fGNSS=1/ΔfLiDARTherefore, the attitude position parameter of the data of the laser radar probe (1) can be obtained, and the transformation equation of the world coordinate system and the space coordinate system of the knapsack type laser radar multi-sensor integrated system at the nth moment can be obtained as follows:
Figure BDA0002248815070000041
finally, it should be noted that: although the present invention has been described in detail with reference to the foregoing embodiments, it will be apparent to those skilled in the art that changes may be made in the embodiments and/or equivalents thereof without departing from the spirit and scope of the invention as defined in the following claims. Any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention shall fall within the protection scope of the present invention.

Claims (6)

1. A forestry backpack type laser radar multi-sensor integrated system is characterized by comprising: the system comprises two laser radar probes, a GNSS (global navigation satellite system), two true color cameras, a binocular camera, a combined navigation module, an industrial personal computer, a battery and a backpack mechanical structure. The backpack mechanical structure comprises a support rod, a backpack frame, a backpack bottom plate and a backpack shell. The support rod is fixedly installed through a locking buckle which is connected to a backpack bottom plate through threads and used for installing the GNSS and a laser radar probe; the backpack frame is connected to the backpack bottom plate through threads and used for mounting the two true color cameras, the combined navigation module, the industrial personal computer, the binocular camera and the laser radar probe; the backpack bottom plate is used for mounting the backpack frame, the supporting rod, the battery and the backpack shell; the backpack shell and all components on the backpack frame are nested and installed, and are installed on the backpack bottom plate through threads.
2. The forestry backpack type lidar integrated system of claim 1, wherein: the two laser radar probes are used for extracting laser point cloud data in a forest scene, wherein the laser radar probes horizontally arranged at the top of the supporting rod transmit laser energy and receive return signals in a scanning range with a backward included angle of 270 degrees, and the laser energy and the return signals are collected and stored as point cloud data; and the other laser radar probe is arranged on a laser radar probe mounting plate on the backpack frame, and the cross section of the laser radar probe is taken as a reference surface, forms a scanning range with an included angle of +/-15 degrees with the reference surface and is used for extracting laser point cloud data on one side of the backpack shell.
3. The forestry backpack type lidar integrated system of claim 1, wherein: the industrial computer is installed on the backpack frame right branch fagging through the copper post. The microprocessor module is arranged on a mainboard of the industrial personal computer and coordinates and controls data acquisition and synchronous transmission among the sensors.
4. The forestry backpack type lidar integrated system of claim 1, wherein: two true color cameras are about binocular camera symmetry from top to bottom to install on the true color camera mounting panel of knapsack frame, shoot regularly in succession through the symmetry position, gather true color forest image information under the forest environment, improve the collection abundance and the accurate nature of forest parameter.
5. The forestry backpack type lidar integrated system of claim 1, wherein: the combined navigation module is arranged on a left support plate of the backpack frame through a copper column, comprises a GPS satellite navigation signal receiver and an IMU inertial measurement unit, and can acquire high-precision three-dimensional position, speed and attitude information in real time by combined use of the GPS satellite navigation signal receiver and the IMU inertial measurement unit, so that inertial attitude measurement in a motion or vibration state is facilitated; the combined navigation module has high short-time precision, can provide effective navigation information when the satellite navigation signal is lost, and ensures the continuity of navigation and the reliability of the system.
6. The forestry backpack type lidar integrated system of claim 3, wherein: the micro processor module introduces a GNSS time system in the operation process, acquires time reference from the industrial personal computer system clock and the GNSS time system clock, controls image data acquisition of the true color camera and the binocular camera and point cloud data acquisition of the laser radar probe, realizes time reference unification of heterogeneous data, and simultaneously realizes high-precision time-space synchronization among the multiple sensors.
CN201911026626.9A 2019-10-26 2019-10-26 Forestry knapsack formula laser radar multisensor integrated system Pending CN110646808A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911026626.9A CN110646808A (en) 2019-10-26 2019-10-26 Forestry knapsack formula laser radar multisensor integrated system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911026626.9A CN110646808A (en) 2019-10-26 2019-10-26 Forestry knapsack formula laser radar multisensor integrated system

Publications (1)

Publication Number Publication Date
CN110646808A true CN110646808A (en) 2020-01-03

Family

ID=69013579

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911026626.9A Pending CN110646808A (en) 2019-10-26 2019-10-26 Forestry knapsack formula laser radar multisensor integrated system

Country Status (1)

Country Link
CN (1) CN110646808A (en)

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111239758A (en) * 2020-03-13 2020-06-05 苏州天准科技股份有限公司 Map collecting device
CN111650605A (en) * 2020-07-08 2020-09-11 上海振华重工电气有限公司 Angle and height adjustable laser radar and camera combined installation device
CN111696162A (en) * 2020-06-11 2020-09-22 中国科学院地理科学与资源研究所 Binocular stereo vision fine terrain measurement system and method
CN111950336A (en) * 2020-04-14 2020-11-17 成都理工大学 Vegetation canopy ecological water estimation method based on backpack type laser radar
CN112214019A (en) * 2020-09-21 2021-01-12 国网浙江省电力有限公司 Non-blind area intelligent feedback control system, method and terminal for unmanned inspection equipment
CN113167884A (en) * 2020-06-30 2021-07-23 深圳市大疆创新科技有限公司 Radar assembly and movable platform with same
CN113280792A (en) * 2021-06-30 2021-08-20 钦州文泰建设工程有限公司 Indoor surveying and mapping system and method
CN114180085A (en) * 2021-12-29 2022-03-15 上海机器人产业技术研究院有限公司 Unmanned aerial vehicle nacelle for three-dimensional true color environment modeling
KR20230089043A (en) * 2021-12-13 2023-06-20 한국전자기술연구원 Apparatus for fixing a plurality of sensors and system including the same
CN117782227A (en) * 2024-02-26 2024-03-29 中国铁路设计集团有限公司 Multisource aerial remote sensing data acquisition device, system and control method

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106443687A (en) * 2016-08-31 2017-02-22 欧思徕(北京)智能科技有限公司 Piggyback mobile surveying and mapping system based on laser radar and panorama camera
CN207268586U (en) * 2017-07-04 2018-04-24 上海圭目机器人有限公司 A kind of autonomous type detecting a mine robot system
CN109597095A (en) * 2018-11-12 2019-04-09 北京大学 Backpack type 3 D laser scanning and three-dimensional imaging combined system and data capture method
CN211014643U (en) * 2019-10-26 2020-07-14 东北林业大学 Forestry knapsack formula laser radar multisensor integrated system

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106443687A (en) * 2016-08-31 2017-02-22 欧思徕(北京)智能科技有限公司 Piggyback mobile surveying and mapping system based on laser radar and panorama camera
CN207268586U (en) * 2017-07-04 2018-04-24 上海圭目机器人有限公司 A kind of autonomous type detecting a mine robot system
CN109597095A (en) * 2018-11-12 2019-04-09 北京大学 Backpack type 3 D laser scanning and three-dimensional imaging combined system and data capture method
CN211014643U (en) * 2019-10-26 2020-07-14 东北林业大学 Forestry knapsack formula laser radar multisensor integrated system

Cited By (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111239758A (en) * 2020-03-13 2020-06-05 苏州天准科技股份有限公司 Map collecting device
CN111950336A (en) * 2020-04-14 2020-11-17 成都理工大学 Vegetation canopy ecological water estimation method based on backpack type laser radar
CN111696162B (en) * 2020-06-11 2022-02-22 中国科学院地理科学与资源研究所 Binocular stereo vision fine terrain measurement system and method
CN111696162A (en) * 2020-06-11 2020-09-22 中国科学院地理科学与资源研究所 Binocular stereo vision fine terrain measurement system and method
CN113167884A (en) * 2020-06-30 2021-07-23 深圳市大疆创新科技有限公司 Radar assembly and movable platform with same
CN111650605A (en) * 2020-07-08 2020-09-11 上海振华重工电气有限公司 Angle and height adjustable laser radar and camera combined installation device
CN112214019A (en) * 2020-09-21 2021-01-12 国网浙江省电力有限公司 Non-blind area intelligent feedback control system, method and terminal for unmanned inspection equipment
CN113280792A (en) * 2021-06-30 2021-08-20 钦州文泰建设工程有限公司 Indoor surveying and mapping system and method
KR20230089043A (en) * 2021-12-13 2023-06-20 한국전자기술연구원 Apparatus for fixing a plurality of sensors and system including the same
WO2023113191A1 (en) * 2021-12-13 2023-06-22 한국전자기술연구원 Device for fixing plurality of sensors and system comprising same
KR102642742B1 (en) * 2021-12-13 2024-03-04 한국전자기술연구원 Apparatus for fixing a plurality of sensors and system including the same
CN114180085A (en) * 2021-12-29 2022-03-15 上海机器人产业技术研究院有限公司 Unmanned aerial vehicle nacelle for three-dimensional true color environment modeling
CN114180085B (en) * 2021-12-29 2023-12-26 上海机器人产业技术研究院有限公司 Unmanned aerial vehicle nacelle for three-dimensional true color environment modeling
CN117782227A (en) * 2024-02-26 2024-03-29 中国铁路设计集团有限公司 Multisource aerial remote sensing data acquisition device, system and control method
CN117782227B (en) * 2024-02-26 2024-05-10 中国铁路设计集团有限公司 Multisource aerial remote sensing data acquisition device, system and control method

Similar Documents

Publication Publication Date Title
CN110646808A (en) Forestry knapsack formula laser radar multisensor integrated system
CN106017463B (en) A kind of Aerial vehicle position method based on orientation sensing device
CN110108984A (en) The spatial relationship synchronous method of power-line patrolling laser radar system multisensor
CN107807365A (en) Small-sized digital photography there-dimensional laser scanning device for the unmanned airborne vehicle in low latitude
CN107289910B (en) Optical flow positioning system based on TOF
KR20190051704A (en) Method and system for acquiring three dimentional position coordinates in non-control points using stereo camera drone
CN105928493A (en) Binocular vision three-dimensional mapping system and method based on UAV
KR101105606B1 (en) The method and apparatus of topographical map data with movement multi sensor moudle
Nagai et al. UAV borne mapping by multi sensor integration
CN212008943U (en) High-flux three-dimensional scanning spectral imaging measuring device
CN112987065A (en) Handheld SLAM device integrating multiple sensors and control method thereof
CN113820735A (en) Method for determining position information, position measuring device, terminal, and storage medium
CN211014643U (en) Forestry knapsack formula laser radar multisensor integrated system
CN202074965U (en) Full-function day and night laser distance measuring instrument
CN105045276A (en) Method and apparatus for controlling flight of unmanned plane
CN102654917B (en) Method and system for sensing motion gestures of moving body
CN115435784A (en) Device and method for building aerial work platform laser radar and inertial navigation fusion positioning image
Grejner-Brzezinska et al. From Mobile Mapping to Telegeoinformatics
CN107255446A (en) A kind of Cold region apple fruit tree canopy three-dimensional map constructing system and method
JPH10318743A (en) Method and apparatus for surveying by using flying object
CN114966793B (en) Three-dimensional measurement system, method and GNSS system
CN116957360A (en) Space observation and reconstruction method and system based on unmanned aerial vehicle
CN116027351A (en) Hand-held/knapsack type SLAM device and positioning method
CN103776425A (en) Imaging space information acquisition system
CN202084081U (en) Moving object motion attitude sensing system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
CB03 Change of inventor or designer information
CB03 Change of inventor or designer information

Inventor after: Yang Tiebin

Inventor after: Chen Dongpeng

Inventor after: Wang Chong

Inventor after: Xing Tao

Inventor after: Xing Yanqiu

Inventor before: Xing Tao

Inventor before: Chen Dongpeng

Inventor before: Wang Chong

SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination