CN114155290A - System and method for large-field-of-view high-speed motion measurement - Google Patents

System and method for large-field-of-view high-speed motion measurement Download PDF

Info

Publication number
CN114155290A
CN114155290A CN202111372441.0A CN202111372441A CN114155290A CN 114155290 A CN114155290 A CN 114155290A CN 202111372441 A CN202111372441 A CN 202111372441A CN 114155290 A CN114155290 A CN 114155290A
Authority
CN
China
Prior art keywords
unmanned aerial
aerial vehicle
position information
time
real
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202111372441.0A
Other languages
Chinese (zh)
Other versions
CN114155290B (en
Inventor
潘大伟
王晓飞
雷秀军
郭红霞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hefei Zhongke Junda Vision Technology Co ltd
Original Assignee
Hefei Fuhuang Junda High Tech Information Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hefei Fuhuang Junda High Tech Information Technology Co ltd filed Critical Hefei Fuhuang Junda High Tech Information Technology Co ltd
Priority to CN202111372441.0A priority Critical patent/CN114155290B/en
Publication of CN114155290A publication Critical patent/CN114155290A/en
Application granted granted Critical
Publication of CN114155290B publication Critical patent/CN114155290B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/38Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
    • G01S19/39Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/40Correcting position, velocity or attitude
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/38Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
    • G01S19/39Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/42Determining position
    • G01S19/43Determining position using carrier phase measurements, e.g. kinematic positioning; using long or short baseline interferometry
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration

Abstract

The invention discloses a system and a method for large-field high-speed motion measurement, which comprises the steps of collecting a flight image of an unmanned aerial vehicle in a large field of view based on a binocular high-speed camera; acquiring real-time accurate position information of the unmanned aerial vehicle in a large-field-of-view flight process; constructing a data conversion model based on flight graphics and real-time accurate position information of the unmanned aerial vehicle in a large view field to obtain calibration parameters; and measuring the moving target in the large view field based on the calibration parameters to obtain the moving parameters of the moving target. The method obtains the precise position information and the flight image of the unmanned aerial vehicle in the large view field to obtain the calibration parameters, has simple calibration process, short time consumption and high efficiency, can cover different heights of the large view field space, and then measures the moving target produced by a teacher based on the calibration parameters, thereby realizing the measurement of the moving target in the whole large view field space, and simultaneously the measurement process is quick and efficient.

Description

System and method for large-field-of-view high-speed motion measurement
Technical Field
The invention belongs to the field of binocular camera calibration, and particularly relates to a system and a method for measuring large-field high-speed motion.
Background
The non-contact measurement is the mainstream trend of the static or dynamic parameter measurement of the current object, the double-target rule of the camera is the basis and the premise of the current optical measurement, and the aim of the binocular calibration is to restore the mapping relation between the real world and the same object or the same position in the computer image. The existing method for measuring the movement in the large view field range is mainly to use a total station to calibrate, after calibration parameters are obtained, the movement of a target is measured according to a parameter equation, the calibration method is that an object with obvious characteristics moves up and down, back and forth, left and right, in the view field range, the total station measures the central position of a certain fixed characteristic point (such as a black and white BMW mark with the diameter of 1 m) of each static object, and then the relation between the real world and a pixel coordinate system is restored. However, the method is only suitable for calibration reduction within a certain height range above the ground, and in a field calibration environment, objects are difficult to place over 3m, and are difficult to erect when the height is more than 10m, and on the other hand, the objects are difficult to fix for measurement due to the influence of wind power on the whole. In the aspect of efficiency, the total station calibration method also has the problems of long time consumption and low efficiency, so that after the total station is used for calibration, the problems of long time consumption, low efficiency, difficulty in detecting the moving target of the whole view field and the like exist when the moving target of the view field is measured
Disclosure of Invention
Aiming at the problems in the prior art, the invention provides a system and a method for large-field high-speed motion measurement, calibration parameters are obtained by obtaining coordinates and images of an unmanned aerial vehicle in a large field, parameters of a moving target are measured based on the calibration parameters, the calibration process is simple, the time consumption is short, the efficiency is high, and the moving target in the whole field can be detected.
A first aspect of an embodiment of the present invention provides an RTK unmanned aerial vehicle calibration system for large-field-of-view motion measurement, where the system includes:
the satellite positioning base station comprises a satellite positioning unit and a wireless communication unit, wherein the satellite positioning unit is used for acquiring satellite positioning information of the satellite positioning base station, and the wireless communication unit is used for combining the satellite positioning information and preset base station position information into positioning correction information and sending the positioning correction information to the unmanned aerial vehicle;
the unmanned aerial vehicle comprises an aircraft and an RTK unmanned aerial vehicle module, wherein the RTK unmanned aerial vehicle module is used for receiving positioning correction information sent by the satellite positioning base station, calculating real-time accurate position information of the unmanned aerial vehicle based on the positioning correction information and sending the real-time accurate position information to the central processing station;
the high-speed camera comprises an image acquisition unit and a time service trigger unit, wherein the time service trigger unit receives time service information of the positioning satellite and starts the image acquisition unit to acquire an image of the RTK unmanned aerial vehicle module;
and the central processing station is in communication connection with the high-speed camera and the RTK unmanned aerial vehicle module, and is used for receiving the image of the unmanned aerial vehicle and the real-time accurate position information of the unmanned aerial vehicle in real time, and constructing a data conversion model based on the received data to obtain the calibration parameters.
A second aspect of embodiments of the present invention provides a method for large-field high-speed motion measurement, the method comprising:
acquiring a flight image of the unmanned aerial vehicle in a large view field based on a binocular high-speed camera;
acquiring real-time accurate position information of the unmanned aerial vehicle in a large-field-of-view flight process;
constructing a data conversion model based on flight graphics and real-time accurate position information of the unmanned aerial vehicle in a large view field to obtain calibration parameters;
and measuring the moving target in the large view field based on the calibration parameters to obtain the moving parameters of the moving target.
As a further optimization of the above scheme, the method for planning the flight trajectory of the unmanned aerial vehicle in the large field of view comprises: dividing the whole view field space into at least one height space in height, flying the unmanned aerial vehicle in the same height space in an irregular shape on the same plane, flying the unmanned aerial vehicle into the adjacent height space to fly after flying the whole plane until the flying track is distributed in the whole view field space.
As a further optimization of the above scheme, the flight trajectories of the unmanned aerial vehicle on two adjacent altitude space planes are respectively:
Figure BDA0003362798570000021
Figure BDA0003362798570000022
wherein W represents the width of the field space, L represents the length of the field space, and a represents a constant; the flight tracks on the two adjacent height space planes are connected through two line segments which are perpendicular to each other and intersect on the x axis, one of the two line segments is perpendicular to the x-z plane, and the other line segment is perpendicular to the x-y plane.
As a further optimization of the above scheme, the real-time accurate position information obtaining process of the unmanned aerial vehicle includes:
acquiring satellite positioning information of satellite positioning base stations of a plurality of time nodes, wherein the satellite positioning information comprises positioning information of at least 4 satellites;
obtaining a plurality of position offset differences based on the satellite positioning information and preset position information of the satellite positioning base station;
carrying out offset correction on the satellite positioning information of the same unmanned aerial vehicle based on a plurality of position offset difference values, and outputting corrected position information of a plurality of unmanned aerial vehicles;
calculating positioning error values of the plurality of unmanned aerial vehicles for correcting the position information, and judging whether the positioning error values meet preset precision or not;
and if the positioning error value of the corrected position information of the unmanned aerial vehicle meets the preset accuracy, selecting the intermediate value of the corrected position information of the unmanned aerial vehicle as the real-time accurate position information of the unmanned aerial vehicle.
As a further optimization of the above scheme, the calibration parameters are specifically obtained as follows:
acquiring real-time accurate position information and positioning time of the unmanned aerial vehicle, and flight images and image acquisition time of the unmanned aerial vehicle;
matching the real-time accurate position of the unmanned aerial vehicle with the same time node with the flight image based on the positioning time and the image acquisition time to form a model data set;
acquiring pixel position information of the unmanned aerial vehicle in the flight image in the model data set;
converting real-time accurate position information of the unmanned aerial vehicle into position information of the same type as the pixel position information of the unmanned aerial vehicle;
constructing a model function based on the converted real-time accurate position information of the unmanned aerial vehicle and the pixel position information of the unmanned aerial vehicle;
and obtaining calibration parameters based on the model function.
As a further optimization of the above scheme, the specific process of matching the real-time accurate position of the unmanned aerial vehicle with the flight image is as follows:
acquiring a starting time node and a preset acquisition frequency of the binocular high-speed camera;
acquiring an acquisition time node of each unmanned aerial vehicle flying image based on a starting time node and a preset acquisition frequency of the binocular high-speed camera;
acquiring a takeoff time node and a preset positioning frequency of the unmanned aerial vehicle;
obtaining a positioning time node of real-time accurate position information of each unmanned aerial vehicle based on a takeoff time node and a preset positioning frequency of the unmanned aerial vehicle;
the method comprises the steps of comparing a collecting time node of a flight image with a positioning time node of real-time accurate position information to obtain the same time node, and matching the real-time accurate position of the unmanned aerial vehicle corresponding to the time node with the unmanned aerial vehicle image to form a model data set.
As a further optimization of the scheme, the starting time node of the binocular high-speed camera is the same as the takeoff time node of the unmanned aerial vehicle, and the preset acquisition frequency of the binocular high-speed camera is an integral multiple of the preset positioning frequency of the real-time accurate position information of the unmanned aerial vehicle.
As a further optimization of the above scheme, the method for acquiring the pixel position information of the unmanned aerial vehicle in the image in the model data set comprises the following steps:
acquiring the front ten images shot by the binocular high-speed camera after starting;
segmenting a moving target in the image based on the dynamic change of the first ten images;
calculating pixel gray scale characteristics of the divided areas where the moving targets are located, calculating the similarity between the gray scale characteristics of different moving targets and preset unmanned aerial vehicle gray scale characteristics based on the gray scale characteristics of the moving targets, and selecting the unmanned aerial vehicle targets;
calculating the number of pixels occupied by the unmanned aerial vehicle image based on the selected unmanned aerial vehicle target, if the number of the pixels occupied by the unmanned aerial vehicle is smaller than a standard value, calculating histograms of color channels of the unmanned aerial vehicle target in the images shot by the two groups of high-speed cameras, and matching pixel position information of the unmanned aerial vehicle through the histograms;
if the number of pixels occupied by the unmanned aerial vehicle is larger than the standard value, acquiring corner points of the unmanned aerial vehicle target, calculating the contact ratio of the acquired corner points in the images shot by the two groups of high-speed cameras, and selecting corner point position information with the contact ratio larger than a threshold value as unmanned aerial vehicle pixel position information.
A third aspect of embodiments of the present invention provides a system for large field of view high speed motion measurement, the system comprising:
the image acquisition module is used for acquiring a flight image of the unmanned aerial vehicle in a large view field based on a binocular high-speed camera;
the positioning module is used for acquiring real-time accurate position information of the unmanned aerial vehicle in the large-field-of-view flight process;
the calibration module is used for constructing a data conversion model based on flight graphics and real-time accurate position information of the unmanned aerial vehicle in a large view field to obtain calibration parameters;
and the moving target measuring module is used for measuring the moving target in the large visual field based on the calibration parameters to obtain the moving parameters of the moving target.
The system and the method for measuring the large-field high-speed motion have the following beneficial effects that:
1. the invention acquires real-time accurate position information of the unmanned aerial vehicle in the large-market flight process through the satellite positioning system, realizes the acquisition of the position information of different heights of the large-market porker, has short time consumption and high acquisition efficiency in the whole position information acquisition process, and can cover the whole large-market space.
2. According to the invention, real-time accurate position information of the unmanned aerial vehicle at the same time point is matched with the flight image of the unmanned aerial vehicle collected by the binocular high-speed camera through time matching, a data conversion model is constructed, calibration parameters are calculated, and then the motion parameters of the motion target in the large market space are measured according to the calibration parameters, so that the measurement of the motion target covering different height positions of the large market is realized, and meanwhile, the calibration and measurement processes are rapid and efficient.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present invention, the drawings needed to be used in the embodiments will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art that other drawings can be obtained according to these drawings without creative efforts.
FIG. 1 is an overall frame diagram of an RTK unmanned aerial vehicle calibration system for large field of view motion measurement;
FIG. 2 is a general flow chart of a method for large field of view motion measurement of the present invention;
FIG. 3 is a flight trajectory diagram of the drone;
FIG. 4 is a diagram of the result verification system framework of the present invention;
fig. 5 is a statistical chart of the results of the experiment.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth, such as particular system structures, techniques, etc. in order to provide a thorough understanding of the embodiments of the invention. It will be apparent, however, to one skilled in the art that the present invention may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present invention with unnecessary detail.
The embodiment of the invention provides an RTK unmanned aerial vehicle calibration system for large-field-of-view motion measurement, which comprises:
the satellite positioning base station comprises a satellite positioning unit and a wireless communication unit, wherein the satellite positioning unit is used for acquiring satellite positioning information of the satellite positioning base station, and the wireless communication unit is used for combining the satellite positioning information and preset base station position information into positioning correction information and sending the positioning correction information to the unmanned aerial vehicle;
the unmanned aerial vehicle comprises an aircraft and an RTK unmanned aerial vehicle module, wherein the RTK unmanned aerial vehicle module is used for receiving positioning correction information sent by the satellite positioning base station, calculating real-time accurate position information of the unmanned aerial vehicle based on the positioning correction information and sending the real-time accurate position information to the central processing station;
the high-speed camera comprises an image acquisition unit and a time service trigger unit, wherein the time service trigger unit receives time service information of the positioning satellite and starts the image acquisition unit to acquire an image of the RTK unmanned aerial vehicle module;
and the central processing station is in communication connection with the high-speed camera and the RTK unmanned aerial vehicle module, and is used for receiving the image of the unmanned aerial vehicle and the real-time accurate position information of the unmanned aerial vehicle in real time, and constructing a data conversion model based on the received data to obtain the calibration parameters.
Referring to fig. 1, in this embodiment, the unmanned aerial vehicle flies in a large view field, and then the position of the unmanned aerial vehicle in the real world is obtained by satellite positioning, so that corresponding coordinates of each position of the large view field can be obtained more flexibly, different heights of the large view field are covered, and meanwhile, by obtaining real-time coordinates of the unmanned aerial vehicle in the motion process of the large view field, the motion measurement of the large view field is realized, and the calibration process is faster and more efficient.
Specifically, the base station position is preset again to place the satellite positioning base station, the satellite positioning base station acquires the satellite positioning information of the satellite positioning base station through a built-in satellite positioning unit, the acquired satellite positioning information and the preset base station position information are combined to be positioning correction information, the positioning correction information is sent to the unmanned aerial vehicle through the wireless communication unit, the unmanned aerial vehicle is provided with an aircraft and a RTK unmanned aerial vehicle module, the RTK unmanned aerial vehicle module is in wireless connection with the satellite positioning base station, after the positioning correction information sent by the satellite positioning base station is received, the RTK unmanned aerial vehicle module can correct the real-time satellite positioning information of the unmanned aerial vehicle in the flight process based on the positioning correction information, the real-time accurate position information of the unmanned aerial vehicle is calculated, and the real-time accurate position information is sent to the central processing station. Meanwhile, the high-speed camera collects images of the unmanned aerial vehicle in the flying process of the unmanned aerial vehicle, transmits the images of the unmanned aerial vehicle to the central processing station through wired communication, performs joint analysis on the images of the unmanned aerial vehicle and real-time accurate position information of the unmanned aerial vehicle by the central processing station, and calculates parameters for calibration.
Based on above-mentioned a RTK unmanned aerial vehicle calibration system for big visual field motion measurement, above-mentioned RTK unmanned aerial vehicle module includes:
the output port of the first data transmission unit is connected with the positioning correction unit, and the first data transmission unit is used for receiving the positioning correction information sent by the satellite positioning base station and transmitting the positioning correction information to the positioning correction unit;
the positioning correction unit is used for acquiring real-time satellite positioning information of the unmanned aerial vehicle during movement and calculating real-time accurate position information of the unmanned aerial vehicle by combining the positioning correction information;
and an input port of the second data transmission unit is connected with the positioning correction unit and used for sending the real-time accurate position information and the positioning time of the unmanned aerial vehicle to the central processing station.
Further, through the RTK unmanned aerial vehicle module that sets up among the unmanned aerial vehicle, realize acquireing unmanned aerial vehicle's accurate positional information in real time at the in-process of unmanned aerial vehicle motion. Specifically, the RTK unmanned aerial vehicle module comprises a first data transmission unit, a second data transmission unit and a positioning correction unit, wherein the unmanned aerial vehicle is connected with a satellite positioning base station through the first data transmission unit, an output port of the first data transmission unit is connected with the positioning correction unit, when the first data transmission unit receives positioning correction information sent by the satellite positioning base station, the positioning correction unit transmits the positioning correction information to the positioning correction unit through the output port, the positioning correction unit can also obtain positioning information and positioning time sent by a positioning satellite, the positioning correction unit corrects the satellite positioning information of the unmanned aerial vehicle according to the positioning correction information after receiving the satellite positioning information of the unmanned aerial vehicle, the real-time accurate position information of the unmanned aerial vehicle is calculated, the positioning correction unit is also connected with an input port of the second data transmission unit, and after the real-time accurate position information of the unmanned aerial vehicle is calculated, the positioning correction unit transmits the information and the positioning time to the second data transmission unit, and the second data transmission unit sends the information to the central processing station, so that the accurate position information of the unmanned aerial vehicle can be acquired in real time.
The embodiment of the invention provides a method for measuring large-field high-speed motion, which comprises the following steps:
acquiring a flight image of the unmanned aerial vehicle in a large view field based on a binocular high-speed camera;
acquiring real-time accurate position information of the unmanned aerial vehicle in a large-field-of-view flight process;
constructing a data conversion model based on flight graphics and real-time accurate position information of the unmanned aerial vehicle in a large view field to obtain calibration parameters;
and measuring the moving target in the large view field based on the calibration parameters to obtain the moving parameters of the moving target.
Referring to fig. 2, in this embodiment, through unmanned aerial vehicle flying in the big visual field, adopt satellite positioning to acquire the real-time accurate positional information of unmanned aerial vehicle in the big visual field, can be more nimble acquire the corresponding coordinate of each position of big visual field, cover the different heights of big visual field, carry out big visual field through acquiring the real-time accurate positional information of unmanned aerial vehicle in big visual field motion process simultaneously and mark, the process of demarcation is more quick high-efficient, can realize measuring the motion parameter of big visual field motion target after marking the completion.
Specifically, the flight image of the unmanned aerial vehicle in the large view field is acquired through the binocular high-speed camera, real-time accurate position information of the unmanned aerial vehicle during flying in the large view field is acquired in real time based on a satellite positioning system, wherein the flight track of the unmanned aerial vehicle covers the whole large view field, positioning information of each position in the large view field can be acquired, and the speed is acquired. The acquired unmanned aerial vehicle flight image and the real-time accurate position information are matched, a data conversion model between the acquired unmanned aerial vehicle flight image and the real-time accurate position information can be obtained, the data conversion model is solved to obtain the required calibration parameters, then the binocular camera is used for collecting the motion image of any moving target in the large view field, the position information of the moving target in the large view field can be obtained based on the calibration parameters, and the motion parameters of the moving target can be obtained based on the position information change condition of the moving target in the large view field.
Based on the method, the method for planning the flight path of the unmanned aerial vehicle in the large view field comprises the following steps: dividing the whole view field space into at least one height space in height, flying the unmanned aerial vehicle in the same height space in an irregular shape on the same plane, flying the unmanned aerial vehicle into the adjacent height space to fly after flying the whole plane until the flying track is distributed in the whole view field space.
Referring to fig. 3, the large view field space is a cuboid space structure, the flight trajectory of the unmanned aerial vehicle in the large view field is irregular, and the collective division mode of the flight trajectory of the unmanned aerial vehicle is as follows: the large field space is divided into a plurality of height spaces by a plurality of planes parallel to the horizontal plane, wherein the height difference between two adjacent height spaces is an average value. Then choose unmanned aerial vehicle's high space of taking off, start unmanned aerial vehicle back, unmanned aerial vehicle begins to fly with irregular orbit in the plane from the angular point of the plane on the high space of taking off, after flying complete individual plane, unmanned aerial vehicle flies into and flies with the adjacent high space of current place high space, and the whole large field of view space is covered up to unmanned aerial vehicle's flight orbit. Flying according to the method, the unmanned aerial vehicle can finish flying in the shortest time, and the flying track can relate to the main viewing field space.
Based on the method, the flight trajectories of the unmanned aerial vehicle on the two adjacent height space planes are respectively as follows:
Figure BDA0003362798570000081
Figure BDA0003362798570000082
wherein W represents the width of the field space, L represents the length of the field space, and a represents a constant; the flight tracks on the two adjacent height space planes are connected through two line segments which are perpendicular to each other and intersect on the x axis, one of the two line segments is perpendicular to the x-z plane, and the other line segment is perpendicular to the x-y plane.
Specifically, the flight trajectories of the unmanned aerial vehicle in two altitude space planes of the vector can be represented by sine functions, wherein the flight trajectory function of the unmanned aerial vehicle in the first altitude space plane is as follows:
Figure BDA0003362798570000083
the flight trajectory function of the drone in the second altitudinal space plane adjacent thereto is:
Figure BDA0003362798570000084
and the flight path function expressions of the unmanned aerial vehicle in the adjacent height space planes alternately appear as the formula (1) and the formula (2).
Furthermore, the terminal point of the flight path of the unmanned aerial vehicle in a certain altitude space plane is connected with the starting point of the flight path in the next adjacent altitude space plane through two line segments which are perpendicular to each other and intersect at the x axis, wherein one of the two line segments is perpendicular to the x-z plane, and the other line segment is perpendicular to the x-y plane.
For example, referring to fig. 2, the large view field space is a cube with a length L, a width W, and a height H, and coordinates are established with a long side as an X axis, a short side as a Y axis, and a spatial height as a Z axis, and the large view field space is divided into three height spaces, where the height difference between two adjacent height spaces is H/4. Selecting the height of the takeoff height space of the unmanned aerial vehicle as H/4, and starting the unmanned aerial vehicle from the point Q to
Figure BDA0003362798570000085
Flying in the plane, the flying orbit is as formula (1)
Figure BDA0003362798570000086
After the plane flight is finished, the unmanned aerial vehicle is connected with the lineThe segments S1 and S2 enter the next adjacent height space and are in the plane
Figure BDA0003362798570000087
The flying trajectory is as the formula (2).
Based on the method, the real-time accurate position information acquisition process of the unmanned aerial vehicle comprises the following steps:
acquiring satellite positioning information of satellite positioning base stations of a plurality of time nodes, wherein the satellite positioning information comprises positioning information of at least 4 satellites;
obtaining a plurality of position offset differences based on the satellite positioning information and preset position information of the satellite positioning base station;
carrying out offset correction on the satellite positioning information of the same unmanned aerial vehicle based on a plurality of position offset difference values, and outputting corrected position information of a plurality of unmanned aerial vehicles;
calculating positioning error values of the plurality of unmanned aerial vehicles for correcting the position information, and judging whether the positioning error values meet preset precision or not;
and if the positioning error value of the corrected position information of the unmanned aerial vehicle meets the preset accuracy, selecting the intermediate value of the corrected position information of the unmanned aerial vehicle as the real-time accurate position information of the unmanned aerial vehicle.
In this embodiment, the satellite positioning information has a certain error, and therefore, a position offset difference value needs to be acquired through a satellite positioning base station to perform offset correction on the satellite positioning information of the unmanned aerial vehicle, so that accurate position information of the unmanned aerial vehicle is obtained. Firstly, receiving satellite positioning information of a satellite positioning base station and preset base station position information, wherein the satellite positioning information comprises positioning information of at least 4 satellites, then calculating position offset difference between the satellite positioning information of the satellite positioning base station and the preset position information, wherein the preset position information is known information, acquiring the position offset difference between the satellite positioning information of the satellite positioning base station and the preset position information under a plurality of different time nodes based on the acquisition process, then carrying out offset correction on the satellite positioning information of the same unmanned aerial vehicle through a plurality of position offset differences, outputting corrected position information of a plurality of unmanned aerial vehicles, then calculating whether the difference between the corrected position information of the plurality of unmanned aerial vehicles meets preset precision, if the difference between the corrected position information of the plurality of unmanned aerial vehicles is less than 1CM, then indicating that the obtained corrected position information of the unmanned aerial vehicles meets the preset precision, the intermediate value of the corrected position information can be selected as the real-time accurate position information of the unmanned aerial vehicle.
Based on the method, the calibration parameters are obtained in the following specific steps:
acquiring real-time accurate position information and positioning time of the unmanned aerial vehicle, and flight images and image acquisition time of the unmanned aerial vehicle;
matching the real-time accurate position of the unmanned aerial vehicle with the same time node with the flight image based on the positioning time and the image acquisition time to form a model data set;
acquiring pixel position information of the unmanned aerial vehicle in the flight image in the model data set;
converting real-time accurate position information of the unmanned aerial vehicle into position information of the same type as the pixel position information of the unmanned aerial vehicle;
constructing a model function based on the converted real-time accurate position information of the unmanned aerial vehicle and the pixel position information of the unmanned aerial vehicle;
and obtaining calibration parameters based on the model function.
In this embodiment, parameters for calibration are calculated by analyzing a mapping relationship between the accurate position information of the unmanned aerial vehicle and the pixel position information of the unmanned aerial vehicle in the flight image. Because unmanned aerial vehicle is continuous motion, consequently at first central processing module need match the real-time accurate positional information of unmanned aerial vehicle who acquires with unmanned aerial vehicle flight image, it is preferred that the data that obtains the same time node according to positioning time and acquisition time match and constitute the model data set, wherein every model data set only contains the real-time accurate positional information of an unmanned aerial vehicle and an unmanned aerial vehicle flight image, it comes to carry out joint analysis with the real-time accurate positional information of unmanned aerial vehicle to reacquire the pixel positional information of unmanned aerial vehicle in flight image, because both no longer unify the coordinate system, consequently, need convert unmanned aerial vehicle's real-time accurate positional information, can construct the function model that both correspond after the conversion, can try to get corresponding calibration parameter through solving this function model.
Based on the calibration parameter acquisition process, the real-time accurate position of the unmanned aerial vehicle and the specific process of matching the flight image are as follows:
acquiring a starting time node and a preset acquisition frequency of the binocular high-speed camera;
acquiring an acquisition time node of each unmanned aerial vehicle flying image based on a starting time node and a preset acquisition frequency of the binocular high-speed camera;
acquiring a takeoff time node and a preset positioning frequency of the unmanned aerial vehicle;
obtaining a positioning time node of real-time accurate position information of each unmanned aerial vehicle based on a takeoff time node and a preset positioning frequency of the unmanned aerial vehicle;
the method comprises the steps of comparing a collecting time node of a flight image with a positioning time node of real-time accurate position information to obtain the same time node, and matching the real-time accurate position of the unmanned aerial vehicle corresponding to the time node with the unmanned aerial vehicle image to form a model data set.
Furthermore, because unmanned aerial vehicle is motion state, therefore unmanned aerial vehicle's real-time accurate positional information and unmanned aerial vehicle image are not completely corresponding, consequently need match both, in this embodiment, through the collection frequency of presetting of high-speed camera and unmanned aerial vehicle real-time accurate positional information predetermine the positioning frequency and the time service signal acquisition collection time node and the positioning time node that the location satellite sent to carry out the time matching, obtain the model data set that corresponds each other.
The starting time node of the binocular high-speed camera is the same as the takeoff time node of the unmanned aerial vehicle, and the preset acquisition frequency of the binocular high-speed camera is integral multiple of the preset positioning frequency of the real-time accurate position information of the unmanned aerial vehicle.
Specifically, at first start two mesh high-speed cameras and unmanned aerial vehicle simultaneously, the collection frequency of predetermineeing of two mesh high-speed cameras and the preset positioning frequency of the real-time accurate positional information of unmanned aerial vehicle, it is preferred 100 per second to predetermine the collection frequency, it is 10 every second to predetermine the positioning frequency, a same time node can appear at both every certain milliseconds of interval to can obtain the real-time accurate positional information and the unmanned aerial vehicle image of the unmanned aerial vehicle that this time node corresponds, accomplish mutual matching between the two.
Based on the specific calculation process of the calibration parameters, the method for acquiring the pixel position information of the unmanned aerial vehicle in the image in the model data set comprises the following steps:
acquiring the front ten images shot by the binocular high-speed camera after starting;
segmenting a moving target in the image based on the dynamic change of the first ten images;
calculating pixel gray scale characteristics of the divided areas where the moving targets are located, calculating the similarity between the gray scale characteristics of different moving targets and preset unmanned aerial vehicle gray scale characteristics based on the gray scale characteristics of the moving targets, and selecting the unmanned aerial vehicle targets;
calculating the number of pixels occupied by the unmanned aerial vehicle image based on the selected unmanned aerial vehicle target, if the number of the pixels occupied by the unmanned aerial vehicle is smaller than a standard value, calculating histograms of color channels of the unmanned aerial vehicle target in the images shot by the two groups of high-speed cameras, and matching pixel position information of the unmanned aerial vehicle through the histograms;
if the number of pixels occupied by the unmanned aerial vehicle is larger than the standard value, acquiring corner points of the unmanned aerial vehicle target, calculating the contact ratio of the acquired corner points in the images shot by the two groups of high-speed cameras, and selecting corner point position information with the contact ratio larger than a threshold value as unmanned aerial vehicle pixel position information.
In this embodiment, first, a moving target in the first ten image analysis images shot by the binocular high-speed camera is obtained, and because other moving target interference experiments may exist in the field of view, the obtained moving target needs to be screened, specifically, the gray scale features of different moving targets are calculated, and then the similarity between the gray scale features of different moving targets and the gray scale features of a preset unmanned aerial vehicle target is calculated to select the unmanned aerial vehicle target. Then determining pixel position information of the unmanned aerial vehicle in the image based on the selected unmanned aerial vehicle target, and in order to solve the problem that the unmanned aerial vehicle image is too small to be judged, firstly, the number of pixel points of the unmanned aerial vehicle target is needed, if the number of pixel points is less than a standard value, and the standard value is preferably 30 pixel points, then matching the pixel position information of the unmanned aerial vehicle through a histogram by calculating the histogram of color channels of the unmanned aerial vehicle target in the images shot by the two groups of binocular high-speed cameras; if the number of the pixel points is larger than the standard value, acquiring corner points of the unmanned aerial vehicle target, calculating the contact ratio of the acquired corner points in the images shot by the two groups of binocular high-speed cameras, and selecting corner point position information with the contact ratio larger than a threshold value as unmanned aerial vehicle pixel position information, wherein the image contact ratio threshold value is preferably 0.8.
In the prior art, the target tracking method is preferably an optical flow method for target tracking, but the displacement of continuous frames in a video sequence part is large, and the sparse optical flow easily loses the target under the condition that the detection time is kept to be less than 0.5 s. In view of the above, the method includes the steps of obtaining continuous first ten frames of pictures of the binocular high-speed camera to conduct difference calculation, screening a region of a moving object in the images, determining an unmanned aerial vehicle target through a characteristic value of the moving target, conducting accuracy detection on the unmanned aerial vehicle target region through a streamer method to obtain an accurate unmanned aerial vehicle target, and selecting mass points in the unmanned aerial vehicle target region as pixel coordinates of the unmanned aerial vehicle.
Based on the specific calculation process of the calibration parameters, the analysis process of the model function includes:
acquiring an unmanned aerial vehicle world coordinate P obtained by converting real-time accurate position information of the unmanned aerial vehicle;
acquiring a coordinate p corresponding to pixel position information of the unmanned aerial vehicle;
constructing a projection matrix of the camera based on the world coordinate P of the unmanned aerial vehicle, and carrying out normalization processing on the last item of the projection matrix of the camera to obtain a parameter matrix of the camera;
constructing a model function based on an imaging model of a camera, and constructing a parameter equation by taking a coordinate p corresponding to pixel position information of the unmanned aerial vehicle as a value of the function;
and solving a parameter equation to obtain a calibration parameter.
In the present embodiment, the principle of imaging according to the cameraThe mapping relation between the real world coordinate point and the image coordinate point shot by the camera can be constructed, so that the real-time accurate position information of the unmanned aerial vehicle is required to be acquired firstly, the world coordinate P (X, Y, Z) of the unmanned aerial vehicle is obtained through conversion, the projection matrix of the camera can be obtained based on the world coordinate P (X, Y, Z) of the unmanned aerial vehicle, and the corresponding relation is
Figure BDA0003362798570000121
Wherein K represents the internal reference of the camera, R3×3Being a rotation matrix of the camera, T3×1The method comprises the steps of calibrating a camera by using a translation matrix of the camera and r represents calibration parameters of the camera, determining the parameters of the camera, and matching the parameters r based on a projection matrix of the camera23And (3) carrying out normalization processing, namely only determining the remaining 11 parameters of the camera, and constructing a parameter equation by taking the coordinate p (u, v) corresponding to the pixel position information of the unmanned aerial vehicle as a value of a function based on an imaging model, wherein the imaging model is a pinhole imaging model, and the parameter equation is as follows:
Figure BDA0003362798570000122
two equations can be constructed for each world coordinate obtained by equation (1), so that all camera parameters can be obtained by 6 world coordinates, and the more the number of the world coordinates is, the more accurate the parameters are.
The embodiment of the invention provides a system for measuring large-field high-speed motion, which comprises:
the image acquisition module is used for acquiring a flight image of the unmanned aerial vehicle in a large view field based on a binocular high-speed camera;
the positioning module is used for acquiring real-time accurate position information of the unmanned aerial vehicle in the large-field-of-view flight process;
the calibration module is used for constructing a data conversion model based on flight graphics and real-time accurate position information of the unmanned aerial vehicle in a large view field to obtain calibration parameters;
and the moving target measuring module is used for measuring the moving target in the large visual field based on the calibration parameters to obtain the moving parameters of the moving target.
Referring to fig. 4 and 5, in this embodiment, a total station and a target are used to measure and verify the calibration precision of the system in combination with calibration parameters of an unmanned aerial vehicle, and a calibration error is calculated to determine the feasibility of the system, specifically, the position of a binocular high-speed camera is kept unchanged, the target is placed in a view field, coordinates of the target on measurement points with different heights are measured, and pixel coordinates pix of different measurement points shot by the binocular high-speed camera are recordedl,iAnd pixr,iI-0, 1, …, n-1, wherein pixl,iPixel coordinates, pix, representing the left camera shotr,iRepresenting the pixel coordinates of the right camera shot and n is the number of measurement points. Solving the three-dimensional space coordinate W of the measuring point based on the pixel coordinate of the measuring point and the camera parameter,
Figure BDA0003362798570000131
wherein, K represents the internal reference of the camera, R represents the rotation matrix of the camera, T is the translation matrix of the camera, and distcoeffs represents the distortion matrix of the camera.
Using the target at the moment i-0 as a reference origin, and calculating the distances d of other measuring points relative to the pointi,di=||Wi-W0L. Then recording the coordinates L of the measuring point measured by the total stationiI is 0,1, …, n-1. Taking the measuring point at the moment when i is 0 as a reference origin, and calculating the distances of other measuring points relative to the point
Figure BDA0003362798570000132
Based on the determined distance diAnd
Figure BDA0003362798570000133
calculating the calibrated reprojection error ei
Figure BDA0003362798570000134
Referring to fig. 5, as the number of calibrated measurement points increases, the corresponding calibrated reprojection error gradually decreases, and when the number of calibrated measurement points reaches 60, the calibrated reprojection error gradually tends to be stable, and the final reprojection error is 1.41461 pixels, so that the calibration method of the system has good replaceability and convenience, and can realize accurate measurement of the motion parameters of the large-market-range moving target, compared with the calibration error of 0.900 pixels in a small field of view of a total station.
The present invention is not limited to the above-described embodiments, and those skilled in the art will be able to make various modifications without creative efforts from the above-described conception, and fall within the scope of the present invention.

Claims (10)

1. An RTK unmanned aerial vehicle calibration system for large field of view motion measurement, the system comprising:
the satellite positioning base station comprises a satellite positioning unit and a wireless communication unit, wherein the satellite positioning unit is used for acquiring satellite positioning information of the satellite positioning base station, and the wireless communication unit is used for combining the satellite positioning information and preset base station position information into positioning correction information and sending the positioning correction information to the unmanned aerial vehicle;
the unmanned aerial vehicle comprises an aircraft and an RTK unmanned aerial vehicle module, wherein the RTK unmanned aerial vehicle module is used for receiving positioning correction information sent by the satellite positioning base station, calculating real-time accurate position information of the unmanned aerial vehicle based on the positioning correction information and sending the real-time accurate position information to the central processing station;
the high-speed camera comprises an image acquisition unit and a time service trigger unit, wherein the time service trigger unit receives time service information of the positioning satellite and starts the image acquisition unit to acquire an image of the RTK unmanned aerial vehicle module;
and the central processing station is in communication connection with the high-speed camera and the RTK unmanned aerial vehicle module, and is used for receiving the image of the unmanned aerial vehicle and the real-time accurate position information of the unmanned aerial vehicle in real time, and constructing a data conversion model based on the received data to obtain the calibration parameters.
2. The method for large field of view high speed motion measurement of an RTK unmanned aerial vehicle calibration system for large field of view motion measurement as claimed in claim 1, the method comprising:
acquiring a flight image of the unmanned aerial vehicle in a large view field based on a binocular high-speed camera;
acquiring real-time accurate position information of the unmanned aerial vehicle in a large-field-of-view flight process;
constructing a data conversion model based on flight graphics and real-time accurate position information of the unmanned aerial vehicle in a large view field to obtain calibration parameters;
and measuring the moving target in the large view field based on the calibration parameters to obtain the moving parameters of the moving target.
3. The method of claim 2, wherein the method for planning the flight path of the unmanned aerial vehicle in the large field of view is: dividing the whole view field space into at least one height space in height, flying the unmanned aerial vehicle in the same height space in an irregular shape on the same plane, flying the unmanned aerial vehicle into the adjacent height space to fly after flying the whole plane until the flying track is distributed in the whole view field space.
4. The method of claim 3, wherein the flight trajectories of the drone on two adjacent altitude-space planes are respectively:
Figure FDA0003362798560000011
Figure FDA0003362798560000012
wherein W represents the width of the field space, L represents the length of the field space, and a represents a constant; the flight tracks on the two adjacent height space planes are connected through two line segments which are perpendicular to each other and intersect on the x axis, wherein one line segment of the two line segments is perpendicular to the x-z plane, and the other line segment of the two line segments is perpendicular to the x-y plane.
5. The method of claim 2, wherein the real-time precise position information acquisition process of the drone comprises:
acquiring satellite positioning information of satellite positioning base stations of a plurality of time nodes, wherein the satellite positioning information comprises positioning information of at least 4 satellites;
obtaining a plurality of position offset differences based on the satellite positioning information and preset position information of the satellite positioning base station;
carrying out offset correction on the satellite positioning information of the same unmanned aerial vehicle based on a plurality of position offset difference values, and outputting corrected position information of a plurality of unmanned aerial vehicles;
calculating positioning error values of the plurality of unmanned aerial vehicles for correcting the position information, and judging whether the positioning error values meet preset precision or not;
and if the positioning error value of the corrected position information of the unmanned aerial vehicle meets the preset accuracy, selecting the intermediate value of the corrected position information of the unmanned aerial vehicle as the real-time accurate position information of the unmanned aerial vehicle.
6. The method according to claim 2, wherein the calibration parameters are obtained specifically as follows:
acquiring real-time accurate position information and positioning time of the unmanned aerial vehicle, and flight images and image acquisition time of the unmanned aerial vehicle;
matching the real-time accurate position of the unmanned aerial vehicle with the same time node with the flight image based on the positioning time and the image acquisition time to form a model data set;
acquiring pixel position information of the unmanned aerial vehicle in the flight image in the model data set;
converting real-time accurate position information of the unmanned aerial vehicle into position information of the same type as the pixel position information of the unmanned aerial vehicle;
constructing a model function based on the converted real-time accurate position information of the unmanned aerial vehicle and the pixel position information of the unmanned aerial vehicle;
and obtaining calibration parameters based on the model function.
7. The method according to claim 6, wherein the real-time precise position and flight image of the drone are matched with each other by the following specific process:
acquiring a starting time node and a preset acquisition frequency of the binocular high-speed camera;
acquiring an acquisition time node of each unmanned aerial vehicle flying image based on a starting time node and a preset acquisition frequency of the binocular high-speed camera;
acquiring a takeoff time node and a preset positioning frequency of the unmanned aerial vehicle;
obtaining a positioning time node of real-time accurate position information of each unmanned aerial vehicle based on a takeoff time node and a preset positioning frequency of the unmanned aerial vehicle;
the method comprises the steps of comparing a collecting time node of a flight image with a positioning time node of real-time accurate position information to obtain the same time node, and matching the real-time accurate position of the unmanned aerial vehicle corresponding to the time node with the unmanned aerial vehicle image to form a model data set.
8. The method according to claim 7, wherein a starting time node of the binocular high-speed camera is the same as a takeoff time node of the unmanned aerial vehicle, and the preset acquisition frequency of the binocular high-speed camera is an integral multiple of a preset positioning frequency of real-time accurate position information of the unmanned aerial vehicle.
9. The method of claim 8, wherein the method for obtaining the pixel position information of the drone in the image in the model data set is as follows:
acquiring the front ten images shot by the binocular high-speed camera after starting;
segmenting a moving target in the image based on the dynamic change of the first ten images;
calculating pixel gray scale characteristics of the divided areas where the moving targets are located, calculating the similarity between the gray scale characteristics of different moving targets and preset unmanned aerial vehicle gray scale characteristics based on the gray scale characteristics of the moving targets, and selecting the unmanned aerial vehicle targets;
calculating the number of pixels occupied by the unmanned aerial vehicle image based on the selected unmanned aerial vehicle target, if the number of the pixels occupied by the unmanned aerial vehicle is smaller than a standard value, calculating histograms of color channels of the unmanned aerial vehicle target in the images shot by the two groups of high-speed cameras, and matching pixel position information of the unmanned aerial vehicle through the histograms;
if the number of pixels occupied by the unmanned aerial vehicle is larger than the standard value, acquiring corner points of the unmanned aerial vehicle target, calculating the contact ratio of the acquired corner points in the images shot by the two groups of high-speed cameras, and selecting corner point position information with the contact ratio larger than a threshold value as unmanned aerial vehicle pixel position information.
10. A system for large field of view high speed motion measurement, the system comprising:
the image acquisition module is used for acquiring a flight image of the unmanned aerial vehicle in a large view field based on a binocular high-speed camera;
the positioning module is used for acquiring real-time accurate position information of the unmanned aerial vehicle in the large-field-of-view flight process;
the calibration module is used for constructing a data conversion model based on flight graphics and real-time accurate position information of the unmanned aerial vehicle in a large view field to obtain calibration parameters;
and the moving target measuring module is used for measuring the moving target in the large visual field based on the calibration parameters to obtain the moving parameters of the moving target.
CN202111372441.0A 2021-11-18 2021-11-18 System and method for large-field-of-view high-speed motion measurement Active CN114155290B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111372441.0A CN114155290B (en) 2021-11-18 2021-11-18 System and method for large-field-of-view high-speed motion measurement

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111372441.0A CN114155290B (en) 2021-11-18 2021-11-18 System and method for large-field-of-view high-speed motion measurement

Publications (2)

Publication Number Publication Date
CN114155290A true CN114155290A (en) 2022-03-08
CN114155290B CN114155290B (en) 2022-09-09

Family

ID=80456977

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111372441.0A Active CN114155290B (en) 2021-11-18 2021-11-18 System and method for large-field-of-view high-speed motion measurement

Country Status (1)

Country Link
CN (1) CN114155290B (en)

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104501779A (en) * 2015-01-09 2015-04-08 中国人民解放军63961部队 High-accuracy target positioning method of unmanned plane on basis of multi-station measurement
CN105346706A (en) * 2015-11-13 2016-02-24 深圳市道通智能航空技术有限公司 Flight device, and flight control system and method
CN107146256A (en) * 2017-04-10 2017-09-08 中国人民解放军国防科学技术大学 Camera marking method under outfield large viewing field condition based on differential global positioning system
CN108510551A (en) * 2018-04-25 2018-09-07 上海大学 Method and system for calibrating camera parameters under long-distance large-field-of-view condition
CN109035320A (en) * 2018-08-12 2018-12-18 浙江农林大学 Depth extraction method based on monocular vision
CN109360240A (en) * 2018-09-18 2019-02-19 华南理工大学 A kind of small drone localization method based on binocular vision
CN109407697A (en) * 2018-09-20 2019-03-01 北京机械设备研究所 A kind of unmanned plane pursuit movement goal systems and method based on binocular distance measurement
CN111879354A (en) * 2020-06-29 2020-11-03 广州中科智云科技有限公司 Unmanned aerial vehicle measurement system that becomes more meticulous
WO2021004312A1 (en) * 2019-07-08 2021-01-14 中原工学院 Intelligent vehicle trajectory measurement method based on binocular stereo vision system
CN112950671A (en) * 2020-08-06 2021-06-11 郑锴 Real-time high-precision parameter measurement method for moving target by unmanned aerial vehicle
CN113406682A (en) * 2021-06-22 2021-09-17 腾讯科技(深圳)有限公司 Positioning method, positioning device, electronic equipment and storage medium

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104501779A (en) * 2015-01-09 2015-04-08 中国人民解放军63961部队 High-accuracy target positioning method of unmanned plane on basis of multi-station measurement
CN105346706A (en) * 2015-11-13 2016-02-24 深圳市道通智能航空技术有限公司 Flight device, and flight control system and method
CN107146256A (en) * 2017-04-10 2017-09-08 中国人民解放军国防科学技术大学 Camera marking method under outfield large viewing field condition based on differential global positioning system
CN108510551A (en) * 2018-04-25 2018-09-07 上海大学 Method and system for calibrating camera parameters under long-distance large-field-of-view condition
CN109035320A (en) * 2018-08-12 2018-12-18 浙江农林大学 Depth extraction method based on monocular vision
CN109360240A (en) * 2018-09-18 2019-02-19 华南理工大学 A kind of small drone localization method based on binocular vision
CN109407697A (en) * 2018-09-20 2019-03-01 北京机械设备研究所 A kind of unmanned plane pursuit movement goal systems and method based on binocular distance measurement
WO2021004312A1 (en) * 2019-07-08 2021-01-14 中原工学院 Intelligent vehicle trajectory measurement method based on binocular stereo vision system
CN111879354A (en) * 2020-06-29 2020-11-03 广州中科智云科技有限公司 Unmanned aerial vehicle measurement system that becomes more meticulous
CN112950671A (en) * 2020-08-06 2021-06-11 郑锴 Real-time high-precision parameter measurement method for moving target by unmanned aerial vehicle
CN113406682A (en) * 2021-06-22 2021-09-17 腾讯科技(深圳)有限公司 Positioning method, positioning device, electronic equipment and storage medium

Also Published As

Publication number Publication date
CN114155290B (en) 2022-09-09

Similar Documents

Publication Publication Date Title
CA3027921C (en) Integrated sensor calibration in natural scenes
WO2022170878A1 (en) System and method for measuring distance between transmission line and image by unmanned aerial vehicle
CN110517325B (en) Coordinate transformation and method and system for positioning objects around vehicle body through coordinate transformation
CN106019264A (en) Binocular vision based UAV (Unmanned Aerial Vehicle) danger vehicle distance identifying system and method
CN112113542A (en) Method for checking and accepting land special data for aerial photography construction of unmanned aerial vehicle
CN109859269B (en) Shore-based video auxiliary positioning unmanned aerial vehicle large-range flow field measuring method and device
CN109920009B (en) Control point detection and management method and device based on two-dimensional code identification
CN113012292B (en) AR remote construction monitoring method and system based on unmanned aerial vehicle aerial photography
CN110706273B (en) Real-time collapse area measurement method based on unmanned aerial vehicle
CN110675448A (en) Ground light remote sensing monitoring method, system and storage medium based on civil aircraft
CN115359130B (en) Radar and camera combined calibration method and device, electronic equipment and storage medium
CN115683062B (en) Territorial space planning detection analysis system
CN110715670A (en) Method for constructing driving test panoramic three-dimensional map based on GNSS differential positioning
CN114413958A (en) Monocular vision distance and speed measurement method of unmanned logistics vehicle
Bybee et al. Method for 3-D scene reconstruction using fused LiDAR and imagery from a texel camera
CN117036300A (en) Road surface crack identification method based on point cloud-RGB heterogeneous image multistage registration mapping
CN114529615A (en) Radar calibration method, device and storage medium
CN114155290B (en) System and method for large-field-of-view high-speed motion measurement
CN110989645A (en) Target space attitude processing method based on compound eye imaging principle
CN211349110U (en) Three-dimensional display system based on unmanned aerial vehicle gathers
CN114429515A (en) Point cloud map construction method, device and equipment
CN114708505A (en) Wheat yield estimation method based on unmanned aerial vehicle low-altitude visible light image and target detection network
CN114693807A (en) Method and system for reconstructing mapping data of power transmission line image and point cloud
CN114979956A (en) Unmanned aerial vehicle aerial photography ground target positioning method and system
KR100874425B1 (en) System for measuring size of signboard and method for measuring size of signboard using the same

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information

Address after: Fuhuang New Vision Building, No. 77 Wutaishan Road, Baohe Economic Development Zone, Hefei City, Anhui Province, 230051

Applicant after: HEFEI FUHUANG JUNDA HIGH-TECH INFORMATION TECHNOLOGY Co.,Ltd.

Address before: 230088 Room 107, Building 3, Tiandao 10 Software Park, Hefei High-tech Zone, Anhui Province

Applicant before: HEFEI FUHUANG JUNDA HIGH-TECH INFORMATION TECHNOLOGY Co.,Ltd.

CB02 Change of applicant information
GR01 Patent grant
GR01 Patent grant
CP01 Change in the name or title of a patent holder

Address after: Fuhuang New Vision Building, No. 77 Wutaishan Road, Baohe Economic Development Zone, Hefei City, Anhui Province, 230051

Patentee after: Hefei Zhongke Junda Vision Technology Co.,Ltd.

Address before: Fuhuang New Vision Building, No. 77 Wutaishan Road, Baohe Economic Development Zone, Hefei City, Anhui Province, 230051

Patentee before: HEFEI FUHUANG JUNDA HIGH-TECH INFORMATION TECHNOLOGY Co.,Ltd.

CP01 Change in the name or title of a patent holder