Disclosure of Invention
Aiming at the problems in the prior art, the invention provides a system and a method for large-field high-speed motion measurement, calibration parameters are obtained by obtaining coordinates and images of an unmanned aerial vehicle in a large field, parameters of a moving target are measured based on the calibration parameters, the calibration process is simple, the time consumption is short, the efficiency is high, and the moving target in the whole field can be detected.
A first aspect of an embodiment of the present invention provides an RTK unmanned aerial vehicle calibration system for large-field-of-view motion measurement, where the system includes:
the satellite positioning base station comprises a satellite positioning unit and a wireless communication unit, wherein the satellite positioning unit is used for acquiring satellite positioning information of the satellite positioning base station, and the wireless communication unit is used for combining the satellite positioning information and preset base station position information into positioning correction information and sending the positioning correction information to the unmanned aerial vehicle;
the unmanned aerial vehicle comprises an aircraft and an RTK unmanned aerial vehicle module, wherein the RTK unmanned aerial vehicle module is used for receiving positioning correction information sent by the satellite positioning base station, calculating real-time accurate position information of the unmanned aerial vehicle based on the positioning correction information and sending the real-time accurate position information to the central processing station;
the high-speed camera comprises an image acquisition unit and a time service trigger unit, wherein the time service trigger unit receives time service information of the positioning satellite and starts the image acquisition unit to acquire an image of the RTK unmanned aerial vehicle module;
and the central processing station is in communication connection with the high-speed camera and the RTK unmanned aerial vehicle module, and is used for receiving the image of the unmanned aerial vehicle and the real-time accurate position information of the unmanned aerial vehicle in real time, and constructing a data conversion model based on the received data to obtain the calibration parameters.
A second aspect of embodiments of the present invention provides a method for large-field high-speed motion measurement, the method comprising:
acquiring a flight image of the unmanned aerial vehicle in a large view field based on a binocular high-speed camera;
acquiring real-time accurate position information of the unmanned aerial vehicle in a large-field-of-view flight process;
constructing a data conversion model based on flight graphics and real-time accurate position information of the unmanned aerial vehicle in a large view field to obtain calibration parameters;
and measuring the moving target in the large view field based on the calibration parameters to obtain the moving parameters of the moving target.
As a further optimization of the above scheme, the method for planning the flight trajectory of the unmanned aerial vehicle in the large field of view comprises: dividing the whole view field space into at least one height space in height, flying the unmanned aerial vehicle in the same height space in an irregular shape on the same plane, flying the unmanned aerial vehicle into the adjacent height space to fly after flying the whole plane until the flying track is distributed in the whole view field space.
As a further optimization of the above scheme, the flight trajectories of the unmanned aerial vehicle on two adjacent altitude space planes are respectively:
wherein W represents the width of the field space, L represents the length of the field space, and a represents a constant; the flight tracks on the two adjacent height space planes are connected through two line segments which are perpendicular to each other and intersect on the x axis, one of the two line segments is perpendicular to the x-z plane, and the other line segment is perpendicular to the x-y plane.
As a further optimization of the above scheme, the real-time accurate position information obtaining process of the unmanned aerial vehicle includes:
acquiring satellite positioning information of satellite positioning base stations of a plurality of time nodes, wherein the satellite positioning information comprises positioning information of at least 4 satellites;
obtaining a plurality of position offset differences based on the satellite positioning information and preset position information of the satellite positioning base station;
carrying out offset correction on the satellite positioning information of the same unmanned aerial vehicle based on a plurality of position offset difference values, and outputting corrected position information of a plurality of unmanned aerial vehicles;
calculating positioning error values of the plurality of unmanned aerial vehicles for correcting the position information, and judging whether the positioning error values meet preset precision or not;
and if the positioning error value of the corrected position information of the unmanned aerial vehicle meets the preset accuracy, selecting the intermediate value of the corrected position information of the unmanned aerial vehicle as the real-time accurate position information of the unmanned aerial vehicle.
As a further optimization of the above scheme, the calibration parameters are specifically obtained as follows:
acquiring real-time accurate position information and positioning time of the unmanned aerial vehicle, and flight images and image acquisition time of the unmanned aerial vehicle;
matching the real-time accurate position of the unmanned aerial vehicle with the same time node with the flight image based on the positioning time and the image acquisition time to form a model data set;
acquiring pixel position information of the unmanned aerial vehicle in the flight image in the model data set;
converting real-time accurate position information of the unmanned aerial vehicle into position information of the same type as the pixel position information of the unmanned aerial vehicle;
constructing a model function based on the converted real-time accurate position information of the unmanned aerial vehicle and the pixel position information of the unmanned aerial vehicle;
and obtaining calibration parameters based on the model function.
As a further optimization of the above scheme, the specific process of matching the real-time accurate position of the unmanned aerial vehicle with the flight image is as follows:
acquiring a starting time node and a preset acquisition frequency of the binocular high-speed camera;
acquiring an acquisition time node of each unmanned aerial vehicle flying image based on a starting time node and a preset acquisition frequency of the binocular high-speed camera;
acquiring a takeoff time node and a preset positioning frequency of the unmanned aerial vehicle;
obtaining a positioning time node of real-time accurate position information of each unmanned aerial vehicle based on a takeoff time node and a preset positioning frequency of the unmanned aerial vehicle;
the method comprises the steps of comparing a collecting time node of a flight image with a positioning time node of real-time accurate position information to obtain the same time node, and matching the real-time accurate position of the unmanned aerial vehicle corresponding to the time node with the unmanned aerial vehicle image to form a model data set.
As a further optimization of the scheme, the starting time node of the binocular high-speed camera is the same as the takeoff time node of the unmanned aerial vehicle, and the preset acquisition frequency of the binocular high-speed camera is an integral multiple of the preset positioning frequency of the real-time accurate position information of the unmanned aerial vehicle.
As a further optimization of the above scheme, the method for acquiring the pixel position information of the unmanned aerial vehicle in the image in the model data set comprises the following steps:
acquiring the front ten images shot by the binocular high-speed camera after starting;
segmenting a moving target in the image based on the dynamic change of the first ten images;
calculating pixel gray scale characteristics of the divided areas where the moving targets are located, calculating the similarity between the gray scale characteristics of different moving targets and preset unmanned aerial vehicle gray scale characteristics based on the gray scale characteristics of the moving targets, and selecting the unmanned aerial vehicle targets;
calculating the number of pixels occupied by the unmanned aerial vehicle image based on the selected unmanned aerial vehicle target, if the number of the pixels occupied by the unmanned aerial vehicle is smaller than a standard value, calculating histograms of color channels of the unmanned aerial vehicle target in the images shot by the two groups of high-speed cameras, and matching pixel position information of the unmanned aerial vehicle through the histograms;
if the number of pixels occupied by the unmanned aerial vehicle is larger than the standard value, acquiring corner points of the unmanned aerial vehicle target, calculating the contact ratio of the acquired corner points in the images shot by the two groups of high-speed cameras, and selecting corner point position information with the contact ratio larger than a threshold value as unmanned aerial vehicle pixel position information.
A third aspect of embodiments of the present invention provides a system for large field of view high speed motion measurement, the system comprising:
the image acquisition module is used for acquiring a flight image of the unmanned aerial vehicle in a large view field based on a binocular high-speed camera;
the positioning module is used for acquiring real-time accurate position information of the unmanned aerial vehicle in the large-field-of-view flight process;
the calibration module is used for constructing a data conversion model based on flight graphics and real-time accurate position information of the unmanned aerial vehicle in a large view field to obtain calibration parameters;
and the moving target measuring module is used for measuring the moving target in the large visual field based on the calibration parameters to obtain the moving parameters of the moving target.
The system and the method for measuring the large-field high-speed motion have the following beneficial effects that:
1. the invention acquires real-time accurate position information of the unmanned aerial vehicle in the large-market flight process through the satellite positioning system, realizes the acquisition of the position information of different heights of the large-market porker, has short time consumption and high acquisition efficiency in the whole position information acquisition process, and can cover the whole large-market space.
2. According to the invention, real-time accurate position information of the unmanned aerial vehicle at the same time point is matched with the flight image of the unmanned aerial vehicle collected by the binocular high-speed camera through time matching, a data conversion model is constructed, calibration parameters are calculated, and then the motion parameters of the motion target in the large market space are measured according to the calibration parameters, so that the measurement of the motion target covering different height positions of the large market is realized, and meanwhile, the calibration and measurement processes are rapid and efficient.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth, such as particular system structures, techniques, etc. in order to provide a thorough understanding of the embodiments of the invention. It will be apparent, however, to one skilled in the art that the present invention may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present invention with unnecessary detail.
The embodiment of the invention provides an RTK unmanned aerial vehicle calibration system for large-field-of-view motion measurement, which comprises:
the satellite positioning base station comprises a satellite positioning unit and a wireless communication unit, wherein the satellite positioning unit is used for acquiring satellite positioning information of the satellite positioning base station, and the wireless communication unit is used for combining the satellite positioning information and preset base station position information into positioning correction information and sending the positioning correction information to the unmanned aerial vehicle;
the unmanned aerial vehicle comprises an aircraft and an RTK unmanned aerial vehicle module, wherein the RTK unmanned aerial vehicle module is used for receiving positioning correction information sent by the satellite positioning base station, calculating real-time accurate position information of the unmanned aerial vehicle based on the positioning correction information and sending the real-time accurate position information to the central processing station;
the high-speed camera comprises an image acquisition unit and a time service trigger unit, wherein the time service trigger unit receives time service information of the positioning satellite and starts the image acquisition unit to acquire an image of the RTK unmanned aerial vehicle module;
and the central processing station is in communication connection with the high-speed camera and the RTK unmanned aerial vehicle module, and is used for receiving the image of the unmanned aerial vehicle and the real-time accurate position information of the unmanned aerial vehicle in real time, and constructing a data conversion model based on the received data to obtain the calibration parameters.
Referring to fig. 1, in this embodiment, the unmanned aerial vehicle flies in a large view field, and then the position of the unmanned aerial vehicle in the real world is obtained by satellite positioning, so that corresponding coordinates of each position of the large view field can be obtained more flexibly, different heights of the large view field are covered, and meanwhile, by obtaining real-time coordinates of the unmanned aerial vehicle in the motion process of the large view field, the motion measurement of the large view field is realized, and the calibration process is faster and more efficient.
Specifically, the base station position is preset again to place the satellite positioning base station, the satellite positioning base station acquires the satellite positioning information of the satellite positioning base station through a built-in satellite positioning unit, the acquired satellite positioning information and the preset base station position information are combined to be positioning correction information, the positioning correction information is sent to the unmanned aerial vehicle through the wireless communication unit, the unmanned aerial vehicle is provided with an aircraft and a RTK unmanned aerial vehicle module, the RTK unmanned aerial vehicle module is in wireless connection with the satellite positioning base station, after the positioning correction information sent by the satellite positioning base station is received, the RTK unmanned aerial vehicle module can correct the real-time satellite positioning information of the unmanned aerial vehicle in the flight process based on the positioning correction information, the real-time accurate position information of the unmanned aerial vehicle is calculated, and the real-time accurate position information is sent to the central processing station. Meanwhile, the high-speed camera collects images of the unmanned aerial vehicle in the flying process of the unmanned aerial vehicle, transmits the images of the unmanned aerial vehicle to the central processing station through wired communication, performs joint analysis on the images of the unmanned aerial vehicle and real-time accurate position information of the unmanned aerial vehicle by the central processing station, and calculates parameters for calibration.
Based on above-mentioned a RTK unmanned aerial vehicle calibration system for big visual field motion measurement, above-mentioned RTK unmanned aerial vehicle module includes:
the output port of the first data transmission unit is connected with the positioning correction unit, and the first data transmission unit is used for receiving the positioning correction information sent by the satellite positioning base station and transmitting the positioning correction information to the positioning correction unit;
the positioning correction unit is used for acquiring real-time satellite positioning information of the unmanned aerial vehicle during movement and calculating real-time accurate position information of the unmanned aerial vehicle by combining the positioning correction information;
and an input port of the second data transmission unit is connected with the positioning correction unit and used for sending the real-time accurate position information and the positioning time of the unmanned aerial vehicle to the central processing station.
Further, through the RTK unmanned aerial vehicle module that sets up among the unmanned aerial vehicle, realize acquireing unmanned aerial vehicle's accurate positional information in real time at the in-process of unmanned aerial vehicle motion. Specifically, the RTK unmanned aerial vehicle module comprises a first data transmission unit, a second data transmission unit and a positioning correction unit, wherein the unmanned aerial vehicle is connected with a satellite positioning base station through the first data transmission unit, an output port of the first data transmission unit is connected with the positioning correction unit, when the first data transmission unit receives positioning correction information sent by the satellite positioning base station, the positioning correction unit transmits the positioning correction information to the positioning correction unit through the output port, the positioning correction unit can also obtain positioning information and positioning time sent by a positioning satellite, the positioning correction unit corrects the satellite positioning information of the unmanned aerial vehicle according to the positioning correction information after receiving the satellite positioning information of the unmanned aerial vehicle, the real-time accurate position information of the unmanned aerial vehicle is calculated, the positioning correction unit is also connected with an input port of the second data transmission unit, and after the real-time accurate position information of the unmanned aerial vehicle is calculated, the positioning correction unit transmits the information and the positioning time to the second data transmission unit, and the second data transmission unit sends the information to the central processing station, so that the accurate position information of the unmanned aerial vehicle can be acquired in real time.
The embodiment of the invention provides a method for measuring large-field high-speed motion, which comprises the following steps:
acquiring a flight image of the unmanned aerial vehicle in a large view field based on a binocular high-speed camera;
acquiring real-time accurate position information of the unmanned aerial vehicle in a large-field-of-view flight process;
constructing a data conversion model based on flight graphics and real-time accurate position information of the unmanned aerial vehicle in a large view field to obtain calibration parameters;
and measuring the moving target in the large view field based on the calibration parameters to obtain the moving parameters of the moving target.
Referring to fig. 2, in this embodiment, through unmanned aerial vehicle flying in the big visual field, adopt satellite positioning to acquire the real-time accurate positional information of unmanned aerial vehicle in the big visual field, can be more nimble acquire the corresponding coordinate of each position of big visual field, cover the different heights of big visual field, carry out big visual field through acquiring the real-time accurate positional information of unmanned aerial vehicle in big visual field motion process simultaneously and mark, the process of demarcation is more quick high-efficient, can realize measuring the motion parameter of big visual field motion target after marking the completion.
Specifically, the flight image of the unmanned aerial vehicle in the large view field is acquired through the binocular high-speed camera, real-time accurate position information of the unmanned aerial vehicle during flying in the large view field is acquired in real time based on a satellite positioning system, wherein the flight track of the unmanned aerial vehicle covers the whole large view field, positioning information of each position in the large view field can be acquired, and the speed is acquired. The acquired unmanned aerial vehicle flight image and the real-time accurate position information are matched, a data conversion model between the acquired unmanned aerial vehicle flight image and the real-time accurate position information can be obtained, the data conversion model is solved to obtain the required calibration parameters, then the binocular camera is used for collecting the motion image of any moving target in the large view field, the position information of the moving target in the large view field can be obtained based on the calibration parameters, and the motion parameters of the moving target can be obtained based on the position information change condition of the moving target in the large view field.
Based on the method, the method for planning the flight path of the unmanned aerial vehicle in the large view field comprises the following steps: dividing the whole view field space into at least one height space in height, flying the unmanned aerial vehicle in the same height space in an irregular shape on the same plane, flying the unmanned aerial vehicle into the adjacent height space to fly after flying the whole plane until the flying track is distributed in the whole view field space.
Referring to fig. 3, the large view field space is a cuboid space structure, the flight trajectory of the unmanned aerial vehicle in the large view field is irregular, and the collective division mode of the flight trajectory of the unmanned aerial vehicle is as follows: the large field space is divided into a plurality of height spaces by a plurality of planes parallel to the horizontal plane, wherein the height difference between two adjacent height spaces is an average value. Then choose unmanned aerial vehicle's high space of taking off, start unmanned aerial vehicle back, unmanned aerial vehicle begins to fly with irregular orbit in the plane from the angular point of the plane on the high space of taking off, after flying complete individual plane, unmanned aerial vehicle flies into and flies with the adjacent high space of current place high space, and the whole large field of view space is covered up to unmanned aerial vehicle's flight orbit. Flying according to the method, the unmanned aerial vehicle can finish flying in the shortest time, and the flying track can relate to the main viewing field space.
Based on the method, the flight trajectories of the unmanned aerial vehicle on the two adjacent height space planes are respectively as follows:
wherein W represents the width of the field space, L represents the length of the field space, and a represents a constant; the flight tracks on the two adjacent height space planes are connected through two line segments which are perpendicular to each other and intersect on the x axis, one of the two line segments is perpendicular to the x-z plane, and the other line segment is perpendicular to the x-y plane.
Specifically, the flight trajectories of the unmanned aerial vehicle in two altitude space planes of the vector can be represented by sine functions, wherein the flight trajectory function of the unmanned aerial vehicle in the first altitude space plane is as follows:
the flight trajectory function of the drone in the second altitudinal space plane adjacent thereto is:
and the flight path function expressions of the unmanned aerial vehicle in the adjacent height space planes alternately appear as the formula (1) and the formula (2).
Furthermore, the terminal point of the flight path of the unmanned aerial vehicle in a certain altitude space plane is connected with the starting point of the flight path in the next adjacent altitude space plane through two line segments which are perpendicular to each other and intersect at the x axis, wherein one of the two line segments is perpendicular to the x-z plane, and the other line segment is perpendicular to the x-y plane.
For example, referring to fig. 2, the large view field space is a cube with a length L, a width W, and a height H, and coordinates are established with a long side as an X axis, a short side as a Y axis, and a spatial height as a Z axis, and the large view field space is divided into three height spaces, where the height difference between two adjacent height spaces is H/4. Selecting the height of the takeoff height space of the unmanned aerial vehicle as H/4, and starting the unmanned aerial vehicle from the point Q to
Flying in the plane, the flying orbit is as formula (1)
After the plane flight is finished, the unmanned aerial vehicle is connected with the lineThe segments S1 and S2 enter the next adjacent height space and are in the plane
The flying trajectory is as the formula (2).
Based on the method, the real-time accurate position information acquisition process of the unmanned aerial vehicle comprises the following steps:
acquiring satellite positioning information of satellite positioning base stations of a plurality of time nodes, wherein the satellite positioning information comprises positioning information of at least 4 satellites;
obtaining a plurality of position offset differences based on the satellite positioning information and preset position information of the satellite positioning base station;
carrying out offset correction on the satellite positioning information of the same unmanned aerial vehicle based on a plurality of position offset difference values, and outputting corrected position information of a plurality of unmanned aerial vehicles;
calculating positioning error values of the plurality of unmanned aerial vehicles for correcting the position information, and judging whether the positioning error values meet preset precision or not;
and if the positioning error value of the corrected position information of the unmanned aerial vehicle meets the preset accuracy, selecting the intermediate value of the corrected position information of the unmanned aerial vehicle as the real-time accurate position information of the unmanned aerial vehicle.
In this embodiment, the satellite positioning information has a certain error, and therefore, a position offset difference value needs to be acquired through a satellite positioning base station to perform offset correction on the satellite positioning information of the unmanned aerial vehicle, so that accurate position information of the unmanned aerial vehicle is obtained. Firstly, receiving satellite positioning information of a satellite positioning base station and preset base station position information, wherein the satellite positioning information comprises positioning information of at least 4 satellites, then calculating position offset difference between the satellite positioning information of the satellite positioning base station and the preset position information, wherein the preset position information is known information, acquiring the position offset difference between the satellite positioning information of the satellite positioning base station and the preset position information under a plurality of different time nodes based on the acquisition process, then carrying out offset correction on the satellite positioning information of the same unmanned aerial vehicle through a plurality of position offset differences, outputting corrected position information of a plurality of unmanned aerial vehicles, then calculating whether the difference between the corrected position information of the plurality of unmanned aerial vehicles meets preset precision, if the difference between the corrected position information of the plurality of unmanned aerial vehicles is less than 1CM, then indicating that the obtained corrected position information of the unmanned aerial vehicles meets the preset precision, the intermediate value of the corrected position information can be selected as the real-time accurate position information of the unmanned aerial vehicle.
Based on the method, the calibration parameters are obtained in the following specific steps:
acquiring real-time accurate position information and positioning time of the unmanned aerial vehicle, and flight images and image acquisition time of the unmanned aerial vehicle;
matching the real-time accurate position of the unmanned aerial vehicle with the same time node with the flight image based on the positioning time and the image acquisition time to form a model data set;
acquiring pixel position information of the unmanned aerial vehicle in the flight image in the model data set;
converting real-time accurate position information of the unmanned aerial vehicle into position information of the same type as the pixel position information of the unmanned aerial vehicle;
constructing a model function based on the converted real-time accurate position information of the unmanned aerial vehicle and the pixel position information of the unmanned aerial vehicle;
and obtaining calibration parameters based on the model function.
In this embodiment, parameters for calibration are calculated by analyzing a mapping relationship between the accurate position information of the unmanned aerial vehicle and the pixel position information of the unmanned aerial vehicle in the flight image. Because unmanned aerial vehicle is continuous motion, consequently at first central processing module need match the real-time accurate positional information of unmanned aerial vehicle who acquires with unmanned aerial vehicle flight image, it is preferred that the data that obtains the same time node according to positioning time and acquisition time match and constitute the model data set, wherein every model data set only contains the real-time accurate positional information of an unmanned aerial vehicle and an unmanned aerial vehicle flight image, it comes to carry out joint analysis with the real-time accurate positional information of unmanned aerial vehicle to reacquire the pixel positional information of unmanned aerial vehicle in flight image, because both no longer unify the coordinate system, consequently, need convert unmanned aerial vehicle's real-time accurate positional information, can construct the function model that both correspond after the conversion, can try to get corresponding calibration parameter through solving this function model.
Based on the calibration parameter acquisition process, the real-time accurate position of the unmanned aerial vehicle and the specific process of matching the flight image are as follows:
acquiring a starting time node and a preset acquisition frequency of the binocular high-speed camera;
acquiring an acquisition time node of each unmanned aerial vehicle flying image based on a starting time node and a preset acquisition frequency of the binocular high-speed camera;
acquiring a takeoff time node and a preset positioning frequency of the unmanned aerial vehicle;
obtaining a positioning time node of real-time accurate position information of each unmanned aerial vehicle based on a takeoff time node and a preset positioning frequency of the unmanned aerial vehicle;
the method comprises the steps of comparing a collecting time node of a flight image with a positioning time node of real-time accurate position information to obtain the same time node, and matching the real-time accurate position of the unmanned aerial vehicle corresponding to the time node with the unmanned aerial vehicle image to form a model data set.
Furthermore, because unmanned aerial vehicle is motion state, therefore unmanned aerial vehicle's real-time accurate positional information and unmanned aerial vehicle image are not completely corresponding, consequently need match both, in this embodiment, through the collection frequency of presetting of high-speed camera and unmanned aerial vehicle real-time accurate positional information predetermine the positioning frequency and the time service signal acquisition collection time node and the positioning time node that the location satellite sent to carry out the time matching, obtain the model data set that corresponds each other.
The starting time node of the binocular high-speed camera is the same as the takeoff time node of the unmanned aerial vehicle, and the preset acquisition frequency of the binocular high-speed camera is integral multiple of the preset positioning frequency of the real-time accurate position information of the unmanned aerial vehicle.
Specifically, at first start two mesh high-speed cameras and unmanned aerial vehicle simultaneously, the collection frequency of predetermineeing of two mesh high-speed cameras and the preset positioning frequency of the real-time accurate positional information of unmanned aerial vehicle, it is preferred 100 per second to predetermine the collection frequency, it is 10 every second to predetermine the positioning frequency, a same time node can appear at both every certain milliseconds of interval to can obtain the real-time accurate positional information and the unmanned aerial vehicle image of the unmanned aerial vehicle that this time node corresponds, accomplish mutual matching between the two.
Based on the specific calculation process of the calibration parameters, the method for acquiring the pixel position information of the unmanned aerial vehicle in the image in the model data set comprises the following steps:
acquiring the front ten images shot by the binocular high-speed camera after starting;
segmenting a moving target in the image based on the dynamic change of the first ten images;
calculating pixel gray scale characteristics of the divided areas where the moving targets are located, calculating the similarity between the gray scale characteristics of different moving targets and preset unmanned aerial vehicle gray scale characteristics based on the gray scale characteristics of the moving targets, and selecting the unmanned aerial vehicle targets;
calculating the number of pixels occupied by the unmanned aerial vehicle image based on the selected unmanned aerial vehicle target, if the number of the pixels occupied by the unmanned aerial vehicle is smaller than a standard value, calculating histograms of color channels of the unmanned aerial vehicle target in the images shot by the two groups of high-speed cameras, and matching pixel position information of the unmanned aerial vehicle through the histograms;
if the number of pixels occupied by the unmanned aerial vehicle is larger than the standard value, acquiring corner points of the unmanned aerial vehicle target, calculating the contact ratio of the acquired corner points in the images shot by the two groups of high-speed cameras, and selecting corner point position information with the contact ratio larger than a threshold value as unmanned aerial vehicle pixel position information.
In this embodiment, first, a moving target in the first ten image analysis images shot by the binocular high-speed camera is obtained, and because other moving target interference experiments may exist in the field of view, the obtained moving target needs to be screened, specifically, the gray scale features of different moving targets are calculated, and then the similarity between the gray scale features of different moving targets and the gray scale features of a preset unmanned aerial vehicle target is calculated to select the unmanned aerial vehicle target. Then determining pixel position information of the unmanned aerial vehicle in the image based on the selected unmanned aerial vehicle target, and in order to solve the problem that the unmanned aerial vehicle image is too small to be judged, firstly, the number of pixel points of the unmanned aerial vehicle target is needed, if the number of pixel points is less than a standard value, and the standard value is preferably 30 pixel points, then matching the pixel position information of the unmanned aerial vehicle through a histogram by calculating the histogram of color channels of the unmanned aerial vehicle target in the images shot by the two groups of binocular high-speed cameras; if the number of the pixel points is larger than the standard value, acquiring corner points of the unmanned aerial vehicle target, calculating the contact ratio of the acquired corner points in the images shot by the two groups of binocular high-speed cameras, and selecting corner point position information with the contact ratio larger than a threshold value as unmanned aerial vehicle pixel position information, wherein the image contact ratio threshold value is preferably 0.8.
In the prior art, the target tracking method is preferably an optical flow method for target tracking, but the displacement of continuous frames in a video sequence part is large, and the sparse optical flow easily loses the target under the condition that the detection time is kept to be less than 0.5 s. In view of the above, the method includes the steps of obtaining continuous first ten frames of pictures of the binocular high-speed camera to conduct difference calculation, screening a region of a moving object in the images, determining an unmanned aerial vehicle target through a characteristic value of the moving target, conducting accuracy detection on the unmanned aerial vehicle target region through a streamer method to obtain an accurate unmanned aerial vehicle target, and selecting mass points in the unmanned aerial vehicle target region as pixel coordinates of the unmanned aerial vehicle.
Based on the specific calculation process of the calibration parameters, the analysis process of the model function includes:
acquiring an unmanned aerial vehicle world coordinate P obtained by converting real-time accurate position information of the unmanned aerial vehicle;
acquiring a coordinate p corresponding to pixel position information of the unmanned aerial vehicle;
constructing a projection matrix of the camera based on the world coordinate P of the unmanned aerial vehicle, and carrying out normalization processing on the last item of the projection matrix of the camera to obtain a parameter matrix of the camera;
constructing a model function based on an imaging model of a camera, and constructing a parameter equation by taking a coordinate p corresponding to pixel position information of the unmanned aerial vehicle as a value of the function;
and solving a parameter equation to obtain a calibration parameter.
In the present embodiment, the principle of imaging according to the cameraThe mapping relation between the real world coordinate point and the image coordinate point shot by the camera can be constructed, so that the real-time accurate position information of the unmanned aerial vehicle is required to be acquired firstly, the world coordinate P (X, Y, Z) of the unmanned aerial vehicle is obtained through conversion, the projection matrix of the camera can be obtained based on the world coordinate P (X, Y, Z) of the unmanned aerial vehicle, and the corresponding relation is
Wherein K represents the internal reference of the camera, R
3×3Being a rotation matrix of the camera, T
3×1The method comprises the steps of calibrating a camera by using a translation matrix of the camera and r represents calibration parameters of the camera, determining the parameters of the camera, and matching the parameters r based on a projection matrix of the camera
23And (3) carrying out normalization processing, namely only determining the remaining 11 parameters of the camera, and constructing a parameter equation by taking the coordinate p (u, v) corresponding to the pixel position information of the unmanned aerial vehicle as a value of a function based on an imaging model, wherein the imaging model is a pinhole imaging model, and the parameter equation is as follows:
two equations can be constructed for each world coordinate obtained by equation (1), so that all camera parameters can be obtained by 6 world coordinates, and the more the number of the world coordinates is, the more accurate the parameters are.
The embodiment of the invention provides a system for measuring large-field high-speed motion, which comprises:
the image acquisition module is used for acquiring a flight image of the unmanned aerial vehicle in a large view field based on a binocular high-speed camera;
the positioning module is used for acquiring real-time accurate position information of the unmanned aerial vehicle in the large-field-of-view flight process;
the calibration module is used for constructing a data conversion model based on flight graphics and real-time accurate position information of the unmanned aerial vehicle in a large view field to obtain calibration parameters;
and the moving target measuring module is used for measuring the moving target in the large visual field based on the calibration parameters to obtain the moving parameters of the moving target.
Referring to fig. 4 and 5, in this embodiment, a total station and a target are used to measure and verify the calibration precision of the system in combination with calibration parameters of an unmanned aerial vehicle, and a calibration error is calculated to determine the feasibility of the system, specifically, the position of a binocular high-speed camera is kept unchanged, the target is placed in a view field, coordinates of the target on measurement points with different heights are measured, and pixel coordinates pix of different measurement points shot by the binocular high-speed camera are recorded
l,iAnd pix
r,iI-0, 1, …, n-1, wherein pix
l,iPixel coordinates, pix, representing the left camera shot
r,iRepresenting the pixel coordinates of the right camera shot and n is the number of measurement points. Solving the three-dimensional space coordinate W of the measuring point based on the pixel coordinate of the measuring point and the camera parameter,
wherein, K represents the internal reference of the camera, R represents the rotation matrix of the camera, T is the translation matrix of the camera, and distcoeffs represents the distortion matrix of the camera.
Using the target at the moment i-0 as a reference origin, and calculating the distances d of other measuring points relative to the point
i,d
i=||W
i-W
0L. Then recording the coordinates L of the measuring point measured by the total station
iI is 0,1, …, n-1. Taking the measuring point at the moment when i is 0 as a reference origin, and calculating the distances of other measuring points relative to the point
Based on the determined distance d
iAnd
calculating the calibrated reprojection error e
i,
Referring to fig. 5, as the number of calibrated measurement points increases, the corresponding calibrated reprojection error gradually decreases, and when the number of calibrated measurement points reaches 60, the calibrated reprojection error gradually tends to be stable, and the final reprojection error is 1.41461 pixels, so that the calibration method of the system has good replaceability and convenience, and can realize accurate measurement of the motion parameters of the large-market-range moving target, compared with the calibration error of 0.900 pixels in a small field of view of a total station.
The present invention is not limited to the above-described embodiments, and those skilled in the art will be able to make various modifications without creative efforts from the above-described conception, and fall within the scope of the present invention.