CN115900639B - Course angle correction method and server applied to cradle head camera on unmanned aerial vehicle - Google Patents

Course angle correction method and server applied to cradle head camera on unmanned aerial vehicle Download PDF

Info

Publication number
CN115900639B
CN115900639B CN202310213576.5A CN202310213576A CN115900639B CN 115900639 B CN115900639 B CN 115900639B CN 202310213576 A CN202310213576 A CN 202310213576A CN 115900639 B CN115900639 B CN 115900639B
Authority
CN
China
Prior art keywords
state information
angle
image
determining
course angle
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202310213576.5A
Other languages
Chinese (zh)
Other versions
CN115900639A (en
Inventor
刘建德
杨吉团
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Consys Technology Co ltd
Original Assignee
Shenzhen Consys Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Consys Technology Co ltd filed Critical Shenzhen Consys Technology Co ltd
Priority to CN202310213576.5A priority Critical patent/CN115900639B/en
Publication of CN115900639A publication Critical patent/CN115900639A/en
Application granted granted Critical
Publication of CN115900639B publication Critical patent/CN115900639B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Studio Devices (AREA)
  • Image Analysis (AREA)
  • Image Processing (AREA)

Abstract

The present disclosure provides a course angle correction method and a server applied to a pan-tilt camera on an unmanned aerial vehicle, and relates to course angle correction technology, including: acquiring first state information of a cradle head camera and a first image shot by the cradle head camera, and acquiring second state information and a second image of the cradle head camera after a certain time; determining a difference value between the second state information and the first state information, and if the difference value is within a preset range, determining pixel displacement of the second image relative to the first image; determining a first course angle of the cradle head camera according to the first state information, the second state information and the pixel displacement; and correcting the course angle reading of the cradle head camera by using the first course angle. And determining a first course angle by using the pixel displacement of the front frame image and the rear frame image of the cradle head camera and the state information, and correcting the course angle measurement value by using the first course angle. The correction method is suitable for correcting scenes of course angle measurement values of the pan-tilt camera in the moving process, and the correction effect is better.

Description

Course angle correction method and server applied to cradle head camera on unmanned aerial vehicle
Technical Field
The disclosure relates to course angle correction technology, in particular to a course angle correction method and a server applied to a cradle head camera on an unmanned aerial vehicle.
Background
The heading angle measuring technology of the moving object has very important application on dynamic carriers such as aircrafts, ocean ship orientations, vehicles and the like. Traditional inertial autonomous navigation devices, such as unmanned aerial vehicle shooting pan-tilt cameras, mostly adopt structural devices based on gyroscopes and accelerometers to measure course angles. However, the course angle measurement error of the pan-tilt camera on the unmanned aerial vehicle in the actual use process is larger. How to correct the course angle measurement value of the cradle head camera on the unmanned aerial vehicle in the motion process is a problem to be solved.
In the prior art, static correction methods based on ground reference points are mostly used. According to the scheme, the azimuth angle is calculated by calibrating a plurality of reference points on the ground and the position of the reference points is compared with the heading angle reading of the sensor in the fixed-position tripod head camera, so that the heading angle measurement value of the sensor is corrected. This approach works well for inherent bias in static scenes.
However, when the unmanned aerial vehicle flies in the air, the angle deviation can be dynamically changed along with the continuous change of the gesture and the speed, so that the correction effect of the course angle measurement value of the pan-tilt camera on the unmanned aerial vehicle in the moving process is poor.
Disclosure of Invention
The disclosure provides a course angle correction method and a server applied to a cradle head camera on an unmanned aerial vehicle, so as to solve the problem of poor dynamic correction effect on the course angle of the cradle head camera in the prior art.
According to a first aspect of the present disclosure, a course angle correction method applied to a pan-tilt camera on an unmanned aerial vehicle is provided, including:
acquiring first state information of a cradle head camera and a first image shot by the cradle head camera which are arranged on an unmanned aerial vehicle from the unmanned aerial vehicle, and acquiring second state information of the cradle head camera and a second image shot by the cradle head camera from the unmanned aerial vehicle after a certain time;
determining a difference value between the second state information and the first state information, and if the difference value is determined to be within a preset range, determining pixel displacement of the second image relative to the first image according to the first image and the second image;
determining a first course angle of the cradle head camera according to the first state information, the second state information and the pixel displacement;
and correcting the course angle reading of the cradle head camera acquired from the unmanned aerial vehicle by using the first course angle.
According to a second aspect of the present disclosure, there is provided a course angle correction device applied to a pan-tilt camera on an unmanned aerial vehicle, including:
the acquisition unit is used for acquiring first state information of a tripod head camera and a first image shot by the tripod head camera which are arranged on the unmanned aerial vehicle from the unmanned aerial vehicle, and acquiring second state information of the tripod head camera and a second image shot by the tripod head camera from the unmanned aerial vehicle after a certain time;
a pixel displacement determining unit, configured to determine a difference between the second state information and the first state information, and if the difference is determined to be within a preset range, determine a pixel displacement of the second image relative to the first image according to the first image and the second image;
a course angle determining unit, configured to determine a first course angle of the pan-tilt camera according to the first state information, the second state information, and the pixel displacement;
and the correcting unit is used for correcting the course angle reading of the cradle head camera acquired from the unmanned aerial vehicle by using the first course angle.
According to a third aspect of the present disclosure, there is provided a server comprising a memory and a processor; wherein,,
the memory is used for storing a computer program;
the processor is configured to read the computer program stored in the memory, and execute the course angle correction method applied to the pan-tilt camera on the unmanned aerial vehicle according to the first aspect according to the computer program in the memory.
According to a fourth aspect of the present disclosure, there is provided a computer readable storage medium, in which computer executable instructions are stored, which when executed by a processor, implement a course angle correction method applied to a pan-tilt camera on an unmanned aerial vehicle according to the first aspect.
According to a fifth aspect of the present disclosure, there is provided a computer program product comprising a computer program which, when executed by a processor, implements a heading angle correction method as described in the first aspect applied to a pan-tilt camera on an unmanned aerial vehicle.
The course angle correction method and the server applied to the cradle head camera on the unmanned aerial vehicle, provided by the disclosure, comprise the following steps: acquiring first state information of a cradle head camera and a first image shot by the cradle head camera which are arranged on the unmanned aerial vehicle from the unmanned aerial vehicle, and acquiring second state information of the cradle head camera and a second image shot by the cradle head camera from the unmanned aerial vehicle after a certain time; determining a difference value between the second state information and the first state information, and if the difference value is determined to be within a preset range, determining pixel displacement of the second image relative to the first image according to the first image and the second image; determining a first course angle of the cradle head camera according to the first state information, the second state information and the pixel displacement; and correcting the course angle reading of the cradle head camera acquired from the unmanned aerial vehicle by using the first course angle. The course angle correction method and the server applied to the cradle head camera on the unmanned aerial vehicle can calculate the first course angle of the cradle head camera by using the pixel displacement of the front and rear frame images of the cradle head camera and the state information of the cradle head camera of the front and rear frames, and can further correct the course angle measurement value by using the first course angle. The correction method is suitable for correcting scenes of course angle measurement values of the pan-tilt camera in the moving process, and can achieve a better correction effect.
Drawings
In order to more clearly illustrate the embodiments of the present disclosure or the technical solutions in the prior art, the drawings that are required in the embodiments or the description of the prior art will be briefly described below, it being obvious that the drawings in the following description are only some embodiments of the present disclosure, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a flowchart illustrating a course angle correction method applied to a pan-tilt camera on an unmanned aerial vehicle according to an exemplary embodiment of the present disclosure;
fig. 2 is a flowchart illustrating a course angle correction method applied to a pan-tilt camera on an unmanned aerial vehicle according to another exemplary embodiment of the present disclosure;
FIG. 3 is a schematic diagram illustrating a solution of a first heading angle of a pan-tilt camera according to an exemplary embodiment of the present disclosure;
fig. 4 is a block diagram illustrating a course angle correction device applied to a pan-tilt camera on an unmanned aerial vehicle according to an exemplary embodiment of the present disclosure;
fig. 5 is a block diagram of a server shown in an exemplary embodiment of the present disclosure.
Detailed Description
The heading angle measuring technology of the moving object has very important application on dynamic carriers such as aircrafts, ocean ship orientations, vehicles and the like. Traditional inertial autonomous navigation devices, such as unmanned aerial vehicle shooting pan-tilt cameras, mostly adopt structural devices based on gyroscopes and accelerometers to measure course angles. However, the course angle measurement error of the pan-tilt camera on the unmanned aerial vehicle in the actual use process is larger. In the experiment, the course angle measurement error of the cradle head camera on the unmanned aerial vehicle is approximately changed between 2 degrees and-5 degrees in the movement process of the unmanned aerial vehicle. Especially, when the unmanned aerial vehicle turns around, the error of the course angle measurement value of the pan-tilt camera suddenly becomes large. How to correct the course angle measurement value of the cradle head camera on the unmanned aerial vehicle in the motion process is a problem to be solved. In the prior art, static correction methods based on ground reference points are mostly used. According to the scheme, the azimuth angle is calculated by calibrating a plurality of reference points on the ground and the position of the reference points is compared with the heading angle reading of the sensor in the fixed-position tripod head camera, so that the heading angle measurement value of the sensor is corrected. This approach works well for inherent bias in static scenes.
However, when the unmanned aerial vehicle flies in the air, the angle deviation can be dynamically changed along with the continuous change of the gesture and the speed, so that the correction effect of the course angle measurement value of the pan-tilt camera on the unmanned aerial vehicle in the moving process is poor.
In order to solve the above technical problems, in the solution provided by the present disclosure, a first heading angle of a pan-tilt camera may be calculated by using pixel displacement of front and rear frame images of the pan-tilt camera and state information of the pan-tilt camera of the front and rear frames, and then a heading angle measurement value may be corrected by using the first heading angle. The correction method is suitable for correcting scenes of course angle measurement values of the pan-tilt camera in the moving process, and can achieve a better correction effect.
It should be noted that, the user information (including, but not limited to, user equipment information, user personal information, etc.) and the data (including, but not limited to, data for analysis, stored data, presented data, etc.) related to the present disclosure are information and data authorized by the user or sufficiently authorized by each party, and the collection, use and processing of the related data need to comply with the related laws and regulations and standards of the related country and region, and be provided with corresponding operation entries for the user to select authorization or rejection.
The following describes the technical solutions of the present disclosure and how the technical solutions of the present disclosure solve the above technical problems in detail with specific embodiments. The following embodiments may be combined with each other, and the same or similar concepts or processes may not be described in detail in some embodiments. Embodiments of the present disclosure will be described below with reference to the accompanying drawings.
Fig. 1 is a flowchart illustrating a course angle correction method applied to a pan-tilt camera on an unmanned aerial vehicle according to an exemplary embodiment of the present disclosure.
As shown in fig. 1, the course angle correction method applied to the pan-tilt camera on the unmanned aerial vehicle provided in this embodiment includes:
step 101, acquiring first state information of a cradle head camera and a first image shot by the cradle head camera, which are arranged on the unmanned aerial vehicle, from the unmanned aerial vehicle, and acquiring second state information of the cradle head camera and a second image shot by the cradle head camera from the unmanned aerial vehicle after a certain time.
Among other things, the methods provided by the present disclosure may be performed by a server having computing capabilities.
The server can acquire first state information of the cradle head camera and a first image shot by the cradle head camera, which are arranged on the unmanned aerial vehicle, from the unmanned aerial vehicle. And after a certain time, acquiring second state information of the cradle head camera and a second image shot by the cradle head camera from the unmanned aerial vehicle again.
The state information of the pan-tilt camera may include position information, camera information, angle information, and the like of the pan-tilt camera.
Specifically, the certain time may be a preset time threshold. For example, the certain time may be a time when the pan-tilt camera captures 5 frames of images.
Step 102, determining a difference between the second state information and the first state information, and if the difference is determined to be within the preset range, determining the pixel displacement of the second image relative to the first image according to the first image and the second image.
The preset range is a value preset according to actual conditions.
Wherein the pixel displacement of the second image relative to the first image refers to: the same object, the pixel distance in the front and back images. For example, if a reference object is positioned (0, 10) in the first drawing and (0, 50) in the second drawing, the pixel shift is (0, 40).
Specifically, the second state information of the pan-tilt camera can be compared with the first state information, a difference value between the second state information and the first state information is determined, and if the difference value is determined to be within a preset range, it can be determined that the second state information is less changed than the first state information.
Then, if the difference is determined to be within the preset range, the pixel displacement of the second image relative to the first image can be determined according to the first image and the second image in a preset mode.
Step 103, determining a first course angle of the pan-tilt camera according to the first state information, the second state information and the pixel displacement.
The course angle of the pan-tilt camera refers to an included angle between the projection of the pan-tilt camera in the horizontal direction (or directly above the picture, from the picture center point to the picture top center point) and the geomagnetic north, and is positive clockwise and negative anticlockwise.
Specifically, for example, according to the position information of the pan-tilt camera included in the first state information and the second state information and the pixel displacement of the second image of the pan-tilt camera relative to the first image, a first course angle of the pan-tilt camera can be calculated by using a preset mode.
And 104, correcting the course angle reading of the cradle head camera acquired from the unmanned aerial vehicle by using the first course angle.
Specifically, the course angle reading of the pan-tilt camera arranged on the unmanned aerial vehicle can be obtained from the unmanned aerial vehicle. And correcting the course angle reading by using the calculated first course angle of the cradle head camera, and finally obtaining the corrected course angle reading.
For example, a heading angle deviation between the heading angle reading and the first heading angle may be calculated, and then the heading angle reading is added to the heading angle deviation before a new first heading angle is calculated next time, so as to determine a corrected heading angle reading.
The course angle correction method applied to the cradle head camera on the unmanned aerial vehicle, provided by the disclosure, comprises the following steps: acquiring first state information of a cradle head camera and a first image shot by the cradle head camera which are arranged on the unmanned aerial vehicle from the unmanned aerial vehicle, and acquiring second state information of the cradle head camera and a second image shot by the cradle head camera from the unmanned aerial vehicle after a certain time; determining a difference value between the second state information and the first state information, and if the difference value is determined to be within a preset range, determining pixel displacement of the second image relative to the first image according to the first image and the second image; determining a first course angle of the cradle head camera according to the first state information, the second state information and the pixel displacement; and correcting the course angle reading of the cradle head camera acquired from the unmanned aerial vehicle by using the first course angle. According to the scheme, the first course angle of the cradle head camera can be calculated by utilizing the pixel displacement of the front and rear frame images of the cradle head camera and the state information of the cradle head camera of the front and rear frames, and then the course angle measurement value (namely course angle reading) can be corrected by using the first course angle. The correction method is suitable for correcting scenes of course angle measurement values of the pan-tilt camera in the moving process, and can achieve a better correction effect.
Fig. 2 is a flowchart illustrating a course angle correction method applied to a pan-tilt camera on an unmanned aerial vehicle according to another exemplary embodiment of the present disclosure.
As shown in fig. 2, the course angle correction method applied to the pan-tilt camera on the unmanned aerial vehicle provided in this embodiment includes:
step 201, acquiring first state information of a pan-tilt camera and a first image shot by the pan-tilt camera set on the unmanned aerial vehicle from the unmanned aerial vehicle, and acquiring second state information of the pan-tilt camera and a second image shot by the pan-tilt camera from the unmanned aerial vehicle after a certain time.
After step 201, step 202, or step 203, may be performed.
Specifically, first state information of a cradle head camera and a first image shot by the cradle head camera, which are arranged on the unmanned aerial vehicle, can be obtained from the unmanned aerial vehicle. And after a certain time, acquiring second state information of the cradle head camera and a second image shot by the cradle head camera from the unmanned aerial vehicle again.
In one implementation, the first state information includes a combination of one or more of the following: global positioning system information, zoom magnification, and angle information; the angle information includes at least one of: pitch angle, heading angle.
Specifically, the first state information of the pan-tilt camera may include one or more of the following combinations: global positioning system (Global Positioning System, GPS) information, zoom magnification, and angle information.
Wherein the GPS information includes longitude information, latitude information, and altitude information.
Wherein the angle information may include at least one of: pitch angle, heading angle.
Wherein, the pitch angle of cloud platform camera indicates: the included angle between the optical axis of the pan-tilt camera and the horizontal line is negative downwards and positive upwards.
Step 202, determining a difference value between the second state information and the first state information, if the difference value exceeds the preset range, continuing to execute the step of acquiring the second state information of the pan-tilt camera and the second image shot by the pan-tilt camera from the unmanned aerial vehicle after a certain time until the difference value between the state information of the adjacent two times of the pan-tilt camera is detected to be within the preset range.
The first state information may include global positioning system information, zoom factors, and angle information, wherein the angle information may include a pitch angle, a heading angle. The second state information may include global positioning system information, zoom factors, and angle information, wherein the angle information may include a pitch angle, a heading angle.
Specifically, the global positioning system information, the zoom multiple, the pitch angle and the course angle included in the first state information and the second state information can be respectively compared, and difference values are calculated, and if at least one difference value in the difference values exceeds a preset range, it is indicated that the second state information of the pan-tilt camera has larger change compared with the first state information, and the solution of the first course angle is not suitable to be carried out according to the information of the front frame and the rear frame of the pan-tilt camera. At this time, the step of acquiring the second state information of the pan-tilt camera and the second image captured by the pan-tilt camera from the unmanned aerial vehicle after a certain time may be continuously performed until it is detected that the difference between the state information of the pan-tilt camera and the state information of the pan-tilt camera adjacent two times is within the preset range.
Step 203, determining a difference value between the second state information and the first state information, and if the difference value is determined to be within the preset range, determining the pixel displacement of the second image relative to the first image according to the first image and the second image by adopting a fast tracking algorithm based on dense space-time context learning.
The first state information may include global positioning system information, zoom factors, and angle information, wherein the angle information may include a pitch angle, a heading angle. The second state information may include global positioning system information, zoom factors, and angle information, wherein the angle information may include a pitch angle, a heading angle.
Specifically, the global positioning system information, the zoom multiple, the pitch angle and the course angle included in the first state information and the second state information can be respectively compared, and difference values are calculated, if the difference values are within a preset range, the pixel displacement of the second image relative to the first image can be determined according to the first image and the second image by adopting a fast tracking (Fast Tracking via Dense Spatio-Temporal Context Learning, STC) algorithm based on dense space-time context learning.
Step 204, determining a first position of the pan-tilt camera according to the global positioning system information included in the first state information; determining a second position of the pan-tilt camera according to the global positioning system information included in the second state information; and determining a connection from the first location to the second location.
Specifically, the first position of the pan-tilt camera may be determined according to global positioning system information included in the first state information; determining a second position of the pan-tilt camera according to the global positioning system information included in the second state information; then, according to the first position and the second position of the cradle head camera, a connection line from the first position to the second position can be determined.
As shown in the figure 3 of the drawings,
Figure SMS_1
where can be indicated a first position, < >>
Figure SMS_2
The second position may be represented. The connection of the first position to the second position can be expressed as +.>
Figure SMS_3
. The frame shown in fig. 3 is a frame in which the first image is located. The dashed line N indicates the direction of the magnetic north.
Step 205, determining a first included angle between the connecting line and the longitudinal axis of the first image according to the pixel displacement.
Wherein the vertical axis of the first image refers to: a line within the first image that is parallel to the y-axis of the first image. For example, if the resolution of the image is 1920×1080, the connection from pixel (960, 1080) to pixel (960,0) on the image may be taken as the vertical axis of the image.
Specifically, as shown in fig. 3, the pixel displacement of the second image with respect to the first image may be expressed as p (x, y). The first angle between the line connecting the first position to the second position and the longitudinal axis of the first image (i.e. directly above the screen shown in fig. 3) can be expressed as
Figure SMS_4
Step 206, determining a first course angle of the pan-tilt camera according to the first included angle.
Specifically, a first course angle of the pan-tilt camera can be calculated by using a preset mode according to the calculated first included angle between the connecting line from the first position to the second position and the longitudinal axis of the first image.
In one implementation, the second angle between the connection line and the magnetic north is determined according to the global positioning system information included in the first state information and the global positioning system information included in the second state information.
Specifically, a second included angle between the connecting line from the first position to the second position and the geomagnetic north may be determined according to latitude and longitude information in the GPS information included in the first state information and latitude and longitude information in the GPS information included in the second state information. As shown in fig. 3, the second included angle may be represented as.
And then, determining the difference between the first included angle and the second included angle as a first course angle of the cradle head camera.
In particular, as shown in FIG. 3,
Figure SMS_5
the angle between the projection of the right above the pan-tilt camera in the horizontal direction (i.e., the vertical axis of the first image, i.e., the right above the screen shown in fig. 3) and the geomagnetic north pole, i.e., the first heading angle of the pan-tilt camera, may be represented. Specifically, the difference between the first angle and the second angle can be determined as the first heading angle of the pan-tilt camera, namely
Figure SMS_6
Step 207, obtaining a current course angle reading of the pan-tilt camera from the unmanned aerial vehicle; the course angle deviation at the last moment is obtained; and determining the heading angle deviation at the current moment according to the current heading angle reading, the first heading angle, the heading angle deviation at the last moment and a preset sliding coefficient.
Specifically, a current heading angle reading of the pan-tilt camera can be obtained from the unmanned aerial vehicle. And the navigation angle deviation at the last moment can be obtained.
Then, a first heading angle deviation may be determined based on the current heading angle reading and the first heading angle.
And then, according to the first course angle deviation, the course angle deviation at the previous moment and a preset sliding coefficient, determining the course angle deviation at the current moment.
Illustratively, the heading angle deviation at the current time is as follows:
offset_avg:= λ*offset_avg+(1-λ)*offset
wherein, offset_avg on the right side of the formula represents the heading angle deviation at the last moment; the offset_avg on the left side of the formula represents the heading angle deviation at the current time; =represent updates; offset represents the first heading angle deviation; λ represents a predetermined sliding system, for example λ=0.9 may be taken.
And step 208, determining the sum of the heading angle deviation at the current moment and the current heading angle reading as the corrected heading angle of the cradle head camera.
Specifically, the sum of the heading angle deviation at the current moment and the current heading angle reading can be determined as the corrected heading angle of the pan-tilt camera.
Specifically, the method provided by the embodiment does not depend on an external angle sensor, and is wide in application range and high in universality. The prior correction is not needed, expensive instruments and equipment are not needed, and the operation is simple and convenient. The dynamic correction of the deviation is performed using a moving average, which ensures both continuity and stability. The logic is simple, the calculated amount is small, the calculation speed is high, and the correction effect is good.
Fig. 4 is a block diagram illustrating a course angle correction device applied to a pan-tilt camera on an unmanned aerial vehicle according to an exemplary embodiment of the present disclosure.
As shown in fig. 4, the course angle correction device 400 provided in the present disclosure applied to a pan-tilt camera on an unmanned aerial vehicle includes:
an acquiring unit 410, configured to acquire, from the unmanned aerial vehicle, first state information of a pan-tilt camera and a first image captured by the pan-tilt camera, which are set on the unmanned aerial vehicle, and acquire, after a certain time, second state information of the pan-tilt camera and a second image captured by the pan-tilt camera from the unmanned aerial vehicle;
a pixel displacement determining unit 420, configured to determine a difference between the second state information and the first state information, and if the difference is determined to be within a preset range, determine a pixel displacement of the second image relative to the first image according to the first image and the second image;
a course angle determining unit 430, configured to determine a first course angle of the pan-tilt camera according to the first state information, the second state information, and the pixel displacement;
and a correction unit 440 for correcting the heading angle reading of the pan-tilt camera acquired from the unmanned aerial vehicle by using the first heading angle.
The course angle determining unit 430 is specifically configured to determine a first position of the pan-tilt camera according to the global positioning system information included in the first state information; determining a second position of the pan-tilt camera according to the global positioning system information included in the second state information; determining a connection line from the first position to the second position;
determining a first included angle between the connecting line and a longitudinal axis of the first image according to the pixel displacement;
and determining a first course angle of the cradle head camera according to the first included angle.
The course angle determining unit 430 is specifically configured to determine a second included angle between the connection line and the magnetic north according to the global positioning system information included in the first state information and the global positioning system information included in the second state information;
and determining the difference between the first included angle and the second included angle as a first course angle of the cradle head camera.
The correction unit 440 is specifically configured to obtain a current heading angle reading of the pan-tilt camera from the unmanned aerial vehicle; the course angle deviation at the last moment is obtained;
determining the heading angle deviation at the current moment according to the current heading angle reading, the first heading angle, the heading angle deviation at the last moment and a preset sliding coefficient;
and determining the sum of the heading angle deviation at the current moment and the current heading angle reading as the corrected heading angle of the cradle head camera.
The pixel displacement determining unit 420 is specifically configured to determine, according to the first image and the second image, a pixel displacement of the second image relative to the first image by using a fast tracking algorithm based on dense space-time context learning.
In one implementation, the first state information includes a combination of one or more of the following: global positioning system information, zoom magnification, and angle information;
the angle information includes at least one of: pitch angle, heading angle.
The pixel displacement determining unit 420 is further configured to, if the difference value exceeds the preset range, continue to perform the step of obtaining the second state information of the pan-tilt camera and the second image captured by the pan-tilt camera from the unmanned aerial vehicle after a certain time until the difference value between the state information of the adjacent two times of the pan-tilt camera is detected to be within the preset range.
Fig. 5 is a block diagram of a server shown in an exemplary embodiment of the present disclosure.
As shown in fig. 5, the server provided in this embodiment includes:
a memory 501;
a processor 502; and
a computer program;
the computer program is stored in the memory 501 and configured to be executed by the processor 502 to implement any of the heading angle correction methods applied to the pan-tilt camera on the unmanned aerial vehicle as described above.
The present embodiment also provides a computer-readable storage medium having stored thereon a computer program that is executed by a processor to implement any of the heading angle correction methods applied to a pan-tilt camera on an unmanned aerial vehicle as described above.
The embodiment also provides a computer program product, which comprises a computer program, and when the computer program is executed by a processor, the method for correcting the course angle of the cradle head camera on the unmanned aerial vehicle is realized.
Those of ordinary skill in the art will appreciate that: all or part of the steps for implementing the method embodiments described above may be performed by hardware associated with program instructions. The foregoing program may be stored in a computer readable storage medium. The program, when executed, performs steps including the method embodiments described above; and the aforementioned storage medium includes: various media that can store program code, such as ROM, RAM, magnetic or optical disks.
Finally, it should be noted that: the above embodiments are only for illustrating the technical solution of the present invention, and not for limiting the same; although the invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some or all of the technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit of the invention.

Claims (9)

1. The course angle correction method applied to the cradle head camera on the unmanned aerial vehicle is characterized by comprising the following steps of:
acquiring first state information of a cradle head camera and a first image shot by the cradle head camera which are arranged on an unmanned aerial vehicle from the unmanned aerial vehicle, and acquiring second state information of the cradle head camera and a second image shot by the cradle head camera from the unmanned aerial vehicle after a certain time;
determining a difference value between the second state information and the first state information, and if the difference value is determined to be within a preset range, determining pixel displacement of the second image relative to the first image according to the first image and the second image;
determining a first course angle of the cradle head camera according to the first state information, the second state information and the pixel displacement;
correcting the course angle reading of the cradle head camera acquired from the unmanned aerial vehicle by utilizing the first course angle;
the determining the first course angle of the pan-tilt camera according to the first state information, the second state information and the pixel displacement includes:
determining a first position of the pan-tilt camera according to global positioning system information included in the first state information;
determining a second position of the pan-tilt camera according to global positioning system information included in the second state information; and determining a connection line from the first location to the second location;
determining a first included angle between the connecting line and a longitudinal axis of the first image according to the pixel displacement;
and determining a first course angle of the cradle head camera according to the first included angle.
2. The method of claim 1, wherein determining the first heading angle of the pan-tilt camera based on the first included angle comprises:
determining a second included angle between the connecting line and the geomagnetic north according to the global positioning system information included in the first state information and the global positioning system information included in the second state information;
and determining the difference between the first included angle and the second included angle as a first course angle of the cradle head camera.
3. The method of claim 1, wherein correcting the heading angle reading of the pan-tilt camera obtained from the drone with the first heading angle comprises:
acquiring a current course angle reading of a pan-tilt camera from the unmanned aerial vehicle; the course angle deviation at the last moment is obtained;
determining the heading angle deviation at the current moment according to the current heading angle reading, the first heading angle, the heading angle deviation at the last moment and a preset sliding coefficient;
and determining the sum of the heading angle deviation at the current moment and the current heading angle reading as the corrected heading angle of the cradle head camera.
4. A method according to any one of claims 1-3, wherein said determining the pixel displacement of the second image relative to the first image from the first image and the second image comprises:
and determining the pixel displacement of the second image relative to the first image according to the first image and the second image by adopting a fast tracking algorithm based on dense space-time context learning.
5. A method according to any one of claim 1 to 3, wherein,
the first status information further includes one or more of the following combinations: zoom multiple, and angle information;
the angle information includes at least one of: pitch angle, heading angle.
6. A method according to any one of claims 1-3, wherein the method further comprises:
if the difference value is determined to be beyond the preset range, continuing to execute the step of acquiring the second state information of the tripod head camera and the second image shot by the tripod head camera from the unmanned aerial vehicle after the certain time until the difference value between the state information of the adjacent tripod head cameras is detected to be within the preset range.
7. Course angle correcting device applied to cradle head camera on unmanned aerial vehicle, which is characterized by comprising:
the acquisition unit is used for acquiring first state information of a tripod head camera and a first image shot by the tripod head camera which are arranged on the unmanned aerial vehicle from the unmanned aerial vehicle, and acquiring second state information of the tripod head camera and a second image shot by the tripod head camera from the unmanned aerial vehicle after a certain time;
a pixel displacement determining unit, configured to determine a difference between the second state information and the first state information, and if the difference is determined to be within a preset range, determine a pixel displacement of the second image relative to the first image according to the first image and the second image;
a course angle determining unit, configured to determine a first course angle of the pan-tilt camera according to the first state information, the second state information, and the pixel displacement;
the correcting unit is used for correcting the course angle reading of the cradle head camera acquired from the unmanned aerial vehicle by using the first course angle;
the course angle determining unit is specifically configured to determine a first position of the pan-tilt camera according to global positioning system information included in the first state information;
determining a second position of the pan-tilt camera according to global positioning system information included in the second state information; and determining a connection line from the first location to the second location;
determining a first included angle between the connecting line and a longitudinal axis of the first image according to the pixel displacement;
and determining a first course angle of the cradle head camera according to the first included angle.
8. A server comprising a memory and a processor; wherein,,
the memory is used for storing a computer program;
the processor being configured to read a computer program stored in the memory and to perform the method according to any of the preceding claims 1-6 according to the computer program in the memory.
9. A computer readable storage medium having stored therein computer executable instructions which when executed by a processor implement the method of any of the preceding claims 1-6.
CN202310213576.5A 2023-03-08 2023-03-08 Course angle correction method and server applied to cradle head camera on unmanned aerial vehicle Active CN115900639B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310213576.5A CN115900639B (en) 2023-03-08 2023-03-08 Course angle correction method and server applied to cradle head camera on unmanned aerial vehicle

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310213576.5A CN115900639B (en) 2023-03-08 2023-03-08 Course angle correction method and server applied to cradle head camera on unmanned aerial vehicle

Publications (2)

Publication Number Publication Date
CN115900639A CN115900639A (en) 2023-04-04
CN115900639B true CN115900639B (en) 2023-05-30

Family

ID=85746755

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310213576.5A Active CN115900639B (en) 2023-03-08 2023-03-08 Course angle correction method and server applied to cradle head camera on unmanned aerial vehicle

Country Status (1)

Country Link
CN (1) CN115900639B (en)

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3081902B1 (en) * 2014-03-24 2019-04-17 SZ DJI Technology Co., Ltd. Method and apparatus for correcting aircraft state in real time
CN107270904B (en) * 2017-06-23 2020-07-03 西北工业大学 Unmanned aerial vehicle auxiliary guide control system and method based on image registration
US10685229B2 (en) * 2017-12-21 2020-06-16 Wing Aviation Llc Image based localization for unmanned aerial vehicles, and associated systems and methods
CN108917752B (en) * 2018-03-30 2022-11-11 深圳一清创新科技有限公司 Unmanned ship navigation method, device, computer equipment and storage medium

Also Published As

Publication number Publication date
CN115900639A (en) 2023-04-04

Similar Documents

Publication Publication Date Title
US7071970B2 (en) Video augmented orientation sensor
JP6663040B2 (en) Depth information acquisition method and apparatus, and image acquisition device
US10147201B2 (en) Method of determining a direction of an object on the basis of an image of the object
US9160980B2 (en) Camera-based inertial sensor alignment for PND
US8933986B2 (en) North centered orientation tracking in uninformed environments
CN106814753B (en) Target position correction method, device and system
CN108827341B (en) Method for determining a deviation in an inertial measurement unit of an image acquisition device
EP2915139B1 (en) Adaptive scale and gravity estimation
CN107560603B (en) Unmanned aerial vehicle oblique photography measurement system and measurement method
US20120317825A1 (en) Direction determining method and apparatus using a triaxial electronic compass
CN113029128B (en) Visual navigation method and related device, mobile terminal and storage medium
CN108780577A (en) Image processing method and equipment
US9451166B1 (en) System and method for imaging device motion compensation
CN112204946A (en) Data processing method, device, movable platform and computer readable storage medium
CN110337668B (en) Image stability augmentation method and device
CN108444452B (en) Method and device for detecting longitude and latitude of target and three-dimensional space attitude of shooting device
US20220174217A1 (en) Image processing method and device, electronic device, and computer-readable storage medium
CN116523748A (en) Image stitching method, device, storage medium and unmanned aerial vehicle
CN109462717A (en) Electronic image stabilization method and terminal
CN115900639B (en) Course angle correction method and server applied to cradle head camera on unmanned aerial vehicle
CN110800023A (en) Image processing method and equipment, camera device and unmanned aerial vehicle
TWI726536B (en) Image capturing method and image capturing apparatus
CN113063434B (en) Precision evaluation method and system for satellite pointing fixed star
US11415990B2 (en) Optical object tracking on focal plane with dynamic focal length
US9210384B2 (en) System and method for real time registration of images

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant