WO2024114376A1 - 无人机云台自动跟踪目标的方法、装置、设备及存储介质 - Google Patents
无人机云台自动跟踪目标的方法、装置、设备及存储介质 Download PDFInfo
- Publication number
- WO2024114376A1 WO2024114376A1 PCT/CN2023/131693 CN2023131693W WO2024114376A1 WO 2024114376 A1 WO2024114376 A1 WO 2024114376A1 CN 2023131693 W CN2023131693 W CN 2023131693W WO 2024114376 A1 WO2024114376 A1 WO 2024114376A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- target area
- particle
- gimbal
- current frame
- frame image
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 52
- 239000002245 particle Substances 0.000 claims abstract description 136
- 238000012546 transfer Methods 0.000 claims description 20
- 238000004590 computer program Methods 0.000 claims description 8
- 238000012952 Resampling Methods 0.000 claims description 4
- 230000007704 transition Effects 0.000 abstract 1
- 238000004891 communication Methods 0.000 description 6
- 238000010586 diagram Methods 0.000 description 5
- 230000009286 beneficial effect Effects 0.000 description 4
- 238000005516 engineering process Methods 0.000 description 4
- 238000005070 sampling Methods 0.000 description 3
- 101001121408 Homo sapiens L-amino-acid oxidase Proteins 0.000 description 2
- 101000827703 Homo sapiens Polyphosphoinositide phosphatase Proteins 0.000 description 2
- 102100026388 L-amino-acid oxidase Human genes 0.000 description 2
- 102100023591 Polyphosphoinositide phosphatase Human genes 0.000 description 2
- 101100012902 Saccharomyces cerevisiae (strain ATCC 204508 / S288c) FIG2 gene Proteins 0.000 description 2
- 238000004364 calculation method Methods 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 101100233916 Saccharomyces cerevisiae (strain ATCC 204508 / S288c) KAR5 gene Proteins 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
Classifications
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02T—CLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
- Y02T10/00—Road transport of goods or passengers
- Y02T10/10—Internal combustion engine [ICE] based vehicles
- Y02T10/40—Engine management systems
Definitions
- the present invention relates to the technical field of unmanned aerial vehicles, and in particular to a method, device, equipment and storage medium for automatically tracking a target by a pan/tilt platform of an unmanned aerial vehicle.
- the embodiments of the present invention aim to provide a method, device, equipment and storage medium for automatically tracking a target by a gimbal of an unmanned aerial vehicle, so as to solve the problems in the prior art of manually adjusting the angle of the gimbal when observing a specific object, which causes troublesome operation and easy loss of the object.
- the embodiments of the present invention provide the following technical solutions:
- a method for a drone gimbal to automatically track a target comprising:
- a device for automatically tracking a target by a gimbal of an unmanned aerial vehicle comprising:
- a feature determination module configured to obtain a selected target area and determine features of the selected target area
- a particle prediction module is configured to obtain a current frame image, obtain a corresponding predicted particle for each particle according to a preset state transfer equation, calculate the similarity between the feature of the position of the predicted particle in the current frame image and the feature of the selected target area based on each predicted particle, and calculate the weight of the predicted particle according to the similarity;
- the pan/tilt control module is configured to calculate the pan/tilt rotation angle required to offset the target area to the center of the lens image according to the position of the target area in the current frame image, and control the pan/tilt to rotate accordingly according to the pan/tilt rotation angle.
- an electronic device comprising a memory, a processor and a computer program stored and running on the memory, wherein when the processor executes the program, the steps of the method for automatically tracking a target by a drone gimbal as described in any one of the above items are implemented.
- a computer-readable storage medium stores a computer program.
- the processor executes any of the above-mentioned methods for automatically tracking a target by a drone gimbal.
- the method, device, equipment and storage medium for automatically tracking a target by the unmanned aerial vehicle gimbal of the embodiment of the present invention obtains a selected target area, determines the characteristics of the selected target area, and then initializes a predetermined number of particles. According to the state transfer equation, each particle obtains a predicted particle, calculates the characteristic similarity between the image at the position of the predicted particle and the selected target area, thereby determining the position of the target area in the current frame image, and further calculates the gimbal rotation angle required to offset the target area to the center of the lens image according to the position of the target area in the current frame image, and controls the gimbal to rotate accordingly according to the gimbal rotation angle.
- FIG1 is a flow chart of an optional method for automatically tracking a target by a drone gimbal provided in Embodiment 1 of the present invention
- FIG2 is a schematic diagram of an optional target area and particle distribution area provided in Example 1 of the present invention.
- FIG3 is a flow chart of another optional method for automatically tracking a target by a drone gimbal provided in Embodiment 1 of the present invention.
- FIG4 is a schematic diagram of the structure of an optional device for automatically tracking a target by a gimbal of an unmanned aerial vehicle provided in Embodiment 2 of the present invention.
- FIG5 is a schematic diagram of the structure of an optional electronic device provided in Embodiment 3 of the present invention.
- FIG1 is a flow chart of an optional method for automatically tracking a target by a gimbal of a drone provided in Embodiment 1 of the present invention, and the method is applied to a drone.
- the method includes:
- Step S101 obtaining a selected target area and determining the characteristics of the selected target area.
- a user manually selects a target area containing a tracked object in an image captured by a pan-tilt camera, and the target area can be represented by a rectangular frame.
- a color histogram of the target area is calculated and used as a feature of the selected target area.
- a color histogram is a statistic of the number of pixels in a certain pixel value range. For example, statistics can be performed according to the three color channels of red, green, and blue to obtain the pixel distribution on each channel.
- Using the color histogram as a feature of the selected target area can reduce the influence of the distance of the tracked object, because its color distribution is roughly the same regardless of the distance.
- contour features or texture features of the target area can be further extracted as features of the target area.
- Step S102 initializing a predetermined number of particles.
- the structure of the particle includes information such as particle size, position coordinates, rectangular window size, weight, etc.
- the rectangular window size of the particle is the size of the selected target area.
- Step S103 obtaining the current frame image, and according to the preset state transfer equation, each particle obtains a corresponding predicted particle, and based on each predicted particle, calculates the similarity between the characteristics of the position of the predicted particle in the current frame image and the characteristics of the selected target area, and calculates the weight of the predicted particle according to the similarity.
- the state transfer equation is a Gaussian prediction state transfer equation.
- a predetermined number of particles are evenly distributed near a selected target area based on the state transfer equation. As shown in FIG2 , a particle distribution area is determined near the target area, and all particles are evenly arranged in the particle distribution area (including the target area).
- a predetermined number of particles are Gaussian distributed on an image plane based on the state transfer equation, wherein the closer to the target area, the more particles there are, and the farther from the target area, the fewer particles there are.
- each particle gets a corresponding predicted particle, and the features of each predicted particle in the current frame image are calculated.
- the calculation method of this feature is the same as the calculation method of the features of the selected target area.
- the features of the selected target area are obtained by calculating the color histogram of the target area
- the features of the predicted particles are also obtained by calculating the color histogram of the predicted particle at the position in the current frame image.
- the weight of the predicted particle is determined by comparing the similarity of the two color histograms. The higher the similarity, the higher the weight, and vice versa, the lower the similarity, the lower the weight.
- calculating the weight of the predicted particle according to the similarity includes: first calculating the original weight of each predicted particle according to the similarity between the predicted particle and the selected target area; and then normalizing the original weight of each predicted particle to obtain the weight of each predicted particle.
- Step S104 determining the position of the target area in the current frame image according to the weight of each predicted particle.
- Predicted particles with high weights represent features that are more similar to the selected target region.
- the position of the predicted particle with the maximum weight is used as the position of the target region in the current frame image.
- Step S105 calculating the pan/tilt rotation angle required to offset the target area to the center of the lens image according to the position of the target area in the current frame image, and controlling the pan/tilt to rotate accordingly according to the pan/tilt rotation angle.
- the target area can be determined by the coordinates of the upper left corner of the rectangular frame and the length and width of the rectangular frame.
- the center coordinates of the target area can be obtained according to the coordinates of the upper left corner of the target area and the length and width of the rectangular frame, and the pixel difference between the center of the target area and the center of the lens image can be calculated.
- the pan-tilt rotation angle required to offset the target area to the center of the lens image can be calculated according to the pixel difference.
- the pan-tilt rotation angle is controlled to rotate accordingly to display the tracked object in the center of the image.
- the pan-tilt rotation angle includes the pitch angle and the yaw angle.
- each particle is resampled according to the weight of each predicted particle.
- the resampling method is: keep the total number of particles unchanged, oversample particles with high weights, and undersample particles with low weights.
- resampling is performed by multiplying the particle weight by the total number of particles, that is, the number of particles obtained by sampling a particle with a large weight is also large.
- the resampled particles are used as samples of the next frame of the image and are substituted into the state transfer equation to obtain new predicted particles.
- FIG. 3 is a flow chart of another optional method for the drone gimbal to automatically track a target provided by the first embodiment of the present invention. The specific steps are as follows:
- Step S301 selecting a target area.
- the user can manually select a target area containing the tracked object in the image captured by the gimbal camera, or input coordinate information of the target area.
- Step S302 capturing the current frame image.
- Step S303 determining whether the current frame image is the first frame image after the target area is selected, if so, proceeding to step S304; otherwise, proceeding to step S306.
- Step S304 calculating the color histogram of the selected target area.
- This color histogram is the characteristic of the selected target area.
- Step S305 initialize a predetermined number of particles, and then enter step S302.
- Step S306 According to the preset state transfer equation, each particle obtains a predicted particle.
- the preset particles are evenly distributed near the selected target area.
- Step S307 calculate the color histogram at the location of each predicted particle, compare it with the color histogram of the selected target area, obtain the similarity between the two, and determine the weight of each predicted particle according to the similarity.
- Step S308 normalize the weight of each predicted particle so that the sum of the weights of all predicted particles is 1.
- Step S309 taking the position of the particle with the largest weight as the position of the target area in the current frame image.
- Step S310 according to the position of the target area in the current frame image, calculate the angle that the pan/tilt head needs to rotate when the target area is offset to the center of the picture, and control the pan/tilt head to rotate to the corresponding position based on the angle.
- the angles that the gimbal needs to rotate include the pitch angle and the yaw angle.
- Step S311 keeping the total number of particles unchanged, resample the particles so that the particles with large weights have more sampling numbers and the particles with small weights have fewer sampling numbers.
- the method for automatically tracking a target by a drone gimbal obtaineds a selected target area, determines the characteristics of the selected target area, and then initializes a predetermined number of particles. According to the state transfer equation, each particle obtains a predicted particle, and calculates the characteristic similarity between the image at the position of the predicted particle and the selected target area, thereby determining the position of the target area in the current frame image, and further calculating the gimbal rotation angle required to offset the target area to the center of the lens image according to the position of the target area in the current frame image, and controlling the gimbal to rotate accordingly according to the gimbal rotation angle.
- a device for automatically tracking a target by a drone gimbal is provided.
- FIG4 it is a schematic structural diagram of an optional device for automatically tracking a target by a drone gimbal provided in Embodiment 2 of the present invention.
- the device 400 for automatically tracking a target by a drone gimbal includes:
- a feature determination module 402 is configured to obtain a selected target area and determine features of the selected target area
- a particle initialization module 404 is configured to initialize a predetermined number of particles
- the particle prediction module 406 is configured to obtain a current frame image, obtain a corresponding predicted particle for each particle according to a preset state transfer equation, calculate the similarity between the feature of the position of the predicted particle in the current frame image and the feature of the selected target area based on each predicted particle, and calculate the weight of the predicted particle according to the similarity;
- a target region determination module 408 is configured to determine the position of the target region in the current frame image according to the weight of each predicted particle
- the pan/tilt control module 410 is configured to calculate the pan/tilt rotation angle required to offset the target area to the center of the lens image according to the position of the target area in the current frame image, and control the pan/tilt to rotate accordingly according to the pan/tilt rotation angle.
- the above-mentioned device can execute the method for the drone gimbal to automatically track the target described in Figures 1 to 3 of Example 1, and has functional modules and beneficial effects corresponding to the method.
- FIG5 it is a schematic diagram of the structure of an optional electronic device provided in the third embodiment of the present invention, and the electronic device may include a processor 501, a communication interface 502, a memory 503 and a communication bus 504, wherein the processor 501, the communication interface 502 and the memory 503 complete mutual communication through the communication bus 504.
- the processor 501 may call the logic instructions in the memory 503 to execute the method for the automatic tracking target of the unmanned aerial vehicle gimbal, which method includes: obtaining a selected target area and determining the characteristics of the selected target area; initializing a predetermined number of particles; obtaining the current frame image, and according to the preset state transfer equation, each particle obtains a corresponding predicted particle, and based on each predicted particle, calculates the similarity between the characteristics of the position of the predicted particle in the current frame image and the characteristics of the selected target area, and calculates the weight of the predicted particle according to the similarity; determines the position of the target area in the current frame image according to the weight of each predicted particle; according to the position of the target area in the current frame image, calculates the gimbal rotation angle required to offset the target area to the center of the lens picture, and controls the gimbal to rotate accordingly according to the gimbal rotation angle.
- the logic instructions in the above-mentioned memory 503 can be implemented in the form of software functional units and can be stored in several computer-readable storage media when they are sold or used as independent products. Based on such an understanding, the technical solution of the present invention can be essentially or partly embodied in the form of a software product that contributes to the prior art.
- the computer software product is stored in a storage medium, including several instructions for enabling a computer device (which can be a personal computer, a server or a network device, etc.) to perform all or part of the steps of the method described in Figures 1 to 3 in the first embodiment of the present invention.
- the aforementioned storage medium includes: various media that can store program codes, such as a U disk, a mobile hard disk, a read-only memory (ROM), a random access memory (RAM), a disk or an optical disk.
- the above-mentioned product can execute the method for automatic tracking of targets by the drone gimbal described in Figures 1 to 3 of Example 1, and has functional modules and beneficial effects corresponding to the method.
- the method for automatic tracking of targets by the drone gimbal provided in Figures 1 to 3 of Example 1 of the present invention.
- a computer-readable storage medium on which a computer program is stored.
- the processor executes the steps of the method for automatically tracking a target by a drone gimbal as described in Embodiment 1.
- the above-mentioned product can execute any method of automatic target tracking of the drone gimbal described in Example 1, and has the corresponding functional modules and beneficial effects of the method.
- any method of automatic target tracking of the drone gimbal described in Example 1 please refer to the method of automatic target tracking of the drone gimbal provided in Example 1 of the present invention.
- each implementation method can be implemented by means of software plus a general hardware platform, and of course, by hardware.
- the above technical solution is essentially or the part that contributes to the relevant technology can be embodied in the form of a software product, and the computer software product can be stored in a computer-readable storage medium, such as ROM/RAM, a disk, an optical disk, etc., including a number of instructions for a computer device (which can be a personal computer, a server, or a network device, etc.) to execute the methods described in each embodiment or some parts of the embodiments.
- the method, device, equipment and storage medium for the automatic tracking target of the unmanned aerial vehicle gimbal of the embodiment of the present invention obtains the selected target area, determines the characteristics of the selected target area, and then initializes a predetermined number of particles. According to the state transfer equation, each particle obtains a predicted particle, and calculates the characteristic similarity between the image at the position of the predicted particle and the selected target area, thereby determining the position of the target area in the current frame image, and further calculating the gimbal rotation angle required to offset the target area to the center of the lens image according to the position of the target area in the current frame image, and controlling the gimbal to rotate accordingly according to the gimbal rotation angle.
Landscapes
- Studio Devices (AREA)
- Image Analysis (AREA)
Abstract
一种无人机云台自动跟踪目标的方法、装置(400)、设备及存储介质,该方法包括:获取选定的目标区域,确定选定的目标区域的特征(S101);初始化预定数目的粒子(S102);根据状态转移方程,每个粒子得到一个预测粒子,计算预测粒子所在位置处图像与选定的目标区域的特征相似度(S103);确定出目标区域在当前帧图像中的位置(S104);根据目标区域在当前帧图像中的位置计算将该目标区域偏移到镜头画面中心所需的云台旋转角度,根据该云台旋转角度控制云台进行相应旋转(S105);从而保证云台在调整过程中的稳定性和相机图像质量。
Description
本发明涉及无人机技术领域,特别是涉及一种无人机云台自动跟踪目标的方法、装置、设备及存储介质。
近年来,随着计算机技术、自动控制技术及无线通信技术的快速发展,无人机在民用领域的应用越来越受到人们的关注。一般的无人机云台相机,只能拍照录像,云台是稳定镜头,在观测具体物体的时候,需要人为手动调整云台角度,才能观察物体。由于无人机巡航速度较快,人为操作麻烦,且容易跟丢物体。
本发明实施例旨在提供一种无人机云台自动跟踪目标的方法、装置、设备及存储介质,以解决现有技术中在观测具体物体的时候,需要人为手动调整云台角度带来的操作麻烦、容易跟丢物体的问题。
为解决上述技术问题,本发明实施例提供以下技术方案:
根据本发明的一方面,提供一种无人机云台自动跟踪目标的方法,所述方法包括:
获取选定的目标区域,确定所述选定的目标区域的特征;
初始化预定数目的粒子;
获取当前帧图像,根据预设的状态转移方程,每个粒子得到一个相应的预测粒子,基于每个预测粒子,计算所述预测粒子在当前帧图像中所在位置的特征与所述选定的目标区域的特征的相似度,并根据所述相似度计算所述预测粒子的权值;
根据各预测粒子的权值确定出所述目标区域在当前帧图像中的位置;
根据所述目标区域在当前帧图像中的位置,计算将所述目标区域偏移到镜头画面中心所需的云台旋转角度,根据所述云台旋转角度控制所述云台进行相应旋转。
根据本发明的另一方面,提供一种无人机云台自动跟踪目标的装置,所述装置包括:
特征确定模块,设置为获取选定的目标区域,确定所述选定的目标区域的特征;
粒子初始化模块,设置为初始化预定数目的粒子;
粒子预测模块,设置为获取当前帧图像,根据预设的状态转移方程,每个粒子得到一个相应的预测粒子,基于每个预测粒子,计算所述预测粒子在当前帧图像中所在位置的特征与所述选定的目标区域的特征的相似度,并根据所述相似度计算所述预测粒子的权值;
目标区域确定模块,设置为根据各预测粒子的权值确定出所述目标区域在当前帧图像中的位置;
云台控制模块,设置为根据所述目标区域在当前帧图像中的位置,计算将所述目标区域偏移到镜头画面中心所需的云台旋转角度,根据所述云台旋转角度控制所述云台进行相应旋转。
根据本发明的再一方面,提供一种电子设备,包括存储器、处理器及存储在存储器上运行的计算机程序,所述处理器执行所述程序时实现上述任一项所述的无人机云台自动跟踪目标的方法的步骤。
根据本发明的再一方面,提供一种计算机可读存储介质,所述计算机可读存储介质存储有计算机程序,当所述计算机程序被处理器执行时,所述处理器执行上述任一项所述的无人机云台自动跟踪目标的方法。
本发明实施例的无人机云台自动跟踪目标的方法、装置、设备及存储介质,通过获取选定的目标区域,确定该选定的目标区域的特征,再初始化预定数目的粒子,根据状态转移方程,每个粒子得到一个预测粒子,计算预测粒子所在位置处图像与选定的目标区域的特征相似度,从而确定出目标区域在当前帧图像中的位置,进一步根据目标区域在当前帧图像中的位置计算将该目标区域偏移到镜头画面中心所需的云台旋转角度,根据该云台旋转角度控制云台进行相应旋转。采用本发明,在观测物体时,不需要手动调整即可使观测物体始终处于画面中心,降低了人力成本和时间成本,提高了工作效率,保证云台在调整过程中的稳定性和相机图像质量。
一个或多个实施例通过与之对应的附图中的图片进行示例性说明,这些示例性说明并不构成对实施例的限定,附图中具有相同参考数字标号的元件表示为类似的元件,除非有特别申明,附图中的图不构成比例限制。
图1是本发明实施例一提供的一种可选的无人机云台自动跟踪目标的方法流程图;
图2为本发明实施例一提供的一种可选的目标区域和粒子分布区域的示意图;
图3是本发明实施例一提供的另一种可选的无人机云台自动跟踪目标的方法流程图;
图4是本发明实施例二提供的一种可选的无人机云台自动跟踪目标的装置的结构示意图。
图5是本发明实施例三提供的一种可选的电子设备的结构示意图。
为使本发明实施例的目的、技术方案和优点更加清楚,下面将结合本发明实施例中的附图,对本发明实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例是本发明一部分实施例,而不是全部的实施例。基于本发明中的实施例,本领域普通技术人员在没有做出创造性劳动前提下所获得的所有其他实施例,都属于本发明保护的范围。
此外,下面所描述的本发明各个实施方式中所涉及到的技术特征只要彼此之间未构成冲突就可以相互组合。
实施例一
请参阅图1 ,图1所示为本发明实施例一提供的一种可选的无人机云台自动跟踪目标的方法流程图,所述方法应用于无人机。所述方法包括:
步骤S101,获取选定的目标区域,确定所述选定的目标区域的特征。
在本发明的一种实施例中,用户在云台相机采集到的图像中手动选定包含有跟踪物体的目标区域,该目标区域可用矩形框表示。计算该目标区域的颜色直方图,将其作为所述选定的目标区域的特征。颜色直方图是对某一像素值范围的像素点个数的统计。比如,可以按照红、绿、蓝三个颜色通道进行统计,得到每个通道上的像素分布。将颜色直方图作为选定的目标区域的特征,可以减少跟踪物体远近距离的影响,因为不论远近,其颜色分布大致相同。在一些实施例中,还可进一步提取目标区域的轮廓特征或纹理特征作为目标区域的特征。
步骤S102,初始化预定数目的粒子。
具体的,初始化预定数目的粒子的结构体,将所有粒子的结构体中各项参数初始化为同样的值。粒子的结构体包括粒子大小、位置坐标、矩形窗口大小、权值等信息。其中,粒子的矩形窗口大小为选定的目标区域大小。
步骤S103,获取当前帧图像,根据预设的状态转移方程,每个粒子得到一个相应的预测粒子,基于每个预测粒子,计算所述预测粒子在当前帧图像中所在位置的特征与所述选定的目标区域的特征的相似度,并根据所述相似度计算所述预测粒子的权值。
在本发明的一个实施例中,该状态转移方程为高斯预测状态转移方程。在一种实施例中,基于该状态转移方程将预定数目的粒子均匀分布在选定的目标区域附近。如图2所示,在目标区域的附近确定出一粒子分布区域,将所有的粒子均匀设置在该粒子分布区域内(包括目标区域)。在其它实施例中,基于该状态转移方程将预定数目的粒子按高斯分布在图像平面,其中,离目标区域越近的地方粒子数量越多,离目标区域越远的地方粒子数量越少。
通过状态转移方程,每个粒子得到一个相应的预测粒子,计算每个预测粒子在当前帧图像中的特征。该特征的计算方式与选定的目标区域的特征的计算方式相同。比如,选定的目标区域的特征通过计算目标区域的颜色直方图得到,则预测粒子的特征也通过计算预测粒子在当前帧图像中位置处的颜色直方图得到。通过比较两个颜色直方图的相似度,来确定预测粒子的权值。相似度越高,则权值越高,反之,相似度越低,则权值越低。
在本发明的一个实施例中,根据所述相似度计算所述预测粒子的权值包括:先根据每一预测粒子与选定的目标区域的相似度计算该预测粒子的原始权值;再将各预测粒子的原始权值进行归一化处理,得到各预测粒子的权值。
步骤S104,根据各预测粒子的权值确定出所述目标区域在当前帧图像中的位置。
权值高的预测粒子代表与选定的目标区域的特征更为相似,在一种实施例中,将具有最大权值的预测粒子所在位置作为所述目标区域在当前帧图像中的位置。
步骤S105,根据所述目标区域在当前帧图像中的位置,计算将所述目标区域偏移到镜头画面中心所需的云台旋转角度,根据所述云台旋转角度控制所述云台进行相应旋转。
具体的,目标区域可通过矩形框的左上角坐标和矩形框长宽来确定。根据目标区域的左上角坐标和矩形框长宽可以求得目标区域的中心坐标,由此计算出目标区域的中心与镜头画面中心的像素差,再根据该像素差可计算出将目标区域偏移到镜头画面中心所需的云台旋转角度,根据该云台旋转角度控制云台进行相应旋转,将跟踪物体显示在画面中心。其中,云台旋转角度包括俯仰角度和偏航角度。
进一步地,根据各预测粒子的权值对各粒子进行重采样,重采样的方法为:保持粒子总数不变,对权值高的粒子的多采样,对权值低的粒子的少采样。一种实施例中,按照粒子的权值乘以粒子总数进行重采样,即权值大的粒子其采样得到的粒子数也大。完成粒子的重采样后,将重采样后的粒子作为下一帧图像的样本,带入到状态转移方程得到新的预测粒子。
请参阅图3,图3所示为本发明实施例一提供的另一种可选的无人机云台自动跟踪目标的方法流程图,具体步骤如下:
步骤S301,选定目标区域。
具体的,用户可在云台相机采集到的图像中手动框选包含有跟踪物体的目标区域,或者输入目标区域的坐标信息。
步骤S302,采集当前帧图像。
步骤S303,判断当前帧图像是否为选定目标区域后的第一帧图像,若是,则进入步骤S304;否则,进入步骤S306。
步骤S304,计算选定的目标区域的颜色直方图。
该颜色直方图即为选定的目标区域的特征。
步骤S305,初始化预定数量的粒子,然后进入步骤S302。
步骤S306,根据预设的状态转移方程,每个粒子得到一个预测粒子。
基于该状态转移方程,将预设的粒子均匀分布在选定目标区域的附近。
步骤S307,计算各预测粒子所在位置处的颜色直方图,将其与选定目标区域的颜色直方图进行比较,得到二者的相似度,根据相似度确定各预测粒子的权值。
步骤S308,对各预测粒子的权值进行归一化处理,使得所有预测粒子的权值之和为1。
步骤S309,取权值最大的粒子所在位置作为目标区域在当前帧图像中的位置。
步骤S310,根据该目标区域在当前帧图像中的位置,计算将目标区域偏移到画面中心时云台所需旋转的角度,基于该角度控制云台旋转到相应位置。
云台所需旋转的角度包括俯仰角度和偏航角度。
步骤S311,保持粒子总数不变,对粒子进行重采样,使得权值大的粒子采样数多,权值小的粒子采样数少。
本发明实施例提供的无人机云台自动跟踪目标的方法,获取选定的目标区域,确定该选定的目标区域的特征,再初始化预定数目的粒子,根据状态转移方程,每个粒子得到一个预测粒子,计算预测粒子所在位置处图像与选定的目标区域的特征相似度,从而确定出目标区域在当前帧图像中的位置,进一步根据目标区域在当前帧图像中的位置计算将该目标区域偏移到镜头画面中心所需的云台旋转角度,根据该云台旋转角度控制云台进行相应旋转。采用本发明,在观测物体时,不需要手动调整即可使观测物体始终处于画面中心,降低了人力成本和时间成本,提高了工作效率,保证云台在调整过程中的稳定性和相机图像质量。
实施例二
根据本发明实施例,提供一种无人机云台自动跟踪目标的装置,如图4所示,为本发明实施例二提供的一种可选的无人机云台自动跟踪目标的装置的结构示意图,所述无人机云台自动跟踪目标的装置400包括:
特征确定模块402,设置为获取选定的目标区域,确定所述选定的目标区域的特征;
粒子初始化模块404,设置为初始化预定数目的粒子;
粒子预测模块406,设置为获取当前帧图像,根据预设的状态转移方程,每个粒子得到一个相应的预测粒子,基于每个预测粒子,计算所述预测粒子在当前帧图像中所在位置的特征与所述选定的目标区域的特征的相似度,并根据所述相似度计算所述预测粒子的权值;
目标区域确定模块408,设置为根据各预测粒子的权值确定出所述目标区域在当前帧图像中的位置;
云台控制模块410,设置为根据所述目标区域在当前帧图像中的位置,计算将所述目标区域偏移到镜头画面中心所需的云台旋转角度,根据所述云台旋转角度控制所述云台进行相应旋转。
上述装置可执行实施例一中图1至图3所述的无人机云台自动跟踪目标的方法,具备方法相应的功能模块和有益效果,未在本实施例中详尽描述的技术细节,可参见本发明实施例一中图1至图3提供的无人机云台自动跟踪目标的方法。
实施例三
如图5所示,为本发明实施例三提供的一种可选的电子设备的结构示意图,该电子设备可以包括处理器501、通信接口502、存储器503和通信总线504,其中,处理器501、通信接口502、存储器503通过通信总线504完成相互间的通信。处理器501可以调用存储器503中的逻辑指令,以执行无人机云台自动跟踪目标的方法,该方法包括:获取选定的目标区域,确定所述选定的目标区域的特征;初始化预定数目的粒子;获取当前帧图像,根据预设的状态转移方程,每个粒子得到一个相应的预测粒子,基于每个预测粒子,计算所述预测粒子在当前帧图像中所在位置的特征与所述选定的目标区域的特征的相似度,并根据所述相似度计算所述预测粒子的权值;根据各预测粒子的权值确定出所述目标区域在当前帧图像中的位置;根据所述目标区域在当前帧图像中的位置,计算将所述目标区域偏移到镜头画面中心所需的云台旋转角度,根据所述云台旋转角度控制所述云台进行相应旋转。
此外,上述存储器503中的逻辑指令可以通过软件功能单元的形式实现并作为独立的产品销售或使用时,可以存储在几个计算机可读存储介质中。基于这样的理解,本发明的技术方案本质上或者说对现有技术做出贡献的部分或者该技术方案的部分可以以软件产品的形式体现出来,计算机软件产品存储于一个存储介质中,包括若干指令用以使得一台计算机设备(可以是个人计算机、服务器或者网络设备等)执行本发明实施例一中图1至图3所述方法的全部或部分步骤。而前述的存储介质包括:U盘、移动硬盘、只读存储器(ROM,Read- OnlyMemory ) 、 随机存取存储器(RAM,RandomAccessMemory)、磁碟或者光盘等各种可以存储程序代码的介质。
上述产品可执行实施例一中图1至图3所述的无人机云台自动跟踪目标的方法,具备方法相应的功能模块和有益效果,未在本实施例中详尽描述的技术细节,可参见本发明实施例一中图1至图3提供的无人机云台自动跟踪目标的方法。
实施例四
根据本发明实施例,提供一种计算机可读存储介质,其上存储有计算机程序,当所述计算机程序被处理器执行时,所述处理器执行实施例一中所述的无人机云台自动跟踪目标的方法的步骤。
上述产品可执行实施例一中任一所述的无人机云台自动跟踪目标的方法,具备方法相应的功能模块和有益效果,未在本实施例中详尽描述的技术细节,可参见本发明实施例一中提供的无人机云台自动跟踪目标的方法。
通过以上的实施方式的描述,本领域的技术人员可以清楚地了解到各实施方式可借助软件加通用硬件平台的方式来实现,当然也可以通过硬件。基于这样的理解,上述技术方案本质上或者说对相关技术做出贡献的部分可以以软件产品的形式体现出来,该计算机软件产品可以存储在计算机可读存储介质中,如 ROM/RAM、磁碟、光盘等,包括若干指令用直至得一台计算机设备(可以是个人计算机,服务器,或者网络设备等)执行各个实施例或者实施例的某些部分所述的方法。
以上实施例仅用以说明本发明的技术方案,而非对其限制;在本发明的思路下,以上实施例或者不同实施例中的技术特征之间也可以进行组合,步骤可以以任意顺序实现,并存在如上所述的本发明的不同方面的许多其它变化,为了简明,它们没有在细节中提供;尽管参照前述实施例对本发明进行了详细的说明,本领域的普通技术人员应当理解:其依然可以对前述各实施例所记载的技术方案进行修改,或者对其中部分技术特征进行等同替换;而这些修改或者替换,并不使相应技术方案的本质脱离本申请各实施例技术方案的范围。
本发明实施例的无人机云台自动跟踪目标的方法、装置、设备及存储介质,通过获取选定的目标区域,确定该选定的目标区域的特征,再初始化预定数目的粒子,根据状态转移方程,每个粒子得到一个预测粒子,计算预测粒子所在位置处图像与选定的目标区域的特征相似度,从而确定出目标区域在当前帧图像中的位置,进一步根据目标区域在当前帧图像中的位置计算将该目标区域偏移到镜头画面中心所需的云台旋转角度,根据该云台旋转角度控制云台进行相应旋转。采用本发明,在观测物体时,不需要手动调整即可使观测物体始终处于画面中心,不仅降低了人力成本和时间成本,提高了工作效率,还保证云台在调整过程中的稳定性和相机图像质量。因此,具有工业实用性。
Claims (10)
- 一种无人机云台自动跟踪目标的方法,所述方法包括:获取选定的目标区域,确定所述选定的目标区域的特征;初始化预定数目的粒子;获取当前帧图像,根据预设的状态转移方程,每个粒子得到一个相应的预测粒子,基于每个预测粒子,计算所述预测粒子在当前帧图像中所在位置的特征与所述选定的目标区域的特征的相似度,并根据所述相似度计算所述预测粒子的权值;根据各预测粒子的权值确定出所述目标区域在当前帧图像中的位置;根据所述目标区域在当前帧图像中的位置,计算将所述目标区域偏移到镜头画面中心所需的云台旋转角度,根据所述云台旋转角度控制所述云台进行相应旋转。
- 根据权利要求1所述的方法,其中,所述获取选定的目标区域,确定所述选定的目标区域的特征包括:获取用户选定的包含有跟踪物体的目标区域;计算所述选定的目标区域的颜色直方图,将其作为所述选定的目标区域的特征。
- 根据权利要求1所述的方法,其中,所述根据预设的状态转移方程,每个粒子得到一个相应的预测粒子包括:根据预设的状态转移方程,将预定数目的粒子均匀分布在所述选定的目标区域附近。
- 根据权利要求1所述的方法,其中,所述根据所述相似度计算所述预测粒子的权值包括:根据所述相似度计算所述预测粒子的原始权值;将各预测粒子的原始权值进行归一化处理,得到各预测粒子的权值。
- 根据权利要求4所述的方法,其中,所述根据各预测粒子的权值确定出所述目标区域在当前帧图像中的位置包括:将具有最大权值的预测粒子所在位置作为所述目标区域在当前帧图像中的位置。
- 根据权利要求2所述的方法,其中,所述方法还包括:根据各预测粒子的权值对各粒子进行重采样,重采样的方法为:保持粒子总数不变,对权值高的粒子的多采样,对权值低的粒子的少采样。
- 根据权利要求1所述的方法,其中,所述云台旋转角度包括俯仰角度和偏航角度。
- 一种无人机云台自动跟踪目标的装置,包括:特征确定模块,设置为获取选定的目标区域,确定所述选定的目标区域的特征;粒子初始化模块,设置为初始化预定数目的粒子;粒子预测模块,设置为获取当前帧图像,根据预设的状态转移方程,每个粒子得到一个相应的预测粒子,基于每个预测粒子,计算所述预测粒子在当前帧图像中所在位置的特征与所述选定的目标区域的特征的相似度,并根据所述相似度计算所述预测粒子的权值;目标区域确定模块,设置为根据各预测粒子的权值确定出所述目标区域在当前帧图像中的位置;云台控制模块,设置为根据所述目标区域在当前帧图像中的位置,计算将所述目标区域偏移到镜头画面中心所需的云台旋转角度,根据所述云台旋转角度控制所述云台进行相应旋转。
- 一种电子设备,包括存储器、处理器及存储在存储器上运行的计算机程序,所述处理器执行所述程序时实现权利要求1-7任一项所述的无人机云台自动跟踪目标的方法的步骤。
- 一种计算机可读存储介质,其上存储有计算机程序,当所述计算机程序被处理器执行时,所述处理器执行如权利要求1-7任一项所述的无人机云台自动跟踪目标的方法。
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202211534065.5 | 2022-12-02 | ||
CN202211534065.5A CN115903904A (zh) | 2022-12-02 | 2022-12-02 | 一种无人机云台自动跟踪目标的方法、装置及设备 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2024114376A1 true WO2024114376A1 (zh) | 2024-06-06 |
Family
ID=86486568
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2023/131693 WO2024114376A1 (zh) | 2022-12-02 | 2023-11-15 | 无人机云台自动跟踪目标的方法、装置、设备及存储介质 |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN115903904A (zh) |
WO (1) | WO2024114376A1 (zh) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN118394134A (zh) * | 2024-06-28 | 2024-07-26 | 深圳市浩瀚卓越科技有限公司 | 基于云台控制的摄像头自动追踪方法及系统 |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115903904A (zh) * | 2022-12-02 | 2023-04-04 | 亿航智能设备(广州)有限公司 | 一种无人机云台自动跟踪目标的方法、装置及设备 |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2010166104A (ja) * | 2008-12-16 | 2010-07-29 | Victor Co Of Japan Ltd | 目標追尾装置 |
JP2010219934A (ja) * | 2009-03-17 | 2010-09-30 | Victor Co Of Japan Ltd | 目標追尾装置 |
CN102184551A (zh) * | 2011-05-10 | 2011-09-14 | 东北大学 | 结合多种特征匹配和粒子滤波的目标自动跟踪方法及系统 |
CN103024344A (zh) * | 2011-09-20 | 2013-04-03 | 佳都新太科技股份有限公司 | 一种基于粒子滤波的ptz自动跟踪目标的方法 |
CN106254836A (zh) * | 2016-09-19 | 2016-12-21 | 南京航空航天大学 | 无人机红外图像目标跟踪系统及方法 |
CN111369597A (zh) * | 2020-03-09 | 2020-07-03 | 南京理工大学 | 一种基于多特征融合的粒子滤波目标跟踪方法 |
CN115903904A (zh) * | 2022-12-02 | 2023-04-04 | 亿航智能设备(广州)有限公司 | 一种无人机云台自动跟踪目标的方法、装置及设备 |
-
2022
- 2022-12-02 CN CN202211534065.5A patent/CN115903904A/zh active Pending
-
2023
- 2023-11-15 WO PCT/CN2023/131693 patent/WO2024114376A1/zh unknown
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2010166104A (ja) * | 2008-12-16 | 2010-07-29 | Victor Co Of Japan Ltd | 目標追尾装置 |
JP2010219934A (ja) * | 2009-03-17 | 2010-09-30 | Victor Co Of Japan Ltd | 目標追尾装置 |
CN102184551A (zh) * | 2011-05-10 | 2011-09-14 | 东北大学 | 结合多种特征匹配和粒子滤波的目标自动跟踪方法及系统 |
CN103024344A (zh) * | 2011-09-20 | 2013-04-03 | 佳都新太科技股份有限公司 | 一种基于粒子滤波的ptz自动跟踪目标的方法 |
CN106254836A (zh) * | 2016-09-19 | 2016-12-21 | 南京航空航天大学 | 无人机红外图像目标跟踪系统及方法 |
CN111369597A (zh) * | 2020-03-09 | 2020-07-03 | 南京理工大学 | 一种基于多特征融合的粒子滤波目标跟踪方法 |
CN115903904A (zh) * | 2022-12-02 | 2023-04-04 | 亿航智能设备(广州)有限公司 | 一种无人机云台自动跟踪目标的方法、装置及设备 |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN118394134A (zh) * | 2024-06-28 | 2024-07-26 | 深圳市浩瀚卓越科技有限公司 | 基于云台控制的摄像头自动追踪方法及系统 |
Also Published As
Publication number | Publication date |
---|---|
CN115903904A (zh) | 2023-04-04 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2024114376A1 (zh) | 无人机云台自动跟踪目标的方法、装置、设备及存储介质 | |
WO2021189456A1 (zh) | 无人机巡检方法、装置及无人机 | |
CN107516319B (zh) | 一种高精度简易交互式抠图方法、存储设备及终端 | |
CN108322666B (zh) | 摄像头快门的调控方法、装置、计算机设备及存储介质 | |
US8903130B1 (en) | Virtual camera operator | |
CN109658427B (zh) | 图像处理方法及装置 | |
CN111917991B (zh) | 图像的质量控制方法、装置、设备及存储介质 | |
WO2019128534A1 (zh) | 摄像头模组倾斜度测试方法、装置、存储介质及电子设备 | |
CN105120247A (zh) | 一种白平衡调整方法及电子设备 | |
CN106249508A (zh) | 自动对焦方法和系统、拍摄装置 | |
WO2020019257A1 (zh) | 多云台的控制方法、装置、无人机、介质及电子设备 | |
CN113747071B (zh) | 一种无人机拍摄方法、装置、无人机及存储介质 | |
CN111491149B (zh) | 基于高清视频的实时抠像方法、装置、设备及存储介质 | |
CN107454337A (zh) | 一种控制摄像头旋转的方法及终端和相关介质产品 | |
CN111093022A (zh) | 图像拍摄方法、装置、终端及计算机存储介质 | |
CN112788322B (zh) | 自适应白平衡处理方法、装置、介质及电子设备 | |
CN113965664A (zh) | 一种图像虚化方法、存储介质以及终端设备 | |
WO2020257999A1 (zh) | 图像处理方法、装置、云台和存储介质 | |
CN113592753B (zh) | 基于工业相机拍摄的图像的处理方法、装置和计算机设备 | |
CN111917986A (zh) | 图像处理方法及其介质和电子设备 | |
CN112312108A (zh) | 一种白平衡的异常确定方法、装置、存储介质及电子设备 | |
CN110072050B (zh) | 曝光参数的自适应调整方法、装置及一种拍摄设备 | |
CN116343652A (zh) | Led显示屏拼接线补偿系统、方法、设备及存储介质 | |
WO2022000213A1 (zh) | 一种图像拍摄的控制方法、装置及存储介质 | |
WO2022001733A1 (zh) | 一种拍摄对象的显示方法及装置、存储介质、终端 |