WO2023109716A1 - 车机协同目标丢失跟踪方法、装置、设备及其存储介质 - Google Patents

车机协同目标丢失跟踪方法、装置、设备及其存储介质 Download PDF

Info

Publication number
WO2023109716A1
WO2023109716A1 PCT/CN2022/138210 CN2022138210W WO2023109716A1 WO 2023109716 A1 WO2023109716 A1 WO 2023109716A1 CN 2022138210 W CN2022138210 W CN 2022138210W WO 2023109716 A1 WO2023109716 A1 WO 2023109716A1
Authority
WO
WIPO (PCT)
Prior art keywords
tracking
target
unmanned vehicle
vehicle
unmanned
Prior art date
Application number
PCT/CN2022/138210
Other languages
English (en)
French (fr)
Inventor
徐坤
向耿召
李慧云
蔡宇翔
刘德梁
Original Assignee
深圳先进技术研究院
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳先进技术研究院 filed Critical 深圳先进技术研究院
Publication of WO2023109716A1 publication Critical patent/WO2023109716A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/246Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06KGRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K7/00Methods or arrangements for sensing record carriers, e.g. for reading patterns
    • G06K7/10Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation
    • G06K7/14Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation using light without selection of wavelength, e.g. sensing reflected white light
    • G06K7/1404Methods for optical code recognition
    • G06K7/1408Methods for optical code recognition the method being specifically adapted for the type of code
    • G06K7/14172D bar codes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06KGRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K7/00Methods or arrangements for sensing record carriers, e.g. for reading patterns
    • G06K7/10Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation
    • G06K7/14Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation using light without selection of wavelength, e.g. sensing reflected white light
    • G06K7/1404Methods for optical code recognition
    • G06K7/1439Methods for optical code recognition including a method step for retrieval of the optical code
    • G06K7/1443Methods for optical code recognition including a method step for retrieval of the optical code locating of the code in an image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30248Vehicle exterior or interior
    • G06T2207/30252Vehicle exterior; Vicinity of vehicle

Definitions

  • the present invention relates to the technical field of vehicle-machine collaboration, and in particular to a method, device, equipment and storage medium for vehicle-machine collaboration target loss tracking.
  • Unmanned vehicles and UAVs have important applications in civil, anti-terrorist and explosion-proof, and military applications. UAVs are often required to track unmanned vehicles to perform corresponding tasks. In the process of tracking the target, the movement state of the unmanned vehicle or drone may be changed due to factors such as strong winds and uneven roads, which may cause the drone to lose the tracking target during the tracking process, which will greatly reduce the work. efficiency. In the research of tracking algorithm, what method can be used to make the UAV recapture the tracking target after losing the tracking target is also a problem that must be considered and solved.
  • the current existing patent CN106023257A proposes a target tracking method based on the rotor UAV platform.
  • This method effectively realizes fast and accurate online real-time tracking of moving targets by the rotor UAV platform through multi-scale sample collection and combined with real-time updated classifiers; during the tracking process, according to the maximum classifier response value of the current frame and Compared with the change of the maximum classifier response value of the previous frame, combined with the tracking performance judgment result of the previous frame, it is determined whether the tracking performance of the current frame is stable. When the tracking performance is unstable, the classification of the classifier is timely The results are corrected, which can effectively prevent the loss of the tracking target caused by occlusion in the process of target tracking.
  • an existing target tracking algorithm based on semi-supervised collaborative training of L1 graphs uses the tracking results to collect positive samples representing the target and negative samples representing the background in the first few frames, and extracts the color and texture features of the samples to construct two Fully redundant views.
  • the candidate regions are randomly sampled as unlabeled samples modeled by particle filtering.
  • the semi-supervised learning algorithm based on the L1 graph is used to train the classifier to calculate the similarity of the unlabeled samples.
  • several unlabeled samples with the lowest similarity are selected as newly labeled negative samples to update the classifier.
  • the classifiers under different views independently calculate the similarity of unlabeled samples, and use their similarity entropy as weight to fuse to obtain the final result.
  • An existing remote control simulation control method based on the SBUS protocol for unmanned ground vehicles, a detailed control method is designed. Aiming at real-time performance and occlusion issues, a tracking scheme based on proportional label recognition is investigated. This scheme tracks labeled objects without occlusion. When occlusion occurs, the scheme will track the color features around the label. The accuracy of tracking algorithms and occlusion issues has been greatly improved. Finally, the scheme is applied to heterogeneous systems. An existing four-rotor control method based on the SBUS protocol to maintain the original stability of the drone.
  • a UGV with mechanical wheels and its control method are proposed.
  • Aiming at the problems of real-time performance and occlusion a tracking scheme based on QR code (Quick Response Code) recognition is proposed.
  • QR code Quick Response Code
  • the scheme is applied to heterogeneous systems.
  • For unmanned ground vehicles a detailed control method is designed.
  • Aiming at real-time performance and occlusion issues, a tracking scheme based on proportional label recognition is investigated. This scheme tracks labeled objects without occlusion. When occlusion occurs, the scheme will track the color features around the label. The accuracy of tracking algorithms and occlusion issues have been improved.
  • the scheme is applied to heterogeneous systems.
  • the embodiment of the present application provides a vehicle-machine cooperative target loss tracking method, the method includes: identifying the target to be tracked by the target identification module on the drone, and sending the identified target information to the drone for control
  • the PID target tracking control module on the UAV is started at the same time to track the target; when the UAV loses the tracking target, it restores the QR code tracking state by tracking and predicting the trajectory of the unmanned vehicle; when the trajectory of the unmanned vehicle is predicted
  • the drone s flying height is raised and hovered, and the unmanned vehicle is searched through the target tracking algorithm to restore the two-dimensional code tracking state.
  • the target recognition module is composed of a two-dimensional code detection algorithm, including any one of artag, apritag, or aruco two-dimensional code detection algorithm.
  • the identification of the target to be tracked by the target recognition module on the drone includes: identifying the two-dimensional code image on the unmanned vehicle of the target to be tracked by the fisheye monocular camera on the drone, And output the position and attitude of the QR code in the camera coordinate system.
  • the two-dimensional code tracking state is restored by tracking and predicting the trajectory of the unmanned vehicle, including: the model prediction module on the unmanned vehicle predicts that there is no The trajectory point coordinates of the person and the vehicle at the preset time, and send the trajectory point coordinates to the predicted trajectory tracking control module on the UAV; the predicted trajectory tracking control module converts the trajectory point coordinates into the trajectory under the UAV body coordinate system, Then track the predicted trajectory points one by one through the PID control method until the UAV finds the QR code.
  • the flying height of the unmanned aerial vehicle is raised and hovered, and the unmanned vehicle is searched by the target tracking algorithm to restore the two-dimensional code tracking state , including: use the lidar on the unmanned vehicle to judge whether there is an obstruction directly above the unmanned vehicle, and if there is an obstruction, control the unmanned vehicle to move a preset distance in the target direction to an unobstructed area and stop in place;
  • the flying height of the drone hovers in the air with a radius of R, the image is recognized by the telephoto camera on the drone, and the target tracking algorithm KCF, CSK, or C-COT algorithm is used to directly track the unmanned vehicle; when the unmanned vehicle is recognized, there is no
  • the embodiment of the present application also provides a vehicle-machine cooperative target loss tracking device, which includes: a target identification module, used to identify the target to be tracked, and send the identified target information to the control system on the UAV At the same time, start the PID target tracking control module on the drone to track the target; the tracking module is used to recover the two-dimensional code tracking state by tracking and predicting the track of the unmanned vehicle when the drone loses the tracking target; the target search module, It is used to increase the flying height of the UAV and hover when the predicted trajectory of the unmanned vehicle is still in the state of target loss after tracking, and the unmanned vehicle is searched by the target tracking algorithm to restore the tracking state of the two-dimensional code.
  • a target identification module used to identify the target to be tracked, and send the identified target information to the control system on the UAV At the same time, start the PID target tracking control module on the drone to track the target
  • the tracking module is used to recover the two-dimensional code tracking state by tracking and predicting the track of the unmanned vehicle when the drone loses
  • the embodiment of the present application also provides a computer device, including a memory, a processor, and a computer program stored in the memory and operable on the processor.
  • the processor executes the program, it implements the The method described in any one of the descriptions of the examples.
  • the embodiment of the present application also provides a computer device, a computer-readable storage medium, on which a computer program is stored, and the computer program is used for: when the computer program is executed by a processor, the computer program according to the present application is implemented.
  • a computer device a computer-readable storage medium, on which a computer program is stored, and the computer program is used for: when the computer program is executed by a processor, the computer program according to the present application is implemented.
  • the vehicle-machine cooperative target loss tracking method provided by the present invention can effectively solve the problem of the unmanned vehicle tracking the lost target caused by the short-term occlusion of the two-dimensional code of the unmanned vehicle or the two-dimensional code is not within the shooting space of the camera.
  • Two-dimensional code tracking, trajectory prediction tracking, and target tracking methods greatly improve the robustness of the vehicle-machine collaboration system and the unmanned vehicle tracking drone characters.
  • FIG. 1 shows a schematic flowchart of a vehicle-machine cooperative target loss tracking method provided by an embodiment of the present application
  • FIG. 2 shows an exemplary structural block diagram of a vehicle-machine cooperative target loss tracking device 200 according to an embodiment of the present application
  • FIG. 3 shows a schematic structural diagram of a computer system suitable for implementing a terminal device according to an embodiment of the present application
  • Fig. 4 shows another schematic flowchart of the vehicle-machine cooperative target loss tracking method provided by the embodiment of the present application
  • Fig. 5 shows a schematic diagram of a vehicle kinematics model provided by an embodiment of the present application.
  • first and second are used for descriptive purposes only, and cannot be interpreted as indicating or implying relative importance or implicitly specifying the quantity of indicated technical features.
  • the features defined as “first” and “second” may explicitly or implicitly include at least one of these features.
  • “plurality” means at least two, such as two, three, etc., unless otherwise specifically defined.
  • the first feature may be in direct contact with the first feature or the first and second feature may be in direct contact with the second feature through an intermediary. touch.
  • “above”, “above” and “above” the first feature on the second feature may mean that the first feature is directly above or obliquely above the second feature, or simply means that the first feature is higher in level than the second feature.
  • “Below”, “beneath” and “beneath” the first feature may mean that the first feature is directly below or obliquely below the second feature, or simply means that the first feature is less horizontally than the second feature.
  • FIG. 1 shows a schematic flowchart of a method for tracking a vehicle-machine cooperative target loss provided by an embodiment of the present application.
  • the method includes:
  • Step 110 identify the target that needs to be tracked by the target recognition module on the drone, send the identified target information to the control system on the drone, and start the PID (proportion, integral, integral) on the drone at the same time.
  • Differential (derivative) target tracking control module carries out target tracking;
  • Step 120 when the UAV loses the tracking target, restore the two-dimensional code tracking state by tracking and predicting the trajectory of the unmanned vehicle;
  • Step 130 when the track of the predicted unmanned vehicle is still in the state of missing the target after tracking, the flying height of the unmanned aerial vehicle is raised and hovered, and the unmanned vehicle is searched by the target tracking algorithm to restore the tracking state of the two-dimensional code.
  • the tracking height is h 0
  • the current position, current course angle, expected position and expected course angle of the UAV in the body coordinate system are output by PID control.
  • the speed of the aircraft in the x-axis, y-axis, and z-axis directions, and the angular velocity of the heading are output by PID control.
  • I t I t-1 + ⁇ w t
  • w current is the current position or attitude
  • w expect is the expected position or attitude
  • I t is the integral
  • D t is the difference
  • cliff(I t ,-c,+c) is a truncation function, which truncates I t between the constant plus and minus c.
  • the above-mentioned technical solution can effectively solve the problem of the unmanned vehicle’s QR code being temporarily blocked or the QR code is not within the range of the camera’s shooting space.
  • the target tracking method greatly improves the vehicle-machine coordination system, and the robustness of unmanned vehicles tracking drones.
  • the target recognition module in this application is composed of a two-dimensional code detection algorithm, including any one of artag, apritag, or aruco two-dimensional code detection algorithm.
  • the identification of the target to be tracked by the target recognition module on the drone includes:
  • the two-dimensional code tracking state is restored by tracking and predicting the track of the unmanned vehicle, including:
  • the model prediction unmanned vehicle trajectory module on the unmanned vehicle predicts the trajectory point coordinates of the unmanned vehicle within the preset time, and sends the trajectory point coordinates to the predicted trajectory tracking control module on the UAV;
  • the predicted trajectory tracking control module converts the coordinates of the trajectory points into the trajectory under the UAV body coordinate system, and then tracks the predicted trajectory points one by one through the PID control method until the UAV finds the two-dimensional code.
  • the UAV After the target is lost by the predicted trajectory tracking control module, the UAV will climb to the height h 1 (h 1 >h 0 ) to expand the field of view of the camera, and the trajectory predicted by the trajectory prediction module will be converted into the trajectory under the UAV body coordinate system. Then track the predicted trajectory points one by one through the PID control method.
  • the module When the two-dimensional code is re-identified, the module is stopped, the height h 0 is restored, and the normal tracking program for the two-dimensional code is started.
  • the flying height of the unmanned aerial vehicle is raised and hovered, and the unmanned vehicle is searched for by the target tracking algorithm to restore the two-dimensional Code tracking status, including: judging whether there is an occlusion directly above the unmanned vehicle through the laser radar on the unmanned vehicle, if there is an occlusion, control the unmanned vehicle to move a preset distance in the target direction to an unobstructed area and stop in place; Raise the flying height of the UAV and hover in the air with R as the radius, identify the image through the telephoto camera on the UAV, and use the target tracking algorithm KCF, CSK, or C-COT algorithm to directly track the unmanned vehicle; when the unmanned vehicle is identified After the car, the drone stops circling and moves the unmanned vehicle to the center of the image through PID control, then the drone lowers its height to the preset height, and then lowers the height to the preset height again after recognizing
  • the target search module starts when the aforementioned predicted trajectory tracking control module has tracked the N predicted trajectory points and is still in the state of target loss.
  • Blockage if there is a blockage, control the unmanned vehicle to move a short distance to the target direction to an unblocked area and stop in place.
  • the UAV climbed vertically again to the height h2 (h2>h1).
  • the ordinary monocular camera was out of focus and hovered in the air with a radius of R.
  • the telephoto camera was used to identify the image, and the commonly used target tracking algorithms KCF and CSK were used. , or C-COT and other algorithms to directly track the unmanned vehicle.
  • the drone stops circling and moves the unmanned vehicle to the center of the image through PID control. Then the UAV lowers its height to h1, and after recognizing the QR code, it lowers its height to h0 again, resumes the normal tracking procedure, and starts the unmanned vehicle to start the task at the same time.
  • FIG. 2 shows an exemplary structural block diagram of a vehicle-machine cooperative target loss tracking device 200 according to an embodiment of the present application.
  • the device includes:
  • the target recognition module 210 is used to identify the target that needs to be tracked, and sends the identified target information to the control system on the drone, and simultaneously starts the PID target tracking control module on the drone to track the target;
  • the tracking module 220 is used to restore the two-dimensional code tracking state by tracking and predicting the trajectory of the unmanned vehicle when the drone loses the tracking target;
  • the target search module 230 is used to increase the flying height of the drone and hover when the track of the predicted unmanned vehicle is still in the state of target loss after tracking, and search for the unmanned vehicle through the target tracking algorithm to restore the two-dimensional code tracking state.
  • the units or modules recorded in the device 200 correspond to the steps in the method described with reference to FIG. 1 . Therefore, the operations and features described above for the method are also applicable to the device 200 and the units contained therein, and will not be repeated here.
  • the apparatus 200 may be pre-implemented in the browser of the electronic device or other security applications, and may also be loaded into the browser of the electronic device or its security applications by downloading or other means.
  • the corresponding units in the apparatus 200 may cooperate with the units in the electronic device to implement the solutions of the embodiments of the present application.
  • FIG. 3 shows a schematic structural diagram of a computer system 300 suitable for implementing a terminal device or a server according to an embodiment of the present application.
  • a computer system 300 includes a central processing unit (CPU) 301 that can operate according to a program stored in a read-only memory (ROM) 302 or a program loaded from a storage section 308 into a random-access memory (RAM) 303 Instead, various appropriate actions and processes are performed.
  • ROM read-only memory
  • RAM random-access memory
  • various programs and data required for the operation of the system 300 are also stored.
  • the CPU 301, ROM 302, and RAM 303 are connected to each other through a bus 304.
  • An input/output (I/O) interface 305 is also connected to the bus 304 .
  • the following components are connected to the I/O interface 305: an input section 306 including a keyboard, a mouse, etc.; an output section 307 including a cathode ray tube (CRT), a liquid crystal display (LCD), etc., and a speaker; a storage section 308 including a hard disk, etc. and a communication section 309 including a network interface card such as a LAN card, a modem, or the like.
  • the communication section 309 performs communication processing via a network such as the Internet.
  • a drive 310 is also connected to the I/O interface 305 as needed.
  • a removable medium 311, such as a magnetic disk, an optical disk, a magneto-optical disk, a semiconductor memory, etc., is mounted on the drive 310 as necessary so that a computer program read therefrom is installed into the storage section 308 as necessary.
  • an embodiment of the present disclosure includes a vehicle-machine cooperative target loss tracking method, which includes a computer program tangibly embodied on a machine-readable medium, the computer program including program codes for executing the methods shown in FIGS. 1-2 .
  • the computer program may be downloaded and installed from a network via communication portion 309 and/or installed from removable media 311 .
  • each block in a flowchart or block diagram may represent a module, program segment, or part of code that includes one or more logical functions for implementing specified executable instructions.
  • the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or they may sometimes be executed in the reverse order, depending upon the functionality involved.
  • each block of the block diagrams and/or flowchart illustrations, and combinations of blocks in the block diagrams and/or flowchart illustrations can be implemented by a dedicated hardware-based system that performs the specified functions or operations , or may be implemented by a combination of dedicated hardware and computer instructions.
  • the units or modules involved in the embodiments described in the present application may be implemented by means of software or by means of hardware.
  • the described units or modules may also be set in a processor.
  • a processor includes a first sub-region generating unit, a second sub-region generating unit, and a display region generating unit.
  • the names of these units or modules do not constitute limitations on the units or modules themselves under certain circumstances.
  • the display area generating unit can also be described as "used to generate The cell of the display area of the text".
  • the present application also provides a computer-readable storage medium, which may be the computer-readable storage medium contained in the aforementioned devices in the above-mentioned embodiments; computer-readable storage media stored in the device.
  • the computer-readable storage medium stores one or more programs, and the aforementioned programs are used by one or more processors to execute the text generation method applied to transparent window envelopes described in this application.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Electromagnetism (AREA)
  • General Health & Medical Sciences (AREA)
  • Toxicology (AREA)
  • Artificial Intelligence (AREA)
  • Multimedia (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

本申请公开了一种车机协同目标丢失跟踪方法、装置、设备及其存储介质,该方法包括:通过识别需要跟踪的目标,将识别后的目标信息发送到无人机上控制系统上,同时启动无人机上的PID目标跟踪控制模块进行目标跟踪;当无人机丢失跟踪目标后,通过跟踪预测无人车轨迹而恢复二维码跟踪状态;当跟踪完预测无人车轨迹仍然处于目标丢失状态时,则提升无人机的飞行高度并盘旋,通过目标跟踪算法搜索无人车而恢复二维码跟踪状态。本申请提供的上述方案,能有效解决无人车二维码被短暂的遮挡或者二维码不在摄像头拍摄空间范围内而造成的无人机跟丢目标的问题,通过结合二维码跟踪、轨迹预测跟踪、目标跟踪方法极大的提升车机协同系统。

Description

车机协同目标丢失跟踪方法、装置、设备及其存储介质 技术领域
本发明涉及车机协同技术领域,具体涉及一种车机协同目标丢失跟踪方法、装置、设备及其存储介质。
背景技术
近年来,车机协同领域成为一个研究热点。无人车和无人机在民用,反恐防爆,军事方面都有很重要的应用,无人机时常会被要求跟踪无人车去执行相应任务。在跟踪目标的过程中,可能会由于强风、路面不平坦等因素改变无人车或者无人机的运动状态,从而导致无人机在跟踪的过程中丢失了跟踪目标,这样就会大大降低工作效率。在跟踪算法的研究中,用什么方法如何能够让无人机在丢失跟踪目标后重新捕获到跟踪目标,这也是一个必须要考虑到而且必须解决的问题。
目前现有的专利CN106023257A提出了一种基于旋翼无人机平台的目标跟踪方法。该方法通过多尺度的样本采集并结合实时更新的分类器,有效实现了旋翼无人机平台对移动目标的快速准确地在线实时跟踪;在跟踪过程中,根据当前帧的最大分类器响应值以及其相比前一帧的最大分类器响应值的变化情况,并结合前一帧的跟踪性能判定结果,来判定当前帧的跟踪性能是否稳定,当跟踪性能不稳定时,及时对分类器的分类结果进行修正,可有效防止目标跟踪过程中由于遮挡而导致的跟踪目标丢失。
同时,现有的一种基于L1图半监督协同训练的目标跟踪算法,在 前几帧利用跟踪结果收集表征目标的正样本和表征背景的负样本,分别提取样本的颜色和纹理特征构建两个充分冗余的视图。当新一帧到达时,以粒子滤波为模型随机采样候选区域作为未标记样本。再在两个视图上分别利用基于L1图的半监督学习算法训练分类器,计算未标记样本的相似度。然后,相互选择相似度最低的若干未标记样本作为新标记的负类样本更新分类器。最后,不同视图下的分类器独立计算未标记样本的相似度,并以其相似度熵为权值融合得到最终结果。现有的一种基于SBUS协议的远程控制模拟控制方法,针对无人地面车辆,设计了详细的控制方法。针对实时性能和遮挡问题,研究了一种基于比例标签识别的跟踪方案。该方案在没有遮挡的情况下跟踪标签目标。当遮挡发生时,该方案将跟踪标签周围的颜色特征。跟踪算法的精度和遮挡问题得到大大提高。最后,将该方案应用于异构系统。现有的一种基于SBUS协议的四转子控制方法,以保持无人机的原始稳定性。提出了一种带有机械轮的UGV及其控制方法。针对实时性能和遮挡的问题,提出了一种基于二维码(快速响应码)识别的跟踪方案。最后,将该方案应用于异构系统。现有的一种基于SBUS协议的远程控制模拟控制方法。针对无人地面车辆,设计了详细的控制方法。针对实时性能和遮挡问题,研究了一种基于比例标签识别的跟踪方案。该方案在没有遮挡的情况下跟踪标签目标。当遮挡发生时,该方案将跟踪标签周围的颜色特征。跟踪算法的精度和遮挡问题得到提高。最后,将该方案应用于异构系统。
然而,现有技术很多都是在优化跟踪算法的基础上从而希望能够降低丢失跟踪目标的概率,基本没有提出解决目标丢失的方法。同时,现有技术是针对于无人机跟踪的算法研究,没有将车机协同作为一个整体系统去考虑,而且使用单一的目标识别算法没有办法应对各种场景。例如,无人机的飞行高度不同时,目标识别算法的精度也不同,这就会导致在不同高度下不能够精准跟踪目标。
发明内容
鉴于现有技术中的上述缺陷或不足,期望提供一种车机协同目标丢失跟踪方法、装置、设备及其存储介质。
第一方面,本申请实施例提供了一种车机协同目标丢失跟踪方法,该方法包括:通过无人机上的目标识别模块识别需要跟踪的目标,将识别后的目标信息发送到无人机上控制系统上,同时启动无人机上的PID目标跟踪控制模块进行目标跟踪;当无人机丢失跟踪目标后,通过跟踪预测无人车轨迹而恢复二维码跟踪状态;当跟踪完预测无人车轨迹仍然处于目标丢失状态时,则提升无人机的飞行高度并盘旋,通过目标跟踪算法搜索无人车而恢复二维码跟踪状态。
在其中一个实施例中,所述目标识别模块由二维码检测算法构成,包括artag、apritag、或者aruco二维码检测算法中的任意一种。
在其中一个实施例中,所述通过无人机上的目标识别模块识别需要跟踪的目标,包括:通过无人机上的鱼眼单目摄像头识别需要跟踪的目标无人车上的二维码图像,并输出二维码在相机坐标系下的位置和姿态。
在其中一个实施例中,所述当无人机丢失跟踪目标后,通过跟踪预测无人车轨迹而恢复二维码跟踪状态,包括:无人车上的模型预测无人车轨迹模块预测出无人车在预设时刻内的轨迹点坐标,并将轨迹点坐标发送给无人机上的预测轨迹跟踪控制模块;预测轨迹跟踪控制模块将轨迹点坐标转换成无人机机体坐标系下的轨迹,再通过PID控制方法逐一跟踪预测轨迹点,直至无人机找到二维码。
在其中一个实施例中,所述当跟踪完预测无人车轨迹仍然处于目标丢失状态时,则提升无人机的飞行高度并盘旋,通过目标跟踪算法搜索无人车而恢复二维码跟踪状态,包括:通过无人车上的激光雷达判断无人车正上方是否有遮挡,如果有遮挡,则控制无人车向目标方 向移动预设距离后至无遮挡区域并停在原地;提升无人机飞行高度并在空中以R为半径盘旋,通过无人机上的长焦摄像头识别图像,利用目标跟踪算法KCF、CSK、或者C-COT算法直接追踪无人车;当识别出无人车后无人机停止盘旋并通过PID控制将无人车移至图像中心,然后无人机降低高度至预设高度,当识别出无人车上的二维码后再次降低高度至预设高度,恢复正常跟踪程序,同时启动无人车开始执行任务。
第二方面,本申请实施例还提供了一种车机协同目标丢失跟踪装置,该装置包括:目标识别模块,用于识别需要跟踪的目标,将识别后的目标信息发送到无人机上控制系统上,同时启动无人机上的PID目标跟踪控制模块进行目标跟踪;跟踪模块,用于当无人机丢失跟踪目标后,通过跟踪预测无人车轨迹而恢复二维码跟踪状态;目标搜索模块,用于当跟踪完预测无人车轨迹仍然处于目标丢失状态时,则提升无人机的飞行高度并盘旋,通过目标跟踪算法搜索无人车而恢复二维码跟踪状态。
第三方面,本申请实施例还提供了一种计算机设备,包括存储器、处理器以及存储在存储器上并可在处理器上运行的计算机程序,所述处理器执行所述程序时实现如本申请实施例描述中任一所述的方法。
第四方面,本申请实施例还提供了一种计算机设备一种计算机可读存储介质,其上存储有计算机程序,所述计算机程序用于:所述计算机程序被处理器执行时实现如本申请实施例描述中任一所述的方法。
本发明的有益效果:
本发明提供的车机协同目标丢失跟踪方法,能有效解决无人车二维码被短暂的遮挡或者二维码不在摄像头拍摄空间范围内而造成的无人机跟丢目标的问题,通过结合二维码跟踪、轨迹预测跟踪、目标跟踪方法极大的提升车机协同系统,无人车跟踪无人机人物的鲁棒性。
附图说明
通过阅读参照以下附图所作的对非限制性实施例所作的详细描述,本申请的其它特征、目的和优点将会变得更明显:
图1示出了本申请实施例提供的车机协同目标丢失跟踪方法的流程示意图;
图2示出了根据本申请一个实施例的车机协同目标丢失跟踪装置200的示例性结构框图;
图3示出了适于用来实现本申请实施例的终端设备的计算机系统的结构示意图;
图4示出了本申请实施例提供的车机协同目标丢失跟踪方法又一流程示意图;
图5示出了本申请实施例提供的车辆运动学模型示意图。
具体实施方式
为使本发明的上述目的、特征和优点能够更加明显易懂,下面结合附图对本发明的具体实施方式做详细的说明。在下面的描述中阐述了很多具体细节以便于充分理解本发明。但是本发明能够以很多不同于在此描述的其它方式来实施,本领域技术人员可以在不违背本发明内涵的情况下做类似改进,因此本发明不受下面公开的具体实施例的限制。
在本发明的描述中,需要理解的是,术语“中心”、“纵向”、“横向”、“长度”、“宽度”、“厚度”、“上”、“下”、“前”、“后”、“左”、“右”、“竖直”、“水平”、“顶”、“底”、“内”、“外”、“顺时针”、“逆时针”、“轴向”、“径向”、“周向”等指示的方位或位置关系为基于附图所示的方位或位置关系,仅是为了便于描述本发明和简化描述,而不是指 示或暗示所指的装置或元件必须具有特定的方位、以特定的方位构造和操作,因此不能理解为对本发明的限制。
此外,术语“第一”、“第二”仅用于描述目的,而不能理解为指示或暗示相对重要性或者隐含指明所指示的技术特征的数量。由此,限定有“第一”、“第二”的特征可以明示或者隐含地包括至少一个该特征。在本发明的描述中,“多个”的含义是至少两个,例如两个,三个等,除非另有明确具体的限定。
在本发明中,除非另有明确的规定和限定,术语“安装”、“相连”、“连接”、“固定”等术语应做广义理解,例如,可以是固定连接,也可以是可拆卸连接,或成一体;可以是机械连接,也可以是电连接;可以是直接相连,也可以通过中间媒介间接相连,可以是两个元件内部的连通或两个元件的相互作用关系,除非另有明确的限定。对于本领域的普通技术人员而言,可以根据具体情况理解上述术语在本发明中的具体含义。
在本发明中,除非另有明确的规定和限定,第一特征在第二特征“上”或“下”可以是第一和第二特征直接接触,或第一和第二特征通过中间媒介间接接触。而且,第一特征在第二特征“之上”、“上方”和“上面”可是第一特征在第二特征正上方或斜上方,或仅仅表示第一特征水平高度高于第二特征。第一特征在第二特征“之下”、“下方”和“下面”可以是第一特征在第二特征正下方或斜下方,或仅仅表示第一特征水平高度小于第二特征。
需要说明的是,当元件被称为“固定于”或“设置于”另一个元件,它可以直接在另一个元件上或者也可以存在居中的元件。当一个元件被认为是“连接”另一个元件,它可以是直接连接到另一个元件或者可能同时存在居中元件。本文所使用的术语“垂直的”、“水平的”、“上”、“下”、“左”、“右”以及类似的表述只是为了说明的目的,并不表示是唯一的实施方式。
请参考图1并结合图4,图1示出了本申请实施例提供的车机协同目标丢失跟踪方法的流程示意图。
如图1所示,该方法包括:
步骤110,通过无人机上的目标识别模块识别需要跟踪的目标,将识别后的目标信息发送到无人机上控制系统上,同时启动无人机上的PID(比例(proportion)、积分(integral)、微分(derivative))目标跟踪控制模块进行目标跟踪;
步骤120,当无人机丢失跟踪目标后,通过跟踪预测无人车轨迹而恢复二维码跟踪状态;
步骤130,当跟踪完预测无人车轨迹仍然处于目标丢失状态时,则提升无人机的飞行高度并盘旋,通过目标跟踪算法搜索无人车而恢复二维码跟踪状态。
本申请中的PID目标跟踪控制模块,跟踪高度为h 0,无人机在机体坐标系下的当前位置、当前航向角、与期望位置、期望航向角,通过PID控制输出机体坐标系下无人机在x轴、y轴、z轴方向的速度、与航向角速度。
Δw t=w expect-w current
I t=I t-1+Δw t
I t=cliff(I t,-c,+c),
D t=Δw t-D t-1
Figure PCTCN2022138210-appb-000001
w current为当前位置或姿态,w expect为期望位置或姿态,I t为积分量,D t为差分量,
Figure PCTCN2022138210-appb-000002
为速度,cliff(I t,-c,+c)为截断函数,将I t截断在常量正负c之间。
采用上述技术方案,能有效解决无人车二维码被短暂的遮挡或者二维码不在摄像头拍摄空间范围内而造成的无人机跟丢目标的问题,通过结合二维码跟踪、轨迹预测跟踪、目标跟踪方法极大的提升车机 协同系统,无人车跟踪无人机人物的鲁棒性。
在一些实施例中,本申请中的所述目标识别模块由二维码检测算法构成,包括artag、apritag、或者aruco二维码检测算法中的任意一种。
在一些实施例中,所述通过无人机上的目标识别模块识别需要跟踪的目标,包括:
通过无人机上的鱼眼单目摄像头识别需要跟踪的目标无人车上的二维码图像,并输出二维码在相机坐标系下的位置和姿态。
在一些实施例中,所述当无人机丢失跟踪目标后,通过跟踪预测无人车轨迹而恢复二维码跟踪状态,包括:
无人车上的模型预测无人车轨迹模块预测出无人车在预设时刻内的轨迹点坐标,并将轨迹点坐标发送给无人机上的预测轨迹跟踪控制模块;
预测轨迹跟踪控制模块将轨迹点坐标转换成无人机机体坐标系下的轨迹,再通过PID控制方法逐一跟踪预测轨迹点,直至无人机找到二维码。
进一步地,参考图5所示,以车辆运动学模型为例
Figure PCTCN2022138210-appb-000003
无人车状态量
Figure PCTCN2022138210-appb-000004
无人车控制量u=[v r δ f],将无人车运动学模型
Figure PCTCN2022138210-appb-000005
用前向欧拉离散化成ξ K+1=f(ξ K,u k),无人车控制量为u=[v r ω],
Figure PCTCN2022138210-appb-000006
Figure PCTCN2022138210-appb-000007
将N个步长的控制量输入离散无人车车数学模型,即可得到每个步长的无人车状态量。
预测轨迹跟踪控制模块在目标丢失后,无人机将爬升至高度h 1(h 1>h 0)扩大摄像头视野范围,轨迹预测模块预测出的轨迹转换成无人机机体坐标系下的轨迹,再通过PID控制方法逐一跟踪预测轨迹点,当重新识别出二维码后即停止该模块,恢复高度h 0,并开始正常跟踪二维码程序。
在一些实施例中,本申请中的所述当跟踪完预测无人车轨迹仍然处于目标丢失状态时,则提升无人机的飞行高度并盘旋,通过目标跟踪算法搜索无人车而恢复二维码跟踪状态,包括:通过无人车上的激光雷达判断无人车正上方是否有遮挡,如果有遮挡,则控制无人车向目标方向移动预设距离后至无遮挡区域并停在原地;提升无人机飞行高度并在空中以R为半径盘旋,通过无人机上的长焦摄像头识别图像,利用目标跟踪算法KCF、CSK、或者C-COT算法直接追踪无人车;当识别出无人车后无人机停止盘旋并通过PID控制将无人车移至图像中心,然后无人机降低高度至预设高度,当识别出无人车上的二维码后再次降低高度至预设高度,恢复正常跟踪程序,同时启动无人车开始执行任务。
具体地,目标搜索模块在前述预测轨迹跟踪控制模块跟踪完N个预测轨迹点后,仍然处于目标丢失状态时启动,此时,首先通过车载竖直向上的激光雷达判断无人车正上方是否有遮挡,如果有遮挡,则控制无人车向目标方向移动一小段距离至无遮挡区域并停在原地。同时无人机再次垂直爬升至高度h2(h2>h1),此时普通单目摄像头已经失焦,并在空中以R为半径盘旋,采用长焦摄像头识别图像,利用常用目标跟踪算法KCF、CSK、或者C-COT等算法直接追踪无人车,当识别出无人车后无人机停止盘旋并通过PID控制将无人车移至图像中心。然后无人机降低高度至h1,识别出二维码后再次降低高度至h0,恢复正常跟踪程序,同时启动无人车开始执行任务。
进一步地,参考图2,图2示出了根据本申请一个实施例的车机 协同目标丢失跟踪装置200的示例性结构框图。
如图2所示,该装置包括:
目标识别模块210,用于识别需要跟踪的目标,将识别后的目标信息发送到无人机上控制系统上,同时启动无人机上的PID目标跟踪控制模块进行目标跟踪;
跟踪模块220,用于当无人机丢失跟踪目标后,通过跟踪预测无人车轨迹而恢复二维码跟踪状态;
目标搜索模块230,用于当跟踪完预测无人车轨迹仍然处于目标丢失状态时,则提升无人机的飞行高度并盘旋,通过目标跟踪算法搜索无人车而恢复二维码跟踪状态。
应当理解,装置200中记载的诸单元或模块与参考图1描述的方法中的各个步骤相对应。由此,上文针对方法描述的操作和特征同样适用于装置200及其中包含的单元,在此不再赘述。装置200可以预先实现在电子设备的浏览器或其他安全应用中,也可以通过下载等方式而加载到电子设备的浏览器或其安全应用中。装置200中的相应单元可以与电子设备中的单元相互配合以实现本申请实施例的方案。
下面参考图3,其示出了适于用来实现本申请实施例的终端设备或服务器的计算机系统300的结构示意图。
如图3所示,计算机系统300包括中央处理单元(CPU)301,其可以根据存储在只读存储器(ROM)302中的程序或者从存储部分308加载到随机访问存储器(RAM)303中的程序而执行各种适当的动作和处理。在RAM 303中,还存储有系统300操作所需的各种程序和数据。CPU 301、ROM 302以及RAM 303通过总线304彼此相连。输入/输出(I/O)接口305也连接至总线304。
以下部件连接至I/O接口305:包括键盘、鼠标等的输入部分306;包括诸如阴极射线管(CRT)、液晶显示器(LCD)等以及扬声器等的输出部分307;包括硬盘等的存储部分308;以及包括诸如LAN卡、 调制解调器等的网络接口卡的通信部分309。通信部分309经由诸如因特网的网络执行通信处理。驱动器310也根据需要连接至I/O接口305。可拆卸介质311,诸如磁盘、光盘、磁光盘、半导体存储器等等,根据需要安装在驱动器310上,以便于从其上读出的计算机程序根据需要被安装入存储部分308。
特别地,根据本公开的实施例,上文参考图1-2描述的过程可以被实现为计算机软件程序。例如,本公开的实施例包括一种车机协同目标丢失跟踪方法,其包括有形地包含在机器可读介质上的计算机程序,所述计算机程序包含用于执行图1-2的方法的程序代码。在这样的实施例中,该计算机程序可以通过通信部分309从网络上被下载和安装,和/或从可拆卸介质311被安装。
附图中的流程图和框图,图示了按照本发明各种实施例的系统、方法和计算机程序产品的可能实现的体系架构、功能和操作。在这点上,流程图或框图中的每个方框可以代表一个模块、程序段、或代码的一部分,前述模块、程序段、或代码的一部分包含一个或多个用于实现规定的逻辑功能的可执行指令。也应当注意,在有些作为替换的实现中,方框中所标注的功能也可以以不同于附图中所标注的顺序发生。例如,两个接连地表示的方框实际上可以基本并行地执行,它们有时也可以按相反的顺序执行,这依所涉及的功能而定。也要注意的是,框图和/或流程图中的每个方框、以及框图和/或流程图中的方框的组合,可以用执行规定的功能或操作的专用的基于硬件的系统来实现,或者可以用专用硬件与计算机指令的组合来实现。
描述于本申请实施例中所涉及到的单元或模块可以通过软件的方式实现,也可以通过硬件的方式来实现。所描述的单元或模块也可以设置在处理器中,例如,可以描述为:一种处理器包括第一子区域生成单元、第二子区域生成单元以及显示区域生成单元。其中,这些单元或模块的名称在某种情况下并不构成对该单元或模块本身的限定, 例如,显示区域生成单元还可以被描述为“用于根据第一子区域和第二子区域生成文本的显示区域的单元”。
作为另一方面,本申请还提供了一种计算机可读存储介质,该计算机可读存储介质可以是上述实施例中前述装置中所包含的计算机可读存储介质;也可以是单独存在,未装配入设备中的计算机可读存储介质。计算机可读存储介质存储有一个或者一个以上程序,前述程序被一个或者一个以上的处理器用来执行描述于本申请的应用于透明窗口信封的文本生成方法。
以上描述仅为本申请的较佳实施例以及对所运用技术原理的说明。本领域技术人员应当理解,本申请中所涉及的发明范围,并不限于上述技术特征的特定组合而成的技术方案,同时也应涵盖在不脱离前述发明构思的情况下,由上述技术特征或其等同特征进行任意组合而形成的其它技术方案。例如上述特征与本申请中公开的(但不限于)具有类似功能的技术特征进行互相替换而形成的技术方案。

Claims (10)

  1. 一种车机协同目标丢失跟踪方法,其特征在于,该方法包括:
    通过无人机上的目标识别模块识别需要跟踪的目标,将识别后的目标信息发送到无人机上控制系统上,同时启动无人机上的PID目标跟踪控制模块进行目标跟踪;
    当无人机丢失跟踪目标后,通过跟踪预测无人车轨迹而恢复二维码跟踪状态;
    当跟踪完预测无人车轨迹仍然处于目标丢失状态时,则提升无人机的飞行高度并盘旋,通过目标跟踪算法搜索无人车而恢复二维码跟踪状态。
  2. 根据权利要求1所述的车机协同目标丢失跟踪方法,其特征在于,所述目标识别模块由二维码检测算法构成,包括artag、apritag、或者aruco二维码检测算法中的任意一种。
  3. 根据权利要求1所述的车机协同目标丢失跟踪方法,其特征在于,所述通过无人机上的目标识别模块识别需要跟踪的目标,包括:
    通过无人机上的鱼眼单目摄像头识别需要跟踪的目标无人车上的二维码图像,并输出二维码在相机坐标系下的位置和姿态。
  4. 根据权利要求1所述的车机协同目标丢失跟踪方法,其特征在于,所述当无人机丢失跟踪目标后,通过跟踪预测无人车轨迹而恢复二维码跟踪状态,包括:
    无人车上的模型预测无人车轨迹模块预测出无人车在预设时刻内的轨迹点坐标,并将轨迹点坐标发送给无人机上的预测轨迹跟踪控制模块;
    预测轨迹跟踪控制模块将轨迹点坐标转换成无人机机体坐标系下的轨迹,再通过PID控制方法逐一跟踪预测轨迹点,直至无人机找到二维码。
  5. 根据权利要求1所述的车机协同目标丢失跟踪方法,其特征在于,所述当跟踪完预测无人车轨迹仍然处于目标丢失状态时,则提升无人机的飞行高度并盘旋,通过目标跟踪算法搜索无人车而恢复二维码跟踪状态,包括:
    通过无人车上的激光雷达判断无人车正上方是否有遮挡,如果有遮挡,则控制无人车向目标方向移动预设距离后至无遮挡区域并停在原地;
    提升无人机飞行高度并在空中以R为半径盘旋,通过无人机上的长焦摄像头识别图像,利用目标跟踪算法KCF、CSK、或者C-COT算法直接追踪无人车;
    当识别出无人车后无人机停止盘旋并通过PID控制将无人车移至图像中心,然后无人机降低高度至预设高度,当识别出无人车上的二维码后再次降低高度至预设高度,恢复正常跟踪程序,同时启动无人车开始执行任务。
  6. 一种车机协同目标丢失跟踪装置,其特征在于,该装置包括:
    目标识别模块,用于识别需要跟踪的目标,将识别后的目标信息发送到无人机上控制系统上,同时启动无人机上的PID目标跟踪控制模块进行目标跟踪;
    跟踪模块,用于当无人机丢失跟踪目标后,通过跟踪预测无人车轨迹而恢复二维码跟踪状态;
    目标搜索模块,用于当跟踪完预测无人车轨迹仍然处于目标丢失状态时,则提升无人机的飞行高度并盘旋,通过目标跟踪算法搜索无人车而恢复二维码跟踪状态。
  7. 根据权利要求6所述的车机协同目标丢失跟踪装置,其特征在于,所述当无人机丢失跟踪目标后,通过跟踪预测无人车轨迹而恢复二维码跟踪状态,包括:
    模型预测无人车轨迹模块,用于预测出无人车在预设时刻内的轨 迹点坐标,并将轨迹点坐标发送给无人机上的预测轨迹跟踪控制模块;
    预测轨迹跟踪控制模块,用于将轨迹点坐标转换成无人机机体坐标系下的轨迹,再通过PID控制方法逐一跟踪预测轨迹点,直至无人机找到二维码。
  8. 根据权利要求6所述的车机协同目标丢失跟踪装置,其特征在于,所述当跟踪完预测无人车轨迹仍然处于目标丢失状态时,则提升无人机的飞行高度并盘旋,通过目标跟踪算法搜索无人车而恢复二维码跟踪状态,包括:
    判断模块,用于通过无人车上的激光雷达判断无人车正上方是否有遮挡,如果有遮挡,则控制无人车向目标方向移动预设距离后至无遮挡区域并停在原地;
    图像识别模块,用于提升无人机飞行高度并在空中以R为半径盘旋,通过无人机上的长焦摄像头识别图像,利用目标跟踪算法KCF、CSK、或者C-COT算法直接追踪无人车;
    当识别出无人车后无人机停止盘旋并通过PID控制将无人车移至图像中心,然后无人机降低高度至预设高度,当识别出无人车上的二维码后再次降低高度至预设高度,恢复正常跟踪程序,同时启动无人车开始执行任务。
  9. 一种计算机设备,包括存储器、处理器以及存储在存储器上并可在处理器上运行的计算机程序,其特征在于,所述处理器执行所述程序时实现如权利要求1-5中任一所述的方法。
  10. 一种计算机可读存储介质,其上存储有计算机程序,所述计算机程序用于:
    所述计算机程序被处理器执行时实现如权利要求1-5中任一所述的方法。
PCT/CN2022/138210 2021-12-13 2022-12-09 车机协同目标丢失跟踪方法、装置、设备及其存储介质 WO2023109716A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202111515854.X 2021-12-13
CN202111515854.XA CN114419095A (zh) 2021-12-13 2021-12-13 车机协同目标丢失跟踪方法、装置、设备及其存储介质

Publications (1)

Publication Number Publication Date
WO2023109716A1 true WO2023109716A1 (zh) 2023-06-22

Family

ID=81265971

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/138210 WO2023109716A1 (zh) 2021-12-13 2022-12-09 车机协同目标丢失跟踪方法、装置、设备及其存储介质

Country Status (2)

Country Link
CN (1) CN114419095A (zh)
WO (1) WO2023109716A1 (zh)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114419095A (zh) * 2021-12-13 2022-04-29 深圳先进技术研究院 车机协同目标丢失跟踪方法、装置、设备及其存储介质
CN115963856B (zh) * 2023-01-03 2024-05-10 广东工业大学 一种四旋翼无人机快速目标跟踪方法

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106447697A (zh) * 2016-10-09 2017-02-22 湖南穗富眼电子科技有限公司 一种基于动平台的特定动目标快速跟踪方法
CN111580551A (zh) * 2020-05-06 2020-08-25 杭州电子科技大学 一种基于视觉定位的导航系统与方法
CN211506262U (zh) * 2020-05-06 2020-09-15 杭州电子科技大学 一种基于视觉定位的导航系统
CN111932588A (zh) * 2020-08-07 2020-11-13 浙江大学 一种基于深度学习的机载无人机多目标跟踪系统的跟踪方法
CN113311873A (zh) * 2021-05-07 2021-08-27 中国科学院沈阳自动化研究所 一种基于视觉的无人机伺服跟踪方法
CN114419095A (zh) * 2021-12-13 2022-04-29 深圳先进技术研究院 车机协同目标丢失跟踪方法、装置、设备及其存储介质

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110109482A (zh) * 2019-06-14 2019-08-09 上海应用技术大学 基于ssd神经网络的目标跟踪系统

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106447697A (zh) * 2016-10-09 2017-02-22 湖南穗富眼电子科技有限公司 一种基于动平台的特定动目标快速跟踪方法
CN111580551A (zh) * 2020-05-06 2020-08-25 杭州电子科技大学 一种基于视觉定位的导航系统与方法
CN211506262U (zh) * 2020-05-06 2020-09-15 杭州电子科技大学 一种基于视觉定位的导航系统
CN111932588A (zh) * 2020-08-07 2020-11-13 浙江大学 一种基于深度学习的机载无人机多目标跟踪系统的跟踪方法
CN113311873A (zh) * 2021-05-07 2021-08-27 中国科学院沈阳自动化研究所 一种基于视觉的无人机伺服跟踪方法
CN114419095A (zh) * 2021-12-13 2022-04-29 深圳先进技术研究院 车机协同目标丢失跟踪方法、装置、设备及其存储介质

Also Published As

Publication number Publication date
CN114419095A (zh) 2022-04-29

Similar Documents

Publication Publication Date Title
WO2023109716A1 (zh) 车机协同目标丢失跟踪方法、装置、设备及其存储介质
CN111932588B (zh) 一种基于深度学习的机载无人机多目标跟踪系统的跟踪方法
Hui et al. Vision-based autonomous navigation approach for unmanned aerial vehicle transmission-line inspection
McGee et al. Obstacle detection for small autonomous aircraft using sky segmentation
CN108230361B (zh) 用无人机探测器和追踪器融合来增强目标追踪方法及系统
Al-Kaff et al. An appearance-based tracking algorithm for aerial search and rescue purposes
Wang et al. Vision-based detection and tracking of a mobile ground target using a fixed-wing UAV
Sonugür A Review of quadrotor UAV: Control and SLAM methodologies ranging from conventional to innovative approaches
KR102167414B1 (ko) 드론을 이용한 교통법규 위반차량 단속 서비스 제공 시스템 및 방법
Bian et al. A novel monocular-based navigation approach for UAV autonomous transmission-line inspection
Bian et al. A monocular vision–based perception approach for unmanned aerial vehicle close proximity transmission tower inspection
WO2019082301A1 (ja) 無人航空機制御システム、無人航空機制御方法、及びプログラム
Rodriguez-Ramos et al. Towards fully autonomous landing on moving platforms for rotary unmanned aerial vehicles
CN112596071A (zh) 无人机自主定位方法、装置及无人机
CN112379681A (zh) 无人机避障飞行方法、装置及无人机
CN115686052A (zh) 无人机避障路径规划方法、装置、计算机设备及存储介质
CN112577481B (zh) 一种旋翼无人机地面目标定位方法
Yang et al. Aerial target tracking algorithm based on faster R-CNN combined with frame differencing
CN113961013A (zh) 一种基于rgb-d slam的无人机路径规划方法
Gökçe et al. Recognition of dynamic objects from UGVs using Interconnected Neuralnetwork-based Computer Vision system
CN112380933A (zh) 无人机识别目标的方法、装置及无人机
Bodunkov et al. Autonomous landing-site selection for a small drone
WO2023097769A1 (zh) 一种车机协同自主跟踪与降落方法
CN113206951B (zh) 一种基于扑翼飞行系统的实时电子稳像方法
Feng et al. Integration and Implementation of a Low-cost and Vision-based UAV Tracking System

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22906463

Country of ref document: EP

Kind code of ref document: A1