WO2023097769A1 - Procédé de suivi et d'atterrissage autonomes collaboratifs entre un véhicule terrestre sans pilote et un véhicule aérien sans pilote - Google Patents

Procédé de suivi et d'atterrissage autonomes collaboratifs entre un véhicule terrestre sans pilote et un véhicule aérien sans pilote Download PDF

Info

Publication number
WO2023097769A1
WO2023097769A1 PCT/CN2021/137824 CN2021137824W WO2023097769A1 WO 2023097769 A1 WO2023097769 A1 WO 2023097769A1 CN 2021137824 W CN2021137824 W CN 2021137824W WO 2023097769 A1 WO2023097769 A1 WO 2023097769A1
Authority
WO
WIPO (PCT)
Prior art keywords
coordinate system
drone
uav
vehicle
unmanned aerial
Prior art date
Application number
PCT/CN2021/137824
Other languages
English (en)
Chinese (zh)
Inventor
徐坤
向耿召
李慧云
蔡宇翔
潘仲鸣
Original Assignee
深圳先进技术研究院
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳先进技术研究院 filed Critical 深圳先进技术研究院
Publication of WO2023097769A1 publication Critical patent/WO2023097769A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/08Control of attitude, i.e. control of roll, pitch, or yaw
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/10Simultaneous control of position or course in three dimensions

Definitions

  • the present invention relates to the technical field of vehicle-machine cooperation, and more specifically, to a vehicle-machine cooperation autonomous tracking and landing method.
  • the size of the QR code can be set to guide the autonomous landing of the UAV in stages, thereby improving the accuracy of the autonomous landing of the UAV, but this method does not consider the integrated system setting of vehicle-machine coordination.
  • Tal yang et al. (“Hybrid camera array-based uav auto-landing on moving ugv in gps-denied environment[J]", Remote Sensing, 2018, 10(11): 1829) by using a combination of fisheye camera and depth camera To improve the accuracy of target recognition and realize the precise landing function.
  • “A Neural Network-Based Rotary Wing UAV Tracking Algorithm” (Chinese Patent Publication No.
  • CN113253755A splits the moving video of the target, allowing the computer to divide the video into several bitmap files according to the number of frames, and select the tracking target. Identify the tracked target in several bitmap files, connect the track of the target in series, and then establish the space coordinates, concatenate the tracked targets in several bitmap files into the track of the tracked target, and display it in the space coordinates.
  • the computer of the ground base station can observe and monitor the target on the display, and can also display the specific orientation of the tracking target through space coordinates, improve the computer's motion analysis of the tracking target, and then budget the coordinates of the next movement of the tracking target through trajectory guidance and trajectory budgeting and direction to improve the tracking efficiency of UAVs.
  • a UAV Autonomous Landing Method, Device, Electronic Equipment, and Storage Medium (Chinese Patent Publication No. CN113359843A) first obtains the real-time video image of the UAV, and calculates the real-time video image and the pre-stored target of the UAV. Drop the similarity between video images.
  • the continuous image similarity information in the real-time video can be integrated Converting to the continuous control of the flight direction and flight speed of the UAV effectively improves the continuity of the flight control during the autonomous landing process of the UAV, thereby improving the stability of the flight control during the autonomous landing process of the UAV.
  • the existing technical solution is only a single UAV tracking and landing, and it only lands on a static ground, and does not consider the case of interference, such as the jitter of the UAV in the case of wind, which leads to tracking failure .
  • the car and machine are not considered as a whole, but the tracking ability of the drone is simply improved.
  • the failure of the drone to track the unmanned vehicle is not the problem of the drone, but the severe mutation of the movement of the unmanned vehicle and it is difficult to track. .
  • trajectory prediction and trajectory tracking through the neural network need to process a lot of image data in practice, which may lead to poor timeliness, and if the given trajectory is not smooth during the tracking process, it may cause unmanned The aircraft cannot be guaranteed to fly in a good attitude, the jitter is large, and even the aircraft crashes.
  • the purpose of the present invention is to overcome the defects of the above-mentioned prior art and provide a vehicle-machine cooperative autonomous tracking and landing method.
  • the method includes the following steps:
  • the heading velocity and heading angular velocity of the UAV in the body coordinate system are output through PID control, and the speed of the unmanned vehicle is coordinated and controlled.
  • the present invention has the advantages of proposing a method for vehicle-machine cooperative autonomous tracking and landing, identifying the target to be tracked by the target recognition module, sending the target information to the UAV system and starting the PID ( Proportional-integral-differential) target tracking control module for tracking.
  • the UAV can independently detect whether the UAV has severe shaking, and if the shaking is detected, the adaptive stabilization module will be activated. In this way, the shaking problem of the UAV in the process of tracking the target can be solved, so that the UAV can maintain a good flying attitude.
  • FIG. 1 is a schematic diagram of the architecture of vehicle-machine cooperative autonomous tracking and landing according to an embodiment of the present invention
  • FIG. 2 is a flow chart of a vehicle-machine cooperative autonomous tracking and landing method according to an embodiment of the present invention
  • Fig. 3 is a schematic diagram of the effect of the vehicle-machine cooperative autonomous tracking and landing method according to an embodiment of the present invention.
  • Fig. 1 illustrates the proposed vehicle-machine collaborative autonomous tracking and landing architecture in the form of functional modules, which generally includes a target recognition module, a target tracking control module, a shake detection module and an adaptive stabilization module.
  • the target recognition module determines the current position and attitude of the target (such as a UAV) according to the images collected by the camera.
  • the shaking detection module is used to judge the shaking of the target according to a series of positions and attitudes.
  • the adaptive stabilization module is used to determine the parameters or instructions that need to be adjusted according to the detected shaking situation.
  • the target tracking control module determines the speed and course of the target at the next moment in response to the adjustment instruction of the adaptive stability augmentation module.
  • the provided vehicle-machine cooperative autonomous tracking and landing method includes the following steps.
  • Step S210 using the two-dimensional code to detect the position and posture of the target in the camera coordinate system.
  • the target recognition module uses the QR code as a road sign to detect the motion of the drone. It can use a QR code detection algorithm such as artag, apritag or aruco to output the position and attitude of the QR code in the camera coordinate system.
  • a QR code detection algorithm such as artag, apritag or aruco to output the position and attitude of the QR code in the camera coordinate system.
  • Step S220 detecting whether the drone shakes.
  • the shake detection module calculates the variance of the QR code position in the camera coordinate system and the variance of the drone's attitude angle in the global coordinate system. When any variance exceeds the threshold, the adaptive stabilization module function is activated.
  • the shake detection module will calculate the variance D1 of the position of the two-dimensional code in the camera coordinate system within 1 second, and the variance D2 of the attitude angle of the drone in the global coordinate system.
  • the thresholds are constants c1 and c2. When D1 >c1 or D2>c2, the adaptive stabilization module is activated. It should be noted that the statistical variance time period, thresholds c1 and c2, etc. can be set according to actual needs, for example, set to appropriate values according to the flight speed of the UAV or according to the desired flight stability. In addition, different levels of variance thresholds can also be set to detect the severity of the drone's shaking, such as weak shaking, general shaking or severe shaking.
  • Step S230 determining the parameters or instructions that need to be adjusted according to the detected shaking situation.
  • the self-adaptive stabilization module includes the self-adaptive adjustment of PID parameters, the UAV's elevation of tracking height, and the adjustment process of limiting the acceleration of unmanned vehicles.
  • the drone By adaptively reducing the P value in the PID parameter to slow down the jitter of the drone and smooth the speed output, in order to prevent the loss of tracking targets caused by increasing the tracking response time of the drone, the drone will climb a certain height and send unmanned Vehicle acceleration reduction command.
  • By increasing the tracking height of the UAV it can not only expand the field of view, but also effectively slow down the shaking of the UAV.
  • Step S240 in response to the adjustment instruction, cooperatively control the UAV and the unmanned vehicle.
  • the input of the PID target tracking control module is the current position of the UAV in the body coordinate system, the current heading angle, the desired position, and the desired heading angle.
  • I t I t-1 + ⁇ w t
  • w current is the current position or attitude
  • w expect is the expected position or attitude
  • I t is the integral quantity
  • D t is the difference quantity
  • cliff(I t , -c, +c) is a truncation function, which truncates I t between the constant plus and minus c.
  • the adjusted parameter values such as k, h and a const , can be set to appropriate values according to the tracking scene or according to the shaking degree of the drone.
  • Figure 3 shows the x, y, z position and yaw heading angle changes of the UAV relative to the QR code during the tracking process, where unstable (non-stabilization) means that the data of the debounce module has not been added, and stable (stabilization) means that the data has been added
  • unstable non-stabilization
  • stable stabilization
  • the data of the de-shaking module, Figure 3(a) to Figure 3(d) are the tracking states of the UAV under different degrees of shaking. It can be seen from the figure that after adding the de-jitter module, the UAV tracking becomes very smooth, and the jitter range is very small, which significantly enhances the stability of UAV tracking and landing in the vehicle-machine coordination system.
  • the present invention can automatically detect whether the drone is shaking when flying, and proposes a method for self-adaptive enhancement of the stability of the drone, by adjusting the PID parameters, increasing the tracking height, limiting the acceleration of the unmanned vehicle, etc. Solved the shaking problem of the UAV during the tracking and landing process, making the flight attitude relatively stable. Moreover, the present invention considers the car and machine as an overall intelligent system, and when there is abnormal shaking, the unmanned car can cooperate with the drone to track cooperatively.
  • the present invention can be a system, method and/or computer program product.
  • a computer program product may include a computer readable storage medium having computer readable program instructions thereon for causing a processor to implement various aspects of the present invention.
  • a computer readable storage medium may be a tangible device that can retain and store instructions for use by an instruction execution device.
  • a computer readable storage medium may be, for example, but is not limited to, an electrical storage device, a magnetic storage device, an optical storage device, an electromagnetic storage device, a semiconductor storage device, or any suitable combination of the foregoing.
  • Computer-readable storage media include: portable computer diskettes, hard disks, random access memory (RAM), read-only memory (ROM), erasable programmable read-only memory (EPROM), or flash memory), static random access memory (SRAM), compact disc read only memory (CD-ROM), digital versatile disc (DVD), memory stick, floppy disk, mechanically encoded device, such as a printer with instructions stored thereon A hole card or a raised structure in a groove, and any suitable combination of the above.
  • RAM random access memory
  • ROM read-only memory
  • EPROM erasable programmable read-only memory
  • flash memory static random access memory
  • SRAM static random access memory
  • CD-ROM compact disc read only memory
  • DVD digital versatile disc
  • memory stick floppy disk
  • mechanically encoded device such as a printer with instructions stored thereon
  • a hole card or a raised structure in a groove and any suitable combination of the above.
  • computer-readable storage media are not to be construed as transient signals per se, such as radio waves or other freely propagating electromagnetic waves, electromagnetic waves propagating through waveguides or other transmission media (e.g., pulses of light through fiber optic cables), or transmitted electrical signals.
  • Computer-readable program instructions described herein may be downloaded from a computer-readable storage medium to a respective computing/processing device, or downloaded to an external computer or external storage device over a network, such as the Internet, a local area network, a wide area network, and/or a wireless network.
  • the network may include copper transmission cables, fiber optic transmission, wireless transmission, routers, firewalls, switches, gateway computers, and/or edge servers.
  • a network adapter card or a network interface in each computing/processing device receives computer-readable program instructions from the network and forwards the computer-readable program instructions for storage in a computer-readable storage medium in each computing/processing device .
  • Computer program instructions for carrying out operations of the present invention may be assembly instructions, instruction set architecture (ISA) instructions, machine instructions, machine-related instructions, microcode, firmware instructions, state setting data, or Source or object code written in any combination, including object-oriented programming languages—such as Smalltalk, C++, Python, etc., and conventional procedural programming languages—such as the “C” language or similar programming languages.
  • Computer readable program instructions may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer, or entirely on the remote computer or server implement.
  • the remote computer can be connected to the user computer through any kind of network, including a local area network (LAN) or a wide area network (WAN), or it can be connected to an external computer (such as via the Internet using an Internet service provider). connect).
  • LAN local area network
  • WAN wide area network
  • an electronic circuit such as a programmable logic circuit, field programmable gate array (FPGA), or programmable logic array (PLA)
  • FPGA field programmable gate array
  • PDA programmable logic array
  • These computer-readable program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine such that when executed by the processor of the computer or other programmable data processing apparatus , producing an apparatus for realizing the functions/actions specified in one or more blocks in the flowchart and/or block diagram.
  • These computer-readable program instructions can also be stored in a computer-readable storage medium, and these instructions cause computers, programmable data processing devices and/or other devices to work in a specific way, so that the computer-readable medium storing instructions includes An article of manufacture comprising instructions for implementing various aspects of the functions/acts specified in one or more blocks in flowcharts and/or block diagrams.
  • each block in a flowchart or block diagram may represent a module, a portion of a program segment, or an instruction that includes one or more Executable instructions.
  • the functions noted in the block may occur out of the order noted in the figures. For example, two blocks in succession may, in fact, be executed substantially concurrently, or they may sometimes be executed in the reverse order, depending upon the functionality involved.
  • each block of the block diagrams and/or flowchart illustrations, and combinations of blocks in the block diagrams and/or flowchart illustrations can be implemented by a dedicated hardware-based system that performs the specified function or action , or may be implemented by a combination of dedicated hardware and computer instructions. It is well known to those skilled in the art that implementation by means of hardware, implementation by means of software, and implementation by a combination of software and hardware are all equivalent.

Landscapes

  • Engineering & Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

La présente invention concerne un procédé de suivi et d'atterrissage autonomes collaboratifs entre un véhicule terrestre sans pilote et un véhicule aérien sans pilote. Le procédé consiste à : détecter la position et la posture d'un code bidimensionnel dans un système de coordonnées de caméra ; détecter l'état de secousse d'un véhicule aérien sans pilote selon le degré de changement de la position du code bidimensionnel dans le système de coordonnées de caméra ou le degré de changement de l'angle d'attitude du véhicule aérien sans pilote dans un système de coordonnées mondial dans une période de temps définie ; déterminer des paramètres d'ajustement adaptatifs selon l'état de secousse du véhicule aérien sans pilote, les paramètres d'ajustement adaptatifs étant utilisés pour ajuster de manière collaborative le véhicule aérien sans pilote et un véhicule terrestre sans pilote ; et, selon les paramètres d'ajustement adaptatifs, délivrer la vitesse de cap et la vitesse angulaire de cap du véhicule aérien sans pilote dans un système de coordonnées de corps de véhicule au moyen d'une commande PID, et commander de manière collaborative la vitesse du véhicule terrestre sans pilote. La présente invention résout le problème de secousse du véhicule aérien sans pilote pendant un suivi de cible, de telle sorte que le véhicule aérien sans pilote conserve une bonne attitude de vol.
PCT/CN2021/137824 2021-12-03 2021-12-14 Procédé de suivi et d'atterrissage autonomes collaboratifs entre un véhicule terrestre sans pilote et un véhicule aérien sans pilote WO2023097769A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202111468698.6 2021-12-03
CN202111468698.6A CN114326766A (zh) 2021-12-03 2021-12-03 一种车机协同自主跟踪与降落方法

Publications (1)

Publication Number Publication Date
WO2023097769A1 true WO2023097769A1 (fr) 2023-06-08

Family

ID=81049183

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/137824 WO2023097769A1 (fr) 2021-12-03 2021-12-14 Procédé de suivi et d'atterrissage autonomes collaboratifs entre un véhicule terrestre sans pilote et un véhicule aérien sans pilote

Country Status (2)

Country Link
CN (1) CN114326766A (fr)
WO (1) WO2023097769A1 (fr)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116828132A (zh) * 2023-07-05 2023-09-29 广州磐碟塔信息科技有限公司 一种虚拟摄影的控制方法及其系统

Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106197422A (zh) * 2016-06-27 2016-12-07 东南大学 一种基于二维标签的无人机定位及目标跟踪方法
CN106275410A (zh) * 2016-11-17 2017-01-04 湖南科瑞特科技股份有限公司 一种防风扰无人机
CN107291094A (zh) * 2017-05-08 2017-10-24 大陆智源科技(北京)有限公司 无人机机器人协同操作系统
CN108873935A (zh) * 2018-07-06 2018-11-23 山东农业大学 物流配送无人机降落的控制方法、装置、设备及存储介质
CN109189088A (zh) * 2018-08-21 2019-01-11 中南林业科技大学 系留式无人机自适应巡航跟踪方法、终端及存储介质
CN109791413A (zh) * 2016-10-10 2019-05-21 高通股份有限公司 用于使无人机着陆在移动基座上的系统和方法
US20190176968A1 (en) * 2016-06-21 2019-06-13 Nec Corporation Moving body, moving body control system, moving body control method, interface device, and recording medium having program recorded thereon
CN110222612A (zh) * 2019-05-27 2019-09-10 北京交通大学 用于无人机自主降落的动态标靶识别与跟踪方法
CN110231836A (zh) * 2019-06-14 2019-09-13 北京查打先锋高科技有限责任公司 一种引导无人机降落在移动靶标的方法
CN111240348A (zh) * 2020-01-22 2020-06-05 西安爱生无人机技术有限公司 基于运动基座的无人机降落控制方法、计算机可读存储介质及控制设备
US20200333804A1 (en) * 2019-04-18 2020-10-22 GM Global Technology Operations LLC Drone landing system and method
CN112639874A (zh) * 2020-03-20 2021-04-09 深圳市大疆创新科技有限公司 目标跟随方法、目标跟随装置、可移动设备和存储介质
CN113568427A (zh) * 2021-07-08 2021-10-29 上海机器人产业技术研究院有限公司 无人机自主着陆移动平台的方法及系统
CN113657256A (zh) * 2021-08-16 2021-11-16 大连海事大学 一种无人艇载无人机海空协同视觉跟踪与自主回收方法

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106647814B (zh) * 2016-12-01 2019-08-13 华中科技大学 一种基于二维码地标识别的无人机视觉辅助定位与飞控系统及方法
CN106527487A (zh) * 2016-12-23 2017-03-22 北京理工大学 一种运动平台上无人机自主精确着陆系统及着陆方法
CN108549397A (zh) * 2018-04-19 2018-09-18 武汉大学 基于二维码和惯导辅助的无人机自主降落方法及系统
CN112198888A (zh) * 2019-12-31 2021-01-08 北京理工大学 一种考虑无人机在机动平台自主起降的自适应pid控制方法

Patent Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20190176968A1 (en) * 2016-06-21 2019-06-13 Nec Corporation Moving body, moving body control system, moving body control method, interface device, and recording medium having program recorded thereon
CN106197422A (zh) * 2016-06-27 2016-12-07 东南大学 一种基于二维标签的无人机定位及目标跟踪方法
CN109791413A (zh) * 2016-10-10 2019-05-21 高通股份有限公司 用于使无人机着陆在移动基座上的系统和方法
CN106275410A (zh) * 2016-11-17 2017-01-04 湖南科瑞特科技股份有限公司 一种防风扰无人机
CN107291094A (zh) * 2017-05-08 2017-10-24 大陆智源科技(北京)有限公司 无人机机器人协同操作系统
CN108873935A (zh) * 2018-07-06 2018-11-23 山东农业大学 物流配送无人机降落的控制方法、装置、设备及存储介质
CN109189088A (zh) * 2018-08-21 2019-01-11 中南林业科技大学 系留式无人机自适应巡航跟踪方法、终端及存储介质
US20200333804A1 (en) * 2019-04-18 2020-10-22 GM Global Technology Operations LLC Drone landing system and method
CN110222612A (zh) * 2019-05-27 2019-09-10 北京交通大学 用于无人机自主降落的动态标靶识别与跟踪方法
CN110231836A (zh) * 2019-06-14 2019-09-13 北京查打先锋高科技有限责任公司 一种引导无人机降落在移动靶标的方法
CN111240348A (zh) * 2020-01-22 2020-06-05 西安爱生无人机技术有限公司 基于运动基座的无人机降落控制方法、计算机可读存储介质及控制设备
CN112639874A (zh) * 2020-03-20 2021-04-09 深圳市大疆创新科技有限公司 目标跟随方法、目标跟随装置、可移动设备和存储介质
CN113568427A (zh) * 2021-07-08 2021-10-29 上海机器人产业技术研究院有限公司 无人机自主着陆移动平台的方法及系统
CN113657256A (zh) * 2021-08-16 2021-11-16 大连海事大学 一种无人艇载无人机海空协同视觉跟踪与自主回收方法

Also Published As

Publication number Publication date
CN114326766A (zh) 2022-04-12

Similar Documents

Publication Publication Date Title
Roelofsen et al. Reciprocal collision avoidance for quadrotors using on-board visual detection
EP3128386B1 (fr) Procédé et dispositif de poursuite d'une cible mobile avec un véhicule aérien
Qi et al. Autonomous landing solution of low-cost quadrotor on a moving platform
Bipin et al. Autonomous navigation of generic monocular quadcopter in natural environment
Mills et al. Vision based control for fixed wing UAVs inspecting locally linear infrastructure using skid-to-turn maneuvers
Greatwood et al. Tracking control of a UAV with a parallel visual processor
Xiao et al. Flying through a narrow gap using end-to-end deep reinforcement learning augmented with curriculum learning and sim2real
Leishman et al. Relative navigation and control of a hexacopter
Potena et al. Effective target aware visual navigation for uavs
Su et al. Catching a flying ball with a vision-based quadrotor
CN115686052A (zh) 无人机避障路径规划方法、装置、计算机设备及存储介质
WO2023109716A1 (fr) Procédé et appareil de suivi de cible perdue par coopération entre un véhicule terrestre sans pilote et un véhicule aérien sans pilote, dispositif et support de stockage
WO2023097769A1 (fr) Procédé de suivi et d'atterrissage autonomes collaboratifs entre un véhicule terrestre sans pilote et un véhicule aérien sans pilote
Ahmadinejad et al. Autonomous flight of quadcopters in the presence of ground effect
Bartholomew et al. Learning to predict obstacle aerodynamics from depth images for micro air vehicles
Jarrett et al. Controller comparisons for autonomous railway following with a fixed-wing UAV
Dinaux et al. FAITH: Fast iterative half-plane focus of expansion estimation using optic flow
CN113516013B (zh) 目标检测方法、装置、电子设备、路侧设备和云控平台
Kamath et al. Vision-based fast-terminal sliding mode super twisting controller for autonomous landing of a quadrotor on a static platform
Ho et al. Characterization of flow field divergence for MAVs vertical control landing
Aspragkathos et al. Event-triggered image moments predictive control for tracking evolving features using UAVs
Siddiquee et al. Flight test of quadcopter guidance with vision-based reinforcement learning
Lin et al. Autonomous Landing of a VTOL UAV on a Ship based on Tau Theory
Kainth et al. Chasing the Intruder: A Reinforcement Learning Approach for Tracking Unidentified Drones
Mitakidis et al. A Deep Reinforcement Learning Visual Servoing Control Strategy for Target Tracking Using a Multirotor UAV

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21966193

Country of ref document: EP

Kind code of ref document: A1