WO2020024104A1 - Procédé, appareil et dispositif de commande de retour - Google Patents

Procédé, appareil et dispositif de commande de retour Download PDF

Info

Publication number
WO2020024104A1
WO2020024104A1 PCT/CN2018/097776 CN2018097776W WO2020024104A1 WO 2020024104 A1 WO2020024104 A1 WO 2020024104A1 CN 2018097776 W CN2018097776 W CN 2018097776W WO 2020024104 A1 WO2020024104 A1 WO 2020024104A1
Authority
WO
WIPO (PCT)
Prior art keywords
target object
drone
image
identity information
take
Prior art date
Application number
PCT/CN2018/097776
Other languages
English (en)
Chinese (zh)
Inventor
孟国涛
庞磊
Original Assignee
深圳市大疆创新科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳市大疆创新科技有限公司 filed Critical 深圳市大疆创新科技有限公司
Priority to PCT/CN2018/097776 priority Critical patent/WO2020024104A1/fr
Priority to CN201880010540.4A priority patent/CN110291482A/zh
Publication of WO2020024104A1 publication Critical patent/WO2020024104A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/10Simultaneous control of position or course in three dimensions
    • G05D1/101Simultaneous control of position or course in three dimensions specially adapted for aircraft

Definitions

  • the present invention relates to the field of communication technologies, and in particular, to a method, a device, and a device for returning home control.
  • Embodiments of the present invention provide a home return control method, device, and device, which can automatically identify a target object during the home return process and improve the convenience of operation.
  • a first aspect of the embodiments of the present invention is to provide a home return control method, including:
  • a second aspect of the embodiments of the present invention is to provide a return flight control device, including:
  • the memory is used to store program code
  • the processor calls the program code, and when the program code is executed, is used to perform the following operations:
  • a third aspect of the embodiments of the present invention is to provide a drone, including:
  • a power system installed on the fuselage and configured to provide power to the drone
  • An image acquisition device installed on the fuselage and configured to acquire a first image including an object
  • the identity information of the target object is acquired during the take-off of the drone, and the first image containing the object is collected by the image acquisition device during the return of the drone, and according to the first image and the identity information, Recognize the object as a target object and perform corresponding operations based on the target object without the need for the user to re-designate the target object.
  • the target object can be automatically identified during the return flight to improve the convenience of the operation.
  • FIG. 1 is a flowchart of a home return control method according to an embodiment of the present invention
  • FIG. 2 is a schematic diagram of a drone taking off scene provided by an embodiment of the present invention
  • FIG. 3 is a schematic diagram of a drone returning scene provided by an embodiment of the present invention.
  • FIG. 4 is a flowchart of another home return control method according to an embodiment of the present invention.
  • FIG. 5 is a schematic diagram of another drone returning scenario provided by an embodiment of the present invention.
  • FIG. 6 is a structural diagram of a home return control device according to an embodiment of the present invention.
  • a component when a component is called “fixed to” another component, it may be directly on another component or a centered component may exist. When a component is considered to be “connected” to another component, it can be directly connected to another component or a centered component may exist at the same time.
  • An embodiment of the present invention provides a home return control method for acquiring identity information of a target object during a drone take-off process, and during a drone return home process, acquiring a first image including an object through an image acquisition device, According to the first image and the identity information, the identified object is the target object, and the corresponding operation is performed based on the target object.
  • the user can automatically identify the target object without re-designating the target object, which improves the convenience of the operation.
  • the process of taking off of the drone refers to the process of moving the drone to the take-off point, wherein the geographic coordinates of the take-off point are the same as the geographic coordinates selected by the user, and the height of the take-off point is the same as the height selected by the user .
  • the user can select the geographic coordinates of the drone taking off in the digital map displayed on the ground station, and select the altitude in the altitude menu.
  • the ground station sends the geographic coordinates and altitude selected by the user to the drone.
  • the drone can be based on Geographical coordinates and altitude to the takeoff point.
  • the height menu may include at least one height value or a slider for indicating at least one height value.
  • the user can operate the remote control with a joystick to control the drone to fly to the take-off point.
  • the identity information may include human characteristics and / or face characteristics of the target object.
  • Human characteristics may include the clothing style of the target object and its color, or a human body support.
  • the facial features may include structural relationships among the target's eyes, nose, mouth, and chin.
  • the drone may acquire an image containing the target object through an image acquisition device, and process the image to obtain the identity information of the target object.
  • a drone may acquire a human body image including a target object, and process the human body image to obtain the human body characteristics of the target object.
  • a drone may acquire a face image including a target object through an image acquisition device, and process the face image to obtain the facial features of the target object.
  • the drone when the drone is taking off, when the distance between the drone and the target object is relatively long, the drone can obtain an image of a human body containing the target object through an image acquisition device, and process the human body image.
  • the human characteristics of the target object are obtained; when the drone is flying near the target object, the distance between the drone and the target object is relatively short, and a face image including the target object can be collected and processed. To obtain the facial features of the target object. For another example, when the drone is taking off, when the distance between the drone and the target object is relatively long, the drone can obtain an image of a human body containing the target object through an image acquisition device, and process the human body image.
  • the human characteristics of the target object are obtained; further, the drone may adjust the focal length of the image acquisition device, collect a face image containing the target object, and process the face image to obtain the facial characteristics of the target object.
  • the drone may receive the second image containing the target object sent by the ground station, and process the second image to obtain the identity information of the target object. For example, before the drone takes off to the take-off point, the user can find the second image containing the target object in the local station's local memory, and the ground station sends the second image found by the user to the drone, then the drone The second image can be processed to obtain the identity information of the target object. For another example, before the drone takes off to the take-off point, the user can collect a second image containing the target object through the ground station, and the ground station sends the collected second image to the drone, and the drone can The image is processed to obtain the identity information of the target object.
  • the drone in the process of obtaining the identity information of the target object, if the image acquired by the drone through the image acquisition device contains an object, the drone can identify the object as the target object; if the drone passes the image acquisition The image obtained by the device contains multiple objects, and the drone can send the image to the ground station, and the ground station displays the image so that the user can select a target object among the above multiple objects, and the ground station can identify the target object selected by the user The ground station sends the identified image to the drone, and the drone can identify the target object in the identified image and obtain the identity information of the target object.
  • performing the corresponding operation based on the target object may specifically include the following three types:
  • the drone enters the gesture control mode during take-off, performs corresponding operations based on the gesture of the target object, and performs a return operation when the return instruction or a signal loss received by the user is entered; when the drone returns,
  • performing the corresponding operation based on the gesture of the target object may specifically include: performing a spiral flight when the first gesture of the target object is detected; and performing a circular flight with the preset position as the center when the second gesture of the target object is detected ,and many more.
  • the drone enters the self-timer mode during take-off, the drone performs the return operation when a return instruction or a signal loss is input by the user; when the target object is identified during the return flight of the drone, Automatically enters self-timer mode again.
  • target tracking mode based on the target object. For example, when the drone enters the target tracking mode during take-off, the drone performs the return operation when a return instruction or a signal loss is input by the user; when the target is identified during the return flight of the drone, Can automatically enter the target tracking mode again.
  • FIG. 1 is a flowchart of a home return control method according to an embodiment of the present invention. As shown in FIG. 1, the method may include the following steps:
  • Step 101 During the take-off of the drone, obtain the identity information of the target object.
  • an image containing the target object may be collected by an image acquisition device, and the image may be processed to obtain the identity information of the target object.
  • the identity information may include human characteristics and / or face characteristics of the target object.
  • the preset position may be a position on the ground or the target's hand or on the desktop.
  • the drone can collect the image data from the image acquisition device before flying to the take-off point.
  • the image of the target object If the drone does not face the target object during take-off, the drone can rotate the pan / tilt after flying to the take-off point to control the image acquisition device to face the target object.
  • the image of the object If the image acquisition device is facing the target object during the take-off process, the drone can collect the image data from the image acquisition device before flying to the take-off point. The image of the target object. If the drone does not face the target object during take-off, the drone can rotate the pan / tilt after flying to the take-off point to control the image acquisition device to face the target object. The image of the object.
  • the drone may receive the second image containing the target object sent by the ground station, and process the second image to obtain the identity information of the target object.
  • Step 102 In the course of returning the drone, a first image including an object is collected by an image acquisition device.
  • the drone when the drone receives a return instruction or a signal loss sent by the ground station, the drone will perform a return operation, that is, fly from the current position to the take-off point or around the take-off point.
  • the first image containing the object may include a human body image and / or a human face image. Taking the scenario of the drone returning as described in FIG. 3 as an example, the drone can fly obliquely below the take-off point, so that the image of the face containing the object can be collected by the image acquisition device.
  • the drone may acquire a first image including an object through an image acquisition device at a preset interval, until it recognizes that the object included in the first image is the target object.
  • the drone can fly from its current position to the take-off point during the return flight. During the flight from the current position to the take-off point, the drone can adjust the focal length of the image acquisition device so that the image acquisition device can capture the human body image containing the object. If the drone captures a human body image containing the object before flying to the takeoff point, the drone can process the human body image to obtain the human body characteristic of the object. If the human body characteristic of the object matches the human body characteristic of the target object, the drone can Identify the object as the target object, and then perform corresponding operations based on the target object.
  • the drone can fly from the current position to the take-off point.
  • the drone can adjust the focal length of the image capture device so that the image capture device can capture a face image containing the object. If the drone collects a face image containing the object before flying to the take-off point, the drone can process the face image to obtain the face feature of the object. If the face feature of the object matches the face feature of the target object , The drone can recognize the object as the target object, and then perform corresponding operations based on the target object.
  • the drone can fly from the current position to below the takeoff point.
  • the drone can collect an image of a human body containing an object through an image acquisition device. If a drone captures a human body image containing an object before flying below the takeoff point, the drone can process the human body image to obtain the human body characteristics of the object. If the human body characteristics of the object and the human body characteristics of the target object match, no one The machine can recognize the object as the target object, and then perform corresponding operations based on the target object.
  • the drone can fly from the current position to below the takeoff point.
  • the drone can acquire an image of a face containing an object through an image acquisition device. If the drone acquires a face image containing the object before flying below the takeoff point, the drone can process the face image to obtain the face feature of the object. If the face feature of the object and the face of the target object Feature matching, the drone can recognize the object as the target object, and then perform corresponding operations based on the target object.
  • the drone can fly from the current position to the area around the takeoff point.
  • the drone can collect an image of the human body containing the object through an image acquisition device. If the drone captures a human body image containing the object before flying around the take-off point, the drone can process the human body image to obtain the human body characteristic of the object. If the human body characteristic of the object matches the human body characteristic of the target object, no The human machine can be lowered to a certain height, so that the face image of the object can be collected by the image acquisition device, and the face image is processed to obtain the face characteristics of the object. If the face characteristics of the object and the face of the target object Feature matching, the drone can recognize the object as the target object, and then perform corresponding operations based on the target object.
  • the drone can fly from the current position to the area around the takeoff point.
  • the drone can collect an image of the human body containing the object through an image acquisition device. If the drone captures a human body image containing the object before flying around the take-off point, the drone can process the human body image to obtain the human body characteristic of the object. If the human body characteristic of the object matches the human body characteristic of the target object, no The human machine can adjust the focal length of the image acquisition device, so that the face image of the object can be collected by the image acquisition device, and the face image is processed to obtain the facial features of the object. If the facial features of the object and the target object's Face features match, the drone can recognize the object as the target object.
  • flight trajectory of the drone flying from the current position to the take-off point and the flight trajectory of the drone flying from the take-off point to the current position may be the same or different, and are not specifically limited by the embodiments of the present invention.
  • Step 103 According to the first image and the identity information, the identification object is a target object.
  • the drone may process the first image to obtain the identity information of the object, and if the identity information of the object and the identity information of the target object match, the identified object is the target object.
  • the identity information of the object includes human characteristics and / or face characteristics of the object.
  • the drone may determine whether the human characteristics of the object and the human characteristics of the target object match through pedestrian re-identification (ReID) technology.
  • the human features match the human features of the target object, and the drone can recognize the object as the target object.
  • the drone may compare the human characteristics of the object with the human characteristics of the target object to obtain a distance value. If the distance value is less than the first preset distance threshold, the drone may determine the human characteristics of the object and the human characteristics of the target object. match.
  • the drone may use face recognition technology to determine whether the facial features of the object and the facial features of the target object match.
  • the facial features of the target object match, and the drone can recognize the object as the target object.
  • the drone may compare the facial features of the object and the facial features of the target object to obtain a distance value. If the distance value is less than a second preset distance threshold, the drone may determine the facial features of the object and the target object. Facial features match.
  • the drone may determine whether the human characteristics of the object and the human characteristics of the target object match, and if the human characteristics of the object and the human characteristics of the target object match Matching, it is judged whether the facial features of the object and the facial features of the target object match, and if the facial features of the object and the facial features of the target object match, the object is identified as the target object.
  • the human characteristics of the object match the human characteristics of the target object.
  • the drone can determine that the similarity between the object and the target object is high. Further, the facial features of the object match the facial features of the target object. The drone can determine the object as the target object, improving the accuracy of target object recognition.
  • Step 104 Perform a corresponding operation based on the target object.
  • the drone After the drone recognizes the object as the target object, it can enter gesture control mode, selfie mode, or target tracking mode based on the target object.
  • the identity information of the target object is acquired during the take-off of the drone, and the first image containing the object is collected by the image acquisition device during the return of the drone, and is identified based on the first image and the identity information.
  • This object is the target object, and the corresponding operation is performed based on the target object. The user does not need to re-designate the target object, and the target object can be automatically identified during the return flight to improve the convenience of the operation.
  • FIG. 4 is a flowchart of another home return control method according to an embodiment of the present invention. As shown in FIG. 4, the method may include the following steps:
  • Step 401 During the take-off of the drone, obtain the identity information of the target object.
  • an image containing the target object may be collected by an image acquisition device, and the image may be processed to obtain the identity information of the target object.
  • the identity information may include human characteristics and / or face characteristics of the target object.
  • the preset position may be a position on the ground or the target's hand or on the desktop.
  • the drone can collect the image data from the image acquisition device before flying to the take-off point.
  • the image of the target object If the drone does not face the target object during take-off, the drone can rotate the pan / tilt after flying to the take-off point to control the image acquisition device to face the target object.
  • the image of the object If the image acquisition device is facing the target object during the take-off process, the drone can collect the image data from the image acquisition device before flying to the take-off point. The image of the target object. If the drone does not face the target object during take-off, the drone can rotate the pan / tilt after flying to the take-off point to control the image acquisition device to face the target object. The image of the object.
  • the drone may receive the second image containing the target object sent by the ground station, and process the second image to obtain the identity information of the target object.
  • Step 402 Obtain a take-off position of the drone during take-off.
  • the drone can obtain the geographic coordinates and altitude of the take-off position. For example, if the drone flies to the takeoff point based on the geographic coordinates and altitude sent by the ground station, the drone can use the geographic coordinates and altitude sent by the ground station as the takeoff position of the drone during takeoff. For another example, if the drone flies to the take-off point based on the user's joystick operation on the remote control, the drone can obtain the take-off position of the drone during take-off by using a preset positioning technology.
  • the preset positioning technology may include Global Positioning System (GPS) technology, base station communication positioning technology, or obtaining the altitude of the drone flying to the take-off point by using a barometric pressure sensor.
  • GPS Global Positioning System
  • Step 403 During the drone return flight, control the drone return flight to the take-off position.
  • the drone when the drone receives a return instruction or a signal loss sent by the ground station, the drone will perform a return operation, that is, fly from the current position to the take-off position. Taking the scenario of the drone returning as described in FIG. 5 as an example, after the drone can fly to the take-off point, it descends to a certain height so that the face image containing the object can be collected by the image acquisition device.
  • Step 404 Collect a first image including an object through an image acquisition device.
  • a first image including an object may be collected by an image acquisition device.
  • the first image containing the object may include a human body image and / or a human face image.
  • a drone can fly from the current position to the take-off point, collect an image of the human body containing the object through an image acquisition device, and process the human image to obtain the human characteristics of the object. If the human characteristics of the object and the target object Matching human characteristics, the drone can recognize the object as the target object.
  • the drone can fly from the current position to the take-off point, adjust the focus of the image acquisition device, so that the image acquisition device can collect the face image containing the object, and process the face image to obtain the object's Face features. If the face features of the object match the face features of the target object, the drone can recognize the object as the target object.
  • the drone can fly from the current position to the take-off point, collect the image of the human body containing the object through the image acquisition device, and process the human image to obtain the human characteristics of the object.
  • the human characteristics of the target object are matched, and the drone can be lowered to a certain height, so that the face image of the object can be collected by the image acquisition device, and the face image is processed to obtain the face characteristics of the object.
  • the facial features match the facial features of the target object, and the drone can recognize the object as the target object.
  • the drone can fly from the current position to the take-off point, collect the image of the human body containing the object through the image acquisition device, and process the human image to obtain the human characteristics of the object.
  • the human characteristics of the target object are matched.
  • the drone can adjust the focal length of the image acquisition device so that the face image of the object can be collected by the image acquisition device, and the face image is processed to obtain the face characteristics of the object.
  • the facial features of the target match the facial features of the target object, and the drone can recognize the object as the target object.
  • flight trajectory of the drone from the current position to the take-off point and the flight trajectory of the drone from the take-off point to the current position may be the same or different, and are not specifically limited by the embodiments of the present invention.
  • Step 405 According to the first image and the identity information, identify the object as a target object.
  • the drone may process the first image to obtain the identity information of the object, and if the identity information of the object and the identity information of the target object match, the identified object is the target object.
  • the identity information of the object includes human characteristics and / or face characteristics of the object.
  • the drone may determine whether the human characteristics of the object and the human characteristics of the target object match through the ReID technology. Matching, the drone can identify the object as the target object. For example, the drone may compare the human characteristics of the object with the human characteristics of the target object to obtain a distance value. If the distance value is less than the first preset distance threshold, the drone may determine the human characteristics of the object and the human characteristics of the target object. match.
  • the drone may use face recognition technology to determine whether the facial features of the object and the facial features of the target object match.
  • the facial features of the target object match, and the drone can recognize the object as the target object.
  • the drone may compare the facial features of the object and the facial features of the target object to obtain a distance value. If the distance value is less than a second preset distance threshold, the drone may determine the facial features of the object and the target object. Facial features match.
  • the drone may determine whether the human characteristics of the object and the human characteristics of the target object match, and if the human characteristics of the object and the human characteristics of the target object match Matching, it is judged whether the facial features of the object and the facial features of the target object match, and if the facial features of the object and the facial features of the target object match, the object is identified as the target object.
  • the human characteristics of the object match the human characteristics of the target object.
  • the drone can determine that the similarity between the object and the target object is high. Further, the facial features of the object match the facial features of the target object. The drone can determine the object as the target object, improving the accuracy of target object recognition.
  • Step 406 Perform a corresponding operation based on the target object.
  • the drone After the drone recognizes the object as the target object, it can enter gesture control mode, selfie mode, or target tracking mode based on the target object.
  • the identity information of the target object and the take-off position of the drone during take-off are acquired.
  • the drone is controlled to return to the take-off position.
  • the image acquisition device collects a first image containing an object, and recognizes the object as a target object based on the first image and identity information, and performs corresponding operations based on the target object. The user does not need to re-specify the target object, and can automatically identify the target object during the return flight To improve the convenience of operation.
  • FIG. 6 is a structural diagram of a home return control device according to an embodiment of the present invention.
  • the home return control device 600 includes a memory 601 and a processor 602.
  • the program code is stored, and the processor 602 calls the program code in the memory.
  • the processor 602 performs the following operations: during the take-off of the drone, obtaining the identity information of the target object; when the drone returns In the process, a first image containing an object is collected by an image acquisition device; according to the first image and identity information, the identified object is a target object; and a corresponding operation is performed based on the target object.
  • the processor 602 when the processor 602 identifies the object as the target object based on the first image and the identity information, the processor 602 performs the following operations:
  • Identity information of the object includes human characteristics and / or face characteristics of the object
  • the identity information of the object matches the identity information of the target object, identifying the object as the target object.
  • the identity information of the target object includes human characteristics and face characteristics of the target object
  • the processor 602 When the identity information of the object and the identity information of the target object match, the processor 602 performs the following operations when the object is identified as the target object:
  • the processor 602 collects the first image containing the object through the image acquisition device during the drone return flight, the following operations are performed:
  • the first image is acquired by the image acquisition device.
  • the processor 602 collects the first image containing the object through the image acquisition device during the drone return flight, the following operations are performed:
  • the first image containing the object is collected by the image acquisition device every preset time interval until it is recognized that the object contained in the first image is the target object.
  • the processor 602 when the processor 602 takes off the identity information of the target object during the take-off of the drone, the processor performs the following operations:
  • the processor 602 when the processor 602 takes off the identity information of the target object during the take-off of the drone, the processor performs the following operations:
  • the processor 602 when performing a corresponding operation based on the target object, performs the following operations:
  • the return-to-home control device provided in this embodiment can execute the return-to-home control method of the drone provided in the foregoing embodiment, and the execution manner and beneficial effects thereof are similar, and will not be repeated here.
  • An embodiment of the present invention also provides a drone.
  • the drone includes:
  • a power system installed on the fuselage and configured to provide power to the drone
  • An image acquisition device installed on the fuselage and configured to acquire a first image including an object
  • the drone further includes a position sensor
  • the position sensor is installed on the fuselage and is used to obtain a take-off position of the drone during take-off.
  • the drone further includes:
  • a communication device is installed on the fuselage and is used for information interaction with a ground station.
  • the disclosed apparatus and method may be implemented in other manners.
  • the device embodiments described above are only schematic.
  • the division of the unit is only a logical function division.
  • multiple units or components may be combined or Can be integrated into another system, or some features can be ignored or not implemented.
  • the displayed or discussed mutual coupling or direct coupling or communication connection may be indirect coupling or communication connection through some interfaces, devices or units, which may be electrical, mechanical or other forms.
  • the units described as separate components may or may not be physically separated, and the components displayed as units may or may not be physical units, may be located in one place, or may be distributed on multiple network units. Some or all of the units may be selected according to actual needs to achieve the objective of the solution of this embodiment.
  • each functional unit in each embodiment of the present invention may be integrated into one processing unit, or each unit may exist separately physically, or two or more units may be integrated into one unit.
  • the above integrated unit may be implemented in the form of hardware, or in the form of hardware plus software functional units.
  • the above integrated unit implemented in the form of a software functional unit may be stored in a computer-readable storage medium.
  • the software functional unit is stored in a storage medium and includes several instructions for causing a computer device (which may be a personal computer, a server, or a network device) or a processor to execute the methods described in the embodiments of the present invention. Some steps.
  • the aforementioned storage media include: U disks, mobile hard disks, read-only memory (ROM), random access memory (RAM), magnetic disks or compact discs, and other media that can store program codes .

Landscapes

  • Engineering & Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

Les modes de réalisation de la présente invention concernent un procédé, un appareil et un dispositif de commande de retour, le procédé consistant à acquérir des informations d'identité d'un objet cible pendant un processus de décollage d'un véhicule aérien sans pilote ; dans le processus du retour de véhicule aérien sans pilote, collecter une première image contenant un objet par un dispositif de collecte d'image ; identifier l'objet en tant qu'objet cible en fonction de la première image et des informations d'identité ; effectuer une opération correspondante basée sur l'objet cible. Les modes de réalisation de la présente invention peuvent identifier automatiquement un objet cible pendant un processus de retour, ce qui permet d'améliorer la commodité de fonctionnement.
PCT/CN2018/097776 2018-07-31 2018-07-31 Procédé, appareil et dispositif de commande de retour WO2020024104A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/CN2018/097776 WO2020024104A1 (fr) 2018-07-31 2018-07-31 Procédé, appareil et dispositif de commande de retour
CN201880010540.4A CN110291482A (zh) 2018-07-31 2018-07-31 返航控制方法、装置及设备

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2018/097776 WO2020024104A1 (fr) 2018-07-31 2018-07-31 Procédé, appareil et dispositif de commande de retour

Publications (1)

Publication Number Publication Date
WO2020024104A1 true WO2020024104A1 (fr) 2020-02-06

Family

ID=68001271

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2018/097776 WO2020024104A1 (fr) 2018-07-31 2018-07-31 Procédé, appareil et dispositif de commande de retour

Country Status (2)

Country Link
CN (1) CN110291482A (fr)
WO (1) WO2020024104A1 (fr)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110781821B (zh) * 2019-10-25 2022-11-01 上海商汤智能科技有限公司 基于无人机的目标检测方法及装置、电子设备和存储介质
CN111580546B (zh) * 2020-04-13 2023-06-06 深圳蚁石科技有限公司 无人机自动返航方法和装置
CN111629149A (zh) * 2020-06-09 2020-09-04 金陵科技学院 一种基于小型无人机的自拍装置及其控制方法
CN111813145B (zh) * 2020-06-30 2024-06-11 深圳市万翼数字技术有限公司 无人机巡航的控制方法及相关系统
CN111679695B (zh) * 2020-08-11 2020-11-10 中航金城无人系统有限公司 一种基于深度学习技术的无人机巡航及追踪系统和方法
WO2023159611A1 (fr) * 2022-02-28 2023-08-31 深圳市大疆创新科技有限公司 Procédé et dispositif de photographie d'image, et plateforme mobile

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105974940A (zh) * 2016-04-29 2016-09-28 优利科技有限公司 适用于飞行器的目标跟踪方法
CN106292721A (zh) * 2016-09-29 2017-01-04 腾讯科技(深圳)有限公司 一种控制飞行器跟踪目标对象的方法、设备及系统
CN106598071A (zh) * 2016-12-20 2017-04-26 北京小米移动软件有限公司 跟随式的飞行控制方法及装置、无人机
US20180149479A1 (en) * 2015-06-29 2018-05-31 Yuneec Technology Co., Limited Geo-location or navigation camera, and aircraft and navigation method therefor

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9442485B1 (en) * 2014-08-13 2016-09-13 Trace Live Network Inc. Pixel based image tracking system for unmanned aerial vehicle (UAV) action camera system
CN106020227B (zh) * 2016-08-12 2019-02-26 北京奇虎科技有限公司 无人机的控制方法、装置
KR20180051996A (ko) * 2016-11-09 2018-05-17 삼성전자주식회사 무인 비행 장치 및 이를 이용한 피사체 촬영 방법
WO2018098678A1 (fr) * 2016-11-30 2018-06-07 深圳市大疆创新科技有限公司 Procédé, dispositif et appareil de commande d'aéronef, et aéronef
CN106909172A (zh) * 2017-03-06 2017-06-30 重庆零度智控智能科技有限公司 环绕跟踪方法、装置和无人机
CN107749942A (zh) * 2017-09-13 2018-03-02 深圳天珑无线科技有限公司 悬浮拍摄方法、移动终端及计算机可读存储介质
CN107948583A (zh) * 2017-10-31 2018-04-20 易瓦特科技股份公司 基于无人机对目标对象进行标识的方法、系统及装置
CN107863153A (zh) * 2017-11-24 2018-03-30 中南大学 一种基于智能大数据的人体健康特征建模测量方法与平台

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180149479A1 (en) * 2015-06-29 2018-05-31 Yuneec Technology Co., Limited Geo-location or navigation camera, and aircraft and navigation method therefor
CN105974940A (zh) * 2016-04-29 2016-09-28 优利科技有限公司 适用于飞行器的目标跟踪方法
CN106292721A (zh) * 2016-09-29 2017-01-04 腾讯科技(深圳)有限公司 一种控制飞行器跟踪目标对象的方法、设备及系统
CN106598071A (zh) * 2016-12-20 2017-04-26 北京小米移动软件有限公司 跟随式的飞行控制方法及装置、无人机

Also Published As

Publication number Publication date
CN110291482A (zh) 2019-09-27

Similar Documents

Publication Publication Date Title
WO2020024104A1 (fr) Procédé, appareil et dispositif de commande de retour
US11914370B2 (en) System and method for providing easy-to-use release and auto-positioning for drone applications
US11566915B2 (en) Method, device and system for processing a flight task
US10674062B2 (en) Control method for photographing using unmanned aerial vehicle, photographing method using unmanned aerial vehicle, mobile terminal, and unmanned aerial vehicle
US20190369613A1 (en) Electronic device and method for controlling multiple drones
CN105425815B (zh) 一种利用无人飞行器的牧场智能管理系统及方法
CN110494360B (zh) 用于提供自主摄影及摄像的系统和方法
WO2021223124A1 (fr) Procédé et dispositif d'obtention d'informations de position, et support de stockage
WO2018001245A1 (fr) Commande de robot à l'aide de gestes
WO2018209898A1 (fr) Dispositif de traitement d'informations, procédé de génération de trajet de photographie aérienne, système de génération de trajet de photographie aérienne, programme et support d'enregistrement
US20220270277A1 (en) Computing a point cloud from stitched images
CN110688914A (zh) 一种手势识别的方法、智能设备、存储介质和电子设备
EP4174716A1 (fr) Procédé et dispositif de suivi de piéton, et support d'enregistrement lisible par ordinateur
US11509809B2 (en) Following control method, control terminal, and unmanned aerial vehicle
WO2019061111A1 (fr) Procédé de réglage de trajet et véhicule aérien sans pilote
CN110546682A (zh) 信息处理装置、航拍路径生成方法、航拍路径生成系统、程序以及记录介质
CN113747069A (zh) 一种拍摄控制方法、装置及控制设备、拍摄设备
US9141191B2 (en) Capturing photos without a camera
US11945583B2 (en) Method for generating search information of unmanned aerial vehicle and unmanned aerial vehicle
CN115649501A (zh) 一种无人机夜行照明系统及方法
US20220157032A1 (en) Multi-modality localization of users
Abate et al. Remote 3D face reconstruction by means of autonomous unmanned aerial vehicles
CN112631333B (zh) 无人机的目标跟踪方法、装置及图像处理芯片
WO2022082440A1 (fr) Procédé, appareil et système pour déterminer une stratégie de suivi de cible et dispositif et support de stockage
EP4246463A1 (fr) Dispositif et procédé de détection d'objet à échelles multiples

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18928530

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18928530

Country of ref document: EP

Kind code of ref document: A1