WO2023178495A1 - 无人机、控制终端、服务器及其控制方法 - Google Patents

无人机、控制终端、服务器及其控制方法 Download PDF

Info

Publication number
WO2023178495A1
WO2023178495A1 PCT/CN2022/082119 CN2022082119W WO2023178495A1 WO 2023178495 A1 WO2023178495 A1 WO 2023178495A1 CN 2022082119 W CN2022082119 W CN 2022082119W WO 2023178495 A1 WO2023178495 A1 WO 2023178495A1
Authority
WO
WIPO (PCT)
Prior art keywords
target object
drone
image
environment
control terminal
Prior art date
Application number
PCT/CN2022/082119
Other languages
English (en)
French (fr)
Inventor
龚鼎
陆泽早
刘昂
Original Assignee
深圳市大疆创新科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳市大疆创新科技有限公司 filed Critical 深圳市大疆创新科技有限公司
Priority to PCT/CN2022/082119 priority Critical patent/WO2023178495A1/zh
Priority to CN202280050802.6A priority patent/CN117677911A/zh
Publication of WO2023178495A1 publication Critical patent/WO2023178495A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions

Definitions

  • Embodiments of the present invention relate to the field of electronic control technology, and in particular, to a drone, a control terminal, a server and a control method thereof.
  • Drones are widely used in surveying, accident search and rescue, equipment inspection, and surveying and mapping applications. In these application fields, drones often perform tasks independently, that is, they independently measure, photograph, and track the target objects they are concerned about. Information related to the target objects is shared differently. However, this operation method cannot meet the needs of certain operation scenarios. For example, when a tourist is lost in a mountainous area, rescuers use multiple drones to fly over the mountainous area to search for this strange tourist. When one of the drones searches for the tourist, at this time, they often hope that the other drones will The drone can learn the location or orientation of the tourist so that other drones can also observe the tourist. However, at present, the relevant information about the observed target objects between UAVs cannot be shared with other UAVs, resulting in the inability to meet the above requirements. This makes the collaborative operation of multiple UAVs on target objects less intelligent.
  • Embodiments of the present invention provide an unmanned aerial vehicle, a control terminal, a server and a control method thereof, so that the unmanned aerial vehicle can share the relevant information of the observed target object to other unmanned aerial vehicles, so that other unmanned aerial vehicles can use the information of the target object. Relevant information enables observation of target objects.
  • the first aspect of the embodiment of the present invention is to provide a method for controlling a drone.
  • the method includes:
  • the sensing data output by the drone's observation sensor sensing the target objects in the environment is obtained;
  • the position of the target object is sent to another drone flying in the environment or the position of the target object is sent to a first relay device, so that the position of the target object is transmitted through the first relay device.
  • the position is sent to another drone flying in the environment, so that the other drone adjusts the shooting direction of its configured shooting device to face the target object according to the position of the target object.
  • the second aspect of the embodiment of the present invention is to provide a control method for a control terminal of an unmanned aerial vehicle.
  • the method includes:
  • the position of the target object in the environment sent by the drone is obtained, where the position is the measurement of the target object by the drone according to its configuration observation sensor.
  • the sensing data output by sensing is determined;
  • the position of the target object is sent to another drone flying in the environment or the position of the target object is sent to a second relay device to transmit the position of the target object through the second relay device.
  • the position is sent to another drone flying in the environment, so that the other drone adjusts the shooting direction of its configured shooting device to face the target object according to the position of the target object.
  • the third aspect of the embodiment of the present invention is to provide a server control method.
  • the method includes:
  • the position of the target object in the environment wherein the position of the target object is determined by the sensing data output by the drone in the environment based on its configuration of the observation sensor sensing the target object;
  • Another unmanned aerial vehicle flying in the middle, so that the other unmanned aerial vehicle adjusts the shooting direction of its configured shooting device to face the target object according to the position of the target object.
  • the fourth aspect of the embodiment of the present invention is to provide another method for controlling a control terminal of a drone.
  • the method includes:
  • the position of the target object is determined by sensing data output by another drone flying in the environment based on its configured observation sensor sensing the target object.
  • the fifth aspect of the embodiment of the present invention is to provide another method of controlling a UAV, the method including:
  • the position of the target object in the environment wherein the position of the target object is determined by sensing data output by another drone in the environment based on its configuration of the observation sensor sensing the target object;
  • the shooting direction of the shooting device is adjusted according to the position of the target object to face the target object.
  • the sixth aspect of the embodiment of the present invention is to provide a control method for a drone, the processor is used to perform the following steps:
  • the sensing data output by the drone's observation sensor sensing the target object in the environment is obtained, and the position of the target object is determined based on the sensing data;
  • the position of the target object is sent to another drone flying in the environment or the position of the target object is sent to a first relay device, so that the position of the target object is transmitted through the first relay device.
  • the position is sent to another drone flying in the environment, so that the other drone adjusts the shooting direction of its configured shooting device to face the target object according to the position of the target object.
  • the seventh aspect of the embodiment of the present invention is to provide a control terminal for a drone, including a memory and a processor,
  • the memory is used to store program code
  • the processor is used to call and execute the program code to perform the following steps:
  • the position of the target object in the environment sent by the drone is obtained, where the position is the measurement of the target object by the drone according to its configuration observation sensor.
  • the sensing data output by sensing is determined;
  • the position of the target object is sent to another drone flying in the environment or the position of the target object is sent to a second relay device to transmit the position of the target object through the second relay device.
  • the position is sent to another drone flying in the environment, so that the other drone adjusts the shooting direction of its configured shooting device to face the target object according to the position of the target object.
  • the eighth aspect of the embodiment of the present invention is to provide a server, including a memory and a processor,
  • the memory is used to store program code
  • the processor is used to call and execute the program code to perform the following steps:
  • the position of the target object in the environment wherein the position of the target object is determined by the sensing data output by the drone in the environment based on its configuration of the observation sensor sensing the target object;
  • Another unmanned aerial vehicle flying in the middle, so that the other unmanned aerial vehicle adjusts the shooting direction of its configured shooting device to face the target object according to the position of the target object.
  • the ninth aspect of the embodiment of the present invention is to provide another drone control terminal, including a memory and a processor,
  • the memory is used to store program code
  • the processor is used to call and execute the program code to perform the following steps:
  • the position of the target object is determined by sensing data output by another drone flying in the environment based on its configured observation sensor sensing the target object.
  • the fifth aspect of the embodiment of the present invention is to provide another drone, the processor is used to perform the following steps:
  • the position of the target object in the environment wherein the position of the target object is determined by sensing data output by another drone in the environment based on its configuration of an observation sensor sensing the target object;
  • the shooting direction of the shooting device is adjusted according to the position of the target object to face the target object.
  • a tenth aspect of the present invention is to provide a computer-readable storage medium.
  • the storage medium is a computer-readable storage medium.
  • Program instructions are stored in the computer-readable storage medium.
  • the program instructions are used in the first aspect to The control method described in any one of the fifth aspects.
  • the drone can transmit the position of the target object determined by its configured observation sensor, and the position of the target object is transmitted to another drone. So that another drone can adjust the shooting direction of its shooting device to face the target object according to the position of the target object. In this way, another drone can observe the target object through the shooting device, which improves the intelligence of multiple drones' collaborative operations on the target object.
  • Figure 1 is a schematic flow chart of a first UAV control method provided by an embodiment of the present invention
  • Figure 2 is a schematic flowchart of the control method of the first control terminal of the first drone provided by an embodiment of the present invention
  • FIG. 3 is a schematic flowchart of a server control method provided by an embodiment of the present invention.
  • Figure 4 is a schematic flowchart of the control method of the second control terminal of the second drone provided by an embodiment of the present invention
  • Figure 5 is a schematic flowchart of a second UAV control method provided by an embodiment of the present invention.
  • Figure 6 is a schematic structural diagram of the first UAV provided by an embodiment of the present invention.
  • Figure 7 is a schematic structural diagram of the first control terminal of the first drone provided by an embodiment of the present invention.
  • Figure 8 is a schematic structural diagram of a server provided by an embodiment of the present invention.
  • Figure 9 is a schematic structural diagram of the second control terminal of the second drone provided by an embodiment of the present invention.
  • Figure 10 is a schematic structural diagram of a second drone provided by an embodiment of the present invention.
  • Embodiments of the present invention provide a control method for a drone.
  • the execution subject of the control method may be a drone.
  • the drone here (i.e. The execution subject of the method) may be called the first drone, and the other drone mentioned below may be called the second drone.
  • the method includes:
  • S101 While the drone is flying in the environment, obtain the sensing data output by the drone's observation sensor sensing the target object in the environment, and determine the position of the target object based on the sensing data. ;
  • control terminal of the UAV here (i.e., the control terminal of the first UAV) may be called the first control terminal, and the control terminal of the other UAV (i.e., the control terminal of the second UAV) ) can be called the second control terminal.
  • the control terminal may include one or more of a remote control, a smart phone, a tablet, and a wearable device.
  • the control terminal may conduct wireless communication connection with the drone and control the drone through the wireless communication connection.
  • the control terminal may include an operating lever. , buttons, pulsators or touch panel display screens and other interactive devices, the control terminal can detect various types of user operations described below through the interactive devices.
  • the first drone may include an observation sensor, and the first observation sensor may be any sensor capable of outputting sensing data such as image, distance, or position.
  • the first drone includes a pan/tilt for mounting the observation sensor and adjusting the observation direction of the observation sensor.
  • the observation direction of the observation sensor can be determined or adjusted according to the attitude of the drone's body and/or the attitude of the gimbal.
  • the first unmanned aerial vehicle can output sensing data (images, distances or positions, etc. as mentioned above) based on the observation sensor sensing the target object in the environment. , determine the position of the target object according to the sensing data.
  • the target object may be selected by the user of the first drone.
  • the target object may be selected by the user operating the first control terminal of the first drone.
  • the position of the target object may be a three-dimensional position, such as longitude, latitude and altitude.
  • the location of the target object may be a two-dimensional location, such as longitude and latitude.
  • the position of the target object can be expressed in any position representation disclosed in the industry.
  • the coordinate system of the position of the target object may be a world coordinate system, a global coordinate system, a spherical coordinate system, etc.
  • S102 Send the position of the target object to another drone flying in the environment or send the position of the target object to the first relay device, so as to transmit the target through the first relay device.
  • the position of the object is sent to another drone flying in the environment, so that the other drone adjusts the shooting direction of its configured shooting device toward the target object according to the position of the target object. .
  • the first UAV sends the position of the target object to another UAV flying in the environment or sends the position of the target object to the first relay device to pass through the first relay device.
  • the relay device sends the position of the target object to another drone flying in the environment.
  • the other UAV may be called a second UAV, and the second UAV may include a shooting device.
  • the second UAV may include a device for installing all The photographing device and a pan/tilt for adjusting the photographing direction of the photographing device.
  • the second UAV can adjust the posture of its body and/or the posture of the pan/tilt according to the position of the target object to adjust the shooting direction of the shooting device toward the target object, so that the target object will appear on the second UAV.
  • the picture taken by two drone shooting devices In the picture taken by two drone shooting devices.
  • the first drone and the second drone may be bound to the same owner (individual, company, or institution) or work group.
  • the first drone and the second drone may be bound to the same owner (individual, company, or institution) or work group through their respective identity information.
  • the identity information may be any information used to distinguish the drone from other drones.
  • the identity information may include the drone's serial number, verification code or QR code, etc.
  • the first drone can be bound to the same owner (individual, company, or institution) or work group with multiple other drones including the second drone, and the second drone can It is determined from the other multiple drones by the user's drone selection operation on the first control terminal.
  • the first drone and other multiple drones can be bound to the same owner (individual, company, or institution) or work group through their respective identity information.
  • the drone can determine the position of the target object using its configured observation sensor, and the position of the target object is transmitted to another drone so that the drone can According to the position of the target object, the shooting direction of the shooting device is adjusted to face the target object. In this way, another drone can observe the target object through the shooting device, which improves the intelligence of multiple drones' collaborative operations on the target object.
  • the observation sensor includes a photographing device that outputs an image
  • obtaining the sensing data output by the observation sensor of the drone sensing a target object in the environment includes: obtaining the sensing data of the object in the environment by the photographing device.
  • an image output by shooting a target object in the environment; determining the position of the target object according to the sensing data includes: determining the position of the target object in the image; The location in the image determines the location of the target object.
  • the position of the target object may be determined based on images collected by the shooting device of the first drone.
  • the first drone can acquire an image output by a shooting device shooting a target object in the environment, and determine the position of the target object in the image, determined based on the position of the target object in the image. The location of the target object.
  • the following provides a method for determining the position of the target object according to the position of the target object in the image.
  • the first drone can determine the position of the target object according to the position of the target object in the image and the shooting direction of the shooting device.
  • the relative orientation between the target object and the first drone is determined, and the position of the target object is determined based on the relative orientation, the height of the first drone and the position of the first drone.
  • the target object may be selected by the user during a target object selection operation on the control terminal of the drone that displays the images collected by the shooting device.
  • the target object can be selected by the user operating the control terminal of the first drone, and the first drone can pass the image through the link between the first drone and the first control terminal.
  • the wireless communication connection is sent to the first control terminal so that the first control terminal displays the image in real time.
  • the user can perform a target selection operation on the first control terminal to select the target object in the displayed image.
  • the first The control terminal determines target object indication information according to the detected target object selection operation, wherein the object indication information may include the position of the target object in the image.
  • the first control terminal may include a touch display, the touch display displays the image, and the user may perform a clicking operation or a frame selection operation on the touch display screen to select a target object in the displayed image.
  • the first control terminal may send the target object indication information to the first drone, and the first drone may receive the target object indication information and select the target object in the environment according to the target indication information.
  • determining the position of the target object in the image may include: running an image tracking algorithm on the image according to the target object indication information to obtain the position of the target object in the image.
  • the user selects the target object in a displayed frame of image.
  • this frame of image is only one frame of the image output by the shooting device in real time. Since it is necessary to determine the position of the target object in the image output by the shooting device in real time. Therefore, the first drone can run an image tracking algorithm on the real-time image output by the shooting device according to the target object indication information to obtain the position of the target object in the image.
  • the first drone can send the image collected by the shooting device to the control terminal of the drone so that the control terminal displays the image.
  • the first control terminal can display a mark indicating the position of the target object in the image in the displayed image, so that the user can understand in real time which object in the image is the target object.
  • the first drone can send the determined position of the target object in the image as described above to the first control terminal so that the first control terminal is at the location.
  • a mark indicating the position of the target object in the image is displayed on the displayed image.
  • the first control terminal runs an image tracking algorithm on the real-time image received from the first drone according to the indication information of the target object as mentioned above to obtain the position of the target object in the image, And display a mark indicating the position of the target object in the image in the displayed image according to the position.
  • the logo may include at least one of text, symbols, shadows, and graphics.
  • the observation sensor includes a ranging sensor
  • obtaining the sensing data output by the observation sensor of the drone sensing a target object in the environment includes: obtaining the ranging sensor output.
  • the distance of the target object and the observation attitude of the ranging sensor; the position of the target object is determined according to the output distance of the target object and the observation attitude.
  • the observation sensor of the first drone may include a ranging sensor, where the ranging sensor may be various types of ranging sensors.
  • the ranging sensor may be an image-based ranging sensor, such as a binocular camera.
  • the ranging sensor may be a ranging sensor based on transmitting and receiving ranging signals.
  • the ranging sensor includes a receiver for transmitting a ranging signal and receiving a ranging signal reflected by a target object.
  • the ranging signal may be Radar signal, light signal or sound signal, etc.
  • the ranging sensor may include a laser ranging sensor, a TOF sensor or various different types of radar.
  • the first UAV obtains the distance of the target object output by the ranging sensor.
  • the first UAV may obtain the observation attitude of the ranging sensor.
  • the observation direction of the observation sensor can be determined based on the posture of the UAV's body and/or the posture of the cloud platform on which the observation sensor is installed, and the observation direction of the ranging sensor can be determined based on the body posture of the first UAV.
  • the first UAV determines the position of the target object based on the distance of the target object output by the ranging sensor and the observation direction. Further, the first UAV determines the position of the target object based on the distance of the target object output by the ranging sensor and the observation direction.
  • the direction and position of the first drone determine the location of the target object.
  • the position of the first UAV may be acquired by the satellite positioning device of the first UAV.
  • the target object may be selected by the user operating the control terminal of the first drone.
  • the operation performed by the user to the control terminal of the first UAV may include an observation direction adjustment operation performed by the user on the first control terminal.
  • the first control terminal can detect the user's observation direction adjustment operation, and generate an observation direction adjustment instruction of the ranging sensor according to the detected observation direction adjustment operation, wherein the observation direction adjustment instruction is used to adjust the first unmanned The viewing direction of the machine’s ranging sensor.
  • the first control terminal and/or the second control terminal may include interactive devices such as operating levers, buttons, pulsators, or touch panel displays, and the observation direction adjustment operation may be performed on the interactive devices.
  • the first control terminal may interact with The device detects the user's viewing direction adjustment operation.
  • the first control terminal sends the observation direction adjustment instruction to the first drone, and the first drone can adjust the observation direction of the distance sensor to face the target object according to the observation direction adjustment instruction.
  • the first UAV may adjust the attitude of the body of the first UAV and/or the attitude of the cloud platform on which the ranging sensor is installed according to the observation direction adjustment instruction to adjust the observation direction of the distance sensor. towards the target object.
  • the first drone may send the image collected by the shooting device to the control terminal of the drone so that the control terminal displays the image.
  • the first drone can determine the target object based on the position of the target object.
  • the position of the target object in the image is sent to the first control terminal of the first UAV, so that the first control terminal determines the position of the target object in the image according to the position of the target object in the image.
  • Position displays a mark on the displayed image indicating the position of the target object in the image.
  • the first drone may determine the position of the target object in the image collected by the shooting device based on the relative positional relationship between the ranging sensor and the shooting device and the position of the target object.
  • the first drone may send the position of the target object to the first control terminal of the first drone, and the first control terminal determines the position of the target object in the image based on the position of the target object. position and display a mark indicating the position of the target object in the image on the displayed image.
  • the first control terminal may be based on the relative positional relationship between the ranging sensor and the shooting device and the The position of the target object determines the position of the target object in the image collected by the shooting device.
  • the logo may be one or more of text, symbols, shadows, and graphics.
  • the ranging sensor and the shooting device are fixedly installed, and the relative positional relationship between the ranging sensor and the shooting device is fixed.
  • the ranging sensor and the shooting device can be fixedly installed on the cloud platform, where , the observation direction of the ranging sensor is parallel to the shooting direction of the shooting device.
  • the ranging sensor and the photographing device may be movably installed, and the relative positional relationship between the ranging sensor and the photographing device may be determined in real time.
  • the first relay device includes at least one of a first control terminal of the first drone, a server, and a second control terminal of the second drone. Specifically, as mentioned above, the first drone sends the position of the target object to the second drone flying in the environment or sends the position of the target object to the first relay device, The position of the target object is sent to the second UAV flying in the environment through the first relay device. In some cases, the first drone may establish a wireless communication connection with the second drone, and the first drone may send the location of the target object to the second drone through the wireless communication connection. In some cases, the first drone sends the location of the target object to the first relay device, and the first relay device can establish a direct or indirect wireless communication connection with the second drone.
  • the relay device may send the position of the target object to the second drone through the direct or indirect wireless communication connection.
  • the first relay device may include a first control terminal, and the first drone may send the location of the target object to the first control terminal.
  • the first control terminal may send the location of the target object.
  • the server can send the position of the target object to the second control terminal, and the second control terminal can send the position of the target object to the third control terminal through the wireless communication connection between the second control terminal and the second drone. Two drones.
  • the server can send the position of the target object received from the first control terminal to the second drone.
  • the first control terminal can send the position of the target object.
  • the first relay device may include a server.
  • the first drone may send the location of the target object to the server.
  • the server may send the location of the target object to the second control terminal.
  • the second control terminal may send the location of the target object to the server through the third control terminal.
  • the wireless communication connection between the second control terminal and the second UAV sends the position of the target object to the second UAV.
  • the server can send the position of the target object received from the first control terminal to The second drone.
  • the first relay device includes a second control terminal, the first unmanned aerial vehicle sends the position of the target object to the second control terminal, and the second control terminal sends the position of the target object to the second unmanned aerial vehicle. machine.
  • the position of the target object is transmitted to a second drone flying in the environment, so that the second drone controls the shooting device according to the position of the target object. zoom; and/or, the position of the target object is transmitted to a second drone flying in the environment, so that the second drone tracks the target object according to the position of the target object .
  • the position of the target object can be transmitted to the second drone as described above, and the second drone can control the lens focus of the shooting device according to the position of the target object to adjust the position of the target object when shooting.
  • the position of the target object can be transmitted to the second drone as described above, and the second drone can track the target object according to the position of the target object. Further, the second drone can determine whether the preset tracking conditions are met. When the preset tracking conditions are met, the second drone tracks the target object according to the position of the target object.
  • the second drone The aircraft can track the target object according to the position of the target object and the position of the second UAV, and the position of the second UAV can be collected by the satellite positioning device of the second UAV.
  • the preset tracking conditions may include that the remaining power of the second drone is greater than or equal to the preset power threshold, and the distance between the second drone and the first drone or the target object is less than or equal to the preset distance threshold.
  • at least one of the second drones is in a flight state.
  • the first drone can transmit the position of the first drone to the second drone in the same manner as the position of the target object, and the second drone can determine the position of the first drone according to the position of the first drone.
  • the distance between the first drone and the first drone can determine the distance to the target object based on the position of the target object.
  • the second drone can first fly to a preset height, and then track the target object according to its location.
  • the first drone can determine the location of the target object in real time as described above.
  • the location of the target object can be transmitted to the second drone in real time as described above.
  • the second drone can determine the location of the target object in real time as described above.
  • the target object is tracked based on the position of the target object received in real time.
  • the target object may be the tracking object of the first drone, that is, the first drone tracks the target object.
  • the first drone can determine the speed of the target object based on the sensing data output by the observation sensor, and the first drone can transmit the speed of the target object to the second drone in the same manner as the position of the target object.
  • the second UAV can track the target object according to its speed and position. Wherein, the speed of the target object may be determined according to the position of the target object, and the speed of the target object may be determined in real time by the first drone and transmitted to the second drone in real time.
  • Embodiments of the present invention provide a method for controlling a control terminal of a drone.
  • the execution subject of the method may be the control terminal of the drone.
  • the control terminal of the drone here may be the first drone as mentioned above.
  • the control terminal of the machine is the first control terminal as mentioned above.
  • the method includes:
  • S201 While the first drone is flying in the environment, obtain the position of the target object in the environment sent by the first drone, where the position is the first drone according to its configuration. Determined by the sensing data output by the observation sensor sensing the target object;
  • S202 Send the position of the target object to a second UAV flying in the environment or send the position of the target object to a second relay device, so as to transmit the target through the second relay device.
  • the position of the object is sent to a second drone flying in the environment, so that the second drone adjusts the shooting direction of its configured shooting device toward the target object according to the position of the target object. .
  • the first control terminal may display a map of the environment, the first control terminal may detect a user's location point selection operation on the displayed map, and determine the user's location point selection operation based on the detected location point selection operation.
  • the position of the selected position point on the map is sent to the second drone flying in the environment or the position of the position point is sent to the second relay device to The position of the position point is sent to the second drone flying in the environment through the second relay device, so that the second drone configures the shooting device according to the position of the position point.
  • the shooting direction is adjusted to the position toward the position point.
  • the first control terminal may include a touch display, and the touch display may display the map.
  • the user may perform click operations on the touch display screen displaying the map, and the first control terminal detects through the touch display screen.
  • the click operation determines the location of the location point selected by the user on the map.
  • the manner in which the first control terminal transmits the position of the position point to the second UAV may be the same as the manner in which the first control terminal transmits the position of the target object to the second UAV, and details will not be described again.
  • the position of the target object is transmitted to the second drone flying in the environment so that the second drone will configure the shooting direction of the shooting device according to the orientation or position of the target object. Adjust toward the target object.
  • the observation sensor includes a shooting device
  • the first control terminal can receive and display the image collected by the shooting device sent by the first drone, and detect the user's response to the displayed image.
  • Target object selection operation determine target object indication information according to the detected target object selection operation, wherein the target object indication information includes the position of the target object in the image, and send the target object indication information to the target object selection operation.
  • the first drone is configured to enable the first drone to select a target object in the environment.
  • the first control terminal displays a mark indicating the position of the target object in the image on the displayed image.
  • the first control terminal may display the identification in the two ways as mentioned above.
  • the first control terminal receives the position of the target object in the image sent by the drone, and displays an indication on the displayed image according to the position of the target object in the image.
  • An identification of the position of the target object in the image is another way, the first control terminal runs an image tracking algorithm on the image received from the first drone according to the indication information of the target object as mentioned above to obtain the position of the target object in the image, and based on the The position of the target object in the image displays a mark indicating the position of the target object in the image in the displayed image.
  • the observation sensor includes a ranging sensor
  • the first drone includes a pan/tilt for installing the ranging sensor and adjusting the observing direction of the ranging sensor
  • the first control terminal detects the user an observation direction adjustment operation, generate an observation direction adjustment instruction according to the detected observation direction adjustment operation, and send the observation direction adjustment instruction to the first UAV so that the first UAV adjusts according to the observation direction
  • the instruction is to adjust the observation direction of the distance sensor to face the target object.
  • the first drone includes a shooting device, wherein the first control terminal receives and displays the images collected by the shooting device sent by the first drone, and the first control terminal receives the The position of the target object in the image is sent, and a mark indicating the position of the target object in the image is displayed on the displayed image, wherein the target object is in the image.
  • the position of is determined by the first drone based on the position of the target object.
  • the first control terminal determines the position of the target object in the image according to the position of the target object, and displays a mark indicating the position of the target object in the image on the displayed image. .
  • the second relay device includes at least one of the aforementioned server and the aforementioned second control terminal of the second drone.
  • the second drone may be determined by the user performing a drone selection operation on the first control terminal.
  • the first control terminal may display indication information of multiple candidate drones, detect the user's drone selection operation, and determine the user's selection from the multiple candidate drones based on the detected drone selection operation.
  • the indication information of the selected drone in the identity information of the drone; specifically, the first control terminal can display the indication information of multiple candidate drones.
  • the touch display of the first control terminal can display the multiple candidate drones. HMI instruction information.
  • the indication information of the drone may include the identity information of the drone as mentioned above, the identity information of the user of the drone (such as ID number, user name, name, nickname, etc.), the location of the drone, at least one.
  • the multiple candidate drones may be drones whose distance from the first drone is less than or equal to a preset distance threshold.
  • the plurality of candidate drones may be other multiple drones bound to the same owner (individual, company, or institution) or work group as the first drone as mentioned above.
  • the first control terminal can detect the user's drone selection operation through the interactive device as described above, and determine that the user selects the drone from the indication information of the plurality of candidate drones based on the detected drone selection operation.
  • the indication information of the drone (that is, the indication information of the second drone) is sent to the second drone flying in the environment corresponding to the selected indication information or the location of the target object is sent to the second drone flying in the environment.
  • the position of the target object is sent to the second relay device, so that the position of the target object is sent to the second drone flying in the environment corresponding to the selected indication information through the second relay device.
  • the position of the target object is transmitted to a second drone flying in the environment, so that the second drone controls the shooting device according to the position of the target object. zoom; and/or, the position of the target object is transmitted to a second drone flying in the environment, so that the second drone tracks the target object according to the position of the target object .
  • An embodiment of the present invention provides a server control method.
  • the execution subject of the method may be the server as mentioned above.
  • the method includes:
  • S301 Obtain the position of the target object in the environment, where the position of the target object is determined by the sensing data output by the first drone in the environment based on its configuration of the observation sensor sensing the target object. of;
  • the server can obtain the position of the target object in the environment.
  • the server can obtain the position of the target object sent by the first drone, or the server can obtain the target object sent by the control terminal of the first drone. s position.
  • S302 Send the position of the target object to the second drone or send the position of the target object to a third relay device, so as to send the position of the target object to the location through the third relay device.
  • the server can send the position of the target object to the second drone.
  • the server can send the position of the target object to the third relay device, where the third relay device
  • the relay device may include a second control terminal of the second drone as described above, so that the second control terminal can send the location of the target to the second drone.
  • the first drone and the second drone may be bound to the same owner (individual, company, or institution) or work group, and the server may determine the relationship with the same owner (individual, company, or institution) or work group.
  • the first drone is bound to a second drone of the same owner (individual, company, or institution) or work group, and the server sends the location of the target object to the bound second drone. or send the position of the target object to a third relay device to send the position of the target object to the bound second UAV flying in the environment through the third relay device .
  • the first UAV and the second UAV can be bound by their respective identity information (ie, the identity information of the first UAV and the identity information of the second UAV). If it is determined, the server can obtain the identity information of the first drone, and determine the second drone bound to the first drone based on the identity information of the first drone. The server may obtain the identity information of the first drone in the same manner as the server obtains the position of the target object in the environment.
  • the server determines the other drone located in the environment from a plurality of candidate drones, and the server may operate the plurality of candidate drones based on the user's drone selection. The other drone is selected in the aircraft.
  • the position of the target object is sent to a plurality of candidate drones or the position of the target object is sent to a third relay device, so that the target is transmitted through the third relay device.
  • the position of the object is sent to a plurality of candidate drones flying in the environment, so that the plurality of candidate drones adjust the shooting direction of their configured shooting devices to the direction according to the position of the target object.
  • the target object, wherein the plurality of candidate drones includes the other drone.
  • multiple candidate UAVs may be UAVs whose distance from the first UAV is less than or equal to a preset distance threshold. In some cases, none of the multiple candidate UAVs
  • the human-machine may be other multiple drones bound to the same owner (individual, company, or institution) or work group as the first drone as mentioned above.
  • Embodiments of the present invention provide a method for controlling a control terminal of a drone.
  • the execution subject of the method may be the control terminal of the drone.
  • the control terminal of the drone here may be the second drone as mentioned above.
  • the control terminal of the machine is the second control terminal as mentioned above.
  • the method includes:
  • S401 Obtain the position of the target object in the environment, where the position of the target object is the sensor output by the first drone flying in the environment based on its configuration of the observation sensor sensing the target object. Data is certain;
  • the second control terminal can obtain the position of the target object in the environment.
  • the second control terminal can obtain the position of the target object sent by the first drone, or the second control terminal can obtain the position of the target object sent by the first drone.
  • the position of the target object sent by the human-machine control terminal, or the second control terminal can obtain the position of the target object sent by the server.
  • S402 Send the position of the target object to the second drone in the environment, so that the second drone sets the shooting direction of its configured shooting device according to the position of the target object. Adjust toward the target object.
  • the second control terminal may, in response to obtaining the position of the target object, display an identification indicating the position of the target object according to the position of the target object; and/or, the second control terminal The terminal may, in response to obtaining the position of the target object, display a sign indicating the orientation of the target object according to the position of the target object. In this way, the user of the second drone can easily understand the position of the target object. Or the orientation. Further, the second control terminal can display a logo indicating the position of the target object according to the position of the target object and the position of the drone, and/or the second control terminal can display the logo according to the position of the target object. The position of the two drones and the position of the target object display a mark for indicating the orientation of the target object.
  • the second control terminal can obtain and display the indication information of the first drone, so that the user can learn the relevant information of the drone that observed the position of the target object.
  • the second control terminal obtains the instruction information of the first drone.
  • the instruction information of the drone may include the identity information of the drone and the identity information of the user of the drone (for example, at least one of ID number, user name, name, nickname, etc.) and the location of the drone.
  • the second control terminal may obtain the instruction information of the first drone in the same manner as the method of obtaining the location of the target object.
  • the second control terminal in response to obtaining the position of the target object, displays prompt information for obtaining the position of the target object.
  • the prompt information may include the second control terminal as described above. Obtain the instruction information of the first drone.
  • the second control terminal may, in response to obtaining the position of the target object, determine whether a preset sending condition is met, and when satisfied, send the obtained position of the target object to the second unmanned aerial vehicle. machine, otherwise, refuse to send the position of the target object to the second UAV. In some cases, when the preset sending conditions are not met, a prompt message refusing to send is displayed.
  • the second control terminal determines whether the preset sending conditions are met, including: the second control terminal determines whether to detect the user's allow response operation; when detecting the allow response operation, determines that the preset sending conditions are met. The preset sending conditions, otherwise, it is determined that the preset sending conditions are not met.
  • the second control terminal determines whether the preset sending conditions are met, including: determining whether the second drone meets the preset response conditions, wherein the preset response conditions include the second Whether the remaining power of the drone is greater than or equal to the preset power threshold, whether the distance between the second drone and the first drone or the target object is less than or equal to the preset distance threshold, whether the second drone is In at least one of the flight states; if it is determined that the second drone meets the preset response conditions, it is determined that the preset sending conditions are met; otherwise, it is determined that the preset sending conditions are not met.
  • the second control terminal can acquire the image collected by the shooting device of the second drone, display the image, and display in the displayed image a signal indicating that the target object is in the image. The identification of the location. As the shooting direction of the shooting device of the second drone is adjusted toward the target object, the target object will be in the shooting screen of the shooting device of the second drone.
  • the second control terminal can display in the displayed image an identification indicating the position of the target object in the image, further, the second control terminal can The position of the target object in the image is obtained, and a mark indicating the position of the target object in the image is displayed in the displayed image according to the position of the target object in the image.
  • the position of the target object in the image collected by the shooting device of the second drone may be determined based on the position of the target object and the shooting direction of the shooting device of the second drone.
  • the second control terminal obtains the position of the target object in the image, which may include the second control terminal obtaining the position of the target object in the image sent by the second drone, and the target object is in the second
  • the position in the image collected by the shooting device of the drone may be determined by the second drone based on the position of the target object and the shooting direction of the shooting device of the second drone.
  • the second control terminal obtains the position of the target object in the image, which may include the second control terminal determining the target according to the position of the target object and the shooting direction of the shooting device of the second drone.
  • the position of the object in the image collected by the photographing device of the second drone, wherein the photographing direction of the photographing device of the second drone may be obtained from the second drone.
  • An embodiment of the present invention provides a method for controlling a drone.
  • the execution subject of the method may be a drone.
  • the drone here is the second drone as mentioned above.
  • the drone includes a shooting device.
  • the method includes:
  • S501 Obtain the position of the target object in the environment, where the position of the target object is determined by the sensing data output by the first drone in the environment based on its configuration of the observation sensor sensing the target object. ;
  • the second drone can acquire the position of the target object in the environment.
  • the second drone can acquire the position of the target object sent by the first drone, or the second drone can acquire The position of the target object sent by the first control terminal of the first UAV, or the second UAV can obtain the position of the target object sent by the second control terminal of the second UAV, or the second UAV can obtain The location of the target object sent by the server.
  • S502 Adjust the shooting direction of the shooting device to face the target object according to the position of the target object. Further, the second drone may adjust the shooting direction of the shooting device toward the target object by adjusting the posture of the body and/or the platform on which the shooting device is installed.
  • the second drone may track the target object based on its location.
  • an embodiment of the present invention also provides an unmanned aerial vehicle 600.
  • the unmanned aerial vehicle here may be the first unmanned aerial vehicle as mentioned above, including an observation sensor 601 and a processor 602.
  • the processor is To perform the following steps:
  • the sensing data output by the drone's observation sensor sensing the target object in the environment is obtained, and the position of the target object is determined based on the sensing data;
  • the position of the target object is sent to another drone flying in the environment or the position of the target object is sent to a first relay device, so that the position of the target object is transmitted through the first relay device.
  • the position is sent to another drone flying in the environment, so that the other drone adjusts the shooting direction of its configured shooting device to face the target object according to the position of the target object.
  • the observation sensor includes a photographing device
  • the processor is configured to:
  • the position of the target object is determined based on the position of the target object in the image.
  • the target object is selected by the user during a target object selection operation on the control terminal of the drone that displays the images collected by the shooting device.
  • the processor is used to:
  • the determined position of the target object in the image is sent to the control terminal of the drone so that the control terminal displays on the displayed image an indication indicating that the target object is in the image. Identification of the location.
  • the observation sensor includes a ranging sensor
  • the processor is configured to:
  • the position of the target object is determined based on the output distance of the target object and the observation posture.
  • the ranging sensor includes a receiver for transmitting a ranging signal and receiving a ranging signal reflected by a target object.
  • the UAV includes a pan/tilt for installing the ranging sensor and adjusting the observation direction of the ranging sensor, and the processor is configured to:
  • the pan/tilt is controlled to adjust the observation direction of the ranging sensor so that the observation direction faces the target object.
  • the drone includes a shooting device, and the processor is configured to:
  • the first relay device includes at least one of a control terminal of a drone, a server, and a control terminal of another drone.
  • the position of the target object is transmitted to another drone flying in the environment, so that the other drone controls the shooting device according to the position of the target object. of zoom; and/or,
  • the position of the target object is transmitted to another drone flying in the environment, so that the other drone tracks the target object according to the position of the target object.
  • the drone and the other drone are drones bound to the same owner or workgroup.
  • the target object is selected by the user performing a target object selection operation on the control terminal of the drone.
  • an embodiment of the present invention also provides a control terminal 700 for a drone.
  • the control terminal here may be the first control terminal as described above, including a memory 701 and a processor 702.
  • the memory is used to store program code
  • the processor is used to call and execute the program code to perform the following steps:
  • the position of the target object in the environment sent by the drone is obtained, where the position is the measurement of the target object by the drone according to its configuration observation sensor.
  • the sensing data output by sensing is determined;
  • the position of the target object is sent to another drone flying in the environment or the position of the target object is sent to a second relay device to transmit the position of the target object through the second relay device.
  • the position is sent to another drone flying in the environment, so that the other drone adjusts the shooting direction of its configured shooting device to face the target object according to the position of the target object.
  • the observation sensor includes a photographing device
  • the processor is configured to:
  • Detect the user's target object selection operation on the displayed image and determine target object indication information according to the detected target object selection operation, wherein the target object indication information includes the position of the target object in the image.
  • the target object indication information is sent to the drone so that the drone selects the target object in the environment.
  • the processor is used to:
  • a mark indicating the position of the target object in the image is displayed on the displayed image.
  • the observation sensor includes a ranging sensor
  • the UAV includes a pan/tilt for installing the ranging sensor and adjusting the observation direction of the ranging sensor, and the processor is used for:
  • the observation direction adjustment instruction is sent to the UAV so that the UAV adjusts the observation direction of the distance sensor toward the target object according to the observation direction adjustment instruction.
  • the drone includes a shooting device, wherein the processor is configured to:
  • the position in the image is determined by the drone based on the position of the target object; or,
  • the second relay device includes at least one of a server and a control terminal of the other drone.
  • the processor is used to:
  • Detect the user's drone selection operation and determine the indication information of the other drone selected by the user from the indication information of the plurality of candidate drones based on the detected drone selection operation;
  • the device sends the position of the target object to another drone flying in the environment corresponding to the selected indication information.
  • the position of the target object is transmitted to another drone flying in the environment, so that the other drone controls the shooting device according to the position of the target object. of zoom; and/or,
  • the position of the target object is transmitted to another drone flying in the environment, so that the other drone tracks the target object according to the position of the target object.
  • the drone and the other drone are drones bound to the same owner or workgroup.
  • this embodiment of the present invention also provides a server 800.
  • the server here can be the server as described above, including a memory 801 and a processor 802.
  • the memory is used to store program code
  • the processor is used to call and execute the program code to perform the following steps:
  • the position of the target object in the environment wherein the position of the target object is determined by the sensing data output by the drone in the environment based on its configuration of the observation sensor sensing the target object;
  • Another unmanned aerial vehicle flying in the middle, so that the other unmanned aerial vehicle adjusts the shooting direction of its configured shooting device to face the target object according to the position of the target object.
  • the processor is used to:
  • Another drone flying in the above environment including:
  • the position of the target object is sent to another drone bound to the drone or the position of the target object is sent to a third relay device, so that the target is transmitted to the target through the third relay device.
  • the position of the object is sent to another drone bound to the drone flying in the environment.
  • the processor is used to:
  • the processor is used to:
  • the other drone located in the environment is determined from a plurality of candidate drones.
  • the drone and the other drone are drones bound to the same owner or workgroup.
  • an embodiment of the present invention also provides a control terminal 900 for a drone.
  • the control terminal here can be the second control terminal as mentioned above, including a memory 901 and a processor 902.
  • the memory is used to store program code
  • the processor is used to call and execute the program code to perform the following steps:
  • the position of the target object is determined by sensing data output by another drone flying in the environment based on its configured observation sensor sensing the target object.
  • the processor is used to:
  • the processor is used to:
  • the obtained position of the target object is sent to the drone; otherwise, the position of the target object is refused to be sent to the drone.
  • the processor is used to:
  • the processor is used to:
  • the preset response condition includes whether the remaining power of the UAV is greater than or equal to a preset power threshold, the relationship between the UAV and another Whether the distance between the drone or the target object is less than or equal to a preset distance threshold, and whether the drone is in a flight state;
  • the processor is used to:
  • a mark indicating the position of the target object in the image is displayed in the displayed image.
  • the drone and the other drone are drones bound to the same owner or workgroup.
  • an embodiment of the present invention also provides a drone 1000.
  • the drone here may be the second drone as mentioned above.
  • the drone includes a shooting device 1001 and a processor 1002. , the processor is used to perform the following steps:
  • the position of the target object in the environment wherein the position of the target object is determined by sensing data output by another drone in the environment based on its configuration of an observation sensor sensing the target object;
  • the shooting direction of the shooting device is adjusted according to the position of the target object to face the target object.
  • the processor is used to:
  • the drone and the other drone are drones bound to the same owner or workgroup.
  • the disclosed related remote control devices and methods can be implemented in other ways.
  • the remote control device embodiments described above are only illustrative.
  • the division of modules or units is only a logical function division.
  • there may be other division methods, such as multiple units or components. can be combined or can be integrated into another system, or some features can be ignored, or not implemented.
  • the coupling or direct coupling or communication connection between each other shown or discussed may be through some interfaces, and the indirect coupling or communication connection of the remote control device or unit may be in electrical, mechanical or other forms.
  • the units described as separate components may or may not be physically separated, and the components shown as units may or may not be physical units, that is, they may be located in one place, or they may be distributed to multiple network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of this embodiment.
  • each functional unit in various embodiments of the present invention can be integrated into one processing unit, or each unit can exist physically alone, or two or more units can be integrated into one unit.
  • the above integrated units can be implemented in the form of hardware or software functional units.
  • the integrated unit is implemented in the form of a software functional unit and sold or used as an independent product, it may be stored in a computer-readable storage medium.
  • the technical solution of the present invention is essentially or contributes to the existing technology or all or part of the technical solution can be embodied in the form of a software product, and the computer software product is stored in a storage medium , including several instructions for causing a computer processor (processor) to execute all or part of the steps of the methods described in various embodiments of the present invention.
  • the aforementioned storage media include: U disk, mobile hard disk, read-only memory (ROM, Read-Only Memory), random access memory (RAM, Random Access Memory), magnetic disk or optical disk and other media that can store program code.

Landscapes

  • Engineering & Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

一种无人机、控制终端、服务器及其控制方法,控制方法包括:在无人机在环境中飞行的过程中,获取无人机的观测传感器对环境中的目标物体进行感测而输出的传感数据;根据传感数据确定目标物体的位置(S101);将目标物体的位置发送给在环境中飞行的另一无人机或者将目标物体的位置发送给第一中继设备,以通过第一中继设备将目标物体的位置发送给在环境中飞行的另一无人机(S102),以使另一无人机根据目标物体的位置将其配置的拍摄装置的拍摄方向调整至朝向目标物体。这样通过在环境中的无人机对观测到的环境中的目标物体的位置共享,使得在环境中的另一无人机的拍摄装置可以观测到目标物体。

Description

无人机、控制终端、服务器及其控制方法 技术领域
本发明实施例涉及电子控制技术领域,尤其涉及一种无人机、控制终端、服务器及其控制方法。
背景技术
无人机被广泛地应用于勘测、事故搜救、设备巡检、测绘应领域中。在这些应用领域中,无人机往往是独立进行任务作业,即各自独立地对自己关注的目标物体进行测量、拍摄、跟踪等作业,目标物体相关的信息是不同共享的。然而,这种作业方式不能满足某些作业场景的需求。例如,在一个游客在山区迷路,救援人员利用多架无人机在山区上空飞行以搜寻这个陌路的游客,当其中的一架无人机搜寻到该游客时,此时,往往希望其他无人机能够得知该游客的位置或者方位,以便其他无人机也能够观测到该游客。然而,目前,无人机之间对观测到的目标物体的相关信息不能共享给其他无人机,导致不能实现上述需求,这样使得多无人机对目标物体进行协同作业的智能性不高。
发明内容
本发明实施例提供了一种无人机、控制终端、服务器及其控制方法,使得无人机能将观测的目标物体的相关信息分享给其他无人机,这样其他无人机可以利用目标物体的相关信息实现对目标物体的观测。
本发明实施例的第一方面是为了提供一种无人机的控制方法,所述方法包括:
在无人机在环境中飞行的过程中,获取无人机的观测传感器对环境中的目标物体进行感测而输出的传感数据;
根据所述传感数据确定所述目标物体的位置;
将所述目标物体的位置发送给在所述环境中飞行的另一无人机或者将所述目标物体的位置发送给第一中继设备,以通过第一中继设备将所述目标物体的位置发送给在所述环境中飞行的另一无人机,以使所述另一无人机根据所述目标物体的位置将其配置的拍摄装置的拍摄方向调整至朝向所述目 标物体。
本发明实施例的第二方面是为了提供一种无人机的控制终端的控制方法,所述方法包括:
在无人机在环境中飞行的过程中,获取无人机发送的所述环境中的目标物体的位置,其中,所述位置是所述无人机根据其配置观测传感器对所述目标物体进行感测而输出的传感数据确定的;
将所述目标物体的位置发送给在所述环境中飞行的另一无人机或者将所述目标物体的位置发送给第二中继设备,以通过第二中继设备将所述目标物体的位置发送给在所述环境中飞行的另一无人机,以使所述另一无人机根据所述目标物体的位置将其配置的拍摄装置的拍摄方向调整至朝向所述目标物体。
本发明实施例的第三方面是为了提供一种服务器的控制方法,所述方法包括:
获取环境中的目标物体的位置,其中,所述目标物体的位置是在所述环境中无人机根据其配置观测传感器对所述目标物体进行感测而输出的传感数据确定的;
将所述目标物体的位置发送给另一无人机或者将所述目标物体的位置发送给第三中继设备,以通过第三中继设备将所述目标物体的位置发送给在所述环境中飞行的另一无人机,以使所述另一无人机根据所述目标物体的位置将其配置的拍摄装置的拍摄方向调整至朝向所述目标物体。
本发明实施例的第四方面是为了提供另一种无人机的控制终端的控制方法,所述方法包括:
获取目标物体的位置;
将所述目标物体的位置发送给在所述环境中的所述无人机,以使所述无人机根据所述目标物体的位置将其配置的拍摄装置的拍摄方向调整至朝向所述目标物体;
其中,所述目标物体的位置是在所述环境中飞行的另一无人机根据其配置观测传感器对所述目标物体进行感测而输出的传感数据确定的。
本发明实施例的第五方面是为了提供另一种无人机的控制方法,所述方法包括:
获取环境中目标物体的位置,其中,所述目标物体的位置是在所述环境 中另一无人机根据其配置观测传感器对所述目标物体进行感测而输出的传感数据确定的;
根据所述目标物体的位置调整所述拍摄装置的拍摄方向调整至朝向所述目标物体。
本发明实施例的第六方面是为了提供一种无人机的控制方法,所述处理器用于执行以下步骤:
在无人机在环境中飞行的过程中,获取无人机的观测传感器对环境中的目标物体进行感测而输出的传感数据,根据所述传感数据确定所述目标物体的位置;
将所述目标物体的位置发送给在所述环境中飞行的另一无人机或者将所述目标物体的位置发送给第一中继设备,以通过第一中继设备将所述目标物体的位置发送给在所述环境中飞行的另一无人机,以使所述另一无人机根据所述目标物体的位置将其配置的拍摄装置的拍摄方向调整至朝向所述目标物体。
本发明实施例的第七方面是为了提供一种无人机的控制终端,包括存储器和处理器,
所述存储器,用于存储程序代码;
所述处理器,用于调用并执行所述程序代码以执行以下步骤:
在无人机在环境中飞行的过程中,获取无人机发送的所述环境中的目标物体的位置,其中,所述位置是所述无人机根据其配置观测传感器对所述目标物体进行感测而输出的传感数据确定的;
将所述目标物体的位置发送给在所述环境中飞行的另一无人机或者将所述目标物体的位置发送给第二中继设备,以通过第二中继设备将所述目标物体的位置发送给在所述环境中飞行的另一无人机,以使所述另一无人机根据所述目标物体的位置将其配置的拍摄装置的拍摄方向调整至朝向所述目标物体。
本发明实施例的第八方面是为了提供一种服务器,包括存储器和处理器,
所述存储器,用于存储程序代码;
所述处理器,用于调用并执行所述程序代码以执行以下步骤:
获取环境中的目标物体的位置,其中,所述目标物体的位置是在所述环 境中无人机根据其配置观测传感器对所述目标物体进行感测而输出的传感数据确定的;
将所述目标物体的位置发送给另一无人机或者将所述目标物体的位置发送给第三中继设备,以通过第三中继设备将所述目标物体的位置发送给在所述环境中飞行的另一无人机,以使所述另一无人机根据所述目标物体的位置将其配置的拍摄装置的拍摄方向调整至朝向所述目标物体。
本发明实施例的第九方面是为了提供另一种无人机的控制终端,包括存储器和处理器,
所述存储器,用于存储程序代码;
所述处理器,用于调用并执行所述程序代码以执行以下步骤:
获取目标物体的位置;
将所述目标物体的位置发送给在所述环境中的所述无人机,以使所述无人机根据所述目标物体的位置将其配置的拍摄装置的拍摄方向调整至朝向所述目标物体;
其中,所述目标物体的位置是在所述环境中飞行的另一无人机根据其配置观测传感器对所述目标物体进行感测而输出的传感数据确定的。
本发明实施例的第五方面是为了提供另一种无人机,所述处理器用于执行以下步骤:
获取环境中目标物体的位置,其中,所述目标物体的位置是在所述环境中另一无人机根据其配置观测传感器对所述目标物体进行感测而输出的传感数据确定的;
根据所述目标物体的位置调整所述拍摄装置的拍摄方向调整至朝向所述目标物体。
本发明的第十方面是为了提供一种计算机可读存储介质,所述存储介质为计算机可读存储介质,该计算机可读存储介质中存储有程序指令,所述程序指令用于第一方面至第五方面任一项方面所述的控制方法。
本发明实施例提供的人飞行器、控制终端、服务器及其控制方法,无人机可以将利用其配置的观测传感器确定的目标物体的位置,所述目标物体的位置被传送给另一无人机以使另一无人机可以根据目标物体的位置将其拍摄装置的拍摄方向调整至朝向所述目标物体。通过这种方式,另一无人机可以通过拍摄装置观测到目标物体,提高了多无人机对目标物体进行协同作业 的智能性。
附图说明
此处所说明的附图用来提供对本申请的进一步理解,构成本申请的一部分,本申请的示意性实施例及其说明用于解释本申请,并不构成对本申请的不当限定。在附图中:
图1为本发明实施例提供的第一无人机的控制方法的流程示意图;
图2为本发明实施例提供的第一无人机的第一控制终端的控制方法的流程示意图;
图3为本发明实施例提供的服务器的控制方法的流程示意图;
图4为本发明实施例提供的第二无人机的第二控制终端的控制方法的流程示意图;
图5为本发明实施例提供的第二无人机的控制方法的流程示意图;
图6为本发明实施例提供的第一无人机的结构示意图;
图7为本发明实施例提供的第一无人机的第一控制终端的结构示意图;
图8为本发明实施例提供的服务器的结构示意图;
图9为本发明实施例提供的第二无人机的第二控制终端的结构示意图;
图10为本发明实施例提供的第二无人机的结构示意图;
具体实施方式
为使本发明实施例的目的、技术方案和优点更加清楚,下面将结合本发明实施例中的附图,对本发明实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例是本发明一部分实施例,而不是全部的实施例。基于本发明中的实施例,本领域普通技术人员在没有做出创造性劳动前提下所获得的所有其他实施例,都属于本发明保护的范围。
除非另有定义,本文所使用的所有的技术和科学术语与属于本发明的技术领域的技术人员通常理解的含义相同。本文中在本发明的说明书中所使用的术语只是为了描述具体的实施例的目的,不是旨在于限制本发明。
下面结合附图对本发明的技术方案进行详细地说明。
本发明实施例提供一种无人机的控制方法,所述控制方法的执行主体可以为无人机,为了与后文中的另一无人机进行区别并防止混淆,这里的无人 机(即该方法的执行主体)可以称为第一无人机,后文中的另一无人机可以称为第二无人机,所述方法包括:
S101:在无人机在环境中飞行的过程中,获取无人机的观测传感器对环境中的目标物体进行感测而输出的传感数据,根据所述传感数据确定所述目标物体的位置;
具体地,这里的无人机的控制终端(即第一无人机的控制终端)可以称为第一控制终端,所述另一无人机的控制终端(即第二无人机的控制终端)可以称为第二控制终端。其中,所述控制终端可以包括遥控器、智能手机、平板电脑、穿戴式设备中的一种或多种,控制终端可以与无人机进行无线通信连接并通过所述无线通信连接对无人机发送控制指令和/或通过所述无线通信连接接收无人机发送的数据(例如无人机的拍摄装置采集的图像、无人机的飞行状态信息和其他任何数据),控制终端可以包括操作杆、按键、波轮或者触摸板显示屏等交互装置,控制终端可以通过所述交互装置检测后文中的用户的各种类型的操作。
第一无人机可以包括观测传感器,所述第一观测传感器可以为任何能够输出图像、距离或者位置等传感数据的传感器。在某些情况中,第一无人机包括用于安装所述观测传感器和调整所述观测传感器的观测方向的云台。观测传感器的观测方向可以根据无人机的机身姿态和/或云台的姿态来确定或者调整。在第一无人在环境中飞行的过程中,第一无人机可以根据观测传感器对环境中的目标物体进行感测而输出的传感数据(如前所述的图像、距离或者位置等),根据所述传感数据确定所述目标物体的位置。其中,所述目标物体可以是第一无人机的用户选中的。进一步地,所述目标物体可以是用户对第一无人机的第一控制终端进行操作而选中。其中,所述目标物体的位置可以为三维位置,例如经度、纬度和高度。在某些情况中,所述目标物体的位置可以为二维位置,例如经度和纬度。目标物体的位置可以以业界公开的任何位置表示方式来表示。所述目标物体的位置的坐标系可以是世界坐标系或者全局坐标系或者球面坐标系等。
S102:将所述目标物体的位置发送给在所述环境中飞行的另一无人机或者将所述目标物体的位置发送给第一中继设备,以通过第一中继设备将所述目标物体的位置发送给在所述环境中飞行的另一无人机,以使所述另一无人机根据所述目标物体的位置将其配置的拍摄装置的拍摄方向调整至朝向所 述目标物体。
具体地,第一无人机将所述目标物体的位置发送给在所述环境中飞行的另一无人机或者将所述目标物体的位置发送给第一中继设备,以通过第一中继设备将所述目标物体的位置发送给在所述环境中飞行的另一无人机。如前所述,所述另一无人机可以称为第二无人机,第二无人机可以包括拍摄装置,在某些情况中,所述第二无人机可以包括用于安装所述拍摄装置和调整所述拍摄装置的拍摄方向的云台。这样,第二无人机可以根据所述目标物体的位置调整其机身的姿态和/或云台的姿态以将拍摄装置的拍摄方向调整至朝向所述目标物体,这样目标物体会出现在第二无人机的拍摄装置的拍摄画面中。
在某些实施例中,第一无人机和第二无人机可以是绑定至同一个所有者(个人、公司、或者机构)或者工作组的。所述第一无人机和第二无人机可以通过其各自的身份信息来绑定至同一个所有者(个人、公司、或者机构)或者工作组的。其中,所述身份信息可以是任何用于将该无人机同其他无人机进行区别的信息,例如,所述身份信息可以包括无人机的序列号、验证码或二维码等。
进一步地,第一无人机可以与包括所述第二无人机的其他多个无人机绑定至同一个所有者(个人、公司、或者机构)或者工作组,第二无人机可以是用户对第一控制终端进行无人机选中操作从所述其他多个无人机中确定的。其中,第一无人机和其他多个无人机可以通过其各自的身份信息来绑定至同一个所有者(个人、公司、或者机构)或者工作组的。
本发明实施例提供的人飞行器的控制方法,无人机可以将利用其配置的观测传感器确定的目标物体的位置,所述目标物体的位置被传输给另一无人机以使无人机可以根据目标物体的位置将其拍摄装置的拍摄方向调整至朝向所述目标物体。通过这种方式,另一无人机可以通过拍摄装置观测到目标物体,提高了多无人机对目标物体进行协同作业的智能性。
在某些实施例种,所述观测传感器包括输出图像的拍摄装置,所述获取无人机的观测传感器对环境中的目标物体进行感测而输出的传感数据,包括:获取拍摄装置对所述环境中的目标物体进行拍摄而输出的图像;所述根据所述传感数据确定所述目标物体的位置,包括:确定所述目标物体在所述图像中的位置;根据所述目标物体在所述图像中的位置确定所述目标物体的 位置。
具体地,所述目标物体的位置可以是根据第一无人机的拍摄装置采集的图像来确定的。第一无人机可以获取拍摄装置对所述环境中的目标物体进行拍摄而输出的图像,并确定所述目标物体在所述图像的位置,根据所述目标物体在所述图像中的位置确定所述目标物体的位置。下面提供一种根据所述目标物体在所述图像中的位置确定所述目标物体的位置的方法,第一无人机可以根据所述目标物体在所述图像中的位置和拍摄装置的拍摄方向确定目标物体和所述第一无人机之间的相对方位,根据所述相对方位、第一无人机的高度和第一无人机的位置确定所述目标物体的位置。
在某些实施例中,所述目标物体可以是用户在对显示所述拍摄装置采集到的图像的无人机的控制终端进行目标物体选择操作选中的。如前所述,所述目标物体可以是用户对第一无人机的控制终端进行操作而选中,第一无人机可以将所述图像通过第一无人机和第一控制终端之间的无线通信连接发送给所述第一控制终端以使所述第一控制终端实时显示所述图像,用户可以对第一控制终端进行目标选择操作以选中所述显示的图像中的目标物体,第一控制终端根据检测到的所述目标物体选择操作确定目标物体指示信息,其中,所述物体指示信息可以包括目标物体在所述图像中的位置。例如,第一控制终端可以包括触摸显示器,所述触摸显示所述图像,用户可以在触摸显示屏进行点选操作或者框选操作以选中所述显示的图像中的目标物体。第一控制终端可以将目标物体指示信息发送第一无人机,第一无人机可以接收所述目标物体指示信息,并根据所述目标指示信息选中所述环境中的目标物体。
进一步地,如前所述确定所述目标物体在所述图像的位置,可以包括:根据所述目标物体指示信息对所述图像运行图像跟踪算法以获取所述目标物体在图像中的位置。如前所述,用户在被显示的一帧图像中选中了目标物体,然而这帧图像只是拍摄装置实时输出的图像中的一帧,由于需要实时地确定目标物体在拍摄装置输出的图像中的位置,因此,第一无人机可以根据所述目标物体指示信息对拍摄装置输出的实时图像运行图像跟踪算法以获取所述目标物体在图像中的位置。
如前所述,第一无人机可以将所述拍摄装置采集到的图像发送给无人机的控制终端以使所述控制终端显示所述图像。第一控制终端可以在所述显示 的图像中显示指示所述目标物体在所述图像中的位置的标识,这样,用户可以实时地了解图像中的哪个物体是目标物体。具体地,一种可行的方式,第一无人机可以将按照如前所述方式的确定的所述目标物体在所述图像中的位置发送给第一控制终端以使第一控制终端在所述被显示的图像上显示指示所述目标物体在所述图像中的位置的标识。另一种可行的方式,第一控制终端根据如前所述的目标物体的指示信息对从第一无人机接收到的实时图像运行图像跟踪算法以获取目标物体在所述图像中的位置,并根据所述位置在所述显示的图像中显示指示所述目标物体在所述图像中的位置的标识。所述标识可以包括文字、符号、阴影、图形中的中的至少一种。
在某些实施例中,所述观测传感器包括测距传感器,所述获取无人机的观测传感器对环境中的目标物体进行感测而输出的传感数据,包括:获取所述测距传感器输出的目标物体的距离和所述测距传感器的观测姿态;根据所述输出的目标物体的距离和所述观测姿态确定所述目标物体的位置。
具体地,第一无人机的观测传感器可以包括测距传感器,其中,所述测距传感器可以为各种类型的测距传感器。所述测距传感器可以为基于图像的测距传感器,例如双目摄像头。所述测距传感器可以为基于发射和接收测距信号的测距传感器,所述测距传感器包括用于发射测距信号和接收被目标物体反射的测距信号的接收器,测距信号可以是雷达信号、光信号或声音信号等,所述测距传感器可以包括激光测距传感器、TOF传感器或各种不同类型的雷达。第一无人机获取所述测距传感器输出的目标物体的距离,另外,第一无人机可以获取所述测距传感器的观测姿态。其中,如前所述,观测传感器的观测方向可以根据无人机的机身姿态和/或安装观测传感器的云台的姿态来确定,测距传感器的观测方向可以根据第一无人机的机身姿态和/或云台的姿态来确定。第一无人机根据测距传感器输出的目标物体的距离和所述观测方向确定所述目标物体的位置,进一步地,第一无人机根据测距传感器输出的目标物体的距离、所述观测方向和第一无人机的位置确定所述目标物体的位置。第一无人机的位置可以是由所述第一无人机的卫星定位装置采集获取的。
如前所述,所述目标物体可以是用户对第一无人机的控制终端进行操作而选中。作为一种用户对第一无人机的控制终端进行操作而选中目标物体的实现方式,用户对第一无人机的控制终端进行的操作可以包括用户对第一控 制终端进行的观测方向调整操作,第一控制终端可以检测用户的观测方向调整操作,并根据所述检测到的观测方向调整操作生成测距传感器的观测方向调整指令,其中,所述观测方向调整指令用于调整第一无人机的测距传感器的观测方向。例如,第一控制终端和/或第二控制终端可以包括操作杆、按键、波轮或者触摸板显示屏等交互装置,可以对所述交互装置进行观测方向调整操作,第一控制终端可以通过交互装置检测用户的观测方向调整操作操作。第一控制终端将所述观测方向调整指令发送给所述第一无人机,第一无人机可以根据所述观测方向调整指令将所述距离传感器的观测方向调整至朝向所述目标物体。第一无人机可以根据所述观测方向调整指令调整第一无人机的机身的姿态和/或者所述安装所述测距传感器的云台的姿态来将所述距离传感器的观测方向调整至朝向所述目标物体。
进一步地,第一无人机可以将拍摄装置采集到的图像发送给无人机的控制终端以使所述控制终端显示所述图像。在根据所述测距传感器确定目标物体的位置的场景中,为了帮助用户了解第一控制终端上显示的图像上哪个物体是目标物体,第一无人机可以根据所述目标物体的位置确定所述目标物体在所述图像中的位置,将所述目标物体在所述图像中的位置发送给第一无人机的第一控制终端,这样第一控制终端根据目标物体在所述图像中的位置在所述被显示的图像上显示指示所述目标物体在所述图像中的位置的标识。进一步地,第一无人机可以根据测距传感器和拍摄装置之间的相对位置关系和所述目标物体的位置确定所述目标物体在拍摄装置采集的图像中的位置。或者,第一无人机可以将所述目标物体的位置发送给第一无人机的第一控制终端,第一控制终端根据所述目标物体的位置确定所述目标物体在所述图像中的位置并在所述被显示的图像上显示指示所述目标物体在所述图像中的位置的标识,进一步地,第一控制终端可以根据测距传感器和拍摄装置之间的相对位置关系和所述目标物体的位置确定所述目标物体在拍摄装置采集的图像中的位置。如前所述,所述标识可以是文字、符号、阴影、图形中的中的一种或多种。在某些情况中,测距传感器和拍摄装置固定安装,测距传感器和拍摄装置之间的相对位置关系是固定不变的,测距传感器和拍摄装置可以固定安装在所述云台上,其中,所述测距传感器的观测方向和拍摄装置的拍摄方向平行。在某些情况中,所述测距传感器和所述拍摄装置之间可以是活动安装的,测距传感器和拍摄装置之间的相对位置关系可以是实时确定 的。
在某些实施例中,所述第一中继设备包括第一无人机的第一控制终端、服务器和所述第二无人机的第二控终端中的至少一个。具体地,如前所述,第一无人机将所述目标物体的位置发送给在所述环境中飞行的第二无人机或者将所述目标物体的位置发送给第一中继设备,以通过第一中继设备将所述目标物体的位置发送给在所述环境中飞行的第二无人机。在某些情况中,第一无人机可以与第二无人机建立无线通信连接,第一无人机可以通过所述无线通信连接将所述目标物体的位置发送给第二无人机。在某些情况中,第一无人机将所述目标物体的位置发送给第一中继设备,第一中继设备可以与第二无人机建立直接或者间接的无线通信连接,第一中继设备可以通过所述直接或者间接的无线通信连接将目标物体的位置发送给第二无人机。例如,所述第一中继设备可以包括第一控制终端,第一无人机可以将目标物体的位置发送给第一控制终端,在某些情况中,第一控制终端可以将目标物体的位置发送给服务器,服务器可以将所述目标物体的位置发送给第二控制终端,第二控制终端可以通过第二控制终端与第二无人机之间的无线通信连接将目标物体的位置发送给第二无人机,在某些情况中,服务器可以将从第一控制终端接收的目标物体的位置发送给第二无人机,在某些情况中,第一控制终端可以将目标物体的位置发送给第二无人机或者将目标物体的位置发送给第二控制终端以使第二控制终端将目标物体的位置发送第二无人机。例如,所述第一中继设备可以包括服务器,第一无人机将目标物体的位置发送给服务器,服务器可以将所述目标物体的位置发送给第二控制终端,第二控制终端可以通过第二控制终端与第二无人机之间的无线通信连接将目标物体的位置发送给第二无人机,在某些情况中,服务器可以将从第一控制终端接收的目标物体的位置发送给第二无人机。再例如,所述第一中继设备包括第二控制终端,第一无人机将目标物体的位置发送给第二控制终端,第二控制终端将所述目标物体的位置发送给第二无人机。
在某些实施例中,所述目标物体的位置被传送给在所述环境中飞行的第二无人机,以使所述第二无人机根据所述目标物体的位置控制所述拍摄装置的变焦;和/或,所述目标物体的位置被传送给在所述环境中飞行的第二无人机,以使所述第二无人机根据所述目标物体的位置跟踪所述目标物体。
具体地,所述目标物体的位置可以按照如前所述的方式传送给第二无人 机,第二无人机可以根据目标物体的位置控制拍摄装置的镜头调焦,以调节目标物体在拍摄装置中的拍摄画面中的大小。在某些情况中,所述目标物体的位置可以按照如前所述的方式传送给第二无人机,第二无人机可以根据目标物体的位置对目标物体进行跟踪。进一步地,第二无人机可以确定是否满足预设的跟踪条件,当满足预设的跟踪条件时,第二无人机根据目标物体的位置对目标物体进行跟踪,进一步地,第二无人机可以根据目标物体的位置和第二无人机的位置对目标物体进行跟踪,所述第二无人机的位置可以是由第二无人机的卫星定位装置采集的。预设的跟踪条件可以包括第二无人机的剩余电量大于或等于预设的电量阈值、第二无人机与第一无人机或者目标物体之间的距离小于或等于预设的距离阈值、第二无人机处于飞行状态中的至少一个。其中,第一无人机可以将第一无人机的位置按照与目标物体的位置相同的方式传送给第二无人机,第二无人机可以根据第一无人机的位置确定与第一无人机之间的距离,第一无人机可以根据目标物体的位置确定与目标物体的距离。第二无人机可以先飞行至预设高度,然后根据目标物体的位置对目标物体进行跟踪。第一无人机可以按照如前所述的方式实时地确定目标物体的位置,目标物体的位置可以按照如前所述的方式实时地传送给第二无人机,第二无人机可以按照实时接收到的目标物体的位置对目标物体进行跟踪。在某些情况中,目标物体可以是第一无人机的跟踪对象,即第一无人机对目标物体进行跟踪。在某些情况中,第一无人机可以根据观测传感器输出的传感数据确定目标物体的速度,第一无人机可以将目标物体的速度按照与目标物体的位置相同的方式传送给第二无人机,第二无人机可以根据目标物体的速度和位置对目标物体进行跟踪。其中,所述目标物体的速度可以是根据所述目标物体的位置确定的,所述目标物体的速度可以是由第一无人机实时确定的,并且向第二无人机实时传送。
本发明实施例提供一种无人机的控制终端的控制方法,所述方法的执行主体可以是无人机的控制终端,这里的无人机的控制终端可以为如前所述第一无人机的控制终端,即如前所述的第一控制终端,所述方法包括:
S201:在第一无人机在环境中飞行的过程中,获取第一无人机发送的所述环境中的目标物体的位置,其中,所述位置是所述第一无人机根据其配置观测传感器对所述目标物体进行感测而输出的传感数据确定的;
S202:将所述目标物体的位置发送给在所述环境中飞行的第二无人机或者将所述目标物体的位置发送给第二中继设备,以通过第二中继设备将所述目标物体的位置发送给在所述环境中飞行的第二无人机,以使所述第二无人机根据所述目标物体的位置将其配置的拍摄装置的拍摄方向调整至朝向所述目标物体。
在某些实施例中,第一控制终端可以显示所述环境的地图,第一控制终端可以检测用户对所述显示的地图的位置点选择操作,根据所述检测到的位置点选择操作确定用户在所述地图上选中的位置点的位置,将所述位置点的位置发送给在所述环境中飞行的第二无人机或者将所述位置点的位置发送给第二中继设备,以通过第二中继设备将所述位置点的位置发送给在所述环境中飞行的第二无人机,以使所述第二无人机根据所述位置点的位置将其配置的拍摄装置的拍摄方向调整至朝向所述位置点的位置。进一步地,所述第一控制终端可以包括触摸显示器,所述触摸显示器可以显示所述地图,用户可以在显示所述地图的触摸显示屏上进行点选操作,第一控制终端通过触摸显示屏检测到的点选操作确定用户在所述地图上选中的位置点的位置所述位置点的位置。第一控制终端向第二无人机传送所述位置点的位置的方式可以与第一控制终端向第二无人机传送所述目标物体的位置的方式,具体不再赘述。
这样,所述目标物体的位置被传送给在所述环境中飞行的第二无人机以使所述第二无人机根据所述目标物体的方位或位置将其配置的拍摄装置的拍摄方向调整至朝向所述目标物体。
在某些实施例中,所述观测传感器包括拍摄装置,第一控制终端可以接收并显示所述第一无人机发送的所述拍摄装置采集的图像,并检测用户对所述显示的图像的目标物体选择操作,根据检测到的所述目标物体选择操作确定目标物体指示信息,其中,所述目标物体指示信息包括目标物体在所述图像中的位置,将所述目标物体指示信息发送给所述第一无人机以使第一无人机选中所述环境中的目标物体。
在某些实施例中,第一控制终端在所述被显示的图像上显示用于指示目标物体在所述图像中的位置的标识。具体地,可以第一控制终端可以按照如前所述的两种方式来显示所述标识。一种方式,第一控制终端接收所述无人机发送的所述目标物体在所述图像中的位置,根据所述目标物体在所述图像 中的位置在所述被显示的图像上显示指示所述目标物体在所述图像中的位置的标识。另一种方式,第一控制终端根据如前所述的目标物体的指示信息对从第一无人机接收到的图像运行图像跟踪算法以获取目标物体在所述图像中的位置,并根据所述目标物体在所述图像中的位置在所述显示的图像中显示指示所述目标物体在所述图像中的位置的标识。
在某些实施例中,所述观测传感器包括测距传感器,所述第一无人机包括用于安装所述测距传感器和调整测距传感器的观测方向的云台,第一控制终端检测用户的观测方向调整操作,根据所述检测到的观测方向调整操作生成观测方向调整指令,并将所述观测方向调整指令发送给第一无人机以使第一无人机根据所述观测方向调整指令将所述距离传感器的观测方向调整至朝向所述目标物体。
在某些实施例中,第一无人机包括拍摄装置,其中,第一控制终端接收并显示第一无人机发送的所述拍摄装置采集的图像,第一控制终端接收第一无人机发送的所述目标物体在所述图像中的位置,并在所述被显示的图像上显示指示所述目标物体在所述图像中的位置的标识,其中,所述目标物体在所述图像中的位置是第一无人机根据所述目标物体的位置确定的。或者,第一控制终端根据所述目标物体的位置确定所述目标物体在所述图像中的位置,并在所述被显示的图像上显示指示所述目标物体在所述图像中的位置的标识。
在某些实施例中,所述第二中继设备包括如前所述的服务器和如前所述的第二无人机的第二控终端中的至少一个。
在某些实施例中,第二无人机可以是用户对第一控制终端进行无人机选中操作确定的。例如,第一控制终端可以显示多个候选的无人机的指示信息,检测用户的无人机选中操作,根据检测到的所述无人机选中操作确定用户从所述多个候选的无人机的身份信息中选中的无人机的指示信息;具体地,第一控制终端可以显示多个候选的无人机的指示信息,例如,第一控制终端的触摸显示器可以显示多个候选的无人机的指示信息。无人机的指示信息可以包括如前所述的无人机的身份信息、无人机的用户的身份信息(例如身份证号、用户名、姓名、昵称等)、无人机的位置中的至少一个。多个候选的无人机可以是与第一无人机之间的距离小于或等于预设的距离阈值的无人机。所述多个候选的无人机可以是如前所述的与第一无人机绑定至同一个所有 者(个人、公司、或者机构)或者工作组的其他多个无人机。第一控制终端可以通过如前所述的交互装置检测用户的无人机选中操作,根据检测到的所述无人机选中操作确定用户从所述多个候选的无人机的指示信息中选中的无人机的指示信息(即第二无人机的指示信息),将所述目标物体的位置发送给在所述环境中飞行的与选中的指示信息对应的第二无人机或者将所述目标物体的位置发送给第二中继设备,以通过第二中继设备将所述目标物体的位置发送给在所述环境中飞行的与选中的指示信息对应的第二无人机。
在某些实施例中,所述目标物体的位置被传送给在所述环境中飞行的第二无人机,以使所述第二无人机根据所述目标物体的位置控制所述拍摄装置的变焦;和/或,所述目标物体的位置被传送给在所述环境中飞行的第二无人机,以使所述第二无人机根据所述目标物体的位置跟踪所述目标物体。
本发明实施例提供一种服务器的控制方法,所述方法的执行主体可以是如前所述的服务器,所述方法包括:
S301:获取环境中的目标物体的位置,其中,所述目标物体的位置是在所述环境中第一无人机根据其配置观测传感器对所述目标物体进行感测而输出的传感数据确定的;
具体地,服务器可以获取环境中的目标物体的位置,如前所述,服务器可以获取第一无人机发送的目标物体的位置,或者服务器可以获取第一无人机的控制终端发送的目标物体的位置。
S302:将所述目标物体的位置发送给第二无人机或者将所述目标物体的位置发送给第三中继设备,以通过第三中继设备将所述目标物体的位置发送给在所述环境中飞行的第二无人机,以使所述第二无人机根据所述目标物体的位置将其配置的拍摄装置的拍摄方向调整至朝向所述目标物体。
具体地,如前所述,服务器可以将目标物体的位置发送给第二无人机,在某些情况中,服务器可以将所述目标物体的位置发送给第三中继设备,其中,第三中继设备可以包括如前所述的第二无人机的第二控制终端,这样,第二控制终端可以将目标的位置发送给第二无人机。
在某些实施例中,如前所述,第一无人机和第二无人机可以是绑定至同一个所有者(个人、公司、或者机构)或者工作组的,服务器可以确定与所述第一无人机绑定至同一个所有者(个人、公司、或者机构)或者工作组的 第二无人机,服务器将所述目标物体的位置发送给所述绑定的第二无人机或者将所述目标物体的位置发送给第三中继设备,以通过第三中继设备将所述目标物体的位置发送给在所述环境中飞行的所述绑定的第二无人机。
进一步地,如前所述,所述第一无人机和第二无人机可以通过其各自的身份信息(即第一无人机的身份信息和第二无人机的身份信息)来绑定的,服务器可以获取所述第一无人机的身份信息,根据所述第一无人机的身份信息确定与第一无人机绑定的第二无人机。其中,服务器获取第一无人机的身份信息的方式可以与获取环境中的目标物体的位置的方式相同。
在某些实施例中,服务器从多个候选的无人机中确定位于所述环境中的所述另一无人机,所述服务器可以基于用户的无人机选择操作多个候选的无人机中选中所述另一无人机。
在某些实施例中,将所述目标物体的位置发送给多个候选的无人机或者将所述目标物体的位置发送给第三中继设备,以通过第三中继设备将所述目标物体的位置发送给在所述环境中飞行的多个候选无人机,以使所述多个候选的无人机根据所述目标物体的位置将其配置的拍摄装置的拍摄方向调整至朝向所述目标物体,其中,所述多个候选无人机包括所述另一无人机。
如前所述,多个候选的无人机可以是与第一无人机之间的距离小于或等于预设的距离阈值的无人机,在某些情况中,所述多个候选的无人机可以是如前所述的与第一无人机绑定至同一个所有者(个人、公司、或者机构)或者工作组的其他多个无人机。
本发明实施例提供一种无人机的控制终端的控制方法,所述方法的执行主体可以是无人机的控制终端,这里的无人机的控制终端可以为如前所述第二无人机的控制终端,即如前所述的第二控制终端,所述方法包括:
S401:获取环境中的目标物体的位置,其中,所述目标物体的位置是在所述环境中飞行的第一无人机根据其配置观测传感器对所述目标物体进行感测而输出的传感数据确定的;
具体地,第二控制终端可以获取环境中的目标物体的位置,如前所述,第二控制终端可以获取第一无人机发送的目标物体的位置,或者第二控制终端可以获取第一无人机的控制终端发送的目标物体的位置,或者第二控制终端可以获取服务器发送的目标物体的位置。
S402:将所述目标物体的位置发送给在所述环境中的所述第二无人机,以使所述第二无人机根据所述目标物体的位置将其配置的拍摄装置的拍摄方向调整至朝向所述目标物体。
在某些实施例中,第二控制终端可以响应于获取到所述目标物体的位置,根据所述目标物体的位置显示用于指示所述目标物体的位置的标识;和/或,第二控制终端可以响应于获取到所述目标物体的位置,根据所述目标物体的位置显示用于指示所述目标物体的方位的标识,这样,第二无人机的用户就可以便于了解目标物体的位置或者方位,进一步地,第二控制终端可以根据所述目标物体的位置和所述无人机的位置显示用于指示所述目标物体的位置的标识,和/或,第二控制终端可以根据第二无人机的位置和所述目标物体的位置显示用于指示所述目标物体的方位的标识。
在某些实施例中,第二控制终端可以获取并显示第一无人机的指示信息,这样,用户可以了解观测得到目标物体的位置的无人机的相关信息。其中,第二控制终端获取第一无人机的指示信息,如前所述,无人机的指示信息可以包括如前所述的无人机的身份信息、无人机的用户的身份信息(例如身份证号、用户名、姓名、昵称等)、无人机的位置中的至少一个,第二控制终端获取第一无人机的指示信息方式可以与获取目标物体的位置的方式相同。
在某些实施例中,第二控制终端响应于获取到所述目标物体的位置,显示获取到所述目标物体的位置的提示信息,所述提示信息可以包括如前所述的第二控制终端获取的第一无人机的指示信息。
在某些实施例中,第二控制终端可以响应于获取到所述目标物体的位置,确定是否满足预设的发送条件,当满足时,将获取到的目标物体的位置发送给第二无人机,否则,拒绝将所述目标物体的位置发送给第二无人机。在某些情况中,当不满足预设的发送条件时,显示拒绝发送提示信息。
在某些实施例中,所述第二控制终端确定是否满足预设的发送条件,包括:第二控制终端确定是否检测用户的允许响应操作;当检测到所述允许响应操作时,确定满足预设的发送条件,否则,确定不满足预设的发送条件
在某些实施例中,所述第二控制终端确定是否满足预设的发送条件,包括:确定第二无人机是否满足预设的响应条件,其中,所述预设的响应条件包括第二无人机的剩余电量是否大于或等于预设的电量阈值、第二无人机与 第一无人机或者目标物体之间的距离是否小于或等于预设的距离阈值、第二无人机是否处于飞行状态中的至少一个;若确定第二无人机满足预设的响应条件时,确定满足预设的发送条件,否则,确定不满足预设的发送条件。
在某些实施例中,第二控制终端可以获取第二无人机的拍摄装置采集的图像,并显示所述图像,并在所述显示的图像中显示用于指示目标物体在所述图像中的位置的标识。随着第二无人机的拍摄装置的拍摄方向调整至朝向所述目标物体,目标物体会在第二无人机的拍摄装置的拍摄画面中,为了便于第二无人机的用户了解第二控制终端显示的图像中哪个物体是目标物体,第二控制终端可以在所述显示的图像中显示用于指示目标物体在所述图像中的位置的标识,进一步地,所述第二控制终端可以获取目标物体在所述图像中的位置,并根据所述目标物体在所述图像中的位置在所述显示的图像中显示用于指示目标物体在所述图像中的位置的标识。其中,所述目标物体在第二无人机的拍摄装置采集到的所述图像中的位置可以是根据目标物体的位置和第二无人机的拍摄装置的拍摄方向确定的。其中,所述第二控制终端获取目标物体在所述图像中的位置,可以包括第二控制终端获取第二无人机发送的目标物体在所述图像中的位置,所述目标物体在第二无人机的拍摄装置采集到的所述图像中的位置可以是第二无人机根据目标物体的位置和第二无人机的拍摄装置的拍摄方向确定的。在某些情况中,所述第二控制终端获取目标物体在所述图像中的位置,可以包括第二控制终端根据目标物体的位置和第二无人机的拍摄装置的拍摄方向确定所述目标物体在第二无人机的拍摄装置采集到的所述图像中的位置,其中,所述第二无人机的拍摄装置的拍摄方向可以是从第二无人机获取的。
本发明实施例提供一种无人机的控制方法,所述方法的执行主体可以是无人机,这里的无人机为如前所述第二无人机,所述无人机包括拍摄装置,所述方法包括:
S501:获取环境中目标物体的位置,其中,所述目标物体的位置是在所述环境中第一无人机根据其配置观测传感器对所述目标物体进行感测而输出的传感数据确定的;
具体地,第二无人机可以获取环境中的目标物体的位置,如前所述,第二无人机可以获取第一无人机发送的目标物体的位置,或者第二无人机可以 获取第一无人机的第一控制终端发送的目标物体的位置,或者第二无人机可以获取第二无人机的第二控制终端发送的目标物体的位置,或者第二无人机可以获取服务器发送的目标物体的位置。
S502:根据所述目标物体的位置调整所述拍摄装置的拍摄方向调整至朝向所述目标物体。进一步地,第二无人机可以通过调整机身和/或安装所述拍摄装置的云台的姿态来将拍摄装置的拍摄方向调整至朝向所述目标物体。
在某些情况中,第二无人机可以根据所述目标物体的位置跟踪所述目标物体。
其中,第二无人机的详细工作原理可以参见前述部分。
如图6所示,本发明实施例还提供一种无人机600,这里的无人机可以是如前所述的第一无人机,包括观测传感器601和处理器602,所述处理器用于执行以下步骤:
在无人机在环境中飞行的过程中,获取无人机的观测传感器对环境中的目标物体进行感测而输出的传感数据,根据所述传感数据确定所述目标物体的位置;
将所述目标物体的位置发送给在所述环境中飞行的另一无人机或者将所述目标物体的位置发送给第一中继设备,以通过第一中继设备将所述目标物体的位置发送给在所述环境中飞行的另一无人机,以使所述另一无人机根据所述目标物体的位置将其配置的拍摄装置的拍摄方向调整至朝向所述目标物体。
在某些实施例中,所述观测传感器包括拍摄装置,所述处理器用于:
获取拍摄装置对所述环境中的目标物体进行拍摄而输出的图像;
确定所述目标物体在所述图像中的位置;
根据所述目标物体在所述图像中的位置确定所述目标物体的位置。
在某些实施例中,所述目标物体是用户在对显示所述拍摄装置采集到的图像的无人机的控制终端进行目标物体选择操作选中的。
在某些实施例中,所述处理器用于:
将所述拍摄装置采集到的图像发送给无人机的控制终端以使所述控制终端显示所述图像;
将所述确定的所述目标物体在所述图像中的位置发送给无人机的控制 终端以使所述控制终端在所述被显示的图像上显示指示所述目标物体在所述图像中的位置的标识。
在某些实施例中,所述观测传感器包括测距传感器,所述处理器用于:
获取所述测距传感器输出的目标物体的距离和所述测距传感器的观测姿态;
根据所述输出的目标物体的距离和所述观测姿态确定所述目标物体的位置。
在某些实施例中,所述测距传感器包括用于发射测距信号和接收被目标物体反射的测距信号的接收器。
在某些实施例中,所述无人机包括用于安装所述测距传感器和调整测距传感器的观测方向的云台,所述处理器用于:
获取所述无人机的控制终端发送的观测方向调整指令;
根据所述观测方向调整指令控制所述云台调整所述测距传感器的观测方向以使所述观测方向朝向所述目标物体。
在某些实施例中,所述无人机包括拍摄装置,所述处理器用于:
将拍摄装置采集到的图像发送给无人机的控制终端以使所述控制终端显示所述图像;
根据所述目标物体的位置确定所述目标物体在所述图像中的位置,将所述目标物体在所述图像中的位置发送给无人机的控制终端以使所述控制终端在所述被显示的图像上显示指示所述目标物体在所述图像中的位置的标识;或者,
将所述目标物体的位置发送给无人机的控制终端,以使所述控制终端根据所述目标物体的位置确定所述目标物体在所述图像中的位置并在所述被显示的图像上显示指示所述目标物体在所述图像中的位置的标识。
在某些实施例中,所述第一中继设备包括无人机的控制终端、服务器和所述另一无人机的控终端中的至少一个。
在某些实施例中,所述目标物体的位置被传送给在所述环境中飞行的另一无人机,以使所述另一无人机根据所述目标物体的位置控制所述拍摄装置的变焦;和/或,
所述目标物体的位置被传送给在所述环境中飞行的另一无人机,以使所述另一无人机根据所述目标物体的位置跟踪所述目标物体。
在某些实施例中,所述无人机和所述另一无人机是绑定至同一个所有者或者工作组的无人机。
在某些实施例中,所述目标物体是用户对所述无人机的控制终端进行目标物体选择操作而选中。
如图7所示,本发明实施例还提供一种无人机的控制终端700,这里的控制终端可以是如前所述的第一控制终端,包括存储器701和处理器702,
所述存储器,用于存储程序代码;
所述处理器,用于调用并执行所述程序代码以执行以下步骤:
在无人机在环境中飞行的过程中,获取无人机发送的所述环境中的目标物体的位置,其中,所述位置是所述无人机根据其配置观测传感器对所述目标物体进行感测而输出的传感数据确定的;
将所述目标物体的位置发送给在所述环境中飞行的另一无人机或者将所述目标物体的位置发送给第二中继设备,以通过第二中继设备将所述目标物体的位置发送给在所述环境中飞行的另一无人机,以使所述另一无人机根据所述目标物体的位置将其配置的拍摄装置的拍摄方向调整至朝向所述目标物体。
在某些实施例中,所述观测传感器包括拍摄装置,所述处理器用于:
接收并显示所述无人机发送的所述拍摄装置采集的图像;
检测用户对所述显示的图像的目标物体选择操作,根据检测到的所述目标物体选择操作确定目标物体指示信息,其中,所述目标物体指示信息包括目标物体在所述图像中的位置。
将所述目标物体指示信息发送给所述无人机以使无人机选中所述环境中的目标物体。
在某些实施例中,所述处理器用于:
接收并显示所述无人机发送的所述拍摄装置采集的图像;
接收所述无人机发送的所述目标物体在所述图像中的位置;
在所述被显示的图像上显示指示所述目标物体在所述图像中的位置的标识。
在某些实施例中,所述观测传感器包括测距传感器,
所述无人机包括用于安装所述测距传感器和调整测距传感器的观测方 向的云台,所述处理器用于:
检测用户的观测方向调整操作,根据所述检测到的观测方向调整操作生成观测方向调整指令;
将所述观测方向调整指令发送给所述无人机以使所述无人机根据所述观测方向调整指令将所述距离传感器的观测方向调整至朝向所述目标物体。
在某些实施例中,所述无人机包括拍摄装置,其中,所述处理器用于:
接收并显示所述无人机发送的所述拍摄装置采集的图像;
接收无人机发送的所述目标物体在所述图像中的位置,在所述被显示的图像上显示指示所述目标物体在所述图像中的位置的标识,其中,所述目标物体在所述图像中的位置是无人机根据所述目标物体的位置确定的;或者,
根据所述目标物体的位置确定所述目标物体在所述图像中的位置,并根据所述目标物体在所述图像中的位置在所述被显示的图像上显示指示所述目标物体在所述图像中的位置的标识。
在某些实施例中,所述第二中继设备包括服务器和所述另一无人机的控终端中的至少一个。
在某些实施例中,所述处理器用于:
显示多个候选的无人机的指示信息;
检测用户的无人机选中操作,根据检测到的所述无人机选中操作确定用户从所述多个候选的无人机的指示信息中选中的所述另一无人机的指示信息;
将所述目标物体的位置发送给在所述环境中飞行的与选中的指示信息对应的另一无人机或者将所述目标物体的位置发送给第二中继设备,以通过第二中继设备将所述目标物体的位置发送给在所述环境中飞行的与选中的指示信息对应的另一无人机。
在某些实施例中,所述目标物体的位置被传送给在所述环境中飞行的另一无人机,以使所述另一无人机根据所述目标物体的位置控制所述拍摄装置的变焦;和/或,
所述目标物体的位置被传送给在所述环境中飞行的另一无人机,以使所述另一无人机根据所述目标物体的位置跟踪所述目标物体。
在某些实施例中,所述无人机和所述另一无人机是绑定至同一个所有者或者工作组的无人机。
如图8所示,本发明实施例还提供一种服务器800,这里的服务器可以是如前所述的服务器,包括存储器801和处理器802,
所述存储器,用于存储程序代码;
所述处理器,用于调用并执行所述程序代码以执行以下步骤:
获取环境中的目标物体的位置,其中,所述目标物体的位置是在所述环境中无人机根据其配置观测传感器对所述目标物体进行感测而输出的传感数据确定的;
将所述目标物体的位置发送给另一无人机或者将所述目标物体的位置发送给第三中继设备,以通过第三中继设备将所述目标物体的位置发送给在所述环境中飞行的另一无人机,以使所述另一无人机根据所述目标物体的位置将其配置的拍摄装置的拍摄方向调整至朝向所述目标物体。
在某些实施例中,所述处理器用于:
确定与所述无人机绑定至同一个所有者或者工作组的所述另一无人机;
所述将所述目标物体的位置发送给另一无人机或者将所述目标物体的位置发送给第三中继设备,以通过第三中继设备将所述目标物体的位置发送给在所述环境中飞行的另一无人机,包括:
将所述目标物体的位置发送给与所述无人机绑定的另一无人机或者将所述目标物体的位置发送给第三中继设备,以通过第三中继设备将所述目标物体的位置发送给在所述环境中飞行的与所述无人机绑定的另一无人机。
在某些实施例中,所述处理器用于:
获取所述无人机的身份信息;
根据所述身份信息确定与所述无人机绑定的所述另一无人机。
在某些实施例中,所述处理器用于:
从多个候选的无人机中确定位于所述环境中的所述另一无人机。
在某些实施例中,所述无人机和所述另一无人机是绑定至同一个所有者或者工作组的无人机。
如图9所示,本发明实施例还提供一种无人机的控制终端900,这里的控制终端可以是如前所述的第二控制终端,包括存储器901和处理器902,
所述存储器,用于存储程序代码;
所述处理器,用于调用并执行所述程序代码以执行以下步骤:
获取目标物体的位置;
将所述目标物体的位置发送给在所述环境中的所述无人机,以使所述无人机根据所述目标物体的位置将其配置的拍摄装置的拍摄方向调整至朝向所述目标物体;
其中,所述目标物体的位置是在所述环境中飞行的另一无人机根据其配置观测传感器对所述目标物体进行感测而输出的传感数据确定的。
在某些实施例中,所述处理器用于:
响应于获取到所述目标物体的位置,根据所述目标物体的位置显示用于指示所述目标物体的位置的标识;和/或,
响应于获取到所述目标物体的位置,根据所述目标物体的位置显示用于指示所述目标物体的方位的标识。
在某些实施例中,所述处理器用于:
响应于获取到所述目标物体的位置,确定是否满足预设的发送条件;
当满足时,将获取到的目标物体的位置发送给所述无人机,否则,拒绝将所述目标物体的位置发送给所述无人机。
在某些实施例中,所述处理器用于:
确定是否检测用户的允许响应操作;
当检测到所述允许响应操作时,确定满足预设的发送条件,否则,确定不满足预设的发送条件。
在某些实施例中,所述处理器用于:
确定所述无人机是否满足预设的响应条件,其中,所述预设的响应条件包括所述无人机的剩余电量是否大于或等于预设的电量阈值、所述无人机与另一无人机或者目标物体之间的距离是否小于或等于预设的距离阈值、所述无人机是否处于飞行状态中的至少一个;
若确定所述无人机满足预设的响应条件时,确定满足预设的发送条件,否则,确定不满足预设的发送条件。
在某些实施例中,所述处理器用于:
获取所述无人机的拍摄装置采集的图像,并显示所述图像;
在所述显示的图像中显示用于指示目标物体在所述图像中的位置的标 识。
在某些实施例中,所述无人机和所述另一无人机是绑定至同一个所有者或者工作组的无人机。
如图10所示,本发明实施例还提供一种无人机1000,这里的无人机可以是如前所述的第二无人机,所述无人机包括拍摄装置1001和处理器1002,所述处理器用于执行以下步骤:
获取环境中目标物体的位置,其中,所述目标物体的位置是在所述环境中另一无人机根据其配置观测传感器对所述目标物体进行感测而输出的传感数据确定的;
根据所述目标物体的位置调整所述拍摄装置的拍摄方向调整至朝向所述目标物体。
在某些实施例中,所述处理器用于:
根据所述目标物体的位置跟踪所述目标物体。
在某些实施例中,所述无人机和所述另一无人机是绑定至同一个所有者或者工作组的无人机。
以上各个实施例中的技术方案、技术特征在与本相冲突的情况下均可以单独,或者进行组合,只要未超出本领域技术人员的认知范围,均属于本申请保护范围内的等同实施例。
在本发明所提供的几个实施例中,应该理解到,所揭露的相关遥控装置和方法,可以通过其它的方式实现。例如,以上所描述的遥控装置实施例仅仅是示意性的,例如,所述模块或单元的划分,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式,例如多个单元或组件可以结合或者可以集成到另一个系统,或一些特征可以忽略,或不执行。另一点,所显示或讨论的相互之间的耦合或直接耦合或通信连接可以是通过一些接口,遥控装置或单元的间接耦合或通信连接,可以是电性,机械或其它的形式。
所述作为分离部件说明的单元可以是或者也可以不是物理上分开的,作为单元显示的部件可以是或者也可以不是物理单元,即可以位于一个地方,或者也可以分布到多个网络单元上。可以根据实际的需要选择其中的部分或者全部单元来实现本实施例方案的目的。
另外,在本发明各个实施例中的各功能单元可以集成在一个处理单元中,也可以是各个单元单独物理存在,也可以两个或两个以上单元集成在一个单元中。上述集成的单元既可以采用硬件的形式实现,也可以采用软件功能单元的形式实现。
所述集成的单元如果以软件功能单元的形式实现并作为独立的产品销售或使用时,可以存储在一个计算机可读取存储介质中。基于这样的理解,本发明的技术方案本质上或者说对现有技术做出贡献的部分或者该技术方案的全部或部分可以以软件产品的形式体现出来,该计算机软件产品存储在一个存储介质中,包括若干指令用以使得计算机处理器(processor)执行本发明各个实施例所述方法的全部或部分步骤。而前述的存储介质包括:U盘、移动硬盘、只读存储器(ROM,Read-Only Memory)、随机存取存储器(RAM,Random Access Memory)、磁盘或者光盘等各种可以存储程序代码的介质。
以上所述仅为本发明的实施例,并非因此限制本发明的专利范围,凡是利用本发明说明书及附图内容所作的等效结构或等效流程变换,或直接或间接运用在其他相关的技术领域,均同理包括在本发明的专利保护范围内。
最后应说明的是:以上各实施例仅用以说明本发明的技术方案,而非对其限制;尽管参照前述各实施例对本发明进行了详细的说明,本领域的普通技术人员应当理解:其依然可以对前述各实施例所记载的技术方案进行修改,或者对其中部分或者全部技术特征进行等同替换;而这些修改或者替换,并不使相应技术方案的本质脱离本发明各实施例技术方案的范围。

Claims (72)

  1. 一种无人机的控制方法,其特征在于,
    在无人机在环境中飞行的过程中,获取无人机的观测传感器对环境中的目标物体进行感测而输出的传感数据,根据所述传感数据确定所述目标物体的位置;
    将所述目标物体的位置发送给在所述环境中飞行的另一无人机或者将所述目标物体的位置发送给第一中继设备,以通过第一中继设备将所述目标物体的位置发送给在所述环境中飞行的另一无人机,以使所述另一无人机根据所述目标物体的位置将其配置的拍摄装置的拍摄方向调整至朝向所述目标物体。
  2. 根据权利要求1所述的方法,其特征在于,所述观测传感器包括拍摄装置,
    所述获取无人机的观测传感器对环境中的目标物体进行感测而输出的传感数据,包括:
    获取拍摄装置对所述环境中的目标物体进行拍摄而输出的图像;
    所述根据所述传感数据确定所述目标物体的位置,包括:
    确定所述目标物体在所述图像中的位置;
    根据所述目标物体在所述图像中的位置确定所述目标物体的位置。
  3. 根据权利要求2所述的方法,其特征在于,所述目标物体是用户在对显示所述拍摄装置采集到的图像的无人机的控制终端进行目标物体选择操作选中的。
  4. 根据权利要求2或3所述的方法,其特征在于,所述方法还包括:
    将所述拍摄装置采集到的图像发送给无人机的控制终端以使所述控制终端显示所述图像;
    将所述确定的所述目标物体在所述图像中的位置发送给无人机的控制终端以使所述控制终端在所述被显示的图像上显示指示所述目标物体在所述图像中的位置的标识。
  5. 根据权利要求1所述的方法,其特征在于,所述观测传感器包括测距传感器,
    所述获取无人机的观测传感器对环境中的目标物体进行感测而输出的 传感数据,包括:
    获取所述测距传感器输出的目标物体的距离和所述测距传感器的观测姿态;
    根据所述输出的目标物体的距离和所述观测姿态确定所述目标物体的位置。
  6. 根据权利要求5所述的方法,其特征在于,所述测距传感器包括用于发射测距信号和接收被目标物体反射的测距信号的接收器。
  7. 根据权利要求5或6所述的方法,其特征在于,所述无人机包括用于安装所述测距传感器和调整测距传感器的观测方向的云台,所述方法还包括:
    获取所述无人机的控制终端发送的观测方向调整指令;
    根据所述观测方向调整指令控制所述云台调整所述测距传感器的观测方向以使所述观测方向朝向所述目标物体。
  8. 根据权利要求5-7任一项所述的方法,其特征在于,所述无人机包括拍摄装置,其中,所述方法还包括:
    将拍摄装置采集到的图像发送给无人机的控制终端以使所述控制终端显示所述图像;
    根据所述目标物体的位置确定所述目标物体在所述图像中的位置,将所述目标物体在所述图像中的位置发送给无人机的控制终端以使所述控制终端在所述被显示的图像上显示指示所述目标物体在所述图像中的位置的标识;或者,
    将所述目标物体的位置发送给无人机的控制终端,以使所述控制终端根据所述目标物体的位置确定所述目标物体在所述图像中的位置并在所述被显示的图像上显示指示所述目标物体在所述图像中的位置的标识。
  9. 根据权利要求1-8任一项所述的方法,其特征在于,所述第一中继设备包括无人机的控制终端、服务器和所述另一无人机的控终端中的至少一个。
  10. 根据权利要求所述1-9任一项所述的方法,其特征在于,
    所述目标物体的位置被传送给在所述环境中飞行的另一无人机,以使所述另一无人机根据所述目标物体的位置控制所述拍摄装置的变焦;和/或,
    所述目标物体的位置被传送给在所述环境中飞行的另一无人机,以使所 述另一无人机根据所述目标物体的位置跟踪所述目标物体。
  11. 根据权利要求1-10任一项所述的方法,其特征在于,所述无人机和所述另一无人机是绑定至同一个所有者或者工作组的无人机。
  12. 根据权利要求1-11任一项所述的方法,其特征在于,所述目标物体是用户对所述无人机的控制终端进行目标物体选择操作而选中。
  13. 一种无人机的控制终端的控制方法,其特征在于,
    在无人机在环境中飞行的过程中,获取无人机发送的所述环境中的目标物体的位置,其中,所述位置是所述无人机根据其配置观测传感器对所述目标物体进行感测而输出的传感数据确定的;
    将所述目标物体的位置发送给在所述环境中飞行的另一无人机或者将所述目标物体的位置发送给第二中继设备,以通过第二中继设备将所述目标物体的位置发送给在所述环境中飞行的另一无人机,以使所述另一无人机根据所述目标物体的位置将其配置的拍摄装置的拍摄方向调整至朝向所述目标物体。
  14. 根据权利要求13所述的方法,其特征在于,所述观测传感器包括拍摄装置,
    接收并显示所述无人机发送的所述拍摄装置采集的图像;
    检测用户对所述显示的图像的目标物体选择操作,根据检测到的所述目标物体选择操作确定目标物体指示信息,其中,所述目标物体指示信息包括目标物体在所述图像中的位置;
    将所述目标物体指示信息发送给所述无人机以使无人机选中所述环境中的目标物体。
  15. 根据权利要求13或14所述的方法,其特征在于,所述方法还包括:
    接收并显示所述无人机发送的所述拍摄装置采集的图像;
    接收所述无人机发送的所述目标物体在所述图像中的位置;
    在所述被显示的图像上显示指示所述目标物体在所述图像中的位置的标识。
  16. 根据权利要求13所述的方法,其特征在于,所述观测传感器包括测距传感器,
    所述无人机包括用于安装所述测距传感器和调整测距传感器的观测方向的云台,所述方法还包括:
    检测用户的观测方向调整操作,根据所述检测到的观测方向调整操作生成观测方向调整指令;
    将所述观测方向调整指令发送给所述无人机以使所述无人机根据所述观测方向调整指令将所述距离传感器的观测方向调整至朝向所述目标物体。
  17. 根据权利要求16所述的方法,其特征在于,所述无人机包括拍摄装置,其中,所述方法还包括:
    接收并显示所述无人机发送的所述拍摄装置采集的图像;
    接收无人机发送的所述目标物体在所述图像中的位置,在所述被显示的图像上显示指示所述目标物体在所述图像中的位置的标识,其中,所述目标物体在所述图像中的位置是无人机根据所述目标物体的位置确定的;或者,
    根据所述目标物体的位置确定所述目标物体在所述图像中的位置,并根据所述目标物体在所述图像中的位置在所述被显示的图像上显示指示所述目标物体在所述图像中的位置的标识。
  18. 根据权利要求13-17任一项所述的方法,其特征在于,所述第二中继设备包括服务器和所述另一无人机的控终端中的至少一个。
  19. 根据权利要求13-18任一项所述的方法,其特征在于,所述方法还包括:
    显示多个候选的无人机的指示信息;
    检测用户的无人机选中操作,根据检测到的所述无人机选中操作确定用户从所述多个候选的无人机的指示信息中选中的所述另一无人机的指示信息;
    所述将所述目标物体的位置发送给在所述环境中飞行的另一无人机或者将所述目标物体的位置发送给第二中继设备,以通过第二中继设备将所述目标物体的位置发送给在所述环境中飞行的另一无人机,包括:
    将所述目标物体的位置发送给在所述环境中飞行的与选中的指示信息对应的另一无人机或者将所述目标物体的位置发送给第二中继设备,以通过第二中继设备将所述目标物体的位置发送给在所述环境中飞行的与选中的指示信息对应的另一无人机。
  20. 根据权利要求所述13-19任一项所述的方法,其特征在于,
    所述目标物体的位置被传送给在所述环境中飞行的另一无人机,以使所述另一无人机根据所述目标物体的位置控制所述拍摄装置的变焦;和/或,
    所述目标物体的位置被传送给在所述环境中飞行的另一无人机,以使所述另一无人机根据所述目标物体的位置跟踪所述目标物体。
  21. 根据权利要求13-20任一项所述的方法,其特征在于,所述无人机和所述另一无人机是绑定至同一个所有者或者工作组的无人机。
  22. 一种服务器的控制方法,其特征在于,所述方法包括:
    获取环境中的目标物体的位置,其中,所述目标物体的位置是在所述环境中无人机根据其配置观测传感器对所述目标物体进行感测而输出的传感数据确定的;
    将所述目标物体的位置发送给另一无人机或者将所述目标物体的位置发送给第三中继设备,以通过第三中继设备将所述目标物体的位置发送给在所述环境中飞行的另一无人机,以使所述另一无人机根据所述目标物体的位置将其配置的拍摄装置的拍摄方向调整至朝向所述目标物体。
  23. 根据权利要求22所述的方法,其特征在于,所述方法还包括:
    确定与所述无人机绑定至同一个所有者或者工作组的所述另一无人机;
    所述将所述目标物体的位置发送给另一无人机或者将所述目标物体的位置发送给第三中继设备,以通过第三中继设备将所述目标物体的位置发送给在所述环境中飞行的另一无人机,包括:
    将所述目标物体的位置发送给与所述无人机绑定的另一无人机或者将所述目标物体的位置发送给第三中继设备,以通过第三中继设备将所述目标物体的位置发送给在所述环境中飞行的与所述无人机绑定的另一无人机。
  24. 根据权利要求23所述的方法,其特征在于,所述确定与所述无人机绑定至同一个所有者或者工作组的另一无人机,包括:
    获取所述无人机的身份信息;
    根据所述身份信息确定与所述无人机绑定的所述另一无人机。
  25. 根据权利要求22-24任一项所述的方法,其特征在于,所述方法还包括:
    从多个候选的无人机中确定位于所述环境中的所述另一无人机。
  26. 根据权利要求22-25任一项所述的方法,其特征在于,所述无人机和所述另一无人机是绑定至同一个所有者或者工作组的无人机。
  27. 一种无人机的控制终端的控制方法,其特征在于,所述方法包括:
    获取目标物体的位置;
    将所述目标物体的位置发送给在所述环境中的所述无人机,以使所述无人机根据所述目标物体的位置将其配置的拍摄装置的拍摄方向调整至朝向所述目标物体;
    其中,所述目标物体的位置是在所述环境中飞行的另一无人机根据其配置观测传感器对所述目标物体进行感测而输出的传感数据确定的。
  28. 根据权利要求27所述的方法,其特征在于,所述方法还包括:
    响应于获取到所述目标物体的位置,根据所述目标物体的位置显示用于指示所述目标物体的位置的标识;和/或,
    响应于获取到所述目标物体的位置,根据所述目标物体的位置显示用于指示所述目标物体的方位的标识。
  29. 根据权利要求27或28所述的方法,其特征在于,所述方法还包括:
    响应于获取到所述目标物体的位置,确定是否满足预设的发送条件;
    当满足时,将获取到的目标物体的位置发送给所述无人机,否则,拒绝将所述目标物体的位置发送给所述无人机。
  30. 根据权利要求29所述的方法,其特征在于,所述确定是否满足预设的发送条件,包括:
    确定是否检测用户的允许响应操作;
    当检测到所述允许响应操作时,确定满足预设的发送条件,否则,确定不满足预设的发送条件。
  31. 根据权利要求29所述的方法,其特征在于,所述确定是否满足预设的发送条件,包括:
    确定所述无人机是否满足预设的响应条件,其中,所述预设的响应条件包括所述无人机的剩余电量是否大于或等于预设的电量阈值、所述无人机与另一无人机或者目标物体之间的距离是否小于或等于预设的距离阈值、所述无人机是否处于飞行状态中的至少一个;
    若确定所述无人机满足预设的响应条件时,确定满足预设的发送条件,否则,确定不满足预设的发送条件。
  32. 根据权利要求27-31任一项所述的方法,其特征在于,所述方法还包括:
    获取所述无人机的拍摄装置采集的图像,并显示所述图像;
    在所述显示的图像中显示用于指示目标物体在所述图像中的位置的标 识。
  33. 根据权利要求27-32任一项所述的方法,其特征在于,所述无人机和所述另一无人机是绑定至同一个所有者或者工作组的无人机。
  34. 一种无人机的控制方法,所述无人机包括拍摄装置,其特征在于,所述方法包括:
    获取环境中目标物体的位置,其中,所述目标物体的位置是在所述环境中另一无人机根据其配置观测传感器对所述目标物体进行感测而输出的传感数据确定的;
    根据所述目标物体的位置调整所述拍摄装置的拍摄方向调整至朝向所述目标物体。
  35. 根据权利要求34所述的方法,其特征在于,所述方法还包括:
    根据所述目标物体的位置跟踪所述目标物体。
  36. 根据权利要求34或35所述的方法,其特征在于,所述无人机和所述另一无人机是绑定至同一个所有者或者工作组的无人机。
  37. 一种无人机,包括观测传感器和处理器,其特征在于,所述处理器用于执行以下步骤:
    在无人机在环境中飞行的过程中,获取无人机的观测传感器对环境中的目标物体进行感测而输出的传感数据,根据所述传感数据确定所述目标物体的位置;
    将所述目标物体的位置发送给在所述环境中飞行的另一无人机或者将所述目标物体的位置发送给第一中继设备,以通过第一中继设备将所述目标物体的位置发送给在所述环境中飞行的另一无人机,以使所述另一无人机根据所述目标物体的位置将其配置的拍摄装置的拍摄方向调整至朝向所述目标物体。
  38. 根据权利要求37所述的无人机,其特征在于,所述观测传感器包括拍摄装置,所述处理器用于:
    获取拍摄装置对所述环境中的目标物体进行拍摄而输出的图像;
    确定所述目标物体在所述图像中的位置;
    根据所述目标物体在所述图像中的位置确定所述目标物体的位置。
  39. 根据权利要求36所述的无人机,其特征在于,所述目标物体是用户在对显示所述拍摄装置采集到的图像的无人机的控制终端进行目标物体 选择操作选中的。
  40. 根据权利要求28或39所述的无人机,其特征在于,所述处理器用于:
    将所述拍摄装置采集到的图像发送给无人机的控制终端以使所述控制终端显示所述图像;
    将所述确定的所述目标物体在所述图像中的位置发送给无人机的控制终端以使所述控制终端在所述被显示的图像上显示指示所述目标物体在所述图像中的位置的标识。
  41. 根据权利要求37所述的无人机,其特征在于,所述观测传感器包括测距传感器,所述处理器用于:
    获取所述测距传感器输出的目标物体的距离和所述测距传感器的观测姿态;
    根据所述输出的目标物体的距离和所述观测姿态确定所述目标物体的位置。
  42. 根据权利要求41所述的无人机,其特征在于,所述测距传感器包括用于发射测距信号和接收被目标物体反射的测距信号的接收器。
  43. 根据权利要求41或42所述的无人机,其特征在于,所述无人机包括用于安装所述测距传感器和调整测距传感器的观测方向的云台,所述处理器用于:
    获取所述无人机的控制终端发送的观测方向调整指令;
    根据所述观测方向调整指令控制所述云台调整所述测距传感器的观测方向以使所述观测方向朝向所述目标物体。
  44. 根据权利要求41-43任一项所述的无人机,其特征在于,所述无人机包括拍摄装置,所述处理器用于:
    将拍摄装置采集到的图像发送给无人机的控制终端以使所述控制终端显示所述图像;
    根据所述目标物体的位置确定所述目标物体在所述图像中的位置,将所述目标物体在所述图像中的位置发送给无人机的控制终端以使所述控制终端在所述被显示的图像上显示指示所述目标物体在所述图像中的位置的标识;或者,
    将所述目标物体的位置发送给无人机的控制终端,以使所述控制终端根 据所述目标物体的位置确定所述目标物体在所述图像中的位置并在所述被显示的图像上显示指示所述目标物体在所述图像中的位置的标识。
  45. 根据权利要求37-44任一项所述的无人机,其特征在于,所述第一中继设备包括无人机的控制终端、服务器和所述另一无人机的控终端中的至少一个。
  46. 根据权利要求所述37-45任一项所述的无人机,其特征在于,
    所述目标物体的位置被传送给在所述环境中飞行的另一无人机,以使所述另一无人机根据所述目标物体的位置控制所述拍摄装置的变焦;和/或,
    所述目标物体的位置被传送给在所述环境中飞行的另一无人机,以使所述另一无人机根据所述目标物体的位置跟踪所述目标物体。
  47. 根据权利要求37-46任一项所述的无人机,其特征在于,所述无人机和所述另一无人机是绑定至同一个所有者或者工作组的无人机。
  48. 根据权利要求37-47任一项所述的无人机,其特征在于,所述目标物体是用户对所述无人机的控制终端进行目标物体选择操作而选中。
  49. 一种无人机的控制终端,其特征在于,包括存储器和处理器,
    所述存储器,用于存储程序代码;
    所述处理器,用于调用并执行所述程序代码以执行以下步骤:
    在无人机在环境中飞行的过程中,获取无人机发送的所述环境中的目标物体的位置,其中,所述位置是所述无人机根据其配置观测传感器对所述目标物体进行感测而输出的传感数据确定的;
    将所述目标物体的位置发送给在所述环境中飞行的另一无人机或者将所述目标物体的位置发送给第二中继设备,以通过第二中继设备将所述目标物体的位置发送给在所述环境中飞行的另一无人机,以使所述另一无人机根据所述目标物体的位置将其配置的拍摄装置的拍摄方向调整至朝向所述目标物体。
  50. 根据权利要求49所述的控制终端,其特征在于,所述观测传感器包括拍摄装置,所述处理器用于:
    接收并显示所述无人机发送的所述拍摄装置采集的图像;
    检测用户对所述显示的图像的目标物体选择操作,根据检测到的所述目标物体选择操作确定目标物体指示信息,其中,所述目标物体指示信息包括目标物体在所述图像中的位置。
    将所述目标物体指示信息发送给所述无人机以使无人机选中所述环境中的目标物体。
  51. 根据权利要求49或50所述的控制终端,其特征在于,所述处理器用于:
    接收并显示所述无人机发送的所述拍摄装置采集的图像;
    接收所述无人机发送的所述目标物体在所述图像中的位置;
    在所述被显示的图像上显示指示所述目标物体在所述图像中的位置的标识。
  52. 根据权利要求49所述的控制终端,其特征在于,所述观测传感器包括测距传感器,
    所述无人机包括用于安装所述测距传感器和调整测距传感器的观测方向的云台,所述处理器用于:
    检测用户的观测方向调整操作,根据所述检测到的观测方向调整操作生成观测方向调整指令;
    将所述观测方向调整指令发送给所述无人机以使所述无人机根据所述观测方向调整指令将所述距离传感器的观测方向调整至朝向所述目标物体。
  53. 根据权利要求52所述的控制终端,其特征在于,所述无人机包括拍摄装置,其中,所述处理器用于:
    接收并显示所述无人机发送的所述拍摄装置采集的图像;
    接收无人机发送的所述目标物体在所述图像中的位置,在所述被显示的图像上显示指示所述目标物体在所述图像中的位置的标识,其中,所述目标物体在所述图像中的位置是无人机根据所述目标物体的位置确定的;或者,
    根据所述目标物体的位置确定所述目标物体在所述图像中的位置,并根据所述目标物体在所述图像中的位置在所述被显示的图像上显示指示所述目标物体在所述图像中的位置的标识。
  54. 根据权利要求49-53任一项所述的控制终端,其特征在于,所述第二中继设备包括服务器和所述另一无人机的控终端中的至少一个。
  55. 根据权利要求49-54任一项所述的控制终端,其特征在于,所述处理器用于:
    显示多个候选的无人机的指示信息;
    检测用户的无人机选中操作,根据检测到的所述无人机选中操作确定用 户从所述多个候选的无人机的指示信息中选中的所述另一无人机的指示信息;
    将所述目标物体的位置发送给在所述环境中飞行的与选中的指示信息对应的另一无人机或者将所述目标物体的位置发送给第二中继设备,以通过第二中继设备将所述目标物体的位置发送给在所述环境中飞行的与选中的指示信息对应的另一无人机。
  56. 根据权利要求所述49-55任一项所述的控制终端,其特征在于,
    所述目标物体的位置被传送给在所述环境中飞行的另一无人机,以使所述另一无人机根据所述目标物体的位置控制所述拍摄装置的变焦;和/或,
    所述目标物体的位置被传送给在所述环境中飞行的另一无人机,以使所述另一无人机根据所述目标物体的位置跟踪所述目标物体。
  57. 根据权利要求49-56任一项所述的控制终端,其特征在于,所述无人机和所述另一无人机是绑定至同一个所有者或者工作组的无人机。
  58. 一种服务器,其特征在于,包括存储器和处理器,
    所述存储器,用于存储程序代码;
    所述处理器,用于调用并执行所述程序代码以执行以下步骤:
    获取环境中的目标物体的位置,其中,所述目标物体的位置是在所述环境中无人机根据其配置观测传感器对所述目标物体进行感测而输出的传感数据确定的;
    将所述目标物体的位置发送给另一无人机或者将所述目标物体的位置发送给第三中继设备,以通过第三中继设备将所述目标物体的位置发送给在所述环境中飞行的另一无人机,以使所述另一无人机根据所述目标物体的位置将其配置的拍摄装置的拍摄方向调整至朝向所述目标物体。
  59. 根据权利要求58所述的服务器,其特征在于,所述处理器用于:
    确定与所述无人机绑定至同一个所有者或者工作组的所述另一无人机;
    所述将所述目标物体的位置发送给另一无人机或者将所述目标物体的位置发送给第三中继设备,以通过第三中继设备将所述目标物体的位置发送给在所述环境中飞行的另一无人机,包括:
    将所述目标物体的位置发送给与所述无人机绑定的另一无人机或者将所述目标物体的位置发送给第三中继设备,以通过第三中继设备将所述目标物体的位置发送给在所述环境中飞行的与所述无人机绑定的另一无人机。
  60. 根据权利要求59所述的服务器,其特征在于,所述处理器用于:
    获取所述无人机的身份信息;
    根据所述身份信息确定与所述无人机绑定的所述另一无人机。
  61. 根据权利要求48-60任一项所述的服务器,其特征在于,所述处理器用于:
    从多个候选的无人机中确定位于所述环境中的所述另一无人机。
  62. 根据权利要求48-61任一项所述的服务器,其特征在于,所述无人机和所述另一无人机是绑定至同一个所有者或者工作组的无人机。
  63. 一种无人机的控制终端,其特征在于,包括存储器和处理器,
    所述存储器,用于存储程序代码;
    所述处理器,用于调用并执行所述程序代码以执行以下步骤:
    获取目标物体的位置;
    将所述目标物体的位置发送给在所述环境中的所述无人机,以使所述无人机根据所述目标物体的位置将其配置的拍摄装置的拍摄方向调整至朝向所述目标物体;
    其中,所述目标物体的位置是在所述环境中飞行的另一无人机根据其配置观测传感器对所述目标物体进行感测而输出的传感数据确定的。
  64. 根据权利要求63所述的控制终端,其特征在于,所述处理器用于:
    响应于获取到所述目标物体的位置,根据所述目标物体的位置显示用于指示所述目标物体的位置的标识;和/或,
    响应于获取到所述目标物体的位置,根据所述目标物体的位置显示用于指示所述目标物体的方位的标识。
  65. 根据权利要求63或64所述的控制终端,其特征在于,所述处理器用于:
    响应于获取到所述目标物体的位置,确定是否满足预设的发送条件;
    当满足时,将获取到的目标物体的位置发送给所述无人机,否则,拒绝将所述目标物体的位置发送给所述无人机。
  66. 根据权利要求65所述的控制终端,其特征在于,所述处理器用于:
    确定是否检测用户的允许响应操作;
    当检测到所述允许响应操作时,确定满足预设的发送条件,否则,确定不满足预设的发送条件。
  67. 根据权利要求65所述的控制终端,其特征在于,所述处理器用于:
    确定所述无人机是否满足预设的响应条件,其中,所述预设的响应条件包括所述无人机的剩余电量是否大于或等于预设的电量阈值、所述无人机与另一无人机或者目标物体之间的距离是否小于或等于预设的距离阈值、所述无人机是否处于飞行状态中的至少一个;
    若确定所述无人机满足预设的响应条件时,确定满足预设的发送条件,否则,确定不满足预设的发送条件。
  68. 根据权利要求63-67任一项所述的控制终端,其特征在于,所述处理器用于:
    获取所述无人机的拍摄装置采集的图像,并显示所述图像;
    在所述显示的图像中显示用于指示目标物体在所述图像中的位置的标识。
  69. 根据权利要求63-68任一项所述的控制终端,其特征在于,所述无人机和所述另一无人机是绑定至同一个所有者或者工作组的无人机。
  70. 一种无人机,所述无人机包括拍摄装置和处理器,所述处理器用于执行以下步骤:
    获取环境中目标物体的位置,其中,所述目标物体的位置是在所述环境中另一无人机根据其配置观测传感器对所述目标物体进行感测而输出的传感数据确定的;
    根据所述目标物体的位置调整所述拍摄装置的拍摄方向调整至朝向所述目标物体。
  71. 根据权利要求70所述的无人机,其特征在于,所述处理器用于:
    根据所述目标物体的位置跟踪所述目标物体。
  72. 根据权利要求70或71所述的无人机,其特征在于,所述无人机和所述另一无人机是绑定至同一个所有者或者工作组的无人机。
PCT/CN2022/082119 2022-03-21 2022-03-21 无人机、控制终端、服务器及其控制方法 WO2023178495A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/CN2022/082119 WO2023178495A1 (zh) 2022-03-21 2022-03-21 无人机、控制终端、服务器及其控制方法
CN202280050802.6A CN117677911A (zh) 2022-03-21 2022-03-21 无人机、控制终端、服务器及其控制方法

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2022/082119 WO2023178495A1 (zh) 2022-03-21 2022-03-21 无人机、控制终端、服务器及其控制方法

Publications (1)

Publication Number Publication Date
WO2023178495A1 true WO2023178495A1 (zh) 2023-09-28

Family

ID=88099555

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/082119 WO2023178495A1 (zh) 2022-03-21 2022-03-21 无人机、控制终端、服务器及其控制方法

Country Status (2)

Country Link
CN (1) CN117677911A (zh)
WO (1) WO2023178495A1 (zh)

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105979146A (zh) * 2016-06-22 2016-09-28 韦程耀 无人机的航拍控制系统
CN106325290A (zh) * 2016-09-30 2017-01-11 北京奇虎科技有限公司 一种基于无人机的监控系统及设备
US20170220037A1 (en) * 2016-02-03 2017-08-03 Sony Corporation System and method for utilization of multiple-camera network to capture static and/or motion scenes
CN108615243A (zh) * 2017-01-25 2018-10-02 北京三星通信技术研究有限公司 立体多媒体信息的确定方法、装置及系统
CN109859264A (zh) * 2017-11-30 2019-06-07 北京机电工程研究所 一种基于视觉导引的飞行器捕控跟踪系统
CN110658852A (zh) * 2019-09-16 2020-01-07 苏州米龙信息科技有限公司 用于无人机的智能目标搜寻方法及系统
CN111142567A (zh) * 2019-11-29 2020-05-12 西北工业大学 一种无人机系统中的无人机目标位置交换方法及装置
CN111487997A (zh) * 2020-05-12 2020-08-04 西安爱生技术集团公司 一种攻击型无人机双机协同制导方法
CN113472998A (zh) * 2020-03-31 2021-10-01 杭州海康机器人技术有限公司 图像处理方法、装置、电子设备和存储介质

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170220037A1 (en) * 2016-02-03 2017-08-03 Sony Corporation System and method for utilization of multiple-camera network to capture static and/or motion scenes
CN105979146A (zh) * 2016-06-22 2016-09-28 韦程耀 无人机的航拍控制系统
CN106325290A (zh) * 2016-09-30 2017-01-11 北京奇虎科技有限公司 一种基于无人机的监控系统及设备
CN108615243A (zh) * 2017-01-25 2018-10-02 北京三星通信技术研究有限公司 立体多媒体信息的确定方法、装置及系统
CN109859264A (zh) * 2017-11-30 2019-06-07 北京机电工程研究所 一种基于视觉导引的飞行器捕控跟踪系统
CN110658852A (zh) * 2019-09-16 2020-01-07 苏州米龙信息科技有限公司 用于无人机的智能目标搜寻方法及系统
CN111142567A (zh) * 2019-11-29 2020-05-12 西北工业大学 一种无人机系统中的无人机目标位置交换方法及装置
CN113472998A (zh) * 2020-03-31 2021-10-01 杭州海康机器人技术有限公司 图像处理方法、装置、电子设备和存储介质
CN111487997A (zh) * 2020-05-12 2020-08-04 西安爱生技术集团公司 一种攻击型无人机双机协同制导方法

Also Published As

Publication number Publication date
CN117677911A (zh) 2024-03-08

Similar Documents

Publication Publication Date Title
US11644832B2 (en) User interaction paradigms for a flying digital assistant
CN111448476B (zh) 在无人飞行器与地面载具之间共享绘图数据的技术
US20180046187A1 (en) Unmanned aerial image capture platform
EP3629309A2 (en) Drone real-time interactive communications system
JP2020030204A (ja) 距離測定方法、プログラム、距離測定システム、および可動物体
US20150054826A1 (en) Augmented reality system for identifying force capability and occluded terrain
US10924691B2 (en) Control device of movable type imaging device and control method of movable type imaging device
WO2018006224A1 (en) System and method for automated tracking and navigation
US10228691B1 (en) Augmented radar camera view for remotely operated aerial vehicles
CN109561282B (zh) 一种用于呈现地面行动辅助信息的方法与设备
US11611700B2 (en) Unmanned aerial vehicle with virtual un-zoomed imaging
CN109656319B (zh) 一种用于呈现地面行动辅助信息方法与设备
WO2021250914A1 (ja) 情報処理装置、移動装置、情報処理システム、および方法、並びにプログラム
CN110187720A (zh) 无人机导引方法、装置、系统、介质及电子设备
WO2023178495A1 (zh) 无人机、控制终端、服务器及其控制方法
CN106688018B (zh) 对目标进行成像、监视和/或指示的机载光电设备
KR102181809B1 (ko) 시설물 점검 장치 및 방법
JP6684012B1 (ja) 情報処理装置および情報処理方法
WO2021212499A1 (zh) 目标标定方法、装置和系统及可移动平台的遥控终端
JP2019082837A (ja) 情報処理装置、飛行制御指示方法、プログラム、及び記録媒体
KR20180106178A (ko) 무인 비행체, 전자 장치 및 그에 대한 제어 방법
CN112581630A (zh) 一种用户交互方法和系统
EP3631595B1 (en) Method and system for operating a movable platform using ray-casting mapping
US20240053746A1 (en) Display system, communications system, display control method, and program
JP2023083072A (ja) 方法、システムおよびプログラム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22932564

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 202280050802.6

Country of ref document: CN