WO2019144295A1 - 一种飞行控制方法、设备、飞行器、系统及存储介质 - Google Patents

一种飞行控制方法、设备、飞行器、系统及存储介质 Download PDF

Info

Publication number
WO2019144295A1
WO2019144295A1 PCT/CN2018/073877 CN2018073877W WO2019144295A1 WO 2019144295 A1 WO2019144295 A1 WO 2019144295A1 CN 2018073877 W CN2018073877 W CN 2018073877W WO 2019144295 A1 WO2019144295 A1 WO 2019144295A1
Authority
WO
WIPO (PCT)
Prior art keywords
control
target user
aircraft
flight
gesture
Prior art date
Application number
PCT/CN2018/073877
Other languages
English (en)
French (fr)
Inventor
钱杰
陈侠
张李亮
赵丛
刘政哲
李思晋
庞磊
李昊南
Original Assignee
深圳市大疆创新科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳市大疆创新科技有限公司 filed Critical 深圳市大疆创新科技有限公司
Priority to PCT/CN2018/073877 priority Critical patent/WO2019144295A1/zh
Priority to CN201880002091.9A priority patent/CN109196438A/zh
Publication of WO2019144295A1 publication Critical patent/WO2019144295A1/zh
Priority to US16/935,680 priority patent/US20200348663A1/en
Priority to US18/316,399 priority patent/US20230280745A1/en

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/08Control of attitude, i.e. control of roll, pitch, or yaw
    • G05D1/0808Control of attitude, i.e. control of roll, pitch, or yaw specially adapted for aircraft
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/0011Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement
    • G05D1/0016Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement characterised by the operator's input device
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64CAEROPLANES; HELICOPTERS
    • B64C39/00Aircraft not otherwise provided for
    • B64C39/02Aircraft not otherwise provided for characterised by special use
    • B64C39/024Aircraft not otherwise provided for characterised by special use of the remote controlled vehicle type, i.e. RPV
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/0011Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/0011Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement
    • G05D1/0033Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement by having the operator tracking the vehicle either by direct line of sight or via one or more cameras located remotely from the vehicle
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/0094Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots involving pointing a payload, e.g. camera, weapon, sensor, towards a fixed or moving target
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/10Simultaneous control of position or course in three dimensions
    • G05D1/101Simultaneous control of position or course in three dimensions specially adapted for aircraft
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • G06V20/13Satellite images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • G06V20/17Terrestrial scenes taken from planes or by drones
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/107Static hand or arm
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • G06V40/28Recognition of hand or arm movements, e.g. recognition of deaf sign language
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U2101/00UAVs specially adapted for particular uses or applications
    • B64U2101/30UAVs specially adapted for particular uses or applications for imaging, photography or videography
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U2201/00UAVs characterised by their flight controls
    • B64U2201/20Remote controls

Definitions

  • the present invention relates to the field of control technologies, and in particular, to a flight control method, device, aircraft, system, and storage medium.
  • Embodiments of the present invention provide a flight control method, device, aircraft, system, and storage medium, which can control an aircraft relatively quickly.
  • an embodiment of the present invention provides a flight control method, which is applied to an aircraft, and the aircraft is mounted with a photographing device, and the method includes:
  • the aircraft flight is controlled according to the control object generation control instruction.
  • an embodiment of the present invention provides another flight control method, which is applied to an aircraft, and the aircraft is mounted with a photographing device, and the method includes:
  • a takeoff control command is generated to control the aircraft to take off.
  • an embodiment of the present invention provides a flight control device, including a memory and a processor;
  • the memory is configured to store program instructions
  • the processor executes program instructions stored in the memory, and when the program instructions are executed, the processor is configured to perform the following steps:
  • the aircraft flight is controlled according to the control object generation control instruction.
  • an embodiment of the present invention provides another flight control device, including a memory and a processor;
  • the memory is configured to store program instructions
  • the processor executes program instructions stored in the memory, and when the program instructions are executed, the processor is configured to perform the following steps:
  • a takeoff control command is generated to control the aircraft to take off.
  • an embodiment of the present invention provides an aircraft, including:
  • a power system disposed on the fuselage for providing flight power
  • a processor configured to acquire an environment image captured by the photographing device; determine a feature portion of the target user according to the environment image, and determine a target image region according to the feature portion, and identify the target image region in the target image region a control object of the target user; controlling the flight of the aircraft according to the control object generation control instruction.
  • an embodiment of the present invention provides another aircraft, including:
  • a power system disposed on the fuselage for providing flight power
  • a processor configured to acquire an environment image captured by the camera when the triggering operation of triggering the aircraft into the image control mode is acquired; perform gesture recognition on the control object of the target user in the environment image; The gesture of the control object is to initiate a flight gesture, and a takeoff control command is generated to control the aircraft to take off.
  • an embodiment of the present invention provides a flight control system, including: a flight control device and an aircraft;
  • the aircraft is configured to control a camera mounted on the aircraft to capture an environment image, and send the environment image to the flight control device;
  • the flight control device is configured to acquire an environment image captured by the camera, determine a feature portion of the target user according to the environment image, and determine a target image region according to the feature portion, and identify the target image region Determining a control object of the target user; controlling the flight of the aircraft according to the control object generation control instruction;
  • the aircraft is further configured to control the aircraft to fly and perform an action corresponding to the flight control instruction in response to the flight control instruction.
  • an embodiment of the present invention provides another flight control system, including: a flight control device and an aircraft;
  • the flight control device is configured to acquire an environment image captured by the camera when the triggering operation of triggering the aircraft into the image control mode is acquired; perform gesture recognition on the control object of the target user in the environment image; The gesture of the control object is to initiate a flight gesture, and generating a takeoff control command to control the aircraft to take off;
  • the aircraft is configured to control the aircraft to take off in response to the takeoff control command.
  • the embodiment of the present invention provides a computer readable storage medium, where the computer readable storage medium stores a computer program, and when the computer program is executed by the processor, implements the first aspect or the second aspect as described above. Flight control method.
  • the flight control device obtains an environment image captured by the camera, determines a feature portion of the target user according to the environment image, and determines a target image region according to the feature portion, where the target image region is The control object of the target user is identified, thereby controlling the aircraft flight according to the control object generation control instruction. In this way, the aircraft is controlled more quickly, and the efficiency of controlling the flight, shooting, landing, etc. of the aircraft is improved.
  • 1a is a schematic structural diagram of a flight control system according to an embodiment of the present invention.
  • FIG. 1b is a schematic diagram of flight control of an aircraft according to an embodiment of the present invention.
  • FIG. 2 is a schematic flow chart of a flight control method according to an embodiment of the present invention.
  • FIG. 3 is a schematic flow chart of another flight control method according to an embodiment of the present invention.
  • FIG. 4 is a schematic flow chart of still another flight control method according to an embodiment of the present invention.
  • FIG. 5 is a schematic structural diagram of a flight control device according to an embodiment of the present invention.
  • FIG. 6 is a schematic structural diagram of another flight control device according to an embodiment of the present invention.
  • the flight control method provided in the embodiments of the present invention may be performed by a flight control device that may be disposed on an aircraft (such as a drone) capable of capturing video on which the camera is mounted.
  • the flight control method can be applied to control operations such as takeoff, flight, landing, photographing, video recording, etc. of the aircraft.
  • the flight control method can also be applied to a movable device such as a robot capable of autonomous movement, and the following describes a flight control method for the aircraft.
  • the flight control device may control takeoff of the aircraft, and if the flight control device acquires a triggering operation that triggers the aircraft to enter an image control mode, the aircraft may be controlled to enter the image control. mode.
  • the flight control device may acquire an environment image captured by a camera mounted on the aircraft, wherein the environment image is captured by the camera device before the aircraft takes off. Preview the image.
  • the flight control device may perform gesture recognition on a control object of the target user in the environment image, and if the gesture of the control object is recognized as a start flight gesture, a takeoff control command may be generated to control the aircraft to take off.
  • the triggering operation may include: a click operation on the aircraft power button, a double-click operation on the aircraft power button, a shaking operation on the aircraft, a voice input operation, a fingerprint input operation, and the like. Any one or more of the triggering operations, such as the feature object scanning operation, the intelligent operation of the smart accessory (such as smart glasses, a smart watch, a wristband, etc.), etc.
  • the triggering operation is not limited.
  • the initiating flight gesture may be any specified gesture made by the target user, such as an "OK" gesture, a scissors hand gesture, etc., and the embodiment of the present invention does not limit the starting flight gesture.
  • the target user mainly refers to a person
  • the control object may be a palm of the target user, or other body parts, a body area, such as a feature part such as a face, a head, a shoulder, etc.
  • the present invention does not limit the target user and the control object.
  • the triggering operation is a double-click operation on the aircraft power button
  • the target user is a person
  • the control object is a palm of the target user
  • the startup flight gesture is set to an “OK” gesture.
  • the flight control device may control the aircraft to enter an image control mode.
  • the image control mode the flight control device may acquire an environment image captured by a camera on the aircraft, and the environment image is a preview image for performing control analysis, and is not required for storage. An image in which the target user is included in the preview image.
  • the flight control device may perform gesture recognition on the palm of the target user in the environment image in the image control mode, and may generate a takeoff if the gesture made by the palm of the target user is recognized as an “OK” gesture. Control commands control the aircraft to take off.
  • the flight control device after acquiring the triggering operation and entering the image control mode, the flight control device first needs to identify the control object of the target user. Specifically, the flight control device may acquire an environmental image by controlling a photographing device mounted on the aircraft, wherein the environmental image is a preview image before the aircraft takes off. The flight control device may determine a feature portion of the target user from the preview image according to the preview image, and determine a target image region according to the feature portion, thereby identifying in the target image region The control object of the target user. For example, assuming that the target user's control object is a palm, the flight control device may acquire an environmental image by controlling a photographing device mounted on the aircraft, wherein the environmental image is before the aircraft takes off. Preview the image.
  • the flight control device may determine the human body according to the human body of the target user.
  • the flight control device may control the camera to capture a flight environment image during flight of the aircraft, and perform gesture recognition on a target object of the target image in the flight environment image.
  • the gesture recognition determines a flight control gesture, and may generate a control instruction to control the aircraft to perform an action corresponding to the control instruction according to the identified flight control gesture.
  • FIG. 1a is a schematic structural diagram of a flight control system according to an embodiment of the present invention.
  • the system includes a flight control device 11 and an aircraft 12.
  • the flight control device 11 may be disposed on the aircraft 12, where the aircraft 12 and the flight control device 11 are placed separately for convenience of explanation.
  • the communication connection between the aircraft 12 and the flight control device 11 may be a wired communication connection or a wireless communication connection.
  • the aircraft 12 may be a rotary wing type unmanned aerial vehicle, such as a quadrotor UAV, a six-rotor UAV, an eight-rotor UAV, or a fixed-wing UAV.
  • the aircraft 12 includes a power system 121 for providing flight power to the aircraft 12, wherein the power system 121 includes any one or more of a propeller, an electric motor, and an electric power, and the aircraft 12 may further include a pan/tilt 122 and The imaging device 123 and the imaging device 123 are mounted on the main body of the aircraft 12 via the pan/tilt 122.
  • the photographing device 123 is configured to take a preview image before the aircraft 12 takes off, and to capture an image or video during the flight of the aircraft 12, including but not limited to a multi-spectral imager, hyperspectral imaging
  • the instrument, the visible light camera and the infrared camera, etc., the pan/tilt 122 is a multi-axis transmission and a stabilization system, and the pan-tilt motor compensates the imaging angle of the imaging device by adjusting the rotation angle of the rotation axis, and sets an appropriate buffer mechanism. To prevent or reduce the jitter of the imaging device.
  • the flight control device 11 can initiate the triggering operation of triggering the aircraft 12 to enter the image control mode, and after entering the image control mode, before the aircraft 12 is controlled to take off, the mount can be opened.
  • the camera 123 on the aircraft 12 is controlled to control the rotation of the platform 122 mounted on the aircraft 12 to adjust the attitude angle of the platform 122, thereby controlling the camera 123 to be within a preset shooting range.
  • Scanning shooting is performed to enable the photographing device 123 to scan the photographed environment image in the preset photographing range to include the feature portion of the target user, so that the flight control device 11 can acquire the photographing device 123 scanning, in the preset shooting range, the captured environment image including the feature portion of the target user, wherein the environment image is a preview image captured by the photographing device 123 before the aircraft 12 takes off .
  • the flight control device 11 detects the state of the target user based on the environmental image before the aircraft 12 takes off, if the flight control device 11 detects the state of the target user. If the parameter satisfies the preset first condition, the feature part of the target user may be determined as the first feature part, and the target image area where the first feature part is located is determined according to the first feature part of the target user. Thereby identifying the control object of the target user in the target image area.
  • the state parameter of the target user includes: a size ratio parameter of an image area in which the target user is located in the environment image, and the state parameter of the target user satisfies a preset first condition.
  • the size ratio of the image area in which the target user is located in the environment image is less than or equal to a preset first percentage threshold; or the state parameter of the target user includes: the target user and the aircraft a distance parameter; the state parameter of the target user satisfying the preset first condition means that the distance between the target user and the aircraft is greater than or equal to a preset first distance.
  • the first feature part is the human body of the target user, or the first feature part may be another body part of the target user, which is not limited in the embodiment of the present invention.
  • the flight control device detects an acquired environment image captured by the camera device And the target user has a size ratio of the image area in the environment image of less than 1/4, and the flight control device may determine that the feature part of the target user is a human body, and determine according to the human body of the target user.
  • a target image area in which the human body is located is extracted, and a control object of the target user, such as a palm, is identified in the target image area.
  • the flight control device 11 detects the state of the target user based on the environmental image before the aircraft 12 takes off, if the flight control device 11 detects the state of the target user. If the parameter satisfies the preset second condition, the feature part of the target user may be determined as the second feature part, and the target image area where the second feature part is located is determined according to the second feature part of the target user. Thereby, the control object of the target user is identified in the target image area.
  • the state parameter of the target user includes: a size ratio parameter of an image area in which the target user is located in the environment image, and the state parameter of the target user satisfies a preset second condition.
  • the size ratio of the image area in which the target user is located in the environment image is greater than or equal to a preset second ratio threshold; or the state parameter of the target user includes: the target user and the aircraft a distance parameter; the state parameter of the target user satisfying the preset first condition means that the distance between the target user and the aircraft is less than or equal to a preset second distance.
  • the second feature portion includes a head of the target user; or the second feature portion may include other body parts such as a head and a shoulder of the target user, in accordance with an embodiment of the present invention Not limited.
  • the flight control device may determine that the feature part of the target user is a head, and according to the head of the target user The part determines a target image area in which the head is located, thereby identifying a control object of the target user such as a palm in the target image area.
  • the target when the flight control device 11 identifies the control object of the target user before the aircraft 12 takes off, if at least one control object is identified in the target image region, the target may be a feature point of the user, determining a joint point of the target user, and determining a control object of the target user from the at least one control object according to the determined joint point.
  • the joint point includes the joint point of the feature part of the target user, which is not limited in the embodiment of the present invention.
  • the flight control device 11 may determine a target joint point from the determined joint points when determining the control object of the target user from the at least one control object, and the at least one control A control object whose object is closest to the target joint point is determined as a control object of the target user.
  • the target joint point may refer to a joint point of a specified arm part, such as a joint point of an elbow joint of an arm, a joint point of an arm and a shoulder, a joint point of a wrist, and the like, the target joint Both the point and the finger of the control object belong to the same target user.
  • the flight control device 11 can determine the joint points of the target user's arms and shoulders, and this The palm of the two palms that is closest to the joint point of the arm and shoulder of the target user is determined as the control object of the target user.
  • the flight control device 11 may identify a flight control gesture of the control object if the flight control device 11 identifies the flight of the control object
  • the control gesture is a height control gesture, and a height control command can be generated to control the aircraft 12 to adjust the altitude at which the aircraft 12 is flying.
  • the flight control device 11 may control the photographing device 123 to capture an image set during the flight of the aircraft, and perform motion recognition on the control object according to an image included in the image set to obtain the The motion information of the object is controlled, wherein the motion information includes motion information such as a motion direction of the control object.
  • the flight control device 11 may analyze the flight control gesture of the control object according to the motion information, and if it is determined that the flight control gesture is a height control gesture, obtain a height control corresponding to the height control gesture.
  • the aircraft 12 is commanded and controlled to fly based on the direction of motion indicated by the altitude control command to adjust the height of the aircraft 12.
  • FIG. 1b is used as an example for description.
  • FIG. 1b is a schematic diagram of flight control of an aircraft according to an embodiment of the present invention.
  • the schematic diagram shown in FIG. 1b includes a target user 13 and an aircraft 12, wherein the target user 13 includes a control object 131, which includes a power system 121, a pan/tilt 122, and a camera 123 as described above with respect to FIG. 1a.
  • the explanation of the aircraft 12 is as described above, and will not be described herein.
  • the aircraft 12 is provided with the flight control device, and the control object 131 is a palm.
  • the flight control device can control the camera 123 to capture the camera.
  • a height control command may be generated to control the aircraft 12 to fly in a vertical ground upward direction to increase the flying height of the aircraft 12.
  • a motion control command may be generated to control the aircraft to the Flight in the direction indicated by the mobile control command.
  • the direction indicated by the movement control instruction includes: a direction away from the control object or a direction close to the control object.
  • the flight control device 11 may The object and the second object perform motion recognition, obtain motion information of the first object and the second object, and obtain motion features represented by the first object and the second object according to the motion information, where
  • the action feature is used to indicate a change in the distance between the first object and the second object, and the flight control device 11 can acquire a motion control instruction corresponding to the action feature according to the distance change.
  • the movement control command is for controlling the aircraft to move away The target user flies in the direction. If the action feature is used to indicate that the distance between the first object and the second object changes to a change in distance reduction, the movement control command is for controlling the direction of the aircraft toward the target user flight.
  • the control object includes a first object and a second object, and the first object is a left palm of a person, and the second object is a right palm of a person, if the flight control device 11 detects the target
  • the two palms of the photographing device of the aircraft 12 lifted by the user are detected, and the two palms are detected to be "opening", that is, the distance between the two palms in the horizontal direction is gradually increased.
  • the flight control device 11 may determine that the flight control gestures made by the two palms are movement control gestures and generate movement control commands to control the aircraft 12 to fly away from the target user.
  • the flight control device 11 can determine this.
  • the flight control gestures made by the two palms are motion control gestures and generate motion control commands that control the aircraft 12 to fly in a direction close to the target user.
  • a drag control command may be generated to control the aircraft along the aircraft.
  • the drag control gesture refers to dragging the palm of the target user to the left or right in the horizontal direction.
  • a drag control command may be generated to control the aircraft to fly in a horizontally left direction.
  • a rotation control command may be generated to control the aircraft along the Rotate the flight in the direction indicated by the rotation control command.
  • the rotation control gesture refers to that the palm of the target user rotates around the target user.
  • the flight control device 11 may perform motion recognition on the palm and the target user included in the control object according to an image included in the image set captured by the imaging device 123, to obtain the palm and the target user.
  • Motion information which may include the direction of motion of the palm and the target user.
  • a rotation control command may be generated to control the aircraft to refer to the rotation control command
  • the direction of the rotation is flying. For example, assuming that the flight control device 11 detects that the palm of the target user and the target user rotates clockwise around the target user, the flight control device 11 may generate a rotation control command to control the aircraft 12 Rotate clockwise around the target user.
  • a landing control command may be generated to control the aircraft landing.
  • the landing gesture may include a gesture in which the palm of the target user is moving downward toward the ground, or the landing gesture may also be other gestures of the target user, which is not performed in the embodiment of the present invention. Specifically limited.
  • a landing control command may be generated to control the aircraft 12 to land to the target.
  • the target position may be preset, or the target position is determined according to the height of the aircraft 12 and the ground detected by the aircraft 12, which is not limited in the embodiment of the present invention. If it is detected that the landing gesture is staying at the target location for a time greater than a preset time threshold, the aircraft 12 may be controlled to land to the ground.
  • the preset time threshold is 3 s
  • the target position determined by the aircraft 12 and the height of the ground detected by the aircraft 12 is 0.5 m from the ground
  • a landing control command may be generated to control the aircraft 12 to land at a position 0.5 m from the ground, if the The gesture of the target user's palm moving downward toward the ground for more than 3 s at a position 0.5 m from the ground can control the aircraft 12 to land to the ground.
  • the aircraft may be controlled to follow the target user as the following target according to the feature part of the target user, and follow the target user to move.
  • the feature portion refers to any body region of the target user, which is not specifically limited in the embodiment of the present invention.
  • the following the target user movement means: adjusting at least one of a position of the aircraft, a posture of a gimbal mounted on the aircraft, and an attitude of an aircraft following the target user Moving to cause the target user to be in an image taken by the camera.
  • the flight control device 11 cannot identify the flight control gesture of the target user and identify the first body region of the target user in the flight environment image, Following the first body region to control the aircraft to follow the target user as a follow target, follow the first body region movement, and adjust the position of the aircraft during the movement following the first body region At least one of a posture of the pan/tilt carried on the aircraft and an attitude of the aircraft to cause the target user to be in an image captured by the photographing device.
  • the flight control device 11 may follow a body region in which the body torso is located, control the aircraft to follow the target user as a follow target, follow the body region in which the body torso moves, and follow the body torso During movement of the body region, adjusting at least one of a position of the aircraft, a posture of a pan-tilt mounted on the aircraft, and an attitude of the aircraft to cause the target user to be in an image captured by the camera .
  • the aircraft 12 can be controlled to follow the second body region movement.
  • the flight control device 11 may follow the second body region to control the aircraft to follow the target user as a follow target, follow the second body region to move, and follow the During the movement of the second body region, adjusting at least one of a position of the aircraft, a posture of a pan-tilt mounted on the aircraft, and an attitude of the aircraft to cause the target user to shoot at the photographing device In the image.
  • the flight control device 11 may follow the body region where the head and the shoulder are located to control the aircraft to follow the target user as a follow target, following The body region where the head and shoulder are located moves, and during the movement following the body region where the head and shoulder are located, adjusting the position of the aircraft, the attitude of the gimbal mounted on the aircraft, At least one of the attitudes of the aircraft to cause the target user to be in an image taken by the camera.
  • a shooting control command may be generated to control the photographing device of the aircraft to capture the target image.
  • the photographing gesture may be any gesture that is set, such as an “O” gesture, which is not specifically limited in the embodiment of the present invention.
  • the photographing gesture is an “O” gesture
  • the flight control device 11 recognizes that the gesture made by the palm of the target user is an “O” gesture
  • a shooting control command may be generated to control the shooting of the aircraft. The device captures the target image.
  • a recording control command may be generated to control the camera of the aircraft to capture a video, and the aircraft is photographed.
  • an end control command may be generated to control the imaging device of the aircraft to stop capturing the video.
  • the recording gesture may be any gesture that is set, which is not limited in the embodiment of the present invention. For example, if the recording gesture is a "1" gesture, if the flight control device 11 recognizes that the gesture made by the palm of the target user is a "1" gesture, a recording control command may be generated to control the shooting of the aircraft.
  • the device captures a video, and in the process of capturing a video by the camera of the aircraft, if the “1” gesture made by the target user is recognized again, an end control command may be generated to control the camera of the aircraft to stop shooting. Said video.
  • the replacement user may be new. And the target user, and identifying the control object of the new target user and the replacement control gesture, and controlling the aircraft to perform the action corresponding to the control instruction according to the replacement control gesture generation control instruction.
  • the replacement control gesture may be any gesture that is set, which is not limited in the embodiment of the present invention.
  • the flight control device 11 may use the replacement user as the target user, and generate a photographing control instruction to control the photographing device of the aircraft to capture the target image according to the “O” gesture made by the replacement user.
  • FIG. 2 is a schematic flowchart diagram of a flight control method according to an embodiment of the present invention.
  • the method may be performed by a flight control device, and the flight control device may be disposed on an aircraft, and the aircraft is mounted on the aircraft.
  • the photographing device in which the specific explanation of the flight control device is as described above.
  • the method of the embodiment of the present invention includes the following steps.
  • S201 Acquire an environment image captured by the photographing device.
  • the flight control device can acquire an environment image captured by the photographing device mounted on the aircraft.
  • S202 Determine a feature part of the target user according to the environment image, and determine a target image area according to the feature part, and identify a control object of the target user in the target image area.
  • the flight control device may determine a feature part of the target user according to the environment image, and determine a target image area according to the feature part, and identify the target user control in the target image area.
  • the control object includes, but is not limited to, the palm of the target user.
  • the flight control device determines a feature portion of the target user according to the environment image, and determines a target image region according to the feature portion, and identifies the target user in the target image region.
  • the flight control device may determine that the feature portion of the target user is the first feature portion, according to the first user of the target user
  • the feature portion determines a target image region in which the first feature portion is located, and identifies a control object of the target user in the target image region.
  • the state parameter of the target user includes: a size ratio parameter of an image area in which the target user is located in the environment image, and the state parameter of the target user satisfies a preset first condition.
  • the size ratio of the image area in which the target user is located in the environment image is less than or equal to a preset first percentage threshold; or the state parameter of the target user includes: the target user and the aircraft a distance parameter; the state parameter of the target user satisfying the preset first condition means that the distance between the target user and the aircraft is greater than or equal to a preset first distance.
  • the first feature portion includes, but is not limited to, a human body of the target user.
  • the flight control device may determine that the feature part of the target user is a human body, and determine according to the human body of the target user. a target image area in which the human body is located, and a control object such as a palm of the target user is identified in the target image area.
  • the flight control device may determine that the feature part of the target user is a second feature part, according to the target user The second feature portion determines a target image region in which the second feature portion is located, and identifies a control object of the target user in the target image region.
  • the state parameter of the target user that meets the preset second condition is that the size ratio parameter of the image area where the target user is located in the environment image is greater than or equal to a preset second ratio.
  • the threshold value; or the state parameter of the target user includes: a distance parameter between the target user and the aircraft; and the state parameter of the target user satisfies a preset first condition, the target user and the aircraft The distance is less than or equal to the preset second distance.
  • the second feature portion includes a head of the target user, or the second feature portion includes a head and a shoulder of the target user, which are not limited in the embodiment of the present invention.
  • the flight control device may determine that the feature part of the target user is a head, and according to the target user The head determines a target image area in which the head is located, and identifies a target object of the target user such as a palm in the target image area.
  • the flight control device may identify at least one control object in the target image region in the process of identifying the control object of the target user in the target image region, And determining, according to the feature part of the target user, a joint point of the target user, and determining, according to the determined joint point, the control object of the target user from the at least one control object.
  • the flight control device may determine a target joint point from the determined joint points when determining the target user's control object from the at least one control object according to the determined joint point. And determining, in the at least one control object, a control object that is closest to the target joint point as a control object of the target user.
  • the target joint point refers to a joint point of a specified arm part, such as a joint point of an elbow joint of an arm, a joint point of an arm and a shoulder, a joint point of a wrist, and the like, and the target joint Both the point and the control object's fingers belong to the same target user.
  • the target image area determined by the flight control device is a target image area in which the target user's human body is located
  • the flight control device recognizes in the target image area where the target user's human body is located 2 palms (control objects)
  • the flight control device can determine the joint points of the target user's arms and shoulders, and determine the palms of the two palms that are closest to the joint points of the arms and shoulders Is the control object of the target user.
  • the flight control device may control the aircraft to fly according to the control object generation control instruction.
  • the flight control device may acquire a control command according to an action feature of the control object by identifying an action feature of the control object, and control the aircraft flight according to the control command.
  • the flight control device determines the target image region according to the feature portion of the target user determined from the environment image by acquiring the environment image captured by the camera device, and identifies the target image region in the target image region. Determining a control object of the target user to control the aircraft to fly according to the control object generation control instruction. In this way, the control object of the target user is identified, and the flight of the aircraft is controlled by identifying the action features of the control object, and the aircraft can be controlled relatively quickly, thereby improving the efficiency of flight control.
  • FIG. 3 is a schematic flowchart diagram of another flight control method according to an embodiment of the present invention, which may be performed by a flight control device, wherein a specific explanation of the flight control device is as described above.
  • the difference between the embodiment of the present invention and the embodiment shown in FIG. 2 is that the embodiment of the present invention triggers the aircraft to enter an image control mode according to the obtained triggering operation, and acquires the target user in the image control mode.
  • the control object performs gesture recognition, and generates a takeoff control command according to the recognized start flight gesture to control the aircraft to take off.
  • S301 Acquire an environment image captured by the photographing device if a triggering operation of triggering the aircraft to enter the image control mode is acquired.
  • the flight control device acquires the triggering operation of triggering the aircraft to enter the image control mode
  • the environment image captured by the camera may be acquired, wherein the environment image is taken by the camera before the aircraft takes off.
  • the triggering operation may include: a click operation on the aircraft power button, a double-click operation on the aircraft power button, a shaking operation on the aircraft, a voice input operation, a fingerprint input operation, and the like.
  • the triggering operation may be any one or more of the scanning feature object, the accessory interaction operation (such as glasses, a watch, a wristband, etc.), and the triggering operation is not performed by the embodiment of the present invention. Make a limit.
  • the triggering operation is a double-click operation on the aircraft power button
  • the flight control device acquires an operation of the target user double-clicking the power button of the aircraft
  • the aircraft may be triggered to enter the image control mode, and the hook is acquired.
  • An environmental image captured by a camera mounted on the aircraft.
  • S302 Perform gesture recognition on a control object of the target user in the environment image.
  • the flight control device may perform gesture recognition on the control object of the target user in the environment image acquired by the camera of the aircraft in the image control mode.
  • the target user may be a movable object such as a person, an animal, an unmanned automobile, or the like
  • the control object may be a palm of the target user, or other body parts, a body area, or the like, such as a face
  • the target user and the control object are not limited in the embodiment of the present invention.
  • the flight control device when acquiring the environment image captured by the camera, may control the pan/tilt mounted on the aircraft to control the shooting after the triggering operation is acquired.
  • the device scans the photographing within a preset photographing range, and acquires an environment image of the photographing device including the feature portion of the target user obtained by scanning the photographed within the preset photographing range.
  • the flight control device if the flight control device recognizes that the gesture of the control object is a start flight gesture, generating a takeoff control command to control the aircraft to take off.
  • the flight control device may generate a takeoff control command to control the aircraft to take off to hover at a corresponding position of the target height.
  • the target height may be preset to a height from the ground, or may be determined according to a location area in the environment image captured by the target user in the camera, and the embodiment of the present invention takes off the aircraft.
  • the target height of the rear hover is not limited.
  • the initiating flight gesture may be any gesture made by the target user, such as an “OK” gesture, a scissors hand gesture, etc., and the embodiment of the present invention does not limit the startup flight gesture.
  • the triggering operation is a double-click operation on the aircraft power button
  • the control object is the palm of the target user
  • the start flight gesture is set to a scissors hand gesture
  • the preset target height is 1.2 m from the ground.
  • the flight control device detects that the target user double-clicks the operation of the power button of the aircraft, controlling the aircraft to enter an image control mode, in the image control mode, if the flight control device identifies the target
  • the gesture made by the palm of the user is a scissor hand gesture, and a takeoff control command may be generated to control the aircraft to take off to a target height corresponding to a target height of 1.2 m.
  • the flight control device enters the image control mode by acquiring a trigger operation for triggering the aircraft to enter the image control mode, and performs a gesture on the acquired control object of the target user in the captured environment image captured by the camera device. It is recognized that if the gesture of the control object is recognized as a start flight gesture, a takeoff control command is generated to control the aircraft to take off. In this way, by controlling the take-off of the aircraft by gesture recognition, the aircraft can be controlled relatively quickly, and the efficiency of controlling the take-off of the aircraft is improved.
  • FIG. 4 is a schematic flowchart diagram of still another flight control method according to an embodiment of the present invention.
  • the method may be performed by a flight control device, wherein a specific explanation of the flight control device is as described above.
  • the embodiment of the present invention is different from the embodiment described in FIG. 3 in that the embodiment of the present invention determines the flight control gesture by performing gesture recognition on the target user's control object during the flight of the aircraft, and according to the The flight control gesture generation control command controls the aircraft to perform an action corresponding to the control instruction.
  • S401 Control the photographing device to capture an image of the flight environment during the flight of the aircraft.
  • the flight control device may control the photographing device mounted on the aircraft to capture an image of the flight environment, wherein the image of the flight environment is photographed on the aircraft.
  • the device scans the captured environmental image during flight of the aircraft.
  • S402 Perform gesture recognition on a control object of the target user in the flight environment image, and determine a flight control gesture.
  • the flight control device may perform gesture recognition on the control object of the target user in the flight environment image to determine a flight control gesture.
  • the control object may include but is not limited to the palm of the target user as described above.
  • the flight control gesture includes any one or more of a height control gesture, a motion control gesture, a drag control gesture, a rotation control gesture, a landing gesture, a camera gesture, a video gesture, and a replacement control gesture, etc. Not limited.
  • S403 Generate, according to the identified flight control gesture, a control instruction to control the aircraft to perform an action corresponding to the control instruction.
  • the flight control device may generate, according to the identified flight control gesture, a control instruction to control the aircraft to perform an action corresponding to the control instruction.
  • a height control command may be generated to control the aircraft to adjust the flight of the aircraft. the height of.
  • the flight control device may perform motion recognition on the control object according to an image included in the image set to obtain motion information of the control object, where the motion information includes a motion direction of the control object, where The set of images includes a plurality of environmental images captured by the photographing device.
  • the flight control device may analyze the flight control gesture of the control object according to the motion information, and if the obtained flight control gesture is a height control gesture, obtain a height control instruction corresponding to the height control gesture, and The aircraft is controlled to fly based on the direction of motion to adjust the height of the aircraft.
  • FIG. 1b is taken as an example. It is assumed that during the flight of the aircraft, the flight control device disposed on the aircraft 12 may identify the palm of the target user according to multiple environmental images captured by the camera, if the The flight control device recognizes that the palm 131 of the target user 13 is moving in the direction of the vertical ground downward movement of the photographing device, and then determining that the gesture of the palm 131 is a height control gesture and generating a height control instruction.
  • the aircraft 12 is controlled to fly in a downward direction to the vertical ground to lower the flying height of the aircraft 12.
  • a height control command may be generated to control the aircraft 12 to fly in a vertical ground upward direction to increase the flying height of the aircraft 12.
  • a mobile control command may be generated to control the aircraft to the mobile control Flight in the direction indicated by the instruction.
  • the direction indicated by the movement control instruction comprises a direction away from the control object or a direction close to the control object.
  • the flight control device performs motion recognition on the first object and the second object included in the control object according to an image included in the image set, motion information of the first object and the second object is obtained, where The image collection includes a plurality of environmental images captured by the photographing device.
  • the flight control device may obtain an action feature represented by the first object and the second object according to the motion information, where the action feature is used to indicate a change in distance between the first object and the second object, And acquiring, according to the distance change, a motion control instruction corresponding to the action feature.
  • the movement control command is for controlling the aircraft to move away The target user flies in the direction. If the action feature is used to indicate that the distance between the first object and the second object changes to a change in distance reduction, the movement control command is for controlling the direction of the aircraft toward the target user flight.
  • the control object includes a first object and a second object, and the first object is a left hand palm of the target user, and the second object is a right hand palm of the target user, if the flight control device Detecting two palms of the photographing device facing the aircraft lifted by the target user, and detecting that the distance between the two palms in the horizontal direction is gradually increased, the flight control device may determine this
  • the flight control gestures made by the two palms are motion control gestures and generate motion control commands that control the aircraft to fly away from the target user.
  • the flight control device may determine that the flight control gestures made by the two palms are motion control gestures, And generating a movement control command to control the aircraft to fly in a direction close to the target user.
  • a drag control command may be generated to control the aircraft along the Drag the horizontal direction indicated by the control command.
  • the drag control gesture refers to dragging the palm of the target user to the left or right in the horizontal direction. For example, if the flight control device recognizes that the palm of the target user is dragged to the left in the horizontal direction, generating a drag control command to control the aircraft to fly in a horizontally left direction.
  • a rotation control command may be generated to control the aircraft along the rotation control Rotate the flight in the direction indicated by the command.
  • the rotation control gesture refers to that the palm of the target user rotates around the target user.
  • the flight control device may perform motion recognition on the palm and the target user included in the control object according to an image included in the image set, and obtain motion information of the palm and the target user, where the motion information includes the The direction of motion of the palm and the target user, the set of images including a plurality of environmental images captured by the camera.
  • a rotation control command may be generated to control the aircraft to refer to the rotation control instruction Directional rotation flight. For example, assuming that the flight control device detects that the palm of the target user and the target user rotates counterclockwise about the target user, the flight control device may generate a rotation control command to control the aircraft along the The target user rotates centerwise in a counterclockwise direction.
  • the flight control device during flight of the aircraft, if the flight control device recognizes that the flight control gesture of the control object is a landing gesture, generating a landing control command to control the aircraft to land.
  • the landing gesture refers to a gesture in which the palm of the target user is moving downward toward the ground, or the landing gesture may also be another gesture of the target user, which is not specific to the embodiment of the present invention. limited.
  • a landing control command may be generated to control the aircraft to land to the target position.
  • the target position may be determined in advance, or may be determined according to the height of the aircraft and the ground detected by the aircraft, which is not specifically limited in the embodiment of the present invention. If the flight control device detects that the landing gesture is staying at the target location for a time greater than a preset time threshold, the aircraft may be controlled to land to the ground.
  • the preset time threshold is 3 s
  • the target position determined according to the height of the aircraft and the ground detected by the aircraft 12 is 0.5 m from the ground
  • the flight control device recognizes that the target user's palm is moving downward toward the ground, and may generate a landing control command to control the aircraft to land at a position 0.5 m from the ground, if the palm of the target user is detected
  • the gesture of moving downward toward the ground stays at a position 0.5 m from the ground for more than 3 s, then the aircraft is controlled to land to the ground.
  • the flight control device if the flight control device cannot recognize the flight control gesture of the target user and identify the feature portion of the target user in the flight environment image, Controlling, according to the feature part of the target user, the aircraft to follow the target user as the following target, and following the target user to move.
  • the feature portion refers to any body region of the target user, which is not specifically limited in the embodiment of the present invention.
  • the following the target user movement means: adjusting at least one of a position of the aircraft, a posture of a gimbal mounted on the aircraft, and an attitude of an aircraft following the target user Moving to cause the target user to be in an image taken by the camera.
  • the first body region controls the aircraft to follow the target user as a follow target, follows the first body region, and adjusts the position of the aircraft and is mounted during the movement following the first body region At least one of a posture of the pan/tilt on the aircraft and an attitude of the aircraft to cause the target user to be in an image taken by the photographing device.
  • the flight control device may follow the body region in which the body torso is located to control the aircraft to follow the target user as the following target, follow the body region in which the body torso moves, and move in the body region following the body torso And adjusting at least one of a position of the aircraft, a posture of a pan-tilt mounted on the aircraft, and an attitude of the aircraft to cause the target user to be in an image captured by the photographing device.
  • the identification is The second body region of the target user can then control the aircraft to follow the movement of the second body region.
  • the flight control device 11 may follow the second body region to control the aircraft to follow the target user as a follow target, follow the second body region, and follow the first Adjusting at least one of a position of the aircraft, a posture of a pan-tilt mounted on the aircraft, and an attitude of an aircraft during movement of the second body region to cause an image captured by the target user at the photographing device in.
  • the flight control device may follow the body region where the head and shoulder are located to control the aircraft to follow the target with the target user, following the head
  • the body area where the part and the shoulder are located move, and during the movement following the body area where the head and shoulder are located, adjust the position of the aircraft, the attitude of the gimbal mounted on the aircraft, the attitude of the aircraft At least one of the objects to cause the target user to be in an image taken by the camera.
  • the flight control device may identify a feature portion included in the target user during the following movement of the target user, and obtain the feature portion in the image. Image size information, and according to the image size information, generating a control command to control the aircraft to move in a direction indicated by the control command. For example, assuming that the feature portion is the body of the target user, if it is detected that the body of the target user is moving forward, and the size of the body of the target user is becoming larger, the aircraft may be controlled to move away from the The direction of the target user moves.
  • the photographing control instruction may be generated to control the photographing device of the aircraft to capture the target image.
  • the photographing gesture may be any gesture that is set, such as an “O” gesture, which is not specifically limited in the embodiment of the present invention.
  • the photographing gesture is an “O” gesture
  • the flight control device recognizes that the gesture made by the palm of the target user is an “O” gesture
  • a shooting control command may be generated to control the camera of the aircraft. Shoot the target image.
  • the recording control command may be generated to control the camera of the aircraft to capture a video, and the camera of the aircraft In the process of capturing a video, if the recording gesture of the control object is recognized again, an end control command is generated to control the imaging device of the aircraft to stop capturing the video.
  • the recording gesture may be any gesture that is set, which is not limited in the embodiment of the present invention.
  • the recording gesture is a "1" gesture
  • the flight control device recognizes that the gesture made by the palm of the target user is a "1” gesture, generating a recording control command to control the shooting of the aircraft
  • a recording control command to control the shooting of the aircraft
  • the replacement user is the new target. And identifying, by the user, the control object of the new target user and the replacement control gesture, and generating, according to the replacement control gesture, a control instruction to control the aircraft to perform an action corresponding to the control instruction.
  • the replacement control gesture may be any gesture that is set, which is not limited in the embodiment of the present invention.
  • the flight control device may use the replacement user as the target user, and generate a photographing control instruction to control the photographing device of the aircraft to capture the target image according to the “O” gesture made by the replacement user.
  • the flight control device controls the shooting device to capture a flight environment image during the flight of the aircraft, and performs gesture recognition on the target object of the target image in the flight environment image to determine
  • the flight control gesture is such that, based on the identified flight control gesture, a control command is generated to control the aircraft to perform an action corresponding to the control command.
  • the action indicated by the gesture of the aircraft during the flight is controlled by the gesture recognition, the operation steps of controlling the aircraft are simplified, the aircraft can be controlled relatively quickly, and the control of the aircraft is improved. effectiveness.
  • FIG. 5 is a schematic structural diagram of a flight control device according to an embodiment of the present invention.
  • the flight control device includes a memory 501, a processor 502, and a data interface 503.
  • the memory 501 may include a volatile memory; the memory 501 may also include a non-volatile memory; the memory 501 may also include a combination of the above types of memory.
  • the processor 502 can be a central processing unit (CPU).
  • the processor 502 may further include a hardware chip.
  • the hardware chip may be an application-specific integrated circuit (ASIC), a programmable logic device (PLD), or a combination thereof. Specifically, for example, it may be a complex programmable logic device (CPLD), a field-programmable gate array (FPGA), or any combination thereof.
  • ASIC application-specific integrated circuit
  • PLD programmable logic device
  • FPGA field-programmable gate array
  • the memory 501 is configured to store program instructions, and when the program instructions are executed, the processor 502 may call program instructions stored in the memory 501 for performing the following steps:
  • the aircraft flight is controlled according to the control object generation control instruction.
  • the processor 502 calls the program instructions stored in the memory 501 for performing the following steps:
  • the aircraft is controlled to fly in accordance with the control command.
  • control object includes a palm of the target user.
  • the processor 502 calls the program instructions stored in the memory 501 for performing the following steps:
  • the state parameter of the target user includes: a size ratio parameter of an image area in which the target user is located in the environment image, and the state parameter of the target user satisfies a preset first condition: The size ratio parameter of the image area where the target user is located in the environment image is less than or equal to a preset first ratio threshold; or
  • the state parameter of the target user includes: a distance parameter between the target user and the aircraft; and the state parameter of the target user satisfies a preset first condition, that is, the distance between the target user and the aircraft is greater than or Equal to the preset first distance.
  • the first feature part is a human body of the target user.
  • the processor 502 calls the program instructions stored in the memory 501 for performing the following steps:
  • the state parameter of the target user includes: a size ratio parameter of an image area in which the target user is located in the environment image, and the state parameter of the target user satisfies a preset second condition: The size ratio parameter of the image area where the target user is located in the environment image is greater than or equal to a preset second ratio threshold; or
  • the state parameter of the target user includes: a distance parameter between the target user and the aircraft; and the state parameter of the target user satisfies a preset first condition, that is, the distance between the target user and the aircraft is less than or Equal to the preset second distance.
  • the second feature portion includes a head of the target user; or the second feature portion includes a head and a shoulder of the target user.
  • the processor 502 calls the program instructions stored in the memory 501 for performing the following steps:
  • the processor 502 calls the program instructions stored in the memory 501 for performing the following steps:
  • a control object that is closest to the target joint point among the at least one control object is determined as a control object of the target user.
  • the flight control device determines the target image region according to the feature portion of the target user determined from the environment image by acquiring the environment image captured by the camera device, and identifies the target image region in the target image region. Determining a control object of the target user to control the aircraft to fly according to the control object generation control instruction. In this way, the control object of the target user is identified, and the flight of the aircraft is controlled by identifying the action features of the control object, so as to simplify the operation flow, the aircraft can be controlled relatively quickly, and the efficiency of flight control is improved. .
  • FIG. 6 is a schematic structural diagram of another flight control device according to an embodiment of the present invention.
  • the flight control device includes: a memory 601, a processor 602, and a data interface 603.
  • the memory 601 may include a volatile memory; the memory 601 may also include a non-volatile memory; the memory 601 may also include a combination of the above types of memory.
  • the processor 602 can be a central processing unit (CPU).
  • the processor 602 may further include a hardware chip.
  • the hardware chip may be an application-specific integrated circuit (ASIC), a programmable logic device (PLD), or a combination thereof.
  • the PLD may be a complex programmable logic device (CPLD), a field-programmable gate array (FPGA), or any combination thereof.
  • the memory 601 is configured to store program instructions. When the program instructions are executed, the processor 602 can call the program instructions stored in the memory 601 for performing the following steps:
  • a takeoff control command is generated to control the aircraft to take off.
  • the triggering operation includes: one or more of a click operation on the aircraft power button, a double-click operation on the aircraft power button, a shaking operation on the aircraft, a voice input operation, and a fingerprint input operation.
  • a click operation on the aircraft power button a double-click operation on the aircraft power button
  • a shaking operation on the aircraft a voice input operation
  • a fingerprint input operation a fingerprint input operation.
  • the processor 602 calls the program instructions stored in the memory 601 for performing the following steps:
  • the processor 602 calls the program instructions stored in the memory 601 to also perform the following steps:
  • the processor 602 calls the program instructions stored in the memory 601 for performing the following steps:
  • a height control command is generated to control the aircraft to adjust the height of the aircraft.
  • the processor 602 calls the program instructions stored in the memory 601 for performing the following steps:
  • the flight control gesture of the control object is a motion control gesture, generating a motion control instruction to control the aircraft to fly in a direction indicated by the motion control instruction;
  • the direction indicated by the movement control instruction includes: a direction away from the control object or a direction close to the control object.
  • the processor 602 calls the program instructions stored in the memory 601 for performing the following steps:
  • the flight control gesture of the control object is a drag control gesture, generating a drag control command to control the aircraft to fly in a horizontal direction indicated by the drag control command.
  • the processor 602 calls the program instructions stored in the memory 601 for performing the following steps:
  • the flight control gesture of the control object is a rotation control gesture, generating a rotation control command to control the aircraft to rotate in a direction indicated by the rotation control command.
  • the processor 602 calls the program instructions stored in the memory 601 for performing the following steps:
  • a landing control command is generated to control the aircraft to land.
  • the processor 602 calls the program instructions stored in the memory 601 for performing the following steps:
  • Controlling according to the feature part of the target user, the aircraft to follow the target user as the following target, and following the target user to move.
  • the target user movement refers to: adjusting a shooting state, wherein the target user is located in an image captured by the camera in the adjusted shooting state, and adjusting the shooting state includes adjusting a position of the aircraft, Any one or more of the attitude of the gimbal mounted on the aircraft and the attitude of the aircraft.
  • the processor 602 calls the program instructions stored in the memory 601 for performing the following steps:
  • the flight control gesture of the control object is a photographing gesture
  • generating a photographing control command controls the photographing device of the aircraft to capture the target image.
  • the processor 602 calls the program instructions stored in the memory 601 for performing the following steps:
  • the flight control gesture of the control object is a recording gesture, generating a recording control command to control the camera of the aircraft to capture a video;
  • the processor 602 calls the program instructions stored in the memory 601 for performing the following steps:
  • the replacement user is determined as a new target user
  • the flight control device controls the photographing device to capture a flight environment image during the flight of the aircraft, and performs gesture recognition on the target object of the target image in the flight environment image to determine
  • the flight control gesture is such that, based on the identified flight control gesture, a control command is generated to control the aircraft to perform an action corresponding to the control command.
  • the action indicated by the gesture of the aircraft during the flight is controlled by the gesture recognition, the operation steps of controlling the aircraft are simplified, the aircraft can be controlled relatively quickly, and the control of the aircraft is improved. effectiveness.
  • An embodiment of the present invention further provides an aircraft, including: a fuselage; a power system disposed on the airframe for providing flight power; and a processor configured to acquire an environment image captured by the camera; and according to the environment image Determining a feature part of the target user, and determining a target image area according to the feature part, identifying a control object of the target user in the target image area; controlling the flight of the aircraft according to the control object generation control instruction .
  • processor is configured to perform the following steps:
  • the aircraft is controlled to fly in accordance with the control command.
  • control object includes a palm of the target user.
  • processor is configured to perform the following steps:
  • the state parameter of the target user includes: a size ratio parameter of an image area in which the target user is located in the environment image, and the state parameter of the target user satisfies a preset first condition: The size ratio parameter of the image area where the target user is located in the environment image is less than or equal to a preset first ratio threshold; or
  • the state parameter of the target user includes: a distance parameter between the target user and the aircraft; and the state parameter of the target user satisfies a preset first condition, that is, the distance between the target user and the aircraft is greater than or Equal to the preset first distance.
  • the first feature part is a human body of the target user.
  • processor is configured to perform the following steps:
  • the state parameter of the target user includes: a size ratio parameter of an image area in which the target user is located in the environment image, and the state parameter of the target user satisfies a preset second condition: The size ratio parameter of the image area where the target user is located in the environment image is greater than or equal to a preset second ratio threshold; or
  • the state parameter of the target user includes: a distance parameter between the target user and the aircraft; and the state parameter of the target user satisfies a preset first condition, that is, the distance between the target user and the aircraft is less than or Equal to the preset second distance.
  • the second feature portion includes a head of the target user; or the second feature portion includes a head and a shoulder of the target user.
  • processor is configured to perform the following steps:
  • processor is configured to perform the following steps:
  • a control object that is closest to the target joint point among the at least one control object is determined as a control object of the target user.
  • the aircraft to be sued may be a four-rotor UAV, a six-rotor UAV, a multi-rotor UAV, and the like.
  • the power system may include a motor, an ESC, a propeller, etc., wherein the motor is responsible for driving the aircraft propeller, and the ESC is responsible for controlling the speed of the motor of the aircraft.
  • An embodiment of the present invention further provides another aircraft, including: a fuselage; a power system disposed on the airframe for providing flight power; and a processor for triggering triggering the aircraft to enter an image control mode if acquired Operation, obtaining an environment image captured by the photographing device; performing gesture recognition on the control object of the target user in the environment image; and if the gesture of the control object is recognized as starting the flight gesture, generating a takeoff control command to control the aircraft take off.
  • the triggering operation includes: one or more of a click operation on the aircraft power button, a double-click operation on the aircraft power button, a shaking operation on the aircraft, a voice input operation, and a fingerprint input operation.
  • a click operation on the aircraft power button a double-click operation on the aircraft power button
  • a shaking operation on the aircraft a voice input operation
  • a fingerprint input operation a fingerprint input operation.
  • processor is configured to perform the following steps:
  • processor is configured to perform the following steps:
  • processor is configured to perform the following steps:
  • a height control command is generated to control the aircraft to adjust the height of the aircraft.
  • processor is configured to perform the following steps:
  • the flight control gesture of the control object is a motion control gesture, generating a motion control instruction to control the aircraft to fly in a direction indicated by the motion control instruction;
  • the direction indicated by the movement control instruction includes: a direction away from the control object or a direction close to the control object.
  • processor is configured to perform the following steps:
  • the flight control gesture of the control object is a drag control gesture, generating a drag control command to control the aircraft to fly in a horizontal direction indicated by the drag control command.
  • processor is configured to perform the following steps:
  • the flight control gesture of the control object is a rotation control gesture, generating a rotation control command to control the aircraft to rotate in a direction indicated by the rotation control command.
  • processor is configured to perform the following steps:
  • a landing control command is generated to control the aircraft to land.
  • processor is configured to perform the following steps:
  • Controlling according to the feature part of the target user, the aircraft to follow the target user as the following target, and following the target user to move.
  • the target user movement refers to: adjusting a shooting state, wherein the target user is located in an image captured by the camera in the adjusted shooting state, and adjusting the shooting state includes adjusting a position of the aircraft, Any one or more of the attitude of the gimbal mounted on the aircraft and the attitude of the aircraft.
  • processor is configured to perform the following steps:
  • the flight control gesture of the control object is a photographing gesture
  • generating a photographing control command controls the photographing device of the aircraft to capture the target image.
  • processor is configured to perform the following steps:
  • the flight control gesture of the control object is a recording gesture, generating a recording control command to control the camera of the aircraft to capture a video;
  • processor is configured to perform the following steps:
  • the replacement user is determined as a new target user
  • An embodiment of the present invention further provides a flight control system, including: a flight control device and an aircraft;
  • the aircraft is configured to control a camera mounted on the aircraft to capture an environment image, and send the environment image to the flight control device;
  • the flight control device is configured to acquire an environment image captured by the camera, determine a feature portion of the target user according to the environment image, and determine a target image region according to the feature portion, and identify the target image region Determining a control object of the target user; controlling the flight of the aircraft according to the control object generation control instruction;
  • the aircraft is further configured to control the aircraft to fly and perform an action corresponding to the flight control instruction in response to the flight control instruction.
  • the flight control device is configured to identify an action feature of the control object, obtain a control instruction according to the action feature of the control object, and control the aircraft flight according to the control command.
  • the flight control device is configured to: if the state parameter of the target user meets a preset first condition, determine that the feature part of the target user is the first feature part; according to the first part of the target user The feature portion determines a target image region in which the first feature portion is located, and identifies a control object of the target user in the target image region.
  • the state parameter of the target user includes: a size ratio parameter of an image area in which the target user is located in the environment image, and the state parameter of the target user satisfies a preset first condition:
  • the size parameter of the image area in which the target user is located in the environment image is less than or equal to a preset first percentage threshold; or the state parameter of the target user includes: a distance parameter between the target user and the aircraft;
  • the state parameter of the target user that meets the preset first condition means that the distance between the target user and the aircraft is greater than or equal to a preset first distance.
  • the first feature part is a human body of the target user.
  • the flight control device is configured to: if the state parameter of the target user meets a preset second condition, determine that the feature part of the target user is a second feature part; according to the second part of the target user The feature portion determines a target image region in which the second feature portion is located, and identifies a control object of the target user in the target image region.
  • the state parameter of the target user includes: a size ratio parameter of an image area in which the target user is located in the environment image, and the state parameter of the target user satisfies a preset second condition:
  • the size ratio of the image area in which the target user is located in the environment image is greater than or equal to a preset second ratio threshold; or the state parameter of the target user includes: a distance parameter between the target user and the aircraft;
  • the state parameter of the target user that meets the preset first condition means that the distance between the target user and the aircraft is less than or equal to a preset second distance.
  • the second feature portion includes a head of the target user; or the second feature portion includes a head and a shoulder of the target user.
  • the flight control device is configured to identify at least one control object in the target image region; determine a joint point of the target user according to the feature portion of the target user; according to the determined joint point, A control object of the target user is determined among the at least one control object.
  • the flight control device is configured to determine a target joint point from the determined joint points; and determine, as the target user, a control object that is closest to the target joint point among the at least one control object Object.
  • the flight control device determines the target image region according to the feature portion of the target user determined from the environment image by acquiring the environment image captured by the camera device, and identifies the target image region in the target image region. Determining a control object of the target user to control the aircraft to fly according to the control object generation control instruction. In this way, the control object of the target user is identified, and the flight of the aircraft is controlled by identifying the action features of the control object to simplify the operation flow and improve the efficiency of flight control.
  • Another embodiment of the present invention provides a flight control system including: a flight control device and an aircraft;
  • the flight control device is configured to acquire an environment image captured by the camera when the triggering operation of triggering the aircraft into the image control mode is acquired; perform gesture recognition on the control object of the target user in the environment image; The gesture of the control object is to initiate a flight gesture, and generating a takeoff control command to control the aircraft to take off;
  • the aircraft is configured to control the aircraft to take off in response to the takeoff control command.
  • the triggering operation includes: one or more of a click operation on the aircraft power button, a double-click operation on the aircraft power button, a shaking operation on the aircraft, a voice input operation, and a fingerprint input operation.
  • a click operation on the aircraft power button a double-click operation on the aircraft power button
  • a shaking operation on the aircraft a voice input operation
  • a fingerprint input operation a fingerprint input operation.
  • the flight control device is configured to control a pan-tilt rotation mounted on the aircraft after acquiring the triggering operation to control the camera to scan and shoot within a preset shooting range;
  • the photographing device scans the photographed environmental image including the feature portion of the target user within the preset photographing range.
  • the flight control device is further configured to: when the aircraft is in flight, control the photographing device to capture an image of a flight environment; perform gesture recognition on a target object of the target image in the flight environment image, and determine flight control Gesture; generating, according to the identified flight control gesture, a control command to control the aircraft to perform an action corresponding to the control instruction.
  • the flight control device is configured to generate a height control command to control the aircraft to adjust a height of the aircraft if the flight control gesture of the control object is recognized as a height control gesture.
  • the flight control device is configured to: if the flight control gesture of the control object is recognized as a movement control gesture, generate a movement control instruction to control the aircraft to fly in a direction indicated by the movement control instruction; wherein The direction indicated by the movement control instruction includes a direction away from the control object or a direction close to the control object.
  • the flight control device is configured to: if the flight control gesture of the control object is recognized as a drag control gesture, generate a drag control instruction to control a horizontal direction indicated by the aircraft along the drag control command flight.
  • the flight control device is configured to generate a rotation control command to control the aircraft to rotate in a direction indicated by the rotation control instruction if the flight control gesture of the control object is recognized as a rotation control gesture.
  • the flight control device is configured to generate a landing control command to control the aircraft to land if the flight control gesture of the control object is recognized as a landing gesture.
  • the flight control device is configured to: if the flight control gesture is not recognized, and identify a feature portion of the target user in the flight environment image; and control the aircraft to target the target according to the feature portion of the target user The user follows the target and follows the target user to move.
  • the target user movement refers to: adjusting a shooting state, wherein the target user is located in an image captured by the camera in the adjusted shooting state, and adjusting the shooting state includes adjusting a position of the aircraft, Any one or more of the attitude of the gimbal mounted on the aircraft and the attitude of the aircraft.
  • the flight control device is configured to generate a shooting control command to control a shooting device of the aircraft to capture a target image if the flight control gesture of the control object is recognized as a photographing gesture.
  • the flight control device is configured to: if the flight control gesture of the control object is recognized as a recording gesture, generate a recording control instruction to control a camera of the aircraft to capture a video; and capture the camera in the aircraft During the video, if the recording gesture of the control object is recognized again, an end control command is generated to control the camera of the aircraft to stop capturing the video.
  • the flight control device is configured to determine the replacement user as a new one if the flight control gesture of the control object of the target user is not recognized and the replacement control gesture issued by the control object of the replacement user is recognized a target user; identifying the control object of the new target user and replacing the control gesture, and controlling the aircraft to perform an action corresponding to the control instruction according to the replacement control gesture generation control instruction.
  • the flight control device controls the shooting device to capture a flight environment image during the flight of the aircraft, and performs gesture recognition on the target object of the target image in the flight environment image to determine
  • the flight control gesture is such that, based on the identified flight control gesture, a control command is generated to control the aircraft to perform an action corresponding to the control command.
  • the action indicated by the gesture of the aircraft during the flight is controlled by the gesture recognition, the operation steps of controlling the aircraft are simplified, the aircraft can be controlled relatively quickly, and the control of the aircraft is improved. effectiveness.
  • FIG. 1a Also provided in an embodiment of the present invention is a computer readable storage medium storing a computer program, which when executed by a processor, implements the present invention, FIG. 1a, FIG. 2, FIG.
  • the flight control device of the embodiment of the present invention shown in FIG. 5 or FIG. 6 can also be implemented in the flight control method in the embodiment corresponding to FIG. 4, and details are not described herein again.
  • the computer readable storage medium may be an internal storage unit of the device described in any of the preceding embodiments, such as a hard disk or a memory of the device.
  • the computer readable storage medium may also be an external storage device of the device, such as a plug-in hard disk equipped on the device, a smart memory card (SMC), and a secure digital (SD) card. , Flash Card, etc.
  • the computer readable storage medium may also include both an internal storage unit of the device and an external storage device.
  • the computer readable storage medium is for storing the computer program and other programs and data required by the terminal.
  • the computer readable storage medium can also be used to temporarily store data that has been output or is about to be output.
  • the storage medium may be a magnetic disk, an optical disk, a read-only memory (ROM), or a random access memory (RAM).

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Remote Sensing (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Automation & Control Theory (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • General Engineering & Computer Science (AREA)
  • Mathematical Physics (AREA)
  • Health & Medical Sciences (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Health & Medical Sciences (AREA)
  • Psychiatry (AREA)
  • Social Psychology (AREA)
  • Computing Systems (AREA)
  • Astronomy & Astrophysics (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

一种飞行控制方法、设备、飞行器、系统及存储介质,其中,方法包括:获取拍摄装置拍摄得到的环境图像(S201);根据环境图像确定出目标用户的特征部位,并根据特征部位确定出目标图像区域,在目标图像区域中识别出目标用户的控制对象(S202);根据控制对象生成控制指令控制飞行器飞行(S203)。通过这种方式,实现通过手势识别可较为快捷地对飞行器进行控制。

Description

一种飞行控制方法、设备、飞行器、系统及存储介质 技术领域
本发明涉及控制技术领域,尤其涉及一种飞行控制方法、设备、飞行器、系统及存储介质。
背景技术
随着计算机技术的发展,无人飞行器的发展越来越快,其中无人飞行器的飞行过程通常是由飞行控制器或者具有控制能力的移动设备控制的。然而,用户在使用这样的飞行控制器或者移动设备控制飞行器飞行之前需要学习对应的操控技巧,导致学习成本高,操作流程复杂。因此如何更好地控制飞行器成为研究的热点问题。
发明内容
本发明实施例提供了一种飞行控制方法、设备、飞行器、系统及存储介质,可较为快捷地对飞行器进行控制。
第一方面,本发明实施例提供了一种飞行控制方法,应用于飞行器,所述飞行器上挂载有拍摄装置,所述方法包括:
获取所述拍摄装置拍摄得到的环境图像;
根据所述环境图像确定出目标用户的特征部位,并根据所述特征部位确定出目标图像区域,在所述目标图像区域中识别出所述目标用户的控制对象;
根据所述控制对象生成控制指令控制所述飞行器飞行。
第二方面,本发明实施例提供了另一种飞行控制方法,应用于飞行器,所述飞行器上挂载有拍摄装置,所述方法包括:
如果获取到触发所述飞行器进入图像控制模式的触发操作,则获取所述拍摄装置拍摄得到的环境图像;
对所述环境图像中目标用户的控制对象进行手势识别;
如果识别出所述控制对象的手势为启动飞行手势,则生成起飞控制指令控制所述飞行器起飞。
第三方面,本发明实施例提供了一种飞行控制设备,包括存储器和处理器;
所述存储器,用于存储程序指令;
所述处理器,执行所述存储器存储的程序指令,当程序指令被执行时,所述处理器用于执行如下步骤:
获取拍摄装置拍摄得到的环境图像;
根据所述环境图像确定出目标用户的特征部位,并根据所述特征部位确定出目标图像区域,在所述目标图像区域中识别出所述目标用户的控制对象;
根据所述控制对象生成控制指令控制所述飞行器飞行。
第四方面,本发明实施例提供了另一种飞行控制设备,包括存储器和处理器;
所述存储器,用于存储程序指令;
所述处理器,执行所述存储器存储的程序指令,当程序指令被执行时,所述处理器用于执行如下步骤:
如果获取到触发所述飞行器进入图像控制模式的触发操作,则获取拍摄装置拍摄得到的环境图像;
对所述环境图像中目标用户的控制对象进行手势识别;
如果识别出所述控制对象的手势为启动飞行手势,则生成起飞控制指令控制所述飞行器起飞。
第五方面,本发明实施例提供了一种飞行器,包括:
机身;
设置在机身上的动力系统,用于提供飞行动力;
处理器,用于获取拍摄装置拍摄得到的环境图像;根据所述环境图像确定出目标用户的特征部位,并根据所述特征部位确定出目标图像区域,在所述目标图像区域中识别出所述目标用户的控制对象;根据所述控制对象生成控制指令控制所述飞行器飞行。
第六方面,本发明实施例提供了另一种飞行器,包括:
机身;
设置在机身上的动力系统,用于提供飞行动力;
处理器,用于如果获取到触发所述飞行器进入图像控制模式的触发操作,则获取拍摄装置拍摄得到的环境图像;对所述环境图像中目标用户的控制对象进行手势识别;如果识别出所述控制对象的手势为启动飞行手势,则生成起飞 控制指令控制所述飞行器起飞。
第七方面,本发明实施例提供了一种飞行控制系统,包括:飞行控制设备和飞行器;
所述飞行器,用于控制挂载在所述飞行器上的拍摄装置拍摄得到环境图像,并将所述环境图像发送给所述飞行控制设备;
所述飞行控制设备,用于获取拍摄装置拍摄得到的环境图像;根据所述环境图像确定出目标用户的特征部位,并根据所述特征部位确定出目标图像区域,在所述目标图像区域中识别出所述目标用户的控制对象;根据所述控制对象生成控制指令控制所述飞行器飞行;
所述飞行器,还用于响应所述飞行控制指令,控制所述飞行器飞行并执行所述飞行控制指令对应的动作。
第八方面,本发明实施例提供了另一种飞行控制系统,包括:飞行控制设备和飞行器;
所述飞行控制设备,用于如果获取到触发所述飞行器进入图像控制模式的触发操作,则获取拍摄装置拍摄得到的环境图像;对所述环境图像中目标用户的控制对象进行手势识别;如果识别出所述控制对象的手势为启动飞行手势,则生成起飞控制指令控制所述飞行器起飞;
所述飞行器,用于响应所述起飞控制指令控制所述飞行器起飞。
第九方面,本发明实施例提供了一种计算机可读存储介质,该计算机可读存储介质存储有计算机程序,该计算机程序被处理器执行时实现如上述第一方面或第二方面所述的飞行控制方法。
本发明实施例中,飞行控制设备通过获取拍摄装置拍摄得到的环境图像,根据所述环境图像确定出目标用户的特征部位,并根据所述特征部位确定出目标图像区域,在所述目标图像区域中识别出所述目标用户的控制对象,从而根据所述控制对象生成控制指令控制所述飞行器飞行。通过这种方式,实现了较为快捷地对飞行器进行控制,提高了控制飞行器飞行、拍摄、降落等操作的效率。
附图说明
为了更清楚地说明本发明实施例或现有技术中的技术方案,下面将对实施 例中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图仅仅是本发明的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图获得其他的附图。
图1a是本发明实施例提供的一种飞行控制系统的结构示意图;
图1b是本发明实施例提供的一种飞行器的飞行控制示意图;
图2是本发明实施例提供的一种飞行控制方法的流程示意图;
图3是本发明实施例提供的另一种飞行控制方法的流程示意图;
图4是本发明实施例提供的又一种飞行控制方法的流程示意图;
图5是本发明实施例提供的一种飞行控制设备的结构示意图;
图6是本发明实施例提供的另一种飞行控制设备的结构示意图。
具体实施方式
下面将结合本发明实施例中的附图,对本发明实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例仅仅是本发明一部分实施例,而不是全部的实施例。基于本发明中的实施例,本领域普通技术人员在没有作出创造性劳动前提下所获得的所有其他实施例,都属于本发明保护的范围。
下面结合附图,对本发明的一些实施方式作详细说明。在不冲突的情况下,下述的实施例及实施例中的特征可以相互组合。
本发明实施例中提供的飞行控制方法可以由一种飞行控制设备执行,该飞行控制设备可以设置在能够拍摄视频的飞行器(如无人机)上,所述飞行器上挂载有拍摄装置。所述飞行控制方法可以应用于控制所述飞行器的起飞、飞行、降落、拍照、录像等操作。在其他实施例中,所述飞行控制方法也可以应用于能够自主移动的机器人等可运动设备上,下面对应用于飞行器的飞行控制方法进行说明。
本发明实施例中,所述飞行控制设备可以控制所述飞行器的起飞,如果所述飞行控制设备获取到触发所述飞行器进入图像控制模式的触发操作,则可以控制所述飞行器进入所述图像控制模式。在所述图像控制模式下,所述飞行控制设备可以获取所述飞行器上挂载的拍摄装置拍摄得到的环境图像,其中,所述环境图像为所述拍摄装置在所述飞行器起飞之前拍摄到的预览图像。所述飞行控制设备可以对所述环境图像中目标用户的控制对象进行手势识别,如果识 别出所述控制对象的手势为启动飞行手势,则可以生成起飞控制指令控制所述飞行器起飞。
在一个实施例中,所述触发操作可以包括:对所述飞行器电源键的点击操作、对所述飞行器电源键的双击操作、对所述飞行器的摇晃操作、语音输入操作、指纹输入操作等中的任意一种或多种,所述触发操作还可以是特征物体扫描操作、智能附件的交互操作(如智能眼镜,智能手表、手环等)等任意一种或多种,本发明实施例对所述触发操作不做限定。
在一个实施例中,所述启动飞行手势可以为所述目标用户所做的任何指定手势,如“OK”手势、剪刀手手势等,本发明实施例对所述启动飞行手势不做限定。
在一个实施例中,所述目标用户主要是指人,所述控制对象可以是所述目标用户的手掌、或其他身体部位、身体区域,如面部、头部、肩部等特征部位,本发明实施例对所述目标用户和所述控制对象不做限定。
具体可举例说明,假设所述触发操作为对所述飞行器电源键的双击操作,所述目标用户为人,所述控制对象为所述目标用户的手掌,所述启动飞行手势设置为“OK”手势,如果所述飞行控制设备检测到所述目标用户对所述飞行器的电源键的双击操作,则所述飞行控制设备可以控制所述飞行器进入图像控制模式。其中,在所述图像控制模式下,所述飞行控制设备可以获取所述飞行器上的拍摄装置拍摄得到的环境图像,所述环境图像为用于进行控制分析的预览图像,而并非拍摄的需要存储的图像,所述预览图像中包括所述目标用户。所述飞行控制设备可以在所述图像控制模式下对所述环境图像中目标用户的手掌进行手势识别,如果识别出所述目标用户的手掌所做的手势为“OK”手势,则可以生成起飞控制指令控制所述飞行器起飞。
在一个实施例中,所述飞行控制设备在获取到所述触发操作,且进入所述图像控制模式之后,首先需要识别出所述目标用户的控制对象。具体地,所述飞行控制设备可以通过控制挂载在所述飞行器上的拍摄装置拍摄获取到环境图像,其中,所述环境图像为所述飞行器起飞之前的预览图像。所述飞行控制设备可以根据所述预览图像,从所述预览图像中确定出所述目标用户的特征部位,并根据所述特征部位确定出目标图像区域,从而在所述目标图像区域中识别出所述目标用户的控制对象。例如,假设所述目标用户的控制对象为手掌, 所述飞行控制设备可以通过控制挂载在所述飞行器上的拍摄装置拍摄获取到环境图像,其中,所述环境图像为所述飞行器起飞之前的预览图像。假设所述飞行控制设备可以根据所述预览图像,从所述预览图像中确定出所述目标用户的特征部位为人体,则所述飞行控制设备可以根据所述目标用户的人体确定出所述人体在所述预览图像中所在的目标图像区域,从而在所述人体所在目标图像区域中识别出所述目标用户的手掌。
在本发明一些实施中,所述飞行控制设备可以在所述飞行器飞行过程中控制所述拍摄装置拍摄获取飞行环境图像,并对所述飞行环境图像中目标用户的控制对象进行手势识别,根据所述手势识别确定出飞行控制手势,并可以根据识别出的所述飞行控制手势,生成控制指令控制所述飞行器执行所述控制指令对应的动作。
具体请参见图1a,图1a是本发明实施例提供的一种飞行控制系统的结构示意图。所述系统包括:飞行控制设备11和飞行器12。所述飞行控制设备11可以设置在所述飞行器12上,这里为了方便说明,将飞行器12和飞行控制设备11分别放置。其中,飞行器12和飞行控制设备11之间的通信连接可以为有线通信连接,也可以为无线通信连接。所述飞行器12可以是旋翼型无人机,例如四旋翼无人机、六旋翼无人机、八旋翼无人机,也可以是固定翼无人机等飞行器。所述飞行器12包括动力系统121,动力系统用于为飞行器12提供飞行动力,其中,动力系统121包括螺旋桨、电机、电调中的任意一种或多种,飞行器12还可以包括云台122以及拍摄装置123,拍摄装置123通过云台122搭载于飞行器12的主体上。所述拍摄装置123用于在所述飞行器12起飞前拍摄得到预览图像,以及在所述飞行器12飞行过程中拍摄图像或视频,所述拍摄装置123包括但不限于多光谱成像仪、高光谱成像仪、可见光相机及红外相机等,所述云台122为多轴传动及增稳系统,云台电机通过调整转动轴的转动角度来对成像设备的拍摄角度进行补偿,并通过设置适当的缓冲机构来防止或减小成像设备的抖动。
在一个实施例中,所述飞行控制设备11在获取到触发所述飞行器12进入图像控制模式的触发操作,进入所述图像控制模式之后,控制所述飞行器12起飞之前,可以开启挂载在所述飞行器12上的拍摄装置123,并控制挂载在所述飞行器12上的云台122转动,以调整所述云台122的姿态角度,从而控 制所述拍摄装置123在预设的拍摄范围内进行扫描拍摄,以使所述拍摄装置123在所述预设的拍摄范围内扫描拍摄得到的环境图像中包括所述目标用户的特征部位,从而使所述飞行控制设备11可以获取所述拍摄装置123在所述预设的拍摄范围内扫描拍摄得到的包括所述目标用户的特征部位的环境图像,其中,所述环境图像为所述拍摄装置123在所述飞行器12起飞之前拍摄得到的预览图像。
在一个实施例中,所述飞行控制设备11在所述飞行器12起飞之前,根据所述环境图像识别所述目标用户的控制对象时,如果所述飞行控制设备11检测到所述目标用户的状态参数满足预设的第一条件,则可以确定出所述目标用户的特征部位为第一特征部位,并根据所述目标用户的第一特征部位确定出所述第一特征部位所在的目标图像区域,从而在所述目标图像区域中识别出所述目标用户的控制对象。在一个实施例中,所述目标用户的状态参数包括:所述环境图像中所述目标用户所在的图像区域的尺寸占比参数,所述目标用户的状态参数满足预设的第一条件是指:所述环境图像中所述目标用户所在的图像区域的尺寸占比参数小于或等于预设第一占比阈值;或者,所述目标用户的状态参数包括:所述目标用户与所述飞行器的距离参数;所述目标用户的状态参数满足预设的第一条件是指:所述目标用户与所述飞行器的距离大于或等于预设第一距离。在一个实施例中,所述第一特征部位为所述目标用户的人体,或者所述第一特征部位可以为所述目标用户的其他身体部位,本发明实施例不做限定。例如,假设所述第一占比阈值是1/4,且所述第一特征部位为目标用户的人体,如果所述飞行控制设备检测到在获取到的所述拍摄装置拍摄得到的环境图像中,所述目标用户在所述环境图像中的图像区域的尺寸占比小于1/4,则所述飞行控制设备可以确定所述目标用户的特征部位为人体,并根据所述目标用户的人体确定出所述人体所在的目标图像区域,并在所述目标图像区域中识别出所述目标用户的控制对象,例如手掌。
在一个实施例中,所述飞行控制设备11在所述飞行器12起飞之前,根据所述环境图像识别所述目标用户的控制对象时,如果所述飞行控制设备11检测到所述目标用户的状态参数满足预设的第二条件,则可以确定所述目标用户的特征部位为第二特征部位,并根据所述目标用户的第二特征部位确定出所述第二特征部位所在的目标图像区域,从而在所述目标图像区域中识别出所述目 标用户的控制对象。在一个实施例中,所述目标用户的状态参数包括:所述环境图像中所述目标用户所在的图像区域的尺寸占比参数,所述目标用户的状态参数满足预设的第二条件是指:所述环境图像中所述目标用户所在的图像区域的尺寸占比参数大于或等于预设第二占比阈值;或者,所述目标用户的状态参数包括:所述目标用户与所述飞行器的距离参数;所述目标用户的状态参数满足预设的第一条件是指:所述目标用户与所述飞行器的距离小于或等于预设第二距离。在一个实施例中,所述第二特征部位包括所述目标用户的头部;或者,所述第二特征部位可以包括所述目标用户的头部和肩部等其他身体部位,本发明实施例不做限定。例如,假设所述第二占比阈值是1/3,且所述第二特征部位为目标用户的头部,如果所述飞行控制设备检测到在获取到的所述拍摄装置拍摄得到的环境图像中所述目标用户在所述环境图像中的图像区域的尺寸占比大于1/3,则所述飞行控制设备可以确定所述目标用户的特征部位为头部,并根据所述目标用户的头部确定出所述头部所在的目标图像区域,从而在所述目标图像区域中识别出所述目标用户的控制对象如手掌。
在一个实施例中,所述飞行控制设备11在所述飞行器12起飞之前识别所述目标用户的控制对象时,如果在所述目标图像区域中识别出至少一个控制对象,则可以根据所述目标用户的特征部位,确定所述目标用户的关节点,并根据确定的所述关节点,从所述至少一个控制对象中确定出所述目标用户的控制对象。其中,所述关节点包括所述目标用户的特征部位的关节点,本发明实施例不做限定。
在一个实施例中,所述飞行控制设备11在从所述至少一个控制对象中确定所述目标用户的控制对象时,可以从确定的关节点中确定出目标关节点,将所述至少一个控制对象中与所述目标关节点距离最近的控制对象确定为所述目标用户的控制对象。其中,所述目标关节点可以是指指定的手臂部位的关节点,例如手臂的肘关节的关节点、手臂与肩膀的关节点、手腕的关节点等任意一种或多种,所述目标关节点和所述控制对象的手指均属于同一个目标用户。例如,假设所述飞行控制设备11在所述目标图像区域中识别出2个手掌(控制对象),所述飞行控制设备11可以确定出所述目标用户的手臂与肩膀的关节点,并将这2个手掌中与所述目标用户的手臂与肩膀的关节点距离最近的手掌确定为所述目标用户的控制对象。
在一个实施例中,在所述飞行器12起飞之后的飞行过程中,所述飞行控制设备11可以识别所述控制对象的飞行控制手势,如果所述飞行控制设备11识别出所述控制对象的飞行控制手势为高度控制手势,则可以生成高度控制指令控制所述飞行器12调整所述飞行器12飞行的高度。具体地,所述飞行控制设备11可以在所述飞行器的飞行过程中,控制所述拍摄装置123拍摄到图像集合,并根据图像集合中包括的图像对所述控制对象进行运动识别,得到所述控制对象的运动信息,其中,所述运动信息包括所述控制对象的运动方向等运动信息。所述飞行控制设备11可以根据所述运动信息,分析得到所述控制对象的飞行控制手势,如果确定出所述飞行控制手势为高度控制手势,则可以获取与所述高度控制手势对应的高度控制指令,并控制所述飞行器12基于所述高度控制指令所指示的运动方向飞行,以调整所述飞行器12的高度。
具体可以图1b为例进行说明,图1b是本发明实施例提供的一种飞行器的飞行控制示意图。如图1b所示的示意图包括目标用户13和飞行器12,其中,所述目标用户13包括控制对象131,所述飞行器12如上述图1a所述,包括动力系统121、云台122和拍摄装置123,所述飞行器12的解释如上所述,在此不再赘述。需要说明的是,所述飞行器12上设置有所述飞行控制设备,假设所述控制对象131为手掌,在所述飞行器12飞行过程中,所述飞行控制设备可以控制所述拍摄装置123拍摄到多张环境图像,并从所述环境图像中识别出所述目标用户13的手掌131,如果所述飞行控制设备识别出所述目标用户13的手掌131的手势为正对所述拍摄装置以垂直地面向上或向下为运动方向移动,则可以确定所述手掌的手势为高度控制手势。如果所述飞行控制设备检测到垂直地面向上移动的手掌131,则可以生成高度控制指令,控制所述飞行器12向垂直地面向上的方向飞行,以调高所述飞行器12的飞行高度。
在一个实施例中,在所述飞行器12飞行过程中,如果所述飞行控制设备11识别出所述控制对象的飞行控制手势为移动控制手势,则可以生成移动控制指令控制所述飞行器向所述移动控制指令所指示的方向飞行。其中,所述移动控制指令所指示的方向包括:远离所述控制对象的方向或靠近所述控制对象的方向。具体地,如果所述飞行控制设备11根据所述拍摄装置123拍摄到的图像集合中的图像包括第一对象和第二对象两个控制对象,则所述飞行控制设备11可以对所述第一对象和第二对象进行运动识别,得到所述第一对象和第 二对象的运动信息,并根据所述运动信息,得到所述第一对象和第二对象所表示的动作特征,其中,所述动作特征用于表示所述第一对象和第二对象之间的距离变化,所述飞行控制设备11可以根据所述距离变化获取所述动作特征对应的移动控制指令。
在一个实施例中,如果所述动作特征用于表示所述第一对象和第二对象之间的距离变化为距离增大的变化,则所述移动控制指令为用于控制所述飞行器向远离所述目标用户的方向飞行。如果所述动作特征用于表示所述第一对象和第二对象之间的距离变化为距离减小的变化,则所述移动控制指令为用于控制所述飞行器向靠近所述目标用户的方向飞行。
具体可举例说明,假设所述控制对象包括第一对象和第二对象,且所述第一对象为人的左手手掌,第二对象为人的右手手掌,如果所述飞行控制设备11检测到所述目标用户举起的正对所述飞行器12的拍摄装置的两个手掌,且检测到这两个手掌在做“开门”的动作,即两个手掌之间在水平方向上的距离逐渐变大,则所述飞行控制设备11可以确定这两个手掌所做的飞行控制手势为移动控制手势,并生成移动控制指令,控制所述飞行器12向远离所述目标用户的方向飞行。又例如,如果所述飞行控制设备11检测到这两个手掌在做“关门”的动作,即两个手掌之间在水平方向上的距离逐渐变小,则所述飞行控制设备11可以确定这两个手掌所做的飞行控制手势为移动控制手势,并生成移动控制指令,控制所述飞行器12向靠近所述目标用户的方向飞行。
在一个实施例中,在所述飞行器12飞行过程中,如果所述飞行控制设备11识别出所述控制对象的飞行控制手势为拖动控制手势,则可以生成拖动控制指令控制所述飞行器沿所述拖动控制指令所指示的水平方向飞行。其中,所述拖动控制手势是指所述目标用户的手掌沿水平方向向左或向右拖动。例如,如果所述飞行控制设备11识别出所述目标用户的手掌沿水平方向向左拖动,则可以生成拖动控制指令控制所述飞行器沿水平向左的方向飞行。
在一个实施例中,在所述飞行器12飞行过程中,如果所述飞行控制设备11识别出所述控制对象的飞行控制手势为旋转控制手势,则可以生成旋转控制指令控制所述飞行器沿所述旋转控制指令所指示的方向旋转飞行。其中,所述旋转控制手势是指所述目标用户的手掌以所述目标用户为中心旋转。具体地,所述飞行控制设备11可以根据所述拍摄装置123拍摄得到的图像集合中 包括的图像,对所述控制对象中包括的手掌和目标用户进行运动识别,得到所述手掌和目标用户的运动信息,所述运动信息可以包括所述手掌和目标用户的运动方向。如果所述飞行控制设备11根据所述运动信息,确定出所述手掌和所述目标用户以所述目标用户为中心旋转,则可以生成旋转控制指令控制所述飞行器参考所述旋转控制指令所指示的方向进行旋转飞行。例如,假设所述飞行控制设备11检测到所述目标用户和所述目标用户的手掌以所述目标用户为中心顺时针旋转,则所述飞行控制设备11可以生成旋转控制指令控制所述飞行器12沿着以所述目标用户为中心顺时针方向旋转。
在一个实施例中,在所述飞行器12飞行过程中,如果所述飞行控制设备11识别出所述控制对象的飞行控制手势为降落手势,则可以生成降落控制指令控制所述飞行器降落。在一个实施例中,所述降落手势可以包括所述目标用户的手掌正对地面向下移动的手势,或者,所述降落手势也可以为所述目标用户的其他手势,本发明实施例不做具体限定。具体地,在所述飞行器12飞行过程中,如果所述飞行控制设备11识别出所述目标用户的手掌正对地面向下移动的手势,则可以生成降落控制指令控制所述飞行器12降落至目标位置,其中,所述目标位置可以是预先设置的,或者所述目标位置是根据所述飞行器12检测到的所述飞行器12与地面的高度来确定的,本发明实施例不做限定。如果检测到所述降落手势在所述目标位置处停留的时间大于预设的时间阈值,则可以控制所述飞行器12降落至地面。例如,假设所述预设的时间阈值为3s,且根据所述飞行器12检测到的所述飞行器12与地面的高度确定出的目标位置为距离地面0.5m,在所述飞行器12飞行过程中,如果所述飞行控制设备11识别出所述目标用户的手掌正对地面向下移动的手势,则可以生成降落控制指令控制所述飞行器12降落至距离地面0.5m的位置处,如果检测到所述目标用户的手掌正对地面向下移动的手势在距离地面0.5m的位置处停留的时间超过3s,则可以控制所述飞行器12降落至地面。
在一个实施例中,在所述飞行器12飞行过程中,如果所述飞行控制设备11不能识别确定所述目标用户的飞行控制手势,且识别出所述飞行环境图像中目标用户的特征部位,则可以根据所述目标用户的特征部位控制所述飞行器以所述目标用户为跟随目标,跟随所述目标用户移动。在一个实施例中,所述特征部位是指目标用户的任意身体区域,本发明实施例不做具体限定。在一个 实施例中,所述跟随所述目标用户移动是指:调整所述飞行器的位置、挂载在所述飞行器上的云台的姿态、飞行器的姿态中的至少一种跟随所述目标用户移动,以使所述目标用户在所述拍摄装置拍摄的图像中。具体地,在所述飞行器12飞行过程中,如果所述飞行控制设备11不能识别确定所述目标用户的飞行控制手势,且识别出所述飞行环境图像中目标用户的第一身体区域,则可以跟随所述第一身体区域控制所述飞行器以所述目标用户为跟随目标,跟随所述第一身体区域移动,并在跟随所述第一身体区域移动过程中,调整所述飞行器的位置、挂载在所述飞行器上的云台的姿态、飞行器的姿态中的至少一种,以使所述目标用户在所述拍摄装置拍摄的图像中。
具体可举例说明,在所述飞行器12飞行过程中,如果所述飞行控制设备11识别不到所述目标用户的手掌所做的手势,且识别到所述目标用户身体躯干所在的身体区域,则所述飞行控制设备11可以跟随所述身体躯干所在的身体区域,控制所述飞行器以所述目标用户为跟随目标,跟随所述身体躯干所在的身体区域移动,并在跟随所述身体躯干所在的身体区域移动过程中,调整所述飞行器的位置、挂载在所述飞行器上的云台的姿态、飞行器的姿态中的至少一种,以使所述目标用户在所述拍摄装置拍摄的图像中。
在一个实施例中,在所述飞行器12飞行过程中,如果所述飞行控制设备11不能识别确定所述目标用户的飞行控制手势,且在检测不到所述目标用户的第一身体区域时,识别出所述目标用户的第二身体区域,则可以控制所述飞行器12跟随所述第二身体区域移动。具体地,在所述飞行器12飞行过程中,如果所述飞行控制设备11不能识别确定所述目标用户的飞行控制手势,且在检测不到所述目标用户的第一身体区域时,识别出所述目标用户的第二身体区域,则所述飞行控制设备11可以跟随所述第二身体区域控制所述飞行器以所述目标用户为跟随目标,跟随所述第二身体区域移动,并在跟随所述第二身体区域移动过程中,调整所述飞行器的位置、挂载在所述飞行器上的云台的姿态、飞行器的姿态中的至少一种,以使所述目标用户在所述拍摄装置拍摄的图像中。
具体可举例说明,在所述飞行器12飞行过程中,如果所述飞行控制设备11识别不到所述目标用户的手掌所做的手势,且在识别不到所述目标用户的身体躯干所在的身体区域时,识别出所述目标用户头部所在的身体区域,则所 述飞行控制设备11可以跟随所述头部和肩部所在的身体区域控制所述飞行器以所述目标用户为跟随目标,跟随所述头部和肩部所在的身体区域移动,并在跟随所述头部和肩部所在身体区域移动过程中,调整所述飞行器的位置、挂载在所述飞行器上的云台的姿态、飞行器的姿态中的至少一种,以使所述目标用户在所述拍摄装置拍摄的图像中。
在一个实施例中,如果所述飞行控制设备11识别出所述控制对象的飞行控制手势为拍照手势,则可以生成拍摄控制指令控制所述飞行器的拍摄装置拍摄得到目标图像。其中,所述拍照手势可以为设置的任意手势,如“O”手势,本发明实施例不做具体限定。例如,假设所述拍照手势为“O”手势,如果所述飞行控制设备11识别出所述目标用户的手掌所做的手势为“O”手势,则可以生成拍摄控制指令控制所述飞行器的拍摄装置拍摄得到目标图像。
在一个实施例中,如果所述飞行控制设备11识别出所述控制对象的飞行控制手势为录像手势,则可以生成录像控制指令控制所述飞行器的拍摄装置拍摄得到视频,在所述飞行器的拍摄装置拍摄视频的过程中,如果再次识别到所述控制对象的录像手势,则可以生成结束控制指令控制所述飞行器的拍摄装置停止拍摄所述视频。其中,所述录像手势可以为设置的任意手势,本发明实施例不做限定。例如,假设所述录像手势为“1”手势,如果所述飞行控制设备11识别出所述目标用户的手掌所做的手势为“1”手势,则可以生成录像控制指令控制所述飞行器的拍摄装置拍摄得到视频,在所述飞行器的拍摄装置拍摄视频的过程中,如果再次识别到所述目标用户所做的“1”手势,则可以生成结束控制指令控制所述飞行器的拍摄装置停止拍摄所述视频。
在一个实施例中,如果所述飞行控制设备11识别不到所述目标用户的控制对象的飞行控制手势、且识别出替换用户的控制对象的替换控制手势,则可以所述替换用户为新的目标用户,并识别所述新的目标用户的控制对象及替换控制手势,根据所述替换控制手势生成控制指令控制所述飞行器执行所述控制指令对应的动作。其中,所述替换控制手势可以为设置的任意手势,本发明实施例不做限定。例如,如果所述飞行控制设备11识别不到所述目标用户的手掌所做的飞行控制手势,且识别出了替换用户正对所述飞行器12的拍摄装置所做的替换控制手势为“O”手势,则所述飞行控制设备11可以所述替换用户为目标用户,并根据所述替换用户所做的“O”手势,生成拍照控制指令控制所述 飞行器的拍摄装置拍摄得到目标图像。
下面结合附图对应用于飞行器的飞行控制方法进行举例说明。
请参见图2,图2是本发明实施例提供的一种飞行控制方法的流程示意图,所述方法可以由飞行控制设备执行,所述飞行控制设备可以设置在飞行器上,所述飞行器上挂载有拍摄装置,其中,所述飞行控制设备的具体解释如前所述。具体地,本发明实施例的所述方法包括如下步骤。
S201:获取拍摄装置拍摄得到的环境图像。
本发明实施例中,飞行控制设备可以获取挂载在所述飞行器上的拍摄装置拍摄得到的环境图像。
S202:根据所述环境图像确定出目标用户的特征部位,并根据所述特征部位确定出目标图像区域,在所述目标图像区域中识别出所述目标用户的控制对象。
本发明实施例中,飞行控制设备可以根据所述环境图像确定出目标用户的特征部位,并根据所述特征部位确定出目标图像区域,在所述目标图像区域中识别出所述目标用户的控制对象。在一个实施例中,所述控制对象包括但不限定于所述目标用户的手掌。
在一个实施例中,在所述飞行控制设备根据所述环境图像确定出目标用户的特征部位,并根据所述特征部位确定出目标图像区域,在所述目标图像区域中识别出所述目标用户的控制对象时,如果所述目标用户的状态参数满足预设的第一条件,则所述飞行控制设备可以确定所述目标用户的特征部位为第一特征部位,根据所述目标用户的第一特征部位确定出所述第一特征部位所在的目标图像区域,并在所述目标图像区域中识别出所述目标用户的控制对象。在一个实施例中,所述目标用户的状态参数包括:所述环境图像中所述目标用户所在的图像区域的尺寸占比参数,所述目标用户的状态参数满足预设的第一条件是指:所述环境图像中所述目标用户所在的图像区域的尺寸占比参数小于或等于预设第一占比阈值;或者,所述目标用户的状态参数包括:所述目标用户与所述飞行器的距离参数;所述目标用户的状态参数满足预设的第一条件是指:所述目标用户与所述飞行器的距离大于或等于预设第一距离。在一个实施例中,所述第一特征部位包括但不限定于所述目标用户的人体。例如,假设所述第一占比阈值是1/3,且所述第一特征部位为目标用户的人体,如果所述飞行 控制设备检测到在获取到的所述拍摄装置拍摄得到的环境图像中所述目标用户在所述环境图像中的图像区域的尺寸占比小于1/3,则所述飞行控制设备可以确定所述目标用户的特征部位为人体,并根据所述目标用户的人体确定出所述人体所在的目标图像区域,并在所述目标图像区域中识别出所述目标用户的控制对象如手掌。
在一个实施例中,如果所述目标用户的状态参数满足预设的第二条件,则所述飞行控制设备可以确定所述目标用户的特征部位为第二特征部位,根据所述目标用户的第二特征部位确定出所述第二特征部位所在的目标图像区域,并在所述目标图像区域中识别出所述目标用户的控制对象。在一个实施例中,所述目标用户的状态参数满足预设的第二条件是指:所述环境图像中所述目标用户所在的图像区域的尺寸占比参数大于或等于预设第二占比阈值;或者,所述目标用户的状态参数包括:所述目标用户与所述飞行器的距离参数;所述目标用户的状态参数满足预设的第一条件是指:所述目标用户与所述飞行器的距离小于或等于预设第二距离。在一个实施例中,所述第二特征部位包括所述目标用户的头部,或者,所述第二特征部位包括所述目标用户的头部和肩部,本发明实施例不做限定。例如,假设所述第二占比阈值是1/2,且所述第二特征部位为目标用户的头部,如果所述飞行控制设备检测到在获取到的所述拍摄装置拍摄得到的环境图像中,所述目标用户在所述环境图像中的图像区域的尺寸占比大于1/2,则所述飞行控制设备可以确定所述目标用户的特征部位为头部,并根据所述目标用户的头部确定出所述头部所在的目标图像区域,并在所述目标图像区域中识别出所述目标用户的控制对象如手掌。
在一个实施例中,所述飞行控制设备在所述目标图像区域中识别所述目标用户的控制对象的过程中,所述飞行控制设备可以在所述目标图像区域中识别出至少一个控制对象,并根据所述目标用户的特征部位,确定所述目标用户的关节点,根据确定的关节点,从所述至少一个控制对象中确定出所述目标用户的控制对象。
在一个实施例中,所述飞行控制设备在根据确定的关节点,从所述至少一个控制对象中确定出所述目标用户的控制对象时,可以从确定的关节点中确定出目标关节点,并将所述至少一个控制对象中与所述目标关节点距离最近的控制对象确定为所述目标用户的控制对象。其中,所述目标关节点是指指定的手 臂部位的关节点,例如手臂的肘关节的关节点、手臂与肩膀的关节点、手腕的关节点等任意一种或多种,且所述目标关节点和控制对象的手指均属于同一个目标用户。例如,假设所述飞行控制设备确定出的所述目标图像区域为所述目标用户的人体所在的目标图像区域,如果所述飞行控制设备在所述目标用户的人体所在的目标图像区域中识别出2个手掌(控制对象),则所述飞行控制设备可以确定出所述目标用户的手臂与肩膀的关节点,并将这2个手掌中与所述手臂与肩膀的关节点距离最近的手掌确定为所述目标用户的控制对象。
S203:根据所述控制对象生成控制指令控制所述飞行器飞行。
本发明实施例中,飞行控制设备可以根据所述控制对象生成控制指令控制所述飞行器飞行。在一个实施例中,所述飞行控制设备可以通过识别所述控制对象的动作特征,根据所述控制对象的动作特征获取控制指令,按照所述控制指令控制所述飞行器飞行。
本发明实施例中,飞行控制设备通过获取拍摄装置拍摄得到的环境图像,根据从所述环境图像确定出的目标用户的特征部位确定出目标图像区域,并在所述目标图像区域中识别出所述目标用户的控制对象,以根据所述控制对象生成控制指令控制所述飞行器飞行。通过这种方式,识别出所述目标用户的控制对象,实现通过识别所述控制对象的动作特征来控制飞行器的飞行,可较为快捷地对飞行器进行控制,提高飞行控制的效率。
请参见图3,图3是本发明实施例提供的另一种飞行控制方法的流程示意图,所述方法可以由飞行控制设备执行,其中,飞行控制设备的具体解释如前所述。本发明实施例与上述图2所述实施例的区别在于,本发明实施例是根据获取到的触发操作触发所述飞行器进入图像控制模式,并在所述图像控制模式下对获取到的目标用户的控制对象进行手势识别,根据识别出的启动飞行手势生成起飞控制指令控制所述飞行器起飞。
S301:如果获取到触发飞行器进入图像控制模式的触发操作,则获取拍摄装置拍摄得到的环境图像。
本发明实施例中,飞行控制设备如果获取到触发飞行器进入图像控制模式的触发操作,则可以获取拍摄装置拍摄得到的环境图像,其中,所述环境图像为所述飞行器起飞之前所述拍摄装置拍摄得到的预览图像。在一个实施例中,所述触发操作可以包括:对所述飞行器电源键的点击操作、对所述飞行器电源 键的双击操作、对所述飞行器的摇晃操作、语音输入操作、指纹输入操作等中的任意一种或多种,所述触发操作还可以是扫描特征物体、附件交互操作(如眼镜、手表、手环等)等任意一种或多种,本发明实施例对所述触发操作不做限定。例如,假设所述触发操作为对所述飞行器电源键的双击操作,如果所述飞行控制设备获取到目标用户双击所述飞行器的电源键的操作,则可以触发飞行器进入图像控制模式,并获取挂载在所述飞行器上的拍摄装置拍摄得到的环境图像。
S302:对所述环境图像中目标用户的控制对象进行手势识别。
本发明实施例中,飞行控制设备可以对在所述图像控制模式下所述飞行器的拍摄装置获取到的所述环境图像中目标用户的控制对象进行手势识别。在一个实施例中,所述目标用户可以是人、动物、无人汽车等可移动的物体,所述控制对象可以是所述目标用户的手掌、或其他身体部位、身体区域等,如面部、头部、肩部等部位,本发明实施例对所述目标用户和所述控制对象不做限定。
在一个实施例中,所述飞行控制设备在获取拍摄装置拍摄得到的环境图像时,可以在获取到所述触发操作后,控制挂载在所述飞行器上的云台转动,以控制所述拍摄装置在预设的拍摄范围内扫描拍摄,并获取所述拍摄装置在所述预设的拍摄范围内扫描拍摄得到的包括所述目标用户的特征部位的环境图像。
S303:如果识别出所述控制对象的手势为启动飞行手势,则生成起飞控制指令控制所述飞行器起飞。
本发明实施例中,如果所述飞行控制设备识别出所述控制对象的手势为启动飞行手势,则生成起飞控制指令控制所述飞行器起飞。具体地,所述飞行控制设备在所述图像控制模式下,如果识别出所述控制对象的手势为启动飞行手势,则可以生成起飞控制指令控制所述飞行器起飞至目标高度对应位置处悬停。其中,所述目标高度可以预先设置的距离地面的高度,也可以是根据所述目标用户在所述拍摄装置拍摄得到的环境图像中的位置区域来确定的,本发明实施例对所述飞行器起飞后悬停的目标高度不做限定。在一个实施例中,所述启动飞行手势可以为所述目标用户所做的任何手势,如“OK”手势、剪刀手手势等,本发明实施例对所述启动飞行手势不做限定。例如,假设所述触发操作为对所述飞行器电源键的双击操作,所述控制对象为所述目标用户的手掌,启动飞行手势设置为剪刀手手势,且预先设置的目标高度为距离地面1.2m,则 如果所述飞行控制设备检测到目标用户双击所述飞行器的电源键的操作,控制所述飞行器进入图像控制模式,在所述图像控制模式下,所述飞行控制设备如果识别出所述目标用户的手掌所做的手势为剪刀手手势,则可以生成起飞控制指令控制所述飞行器起飞至1.2m的目标高度对应位置处悬停。
本发明实施例中,飞行控制设备通过获取触发所述飞行器进入图像控制模式的触发操作,进入图像控制模式,并对获取到的所述拍摄装置拍摄得到的环境图像中目标用户的控制对象进行手势识别,如果识别出所述控制对象的手势为启动飞行手势,则生成起飞控制指令控制所述飞行器起飞。通过这种方式,实现通过手势识别控制飞行器起飞,可较为快捷地对飞行器进行控制,提高了控制飞行器起飞的效率。
请参见图4,图4是本发明实施例提供的又一种飞行控制方法的流程示意图,所述方法可以由飞行控制设备执行,其中,飞行控制设备的具体解释如前所述。本发明实施例与上述图3所述实施例的区别在于,本发明实施例是在所述飞行器飞行过程中,通过对目标用户的控制对象进行手势识别,确定出飞行控制手势,并根据所述飞行控制手势生成控制指令控制所述飞行器执行所述控制指令对应的动作。
S401:在飞行器飞行过程中,控制拍摄装置拍摄获取飞行环境图像。
本发明实施例中,在飞行器飞行过程中,飞行控制设备可以控制所述飞行器上挂载的拍摄装置拍摄获取到飞行环境图像,其中,所述飞行环境图像是挂载在所述飞行器上的拍摄装置在所述飞行器飞行过程中扫描拍摄得到的环境图像。
S402:对所述飞行环境图像中目标用户的控制对象进行手势识别,确定飞行控制手势。
本发明实施例中,所述飞行控制设备可以对所述飞行环境图像中目标用户的控制对象进行手势识别,确定飞行控制手势。其中,所述控制对象如上所述可以包括但不限定于目标用户的手掌。所述飞行控制手势包括高度控制手势、移动控制手势、拖动控制手势、旋转控制手势、降落手势、拍照手势、录像手势、替换控制手势等中的任意一种或多种手势,本发明实施例不做限定。
S403:根据识别出的所述飞行控制手势,生成控制指令控制所述飞行器执行所述控制指令对应的动作。
本发明实施例中,所述飞行控制设备可以根据识别出的所述飞行控制手势,生成控制指令控制所述飞行器执行所述控制指令对应的动作。
在一个实施例中,在所述飞行器飞行过程中,如果所述飞行控制设备识别出所述控制对象的飞行控制手势为高度控制手势,则可以生成高度控制指令控制所述飞行器调整所述飞行器飞行的高度。具体地,所述飞行控制设备可以根据图像集合中包括的图像对所述控制对象进行运动识别,得到所述控制对象的运动信息,所述运动信息包括所述控制对象的运动方向,其中,所述图像集合包括所述拍摄装置拍摄到的多张环境图像。所述飞行控制设备可以根据所述运动信息,分析得到所述控制对象的飞行控制手势,如果得到的飞行控制手势为高度控制手势,则可以获取与所述高度控制手势对应的高度控制指令,并控制所述飞行器基于所述运动方向飞行,以调整所述飞行器的高度。具体可以图1b为例进行说明,假设在所述飞行器飞行过程中,所述飞行器12上设置的飞行控制设备可以根据拍摄装置拍摄到的多张环境图像,识别出目标用户的手掌,如果所述飞行控制设备识别出所述目标用户13的手掌131为正对所述拍摄装置以垂直地面向下为运动方向移动,则可以确定所述手掌131的手势为高度控制手势,并生成高度控制指令,控制所述飞行器12向垂直地面向下的方向飞行,以调低所述飞行器12的飞行高度。又例如,如果所述飞行控制设备检测到垂直地面向上移动的手掌131,则可以生成高度控制指令,控制所述飞行器12向垂直地面向上的方向飞行,以调高所述飞行器12的飞行高度。
在一个实施例中,在所述飞行器飞行过程中,如果所述飞行控制设备识别出所述控制对象的飞行控制手势为移动控制手势,则可以生成移动控制指令控制所述飞行器向所述移动控制指令所指示的方向飞行。在一个实施例中,所述移动控制指令所指示的方向包括:远离所述控制对象的方向或靠近所述控制对象的方向。具体地,如果所述飞行控制设备根据图像集合中包括的图像对所述控制对象中包括的第一对象和第二对象进行运动识别,得到所述第一对象和第二对象的运动信息,其中,所述图像集合包括所述拍摄装置拍摄到的多张环境图像。所述飞行控制设备可以根据所述运动信息,得到所述第一对象和第二对象所表示的动作特征,所述动作特征用于表示所述第一对象和第二对象之间的距离变化,并根据所述距离变化获取所述动作特征对应的移动控制指令。
在一个实施例中,如果所述动作特征用于表示所述第一对象和第二对象之 间的距离变化为距离增大的变化,则所述移动控制指令为用于控制所述飞行器向远离所述目标用户的方向飞行。如果所述动作特征用于表示所述第一对象和第二对象之间的距离变化为距离减小的变化,则所述移动控制指令为用于控制所述飞行器向靠近所述目标用户的方向飞行。具体可举例说明,假设所述控制对象包括第一对象和第二对象,且所述第一对象为目标用户的左手手掌,第二对象为所述目标用户的右手手掌,如果所述飞行控制设备检测到所述目标用户举起的正对所述飞行器的拍摄装置的两个手掌,且检测到这两个手掌之间在水平方向上的距离逐渐变大,则所述飞行控制设备可以确定这两个手掌所做的飞行控制手势为移动控制手势,并生成移动控制指令,控制所述飞行器向远离所述目标用户的方向飞行。又例如,如果所述飞行控制设备检测到这两个手掌之间在水平方向上的距离逐渐变小,则所述飞行控制设备可以确定这两个手掌所做的飞行控制手势为移动控制手势,并生成移动控制指令,控制所述飞行器向靠近所述目标用户的方向飞行。
在一个实施例中,在所述飞行器飞行过程中,如果所述飞行控制设备识别出所述控制对象的飞行控制手势为拖动控制手势,则可以生成拖动控制指令控制所述飞行器沿所述拖动控制指令所指示的水平方向飞行。其中,所述拖动控制手势是指所述目标用户的手掌沿水平方向向左或向右拖动。例如,如果所述飞行控制设备识别出所述目标用户的手掌沿水平方向向左拖动,则生成拖动控制指令控制所述飞行器沿水平向左的方向飞行。
在一个实施例中,在所述飞行器飞行过程中,如果所述飞行控制设备识别出所述控制对象的飞行控制手势为旋转控制手势,则可以生成旋转控制指令控制所述飞行器沿所述旋转控制指令所指示的方向旋转飞行。其中,所述旋转控制手势是指所述目标用户的手掌以所述目标用户为中心旋转。具体地,所述飞行控制设备可以根据图像集合中包括的图像对所述控制对象中包括的手掌和目标用户进行运动识别,得到所述手掌和目标用户的运动信息,所述运动信息包括所述手掌和目标用户的运动方向,所述图像集合包括所述拍摄装置拍摄到的多张环境图像。如果所述飞行控制设备根据所述运动信息,确定出所述手掌和所述目标用户以所述目标用户为中心旋转,则可以生产旋转控制指令控制所述飞行器参考所述旋转控制指令所指示的方向进行旋转飞行。例如,假设所述飞行控制设备检测到所述目标用户和所述目标用户的手掌以所述目标用户为 中心逆时针旋转,则所述飞行控制设备可以生成旋转控制指令控制所述飞行器沿着以所述目标用户为中心逆时针方向旋转。
在一个实施例中,在所述飞行器飞行过程中,如果所述飞行控制设备识别出所述控制对象的飞行控制手势为降落手势,则生成降落控制指令控制所述飞行器降落。在一个实施例中,所述降落手势指所述目标用户的手掌正对地面向下移动的手势,或者,所述降落手势也可以为所述目标用户的其他手势,本发明实施例不做具体限定。具体地,在所述飞行器飞行过程中,如果所述飞行控制设备识别出所述目标用户的手掌正对地面向下移动的手势,则可以生成降落控制指令控制所述飞行器降落至目标位置。其中,所述目标位置可以是预先设置的,也可以是根据所述飞行器检测到的所述飞行器与地面的高度来确定的,本发明实施例不做具体限定。如果所述飞行控制设备检测到所述降落手势在所述目标位置处停留的时间大于预设的时间阈值,则可以控制所述飞行器降落至地面。例如,假设所述预设的时间阈值为3s,且根据所述飞行器12检测到的所述飞行器与地面的高度确定出的目标位置为距离地面0.5m,在所述飞行器飞行过程中,如果所述飞行控制设备识别出所述目标用户的手掌正对地面向下移动的手势,则可以生成降落控制指令控制所述飞行器降落至距离地面0.5m的位置处,如果检测到所述目标用户的手掌正对地面向下移动的手势在距离地面0.5m的位置处停留的时间超过3s,则控制所述飞行器降落至地面。
在一个实施例中,在所述飞行器飞行过程中,如果所述飞行控制设备不能识别出所述目标用户的飞行控制手势,而识别出了所述飞行环境图像中目标用户的特征部位,则可以根据所述目标用户的特征部位控制所述飞行器以所述目标用户为跟随目标,跟随所述目标用户移动。在一个实施例中,所述特征部位是指目标用户的任意身体区域,本发明实施例不做具体限定。在一个实施例中,所述跟随所述目标用户移动是指:调整所述飞行器的位置、挂载在所述飞行器上的云台的姿态、飞行器的姿态中的至少一种跟随所述目标用户移动,以使所述目标用户在所述拍摄装置拍摄的图像中。具体地,在所述飞行器飞行过程中,如果所述飞行控制设备不能识别所述目标用户的飞行控制手势,但却识别出所述飞行环境图像中目标用户的第一身体区域,则可以跟随所述第一身体区域控制所述飞行器以所述目标用户为跟随目标,跟随所述第一身体区域移动,并在跟随所述第一身体区域移动过程中,调整所述飞行器的位置、挂载在所述飞行 器上的云台的姿态、飞行器的姿态中的至少一种,以使所述目标用户在所述拍摄装置拍摄的图像中。
具体可举例说明,在所述飞行器飞行过程中,如果所述飞行控制设备识别不到所述目标用户的手掌所做的手势,且识别到所述目标用户身体躯干所在的身体区域,则所述飞行控制设备可以跟随所述身体躯干所在的身体区域控制所述飞行器以所述目标用户为跟随目标,跟随所述身体躯干所在的身体区域移动,并在跟随所述身体躯干所在的身体区域移动过程中,调整所述飞行器的位置、挂载在所述飞行器上的云台的姿态、飞行器的姿态中的至少一种,以使所述目标用户在所述拍摄装置拍摄的图像中。
在一个实施例中,在所述飞行器飞行过程中,如果所述飞行控制设备不能识别所述目标用户的飞行控制手势,且在检测不到所述目标用户的第一身体区域时,识别出所述目标用户的第二身体区域,则可以控制所述飞行器跟随所述第二身体区域移动。具体地,在所述飞行器飞行过程中,如果所述飞行控制设备不能识别确定所述目标用户的飞行控制手势,且在检测不到所述目标用户的第一身体区域时,识别出所述目标用户的第二身体区域,则所述飞行控制设备11可以跟随所述第二身体区域控制所述飞行器以所述目标用户为跟随目标,跟随所述第二身体区域移动,并在跟随所述第二身体区域移动过程中,调整所述飞行器的位置、挂载在所述飞行器上的云台的姿态、飞行器的姿态中的至少一种,以使所述目标用户在所述拍摄装置拍摄的图像中。
具体可举例说明,在所述飞行器飞行过程中,如果所述飞行控制设备识别不到所述目标用户的手掌所做的手势,且在识别不到所述目标用户的身体躯干所在的身体区域时,识别出所述目标用户头部所在的身体区域,则所述飞行控制设备可以跟随所述头部和肩部所在的身体区域控制所述飞行器以所述目标用户为跟随目标,跟随所述头部和肩部所在的身体区域移动,并在跟随所述头部和肩部所在身体区域移动过程中,调整所述飞行器的位置、挂载在所述飞行器上的云台的姿态、飞行器的姿态中的至少一种,以使所述目标用户在所述拍摄装置拍摄的图像中。
在一个实施例中,所述飞行控制设备在跟随所述目标用户移动的过程中,所述飞行器可以对所述目标用户中包括的特征部位进行识别,得到所述特征部位在所述图像中的图像尺寸信息,并根据所述图像尺寸信息,生成控制指令控 制所述飞行器按照所述控制指令所指示的方向移动。例如,假设所述特征部位为目标用户的身体,如果检测到所述目标用户的身体在往前移动,且所述目标用户的身体的尺寸在变大,则可以控制所述飞行器向远离所述目标用户的方向移动。
在一个实施例中,如果所述飞行控制设备识别出所述控制对象的飞行控制手势为拍照手势,则可以生成拍摄控制指令控制所述飞行器的拍摄装置拍摄得到目标图像。其中,所述拍照手势可以为设置的任意手势,如“O”手势,本发明实施例不做具体限定。例如,假设所述拍照手势为“O”手势,如果所述飞行控制设备识别出所述目标用户的手掌所做的手势为“O”手势,则可以生成拍摄控制指令控制所述飞行器的拍摄装置拍摄得到目标图像。
在一个实施例中,如果所述飞行控制设备识别出所述控制对象的飞行控制手势为录像手势,则可以生成录像控制指令控制所述飞行器的拍摄装置拍摄得到视频,在所述飞行器的拍摄装置拍摄视频的过程中,如果再次识别到所述控制对象的录像手势,则生成结束控制指令控制所述飞行器的拍摄装置停止拍摄所述视频。其中,所述录像手势可以为设置的任意手势,本发明实施例不做限定。例如,假设所述录像手势为“1”手势,如果所述飞行控制设备识别出所述目标用户的手掌所做的手势为“1”手势,则生成录像控制指令控制所述飞行器的拍摄装置拍摄得到视频,在所述飞行器的拍摄装置拍摄视频的过程中,如果再次识别到所述目标用户所做的“1”手势,则生成结束控制指令控制所述飞行器的拍摄装置停止拍摄所述视频。
在一个实施例中,如果所述飞行控制设备识别不到所述目标用户的控制对象的飞行控制手势、且识别出替换用户的控制对象的替换控制手势,则以所述替换用户为新的目标用户,并识别所述新的目标用户的控制对象及替换控制手势,根据所述替换控制手势,生成控制指令控制所述飞行器执行所述控制指令对应的动作。其中,所述替换控制手势可以为设置的任意手势,本发明实施例不做限定。例如,如果所述飞行控制设备识别不到所述目标用户的手掌所做的飞行控制手势,且识别出了替换用户正对所述飞行器的拍摄装置所做的替换控制手势为“O”手势,则所述飞行控制设备可以所述替换用户为目标用户,并根据所述替换用户所做的“O”手势,生成拍照控制指令控制所述飞行器的拍摄装置拍摄得到目标图像。
本发明实施例中,所述飞行控制设备通过在所述飞行器飞行过程中,控制所述拍摄装置拍摄获取飞行环境图像,并对所述飞行环境图像中目标用户的控制对象进行手势识别,确定出飞行控制手势,从而根据识别出的所述飞行控制手势,生成控制指令控制所述飞行器执行所述控制指令对应的动作。通过这种方式,实现通过手势识别控制所述飞行器在飞行过程中执行所述手势所指示的动作,简化了对飞行器进行控制的操作步骤,可较为快捷地对飞行器进行控制,提高了控制飞行器的效率。
请参见图5,图5是本发明实施例提供的一种飞行控制设备的结构示意图。具体的,所述飞行控制设备包括:存储器501、处理器502以及数据接口503。
所述存储器501可以包括易失性存储器(volatile memory);存储器501也可以包括非易失性存储器(non-volatile memory);存储器501还可以包括上述种类的存储器的组合。所述处理器502可以是中央处理器(central processing unit,CPU)。所述处理器502还可以进一步包括硬件芯片。上述硬件芯片可以是专用集成电路(application-specific integrated circuit,ASIC),可编程逻辑器件(programmable logic device,PLD)或其组合。具体例如可以是复杂可编程逻辑器件(complex programmable logic device,CPLD),现场可编程逻辑门阵列(field-programmable gate array,FPGA)或其任意组合。
进一步地,所述存储器501用于存储程序指令,当程序指令被执行时所述处理器502可以调用存储器501中存储的程序指令,用于执行如下步骤:
获取拍摄装置拍摄得到的环境图像;
根据所述环境图像确定出目标用户的特征部位,并根据所述特征部位确定出目标图像区域,在所述目标图像区域中识别出所述目标用户的控制对象;
根据所述控制对象生成控制指令控制所述飞行器飞行。
所述处理器502调用存储器501中存储的程序指令用于执行如下步骤:
识别所述控制对象的动作特征,根据所述控制对象的动作特征获取控制指令;
按照所述控制指令控制所述飞行器飞行。
进一步地,所述控制对象包括所述目标用户的手掌。
所述处理器502调用存储器501中存储的程序指令用于执行如下步骤:
如果所述目标用户的状态参数满足预设的第一条件,则确定所述目标用户 的特征部位为第一特征部位;
根据所述目标用户的第一特征部位确定出所述第一特征部位所在的目标图像区域,并在所述目标图像区域中识别出所述目标用户的控制对象。
进一步地,所述目标用户的状态参数包括:所述环境图像中所述目标用户所在的图像区域的尺寸占比参数,所述目标用户的状态参数满足预设的第一条件是指:所述环境图像中所述目标用户所在的图像区域的尺寸占比参数小于或等于预设第一占比阈值;或者,
所述目标用户的状态参数包括:所述目标用户与所述飞行器的距离参数;所述目标用户的状态参数满足预设的第一条件是指:所述目标用户与所述飞行器的距离大于或等于预设第一距离。
进一步地,所述第一特征部位为所述目标用户的人体。
所述处理器502调用存储器501中存储的程序指令用于执行如下步骤:
如果所述目标用户的状态参数满足预设的第二条件,则确定所述目标用户的特征部位为第二特征部位;
根据所述目标用户的第二特征部位确定出所述第二特征部位所在的目标图像区域,并在所述目标图像区域中识别出所述目标用户的控制对象。
进一步地,所述目标用户的状态参数包括:所述环境图像中所述目标用户所在的图像区域的尺寸占比参数,所述目标用户的状态参数满足预设的第二条件是指:所述环境图像中所述目标用户所在的图像区域的尺寸占比参数大于或等于预设第二占比阈值;或者,
所述目标用户的状态参数包括:所述目标用户与所述飞行器的距离参数;所述目标用户的状态参数满足预设的第一条件是指:所述目标用户与所述飞行器的距离小于或等于预设第二距离。
进一步地,所述第二特征部位包括所述目标用户的头部;或者,所述第二特征部位包括所述目标用户的头部和肩部。
所述处理器502调用存储器501中存储的程序指令用于执行如下步骤:
在所述目标图像区域中识别出至少一个控制对象;
根据所述目标用户的特征部位,确定所述目标用户的关节点;
根据确定的关节点,从所述至少一个控制对象中确定出所述目标用户的控制对象。
所述处理器502调用存储器501中存储的程序指令用于执行如下步骤:
从确定的关节点中确定出目标关节点;
将所述至少一个控制对象中与所述目标关节点距离最近的控制对象确定为所述目标用户的控制对象。
本发明实施例中,飞行控制设备通过获取拍摄装置拍摄得到的环境图像,根据从所述环境图像确定出的目标用户的特征部位确定出目标图像区域,并在所述目标图像区域中识别出所述目标用户的控制对象,以根据所述控制对象生成控制指令控制所述飞行器飞行。通过这种方式,识别出所述目标用户的控制对象,实现通过识别所述控制对象的动作特征来控制飞行器的飞行,以简化操作流程,可较为快捷地对飞行器进行控制,提高飞行控制的效率。
请参见图6,图6是本发明实施例提供的另一种飞行控制设备的结构示意图。具体的,所述飞行控制设备包括:存储器601、处理器602以及数据接口603。
所述存储器601可以包括易失性存储器(volatile memory);存储器601也可以包括非易失性存储器(non-volatile memory);存储器601还可以包括上述种类的存储器的组合。所述处理器602可以是中央处理器(central processing unit,CPU)。所述处理器602还可以进一步包括硬件芯片。上述硬件芯片可以是专用集成电路(application-specific integrated circuit,ASIC),可编程逻辑器件(programmable logic device,PLD)或其组合。上述PLD可以是复杂可编程逻辑器件(complex programmable logic device,CPLD),现场可编程逻辑门阵列(field-programmable gate array,FPGA)或其任意组合。
进一步地,所述存储器601用于存储程序指令,当程序指令被执行时所述处理器602可以调用存储器601中存储的程序指令,用于执行如下步骤:
如果获取到触发所述飞行器进入图像控制模式的触发操作,则获取拍摄装置拍摄得到的环境图像;
对所述环境图像中目标用户的控制对象进行手势识别;
如果识别出所述控制对象的手势为启动飞行手势,则生成起飞控制指令控制所述飞行器起飞。
进一步地,所述触发操作包括:对所述飞行器电源键的点击操作、对所述飞行器电源键双击操作、对所述飞行器的摇晃操作、语音输入操作、指纹输入操作中的任意一种或多种。
所述处理器602调用存储器601中存储的程序指令用于执行如下步骤:
在获取到所述触发操作后,控制挂载在所述飞行器上的云台转动,以控制所述拍摄装置在预设的拍摄范围内扫描拍摄;
获取所述拍摄装置在所述预设的拍摄范围内扫描拍摄得到的包括所述目标用户的特征部位的环境图像。
所述处理器602调用存储器601中存储的程序指令还用于执行如下步骤:
在所述飞行器飞行过程中,控制所述拍摄装置拍摄获取飞行环境图像;
对所述飞行环境图像中目标用户的控制对象进行手势识别,确定飞行控制手势;
根据识别出的所述飞行控制手势,生成控制指令控制所述飞行器执行所述控制指令对应的动作。
所述处理器602调用存储器601中存储的程序指令用于执行如下步骤:
如果识别出所述控制对象的飞行控制手势为高度控制手势,则生成高度控制指令控制所述飞行器调整所述飞行器的高度。
所述处理器602调用存储器601中存储的程序指令用于执行如下步骤:
如果识别出所述控制对象的飞行控制手势为移动控制手势,则生成移动控制指令控制所述飞行器向所述移动控制指令所指示的方向飞行;
其中,所述移动控制指令所指示的方向包括:远离所述控制对象的方向或靠近所述控制对象的方向。
所述处理器602调用存储器601中存储的程序指令用于执行如下步骤:
如果识别出所述控制对象的飞行控制手势为拖动控制手势,则生成拖动控制指令控制所述飞行器沿所述拖动控制指令所指示的水平方向飞行。
所述处理器602调用存储器601中存储的程序指令用于执行如下步骤:
如果识别出所述控制对象的飞行控制手势为旋转控制手势,则生成旋转控制指令控制所述飞行器沿所述旋转控制指令所指示的方向旋转飞行。
所述处理器602调用存储器601中存储的程序指令用于执行如下步骤:
如果识别出所述控制对象的飞行控制手势为降落手势,则生成降落控制指令控制所述飞行器降落。
所述处理器602调用存储器601中存储的程序指令用于执行如下步骤:
如果不能识别确定飞行控制手势,且识别出所述飞行环境图像中目标用户的特征部位;
根据所述目标用户的特征部位控制所述飞行器以所述目标用户为跟随目标,跟随所述目标用户移动。
进一步地,所述跟随所述目标用户移动是指:调整拍摄状态,在调整后的拍摄状态下所述目标用户位于所述拍摄装置拍摄的图像中,调整拍摄状态包括调整所述飞行器的位置、挂载在所述飞行器上的云台的姿态、飞行器的姿态的任意一种或多种。
所述处理器602调用存储器601中存储的程序指令用于执行如下步骤:
如果识别出所述控制对象的飞行控制手势为拍照手势,则生成拍摄控制指令控制所述飞行器的拍摄装置拍摄得到目标图像。
所述处理器602调用存储器601中存储的程序指令用于执行如下步骤:
如果识别出所述控制对象的飞行控制手势为录像手势,则生成录像控制指令控制所述飞行器的拍摄装置拍摄得到视频;
在所述飞行器的拍摄装置拍摄视频的过程中,如果再次识别到所述控制对象的录像手势,则生成结束控制指令控制所述飞行器的拍摄装置停止拍摄所述视频。
所述处理器602调用存储器601中存储的程序指令用于执行如下步骤:
如果识别不到所述目标用户的控制对象的飞行控制手势、且识别出替换用户的控制对象发出的替换控制手势,则将所述替换用户确定为新的目标用户;
识别所述新的目标用户的控制对象及替换控制手势,并根据所述替换控制手势生成控制指令控制所述飞行器执行所述控制指令对应的动作。
本发明实施例中,所述飞行控制设备通过在所述飞行器飞行过程中,控制所述拍摄装置拍摄获取飞行环境图像,并对所述飞行环境图像中目标用户的控制对象进行手势识别,确定出飞行控制手势,从而根据识别出的所述飞行控制手势,生成控制指令控制所述飞行器执行所述控制指令对应的动作。通过这种方式,实现通过手势识别控制所述飞行器在飞行过程中执行所述手势所指示的 动作,简化了对飞行器进行控制的操作步骤,可较为快捷地对飞行器进行控制,提高了控制飞行器的效率。
本发明实施例还提供了一种飞行器,包括:机身;设置在机身上的动力系统,用于提供飞行动力;处理器,用于获取拍摄装置拍摄得到的环境图像;根据所述环境图像确定出目标用户的特征部位,并根据所述特征部位确定出目标图像区域,在所述目标图像区域中识别出所述目标用户的控制对象;根据所述控制对象生成控制指令控制所述飞行器飞行。
进一步地,所述处理器用于执行如下步骤:
识别所述控制对象的动作特征,根据所述控制对象的动作特征获取控制指令;
按照所述控制指令控制所述飞行器飞行。
进一步地,所述控制对象包括所述目标用户的手掌。
进一步地,所述处理器用于执行如下步骤:
如果所述目标用户的状态参数满足预设的第一条件,则确定所述目标用户的特征部位为第一特征部位;
根据所述目标用户的第一特征部位确定出所述第一特征部位所在的目标图像区域,并在所述目标图像区域中识别出所述目标用户的控制对象。
进一步地,所述目标用户的状态参数包括:所述环境图像中所述目标用户所在的图像区域的尺寸占比参数,所述目标用户的状态参数满足预设的第一条件是指:所述环境图像中所述目标用户所在的图像区域的尺寸占比参数小于或等于预设第一占比阈值;或者,
所述目标用户的状态参数包括:所述目标用户与所述飞行器的距离参数;所述目标用户的状态参数满足预设的第一条件是指:所述目标用户与所述飞行器的距离大于或等于预设第一距离。
进一步地,所述第一特征部位为所述目标用户的人体。
进一步地,所述处理器用于执行如下步骤:
如果所述目标用户的状态参数满足预设的第二条件,则确定所述目标用户的特征部位为第二特征部位;
根据所述目标用户的第二特征部位确定出所述第二特征部位所在的目标图像区域,并在所述目标图像区域中识别出所述目标用户的控制对象。
进一步地,所述目标用户的状态参数包括:所述环境图像中所述目标用户所在的图像区域的尺寸占比参数,所述目标用户的状态参数满足预设的第二条件是指:所述环境图像中所述目标用户所在的图像区域的尺寸占比参数大于或等于预设第二占比阈值;或者,
所述目标用户的状态参数包括:所述目标用户与所述飞行器的距离参数;所述目标用户的状态参数满足预设的第一条件是指:所述目标用户与所述飞行器的距离小于或等于预设第二距离。
进一步地,所述第二特征部位包括所述目标用户的头部;或者,所述第二特征部位包括所述目标用户的头部和肩部。
进一步地,所述处理器用于执行如下步骤:
在所述目标图像区域中识别出至少一个控制对象;
根据所述目标用户的特征部位,确定所述目标用户的关节点;
根据确定的关节点,从所述至少一个控制对象中确定出所述目标用户的控制对象。
进一步地,所述处理器用于执行如下步骤:
从确定的关节点中确定出目标关节点;
将所述至少一个控制对象中与所述目标关节点距离最近的控制对象确定为所述目标用户的控制对象。
所述飞行中处理器的具体实现可参考上述图2所对应实施例的飞行控制方法,在此不再赘述。其中,所诉飞行器可以是四旋翼无人机、六旋翼无人机、多旋翼无人机等类型的飞行器。所述动力系统可以包括电机、电调、螺旋桨等结构,其中,电机负责带动飞行器螺旋桨,电调负责控制飞行器的电机的转速。
本发明实施例还提供了另一种飞行器,包括:机身;设置在机身上的动力系统,用于提供飞行动力;处理器,用于如果获取到触发所述飞行器进入图像控制模式的触发操作,则获取拍摄装置拍摄得到的环境图像;对所述环境图像中目标用户的控制对象进行手势识别;如果识别出所述控制对象的手势为启动飞行手势,则生成起飞控制指令控制所述飞行器起飞。
进一步地,所述触发操作包括:对所述飞行器电源键的点击操作、对所述飞行器电源键双击操作、对所述飞行器的摇晃操作、语音输入操作、指纹输入操作中的任意一种或多种。
进一步地,所述处理器用于执行如下步骤:
在获取到所述触发操作后,控制挂载在所述飞行器上的云台转动,以控制所述拍摄装置在预设的拍摄范围内扫描拍摄;
获取所述拍摄装置在所述预设的拍摄范围内扫描拍摄得到的包括所述目标用户的特征部位的环境图像。
进一步地,所述处理器用于执行如下步骤:
在所述飞行器飞行过程中,控制所述拍摄装置拍摄获取飞行环境图像;
对所述飞行环境图像中目标用户的控制对象进行手势识别,确定飞行控制手势;
根据识别出的所述飞行控制手势,生成控制指令控制所述飞行器执行所述控制指令对应的动作。
进一步地,所述处理器用于执行如下步骤:
如果识别出所述控制对象的飞行控制手势为高度控制手势,则生成高度控制指令控制所述飞行器调整所述飞行器的高度。
进一步地,所述处理器用于执行如下步骤:
如果识别出所述控制对象的飞行控制手势为移动控制手势,则生成移动控制指令控制所述飞行器向所述移动控制指令所指示的方向飞行;
其中,所述移动控制指令所指示的方向包括:远离所述控制对象的方向或靠近所述控制对象的方向。
进一步地,所述处理器用于执行如下步骤:
如果识别出所述控制对象的飞行控制手势为拖动控制手势,则生成拖动控制指令控制所述飞行器沿所述拖动控制指令所指示的水平方向飞行。
进一步地,所述处理器用于执行如下步骤:
如果识别出所述控制对象的飞行控制手势为旋转控制手势,则生成旋转控制指令控制所述飞行器沿所述旋转控制指令所指示的方向旋转飞行。
进一步地,所述处理器用于执行如下步骤:
如果识别出所述控制对象的飞行控制手势为降落手势,则生成降落控制指令控制所述飞行器降落。
进一步地,所述处理器用于执行如下步骤:
如果不能识别确定飞行控制手势,且识别出所述飞行环境图像中目标用 户的特征部位;
根据所述目标用户的特征部位控制所述飞行器以所述目标用户为跟随目标,跟随所述目标用户移动。
进一步地,所述跟随所述目标用户移动是指:调整拍摄状态,在调整后的拍摄状态下所述目标用户位于所述拍摄装置拍摄的图像中,调整拍摄状态包括调整所述飞行器的位置、挂载在所述飞行器上的云台的姿态、飞行器的姿态的任意一种或多种。
进一步地,所述处理器用于执行如下步骤:
如果识别出所述控制对象的飞行控制手势为拍照手势,则生成拍摄控制指令控制所述飞行器的拍摄装置拍摄得到目标图像。
进一步地,所述处理器用于执行如下步骤:
如果识别出所述控制对象的飞行控制手势为录像手势,则生成录像控制指令控制所述飞行器的拍摄装置拍摄得到视频;
在所述飞行器的拍摄装置拍摄视频的过程中,如果再次识别到所述控制对象的录像手势,则生成结束控制指令控制所述飞行器的拍摄装置停止拍摄所述视频。
进一步地,所述处理器用于执行如下步骤:
如果识别不到所述目标用户的控制对象的飞行控制手势、且识别出替换用户的控制对象发出的替换控制手势,则将所述替换用户确定为新的目标用户;
识别所述新的目标用户的控制对象及替换控制手势,并根据所述替换控制手势生成控制指令控制所述飞行器执行所述控制指令对应的动作。
所述飞行中处理器的具体实现可参考上述图3或图4所对应实施例的飞行控制方法,在此不再赘述。其中,所诉飞行器的解释如上所述,此处不再赘述。
本发明实施例还提供了一种飞行控制系统,包括:飞行控制设备和飞行器;
所述飞行器,用于控制挂载在所述飞行器上的拍摄装置拍摄得到环境图像,并将所述环境图像发送给所述飞行控制设备;
所述飞行控制设备,用于获取拍摄装置拍摄得到的环境图像;根据所述环境图像确定出目标用户的特征部位,并根据所述特征部位确定出目标图像区域,在所述目标图像区域中识别出所述目标用户的控制对象;根据所述控制对象生成控制指令控制所述飞行器飞行;
所述飞行器,还用于响应所述飞行控制指令,控制所述飞行器飞行并执行所述飞行控制指令对应的动作。
进一步地,所述飞行控制设备,用于识别所述控制对象的动作特征,根据所述控制对象的动作特征获取控制指令;按照所述控制指令控制所述飞行器飞行。
进一步地,所述飞行控制设备,用于如果所述目标用户的状态参数满足预设的第一条件,则确定所述目标用户的特征部位为第一特征部位;根据所述目标用户的第一特征部位确定出所述第一特征部位所在的目标图像区域,并在所述目标图像区域中识别出所述目标用户的控制对象。
进一步地,所述目标用户的状态参数包括:所述环境图像中所述目标用户所在的图像区域的尺寸占比参数,所述目标用户的状态参数满足预设的第一条件是指:所述环境图像中所述目标用户所在的图像区域的尺寸占比参数小于或等于预设第一占比阈值;或者,所述目标用户的状态参数包括:所述目标用户与所述飞行器的距离参数;所述目标用户的状态参数满足预设的第一条件是指:所述目标用户与所述飞行器的距离大于或等于预设第一距离。
进一步地,所述第一特征部位为所述目标用户的人体。
进一步地,所述飞行控制设备,用于如果所述目标用户的状态参数满足预设的第二条件,则确定所述目标用户的特征部位为第二特征部位;根据所述目标用户的第二特征部位确定出所述第二特征部位所在的目标图像区域,并在所述目标图像区域中识别出所述目标用户的控制对象。
进一步地,所述目标用户的状态参数包括:所述环境图像中所述目标用户所在的图像区域的尺寸占比参数,所述目标用户的状态参数满足预设的第二条件是指:所述环境图像中所述目标用户所在的图像区域的尺寸占比参数大于或等于预设第二占比阈值;或者,所述目标用户的状态参数包括:所述目标用户与所述飞行器的距离参数;所述目标用户的状态参数满足预设的第一条件是指:所述目标用户与所述飞行器的距离小于或等于预设第二距离。
进一步地,所述第二特征部位包括所述目标用户的头部;或者,所述第二特征部位包括所述目标用户的头部和肩部。
进一步地,所述飞行控制设备,用于在所述目标图像区域中识别出至少一个控制对象;根据所述目标用户的特征部位,确定所述目标用户的关节点;根 据确定的关节点,从所述至少一个控制对象中确定出所述目标用户的控制对象。
进一步地,所述飞行控制设备,用于从确定的关节点中确定出目标关节点;将所述至少一个控制对象中与所述目标关节点距离最近的控制对象确定为所述目标用户的控制对象。
本发明实施例中,飞行控制设备通过获取拍摄装置拍摄得到的环境图像,根据从所述环境图像确定出的目标用户的特征部位确定出目标图像区域,并在所述目标图像区域中识别出所述目标用户的控制对象,以根据所述控制对象生成控制指令控制所述飞行器飞行。通过这种方式,识别出所述目标用户的控制对象,实现通过识别所述控制对象的动作特征来控制飞行器的飞行,以简化操作流程,提高飞行控制的效率。
本发明实施例还提供了另一种飞行控制系统,包括:飞行控制设备和飞行器;
所述飞行控制设备,用于如果获取到触发所述飞行器进入图像控制模式的触发操作,则获取拍摄装置拍摄得到的环境图像;对所述环境图像中目标用户的控制对象进行手势识别;如果识别出所述控制对象的手势为启动飞行手势,则生成起飞控制指令控制所述飞行器起飞;
所述飞行器,用于响应所述起飞控制指令控制所述飞行器起飞。
进一步地,所述触发操作包括:对所述飞行器电源键的点击操作、对所述飞行器电源键双击操作、对所述飞行器的摇晃操作、语音输入操作、指纹输入操作中的任意一种或多种。
进一步地,所述飞行控制设备,用于在获取到所述触发操作后,控制挂载在所述飞行器上的云台转动,以控制所述拍摄装置在预设的拍摄范围内扫描拍摄;获取所述拍摄装置在所述预设的拍摄范围内扫描拍摄得到的包括所述目标用户的特征部位的环境图像。
进一步地,所述飞行控制设备,还用于在所述飞行器飞行过程中,控制所述拍摄装置拍摄获取飞行环境图像;对所述飞行环境图像中目标用户的控制对象进行手势识别,确定飞行控制手势;根据识别出的所述飞行控制手势,生成控制指令控制所述飞行器执行所述控制指令对应的动作。
进一步地,所述飞行控制设备,用于如果识别出所述控制对象的飞行控制 手势为高度控制手势,则生成高度控制指令控制所述飞行器调整所述飞行器的高度。
进一步地,所述飞行控制设备,用于如果识别出所述控制对象的飞行控制手势为移动控制手势,则生成移动控制指令控制所述飞行器向所述移动控制指令所指示的方向飞行;其中,所述移动控制指令所指示的方向包括:远离所述控制对象的方向或靠近所述控制对象的方向。
进一步地,所述飞行控制设备,用于如果识别出所述控制对象的飞行控制手势为拖动控制手势,则生成拖动控制指令控制所述飞行器沿所述拖动控制指令所指示的水平方向飞行。
进一步地,所述飞行控制设备,用于如果识别出所述控制对象的飞行控制手势为旋转控制手势,则生成旋转控制指令控制所述飞行器沿所述旋转控制指令所指示的方向旋转飞行。
进一步地,所述飞行控制设备,用于如果识别出所述控制对象的飞行控制手势为降落手势,则生成降落控制指令控制所述飞行器降落。
进一步地,所述飞行控制设备,用于如果不能识别确定飞行控制手势,且识别出所述飞行环境图像中目标用户的特征部位;根据所述目标用户的特征部位控制所述飞行器以所述目标用户为跟随目标,跟随所述目标用户移动。
进一步地,所述跟随所述目标用户移动是指:调整拍摄状态,在调整后的拍摄状态下所述目标用户位于所述拍摄装置拍摄的图像中,调整拍摄状态包括调整所述飞行器的位置、挂载在所述飞行器上的云台的姿态、飞行器的姿态的任意一种或多种。
进一步地,所述飞行控制设备,用于如果识别出所述控制对象的飞行控制手势为拍照手势,则生成拍摄控制指令控制所述飞行器的拍摄装置拍摄得到目标图像。
进一步地,所述飞行控制设备,用于如果识别出所述控制对象的飞行控制手势为录像手势,则生成录像控制指令控制所述飞行器的拍摄装置拍摄得到视频;在所述飞行器的拍摄装置拍摄视频的过程中,如果再次识别到所述控制对象的录像手势,则生成结束控制指令控制所述飞行器的拍摄装置停止拍摄所述视频。
进一步地,所述飞行控制设备,用于如果识别不到所述目标用户的控制对 象的飞行控制手势、且识别出替换用户的控制对象发出的替换控制手势,则将所述替换用户确定为新的目标用户;识别所述新的目标用户的控制对象及替换控制手势,并根据所述替换控制手势生成控制指令控制所述飞行器执行所述控制指令对应的动作。
本发明实施例中,所述飞行控制设备通过在所述飞行器飞行过程中,控制所述拍摄装置拍摄获取飞行环境图像,并对所述飞行环境图像中目标用户的控制对象进行手势识别,确定出飞行控制手势,从而根据识别出的所述飞行控制手势,生成控制指令控制所述飞行器执行所述控制指令对应的动作。通过这种方式,实现通过手势识别控制所述飞行器在飞行过程中执行所述手势所指示的动作,简化了对飞行器进行控制的操作步骤,可较为快捷地对飞行器进行控制,提高了控制飞行器的效率。
在本发明的实施例中还提供了一种计算机可读存储介质,所述计算机可读存储介质存储有计算机程序,所述计算机程序被处理器执行时实现本发明图1a、图2、图3、或图4所对应实施例中描述的飞行控制方法方式,也可实现图5或图6所述本发明所对应实施例的飞行控制设备,在此不再赘述。
所述计算机可读存储介质可以是前述任一实施例所述的设备的内部存储单元,例如设备的硬盘或内存。所述计算机可读存储介质也可以是所述设备的外部存储设备,例如所述设备上配备的插接式硬盘,智能存储卡(Smart Media Card,SMC),安全数字(Secure Digital,SD)卡,闪存卡(Flash Card)等。进一步地,所述计算机可读存储介质还可以既包括所述设备的内部存储单元也包括外部存储设备。所述计算机可读存储介质用于存储所述计算机程序以及所述终端所需的其他程序和数据。所述计算机可读存储介质还可以用于暂时地存储已经输出或者将要输出的数据。
本领域普通技术人员可以理解实现上述实施例方法中的全部或部分流程,是可以通过计算机程序来指令相关的硬件来完成,所述的程序可存储于一计算机可读取存储介质中,该程序在执行时,可包括如上述各方法的实施例的流程。其中,所述的存储介质可为磁碟、光盘、只读存储记忆体(Read-Only Memory,ROM)或随机存储记忆体(Random Access Memory,RAM)等。
以上所揭露的仅为本发明部分实施例而已,当然不能以此来限定本发明之权利范围,因此依本发明权利要求所作的等同变化,仍属本发明所涵盖的范围。

Claims (80)

  1. 一种飞行控制方法,其特征在于,应用于飞行器,所述飞行器上挂载有拍摄装置,所述方法包括:
    获取所述拍摄装置拍摄得到的环境图像;
    根据所述环境图像确定出目标用户的特征部位,并根据所述特征部位确定出目标图像区域,在所述目标图像区域中识别出所述目标用户的控制对象;
    根据所述控制对象生成控制指令控制所述飞行器飞行。
  2. 根据权利要求1所述的方法,其特征在于,所述根据所述控制对象生成控制指令控制所述飞行器飞行,包括:
    识别所述控制对象的动作特征,根据所述控制对象的动作特征获取控制指令;
    按照所述控制指令控制所述飞行器飞行。
  3. 根据权利要求1所述的方法,其特征在于,
    所述控制对象包括所述目标用户的手掌。
  4. 根据权利要求1所述的方法,其特征在于,所述根据所述环境图像确定出目标用户的特征部位,并根据所述特征部位确定出目标图像区域,在所述目标图像区域中识别出所述目标用户的控制对象,包括:
    如果所述目标用户的状态参数满足预设的第一条件,则确定所述目标用户的特征部位为第一特征部位;
    根据所述目标用户的第一特征部位确定出所述第一特征部位所在的目标图像区域,并在所述目标图像区域中识别出所述目标用户的控制对象。
  5. 根据权利要求4所述的方法,其特征在于,
    所述目标用户的状态参数包括:所述环境图像中所述目标用户所在的图像区域的尺寸占比参数,所述目标用户的状态参数满足预设的第一条件是指:所述环境图像中所述目标用户所在的图像区域的尺寸占比参数小于或等于预设 第一占比阈值;或者,
    所述目标用户的状态参数包括:所述目标用户与所述飞行器的距离参数;所述目标用户的状态参数满足预设的第一条件是指:所述目标用户与所述飞行器的距离大于或等于预设第一距离。
  6. 根据权利要求4所述的方法,其特征在于,
    所述第一特征部位为所述目标用户的人体。
  7. 根据权利要求1所述的方法,其特征在于,所述根据所述环境图像确定出目标用户的特征部位,并根据所述特征部位确定出目标图像区域,在所述目标图像区域中识别出所述目标用户的控制对象,包括:
    如果所述目标用户的状态参数满足预设的第二条件,则确定所述目标用户的特征部位为第二特征部位;
    根据所述目标用户的第二特征部位确定出所述第二特征部位所在的目标图像区域,并在所述目标图像区域中识别出所述目标用户的控制对象。
  8. 根据权利要求7所述的方法,其特征在于,
    所述目标用户的状态参数包括:所述环境图像中所述目标用户所在的图像区域的尺寸占比参数,所述目标用户的状态参数满足预设的第二条件是指:所述环境图像中所述目标用户所在的图像区域的尺寸占比参数大于或等于预设第二占比阈值;或者,
    所述目标用户的状态参数包括:所述目标用户与所述飞行器的距离参数;所述目标用户的状态参数满足预设的第二条件是指:所述目标用户与所述飞行器的距离小于或等于预设第二距离。
  9. 根据权利要求8所述的方法,其特征在于,
    所述第二特征部位包括所述目标用户的头部;
    或者,所述第二特征部位包括所述目标用户的头部和肩部。
  10. 根据权利要求1-9任一项所述的方法,其特征在于,所述在所述目标 图像区域中识别出所述目标用户的控制对象,包括:
    在所述目标图像区域中识别出至少一个控制对象;
    根据所述目标用户的特征部位,确定所述目标用户的关节点;
    根据确定的关节点,从所述至少一个控制对象中确定出所述目标用户的控制对象。
  11. 根据权利要求10所述的方法,其特征在于,所述根据确定的关节点,从所述至少一个控制对象中确定出所述目标用户的控制对象,包括:
    从确定的关节点中确定出目标关节点;
    将所述至少一个控制对象中与所述目标关节点距离最近的控制对象确定为所述目标用户的控制对象。
  12. 一种飞行控制方法,其特征在于,应用于飞行器,所述飞行器上挂载有拍摄装置,所述方法包括:
    如果获取到触发所述飞行器进入图像控制模式的触发操作,则获取所述拍摄装置拍摄得到的环境图像;
    对所述环境图像中目标用户的控制对象进行手势识别;
    如果识别出所述控制对象的手势为启动飞行手势,则生成起飞控制指令控制所述飞行器起飞。
  13. 根据权利要求12所述的方法,其特征在于,
    所述触发操作包括:对所述飞行器电源键的点击操作、对所述飞行器电源键双击操作、对所述飞行器的摇晃操作、语音输入操作、指纹输入操作中的任意一种或多种。
  14. 根据权利要求12所述的方法,其特征在于,所述获取所述拍摄装置拍摄得到的环境图像,包括:
    在获取到所述触发操作后,控制挂载在所述飞行器上的云台转动,以控制所述拍摄装置在预设的拍摄范围内进行扫描拍摄;
    获取所述拍摄装置在所述预设的拍摄范围内扫描拍摄得到的包括所述目 标用户的特征部位的环境图像。
  15. 根据权利要求12所述的方法,其特征在于,还包括:
    在所述飞行器飞行过程中,控制所述拍摄装置拍摄获取飞行环境图像;
    对所述飞行环境图像中目标用户的控制对象进行手势识别,确定飞行控制手势;
    根据识别出的所述飞行控制手势,生成控制指令控制所述飞行器执行所述控制指令对应的动作。
  16. 根据权利要求15所述的方法,其特征在于,所述根据识别出的所述飞行控制手势,生成控制指令控制所述飞行器执行所述控制指令对应的动作,包括:
    如果识别出所述控制对象的飞行控制手势为高度控制手势,则生成高度控制指令控制所述飞行器调整所述飞行器的高度。
  17. 根据权利要求15所述的方法,其特征在于,所述根据识别出的所述飞行控制手势,生成控制指令控制所述飞行器执行所述控制指令对应的动作,包括:
    如果识别出所述控制对象的飞行控制手势为移动控制手势,则生成移动控制指令控制所述飞行器向所述移动控制指令所指示的方向飞行;
    其中,所述移动控制指令所指示的方向包括:远离所述控制对象的方向或靠近所述控制对象的方向。
  18. 根据权利要求15所述的方法,其特征在于,所述根据识别出的所述飞行控制手势,生成控制指令控制所述飞行器执行所述控制指令对应的动作,包括:
    如果识别出所述控制对象的飞行控制手势为拖动控制手势,则生成拖动控制指令控制所述飞行器沿所述拖动控制指令所指示的水平方向飞行。
  19. 根据权利要求15所述的方法,其特征在于,所述根据识别出的所述 飞行控制手势,生成控制指令控制所述飞行器执行所述控制指令对应的动作,包括:
    如果识别出所述控制对象的飞行控制手势为旋转控制手势,则生成旋转控制指令控制所述飞行器沿所述旋转控制指令所指示的方向旋转飞行。
  20. 根据权利要求15所述的方法,其特征在于,所述根据识别出的所述飞行控制手势,生成控制指令控制所述飞行器执行所述控制指令对应的动作,包括:
    如果识别出所述控制对象的飞行控制手势为降落手势,则生成降落控制指令控制所述飞行器降落。
  21. 根据权利要求15所述的方法,其特征在于,所述根据识别出的所述飞行控制手势,生成控制指令控制所述飞行器执行所述控制指令对应的动作,包括:
    如果不能识别确定飞行控制手势,且识别出所述飞行环境图像中目标用户的特征部位;
    根据所述目标用户的特征部位控制所述飞行器以所述目标用户为跟随目标,跟随所述目标用户移动。
  22. 根据权利要求21所述的方法,其特征在于,
    所述跟随所述目标用户移动是指:调整拍摄状态,在调整后的拍摄状态下所述目标用户位于所述拍摄装置拍摄的图像中,调整拍摄状态包括调整所述飞行器的位置、挂载在所述飞行器上的云台的姿态、飞行器的姿态的任意一种或多种。
  23. 根据权利要求15所述的方法,其特征在于,所述根据识别出的所述飞行控制手势,生成控制指令控制所述飞行器执行所述控制指令对应的动作,包括:
    如果识别出所述控制对象的飞行控制手势为拍照手势,则生成拍摄控制指令控制所述飞行器的拍摄装置拍摄得到目标图像。
  24. 根据权利要求15所述的方法,其特征在于,所述根据识别出的所述飞行控制手势,生成控制指令控制所述飞行器执行所述控制指令对应的动作,包括:
    如果识别出所述控制对象的飞行控制手势为录像手势,则生成录像控制指令控制所述飞行器的拍摄装置拍摄得到视频;
    在所述飞行器的拍摄装置拍摄视频的过程中,如果再次识别到所述控制对象的录像手势,则生成结束控制指令控制所述飞行器的拍摄装置停止拍摄所述视频。
  25. 根据权利要求15所述的方法,其特征在于,所述根据识别出的所述飞行控制手势,生成控制指令控制所述飞行器执行所述控制指令对应的动作,包括:
    如果识别不到所述目标用户的控制对象的飞行控制手势、且识别出替换用户的控制对象发出的替换控制手势,则将所述替换用户确定为新的目标用户;
    识别所述新的目标用户的控制对象及替换控制手势,并根据所述替换控制手势生成控制指令控制所述飞行器执行所述控制指令对应的动作。
  26. 一种飞行控制设备,其特征在于,应用于飞行器,所述飞行器上挂载有拍摄装置,所述设备包括:处理器和存储器;
    所述存储器,用于存储程序指令;
    所述处理器,执行所述存储器存储的程序指令,当程序指令被执行时,所述处理器用于执行如下步骤:
    获取所述拍摄装置拍摄得到的环境图像;
    根据所述环境图像确定出目标用户的特征部位,并根据所述特征部位确定出目标图像区域,在所述目标图像区域中识别出所述目标用户的控制对象;
    根据所述控制对象生成控制指令控制所述飞行器飞行。
  27. 根据权利要求26所述的设备,其特征在于,
    所述处理器,用于:识别所述控制对象的动作特征,根据所述控制对象的 动作特征获取控制指令;按照所述控制指令控制所述飞行器飞行。
  28. 根据权利要求26所述的设备,其特征在于,
    所述控制对象包括所述目标用户的手掌。
  29. 根据权利要求26所述的设备,其特征在于,
    所述处理器,用于:如果所述目标用户的状态参数满足预设的第一条件,则确定所述目标用户的特征部位为第一特征部位;根据所述目标用户的第一特征部位确定出所述第一特征部位所在的目标图像区域,并在所述目标图像区域中识别出所述目标用户的控制对象。
  30. 根据权利要求29所述的设备,其特征在于,
    所述目标用户的状态参数包括:所述环境图像中所述目标用户所在的图像区域的尺寸占比参数,所述目标用户的状态参数满足预设的第一条件是指:所述环境图像中所述目标用户所在的图像区域的尺寸占比参数小于或等于预设第一占比阈值;或者,
    所述目标用户的状态参数包括:所述目标用户与所述飞行器的距离参数;所述目标用户的状态参数满足预设的第一条件是指:所述目标用户与所述飞行器的距离大于或等于预设第一距离。
  31. 根据权利要求29所述的设备,其特征在于,
    所述第一特征部位为所述目标用户的人体。
  32. 根据权利要求26所述的设备,其特征在于,
    所述处理器,用于:如果所述目标用户的状态参数满足预设的第二条件,则确定所述目标用户的特征部位为第二特征部位;根据所述目标用户的第二特征部位确定出所述第二特征部位所在的目标图像区域,并在所述目标图像区域中识别出所述目标用户的控制对象。
  33. 根据权利要求32所述的设备,其特征在于,
    所述目标用户的状态参数包括:所述环境图像中所述目标用户所在的图像区域的尺寸占比参数,所述目标用户的状态参数满足预设的第二条件是指:所述环境图像中所述目标用户所在的图像区域的尺寸占比参数大于或等于预设第二占比阈值;或者,
    所述目标用户的状态参数包括:所述目标用户与所述飞行器的距离参数;所述目标用户的状态参数满足预设的第一条件是指:所述目标用户与所述飞行器的距离小于或等于预设第二距离。
  34. 根据权利要求33所述的设备,其特征在于,
    所述第二特征部位包括所述目标用户的头部;
    或者,所述第二特征部位包括所述目标用户的头部和肩部。
  35. 根据权利要求26-34任一项所述的设备,其特征在于,
    所述处理器,用于:在所述目标图像区域中识别出至少一个控制对象;根据所述目标用户的特征部位,确定所述目标用户的关节点;根据确定的关节点,从所述至少一个控制对象中确定出所述目标用户的控制对象。
  36. 根据权利要求35所述的设备,其特征在于,
    所述处理器,用于:从确定的关节点中确定出目标关节点;将所述至少一个控制对象中与所述目标关节点距离最近的控制对象确定为所述目标用户的控制对象。
  37. 一种飞行控制设备,其特征在于,根据权利要求26所述的设备,其特征在于,应用于飞行器,所述飞行器上挂载有拍摄装置,所述设备包括:处理器和存储器;
    所述存储器,用于存储程序指令;
    所述处理器,执行所述存储器存储的程序指令,当程序指令被执行时,所述处理器用于执行如下步骤:
    如果获取到触发所述飞行器进入图像控制模式的触发操作,则获取所述拍摄装置拍摄得到的环境图像;
    对所述环境图像中目标用户的控制对象进行手势识别;
    如果识别出所述控制对象的手势为启动飞行手势,则生成起飞控制指令控制所述飞行器起飞。
  38. 根据权利要求37所述的设备,其特征在于,
    所述触发操作包括:对所述飞行器电源键的点击操作、对所述飞行器电源键双击操作、对所述飞行器的摇晃操作、语音输入操作、指纹输入操作中的任意一种或多种。
  39. 根据权利要求37所述的设备,其特征在于,
    所述处理器,用于:在获取到所述触发操作后,控制挂载在所述飞行器上的云台转动,以控制所述拍摄装置在预设的拍摄范围内扫描拍摄;获取所述拍摄装置在所述预设的拍摄范围内扫描拍摄得到的包括所述目标用户的特征部位的环境图像。
  40. 根据权利要求37所述的设备,其特征在于,
    所述处理器,还用于:在所述飞行器飞行过程中,控制所述拍摄装置拍摄获取飞行环境图像;对所述飞行环境图像中目标用户的控制对象进行手势识别,确定飞行控制手势;根据识别出的所述飞行控制手势,生成控制指令控制所述飞行器执行所述控制指令对应的动作。
  41. 根据权利要求40所述的设备,其特征在于,
    所述处理器,用于:如果识别出所述控制对象的飞行控制手势为高度控制手势,则生成高度控制指令控制所述飞行器调整所述飞行器的高度。
  42. 根据权利要求40所述的设备,其特征在于,
    所述处理器,用于:如果识别出所述控制对象的飞行控制手势为移动控制手势,则生成移动控制指令控制所述飞行器向所述移动控制指令所指示的方向飞行;其中,所述移动控制指令所指示的方向包括:远离所述控制对象的方向或靠近所述控制对象的方向。
  43. 根据权利要求40所述的设备,其特征在于,
    所述处理器,用于:如果识别出所述控制对象的飞行控制手势为拖动控制手势,则生成拖动控制指令控制所述飞行器沿所述拖动控制指令所指示的水平方向飞行。
  44. 根据权利要求40所述的设备,其特征在于,
    所述处理器,用于:如果识别出所述控制对象的飞行控制手势为旋转控制手势,则生成旋转控制指令控制所述飞行器沿所述旋转控制指令所指示的方向旋转飞行。
  45. 根据权利要求40所述的设备,其特征在于,
    所述处理器,用于:如果识别出所述控制对象的飞行控制手势为降落手势,则生成降落控制指令控制所述飞行器降落。
  46. 根据权利要求40所述的设备,其特征在于,
    所述处理器,用于:如果不能识别确定飞行控制手势,且识别出所述飞行环境图像中目标用户的特征部位;根据所述目标用户的特征部位控制所述飞行器以所述目标用户为跟随目标,跟随所述目标用户移动。
  47. 根据权利要求46所述的设备,其特征在于,
    所述跟随所述目标用户移动是指:调整拍摄状态,在调整后的拍摄状态下所述目标用户位于所述拍摄装置拍摄的图像中,调整拍摄状态包括调整所述飞行器的位置、挂载在所述飞行器上的云台的姿态、飞行器的姿态的任意一种或多种。
  48. 根据权利要求40所述的设备,其特征在于,
    所述处理器,用于:如果识别出所述控制对象的飞行控制手势为拍照手势,则生成拍摄控制指令控制所述飞行器的拍摄装置拍摄得到目标图像。
  49. 根据权利要求40所述的设备,其特征在于,
    所述处理器,用于:如果识别出所述控制对象的飞行控制手势为录像手势,则生成录像控制指令控制所述飞行器的拍摄装置拍摄得到视频;在所述飞行器的拍摄装置拍摄视频的过程中,如果再次识别到所述控制对象的录像手势,则生成结束控制指令控制所述飞行器的拍摄装置停止拍摄所述视频。
  50. 根据权利要求40所述的设备,其特征在于,
    所述处理器,用于:如果识别不到所述目标用户的控制对象的飞行控制手势、且识别出替换用户的控制对象发出的替换控制手势,则将所述替换用户确定为新的目标用户;识别所述新的目标用户的控制对象及替换控制手势,并根据所述替换控制手势生成控制指令控制所述飞行器执行所述控制指令对应的动作。
  51. 一种飞行器,其特征在于,包括:
    机身;
    设置在机身上的动力系统,用于:提供飞行动力;
    处理器,用于:获取拍摄装置拍摄得到的环境图像;根据所述环境图像确定出目标用户的特征部位,并根据所述特征部位确定出目标图像区域,在所述目标图像区域中识别出所述目标用户的控制对象;根据所述控制对象生成控制指令控制所述飞行器飞行。
  52. 根据权利要求51所述的飞行器,其特征在于,
    所述处理器,用于:执行上述权利要求1-11任一项所述方法。
  53. 一种飞行器,其特征在于,包括:
    机身;
    设置在机身上的动力系统,用于:提供飞行动力;
    处理器,用于:如果获取到触发所述飞行器进入图像控制模式的触发操作,则获取拍摄装置拍摄得到的环境图像;对所述环境图像中目标用户的控制对象进行手势识别;如果识别出所述控制对象的手势为启动飞行手势,则生成起飞 控制指令控制所述飞行器起飞。
  54. 根据权利要求53所述的飞行器,其特征在于,
    所述处理器,用于:执行上述权利要求12-25任一项所述方法。
  55. 一种飞行控制系统,其特征在于,包括:飞行控制设备和飞行器;
    所述飞行器,用于:控制挂载在所述飞行器上的拍摄装置拍摄得到环境图像,并将所述环境图像发送给所述飞行控制设备;
    所述飞行控制设备,用于:获取拍摄装置拍摄得到的环境图像;根据所述环境图像确定出目标用户的特征部位,并根据所述特征部位确定出目标图像区域,在所述目标图像区域中识别出所述目标用户的控制对象;根据所述控制对象生成控制指令控制所述飞行器飞行;
    所述飞行器,还用于:响应所述飞行控制指令,控制所述飞行器飞行并执行所述飞行控制指令对应的动作。
  56. 根据权利要求55所述的系统,其特征在于,
    所述飞行控制设备,用于:识别所述控制对象的动作特征,根据所述控制对象的动作特征获取控制指令;按照所述控制指令控制所述飞行器飞行。
  57. 根据权利要求55所述的系统,其特征在于,
    所述控制对象包括所述目标用户的手掌。
  58. 根据权利要求55所述的系统,其特征在于,
    所述飞行控制设备,用于:如果所述目标用户的状态参数满足预设的第一条件,则确定所述目标用户的特征部位为第一特征部位;根据所述目标用户的第一特征部位确定出所述第一特征部位所在的目标图像区域,并在所述目标图像区域中识别出所述目标用户的控制对象。
  59. 根据权利要求58所述的系统,其特征在于,
    所述目标用户的状态参数包括:所述环境图像中所述目标用户所在的图像 区域的尺寸占比参数,所述目标用户的状态参数满足预设的第一条件是指:所述环境图像中所述目标用户所在的图像区域的尺寸占比参数小于或等于预设第一占比阈值;或者,
    所述目标用户的状态参数包括:所述目标用户与所述飞行器的距离参数;所述目标用户的状态参数满足预设的第一条件是指:所述目标用户与所述飞行器的距离大于或等于预设第一距离。
  60. 根据权利要求58所述的系统,其特征在于,
    所述第一特征部位为所述目标用户的人体。
  61. 根据权利要求55所述的系统,其特征在于,
    所述飞行控制设备,用于:如果所述目标用户的状态参数满足预设的第二条件,则确定所述目标用户的特征部位为第二特征部位;根据所述目标用户的第二特征部位确定出所述第二特征部位所在的目标图像区域,并在所述目标图像区域中识别出所述目标用户的控制对象。
  62. 根据权利要求61所述的系统,其特征在于,
    所述目标用户的状态参数包括:所述环境图像中所述目标用户所在的图像区域的尺寸占比参数,所述目标用户的状态参数满足预设的第二条件是指:所述环境图像中所述目标用户所在的图像区域的尺寸占比参数大于或等于预设第二占比阈值;或者,
    所述目标用户的状态参数包括:所述目标用户与所述飞行器的距离参数;所述目标用户的状态参数满足预设的第一条件是指:所述目标用户与所述飞行器的距离小于或等于预设第二距离。
  63. 根据权利要求62所述的系统,其特征在于,
    所述第二特征部位包括所述目标用户的头部;
    或者,所述第二特征部位包括所述目标用户的头部和肩部。
  64. 根据权利要求55-63任一项所述的系统,其特征在于,
    所述飞行控制设备,用于:在所述目标图像区域中识别出至少一个控制对象;根据所述目标用户的特征部位,确定所述目标用户的关节点;根据确定的关节点,从所述至少一个控制对象中确定出所述目标用户的控制对象。
  65. 根据权利要求64所述的系统,其特征在于,
    所述飞行控制设备,用于:从确定的关节点中确定出目标关节点;将所述至少一个控制对象中与所述目标关节点距离最近的控制对象确定为所述目标用户的控制对象。
  66. 一种飞行控制系统,其特征在于,包括:飞行控制设备和飞行器;
    所述飞行控制设备,用于:如果获取到触发所述飞行器进入图像控制模式的触发操作,则获取拍摄装置拍摄得到的环境图像;对所述环境图像中目标用户的控制对象进行手势识别;如果识别出所述控制对象的手势为启动飞行手势,则生成起飞控制指令控制所述飞行器起飞;
    所述飞行器,用于:响应所述起飞控制指令控制所述飞行器起飞。
  67. 根据权利要求66所述的系统,其特征在于,
    所述触发操作包括:对所述飞行器电源键的点击操作、对所述飞行器电源键双击操作、对所述飞行器的摇晃操作、语音输入操作、指纹输入操作中的任意一种或多种。
  68. 根据权利要求66所述的系统,其特征在于,
    所述飞行控制设备,用于:在获取到所述触发操作后,控制挂载在所述飞行器上的云台转动,以控制所述拍摄装置在预设的拍摄范围内扫描拍摄;获取所述拍摄装置在所述预设的拍摄范围内扫描拍摄得到的包括所述目标用户的特征部位的环境图像。
  69. 根据权利要求66所述的系统,其特征在于,
    所述飞行控制设备,还用于:在所述飞行器飞行过程中,控制所述拍摄装置拍摄获取飞行环境图像;对所述飞行环境图像中目标用户的控制对象进行手 势识别,确定飞行控制手势;根据识别出的所述飞行控制手势,生成控制指令控制所述飞行器执行所述控制指令对应的动作。
  70. 根据权利要求69所述的系统,其特征在于,
    所述飞行控制设备,用于:如果识别出所述控制对象的飞行控制手势为高度控制手势,则生成高度控制指令控制所述飞行器调整所述飞行器的高度。
  71. 根据权利要求69所述的系统,其特征在于,
    所述飞行控制设备,用于:如果识别出所述控制对象的飞行控制手势为移动控制手势,则生成移动控制指令控制所述飞行器向所述移动控制指令所指示的方向飞行;其中,所述移动控制指令所指示的方向包括:远离所述控制对象的方向或靠近所述控制对象的方向。
  72. 根据权利要求69所述的系统,其特征在于,
    所述飞行控制设备,用于:如果识别出所述控制对象的飞行控制手势为拖动控制手势,则生成拖动控制指令控制所述飞行器沿所述拖动控制指令所指示的水平方向飞行。
  73. 根据权利要求69所述的系统,其特征在于,
    所述飞行控制设备,用于:如果识别出所述控制对象的飞行控制手势为旋转控制手势,则生成旋转控制指令控制所述飞行器沿所述旋转控制指令所指示的方向旋转飞行。
  74. 根据权利要求69所述的系统,其特征在于,
    所述飞行控制设备,用于:如果识别出所述控制对象的飞行控制手势为降落手势,则生成降落控制指令控制所述飞行器降落。
  75. 根据权利要求69所述的系统,其特征在于,
    所述飞行控制设备,用于:如果不能识别确定飞行控制手势,且识别出所述飞行环境图像中目标用户的特征部位;根据所述目标用户的特征部位控制所 述飞行器以所述目标用户为跟随目标,跟随所述目标用户移动。
  76. 根据权利要求75所述的系统,其特征在于,
    所述跟随所述目标用户移动是指:调整拍摄状态,在调整后的拍摄状态下所述目标用户位于所述拍摄装置拍摄的图像中,调整拍摄状态包括调整所述飞行器的位置、挂载在所述飞行器上的云台的姿态、飞行器的姿态的任意一种或多种。
  77. 根据权利要求69所述的系统,其特征在于,
    所述飞行控制设备,用于:如果识别出所述控制对象的飞行控制手势为拍照手势,则生成拍摄控制指令控制所述飞行器的拍摄装置拍摄得到目标图像。
  78. 根据权利要求69所述的系统,其特征在于,
    所述飞行控制设备,用于:如果识别出所述控制对象的飞行控制手势为录像手势,则生成录像控制指令控制所述飞行器的拍摄装置拍摄得到视频;在所述飞行器的拍摄装置拍摄视频的过程中,如果再次识别到所述控制对象的录像手势,则生成结束控制指令控制所述飞行器的拍摄装置停止拍摄所述视频。
  79. 根据权利要求69所述的系统,其特征在于,
    所述飞行控制设备,用于:如果识别不到所述目标用户的控制对象的飞行控制手势、且识别出替换用户的控制对象发出的替换控制手势,则将所述替换用户确定为新的目标用户;识别所述新的目标用户的控制对象及替换控制手势,并根据所述替换控制手势生成控制指令控制所述飞行器执行所述控制指令对应的动作。
  80. 一种计算机可读存储介质,所述计算机可读存储介质存储有计算机程序,其特征在于,所述计算机程序被处理器执行时实现如权利要求1至25任一项所述方法。
PCT/CN2018/073877 2018-01-23 2018-01-23 一种飞行控制方法、设备、飞行器、系统及存储介质 WO2019144295A1 (zh)

Priority Applications (4)

Application Number Priority Date Filing Date Title
PCT/CN2018/073877 WO2019144295A1 (zh) 2018-01-23 2018-01-23 一种飞行控制方法、设备、飞行器、系统及存储介质
CN201880002091.9A CN109196438A (zh) 2018-01-23 2018-01-23 一种飞行控制方法、设备、飞行器、系统及存储介质
US16/935,680 US20200348663A1 (en) 2018-01-23 2020-07-22 Flight control method, device, aircraft, system, and storage medium
US18/316,399 US20230280745A1 (en) 2018-01-23 2023-05-12 Flight control method, device, aircraft, system, and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2018/073877 WO2019144295A1 (zh) 2018-01-23 2018-01-23 一种飞行控制方法、设备、飞行器、系统及存储介质

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US16/935,680 Continuation US20200348663A1 (en) 2018-01-23 2020-07-22 Flight control method, device, aircraft, system, and storage medium

Publications (1)

Publication Number Publication Date
WO2019144295A1 true WO2019144295A1 (zh) 2019-08-01

Family

ID=64938216

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2018/073877 WO2019144295A1 (zh) 2018-01-23 2018-01-23 一种飞行控制方法、设备、飞行器、系统及存储介质

Country Status (3)

Country Link
US (2) US20200348663A1 (zh)
CN (1) CN109196438A (zh)
WO (1) WO2019144295A1 (zh)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111343330A (zh) * 2019-03-29 2020-06-26 阿里巴巴集团控股有限公司 一种智能手机
US11106223B2 (en) * 2019-05-09 2021-08-31 GEOSAT Aerospace & Technology Apparatus and methods for landing unmanned aerial vehicle
CN112154652A (zh) * 2019-08-13 2020-12-29 深圳市大疆创新科技有限公司 手持云台的控制方法、控制装置、手持云台及存储介质
CN110650287A (zh) * 2019-09-05 2020-01-03 深圳市道通智能航空技术有限公司 一种拍摄控制方法、装置、飞行器及飞行系统
WO2021072766A1 (zh) * 2019-10-18 2021-04-22 深圳市大疆创新科技有限公司 飞行控制方法、系统、无人飞行器及存储介质
WO2021109068A1 (zh) * 2019-12-05 2021-06-10 深圳市大疆创新科技有限公司 手势控制方法及可移动平台

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104808799A (zh) * 2015-05-20 2015-07-29 成都通甲优博科技有限责任公司 一种能够识别手势的无人机及其识别方法
WO2015200209A1 (en) * 2014-06-23 2015-12-30 Nixie Labs, Inc. Wearable unmanned aerial vehicles, launch- controlled unmanned aerial vehicles, and associated systems and methods
CN105892474A (zh) * 2016-03-31 2016-08-24 深圳奥比中光科技有限公司 无人机以及无人机控制方法
CN106200657A (zh) * 2016-07-09 2016-12-07 东莞市华睿电子科技有限公司 一种无人机控制方法
CN106774947A (zh) * 2017-02-08 2017-05-31 亿航智能设备(广州)有限公司 一种飞行器及其控制方法
CN106774945A (zh) * 2017-01-24 2017-05-31 腾讯科技(深圳)有限公司 一种飞行器飞行控制方法、装置、飞行器及系统
CN106980372A (zh) * 2017-03-24 2017-07-25 普宙飞行器科技(深圳)有限公司 一种无需地面操控终端的无人机操控方法及系统
CN107390713A (zh) * 2016-04-27 2017-11-24 阿特拉斯动力公司 基于手势的无人机控制

Family Cites Families (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7983450B2 (en) * 2009-03-16 2011-07-19 The Boeing Company Method, apparatus and computer program product for recognizing a gesture
US10026165B1 (en) * 2011-07-05 2018-07-17 Bernard Fryshman Object image recognition and instant active response
CN102662464A (zh) * 2012-03-26 2012-09-12 华南理工大学 一种手势漫游控制系统的手势控制方法
TW201339903A (zh) * 2012-03-26 2013-10-01 Hon Hai Prec Ind Co Ltd 無人飛行載具控制系統及方法
US20140173524A1 (en) * 2012-12-14 2014-06-19 Microsoft Corporation Target and press natural user input
US8930044B1 (en) * 2012-12-28 2015-01-06 Google Inc. Multi-part navigation process by an unmanned aerial vehicle for navigating to a medical situatiion
EP3014407A4 (en) * 2013-06-28 2017-08-02 Chia Ming Chen Controlling device operation according to hand gestures
CN103426282A (zh) * 2013-07-31 2013-12-04 深圳市大疆创新科技有限公司 遥控方法及终端
US9531784B2 (en) * 2013-12-17 2016-12-27 International Business Machines Corporation Identity service management in limited connectivity environments
CN104317385A (zh) * 2014-06-26 2015-01-28 青岛海信电器股份有限公司 一种手势识别方法和系统
CN105373215B (zh) * 2014-08-25 2018-01-30 中国人民解放军理工大学 基于手势编码与译码的动态无线手势识别方法
CN105807926B (zh) * 2016-03-08 2019-06-21 中山大学 一种基于三维连续动态手势识别的无人机人机交互方法
CN105867362A (zh) * 2016-04-20 2016-08-17 北京博瑞爱飞科技发展有限公司 终端设备和无人驾驶飞行器的控制系统
CN106227231A (zh) * 2016-07-15 2016-12-14 深圳奥比中光科技有限公司 无人机的控制方法、体感交互装置以及无人机
CN106020227B (zh) * 2016-08-12 2019-02-26 北京奇虎科技有限公司 无人机的控制方法、装置
CN106650606A (zh) * 2016-10-21 2017-05-10 江苏理工学院 人脸图像的匹配及处理方法、人脸图像模型构建系统
CN106682091A (zh) * 2016-11-29 2017-05-17 深圳市元征科技股份有限公司 一种无人机控制方法及装置
CN110119154A (zh) * 2016-11-30 2019-08-13 深圳市大疆创新科技有限公司 飞行器的控制方法、装置和设备以及飞行器
CN106682585A (zh) * 2016-12-02 2017-05-17 南京理工大学 一种基于kinect2的动态手势识别方法
WO2018195979A1 (zh) * 2017-04-28 2018-11-01 深圳市大疆创新科技有限公司 一种跟踪控制方法、装置及飞行器
CN107357427A (zh) * 2017-07-03 2017-11-17 南京江南博睿高新技术研究院有限公司 一种用于虚拟现实设备的手势识别控制方法

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2015200209A1 (en) * 2014-06-23 2015-12-30 Nixie Labs, Inc. Wearable unmanned aerial vehicles, launch- controlled unmanned aerial vehicles, and associated systems and methods
CN104808799A (zh) * 2015-05-20 2015-07-29 成都通甲优博科技有限责任公司 一种能够识别手势的无人机及其识别方法
CN105892474A (zh) * 2016-03-31 2016-08-24 深圳奥比中光科技有限公司 无人机以及无人机控制方法
CN107390713A (zh) * 2016-04-27 2017-11-24 阿特拉斯动力公司 基于手势的无人机控制
CN106200657A (zh) * 2016-07-09 2016-12-07 东莞市华睿电子科技有限公司 一种无人机控制方法
CN106774945A (zh) * 2017-01-24 2017-05-31 腾讯科技(深圳)有限公司 一种飞行器飞行控制方法、装置、飞行器及系统
CN106774947A (zh) * 2017-02-08 2017-05-31 亿航智能设备(广州)有限公司 一种飞行器及其控制方法
CN106980372A (zh) * 2017-03-24 2017-07-25 普宙飞行器科技(深圳)有限公司 一种无需地面操控终端的无人机操控方法及系统

Also Published As

Publication number Publication date
CN109196438A (zh) 2019-01-11
US20230280745A1 (en) 2023-09-07
US20200348663A1 (en) 2020-11-05

Similar Documents

Publication Publication Date Title
US11340606B2 (en) System and method for controller-free user drone interaction
US11914370B2 (en) System and method for providing easy-to-use release and auto-positioning for drone applications
WO2019144295A1 (zh) 一种飞行控制方法、设备、飞行器、系统及存储介质
US11106201B2 (en) Systems and methods for target tracking
US10824149B2 (en) System and method for automated aerial system operation
CN112462802A (zh) 用于提供自主摄影及摄像的系统和方法
KR102542278B1 (ko) 무인 비행 시스템 및 무인 비행 시스템을 위한 제어 시스템
TWI634047B (zh) 遠端控制方法及終端
WO2020107372A1 (zh) 拍摄设备的控制方法、装置、设备及存储介质
WO2019173981A1 (zh) 一种无人机控制方法、设备、无人机、系统及存储介质
US20210240180A1 (en) Aerial device and method for controlling the aerial device
US20200382696A1 (en) Selfie aerial camera device
WO2022011533A1 (zh) 一种运动控制方法、控制设备、可移动平台及存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18902448

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18902448

Country of ref document: EP

Kind code of ref document: A1