WO2018176426A1 - 一种无人机的飞行控制方法及无人机 - Google Patents

一种无人机的飞行控制方法及无人机 Download PDF

Info

Publication number
WO2018176426A1
WO2018176426A1 PCT/CN2017/079134 CN2017079134W WO2018176426A1 WO 2018176426 A1 WO2018176426 A1 WO 2018176426A1 CN 2017079134 W CN2017079134 W CN 2017079134W WO 2018176426 A1 WO2018176426 A1 WO 2018176426A1
Authority
WO
WIPO (PCT)
Prior art keywords
drone
wearable device
target object
information
position information
Prior art date
Application number
PCT/CN2017/079134
Other languages
English (en)
French (fr)
Inventor
邬奇峰
钱杰
Original Assignee
深圳市大疆创新科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳市大疆创新科技有限公司 filed Critical 深圳市大疆创新科技有限公司
Priority to CN202210149842.8A priority Critical patent/CN114510079A/zh
Priority to PCT/CN2017/079134 priority patent/WO2018176426A1/zh
Priority to CN201780054731.6A priority patent/CN109690440B/zh
Publication of WO2018176426A1 publication Critical patent/WO2018176426A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/0011Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot associated with a remote control arrangement
    • G05D1/0016Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot associated with a remote control arrangement characterised by the operator's input device
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/10Simultaneous control of position or course in three dimensions
    • G05D1/101Simultaneous control of position or course in three dimensions specially adapted for aircraft
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations

Definitions

  • the invention relates to the field of UAV communication, in particular to a flight control method for a UAV and a UAV.
  • the consumer drone market is currently booming, and most consumer-grade drones are used for aerial photography.
  • visual tracking technology is often used to control the drone to automatically follow the target object.
  • the existing visual tracking technology is difficult to obtain the position of the target object, and it is difficult to recognize after the target object passes the obstacle, and it is easy for the drone to lose the target object.
  • a remote controller is often used to control the drone, but when the user goes outdoors to participate in activities such as skiing, mountaineering, mountain biking, etc., carrying a relatively bulky remote controller and its inconvenience.
  • the technical problem to be solved by the present invention is to provide a flight control method for a drone and a drone, which can solve the problem that the drone using the visual tracking technology in the prior art is easy to lose the target object and the inconvenience of carrying the remote controller.
  • the first technical solution adopted by the present invention is to provide a flight control method for a drone, comprising: acquiring position information and action information of a wearable device worn by a target object; and determining an action of the wearable device Whether the information matches the preset first action template; if it matches the first action template, the flight position of the drone, the shooting angle of the imaging device mounted on the drone, and the shooting position according to the position information of the wearable device
  • the target object is adjusted in at least one or a combination of imaging sizes in the imaging device carried by the drone such that the drone is within a predetermined distance range of the wearable device and the target object is within a predetermined shooting range of the imaging device .
  • the wearing device is a wristband or a watch worn on the arm of the target object
  • the first motion template corresponds to the waving motion of the target object
  • the step of adjusting the combination includes: adjusting the horizontal flight position of the drone according to the horizontal position information of the wearable device and the horizontal position information of the drone to adjust the horizontal relative distance between the drone and the wearable device to a first predetermined distance Within the range; adjust the flying height of the drone according to the height information of the wearable device and the height information of the drone to adjust the relative height of the drone and the wearable device to a second predetermined distance; according to the drone and wear Calculating the angle of the connection between the drone and the wearable device with respect to the horizontal direction or the vertical direction according to the horizontal relative distance and the relative height of the device, and adjusting the shooting angle of the imaging device according to the angle to make the optical axis of the imaging device The direction is adjusted to be within a predetermined angle range relative to the connection between the drone and the wear
  • the method further comprises: visually recognizing the target object from within the captured image of the imaging device.
  • the step of visually recognizing the target object from the captured image of the imaging device includes: performing motion recognition on at least two candidate objects in the captured image to respectively acquire motion information of at least two candidate objects; and selecting at least two candidates
  • the motion information of the object is matched with the motion information of the wearable device or the first motion template; the matched candidate object is used as the target object.
  • the method further includes: controlling the drone to track the target object in combination with the position information of the subsequently obtained wearable device and the position information of the target object obtained by the visual recognition.
  • the step of controlling the drone to track the target object according to the position information of the wearable device obtained by the subsequent obtaining and the position information of the target object obtained by the visual recognition comprises: adjusting the flight position of the drone according to the position information of the wearable device, and according to The position information of the target object obtained by visual recognition adjusts the shooting angle of the drone.
  • the step of controlling the drone to track the target object in combination with the position information of the subsequently obtained wearable device and the position information of the target object obtained by the visual recognition further comprises: determining the wearable device Whether the difference between the position information and the position information of the target object obtained by the visual recognition is greater than a preset threshold; if it is greater than the preset threshold, the flight position of the drone and the drone are reloaded according to the position information of the wearable device. At least one or a combination of the photographing angle of the imaging device and the imaging size of the target object in the imaging device carried by the drone is adjusted such that the drone is within a predetermined distance range of the wearable device and the target object is at The intended shooting range of the imaging device.
  • At least one of the flying position of the drone, the shooting angle of the imaging device mounted on the drone, and the imaging size of the target object in the imaging device mounted on the drone are re-according to the position information of the wearing device.
  • the method further includes: determining whether the action information of the subsequently acquired wearable device or the action information obtained by performing motion recognition on the target object matches the preset second action template; if the second action template matches, Azimuth adjustment of the drone.
  • the wearable device is a wristband or a watch worn on the arm of the target object
  • the second motion template corresponds to the wrist flipping action of the target object
  • the method further includes: acquiring posture information of the wearable device; Whether the posture information of the device satisfies the preset orientation adjustment trigger condition; if the orientation adjustment trigger condition is satisfied, performing the determination of whether the action information of the subsequently acquired wearable device or the action information obtained by performing motion recognition on the target object is preset
  • the second action template matches the steps.
  • the step of determining whether the posture information of the wearable device satisfies the preset orientation adjustment trigger condition comprises: determining, according to the posture information of the wearable device and the relative positional relationship between the drone and the wearable device, determining that the target object is wearing the wearable device The direction of the limb is relative to whether the connection between the drone and the wearable device is within a preset angle range; if it is within the preset angle range, the orientation adjustment trigger condition is satisfied.
  • the step of adjusting the orientation of the drone includes: adjusting the relative positional relationship between the drone and the wearable device according to the posture information of the wearable device obtained subsequently.
  • the step of adjusting the relative positional relationship between the UAV and the wearable device according to the posture information of the wearable device obtained subsequently includes: adjusting the relative positional relationship between the UAV and the wearable device according to the posture information of the wearable device that is subsequently obtained, so as to achieve the target
  • the orientation of the body on which the wearer wears the device relative to the line between the drone and the wearable device is always within a predetermined range of angles.
  • the step of adjusting the relative positional relationship between the drone and the wearable device according to the posture information of the wearable device obtained subsequently includes: recording the pointing of the limb of the target object wearing the wearable device with respect to the connection between the drone and the wearable device The angle between the angles of the wearer and the wearable device is adjusted according to the attitude information of the wearable device that is obtained later, so that the adjusted target object is pointed with the limb of the wearable device. And the connection between the drone and the wearable device coincides with each other.
  • the second technical solution adopted by the present invention is to provide a flight control method for a drone, comprising: acquiring position information of a wearable device worn by the target object; and combining the obtained position information of the wearable device And the position information of the target object obtained by visual recognition controls the drone to track the target object.
  • the step of controlling the drone to track the target object according to the obtained position information of the wearable device and the position information of the target object obtained by the visual recognition comprises: adjusting the flight position of the drone according to the position information of the wearable device, and according to the visual
  • the position information of the obtained target object is identified to adjust the shooting angle of the drone.
  • the method further includes: determining whether a difference between the location information of the wearable device and the location information of the target object obtained by the visual recognition is greater than a first preset threshold; if greater than the first preset threshold, the location information according to the wearable device The flight position of the drone is adjusted so that the drone is within a predetermined distance of the wearable device.
  • the method further includes: adjusting a shooting angle of the image forming device mounted on the drone so that the target object is in a predetermined shooting of the imaging device.
  • the method further comprises: visually recognizing the target object from within the captured image of the imaging device.
  • the method further includes: determining whether a difference between the position information of the wearable device and the position information of the target object obtained by the visual recognition is greater than a duration of the first preset threshold is greater than a second preset threshold; if the duration is greater than the first
  • the second preset threshold is used to control the drone to track the target object in combination with the position information of the subsequently obtained wearable device and the position information of the target object obtained by visual recognition; otherwise, the flight of the drone is re-based according to the position information of the wearable device. Position adjustment.
  • the third technical solution adopted by the present invention is to provide a drone, comprising: a wireless communication circuit, configured to acquire position information and action information of a wearable device worn by a target object;
  • the wireless communication circuit is configured to determine whether the action information of the wearable device matches the preset first action template, and when the action information of the wearable device matches the first action template, according to the position information of the wearable device At least one or a combination of a flying position of the human machine, a shooting angle of the imaging device mounted on the drone, and an imaging size of the target object in the imaging device mounted on the drone to make the drone Within a predetermined distance range of the wearable device and the target object is within a predetermined shooting range of the imaging device.
  • the wearing device is a wristband or a watch worn on the arm of the target object
  • the first motion template corresponds to the waving motion of the target object
  • the processor according to the position information of the wearable device, at least a flight position of the drone, a shooting angle of the imaging device mounted on the drone, and an imaging size of the target object in the imaging device mounted on the drone Adjusting one or a combination specifically includes: adjusting a horizontal flight position of the drone according to the horizontal position information of the wearable device and the horizontal position information of the drone to adjust the horizontal relative distance between the drone and the wearable device to the first predetermined Within the distance range; adjust the flying height of the drone according to the height information of the wearable device and the height information of the drone to adjust the relative height of the drone and the wearable device to a second predetermined distance; according to the drone and Calculating the angle of the connection between the drone and the wearable device with respect to the horizontal direction or the vertical direction according to the horizontal relative distance and the relative height of the wearable device, and adjusting the photographing angle of the imaging device according to the angle to make the light of the imaging device
  • the axis direction is adjusted to be within a predetermined angle range relative to the connection between the drone
  • the processor is further configured to: visually recognize the target object from within the captured image of the imaging device.
  • the visually recognizing the target object from the captured image of the imaging device includes: performing motion recognition on at least two candidate objects in the captured image to acquire motion information of at least two candidate objects respectively;
  • the motion information of the candidate object is matched with the motion information of the wearable device or the first motion template; the matched candidate object is used as the target object.
  • the processor is further configured to: control the drone to track the target object by combining the position information of the subsequently obtained wearable device and the position information of the target object obtained by the visual recognition.
  • the processor in combination with the position information of the subsequently obtained wearable device and the position information of the target object obtained by the visual recognition, controlling the drone to track the target object, specifically: adjusting the flight position of the drone according to the position information of the wearable device, and The shooting angle of the drone is adjusted according to the position information of the target object obtained by the visual recognition.
  • the step of controlling the UAV to track the target object by combining the position information of the wearable device obtained by the processor and the position information of the target object obtained by the visual recognition further includes: determining the position information of the wearable device and the target object obtained by visual recognition. Whether the difference between the location information is greater than a preset threshold; if it is greater than the preset threshold, the flight position of the drone, the shooting angle of the imaging device carried by the drone, and the target object are re-according to the position information of the wearable device. At least one or a combination of imaging sizes in the imaging device mounted on the drone is adjusted.
  • the processor re-based the position information of the wearable device on the flight position of the drone, the shooting angle of the imaging device mounted on the drone, and the imaging size of the target object in the imaging device mounted on the drone After at least one or a combination of adjustments, further used to: re-visualize the target object.
  • the processor is further configured to: determine whether the action information of the subsequently acquired wearable device or the action information obtained by performing motion recognition on the target object matches the preset second action template, and obtain the action information and the When the two action templates match, the azimuth is adjusted.
  • the wearable device is a wristband or a watch worn on the arm of the target object
  • the second motion template corresponds to the wrist flipping action of the target object
  • the wireless communication circuit is further configured to acquire posture information of the wearable device, before the processor determines whether the action information of the subsequently acquired wearable device or the motion information obtained by performing motion recognition on the target object matches the preset second motion template.
  • the processor is further configured to determine whether the posture information of the wearable device satisfies a preset azimuth adjustment trigger condition, and when the azimuth adjustment trigger condition is satisfied, performing action information for determining the subsequently acquired wearable device or performing motion recognition on the target object The step of whether the action information matches the preset second action template.
  • the determining whether the gesture information of the wearable device meets the preset orientation adjustment triggering condition specifically includes: determining, according to the posture information of the wearable device and the relative positional relationship between the drone and the wearable device, determining that the target object is worn with the wearable device.
  • the pointing of the limb is relative to whether the connection between the drone and the wearable device is within a preset angle range, and when it is within the preset angle range, it is determined that the posture information of the wearable device satisfies the orientation adjustment trigger condition.
  • the processor adjusts the orientation of the drone according to the attitude information of the wearable device that is subsequently obtained, and adjusts the relative positional relationship between the drone and the wearable device.
  • the adjusting the relative positional relationship between the UAV and the wearable device according to the posture information of the wearable device that is subsequently obtained includes: adjusting the relative positional relationship between the UAV and the wearable device according to the posture information of the wearable device that is subsequently obtained, so that The orientation of the target object wearing the limb of the wearable device is always within a preset angle range relative to the connection between the drone and the wearable device.
  • the processor adjusts the relative positional relationship between the drone and the wearable device according to the posture information of the wearable device that is subsequently obtained, which includes: recording the pointing of the target object wearing the wearable device relative to the drone and the wearable device The angle between the lines; the relative positional relationship between the drone and the wearable device is adjusted according to the posture information of the wearable device obtained in the following manner, so that the adjusted target object is worn with the limb of the wearable device The connection and the connection between the drone and the wearable device coincide with each other.
  • the fourth technical solution adopted by the present invention is: providing a drone,
  • the method includes: a wireless communication circuit, configured to acquire location information of a wearable device worn by the target object; and a processor coupled to the wireless communication circuit, configured to combine the obtained position information of the wearable device and the position information of the target object obtained by visual recognition The drone tracks the target object.
  • the processor in combination with the obtained position information of the wearable device and the position information of the target object obtained by the visual recognition, controlling the drone to track the target object, specifically: adjusting the flight position of the drone according to the position information of the wearable device, and according to The position information of the target object obtained by visual recognition adjusts the shooting angle of the drone.
  • the processor is further configured to: determine whether a difference between the location information of the wearable device and the location information of the target object obtained by the visual recognition is greater than a first preset threshold, and when greater than the first preset threshold, according to the wearable device The location information adjusts the flight position of the drone such that the drone is within a predetermined distance of the wearable device.
  • the processor After the processor adjusts the flight position of the drone according to the position information of the wearable device, the processor further uses: adjusting the shooting angle of the imaging device mounted on the drone to make the target object be in the predetermined position of the imaging device. Within the shooting range.
  • the processor is further configured to: visually recognize the target object from within the captured image of the imaging device.
  • the processor is further configured to: determine whether a difference between the position information of the wearable device and the position information of the target object obtained by the visual recognition is not greater than a duration of the first preset threshold is greater than a second preset threshold, and is in a duration
  • the UAV controls the UAV to track the target object in combination with the position information of the subsequently obtained wearable device and the position information of the target object obtained by the visual recognition, when the duration is not greater than the second preset threshold. Re-adjust the flight position of the drone based on the position information of the wearable device.
  • the embodiment of the present invention determines whether the action information of the wearable device and the preset first action are obtained by acquiring the position information and the action information of the wearable device worn by the target object.
  • the templates are matched, and at the time of matching, at least one or a group of the flying position of the drone and the shooting angle of the imaging device mounted on the drone according to the position information of the wearing device Adjusting so that the drone is within a predetermined distance range of the wearable device and the target object is within a predetermined shooting range of the imaging device, so that the user can control the drone flight through the portable wearable device without carrying bulky Controller to improve the convenience of drone control;
  • the embodiment of the present invention controls the drone to track the target object by combining the obtained position information of the wearable device and the position information of the target object obtained by the visual recognition, thereby using the position information of the wearable device to compensate for the problem of unstable visual recognition. Improve the accuracy of target tracking.
  • FIG. 1 is a schematic flow chart of a first embodiment of a flight control method for a drone of the present invention
  • FIG. 2 is a schematic diagram showing the relative position change between the UAV and the target object before and after the execution of step S14 in FIG. 1;
  • FIG. 3 is a schematic flow chart of a second embodiment of a flight control method for a drone of the present invention.
  • FIG. 4 is a schematic flow chart of a third embodiment of a flight control method for a drone of the present invention.
  • Figure 5 is a schematic diagram showing the relative position change between the drone and the target object before and after the step S3211;
  • Figure 6 is a schematic diagram showing the relative position change between the drone and the target object before and after the step S3213;
  • FIG. 7 is a schematic diagram of a process of determining whether the UAV azimuth adjustment trigger condition is satisfied in step S301;
  • Figure 8 is a schematic view showing the structure of an embodiment of the drone of the present invention.
  • FIG. 1 is a schematic flow chart of a first embodiment of a flight control method for a drone of the present invention.
  • the flight control method of the drone of the present invention includes:
  • Step S10 acquiring location information and action information of the wearable device worn by the target object
  • the wearing device is a wristband or a wrist watch worn on the arm of the target object; the position information is worn The GPS position data of the device is worn; the action information is at least one type of data of the wearable device, including acceleration, angular velocity or motion trajectory, and the motion information is detected by an inertial measurement unit of the wearable device or a sensor such as a magnetometer.
  • the wearable device may also be other types of devices such as a ring that is worn on the finger of the target object, and is not specifically limited herein.
  • the wearable device worn by the target object is bound to the drone in advance, and can communicate with the drone through a wireless communication link, such as 4G network, wifi, Bluetooth, or the like.
  • Step S12 determining whether the action information of the wearable device matches the preset first action template
  • the first action template is preset posture data, and is associated with some control command of the drone in advance.
  • the first motion template corresponds to a swing motion of the target object, that is, the first motion template is motion information generated by the wearable device worn by the target object when the target object is waved, when the drone obtains
  • the motion information of the target object for example, the acceleration
  • the first action template may correspond to other actions such as the hand-raising action of the target object
  • the matching determination process may also adopt other methods such as determining the variance, which is not specifically limited herein.
  • Step S14 if it matches the first motion template, the flight position of the drone, the shooting angle of the imaging device mounted on the drone, and the target object are mounted on the drone according to the position information of the wearable device. At least one or a combination of imaging sizes in the imaging device is adjusted such that the drone is within a predetermined distance range of the wearable device and the target object is within a predetermined imaging range of the imaging device.
  • step S14 includes:
  • Step S141 adjusting the horizontal flight position of the drone according to the horizontal position information of the wearable device and the horizontal position information of the drone to adjust the horizontal relative distance between the drone and the wearable device to a first predetermined distance range;
  • the first predetermined distance range is a preset first distance threshold range, which may be set according to specific requirements, and is not specifically limited herein; the horizontal position information of the wearable device may be through the wearable device.
  • the transmitted GPS position data is obtained, and the horizontal position information of the drone can be obtained by the GPS locator of the drone.
  • the horizontal position information of the wearable device 20 and the horizontal position information of the drone 10 are obtained. That is, according to the GPS position data of the wearable device 20 and the GPS position data of the drone 10, the horizontal relative distance between the drone 10 and the wearable device 20 is calculated, thereby adjusting the horizontal flight position of the drone 10, for example, from the figure.
  • the A position in 2 flies to the B position, so that the horizontal relative distance between the drone 10 and the wearable device 20 is gradually reduced/increased to a range within a first predetermined distance (for example, within a range of 1-2 meters), as shown in FIG. Reduced from X1 to X2.
  • Step S142 adjusting the flying height of the drone according to the height information of the wearing device and the height information of the drone to adjust the relative height of the drone and the wearing device to a second predetermined distance range;
  • the second predetermined distance range is a preset second distance threshold range, which may be set according to specific requirements, and is not specifically limited herein.
  • the height information of the wearable device 20 may default to the same height as the ground, and the height information of the drone 10 may be obtained by an ultrasonic sensor or a barometer of the drone 10 or the like.
  • the height difference between the wearable device 20 and the drone 10 can be calculated, and the flying height of the drone 10 can be adjusted, for example, from FIG.
  • the A position flies to the B position to gradually reduce/increase the relative height between the drone 10 and the wearable device 20 to a second predetermined distance range (for example, within a range of 3-4 meters), as reduced from H1 in FIG. To H2.
  • the height information of the wearable device 20 can also be set to other default height values, or obtained by other sensors.
  • the altitude information of the drone 10 can also be measured by the binocular vision system. Specifically limited.
  • Step S143 Calculate an angle between the connection line between the drone and the wearable device with respect to the horizontal direction or the vertical direction according to the horizontal relative distance and the relative height of the drone and the wearable device, and adjust the shooting of the imaging device according to the angle
  • the angle is such that the optical axis direction of the imaging device is adjusted to be within a predetermined angular range with respect to the line between the drone and the wearable device.
  • the predetermined angle range is a preset angle threshold range, which can be set according to specific requirements, and is not specifically limited herein.
  • the connection between the drone 10 and the wearable device 20 is calculated (such as the CD connection in FIG. 2).
  • the imaging angle of the imaging device 101 is adjusted according to the angle, so that the optical axis direction of the imaging device 101 (such as the DE connection direction in FIG. 2) is adjusted to be relative
  • the connection between the drone 10 and the wearable device 20 i.e., the CD connection
  • a predetermined angular range e.g., in the range of 0-10 degrees
  • Step S144 Adjust the focal length of the imaging device according to the horizontal relative distance and the relative height of the drone and the wearable device, so that the imaging size of the target object in the imaging device accounts for a predetermined ratio within the proportion of the entire captured image.
  • the predetermined ratio range is a preset range of the target object imaged in advance, and may be set according to specific requirements, and is not specifically limited herein.
  • the target object 30 is already in the shooting range of the imaging device 101 mounted on the drone 10,
  • the distance between the drone 10 and the wearable device 20 can be calculated, thereby corresponding to the distance according to the distance.
  • the focal length of the imaging device 101 is adjusted, and the shooting range is gradually enlarged/reduced so that the imaging size of the target object 30 accounts for a predetermined ratio within the predetermined ratio, for example, in the range of 25% to 35%.
  • the above steps S141-S144 may only perform one or more combinations thereof. For example, when it is determined that the acquired action information matches the first action template, the relative height of the drone and the wearable device is already in the first Within the preset distance range, only steps S142 and S143 need to be performed at this time.
  • the flying position of the drone can also be adjusted according to the imaging size of the target object in the imaging device, so that the imaging size of the target object in the imaging device carried by the drone accounts for the entire shooting image.
  • the ratio is within a predetermined ratio, while making drones and wearables
  • the horizontal relative distance and the relative height between the two gradually increase/decrease to a predetermined distance range, which is not specifically limited herein.
  • the location information and the action information of the wearable device worn by the target object are obtained, and it is determined whether the action information of the wearable device matches the preset first action template, and according to the position information of the wearable device when matching Adjusting at least one or a combination of a flight position of the drone, an imaging angle of the imaging device mounted on the drone, and an imaging size of the target object in the imaging device mounted on the drone so that none
  • the human machine is within a predetermined distance range of the wearable device and the target object is within a predetermined shooting range of the imaging device, so that the user can control the drone flight through the portable wearable device without carrying a bulky controller to improve the unmanned person.
  • FIG. 3 is a schematic flow chart of a second embodiment of a flight control method for a drone of the present invention.
  • the flight control method of the drone of the present invention includes:
  • Step S21 Control the drone to track the target object by combining the obtained position information of the wearable device and the position information of the target object obtained by visual recognition.
  • step S21 specifically includes:
  • Step S211 Adjust the flight position of the drone according to the position information of the wearable device, and adjust the shooting angle of the drone according to the position information of the target object obtained by the visual recognition.
  • the wearable device is pre-bound with the drone, and during the flight of the drone, the flight position of the drone is adjusted according to the position information uploaded by the wearable device, so that the drone is within a predetermined distance range of the wearable device, for example,
  • the linear distance between the drone and the wearable device is within a predetermined distance range (for example, within a range of 4-5 meters), and the shooting angle of the drone is adjusted according to the position information of the target object obtained by visual recognition, so that the target object is imaged.
  • the predetermined shooting range of the device wherein the predetermined distance range and the predetermined shooting range can be set according to actual needs, and is not specifically limited herein.
  • step S21 the method includes:
  • Step S20 visually recognize the target object from within the photographing screen of the imaging device.
  • the drone can trigger the visual recognition function through a certain predetermined motion information, and can also automatically trigger the visual recognition function, which is not specifically limited herein.
  • step S20 includes:
  • Step S201 performing motion recognition on at least two candidate objects in the captured image to respectively acquire motion information of at least two candidate objects;
  • the drone may first perform contour recognition on the captured image, select at least two candidate objects that are relatively close to the human body contour for motion recognition, and acquire motion information of at least two candidate objects, such as motion trajectories, by using a visual recognition algorithm.
  • Information such as acceleration.
  • at least two objects in the screen may be randomly selected as candidate objects, and other selection manners may be used, which are not specifically limited herein.
  • Step S202 Matching action information of at least two candidate objects with action information of the wearable device or a first action template
  • Step S203 The matched candidate object is used as the target object.
  • the first action template is preset posture data, and includes at least one type of data such as acceleration, angular velocity, and motion track of the corresponding motion.
  • the action information of the at least two candidate objects is matched with the action information uploaded by the wearable device or the first action template, for example, determining whether the motion track of the motion of the candidate object is the same as the motion track uploaded by the wearable device, or The difference between the two is within the allowable range, wherein the allowable range is preset; the matching candidate object is finally used as the target object.
  • only one candidate object may be selected for motion matching, or one of the objects in the captured image may be randomly selected as a target object, which is not specifically limited herein.
  • step S21 includes:
  • Step S212 determining whether the difference between the location information of the wearable device and the location information of the target object obtained by the visual recognition is greater than a first preset threshold
  • the first preset threshold is a maximum allowable error of the preset visual ranging. If the first preset threshold is exceeded, the visual ranging is unreliable, and the specific value may be set according to specific requirements. limited.
  • the drone can use a visual ranging algorithm, such as binocular vision ranging, to obtain the distance a between the target object and the drone, and the drone can also utilize the position information of the wearable device and the drone.
  • the position information is used to calculate the distance b between the drone and the wearable device, so that the difference between a and b can be calculated, thereby determining whether the difference is greater than a preset threshold, for example, 0.5 meters.
  • Step S213 If it is greater than the first preset threshold, the flight position of the drone, the shooting angle of the imaging device mounted on the drone, and the target object are mounted on the drone according to the position information of the wearable device. At least one or a combination of imaging sizes in the imaging device is adjusted such that the drone is within a predetermined distance range of the wearable device and the target object is within a predetermined imaging range of the imaging device.
  • the step S213 specifically includes:
  • Step S2131 If it is greater than the first preset threshold, adjust the flight position of the drone according to the position information of the wearable device, so that the drone is within a predetermined distance range of the wearable device.
  • the drone can be controlled according to the position information of the wearable device when the UAV visually tracks and loses the target object or visually tracks the error, so that the drone is re-set within the preset range of the wearable device, thereby preventing The drone is losing the target.
  • Step S2132 Adjust the shooting angle of the imaging device mounted on the drone so that the target object is within a predetermined shooting range of the imaging device.
  • Step S2133 Adjust the focal length of the imaging device according to the horizontal relative distance and the relative height of the drone and the wearable device, so that the imaging size of the target object in the imaging device accounts for a predetermined ratio within the proportion of the entire captured image.
  • steps S2131, S2132, and S2133 may also perform only one or a combination of any two, which is not specifically limited herein.
  • Step S214 Perform visual recognition on the target object again.
  • the drone can use a visual detection algorithm, for example, to identify the target object by feature extraction, or randomly select an object in the captured image as the target object, and the specific process may be Referring to the above step S20, it will not be repeated here.
  • a visual detection algorithm for example, to identify the target object by feature extraction, or randomly select an object in the captured image as the target object, and the specific process may be Referring to the above step S20, it will not be repeated here.
  • Step S215 determining whether the difference between the position information of the wearable device and the position information of the target object obtained by the visual recognition is not greater than a duration of the first preset threshold is greater than a second preset threshold;
  • the second preset threshold is a set time threshold of the visual ranging, and the second preset threshold indicates that the visual ranging is stable and reliable, and the specific value may be set according to specific requirements, which is not specifically limited herein.
  • Step S216 If the duration is greater than the second preset threshold, the UAV controls the UAV to track the target object in combination with the position information of the subsequently obtained wearable device and the position information of the target object obtained by the visual recognition; otherwise, according to the wearable device The position information adjusts the flight position of the drone.
  • the drone continuously determines whether the difference between the position information of the wearable device and the position information of the target object obtained by the visual recognition is not greater than a first preset threshold, and records the position information of the wearable device and the target object obtained by visual recognition.
  • the difference between the location information is not greater than the duration of the first preset threshold. If the duration is greater than the second predetermined threshold (for example, 1 minute), it indicates that the visual ranging is stable and reliable, that is, the vision of the drone.
  • the tracking is stable and reliable.
  • the UAV can be used to track the target object in combination with the position information of the subsequently obtained wearable device and the position information of the target object obtained by visual recognition; otherwise, the visual tracking of the drone is unstable, and The flight position of the drone is adjusted according to the position information of the wearable device.
  • the drone by combining the obtained position information of the wearable device and the position information of the target object obtained by visual recognition, the drone is controlled to track the target object, thereby using the position information of the wearable device to compensate for the problem of unstable visual recognition, and improving The accuracy of the target tracking.
  • This embodiment may be combined with the first embodiment of the flight control method of the unmanned aerial vehicle of the present invention, and the execution of the steps of the present embodiment may be after step S10 or step S14.
  • FIG. 4 is a schematic flow chart of a third embodiment of a flight control method for a drone of the present invention.
  • the flight control method of the drone of the present invention includes:
  • Step S31 determining whether the acquired action information of the wearable device or the action information obtained by performing motion recognition on the target object matches the preset second action template;
  • the wearable device is a wristband or a watch worn on the arm of the target object
  • the second motion template corresponds to the wrist flipping action of the target object.
  • the motion information is attitude data of the wearable device, and includes at least one type of data such as an acceleration, an angular velocity, or a motion trajectory, and the motion information is detected by a sensor such as an inertial measurement unit of the wearable device or a magnetometer.
  • the wearable device may be other types of devices such as a ring worn on the finger of the target object, and the second action template may also correspond to other actions such as raising the hand of the target object, which is not specifically limited herein. .
  • Step S32 If the second action template is matched, the azimuth adjustment is performed on the drone.
  • the wearable device worn by the target object is bound to the drone in advance, and the action information is uploaded to the drone through the wireless communication link, and the drone determines the acquired action information of the wearable device or performs motion recognition on the target object.
  • the obtained motion information matches the preset second action template, for example, determining whether the motion track of the wearable device or the motion track of the target object obtained by the visual recognition is the same as the motion track of the preset second action template, Or the difference between the two is within an allowable range, wherein the allowable range is preset; if the motion trajectory is the same or the difference is within the allowable range, determining the acquired action information of the wearable device or performing motion recognition on the target object
  • the action information is matched with the preset second action template, and the azimuth can be adjusted for the unmanned aerial vehicle; at the same time, the shooting angle of the imaging device carried by the drone can be synchronously adjusted so that the target object is always within the shooting range.
  • step S32 specifically includes:
  • Step S321 Adjust the relative positional relationship between the drone and the wearable device according to the posture information of the wearable device obtained subsequently.
  • the posture information is data such as an azimuth or Euler angle of the wearable device obtained by the magnetometer and/or the inertial measurement unit of the wearable device.
  • step S321 includes:
  • Step S3211 Adjust the relative positional relationship between the drone and the wearable device according to the posture information of the wearable device obtained subsequently, so that the pointing of the target object wearing the wearable device relative to the drone and the wearable device is always Keep within the preset angle range.
  • the drone 10 is based on the subsequently obtained wearable device 20
  • Attitude information such as Euler angles
  • the flying position of the drone 10 is synchronously adjusted, for example, according to the Euler angle of the wearable device 20 obtained subsequently, the wearing device is acquired.
  • step S321 specifically includes:
  • Step S3212 recording an angle between the pointing of the limb of the target object wearing the wearable device relative to the line between the drone and the wearable device;
  • the drone 10 can obtain the limb 301 of the target object 30 wearing the wearable device 20 according to the posture information of the wearable device 20 obtained later, such as Euler angles ( For example, the direction of the arm), as shown in FIG. 6 in the direction of the AB connection; the position information of the drone 10 and the wearable device 20 can be used to calculate the direction of the connection between the drone 10 and the wearable device 20, that is, FIG.
  • the direction of the AC connection is such that the angle ⁇ between the direction of the arm 301 (the direction of the AB connection) relative to the line between the drone 10 and the wearable device 20 (the direction of the AC connection) can be obtained.
  • Step S3213 adjusting the relative positional relationship between the drone and the wearable device according to the attitude information of the wearable device obtained by using the angle as the compensation value, so that the adjusted target object is pointed and unmanned by the wearable device.
  • the connection between the machine and the wearable device coincides with each other.
  • the angle ⁇ is used as the angle
  • the compensation value is used to synchronously adjust the flight position of the drone 10, for example, according to the wearable device 20 obtained subsequently.
  • the deflection angle ⁇ of the wearable device 20 is obtained, with the deflection angle ⁇ plus the compensation value ⁇ as the flight angle of the drone 10, on the horizontal plane where the drone 10 is located, and in the drone 10
  • the flight position of the drone 10 is synchronously adjusted, so that the adjusted target object 30 can be worn with the pointing of the limb 301 of the wearable device 20 (such as the DE connection in FIG. 6).
  • the direction) and the connection between the drone 10 and the wearable device 20 (such as the DF connection direction in FIG. 6) coincide with each other.
  • step S31 the method includes:
  • Step S300 Acquire posture information of the wearable device
  • Step S301 determining whether the posture information of the wearable device meets the preset orientation adjustment trigger condition
  • Step S302 If the orientation adjustment trigger condition is satisfied, step S31 is performed.
  • step S301 includes:
  • Step S3011 determining, according to the posture information of the wearable device and the relative positional relationship between the drone and the wearable device, whether the pointing of the limb of the target object wearing the wearable device is relative to the connection between the drone and the wearable device. Set within the angle range;
  • Step S3012 If it is within the preset angle range, the orientation adjustment trigger condition is satisfied.
  • the preset angle range is a preset maximum angle deviation range, and the specific value may be set according to actual requirements, and is not specifically limited herein.
  • the drone 10 can obtain the limb 301 of the target object 30 wearing the wearable device 20 according to the obtained posture information of the wearable device 20, such as the Euler angle (for example, The orientation of the arm), as shown in the AB connection direction in FIG. 7, can be calculated according to the position information of the drone 10 and the wearable device 20, as shown in FIG. In the AC connection direction, it is judged whether the angle ⁇ between the pointing of the arm 301 (ie, the line connecting the AB) and the direction of the line between the drone 10 and the wearable device 20 (ie, the direction of the AC connection) is within a preset angle range.
  • the angle ⁇ between the pointing of the arm 301 ie, the line connecting the AB
  • the direction of the line between the drone 10 and the wearable device 20 ie, the direction of the AC connection
  • the orientation adjustment trigger condition is satisfied, and the step S31 can be continued, so that the drone 10 can follow the inside (for example, within ⁇ 20 degrees).
  • the movement of the arm 301 is synchronously adjusted to the flight position; otherwise, the orientation adjustment is not triggered, that is, not performed Step S31.
  • the azimuth adjustment process if the angle between the pointing of the arm 301 (ie, the direction of the AB connection) and the direction of the line between the drone 10 and the wearable device 20 (ie, the direction of the AC connection) exceeds the preset The angle range determines the end of the orientation adjustment.
  • the unmanned aerial vehicle is oriented. Adjustment, so that the relative position of the drone and the target object can be adjusted, and the shooting angle of the imaging device mounted on the drone can be synchronously adjusted so that the target object is always in the shooting range, and thus the portable wearing device can realize no
  • the flight control of the man-machine does not require carrying a bulky controller to improve the control convenience of the drone.
  • the present embodiment may be combined with the first and/or second embodiment of the flight control method of the unmanned aerial vehicle of the present invention, and the execution of the steps of the present embodiment may be after step S14 or step S21.
  • FIG. 8 is a schematic structural view of an embodiment of the drone of the present invention.
  • the drone 80 of the present invention comprises:
  • the wireless communication circuit 801 is configured to acquire location information and action information of the wearable device worn by the target object;
  • the processor 802 is coupled to the wireless communication circuit 801, configured to determine whether the action information of the wearable device matches the preset first action template, and when the action information of the wearable device matches the first action template, according to the wearable device At least one or a combination of the positional information of the flight position of the drone 80, the imaging angle of the imaging device mounted on the drone 80, and the imaging size of the target object in the imaging device mounted on the drone 80 Adjustments are made such that the drone 80 is within a predetermined distance range of the wearable device and the target object is within a predetermined shooting range of the imaging device.
  • the wearing device is a wristband or a watch worn on the arm of the target object
  • the first motion template corresponds to the waving motion of the target object.
  • the wearable device may be other types of devices such as a ring that is worn on the finger of the target object
  • the first action template may also be used for other actions such as shaking the arm up and down, and is not specifically limited herein.
  • the drone 80 further includes:
  • the memory 803 is coupled to the processor 802, and is configured to store instructions and data required for the processor 802 to operate, such as a first action template.
  • the locator 804 is coupled to the processor 802 for acquiring location information of the drone 80;
  • the processor 802 determines whether the action information of the wearable device matches the preset first motion template, and the flight position of the drone 80 according to the position information of the wearable device, and the imaging device carried by the drone 80.
  • the specific process of adjusting the shooting angle and the image size of the target object in the imaging device mounted on the unmanned aerial vehicle 80 may refer to the corresponding steps of the first embodiment of the flight control method of the unmanned aerial vehicle of the present invention. , not repeated here.
  • the processor 802 is configured to: visually recognize the target object from the captured image of the imaging device; and control the drone 80 to target the target by combining the position information of the subsequently obtained wearable device and the position information of the target object obtained by the visual recognition.
  • the object is tracked; the specific process can refer to the corresponding step of the second embodiment of the flight control method of the unmanned aerial vehicle of the present invention, and is not repeated here.
  • the processor 802 is further configured to: determine whether the action information of the subsequently acquired wearable device or the action information obtained by performing motion recognition on the target object matches the preset second action template, and obtain the action information. When the second action template is matched, the drone 80 is adjusted in orientation.
  • the second action template corresponds to a wrist flipping action of the target object.
  • the second motion template may also be corresponding to other actions such as raising a hand, and is not specifically limited herein.
  • the processor 802 is further configured to determine whether the posture information of the wearable device meets the preset before determining whether the motion information of the wearable device that is subsequently acquired or the action information obtained by performing motion recognition on the target object matches the preset second action template.
  • the azimuth adjustment trigger condition and when the azimuth adjustment trigger condition is satisfied, the step of determining whether the action information of the subsequently acquired wearable device or the action information obtained by performing motion recognition on the target object matches the preset second action template is performed. .
  • the processor 802 determines whether the posture information of the wearable device satisfies the preset orientation adjustment trigger condition, and determines whether the action information of the subsequently acquired wearable device or the action information obtained by performing motion recognition on the target object is related to the preset second action.
  • the specific process of matching the template and the azimuth adjustment of the drone 80 reference may be made to the corresponding steps of the third embodiment of the flight control method of the unmanned aerial vehicle of the present invention. It will not be repeated here.
  • the drone may further include other components such as an ultrasonic sensor, a magnetometer, and the like, which is not specifically limited herein.
  • the UAV determines whether the action information of the wearable device matches the preset first action template by acquiring the position information and the action information of the wearable device worn by the target object, and according to the wearable device when matching
  • the position information is adjusted for at least one or a combination of a flight position of the drone, an imaging angle of the imaging device mounted on the drone, and an imaging size of the target object in the imaging device mounted on the drone, So that the drone is within a predetermined distance range of the wearable device and the target object is within a predetermined shooting range of the imaging device, so that the user can control the drone flight through the portable wearable device without carrying a bulky controller.
  • the convenience of the drone control is improved; in addition, the position information of the target object obtained by the combination and the position information of the target object obtained by the visual recognition are controlled to track the target object, thereby utilizing the position information of the wearable device to compensate for the visual recognition. Unstable problems improve the accuracy of target tracking.

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • General Physics & Mathematics (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Computing Systems (AREA)
  • Mathematical Physics (AREA)
  • Theoretical Computer Science (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
  • User Interface Of Digital Computer (AREA)
  • Image Analysis (AREA)

Abstract

一种无人机的飞行控制方法及无人机,该方法包括:获取目标物体所佩戴的穿戴设备的位置信息和动作信息(S10),判断穿戴设备的动作信息是否与预设的第一动作模板相匹配(S12),并在匹配时,根据穿戴设备的位置信息对无人机的飞行位置、无人机的所搭载的成像设备的拍摄角度和目标物体在无人机的所搭载的成像设备中的成像大小中的至少一者或组合进行调整,以使得无人机处于穿戴设备的预定距离范围内且目标物体处于成像设备的预定拍摄范围内(S14),从而用户通过便携的穿戴设备即可实现控制无人机飞行,而不需要携带笨重的控制器,提高无人机控制的便利性。

Description

一种无人机的飞行控制方法及无人机 【技术领域】
本发明涉及无人机通信领域,特别是涉及一种无人机的飞行控制方法及无人机。
【背景技术】
目前消费级无人机市场正在蓬勃的发展中,而大部分的消费级别的无人机都是用于航拍。航拍中常利用视觉跟踪技术控制无人机自动跟随目标物体飞行。但是,现有的视觉跟踪技术难以获得目标物体的位置,并且在目标物体经过障碍物后难以识别,容易使得无人机跟丢目标物体。
此外,现有技术中常采用遥控器控制无人机,但当用户到户外参与滑雪、登山、山地自行车骑行等活动时,携带较为笨重的遥控器及其不便。
【发明内容】
本发明主要解决的技术问题是提供一种无人机的飞行控制方法及无人机,能够解决现有技术中利用视觉跟踪技术的无人机容易跟丢目标物体以及携带遥控器不便的问题。
为解决上述技术问题,本发明采用的第一个技术方案是:提供一种无人机的飞行控制方法,包括:获取目标物体所佩戴的穿戴设备的位置信息和动作信息;判断穿戴设备的动作信息是否与预设的第一动作模板相匹配;若与第一动作模板相匹配,则根据穿戴设备的位置信息对无人机的飞行位置、无人机的所搭载的成像设备的拍摄角度和目标物体在无人机的所搭载的成像设备中的成像大小中的至少一者或组合进行调整,以使得无人机处于穿戴设备的预定距离范围内且目标物体处于成像设备的预定拍摄范围内。
其中,穿戴设备为佩戴于目标物体的手臂上的手环或手表,第一动作模板对应于目标物体的挥手动作。
其中,根据穿戴设备的位置信息对无人机的飞行位置、无人机的所搭载的成像设备的拍摄角度和目标物体在无人机的所搭载的成像设备中的成像大小中的至少一者或组合进行调整的步骤包括:根据穿戴设备的水平位置信息与无人机的水平位置信息调整无人机的水平飞行位置,以将无人机与穿戴设备的水平相对距离调整到第一预定距离范围内;根据穿戴设备的高度信息与无人机的高度信息调整无人机的飞行高度,以将无人机与穿戴设备的相对高度调整到第二预定距离范围内;根据无人机和穿戴设备的水平相对距离和相对高度计算无人机与穿戴设备之间连线相对于水平方向或竖直方向的夹角,并根据该夹角调整成像设备的拍摄角度,以使得成像设备的光轴方向调整到相对于无人机与穿戴设备之间连线处于预定角度范围内;根据无人机和穿戴设备的水平相对距离和相对高度调整成像设备的焦距,使得目标物体在成像设备中的成像大小占整个拍摄画面的比例处于预定比例范围内。
其中,该方法进一步包括:从成像设备的拍摄画面内对目标物体进行视觉识别。
其中,从成像设备的拍摄画面内对目标物体进行视觉识别的步骤包括:对拍摄画面中的至少两个候选物体进行动作识别,以分别获取至少两个候选物体的动作信息;将至少两个候选物体的动作信息与穿戴设备的动作信息或第一动作模板进行匹配;将相匹配的候选物体作为目标物体。
其中,该方法进一步包括:结合后续获得的穿戴设备的位置信息和视觉识别获得的目标物体的位置信息控制无人机对目标物体进行跟踪。
其中,结合后续获得的穿戴设备的位置信息和视觉识别获得的目标物体的位置信息控制无人机对目标物体进行跟踪的步骤包括:根据穿戴设备的位置信息调整无人机的飞行位置,并根据视觉识别获得的目标物体的位置信息调整无人机的拍摄角度。
其中,结合后续获得的穿戴设备的位置信息和视觉识别获得的目标物体的位置信息控制无人机对目标物体进行跟踪的步骤进一步包括:判断穿戴设备的 位置信息和视觉识别获得的目标物体的位置信息之间的差异是否大于预设阈值;若大于预设阈值,则重新根据穿戴设备的位置信息对无人机的飞行位置、无人机的所搭载的成像设备的拍摄角度和目标物体在无人机的所搭载的成像设备中的成像大小中的至少一者或组合进行调整,以使得无人机处于穿戴设备的预定距离范围内且目标物体处于成像设备的预定拍摄范围内。
其中,重新根据穿戴设备的位置信息对无人机的飞行位置、无人机的所搭载的成像设备的拍摄角度和目标物体在无人机的所搭载的成像设备中的成像大小中的至少一者或组合进行调整的步骤之后,进一步包括:重新对目标物体进行视觉识别。
其中,该方法进一步包括:判断后续获取的穿戴设备的动作信息或对目标物体进行动作识别所获得的动作信息是否与预设的第二动作模板相匹配;若与第二动作模板相匹配,则对无人机进行方位调整。
其中,穿戴设备为佩戴于目标物体的手臂上的手环或手表,第二动作模板对应于目标物体的手腕翻转动作。
其中,判断后续获取的穿戴设备的动作信息或对目标物体进行动作识别所获得的动作信息是否与预设的第二动作模板相匹配的步骤之前,进一步包括:获取穿戴设备的姿态信息;判断穿戴设备的姿态信息是否满足预设的方位调整触发条件;若满足方位调整触发条件,则执行判断后续获取的穿戴设备的动作信息或对目标物体进行动作识别所获得的动作信息是否与预设的第二动作模板相匹配的步骤。
其中,判断穿戴设备的姿态信息是否满足预设的方位调整触发条件的步骤包括:根据穿戴设备的姿态信息和无人机与穿戴设备之间的相对位置关系,判断目标物体的佩戴有穿戴设备的肢体的指向相对于无人机与穿戴设备之间连线是否处于预设角度范围内;若处于预设角度范围内,则满足方位调整触发条件。
其中,对无人机进行方位调整的步骤包括:根据后续获得的穿戴设备的姿态信息调整无人机与穿戴设备的相对位置关系。
其中,根据后续获得的穿戴设备的姿态信息调整无人机与穿戴设备的相对位置关系的步骤包括:根据后续获得的穿戴设备的姿态信息调整无人机与穿戴设备的相对位置关系,以使得目标物体的佩戴有穿戴设备的肢体的指向相对于无人机与穿戴设备之间连线始终保持在预设角度范围内。
其中,根据后续获得的穿戴设备的姿态信息调整无人机与穿戴设备的相对位置关系的步骤包括:记录目标物体的佩戴有穿戴设备的肢体的指向相对于无人机与穿戴设备之间连线之间的夹角;以该夹角作为补偿值根据后续获得的穿戴设备的姿态信息调整无人机与穿戴设备的相对位置关系,以使得调整后的目标物体的佩戴有穿戴设备的肢体的指向和无人机与穿戴设备之间连线彼此重合。
为解决上述技术问题,本发明采用的第二个技术方案是:提供一种无人机的飞行控制方法,包括:获取目标物体所佩戴的穿戴设备的位置信息;结合获得的穿戴设备的位置信息和视觉识别获得的目标物体的位置信息控制无人机对目标物体进行跟踪。
其中,结合获得的穿戴设备的位置信息和视觉识别获得的目标物体的位置信息控制无人机对目标物体进行跟踪的步骤包括:根据穿戴设备的位置信息调整无人机的飞行位置,并根据视觉识别获得的目标物体的位置信息调整无人机的拍摄角度。
其中,该方法进一步包括:判断穿戴设备的位置信息和视觉识别获得的目标物体的位置信息之间的差异是否大于第一预设阈值;若大于第一预设阈值,则根据穿戴设备的位置信息对无人机的飞行位置进行调整,以使得无人机处于穿戴设备的预定距离范围内。
其中,根据穿戴设备的位置信息对无人机的飞行位置进行调整的步骤之后,进一步包括:对无人机的所搭载的成像设备的拍摄角度进行调整,以使得目标物体处于成像设备的预定拍摄范围内。
其中,该方法进一步包括:从成像设备的拍摄画面内对目标物体进行视觉识别。
其中,该方法进一步包括:判断穿戴设备的位置信息和视觉识别获得的目标物体的位置信息之间的差异不大于第一预设阈值的持续时间是否大于第二预设阈值;若持续时间大于第二预设阈值,则结合后续获得的穿戴设备的位置信息和视觉识别获得的目标物体的位置信息控制无人机对目标物体进行跟踪;否则,重新根据穿戴设备的位置信息对无人机的飞行位置进行调整。
为解决上述技术问题,本发明采用的第三个技术方案是:提供一种无人机,包括:无线通信电路,用于获取目标物体所佩戴的穿戴设备的位置信息和动作信息;处理器,耦接无线通信电路,用于判断穿戴设备的动作信息是否与预设的第一动作模板相匹配,并在穿戴设备的动作信息与第一动作模板相匹配时,根据穿戴设备的位置信息对无人机的飞行位置、无人机的所搭载的成像设备的拍摄角度和目标物体在无人机的所搭载的成像设备中的成像大小中的至少一者或组合进行调整,以使得无人机处于穿戴设备的预定距离范围内且目标物体处于成像设备的预定拍摄范围内。
其中,穿戴设备为佩戴于目标物体的手臂上的手环或手表,第一动作模板对应于目标物体的挥手动作。
其中,处理器根据穿戴设备的位置信息对无人机的飞行位置、无人机的所搭载的成像设备的拍摄角度和目标物体在无人机的所搭载的成像设备中的成像大小中的至少一者或组合进行调整具体包括:根据穿戴设备的水平位置信息与无人机的水平位置信息调整无人机的水平飞行位置,以将无人机与穿戴设备的水平相对距离调整到第一预定距离范围内;根据穿戴设备的高度信息与无人机的高度信息调整无人机的飞行高度,以将无人机与穿戴设备的相对高度调整到第二预定距离范围内;根据无人机和穿戴设备的水平相对距离和相对高度计算无人机与穿戴设备之间连线相对于水平方向或竖直方向的夹角,并根据该夹角调整成像设备的拍摄角度,以使得成像设备的光轴方向调整到相对于无人机与穿戴设备之间连线处于预定角度范围内;根据无人机和穿戴设备的水平相对距离和相对高度调整成像设备的焦距,使得目标物体在成像设备中的成像大小占 整个拍摄画面的比例处于预定比例范围内。
其中,处理器进一步用于:从成像设备的拍摄画面内对目标物体进行视觉识别。
其中,处理器从成像设备的拍摄画面内对目标物体进行视觉识别具体包括:对拍摄画面中的至少两个候选物体进行动作识别,以分别获取至少两个候选物体的动作信息;将至少两个候选物体的动作信息与穿戴设备的动作信息或第一动作模板进行匹配;将相匹配的候选物体作为目标物体。
其中,处理器进一步用于:结合后续获得的穿戴设备的位置信息和视觉识别获得的目标物体的位置信息控制无人机对目标物体进行跟踪。
其中,处理器结合后续获得的穿戴设备的位置信息和视觉识别获得的目标物体的位置信息控制无人机对目标物体进行跟踪具体包括:根据穿戴设备的位置信息调整无人机的飞行位置,并根据视觉识别获得的目标物体的位置信息调整无人机的拍摄角度。
其中,处理器结合后续获得的穿戴设备的位置信息和视觉识别获得的目标物体的位置信息控制无人机对目标物体进行跟踪的步骤进一步包括:判断穿戴设备的位置信息和视觉识别获得的目标物体的位置信息之间的差异是否大于预设阈值;若大于预设阈值,则重新根据穿戴设备的位置信息对无人机的飞行位置、无人机的所搭载的成像设备的拍摄角度和目标物体在无人机的所搭载的成像设备中的成像大小中的至少一者或组合进行调整。
其中,处理器重新根据穿戴设备的位置信息对无人机的飞行位置、无人机的所搭载的成像设备的拍摄角度和目标物体在无人机的所搭载的成像设备中的成像大小中的至少一者或组合进行调整之后,进一步用于:重新对目标物体进行视觉识别。
其中,处理器进一步用于:判断后续获取的穿戴设备的动作信息或对目标物体进行动作识别所获得的动作信息是否与预设的第二动作模板相匹配,并在所获得的动作信息与第二动作模板相匹配时,对无人机进行方位调整。
其中,穿戴设备为佩戴于目标物体的手臂上的手环或手表,第二动作模板对应于目标物体的手腕翻转动作。
其中,处理器判断后续获取的穿戴设备的动作信息或对目标物体进行动作识别所获得的动作信息是否与预设的第二动作模板相匹配之前,无线通信电路进一步用于获取穿戴设备的姿态信息;处理器进一步用于判断穿戴设备的姿态信息是否满足预设的方位调整触发条件,并在满足方位调整触发条件时,执行判断后续获取的穿戴设备的动作信息或对目标物体进行动作识别所获得的动作信息是否与预设的第二动作模板相匹配的步骤。
其中,处理器判断穿戴设备的姿态信息是否满足预设的方位调整触发条件具体包括:根据穿戴设备的姿态信息和无人机与穿戴设备之间的相对位置关系,判断目标物体的佩戴有穿戴设备的肢体的指向相对于无人机与穿戴设备之间连线是否处于预设角度范围内,并在处于预设角度范围内时,判定穿戴设备的姿态信息满足方位调整触发条件。
其中,处理器对无人机进行方位调整具体包括:根据后续获得的穿戴设备的姿态信息调整无人机与穿戴设备的相对位置关系。
其中,处理器根据后续获得的穿戴设备的姿态信息调整无人机与穿戴设备的相对位置关系具体包括:根据后续获得的穿戴设备的姿态信息调整无人机与穿戴设备的相对位置关系,以使得目标物体的佩戴有穿戴设备的肢体的指向相对于无人机与穿戴设备之间连线始终保持在预设角度范围内。
其中,处理器根据后续获得的穿戴设备的姿态信息调整无人机与穿戴设备的相对位置关系具体包括:记录目标物体的佩戴有穿戴设备的肢体的指向相对于无人机与穿戴设备之间连线之间的夹角;以该夹角作为补偿值根据后续获得的穿戴设备的姿态信息调整无人机与穿戴设备的相对位置关系,以使得调整后的目标物体的佩戴有穿戴设备的肢体的指向和无人机与穿戴设备之间连线彼此重合。
为解决上述技术问题,本发明采用的第四个技术方案是:提供一种无人机, 包括:无线通信电路,用于获取目标物体所佩戴的穿戴设备的位置信息;处理器,耦接无线通信电路,用于结合获得的穿戴设备的位置信息和视觉识别获得的目标物体的位置信息控制无人机对目标物体进行跟踪。
其中,处理器结合获得的穿戴设备的位置信息和视觉识别获得的目标物体的位置信息控制无人机对目标物体进行跟踪具体包括:根据穿戴设备的位置信息调整无人机的飞行位置,并根据视觉识别获得的目标物体的位置信息调整无人机的拍摄角度。
其中,处理器进一步用于:判断穿戴设备的位置信息和视觉识别获得的目标物体的位置信息之间的差异是否大于第一预设阈值,并在大于第一预设阈值时,根据穿戴设备的位置信息对无人机的飞行位置进行调整,以使得无人机处于穿戴设备的预定距离范围内。
其中,处理器根据穿戴设备的位置信息对无人机的飞行位置进行调整之后,进一步用于:对无人机的所搭载的成像设备的拍摄角度进行调整,以使得目标物体处于成像设备的预定拍摄范围内。
其中,处理器进一步用于:从成像设备的拍摄画面内对目标物体进行视觉识别。
其中,处理器进一步用于:判断穿戴设备的位置信息和视觉识别获得的目标物体的位置信息之间的差异不大于第一预设阈值的持续时间是否大于第二预设阈值,并在持续时间大于第二预设阈值时,结合后续获得的穿戴设备的位置信息和视觉识别获得的目标物体的位置信息控制无人机对目标物体进行跟踪,在持续时间不大于所述第二预设阈值时,重新根据穿戴设备的位置信息对无人机的飞行位置进行调整。
本发明的有益效果是:区别于现有技术的情况,本发明的实施例通过获取目标物体所佩戴的穿戴设备的位置信息和动作信息,判断穿戴设备的动作信息是否与预设的第一动作模板相匹配,并在匹配时,根据穿戴设备的位置信息对无人机的飞行位置和无人机的所搭载的成像设备的拍摄角度中的至少一者或组 合进行调整,以使得无人机处于穿戴设备的预定距离范围内且目标物体处于成像设备的预定拍摄范围内,从而用户通过便携的穿戴设备即可实现控制无人机飞行,而不需要携带笨重的控制器,提高无人机控制的便利性;
另外,本发明的实施例通过结合获得的穿戴设备的位置信息和视觉识别获得的目标物体的位置信息控制无人机对目标物体进行跟踪,从而采用穿戴设备的位置信息弥补视觉识别不稳定的问题,提高目标跟踪的准确性。
【附图说明】
图1是本发明无人机的飞行控制方法第一实施方式的流程示意图;
图2是图1中步骤S14执行前后无人机和目标物体之间相对位置变化示意图;
图3是本发明无人机的飞行控制方法第二实施方式的流程示意图;
图4是本发明无人机的飞行控制方法第三实施方式的流程示意图;
图5是步骤S3211执行前后无人机和目标物体之间相对位置变化示意图;
图6是步骤S3213执行前后无人机和目标物体之间相对位置变化示意图;
图7是步骤S301判断是否满足无人机方位调整触发条件的过程示意图;
图8本发明无人机一实施方式的结构示意图。
【具体实施方式】
为使本领域的技术人员更好地理解本发明的技术方案,下面结合附图和具体实施方式对本发明所提供的无人机的飞行控制方法及无人机做进一步详细描述。
请参阅图1,图1是本发明无人机的飞行控制方法第一实施方式的流程示意图。如图1所示,本发明无人机的飞行控制方法包括:
步骤S10:获取目标物体所佩戴的穿戴设备的位置信息和动作信息;
其中,穿戴设备为佩戴于目标物体的手臂上的手环或手表;位置信息是穿 戴设备的GPS位置数据;动作信息是穿戴设备的姿态数据,包括加速度、角速度或者运动轨迹等至少一种数据,该动作信息通过穿戴设备的惯性测量单元或者磁力计等传感器检测得到的。当然,在其他实施方式中,穿戴设备也可以是佩戴于目标物体手指上的戒指等其他类型的设备,此处不做具体限定。
具体地,目标物体所佩戴的穿戴设备预先与无人机绑定,通过无线通信链路,例如4G网络、wifi、蓝牙等,可以与无人机进行通信。
步骤S12:判断穿戴设备的动作信息是否与预设的第一动作模板相匹配;
其中,第一动作模板是预先设置的姿态数据,并且预先与无人机的某一/些控制指令向关联。
具体地,在一个应用例中,第一动作模板对应于目标物体的挥手动作,即第一动作模板是目标物体挥手时,目标物体所佩戴的穿戴设备产生的动作信息,当无人机获得的目标物体的动作信息,例如加速度,与第一动作模板的加速度数据相同或者两者的偏差小于一定范围时,则判定获取的动作信息与第一动作模板相匹配,否则不匹配。当然,在其他应用例中,第一动作模板可以对应与目标物体的抬手动作等其他动作,匹配判断过程也可以采用判断方差等其他方法,此处不做具体限定。
步骤S14:若与第一动作模板相匹配,则根据穿戴设备的位置信息对无人机的飞行位置、无人机的所搭载的成像设备的拍摄角度和目标物体在无人机的所搭载的成像设备中的成像大小中的至少一者或组合进行调整,以使得无人机处于穿戴设备的预定距离范围内且目标物体处于成像设备的预定拍摄范围内。
进一步地,步骤S14包括:
步骤S141:根据穿戴设备的水平位置信息与无人机的水平位置信息调整无人机的水平飞行位置,以将无人机与穿戴设备的水平相对距离调整到第一预定距离范围内;
其中,第一预定距离范围是预先设置的第一距离阈值范围,可以根据具体需求设置,此处不做具体限定;穿戴设备的水平位置信息可以通过穿戴设备上 传的GPS位置数据获得,无人机的水平位置信息可以通过无人机的GPS定位仪获得。
具体地,如图2所示,在上述应用例中,无人机10判定获取的动作信息与第一动作模板相匹配时,根据穿戴设备20的水平位置信息与无人机10的水平位置信息,即根据穿戴设备20的GPS位置数据和无人机10的GPS位置数据,计算无人机10和穿戴设备20之间的水平相对距离,进而调整无人机10的水平飞行位置,例如从图2中的A位置飞行到B位置,使得无人机10与穿戴设备20之间的水平相对距离逐渐减少/增加到第一预定距离范围内(例如1-2米范围内),如图2中从X1减少到X2。
步骤S142:根据穿戴设备的高度信息与无人机的高度信息调整无人机的飞行高度,以将无人机与穿戴设备的相对高度调整到第二预定距离范围内;
其中,第二预定距离范围是预先设置的第二距离阈值范围,可以根据具体需求设置,此处不做具体限定。
具体地,进一步参阅图2,在一个应用例中,穿戴设备20的高度信息可以默认为与地面同一高度,无人机10的高度信息可以通过无人机10的超声波传感器或气压计等设备获得;根据穿戴设备20的高度信息与无人机10的高度信息,可以计算穿戴设备20和无人机10之间的高度差,进而可以调整无人机10的飞行高度,例如从图2中的A位置飞行到B位置,以将无人机10和穿戴设备20之间的相对高度逐渐减少/增加到第二预定距离范围内(例如3-4米范围内),如图2中从H1减少到H2。当然,在其他应用例中,穿戴设备20的高度信息也可以设置为与其他默认高度值,或者通过其他传感器获取,无人机10的高度信息也可以通过双目视觉系统测量,此处不做具体限定。
步骤S143:根据无人机和穿戴设备的水平相对距离和相对高度计算无人机与穿戴设备之间连线相对于水平方向或竖直方向的夹角,并根据该夹角调整成像设备的拍摄角度,以使得成像设备的光轴方向调整到相对于无人机与穿戴设备之间连线处于预定角度范围内。
其中,预定角度范围是预先设置的角度阈值范围,可以根据具体需求设置,此处不做具体限定。
具体地,如图2所示,根据无人机10和穿戴设备20的水平相对距离X2和相对高度H2,计算无人机10与穿戴设备20之间连线(如图2中的CD连线)相对于水平方向或竖直方向的夹角θ1或θ2,根据该夹角调整成像设备101的拍摄角度,使得成像设备101的光轴方向(如图2中的DE连线方向)调整到相对于无人机10与穿戴设备20之间连线(即CD连线)处于预定角度范围内(例如0-10度范围内),例如图2中的α角不大于10度。
步骤S144:根据无人机和穿戴设备的水平相对距离和相对高度调整成像设备的焦距,使得目标物体在成像设备中的成像大小占整个拍摄画面的比例处于预定比例范围内。
其中,预定比例范围是预先设置的目标物体成像的预设比例范围,可以根据具体需求设置,此处不做具体限定。
具体地,在上述应用例中,如图2所示,当无人机10位于B位置时,经过上述步骤,目标物体30已经处于无人机10所搭载的成像设备101的拍摄范围内,此时,可以基于光学成像原理,根据无人机10和穿戴设备20之间的水平相对距离X2以及相对高度H2,可以计算出无人机10与穿戴设备20之间的距离,从而根据该距离对应调整成像设备101的焦距,逐渐放大/缩小拍摄范围,使得目标物体30的成像大小占整个拍摄画面的比例处于预定比例范围内,例如25%-35%范围内。
本实施方式中,上述步骤S141-S144可以只执行其中的一个或多个的组合,例如,在判定获取的动作信息与第一动作模板匹配时,无人机与穿戴设备的相对高度已经在第二预设距离范围内,此时只需要执行步骤S142和S143即可。
当然,在其他实施方式中,也可以根据目标物体在成像设备中的成像大小,调整无人机的飞行位置,以使得目标物体在无人机所搭载的成像设备中的成像大小占整个拍摄画面的比例处于预定比例范围内,同时使得无人机与穿戴设备 之间的水平相对距离和相对高度逐渐增大/缩小到预定距离范围内,此处不做具体限定。
上述实施方式中,通过获取目标物体所佩戴的穿戴设备的位置信息和动作信息,判断穿戴设备的动作信息是否与预设的第一动作模板相匹配,并在匹配时,根据穿戴设备的位置信息对无人机的飞行位置、无人机的所搭载的成像设备的拍摄角度和目标物体在无人机的所搭载的成像设备中的成像大小中的至少一者或组合进行调整,以使得无人机处于穿戴设备的预定距离范围内且目标物体处于成像设备的预定拍摄范围内,从而用户通过便携的穿戴设备即可实现控制无人机飞行,而不需要携带笨重的控制器,提高无人机控制的便利性。
请参阅图3,图3是本发明无人机的飞行控制方法第二实施方式的流程示意图。如图3所示,本发明无人机的飞行控制方法包括:
步骤S21:结合获得的穿戴设备的位置信息和视觉识别获得的目标物体的位置信息控制无人机对目标物体进行跟踪。
进一步地,步骤S21具体包括:
步骤S211:根据穿戴设备的位置信息调整无人机的飞行位置,并根据视觉识别获得的目标物体的位置信息调整无人机的拍摄角度。
具体地,穿戴设备预先与无人机绑定,在无人机飞行过程中,根据穿戴设备上传的位置信息调整无人机的飞行位置,使得无人机处于穿戴设备的预定距离范围内,例如无人机与穿戴设备之间的直线距离处于预定距离范围内(例如4-5米范围内),并根据视觉识别获得的目标物体的位置信息调整无人机的拍摄角度,使得目标物体处于成像设备的预定拍摄范围内,其中,预定距离范围和预定拍摄范围可以根据实际需求设置,此处不做具体限定。
进一步地,步骤S21之前,包括:
步骤S20:从成像设备的拍摄画面内对目标物体进行视觉识别。
其中,无人机可以通过某一预定动作信息触发视觉识别功能,也可以自动触发视觉识别功能,此处不做具体限定。
在一个应用例中,步骤S20包括:
步骤S201:对拍摄画面中的至少两个候选物体进行动作识别,以分别获取至少两个候选物体的动作信息;
具体地,无人机可以先对拍摄画面进行轮廓识别,选取其中与人体轮廓比较接近的至少两个候选物体进行动作识别,通过视觉识别算法获取至少两个候选物体的动作信息,例如动作轨迹、加速度等信息。当然,在其他应用例中,也可以随机选取画面中的至少两个物体作为候选物体,也可以采用其他选取方式,此处不做具体限定。
步骤S202:将至少两个候选物体的动作信息与穿戴设备的动作信息或第一动作模板进行匹配;
步骤S203:将相匹配的候选物体作为目标物体。
其中,第一动作模板是预先设置的姿态数据,包括对应的动作的加速度、角速度、运动轨迹等至少一种数据。
具体地,将至少两个候选物体的动作信息与穿戴设备上传的动作信息或者是第一动作模板进行匹配,例如判断候选物体的动作的运动轨迹是否与穿戴设备上传的运动轨迹相同,或者在两者之间的差异在容许范围内,其中该容许范围是预先设置的;最后将相匹配的候选物体作为目标物体。
当然,在其他实施方式中,选取候选物体时也可以只选取一个候选物体进行动作匹配,或者随机选取拍摄画面中的其中一个物体作为目标物体,此处不做具体限定。
进一步地,步骤S21包括:
步骤S212:判断穿戴设备的位置信息和视觉识别获得的目标物体的位置信息之间的差异是否大于第一预设阈值;
其中,该第一预设阈值是预先设置的视觉测距的最大容许误差,超过第一预设阈值,则表明视觉测距不可靠,其具体取值可以视具体需求设置,此处不做具体限定。
具体地,无人机可以利用视觉测距算法,例如双目视觉测距,获得目标物体与无人机之间的距离a,同时无人机还可以利用穿戴设备的位置信息和无人机的位置信息,计算无人机与穿戴设备之间的距离b,从而可以计算a、b之间的差异,进而判断该差异是否大于预设阈值,例如0.5米。
步骤S213:若大于第一预设阈值,则重新根据穿戴设备的位置信息对无人机的飞行位置、无人机的所搭载的成像设备的拍摄角度和目标物体在无人机的所搭载的成像设备中的成像大小中的至少一者或组合进行调整,以使得无人机处于穿戴设备的预定距离范围内且目标物体处于成像设备的预定拍摄范围内。
其中,步骤S213具体包括:
步骤S2131:若大于第一预设阈值,则根据穿戴设备的位置信息对无人机的飞行位置进行调整,以使得无人机处于穿戴设备的预定距离范围内。
其中,本步骤的具体执行过程可以参考上述步骤S141和/或S142,此处不再重复。通过上述步骤,可以在无人机视觉跟踪跟丢目标物体或者视觉跟踪出错时,根据穿戴设备的位置信息控制无人机的飞行,使得无人机重新处于穿戴设备的预设范围内,从而防止无人机跟丢目标。
步骤S2132:对无人机的所搭载的成像设备的拍摄角度进行调整,以使得目标物体处于成像设备的预定拍摄范围内。
步骤S2133:根据无人机和穿戴设备的水平相对距离和相对高度调整成像设备的焦距,使得目标物体在成像设备中的成像大小占整个拍摄画面的比例处于预定比例范围内。
其中,本步骤的具体执行过程可以参考上述步骤S143,此处不再重复。在其他实施方式中,步骤S2131、S2132和S2133也可以只执行其中一个或任意两个的组合,此处不做具体限定。
步骤S214:重新对目标物体进行视觉识别。
具体地,无人机可以利用视觉检测算法,例如通过特征提取辨识,识别目标物体,也可以随机选取拍摄画面中的一个物体作为目标物体,具体过程可以 参考上述步骤S20,此处不再重复。
步骤S215:判断穿戴设备的位置信息和视觉识别获得的目标物体的位置信息之间的差异不大于第一预设阈值的持续时间是否大于第二预设阈值;
其中,第二预设阈值是预先设置的视觉测距的稳定时间阈值,超过第二预设阈值则表明视觉测距稳定可靠,其具体取值可以视具体需求设置,此处不做具体限定。
步骤S216:若持续时间大于第二预设阈值,则结合后续获得的穿戴设备的位置信息和视觉识别获得的目标物体的位置信息控制无人机对目标物体进行跟踪;否则,重新根据穿戴设备的位置信息对无人机的飞行位置进行调整。
具体地,无人机持续判断穿戴设备的位置信息和视觉识别获得的目标物体的位置信息之间的差异是否不大于第一预设阈值,并记录穿戴设备的位置信息和视觉识别获得的目标物体的位置信息之间的差异不大于第一预设阈值的持续时间,若该持续时间大于第二预设阈值(例如1分钟)时,则表明该视觉测距稳定可靠,即无人机的视觉跟踪稳定可靠,此时,可以结合后续获得的穿戴设备的位置信息和视觉识别获得的目标物体的位置信息控制无人机对目标物体进行跟踪;否则,表明无人机的视觉跟踪不稳定,重新根据穿戴设备的位置信息对无人机的飞行位置进行调整。
上述实施方式中,通过结合获得的穿戴设备的位置信息和视觉识别获得的目标物体的位置信息控制无人机对目标物体进行跟踪,从而采用穿戴设备的位置信息弥补视觉识别不稳定的问题,提高目标跟踪的准确性。
本实施方式可以与本发明无人机的飞行控制方法第一实施方式相结合,本实施方式步骤的执行可以在步骤S10或者步骤S14之后。
请参阅图4,图4是本发明无人机的飞行控制方法第三实施方式的流程示意图。如图4所示,本发明无人机的飞行控制方法包括:
步骤S31:判断获取的穿戴设备的动作信息或对目标物体进行动作识别所获得的动作信息是否与预设的第二动作模板相匹配;
其中,穿戴设备为佩戴于目标物体的手臂上的手环或手表,第二动作模板对应于目标物体的手腕翻转动作。动作信息是穿戴设备的姿态数据,包括加速度、角速度或者运动轨迹等至少一种数据,该动作信息通过穿戴设备的惯性测量单元或者磁力计等传感器检测得到的。当然,在其他实施方式中,穿戴设备也可以是佩戴于目标物体手指上的戒指等其他类型的设备,第二动作模板也可以对应于目标物体的抬手等其他动作,此处不做具体限定。
步骤S32:若与第二动作模板相匹配,则对无人机进行方位调整。
具体地,目标物体所佩戴的穿戴设备预先与无人机绑定,通过无线通信链路向无人机上传动作信息,无人机通过判断获取的穿戴设备的动作信息或对目标物体进行动作识别所获得的动作信息是否与预设的第二动作模板相匹配,例如判断穿戴设备的运动轨迹或者视觉识别得到的目标物体的动作的运动轨迹是否与预设的第二动作模板的运动轨迹相同,或者两者之间的差异在容许范围内,其中该容许范围是预先设置的;若运动轨迹相同或差异在容许范围内,则判定获取的穿戴设备的动作信息或对目标物体进行动作识别所获得的动作信息与预设的第二动作模板相匹配,可以对无人机进行方位调整;同时,可以同步调整无人机搭载的成像设备的拍摄角度,使得目标物体始终处于拍摄范围内。
进一步地,步骤S32具体包括:
步骤S321:根据后续获得的穿戴设备的姿态信息调整无人机与穿戴设备的相对位置关系。
其中,姿态信息是通过穿戴设备的磁力计和/或惯性测量单元检测获得的穿戴设备的方位角或欧拉角等数据。
具体地,步骤S321包括:
步骤S3211:根据后续获得的穿戴设备的姿态信息调整无人机与穿戴设备的相对位置关系,以使得目标物体的佩戴有穿戴设备的肢体的指向相对于无人机与穿戴设备之间连线始终保持在预设角度范围内。
如图5所示,在一个应用例中,无人机10根据后续获得的穿戴设备20的 姿态信息,例如欧拉角,可以得到目标物体30的佩戴有穿戴设备20的肢体301(例如手臂)的指向,如图5中的AB连线方向;在目标物体30的佩戴有穿戴设备20的肢体301的指向发生移动时,如图5中从AB连线方向移动到DE连线方向,同步调整无人机10的飞行位置,例如根据后续获得的穿戴设备20的欧拉角,获取穿戴设备20的偏转角度δ,以该偏转角度δ作为无人机10的飞行角度,在无人机10所处的水平面上,并且在无人机10与目标物体30之间距离不变时,同步调整无人机10的飞行位置,即从M位置飞行到N位置,使得无人机10与穿戴设备20之间的连线AC/DF与手臂的指向AB/DE之间的角度差β始终保持在预设角度范围内(例如±15度内),其中无人机10与穿戴设备20之间的连线AC/DF的指向可以通过无人机10和穿戴设备20的位置信息计算得到;该预设角度范围可以根据实际需求设置,此处不做具体限定。
或者,步骤S321具体包括:
步骤S3212:记录目标物体的佩戴有穿戴设备的肢体的指向相对于无人机与穿戴设备之间连线之间的夹角;
具体地,如图6所示,在一个应用例中,无人机10根据后续获得的穿戴设备20的姿态信息,例如欧拉角,可以得到目标物体30的佩戴有穿戴设备20的肢体301(例如手臂)的指向,如图6中的AB连线方向;通过无人机10和穿戴设备20的位置信息可以计算得到无人机10与穿戴设备20之间的连线的指向,即图6中的AC连线方向,从而可以获得手臂301的指向(AB连线方向)相对于无人机10与穿戴设备20之间连线(AC连线方向)之间的夹角β。
步骤S3213:以该夹角作为补偿值根据后续获得的穿戴设备的姿态信息调整无人机与穿戴设备的相对位置关系,以使得调整后的目标物体的佩戴有穿戴设备的肢体的指向和无人机与穿戴设备之间连线彼此重合。
具体地,在上述应用例中,在目标物体30的佩戴有穿戴设备20的肢体301的指向发生移动时,如图5中从AB连线方向移动到DE连线方向,以该夹角β作为补偿值,同步调整无人机10的飞行位置,例如根据后续获得的穿戴设备20 的欧拉角,获取穿戴设备20的偏转角度δ,以该偏转角度δ加上补偿值β作为无人机10的飞行角度,在无人机10所处的水平面上,并且在无人机10与目标物体30之间距离不变时,同步调整无人机10的飞行位置,从而可以使得调整后的目标物体30的佩戴有穿戴设备20的肢体301的指向(如图6中的DE连线方向)和无人机10与穿戴设备20之间连线(如图6中的DF连线方向)彼此重合。
进一步地,步骤S31之前,包括:
步骤S300:获取穿戴设备的姿态信息;
步骤S301:判断穿戴设备的姿态信息是否满足预设的方位调整触发条件;
步骤S302:若满足方位调整触发条件,则执行步骤S31。
其中,步骤S301包括:
步骤S3011:根据穿戴设备的姿态信息和无人机与穿戴设备之间的相对位置关系,判断目标物体的佩戴有穿戴设备的肢体的指向相对于无人机与穿戴设备之间连线是否处于预设角度范围内;
步骤S3012:若处于预设角度范围内,则满足方位调整触发条件。
其中,预设角度范围是预先设置的角度最大偏差范围,其具体取值可以根据实际需求设置,此处不做具体限定。
具体地,如图7所示,在一个应用例中,无人机10根据获得的穿戴设备20的姿态信息,例如欧拉角,可以得到目标物体30的佩戴有穿戴设备20的肢体301(例如手臂)的指向,如图7中的AB连线方向,根据无人机10和穿戴设备20的位置信息可以计算得到无人机10与穿戴设备20之间连线的指向,如图7中的AC连线方向,判断手臂301指向(即AB连线方向)和无人机10与穿戴设备20之间连线的指向(即AC连线方向)之间的夹角β是否处于预设角度范围内(例如±20度内),若处于预设角度范围内,则表明手臂301的指向对准无人机10,则满足方位调整触发条件,可以继续执行步骤S31,从而使得无人机10随着手臂301的移动同步调整飞行位置;否则,不触发方位调整,即不执行 步骤S31。在方位调整过程中,若该手臂301的指向(即AB连线方向)和无人机10与穿戴设备20之间连线的指向(即AC连线方向)之间的夹角β超出预设角度范围,则判定方位调整结束。
上述实施方式中,通过判断获取的穿戴设备的动作信息或对目标物体进行动作识别所获得的动作信息是否与预设的第二动作模板相匹配,并在相匹配时,对无人机进行方位调整,从而可以调整无人机与目标物体的相对位置,同时也可以同步调整无人机搭载的成像设备的拍摄角度,使得目标物体始终处于拍摄范围内,进而通过便携的穿戴设备即可以实现无人机的飞行控制,而不需要携带笨重的控制器,提高无人机的控制便利性。
本实施方式可以与本发明无人机的飞行控制方法第一和/或第二实施方式相结合,本实施方式步骤的执行可以在步骤S14或者步骤S21之后。
请参阅图8,图8是本发明无人机一实施方式的结构示意图。如图8所示,本发明无人机80包括:
无线通信电路801,用于获取目标物体所佩戴的穿戴设备的位置信息和动作信息;
处理器802,耦接无线通信电路801,用于判断穿戴设备的动作信息是否与预设的第一动作模板相匹配,并在穿戴设备的动作信息与第一动作模板相匹配时,根据穿戴设备的位置信息对无人机80的飞行位置、无人机80的所搭载的成像设备的拍摄角度和目标物体在无人机80的所搭载的成像设备中的成像大小中的至少一者或组合进行调整,以使得无人机80处于穿戴设备的预定距离范围内且目标物体处于成像设备的预定拍摄范围内。
其中,穿戴设备为佩戴于目标物体的手臂上的手环或手表,第一动作模板对应于目标物体的挥手动作。当然,在其他实施方式中,穿戴设备也可以是佩戴于目标物体手指上的戒指等其他类型的设备,第一动作模板也可以对应上下晃动手臂等其他动作,此处不做具体限定。
进一步地,无人机80还包括:
存储器803,耦接处理器802,用于存储处理器802运行所需的指令和数据,例如第一动作模板;
定位仪804,耦接处理器802,用于获取无人机80的位置信息;
其中,处理器802判断判断穿戴设备的动作信息是否与预设的第一动作模板相匹配,以及根据穿戴设备的位置信息对无人机80的飞行位置、无人机80所搭载的成像设备的拍摄角度和目标物体在无人机80的所搭载的成像设备中的成像大小中的至少一者或组合进行调整的具体过程可以参考本发明无人机的飞行控制方法第一实施方式的对应步骤,此处不再重复。
进一步地,处理器802用于:从成像设备的拍摄画面内对目标物体进行视觉识别;并结合后续获得的穿戴设备的位置信息和视觉识别获得的目标物体的位置信息控制无人机80对目标物体进行跟踪;其具体过程可以参考本发明无人机的飞行控制方法第二实施方式的对应步骤,此处不再重复。
进一步地,处理器802还用于:判断后续获取的穿戴设备的动作信息或对目标物体进行动作识别所获得的动作信息是否与预设的第二动作模板相匹配,并在所获得的动作信息与第二动作模板相匹配时,对无人机80进行方位调整。
其中,第二动作模板对应于目标物体的手腕翻转动作。当然,在其他实施方式中,第二动作模板也可以对应抬手等其他动作,此处不做具体限定。
处理器802判断后续获取的穿戴设备的动作信息或对目标物体进行动作识别所获得的动作信息是否与预设的第二动作模板相匹配之前,进一步用于判断穿戴设备的姿态信息是否满足预设的方位调整触发条件,并在满足方位调整触发条件时,执行判断后续获取的穿戴设备的动作信息或对目标物体进行动作识别所获得的动作信息是否与预设的第二动作模板相匹配的步骤。
其中,处理器802判断穿戴设备的姿态信息是否满足预设的方位调整触发条件,判断后续获取的穿戴设备的动作信息或对目标物体进行动作识别所获得的动作信息是否与预设的第二动作模板相匹配,以及对无人机80进行方位调整的具体过程可以参考本发明无人机的飞行控制方法第三实施方式的对应步骤, 此处不再重复。
当然,在其他实施方式中,无人机还可以包括超声波传感器、磁力计等其他部件,此处不做具体限定。
上述实施方式中,无人机通过获取目标物体所佩戴的穿戴设备的位置信息和动作信息,判断穿戴设备的动作信息是否与预设的第一动作模板相匹配,并在匹配时,根据穿戴设备的位置信息对无人机的飞行位置、无人机的所搭载的成像设备的拍摄角度和目标物体在无人机的所搭载的成像设备中的成像大小中的至少一者或组合进行调整,以使得无人机处于穿戴设备的预定距离范围内且目标物体处于成像设备的预定拍摄范围内,从而用户通过便携的穿戴设备即可实现控制无人机飞行,而不需要携带笨重的控制器,提高无人机控制的便利性;另外,通过结合获得的穿戴设备的位置信息和视觉识别获得的目标物体的位置信息控制无人机对目标物体进行跟踪,从而采用穿戴设备的位置信息弥补视觉识别不稳定的问题,提高目标跟踪的准确性。
以上所述仅为本发明的实施方式,并非因此限制本发明的专利范围,凡是利用本发明说明书及附图内容所作的等效结构或等效流程变换,或直接或间接运用在其他相关的技术领域,均同理包括在本发明的专利保护范围内。

Claims (44)

  1. 一种无人机的飞行控制方法,其特征在于,所述方法包括:
    获取目标物体所佩戴的穿戴设备的位置信息和动作信息;
    判断所述穿戴设备的动作信息是否与预设的第一动作模板相匹配;
    若与所述第一动作模板相匹配,则根据所述穿戴设备的所述位置信息对所述无人机的飞行位置、所述无人机的所搭载的成像设备的拍摄角度和所述目标物体在所述无人机的所搭载的成像设备中的成像大小中的至少一者或组合进行调整,以使得所述无人机处于所述穿戴设备的预定距离范围内且所述目标物体处于所述成像设备的预定拍摄范围内。
  2. 根据权利要求1所述的方法,其特征在于,所述穿戴设备为佩戴于所述目标物体的手臂上的手环或手表,所述第一动作模板对应于所述目标物体的挥手动作。
  3. 根据权利要求1所述的方法,其特征在于,所述根据所述穿戴设备的所述位置信息对所述无人机的飞行位置、所述无人机的所搭载的成像设备的拍摄角度和所述目标物体在所述无人机的所搭载的成像设备中的成像大小中的至少一者或组合进行调整的步骤包括:
    根据所述穿戴设备的水平位置信息与所述无人机的水平位置信息调整所述无人机的水平飞行位置,以将所述无人机与所述穿戴设备的水平相对距离调整到第一预定距离范围内;
    根据所述穿戴设备的高度信息与所述无人机的高度信息调整所述无人机的飞行高度,以将所述无人机与所述穿戴设备的相对高度调整到第二预定距离范围内;
    根据所述无人机和所述穿戴设备的水平相对距离和相对高度计算所述无人机与所述穿戴设备之间连线相对于水平方向或竖直方向的夹角,并根据所述夹角调整所述成像设备的拍摄角度,以使得所述成像设备的光轴方向调整到相对 于所述无人机与所述穿戴设备之间连线处于预定角度范围内;
    根据所述无人机和所述穿戴设备的水平相对距离和相对高度调整所述成像设备的焦距,使得所述目标物体在所述成像设备中的成像大小占整个拍摄画面的比例处于预定比例范围内。
  4. 根据权利要求1所述的方法,其特征在于,所述方法进一步包括:
    从所述成像设备的拍摄画面内对所述目标物体进行视觉识别。
  5. 根据权利要求4所述的方法,其特征在于,所述从所述成像设备的拍摄画面内对所述目标物体进行视觉识别的步骤包括:
    对所述拍摄画面中的至少两个候选物体进行动作识别,以分别获取所述至少两个候选物体的动作信息;
    将所述至少两个候选物体的动作信息与所述穿戴设备的动作信息或所述第一动作模板进行匹配;
    将相匹配的所述候选物体作为所述目标物体。
  6. 根据权利要求1所述的方法,其特征在于,所述方法进一步包括:
    结合后续获得的所述穿戴设备的位置信息和视觉识别获得的所述目标物体的位置信息控制所述无人机对所述目标物体进行跟踪。
  7. 根据权利要求6所述的方法,其特征在于,所述结合后续获得的所述穿戴设备的位置信息和视觉识别获得的所述目标物体的位置信息控制所述无人机对所述目标物体进行跟踪的步骤包括:
    根据所述穿戴设备的位置信息调整所述无人机的飞行位置,并根据视觉识别获得的所述目标物体的位置信息调整所述无人机的拍摄角度。
  8. 根据权利要求6所述的方法,其特征在于,所述结合后续获得的所述穿戴设备的位置信息和视觉识别获得的所述目标物体的位置信息控制所述无人机对所述目标物体进行跟踪的步骤进一步包括:
    判断所述穿戴设备的位置信息和视觉识别获得的所述目标物体的位置信息之间的差异是否大于预设阈值;
    若大于所述预设阈值,则重新根据所述穿戴设备的所述位置信息对所述无人机的飞行位置、所述无人机的所搭载的成像设备的拍摄角度和所述目标物体在所述无人机的所搭载的成像设备中的成像大小中的至少一者或组合进行调整,以使得所述无人机处于所述穿戴设备的预定距离范围内且所述目标物体处于所述成像设备的预定拍摄范围内。
  9. 根据权利要求8所述的方法,其特征在于,所述重新根据所述穿戴设备的所述位置信息对所述无人机的飞行位置、所述无人机的所搭载的成像设备的拍摄角度和所述目标物体在所述无人机的所搭载的成像设备中的成像大小中的至少一者或组合进行调整的步骤之后,进一步包括:
    重新对所述目标物体进行视觉识别。
  10. 根据权利要求4所述的方法,其特征在于,所述方法进一步包括:
    判断后续获取的所述穿戴设备的动作信息或对所述目标物体进行动作识别所获得的动作信息是否与预设的第二动作模板相匹配;
    若与所述第二动作模板相匹配,则对所述无人机进行方位调整。
  11. 根据权利要求10所述的方法,其特征在于,所述穿戴设备为佩戴于所述目标物体的手臂上的手环或手表,所述第二动作模板对应于所述目标物体的手腕翻转动作。
  12. 根据权利要求10所述的方法,其特征在于,所述判断后续获取的所述穿戴设备的动作信息或对所述目标物体进行动作识别所获得的动作信息是否与预设的第二动作模板相匹配的步骤之前,进一步包括:
    获取所述穿戴设备的姿态信息;
    判断所述穿戴设备的姿态信息是否满足预设的方位调整触发条件;
    若满足所述方位调整触发条件,则执行所述判断后续获取的所述穿戴设备的动作信息或对所述目标物体进行动作识别所获得的动作信息是否与预设的第二动作模板相匹配的步骤。
  13. 根据权利要求12所述的方法,其特征在于,所述判断所述穿戴设备的姿 态信息是否满足预设的方位调整触发条件的步骤包括:
    根据所述穿戴设备的姿态信息和所述无人机与所述穿戴设备之间的相对位置关系,判断所述目标物体的佩戴有所述穿戴设备的肢体的指向相对于所述无人机与所述穿戴设备之间连线是否处于预设角度范围内;
    若处于所述预设角度范围内,则满足所述方位调整触发条件。
  14. 根据权利要求13所述的方法,其特征在于,所述对所述无人机进行方位调整的步骤包括:
    根据后续获得的所述穿戴设备的姿态信息调整所述无人机与所述穿戴设备的相对位置关系。
  15. 根据权利要求14所述的方法,其特征在于,所述根据后续获得的所述穿戴设备的姿态信息调整所述无人机与所述穿戴设备的相对位置关系的步骤包括:
    根据后续获得的所述穿戴设备的姿态信息调整所述无人机与所述穿戴设备的相对位置关系,以使得所述目标物体的佩戴有所述穿戴设备的肢体的指向相对于所述无人机与所述穿戴设备之间连线始终保持在所述预设角度范围内。
  16. 根据权利要求14所述的方法,其特征在于,所述根据后续获得的所述穿戴设备的姿态信息调整所述无人机与所述穿戴设备的相对位置关系的步骤包括:
    记录所述目标物体的佩戴有所述穿戴设备的肢体的指向相对于所述无人机与所述穿戴设备之间连线之间的夹角;
    以所述夹角作为补偿值根据后续获得的所述穿戴设备的姿态信息调整所述无人机与所述穿戴设备的相对位置关系,以使得调整后的所述目标物体的佩戴有所述穿戴设备的肢体的指向和所述无人机与所述穿戴设备之间连线彼此重合。
  17. 一种无人机的飞行控制方法,其特征在于,包括:
    获取目标物体所佩戴的穿戴设备的位置信息;
    结合获得的所述穿戴设备的位置信息和视觉识别获得的所述目标物体的位置信息控制所述无人机对所述目标物体进行跟踪。
  18. 根据权利要求17所述的方法,其特征在于,所述结合获得的所述穿戴设 备的位置信息和视觉识别获得的所述目标物体的位置信息控制所述无人机对所述目标物体进行跟踪的步骤包括:
    根据所述穿戴设备的位置信息调整所述无人机的飞行位置,并根据视觉识别获得的所述目标物体的位置信息调整所述无人机的拍摄角度。
  19. 根据权利要求17所述的方法,其特征在于,所述方法进一步包括:
    判断所述穿戴设备的位置信息和视觉识别获得的所述目标物体的位置信息之间的差异是否大于第一预设阈值;
    若大于所述第一预设阈值,则根据所述穿戴设备的所述位置信息对所述无人机的飞行位置进行调整,以使得所述无人机处于所述穿戴设备的预定距离范围内。
  20. 根据权利要求19所述的方法,其特征在于,所述根据所述穿戴设备的所述位置信息对所述无人机的飞行位置进行调整的步骤之后,进一步包括:
    对所述无人机的所搭载的成像设备的拍摄角度进行调整,以使得所述目标物体处于所述成像设备的预定拍摄范围内。
  21. 根据权利要求17所述的方法,其特征在于,所述方法进一步包括:
    从所述成像设备的拍摄画面内对所述目标物体进行视觉识别。
  22. 根据权利要求19所述的方法,其特征在于,所述方法进一步包括:
    判断所述穿戴设备的位置信息和视觉识别获得的所述目标物体的位置信息之间的差异不大于第一预设阈值的持续时间是否大于第二预设阈值;
    若所述持续时间大于所述第二预设阈值,则结合后续获得的所述穿戴设备的位置信息和视觉识别获得的所述目标物体的位置信息控制所述无人机对所述目标物体进行跟踪;否则,重新根据所述穿戴设备的所述位置信息对所述无人机的飞行位置进行调整。
  23. 一种无人机,其特征在于,包括:
    无线通信电路,用于获取目标物体所佩戴的穿戴设备的位置信息和动作信息;
    处理器,耦接所述无线通信电路,用于判断所述穿戴设备的动作信息是否与预设的第一动作模板相匹配,并在所述穿戴设备的动作信息与所述第一动作模板相匹配时,根据所述穿戴设备的所述位置信息对所述无人机的飞行位置、所述无人机的所搭载的成像设备的拍摄角度和所述目标物体在所述无人机的所搭载的成像设备中的成像大小中的至少一者或组合进行调整,以使得所述无人机处于所述穿戴设备的预定距离范围内且所述目标物体处于所述成像设备的预定拍摄范围内。
  24. 根据权利要求23所述的无人机,其特征在于,所述穿戴设备为佩戴于所述目标物体的手臂上的手环或手表,所述第一动作模板对应于所述目标物体的挥手动作。
  25. 根据权利要求23所述的无人机,其特征在于,所述处理器根据所述穿戴设备的所述位置信息对所述无人机的飞行位置、所述无人机的所搭载的成像设备的拍摄角度和所述目标物体在所述无人机的所搭载的成像设备中的成像大小中的至少一者或组合进行调整具体包括:
    根据所述穿戴设备的水平位置信息与所述无人机的水平位置信息调整所述无人机的水平飞行位置,以将所述无人机与所述穿戴设备的水平相对距离调整到第一预定距离范围内;
    根据所述穿戴设备的高度信息与所述无人机的高度信息调整所述无人机的飞行高度,以将所述无人机与所述穿戴设备的相对高度调整到第二预定距离范围内;
    根据所述无人机和所述穿戴设备的水平相对距离和相对高度计算所述无人机与所述穿戴设备之间连线相对于水平方向或竖直方向的夹角,并根据所述夹角调整所述成像设备的拍摄角度,以使得所述成像设备的光轴方向调整到相对于所述无人机与所述穿戴设备之间连线处于预定角度范围内;
    根据所述无人机和所述穿戴设备的水平相对距离和相对高度调整所述成像设备的焦距,使得所述目标物体在所述成像设备中的成像大小占整个拍摄画面 的比例处于预定比例范围内。
  26. 根据权利要求23所述的无人机,其特征在于,所述处理器进一步用于:
    从所述成像设备的拍摄画面内对所述目标物体进行视觉识别。
  27. 根据权利要求26所述的无人机,其特征在于,所述处理器从所述成像设备的拍摄画面内对所述目标物体进行视觉识别具体包括:
    对所述拍摄画面中的至少两个候选物体进行动作识别,以分别获取所述至少两个候选物体的动作信息;
    将所述至少两个候选物体的动作信息与所述穿戴设备的动作信息或所述第一动作模板进行匹配;
    将相匹配的所述候选物体作为所述目标物体。
  28. 根据权利要求23所述的无人机,其特征在于,所述处理器进一步用于:
    结合后续获得的所述穿戴设备的位置信息和视觉识别获得的所述目标物体的位置信息控制所述无人机对所述目标物体进行跟踪。
  29. 根据权利要求28所述的无人机,其特征在于,所述处理器结合后续获得的所述穿戴设备的位置信息和视觉识别获得的所述目标物体的位置信息控制所述无人机对所述目标物体进行跟踪具体包括:
    根据所述穿戴设备的位置信息调整所述无人机的飞行位置,并根据视觉识别获得的所述目标物体的位置信息调整所述无人机的拍摄角度。
  30. 根据权利要求28所述的无人机,其特征在于,所述处理器结合后续获得的所述穿戴设备的位置信息和视觉识别获得的所述目标物体的位置信息控制所述无人机对所述目标物体进行跟踪的步骤进一步包括:
    判断所述穿戴设备的位置信息和视觉识别获得的所述目标物体的位置信息之间的差异是否大于预设阈值;
    若大于所述预设阈值,则重新根据所述穿戴设备的所述位置信息对所述无人机的飞行位置、所述无人机的所搭载的成像设备的拍摄角度和所述目标物体在所述无人机的所搭载的成像设备中的成像大小中的至少一者或组合进行调整。
  31. 根据权利要求30所述的无人机,其特征在于,所述处理器重新根据所述穿戴设备的所述位置信息对所述无人机的飞行位置、所述无人机的所搭载的成像设备的拍摄角度和所述目标物体在所述无人机的所搭载的成像设备中的成像大小中的至少一者或组合进行调整之后,进一步用于:
    重新对所述目标物体进行视觉识别。
  32. 根据权利要求26所述的无人机,其特征在于,所述处理器进一步用于:
    判断后续获取的所述穿戴设备的动作信息或对所述目标物体进行动作识别所获得的动作信息是否与预设的第二动作模板相匹配,并在所获得的所述动作信息与所述第二动作模板相匹配时,对所述无人机进行方位调整。
  33. 根据权利要求32所述的无人机,其特征在于,所述穿戴设备为佩戴于所述目标物体的手臂上的手环或手表,所述第二动作模板对应于所述目标物体的手腕翻转动作。
  34. 根据权利要求32所述的无人机,其特征在于,所述处理器判断后续获取的所述穿戴设备的动作信息或对所述目标物体进行动作识别所获得的动作信息是否与预设的第二动作模板相匹配之前,
    所述无线通信电路进一步用于获取所述穿戴设备的姿态信息;
    所述处理器进一步用于判断所述穿戴设备的姿态信息是否满足预设的方位调整触发条件,并在满足所述方位调整触发条件时,执行所述判断后续获取的所述穿戴设备的动作信息或对所述目标物体进行动作识别所获得的动作信息是否与预设的第二动作模板相匹配的步骤。
  35. 根据权利要求34所述的无人机,其特征在于,所述处理器判断所述穿戴设备的姿态信息是否满足预设的方位调整触发条件具体包括:
    根据所述穿戴设备的姿态信息和所述无人机与所述穿戴设备之间的相对位置关系,判断所述目标物体的佩戴有所述穿戴设备的肢体的指向相对于所述无人机与所述穿戴设备之间连线是否处于预设角度范围内,并在处于所述预设角度范围内时,判定所述穿戴设备的姿态信息满足所述方位调整触发条件。
  36. 根据权利要求35所述的无人机,其特征在于,所述处理器对所述无人机进行方位调整具体包括:
    根据后续获得的所述穿戴设备的姿态信息调整所述无人机与所述穿戴设备的相对位置关系。
  37. 根据权利要求36所述的无人机,其特征在于,所述处理器根据后续获得的所述穿戴设备的姿态信息调整所述无人机与所述穿戴设备的相对位置关系具体包括:
    根据后续获得的所述穿戴设备的姿态信息调整所述无人机与所述穿戴设备的相对位置关系,以使得所述目标物体的佩戴有所述穿戴设备的肢体的指向相对于所述无人机与所述穿戴设备之间连线始终保持在所述预设角度范围内。
  38. 根据权利要求36所述的无人机,其特征在于,所述处理器根据后续获得的所述穿戴设备的姿态信息调整所述无人机与所述穿戴设备的相对位置关系具体包括:
    记录所述目标物体的佩戴有所述穿戴设备的肢体的指向相对于所述无人机与所述穿戴设备之间连线之间的夹角;
    以所述夹角作为补偿值根据后续获得的所述穿戴设备的姿态信息调整所述无人机与所述穿戴设备的相对位置关系,以使得调整后的所述目标物体的佩戴有所述穿戴设备的肢体的指向和所述无人机与所述穿戴设备之间连线彼此重合。
  39. 一种无人机,其特征在于,包括:
    无线通信电路,用于获取目标物体所佩戴的穿戴设备的位置信息;
    处理器,耦接所述无线通信电路,用于结合获得的所述穿戴设备的位置信息和视觉识别获得的所述目标物体的位置信息控制所述无人机对所述目标物体进行跟踪。
  40. 根据权利要求39所述的无人机,其特征在于,所述处理器结合获得的所述穿戴设备的位置信息和视觉识别获得的所述目标物体的位置信息控制所述无人机对所述目标物体进行跟踪具体包括:
    根据所述穿戴设备的位置信息调整所述无人机的飞行位置,并根据视觉识别获得的所述目标物体的位置信息调整所述无人机的拍摄角度。
  41. 根据权利要求39所述的无人机,其特征在于,所述处理器进一步用于:
    判断所述穿戴设备的位置信息和视觉识别获得的所述目标物体的位置信息之间的差异是否大于第一预设阈值,并在大于所述第一预设阈值时,根据所述穿戴设备的所述位置信息对所述无人机的飞行位置进行调整,以使得所述无人机处于所述穿戴设备的预定距离范围内。
  42. 根据权利要求41所述的无人机,其特征在于,所述处理器根据所述穿戴设备的所述位置信息对所述无人机的飞行位置进行调整之后,进一步用于:
    对所述无人机的所搭载的成像设备的拍摄角度进行调整,以使得所述目标物体处于所述成像设备的预定拍摄范围内。
  43. 根据权利要求39所述的无人机,其特征在于,所述处理器进一步用于:
    从所述成像设备的拍摄画面内对所述目标物体进行视觉识别。
  44. 根据权利要求41所述的无人机,其特征在于,所述处理器进一步用于:
    判断所述穿戴设备的位置信息和视觉识别获得的所述目标物体的位置信息之间的差异不大于第一预设阈值的持续时间是否大于第二预设阈值,并在所述持续时间大于所述第二预设阈值时,结合后续获得的所述穿戴设备的位置信息和视觉识别获得的所述目标物体的位置信息控制所述无人机对所述目标物体进行跟踪,在所述持续时间不大于所述第二预设阈值时,重新根据所述穿戴设备的所述位置信息对所述无人机的飞行位置进行调整。
PCT/CN2017/079134 2017-03-31 2017-03-31 一种无人机的飞行控制方法及无人机 WO2018176426A1 (zh)

Priority Applications (3)

Application Number Priority Date Filing Date Title
CN202210149842.8A CN114510079A (zh) 2017-03-31 2017-03-31 一种无人机的飞行控制方法及无人机
PCT/CN2017/079134 WO2018176426A1 (zh) 2017-03-31 2017-03-31 一种无人机的飞行控制方法及无人机
CN201780054731.6A CN109690440B (zh) 2017-03-31 2017-03-31 一种无人机的飞行控制方法及无人机

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2017/079134 WO2018176426A1 (zh) 2017-03-31 2017-03-31 一种无人机的飞行控制方法及无人机

Publications (1)

Publication Number Publication Date
WO2018176426A1 true WO2018176426A1 (zh) 2018-10-04

Family

ID=63673997

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2017/079134 WO2018176426A1 (zh) 2017-03-31 2017-03-31 一种无人机的飞行控制方法及无人机

Country Status (2)

Country Link
CN (2) CN109690440B (zh)
WO (1) WO2018176426A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112989982A (zh) * 2021-03-05 2021-06-18 佛山科学技术学院 一种无人车图像采集控制方法及系统

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105676860A (zh) * 2016-03-17 2016-06-15 歌尔声学股份有限公司 一种可穿戴设备、无人机控制装置和控制实现方法
CN105955306A (zh) * 2016-07-20 2016-09-21 西安中科比奇创新科技有限责任公司 可穿戴设备、基于可穿戴设备的无人机控制方法及系统
JP6020872B1 (ja) * 2016-06-24 2016-11-02 株式会社アドインテ 分析システム及び分析方法
CN106161953A (zh) * 2016-08-12 2016-11-23 零度智控(北京)智能科技有限公司 一种跟踪拍摄方法和装置
CN106370184A (zh) * 2016-08-29 2017-02-01 北京奇虎科技有限公司 无人机自动跟踪拍摄的方法、无人机和移动终端设备
CN106446837A (zh) * 2016-09-28 2017-02-22 湖南优象科技有限公司 一种基于运动历史图像的挥手检测方法

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120194551A1 (en) * 2010-02-28 2012-08-02 Osterhout Group, Inc. Ar glasses with user-action based command and control of external devices
WO2015179797A1 (en) * 2014-05-23 2015-11-26 Lily Robotics, Inc. Unmanned aerial copter for photography and/or videography
JP6784434B2 (ja) * 2014-07-30 2020-11-11 エスゼット ディージェイアイ テクノロジー カンパニー リミテッドSz Dji Technology Co.,Ltd 方法、uav制御プログラム、無人航空機、及び制御システム
CN105843246A (zh) * 2015-11-27 2016-08-10 深圳市星图智控科技有限公司 无人机跟踪方法、系统及无人机
CN105759839B (zh) * 2016-03-01 2018-02-16 深圳市大疆创新科技有限公司 无人机视觉跟踪方法、装置以及无人机
CN106020492A (zh) * 2016-06-07 2016-10-12 赵武刚 通过手的动作与手势产生遥控无人机及附件的信号的方法
CN106155090B (zh) * 2016-08-29 2019-04-19 电子科技大学 基于体感的可穿戴无人机控制设备
CN106444843B (zh) * 2016-12-07 2019-02-15 北京奇虎科技有限公司 无人机相对方位控制方法及装置

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105676860A (zh) * 2016-03-17 2016-06-15 歌尔声学股份有限公司 一种可穿戴设备、无人机控制装置和控制实现方法
JP6020872B1 (ja) * 2016-06-24 2016-11-02 株式会社アドインテ 分析システム及び分析方法
CN105955306A (zh) * 2016-07-20 2016-09-21 西安中科比奇创新科技有限责任公司 可穿戴设备、基于可穿戴设备的无人机控制方法及系统
CN106161953A (zh) * 2016-08-12 2016-11-23 零度智控(北京)智能科技有限公司 一种跟踪拍摄方法和装置
CN106370184A (zh) * 2016-08-29 2017-02-01 北京奇虎科技有限公司 无人机自动跟踪拍摄的方法、无人机和移动终端设备
CN106446837A (zh) * 2016-09-28 2017-02-22 湖南优象科技有限公司 一种基于运动历史图像的挥手检测方法

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112989982A (zh) * 2021-03-05 2021-06-18 佛山科学技术学院 一种无人车图像采集控制方法及系统
CN112989982B (zh) * 2021-03-05 2024-04-30 佛山科学技术学院 一种无人车图像采集控制方法及系统

Also Published As

Publication number Publication date
CN114510079A (zh) 2022-05-17
CN109690440B (zh) 2022-03-08
CN109690440A (zh) 2019-04-26

Similar Documents

Publication Publication Date Title
US11914370B2 (en) System and method for providing easy-to-use release and auto-positioning for drone applications
US11733692B2 (en) Systems and methods for controlling an unmanned aerial vehicle
US11125563B2 (en) Systems and methods for autonomous machine tracking and localization of mobile objects
US11454964B2 (en) Systems and methods for adjusting flight control of an unmanned aerial vehicle
CN110494360B (zh) 用于提供自主摄影及摄像的系统和方法
US10636150B2 (en) Subject tracking systems for a movable imaging system
WO2017197729A1 (zh) 一种跟踪系统及跟踪方法
WO2019233210A1 (zh) 一种智能眼镜、眼球轨迹的追踪方法、装置及存储介质
CN205610783U (zh) 一种带自动视觉跟踪的自稳定手持拍照摄像云台
KR20180075191A (ko) 무인 이동체를 제어하기 위한 방법 및 전자 장치
WO2021127888A1 (zh) 控制方法、智能眼镜、可移动平台、云台、控制系统及计算机可读存储介质
WO2019126958A1 (zh) 偏航姿态控制方法、无人机、计算机可读存储介质
US20210112194A1 (en) Method and device for taking group photo
EP3273318A1 (fr) Système autonome de prise de vues animées par un drone avec poursuite de cible et localisation améliorée de la cible
CN108957505A (zh) 一种定位方法、定位系统和手携式智能穿戴设备
KR101959366B1 (ko) 무인기와 무선단말기 간의 상호 인식 방법
Hausamann et al. Positional head-eye tracking outside the lab: an open-source solution
WO2018176426A1 (zh) 一种无人机的飞行控制方法及无人机
WO2020019113A1 (zh) 移动机器人的控制方法、装置及移动机器人系统
KR101599149B1 (ko) 피사체를 자동으로 추적하는 촬영장치
KR20190008257A (ko) 동영상 하이라이트 부분을 식별하기 위한 머리 회전 추적 기기
US11582395B1 (en) Gimbal device
KR102334509B1 (ko) 무인기와 무선단말기 간의 상호 인식 방법

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17903139

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17903139

Country of ref document: EP

Kind code of ref document: A1