WO2018072155A1 - 一种用于控制无人机的穿戴式设备及无人机系统 - Google Patents

一种用于控制无人机的穿戴式设备及无人机系统 Download PDF

Info

Publication number
WO2018072155A1
WO2018072155A1 PCT/CN2016/102615 CN2016102615W WO2018072155A1 WO 2018072155 A1 WO2018072155 A1 WO 2018072155A1 CN 2016102615 W CN2016102615 W CN 2016102615W WO 2018072155 A1 WO2018072155 A1 WO 2018072155A1
Authority
WO
WIPO (PCT)
Prior art keywords
drone
wearable device
information
processor
control command
Prior art date
Application number
PCT/CN2016/102615
Other languages
English (en)
French (fr)
Inventor
丘华良
尤中乾
李进吉
赵有成
冯晓峰
冯建
Original Assignee
深圳市大疆创新科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳市大疆创新科技有限公司 filed Critical 深圳市大疆创新科技有限公司
Priority to CN201910392512.XA priority Critical patent/CN110045745A/zh
Priority to PCT/CN2016/102615 priority patent/WO2018072155A1/zh
Priority to CN201680004499.0A priority patent/CN107438804B/zh
Publication of WO2018072155A1 publication Critical patent/WO2018072155A1/zh
Priority to US16/388,168 priority patent/US20190243357A1/en

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/08Control of attitude, i.e. control of roll, pitch, or yaw
    • G05D1/0808Control of attitude, i.e. control of roll, pitch, or yaw specially adapted for aircraft
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/0011Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement
    • G05D1/0016Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement characterised by the operator's input device
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/0011Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement
    • G05D1/0022Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement characterised by the communication link
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/0011Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement
    • G05D1/0038Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement by providing the operator with simple or augmented images from one or more cameras located onboard the vehicle, e.g. tele-operation
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/0094Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots involving pointing a payload, e.g. camera, weapon, sensor, towards a fixed or moving target
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/10Simultaneous control of position or course in three dimensions
    • G05D1/101Simultaneous control of position or course in three dimensions specially adapted for aircraft
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/12Target-seeking control
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/163Wearable computers, e.g. on a belt
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/66Remote control of cameras or camera parts, e.g. by remote control devices
    • H04N23/661Transmitting camera control signals through networks, e.g. control via the Internet
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/695Control of camera direction for changing a field of view, e.g. pan, tilt or based on tracking of objects
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U10/00Type of UAV
    • B64U10/10Rotorcrafts
    • B64U10/13Flying platforms
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U2101/00UAVs specially adapted for particular uses or applications
    • B64U2101/30UAVs specially adapted for particular uses or applications for imaging, photography or videography
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U2201/00UAVs characterised by their flight controls
    • B64U2201/20Remote controls
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U30/00Means for producing lift; Empennages; Arrangements thereof
    • B64U30/20Rotors; Rotor supports
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/02Input arrangements using manually operated switches, e.g. using keyboards or dials
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2360/00Aspects of the architecture of display systems
    • G09G2360/14Detecting light within display terminals, e.g. using a single or a plurality of photosensors
    • G09G2360/144Detecting light within display terminals, e.g. using a single or a plurality of photosensors the light being ambient light
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G3/00Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
    • G09G3/20Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters
    • G09G3/34Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters by control of light from an independent source
    • G09G3/3406Control of illumination source
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G3/00Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
    • G09G3/20Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters
    • G09G3/34Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters by control of light from an independent source
    • G09G3/36Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters by control of light from an independent source using liquid crystals

Definitions

  • Embodiments of the present invention relate to the field of drones, and more particularly to a wearable device and a drone system for controlling a drone.
  • the remote control of the drone is mainly realized by a hand-held remote control terminal that performs wireless communication with the drone, and has many disadvantages such as large size and inconvenient carrying.
  • the adjustment of the flight state of the drone and the shooting angle of the imaging device mounted on the drone still depends on the visual remote control of the operator, and the experience of the operator and the operational proficiency requirements for the hand-held remote terminal are relatively high. high.
  • the embodiment of the invention provides a wearable device and a drone system for controlling the drone, so as to effectively improve the portability of the remote control end of the drone and further reduce the operation complexity.
  • a technical solution adopted by the embodiment of the present invention is to provide a wearable device for controlling a drone, including a processor, at least one sensor, and a communication module.
  • At least one sensor is configured to detect first state information of the wearable device, and the processor sends the first state information to the drone through the communication module, so that the drone is based on the first state information or the first state information and the drone
  • the second state information of the second state generates a corresponding control instruction, or the processor generates a control instruction according to the first state information or the first state information and the second state information received from the drone through the communication module, and the control command is sent by the communication module Send to the drone.
  • the at least one sensor includes a first positioning module, configured to detect first location information of the wearable device, where the first state information includes first location information.
  • the second state information includes the second location information of the drone itself, and the processor or the drone generates a flight control command according to the first location information and the second location information, and then adjusts the drone and the wearable by using the flight control instruction.
  • the second state information includes the second location information and the orientation information of the drone itself, and the processor or the drone generates the flight control instruction or the beat according to the first location information, the second location information, and the orientation information.
  • the control command is taken, and then the predetermined reference direction of the drone is adjusted on the horizontal plane by the flight control command, or the shooting angle of the imaging device mounted on the drone is adjusted on the horizontal plane by the shooting control command.
  • the at least one sensor further includes a height sensor for detecting first height information of the wearable device, and the first state information further includes first height information.
  • the second status information includes the second height information of the drone itself, and the processor or the drone further generates a flight control instruction according to the first height information and the second height information, and then adjusts the drone and the wear by using the flight control instruction.
  • the second state information includes the second location information and the second height information of the drone, and the processor or the drone further generates the first location information, the first height information, the second location information, and the second height information.
  • the flight control command or the shooting control command further adjusts the predetermined reference direction of the drone on the vertical plane by the flight control command, or adjusts the shooting angle of the imaging device mounted on the drone on the vertical plane by the shooting control command.
  • the at least one sensor further includes an orientation sensor for detecting orientation information of the wearable device, the first state information further includes orientation information, and the second state information includes second location information of the drone itself, the processor or the drone
  • the flight control command is generated according to the first position information, the position information, and the second position information, and then the relative position of the drone and the wearable device is adjusted by the flight control command.
  • the second state information includes the second location information of the drone, and the processor or the drone further records the first location information or the second location information, thereby generating a motion track of the wearable device or the drone. And further associate the image or video captured by the drone with the motion track.
  • the processor or the drone further matches the second position information when the drone captures the image or the video with the first position information or the second position information on the motion track, and compares the image or the video with the motion track The location point at which the second position information of the drone when the image or video is captured is associated.
  • the at least one sensor further comprises a motion sensor, the motion sensor is configured to detect a motion parameter of the wearable device, and the processor or the drone generates a control command according to the motion parameter.
  • the wearable device or the drone further stores a memory for storing at least one action template and a control command associated with the action template, wherein the processor or the drone matches the action command formed according to the motion parameter with the action template. And generating control instructions associated with the matching action template.
  • the motion sensor comprises an inertial sensor, and the integral of the motion parameter output by the inertial sensor in time forms an action instruction.
  • the processor or the drone directly maps the motion parameters into flight control commands or shooting control commands, the flight control commands are used to control the flight state of the drone, and the shooting control commands are used to control the drones.
  • the shooting state of the mounted imaging device, and the flight state or shooting state are simultaneously adjusted during the movement of the wearable device.
  • the processor or the drone generates a summoning control command according to the motion parameter, and the processor or the drone further generates a flight control command or a shooting control command in response to the summoning control command, and the flight control command is used to control the flight state of the drone, and the shooting The control command is used to control the shooting state of the imaging device mounted on the drone.
  • the UAV adjusts the relative position of the UAV and the wearable device or the shooting angle of the imaging device according to the flight control command or the shooting control command, thereby achieving shooting of the operator wearing the wearable device.
  • the processor or drone further visually recognizes the operator from the captured image or video.
  • the wearable device further includes at least one button, and the processor generates a control command according to the user's operation on the button.
  • the button includes a direction key, the direction key is used to generate a flight control command or a shooting control command, the flight control command is used to control the flight state of the drone, and the shooting control command is used to control the shooting state of the imaging device carried by the drone. .
  • the button further includes a multiplex button, wherein the directional button is used to generate a flight control command when the multiplex button is in the first state, and the directional button is used to generate a shooting control command when the multiplex button is in the second state.
  • the button further includes at least one or a combination of a take-off button, a landing button, a return button, and a follow-up button, wherein the take-off button is used to control the drone to take off, the drop button is used to control the drone to perform the landing, and the return button is used to Control the drone to return to the preset position, and the follow button is used to control the drone to follow the preset target for flight.
  • the take-off button is used to control the drone to take off
  • the drop button is used to control the drone to perform the landing
  • the return button is used to Control the drone to return to the preset position
  • the follow button is used to control the drone to follow the preset target for flight.
  • the wearable device is a watch or a wristband, and includes a casing and a wristband, wherein the antenna of the communication module or at least part of the sensor is disposed on the wristband.
  • the wearable device further comprises a display screen for at least one of displaying the first state information and the second state information, the image and the video returned by the drone through the communication module.
  • the display screen comprises a transflective liquid crystal panel and a backlight module
  • the wearable device further comprises a backlight control button or an ambient light sensor
  • the backlight module detects the ambient light according to the backlight control command generated by the backlight control button or the light sensor.
  • the intensity provides selective backlighting for the transflective liquid crystal panel.
  • the communication module includes an ISM communication module and a WIFI communication module, wherein the ISM communication module Used to communicate with the drone, the WIFI communication module is used to communicate with the server, and then download data from the server or upload data to the server.
  • a technical solution adopted by the embodiment of the present invention is to provide an unmanned aerial vehicle system, including a drone and a wearable device for controlling the drone.
  • the wearable device includes a first processor, at least one first sensor, and a first communication module
  • the drone includes a second processor, at least one second sensor, and a second communication module, wherein the at least one first sensor is configured to detect wear
  • the first state information of the device the second sensor is configured to detect the second state information of the drone
  • the first processor sends the first state information to the drone through the first communication module and the second communication module, so that
  • the second processor generates a corresponding control instruction according to the first state information or the first state information and the second state information, or the first processor passes the first communication module according to the first state information or the first state information and the second processor
  • the second status information sent to the wearable device by the second communication module generates a control command, and sends the control command to the drone through the first communication module and the second communication module.
  • the at least one first sensor includes a first positioning module, configured to detect first location information of the wearable device, the first state information includes first location information, and the at least one second sensor includes a second location module, configured to detect none The second location information of the human machine, and the second state information includes the second location information.
  • the first processor or the second processor generates a flight control command according to the first position information and the second position information, and further adjusts a projection distance of the drone and the wearable device on a horizontal plane by using a flight control instruction.
  • the at least one second sensor includes an orientation sensor for detecting the orientation information of the drone, the second state information includes the orientation information, and the first processor or the second processor is configured according to the first location information, the second location information, and the orientation.
  • the information generates a flight control command or a shooting control command, and then adjusts the predetermined reference direction of the drone on the horizontal plane by the flight control command, or adjusts the shooting angle of the imaging device mounted on the drone on the horizontal plane by the shooting control command.
  • the at least one first sensor further includes a first height sensor, configured to detect first height information of the wearable device, the first state information further includes first height information, and the at least one second sensor further includes a second height sensor, For detecting the second height information of the drone, the second status information further includes second height information.
  • the first processor or the second processor further generates a flight control command according to the first height information and the second height information, and further adjusts a relative height between the drone and the wearable device by using the flight control command.
  • the first processor or the second processor is further configured according to the first location information, the first height information, The second position information and the second height information generate a flight control command or a shooting control command, thereby adjusting a predetermined reference direction of the drone on a vertical plane by a flight control command, or adjusting an unmanned person on a vertical plane by a shooting control command The shooting angle of the imaging device mounted on the unit.
  • the at least one first sensor further includes an orientation sensor, configured to detect orientation information of the wearable device, the first state information further includes orientation information, and the first processor or the second processor is configured according to the first location information, the location information, and the first The two position information generates a flight control command, and then the relative position of the drone flight and the wearable device is adjusted by the flight control command.
  • the first processor or the second processor further records the first location information or the second location information, thereby generating a motion track of the wearable device or the drone, and further capturing an image or video captured by the drone. Associated with the motion track.
  • the first processor or the second processor further matches the second position information when the drone captures the image or the video with the first position information or the second position information on the motion track, and the image or the video and the motion
  • the trajectory is associated with a position point at which the second position information of the drone when the image or video is captured.
  • the at least one first sensor further comprises a motion sensor
  • the motion sensor is configured to detect a motion parameter of the wearable device
  • the first processor or the second processor generates a control instruction according to the motion parameter.
  • the wearable device or the drone further stores a memory for storing at least one action template and a control instruction associated with the action template, wherein the first processor or the second processor will generate an action instruction and an action template according to the motion parameter. A match is made and a control instruction associated with the matching action template is generated.
  • the motion sensor comprises an inertial sensor, and the integral of the motion parameter output by the inertial sensor in time forms an action instruction.
  • the first processor or the second processor directly maps motion parameters into flight control commands or shooting control commands, the flight control commands are used to control the flight state of the drone, and the shooting control commands are used to control the drones.
  • the shooting state of the imaging device, and then the flight state or shooting state is synchronously adjusted during the movement of the wearable device.
  • the first processor or the second processor generates a summoning control command according to the motion parameter, and the processor or the drone further generates a flight control command or a photographing control command in response to the summoning control command, where the flight control command is used to control the flight of the drone
  • the status, shooting control command is used to control the shooting status of the imaging device mounted on the drone.
  • the second processor sets the drone and the wearable according to the flight control instruction or the shooting control instruction
  • the relative position of the device or the shooting angle of the imaging device is adjusted to achieve the shooting of the operator wearing the wearable device.
  • the first processor or the second processor further visually recognizes the operator from the captured image or video.
  • the wearable device further includes at least one button, and the first processor generates a control command according to the user's operation on the button.
  • the button includes a direction key, the direction key is used to generate a flight control command or a shooting control command, the flight control command is used to control the flight state of the drone, and the shooting control command is used to control the shooting state of the imaging device carried by the drone. .
  • the button further includes a multiplex button, wherein the directional button is used to generate a flight control command when the multiplex button is in the first state, and the directional button is used to generate a shooting control command when the multiplex button is in the second state.
  • the button further includes at least one or a combination of a take-off button, a landing button, a return button, and a follow-up button, wherein the take-off button is used to control the drone to take off, the drop button is used to control the drone to perform the landing, and the return button is used to Control the drone to return to the preset position, and the follow button is used to control the drone to follow the preset target for flight.
  • the take-off button is used to control the drone to take off
  • the drop button is used to control the drone to perform the landing
  • the return button is used to Control the drone to return to the preset position
  • the follow button is used to control the drone to follow the preset target for flight.
  • the wearable device is a watch or a wristband, and includes a casing and a wristband, wherein the antenna of the first communication module or at least part of the first sensor is disposed on the wristband.
  • the wearable device further includes a display screen configured to display at least one of the first state information and the second state information, the image, and the video returned by the second processor through the first communication module and the second communication module.
  • a display screen configured to display at least one of the first state information and the second state information, the image, and the video returned by the second processor through the first communication module and the second communication module.
  • the display screen comprises a transflective liquid crystal panel and a backlight module
  • the wearable device further comprises a backlight control button or an ambient light sensor
  • the backlight module detects the environment according to the backlight control command generated by the backlight control button or the ambient light sensor.
  • the light intensity selectively provides backlighting for the transflective liquid crystal panel.
  • the communication module includes an ISM communication module and a WIFI communication module, wherein the ISM communication module is used for communication with the drone, and the WIFI communication module is used for communicating with the server, thereby downloading data from the server or uploading data to the server.
  • the beneficial effects of the embodiment of the present invention are: in the wearable device and the drone system for controlling the drone provided by the embodiment of the present invention, the ground control end of the drone is set in the form of a wearable device, It can effectively improve the portability of the ground control terminal, and further based on the detected state information of the wearable device To generate corresponding control commands, which can effectively reduce the operational complexity.
  • FIG. 1 is a schematic view of a drone system in accordance with a first embodiment of the present invention
  • FIG. 2 is a schematic block diagram of a wearable device in accordance with a second embodiment of the present invention.
  • Figure 3 is a schematic block diagram of a drone according to a third embodiment of the present invention.
  • FIG. 4 is a schematic diagram of controlling a drone according to state information of a wearable device according to a fourth embodiment of the present invention.
  • FIG. 5 is a schematic diagram of controlling a drone according to state information of a wearable device according to a fifth embodiment of the present invention.
  • FIG. 6 is a schematic diagram of controlling a drone according to state information of a wearable device according to a sixth embodiment of the present invention.
  • FIG. 7 is a schematic diagram of a motion path associated with an image and a video according to a seventh embodiment of the present invention.
  • Figure 8 is an external view of a wearable device in accordance with eight embodiments of the present invention.
  • Fig. 1 is a schematic view of a drone system in accordance with a first embodiment of the present invention.
  • the drone system of the present embodiment includes a wearable device 10 and a drone 20, wherein the drone 20 includes a flight body 21, a pan/tilt head 22, and an imaging device 23.
  • the flying body 21 includes a plurality of rotors 211 and a rotor motor 212 that drives the rotation of the rotor 211, thereby providing the power required for the drone 20 to fly.
  • the imaging device 23 is mounted on the flying body 21 via the pan/tilt head 22.
  • the imaging device 23 is used for image or video capture during flight of the drone 20, including but not limited to multi-spectral imagers, hyperspectral imagers, visible light cameras, and infrared cameras.
  • the pan/tilt head 22 is a multi-axis transmission and stabilization system including a plurality of rotating shafts 221 and a pan-tilt motor 222.
  • the pan/tilt motor 222 compensates for the photographing angle of the imaging device 23 by adjusting the rotational angle of the rotational shaft 221, and prevents or reduces the shake of the imaging device 23 by setting an appropriate buffer mechanism.
  • the imaging device 23 can be mounted on the flying body 21 directly or by other means.
  • the wearable device 10 is worn by an operator and communicates with the drone through wireless communication. The communication is performed to control the flight process of the drone 20 and the photographing process of the imaging device 23.
  • FIG. 2 is a schematic block diagram of a wearable device in accordance with a second embodiment of the present invention.
  • the wearable device 10 of the present embodiment includes a processor 101, a communication module 102, and at least one sensor.
  • the sensor on the wearable device 10 is used to detect the state information of the wearable device 10, and the corresponding control command is generated by the wearable device 10 or the drone 20 according to at least the state information of the wearable device 10.
  • the control commands include, but are not limited to, a flight control command for controlling the flight state of the drone 20 (eg, position, altitude, direction, speed, attitude, etc.), and a shooting control command for controlling no
  • the photographing state for example, photographing angle, photographing time, and exposure parameters, etc.
  • the processor 101 of the wearable device 10 transmits the status information of the wearable device 10 to the drone 20 through the communication module 102, so that the drone 20 can be based on the status information of the wearable device 10.
  • the state information of the wearable device 10 and the state information of the drone 20 itself generate corresponding control instructions; in another specific implementation, the processor 101 of the wearable device 10 according to the state information or wearable of the wearable device 10
  • the status information of the device 10 and the communication module 102 generate control commands from the status information of the drone 20 received by the drone 20, and transmit control commands to the drone 20 via the communication module 102.
  • the ground control end of the drone is set in the form of a wearable device, which can effectively improve the portability of the ground control end, and further generate corresponding control commands according to the detected state information of the wearable device, thereby effectively reducing the Operational complexity.
  • the sensors on the wearable device 10 include a positioning module 103, a height sensor 104, an orientation sensor 105, and a motion sensor 106.
  • the positioning module 103 is configured to detect the position information of the wearable device 10, and may be specifically implemented by a GPS satellite positioning module or a Beidou satellite positioning module, and may acquire the latitude and longitude coordinates of the wearable device 10, thereby implementing the wearable device 10 at a horizontal plane. Two-dimensional positioning on the top.
  • the height sensor 104 is used to detect the height information of the wearable device 10, and may be specifically implemented by a barometer, an ultrasonic range finder, an infrared range finder, or the like. Taking a barometer as an example, the barometer obtains the height information of the wearable device 10 by detecting the actual barometric pressure value at the position where the wearable device 10 is located, and the processor 101, the built-in processing module of the barometer, or other processing module can be detected according to the The difference between the actual air pressure value and the reference air pressure value of the reference position translates the relative height of the position of the wearable device 10 relative to the reference position.
  • the difference between the barometric pressure measured by the barometer on the wearable device 10 and the barometric pressure measured by the barometer on the drone 20 Value can The relative height between the drone 20 and the wearable device 10 is calculated.
  • the orientation sensor 105 is configured to detect the orientation information of the wearable device 10, and may be implemented by a compass or the like.
  • the orientation information of the wearable device 10 may be represented by an angle between a predetermined reference direction of the wearable device 10 relative to a standard direction (eg, east, west, south, north).
  • the motion sensor 106 is configured to detect motion parameters (eg, direction, speed, acceleration, attitude, motion path, etc.) of the wearable device 10, and may be specifically implemented by an inertial sensor, an image sensor, or the like.
  • motion parameters eg, direction, speed, acceleration, attitude, motion path, etc.
  • the positioning module 103, height sensor 104, orientation sensor 105, and motion sensor 106 referred to above are merely examples of sensors that can be disposed on the wearable device 10. In actual use, one or a combination of the above sensors may be selected according to actual needs to implement a specific function, or other sensors may be further added to implement the corresponding functions.
  • the processor 101, the communication module 102, the sensor, and other functional modules communicate via the bus 100. In other embodiments, the functional modules may communicate by other means.
  • FIG. 3 is a schematic block diagram of a drone according to a third embodiment of the present invention.
  • the UAV 20 of the present embodiment includes a processor 201, a communication module 202, and at least one sensor, wherein the sensor on the UAV 20 is used to detect status information of the UAV 20, and specifically includes a positioning module 203 and a height sensor. 204. Azimuth sensor 205.
  • the positioning module 203 is configured to detect the position information of the drone 20, the height sensor 204 is used to detect the height information of the drone 20, and the orientation sensor 205 is used to detect the orientation information of the drone 20.
  • the specific implementation manner of the foregoing sensor has been described in detail above, and details are not described herein again.
  • the communication module 202 is configured to perform wireless communication with the communication module 201 to implement data transmission between the wearable device 10 and the drone 20.
  • the processor 101 of the wearable device 10 transmits the state information of the wearable device 10 to the drone 20 through the communication module 102 and the communication module 202. And generating, by the processor 201 of the drone 20, a corresponding control instruction according to the received state information of the wearable device 10, or by the processor 201 of the drone 20 according to the state information of the wearable device 10 and the drone 20 The status information generates corresponding control commands.
  • the processor 101 of the wearable device 10 directly generates a control command according to the state information of the wearable device 10, or communicates by the processor 201 of the drone 20.
  • the module 202 and the communication module 102 transmit the status information of the drone 20 to the wearable device 10, and then the processor 101 of the wearable device 10 generates a control command according to the state information of the wearable device 10 and the state information of the drone 20. And further transmitting control commands to the drone 20 through the communication module 102 and the communication module 202.
  • the positioning module 203, the height sensor 204, and the orientation sensor 205 mentioned above are merely examples of sensors that can be disposed on the drone 20. In actual use, one or a combination of the above sensors may be selected according to actual needs to implement a specific function, or other sensors may be further added to implement the corresponding functions.
  • the processor 201, the communication module 202, the above-mentioned sensors, and other functional modules communicate through the bus 200. In other embodiments, the modules may be communicated by other means, and may be distributed on the flight main body 21 and the pan/tilt head 22 And any one or combination of the imaging devices 23.
  • FIG. 4 is a schematic diagram of controlling a drone according to state information of a wearable device according to a fourth embodiment of the present invention.
  • the distance between the UAV 20 and the projection of the wearable device 10 on the horizontal plane (ie, the projection distance L) can be calculated according to the above two position information, and the calculated projection distance L and the preset distance range are calculated according to the calculated distance between the UAV 20 and the wearable device 10 on the horizontal plane.
  • the comparison result is used to generate the flight control command, and the rotational speed of the corresponding rotor motor 212 is controlled by the rotor motor driver 206 of the drone 20, thereby controlling the drone 20 to advance or retreat relative to the wearable device 10 on a horizontal plane, so that The projection distance L of the drone 20 and the wearable device 10 on the horizontal plane is maintained within a preset distance range.
  • horizontal distance tracking of the drone 20 with respect to the wearable device 10 can be achieved.
  • the processor 101 of the wearable device 10 or the processor 201 of the drone device 20 acquires the position information x1, v1 of the wearable device 10 and the position information x2, v2 and position information of the drone 20
  • the predetermined reference direction D1 of the drone 20 is adjusted on the horizontal plane by the flight control command, or the drone is superimposed on the horizontal surface by the shooting control command.
  • the imaging angle 23 of the imaging device 23 is taken.
  • the angle between the predetermined reference direction D1 of the drone 20 relative to the standard direction can be calculated by the orientation information of the drone 20, or according to the drone 20
  • the orientation information and the rotation angle of each axis of the pan/tilt 22 calculate the shooting angle D2 of the imaging device relative to
  • the angle between the standard direction for example, east, south, west, and north
  • the position information x1, v1 and the position information x2, v2 can calculate the horizontal surface of the drone 20 and the wearable device 10.
  • the angle between the predetermined reference direction D1 or the shooting angle D2 with respect to the above-mentioned connecting line can be calculated, and a flight control command or a shooting control command is generated, and then the rotor motor driver 206 or the pan/tilt head of the drone 20 is passed.
  • the motor driver 207 controls the rotation speed of the corresponding rotor motor 212 or the rotation angle of the pan/tilt motor 222, thereby causing the predetermined reference direction D1 or the shooting angle D2 to be directed to the wearable device 10.
  • horizontal shooting tracking of the drone 20 with respect to the wearable device 10 can be achieved.
  • the above two adjustment methods can be used simultaneously or separately, or combined with other tracking methods, and are not limited herein.
  • any of the above two adjustment methods can be combined with the visual tracking method to ensure the accuracy of the visual tracking.
  • FIG. 5 is a schematic diagram of controlling a drone according to status information of a wearable device according to a fifth embodiment of the present invention.
  • the processor 101 of the wearable device 10 or the processor 201 of the drone device 20 acquires the height information h1 of the wearable device 10 and the height information h2 of the drone device 20, according to the height information h1
  • the altitude control information h2 generates a flight control command
  • the relative height h3 between the drone 20 and the wearable device 10 is adjusted by the flight control command.
  • the height information h1 and the height information h2 may be air pressure values or other detected values indicating height, or may be obtained by converting the other values or the detected values.
  • the relative height h3 of the drone 20 and the wearable device 10 can be calculated according to the above two height information, and the flight control command is generated according to the calculated comparison result of the relative height h3 and the preset height range, and the unmanned
  • the rotor motor driver 206 of the machine 20 controls the rotational speed of the corresponding rotor motor 212 to control the drone 20 to rise or fall relative to the wearable device 10 in a vertical direction such that the drone 20 is opposite the wearable device 10.
  • the height is kept within the preset height range. In the above manner, vertical distance tracking of the drone 20 relative to the wearable device 10 can be achieved.
  • a flight control command or a shooting control command may be generated according to the above information, and then the predetermined reference direction D1 of the drone 20 is adjusted on the vertical plane by the flight control command, or the vertical direction is adopted by the shooting control command.
  • the imaging angle D2 of the imaging device 23 mounted on the drone 20 is adjusted upward.
  • the projection distance L between the wearable device 10 and the drone 20 can be calculated by the position information x1, v1 and the position information x2, v2, and the wearable device 10 can be calculated by the height information h1 and the height information h2.
  • the relative height h3 between the drones 20 can further calculate the angle of the line between the wearable device 10 and the drone 20 with respect to the vertical direction according to the horizontal projection distance L and the relative height h3.
  • a flight control command or a shooting control command may be generated according to the angle, and then the rotation speed of the corresponding rotor motor 212 or the rotation angle of the pan/tilt motor 222 is controlled by the rotor motor driver 206 or the pan/tilt motor driver 207 of the drone 20, thereby
  • the predetermined reference direction D1 or the photographing angle D2 is adjusted to point to the wearable device 10. In the above manner, vertical shooting tracking of the drone 20 relative to the wearable device 10 can be achieved.
  • the two adjustment modes shown in FIG. 5 can be further combined with the two adjustment modes shown in FIG. 4 to realize three-dimensional distance tracking and shooting tracking.
  • FIG. 6 is a schematic diagram of controlling a drone according to state information of a wearable device according to a sixth embodiment of the present invention.
  • the processor 101 of the wearable device 10 or the processor 201 of the drone device 20 acquires the position information x1, v1 and orientation information of the wearable device 10 and the position information x2, v2 of the drone 20
  • the flight control command may be generated according to the above information, and then the relative orientation of the drone 20 and the wearable device 10 is adjusted by the flight control command (eg, front, rear, left, and relative to the preset reference direction of the wearable device 10, right).
  • the angle between the preset reference direction D3 of the wearable device 10 relative to the standard direction may be determined according to the orientation information of the wearable device 10 while passing through the wearable device 10
  • the position information x1, v1 and the position information x2, v2 can calculate an angle between the connection between the wearable device 10 and the projection of the drone 20 on the horizontal plane with respect to the standard direction, and can calculate according to the above angle The angle between the above connection with respect to the preset reference direction D3 of the wearable device 10.
  • the to-be-adjusted angle of the drone 20 around the wearable device 10 can be determined according to actual needs, a flight control command is generated according to the to-be-adjusted angle, and the rotational speed of the corresponding rotor motor 212 is controlled by the rotor motor driver 206 of the drone 20,
  • the drone 20 is caused to make an orientation adjustment around the wearable device 10. For example, as shown in FIG. 6, the drone 20 is caused to fly from the left side of the wearable device 10 to the right side of the wearable device 10. Alternatively, the drone 20 is always maintained within a predetermined range of orientation relative to the wearable device 10 as the wearable device 10 rotates itself, such as always on the right side of the wearable device 10.
  • the adjustment mode shown in FIG. 6 can be combined with the adjustment modes shown in FIGS. 4 and 5, so that the drone 20 maintains the azimuth adjustment while maintaining distance tracking and shooting tracking.
  • FIG. 7 is a schematic diagram of a motion path associated with an image and a video according to a seventh embodiment of the present invention.
  • the processor 101 of the wearable device 10 or the processor 201 of the drone device 20 has the position information x1, v1 of the wearable device 10 or the position information x2 of the drone 20 collected in the above embodiment.
  • the recording is performed by v2, thereby generating the motion trajectory 700 of the wearable device 10 or the drone 20, and further correlating the image or video captured by the drone 20 with the motion trajectory.
  • the processor 201 of the drone 20 further records the position information x2, v2 of the drone 20 when the image or video is captured, and the processor 101 of the wearable device 10 or the processor 201 of the drone 20 further unmanned
  • the captured image or video position information x2, v2 of the machine 20 is matched with the position information x1, v1 or x2, v2 on the motion track, and the position information x2 on the image or video and the motion track 700 and when the image or video is captured.
  • v2 match the position points to associate. For example, in FIG.
  • image 720 is associated with corresponding location point 710
  • image 740 is associated with corresponding location point 730
  • video 770 is associated with location points 750 and 760, wherein video processing of video 770 corresponding to location points 750 and 760, respectively.
  • the video or image associated with the motion track 700 is preferably stored in the form of a thumbnail.
  • the specific association relationship may be stored in the form of a picture as shown in FIG. 6, or may be stored in other manners, such as a table mode.
  • the thumbnail of the video or image may also be set to hyperlink, thereby pointing to the actual storage location of the video or image by clicking on the hyperlink and obtaining a clearer and more complete image or video.
  • the processor 201 of the drone 20 can further record other state information of the drone 20, such as height information or orientation information, etc., when captured or video, and is embodied on the motion track 700 or image or video.
  • the orientation of the drone 20 relative to the wearable device 10 is different when the image 720 and the image 740 are captured (for example, when the image 720 is captured)
  • the drone 20 is located on the right side of the wearable device 10, and the drone 20 is located on the left side of the wearable device 10 when the image 740 is captured.
  • the shooting angle of the drone 20 can also be represented based on the line between the image 620 and the image 640 and the corresponding position point.
  • the wearable device 10 further includes a motion sensor 106 for detecting motion parameters of the wearable device 10.
  • the processor 101 of the wearable device 10 or the processor 201 of the drone 20 generates a control command based on the motion parameters of the wearable device 10.
  • the generation of the control instruction according to the motion parameter of the wearable device 10 may include the following two methods:
  • the wearable device 10 can be provided with a memory 107, or can be set on the drone 20
  • the memory 208, the memory 107 or the memory 208 is configured to store at least one action template and control instructions associated with the action template, and the processor 101 of the wearable device 10 or the processor 201 of the drone 20 will perform an action according to the motion parameter.
  • the instructions are matched to the action template and generate control instructions associated with the matched action template.
  • motion parameters detected by motion sensor 106 include, but are not limited to, direction, velocity, acceleration, attitude, motion path, and the like.
  • the motion sensor includes an inertial sensor, and the motion parameters output by the inertial sensor can be directly used as motion commands or calculated for motion parameters to form motion instructions (eg, integrated over time).
  • an action template can be set such that the direction, speed, or acceleration of the wearable device 10 satisfies a preset change rule, or a certain action template is set such that the wearable device 10 satisfies a specific posture or a specific motion path.
  • the processor 101 or the processor 201 can directly match the direction, speed or acceleration detected by the motion sensor 106 with the change rule in the above action template.
  • the posture, the motion path, and the like obtained by integrating the velocity and the acceleration in time are matched with the posture or the motion path in the above motion template.
  • the above steps are preferably sent to the drone 20 only after the processor 101 is completed on the wearable device 10.
  • the processor 101 of the wearable device 10 or the processor 201 of the drone 20 can generate a summoning control command according to the motion parameter, such as a motion trajectory that is satisfied by the beckoning action of the operator wearing the wearable device 10.
  • the change rule of direction, speed or acceleration is set as the action template and associated with the summoning control command. Therefore, it is possible to detect whether the action of the operator wearing the wearable device 10 is a beckoning action based on the motion parameter detected by the motion sensor 106, and to generate a call control command in the case of a beckoning action.
  • the processor 101 or the processor 201 further generates a flight control command or a shooting control command in response to the summoning control command, wherein the flight control command is used to control the flight state of the drone 20, and the shooting control command is used to control the drone 20
  • the photographing state of the mounted imaging device 23 may further determine the relative position of the drone 20 to the wearable device 10 (eg, the horizontal projection distance, the relative height or the relative orientation described above) or the imaging angle of the imaging device 23 according to the flight control command or the photographing control command. Adjustment is made to achieve photographing of the operator wearing the wearable device 10, thereby acquiring an image or video containing the above operator.
  • the processor 101 or the processor 201 can visually recognize the operator from the captured image or video, for example, visually recognizing the operator's beckoning action or performing face recognition on the operator. Thereby, it is possible to facilitate the operator to perform subsequent operations, such as controlling the subsequent actions of the drone 20 by visually recognizing the subsequent actions of the operator.
  • the processor 101 of the wearable device 10 or the processor 201 of the drone 20 The motion parameters may be directly mapped into a flight control command or a shooting control command, the flight control command is used to control the flight state of the drone 20, and the shooting control command is used to control the shooting state of the imaging device 23 mounted on the drone 20, Further, the flight state or the shooting state is simultaneously adjusted during the movement of the wearable device 10.
  • the processor 101 of the wearable device 10 or the processor 201 of the drone device 20 directly maps the motion parameters of the direction, speed, acceleration, posture, and the like of the wearable device 10 into directions, speeds, and Flight control commands for flight conditions such as acceleration, attitude, etc., such that the drone 20 moves synchronously with the wearable device 10 in the same motion trajectory or posture.
  • the positioning module 103, height sensor 104, orientation sensor 105, and motion sensor 106 referred to above are merely examples of sensors that can be disposed on the wearable device 10.
  • one or a combination of the above sensors may be selected according to actual needs to implement a specific function, or other sensors may be further added to implement the corresponding functions.
  • the inclination of the wearable device 10 can be detected by a gravity sensor, and a flight control command or a photographing control command can be generated to control the flight direction of the drone 20 or the photographing angle of the imaging device 23.
  • the distance sensor and the orientation sensor can detect the distance and orientation information of the wearable device 10 relative to the target object, replace the wearable device 10 with the target object, and further control the pair of drones 20 in combination with the various tracking methods described above.
  • the target object is tracked.
  • the wearable device 10 further includes at least one button, and the processor 101 of the wearable device 10 generates a control command according to the user's operation of the button.
  • the keys on the wearable device 10 include direction keys 108 for generating flight control commands or shooting control commands.
  • the flight control command is for controlling the flight state of the drone 20
  • the photographing control command is for controlling the photographing state of the image forming apparatus 23 mounted on the drone 20.
  • the wearable device 10 is provided with a multiplex key 109, wherein the directional key 108 is used to generate a flight control command when the multiplex key 109 is in the first state, and the directional key 108 is used when the multiplex key 109 is in the second state. Produce shooting control commands.
  • the wearable device 10 is further provided with a takeoff key 110, a landing key 111, a return key 112, and a follow key 113.
  • the take-off button 110 is used to control the drone 20 to take off
  • the drop button 111 is used to control the drone 20 to make a drop
  • the return button 112 is used to control the drone 20 to return to a preset position, for example, returning to the wearable device 10 Location or other location specified by the user.
  • the follow key 113 is used to control the drone 20 to follow the preset target for flight. For example, after the operator presses the follow button 113, the drone 20 can automatically take off and follow the wearable device 10 for flight in accordance with one or a combination of distance tracking, shot tracking, and azimuth tracking methods described above.
  • buttons mentioned above are merely exemplary. In actual use, one or a combination of the above buttons may be selected according to actual needs to implement a specific function, or further added to implement a corresponding function.
  • the above-mentioned button can be implemented by a physical button or a virtual button, which is not limited herein.
  • the wearable device 10 further includes a display screen 114 for at least status information of the wearable device 10 and status information, images and videos of the drone 20 returned by the drone 20 via the communication modules 212, 112. At least one of them.
  • the display screen 114 includes a transflective liquid crystal panel 1141 and a backlight module 1142.
  • the wearable device 10 further includes a backlight control button 115 or an ambient light sensor 116.
  • the backlight module 1142 is based on the backlight control button 115.
  • the generated backlight control command or the ambient light intensity detected by the ambient light sensor 116 selectively provides backlighting for the transflective liquid crystal panel 1141.
  • the backlight module 1142 does not provide backlight, and the half-through transflective liquid crystal panel 1141 only relies on the received external natural light for display.
  • the backlight module 1142 When the ambient light brightness is relatively low or the backlight control button 115 is in the second state, the backlight module 1142 provides a backlight, and the half-through transflective liquid crystal panel 1141 mainly relies on the backlight for display, thereby achieving power saving purposes.
  • the specific control of the backlight module 1142 can be implemented by the processor 101, the built-in processing module of the display screen 114, or other processing modules, which is not limited herein.
  • the UAV system of the present embodiment further serves the server 30.
  • the communication module 102 of the wearable device 10 includes an ISM communication module 1021 and a WIFI communication module 1022, wherein the ISM communication module 1021 is used for the UAV.
  • the communication is performed by the WIFI communication module 1022 for communicating with the server 30 to download data from the server 30 or upload data to the server.
  • status information of the wearable device 10 or status information, images or video received from the drone 20 is uploaded to the server 30, and an installation or upgrade file required by the wearable device 10 can be downloaded from the server 30.
  • the UAV 20 and the server 30 can also communicate through the WIFI communication module, so that the status information, images or videos received by the UAV 20 can be directly uploaded to the server 30.
  • only state information or control commands are transmitted between the wearable device 10 and the drone 20, while other data is between the drone 20 and the server 30 and the server 30 and the wearable device 10. Transfer between.
  • only the state information or the uplink control command of the wearable device 10 is transmitted between the wearable device 10 and the drone 20, and the state information of the drone 20 and the image or video captured by the drone 20 are in the drone.
  • the transmission is performed between the server 30 and the server 30, and is downloaded from the server 30 by the wearable device 10 according to its own needs.
  • FIG. 8 is an external view of a wearable device according to an eighth embodiment of the present invention.
  • the wearable device is a watch or a wristband and includes a housing 81 and a wrist strap 82.
  • the wearable device can be designed in other forms, such as necklaces, glasses, headphones, or clothing.
  • the processor 101, the communication module 102, and various sensors described above are disposed in the housing 81 and covered by the display screen 83.
  • the housing 81 is also provided with physical buttons 85-89 for implementing the functions of the various buttons described above.
  • button 85 is a five-dimensional button that implements at least a portion of the control functions corresponding to direction keys 108 or both direction keys 108 and multiplex keys 109.
  • the flight control commands are generated by operations of other dimensions of the physical button 85 to control the flight direction of the drone 20 (eg, front, rear, Left and right)
  • the shooting control command is generated by the operation of the other dimensions of the physical button 85 to control the shooting angle of the imaging device 23.
  • the operator can select an operating parameter and confirm with the button 86 when the display 83 displays the parameters of the drone or camera.
  • the button 86 can also be used to control the shooting of the imaging device 23.
  • the button 87 is used to control the drone 20 to rise
  • the button 88 is used to control the drone 20 to be lowered
  • the button 89 is used to control the wearable device to be turned on.
  • the display screen 83 can display the current time, and therefore, the wearable device can be used as a watch.
  • the communication module 102 or a portion of the sensors (eg, the positioning module 203) or the antennas 841, 842 described above may be disposed on the wristband 12, thereby simplifying the circuit layout within the housing 81.
  • the antennas 841 and 842 may also be disposed in the housing 81 or in other suitable positions of the wearable device, and are not limited to the embodiment.
  • the ground control end of the drone is set as the wearable device.
  • the form can effectively improve the portability of the ground control terminal, and further generate corresponding control commands according to the detected state information of the wearable device, thereby effectively reducing the operation complexity.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Automation & Control Theory (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Signal Processing (AREA)
  • Multimedia (AREA)
  • Computing Systems (AREA)
  • Mathematical Physics (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Studio Devices (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
  • Selective Calling Equipment (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

一种用于控制无人机(20)的穿戴式设备(10)及无人机系统,其中穿戴式设备(10)包括处理器(101)、至少一传感器以及通信模块(102),其中至少一传感器用于检测穿戴式设备(10)的第一状态信息,处理器(101)通过通信模块(102)将第一状态信息发送至无人机(20),以使无人机(20)根据第一状态信息或第一状态信息和无人机(20)自身的第二状态信息产生相应的控制指令,或者处理器(101)根据第一状态信息或第一状态信息和通过通信模块(102)从无人机(20)接收的第二状态信息产生控制指令,并通过通信模块(102)将控制指令发送至无人机(20)。无人机(20)的地面控制端设置成穿戴式设备(10)的形式,有效提高地面控制端的便携性,根据所检测的穿戴式设备(10)的状态信息来产生相应控制指令,有效降低操作复杂度。

Description

一种用于控制无人机的穿戴式设备及无人机系统 【技术领域】
本发明实施例涉及无人机领域,特别是涉及一种用于控制无人机的穿戴式设备及无人机系统。
【背景技术】
无人机作为一种新兴的飞行设备,已在娱乐、农业、地质、气象、电力、抢险救灾等多个领域得到了广泛的应用。目前,无人机的远程控制主要通过与无人机进行无线通信的手持式遥控终端实现,存在体积大、携带不便等诸多不利。同时,对于无人机的飞行状态以及无人机所搭载的成像设备的拍摄角度等的调整仍依靠操作手的目视遥控,对操作手的经验以及对手持式遥控终端的操作熟练度要求较高。
【发明内容】
本发明实施例提供一种用于控制无人机的穿戴式设备及无人机系统,以有效提高无人机的远程控制端的便携性,并进一步降低操作复杂度。
为解决上述技术问题,本发明实施例采用的一个技术方案是:提供一种用于控制无人机的穿戴式设备,包括处理器、至少一传感器以及通信模块。至少一传感器用于检测穿戴式设备的第一状态信息,处理器通过通信模块将第一状态信息发送至无人机,以使无人机根据第一状态信息或第一状态信息和无人机自身的第二状态信息产生相应的控制指令,或者处理器根据第一状态信息或第一状态信息和通过通信模块从无人机接收的第二状态信息产生控制指令,并通过通信模块将控制指令发送至无人机。
其中,至少一传感器包括第一定位模块,用于检测穿戴式设备的第一位置信息,第一状态信息包括第一位置信息。
其中,第二状态信息包括无人机自身的第二位置信息,处理器或无人机根据第一位置信息和第二位置信息产生飞行控制指令,进而通过飞行控制指令调整无人机与穿戴式设备在水平面上的投影距离。
其中,第二状态信息包括无人机自身的第二位置信息和方位信息,处理器或无人机根据第一位置信息、第二位置信息和方位信息产生飞行控制指令或拍 摄控制指令,进而通过飞行控制指令在水平面上调整无人机的预定参考方向,或者通过拍摄控制指令在水平面上调整无人机上所搭载的成像设备的拍摄角度。
其中,至少一传感器进一步包括高度传感器,用于检测穿戴式设备的第一高度信息,第一状态信息进一步包括第一高度信息。
其中,第二状态信息包括无人机自身的第二高度信息,处理器或无人机进一步根据第一高度信息和第二高度信息产生飞行控制指令,进而通过飞行控制指令调整无人机与穿戴式设备之间的相对高度。
其中,第二状态信息包括无人机自身的第二位置信息和第二高度信息,处理器或无人机进一步根据第一位置信息、第一高度信息、第二位置信息和第二高度信息产生飞行控制指令或拍摄控制指令,进而通过飞行控制指令在竖直面上调整无人机的预定参考方向,或者通过拍摄控制指令在竖直面上调整无人机上所搭载的成像设备的拍摄角度。
其中,至少一传感器进一步包括方位传感器,用于检测穿戴式设备的方位信息,第一状态信息进一步包括方位信息,第二状态信息包括无人机自身的第二位置信息,处理器或无人机根据第一位置信息、方位信息和第二位置信息产生飞行控制指令,进而通过飞行控制指令调整无人机与穿戴式设备的相对方位。
其中,第二状态信息包括无人机自身的第二位置信息,处理器或无人机进一步对第一位置信息或第二位置信息进行记录,进而生成穿戴式设备或无人机的运动轨迹,并进一步将无人机所拍摄的图像或视频与运动轨迹进行关联。
其中,处理器或无人机进一步将无人机拍摄图像或视频时的第二位置信息与运动轨迹上的第一位置信息或第二位置信息进行匹配,并将图像或视频与运动轨迹上和无人机拍摄图像或视频时的第二位置信息相匹配的位置点进行关联。
其中,至少一传感器进一步包括运动传感器,运动传感器用于检测穿戴式设备的运动参数,且处理器或无人机根据运动参数产生控制指令。
其中,穿戴式设备或无人机进一步存储器,存储器用于存储至少一动作模板以及动作模板相关联的控制指令,其中处理器或无人机将根据运动参数形成的动作指令与动作模板进行匹配,并产生与匹配的动作模板相关联的控制指令。
其中,运动传感器包括惯性传感器,惯性传感器输出的运动参数在时间上的积分形成动作指令。
其中,处理器或无人机将运动参数直接映射成飞行控制指令或拍摄控制指令,飞行控制指令用于控制无人机的飞行状态,拍摄控制指令用于控制无人机 所搭载的成像设备的拍摄状态,进而在穿戴式设备的运动过程中对飞行状态或拍摄状态进行同步调整。
其中,处理器或无人机根据运动参数产生召唤控制指令,处理器或无人机进一步响应召唤控制指令产生飞行控制指令或拍摄控制指令,飞行控制指令用于控制无人机的飞行状态,拍摄控制指令用于控制无人机所搭载的成像设备的拍摄状态。
其中,无人机根据飞行控制指令或拍摄控制指令对无人机与穿戴式设备的相对位置或成像设备的拍摄角度进行调整,进而实现对佩戴穿戴式设备的操作者的拍摄。
其中,处理器或无人机进一步从所拍摄的图像或视频中对操作者进行视觉识别。
其中,穿戴式设备进一步包括至少一按键,处理器根据用户对按键的操作产生控制指令。
其中,按键包括方向键,方向键用于产生飞行控制指令或拍摄控制指令,飞行控制指令用于控制无人机的飞行状态,拍摄控制指令用于控制无人机所搭载的成像设备的拍摄状态。
其中,按键进一步包括一复用键,其中在复用键处于第一状态时,方向键用于产生飞行控制指令,在复用键处于第二状态时,方向键用于产生拍摄控制指令。
其中,按键进一步包括起飞键、降落键、返航键以及跟随键中的至少一个或组合,其中起飞键用于控制无人机进行起飞,降落键用于控制无人机进行降落,返航键用于控制无人机返航至预设位置,跟随键用于控制无人机跟随预设目标进行飞行。
其中,穿戴式设备为手表或手环,且包括壳体和腕带,其中通信模块或至少部分传感器的天线设置于腕带上。
其中,穿戴式设备进一步包括显示屏,显示屏至少用于显示第一状态信息以及无人机通过通信模块回传的第二状态信息、图像和视频中的至少一种。
其中,显示屏包括半穿半反式液晶面板和背光模组,其中穿戴式设备进一步包括背光控制按键或环境光传感器,背光模组根据背光控制按键产生的背光控制指令或光传感器所检测环境光强度为半穿半反式液晶面板选择性提供背光。
其中,通信模块包括ISM通信模块和WIFI通信模块,其中ISM通信模块 用于与无人机进行通信,WIFI通信模块用于与服务端进行通信,进而从服务端下载数据或向服务器端上传数据。
为解决上述技术问题,本发明实施例采用的一个技术方案是:提供一种无人机系统,包括无人机以及用于控制无人机的穿戴式设备。穿戴式设备包括第一处理器、至少一第一传感器以及第一通信模块,无人机包括第二处理器、至少一第二传感器以及第二通信模块,其中至少一第一传感器用于检测穿戴式设备的第一状态信息,第二传感器用于检测无人机的第二状态信息,第一处理器通过第一通信模块和第二通信模块将第一状态信息发送至无人机,以使第二处理器根据第一状态信息或第一状态信息和第二状态信息产生相应的控制指令,或者第一处理器根据第一状态信息或第一状态信息和第二处理器通过第一通信模块和第二通信模块发送至穿戴式设备的第二状态信息产生控制指令,并通过第一通信模块和第二通信模块将控制指令发送至无人机。
其中,至少一第一传感器包括第一定位模块,用于检测穿戴式设备的第一位置信息,第一状态信息包括第一位置信息,至少一第二传感器包括第二定位模块,用于检测无人机的第二位置信息,第二状态信息包括第二位置信息。
其中,第一处理器或第二处理器根据第一位置信息和第二位置信息产生飞行控制指令,进而通过飞行控制指令调整无人机与穿戴式设备在水平面上的投影距离。
其中,至少一第二传感器包括方位传感器,用于检测无人机的方位信息,第二状态信息包括方位信息,第一处理器或第二处理器根据第一位置信息、第二位置信息和方位信息产生飞行控制指令或拍摄控制指令,进而通过飞行控制指令在水平面上调整无人机的预定参考方向,或者通过拍摄控制指令在水平面上调整无人机上所搭载的成像设备的拍摄角度。
其中,至少一第一传感器进一步包括第一高度传感器,用于检测穿戴式设备的第一高度信息,第一状态信息进一步包括第一高度信息,至少一第二传感器进一步包括第二高度传感器,用于检测无人机的第二高度信息,第二状态信息进一步包括第二高度信息。
其中,第一处理器或第二处理器进一步根据第一高度信息和第二高度信息产生飞行控制指令,进而通过飞行控制指令调整无人机与穿戴式设备之间的相对高度。
其中,第一处理器或第二处理器进一步根据第一位置信息、第一高度信息、 第二位置信息和第二高度信息产生飞行控制指令或拍摄控制指令,进而通过飞行控制指令在竖直面上调整无人机的预定参考方向,或者通过拍摄控制指令在竖直面上调整无人机上所搭载的成像设备的拍摄角度。
其中,至少一第一传感器进一步包括方位传感器,用于检测穿戴式设备的方位信息,第一状态信息进一步包括方位信息,第一处理器或第二处理器根据第一位置信息、方位信息和第二位置信息产生飞行控制指令,进而通过飞行控制指令调整无人机飞行与穿戴式设备的相对方位。
其中,第一处理器或第二处理器进一步对第一位置信息或第二位置信息进行记录,进而生成穿戴式设备或无人机的运动轨迹,并进一步将无人机所拍摄的图像或视频与运动轨迹进行关联。
其中,第一处理器或第二处理器进一步将无人机拍摄图像或视频时的第二位置信息与运动轨迹上的第一位置信息或第二位置信息进行匹配,并将图像或视频与运动轨迹上和无人机拍摄图像或视频时的第二位置信息相匹配的位置点进行关联。
其中,至少一第一传感器进一步包括运动传感器,运动传感器用于检测穿戴式设备的运动参数,且第一处理器或第二处理器根据运动参数产生控制指令。
其中,穿戴式设备或无人机进一步存储器,存储器用于存储至少一动作模板以及动作模板相关联的控制指令,其中第一处理器或第二处理器将根据运动参数形成的动作指令与动作模板进行匹配,并产生与匹配的动作模板相关联的控制指令。
其中,运动传感器包括惯性传感器,惯性传感器输出的运动参数在时间上的积分形成动作指令。
其中,第一处理器或第二处理器将运动参数直接映射成飞行控制指令或拍摄控制指令,飞行控制指令用于控制无人机的飞行状态,拍摄控制指令用于控制无人机所搭载的成像设备的拍摄状态,进而在穿戴式设备的运动过程中对飞行状态或拍摄状态进行同步调整。
其中,第一处理器或第二处理器根据运动参数产生召唤控制指令,处理器或无人机进一步响应召唤控制指令产生飞行控制指令或拍摄控制指令,飞行控制指令用于控制无人机的飞行状态,拍摄控制指令用于控制无人机所搭载的成像设备的拍摄状态。
其中,第二处理器根据飞行控制指令或拍摄控制指令对无人机与穿戴式设 备的相对位置或成像设备的拍摄角度进行调整,进而实现对佩戴穿戴式设备的操作者的拍摄。
其中,第一处理器或第二处理器进一步从所拍摄的图像或视频中对操作者进行视觉识别。
其中,穿戴式设备进一步包括至少一按键,第一处理器根据用户对按键的操作产生控制指令。
其中,按键包括方向键,方向键用于产生飞行控制指令或拍摄控制指令,飞行控制指令用于控制无人机的飞行状态,拍摄控制指令用于控制无人机所搭载的成像设备的拍摄状态。
其中,按键进一步包括一复用键,其中在复用键处于第一状态时,方向键用于产生飞行控制指令,在复用键处于第二状态时,方向键用于产生拍摄控制指令。
其中,按键进一步包括起飞键、降落键、返航键以及跟随键中的至少一个或组合,其中起飞键用于控制无人机进行起飞,降落键用于控制无人机进行降落,返航键用于控制无人机返航至预设位置,跟随键用于控制无人机跟随预设目标进行飞行。
其中,穿戴式设备为手表或手环,且包括壳体和腕带,其中第一通信模块或至少部分第一传感器的天线设置于腕带上。
其中,穿戴式设备进一步包括显示屏,显示屏至少用于显示第一状态信息以及第二处理器通过第一通信模块和第二通信模块回传的第二状态信息、图像和视频中的至少一种。
其中,显示屏包括半穿半反式液晶面板和背光模组,其中穿戴式设备进一步包括背光控制按键或环境光传感器,背光模组根据背光控制按键产生的背光控制指令或环境光传感器所检测环境光强度为半穿半反式液晶面板选择性提供背光。
其中,通信模块包括ISM通信模块和WIFI通信模块,其中ISM通信模块用于与无人机进行通信,WIFI通信模块用于与服务端进行通信,进而从服务端下载数据或向服务器端上传数据。
本发明实施例的有益效果是:在本发明实施例所提供的用于控制无人机的穿戴式设备及无人机系统中,将无人机的地面控制端设置成穿戴式设备的形式,可有效提高地面控制端的便携性,进一步根据所检测的穿戴式设备的状态信息 来产生相应控制指令,进而可有效降低操作复杂度。
【附图说明】
图1是根据本发明第一实施例的无人机系统的示意图;
图2是根据本发明第二实施例的穿戴式设备的示意框图;
图3是根据本发明第三实施例的无人机的示意框图;
图4是根据本发明四实施例的根据穿戴式设备的状态信息对无人机进行控制的示意图;
图5是根据本发明五实施例的的根据穿戴式设备的状态信息对无人机进行控制的示意图;
图6是根据本发明六实施例的的根据穿戴式设备的状态信息对无人机进行控制的示意图;
图7是根据本发明七实施例的运动路径与图像和视频进行关联的示意图;
图8是根据本发明八实施例的穿戴式设备的外观图。
【具体实施方式】
下面将结合本发明实施例中的附图,对本发明实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例仅是本发明的一部分实施例,而不是全部的实施例。基于本发明中的实施例,本领域普通技术人员在没有做出创造性劳动前提下所获得的所有其他实施例,都属于本发明保护的范围。
如图1所示,图1是根据本发明第一实施例的无人机系统的示意图。本实施例的无人机系统包括穿戴式设备10和无人机20,其中无人机20包括飞行主体21、云台22以及成像设备23。在本实施例中,飞行主体21包括多个旋翼211以及驱动旋翼211转动的旋翼电机212,由此提供无人机20飞行所需动力。成像设备23通过云台22搭载于飞行主体21上。成像设备23用于在无人机20的飞行过程中进行图像或视频拍摄,包括但不限于多光谱成像仪、高光谱成像仪、可见光相机及红外相机等。云台22为多轴传动及增稳系统,包括多个转动轴221和云台电机222。云台电机222通过调整转动轴221的转动角度来对成像设备23的拍摄角度进行补偿,并通过设置适当的缓冲机构来防止或减小成像设备23的抖动。当然,在其他实施例中,成像设备23可以直接或通过其他方式搭载于飞行主体21上。穿戴式设备10由操作者佩戴,并通过无线通信方式与无人机 20进行通信,进而对无人机20的飞行过程及成像设备23的拍摄过程进行控制。
如图2所示,图2是根据本发明第二实施例的穿戴式设备的示意框图。本实施例的穿戴式设备10包括处理器101、通信模块102以及至少一传感器。其中,穿戴式设备10上的传感器用于检测穿戴式设备10的状态信息,由进一步由穿戴式设备10或无人机20至少根据穿戴式设备10的状态信息来产生相应的控制指令。控制指令包括但不限于飞行控制指令或拍摄控制指令,其中飞行控制指令用于控制无人机20的飞行状态(例如,位置、高度、方向、速度以及姿态等),拍摄控制指令用于控制无人机20所搭载的成像设备23的拍摄状态(例如,拍摄角度、拍摄时间以及曝光参数等)。
例如,在一具体实现方式中,穿戴式设备10的处理器101通过通信模块102将穿戴式设备10的状态信息发送至无人机20,以使无人机20根据穿戴式设备10的状态信息或穿戴式设备10的状态信息和无人机20自身的状态信息产生相应的控制指令;在另一具体实现方式中,穿戴式设备10的处理器101根据穿戴式设备10的状态信息或穿戴式设备10的状态信息和通信模块102从无人机20接收的无人机20的状态信息产生控制指令,并通过通信模块102将控制指令发送至无人机20。
通过上述方式,将无人机的地面控制端设置成穿戴式设备的形式,可有效提高地面控制端的便携性,进一步根据所检测的穿戴式设备的状态信息来产生相应控制指令,进而可有效降低操作复杂度。
在本实施例中,穿戴式设备10上的传感器包括定位模块103、高度传感器104、方位传感器105以及运动传感器106。其中,定位模块103用于检测穿戴式设备10的位置信息,并具体可由GPS卫星定位模块或北斗卫星定位模块等实现,并可以获取穿戴式设备10的经纬度坐标,进而实现穿戴式设备10在水平面上的二维定位。
高度传感器104用于检测穿戴式设备10的高度信息,具体可以由气压计、超声波测距仪、红外测距仪等实现。以气压计为例,气压计通过检测穿戴式设备10所处位置的实际气压值来获取穿戴式设备10的高度信息,处理器101、气压计的内置处理模块或其他处理模块则可根据检测的实际气压值与参考位置的参考气压值之间差值换算出穿戴式设备10所处位置相对于参考位置的相对高度。进一步,当无人机20上同样设置有气压计时,可通过穿戴式设备10上的气压计所测得的气压值与无人机20上的气压计所测得的气压值之间的气压差值可以 计算出无人机20与穿戴式设备10之间的相对高度。
方位传感器105用于检测穿戴式设备10的方位信息,具体可以由指南针等实现。穿戴式设备10的方位信息可以由穿戴式设备10的某一预设参考方向相对于标准方向(例如,东、西、南、北)之间的夹角来进行表示。
运动传感器106用于检测穿戴式设备10的运动参数(例如,方向、速度、加速度、姿态以及运动路径等),并具体可以由惯性传感器、图像传感器等实现。
如本领域技术人员所理解的,上文所提到的定位模块103、高度传感器104、方位传感器105以及运动传感器106仅是能够设置在穿戴式设备10上的传感器的示例。在实际使用中,可以根据实际需要选择上述传感器中的一种或组合来实现特定的功能,或者进一步增加其他传感器来实现相应的功能。进一步,上述处理器101、通信模块102、上述传感器以及其他功能模块通过总线100进行通信,在其他实施例中,上述功能模块也可以通过其他方式进行通信。
如图3所示,图3是根据本发明第三实施例的无人机的示意框图。本实施例的无人机20包括处理器201、通信模块202以及至少一传感器,其中无人机20上的传感器用于检测无人机20的状态信息,并具体可包括定位模块203、高度传感器204、方位传感器205。其中,定位模块203用于检测无人机20的位置信息,高度传感器204用于检测无人机20的高度信息,方位传感器205用于检测无人机20的方位信息。上述传感器的具体实现方式在上文中已经进行了详细描述,在此不再赘述。通信模块202用于与通信模块201进行无线通信,进而实现穿戴式设备10与无人机20之间的数据传输。
具体来说,在上述由穿戴式设备10产生控制指令的具体实现方式中,穿戴式设备10的处理器101通过通信模块102和通信模块202将穿戴式设备10的状态信息发送至无人机20,并由无人机20的处理器201根据接收的穿戴式设备10的状态信息产生相应的控制指令,或者由无人机20的处理器201根据穿戴式设备10的状态信息和无人机20的状态信息产生相应的控制指令。
在上述由无人机20产生控制指令的具体实现方式中,由穿戴式设备10的处理器101直接根据穿戴式设备10的状态信息产生控制指令,或者由无人机20的处理器201通过通信模块202和通信模块102将无人机20的状态信息发送至穿戴式设备10,再由穿戴式设备10的处理器101根据穿戴式设备10的状态信息和无人机20的状态信息产生控制指令,并进一步通过通信模块102和通信模块202将控制指令发送至无人机20。
如本领域技术人员所理解的,上文所提到的定位模块203、高度传感器204以及方位传感器205仅是能够设置在无人机20上的传感器的示例。在实际使用中,可以根据实际需要选择上述传感器中的一种或组合来实现特定的功能,或者进一步增加其他传感器来实现相应的功能。进一步,处理器201、通信模块202、上述传感器以及其他功能模块通过总线200进行通信,在其他实施例中,上述模块也可以通过其他方式进行通信,并且可以分布设置于飞行主体21、云台22以及成像设备23的任意一个或组合上。
下面将结合具体实例来对如何利用穿戴式设备10的状态信息或者利用穿戴式设备10的状态信息和无人机20的状态信息来产生控制指令的具体实例进行描述。
参见图4,图4是根据本发明四实施例的根据穿戴式设备的状态信息对无人机进行控制的示意图。
在本实施例中,当穿戴式设备10的处理器101或无人机20的处理器201获取到穿戴式设备10的位置信息x1、v1以及无人机20的位置信息x2、v2时,可根据上述两个位置信息产生相应的飞行控制指令,进而通过该飞行控制指令调整无人机20与穿戴式设备10在水平面上的投影距离L。
例如,可根据上述两个位置信息计算出无人机20与穿戴式设备10在水平面上的投影之间的距离(即,投影距离L),根据计算得到的投影距离L与预设的距离范围的比较结果来产生该飞行控制指令,并通过无人机20的旋翼电机驱动器206控制对应旋翼电机212的转速,进而在水平面上控制无人机20相对于穿戴式设备10前进或后退,以使得无人机20与穿戴式设备10在水平面上的投影距离L保持在预设的距离范围内。通过上述方式,可以实现无人机20相对于穿戴式设备10的水平距离跟踪。
在本实施例中,当穿戴式设备10的处理器101或无人机20的处理器201获取到穿戴式设备10的位置信息x1、v1以及无人机20的位置信息x2、v2和方位信息时,可根据上述信息产生飞行控制指令或拍摄控制指令,进而通过飞行控制指令在水平面上调整无人机20的预定参考方向D1,或者通过拍摄控制指令在水平面上调整无人机20上所搭载的成像设备23的拍摄角度D2。
例如,通过无人机20的方位信息可以计算出无人机20的预定参考方向D1相对于标准方向(例如,东、南、西、北)之间的夹角,或者根据无人机20的方位信息以及云台22的各轴的转动角度计算出成像设备的拍摄角度D2相对于 标准方向(例如,东、南、西、北)之间的夹角,并进一步通过位置信息x1、v1以及位置信息x2、v2则可以计算出无人机20与穿戴式设备10在水平面上的投影之间连线相对于标准方向的夹角。通过上述夹角可以计算出预定参考方向D1或拍摄角度D2相对于上述连线之间的夹角,并产生飞行控制指令或拍摄控制指令,进而通过无人机20的旋翼电机驱动器206或云台电机驱动器207来控制对应旋翼电机212的转速或云台电机222的转角,进而使得预定参考方向D1或拍摄角度D2指向穿戴式设备10。通过上述方式,可以实现无人机20相对于穿戴式设备10的水平拍摄跟踪。
上述两种调整方式可以同时使用或单独使用,或者与其他跟踪方式进行结合,在此不做限定。例如,上述两种调整方式的任意一种方式可以与视觉跟踪方式进行结合来确保视觉跟踪的精度。
参见图5,图5是根据本发明五实施例的的根据穿戴式设备的状态信息对无人机进行控制的示意图。
在本实施例中,当穿戴式设备10的处理器101或无人机20的处理器201获取到穿戴式设备10的高度信息h1以及无人机20的高度信息h2时,可根据高度信息h1和高度信息h2产生飞行控制指令,进而通过飞行控制指令调整无人机20与穿戴式设备10之间的相对高度h3。如上文所表描述,在本实施例中,高度信息h1和高度信息h2可以是气压值或者能够表示高度的其他检测值,也可以是通过上述其他值或检测值换算获得实际高度。
例如,可根据上述两个高度信息计算出无人机20与穿戴式设备10相对高度h3,根据计算得到的相对高度h3与预设高度范围的比较结果来产生该飞行控制指令,并通过无人机20的旋翼电机驱动器206控制对应旋翼电机212的转速,进而在竖直方向上控制无人机20相对于穿戴式设备10进行上升或下降,以使得无人机20与穿戴式设备10的相对高度保持在预设的高度范围内。通过上述方式,可以实现无人机20相对于穿戴式设备10的竖直距离跟踪。
在本实施例中,当穿戴式设备10的处理器101或无人机20的处理器201获取到穿戴式设备10的位置信息x1、v1和高度信息h1以及无人机20的位置信息x2、v2和高度信息h2时,可根据上述信息产生飞行控制指令或拍摄控制指令,进而通过飞行控制指令在竖直面上调整无人机20的预定参考方向D1,或者通过拍摄控制指令在竖直面上调整无人机20上所搭载的成像设备23的拍摄角度D2。
例如,通过位置信息x1、v1和位置信息x2、v2可以计算出穿戴式设备10与无人机20之间的投影距离L,并通过高度信息h1和高度信息h2可以计算出穿戴式设备10与无人机20之间的相对高度h3,则进一步根据水平投影距离L和相对高度h3可以计算出穿戴式设备10与无人机20之间连线相对于竖直方向的夹角。进一步可以根据该夹角产生飞行控制指令或拍摄控制指令,进而通过无人机20的旋翼电机驱动器206或云台电机驱动器207来控制对应旋翼电机212的转速或云台电机222的转角,进而使得预定参考方向D1或拍摄角度D2调整为指向穿戴式设备10。通过上述方式,可以实现无人机20相对于穿戴式设备10的竖直拍摄跟踪。
图5所示的两种调整方式可以与图4所示的两种调整方式进一步结合,进而实现三维的距离跟踪和拍摄跟踪。
参见图6,图6是根据本发明六实施例的的根据穿戴式设备的状态信息对无人机进行控制的示意图。
在本实施例中,当穿戴式设备10的处理器101或无人机20的处理器201获取到穿戴式设备10的位置信息x1、v1和方位信息以及无人机20的位置信息x2、v2时,可根据上述信息产生飞行控制指令,进而通过飞行控制指令调整无人机20与穿戴式设备10的相对方位(例如,相对于穿戴式设备10的预设参考方向的前、后、左、右)。
例如,根据穿戴式设备10的方位信息可以确定穿戴式设备10的预设参考方向D3相对于标准方向(例如,东、南、西、北)之间的夹角,同时通过穿戴式设备10的位置信息x1、v1和位置信息x2、v2可以计算出穿戴式设备10与无人机20在水平面上的投影之间连线相对于标准方向之间的夹角,并根据上述夹角可以计算出上述连线相对于穿戴式设备10的预设参考方向D3之间的夹角。进一步,可以根据实际需要确定无人机20绕穿戴式设备10的待调整角度,根据该待调整角度产生飞行控制指令,并通过无人机20的旋翼电机驱动器206控制对应旋翼电机212的转速,使得无人机20绕穿戴式设备10进行方位调整。例如,如图6所示的,使得无人机20从穿戴式设备10的左侧飞行到穿戴式设备10的右侧。或者,使得无人机20随着穿戴式设备10的自身转动始终相对穿戴式设备10保持在预定方位范围内,例如始终保持在穿戴式设备10的右侧。
图6所示的调整方式可以与图4和图5所示的调整方式进行结合,进而使得无人机20在进行方位调整的同时仍保持距离跟踪和拍摄跟踪。
如图7所示,图7是根据本发明七实施例的运动路径与图像和视频进行关联的示意图。
在本实施例中,穿戴式设备10的处理器101或无人机20的处理器201对上述实施例中采集的穿戴式设备10的位置信息x1、v1或无人机20的位置信息x2、v2进行记录,进而生成穿戴式设备10或无人机20的运动轨迹700,并进一步将无人机20所拍摄的图像或视频与运动轨迹进行关联。
例如,无人机20的处理器201进一步记录拍摄图像或视频时无人机20的位置信息x2、v2,穿戴式设备10的处理器101或无人机20的处理器201则进一步将无人机20的拍摄图像或视频时位置信息x2、v2与运动轨迹上的位置信息x1、v1或x2、v2进行匹配,并将图像或视频与运动轨迹700上和拍摄图像或视频时的位置信息x2、v2相匹配的位置点进行关联。例如,在图6中,图像720与对应位置点710关联,图像740与对应位置点730关联,而视频770与位置点750和760关联,其中位置点750和760分别对应的视频770的拍摄过程的起始位置和终止位置。
进一步,与运动轨迹700关联的视频或图像优选以缩略图形式存储,具体关联关系可以通过图6所示的图片形式进行存储,也可以通过其他方式进行存储,例如表格方式。进一步优选地,视频或图像的缩略图还可以设置超链接,进而通过点击该超链接而指向视频或图像的实际存储位置,并获取更清晰和完整的图像或视频。
此外,无人机20的处理器201还可以进一步记录拍摄图像或视频时无人机20的其他状态信息,例如高度信息或方位信息等等,并在运动轨迹700或图像或视频上进行体现。例如,通过将图像720和图像740分别设置于运动轨迹700的两侧来表示在拍摄图像720和图像740时,无人机20相对于穿戴式设备10的方位不同(例如,在拍摄图像720时,无人机20位于穿戴式设备10的右侧,在拍摄图像740时,无人机20位于穿戴式设备10的左侧)。进一步,还可以根据图像620和图像640与相应位置点之间的连线来表示无人机20的拍摄角度。
进一步如图2所示,穿戴式设备10还包括运动传感器106,运动传感器106用于检测穿戴式设备10的运动参数。穿戴式设备10的处理器101或无人机20的处理器201根据穿戴式设备10的运动参数产生控制指令。
根据穿戴式设备10的运动参数产生控制指令可以包括以下两种方式:
在一种方式中,穿戴式设备10上可设置存储器107,或者无人机20上可设 置存储器208,存储器107或者存储器208用于存储至少一动作模板以及动作模板相关联的控制指令,穿戴式设备10的处理器101或无人机20的处理器201将根据上述运动参数形成的动作指令与动作模板进行匹配,并产生与匹配的动作模板相关联的控制指令。具体来说,运动传感器106所检测的运动参数包括但不限于方向、速度、加速度、姿态、运动路径等。例如,运动传感器包括惯性传感器,惯性传感器输出的运动参数可以直接作为动作指令或者对运动参数进行计算形成动作指令(例如,在时间上进行积分)。因此,可以将某一动作模板设置成穿戴式设备10的方向、速度或加速度满足预设变化规律,或者将某一动作模板设置成穿戴式设备10满足特定姿态或特定运动路径。此时,处理器101或处理器201可以将运动传感器106所检测的方向、速度或加速度直接与上述动作模板中的变化规律进行匹配。或者,将通过对速度、加速度在时间上的积分获得的姿态、运动路径等与上述动作模板中的姿态或运动路径进行匹配。其中,由于计算和匹配运动参数所需的数据量相对较大,因此上述步骤优选在穿戴式设备10上由处理器101完成后仅将控制指令发送到无人机20。
在一具体应用中,穿戴式设备10的处理器101或无人机20的处理器201可以根据运动参数产生召唤控制指令,例如将佩戴穿戴式设备10的操作者的招手动作所满足的运动轨迹或者方向、速度或加速度的变化规律设置成动作模板,并与召唤控制指令进行关联。由此,根据运动传感器106所检测的运动参数配合上述动作模板即可检测出佩戴穿戴式设备10的操作者的动作是否为招手动作,若是招手动作则产生召唤控制指令。此时,处理器101或处理器201进一步响应召唤控制指令产生飞行控制指令或拍摄控制指令,其中飞行控制指令用于控制无人机20的飞行状态,拍摄控制指令用于控制无人机20所搭载的成像设备23的拍摄状态。例如,处理器201可以进一步根据飞行控制指令或拍摄控制指令对无人机20与穿戴式设备10的相对位置(例如,上述的水平投影距离、相对高度或相对方位)或成像设备23的拍摄角度进行调整,进而实现对佩戴穿戴式设备10的操作者的拍摄,由此获取到含有上述操作者的图像或视频。
进一步,处理器101或处理器201可以从所拍摄的图像或视频中对操作者进行视觉识别,例如对操作者的招手动作进行视觉识别或者对操作者进行人脸识别。由此,可以便于操作者进行后续操作,例如通过对操作者的后续动作进行视觉识别来控制无人机20的后续动作。
在另一种方式中,穿戴式设备10的处理器101或无人机20的处理器201 可以将上述运动参数直接映射成飞行控制指令或拍摄控制指令,飞行控制指令用于控制无人机20的飞行状态,拍摄控制指令用于控制无人机20所搭载的成像设备23的拍摄状态,进而在穿戴式设备10的运动过程中对飞行状态或拍摄状态进行同步调整。
例如,穿戴式设备10的处理器101或无人机20的处理器201将穿戴式设备10的方向、速度、加速度、姿态等运动参数直接映射成用于控制无人机20的方向、速度、加速度、姿态等飞行状态的飞行控制指令,以使得无人机20随着穿戴式设备10按照相同的运动轨迹或姿态进行同步运动。
如本领域技术人员所理解的,上文所提到的定位模块103、高度传感器104、方位传感器105以及运动传感器106仅是能够设置在穿戴式设备10上的传感器的示例。在实际使用中,可以根据实际需要选择上述传感器中的一种或组合来实现特定的功能,或者进一步增加其他传感器来实现相应的功能。例如,可以通过重力传感器来检测穿戴式设备10的倾角,并产生飞行控制指令或拍摄控制指令来控制无人机20的飞行方向或成像设备23的拍摄角度。进一步,可通过距离传感器和方位传感器检测穿戴式设备10相对于目标物体的距离和方位信息,利用目标物体代替穿戴式设备10,并进一步结合上文描述的各种跟踪方式控制无人机20对目标物体进行跟踪。
进一步如图2所示,穿戴式设备10进一步包括至少一按键,穿戴式设备10的处理器101根据用户对按键的操作产生控制指令。例如,穿戴式设备10上的按键包括方向键108,该方向键108用于产生飞行控制指令或拍摄控制指令。如上文所述的,飞行控制指令用于控制无人机20的飞行状态,拍摄控制指令用于控制无人机20所搭载的成像设备23的拍摄状态。进一步,穿戴式设备10设置有复用键109,其中在复用键109处于第一状态时,方向键108用于产生飞行控制指令,在复用键109处于第二状态时,方向键108用于产生拍摄控制指令。
进一步,穿戴式设备10还设置有起飞键110、降落键111、返航键112以及跟随键113。起飞键110用于控制无人机20进行起飞,降落键111用于控制无人机20进行降落,返航键112用于控制无人机20返航至预设位置,例如返航到穿戴式设备10当前所处位置或者用户指定的其他位置。跟随键113用于控制无人机20跟随预设目标进行飞行。例如,在操作者按下跟随键113后,无人机20可以自动起飞并根据上文描述的距离跟踪、拍摄跟踪和方位跟踪方式中的一种或结合来跟随穿戴式设备10进行飞行。
如本领域技术人员所理解的,上文所提到的上述按键仅是示例性的。在实际使用中,可以根据实际需要选择上述按键中的一种或组合来实现特定的功能,或者进一步增加其他按键来实现相应的功能。此外,上述按键可由实体按键或虚拟按键实现,在此不做限定。
进一步,穿戴式设备10进一步包括显示屏114,显示屏114至少用于穿戴式设备10的状态信息以及无人机20通过通信模块212、112回传的无人机20的状态信息、图像和视频中的至少一种。
在一优选实施例中,显示屏114包括半穿半反式液晶面板1141和背光模组1142,穿戴式设备10进一步包括背光控制按键115或环境光传感器116,背光模组1142根据背光控制按键115产生的背光控制指令或环境光传感器116所检测环境光强度为半穿半反式液晶面板1141选择性提供背光。例如,当环境光亮度相对较高或者背光控制按键115处于第一状态时,背光模组1142不提供背光,半穿半反式液晶面板1141仅依靠接收的外界自然光来进行显示。当环境光亮度相对较低或者背光控制按键115处于第二状态时,背光模组1142提供背光,半穿半反式液晶面板1141主要依靠背光来进行显示,由此可以达到省电目的。背光模组1142的具体控制可由处理器101、显示屏114的内置处理模块或其他处理模块实现,在此不做限定。
进一步如图2所示,本实施例的无人机系统进一步服务端30,穿戴式设备10的通信模块102包括ISM通信模块1021和WIFI通信模块1022,其中ISM通信模块1021用于与无人机20进行通信,WIFI通信模块1022用于与服务端30进行通信,进而从服务端30下载数据或向所述服务器端上传数据。例如,将穿戴式设备10的状态信息或从无人机20接收的状态信息、图像或视频上传到服务端30,并可以从服务端30下载穿戴式设备10所需的安装或升级文件。
此外,无人机20与服务端30之间也可以通过WIFI通信模块进行通信,以使得无人机20接收的状态信息、图像或视频可以直接上传到服务端30。进一步,在一优选实施例中,穿戴式设备10与无人机20之间仅传输状态信息或控制指令,而其他数据则在无人机20与服务端30以及服务端30与穿戴式设备10之间进行传输。例如穿戴式设备10与无人机20之间仅传输穿戴式设备10的状态信息或上行控制指令,而无人机20的状态信息以及无人机20所拍摄的图像或视频则在无人机20与服务端30之间进行传输,并由穿戴式设备10根据自身需要从服务端30下载。
如图8所示,图8是根据本发明第八实施例的穿戴式设备的外观图。在本实施例中,穿戴式设备为手表或手环,且包括壳体81和腕带82。当然,在其他实施例中,穿戴式设备可以设计成其他形式,例如项链、眼镜、耳机或衣服等。在本实施例中,上文描述的处理器101、通信模块102以及各种传感器设置于壳体81内,并由显示屏83所覆盖。此外,壳体81还设置有实体按键85-89,用于实现上文描述出的各种按键的功能。例如,按键85为一五维按键,其实现对应于方向键108或者同时实现方向键108和复用键109的至少部分控制功能。例如,在实体按键85处于按下或未按下状态中的一种状态时,通过实体按键85的其他维度的操作产生飞行控制指令来控制无人机20的飞行方向(例如,前、后、左、右),在实体按键85处于按下或未按下状态中的另一种状态时,通过实体按键85的其他维度的操作产生拍摄控制指令来控制成像设备23的拍摄角度。
此外,操作者可以在显示屏83显示无人机或者相机的参数时,通过按键86选择操作参数并确认。按键86还可以用于控制成像设备23的拍摄。按键87用于控制无人机20上升,按键88用于控制用于控制无人机20下降,按键89用于控制穿戴式设备开机。
可以理解的是,当穿戴式设备没有控制无人机时,所述显示屏83可以显示当前时间,因此,所述穿戴式设备可以当表用。
进一步,上文所描述的通信模块102或部分传感器(例如,定位模块203)或的天线841、842可设置于腕带12上,由此简化壳体81内的电路布局。当然,在其它实施例中,天线841、842也可设置于壳体81内,或者设置于穿戴式设备的其它适当位置,并不限于本实施例。
综上所述,本领域技术人员容易理解,在本发明实施例所提供的用于控制无人机的穿戴式设备及无人机系统中,将无人机的地面控制端设置成穿戴式设备的形式,可有效提高地面控制端的便携性,进一步根据所检测的穿戴式设备的状态信息来产生相应控制指令,进而可有效降低操作复杂度。
以上所述仅为本发明的实施方式,并非因此限制本发明的专利范围,凡是利用本发明说明书及附图内容所作的等效结构或等效流程变换,或直接或间接运用在其他相关的技术领域,均同理包括在本发明的专利保护范围内。

Claims (50)

  1. 一种用于控制无人机的穿戴式设备,其特征在于,所述穿戴式设备包括处理器、至少一传感器以及通信模块,其中所述至少一传感器用于检测所述穿戴式设备的第一状态信息,所述处理器通过所述通信模块将所述第一状态信息发送至所述无人机,以使所述无人机根据所述第一状态信息或所述第一状态信息和所述无人机自身的第二状态信息产生相应的控制指令,或者所述处理器根据所述第一状态信息或所述第一状态信息和通过所述通信模块从所述无人机接收的所述第二状态信息产生所述控制指令,并通过所述通信模块将所述控制指令发送至所述无人机。
  2. 根据权利要求1所述的穿戴式设备,其特征在于,所述至少一传感器包括第一定位模块,用于检测所述穿戴式设备的第一位置信息,所述第一状态信息包括所述第一位置信息。
  3. 根据权利要求2所述的穿戴式设备,其特征在于,所述第二状态信息包括所述无人机自身的第二位置信息,所述处理器或所述无人机根据所述第一位置信息和所述第二位置信息产生飞行控制指令,进而通过所述飞行控制指令调整所述无人机与所述穿戴式设备在水平面上的投影距离。
  4. 根据权利要求2所述的穿戴式设备,其特征在于,所述第二状态信息包括所述无人机自身的第二位置信息和方位信息,所述处理器或所述无人机根据所述第一位置信息、所述第二位置信息和所述方位信息产生飞行控制指令或拍摄控制指令,进而通过所述飞行控制指令在水平面上调整所述无人机的预定参考方向,或者通过所述拍摄控制指令在水平面上调整所述无人机上所搭载的成像设备的拍摄角度。
  5. 根据权利要求2所述的穿戴式设备,其特征在于,所述至少一传感器进一步包括高度传感器,用于检测所述穿戴式设备的第一高度信息,所述第一状态信息进一步包括所述第一高度信息。
  6. 根据权利要求5所述的穿戴式设备,其特征在于,所述第二状态信息包括所述无人机自身的第二高度信息,所述处理器或所述无人机进一步根据所述第一高度信息和所述第二高度信息产生飞行控制指令,进而通过所述飞行控制指令调整所述无人机与所述穿戴式设备之间的相对高度。
  7. 根据权利要求5所述的穿戴式设备,其特征在于,所述第二状态信息包 括所述无人机自身的第二位置信息和第二高度信息,所述处理器或所述无人机进一步根据所述第一位置信息、所述第一高度信息、所述第二位置信息和第二高度信息产生飞行控制指令或拍摄控制指令,进而通过所述飞行控制指令在竖直面上调整所述无人机的预定参考方向,或者通过所述拍摄控制指令在竖直面上调整所述无人机上所搭载的成像设备的拍摄角度。
  8. 根据权利要求2所述的穿戴式设备,其特征在于,所述至少一传感器进一步包括方位传感器,用于检测所述穿戴式设备的方位信息,所述第一状态信息进一步包括所述方位信息,所述第二状态信息包括所述无人机自身的第二位置信息,所述处理器或所述无人机根据所述第一位置信息、所述方位信息和所述第二位置信息产生飞行控制指令,进而通过所述飞行控制指令调整所述无人机与所述穿戴式设备的相对方位。
  9. 根据权利要求2所述的穿戴式设备,其特征在于,所述第二状态信息包括所述无人机自身的第二位置信息,所述处理器或所述无人机进一步对所述第一位置信息或所述第二位置信息进行记录,进而生成所述穿戴式设备或所述无人机的运动轨迹,并进一步将所述无人机所拍摄的图像或视频与所述运动轨迹进行关联。
  10. 根据权利要求9所述的穿戴式设备,其特征在于,所述处理器或所述无人机进一步将所述无人机拍摄图像或视频时的所述第二位置信息与所述运动轨迹上的所述第一位置信息或所述第二位置信息进行匹配,并将所述图像或视频与所述运动轨迹上和所述无人机拍摄图像或视频时的所述第二位置信息相匹配的位置点进行关联。
  11. 根据权利要求1所述的穿戴式设备,其特征在于,所述至少一传感器进一步包括运动传感器,所述运动传感器用于检测所述穿戴式设备的运动参数,所述第一状态信息包括所述运动参数,所述处理器或所述无人机根据所述运动参数产生所述控制指令。
  12. 根据权利要求11所述的穿戴式设备,其特征在于,所述穿戴式设备或所述无人机进一步存储器,所述存储器用于存储至少一动作模板以及所述动作模板相关联的所述控制指令,其中所述处理器或所述无人机将根据所述运动参数形成的动作指令与所述动作模板进行匹配,并产生与匹配的所述动作模板相关联的所述控制指令。
  13. 根据权利要求12所述的穿戴式设备,其特征在于,所述运动传感器包 括惯性传感器,所述惯性传感器输出的运动参数在时间上的积分形成所述动作指令。
  14. 根据权利要求11所述的穿戴式设备,其特征在于,所述处理器或所述无人机将所述运动参数直接映射成飞行控制指令或拍摄控制指令,所述飞行控制指令用于控制所述无人机的飞行状态,所述拍摄控制指令用于控制所述无人机所搭载的成像设备的拍摄状态,进而在所述穿戴式设备的运动过程中对所述飞行状态或所述拍摄状态进行同步调整。
  15. 根据权利要求11所述的穿戴式设备,其特征在于,所述处理器或所述无人机根据所述运动参数产生召唤控制指令,所述处理器或所述无人机进一步响应所述召唤控制指令产生飞行控制指令或拍摄控制指令,所述飞行控制指令用于控制所述无人机的飞行状态,所述拍摄控制指令用于控制所述无人机所搭载的成像设备的拍摄状态。
  16. 根据权利要求15所述的穿戴式设备,其特征在于,所述无人机根据所述飞行控制指令或拍摄控制指令对所述无人机与所述穿戴式设备的相对位置或所述成像设备的拍摄角度进行调整,进而实现对佩戴所述穿戴式设备的操作者的拍摄。
  17. 根据权利要求16所述的穿戴式设备,其特征在于,所述处理器或所述无人机进一步从所拍摄的图像或视频中对所述操作者进行视觉识别。
  18. 根据权利要求1所述的穿戴式设备,其特征在于,所述穿戴式设备进一步包括至少一按键,所述处理器根据用户对所述按键的操作产生所述控制指令。
  19. 根据权利要求18所述的穿戴式设备,其特征在于,所述按键包括方向键,所述方向键用于产生飞行控制指令或拍摄控制指令,所述飞行控制指令用于控制所述无人机的飞行状态,所述拍摄控制指令用于控制所述无人机所搭载的成像设备的拍摄状态。
  20. 根据权利要求19所述的穿戴式设备,其特征在于,所述按键进一步包括一复用键,其中在所述复用键处于第一状态时,所述方向键用于产生所述飞行控制指令,在所述复用键处于第二状态时,所述方向键用于产生所述拍摄控制指令。
  21. 根据权利要求18所述的穿戴式设备,其特征在于,所述按键进一步包括起飞键、降落键、返航键以及跟随键中的至少一个或组合,其中所述起飞键用于控制所述无人机进行起飞,所述降落键用于控制所述无人机进行降落,所 述返航键用于控制所述无人机返航至预设位置,所述跟随键用于控制所述无人机跟随预设目标进行飞行。
  22. 根据权利要求1所述的穿戴式设备,其特征在于,所述穿戴式设备为手表或手环,且包括壳体和腕带,其中所述通信模块或至少部分所述传感器的天线设置于所述腕带上。
  23. 根据权利要求1所述的穿戴式设备,其特征在于,所述穿戴式设备进一步包括显示屏,所述显示屏至少用于显示所述第一状态信息以及所述无人机通过所述通信模块回传的所述第二状态信息、图像和视频中的至少一种。
  24. 根据权利要求23所述的穿戴式设备,其特征在于,所述显示屏包括半穿半反式液晶面板和背光模组,其中所述穿戴式设备进一步包括背光控制按键或环境光传感器,所述背光模组根据所述背光控制按键产生的背光控制指令或所述光传感器所检测环境光强度为所述半穿半反式液晶面板选择性提供背光。
  25. 根据权利要求1所述的穿戴式设备,其特征在于,所述通信模块包括ISM通信模块和WIFI通信模块,其中所述ISM通信模块用于与所述无人机进行通信,所述WIFI通信模块用于与服务端进行通信,进而从所述服务端下载数据或向所述服务器端上传数据。
  26. 一种无人机系统,其特征在于,所述无人机系统包括无人机以及用于控制所述无人机的穿戴式设备,所述穿戴式设备包括第一处理器、至少一第一传感器以及第一通信模块,所述无人机包括第二处理器、至少一第二传感器以及第二通信模块,其中所述至少一第一传感器用于检测所述穿戴式设备的第一状态信息,所述第二传感器用于检测所述无人机的第二状态信息,所述第一处理器通过所述第一通信模块和所述第二通信模块将所述第一状态信息发送至所述无人机,以使所述第二处理器根据所述第一状态信息或所述第一状态信息和所述第二状态信息产生相应的控制指令,或者所述第一处理器根据所述第一状态信息或所述第一状态信息和所述第二处理器通过所述第一通信模块和所述第二通信模块发送至所述穿戴式设备的所述第二状态信息产生所述控制指令,并通过所述第一通信模块和所述第二通信模块将所述控制指令发送至所述无人机。
  27. 根据权利要求26所述的无人机系统,其特征在于,所述至少一第一传感器包括第一定位模块,用于检测所述穿戴式设备的第一位置信息,所述第一状态信息包括所述第一位置信息,所述至少一第二传感器包括第二定位模块,用于检测所述无人机的第二位置信息,所述第二状态信息包括所述第二位置信 息。
  28. 根据权利要求27所述的无人机系统,其特征在于,所述第一处理器或所述第二处理器根据所述第一位置信息和所述第二位置信息产生飞行控制指令,进而通过所述飞行控制指令调整所述无人机与所述穿戴式设备在水平面上的投影距离。
  29. 根据权利要求27所述的无人机系统,其特征在于,所述至少一第二传感器包括方位传感器,用于检测所述无人机的方位信息,所述第二状态信息包括所述方位信息,所述第一处理器或所述第二处理器根据所述第一位置信息、所述第二位置信息和所述方位信息产生飞行控制指令或拍摄控制指令,进而通过所述飞行控制指令在水平面上调整所述无人机的预定参考方向,或者通过所述拍摄控制指令在水平面上调整所述无人机上所搭载的成像设备的拍摄角度。
  30. 根据权利要求27所述的无人机系统,其特征在于,所述至少一第一传感器进一步包括第一高度传感器,用于检测所述穿戴式设备的第一高度信息,所述第一状态信息进一步包括所述第一高度信息,所述至少一第二传感器进一步包括第二高度传感器,用于检测所述无人机的第二高度信息,所述第二状态信息进一步包括所述第二高度信息。
  31. 根据权利要求30所述的无人机系统,其特征在于,所述第一处理器或所述第二处理器进一步根据所述第一高度信息和所述第二高度信息产生飞行控制指令,进而通过所述飞行控制指令调整所述无人机与所述穿戴式设备之间的相对高度。
  32. 根据权利要求30所述的无人机系统,其特征在于,所述第一处理器或所述第二处理器进一步根据所述第一位置信息、所述第一高度信息、所述第二位置信息和第二高度信息产生飞行控制指令或拍摄控制指令,进而通过所述飞行控制指令在竖直面上调整所述无人机的预定参考方向,或者通过所述拍摄控制指令在竖直面上调整所述无人机上所搭载的成像设备的拍摄角度。
  33. 根据权利要求27所述的无人机系统,其特征在于,所述至少一第一传感器进一步包括方位传感器,用于检测所述穿戴式设备的方位信息,所述第一状态信息进一步包括所述方位信息,所述第一处理器或所述第二处理器根据所述第一位置信息、所述方位信息和所述第二位置信息产生飞行控制指令,进而通过所述飞行控制指令调整所述无人机飞行与所述穿戴式设备的相对方位。
  34. 根据权利要求27所述的无人机系统,其特征在于,所述第一处理器或 所述第二处理器进一步对所述第一位置信息或所述第二位置信息进行记录,进而生成所述穿戴式设备或所述无人机的运动轨迹,并进一步将所述无人机所拍摄的图像或视频与所述运动轨迹进行关联。
  35. 根据权利要求34所述的无人机系统,其特征在于,所述第一处理器或所述第二处理器进一步将所述无人机拍摄图像或视频时的所述第二位置信息与所述运动轨迹上的所述第一位置信息或所述第二位置信息进行匹配,并将所述图像或视频与所述运动轨迹上和所述无人机拍摄图像或视频时的所述第二位置信息相匹配的位置点进行关联。
  36. 根据权利要求26所述的无人机系统,其特征在于,所述至少一第一传感器进一步包括运动传感器,所述运动传感器用于检测所述穿戴式设备的运动参数,且所述第一处理器或所述第二处理器根据所述运动参数产生所述控制指令。
  37. 根据权利要求36所述的无人机系统,其特征在于,所述穿戴式设备或所述无人机进一步存储器,所述存储器用于存储至少一动作模板以及所述动作模板相关联的所述控制指令,其中所述第一处理器或所述第二处理器将根据所述运动参数形成的动作指令与所述动作模板进行匹配,并产生与匹配的所述动作模板相关联的所述控制指令。
  38. 根据权利要求37所述的无人机系统,其特征在于,所述运动传感器包括惯性传感器,所述惯性传感器输出的运动参数在时间上的积分形成所述动作指令。
  39. 根据权利要求36所述的无人机系统,其特征在于,所述第一处理器或所述第二处理器将所述运动参数直接映射成飞行控制指令或拍摄控制指令,所述飞行控制指令用于控制所述无人机的飞行状态,所述拍摄控制指令用于控制所述无人机所搭载的成像设备的拍摄状态,进而在所述穿戴式设备的运动过程中对所述飞行状态或所述拍摄状态进行同步调整。
  40. 根据权利要求36所述的无人机系统,其特征在于,所述第一处理器或所述第二处理器根据所述运动参数产生召唤控制指令,所述处理器或所述无人机进一步响应所述召唤控制指令产生飞行控制指令或拍摄控制指令,所述飞行控制指令用于控制所述无人机的飞行状态,所述拍摄控制指令用于控制所述无人机所搭载的成像设备的拍摄状态。
  41. 根据权利要求40所述的无人机系统,其特征在于,所述第二处理器根 据所述飞行控制指令或拍摄控制指令对所述无人机与所述穿戴式设备的相对位置或所述成像设备的拍摄角度进行调整,进而实现对佩戴所述穿戴式设备的操作者的拍摄。
  42. 根据权利要求41所述的无人机系统,其特征在于,所述第一处理器或所述第二处理器进一步从所拍摄的图像或视频中对所述操作者进行视觉识别。
  43. 根据权利要求26所述的无人机系统,其特征在于,所述穿戴式设备进一步包括至少一按键,所述第一处理器根据用户对所述按键的操作产生所述控制指令。
  44. 根据权利要求43所述的无人机系统,其特征在于,所述按键包括方向键,所述方向键用于产生飞行控制指令或拍摄控制指令,所述飞行控制指令用于控制所述无人机的飞行状态,所述拍摄控制指令用于控制所述无人机所搭载的成像设备的拍摄状态。
  45. 根据权利要求44所述的无人机系统,其特征在于,所述按键进一步包括一复用键,其中在所述复用键处于第一状态时,所述方向键用于产生所述飞行控制指令,在所述复用键处于第二状态时,所述方向键用于产生所述拍摄控制指令。
  46. 根据权利要求43所述的无人机系统,其特征在于,所述按键进一步包括起飞键、降落键、返航键以及跟随键中的至少一个或组合,其中所述起飞键用于控制所述无人机进行起飞,所述降落键用于控制所述无人机进行降落,所述返航键用于控制所述无人机返航至预设位置,所述跟随键用于控制所述无人机跟随预设目标进行飞行。
  47. 根据权利要求26所述的无人机系统,其特征在于,所述穿戴式设备为手表或手环,且包括壳体和腕带,其中所述第一通信模块或至少部分所述第一传感器的天线设置于所述腕带上。
  48. 根据权利要求26所述的无人机系统,其特征在于,所述穿戴式设备进一步包括显示屏,所述显示屏至少用于显示所述第一状态信息以及所述第二处理器通过所述第一通信模块和所述第二通信模块回传的所述第二状态信息、图像和视频中的至少一种。
  49. 根据权利要求48所述的无人机系统,其特征在于,所述显示屏包括半穿半反式液晶面板和背光模组,其中所述穿戴式设备进一步包括背光控制按键或环境光传感器,所述背光模组根据所述背光控制按键产生的背光控制指令或 所述环境光传感器所检测环境光强度为所述半穿半反式液晶面板选择性提供背光。
  50. 根据权利要求26所述的无人机系统,其特征在于,所述通信模块包括ISM通信模块和WIFI通信模块,其中所述ISM通信模块用于与所述无人机进行通信,所述WIFI通信模块用于与服务端进行通信,进而从所述服务端下载数据或向所述服务器端上传数据。
PCT/CN2016/102615 2016-10-19 2016-10-19 一种用于控制无人机的穿戴式设备及无人机系统 WO2018072155A1 (zh)

Priority Applications (4)

Application Number Priority Date Filing Date Title
CN201910392512.XA CN110045745A (zh) 2016-10-19 2016-10-19 一种用于控制无人机的穿戴式设备及无人机系统
PCT/CN2016/102615 WO2018072155A1 (zh) 2016-10-19 2016-10-19 一种用于控制无人机的穿戴式设备及无人机系统
CN201680004499.0A CN107438804B (zh) 2016-10-19 2016-10-19 一种用于控制无人机的穿戴式设备及无人机系统
US16/388,168 US20190243357A1 (en) 2016-10-19 2019-04-18 Wearable uav control device and uav system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2016/102615 WO2018072155A1 (zh) 2016-10-19 2016-10-19 一种用于控制无人机的穿戴式设备及无人机系统

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US16/388,168 Continuation US20190243357A1 (en) 2016-10-19 2019-04-18 Wearable uav control device and uav system

Publications (1)

Publication Number Publication Date
WO2018072155A1 true WO2018072155A1 (zh) 2018-04-26

Family

ID=60459076

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2016/102615 WO2018072155A1 (zh) 2016-10-19 2016-10-19 一种用于控制无人机的穿戴式设备及无人机系统

Country Status (3)

Country Link
US (1) US20190243357A1 (zh)
CN (2) CN110045745A (zh)
WO (1) WO2018072155A1 (zh)

Families Citing this family (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108496141B (zh) * 2017-06-30 2021-11-12 深圳市大疆创新科技有限公司 控制可移动设备跟随的方法、控制设备和跟随系统
CN108268059A (zh) * 2018-01-18 2018-07-10 桂林智神信息技术有限公司 一种稳定器体感遥控系统的工作方法
CN108398689A (zh) * 2018-01-26 2018-08-14 广东容祺智能科技有限公司 一种基于无人机的鸟类识别引导装置及其引导方法
JP6616441B2 (ja) * 2018-02-26 2019-12-04 株式会社日本総合研究所 移動制御システム、制御システム、作業機械、及びプログラム
CN111566006B (zh) * 2018-02-28 2024-03-12 株式会社尼罗沃克 无人机和操作器及其控制方法、计算机可读取记录介质
CN108958300B (zh) * 2018-06-26 2023-06-20 北京小米移动软件有限公司 云台控制方法及装置
CN110187773B (zh) * 2019-06-04 2022-07-29 中科海微(北京)科技有限公司 增强现实眼镜控制的方法、设备及计算机存储介质
CN110263743B (zh) * 2019-06-26 2023-10-13 北京字节跳动网络技术有限公司 用于识别图像的方法和装置
USD1010004S1 (en) 2019-11-04 2024-01-02 Amax Group Usa, Llc Flying toy
US11157086B2 (en) * 2020-01-28 2021-10-26 Pison Technology, Inc. Determining a geographical location based on human gestures
US11199908B2 (en) * 2020-01-28 2021-12-14 Pison Technology, Inc. Wrist-worn device-based inputs for an operating system
US20210261247A1 (en) * 2020-02-26 2021-08-26 Nxp B.V. Systems and methodology for voice and/or gesture communication with device having v2x capability
CN112874797B (zh) * 2021-02-03 2022-07-19 维沃移动通信有限公司 飞行部件及智能穿戴设备
USD1003214S1 (en) 2021-06-09 2023-10-31 Amax Group Usa, Llc Quadcopter
USD1001009S1 (en) 2021-06-09 2023-10-10 Amax Group Usa, Llc Quadcopter
USD1035787S1 (en) 2022-06-24 2024-07-16 Amax Group Usa, Llc Flying toy
CN115884149B (zh) * 2023-01-17 2024-05-31 南京开天眼无人机科技有限公司 一种控制方法、智能穿戴终端、交互及救援系统

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2007050218A2 (en) * 2005-10-21 2007-05-03 Motorola, Inc. Limiting controlled characteristics of a remotely controlled device
EP2256571A2 (en) * 2009-05-27 2010-12-01 Honeywell International Inc. Adaptive user interface for semi-automatic operation of a remot-controlled, unmanned vehicle
US20120229660A1 (en) * 2011-03-09 2012-09-13 Matthews Cynthia C Methods and apparatus for remote controlled devices
CN105185083A (zh) * 2015-09-21 2015-12-23 深圳飞豹航天航空科技有限公司 可控制移动设备做跟随的智能设备及系统
CN105676860A (zh) * 2016-03-17 2016-06-15 歌尔声学股份有限公司 一种可穿戴设备、无人机控制装置和控制实现方法
CN105739525A (zh) * 2016-02-14 2016-07-06 普宙飞行器科技(深圳)有限公司 一种配合体感操作实现虚拟飞行的系统
CN105807788A (zh) * 2016-03-09 2016-07-27 广州极飞电子科技有限公司 无人机监控方法、系统以及无人机和地面站
CN105955306A (zh) * 2016-07-20 2016-09-21 西安中科比奇创新科技有限责任公司 可穿戴设备、基于可穿戴设备的无人机控制方法及系统
CN205613032U (zh) * 2016-04-25 2016-10-05 电子科技大学中山学院 一种可穿戴式航模无线遥控系统
CN106020492A (zh) * 2016-06-07 2016-10-12 赵武刚 通过手的动作与手势产生遥控无人机及附件的信号的方法
CN205643719U (zh) * 2015-12-31 2016-10-12 南宁慧视科技有限责任公司 一种无人机gps定位追踪系统

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2004107012A1 (ja) * 2003-05-30 2004-12-09 Vixen Co., Ltd. 天体の自動導入装置
CN102355574B (zh) * 2011-10-17 2013-12-25 上海大学 机载云台运动目标自主跟踪系统的图像稳定方法
CN103188431A (zh) * 2011-12-27 2013-07-03 鸿富锦精密工业(深圳)有限公司 控制无人飞行载具进行影像采集的系统及方法
TWM473650U (zh) * 2013-12-04 2014-03-01 Timotion Technology Co Ltd 省電型遙控裝置
CN106458318A (zh) * 2014-05-23 2017-02-22 莉莉机器人公司 用于照相和/或摄像的无人航拍直升机
WO2016049905A1 (zh) * 2014-09-30 2016-04-07 深圳市大疆创新科技有限公司 一种飞行任务处理方法、装置及系统
CN105681713A (zh) * 2016-01-04 2016-06-15 努比亚技术有限公司 视频录制方法、装置及移动终端
CN105892474A (zh) * 2016-03-31 2016-08-24 深圳奥比中光科技有限公司 无人机以及无人机控制方法

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2007050218A2 (en) * 2005-10-21 2007-05-03 Motorola, Inc. Limiting controlled characteristics of a remotely controlled device
EP2256571A2 (en) * 2009-05-27 2010-12-01 Honeywell International Inc. Adaptive user interface for semi-automatic operation of a remot-controlled, unmanned vehicle
US20120229660A1 (en) * 2011-03-09 2012-09-13 Matthews Cynthia C Methods and apparatus for remote controlled devices
CN105185083A (zh) * 2015-09-21 2015-12-23 深圳飞豹航天航空科技有限公司 可控制移动设备做跟随的智能设备及系统
CN205643719U (zh) * 2015-12-31 2016-10-12 南宁慧视科技有限责任公司 一种无人机gps定位追踪系统
CN105739525A (zh) * 2016-02-14 2016-07-06 普宙飞行器科技(深圳)有限公司 一种配合体感操作实现虚拟飞行的系统
CN105807788A (zh) * 2016-03-09 2016-07-27 广州极飞电子科技有限公司 无人机监控方法、系统以及无人机和地面站
CN105676860A (zh) * 2016-03-17 2016-06-15 歌尔声学股份有限公司 一种可穿戴设备、无人机控制装置和控制实现方法
CN205613032U (zh) * 2016-04-25 2016-10-05 电子科技大学中山学院 一种可穿戴式航模无线遥控系统
CN106020492A (zh) * 2016-06-07 2016-10-12 赵武刚 通过手的动作与手势产生遥控无人机及附件的信号的方法
CN105955306A (zh) * 2016-07-20 2016-09-21 西安中科比奇创新科技有限责任公司 可穿戴设备、基于可穿戴设备的无人机控制方法及系统

Also Published As

Publication number Publication date
CN107438804A (zh) 2017-12-05
CN107438804B (zh) 2019-07-12
CN110045745A (zh) 2019-07-23
US20190243357A1 (en) 2019-08-08

Similar Documents

Publication Publication Date Title
WO2018072155A1 (zh) 一种用于控制无人机的穿戴式设备及无人机系统
US11649052B2 (en) System and method for providing autonomous photography and videography
US11797009B2 (en) Unmanned aerial image capture platform
US11233943B2 (en) Multi-gimbal assembly
CN111596649B (zh) 用于空中系统的单手远程控制设备
WO2019242553A1 (zh) 控制拍摄装置的拍摄角度的方法、控制装置及可穿戴设备
WO2018098704A1 (zh) 控制方法、设备、系统、无人机和可移动平台
TW201831955A (zh) 顯示裝置、及顯示裝置之控制方法
US11513514B2 (en) Location processing device, flight vehicle, location processing system, flight system, location processing method, flight control method, program and recording medium
WO2021127888A1 (zh) 控制方法、智能眼镜、可移动平台、云台、控制系统及计算机可读存储介质
JP2003267295A (ja) 遠隔操縦システム
CN107205111B (zh) 摄像装置、移动装置、摄像系统、摄像方法和记录介质
US11082639B2 (en) Image display method, image display system, flying object, program, and recording medium
WO2020048365A1 (zh) 飞行器的飞行控制方法、装置、终端设备及飞行控制系统
WO2018214155A1 (zh) 用于设备姿态调整的方法、设备、系统和计算机可读存储介质
CN206294286U (zh) 一种远程虚拟现实的实现系统
CN204287973U (zh) 飞行相机
WO2022109860A1 (zh) 跟踪目标对象的方法和云台
US20200183380A1 (en) Systems and methods for communicating with an unmanned aerial vehicle
JP6329219B2 (ja) 操作端末、及び移動体
WO2022061934A1 (zh) 图像处理方法、装置、系统、平台及计算机可读存储介质
WO2018010472A1 (zh) 控制无人机云台转动的智能显示设备及其控制系统
KR20170004407A (ko) 무인정찰시스템 및 무인정찰방법
KR102542181B1 (ko) 360도 vr 영상 제작을 위한 무인 비행체 제어 방법 및 장치
WO2018188086A1 (zh) 无人机及其控制方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16919163

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16919163

Country of ref document: EP

Kind code of ref document: A1