CN110383814B - Control method, unmanned aerial vehicle, remote control device and nonvolatile storage medium - Google Patents

Control method, unmanned aerial vehicle, remote control device and nonvolatile storage medium Download PDF

Info

Publication number
CN110383814B
CN110383814B CN201880014345.9A CN201880014345A CN110383814B CN 110383814 B CN110383814 B CN 110383814B CN 201880014345 A CN201880014345 A CN 201880014345A CN 110383814 B CN110383814 B CN 110383814B
Authority
CN
China
Prior art keywords
mode
flight
remotely controlled
electronic stability
stability augmentation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN201880014345.9A
Other languages
Chinese (zh)
Other versions
CN110383814A (en
Inventor
朱炼
张洁明
李进吉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
SZ DJI Technology Co Ltd
Original Assignee
SZ DJI Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by SZ DJI Technology Co Ltd filed Critical SZ DJI Technology Co Ltd
Publication of CN110383814A publication Critical patent/CN110383814A/en
Application granted granted Critical
Publication of CN110383814B publication Critical patent/CN110383814B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
  • Selective Calling Equipment (AREA)

Abstract

A control method is applied to a remotely controlled device, and the remotely controlled device comprises the following steps: a camera for capturing an image; the electronic stability augmentation module is used for preventing the camera from shaking; a processor for receiving flight mode instructions from a user and determining a current flight mode of the remotely controlled device; the control method is characterized by comprising the following steps: controlling the electronic stability augmentation module to turn on or off based on the current flight mode determined by the processor. A control method is applied to a remote control device, and comprises the following steps: sending a speed mode to a remote control device controlled by the remote control device, so that the remote control device can control the electronic stability augmentation module arranged on the remote control device to be turned on or off based on the speed mode. Also included are a drone, a remote control device, and a non-volatile storage medium.

Description

Control method, unmanned aerial vehicle, remote control device and nonvolatile storage medium
Copyright declaration
The disclosure of this patent document contains material which is subject to copyright protection. The copyright is owned by the copyright owner. The copyright owner has no objection to the facsimile reproduction by anyone of the patent document or the patent disclosure, as it appears in the patent and trademark office official records and records.
Technical Field
The embodiment of the disclosure belongs to the technical field of remote control, and particularly relates to a control method, an unmanned aerial vehicle, remote control equipment and a nonvolatile storage medium. The embodiment of the disclosure is suitable for the scenes of flying and shooting together for miniature and small toy unmanned aerial vehicles, traversing machines, aerial photography unmanned aerial vehicles and the like.
Background
Traditional aerial photography is to take a picture suitable for the current scene by manually adjusting shooting parameters. The shaking of the flying machine body causes shaking of real-time pictures and shot pictures, and the imaging effect is poor. For severe motion scenes, jitter is difficult to resolve. It is desirable to provide a control method that stabilizes real-time pictures and reduces vertigo; the shooting effect is better, makes the flaky rate of utilization in later stage promote.
Disclosure of Invention
One aspect of the embodiments of the present disclosure provides a control method applied to a remotely controlled device, where the remotely controlled device includes: a camera for capturing an image; the electronic stability augmentation module is used for preventing the camera from shaking; a processor for receiving flight mode instructions from a user and determining a current flight mode of the remotely controlled device; the control method is characterized by comprising the following steps: controlling the electronic stability augmentation module to turn on or off based on the current flight mode determined by the processor.
Another aspect of the embodiments of the present disclosure provides a control method applied to a remote control device, the method including: sending a speed mode to a remote control device controlled by the remote control device, so that the remote control device can control the electronic stability augmentation module arranged on the remote control device to be turned on or off based on the speed mode.
Another aspect of the embodiments of the present disclosure provides an unmanned aerial vehicle, a camera for capturing an image; the electronic stability augmentation module is used for preventing the camera from shaking; a processor for receiving flight mode instructions from a user and determining a current flight mode of the drone, and controlling the electronic stability augmentation module to turn on or off based on the current flight mode determined by the processor; and a memory having computer-readable instructions stored thereon for execution by the processor.
Another aspect of an embodiment of the present disclosure provides a remote control device including: the processor is used for sending a speed mode to a remote control device controlled by the remote control device, so that the remote control device can control the electronic stability augmentation module arranged on the remote control device to be turned on or off based on the speed mode; and a memory having computer-readable instructions stored thereon for execution by the processor.
Another aspect of embodiments of the present disclosure provides a non-volatile storage medium having stored thereon executable instructions for performing any of the above methods.
Drawings
For a more complete understanding of the disclosed embodiments and the advantages thereof, reference is now made to the following descriptions taken in conjunction with the accompanying drawings, in which:
FIG. 1 schematically illustrates an application scenario in accordance with an embodiment of the present disclosure;
FIG. 2 schematically illustrates a flow chart of a control method according to an embodiment of the disclosure;
3-10 schematically illustrate display interfaces on a display screen of a remote control device according to some embodiments of the present disclosure;
FIG. 11 schematically illustrates a flow chart of a control method according to another embodiment of the present disclosure;
fig. 12 schematically shows a flowchart of acquiring a shooting mode according to an embodiment of the present disclosure;
fig. 13 schematically shows an information interaction diagram for acquiring a photographing mode according to another embodiment of the present disclosure;
fig. 14 schematically illustrates a block diagram of a drone, in accordance with an embodiment of the present disclosure; and
FIG. 15 schematically shows a block diagram of a remote control device according to an embodiment of the disclosure.
Detailed Description
Hereinafter, embodiments of the present disclosure will be described with reference to the accompanying drawings. It should be understood that the description is illustrative only and is not intended to limit the scope of the present disclosure. Moreover, in the following description, descriptions of well-known structures and techniques are omitted so as to not unnecessarily obscure the concepts of the present disclosure.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the disclosure. The terms "comprises," "comprising," and the like, as used herein, specify the presence of stated features, steps, operations, and/or components, but do not preclude the presence or addition of one or more other features, steps, operations, or components.
All terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art unless otherwise defined. It is noted that the terms used herein should be interpreted as having a meaning that is consistent with the context of this specification and should not be interpreted in an idealized or overly formal sense.
Where a convention analogous to "at least one of A, B and C, etc." is used, in general such a construction is intended in the sense one having skill in the art would understand the convention (e.g., "a system having at least one of A, B and C" would include but not be limited to systems that have a alone, B alone, C alone, a and B together, a and C together, B and C together, and/or A, B, C together, etc.). Where a convention analogous to "A, B or at least one of C, etc." is used, in general such a construction is intended in the sense one having skill in the art would understand the convention (e.g., "a system having at least one of A, B or C" would include but not be limited to systems that have a alone, B alone, C alone, a and B together, a and C together, B and C together, and/or A, B, C together, etc.). It will be further understood by those within the art that virtually any disjunctive word and/or phrase presenting two or more alternative terms, whether in the description, claims, or drawings, should be understood to contemplate the possibilities of including one of the terms, either of the terms, or both terms. For example, the phrase "a or B" should be understood to include the possibility of "a" or "B", or "a and B".
Some block diagrams and/or flow diagrams are shown in the figures. It will be understood that some blocks of the block diagrams and/or flowchart illustrations, or combinations thereof, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus, such that the instructions, which execute via the processor, create means for implementing the functions/acts specified in the block diagrams and/or flowchart block or blocks.
Accordingly, the techniques of this disclosure may be implemented in hardware and/or software (including firmware, microcode, etc.).
In addition, the techniques of this disclosure may take the form of a computer program product on a computer-readable medium having instructions stored thereon for use by or in connection with an instruction execution system. In the context of this disclosure, a computer-readable medium may be any medium that can contain, store, communicate, propagate, or transport the instructions. For example, the computer readable medium can include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, device, or propagation medium. Specific examples of the computer readable medium include: magnetic storage devices, such as magnetic tape or Hard Disk Drives (HDDs); optical storage devices, such as compact disks (CD-ROMs); a memory, such as a Random Access Memory (RAM) or a flash memory; and/or wired/wireless communication links.
The Unmanned Aerial Vehicles (UAVs) described below include, for example, fixed wing aircraft and rotorcraft, such as helicopters, quadcopters, and aircraft having other numbers and/or rotor configurations. Be provided with image acquisition device on the unmanned aerial vehicle for gather the target image, for example can be used for taking the photo, shoot video etc..
Techniques related to shooting parameter settings of a drone while performing an aerial photography task are described herein.
Fig. 1 schematically shows an application scenario according to an embodiment of the present disclosure. It should be noted that fig. 1 is only an example of a scenario in which the embodiments of the present disclosure may be applied to help those skilled in the art understand the technical content of the present disclosure, but does not mean that the embodiments of the present disclosure may not be applied to other devices, systems, environments or scenarios.
As shown in fig. 1, the drone 100 is provided with a load 110, and the load 110 may include, for example, a processor, a memory, an image capture device, a sensor, an electronic stability augmentation module, a communication device, and the like. Sensor and electron increase steady module can the snap-on unmanned aerial vehicle 100, also can fix on unmanned aerial vehicle 100 through the cloud platform, and wherein, the cloud platform can be used for changing image acquisition device's direction.
In some embodiments, the image capture device includes one or more optical devices that affect the light reaching the focal point of the imaging sensor. In some embodiments, the image capture device includes, for example, a semiconductor Charge Coupled Device (CCD), using Complementary Metal Oxide Semiconductor (CMOS) and/or N-type metal oxide semiconductor (NMOS, LiveMOS). In some embodiments, the image capture device is configured to capture high definition or ultra high definition video (e.g., 720p, 1080i, 1080p, 1440p, 2000p, 2160p, 2540p, 4000p, 4320p, etc.).
According to embodiments of the present disclosure, the cargo 110 may also include one or more other sensors for obtaining environmental information around various drones as well as the flight speed of the drone itself.
In some embodiments, the one or more sensors comprise one or more audio transducers. For example, the audio detection system includes an audio output transducer (e.g., a speaker) and/or an audio input transducer (e.g., a microphone, such as a parabolic microphone). In some embodiments, a microphone and a speaker are used as components of the sonar system.
A sonar system is used, for example, to provide a three-dimensional map of the surroundings of the drone 100.
In some embodiments, the one or more sensors include one or more infrared sensors. In some embodiments, a distance measurement system for measuring the distance from the drone 100 to an object or surface includes one or more infrared sensors, such as left and right infrared sensors for stereo imaging and/or distance determination.
In some embodiments, the one or more sensors include one or more Global Positioning System (GPS) sensors, motion sensors (e.g., accelerometers), rotation sensors (e.g., gyroscopes), inertial sensors, proximity sensors (e.g., infrared sensors), and/or weather sensors (e.g., pressure sensors, temperature sensors, humidity sensors, and/or wind speed sensors).
In some embodiments, the drone 100, the remote control device 200, use the sensed data generated by the one or more sensors to determine, for example, the location of the drone 100, the orientation of the drone 100, characteristics of movement of the drone 100 (e.g., angular velocity, angular acceleration, translational velocity, translational acceleration, and/or directions of motion along one or more axes), and/or the location of potential obstacles, targets, weather conditions, geographic features, and/or man-made structures that the drone 100 is approaching.
According to an embodiment of the present disclosure, the remote control device 200 may be, for example, a general remote controller or a dedicated remote controller of the drone 100, or alternatively, may also be a mobile terminal, such as a mobile phone, a tablet computer, etc., on which a corresponding application program runs, which may be used to remotely control the drone 100. The remote control device 200 may be provided with a presentation means 210, such as a display screen or a projection means, for example, which may be used to present images, video or text information, for example, transmitted by the drone 100.
In some embodiments, the remote control device 200 receives user input to control, for example, the pose, position, orientation, velocity, acceleration, navigation, and/or tracking, etc., of the drone 100 and its cargo 110. In some embodiments, the remote control device 200 is manipulated by a user to input control instructions for controlling the navigation of the drone 100. In some embodiments, the remote control device 200 is used to input a flight mode of the drone 100, such as autopilot or navigating according to a predetermined navigation path.
In some embodiments, the user controls the drone 100, such as the position, attitude, and/or orientation of the drone 100, by changing the position of the remote control device 200 (e.g., by tilting or otherwise moving the remote control device 200). For example, a change in position of the remote control device 200 is detected by, for example, one or more inertial sensors, and the output of the one or more inertial sensors is used to generate command data. In some embodiments, the remote control device 200 is used to adjust operating parameters of the load 110, such as camera parameters of the image capture device and/or position parameters of the load 110 relative to the drone 100.
In some embodiments, the drone 100 communicates with the remote control device 200, for example, through wireless communication. In some embodiments, the drone 100 receives information from the remote control device 200. For example, the information received by the drone 100 includes, for example, control instructions for controlling parameters of the drone 100. In some embodiments, the drone 100 transmits information to the remote control device 200. For example, the information transmitted by the drone 100 includes, for example, images and/or video captured by the drone 100.
In some embodiments, communications between the remote control device 200 and the drone 100 are sent via wireless map-transfer technology (e.g., wifi), a network, and/or a wireless signal transmitter of a cellular station (e.g., a long-range wireless signal transmitter). In some embodiments, the satellite may be an integral part of the internet, and may be used in place of a cellular tower. In some embodiments, the communication between the remote control device 200 and the drone 100 may also be via a private network, such as Lightbridge or the like.
The control instructions include, for example, navigation instructions for controlling navigation parameters of the drone 100, such as the position, orientation, attitude, and/or one or more movement characteristics of the drone 100, the cargo 110. In some embodiments, the control instructions include instructions to control the flight of the drone.
In some embodiments, the control instructions are used to adjust one or more operating parameters of the cargo 110. For example, the control instructions include instructions for adjusting shooting parameters of the image capturing device. In some embodiments, the control instructions include instructions for the functions of the image capture device including, for example, capturing an image, starting/stopping video capture, turning the image capture device on or off, adjusting the capture mode (e.g., capturing a single image or capturing video), adjusting the distance between the left and right components of the stereoscopic imaging system, or adjusting the position, orientation, and/or movement of the image capture device (including adjusting the image capture device via a pan-tilt).
In some embodiments, the control instructions select a speed mode based on the speed parameter; and controlling the electronic stability augmentation module to be turned on or off based on the selected speed mode.
In some embodiments, when the drone 100 receives the control instructions, the control instructions change parameters of the memory and/or are stored by the memory.
Fig. 2 schematically shows a flow chart of a control method according to an embodiment of the disclosure.
As shown in fig. 2, the method includes operations S210 to S230.
In operation S210, a current flight mode is determined.
In operation S220, the electronic stability augmentation module is controlled to be turned on or off based on the current flight mode determined by the processor.
In some embodiments, the step of controlling the turning on or off of the electronic stability augmentation module based on the current flight mode determined by the processor comprises:
when the received flight mode instruction from the user is a first mode, the processor determines that the current flight mode is a fast mode so as to close the electronic stability augmentation module;
and when the received flight mode instruction from the user is the second mode, the processor determines that the current flight mode is the slow mode so as to start the electronic stability augmentation module.
In some embodiments, when the processor determines that the current flight mode of the remotely controlled device is the smart flight mode, the electronic stability augmentation module is turned on even if the received flight mode command from the user is the first mode; or,
and if the current mode is detected to be the first mode, automatically switching to the second mode.
According to the method, the electronic stability augmentation module can be intelligently started at different speeds; for example, according to different flight modes and play modes, the electronic stability augmentation module is intelligently selected; turning off electron stabilization (EIS) when a large field of view (FOV) is required; the electronic image is turned on to enhance stability when a stable image is desired.
According to an embodiment of the present disclosure, there are totally two speed modes: the first mode is a slow mode and the second mode is a fast mode. When a user selects a slow mode in the setting, electronic image stability Enhancement (EIS) is started by default, and a stable image can be obtained; when the fast mode is selected, the electron stabilization is turned off by default, and a larger field of view (FOV) can be obtained. In a word, the corresponding electronic stability augmentation effect is matched according to the speed mode selected by the user.
Also, when the slow mode or the fast mode is selected, at least one of the photographing modes is turned off by default.
According to the embodiment of the present disclosure, the photographing mode may include, for example, a night mode, a panorama mode, a following mode, a portrait/self-portrait mode, an indoor mode, a group photograph mode, a top-down photograph mode, a landscape mode, or the like.
The method can call the shooting parameters to shoot according to the shooting mode, reduces the learning and operating cost of the user, and improves the user experience.
In some embodiments, the drone further comprises: a speed sensor to detect a speed parameter of the drone, wherein controlling the electronic stability augmentation module to turn on or off based on the current flight mode determined by the processor comprises:
when the speed parameter value from the speed sensor is larger than or equal to a preset value, determining that the current flight mode is a first mode so as to close the electronic stability augmentation module; and when the speed parameter value from the speed sensor is smaller than a preset value, determining that the current flight mode is a second mode so as to start the electronic stability augmentation module.
In some embodiments, when the processor determines that the current flight mode of the drone is a smart flight mode, the electronic stability augmentation module is turned on even if the speed parameter value from the speed sensor is greater than or equal to a predetermined value.
Fig. 3-10 schematically illustrate display interfaces on a display screen of a remote control device according to an embodiment of the present disclosure.
As shown in fig. 3, the display interface includes a mode selection and display control 300, and in the state shown in fig. 3, the shooting mode is a normal mode, for example, a mode in which a user actively sets shooting parameters. When the user operates the mode selection and display control 300, the interface shown in FIG. 4 is entered.
In fig. 3, an item of flying speed is shown, and the user can select a fast mode and a slow mode.
As shown in fig. 4, the interface demonstrates alternative multiple shooting modes, such as follow-up mode, panoramic photo, time-lapse video, one-key-short, slow-motion video, light-track video, indoor shooting, gesture selfie, group photo, night selfie, depth-of-field photo, surround-shot, and the like. When the user operates to select one of the shooting modes, the interface shown in fig. 3 is returned, and the contents displayed on the mode selection & display control 300 are replaced with the selected mode name.
As shown in FIG. 5, in some modes, a sub-mode associated with the mode is provided, such as illustrated by sub-mode illustration control 310 in the figure, and allows user selection. A hidden sub-mode control 320 is also arranged on the interface and used for hiding the sub-mode display control 310. The hidden interface is shown in fig. 6. In the interface illustrated in fig. 6, a display sub-mode control 330 is provided for displaying the sub-mode display control 310, and returning to the interface state of fig. 5.
As shown in FIG. 7, in some modes, partially configurable parameters are provided, such as via a parameter presentation control 340 in the figure, and allow user configuration. A hidden parameter control 350 is further disposed on the interface, and is used for hiding the parameter display control 340. The hidden interface is shown in fig. 8. In the interface illustrated in fig. 8, a display parameter control 360 is provided for displaying the parameter display control 340, and returning to the interface state of fig. 5.
As shown in fig. 9, in some mode interfaces, a help control 370 is also provided. The help control 370, in response to user operation, presents a help interface corresponding to the current mode, as shown in fig. 10, for providing introductory information about the current mode. In the interface illustrated in fig. 10, a close control 380 is provided for closing the help interface to return to the state of fig. 3G.
According to the embodiment of the present disclosure, at least one of the color of the speed displayed on the display device and the color of the stick when the fast mode is selected is different from that when the slow mode is selected.
Fig. 11 schematically shows a flow chart of a control method according to another embodiment of the present disclosure.
As shown in fig. 11, the method may further include one or more of operations S410, S420, and S430 based on the embodiment illustrated in fig. 2.
In operation S410, when the slow mode is selected, the electronic stability increasing module is turned on.
In operation S420, when the fast mode is selected, the electronic stability increasing module is turned off.
In operation S430, when the intelligent flight mode is selected, the electronic stability augmentation module is turned on; even if the received flight mode instruction from the user is in the first mode, the electronic stability augmentation module is still started; or if the current mode is detected to be the first mode, automatically switching to the second mode.
The intelligent flight mode at least comprises one mode of tracking flight, pointing flight, hotspot following, course locking, waypoint flight, return flight locking and interest point surrounding.
In some embodiments, drone photography requires coordination with specific motion and/or pan-tilt controls to achieve the intended photographic effect. Some shooting modes that this disclosed embodiment provided still correspond to motion parameter and/or cloud platform control parameter, after the user confirms shooting mode, unmanned aerial vehicle can realize the control to unmanned aerial vehicle, cloud platform and image acquisition device automatically according to shooting mode, and shoots, has greatly reduced user's the operation degree of difficulty and complexity, has improved and has shot the success rate.
After ordinary aerial photography, the photographed works need to be finished by processing the photographed images or videos. For example, a panoramic image is synthesized from a plurality of images. Because the result of each shooting has strong uncontrollable factors due to the manual control of the shooting process, automatic processing cannot be realized, and only manual processing by a user is performed, so that the learning cost of the user, the processing difficulty and the time cost spent by the user during processing are high. Some of the shooting modes provided by the embodiments of the present disclosure also correspond to processing modes. After the shooting is finished through the shooting modes, due to the stabilization of the shooting process, the shooting result can be processed in a mode, automatically and in real time, and the user experience is improved.
According to the embodiment of the present disclosure, the various control parameters in the above operations S410 and S420 may also be determined by the remote control device based on the photographing mode.
In some embodiments, the remote control device may, for example, invoke a motion parameter based on the capture mode, and send the motion parameter to the drone, enabling the drone to control the drone to move based on the motion parameter. In particular, the remote control device may, for example, send the motion parameters to the drone, enabling the drone to change at least one of a speed, an acceleration, or a direction of motion of the drone based on the motion parameters. Alternatively, the remote control device may, for example, send the motion parameters to the drone, so that the drone can control the drone to move according to a first preset mode based on the motion parameters.
In some embodiments, the remote control device may, for example, invoke pan-tilt control parameters based on the shooting mode, and send the pan-tilt control parameters to the drone, enabling the drone to control the state of the pan-tilt based on the pan-tilt control parameters. Specifically, the remote control device may, for example, send the pan/tilt control parameter to the drone, so that the drone can change the direction of the pan/tilt based on the pan/tilt control parameter. Or, the remote control device may send the pan/tilt control parameter to the unmanned aerial vehicle, so that the unmanned aerial vehicle can control the pan/tilt to move according to a second preset mode based on the pan/tilt control parameter.
In some embodiments, the remote control device may, for example, receive in real-time a target image sent by the drone, invoke a processing mode based on the capture mode, process the target image based on the processing mode, and present the processed target image in real-time.
Fig. 12 schematically shows a flowchart for acquiring a shooting mode according to an embodiment of the present disclosure.
As shown in fig. 12, the method includes S211 and S212.
In operation S211, environment information is acquired.
In operation S212, a speed mode is determined based on the environment information.
According to the embodiment of the present disclosure, the environment information may include, for example, environment image information collected by the unmanned aerial vehicle, sound information collected by the unmanned aerial vehicle, positioning information of the unmanned aerial vehicle, communication information of the unmanned aerial vehicle, or distance information of objects around the unmanned aerial vehicle, and the like. Wherein the communication information includes communication signals that the drone can receive in the current state, for example, in an indoor environment, the GPS signal becomes weaker or even disappears.
According to an embodiment of the present disclosure, the determining of the photographing mode based on the environment information includes determining the photographing mode based on environment image information including a specific object. For example, when a human face is included in the environment image information, the shooting mode is determined to be a self-timer shooting mode, when a plurality of trees are included in the environment image information, the shooting mode is determined to be a landscape mode, and the like.
According to an embodiment of the present disclosure, the determining of the photographing mode based on the environment information includes determining the photographing mode based on environment image information including a specific number or a specific area ratio of a specific object. For example, when a plurality of faces are included in the environment image information, the photographing mode is determined to be the group photographing mode, when 10% or more of trees are included in the environment image information, the photographing mode is determined to be the landscape mode, and the like.
According to an embodiment of the present disclosure, the determining of the photographing mode based on the environment information includes determining the photographing mode based on environment image information including a specific gesture or a specific posture. The user may control the shooting mode used by the drone, for example, through preset gestures or postures. For example, when a gesture occurs in the environment image information, the gesture is recognized and matched in a preset gesture-pattern library, and when the matching is successful, the matched pattern is determined as a shooting pattern.
Fig. 13 schematically shows an information interaction diagram for acquiring a photographing mode according to another embodiment of the present disclosure.
As shown in fig. 13, the method includes operations S610 to S650.
In operation S610, the drone acquires environmental information. This operation is similar to operation S211 described above and will not be described in detail here.
In operation S620, the drone determines a recommended speed mode based on the environment information, and transmits the recommended speed mode to the remote control device, and the remote control device receives the recommended speed mode.
In operation S630, the remote control device exhibits an alternative speed mode.
In operation S640, the remote control device determines at least one speed mode from the alternative speed modes in response to a user operation.
In operation S650, the remote control device transmits a speed pattern to the drone, which the drone receives. Thus, the process of acquiring the speed mode according to the embodiment of the present disclosure is completed.
In some embodiments, operations S630-S650 may also be performed independently of the performance of S610 and S620, i.e., no speed mode is recommended in the course of presenting the alternative speed mode.
Fig. 14 schematically shows a block diagram of a drone 1000 according to an embodiment of the present disclosure.
As shown in fig. 14, the drone 1000 includes a processor 1010, a computer-readable storage medium 1020, a communication device 1030, a speed sensor 1050, and an image acquisition device 1040. The drone 1000 may perform the method applied to the drone described above with reference to fig. 1 to 13 to enable automatic invocation of the shooting parameters according to the shooting mode.
In particular, processor 1010 may include, for example, a general purpose microprocessor, an instruction set processor and/or related chip set and/or a special purpose microprocessor (e.g., an Application Specific Integrated Circuit (ASIC)), and/or the like. The processor 1010 may also include on-board memory for caching purposes. The processor 1010 may be a single processing unit or a plurality of processing units for performing the different actions of the method flows applied to the drone according to embodiments of the present disclosure described with reference to fig. 1-13.
Computer-readable storage medium 1020, for example, may be any medium that can contain, store, communicate, propagate, or transport the instructions. For example, a readable storage medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, device, or propagation medium. Specific examples of the readable storage medium include: magnetic storage devices, such as magnetic tape or Hard Disk Drives (HDDs); optical storage devices, such as compact disks (CD-ROMs); a memory, such as a Random Access Memory (RAM) or a flash memory; and/or wired/wireless communication links.
The computer-readable storage medium 1020 may include a computer program 1021, which computer program 1021 may include code/computer-executable instructions that, when executed by the processor 1010, cause the processor 1010 to perform a method flow applied to a drone, such as described above in connection with fig. 2-9, and any variations thereof.
The computer program 1021 may be configured with computer program code, for example, comprising computer program modules.
For example, in an example embodiment, code in computer program 1021 may include one or more program modules, including, for example, 1021A, modules 1021B, … …. It should be noted that the division and number of modules are not fixed, and those skilled in the art may use suitable program modules or program module combinations according to actual situations, and when these program modules are executed by the processor 1010, the processor 1010 may execute the method flow applied to the drone described above with reference to fig. 2 to 9, for example, and any variants thereof.
According to an embodiment of the present disclosure, the processor 1010 may interact with the communication device 1030, the sensor 1050, and the image acquisition device 1040 to perform the method flow applied to the drone described above in connection with fig. 1-13 and any variations thereof.
Fig. 15 schematically shows a block diagram of a remote control device 1100 according to an embodiment of the present disclosure.
As shown in fig. 15, remote control device 1100 includes a processor 1110, a computer-readable storage medium 1120, a communication apparatus 1130, and a presentation apparatus 1140. The remote control device 1100 may perform the method applied to the remote control device described above with reference to fig. 1 to 13 to implement automatic invocation of the photographing parameters according to the photographing mode.
In particular, processor 1110 may include, for example, a general purpose microprocessor, an instruction set processor and/or related chip set and/or a special purpose microprocessor (e.g., an Application Specific Integrated Circuit (ASIC)), and/or the like. The processor 1110 may also include onboard memory for caching purposes. Processor 1110 may be a single processing unit or multiple processing units for performing the different actions of the method flows described with reference to fig. 2-9 as applied to a remote control device.
Computer-readable storage medium 1120 may be, for example, any medium that can contain, store, communicate, propagate, or transport the instructions. For example, a readable storage medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, device, or propagation medium. Specific examples of the readable storage medium include: magnetic storage devices, such as magnetic tape or Hard Disk Drives (HDDs); optical storage devices, such as compact disks (CD-ROMs); a memory, such as a Random Access Memory (RAM) or a flash memory; and/or wired/wireless communication links.
Computer-readable storage medium 1120 may include a computer program 1121, and computer program 1121 may include code/computer-executable instructions that, when executed by processor 1110, cause processor 1110 to perform a method flow, such as described above in connection with fig. 2-9, as applied to a remote control device, and any variations thereof.
The computer programs 1121 can be configured to have, for example, computer program code including computer program modules.
For example, in an example embodiment, code in computer program 1121 may include one or more program modules, including, for example, 1121A, 1121B, … …. It should be noted that the division and number of modules are not fixed, and those skilled in the art may use suitable program modules or program module combinations according to actual situations, which when executed by the processor 1110, enable the processor 1110 to perform the method flows applied to the remote control device as described above with reference to fig. 2-9, and any variations thereof, for example.
According to an embodiment of the disclosure, processor 1110 may interact with communications apparatus 1130 and presentation apparatus 1140 to perform the method flows described above in connection with fig. 1-13 as applied to a remote control device and any variations thereof.
According to embodiments of the present disclosure, a non-transitory machine-readable medium having stored thereon machine-readable instructions which, when executed by a processor, may implement a method according to embodiments of the present disclosure described above.
Many features of embodiments of the present disclosure may be performed using hardware, software, firmware, or a combination thereof, with the assistance of hardware, software, firmware, or a combination thereof. Thus, the features of the disclosed embodiments may be implemented using a processing system. Exemplary processing systems include, but are not limited to, one or more general purpose microprocessors (e.g., single or multi-core processors), application specific integrated circuits, special purpose instruction set processors, field programmable gate arrays, graphics processing units, physical processing units, digital signal processing units, co-processors, network processing units, audio processing units, cryptographic processing units, and the like.
The features of embodiments of the present disclosure may be performed by, with, or with the aid of a computer program product, such as a storage medium (media) or computer-readable storage medium (media) having stored thereon/in which instructions for performing any of the features presented herein are stored. The storage medium may include, but is not limited to, any type of disk including floppy disks, optical disks, DVDs, CD-ROMs, microdrive, and magneto-optical disks, ROMs, RAMs, EPROMs, EEPROMs, DRAMs, VRAMs, DDR RAMs, flash memory devices, magnetic or optical cards, nanosystems (including molecular memory ICs), or any type of media or device suitable for storing instructions and/or data.
The features of embodiments of the present disclosure, stored on any one of the machine-readable media (media), may be incorporated into software and/or firmware for controlling the hardware of a processing system and for enabling the processing system to interact with other mechanisms that utilize the results of embodiments of the present disclosure. Such software or firmware may include, but is not limited to, application code, device drivers, operating systems, and execution environments/containers.
A communication system as referred to herein optionally communicates via a wired and/or wireless communication connection. For example, communication systems optionally receive and transmit RF signals, also referred to as electromagnetic signals. RF circuitry of a communication system converts/transforms electrical signals into/from electromagnetic signals and communicates with communication networks and other communication devices via electromagnetic signals. The RF circuitry optionally includes well-known circuitry for performing these functions, including but not limited to an antenna system, an RF transceiver, one or more amplifiers, a tuner, one or more oscillators, a digital signal processor, a CODEC chipset, a Subscriber Identity Module (SIM) card, memory, and so forth. The communication system optionally communicates with a network, such as the internet, also known as the World Wide Web (WWW), a local area network, and/or a wireless network, such as a cellular telephone network, a wireless Local Area Network (LAN), and/or a Metropolitan Area Network (MAN), among other wireless communication devices. The wireless communication connection optionally uses any of a number of communication standards, protocols, and technologies, including, but not limited to, global system for mobile communications (GSM), Enhanced Data GSM Environment (EDGE), High Speed Downlink Packet Access (HSDPA), High Speed Uplink Packet Access (HSUPA), data evolution (EV-DO), HSPA +, Dual-cell HSPA (DC-HSPDA), Long Term Evolution (LTE), Near Field Communication (NFC), wideband code (W-CDMA), Code Division Multiple Access (CDMA), Time Division Multiple Access (TDMA), Bluetooth, Wireless Fidelity (Wi-Fi) (e.g., IEEE100.111a, IEEE100.11ac, IEEE100.11ax, IEEE100.11 b, IEEE100.11 g and/or IEEE100.11 n), Voice over Internet protocol (VoIP), Wi-MAX, email protocol (e.g., Internet Message Access Protocol (IMAP) and/or Post Office Protocol (POP)), instant Messaging (e.g., extensible Messaging and Presence protocol (XMPP), The session initiation protocol and Prese for instant messaging utilize extensions (SIMPLE), Instant Messaging and Presence Service (IMPS)) and/or Short Message Service (SMS) or any other suitable communication protocol including those not yet developed before the filing date of this document.
As mentioned herein, electronic stability augmentation (EIS) trades off camera Sensor (Sensor) pixel utilization and image quality for image frame stability, and after acquiring an original image with a large field of view (FOV), the EIS scheme intercepts a target region in the original image according to inertial measurement system (IMU) information, corrects the image according to calibration parameters, and finally outputs a distortion-free small field of view (FOV) image. The obtained final image is often low in Sensor (Sensor) utilization rate and has large sacrifice on picture quality.
The electronic stability augmentation module is not used, the mechanical two-axis or three-axis mode can be used for realizing picture stability, but higher requirements on an aircraft body structure and a mechanical holder structure can be met, and the cost can be increased.
Those skilled in the art will appreciate that various combinations and/or combinations of features recited in the various embodiments and/or claims of the present disclosure can be made, even if such combinations or combinations are not expressly recited in the present disclosure. In particular, various combinations and/or combinations of the features recited in the various embodiments and/or claims of the present disclosure may be made without departing from the spirit or teaching of the present disclosure. All such combinations and/or associations are within the scope of the present disclosure.
While the disclosure has been shown and described with reference to certain exemplary embodiments thereof, it will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the spirit and scope of the disclosure as defined by the appended claims and their equivalents. Accordingly, the scope of the present disclosure should not be limited to the above-described embodiments, but should be defined not only by the appended claims, but also by equivalents thereof.

Claims (42)

1. A control method is applied to a remotely controlled device, and the remotely controlled device comprises the following steps:
a camera for capturing an image; and
the electronic stability augmentation module is used for preventing the camera from shaking;
a processor for receiving flight mode instructions from a user and determining a current flight mode of the remotely controlled device;
the control method is characterized by comprising the following steps:
controlling the electronic stability augmentation module to turn on or off based on the current flight mode determined by the processor, comprising: when the received flight mode instruction from the user is a first mode, the processor determines that the current flight mode is a fast mode to close the electronic stability augmentation module, so that a larger visual field range is obtained; when the received flight mode instruction from the user is the second mode, the processor determines that the current flight mode is the slow mode to start the electronic stability augmentation module, so that a stable image is obtained.
2. The method of claim 1, wherein,
and when the processor determines that the current flight mode is the intelligent flight mode, the electronic stability augmentation module is started.
3. The method of claim 2, wherein,
the intelligent flight mode at least comprises one mode of tracking flight, pointing flight, hotspot following, course locking, waypoint flight, return flight locking and interest point surrounding.
4. The method of claim 1, wherein,
when the processor determines that the current flight mode of the remotely controlled device is the intelligent flight mode, the electronic stability augmentation module is still started even if the received flight mode instruction from the user is the first mode; or,
and if the current mode is detected to be the first mode, automatically switching to the second mode.
5. The method of claim 4, wherein,
the intelligent flight mode at least comprises one mode of tracking flight, pointing flight, hotspot following, course locking, waypoint flight, return flight locking and interest point surrounding.
6. The method of claim 1, wherein the first and second light sources are selected from the group consisting of,
the remotely controlled device further comprises:
a speed sensor for detecting a speed of the remotely controlled device;
the control method further comprises the following steps:
acquiring a speed parameter of the remotely controlled device from the speed sensor;
wherein the step of determining the current flight mode of the remotely controlled device to control the turning on or off of the electronic stability augmentation module comprises:
determining a first mode to turn off the electronic stability augmentation module when the speed parameter value from the speed sensor is greater than or equal to a predetermined value;
when the speed parameter value from the speed sensor is smaller than a preset value, determining as a second mode to start the electronic stability augmentation module.
7. The method of claim 6, wherein,
and when the processor determines that the current flight mode of the remotely controlled equipment is the intelligent flight mode, the electronic stability augmentation module is still started even if the speed parameter value is greater than or equal to a preset value.
8. The method of claim 7, wherein,
the intelligent flight mode at least comprises one mode of tracking flight, pointing flight, hotspot following, course locking, waypoint flight, return flight locking and interest point surrounding.
9. The method of claim l, wherein the remotely controlled device is a drone.
10. The method of claim 1 or 6, wherein at least one of a panorama mode, a follow mode, a portrait/self-portrait mode, an indoor mode, a snap-shot mode, a top-down mode, or a landscape mode of the camera is turned off when the current flight mode of the remotely controlled device is determined to be the first mode.
11. The method of claim 1 or 6, wherein at least one of a panorama mode, a follow mode, a portrait/self-portrait mode, an indoor mode, a snap-shot mode, a top-down mode, or a landscape mode of the shooting mode of the camera is turned on upon determining that the current flight mode of the remotely controlled device is the second mode.
12. A control method is applied to a remote control device, and comprises the following steps:
sending a speed mode to a remote control device controlled by the remote control device, so that the remote control device can control the electronic stability augmentation module arranged on the remote control device to be turned on or off based on the speed mode;
the speed mode comprises a first mode and a second mode, when the first mode is sent, the processor of the remotely controlled device determines that the current flight mode is the fast mode, and the remotely controlled device closes the electronic stability augmentation module, so that a larger visual field range is obtained; and when the second mode is transmitted, the processor of the remotely controlled device determines that the current flight mode is the slow mode, and the remotely controlled device starts the electronic stability augmentation module so as to obtain a stable image.
13. The method of claim 12, wherein the remotely controlled device is a drone.
14. The method of claim 12, wherein:
controlling to turn off at least one of a panorama mode, a following mode, a portrait/self-timer mode, an indoor mode, a snap-in mode, a top-down mode, or a landscape mode among shooting modes of a camera provided on the remotely controlled device when the first mode is transmitted.
15. The method of claim 12, wherein:
and controlling to start at least one of a panorama mode, a following mode, a portrait/self-timer mode, an indoor mode, a closed-shot mode, a top-shot mode or a landscape mode among shooting modes of a camera provided on the remotely controlled device when the second mode is transmitted.
16. The method of claim 13, further comprising:
acquiring a speed parameter value of the remotely controlled equipment;
when the speed parameter value from the speed sensor is larger than or equal to a preset value, sending a first mode to the remotely controlled equipment to close the electronic stability augmentation module;
and when the speed parameter value from the speed sensor is smaller than a preset value, sending a second mode to the remotely controlled equipment to start the electronic stability augmentation module.
17. The method of claim 16, wherein,
if the current flight mode of the remotely controlled equipment is an intelligent flight mode, even if the acquired speed parameter value of the remotely controlled equipment is larger than or equal to a preset value, a control instruction is still sent to the remotely controlled equipment to start the electronic stability augmentation module.
18. The method of claim 17, further comprising:
receiving and displaying the target image sent by the unmanned aerial vehicle in real time.
19. The method of claim 18, wherein said receiving and presenting in real-time a target image transmitted by said remotely controlled device comprises:
receiving a target image transmitted by the remotely controlled device in real time;
setting at least one of a panorama mode, a following mode, a portrait/self-portrait mode, an indoor mode, a group photograph mode, a top-down photograph mode, or a landscape mode in a photographing mode based on the target image.
20. The method of claim 12, wherein,
and when the current flight mode is the intelligent flight mode, the electronic stability augmentation module is always started by the remote control equipment.
21. An unmanned aerial vehicle, comprising:
a camera for capturing an image;
the electronic stability augmentation module is used for preventing the camera from shaking;
a processor for receiving flight mode instructions from a user and determining a current flight mode of the drone, and controlling the electronic stability augmentation module to turn on or off based on the current flight mode determined by the processor; and
a memory having computer-readable instructions stored thereon for execution by the processor;
when the processor receives a first mode triggered by a user, the processor determines that the current flight mode is a fast mode to close the electronic stability augmentation module, so that a larger view field range is obtained; and when the processor receives a second mode triggered by a user, the processor determines that the current flight mode is the slow mode to start the electronic stability augmentation module so as to obtain a stable image.
22. The drone of claim 21, wherein,
and when the processor determines that the current flight mode is the intelligent flight mode, the electronic stability augmentation module is started.
23. The drone of claim 22, wherein,
the intelligent flight mode at least comprises one mode of tracking flight, pointing flight, hotspot following, course locking, waypoint flight, return flight locking and interest point surrounding.
24. The drone of claim 21, wherein,
when the processor determines that the current flight mode is the intelligent flight mode, even when the processor receives a first mode triggered by a user, the electronic stability augmentation module is still started; or,
and if the current mode is detected to be the first mode, automatically switching to the second mode.
25. The drone of claim 24, wherein,
the intelligent flight mode at least comprises one mode of tracking flight, pointing flight, hotspot following, course locking, waypoint flight, return flight locking and interest point surrounding.
26. The drone of claim 21, wherein the drone further comprises:
and the speed sensor is used for detecting the speed parameter of the unmanned aerial vehicle.
27. The unmanned aerial vehicle of claim 26,
wherein controlling the electronic stability augmentation module to turn on or off based on the current flight mode determined by the processor comprises:
when the speed parameter value from the speed sensor is larger than or equal to a preset value, determining that the current flight mode is a first mode so as to close the electronic stability augmentation module;
and when the speed parameter value from the speed sensor is smaller than a preset value, determining that the current flight mode is a second mode so as to start the electronic stability augmentation module.
28. The drone of claim 27, wherein,
when the processor determines that the current flight mode of the unmanned aerial vehicle is the intelligent flight mode, the electronic stability augmentation module is still started even if the speed parameter value from the speed sensor is greater than or equal to a preset value.
29. The drone of claim 28, wherein,
the intelligent flight mode at least comprises one mode of tracking flight, pointing flight, hotspot following, course locking, waypoint flight, return flight locking and interest point surrounding.
30. The drone of claim 21 or 27, wherein when the current flight mode is determined to be the first mode, at least one of a panoramic mode, a follow-up mode, a portrait/self-portrait mode, an indoor mode, a shoot-through mode, or a landscape mode of the shooting mode of the camera is turned off.
31. The drone of claim 21 or 27, wherein when the current flight mode is determined to be the second mode, at least one of a panoramic mode, a follow-up mode, a portrait/self-portrait mode, an indoor mode, a shoot-through mode, or a landscape mode of the shooting mode of the camera is turned on.
32. A remote control device, comprising:
the processor is used for sending a speed mode to a remote control device controlled by the remote control device, so that the remote control device can control the electronic stability augmentation module arranged on the remote control device to be turned on or off based on the speed mode; and
a memory having computer-readable instructions stored thereon for execution by the processor;
the speed mode comprises a first mode and a second mode, when the first mode is sent, the processor of the remotely controlled device determines that the current flight mode is the fast mode, and the remotely controlled device closes the electronic stability augmentation module, so that a larger visual field range is obtained; and when the second mode is transmitted, the processor of the remotely controlled device determines that the current flight mode is the slow mode, and the remotely controlled device starts the electronic stability augmentation module so as to obtain a stable image.
33. The remote control device of claim 32, wherein the remotely controlled device is a drone.
34. The remote control device according to claim 32, wherein at the time of transmitting the first mode, at least one of a panorama mode, a following mode, a portrait/self-timer mode, an indoor mode, a snap-in mode, a top-down mode, or a landscape mode among shooting modes of a camera provided on the remotely controlled device is turned off.
35. The remote control device according to claim 32, wherein at the time of transmitting the second mode, at least one of a panorama mode, a following mode, a portrait/self-portrait mode, an indoor mode, a group photograph mode, a top-down photograph mode, or a landscape mode among photographing modes of a camera provided on the remotely controlled device is turned on.
36. The remote control device of claim 32, further comprising:
acquiring a speed parameter value of the remotely controlled equipment;
when the speed parameter value from the speed sensor is larger than or equal to a preset value, sending a first mode to the remotely controlled equipment to close the electronic stability augmentation module;
and when the speed parameter value from the speed sensor is smaller than a preset value, sending a second mode to the remotely controlled equipment to start the electronic stability augmentation module.
37. The remote control device of claim 33,
if the current flight mode of the remotely controlled equipment is an intelligent flight mode, even if the acquired speed parameter value of the remotely controlled equipment is larger than or equal to a preset value, a control instruction is still sent to the remotely controlled equipment to start the electronic stability augmentation module.
38. The remote control device of claim 37, further comprising:
and receiving and displaying the target image transmitted by the remotely controlled equipment in real time.
39. The remote control device of claim 38, wherein the receiving and presenting in real-time a target image transmitted by the drone comprises:
receiving a target image transmitted by the remotely controlled device in real time;
setting at least one of a panorama mode, a following mode, a portrait/self-portrait mode, an indoor mode, a group photograph mode, a top-down photograph mode, or a landscape mode in a photographing mode based on the target image.
40. The remote control device of claim 36, wherein:
the color of the speed of the first mode displayed on the display device is different from the color of the speed of the second mode displayed.
41. The remote control device of claim 36, wherein:
the color of the joystick of the first mode displayed on the display device is different from the color of the joystick of the second mode displayed.
42. A non-volatile storage medium having stored thereon executable instructions for performing the method of any one of claims 1 to 20.
CN201880014345.9A 2018-01-05 2018-01-05 Control method, unmanned aerial vehicle, remote control device and nonvolatile storage medium Expired - Fee Related CN110383814B (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2018/071594 WO2019134124A1 (en) 2018-01-05 2018-01-05 Control method, unmanned aerial vehicle, remote control device, and nonvolatile storage medium

Publications (2)

Publication Number Publication Date
CN110383814A CN110383814A (en) 2019-10-25
CN110383814B true CN110383814B (en) 2021-08-31

Family

ID=67144023

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201880014345.9A Expired - Fee Related CN110383814B (en) 2018-01-05 2018-01-05 Control method, unmanned aerial vehicle, remote control device and nonvolatile storage medium

Country Status (2)

Country Link
CN (1) CN110383814B (en)
WO (1) WO2019134124A1 (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111103297B (en) * 2020-01-20 2024-07-12 无锡市建筑工程质量检测中心 Non-contact detection method and detection system for quality of building exterior wall surface layer
CN112684456B (en) * 2020-12-22 2024-05-17 安徽配隆天环保科技有限公司 Unmanned aerial vehicle ultrasonic three-dimensional imaging model system
WO2022134036A1 (en) * 2020-12-25 2022-06-30 深圳市大疆创新科技有限公司 Movable platform and control method and device therefor
CN115209029A (en) * 2021-04-08 2022-10-18 成都睿铂科技有限责任公司 Stability augmentation control method, device, system, equipment and storage medium
CN114706410A (en) * 2022-04-02 2022-07-05 深圳市道通智能航空技术股份有限公司 Flight control method, unmanned aerial vehicle and readable storage medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106982324A (en) * 2017-03-10 2017-07-25 重庆零度智控智能科技有限公司 Unmanned plane, video capture method and apparatus
CN107087430A (en) * 2016-03-29 2017-08-22 深圳市大疆创新科技有限公司 Perform state indication method, device and unmanned plane
CN107223223A (en) * 2016-04-29 2017-09-29 深圳市大疆创新科技有限公司 The control method and system, intelligent glasses of a kind of visual angle of unmanned plane first flight
CN107450573A (en) * 2016-11-17 2017-12-08 广州亿航智能技术有限公司 Flight shoot control system and method, intelligent mobile communication terminal, aircraft

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160286128A1 (en) * 2002-10-01 2016-09-29 Dylan TX ZHOU Amphibious vtol super drone camera in a mobile case (phone case) with multiple aerial and aquatic flight modes for capturing panoramic virtual reality views, selfie and interactive video

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107087430A (en) * 2016-03-29 2017-08-22 深圳市大疆创新科技有限公司 Perform state indication method, device and unmanned plane
CN107223223A (en) * 2016-04-29 2017-09-29 深圳市大疆创新科技有限公司 The control method and system, intelligent glasses of a kind of visual angle of unmanned plane first flight
CN107450573A (en) * 2016-11-17 2017-12-08 广州亿航智能技术有限公司 Flight shoot control system and method, intelligent mobile communication terminal, aircraft
CN106982324A (en) * 2017-03-10 2017-07-25 重庆零度智控智能科技有限公司 Unmanned plane, video capture method and apparatus

Also Published As

Publication number Publication date
WO2019134124A1 (en) 2019-07-11
CN110383814A (en) 2019-10-25

Similar Documents

Publication Publication Date Title
CN110383814B (en) Control method, unmanned aerial vehicle, remote control device and nonvolatile storage medium
WO2019119434A1 (en) Information processing method, unmanned aerial vehicle, remote control apparatus, and non-volatile storage medium
US20210211579A1 (en) Query response by a gimbal mounted camera
US10021339B2 (en) Electronic device for generating video data
US10484621B2 (en) Systems and methods for compressing video content
US10687050B2 (en) Methods and systems of reducing latency in communication of image data between devices
US11722647B2 (en) Unmanned aerial vehicle imaging control method, unmanned aerial vehicle imaging method, control terminal, unmanned aerial vehicle control device, and unmanned aerial vehicle
CN108780262B (en) Method and apparatus for moving optics relative to an image sensor in an imaging device
US11258949B1 (en) Electronic image stabilization to improve video analytics accuracy
CN109792543B (en) Method and system for creating video abstraction from image data captured by movable object
CN106791483B (en) Image transmission method and device and electronic equipment
WO2023036198A1 (en) Method and apparatus for controlling aerial vehicle to capture rotational delay video, and device and medium
US10419662B2 (en) Photographing method for intelligent flight device and intelligent flight device
WO2017166714A1 (en) Method, device, and system for capturing panoramic image
WO2022141369A1 (en) Systems and methods for supporting automatic video capture and video editing
JP2018055656A (en) Unmanned aircraft control system, and control method and program for the same
WO2020014953A1 (en) Image processing method and device
JP6726649B2 (en) Flight device, management device, shooting control method, and shooting control program
CN111295331A (en) System and method for synchronizing a plurality of control devices with a movable object
US10742887B1 (en) Apparatus comprising a processor configured to perform electronic image stabilization using a calculated warp table
US20210092306A1 (en) Movable body, image generation method, program, and recording medium
WO2022061934A1 (en) Image processing method and device, system, platform, and computer readable storage medium
CN112804441B (en) Unmanned aerial vehicle control method and device
CN113452925B (en) Automatic exposure method for high dynamic range image and unmanned aerial vehicle
JP2018129577A (en) Imaging system, control method thereof, and program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20210831