WO2019119434A1 - Procédé de traitement d'informations, véhicule aérien sans pilote, appareil de commande à distance, et support de stockage non volatile - Google Patents
Procédé de traitement d'informations, véhicule aérien sans pilote, appareil de commande à distance, et support de stockage non volatile Download PDFInfo
- Publication number
- WO2019119434A1 WO2019119434A1 PCT/CN2017/118061 CN2017118061W WO2019119434A1 WO 2019119434 A1 WO2019119434 A1 WO 2019119434A1 CN 2017118061 W CN2017118061 W CN 2017118061W WO 2019119434 A1 WO2019119434 A1 WO 2019119434A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- drone
- mode
- shooting
- pan
- shooting mode
- Prior art date
Links
- 238000003672 processing method Methods 0.000 title claims abstract description 21
- 230000010365 information processing Effects 0.000 title claims abstract description 20
- 238000000034 method Methods 0.000 claims abstract description 74
- 230000033001 locomotion Effects 0.000 claims description 65
- 238000012545 processing Methods 0.000 claims description 40
- 230000007613 environmental effect Effects 0.000 claims description 39
- 238000004891 communication Methods 0.000 claims description 28
- 230000004044 response Effects 0.000 claims description 11
- 230000001133 acceleration Effects 0.000 claims description 9
- 230000008859 change Effects 0.000 claims description 9
- 238000004091 panning Methods 0.000 claims description 7
- 238000004590 computer program Methods 0.000 description 14
- 238000010586 diagram Methods 0.000 description 12
- 230000003287 optical effect Effects 0.000 description 8
- 239000004065 semiconductor Substances 0.000 description 6
- 230000003993 interaction Effects 0.000 description 5
- 230000008569 process Effects 0.000 description 4
- 230000006870 function Effects 0.000 description 3
- 238000003384 imaging method Methods 0.000 description 3
- 230000001413 cellular effect Effects 0.000 description 2
- 238000013507 mapping Methods 0.000 description 2
- 229910044991 metal oxide Inorganic materials 0.000 description 2
- 150000004706 metal oxides Chemical class 0.000 description 2
- 238000003491 array Methods 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 230000009977 dual effect Effects 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 230000000977 initiatory effect Effects 0.000 description 1
- 230000007774 longterm Effects 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 238000010295 mobile communication Methods 0.000 description 1
- 230000006641 stabilisation Effects 0.000 description 1
- 238000011105 stabilization Methods 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/66—Remote control of cameras or camera parts, e.g. by remote control devices
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/08—Control of attitude, i.e. control of roll, pitch, or yaw
- G05D1/0808—Control of attitude, i.e. control of roll, pitch, or yaw specially adapted for aircraft
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/10—Simultaneous control of position or course in three dimensions
- G05D1/101—Simultaneous control of position or course in three dimensions specially adapted for aircraft
-
- G—PHYSICS
- G08—SIGNALLING
- G08C—TRANSMISSION SYSTEMS FOR MEASURED VALUES, CONTROL OR SIMILAR SIGNALS
- G08C17/00—Arrangements for transmitting signals characterised by the use of a wireless electrical link
- G08C17/02—Arrangements for transmitting signals characterised by the use of a wireless electrical link using a radio link
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/61—Control of cameras or camera modules based on recognised objects
- H04N23/611—Control of cameras or camera modules based on recognised objects where the recognised objects include parts of the human body
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/667—Camera operation mode switching, e.g. between still and video, sport and normal or high- and low-resolution modes
Definitions
- the invention belongs to the technical field of remote control, and particularly relates to an information processing method, a drone, a remote control device and a non-volatile storage medium.
- the traditional aerial photography is to take photos suitable for the current scene by manually adjusting the shooting parameters after evaluating the environment, light and other factors.
- a variety of camera parameters can affect later filming.
- the landscape in the sky is wide, and a little change in position or angle will lead to changes in light, scenes, targets, etc.
- the most troublesome aspect of aerial photography is the need to constantly adjust the shooting scheme and shooting parameters for different scenes or scenes. This kind of shooting mode is actually complicated and the technical threshold is high for ordinary people. It is easy to make the film taken invalid if it is slightly inadvertent or unfamiliar.
- An aspect of an embodiment of the present disclosure provides an information processing method, which is applied to a drone, the unmanned aerial vehicle includes an image capturing device, and the method includes acquiring a shooting mode, and calling a shooting parameter based on the shooting mode, And controlling the image acquisition device to acquire the target image based on the shooting parameters.
- Another aspect of an embodiment of the present disclosure provides an information processing method for a remote control device, the method comprising: transmitting a shooting mode to a drone controlled by the remote control device, so that the drone can be based on The shooting mode is called, the shooting parameters are called, and the state of the image capturing device on the drone is adjusted based on the shooting parameters, and the target image is acquired.
- Another aspect of an embodiment of the present disclosure provides an information processing method applied to a remote control device, the method comprising: acquiring a shooting mode, calling a shooting parameter based on the shooting mode, and transmitting the shooting parameter to the The drone controlled by the remote control device enables the drone to control the image capturing device on the drone to collect the target image based on the shooting parameters.
- a drone comprising: an image capture device, a processor, and a memory having stored thereon computer readable instructions that, when executed by the processor, cause the processor to acquire a shooting mode, based on the shooting mode, invoking a shooting parameter, and controlling the image capturing device to acquire a target image based on the shooting parameter.
- a remote control device including a processor and a memory having stored thereon computer readable instructions that, when executed by a processor, cause the processor to send a shooting mode to be
- the drone controlled by the remote control device enables the drone to call a shooting parameter based on the shooting mode, and adjust a state of the image capturing device on the drone based on the shooting parameter, and collect a target image.
- a remote control device including: a processor, and a memory having stored thereon computer readable instructions, the instructions being executed by a processor to cause a processor to acquire a shooting mode, And based on the shooting mode, calling a shooting parameter, and transmitting the shooting parameter to a drone controlled by the remote control device, so that the drone can control image collection on the drone based on the shooting parameter
- the device collects a target image.
- Another aspect of an embodiment of the present disclosure provides a nonvolatile storage medium having stored thereon executable instructions for performing any of the above methods.
- FIG. 1 is a schematic diagram showing an application scenario according to an embodiment of the present invention
- FIG. 2 is a flow chart schematically showing an information processing method according to an embodiment of the present invention
- 3A-3H schematically illustrate display interfaces on a display screen of a remote control device in accordance with some embodiments of the present invention
- FIG. 4 is a flow chart schematically showing an information processing method according to another embodiment of the present invention.
- FIG. 5 is a flow chart schematically showing an acquisition shooting mode according to an embodiment of the present invention.
- FIG. 6 is a view schematically showing an information interaction diagram of acquiring a shooting mode according to another embodiment of the present invention.
- FIG. 7 is a view schematically showing an information interaction diagram of an information processing method according to another embodiment of the present invention.
- FIG. 8 is a flow chart schematically showing an information processing method according to another embodiment of the present invention.
- FIG. 9 is a flow chart schematically showing an information processing method according to another embodiment of the present invention.
- Figure 10 is a block diagram schematically showing a drone according to an embodiment of the present invention.
- Figure 11 is a schematic block diagram of a remote control device in accordance with an embodiment of the present invention.
- the techniques of this disclosure may be implemented in the form of hardware and/or software (including firmware, microcode, etc.). Additionally, the techniques of this disclosure may take the form of a computer program product on a computer readable medium storing instructions for use by or in connection with an instruction execution system.
- a computer readable medium can be any medium that can contain, store, communicate, propagate or transport the instructions.
- a computer readable medium can include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, device, or propagation medium.
- the computer readable medium include: a magnetic storage device such as a magnetic tape or a hard disk (HDD); an optical storage device such as a compact disk (CD-ROM); a memory such as a random access memory (RAM) or a flash memory; and/or a wired /Wireless communication link.
- a magnetic storage device such as a magnetic tape or a hard disk (HDD)
- an optical storage device such as a compact disk (CD-ROM)
- a memory such as a random access memory (RAM) or a flash memory
- RAM random access memory
- Unmanned aerial vehicles described below include, for example, fixed-wing aircraft and rotary-wing aircraft, such as helicopters, quadcopters, and aircraft having other numbers and/or rotor configurations.
- An image acquisition device is provided on the drone for collecting a target image, for example, for taking a photo, taking a video, and the like.
- FIG. 1 schematically shows an application scenario according to an embodiment of the present invention. It should be noted that FIG. 1 is only an example of a scenario in which an embodiment of the present disclosure may be applied to help those skilled in the art understand the technical content of the present disclosure, but does not mean that the embodiment of the present disclosure may not be used for other devices. , system, environment or scenario.
- the drone 100 is provided with a mounted object 110, and the mounted object 110 may include, for example, a processor, a memory, an image capture device, a communication device, and the like.
- the image capture device may be directly fixed to the drone 100 or may be fixed to the drone 100 through a pan/tilt, wherein the pan/tilt may be used to change the orientation of the image capture device.
- the image capture device includes one or more optical devices that affect the light reaching the focus of the imaging sensor.
- the image capture device includes, for example, a semiconductor charge coupled device (CCD), a complementary metal oxide semiconductor (CMOS), and/or an N-type metal oxide semiconductor (NMOS, Live MOS).
- the image capture device is configured to capture high definition or ultra high definition video (eg, 720p, 1080i, 1080p, 1440p, 2000p, 2160p, 2540p, 4000p, 4320p, etc.).
- the carrier 110 may also include one or more other sensors for obtaining environmental information around various drones.
- the one or more sensors include one or more audio transducers.
- the audio detection system includes an audio output transducer (eg, a speaker) and/or an audio input transducer (eg, a microphone, such as a parabolic microphone).
- a microphone and speaker are used as components of the sonar system.
- a three-dimensional map of the surroundings of the drone 100 is provided, for example, using a sonar system.
- the one or more sensors include one or more infrared sensors.
- the distance measurement system for measuring the distance from the drone 100 to the object or surface includes one or more infrared sensors, such as left and right infrared sensors for stereo imaging and/or distance determination. .
- the one or more sensors include one or more global positioning system (GPS) sensors, motion sensors (eg, accelerometers), rotation sensors (eg, gyroscopes), inertial sensors, proximity sensors (eg, infrared sensors) And/or weather sensors (eg pressure sensors, temperature sensors, humidity sensors and/or wind speed sensors).
- GPS global positioning system
- the drone 100, remote control device 200 uses sensing data generated by one or more sensors to determine, for example, the location of the drone 100, the orientation of the drone 100, and the mobile characteristics of the drone 100. (eg angular velocity, angular acceleration, translational velocity, translational acceleration and/or direction of motion along one or more axes), and/or potential obstacles, targets, weather conditions, geographic features, and/or proximity of the drone 100 and/or Or the location of the man-made structure.
- sensing data generated by one or more sensors to determine, for example, the location of the drone 100, the orientation of the drone 100, and the mobile characteristics of the drone 100.
- angular velocity, angular acceleration, translational velocity, translational acceleration and/or direction of motion along one or more axes e.g angular velocity, angular acceleration, translational velocity, translational acceleration and/or direction of motion along one or more axes
- potential obstacles targets, weather conditions, geographic features, and/or proximity of the drone 100 and/or Or the location of the man-made structure
- the remote control device 200 may be, for example, a universal remote controller or a dedicated remote controller of the drone 100. Alternatively, it may also be a mobile terminal such as a mobile phone or a tablet computer, on which a corresponding application runs. It can be used to remotely control the drone 100.
- Display device 210 such as a display screen or projection device, etc., can be provided on remote control device 200 for displaying image, video or text information transmitted by, for example, drone 100.
- remote control device 200 receives user input to control, for example, attitude, position, orientation, speed, acceleration, navigation, and/or tracking of drone 100 and its carrier 110.
- the remote control device 200 is manipulated by a user to input control instructions for controlling navigation of the drone 100.
- the remote control device 200 is used to input an airplane mode of the drone 100, such as autopilot or sailing according to a predetermined navigation path.
- the user controls the position, posture, and/or orientation of the drone 100, such as the drone 100, by changing the position of the remote control device 200 (eg, by tilting or otherwise moving the remote control device 200).
- a change in position of the remote control device 200 is detected by, for example, one or more inertial sensors, and the output of the one or more inertial sensors is used to generate command data.
- the remote control device 200 is used to adjust operational parameters of the mount 110, such as capture parameters of the image capture device and/or positional parameters of the mount 110 relative to the drone 100.
- drone 100 is in communication with remote control device 200, such as by wireless communication.
- drone 100 receives information from remote control device 200.
- the information received by the drone 100 includes, for example, control instructions for controlling parameters of the drone 100.
- drone 100 transmits information to remote control device 200.
- the information transmitted by the drone 100 includes, for example, images and/or video captured by the drone 100.
- communication between the remote control device 200 and the drone 100 is via a wireless map transmission technology (eg, wifi), a network, and/or a wireless signal transmitter of a cellular station (eg, a remote wireless signal transmitter). send.
- a wireless map transmission technology eg, wifi
- a network e.g., a network
- a wireless signal transmitter of a cellular station e.g., a remote wireless signal transmitter.
- the satellite may be an integral part of the Internet and may be used in place of the bee tower.
- the remote control device 200 and the drone 100 can also be operated over a private network, such as Lightbridge or the like.
- the control instructions include, for example, navigation instructions for controlling navigation parameters of the drone 100, such as the drone 100, the position, orientation, posture, and/or one or more movement characteristics of the loader 110.
- the control instructions include instructions to control drone flight.
- control instructions are used to adjust one or more operational parameters of the carrier 110.
- the control instructions include instructions for adjusting the shooting parameters of the image capture device.
- the control instructions include instructions for functions of the image capture device, such as capturing images, starting/stopping video capture, turning image capture devices on or off, adjusting acquisition modes (eg, acquiring a single sheet) Image or video acquisition), adjusting the distance between the left and right components of the stereoscopic imaging system, or adjusting the position, orientation, and/or movement of the image capture device (including adjusting the image capture device through the pan/tilt).
- the control command when the drone 100 receives a control command, changes the parameters of the memory and/or is stored by the memory.
- FIG. 2 schematically shows a flow chart of an information processing method according to an embodiment of the present invention.
- the method includes operations S210-S230.
- a shooting parameter is called based on the shooting mode.
- a mapping relationship between different shooting modes and various shooting parameters can be stored. After determining the shooting mode, the shooting parameters can be called based on the mapping relationship.
- the image acquisition device is controlled to acquire a target image based on the shooting parameters.
- the method can call the shooting parameters according to the shooting mode to shoot, reducing the user's learning and operating costs, and improving the user experience.
- the photographing mode may include, for example, a night scene mode, a panorama mode, a follow mode, a portrait/self-photograph mode, an indoor mode, a photographing mode, a panning mode, or a landscape mode, and the like.
- 3A-3H schematically illustrate display interfaces on a display screen of a remote control device in accordance with some embodiments of the present invention.
- the display interface includes a mode selection and display control 300.
- the shooting mode is a normal mode, such as a mode in which a user actively sets a shooting parameter.
- the mode selection and display control 300 the interface shown in FIG. 3B is entered.
- the interface shows alternative multiple shooting modes, such as follow mode, panoramic photo, time-lapse video, one-click video, slow motion video, track video, indoor shooting, gesture self-timer, photo, night scene Self-timer, depth of field photos, surround shooting, etc.
- follow mode panoramic photo
- time-lapse video time-lapse video
- one-click video slow motion video
- track video indoor shooting
- gesture self-timer photo
- night scene Self-timer depth of field photos
- surround shooting etc.
- sub-modes associated with the mode are provided, such as shown by sub-mode display control 310 in the figure, and allow user selection.
- a hidden sub-mode control 320 is also provided on the interface for hiding the sub-mode display control 310.
- the hidden interface is shown in Figure 3D.
- a display sub-mode control 330 is provided for displaying the sub-mode display control 310, returning to the interface state of FIG. 3C.
- partially configurable parameters are provided, such as shown by parameter display control 340 in the figure, and allow user configuration.
- a hidden parameter control 350 is also provided on the interface for hiding the parameter display control 340.
- the hidden interface is shown in Figure 3F.
- a display parameter control 360 is provided for displaying the parameter display control 340, returning to the interface state of FIG. 3C.
- a help control 370 is also provided.
- the help control 370 in response to a user operation, presents a help interface corresponding to the current mode, as shown in Figure 3H, for providing introductory information about the current mode.
- a close control 380 is provided for closing the help interface back to the state of FIG. 3G.
- FIG. 4 is a flow chart schematically showing an information processing method according to another embodiment of the present invention.
- the method may further include one or more of operations S410, S420 or S430 on the basis of the embodiment illustrated in FIG.
- a motion parameter is invoked, and based on the motion parameter, the drone is controlled to move.
- At least one of a moving speed, a motion acceleration, or a moving direction of the drone is changed based on the motion parameter.
- the drone is controlled to move according to the first preset mode based on the motion parameter.
- the pan-tilt control parameter is invoked, and the state of the pan-tilt is controlled based on the pan-tilt control parameter.
- the direction of the pan/tilt is changed based on the pan/tilt control parameter.
- the pan/tilt is controlled to move according to the second preset mode based on the pan/tilt control parameter.
- drone photography requires a specific motion and/or pan/tilt control to achieve the desired photographic effect.
- Some shooting modes provided by the embodiments of the present disclosure also correspond to motion parameters and/or pan/tilt control parameters.
- the drone can automatically implement the drone, the pan/tilt, and the image capturing device according to the shooting mode.
- the control while shooting, greatly reduces the user's operation difficulty and complexity, and improves the shooting success rate.
- a processing mode is invoked based on the shooting mode, and the target image is processed based on the processing mode.
- Some shooting modes provided by embodiments of the present disclosure also correspond to processing modes. After the shooting is completed by the shooting modes, the shooting result can be processed in a pattern, automatically, and in real time due to the stabilization of the shooting process, thereby improving the user experience.
- various control parameters in the above operations S410, S420, and/or S430 may also be determined by the remote control device based on the shooting mode.
- the remote control device can, for example, invoke a motion parameter based on the photographing mode, and transmit the motion parameter to the drone to enable the drone to control the unmanned based on the motion parameter
- the machine moves.
- the remote control device may, for example, transmit the motion parameter to the drone to enable the drone to change at least one of a motion speed, a motion acceleration, or a motion direction of the drone based on the motion parameter.
- the remote control device may, for example, transmit the motion parameter to the drone to enable the drone to control the drone to move in accordance with the first preset mode based on the motion parameter.
- the remote control device can, for example, invoke a pan-tilt control parameter based on the photographing mode, and transmit the pan-tilt control parameter to the drone to enable the drone to be controlled based on the pan-tilt A parameter that controls the state of the gimbal.
- the remote control device may, for example, transmit the pan/tilt control parameter to the drone, so that the drone can change the direction of the pan/tilt based on the pan/tilt control parameter.
- the remote control device may, for example, transmit the pan/tilt control parameter to the drone, so that the drone can control the pan/tilt to move according to the second preset mode based on the pan/tilt control parameter.
- the remote control device may receive, for example, a target image transmitted by the drone in real time, invoke a processing mode based on the shooting mode, process the target image based on the processing mode, and display the processed image in real time.
- Target image may be received, for example, a target image transmitted by the drone in real time, invoke a processing mode based on the shooting mode, process the target image based on the processing mode, and display the processed image in real time.
- FIG. 5 schematically shows a flow chart of acquiring a shooting mode according to an embodiment of the present invention.
- the method includes S211 and S212.
- a shooting mode is determined based on the environmental information.
- the environmental information may include, for example, environmental image information collected by the drone, sound information collected by the drone, positioning information of the drone, and communication of the drone Information, or distance information of objects around the drone, and the like.
- the communication information includes a communication signal that the drone can receive in the current state. For example, in an indoor environment, the GPS signal becomes weak or even disappears.
- the determining a shooting mode based on the environmental information includes determining a shooting mode based on environmental image information including a specific object. For example, when the environmental image information includes a human face, it is determined that the shooting mode is a self-portrait mode, and when the environment image information includes a plurality of trees, the shooting mode is determined to be a landscape mode or the like.
- the determining a shooting mode based on the environmental information includes determining a shooting mode based on environmental image information of a specific object including a specific number or a specific area ratio. For example, when the environment image information includes a plurality of faces, it is determined that the shooting mode is the merging mode, and when the environment image information includes more than 10% of the trees, the shooting mode is determined to be the landscape mode or the like.
- the determining a shooting mode based on the environmental information includes determining a shooting mode based on environmental image information including a specific gesture or a specific gesture.
- the user can control the shooting mode used by the drone, for example, by a preset gesture or gesture. For example, when a gesture appears in the environment image information, the gesture is recognized and matched in a preset gesture-mode library, and when the matching is successful, the matched mode is determined as the shooting mode.
- FIG. 6 is a view schematically showing an information interaction diagram of acquiring a shooting mode according to another embodiment of the present invention.
- the method includes operations S610-S650.
- the drone acquires environmental information at operation S610. This operation is similar to the above operation S211, and will not be described again here.
- the drone determines a recommended shooting mode based on the environmental information, and transmits the recommended shooting mode to the remote control device, and the remote control device receives the recommended shooting mode.
- the remote control device displays an alternate shooting mode.
- the recommended shooting mode is a subset of the alternative shooting modes, and the recommended shooting mode may be displayed in a manner different from other alternative shooting modes, for example, displaying the recommended shooting mode in a highlighted manner. This method can further reduce the user's learning cost.
- the remote control device determines at least one shooting mode from the alternative shooting mode in response to a user operation.
- the remote control device transmits a shooting mode to the drone, and the drone receives the shooting mode. So far, the process of acquiring the shooting mode of the embodiment of the present disclosure has been completed.
- operations S630-S650 may also be performed independent of the execution of S610 and S620, ie, the shooting mode is not recommended during the presentation of the alternate shooting mode.
- FIG. 7 is a view schematically showing an information interaction diagram of an information processing method according to another embodiment of the present invention.
- the method further includes operations S710 and S720 in the case where the drone controls the image capturing device to acquire the target image based on the shooting parameters.
- the target image is transmitted to at least one other electronic device, such as a remote control device, and the target image is received by the other electronic device, such as a remote control device.
- other electronic devices such as a remote control device display a target image.
- the target image may be sent to the other electronic device in real time, and the other electronic device may display the target image or the processed target image in real time, so that the user can preview the shooting result in time and adjust the shooting plan in real time.
- FIG. 8 is a flow chart schematically showing an information processing method according to another embodiment of the present invention.
- the method includes operations S810-S830.
- the environment information is acquired.
- the environment information may be, for example, image information collected by an image acquisition device, or environmental information obtained by other sensing units, such as a radar chart, an ultrasonic reflection result, or the like.
- a salient object in the environmental information is identified as a subject.
- the environment information acquired in operation S810 is identified, for example, for image information
- the image may be processed by an existing image processing method to obtain a salient object in the image, such as a prominent building, significant Natural landscape, or people in the environment.
- the salient object is likely to be the target object that the user uses to shoot with the drone. Therefore, the drone can use the salient object as the shooting object, and then adjust the drone and the pan/tilt according to the position, size, and shape characteristics of the salient object. And/or the state of the image capture device, such as adjusting the orientation or movement mode of the drone, the angle of the gimbal, and the like.
- the image capturing device is controlled to acquire a target image of the photographic subject.
- the photographic subject may also be determined by the remote control device in response to a user operation and transmitted to the drone through the remote control device.
- FIG. 9 is a flow chart schematically showing an information processing method according to another embodiment of the present invention.
- the method is applied to a remote control device, including operations S910-S930.
- a shooting parameter is called based on the shooting mode.
- the shooting parameter is transmitted to the drone controlled by the remote control device, so that the drone can control the image capturing device on the drone to collect the target image based on the shooting parameter.
- the method of the embodiment of the present disclosure differs from the foregoing embodiment in that the shooting mode is acquired, and the operation of calling the shooting parameter based on the shooting mode is performed by the remote control device, and other contents are similar to the foregoing embodiment, and details are not described herein again.
- FIG. 10 schematically shows a block diagram of a drone 1000 in accordance with an embodiment of the present invention.
- the drone 1000 includes a processor 1010, a computer readable storage medium 1020, a communication device 1030, and an image capture device 1040.
- the drone 1000 can perform the method applied to the drone described above with reference to FIGS. 2 to 9 to automatically call the shooting parameters according to the shooting mode.
- processor 1010 can include, for example, a general purpose microprocessor, an instruction set processor, and/or a related chipset and/or a special purpose microprocessor (eg, an application specific integrated circuit (ASIC)), and the like.
- processor 1010 may also include onboard memory for caching purposes.
- the processor 1010 may be a single processing unit or a plurality of processing units for performing different actions of the method flow applied to the drone according to the embodiments of the present disclosure described with reference to FIGS. 2 to 9.
- Computer readable storage medium 1020 can be any medium that can contain, store, communicate, propagate or transport the instructions.
- a readable storage medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, device, or propagation medium.
- Specific examples of the readable storage medium include: a magnetic storage device such as a magnetic tape or a hard disk (HDD); an optical storage device such as a compact disk (CD-ROM); a memory such as a random access memory (RAM) or a flash memory; and/or a wired /Wireless communication link.
- the computer readable storage medium 1020 can include a computer program 1021 that can include code/computer executable instructions that, when executed by the processor 1010, cause the processor 1010 to perform, for example, the applications described above in connection with Figures 2-9 The method flow of the drone and any variants thereof.
- Computer program 1021 can be configured to have computer program code, for example, including a computer program module.
- the code in computer program 1021 may include one or more program modules, including, for example, 1021A, module 1021B, .
- the division manner and number of modules are not fixed, and those skilled in the art may use suitable program modules or program module combinations according to actual situations.
- the processor 1010 may be The method flow applied to the drone, such as described above in connection with Figures 2-9, and any variations thereof are performed.
- processor 1010 can interact with communication device 1030 and image acquisition device 1040 to perform the method flow applied to the drone described above in connection with Figures 2-9 and any variations thereof.
- FIG. 11 schematically shows a block diagram of a remote control device 1100 in accordance with an embodiment of the present invention.
- the remote control device 1100 includes a processor 1110, a computer readable storage medium 1120, a communication device 1130, and a display device 1140.
- the remote control device 1100 can perform the method applied to the remote control device described above with reference to FIGS. 2 to 9 to automatically call the shooting parameters according to the shooting mode.
- processor 1110 can include, for example, a general purpose microprocessor, an instruction set processor, and/or a related chipset and/or a special purpose microprocessor (eg, an application specific integrated circuit (ASIC)), and the like.
- the processor 1110 may also include an onboard memory for caching purposes.
- the processor 1110 may be a single processing unit or a plurality of processing units for performing different actions of the method flow applied to the remote control device described with reference to FIGS. 2 to 9.
- Computer readable storage medium 1120 can be, for example, any medium that can contain, store, communicate, propagate or transport the instructions.
- a readable storage medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, device, or propagation medium.
- Specific examples of the readable storage medium include: a magnetic storage device such as a magnetic tape or a hard disk (HDD); an optical storage device such as a compact disk (CD-ROM); a memory such as a random access memory (RAM) or a flash memory; and/or a wired /Wireless communication link.
- the computer readable storage medium 1120 can include a computer program 1121 that can include code/computer executable instructions that, when executed by the processor 1110, cause the processor 1110 to perform applications such as those described above in connection with Figures 2-9 The method flow of the remote control device and any variations thereof.
- Computer program 1121 can be configured to have computer program code, for example, including a computer program module.
- the code in computer program 1121 may include one or more program modules, including, for example, 1121A, module 1121B, .
- the division manner and number of modules are not fixed, and those skilled in the art may use suitable program modules or program module combinations according to actual conditions.
- the processor 1110 may be The method flow applied to the remote control device, such as described above in connection with Figures 2-9, and any variations thereof are performed.
- the processor 1110 can interact with the communication device 1130 and the presentation device 1140 to perform the method flow applied to the remote control device described above in connection with FIGS. 2-9 and any variations thereof.
- a non-transitory machine readable medium having stored thereon machine readable instructions that, when executed by a processor, implement a method in accordance with an embodiment of the present invention as described above.
- processing systems include, but are not limited to, one or more general purpose microprocessors (eg, single or multi-core processors), application specific integrated circuits, dedicated instruction set processors, field programmable gate arrays, graphics processing units, physical processing units, Digital signal processing unit, coprocessor, network processing unit, audio processing unit, encryption processing unit, and the like.
- general purpose microprocessors eg, single or multi-core processors
- application specific integrated circuits e.g., single or multi-core processors
- dedicated instruction set processors e.g., field programmable gate arrays
- graphics processing units e.g., graphics processing units, physical processing units, Digital signal processing unit, coprocessor, network processing unit, audio processing unit, encryption processing unit, and the like.
- Storage medium may include, but are not limited to, any type of disk, including floppy disks, optical disks, DVDs, CD-ROMs, mini drives and magneto-optical disks, ROM, RAM, EPROM, EEPROM, DRAM, VRAM, DDR RAM, flash memory devices, magnetic cards or light. Cards, nanosystems (including molecular memory ICs) or any type of medium or device suitable for storing instructions and/or data.
- any one machine readable medium may be incorporated into software and/or firmware for controlling hardware of a processing system, and for enabling a processing system to utilize the results of the present invention The interaction of other agencies.
- software or firmware may include, but is not limited to, application code, device drivers, operating systems, and execution environments/containers.
- Communication systems as referred to herein optionally communicate via wired and/or wireless communication connections.
- a communication system optionally receives and transmits RF signals, also referred to as electromagnetic signals.
- the RF circuitry of the communication system converts electrical signals into/from electromagnetic signals and communicates with communication networks and other communication devices via electromagnetic signals.
- the RF circuitry optionally includes well-known circuitry for performing these functions, including but not limited to antenna systems, RF transceivers, one or more amplifiers, tuners, one or more oscillators, digital signal processors, CODEC chipsets , User Identity Module (SIM) card, memory, etc.
- SIM User Identity Module
- the communication system optionally communicates with a network such as the Internet, also known as the World Wide Web (WWW), a local area network, and/or a wireless network such as a cellular telephone network, a wireless local area network (LAN), and/or a metropolitan area network (MAN), and other wireless communication devices.
- a network such as the Internet, also known as the World Wide Web (WWW), a local area network, and/or a wireless network such as a cellular telephone network, a wireless local area network (LAN), and/or a metropolitan area network (MAN), and other wireless communication devices.
- WWW World Wide Web
- LAN wireless local area network
- MAN metropolitan area network
- the wireless communication connection optionally uses any of a variety of communication standards, protocols, and techniques including, but not limited to, Global System for Mobile Communications (GSM), Enhanced Data GSM Environment (EDGE), High Speed Downlink Packet Access (HSDPA) ), High Speed Uplink Packet Access (HSUPA), Data Evolution (EV-DO), HSPA, HSPA+, Dual Cell HSPA (DC-HSPDA), Long Term Evolution (LTE), Near Field Communication (NFC), Wideband Code ( W-CDMA), Code Division Multiple Access (CDMA), Time Division Multiple Access (TDMA), Bluetooth, Wireless Fidelity (Wi-Fi) (eg IEEE 100.11a, IEEE 100.11ac, IEEE 100.11ax, IEEE 100.11b, IEEE 100.11) g and / or IEEE 100.11n), Voice over Internet Protocol (VoIP), Wi-MAX, email protocols (such as Internet Message Access Protocol (IMAP) and / or Post Office Protocol (POP)), instant messaging (such as scalable messaging) And presence protocols (XMPP
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Aviation & Aerospace Engineering (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Automation & Control Theory (AREA)
- Computer Networks & Wireless Communication (AREA)
- Studio Devices (AREA)
Abstract
L'invention concerne un procédé de traitement d'informations applicable à un véhicule aérien sans pilote comprenant un dispositif de capture d'image. Le procédé consiste à : acquérir un mode de photographie ; appeler un paramètre de photographie sur la base du mode de photographie ; et commander au dispositif de capture d'image de capturer une image cible, sur la base du paramètre de photographie. L'invention concerne également un procédé de traitement d'informations applicable à un appareil de commande à distance. Le procédé consiste à : transmettre un mode de photographie à un véhicule aérien sans pilote commandé par un appareil de commande à distance, de telle sorte que le véhicule aérien sans pilote peut appeler un paramètre de photographie sur la base du mode de photographie ; et, sur la base du paramètre de photographie, ajuster un état d'un dispositif de capture d'image au niveau du véhicule aérien sans pilote, et capturer une image cible. L'invention concerne en outre un véhicule aérien sans pilote, un appareil de commande à distance, et un support de stockage non volatile.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201780035243.0A CN109479090A (zh) | 2017-12-22 | 2017-12-22 | 信息处理方法、无人机、遥控设备以及非易失性存储介质 |
PCT/CN2017/118061 WO2019119434A1 (fr) | 2017-12-22 | 2017-12-22 | Procédé de traitement d'informations, véhicule aérien sans pilote, appareil de commande à distance, et support de stockage non volatile |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/CN2017/118061 WO2019119434A1 (fr) | 2017-12-22 | 2017-12-22 | Procédé de traitement d'informations, véhicule aérien sans pilote, appareil de commande à distance, et support de stockage non volatile |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2019119434A1 true WO2019119434A1 (fr) | 2019-06-27 |
Family
ID=65658526
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2017/118061 WO2019119434A1 (fr) | 2017-12-22 | 2017-12-22 | Procédé de traitement d'informations, véhicule aérien sans pilote, appareil de commande à distance, et support de stockage non volatile |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN109479090A (fr) |
WO (1) | WO2019119434A1 (fr) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113965741A (zh) * | 2021-11-26 | 2022-01-21 | 北京萌特博智能机器人科技有限公司 | 娱乐设备自动部署方法、装置、设备以及存储介质 |
CN114088207A (zh) * | 2020-07-17 | 2022-02-25 | 北京京东尚科信息技术有限公司 | 温度检测方法和系统 |
US11880634B2 (en) * | 2021-03-23 | 2024-01-23 | Toyota Jidosha Kabushiki Kaisha | Volume control system, volume control method, and non-transitory computer readable medium storing volume control program |
Families Citing this family (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111316636B (zh) * | 2019-03-27 | 2021-11-16 | 深圳市大疆创新科技有限公司 | 旋转式拍摄方法、控制装置、可移动平台及存储介质 |
CN112154649A (zh) * | 2019-07-30 | 2020-12-29 | 深圳市大疆创新科技有限公司 | 航测方法、拍摄控制方法、飞行器、终端、系统及存储介质 |
CN111583692A (zh) * | 2020-04-27 | 2020-08-25 | 新石器慧通(北京)科技有限公司 | 一种基于自动驾驶的远程视野获取方法、装置及系统 |
WO2022077238A1 (fr) * | 2020-10-13 | 2022-04-21 | 深圳市大疆创新科技有限公司 | Procédé d'affichage d'imagerie, terminal de commande à distance, dispositif, système, et support de stockage |
CN113095280A (zh) * | 2021-04-28 | 2021-07-09 | 广州极飞科技股份有限公司 | 无人机数据处理设备的推荐方法、装置、设备及介质 |
CN114051095A (zh) * | 2021-11-12 | 2022-02-15 | 苏州臻迪智能科技有限公司 | 视频流数据的远程处理方法以及拍摄系统 |
CN114666505A (zh) * | 2022-03-24 | 2022-06-24 | 臻迪科技股份有限公司 | 控制无人机拍摄的方法和系统以及无人机系统 |
CN114785960B (zh) * | 2022-06-16 | 2022-09-02 | 鹰驾科技(深圳)有限公司 | 一种基于无线传输技术的360度全景行车记录仪系统 |
CN115474001A (zh) * | 2022-08-30 | 2022-12-13 | 中化信息技术有限公司 | 自动连拍合成全景图像的方法、装置 |
CN117930879A (zh) * | 2024-01-19 | 2024-04-26 | 天津云圣智能科技有限责任公司 | 景区无人机自主拍摄的方法、装置、存储介质及电子设备 |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105391939A (zh) * | 2015-11-04 | 2016-03-09 | 腾讯科技(深圳)有限公司 | 无人机拍摄控制方法和装置、无人机拍摄方法和无人机 |
CN105827929A (zh) * | 2015-01-06 | 2016-08-03 | 中兴通讯股份有限公司 | 一种移动终端相机的拍照方法及装置 |
CN106125767A (zh) * | 2016-08-31 | 2016-11-16 | 北京小米移动软件有限公司 | 飞行器的控制方法、装置及飞行器 |
CN106292720A (zh) * | 2015-04-21 | 2017-01-04 | 高域(北京)智能科技研究院有限公司 | 一种智能多模式飞行拍摄设备及其飞行控制方法 |
Family Cites Families (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102809969A (zh) * | 2011-06-03 | 2012-12-05 | 鸿富锦精密工业(深圳)有限公司 | 无人飞行载具控制系统及方法 |
CN104516355A (zh) * | 2014-12-31 | 2015-04-15 | 深圳雷柏科技股份有限公司 | 一种自拍型飞行器及其自拍方法 |
CN104796611A (zh) * | 2015-04-20 | 2015-07-22 | 零度智控(北京)智能科技有限公司 | 移动终端遥控无人机实现智能飞行拍摄的方法及系统 |
KR20170136750A (ko) * | 2016-06-02 | 2017-12-12 | 삼성전자주식회사 | 전자 장치 및 그의 동작 방법 |
-
2017
- 2017-12-22 WO PCT/CN2017/118061 patent/WO2019119434A1/fr active Application Filing
- 2017-12-22 CN CN201780035243.0A patent/CN109479090A/zh active Pending
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105827929A (zh) * | 2015-01-06 | 2016-08-03 | 中兴通讯股份有限公司 | 一种移动终端相机的拍照方法及装置 |
CN106292720A (zh) * | 2015-04-21 | 2017-01-04 | 高域(北京)智能科技研究院有限公司 | 一种智能多模式飞行拍摄设备及其飞行控制方法 |
CN105391939A (zh) * | 2015-11-04 | 2016-03-09 | 腾讯科技(深圳)有限公司 | 无人机拍摄控制方法和装置、无人机拍摄方法和无人机 |
CN106125767A (zh) * | 2016-08-31 | 2016-11-16 | 北京小米移动软件有限公司 | 飞行器的控制方法、装置及飞行器 |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114088207A (zh) * | 2020-07-17 | 2022-02-25 | 北京京东尚科信息技术有限公司 | 温度检测方法和系统 |
US11880634B2 (en) * | 2021-03-23 | 2024-01-23 | Toyota Jidosha Kabushiki Kaisha | Volume control system, volume control method, and non-transitory computer readable medium storing volume control program |
CN113965741A (zh) * | 2021-11-26 | 2022-01-21 | 北京萌特博智能机器人科技有限公司 | 娱乐设备自动部署方法、装置、设备以及存储介质 |
Also Published As
Publication number | Publication date |
---|---|
CN109479090A (zh) | 2019-03-15 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2019119434A1 (fr) | Procédé de traitement d'informations, véhicule aérien sans pilote, appareil de commande à distance, et support de stockage non volatile | |
WO2019134124A1 (fr) | Procédé de commande, véhicule aérien sans pilote, dispositif de télécommande, et support de stockage non volatil | |
US10484621B2 (en) | Systems and methods for compressing video content | |
US10636150B2 (en) | Subject tracking systems for a movable imaging system | |
US10021339B2 (en) | Electronic device for generating video data | |
CN106406343B (zh) | 无人飞行器的控制方法、装置和系统 | |
WO2018133589A1 (fr) | Dispositif, procédé de photographie aérienne, et véhicule aérien sans pilote | |
JP7192592B2 (ja) | 撮像装置、画像通信システム、画像処理方法、プログラム | |
WO2018157464A1 (fr) | Procédé d'affichage d'image et dispositif électronique | |
JP6532958B2 (ja) | スマート飛行機器の撮影方法、スマート飛行機器、プログラム及び記録媒体 | |
CN106791483B (zh) | 图像传输方法及装置、电子设备 | |
US20200412943A1 (en) | Photography assistance for mobile devices | |
WO2023036198A1 (fr) | Procédé et appareil pour commander un véhicule aérien afin qu'il capture une vidéo à retard de rotation, et dispositif et support | |
US20230393808A1 (en) | Image capture device control using mobile platform voice recognition | |
WO2019241970A1 (fr) | Procédé et dispositif de commande de haut-parleur d'aéronef sans pilote | |
JP2018055656A (ja) | 無人航空機制御システム、その制御方法、及びプログラム | |
US20220006962A1 (en) | Coordinated Multi-viewpoint Image Capture With A Robotic Vehicle | |
TW202239184A (zh) | 成像裝置和系統、捕獲圖像的方法以及電腦可讀儲存媒體 | |
US20210092306A1 (en) | Movable body, image generation method, program, and recording medium | |
KR20200010895A (ko) | 무인기 착륙 유도 방법 | |
WO2021115192A1 (fr) | Dispositif de traitement d'image, procédé de traitement d'image, programme, et support d'enregistrement | |
WO2022000211A1 (fr) | Procédé de commande de système de photographie, dispositif, plateforme mobile et support de stockage | |
JP7081198B2 (ja) | 撮影システム及び撮影制御装置 | |
CN114051722B (zh) | 成像装置和捕获图像的方法 | |
JP2018129577A (ja) | 撮影システム、その制御方法、及びプログラム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 17935320 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 17935320 Country of ref document: EP Kind code of ref document: A1 |