WO2018184218A1 - 控制方法、处理装置、处理器、飞行器和体感系统 - Google Patents

控制方法、处理装置、处理器、飞行器和体感系统 Download PDF

Info

Publication number
WO2018184218A1
WO2018184218A1 PCT/CN2017/079756 CN2017079756W WO2018184218A1 WO 2018184218 A1 WO2018184218 A1 WO 2018184218A1 CN 2017079756 W CN2017079756 W CN 2017079756W WO 2018184218 A1 WO2018184218 A1 WO 2018184218A1
Authority
WO
WIPO (PCT)
Prior art keywords
control information
flight control
image
somatosensory
information
Prior art date
Application number
PCT/CN2017/079756
Other languages
English (en)
French (fr)
Inventor
张志鹏
尹小俊
王乃博
马宁
Original Assignee
深圳市大疆创新科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳市大疆创新科技有限公司 filed Critical 深圳市大疆创新科技有限公司
Priority to CN201780005398.XA priority Critical patent/CN108885101B/zh
Priority to CN202110227430.7A priority patent/CN113050669A/zh
Priority to PCT/CN2017/079756 priority patent/WO2018184218A1/zh
Publication of WO2018184218A1 publication Critical patent/WO2018184218A1/zh
Priority to US16/591,165 priority patent/US20200150691A1/en

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/10Simultaneous control of position or course in three dimensions
    • G05D1/101Simultaneous control of position or course in three dimensions specially adapted for aircraft
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/08Control of attitude, i.e. control of roll, pitch, or yaw
    • G05D1/0808Control of attitude, i.e. control of roll, pitch, or yaw specially adapted for aircraft
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C23/00Combined instruments indicating more than one navigational value, e.g. for aircraft; Combined measuring devices for measuring two or more variables of movement, e.g. distance, speed or acceleration
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64DEQUIPMENT FOR FITTING IN OR TO AIRCRAFT; FLIGHT SUITS; PARACHUTES; ARRANGEMENTS OR MOUNTING OF POWER PLANTS OR PROPULSION TRANSMISSIONS IN AIRCRAFT
    • B64D47/00Equipment not otherwise provided for
    • B64D47/08Arrangements of cameras
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • G01C11/04Interpretation of pictures
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/0094Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot involving pointing a payload, e.g. camera, weapon, sensor, towards a fixed or moving target

Definitions

  • the present invention relates to the field of consumer electronics, and in particular, to a control method, a processing device, a processor, an aircraft, and a somatosensory system.
  • the video obtained by the aerial photography of the aircraft does not include the somatosensory information.
  • the somatosensory information is generally generated through post-simulation, and the somatosensory information generating process is complicated, costly, and takes a lot of time. .
  • Embodiments of the present invention provide a control method, a processing device, a processor, an aircraft, and a somatosensory system.
  • a processing method provided by an embodiment of the present invention is for an aircraft, the aircraft includes an imaging device and a flight control module, and the processing method includes the following steps:
  • the image and the flight control information of the flight control module when the imaging device is imaged are associated and saved.
  • Flight control module the flight control module is used for:
  • the image and the flight control information of the flight control module when the imaging device is imaged are associated and saved.
  • An aircraft comprising an imaging device and a flight control module
  • the processor is configured to:
  • the image and the flight control information of the flight control module when the imaging device is imaged are associated and saved.
  • a processing method provided by an embodiment of the present invention is for processing an image and flight control information, and the processing method includes the following steps:
  • the image and the flight control information are associated.
  • a processing device for processing an image and flight control information, and the processing device includes:
  • the first processing module configured to associate the image with the flight control information.
  • the present invention provides a processor for processing image and flight control information, the processor for associating the image with the flight control information.
  • the control method, the processing device, the processor, the aircraft and the somatosensory system of the embodiments of the present invention associate and save the image and the flight control information, so that the flight control information and the image are synchronized in time, saving the time and cost of the user in the post production. .
  • FIG. 1 is a schematic flow chart of a processing method according to an embodiment of the present invention.
  • FIG. 2 is a schematic block diagram of a somatosensory system according to an embodiment of the present invention.
  • FIG. 3 is another schematic diagram of a module of a somatosensory system according to an embodiment of the present invention.
  • FIG. 4 is another schematic flowchart of a processing method according to an embodiment of the present invention.
  • Figure 5 is a schematic block diagram of an aircraft according to an embodiment of the present invention.
  • FIG. 6 is a schematic flow chart of still another processing method according to an embodiment of the present invention.
  • FIG. 7 is a schematic diagram of another module of an aircraft according to an embodiment of the present invention.
  • FIG. 8 is a schematic diagram of still another module of the aircraft according to an embodiment of the present invention.
  • FIG. 9 is still another schematic flowchart of a processing method according to an embodiment of the present invention.
  • Figure 10 is a block diagram showing a processing apparatus of an embodiment of the present invention.
  • FIG. 11 is a block diagram of a somatosensory device according to an embodiment of the present invention.
  • Somatosensory system 1000 aircraft 100, imaging device 10, flight control module 20, timing device 30, angle sensor 40, rotor motor 50, pan/tilt 60, somatosensory device 700, head somatosensory device 720, body somatosensory device 740, processing device 800
  • the first processing module 820, the second processing module 840, and the processor 900 The first processing module 820, the second processing module 840, and the processor 900.
  • first and second are used for descriptive purposes only and are not to be construed as indicating or implying a relative importance or implicitly indicating the number of technical features indicated.
  • features defining “first” or “second” may include one or more of the described features either explicitly or implicitly.
  • the meaning of "a plurality" is two or more unless specifically and specifically defined otherwise.
  • connection In the description of the present invention, it should be noted that the terms “installation”, “connected”, and “connected” are to be understood broadly, and may be fixed or detachable, for example, unless otherwise explicitly defined and defined. Connected, or integrally connected; may be mechanically connected, or may be electrically connected or may communicate with each other; may be directly connected or indirectly connected through an intermediate medium, may be internal communication of two elements or interaction of two elements relationship. For those skilled in the art, the specific meanings of the above terms in the present invention can be understood on a case-by-case basis.
  • the processing method of the embodiment of the present invention can be applied to the somatosensory system 1000.
  • the somatosensory system 1000 includes an aircraft 100 and a somatosensory device 700.
  • the aircraft 100 includes an imaging device 10 and a flight control module 20.
  • the processing method includes the following steps:
  • S1 controlling the imaging device 10 to image to obtain an image
  • S2 Associate and save the image and the flight control information of the flight control module 20 when the imaging device 10 is imaged.
  • the somatosensory system 1000 of an embodiment of the present invention includes an aircraft 100, a somatosensory device 700, and a processor 900.
  • the aircraft 100 includes an imaging device 10 and a flight control module 20.
  • the processor 900 is configured to control the imaging device 10 to obtain an image and associate and save the image and the flight control information of the flight control module 20 when the imaging device 10 is imaging. Images include static as well as moving images, ie photos and videos. When the image is a photo, the flight control information of the flight control module 20 at the time of imaging of the image is associated. When the image is a video, the flight control information of the flight control module 20 at the time of generating the video frame is associated.
  • the processing method of the embodiment of the present invention may be implemented by the somatosensory system 1000, wherein the steps S1 and S2 may be implemented by the processor 900.
  • the processor 900 can be applied to the aircraft 100, or the flight control module 20 includes the processor 900, that is, steps S1 and S2 can be implemented by the flight control module 20.
  • the processing device 800 of an embodiment of the present invention includes a first processing module 820.
  • the first processing module 820 is configured to associate images and flight control information.
  • the processing device 800 and the processor 900 of the embodiments of the present invention may be applied to the aircraft 100, the somatosensory device 700, or other electronic devices such as a mobile phone, a tablet computer, a personal computer, or the like.
  • the control method, the processing device 800, the processor 900, the aircraft 100 and the somatosensory system 1000 of the embodiments of the present invention associate and save images and flight control information, so that the flight control information and the image are synchronized in time, saving the user in post production. Time and cost.
  • aircraft 100 includes an unmanned aerial vehicle.
  • step S2 includes the following steps:
  • the processor 900 is configured to associate and save time information and associations of the image and imaging device 10 when imaging and save time information and flight control information.
  • step S22 and step S24 can be implemented by the processor 900.
  • the image can be associated with the flight control information.
  • the first processing module 820 is configured to associate images and flight control information according to time information.
  • the image and the flight control information have mutually independent time information, and the image and the flight control information may be associated according to the time information, so that the image and the flight control information are synchronized in time, that is, the same time information is found.
  • the image and the flight control information associate the image corresponding to the same time information with the flight control information.
  • aircraft 100 includes a timing device 30 for providing time information.
  • time information can be obtained from the timing device 30.
  • the imaging device 10 on the aircraft 100 can acquire the time information provided by the timing device 30 on the aircraft 100 when imaging, thereby knowing the time information of the image. Since both the imaging device 10 and the timing device 30 are disposed on the aircraft 100, the real-time and accuracy of the time information of the image can be ensured. In addition, the time information provided by the timing device 30 can also be used to correlate with the flight control information such that the flight control information has time information.
  • step S2 includes the following steps:
  • the processor 900 is configured to synthesize flight control information into an image.
  • step S26 can be implemented by the processor 900.
  • the flight control information and the image can be synchronized in time.
  • the first processing module 820 is configured to synthesize flight control information into the image.
  • association image and the flight control information according to the time information may be biased during the processing, resulting in the image and the flight control information being out of synchronization, and synthesizing the flight control information into the image can ensure the height of the image and the flight control information in time. Synchronize to reduce or avoid errors.
  • aircraft 100 includes an angle sensor 40 and/or a rotor motor 50 .
  • the flight control information includes operational status information of the angle sensor 40 and/or the rotor motor 50.
  • the aircraft 100 including the angle sensor 40 and/or the rotor motor 50 means that the aircraft 100 includes an angle sensor 40, and the aircraft 100 includes a rotor motor 50 that includes any one of an angle sensor 40 and a rotor motor 50, correspondingly
  • the flight control information includes operational status information of the angle sensor 40, the flight control information includes operational status information of the rotor motor 50, and the flight control information includes one of the operational status information of the angle sensor 40 and/or the rotor motor 50.
  • the operational state of the aircraft 100 can be determined by the operational status information of the angle sensor 40 and/or the rotor motor 50 such that the somatosensory device 700 can be controlled in accordance with the operational state of the aircraft 100.
  • the aircraft 100 includes a pan/tilt head 60 for detecting attitude information of the pan/tilt head 60, and the working state information of the angle sensor 40 includes a pitch angle, a yaw angle, and Rolling angle.
  • the operating state of the pan/tilt head 60 can be obtained based on the operating state information of the angle sensor 40.
  • the pan/tilt head 60 is a three-axis pan/tilt head.
  • the working state of the pan/tilt head 60 includes a pitch state, a yaw state, and a roll state. According to the working state information of the angle sensor 40, the pan/tilt head 60 can be correspondingly obtained.
  • the working state for example, the angle sensor 40 obtains a pitch angle of 5 degrees for the pan/tilt head 60, indicating that the working state of the pan/tilt head is uplifted by 5 degrees.
  • the pitch angle, the yaw angle, and the roll angle of the pan/tilt head 60 can be quickly acquired by the operation state information of the angle sensor 40, thereby determining the operating state of the pan/tilt head 60. It can be understood that in other embodiments, the pan/tilt head 60 can be other types of pan/tilt heads, which are not specifically limited herein.
  • the processor 900 is configured to process flight control information to obtain somatosensory control information and to control the somatosensory device 700 using the somatosensory control information.
  • the somatosensory device 700 can obtain the somatosensory control information and control the somatosensory device 700 based on the somatosensory control information.
  • the processor 900 is applied to the aircraft 100 , ie, the flight control module 20 includes the processor 900 .
  • the aircraft 100 communicates with the somatosensory device 700, and the processing method includes the following steps:
  • the flight control information and the image are transmitted to the somatosensory device 700, so that the somatosensory device 700 is used to process the flight control information to obtain the somatosensory control information and to control the somatosensory device 700 using the somatosensory control information.
  • the processor 900 is applied to the aircraft 100, ie, the flight control module 20 includes the processor 900.
  • the aircraft 100 communicates with the somatosensory device 700, and the flight control module 20 is used to transmit flight control information and images.
  • the somatosensory device 700 is sent to the somatosensory device 700 for processing the flight control information to obtain the somatosensory control information and to control the somatosensory device 700 using the somatosensory control information.
  • step S4 can be implemented by the processor 900, and the processor 900 can be applied to the flight control module 20.
  • the processing device 800 includes a second processing module 840.
  • the second processing module 840 is configured to process the flight control information to obtain the somatosensory control information.
  • the somatosensory control information may be obtained by the second processing module 840 or the processor 900.
  • the corresponding body feeling control information can be quickly obtained by processing the flight control information, and the body feeling device 700 can be controlled by the body feeling control information, thereby generating a corresponding body feeling.
  • the operational status information of the rotor motor 50 is used to determine attitude information for the aircraft 100.
  • the somatosensory device 700 includes a head somatosensory device 720 and a body somatosensory device 740.
  • the somatosensory control information includes head control information for controlling the head somatosensory device 720 and body control information for controlling the body somatosensory device 740.
  • the processor 900 is configured to determine head control information and body control information according to the posture information of the pan-tilt 60 and the posture information of the aircraft 100.
  • the head somatosensory device 720 and the body somatosensory device 740 can be controlled based on the posture information of the pan-tilt 60 and the posture information of the aircraft 100.
  • the head somatosensory device 720 can be controlled to generate a head-up feeling; when the posture information of the pan-tilt head 60 is downward, the head somatosensory device 720 can be controlled to generate a head-down feeling;
  • the head somatosensory device 720 and the body somatosensory device 740 are controlled to generate a static sense of body;
  • the head somatosensory device 720 is controlled to Generating a head somatosensory and controlling body somatosensory device 740 to create a feeling of overweight; controlling the head somatosensory device 720 to produce a head up sense and controlling the body somatosensory device 740 to produce a weightless body sense when the attitude information of the aircraft 100 is accelerated down; at the aircraft 100
  • the posture information is uniform advancement, uniform speed retreat or
  • the somatosensory device 720 generates a head rest sensation and the body sensation device 740 is stationary to generate a body slanting body feeling, and the angle and direction of the tilt can be determined by the operating state information of the rotor motor; when the attitude information of the aircraft 100 is rotated, the head sensation is controlled Device 720 produces a rotor feel.
  • the above-described situation of controlling the head somatosensory device 720 and the body somatosensory device 740 according to the posture information of the pan/tilt 60 and the posture information of the aircraft 100 may be combined, for example, the attitude information of the pan/tilt head 60 is upward and the aircraft 100 is When the attitude information is accelerated, the head somatosensory device 720 can be controlled to generate a head rest sensation and control the body sensation device 740 to produce an overweight sensation. There are no restrictions here.
  • a "computer-readable medium” can be any apparatus that can contain, store, communicate, propagate, or transport a program for use in an instruction execution system, apparatus, or device, or in conjunction with the instruction execution system, apparatus, or device.
  • computer readable media include the following: electrical connections (electronic devices) having one or more wires, portable computer disk cartridges (magnetic devices), random access memory (RAM), Read only memory (ROM), erasable editable read only memory (EPROM or flash memory), fiber optic devices, and portable compact disk read only memory (CDROM).
  • the computer readable medium may even be a paper or other suitable medium on which the program can be printed, as it may be optically scanned, for example by paper or other medium, followed by editing, interpretation or, if appropriate, other suitable The method is processed to obtain the program electronically and then stored in computer memory.
  • portions of the invention may be implemented in hardware, software, firmware or a combination thereof.
  • multiple steps or methods may be performed by software or firmware stored in a memory and executed by a suitable instruction execution system.
  • a suitable instruction execution system For example, if executed in hardware, as in another embodiment, it can be performed by any one of the following techniques or combinations thereof known in the art: having logic gates for performing logic functions on data signals Discrete logic circuits, application specific integrated circuits with suitable combinational logic gates, programmable gate arrays (PGAs), field programmable gate arrays (FPGAs), etc.
  • each functional unit in each embodiment of the present invention may be integrated into one processing module, or each unit may exist physically separately, or two or more units may be integrated into one module.
  • the above integrated module is It can be executed in the form of hardware or in the form of a software function module.
  • the integrated modules, if executed in the form of software functional modules and sold or used as separate products, may also be stored in a computer readable storage medium.
  • the above mentioned storage medium may be a read only memory, a magnetic disk or an optical disk or the like.

Abstract

一种处理方法,用于飞行器(100),飞行器(100)上设置有成像装置(10)和飞控模块(20)。处理方法包括步骤:控制成像装置(10)成像以获得图像(S1);及关联并保存图像和成像装置(10)成像时飞控模块(20)的飞控信息(S2)。一种处理装置(800)、一种处理器(900)、一种飞行器(100)和一种体感系统(1000)。

Description

控制方法、处理装置、处理器、飞行器和体感系统 技术领域
本发明涉及消费性电子技术领域,特别涉及一种控制方法、处理装置、处理器、飞行器和体感系统。
背景技术
在相关技术中,飞行器航拍获得的视频中不包含体感信息,为了实现用户在各种感官上的体验,体感信息一般通过后期模拟而生成,体感信息生成过程比较复杂及成本高并且耗费大量的时间。
发明内容
本发明的实施方式提供一种控制方法、处理装置、处理器、飞行器和体感系统。
本发明实施方式提供的一种处理方法,用于飞行器,所述飞行器包括成像装置和飞控模块,所述处理方法包括以下步骤:
控制所述成像装置成像以获得图像;
关联并保存所述图像和所述成像装置成像时所述飞控模块的飞控信息。
本发明实施方式提供的一种飞行器,包括:
成像装置;
飞控模块,所述飞控模块用于:
控制所述成像装置成像以获得图像;
关联并保存所述图像和所述成像装置成像时所述飞控模块的飞控信息。
本发明实施方式提供的一种体感系统,包括:
飞行器,所述飞行器包括成像装置和飞控模块;
体感设备;和
处理器;所述处理器用于:
控制所述成像装置成像以获得图像;
关联并保存所述图像和所述成像装置成像时所述飞控模块的飞控信息。
本发明实施方式提供的一种处理方法,用于处理图像和飞控信息,所述处理方法包括以下步骤:
关联所述图像和所述飞控信息。
本发明实施方式提供的一种处理装置,用于处理图像和飞控信息,所述处理装置包括:
第一处理模块,所述第一处理模块用于关联所述图像和所述飞控信息。
本发明提供一种处理器,用于处理图像和飞控信息,所述处理器用于关联所述图像和所述飞控信息。
本发明实施方式的控制方法、处理装置、处理器、飞行器和体感系统将图像和飞控信息进行关联和保存,可使得飞控信息跟图像在时间上同步,节省用户在后期制作的时间和费用。
本发明的实施方式的附加方面和优点将在下面的描述中部分给出,部分将从下面的描述中变得明显,或通过本发明的实施方式的实践了解到。
附图说明
本发明的上述和/或附加的方面和优点从结合下面附图对实施方式的描述中将变得明显和容易理解,其中:
图1是本发明实施方式的处理方法的流程示意图;
图2是本发明实施方式的体感系统的模块示意图;
图3是本发明实施方式的体感系统的另一个模块示意图;
图4是本发明实施方式的处理方法的另一个流程示意图;
图5是本发明实施方式的飞行器的模块示意图;
图6是本发明实施方式的处理方法的再一个流程示意图;
图7是本发明实施方式的飞行器的另一个模块示意图;
图8是本发明实施方式的飞行器的再一个模块示意图;
图9是本发明实施方式的处理方法的又一个流程示意图;
图10是本发明实施方式的处理装置的模块示意图;
图11是本发明实施方式的体感设备的模块示意图。
主要元件符号附图说明:
体感系统1000、飞行器100、成像装置10、飞控模块20、计时装置30、角度传感器40、旋翼电机50、云台60、体感设备700、头部体感设备720、身体体感设备740、处理装置800、第一处理模块820、第二处理模块840、处理器900。
具体实施方式
下面详细描述本发明的实施方式,所述实施方式的示例在附图中示出,其中自始至终相同或类似的标号表示相同或类似的元件或具有相同或类似功能的元件。下面通过参考附图描述的实施方式是示例性的,仅用于解释本发明,而不能理解为对本发明的限制。
在本发明的描述中,需要理解的是,术语“第一”、“第二”仅用于描述目的,而不能理解为指示或暗示相对重要性或者隐含指明所指示的技术特征的数量。由此,限定有“第一”、“第二”的特征可以明示或者隐含地包括一个或者更多个所述特征。在本发明的描述中,“多个”的含义是两个或两个以上,除非另有明确具体的限定。
在本发明的描述中,需要说明的是,除非另有明确的规定和限定,术语“安装”、“相连”、“连接”应做广义理解,例如,可以是固定连接,也可以是可拆卸连接,或一体地连接;可以是机械连接,也可以是电连接或可以相互通信;可以是直接相连,也可以通过中间媒介间接相连,可以是两个元件内部的连通或两个元件的相互作用关系。对于本领域的普通技术人员而言,可以根据具体情况理解上述术语在本发明中的具体含义。
下文的公开提供了许多不同的实施方式或例子用来实现本发明的不同结构。为了简化本发明的公开,下文中对特定例子的部件和设置进行描述。当然,它们仅仅为示例,并且目的不在于限制本发明。此外,本发明可以在不同例子中重复参考数字和/或参考字母,这种重复是为了简化和清楚的目的,其本身不指示所讨论各种实施方式和/或设置之间的关系。此外,本发明提供了的各种特定的工艺和材料的例子,但是本领域普通技术人员可以意识到其他工艺的应用和/或其他材料的使用。
下面详细描述本发明的实施方式,所述实施方式的示例在附图中示出,其中自始至终相同或类似的标号表示相同或类似的元件或具有相同或类似功能的元件。下面通过参考附图描述的实施方式是示例性的,仅用于解释本发明,而不能理解为对本发明的限制。
请一并参阅图1和图2,本发明实施方式的处理方法可以用于体感系统1000。体感系统1000包括飞行器100和体感设备700。飞行器100包括成像装置10和飞控模块20。处理方法包括以下步骤:
S1:控制成像装置10成像以获得图像;
S2:关联并保存图像和成像装置10成像时飞控模块20的飞控信息。
请再次参阅图2,本发明实施方式的体感系统1000包括飞行器100、体感设备700和处理器900。飞行器100包括成像装置10和飞控模块20。处理器900用于控制成像装置10成像以获得图像和关联并保存图像和成像装置10成像时飞控模块20的飞控信息。图像包括静态以及动态图像,即照片和视频。当图像为照片时,关联所述图像的成像时飞控模块20的飞控信息。当图像为视频时,关联所述视频帧生成时飞控模块20的飞控信息。
也即是说,本发明实施方式的处理方法可以由体感系统1000实现,其中,步骤S1和S2可以由处理器900实现。
在某些实施方式中,处理器900可以应用于飞行器100,或者说,飞控模块20包括处理器900,也即是说,步骤S1和步骤S2可以由飞控模块20实现。
请参阅图3,在某些实施方式中,本发明实施方式的处理装置800包括第一处理模块820。第一处理模块820用于关联图像和飞控信息。本发明实施方式的处理装置800和处理器900可以应用于飞行器100、体感设备700或其他电子设备,其他电子设备例如是手机、平板电脑、个人计算机等。
本发明实施方式的控制方法、处理装置800、处理器900、飞行器100和体感系统1000将图像和飞控信息进行关联和保存,可使得飞控信息跟图像在时间上同步,节省用户在后期制作的时间和费用。
在某些实施方式中,飞行器100包括无人飞行器。
请参阅图4,在一个实施方式中,步骤S2包括以下步骤:
S22:关联并保存图像和成像装置10成像时的时间信息;和
S24:关联并保存时间信息和飞控信息。
在一个实施方式中,处理器900用于关联并保存图像和成像装置10成像时的时间信息和关联并保存时间信息和飞控信息。
也即是说,步骤S22和步骤S24可以由处理器900实现。
如此,可以将图像和飞控信息进行关联。
请再次参阅图3,在一个实施方式中,第一处理模块820用于根据时间信息关联图像和飞控信息。
具体地,图像和飞控信息具有相互独立的时间信息,根据时间信息可以将图像和飞控信息进行关联,从而使得图像和飞控信息在时间上同步,也即是说,找到相同时间信息对应的图像和飞控信息并将相同时间信息对应的图像和飞控信息进行关联。
请参阅图5,在一个实施方式中,飞行器100包括计时装置30,计时装置30用于提供时间信息。
如此,可以从计时装置30获得时间信息。
可以理解,飞行器100上的成像装置10成像时可以获取飞行器100上的计时装置30提供的时间信息,从而得知图像的时间信息。由于成像装置10和计时装置30均设置在飞行器100上,可以保证图像的时间信息的实时和准确性。此外,计时装置30提供的时间信息也可以用于与飞控信息产生关联,从而使得飞控信息具备时间信息。
请参阅图6,在一个实施方式中,步骤S2包括以下步骤:
S26:将飞控信息合成到图像中。
请再次参阅图2,在一个实施方式中,处理器900用于将飞控信息合成到图像中。
也即是说,步骤S26可以由处理器900实现。
如此,可以将飞控信息与图像实现时间上的同步。
请再次参阅图3,在一个实施方式中,第一处理模块820用于将飞控信息合成到图像中。
可以理解,根据时间信息关联图像和飞控信息可能会在处理过程中产生偏差,导致图像和飞控信息不同步,将飞控信息合成到图像中可以保证图像和飞控信息在时间上的高度同步,从而减少或避免误差。
请参阅图7,在一个实施方式中,飞行器100包括角度传感器40和/或旋翼电机50。飞控信息包括角度传感器40和/或旋翼电机50的工作状态信息。
如此,可以获得角度传感器40和/或旋翼电机50的工作状态信息。
具体地,飞行器100包括角度传感器40和/或旋翼电机50是指飞行器100包括角度传感器40,飞行器100包括旋翼电机50,飞行器100包括角度传感器40和旋翼电机50中的任意一种情况,对应地,飞控信息包括角度传感器40的工作状态信息,飞控信息包括旋翼电机50的工作状态信息,飞控信息包括角度传感器40和/或旋翼电机50的工作状态信息中的一种情况。通过角度传感器40和/或旋翼电机50的工作状态信息可以判断飞行器100的工作状态,从而可以根据飞行器100的工作状态控制体感设备700。
请参阅图8,在一个实施方式中,飞行器100包括云台60,角度传感器40用于检测云台60的姿态信息,角度传感器40的工作状态信息包括云台60的俯仰角、偏航角和滚转角。
如此,可以根据角度传感器40的工作状态信息获得云台60的工作状态。
在一个实施方式中,云台60是三轴云台,云台60的工作状态包括俯仰状态、偏航状态和滚转状态,根据角度传感器40的工作状态信息即可对应地获得云台60的工作状态,比如角度传感器40获得云台60的俯仰角为5度,说明云台的工作状态为向上抬升了5度。因此,通过角度传感器40的工作状态信息可以快速地获取云台60的俯仰角、偏航角和滚转角,进而判断出云台60的工作状态。可以理解,在其他实施方式中,云台60可以为其他类型的云台,在此不再具体限定。
请再次参阅图2,在一个实施方式中,处理器900用于处理飞控信息以获得体感控制信息并利用体感控制信息控制体感设备700。
如此,体感设备700可以获得体感控制信息并根据体感控制信息控制体感设备700。
请参阅图9,在一个实施方式中,处理器900应用于飞行器100,即飞控模块20包括处理器900。飞行器100与体感设备700通信,处理方法包括以下步骤:
S4:将飞控信息和图像发送给体感设备700,以使体感设备700用于处理飞控信息以获得体感控制信息并利用体感控制信息控制体感设备700。
请再次参阅图2,在一个实施方式中,处理器900应用于飞行器100,即飞控模块20包括处理器900。飞行器100与体感设备700通信,飞控模块20用于将飞控信息和图像发 送给体感设备700,以使体感设备700用于处理飞控信息以获得体感控制信息并利用体感控制信息控制体感设备700。
也即是说,步骤S4可以由处理器900实现,处理器900可以应用于飞控模块20。
请参阅图10,在一个实施方式中,处理装置800包括第二处理模块840。第二处理模块840用于处理飞控信息以获得体感控制信息。
具体地,体感控制信息可以由第二处理模块840或处理器900处理获得。如此,通过处理飞控信息可以快速地获得对应的体感控制信息,并可以利用体感控制信息控制体感设备700,从而产生相应的体感。
在一个实施方式中,旋翼电机50的工作状态信息用于确定飞行器100的姿态信息。请参阅图11,体感设备700包括头部体感设备720和身体体感设备740,体感控制信息包括用于控制头部体感设备720的头部控制信息和用于控制身体体感设备740的身体控制信息。处理器900用于根据云台60的姿态信息和飞行器100的姿态信息确定头部控制信息和身体控制信息。
如此,可以根据云台60的姿态信息和飞行器100的姿态信息控制头部体感设备720和身体体感设备740。
具体地,在云台60的姿态信息为向上时,可以控制头部体感设备720以产生抬头体感;在云台60的姿态信息为向下时,可以控制头部体感设备720以产生低头体感;在飞行器100的姿态信息为悬停或匀速上升或下降时,控制头部体感设备720和身体体感设备740以产生静止体感;在飞行器100的姿态信息为加速上升时,控制头部体感设备720以产生低头体感和控制身体体感设备740以产生超重体感;在飞行器100的姿态信息为加速下降时,控制头部体感设备720以产生抬头体感和控制身体体感设备740以产生失重体感;在飞行器100的姿态信息为匀速前进、匀速后退或偏航时,控制头部体感设备720以产生头部静止体感和身体体感设备740静止以产生身体倾斜体感,倾斜的角度和方向可以由旋翼电机的工作状态信息确定;在飞行器100的姿态信息为加速前进、加速后退时,控制头部体感设备720以产生头部静止体感和身体体感设备740静止以产生身体倾斜体感,倾斜的角度和方向可以由旋翼电机的工作状态信息确定;在飞行器100的姿态信息为旋转时,控制头部体感设备720以产生转头体感。
需要说明的是,上述根据云台60的姿态信息和飞行器100的姿态信息控制头部体感设备720和身体体感设备740的情况可以进行组合,比如在云台60的姿态信息为向上并且飞行器100的姿态信息为加速上升时,可以控制头部体感设备720以产生头部静止体感和控制身体体感设备740以产生超重体感。在此不做任何限制。
在本说明书的描述中,参考术语“一个实施方式”、“一些实施方式”、“示意性实施方 式”、“示例”、“具体示例”、或“一些示例”等的描述意指结合所述实施方式或示例描述的具体特征、结构、材料或者特点包含于本发明的至少一个实施方式或示例中。在本说明书中,对上述术语的示意性表述不一定指的是相同的实施方式或示例。而且,描述的具体特征、结构、材料或者特点可以在任何的一个或多个实施方式或示例中以合适的方式结合。
流程图中或在此以其他方式描述的任何过程或方法描述可以被理解为,表示包括一个或更多个用于执行特定逻辑功能或过程的步骤的可执行指令的代码的模块、片段或部分,并且本发明的优选实施方式的范围包括另外的执行,其中可以不按所示出或讨论的顺序,包括根据所涉及的功能按基本同时的方式或按相反的顺序,来执行功能,这应被本发明的实施例所属技术领域的技术人员所理解。
在流程图中表示或在此以其他方式描述的逻辑和/或步骤,例如,可以被认为是用于执行逻辑功能的可执行指令的定序列表,可以具体执行在任何计算机可读介质中,以供指令执行系统、装置或设备(如基于计算机的系统、包括处理器的系统或其他可以从指令执行系统、装置或设备取指令并执行指令的系统)使用,或结合这些指令执行系统、装置或设备而使用。就本说明书而言,"计算机可读介质"可以是任何可以包含、存储、通信、传播或传输程序以供指令执行系统、装置或设备或结合这些指令执行系统、装置或设备而使用的装置。计算机可读介质的更具体的示例(非穷尽性列表)包括以下:具有一个或多个布线的电连接部(电子装置),便携式计算机盘盒(磁装置),随机存取存储器(RAM),只读存储器(ROM),可擦除可编辑只读存储器(EPROM或闪速存储器),光纤装置,以及便携式光盘只读存储器(CDROM)。另外,计算机可读介质甚至可以是可在其上打印所述程序的纸或其他合适的介质,因为可以例如通过对纸或其他介质进行光学扫描,接着进行编辑、解译或必要时以其他合适方式进行处理来以电子方式获得所述程序,然后将其存储在计算机存储器中。
应当理解,本发明的各部分可以用硬件、软件、固件或它们的组合来执行。在上述实施方式中,多个步骤或方法可以用存储在存储器中且由合适的指令执行系统执行的软件或固件来执行。例如,如果用硬件来执行,和在另一实施方式中一样,可用本领域公知的下列技术中的任一项或他们的组合来执行:具有用于对数据信号执行逻辑功能的逻辑门电路的离散逻辑电路,具有合适的组合逻辑门电路的专用集成电路,可编程门阵列(PGA),现场可编程门阵列(FPGA)等。
本技术领域的普通技术人员可以理解执行上述实施方法携带的全部或部分步骤是可以通过程序来指令相关的硬件完成,所述的程序可以存储于一种计算机可读存储介质中,该程序在执行时,包括方法实施例的步骤之一或其组合。
此外,在本发明各个实施例中的各功能单元可以集成在一个处理模块中,也可以是各个单元单独物理存在,也可以两个或两个以上单元集成在一个模块中。上述集成的模块既 可以采用硬件的形式执行,也可以采用软件功能模块的形式执行。所述集成的模块如果以软件功能模块的形式执行并作为独立的产品销售或使用时,也可以存储在一个计算机可读取存储介质中。
上述提到的存储介质可以是只读存储器,磁盘或光盘等。尽管上面已经示出和描述了本发明的实施例,可以理解的是,上述实施例是示例性的,不能理解为对本发明的限制,本领域的普通技术人员在本发明的范围内可以对上述实施例进行变化、修改、替换和变型。

Claims (34)

  1. 一种处理方法,用于飞行器,其特征在于,所述飞行器包括成像装置和飞控模块,所述处理方法包括以下步骤:
    控制所述成像装置成像以获得图像;
    关联并保存所述图像和所述成像装置成像时所述飞控模块的飞控信息。
  2. 如权利要求1所述的处理方法,其特征在于,所述关联并保存所述图像和所述成像装置成像时所述飞控模块的飞控信息的步骤包括以下步骤:
    关联并保存所述图像和所述成像装置成像时的时间信息;和
    关联并保存所述时间信息和所述飞控信息。
  3. 如权利要求2所述的处理方法,其特征在于,所述飞行器包括计时装置,所述计时装置用于提供所述时间信息。
  4. 如权利要求1所述的处理方法,其特征在于,所述关联并保存所述图像和所述成像装置成像时所述飞控模块的飞控信息的步骤包括以下步骤:
    将所述飞控信息合成到所述图像中。
  5. 如权利要求1所述的处理方法,其特征在于,所述飞行器包括角度传感器和/或旋翼电机,所述飞控信息包括所述角度传感器和/或所述旋翼电机的工作状态信息。
  6. 如权利要求5所述的处理方法,其特征在于,所述飞行器包括云台,所述角度传感器用于检测所述云台的姿态信息,所述角度传感器的工作状态信息包括所述云台的俯仰角、偏航角和滚转角。
  7. 如权利要求1所述的处理方法,其特征在于,所述飞行器与体感设备通信,所述处理方法包括以下步骤:
    将所述飞控信息和所述图像发送给所述体感设备,以使所述体感设备用于处理所述飞控信息以获得体感控制信息并利用所述体感控制信息控制所述体感设备。
  8. 一种飞行器,其特征在于,包括:
    成像装置;
    飞控模块,所述飞控模块用于:
    控制所述成像装置成像以获得图像;
    关联并保存所述图像和所述成像装置成像时所述飞控模块的飞控信息。
  9. 如权利要求8所述的飞行器,其特征在于,所述飞控模块用于:
    关联并保存所述图像和所述成像装置成像时的时间信息;和
    关联并保存所述时间信息和所述飞控信息。
  10. 如权利要求9所述的飞行器,其特征在于,所述飞行器包括计时装置,所述计时 装置用于提供所述时间信息。
  11. 如权利要求8所述的飞行器,其特征在于,所述飞控模块用于将所述飞控信息合成到所述图像中。
  12. 如权利要求8所述的飞行器,其特征在于,所述飞行器包括角度传感器和/或旋翼电机,所述飞控信息包括所述角度传感器和/或所述旋翼电机的工作状态信息。
  13. 如权利要求12所述的飞行器,其特征在于,所述飞行器包括云台,所述角度传感器用于检测所述云台的姿态信息,所述角度传感器的工作状态信息包括所述云台的俯仰角、偏航角和滚转角。
  14. 如权利要求8所述的飞行器,其特征在于,所述飞行器与体感设备通信,所述飞控模块用于将所述飞控信息和所述图像发送给所述体感设备,以使所述体感设备用于处理所述飞控信息以获得体感控制信息并利用所述体感控制信息控制所述体感设备。
  15. 一种体感系统,其特征在于,包括:
    飞行器,所述飞行器包括成像装置和飞控模块;
    体感设备;和
    处理器;所述处理器用于:
    控制所述成像装置成像以获得图像;
    关联并保存所述图像和所述成像装置成像时所述飞控模块的飞控信息。
  16. 如权利要求15所述的体感系统,其特征在于,所述处理器用于:
    关联并保存所述图像和所述成像装置成像时的时间信息;和
    关联并保存所述时间信息和所述飞控信息。
  17. 如权利要求16所述的体感系统,其特征在于,所述飞行器包括计时装置,所述计时装置用于提供所述时间信息。
  18. 如权利要求15所述的体感系统,其特征在于,所述处理器用于将所述飞控信息合成到所述图像中。
  19. 如权利要求15所述的体感系统,其特征在于,所述飞行器包括角度传感器和/或旋翼电机,所述飞控信息包括所述角度传感器和/或所述旋翼电机的工作状态信息。
  20. 如权利要求19所述的体感系统,其特征在于,所述飞行器包括云台,所述角度传感器用于检测所述云台的姿态信息,所述角度传感器的工作状态信息包括所述云台的俯仰角、偏航角和滚转角。
  21. 如权利要求20所述的体感系统,其特征在于,所述处理器用于处理所述飞控信息以获得体感控制信息并利用所述体感控制信息控制所述体感设备。
  22. 如权利要求21所述的体感系统,其特征在于,所述旋翼电机的工作状态信息用于 确定所述飞行器的姿态信息,所述体感设备包括头部体感设备和身体体感设备,所述体感控制信息包括用于控制所述头部体感设备的头部控制信息和用于控制所述身体体感设备的身体控制信息,所述处理器用于根据所述云台的姿态信息和所述飞行器的姿态信息确定所述头部控制信息和所述身体控制信息。
  23. 一种处理方法,用于处理图像和飞控信息,其特征在于,所述处理方法包括以下步骤:
    关联所述图像和所述飞控信息。
  24. 如权利要求23所述的处理方法,其特征在于,所述图像和所述飞控信息均包括有时间信息,所述关联所述图像和所述飞控信息的步骤包括以下步骤:
    根据所述时间信息关联所述图像和所述飞控信息。
  25. 如权利要求23所述的处理方法,其特征在于,所述关联所述图像和所述飞控信息的步骤包括以下步骤:
    将所述飞控信息合成到所述图像中。
  26. 如权利要求23所述的处理方法,其特征在于,所述处理方法包括以下步骤:
    处理所述飞控信息以获得体感控制信息。
  27. 一种处理装置,用于处理图像和飞控信息,其特征在于,所述处理装置包括:
    第一处理模块,所述第一处理模块用于关联所述图像和所述飞控信息。
  28. 如权利要求27所述的处理装置,其特征在于,所述图像和所述飞控信息均包括有时间信息,所述第一处理模块用于根据所述时间信息关联所述图像和所述飞控信息。
  29. 如权利要求27所述的处理装置,其特征在于,所述第一处理模块用于将所述飞控信息合成到所述图像中。
  30. 如权利要求27所述的处理装置,其特征在于,所述处理装置包括:
    第二处理模块,所述第二处理模块用于处理所述飞控信息以获得体感控制信息。
  31. 一种处理器,用于处理图像和飞控信息,其特征在于,所述处理器用于关联所述图像和所述飞控信息。
  32. 如权利要求31所述的处理器,其特征在于,所述图像和所述飞控信息均包括有时间信息,所述处理器用于根据所述时间信息关联所述图像和所述飞控信息。
  33. 如权利要求31所述的处理器,其特征在于,所述处理器用于将所述飞控信息合成到所述图像中。
  34. 如权利要求31所述的处理器,其特征在于,所述处理器用于处理所述飞控信息以获得体感控制信息。
PCT/CN2017/079756 2017-04-07 2017-04-07 控制方法、处理装置、处理器、飞行器和体感系统 WO2018184218A1 (zh)

Priority Applications (4)

Application Number Priority Date Filing Date Title
CN201780005398.XA CN108885101B (zh) 2017-04-07 2017-04-07 控制方法、处理装置、处理器、飞行器和体感系统
CN202110227430.7A CN113050669A (zh) 2017-04-07 2017-04-07 控制方法、处理装置、处理器、飞行器和体感系统
PCT/CN2017/079756 WO2018184218A1 (zh) 2017-04-07 2017-04-07 控制方法、处理装置、处理器、飞行器和体感系统
US16/591,165 US20200150691A1 (en) 2017-04-07 2019-10-02 Control method, processing device, processor, aircraft, and somatosensory system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2017/079756 WO2018184218A1 (zh) 2017-04-07 2017-04-07 控制方法、处理装置、处理器、飞行器和体感系统

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US16/591,165 Continuation US20200150691A1 (en) 2017-04-07 2019-10-02 Control method, processing device, processor, aircraft, and somatosensory system

Publications (1)

Publication Number Publication Date
WO2018184218A1 true WO2018184218A1 (zh) 2018-10-11

Family

ID=63711981

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2017/079756 WO2018184218A1 (zh) 2017-04-07 2017-04-07 控制方法、处理装置、处理器、飞行器和体感系统

Country Status (3)

Country Link
US (1) US20200150691A1 (zh)
CN (2) CN113050669A (zh)
WO (1) WO2018184218A1 (zh)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113050669A (zh) * 2017-04-07 2021-06-29 深圳市大疆创新科技有限公司 控制方法、处理装置、处理器、飞行器和体感系统

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4802757A (en) * 1986-03-17 1989-02-07 Geospectra Corporation System for determining the attitude of a moving imaging sensor platform or the like
CN102607532A (zh) * 2011-01-25 2012-07-25 吴立新 一种利用飞控数据的低空影像快速匹配方法
CN104111659A (zh) * 2013-04-19 2014-10-22 索尼公司 控制装置、控制方法和计算机程序
CN105222761A (zh) * 2015-10-29 2016-01-06 哈尔滨工业大学 借助虚拟现实及双目视觉技术实现的第一人称沉浸式无人机驾驶系统及驾驶方法
CN205645015U (zh) * 2016-01-05 2016-10-12 上海交通大学 地面座舱及二自由度360度飞行模拟驾驶舱仿真运动平台
CN106155069A (zh) * 2016-07-04 2016-11-23 零度智控(北京)智能科技有限公司 无人机飞行控制装置、方法及遥控终端

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102348068B (zh) * 2011-08-03 2014-11-26 东北大学 一种基于头部姿态控制的随动远程视觉系统
CN202632581U (zh) * 2012-05-28 2012-12-26 戴震宇 基于真实空中环境下的飞行模拟操控及体验装置
US20140008496A1 (en) * 2012-07-05 2014-01-09 Zhou Ye Using handheld device to control flying object
CN105573330B (zh) * 2015-03-03 2018-11-09 广州亿航智能技术有限公司 基于智能终端的飞行器操控方法
CN108883335A (zh) * 2015-04-14 2018-11-23 约翰·詹姆斯·丹尼尔斯 用于人与机器或人与人的可穿戴式的电子多感官接口
CN204741528U (zh) * 2015-04-22 2015-11-04 四川大学 立体沉浸式体感智能控制器
CN105489083A (zh) * 2016-01-05 2016-04-13 上海交通大学 二自由度360度飞行模拟驾驶舱仿真运动平台
CN105739525B (zh) * 2016-02-14 2019-09-03 普宙飞行器科技(深圳)有限公司 一种配合体感操作实现虚拟飞行的系统
CN106125769A (zh) * 2016-07-22 2016-11-16 南阳理工学院 一种无线头部运动随动系统设计方法
CN113050669A (zh) * 2017-04-07 2021-06-29 深圳市大疆创新科技有限公司 控制方法、处理装置、处理器、飞行器和体感系统

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4802757A (en) * 1986-03-17 1989-02-07 Geospectra Corporation System for determining the attitude of a moving imaging sensor platform or the like
CN102607532A (zh) * 2011-01-25 2012-07-25 吴立新 一种利用飞控数据的低空影像快速匹配方法
CN104111659A (zh) * 2013-04-19 2014-10-22 索尼公司 控制装置、控制方法和计算机程序
CN105222761A (zh) * 2015-10-29 2016-01-06 哈尔滨工业大学 借助虚拟现实及双目视觉技术实现的第一人称沉浸式无人机驾驶系统及驾驶方法
CN205645015U (zh) * 2016-01-05 2016-10-12 上海交通大学 地面座舱及二自由度360度飞行模拟驾驶舱仿真运动平台
CN106155069A (zh) * 2016-07-04 2016-11-23 零度智控(北京)智能科技有限公司 无人机飞行控制装置、方法及遥控终端

Also Published As

Publication number Publication date
CN108885101A (zh) 2018-11-23
CN113050669A (zh) 2021-06-29
US20200150691A1 (en) 2020-05-14
CN108885101B (zh) 2021-03-19

Similar Documents

Publication Publication Date Title
EP3422699B1 (en) Camera module and control method
US10484600B2 (en) Electronic apparatus and controlling method thereof
US9635254B2 (en) Panoramic scene capturing and browsing mobile device, system and method
US9992483B2 (en) Imaging architecture for depth camera mode with mode switching
US20190043209A1 (en) Automatic tuning of image signal processors using reference images in image processing environments
US10685666B2 (en) Automatic gain adjustment for improved wake word recognition in audio systems
US20100106295A1 (en) System and method for stabilization control adopting vestibulo-ocular reflex
EP3988902A1 (en) Event data stream processing method and computing device
WO2019011091A1 (zh) 拍照提醒方法、装置、终端和计算机存储介质
WO2019127027A1 (zh) 无人机拍摄视频的处理方法、拍摄相机和遥控器
WO2018191963A1 (zh) 遥控器、云台及云台控制方法、装置、系统
US20170085740A1 (en) Systems and methods for storing images and sensor data
WO2019183914A1 (en) Dynamic video encoding and view adaptation in wireless computing environments
US11388343B2 (en) Photographing control method and controller with target localization based on sound detectors
WO2020051831A1 (zh) 手持云台的控制方法及手持云台、手持设备
WO2018191964A1 (zh) 云台的控制方法以及云台
US20210389764A1 (en) Relative image capture device orientation calibration
US9699381B2 (en) Digital photographing motion compensation system and method
WO2018019013A1 (zh) 拍照控制方法和装置
WO2018184218A1 (zh) 控制方法、处理装置、处理器、飞行器和体感系统
US20160006938A1 (en) Electronic apparatus, processing method and storage medium
US9250498B2 (en) Apparatus and method for controlling auto focus function in electronic device
CN103854447A (zh) 依据目标影像与倾斜角度提示坐姿调整的可携装置及方法
JP2017112439A (ja) 移動撮像装置および移動撮像装置の制御方法
US10165173B2 (en) Operating method and apparatus for detachable lens type camera

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17904487

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17904487

Country of ref document: EP

Kind code of ref document: A1