WO2021026754A1 - Procédé et appareil de commande de mise au point pour appareil de photographie, et aéronef sans pilote - Google Patents

Procédé et appareil de commande de mise au point pour appareil de photographie, et aéronef sans pilote Download PDF

Info

Publication number
WO2021026754A1
WO2021026754A1 PCT/CN2019/100344 CN2019100344W WO2021026754A1 WO 2021026754 A1 WO2021026754 A1 WO 2021026754A1 CN 2019100344 W CN2019100344 W CN 2019100344W WO 2021026754 A1 WO2021026754 A1 WO 2021026754A1
Authority
WO
WIPO (PCT)
Prior art keywords
focus
target
photographing device
dimensional map
unmanned aerial
Prior art date
Application number
PCT/CN2019/100344
Other languages
English (en)
Chinese (zh)
Inventor
吴博
钱杰
Original Assignee
深圳市大疆创新科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳市大疆创新科技有限公司 filed Critical 深圳市大疆创新科技有限公司
Priority to CN201980032926.XA priority Critical patent/CN112154650A/zh
Priority to PCT/CN2019/100344 priority patent/WO2021026754A1/fr
Publication of WO2021026754A1 publication Critical patent/WO2021026754A1/fr

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64DEQUIPMENT FOR FITTING IN OR TO AIRCRAFT; FLIGHT SUITS; PARACHUTES; ARRANGEMENT OR MOUNTING OF POWER PLANTS OR PROPULSION TRANSMISSIONS IN AIRCRAFT
    • B64D47/00Equipment not otherwise provided for
    • B64D47/08Arrangements of cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/29Geographical information databases
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/57Mechanical or electrical details of cameras or camera modules specially adapted for being embedded in other devices

Definitions

  • This application relates to the field of imaging technology, and in particular to a focus control method and device of a photographing device, and an unmanned aerial vehicle.
  • shooting devices generally have an auto focus (Auto Focus, referred to as “AF”) function.
  • AF Auto Focus
  • SLR cameras usually use a phase difference detection method for focusing.
  • repeated focusing and The situation where it is difficult to focus results in unclear imaging of the shooting target and poor user experience.
  • one of the objectives of the present invention is to provide a focus control method, device and unmanned aerial vehicle for a shooting device, so as to at least achieve the purpose of accurately determining the focus parameter of the shooting device and using the focus parameter to perform autofocus.
  • an embodiment of the present invention provides a focus control method of a photographing device installed on an unmanned aerial vehicle, and the method includes:
  • the focus parameters including: a target distance between the focus target and the shooting device;
  • the focal length of the photographing device is adjusted according to the focus parameter.
  • an embodiment of the present invention provides a focus control device for a photographing device, including a memory, a processor, and a computer program stored in the memory and running on the processor, and the processor executes the program when the program is executed.
  • the focus parameters including: a target distance between the focus target and the shooting device;
  • the focal length of the photographing device is adjusted according to the focus parameter.
  • an embodiment of the present invention provides an unmanned aerial vehicle, the unmanned aerial vehicle comprising: a photographing device and the focus control device of the photographing device as described in the second aspect.
  • the focus control method, device, and unmanned aerial vehicle of a photographing device obtained by the embodiments of the present invention obtain a three-dimensional map of the surrounding environment of the unmanned aerial vehicle, and determine the focus target and focus of the photographing device at least partially based on the three-dimensional map. Parameters, and then adjust the focal length of the shooting device according to the focus parameters to control the shooting device to perform auto-focusing.
  • the present invention can accurately calculate the focus parameters and realize auto-focusing, and has simple and efficient positive effects.
  • FIG. 1 is a schematic diagram of an application scenario provided by an embodiment of the present invention.
  • FIG. 2 is a schematic flowchart of a focus control method of a photographing device according to an embodiment of the present invention
  • FIG. 3 is a schematic diagram of a process of obtaining a three-dimensional map of the surrounding environment where the camera is located according to an embodiment of the present invention
  • FIG. 4 is a schematic diagram of a process for determining the focus parameters of a shooting device according to a three-dimensional map according to an embodiment of the present invention
  • Fig. 5 is a schematic diagram of posture information of a photographing device provided by an embodiment of the present invention.
  • Figure 6 is a schematic diagram of a designated area provided by an embodiment of the present invention.
  • FIG. 7 is a schematic diagram of a three-dimensional map provided by an embodiment of the present invention.
  • FIG. 8 is a schematic structural diagram of a focus control device of a photographing device according to an embodiment of the present invention.
  • Figure 9 is a schematic structural diagram of an unmanned aerial vehicle provided by an embodiment of the present invention.
  • Fig. 10 is a schematic structural diagram of another unmanned aerial vehicle provided by an embodiment of the present invention.
  • Focusing refers to adjusting the distance of the focus to make the photo clear.
  • the current focus methods include manual focus and auto focus.
  • Manual focus is a focusing method that adjusts the distance between the lens groups of the camera lens by manually turning the focusing ring on the lens to make the image clear. This focusing method largely depends on the photographer's clarity of the image in the viewfinder Judgment.
  • Current SLR cameras usually use a phase difference detection method of focusing scheme. However, in the case of low light and insufficient light, the focusing speed and performance of this scheme will be significantly reduced. In actual shooting, there will be repeated focusing and difficult focusing conditions, resulting in shooting The image of the target is not clear, and the user experience is poor.
  • the embodiments of the present invention provide a focus control method, device, and unmanned aerial vehicle for a photographing device to combine the application scenarios of the unmanned aerial vehicle to achieve a simple and efficient autofocus effect.
  • the UAV 10 includes a photographing device 101 and a depth sensor.
  • the depth sensor may be a binocular vision sensor (including a first camera 102A and a second camera). Second camera 102B).
  • the binocular vision sensor can obtain the depth map of the surrounding environment of the unmanned aerial vehicle 10, and the positioning device and the attitude sensor (not shown) of the unmanned aerial vehicle 10 can obtain the position information and attitude information of the binocular vision sensor.
  • a three-dimensional map of the surrounding environment of the unmanned aerial vehicle 10 is obtained.
  • the optical center of the imaging device can be used as the starting point to form one or more direction vectors in the central area of the field of view (FOV) of the imaging device. Counting the area passed by one or more direction vectors in the map can obtain the focus target and the depth information corresponding to the focus target.
  • the focal length of the photographing device 101 can be automatically adjusted according to the depth information corresponding to the focus target to achieve automatic focusing.
  • an object located in the center area of the screen in the image captured by the camera can always be clearly imaged.
  • the number of binocular vision sensors is one or more, so as to generate a three-dimensional map within a certain radius around the unmanned aerial vehicle in real time during the movement of the unmanned aerial vehicle.
  • multiple binocular vision sensors can be installed on the front, rear, left, and right sides of the UAV, respectively.
  • the embodiment of the present invention uses binocular vision sensors to generate three-dimensional maps without adding additional hardware structures such as other sensors.
  • the focus target is accurately determined, and the focal length of the shooting device is adjusted according to the depth information of the focus target to achieve automatic focusing, which has simple and efficient positive effects.
  • the camera is installed in an unmanned aerial vehicle.
  • the focus control method of the shooting device is applied to the focus control device of the shooting device.
  • the focus control device of the shooting device may be installed inside the shooting device, or may be independently installed outside the shooting device, or partly installed inside the shooting device , A part is independently installed outside the camera, for example, a part is installed in the body of an unmanned aerial vehicle, and a part is installed in the camera.
  • the method includes the following steps S201-S203:
  • the three-dimensional map of the surrounding environment where the photographing device is located may be pre-established, and the three-dimensional map contains three-dimensional information of the surrounding environment where the photographing device is located.
  • the three-dimensional information may be expressed in longitude, latitude, and altitude.
  • the three-dimensional map can be stored in a cloud processor.
  • the focus control device of the camera can acquire a three-dimensional map within a certain radius centered on the unmanned aerial vehicle in real time.
  • the three-dimensional map of the surrounding environment where the aforementioned camera is located may be updated in real time based on the depth sensor on the unmanned aerial vehicle.
  • the depth sensor can be a binocular vision sensor, an ultrasonic sensor, a millimeter wave radar sensor, and a lidar sensor.
  • Fig. 3 is a schematic flowchart of obtaining a three-dimensional map of the surrounding environment of an unmanned aerial vehicle according to an embodiment of the present invention.
  • the foregoing acquiring a three-dimensional map of the surrounding environment where the photographing device is located specifically includes the following steps S301-S303:
  • the aforementioned depth sensor is a binocular vision sensor.
  • the binocular vision sensor for distance measurement, the three-dimensional coordinates of the object can be quickly calculated and the three-dimensional space can be reconstructed. It has the advantages of high flexibility and high accuracy.
  • the position of the depth sensor may be acquired through a positioning device, such as a GPS module, and the attitude of the depth sensor may be acquired through an attitude sensor, such as an inertial measurement unit IMU.
  • a positioning device such as a GPS module
  • the attitude of the depth sensor may be acquired through an attitude sensor, such as an inertial measurement unit IMU.
  • the GPS module and the IMU are installed in an unmanned aerial vehicle.
  • multiple depth sensors are installed on the unmanned aerial vehicle, and depth images can be generated in real time through each depth sensor. After obtaining the position and attitude of each depth sensor, the relative position relationship between the depth sensors can be further obtained. The coordinate conversion relationship between the depth images generated by each depth sensor can be determined. Then, according to the coordinate conversion relationship, the depth images generated by the depth sensors are image-spliced to obtain a three-dimensional map of the surrounding environment where the camera is located.
  • a single depth sensor is installed on the unmanned aerial vehicle.
  • the depth sensor can generate multiple depth images in real time during the movement of the unmanned aerial vehicle. After obtaining the positions and attitudes of the depth sensors corresponding to the multiple depth images, it can Determine the coordinate conversion relationship between the multiple depth images generated by the depth sensor. Then, according to the coordinate conversion relationship, multiple depth images generated by the depth sensor are image-spliced to obtain a three-dimensional map of the surrounding environment where the camera is located.
  • S202 Determine a focus target and focus parameters of the shooting device at least partially according to the three-dimensional map, where the focus parameters at least include: a target distance between the focus target and the shooting device.
  • the focus target can be determined from the above-mentioned map, and the target distance between the focus target and the shooting device can be obtained according to the depth information of the focus target.
  • the focal length of the shooting device can be adjusted according to the target distance to control the shooting device to perform automatic focusing, so that the focus target can form a clear image.
  • the focus control method of the shooting device obtaineds a three-dimensional map of the surrounding environment of the shooting device, determines the focus target and focus parameters of the shooting device from the three-dimensional map, and determines the focus target and focus parameters of the shooting device according to the determined focus parameters Control the camera to auto focus, which has a positive effect of accuracy and efficiency.
  • the foregoing acquiring a three-dimensional map of the surrounding environment where the photographing device is located includes:
  • the focal length of the shooting device can be set to infinity; for example, when the focus of the shooting scene is between the focus target and the shooting device When the distance is greater than 16 meters, the focal length of the camera can be set to infinity.
  • the acquired three-dimensional map in the surrounding environment of the camera is a three-dimensional map within a preset range.
  • the preset range may be centered on the carrier of the camera with a radius of 16 meters or 32 meters. Or other distance ranges. It is understandable that setting the range of the three-dimensional map can reduce the amount of data storage and increase the processing speed.
  • the camera when the focus target is not determined from the three-dimensional map, the camera can be controlled to adjust the focus to infinity.
  • Fig. 4 is a schematic diagram of a process for determining the focus parameters of a shooting device according to a three-dimensional map according to an embodiment of the present invention. As shown in FIG. 4, in the embodiment of the present invention, determining the focus parameter of the shooting device according to the three-dimensional map includes the following steps S401-S403:
  • a positioning device such as a GPS module, is installed on the unmanned aerial vehicle.
  • the positioning device can obtain the position information of the unmanned aerial vehicle in real time. Since the relative position relationship between the camera and the positioning device can be obtained in advance, The location information of the shooting device is obtained according to the location information obtained by the positioning device.
  • the position information of the photographing device is position information of the optical center of the photographing device.
  • the posture information of the aforementioned camera includes: the exit direction of the FOV center line (the dotted line shown in the figure) of the field of view of the camera 1011.
  • the posture information of the camera is not limited to the above representation.
  • the camera is installed on the unmanned aerial vehicle through a three-axis platform, and the posture angle of the three-axis platform can also be used. Indicates the posture of the camera.
  • one or more direction vectors within the field of view of the camera can be determined.
  • S402 Determine a focus target according to the position information and posture information of the shooting device and the three-dimensional map.
  • the focus target can be located from the three-dimensional map.
  • S403. Determine the depth information of the focus target as the target distance between the focus target and the shooting device.
  • the depth information of the focus target can be further obtained, and the depth information is determined as the target distance between the focus target and the shooting device.
  • determining the focus target according to the position information and the posture information and the three-dimensional map includes the following steps A10-A20:
  • Step A10 Determine one or more direction vectors according to the position information and posture information of the photographing device.
  • the optical center of the shooting device can be used as the starting point, and one or more direction vectors are formed in the central area of the field of view of the shooting device to search for the focus target in the central area .
  • the central area may be a cylindrical area 600 extending in the direction of the central axis of the photographing device with the optical center of the photographing device as the center and the predetermined value as the radius. The center area of the image.
  • the above-mentioned direction vector is a vector in the cylindrical region 600 that represents the ray emitted from the optical center of the camera under the designated coordinate system, and the vector may be a unit vector.
  • the above-mentioned designated coordinate system may be a three-dimensional coordinate system formed by the optical center of the camera as the origin O, the true east direction as the X axis direction, the true north direction as the Y axis direction, and the sky pointing direction as the Z axis direction.
  • the designated coordinate system may also be another coordinate system.
  • FIG. 7 is a schematic diagram of a three-dimensional map provided by an embodiment of the present invention.
  • the three-dimensional map and the above-mentioned direction vector are in the same coordinate system, or in different coordinate systems and have certain Coordinate conversion relationship.
  • Step A20 Determine a focus target of the shooting device according to the direction vector and the three-dimensional map.
  • a target location point is determined from the three-dimensional map according to the above-mentioned direction vector and a three-dimensional map, the target area where the target location point is located is identified, and when the target area corresponding to the target location point satisfies When the conditions are preset, the target position point is determined as the focus target.
  • the target area corresponding to the target location point meets a preset condition, including:
  • the area occupied by the target area is greater than a preset threshold and/or the location of the target area is located in the preset location area.
  • the area occupied by the target area may be the area occupied by the target area in the image after imaging.
  • the area occupied by the target area may be the ratio of the area occupied by the target area in the image after imaging to the entire image area.
  • the area occupied by the target area may be a physical area corresponding to the target area.
  • the area occupied by the target area may be a ratio of the physical area corresponding to the target area to the preset area.
  • the predetermined area may be the area of the circular cross-section of the cylindrical region 600.
  • Noise points may exist in the three-dimensional map of the surrounding environment where the camera is located. These noise points are usually small. These noise points can be eliminated by determining whether the area occupied by the target area is greater than a preset threshold, and the accuracy of focusing can be improved.
  • the area where the target area is located at the preset position may be the central area of the image after the target area is imaged, and the central area may be a rectangular area, a square area or a circular area.
  • the location where the target area is located at a preset location area may be a central area of the target area within the field of view of the camera in the three-dimensional space, and the central area may be a rectangular parallelepiped area, a cubic area, or a cylindrical area. Area or spherical area.
  • the predetermined location area may be a cylindrical area 601 with a certain distance from the camera.
  • the radius of the cylindrical area 601 may be the same as or different from that of the cylindrical area 600.
  • the cylindrical area 601 is coaxial with the cylindrical area 600.
  • the cylindrical area 601 may also be replaced with a cuboid, a cube, a sphere, etc. Other shapes.
  • the focus target is in the center area of the screen, and the target area at the edge of the screen can be filtered out by determining whether the location of the target area is located in the preset location area.
  • the focusing distance of the camera is usually limited. For example, the closest focusing distance of some camera is 1.5 meters, that is, when the target area is too close to the camera, the camera cannot focus. In this way, you can change the preset The location area is set at a certain distance from the camera to avoid inability to focus.
  • the target location that satisfies that the area occupied by the target area is greater than the preset threshold and/or the location of the target area is located in the preset location area is used as the focus target, which can improve the accuracy of focus and avoid inability The focus situation.
  • adjusting the focal length of the photographing device according to the focus parameter includes the following steps B10-B30:
  • Step B10 For each focus target, respectively acquire images collected by the photographing device according to the focus parameters corresponding to the focus target.
  • the distance between the focus target and the shooting device is obtained from the map, and the shooting device is controlled to use the distance corresponding to each focus target to perform automatic focusing and perform image acquisition.
  • Step B20 Compare all images to determine the target image.
  • the target image with the best imaging quality can be obtained according to the contrast information of all images.
  • all images can be input to a pre-trained model to obtain a target image with the best imaging quality.
  • Step B30 Control the photographing device to perform automatic focusing according to the focusing parameter corresponding to the target image.
  • the distance corresponding to the target image with the best imaging quality obtained above is determined as the target distance for controlling the camera to perform autofocus, and the focal length of the camera is adjusted according to the target distance to achieve autofocus. Furthermore, the focus of the camera is more accurate.
  • the above-mentioned focusing parameters further include: the target distance corresponds to state information of the photographing device, and the state information of the photographing device includes posture information and position information.
  • the above-mentioned method further includes the following step C10:
  • Step C10 Associate and store the target distance and the state information of the photographing device corresponding to the target distance.
  • the above-mentioned associated storage method may be corresponding storage in a table manner.
  • the state information of the camera is acquired.
  • the stored target distance corresponding to the state information is acquired, and the camera is controlled according to the target distance. auto focus.
  • the flight trajectory of the drone is a pre-set fixed trajectory.
  • the present invention it is possible to obtain in advance the state information that characterizes the shooting device at a later time, and to determine the focus target of the shooting device at a later time from the three-dimensional map of the surrounding environment where the shooting device is located, and then obtain the The target distance between the focus target and the image capturing device is associated and stored with the above-mentioned state information.
  • the target corresponding to the state information is obtained Distance, adjust the focal length of the photographing device according to the target distance to control the photographing device to perform auto-focusing; further, it is possible to pre-determine the focus parameters before the state of the photographing device reaches the state at a certain moment, which can improve the focal length of the photographing device The efficiency of regulation.
  • FIG. 8 is a schematic structural diagram of a focus control device of a photographing device provided in an embodiment of the present invention.
  • the focus control device 800 of the photographing device includes a memory 802, a processor 801, and a computer program stored on the memory 802 and running on the processor 801.
  • the processor 801 implements :
  • the focus parameters including: a target distance between the focus target and the shooting device;
  • the focal length of the photographing device is adjusted according to the focus parameter.
  • the foregoing processor 801 implements the following when executing the program:
  • a three-dimensional map of the surrounding environment of the unmanned aerial vehicle is obtained.
  • the above-mentioned depth sensor includes a binocular vision sensor; when the processor executes the program:
  • the foregoing processor 801 implements the following when executing the program:
  • the depth information of the focus target is determined as the target distance between the focus target and the photographing device.
  • the foregoing processor 801 implements the following when executing the program:
  • a focus target of the photographing device is determined.
  • the foregoing processor 801 implements the following when executing the program:
  • one or more direction vectors corresponding to the central position area within the field of view of the photographing device are determined.
  • the foregoing processor 801 implements the following when executing the program:
  • the target position point is determined as the focus target.
  • the foregoing preset condition includes: the area occupied by the target area is greater than a preset threshold and/or the location of the target area is located in a preset location area.
  • the foregoing processor 801 implements the following when executing the program:
  • the focal length of the photographing device is adjusted according to the focus parameter corresponding to the target image.
  • the stored target distance corresponding to the state information of the photographing device is acquired, and the photographing is controlled according to the target distance.
  • the device performs automatic focusing.
  • the device shown in FIG. 8 can execute the methods of the embodiments shown in FIG. 1 to FIG. 7.
  • parts that are not described in detail in this embodiment please refer to the related description of the embodiment shown in FIG. 1 to FIG. 7, which will not be repeated here.
  • FIG. 9 is a schematic structural diagram of an unmanned aerial vehicle according to an embodiment of the present invention.
  • the UAV 1000 includes: a photographing device 1002 and a focusing control device 1001 of the photographing device described in any of the above embodiments.
  • the focusing control device 1001 of the photographing device may be set inside the photographing device 1002 It may also be installed outside the imaging device 1002, or partly installed inside the imaging device 1002, and part independently installed outside the imaging device 1002, for example, one part is installed in the body of the UAV 1000, and another part is installed in the imaging device 1002.
  • the focus control device 1001 of the shooting device is used to obtain a three-dimensional map of the surrounding environment where the shooting device 1002 is located; the focus target and focus parameters of the shooting device are determined at least partially according to the three-dimensional map, and the focus The parameters include: the target distance between the focus target and the shooting device; and adjusting the focal length of the shooting device according to the focus parameter.
  • the above-mentioned unmanned aerial vehicle 1000 further includes: a depth sensor 1003.
  • the above-mentioned depth sensor 1003 is a binocular vision sensor.
  • the binocular vision sensor By using the binocular vision sensor for distance measurement, the object's position can be quickly calculated. Three-dimensional coordinates, reconstruction of three-dimensional space. It has the advantages of high flexibility and high accuracy.
  • the embodiment of the present invention generates a three-dimensional map by using binocular vision sensors without adding additional hardware structures such as other sensors. It can accurately determine the focus target and realize automatic focus based on the depth information of the focus target, which has simple and efficient positive effects.
  • the disclosed device and method may be implemented in other ways.
  • the device embodiments described above are merely illustrative, for example, the division of the units is only a logical function division, and there may be other divisions in actual implementation, for example, multiple units or components may be combined or It can be integrated into another device, or some features can be ignored or not implemented.
  • the various component embodiments of the present invention may be implemented by hardware, or by software modules running on one or more processors, or by their combination.
  • a microprocessor or a digital signal processor (DSP) may be used in practice to implement some or all of the functions of some modules according to the embodiments of the present invention.
  • DSP digital signal processor
  • the present invention can also be implemented as a device program (for example, a computer program and a computer program product) for executing part or all of the methods described herein.
  • Such a program for realizing the present invention may be stored on a computer-readable medium, or may have the form of one or more signals. Such signals can be downloaded from Internet websites, or provided on carrier signals, or provided in any other form.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Databases & Information Systems (AREA)
  • Theoretical Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Remote Sensing (AREA)
  • Data Mining & Analysis (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Studio Devices (AREA)

Abstract

L'invention concerne un procédé de commande de mise au point destiné à un appareil de photographie (101), un appareil de commande de mise au point (800) destiné à l'appareil de photographie (101), et un aéronef sans pilote. L'appareil de photographie (101) est monté sur l'aéronef sans pilote (10) ; l'appareil de commande de mise au point (800) destiné à l'appareil de photographie (101) comprend une mémoire (802), un processeur (801), et un programme d'ordinateur stocké dans la mémoire (802) et pouvant être exécuté sur le processeur (801). Le procédé de commande de mise au point destiné à l'appareil de photographie (101) consiste : à acquérir une image tridimensionnelle de l'environnement ambiant de l'aéronef sans pilote (10) ; à déterminer une cible de mise au point et des paramètres de mise au point de l'appareil de photographie (101) au moins partiellement sur la base de l'image tridimensionnelle, les paramètres de mise au point comprenant une distance cible entre la cible de mise au point et l'appareil de photographie (101) ; et, sur la base des paramètres de mise au point, à régler la distance focale de l'appareil de photographie (101). Le présent procédé de commande de mise au point peut calculer précisément et rapidement la cible de mise au point et les paramètres de mise au point et exécuter une mise au point automatique.
PCT/CN2019/100344 2019-08-13 2019-08-13 Procédé et appareil de commande de mise au point pour appareil de photographie, et aéronef sans pilote WO2021026754A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201980032926.XA CN112154650A (zh) 2019-08-13 2019-08-13 一种拍摄装置的对焦控制方法、装置及无人飞行器
PCT/CN2019/100344 WO2021026754A1 (fr) 2019-08-13 2019-08-13 Procédé et appareil de commande de mise au point pour appareil de photographie, et aéronef sans pilote

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2019/100344 WO2021026754A1 (fr) 2019-08-13 2019-08-13 Procédé et appareil de commande de mise au point pour appareil de photographie, et aéronef sans pilote

Publications (1)

Publication Number Publication Date
WO2021026754A1 true WO2021026754A1 (fr) 2021-02-18

Family

ID=73891288

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2019/100344 WO2021026754A1 (fr) 2019-08-13 2019-08-13 Procédé et appareil de commande de mise au point pour appareil de photographie, et aéronef sans pilote

Country Status (2)

Country Link
CN (1) CN112154650A (fr)
WO (1) WO2021026754A1 (fr)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022141271A1 (fr) * 2020-12-30 2022-07-07 深圳市大疆创新科技有限公司 Procédé de commande et dispositif de commande pour système de plateforme, système de plateforme et support de stockage

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103179338A (zh) * 2011-12-22 2013-06-26 奥林巴斯映像株式会社 跟踪装置和跟踪方法
JP2013221993A (ja) * 2012-04-13 2013-10-28 Olympus Corp オートフォーカス制御装置、オートフォーカス制御方法、および撮像装置
US20150201182A1 (en) * 2013-04-11 2015-07-16 Altek Semiconductor Corp. Auto focus method and auto focus apparatus
CN107079102A (zh) * 2016-09-26 2017-08-18 深圳市大疆创新科技有限公司 对焦方法、摄像装置和无人机
CN107409205A (zh) * 2015-03-16 2017-11-28 深圳市大疆创新科技有限公司 用于焦距调节和深度图确定的装置和方法
CN108351574A (zh) * 2015-10-20 2018-07-31 深圳市大疆创新科技有限公司 用于设置相机参数的系统、方法和装置
CN108496350A (zh) * 2017-09-27 2018-09-04 深圳市大疆创新科技有限公司 一种对焦处理方法及设备
CN108702456A (zh) * 2017-11-30 2018-10-23 深圳市大疆创新科技有限公司 一种对焦方法、设备及可读存储介质

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108881885A (zh) * 2017-04-10 2018-11-23 钰立微电子股份有限公司 深度处理系统
CN108174096A (zh) * 2017-12-29 2018-06-15 广东欧珀移动通信有限公司 拍摄参数设置的方法、装置、终端及存储介质
CN109089047B (zh) * 2018-09-29 2021-01-12 Oppo广东移动通信有限公司 控制对焦的方法和装置、存储介质、电子设备
CN109905604B (zh) * 2019-03-29 2021-09-21 深圳市道通智能航空技术股份有限公司 对焦方法、装置、拍摄设备及飞行器

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103179338A (zh) * 2011-12-22 2013-06-26 奥林巴斯映像株式会社 跟踪装置和跟踪方法
JP2013221993A (ja) * 2012-04-13 2013-10-28 Olympus Corp オートフォーカス制御装置、オートフォーカス制御方法、および撮像装置
US20150201182A1 (en) * 2013-04-11 2015-07-16 Altek Semiconductor Corp. Auto focus method and auto focus apparatus
CN107409205A (zh) * 2015-03-16 2017-11-28 深圳市大疆创新科技有限公司 用于焦距调节和深度图确定的装置和方法
CN108351574A (zh) * 2015-10-20 2018-07-31 深圳市大疆创新科技有限公司 用于设置相机参数的系统、方法和装置
CN107079102A (zh) * 2016-09-26 2017-08-18 深圳市大疆创新科技有限公司 对焦方法、摄像装置和无人机
CN108496350A (zh) * 2017-09-27 2018-09-04 深圳市大疆创新科技有限公司 一种对焦处理方法及设备
CN108702456A (zh) * 2017-11-30 2018-10-23 深圳市大疆创新科技有限公司 一种对焦方法、设备及可读存储介质

Also Published As

Publication number Publication date
CN112154650A (zh) 2020-12-29

Similar Documents

Publication Publication Date Title
WO2019113966A1 (fr) Procédé et dispositif d'évitement d'obstacle, et véhicule aérien autonome
CN107329490B (zh) 无人机避障方法及无人机
WO2020014909A1 (fr) Procédé et dispositif de photographie, et véhicule aérien sans pilote
WO2018210078A1 (fr) Procédé de mesure de distance pour un véhicule aérien sans pilote et véhicule aérien sans pilote
CN108235815B (zh) 摄像控制装置、摄像装置、摄像系统、移动体、摄像控制方法及介质
US20220086362A1 (en) Focusing method and apparatus, aerial camera and unmanned aerial vehicle
WO2018120350A1 (fr) Procédé et dispositif de positionnement de véhicule aérien sans pilote
CN110139038B (zh) 一种自主环绕拍摄方法、装置以及无人机
US20210109312A1 (en) Control apparatuses, mobile bodies, control methods, and programs
US20210120171A1 (en) Determination device, movable body, determination method, and program
US20210014427A1 (en) Control device, imaging device, mobile object, control method and program
WO2020237422A1 (fr) Procédé d'arpentage aérien, aéronef et support d'informations
WO2017203646A1 (fr) Dispositif de commande de capture d'image, dispositif de spécification de position d'ombre, système de capture d'image, objet mobile, procédé de commande de capture d'image, procédé de spécification de position d'ombre, et programme
WO2020019175A1 (fr) Procédé et dispositif de traitement d'image et dispositif photographique et véhicule aérien sans pilote
JP2021096865A (ja) 情報処理装置、飛行制御指示方法、プログラム、及び記録媒体
WO2021026754A1 (fr) Procédé et appareil de commande de mise au point pour appareil de photographie, et aéronef sans pilote
US20210105411A1 (en) Determination device, photographing system, movable body, composite system, determination method, and program
US10884415B2 (en) Unmanned aerial vehicle light flash synchronization
JP6515423B2 (ja) 制御装置、移動体、制御方法、及びプログラム
JP6543878B2 (ja) 制御装置、撮像装置、移動体、制御方法、およびプログラム
US20210112202A1 (en) Control apparatuses, mobile bodies, control methods, and programs
WO2019127192A1 (fr) Procédé et appareil de traitement d'image
US20210092306A1 (en) Movable body, image generation method, program, and recording medium
CN111602385B (zh) 确定装置、移动体、确定方法以及计算机可读记录介质
WO2021223107A1 (fr) Procédé de traitement de signal, dispositif électronique et support de stockage lisible par ordinateur

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19941520

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19941520

Country of ref document: EP

Kind code of ref document: A1