WO2020107372A1 - 拍摄设备的控制方法、装置、设备及存储介质 - Google Patents

拍摄设备的控制方法、装置、设备及存储介质 Download PDF

Info

Publication number
WO2020107372A1
WO2020107372A1 PCT/CN2018/118410 CN2018118410W WO2020107372A1 WO 2020107372 A1 WO2020107372 A1 WO 2020107372A1 CN 2018118410 W CN2018118410 W CN 2018118410W WO 2020107372 A1 WO2020107372 A1 WO 2020107372A1
Authority
WO
WIPO (PCT)
Prior art keywords
target object
image
information
shooting
shooting device
Prior art date
Application number
PCT/CN2018/118410
Other languages
English (en)
French (fr)
Inventor
钱杰
郭晓东
邬奇峰
Original Assignee
深圳市大疆创新科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳市大疆创新科技有限公司 filed Critical 深圳市大疆创新科技有限公司
Priority to PCT/CN2018/118410 priority Critical patent/WO2020107372A1/zh
Priority to CN201880040431.7A priority patent/CN110785993A/zh
Publication of WO2020107372A1 publication Critical patent/WO2020107372A1/zh
Priority to US17/334,735 priority patent/US20210289141A1/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/69Control of means for changing angle of the field of view, e.g. optical zoom objectives or electronic zooming
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/61Control of cameras or camera modules based on recognised objects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/61Control of cameras or camera modules based on recognised objects
    • H04N23/611Control of cameras or camera modules based on recognised objects where the recognised objects include parts of the human body
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • H04N23/633Control of cameras or camera modules by using electronic viewfinders for displaying additional information relating to control or operation of the camera
    • H04N23/634Warning indications
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/66Remote control of cameras or camera parts, e.g. by remote control devices
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U2101/00UAVs specially adapted for particular uses or applications
    • B64U2101/30UAVs specially adapted for particular uses or applications for imaging, photography or videography
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/64Computer-aided capture of images, e.g. transfer from script file into camera, check of taken image quality, advice or proposal for image composition or decision on when to take image

Definitions

  • Embodiments of the present invention relate to the field of unmanned aerial vehicles, and in particular, to a control method, device, equipment, and storage medium for photographing equipment.
  • the mobile platforms in the prior art such as mobile robots, unmanned aerial vehicles, handheld gimbals, etc., are equipped with a shooting device, and the user can adjust the focus of the shooting device to adjust the target object in the image captured by the shooting device size.
  • the target object in the image cannot be accurately identified.
  • Embodiments of the present invention provide a control method, device, equipment, and storage medium of a photographing device, so as to improve the accuracy of identifying target objects in an image.
  • a first aspect of an embodiment of the present invention is to provide a method for controlling a shooting device, which is applied to a movable platform on which the shooting device is mounted.
  • the method includes:
  • the zoom operation of the shooting device is controlled according to the shooting information and the monitoring information, so that the size of the target object in the image collected by the shooting device is controlled at Within the preset range.
  • a second aspect of the embodiments of the present invention is to provide a control device for a shooting device, including: a memory and a processor;
  • the memory is used to store program codes
  • the processor calls the program code, and when the program code is executed, it is used to perform the following operations:
  • the zoom operation of the shooting device is controlled according to the shooting information and the monitoring information, so that the size of the target object in the image collected by the shooting device is controlled at Within the preset range.
  • a third aspect of the embodiments of the present invention is to provide a movable platform, including:
  • the power system is installed on the fuselage to provide power
  • control device according to the second aspect.
  • a fourth aspect of the embodiments of the present invention is to provide a computer-readable storage medium on which a computer program is stored, and the computer program is executed by a processor to implement the method of the first aspect.
  • the control method, device, equipment and storage medium of the shooting device provided by this embodiment, by acquiring the shooting information of the shooting device and the monitoring information of the target object, when the recognition trigger event of the target object is detected, according to the shooting information and monitoring information Control the zoom operation of the shooting device, so that the size of the target object in the image collected by the shooting device is controlled within a preset range, to avoid that the target object is too large or too small in the image collected by the shooting device, to improve the target The accuracy of object recognition.
  • FIG. 1 is a schematic diagram of an application scenario provided by an embodiment of the present invention.
  • FIG. 2 is a flowchart of a control method of a shooting device according to an embodiment of the present invention
  • FIG. 3 is a flowchart of a control method of a shooting device according to another embodiment of the present invention.
  • FIG. 4 is a schematic diagram of a target object provided by another embodiment of the present invention.
  • FIG. 5 is a schematic diagram of a target object provided by another embodiment of the present invention.
  • FIG. 6 is a schematic diagram of a target object provided by another embodiment of the present invention.
  • FIG. 7 is a flowchart of a method for controlling a shooting device according to another embodiment of the present invention.
  • FIG. 8 is a schematic diagram of an application scenario provided by another embodiment of the present invention.
  • FIG. 9 is a structural diagram of a control device of a shooting device according to an embodiment of the present invention.
  • FIG. 10 is a structural diagram of an unmanned aerial vehicle according to an embodiment of the present invention.
  • 61 framed area; 80: image; 90: control device;
  • 110 communication system
  • 102 supporting equipment
  • 104 shooting equipment.
  • a component when a component is said to be “fixed” to another component, it can be directly on another component or it can also exist in a centered component. When a component is considered to be “connected” to another component, it can be directly connected to another component or there can be centered components at the same time.
  • An embodiment of the present invention provides a control method of a shooting device.
  • the control method of the shooting device is applied to a movable platform on which the shooting device is mounted.
  • the movable platform includes at least one of the following: an unmanned aerial vehicle, a movable robot, and a handheld gimbal.
  • the embodiment of the present invention takes an unmanned aerial vehicle as an example to introduce a control method of the shooting device.
  • the UAV 10 is equipped with a shooting device 11 that is connected to the fuselage of the UAV 10 through a supporting member such as a gimbal 12.
  • the control terminal 14.15 represents a processor in the UAV 10, and the processor may specifically be a flight controller of the UAV 10, and the flight controller may be used to control the flight of the UAV 10, for example, to control the unmanned aircraft
  • the aircraft 10 intelligently follows the target object 41.
  • the ground control terminal 14 can be used to control the flying state parameters of the UAV 10, for example, to control the flying speed, flying altitude, attitude angle, etc. of the UAV 10.
  • the ground control terminal 14 can also control the shooting parameters of the shooting device 11 of the unmanned aerial vehicle 10, for example, control the focal length and resolution of the shooting device 11.
  • the ground control terminal 14 may be provided with a zoom component for adjusting the focal length of the shooting device 11, such as a zoom ring.
  • the ground control terminal 14 generates a zoom command according to the user's operation on the zoom ring, and sends the zoom command to
  • the shooting device 11 performs zooming according to the zoom instruction.
  • the shooting device 11 performs optical zoom or digital zoom according to the zoom instruction.
  • the user adjusting the focal length of the shooting device 11 through the ground control terminal 14 may cause the target object 41 to be too large or too small in the image captured by the shooting device 11, resulting in the inability to accurately identify the target object 41 in the image .
  • an embodiment of the present invention provides a method for controlling a shooting device. The method for controlling the shooting device will be described below in conjunction with specific embodiments.
  • FIG. 2 is a flowchart of a control method of a shooting device according to an embodiment of the present invention. As shown in FIG. 2, the method in this embodiment may include:
  • Step S201 Acquire shooting information of the shooting device and monitoring information of a target object, where the target object is a shooting object of the shooting device.
  • the shooting device 11 and the processor 15 of the UAV 10 are communicatively connected.
  • the shooting device 11 may send the image captured by the shooting device 11 and/or the focal length of the shooting device 11 to the processor 15 in real time.
  • the processor 15 may acquire the shooting information of the shooting device 11 and the monitoring information of the target object 41 according to the image shot by the shooting device 11 and/or the focal length of the shooting device 11.
  • the target object 41 is a photographing object of the photographing device 11, such as a human body, a human face, and the like.
  • the shooting information includes size information of the image
  • the monitoring information includes size information of the target object in the image.
  • the processor 15 may determine the size information of the image according to the image captured by the shooting device 11, and identify the target object 41 in the image to determine the position information of the target object 41 in the image, further according to the target object The position information of 41 in the image determines the size information of the target object 41 in the image.
  • the size information of the image may specifically be the width, height and/or area of the image, and the size information of the target object 41 in the image may specifically be the width, height and/or area of the target object 41 in the image.
  • the shooting information includes focal length information of the shooting device
  • the monitoring information includes distance information of the target object from the movable platform or the shooting device.
  • the processor 15 may determine the historical focal length of the photographing device 11 at a certain historical moment according to the focal length sent by the photographing device 11 in real time.
  • the processor 15 may also determine the target object 41 according to the historical image captured by the shooting device 11 at the historical moment, the depth information of the historical image, and the internal and external parameters when the photographing device 11 captured the historical image at the historical moment. The first distance relative to the UAV 10 or the shooting device 11 at this historical moment.
  • the processor 15 can also determine the target object 41 at the current time relative to the current object based on the current image captured by the shooting device 11 at the current time, the depth information of the current image, and the internal and external parameters of the shooting device 11 at the current time The second distance of the humanoid vehicle 10 or the shooting device 11.
  • Step S202 If a recognition trigger event for the target object is detected, the zoom operation of the shooting device is controlled according to the shooting information and the monitoring information, so that the target object is in the image captured by the shooting device The size is controlled within a preset range.
  • the UAV 10 when the UAV 10 performs intelligent follow-up on the target object 41, or when the UAV 10 performs intelligent follow-up shooting on the target object 41, the following target object 41 needs to be identified, and the target object 41 is in the
  • the size of the image collected by the shooting device 11 may affect the accuracy of the UAV 10 identifying the target object 41, that is, the target object 41 is too large or too small in the image collected by the shooting device 11 It is possible that the UAV 10 cannot accurately recognize the target object 41. Therefore, it is necessary to control the size of the target object 41 in the image collected by the shooting device 11 within a preset range.
  • the processor 15 of the UAV 10 can detect the recognition trigger event of the target object 41 in real time, for example, the processor 15 can detect the control command sent from the ground control terminal 14 to the UAV 10 in real time, and Determine whether it is necessary to identify the target object 41 when executing the control instruction, for example, to determine whether the control instruction is a control instruction to control the unmanned aerial vehicle 10 to intelligently follow or intelligently follow the target object 41 if the control instruction is ground control
  • the terminal 14 controls the unmanned aerial vehicle 10 to intelligently follow or intelligently follow the control instruction of the target object 41, then the processor 15 can control the shooting according to the shooting information of the shooting device and the monitoring information of the target object 41 determined by the above steps
  • the zoom operation of the device for example, controls the shooting device to stop zooming, or controls the shooting device to zoom so that the size of the target object 41 in the image captured by the shooting device 11 is controlled within a preset range.
  • the zoom operation of the shooting device is controlled according to the shooting information and monitoring information, so that the target object is collected by the shooting device
  • the size of the image is controlled within a preset range to prevent the target object from being too large or too small in the image collected by the shooting device, so as to improve the accuracy of identifying the target object.
  • FIG. 3 is a flowchart of a control method of a shooting device according to another embodiment of the present invention.
  • the controlling the zooming operation of the shooting device according to the shooting information and the monitoring information includes: according to the shooting information and the monitoring information To control the shooting device to stop zooming.
  • 41 represents the target object 41 that the unmanned aerial vehicle 10 follows.
  • the distance between the unmanned aerial vehicle 10 and the target object 41 may be fixed.
  • the user can adjust the focal length of the shooting device 11 of the UAV 10 through the ground control terminal 14 to adjust the size of the target object 41 in the image collected by the shooting device 11.
  • the user adjusts a zoom component on the ground control terminal 14, such as a zoom ring.
  • the ground control terminal 14 generates a zoom command according to the user's operation on the zoom ring, and sends the zoom command to the UAV 10 through wireless communication.
  • the zoom command may include a focal length value.
  • the communication interface 13 of the UAV 10 sends the zoom command to the shooting device 11, so that the shooting device 11
  • the zoom command adjusts the focal length of the lens to adjust the size of the target object 41 in the image, or the zoom command can be directly sent from the ground control terminal 14 to the shooting device 11 so that the shooting device 11 adjusts the lens's focal length.
  • the shooting device 11 collects the image of the target object 41 according to the adjusted focal length, and sends the image to the processor 15, and the processor 15 can determine the size of the image according to the image of the target object 41 taken by the shooting device 11, and The size of the target object 41 in this image. If the size of the target object 41 in the image exceeds the preset range, the processor 15 can control the shooting device 11 to stop zooming to avoid that the target object 41 is too large or too small in the image to affect the processor 15 Accurate identification of the target object 41.
  • controlling the shooting device to stop zooming according to the shooting information and the monitoring information includes:
  • Step S301 Determine the size of the target object in the image according to the shooting information and the monitoring information.
  • the shooting information includes size information of the image
  • the monitoring information includes size information of the target object in the image.
  • the processor 15 acquires the image of the target object captured by the shooting device 11, it determines the size information of the image and the size information of the target object in the image. Further, according to the size information of the image and the size information of the target object in the image, determine the size of the target object in the image, for example, determine whether the size of the target object in the image is within a preset range, Or, determine whether the size of the target object in the image exceeds the preset range.
  • the determining the size of the target object in the image based on the shooting information and the monitoring information includes: according to the size information of the image and the target object in the image Size information, determining the size difference between the target object and the image; if the size difference is greater than the first threshold or the size difference is less than the second threshold, determining the size of the target object in the image It is outside the preset range.
  • 40 represents the image captured by the shooting device 11, and the processor 15 can determine the size information of the image 40 and the size information of the target object 41 in the image 40 based on the image 40 captured by the shooting device 11.
  • the shooting device 11 can also send the size information of the image 40 to the processor 15.
  • One possible way for the processor 15 to determine the size information of the target object 41 in the image 40 is that the processor 15 inputs the image 40 into a neural network model that has been pre-trained.
  • the neural network model may specifically be passed in advance. A large number of human body samples are used as the trained model.
  • the neural network model can be used to identify the target object 41 in the image 40, such as a human body.
  • the target object 41 After the target object 41 is recognized, it can be expanded to the surroundings according to a preset rule at a certain position in the target object 41 to form a target area 42 including at least a part of the target object 41. Further, the position information of the target object 41 in the image 40 may be output, and the position information may specifically be the position information of the upper left corner and the lower right corner of the target area 42 including the target object 41 in the image 40. The size information of the target object 41 in the image 40 may specifically be the size information of the target area 42 in the image 40.
  • the processor 15 may determine the size difference between the target object 41 and the image 40 according to the size information of the image 40 and the size information of the target object 41 in the image 40. The size difference may be the target object 41 in the image 40 The ratio or difference between the size and the size of the image 40.
  • the size information of the image may specifically be the width and/or height of the image
  • the size information of the target object in the image 40 may specifically be the width and/or height of the target area in the image.
  • H represents the height of the image 40.
  • h represents the height of the target area 42 in the image 40, that is, the height of the target object 41 in the image 40.
  • W represents the width of the image 40.
  • w represents the width of the target area 42 in the image 40, that is, the width of the target object 41 in the image 40.
  • the size difference between the target object 41 and the image 40 includes the difference between the height h of the target area 42 in the image 40 and the height H of the image 40, and/or the width w of the target area 42 in the image 40 and The difference in the width W of the image 40.
  • the difference between the height h of the target area 42 in the image 40 and the height H of the image 40 is the ratio of h and H, and the width w of the target area 42 in the image 40 is different from the image 40
  • the difference in the width W is the ratio of w to W.
  • a feasible implementation method is: if the ratio of the height h of the target area 42 in the image 40 to the height H of the image 40 is greater than the first threshold ⁇ 1 , that is, h and H satisfy the condition described in the following formula (1) Or, the ratio of the height h of the target area 42 in the image 40 to the height H of the image 40 is less than the second threshold ⁇ 2 , that is, h and H satisfy the condition described in the following formula (2), where the first threshold ⁇ 1 is greater than the second threshold ⁇ 2 , then it is determined that the size of the target object 41 in the image 40 is outside the preset range.
  • Another feasible implementation manner is: if the ratio of the width w of the target area 42 in the image 40 to the width W of the image 40 is greater than the first threshold ⁇ 1 , that is, w and W satisfy the following formula (3)
  • the condition, or, the ratio of the width w of the target area 42 in the image 40 to the width W of the image 40 is less than the second threshold ⁇ 2 , that is, w and W satisfy the condition described in the following formula (4).
  • the first threshold ⁇ 1 is greater than the second threshold ⁇ 2 , then it is determined that the size of the target object 41 in the image 40 is outside the preset range.
  • Another feasible implementation method is: if the ratio of the height h of the target area 42 in the image 40 to the height H of the image 40 is greater than the first threshold ⁇ 1 , and the width w of the target area 42 in the image 40 is equal to The ratio of the width W of the image 40 is greater than the first threshold ⁇ 1 ; or the ratio of the height h of the target area 42 in the image 40 to the height H of the image 40 is less than the second threshold ⁇ 2 , and the target area 42 is in the The ratio of the width w in the image 40 to the width W of the image 40 is smaller than the second threshold ⁇ 2 . Wherein, the first threshold ⁇ 1 is greater than the second threshold ⁇ 2 , then it is determined that the size of the target object 41 in the image 40 is outside the preset range.
  • the difference between the height h of the target area 42 in the image 40 and the height H of the image 40 is the absolute value of the difference between h and H
  • the width of the target area 42 in the image 40 is the absolute value of the difference between w and W.
  • determining that the size of the target object 41 in the image 40 is outside the preset range may include the following feasible implementation manners:
  • a feasible implementation method is: if the absolute value of the difference between the height h of the target area 42 in the image 40 and the height H of the image 40 is greater than the first threshold, or the height of the target area 42 in the image 40 The absolute value of the difference between h and the height H of the image 40 is less than the second threshold, where the first threshold is greater than the second threshold, it is determined that the size of the target object 41 in the image 40 is outside the preset range.
  • Another possible implementation method is: if the absolute value of the difference between the width w of the target area 42 in the image 40 and the width W of the image 40 is greater than the first threshold, or the target area 42 in the image 40 The absolute value of the difference between the width w and the width W of the image 40 is smaller than the second threshold, where the first threshold is larger than the second threshold, it is determined that the size of the target object 41 in the image 40 is outside the preset range.
  • Another feasible implementation manner is: if the absolute value of the difference between the height h of the target area 42 in the image 40 and the height H of the image 40 is greater than the first threshold, and the width of the target area 42 in the image 40 The absolute value of the difference between w and the width W of the image 40 is greater than the first threshold; or, the absolute value of the difference between the height h of the target area 42 in the image 40 and the height H of the image 40 is less than the second threshold, And the absolute value of the difference between the width w of the target area 42 in the image 40 and the width W of the image 40 is smaller than the second threshold. Wherein the first threshold is greater than the second threshold, it is determined that the size of the target object 41 in the image 40 is outside the preset range.
  • the monitoring information includes size information of a framed area of the target object in the image, and the target object is at least partially located in the framed area.
  • 40 represents an image captured by the shooting device 11, and the processor 15 can send the image 40 to the ground control terminal 14 through the communication interface 13, and the ground control terminal 14 displays the image 40 in the display component.
  • the component may specifically be a touch screen.
  • the user can frame the target object 41 in the image 40 on the touch screen. For example, the user can frame the entire target object 41 on the touch screen, or frame a part of the target object 41 selected. Taking the user's frame selection of the target object 41 as an example, as shown in FIG.
  • 61 represents the frame selection area of the target object 41 in the image 40, and at least part of the target object 41 is located in the frame selection area, for example, the target
  • the object 41 is a human body, and the user frames the human face, that is, the human face is located in the framed area.
  • the ground control terminal 14 further sends the position information of the framed area 61 in the image 40 to the UAV 10, and after the processor 15 obtains the position information of the framed area 61 in the image 40, determines the frame The size of the selected area 61 in the image 40, and the size of the framed area 61 in the image 40 are used as the monitoring information of the target object 41.
  • the size of the framed area 61 in the image 40 may specifically be the height and/or width of the framed area 61 in the image 40. As shown in FIG. 6, w1 represents the width of the frame selection area 61 and h1 represents the height of the frame selection area 61.
  • the size information about the frame selection area 61 may not be determined based on the position information in the image 40 sent by the ground control terminal 14, and other methods may be used.
  • the ground control terminal 14 calculates the size information of the framed area 61 and sends the size information to the UAV 10, and the specific method is not limited herein.
  • the frame selection area 61 may also be obtained by the user after clicking on the touch screen or through other interaction with the ground control terminal 14.
  • the original size of the image 40 displayed in the display component may be scaled proportionally to the display size of the display component, or may not be scaled.
  • the scaled size information may be used. The information after disproportionate scaling can be used, which can be set according to specific needs
  • the frame selection area is a rectangular area; the size difference includes the difference between the width of the frame selection area and the width of the image, and/or the height of the frame selection area and the The difference between the height of the images.
  • the frame selection area 61 is a rectangular area, and the size difference between the target object 41 and the image 40 may be the difference between the width of the frame selection area 61 and the width of the image 40, and/or the frame selection area 61 The difference between the height and the height of the image 40.
  • the difference between the width of the framed area 61 and the width of the image 40 may specifically be the ratio of the width w1 of the framed area 61 to the width W of the image 40, and the height of the framed area 61 to the width of the image 40
  • the difference between the heights may specifically be the ratio of the height h1 of the framed area 61 to the height H of the image 40.
  • determining that the size of the target object 41 in the image 40 is outside the preset range may include the following feasible implementation manners:
  • a feasible implementation manner is: if the ratio of the width w1 of the frame selection area 61 to the width W of the image 40 is greater than the first threshold ⁇ 1 , or the ratio of the width w1 of the frame selection area 61 to the width W of the image 40 is less than the first Two thresholds ⁇ 2 , where the first threshold ⁇ 1 is greater than the second threshold ⁇ 2 , it is determined that the size of the target object 41 in the image 40 is outside the preset range.
  • Another feasible implementation method is: if the ratio of the height h1 of the framed area 61 to the height H of the image 40 is greater than the first threshold ⁇ 1 , or the ratio of the height h1 of the framed area 61 to the height H of the image 40 is less than A second threshold ⁇ 2 , where the first threshold ⁇ 1 is greater than the second threshold ⁇ 2 , it is determined that the size of the target object 41 in the image 40 is outside the preset range.
  • Another feasible implementation manner is: if the ratio of the width w1 of the framed area 61 to the width W of the image 40 is greater than the first threshold ⁇ 1 , and the ratio of the height h1 of the framed area 61 to the height H of the image 40 is greater than the first A threshold ⁇ 1 ; alternatively, the ratio of the width w1 of the framed area 61 to the width W of the image 40 is less than the second threshold ⁇ 2 , and the ratio of the height h1 of the framed area 61 to the height H of the image 40 is less than the second threshold ⁇ 2 , where the first threshold ⁇ 1 is greater than the second threshold ⁇ 2 , then it is determined that the size of the target object 41 in the image 40 is outside the preset range.
  • the difference between the width of the framed area 61 and the width of the image 40 may specifically be the absolute value of the difference between the width w1 of the framed area 61 and the width W of the image 40.
  • the difference between the height and the height of the image 40 may specifically be the absolute value of the difference between the height h1 of the frame selection area 61 and the height H of the image 40.
  • determining that the size of the target object 41 in the image 40 is outside the preset range may include the following feasible implementation manners:
  • a feasible implementation method is: if the absolute value of the difference between the width w1 of the frame selection area 61 and the width W of the image 40 is greater than the first threshold, or the difference between the width w1 of the frame selection area 61 and the width W of the image 40 The absolute value of the value is less than the second threshold, where the first threshold is greater than the second threshold, it is determined that the size of the target object 41 in the image 40 is outside the preset range.
  • Another feasible implementation manner is: if the absolute value of the difference between the height h1 of the framed area 61 and the height H of the image 40 is greater than the first threshold, or the height h1 of the framed area 61 and the height H of the image 40 The absolute value of the difference is less than the second threshold, where the first threshold is greater than the second threshold, it is determined that the size of the target object 41 in the image 40 is outside the preset range.
  • Another feasible implementation manner is: if the absolute value of the difference between the width w1 of the frame selection area 61 and the width W of the image 40 is greater than the first threshold, and the difference between the height h1 of the frame selection area 61 and the height H of the image 40 The absolute value of the value is greater than the first threshold; or, the absolute value of the difference between the width w1 of the framed area 61 and the width W of the image 40 is less than the second threshold, and the height h1 of the framed area 61 and the height H of the image 40 The absolute value of the difference is less than the second threshold, where the first threshold is greater than the second threshold, it is determined that the size of the target object 41 in the image 40 is outside the preset range.
  • the shooting information includes area information of the image
  • the monitoring information includes area information of the target object in the image.
  • the processor 15 acquires the image of the target object captured by the shooting device 11, it determines the area information of the image and the area information of the target object in the image. Further, according to the area information of the image and the area information of the target object in the image, determine the size of the target object in the image, for example, determine whether the size of the target object in the image is within a preset range Or, determine whether the size of the target object in the image exceeds the preset range.
  • the determining the size of the target object in the image based on the shooting information and the monitoring information includes: according to the area information of the image and the target object in the image Area information to determine the area difference between the target object and the image; if the area difference is greater than the third threshold or the area difference is less than the fourth threshold, determine the size of the target object in the image It is outside the preset range.
  • the area of the image 40 is the product of the height H of the image 40 and the width W of the image 40
  • the area of the target object 41 in the image 40 is the height of the target area 42 including the target object 41 in the image 40
  • the area difference between the target object 41 and the image 40 may be the ratio of the area of the target object 41 in the image 40 to the area of the image 40, or the area difference may be the area of the target object 41 in the image 40 and the image 40
  • the absolute value of the difference in the area may include the following feasible implementation manners:
  • a feasible implementation method is: if the ratio of the area of the target object 41 in the image 40 to the area of the image 40 is greater than the third threshold, or the area of the target object 41 in the image 40 and the area of the image 40 The ratio of is smaller than the fourth threshold, where the third threshold is larger than the fourth threshold, it is determined that the size of the target object 41 in the image 40 is outside the preset range.
  • Another possible implementation method is: if the absolute value of the difference between the area of the target object 41 in the image 40 and the area of the image 40 is greater than the third threshold, or the area of the target object 41 in the image 40 is equal to The absolute value of the difference in the area of the image 40 is less than the fourth threshold, where the third threshold is greater than the fourth threshold, and it is determined that the size of the target object 41 in the image 40 is outside the preset range.
  • the monitoring information includes area information of a framed area of the target object in the image, and the target object is located at least partially within the framed area.
  • 61 represents the framed area of the target object 41 in the image 40, and at least part of the target object 41 is located in the framed area.
  • the processor 15 may also use the area of the framed area 61 in the image 40 as the monitoring information of the target object 41.
  • the area of the framed area 61 in the image 40 is the product of the height h1 of the framed area 61 and the width w1 of the framed area 61.
  • determining that the size of the target object 41 in the image 40 is outside the preset range may include the following feasible implementation manners:
  • a feasible implementation method is: if the ratio of the area of the framed area 61 in the image 40 to the area of the image 40 is greater than the third threshold, or the area of the framed area 61 in the image 40 and the image 40 The ratio of the area of is smaller than the fourth threshold, where the third threshold is larger than the fourth threshold, it is determined that the size of the target object 41 in the image 40 is outside the preset range.
  • Another feasible implementation manner is: if the absolute value of the difference between the area of the framed area 61 in the image 40 and the area of the image 40 is greater than the third threshold, or the framed area 61 in the image 40 The absolute value of the difference between the area and the area of the image 40 is less than the fourth threshold, where the third threshold is greater than the fourth threshold, and it is determined that the size of the target object 41 in the image 40 is outside the preset range.
  • Step S302 If the size of the target object in the image is outside the preset range, control the shooting device to stop zooming.
  • the shooting device 11 of the UAV 10 is controlled to stop zooming.
  • the size of the target area 42 in the image 40 gradually increases. If the height of the target area 42 in the image 40 and the height of the image 40 The ratio is greater than the first threshold, and/or the ratio of the width of the target area 42 in the image 40 to the width of the image 40 is greater than the first threshold, it is determined that the size of the target object 41 in the image 40 is outside the preset range At this time, it is necessary to control the shooting device 11 to stop zooming, to prevent the size of the target area 42 in the image 40 from further increasing.
  • the size of the target area 42 in the image 40 gradually decreases, if the height of the target area 42 in the image 40 and the height of the image 40 Is less than the second threshold, and/or the ratio of the width of the target area 42 in the image 40 to the width of the image 40 is less than the second threshold, it is determined that the size of the target object 41 in the image 40 is within a preset range In addition, at this time, it is necessary to control the shooting device 11 to stop zooming, to prevent the size of the target area 42 in the image 40 from further decreasing.
  • controlling the shooting device to stop zooming includes: controlling the shooting device to stop executing the zooming instruction received by the shooting device, the zooming instruction being used to adjust the focal length of the shooting device.
  • the ground control terminal 14 sends a zoom command to the UAV 10 through wireless communication.
  • the zoom command may include a focal length value.
  • the processor 15 of the UAV 10 obtains the zoom command
  • the zoom command according to the zoom value in the zoom command Adjust the current focal length of the shooting device 11, or after receiving the zoom command
  • the communication interface 13 of the UAV 10 sends the zoom command to the shooting device 11, so that the shooting device 11 adjusts the focal length of the lens according to the zoom command, or
  • the zoom command can be directly sent from the ground control terminal 14 to the shooting device 11, so that the shooting device 11 adjusts the focal length of the lens according to the zoom command.
  • the shooting device 11 collects an image of the target object 41 according to the adjusted focal length, and sends the image to the processor 15, if the processor 15 determines that the size of the target object 41 in the image 40 is outside the preset range, then controls The shooting device 11 stops executing the zoom command received by the shooting device 11, for example, the processor 15 of the unmanned aerial vehicle 10 sends a stop zoom command to the shooting device 11 through the communication interface 13 so that the shooting device 11 stops executing the receiving according to the stop zoom command To the zoom command.
  • the method further includes: sending a stop zooming prompt message to the control terminal, so that the control terminal prompts the user according to the stop zooming prompt message, the The control terminal is used to control the movable platform.
  • the processor 15 controls the shooting device 11 to stop executing the zooming instruction, it can also send a stop zooming prompt message to the ground control end 14, which can be used to prompt the target object to be too large in the image 40 Or it is too small, so that the ground control end 14 prompts the user according to the stop zooming prompt information, and the user can stop adjusting the zoom component on the ground control end 14, such as a zoom ring, according to the prompt of the ground control end 14.
  • the size of the target object in the image is determined by the shooting information of the shooting device and the monitoring information of the target object. If the size of the target object in the image is outside the preset range, the shooting device is controlled to stop Zoom to prevent the user from adjusting the focal length of the shooting device through the ground control terminal to cause the target object to be too large or too small in the image collected by the shooting device, so that the size of the target object in the image collected by the shooting device is controlled at a preset Within the scope to improve the accuracy of identifying the target object.
  • An embodiment of the present invention provides a control method of a shooting device.
  • 7 is a flowchart of a method for controlling a shooting device according to another embodiment of the present invention.
  • the controlling the zooming operation of the shooting device based on the shooting information and the monitoring information includes controlling the camera based on the shooting information and the monitoring information The shooting device performs zoom operation.
  • the UAV 10 and/or the target object 41 may move, resulting in the unmanned aerial vehicle 10 and the target object 41 The distance between them changes, or the distance between the shooting device 11 and the target object 41 changes, which causes the size of the target object 41 in the image collected by the shooting device 11 to change. If the target object 41 is too large or too small in the image collected by the shooting device 11, it may affect the correct recognition of the target object 41 by the processor 15 of the UAV 10.
  • the shooting device 11 may be controlled to perform a zoom operation according to the shooting information of the shooting device 11 and the monitoring information of the target object 41.
  • the shooting information includes focal length information of the shooting device
  • the monitoring information includes distance information of the target object from the movable platform or the shooting device. That is, the shooting device 11 can be controlled to perform a zooming operation according to the focal length information of the shooting device 11 and the distance information of the target object 41 from the UAV 10 or the shooting device 11, for example, the shooting device 11 can be controlled to adjust the focal length.
  • the shooting device 11 may send the focal length of the shooting device 11 and the image captured by the shooting device 11 to the processor 15 in real time, and the processor 15 may determine that the shooting device 11 is different according to the focal length sent by the shooting device 11 in real time. The focal length of the moment.
  • the processor 15 can also determine the position information of the target object 41 at different times according to the images captured by the shooting device 11 at different times.
  • the position information of the target object 41 is specifically the three-dimensional coordinates of the target object 41 in the world coordinate system .
  • 80 indicates an image captured by the shooting device 11 at a certain time.
  • the three-dimensional point on the target object 41 may be mapped into the image 80, and the mapping point of the three-dimensional point in the image 80 may specifically be a feature point in the image 80.
  • point A, point B, and point C are three-dimensional points on the target object 41 respectively
  • point a, point b, and point c respectively represent characteristic points in the image 80
  • point a is the mapping point of point A in the image 80
  • Point b is the mapping point of point B in image 80
  • point c is the mapping point of point C in image 80.
  • the three-dimensional coordinates (x w , y w , z w ) of the three-dimensional point on the target object 41 in the world coordinate system and the three-dimensional point in the image 80 can be obtained.
  • the position information of the dots in the image 80 is, for example, the relationship of pixel coordinates ( ⁇ , ⁇ ), and the relationship is specifically shown in the following formula (5):
  • z c represents the coordinate of the three-dimensional point on the Z axis of the camera coordinate system, that is, the depth information of the image 80.
  • K represents the internal parameters of the camera
  • R represents the rotation matrix of the camera
  • T represents the translation matrix of the camera.
  • K, R, T, z c the three-dimensional coordinates (x w , y w , z w ) of the three-dimensional point in the world coordinate system can be calculated.
  • the three-dimensional coordinates of the three-dimensional point A in the world coordinate system can be calculated according to the pixel coordinates, K, R, T, and z c of the point a in the image 80.
  • the three-dimensional coordinates of the three-dimensional point B in the world coordinate system can be calculated according to the pixel coordinates, K, R, T, and z c of the point b in the image 80.
  • the three-dimensional coordinates of the three-dimensional point C in the world coordinate system can be calculated according to the pixel coordinates, K, R, T, and z c of the point c in the image 80.
  • the three-dimensional coordinates of the target object 41 in the world coordinate system can be calculated.
  • the processor 15 can determine the three-dimensional coordinates of the target object 41 in the world coordinate system at that time based on the image 80 captured by the shooting device 11 at a certain time. Further, according to the target object 41 in the The three-dimensional coordinates in the world coordinate system at the time and the positioning information of the UAV 10 at that time can determine the distance of the target object 41 at that time relative to the UAV 10. Alternatively, the processor 15 may be based on the three-dimensional coordinates of the target object 41 at the moment in the world coordinate system, the positioning information of the unmanned aerial vehicle 10 at the moment, and the position and attitude of the shooting device 11 relative to the fuselage of the unmanned aerial vehicle 10 To determine the distance of the target object 41 relative to the shooting device 11 at that moment.
  • the method includes: according to the shooting information and the monitoring information, controlling the shooting device to perform a zoom operation to maintain the size of the target object in the image collected by the shooting device.
  • the shooting device 11 can be controlled 11 Perform a zoom operation, for example, increase the focal length of the shooting device 11 so that the size of the target object 41 in the image collected by the shooting device 11 remains unchanged or within a preset range.
  • the shooting device 11 can be controlled to zoom The operation, for example, reduces the focal length of the photographing device 11 so that the size of the target object 41 in the image collected by the photographing device 11 remains unchanged or within a preset range.
  • the controlling the shooting device to perform a zoom operation according to the shooting information and the monitoring information includes:
  • Step S701 Determine the target focal length of the shooting device according to the focal length information and the distance information.
  • the focal length information includes a historical focal length of the shooting device at a historical moment
  • the distance information includes a first distance of the target object relative to the movable platform or the shooting device at the historical moment, the target The second distance of the object relative to the movable platform or the shooting device at the current moment.
  • the processor 15 can determine the historical focal length of the shooting device 11 at a historical moment according to the focal length sent by the shooting device 11 in real time.
  • the processor 15 can determine the three-dimensional coordinates of the target object 41 in the world coordinate system at the historical time according to the image captured by the shooting device 11 at the historical time, and further determine that the target object 41 is relative to the unmanned aerial vehicle at the historical time 10 or the distance of the photographing device 11, here, the distance of the target object 41 relative to the UAV 10 or the photographing device 11 at this historical moment is recorded as the first distance.
  • the processor 15 can also determine the three-dimensional coordinates of the target object 41 in the world coordinate system at the current time according to the image captured by the shooting device 11 at the current time, and further determine that the target object 41 is relative to the UAV 10 at the current time Or the distance of the shooting device 11, here, the distance of the target object 41 relative to the UAV 10 or the shooting device 11 at the current moment is recorded as the second distance. Further, according to the historical focal length of the shooting device 11 at the historical moment, the first distance of the target object 41 relative to the UAV 10 or the shooting device 11 at the historical moment, and the target object 41 at the current moment relative to the UAV 10 or the shooting device The second distance of 11 determines the target focal length of the shooting device 11. The so-called target focal length refers to adjusting the current focal length of the shooting device 11 to the target focal length.
  • the determining the target focal length of the shooting device according to the focal length information and the distance information includes determining the target focal length of the shooting device based on the historical focal length, the first distance, and the second distance .
  • the historical moment is the moment when the movable platform begins to follow the target object.
  • the processor 15 may specifically be a flight controller of the UAV 10, and the flight controller may control the UAV 10 to intelligently follow the target object 41.
  • the ground control terminal 14 sends an intelligent following control instruction to the unmanned aerial vehicle 10, and after the flight controller obtains the intelligent following control instruction, it determines the distance of the target object 41 relative to the unmanned aerial vehicle 10 or the shooting device 11, When the distance is less than the preset distance, the flight controller sends a prompt message to the ground control terminal 14 so that the ground control terminal 14 prompts the user that the distance between the target object 41 and the UAV 10 or the shooting device 11 is short, The target object 41 cannot be followed intelligently.
  • the user can control the distance increase of the target object 41 relative to the UAV 10 or the shooting device 11 through the ground control terminal 14 according to the prompt information, and when the distance is adjusted to be greater than the preset distance, the flight controller starts The UAV 10 is controlled to follow the target object 41 intelligently.
  • the size of the target object 41 in the image collected by the shooting device 11 is within a preset range.
  • the flight controller determines, based on the focal length sent by the shooting device 11 to the flight controller in real time, at the historical moment when the unmanned aerial vehicle 10 begins to follow the target object 41, the historical focal length of the shooting device 11 is f init .
  • the flight controller may also determine the first time that the target object 41 is relative to the UAV 10 or the shooting device 11 at the historical time according to the historical moment when the UAV 10 starts following the target object 41 and the image captured by the shooting device 11 The distance is d init .
  • the flight controller may also determine the second distance, d cur , of the target object 41 relative to the UAV 10 or the shooting device 11 at the current time according to the image of the target object 41 captured by the shooting device 11 at the current time.
  • the target focal length f cmd of the photographing device 11 is calculated according to the above formula (6).
  • the historical moment is the moment when the size of the target object in the image is within the preset range.
  • the ground control terminal 14 may send a zoom instruction to the shooting device 11 to adjust the focal length of the shooting device 11, thereby adjusting the target object 41 to the shooting device The size in the captured image.
  • the processor 15 of the UAV 10 can record the historical moment of the size of the target object 41 in the image within a preset range, and according to the historical focal length f init of the shooting device 11 at the historical moment, the target object 41 is The first distance d init of the historical moment relative to the UAV 10 or the shooting device 11 and the second distance d cur of the target object 41 relative to the UAV 10 or the shooting device 11 at the current moment, according to the above formula (6 ) The target focal length f cmd of the shooting device 11 is calculated.
  • it further includes: during the shooting process of the shooting device, if a user's zoom instruction is received, adjusting the current focal length of the shooting device according to the zoom instruction, and adjusting the adjusted current focal length As the historical focal length.
  • the shooting device 11 may send the image it captured to the ground control terminal 14, and the ground control terminal 14 displays the image in the display component, if the target object 41
  • the size of the display component is not within the preset range, the user can send a zoom command to the UAV 10 through the ground control terminal 14, the processor 15 of the UAV 10 obtains the zoom command, and then according to the zoom Instruct to adjust the current focal length of the shooting device 11, for example, the processor 15 obtains a zoom instruction at time t1, and adjusts the current focal length of the shooting device 11 at time t1 according to the zoom instruction.
  • the first distance of the target object 41 with respect to the UAV 10 or the shooting device 11 at time t1 is d init .
  • the distance of the target object 41 relative to the UAV 10 or the shooting device 11 may change, for example, the second distance of the target object 41 relative to the UAV 10 or the shooting device 11 at the current time t2 is d cur , the current focal length adjusted by the shooting device 11 at time t1 can be used as the historical focal length f init , and further, the target focal length f cmd of the shooting device 11 can be calculated according to the above formula (6).
  • Step S702 Control the current focal length of the shooting device to adjust to the target focal length.
  • the processor 15 of the UAV 10 can send the target focal length f cmd to the shooting device 11, and the shooting device 11 adjusts its current focal length to the target focal length f cmd , or the processor 15 may directly adjust the current focal length of the shooting device 11 to the target focal length f cmd according to the target focal length f cmd .
  • the focal length information of the shooting device and the distance information between the target object and the UAV or the shooting device are used to control the shooting device to perform a zoom operation to maintain the size of the target object in the image collected by the shooting device.
  • the size of the target object in the image collected by the shooting device is controlled within a preset range to improve the accuracy of identifying the target object.
  • An embodiment of the present invention provides a control device for a photographing device.
  • 9 is a structural diagram of a control device of a shooting device according to an embodiment of the present invention.
  • the control device 90 of the shooting device includes: a memory 91 and a processor 92; the memory 91 is used to store program codes; and the processor 92 Call the program code, and when the program code is executed, it is used to perform the following operations: acquiring shooting information of the shooting device and monitoring information of a target object, the target object being a shooting object of the shooting device; if detected The recognition trigger event of the target object controls the zoom operation of the shooting device according to the shooting information and the monitoring information, so that the size of the target object in the image collected by the shooting device is controlled within a preset range Inside.
  • the processor 92 controls the zooming operation of the shooting device according to the shooting information and the monitoring information, it is specifically used to control the shooting device to stop zooming based on the shooting information and the monitoring information.
  • the processor 92 controls the shooting device to stop zooming based on the shooting information and the monitoring information, it is specifically used to: determine that the target object is in the field based on the shooting information and the monitoring information The size in the image; if the size of the target object in the image is outside the preset range, control the shooting device to stop zooming.
  • the shooting information includes size information of the image
  • the monitoring information includes size information of the target object in the image
  • the processor 92 determines the location based on the shooting information and the monitoring information
  • the size of the target object in the image is specifically used to determine the size difference between the target object and the image based on the size information of the image and the size information of the target object in the image; If the size difference is greater than the first threshold or the size difference is less than the second threshold, it is determined that the size of the target object in the image is outside the preset range.
  • the monitoring information includes size information of a framed area of the target object in the image, and the target object is at least partially located in the framed area.
  • the frame selection area is a rectangular area; the size difference includes the difference between the width of the frame selection area and the width of the image, and/or the height of the frame selection area and the The difference between the height of the images.
  • the shooting information includes area information of the image
  • the monitoring information includes area information of the target object in the image
  • the processor 92 determines the location based on the shooting information and the monitoring information
  • the size of the target object in the image is specifically used to determine the area difference between the target object and the image based on the area information of the image and the area information of the target object in the image; If the area difference is greater than the third threshold or the area difference is less than the fourth threshold, it is determined that the size of the target object in the image is outside the preset range.
  • the monitoring information includes area information of the target object in the framed area of the image, and the target object is at least partially located in the framed area.
  • the processor 92 controls the shooting device to stop zooming, it is specifically used to: control the shooting device to stop executing the zoom command received by the shooting device, and the zoom command is used to adjust the focal length of the shooting device .
  • control device further includes: a communication interface 93; after the processor 92 controls the shooting device to stop zooming, it is also used to: send a stop zooming prompt message to the control terminal through the communication interface 93, so that the control terminal The user is prompted according to the prompt message for stopping zooming, and the control terminal is used to control the movable platform.
  • a communication interface 93 after the processor 92 controls the shooting device to stop zooming, it is also used to: send a stop zooming prompt message to the control terminal through the communication interface 93, so that the control terminal The user is prompted according to the prompt message for stopping zooming, and the control terminal is used to control the movable platform.
  • the processor 92 controls the zoom operation of the shooting device according to the shooting information and the monitoring information, it is specifically used to: control the shooting device to perform a zoom operation based on the shooting information and the monitoring information .
  • the processor 92 controls the shooting device to perform a zoom operation according to the shooting information and the monitoring information, so that the size of the target object in the image collected by the shooting device is controlled within a preset range It is specifically used to: control the shooting device to perform a zoom operation according to the shooting information and the monitoring information to maintain the size of the target object in the image collected by the shooting device.
  • the shooting information includes focal length information of the shooting device
  • the monitoring information includes distance information of the target object from the movable platform or the shooting device
  • the processor 92 is based on the shooting information and
  • the monitoring information when controlling the shooting device to perform a zoom operation, is specifically used to: determine the target focal length of the shooting device based on the focal length information and the distance information; control the current focal length of the shooting device to adjust to Describe the target focal length.
  • the focal length information includes a historical focal length of the shooting device at a historical moment
  • the distance information includes a first distance of the target object relative to the movable platform or the shooting device at the historical moment 2.
  • the second distance of the target object relative to the movable platform or the shooting device at the current moment; the processor 92 determines the target focal length of the shooting device according to the focal length information and the distance information, specifically It is used to determine the target focal length of the shooting device according to the historical focal length, the first distance, and the second distance.
  • the historical moment is the moment when the movable platform starts to follow the target object.
  • the historical moment is the moment when the size of the target object in the image is within the preset range.
  • control device further includes a communication interface 93; the processor 92 is further configured to: according to the zoom instruction, if a user's zoom instruction is received through the communication interface 93 during the shooting of the shooting device The current focal length of the shooting device, and the adjusted current focal length is used as the historical focal length.
  • the zoom operation of the shooting device is controlled according to the shooting information and monitoring information, so that the target object is collected by the shooting device
  • the size of the image is controlled within a preset range to prevent the target object from being too large or too small in the image collected by the shooting device, so as to improve the accuracy of identifying the target object.
  • FIG. 10 is a structural diagram of an unmanned aerial vehicle according to an embodiment of the present invention.
  • the unmanned aerial vehicle 100 includes: a fuselage, a power system, a shooting device 104, and a control device 118.
  • the power system includes at least one of the following: a motor 107, a propeller 106, and an electronic governor 117.
  • the power system is installed on the fuselage to provide flight power; the shooting device 104 is used to collect images; and the specific principle of the control device 118
  • the implementation and implementation are the same as the control device described in the above embodiment, and will not be repeated here.
  • the control device 118 may specifically be a flight controller.
  • the UAV 100 further includes: a sensing system 108, a communication system 110, a support device 102, and a shooting device 104, wherein the support device 102 may specifically be a pan-tilt, and the communication system 110 may specifically include The receiver is used to receive the wireless signal sent by the ground control terminal.
  • this embodiment also provides a computer-readable storage medium on which a computer program is stored, and the computer program is executed by a processor to implement the control method of the shooting device described in the above embodiment.
  • the disclosed device and method may be implemented in other ways.
  • the device embodiments described above are only schematic.
  • the division of the units is only a division of logical functions.
  • there may be other divisions for example, multiple units or components may be combined or Can be integrated into another system, or some features can be ignored, or not implemented.
  • the displayed or discussed mutual coupling or direct coupling or communication connection may be indirect coupling or communication connection through some interfaces, devices or units, and may be in electrical, mechanical or other forms.
  • the units described as separate components may or may not be physically separated, and the components displayed as units may or may not be physical units, that is, they may be located in one place, or may be distributed on multiple network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
  • each functional unit in each embodiment of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units may be integrated into one unit.
  • the above integrated unit can be implemented in the form of hardware, or in the form of hardware plus software functional units.
  • the above integrated unit implemented in the form of a software functional unit may be stored in a computer-readable storage medium.
  • the above software functional units are stored in a storage medium, and include several instructions to enable a computer device (which may be a personal computer, server, or network device, etc.) or processor to execute the method described in each embodiment of the present invention Partial steps.
  • the aforementioned storage media include: U disk, mobile hard disk, read-only memory (Read-Only Memory, ROM), random access memory (Random Access Memory, RAM), magnetic disk or optical disk and other media that can store program code .

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)
  • Aviation & Aerospace Engineering (AREA)

Abstract

本发明实施例提供一种拍摄设备的控制方法、装置、设备及存储介质,该方法包括:获取拍摄设备的拍摄信息以及目标对象的监测信息,目标对象为拍摄设备的拍摄对象;若检测到对目标对象的识别触发事件,则根据拍摄信息和监测信息控制拍摄设备的变焦操作,以使目标对象在拍摄设备采集的图像中的大小控制在预设范围内。本发明实施例通过获取拍摄设备的拍摄信息以及目标对象的监测信息,当检测到对目标对象的识别触发事件时,根据拍摄信息和监测信息控制拍摄设备的变焦操作,以使目标对象在拍摄设备采集的图像中的大小控制在预设范围内,避免该目标对象在该拍摄设备采集的图像中过大或过小,以提高对该目标对象进行识别的准确性。

Description

拍摄设备的控制方法、装置、设备及存储介质 技术领域
本发明实施例涉及无人机领域,尤其涉及一种拍摄设备的控制方法、装置、设备及存储介质。
背景技术
现有技术中的可移动平台,例如可移动机器人、无人飞行器、手持云台等搭载有拍摄设备,用户可通过控制该拍摄设备的焦距,以调整目标对象在该拍摄设备拍摄的图像中的大小。
但是,如果目标对象在该图像中过大或过小,将无法对该图像中的目标对象进行准确识别。
发明内容
本发明实施例提供一种拍摄设备的控制方法、装置、设备及存储介质,以提高对图像中的目标对象识别的精准度。
本发明实施例的第一方面是提供一种拍摄设备的控制方法,应用于可移动平台,所述可移动平台上搭载有所述拍摄设备,所述方法包括:
获取所述拍摄设备的拍摄信息以及目标对象的监测信息,所述目标对象为所述拍摄设备的拍摄对象;
若检测到对目标对象的识别触发事件,则根据所述拍摄信息和所述监测信息控制所述拍摄设备的变焦操作,以使所述目标对象在所述拍摄设备采集的图像中的大小控制在预设范围内。
本发明实施例的第二方面是提供一种拍摄设备的控制装置,包括:存储器和处理器;
所述存储器用于存储程序代码;
所述处理器,调用所述程序代码,当程序代码被执行时,用于执行以下操作:
获取所述拍摄设备的拍摄信息以及目标对象的监测信息,所述目标对 象为所述拍摄设备的拍摄对象;
若检测到对目标对象的识别触发事件,则根据所述拍摄信息和所述监测信息控制所述拍摄设备的变焦操作,以使所述目标对象在所述拍摄设备采集的图像中的大小控制在预设范围内。
本发明实施例的第三方面是提供一种可移动平台,包括:
机身;
动力系统,安装在所述机身,用于提供动力;
拍摄设备,用于采集图像;以及
如第二方面所述的控制装置。
本发明实施例的第四方面是提供一种计算机可读存储介质,其上存储有计算机程序,所述计算机程序被处理器执行以实现第一方面所述的方法。
本实施例提供的拍摄设备的控制方法、装置、设备及存储介质,通过获取拍摄设备的拍摄信息以及目标对象的监测信息,当检测到对目标对象的识别触发事件时,根据拍摄信息和监测信息控制拍摄设备的变焦操作,以使目标对象在拍摄设备采集的图像中的大小控制在预设范围内,避免该目标对象在该拍摄设备采集的图像中过大或过小,以提高对该目标对象进行识别的准确性。
附图说明
为了更清楚地说明本发明实施例中的技术方案,下面将对实施例描述中所需要使用的附图作一简单地介绍,显而易见地,下面描述中的附图是本发明的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动性的前提下,还可以根据这些附图获得其他的附图。
图1为本发明实施例提供的一种应用场景的示意图;
图2为本发明实施例提供的拍摄设备的控制方法的流程图;
图3为本发明另一实施例提供的拍摄设备的控制方法的流程图;
图4为本发明另一实施例提供的目标对象的示意图;
图5为本发明另一实施例提供的目标对象的示意图;
图6为本发明另一实施例提供的目标对象的示意图;
图7为本发明另一实施例提供的拍摄设备的控制方法的流程图;
图8为本发明另一实施例提供的应用场景的示意图;
图9为本发明实施例提供的拍摄设备的控制装置的结构图;
图10为本发明实施例提供的无人飞行器的结构图。
附图标记:
10:无人飞行器;    11:拍摄设备;       12:云台;
13:通讯接口;      14:地面控制端;     15:处理器;
41:目标对象;      40:图像;           42:目标区域;
61:框选区域;      80:图像;           90:控制装置;
91:存储器;        92:处理器;         93:通讯接口;
100:无人机;       107:电机            106:螺旋桨;
117:电子调速器;   118:控制装置;      108:传感系统;
110:通信系统;     102:支撑设备;      104:拍摄设备。
具体实施方式
下面将结合本发明实施例中的附图,对本发明实施例中的技术方案进行清楚地描述,显然,所描述的实施例仅仅是本发明一部分实施例,而不是全部的实施例。基于本发明中的实施例,本领域普通技术人员在没有做出创造性劳动前提下所获得的所有其他实施例,都属于本发明保护的范围。
需要说明的是,当组件被称为“固定于”另一个组件,它可以直接在另一个组件上或者也可以存在居中的组件。当一个组件被认为是“连接”另一个组件,它可以是直接连接到另一个组件或者可能同时存在居中组件。
除非另有定义,本文所使用的所有的技术和科学术语与属于本发明的技术领域的技术人员通常理解的含义相同。本文中在本发明的说明书中所使用的术语只是为了描述具体的实施例的目的,不是旨在于限制本发明。本文所使用的术语“及/或”包括一个或多个相关的所列项目的任意的和所有的组合。
下面结合附图,对本发明的一些实施方式作详细说明。在不冲突的情况下,下述的实施例及实施例中的特征可以相互组合。
本发明实施例提供一种拍摄设备的控制方法。该拍摄设备的控制方法应用于可移动平台,所述可移动平台上搭载有所述拍摄设备。所述可移动 平台包括如下至少一种:无人飞行器、可移动机器人、手持云台。本发明实施例以无人飞行器为例,对该拍摄设备的控制方法进行介绍。如图1所示,无人飞行器10搭载有拍摄设备11,拍摄设备11通过支撑部件例如云台12与无人飞行器10的机身连接,拍摄设备11拍摄的图像可通过通讯接口13发送给地面控制端14。15表示该无人飞行器10中的处理器,该处理器具体可以是无人飞行器10的飞行控制器,该飞行控制器可用于控制无人飞行器10的飞行,例如,控制无人飞行器10对目标对象41进行智能跟随。地面控制端14可用于控制无人飞行器10的飞行状态参数,例如,控制无人飞行器10的飞行速度、飞行高度、姿态角等。在一些实施例中,地面控制端14还可以控制无人飞行器10的拍摄设备11的拍摄参数,例如,控制拍摄设备11的焦距、分辨率等。
具体的,地面控制端14上可设置有用于调整拍摄设备11的焦距的变焦组件,例如变焦环,地面控制端14根据用户对该变焦环的操作,生成变焦指令,并将该变焦指令发送给拍摄设备11,拍摄设备11根据该变焦指令进行变焦,例如,拍摄设备11根据该变焦指令进行光学变焦或数码变焦。但是,用户通过地面控制端14调整拍摄设备11的焦距,可能会导致目标对象41在该拍摄设备11拍摄的图像中过大或过小,从而导致无法对该图像中的目标对象41进行准确识别。针对该问题,本发明实施例提供了一种拍摄设备的控制方法,下面结合具体的实施例对该拍摄设备的控制方法进行介绍。
图2为本发明实施例提供的拍摄设备的控制方法的流程图。如图2所示,本实施例中的方法,可以包括:
步骤S201、获取所述拍摄设备的拍摄信息以及目标对象的监测信息,所述目标对象为所述拍摄设备的拍摄对象。
如图1所示,拍摄设备11和无人飞行器10的处理器15通信连接,拍摄设备11可以将拍摄设备11拍摄的图像和/或该拍摄设备11的焦距实时发送给处理器15。处理器15可根据拍摄设备11拍摄的图像和/或该拍摄设备11的焦距,获取该拍摄设备11的拍摄信息以及目标对象41的监测信息。其中,该目标对象41为该拍摄设备11的拍摄对象,例如人物躯 体、人脸等。
在一些实施例中,所述拍摄信息包括所述图像的尺寸信息,所述监测信息包括所述目标对象在所述图像中的尺寸信息。例如,处理器15可根据拍摄设备11拍摄的图像,确定该图像的尺寸信息,以及识别该图像中的目标对象41,以确定该目标对象41在该图像中的位置信息,进一步根据该目标对象41在该图像中的位置信息,确定该目标对象41在该图像中的尺寸信息。该图像的尺寸信息具体可以是该图像的宽度、高度和/或面积,该目标对象41在该图像中的尺寸信息具体可以是该目标对象41在该图像中的宽度、高度和/或面积。
在另外一些实施例中,所述拍摄信息包括所述拍摄设备的焦距信息,所述监测信息包括所述目标对象与所述可移动平台或所述拍摄设备的距离信息。例如,处理器15可根据拍摄设备11实时发送的焦距,确定拍摄设备11在某一历史时刻的历史焦距。处理器15还可以根据拍摄设备11在该历史时刻拍摄的历史图像、该历史图像的深度信息、以及该拍摄设备11在该历史时刻拍摄该历史图像时的内参和外参,确定该目标对象41在该历史时刻相对于该无人飞行器10或拍摄设备11的第一距离。处理器15还可以根据拍摄设备11在当前时刻拍摄的当前图像、该当前图像的深度信息、以及该拍摄设备11在当前时刻的内参和外参,确定该目标对象41在当前时刻相对于该无人飞行器10或拍摄设备11的第二距离。
步骤S202、若检测到对目标对象的识别触发事件,则根据所述拍摄信息和所述监测信息控制所述拍摄设备的变焦操作,以使所述目标对象在所述拍摄设备采集的图像中的大小控制在预设范围内。
例如,当无人飞行器10对目标对象41进行智能跟随时,或者,当无人飞行器10对目标对象41进行智能跟随拍摄时,需要对跟随的目标对象41进行识别,而该目标对象41在该拍摄设备11采集的图像中的大小可能会影响无人飞行器10对该目标对象41识别的准确性,也就是说,该目标对象41在该拍摄设备11采集的图像中过大或过小,都有可能导致无人飞行器10对该目标对象41无法准确识别。因此,需要将目标对象41在该拍摄设备11采集的图像中的大小控制在预设范围内。
作为一种可实现方式,无人飞行器10的处理器15可实时检测对目标 对象41的识别触发事件,例如,处理器15可实时检测地面控制端14向无人飞行器10发送的控制指令,并确定执行该控制指令时是否需要对目标对象41进行识别,例如,确定该控制指令是否为控制无人飞行器10对目标对象41进行智能跟随或智能跟随拍摄的控制指令,若该控制指令是地面控制端14控制无人飞行器10对目标对象41进行智能跟随或智能跟随拍摄的控制指令,则该处理器15可根据上述步骤确定出的拍摄设备的拍摄信息以及目标对象41的监测信息,控制该拍摄设备的变焦操作,例如,控制该拍摄设备停止变焦,或者,控制该拍摄设备进行变焦,以使目标对象41在该拍摄设备11采集的图像中的大小控制在预设范围内。
本实施例通过获取拍摄设备的拍摄信息以及目标对象的监测信息,当检测到对目标对象的识别触发事件时,根据拍摄信息和监测信息控制拍摄设备的变焦操作,以使目标对象在拍摄设备采集的图像中的大小控制在预设范围内,避免该目标对象在该拍摄设备采集的图像中过大或过小,以提高对该目标对象进行识别的准确性。
本发明实施例提供一种拍摄设备的控制方法。图3为本发明另一实施例提供的拍摄设备的控制方法的流程图。如图3所示,在图2所示实施例的基础上,所述根据所述拍摄信息和所述监测信息控制所述拍摄设备的变焦操作,包括:根据所述拍摄信息和所述监测信息,控制所述拍摄设备停止变焦。
如图1所示,41表示无人飞行器10跟随的目标对象41,当无人飞行器10在智能跟随模式下飞行时,无人飞行器10和目标对象41之间的距离可以是固定的。用户可以通过地面控制端14调整无人飞行器10的拍摄设备11的焦距,以调整目标对象41在该拍摄设备11采集的图像中的大小。例如,用户调整地面控制端14上的变焦组件,例如变焦环。地面控制端14根据用户对该变焦环的操作,生成变焦指令,并通过无线通信方式向无人飞行器10发送该变焦指令,该变焦指令可包括焦距值,无人飞行器10接收到该变焦指令后,根据该变焦指令中的焦距值调整拍摄设备11当前的焦距,或者,无人飞行器10的通讯接口13接收到该变焦指令后,将该变焦指令发送给拍摄设备11,以使拍摄设备11根据该变焦指令调整 镜头的焦距,以调整目标对象41在该图像中的大小,或者,该变焦指令可以由地面控制端14直接发送至拍摄设备11,以使得拍摄设备11根据该变焦指令调整镜头的焦距。拍摄设备11根据调整后的焦距采集该目标对象41的图像,并将该图像发送给处理器15,处理器15可根据拍摄设备11拍摄的该目标对象41的图像,确定该图像的大小,以及目标对象41在该图像中的大小。如果该目标对象41在该图像中的大小超出了预设范围,则该处理器15可控制该拍摄设备11停止变焦,以避免该目标对象41在该图像中过大或过小而影响处理器15对该目标对象41的准确识别。
可选的,所述根据所述拍摄信息和所述监测信息,控制所述拍摄设备停止变焦,包括:
步骤S301、根据所述拍摄信息和所述监测信息,确定所述目标对象在所述图像中的大小。
在一些实施例中,所述拍摄信息包括所述图像的尺寸信息,所述监测信息包括所述目标对象在所述图像中的尺寸信息。处理器15获取到拍摄设备11拍摄的该目标对象的图像后,确定该图像的尺寸信息,以及该目标对象在该图像中的尺寸信息。进一步,根据该图像的尺寸信息和该目标对象在该图像中的尺寸信息,确定该目标对象在该图像中的大小,例如,确定该目标对象在该图像中的大小是否在预设范围内,或者,确定该目标对象在该图像中的大小是否超出了该预设范围。
可选的,所述根据所述拍摄信息和所述监测信息,确定所述目标对象在所述图像中的大小,包括:根据所述图像的尺寸信息和所述目标对象在所述图像中的尺寸信息,确定所述目标对象与所述图像的尺寸差异;若所述尺寸差异大于第一阈值或所述尺寸差异小于第二阈值,则确定所述目标对象在所述图像中的大小在所述预设范围外。
如图4所示,40表示拍摄设备11拍摄的图像,处理器15根据拍摄设备11拍摄的图像40,可确定出图像40的尺寸信息、以及目标对象41在该图像40中的尺寸信息,在一些实施例中,拍摄设备11还可以将图像40的尺寸信息发送给处理器15。处理器15确定目标对象41在该图像40中的尺寸信息的一种可实现方式是:处理器15将该图像40输入到预先训练完成的神经网络模型中,该神经网络模型具体可以是预先通过大量的人体 样本进行训练后的模型,该神经网络模型可用于识别该图像40中的目标对象41,例如人体。识别到目标对象41后,能够以目标对象41中的某一位置按照预设规则向周围扩展,以形成至少包括部分目标对象41的目标区域42。进一步的,可以输出该目标对象41在该图像40中的位置信息,该位置信息具体可以是该图像40中包括该目标对象41的目标区域42的左上角和右下角的位置信息。目标对象41在该图像40中的尺寸信息具体可以是该目标区域42在该图像40中的尺寸信息。处理器15根据图像40的尺寸信息和目标对象41在该图像40中的尺寸信息,可确定出该目标对象41与图像40的尺寸差异,该尺寸差异可以是目标对象41在该图像40中的尺寸与该图像40的尺寸的比值或差值。
在一些实施例中,图像的尺寸信息具体可以是图像的宽度和/或高度,该目标对象在该图像40中的尺寸信息具体可以是该目标区域在该图像中的宽度和/或高度。
如图5所示,H表示图像40的高度。h表示目标区域42在该图像40中的高度,也就是,目标对象41在该图像40中的高度。W表示图像40的宽度。w表示目标区域42在该图像40中的宽度,也就是,目标对象41在该图像40中的宽度。
该目标对象41与图像40的尺寸差异包括该目标区域42在该图像40中的高度h与该图像40的高度H的差异,和/或,该目标区域42在该图像40中的宽度w与该图像40的宽度W的差异。
在一些实施例中,该目标区域42在该图像40中的高度h与该图像40的高度H的差异为h与H的比值,该目标区域42在该图像40中的宽度w与该图像40的宽度W的差异为w与W的比值。在这种情况下,确定目标对象41在该图像40中的大小在预设范围外可包括如下几种可行的实现方式:
一种可行的实现方式是:若目标区域42在该图像40中的高度h与该图像40的高度H的比值大于第一阈值ε 1,即h和H满足如下公式(1)所述的条件,或者,目标区域42在该图像40中的高度h与该图像40的高度H的比值小于第二阈值ε 2,即h和H满足如下公式(2)所述的条件,其中,第一阈值ε 1大于第二阈值ε 2,则确定目标对象41在该图像40中的大 小在预设范围外。
Figure PCTCN2018118410-appb-000001
Figure PCTCN2018118410-appb-000002
另一种可行的实现方式是:若目标区域42在该图像40中的宽度w与该图像40的宽度W的比值大于第一阈值ε 1,即w和W满足如下公式(3)所述的条件,或者,目标区域42在该图像40中的宽度w与该图像40的宽度W的比值小于第二阈值ε 2,即w和W满足如下公式(4)所述的条件。其中,第一阈值ε 1大于第二阈值ε 2,则确定目标对象41在该图像40中的大小在预设范围外。
Figure PCTCN2018118410-appb-000003
Figure PCTCN2018118410-appb-000004
再一种可行的实现方式是:若目标区域42在该图像40中的高度h与该图像40的高度H的比值大于第一阈值ε 1,且目标区域42在该图像40中的宽度w与该图像40的宽度W的比值大于第一阈值ε 1;或者,目标区域42在该图像40中的高度h与该图像40的高度H的比值小于第二阈值ε 2,且目标区域42在该图像40中的宽度w与该图像40的宽度W的比值小于第二阈值ε 2。其中,第一阈值ε 1大于第二阈值ε 2,则确定目标对象41在该图像40中的大小在预设范围外。
在另一些实施例中,该目标区域42在该图像40中的高度h与该图像40的高度H的差异为h与H的差值的绝对值,该目标区域42在该图像40中的宽度w与该图像40的宽度W的差异为w与W的差值的绝对值。在这种情况下,确定目标对象41在该图像40中的大小在预设范围外可包括如下几种可行的实现方式:
一种可行的实现方式是:若目标区域42在该图像40中的高度h与该图像40的高度H的差值的绝对值大于第一阈值,或者,目标区域42在该图像40中的高度h与该图像40的高度H的差值的绝对值小于第二阈值,其中,第一阈值大于第二阈值,则确定目标对象41在该图像40中的大小 在预设范围外。
另一种可行的实现方式是:若目标区域42在该图像40中的宽度w与该图像40的宽度W的差值的绝对值大于第一阈值,或者,目标区域42在该图像40中的宽度w与该图像40的宽度W的差值的绝对值小于第二阈值,其中,第一阈值大于第二阈值,则确定目标对象41在该图像40中的大小在预设范围外。
再一种可行的实现方式是:若目标区域42在该图像40中的高度h与该图像40的高度H的差值的绝对值大于第一阈值,且目标区域42在该图像40中的宽度w与该图像40的宽度W的差值的绝对值大于第一阈值;或者,目标区域42在该图像40中的高度h与该图像40的高度H的差值的绝对值小于第二阈值,且目标区域42在该图像40中的宽度w与该图像40的宽度W的差值的绝对值小于第二阈值。其中,第一阈值大于第二阈值,则确定目标对象41在该图像40中的大小在预设范围外。
在另一些实施例中,所述监测信息包括所述目标对象在所述图像中的框选区域的尺寸信息,所述目标对象至少部分位于所述框选区域内。
如图6所示,40表示拍摄设备11拍摄的图像,处理器15可通过通讯接口13将该图像40发送给地面控制端14,地面控制端14将该图像40显示在显示组件中,该显示组件具体可以是触摸屏。用户可以在该触摸屏上对该图像40中的目标对象41进行框选,例如,该用户可以在该触摸屏上对该目标对象41的整体进行框选,也可以对该目标对象41的部分进行框选。以用户框选该目标对象41的部分为例,如图6所示,61表示目标对象41在该图像40中的框选区域,目标对象41的至少部分位于该框选区域内,例如,目标对象41为人体,用户框选的是该人体的人脸,也就是说,该人脸位于该框选区域内。该地面控制端14进一步将该框选区域61在该图像40中的位置信息发送给无人飞行器10,处理器15获取到该框选区域61在该图像40中的位置信息后,确定该框选区域61在该图像40中的尺寸,并将该框选区域61在该图像40中的尺寸作为该目标对象41的监测信息。该框选区域61在该图像40中的尺寸具体可以是该框选区域61在该图像40中的高度和/或宽度。如图6所示,w1表示框选区域61的宽度,h1表示框选区域61的高度。
可以理解,当用户在地面控制端14进行框选操作后,关于框选区域61的尺寸信息可以不是根据由地面控制端14发送其在图像40中的位置信息所确定,还可以采用其它方式,例如,由地面控制端14计算框选区域61的尺寸信息,并将该尺寸信息发送至无人飞行器10,具体方式此处不做限定。
需要说明的是,在图6所示的实施例中,框选区域61也可以为用户在触摸屏上进行点选后得到或与地面控制端14通过其它交互方式后得到。此外,显示组件中显示的图像40的原始尺寸可以与显示组件的显示尺寸成比例缩放,也可以不为比例缩放,在利用图像40的尺寸信息时,可以采用成比例缩放后的尺寸信息,也可以采用不成比例缩放后的信息,具体可以根据需要进行设定
可选的,所述框选区域为矩形区域;所述尺寸差异包括所述框选区域的宽度与所述图像的宽度之间的差异,和/或,所述框选区域的高度与所述图像的高度之间的差异。如图6所示,框选区域61为矩形区域,目标对象41与图像40的尺寸差异可以是框选区域61的宽度与图像40的宽度之间的差异,和/或,框选区域61的高度与图像40的高度之间的差异。
在一些实施例中,框选区域61的宽度与图像40的宽度之间的差异具体可以是框选区域61的宽度w1和图像40的宽度W的比值,框选区域61的高度与图像40的高度之间的差异具体可以是框选区域61的高度h1与图像40的高度H的比值。在这种情况下,确定目标对象41在该图像40中的大小在预设范围外可包括如下几种可行的实现方式:
一种可行的实现方式是:若框选区域61的宽度w1和图像40的宽度W的比值大于第一阈值ε 1,或者,框选区域61的宽度w1和图像40的宽度W的比值小于第二阈值ε 2,其中,第一阈值ε 1大于第二阈值ε 2,则确定目标对象41在该图像40中的大小在预设范围外。
另一种可行的实现方式是:若框选区域61的高度h1与图像40的高度H的比值大于第一阈值ε 1,或者,框选区域61的高度h1与图像40的高度H的比值小于第二阈值ε 2,其中,第一阈值ε 1大于第二阈值ε 2,则确定目标对象41在该图像40中的大小在预设范围外。
再一种可行的实现方式是:若框选区域61的宽度w1和图像40的宽度 W的比值大于第一阈值ε 1,且框选区域61的高度h1与图像40的高度H的比值大于第一阈值ε 1;或者,框选区域61的宽度w1和图像40的宽度W的比值小于第二阈值ε 2,且框选区域61的高度h1与图像40的高度H的比值小于第二阈值ε 2,其中,第一阈值ε 1大于第二阈值ε 2,则确定目标对象41在该图像40中的大小在预设范围外。
在另一些实施例中,框选区域61的宽度与图像40的宽度之间的差异具体可以是框选区域61的宽度w1和图像40的宽度W的差值的绝对值,框选区域61的高度与图像40的高度之间的差异具体可以是框选区域61的高度h1与图像40的高度H的差值的绝对值。在这种情况下,确定目标对象41在该图像40中的大小在预设范围外可包括如下几种可行的实现方式:
一种可行的实现方式是:若框选区域61的宽度w1和图像40的宽度W的差值的绝对值大于第一阈值,或者,框选区域61的宽度w1和图像40的宽度W的差值的绝对值小于第二阈值,其中,第一阈值大于第二阈值,则确定目标对象41在该图像40中的大小在预设范围外。
另一种可行的实现方式是:若框选区域61的高度h1与图像40的高度H的差值的绝对值大于第一阈值,或者,框选区域61的高度h1与图像40的高度H的差值的绝对值小于第二阈值,其中,第一阈值大于第二阈值,则确定目标对象41在该图像40中的大小在预设范围外。
再一种可行的实现方式是:若框选区域61的宽度w1和图像40的宽度W的差值的绝对值大于第一阈值,且框选区域61的高度h1与图像40的高度H的差值的绝对值大于第一阈值;或者,框选区域61的宽度w1和图像40的宽度W的差值的绝对值小于第二阈值,且框选区域61的高度h1与图像40的高度H的差值的绝对值小于第二阈值,其中,第一阈值大于第二阈值,则确定目标对象41在该图像40中的大小在预设范围外。
在另外一些实施例中,所述拍摄信息包括所述图像的面积信息,所述监测信息包括所述目标对象在所述图像中的面积信息。处理器15获取到拍摄设备11拍摄的该目标对象的图像后,确定该图像的面积信息,以及该目标对象在该图像中的面积信息。进一步,根据该图像的面积信息,以及该目标对象在该图像中的面积信息,确定该目标对象在该图像中的大小,例如,确定该目标对象在该图像中的大小是否在预设范围内,或者,确定 该目标对象在该图像中的大小是否超出了该预设范围。
可选的,所述根据所述拍摄信息和所述监测信息,确定所述目标对象在所述图像中的大小,包括:根据所述图像的面积信息和所述目标对象在所述图像中的面积信息,确定所述目标对象与所述图像的面积差异;若所述面积差异大于第三阈值或所述面积差异小于第四阈值,则确定所述目标对象在所述图像中的大小在所述预设范围外。
如图5所示,图像40的面积为图像40的高度H和图像40的宽度W的乘积,目标对象41在该图像40中的面积为图像40中包括该目标对象41的目标区域42的高度h和图像40的宽度w的乘积。目标对象41与图像40的面积差异可以是目标对象41在该图像40中的面积与该图像40的面积的比值,或者,该面积差异是目标对象41在该图像40中的面积与该图像40的面积的差值的绝对值。在这种情况下,确定目标对象41在该图像40中的大小在预设范围外可包括如下几种可行的实现方式:
一种可行的实现方式是:若目标对象41在该图像40中的面积与该图像40的面积的比值大于第三阈值,或者,目标对象41在该图像40中的面积与该图像40的面积的比值小于第四阈值,其中,该第三阈值大于该第四阈值,则确定目标对象41在该图像40中的大小在预设范围外。
另一种可行的实现方式是:若目标对象41在该图像40中的面积与该图像40的面积的差值的绝对值大于第三阈值,或者,目标对象41在该图像40中的面积与该图像40的面积的差值的绝对值小于第四阈值,其中,该第三阈值大于该第四阈值,则确定目标对象41在该图像40中的大小在预设范围外。
在另外一些实施例中,所述监测信息包括所述目标对象在所述图像中的框选区域的面积信息,所述目标对象至少部分位于所述框选区域内。
如图6所示,61表示目标对象41在该图像40中的框选区域,目标对象41的至少部分位于该框选区域内。处理器15还可以将该框选区域61在该图像40中的面积作为该目标对象41的监测信息。该框选区域61在该图像40中的面积为框选区域61的高度h1和框选区域61的宽度w1的乘积。在这种情况下,确定目标对象41在该图像40中的大小在预设范围外可包括如下几种可行的实现方式:
一种可行的实现方式是:若框选区域61在该图像40中的面积与该图像40的面积的比值大于第三阈值,或者,框选区域61在该图像40中的面积与该图像40的面积的比值小于第四阈值,其中,该第三阈值大于该第四阈值,则确定目标对象41在该图像40中的大小在预设范围外。
另一种可行的实现方式是:若框选区域61在该图像40中的面积与该图像40的面积的差值的绝对值大于第三阈值,或者,框选区域61在该图像40中的面积与该图像40的面积的差值的绝对值小于第四阈值,其中,该第三阈值大于该第四阈值,则确定目标对象41在该图像40中的大小在预设范围外。
步骤S302、若所述目标对象在所述图像中的大小在所述预设范围外,则控制所述拍摄设备停止变焦。
根据上述步骤中的任一种方法,若确定目标对象41在该图像40中的大小在预设范围外,则控制无人飞行器10的拍摄设备11停止变焦。
例如,当用户通过地面控制端14增大拍摄设备11的焦距时,目标区域42在该图像40中的大小逐渐增大,若目标区域42在该图像40中的高度和该图像40的高度的比值大于第一阈值,和/或,目标区域42在该图像40中的宽度和该图像40的宽度的比值大于第一阈值,则确定目标对象41在该图像40中的大小在预设范围外,此时,需要控制拍摄设备11停止变焦,防止目标区域42在该图像40中的大小进一步增大。同理,当用户通过地面控制端14减小拍摄设备11的焦距时,目标区域42在该图像40中的大小逐渐减小,若目标区域42在该图像40中的高度和该图像40的高度的比值小于第二阈值,和/或,目标区域42在该图像40中的宽度和该图像40的宽度的比值小于第二阈值,则确定目标对象41在该图像40中的大小在预设范围外,此时,需要控制拍摄设备11停止变焦,防止目标区域42在该图像40中的大小进一步减小。
在一些实施例中,所述控制所述拍摄设备停止变焦,包括:控制所述拍摄设备停止执行所述拍摄设备接收到的变焦指令,所述变焦指令用于调整所述拍摄设备的焦距。
例如,地面控制端14通过无线通信方式向无人飞行器10发送变焦指令,该变焦指令可包括焦距值,无人飞行器10的处理器15获取到该变焦 指令后,根据该变焦指令中的焦距值调整拍摄设备11当前的焦距,或者,无人飞行器10的通讯接口13接收到该变焦指令后,将该变焦指令发送给拍摄设备11,以使拍摄设备11根据该变焦指令调整镜头的焦距,或者,该变焦指令可以由地面控制端14直接发送至拍摄设备11,以使得拍摄设备11根据该变焦指令调整镜头的焦距。拍摄设备11根据调整后的焦距采集该目标对象41的图像,并将该图像发送给处理器15,若处理器15确定该目标对象41在该图像40中的大小在预设范围外,则控制拍摄设备11停止执行该拍摄设备11接收到的变焦指令,例如,无人飞行器10的处理器15通过通讯接口13向拍摄设备11发送停止变焦指令,以使得拍摄设备11根据停止变焦指令停止执行接收到的变焦指令。
在另外一些实施例中,所述控制所述拍摄设备停止变焦之后,还包括:向控制终端发送停止变焦提示信息,以使得所述控制终端根据所述停止变焦提示信息对用户进行提示,所述控制终端用于控制所述可移动平台。为了提高用户体验,当处理器15控制拍摄设备11停止执行变焦指令后,还可以向地面控制端14发送停止变焦提示信息,该停止变焦提示信息可用于提示该目标对象在该图像40中过大或过小,以使该地面控制端14根据该停止变焦提示信息对用户进行提示,用户可根据该地面控制端14的提示停止调整该地面控制端14上的变焦组件,例如变焦环。
本实施例通过拍摄设备的拍摄信息以及目标对象的监测信息,确定该目标对象在该图像中的大小,若该目标对象在该图像中的大小在该预设范围外,则控制该拍摄设备停止变焦,避免用户通过地面控制端调整该拍摄设备的焦距时导致该目标对象在该拍摄设备采集的图像中过大或过小,使得该目标对象在拍摄设备采集的图像中的大小控制在预设范围内,以提高对该目标对象进行识别的准确性。
本发明实施例提供一种拍摄设备的控制方法。图7为本发明另一实施例提供的拍摄设备的控制方法的流程图。如图7所示,在上述实施例的基础上,所述根据所述拍摄信息和所述监测信息控制所述拍摄设备的变焦操作,包括:根据所述拍摄信息和所述监测信息,控制所述拍摄设备进行变焦操作。
如图8所示,当无人飞行器10的拍摄设备11对目标对象41进行智能跟随拍摄的时候,无人飞行器10和/目标对象41可能会发生移动,导致无人飞行器10和目标对象41之间的距离发生变化,或者导致拍摄设备11和目标对象41之间的距离发生变化,从而导致目标对象41在该拍摄设备11采集到的图像中的大小发生变化。如果目标对象41在该拍摄设备11采集到的图像中过大或过小,则有可能影响无人飞行器10的处理器15对该目标对象41的正确识别。如果处理器15无法准确识别出该图像中的目标对象41,有可能会导致处理器15无法准确确定目标对象41的位置信息和/或速度信息,从而无法控制拍摄设备11对该目标对象41进行智能跟随拍摄。针对该问题,在本实施例中,可根据拍摄设备11的拍摄信息,以及目标对象41的监测信息,控制拍摄设备11进行变焦操作。可选的,所述拍摄信息包括所述拍摄设备的焦距信息,所述监测信息包括所述目标对象与所述可移动平台或所述拍摄设备的距离信息。也就是说,可以根据拍摄设备11的焦距信息,以及目标对象41与无人飞行器10或拍摄设备11的距离信息,控制拍摄设备11进行变焦操作,例如,控制拍摄设备11调整焦距。
如图8所示,拍摄设备11可将拍摄设备11的焦距和拍摄设备11拍摄的图像实时地发送给处理器15,处理器15可根据拍摄设备11实时发送的焦距,确定拍摄设备11在不同时刻的焦距。另外,处理器15还可以根据拍摄设备11在不同时刻拍摄的图像,确定目标对象41在不同时刻的位置信息,该目标对象41的位置信息具体为该目标对象41在世界坐标系中的三维坐标。如图8所示,80表示拍摄设备11在某一时刻拍摄的图像。可以理解,目标对象41上的三维点可映射到图像80中,该三维点在图像80中的映射点具体可以是图像80中的特征点。例如,点A、点B和点C分别为目标对象41上的三维点,点a、点b和点c分别表示图像80中的特征点,点a是点A在图像80中的映射点,点b是点B在图像80中的映射点,点c是点C在图像80中的映射点。此处只是示意性说明,并不限定目标对象41上的三维点在图像80中的映射点。根据世界坐标系和像素平面坐标系的转换关系,可得到目标对象41上的三维点在世界坐标系中的三维坐标(x w,y w,z w)与该三维点在图像80中的映射点在该图像80中的位置 信息例如像素坐标(μ,υ)的关系,该关系具体如下公式(5)所示:
Figure PCTCN2018118410-appb-000005
其中,z c表示该三维点在相机坐标系Z轴上的坐标,也就是图像80的深度信息。K表示相机的内参,R表示相机的旋转矩阵,T表示相机的平移矩阵。根据(μ,υ)、K、R、T、z c可计算出该三维点在世界坐标系中的三维坐标(x w,y w,z w)。具体的,根据点a在该图像80中的像素坐标、K、R、T、z c可计算出三维点A在世界坐标系中的三维坐标。根据点b在该图像80中的像素坐标、K、R、T、z c可计算出三维点B在世界坐标系中的三维坐标。根据点c在该图像80中的像素坐标、K、R、T、z c可计算出三维点C在世界坐标系中的三维坐标。进一步,根据三维点A、B、C在世界坐标系中的三维坐标,可计算出该目标对象41在世界坐标系中的三维坐标。
根据上述公式(5)可知,处理器15根据拍摄设备11在某一时刻拍摄的图像80,可确定出目标对象41在该时刻在世界坐标系中的三维坐标,进一步,根据目标对象41在该时刻在世界坐标系中的三维坐标,以及无人飞行器10在该时刻的定位信息,可确定出目标对象41在该时刻相对于该无人飞行器10的距离。或者,该处理器15可根据目标对象41在该时刻在世界坐标系中的三维坐标、无人飞行器10在该时刻的定位信息、以及拍摄设备11相对于无人飞行器10机身的位置和姿态,确定出目标对象41在该时刻相对于该拍摄设备11的距离。
可选的,所述根据所述拍摄信息和所述监测信息,控制所述拍摄设备进行变焦操作,以使所述目标对象在所述拍摄设备采集的图像中的大小控制在预设范围内,包括:根据所述拍摄信息和所述监测信息,控制所述拍摄设备进行变焦操作,以维持所述目标对象在所述拍摄设备采集的图像中的大小。
例如,当无人飞行器10或拍摄设备11相对于该目标对象41的距离增大时,目标对象41在该拍摄设备11采集到的图像中的大小可能会变小,此时,可以控制拍摄设备11进行变焦操作,例如,增大该拍摄设备11的 焦距,以使目标对象41在该拍摄设备11采集到的图像中的大小维持不变,或者维持在预设范围内。当无人飞行器10或拍摄设备11相对于该目标对象41的距离减小时,目标对象41在该拍摄设备11采集到的图像中的大小可能会变大,此时,可以控制拍摄设备11进行变焦操作,例如,减小该拍摄设备11的焦距,以使目标对象41在该拍摄设备11采集到的图像中的大小维持不变,或者维持在预设范围内。
在一些实施例中,所述根据所述拍摄信息和所述监测信息,控制所述拍摄设备进行变焦操作,包括:
步骤S701、根据所述焦距信息和所述距离信息,确定所述拍摄设备的目标焦距。
所述焦距信息包括所述拍摄设备在历史时刻的历史焦距,所述距离信息包括所述目标对象在所述历史时刻相对于所述可移动平台或所述拍摄设备的第一距离、所述目标对象在当前时刻相对于所述可移动平台或所述拍摄设备的第二距离。
例如,处理器15根据拍摄设备11实时发送的焦距,可确定出拍摄设备11在历史时刻的历史焦距。处理器15根据拍摄设备11在该历史时刻拍摄的图像,可确定出目标对象41在该历史时刻在世界坐标系中的三维坐标,进一步,确定出目标对象41在该历史时刻相对于无人飞行器10或拍摄设备11的距离,此处,将目标对象41在该历史时刻相对于无人飞行器10或拍摄设备11的距离记为第一距离。此外,处理器15还可以根据拍摄设备11在当前时刻拍摄的图像,确定目标对象41在当前时刻在世界坐标系中的三维坐标,进一步,确定出目标对象41在当前时刻相对于无人飞行器10或拍摄设备11的距离,此处,将目标对象41在当前时刻相对于无人飞行器10或拍摄设备11的距离记为第二距离。进一步,根据拍摄设备11在历史时刻的历史焦距、目标对象41在该历史时刻相对于无人飞行器10或拍摄设备11的第一距离、目标对象41在当前时刻相对于无人飞行器10或拍摄设备11的第二距离,确定该拍摄设备11的目标焦距,所谓的目标焦距是指将拍摄设备11当前的焦距调整为该目标焦距。
所述根据所述焦距信息和所述距离信息,确定所述拍摄设备的目标焦距,包括:根据所述历史焦距、所述第一距离、所述第二距离,确定所述 拍摄设备的目标焦距。
例如,将拍摄设备11在历史时刻的历史焦距记为f init,将目标对象41在该历史时刻相对于无人飞行器10或拍摄设备11的第一距离记为d init,将目标对象41在当前时刻相对于无人飞行器10或拍摄设备11的第二距离记为d cur,拍摄设备11的目标焦距f cmd可根据如下公式(6)计算得出:
Figure PCTCN2018118410-appb-000006
在一些实施例中,所述历史时刻是所述可移动平台开始跟随所述目标对象的时刻。如图1所示,处理器15具体可以是无人飞行器10的飞行控制器,该飞行控制器可控制无人飞行器10对目标对象41进行智能跟随。具体的,地面控制端14向无人飞行器10发送智能跟随控制指令,当该飞行控制器获取到该智能跟随控制指令后,确定该目标对象41相对于无人飞行器10或拍摄设备11的距离,当该距离小于预设距离时,该飞行控制器向地面控制端14发送提示信息,以使地面控制端14提示用户该目标对象41与无人飞行器10或拍摄设备11之间的距离较近,无法对目标对象41进行智能跟随。用户可根据该提示信息,通过地面控制端14控制该目标对象41相对于无人飞行器10或拍摄设备11的距离增大,并在该距离调整至大于该预设距离时,该飞行控制器开始控制无人飞行器10对目标对象41进行智能跟随。可选的,在无人飞行器10开始跟随目标对象41的历史时刻,该目标对象41在该拍摄设备11采集的图像中的大小在预设范围内。该飞行控制器根据拍摄设备11向该飞行控制器实时发送的焦距,确定在无人飞行器10开始跟随目标对象41的历史时刻,该拍摄设备11的历史焦距即f init。该飞行控制器还可根据无人飞行器10开始跟随目标对象41的历史时刻,该拍摄设备11拍摄的图像,确定该目标对象41在该历史时刻相对于无人飞行器10或拍摄设备11的第一距离即d init。另外,该飞行控制器还可根据拍摄设备11在当前时刻拍摄的该目标对象41的图像,确定该目标对象41在当前时刻相对于无人飞行器10或拍摄设备11的第二距离即d cur。进一步,根据上述公式(6)计算得出拍摄设备11的目标焦距f cmd
在其他实施例中,所述历史时刻是所述目标对象在所述图像中的大小在所述预设范围内的时刻。例如,在无人飞行器10对目标对象41进行智 能跟随的过程中,地面控制端14可向拍摄设备11发送变焦指令,以调整拍摄设备11的焦距,从而调整该目标对象41在该拍摄设备11采集的图像中的大小。无人飞行器10的处理器15可记录该目标对象41在该图像中的大小在预设范围内的历史时刻,并根据该拍摄设备11在该历史时刻的历史焦距f init、该目标对象41在该历史时刻相对于无人飞行器10或拍摄设备11的第一距离d init、以及该目标对象41在当前时刻相对于无人飞行器10或拍摄设备11的第二距离d cur,根据上述公式(6)计算得出拍摄设备11的目标焦距f cmd
在另外一些实施例中,还包括:在所述拍摄设备的拍摄过程中,若接收到用户的变焦指令,则根据所述变焦指令调整所述拍摄设备的当前焦距,并将调整后的当前焦距作为所述历史焦距。
例如,在拍摄设备11对目标对象41进行拍摄的过程中,拍摄设备11可将其拍摄到的图像发送给地面控制端14,地面控制端14将该图像显示在显示组件中,如果该目标对象41在该显示组件中的大小不在预设范围内,则该用户可通过地面控制端14向无人飞行器10发送变焦指令,无人飞行器10的处理器15获取到该变焦指令后,根据该变焦指令调整该拍摄设备11的当前焦距,例如,处理器15在t1时刻获取到变焦指令,并根据该变焦指令调整该拍摄设备11在t1时刻的当前焦距。目标对象41在t1时刻相对于无人飞行器10或拍摄设备11的第一距离为d init。在t1时刻之后,目标对象41相对于无人飞行器10或拍摄设备11的距离可能会发生变化,例如,目标对象41在当前时刻t2相对于无人飞行器10或拍摄设备11的第二距离为d cur,则可以将该拍摄设备11在t1时刻调整后的当前焦距作为历史焦距f init,进一步,根据上述公式(6)计算得出拍摄设备11的目标焦距f cmd
步骤S702、控制所述拍摄设备的当前焦距调整为所述目标焦距。
通过上述步骤计算出拍摄设备11的目标焦距f cmd之后,无人飞行器10的处理器15可以将该目标焦距f cmd发送给拍摄设备11,由拍摄设备11将其当前焦距调整为该目标焦距f cmd,或者,该处理器15可直接根据该目标焦距f cmd,将拍摄设备11的当前焦距调整为该目标焦距f cmd
本实施例通过拍摄设备的焦距信息、目标对象与无人飞行器或拍摄设 备的距离信息,控制所述拍摄设备进行变焦操作,以维持所述目标对象在所述拍摄设备采集的图像中的大小,使得该目标对象在拍摄设备采集的图像中的大小控制在预设范围内,以提高对该目标对象进行识别的准确性。
本发明实施例提供一种拍摄设备的控制装置。图9为本发明实施例提供的拍摄设备的控制装置的结构图,如图9所示,拍摄设备的控制装置90包括:存储器91和处理器92;存储器91用于存储程序代码;处理器92调用所述程序代码,当程序代码被执行时,用于执行以下操作:获取所述拍摄设备的拍摄信息以及目标对象的监测信息,所述目标对象为所述拍摄设备的拍摄对象;若检测到对目标对象的识别触发事件,则根据所述拍摄信息和所述监测信息控制所述拍摄设备的变焦操作,以使所述目标对象在所述拍摄设备采集的图像中的大小控制在预设范围内。
可选的,处理器92根据所述拍摄信息和所述监测信息控制所述拍摄设备的变焦操作时,具体用于:根据所述拍摄信息和所述监测信息,控制所述拍摄设备停止变焦。
可选的,处理器92根据所述拍摄信息和所述监测信息,控制所述拍摄设备停止变焦时,具体用于:根据所述拍摄信息和所述监测信息,确定所述目标对象在所述图像中的大小;若所述目标对象在所述图像中的大小在所述预设范围外,则控制所述拍摄设备停止变焦。
可选的,所述拍摄信息包括所述图像的尺寸信息,所述监测信息包括所述目标对象在所述图像中的尺寸信息;处理器92根据所述拍摄信息和所述监测信息,确定所述目标对象在所述图像中的大小时,具体用于:根据所述图像的尺寸信息和所述目标对象在所述图像中的尺寸信息,确定所述目标对象与所述图像的尺寸差异;若所述尺寸差异大于第一阈值或所述尺寸差异小于第二阈值,则确定所述目标对象在所述图像中的大小在所述预设范围外。
可选的,所述监测信息包括所述目标对象在所述图像中的框选区域的尺寸信息,所述目标对象至少部分位于所述框选区域内。
可选的,所述框选区域为矩形区域;所述尺寸差异包括所述框选区域的宽度与所述图像的宽度之间的差异,和/或,所述框选区域的高度与所述 图像的高度之间的差异。
可选的,所述拍摄信息包括所述图像的面积信息,所述监测信息包括所述目标对象在所述图像中的面积信息;处理器92根据所述拍摄信息和所述监测信息,确定所述目标对象在所述图像中的大小时,具体用于:根据所述图像的面积信息和所述目标对象在所述图像中的面积信息,确定所述目标对象与所述图像的面积差异;若所述面积差异大于第三阈值或所述面积差异小于第四阈值,则确定所述目标对象在所述图像中的大小在所述预设范围外。
可选的,所述监测信息包括所述目标对象在所述图像中的框选区域的面积信息,所述目标对象至少部分位于所述框选区域内。
可选的,处理器92控制所述拍摄设备停止变焦时,具体用于:控制所述拍摄设备停止执行所述拍摄设备接收到的变焦指令,所述变焦指令用于调整所述拍摄设备的焦距。
可选的,所述控制装置还包括:通讯接口93;处理器92控制所述拍摄设备停止变焦之后,还用于:通过通讯接口93向控制终端发送停止变焦提示信息,以使得所述控制终端根据所述停止变焦提示信息对用户进行提示,所述控制终端用于控制所述可移动平台。
可选的,处理器92根据所述拍摄信息和所述监测信息控制所述拍摄设备的变焦操作时,具体用于:根据所述拍摄信息和所述监测信息,控制所述拍摄设备进行变焦操作。
可选的,处理器92根据所述拍摄信息和所述监测信息,控制所述拍摄设备进行变焦操作,以使所述目标对象在所述拍摄设备采集的图像中的大小控制在预设范围内时,具体用于:根据所述拍摄信息和所述监测信息,控制所述拍摄设备进行变焦操作,以维持所述目标对象在所述拍摄设备采集的图像中的大小。
可选的,所述拍摄信息包括所述拍摄设备的焦距信息,所述监测信息包括所述目标对象与所述可移动平台或所述拍摄设备的距离信息;处理器92根据所述拍摄信息和所述监测信息,控制所述拍摄设备进行变焦操作时,具体用于:根据所述焦距信息和所述距离信息,确定所述拍摄设备的目标焦距;控制所述拍摄设备的当前焦距调整为所述目标焦距。
可选的,所述焦距信息包括所述拍摄设备在历史时刻的历史焦距,所述距离信息包括所述目标对象在所述历史时刻相对于所述可移动平台或所述拍摄设备的第一距离、所述目标对象在当前时刻相对于所述可移动平台或所述拍摄设备的第二距离;处理器92根据所述焦距信息和所述距离信息,确定所述拍摄设备的目标焦距时,具体用于:根据所述历史焦距、所述第一距离、所述第二距离,确定所述拍摄设备的目标焦距。
可选的,所述历史时刻是所述可移动平台开始跟随所述目标对象的时刻。
可选的,所述历史时刻是所述目标对象在所述图像中的大小在所述预设范围内的时刻。
可选的,所述控制装置还包括通讯接口93;处理器92还用于:在所述拍摄设备的拍摄过程中,若通过通讯接口93接收到用户的变焦指令,则根据所述变焦指令调整所述拍摄设备的当前焦距,并将调整后的当前焦距作为所述历史焦距。
本发明实施例提供的控制装置的具体原理和实现方式均与上述实施例类似,此处不再赘述。
本实施例通过获取拍摄设备的拍摄信息以及目标对象的监测信息,当检测到对目标对象的识别触发事件时,根据拍摄信息和监测信息控制拍摄设备的变焦操作,以使目标对象在拍摄设备采集的图像中的大小控制在预设范围内,避免该目标对象在该拍摄设备采集的图像中过大或过小,以提高对该目标对象进行识别的准确性。
本发明实施例提供一种可移动平台。所述可移动平台包括如下至少一种:无人飞行器、可移动机器人、手持云台。以无人飞行器为例,图10为本发明实施例提供的无人飞行器的结构图,如图10所示,无人飞行器100包括:机身、动力系统、拍摄设备104和控制装置118,所述动力系统包括如下至少一种:电机107、螺旋桨106和电子调速器117,动力系统安装在所述机身,用于提供飞行动力;拍摄设备104用于采集图像;控制装置118的具体原理和实现方式均与上述实施例所述的控制装置一致,此处不再赘述。在一些实施例中,该控制装置118具体可以是飞行控制器。
另外,如图10所示,无人飞行器100还包括:传感系统108、通信系统110、支撑设备102、拍摄设备104,其中,支撑设备102具体可以是云台,通信系统110具体可以包括接收机,接收机用于接收地面控制端发送的无线信号。
另外,本实施例还提供一种计算机可读存储介质,其上存储有计算机程序,所述计算机程序被处理器执行以实现上述实施例所述的拍摄设备的控制方法。
在本发明所提供的几个实施例中,应该理解到,所揭露的装置和方法,可以通过其它的方式实现。例如,以上所描述的装置实施例仅仅是示意性的,例如,所述单元的划分,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式,例如多个单元或组件可以结合或者可以集成到另一个系统,或一些特征可以忽略,或不执行。另一点,所显示或讨论的相互之间的耦合或直接耦合或通信连接可以是通过一些接口,装置或单元的间接耦合或通信连接,可以是电性,机械或其它的形式。
所述作为分离部件说明的单元可以是或者也可以不是物理上分开的,作为单元显示的部件可以是或者也可以不是物理单元,即可以位于一个地方,或者也可以分布到多个网络单元上。可以根据实际的需要选择其中的部分或者全部单元来实现本实施例方案的目的。
另外,在本发明各个实施例中的各功能单元可以集成在一个处理单元中,也可以是各个单元单独物理存在,也可以两个或两个以上单元集成在一个单元中。上述集成的单元既可以采用硬件的形式实现,也可以采用硬件加软件功能单元的形式实现。
上述以软件功能单元的形式实现的集成的单元,可以存储在一个计算机可读取存储介质中。上述软件功能单元存储在一个存储介质中,包括若干指令用以使得一台计算机设备(可以是个人计算机,服务器,或者网络设备等)或处理器(processor)执行本发明各个实施例所述方法的部分步骤。而前述的存储介质包括:U盘、移动硬盘、只读存储器(Read-Only Memory,ROM)、随机存取存储器(Random Access Memory,RAM)、磁碟或者光盘等各种可以存储程序代码的介质。
本领域技术人员可以清楚地了解到,为描述的方便和简洁,仅以上述各功能模块的划分进行举例说明,实际应用中,可以根据需要而将上述功能分配由不同的功能模块完成,即将装置的内部结构划分成不同的功能模块,以完成以上描述的全部或者部分功能。上述描述的装置的具体工作过程,可以参考前述方法实施例中的对应过程,在此不再赘述。
最后应说明的是:以上各实施例仅用以说明本发明的技术方案,而非对其限制;尽管参照前述各实施例对本发明进行了详细的说明,本领域的普通技术人员应当理解:其依然可以对前述各实施例所记载的技术方案进行修改,或者对其中部分或者全部技术特征进行等同替换;而这些修改或者替换,并不使相应技术方案的本质脱离本发明各实施例技术方案的范围。

Claims (38)

  1. 一种拍摄设备的控制方法,应用于可移动平台,所述可移动平台上搭载有所述拍摄设备,其特征在于,包括:
    获取所述拍摄设备的拍摄信息以及目标对象的监测信息,所述目标对象为所述拍摄设备的拍摄对象;
    若检测到对所述目标对象的识别触发事件,则根据所述拍摄信息和所述监测信息控制所述拍摄设备的变焦操作,以使所述目标对象在所述拍摄设备采集的图像中的大小控制在预设范围内。
  2. 根据权利要求1所述的方法,其特征在于,所述根据所述拍摄信息和所述监测信息控制所述拍摄设备的变焦操作,包括:
    根据所述拍摄信息和所述监测信息,控制所述拍摄设备停止变焦。
  3. 根据权利要求2所述的方法,其特征在于,所述根据所述拍摄信息和所述监测信息,控制所述拍摄设备停止变焦,包括:
    根据所述拍摄信息和所述监测信息,确定所述目标对象在所述图像中的大小;
    若所述目标对象在所述图像中的大小在所述预设范围外,则控制所述拍摄设备停止变焦。
  4. 根据权利要求3所述的方法,其特征在于,所述拍摄信息包括所述图像的尺寸信息,所述监测信息包括所述目标对象在所述图像中的尺寸信息;
    所述根据所述拍摄信息和所述监测信息,确定所述目标对象在所述图像中的大小,包括:
    根据所述图像的尺寸信息和所述目标对象在所述图像中的尺寸信息,确定所述目标对象与所述图像的尺寸差异;
    若所述尺寸差异大于第一阈值或所述尺寸差异小于第二阈值,则确定所述目标对象在所述图像中的大小在所述预设范围外。
  5. 根据权利要求4所述的方法,其特征在于,所述监测信息包括所述目标对象在所述图像中的框选区域的尺寸信息,所述目标对象至少部分位于所述框选区域内。
  6. 根据权利要求5所述的方法,其特征在于,所述框选区域为矩形 区域;所述尺寸差异包括所述框选区域的宽度与所述图像的宽度之间的差异,和/或,所述框选区域的高度与所述图像的高度之间的差异。
  7. 根据权利要求3所述的方法,其特征在于,所述拍摄信息包括所述图像的面积信息,所述监测信息包括所述目标对象在所述图像中的面积信息;
    所述根据所述拍摄信息和所述监测信息,确定所述目标对象在所述图像中的大小,包括:
    根据所述图像的面积信息和所述目标对象在所述图像中的面积信息,确定所述目标对象与所述图像的面积差异;
    若所述面积差异大于第三阈值或所述面积差异小于第四阈值,则确定所述目标对象在所述图像中的大小在所述预设范围外。
  8. 根据权利要求7所述的方法,其特征在于,所述监测信息包括所述目标对象在所述图像中的框选区域的面积信息,所述目标对象至少部分位于所述框选区域内。
  9. 根据权利要求2所述的方法,其特征在于,所述控制所述拍摄设备停止变焦,包括:
    控制所述拍摄设备停止执行所述拍摄设备接收到的变焦指令,所述变焦指令用于调整所述拍摄设备的焦距。
  10. 根据权利要求2所述的方法,其特征在于,所述控制所述拍摄设备停止变焦之后,还包括:
    向控制终端发送停止变焦提示信息,以使得所述控制终端根据所述停止变焦提示信息对用户进行提示,所述控制终端用于控制所述可移动平台。
  11. 根据权利要求1所述的方法,其特征在于,所述根据所述拍摄信息和所述监测信息控制所述拍摄设备的变焦操作,包括:
    根据所述拍摄信息和所述监测信息,控制所述拍摄设备进行变焦操作。
  12. 根据权利要求11所述的方法,其特征在于,所述根据所述拍摄信息和所述监测信息,控制所述拍摄设备进行变焦操作,包括:
    根据所述拍摄信息和所述监测信息,控制所述拍摄设备进行变焦操作,以维持所述目标对象在所述拍摄设备采集的图像中的大小。
  13. 根据权利要求12所述的方法,其特征在于,所述拍摄信息包括 所述拍摄设备的焦距信息,所述监测信息包括所述目标对象与所述可移动平台或所述拍摄设备的距离信息;
    所述根据所述拍摄信息和所述监测信息,控制所述拍摄设备进行变焦操作,包括:
    根据所述焦距信息和所述距离信息,确定所述拍摄设备的目标焦距;
    控制所述拍摄设备的当前焦距调整为所述目标焦距。
  14. 根据权利要求13所述的方法,其特征在于,所述焦距信息包括所述拍摄设备在历史时刻的历史焦距,所述距离信息包括所述目标对象在所述历史时刻相对于所述可移动平台或所述拍摄设备的第一距离、所述目标对象在当前时刻相对于所述可移动平台或所述拍摄设备的第二距离;
    所述根据所述焦距信息和所述距离信息,确定所述拍摄设备的目标焦距,包括:
    根据所述历史焦距、所述第一距离、所述第二距离,确定所述拍摄设备的目标焦距。
  15. 根据权利要求14所述的方法,其特征在于,所述历史时刻是所述可移动平台开始跟随所述目标对象的时刻。
  16. 根据权利要求14所述的方法,其特征在于,所述历史时刻是所述目标对象在所述图像中的大小在所述预设范围内的时刻。
  17. 根据权利要求14所述的方法,其特征在于,还包括:
    在所述拍摄设备的拍摄过程中,若接收到用户的变焦指令,则根据所述变焦指令调整所述拍摄设备的当前焦距,并将调整后的当前焦距作为所述历史焦距。
  18. 根据权利要求1所述的方法,其特征在于,所述可移动平台包括如下至少一种:
    无人飞行器、可移动机器人、手持云台。
  19. 一种拍摄设备的控制装置,其特征在于,包括:存储器和处理器;
    所述存储器用于存储程序代码;
    所述处理器,调用所述程序代码,当程序代码被执行时,用于执行以下操作:
    获取所述拍摄设备的拍摄信息以及目标对象的监测信息,所述目标对 象为所述拍摄设备的拍摄对象;
    若检测到对目标对象的识别触发事件,则根据所述拍摄信息和所述监测信息控制所述拍摄设备的变焦操作,以使所述目标对象在所述拍摄设备采集的图像中的大小控制在预设范围内。
  20. 根据权利要求19所述的控制装置,其特征在于,所述处理器根据所述拍摄信息和所述监测信息控制所述拍摄设备的变焦操作时,具体用于:
    根据所述拍摄信息和所述监测信息,控制所述拍摄设备停止变焦。
  21. 根据权利要求20所述的控制装置,其特征在于,所述处理器根据所述拍摄信息和所述监测信息,控制所述拍摄设备停止变焦时,具体用于:
    根据所述拍摄信息和所述监测信息,确定所述目标对象在所述图像中的大小;
    若所述目标对象在所述图像中的大小在所述预设范围外,则控制所述拍摄设备停止变焦。
  22. 根据权利要求21所述的控制装置,其特征在于,所述拍摄信息包括所述图像的尺寸信息,所述监测信息包括所述目标对象在所述图像中的尺寸信息;
    所述处理器根据所述拍摄信息和所述监测信息,确定所述目标对象在所述图像中的大小时,具体用于:
    根据所述图像的尺寸信息和所述目标对象在所述图像中的尺寸信息,确定所述目标对象与所述图像的尺寸差异;
    若所述尺寸差异大于第一阈值或所述尺寸差异小于第二阈值,则确定所述目标对象在所述图像中的大小在所述预设范围外。
  23. 根据权利要求22所述的控制装置,其特征在于,所述监测信息包括所述目标对象在所述图像中的框选区域的尺寸信息,所述目标对象至少部分位于所述框选区域内。
  24. 根据权利要求23所述的控制装置,其特征在于,所述框选区域为矩形区域;所述尺寸差异包括所述框选区域的宽度与所述图像的宽度之间的差异,和/或,所述框选区域的高度与所述图像的高度之间的差异。
  25. 根据权利要求21所述的控制装置,其特征在于,所述拍摄信息包括所述图像的面积信息,所述监测信息包括所述目标对象在所述图像中的面积信息;
    所述处理器根据所述拍摄信息和所述监测信息,确定所述目标对象在所述图像中的大小时,具体用于:
    根据所述图像的面积信息和所述目标对象在所述图像中的面积信息,确定所述目标对象与所述图像的面积差异;
    若所述面积差异大于第三阈值或所述面积差异小于第四阈值,则确定所述目标对象在所述图像中的大小在所述预设范围外。
  26. 根据权利要求25所述的控制装置,其特征在于,所述监测信息包括所述目标对象在所述图像中的框选区域的面积信息,所述目标对象至少部分位于所述框选区域内。
  27. 根据权利要求20所述的控制装置,其特征在于,所述处理器控制所述拍摄设备停止变焦时,具体用于:
    控制所述拍摄设备停止执行所述拍摄设备接收到的变焦指令,所述变焦指令用于调整所述拍摄设备的焦距。
  28. 根据权利要求20所述的控制装置,其特征在于,所述控制装置还包括:通讯接口;
    所述处理器控制所述拍摄设备停止变焦之后,还用于:
    通过所述通讯接口向控制终端发送停止变焦提示信息,以使得所述控制终端根据所述停止变焦提示信息对用户进行提示,所述控制终端用于控制可移动平台。
  29. 根据权利要求19所述的控制装置,其特征在于,所述处理器根据所述拍摄信息和所述监测信息控制所述拍摄设备的变焦操作时,具体用于:
    根据所述拍摄信息和所述监测信息,控制所述拍摄设备进行变焦操作。
  30. 根据权利要求29所述的控制装置,其特征在于,所述处理器根据所述拍摄信息和所述监测信息,控制所述拍摄设备进行变焦操作时,具体用于:
    根据所述拍摄信息和所述监测信息,控制所述拍摄设备进行变焦操作, 以维持所述目标对象在所述拍摄设备采集的图像中的大小。
  31. 根据权利要求30所述的控制装置,其特征在于,所述拍摄信息包括所述拍摄设备的焦距信息,所述监测信息包括所述目标对象与可移动平台或所述拍摄设备的距离信息;
    所述处理器根据所述拍摄信息和所述监测信息,控制所述拍摄设备进行变焦操作时,具体用于:
    根据所述焦距信息和所述距离信息,确定所述拍摄设备的目标焦距;
    控制所述拍摄设备的当前焦距调整为所述目标焦距。
  32. 根据权利要求31所述的控制装置,其特征在于,所述焦距信息包括所述拍摄设备在历史时刻的历史焦距,所述距离信息包括所述目标对象在所述历史时刻相对于所述可移动平台或所述拍摄设备的第一距离、所述目标对象在当前时刻相对于所述可移动平台或所述拍摄设备的第二距离;
    所述处理器根据所述焦距信息和所述距离信息,确定所述拍摄设备的目标焦距时,具体用于:
    根据所述历史焦距、所述第一距离、所述第二距离,确定所述拍摄设备的目标焦距。
  33. 根据权利要求32所述的控制装置,其特征在于,所述历史时刻是所述可移动平台开始跟随所述目标对象的时刻。
  34. 根据权利要求32所述的控制装置,其特征在于,所述历史时刻是所述目标对象在所述图像中的大小在所述预设范围内的时刻。
  35. 根据权利要求32所述的控制装置,其特征在于,所述控制装置还包括通讯接口;
    所述处理器还用于:
    在所述拍摄设备的拍摄过程中,若通过所述通讯接口接收到用户的变焦指令,则根据所述变焦指令调整所述拍摄设备的当前焦距,并将调整后的当前焦距作为所述历史焦距。
  36. 一种可移动平台,其特征在于,包括:
    机身;
    动力系统,安装在所述机身,用于提供动力;
    拍摄设备,用于采集图像;以及
    如权利要求19-35任一项所述的控制装置。
  37. 根据权利要求36所述的可移动平台,其特征在于,所述可移动平台包括如下至少一种:
    无人飞行器、可移动机器人、手持云台。
  38. 一种计算机可读存储介质,其特征在于,其上存储有计算机程序,所述计算机程序被处理器执行以实现如权利要求1-18任一项所述的方法。
PCT/CN2018/118410 2018-11-30 2018-11-30 拍摄设备的控制方法、装置、设备及存储介质 WO2020107372A1 (zh)

Priority Applications (3)

Application Number Priority Date Filing Date Title
PCT/CN2018/118410 WO2020107372A1 (zh) 2018-11-30 2018-11-30 拍摄设备的控制方法、装置、设备及存储介质
CN201880040431.7A CN110785993A (zh) 2018-11-30 2018-11-30 拍摄设备的控制方法、装置、设备及存储介质
US17/334,735 US20210289141A1 (en) 2018-11-30 2021-05-29 Control method and apparatus for photographing device, and device and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2018/118410 WO2020107372A1 (zh) 2018-11-30 2018-11-30 拍摄设备的控制方法、装置、设备及存储介质

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US17/334,735 Continuation US20210289141A1 (en) 2018-11-30 2021-05-29 Control method and apparatus for photographing device, and device and storage medium

Publications (1)

Publication Number Publication Date
WO2020107372A1 true WO2020107372A1 (zh) 2020-06-04

Family

ID=69383083

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2018/118410 WO2020107372A1 (zh) 2018-11-30 2018-11-30 拍摄设备的控制方法、装置、设备及存储介质

Country Status (3)

Country Link
US (1) US20210289141A1 (zh)
CN (1) CN110785993A (zh)
WO (1) WO2020107372A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113610027A (zh) * 2021-08-13 2021-11-05 青岛海信网络科技股份有限公司 监控方法、装置、电子设备及计算机可读存储介质

Families Citing this family (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113645397A (zh) * 2020-04-27 2021-11-12 杭州海康机器人技术有限公司 移动目标对象的跟踪方法、装置及跟踪系统
CN112639405A (zh) * 2020-05-07 2021-04-09 深圳市大疆创新科技有限公司 状态信息确定方法、装置、系统、可移动平台和存储介质
CN113853781A (zh) * 2020-05-28 2021-12-28 深圳市大疆创新科技有限公司 一种图像处理方法、头戴式显示设备及存储介质
CN113747085B (zh) * 2020-05-30 2023-01-06 华为技术有限公司 拍摄视频的方法和装置
CN111885303A (zh) * 2020-07-06 2020-11-03 雍朝良 一种主动跟踪录摄像视觉方法
CN113491102A (zh) * 2020-08-25 2021-10-08 深圳市大疆创新科技有限公司 变焦视频拍摄方法、拍摄系统、拍摄装置和存储介质
WO2022041013A1 (zh) * 2020-08-26 2022-03-03 深圳市大疆创新科技有限公司 控制方法、手持云台、系统及计算机可读存储介质
CN112102620A (zh) * 2020-11-09 2020-12-18 南京慧尔视智能科技有限公司 一种雷达与球机联动的监测方法
CN112839171B (zh) * 2020-12-31 2023-02-10 上海米哈游天命科技有限公司 一种画面拍摄方法、装置、存储介质及电子设备
CN113163112B (zh) * 2021-03-25 2022-12-13 中国电子科技集团公司第三研究所 一种聚变焦控制方法及系统
CN113163167B (zh) * 2021-03-31 2023-04-28 杭州海康机器人股份有限公司 图像获取方法和装置
CN113938609B (zh) * 2021-11-04 2023-08-22 中国联合网络通信集团有限公司 区域监控方法、装置及设备
CN114785909B (zh) * 2022-04-25 2024-10-11 歌尔股份有限公司 拍摄校准方法、装置、设备及存储介质

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006229321A (ja) * 2005-02-15 2006-08-31 Matsushita Electric Ind Co Ltd 自動追尾撮像装置及び自動追尾撮像方法、並びにプログラム
CN102439966A (zh) * 2010-03-30 2012-05-02 索尼公司 图像处理设备、方法和程序
CN102984454A (zh) * 2012-11-15 2013-03-20 广东欧珀移动通信有限公司 一种自动调节相机焦距的系统、方法和手机
CN104717427A (zh) * 2015-03-06 2015-06-17 广东欧珀移动通信有限公司 一种自动变焦方法、装置和移动终端
JP2016122030A (ja) * 2014-12-24 2016-07-07 キヤノン株式会社 ズーム制御装置、撮像装置、ズーム制御装置の制御方法、ズーム制御装置の制御プログラム及び記憶媒体
CN107079090A (zh) * 2016-04-21 2017-08-18 深圳市大疆创新科技有限公司 无人机及摄像组件

Family Cites Families (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5040760B2 (ja) * 2008-03-24 2012-10-03 ソニー株式会社 画像処理装置、撮像装置、表示制御方法およびプログラム
US9497388B2 (en) * 2010-12-17 2016-11-15 Pelco, Inc. Zooming factor computation
KR101758681B1 (ko) * 2012-03-27 2017-07-14 한화테크윈 주식회사 통신 시스템 및 통신 시스템에서의 데이터 송수신 방법
JP6034656B2 (ja) * 2012-10-26 2016-11-30 キヤノン株式会社 ズームレンズ及びそれを有する撮像装置
KR101983288B1 (ko) * 2012-11-22 2019-05-29 삼성전자주식회사 카메라 촬영 제어장치 및 방법
CN103197491B (zh) * 2013-03-28 2016-03-30 华为技术有限公司 快速自动聚焦的方法和图像采集装置
CN104349031B (zh) * 2013-07-31 2018-05-18 华为技术有限公司 一种调整摄像装置的取景范围方法及摄像系统、操作装置
CN103595917B (zh) * 2013-11-14 2016-08-17 上海华勤通讯技术有限公司 移动终端及其调焦方法
KR102101438B1 (ko) * 2015-01-29 2020-04-20 한국전자통신연구원 연속 시점 전환 서비스에서 객체의 위치 및 크기를 유지하기 위한 다중 카메라 제어 장치 및 방법
US10397484B2 (en) * 2015-08-14 2019-08-27 Qualcomm Incorporated Camera zoom based on sensor data
US9781350B2 (en) * 2015-09-28 2017-10-03 Qualcomm Incorporated Systems and methods for performing automatic zoom
KR20170055213A (ko) * 2015-11-11 2017-05-19 삼성전자주식회사 비행이 가능한 전자 장치를 이용한 촬영 방법 및 장치
US10029790B2 (en) * 2016-01-28 2018-07-24 Panasonic Intellectual Property Corporation Of America Device that controls flight altitude of unmanned aerial vehicle
CN105867362A (zh) * 2016-04-20 2016-08-17 北京博瑞爱飞科技发展有限公司 终端设备和无人驾驶飞行器的控制系统
CN106161941B (zh) * 2016-07-29 2022-03-11 南昌黑鲨科技有限公司 双摄像头自动追焦方法、装置及终端
CN205883405U (zh) * 2016-07-29 2017-01-11 深圳众思科技有限公司 一种自动追焦装置及终端
JP7057637B2 (ja) * 2017-08-23 2022-04-20 キヤノン株式会社 制御装置、制御システム、制御方法、プログラム、及び記憶媒体
JP7045218B2 (ja) * 2018-02-28 2022-03-31 キヤノン株式会社 情報処理装置および情報処理方法、プログラム

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006229321A (ja) * 2005-02-15 2006-08-31 Matsushita Electric Ind Co Ltd 自動追尾撮像装置及び自動追尾撮像方法、並びにプログラム
CN102439966A (zh) * 2010-03-30 2012-05-02 索尼公司 图像处理设备、方法和程序
CN102984454A (zh) * 2012-11-15 2013-03-20 广东欧珀移动通信有限公司 一种自动调节相机焦距的系统、方法和手机
JP2016122030A (ja) * 2014-12-24 2016-07-07 キヤノン株式会社 ズーム制御装置、撮像装置、ズーム制御装置の制御方法、ズーム制御装置の制御プログラム及び記憶媒体
CN104717427A (zh) * 2015-03-06 2015-06-17 广东欧珀移动通信有限公司 一种自动变焦方法、装置和移动终端
CN107079090A (zh) * 2016-04-21 2017-08-18 深圳市大疆创新科技有限公司 无人机及摄像组件

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113610027A (zh) * 2021-08-13 2021-11-05 青岛海信网络科技股份有限公司 监控方法、装置、电子设备及计算机可读存储介质

Also Published As

Publication number Publication date
CN110785993A (zh) 2020-02-11
US20210289141A1 (en) 2021-09-16

Similar Documents

Publication Publication Date Title
WO2020107372A1 (zh) 拍摄设备的控制方法、装置、设备及存储介质
JP6943988B2 (ja) 移動可能物体の制御方法、機器およびシステム
CN107466385B (zh) 一种云台控制方法及系统
WO2018227350A1 (zh) 无人机返航控制方法、无人机和机器可读存储介质
US10597169B2 (en) Method of aerial vehicle-based image projection, device and aerial vehicle
CN109074168B (zh) 无人机的控制方法、设备和无人机
CN110692027A (zh) 用于提供无人机应用的易用的释放和自动定位的系统和方法
CN108235815B (zh) 摄像控制装置、摄像装置、摄像系统、移动体、摄像控制方法及介质
US11210796B2 (en) Imaging method and imaging control apparatus
CN110633629A (zh) 基于人工智能的电网巡检方法、装置、设备及存储介质
WO2020014987A1 (zh) 移动机器人的控制方法、装置、设备及存储介质
CN112154649A (zh) 航测方法、拍摄控制方法、飞行器、终端、系统及存储介质
CN110337806A (zh) 集体照拍摄方法和装置
CN111316632A (zh) 拍摄控制方法及可移动平台
WO2020024182A1 (zh) 一种参数处理方法、装置及摄像设备、飞行器
CN112514366A (zh) 图像处理方法、图像处理装置和图像处理系统
WO2021000225A1 (zh) 可移动平台的控制方法、装置、设备及存储介质
WO2020062024A1 (zh) 基于无人机的测距方法、装置及无人机
WO2022141271A1 (zh) 云台系统的控制方法、控制设备、云台系统和存储介质
JP7501535B2 (ja) 情報処理装置、情報処理方法、情報処理プログラム
WO2021146969A1 (zh) 距离测量方法、可移动平台、设备和存储介质
KR20220068606A (ko) 부분 영상을 고려한 드론의 자동 착륙 알고리즘
CN114270285A (zh) 移动体、信息处理装置、信息处理方法和程序
CN112292712A (zh) 装置、摄像装置、移动体、方法以及程序
WO2022000211A1 (zh) 拍摄系统的控制方法、设备、及可移动平台、存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18941133

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18941133

Country of ref document: EP

Kind code of ref document: A1