WO2019056312A1 - 用于跟踪拍摄的方法和设备 - Google Patents

用于跟踪拍摄的方法和设备 Download PDF

Info

Publication number
WO2019056312A1
WO2019056312A1 PCT/CN2017/102992 CN2017102992W WO2019056312A1 WO 2019056312 A1 WO2019056312 A1 WO 2019056312A1 CN 2017102992 W CN2017102992 W CN 2017102992W WO 2019056312 A1 WO2019056312 A1 WO 2019056312A1
Authority
WO
WIPO (PCT)
Prior art keywords
target object
image
mobile device
image acquisition
control
Prior art date
Application number
PCT/CN2017/102992
Other languages
English (en)
French (fr)
Inventor
陶冶
苏冠樑
Original Assignee
深圳市大疆创新科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳市大疆创新科技有限公司 filed Critical 深圳市大疆创新科技有限公司
Priority to PCT/CN2017/102992 priority Critical patent/WO2019056312A1/zh
Priority to CN201780006943.7A priority patent/CN108521862B/zh
Publication of WO2019056312A1 publication Critical patent/WO2019056312A1/zh
Priority to US16/822,630 priority patent/US20200221005A1/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/71Circuitry for evaluating the brightness variation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals
    • H04N23/675Focus control based on electronic image sensor signals comprising setting of focusing regions
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/61Control of cameras or camera modules based on recognised objects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • H04N23/631Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/695Control of camera direction for changing a field of view, e.g. pan, tilt or based on tracking of objects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/73Circuitry for compensating brightness variation in the scene by influencing the exposure time
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U2101/00UAVs specially adapted for particular uses or applications
    • B64U2101/20UAVs specially adapted for particular uses or applications for use as communications relays, e.g. high-altitude platforms
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U2101/00UAVs specially adapted for particular uses or applications
    • B64U2101/30UAVs specially adapted for particular uses or applications for imaging, photography or videography
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/66Radar-tracking systems; Analogous systems
    • G01S13/72Radar-tracking systems; Analogous systems for two-dimensional tracking, e.g. combination of angle and range tracking, track-while-scan radar
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S15/00Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
    • G01S15/66Sonar tracking systems
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/66Tracking systems using electromagnetic waves other than radio waves
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/01Satellite radio beacon positioning systems transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/13Receivers
    • G01S19/14Receivers specially adapted for specific applications
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/38Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
    • G01S19/39Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/42Determining position

Definitions

  • the present disclosure relates to the field of image processing, and more particularly, to a method and apparatus for tracking shots.
  • a drone with a camera can track the desired subject and transmit the captured image to the user in real time.
  • the mobile device sets the camera device therein to the global average focus mode. That is, not only to ensure clear imaging of the tracking target, but also to ensure clear imaging of other objects in the picture containing the tracking target.
  • the imaging device performs photometry, global average photometry is employed.
  • the existing tracking mode cannot satisfy such a requirement, because in the existing tracking mode, the camera uses a global average focus.
  • the existing tracking shooting mode adopts the method of global average metering, under the scene of strong contrast, the tracking target may be too dark or too bright, resulting in a low quality picture.
  • the present disclosure proposes a technical solution for providing local focus and/or metering in a tracking shooting mode.
  • this technical solution it is possible to focus and/or meter the tracking target in the tracking shooting mode, so that the area outside the tracking target can be blurred and/or the appropriate brightness of the tracking picture can be achieved.
  • a mobile device is provided that is in communication with a control device and accepts control instructions of the control device.
  • the movable device includes: a sensing component that acquires position information of the target object; an image acquiring device that acquires an image frame including the target object; and a controller that adjusts an orientation of the image acquiring device to cause the image capturing device to always face the target object and acquire the included The image screen of the target object.
  • the controller is further configured to: control the image acquisition device to focus and/or meter the target object according to the position information of the target object.
  • a method performed by a mobile device is provided.
  • the mobile device is in communication with the control device and accepts control commands from the control device.
  • the mobile device includes a sensing element, an image acquisition device, and a controller.
  • the method includes: acquiring position information of the target object by the sensing element; acquiring an image frame including the target object by the image acquiring device; adjusting an orientation of the image acquiring device by the controller, causing the image acquiring device to always face the target object and acquiring the target object Image of the picture.
  • the controller controls the image acquiring device to focus and/or meter the target object according to the position information of the target object.
  • a control apparatus for controlling a mobile device including:
  • a receiving unit which receives an image image acquired by the mobile device from the mobile device; a display unit that displays the received image image; an information input unit that acquires a control instruction input by the user; and a processor that is in the image image according to a control instruction input by the user Determining a target object to be tracked, and using the target object as an object of focusing and/or metering; and sending a unit, transmitting an indication information to the movable device, instructing the movable device to capture the target object, and focusing on the target object and/or Or metering.
  • a method performed by a control device for controlling a removable device includes: receiving an image image acquired by the mobile device from the mobile device; displaying the received image image; acquiring a control instruction input by the user; determining a target object to be tracked in the image image according to a control instruction input by the user, and Targeting the target object as focusing and/or metering; and transmitting indication information to the mobile device, instructing the mobile device to capture the target object, and focusing and/or metering the target object.
  • a tracking photographing system including a control device and a movable device, wherein the control device and the movable device are communicatively coupled.
  • the control device includes: a receiving unit, receiving an image image acquired by the mobile device from the mobile device; a display unit displaying the received image image; an information input unit acquiring a control instruction input by the user; and a processor according to the control instruction input by the user Determining a target object to be tracked in the image screen, and using the target object as an object of focusing and/or metering; and transmitting unit, transmitting indication information to the mobile device, instructing the mobile device to capture the target object, and targeting the target object Focus and/or meter.
  • the movable device includes: a sensing component that acquires position information of the target object; an image acquiring device that acquires an image frame including the target object; and a controller that adjusts an orientation of the image acquiring device such that the image capturing device always faces the target object and acquires the target The image screen of the object.
  • the controller is further configured to: control the image acquisition device to focus and/or meter the target object according to the position information of the target object, and transmit an image screen including the target object to the control device.
  • a method for tracking a photographing system includes a control device and a movable device, and the control device and the mobile device communicate with each other.
  • the method includes the following operations performed at a control device: receiving an image picture acquired by the mobile device from a mobile device, and displaying the received image frame; acquiring a control instruction input by the user; and displaying the image according to a control instruction input by the user Determining a target object to be tracked in the picture, and using the target object as an object of focusing and/or metering; and transmitting an indication information to the mobile device, instructing the mobile device to capture the target object, and focusing on the target object and/or Metering.
  • the method further includes the following operations performed at the movable device: acquiring location information of the target object; adjusting an orientation of the image acquisition device in the movable device, causing the image acquisition device to always face the target object and acquiring an image image including the target object; And controlling the image acquisition device to focus and/or meter the target object according to the position information of the target object, and transmitting an image screen including the target object to the control device.
  • the combination of intelligent tracking shooting and local focusing and/or metering can be realized by a simple interaction manner.
  • focusing and/or metering the tracking target enables blurring of areas outside the tracking target and/or achieving proper brightness of the tracking picture.
  • FIG. 1 is a block diagram showing a mobile device in accordance with one embodiment of the present disclosure.
  • FIG. 2 is a flow chart showing a method performed by a mobile device in accordance with one embodiment of the present disclosure.
  • FIG. 3 is a block diagram showing a control device according to an embodiment of the present disclosure.
  • FIG. 4 is a flow chart showing a method performed by a control device in accordance with one embodiment of the present disclosure.
  • 5A-5D are schematic diagrams showing a tracking shooting application in accordance with one embodiment of the present disclosure.
  • FIG. 1 is a block diagram showing a removable device 10 in accordance with one embodiment of the present disclosure.
  • the mobile device 10 includes a sensing element 110, an image acquisition device 120, and a controller 130.
  • the mobile device 10 can be any form of device including, but not limited to, a drone, an unmanned vehicle, or an unmanned boat.
  • a drone an unmanned vehicle
  • an unmanned boat an unmanned boat.
  • the sensing component 110 can acquire location information of the target object.
  • the "target object” refers to an object that is of interest to the user and thus needs to be tracked.
  • the sensing component 110 can also acquire location information of the mobile device 10 itself. Examples of the sensing element 110 may include one or more of a GPS sensor, a visual sensor, an ultrasonic sensor, an infrared sensor, a radar.
  • the image acquisition device 120 can acquire an image screen including a target object.
  • the image acquisition device 120 may include one or more cameras to capture an image frame including a target object in real time.
  • the image acquisition device 120 is a visible light camera. It can be understood that the image acquisition device 120 can also be other forms of devices that can acquire image images, such as an infrared camera.
  • the controller 130 adjusts the orientation of the image acquisition device 120 such that the image acquisition device 120 always faces the target object and acquires an image screen including the target object. For example, in the tracking mode, the movement of the target object may be frequent and the movement is irregular. In this case, the controller 130 needs to continuously acquire real-time location information of the target object sensed by the sensing component 110, and is based on The position information adjusts the orientation of the image acquisition device 120 so that the image acquisition device 120 can always follow the shooting target object.
  • the controller 130 controls the image acquisition device 120 to focus and/or meter the target object based on the position information of the target object sensed by the sensing element 110.
  • the image acquisition device 120 can receive a control command from the controller 130 and initiate or stop focusing and/or metering the target object based on the received control command.
  • the controller 130 may control the image acquisition device 120 to set the focus point on the tracked target object according to the position information of the target object.
  • the target object itself is clear without regard to whether or not the focus of the object other than the target object in the image screen is clear.
  • the degree of background blur is higher.
  • the background blur is less.
  • Those skilled in the art can use appropriate aperture values to achieve corresponding background blurring as needed.
  • the image acquisition device 120 receives the control command issued by the controller 130 according to the position information of the target object, acquires the brightness at the target object, and adjusts the whole according to the brightness.
  • the brightness of the image screen In this way, it is possible to ensure that the brightness of the target object is appropriate throughout the image picture, so that the target object is clearly distinguishable.
  • the image acquisition device 120 may compensate the brightness of the target object after metering the target object, and set the brightness of the entire image frame based on the brightness of the compensated target object.
  • two thresholds can be set: a first threshold and a second threshold, wherein the first threshold is less than the second threshold. These two thresholds may be predetermined or may be constantly adjusted during the tracking process.
  • the image acquisition device 120 performs compensation to increase the brightness of the target object.
  • the image acquisition device 120 performs compensation to reduce the brightness of the target object.
  • the increased or decreased value can be set according to actual needs.
  • the brightness of the compensated target object is not too dark or too bright, so that the brightness of the entire picture adjusted based on the brightness is more appropriate.
  • the controller 130 may further control the image obtaining device 120 to zoom in or out according to the position information of the target object to achieve enlargement or reduction of the acquired image image.
  • the controller 130 can control the image acquisition device 120 to zoom in on the lens so that the target object is displayed larger in the screen, facilitating the user to view the details of the target object.
  • the controller 130 can control the image acquisition device 120 to zoom out the lens so that more information can be accommodated in the captured image, so that the user can view the environment around the target object.
  • the controller 130 may also control the image acquisition device 120 to adjust the white balance and/or color filter.
  • the controller 130 can control the image acquisition device 120 to adjust the white balance setting and/or the color filter according to ambient lighting conditions to obtain better image quality and/or desired image style.
  • the controller 130 may also control the movable device 10 to follow the target object motion and maintain a preset distance from the target object.
  • the controller 130 may adjust the orientation of the image acquisition device 120 such that the image acquisition device 120 is always facing the target object, controlling the orientation of the movable device 10 to remain unchanged or adjusting the movable device 10 also toward the target object. This makes it easier to capture the target object to be tracked.
  • the controller 130 can also control the movable device 10 to be located at a certain position and control the image acquisition device 120 to adjust the orientation in real time to capture the target object.
  • the above-mentioned local focusing and partial photometry technologies of the present disclosure may be applied separately or in combination. Further, one or more of the above-described lens adjustment (zoom in or zoom out), white balance setting, and color filter settings of the present disclosure may be applied together with local focusing and/or metering.
  • the combination of intelligent tracking shooting and local focusing and/or metering can be realized by a simple interaction manner.
  • focusing and/or metering the tracking target enables blurring of areas outside the tracking target and/or achieving proper brightness of the tracking picture.
  • FIG. 2 is a flow chart showing a method performed by a mobile device in accordance with one embodiment of the present disclosure.
  • the method can be performed by the removable device 10 shown in FIG.
  • the description of the details of the removable device 10 is omitted below for the sake of brevity.
  • the position information of the target object is acquired by the sensing element.
  • the sensing element can also obtain location information of the mobile device.
  • the sensing element can include one or more of a GPS sensor, a visual sensor, an ultrasonic sensor, an infrared sensor, a radar.
  • an image screen including the target object is acquired by the image acquisition device.
  • an image screen including a target object can be captured in real time by one or more cameras.
  • step S230 the orientation of the image acquisition device is adjusted by the controller to cause the image acquisition device Always toward the target object and get an image of the image including the target object.
  • step S240 the image acquisition device is controlled to focus and/or meter the target object according to the position information of the target object.
  • focusing and metering can be applied separately or in combination.
  • one or more of the lens adjustment (pull or zoom out), white balance setting, and color filter settings described above may also be applied together, and will not be described in detail herein.
  • FIG. 3 is a block diagram showing a control device according to an embodiment of the present disclosure.
  • the control device 30 includes a receiving unit 310, a display unit 320, an information input unit 330, a processor 340, and a transmitting unit 350.
  • Control device 30 can be a variety of devices including, but not limited to, a smartphone, a control terminal, a tablet PAD, and the like.
  • a smartphone a control terminal
  • a tablet PAD a tablet PAD
  • the principle of the present disclosure will be described with a smartphone as an example of the control device 30.
  • Those skilled in the art will appreciate that the principles described are equally applicable to other forms of control devices.
  • the image unit received by the receiving unit 310 from the mobile device.
  • the display unit 320 displays the received image screen.
  • the information input unit 330 acquires a control command input by the user.
  • the information input unit 330 may include a touch screen, a control button, and the like.
  • the control instructions input by the user may include causing the mobile device to perform a tracking shooting function for the target object, and may also cause the mobile device to perform the function of focusing and/or metering the target object as described above.
  • the processor 340 determines a target object to be tracked in the image screen according to a control instruction input by the user, and uses the target object as an object of focusing and/or metering.
  • the transmitting unit 350 transmits the indication information to the mobile device, instructing the mobile device to capture the target object and focus and/or meter the target object.
  • the information input unit 330 acquires a frame selection instruction made by the user on the display unit 320, and the processor 340 identifies an object in the frame selection area and targets an object that meets the preset condition as a target object.
  • the processor 340 may set the indication box of the selected target object to the in-focus focus frame and/or the photometry frame of the photometry.
  • the receiving unit 310 can also receive real-time location information of the target object from the mobile device, so that the display unit 320 can display real-time location information of the target object.
  • target object refers to an object of interest to the user that needs to be tracked.
  • the image acquisition device of the mobile device acquires the picture and transmits it back to the control device 30, and the control device 30 uses The user input determines the target object in the picture, and after obtaining the indication message sent by the transmitting unit 350, the mobile device can track the target object in real time by using an image acquiring device (for example, one or more cameras).
  • the movable device may acquire real-time position information of the target object through the sensing element, focus and/or meter the target object according to the received indication information, generate an image image including the target object, and transmit it to the control device 30.
  • control device 30 may further include a parameter setting unit 360 (shown by a broken line in FIG. 3).
  • the parameter setting unit 360 may set parameters for photometry, and the transmitting unit 350 may transmit parameters set by the parameter setting unit 360 to the movable device. It can be understood that the parameter setting unit 360 can be a sub-unit of the information input unit 330 for setting parameters.
  • the parameter may include a compensation parameter for compensating for the photometric result of the target object.
  • two thresholds can be set: a first threshold and a second threshold, wherein the first threshold is less than the second threshold. These two thresholds may be predetermined or may be constantly adjusted during the tracking process.
  • the mobile device When the brightness of the target object is less than the first threshold, the mobile device performs a compensation process to increase the brightness of the target object.
  • the mobile device when the brightness of the target object is greater than the second threshold, the mobile device performs a compensation process to reduce the brightness of the target object.
  • the increased or decreased value can be set according to actual needs.
  • the brightness of the compensated target object is not too dark or too bright, so that the brightness of the entire picture adjusted based on the brightness is more appropriate.
  • the processor 340 After determining the tracking target object, the processor 340 generates indication information indicating that the mobile device tracks the target object according to the control instruction input by the user.
  • the indication information may indicate that the image acquisition device that controls the mobile device is oriented toward the target object. That is, the orientation of the movable device itself does not change, but only the orientation of the image acquisition device is adjusted.
  • the indication information may indicate that both the controllable mobile device and the image acquisition device are directed toward the target object. In this case, it is necessary to adjust both the mobile device itself and the image acquisition device included therein such that they all face the target object.
  • the indication information may further indicate that the control mobile device follows the target object motion and maintains a preset distance from the target object.
  • FIG. 4 is a flow chart showing a method performed by a control device in accordance with one embodiment of the present disclosure.
  • the method can be performed by the control device 30 shown in FIG.
  • the description of the details of the control device 30 is omitted below for the sake of brevity.
  • step S410 an image screen acquired by the mobile device is received from the mobile device.
  • step S420 the received image screen is displayed.
  • a control command input by the user is acquired.
  • the target object to be tracked is determined in the image screen according to the control instruction input by the user, and the target object is used as an object of focusing and/or metering.
  • a frame selection instruction made by the user on the display unit may be acquired, an object in the frame selection area is identified, and an object meeting the preset condition is taken as the target object.
  • the indicator box of the selected target object can be set as the focus frame and/or the metering frame of the metering.
  • step S450 the indication information is sent to the mobile device, instructing the mobile device to capture the target object and focus and/or meter the target object.
  • the indication information indicating that the mobile device tracks the target object may be generated according to a control instruction input by the user, and the information is sent.
  • the indication information may indicate that the image acquisition device of the mobile device is toward the target object, or both the mobile device and the image acquisition device are toward the target object.
  • the indication information may further indicate that the movable device follows the target object motion and maintains a preset distance from the target object.
  • parameters for metering may be set and the set parameters are transmitted to the mobile device.
  • the parameter can include a compensation parameter for compensating for the photometric result of the target object, as described above.
  • FIGS. 5A-5D a schematic scene of a tracking shooting application according to an embodiment of the present disclosure will be described in detail with reference to FIGS. 5A-5D.
  • the principle of the technical solution of the present disclosure will be described using a smartphone as an example of a control device and an unmanned aerial vehicle as an example of a mobile device.
  • a smartphone as an example of a control device
  • an unmanned aerial vehicle as an example of a mobile device.
  • the principles of the present disclosure may be implemented in other types of control devices and mobile devices.
  • Figures 5A-5D show what is displayed on the screen on the smartphone.
  • the user can frame a certain person on the screen as a tracking target. Accordingly, the target object is determined as the area indicated by reference numeral 501.
  • the target object selected in FIG. 5A is in the near-center portion of the screen, those skilled in the art can understand that the position of the target object is not limited thereto, but may be at any position on the screen.
  • option box 502 a plurality of option buttons 502a-502c are included.
  • the option button 502a indicates that the drone is required to perform the tracking function
  • the option 502b indicates that the drone performs the "load tracking" function
  • the option button 502c indicates that the target object is focused and/or metered.
  • User can The corresponding button in option box 502 is operated in a function that is performed as needed.
  • the nose of the drone and the image acquiring device are both directed toward the target object; among the functions corresponding to the button 502b, only the image capturing device faces the target object, and the drone remains The orientation of itself is constant or remains the same.
  • the user has selected to track the area indicated by reference numeral 501 and wants the drone to focus and/or meter the area, thereby generating a picture of the tracking shot. Then, the user can first click button 502a to enable the tracking function of the drone, and then click option button 502c to activate the local focus and/or metering function of the drone. In other words, clicking the option button 502c will cause the drone to focus and/or meter only the target object 501 and generate a real-time tracked picture based on the results of focusing and/or metering.
  • a dialog box pops up on the screen, as shown in FIG. 5C.
  • the dialog prompts the user that the drone will focus and/or meter the target. This way, the user gets a better hint when using the feature for the first time.
  • the box before "No longer displayed” can be checked.
  • the dialog shown in Figure 5C is no longer popped up.
  • button 502c can be clicked again to disable the function.
  • Figure 5D shows something similar to Figure 5B, except that option box 502 also includes an option button 502d.
  • the option button 502d is used to control the flight state of the drone, including: the drone follows the target object motion and maintains a preset distance with the target object; the drone position does not change, and only the image acquisition device (such as a camera) is adjusted.
  • the image acquisition device is adjusted in real time to ensure that the image acquisition device can always capture the target object.
  • the user can control the drone to perform the tracking function on the smartphone, and conveniently enable/disable the local focus and/or metering function of the drone during the tracking process, so that the screen taken by the drone is presented.
  • users can view the tracking images taken by the drone in real time on their smartphones.
  • the user can also view the location of the target object and/or the drone in real time on the smartphone.
  • Such an arrangement of the present disclosure is typically provided as software, code, and/or other data structures, such as one or more, that are arranged or encoded on a computer readable medium such as an optical medium (eg, CD-ROM), floppy disk, or hard disk.
  • a computer readable medium such as an optical medium (eg, CD-ROM), floppy disk, or hard disk.
  • Software or firmware or such a configuration may be installed on the computing device such that one or more processors in the computing device perform the technical solutions described in the embodiments of the present disclosure.
  • each functional module or individual feature of the device used in each of the above embodiments may be implemented or executed by circuitry, typically one or more integrated circuits.
  • Circuitry designed to perform the various functions described in this specification can include general purpose processors, digital signal processors (DSPs), application specific integrated circuits (ASICs) or general purpose integrated circuits, field programmable gate arrays (FPGAs), or others.
  • a general purpose processor may be a microprocessor, or the processor may be an existing processor, controller, microcontroller, or state machine.
  • the above general purpose processor or each circuit may be configured by a digital circuit or may be configured by a logic circuit.
  • the present disclosure may also use integrated circuits obtained using the advanced technology.
  • the program running on the device may be a program that causes a computer to implement the functions of the embodiments of the present invention by controlling a central processing unit (CPU).
  • the program or information processed by the program may be temporarily stored in a volatile memory (such as a random access memory RAM), a hard disk drive (HDD), a non-volatile memory (such as a flash memory), or other memory system.
  • a program for realizing the functions of the embodiments of the present invention can be recorded on a computer readable recording medium.
  • the corresponding functions can be realized by causing a computer system to read programs recorded on the recording medium and execute the programs.
  • the so-called "computer system” herein may be a computer system embedded in the device, and may include an operating system or hardware (such as a peripheral device).

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Human Computer Interaction (AREA)
  • Studio Devices (AREA)
  • Aviation & Aerospace Engineering (AREA)

Abstract

提供了一种可移动设备,与控制设备通信连接并接受控制设备的控制指令。该可移动设备包括:感测元件,获取目标对象的位置信息;图像获取装置,获取包括目标对象的图像画面;控制器,调整图像获取装置的朝向,使图像获取装置始终朝向目标对象并获取包括目标对象的图像画面。控制器还被配置为:根据目标对象的位置信息,控制图像获取装置对目标对象进行对焦和/或测光。还提供了一种控制设备及其相应的方法。

Description

用于跟踪拍摄的方法和设备 技术领域
本公开涉及图像处理领域,更具体地,本公开涉及一种用于跟踪拍摄的方法和设备。
背景技术
当前,具有摄像装置的可移动设备得到了广泛的应用。例如,具有摄像头的无人机可以跟踪拍摄期望的对象,并将所拍摄的图像实时传送给用户。
在传统的跟踪过程中,当用户选择跟踪目标后,可移动设备会将其中的摄像装置设置为全局平均对焦模式。即,不仅要保证跟踪目标的清晰成像,还要保证包含跟踪目标的画面中的其他对象的清晰成像。此外,在摄像装置进行测光时,采用全局平均测光。
然而,当需要拍摄目标对象清晰而背景虚化的图像时,现有的跟踪拍摄模式无法满足这样的需求,因为在现有的跟踪拍摄模式下,摄像装置采用的是全局平均对焦。此外,由于现有的跟踪拍摄模式采用全局平均测光的方法,使得在强对比度的画面下,跟踪目标可能出现过暗或者过亮的情况,从而导致质量不高的画面。
发明内容
为了解决以上问题中的至少一部分,本公开提出一种在跟踪拍摄模式中提供局部对焦和/或测光的技术方案。在该技术方案中,能够在跟踪拍摄模式下针对跟踪目标进行对焦和/或对其测光,使得能够将跟踪目标之外的区域进行虚化和/或实现跟踪画面的合适亮度。
根据本公开的一个方面,提供了一种可移动设备,与控制设备通信连接并接受控制设备的控制指令。该可移动设备包括:感测元件,获取目标对象的位置信息;图像获取装置,获取包括目标对象的图像画面;控制器,调整图像获取装置的朝向,使图像获取装置始终朝向目标对象并获取包括 目标对象的图像画面。控制器还被配置为:根据目标对象的位置信息,控制图像获取装置对目标对象进行对焦和/或测光。
根据本公开的另一个方面,提供了一种由可移动设备执行的方法。该可移动设备与控制设备通信连接并接受控制设备的控制指令。该可移动设备包括感测元件、图像获取装置和控制器。该方法包括:通过感测元件获取目标对象的位置信息;通过图像获取装置获取包括目标对象的图像画面;通过控制器调整图像获取装置的朝向,使图像获取装置始终朝向目标对象并获取包括目标对象的图像画面。其中,控制器根据目标对象的位置信息,控制图像获取装置对目标对象进行对焦和/或测光。
根据本公开的另一个方面,提供了一种用于控制可移动设备的控制设备,包括:
接收单元,从可移动设备接收可移动设备获取的图像画面;显示单元,显示所接收的图像画面;信息输入单元,获取用户输入的控制指令;处理器,根据用户输入的控制指令在图像画面中确定出待跟踪的目标对象,并将目标对象作为对焦和/或测光的对象;以及发送单元,向可移动设备发送指示信息,指示可移动设备捕捉目标对象,并对目标对象进行对焦和/或测光。
根据本公开的另一个方面,提供了一种由控制设备执行的方法,该控制设备用于控制可移动设备。该方法包括:从可移动设备接收可移动设备获取的图像画面;显示所接收的图像画面;获取用户输入的控制指令;根据用户输入的控制指令在图像画面中确定出待跟踪的目标对象,并将目标对象作为对焦和/或测光的对象;以及向可移动设备发送指示信息,指示可移动设备捕捉目标对象,并对目标对象进行对焦和/或测光。
根据本公开的另一个方面,提供了一种包括控制设备和可移动设备的跟踪拍摄系统,其中控制设备和可移动设备通信连接。控制设备包括:接收单元,从可移动设备接收可移动设备获取的图像画面;显示单元,显示所接收的图像画面;信息输入单元,获取用户输入的控制指令;处理器,根据用户输入的控制指令在图像画面中确定出待跟踪的目标对象,并将目标对象作为对焦和/或测光的对象;以及发送单元,向可移动设备发送指示信息,指示可移动设备捕捉目标对象,并对目标对象进行对焦和/或测光。 可移动设备包括:感测元件,获取目标对象的位置信息;图像获取装置,获取包括目标对象的图像画面;控制器,调整图像获取装置的朝向,使图像获取装置始终朝向目标对象并获取包括目标对象的图像画面。控制器还被配置为:根据目标对象的位置信息,控制图像获取装置对目标对象进行对焦和/或测光,并向控制设备发送包括目标对象的图像画面。
根据本公开的另一个方面,提供了一种用于跟踪拍摄系统的方法。跟踪拍摄系统包括控制设备和可移动设备,控制设备和可移动设备通信连接。该方法包括在控制设备处执行的以下操作:从可移动设备接收所述可移动设备获取的图像画面,并显示所接收的图像画面;获取用户输入的控制指令;根据用户输入的控制指令在图像画面中确定出待跟踪的目标对象,并将目标对象作为对焦和/或测光的对象;以及向可移动设备发送指示信息,指示可移动设备捕捉目标对象,并对目标对象进行对焦和/或测光。该方法还包括在可移动设备处执行的以下操作:获取目标对象的位置信息;调整可移动设备中的图像获取装置的朝向,使图像获取装置始终朝向目标对象并获取包括目标对象的图像画面;以及根据目标对象的位置信息,控制图像获取装置对目标对象进行对焦和/或测光,并向控制设备发送包括目标对象的图像画面。
采用本公开的技术方案,可以通过简便的交互方式,实现智能跟踪拍摄与局部对焦和/或测光的结合。在跟踪拍摄模式下,针对跟踪目标进行对焦和/或对其测光,使得能够将跟踪目标之外的区域进行虚化和/或实现跟踪画面的合适亮度。
附图说明
通过下文结合附图的详细描述,本公开的上述和其它特征将会变得更加明显,其中:
图1是示出了根据本公开一个实施例的可移动设备的框图。
图2是示出了根据本公开一个实施例的由可移动设备执行的方法的流程图。
图3是示出了根据本公开一个实施例的控制设备的框图。
图4是示出了根据本公开一个实施例的由控制设备执行的方法的流程图。
图5A-5D是示出了根据本公开一个实施例的跟踪拍摄应用的示意图。
需要注意的是,附图不一定按比例绘制,重点在于示出本文公开的技术的原理。另外,为了清楚起见,贯穿附图中的相似的附图标记指代相似的元素。
具体实施方式
下面结合附图和具体实施方式对本公开进行详细阐述。应当注意,本公开不应局限于下文所述的具体实施方式。另外,为了简便起见,省略了对与本公开没有直接关联的公知技术的详细描述,以防止对本公开的理解造成混淆。
图1是示出了根据本公开一个实施例的可移动设备10的框图。如图1所示,可移动设备10包括感测元件110、图像获取装置120和控制器130。
可移动设备10可以是各种形式的设备,包括但不限于无人机、无人车或无人船等。下文,以无人机作为可移动设备10的示例来描述本公开的原理。本领域技术人员可以理解,所描述的原理同样可以适应于其他形式的可移动设备。
感测元件110可以获取目标对象的位置信息。这里,“目标对象”是指用户感兴趣的、从而需要跟踪拍摄的对象。此外,感测元件110还可以获取可移动设备10本身的位置信息。感测元件110的示例可以包括GPS传感器、视觉传感器、超声波传感器、红外传感器、雷达中的一种或多种。
图像获取装置120可以获取包括目标对象的图像画面。例如,图像获取装置120可以包括一个或更多个摄像头,实时地拍摄包括目标对象的图像画面。本实施方式中,图像获取装置120为可见光相机。可以理解,图像获取装置120也可为红外相机等其他形式的可获取图像画面的装置。
控制器130调整图像获取装置120的朝向,使图像获取装置120始终朝向目标对象并获取包括目标对象的图像画面。例如,在跟踪拍摄模式下,目标对象的移动可能很频繁,而且运动不规律。在这种情况下,控制器130需要不断地获取感测元件110所感测的目标对象的实时位置信息,并基于 该位置信息来调整图像获取装置120的朝向,使得图像获取装置120能够一直跟踪拍摄目标对象。
在本公开中,控制器130根据感测元件110所感测的目标对象的位置信息,控制图像获取装置120对目标对象进行对焦和/或测光。例如,图像获取装置120可以接收控制器130的控制指令,并根据所接收的控制指令来启动或停止对目标对象进行对焦和/或测光。
举例来说,如果期望仅对目标对象实现局部对焦,则控制器130根据所述目标对象的位置信息,可以控制图像获取装置120将对焦点设置在所跟踪的目标对象上。这样,在图像获取装置120所获取的图像画面中,目标对象本身是清晰的,而无需考虑图像画面中除了目标对象之外的其他对象的对焦是否清晰。当采用大光圈进行拍摄时,背景虚化的程度较高。而采用小光圈拍摄时,背景虚化的程度较低。本领域技术人员可以根据需要,采用合适的光圈值来实现相应的背景虚化。
另一方面,如果需要仅对目标对象实现局部测光,则图像获取装置120根据目标对象的位置信息,接收控制器130发出的控制指令,获取目标对象处的亮度,并根据该亮度来调整整个图像画面的亮度。这样,能够保证目标对象的亮度在整个图像画面中是适当的,从而使得目标对象清晰可辨。
优选地,图像获取装置120可以在对目标对象进行测光后,对目标对象的亮度进行补偿,并且基于补偿后的目标对象的亮度来设置整个图像画面的亮度。例如,可以设置两个阈值:第一阈值和第二阈值,其中第一阈值小于第二阈值。这两个阈值可以是预先确定的,或者可以是在跟踪拍摄过程中不断调整的。当目标对象的亮度小于第一阈值时,图像获取装置120执行补偿以提高目标对象的亮度。相反,当目标对象的亮度大于第二阈值时,图像获取装置120执行补偿以降低目标对象的亮度。所提高或降低的值可以根据实际需要来设置。这样,经过补偿后的目标对象的亮度不会太暗或太亮,从而基于该亮度调整的整个画面的亮度也会更加适当。
备选地,控制器130还可以控制图像获取装置120根据目标对象的位置信息拉近或拉远镜头,实现获取的图像画面的放大或缩小。这样做的优点是,用户可以根据需要调整目标对象的大小。当需要查看目标对象的细节时,控制器130可以控制图像获取装置120将镜头拉近,这样目标对象在画面中显示得更大,便于用户查看目标对象的细节。而当需要查看目标 对象所处的整体环境时,控制器130可以控制图像获取装置120将镜头拉远,使得所拍摄的画面中能够容纳更多的信息,便于用户查看目标对象周围的环境。
备选地,控制器130还可以控制图像获取装置120调节白平衡和/或色彩滤镜。例如,由于天气条件不同,固定的白平衡设置可能无法获得良好的图像。因此,控制器130可以根据周围的光照情况来控制图像获取装置120调节白平衡设置和/或色彩滤镜,以获得更好的图像质量和/或期望的图像风格。
备选地,控制器130还可以控制可移动设备10跟随目标对象运动并与目标对象保持预设距离。另外,控制器130可以调整图像获取装置120的朝向,使图像获取装置120始终朝向目标对象的过程中,控制可移动设备10的朝向保持不变或调整可移动设备10也朝向目标对象。这样,可以更加方便地拍摄要跟踪的目标对象。控制器130也可控制可移动设备10位于某以特定位置,并控制图像获取装置120实时调整朝向以捕捉目标对象。
可以理解,本公开的上述局部对焦和局部测光技术可以单独应用,也可以组合应用。此外,本公开的上述镜头调整(拉近或拉远)、白平衡设置、色彩滤镜设置中的一种或更多种可以与局部对焦和/或测光一起应用。
采用本公开的技术方案,可以通过简便的交互方式,实现智能跟踪拍摄与局部对焦和/或测光的结合。在跟踪拍摄模式下,针对跟踪目标进行对焦和/或对其测光,使得能够将跟踪目标之外的区域进行虚化和/或实现跟踪画面的合适亮度。
图2是示出了根据本公开一个实施例的由可移动设备执行的方法的流程图。例如,该方法可以由图1所示的可移动设备10来执行。以下为了简便,省略了对可移动设备10的细节的描述。
如图2所示,在步骤S210,通过感测元件获取目标对象的位置信息。此外,感测元件还可以获取可移动设备的位置信息。如上文所述,感测元件可以包括GPS传感器、视觉传感器、超声波传感器、红外传感器、雷达中的一种或多种。
在步骤S220,通过图像获取装置获取包括目标对象的图像画面。例如,可以通过一个或更多个摄像头来实时拍摄包括目标对象的图像画面。
在步骤S230,通过控制器调整图像获取装置的朝向,使图像获取装置 始终朝向目标对象并获取包括目标对象的图像画面。
在步骤S240,根据目标对象的位置信息,控制图像获取装置对目标对象进行对焦和/或测光。如上文所述,对焦和测光可以单独应用,也可以组合应用。此外,还可以结合上文描述的镜头调整(拉近或拉远)、白平衡设置、色彩滤镜设置中的一种或更多种一起应用,此处不再详细描述。
以上,描述了根据本公开的一个实施例的可移动设备及其执行的方法。下面,对与可移动设备相对应地操作的控制设备及其方法进行详细描述。
图3是示出了根据本公开一个实施例的控制设备的框图。如图3所示,控制设备30包括接收单元310、显示单元320、信息输入单元330、处理器340和发送单元350
控制设备30可以是各种形式的设备,包括但不限于智能手机、控制终端、平板电脑PAD等。下文,以智能手机作为控制设备30的示例来描述本公开的原理。本领域技术人员可以理解,所描述的原理同样可以适应于其他形式的控制设备。
接收单元310从可移动设备接收的图像画面。显示单元320显示所接收的图像画面。
信息输入单元330获取用户输入的控制指令。例如,信息输入单元330可以包括触摸屏、控制按键等。用户输入的控制指令可以包括使可移动设备执行对目标对象的跟踪拍摄功能,并且还可以使可移动设备执行上文描述的对目标对象进行对焦和/或测光的功能。
处理器340根据用户输入的控制指令在图像画面中确定出待跟踪的目标对象,并将该目标对象作为对焦和/或测光的对象。
发送单元350向可移动设备发送指示信息,指示可移动设备捕捉目标对象并对目标对象进行对焦和/或测光。在一个示例中,信息输入单元330获取用户在显示单元320上做出的框选指令,处理器340识别框选区域中的物体,并将符合预设条件的物体作为目标对象。备选地,处理器340可以将框选的目标对象的指示框设定为对焦的对焦框和/或测光的测光框。
此外,接收单元310还可以从可移动设备接收目标对象的实时位置信息,从而显示单元320可以显示目标对象的实时位置信息。
如上文所述,“目标对象”是指用户感兴趣的、需要跟踪拍摄的对象。可移动设备的图像获取装置获取画面并回传给控制设备30,控制设备30用 户输入在所述画面中确定出目标对象,可移动设备在获得发送单元350所发送的指示消息后,可以利用图像获取装置(例如一个或更多个摄像头)实时地跟踪目标对象。可移动设备可以通过感测元件获取目标对象的实时位置信息,根据所接收的指示信息对目标对象进行对焦和/或测光,生成包括目标对象的图像画面并将其传送给控制设备30。
备选地,控制设备30还可以包括参数设置单元360(图3中的虚线所示)。该参数设置单元360可以设置用于测光的参数,而发送单元350可以向可移动设备发送由参数设置单元360所设置的参数。可以理解,所述参数设置单元360可以为信息输入单元330的一个子单元,用于设置参数。
例如,该参数可以包括用于对目标对象的测光结果进行补偿的补偿参数。假设可以设置两个阈值:第一阈值和第二阈值,其中第一阈值小于第二阈值。这两个阈值可以是预先确定的,或者可以是在跟踪拍摄过程中不断调整的。当目标对象的亮度小于第一阈值时,可移动设备执行补偿处理以提高目标对象的亮度。相反,当目标对象的亮度大于第二阈值时,可移动设备执行补偿处理以降低目标对象的亮度。所提高或降低的值可以根据实际需要来设置。这样,经过补偿后的目标对象的亮度不会太暗或太亮,从而基于该亮度调整的整个画面的亮度也会更加适当。
具体地,处理器340在确定出跟踪目标对象后根据用户输入的控制指令生成指示可移动设备对目标对象进行跟踪的指示信息。例如,该指示信息可以指示:控制可移动设备的图像获取装置朝向目标对象。即,可移动设备本身的朝向不会改变,而仅仅调整图像获取装置的朝向。
此外,该指示信息可以指示:控制可移动设备和图像获取装置两者均朝向目标对象。在此情况下,需要调整可移动设备本身及其所包括的图像获取装置两者,使得它们都朝向目标对象。优选地,当指示信息指示控制可移动设备和图像获取装置两者均朝向目标对象时,该指示信息还可以进一步指示:控制可移动设备跟随目标对象运动并与目标对象保持预设距离。
图4是示出了根据本公开一个实施例的由控制设备执行的方法的流程图。例如,该方法可以由图3所示的控制设备30来执行。以下为了简便,省略了对控制设备30的细节的描述。
如图4所示,在步骤S410,从可移动设备接收可移动设备获取的图像画面。在步骤S420,显示所接收的图像画面。
在步骤S430,获取用户输入的控制指令。在步骤S440,根据用户输入的控制指令在图像画面中确定出待跟踪的目标对象,并将目标对象作为对焦和/或测光的对象。例如,可以获取用户在显示单元上做出的框选指令,识别框选区域中的物体,并将符合预设条件的物体作为目标对象。可以将框选的目标对象的指示框设定为对焦的对焦框和/或测光的测光框。
在步骤S450,向可移动设备发送指示信息,指示可移动设备捕捉目标对象并对目标对象进行对焦和/或测光。
备选地,在确定出待跟踪的目标对象后,还可以根据用户输入的控制指令生成指示可移动设备对目标对象跟踪的指示信息,并发送该信息。例如,该指示信息可以指示:可移动设备的图像获取装置朝向目标对象,或者可移动设备和图像获取装置两者均朝向目标对象。优选地,当指示信息指示控制可移动设备和图像获取装置两者均朝向目标对象时,该指示信息还可以进一步指示:可移动设备跟随目标对象运动并与目标对象保持预设距离。
备选地,可以设置用于测光的参数,并且向可移动设备发送所设置的参数。例如,该参数可以包括用于对目标对象的测光结果进行补偿的补偿参数,如上文所述。
下面,结合附图5A-5D来详细描述根据本公开一个实施例的跟踪拍摄应用的示意场景。在附图5A-5D中,以智能手机作为控制设备的示例,而以无人机作为可移动设备的示例,来说明本公开的技术方案的原理。然而,本领域技术人员可以理解,本公开的原理可以在其他类型的控制设备和可移动设备中实现。
图5A-5D示出了智能手机上的屏幕所显示的内容。
在附图5A中,用户可以框选屏幕上的某个人作为跟踪目标。相应地,目标对象被确定为由附图标记501指示的区域。尽管图5A中所选择的目标对象处于画面的靠近中心的部分,本领域技术人员可以理解,目标对象的位置不限于此,而是可以处于屏幕上的任意位置。
在附图5B中,用户点击屏幕左侧的菜单栏中的圆形按钮,弹出了选项框502。在选项框502中,包含了多个选项按钮502a-502c。其中,选项按钮502a指示需要无人机执行跟踪功能,选项502b指示无人机执行“负载跟踪”功能,而选项按钮502c表示对目标对象进行对焦和/或测光。用户可 以根据需要执行的功能来操作选项框502中的相应按钮。其中,在与按钮502a相对应的功能中,无人机的机头和图像获取装置均朝向目标对象;在与按钮502b相对应的功能中,仅图像获取装置朝向目标对象,而无人机保持自身朝向不变或保持其他朝向。
例如,用户已经选择了对附图标记501指示的区域进行跟踪,并希望无人机对该区域进行对焦和/或测光,由此来生成跟踪拍摄的画面。那么,用户可以先点击按钮502a以启用无人机的跟踪功能,然后点击选项按钮502c以启动无人机的局部对焦和/或测光功能。换句话说,点击选项按钮502c将使得无人机仅对目标对象501进行对焦和/或测光,并基于对焦和/或测光的结果产生实时跟踪的画面。
优选地,在用户点击按钮502c后,屏幕上弹出对话框,如附图5C所示。该对话框提示用户,无人机将对目标对象进行对焦和/或测光。这样,用户在第一次使用该功能时,能够更好地获得提示。当用户不希望之后点击按钮502c时再次显示该对话框时,可以勾选“不再显示”前的框。这样,当用户下次点击502c以启用局部对焦和/或测光时,不再弹出附图5C所示的对话框。
当然,如果用户不再希望无人机执行局部对焦和/或测光功能时,可以再次点击按钮502c,以禁用该功能。
附图5D示出了与附图5B类似的内容,不同之处在于选项框502还包括选项按钮502d。选项按钮502d用来控制无人机的飞行状态,包括:无人机跟随目标对象运动并与所述目标对象保持预设距离;无人机位置不变,仅调整图像获取装置(如相机)朝向目标对象,且目标对象活动时,实时调整所述图像获取装置,保证图像获取装置始终可以捕捉目标对象。
因此,用户可以在智能手机上控制无人机执行跟踪功能,并在跟踪拍摄的过程中方便地启用/禁用无人机的局部对焦和/或测光的功能,使得无人机拍摄的画面呈现背景虚化的能力,并改善目标对象在画面中的辨识度。此外,用户可以在智能手机上实时查看无人机所拍摄的跟踪画面。备选地,用户还可以在智能手机上实时查看目标对象和/或无人机的所在位置。
上文已经结合优选实施例对本公开的方法和涉及的设备进行了描述。本领域技术人员可以理解,上面示出的方法仅是示例性的。本公开的方法并不局限于上面示出的步骤和顺序。
应该理解,本公开的上述实施例可以通过软件、硬件或者软件和硬件两者的结合来实现。本公开的这种设置典型地提供为设置或编码在例如光介质(例如CD-ROM)、软盘或硬盘等的计算机可读介质上的软件、代码和/或其他数据结构、或者诸如一个或多个ROM或RAM或PROM芯片上的固件或微代码的其他介质、或一个或多个模块中的可下载的软件图像、共享数据库等。软件或固件或这种配置可安装在计算设备上,以使得计算设备中的一个或多个处理器执行本公开实施例所描述的技术方案。
此外,上述每个实施例中所使用的设备的每个功能模块或各个特征可以由电路实现或执行,所述电路通常为一个或多个集成电路。设计用于执行本说明书中所描述的各个功能的电路可以包括通用处理器、数字信号处理器(DSP)、专用集成电路(ASIC)或通用集成电路、现场可编程门阵列(FPGA)或其他可编程逻辑器件、分立的门或晶体管逻辑、或分立的硬件组件、或以上器件的任意组合。通用处理器可以是微处理器,或者所述处理器可以是现有的处理器、控制器、微控制器或状态机。上述通用处理器或每个电路可以由数字电路配置,或者可以由逻辑电路配置。此外,当由于半导体技术的进步,出现了能够替代目前的集成电路的先进技术时,本公开也可以使用利用该先进技术得到的集成电路。
运行在根据本发明的设备上的程序可以是通过控制中央处理单元(CPU)来使计算机实现本发明的实施例功能的程序。该程序或由该程序处理的信息可以临时存储在易失性存储器(如随机存取存储器RAM)、硬盘驱动器(HDD)、非易失性存储器(如闪速存储器)、或其他存储器系统中。用于实现本发明各实施例功能的程序可以记录在计算机可读记录介质上。可以通过使计算机系统读取记录在所述记录介质上的程序并执行这些程序来实现相应的功能。此处的所谓“计算机系统”可以是嵌入在该设备中的计算机系统,可以包括操作系统或硬件(如外围设备)。
如上,已经参考附图对本发明的实施例进行了详细描述。但是,具体的结构并不局限于上述实施例,本发明也包括不偏离本发明主旨的任何设计改动。另外,可以在权利要求的范围内对本发明进行多种改动,通过适当地组合不同实施例所公开的技术手段所得到的实施例也包含在本发明的技术范围内。此外,上述实施例中所描述的具有相同效果的组件可以相互替代。

Claims (49)

  1. 一种可移动设备,与控制设备通信连接并接受所述控制设备的控制指令,包括:
    感测元件,获取目标对象的位置信息;
    图像获取装置,获取包括所述目标对象的图像画面;
    控制器,调整所述图像获取装置的朝向,使所述图像获取装置始终朝向所述目标对象并获取包括所述目标对象的图像画面;
    其中,所述控制器还被配置为:根据所述目标对象的位置信息,控制所述图像获取装置对所述目标对象进行对焦和/或测光。
  2. 根据权利要求1所述的可移动设备,其中,所述图像获取装置被配置为:
    接收所述控制器的控制指令;以及
    根据所述控制指令来启动或停止对所述目标对象进行对焦和/或测光。
  3. 根据权利要求1所述的可移动设备,其中,所述感测元件获取所述目标对象的实时位置信息。
  4. 根据权利要求1或3所述的可移动设备,其中,所述感测元件获取所述目标对象的位置信息以及所述可移动设备的位置信息。
  5. 根据权利要求1所述的可移动设备,其中,所述感测元件包括GPS传感器、视觉传感器、超声波传感器、红外传感器及雷达中的一种或多种。
  6. 根据权利要求1或2所述的可移动设备,其中,所述图像获取装置接收所述控制器的控制指令,获取所述目标对象所在区域的亮度,并根据所述亮度调整整个画面的亮度。
  7. 根据权利要求6所述的可移动设备,其中,所述图像获取装置被配置为:
    在对所述目标对象进行测光后,对所述目标对象的亮度进行补偿;以及
    基于补偿后的所述目标对象的亮度来设置所述图像的亮度。
  8. 根据权利要求7所述的可移动设备,其中,所述图像获取装置被配置为:
    如果所述目标对象的亮度小于第一阈值,则提高所述目标对象的亮度;或者
    如果所述目标对象的亮度大于第二阈值,则降低所述目标对象的亮度;
    其中,所述第二阈值大于所述第一阈值。
  9. 根据权利要求8所述的可移动设备,其中,所述第一阈值和所述第二阈值是预先确定的或者是可调整的。
  10. 根据权利要求1或2所述的可移动设备,其中,所述控制器控制所述图像获取装置根据所述目标对象的位置信息拉近或拉远镜头,实现获取的图像画面的放大或缩小。
  11. 根据权利要求1或2所述的可移动设备,其中,所述控制器控制所述图像获取装置调节白平衡和/或色彩滤镜。
  12. 根据权利要求1所述的可移动设备,其中,所述控制器还控制所述可移动设备跟随所述目标对象运动并与所述目标对象保持预设距离。
  13. 根据权利要求1所述的可移动设备,其中,所述控制器调整所述图像获取装置的朝向,使所述图像获取装置始终朝向所述目标对象的过程中,控制所述可移动设备的朝向保持不变或调整所述可移动设备也朝向所述目标对象。
  14. 根据权利要求1-13中任意一项所述的可移动设备,其中,所述可移动设备包括无人机、无人车或无人船。
  15. 一种由可移动设备执行的方法,所述可移动设备与控制设备通信连接并接受所述控制设备的控制指令,且所述可移动设备包括感测元件、图像获取装置和控制器,所述方法包括:
    通过感测元件获取目标对象的位置信息;
    通过图像获取装置获取包括所述目标对象的图像画面;
    通过控制器调整所述图像获取装置的朝向,使所述图像获取装置始终朝向所述目标对象并获取包括所述目标对象的图像画面;
    其中,所述控制器根据所述目标对象的位置信息,控制所述图像获取装置对所述目标对象进行对焦和/或测光。
  16. 根据权利要求15所述的方法,其中,所述图像获取装置接收所述控制器的控制指令,以及根据所述控制指令来启动或停止对所述目标对象进行对焦和/或测光。
  17. 根据权利要求15所述的方法,其中,所述感测元件获取所述目标对象的实时位置信息。
  18. 根据权利要求15或17所述的方法,其中,所述感测元件获取所述目标对象的位置信息以及所述可移动设备的位置信息。
  19. 根据权利要求15所述的方法,其中,所述感测元件包括GPS传感器、视觉传感器、超声波传感器、红外传感器、雷达中的一种或多种。
  20. 根据权利要求15或16所述的方法,其中,所述图像获取装置接收所述控制器的控制指令,获取所述目标对象所在区域的亮度,并根据所述亮度调整整个画面的亮度。
  21. 根据权利要求20所述的方法,其中,所述图像获取装置在对所述目标对象进行测光后,对所述目标对象的亮度进行补偿;以及基于补偿后的所述目标对象的亮度来设置所述图像的亮度。
  22. 根据权利要求21所述的方法,其中,
    如果所述目标对象的亮度小于第一阈值,则所述图像获取装置提高所述目标对象的亮度;或者
    如果所述目标对象的亮度大于第二阈值,则所述图像获取装置降低所述目标对象的亮度;
    其中,所述第二阈值大于所述第一阈值。
  23. 根据权利要求22所述的方法,其中,所述第一阈值和所述第二阈值是预先确定的或者是可调整的。
  24. 根据权利要求15或16所述的方法,其中,所述控制器控制所述图像获取装置根据所述目标对象的位置信息拉近或拉远镜头,实现获取的图像画面的放大或缩小。
  25. 根据权利要求15或16所述的方法,其中,所述控制器控制所述图像获取装置调节白平衡和/或色彩滤镜。
  26. 根据权利要求15所述的方法,其中,所述控制器还控制所述可移动设备跟随所述目标对象运动并与所述目标对象保持预设距离。
  27. 根据权利要求15所述的方法,其中,所述控制器调整所述图像获取装置的朝向,使所述图像获取装置始终朝向所述目标对象的过程中,控制所述可移动设备的朝向保持不变或调整所述可移动设备也朝向所述目标对象。
  28. 根据权利要求15所述的方法,其中,所述方法还包括将所述图像获取装置获取的图像画面发送给所述控制设备。
  29. 根据权利要求15-28中任意一项所述的方法,其中,所述可移动设备包括无人机、无人车或无人船。
  30. 一种控制设备,用于控制可移动设备,包括:
    接收单元,从所述可移动设备接收所述可移动设备获取的图像画面;
    显示单元,显示所接收的图像画面;
    信息输入单元,获取用户输入的控制指令;
    处理器,根据所述用户输入的控制指令在所述图像画面中确定出待跟踪的目标对象,并将所述目标对象作为对焦和/或测光的对象;以及
    发送单元,向所述可移动设备发送指示信息,指示所述可移动设备捕捉所述目标对象,并对所述目标对象进行对焦和/或测光。
  31. 根据权利要求30所述的控制设备,其中,所述信息输入单元获取用户输入的在显示单元中的框选指令,所述处理器识别所述框选区域中的物体,并将符合预设条件的物体作为目标对象。
  32. 根据权利要求30所述的控制设备,其中,所述处理器将所述框选目标对象的指示框设定为对焦的对焦框和/或测光的测光框。
  33. 根据权利要求30所述的控制设备,其中,所述接收单元还被配置为接收所述目标对象的实时位置信息,以及所述显示单元还被配置为显示所述目标对象的实时位置信息。
  34. 根据权利要求30所述的控制设备,还包括:
    参数设置单元,被配置为设置用于测光的参数;
    其中,所述发送单元被配置为向所述可移动设备发送所述参数。
  35. 根据权利要求34所述的控制设备,其中,所述参数包括:用于对所述对象的测光结果进行补偿的补偿参数。
  36. 根据权利要求30所述的控制设备,其中,所述处理器确定出待跟踪的目标对象后,根据所述用户输入的控制指令生成指示所述可移动设备对所述目标对象跟踪的指示信息。
  37. 根据权利要求36所述的控制设备,其中,所述指示信息包括;
    控制所述可移动设备的图像获取装置朝向所述目标对象;或
    控制所述可移动设备和所述图像获取装置均朝向所述目标对象。
  38. 根据权利要求37所述的控制设备,其中,当所述指示信息为控制所述可移动设备和所述图像获取装置均朝向所述目标对象时,所述指示信息还包括:
    控制所述可移动设备跟随所述目标对象运动并与所述目标对象保持预设距离。
  39. 一种由控制设备执行的方法,所述控制设备用于控制可移动设备,所述方法包括:
    从所述可移动设备接收所述可移动设备获取的图像画面;
    显示所接收的图像画面;
    获取用户输入的控制指令;
    根据所述用户输入的控制指令在所述图像画面中确定出待跟踪的目标对象,并将所述目标对象作为对焦和/或测光的对象;以及
    向所述可移动设备发送指示信息,指示所述可移动设备捕捉所述目标对象,并对所述目标对象进行对焦和/或测光。
  40. 根据权利要求39所述的方法,其中,获取用户输入的在显示单元中的框选指令,识别所述框选区域中的物体,并将符合预设条件的物体作为目标对象。
  41. 根据权利要求39所述的方法,其中,将所述框选目标对象的指示框设定为对焦的对焦框和/或测光的测光框。
  42. 根据权利要求39所述的方法,还包括:
    接收所述目标对象的实时位置信息;以及
    显示所述目标对象的实时位置信息。
  43. 根据权利要求39所述的方法,还包括:
    设置用于测光的参数;以及
    向所述可移动设备发送所述参数。
  44. 根据权利要求43所述的方法,其中,所述参数包括:用于对所述对象的测光结果进行补偿的补偿参数。
  45. 根据权利要求39所述的方法,其中,在确定出待跟踪的目标对象后,根据所述用户输入的控制指令生成指示所述可移动设备对所述目标对象跟踪的指示信息。
  46. 根据权利要求45所述的方法,其中,所述指示信息包括;
    控制所述可移动设备的图像获取装置朝向所述目标对象;或
    控制所述可移动设备和所述图像获取装置均朝向所述目标对象。
  47. 根据权利要求46所述的方法,其中,当所述指示信息为控制所述可移动设备和所述图像获取装置均朝向所述目标对象时,所述指示信息还包括:
    控制所述可移动设备跟随所述目标对象运动并与所述目标对象保持预设距离。
  48. 一种跟踪拍摄系统,包括如权利要求30-38中任一项所述的控制设备和如权利要求1-14中任一项所述的可移动设备,所述控制设备和所述可移动设备通信连接,其中
    所述控制设备包括:
    接收单元,从所述可移动设备接收所述可移动设备获取的图像画面;
    显示单元,显示所接收的图像画面;
    信息输入单元,获取用户输入的控制指令;
    处理器,根据所述用户输入的控制指令在所述图像画面中确定出待跟踪的目标对象,并将所述目标对象作为对焦和/或测光的对象;以及
    发送单元,向所述可移动设备发送指示信息,指示所述可移动设备捕捉所述目标对象,并对所述目标对象进行对焦和/或测光;
    所述可移动设备包括:
    感测元件,获取所述目标对象的位置信息;
    图像获取装置,获取包括所述目标对象的图像画面;
    控制器,调整所述图像获取装置的朝向,使所述图像获取装置始终朝向所述目标对象并获取包括所述目标对象的图像画面;
    其中,所述控制器还被配置为:根据所述目标对象的位置信息,控制所述图像获取装置对所述目标对象进行对焦和/或测光,并向所述控制设备发送包括所述目标对象的图像画面。
  49. 一种用于跟踪拍摄系统的方法,所述跟踪拍摄系统包括如权利要求30-38中任一项所述的控制设备和如权利要求1-14中任一项所述的可移动设备,所述控制设备和所述可移动设备通信连接,所述方法包括:
    在所述控制设备处执行的以下操作:
    从所述可移动设备接收所述可移动设备获取的图像画面,并显示所接收的图像画面;
    获取用户输入的控制指令;
    根据所述用户输入的控制指令在所述图像画面中确定出待跟踪的目标对象,并将所述目标对象作为对焦和/或测光的对象;以及
    向所述可移动设备发送指示信息,指示所述可移动设备捕捉所述目标对象,并对所述目标对象进行对焦和/或测光;以及
    在所述可移动设备处执行的以下操作:
    获取所述目标对象的位置信息;
    调整所述可移动设备中的图像获取装置的朝向,使所述图像获取装置始终朝向所述目标对象并获取包括所述目标对象的图像画面;以及
    根据所述目标对象的位置信息,控制所述图像获取装置对所述目标对象进行对焦和/或测光,并向所述控制设备发送包括所述目标对象的图像画面。
PCT/CN2017/102992 2017-09-22 2017-09-22 用于跟踪拍摄的方法和设备 WO2019056312A1 (zh)

Priority Applications (3)

Application Number Priority Date Filing Date Title
PCT/CN2017/102992 WO2019056312A1 (zh) 2017-09-22 2017-09-22 用于跟踪拍摄的方法和设备
CN201780006943.7A CN108521862B (zh) 2017-09-22 2017-09-22 用于跟踪拍摄的方法和设备
US16/822,630 US20200221005A1 (en) 2017-09-22 2020-03-18 Method and device for tracking photographing

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2017/102992 WO2019056312A1 (zh) 2017-09-22 2017-09-22 用于跟踪拍摄的方法和设备

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US16/822,630 Continuation US20200221005A1 (en) 2017-09-22 2020-03-18 Method and device for tracking photographing

Publications (1)

Publication Number Publication Date
WO2019056312A1 true WO2019056312A1 (zh) 2019-03-28

Family

ID=63433075

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2017/102992 WO2019056312A1 (zh) 2017-09-22 2017-09-22 用于跟踪拍摄的方法和设备

Country Status (3)

Country Link
US (1) US20200221005A1 (zh)
CN (1) CN108521862B (zh)
WO (1) WO2019056312A1 (zh)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102681582B1 (ko) * 2017-01-05 2024-07-05 삼성전자주식회사 전자 장치 및 그 제어 방법
CN112243582A (zh) * 2019-08-30 2021-01-19 深圳市大疆创新科技有限公司 补光控制方法、装置、系统和存储介质
US12041337B2 (en) 2019-09-03 2024-07-16 Sony Group Corporation Imaging control apparatus, imaging control method, program, and imaging device
CN112673380A (zh) * 2020-05-28 2021-04-16 深圳市大疆创新科技有限公司 图像处理的方法、装置、可移动平台以及系统

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103986865A (zh) * 2013-02-13 2014-08-13 索尼公司 成像设备、控制方法和程序
CN104571135A (zh) * 2013-10-20 2015-04-29 郁杰夫 一种云台追踪摄影系统和云台追踪摄影方法
CN105225254A (zh) * 2015-09-25 2016-01-06 凌云光技术集团有限责任公司 一种自动追踪局部目标的曝光方法及系统
CN105391939A (zh) * 2015-11-04 2016-03-09 腾讯科技(深圳)有限公司 无人机拍摄控制方法和装置、无人机拍摄方法和无人机
WO2017073310A1 (ja) * 2015-10-27 2017-05-04 三菱電機株式会社 構造物の形状測定用の画像撮影システム、構造物の形状測定に使用する構造物の画像を撮影する方法、機上制御装置、遠隔制御装置、プログラム、および記録媒体
CN107147882A (zh) * 2017-06-08 2017-09-08 柳州智视科技有限公司 一种自动实时跟踪目标对象的多分辨率观测系统

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101778214B (zh) * 2009-01-09 2011-08-31 华晶科技股份有限公司 具有亮度和对焦补偿的数字摄像装置及其影像补偿方法
CN104394326A (zh) * 2014-11-10 2015-03-04 广东欧珀移动通信有限公司 一种测光方法及终端
CN105721787B (zh) * 2016-02-01 2019-08-20 Oppo广东移动通信有限公司 调整局部区域曝光的方法、装置及移动终端
CN106506982B (zh) * 2016-12-07 2019-12-13 浙江宇视科技有限公司 一种获取测光参数的方法、装置及终端设备

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103986865A (zh) * 2013-02-13 2014-08-13 索尼公司 成像设备、控制方法和程序
CN104571135A (zh) * 2013-10-20 2015-04-29 郁杰夫 一种云台追踪摄影系统和云台追踪摄影方法
CN105225254A (zh) * 2015-09-25 2016-01-06 凌云光技术集团有限责任公司 一种自动追踪局部目标的曝光方法及系统
WO2017073310A1 (ja) * 2015-10-27 2017-05-04 三菱電機株式会社 構造物の形状測定用の画像撮影システム、構造物の形状測定に使用する構造物の画像を撮影する方法、機上制御装置、遠隔制御装置、プログラム、および記録媒体
CN105391939A (zh) * 2015-11-04 2016-03-09 腾讯科技(深圳)有限公司 无人机拍摄控制方法和装置、无人机拍摄方法和无人机
CN107147882A (zh) * 2017-06-08 2017-09-08 柳州智视科技有限公司 一种自动实时跟踪目标对象的多分辨率观测系统

Also Published As

Publication number Publication date
CN108521862A (zh) 2018-09-11
CN108521862B (zh) 2021-08-17
US20200221005A1 (en) 2020-07-09

Similar Documents

Publication Publication Date Title
US10798299B2 (en) Digital photographing apparatus, methods of controlling the same, and computer-readable storage medium to increase success rates in panoramic photography
US9876950B2 (en) Image capturing apparatus, control method thereof, and storage medium
US7565068B2 (en) Image-taking apparatus
US20200221005A1 (en) Method and device for tracking photographing
US8937677B2 (en) Digital photographing apparatus, method of controlling the same, and computer-readable medium
JP6501580B2 (ja) 撮像装置、撮像方法、およびプログラム
US20130027606A1 (en) Lens position
US10116856B2 (en) Imaging apparatus and imaging method for controlling a display while continuously adjusting focus of a focus lens
US20180139369A1 (en) Backlit face detection
US20140192246A1 (en) Digital photographing apparatus, method of controlling the same, and computer-readable recording medium
TW201331662A (zh) 相機系統以及自動對焦方法
WO2016062083A1 (zh) 一种对焦方法、装置及终端
US9800793B2 (en) Method for generating target gain value of wide dynamic range operation
JP5945444B2 (ja) 撮影機器
JP2017184182A (ja) 撮像装置
US11336802B2 (en) Imaging apparatus
US20210103201A1 (en) Flash metering for dual camera devices
JP2013135268A (ja) 画像処理装置及び画像処理方法
JP2018191141A (ja) 撮像装置
US20220232160A1 (en) Imaging apparatus and imaging control method
US20230209187A1 (en) Image pickup system that performs automatic shooting using multiple image pickup apparatuses, image pickup apparatus, control method therefor, and storage medium
US20230139034A1 (en) Image capturing apparatus and method for controlling image capturing apparatus
JP2018014659A (ja) 撮像装置
US9936158B2 (en) Image processing apparatus, method and program
JP2018074223A (ja) 撮像装置とその制御方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17925875

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17925875

Country of ref document: EP

Kind code of ref document: A1