WO2018214465A1 - 控制装置、摄像装置、摄像系统、移动体、控制方法及程序 - Google Patents
控制装置、摄像装置、摄像系统、移动体、控制方法及程序 Download PDFInfo
- Publication number
- WO2018214465A1 WO2018214465A1 PCT/CN2017/114806 CN2017114806W WO2018214465A1 WO 2018214465 A1 WO2018214465 A1 WO 2018214465A1 CN 2017114806 W CN2017114806 W CN 2017114806W WO 2018214465 A1 WO2018214465 A1 WO 2018214465A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- imaging device
- imaging
- reference region
- exposure
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims description 22
- 238000003384 imaging method Methods 0.000 claims description 357
- 230000008859 change Effects 0.000 claims description 12
- 230000007246 mechanism Effects 0.000 claims description 6
- 230000015654 memory Effects 0.000 description 24
- 238000004891 communication Methods 0.000 description 14
- 238000012545 processing Methods 0.000 description 11
- 238000010586 diagram Methods 0.000 description 10
- 238000011156 evaluation Methods 0.000 description 10
- 230000008569 process Effects 0.000 description 7
- 230000006870 function Effects 0.000 description 5
- 230000003287 optical effect Effects 0.000 description 5
- 230000005540 biological transmission Effects 0.000 description 4
- 238000005259 measurement Methods 0.000 description 4
- 230000010365 information processing Effects 0.000 description 3
- 238000003491 array Methods 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 230000001133 acceleration Effects 0.000 description 1
- 230000004913 activation Effects 0.000 description 1
- 230000001174 ascending effect Effects 0.000 description 1
- 238000012937 correction Methods 0.000 description 1
- 230000001419 dependent effect Effects 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 230000008520 organization Effects 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 230000035945 sensitivity Effects 0.000 description 1
- 230000008685 targeting Effects 0.000 description 1
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B7/00—Control of exposure by setting shutters, diaphragms or filters, separately or conjointly
- G03B7/28—Circuitry to measure or to take account of the object contrast
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64U—UNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
- B64U20/00—Constructional aspects of UAVs
- B64U20/80—Arrangement of on-board electronics, e.g. avionics systems or wiring
- B64U20/87—Mounting of imaging devices, e.g. mounting of gimbals
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64U—UNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
- B64U30/00—Means for producing lift; Empennages; Arrangements thereof
- B64U30/20—Rotors; Rotor supports
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B15/00—Special procedures for taking photographs; Apparatus therefor
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/10—Image acquisition
- G06V10/12—Details of acquisition arrangements; Constructional details thereof
- G06V10/14—Optical characteristics of the device performing the acquisition or on the illumination arrangements
- G06V10/147—Details of sensors, e.g. sensor lenses
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/10—Terrestrial scenes
- G06V20/13—Satellite images
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/10—Terrestrial scenes
- G06V20/17—Terrestrial scenes taken from planes or by drones
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/50—Constructional details
- H04N23/54—Mounting of pick-up tubes, electronic image sensors, deviation or focusing coils
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/66—Remote control of cameras or camera parts, e.g. by remote control devices
- H04N23/663—Remote control of cameras or camera parts, e.g. by remote control devices for controlling interchangeable camera parts based on electronic image sensor signals
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/68—Control of cameras or camera modules for stable pick-up of the scene, e.g. compensating for camera body vibrations
- H04N23/681—Motion detection
- H04N23/6812—Motion detection based on additional sensors, e.g. acceleration sensors
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/70—Circuitry for compensating brightness variation in the scene
- H04N23/71—Circuitry for evaluating the brightness variation
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/70—Circuitry for compensating brightness variation in the scene
- H04N23/72—Combination of two or more compensation controls
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/70—Circuitry for compensating brightness variation in the scene
- H04N23/73—Circuitry for compensating brightness variation in the scene by influencing the exposure time
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/90—Arrangement of cameras or camera modules, e.g. multiple cameras in TV studios or sports stadiums
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64U—UNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
- B64U10/00—Type of UAV
- B64U10/10—Rotorcrafts
- B64U10/13—Flying platforms
- B64U10/14—Flying platforms with four distinct rotor axes, e.g. quadcopters
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64U—UNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
- B64U2101/00—UAVs specially adapted for particular uses or applications
- B64U2101/30—UAVs specially adapted for particular uses or applications for imaging, photography or videography
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64U—UNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
- B64U2201/00—UAVs characterised by their flight controls
- B64U2201/20—Remote controls
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10032—Satellite or aerial image; Remote sensing
Definitions
- the present invention relates to a control device, an imaging device, an imaging system, a moving body, a control method, and a program.
- Patent Document 1 discloses a camera that calculates the sensitivity of a film suitable for photography with respect to the measurement result of the subject luminance.
- Patent Document 1 Japanese Patent Laid-Open Publication No. 2003-43548
- the brightness of the imaging area that is imaged by the imaging device sometimes changes, and the exposure of the imaging device cannot be appropriately controlled.
- a control device may include an identification unit that recognizes a first object existing in a reference region predetermined in an imaging range of the imaging device among the first images captured by the imaging device.
- the control device may include a prediction unit that predicts the position of the reference region of the second image captured after the first image based on the drive information for changing the position or orientation of the imaging device.
- the control device may include a control unit that selects an image in the first image corresponding to the reference region of the second image when the position of the reference region of the second image is included in the first image and is not in the first object.
- the image data of the area controls the exposure of the imaging device when the second image is captured.
- control unit controls exposure of the imaging device when the second image is captured, based on the image data of the reference region of the first image.
- the prediction unit can determine, based on the drive information, the amount of movement of the imaging device between the timing at which the imaging device captures the first image and the timing at which the imaging device captures the second image, and predicts the reference region of the second image based on the amount of movement. s position.
- the prediction unit can determine the speed of the imaging device based on the drive information, and determine the amount of movement based on the speed and the difference between the timing at which the imaging device captures the first image and the timing at which the imaging device captures the second image.
- the prediction unit may further determine, based on the drive information, a change amount of the orientation of the imaging device between the timing at which the imaging device captures the first image and the timing at which the second image is captured, and predicts the second amount based on the amount of movement and the amount of change.
- the position of the reference area of the image may be determined, based on the drive information, a change amount of the orientation of the imaging device between the timing at which the imaging device captures the first image and the timing at which the second image is captured, and predicts the second amount based on the amount of movement and the amount of change.
- the control unit may determine a first movement amount which is an imaging device that performs imaging of the second image after imaging the first image until the position of the reference region of the second image is not on the first object.
- a first movement amount which is an imaging device that performs imaging of the second image after imaging the first image until the position of the reference region of the second image is not on the first object.
- the recognition unit may further recognize the second object existing in the third image captured by the other imaging device before the second image, and the other imaging device images the imaging range different from the imaging device.
- the control unit may correspond to the reference region of the second image.
- the image data of the image region in the image 3 controls the exposure of the image pickup device when the second image is imaged.
- control unit may use the image data of the reference region of the first image.
- the exposure of the imaging device when imaging the second image is controlled.
- the prediction unit can recognize, based on the drive information, the amount of movement of the imaging device between the timing at which the imaging device captures the first image and the timing at which the imaging device captures the second image, and predicts the reference region of the second image based on the amount of movement. s position.
- the control unit can recognize the first
- the amount of movement and the second amount of movement are the amount by which the imaging device moves after the first image is captured until the position of the reference region of the second image is not on the first object, and the second amount of movement is imaging.
- the image data of the image region in the image 3 controls the exposure of the image pickup device when the second image is imaged.
- control unit controls the exposure of the imaging device when imaging the second image based on the image data of the reference region of the first image.
- the control unit may include, based on the image data of the image region in the third image, when the position of the reference region of the second image is included in the third image and not on the first object and the second object.
- the difference between the characteristics of the image captured by the imaging device and the characteristics of the image captured by the other imaging device controls the exposure of the imaging device when the second image is captured.
- the imaging range of other imaging devices may be larger than the imaging range of the imaging device.
- the control unit may control exposure of the imaging device when imaging the second image based on the image data of the reference region of the first image.
- the recognition unit may further recognize the second object existing in the first image.
- the control unit controls the imaging of the second image based on the image data of the image region in the first image.
- the exposure of the camera device When the position of the reference region of the second image is on the first object or the second object, the control unit controls the exposure of the imaging device when the second image is captured based on the image data of the reference region of the first image.
- An imaging device may include the control device.
- the imaging device can perform imaging in accordance with the exposure controlled by the control unit.
- An imaging system may include the imaging device.
- the camera system may be provided with a support mechanism that supports the orientation of the camera device Hold the camera.
- the imaging system may include another imaging device that captures an imaging range different from the imaging device.
- a moving body includes the imaging system and moves.
- the control method may include a step of recognizing a first object existing in a reference region predetermined in an imaging range of the imaging device among the first images captured by the imaging device.
- the control method may include a step of predicting the position of the reference region of the second image captured after the first image based on the drive information for changing the position or orientation of the imaging device.
- the control method may include a stage in which, when the position of the reference region of the second image is included in the first image and not in the first object, the image region in the first image corresponding to the reference region of the second image
- the image data controls exposure of the imaging device when imaging the second image.
- the program can cause the computer to execute a stage of recognizing a first object existing in a reference region predetermined for an imaging range of the imaging device among the first images captured by the imaging device.
- the program can cause the computer to perform a stage of predicting the position of the reference region of the second image taken after the first image based on the drive information for changing the position or orientation of the imaging device.
- the program can cause the computer to execute an image region in the first image corresponding to the reference region of the second image when the position of the reference region of the second image is included in the first image and not in the first object.
- the image data controls exposure of the imaging device when imaging the second image.
- the present invention it is possible to prevent a phenomenon in which the exposure of the imaging device is not properly controlled due to a change in the brightness of the imaging region in which the imaging device performs imaging.
- FIG. 1 is a view showing an example of the appearance of a drone and a remote operation device.
- FIG. 2 is a view showing an example of functional blocks of the drone.
- 3A is a view for explaining a relationship between a reference area of an image and an object.
- 3B is a diagram for explaining a relationship between a reference area of an image and an object.
- 3C is a diagram for explaining a relationship between a reference area of an image and an object.
- 3D is a diagram for explaining a relationship between a reference area of an image and an object.
- 4A is a view for explaining a relationship between a reference area of an image and an object.
- 4B is a diagram for explaining a relationship between a reference area of an image and an object.
- FIG. 5 is a diagram for explaining a relationship between a reference area of an image and an object.
- FIG. 6 is a flowchart showing an example of a procedure of exposure control of the imaging device.
- FIG. 7 is a flowchart showing another example of the procedure of exposure control of the imaging device.
- FIG. 8 is a flowchart showing an example of a procedure for deriving an exposure control value.
- FIG. 9 is a diagram showing an example of a hardware configuration.
- the various embodiments of the present invention can be described with reference to the flowcharts and block diagrams, and the blocks herein may represent (1) a stage of a process of performing an operation, or (2) a "part" of a device having an effect of performing an operation.
- the stage of identification and the "parts" can be installed using programmable circuitry and/or processors.
- Dedicated circuits may include digital and/or analog hardware circuits.
- ICs integrated circuits
- discrete circuits may be included.
- the programmable circuit can include reconfigurable hardware circuitry.
- Reconfigurable hardware circuits can include memory AND, logic OR, logic XOR, logic NAND, logic NOR and other logic operations, flip-flops, registers, field programmable gate arrays (FPGAs), programmable logic arrays (PLAs), and other memory elements Wait.
- FPGAs field programmable gate arrays
- PDAs programmable logic arrays
- the computer readable medium can comprise any tangible device that can store instructions that are executed by a suitable device.
- a computer readable medium having stored instructions therein is provided with a product comprising executable instructions for forming means for performing the operations specified in the flowchart or block diagram.
- an electronic storage medium, a magnetic storage medium, an optical storage medium, an electromagnetic storage medium, a semiconductor storage medium, or the like can be included.
- a floppy (registered trademark) disk a floppy disk, a hard disk, a random access memory (RAM), a read only memory (ROM), an erasable programmable read only memory (EPROM or Flash), EEPROM, SRAM, CD-ROM, Digital Versatile Disc (DVD), Blu-ray (RTM) CD, memory stick, integrated circuit card, etc.
- RAM random access memory
- ROM read only memory
- EPROM or Flash erasable programmable read only memory
- EEPROM electrically erasable programmable read only memory
- SRAM CD-ROM
- DVD Digital Versatile Disc
- RTM Blu-ray
- the computer readable instructions can comprise any of the source code or object code described in any combination of one or more programming languages.
- the source code or object code contains an existing procedural programming language.
- Existing procedural programming languages may be assembler instructions, instruction set architecture (ISA) instructions, machine instructions, machine dependent instructions, microcode, firmware instructions, state setting data, or Smalltalk, JAVA (registered trademark), C++, etc.
- the computer readable instructions may be provided locally or via a wide area network (WAN), such as a local area network (LAN), the Internet, to a processor or programmable circuit of a general purpose computer, special purpose computer, or other programmable data processing apparatus.
- WAN wide area network
- LAN local area network
- the Internet to a processor or programmable circuit of a general purpose computer, special purpose computer, or other programmable data processing apparatus.
- the processor or programmable circuitry can execute computer readable instructions to form a means for performing the operations specified in the flowcharts or block diagrams.
- Examples of the processor include a computer processor, a processing unit, a microprocessor, a digital signal processor, a controller, a microcontroller, and the like.
- FIG. 1 shows an example of the appearance of the unmanned aerial vehicle (UAV) 10 and the remote operation device 300.
- the UAV 10 includes a UAV main body 20, a pan/tilt head 50, a plurality of imaging devices 60, and an imaging device 100.
- the pan/tilt head 50 and the imaging device 100 are examples of an imaging system.
- the UAV 10 is an example of a moving body propelled by the propulsion unit.
- the concept of the mobile body includes a flying object such as another aircraft moving in the air, a vehicle moving on the ground, a ship moving on the water, and the like.
- the UAV main body 20 is provided with a plurality of rotors.
- the plurality of rotors are an example of a propulsion unit.
- the UAV body 20 is to fly the UAV 10 by controlling the rotation of a plurality of rotors.
- the UAV body 20 uses, for example, four rotors to fly the UAV 10.
- the number of rotors is not limited to four.
- the UAV 10 can also be a rotorless fixed wing aircraft.
- the imaging device 100 is an imaging camera that images an object included in a desired imaging range.
- the pan/tilt 50 supports the imaging device 100 so that the posture of the imaging device 100 can be changed.
- the pan/tilt 50 supports the imaging device 100 in a rotatable imaging device 100.
- the Yuntai 50 is an example of a support organization.
- the pan/tilt head 50 supports the image pickup apparatus 100 in such a manner as to rotate the pitch axis by an actuator.
- the pan/tilt head 50 supports the image pickup apparatus 100 so as to be rotatable about the roll axis and the yaw axis by the actuator.
- the pan/tilt head 50 can change the posture of the imaging device 100 by rotating the imaging device 100 around at least one of the yaw axis, the pitch axis, and the roll axis.
- the plurality of imaging devices 60 are sensing cameras that image the surroundings of the UAV 10 in order to control the flight of the UAV 10 .
- the two imaging devices 60 can be provided on the front side of the UAV 10, that is, the front side.
- the other two imaging devices 60 may be provided on the bottom surface of the UAV 10.
- the two imaging devices 60 on the front side can be paired and function as a so-called stereo camera.
- the two imaging devices 60 on the bottom side may be paired and function as a stereo camera.
- the three-dimensional spatial data around the UAV 10 can be generated based on the images taken by the plurality of imaging devices 60.
- the number of imaging devices 60 included in the UAV 10 is not limited to four.
- the UAV 10 may have at least one imaging device 60.
- the UAV 10 can have at least one imaging device 60 on the head, the tail, the side, the bottom surface, and the top surface of the UAV 10, respectively.
- the viewing angle that the imaging device 60 can set can be larger than the viewing angle that the imaging device 100 can set. That is, the imaging range of the imaging device 60 can be larger than the imaging range of the imaging device 100.
- the camera device 60 can also have a fixed focus lens or a fisheye lens.
- the remote operation device 300 communicates with the UAV 10 to remotely operate the UAV 10.
- the remote operating device 300 can communicate with the UAV 10 in a wireless manner.
- the remote operation device 300 transmits drive information indicating various drive commands related to the movement of the UAV 10, such as ascending, descending, accelerating, decelerating, advancing, advancing, and rotating, to the UAV 10.
- the drive information includes, for example, drive information that causes the height of the UAV 10 to rise.
- the drive information can indicate the height at which the UAV 10 should be.
- the UAV 10 moves to a height indicated by the drive information received from the remote operation device 300.
- FIG. 2 shows an example of functional blocks of the UAV 10.
- the UAV 10 includes a UAV control unit 30, a memory 32, a communication interface 34, a propulsion unit 40, a GPS receiver 41, an inertial measurement device 42, a magnetic compass 43, a barometric altimeter 44, a pan/tilt head 50, an imaging device 60, and an imaging device. 100.
- the communication interface 34 communicates with other devices such as the remote operation device 300.
- the communication interface 34 can receive indication information including various instructions of the remote operation device 300 for the UAV control unit 30.
- the memory 32 stores the UAV control unit 30 for controlling the propulsion unit 40, the GPS receiver 41, the inertial measurement unit (IMU) 42, the magnetic compass 43, the barometric altimeter 44, the pan/tilt head 50, the imaging device 60, and the imaging device 100. Programs, etc.
- the memory 32 may be a computer-readable recording medium, and may include at least one of flash memories such as SRAM, DRAM, EPROM, EEPROM, and USB memory.
- the memory 32 can be provided inside the UAV main body 20. It can be configured to be detachable from the UAV body 20.
- the UAV control unit 30 controls the flight and imaging of the UAV 10 based on the program stored in the memory 32.
- the UAV control unit 30 can be constituted by a microprocessor such as a CPU or an MPU, a microcontroller such as an MCU, or the like.
- the UAV control unit 30 controls the flight and imaging of the UAV 10 based on an instruction received from the remote operation device 300 via the communication interface 34.
- the propulsion unit 40 advances the UAV 10.
- the propulsion unit 40 has a plurality of rotors and a plurality of drive motors that rotate the plurality of rotors.
- the propulsion unit 40 rotates the plurality of rotors by a plurality of drive motors in accordance with a drive command from the UAV control unit 30 to fly the UAV 10 .
- the UAV control unit 30 can analyze a plurality of images captured by the plurality of imaging devices 60 for sensing, thereby identifying the environment around the UAV 10.
- the UAV control unit 30 controls the flight in accordance with the environment around the UAV 10, for example, avoiding obstacles.
- the UAV control unit 30 can generate three-dimensional spatial data around the UAV 10 based on a plurality of images taken by the plurality of imaging devices 60, and control the flight based on the three-dimensional spatial data.
- the GPS receiver 41 receives a plurality of signals indicating the time of transmission from a plurality of GPS satellites.
- the GPS receiver 41 calculates the position of the GPS receiver 41, that is, the position of the UAV 10 based on the received plurality of signals.
- the IMU 42 detects the posture of the UAV 10. As the posture of the UAV 10, the IMU 42 detects the acceleration in the three-axis direction of the front, rear, left and right, and up and down of the UAV 10, and the angular velocities in the three axial directions of pitch, roll, and yaw.
- the magnetic compass 43 detects the orientation of the hand of the UAV 10.
- the barometric altimeter 44 detects the flying height of the UAV 10. The barometric altimeter 44 detects the air pressure around the UAV 10 and converts the detected air pressure to a height to detect the height.
- the imaging device 100 includes an imaging unit 102 and a lens unit 200.
- the lens portion 200 is a lens An example of a device.
- the imaging unit 102 includes an image sensor 120, an imaging control unit 110, and a memory 130.
- the image sensor 120 may be composed of a CCD or a CMOS.
- the image sensor 120 outputs image data of an optical image imaged by the plurality of lenses 210 to the imaging control section 110.
- the imaging control unit 110 can be configured by a microprocessor such as a CPU or an MPU, a microcontroller such as an MCU, or the like.
- the imaging control unit 110 can control the imaging device 100 based on an operation command from the imaging device 100 of the UAV control unit 30.
- the memory 130 may be a computer-readable recording medium, and may include at least one of flash memories such as SRAM, DRAM, EPROM, EEPROM, and USB memory.
- the memory 130 stores a program or the like necessary for the image control unit 110 to control the image sensor 120 and the like.
- the memory 130 may be provided inside the casing of the image pickup apparatus 100.
- the memory 130 may be disposed to be detachable from the housing of the image pickup apparatus 100.
- the lens unit 200 has a plurality of lenses 210, a lens moving mechanism 212, and a lens control unit 220.
- the plurality of lenses 210 can function as a zoom lens, a zoom lens, and a focus lens. At least a portion or all of the plurality of lenses 210 are configured to be movable along the optical axis.
- the lens unit 200 may be an interchangeable lens that is detachably attachable to the imaging unit 102.
- the lens shifting mechanism 212 moves at least a part or all of the plurality of lenses 210 along the optical axis.
- the lens control unit 220 drives the lens shifting mechanism 212 in accordance with the lens control command from the image capturing unit 102 to move the one or more lenses 210 in the optical axis direction.
- the lens control commands are, for example, a zoom control command and a focus control command.
- the imaging apparatus 100 configured as described above controls the exposure of the imaging apparatus 100 based on the image data of the reference area determined in advance in the imaging range of the imaging apparatus 100.
- the image pickup apparatus 100 can derive an evaluation value of the brightness of the reference area within the image, and derive an exposure control value (EV value) based on the evaluation value of the brightness.
- the imaging apparatus 100 can control the aperture of the imaging apparatus 100, the shutter speed, the output gain of the image sensor 120, and the like according to the exposure control value, thereby controlling the exposure of the imaging apparatus 100.
- the reference area may be a region of interest (ROI) that is predetermined for the imaging range of the imaging apparatus 100 in order to control exposure of the imaging apparatus 100.
- the position of the reference area may be a central portion of the imaging range of the imaging apparatus 100.
- the position of the reference area can be predetermined in accordance with the photographing mode of each imaging apparatus 100.
- the position of the reference area can be set to an arbitrary position within the imaging range of the imaging apparatus 100 in accordance with an instruction from the user.
- the shape and size of the reference area can be changed according to the photographing mode or the user's instruction.
- the reference area can be divided into multiple The area, and the divided areas are scattered within the photographing range of the image pickup apparatus 100.
- the imaging apparatus 100 can derive an exposure control value for an image taken next time, for example, based on the luminance of the reference area within the current image.
- the image pickup apparatus 100 can image the next image based on the derived exposure control value.
- the image pickup apparatus 100 can sequentially image an image at a predetermined frame rate.
- the imaging apparatus 100 can derive an exposure control value when imaging the next frame based on the image data of the reference area of the current frame (image).
- the imaging range of the imaging apparatus 100 mounted on a mobile body such as the UAV 10 changes with the movement of the UAV 10 until the next image is captured.
- the imaging range of the imaging apparatus 100 supported by the pan/tilt head 50 changes with the driving of the pan/tilt head 50 until the next image is taken. Because of such a change, the brightness of the imaging range is changed, and the exposure of the imaging apparatus 100 when imaging the next image may not be properly controlled.
- the image 501 taken by the imaging apparatus 100 includes the object 400, and the reference area 511 is on the object 400.
- the reference area 512 is not on the object 400.
- the imaging apparatus 100 may not appropriately control the exposure at the time of imaging the image 502 based on the image data of the reference area 511 of the image 501.
- the imaging apparatus 100 images the image 502, overexposure or underexposure may occur. For example, when the UAV 10 is photographed by the image pickup apparatus 100 while the high-rise building is targeting the high-rise building and the high-rise building is outside the reference area of the image, there is a case where the UAV 10 is overexposed.
- the imaging apparatus 100 predicts the position of the reference region 512 of the image 502 to be captured next time.
- the image capturing apparatus 100 according to the image area 521 in the image 501 corresponding to the reference area 512 of the image 502.
- the image data controls the exposure of the image pickup apparatus 100 when the image 502 is imaged. Thereby, even if the brightness of the imaging range of the imaging device 100 changes, the exposure of the imaging device 100 can be appropriately controlled.
- the image pickup apparatus 100 controls the exposure of the image pickup apparatus 100 when the image 502 is imaged based on the image data of the reference region 511 of the image 501.
- the exposure is controlled according to the image data of the reference area 511 of the image 501, and the imaging apparatus 100 does not overexpose when the image 502 is imaged. Or underexposed. Therefore, in this case, the imaging apparatus 100 does not need to execute the process of moving the reference area in order to control the exposure, and it is possible to prevent an unnecessary processing load from being increased.
- the imaging apparatus 100 controls exposure of the imaging apparatus 100 when imaging the image 502 based on the image data of the reference area 511 of the image 501.
- the imaging apparatus 100 controls the exposure based on the image data of the reference area 511 of the image 501, and the possibility of overexposure or underexposure can also be reduced.
- the imaging apparatus 100 photographs a high-rise building group, even if the high-rise building existing in the reference area of the current image is not included in the reference area of the next image, as long as other high-rise buildings are included, the predetermined reference according to the current image is used.
- the image data of the area controls the exposure, which also reduces the possibility of overexposure or underexposure. Therefore, in this case, the imaging apparatus 100 does not need to perform a process of moving the reference area in order to control the exposure. Thereby, it is possible to prevent an unnecessary processing load from being increased.
- the imaging control unit 110 includes the recognition unit 112, the prediction unit 114, and the exposure control unit 116.
- the exposure control unit 116 is an example of a control unit.
- the recognition unit 112 recognizes, for example, an object 701 located in the reference region 611 predetermined in the imaging range of the imaging device 100 among the images 601 captured by the imaging device 100.
- the recognition unit 112 can recognize an object located within a predetermined distance from the imaging apparatus 100 as an object.
- the prediction unit 114 drives a drive letter for changing the position or orientation of the imaging apparatus 100.
- the position of the reference area 612 of the image 602 taken after the image 601 is predicted.
- the prediction unit 114 can determine, based on the drive information, the movement amount D of the imaging device between the timing at which the imaging device 100 images the image 601 and the timing at which the imaging device 100 images the image 602, and predicts the image 602 based on the movement amount D.
- the prediction unit 114 can determine the speed of the imaging apparatus 100 based on the driving information, and determine the movement amount D based on the speed and the timing of the imaging of the image 601 by the imaging apparatus 100 and the timing of the imaging of the image 602 by the imaging apparatus 100.
- the prediction unit 114 can determine the speed of the imaging apparatus 100 based on the driving information of the UAV 10 transmitted by the remote operation device 300.
- the prediction unit 114 can determine the movement amount D based on the speed v of the imaging apparatus 100 and the frame rate f(fps) of the imaging apparatus 100.
- the prediction unit 114 can determine the movement amount D by calculating v ⁇ (1/f).
- the prediction unit 114 may further determine, based on the drive information, the amount of change H of the orientation of the image pickup apparatus 100 between the timing at which the image pickup apparatus 100 images the image 601 and the timing at which the image 602 is imaged, and based on the movement amount D and the amount of change H The position of the reference area 612 of the image 602 is predicted.
- the prediction unit 114 can determine the amount of change H in the orientation of the imaging device 100 based on at least one of the drive information of the UAV 10 and the drive information of the pan/tilt 50 transmitted by the remote operation device 300.
- the exposure control section 116 when the position of the reference area 612 of the image 602 is included in the image 601 and not on the object 701, the exposure control section 116 generates image data of the image area 621 in the image 601 corresponding to the reference area 612 of the image 602. The exposure of the image pickup apparatus 100 when the image 602 is imaged is controlled. As shown in FIG. 4B, when the position of the reference area 612 of the image 602 is on the object 701, the exposure control unit 116 controls the exposure of the image pickup apparatus 100 when the image 602 is imaged based on the image data of the reference area 611 of the image 601. .
- the exposure control unit 116 can determine the standard movement amount d0 which is the period during which the imaging apparatus 100 images the image 602 after imaging the image 601 until the position of the reference area 612 of the image 602 is not on the object 701.
- the standard movement amount d0 is an example of the first movement amount.
- Exposure control unit The distance from the end on the moving direction side of the imaging device 100 of the reference region 611 of the image 601 to the end on the moving direction side of the imaging device 100 of the object 701 can be determined as the standard movement amount d0.
- the exposure control unit 116 can shift the distance from the end of the imaging device 100 on the moving direction side of the reference region 611 of the image 601 to the end of the imaging device 100 of the object 701 which is the farthest from the end on the moving direction side of the imaging device 100. Determined as the standard movement amount d0.
- the exposure control unit 116 can determine the distance from the end of the reference region 611 of the image 601 opposite to the end in the moving direction of the imaging device 100 to the end of the imaging device 100 on the moving direction side of the object 701 as the standard movement amount d0.
- the exposure control portion 116 may determine that the position of the reference region 612 of the image 602 is not on the object 701.
- the exposure control section 116 may determine that the position of the reference area 612 of the image 602 is on the object 701.
- the exposure control portion 116 may determine that the position of the reference region 612 of the image 602 is not on the object 701.
- the exposure control portion 116 may determine that the position of the reference region 612 of the image 602 is on the object 701.
- the imaging apparatus 100 moves quickly, that is, when the UAV 10 moves quickly or the like, the reference area of the image taken next time is sometimes located outside the current imaging range of the imaging apparatus 100. At this time, it is not possible to determine what kind of object is located in the reference area of the image to be captured next, based on the image taken by the imaging apparatus 100. Therefore, in this case, the image pickup apparatus 100 can control the exposure without moving the reference area.
- the UAV 10 includes an imaging device 60 that captures an imaging range different from the imaging device 100 in addition to the imaging device 100.
- the imaging device 60 functions as a sensor for detecting an obstacle around the UAV 10.
- the recognition unit 112 can recognize an object outside the imaging range of the imaging device 100 using the image of the imaging device 60.
- an object 701 and an object 702 are included in an image 800 taken by the imaging device 60.
- the image 601 taken by the imaging apparatus 100 includes the object 701, but does not include the object 702.
- the image 601 there is no An image area corresponding to the reference area 612 of the image 602 taken by the image pickup apparatus 100 after 601.
- the image 800 there is an image area 821 corresponding to the reference area 612 of the image 602.
- the imaging apparatus 100 can control the exposure of the imaging apparatus 100 based on the image data of the image area 821 in the image 800 captured by the imaging apparatus 60.
- the recognition unit 112 can also recognize the object 702 that exists in the image 800 taken by the imaging device 60 that images the imaging range different from the imaging device 100 before the image 602.
- the exposure control portion 116 may be based on the image in the image 800 corresponding to the reference region 612 of the image 602.
- the image data of the area 821 controls the exposure of the image pickup apparatus 100 when the image 602 is imaged.
- the exposure control section 116 may control the image according to the image data of the reference area 611 of the image 601. 602 Exposure of the imaging device 100 at the time of imaging.
- the characteristics of the image sensor 120, the lens 210, and the like of the imaging apparatus 100 may be different from those of the image sensor and the lens of the imaging apparatus 60. At this time, the characteristics of the image taken by the imaging apparatus 100 and the characteristics of the image captured by the imaging apparatus 60 may be different. Therefore, when the exposure of the image pickup apparatus 100 is controlled in accordance with the image taken by the image pickup device 60, it is preferable to perform the correction.
- the exposure control portion 116 may be based on the image data of the image region 821 in the image 800 and the image captured by the imaging device 100.
- the difference between the characteristics and the characteristics of the image captured by the image pickup device 60 controls the exposure of the image pickup apparatus 100 when the image 602 is imaged.
- the exposure control unit 116 may interpolate the luminance of the image region 821 in the image 800 according to the predetermined interpolation coefficient, and derive an evaluation value of the luminance of the image region 821 according to the luminance after the interpolation, and according to the derived The evaluation value of the brightness is used to derive the exposure control value of the image pickup apparatus 100.
- the interpolation coefficient can be determined based on the difference between the characteristics of the image captured by the imaging device 100 and the characteristics of the image captured by the imaging device 60.
- the imaging apparatus 100 and the imaging apparatus 60 can image the same subject, and determine the interpolation coefficient in advance by comparing the captured images with each other.
- the prediction unit 114 can recognize the movement amount D of the imaging apparatus 100 between the timing at which the imaging device 100 captures the image 601 and the timing at which the imaging device 100 captures the image 602 based on the drive information, and predicts the image 602 based on the movement amount D.
- the exposure control unit 116 can recognize the first standard movement amount d0 and the second standard movement amount d1, which is the position of the reference area 602 of the image 602 after the image capturing apparatus 100 captures the image 602.
- the second standard movement amount d1 is the amount that should be moved until the position of the reference region 612 of the image 602 moves on the object 702 after the image pickup device 100 captures the image 601.
- the exposure control unit 116 can control the image 602 to be imaged based on the image data of the image area 821 in the image 800.
- the exposure of the image pickup apparatus 100 The first standard movement amount d0 is an example of the first movement amount.
- the second standard movement amount d1 is an example of the second movement amount.
- the exposure control unit 116 can control the image 602 to be imaged based on the image data of the reference area 611 of the image 601. The exposure of the image pickup apparatus 100.
- the prediction by the prediction unit 114 of the position of the reference region of the next image is performed based on the drive information for controlling the UAV 10 or the pan/tilt 50 before imaging the next image.
- the UAV 10 or the pan/tilt 50 may be controlled based on the further added drive information. In this case, the position of the reference area predicted by the prediction unit 114 may be inaccurate.
- the exposure control unit 116 may control exposure of the imaging device when the image 602 is imaged based on the image data of the reference region 611 of the image 601.
- FIG. 6 is a flowchart showing an example of a procedure of exposure control by the imaging device 100 executed by the imaging control unit 110.
- the recognition unit 112 determines whether or not the object exists in the reference region of the current image captured by the imaging device 100 (S100).
- the identification unit 112 can be within the reference area of the current image, Whether or not there is an object within a predetermined distance from the imaging device 100 determines whether or not there is an object in the reference region of the current image. If there is no object in the reference area, the exposure control unit 116 derives the exposure control value of the imaging apparatus 100 based on the evaluation value of the brightness of the reference area in the current image (S114). Then, the exposure control unit 116 applies the exposure control value derived for the reference region in the current image to the next imaging and images the next image (S116).
- the prediction unit 114 determines whether the UAV control unit 30 receives the drive command of the UAV 10 or the pan/tilt 50 (S102). If the drive command is not received, the exposure control unit 116 applies the exposure control value derived for the reference region in the current image to the next imaging and captures the next image. When receiving the drive command indicating that the UAV 10 is hovering, it is also determined that the UAV 10 has not moved, and the exposure control unit 116 can apply the exposure control value derived for the reference region in the current image to the next imaging and image the next image.
- the prediction unit 114 determines the time until the next imaging based on the speed of the UAV 10 and the frame rate of the imaging device 100 based on the drive command, and predicts the reference of the next image based on the speed and time.
- the exposure control unit 116 determines whether or not the position of the reference region of the next image is on the object located in the reference region of the current image (S106). When the position of the reference area of the next image is on the object located in the reference area of the current image, the exposure control section 116 applies the exposure control value derived for the reference area in the current image to the next imaging and performs the next image. Camera.
- the exposure control portion 116 derives the exposure control based on the evaluation value of the luminance of the image region within the current image corresponding to the reference region of the next image. Value (S108). Then, the exposure control unit 116 determines whether or not the UAV control unit 30 receives an additional drive command of the UAV 10 or the pan/tilt 50 until the next shooting (S110). When an additional drive command is received, the exposure control unit 116 applies the exposure control value derived for the reference region in the current image to the next imaging and captures the next image. In S110, the UAV control unit 30 can be driven, for example, after a predetermined time of waiting for about 1 second. This is to make UAV10 Corresponds to an input that moves in a direction different from the initial moving direction. In this case, the reference area 611 is located in the object 701, and the exposure does not change.
- the exposure control unit 116 applies the exposure control value derived for the image region in the current image in step S108 to the next imaging and images the next image (S112).
- the imaging apparatus 100 predicts the position of the reference area of the next image, and if no object is located at the position of the predicted reference area or the reference area of the current image, based on the predicted next image in the current image.
- the image data of the image area corresponding to the reference area controls the exposure when the next image is imaged.
- FIG. 7 is a flowchart showing another example of the procedure of the exposure control of the imaging apparatus 100 executed by the imaging control unit 110.
- the recognition unit 112 determines whether or not an object is present based on the image captured by the imaging device 60 for sensing (S200). If there is no object, the exposure control unit 116 derives the exposure control value of the imaging apparatus 100 based on the luminance evaluation value of the reference region in the current image (S224). Then, the exposure control unit 116 applies the exposure control value derived for the reference region in the current image to the next imaging and images the next image (S226).
- the recognition unit 112 determines whether or not an object exists in the reference area of the current image captured by the imaging apparatus 100 (S202). If there is no object in the reference area of the current image, the exposure control unit 116 applies the exposure control value derived for the reference area in the current image to the next imaging and images the next image.
- the prediction unit 114 determines whether the UAV control unit 30 receives the drive command of the UAV 10 or the pan/tilt 50 (S204). If the drive command is not received, the exposure control unit 116 applies the exposure control value derived for the reference region in the current image to the next imaging and captures the next image.
- the prediction unit 114 recognizes the time until the next imaging by the speed of the UAV 10 and the frame rate of the imaging device 100 based on the drive command, and predicts the reference of the next image based on the speed and time. Location of the area (S206).
- the exposure control unit 116 derives a distance d0 from the end of the UAV 10 on the moving direction side of the reference region of the current image to the end of the target region on the moving direction side, and the imaging device 100 moves after imaging the next image.
- Distance D (S208).
- the exposure control unit 116 determines whether or not the distance D is equal to or smaller than the distance d0 (S210). When the distance D is equal to or smaller than the distance d0, it is determined that the reference region of the next image has an object, and the exposure control unit 116 applies the exposure control value derived for the reference region in the current image to the next imaging and performs the next image. Camera.
- the exposure control unit 116 determines whether or not there is another object (S212).
- the exposure control unit 116 can determine whether or not another object exists based on the detection result of the object recognized by the recognition unit 112 based on the image of the imaging device 60. In other words, the exposure control unit 116 determines whether or not there is an object outside the imaging range, in addition to determining whether or not there is an object in the imaging range of the imaging device 100.
- the exposure control unit 116 selects an image region within the current image captured by the imaging device 100 corresponding to the reference region of the next image or an image region within the image captured by the imaging device 60.
- the exposure control value is derived from the evaluation value of the luminance (S218).
- the exposure control unit 116 determines whether or not the reference region of the next image exists in the current image of the imaging apparatus 100 (S300). When the reference area of the next image exists in the current image of the image pickup apparatus 100, the exposure control section 116 derives the exposure based on the evaluation value of the brightness of the image area in the current image of the image pickup apparatus 100 corresponding to the reference area of the next image. Control value (S302).
- the exposure control unit 116 determines an image region corresponding to the reference region of the next image from the image captured by the sensing image pickup device 60 (S304) ). The exposure control unit 116 derives an exposure control value based on the evaluation value of the brightness of the determined image region in the image captured by the imaging device 60 (S306).
- the exposure control unit 116 determines whether or not the UAV control unit 30 receives an additional drive command of the UAV 10 or the pan/tilt 50 until the next shooting (S220). When receiving an additional drive command, the exposure control unit 116 will refer to the current image. The exposure control value derived from the area is applied to the next image and the next image is taken.
- the exposure control unit 116 applies the exposure control value derived in step S218 to the next imaging and images the next image (S222).
- the exposure control unit 116 derives the end portion of the UAV 10 on the moving direction side of the reference region of the current image from the end portion of the other object opposite to the end portion in the moving direction.
- the distance d1 (S214).
- the exposure control unit 116 determines whether or not the distance D is equal to or greater than the distance d0 and equal to or smaller than the distance d1. When the distance D is larger than the distance d1, the exposure control unit 116 applies the exposure control value derived for the reference region in the current image to the next imaging and images the next image.
- the exposure control unit 116 captures an image region in the current image captured by the imaging device 100 corresponding to the reference region of the next image or is captured by the imaging device 60.
- the exposure control value is derived by evaluating the brightness of the image area within the image (S218). After the exposure control value is derived, the exposure control unit 116 determines whether or not the UAV control unit 30 receives an additional drive command of the UAV 10 or the pan/tilt 50 until the next shooting (S220). When an additional drive command is received, the exposure control unit 116 applies the exposure control value derived for the reference region in the current image to the next imaging and captures the next image.
- the exposure control unit 116 applies the exposure control value derived in step S218 to the next imaging and images the next image (S222).
- the imaging apparatus 100 of the present embodiment if there is no object in the current image captured by the imaging device 100 or the imaging device 60 in the reference region of the image to be captured next time, then according to the next time The image data of the image area corresponding to the reference area of the captured image controls the exposure of the imaging apparatus 100 when imaging the next image.
- the UAV 10 or the pan/tilt head 50 is driven, the brightness of the imaging range of the imaging apparatus 100 is changed until the next image is captured, and the exposure control of the imaging apparatus 100 can be prevented from being inappropriate.
- FIG. 9 shows an example of a computer 1200 that implements various aspects of the present invention, in whole or in part.
- the program installed in the computer 1200 can make the computer 1200 as the present The related operations of the device according to the embodiment or the one or more "portions" of the device function. Alternatively, the program can cause the computer 1200 to perform the operation or the one or more "parts.”
- the program enables computer 1200 to perform the processes involved in embodiments of the present invention or the stages of the process. Such a program may be executed by the CPU 1212 in order for the computer 1200 to perform a number or all of the determination operations associated with the flowcharts and block diagrams of the present description.
- the computer 1200 in the present embodiment includes a CPU 1212 and a RAM 1214, and the CPU 1212 and the RAM 1214 are connected to each other by a host controller 1210.
- the computer 1200 also includes a communication interface 1222, an input/output unit, and a communication interface 1222 and an input/output unit connected to the host controller 1210 via an input/output controller 1220.
- Computer 1200 also includes a ROM 1230.
- the CPU 1212 operates in accordance with programs stored in the ROM 1230 and the RAM 1214, thereby controlling the respective units.
- Communication interface 1222 communicates with other electronic devices over a network.
- the hard disk drive can store programs and data for use by the CPU 1212 within the computer 1200.
- the ROM 1230 stores a boot program or the like executed by the computer 1200 at the time of activation and/or a program depending on the hardware of the computer 1200.
- the program can be provided by a computer readable recording medium such as a CR-ROM, a USB memory, or an IC card.
- the program is installed in the RAM 1214 or the ROM 1230 which is also an example of a computer readable recording medium and is executed by the CPU 1212.
- the information processing described within these programs is read by computer 1200 to enable cooperation between the program and the various types of hardware resources.
- the apparatus or method can be constructed by realizing the operation or processing of information by using the computer 1200.
- the CPU 1212 can execute a communication program loaded on the RAM 1214, and instructs the communication interface 1222 to perform communication processing in accordance with the processing described in the communication program.
- the communication interface 1222 reads the transmission data stored in the transmission buffer provided in the recording medium such as the RAM 1214 or the USB memory under the control of the CPU 1212, transmits the read transmission data to the network, or writes the received data received through the network. It is entered into a receiving buffer or the like provided on the recording medium.
- the CPU 1212 can read all or a part of files or databases stored in an external recording medium such as a USB memory into the RAM 1214, and perform various types of processing on the data on the RAM 1214. Then, the CPU 1212 can process the processed number Write back to the external recording medium.
- an external recording medium such as a USB memory
- Various types of information such as various types of programs, data, tables, and databases can be stored in a recording medium and subjected to information processing.
- the CPU 1212 can perform various types of processing on the data read from the RAM 1214 and write the results back into the RAM 1214, which includes various types specified by the program's instruction sequence as described elsewhere in the present disclosure. Type operations, information processing, conditional judgment, conditional branching, unconditional branching, information retrieval/replacement, etc.
- the CPU 1212 can retrieve information in a file, a database, and the like in the recording medium.
- the CPU 1212 may retrieve the attribute specified by the attribute value of the first attribute from the plurality of items.
- An entry having the same condition reads the attribute value of the second attribute stored in the entry, thereby obtaining the attribute value of the second attribute related to the first attribute that satisfies a predetermined condition.
- the above described programs or software modules may be stored on computer 1200 or in a computer readable storage medium in the vicinity of computer 1200.
- a recording medium such as a hard disk or a RAM provided in a server system connected to a dedicated communication network or the Internet can be used as the computer readable storage medium, and thus the program can be supplied to the computer 1200 through the network.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Signal Processing (AREA)
- Theoretical Computer Science (AREA)
- Remote Sensing (AREA)
- Aviation & Aerospace Engineering (AREA)
- Mechanical Engineering (AREA)
- General Health & Medical Sciences (AREA)
- Vascular Medicine (AREA)
- Health & Medical Sciences (AREA)
- Astronomy & Astrophysics (AREA)
- Microelectronics & Electronic Packaging (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Studio Devices (AREA)
- Exposure Control For Cameras (AREA)
Abstract
本发明的课题在于,由摄像装置进行摄像的摄像区域的亮度有时发生变化,无法恰当地控制摄像装置的曝光。控制装置可具备识别部,该识别部识别出由摄像装置所摄的第1图像中的、存在于对于摄像装置的摄像范围预先确定的参照区域的第1对象。控制装置具备预测部,该预测部根据用于变更摄像装置的位置或朝向的驱动信息而预测在第1图像之后所摄的第2图像的参照区域的位置。控制装置具备控制部,该控制部在第2图像的参照区域的位置包含于第1图像且不在第1对象上的情况下,根据与第2图像的参照区域对应的第1图像内的图像区域的图像数据,控制对第2图像进行摄像时的摄像装置的曝光。
Description
本发明涉及控制装置、摄像装置、摄像系统、移动体、控制方法及程序。
专利文献1中公开了一种对于被摄体辉度的测量结果运算出适宜摄影的胶片的感度的摄像机。
专利文献1:日本特开2003-43548号公报
发明内容
发明所要解决的技术问题
利用摄像装置进行摄像的摄像区域的亮度有时会发生变化,从而无法恰当地控制摄像装置的曝光。
用于解决技术问题的手段
本发明的一个方式所涉及的控制装置可具备识别部,该识别部识别由摄像装置所摄的第1图像中的、存在于对摄像装置的摄像范围预先确定的参照区域中的第1对象。控制装置可具备预测部,该预测部根据用于变更摄像装置的位置或朝向的驱动信息,预测在第1图像之后所摄的第2图像的参照区域的位置。控制装置可具备控制部,该控制部在第2图像的参照区域的位置包含于第1图像且不在第1对象上的情况下,根据与第2图像的参照区域对应的第1图像内的图像区域的图像数据,控制对第2图像进行摄像时的摄像装置的曝光。
控制部可在第2图像的参照区域的位置在第1对象上的情况下,根据第1图像的参照区域的图像数据,控制对第2图像进行摄像时的摄像装置的曝光。
预测部可根据驱动信息,确定摄像装置对第1图像进行摄像的时机与摄像装置对第2图像进行摄像的时机之间的摄像装置的移动量,并根据移动量而预测第2图像的参照区域的位置。
预测部可根据驱动信息确定摄像装置的速度,根据速度、以及摄像装置对第1图像进行摄像的时机与摄像装置对第2图像进行摄像的时机之差确定移动量。
预测部可还根据驱动信息,确定摄像装置对第1图像进行摄像的时机与对第2图像进行摄像的时机之间的摄像装置的朝向的变化量,并根据移动量及变化量而预测第2图像的参照区域的位置。
控制部可确定第1移动量,该第1移动量是摄像装置在对第1图像进行摄像之后对第2图像进行摄像期间的、直至第2图像的参照区域的位置不在第1对象上为止应移动的量,当摄像装置的移动量为第1移动量以上时,根据第1图像内的图像区域的图像数据,控制对第2图像进行摄像时的摄像装置的曝光。
识别部可还识别存在于由其它摄像装置在第2图像之前所摄的第3图像中的第2对象,该其它摄像装置是对与摄像装置不同的摄像范围进行摄像。控制部可在第2图像的参照区域的位置不包含于第1图像而包含于第3图像、且不在第1对象及第2对象上的情况下,根据与第2图像的参照区域对应的第3图像内的图像区域的图像数据,控制对第2图像进行摄像时的摄像装置的曝光。
控制部可在第2图像的参照区域的位置不包含于第1图像而包含于第3图像、且在第1对象或第2对象上的情况下,根据第1图像的参照区域的图像数据,控制对第2图像进行摄像时的摄像装置的曝光。
预测部可根据驱动信息,识别摄像装置对第1图像进行摄像的时机与摄像装置对第2图像进行摄像的时机之间的摄像装置的移动量,并根据移动量而预测第2图像的参照区域的位置。控制部可识别第1
移动量及第2移动量,该第1移动量是摄像装置对第1图像进行摄像之后直至第2图像的参照区域的位置不在第1对象上为止应移动的量,该第2移动量是摄像装置对第1图像进行摄像之后直至第2图像的参照区域的位置位于第2对象上为止应移动的量,当摄像装置的移动量为第1移动量以上且小于第2移动量时,根据第3图像内的图像区域的图像数据,控制对第2图像进行摄像时的摄像装置的曝光。
控制部可在摄像装置的移动量小于第1移动量、或为第2移动量以上的情况下,根据第1图像的参照区域的图像数据,控制对第2图像进行摄像时的摄像装置的曝光。
控制部可在,所述第2图像的所述参照区域的位置包含于第3图像且不在第1对象及第2对象上的情况下,根据第3图像内的图像区域的图像数据、及由摄像装置所摄的图像的特性与由其它摄像装置所摄的图像的特性之差,控制对第2图像进行摄像时的摄像装置的曝光。
其它摄像装置的摄像范围可大于摄像装置的摄像范围。
在摄像装置对第2图像进行摄像之前,检测到与驱动信息不同的用于变更摄像装置的位置或朝向的其它驱动信息的情况下,当第2图像的参照区域的位置包含于第1图像且不在第1对象上时,控制部也可根据第1图像的参照区域的图像数据,控制对第2图像进行摄像时的摄像装置的曝光。
识别部可还识别存在于第1图像中的第2对象。控制部可在第2图像的参照区域的位置包含于第1图像且不在第1对象及第2对象上的情况下,根据第1图像内的图像区域的图像数据,控制对第2图像进行摄像时的摄像装置的曝光。控制部可在第2图像的参照区域的位置在第1对象或第2对象上的情况下,根据第1图像的参照区域的图像数据,控制对第2图像进行摄像时的摄像装置的曝光。
本发明的一个方式所涉及的摄像装置可具备所述控制装置。摄像装置可按照受控制部控制的曝光进行摄像。
本发明的一个方式所涉及的摄像系统可具备所述摄像装置。摄像系统可具备支持机构,该支持机构以可变更摄像装置的朝向的方式支
持摄像装置。
摄像系统可具备对与摄像装置不同的摄像范围进行摄像的其它摄像装置。
本发明的一个方式所涉及的移动体具备所述摄像系统而进行移动。
本发明的一个方式所涉及的控制方法可具备下述阶段:识别由摄像装置所摄的第1图像中的、存在于对摄像装置的摄像范围预先确定的参照区域的第1对象。控制方法可具备下述阶段:根据用于变更摄像装置的位置或朝向的驱动信息,预测在第1图像之后所摄的第2图像的参照区域的位置。控制方法可具备下述阶段:在第2图像的参照区域的位置包含于第1图像且不在第1对象上的情况下,根据与第2图像的参照区域对应的第1图像内的图像区域的图像数据,控制对第2图像进行摄像时的摄像装置的曝光。
本发明的一个方式所涉及的程序可使计算机执行下述阶段:识别由摄像装置所摄的第1图像中的、存在于对摄像装置的摄像范围预先确定的参照区域的第1对象。程序可使计算机执行下述阶段:根据用于变更摄像装置的位置或朝向的驱动信息,预测在第1图像之后所摄的第2图像的参照区域的位置。程序可使计算机执行下述阶段:在第2图像的参照区域的位置包含于第1图像且不在第1对象上的情况下,根据与第2图像的参照区域对应的第1图像内的图像区域的图像数据,控制对第2图像进行摄像时的摄像装置的曝光。
根据本发明的一个方式,能防止因摄像装置进行摄像的摄像区域的亮度发生变化而无法恰当控制摄像装置的曝光的现象。
所述发明的概要并未列举出本发明的所有特征。这些特征群的子组合也可成为发明。
图1是表示无人机及远程操作装置的外观的一例的图。
图2是表示无人机的功能块的一例的图。
图3A是用于说明图像的参照区域与对象的关系的图。
图3B是用于说明图像的参照区域与对象的关系的图。
图3C是用于说明图像的参照区域与对象的关系的图。
图3D是用于说明图像的参照区域与对象的关系的图。
图4A是用于说明图像的参照区域与对象的关系的图。
图4B是用于说明图像的参照区域与对象的关系的图。
图5是用于说明图像的参照区域与对象的关系的图。
图6是表示摄像装置的曝光控制的顺序的一例的流程图。
图7是表示摄像装置的曝光控制的顺序的其它例的流程图。
图8是表示导出曝光控制值的顺序的一例的流程图。
图9是表示硬件构成的一例的图。
以下,利用发明的实施方式对本发明进行说明,但以下的实施方式并不限定权利要求书所涉及的发明。而且,实施方式中说明的特征的组合并非全部是发明的解决方案所必须的。对本领域普通技术人员来说,显然可以对以下的实施方式加以各种变更或改良。从权利要求书的记载可以明白,加以了这样的变更或改良的方式都可包含在本发明的技术范围之内。
权利要求书、说明书、说明书附图以及说明书摘要中包含作为著作权所保护对象的事项。任何人只要如专利局的文档或者记录所表示的那样进行这些文件的复制,著作权人就无法异议。但是,在除此以外的情况下,保留一切的著作权。
本发明的各种实施方式可参照流程图及框图而记载,这里的框可表示(1)操作执行的过程的阶段、或(2)具有执行操作的作用的装置的“部”。识别的阶段及“部”可利用可编程电路及/或处理器进行安装。专用电路可包含数字及/或模拟硬件电路。可包含集成电路(IC)及/或离散电路。可编程电路可包含可重构硬件电路。可重构硬件电路可包含逻辑AND、逻辑OR、逻辑XOR、逻辑NAND、逻辑NOR及其它逻辑操作、触发器、寄存器、现场可编程门阵列(FPGA)、可编程逻辑阵列(PLA)等内存要素等。
计算机可读介质可包含能储存通过适当的设备而执行的指令的任意的有形设备。结果,内部具有所储存的指令的计算机可读介质成为具备包括可执行的指令的产品,该可执行的指令用于形成用以执行流程图或框图中指定的操作的手段。作为计算机可读介质的示例,可包括电子存储介质、磁性存储介质、光存储介质、电磁存储介质、半导体存储介质等。作为计算机可读介质的更具体的示例,可包含软(floppy,注册商标)盘、软磁盘、硬盘、随机存取存储器(RAM)、只读存储器(ROM)、可擦除可编程只读存储器(EPROM或闪存)、电可擦除可编程只读存储器(EEPROM)、静态随机存取存储器(SRAM)、微型光盘只读存储器(CD-ROM)、数字多功能光盘(DVD)、蓝光(RTM)光盘、记忆棒、集成电路卡等。
计算机可读指令可包含一个或多个编程语言的任意组合中所描述的源代码或对象代码的任一种。源代码或对象代码包含现有的过程性编程语言。现有的过程性编程语言可为汇编程序指令、指令集架构(ISA)指令、机器指令、机器依存指令、微代码、固件指令、状态设定数据、或Smalltalk、JAVA(注册商标)、C++等面向对象编程语言及“C”编程语言或同样的编程语言。计算机可读指令可由本地提供或通过局域网(LAN)、互联网等广域网(WAN)提供给通用计算机、特殊用途的计算机、或其他可编程的数据处理装置的处理器或可编程电路。处理器或可编程电路可执行计算机可读指令以形成用于执行流程图或框图中指定的操作的手段。作为处理器的示例,包含计算机处理器、处理单元、微处理器、数字信号处理器、控制器、微控制器等。
图1表示无人机(UAV)10及远程操作装置300的外观的一例。UAV10具备UAV主体20、云台50、多个摄像装置60及摄像装置100。云台50及摄像装置100是摄像系统的一例。UAV10是由推进部推进的移动体的一例。移动体的概念除了UAV之外,还包含在空中移动的其他飞行器等飞行物、在地上移动的车辆、在水上移动的船舶等。
UAV主体20具备多个旋翼。多个旋翼是推进部的一例。UAV主体20是通过控制多个旋翼的旋转来使UAV10飞行。UAV主体20例如使用4个旋翼来使UAV10飞行。旋翼的数量并不限于4个。而且,
UAV10也可为无旋翼的固定翼机。
摄像装置100是对期望摄像范围内包含的被摄体进行摄像的摄像用相机。云台50以可变更摄像装置100的姿势的方式支持摄像装置100。云台50以可旋转摄像装置100的方式支持摄像装置100。云台50是支持机构的一例。例如,云台50以利用致动器可使摄像装置100以俯仰轴旋转的方式对其进行支持。云台50以可利用致动器使摄像装置100还分别以横滚轴及偏航轴为中心旋转的方式对其进行支持。云台50通过使摄像装置100以偏航轴、俯仰轴及横滚轴中的至少一个为中心旋转,可变更摄像装置100的姿势。
多个摄像装置60是为了控制UAV10的飞行而对UAV10的周围进行摄像的传感用相机。2个摄像装置60可设在UAV10的机头即正面。另外2个摄像装置60可设在UAV10的底面。正面侧的2个摄像装置60可成对,作为所谓立体相机发挥功能。底面侧的2个摄像装置60也可成对,作为立体相机发挥功能。可根据由多个摄像装置60所摄的图像而生成UAV10周围的三维空间数据。UAV10所具备的摄像装置60的数量并不限于4个。UAV10至少具备1个摄像装置60即可。UAV10可在UAV10的机头、机尾、侧面、底面及顶面分别具有至少1个摄像装置60。摄像装置60可设定的视角可大于摄像装置100可设定的视角。即,摄像装置60的摄像范围可大于摄像装置100的摄像范围。摄像装置60也可具有定焦镜头或鱼眼镜头。
远程操作装置300与UAV10进行通信而对UAV10进行远程操作。远程操作装置300可以无线方式与UAV10进行通信。远程操作装置300向UAV10发送驱动信息,该驱动信息表示上升、下降、加速、减速、前进、后进、旋转等与UAV10的移动相关的各种驱动指令。驱动信息包含例如使UAV10的高度上升的驱动信息。驱动信息可表示UAV10应在的高度。UAV10移动到位于从远程操作装置300接收的驱动信息所表示的高度。
图2表示UAV10的功能块的一例。UAV10具备UAV控制部30、内存32、通信接口34、推进部40、GPS接收器41、惯性测量装置42、磁罗盘43、气压高度计44、云台50、摄像装置60及摄像装置
100。
通信接口34与远程操作装置300等其他装置进行通信。通信接口34可接收指示信息,该指示信息包含远程操作装置300对于UAV控制部30的各种指令。内存32中储存UAV控制部30对推进部40、GPS接收器41、惯性测量装置(IMU)42、磁罗盘43、气压高度计44、云台50、摄像装置60及摄像装置100进行控制所需的程序等。内存32可为计算机可读的记录介质,可包括SRAM、DRAM、EPROM、EEPROM及USB存储器等闪存中的至少一个。内存32可设在UAV主体20内部。可被设置成可从UAV主体20卸下。
UAV控制部30根据内存32内储存的程序对UAV10的飞行及摄像进行控制。UAV控制部30可由CPU或MPU等微处理器、MCU等微控制器等构成。UAV控制部30根据通过通信接口34而从远程操作装置300接收的指令,对UAV10的飞行及摄像进行控制。推进部40推进UAV10。推进部40具有多个旋翼、及使多个旋翼旋转的多个驱动马达。推进部40根据来自UAV控制部30的驱动指令而利用多个驱动马达来使多个旋翼旋转,从而使UAV10飞行。
UAV控制部30可对由传感用的多个摄像装置60所摄的多个图像进行解析,由此识别UAV10周围的环境。UAV控制部30根据UAV10周围的环境而控制飞行,例如避开障碍物。UAV控制部30可根据由多个摄像装置60所摄的多个图像而生成UAV10周围的三维空间数据,且根据三维空间数据控制飞行。
GPS接收器41接收表示从多个GPS卫星发送的时刻的多个信号。GPS接收器41根据所接收的多个信号而算出GPS接收器41的位置、即UAV10的位置。IMU42检测UAV10的姿势。作为UAV10的姿势,IMU42检测UAV10的前后、左右及上下这3轴方向的加速度、及俯仰、横滚及偏航这3轴方向的角速度。磁罗盘43检测UAV10的机头的方位。气压高度计44检测UAV10的飞行高度。气压高度计44检测UAV10周围的气压,且将所检测到的气压换算为高度,从而对高度进行检测。
摄像装置100具备摄像部102及镜头部200。镜头部200为镜头
装置的一例。摄像部102具有图像传感器120、摄像控制部110及内存130。图像传感器120可由CCD或CMOS构成。图像传感器120将通过多个镜头210成像的光学像的图像数据输出到摄像控制部110。摄像控制部110可由CPU或MPU等微处理器、MCU等微控制器等构成。摄像控制部110可根据来自UAV控制部30的摄像装置100的动作指令来控制摄像装置100。内存130可为计算机可读的记录介质,可包括SRAM、DRAM、EPROM、EEPROM及USB存储器等闪存中的至少一个。内存130储存摄像控制部110对图像传感器120等进行控制所需的程序等。内存130可设在摄像装置100的壳体的内部。内存130可设置成可从摄像装置100的壳体卸下。
镜头部200具有多个镜头210、镜头移动机构212及镜头控制部220。多个镜头210可作为变焦镜头、可变焦距镜头及聚焦镜头发挥功能。多个镜头210中的至少一部分或全部配置成可沿光轴移动。镜头部200可为设置成能相对于摄像部102装卸的可换镜头。镜头移动机构212使多个镜头210中的至少一部分或全部沿光轴移动。镜头控制部220根据来自摄像部102的镜头控制指令而驱动镜头移动机构212,从而使一个或多个镜头210沿光轴方向移动。镜头控制指令例如为变焦控制指令及聚焦控制指令。
如此构成的摄像装置100根据对摄像装置100的摄像范围预先确定的参照区域的图像数据而控制摄像装置100的曝光。摄像装置100可导出图像内的参照区域的亮度的评估值,且根据亮度的评估值来导出曝光控制值(EV值)。摄像装置100可根据曝光控制值来控制摄像装置100的光圈、快门速度、图像传感器120的输出增益等,由此控制摄像装置100的曝光。
参照区域可以是为了控制摄像装置100的曝光而对摄像装置100的摄像范围预先确定的感兴趣区域(ROI)。参照区域的位置可为摄像装置100的摄像范围的中央部分。参照区域的位置可按照每个摄像装置100的摄影模式而预先确定。参照区域的位置可按照用户的指示而设定在摄像装置100的摄像范围内的任意位置。参照区域的形状、大小可根据摄影模式或用户的指示而变更。参照区域可被分割为多个
区域,且被分割的各个区域散在摄像装置100的摄影范围内。
摄像装置100可例如根据当前图像内的参照区域的辉度,导出用于下次所摄的图像的曝光控制值。摄像装置100可根据所导出的曝光控制值来对下一图像进行摄像。摄像装置100可按照预先确定的帧速率依次对图像进行摄像。摄像装置100可根据当前帧(图像)的参照区域的图像数据,导出对下一帧进行摄像时的曝光控制值。
搭载在UAV10等移动体上的摄像装置100的摄像范围随着UAV10的移动而变化,直至拍摄下一图像为止。由云台50支持的摄像装置100的摄像范围随着云台50的驱动而变化,直至拍摄下一图像为止。因为这种变化,使得摄像范围的亮度变化,有时无法恰当地控制对下一图像进行摄像时的摄像装置100的曝光。
例如如图3A所示,由摄像装置100所摄的图像501包含对象400,参照区域511在对象400上。另一方面,由摄像装置100在图像501之后所摄的图像502中,参照区域512不在对象400上。这里,当对象400的亮度与对象400的背景的亮度存在较大差异时,摄像装置100有时无法根据图像501的参照区域511的图像数据而恰当地控制对图像502进行摄像时的曝光。此时,若摄像装置100对图像502进行摄像,则会发生曝光过度或曝光不足。例如,当UAV10一面飞行一面由摄像装置100对以高层建筑为对象、高层建筑的背景为天空的风景进行摄像时,若高层建筑在图像的参照区域之外,则有时存在曝光过度。
因此,根据本实施方式所涉及的摄像装置100,如图3B所示,摄像装置100预测下次所摄的图像502的参照区域512的位置。当图像502的参照区域512的位置包含于图像502之前所摄的图像501,且参照区域512不在对象400上时,摄像装置100根据与图像502的参照区域512对应的图像501内的图像区域521的图像数据,控制对图像502进行摄像时的摄像装置100的曝光。由此,即便摄像装置100的摄像范围的亮度发生变化,也能恰当地控制摄像装置100的曝光。
另一方面,如图3C所示,当图像502的参照区域512的位置包
含于图像501,且参照区域512在位于图像501的参照区域511的对象401上时,摄像装置100根据图像501的参照区域511的图像数据,控制对图像502进行摄像时的摄像装置100的曝光。当位于图像501的参照区域511的对象401还位于图像502的参照区域512时,根据图像501的参照区域511的图像数据来控制曝光,摄像装置100对图像502进行摄像时也不会发生曝光过度或曝光不足。所以,该情况下,摄像装置100无需为了控制曝光而执行使参照区域移动的处理,能防止增加不必要的处理负担。
另外,如图3D所示,当图像501内存在多个对象402及403时,即便对象402不存在于图像502的参照区域512,对象403有时也会位于图像502的参照区域512。此情况下,与图3C的情况相同,摄像装置100根据图像501的参照区域511的图像数据,控制对图像502进行摄像时的摄像装置100的曝光。当有其它对象存在于参照区域512时,与不存在其它对象的情况相比,亮度变化少。因此,摄像装置100根据图像501的参照区域511的图像数据控制曝光,也能降低发生曝光过度或曝光不足的可能性。例如,当摄像装置100对高层建筑群进行摄影时,就算存在于当前图像的参照区域的高层建筑不包含于下一图像的参照区域,只要包含其他高层建筑,那么根据当前图像的预先确定的参照区域的图像数据控制曝光,也能降低发生曝光过度或曝光不足的可能性。因此,此情况下,摄像装置100也无需为了控制曝光而执行使参照区域移动的处理。由此,能防止增加不必要的处理负担。
如上文所述,为了更恰当地控制曝光,摄像控制部110具有识别部112、预测部114及曝光控制部116。曝光控制部116为控制部的一例。
图4A所示,识别部112例如识别由摄像装置100所摄的图像601中的、位于对摄像装置100的摄像范围预先确定的参照区域611的对象701。识别部112可将位于距摄像装置100为预先确定的距离以内的物体识别为对象。
预测部114根据用于变更摄像装置100的位置或朝向的驱动信
息,预测在图像601之后所摄的图像602的参照区域612的位置。预测部114可根据驱动信息,确定摄像装置100对图像601进行摄像的时机与摄像装置100对图像602进行摄像的时机之间的摄像装置的移动量D,且根据移动量D而预测图像602的参照区域612的位置。
预测部114可根据驱动信息而确定摄像装置100的速度,且根据速度、及摄像装置100对图像601进行摄像的时机与摄像装置100对图像602进行摄像的时机的差而确定移动量D。预测部114可根据由远程操作装置300发送的UAV10的驱动信息而确定摄像装置100的速度。预测部114可根据摄像装置100的速度v及摄像装置100的帧速率f(fps)而确定移动量D。预测部114可通过算出v×(1/f)而确定移动量D。
预测部114可以进一步根据驱动信息而确定摄像装置100对图像601进行摄像的时机与对图像602进行摄像的时机之间的摄像装置100的朝向的变化量H,且根据移动量D及变化量H而预测图像602的参照区域612的位置。预测部114可根据由远程操作装置300发送的UAV10的驱动信息及云台50的驱动信息中的至少一方而确定摄像装置100的朝向的变化量H。
如图4A所示,当图像602的参照区域612的位置包含于图像601且不在对象701上时,曝光控制部116根据与图像602的参照区域612对应的图像601内的图像区域621的图像数据,而控制对图像602进行摄像时的摄像装置100的曝光。如图4B所示,当图像602的参照区域612的位置在对象701上时,曝光控制部116根据图像601的参照区域611的图像数据,而控制对图像602进行摄像时的摄像装置100的曝光。
曝光控制部116可确定标准移动量d0,该标准移动量d0是摄像装置100在对图像601进行摄像之后对图像602进行摄像期间的、直至图像602的参照区域612的位置不在对象701上为止应移动的量。标准移动量d0为第1移动量的一例。当摄像装置100的移动量D为标准移动量d0以上时,可根据图像601内的图像区域621的图像数据,控制对图像602进行摄像时的摄像装置100的曝光。曝光控制部
116可将从图像601的参照区域611的摄像装置100移动方向侧的端到对象701的摄像装置100移动方向侧的端部为止的距离确定为标准移动量d0。曝光控制部116可将从图像601的参照区域611的摄像装置100移动方向侧的端到距摄像装置100移动方向侧的端最远的对象701的摄像装置100移动方向侧的端部为止的距离确定为标准移动量d0。曝光控制部116可将从图像601的参照区域611的与摄像装置100移动方向的端为相反侧的端到对象701的摄像装置100移动方向侧的端部为止的距离确定为标准移动量d0。
当图像602的参照区域612的至少一部分不包含对象701时,曝光控制部116可判断图像602的参照区域612的位置不在对象701上。当图像602的参照区域612完全被对象701占据时,曝光控制部116可判断图像602的参照区域612的位置在对象701上。当包含于图像601的参照区域611的对象701仅占据图像602的参照区域612的预先确定的比例W以下时,曝光控制部116可判断图像602的参照区域612的位置不在对象701上。当包含于图像601的参照区域611的对象701占据大于图像602的参照区域612的预先确定的比例W时,曝光控制部116可判断图像602的参照区域612的位置在对象701上。
这里,例如当摄像装置100移动快速、即UAV10移动快速等时,下次所摄的图像的参照区域有时位于摄像装置100的当前摄像范围外。此时,无法根据摄像装置100所摄的图像来判断什么样的对象位于下次所摄的图像的参照区域。因此,此情况下,摄像装置100可不使参照区域移动而控制曝光。
另一方面,UAV10除了具备摄像装置100之外,还具备对与摄像装置100不同的摄像范围进行摄像的摄像装置60。摄像装置60作为用于对UAV10周围的障碍物进行检测的传感用相机发挥功能。识别部112可利用摄像装置60的图像来识别摄像装置100的摄像范围外的对象。
例如,如图5所示,由摄像装置60所摄的图像800中包含对象701及对象702。另一方面,由摄像装置100所摄的图像601中包含对象701,但不包含对象702。并且,图像601中,并不存在与在图
像601之后由摄像装置100所摄的图像602的参照区域612对应的图像区域。另一方面,图像800中,存在与图像602的参照区域612对应的图像区域821。此情况下,摄像装置100可根据由摄像装置60所摄的图像800内的图像区域821的图像数据而控制摄像装置100的曝光。
因此,识别部112可还识别对象702,该对象702存在于在图像602之前、由对与摄像装置100不同的摄像范围进行摄像的摄像装置60所摄的图像800中。当图像602的参照区域612的位置不包含于图像601而包含于图像800、且不在对象701及对象702上时,曝光控制部116可根据与图像602的参照区域612对应的图像800内的图像区域821的图像数据,而控制对图像602进行摄像时的摄像装置100的曝光。当图像602的参照区域612的位置不包含于图像601而包含于图像800、且在对象701或对象702上时,曝光控制部116可根据图像601的参照区域611的图像数据,而控制对图像602进行摄像时的摄像装置100的曝光。
这里,摄像装置100的图像传感器120及镜头210等的特性与摄像装置60的图像传感器及镜头的特性可能不同。此时,由摄像装置100所摄的图像的特性与由摄像装置60所摄的图像的特性可能不同。因此,当根据由摄像装置60所摄的图像而控制摄像装置100的曝光时,优选执行校正。当图像602的参照区域612的位置包含于图像800且不在对象702及对象702上时,曝光控制部116可根据图像800内的图像区域821的图像数据、及由摄像装置100所摄的图像的特性与由摄像装置60所摄的图像的特性之差,而控制对图像602进行摄像时的摄像装置100的曝光。例如,曝光控制部116可根据预先确定的内插系数来对图像800内的图像区域821的辉度进行内插,依据内插后的辉度导出图像区域821的亮度的评估值,且根据导出的亮度的评估值来导出摄像装置100的曝光控制值。内插系数可根据由摄像装置100所摄的图像的特性与由摄像装置60所摄的图像的特性之差来确定。摄像装置100及摄像装置60可对相同的被摄体进行摄像,且通过对所摄的图像彼此进行比较来预先确定内插系数。
预测部114可根据驱动信息,识别摄像装置100对图像601进行摄像的时机与摄像装置100对图像602进行摄像的时机之间的摄像装置100的移动量D,且根据移动量D而预测图像602的参照区域612的位置。曝光控制部116可识别第1标准移动量d0及第2标准移动量d1,该第1标准移动量d0是摄像装置100对图像602进行摄像后、直至图像602的参照区域602的位置不在对象701上为止应移动的量,该第2标准移动量d1是摄像装置100对图像601进行摄像之后、直至图像602的参照区域612的位置在对象702上为止应移动的量。当摄像装置100的移动量D为第1标准移动量d0以上且小于第2标准移动量d1时,曝光控制部116可根据图像800内的图像区域821的图像数据而控制对图像602进行摄像时的摄像装置100的曝光。第1标准移动量d0为第1移动量的一例。第2标准移动量d1为第2移动量的一例。
当摄像装置100的移动量D小于第1标准移动量d0、或为第2标准移动量d1以上时,曝光控制部116可根据图像601的参照区域611的图像数据而控制对图像602进行摄像时的摄像装置100的曝光。
利用预测部114对下一图像的参照区域的位置的预测是在对下一图像进行摄像之前根据用于控制UAV10或云台50的驱动信息来进行。这里,当根据驱动信息进行预测之后,有时还会根据进一步追加的驱动信息来控制UAV10或云台50。此情况下,由预测部114预测出的参照区域的位置可能不准确。因此,例如在摄像装置100对图像602进行摄像之前,检测到与之前的驱动信息不同的、用于变更摄像装置100的位置或朝向的其它驱动信息的情况下,当图像602的参照区域612的位置包含于图像601且不在对象701上时,曝光控制部116也可根据图像601的参照区域611的图像数据而控制对图像602进行摄像时的摄像装置的曝光。
图6是表示由摄像控制部110所执行的摄像装置100的曝光控制的顺序的一例的流程图。
识别部112判定对象是否存在于由摄像装置100所摄的当前图像的参照区域(S100)。识别部112可根据当前图像的参照区域内、在
距摄像装置100预先确定的距离以内是否有物体,来判定当前的图像的参照区域是否有对象。若参照区域内没有对象,则曝光控制部116根据当前图像内的参照区域的亮度的评估值导出摄像装置100的曝光控制值(S114)。然后,曝光控制部116将针对当前图像内的参照区域所导出的曝光控制值应用于下次摄像而对下一图像进行摄像(S116)。
当在当前的图像的参照区域内存在对象时,预测部114判定UAV控制部30是否接收UAV10或云台50的驱动指令(S102)。若未接收驱动指令,则曝光控制部116将针对当前图像内的参照区域所导出的曝光控制值应用于下次摄像而对下一图像进行摄像。当接收表示UAV10悬停的驱动指令时,也判断UAV10未移动,曝光控制部116可将将针对当前图像内的参照区域所导出的曝光控制值应用于下次摄像而对下一图像进行摄像。
当UAV控制部30接收驱动指令时,预测部114根据驱动指令而确定基于UAV10的速度及摄像装置100的帧速率的直至下次摄像为止的时间,且根据速度及时间而预测下一图像的参照区域的位置(S104)。
然后,曝光控制部116判定下一图像的参照区域的位置是否在位于当前的图像的参照区域的对象上(S106)。当下一图像的参照区域的位置在位于当前的图像的参照区域的对象上时,曝光控制部116将针对当前图像内的参照区域所导出的曝光控制值应用于下次摄像而对下一图像进行摄像。
当下一图像的参照区域的位置不在位于当前的图像的参照区域的对象上时,曝光控制部116根据与下一图像的参照区域对应的当前图像内的图像区域的亮度的评估值而导出曝光控制值(S108)。然后,曝光控制部116判断直至下次摄影为止UAV控制部30是否接收UAV10或云台50的追加的驱动指令(S110)。若接收追加的驱动指令,则曝光控制部116将针对当前图像内的参照区域所导出的曝光控制值应用于下次摄像而对下一图像进行摄像。S110中,可例如在待机1秒左右的规定时间后驱动UAV控制部30。这是为了与使UAV10
向与最初的移动方向不同的方向移动的输入相对应。此情况下,参照区域611位于对象701内,曝光未变化。
另一方面,若未接收追加的驱动指令,则曝光控制部116将步骤S108中针对当前图像内的图像区域所导出的曝光控制值应用于下次摄像而对下一图像进行摄像(S112)。
如以上所述,摄像装置100预测下一图像的参照区域的位置,若没有对象位于预测到的参照区域的位置、当前的图像的参照区域,则根据当前图像内的与预测到的下一图像的参照区域对应的图像区域的图像数据,而控制对下一图像进行摄像时的曝光。由此,在因驱动UAV10或云台50,而使得直至对下一图像进行摄像为止摄像装置100的摄像范围的亮度变大的情况下,也能防止摄像装置100的曝光控制变得不恰当。
图7是表示由摄像控制部110执行的摄像装置100的曝光控制的顺序的其它例的流程图。
识别部112根据由传感用的摄像装置60所摄的图像来判定是否存在对象(S200)。若不存在对象,则曝光控制部116根据当前图像内的参照区域的亮度评估值而导出摄像装置100的曝光控制值(S224)。然后,曝光控制部116将针对当前图像内的参照区域所导出的曝光控制值应用于下次摄像而对下一图像进行摄像(S226)。
当存在对象时,识别部112判定由摄像装置100所摄的当前的图像的参照区域内是否存在对象(S202)。若当前的图像的参照区域内不存在对象,则曝光控制部116将针对当前图像内的参照区域所导出的曝光控制值应用于下次摄像而对下一图像进行摄像。
若当前的图像的参照区域内存在对象,则预测部114判定UAV控制部30是否接收UAV10或云台50的驱动指令(S204)。若未接收驱动指令,则曝光控制部116将针对当前图像内的参照区域所导出的曝光控制值应用于下次摄像而对下一图像进行摄像。
若UAV控制部30接收驱动指令,则预测部114根据驱动指令,识别基于UAV10的速度及摄像装置100的帧速率的直至下次摄像为止的时间,且根据速度及时间来预测下一图像的参照区域的位置
(S206)。
曝光控制部116导出从当前图像的参照区域的UAV10的移动方向侧的端部到参照区域的对象的移动方向侧的端部为止的距离d0、及摄像装置100对下一图像进行摄像为止移动的距离D(S208)。曝光控制部116判定距离D是否为距离d0以下(S210)。若距离D为距离d0以下,则判断下一图像的参照区域存在对象,从而,曝光控制部116将针对当前图像内的参照区域所导出的曝光控制值应用于下次摄像而对下一图像进行摄像。
当距离D大于距离d0时,曝光控制部116判定是否存在其它对象(S212)。曝光控制部116可根据识别部112依据摄像装置60的图像所识别的对象的检测结果,来判定是否存在其它对象。即,曝光控制部116除了判定摄像装置100的摄像范围内是否存在对象之外,还判定是否存在摄像范围外的对象。
当不存在其它对象时,曝光控制部116根据与下一图像的参照区域对应的、由摄像装置100所摄的当前图像内的图像区域、或由摄像装置60所摄的图像内的图像区域的亮度的评估值而导出曝光控制值(S218)。
更具体来说,如图8所示的流程图所示,曝光控制部116判定下一图像的参照区域是否存在于摄像装置100的当前图像内(S300)。当下一图像的参照区域存在于摄像装置100的当前图像内时,曝光控制部116根据与下一图像的参照区域对应的、摄像装置100的当前图像内的图像区域的亮度的评估值而导出曝光控制值(S302)。
当下一图像的参照区域不存在于摄像装置100的当前图像内时,曝光控制部116从由传感用的摄像装置60所摄的图像中确定与下一图像的参照区域对应的图像区域(S304)。曝光控制部116根据由摄像装置60所摄的图像内的所确定的图像区域的亮度的评估值而导出曝光控制值(S306)。
导出曝光控制值后,曝光控制部116判定直至下次摄影为止,UAV控制部30是否接收UAV10或云台50的追加的驱动指令(S220)。若接收追加的驱动指令,则曝光控制部116将针对当前图像内的参照
区域所导出的曝光控制值应用于下次摄像而对下一图像进行摄像。
另一方面,若未接收追加的驱动指令,则曝光控制部116将步骤S218中导出的曝光控制值应用于下次摄像而对下一图像进行摄像(S222)。
步骤S212的判定结果为,当存在其它对象时,曝光控制部116导出从当前图像的参照区域的UAV10的移动方向侧的端部到其它对象的与移动方向的端部为相反侧的端部为止的距离d1(S214)。曝光控制部116判定距离D是否为距离d0以上且为距离d1以下。当距离D大于距离d1时,曝光控制部116将针对当前图像内的参照区域所导出的曝光控制值应用于下次摄像而对下一图像进行摄像。
若距离D为距离d0以上且为距离d1以下,则曝光控制部116根据与下一图像的参照区域对应的、由摄像装置100所摄的当前图像内的图像区域、或由摄像装置60所摄的图像内的图像区域的亮度的评估值而导出曝光控制值(S218)。导出曝光控制值后,曝光控制部116判定直至下次摄影为止,UAV控制部30是否接收UAV10或云台50的追加的驱动指令(S220)。若接收追加的驱动指令,则曝光控制部116将针对当前图像内的参照区域所导出的曝光控制值应用于下次摄像而对下一图像进行摄像。
另一方面,若未接收追加的驱动指令,则曝光控制部116将步骤S218中导出的曝光控制值应用于下次摄像而对下一图像进行摄像(S222)。
如上所述,根据本实施方式所涉及的摄像装置100,若下次所摄的图像的参照区域内不存在由摄像装置100或摄像装置60所摄的当前图像内的对象,则根据与下次所摄的图像的参照区域对应的图像区域的图像数据,而控制摄像装置100对下一图像进行摄像时的曝光。由此,在因驱动UAV10或云台50,使得直至对下一图像进行摄像为止摄像装置100的摄像范围的亮度发生变化的情况下,也能防止摄像装置100的曝光控制变得不恰当。
图9表示整体地或部分地具体实现本发明的多个形态的计算机1200的一例。计算机1200中安装的程序能使计算机1200作为本发
明的实施方式所涉及的装置的相关操作或该装置的一个或多个“部”而发挥功能。或者,该程序能使计算机1200执行该操作或该一个或多个“部”。该程序能使计算机1200执行本发明的实施方式所涉及的过程或该过程的阶段。这种程序为了使计算机1200执行本说明书中记载的流程图及框图的框中的若干个或全部所相关的确定操作,可由CPU1212执行。
本实施方式中的计算机1200包含CPU1212及RAM1214,CPU1212及RAM1214通过主机控制器1210彼此连接。计算机1200还包含通信接口1222、输入/输出单元,通信接口1222与输入/输出单元经由输入/输出控制器1220而连接于主机控制器1210。计算机1200还包含ROM1230。CPU1212按照ROM1230及RAM1214内储存的程序进行动作,由此控制各单元。
通信接口1222通过网络而与其他电子设备进行通信。硬盘驱动器可储存供计算机1200内的CPU1212使用的程序及数据。ROM1230中储存激活化时由计算机1200执行的引导程序等及/或依赖于计算机1200硬件的程序。可通过CR-ROM、USB存储器或IC卡等计算机可读记录介质或网络提供程序。程序被安装于也为计算机可读记录介质的示例的RAM1214或ROM1230中且由CPU1212执行。这些程序内描述的信息处理被计算机1200读取,使程序与所述各种类型的硬件资源之间实现协作。装置或方法可通过利用计算机1200实现信息的操作或处理来构成。
例如,当在计算机1200及外部设备之间执行通信时,CPU1212可执行RAM1214上载入的通信程序,根据通信程序中描述的处理,命令通信接口1222进行通信处理。通信接口1222在CPU1212的控制下,读取RAM1214或USB存储器等记录介质内所提供的发送缓冲区中储存的发送数据,将读取的发送数据发送到网络,或将通过网络接收的接收数据写入到记录介质上所提供的接收缓冲区等内。
而且,CPU1212可将USB存储器等外部记录介质中储存的全部或需要的一部分文件或数据库读取到RAM1214中,且对RAM1214上的数据执行各种类型的处理。接着,CPU1212可将经过处理的数
据回写到外部记录介质内。
各种类型的程序、数据、表格及数据库等各种类型的信息可储存在记录介质中,并接受信息处理。CPU1212可对于从RAM1214读取的数据执行各种类型的处理,且将结果回写到RAM1214内,所述各种类型的处理包含本公开中随处记载的、由程序的指令序列所指定的各种类型的操作、信息处理、条件判断、条件分支、无条件分支、信息的检索/置换等。而且,CPU1212可检索记录介质内的文件、数据库等中的信息。例如,当记录介质内储存分别具有与第2属性的属性值相关的第1属性的属性值的多个条目时,CPU1212可从该多个条目中检索出第1属性的属性值所指定的与条件一致的条目,读取该条目内储存的第2属性的属性值,由此取得满足预先确定的条件的与第1属性相关的第2属性的属性值。
上文说明的程序或软件模块可储存在计算机1200上或计算机1200近旁的计算机可读存储介质中。而且,连接于专用通信网络或互联网的服务器系统内所提供的硬盘或RAM等记录介质可用作计算机可读存储介质,这样,程序可通过网络提供给计算机1200。
要注意的是,权利要求书、说明书、以及说明书附图中所示的装置、系统、程序以及方法中的动作、顺序、步骤、以及阶段等各项处理的执行顺序,只要没有特别明示“在...之前”、“事先”等,只要前面处理的输出并不用在后面的处理中,则可以以任意顺序实现。关于权利要求书、说明书以及说明书附图中的动作流程,为方便起见而使用“首先”、“接着”等进行了说明,但并不意味着必须按照这样的顺序实施。
符号说明
10 UAV
20 UAV主体
30 UAV控制部
32 内存
34 通信接口
40 推进部
41 GPS接收器
42 惯性测量装置
43 磁罗盘
44 气压高度计
50 云台
60 摄像装置
100 摄像装置
102 摄像部
110 摄像控制部
112 识别部
114 预测部
116 曝光控制部
120 图像传感器
130 内存
200 镜头部
210 镜头
212 镜头移动机构
220 镜头控制部
300 远程操作装置
1200 计算机
1210 主机控制器
1212 CPU
1214 RAM
1220 输入/输出控制器
1222 通信接口
1230 ROM
Claims (20)
- 一种控制装置,其具备:识别部,其识别由摄像装置所摄的第1图像中的、存在于对所述摄像装置的摄像范围预先确定的参照区域中的第1对象;预测部,其根据用于变更所述摄像装置的位置或朝向的驱动信息,预测在所述第1图像之后所摄的第2图像中所述参照区域的位置;及控制部,其在所述第2图像的所述参照区域的位置包含于所述第1图像且不在所述第1对象上的情况下,根据与所述第2图像的所述参照区域对应的所述第1图像内的图像区域的图像数据,控制对所述第2图像进行摄像时的所述摄像装置的曝光。
- 如权利要求1所述的控制装置,其中,所述控制部在所述第2图像的所述参照区域的位置在所述第1对象上的情况下,根据所述第1图像的所述参照区域的图像数据,控制对所述第2图像进行摄像时的所述摄像装置的曝光。
- 如权利要求1所述的控制装置,其中,所述预测部根据所述驱动信息,确定所述摄像装置对所述第1图像进行摄像的时机与所述摄像装置对所述第2图像进行摄像的时机之间的所述摄像装置的移动量,根据所述移动量,预测所述第2图像的所述参照区域的位置。
- 如权利要求3所述的控制装置,其中,所述预测部根据所述驱动信息,确定所述摄像装置的速度,根据所述速度、以及所述摄像装置对所述第1图像进行摄像的时机与所述摄像装置对所述第2图像进行摄像的时机之差,确定所述移动量。
- 如权利要求3所述的控制装置,其中,所述预测部还根据所述驱动信息,确定所述摄像装置对所述第1图像进行摄像的时机与对所述第2图像进行摄像的时机之间的所述摄像装置的朝向的变化量,根据所述移动量及所述变化量,预测所述 第2图像的所述参照区域的位置。
- 如权利要求3所述的控制装置,其中,所述控制部确定第1移动量[a1],该第1移动量是所述摄像装置在对所述第1图像进行摄像之后对所述第2图像进行摄像期间的、直至所述第2图像的所述参照区域的位置不在所述第1对象上为止应移动的量,当所述摄像装置的所述移动量在所述第1移动量以上时,根据所述第1图像内的所述图像区域的图像数据,控制对所述第2图像进行摄像时的所述摄像装置的曝光。
- 如权利要求1所述的控制装置,其中,所述识别部还识别存在于由其它摄像装置在所述第2图像之前所摄的第3图像中的第2对象,该其它摄像装置是对与所述摄像装置不同的摄像范围进行摄像,所述控制部在所述第2图像的所述参照区域的位置不包含于所述第1图像而包含于所述第3图像、且不在所述第1对象及所述第2对象上的情况下,根据与所述第2图像的所述参照区域对应的所述第3图像内的图像区域的图像数据,控制对所述第2图像进行摄像时的所述摄像装置的曝光。
- 如权利要求7所述的控制装置,其中,所述控制部在所述第2图像的所述参照区域的位置不包含于所述第1图像而包含于所述第3图像、且在所述第1对象或所述第2对象上的情况下,根据所述第1图像的所述参照区域的图像数据,控制对所述第2图像进行摄像时的所述摄像装置的曝光。
- 如权利要求7所述的控制装置,其中,所述预测部根据所述驱动信息,确定所述摄像装置对所述第1图像进行摄像的时机与所述摄像装置对所述第2图像进行摄像的时机之间的所述摄像装置的移动量,根据所述移动量来预测所述第2图像的所述参照区域的位置,所述控制部确定第1移动量及第2移动量,该第1移动量是所述摄像装置对所述第1图像进行摄像后直至所述第2图像的所述参照区域的位置不在所述第1对象上为止应移动的量,该第2移动量是所述 摄像装置对所述第1图像进行摄像后直至所述第2图像的所述参照区域的位置位于所述第2对象上为止应移动的量,当所述摄像装置的所述移动量在所述第1移动量以上且小于所述第2移动量时,根据所述第3图像内的所述图像区域的图像数据,控制对所述第2图像进行摄像时的所述摄像装置的曝光。
- 如权利要求9所述的控制装置,其中,所述控制部在所述摄像装置的移动量小于所述第1移动量、或为所述第2移动量以上的情况下,根据所述第1图像的所述参照区域的图像数据,控制对所述第2图像进行摄像时的所述摄像装置的曝光。
- 如权利要求7所述的控制装置,其中,所述第2图像的所述参照区域的位置包含于所述第3图像且不在所述第1对象及所述第2对象上的情况下,所述控制部根据所述第3图像内的所述图像区域的图像数据、及由所述摄像装置所摄的图像的特性与由所述其它摄像装置所摄的图像的特性之差,控制对所述第2图像进行摄像时的所述摄像装置的曝光。
- 如权利要求7所述的控制装置,其中,所述其它摄像装置的摄像范围大于所述摄像装置的摄像范围。
- 如权利要求1所述的控制装置,其中,在所述摄像装置对所述第2图像进行摄像之前,检测到与所述驱动信息不同的用于变更所述摄像装置的位置或朝向的其它驱动信息的情况下,当所述第2图像的所述参照区域的位置包含于所述第1图像且不在所述第1对象上时,所述控制部也根据所述第1图像的所述参照区域的图像数据,控制对所述第2图像进行摄像时的所述摄像装置的曝光。
- 如权利要求1所述的控制装置,其中,所述识别部还识别存在于所述第1图像中的第2对象,当所述第2图像的所述参照区域的位置包含于所述第1图像且不在所述第1对象及所述第2对象上的情况下,所述控制部根据所述第1图像内的所述图像区域的图像数据,控制对所述第2图像进行摄像时的所述摄像装置的曝光,当所述第2图像的所述参照区域的位置在所述第1对象或所述第2对象上的情况下,所述控制部根据所述第1图像的所述参照区域的图像数据,控制对所述第2图像进行摄像时的所述摄像装置的曝光。
- 一种摄像装置,其具备如权利要求1至14中任一项所述的控制装置,且按照受所述控制部控制的曝光进行摄像。
- 一种摄像系统,其具备:如权利要求15所述的摄像装置;及支持机构,其以可变更所述摄像装置的朝向的方式支持所述摄像装置。
- 如权利要求16所述的摄像系统,其中,还具备对与所述摄像装置不同的摄像范围进行摄像的其它摄像装置。
- 一种移动体,其具备如权利要求17所述的摄像系统而进行移动。
- 一种控制方法,其具备下述阶段:识别由摄像装置所摄的第1图像中的、存在于对所述摄像装置的摄像范围预先确定的参照区域中的第1对象;根据用于变更所述摄像装置的位置或朝向的驱动信息,预测在所述第1图像之后所摄的第2图像的所述参照区域的位置;及在所述第2图像的所述参照区域的位置包含于所述第1图像且不在所述第1对象上的情况下,根据与所述第2图像的所述参照区域对应的所述第1图像内的图像区域的图像数据,控制对所述第2图像进行摄像时的所述摄像装置的曝光。
- 一种程序,其用于使计算机执行下述阶段:识别由摄像装置所摄的第1图像中的、存在于对所述摄像装置的摄像范围预先确定的参照区域中的第1对象;根据用于变更所述摄像装置的位置或朝向的驱动信息,预测在所述第1图像之后所摄的第2图像的所述参照区域的位置;及在所述第2图像的所述参照区域的位置包含于所述第1图像且不在所述第1对象上的情况下,根据与所述第2图像的所述参照区域对 应的所述第1图像内的图像区域的图像数据,控制对所述第2图像进行摄像时的所述摄像装置的曝光。
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201780064276.8A CN109863741B (zh) | 2017-05-24 | 2017-12-06 | 控制装置、摄像装置、摄像系统、移动体、控制方法及计算机可读存储介质 |
US16/685,772 US20200092455A1 (en) | 2017-05-24 | 2019-11-15 | Control device, photographing device, photographing system, and movable object |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2017102646A JP6384000B1 (ja) | 2017-05-24 | 2017-05-24 | 制御装置、撮像装置、撮像システム、移動体、制御方法、及びプログラム |
JP2017-102646 | 2017-05-24 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/685,772 Continuation US20200092455A1 (en) | 2017-05-24 | 2019-11-15 | Control device, photographing device, photographing system, and movable object |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2018214465A1 true WO2018214465A1 (zh) | 2018-11-29 |
Family
ID=63444175
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2017/114806 WO2018214465A1 (zh) | 2017-05-24 | 2017-12-06 | 控制装置、摄像装置、摄像系统、移动体、控制方法及程序 |
Country Status (4)
Country | Link |
---|---|
US (1) | US20200092455A1 (zh) |
JP (1) | JP6384000B1 (zh) |
CN (1) | CN109863741B (zh) |
WO (1) | WO2018214465A1 (zh) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112313943A (zh) * | 2019-08-20 | 2021-02-02 | 深圳市大疆创新科技有限公司 | 装置、摄像装置、移动体、方法以及程序 |
WO2021031840A1 (zh) * | 2019-08-20 | 2021-02-25 | 深圳市大疆创新科技有限公司 | 装置、摄像装置、移动体、方法以及程序 |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110213494B (zh) * | 2019-07-03 | 2021-05-11 | Oppo广东移动通信有限公司 | 拍摄方法和装置、电子设备、计算机可读存储介质 |
DE112021004974T5 (de) * | 2020-09-23 | 2023-07-27 | Sony Group Corporation | Informationsverarbeitungsvorrichtung, informationsverarbeitungsverfahren und programm |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20050253956A1 (en) * | 2004-04-30 | 2005-11-17 | Shuji Ono | Image capturing apparatus and an image capturing method |
JP2014066958A (ja) * | 2012-09-27 | 2014-04-17 | Xacti Corp | 撮像装置 |
CN104137529A (zh) * | 2012-02-13 | 2014-11-05 | 诺基亚公司 | 用于数字摄影的焦点、曝光和白平衡的增强型自动调节的方法和装置 |
CN106331518A (zh) * | 2016-09-30 | 2017-01-11 | 北京旷视科技有限公司 | 图像处理方法及装置和电子系统 |
CN106534709A (zh) * | 2015-09-10 | 2017-03-22 | 鹦鹉无人机股份有限公司 | 具有用天空图像分割来自动曝光控制的前视相机的无人机 |
Family Cites Families (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2002314851A (ja) * | 2001-04-10 | 2002-10-25 | Nikon Corp | 撮影装置 |
JP4865587B2 (ja) * | 2007-02-20 | 2012-02-01 | キヤノン株式会社 | 設置型撮像装置 |
JP2013187665A (ja) * | 2012-03-07 | 2013-09-19 | Nikon Corp | 撮像装置 |
JP5737306B2 (ja) * | 2013-01-23 | 2015-06-17 | 株式会社デンソー | 露出制御装置 |
-
2017
- 2017-05-24 JP JP2017102646A patent/JP6384000B1/ja not_active Expired - Fee Related
- 2017-12-06 WO PCT/CN2017/114806 patent/WO2018214465A1/zh active Application Filing
- 2017-12-06 CN CN201780064276.8A patent/CN109863741B/zh not_active Expired - Fee Related
-
2019
- 2019-11-15 US US16/685,772 patent/US20200092455A1/en not_active Abandoned
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20050253956A1 (en) * | 2004-04-30 | 2005-11-17 | Shuji Ono | Image capturing apparatus and an image capturing method |
CN104137529A (zh) * | 2012-02-13 | 2014-11-05 | 诺基亚公司 | 用于数字摄影的焦点、曝光和白平衡的增强型自动调节的方法和装置 |
JP2014066958A (ja) * | 2012-09-27 | 2014-04-17 | Xacti Corp | 撮像装置 |
CN106534709A (zh) * | 2015-09-10 | 2017-03-22 | 鹦鹉无人机股份有限公司 | 具有用天空图像分割来自动曝光控制的前视相机的无人机 |
CN106331518A (zh) * | 2016-09-30 | 2017-01-11 | 北京旷视科技有限公司 | 图像处理方法及装置和电子系统 |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112313943A (zh) * | 2019-08-20 | 2021-02-02 | 深圳市大疆创新科技有限公司 | 装置、摄像装置、移动体、方法以及程序 |
WO2021031840A1 (zh) * | 2019-08-20 | 2021-02-25 | 深圳市大疆创新科技有限公司 | 装置、摄像装置、移动体、方法以及程序 |
Also Published As
Publication number | Publication date |
---|---|
US20200092455A1 (en) | 2020-03-19 |
JP2018198393A (ja) | 2018-12-13 |
CN109863741A (zh) | 2019-06-07 |
JP6384000B1 (ja) | 2018-09-05 |
CN109863741B (zh) | 2020-12-11 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2018214465A1 (zh) | 控制装置、摄像装置、摄像系统、移动体、控制方法及程序 | |
US11070735B2 (en) | Photographing device, photographing system, mobile body, control method and program | |
WO2019238044A1 (zh) | 确定装置、移动体、确定方法以及程序 | |
US20210014427A1 (en) | Control device, imaging device, mobile object, control method and program | |
WO2020011230A1 (zh) | 控制装置、移动体、控制方法以及程序 | |
US20190258255A1 (en) | Control device, imaging system, movable object, control method, and program | |
WO2019085771A1 (zh) | 控制装置、镜头装置、摄像装置、飞行体以及控制方法 | |
US11066182B2 (en) | Control apparatus, camera apparatus, flying object, control method and program | |
WO2019174343A1 (zh) | 活动体检测装置、控制装置、移动体、活动体检测方法及程序 | |
WO2020098603A1 (zh) | 确定装置、摄像装置、摄像系统、移动体、确定方法以及程序 | |
JP6481228B1 (ja) | 決定装置、制御装置、撮像システム、飛行体、決定方法、及びプログラム | |
JP6790318B2 (ja) | 無人航空機、制御方法、及びプログラム | |
JP6515423B2 (ja) | 制御装置、移動体、制御方法、及びプログラム | |
JP6501091B1 (ja) | 制御装置、撮像装置、移動体、制御方法、及びプログラム | |
WO2019061887A1 (zh) | 控制装置、摄像装置、飞行体、控制方法以及程序 | |
WO2019223614A1 (zh) | 控制装置、摄像装置、移动体、控制方法以及程序 | |
WO2020020042A1 (zh) | 控制装置、移动体、控制方法以及程序 | |
WO2020011198A1 (zh) | 控制装置、移动体、控制方法以及程序 | |
JP6543879B2 (ja) | 無人航空機、決定方法、およびプログラム | |
WO2018163300A1 (ja) | 制御装置、撮像装置、撮像システム、移動体、制御方法、及びプログラム | |
WO2020216057A1 (zh) | 控制装置、摄像装置、移动体、控制方法以及程序 | |
JP6413170B1 (ja) | 決定装置、撮像装置、撮像システム、移動体、決定方法、及びプログラム | |
WO2019085794A1 (zh) | 控制装置、摄像装置、飞行体、控制方法以及程序 | |
JP6569157B1 (ja) | 制御装置、撮像装置、移動体、制御方法、及びプログラム | |
WO2020063770A1 (zh) | 控制装置、摄像装置、移动体、控制方法以及程序 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 17910545 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 17910545 Country of ref document: EP Kind code of ref document: A1 |