WO2020259255A1 - 控制装置、摄像装置、摄像系统、移动体、控制方法以及程序 - Google Patents

控制装置、摄像装置、摄像系统、移动体、控制方法以及程序 Download PDF

Info

Publication number
WO2020259255A1
WO2020259255A1 PCT/CN2020/094642 CN2020094642W WO2020259255A1 WO 2020259255 A1 WO2020259255 A1 WO 2020259255A1 CN 2020094642 W CN2020094642 W CN 2020094642W WO 2020259255 A1 WO2020259255 A1 WO 2020259255A1
Authority
WO
WIPO (PCT)
Prior art keywords
value
transmittance
imaging device
polarizing filter
optical device
Prior art date
Application number
PCT/CN2020/094642
Other languages
English (en)
French (fr)
Inventor
本庄谦一
中川敬太
中辻达也
永山佳范
Original Assignee
深圳市大疆创新科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳市大疆创新科技有限公司 filed Critical 深圳市大疆创新科技有限公司
Priority to CN202080003326.3A priority Critical patent/CN112335229A/zh
Publication of WO2020259255A1 publication Critical patent/WO2020259255A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B11/00Filters or other obturators specially adapted for photographic purposes
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B7/00Control of exposure by setting shutters, diaphragms or filters, separately or conjointly
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B7/00Control of exposure by setting shutters, diaphragms or filters, separately or conjointly
    • G03B7/08Control effected solely on the basis of the response, to the intensity of the light received by the camera, of a built-in light-sensitive device
    • G03B7/091Digital circuits
    • G03B7/093Digital circuits for control of exposure time
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B7/00Control of exposure by setting shutters, diaphragms or filters, separately or conjointly
    • G03B7/08Control effected solely on the basis of the response, to the intensity of the light received by the camera, of a built-in light-sensitive device
    • G03B7/091Digital circuits
    • G03B7/095Digital circuits for control of aperture
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B7/00Control of exposure by setting shutters, diaphragms or filters, separately or conjointly
    • G03B7/08Control effected solely on the basis of the response, to the intensity of the light received by the camera, of a built-in light-sensitive device
    • G03B7/091Digital circuits
    • G03B7/097Digital circuits for control of both exposure time and aperture
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/75Circuitry for compensating brightness variation in the scene by influencing optical camera components

Definitions

  • the invention relates to a control device, an imaging device, an imaging system, a mobile body, a control method, and a program.
  • Patent Document 1 describes that the brightness level of the input image is changed by rotating the PL polarizer.
  • Patent Document 1 Japanese Patent Laid-Open No. 2014-74838
  • the exposure based on the automatic exposure control is not necessarily the exposure that meets the user's intention.
  • the control device may be a control device that controls at least one of the aperture value and the exposure time of the imaging device based on the target exposure value of the imaging device.
  • the control device may include a circuit configured to change the target exposure value based on the light transmittance of the optical device, and the optical device transmits light incident on an image sensor included in the imaging device.
  • the circuit may be configured to acquire instruction data for changing at least one of the aperture value and the exposure time of the imaging device.
  • the circuit may be configured to determine the first aperture value and the first exposure time based on the indication data and the target exposure value.
  • the circuit may be configured to change the transmittance of the optical device when the first aperture value is not within the predetermined aperture value range or the first exposure time is not within the predetermined exposure time range.
  • the circuit may be configured to change the target exposure value based on the changed transmittance of the optical device.
  • the circuit may be configured to determine the second aperture value within the predetermined aperture value range and the second exposure time within the predetermined range based on the indication data and the target exposure value.
  • the circuit may be configured to change the transmittance of the optical device based on the difference between the target exposure value before the change and the exposure value based on the second aperture value and the second exposure time.
  • the optical device may include a first polarizing filter, a second polarizing filter that overlaps the first polarizing filter in the optical axis direction, and a rotating mechanism that rotates the first polarizing filter.
  • the circuit may be configured to adjust the relative positional relationship between the polarization direction of the transmitted light of the first polarizing filter and the polarization direction of the second polarizing filter through a rotating mechanism, thereby changing the transmittance of the optical device.
  • the camera device is rotatably supported by the supporting mechanism.
  • the rotating mechanism can further rotate the second polarizing filter.
  • the circuit may be configured to maintain the relative position between the polarization direction of the first polarization filter and the polarization direction of the second polarization filter based on a control command for the support mechanism for rotating the imaging device in the first direction
  • the rotation mechanism is controlled to rotate the first polarizing filter and the second polarizing filter in a second direction opposite to the first direction.
  • the circuit may be configured to change the transmittance of the optical device at a first speed when the imaging device captures a still image, and change the transmittance of the optical device at a second speed slower than the first speed when the imaging device captures a moving image.
  • the imaging device may include the above-mentioned control device, optical device, and image sensor.
  • An imaging system may include the above-mentioned imaging device and a supporting mechanism that rotatably supports the imaging device.
  • the mobile body according to one aspect of the present invention may be a mobile body that includes the aforementioned camera system and moves.
  • the control method according to an aspect of the present invention may be a control method of controlling at least one of the aperture value and the exposure time of the imaging device based on the target exposure value of the imaging device.
  • the control method may include changing the target exposure value based on the light transmittance of the optical device, and the optical device transmits light incident on the image sensor of the imaging device.
  • the program according to one aspect of the present invention may be a program for causing a computer to function as the aforementioned control device.
  • FIG. 1 is a diagram showing an example of the appearance of an unmanned aircraft and a remote operation device.
  • Fig. 2 is a diagram showing an example of functional blocks of an unmanned aircraft.
  • 3 is a diagram showing an example of the relationship between the rotation angle of the polarizing filter and the transmittance.
  • FIG. 4 is a diagram showing an example of the change speed of the transmittance of the optical device when the imaging device captures a still image, and the change speed of the transmittance of the optical device when the imaging device captures a moving image.
  • FIG. 5 is a diagram showing an example of the relationship between the number of steps of the drive motor of the polarization filter drive section and the polarization wave surface angle of the polarization filter.
  • FIG. 6 is a diagram showing the driving state of the driving motor corresponding to the rotation of the imaging device.
  • Fig. 7 is a diagram showing an example of a hardware configuration.
  • the blocks may show (1) the stages of the process of performing operations or (2) the "parts" of the device that perform operations. Specific stages and “parts” can be implemented by programmable circuits and/or processors.
  • Dedicated circuits may include digital and/or analog hardware circuits. May include integrated circuits (ICs) and/or discrete circuits.
  • the programmable circuit may include a reconfigurable hardware circuit.
  • Reconfigurable hardware circuits can include logical AND, logical OR, logical exclusive OR, logical NAND, logical NOR, and other logical operations, flip-flops, registers, field programmable gate array (FPGA), programmable logic array (PLA) ) And other memory components.
  • the computer-readable medium may include any tangible device that can store instructions to be executed by a suitable device. Therefore, a computer-readable medium having instructions stored thereon includes a product containing instructions that can be executed to create means for performing operations specified by the flowchart or block diagram.
  • a computer-readable medium electronic storage media, magnetic storage media, optical storage media, electromagnetic storage media, semiconductor storage media, and the like may be included.
  • the computer readable medium may include floppy disk (registered trademark) floppy disk, floppy disk, hard disk, random access memory (RAM), read only memory (ROM), erasable programmable read only memory (EPROM) Or flash memory), electrically erasable programmable read-only memory (EEPROM), static random access memory (SRAM), compact disc read-only memory (CD-ROM), digital versatile disc (DVD), Blu-ray (RTM) disc, memory Sticks, integrated circuit cards, etc.
  • floppy disk registered trademark
  • floppy disk floppy disk
  • floppy disk hard disk
  • RAM random access memory
  • ROM read only memory
  • EPROM erasable programmable read only memory
  • flash memory electrically erasable programmable read-only memory
  • SRAM static random access memory
  • CD-ROM compact disc read-only memory
  • DVD digital versatile disc
  • Blu-ray (RTM) disc memory Sticks, integrated circuit cards, etc.
  • the computer-readable instructions may include any one of source code or object code described in any combination of one or more programming languages.
  • the source code or object code includes traditional procedural programming languages.
  • Traditional programming languages can be assembly instructions, instruction set architecture (ISA) instructions, machine instructions, machine-related instructions, microcode, firmware instructions, status setting data, or Smalltalk, JAVA (registered trademark), C++, etc.
  • the computer-readable instructions may be provided locally or via a wide area network (WAN) such as a local area network (LAN) or the Internet to a processor or programmable circuit of a general-purpose computer, a special-purpose computer, or other programmable data processing device.
  • the processor or programmable circuit can execute computer-readable instructions to create means for performing the operations specified in the flowchart or block diagram. Examples of processors include computer processors, processing units, microprocessors, digital signal processors, controllers, microcontrollers, etc.
  • FIG. 1 shows an example of the appearance of an unmanned aerial vehicle (UAV) 10 and a remote operation device 300.
  • the UAV 10 includes a UAV main body 20, a universal joint 50, a plurality of imaging devices 60, and the imaging device 100.
  • the universal joint 50 and the camera device 100 are an example of a camera system.
  • UAV10, or mobile refers to the concept including flying objects moving in the air, vehicles moving on the ground, and ships moving on the water. Flying objects moving in the air refer to concepts that include not only UAVs, but also other aircraft, airships, and helicopters that move in the air.
  • the UAV main body 20 includes a plurality of rotors. Multiple rotors are an example of a propulsion section.
  • the UAV main body 20 makes the UAV 10 fly by controlling the rotation of a plurality of rotors.
  • the UAV main body 20 uses, for example, four rotors to fly the UAV 10. The number of rotors is not limited to four.
  • UAV10 can also be a fixed-wing aircraft without rotors.
  • the imaging device 100 is an imaging camera that captures a subject included in a desired imaging range.
  • the universal joint 50 rotatably supports the imaging device 100.
  • the universal joint 50 is an example of a supporting mechanism.
  • the universal joint 50 uses an actuator to rotatably support the imaging device 100 around the pitch axis.
  • the universal joint 50 uses an actuator to further rotatably support the imaging device 100 around the roll axis and the yaw axis, respectively.
  • the gimbal 50 can change the posture of the imaging device 100 by rotating the imaging device 100 around at least one of the yaw axis, the pitch axis, and the roll axis.
  • the plurality of imaging devices 60 are cameras for sensing that photograph the surroundings of the UAV 10 in order to control the flight of the UAV 10.
  • the two camera devices 60 can be installed on the nose of the UAV 10, that is, on the front side.
  • the other two camera devices 60 may be provided on the bottom surface of the UAV 10.
  • the two imaging devices 60 on the front side may be paired to function as a so-called stereo camera.
  • the two imaging devices 60 on the bottom side may also be paired to function as a stereo camera.
  • the three-dimensional spatial data around the UAV 10 can be generated based on the images taken by the plurality of camera devices 60.
  • the number of imaging devices 60 included in the UAV 10 is not limited to four.
  • the UAV 10 may include at least one imaging device 60.
  • the UAV10 may be equipped with at least one camera 60 on the nose, tail, side, bottom, and top of the UAV10.
  • the viewing angle that can be set in the imaging device 60 may be larger than the viewing angle that can be set in the imaging device 100.
  • the imaging device 60 may have a single focus lens or a fisheye lens.
  • the remote operation device 300 communicates with the UAV 10 to remotely operate the UAV 10.
  • the remote operation device 300 can wirelessly communicate with the UAV 10.
  • the remote operation device 300 transmits to the UAV 10 instruction information indicating various commands related to the movement of the UAV 10 such as ascending, descending, accelerating, decelerating, forwarding, retreating, and rotating.
  • the instruction information includes, for example, instruction information for raising the height of the UAV 10.
  • the indication information may show the height at which the UAV 10 should be located.
  • the UAV 10 moves to be located at the height indicated by the instruction information received from the remote operation device 300.
  • the instruction information may include an ascending instruction to raise the UAV10.
  • UAV10 rises while receiving the rise command. When the height of UAV10 has reached the upper limit height, even if the ascent command is accepted, UAV10 can be restricted from ascent.
  • FIG. 2 shows an example of the functional blocks of UAV10.
  • UAV10 includes UAV control unit 30, memory 37, communication interface 36, propulsion unit 40, GPS receiver 41, inertial measurement device 42, magnetic compass 43, barometric altimeter 44, temperature sensor 45, humidity sensor 46, universal joint 50, camera The device 60 and the imaging device 100.
  • the communication interface 36 communicates with other devices such as the remote operation device 300.
  • the communication interface 36 can receive instruction information including various instructions for the UAV control unit 30 from the remote operation device 300.
  • the memory 37 stores the UAV control unit 30's response to the propulsion unit 40, GPS receiver 41, inertial measurement unit (IMU) 42, magnetic compass 43, barometric altimeter 44, temperature sensor 45, humidity sensor 46, universal joint 50, imaging device 60, and
  • the imaging device 100 performs programs and the like necessary for control.
  • the memory 37 may be a computer-readable recording medium, and may include at least one of flash memory such as SRAM, DRAM, EPROM, EEPROM, USB memory, and solid state drive (SSD).
  • the storage 37 may be provided inside the UAV main body 20. It can be detachably installed on the UAV main body 20.
  • the UAV control unit 30 controls the flight and shooting of the UAV 10 according to a program stored in the memory 37.
  • the UAV control unit 30 may be composed of a microprocessor such as a CPU or an MPU, and a microcontroller such as an MCU.
  • the UAV control unit 30 controls the flight and shooting of the UAV 10 in accordance with instructions received from the remote operation device 300 via the communication interface 36.
  • the propulsion unit 40 propels the UAV10.
  • the propulsion part 40 includes a plurality of rotors and a plurality of drive motors that rotate the plurality of rotors.
  • the propulsion unit 40 rotates a plurality of rotors via a plurality of drive motors in accordance with an instruction from the UAV control unit 30 to cause the UAV 10 to fly.
  • the GPS receiver 41 receives a plurality of signals indicating time transmitted from a plurality of GPS satellites.
  • the GPS receiver 41 calculates the position (latitude and longitude) of the GPS receiver 41, that is, the position (latitude and longitude) of the UAV 10 based on the received signals.
  • the IMU42 detects the posture of the UAV10.
  • the IMU 42 detects the acceleration of the UAV 10 in the three-axis directions of front and rear, left and right, and up and down, and the angular velocities of the pitch axis, the roll axis, and the yaw axis as the attitude of the UAV 10.
  • the magnetic compass 43 detects the position of the nose of the UAV 10.
  • the barometric altimeter 44 detects the flying altitude of the UAV10.
  • the barometric altimeter 44 detects the air pressure around the UAV 10 and converts the detected air pressure into altitude to detect the altitude.
  • the temperature sensor 45 detects the temperature around the UAV 10.
  • the humidity sensor 46 detects the humidity around the UAV 10.
  • the imaging device 100 includes an imaging unit 102 and a lens unit 200.
  • the lens part 200 is an example of a lens device.
  • the imaging unit 102 includes an image sensor 120, an imaging control unit 110, a memory 130, and an acceleration sensor 140.
  • the image sensor 120 may be composed of CCD or CMOS.
  • the image sensor 120 captures optical images formed through the plurality of lenses 210 and outputs the captured images to the imaging control unit 110.
  • the imaging control unit 110 may be constituted by a microprocessor such as a CPU or an MPU, a microcontroller such as an MCU, or the like.
  • the imaging control unit 110 is an example of a circuit.
  • the imaging control unit 110 can control the imaging device 100 according to an operation instruction of the imaging device 100 from the UAV control unit 30.
  • the imaging control unit 110 is an example of a first control unit and a second control unit.
  • the memory 130 may be a computer-readable recording medium, and may include at least one of flash memory such as SRAM, DRAM, EPROM, EEPROM, USB memory, and solid state drive (SSD).
  • the memory 130 stores programs and the like necessary for the imaging control unit 110 to control the image sensor 120 and the like.
  • the memory 130 may be provided inside the housing of the imaging device 100.
  • the storage 130 may be configured to be detachable from the housing of the imaging device 100.
  • the acceleration sensor 140 detects the acceleration in the front and rear, left and right, and up and down directions of the imaging device 100.
  • the imaging control unit 110 obtains from the acceleration sensor 140 information indicating the acceleration of the imaging device 100 in the three-axis directions of front and rear, left and right, and up and down as posture information indicating the posture state of the imaging device 100.
  • the lens part 200 includes a plurality of lenses 210, a plurality of lens driving parts 212, and a lens control part 220.
  • the plurality of lenses 210 may be used as a zoom lens (zoom lens), a variable focal length lens (varifocal lens), and a focusing lens. At least a part or all of the plurality of lenses 210 are configured to be movable along the optical axis.
  • the lens unit 200 may be an interchangeable lens that is provided to be detachable from the imaging unit 102.
  • the lens driving unit 212 moves at least a part or all of the plurality of lenses 210 along the optical axis via a mechanism member such as a cam ring.
  • the lens driving part 212 may include an actuator.
  • the actuator may include a stepper motor.
  • the lens control unit 220 drives the lens driving unit 212 in accordance with a lens control instruction from the imaging unit 102 to move one or more lenses 210 in the optical axis direction via a mechanism member.
  • the lens control commands are, for example, zoom control commands and focus control commands.
  • the lens part 200 further includes a memory 222 and a position sensor 214.
  • the lens control unit 220 controls the movement of the lens 210 in the optical axis direction via the lens drive unit 212 in accordance with a lens operation command from the imaging unit 102.
  • the lens control unit 220 controls the movement of the lens 210 in the optical axis direction via the lens drive unit 212 in accordance with a lens operation command from the imaging unit 102.
  • Part or all of the lens 210 moves along the optical axis.
  • the lens control section 220 performs at least one of a zoom operation and a focus operation by moving at least one of the lenses 210 along the optical axis.
  • the position sensor 214 detects the position of the lens 210.
  • the position sensor 214 can detect the current zoom position or focus position.
  • the lens driving part 212 may include a shake correction mechanism.
  • the lens control section 220 may move the lens 210 in a direction along the optical axis or a direction perpendicular to the optical axis via a shake correction mechanism to perform shake correction.
  • the lens driving part 212 may drive a shake correction mechanism by a stepping motor to perform shake correction.
  • the shake correction mechanism may be driven by a stepping motor to move the image sensor 120 in a direction along the optical axis or a direction perpendicular to the optical axis to perform shake correction.
  • the memory 222 stores control values of the plurality of lenses 210 moved by the lens driving unit 212.
  • the memory 222 may include at least one of flash memory such as SRAM, DRAM, EPROM, EEPROM, and USB memory.
  • the lens part 200 further includes an optical device 230.
  • the optical device 230 may be a variable ND filter, and its transmittance may be changed.
  • the optical device 230 includes a polarization filter driving part 232, a polarization filter 234 and a polarization filter 236.
  • the polarization filter 234 and the polarization filter 236 are optical filters that transmit light in a specific polarization direction.
  • the polarizing filter 234 and the polarizing filter 236 may be arranged on the optical axis of the imaging device 100.
  • the polarizing filter 234 and the polarizing filter 236 may be arranged between the lens 210 and the image sensor 120.
  • the polarizing filter driving unit 232 includes a rotating mechanism that rotates the polarizing filter 234.
  • the polarizing filter driving unit 232 can rotate the polarizing filter 234 around the optical axis.
  • the polarizing filter driving part 232 includes a driving motor.
  • the rotating mechanism may include at least one gear that transmits power from the driving motor to the polarizing filter 234.
  • the driving motor may be a stepping motor.
  • the polarization filter driving unit 232 changes the polarization direction of the light transmitted by the polarization filter 234 by rotating the polarization filter 234.
  • the polarizing filter 236 may be fixed on the optical axis.
  • the polarization filter driving unit 232 includes a rotation detection sensor that detects the rotation of the polarization filter 234.
  • the rotation detection sensor may be a photo-interrupter, a variable resistor, a Hall element,
  • the polarization filter drive unit 232 rotates the polarization filter 234 to change the polarization direction of the transmitted light of the polarization filter 234 in the absolute space, that is, the actual space.
  • the polarization filter 234 changes the angle of the polarization plane of the transmitted light of the polarization filter 234 in real space.
  • the polarization filter driving unit 232 changes the rotation angle of the polarization filter 234 by rotating the polarization filter 234 to change the polarization direction of the transmitted light of the polarization filter 234.
  • the optical device 230 can change the transmittance of the optical device 230 by adjusting the relative positional relationship between the polarization direction of the polarization filter 234 and the polarization direction of the polarization filter 236.
  • the optical device 230 is a variable ND filter, and may also be another type of filter capable of electrically adjusting the light transmittance.
  • the optical device 230 may be, for example, an electrochromic element.
  • the electrochromic element includes an electrochromic material that reversibly produces optical absorption by applying a voltage or passing a current.
  • the optical device 230 may be a liquid crystal type ND filter capable of adjusting the transmittance by changing the arrangement of the liquid crystal by applying a voltage.
  • the imaging unit 102 includes a photometric sensor 150.
  • the photometric sensor 150 detects the amount of light passing through the lens 210.
  • the light metering sensor 150 is a TTL exposure meter.
  • the imaging control unit 110 includes an automatic exposure control unit 112 and a transmittance control unit 114.
  • the automatic exposure control section 112 performs automatic exposure control by controlling the aperture and shutter.
  • the transmittance control unit 114 controls the polarization filter driving unit 232 to control the transmittance of the optical device 230.
  • the automatic exposure control unit 112 determines an appropriate exposure, that is, a target exposure value (EV value) based on the amount of light detected by the photometric sensor 150.
  • the automatic exposure control section 112 determines the aperture value of the aperture and the shutter speed (exposure time) of the shutter based on the target exposure value.
  • the automatic exposure control section 112 determines the AV value as an index of the aperture value and the TV value as an index of the shutter speed based on the target exposure value.
  • the automatic exposure control unit 112 controls the aperture based on the AV value, and controls the shutter speed based on the TV value.
  • the automatic exposure control unit 112 may also adjust the ISO sensitivity to control the exposure value to an appropriate exposure.
  • the aperture value (AV value) determined by the automatic exposure control section 112 and the image taken at the shutter speed (TV value) are not necessarily the images desired by the user.
  • the shutter speed determined by the automatic exposure control unit 112 is too fast, and sometimes it becomes a jagged and fragmented comic-style image.
  • the aperture value is increased, it may become a suitable exposure.
  • the EV value for proper exposure is set to 14, the AV value is set to 6, and the TV value is set to 8.
  • the TV value is set to 8.
  • the shutter speed is slowed down, the imaging device 100 cannot capture images with appropriate exposure.
  • the transmittance control unit 114 adjusts the transmittance of the optical device 230.
  • the transmittance control unit 114 adjusts the transmittance of the optical device 230 so that the amount of light incident on the image sensor 120 is darkened by two levels.
  • the imaging device 100 can capture an image with the same brightness as that of an image captured with a suitable exposure.
  • the imaging control unit 110 determines a target exposure value for appropriate exposure based on the amount of light from the light metering sensor 150 and the set ISO sensitivity.
  • the imaging control unit 110 changes the target exposure value based on the light transmittance of the optical device 230, where the optical device 230 transmits light incident to the image sensor 120.
  • the imaging control unit 110 obtains instruction data for changing at least one of the aperture value and the exposure time.
  • the imaging control unit 110 acquires instruction data from the user.
  • the imaging control unit 110 obtains instruction data via the remote operation device 300. As the instruction data, the imaging control unit 110 obtains instruction data for changing the TV value from "8" to "6", for example.
  • the automatic exposure control section 112 determines the first aperture value and the first exposure time based on the instruction data and the target exposure value. For example, the automatic exposure control unit 112 determines the TV value specified by the instruction data as the first exposure time, and determines the AV value at which a suitable exposure can be obtained from the TV value as the first aperture value.
  • the transmittance control unit 114 changes the transmittance of the optical device 230. For example, when the aperture value is fixed, the AV value cannot be changed with the change of the TV value. Therefore, the transmittance control unit 114 determines that the first aperture value is not within the predetermined aperture value range.
  • the predetermined aperture value range and the predetermined exposure time range can be set by the user, or can be set according to a photography mode such as aperture priority and shutter priority.
  • the predetermined aperture value range may be multiple levels of AV values, such as a range from the first AV value to the second EV value, or any level of AV value.
  • the preset exposure time range may be multiple levels of TV values, for example, the range from the first TV value to the second TV value, or any level of TV value.
  • the transmittance control section 114 determines the second aperture value within the predetermined aperture value range and the second exposure time within the predetermined range based on the instruction data and the target exposure value. For example, when the aperture value is fixed, the transmittance control unit 114 determines the TV value indicated by the instruction data as the second exposure time, and determines the current AV value as the second aperture value. The transmittance control unit 114 changes the transmittance of the optical device 230 based on the difference between the target exposure value before the change and the exposure value based on the second aperture value and the second exposure time.
  • the transmittance control unit 114 changes the transmittance of the optical device 230 according to the number of steps of the difference between the target exposure value before the change and the exposure value based on the second aperture value and the second exposure time.
  • the transmittance control unit 114 changes the transmittance of the optical device 230 so that the transmittance corresponds to the number of phase differences.
  • the transmittance control unit 114 changes the transmittance of the optical device 230 so that the light input to the image sensor 120 has a lower light amount of "2".
  • the transmittance control unit 114 changes the transmittance of the optical device 230 by controlling the rotation angle of the polarization filter 234.
  • the automatic exposure control unit 112 changes the target exposure value based on the changed transmittance of the optical device 230. For example, the automatic exposure control unit 112 changes the target exposure value from “14" to "12" based on the changed transmittance of the optical device 230. Thereby, the imaging device 100 can capture an image with the same brightness as when the target exposure value is "14".
  • FIG. 3 shows an example of the relationship between the rotation angle of the polarizing filter 234 and the transmittance K.
  • the transmittance control unit 114 changes the appropriate exposure by changing the transmittance K of the optical device 230. For example, by reducing the transmittance K, the EV value for proper exposure can be reduced.
  • FIG. 4 is a diagram showing an example of the change speed of the transmittance of the optical device 230 when the imaging device 100 captures a still image, and the change speed of the transmittance of the optical device 230 when the imaging device 100 captures a moving image.
  • the transmittance control section 114 can change the transmittance of the optical device 230 at a first speed, and when the imaging device 100 captures a moving image, it can use a second speed that is slower than the first speed.
  • the transmittance of the optical device 230 is changed.
  • the posture of the imaging device 100 changes over time.
  • the polarization direction of the light passing through the polarization filter 234 and the polarization filter 236 changes. If the polarization direction changes, the intensity of the light in the specific polarization direction incident on the image sensor 120 may change. For example, the intensity of sunlight reflected on the water surface and incident on the image sensor 120 may vary. That is, the imaging device 100 may not be able to capture an image in a style that matches the user's intention.
  • the polarizing filter driving unit 232 may also rotate the polarizing filter 236 around the optical axis. In this case, it may be considered to detect the posture of the imaging device 100 and change the rotation angles of the polarizing filter 234 and the polarizing filter 236 according to the posture of the imaging device 100.
  • the rotation angles of the polarizing filter 234 and the polarizing filter 236 are changed by feedback as in this method, when the posture of the imaging device 100 changes with time, it is not always possible to appropriately control the polarizing filter 234 and the polarizing filter.
  • the rotation angle of the filter 236 is not always possible to appropriately control the polarizing filter 234 and the polarizing filter.
  • the transmittance control unit 114 can maintain the relative positional relationship between the polarization direction of the polarization filter 234 and the polarization direction of the polarization filter 236 through feed forward, and make the polarization filter 234 and the polarization filter The mirror 236 rotates at a rotation angle.
  • the transmittance control unit 114 can maintain the relative positional relationship in the polarization direction of the polarizing filter 234 and the polarizing filter 236, and based on the universal joint 50 for rotatably supporting the imaging device 100 for making the imaging
  • the control command to rotate the device 100 in the first direction controls the polarizing filter driving unit 232 to rotate the polarizing filter 234 and the polarizing filter 236 in a second direction opposite to the first direction.
  • the transmittance control unit 114 controls the polarizing filter driving unit 232 so that the polarizing filter 234 rotates in the second direction by the amount of rotation of the imaging device 100 in the first direction based on the control command.
  • the transmittance control unit 114 can maintain the transmittance of the optical device 230 while controlling the polarization based on a control command for rotating the imaging device 100 in the first direction for the universal joint 50 that rotatably supports the imaging device 100
  • the filter driving unit 232 rotates the polarizing filter 234 and the polarizing filter 236 in a second direction opposite to the first direction.
  • FIG. 5 shows the relationship between the number of steps of the driving motor of the polarizing filter driving section 232 and the polarization angle of the polarizing filter 234.
  • the polarization plane angle of the polarization filter 234 is the angle between the polarization plane of the light passing through the polarization filter 234 and the horizontal plane of the imaging device 100.
  • the polarization plane is the plane containing the polarization direction.
  • the horizontal plane of the imaging device 100 may be a plane including the optical axis when the imaging device 100 is in the reference posture (the optical axis faces the horizontal direction).
  • the polarizing filter 234 and the polarizing filter 236 rotate in the first direction (positive direction).
  • the rotation angle of the polarizing filter 234 and the polarizing filter 236 is +180 degrees.
  • the rotation angle of the polarizing filter 234 and the polarizing filter 236 is -180 degrees.
  • FIG. 6 shows the driving situation of the driving motor corresponding to the rotation of the imaging device 100. It shows a situation where the imaging device 100 is capturing a subject in a state where the UAV 10 is flying.
  • the flying state of the UAV 10 is a horizontal state with respect to the ground
  • the posture state of the camera device 100 supported by the universal joint 50 is also a horizontal state with respect to the ground. That is, the plane including the optical axis of the imaging device 100 is parallel to the ground.
  • the UAV control unit 30 outputs to the universal joint 50 a control command to rotate the posture state of the imaging device 100 with respect to the ground by +30 degrees around the optical axis. Thus, it moves from step 1 to step 2.
  • step 2 the imaging control unit 110 outputs a rotation drive command to the drive motor in synchronization with the control command.
  • the rotation drive command maintains the relative positional relationship of the polarization filter 234 and the polarization filter 236 while maintaining the relative positional relationship between the polarization directions of the polarization filter 234 and the polarization filter 236.
  • the polarizing filter 234 and the polarizing filter 236 are rotated by -30 degrees (-N/6 steps).
  • the polarization direction of the light passing through the polarization filter 234 and the polarization filter 236 can be maintained. That is, while maintaining the transmittance of the optical device 230, the rotation angles of the polarizing filter 234 and the polarizing filter 236 can be maintained at a fixed angle with respect to the ground.
  • step 3 when the flying state of the UAV 10 is horizontal with respect to the ground, the posture state of the imaging device 100 supported by the universal joint 50 is rotated with respect to the ground by -30 degrees with respect to the optical axis.
  • the UAV control unit 30 outputs to the universal joint 50 a control command to rotate the posture state of the imaging device 100 with respect to the ground from +30 degrees to -30 degrees around the optical axis.
  • the imaging control unit 110 rotates the polarizing filter 234 and the polarizing filter 236 from -30 degrees to +30 degrees (from -N/6 steps to +N/6 steps) in synchronization with the control command.
  • the command is output to the drive motor.
  • the imaging device 100 rotates in the second direction (negative direction) with the optical axis as the center at a constant rotation speed. Therefore, the polarization direction of the light passing through the polarization filter 234 and the polarization filter 236 can be maintained. That is, while maintaining the transmittance of the optical device 230, the rotation angles of the polarizing filter 234 and the polarizing filter 236 can be maintained at a fixed angle with respect to the ground.
  • the universal joint 50 controls the posture state of the imaging device 100 to be horizontal with respect to the ground. Then, during this period, the imaging control unit 110 drives the drive motor so as to maintain the relative positional relationship between the polarizing filter 234 and the polarizing filter 236 in the polarization direction, and at the same time make the rotation angle of the polarizing filter 234 relative to The ground is level (0 degrees).
  • the imaging control unit 110 rotates the polarizing filter 234 and the polarizing filter 236 through feed forward in accordance with the control command for the universal joint 50. Therefore, while maintaining the transmittance of the optical device 230, the polarization direction of the polarization filter 234 and the polarization filter 236 can be maintained in a fixed direction with respect to the ground. Therefore, it is possible to prevent the polarization direction of the polarizing filter 234 and the polarizing filter 236 from changing due to a change in the posture of the image pickup device 100, causing the image pickup device 100 to fail to capture images in a style that meets the user's intention.
  • FIG. 7 shows an example of a computer 1200 that can embody aspects of the present invention in whole or in part.
  • the program installed on the computer 1200 can make the computer 1200 function as an operation associated with the device according to the embodiment of the present invention or one or more "parts" of the device. Alternatively, the program can cause the computer 1200 to perform the operation or the one or more "parts".
  • This program enables the computer 1200 to execute the process or stages of the process involved in the embodiment of the present invention.
  • Such a program may be executed by the CPU 1212, so that the computer 1200 executes specified operations associated with some or all blocks in the flowcharts and block diagrams described in this specification.
  • the computer 1200 of this embodiment includes a CPU 1212 and a RAM 1214, which are connected to each other through a host controller 1210.
  • the computer 1200 further includes a communication interface 1222, an input/output unit, which is connected to the host controller 1210 through the input/output controller 1220.
  • the computer 1200 also includes a ROM 1230.
  • the CPU 1212 operates in accordance with programs stored in the ROM 1230 and RAM 1214 to control each unit.
  • the communication interface 1222 communicates with other electronic devices through a network.
  • the hard disk drive can store programs and data used by the CPU 1212 in the computer 1200.
  • the ROM 1230 stores therein a boot program executed by the computer 1200 during operation, and/or a program that depends on the hardware of the computer 1200.
  • the program is provided via a computer-readable recording medium such as CR-ROM, USB memory, or IC card, or a network.
  • the program is installed in RAM 1214 or ROM 1230 which is also an example of a computer-readable recording medium, and is executed by CPU 1212.
  • the information processing described in these programs is read by the computer 1200 and causes cooperation between the programs and the various types of hardware resources described above.
  • the apparatus or method may be constituted by implementing operations or processing of information according to the use of the computer 1200.
  • the CPU 1212 can execute a communication program loaded in the RAM 1214, and based on the processing described in the communication program, instruct the communication interface 1222 to perform communication processing.
  • the communication interface 1222 reads the transmission data stored in the transmission buffer provided in a recording medium such as RAM 1214 or USB memory under the control of the CPU 1212, and sends the read transmission data to the network or receives the data from the network The received data is written into the receiving buffer provided in the recording medium, etc.
  • the CPU 1212 can make the RAM 1214 read all or necessary parts of files or databases stored in an external recording medium such as a USB memory, and perform various types of processing on the data on the RAM 1214. Then, the CPU 1212 can write the processed data back to the external recording medium.
  • an external recording medium such as a USB memory
  • the CPU 1212 can perform various types of operations, information processing, conditional judgment, conditional transfer, unconditional transfer, and information retrieval/retrieval/information specified by the instruction sequence of the program described in various places in this disclosure. Replace various types of processing, and write the results back to RAM 1214.
  • the CPU 1212 can search for information in files, databases, and the like in the recording medium. For example, when a plurality of entries having the attribute value of the first attribute respectively associated with the attribute value of the second attribute are stored in the recording medium, the CPU 1212 may retrieve the attribute value of the specified first attribute from the multiple entries. And read the attribute value of the second attribute stored in the entry that matches the condition, so as to obtain the attribute value of the second attribute associated with the first attribute that meets the predetermined condition.
  • the programs or software modules described above may be stored on the computer 1200 or on a computer-readable storage medium near the computer 1200.
  • a recording medium such as a hard disk or RAM provided in a server system connected to a dedicated communication network or the Internet can be used as a computer-readable storage medium so that the program can be provided to the computer 1200 via the network.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)
  • Exposure Control For Cameras (AREA)
  • Blocking Light For Cameras (AREA)

Abstract

有时利用自动曝光控制的曝光未必是符合用户的意图的曝光。控制装置可以是基于摄像装置的目标曝光值控制摄像装置的光圈值以及曝光时间的至少一个的控制装置。控制装置可以包括构成为基于光学装置的透光率而变更目标曝光值的电路,光学装置透射入射至摄像装置所具备的图像传感器的光。电路可以构成为:获取用于变更摄像装置的光圈值和曝光时间中的至少一个的指示数据,基于指示数据和目标曝光值来确定第一光圈值和第一曝光时间,当第一光圈值不在预定光圈值范围内时或者第一曝光时间不在预定曝光时间范围内时,变更光学装置的透射率,基于经变更的光学装置的透射率来变更目标曝光值。

Description

控制装置、摄像装置、摄像系统、移动体、控制方法以及程序
本申请要求于2019-06-28递交的、申请号为JP2019-122137的日本专利申请的优先权,其内容一并在此作为参考。
技术领域
本发明涉及一种控制装置、摄像装置、摄像系统、移动体、控制方法以及程序。
背景技术
在专利文献1中,记载了通过旋转PL偏光镜来变更输入图像的亮度水平。
[现有技术文献]
[专利文献]
[专利文献1]日本专利特开2014-74838号公报
发明内容
发明所要解决的技术问题
有时基于自动曝光控制的曝光未必是符合用户的意图的曝光。
本发明的一个方面涉及的控制装置可以是一种基于摄像装置的目标曝光值,对摄像装置的光圈值以及曝光时间中的至少一个进行控制的控制装置。控制装置可以包括构成为基于光学装置的透光率而变更目标曝光值的电路,光学装置透射入射至摄像装置所具备的图像传感器的光。
电路可以构成为:获取用于变更摄像装置的光圈值和曝光时间中的至少一个的指示数据。电路可以构成为:基于指示数据和目标曝光值来确定第一光圈值和第一曝光时间。电路可以构成为:当第一光圈值不在预定光圈值范围内或者第一曝光时间不在预定曝光时间范围内时,变更光学装置的透射率。电路可以构成为:基于经变更的光学装置的透射率来变更目标曝光值。
电路可以构成为:基于指示数据和目标曝光值来确定预定光圈值范围内的第二光圈值和预定范围内的第二曝光时间。电路可以构成为:根据变更前的目标曝光值与基于第二光圈值和第二曝光时间的曝光值之差,来变更光学装置的透射率。
光学装置可以包括第一偏光滤光镜、与第一偏光滤光镜在光轴方向上重合的第二偏光 滤光镜以及使第一偏光滤光镜旋转的旋转机构。电路可以构成为:通过旋转机构调整第一偏光滤光镜的透射光的偏振方向与第二偏光滤光镜的透射光的偏振方向的相对位置关系,从而变更光学装置的透射率。
摄像装置可旋转地由支撑机构支撑。旋转机构可以进一步使第二偏光滤光镜旋转。电路可以构成为:基于针对支撑机构的用于使摄像装置向第一方向旋转的控制指令,在维持第一偏光滤光镜的偏振方向与第二偏光滤光镜的偏振方向之间的相对位置关系的同时,控制旋转机构使得第一偏光滤光镜和第二偏光滤光镜向与第一方向相反的第二方向旋转。
电路可以构成为:当摄像装置拍摄静态图像时,以第一速度变更光学装置的透射率,当摄像装置拍摄动态图像时,以比第一速度慢的第二速度变更光学装置的透射率。
摄像装置可以包括上述控制装置、光学装置以及图像传感器。
本发明的一个方面所涉及的摄像系统可以包括上述摄像装置以及可旋转地支撑摄像装置的支撑机构。
本发明一个方面所涉及的移动体可以是包括上述摄像系统并进行移动的移动体。
本发明的一个方面所涉及的控制方法可以是基于摄像装置的目标曝光值来控制摄像装置的光圈值和曝光时间中的至少一个的控制方法。控制方法可以包括:基于光学装置的透光率来变更目标曝光值,光学装置透射入射至摄像装置所具备的图像传感器的光。
本发明的一个方面所涉及的程序可以是一种用于使计算机作为上述控制装置发挥功能的程序。
根据本发明的一个方面,可以更容易地实现符合用户意图的曝光。
此外,上述发明内容未列举本发明的所有必要特征。此外,这些特征组的子组合也可以构成发明。
附图说明
图1是示出无人驾驶航空器和远程操作装置的外观的一个示例的图。
图2是示出无人驾驶航空器的功能块的一个示例的图。
图3是示出偏光滤光镜的旋转角度与透射率之间的关系的示例的图。
图4是示出摄像装置拍摄静态图像时的光学装置的透射率的变化速度、以及摄像装置拍摄动态图像时的光学装置的透射率的变化速度的情况的示例的图。
图5是示出偏光滤光镜驱动部的驱动电机的步数与偏光滤光镜的偏振波面角度之间的关系的示例的图。
图6是示出与摄像装置的旋转相应的驱动电机的驱动情况的图。
图7是示出硬件配置的示例的图。
符号说明
10 UAV
20 UAV主体
30 UAV控制部
36 通信接口
37 存储器
40 推进部
41 GPS接收器
42 惯性测量装置
43 磁罗盘
44 气压高度计
45 温度传感器
46 湿度传感器
50 万向节
60 摄像装置
100 摄像装置
102 摄像部
110 摄像控制部
112 自动曝光控制部
114 透射率控制部
120 图像传感器
130 存储器
140 加速度传感器
150 测光传感器
200 镜头部
210 镜头
212 镜头驱动部
214 位置传感器
220 镜头控制部
222 存储器
230 光学装置
232 偏光滤光镜驱动部
234,236 偏光滤光镜
300 远程操作装置
1200 计算机
1210 主机控制器
1212 CPU
1214 RAM
1220 输入/输出控制器
1222 通信接口
1230 ROM
具体实施方式
以下,通过发明的实施方式来说明本发明,但是以下的实施方式并不限定权利要求书所涉及的发明。此外,实施方式中所说明的所有特征组合对于发明的解决方案未必是必须的。对本领域普通技术人员来说,显然可以对以下实施方式加以各种变更或改良。从权利要求书的描述显而易见的是,加以了这样的变更或改良的方式都可包含在本发明的技术范围之内。
权利要求书、说明书、说明书附图以及说明书摘要中包含作为著作权所保护对象的事项。任何人只要如专利局的文档或者记录所示出的那样进行这些文件的复制,著作权人则不会提出异议。但是,在除此以外的情况下,保留一切的著作权。
本发明的各种实施方式可参照流程图及框图来描述,这里,方框可示出(1)执行操作的过程的阶段或者(2)具有执行操作的作用的装置的“部”。特定的阶段和“部”可以通过可编程电路和/或处理器来实现。专用电路可以包括数字和/或模拟硬件电路。可以包括集成电路(IC)和/或分立电路。可编程电路可以包括可重构硬件电路。可重构硬件电路可以包括逻辑与、逻辑或、逻辑异或、逻辑与非、逻辑或非、及其它逻辑操作、触发器、寄存器、现场可编程门阵列(FPGA)、可编程逻辑阵列(PLA)等存储器元件等。
计算机可读介质可以包括可以对由适宜的设备执行的指令进行存储的任意有形设备。因此,其上存储有指令的计算机可读介质包括一种包含指令的产品,该指令可被执行以创 建用于执行流程图或框图所指定的操作的手段。作为计算机可读介质的示例,可以包括电子存储介质、磁存储介质、光学存储介质、电磁存储介质、半导体存储介质等。作为计算机可读介质的更具体的示例,可以包括floppy disk(注册商标)软盘、软磁盘、硬盘、随机存取存储器(RAM)、只读存储器(ROM)、可擦除可编程只读存储器(EPROM或者闪存)、电可擦可编程只读存储器(EEPROM)、静态随机存取存储器(SRAM)、光盘只读存储器(CD-ROM)、数字多用途光盘(DVD)、蓝光(RTM)光盘、记忆棒、集成电路卡等。
计算机可读指令可以包括由一种或多种编程语言的任意组合描述的源代码或者目标代码中的任意一个。源代码或者目标代码包括传统的程序式编程语言。传统的程序式编程语言可以为汇编指令、指令集架构(ISA)指令、机器指令、与机器相关的指令、微代码、固件指令、状态设置数据、或者Smalltalk、JAVA(注册商标)、C++等面向对象编程语言以及“C”编程语言或者类似的编程语言。计算机可读指令可以在本地或者经由局域网(LAN)、互联网等广域网(WAN)提供给通用计算机、专用计算机或者其它可编程数据处理装置的处理器或可编程电路。处理器或可编程电路可以执行计算机可读指令,以创建用于执行流程图或框图所指定操作的手段。处理器的示例包括计算机处理器、处理单元、微处理器、数字信号处理器、控制器、微控制器等。
图1示出无人驾驶航空器(UAV)10及远程操作装置300的外观的一个示例。UAV10包括UAV主体20、万向节50、多个摄像装置60、以及摄像装置100。万向节50及摄像装置100为摄像系统的一个示例。UAV10,即移动体,是指包括在空中移动的飞行体、在地面上移动的车辆、在水上移动的船舶等的概念。在空中移动的飞行体是指不仅包括UAV、还包括在空中移动的其它的飞行器、飞艇、直升机等的概念。
UAV主体20包括多个旋翼。多个旋翼为推进部的一个示例。UAV主体20通过控制多个旋翼的旋转而使UAV10飞行。UAV主体20使用例如四个旋翼来使UAV10飞行。旋翼的数量不限于四个。另外,UAV10也可以是没有旋翼的固定翼机。
摄像装置100是为对包含在所期望的摄像范围内的被摄体进行摄像的摄像用相机。万向节50可旋转地支撑摄像装置100。万向节50为支撑机构的一个示例。例如,万向节50使用致动器以俯仰轴为中心可旋转地支撑摄像装置100。万向节50使用致动器进一步分别以滚转轴和偏航轴为中心可旋转地支撑摄像装置100。万向节50可通过使摄像装置100以偏航轴、俯仰轴以及滚转轴中的至少一个为中心旋转,来变更摄像装置100的姿势。
多个摄像装置60是为了控制UAV10的飞行而对UAV10的周围进行拍摄的传感用相 机。两个摄像装置60可以设置于UAV10的机头、即正面。并且,其它两个摄像装置60可以设置于UAV10的底面。正面侧的两个摄像装置60可以成对,起到所谓的立体相机的作用。底面侧的两个摄像装置60也可以成对,起到立体相机的作用。可以根据由多个摄像装置60所拍摄的图像来生成UAV10周围的三维空间数据。UAV10所具备的摄像装置60的数量不限于四个。UAV10具备至少一个摄像装置60即可。UAV10也可以在UAV10的机头、机尾、侧面、底面及顶面分别具备至少一个摄像装置60。摄像装置60中可设定的视角可大于摄像装置100中可设定的视角。摄像装置60也可以具有单焦点镜头或鱼眼镜头。
远程操作装置300与UAV10通信,以远程操作UAV10。远程操作装置300可以与UAV10进行无线通信。远程操作装置300向UAV10发送示出上升、下降、加速、减速、前进、后退、旋转等与UAV10的移动有关的各种指令的指示信息。指示信息包括例如使UAV10的高度上升的指示信息。指示信息可以示出UAV10应该位于的高度。UAV10进行移动,以位于从远程操作装置300接收的指示信息所示出的高度。指示信息可以包括使UAV10上升的上升指令。UAV10在接受上升指令的期间上升。在UAV10的高度已达到上限高度时,即使接受上升指令,也可以限制UAV10上升。
图2示出UAV10的功能块的一个示例。UAV10包括UAV控制部30、存储器37、通信接口36、推进部40、GPS接收器41、惯性测量装置42、磁罗盘43、气压高度计44、温度传感器45、湿度传感器46、万向节50、摄像装置60以及摄像装置100。
通信接口36与远程操作装置300等其它装置通信。通信接口36可以从远程操作装置300接收包括针对UAV控制部30的各种指令的指示信息。存储器37存储UAV控制部30对推进部40、GPS接收器41、惯性测量装置(IMU)42、磁罗盘43、气压高度计44、温度传感器45、湿度传感器46、万向节50、摄像装置60及摄像装置100进行控制所需的程序等。存储器37可以为计算机可读记录介质,可以包括SRAM、DRAM、EPROM、EEPROM、USB存储器及固态硬盘(SSD)等闪存中的至少一个。存储器37可以设置在UAV主体20的内部。其可以可拆卸地设置在UAV主体20上。
UAV控制部30按照存储在存储器37中的程序来控制UAV10的飞行及拍摄。UAV控制部30可以由CPU或MPU等微处理器、以及MCU等微控制器等构成。UAV控制部30按照经由通信接口36从远程操作装置300接收到的指令来控制UAV10的飞行及拍摄。推进部40推进UAV10。推进部40包括多个旋翼和使多个旋翼旋转的多个驱动电机。推进部40按照来自UAV控制部30的指令,经由多个驱动电机使多个旋翼旋转,以使UAV10 飞行。
GPS接收器41接收从多个GPS卫星发送的表示时刻的多个信号。GPS接收器41根据所接收的多个信号来计算出GPS接收器41的位置(纬度及经度)、即UAV10的位置(纬度及经度)。IMU42检测UAV10的姿势。IMU42检测UAV10的前后、左右以及上下的三轴方向的加速度和俯仰轴、滚转轴以及偏航轴的三轴方向的角速度,作为UAV10的姿势。磁罗盘43检测UAV10的机头的方位。气压高度计44检测UAV10的飞行高度。气压高度计44检测UAV10周围的气压,并将检测到的气压换算为高度而检测出高度。温度传感器45检测UAV10周围的温度。湿度传感器46检测UAV10周围的湿度。
摄像装置100包括摄像部102及镜头部200。镜头部200为镜头装置的一个示例。摄像部102包括图像传感器120、摄像控制部110、存储器130以及加速度传感器140。图像传感器120可以由CCD或CMOS构成。图像传感器120拍摄经由多个镜头210成像的光学图像,并将所拍摄的图像输出至摄像控制部110。摄像控制部110可以由CPU或MPU等微处理器、MCU等微控制器等构成。摄像控制部110是电路的一个示例。摄像控制部110可以根据来自UAV控制部30的摄像装置100的操作指令来控制摄像装置100。摄像控制部110为第一控制部及第二控制部的一个示例。存储器130可以为计算机可读记录介质,可以包括SRAM、DRAM、EPROM、EEPROM、USB存储器及固态硬盘(SSD)等闪存中的至少一个。存储器130存储摄像控制部110对图像传感器120等进行控制所需的程序等。存储器130可以设置于摄像装置100的壳体内部。存储器130可以设置成可从摄像装置100的壳体上拆卸下来。
加速度传感器140对摄像装置100的前后、左右以及上下三轴方向的加速度进行检测。摄像控制部110从加速度传感器140获得指示摄像装置100的前后、左右以及上下的三轴方向的加速度的信息作为指示摄像装置100的姿势状态的姿势信息。
镜头部200包括多个镜头210、多个镜头驱动部212、以及镜头控制部220。多个镜头210可以用作变焦镜头(zoom lens)、可变焦距镜头(varifocal lens)和聚焦镜头。多个镜头210中的至少一部分或全部构成为能够沿着光轴移动。镜头部200可以是被设置成能够相对摄像部102拆装的可更换镜头。镜头驱动部212经由凸轮环等机构构件使多个镜头210中的至少一部分或全部沿着光轴移动。镜头驱动部212可以包括致动器。致动器可以包括步进电机。镜头控制部220按照来自摄像部102的镜头控制指令来驱动镜头驱动部212,以经由机构构件使一个或多个镜头210沿着光轴方向移动。镜头控制指令例如为变焦控制指令及聚焦控制指令。
镜头部200还包括存储器222和位置传感器214。镜头控制部220按照来自摄像部102的镜头操作指令,经由镜头驱动部212来控制镜头210向光轴方向的移动。镜头控制部220按照来自摄像部102的镜头操作指令,经由镜头驱动部212来控制镜头210向光轴方向的移动。镜头210的一部分或者全部沿光轴移动。镜头控制部220通过使镜头210中的至少一个沿着光轴移动,来执行变焦操作和聚焦操作中的至少一个。位置传感器214检测镜头210的位置。位置传感器214可以检测当前的变焦位置或聚焦位置。
镜头驱动部212可以包括抖动校正机构。镜头控制部220可以经由抖动校正机构使镜头210在沿着光轴的方向或垂直于光轴的方向上移动,来执行抖动校正。镜头驱动部212可以由步进电机驱动抖动校正机构,以执行抖动校正。另外,抖动校正机构可以由步进电机驱动,以使图像传感器120在沿着光轴的方向或垂直于光轴的方向上移动,来执行抖动校正。
存储器222存储利用镜头驱动部212而移动的多个镜头210的控制值。存储器222可以包括SRAM、DRAM、EPROM、EEPROM及USB存储器等闪存中的至少一个。
镜头部200还包括光学装置230。光学装置230可以是可变ND滤镜,其透射率可以变更。光学装置230包括偏光滤光镜驱动部232、偏光滤光镜234和偏光滤光镜236。偏光滤光镜234和偏光滤光镜236是使特定偏振方向的光透射的光学滤光器。偏光滤光镜234和偏光滤光镜236可以布置在摄像装置100的光轴上。
偏光滤光镜234和偏光滤光镜236可以布置于镜头210和图像传感器120之间。偏光滤光镜驱动部232包括使偏光滤光镜234旋转的旋转机构。偏光滤光镜驱动部232可以以光轴为中心使偏光滤光镜234旋转。偏光滤光镜驱动部232包括驱动电机。旋转机构可以包括将来自驱动电机的动力传递给偏光滤光镜234的至少一个齿轮。驱动电机可以是步进电机。偏光滤光镜驱动部232通过使偏光滤光镜234旋转来变更偏光滤光镜234透射的光的偏振方向。偏光滤光镜236可以固定在光轴上。偏光滤光镜驱动部232包括检测偏光滤光镜234的旋转的旋转检测传感器。旋转检测传感器可以是光遮断器(photo-interrupter)、可变电阻器、霍尔元件等。
偏光滤光镜驱动部232通过使偏光滤光镜234旋转来变更在绝对空间即实际空间中的偏光滤光镜234的透射光的偏振方向。偏光滤光镜234变更在实际空间中偏光滤光镜234的透射光的偏振波面的角度。偏光滤光镜驱动部232通过旋转偏光滤光镜234来变更偏光滤光镜234的旋转角度,从而变更偏光滤光镜234的透射光的偏振方向。光学装置230可以通过调整偏光滤光镜234的偏振方向与偏光滤光镜236的偏振方向之间的相对位置关系, 从而变更光学装置230的透射率。另外,光学装置230是可变的ND滤镜,也可以是能够电性调整透光率的其他类型的滤镜。光学装置230可以是例如电致变色元件。电致变色元件包含通过施加电压或流通电流而可逆地产生光学吸收的电致变色材料。光学装置230也可以是能够通过施加电压变更液晶排列来调整透射率的液晶类型的ND滤镜。
摄像部102包括测光传感器150。测光传感器150检测通过镜头210的光的光量。测光传感器150是TTL曝光计。摄像控制部110包括自动曝光控制部112、透射率控制部114。自动曝光控制部112通过控制光圈和快门来执行自动曝光控制。透射率控制部114控制偏光滤光镜驱动部232来控制光学装置230的透射率。
自动曝光控制部112基于由测光传感器150检测出的光量,确定合适曝光即目标曝光值(EV值)。自动曝光控制部112基于目标曝光值,确定光圈的光圈值和快门的快门速度(曝光时间)。自动曝光控制部112基于目标曝光值,确定作为光圈值指标的AV值和作为快门速度指标的TV值。自动曝光控制部112基于AV值控制光圈,基于TV值控制快门速度。自动曝光控制部112也可以通过调整ISO灵敏度,控制曝光值成为合适曝光。
然而,由自动曝光控制部112确定的光圈值(AV值)以及以快门速度(TV值)拍摄的图像未必是用户想要的图像。例如,当用摄像装置100拍摄动态图像时,由自动曝光控制部112确定的快门速度过快,有时会成不流畅、零散的漫画风格的影像。
在上述情况下,可以考虑通过减慢快门速度来延长曝光时间。此时,如果将光圈值增大,就有可能成为合适曝光。例如,通过自动曝光控制,合适曝光的EV值被设定为14,AV值被设定为6,TV值被设定为8。此时,为了减慢快门速度,要将TV值设定为6。由于AV值和TV值的合计为EV值(EV值=AV值+TV值),因此在这种情况下,如果能够将AV值设定为8,就会成为合适曝光。但是,为了得到用户想要的图像,有时不希望变更光圈值。即,有时不想将AV值设定为8。此时,如果快门速度减慢,则摄像装置100无法通过合适曝光摄像。
因此,在本实施方式中,透射率控制部114调整光学装置230的透射率。当无法将AV值从6设定为8时,透射率控制部114调整光学装置230的透射率,使得入射至图像传感器120的光量变暗两级。由此,例如,即使不变更光圈值,并且减慢快门速度,摄像装置100也能够拍摄出与合适曝光拍摄的图像的明亮度相同程度的亮度的图像。
摄像控制部110基于来自测光传感器150的光的光量以及设定的ISO灵敏度,确定成为合适曝光的目标曝光值。摄像控制部110基于光学装置230的透光率,变更目标曝光值,其中光学装置230透射入射至图像传感器120的光。摄像控制部110获得用于变更光圈值 以及曝光时间中的至少一个的指示数据。摄像控制部110从用户获取指示数据。摄像控制部110经由远程操作装置300获得指示数据。作为指示数据,摄像控制部110例如获得将TV值从“8”变更为“6”的指示数据。
自动曝光控制部112基于指示数据和目标曝光值确定第一光圈值和第一曝光时间。例如,自动曝光控制部112将由指示数据指定的TV值确定为第一曝光时间,将通过该TV值可获得合适曝光的AV值确定为第一光圈值。
当第一光圈值不在预定光圈值范围内或者第一曝光时间不在预定曝光时间范围内时,透射率控制部114变更光学装置230的透射率。例如,当光圈值固定时,随着TV值的变更,AV值无法变更,因此,透射率控制部114判断为第一光圈值不在预定的光圈值范围内。预定的光圈值范围和预定的曝光时间范围可以由用户设定,也可以根据光圈优先、快门优先等摄影模式进行设定。预定的光圈值范围可以是多个级数的AV值,例如从第一AV值到第二EV值的范围,或者任意一级的AV值。预先设定的曝光时间范围可以是多个级数的TV值,例如从第一TV值到第二TV值的范围,或者任意一级的TV值。
当第一光圈值不在预定光圈值范围内时,透射率控制部114基于指示数据和目标曝光值,确定预定的光圈值范围内的第二光圈值和预定范围内的第二曝光时间。例如,在光圈值固定的情况下,透射率控制部114将指示数据所指示的TV值确定为第二曝光时间,将当前的AV值确定为第二光圈值。透射率控制部114根据变更前的目标曝光值与基于第二光圈值和第二曝光时间的曝光值之差,变更光学装置230的透射率。透射率控制部114根据变更前的目标曝光值与基于第二光圈值和第二曝光时间的曝光值的差的级数,变更光学装置230的透射率。透射率控制部114变更光学装置230的透射率,使得透射率与相差的级数相当。
例如,当变更前的目标曝光值为“14”、相当于第二光圈值的AV值为“6”、相当于第二曝光时间的TV值为“6”时,基于第二光圈值和第二曝光时间的曝光值为“12”。由于向图像传感器120输入的光的光量多“2”级,因此,透射率控制部114变更光学装置230的透射率使得向图像传感器120输入的光得光量少“2”级。透射率控制部114通过控制偏光滤光镜234的旋转角度来变更光学装置230的透射率。
自动曝光控制部112基于变更后的光学装置230的透射率变更目标曝光值。例如,自动曝光控制部112基于变更后的光学装置230的透射率,将目标曝光值从“14”变更为“12”。由此,摄像装置100能够拍摄与目标曝光值为“14”时相同明亮度的图像。
图3示出了偏光滤光镜234的旋转角度与透射率K之间的关系的一个示例。透射率控 制部114通过变更光学装置230的透射率K来变更合适曝光。例如,通过降低透射率K,能够减小合适曝光的EV值。
图4是示出摄像装置100拍摄静态图像时的光学装置230的透射率的变化速度、以及摄像装置100拍摄动态图像时的光学装置230的透射率的变化速度的情况的示例的图。当摄像装置100拍摄动态图像时,如果光学装置230的透射率急剧变化,则可能会对图像产生不利影响,例如,图像闪烁等。因此,当摄像装置100拍摄静态图像时,透射率控制部114可以以第一速度变更光学装置230的透射率,而当摄像装置100拍摄动态图像时,可以以比第一速度慢的第二速度变更光学装置230的透射率。
然而,随着UAV10的姿势随时间的变化,摄像装置100的姿势随时间的变化而变化。当摄像装置100的姿势变更时,通过偏光滤光镜234和偏光滤光镜236的光的偏振方向会变化。如果偏振方向发生变化,则入射至图像传感器120的特定偏振方向的光的强度有可能发生变化。例如,在水面上反射的、入射至图像传感器120的太阳光的光的强度有可能变化。即,摄像装置100有可能无法拍摄符合用户意图的风格的图像。
这里,除了偏光滤光镜234,偏光滤光镜驱动部232还可以使偏光滤光镜236以光轴为中心旋转。在这种情况下,可以考虑检测摄像装置100的姿势,并根据摄像装置100的姿势变更偏光滤光镜234和偏光滤光镜236的旋转角度。然而,当如本手法那样通过反馈变更偏光滤光镜234和偏光滤光镜236的旋转角度时,当摄像装置100的姿势随时间变化时,不一定能够适当地控制偏光滤光镜234和偏光滤光镜236的旋转角度。
因此,透射率控制部114可以通过前馈,在维持偏光滤光镜234的偏振方向和偏光滤光镜236的偏振方向之间的相对位置关系的同时,使偏光滤光镜234和偏光滤光镜236以旋转角度旋转。
透射率控制部114可以在维持偏光滤光镜234和偏光滤光镜236的偏振方向上的相对位置关系的同时,基于针对可旋转地支撑摄像装置100的万向节50的、用于使摄像装置100向第一方向旋转的控制指令,控制偏光滤光镜驱动部232使偏光滤光镜234和偏光滤光镜236向与第一方向相反的第二方向旋转。透射率控制部114控制偏光滤光镜驱动部232使得偏光滤光镜234以基于控制指令的摄像装置100向第一方向的旋转量向第二方向旋转。透射率控制部114可以在维持光学装置230的透射率的同时,基于针对可旋转地支撑摄像装置100的万向节50的、用于使摄像装置100向第一方向旋转的控制指令,控制偏光滤光镜驱动部232使偏光滤光镜234和偏光滤光镜236向与第一方向相反的第二方向旋转。
图5示出了偏光滤光镜驱动部232的驱动电机的步数与偏光滤光镜234的偏振波面角 度之间的关系。偏光滤光镜234的偏振波面角度是通过偏光滤光镜234的光的偏振波面与摄像装置100的水平面所成的角度。偏振面是包含偏振方向的平面。摄像装置100的水平面可以是包含摄像装置100为基准姿势(光轴朝向水平方向)时的光轴的平面。
当增加步数时,偏光滤光镜234和偏光滤光镜236向第一方向(正方向)旋转。当驱动电机的步数为+N时,偏光滤光镜234和偏光滤光镜236的旋转角度成+180度。另一方面,驱动电机的步数为-N时,偏光滤光镜234和偏光滤光镜236的旋转角度成-180度。
图6示出与摄像装置100的旋转相应的驱动电机的驱动情况。其示出在UAV10正在飞行的状态下,摄像装置100正在拍摄被摄体的情形。在步骤1中,UAV10的飞行状态为相对于地面的水平状态,并且由万向节50支撑的摄像装置100的姿势状态也是相对地面的水平状态。即,包含摄像装置100的光轴的平面与地面平行。在该状态下,从UAV控制部30向万向节50输出使摄像装置100的姿势状态相对于地面、以光轴为中心+30度旋转的控制指令。由此,从步骤1移至步骤2。
在步骤2中,摄像控制部110与控制指令同步地将旋转驱动指令输出到驱动电机,该旋转驱动指令在维持偏光滤光镜234和偏光滤光镜236的偏振方向的相对位置关系的同时,将偏光滤光镜234和偏光滤光镜236旋转-30度(-N/6步)。由此,能够维持通过偏光滤光镜234和偏光滤光镜236的光的偏振方向。即,能够在维持光学装置230的透射率的同时,使偏光滤光镜234和偏光滤光镜236各自的旋转角度相对于地面维持在固定的角度。
接着,在步骤3中,在UAV10的飞行状态为相对于地面水平的状态下,使由万向节50支撑的摄像装置100的姿势状态相对于地面,以光轴为中心旋转-30度。
在步骤3中,从UAV控制部30向万向节50输出使摄像装置100的姿势状态相对于地面,以光轴为中心从+30度旋转到-30度这一控制指令。摄像控制部110与该控制指令同步地将使偏光滤光镜234和偏光滤光镜236从-30度旋转到+30度(从-N/6步到+N/6步)这一旋转驱动指令输出到驱动电机。由此,摄像装置100以恒定的旋转速度以光轴为中心向第二方向(负方向)旋转。因此,可以维持通过偏光滤光镜234和偏光滤光镜236的光的偏振方向。即,能够在维持光学装置230的透射率的同时,使偏光滤光镜234和偏光滤光镜236各自的旋转角度相对于地面维持固定的角度。
然后,万向节50将摄像装置100的姿势状态控制为相对于地面水平的状态。然后,在此期间,摄像控制部110使驱动电机驱动使得维持偏光滤光镜234和偏光滤光镜236在偏振方向上的相对位置关系的同时,使例如偏光滤光镜234的旋转角度相对于地面呈水平(0度)。
如上所述,摄像控制部110根据针对万向节50的控制指令,通过前馈使偏光滤光镜234和偏光滤光镜236旋转。因此,可以在维持光学装置230的透射率的同时,使偏光滤光镜234和偏光滤光镜236各自的偏振方向相对于地面维持在固定的方向。因此,能够防止由于摄像装置100的姿势变化,而偏光滤光镜234和偏光滤光镜236的偏振方向变化,导致摄像装置100无法拍摄符合用户意图的风格的图像。
图7示出了可以整体地或部分地体现本发明的多个方面的计算机1200的示例。安装在计算机1200上的程序能够使计算机1200作为与本发明的实施方式所涉及的装置相关联的操作或者该装置的一个或多个“部”而起作用。或者,该程序能够使计算机1200执行该操作或者该一个或多个“部”。该程序能够使计算机1200执行本发明的实施方式所涉及的过程或者该过程的阶段。这种程序可以由CPU1212执行,以使计算机1200执行与本说明书所述的流程图及框图中的一些或者全部方框相关联的指定操作。
本实施方式的计算机1200包括CPU1212以及RAM1214,它们通过主机控制器1210相互连接。计算机1200还包括通信接口1222、输入/输出单元,它们通过输入/输出控制器1220与主机控制器1210连接。计算机1200还包括ROM1230。CPU1212按照ROM1230及RAM1214内存储的程序而工作,从而控制各单元。
通信接口1222通过网络与其他电子装置通信。硬盘驱动器可以存储计算机1200内的CPU1212所使用的程序及数据。ROM1230在其中存储运行时由计算机1200执行的引导程序等、和/或依赖于计算机1200的硬件的程序。程序通过CR-ROM、USB存储器或IC卡之类的计算机可读记录介质或者网络来提供。程序安装在也作为计算机可读记录介质的示例的RAM1214或ROM1230中,并通过CPU1212执行。这些程序中记述的信息处理由计算机1200读取,并引起程序与上述各种类型的硬件资源之间的协作。可以通过根据计算机1200的使用而实现信息的操作或者处理来构成装置或方法。
例如,当在计算机1200和外部装置之间执行通信时,CPU1212可执行加载在RAM1214中的通信程序,并且基于通信程序中描述的处理,指令通信接口1222进行通信处理。通信接口1222在CPU1212的控制下,读取存储在RAM1214或USB存储器之类的记录介质内提供的发送缓冲区中的发送数据,并将读取的发送数据发送到网络,或者将从网络接收的接收数据写入记录介质内提供的接收缓冲区等中。
此外,CPU1212可以使RAM1214读取USB存储器等外部记录介质所存储的文件或数据库的全部或者需要的部分,并对RAM1214上的数据执行各种类型的处理。接着,CPU1212可以将处理过的数据写回到外部记录介质中。
可以将各种类型的程序、数据、表格及数据库之类的各种类型的信息存储在记录介质中,并接受信息处理。对于从RAM1214读取的数据,CPU1212可执行在本公开的各处描述的、包括由程序的指令序列指定的各种类型的操作、信息处理、条件判断、条件转移、无条件转移、信息的检索/替换等各种类型的处理,并将结果写回到RAM1214中。此外,CPU1212可以检索记录介质内的文件、数据库等中的信息。例如,在记录介质中存储具有分别与第二属性的属性值相关联的第一属性的属性值的多个条目时,CPU1212可以从该多个条目中检索出与指定第一属性的属性值的条件相匹配的条目,并读取该条目内存储的第二属性的属性值,从而获取与满足预定条件的第一属性相关联的第二属性的属性值。
以上描述的程序或者软件模块可以存储在计算机1200上或者计算机1200附近的计算机可读存储介质上。另外,连接到专用通信网络或因特网的服务器系统中提供的诸如硬盘或RAM之类的记录介质可以用作计算机可读存储介质,从而可以经由网络将程序提供给计算机1200。
应该注意的是,权利要求书、说明书以及附图中所示的装置、系统、程序以及方法中的动作、顺序、步骤以及阶段等各项处理的执行顺序,只要没有特别明示“在…之前”、“事先”等,且只要前面处理的输出并不用在后面的处理中,则可以任意顺序实现。关于权利要求书、说明书以及附图中的操作流程,为方便起见而使用“首先”、“接着”等进行了说明,但并不意味着必须按照这样的顺序实施。
以上使用实施方式对本发明进行了说明,但是本发明的技术范围并不限于上述实施方式所描述的范围。对本领域普通技术人员来说,显然可对上述实施方式加以各种变更或改良。从权利要求书的描述显而易见的是,加以了这样的变更或改良的方式都可包含在本发明的技术范围之内。

Claims (11)

  1. 一种基于摄像装置的目标曝光值控制所述摄像装置的光圈值以及曝光时间中的至少一个的控制装置,其特征在于,
    包括构成为基于光学装置的透光率而变更所述目标曝光值的电路,所述光学装置透射入射至所述摄像装置所具备的图像传感器的光。
  2. 根据权利要求1所述的控制装置,其特征在于,所述电路构成如下:
    获取用于变更所述摄像装置的光圈值和曝光时间中的至少一个的指示数据,
    基于所述指示数据和所述目标曝光值来确定第一光圈值和第一曝光时间,
    当所述第一光圈值不在预定光圈值范围内或者所述第一曝光时间不在预定曝光时间范围内时,变更所述光学装置的透射率,
    基于经变更的所述光学装置的透射率来变更所述目标曝光值。
  3. 根据权利要求2所述的控制装置,其特征在于,所述电路构成如下:
    基于所述指示数据和所述目标曝光值来确定所述预定光圈值范围内的第二光圈值和所述预定范围内的第二曝光时间,
    基于变更前的所述目标曝光值与根据所述第二光圈值和所述第二曝光时间的曝光值之差,来变更所述光学装置的透射率。
  4. 根据权利要求1所述的控制装置,其特征在于,
    所述光学装置包括第一偏光滤光镜、与第一偏光滤光镜在光轴方向上重合的第二偏光滤光镜以及使所述第一偏光滤光镜旋转的旋转机构,
    所述电路构成为:通过所述旋转机构调整所述第一偏光滤光镜的透射光的偏振方向与所述第二偏光滤光镜的透射光的偏振方向的相对位置关系,从而变更所述光学装置的透射率。
  5. 根据权利要求4所述的控制装置,其特征在于,
    所述摄像装置可旋转地由支撑机构支撑,
    所述旋转机构进一步使所述第二偏光滤光镜旋转,
    所述电路构成为:基于针对所述支撑机构的用于使所述摄像装置向第一方向旋转的控制指令,在维持所述第一偏光滤光镜的偏振方向与所述第二偏光滤光镜的偏振方向之间的相对位置关系的同时,控制所述旋转机构使所述第一偏光滤光镜和所述第二偏光滤光镜向与所述第一方向相反的第二方向旋转。
  6. 根据权利要求1所述的控制装置,其特征在于,
    所述电路构成为:当所述摄像装置拍摄静态图像时,以第一速度变更所述光学装置的透射率,当所述摄像装置拍摄动态图像时,以比所述第一速度慢的第二速度变更所述光学装置的透射率。
  7. 一种摄像装置,其特征在于,包括:
    根据权利要求1至6中任一项所述的控制装置、
    所述光学装置、以及
    所述图像传感器。
  8. 一种摄像系统,其特征在于,包括:
    根据权利要求7所述的摄像装置;以及
    支撑机构,其可旋转地支撑所述摄像装置。
  9. 一种移动体,其特征在于,
    包括根据权利要求8所述的摄像系统并进行移动。
  10. 一种基于摄像装置的目标曝光值来控制所述摄像装置的光圈值和曝光时间中的至少一个的控制方法,其特征在于,包括:基于光学装置的透光率变更目标曝光值,其中,所述光学装置透射入射至所述摄像装置所具备的图像传感器的光。
  11. 一种程序,其特征在于,用于使计算机作为根据权利要求1至6中任一项所述的控制装置而发挥功能。
PCT/CN2020/094642 2019-06-28 2020-06-05 控制装置、摄像装置、摄像系统、移动体、控制方法以及程序 WO2020259255A1 (zh)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202080003326.3A CN112335229A (zh) 2019-06-28 2020-06-05 控制装置、摄像装置、摄像系统、移动体、控制方法以及程序

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2019122137A JP6794601B1 (ja) 2019-06-28 2019-06-28 制御装置、撮像装置、撮像システム、移動体、制御方法、及びプログラム
JP2019-122137 2019-06-28

Publications (1)

Publication Number Publication Date
WO2020259255A1 true WO2020259255A1 (zh) 2020-12-30

Family

ID=73544680

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/094642 WO2020259255A1 (zh) 2019-06-28 2020-06-05 控制装置、摄像装置、摄像系统、移动体、控制方法以及程序

Country Status (3)

Country Link
JP (1) JP6794601B1 (zh)
CN (1) CN112335229A (zh)
WO (1) WO2020259255A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023102921A1 (zh) * 2021-12-10 2023-06-15 深圳传音控股股份有限公司 摄像头模组、拍摄方法、智能终端及存储介质

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100026828A1 (en) * 2008-07-31 2010-02-04 Sony Corporation Optical imaging device, and lens control method and apparatus
CN103595921A (zh) * 2012-08-13 2014-02-19 奥林巴斯映像株式会社 摄像装置
CN103765274A (zh) * 2011-08-31 2014-04-30 富士胶片株式会社 镜头装置和具有该镜头装置的成像装置
US20140307124A1 (en) * 2011-12-28 2014-10-16 Fujifilm Corporation Imaging apparatus, control method of imaging apparatus, interchangeable lens and lens-interchangeable type imaging apparatus body
WO2019071543A1 (en) * 2017-10-12 2019-04-18 SZ DJI Technology Co., Ltd. SYSTEMS AND METHODS FOR AUTOMATIC DETECTION AND CORRECTION OF LUMINANCE VARIATIONS ON IMAGES

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS5238930A (en) * 1975-09-23 1977-03-25 Fuji Photo Film Co Ltd Automatic exposure control device for three variables camera
JPH0242426A (ja) * 1988-08-02 1990-02-13 Konica Corp 自動露出カメラ
JPH10246903A (ja) * 1997-03-06 1998-09-14 Canon Inc 光学機器
JP2004333553A (ja) * 2003-04-30 2004-11-25 Canon Inc 光量調節装置、撮影装置及びフィルタ
JP2013109008A (ja) * 2011-11-17 2013-06-06 Canon Inc 撮像システム
JP2019066563A (ja) * 2017-09-28 2019-04-25 エスゼット ディージェイアイ テクノロジー カンパニー リミテッドSz Dji Technology Co.,Ltd 制御装置、レンズ装置、撮像装置、制御方法、及びプログラム

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100026828A1 (en) * 2008-07-31 2010-02-04 Sony Corporation Optical imaging device, and lens control method and apparatus
CN103765274A (zh) * 2011-08-31 2014-04-30 富士胶片株式会社 镜头装置和具有该镜头装置的成像装置
US20140307124A1 (en) * 2011-12-28 2014-10-16 Fujifilm Corporation Imaging apparatus, control method of imaging apparatus, interchangeable lens and lens-interchangeable type imaging apparatus body
CN103595921A (zh) * 2012-08-13 2014-02-19 奥林巴斯映像株式会社 摄像装置
WO2019071543A1 (en) * 2017-10-12 2019-04-18 SZ DJI Technology Co., Ltd. SYSTEMS AND METHODS FOR AUTOMATIC DETECTION AND CORRECTION OF LUMINANCE VARIATIONS ON IMAGES

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023102921A1 (zh) * 2021-12-10 2023-06-15 深圳传音控股股份有限公司 摄像头模组、拍摄方法、智能终端及存储介质

Also Published As

Publication number Publication date
JP2021009204A (ja) 2021-01-28
CN112335229A (zh) 2021-02-05
JP6794601B1 (ja) 2020-12-02

Similar Documents

Publication Publication Date Title
US11070735B2 (en) Photographing device, photographing system, mobile body, control method and program
US20210120171A1 (en) Determination device, movable body, determination method, and program
JP2019110462A (ja) 制御装置、システム、制御方法、及びプログラム
JP2019066563A (ja) 制御装置、レンズ装置、撮像装置、制御方法、及びプログラム
JP2020012878A (ja) 制御装置、移動体、制御方法、及びプログラム
US20210105411A1 (en) Determination device, photographing system, movable body, composite system, determination method, and program
WO2020259255A1 (zh) 控制装置、摄像装置、摄像系统、移动体、控制方法以及程序
US10942331B2 (en) Control apparatus, lens apparatus, photographic apparatus, flying body, and control method
JP6543875B2 (ja) 制御装置、撮像装置、飛行体、制御方法、プログラム
JP6501091B1 (ja) 制御装置、撮像装置、移動体、制御方法、及びプログラム
JP7048019B2 (ja) 制御装置、飛行体、制御方法、及びプログラム
JP6544542B2 (ja) 制御装置、撮像装置、無人航空機、制御方法、及びプログラム
US11066182B2 (en) Control apparatus, camera apparatus, flying object, control method and program
US11265456B2 (en) Control device, photographing device, mobile object, control method, and program for image acquisition
WO2019041678A1 (zh) 控制装置、摄像装置、移动体、控制方法、及程序
WO2020020042A1 (zh) 控制装置、移动体、控制方法以及程序
JP6641574B1 (ja) 決定装置、移動体、決定方法、及びプログラム
WO2020011198A1 (zh) 控制装置、移动体、控制方法以及程序
WO2021249245A1 (zh) 装置、摄像装置、摄像系统及移动体
US20200241570A1 (en) Control device, camera device, flight body, control method and program
CN111213369B (zh) 控制装置、方法、摄像装置、移动体以及计算机可读存储介质
WO2020216057A1 (zh) 控制装置、摄像装置、移动体、控制方法以及程序
JP6569157B1 (ja) 制御装置、撮像装置、移動体、制御方法、及びプログラム
WO2018163298A1 (ja) 制御装置、レンズ装置、撮像装置、撮像システム、移動体、制御方法、及びプログラム
JP2020052220A (ja) 制御装置、撮像装置、移動体、制御方法、及びプログラム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20831515

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20831515

Country of ref document: EP

Kind code of ref document: A1