WO2021017914A1 - 控制装置、摄像装置、移动体、控制方法以及程序 - Google Patents

控制装置、摄像装置、移动体、控制方法以及程序 Download PDF

Info

Publication number
WO2021017914A1
WO2021017914A1 PCT/CN2020/102917 CN2020102917W WO2021017914A1 WO 2021017914 A1 WO2021017914 A1 WO 2021017914A1 CN 2020102917 W CN2020102917 W CN 2020102917W WO 2021017914 A1 WO2021017914 A1 WO 2021017914A1
Authority
WO
WIPO (PCT)
Prior art keywords
imaging device
upper limit
exposure time
value
imaging
Prior art date
Application number
PCT/CN2020/102917
Other languages
English (en)
French (fr)
Inventor
家富邦彦
周剑斌
陈喆君
Original Assignee
深圳市大疆创新科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳市大疆创新科技有限公司 filed Critical 深圳市大疆创新科技有限公司
Priority to CN202080003377.6A priority Critical patent/CN112335230A/zh
Publication of WO2021017914A1 publication Critical patent/WO2021017914A1/zh
Priority to US17/524,623 priority patent/US20220141371A1/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/66Remote control of cameras or camera parts, e.g. by remote control devices
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/62Control of parameters via user interfaces
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/73Circuitry for compensating brightness variation in the scene by influencing the exposure time
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/74Circuitry for compensating brightness variation in the scene by influencing the scene brightness using illuminating means

Definitions

  • the invention relates to a control device, an imaging device, a mobile body, a control method, and a program.
  • Patent Document 1 describes a technique of determining the point at which the imaging sensitivity is changed in a predetermined program graph based on the upper limit or lower limit of the imaging sensitivity that has been set.
  • the control device may include a circuit configured to set an upper limit of the exposure time.
  • the circuit may be configured to determine the exposure time of the imaging device within a range below the upper limit value based on the exposure control value of the imaging device.
  • the circuit can be configured to set the upper limit value that has been input.
  • the circuit may be configured to display information for inputting the upper limit value and an exposure time determined based on the current exposure control value of the imaging device within a range below the upper limit value on the display device.
  • the circuit may be configured to update the exposure time displayed on the display device when the imaging device is operating after setting the upper limit value.
  • the circuit may be configured such that when the upper limit value is not set, the exposure time is determined according to the exposure control value in a range below the preset value longer than the maximum value of the upper limit value that can be input.
  • the circuit may be configured to generate a program graph showing the relationship between the exposure value, imaging sensitivity, and exposure time in a range below the upper limit value when the upper limit value is set.
  • the circuit can be configured to determine the exposure time and imaging sensitivity according to the exposure control value and the program graph.
  • the circuit may be configured to determine the exposure time when the imaging sensitivity is fixed to a preset value within a range below the upper limit value according to the exposure control value.
  • the circuit can be configured as follows: when the exposure time when the imaging sensitivity is fixed to the preset value cannot be determined within the range below the upper limit, the exposure time is determined as the upper limit, and the exposure time is determined to be fixed according to the exposure control value The imaging sensitivity at the upper limit.
  • the imaging device may be an imaging device with a fixed aperture.
  • the imaging device may include a first imaging device that performs imaging with light in a first wavelength region and a second imaging device that performs imaging with light in a second wavelength region.
  • the circuit may be configured to determine the exposure time of the first imaging device and the exposure time of the second imaging device within a range below the upper limit value.
  • the upper limit value may include a first upper limit value of the exposure time of light in the first wavelength region and a second upper limit value of the exposure time of light in the second wavelength region.
  • the circuit may be configured to determine the exposure time of the first imaging device within the range of the first upper limit value according to the exposure control value corresponding to the light in the first wavelength region.
  • the circuit may be configured to determine the exposure time of the second imaging device within the range of the second upper limit value according to the exposure control value corresponding to the light in the second wavelength region.
  • the circuit may be configured to set the upper limit value according to the moving speed of the imaging device.
  • the circuit may be configured to set the upper limit value according to the speed of change of the direction of the imaging device.
  • the circuit may be configured such that when the aperture of the imaging device is fixed, the exposure time is determined according to the exposure control value in a range below the upper limit value.
  • the circuit can be configured to determine the F value and the exposure time according to the exposure control value when the aperture of the imaging device is not fixed.
  • the imaging device may include the above-mentioned control device.
  • the moving body according to an aspect of the present invention may be a moving body that includes the aforementioned imaging device and moves.
  • the control method involved in one aspect of the present invention may include the following steps: setting an upper limit of the exposure time.
  • the control method may include the steps of determining the exposure time of the imaging device within a range below the upper limit value according to the exposure control value of the imaging device.
  • the program involved in one aspect of the present invention can make the computer execute the following steps: setting the upper limit of the exposure time.
  • the program may cause the computer to execute the following steps: according to the exposure control value of the imaging device, determine the exposure time of the imaging device within the range below the upper limit value.
  • the upper limit value of the exposure time can be set, so that the extreme increase of the exposure time can be suppressed.
  • FIG. 1 shows an example of the appearance of an unmanned aerial vehicle (UAV) 10 and a remote operation device 300.
  • UAV unmanned aerial vehicle
  • FIG. 2 is a diagram showing an example of the appearance of the imaging device 100 mounted on the UAV 10.
  • FIG. 3 shows an example of the functional blocks of UAV10.
  • FIG. 4 shows an example of functional blocks of the imaging device 100.
  • FIG. 5 shows the range of the program graph generated when the user has set the upper limit of the exposure time.
  • FIG. 6 shows the range of the program graph generated when the user has set the upper limit of the exposure time.
  • FIG. 7 shows a screen for the user to set the upper limit value of the exposure time.
  • FIG. 8 shows another example of a screen for the user to set the upper limit value of the exposure time.
  • FIG. 9 is a flowchart showing an example of the execution procedure of the imaging control unit 182.
  • FIG. 10 is a diagram showing another example of the appearance of the imaging device 100 mounted on the UAV 10.
  • FIG. 11 is a diagram showing an example of the hardware configuration.
  • the blocks can represent (1) the stages of the process of performing operations or (2) the "parts" of the device that perform operations. Specific stages and “parts” can be implemented by programmable circuits and/or processors.
  • Dedicated circuits may include digital and/or analog hardware circuits. May include integrated circuits (ICs) and/or discrete circuits.
  • the programmable circuit may include a reconfigurable hardware circuit.
  • Reconfigurable hardware circuits can include logical AND, logical OR, logical exclusive OR, logical NAND, logical NOR, and other logical operations, flip-flops, registers, field programmable gate array (FPGA), programmable logic array (PLA) ) And other memory components.
  • the computer-readable medium may include any tangible device that can store instructions to be executed by a suitable device.
  • the computer-readable medium having instructions stored thereon includes a product including instructions that can be executed to create means for performing operations specified by the flowchart or block diagram.
  • electronic storage media, magnetic storage media, optical storage media, electromagnetic storage media, semiconductor storage media, and the like may be included.
  • the computer readable medium may include floppy (registered trademark) disk, floppy disk, hard disk, random access memory (RAM), read only memory (ROM), erasable programmable read only memory (EPROM) Or flash memory), electrically erasable programmable read-only memory (EEPROM), static random access memory (SRAM), compact disc read-only memory (CD-ROM), digital versatile disc (DVD), Blu-ray (RTM) disc, memory Sticks, integrated circuit cards, etc.
  • floppy registered trademark
  • floppy disk floppy disk
  • hard disk random access memory
  • RAM random access memory
  • ROM read only memory
  • EPROM erasable programmable read only memory
  • flash memory electrically erasable programmable read-only memory
  • SRAM static random access memory
  • CD-ROM compact disc read-only memory
  • DVD digital versatile disc
  • Blu-ray (RTM) disc memory Sticks, integrated circuit cards, etc.
  • the computer-readable instructions may include any one of source code or object code described in any combination of one or more programming languages.
  • the source code or object code includes traditional procedural programming languages.
  • Traditional procedural programming languages can be assembly instructions, instruction set architecture (ISA) instructions, machine instructions, machine-related instructions, microcode, firmware instructions, state setting data, or Smalltalk (registered trademark), JAVA (registered trademark) , C++ and other object-oriented programming languages and "C" programming language or similar programming languages.
  • the computer-readable instructions may be provided locally or via a wide area network (WAN) such as a local area network (LAN) or the Internet to a processor or programmable circuit of a general-purpose computer, a special-purpose computer, or other programmable data processing device.
  • the processor or programmable circuit can execute computer-readable instructions to create means for performing the operations specified in the flowchart or block diagram. Examples of processors include computer processors, processing units, microprocessors, digital signal processors, controllers, microcontrollers, etc.
  • FIG. 1 shows an example of the appearance of an unmanned aerial vehicle (UAV) 10 and a remote operation device 300.
  • the UAV 10 includes a UAV main body 20, a universal joint 50, a plurality of imaging devices 60, and the imaging device 100.
  • the universal joint 50 and the camera device 100 are an example of a camera system.
  • UAV10 is an example of a moving body. Moving objects include concepts such as flying objects moving in the air, vehicles moving on the ground, and ships moving on the water. The concept of flying objects moving in the air includes not only UAVs, but also other aircraft, airships, helicopters, etc. that move in the air.
  • the UAV main body 20 includes a plurality of rotors. Multiple rotors are an example of a propulsion section.
  • the UAV main body 20 makes the UAV 10 fly by controlling the rotation of a plurality of rotors.
  • the UAV main body 20 uses, for example, four rotors to fly the UAV 10. The number of rotors is not limited to four.
  • UAV10 can also be a fixed-wing aircraft without rotors.
  • the imaging device 100 is an imaging multispectral camera that captures an object within a desired imaging range in each of a plurality of wavelength bands.
  • the universal joint 50 rotatably supports the imaging device 100.
  • the universal joint 50 is an example of a supporting mechanism.
  • the universal joint 50 can support the imaging device 100 in a manner of rotating around a pitch axis using an actuator.
  • the universal joint 50 can further use an actuator to support the imaging device 100 so as to rotate around the roll axis and the yaw axis, respectively.
  • the gimbal 50 can change the posture of the imaging device 100 by rotating the imaging device 100 around at least one of the yaw axis, the pitch axis, and the roll axis.
  • the plurality of imaging devices 60 are sensing cameras that photograph the surroundings of the UAV 10 in order to control the flight of the UAV 10.
  • the two camera devices 60 can be installed on the nose of the UAV 10, that is, on the front side.
  • the other two camera devices 60 may be provided on the bottom surface of the UAV 10.
  • the two imaging devices 60 on the front side may be paired to function as a so-called stereo camera.
  • the two imaging devices 60 on the bottom side may also be paired to function as a stereo camera.
  • the imaging device 60 can detect the existence of an object included in the imaging range of the imaging device 60 and measure the distance to the object.
  • the imaging device 60 is an example of a measuring device that measures an object existing in the imaging direction of the imaging device 100.
  • the measuring device may be another sensor such as an infrared sensor or an ultrasonic sensor that measures an object existing in the imaging direction of the imaging device 100.
  • the three-dimensional spatial data around the UAV 10 can be generated based on the images taken by the plurality of camera devices 60.
  • the number of imaging devices 60 included in the UAV 10 is not limited to four.
  • the UAV 10 may include at least one imaging device 60.
  • the UAV10 may be equipped with at least one camera 60 on the nose, tail, side, bottom, and top of the UAV10.
  • the viewing angle that can be set in the imaging device 60 may be larger than the viewing angle that can be set in the imaging device 100.
  • the imaging device 60 may have a single focus lens or a fisheye lens.
  • the remote operation device 300 communicates with the UAV 10 to remotely operate the UAV 10.
  • the remote operation device 300 can wirelessly communicate with the UAV 10.
  • the remote operation device 300 transmits to the UAV 10 instruction information indicating various commands related to the movement of the UAV 10 such as ascending, descending, accelerating, decelerating, forwarding, retreating, and rotating.
  • the instruction information includes, for example, instruction information for raising the height of the UAV 10.
  • the indication information may indicate the height at which the UAV10 should be located.
  • the UAV 10 moves to be located at the height indicated by the instruction information received from the remote operation device 300.
  • the instruction information may include an ascending instruction to raise the UAV10. UAV10 rises while receiving the rise command. When the height of UAV10 has reached the upper limit height, even if the ascending instruction is accepted, the ascent of UAV10 can be restricted.
  • the remote operation device 300 includes a display device 302.
  • the display device 302 displays an image captured by the imaging device 100.
  • the display device 302 also functions as an input device that receives information input by the user to remotely operate the UAV 10.
  • the display device 302 receives the setting information of the imaging device 100.
  • the remote operation device 300 transmits instruction information indicating various instructions related to the operation of the imaging device 100 to the UAV 10 based on the setting information received from the user.
  • FIG. 2 is a diagram showing an example of the appearance of the imaging device 100 mounted on the UAV 10.
  • the imaging device 100 is a multispectral camera that captures image data of each of a plurality of preset wavelength bands.
  • the imaging device 100 includes an imaging device 110 for R, an imaging device 120 for G, an imaging device 130 for B, an imaging device 140 for RE, and an imaging device 150 for NIR.
  • the imaging device 100 can record respective image data captured by the imaging device 110 for R, the imaging device 120 for G, the imaging device 130 for B, the imaging device 140 for RE, and the imaging device 150 for NIR as a multispectral image.
  • multispectral images can be used to predict the health and vitality of crops.
  • FIG. 3 shows an example of the functional blocks of UAV10.
  • UAV 10 includes UAV control unit 30, memory 32, communication interface 36, propulsion unit 40, GPS receiver 41, inertial measurement device 42 (called IMU 42), magnetic compass 43, barometric altimeter 44, temperature sensor 45, humidity sensor 46, The joint 50, the imaging device 60, and the imaging device 100.
  • the communication interface 36 communicates with other devices such as the remote operation device 300.
  • the communication interface 36 can receive instruction information including various instructions for the UAV control unit 30 from the remote operation device 300.
  • the memory 32 stores the UAV control unit 30 controlling the propulsion unit 40, the GPS receiver 41, the IMU 42, the magnetic compass 43, the barometric altimeter 44, the temperature sensor 45, the humidity sensor 46, the universal joint 50, the imaging device 60, and the imaging device 100. Required procedures, etc.
  • the memory 32 may be a computer-readable recording medium, and may include at least one of flash memory such as SRAM, DRAM, EPROM, EEPROM, and USB memory.
  • the memory 32 may be provided inside the UAV main body 20. It can be configured to be detachable from the UAV main body 20.
  • the UAV control unit 30 controls the flight and shooting of the UAV 10 in accordance with the program stored in the memory 32.
  • the UAV control unit 30 may be composed of a microprocessor such as a CPU or an MPU, and a microcontroller such as an MCU.
  • the UAV control unit 30 controls the flight and shooting of the UAV 10 in accordance with instructions received from the remote operation device 300 via the communication interface 36.
  • the propulsion unit 40 propels the UAV10.
  • the propulsion part 40 includes a plurality of rotors and a plurality of drive motors that rotate the plurality of rotors.
  • the propulsion unit 40 rotates a plurality of rotors via a plurality of drive motors in accordance with an instruction from the UAV control unit 30 to cause the UAV 10 to fly.
  • the GPS receiver 41 receives a plurality of signals indicating time transmitted from a plurality of GPS satellites.
  • the GPS receiver 41 calculates the position (latitude and longitude) of the GPS receiver 41, that is, the position (latitude and longitude) of the UAV 10 based on the received signals.
  • the IMU42 detects the posture of the UAV10.
  • the IMU 42 detects the acceleration of the UAV 10 in the three-axis directions of front and rear, left and right, and up and down, and the angular velocities of the pitch axis, the roll axis, and the yaw axis as the attitude of the UAV 10.
  • the magnetic compass 43 detects the position of the nose of the UAV 10.
  • the barometric altimeter 44 detects the flying altitude of the UAV10.
  • the barometric altimeter 44 detects the air pressure around the UAV 10 and converts the detected air pressure to altitude to detect the altitude.
  • the temperature sensor 45 detects the temperature around the UAV 10.
  • the humidity sensor 46 detects the humidity around the UAV 10.
  • FIG. 4 shows an example of functional blocks of the imaging device 100.
  • the imaging device 100 includes an imaging device 110 for R, an imaging device 120 for G, an imaging device 130 for B, an imaging device 140 for RE, and an imaging device 150 for NIR.
  • the imaging device 100 includes a processor 180, a transmission unit 190, and a memory 192.
  • the imaging device 110 for R includes an image sensor 112 for R and an optical system 114.
  • the image sensor 112 for R captures an image formed by the optical system 114.
  • the R image sensor 112 includes a filter that transmits light in the red region, and outputs an R image signal that is an image signal in the red region.
  • the wavelength band of the red region is 620 nm to 750 nm.
  • the wavelength band of the red region may be a specific wavelength band in the red region, for example, it may be 663 nm to 673 nm.
  • the imaging device 120 for G includes an image sensor 122 for G and an optical system 124.
  • the image sensor 122 for G captures an image formed by the optical system 124.
  • the G image sensor 122 includes a filter that transmits light in the green region, and outputs a G image signal that is an image signal in the green region.
  • the wavelength band of the green region is 500 nm to 570 nm.
  • the wavelength band of the green region may be a specific wavelength band in the green region, for example, it may be 550 nm to 570 nm.
  • the imaging device 130 for B includes an image sensor 132 for B and an optical system 134.
  • the image sensor 132 for B captures an image formed by the optical system 134.
  • the image sensor for B 132 includes a filter that transmits light in the blue region, and outputs a B image signal that is an image signal in the blue region.
  • the wavelength band of the blue region is 450 nm to 500 nm.
  • the wavelength band of the blue region may be a specific wavelength band in the blue region, for example, it may be 465 nm to 485 nm.
  • the imaging device 140 for RE includes an image sensor 142 for RE and an optical system 144.
  • the image sensor 142 for RE captures an image formed by the optical system 144.
  • the RE image sensor 142 includes a filter that transmits light in the red edge region, and outputs an RE image signal that is an image signal in the red edge region.
  • the wavelength band of the red edge region is 705 nm to 745 nm.
  • the wavelength band of the red edge region may be 712 nm to 722 nm.
  • the NIR imaging device 150 includes an NIR image sensor 152 and an optical system 154.
  • the image sensor 152 for NIR captures the image formed by the optical system 154.
  • the NIR image sensor 152 includes a filter that transmits light in the near-infrared region, and outputs an image signal in the near-infrared region, that is, an NIR image signal.
  • the wavelength band of the near infrared region is 800 nm to 2500 nm.
  • the wavelength band of the near infrared region may be 800 nm to 900 nm.
  • the processor 180 includes a multiplexer 170, an input receiving unit 172, a demosaicing processing unit 174, and a recording processing unit 178.
  • the processor 180 is an example of a circuit.
  • the processor 180 may be constituted by a microprocessor such as a CPU or an MPU, and a microcontroller such as an MCU.
  • the multiplexer 170 receives the image signal output from each image sensor, selects the image signal output from any image sensor according to a preset condition, and inputs it to the input receiving unit 172.
  • the demosaic processing unit 174 generates display image data based on the R image signal, G image signal, and B image signal input to the input receiving unit 172.
  • the demosaic processing unit 174 generates display image data by performing demosaic processing on the R image signal, the G image signal, and the B image signal.
  • the demosaic processing unit 174 can perform thinning processing on the R image signal, G image signal, and B image signal, and convert the thinning-processed R image signal, G image signal, and B image signal into Bayer array image signals to generate display signals.
  • the transmitting unit 190 transmits the image data for display to the display device.
  • the transmitting unit 190 may transmit the image data for display to the remote operation device 300.
  • the remote operation device 300 can display the image data for display on the display device 302 as a live view image.
  • the recording processing unit 178 generates recording image data in accordance with a preset recording format based on the R image signal, G image signal, B image signal, RE image signal, and NIR image signal input to the input receiving unit 172.
  • the recording processing unit 178 can generate RAW data as recording image data from the R image signal, the G image signal, the B image signal, the RE image signal, and the NIR image signal according to the RAW format.
  • the recording processing unit 178 may generate all-pixel recording image data without performing thinning-out processing on the R image signal, G image signal, B image signal, RE image signal, and NIR image signal.
  • the recording processing unit 178 may store the image data for recording in the memory 192.
  • the memory 192 may be a computer-readable recording medium, and may also include at least one of flash memory such as SRAM, DRAM, EPROM, EEPROM, and USB memory.
  • the memory 192 may be provided inside the housing of the imaging device 100.
  • the memory 192 may be configured to be detachable from the housing of the imaging device 100.
  • the processor 180 also includes a receiving unit 184 and a switching unit 186.
  • the receiving unit 184 receives a storage instruction to store the image data for recording in the memory 192.
  • the receiving unit 184 may receive a storage instruction from the user through an external terminal such as the remote operation device 300.
  • the receiving unit 184 may receive the storage instruction from the UAV control unit 30.
  • the UAV control unit 30 determines that the position of the imaging device 100 is the predetermined position, and the receiving unit 184 may receive the storage instruction from the UAV control unit 30.
  • the imaging device 100 may include a GPS receiver.
  • the processor 180 can determine whether the position of the camera 100 is a preset position according to the position information from its own GPS receiver.
  • the switching unit 186 switches between the following two methods. One is to generate display image data in the demosaicing processing unit 174 based on the R image signal, G image signal, and B image signal input to the input receiving unit 172, and the other is based on the input.
  • the R image signal, G image signal, B image signal, RE image signal, and NIR image signal sent to the input receiving unit 172 are generated in the recording processing unit 178 according to a preset recording format.
  • the exposure control in the imaging device 100 will be described.
  • the imaging control unit 182 sets the upper limit value of the exposure time (the charge accumulation time of the image sensor).
  • the imaging control unit 182 determines the exposure time of the imaging device 100 within the range below the upper limit value based on the exposure control value of the imaging device 100.
  • the imaging control unit 182 sets the input upper limit value.
  • the imaging control unit 182 is used to input the information of the upper limit value and the exposure time determined based on the current exposure control value of the imaging device 100 within the range below the upper limit value is displayed on the display device 302. After setting the upper limit value, the imaging control unit 182 updates the exposure time displayed on the display device 302 while the imaging device 100 is operating.
  • the imaging control section 182 determines the exposure time within a range of a preset value or less that is longer than the maximum value of the upper limit value that can be input based on the exposure control value.
  • the imaging control unit 182 When the upper limit value has been set, the imaging control unit 182 generates a program graph in a range below the upper limit value.
  • the program graph shows the relationship between exposure value, imaging sensitivity and exposure time.
  • the imaging control unit 182 determines the exposure time and imaging sensitivity based on the exposure control value and the program graph.
  • the imaging control unit 182 determines the exposure time when the imaging sensitivity is fixed to a preset value within a range below the upper limit value based on the exposure control value. When the exposure time when the imaging sensitivity is fixed to the preset value cannot be determined within the range below the upper limit value, the imaging control section 182 determines the exposure time as the upper limit value, and determines to fix the exposure time to the upper limit according to the exposure control value. Camera sensitivity at the limit. For example, when the imaging control unit 182 determines that underexposure occurs when the imaging sensitivity is fixed to a preset value and the exposure time is set to the upper limit value, it determines the exposure time as the upper limit value and also The exposure control value increases the imaging sensitivity.
  • the imaging device 100 may include a first imaging device that performs imaging with light in a first wavelength region and a second imaging device that performs imaging with light in a second wavelength region.
  • the imaging control unit 182 determines the exposure time of the first imaging device and the exposure time of the second imaging device within the range below the upper limit value.
  • the upper limit value includes the first upper limit value of the exposure time of light in the first wavelength region and the second upper limit value of the exposure time of light in the second wavelength region.
  • the imaging control unit 182 determines the exposure time of the first imaging device within the range of the first upper limit based on the exposure control value corresponding to the light in the first wavelength region.
  • the imaging control unit 182 determines the exposure time of the second imaging device within the range of the second upper limit based on the exposure control value corresponding to the light in the second wavelength region.
  • the first imaging device may be one of the imaging device 110 for R, the imaging device 120 for G, the imaging device 130 for B, the imaging device 140 for RE, and the imaging device 150 for NIR
  • the second imaging device may be R The other imaging devices among the imaging device 110 for G, the imaging device 120 for G, the imaging device 130 for B, the imaging device 140 for RE, and the imaging device 150 for NIR.
  • the imaging device 100 may be an imaging device with a fixed aperture.
  • the imaging device 110 for R, the imaging device 120 for G, the imaging device 130 for B, the imaging device 140 for RE, and the imaging device 150 for NIR may each be an imaging device with a fixed aperture.
  • the imaging control unit 182 may set the upper limit value according to the moving speed of the imaging device 100. For example, the imaging control unit 182 may set the upper limit value according to the moving speed of the UAV 10. The imaging control unit 182 may set the upper limit value according to the relative speed between the UAV 10 and the subject.
  • the imaging control unit 182 may set the upper limit value according to the speed of change of the direction of the imaging device 100.
  • the imaging control unit 182 may set an upper limit value according to the rotation speed of the imaging device 100 using the universal joint 50.
  • the imaging control unit 182 may determine the exposure time according to the exposure control value in a range below the upper limit value when the aperture of the imaging device 100 is fixed.
  • the imaging control unit 182 may determine the F value and the exposure time based on the exposure control value when the aperture of the imaging device 100 is not fixed.
  • the term "when the aperture of the imaging device 100 is fixed” includes, for example, the following cases: the imaging device 100 is an imaging device with a variable aperture, and the imaging mode is set to the aperture priority mode.
  • the term "when the aperture of the imaging device 100 is fixed” includes, for example, the case where the imaging device 100 is an imaging device having a variable aperture, and underexposure occurs when the F value is set to the minimum value.
  • the term "when the aperture of the imaging device 100 is fixed” includes, for example, the case where the imaging device 100 does not have an aperture. Even if there is no aperture in the imaging device 100, it is equivalent to a fixed aperture.
  • the imaging device 110 for R has a function of changing the exposure time in the range of 1/8000 second to 1/8 second.
  • the imaging device 110 for R has a function of changing the ISO sensitivity in the range of 100 to 6,400.
  • Figure 5 shows the case where the F value is 1.4.
  • Fig. 6 shows the case where the F value is 2.0.
  • 1/8000 second is not the shortest exposure time. For example, the shortest time can be 1/20000 second.
  • the horizontal axis of FIGS. 5 and 6 represents the exposure time, and the vertical axis represents the imaging sensitivity.
  • the range 500 shown in FIGS. 5 and 6 represents the range of the program graph generated by the imaging control unit 182 when the upper limit of the exposure time is set to 1/250 second.
  • the imaging control unit 182 determines an exposure control value based on an image obtained by the imaging device 110 for R. For example, the imaging control unit 182 determines the EV value (exposure value) based on the brightness information of the image obtained by the imaging device 110 for R.
  • the imaging control unit 182 determines the exposure time and imaging sensitivity based on the program graph and the EV value in the range 500 generated based on the upper limit of the exposure time. As described above, the imaging control unit 182 does not set the exposure time to be greater than 1/250 second in the imaging device 110 for R. That is, the imaging control unit 182 prohibits the imaging device 110 for R from being exposed for an exposure time longer than 1/250 second.
  • the maximum value of the upper limit value of the exposure time settable by the user in the imaging device 110 for R may be less than 1/8 second.
  • the maximum value of the upper limit value of the exposure time that can be set by the user may be 1/15 second.
  • the imaging control unit 182 determines the exposure time within the range of 1/8 second or less, which is the maximum value of the exposure time that can be set in the imaging device 110 for R. That is, when the user does not set the upper limit of the exposure time, the imaging control unit 182 may determine the exposure time greater than 1/15 second.
  • the imaging control unit 182 uses the same method as the setting method of the exposure time and imaging sensitivity of the imaging device 110 for R to set the imaging device 120 for G, imaging device 130 for B, imaging device 140 for RE, and imaging device 150 for NIR. Exposure time and camera sensitivity. That is, the imaging control unit 182 has the imaging device for G imaging device 120, B imaging device 130, RE imaging device 140, and NIR imaging device 150 within the range below the upper limit of the set exposure time. Generate program curve graph. In addition, the imaging control unit 182 determines the EV value (exposure value) based on the brightness information of the images obtained by the imaging device 120 for G, the imaging device 130 for B, the imaging device 140 for RE, and the imaging device 150 for NIR. To determine the exposure time and imaging sensitivity of each camera device based on the program graph and EV value.
  • the imaging control unit 182 may determine the R imaging device based on the brightness information of the specific area in the R imaging device 110, G imaging device 120, B imaging device 130, RE imaging device 140, and NIR imaging device 150 110.
  • the specific area may be, for example, an area including the same subject.
  • the imaging control unit 182 can calculate the vegetation index for each pixel based on the image information of a specific area in the imaging device 110 for R, the imaging device 120 for G, the imaging device 130 for B, the imaging device 140 for RE, and the imaging device 150 for NIR.
  • the image area where the vegetation index meets the predetermined condition is determined as a specific area.
  • vegetation index normalized vegetation index (NDVI: Normalized Difference Vegetation Index), SAVI (Solid Adjusted Vegetation Index, soil adjustment vegetation index), gNDVI (Green Normalized Difference Vegetation Index, green normalized vegetation index) can be exemplified , NDRE (Normalized Difference Red Edge Index, normalized red edge vegetation index), etc.
  • the imaging control unit 182 can fix the imaging sensitivity to a preset value (for example, ISO sensitivity 100), and according to the EV value and imaging sensitivity, the user has set The exposure time for proper exposure is determined within the range below the upper limit.
  • a preset value for example, ISO sensitivity 100
  • the imaging control unit 182 can determine the appropriate exposure based on the EV value and the exposure time of the upper limit value while determining the exposure time as the upper limit value set by the user.
  • the camera sensitivity When shooting in a dark environment, underexposure becomes stronger. In this case, the imaging control unit 182 does not change the exposure time so that it exceeds the upper limit of the exposure time that has been set.
  • the imaging control unit 182 improves the image sensors that are applied to the imaging device 110 for R, the imaging device 120 for G, the imaging device 130 for B, the imaging device 140 for RE, and the imaging device 150 for NIR. Gain to increase exposure.
  • the imaging control unit 182 increases the gain for the first time when the exposure cannot be adjusted without increasing the gain. As a result, it is possible to prevent deterioration of the image quality acquired by the image sensor.
  • ISO sensitivity is the sensitivity to specific light. Therefore, adjusting the gain may be equivalent to adjusting the ISO sensitivity.
  • FIG. 7 shows a screen 700 for the user to set the upper limit value of the exposure time.
  • the screen 700 is displayed on the display device 302.
  • the screen 700 is a screen when the respective imaging devices of the imaging device 110 for R, the imaging device 120 for G, the imaging device 130 for B, the imaging device 140 for RE, and the imaging device 150 for NIR are collectively controlled.
  • the screen 700 includes a display area 710 of exposure conditions and a preset area 712 for setting the upper limit of the exposure time.
  • the display area 710 displays the aperture value (Iris), imaging sensitivity (ISO), and exposure time (shutter speed) of the imaging device 100.
  • the preset area 712 displays a slide bar 720 that accepts the change of the upper limit of the exposure time.
  • the slide bar 720 is an example of information for the user to input the upper limit value of the exposure time.
  • the user changes the upper limit of the exposure time by moving the position of the button (mark) 730 of the slide bar 720.
  • the screen 700 shows a state where the upper limit of the exposure time is set to 1/250 second.
  • the remote operation device 300 transmits instruction information indicating the upper limit of the exposure time specified by the user to the UAV 10.
  • the imaging control unit 182 determines the exposure time and imaging sensitivity below the upper limit of the exposure time indicated by the instruction information received by the UAV control unit 30.
  • the transmitting unit 190 transmits the exposure conditions including the aperture value of the imaging device 100, the determined exposure time, and the imaging sensitivity to the remote operation device 300.
  • the remote operation device 300 updates the screen 700 according to the received exposure conditions.
  • the remote operation device 300 can prompt the user of the current exposure conditions in the imaging device 100 and the slide bar 720.
  • the exposure time and the imaging sensitivity can be changed.
  • the imaging control unit 182 sets the exposure time to 1/1000 second, and sets the image
  • the sensitivity (ISO sensitivity) is changed to 200 for proper exposure.
  • the shutter speed (exposure time) displayed in the display area 710 can be updated in real time.
  • the shutter speed displayed in the display area 710 is 1/2000 second or 1/10000 second depending on the exposure.
  • the gain is maintained at, for example, 1.0.
  • the gain information can be displayed on the display unit 710 together with the ISO sensitivity information.
  • the ISO sensitivity information can be adjusted in the same way as the gain instead of the gain. That is, when ISO500 is set to a predetermined value and underexposure becomes stronger, it can be set to ISO800 which is greater than ISO500 for the first time.
  • the ISO sensitivity, gain, and shutter speed can be updated in real time (sequentially) when the imaging device 100 is shooting (working). In this way, the user can grasp the shooting situation in real time.
  • the shutter speed of the display part 710 may be replaced with a slide bar 720. In this case, when the upper limit value of the shutter speed is set, the shutter speed is switched in real time thereafter.
  • FIG. 8 shows another example of a screen for the user to set the upper limit value of the exposure time.
  • the screen 800 is displayed on the display device 302.
  • the screen 800 is a screen for controlling the imaging device 110 for R, the imaging device 120 for G, the imaging device 130 for B, the imaging device 140 for RE, and the imaging device 150 for NIR, respectively.
  • the screen 800 includes a display area 810 of exposure conditions and a preset area 812 for setting the upper limit of the exposure time.
  • the display area 810 displays the aperture value (Iris), imaging sensitivity (ISO), and exposure time of each of the imaging device 110 for R, the imaging device 120 for G, the imaging device 130 for B, the imaging device 140 for RE, and the imaging device 150 for NIR ( Shutter speed).
  • the display area 812 displays sliding bars 821, 822, 823, 824, and 825.
  • the sliders 821, 822, 823, 824, and 825 are used to receive the exposure time of the R imaging device 110, G imaging device 120, B imaging device 130, RE imaging device 140, and NIR imaging device 150, respectively.
  • An example of information about the change of the limit value is used to receive the exposure time of the R imaging device 110, G imaging device 120, B imaging device 130, RE imaging device 140, and NIR imaging device 150.
  • the user changes the upper limit of the exposure time of the imaging device 110 for R by moving the position of the button 831 of the slide bar 821.
  • the user can change the exposure time of the G camera 120, B camera 130, RE camera 140, and NIR camera 150 one by one by moving the position of the respective buttons of the sliders 822 to 825. Limit.
  • FIG. 9 is a flowchart showing an example of the execution procedure of the imaging control unit 182.
  • the imaging control unit 182 determines the type of event related to the exposure condition.
  • the event includes the setting event of the upper limit of the exposure time, the shooting event, and the end event.
  • the setting event of the upper limit value of the exposure time occurs when the user uses the remote operation device 300 to change the upper limit value of the exposure time.
  • the photographing event occurs when the user uses the remote operation device 300 to instruct photographing.
  • the shooting end event occurs when the user uses the remote operation device 300 to instruct the end of shooting.
  • the imaging control section 182 when a setting event of the upper limit value of the exposure time occurs, in S904, the imaging control section 182 generates a program graph within the range of the upper limit value of the exposure time that has been set.
  • the imaging device 182 can generate program graphs of each imaging device of the imaging device 110 for R, the imaging device 120 for G, the imaging device 130 for B, the imaging device 140 for RE, and the imaging device 150 for NIR.
  • the photographing control unit 182 determines the target EV value based on the current EV value and the brightness information of the image captured by the photographing apparatus 100. Specifically, the imaging control unit 182 can determine the EV value of each imaging device of the imaging device 110 for R, the imaging device 120 for G, the imaging device 130 for B, the imaging device 140 for RE, and the imaging device 150 for NIR.
  • the exposure time and imaging sensitivity are determined.
  • the imaging control unit 182 can determine the exposure time and imaging sensitivity of each imaging device of the imaging device 110 for R, the imaging device 120 for G, the imaging device 130 for B, the imaging device 140 for RE, and the imaging device 150 for NIR.
  • the imaging control unit 182 causes the R imaging device 110, G imaging device 120, B imaging device 130, RE imaging device 140, and NIR imaging device 150 to capture images based on the exposure time and imaging sensitivity determined in S914. .
  • the upper limit of the exposure time can be set according to the imaging device 100, so that it is possible to suppress the extremely long exposure time in the automatic exposure control. As a result, image blur and extreme overexposure can be suppressed. For example, even in the case of shooting crops on the ground while the UAV10 is flying at high speed, since the exposure time can be set so that the exposure time is not long, the image blur of the crops can be reduced. It can even analyze the health status of crops well.
  • the upper limit value of the exposure time of each wavelength region of the subject light can be set according to the imaging device 100. For example, where crops have been harvested and where crops have not been harvested, the intensity of the R component and the intensity of the G component will have a large difference. Even in this case, by setting the upper limit value of the exposure time for each wavelength region of the subject light, it is possible to suppress shooting in an overexposed state.
  • the exposure processing described corresponding to the above-mentioned embodiment is applicable to an imaging device with a fixed aperture that does not have a variable aperture.
  • the imaging device 100 can be constructed at low cost.
  • the exposure processing described corresponding to the above-mentioned embodiment can also be applied to a single imaging device.
  • the imaging control unit 182 may also set the upper limit of the exposure time according to the moving speed of the imaging device 100. For example, as the moving speed of the imaging device 100 is higher, the imaging control section 182 may determine a smaller value as the upper limit value of the exposure time. In addition, the imaging control unit 182 may set an upper limit value according to the relative speed between the UAV 10 and the subject. In addition, the imaging control unit 182 may set the upper limit value according to the speed of change of the direction of the imaging device 100. The imaging control unit 182 may set an upper limit value according to the rotation speed of the imaging device 100 using the universal joint 50. As the rotation speed of the imaging device 100 using the universal joint 50 is higher, the imaging control section 182 can determine a smaller value as the upper limit value of the exposure time.
  • FIG. 10 is a diagram showing another example of the appearance of the imaging device 100 mounted on the UAV 10.
  • the imaging device 100 includes the imaging device 110 for R, the imaging device 120 for G, the imaging device 130 for B, the imaging device 140 for RE, and the imaging device 150 for NIR, as well as the imaging device 160 for RGB.
  • the imaging device 100 shown in FIG. 2 is different.
  • the RGB imaging device 160 may be the same as a normal camera and includes an optical system and an image sensor.
  • the image sensor may include a filter that is configured by a Bayer array and transmits light in the red region, a filter that transmits light in the green region, and a filter that transmits light in the blue region.
  • the RGB imaging device 160 can output RGB images.
  • the wavelength band of the red region may be 620 nm to 750 nm.
  • the wavelength band of the green region may be 500 nm to 570 nm.
  • the wavelength band of the blue region is 450 nm to 500 nm.
  • the imaging control unit 182 generates the imaging device 160 for RGB in the same manner as the exposure control for the imaging device 110 for R, the imaging device 120 for G, the imaging device 130 for B, the imaging device 140 for RE, and the imaging device 150 for NIR.
  • the program curve diagram of the range below the upper limit of the exposure time, and the exposure time and imaging sensitivity are determined according to the program curve diagram and the EV value.
  • FIG. 11 shows an example of a computer 1200 that may fully or partially embody aspects of the present invention.
  • the program installed on the computer 1200 can make the computer 1200 function as an operation associated with the device according to the embodiment of the present invention or one or more "parts" of the device. Alternatively, the program can cause the computer 1200 to perform the operation or the one or more "parts".
  • This program enables the computer 1200 to execute the process or stages of the process involved in the embodiment of the present invention.
  • Such a program may be executed by the CPU 1212, so that the computer 1200 executes specified operations associated with some or all blocks in the flowcharts and block diagrams described in this specification.
  • the computer 1200 of this embodiment includes a CPU 1212 and a RAM 1214, which are connected to each other through a host controller 1210.
  • the computer 1200 further includes a communication interface 1222, an input/output unit, which is connected to the host controller 1210 through the input/output controller 1220.
  • the computer 1200 also includes a ROM 1230.
  • the CPU 1212 operates in accordance with programs stored in the ROM 1230 and RAM 1214 to control each unit.
  • the communication interface 1222 communicates with other electronic devices through a network.
  • the hard disk drive can store programs and data used by the CPU 1212 in the computer 1200.
  • the ROM 1230 stores therein a boot program executed by the computer 1200 during operation, and/or a program that depends on the hardware of the computer 1200.
  • the program is provided via a computer-readable recording medium such as CR-ROM, USB memory, or IC card, or a network.
  • the program is installed in RAM 1214 or ROM 1230 which is also an example of a computer-readable recording medium, and is executed by CPU 1212.
  • the information processing described in these programs is read by the computer 1200, and causes cooperation between the programs and the various types of hardware resources described above.
  • the apparatus or method may be constituted by implementing operations or processing of information according to the use of the computer 1200.
  • the CPU 1212 can execute a communication program loaded in the RAM 1214, and based on the processing described in the communication program, instruct the communication interface 1222 to perform communication processing.
  • the communication interface 1222 reads the transmission data stored in the transmission buffer provided in a recording medium such as RAM 1214 or USB memory under the control of the CPU 1212, and sends the read transmission data to the network or receives the data from the network The received data is written into the receiving buffer provided in the recording medium, etc.
  • the CPU 1212 can make the RAM 1214 read all or necessary parts of files or databases stored in an external recording medium such as a USB memory, and perform various types of processing on the data on the RAM 1214. Then, the CPU 1212 can write the processed data back to the external recording medium.
  • an external recording medium such as a USB memory
  • the CPU 1212 can perform various types of operations, information processing, conditional judgment, conditional transfer, unconditional transfer, and information retrieval/retrieval/information specified by the instruction sequence of the program described in various places in this disclosure. Replace various types of processing, and write the results back to RAM 1214.
  • the CPU 1212 can search for information in files, databases, and the like in the recording medium. For example, when a plurality of entries having the attribute value of the first attribute respectively associated with the attribute value of the second attribute are stored in the recording medium, the CPU 1212 may retrieve the attribute value of the specified first attribute from the multiple entries. And read the attribute value of the second attribute stored in the entry that matches the condition, so as to obtain the attribute value of the second attribute associated with the first attribute that meets the predetermined condition.
  • the programs or software modules described above may be stored on the computer 1200 or on a computer-readable storage medium near the computer 1200.
  • a recording medium such as a hard disk or RAM provided in a server system connected to a dedicated communication network or the Internet can be used as a computer-readable storage medium so that the program can be provided to the computer 1200 via the network.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Human Computer Interaction (AREA)
  • Studio Devices (AREA)
  • Exposure Control For Cameras (AREA)
  • Stereoscopic And Panoramic Photography (AREA)

Abstract

由于自动曝光控制,曝光时间有可能会极度变长。控制装置包括电路,其构成为设置曝光时间的上限值。电路构成为:根据摄像装置的曝光控制值,在上限值以下的范围内确定摄像装置的曝光时间。控制方法包括设置曝光时间的上限值的步骤。控制方法包括以下步骤:根据摄像装置的曝光控制值,在所述上限值以下的范围内确定所述摄像装置的曝光时间。

Description

控制装置、摄像装置、移动体、控制方法以及程序 【技术领域】
本发明涉及一种控制装置、摄像装置、移动体、控制方法以及程序。
【背景技术】
专利文献1中记载有以下技术:根据已设置的摄像灵敏度的上限值或下限值,在规定的程序曲线图中确定变更摄像灵敏度的点。
[现有技术文献]
[专利文献]
[专利文献1]日本特开2012-19427号公报
【发明内容】
【发明所要解决的技术问题】
例如,当对较暗被摄体进行拍摄等时,存在由于自动曝光控制,曝光时间极度变长的情况。
【用于解决问题的技术手段】
本发明的一个方面所涉及的控制装置可以包括电路,电路构成为设置曝光时间的上限值。电路可以构成为:根据摄像装置的曝光控制值,在上限值以下的范围内确定摄像装置的曝光时间。
电路可以构成为设置已输入的上限值。
电路可以构成为:将用于输入上限值的信息和根据摄像装置的当前曝光控制值在上限值以下的范围内确定的曝光时间显示于显示装置上。
电路可以构成为:在设置上限值后,当摄像装置工作时,更新显示于显示装置的曝光时间。
电路可以构成为:当未设置上限值时,在比可输入的上限值最大值长的预设值以下的范围内,根据曝光控制值确定曝光时间。
电路可以构成为:当设置了上限值时,在上限值以下的范围内,生成表示曝光值、摄像灵敏度以及曝光时间的关系的程序曲线图。电路可以构成为:根据曝光控制值以及程序曲线图,确定曝光时间以及摄像灵敏度。
电路可以构成为:根据曝光控制值,在上限值以下的范围内确定将摄像灵敏度固定为预设值时的曝光时间。电路可以构成为:当无法在上限值以下的范围内确定将摄像灵敏度固定为预设值时的曝光时间时,将曝光时间确定为上限值,并根据曝光控制值确定将曝光时间固定为上限值时的摄像灵敏度。
摄像装置可以是光圈固定的摄像装置。
摄像装置可以包括:以第一波长区域的光进行拍摄的第一摄像装置以及以第二波长区域的光进行拍摄的第二摄像装置。电路可以构成为:在上限值以下的范围内确定第一摄像装置的曝光时间以及第二摄像装置的曝光时间。
上限值可以包括第一波长区域的光的曝光时间的第一上限值以及第二波长区域的光的曝光时间的第二上限值。电路可以构成为:根据第一波长区域的光对应的曝光控制值,在第一上限值的范围内确定第一摄像装置的曝光时间。电路可以构成为:根 据第二波长区域的光对应的曝光控制值,在第二上限值的范围内确定第二摄像装置的曝光时间。
电路可以构成为:根据摄像装置的移动速度设置上限值。
电路可以构成为:根据摄像装置的方向的变化速度设置上限值。
电路可以构成为:当摄像装置的光圈固定时,根据曝光控制值在上限值以下的范围内确定曝光时间。电路可以构成为:当摄像装置的光圈未固定时,根据曝光控制值确定F值以及曝光时间。
本发明的一个方面所涉及的摄像装置可以包括上述控制装置。
本发明的一个方面所涉及的移动体可以是包括上述摄像装置并进行移动的移动体。
本发明的一个方面所涉及的控制方法可以包括以下步骤:设置曝光时间的上限值。控制方法可以包括以下步骤:根据摄像装置的曝光控制值,在上限值以下的范围内确定摄像装置的曝光时间。
本发明的一个方面所涉及的程序可以使计算机执行以下步骤:设置曝光时间的上限值。程序可以使计算机执行以下步骤:根据摄像装置的曝光控制值,在所述上限值以下的范围内确定所述摄像装置的曝光时间。
根据本发明的一个方面,可以设置曝光时间的上限值,从而可以抑制曝光时间的极度变长。
此外,上述发明内容未列举本发明的必要的全部特征。此外,这些特征组的子组合也可以构成发明。
【附图说明】
图1表示无人驾驶航空器(UAV)10及远程操作装置300的外观的一个示例。
图2是示出搭载在UAV10上的摄像装置100的外观的一个示例的图。
图3示出UAV10的功能块的一个示例。
图4示出摄像装置100的功能块的一个示例。
图5示出用户已设置曝光时间的上限值时生成的程序曲线图的范围。
图6示出用户已设置曝光时间的上限值时生成的程序曲线图的范围。
图7示出用于用户设置曝光时间的上限值的画面。
图8示出用于用户设置曝光时间的上限值的画面的其他示例。
图9是示出摄像控制部182的执行过程的一个示例的流程图。
图10是示出搭载在UAV10上的摄像装置100的外观的另一示例的图。
图11是示出硬件构成的一个示例的图。
【具体实施方式】
以下,通过发明的实施方式来说明本发明,但是以下的实施方式并不限定权利要求书所涉及的发明。此外,实施方式中所说明的所有特征组合对于发明的解决方案未必是必须的。对本领域普通技术人员来说,显然可以对以下实施方式加以各种变更或改良。从权利要求书的描述显而易见的是,加以了这样的变更或改良的方式都可包含在本发明的技术范围之内。
权利要求书、说明书、说明书附图以及说明书摘要中包含作为著作权所保护对象的事项。任何人只要如专利局的文档或者记录所表示的那样进行这些文件的复制,著作权人则不会提出异议。但是,在除此以外的情况下,保留一切的著作权。
本发明的各种实施方式可参照流程图及框图来描述,这里,方框可表示(1)执行操作的过程的阶段或者(2)具有执行操作的作用的装置的“部”。特定的阶段和“部”可以通过可编程电路和/或处理器来实现。专用电路可以包括数字和/或模拟硬件电路。可以包括集成电路(IC)和/或分立电路。可编程电路可以包括可重构硬件电路。可重构硬件电路可以包括逻辑与、逻辑或、逻辑异或、逻辑与非、逻辑或非、及其它逻辑操作、触发器、寄存器、现场可编程门阵列(FPGA)、可编程逻辑阵列(PLA)等存储器元件等。
计算机可读介质可以包括可以对由适宜的设备执行的指令进行存储的任意有形设备。其结果是,其上存储有指令的计算机可读介质包括一种包括指令的产品,该指令可被执行以创建用于执行流程图或框图所指定的操作的手段。作为计算机可读介质的示例,可以包括电子存储介质、磁存储介质、光学存储介质、电磁存储介质、半导体存储介质等。作为计算机可读介质的更具体的示例,可以包括floppy(注册商标)disk软盘、软磁盘、硬盘、随机存取存储器(RAM)、只读存储器(ROM)、可擦除可编程只读存储器(EPROM或者闪存)、电可擦可编程只读存储器(EEPROM)、静态随机存取存储器(SRAM)、光盘只读存储器(CD-ROM)、数字多用途光盘(DVD)、蓝光(RTM)光盘、记忆棒、集成电路卡等。
计算机可读指令可以包括由一种或多种编程语言的任意组合描述的源代码或者目标代码中的任意一个。源代码或者目标代码包括传统的程序式编程语言。传统的程序式编程语言可以为汇编指令、指令集架构(ISA)指令、机器指令、与机器相关的指令、微代码、固件指令、状态设置数据、或者Smalltalk(注册商标)、JAVA(注册商标)、C++等面向对象编程语言以及“C”编程语言或者类似的编程语言。计算机可读指令可以在本地或者经由局域网(LAN)、互联网等广域网(WAN)提供给通用计算机、专用计算机或者其它可编程数据处理装置的处理器或可编程电路。处理器或可编程电路可以执行计算机可读指令,以创建用于执行流程图或框图所指定操作的手段。处理器的示例包括计算机处理器、处理单元、微处理器、数字信号处理器、控制器、微控制器等。
图1表示无人驾驶航空器(UAV)10及远程操作装置300的外观的一个示例。UAV10包括UAV主体20、万向节50、多个摄像装置60以及摄像装置100。万向节50及摄像装置100为摄像系统的一个示例。UAV10为移动体的一个示例。移动体包括在空中移动的飞行物体、在地面移动的车辆、在水上移动的船舶等的概念。在空中移动的飞行体的概念不仅包括UAV、还包括在空中移动的其它的飞行器、飞艇、直升机等。
UAV主体20包括多个旋翼。多个旋翼为推进部的一个示例。UAV主体20通过控制多个旋翼的旋转而使UAV10飞行。UAV主体20使用例如四个旋翼来使UAV10飞行。旋翼的数量不限于四个。另外,UAV10也可以是没有旋翼的固定翼机。
摄像装置100是在多个波段的各个波段对所期望的摄像范围内的对象进行拍摄的拍摄用多光谱照相机。万向节50可旋转地支撑摄像装置100。万向节50为支撑机构的一个示例。例如,例如,万向节50能够利用致动器以围绕俯仰轴旋转的方式支撑摄像装置100。万向节50进而能够利用致动器以分别围绕滚转轴和偏航轴旋转的 方式支撑摄像装置100。万向节50可通过使摄像装置100以偏航轴、俯仰轴以及滚转轴中的至少一个为中心旋转,来变更摄像装置100的姿势。
多个摄像装置60是为了控制UAV10的飞行而对UAV10的周围进行拍摄的传感用相机。两个摄像装置60可以设置于UAV10的机头、即正面。并且,其它两个摄像装置60可以设置于UAV10的底面。正面侧的两个摄像装置60可以成对,起到所谓的立体相机的作用。底面侧的两个摄像装置60也可以成对,起到立体相机的作用。摄像装置60可以检测到摄像装置60的摄像范围所包含的对象的存在以及测量出与对象间的距离。摄像装置60为对存在于摄像装置100的摄像方向上的对象进行测量的测量装置的一个示例。测量装置也可以是对存在于摄像装置100的摄像方向上的对象进行测量的红外传感器、超声波传感器等的其它的传感器。可以根据由多个摄像装置60所拍摄的图像来生成UAV10周围的三维空间数据。UAV10所具备的摄像装置60的数量不限于四个。UAV10具备至少一个摄像装置60即可。UAV10也可以在UAV10的机头、机尾、侧面、底面及顶面分别具备至少一个摄像装置60。摄像装置60中可设定的视角可大于摄像装置100中可设定的视角。摄像装置60也可以具有单焦点镜头或鱼眼镜头。
远程操作装置300与UAV10通信,以远程操作UAV10。远程操作装置300可以与UAV10进行无线通信。远程操作装置300向UAV10发送表示上升、下降、加速、减速、前进、后退、旋转等与UAV10的移动有关的各种指令的指示信息。指示信息包括例如使UAV10的高度上升的指示信息。指示信息可以表示UAV10应该位于的高度。UAV10进行移动,以位于从远程操作装置300接收的指示信息所表示的高度。指示信息可以包括使UAV10上升的上升指令。UAV10在接受上升指令的期间上升。在UAV10的高度已达到上限高度时,即使接受上升指令,也可以限制UAV10上升。
远程操作装置300包括显示装置302。显示装置302显示由摄像装置100拍摄的图像。另外,显示装置302还作为接收用户输入的信息的输入装置发挥作用,以远程操作UAV10。例如,显示装置302接收摄像装置100的设置信息。远程操作装置300根据从用户接收到的设置信息,向UAV10发送指示与摄像装置100的动作有关的各种指令的指示信息。
图2是示出搭载在UAV10上的摄像装置100的外观的一个示例的图。摄像装置100是对预设的多个波段的各个波段的图像数据进行拍摄的多光谱相机。摄像装置100包括R用摄像装置110、G用摄像装置120、B用摄像装置130、RE用摄像装置140以及NIR用摄像装置150。摄像装置100能够将由R用摄像装置110、G用摄像装置120、B用摄像装置130、RE用摄像装置140以及NIR用摄像装置150拍摄的各自的图像数据作为多光谱图像进行记录。例如,多光谱图像可用于对农作物的健康状态以及活力进行预测。
图3示出UAV10的功能块的一个示例。UAV10包括UAV控制部30、存储器32、通信接口36、推进部40、GPS接收器41、惯性测量装置42(称为IMU42)、磁罗盘43、气压高度计44、温度传感器45、湿度传感器46、万向节50、摄像装置60及摄像装置100。
通信接口36与远程操作装置300等其它装置通信。通信接口36可以从远程操作装置300接收包括针对UAV控制部30的各种指令的指示信息。存储器32存储UAV控制部30对推进部40、GPS接收器41、IMU42、磁罗盘43、气压高度计44、温度传感器45、湿度传感器46、万向节50、摄像装置60及摄像装置100进行控制所需 的程序等。存储器32可以为计算机可读记录介质,可以包括SRAM、DRAM、EPROM、EEPROM、USB存储器等闪存中的至少一个。存储器32可以设置在UAV主体20的内部。其可以设置成可从UAV主体20上拆卸下来。
UAV控制部30按照储存在存储器32中的程序来控制UAV10的飞行及拍摄。UAV控制部30可以由CPU或MPU等微处理器、以及MCU等微控制器等构成。UAV控制部30按照经由通信接口36从远程操作装置300接收到的指令来控制UAV10的飞行及拍摄。推进部40推进UAV10。推进部40包括多个旋翼和使多个旋翼旋转的多个驱动电机。推进部40按照来自UAV控制部30的指令,经由多个驱动电机使多个旋翼旋转,以使UAV10飞行。
GPS接收器41接收从多个GPS卫星发送的表示时刻的多个信号。GPS接收器41根据所接收的多个信号来计算出GPS接收器41的位置(纬度及经度)、即UAV10的位置(纬度及经度)。IMU42检测UAV10的姿势。IMU42检测UAV10的前后、左右以及上下的三轴方向的加速度和俯仰轴、滚转轴以及偏航轴的三轴方向的角速度,作为UAV10的姿势。磁罗盘43检测UAV10的机头的方位。气压高度计44检测UAV10的飞行高度。气压高度计44检测UAV10周围的气压,并将检测到的气压换算为高度,以检测高度。温度传感器45检测UAV10周围的温度。湿度传感器46检测UAV10周围的湿度。
图4示出摄像装置100的功能块的一个示例。摄像装置100包括R用摄像装置110、G用摄像装置120、B用摄像装置130、RE用摄像装置140以及NIR用摄像装置150。摄像装置100包括处理器180、发送部190及存储器192。
R用摄像装置110包括R用图像传感器112及光学系统114。R用图像传感器112对由光学系统114所成的像进行拍摄。R用图像传感器112包括使红色区域波段的光透过的滤波器,并输出红色区域波段的图像信号即R图像信号。例如,红色区域的波段是620nm~750nm。红色区域的波段可以是红色区域中特定的波段,例如可以是663nm~673nm。
G用摄像装置120包括G用图像传感器122及光学系统124。G用图像传感器122对由光学系统124所成的像进行拍摄。G用图像传感器122包括使绿色区域波段的光透过的滤波器,并输出绿色区域波段的图像信号即G图像信号。例如,绿色区域的波段是500nm~570nm。绿色区域的波段可以是绿色区域中特定的波段,例如可以是550nm~570nm。
B用摄像装置130包括B用图像传感器132及光学系统134。B用图像传感器132对由光学系统134所成的像进行拍摄。B用图像传感器132包括使蓝色区域波段的光透过的滤波器,并输出蓝色区域波段的图像信号即B图像信号。例如,蓝色区域的波段是450nm~500nm。蓝色区域的波段可以是蓝色区域中特定的波段,例如可以是465nm~485nm。
RE用摄像装置140包括RE用图像传感器142及光学系统144。RE用图像传感器142对由光学系统144所成的像进行拍摄。RE用图像传感器142包括使红色边缘区域波段的光透过的滤波器,并输出红色边缘区域波段的图像信号即RE图像信号。例如,红色边缘区域的波段是705nm~745nm。红色边缘区域的波段可以是712nm~722nm。
NIR用摄像装置150包括NIR用图像传感器152及光学系统154。NIR用图像传感器152对由光学系统154所成的像进行拍摄。NIR用图像传感器152包括使近红外 线区域波段的光透过的滤波器,并输出近红外线区域波段的图像信号即NIR图像信号。例如,近红外线区域的波段是800nm~2500nm。近红外线区域的波段可以是800nm至900nm。
处理器180包括复用器170、输入接收部172、去马赛克处理部174以及记录处理部178。处理器180是电路的一个示例。处理器180可以由CPU或者MPU等微处理器、MCU等微控制器构成。
复用器170接收从各个图像传感器输出的图像信号,并根据预设条件对从任意一个图像传感器输出的图像信号进行选择,输入到输入接收部172。
去马赛克处理部174根据输入到输入接收部172的R图像信号、G图像信号以及B图像信号生成显示用图像数据。去马赛克处理部174通过对R图像信号、G图像信号以及B图像信号实施去马赛克处理从而生成显示用图像数据。去马赛克处理部174可以通过对R图像信号、G图像信号以及B图像信号实施稀疏处理,将稀疏处理后的R图像信号、G图像信号以及B图像信号转换为拜耳阵列的图像信号,生成显示用图像数据。发送部190将显示用图像数据发送到显示装置。例如,发送部190可以向远程操作装置300发送显示用图像数据。远程操作装置300可以在显示装置302上对显示用图像数据进行显示以作为实时取景的图像。
记录处理部178根据输入到输入接收部172的R图像信号、G图像信号、B图像信号、RE图像信号以及NIR图像信号并按照预设的记录格式生成记录用图像数据。记录处理部178可以根据RAW格式将R图像信号、G图像信号、B图像信号、RE图像信号以及NIR图像信号生成RAW数据作为记录用图像数据。记录处理部178可以不对R图像信号、G图像信号、B图像信号、RE图像信号以及NIR图像信号分别进行稀疏处理而生成全像素的记录用图像数据。记录处理部178可以将记录用图像数据存储在存储器192中。存储器192可以是计算机可读记录介质,也可以包括诸如SRAM、DRAM、EPROM、EEPROM和USB存储器等闪存中的至少一种。存储器192可以设置于摄像装置100的壳体内部。存储器192可以设置成可从摄像装置100的壳体上拆卸下来。
处理器180还包括接收部184及切换部186。接收部184接收将记录用图像数据存储到存储器192中的存储指示。接收部184可以通过远程操作装置300等外部终端接收来自用户的存储指示。在摄像装置100的位置是预定的位置的情况下,接受部184可以从UAV控制部30接受存储指示。在UAV10的位置是预定的位置的情况下,UAV控制部30判断摄像装置100的位置是预定位置,接受部184可以从UAV控制部30接受存储指示。摄像装置100可以具备GPS接收器。在该情况下,处理器180可以根据来自自身GPS接收器的位置信息判断摄像装置100的位置是否为预设位置。切换部186对下述两种方式进行切换,一是根据输入到输入接收部172中的R图像信号、G图像信号以及B图像信号在去马赛克处理部174生成显示用图像数据,二是根据输入到输入接收部172中的R图像信号、G图像信号、B图像信号、RE图像信号以及NIR图像信号并根据预设的记录格式在记录处理部178生成记录用图像数据。
对摄像装置100中的曝光控制进行说明。摄像控制部182设置曝光时间(图像传感器的电荷蓄积时间)的上限值。摄像控制部182根据摄像装置100的曝光控制值,在所述上限值以下的范围内确定摄像装置100的曝光时间。摄像控制部182设置已输入的上限值。
摄像控制部182使用于输入上限值的信息以及根据摄像装置100的当前的曝光控制值在上限值以下的范围内确定的曝光时间显示于显示装置302。摄像控制部182在设置上限值后,在摄像装置100工作时更新显示于显示装置302的曝光时间。
当未设置上限值时,摄像控制部182根据曝光控制值,在比可输入的上限值最大值长的预设值以下的范围内确定曝光时间。
当已设置上限值时,摄像控制部182在上限值以下的范围内生成程序曲线图。程序曲线图表示曝光值、摄像灵敏度以及曝光时间的关系。摄像控制部182根据曝光控制值以及程序曲线图,确定曝光时间以及摄像灵敏度。
摄像控制部182根据曝光控制值,在上限值以下的范围内确定将摄像灵敏度固定为预设值时的曝光时间。当无法在上限值以下的范围内确定将摄像灵敏度固定为预设值时的曝光时间时,摄像控制部182将曝光时间确定为上限值,并根据曝光控制值确定将曝光时间固定为上限值时的摄像灵敏度。例如,摄像控制部182判断在将摄像灵敏度固定为预设值,并将曝光时间设为上限值的情况下会产生曝光不足时,则在将曝光时间确定为上限值的同时,还根据曝光控制值提高摄像灵敏度。
摄像装置100可以包括以第一波长区域的光进行拍摄的第一摄像装置以及以第二波长区域的光进行拍摄的第二摄像装置。摄像控制部182在上限值以下的范围内确定第一摄像装置的曝光时间以及所述第二摄像装置的曝光时间。例如,上限值包括第一波长区域的光的曝光时间的第一上限值和第二波长区域的光的曝光时间的第二上限值。摄像控制部182根据第一波长区域的光对应的曝光控制值,在所述第一上限值的范围内确定所述第一摄像装置的曝光时间。摄像控制部182根据第二波长区域的光对应的曝光控制值,在所述第二上限值的范围内确定所述第二摄像装置的曝光时间。另外,第一摄像装置可以是R用摄像装置110、G用摄像装置120、B用摄像装置130、RE用摄像装置140以及NIR用摄像装置150中的一个摄像装置,第二摄像装置可以是R用摄像装置110、G用摄像装置120、B用摄像装置130、RE用摄像装置140以及NIR用摄像装置150中的其他摄像装置。
摄像装置100可以是光圈固定的摄像装置。例如,R用摄像装置110、G用摄像装置120、B用摄像装置130、RE用摄像装置140以及NIR用摄像装置150可以分别是光圈固定的摄像装置。
摄像控制部182可以根据摄像装置100的移动速度设置上限值。例如,摄像控制部182可以根据UAV10的移动速度设置上限值。摄像控制部182可以根据UAV10和被摄体之间的相对速度设置上限值。
摄像控制部182可以根据摄像装置100的方向的变化速度设置上限值。摄像控制部182可以根据利用万向节50的摄像装置100的旋转速度设置上限值。
摄像控制部182可以在摄像装置100的光圈固定时,根据曝光控制值在所述上限值以下的范围内确定所述曝光时间。摄像控制部182可以在摄像装置100的光圈未固定时,根据曝光控制值确定F值以及曝光时间。另外,所谓“摄像装置100的光圈固定时”,例如包括以下情况:摄像装置100是具有可变光圈的摄像装置,摄像模式设置为光圈优先模式。所谓“摄像装置100的光圈固定时”,例如包括以下情况:摄像装置100是具有可变光圈的摄像装置,在F值设为最小值的状态下产生曝光不足。所谓“摄像装置100的光圈固定时”,例如包括以下情况:摄像装置100不具有光圈。摄像装置100中即使没有光圈,也相当于固定有光圈。
图5以及图6示出用户设置曝光时间的上限值时生成的程序曲线图的范围。在此,将对R用摄像装置110进行说明。R用摄像装置110具有在1/8000秒到1/8秒的范围内变更曝光时间的功能。R用摄像装置110具有在100到6400的范围内变更ISO灵敏度的功能。图5示出F值为1.4的情况。图6示出F值为2.0的情况。1/8000秒并非曝光时间的最短时间。例如,最短时间可以是1/20000秒。
图5以及图6的横轴表示曝光时间,纵轴表示摄像灵敏度。图5以及图6示出的范围500表示在曝光时间的上限值设置为1/250秒时,摄像控制部182生成的程序曲线图的范围。
在进行自动曝光控制的情况下,摄像控制部182根据由R用摄像装置110得到的图像确定曝光控制值。例如,摄像控制部182根据由R用摄像装置110得到的图像的亮度信息确定EV值(曝光值)。
摄像控制部182根据基于曝光时间的上限生成的范围500内的程序曲线图和EV值确定曝光时间以及摄像灵敏度。如上所述,摄像控制部182在R用摄像装置110中并不将曝光时间设置为大于1/250秒。也就是说,摄像控制部182禁止R用摄像装置110在大于1/250秒的曝光时间内曝光。
另外,R用摄像装置110中用户可设置的曝光时间的上限值的最大值可以是小于1/8秒。例如,用户可设置的曝光时间的上限值的最大值可以是1/15秒。用户未设置曝光时间的上限值时,摄像控制部182在R用摄像装置110中可设置的曝光时间的最大值即1/8秒以下的范围内确定曝光时间。也就是说,用户未设置曝光时间的上限值时,摄像控制部182可以确定大于1/15秒的曝光时间。
摄像控制部182利用与R用摄像装置110中曝光时间以及摄像灵敏度的设置方法相同的方法,设置G用摄像装置120、B用摄像装置130、RE用摄像装置140以及NIR用摄像装置150各自的曝光时间以及摄像灵敏度。也就是说,摄像控制部182对于G用摄像装置120、B用摄像装置130、RE用摄像装置140以及NIR用摄像装置150各个摄像装置,在已设置的曝光时间的上限值以下的范围内生成程序曲线图。另外,摄像控制部182根据由G用摄像装置120、B用摄像装置130、RE用摄像装置140以及NIR用摄像装置150各个摄像装置得到的图像的亮度信息确定EV值(曝光值),根据生成的程序曲线图和EV值,确定各个摄像装置的曝光时间以及摄像灵敏度。
另外,摄像控制部182可以根据R用摄像装置110、G用摄像装置120、B用摄像装置130、RE用摄像装置140以及NIR用摄像装置150中的特定区域的亮度信息,确定R用摄像装置110、G用摄像装置120、B用摄像装置130、RE用摄像装置140以及NIR用摄像装置150各自的EV值。所谓特定区域,例如可以是包括相同的被摄体的区域。摄像控制部182可以根据R用摄像装置110、G用摄像装置120、B用摄像装置130、RE用摄像装置140以及NIR用摄像装置150中的特定区域的图像信息,对每个像素计算植被指数,将植被指数满足预定条件的图像区域确定为特定区域。另外,作为植被指数,可以例示归一化植被指数(NDVI:Normalized Difference Vegetation Index)、SAVI(Solid Adjusted Vegetation Index,土壤调节植被指数)、gNDVI(Green Normalized Difference Vegetation Index,绿色归一化植被指数)、NDRE(Normalized Difference Red Edge Index,归一化红边植被指数)等。
另外,当由用户设置了曝光时间的上限值时,摄像控制部182可以在将摄像灵敏度固定为预设值(例如,ISO灵敏度100)的同时,根据EV值以及摄像灵敏度,在 用户已设置的上限值以下的范围内确定适当曝光的曝光时间。当即使曝光时间为上限值也会产生曝光不足时,摄像控制部182可以在将曝光时间确定为用户已设置的上限值的同时,根据EV值以及上限值的曝光时间,确定适当曝光的摄像灵敏度。在较暗的环境下进行拍摄时,曝光不足变强。在该情况下,摄像控制部182不变更曝光时间,使得其超过已设置的曝光时间的上限值。代替不变更曝光时间地,摄像控制部182通过提高应用于R用摄像装置110、G用摄像装置120、B用摄像装置130、RE用摄像装置140以及NIR用摄像装置150的分别具备的图像传感器的增益以提高曝光。另外,摄像控制部182保持应用于图像传感器的增益为规定值(例如增益gain=1.0)。摄像控制部182在不提高增益就无法调整曝光的情况下,首次提高增益。由此,可以防止图像传感器获取的画质劣化。ISO灵敏度是对于特定的光的灵敏度。因此,调整增益也可以是相当于调整ISO灵敏度。
图7示出用于用户设置曝光时间的上限值的画面700。画面700显示于显示装置302。画面700是统一控制R用摄像装置110、G用摄像装置120、B用摄像装置130、RE用摄像装置140以及NIR用摄像装置150各自的摄像装置时的画面。画面700包括曝光条件的显示区域710和用于设置曝光时间的上限值的预设区域712。
显示区域710显示摄像装置100中的光圈值(Iris)、摄像灵敏度(ISO)以及曝光时间(快门速度)。预设区域712显示接受曝光时间的上限值的变更的滑动条720。
滑动条720是用于用户输入曝光时间的上限值的信息的一个示例。用户通过使滑动条720的按钮(标记)730的位置移动以变更曝光时间的上限值。画面700表示曝光时间的上限值设置成1/250秒的状态。远程操作装置300向UAV10发送表示用户指定的曝光时间的上限值的指示信息。摄像控制部182确定在通过UAV控制部30接受的指示信息表示的曝光时间的上限值以下的曝光时间和摄像灵敏度。发送部190向远程操作装置300发送包含摄像装置100的光圈值、已确定的曝光时间以及摄像灵敏度的曝光条件。远程操作装置300根据接收的曝光条件更新画面700。
由此,远程操作装置300能够向用户提示摄像装置100中的当前的曝光条件以及滑动条720。根据用户移动滑动条720的按钮730以使上限值变更,可以变更曝光时间以及摄像灵敏度。例如,当用户从图7所示的状态将曝光时间的上限值变更为1/1000秒时,假定EV值是固定的,摄像控制部182将曝光时间设为1/1000秒,并且将摄像灵敏度(ISO灵敏度)变更为200以成为适当曝光。显示区域710显示的快门速度(曝光时间)可以实时更新。例如,当使滑动条720的按钮730移动,将曝光时间的上限值设为1/1000秒时,显示区域710显示的快门速度根据曝光的情况为1/2000秒或1/10000秒。此时,增益维持在例如1.0。然后,有时会出现曝光不足变强,快门速度无法调整的情况。在该情况下,增益会大于1.0。增益的信息可以与ISO灵敏度信息一同显示于显示部710。可以代替增益以与增益相同的方式调整ISO灵敏度信息。即,当将ISO500设为规定值,而曝光不足变强时,可以首次设为大于ISO500的ISO800。由此,可以保持画质的劣化。在设置曝光时间的上限值后,在摄像装置100拍摄(工作)时,可以实时(依次)更新ISO灵敏度、增益和快门速度。由此,用户可以实时掌握拍摄情况。可以将显示部710的快门速度替换为滑动条720。在该情况下,当设置快门速度的上限值时,其后实时切换快门速度。
图8示出用于用户设置曝光时间的上限值的画面的其他示例。画面800显示于显示装置302。画面800是用于分别对R用摄像装置110、G用摄像装置120、B用摄 像装置130、RE用摄像装置140以及NIR用摄像装置150进行控制的画面。画面800包括曝光条件的显示区域810和用于设置曝光时间的上限值的预设区域812。
显示区域810显示R用摄像装置110、G用摄像装置120、B用摄像装置130、RE用摄像装置140以及NIR用摄像装置150各自的光圈值(Iris),摄像灵敏度(ISO)以及曝光时间(快门速度)。显示区域812显示滑动条821、822、823、824、825。滑动条821、822、823、824、825是分别用于接收R用摄像装置110、G用摄像装置120、B用摄像装置130、RE用摄像装置140以及NIR用摄像装置150的曝光时间的上限值的变更的信息的一个示例。
用户通过使滑动条821的按钮831的位置移动以变更R用摄像装置110的曝光时间的上限值。同样,用户通过使滑动条822~825各自的按钮的位置移动,可以一个一个变更G用摄像装置120、B用摄像装置130、RE用摄像装置140以及NIR用摄像装置150各自的曝光时间的上限值。
图9是示出摄像控制部182的执行过程的一个示例的流程图。在S902中,摄像控制部182判断与曝光条件有关的事件的类型。在此,事件包括曝光时间的上限值的设置事件、摄影事件以及结束事件。曝光时间的上限值的设置事件在用户使用远程操作装置300变更曝光时间的上限值时发生。摄影事件在用户使用远程操作装置300指示拍摄时发生。摄影结束事件在用户使用远程操作装置300指示拍摄结束时发生。
在S902中,当曝光时间的上限值的设置事件发生时,在S904中,摄像控制部182在已设置的曝光时间的上限值的范围内生成程序曲线图。摄像装置182可以生成R用摄像装置110、G用摄像装置120、B用摄像装置130、RE用摄像装置140以及NIR用摄像装置150的各个摄像装置的程序曲线图。
在S902中摄影事件发生时,在S912中,摄影控制部182根据当前的EV值以及摄影装置100拍摄的图像的亮度信息确定目标EV值。具体来说,摄像控制部182可以确定R用摄像装置110、G用摄像装置120、B用摄像装置130、RE用摄像装置140以及NIR用摄像装置150的各个摄像装置的EV值。
在S914中,根据在S904生成的程序曲线图以及S912中确定的EV值,确定曝光时间以及摄像灵敏度。摄像控制部182可以确定R用摄像装置110、G用摄像装置120、B用摄像装置130、RE用摄像装置140以及NIR用摄像装置150的各个摄像装置的曝光时间以及摄像灵敏度。
在S916中,摄像控制部182根据S914中确定的曝光时间以及摄像灵敏度,使R用摄像装置110、G用摄像装置120、B用摄像装置130、RE用摄像装置140以及NIR用摄像装置150摄像。
在S902中结束事件发生时,结束本流程图的处理。
如上所述,根据摄像装置100可以设置曝光时间的上限值,从而可以抑制自动曝光控制中曝光时间的极度变长。由此,可以抑制图像模糊、抑制极端曝光过度的产生。例如,即使在UAV10高速飞行的同时拍摄地面的农作物的情况下,由于可以设置使得曝光时间不变长,也可以减小农作物的图像模糊。甚至能够良好地分析农作物的健康状态等。另外,根据摄像装置100可以设置被摄体光的每个波长区域的曝光时间的上限值。例如,已收割农作物的地方和未收割农作物的地方,R成分强度和G成分强度会产生较大的差异。即使在这种情况下,通过设置被摄体光的每个波长区域的曝光时间的上限值,也可以抑制在曝光过度的状态下进行拍摄。
与上述实施方式对应说明的曝光处理适用于不具有可变光圈的光圈固定的摄像装置。利用进行上述曝光处理,通过设置多个不具有可变光圈的较为简单的摄像装置,可以低成本地构成摄像装置100。与上述实施方式对应说明的曝光处理也可以适用于单个的摄像装置。
另外,上述实施方式主要举出了用户设置曝光时间的上限值的情况。然而,也可以由摄像控制部182根据摄像装置100的移动速度设置曝光时间的上限值。例如,随着摄像装置100的移动速度越高,摄像控制部182可以确定越小的值作为曝光时间的上限值。另外,摄像控制部182可以根据UAV10和被摄体之间的相对速度设置上限值。另外,摄像控制部182可以根据摄像装置100的方向的变化速度设置上限值。摄像控制部182可以根据利用万向节50的摄像装置100的旋转速度设置上限值。随着使用万向节50的摄像装置100的旋转速度越高,摄像控制部182可以确定越小的值作为曝光时间的上限值。
图10是示出搭载在UAV10上的摄像装置100的外观的另一个示例的图。摄像装置100除了包括R用摄像装置110、G用摄像装置120、B用摄像装置130、RE用摄像装置140以及NIR用摄像装置150之外,还包括RGB用摄像装置160,在这一点上与图2所示的摄像装置100不同。RGB用摄像装置160可与普通的照相机相同,包括光学系统及图像传感器。图像传感器可以包括以拜耳阵列构成且使红色区域波段的光透过的滤波器、使绿色区域波段的光透过的滤波器以及使蓝色区域波段的光透过的滤波器。RGB用摄像装置160可以输出RGB图像。例如,红色区域的波段可以是620nm~750nm。例如,绿色区域的波段可以是500nm~570nm。例如,蓝色区域的波段是450nm~500nm。
摄像控制部182以与针对R用摄像装置110、G用摄像装置120、B用摄像装置130、RE用摄像装置140以及NIR用摄像装置150的曝光控制相同的方式,生成针对RGB用摄像装置160的曝光时间上限值以下的范围的程序曲线图,并按照程序曲线图以及EV值,确定曝光时间以及摄像灵敏度。
图11示出可全部或部分地体现本发明的多个方面的计算机1200的一个示例。安装在计算机1200上的程序能够使计算机1200作为与本发明的实施方式所涉及的装置相关联的操作或者该装置的一个或多个“部”而起作用。或者,该程序能够使计算机1200执行该操作或者该一个或多个“部”。该程序能够使计算机1200执行本发明的实施方式所涉及的过程或者该过程的阶段。这种程序可以由CPU1212执行,以使计算机1200执行与本说明书所述的流程图及框图中的一些或者全部方框相关联的指定操作。
本实施方式的计算机1200包括CPU1212以及RAM1214,它们通过主机控制器1210相互连接。计算机1200还包括通信接口1222、输入/输出单元,它们通过输入/输出控制器1220与主机控制器1210连接。计算机1200还包括ROM1230。CPU1212按照ROM1230及RAM1214内存储的程序而工作,从而控制各单元。
通信接口1222通过网络与其他电子装置通信。硬盘驱动器可以存储计算机1200内的CPU1212所使用的程序及数据。ROM1230在其中存储运行时由计算机1200执行的引导程序等、和/或依赖于计算机1200的硬件的程序。程序通过CR-ROM、USB存储器或IC卡之类的计算机可读记录介质或者网络来提供。程序安装在也作为计算机可读记录介质的示例的RAM1214或ROM1230中,并通过CPU1212执行。这些程序中记述的信息处理由计算机1200读取,并引起程序与上述各种类型的硬件资源之 间的协作。可以通过根据计算机1200的使用而实现信息的操作或者处理来构成装置或方法。
例如,当在计算机1200和外部装置之间执行通信时,CPU1212可执行加载在RAM1214中的通信程序,并且基于通信程序中描述的处理,命令通信接口1222进行通信处理。通信接口1222在CPU1212的控制下,读取存储在RAM1214或USB存储器之类的记录介质内提供的发送缓冲区中的发送数据,并将读取的发送数据发送到网络,或者将从网络接收的接收数据写入记录介质内提供的接收缓冲区等中。
此外,CPU1212可以使RAM1214读取USB存储器等外部记录介质所存储的文件或数据库的全部或者需要的部分,并对RAM1214上的数据执行各种类型的处理。接着,CPU1212可以将处理过的数据写回到外部记录介质中。
可以将各种类型的程序、数据、表格及数据库之类的各种类型的信息存储在记录介质中,并接受信息处理。对于从RAM1214读取的数据,CPU1212可执行在本公开的各处描述的、包括由程序的指令序列指定的各种类型的操作、信息处理、条件判断、条件转移、无条件转移、信息的检索/替换等各种类型的处理,并将结果写回到RAM1214中。此外,CPU1212可以检索记录介质内的文件、数据库等中的信息。例如,在记录介质中存储具有分别与第二属性的属性值相关联的第一属性的属性值的多个条目时,CPU1212可以从该多个条目中检索出与指定第一属性的属性值的条件相匹配的条目,并读取该条目内存储的第二属性的属性值,从而获取与满足预定条件的第一属性相关联的第二属性的属性值。
以上描述的程序或者软件模块可以存储在计算机1200上或者计算机1200附近的计算机可读存储介质上。另外,连接到专用通信网络或因特网的服务器系统中提供的诸如硬盘或RAM之类的记录介质可以用作计算机可读存储介质,从而可以经由网络将程序提供给计算机1200。
应该注意的是,权利要求书、说明书以及附图中所示的装置、系统、程序以及方法中的动作、顺序、步骤以及阶段等各项处理的执行顺序,只要没有特别明示“在…之前”、“事先”等,且只要前面处理的输出并不用在后面的处理中,则可以任意顺序实现。关于权利要求书、说明书以及附图中的操作流程,为方便起见而使用“首先”、“接着”等进行了说明,但并不意味着必须按照这样的顺序实施。
以上使用实施方式对本发明进行了说明,但是本发明的技术范围并不限于上述实施方式所描述的范围。对本领域普通技术人员来说,显然可对上述实施方式加以各种变更或改良。从权利要求书的描述显而易见的是,加以了这样的变更或改良的方式都可包含在本发明的技术范围之内。
【符号说明】
10 UAV
20 UAV主体
30 UAV控制部
32 存储器
36 通信接口
40 推进部
41 GPS接收器
42 惯性测量装置
43 磁罗盘
44 气压高度计
45 温度传感器
46 湿度传感器
50 万向节
60 摄像装置
100 摄像装置
110 R用摄像装置
112 R用图像传感器
114 光学系统
120 G用摄像装置
122 G用图像传感器
124 光学系统
130 B用摄像装置
132 B用图像传感器
134 光学系统
140 RE用摄像装置
142 RE用图像传感器
144 光学系统
150 NIR用摄像装置
152 NIR用图像传感器
154 光学系统
160 RGB用摄像装置
170 复用器
172 输入接收部
174 去马赛克处理部
178 记录处理部
180 处理器
182 摄像控制部
184 接收部
186 切换部
190 发送部
192 存储器
300 远程操作装置
302 显示装置
500 范围
700,800 画面
710,810 显示区域
712,812 预设区域
720 滑动条
730,831 旋钮
821,822,823,824,825 滑动条
1200 计算机
1210 主机控制器
1212 CPU
1214 RAM
1220 输入/输出控制器
1222 通信接口
1230 ROM

Claims (17)

  1. 一种控制装置,其特征在于,包括一种电路,所述电路构成为:设置曝光时间的上限值,
    根据摄像装置的曝光控制值,在所述上限值以下的范围内确定所述摄像装置的曝光时间。
  2. 根据权利要求1所述的控制装置,其特征在于,所述电路构成为设置已输入的所述上限值。
  3. 根据权利要求2所述的控制装置,其特征在于,所述电路构成为:将用于输入所述上限值的信息和根据所述摄像装置的当前曝光控制值在所述上限值以下的范围内确定的曝光时间显示于显示装置上。
  4. 根据权利要求3所述的控制装置,其特征在于,所述电路构成为:在设置所述上限值后,当所述摄像装置工作时,更新显示于所述显示装置的所述曝光时间。
  5. 根据权利要求2至4中任一项所述的控制装置,其特征在于,所述电路构成为:当未设置所述上限值时,在比可输入的所述上限值最大值长的预设值以下的范围内,根据所述曝光控制值确定所述曝光时间。
  6. 根据权利要求1至4中任一项所述的控制装置,其特征在于,所述电路构成为:
    当设置了所述上限值置时,在所述上限值以下的范围内,生成表示曝光值、摄像灵敏度以及曝光时间的关系的程序曲线图;
    根据所述曝光控制值以及所述程序曲线图,确定所述曝光时间以及所述摄像灵敏度。
  7. 根据权利要求1至4中任一项所述的控制装置,其特征在于,所述电路构成为:
    根据所述曝光控制值,在所述上限值以下的范围内确定将摄像灵敏度固定为预设值时的所述曝光时间;
    当无法在所述上限值以下的范围内确定将所述摄像灵敏度固定为预设值时的所述曝光时间时,将所述曝光时间确定为所述上限值,并根据所述曝光控制值确定将所述曝光时间固定为所述上限值时的摄像灵敏度。
  8. 根据权利要求1至4中任一项所述的控制装置,其特征在于,所述摄像装置是光圈固定的摄像装置。
  9. 根据权利要求1至4中任一项所述的控制装置,其特征在于,所述摄像装置包括以第一波长区域的光进行拍摄的第一摄像装置以及以第二波长区域的光进行拍摄的第二摄像装置,
    所述电路构成为:在所述上限值以下的范围内确定所述第一摄像装置的曝光时间 以及所述第二摄像装置的曝光时间。
  10. 根据权利要求9所述的控制装置,其特征在于,所述上限值包括所述第一波长区域的光的曝光时间的第一上限值和所述第二波长区域的光的曝光时间的第二上限值,
    所述电路构成为:
    根据所述第一波长区域的光对应的曝光控制值,在所述第一上限值的范围内确定所述第一摄像装置的曝光时间;
    根据所述第二波长区域的光对应的曝光控制值,在所述第二上限值的范围内确定所述第二摄像装置的曝光时间。
  11. 根据权利要求1所述的控制装置,其特征在于,所述电路构成为:根据所述摄像装置的移动速度设置所述上限值。
  12. 根据权利要求1所述的控制装置,其特征在于,所述电路构成为:根据所述摄像装置的方向的变化速度设置所述上限值。
  13. 根据权利要求1至4的任一项所述的控制装置,其特征在于,所述电路构成为:
    在所述摄像装置的光圈固定时,根据所述曝光控制值在所述上限值以下的范围内确定所述曝光时间;
    在所述摄像装置的光圈未固定时,根据所述曝光控制值确定F值以及曝光时间。
  14. 一种摄像装置,其特征在于,包括根据权利要求1至4中任一项所述的控制装置。
  15. 一种移动体,其特征在于,包括根据权利要求14所述的摄像装置并进行移动。
  16. 一种控制方法,其特征在于,包括以下步骤:设置曝光时间的上限值;以及
    根据摄像装置的曝光控制值,在所述上限值以下的范围内确定所述摄像装置的曝光时间。
  17. 一种程序,其特征在于,用于使计算机执行以下步骤:设置曝光时间的上限值;以及
    根据摄像装置的曝光控制值,在所述上限值以下的范围内确定所述摄像装置的曝光时间。
PCT/CN2020/102917 2019-07-31 2020-07-20 控制装置、摄像装置、移动体、控制方法以及程序 WO2021017914A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202080003377.6A CN112335230A (zh) 2019-07-31 2020-07-20 控制装置、摄像装置、移动体、控制方法以及程序
US17/524,623 US20220141371A1 (en) 2019-07-31 2021-11-11 Control apparatuses, photographing apparatuses, movable objects, control methods, and programs

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2019-141589 2019-07-31
JP2019141589A JP2021027409A (ja) 2019-07-31 2019-07-31 制御装置、撮像装置、移動体、制御方法、及びプログラム

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US17/524,623 Continuation US20220141371A1 (en) 2019-07-31 2021-11-11 Control apparatuses, photographing apparatuses, movable objects, control methods, and programs

Publications (1)

Publication Number Publication Date
WO2021017914A1 true WO2021017914A1 (zh) 2021-02-04

Family

ID=74229513

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/102917 WO2021017914A1 (zh) 2019-07-31 2020-07-20 控制装置、摄像装置、移动体、控制方法以及程序

Country Status (4)

Country Link
US (1) US20220141371A1 (zh)
JP (1) JP2021027409A (zh)
CN (1) CN112335230A (zh)
WO (1) WO2021017914A1 (zh)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024018691A1 (ja) * 2022-07-19 2024-01-25 富士フイルム株式会社 制御装置、制御方法、及びプログラム

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012019427A (ja) * 2010-07-09 2012-01-26 Hoya Corp 露出制御装置及びそれを備える電子カメラ
CN104980664A (zh) * 2014-04-03 2015-10-14 佳能株式会社 图像处理装置及其控制方法以及摄像装置
JP2018037747A (ja) * 2016-08-29 2018-03-08 キヤノン株式会社 情報処理装置、その制御方法およびプログラム
CN108419026A (zh) * 2018-05-31 2018-08-17 歌尔股份有限公司 相机曝光时间调整方法、装置及设备
CN109862283A (zh) * 2019-02-19 2019-06-07 合肥泰禾光电科技股份有限公司 一种曝光时间自动调整方法及装置

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004266458A (ja) * 2003-02-28 2004-09-24 Shimadzu Corp 撮影装置、および、同期撮影タイミング制御器
WO2005064922A1 (ja) * 2003-12-25 2005-07-14 Niles Co., Ltd. 撮像システム
JP2006050350A (ja) * 2004-08-05 2006-02-16 Sanyo Electric Co Ltd 撮像装置及び撮像素子の制御回路
US7630000B2 (en) * 2005-07-29 2009-12-08 Olympus Imaging Corp. Electronic blurring correction apparatus
JP2012004949A (ja) * 2010-06-18 2012-01-05 Panasonic Corp カメラ本体、交換レンズユニットおよび撮像装置
JP2013115514A (ja) * 2011-11-25 2013-06-10 Nikon Corp 表示制御装置および表示制御プログラム
JP2016100686A (ja) * 2014-11-19 2016-05-30 三星電子株式会社Samsung Electronics Co.,Ltd. 撮像装置および撮像方法
JP2018046787A (ja) * 2016-09-23 2018-03-29 ドローン・ジャパン株式会社 農業管理予測システム、農業管理予測方法、及びサーバ装置

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012019427A (ja) * 2010-07-09 2012-01-26 Hoya Corp 露出制御装置及びそれを備える電子カメラ
CN104980664A (zh) * 2014-04-03 2015-10-14 佳能株式会社 图像处理装置及其控制方法以及摄像装置
JP2018037747A (ja) * 2016-08-29 2018-03-08 キヤノン株式会社 情報処理装置、その制御方法およびプログラム
CN108419026A (zh) * 2018-05-31 2018-08-17 歌尔股份有限公司 相机曝光时间调整方法、装置及设备
CN109862283A (zh) * 2019-02-19 2019-06-07 合肥泰禾光电科技股份有限公司 一种曝光时间自动调整方法及装置

Also Published As

Publication number Publication date
CN112335230A (zh) 2021-02-05
US20220141371A1 (en) 2022-05-05
JP2021027409A (ja) 2021-02-22

Similar Documents

Publication Publication Date Title
US11070735B2 (en) Photographing device, photographing system, mobile body, control method and program
JP6496955B1 (ja) 制御装置、システム、制御方法、及びプログラム
CN111567032B (zh) 确定装置、移动体、确定方法以及计算机可读记录介质
JP6384000B1 (ja) 制御装置、撮像装置、撮像システム、移動体、制御方法、及びプログラム
CN110383228B (zh) 生成装置、生成系统、摄像系统、移动体以及生成方法
US20210235044A1 (en) Image processing device, camera device, mobile body, image processing method, and program
WO2021017914A1 (zh) 控制装置、摄像装置、移动体、控制方法以及程序
WO2020192385A1 (zh) 确定装置、摄像系统及移动体
JP6651693B2 (ja) 制御装置、移動体、制御方法、及びプログラム
WO2020020042A1 (zh) 控制装置、移动体、控制方法以及程序
WO2019223614A1 (zh) 控制装置、摄像装置、移动体、控制方法以及程序
WO2020216057A1 (zh) 控制装置、摄像装置、移动体、控制方法以及程序
JP6896962B2 (ja) 決定装置、飛行体、決定方法、及びプログラム
WO2021249245A1 (zh) 装置、摄像装置、摄像系统及移动体
WO2018163300A1 (ja) 制御装置、撮像装置、撮像システム、移動体、制御方法、及びプログラム
WO2021115166A1 (zh) 确定装置、飞行体、确定方法以及程序
WO2021083049A1 (zh) 图像处理装置、图像处理方法以及程序
JP6818987B1 (ja) 画像処理装置、撮像装置、移動体、画像処理方法、及びプログラム
WO2021143425A1 (zh) 控制装置、摄像装置、移动体、控制方法以及程序
US20200241570A1 (en) Control device, camera device, flight body, control method and program
JP2022053417A (ja) 制御装置、撮像装置、移動体、制御方法、及びプログラム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20847996

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20847996

Country of ref document: EP

Kind code of ref document: A1