WO2020192385A1 - Determination device, camera system, and moving object - Google Patents

Determination device, camera system, and moving object Download PDF

Info

Publication number
WO2020192385A1
WO2020192385A1 PCT/CN2020/078018 CN2020078018W WO2020192385A1 WO 2020192385 A1 WO2020192385 A1 WO 2020192385A1 CN 2020078018 W CN2020078018 W CN 2020078018W WO 2020192385 A1 WO2020192385 A1 WO 2020192385A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
region
imaging device
wavelength region
imaging
Prior art date
Application number
PCT/CN2020/078018
Other languages
French (fr)
Chinese (zh)
Inventor
家富邦彦
Original Assignee
深圳市大疆创新科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳市大疆创新科技有限公司 filed Critical 深圳市大疆创新科技有限公司
Priority to CN202080002806.8A priority Critical patent/CN112154646A/en
Publication of WO2020192385A1 publication Critical patent/WO2020192385A1/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/30Transforming light or analogous information into electric information
    • H04N5/33Transforming infrared radiation

Definitions

  • the invention relates to a determining device, a camera system and a mobile body.
  • Patent Document 1 discloses a sensing system that, when sensing an area containing an object to be inspected and a reference reflection area, senses the reference reflection area in each waveband as the sensing object of the object under inspection. It is measured that the reference reflection area has a reflectance corresponding to the object under inspection.
  • Patent Document 1 International Publication No. 2017/221756
  • a multi-spectral camera that captures images in multiple wavelength bands may not be able to appropriately set imaging conditions suitable for a specific object.
  • the determining device may include a circuit configured to determine the first image in the first image based on the first image in the first wavelength region captured by the first image sensor included in the first imaging device A region, and the imaging control of the second imaging device is determined based on the image information of the second region corresponding to the first region in the second image of the second wavelength region captured by the second image sensor of the second imaging device value.
  • the image information may be luminance information
  • the imaging control value may be an exposure control value
  • the circuit can determine the first area according to the first image and the second image.
  • the circuit may select the first image and the second image or the first image and the third image in the third wavelength region captured by the third image sensor of the third imaging device as the image for determining the first region.
  • the circuit may determine the first area according to the selected first image and second image or first image and third image.
  • the circuit can determine the camera control value of the first camera device according to the image information of the first area in the first image, determine the camera control value of the second camera device according to the image information of the second area in the second image, and according to the third image
  • the image information of the third area corresponding to the first area is used to determine the imaging control value of the third imaging device.
  • the circuit can select the first image and the second image or the first image and the third image according to the characteristics of the object.
  • the first wavelength region may be a wavelength region of the near infrared region.
  • the second wavelength region may be a wavelength region of the red region.
  • the third wavelength region may be the wavelength region of the green region or the red edge region.
  • the circuit can select the first image and the second image, the first image and the third image in the third wavelength region captured by the third image sensor of the third imaging device, or the first image and the third image captured by the fourth imaging device.
  • the fourth image in the fourth wavelength region captured by the fourth image sensor is used as an image for determining the first region.
  • the circuit may determine the first area according to the selected first image and second image, first image and third image, or first image and fourth image.
  • the circuit can determine the camera control value of the first camera device according to the image information of the first area in the first image, determine the camera control value of the second camera device according to the image information of the second area in the second image, and according to the third image
  • the image information of the third area corresponding to the first area is used to determine the image capture control value of the third camera
  • the image information of the fourth area corresponding to the first area in the fourth image is used to determine the image information of the fourth camera.
  • Camera control value is used to determine the camera control value of the first camera device according to the image information of the first area in the first image.
  • the circuit can select the first image and the second image, the first image and the third image, or the first image and the fourth image according to the characteristics of the subject.
  • the first wavelength region may be a wavelength region of the near infrared region.
  • the second wavelength region may be a wavelength region of the red region.
  • the third wavelength region may be a wavelength region of the green region.
  • the fourth wavelength region may be the wavelength region of the red edge region.
  • the determining device may include a circuit configured to be based on the first image in the first wavelength region captured by the first image sensor provided in the first imaging device and the second imaging device.
  • the second image in the second wavelength region captured by the second image sensor is provided to determine the first region in the first image, and the third region in the third wavelength region captured by the third image sensor of the third image sensor is determined.
  • the image information of the third area corresponding to the first area in the image determines the imaging control value of the third imaging device.
  • the image information may be luminance information
  • the imaging control value may be an exposure control value
  • the first wavelength region may be a wavelength region of the near infrared region.
  • the second wavelength region may be a wavelength region of a red region, a green region, or a red edge region.
  • the third wavelength region may be a wavelength region of a red region, a green region, and a blue region.
  • the camera system may include the above-mentioned determining device.
  • the camera system may include a first camera device and a second camera device.
  • the moving body according to an aspect of the present invention may be a moving body equipped with the aforementioned imaging system and moving.
  • the determining method may include a stage of determining the first region in the first image based on the first image of the first wavelength region captured by the first image sensor included in the first imaging device.
  • the determining method may include imaging of the second imaging device based on image information of the second region corresponding to the first region in the second image of the second wavelength region captured by the second image sensor of the second imaging device. The stage in which the control value is determined.
  • the program according to one aspect of the present invention may be a program for causing a computer to function as the above-mentioned determining device.
  • a multispectral camera that separately captures images in a plurality of wavelength bands can appropriately set shooting conditions suitable for a specific subject.
  • FIG. 1 is a diagram showing an example of the appearance of an unmanned aerial vehicle (UAV) and a remote operation device.
  • UAV unmanned aerial vehicle
  • Fig. 2 is a diagram showing an example of the appearance of an imaging system mounted on a UAV.
  • Fig. 3 is a diagram showing an example of functional blocks of UAV.
  • Fig. 4 is a diagram showing an example of functional blocks of the camera system.
  • Fig. 5 is a diagram showing an example of an image taken by an imaging system.
  • Fig. 6 is a luminance distribution diagram of a part of the image shown in Fig. 5.
  • Fig. 7 is a diagram for explaining blocks formed by dividing an image.
  • Fig. 8 is a diagram for explaining a region of interest in an image.
  • Fig. 9 is a diagram showing an example of an image taken by an imaging system.
  • Fig. 10 is a flowchart showing an example of a procedure for determining an exposure control value.
  • FIG. 11 is a diagram showing another example of the appearance of the imaging system mounted on the UAV.
  • Fig. 12 is a diagram for explaining an example of the hardware configuration.
  • a block may represent (1) a stage of a process of performing an operation or (2) a "part" of a device that performs an operation.
  • Specific stages and “parts” can be implemented by programmable circuits and/or processors.
  • Dedicated circuits may include digital and/or analog hardware circuits. May include integrated circuits (ICs) and/or discrete circuits.
  • the programmable circuit may include a reconfigurable hardware circuit.
  • Reconfigurable hardware circuits can include logical AND, logical OR, logical exclusive OR, logical NAND, logical NOR, and other logical operations, flip-flops, registers, field programmable gate array (FPGA), programmable logic array (PLA) ) And other memory components.
  • the computer-readable medium may include any tangible device that can store instructions for execution by a suitable device.
  • the computer-readable medium on which instructions are stored is provided with a product including instructions that can be executed to create means for performing operations specified by the flowchart or block diagram.
  • a computer-readable medium it may include electronic storage media, magnetic storage media, optical storage media, electromagnetic storage media, semiconductor storage media, and the like.
  • the computer-readable medium may include floppy (registered trademark) disk, floppy disk, hard disk, random access memory (RAM), read only memory (ROM), erasable programmable read only memory (EPROM) or Flash memory), electrically erasable programmable read-only memory (EEPROM), static random access memory (SRAM), compact disc read-only memory (CD-ROM), digital versatile disc (DVD), Blu-ray (RTM) disc, memory stick , Integrated circuit cards, etc.
  • floppy registered trademark
  • floppy disk floppy disk
  • hard disk random access memory
  • RAM random access memory
  • ROM read only memory
  • EPROM erasable programmable read only memory
  • Flash memory electrically erasable programmable read-only memory
  • EEPROM electrically erasable programmable read-only memory
  • SRAM static random access memory
  • CD-ROM compact disc read-only memory
  • DVD digital versatile disc
  • RTM Blu-ray
  • the computer-readable instructions may include any one of source code or object code described in any combination of one or more programming languages.
  • the source code or object code includes traditional procedural programming languages.
  • Traditional procedural programming languages can be assembly instructions, instruction set architecture (1SA) instructions, machine instructions, machine-related instructions, microcode, firmware instructions, status setting data, or Smalltalk (registered trademark), JAVA (registered trademark) , C++ and other object-oriented programming languages and "C" programming language or similar programming languages.
  • the computer-readable instructions may be provided locally or via a wide area network (WAN) such as a local area network (LAN) or the Internet to a processor or programmable circuit of a general-purpose computer, a special-purpose computer, or other programmable data processing device.
  • WAN wide area network
  • LAN local area network
  • the processor or programmable circuit can execute computer-readable instructions to create means for performing the operations specified in the flowchart or block diagram.
  • Examples of processors include computer processors, processing units, microprocessors, digital signal processors, controllers, microcontrollers, and so on.
  • FIG. 1 shows an example of the appearance of an unmanned aerial vehicle (UAV) 10 and a remote operation device 300.
  • the UAV 10 includes a UAV main body 20, a universal joint 50, a plurality of camera devices 60, and a camera system 100.
  • the universal joint 50 and the camera system 100 are an example of a camera system.
  • UAV10 is an example of a moving body.
  • Moving objects include concepts such as flying objects moving in the air, vehicles moving on the ground, and ships moving on the water. Flying objects moving in the air refer to concepts that include not only UAVs, but also other aircraft, airships, and helicopters that move in the air.
  • the UAV main body 20 includes a plurality of rotors. Multiple rotors are an example of a propulsion section.
  • the UAV main body 20 makes the UAV 10 fly by controlling the rotation of a plurality of rotors.
  • the UAV main body 20 uses, for example, four rotors to fly the UAV 10. The number of rotors is not limited to four.
  • UAV10 can also be a fixed-wing aircraft without rotors.
  • the imaging system 100 is a multispectral camera for imaging that captures objects within a desired imaging range in a plurality of wavelength bands.
  • the universal joint 50 rotatably supports the camera system 100.
  • the universal joint 50 is an example of a supporting mechanism.
  • the gimbal 50 uses an actuator to rotatably support the camera system 100 with a pitch axis.
  • the universal joint 50 uses an actuator to further rotatably support the camera system 100 around the roll axis and the yaw axis, respectively.
  • the gimbal 50 can change the posture of the camera system 100 by rotating the camera system 100 around at least one of the yaw axis, the pitch axis, and the roll axis.
  • the plurality of imaging devices 60 are sensing cameras that photograph the surroundings of the UAV 10 in order to control the flight of the UAV 10.
  • the two camera devices 60 can be installed on the nose of the UAV 10, that is, on the front side.
  • the other two camera devices 60 may be provided on the bottom surface of the UAV 10.
  • the two imaging devices 60 on the front side may be paired to function as a so-called stereo camera.
  • the two imaging devices 60 on the bottom side may also be paired to function as a stereo camera.
  • the imaging device 60 can detect the existence of an object included in the imaging range of the imaging device 60 and measure the distance to the object.
  • the imaging device 60 is an example of a measuring device that measures an object existing in the imaging direction of the imaging system 100.
  • the measuring device may be another sensor such as an infrared sensor or an ultrasonic sensor that measures an object existing in the imaging direction of the imaging system 100.
  • the three-dimensional spatial data around the UAV 10 can be generated based on the images taken by the plurality of camera devices 60.
  • the number of imaging devices 60 included in the UAV 10 is not limited to four.
  • the UAV 10 may include at least one imaging device 60.
  • the UAV10 may be equipped with at least one camera 60 on the nose, tail, side, bottom, and top of the UAV10.
  • the viewing angle that can be set in the camera device 60 may be larger than the viewing angle that can be set in the camera system 100.
  • the imaging device 60 may also include a single focus lens or a fisheye lens.
  • the remote operation device 300 communicates with the UAV 10 to remotely operate the UAV 10.
  • the remote operation device 300 can wirelessly communicate with the UAV 10.
  • the remote operation device 300 transmits to the UAV 10 instruction information indicating various commands related to the movement of the UAV 10 such as ascending, descending, accelerating, decelerating, forwarding, retreating, and rotating.
  • the instruction information includes, for example, instruction information for raising the height of the UAV 10.
  • the indication information may indicate the height at which the UAV10 should be located.
  • the UAV 10 moves to be located at the height indicated by the instruction information received from the remote operation device 300.
  • the instruction information may include an ascending instruction to raise the UAV10. UAV10 rises while receiving the rise command. When the height of UAV10 has reached the upper limit height, even if the ascending instruction is accepted, the ascent of UAV10 can be restricted.
  • FIG. 2 is a diagram showing an example of the appearance of the imaging system 100 mounted on the UAV 10.
  • the imaging system 100 is a multispectral camera that separately captures image data of each of a plurality of preset wavebands.
  • the imaging system 100 includes an imaging device 110 for R, an imaging device 120 for G, an imaging device 130 for B, an imaging device 140 for RE, and an imaging device 150 for NIR.
  • the imaging system 100 can record each image data captured by the imaging device 110 for R, the imaging device 120 for G, the imaging device 130 for B, the imaging device 140 for RE, and the imaging device 150 for NIR as a multispectral image.
  • multispectral images can be used to predict the health and vitality of crops.
  • FIG. 3 shows an example of the functional blocks of UAV10.
  • UAV10 includes UAV control unit 30, memory 32, communication interface 36, propulsion unit 40, GPS receiver 41, inertial measurement device 42, magnetic compass 43, barometric altimeter 44, temperature sensor 45, humidity sensor 46, universal joint 50, camera The device 60 and the camera system 100.
  • the communication interface 36 communicates with other devices such as the remote operation device 300.
  • the communication interface 36 can receive instruction information including various instructions to the UAV control unit 30 from the remote operation device 300.
  • the memory 32 stores the UAV control unit 30 to the propulsion unit 40, GPS receiver 41, inertial measurement unit (IMU) 42, magnetic compass 43, barometric altimeter 44, temperature sensor 45, humidity sensor 46, universal joint 50, camera 60 and
  • the imaging system 100 performs programs and the like necessary for control.
  • the memory 32 may be a computer-readable recording medium, and may include at least one of flash memory such as SRAM, DRAM, EPROM, EEPROM, and USB memory.
  • the memory 32 may be provided inside the UAV main body 20. It can be configured to be detachable from the UAV main body 20.
  • the UAV control unit 30 controls the flight and shooting of the UAV 10 in accordance with the program stored in the memory 32.
  • the UAV control unit 30 may be composed of a microprocessor such as a CPU or an MPU, and a microcontroller such as an MCU.
  • the UAV control unit 30 controls the flight and shooting of the UAV 10 in accordance with instructions received from the remote operation device 300 via the communication interface 36.
  • the propulsion unit 40 propels the UAV10.
  • the propulsion part 40 includes a plurality of rotors and a plurality of drive motors that rotate the plurality of rotors.
  • the propulsion unit 40 rotates a plurality of rotors via a plurality of drive motors in accordance with an instruction from the UAV control unit 30 to cause the UAV 10 to fly.
  • the GPS receiver 41 receives a plurality of signals indicating the time transmitted from a plurality of GPS satellites.
  • the GPS receiver 41 calculates the position (latitude and longitude) of the GPS receiver 41, that is, the position (latitude and longitude) of the UAV 10 based on the received signals.
  • the IMU42 detects the posture of the UAV10.
  • the IMU 42 detects the acceleration of the UAV 10 in the three-axis directions of front and rear, left and right, and up and down, and the angular velocities of the pitch axis, the roll axis, and the yaw axis as the attitude of the UAV 10.
  • the magnetic compass 43 detects the position of the nose of the UAV 10.
  • the barometric altimeter 44 detects the flying altitude of the UAV10.
  • the barometric altimeter 44 detects the air pressure around the UAV 10 and converts the detected air pressure to altitude to detect the altitude.
  • the temperature sensor 45 detects the temperature around the UAV 10.
  • the humidity sensor 46 detects the humidity around the UAV 10.
  • FIG. 4 shows an example of functional blocks of the camera system 100.
  • the imaging system 100 includes an imaging device 110 for R, an imaging device 120 for G, an imaging device 130 for B, an imaging device 140 for RE, and an imaging device 150 for NIR.
  • the imaging system 100 includes an image processor 180, a transmission unit 190, and a memory 192.
  • the imaging device 110 for R includes an image sensor 112 for R and an optical system 114.
  • the image sensor 112 for R captures an image formed by the optical system 114.
  • the R image sensor 112 includes a filter that transmits light in the red region, and outputs an R image signal that is an image signal in the red region.
  • the wavelength band of the red region is 620 nm to 750 nm.
  • the wavelength band of the red region may be a specific wavelength band in the red region, for example, it may be 663 nm to 673 nm.
  • the imaging device 120 for G includes an image sensor 122 for G and an optical system 124.
  • the image sensor 122 for G captures an image formed by the optical system 124.
  • the G image sensor 122 includes a filter that transmits light in the green region, and outputs a G image signal that is an image signal in the green region.
  • the wavelength band of the green region is 500 nm to 570 nm.
  • the wavelength band of the green region may be a specific wavelength band in the green region, for example, it may be 550 nm to 570 nm.
  • the imaging device 130 for B includes an image sensor 132 for B and an optical system 134.
  • the image sensor 132 for B captures an image formed by the optical system 134.
  • the image sensor for B 132 includes a filter that transmits light in the blue region, and outputs a B image signal that is an image signal in the blue region.
  • the wavelength band of the blue region is 450 nm to 500 nm.
  • the wavelength band of the blue region may be a specific wavelength band in the blue region, for example, it may be 465 nm to 485 nm.
  • the imaging device 140 for RE includes an image sensor 142 for RE and an optical system 144.
  • the image sensor 142 for RE captures an image formed by the optical system 144.
  • the RE image sensor 142 includes a filter that transmits light in the red edge region, and outputs an RE image signal that is an image signal in the red edge region.
  • the wavelength band of the red edge region is 705 nm to 745 nm.
  • the wavelength band of the red edge region may be 712 nm to 722 nm.
  • the NIR imaging device 150 includes an NIR image sensor 152 and an optical system 154.
  • the image sensor 152 for NIR captures the image formed by the optical system 154.
  • the image sensor for NIR 152 includes a filter that transmits light in the near infrared region, and outputs an image signal in the near infrared region, that is, an NIR image signal.
  • the wavelength band of the near infrared region is 800 nm to 2500 nm.
  • the wavelength band of the near infrared region may be 800 nm to 900 nm.
  • the image processor 180 includes a multiplexer 170, an input receiving section 172, a demosaicing processing section 174, and a recording processing section 178.
  • the image processor 180 is an example of a circuit.
  • the image processor 180 may be composed of a microprocessor such as a CPU or an MPU, and a microcontroller such as an MCU.
  • the multiplexer 170 receives the image signal output from each image sensor, selects the image signal output from any image sensor according to a preset condition, and inputs it to the input receiving unit 172.
  • the demosaic processing unit 174 generates display image data based on the R image signal, the G image signal, and the B image signal input to the input receiving unit 172.
  • the demosaic processing unit 174 generates display image data by performing demosaic processing on the R image signal, the G image signal, and the B image signal.
  • the demosaic processing unit 174 can perform thinning processing on the R image signal, G image signal, and B image signal, and convert the thinning-processed R image signal, G image signal, and B image signal into Bayer array image signals to generate display signals.
  • the transmitting unit 190 transmits the image data for display to the display device.
  • the transmitting unit 190 may transmit the image data for display to the remote operation device 300.
  • the remote operation device 300 may display the image data for display on the display unit as an image of a live view.
  • the recording processing unit 178 generates recording image data based on the R image signal, G image signal, B image signal, RE image signal, and NIR image signal input to the input receiving unit 172 and based on a preset recording format.
  • the recording processing unit 178 can generate RAW data as recording image data from the R image signal, the G image signal, the B image signal, the RE image signal, and the NIR image signal according to the RAW format.
  • the recording processing unit 178 may generate all-pixel recording image data without performing thinning-out processing on the R image signal, G image signal, B image signal, RE image signal, and NIR image signal.
  • the recording processing unit 178 may store the image data for recording in the memory 192.
  • the memory 192 may be a computer-readable storage medium, and may include at least one of flash memory such as SRAM, DRAM, EPROM, EEPROM, and USB memory.
  • the memory 192 may be provided inside the housing of the camera system 100.
  • the memory 192 may be configured to be detachable from the housing of the camera system 100.
  • the image processor 180 also includes a receiving unit 184 and a switching unit 186.
  • the receiving unit 184 receives a storage instruction to store the image data for recording in the memory 192.
  • the receiving unit 184 may receive a storage instruction from the user through an external terminal such as the remote operation device 300.
  • the receiving unit 184 may receive a storage instruction from the UAV control unit 30.
  • the UAV control unit 30 determines that the position of the camera system 100 is the preset position, so the receiving unit 184 can receive the storage instruction from the UAV control unit 30.
  • the camera system 100 may also include a GPS receiver.
  • the image processor 180 can determine whether the position of the camera system 100 is a preset position according to the position information from its own GPS receiver.
  • the switching unit 186 switches between the following two methods. One is to generate display image data in the demosaicing processing unit 174 based on the R image signal, G image signal, and B image signal input to the input receiving unit 172, and the other is based on the input.
  • the R image signal, G image signal, B image signal, RE image signal, and NIR image signal sent to the input receiving unit 172 are generated in the recording processing unit 178 according to a preset recording format.
  • the imaging system 100 configured as described above, it is possible to determine a region of interest (ROI) based on image signals captured by the imaging device 110 for R, the imaging device 120 for G, and the imaging device 130 for B, respectively. Determine the shooting conditions.
  • the image processor 180 can determine the ROI based on the pixel values (luminance values) shown by the image signals captured by the R imaging device 110, the G imaging device 120, and the B imaging device 130, respectively.
  • the image processor 180 determines the imaging conditions of each of the imaging device 110 for R, the imaging device 120 for G, and the imaging device 130 for B based on such an ROI, the imaging conditions may not be suitable for a specific object.
  • the specific object is a plant such as a crop, and a vehicle or a road exists around the plant
  • the image processor 180 may determine an area including the vehicle or the road as an ROI.
  • the image processor 180 will determine the area containing the vehicle or the road as the ROI.
  • the luminance value (brightness) distribution of the straight line portion indicated by the symbol 501 in the image 500 is as shown in FIG. 6. That is, the brightness value of the vehicle 512 is high, and the image processor 180 will determine the area containing the vehicle 512 as an ROI. The image processor 180 will determine such a region as an ROI, and determine the exposure based on image information such as the brightness value in the ROI. In this case, as shown in FIG. 5, it is not suitable for the exposure of the crop 510, and the crop 510 in the image 500 is dark.
  • the ROI is determined by the index corresponding to the characteristic of the specific subject, so that the appropriate shooting conditions can be determined based on the specific subject.
  • the characteristics of the subject refer to the characteristics that can estimate or determine the location of the subject based on the image taken by the imaging device, that is, the reflectance of the subject to each wavelength, the color, the shape, the shooting position of the subject, Time, season, etc.
  • the image processor 180 includes an area determination unit 181 and a shooting condition determination unit 182.
  • the imaging condition determination unit 182 is based on an image in a specific wavelength region captured by at least one of the imaging device 110 for R, the imaging device 120 for G, the imaging device 130 for B, the imaging device 140 for RE, and the imaging device 150 for NIR.
  • the first area within is determined as the ROI.
  • the imaging condition determination unit 182 is based on the image information of each region corresponding to each first region among the imaging device 110 for R, the imaging device 120 for G, the imaging device 130 for B, the imaging device 140 for RE, and the imaging device 150 for NIR.
  • the imaging control values of the imaging device 110 for R, the imaging device 120 for G, the imaging device 130 for B, the imaging device 140 for RE, and the imaging device 150 for NIR are determined.
  • the area corresponding to the first area refers to an area containing the same subject as the subject included in the first area.
  • the second area of the R-use imaging device 110 corresponding to the first area of the NIR-use imaging device 150 means that it includes the same subject as the subject included in the first area of the NIR imaging device 150 and is defined by The area within the image captured by the imaging device 110 for R.
  • the third area of the imaging device for G 120, the fourth area of the imaging device for B 130, and the fifth area of the imaging device for RE 140 corresponding to the first area of the imaging device for NIR 150 refer to the The image captured by the device 150, the imaging device for G 120, the imaging device for B 130, and the imaging device for RE 140 is an area of the same subject included in the first area.
  • Image information refers to image information taken by each imaging device such as the imaging device 110 for R, the imaging device 120 for G, the imaging device 130 for B, the imaging device 140 for RE, or the imaging device 150 for NIR, including luminance information and color information Wait.
  • the imaging control value may be at least one of a luminance value that is a control value when automatic exposure control is executed, a gain value of the image sensor, an aperture value (Iris value), and a shutter speed (accumulation time).
  • the imaging control value may be a focus position (lens position) that is a control value when performing autofocus control.
  • the imaging control value may be an R gain value and a B gain value that are control values when performing automatic white balance control.
  • the memory 192 can store information indicating the correspondence of the respective image coordinate systems of the imaging device 110 for R, the imaging device 120 for G, the imaging device 130 for B, the imaging device 140 for RE, and the imaging device 150 for NIR.
  • the imaging condition determining unit 182 may determine the second area, the third area, the fourth area, and the fifth area corresponding to the first area based on the information indicating the correspondence relationship.
  • the parameters used when determining the ROI can be determined according to the characteristics of the light reflectance unique to a specific subject. For example, when a specific subject reflects relatively much light in the near-infrared region, the region determining unit 181 may determine the ROI based on the near-infrared reflectance.
  • the region determining unit 181 may determine the first region in the NIR image as an ROI based on the NIR image of the near infrared region captured by the NIR imaging device 150.
  • the imaging condition determination unit 182 may determine the exposure control value of the R imaging device 110 based on the luminance information of the second region corresponding to the first region in the red image of the red region captured by the R imaging device 110.
  • the imaging condition determination unit 182 may determine the exposure control value of the G imaging device 120 based on the luminance information of the third region corresponding to the first region in the green image of the green region captured by the G imaging device 120.
  • the imaging condition determination unit 182 may determine the exposure control value of the B imaging device 130 based on the luminance information of the fourth region corresponding to the first region in the blue image of the blue region captured by the B imaging device 130.
  • the imaging condition determination unit 182 may determine the exposure control value of the RE imaging device 140 based on the luminance information of the fifth region corresponding to the first region in the red edge image of the red edge region captured by the RE imaging device 140.
  • the imaging condition determining unit 182 may determine the exposure control value of the NIR imaging device 150 based on the luminance information of the first region in the near infrared image of the near infrared region captured by the NIR imaging device 150.
  • the area determining unit 181 may determine the first area based on the near infrared image and the red image. The area determining unit 181 may determine the first area based on the near infrared image and the green image. The area determining unit 181 may determine the first area based on the near infrared image and the red edge image.
  • the standard vegetation index (NDVI) represents a unique value.
  • the value of NDVI will be above the preset value.
  • the number of branches and leaves will vary.
  • the degree of activity will vary. Therefore, when the specific subject is a specific plant, the standard vegetation index (NDVI) is a value within a preset range.
  • the preset range can be determined according to the type of plant and the health status of the plant to be observed.
  • NDVI is represented by the following formula.
  • IR represents the reflectance in the near infrared region
  • R represents the reflectance of red in the visible light region
  • the area determining unit 181 calculates NDVI from the near-infrared image and the red image, and determines the first area whose NDVI is greater than or equal to the preset value as the imaging device 110 for R, the imaging device 120 for G, the imaging device 130 for B, and the imaging device 140 for RE And each ROI of the imaging device 150 for NIR.
  • the area determination unit 181 divides the near-infrared image and the red image into a plurality of blocks, and can calculate the NDVI of each block based on the pixel value of each block.
  • the region determining unit 181 may determine at least one block whose NDVI is above a preset threshold as an ROI. For example, as shown in FIG.
  • the area determination unit 181 divides the image 500 into a plurality of blocks 600 and calculates the NDVI of each block. For example, as shown in FIG. 8, the region determining unit 181 determines the block with the highest NDVI as the ROI601.
  • the imaging condition determination unit 182 determines the exposure control value of the imaging device based on the luminance information of each region of each imaging device corresponding to the first region. In this way, the exposure control value of each imaging device is determined based on the luminance information in the ROI based on NDVI, so that each imaging device can capture images with exposure suitable for the crops in the ROI. Therefore, for example, as shown in FIG. 9, an image 530 suitable for exposure of the crop 532 can be obtained.
  • a vehicle 534 or the like that is brighter than the crop 532 may be overexposed.
  • the subject when there is a subject that is extremely darker than the crop 532, the subject may be underexposed in the image. However, since it is intended to obtain an image focusing on the crop 532, this does not affect the evaluation of the image.
  • the ROI may be preferable to determine the ROI by an index other than NDVI.
  • the reflectance characteristics may change depending on the location, time, or season. In this case, depending on the location, time, season, etc., it is sometimes preferable to switch the index used to determine the ROI. That is, the region determining unit 181 may select an index for determining the ROI according to at least one of the type of the subject and the location, time period, or season where the subject is photographed.
  • the region determining unit 181 may determine the ROI according to SAVI (Solid Adjusted Vegetation Index, soil adjusted vegetation index).
  • SAVI Solid Adjusted Vegetation Index, soil adjusted vegetation index.
  • SAVI is an index that considers the difference in soil reflectance. It is different from NDVI in considering the difference in soil reflectance. L is 1 when the vegetation is small, and 0.25 when it is large.
  • the area determining unit 181 may determine the ROI according to gNDVI (Green Normalized Difference Vegetation Index, green normalized vegetation index).
  • gNDVI Green Normalized Difference Vegetation Index, green normalized vegetation index.
  • G represents the blue reflectance in the visible light region.
  • the region determining unit 181 may determine the ROI according to the NDRE (Normalized Difference Red Edge Index, normalized difference red edge index).
  • NDRE Normalized Difference Red Edge Index, normalized difference red edge index.
  • NIR near infrared reflectance
  • RE red edge reflectance
  • the region determining unit 181 may select a near-infrared image and a red image or a near-infrared image and a green image as the image for determining the ROI according to the characteristics of the subject.
  • the region determining unit 181 may determine the ROI based on the selected near-infrared image and red image or near-infrared image and green image.
  • the region determining unit 181 may select a near-infrared image and a red image, a near-infrared image and a green image, or a near-infrared image and a red edge image as the image for determining the ROI according to the characteristics of the subject.
  • the region determining unit 181 may determine the ROI based on the selected near-infrared image and red image, near-infrared image and green image, or near-infrared image and red edge.
  • the region determining unit 181 selects NDVI, SAVI, gNDVI, or NDRE according to the characteristics of the subject, and can determine the ROI according to the selected NDVI, SAVI, gNDVI, or NDRE.
  • the area determination unit 181 may select NDVI, SAVI, gNDVI, or NDRE according to the characteristics of the plant as the subject.
  • the area determining unit 181 may select NDVI, SAVI, gNDVI, or NDRE according to at least one of the type of the subject, the location where the subject was photographed, the time period, or the season.
  • the region determining unit 181 may select NDVI, SAVI, gNDVI, or NDRE as an index for determining the ROI according to the type of the subject specified by the user received through the receiving unit 184.
  • the region determining unit 181 may select NDVI, SAVI, gNDVI, or NDRE as an index for determining the ROI according to the type of crop.
  • Fig. 10 is a flowchart showing an example of a procedure for determining an exposure control value.
  • the area determination unit 181 acquires an image necessary for calculation of an index for determining the ROI (S100). For example, in order to derive NVDI, the area specifying unit 181 acquires a near-infrared image taken by the imaging device 150 for NIR and a red image taken by the imaging device 110 for R.
  • the area specifying unit 181 sets a plurality of blocks in the image (S102).
  • the area specifying unit 181 may set a plurality of blocks in a coordinate system common to each of the imaging device 110 for R, the imaging device 120 for G, the imaging device 130 for B, the imaging device 140 for RE, and the imaging device 150 for NIR.
  • the area determination unit 181 derives the NDVI of each block based on the near-infrared reflectance of each block derived from each block of the near-infrared image and the red area reflectance of each block derived from each block of the red image (S104).
  • the area determination unit 181 extracts at least one block representing the NDVI corresponding to the plant from the NDVI of each of the plurality of blocks (S106).
  • the area determination section 181 may extract at least one block representing NDVI above a preset threshold.
  • the area determining part 181 may extract at least one block representing the largest NDVI above a preset threshold.
  • the area determination part 181 may extract at least one block representing NDVI that is above the preset lower limit threshold and below the preset upper threshold.
  • the area determination section 181 sets the block indicating the NDVI corresponding to the plant as an ROI (S108).
  • the region specifying unit 181 sets all the blocks as ROI (S110).
  • the imaging condition determination unit 182 determines the exposure control value of each imaging device based on the image information of each ROI of each imaging device (S112).
  • the imaging condition determination unit 182 can be based on each image corresponding to the ROI among the respective images captured by the imaging device 110 for R, the imaging device 120 for G, the imaging device 130 for B, the imaging device 140 for RE, and the imaging device 150 for NIR.
  • the luminance information of each region determines the exposure control values of the imaging device 110 for R, the imaging device 120 for G, the imaging device 130 for B, the imaging device 140 for RE, and the imaging device 150 for NIR.
  • the imaging system 100 it is possible to determine the ROI based on the index corresponding to the characteristics of the subject, and the ROI must be referred to in order to determine the imaging conditions. As a result, for example, even when there are other subjects such as a vehicle with high brightness around the plant to be paid attention to, the imaging system 100 can perform the optimal exposure of the plant and capture the image separately.
  • FIG. 11 is a diagram showing another example of the appearance of the imaging system 100 mounted on the UAV 10.
  • the imaging system 100 also includes an imaging device 160 for RGB, which is similar to the imaging system 100 shown in FIG. different.
  • the RGB imaging device 160 may be the same as a normal camera and includes an optical system and an image sensor.
  • the image sensor may include a filter that is arranged in a Bayer array and transmits light in the red region, a filter that transmits light in the green region, and a filter that transmits light in the blue region.
  • the RGB imaging device 160 can output RGB images.
  • the wavelength band of the red region may be 620 nm to 750 nm.
  • the wavelength band of the green region may be 500 nm to 570 nm.
  • the wavelength band of the blue region is 450 nm to 500 nm.
  • the region determining unit 181 may determine the first region in the near-infrared image as an ROI based on the near-infrared image captured by the NIR imaging device 150 and the red image captured by the R imaging device 110.
  • the region determining unit 181 may derive NDVI from the near-infrared image captured by the NIR camera 150 and the red image captured by the R camera 110, and determine the first region in the near-infrared image where the NDVI is above the preset threshold as an ROI .
  • the imaging condition determination unit 182 may determine the imaging control value of the RGB imaging device 160 based on the image information of the region corresponding to the first region in the RGB image captured by the RGB imaging device 160.
  • the imaging condition determination unit 182 may determine the exposure control value of the RGB imaging device 160 based on the luminance information of the region corresponding to the first region in the RGB image captured by the RGB imaging device 160.
  • FIG. 12 shows an example of a computer 1200 that may fully or partially embody various aspects of the present invention.
  • the program installed on the computer 1200 can make the computer 1200 function as an operation associated with the device according to the embodiment of the present invention or one or more "parts" of the device. Alternatively, the program can cause the computer 1200 to perform the operation or the one or more "parts".
  • This program enables the computer 1200 to execute the process or stages of the process involved in the embodiment of the present invention.
  • Such a program may be executed by the CPU 1212, so that the computer 1200 executes specified operations associated with some or all blocks in the flowcharts and block diagrams described in this specification.
  • the computer 1200 includes a CPU 1212 and a RAM 1214, which are connected to each other through a host controller 1210.
  • the computer 1200 further includes a communication interface 1222, an input/output unit, which is connected to the host controller 1210 through the input/output controller 1220.
  • the computer 1200 also includes a ROM 1230.
  • the CPU 1212 operates in accordance with programs stored in the ROM 1230 and RAM 1214, thereby controlling each unit.
  • the communication interface 1222 communicates with other electronic devices via a network.
  • the hard disk drive can store programs and data used by the CPU 1212 in the computer 1200.
  • the ROM 1230 stores therein a boot program executed by the computer 1200 during operation, and/or a program dependent on the hardware of the computer 1200.
  • the program is provided via a computer-readable recording medium such as CR-ROM, USB memory, or IC card, or a network.
  • the program is installed in RAM 1214 or ROM 1230 which is also an example of a computer-readable recording medium, and is executed by CPU 1212.
  • the information processing described in these programs is read by the computer 1200 and causes cooperation between the programs and the various types of hardware resources described above.
  • the operation or processing of information can be realized as the computer 1200 is used, thereby constituting an apparatus or method.
  • the CPU 1212 may execute a communication program loaded in the RAM 1214, and based on the processing described in the communication program, instruct the communication interface 1222 to perform communication processing.
  • the communication interface 1222 reads the transmission data stored in the transmission buffer provided in a recording medium such as RAM 1214 or USB memory, and sends the read transmission data to the network or receives it from the network The received data is written into the receiving buffer provided on the recording medium, etc.
  • the CPU 1212 can make the RAM 1214 read all or necessary parts of files or databases stored in an external recording medium such as a USB memory, and perform various types of processing on the data on the RAM 1214. Then, the CPU 1212 can write the processed data back to the external recording medium.
  • an external recording medium such as a USB memory
  • Various types of information such as various types of programs, data, tables, and databases can be stored in a recording medium and subjected to information processing.
  • the CPU 1212 can perform various types of operations, information processing, conditional judgment, conditional transfer, unconditional transfer, and information retrieval/retrieval/information specified by the instruction sequence of the program described in various places in this disclosure. Replace various types of processing, and write the results back to RAM 1214.
  • the CPU 1212 can search for information in files, databases, and the like in the recording medium.
  • the CPU 1212 may retrieve the attribute value of the specified first attribute from the multiple entries And read the attribute value of the second attribute stored in the entry to obtain the attribute value of the second attribute associated with the first attribute meeting the predetermined condition.
  • the above-mentioned programs or software modules may be stored on the computer 1200 or on a computer-readable storage medium near the computer 1200.
  • a recording medium such as a hard disk or RAM provided in a server system connected to a dedicated communication network or the Internet can be used as a computer-readable storage medium so that the program can be provided to the computer 1200 via the network.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)
  • Transforming Light Signals Into Electric Signals (AREA)

Abstract

A multispectral camera capable of capturing images at multiple spectral bands sometimes cannot appropriately configure photographing conditions suitable for a specific object. Provided is a determination device comprising a circuit, the circuit being configured to: determine, according to a first image of a first wavelength region captured by a first image sensor included in a first camera device, a first region in the first image; and determine, according to image information of a second region in a second image of a second wavelength region captured by a second image sensor included in a second camera device, a camera control value of the second camera device, the second region corresponding to the first region.

Description

确定装置、摄像系统及移动体Confirmation device, camera system and moving body 技术领域Technical field
本发明涉及一种确定装置、摄像系统及移动体。The invention relates to a determining device, a camera system and a mobile body.
背景技术Background technique
在专利文献1中公开有一种传感系统,在其对包含有被检查对象物的区域和基准反射区域进行感测时,在作为被检查对象物感测对象的各个波段对基准反射区域进行感测,所述基准反射区域具有与被检查对象物相应的反射率。Patent Document 1 discloses a sensing system that, when sensing an area containing an object to be inspected and a reference reflection area, senses the reference reflection area in each waveband as the sensing object of the object under inspection. It is measured that the reference reflection area has a reflectance corresponding to the object under inspection.
背景技术文献:Background technical literature:
[专利文献][Patent Literature]
[专利文献1]国际公开第2017/221756号公报[Patent Document 1] International Publication No. 2017/221756
发明内容Summary of the invention
发明所要解决的技术问题:The technical problem to be solved by the invention:
在多个波段分别对图像进行拍摄的多光谱照相机有时会无法对适合特定对象物的拍摄条件进行适当地设定。A multi-spectral camera that captures images in multiple wavelength bands may not be able to appropriately set imaging conditions suitable for a specific object.
用于解决技术问题的手段:Means used to solve technical problems:
本发明的一个方面所涉及的确定装置,可以包括配置为以下的电路:根据由第一摄像装置所具备的第一图像传感器拍摄的第一波长区域的第一图像来确定第一图像内的第一区域,并根据由第二摄像装置所具备的第二图像传感器拍摄的第二波长区域的第二图像中与第一区域相对应的第二区域的图像信息来确定第二摄像装置的摄像控制值。The determining device according to an aspect of the present invention may include a circuit configured to determine the first image in the first image based on the first image in the first wavelength region captured by the first image sensor included in the first imaging device A region, and the imaging control of the second imaging device is determined based on the image information of the second region corresponding to the first region in the second image of the second wavelength region captured by the second image sensor of the second imaging device value.
图像信息可以是辉度信息,摄像控制值可以是曝光控制值。The image information may be luminance information, and the imaging control value may be an exposure control value.
电路可以根据第一图像与第二图像来确定第一区域。The circuit can determine the first area according to the first image and the second image.
电路可以选择第一图像与第二图像或者第一图像与由第三摄像装置所具备的第三图像传感器拍摄的第三波长区域的第三图像作为用于确定第一区域的图像。电路可以根据选择的第一图像与第二图像或者第一图像与第三图像来确定第一区域。电路可以根据第一图像中第一区域的图像信息来确定第一摄像装置的摄像控制值,根据第二图像中第二区域的图像信息来确定第二摄像装置的摄像控制值,根据第三图像中与第一区域相对应的第三区域的图像信息来确定第三摄像装置的摄像控制值。The circuit may select the first image and the second image or the first image and the third image in the third wavelength region captured by the third image sensor of the third imaging device as the image for determining the first region. The circuit may determine the first area according to the selected first image and second image or first image and third image. The circuit can determine the camera control value of the first camera device according to the image information of the first area in the first image, determine the camera control value of the second camera device according to the image information of the second area in the second image, and according to the third image The image information of the third area corresponding to the first area is used to determine the imaging control value of the third imaging device.
电路可以根据被摄体的特性选择第一图像与第二图像或者第一图像与第三图像。The circuit can select the first image and the second image or the first image and the third image according to the characteristics of the object.
第一波长区域可以是近红外线区域的波长区域。第二波长区域可以是红色区域的波长区域。第三波长区域可以是绿色区域或者红色边缘区域的波长区域。The first wavelength region may be a wavelength region of the near infrared region. The second wavelength region may be a wavelength region of the red region. The third wavelength region may be the wavelength region of the green region or the red edge region.
电路可以选择第一图像与第二图像、第一图像与由第三摄像装置所具备的第三图像传感器拍摄的第三波长区域的第三图像或者第一图像与由第四摄像装置所具备的第四图像传感器拍摄的第四波长区域的第四图像作为用于确定第一区域的图像。电路可以根据选择的第一图像与第二图像、第一图像与第三图像或者第一图像与第四图像来确定第一区域。电路可以根据第一图像中第一区域的图像信息来确定第一摄像装置的摄像控制值,根据第二图像中第二区域的图像信息来确定第二摄像装置的摄像控制值,根据第三图像中与第一区域相对应的第三区域的图像信息来确定第三摄像装置的摄像控制值,根据第四图像中与第一区域相对应的第四区域的图像信息来确定第四摄像装置的摄像控制值。The circuit can select the first image and the second image, the first image and the third image in the third wavelength region captured by the third image sensor of the third imaging device, or the first image and the third image captured by the fourth imaging device. The fourth image in the fourth wavelength region captured by the fourth image sensor is used as an image for determining the first region. The circuit may determine the first area according to the selected first image and second image, first image and third image, or first image and fourth image. The circuit can determine the camera control value of the first camera device according to the image information of the first area in the first image, determine the camera control value of the second camera device according to the image information of the second area in the second image, and according to the third image The image information of the third area corresponding to the first area is used to determine the image capture control value of the third camera, and the image information of the fourth area corresponding to the first area in the fourth image is used to determine the image information of the fourth camera. Camera control value.
电路可以根据被摄体的特性选择第一图像与第二图像、第一图像与第三图像或者第一图像与第四图像。The circuit can select the first image and the second image, the first image and the third image, or the first image and the fourth image according to the characteristics of the subject.
第一波长区域可以是近红外线区域的波长区域。第二波长区域可以是红色区域的波长区域。第三波长区域可以是绿色区域的波长区域。第四波长区域可以是红色边缘区域的波长区域。The first wavelength region may be a wavelength region of the near infrared region. The second wavelength region may be a wavelength region of the red region. The third wavelength region may be a wavelength region of the green region. The fourth wavelength region may be the wavelength region of the red edge region.
本发明的一个方面所涉及的确定装置,可以包括配置为为以下的电路:根据由第一摄像装置所具备的第一图像传感器拍摄的第一波长区域的第一图像以及由第二摄像装置所具备的第二图像传感器拍摄的第二波长区域的第二图像来确定第一图像内的第一区域,并根据由第三摄像装置所具备的第三图像传感器拍摄的第三波长区域的第三图像中与第一区域相对应的第三区域的图像信息来确定第三摄像装置的摄像控制值。The determining device according to an aspect of the present invention may include a circuit configured to be based on the first image in the first wavelength region captured by the first image sensor provided in the first imaging device and the second imaging device. The second image in the second wavelength region captured by the second image sensor is provided to determine the first region in the first image, and the third region in the third wavelength region captured by the third image sensor of the third image sensor is determined. The image information of the third area corresponding to the first area in the image determines the imaging control value of the third imaging device.
图像信息可以是辉度信息,摄像控制值可以是曝光控制值。The image information may be luminance information, and the imaging control value may be an exposure control value.
第一波长区域可以是近红外线区域的波长区域。第二波长区域可以是红色区域、绿色区域或者红色边缘区域的波长区域。第三波长区域可以是红色区域、绿色区域以及蓝色区域的波长区域。The first wavelength region may be a wavelength region of the near infrared region. The second wavelength region may be a wavelength region of a red region, a green region, or a red edge region. The third wavelength region may be a wavelength region of a red region, a green region, and a blue region.
本发明的一个方面所涉及的摄像系统可以包括上述确定装置。摄像系统可以包括第一摄像装置及第二摄像装置。The camera system according to an aspect of the present invention may include the above-mentioned determining device. The camera system may include a first camera device and a second camera device.
本发明的一个方面所涉及的移动体可以是具备上述摄像系统并进行移动的移动体。The moving body according to an aspect of the present invention may be a moving body equipped with the aforementioned imaging system and moving.
本发明的一个方面所涉及的确定方法可以包括根据由第一摄像装置所具备的第一图像传感器拍摄的第一波长区域的第一图像对第一图像内的第一区域进行确定的阶段。所述确定方法可以包括根据由第二摄像装置所具备的第二图像传感器拍摄的第二波长区域的第二图像中与第一区域相对应的第二区域的图像信息对第二摄像装置的摄像控制值进行确定的阶段。The determining method according to an aspect of the present invention may include a stage of determining the first region in the first image based on the first image of the first wavelength region captured by the first image sensor included in the first imaging device. The determining method may include imaging of the second imaging device based on image information of the second region corresponding to the first region in the second image of the second wavelength region captured by the second image sensor of the second imaging device. The stage in which the control value is determined.
本发明的一个方面所涉及的程序,可以是一种用于使计算机作为上述确定装置而发挥功能的程序。The program according to one aspect of the present invention may be a program for causing a computer to function as the above-mentioned determining device.
根据本发明的一个方面,在多个波段分别对图像进行拍摄的多光谱照相机能够对适合特定的被摄体的拍摄条件进行适当地设定。According to an aspect of the present invention, a multispectral camera that separately captures images in a plurality of wavelength bands can appropriately set shooting conditions suitable for a specific subject.
此外,上述发明内容未列举本发明的必要的全部特征。此外,这些特征组的子组合也可以构成发明。In addition, the above summary does not enumerate all the essential features of the present invention. In addition, sub-combinations of these feature groups may also constitute inventions.
附图说明Description of the drawings
图1是示出无人驾驶航空器(UAV)及远程操作装置的外观的一个示例的图。FIG. 1 is a diagram showing an example of the appearance of an unmanned aerial vehicle (UAV) and a remote operation device.
图2是示出搭载于UAV上的摄像系统的外观的一个示例的图。Fig. 2 is a diagram showing an example of the appearance of an imaging system mounted on a UAV.
图3是示出UAV的功能块的一个示例的图。Fig. 3 is a diagram showing an example of functional blocks of UAV.
图4是示出摄像系统的功能块的一个示例的图。Fig. 4 is a diagram showing an example of functional blocks of the camera system.
图5是示出由摄像系统拍摄的图像的一个示例的图。Fig. 5 is a diagram showing an example of an image taken by an imaging system.
图6是图5所示的图像的一部分的辉度分布图。Fig. 6 is a luminance distribution diagram of a part of the image shown in Fig. 5.
图7是用于对由图像分割而成的块进行说明的图。Fig. 7 is a diagram for explaining blocks formed by dividing an image.
图8是用于对图像内的关注区域进行说明的图。Fig. 8 is a diagram for explaining a region of interest in an image.
图9是示出由摄像系统拍摄的图像的一个示例的图。Fig. 9 is a diagram showing an example of an image taken by an imaging system.
图10是示出对曝光控制值进行确定的步骤的一个示例的流程图。Fig. 10 is a flowchart showing an example of a procedure for determining an exposure control value.
图11是示出搭载于UAV上的摄像系统的外观的另一个示例的图。FIG. 11 is a diagram showing another example of the appearance of the imaging system mounted on the UAV.
图12是用于说明硬件配置的一个示例的图。Fig. 12 is a diagram for explaining an example of the hardware configuration.
符号说明:Symbol Description:
10   UAV10 UAV
20   UAV主体20 UAV subject
30   UAV控制部30 UAV Control Department
32   存储器32 Memory
36   通信接口36 Communication interface
40   推进部40 Promotion Department
41   GPS接收器41 GPS receiver
42   惯性测量装置42 Inertial measurement device
43   磁罗盘43 Magnetic compass
44   气压高度计44 Barometric altimeter
45   温度传感器45 Temperature sensor
46   湿度传感器46 Humidity sensor
50   万向节50 Universal joint
60   摄像装置60 Camera device
100  摄像系统100 Camera system
110  R用摄像装置110 Camera device for R
112  R用图像传感器112 Image sensor for R
114  光学系统114 Optical system
120  G用摄像装置120 Camera device for G
122  G用图像传感器122 Image sensor for G
124  光学系统124 Optical system
130  B用摄像装置130 Camera device for B
132  B用图像传感器132 Image sensor for B
134  光学系统134 Optical system
140  RE用摄像装置140 Camera device for RE
142  RE用图像传感器142 Image sensor for RE
144  光学系统144 Optical system
150  NIR用摄像装置150 NIR camera device
152  NIR用图像传感器152 Image sensor for NIR
154  光学系统154 Optical system
160  RGB用摄像装置160 RGB camera
170  复用器170 Multiplexer
172  输入接收部172 Input receiving department
174  去马赛克处理部174 Demosaicing Department
178  记录处理部178 Record Processing Department
180  图像处理器180 Image processor
181  区域确定部181 Region Determination Department
182  拍摄条件确定部182 Shooting Condition Determination Department
184  接收部184 Receiving Department
186  切换部186 Switching Department
190  发送部190 Sending Department
192  存储器192 Memory
300  远程操作装置300 remote operation device
1200 计算机1200 Computer
1210 主机控制器1210 Host Controller
1212 CPU1212 CPU
1214 RAM1214 RAM
1220 输入/输出控制器1220 Input/Output Controller
1222 通信接口1222 Communication interface
1230 ROM1230 ROM
具体实施方式detailed description
以下,通过发明的实施方式来对本发明进行说明,但是以下的实施方式并不限定权利要求书所涉及的发明。此外,实施方式中所说明的特征的所有组合对于发明的解决方案未必是必须的。对本领域普通技术人员来说,显然可以对以下实施方式加以各种变更或改良。从权利要求书的描述显而易见的是,加以了这样的变更或改良的方式都可包含在本发明的技术范围之内。Hereinafter, the present invention will be described through embodiments of the invention, but the following embodiments do not limit the invention according to the claims. In addition, all the combinations of the features described in the embodiments are not necessarily necessary for the solution of the invention. It is obvious to a person skilled in the art that various changes or improvements can be made to the following embodiments. It is obvious from the description of the claims that all such changes or improvements can be included in the technical scope of the present invention.
权利要求书、说明书、说明书附图以及说明书摘要中包含作为著作权所保护对象的事项。任何人只要如专利局的文档或者记录所表示的那样进行这些文件的复制,著作权人就无法异议。但是,在除此以外的情况下,保留一切的著作权。The claims, the description, the drawings of the description, and the abstract of the description include matters that are the subject of copyright protection. As long as anyone makes copies of these files as indicated in the patent office files or records, the copyright owner cannot object. However, in other cases, all copyrights are reserved.
本发明的各种实施方式可参照流程图及框图来描述,这里,框可表示(1)执行操作的过程的阶段或者(2)具有执行操作的作用的装置的“部”。特定的阶段和“部”可以通过可编程电路和/或处理器来实现。专用电路可以包括数字和/或模拟硬件电路。可以包括集成电路(IC)和/或分立电路。可编程电路可以包括可重构硬件电路。可重构硬件电路可以包括逻辑与、逻辑或、逻辑异或、逻辑与非、逻辑或非、及其它逻辑操作、触发器、寄存器、现场可编程门阵列(FPGA)、可编程逻辑阵列(PLA)等存储器元件等。Various embodiments of the present invention can be described with reference to flowcharts and block diagrams. Here, a block may represent (1) a stage of a process of performing an operation or (2) a "part" of a device that performs an operation. Specific stages and "parts" can be implemented by programmable circuits and/or processors. Dedicated circuits may include digital and/or analog hardware circuits. May include integrated circuits (ICs) and/or discrete circuits. The programmable circuit may include a reconfigurable hardware circuit. Reconfigurable hardware circuits can include logical AND, logical OR, logical exclusive OR, logical NAND, logical NOR, and other logical operations, flip-flops, registers, field programmable gate array (FPGA), programmable logic array (PLA) ) And other memory components.
计算机可读介质可以包括能够存储由合适设备执行的指令的任何有形设备。其结果是,其上存储有指令的计算机可读介质具备一种包括指令的产品,该指令可被执行以创建用于执行流程图或框图所指定的操作的手段。作为计算机可读介质的示例,可以包括电子存储介质、磁存储介质、光存储介质、电磁存储介质、半导体存储介质等。作为计算机可读介质的更具体的示例,可以包括floppy(注册商标)disk、软磁盘、硬盘、随机存取存储器(RAM)、只读存储器(ROM)、可擦除可编程只读存储器(EPROM或者闪存)、电可擦可编程只读存储器(EEPROM)、静态随机存取存储器(SRAM)、光盘只读存储器(CD-ROM)、数字多用途光盘(DVD)、蓝光(RTM)光盘、记忆棒、集成电路卡等。The computer-readable medium may include any tangible device that can store instructions for execution by a suitable device. As a result, the computer-readable medium on which instructions are stored is provided with a product including instructions that can be executed to create means for performing operations specified by the flowchart or block diagram. As an example of a computer-readable medium, it may include electronic storage media, magnetic storage media, optical storage media, electromagnetic storage media, semiconductor storage media, and the like. As a more specific example of the computer-readable medium, it may include floppy (registered trademark) disk, floppy disk, hard disk, random access memory (RAM), read only memory (ROM), erasable programmable read only memory (EPROM) or Flash memory), electrically erasable programmable read-only memory (EEPROM), static random access memory (SRAM), compact disc read-only memory (CD-ROM), digital versatile disc (DVD), Blu-ray (RTM) disc, memory stick , Integrated circuit cards, etc.
计算机可读指令可以包括由一种或多种编程语言的任意组合描述的源代码或者目标代码中的任意一个。源代码或者目标代码包括传统的程序式编程语言。传统的程序式编程语言可以为汇编指令、指令集架构(1SA)指令、机器指令、与机器相关的指令、微代码、固件指令、状态设置数据、或者Smalltalk(注册商标)、JAVA(注册商标)、C++等面向对象编程语言以及“C”编程语言或者类似的编程语言。计算机可读指令可以在本地或者经由局域网(LAN)、互联网等广域网(WAN)提供给通用计算机、专用计算机或者其它可编程数据处理装置的处理器或可编程电路。处理器或可编程电路可以执行计算机可读指令,以创建用于执行流程图或框图所指定操作的手段。作为处理器的示例,包括计算机处理器、处理单元、微处理器、数字信号处理器、控制器、微控制器等。The computer-readable instructions may include any one of source code or object code described in any combination of one or more programming languages. The source code or object code includes traditional procedural programming languages. Traditional procedural programming languages can be assembly instructions, instruction set architecture (1SA) instructions, machine instructions, machine-related instructions, microcode, firmware instructions, status setting data, or Smalltalk (registered trademark), JAVA (registered trademark) , C++ and other object-oriented programming languages and "C" programming language or similar programming languages. The computer-readable instructions may be provided locally or via a wide area network (WAN) such as a local area network (LAN) or the Internet to a processor or programmable circuit of a general-purpose computer, a special-purpose computer, or other programmable data processing device. The processor or programmable circuit can execute computer-readable instructions to create means for performing the operations specified in the flowchart or block diagram. Examples of processors include computer processors, processing units, microprocessors, digital signal processors, controllers, microcontrollers, and so on.
图1表示无人驾驶航空器(UAV)10及远程操作装置300的外观的一个示例。UAV10包括UAV主体20、万向节50、多个摄像装置60、以及摄像系统100。万向节50及摄像系统100为摄像系统的一个示例。UAV10为移动体的一个示例。移动体包括在空中移动的飞行物体、在地面移动的车辆、在水上移动的船舶等的概念。在空中移动的飞行体是指不仅包括UAV、还包括在空中移动的其它的飞行器、飞艇、直升机等的概念。FIG. 1 shows an example of the appearance of an unmanned aerial vehicle (UAV) 10 and a remote operation device 300. The UAV 10 includes a UAV main body 20, a universal joint 50, a plurality of camera devices 60, and a camera system 100. The universal joint 50 and the camera system 100 are an example of a camera system. UAV10 is an example of a moving body. Moving objects include concepts such as flying objects moving in the air, vehicles moving on the ground, and ships moving on the water. Flying objects moving in the air refer to concepts that include not only UAVs, but also other aircraft, airships, and helicopters that move in the air.
UAV主体20包括多个旋翼。多个旋翼为推进部的一个示例。UAV主体20通过控制多个旋翼的旋转而使UAV10飞行。UAV主体20使用例如四个旋翼来使UAV10飞行。旋翼的数量不限于四个。另外,UAV10也可以是没有旋翼的固定翼机。The UAV main body 20 includes a plurality of rotors. Multiple rotors are an example of a propulsion section. The UAV main body 20 makes the UAV 10 fly by controlling the rotation of a plurality of rotors. The UAV main body 20 uses, for example, four rotors to fly the UAV 10. The number of rotors is not limited to four. In addition, UAV10 can also be a fixed-wing aircraft without rotors.
摄像系统100是在多个波段分别对所期望的摄像范围内的对象进行拍摄的拍摄用多光谱照相机。万向节50可旋转地支撑摄像系统100。万向节50为支撑机构的一个示例。例如,万向节50使用致动器以俯仰轴可旋转地支撑摄像系统100。万向节50使用致动器进一步分别以滚转轴和偏航轴为中心可旋转地支撑摄像系统100。万向节50可通过使摄像系统100以偏航轴、俯仰轴以及滚转轴中的至少一个为中心旋转,来变更摄像系统100的姿势。The imaging system 100 is a multispectral camera for imaging that captures objects within a desired imaging range in a plurality of wavelength bands. The universal joint 50 rotatably supports the camera system 100. The universal joint 50 is an example of a supporting mechanism. For example, the gimbal 50 uses an actuator to rotatably support the camera system 100 with a pitch axis. The universal joint 50 uses an actuator to further rotatably support the camera system 100 around the roll axis and the yaw axis, respectively. The gimbal 50 can change the posture of the camera system 100 by rotating the camera system 100 around at least one of the yaw axis, the pitch axis, and the roll axis.
多个摄像装置60是为了控制UAV10的飞行而对UAV10的周围进行拍摄的传感用相机。两个摄像装置60可以设置于UAV10的机头、即正面。并且,其它两个摄像装置60可以设置于UAV10的底面。正面侧的两个摄像装置60可以成对,起到所谓的立体相机的作用。底面侧的两个摄像装置60也可以成对,起到立体相机的作用。摄像装置60可以检测到摄像装置60的摄像范围所包含的对象的存在以及测量出与对象间的距离。摄像装置60为对存在于摄像系统100的摄像方向上的对象进行测量的测量装置的一个示例。测量装置也可以是对存在于摄像系统100的摄像方向上的对象进行测量的红外传感器、超声波传感器等的其它的传感器。可以根据由多个摄像装置60所拍摄的图像来生成UAV10周围的三维空间数据。UAV10所具备的摄像装置60的数量不限于四个。UAV10具备至少一个摄像装置60即可。UAV10也可以在UAV10的机头、机尾、侧面、底面及顶面分别具备至少一个摄像装置60。摄像装置60中可设定的视角可大于摄像系统100中可设定的视角。摄像装置60也可以包括单焦点镜头或鱼眼镜头。The plurality of imaging devices 60 are sensing cameras that photograph the surroundings of the UAV 10 in order to control the flight of the UAV 10. The two camera devices 60 can be installed on the nose of the UAV 10, that is, on the front side. In addition, the other two camera devices 60 may be provided on the bottom surface of the UAV 10. The two imaging devices 60 on the front side may be paired to function as a so-called stereo camera. The two imaging devices 60 on the bottom side may also be paired to function as a stereo camera. The imaging device 60 can detect the existence of an object included in the imaging range of the imaging device 60 and measure the distance to the object. The imaging device 60 is an example of a measuring device that measures an object existing in the imaging direction of the imaging system 100. The measuring device may be another sensor such as an infrared sensor or an ultrasonic sensor that measures an object existing in the imaging direction of the imaging system 100. The three-dimensional spatial data around the UAV 10 can be generated based on the images taken by the plurality of camera devices 60. The number of imaging devices 60 included in the UAV 10 is not limited to four. The UAV 10 may include at least one imaging device 60. The UAV10 may be equipped with at least one camera 60 on the nose, tail, side, bottom, and top of the UAV10. The viewing angle that can be set in the camera device 60 may be larger than the viewing angle that can be set in the camera system 100. The imaging device 60 may also include a single focus lens or a fisheye lens.
远程操作装置300与UAV10通信,以远程操作UAV10。远程操作装置300可以与UAV10进行无线通信。远程操作装置300向UAV10发送表示上升、下降、加速、减速、前进、后退、旋转等与UAV10的移动有关的各种指令的指示信息。指示信息包括例如使UAV10的高度上升的指示信息。指示信息可以表示UAV10应该位于的高度。UAV10进行移动,以位于从远程操作装置300接收的指示信息所表示的高度。指示信息可以包括使UAV10上升的上升指令。UAV10在接受上升指令的期间上升。在UAV10的高度已达到上限高度时,即使接受上升指令,也可以限制UAV10上升。The remote operation device 300 communicates with the UAV 10 to remotely operate the UAV 10. The remote operation device 300 can wirelessly communicate with the UAV 10. The remote operation device 300 transmits to the UAV 10 instruction information indicating various commands related to the movement of the UAV 10 such as ascending, descending, accelerating, decelerating, forwarding, retreating, and rotating. The instruction information includes, for example, instruction information for raising the height of the UAV 10. The indication information may indicate the height at which the UAV10 should be located. The UAV 10 moves to be located at the height indicated by the instruction information received from the remote operation device 300. The instruction information may include an ascending instruction to raise the UAV10. UAV10 rises while receiving the rise command. When the height of UAV10 has reached the upper limit height, even if the ascending instruction is accepted, the ascent of UAV10 can be restricted.
图2是示出搭载于UAV10上的摄像系统100的外观的一个示例的图。摄像系统100是对预设的多个波段每个波段的图像数据分别进行拍摄的多光谱照相机。摄像系统100包括R用摄像装置110、G用摄像装置120、B用摄像装置130、RE用摄像装置140以及NIR用摄像装置150。摄像系统100能够将由R用摄像装置110、G用摄像装置120、B用摄像装置130、RE用摄像装置140以及NIR用摄像装置150拍摄的各个图像数据作为多光谱图像进行记录。例如,多光谱图像可用于对农作物的健康状态以及活力进行预测。FIG. 2 is a diagram showing an example of the appearance of the imaging system 100 mounted on the UAV 10. The imaging system 100 is a multispectral camera that separately captures image data of each of a plurality of preset wavebands. The imaging system 100 includes an imaging device 110 for R, an imaging device 120 for G, an imaging device 130 for B, an imaging device 140 for RE, and an imaging device 150 for NIR. The imaging system 100 can record each image data captured by the imaging device 110 for R, the imaging device 120 for G, the imaging device 130 for B, the imaging device 140 for RE, and the imaging device 150 for NIR as a multispectral image. For example, multispectral images can be used to predict the health and vitality of crops.
图3示出UAV10的功能块的一个示例。UAV10包括UAV控制部30、存储器32、通信接口36、推进部40、GPS接收器41、惯性测量装置42、磁罗盘43、气压高度计44、温度传感器45、湿度传感器46、万向节50、摄像装置60及摄像系统100。FIG. 3 shows an example of the functional blocks of UAV10. UAV10 includes UAV control unit 30, memory 32, communication interface 36, propulsion unit 40, GPS receiver 41, inertial measurement device 42, magnetic compass 43, barometric altimeter 44, temperature sensor 45, humidity sensor 46, universal joint 50, camera The device 60 and the camera system 100.
通信接口36与远程操作装置300等其它装置通信。通信接口36可以从远程操作装置300接收包括对UAV控制部30的各种指令的指示信息。存储器32存储UAV控制部30对推进部40、GPS接收器41、惯性测量装置(IMU)42、磁罗盘43、气压高度计44、温度传感器45、湿度传感器46、万向节50、摄像装置60及摄像系统100进行控制所需的程序等。存储器32可以为计算机可读记录介质,可以包括SRAM、DRAM、EPROM、EEPROM、USB存储器等闪存中的至少一个。存储器32可以设置在UAV主体20的内部。其可以设置成可从UAV主体20上拆卸下来。The communication interface 36 communicates with other devices such as the remote operation device 300. The communication interface 36 can receive instruction information including various instructions to the UAV control unit 30 from the remote operation device 300. The memory 32 stores the UAV control unit 30 to the propulsion unit 40, GPS receiver 41, inertial measurement unit (IMU) 42, magnetic compass 43, barometric altimeter 44, temperature sensor 45, humidity sensor 46, universal joint 50, camera 60 and The imaging system 100 performs programs and the like necessary for control. The memory 32 may be a computer-readable recording medium, and may include at least one of flash memory such as SRAM, DRAM, EPROM, EEPROM, and USB memory. The memory 32 may be provided inside the UAV main body 20. It can be configured to be detachable from the UAV main body 20.
UAV控制部30按照储存在存储器32中的程序来控制UAV10的飞行及拍摄。UAV控制部30可以由CPU或MPU等微处理器、以及MCU等微控制器等构成。UAV控制部30按照经由通信接口36从远程操作装置300接收到的指令来控制UAV10的飞行及拍摄。推进部40推进UAV10。推进部40包括多个旋翼和使多个旋翼旋转的多个驱动电机。推进部40按照来自UAV控制部30的指令,经由多个驱动电机使多个旋翼旋转,以使UAV10飞行。The UAV control unit 30 controls the flight and shooting of the UAV 10 in accordance with the program stored in the memory 32. The UAV control unit 30 may be composed of a microprocessor such as a CPU or an MPU, and a microcontroller such as an MCU. The UAV control unit 30 controls the flight and shooting of the UAV 10 in accordance with instructions received from the remote operation device 300 via the communication interface 36. The propulsion unit 40 propels the UAV10. The propulsion part 40 includes a plurality of rotors and a plurality of drive motors that rotate the plurality of rotors. The propulsion unit 40 rotates a plurality of rotors via a plurality of drive motors in accordance with an instruction from the UAV control unit 30 to cause the UAV 10 to fly.
GPS接收器41接收表示从多个GPS卫星发送的时间的多个信号。GPS接收器41根据所接收的多个信号来计算出GPS接收器41的位置(纬度及经度)、即UAV10的位置(纬度及经度)。IMU42检测UAV10的姿势。IMU42检测UAV10的前后、左右以及上下的三轴方向的加速度和俯仰轴、滚转轴以及偏航轴的三轴方向的角速度,作为UAV10的姿势。磁罗盘43检测UAV10的机头的方位。气压高度计44检测UAV10的飞行高度。气压高度计44检测UAV10周围的气压,并将检测到的气压换算为高度,以检测高度。温度传感器45检测UAV10周围的温度。湿度传感器46检测UAV10周围的湿度。The GPS receiver 41 receives a plurality of signals indicating the time transmitted from a plurality of GPS satellites. The GPS receiver 41 calculates the position (latitude and longitude) of the GPS receiver 41, that is, the position (latitude and longitude) of the UAV 10 based on the received signals. The IMU42 detects the posture of the UAV10. The IMU 42 detects the acceleration of the UAV 10 in the three-axis directions of front and rear, left and right, and up and down, and the angular velocities of the pitch axis, the roll axis, and the yaw axis as the attitude of the UAV 10. The magnetic compass 43 detects the position of the nose of the UAV 10. The barometric altimeter 44 detects the flying altitude of the UAV10. The barometric altimeter 44 detects the air pressure around the UAV 10 and converts the detected air pressure to altitude to detect the altitude. The temperature sensor 45 detects the temperature around the UAV 10. The humidity sensor 46 detects the humidity around the UAV 10.
图4示出了摄像系统100的功能块的一个示例。摄像系统100包括R用摄像装置110、G用摄像装置120、B用摄像装置130、RE用摄像装置140以及NIR用摄像装置150。摄像系统100包括图像处理器180、发送部190及存储器192。FIG. 4 shows an example of functional blocks of the camera system 100. The imaging system 100 includes an imaging device 110 for R, an imaging device 120 for G, an imaging device 130 for B, an imaging device 140 for RE, and an imaging device 150 for NIR. The imaging system 100 includes an image processor 180, a transmission unit 190, and a memory 192.
R用摄像装置110包括R用图像传感器112及光学系统114。R用图像传感器112对由光学系统114所成的像进行拍摄。R用图像传感器112包括使红色区域波段的光透过的滤光片,并输出红色区域波段的图像信号即R图像信号。例如,红色区域的波段是620nm~750nm。红色区域的波段可以是红色区域中特定的波段,例如可以是663nm~673nm。The imaging device 110 for R includes an image sensor 112 for R and an optical system 114. The image sensor 112 for R captures an image formed by the optical system 114. The R image sensor 112 includes a filter that transmits light in the red region, and outputs an R image signal that is an image signal in the red region. For example, the wavelength band of the red region is 620 nm to 750 nm. The wavelength band of the red region may be a specific wavelength band in the red region, for example, it may be 663 nm to 673 nm.
G用摄像装置120包括G用图像传感器122及光学系统124。G用图像传感器122对由光学系统124所成的像进行拍摄。G用图像传感器122包括使绿色区域波段的光透过的滤波器,并输出绿色区域波段的图像信号即G图像信号。例如,绿色区域的波段是500nm~570nm。绿色区域的波段可以是绿色区域中特定的波段,例如可以是550nm~570nm。The imaging device 120 for G includes an image sensor 122 for G and an optical system 124. The image sensor 122 for G captures an image formed by the optical system 124. The G image sensor 122 includes a filter that transmits light in the green region, and outputs a G image signal that is an image signal in the green region. For example, the wavelength band of the green region is 500 nm to 570 nm. The wavelength band of the green region may be a specific wavelength band in the green region, for example, it may be 550 nm to 570 nm.
B用摄像装置130包括B用图像传感器132及光学系统134。B用图像传感器132对由光学系统134所成的像进行拍摄。B用图像传感器132包括使蓝色区域波段的光 透过的滤光片,并输出蓝色区域波段的图像信号即B图像信号。例如,蓝色区域的波段是450nm~500nm。蓝色区域的波段可以是蓝色区域中特定的波段,例如可以是465nm~485nm。The imaging device 130 for B includes an image sensor 132 for B and an optical system 134. The image sensor 132 for B captures an image formed by the optical system 134. The image sensor for B 132 includes a filter that transmits light in the blue region, and outputs a B image signal that is an image signal in the blue region. For example, the wavelength band of the blue region is 450 nm to 500 nm. The wavelength band of the blue region may be a specific wavelength band in the blue region, for example, it may be 465 nm to 485 nm.
RE用摄像装置140包括RE用图像传感器142及光学系统144。RE用图像传感器142对由光学系统144所成的像进行拍摄。RE用图像传感器142包括使红色边缘区域波段的光透过的滤光片,并输出红色边缘区域波段的图像信号即RE图像信号。例如,红色边缘区域的波段是705nm~745nm。红色边缘区域的波段可以是712nm~722nm。The imaging device 140 for RE includes an image sensor 142 for RE and an optical system 144. The image sensor 142 for RE captures an image formed by the optical system 144. The RE image sensor 142 includes a filter that transmits light in the red edge region, and outputs an RE image signal that is an image signal in the red edge region. For example, the wavelength band of the red edge region is 705 nm to 745 nm. The wavelength band of the red edge region may be 712 nm to 722 nm.
NIR用摄像装置150包括NIR用图像传感器152及光学系统154。NIR用图像传感器152对由光学系统154所成的像进行拍摄。NIR用图像传感器152包括使近红外线区域波段的光透过的滤光片,并输出近红外线区域波段的图像信号即NIR图像信号。例如,近红外线区域的波段是800nm~2500nm。近红外线区域的波段可以是800nm至900nm。The NIR imaging device 150 includes an NIR image sensor 152 and an optical system 154. The image sensor 152 for NIR captures the image formed by the optical system 154. The image sensor for NIR 152 includes a filter that transmits light in the near infrared region, and outputs an image signal in the near infrared region, that is, an NIR image signal. For example, the wavelength band of the near infrared region is 800 nm to 2500 nm. The wavelength band of the near infrared region may be 800 nm to 900 nm.
图像处理器180包括复用器170、输入接收部172、去马赛克处理部174以及记录处理部178。图像处理器180是电路的一个示例。图像处理器180可以由CPU或者MPU等微处理器、MCU等微控制器组成。The image processor 180 includes a multiplexer 170, an input receiving section 172, a demosaicing processing section 174, and a recording processing section 178. The image processor 180 is an example of a circuit. The image processor 180 may be composed of a microprocessor such as a CPU or an MPU, and a microcontroller such as an MCU.
复用器170接收从各个图像传感器输出的图像信号,并根据预设条件对从任意一个图像传感器输出的图像信号进行选择,输入到输入接收部172。The multiplexer 170 receives the image signal output from each image sensor, selects the image signal output from any image sensor according to a preset condition, and inputs it to the input receiving unit 172.
去马赛克处理部174根据输入到输入接收部172中的R图像信号、G图像信号以及B图像信号生成显示用图像数据。去马赛克处理部174通过对R图像信号、G图像信号以及B图像信号实施去马赛克处理从而生成显示用图像数据。去马赛克处理部174可以通过对R图像信号、G图像信号以及B图像信号实施稀疏处理,将稀疏处理后的R图像信号、G图像信号以及B图像信号转换为拜耳阵列的图像信号,生成显示用图像数据。发送部190将显示用图像数据发送到显示装置。例如,发送部190可以向远程操作装置300发送显示用图像数据。远程操作装置300可以在显示部上对显示用图像数据进行显示以作为实时视图的图像。The demosaic processing unit 174 generates display image data based on the R image signal, the G image signal, and the B image signal input to the input receiving unit 172. The demosaic processing unit 174 generates display image data by performing demosaic processing on the R image signal, the G image signal, and the B image signal. The demosaic processing unit 174 can perform thinning processing on the R image signal, G image signal, and B image signal, and convert the thinning-processed R image signal, G image signal, and B image signal into Bayer array image signals to generate display signals. Image data. The transmitting unit 190 transmits the image data for display to the display device. For example, the transmitting unit 190 may transmit the image data for display to the remote operation device 300. The remote operation device 300 may display the image data for display on the display unit as an image of a live view.
记录处理部178根据输入到输入接收部172中的R图像信号、G图像信号、B图像信号、RE图像信号以及NIR图像信号并根据预设的记录格式生成记录用图像数据。记录处理部178可以根据RAW格式将R图像信号、G图像信号、B图像信号、RE图像信号以及NIR图像信号生成RAW数据作为记录用图像数据。记录处理部178可以不对R图像信号、G图像信号、B图像信号、RE图像信号以及NIR图像信号分别进行稀疏处理而生成全像素的记录用图像数据。记录处理部178可以将记录用图像数据存储在存储器192中。存储器192可以为计算机可读存储介质,可以包含SRAM、DRAM、EPROM、EEPROM及USB存储器等闪存中的至少一个。存储器192可以设置于摄像系统100的壳体内部。存储器192可以设置成可从摄像系统100的壳体上拆卸下来。The recording processing unit 178 generates recording image data based on the R image signal, G image signal, B image signal, RE image signal, and NIR image signal input to the input receiving unit 172 and based on a preset recording format. The recording processing unit 178 can generate RAW data as recording image data from the R image signal, the G image signal, the B image signal, the RE image signal, and the NIR image signal according to the RAW format. The recording processing unit 178 may generate all-pixel recording image data without performing thinning-out processing on the R image signal, G image signal, B image signal, RE image signal, and NIR image signal. The recording processing unit 178 may store the image data for recording in the memory 192. The memory 192 may be a computer-readable storage medium, and may include at least one of flash memory such as SRAM, DRAM, EPROM, EEPROM, and USB memory. The memory 192 may be provided inside the housing of the camera system 100. The memory 192 may be configured to be detachable from the housing of the camera system 100.
图像处理器180还包括接收部184及切换部186。接收部184接收将记录用图像数据存储到存储器192中的存储指示。接收部184可以通过远程操作装置300等外部终端接收来自用户的存储指示。当摄像系统100的位置为预设位置时,接收部184可以从UAV控制部30接收存储指示。当UAV10的位置为预设位置时,UAV控制部30判断出摄像系统100的位置为预设位置,从而接收部184可以从UAV控制部30接收存储指示。摄像系统100也可以包括GPS接收器。在该情况下,图像处理器180可以根据来自自身GPS接收器的位置信息判断摄像系统100的位置是否为预设位置。切换部186对下述两种方式进行切换,一是根据输入到输入接收部172中的R图像信号、G图像信号以及B图像信号在去马赛克处理部174生成显示用图像数据,二是根据输入到输入接收部172中的R图像信号、G图像信号、B图像信号、RE图像信号以及NIR图像信号并根据预设的记录格式在记录处理部178生成记录用图像数据。The image processor 180 also includes a receiving unit 184 and a switching unit 186. The receiving unit 184 receives a storage instruction to store the image data for recording in the memory 192. The receiving unit 184 may receive a storage instruction from the user through an external terminal such as the remote operation device 300. When the position of the camera system 100 is a preset position, the receiving unit 184 may receive a storage instruction from the UAV control unit 30. When the position of the UAV 10 is the preset position, the UAV control unit 30 determines that the position of the camera system 100 is the preset position, so the receiving unit 184 can receive the storage instruction from the UAV control unit 30. The camera system 100 may also include a GPS receiver. In this case, the image processor 180 can determine whether the position of the camera system 100 is a preset position according to the position information from its own GPS receiver. The switching unit 186 switches between the following two methods. One is to generate display image data in the demosaicing processing unit 174 based on the R image signal, G image signal, and B image signal input to the input receiving unit 172, and the other is based on the input The R image signal, G image signal, B image signal, RE image signal, and NIR image signal sent to the input receiving unit 172 are generated in the recording processing unit 178 according to a preset recording format.
在如上所述构成的摄像系统100中,可以根据分别由R用摄像装置110、G用摄像装置120以及B用摄像装置130拍摄的图像信号来确定关注区域(ROI),所述关注区域用于对拍摄条件进行确定。图像处理器180可以根据分别由R用摄像装置110、G用摄像装置120以及B用摄像装置130拍摄的图像信号所示的像素值(辉度值)来确定ROI。In the imaging system 100 configured as described above, it is possible to determine a region of interest (ROI) based on image signals captured by the imaging device 110 for R, the imaging device 120 for G, and the imaging device 130 for B, respectively. Determine the shooting conditions. The image processor 180 can determine the ROI based on the pixel values (luminance values) shown by the image signals captured by the R imaging device 110, the G imaging device 120, and the B imaging device 130, respectively.
然而,图像处理器180根据这样的ROI来确定R用摄像装置110、G用摄像装置120以及B用摄像装置130的每个的拍摄条件时,有时并不是适合特定对象物的拍摄条件。例如,当特定对象物是农作物等植物,且在植物的周围存在车辆或者道路时,图像处理器180有时会将包含车辆或者道路的区域确定为ROI。当包含车辆或者道路区域的像素值高于包含植物区域的像素值时,图像处理器180会将包含车辆或者道路的区域确定为ROI。However, when the image processor 180 determines the imaging conditions of each of the imaging device 110 for R, the imaging device 120 for G, and the imaging device 130 for B based on such an ROI, the imaging conditions may not be suitable for a specific object. For example, when the specific object is a plant such as a crop, and a vehicle or a road exists around the plant, the image processor 180 may determine an area including the vehicle or the road as an ROI. When the pixel value of the area containing the vehicle or road is higher than the pixel value of the area containing the plant, the image processor 180 will determine the area containing the vehicle or the road as the ROI.
例如,当包含图5所示的农作物510和车辆512的图像500被摄像系统100拍摄时,图像500内的符号501所示的直线部分的辉度值(亮度)分布如图6所示。即,车辆512的辉度值较高,图像处理器180会将包含车辆512的区域确定为ROI。图像处理器180会将这样的区域确定为ROI,通过该ROI内的辉度值等图像信息来确定曝光。在该情况下,如图5所示,并非是适合农作物510的曝光,图像500内的农作物510较暗。For example, when the image 500 including the crop 510 and the vehicle 512 shown in FIG. 5 is captured by the imaging system 100, the luminance value (brightness) distribution of the straight line portion indicated by the symbol 501 in the image 500 is as shown in FIG. 6. That is, the brightness value of the vehicle 512 is high, and the image processor 180 will determine the area containing the vehicle 512 as an ROI. The image processor 180 will determine such a region as an ROI, and determine the exposure based on image information such as the brightness value in the ROI. In this case, as shown in FIG. 5, it is not suitable for the exposure of the crop 510, and the crop 510 in the image 500 is dark.
因此,在本实施方式中,通过与特定的被摄体的特性对应的指标来确定ROI,以能够根据特定的被摄体来确定合适的拍摄条件。被摄体的特性是指能够根据由摄像装置拍摄的图像估计或者确定该被摄体的存在位置的特性,即被摄体对各个波长的反射率、颜色、形状、被摄体的拍摄位置、时期、季节等。Therefore, in this embodiment, the ROI is determined by the index corresponding to the characteristic of the specific subject, so that the appropriate shooting conditions can be determined based on the specific subject. The characteristics of the subject refer to the characteristics that can estimate or determine the location of the subject based on the image taken by the imaging device, that is, the reflectance of the subject to each wavelength, the color, the shape, the shooting position of the subject, Time, season, etc.
图像处理器180包括区域确定部181及拍摄条件确定部182。拍摄条件确定部182根据由R用摄像装置110、G用摄像装置120、B用摄像装置130、RE用摄像装置140以及NIR用摄像装置150中的至少一个拍摄的特定波长区域的图像将该图像内的第一区域确定为ROI。拍摄条件确定部182根据R用摄像装置110、G用摄像装置120、B 用摄像装置130、RE用摄像装置140以及NIR用摄像装置150中的与各第一区域相对应的各区域的图像信息来确定R用摄像装置110、G用摄像装置120、B用摄像装置130、RE用摄像装置140以及NIR用摄像装置150的各摄像控制值。The image processor 180 includes an area determination unit 181 and a shooting condition determination unit 182. The imaging condition determination unit 182 is based on an image in a specific wavelength region captured by at least one of the imaging device 110 for R, the imaging device 120 for G, the imaging device 130 for B, the imaging device 140 for RE, and the imaging device 150 for NIR. The first area within is determined as the ROI. The imaging condition determination unit 182 is based on the image information of each region corresponding to each first region among the imaging device 110 for R, the imaging device 120 for G, the imaging device 130 for B, the imaging device 140 for RE, and the imaging device 150 for NIR. The imaging control values of the imaging device 110 for R, the imaging device 120 for G, the imaging device 130 for B, the imaging device 140 for RE, and the imaging device 150 for NIR are determined.
其中,与第一区域相对应的区域是指包含与第一区域所包含的被摄体相同的被摄体的区域。例如,与NIR用摄像装置150的第一区域相对应的R用摄像装置110的第二区域是指包含与NIR用摄像装置150的第一区域所包含的被摄体相同的被摄体且由R用摄像装置110拍摄的图像内的区域。与NIR用摄像装置150的第一区域相对应的G用摄像装置120的第三区域、B用摄像装置130的第四区域及RE用摄像装置140的第五区域是指包含与由NIR用摄像装置150、G用摄像装置120、B用摄像装置130及RE用摄像装置140拍摄的图像内的第一区域所包含的被摄体相同的被摄体的区域。The area corresponding to the first area refers to an area containing the same subject as the subject included in the first area. For example, the second area of the R-use imaging device 110 corresponding to the first area of the NIR-use imaging device 150 means that it includes the same subject as the subject included in the first area of the NIR imaging device 150 and is defined by The area within the image captured by the imaging device 110 for R. The third area of the imaging device for G 120, the fourth area of the imaging device for B 130, and the fifth area of the imaging device for RE 140 corresponding to the first area of the imaging device for NIR 150 refer to the The image captured by the device 150, the imaging device for G 120, the imaging device for B 130, and the imaging device for RE 140 is an area of the same subject included in the first area.
图像信息是涉及由R用摄像装置110、G用摄像装置120、B用摄像装置130、RE用摄像装置140或者NIR用摄像装置150等各摄像装置拍摄的图像信息,包括辉度信息及颜色信息等。例如,摄像控制值可以是执行自动曝光控制时的控制值即辉度值、图像传感器的增益值、光圈值(Iris值)及快门速度(蓄积时间)中的至少一个。摄像控制值可以是执行自动聚焦控制时的控制值即对焦位置(镜头位置)。另外,摄像控制值可以是执行自动白平衡控制时的控制值即R增益值及B增益值。Image information refers to image information taken by each imaging device such as the imaging device 110 for R, the imaging device 120 for G, the imaging device 130 for B, the imaging device 140 for RE, or the imaging device 150 for NIR, including luminance information and color information Wait. For example, the imaging control value may be at least one of a luminance value that is a control value when automatic exposure control is executed, a gain value of the image sensor, an aperture value (Iris value), and a shutter speed (accumulation time). The imaging control value may be a focus position (lens position) that is a control value when performing autofocus control. In addition, the imaging control value may be an R gain value and a B gain value that are control values when performing automatic white balance control.
存储器192可以对表示R用摄像装置110、G用摄像装置120、B用摄像装置130、RE用摄像装置140、NIR用摄像装置150各自的图像坐标系的对应关系的信息进行存储。拍摄条件确定部182可以根据表示对应关系的信息对与第一区域相对应的第二区域、第三区域、第四区域及第五区域进行确定。The memory 192 can store information indicating the correspondence of the respective image coordinate systems of the imaging device 110 for R, the imaging device 120 for G, the imaging device 130 for B, the imaging device 140 for RE, and the imaging device 150 for NIR. The imaging condition determining unit 182 may determine the second area, the third area, the fourth area, and the fifth area corresponding to the first area based on the information indicating the correspondence relationship.
对ROI进行确定时使用的参数可以根据特定的被摄体所特有的光反射率的特性进行确定。例如,当特定的被摄体对近红外线区域的光进行相对较多地反射时,区域确定部181可以根据近红外线反射率来确定ROI。The parameters used when determining the ROI can be determined according to the characteristics of the light reflectance unique to a specific subject. For example, when a specific subject reflects relatively much light in the near-infrared region, the region determining unit 181 may determine the ROI based on the near-infrared reflectance.
区域确定部181可以根据由NIR用摄像装置150拍摄的近红外线区域的NIR图像将NIR图像内的第一区域确定为ROI。拍摄条件确定部182可以根据由R用摄像装置110拍摄的红色区域的红色图像中与第一区域相对应的第二区域的辉度信息来确定R用摄像装置110的曝光控制值。拍摄条件确定部182可以根据由G用摄像装置120拍摄的绿色区域的绿色图像中与第一区域相对应的第三区域的辉度信息来确定G用摄像装置120的曝光控制值。拍摄条件确定部182可以根据由B用摄像装置130拍摄的蓝色区域的蓝色图像中与第一区域相对应的第四区域的辉度信息来确定B用摄像装置130的曝光控制值。拍摄条件确定部182可以根据由RE用摄像装置140拍摄的红色边缘区域的红色边缘图像中与第一区域相对应的第五区域的辉度信息来确定RE用摄像装置140的曝光控制值。拍摄条件确定部182可以根据由NIR用摄像装置150拍摄的近红外线区域的近红外线图像内的第一区域的辉度信息来确定NIR用摄像装置150的曝光控制值。The region determining unit 181 may determine the first region in the NIR image as an ROI based on the NIR image of the near infrared region captured by the NIR imaging device 150. The imaging condition determination unit 182 may determine the exposure control value of the R imaging device 110 based on the luminance information of the second region corresponding to the first region in the red image of the red region captured by the R imaging device 110. The imaging condition determination unit 182 may determine the exposure control value of the G imaging device 120 based on the luminance information of the third region corresponding to the first region in the green image of the green region captured by the G imaging device 120. The imaging condition determination unit 182 may determine the exposure control value of the B imaging device 130 based on the luminance information of the fourth region corresponding to the first region in the blue image of the blue region captured by the B imaging device 130. The imaging condition determination unit 182 may determine the exposure control value of the RE imaging device 140 based on the luminance information of the fifth region corresponding to the first region in the red edge image of the red edge region captured by the RE imaging device 140. The imaging condition determining unit 182 may determine the exposure control value of the NIR imaging device 150 based on the luminance information of the first region in the near infrared image of the near infrared region captured by the NIR imaging device 150.
区域确定部181可以根据近红外线图像及红色图像来确定第一区域。区域确定部181可以根据近红外线图像及绿色图像来确定第一区域。区域确定部181可以根据近红外线图像及红色边缘图像来确定第一区域。The area determining unit 181 may determine the first area based on the near infrared image and the red image. The area determining unit 181 may determine the first area based on the near infrared image and the green image. The area determining unit 181 may determine the first area based on the near infrared image and the red edge image.
例如,特定的被摄体为植物时,标准植被指标(NDVI)表示特有的值。植物越多NDVI的值越高。植物的枝叶越多NDVI的值越高。植物的活性度越高NDVI的值越高。当特定的被摄体为植物时,NDVI的值会在预设值以上。根据植物的种类的不同,枝叶的多少会有所不同。根据植物的健康状态的不同,活性度会有所不同。因此,当特定的被摄体为特定的植物时,标准植被指标(NDVI)为预设范围的值。预设的范围可以根据植物的种类及要观察的植物的健康状态等进行确定。For example, when the specific subject is a plant, the standard vegetation index (NDVI) represents a unique value. The more plants, the higher the value of NDVI. The more branches and leaves of the plant, the higher the value of NDVI. The higher the activity of the plant, the higher the value of NDVI. When the specific subject is a plant, the value of NDVI will be above the preset value. Depending on the type of plant, the number of branches and leaves will vary. Depending on the health of the plant, the degree of activity will vary. Therefore, when the specific subject is a specific plant, the standard vegetation index (NDVI) is a value within a preset range. The preset range can be determined according to the type of plant and the health status of the plant to be observed.
其中,NDVI由下式表示。Among them, NDVI is represented by the following formula.
【式1】【Formula 1】
Figure PCTCN2020078018-appb-000001
Figure PCTCN2020078018-appb-000001
IR表示近红外线区域反射率,R表示可见光区域中红色反射率。IR represents the reflectance in the near infrared region, and R represents the reflectance of red in the visible light region.
区域确定部181根据近红外线图像及红色图像计算NDVI,将NDVI在预设值以上的第一区域确定为R用摄像装置110、G用摄像装置120、B用摄像装置130、RE用摄像装置140以及NIR用摄像装置150的每个ROI。区域确定部181将近红外线图像及红色图像分别分割成多个块,可以根据每个块的像素值计算每个块的NDVI。区域确定部181可以将NDVI在预设阈值以上的至少一个块确定为ROI。例如如图7所示,区域确定部181将图像500分割成多个块600,计算每个块的NDVI。例如如图8所示,区域确定部181将NDVI最高的块确定为ROI601。拍摄条件确定部182根据与第一区域相对应的各摄像装置的各个区域的辉度信息来确定摄像装置的曝光控制值。这样根据基于NDVI的ROI内的辉度信息来确定每个摄像装置的曝光控制值,从而各摄像装置能够以适合ROI内的农作物的曝光对图像进行拍摄。因此,例如如图9所示,能够得到适合农作物532曝光的图像530。此外,当执行适合农作物532的曝光时,比农作物532明亮的车辆534等可能会曝光过度。另外,当存在有比农作物532极度暗淡的被摄体时,该被摄体在图像内可能会曝光不足。但是,由于旨在获得着眼于农作物532的图像,因此这不影响对图像的评价。The area determining unit 181 calculates NDVI from the near-infrared image and the red image, and determines the first area whose NDVI is greater than or equal to the preset value as the imaging device 110 for R, the imaging device 120 for G, the imaging device 130 for B, and the imaging device 140 for RE And each ROI of the imaging device 150 for NIR. The area determination unit 181 divides the near-infrared image and the red image into a plurality of blocks, and can calculate the NDVI of each block based on the pixel value of each block. The region determining unit 181 may determine at least one block whose NDVI is above a preset threshold as an ROI. For example, as shown in FIG. 7, the area determination unit 181 divides the image 500 into a plurality of blocks 600 and calculates the NDVI of each block. For example, as shown in FIG. 8, the region determining unit 181 determines the block with the highest NDVI as the ROI601. The imaging condition determination unit 182 determines the exposure control value of the imaging device based on the luminance information of each region of each imaging device corresponding to the first region. In this way, the exposure control value of each imaging device is determined based on the luminance information in the ROI based on NDVI, so that each imaging device can capture images with exposure suitable for the crops in the ROI. Therefore, for example, as shown in FIG. 9, an image 530 suitable for exposure of the crop 532 can be obtained. In addition, when the exposure suitable for the crop 532 is performed, a vehicle 534 or the like that is brighter than the crop 532 may be overexposed. In addition, when there is a subject that is extremely darker than the crop 532, the subject may be underexposed in the image. However, since it is intended to obtain an image focusing on the crop 532, this does not affect the evaluation of the image.
根据由摄像系统100拍摄的被摄体的不同,有时优选通过NDVI以外的指标来确定ROI。另外例如,即使是同一种农作物,根据位置、时期或者季节的不同,反射率的特性可能会发生变化。在该情况下,根据位置、时期或者季节等不同,有时优选对用于来确定ROI的指标进行切换。即,区域确定部181可以根据被摄体的种类及对被摄体进行拍摄的位置、时期或者季节中的至少一个选择用于确定ROI的指标。Depending on the subject photographed by the imaging system 100, it may be preferable to determine the ROI by an index other than NDVI. In addition, for example, even if it is the same crop, the reflectance characteristics may change depending on the location, time, or season. In this case, depending on the location, time, season, etc., it is sometimes preferable to switch the index used to determine the ROI. That is, the region determining unit 181 may select an index for determining the ROI according to at least one of the type of the subject and the location, time period, or season where the subject is photographed.
例如,区域确定部181可以根据SAVI(Solid Adjusted Vegetation Index,土壤调整植被指数)来确定ROI。SAVI由下式表示。For example, the region determining unit 181 may determine the ROI according to SAVI (Solid Adjusted Vegetation Index, soil adjusted vegetation index). SAVI is represented by the following formula.
【式2】[Formula 2]
Figure PCTCN2020078018-appb-000002
Figure PCTCN2020078018-appb-000002
SAVI是考虑土壤反射率差别的指标。在考虑土壤反射率差别这一点上与NDVI有所不同。当植被较小时L为1,较大时为0.25。SAVI is an index that considers the difference in soil reflectance. It is different from NDVI in considering the difference in soil reflectance. L is 1 when the vegetation is small, and 0.25 when it is large.
区域确定部181可以根据gNDVI(Green Normalized Difference Vegetation Index,绿度归一化植被指数)来确定ROI。gNDVI由下式表示。The area determining unit 181 may determine the ROI according to gNDVI (Green Normalized Difference Vegetation Index, green normalized vegetation index). gNDVI is represented by the following formula.
【式3】[Formula 3]
Figure PCTCN2020078018-appb-000003
Figure PCTCN2020078018-appb-000003
其中,G表示可见光区域的蓝色反射率。Among them, G represents the blue reflectance in the visible light region.
区域确定部181可以根据NDRE(Normalized Difference Red edge Index,归一化差分红边指数)来确定ROI。NDRE由下式表示。The region determining unit 181 may determine the ROI according to the NDRE (Normalized Difference Red Edge Index, normalized difference red edge index). NDRE is represented by the following formula.
【式4】[Formula 4]
Figure PCTCN2020078018-appb-000004
Figure PCTCN2020078018-appb-000004
其中,NIR表示近红外线反射率,RE表示红色边缘反射率。通过使用NDRE,能够对植被分布进行更深的分析。例如,能够对杉及桧的差别进行分析。例如,区域确定部181能够对杉的区域或者桧的区域进行区分从而确定ROI。Among them, NIR stands for near infrared reflectance, and RE stands for red edge reflectance. By using NDRE, a deeper analysis of vegetation distribution can be carried out. For example, the difference between cedar and juniper can be analyzed. For example, the area determination unit 181 can distinguish between the area of cedar or the area of cypress to determine the ROI.
区域确定部181可以根据被摄体的特性选择近红外线图像及红色图像或者近红外线图像及绿色图像作为用于确定ROI的图像。区域确定部181可以根据选择的近红外线图像及红色图像或者近红外线图像及绿色图像来确定ROI。区域确定部181可以根据被摄体的特性选择近红外线图像及红色图像、近红外线图像及绿色图像或者近红外线图像及红色边缘图像作为用于确定ROI的图像。区域确定部181可以根据选择的近红外线图像及红色图像、近红外线图像及绿色图像或者近红外线图像及红色边缘来确定ROI。The region determining unit 181 may select a near-infrared image and a red image or a near-infrared image and a green image as the image for determining the ROI according to the characteristics of the subject. The region determining unit 181 may determine the ROI based on the selected near-infrared image and red image or near-infrared image and green image. The region determining unit 181 may select a near-infrared image and a red image, a near-infrared image and a green image, or a near-infrared image and a red edge image as the image for determining the ROI according to the characteristics of the subject. The region determining unit 181 may determine the ROI based on the selected near-infrared image and red image, near-infrared image and green image, or near-infrared image and red edge.
区域确定部181根据被摄体的特性选择NDVI、SAVI、gNDVI或者NDRE,可以根据选择的NDVI、SAVI、gNDVI或者NDRE来确定ROI。区域确定部181可以根据作为被摄体的植物的特性选择NDVI、SAVI、gNDVI或者NDRE。区域确定部181可 以根据被摄体的种类、对被摄体进行拍摄的位置、时期或者季节中的至少一个选择NDVI、SAVI、gNDVI或者NDRE。The region determining unit 181 selects NDVI, SAVI, gNDVI, or NDRE according to the characteristics of the subject, and can determine the ROI according to the selected NDVI, SAVI, gNDVI, or NDRE. The area determination unit 181 may select NDVI, SAVI, gNDVI, or NDRE according to the characteristics of the plant as the subject. The area determining unit 181 may select NDVI, SAVI, gNDVI, or NDRE according to at least one of the type of the subject, the location where the subject was photographed, the time period, or the season.
区域确定部181可以根据通过接收部184接收的由用户特定的被摄体的种类选择NDVI、SAVI、gNDVI或者NDRE作为用于确定ROI的指标。区域确定部181可以根据农作物的种类选择NDVI、SAVI、gNDVI或者NDRE作为用于确定ROI的指标。The region determining unit 181 may select NDVI, SAVI, gNDVI, or NDRE as an index for determining the ROI according to the type of the subject specified by the user received through the receiving unit 184. The region determining unit 181 may select NDVI, SAVI, gNDVI, or NDRE as an index for determining the ROI according to the type of crop.
图10是示出对曝光控制值进行确定的步骤的一个示例的流程图。区域确定部181获取对用于确定ROI的指标进行计算所需的图像(S100)。例如,区域确定部181为了将NVDI导出,获取由NIR用摄像装置150拍摄的近红外线图像以及由R用摄像装置110拍摄的红色图像。区域确定部181在图像中设定多个块(S102)。区域确定部181可以在R用摄像装置110、G用摄像装置120、B用摄像装置130、RE用摄像装置140以及NIR用摄像装置150分别通用的坐标系中设定多个块。Fig. 10 is a flowchart showing an example of a procedure for determining an exposure control value. The area determination unit 181 acquires an image necessary for calculation of an index for determining the ROI (S100). For example, in order to derive NVDI, the area specifying unit 181 acquires a near-infrared image taken by the imaging device 150 for NIR and a red image taken by the imaging device 110 for R. The area specifying unit 181 sets a plurality of blocks in the image (S102). The area specifying unit 181 may set a plurality of blocks in a coordinate system common to each of the imaging device 110 for R, the imaging device 120 for G, the imaging device 130 for B, the imaging device 140 for RE, and the imaging device 150 for NIR.
区域确定部181根据由近红外线图像的每个块导出的各块的近红外线反射率以及由红色图像的每个块导出的各块的红色区域反射率将各块的NDVI导出(S104)。区域确定部181从多个块的每一个的NDVI中提取表示与植物相对应的NDVI中的至少一个块(S106)。区域确定部181可以提取至少一个表示在预设阈值以上的NDVI的块。区域确定部181可以提取至少一个表示在预设阈值以上的最大NDVI的块。区域确定部181可以提取至少一个表示在预设下限阈值以上且在预设上限阈值以下的NDVI的块。The area determination unit 181 derives the NDVI of each block based on the near-infrared reflectance of each block derived from each block of the near-infrared image and the red area reflectance of each block derived from each block of the red image (S104). The area determination unit 181 extracts at least one block representing the NDVI corresponding to the plant from the NDVI of each of the plurality of blocks (S106). The area determination section 181 may extract at least one block representing NDVI above a preset threshold. The area determining part 181 may extract at least one block representing the largest NDVI above a preset threshold. The area determination part 181 may extract at least one block representing NDVI that is above the preset lower limit threshold and below the preset upper threshold.
当表示与植物相对应的NDVI的块存在时,区域确定部181将表示与植物相对应的NDVI的块设定为ROI(S108)。另一方面,当表示与植物相对应的NDVI的块不存在时,区域确定部181将全部的块设定为ROI(S110)。When the block indicating the NDVI corresponding to the plant exists, the area determination section 181 sets the block indicating the NDVI corresponding to the plant as an ROI (S108). On the other hand, when the block indicating the NDVI corresponding to the plant does not exist, the region specifying unit 181 sets all the blocks as ROI (S110).
拍摄条件确定部182根据各摄像装置的各ROI的图像信息来确定各摄像装置的曝光控制值(S112)。拍摄条件确定部182可以根据分别由R用摄像装置110、G用摄像装置120、B用摄像装置130、RE用摄像装置140以及NIR用摄像装置150各自拍摄的各个图像中与ROI相对应的每个区域的辉度信息来确定R用摄像装置110、G用摄像装置120、B用摄像装置130、RE用摄像装置140以及NIR用摄像装置150各自的曝光控制值。The imaging condition determination unit 182 determines the exposure control value of each imaging device based on the image information of each ROI of each imaging device (S112). The imaging condition determination unit 182 can be based on each image corresponding to the ROI among the respective images captured by the imaging device 110 for R, the imaging device 120 for G, the imaging device 130 for B, the imaging device 140 for RE, and the imaging device 150 for NIR. The luminance information of each region determines the exposure control values of the imaging device 110 for R, the imaging device 120 for G, the imaging device 130 for B, the imaging device 140 for RE, and the imaging device 150 for NIR.
如上所述,根据本实施方式所涉及的摄像系统100,能够根据与被摄体的特性对应的指标来确定ROI,所述ROI是为了确定拍摄条件而必须参照的。由此,例如即使在需关注的植物的周围存在辉度较高的车辆等其他被摄体时,摄像系统100也能够以对植物进行最佳的曝光并分别对图像进行拍摄。As described above, according to the imaging system 100 according to the present embodiment, it is possible to determine the ROI based on the index corresponding to the characteristics of the subject, and the ROI must be referred to in order to determine the imaging conditions. As a result, for example, even when there are other subjects such as a vehicle with high brightness around the plant to be paid attention to, the imaging system 100 can perform the optimal exposure of the plant and capture the image separately.
图11是示出搭载于UAV10上的摄像系统100的外观的另一个示例的图。摄像系统100除了G用摄像装置120、B用摄像装置130、RE用摄像装置140以及NIR用摄像装置150之外还包括RGB用摄像装置160,在这一点上与图2所示的摄像系统100不同。RGB用摄像装置160可与普通的照相机相同,包括光学系统及图像传感器。图像传感器可以包括以拜耳阵列配置且使红色区域波段的光透过的滤光片、使绿色区域 波段的光透过的滤光片以及使蓝色区域波段的光透过的滤光片。RGB用摄像装置160可以输出RGB图像。例如,红色区域的波段可以是620nm~750nm。例如,绿色区域的波段可以是500nm~570nm。例如,蓝色区域的波段是450nm~500nm。FIG. 11 is a diagram showing another example of the appearance of the imaging system 100 mounted on the UAV 10. In addition to the imaging device 120 for G, the imaging device 130 for B, the imaging device 140 for RE, and the imaging device 150 for NIR, the imaging system 100 also includes an imaging device 160 for RGB, which is similar to the imaging system 100 shown in FIG. different. The RGB imaging device 160 may be the same as a normal camera and includes an optical system and an image sensor. The image sensor may include a filter that is arranged in a Bayer array and transmits light in the red region, a filter that transmits light in the green region, and a filter that transmits light in the blue region. The RGB imaging device 160 can output RGB images. For example, the wavelength band of the red region may be 620 nm to 750 nm. For example, the wavelength band of the green region may be 500 nm to 570 nm. For example, the wavelength band of the blue region is 450 nm to 500 nm.
区域确定部181可以根据由NIR用摄像装置150拍摄的近红外线图像以及由R用摄像装置110拍摄的红色图像将近红外线图像内的第一区域确定为ROI。区域确定部181可以根据由NIR用摄像装置150拍摄的近红外线图像以及由R用摄像装置110拍摄的红色图像将NDVI导出,并将近红外线图像中NDVI在预设阈值以上的第一区域确定为ROI。拍摄条件确定部182可以根据由RGB用摄像装置160拍摄的RGB图像中与第一区域相对应区域的图像信息来确定RGB用摄像装置160的摄像控制值。拍摄条件确定部182可以根据由RGB用摄像装置160拍摄的RGB图像中与第一区域相对应区域的辉度信息来确定RGB用摄像装置160的曝光控制值。The region determining unit 181 may determine the first region in the near-infrared image as an ROI based on the near-infrared image captured by the NIR imaging device 150 and the red image captured by the R imaging device 110. The region determining unit 181 may derive NDVI from the near-infrared image captured by the NIR camera 150 and the red image captured by the R camera 110, and determine the first region in the near-infrared image where the NDVI is above the preset threshold as an ROI . The imaging condition determination unit 182 may determine the imaging control value of the RGB imaging device 160 based on the image information of the region corresponding to the first region in the RGB image captured by the RGB imaging device 160. The imaging condition determination unit 182 may determine the exposure control value of the RGB imaging device 160 based on the luminance information of the region corresponding to the first region in the RGB image captured by the RGB imaging device 160.
图12表示可全部或部分地体现本发明的多个方面的计算机1200的一个示例。安装在计算机1200上的程序能够使计算机1200作为与本发明的实施方式所涉及的装置相关联的操作或者该装置的一个或多个“部”而起作用。或者,该程序能够使计算机1200执行该操作或者该一个或多个“部”。该程序能够使计算机1200执行本发明的实施方式所涉及的过程或者该过程的阶段。这种程序可以由CPU1212执行,以使计算机1200执行与本说明书所述的流程图及框图中的一些或者全部方框相关联的指定操作。FIG. 12 shows an example of a computer 1200 that may fully or partially embody various aspects of the present invention. The program installed on the computer 1200 can make the computer 1200 function as an operation associated with the device according to the embodiment of the present invention or one or more "parts" of the device. Alternatively, the program can cause the computer 1200 to perform the operation or the one or more "parts". This program enables the computer 1200 to execute the process or stages of the process involved in the embodiment of the present invention. Such a program may be executed by the CPU 1212, so that the computer 1200 executes specified operations associated with some or all blocks in the flowcharts and block diagrams described in this specification.
根据本实施方式的计算机1200包括CPU1212和RAM1214,它们通过主机控制器1210相互连接。计算机1200还包括通信接口1222、输入/输出单元,它们通过输入/输出控制器1220与主机控制器1210连接。计算机1200还包括ROM1230。CPU1212根据存储在ROM1230和RAM1214中的程序进行操作,从而控制每个单元。The computer 1200 according to the present embodiment includes a CPU 1212 and a RAM 1214, which are connected to each other through a host controller 1210. The computer 1200 further includes a communication interface 1222, an input/output unit, which is connected to the host controller 1210 through the input/output controller 1220. The computer 1200 also includes a ROM 1230. The CPU 1212 operates in accordance with programs stored in the ROM 1230 and RAM 1214, thereby controlling each unit.
通信接口1222经由网络与其他电子设备通信。硬盘驱动器可以储存计算机1200内的CPU1212所使用的程序及数据。ROM1230在其中储存运行时由计算机1200执行的引导程序等、和/或依赖于计算机1200的硬件的程序。程序通过CR-ROM、USB存储器或IC卡之类的计算机可读记录介质或者网络来提供。程序安装在也作为计算机可读记录介质的示例的RAM1214或ROM1230中,并通过CPU1212执行。这些程序中记述的信息处理由计算机1200读取,并引起程序与上述各种类型的硬件资源之间的协作。可以随着计算机1200的使用而实现信息的操作或者处理,从而构成装置或方法。The communication interface 1222 communicates with other electronic devices via a network. The hard disk drive can store programs and data used by the CPU 1212 in the computer 1200. The ROM 1230 stores therein a boot program executed by the computer 1200 during operation, and/or a program dependent on the hardware of the computer 1200. The program is provided via a computer-readable recording medium such as CR-ROM, USB memory, or IC card, or a network. The program is installed in RAM 1214 or ROM 1230 which is also an example of a computer-readable recording medium, and is executed by CPU 1212. The information processing described in these programs is read by the computer 1200 and causes cooperation between the programs and the various types of hardware resources described above. The operation or processing of information can be realized as the computer 1200 is used, thereby constituting an apparatus or method.
例如,当在计算机1200和外部设备之间执行通信时,CPU1212可以执行加载在RAM1214中的通信程序,并且基于通信程序中描述的处理,命令通信接口1222进行通信处理。在CPU1212的控制下,通信接口1222读取存储在诸如RAM1214或USB存储器之类的记录介质中提供的发送缓冲区中的发送数据,并将读取的发送数据发送到网络,或者将从网络接收的接收数据写入记录介质上提供的接收缓冲区等。For example, when performing communication between the computer 1200 and an external device, the CPU 1212 may execute a communication program loaded in the RAM 1214, and based on the processing described in the communication program, instruct the communication interface 1222 to perform communication processing. Under the control of the CPU 1212, the communication interface 1222 reads the transmission data stored in the transmission buffer provided in a recording medium such as RAM 1214 or USB memory, and sends the read transmission data to the network or receives it from the network The received data is written into the receiving buffer provided on the recording medium, etc.
另外,CPU1212可以使RAM1214读取存储在诸如USB存储器等外部记录介质中的文件或数据库的全部或必要部分,并对RAM1214上的数据执行各种类型的处理。接着,CPU1212可以将处理过的数据写回到外部记录介质中。In addition, the CPU 1212 can make the RAM 1214 read all or necessary parts of files or databases stored in an external recording medium such as a USB memory, and perform various types of processing on the data on the RAM 1214. Then, the CPU 1212 can write the processed data back to the external recording medium.
诸如各种类型的程序、数据、表格和数据库的各种类型的信息可以存储在记录介质中并且接受信息处理。对于从RAM1214读取的数据,CPU1212可执行在本公开的各处描述的、包括由程序的指令序列指定的各种类型的操作、信息处理、条件判断、条件转移、无条件转移、信息的检索/替换等各种类型的处理,并将结果写回到RAM1214中。此外,CPU1212可以检索记录介质内的文件、数据库等中的信息。例如,在记录介质中储存具有分别与第二属性的属性值建立了关联的第一属性的属性值的多个条目时,CPU1212可以从该多个条目中检索出与指定第一属性的属性值的条件相匹配的条目,并读取该条目内储存的第二属性的属性值,从而获取与满足预定条件的第一属性相关联的第二属性的属性值。Various types of information such as various types of programs, data, tables, and databases can be stored in a recording medium and subjected to information processing. For the data read from the RAM 1214, the CPU 1212 can perform various types of operations, information processing, conditional judgment, conditional transfer, unconditional transfer, and information retrieval/retrieval/information specified by the instruction sequence of the program described in various places in this disclosure. Replace various types of processing, and write the results back to RAM 1214. In addition, the CPU 1212 can search for information in files, databases, and the like in the recording medium. For example, when storing a plurality of entries having the attribute value of the first attribute respectively associated with the attribute value of the second attribute in the recording medium, the CPU 1212 may retrieve the attribute value of the specified first attribute from the multiple entries And read the attribute value of the second attribute stored in the entry to obtain the attribute value of the second attribute associated with the first attribute meeting the predetermined condition.
上述程序或软件模块可以存储在计算机1200上或计算机1200附近的计算机可读存储介质上。此外,连接到专用通信网络或因特网的服务器系统中提供的诸如硬盘或RAM之类的记录介质可以用作计算机可读存储介质,从而可以经由网络将程序提供到计算机1200。The above-mentioned programs or software modules may be stored on the computer 1200 or on a computer-readable storage medium near the computer 1200. In addition, a recording medium such as a hard disk or RAM provided in a server system connected to a dedicated communication network or the Internet can be used as a computer-readable storage medium so that the program can be provided to the computer 1200 via the network.
应该注意的是,权利要求书、说明书以及附图中所示的装置、系统、程序以及方法中的动作、顺序、步骤以及阶段等各项处理的执行顺序,只要没有特别明示“在…之前”、“事先”等,且只要前面处理的输出并不用在后面的处理中,则可以任意顺序实现。关于权利要求书、说明书以及附图中的操作流程,为方便起见而使用“首先”、“接着”等进行了说明,但并不意味着必须按照这样的顺序实施。It should be noted that the execution order of the actions, sequences, steps, and stages in the devices, systems, programs and methods shown in the claims, description and drawings, as long as there is no special indication that "before..." , "In advance", etc., and can be implemented in any order as long as the output of the previous processing is not used in the subsequent processing. Regarding the operating procedures in the claims, the specification and the drawings, the descriptions are made using "first", "next", etc. for convenience, but it does not mean that it must be implemented in this order.
以上使用实施方式对本发明进行了说明,但是本发明的技术范围并不限于上述实施方式所描述的范围。对本领域普通技术人员来说,显然可对上述实施方式加以各种变更或改良。从权利要求书的描述显而易见的是,加以了这样的变更或改良的方式都可包含在本发明的技术范围之内。The present invention has been described above using the embodiments, but the technical scope of the present invention is not limited to the scope described in the above embodiments. It is obvious to those skilled in the art that various changes or improvements can be made to the above-mentioned embodiments. It is obvious from the description of the claims that all such changes or improvements can be included in the technical scope of the present invention.

Claims (16)

  1. 一种确定装置,其特征在于,包括配置为以下的电路:根据由第一摄像装置所具备的第一图像传感器拍摄的第一波长区域的第一图像来确定所述第一图像内的第一区域,A determining device is characterized by comprising a circuit configured to determine the first image in the first image based on the first image in the first wavelength region captured by the first image sensor provided in the first imaging device. area,
    根据由第二摄像装置所具备的第二图像传感器拍摄的第二波长区域的第二图像中与所述第一区域相对应的第二区域的图像信息来确定所述第二摄像装置的摄像控制值。The imaging control of the second imaging device is determined based on the image information of the second region corresponding to the first region in the second image of the second wavelength region captured by the second image sensor of the second imaging device value.
  2. 根据权利要求1所述的确定装置,其特征在于,所述图像信息是辉度信息,所述摄像控制值是曝光控制值。The determining device according to claim 1, wherein the image information is luminance information, and the imaging control value is an exposure control value.
  3. 根据权利要求1所述的确定装置,其特征在于,所述电路根据所述第一图像与所述第二图像来确定所述第一区域。The determining device according to claim 1, wherein the circuit determines the first area based on the first image and the second image.
  4. 根据权利要求1所述的确定装置,其特征在于,所述电路The determining device according to claim 1, wherein the circuit
    选择所述第一图像与所述第二图像或者所述第一图像与由第三摄像装置所具备的第三图像传感器拍摄的第三波长区域的第三图像,作为用于确定所述第一区域的图像,The first image and the second image or the first image and the third image in the third wavelength region captured by the third image sensor of the third imaging device are selected as the third image for determining the first The image of the area,
    根据选择的所述第一图像与所述第二图像或者所述第一图像与所述第三图像来确定所述第一区域,Determining the first area according to the selected first image and the second image or the first image and the third image,
    根据所述第一图像中所述第一区域的图像信息来确定所述第一摄像装置的摄像控制值,根据所述第二图像中所述第二区域的图像信息来确定所述第二摄像装置的摄像控制值,根据所述第三图像中与所述第一区域相对应的第三区域的图像信息来确定所述第三摄像装置的摄像控制值。Determine the imaging control value of the first imaging device according to the image information of the first area in the first image, and determine the second imaging device according to the image information of the second area in the second image The imaging control value of the device determines the imaging control value of the third imaging device according to the image information of the third region corresponding to the first region in the third image.
  5. 根据权利要求4所述的确定装置,其特征在于,所述电路根据被摄体的特性选择所述第一图像与所述第二图像或者所述第一图像与所述第三图像。The determining device according to claim 4, wherein the circuit selects the first image and the second image or the first image and the third image according to the characteristics of the subject.
  6. 根据权利要求4所述的确定装置,其特征在于,所述第一波长区域是近红外线区域的波长区域,The determining device according to claim 4, wherein the first wavelength region is a wavelength region of a near infrared region,
    所述第二波长区域是红色区域的波长区域,The second wavelength region is a wavelength region of the red region,
    所述第三波长区域是绿色区域或者红色边缘区域的波长区域。The third wavelength region is the wavelength region of the green region or the red edge region.
  7. 根据权利要求1所述的确定装置,其特征在于,所述电路The determining device according to claim 1, wherein the circuit
    选择所述第一图像与所述第二图像、所述第一图像与由第三摄像装置所具备的第三图像传感器拍摄的第三波长区域的第三图像或者所述第一图像与由第四摄像装置所 具备的第四图像传感器拍摄的第四波长区域的第四图像,作为用于确定所述第一区域的图像,Select the first image and the second image, the first image and the third image in the third wavelength region captured by the third image sensor of the third imaging device, or the first image and the third image The fourth image in the fourth wavelength region captured by the fourth image sensor included in the quad camera device is used as an image for identifying the first region,
    根据选择的所述第一图像与所述第二图像、所述第一图像与所述第三图像或者所述第一图像与所述第四图像来确定所述第一区域,Determining the first area according to the selected first image and the second image, the first image and the third image, or the first image and the fourth image,
    根据所述第一图像中所述第一区域的图像信息来确定所述第一摄像装置的摄像控制值,根据所述第二图像中所述第二区域的图像信息来确定所述第二摄像装置的摄像控制值,根据所述第三图像中与所述第一区域相对应的第三区域的图像信息来确定所述第三摄像装置的摄像控制值,根据所述第四图像中与所述第一区域相对应的第四区域的图像信息来确定所述第四摄像装置的摄像控制值。Determine the imaging control value of the first imaging device according to the image information of the first area in the first image, and determine the second imaging device according to the image information of the second area in the second image The imaging control value of the device is determined according to the image information of the third area corresponding to the first area in the third image, and the imaging control value of the third imaging device is determined according to the information in the fourth image. The image information of the fourth area corresponding to the first area determines the imaging control value of the fourth imaging device.
  8. 根据权利要求7所述的确定装置,其特征在于,所述电路根据被摄体的特性选择所述第一图像与所述第二图像、所述第一图像与所述第三图像或者所述第一图像与所述第四图像。The determining device according to claim 7, wherein the circuit selects the first image and the second image, the first image and the third image, or the The first image and the fourth image.
  9. 根据权利要求7所述的确定装置,其特征在于,所述第一波长区域是近红外线区域的波长区域,The determining device according to claim 7, wherein the first wavelength region is a wavelength region of a near infrared region,
    所述第二波长区域是红色区域的波长区域,The second wavelength region is a wavelength region of the red region,
    所述第三波长区域是绿色区域的波长区域,The third wavelength region is a wavelength region of the green region,
    所述第四波长区域是红色边缘区域的波长区域。The fourth wavelength region is the wavelength region of the red edge region.
  10. 一种确定装置,其特征在于,包括配置为以下的电路:根据由第一摄像装置所具备的第一图像传感器拍摄的第一波长区域的第一图像以及由第二摄像装置所具备的第二图像传感器拍摄的第二波长区域的第二图像来确定所述第一图像内的第一区域,A determining device is characterized by comprising a circuit configured as follows: based on a first image in a first wavelength region captured by a first image sensor included in a first imaging device, and a second image captured by a second imaging device. A second image in the second wavelength region taken by the image sensor to determine the first region in the first image,
    根据由第三摄像装置所具备的第三图像传感器拍摄的第三波长区域的第三图像中与所述第一区域相对应的第三区域的图像信息来确定所述第三摄像装置的摄像控制值。The imaging control of the third imaging device is determined based on the image information of the third region corresponding to the first region in the third image of the third wavelength region captured by the third image sensor of the third imaging device value.
  11. 根据权利要求10所述的确定装置,其特征在于,所述图像信息是辉度信息,所述摄像控制值是曝光控制值。The determining device according to claim 10, wherein the image information is luminance information, and the imaging control value is an exposure control value.
  12. 根据权利要求10所述的确定装置,其特征在于,所述第一波长区域是近红外线区域的波长区域,The determining device according to claim 10, wherein the first wavelength region is a wavelength region of a near infrared region,
    所述第二波长区域是红色区域、绿色区域或者红色边缘区域的波长区域,The second wavelength region is a wavelength region of a red region, a green region or a red edge region,
    所述第三波长区域是红色区域、绿色区域以及蓝色区域的波长区域。The third wavelength region is a wavelength region of a red region, a green region, and a blue region.
  13. 一种摄像系统,其特征在于,包括:根据权利要求1至12中任一项所述的确定装置、A camera system, characterized by comprising: the determining device according to any one of claims 1 to 12,
    所述第一摄像装置以及The first camera and
    所述第二摄像装置。The second camera device.
  14. 一种移动体,其特征在于,具备根据权利要求13所述的摄像系统并进行移动。A mobile body characterized by having the imaging system according to claim 13 and moving.
  15. 一种确定方法,其特征在于,包括:根据由第一摄像装置所具备的第一图像传感器拍摄的第一波长区域的第一图像对所述第一图像内的第一区域进行确定的阶段;以及A determining method, characterized in that it comprises: a stage of determining a first region in the first image according to a first image in a first wavelength region captured by a first image sensor of a first imaging device; as well as
    根据由第二摄像装置所具备的第二图像传感器拍摄的第二波长区域的第二图像中与所述第一区域相对应的第二区域的图像信息对所述第二摄像装置的摄像控制值进行确定的阶段。The imaging control value of the second imaging device based on the image information of the second region corresponding to the first region in the second image of the second wavelength region captured by the second image sensor provided in the second imaging device Carry out a certain stage.
  16. 一种程序,其特征在于,其用于使计算机作为根据权利要求1至12中任一项所述的确定装置而发挥功能。A program characterized by causing a computer to function as the determining device according to any one of claims 1 to 12.
PCT/CN2020/078018 2019-03-26 2020-03-05 Determination device, camera system, and moving object WO2020192385A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202080002806.8A CN112154646A (en) 2019-03-26 2020-03-05 Specifying device, imaging system, and moving object

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2019-057752 2019-03-26
JP2019057752A JP6690106B1 (en) 2019-03-26 2019-03-26 Determination device, imaging system, and moving body

Publications (1)

Publication Number Publication Date
WO2020192385A1 true WO2020192385A1 (en) 2020-10-01

Family

ID=70413788

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/078018 WO2020192385A1 (en) 2019-03-26 2020-03-05 Determination device, camera system, and moving object

Country Status (3)

Country Link
JP (1) JP6690106B1 (en)
CN (1) CN112154646A (en)
WO (1) WO2020192385A1 (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102483808A (en) * 2009-06-11 2012-05-30 Pa有限责任公司 Vegetation Indices For Measuring Multilayer Microcrop Density And Growth
WO2016123201A1 (en) * 2015-01-27 2016-08-04 The Trustees Of The University Of Pennsylvania Systems, devices, and methods for robotic remote sensing for precision agriculture
CN106170058A (en) * 2016-08-30 2016-11-30 维沃移动通信有限公司 A kind of exposure method and mobile terminal
CN108235815A (en) * 2017-04-07 2018-06-29 深圳市大疆创新科技有限公司 Video camera controller, photographic device, camera system, moving body, camera shooting control method and program
CN108369635A (en) * 2015-11-08 2018-08-03 阿格洛英公司 The method with analysis is obtained for aerial image
JP2019029734A (en) * 2017-07-27 2019-02-21 東京電力ホールディングス株式会社 Observation device

Family Cites Families (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002232775A (en) * 2001-01-31 2002-08-16 Fuji Photo Film Co Ltd Monochrome imaging method of digital camera
WO2003083773A2 (en) * 2002-03-27 2003-10-09 The Trustees Of Columbia University In The City Of New York Imaging method and system
JP2004246252A (en) * 2003-02-17 2004-09-02 Takenaka Komuten Co Ltd Apparatus and method for collecting image information
WO2012023603A1 (en) * 2010-08-20 2012-02-23 株式会社日立国際電気 Image monitoring system and camera
JP2012068762A (en) * 2010-09-21 2012-04-05 Sony Corp Detection device, detection method, program, and electronic apparatus
DE102012221580B4 (en) * 2012-11-26 2019-10-24 Deutsches Zentrum für Luft- und Raumfahrt e.V. A method for locating living things from the air and flying object for locating living things from the air
JP6303764B2 (en) * 2014-04-23 2018-04-04 日本電気株式会社 Data fusion device, land cover classification system, method and program
US20160006954A1 (en) * 2014-07-03 2016-01-07 Snap Vision Technologies LLC Multispectral Detection and Processing From a Moving Platform
WO2016208415A1 (en) * 2015-06-26 2016-12-29 ソニー株式会社 Inspection apparatus, sensing apparatus, sensitivity control apparatus, inspection method, and program
CN105160647B (en) * 2015-10-28 2018-10-19 中国地质大学(武汉) A kind of panchromatic multispectral image fusion method
KR102465212B1 (en) * 2015-10-30 2022-11-10 삼성전자주식회사 Photographing apparatus using multiple exposure sensor and photographing method thereof
WO2017221756A1 (en) * 2016-06-22 2017-12-28 ソニー株式会社 Sensing system, sensing method, and sensing device
CN107197170A (en) * 2017-07-14 2017-09-22 维沃移动通信有限公司 A kind of exposal control method and mobile terminal
US11570371B2 (en) * 2017-08-01 2023-01-31 Sony Group Corporation Imaging apparatus, imaging method, and program
CN109187398A (en) * 2018-11-08 2019-01-11 河南省农业科学院植物营养与资源环境研究所 A kind of EO-1 hyperion measuring method of wheat plant nitrogen content

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102483808A (en) * 2009-06-11 2012-05-30 Pa有限责任公司 Vegetation Indices For Measuring Multilayer Microcrop Density And Growth
WO2016123201A1 (en) * 2015-01-27 2016-08-04 The Trustees Of The University Of Pennsylvania Systems, devices, and methods for robotic remote sensing for precision agriculture
CN108369635A (en) * 2015-11-08 2018-08-03 阿格洛英公司 The method with analysis is obtained for aerial image
CN106170058A (en) * 2016-08-30 2016-11-30 维沃移动通信有限公司 A kind of exposure method and mobile terminal
CN108235815A (en) * 2017-04-07 2018-06-29 深圳市大疆创新科技有限公司 Video camera controller, photographic device, camera system, moving body, camera shooting control method and program
JP2019029734A (en) * 2017-07-27 2019-02-21 東京電力ホールディングス株式会社 Observation device

Also Published As

Publication number Publication date
CN112154646A (en) 2020-12-29
JP2020161917A (en) 2020-10-01
JP6690106B1 (en) 2020-04-28

Similar Documents

Publication Publication Date Title
JP6496955B1 (en) Control device, system, control method, and program
WO2019238044A1 (en) Determination device, mobile object, determination method and program
JP6384000B1 (en) Control device, imaging device, imaging system, moving object, control method, and program
US20210235044A1 (en) Image processing device, camera device, mobile body, image processing method, and program
WO2019206076A1 (en) Control device, camera, moving body, control method and program
WO2019105222A1 (en) Generation device, generation system, image capturing system, moving body, and generation method
WO2021017914A1 (en) Control device, camera device, movable body, control method, and program
WO2020192385A1 (en) Determination device, camera system, and moving object
JP6481228B1 (en) Determination device, control device, imaging system, flying object, determination method, and program
WO2019174343A1 (en) Active body detection device, control device, moving body, active body detection method and procedure
JP2019220834A (en) Unmanned aircraft, control method, and program
WO2021083049A1 (en) Image processsing device, image processing method and program
JP6896962B2 (en) Decision device, aircraft, decision method, and program
WO2021115166A1 (en) Determining device, flying object, determining method, and program
WO2019223614A1 (en) Control apparatus, photographing apparatus, moving body, control method, and program
WO2020020042A1 (en) Control device, moving body, control method and program
WO2020216057A1 (en) Control device, photographing device, mobile body, control method and program
JP2022053417A (en) Control device, imaging device, moving object, control method, and program
WO2018163300A1 (en) Control device, imaging device, imaging system, moving body, control method, and program
WO2021249245A1 (en) Device, camera device, camera system, and movable member
WO2021143425A1 (en) Control device, photographing device, moving body, control method, and program
JP6818987B1 (en) Image processing equipment, imaging equipment, moving objects, image processing methods, and programs
WO2019085794A1 (en) Control device, camera device, flight body, control method and program
WO2020125414A1 (en) Control apparatus, photography apparatus, photography system, moving body, control method and program
JP2022011712A (en) Scene recognition apparatus, imaging apparatus, scene recognition method, and program

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20778643

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20778643

Country of ref document: EP

Kind code of ref document: A1