WO2021083049A1 - 图像处理装置、图像处理方法以及程序 - Google Patents

图像处理装置、图像处理方法以及程序 Download PDF

Info

Publication number
WO2021083049A1
WO2021083049A1 PCT/CN2020/123276 CN2020123276W WO2021083049A1 WO 2021083049 A1 WO2021083049 A1 WO 2021083049A1 CN 2020123276 W CN2020123276 W CN 2020123276W WO 2021083049 A1 WO2021083049 A1 WO 2021083049A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
information
camera device
offset
imaging
Prior art date
Application number
PCT/CN2020/123276
Other languages
English (en)
French (fr)
Inventor
家富邦彦
陈斌
Original Assignee
深圳市大疆创新科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳市大疆创新科技有限公司 filed Critical 深圳市大疆创新科技有限公司
Priority to CN202080004297.2A priority Critical patent/CN112955925A/zh
Publication of WO2021083049A1 publication Critical patent/WO2021083049A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • G01C11/04Interpretation of pictures
    • G01C11/06Interpretation of pictures by comparison of two or more pictures of the same area
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/40Scaling of whole images or parts thereof, e.g. expanding or contracting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration using two or more images, e.g. averaging or subtraction

Definitions

  • the invention relates to an image processing device, an image processing method and a program.
  • Patent Document 1 describes that a normalized vegetation index as an inspection object is calculated based on measurement values sensed by two sensing devices.
  • Patent Document 1 International Publication No. 2017/221756.
  • the image processing device may be an image processing device that processes each image taken by the first imaging device and the second device arranged in a preset positional relationship.
  • the image processing device may include a circuit configured to obtain offset information about the offset between the first optical axis of the first imaging device and the second optical axis of the second imaging device, and to indicate that the first imaging device and the second optical axis are offset. Height information of the height of the camera.
  • the circuit may be configured to acquire a first image taken by the first camera device and a second image taken by the second camera device.
  • the circuit may be configured to align the first image with the second image according to the offset information and the height information.
  • the offset information may include information indicating the angle formed by the first optical axis and the second optical axis.
  • the offset information may include a first intersection point between the first imaging surface of the first imaging device and the first optical axis and a second intersection point between the second imaging surface of the second imaging device and the second optical axis. Distance information.
  • the offset information may include information indicating an offset relationship between the height and the position between the first image and the second image.
  • the offset information may include information indicating an offset amount of the position between the first image and the second image with respect to a preset height.
  • the first imaging device and the second imaging device may be mounted on a mobile body.
  • the first imaging device and the second imaging device can be mounted on a mobile body by a support mechanism that can adjust the postures of the first imaging device and the second imaging device to support it.
  • the circuit may be configured to further obtain posture information indicating the posture states of the first camera device and the second camera device.
  • the circuit may be configured to further align the first image with the second image according to the posture information.
  • the moving body may be a flying body.
  • the first imaging device and the second imaging device may be mounted on the movable body by a support mechanism that can adjust the postures of the first imaging device and the second imaging device to support them.
  • the circuit may be configured to acquire posture information indicating the posture states of the first imaging device and the second imaging device.
  • the circuit may be configured to determine whether the angle formed by the imaging direction of the first imaging device and the second imaging device and the vertical downward direction is less than or equal to a preset angle according to the posture information.
  • the circuit may be configured to align the first image with the second image according to the offset information and the height information when the angle is less than or equal to the preset angle.
  • the circuit may be configured to align the first image with the second image according to offset information, height information, and posture information when the angle is greater than the preset angle.
  • the first camera device can take images of the first waveband.
  • the second camera device can take images of the second waveband.
  • the first waveband may be a waveband in the near-infrared region.
  • the second waveband can be a red zone, a green zone, or a red border zone.
  • the image processing device may be an image processing device that processes each image taken by the first camera device and the second device arranged in a preset positional relationship.
  • the image processing device may include a circuit configured to obtain offset information about the offset between the first optical axis of the first imaging device and the second optical axis of the second imaging device and to indicate the difference between the first imaging device and the second optical axis.
  • Distance information of the distance of the subject captured by the second imaging device The circuit may be configured to acquire a first image including the subject captured by the first imaging device and a second image including the subject captured by the second imaging device.
  • the circuit may be configured to align the first image with the second image according to the offset information and the distance information.
  • the first camera device can take images of the first waveband.
  • the second camera device can take images of the second waveband.
  • the first wavelength bandwidth may be the wavelength bandwidth of the near-infrared region.
  • the second waveband can be a red zone, a green zone, or a red border zone.
  • the offset information may include information indicating the angle formed by the first optical axis and the second optical axis.
  • the offset information may include information indicating the distance between the first intersection point between the first imaging surface of the first imaging device and the first optical axis and the second intersection point between the second imaging surface of the second imaging device and the second optical axis. information.
  • the offset information may include information indicating the relationship between the distance to the subject and the positional offset amount between the first image and the second image.
  • the image processing method may be an image processing method that processes each image taken by the first imaging device and the second device arranged in a preset positional relationship.
  • the image processing method may include acquiring offset information about the offset between the first optical axis of the first imaging device and the second optical axis of the second imaging device and the height representing the heights of the first imaging device and the second imaging device Information stage.
  • the image processing method may include a stage of acquiring a first image taken by a first camera and a second image taken by a second camera.
  • the image processing method may include a stage of aligning the first image with the second image based on the offset information and the height information.
  • the image processing method related to one aspect of the present invention may be an image processing method that processes each image taken by the first camera device and the second device arranged in a preset positional relationship.
  • the image processing method may include acquiring offset information about the offset between the first optical axis of the first imaging device and the second optical axis of the second imaging device, and indicating that the images taken by the first imaging device and the second imaging device The stage of the distance information of the distance of the subject.
  • the image processing method may include a stage of acquiring a first image including a subject captured by a first camera and a second image including a subject captured by a second camera.
  • the image processing method may include a stage of aligning the first image with the second image based on the offset information and the distance information.
  • the program according to one aspect of the present invention may be a program for causing a computer to function as the above-mentioned image processing apparatus.
  • the load of the alignment process between images can be suppressed.
  • FIG. 1 is a diagram showing an example of the appearance of an unmanned aerial vehicle (UAV) and a remote operation device;
  • UAV unmanned aerial vehicle
  • FIG. 2 is a diagram showing an example of the appearance of the camera system mounted on the UAV;
  • FIG. 3 is a diagram showing an example of functional blocks of UAV
  • FIG. 4 is a diagram showing an example of functional blocks of the camera system
  • FIG. 5 is a diagram showing an example of a shooting situation of a camera system mounted on a UAV
  • FIG. 6 is a diagram showing an example of the shift amount of the optical axis
  • FIG. 7 is a diagram for explaining image alignment
  • FIG. 8 is a diagram showing an example of the shooting state of the camera system mounted on the UAV.
  • Fig. 9 is a flowchart showing an example of an image alignment process performed by an image processor
  • FIG. 10 is a diagram showing another example of the appearance of the camera system mounted on the UAV.
  • FIG. 11 is a diagram showing an example of the hardware configuration.
  • the blocks can represent (1) a stage of a process of performing operations or (2) a "part" of a device that performs operations.
  • Specific stages and “sections” can be implemented by programmable circuits and/or processors.
  • Dedicated circuits may include digital and/or analog hardware circuits. May include integrated circuits (ICs) and/or discrete circuits.
  • Programmable circuits may include reconfigurable hardware circuits. Reconfigurable hardware circuits can include logic AND, logic OR, logic exclusive OR, logic NAND, logic NOR and other logic operations, flip-flops, registers, field programmable gate arrays (FPGA), programmable logic arrays (PLA) And other storage elements.
  • the computer-readable medium may include any tangible device that can store instructions to be executed by a suitable device.
  • the computer-readable medium on which instructions are stored includes a product that includes instructions that can be executed to create means for performing operations specified by the flowchart or block diagram.
  • a computer-readable medium it may include electronic storage media, magnetic storage media, optical storage media, electromagnetic storage media, semiconductor storage media, and the like.
  • Computer readable media may include floppy (registered trademark) disk, floppy disk, hard disk, random access memory (RAM), read only memory (ROM), erasable programmable read only memory (EPROM or flash memory) , Electrically erasable programmable read-only memory (EEPROM), static random access memory (SRAM), compact disc read-only memory (CD-ROM), digital versatile disc (DVD), Blu-ray (RTM) disc, memory stick, integrated Circuit cards, etc.
  • floppy registered trademark
  • floppy disk floppy disk
  • hard disk random access memory
  • RAM random access memory
  • ROM read only memory
  • EPROM or flash memory erasable programmable read only memory
  • EEPROM Electrically erasable programmable read-only memory
  • SRAM compact disc read-only memory
  • DVD digital versatile disc
  • RTM Blu-ray
  • the computer-readable instructions may include any one of source code or object code described in any combination of one or more programming languages.
  • the source code or object code includes a traditional procedural programming language.
  • Traditional procedural programming languages can be assembly instructions, instruction set architecture (ISA) instructions, machine instructions, machine-related instructions, microcode, firmware instructions, status setting data, or Smalltalk (registered trademark), JAVA (registered trademark) , C++ and other object-oriented programming languages and "C" programming language or similar programming languages.
  • the computer-readable instructions may be provided locally or via a wide area network (WAN) such as a local area network (LAN) or the Internet to a processor or programmable circuit of a general-purpose computer, a special-purpose computer, or other programmable data processing device.
  • WAN wide area network
  • LAN local area network
  • the processor or programmable circuit can execute computer-readable instructions to create means for performing the operations specified in the flowchart or block diagram.
  • Examples of processors include computer processors, processing units, microprocessors, digital signal processors, controllers, microcontrollers, and so on.
  • FIG. 1 shows an example of the appearance of an unmanned aerial vehicle (UAV) 10 and a remote operation device 300.
  • the UAV 10 includes a UAV main body 20, a universal joint 50, a plurality of camera devices 60, and a camera system 100.
  • the gimbal 50 and the camera system 100 are an example of a camera system.
  • UAV10 is an example of a moving body.
  • Moving objects include concepts such as flying objects moving in the air, vehicles moving on the ground, and ships moving on the water. Flying objects moving in the air refer to concepts that include not only UAVs, but also other aircraft, airships, and helicopters that move in the air.
  • the UAV main body 20 includes a plurality of rotors. Multiple rotors are an example of a propulsion section.
  • the UAV main body 20 controls the rotation of a plurality of rotors to make the UAV 10 fly.
  • the UAV main body 20 uses four rotors to make the UAV 10 fly.
  • the number of rotors is not limited to four.
  • UAV10 can also be a fixed-wing aircraft without rotors.
  • the imaging system 100 is a multispectral camera for imaging that captures objects within a desired imaging range in a plurality of wavelength bands, respectively.
  • the universal joint 50 rotatably supports the camera system 100.
  • the gimbal 50 is an example of a supporting mechanism.
  • the gimbal 50 uses an actuator to rotatably support the camera system 100 with a pitch axis.
  • the universal joint 50 uses an actuator to further support the camera system 100 rotatably around the roll axis and the yaw axis, respectively.
  • the gimbal 50 can change the posture of the camera system 100 by rotating the camera system 100 around at least one of the yaw axis, the pitch axis, and the roll axis.
  • the plurality of imaging devices 60 are sensing cameras that photograph the surroundings of the UAV 10 in order to control the flight of the UAV 10.
  • the two camera devices 60 can be installed on the nose of the UAV 10, that is, on the front side.
  • the other two camera devices 60 may be provided on the bottom surface of the UAV 10.
  • the two camera devices 60 on the front side may be paired to function as a so-called stereo camera.
  • the two imaging devices 60 on the bottom side may also be paired to function as a stereo camera.
  • the imaging device 60 can measure the existence of the object included in the imaging range of the imaging device 60 and the distance to the object.
  • the imaging device 60 is an example of a measuring device that measures an object existing in the imaging direction of the imaging system 100.
  • the measuring device may be another sensor such as an infrared sensor or an ultrasonic sensor that measures an object existing in the imaging direction of the imaging system 100.
  • the three-dimensional spatial data around the UAV 10 can be generated based on the images taken by the plurality of camera devices 60.
  • the number of imaging devices 60 included in the UAV 10 is not limited to four.
  • the UAV 10 only needs to include at least one imaging device 60.
  • the UAV 10 may also include at least one camera device 60 on the nose, tail, side, bottom, and top surfaces of the UAV 10, respectively.
  • the angle of view that can be set in the camera device 60 may be larger than the angle of view that can be set in the camera system 100.
  • the imaging device 60 may also include a single focus lens or a fisheye lens.
  • the remote operation device 300 communicates with the UAV 10 to remotely operate the UAV 10.
  • the remote operation device 300 can communicate with the UAV 10 wirelessly.
  • the remote operation device 300 transmits to the UAV 10 instruction information indicating various instructions related to the movement of the UAV 10 such as ascending, descending, accelerating, decelerating, forwarding, retreating, and rotating.
  • the instruction information includes, for example, instruction information for raising the height of the UAV 10.
  • the indication information may indicate the height at which the UAV10 should be located.
  • the UAV 10 moves to be located at the height indicated by the instruction information received from the remote operation device 300.
  • the instruction information may include an ascending instruction to raise the UAV10.
  • UAV10 rises while receiving the rise command. When the height of UAV10 has reached the upper limit height, UAV10 can limit the ascent even if the ascent command is accepted.
  • FIG. 2 is a diagram showing an example of the appearance of the imaging system 100 mounted on the UAV 10.
  • the imaging system 100 is a multispectral camera that separately captures image data of each of a plurality of preset wavelength bands.
  • the imaging system 100 includes an imaging device 110 for R, an imaging device 120 for G, an imaging device 130 for B, an imaging device 140 for RE, and an imaging device 150 for NIR.
  • the imaging system 100 can record each image data captured by the imaging device 110 for R, the imaging device 120 for G, the imaging device 130 for B, the imaging device 140 for RE, and the imaging device 150 for NIR as a multispectral image.
  • multispectral images can be used to predict the health and vitality of crops.
  • FIG. 3 shows an example of the functional blocks of UAV10.
  • UAV10 includes UAV control unit 30, memory 32, communication interface 36, propulsion unit 40, GPS receiver 41, inertial measurement device 42, magnetic compass 43, barometric altimeter 44, temperature sensor 45, humidity sensor 46, universal joint 50, camera The device 60 and the camera system 100.
  • the communication interface 36 communicates with other devices such as the remote operation device 300.
  • the communication interface 36 can receive instruction information including various instructions to the UAV control unit 30 from the remote operation device 300.
  • the memory 32 stores the UAV control unit 30's response to the propulsion unit 40, GPS receiver 41, inertial measurement unit (IMU) 42, magnetic compass 43, barometric altimeter 44, temperature sensor 45, humidity sensor 46, universal joint 50, imaging device 60, and
  • the imaging system 100 performs programs and the like necessary for control.
  • the memory 32 may be a computer-readable recording medium, and may include at least one of flash memory such as SRAM, DRAM, EPROM, EEPROM, and USB memory.
  • the memory 32 may be provided inside the UAV main body 20. It can be configured to be detachable from the UAV main body 20.
  • the UAV control unit 30 controls the flying and shooting of the UAV 10 in accordance with a program stored in the memory 32.
  • the UAV control unit 30 may be composed of a microprocessor such as a CPU or an MPU, and a microcontroller such as an MCU.
  • the UAV control unit 30 controls the flight and shooting of the UAV 10 in accordance with instructions received from the remote operation device 300 via the communication interface 36.
  • the propulsion unit 40 propels the UAV10.
  • the propulsion part 40 includes a plurality of rotors and a plurality of drive motors that rotate the plurality of rotors.
  • the propulsion unit 40 rotates a plurality of rotors via a plurality of drive motors in accordance with an instruction from the UAV control unit 30 to cause the UAV 10 to fly.
  • the GPS receiver 41 receives a plurality of signals indicating the time transmitted from a plurality of GPS satellites.
  • the GPS receiver 41 calculates the position (latitude, longitude, and altitude) of the GPS receiver 41, that is, the position (latitude, longitude, and altitude) of the UAV 10 based on the received signals.
  • IMU42 detects the posture of UAV10.
  • the IMU 42 detects the acceleration in the front and rear, left and right, and up and down directions of the UAV 10 according to the attitude of the UAV 10 and the angular velocity in the three axis directions of the pitch axis, the roll axis, and the yaw axis.
  • the magnetic compass 43 detects the position of the nose of the UAV10.
  • the barometric altimeter 44 detects the flying altitude of the UAV10.
  • the barometric altimeter 44 detects the air pressure around the UAV 10 and converts the detected air pressure to altitude to detect the altitude.
  • the temperature sensor 45 detects the temperature around the UAV 10.
  • the humidity sensor 46 detects the humidity around the UAV 10.
  • FIG. 4 shows an example of functional blocks of the camera system 100.
  • the imaging system 100 includes an imaging device 110 for R, an imaging device 120 for G, an imaging device 130 for B, an imaging device 140 for RE, and an imaging device 150 for NIR.
  • the imaging system 100 includes an image processor 180, a transmission unit 190, and a memory 192.
  • the imaging device 110 for R includes an image sensor 112 for R and an optical system 114.
  • the image sensor 112 for R captures an image formed by the optical system 114.
  • the R image sensor 112 includes a filter that transmits light in the red region, and outputs an R image signal that is an image signal in the red region.
  • the wavelength band of the red region is 620 nm to 750 nm.
  • the wavelength band of the red region may be a specific wavelength band in the red region, for example, it may be 663 nm to 673 nm.
  • the imaging device 120 for G includes an image sensor 122 for G and an optical system 124.
  • the image sensor 122 for G captures an image formed by the optical system 124.
  • the G image sensor 122 includes a filter that transmits light in the green region, and outputs a G image signal that is an image signal in the green region.
  • the wavelength band of the green region is 500 nm to 570 nm.
  • the wavelength band of the green region may be a specific wavelength band in the green region, for example, it may be 550 nm to 570 nm.
  • the imaging device 130 for B includes an image sensor 132 for B and an optical system 134.
  • the image sensor 132 for B captures an image formed by the optical system 134.
  • the image sensor for B 132 includes a filter that transmits light in the blue region, and outputs a B image signal that is an image signal in the blue region.
  • the wavelength band of the blue region is 450 nm to 500 nm.
  • the wavelength band of the blue region may be a designated wavelength band in the blue region, for example, it may be 465 nm to 485 nm.
  • the imaging device 140 for RE includes an image sensor 142 for RE and an optical system 144.
  • the image sensor 142 for RE captures an image formed by the optical system 144.
  • the RE image sensor 142 includes a filter that transmits light in the red-side area band, and outputs an RE image signal that is an image signal in the red-side area band.
  • the band of the red border area is 705nm to 745nm.
  • the wavelength band of the red border region can be 712nm ⁇ 722nm.
  • the NIR imaging device 150 includes an NIR image sensor 152 and an optical system 154.
  • the image sensor 152 for NIR captures an image formed by the optical system 154.
  • the NIR image sensor 152 includes a filter that transmits light in the near-infrared region, and outputs an image signal in the near-infrared region, that is, an NIR image signal.
  • the wavelength band of the near infrared region is 800 nm to 2500 nm.
  • the wavelength band of the near infrared region may be 800 nm to 900 nm.
  • the image processor 180 performs preset image processing on the respective image signals output from the imaging device 110 for R, the imaging device 120 for G, the imaging device 130 for B, the imaging device 140 for RE, and the imaging device 150 for NIR.
  • the image processor 180 is an example of a circuit.
  • the image processor 180 may be composed of a microprocessor such as a CPU or an MPU, and a microcontroller such as an MCU.
  • the image processor 180 calculates an index representing the state of the plant based on the image signal output from any one of the image sensors. For example, the image processor 180 may calculate a normalized vegetation index (NDVI, Normalized Difference Vegetation Index) according to the R image signal and the NIR image signal.
  • NDVI Normalized Difference Vegetation Index
  • NDVI is represented by the following formula.
  • IR represents the reflectance of the near infrared region
  • R represents the reflectance of red in the visible light region
  • the image processor 180 may calculate gNDVI (Green Normalized Difference Vegetation Index, green normalized vegetation index) according to the G image signal and the NIR image signal.
  • gNDVI Green Normalized Difference Vegetation Index, green normalized vegetation index
  • G represents the blue reflectance in the visible light region.
  • the image processor 180 may calculate SAVI (Solid Adjusted Vegetation Index, soil adjusted vegetation index) according to the R image signal and the NIR image signal.
  • SAVI Solid Adjusted Vegetation Index, soil adjusted vegetation index
  • SAVI is an index that considers the difference in soil reflectance. It is different from NDVI in considering the difference in soil reflectance. When the vegetation is small, L is 1 and when the vegetation is large, it is 0.25.
  • the image processor 180 may calculate NDRE (Normalized Difference Red Edge Index) according to the NIR image signal and the RE image signal.
  • NDRE is represented by the following formula.
  • NIR near infrared reflectance
  • RE red edge reflectance
  • the image processor 180 includes an offset calculation unit 170, an offset correction unit 172, and an image generation unit 174.
  • the image generating unit 174 selects an image signal output from any one image sensor from the image signals output by each image sensor according to a preset condition.
  • the image generating unit 174 can select an R image signal, a G image signal, and a B image signal.
  • the image generating unit 174 generates image data for display based on the R image signal, the G image signal, and the B image signal.
  • the image generation unit 174 can select the R image signal and the NIR image signal.
  • the image generating unit 174 generates image data representing NDVI based on the R image signal and the NIR image signal.
  • the image generation unit 174 can select the R image signal and the NIR image signal.
  • the image generating unit 174 generates image data representing gNDVI based on the R image signal and the NIR image signal.
  • the transmitting unit 190 may transmit image data for display, image data representing NDVI, and image data representing gNDVI to the display device.
  • the transmitting unit 190 may transmit image data for display to the remote operation device 300.
  • the remote operation device 300 may display image data for display, image data representing NDVI, or image data representing gNDVI as a live view image on the display unit.
  • the image generating unit 174 can generate image data for recording according to a preset recording format based on the R image signal, the G image signal, the B image signal, the RE image signal, and the NIR image signal.
  • the image generating unit 174 may generate RAW data in the RAW format from the R image signal, G image signal, B image signal, RE image signal, and NIR image signal as image data for recording.
  • the image generating unit 174 may generate image data for recording of all pixels without performing thinning-out processing on the R image signal, the G image signal, the B image signal, the RE image signal, and the NIR image signal, respectively.
  • the image generating unit 174 may store image data for recording in the memory 192.
  • the memory 192 may be a computer-readable recording medium, and may include at least one of flash memory such as SRAM, DRAM, EPROM, EEPROM, and USB memory.
  • the memory 192 may be provided inside the housing of the camera system 100.
  • the memory 192 may be configured to be detachable from the housing of the camera system 100.
  • the optical axes of the respective imaging devices in the imaging system 100 configured as described above are located at different positions. Therefore, in order for the image generating unit 174 to generate a plurality of image data synthesized from a plurality of image signals, it is necessary to align the image data generated from the respective image signals with each other.
  • the alignment of each image data may be performed in consideration of the number of pixels corresponding to the distance between the optical axes.
  • the optical axis of each imaging device sometimes shifts to a non-horizontal state.
  • each imaging device captures images in a state where the imaging direction is vertically downward.
  • the optical axis 501 of the first imaging device such as the R imaging device 110 and the optical axis 502 of the second imaging device such as the NIR imaging device 150 or the RE imaging device 140 are respectively inclined with respect to the reference axis 500.
  • the amount of positional shift between the first image captured by the first imaging device and the second image captured by the second imaging device may change depending on the distance from the imaging system 100 to the subject. That is, according to the distance from the imaging system 100 to the subject, the shift amount of pixels between the first image captured by the first imaging device and the second image captured by the second imaging device may change.
  • the image processor 180 corrects the pixels between the first image taken by the first camera and the second image taken by the second camera according to the distance from the first camera and the second camera to the subject. ⁇ Offset. For example, when the UAV 10 is flying while photographing plants on the ground with the imaging direction vertically downward, the height h of the first imaging device and the second imaging device can be regarded as the height h from the first imaging device and the second imaging device. 2. The distance from the camera to the subject. Therefore, the image processor 180 corrects the pixel shift between the first image captured by the first camera device and the second image captured by the second camera device according to the height h of the first camera device and the second camera device.
  • the image processor 180 generates an image obtained by combining the corrected first image and the second image, and displays it on the display part of the remote operation device 300. For example, when the first image is an R image and the second image is an NIR image, the image processor 180 calculates NDVI for each pixel of the corrected first image and the second image, and generates the value of NDVI as a pixel value. The image is displayed on the display part of the remote operation device 300.
  • the image processor 180 includes a shift amount calculation section 170 and a shift correction section 172 to correct the position shift between the images accompanying the shift of the optical axis.
  • the offset calculation unit 170 acquires offset information about the offset between the first optical axis 501 of the first imaging device and the second optical axis 502 of the second imaging device, and the information indicating the first imaging device and the second imaging device. Height information of height h.
  • the offset information may include information indicating the angle formed by the first optical axis 501 and the second optical axis 502.
  • the offset information may include the angle ( ⁇ 1x, ⁇ 1y) between the first optical axis 501 and the reference axis 500 ( ⁇ 1x, ⁇ 1y) and the angle ( ⁇ 2x, ⁇ 2y) between the second optical axis 502 and the reference axis 500 as the first optical axis 501
  • Information on the angle formed with the second optical axis 502. ⁇ 1x and ⁇ 2x indicate the angles formed by the first optical axis 501 and the second optical axis 502 projected on a plane parallel to the paper surface in FIG. 5 and the reference axis 500, respectively.
  • the offset information may include the angle between the optical axis and the reference axis of the imaging device 110 for R in the x direction and the y direction, and the angle between the imaging device 150 for NIR in the x direction. Information about the angle between the optical axis in the y direction and the reference axis.
  • the offset information may include information indicating the distance between the first camera and the second camera. Information indicating the distance a between the first intersection point of the first imaging surface of the first imaging device and the first optical axis 501 and the second intersection point of the second imaging surface of the second imaging device and the second optical axis 502 may be included.
  • the offset information may include information indicating the amount of offset between the first imaging surface and the second imaging surface.
  • the offset information may include information representing the relationship between the height and the offset amount of the position between the first image and the second image.
  • the offset information may indicate the number of pixels corresponding to the height of the image to be moved in the preset XY coordinate system for each of the vertical direction (x direction) and the horizontal direction (y direction).
  • the offset information may include information indicating the relationship between the number of pixels and the height in each of the vertical direction (x direction) and the horizontal direction (y direction).
  • the offset information may include information indicating the relationship between the number of pixels in each of the vertical direction (x direction) and the horizontal direction (y direction) and the distance to the subject.
  • the offset information may include information indicating an offset amount of the position between the first image and the second image with respect to a preset height.
  • the offset information may indicate the number of pixels corresponding to the image that should be moved in the XY coordinate system relative to the preset height for each of the vertical direction (x direction) and the horizontal direction (y direction).
  • the offset information may include information indicating a positional offset between the first image and the second image in the distance to the preset subject.
  • the offset calculation unit 170 may obtain offset information from a memory such as the memory 32.
  • the offset calculation unit 170 may obtain altitude information from the GPS receiver 41 or the barometric altimeter 44 or the like.
  • the offset calculation unit 170 may acquire height information indicating the height of the UAV 10 as the height information indicating the height of each imaging device.
  • the offset calculation unit 170 may also obtain altitude information indicating the altitude of the UAV 10 from the control information of the UAV 10.
  • the offset calculation unit 170 may calculate the direction in which each image should move in the XY coordinate system and the number of pixels as the offset based on the offset information and the height information.
  • the offset calculation unit 170 can calculate the movement diff1x of the first image in the x direction as h ⁇ tan( ⁇ 1x), the movement amount diff1y of the first image in the y direction is h ⁇ tan( ⁇ 1y).
  • the offset calculation unit 170 may add the distance in the x direction and the y direction between the first intersection of the first imaging surface of the first imaging device and the first optical axis 501 and the reference axis 500 to the amount of movement in each direction.
  • the amount of movement (number of pixels) of the first image in the x direction and the amount of movement (number of pixels) of the first image in the y direction are calculated.
  • the offset calculation unit 170 can calculate the amount of movement of the second image in the x direction diff2x is h ⁇ tan( ⁇ 2x), and the movement amount diff2y of the second image in the y direction is h ⁇ tan( ⁇ 2y).
  • the offset calculation unit 170 can calculate the distance between the second intersection of the second imaging surface of the second imaging device and the second optical axis 502 and the reference axis 500 in the x direction and the y direction plus the amount of movement in each direction. The amount of movement (number of pixels) of the second image in the x direction and the amount of movement (number of pixels) of the second image in the y direction are obtained.
  • the offset correction unit 172 moves each image according to the offset amount calculated by the offset amount calculation unit 170 to perform alignment between the images on the XY coordinate system.
  • the image generating unit 174 may generate one image by superimposing the aligned images.
  • the image generating unit 174 may generate one image by synthesizing the aligned images.
  • the offset correction unit 172 moves the first image 601 in the x direction of the first image by the amount diff1x (number of pixels), and in the y direction of the first image by the amount diff1y (number of pixels). ).
  • the offset correction unit 172 can move the second image 603 in the x direction of the second image 603 by the moving amount diff2x (number of pixels), and move the second image 603 in the y direction by the moving amount diff2y (number of pixels) in the XY coordinate Align the first image with the second image on the system.
  • the image generating unit 174 may synthesize the aligned first image 602 and the second image 604 to generate a synthesized image 605.
  • the image generating unit 174 may calculate an index value such as NDVI according to the pixel value of each pixel of the aligned first image 602 and the second image 604 to generate an image with each index value as the pixel value as the composite image 605.
  • the image generating unit 174 may cut out only the overlapping part of the aligned first image 602 and the second image 604 as the composite image 605.
  • the image generating unit 174 may generate an image of a preset size by adding a preset image around the cut out composite image 605.
  • the image generating unit 174 may generate an image by enlarging the cut out composite image 605 into an image of a preset size.
  • the imaging direction of the imaging system 100 is not vertically downward.
  • the height is regarded as the distance to the subject, the accuracy of the alignment may be reduced.
  • the shift amount calculation unit 170 considers the height h as the distance to the subject and calculates the shift amount, the alignment accuracy between the images may be reduced.
  • the offset calculation unit 170 may calculate the distance T to the subject from h/cos( ⁇ g) based on the height h.
  • the offset calculation unit 170 may obtain posture information indicating the posture of the imaging system 100 from the gimbal 50.
  • the offset calculation unit 170 may acquire information indicating the imaging direction of the imaging system 100 as the posture information of the imaging device 100.
  • the offset calculation unit 170 may acquire information indicating the angle ⁇ g formed by the axis in the vertical downward direction and the reference axis of the imaging system 100 as the posture information of the imaging apparatus 100.
  • the offset calculation unit 170 may calculate the distance T to the subject based on the angle ⁇ g and the height h.
  • the offset calculation unit 170 may calculate the amount of movement of the first image in the x direction diff1x is T ⁇ tan( ⁇ 1x), and the movement amount diff1y of the first image in the y direction is T ⁇ tan( ⁇ 1y).
  • the offset calculation unit 170 can calculate the amount of movement of the second image in the x direction diff2x is T ⁇ tan( ⁇ 2x), and the movement amount diff2y in the y direction of the second image is T ⁇ tan( ⁇ 2y).
  • the offset calculation part 170 may acquire distance information indicating the distance from the imaging system 100 to the subject, and calculate the offset based on the distance information and the offset information.
  • the offset calculation unit 170 may regard the height as the subject Calculate the offset from the distance.
  • the offset calculation unit 170 may calculate the distance to the subject based on the height information and the posture information, and according to the distance to the subject. The distance of the subject calculates the offset of each image.
  • the offset calculation unit 170 may also obtain distance information indicating the distance to the subject through a distance measuring sensor.
  • FIG. 9 is a flowchart showing an example of an image alignment processing procedure performed by the image processor 180.
  • the offset calculation unit 170 acquires offset information, height information, and posture information (S100). The offset calculation unit 170 calculates the offset of the image based on the offset information, the height information, and the posture information (S102). The offset calculation unit 170 calculates the angle ⁇ g formed between the imaging direction of the imaging system 100 and the vertical downward direction based on the posture information. The offset calculation unit 170 can calculate the distance to the subject from h/cos( ⁇ g) based on the height h indicated by the height information and the angle ⁇ g formed between the imaging direction of the imaging system 100 and the vertical downward direction. . If the angle ⁇ g is less than or equal to the preset threshold, the offset calculation part 170 may determine that the height h is the distance to the subject. The offset calculation unit 170 may determine whether the imaging direction of the imaging system 100 is a vertical downward direction based on the acceleration sensor included in the imaging system 100 or the posture information of the imaging system 100 from the acceleration sensor.
  • the offset calculation unit 170 can calculate the offset of the first image captured by the first camera and the offset of the second image captured by the second camera based on the offset information and the distance to the subject. .
  • the offset information of the first image shows the angle ( ⁇ 1x, ⁇ 1y) formed by the first optical axis 501 and the reference axis 500
  • the offset calculation unit 170 can calculate the movement diff1x of the first image in the x direction as T ⁇ tan( ⁇ 1x), the movement amount diff1y of the first image in the y direction is T ⁇ tan( ⁇ 1y).
  • the offset calculation unit 170 can calculate the amount of movement of the second image in the x direction diff2x is T ⁇ tan( ⁇ 2x), and the movement amount diff2y of the second image in the y direction is T ⁇ tan( ⁇ 2y).
  • the offset correction unit 172 acquires the first image captured by the first imaging device and the second image captured by the second imaging device (S104).
  • the offset correction unit 172 aligns the first image with the second image in the XY coordinate system according to the offset amount (S106).
  • the offset correction unit 172 corrects the position of the first image on the XY coordinate system by moving the first image by diff1x in the x direction and diff1y in the y direction on the XY coordinate system.
  • the two images move diff2x in the x direction and diff2y in the y direction, thereby correcting the position of the second image on the XY coordinate system and aligning the first image with the second image.
  • the image generating unit 174 synthesizes the aligned first image and second image to generate a display image (S108).
  • the image generating section 174 may calculate NDVI for each pixel of each of the first image as the R image and the second image as the IR image, and generate the NDVI image as a display image.
  • the image generating unit 174 may generate an RGB image by aligning the R image, G image, and B image.
  • the image generating unit 174 may generate a superimposed image in which the NDVI image and the RGB image are superimposed as a display image.
  • the image generation unit 174 transmits the display image to the display unit or the like of the remote operation device 300 (S110).
  • the display unit can display the NDVI image generated from the image taken by the imaging system 100 mounted on the UAV 10 in real time.
  • the image generating unit 174 may store the display image in the memory 192 or the like.
  • the image processor 180 performs the alignment between the images by pattern matching. However, when pattern matching is performed, the processing capability of the image processor 180 needs to be improved. However, from the viewpoints of power saving, weight reduction, and cost reduction, it is sometimes not preferable to increase the processing capability of the image processor 180 mounted on the UAV10.
  • pattern matching between images of different wavelength bands is performed with high accuracy, such as the pattern between IR image or RE image and R image or G image. Matching is not easy.
  • the alignment between images of different wavelength bands can also be performed more easily than pattern matching.
  • FIG. 10 is a diagram showing another example of the appearance of the imaging system 100 mounted on the UAV 10.
  • the imaging system 100 also includes an imaging device 160 for RGB, which is similar to the imaging system 100 shown in FIG. 2 in this regard. different.
  • the RGB imaging device 160 may be the same as a normal camera, and includes an optical system and an image sensor.
  • the image sensor may include a filter configured by a Bayer array and transmitting light in the red region, a filter transmitting light in the green region, and a filter transmitting light in the blue region.
  • the RGB imaging device 160 can output RGB images.
  • the wavelength band of the red region may be 620 nm to 750 nm.
  • the wavelength band of the green region may be 500 nm to 570 nm.
  • the wavelength band of the blue region is 450 nm to 500 nm.
  • two imaging devices, the first imaging device and the second imaging device are used as an example for description.
  • the imaging device is not limited to two. Even if there are three or more imaging devices, the offset between the reference axis and the optical axis can also be calculated for image alignment.
  • the imaging devices are the imaging device 110 for R, the imaging device 120 for G, the imaging device 130 for B, the imaging device 140 for RE, the imaging device 150 for NIR, and the imaging device 160 for RGB. That is, the first image is an R image, and the second image is an NIR image. However, it is also possible that the first image is a G image and the second image is an RGB image.
  • FIG. 11 shows an example of a computer 1200 that may fully or partially embody various aspects of the present invention.
  • the program installed on the computer 1200 can make the computer 1200 function as an operation associated with the device according to the embodiment of the present invention or as one or more "parts" of the device.
  • the program can cause the computer 1200 to perform the operation or the one or more "parts".
  • This program enables the computer 1200 to execute the process or stages of the process involved in the embodiment of the present invention.
  • Such a program may be executed by the CPU 1212, so that the computer 1200 executes specific operations associated with some or all of the blocks in the flowcharts and block diagrams described in this specification.
  • the computer 1200 of this embodiment includes a CPU 1212 and a RAM 1214, which are connected to each other through a host controller 1210.
  • the computer 1200 further includes a communication interface 1222, an input/output unit, which is connected to the host controller 1210 through the input/output controller 1220.
  • the computer 1200 also includes a ROM 1230.
  • the CPU 1212 operates in accordance with programs stored in the ROM 1230 and RAM 1214 to control each unit.
  • the communication interface 1222 communicates with other electronic devices via a network.
  • the hard disk drive can store programs and data used by the CPU 1212 in the computer 1200.
  • the ROM 1230 stores therein a boot program executed by the computer 1200 during operation, and/or a program dependent on the hardware of the computer 1200.
  • the program is provided via a computer-readable recording medium such as CR-ROM, USB memory, or IC card, or a network.
  • the program is installed in RAM 1214 or ROM 1230 which is also an example of a computer-readable recording medium, and is executed by CPU 1212.
  • the information processing described in these programs is read by the computer 1200 and causes cooperation between the programs and the various types of hardware resources described above.
  • the operation or processing of information can be realized as the computer 1200 is used, thereby constituting an apparatus or method.
  • the CPU 1212 may execute a communication program loaded in the RAM 1214, and instruct the communication interface 1222 to perform communication processing according to the processing described in the communication program.
  • the communication interface 1222 reads the transmission data stored in the transmission buffer provided in a recording medium such as RAM 1214 or USB memory, and transmits the read transmission data to the network, or from the network The received received data is written into the receiving buffer provided on the recording medium, etc.
  • the CPU 1212 can make the RAM 1214 read all or necessary parts of a file or database stored in an external recording medium such as a USB memory, and perform various types of processing on the data on the RAM 1214. Then, the CPU 1212 can write the processed data back to the external recording medium.
  • an external recording medium such as a USB memory
  • CPU 1212 can perform various types of operations, information processing, conditional judgment, conditional transfer, unconditional transfer, and information retrieval described in various parts of the present disclosure, including specified by the instruction sequence of the program. /Replacement and other types of processing, and write the results back to RAM1214.
  • the CPU 1212 can search for information in files, databases, and the like in the recording medium. For example, when multiple entries including the attribute value of the first attribute respectively associated with the attribute value of the second attribute are stored in the recording medium, the CPU 1212 may retrieve the attribute value of the specified first attribute from the multiple related entries. And read the attribute value of the second attribute stored in the entry to obtain the attribute value of the second attribute associated with the first attribute that meets the preset condition.
  • the programs or software modules described above may be stored on the computer 1200 or on a computer-readable storage medium near the computer 1200.
  • a recording medium such as a hard disk or RAM provided in a server system connected to a dedicated communication network or the Internet can be used as a computer-readable storage medium so that the program can be provided to the computer 1200 via the network.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Studio Devices (AREA)

Abstract

期望抑制图像间的对准处理的负荷。图像处理装置对以预设位置关系布置的第一摄像装置及第二装置拍摄的各个图像进行处理。图像处理装置可以包括电路,电路构成为:获取关于第一摄像装置的第一光轴与第二摄像装置的第二光轴之间的偏移的偏移信息以及表示第一摄像装置及第二摄像装置的高度的高度信息;获取由第一摄像装置拍摄的第一图像和由第二摄像装置拍摄的第二图像;并且根据偏移信息及高度信息,将第一图像与第二图像对准。

Description

图像处理装置、图像处理方法以及程序 技术领域
本发明涉及一种图像处理装置、图像处理方法以及程序。
背景技术
专利文献1中记载有:根据通过两个传感装置进行传感得到的测定值来计算作为检查对象物的归一化植被指数。
[专利文献1]国际公开第2017/221756号公报。
发明内容
在根据由多个摄像装置拍摄的图像,计算归一化植被指数等指标的情况下,期望能够抑制图像间的对准处理的负荷。
本发明所涉及的图像处理装置可以是对以预设位置关系布置的第一摄像装置及第二装置拍摄的各个图像进行处理的图像处理装置。图像处理装置可以包括电路,电路构成为:获取关于第一摄像装置的第一光轴与第二摄像装置的第二光轴之间的偏移的偏移信息以及表示第一摄像装置和第二摄像装置的高度的高度信息。电路可以构成为:获取由第一摄像装置拍摄的第一图像和由第二摄像装置拍摄的第二图像。电路可以构成为:根据偏移信息和高度信息,将第一图像与第二图像对准。
偏移信息可以包括表示第一光轴与第二光轴所成的角度的信息。
偏移信息可以包括表示第一摄像装置的第一摄像面与第一光轴之间的第一交点和与第二摄像装置的第二摄像面与第二光轴之间的第二交点之间的距离的信息。
偏移信息可以包括表示高度与第一图像和第二图像之间的位置的偏移量关系的信息。
偏移信息可以包括表示相对于预设高度的、第一图像和第二图像之间的位置的偏移量的信息。
第一摄像装置和第二摄像装置可以搭载于移动体。
第一摄像装置和第二摄像装置可以通过可调整第一摄像装置和第二摄像装置的姿态地进行支撑的支撑机构,搭载于移动体。
电路可以构成为:进一步获取表示第一摄像装置和第二摄像装置的姿态状态的姿态信息。电路可以构成为:进一步根据姿态信息,将第一图像与第二图像对准。
移动体可以是飞行体。
第一摄像装置和第二摄像装置可以通过可调整第一摄像装置和第二摄像装置的姿态地进行支撑的支撑机构搭载于移动体。电路可以构成为:获取表示第一摄像装置和第二摄像装置的姿态状态的姿态信息。电路可以构成为:根据姿态信息,判断第一摄像装置和第二摄像装置的摄像方向与垂直向下的方向所成的角度是否小于等于预设角度。电路可以构成为:当角度小于等于预设角度时,根据偏移信息和高度信息,将第一图像与第二图像对准。电路可以构成为:当角度大于预设角度时,根据偏移信息、高度信息以及姿态信息,将第一图像与第二图像对准。
第一摄像装置可以拍摄第一波段的图像。第二摄像装置可以拍摄第二波段的图像。
第一波段可以是近红外区域的波段。第二波段可以是红色区域、绿色区域或者红边区域的波段。
本发明的一个方面所涉及的图像处理装置可以是对以预设位置关系布置的第一摄像装置及第二装置拍摄的各个图像进行处理的图像处理装置。图像处理装置可以包括电路,电路构成为:获取关于第一摄像装置的第一光轴与第二摄像装置的第二光轴之间的偏移的偏移信息以及表示到由第一摄像装置和第二摄像装置拍摄的被摄体的距离的距离信息。电路可以构成为:获取由第一摄像装置拍摄的包含被摄体的第一图像和由第二摄像装置拍摄的包含被摄体的第二图像。电路可以构成为:根据偏移信息和距离信息,将第一图像与第二图像对准。
第一摄像装置可以拍摄第一波段的图像。第二摄像装置可以拍摄第二波段的图像。
第一波长带宽可以是近红外区域的波长带宽。第二波段可以是红色区域、绿色区域或者红边区域的波段。
偏移信息可以包括表示第一光轴与第二光轴所成的角度的信息。
偏移信息可以包括表示第一摄像装置的第一摄像面与第一光轴之间的第一交点和第二摄像装置的第二摄像面与第二光轴的第二交点之间的距离的信息。
偏移信息可以包括表示到被摄体的距离与第一图像和第二图像之间的位置偏移量的关系的信息。
本发明的一个方面所涉及的图像处理方法可以是对以预设位置关系布置的第一摄 像装置及第二装置拍摄的各个图像进行处理的图像处理方法。图像处理方法可以包括获取关于第一摄像装置的第一光轴与第二摄像装置的第二光轴之间的偏移的偏移信息以及表示第一摄像装置和第二摄像装置的高度的高度信息的阶段。图像处理方法可以包括获取由第一摄像装置拍摄的第一图像以及由第二摄像装置拍摄的第二图像的阶段。图像处理方法可以包括根据偏移信息和高度信息,将第一图像与第二图像对准的阶段。
本发明的一个方面所涉及的图像处理方法可以是对以预设位置关系布置的第一摄像装置及第二装置拍摄的各个图像进行处理的图像处理方法。图像处理方法可以包括获取关于第一摄像装置的第一光轴与第二摄像装置的第二光轴之间的偏移的偏移信息以及表示到第一摄像装置和第二摄像装置所拍摄的被摄体的距离的距离信息的阶段。图像处理方法可以包括获取由第一摄像装置拍摄的包含被摄体的第一图像以及由第二摄像装置拍摄的包含被摄体的第二图像的阶段。图像处理方法可以包括根据偏移信息和距离信息,将第一图像与第二图像对准的阶段。
本发明的一个方面所涉及的程序可以是一种用于使计算机作为上述图像处理装置而发挥功能的程序。
根据本发明的一个方面,能够抑制图像间的对准处理的负荷。
另外,上述本发明的内容中没有穷举本发明的所有必要的特征。此外,这些特征组的子组合也可以构成发明。
附图说明
图1是示出无人驾驶航空器(UAV)及远程操作装置的外观的一个示例的图;
图2是示出搭载于UAV上的摄像系统的外观的一个示例的图;
图3是示出UAV的功能块的一个示例的图;
图4是示出摄像系统的功能块的一个示例的图;
图5是示出搭载在UAV上的摄像系统的拍摄情况的一个示例的图;
图6是示出光轴的偏移量的一个示例的图;
图7是用于对图像对准进行说明的图;
图8是示出搭载在UAV上的摄像系统的拍摄的状态的一个示例的图;
图9是示出图像处理器进行图像的对准处理过程的一个示例的流程图;
图10是示出搭载在UAV上的摄像系统的外观的另一个示例的图;
图11是示出硬件构成的一个示例的图。
【符号说明】
10 UAV
20 UAV主体
30 UAV控制部
32 存储器
36 通信接口
40 推进部
41 GPS接收器
42 惯性测量装置
43 磁罗盘
44 气压高度计
45 温度传感器
46 湿度传感器
50 万向节
60 摄像装置
100 摄像系统
110 R用摄像装置
112 R用图像传感器
114 光学系统
120 G用摄像装置
122 G用图像传感器
124 光学系统
130 B用摄像装置
132 B用图像传感器
134 光学系统
140 RE用摄像装置
142 RE用图像传感器
144 光学系统
150 NIR用摄像装置
152 NIR用图像传感器
154 光学系统
160 RGB用摄像装置
170 偏移量计算部
172 偏移校正部
174 图像生成部
180 图像处理器
190 发送部
192 存储器
300 远程操作装置
1200 计算机
1210 主机控制器
1212 CPU
1214 RAM
1220 输入/输出控制器
1222 通信接口
1230 ROM
具体实施方式
以下,通过发明的实施方式来说明本发明,但是以下的实施方式并不限定权利要求书所涉及的发明。此外,实施方式中所说明的所有特征组合对于发明的解决方案未必是必须的。对本领域普通技术人员来说,显然可以对以下实施方式加以各种变更或改良。从权利要求书的描述显而易见的是,加以了这样的变更或改良的方式都可包含在本发明的技术范围之内。
权利要求书、说明书、说明书附图以及说明书摘要中包含作为著作权所保护对象的事项。任何人只要如专利局的文档或者记录所表示的那样进行这些文件的复制,著作权人则不会提出异议。但是,在除此以外的情况下,保留一切的著作权。
本发明的各种实施方式可参照流程图及框图来描述,这里,方框可表示(1)执行 操作的过程的阶段或者(2)具有执行操作的作用的装置的“部”。特定的阶段和“部”可以通过可编程电路和/或处理器来实现。专用电路可以包括数字和/或模拟硬件电路。可以包括集成电路(IC)和/或分立电路。可编程电路可以包括可重构硬件电路。可重构硬件电路可以包括逻辑与、逻辑或、逻辑异或、逻辑与非、逻辑或非及其它逻辑操作、触发器、寄存器、现场可编程门阵列(FPGA)、可编程逻辑阵列(PLA)等存储元件等。
计算机可读介质可以包括可以对由适宜的设备执行的指令进行存储的任意有形设备。其结果是,其上存储有指令的计算机可读介质包括一种包括指令的产品,该指令可被执行以创建用于执行流程图或框图所指定的操作的手段。作为计算机可读介质的示例,可以包括电子存储介质、磁存储介质、光学存储介质、电磁存储介质、半导体存储介质等。计算机可读介质的更具体示例可以包括软盘floppy(注册商标)disk、软盘、硬盘、随机存取存储器(RAM)、只读存储器(ROM)、可擦除可编程只读存储器(EPROM或闪存)、电可擦除可编程只读存储器(EEPROM)、静态随机存取存储器(SRAM)、光盘只读存储器(CD-ROM)、数字通用光盘(DVD)、蓝光(RTM)光盘、记忆棒、集成电路卡等。
计算机可读指令可以包括由一种或多种编程语言的任意组合描述的源代码或者目标代码中的任意一个。源代码或者目标代码包括传统的程序式编程语言。传统的程序式编程语言可以为汇编指令、指令集架构(ISA)指令、机器指令、与机器相关的指令、微代码、固件指令、状态设置数据、或者Smalltalk(注册商标)、JAVA(注册商标)、C++等面向对象编程语言以及“C”编程语言或者类似的编程语言。计算机可读指令可以在本地或者经由局域网(LAN)、互联网等广域网(WAN)提供给通用计算机、专用计算机或者其它可编程数据处理装置的处理器或可编程电路。处理器或可编程电路可以执行计算机可读指令,以创建用于执行流程图或框图所指定操作的手段。处理器的示例包括计算机处理器、处理单元、微处理器、数字信号处理器、控制器、微控制器等。
图1表示无人驾驶航空器(UAV)10及远程操作装置300的外观的一个示例。UAV10包括UAV主体20、万向节50、多个摄像装置60、以及摄像系统100。万向节50及摄像系统100为摄像系统的一个示例。UAV10为移动体的一个示例。移动体包括在空中移动的飞行物体、在地面移动的车辆、在水上移动的船舶等的概念。在空中移动的飞行体是指不仅包括UAV、还包括在空中移动的其它的飞行器、飞艇、直升机等的概念。
UAV主体20包括多个旋翼。多个旋翼为推进部的一个示例。UAV主体20通过控 制多个旋翼的旋转而使UAV10飞行。例如,UAV主体20使用四个旋翼来使UAV10飞行。旋翼的数量不限于四个。另外,UAV10也可以是没有旋翼的固定翼机。
摄像系统100是在多个波段分别对所期望的摄像范围内的对象进行拍摄的拍摄用多光谱照相机。万向节50可旋转地支撑摄像系统100。万向支架50为支撑机构的一个示例。例如,万向节50使用致动器以俯仰轴可旋转地支撑摄像系统100。万向节50使用致动器进一步分别以滚转轴和偏航轴为中心可旋转地支撑摄像系统100。万向节50可通过使摄像系统100以偏航轴、俯仰轴以及滚转轴中的至少一个为中心旋转,来变更摄像系统100的姿态。
多个摄像装置60是为了控制UAV10的飞行而对UAV10的周围进行拍摄的传感用相机。两个摄像装置60可以设置于UAV10的机头、即正面。并且,其它两个摄像装置60可以设置于UAV10的底面。正面侧的两个摄像装置60可以成对,起到所谓的立体相机的作用。底面侧的两个摄像装置60也可以成对,起到立体相机的作用。摄像装置60可以测量出摄像装置60的摄像范围所包含的对象的存在以及到对象的距离。摄像装置60为对存在于摄像系统100的摄像方向上的对象进行测量的测量装置的一个示例。测量装置也可以是对存在于摄像系统100的摄像方向上的对象进行测量的红外传感器、超声波传感器等的其它的传感器。可以根据由多个摄像装置60所拍摄的图像来生成UAV10周围的三维空间数据。UAV10所包括的摄像装置60的数量不限于四个。UAV10只要具备至少一个摄像装置60即可。UAV10也可以在UAV10的机头、机尾、侧面、底面及顶面分别包括至少一个摄像装置60。摄像装置60中可设定的视角可大于摄像系统100中可设定的视角。摄像装置60也可以包括单焦点镜头或鱼眼镜头。
远程操作装置300与UAV10通信,以远程操作UAV10。远程操作装置300可以与UAV10进行无线通信。远程操作装置300向UAV10发送表示上升、下降、加速、减速、前进、后退、旋转等与UAV10的移动有关的各种指令的指示信息。指示信息包括例如使UAV10的高度上升的指示信息。指示信息可以表示UAV10应该位于的高度。UAV10进行移动,以位于从远程操作装置300接收的指示信息所表示的高度。指示信息可以包括使UAV10上升的上升指令。UAV10在接受上升指令的期间上升。在UAV10的高度已达到上限高度时,即使接受上升指令,UAV10也可以限制上升。
图2是示出搭载于UAV10上的摄像系统100的外观的一个示例的图。摄像系统100是对预设的多个波段每个波段的图像数据分别进行拍摄的多光谱照相机。摄像系统100 包括R用摄像装置110、G用摄像装置120、B用摄像装置130、RE用摄像装置140以及NIR用摄像装置150。摄像系统100能够将由R用摄像装置110、G用摄像装置120、B用摄像装置130、RE用摄像装置140以及NIR用摄像装置150拍摄的各个图像数据作为多光谱图像进行记录。例如,多光谱图像可用于对农作物的健康状态以及活力进行预测。
图3示出UAV10的功能块的一个示例。UAV10包括UAV控制部30、存储器32、通信接口36、推进部40、GPS接收器41、惯性测量装置42、磁罗盘43、气压高度计44、温度传感器45、湿度传感器46、万向节50、摄像装置60及摄像系统100。
通信接口36与远程操作装置300等其它装置通信。通信接口36可以从远程操作装置300接收包括对UAV控制部30的各种指令的指示信息。存储器32存储UAV控制部30对推进部40、GPS接收器41、惯性测量装置(IMU)42、磁罗盘43、气压高度计44、温度传感器45、湿度传感器46、万向节50、摄像装置60及摄像系统100进行控制所需的程序等。存储器32可以为计算机可读记录介质,可以包括SRAM、DRAM、EPROM、EEPROM、以及USB存储器等的闪存中的至少一个。存储器32可以设置于UAV主体20的内部。其可以设置成可从UAV主体20中拆卸下来。
UAV控制部30按照储存在存储器32中的程序来控制UAV10的飞行及拍摄。UAV控制部30可以由CPU或MPU等微处理器、以及MCU等微控制器等构成。UAV控制部30按照经由通信接口36从远程操作装置300接收到的指令来控制UAV10的飞行及拍摄。推进部40推进UAV10。推进部40包括多个旋翼和使多个旋翼旋转的多个驱动电机。推进部40按照来自UAV控制部30的指令,经由多个驱动电机使多个旋翼旋转,以使UAV10飞行。
GPS接收器41接收表示从多个GPS卫星发送的时间的多个信号。GPS接收器41根据所接收的多个信号计算GPS接收器41的位置(纬度、经度及高度)、即UAV10的位置(纬度、经度及高度)。IMU42检测UAV10的姿态。IMU42按照UAV10的姿态检测UAV10的前后、左右以及上下的三轴方向的加速度和俯仰轴、滚转轴以及偏航轴的三轴方向的角速度。磁罗盘43检测UAV10的机头的方位。气压高度计44检测UAV10的飞行高度。气压高度计44检测UAV10周围的气压,并将检测到的气压换算为高度,以检测高度。温度传感器45检测UAV10周围的温度。湿度传感器46检测UAV10周围的湿度。
图4示出了摄像系统100的功能块的一个示例。摄像系统100包括R用摄像装置110、G用摄像装置120、B用摄像装置130、RE用摄像装置140以及NIR用摄像装置150。摄像系统100包括图像处理器180、发送部190及存储器192。
R用摄像装置110包括R用图像传感器112及光学系统114。R用图像传感器112对由光学系统114所成的像进行拍摄。R用图像传感器112包括使红色区域波段的光透过的滤波器,并输出红色区域波段的图像信号即R图像信号。例如,红色区域的波段是620nm~750nm。红色区域的波段可以是红色区域中特定的波段,例如可以是663nm~673nm。
G用摄像装置120包括G用图像传感器122及光学系统124。G用图像传感器122对由光学系统124所成的像进行拍摄。G用图像传感器122包括使绿色区域波段的光透过的滤波器,并输出绿色区域波段的图像信号即G图像信号。例如,绿色区域的波段是500nm~570nm。绿色区域的波段可以是绿色区域中特定的波段,例如可以是550nm~570nm。
B用摄像装置130包括B用图像传感器132及光学系统134。B用图像传感器132对由光学系统134所成的像进行拍摄。B用图像传感器132包括使蓝色区域波段的光透过的滤波器,并输出蓝色区域波段的图像信号即B图像信号。例如,蓝色区域的波段是450nm~500nm。蓝色区域的波段可以是蓝色区域中指定的波段,例如可以是465nm~485nm。
RE用摄像装置140包括RE用图像传感器142及光学系统144。RE用图像传感器142对由光学系统144所成的像进行拍摄。RE用图像传感器142包括使红边区域波段的光透过的滤波器,并输出红边区域波段的图像信号即RE图像信号。例如,红边区域的波段是705nm~745nm。红边区域的波段可以是712nm~722nm。
NIR用摄像装置150包括NIR用图像传感器152及光学系统154。NIR用图像传感器152对由光学系统154所成的像进行拍摄。NIR用图像传感器152包括使近红外线区域波段的光透过的滤波器,并输出近红外线区域波段的图像信号即NIR图像信号。例如,近红外线区域的波段是800nm~2500nm。近红外线区域的波段可以是800nm至900nm。
图像处理器180对从R用摄像装置110、G用摄像装置120、B用摄像装置130、RE用摄像装置140以及NIR用摄像装置150输出的各个图像信号实施预设图像处理。图像处理器180是电路的一个示例。图像处理器180可以由CPU或者MPU等微处理器、 MCU等微控制器组成。
图像处理器180根据从任意一个图像传感器输出的图像信号,计算表示植物状态的指数。例如,图像处理器180可以根据R图像信号和NIR图像信号计算归一化植被指数(NDVI,Normalized Difference Vegetation Index)。植物越多NDVI的值越高。植物的枝叶越多NDVI的值越高。植物的活性度越高NDVI的值越高。
其中,NDVI由下式表示。
Figure PCTCN2020123276-appb-000001
IR表示近红外线区域反射率,R表示可见光区域中红色的反射率。
例如,图像处理器180可以根据G图像信号和NIR图像信号,计算gNDVI(Green Normalized Difference Vegetation Index,绿色归一化植被指数)。gNDVI由下式表示。
Figure PCTCN2020123276-appb-000002
其中,G表示可见光区域的蓝色反射率。
图像处理器180可以根据R图像信号和NIR图像信号,计算SAVI(Solid Adjusted Vegetation Index,土壤调节植被指数)。SAVI由下式表示。
Figure PCTCN2020123276-appb-000003
SAVI是考虑土壤反射率差别的指标。在考虑土壤反射率差别这一点上与NDVI有所不同。当植被较小时L为1,较大时为0.25。
图像处理器180可以根据NIR图像信号和RE图像信号,计算NDRE(Normalized Difference Red edge Index,归一化差分红边指数)。NDRE由下式表示。
Figure PCTCN2020123276-appb-000004
其中,NIR表示近红外线反射率,RE表示红边反射率。通过使用NDRE,能够对植被分布进行更深的分析。例如,能够对杉及桧的差别进行分析。
图像处理器180包括偏移量计算部170、偏移校正部172以及图像生成部174。图像生成部174根据预设条件,从各个图像传感器输出的图像信号中选择从任意一个图像传感器输出的图像信号。
图像生成部174可以选择R图像信号、G图像信号以及B图像信号。图像生成部174根据R图像信号、G图像信号以及B图像信号生成显示用的图像数据。
图像生成部174可以选择R图像信号以及NIR图像信号。图像生成部174根据R图像信号以及NIR图像信号生成表示NDVI的图像数据。
图像生成部174可以选择R图像信号以及NIR图像信号。图像生成部174根据R图像信号以及NIR图像信号生成表示gNDVI的图像数据。
发送部190可以将显示用的图像数据、表示NDVI的图像数据以及表示gNDVI的图像数据发送给显示装置。例如,发送部190可以向远程操作装置300发送显示用的图像数据。远程操作装置300可以将显示用的图像数据、表示NDVI的图像数据或者表示gNDVI的图像数据作为实时取景图像显示于显示部中。
图像生成部174可以根据R图像信号、G图像信号、B图像信号、RE图像信号以及NIR图像信号,按照预设的记录形式生成记录用的图像数据。图像生成部174可以从R图像信号、G图像信号、B图像信号、RE图像信号以及NIR图像信号中,按照RAW形式生成RAW数据作为记录用的图像数据。图像生成部174可以不对R图像信号、G图像信号、B图像信号、RE图像信号以及NIR图像信号分别进行稀疏处理,而生成全像素的记录用的图像数据。图像生成部174可以将记录用的图像数据存储在存储器192中。存储器192可以为计算机可读记录介质,可以包括SRAM、DRAM、EPROM、EEPROM及USB存储器等闪存中的至少一个。存储器192可以设置于摄像系统100的壳体内部。存储器192可以设置成可从摄像系统100的壳体上拆卸下来。
在如上所述构成的摄像系统100的各个摄像装置的光轴位于不同位置。因此,图像生成部174为了生成从多个图像信号中合成的多个图像数据,需要将从各个图像信号中生成的图像数据彼此对准。
如果各个摄像装置的光轴为水平,考虑到与光轴间的距离相当的像素数,进行各个图像数据的对准即可。然而,在制造工艺中,各个摄像装置的光轴有时会偏移至不水平 的状态。
例如,如图5所示,在UAV10飞行的同时,各个摄像装置在摄像方向为垂直向下的状态下拍摄图像。
其中,R用摄像装置110等第一摄像装置的光轴501以及NIR用摄像装置150或RE用摄像装置140等第二摄像装置的光轴502分别相对于基准轴500倾斜。在这种情况下,根据从摄像系统100到被摄体的距离,由第一摄像装置拍摄的第一图像和由第二摄像装置拍摄的第二图像之间的位置的偏移量会改变。也就是说,根据从摄像系统100到被摄体的距离,由第一摄像装置拍摄的第一图像和由第二摄像装置拍摄的第二图像之间的像素的偏移量会改变。
因此,图像处理器180根据从第一摄像装置和第二摄像装置到被摄体的距离,校正由第一摄像装置拍摄的第一图像和由第二摄像装置拍摄的第二图像之间的像素的偏移。例如,当在UAV10飞行的同时,在摄像方向为垂直向下的状态对地面的植物等进行拍摄时,可以将第一摄像装置以及第二摄像装置的高度h视为从第一摄像装置和第二摄像装置到被摄体的距离。因此,图像处理器180根据第一摄像装置及第二摄像装置的高度h,校正由第一摄像装置拍摄的第一图像与第二摄像装置拍摄的第二图像之间的像素的偏移。图像处理器180生成将校正后的第一图像和第二图像合成所得的图像,并显示在远程操作装置300的显示部。例如,当第一图像是R图像,第二图像是NIR图像时,图像处理器180对校正后的第一图像和第二图像各自的每个像素计算NDVI,生成将NDVI的值表示为像素值的图像,并显示在远程操作装置300的显示部。
图像处理器180包括偏移量计算部170和偏移校正部172,以校正伴随着光轴偏移的图像间的位置偏移。偏移量计算部170获取关于第一摄像装置的第一光轴501与第二摄像装置的第二光轴502之间的偏移的偏移信息以及表示第一摄像装置和第二摄像装置的高度h的高度信息。
偏移信息可以包括表示第一光轴501与第二光轴502所成的角度的信息。偏移信息可以包括将表示第一光轴501与基准轴500所成的角度(θ1x,θ1y)和第二光轴502与基准轴500所成的角度(θ2x,θ2y)作为第一光轴501与第二光轴502所成的角度的信息。θ1x及θ2x表示投影到与图5中的纸面平行的平面上的第一光轴501及第二光轴502分别与基准轴500所成的角度。θ1y和θ2y表示投影到与图5中的纸面垂直且包括基准轴500的平面上的第一光轴501和第二光轴502分别与基准轴500所成的角 度。如图6所示,当生成表示NDVI的图像时,偏移信息可以包括表示R用摄像装置110在x方向和y方向的光轴与基准轴所成的角度以及NIR用摄像装置150在x方向和y方向的光轴与基准轴所成的角度的信息。
偏移信息可以包括表示第一摄像装置和第二摄像装置之间的距离的信息。可以包括表示第一摄像装置的第一摄像面与第一光轴501的第一交点和第二摄像装置的第二摄像面与第二光轴502的第二交点之间的距离a的信息。偏移信息可以包括表示第一摄像面与第二摄像面之间的偏移量的信息。偏移信息可以包括表示高度与第一图像和第二图像之间的位置的偏移量之间关系的信息。偏移信息可以表示针对垂直方向(x方向)和水平方向(y方向)的各个方向,在预设的XY坐标系中与应该移动的图像的高度相对应的像素数。
偏移信息可以包括表示关于垂直方向(x方向)以及水平方向(y方向)的各个方向的像素数与高度之间关系的信息。偏移信息可以包括表示关于垂直方向(x方向)以及水平方向(y方向)的各个方向的像素数与到被摄体的距离的关系的信息。
偏移信息可以包括表示相对于预设高度第一图像和第二图像之间的位置的偏移量的信息。偏移信息可以表示针对垂直方向(x方向)和水平方向(y方向)的各个方向,相对于预设高度,XY坐标系中与应该移动的图像相对应的像素数。偏移信息可以包括表示到预设被摄体的距离中第一图像与第二图像之间的位置偏移量的信息。
偏移量计算部170可以从存储器32等存储器获取偏移信息。偏移量计算部170可以从GPS接收器41或者气压高度计44等获取高度信息。偏移量计算部170可以获取表示UAV10的高度的高度信息作为表示各个摄像装置的高度的高度信息。当UAV10被设定为在预设高度飞行时,偏移量计算部170也可以从UAV10的控制信息获取表示UAV10的高度的高度信息。
偏移量计算部170可以根据偏移信息以及高度信息,计算出在XY坐标系中各个图像应该移动的方向以及像素数作为偏移量。当第一图像的偏移信息示出第一光轴501与基准轴500所成的角度(θ1x,θ1y)时,偏移量计算部170可以计算出第一图像在x方向的移动量diff1x为h×tan(θ1x),第一图像在y方向的移动量diff1y为h×tan(θ1y)。偏移量计算部170可以将第一摄像装置的第一摄像面与第一光轴501的第一交点与基准轴500之间的在x方向及y方向的距离加上各自方向的移动量而计算出第一图像在x方向的移动量(像素数)以及第一图像在y方向的移动量(像素数)。同样,当第二图像 的偏移信息示出第二光轴502与基准轴500所成的角度(θ2x,θ2y)时,偏移量计算部170可以计算出第二图像在x方向的移动量diff2x为h×tan(θ2x),第二图像在y方向的移动量diff2y为h×tan(θ2y)。偏移量计算部170可以将第二摄像装置的第二摄像面与第二光轴502的第二交点与基准轴500之间在x方向及y方向的距离加上各自方向的移动量而计算出第二图像在x方向的移动量(像素数)以及第二图像在y方向的移动量(像素数)。
偏移校正部172根据由偏移量计算部170计算出的偏移量,使各个图像移动而在XY坐标系上进行图像之间的对准。图像生成部174可以通过重叠对准的图像而生成一个图像。图像生成部174可以通过合成对准的图像而生成一个图像。
例如,如图7所示,偏移校正部172使第一图像601在第一图像的x方向上移动移动量diff1x(像素数)、在第一图像的y方向上移动移动量diff1y(像素数)。偏移校正部172可以通过使第二图像603在第二图像603的x方向移动移动量diff2x(像素数)、在第二图像603的y方向移动移动量diff2y(像素数),从而在XY坐标系上进行第一图像与第二图像的对准。图像生成部174可以合成对准的第一图像602和第二图像604而生成合成图像605。图像生成部174可以根据对准的第一图像602以及第二图像604各自的每一个像素的像素值计算NDVI等指标值而生成将各个指标值作为像素值的图像作为合成图像605。图像生成部174可以仅剪切出对准的第一图像602和第二图像604中的重叠部分作为合成图像605。图像生成部174可以通过在剪切出的合成图像605的周围追加预设图像来生成预设尺寸的图像。图像生成部174可以通过将剪切出的合成图像605放大成预设尺寸的图像而生成图像。
然而,也存在摄像系统100的摄像方向不是垂直向下的情况。当不是垂直向下时,如果将高度视为到被摄体的距离,则有可能降低对准的精度。
例如,如图8所示,也会存在表示垂直向下方向的轴510与摄像系统100的基准轴500所形成的角度θg的方向是摄像系统100的摄像方向的情况。在这种情况下,如果偏移量计算部170将高度h视为到被摄体的距离而计出算偏移量,有可能会降低图像之间的对准精度。
对此,偏移量计算部170可以基于高度h,根据h/cos(θg)计算到被摄体的距离T。
偏移量计算部170可以从万向节50获取表示摄像系统100的姿态的姿态信息。偏移量计算部170可以获取表示摄像系统100的摄像方向的信息作为摄像装置100的姿态 信息。偏移量计算部170可以获取表示垂直向下方向的轴与摄像系统100的基准轴所成的角度θg的信息作为摄像设备100的姿态信息。偏移量计算部170可以根据角度θg和高度h计算到被摄体的距离T。例如,当第一图像的偏移信息示出第一光轴501与基准轴500所成的角度(θ1x,θ1y)时,偏移量计算部170可以计算出第一图像在x方向的移动量diff1x为T×tan(θ1x),第一图像在y方向的移动量diff1y为T×tan(θ1y)。同样,当第二图像的偏移信息示出第二光轴502与基准轴500所成的角度(θ2x,θ2y)时,偏移量计算部170可以计算出第二图像在x方向的移动量diff2x为T×tan(θ2x),第二图像的y方向的移动量diff2y为T×tan(θ2y)。
偏移量计算部170可以获取表示从摄像系统100到被摄体的距离的距离信息,并且根据距离信息和偏移信息计算出偏移量。当摄像系统100的摄像方向与垂直向下的方向所成的角度小于等于预设角度,例如,小于等于10度或小于等于5度时,偏移量计算部170可以将高度视为被摄体的距离而计算出偏移量。当摄像系统100的摄像方向与垂直向下的方向所成的角度大于预设角度时,偏移量计算部170可以根据高度信息和姿态信息计算出到被摄体的距离,并根据到该被摄体的距离计算出各个图像的偏移量。偏移量计算部170也可以通过测距传感器获取表示到被摄体的距离的距离信息。
图9是示出图像处理器180进行图像的对准处理过程的一个示例的流程图。
偏移量计算部170获取偏移信息、高度信息以及姿态信息(S100)。偏移量计算部170根据偏移信息、高度信息以及姿态信息,计算出图像的偏移量(S102)。偏移量计算部170根据姿态信息,计算出摄像系统100的摄像方向与垂直向下方向之间所成的角度θg。偏移量计算部170可以根据高度信息所表示的高度h以及摄像系统100的摄像方向与垂直向下的方向之间所成的角度θg,根据h/cos(θg)计算到被摄体的距离。如果角度θg小于等于预设阈值,则偏移量计算部170可以确定高度h为到被摄体的距离。偏移量计算部170可以根据由摄像系统100所包括的加速度传感器或者来自加速度传感器的摄像系统100的姿态信息,判断摄像系统100的摄像方向是否是垂直向下方向。
偏移量计算部170可以根据偏移信息和到被摄体的距离,计算出由第一摄像装置拍摄的第一图像的偏移量和由第二摄像装置拍摄的第二图像的偏移量。当第一图像的偏移信息示出第一光轴501与基准轴500所成的角度(θ1x,θ1y)时,偏移量计算部170可以计算出第一图像在x方向的移动量diff1x为T×tan(θ1x),第一图像在y方向的移动量diff1y为T×tan(θ1y)。同样,当第二图像的偏移信息示出第二光轴502与基准轴 500所成的角度(θ2x,θ2y)时,偏移量计算部170可以计算出第二图像在x方向的移动量diff2x为T×tan(θ2x),第二图像在y方向的移动量diff2y为T×tan(θ2y)。
偏移校正部172获取由第一摄像装置拍摄的第一图像以及由第二摄像装置拍摄的第二图像(S104)。偏移校正部172根据偏移量,在XY坐标系中将第一图像与第二图像对准(S106)。偏移校正部172通过在XY坐标系上使第一图像在x方向上移动diff1x,在y方向上移动diff1y,从而校正XY坐标系上的第一图像的位置,通过在XY坐标系上使第二图像在x方向上移动diff2x,在y方向上移动diff2y,从而校正XY坐标系上的第二图像的位置,并且将第一图像与第二图像对准。
图像生成部174将对准的第一图像和第二图像进行合成并生成显示图像(S108)。当生成将NDVI的值表示为像素值的NDVI图像时,图像生成部174可以对作为R图像的第一图像和作为IR图像的第二图像各自的每个像素计算NDVI,并生成NDVI图像作为显示图像。此外,图像生成部174可以通过对准R图像、G图像以及B图像来生成RGB图像。图像生成部174也可以生成将NDVI图像与RGB图像重叠而成的重叠图像作为显示图像。
图像生成部174向遥控操作装置300的显示部等发送显示图像(S110)。显示部能够实时显示由搭载在UAV10上的摄像系统100所拍摄的图像生成的NDVI图像。图像生成部174可以将显示图像存储在存储器192等中。
也可以考虑图像处理器180通过图案匹配的方式进行图像之间的对准。不过,当进行图案匹配时,需要提高图像处理器180的处理能力。然而,从省电化、轻量化以及削减成本等观点出发,提高搭载在UAV10上的图像处理器180的处理能力有时并不优选。
另外,与RGB图像之间等相同波段的图像之间的图案匹配相比,高精度地进行不同波段的图像之间的图案匹配,例如IR图像或RE图像与R图像或G图像之间的图案匹配不易进行。
根据本实施方式,能够根据摄像系统100的高度或者到被摄体的距离,计算出图像的偏移量,并且进行图像之间的对准。因此,不同波段的图像之间的对准也能够比图案匹配更容易进行。
图10是示出搭载于UAV10上的摄像系统100的外观的另一个示例的图。摄像系统100除了G用摄像装置120、B用摄像装置130、RE用摄像装置140以及NIR用摄像装置150之外还包括RGB用摄像装置160,在这一点上与图2所示的摄像系统100不同。 RGB用摄像装置160可与普通的照相机相同,包括光学系统及图像传感器。图像传感器可以包括以拜耳阵列构成且使红色区域波段的光透过的滤波器、使绿色区域波段的光透过的滤波器以及使蓝色区域波段的光透过的滤波器。RGB用摄像装置160可以输出RGB图像。例如,红色区域的波段可以是620nm~750nm。例如,绿色区域的波段可以是500nm~570nm。例如,蓝色区域的波段是450nm~500nm。为了简化说明,以第一摄像装置和第二摄像装置两个摄像装置为例进行了说明。然而,如实施方式中所说明的,摄像装置不限于两个。即使摄像装置有3个以上,也同样可以计算基准轴和光轴的偏移量来进行图像的对准。摄像装置是R用摄像装置110、G用摄像装置120、B用摄像装置130、RE用摄像装置140、NIR用摄像装置150以及RGB用摄像装置160。也就是说,第一图像为R图像,第二图像为NIR图像。不过也可以第一图像为G图像,第二图像为RGB图像。
图11示出可全部或部分地体现本发明的多个方面的计算机1200的一个示例。安装在计算机1200上的程序能够使计算机1200作为与本发明的实施方式所涉及的装置建立关联的操作或者该装置的一个或多个“部”发挥功能。或者,该程序能够使计算机1200执行该操作或者该一个或多个“部”。该程序能够使计算机1200执行本发明的实施方式所涉及的过程或者该过程的阶段。这种程序可以由CPU1212执行,以使计算机1200执行与本说明书所述的流程图及框图中的一些或者全部方框建立关联的特定操作。
本实施方式的计算机1200包括CPU1212和RAM1214,它们通过主机控制器1210相互连接。计算机1200还包括通信接口1222、输入/输出单元,它们通过输入/输出控制器1220与主机控制器1210连接。计算机1200还包括ROM1230。CPU1212根据存储在ROM1230和RAM1214中的程序进行操作,从而控制各单元。
通信接口1222经由网络与其他电子设备通信。硬盘驱动器可以存储计算机1200内的CPU1212所使用的程序及数据。ROM1230在其中存储运行时由计算机1200执行的引导程序等、和/或依赖于计算机1200的硬件的程序。程序通过CR-ROM、USB存储器或IC卡之类的计算机可读记录介质或者网络来提供。程序安装在也作为计算机可读记录介质的示例的RAM1214或ROM1230中,并通过CPU1212执行。这些程序中记述的信息处理由计算机1200读取,并引起程序与上述各种类型的硬件资源之间的协作。可以随着计算机1200的使用而实现信息的操作或者处理,从而构成装置或方法。
例如,当在计算机1200和外部设备之间执行通信时,CPU1212可以执行加载在 RAM1214中的通信程序,并且根据通信程序中描述的处理,命令通信接口1222进行通信处理。在CPU1212的控制下,通信接口1222读取存储在诸如RAM1214或USB存储器之类的记录介质中所提供的发送缓冲区中的发送数据,并将读取的发送数据发送到网络,或者将从网络接收的接收数据写入记录介质上所提供的接收缓冲区等。
另外,CPU1212可以使RAM1214读取存储在诸如USB存储器等外部记录介质中的文件或数据库的全部或必要部分,并对RAM1214上的数据执行各种类型的处理。接着,CPU1212可以将处理过的数据写回到外部记录介质中。
可以将各种类型的程序、数据、表格及数据库之类的各种类型的信息存储在记录介质中,并接受信息处理。对于从RAM1214读取的数据,CPU1212可执行在本公开的各处所描述的、包括由程序的指令序列所指定的各种类型的操作、信息处理、条件判断、条件转移、无条件转移、信息的检索/替换等各种类型的处理,并将结果写回到RAM1214中。此外,CPU1212可以检索记录介质内的文件、数据库等中的信息。例如,在记录介质中存储包括分别与第二属性的属性值建立了关联的第一属性的属性值的多个条目时,CPU1212可以从相关多个条目中检索出与指定第一属性的属性值的条件相匹配的条目,并读取该条目内存储的第二属性的属性值,从而获取与满足预设条件的第一属性相关联的第二属性的属性值。
以上描述的程序或者软件模块可以储存在计算机1200上或者计算机1200附近的计算机可读存储介质上。此外,连接到专用通信网络或因特网的服务器系统中提供的诸如硬盘或RAM之类的记录介质可以用作计算机可读存储介质,从而可以经由网络将程序提供给计算机1200。
应该注意的是,权利要求书、说明书以及附图中所示的装置、系统、程序以及方法中的动作、顺序、步骤以及阶段等各项处理的执行顺序,只要没有特别明示“在…之前”、“事先”等,且只要前面处理的输出并不用在后面的处理中,则可以任意顺序实现。关于权利要求书、说明书以及附图中的操作流程,为方便起见而使用“首先”、“接着”等进行了说明,但并不意味着必须按照这样的顺序实施。
以上使用实施方式对本发明进行了说明,但是本发明的技术范围并不限于上述实施方式所描述的范围。对本领域普通技术人员来说,显然可对上述实施方式加以各种变更或改良。从权利要求书的描述显而易见的是,加以了这样的变更或改良的方式都可包含在本发明的技术范围之内。

Claims (20)

  1. 一种图像处理装置,其对以预设位置关系布置的第一摄像装置及第二装置拍摄的各个图像进行处理,其特征在于,
    包括电路,所述电路构成为:获取关于所述第一摄像装置的第一光轴与所述第二摄像装置的第二光轴之间的偏移的偏移信息以及表示所述第一摄像装置与所述第二摄像装置的高度的高度信息;
    获取由所述第一摄像装置拍摄的第一图像以及由所述第二摄像装置拍摄的第二图像;
    根据所述偏移信息和所述高度信息,将所述第一图像与所述第二图像对准。
  2. 根据权利要求1所述的图像处理装置,其特征在于,所述偏移信息包括表示所述第一光轴与所述第二光轴所成的角度的信息。
  3. 根据权利要求2所述的图像处理装置,其特征在于,所述偏移信息包括表示所述第一摄像装置的第一摄像面与所述第一光轴的第一交点和所述第二摄像装置的第二摄像面与所述第二光轴的第二交点之间的距离的信息。
  4. 根据权利要求1所述的图像处理装置,其特征在于,所述偏移信息包括表示高度与所述第一图像和所述第二图像之间位置的偏移量的关系的信息。
  5. 根据权利要求1所述的图像处理装置,其特征在于,所述偏移信息包括表示相对于预设高度的、所述第一图像与所述第二图像之间的位置的偏移量的信息。
  6. 根据权利要求1所述的图像处理装置,其特征在于,所述第一摄像装置及所述第二摄像装置搭载于移动体。
  7. 根据权利要求6所述的图像处理装置,其特征在于,所述第一摄像装置及所述第二摄像装置通过可调整所述第一摄像装置及所述第二摄像装置的姿态地进行支撑的支撑机构,搭载于所述移动体,
    所述电路构成为:进一步获取表示所述第一摄像装置及所述第二摄像装置的姿态状态的姿态信息;
    进一步根据所述姿态信息,将所述第一图像与所述第二图像对准。
  8. 根据权利要求6所述的图像处理装置,其特征在于,所述移动体是飞行体。
  9. 根据权利要求6所述的图像处理装置,其特征在于,所述第一摄像装置及所述 第二摄像装置通过可调整所述第一摄像装置及所述第二摄像装置的姿态地进行支撑的支撑机构,搭载于所述移动体,
    所述电路构成为:
    获取表示所述第一摄像装置及所述第二摄像装置的姿态状态的姿态信息;
    根据所述姿态信息,判断所述第一摄像装置及所述第二摄像装置的摄像方向与垂直向下的方向所成的角度是否小于等于预设角度;
    当所述角度小于等于预设角度时,根据所述偏移信息及所述高度信息,将所述第一图像与所述第二图像对准;
    当所述角度大于预设角度时,根据所述偏移信息、所述高度信息及所述姿态信息,将所述第一图像和所述第二图像对准。
  10. 根据权利要求1所述的图像处理装置,其特征在于,所述第一摄像装置拍摄第一波段的图像;
    所述第二摄像装置拍摄第二波段的图像。
  11. 根据权利要求10所述的图像处理装置,其特征在于,所述第一波段是近红外区域的波段;
    所述第二波段是红色区域、绿色区域或者红边区域的波段。
  12. 一种图像处理装置,其对以预设位置关系布置的第一摄像装置及第二装置拍摄的各个图像进行处理,其特征在于,
    包括电路,所述电路构成为:获取关于所述第一摄像装置的第一光轴与所述第二摄像装置的第二光轴之间的偏移的偏差信息以及表示所述第一摄像装置和所述第二摄像装置拍摄的到被摄体的距离的距离信息;
    获取由所述第一摄像装置拍摄的包含所述被摄体的第一图像以及由所述第二摄像装置拍摄的包含所述被摄体的第二图像;
    根据所述偏移信息以及所述距离信息,将所述第一图像与所述第二图像对准。
  13. 根据权利要求12所述的图像处理装置,其特征在于,所述第一摄像装置拍摄第一波段的图像,
    所述第二摄像装置拍摄第二波段的图像。
  14. 根据权利要求13所述的图像处理装置,其特征在于,所述第一波段是近红外区域的波段;
    所述第二波段是红色区域、绿色区域或者红边区域的波段。
  15. 根据权利要求12所述的图像处理装置,其特征在于,所述偏移信息包括表示所述第一光轴与所述第二光轴所成角度的信息。
  16. 根据权利要求12所述的图像处理装置,其特征在于,所述偏移信息包括表示所述第一摄像装置的第一摄像面与所述第一光轴的第一交点和所述第二摄像装置的第二摄像面与所述第二光轴的第二交点之间的距离的信息。
  17. 根据权利要求12所述的图像处理装置,其特征在于,所述偏移信息包括表示到被摄体距离与所述第一图像和所述第二图像之间的位置偏移量的关系的信息。
  18. 一种图像处理方法,其对以预设位置关系布置的第一摄像装置及第二装置拍摄的各个图像进行处理,其特征在于,包括:
    获取关于所述第一摄像装置的第一光轴与所述第二摄像装置的第二光轴之间的偏移的偏移信息以及表示所述第一摄像装置及所述第二摄像装置的高度的高度信息的阶段;
    获取由所述第一摄像装置拍摄的第一图像和由所述第二摄像装置拍摄的第二图像的阶段;以及
    根据所述偏移信息及所述高度信息,将所述第一图像与所述第二图像对准的阶段。
  19. 一种图像处理方法,其对以预设位置关系布置的第一摄像装置及第二装置拍摄的各个图像进行处理,其特征在于,包括:
    获取关于所述第一摄像装置的第一光轴与所述第二摄像装置的第二光轴之间的偏移的偏移信息以及表示所述第一摄像装置和所述第二摄像装置拍摄的到被摄体的距离的距离信息的阶段;
    获取由所述第一摄像装置拍摄的包含所述被摄体的第一图像以及由所述第二摄像装置拍摄的包含所述被摄体的第二图像的阶段;以及
    根据所述偏移信息及所述距离信息,将所述第一图像与所述第二图像对准的阶段。
  20. 一种程序,其特征在于,其用于使计算机作为如权利要求1至17中任一项所述的图像处理装置而发挥功能。
PCT/CN2020/123276 2019-11-01 2020-10-23 图像处理装置、图像处理方法以及程序 WO2021083049A1 (zh)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202080004297.2A CN112955925A (zh) 2019-11-01 2020-10-23 图像处理装置、图像处理方法以及程序

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2019200149A JP6880380B2 (ja) 2019-11-01 2019-11-01 画像処理装置、画像処理方法、及びプログラム
JP2019-200149 2019-11-01

Publications (1)

Publication Number Publication Date
WO2021083049A1 true WO2021083049A1 (zh) 2021-05-06

Family

ID=75712993

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/123276 WO2021083049A1 (zh) 2019-11-01 2020-10-23 图像处理装置、图像处理方法以及程序

Country Status (3)

Country Link
JP (1) JP6880380B2 (zh)
CN (1) CN112955925A (zh)
WO (1) WO2021083049A1 (zh)

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101796817A (zh) * 2007-07-06 2010-08-04 前视红外系统股份公司 照相机和校准照相机的方法
CN103037172A (zh) * 2011-10-04 2013-04-10 弗卢克公司 具有红外镜头聚焦调节装置的热成像摄像机
CN106506941A (zh) * 2016-10-20 2017-03-15 深圳市道通智能航空技术有限公司 图像处理的方法及装置、飞行器
CN106572307A (zh) * 2016-11-01 2017-04-19 深圳岚锋创视网络科技有限公司 一种全景图像的生成方法、系统及拍摄装置
US20170372137A1 (en) * 2015-01-27 2017-12-28 The Trustees Of The University Of Pennsylvania Systems, devices, and methods for robotic remote sensing for precision agriculture
CN107798656A (zh) * 2017-11-09 2018-03-13 南京齿贝犀科技有限公司 一种基于距离传感器和陀螺仪的口腔全景图像拼接方法
CN108257183A (zh) * 2017-12-20 2018-07-06 歌尔科技有限公司 一种相机镜头光轴校准方法和装置
CN109118425A (zh) * 2017-06-22 2019-01-01 华为技术有限公司 一种双鱼眼镜头的图像拼接参数校正方法及摄像设备
CN109362234A (zh) * 2016-04-28 2019-02-19 深圳市大疆创新科技有限公司 用于获得球面全景图像的系统和方法

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2013045032A (ja) * 2011-08-26 2013-03-04 Fujifilm Corp 多眼撮像装置
JP6751155B2 (ja) * 2016-11-24 2020-09-02 富士フイルム株式会社 画像処理装置、撮像装置、及び画像処理方法
JP7069609B2 (ja) * 2017-09-01 2022-05-18 コニカミノルタ株式会社 作物栽培支援装置
JP6948917B2 (ja) * 2017-11-10 2021-10-13 ヤンマーパワーテクノロジー株式会社 散布作業機

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101796817A (zh) * 2007-07-06 2010-08-04 前视红外系统股份公司 照相机和校准照相机的方法
CN103037172A (zh) * 2011-10-04 2013-04-10 弗卢克公司 具有红外镜头聚焦调节装置的热成像摄像机
US20170372137A1 (en) * 2015-01-27 2017-12-28 The Trustees Of The University Of Pennsylvania Systems, devices, and methods for robotic remote sensing for precision agriculture
CN109362234A (zh) * 2016-04-28 2019-02-19 深圳市大疆创新科技有限公司 用于获得球面全景图像的系统和方法
CN106506941A (zh) * 2016-10-20 2017-03-15 深圳市道通智能航空技术有限公司 图像处理的方法及装置、飞行器
CN106572307A (zh) * 2016-11-01 2017-04-19 深圳岚锋创视网络科技有限公司 一种全景图像的生成方法、系统及拍摄装置
CN109118425A (zh) * 2017-06-22 2019-01-01 华为技术有限公司 一种双鱼眼镜头的图像拼接参数校正方法及摄像设备
CN107798656A (zh) * 2017-11-09 2018-03-13 南京齿贝犀科技有限公司 一种基于距离传感器和陀螺仪的口腔全景图像拼接方法
CN108257183A (zh) * 2017-12-20 2018-07-06 歌尔科技有限公司 一种相机镜头光轴校准方法和装置

Also Published As

Publication number Publication date
JP2021071453A (ja) 2021-05-06
CN112955925A (zh) 2021-06-11
JP6880380B2 (ja) 2021-06-02

Similar Documents

Publication Publication Date Title
US10475209B2 (en) Camera calibration
US20220206515A1 (en) Uav hardware architecture
JP5947634B2 (ja) 航空写真撮像方法及び航空写真撮像システム
CN103134475B (zh) 航空摄影图像拾取方法和航空摄影图像拾取设备
WO2018198634A1 (ja) 情報処理装置、情報処理方法、情報処理プログラム、画像処理装置および画像処理システム
JP6496955B1 (ja) 制御装置、システム、制御方法、及びプログラム
CN110914780A (zh) 无人飞行器的动作计划制作系统、方法以及程序
WO2019230604A1 (ja) 検査システム
CN111344650B (zh) 信息处理装置、飞行路径生成方法、程序以及记录介质
US20210235044A1 (en) Image processing device, camera device, mobile body, image processing method, and program
WO2019189381A1 (ja) 移動体、制御装置、および制御プログラム
WO2020225979A1 (ja) 情報処理装置、情報処理方法、プログラム、及び情報処理システム
JP6681101B2 (ja) 検査システム
WO2021083049A1 (zh) 图像处理装置、图像处理方法以及程序
JP6481228B1 (ja) 決定装置、制御装置、撮像システム、飛行体、決定方法、及びプログラム
Gabdullin et al. Analysis of onboard sensor-based odometry for a quadrotor uav in outdoor environment
WO2021017914A1 (zh) 控制装置、摄像装置、移动体、控制方法以及程序
WO2021115166A1 (zh) 确定装置、飞行体、确定方法以及程序
WO2020192385A1 (zh) 确定装置、摄像系统及移动体
WO2021115167A1 (zh) 确定装置、飞行体、确定方法以及程序
WO2021035746A1 (zh) 图像处理方法、装置和可移动平台
WO2023047799A1 (ja) 画像処理装置、画像処理方法及びプログラム
JP2022053417A (ja) 制御装置、撮像装置、移動体、制御方法、及びプログラム
WO2020125414A1 (zh) 控制装置、摄像装置、摄像系统、移动体、控制方法以及程序
WO2018188086A1 (zh) 无人机及其控制方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20882410

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20882410

Country of ref document: EP

Kind code of ref document: A1