WO2020001335A1 - 控制装置、摄像装置、移动体、控制方法以及程序 - Google Patents

控制装置、摄像装置、移动体、控制方法以及程序 Download PDF

Info

Publication number
WO2020001335A1
WO2020001335A1 PCT/CN2019/091780 CN2019091780W WO2020001335A1 WO 2020001335 A1 WO2020001335 A1 WO 2020001335A1 CN 2019091780 W CN2019091780 W CN 2019091780W WO 2020001335 A1 WO2020001335 A1 WO 2020001335A1
Authority
WO
WIPO (PCT)
Prior art keywords
imaging
imaging device
range
image
captured
Prior art date
Application number
PCT/CN2019/091780
Other languages
English (en)
French (fr)
Inventor
本庄谦一
邵明
Original Assignee
深圳市大疆创新科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳市大疆创新科技有限公司 filed Critical 深圳市大疆创新科技有限公司
Priority to CN201980005098.0A priority Critical patent/CN111226263A/zh
Publication of WO2020001335A1 publication Critical patent/WO2020001335A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B15/00Optical objectives with means for varying the magnification
    • G02B15/14Optical objectives with means for varying the magnification by axial movement of one or more lenses or groups of lenses relative to the image plane for continuously varying the equivalent focal length of the objective
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B7/00Mountings, adjusting means, or light-tight connections, for optical elements
    • G02B7/28Systems for automatic generation of focusing signals
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B7/00Mountings, adjusting means, or light-tight connections, for optical elements
    • G02B7/28Systems for automatic generation of focusing signals
    • G02B7/36Systems for automatic generation of focusing signals using image sharpness techniques, e.g. image processing techniques for generating autofocus signals
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B13/00Viewfinders; Focusing aids for cameras; Means for focusing for cameras; Autofocus systems for cameras
    • G03B13/32Means for focusing
    • G03B13/34Power focusing
    • G03B13/36Autofocus systems
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B15/00Special procedures for taking photographs; Apparatus therefor
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B35/00Stereoscopic photography
    • G03B35/16Stereoscopic photography by sequential viewing
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B37/00Panoramic or wide-screen photography; Photographing extended surfaces, e.g. for surveying; Photographing internal surfaces, e.g. of pipe
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/55Depth or shape recovery from multiple images
    • G06T7/571Depth or shape recovery from multiple images from focus
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules

Definitions

  • the present invention relates to a control device, an imaging device, a moving body, a control method, and a program.
  • An imaging device which causes an image processing unit to generate moving image data while moving a focus position of an optical system from a nearest end side to an infinite remote end side, and from a plurality of frame images included in the moving image data Extracts a still image focused on a specified area.
  • Patent Document 1 International Publication No. 2017/006538
  • the present invention is expected to be able to generate a depth map while suppressing the amount of movement of the focus lens.
  • the control device may include a control unit that causes the imaging device to photograph the first imaging range when the imaging surface of the imaging device and the focusing lens of the imaging device are in a first positional relationship, and In a state where the imaging surface of the device and the focusing lens of the imaging device are in a second positional relationship, the imaging device captures a second imaging range.
  • the second imaging range is different from the first imaging range and includes a first A repeating range.
  • the control device may include an acquisition unit that acquires a first captured image of the first imaging range and a second captured image of the second imaging range captured by the imaging device.
  • the control device may include a calculation unit that calculates respective blur amounts of the first image corresponding to the first repetition range included in the first captured image and the second image corresponding to the first repetition range included in the second captured image.
  • the control device may include a generating section that generates a depth map including depth information corresponding to the first repetition range based on respective blur amounts of the first image and the second image.
  • the first imaging range may be repeated more than half of the second imaging range.
  • the control unit may cause the imaging device to photograph the third imaging range when the imaging surface of the imaging device and the focusing lens of the imaging device are in a third positional relationship.
  • the third imaging range is different from the second imaging range and includes the same as the second imaging range.
  • the acquisition unit may acquire a third captured image in a third imaging range captured by the imaging device.
  • the calculation unit may calculate respective blur amounts of the third image corresponding to the second repetition range included in the second captured image and the fourth image corresponding to the second repetition range included in the third captured image.
  • the generating unit may generate a depth map further including depth information corresponding to the second repetition range based on the respective blur amounts of the third image and the fourth image.
  • the second imaging range may be repeated more than half of the third imaging range.
  • the control unit may cause the imaging device to photograph the first imaging range in a state where the imaging surface of the imaging device and the focusing lens of the imaging device are in a first positional relationship during a change in the imaging direction of the imaging device, and In a state where the focusing lens and the focusing lens of the imaging device are in a second position, the imaging device photographs the second imaging range.
  • the control unit may cause the imaging device to capture the first captured image and the second captured image during the first rotation of the imaging device around the first point so that the imaging direction of the imaging device changes.
  • the control unit may control the position of the focus lens of the imaging device according to the depth map during the second rotation of the imaging device around the first point to change the imaging direction of the imaging device, and cause the imaging device to recapture multiple captured images.
  • the control unit may store the depth map and the plurality of captured images in a storage unit in association with each other.
  • the control unit may cause the imaging device to photograph the first imaging range while the imaging device is moving along the first trajectory in a state where the imaging surface of the imaging device and the focusing lens of the imaging device are in the first positional relationship, and In a state where the surface and the focus lens of the imaging device are in a second positional relationship, the imaging device captures the second imaging range.
  • the control unit may cause the imaging device to capture the first and second captured images during the first movement of the imaging device along the first trajectory, and control the imaging device according to the depth map during the second movement of the imaging device along the first trajectory Position the focus lens and make the imaging device recapture multiple captured images.
  • the control unit may store the depth map and the plurality of captured images in a storage unit in association with each other.
  • the imaging device may include the control device described above.
  • the imaging device may include a focusing lens.
  • the moving object according to one aspect of the present invention may be a moving object that includes the imaging device described above and moves.
  • the control method may include: when the imaging surface of the imaging device and the focusing lens of the imaging device are in a first positional relationship, causing the imaging device to photograph the first imaging range, and The stage where the camera and the focusing lens of the imaging device are in a second positional relationship to cause the imaging device to photograph a second imaging range, wherein the second imaging range is different from the first imaging range and includes a repeat of the first imaging range.
  • the control method may include a stage of acquiring a first captured image of a first imaging range and a second captured image of a second imaging range captured by the imaging device.
  • the control method may include a stage of calculating respective blur amounts of the first image corresponding to the first repetition range included in the first captured image and the second image corresponding to the first repetition range included in the second captured image.
  • the control method may include a stage of generating a depth map including depth information corresponding to the first repetition range based on respective blur amounts of the first image and the second image.
  • the program according to one aspect of the present invention may be a program for causing a computer to function as the control device.
  • FIG. 1 is a diagram showing an example of the appearance of an unmanned aircraft and a remote operation device.
  • FIG. 2 is a diagram showing an example of functional blocks of an unmanned aircraft.
  • FIG. 3 is a diagram showing an example of a curve showing a relationship between a blur amount and a lens position.
  • FIG. 4 is a diagram illustrating an example of a process of calculating a distance to an object based on a blur amount.
  • FIG. 5 is a diagram for explaining a relationship among a target position, a lens position, and a focal length.
  • FIG. 6 is a diagram for explaining a method of capturing an image of the imaging device while the drone is rotating.
  • FIG. 7A is a diagram for explaining a manner in which the imaging device performs imaging while the unmanned aircraft is rotating.
  • FIG. 7B is a diagram illustrating an example of a relationship between a captured image and a focusing distance of a focusing lens.
  • FIG. 8 is a diagram illustrating an example of a panoramic image generated from a plurality of captured images.
  • FIG. 9 is a flowchart showing an example of an imaging process by an imaging device mounted on a UAV.
  • FIG. 10 is a diagram showing an example of a hardware configuration.
  • the blocks may indicate (1) a stage of a process of performing an operation or (2) a "portion" of a device having a role of performing an operation.
  • Certain stages and “parts” may be implemented by programmable circuits and / or processors.
  • the dedicated circuits may include digital and / or analog hardware circuits. It may include integrated circuits (ICs) and / or discrete circuits.
  • Programmable circuits may include reconfigurable hardware circuits.
  • Reconfigurable hardware circuits can include logical AND, logical OR, logical XOR, logical NAND, logical NOR, and other logical operations, triggers, registers, field programmable gate arrays (FPGAs), programmable logic arrays (PLAs) ) And other memory elements.
  • FPGAs field programmable gate arrays
  • PDAs programmable logic arrays
  • the computer-readable medium may include any tangible device that can store instructions executed by a suitable device.
  • a computer-readable medium containing instructions stored therein includes a product including instructions that can be executed to create a means for performing the operations specified by the flowchart or block diagram.
  • an electronic storage medium, a magnetic storage medium, an optical storage medium, an electromagnetic storage medium, a semiconductor storage medium, and the like may be included.
  • a floppy disk (registered trademark), a floppy disk, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory) may be included ), Electrically erasable programmable read-only memory (EEPROM), static random access memory (SRAM), compact disc read-only memory (CD-ROM), digital versatile disc (DVD), Blu-ray (RTM) disc, memory stick, IC cards, etc.
  • EEPROM Electrically erasable programmable read-only memory
  • SRAM static random access memory
  • CD-ROM compact disc read-only memory
  • DVD digital versatile disc
  • RTM Blu-ray
  • the computer-readable instructions may include any one of source code or object code described by any combination of one or more programming languages.
  • the source or object code includes traditional procedural programming languages.
  • Traditional programming languages can be assembly instructions, instruction set architecture (ISA) instructions, machine instructions, machine-related instructions, microcode, firmware instructions, state setting data, or Smalltalk, JAVA (registered trademark), C ++, etc.
  • the computer-readable instructions may be provided to a processor or a programmable circuit of a general-purpose computer, a special-purpose computer, or other programmable data processing device locally or via a wide area network (WAN) such as a local area network (LAN) or the Internet.
  • WAN wide area network
  • LAN local area network
  • a processor or programmable circuit can execute computer-readable instructions to create a means for performing the operations specified in the flowchart or block diagram.
  • Examples of the processor include a computer processor, a processing unit, a microprocessor, a digital signal processor, a controller, a microcontroller, and the like.
  • FIG. 1 shows an example of the appearance of an unmanned aerial vehicle (UAV) 10 and a remote operation device 300.
  • the UAV 10 includes a UAV body 20, a universal joint 50, a plurality of imaging devices 60, and an imaging device 100.
  • the gimbal 50 and the imaging device 100 are examples of an imaging system.
  • UAV 10, that is, a moving body refers to concepts that include flying bodies moving in the air, vehicles moving on the ground, ships moving on the water, and so on.
  • a flying object moving in the air refers to a concept that includes not only UAVs, but also other aircraft, airships, and helicopters that move in the air.
  • the UAV body 20 includes a plurality of rotors. Multiple rotors are an example of a propulsion part.
  • the UAV body 20 controls the rotation of a plurality of rotors to fly the UAV 10.
  • the UAV body 20 uses, for example, four rotors to fly the UAV 10.
  • the number of rotors is not limited to four.
  • UAV 10 can also be a fixed-wing aircraft without rotors.
  • the imaging device 100 is an imaging camera that images an object included in a desired imaging range.
  • the gimbal 50 rotatably supports the imaging device 100.
  • the universal joint 50 is an example of a support mechanism.
  • the gimbal 50 uses an actuator to rotatably support the imaging device 100 about a pitch axis.
  • the gimbal 50 uses an actuator to further rotatably support the imaging device 100 around a roll axis and a yaw axis, respectively.
  • the gimbal 50 can change the posture of the imaging device 100 by rotating the imaging device 100 about at least one of a yaw axis, a pitch axis, and a roll axis.
  • the plurality of imaging devices 60 are sensing cameras that capture the surroundings of the UAV 10 in order to control the flight of the UAV 10.
  • the two camera devices 60 may be installed on the nose of the UAV 10, that is, on the front side.
  • the other two camera devices 60 may be disposed on the bottom surface of the UAV 10.
  • the two image pickup devices 60 on the front side may be paired and function as a so-called stereo camera.
  • the two imaging devices 60 on the bottom surface side may be paired to function as a stereo camera.
  • the three-dimensional space data around the UAV 10 can be generated from the images captured by the plurality of imaging devices 60.
  • the number of the imaging devices 60 included in the UAV 10 is not limited to four.
  • the UAV 10 may include at least one camera 60.
  • the UAV 10 may also include at least one camera device 60 on the nose, tail, side, bottom, and top surfaces of the UAV 10.
  • the angle of view settable in the imaging device 60 may be greater than the angle of view settable in the imaging device 100.
  • the imaging device 60 may include a single focus lens or a fisheye lens.
  • the remote operation device 300 communicates with the UAV 10 to remotely operate the UAV 10.
  • the remote operation device 300 can perform wireless communication with the UAV 10.
  • the remote operation device 300 transmits to the UAV 10 instruction information indicating various instructions related to the movement of the UAV 10 such as ascent, descent, acceleration, deceleration, forward, backward, and rotation.
  • the instruction information includes, for example, instruction information for raising the height of the UAV 10.
  • the instruction information may indicate the height at which the UAV 10 should be located.
  • the UAV 10 moves to a height indicated by the instruction information received from the remote operation device 300.
  • the instruction information may include a rising instruction for causing the UAV 10 to rise. UAV10 rises while receiving the rising instruction. When the height of UAV 10 reaches the upper limit, UAV 10 can limit the ascent even if it accepts the ascent command.
  • FIG. 2 shows an example of the functional blocks of the UAV 10.
  • UAV 10 includes UAV control unit 30, memory 37, communication interface 36, propulsion unit 40, GPS receiver 41, inertial measurement device 42, magnetic compass 43, barometric altimeter 44, temperature sensor 45, humidity sensor 46, universal joint 50, The imaging device 60 and the imaging device 100.
  • the communication interface 36 communicates with other devices such as the remote operation device 300.
  • the communication interface 36 may receive instruction information including various instructions to the UAV control section 30 from the remote operation device 300.
  • the memory 37 stores the UAV control unit 30 pair of the propulsion unit 40, GPS receiver 41, inertial measurement unit (IMU) 42, magnetic compass 43, barometric altimeter 44, temperature sensor 45, humidity sensor 46, universal joint 50, camera 60, Programs and the like necessary for the imaging device 100 to perform control.
  • the memory 37 may be a computer-readable recording medium, and may include at least one of SRAM, DRAM, EPROM, EEPROM, USB memory, and flash memory such as a solid state drive (SSD).
  • the memory 37 may be provided inside the UAV body 20. It may be provided to be detachable from the UAV body 20.
  • the UAV control unit 30 controls the flight and shooting of the UAV 10 in accordance with a program stored in the memory 37.
  • the UAV control unit 30 may be composed of a microprocessor such as a CPU or an MPU, and a microcontroller such as an MCU.
  • the UAV control unit 30 controls the flight and shooting of the UAV 10 in accordance with instructions received from the remote operation device 300 via the communication interface 36.
  • the advancing unit 40 advances the UAV 10.
  • the propulsion unit 40 includes a plurality of rotors and a plurality of drive motors that rotate the plurality of rotors.
  • the propulsion unit 40 rotates a plurality of rotors through a plurality of drive motors in accordance with a command from the UAV control unit 30 to fly the UAV 10.
  • the GPS receiver 41 receives a plurality of signals indicating the time transmitted from a plurality of GPS satellites.
  • the GPS receiver 41 calculates the position (latitude and longitude) of the GPS receiver 41, that is, the position (latitude and longitude) of the UAV 10 based on the received multiple signals.
  • IMU 42 detects UAV 10's posture.
  • IMU42 detects the three-axis acceleration of the front-rear, left-right, and up-down UAV 10, and the three-axis angular velocity of the pitch axis, roll axis, and yaw axis, as the posture of UAV 10.
  • the magnetic compass 43 detects the orientation of the nose of the UAV 10.
  • the barometric altimeter 44 detects the flying altitude of the UAV 10.
  • the barometric altimeter 44 detects the air pressure around the UAV 10 and converts the detected air pressure into an altitude to detect the altitude.
  • the temperature sensor 45 detects the temperature around the UAV 10.
  • the imaging device 100 includes an imaging section 102 and a lens section 200.
  • the lens unit 200 is an example of a lens device.
  • the imaging unit 102 includes an image sensor 120, an imaging control unit 110, and a memory 130.
  • the image sensor 120 may be composed of a CCD or a CMOS.
  • the image sensor 120 captures an optical image formed through the plurality of lenses 210 and outputs the captured image to the imaging control section 110.
  • the imaging control unit 110 may be composed of a microprocessor such as a CPU or an MPU, and a microcontroller such as an MCU.
  • the imaging control unit 110 may control the imaging apparatus 100 according to an operation instruction from the imaging apparatus 100 of the UAV control unit 30.
  • the imaging control section 110 is an example of a first control section and a second control section.
  • the memory 130 may be a computer-readable recording medium, and may include at least one of SRAM, DRAM, EPROM, EEPROM, USB memory, and flash memory such as a solid state hard disk (SSD).
  • the memory 130 stores programs and the like necessary for the imaging control unit 110 to control the image sensor 120 and the like.
  • the memory 130 may be disposed inside a casing of the imaging device 100.
  • the memory 130 may be provided so as to be detachable from a casing of the imaging apparatus 100.
  • the lens unit 200 includes a plurality of lenses 210, a plurality of lens driving units 212, and a lens control unit 220.
  • the plurality of lenses 210 can function as zoom lenses, varifocal lenses, and focusing lenses. At least a part or all of the plurality of lenses 210 are configured to be movable along the optical axis.
  • the lens unit 200 may be an interchangeable lens provided to be removable from the imaging unit 102.
  • the lens driving unit 212 moves at least a part or all of the plurality of lenses 210 along an optical axis via a mechanism member such as a cam ring.
  • the lens driving section 212 may include an actuator.
  • the actuator may include a stepper motor.
  • the lens control section 220 drives the lens driving section 212 in accordance with a lens control instruction from the imaging section 102 to move one or more lenses 210 along the optical axis direction via a mechanism member.
  • the lens control command is, for example, a zoom control command and a focus control command.
  • the lens unit 200 further includes a memory 222 and a position sensor 214.
  • the lens control unit 220 controls the movement of the lens 210 in the optical axis direction via the lens driving unit 212 in accordance with a lens operation instruction from the imaging unit 102.
  • the lens control unit 220 controls the movement of the lens 210 in the optical axis direction via the lens driving unit 212 in accordance with a lens operation instruction from the imaging unit 102.
  • a part or all of the lens 210 moves along the optical axis.
  • the lens control section 220 performs at least one of a zoom operation and a focus operation by moving at least one of the lenses 210 along the optical axis.
  • the position sensor 214 detects the position of the lens 210.
  • the position sensor 214 may detect a current zoom position or a focus position.
  • the lens driving section 212 may include a shake correction mechanism.
  • the lens control section 220 may perform the shake correction by moving the lens 210 in a direction along the optical axis or a direction perpendicular to the optical axis via a shake correction mechanism.
  • the lens driving section 212 may drive a shake correction mechanism by a stepping motor to perform shake correction.
  • the shake correction mechanism may be driven by a stepping motor to move the image sensor 120 in a direction along the optical axis or in a direction perpendicular to the optical axis to perform shake correction.
  • the memory 222 stores control values of the plurality of lenses 210 that are moved via the lens driving unit 212.
  • the memory 222 may include at least one of flash memories such as SRAM, DRAM, EPROM, EEPROM, and USB memory.
  • the imaging device 100 configured as described above has a function of determining the distance (subject distance) from the lens to the subject in order to perform autofocus processing (AF processing) and the like.
  • AF processing autofocus processing
  • a method for determining the subject distance there is a method for determining based on the blur amount of a plurality of images captured in different states of the positional relationship between the lens and the imaging surface. This method is referred to as Bokeh Detection AutoFoucus (BDAF) method.
  • BDAF Bokeh Detection AutoFoucus
  • the Gaussian function can be used to express the blur amount (Cost) of an image by the following formula (1).
  • x represents a pixel position in the horizontal direction.
  • represents the standard deviation.
  • FIG. 3 shows an example of a curve represented by the formula (1).
  • FIG. 4 is a flowchart showing an example of a distance calculation process in the BDAF method.
  • the first image I 1 is captured by the imaging device 100 and stored in the memory 130.
  • the lens and the imaging surface are in a second positional relationship, and the second image I 2 is captured by the imaging device 100 and stored in the memory 130 (S101).
  • the imaging surface of the focus lens or the image sensor 120 is moved along the optical axis direction without exceeding the focus point.
  • the amount of movement of the focusing lens or the imaging surface of the image sensor 120 may be, for example, 10 ⁇ m.
  • the imaging device 100 divides the image I 1 into a plurality of regions (S102).
  • the feature amount can be calculated for each pixel in the image I 2 , and a pixel group having a similar feature amount can be used as a region to further divide the image I 1 into a plurality of regions.
  • the pixel group set as the range of the AF processing frame in the image I 1 may be divided into a plurality of regions.
  • the imaging device 100 divides the image I 2 into a plurality of regions corresponding to the plurality of regions of the image I 1 .
  • the imaging device 100 calculates a distance to an object included in each of the plurality of regions for each of the plurality of regions based on the blur amount of each of the regions of image I 1 and each of the plurality of regions of image I 2 (S103 ).
  • the calculation process of the distance is further described with reference to FIG. 5.
  • the focal length is F.
  • the relationship between the distance A, the distance B, and the focal length F can be expressed by the following formula (2) according to the lens type.
  • the focal length F is specified by the lens position. Therefore, if the distance B at which the subject 510 is imaged on the imaging surface can be specified, then the formula A can be used to specify the distance A from the lens L to the subject 510.
  • the distance B can be specified by calculating the position where the object 510 is imaged based on the size of the blur of the object 510 (the circles of dispersion 512 and 514) projected on the imaging surface, and then the distance A can be specified. That is, the imaging position can be specified by combining the size of the blur (the amount of blur) with the imaging plane and the imaging position.
  • the distance from the image I 1 which is closer to the imaging surface to the lens L is D 1 .
  • the distance from the image I 2 farther from the imaging surface to the lens L be D 2 . Every image is blurred.
  • the point spread function (Point Spread Function) at this time be PSF, and let the images at D 1 and D 2 be I d1 and I d2 respectively .
  • the image I 1 can be expressed by the following formula (3) according to a convolution operation.
  • the Fourier transform function of the image data I d1 and I d2 be f
  • the optical transfer function obtained by performing Fourier transform on the point spread functions PSF 1 and PSF 2 of the images I d1 and I d2 as For OTF 1 and OTF 2
  • the ratio is obtained as shown in the following formula (4).
  • the C value represented by the formula (4) is the change amount of the blur amount of each of the images I d1 and I d2 , that is, the C value corresponds to the difference between the blur amount of the image I d1 and the blur amount of the image I d2n .
  • a depth map is generated while suppressing the amount of movement of the focus lens.
  • the depth map is data representing the distance from each pixel or each block including a plurality of pixels to a subject.
  • the imaging control unit 110 includes an acquisition unit 112, a calculation unit 114, a generation unit 116, and a synthesis unit 118.
  • the imaging control unit 110 When the imaging control unit 110 changes the imaging direction of the imaging device 100 or the imaging device 100 moves along the first trajectory, the imaging control unit 110 causes the imaging device 100 and the focusing lens of the imaging device 100 to have a different positional relationship in a state where the positional relationship is different.
  • the imaging device 100 captures a plurality of captured images in a plurality of imaging ranges including a repetition range.
  • the imaging control unit 110 may cause the imaging device 100 to include repetition in a state where the positional relationship between the imaging surface of the imaging device 100 and the focusing lens of the imaging device 100 is different during the rotation of the UAV 10 while hovering at a predetermined location.
  • Multiple imaging ranges of a range capture a plurality of captured images.
  • the imaging range is a range of a space captured by the imaging device 100.
  • the imaging control unit 110 may cause the imaging device 100 to shoot multiple images in a plurality of imaging ranges including a repeated range in a state where the positional relationship between the imaging surface of the imaging device 100 and the focusing lens of the imaging device 100 is different. Camera images.
  • the imaging control unit 110 may cause the imaging device 100 to change the positional relationship between the imaging surface of the imaging device 100 and the focus lens of the imaging device 100 while the UAV 10 is moving in a direction different from the imaging direction of the imaging device 100.
  • a plurality of captured images are captured in a plurality of imaging ranges including a repetition range.
  • the imaging control unit 110 may move the focus lens while the UAV 10 is rotating while hovering at a predetermined location, and cause the imaging device 100 to capture a plurality of captured images in a plurality of imaging ranges including a repeat range.
  • the imaging control unit 110 may move the focus lens while the UAV 10 moves along the first trajectory, and cause the imaging device 100 to capture a plurality of captured images in a plurality of imaging ranges including a repeated range.
  • the imaging control unit 110 may alternately switch the position of the focus lens between the first position and the second position while the UAV 10 is rotating while hovering at a predetermined location, and at the same time, cause the imaging device 100 to perform multiple imaging operations including a repeating range. Range captures multiple camera images. The imaging control unit 110 may alternately switch the position of the focus lens between the first position and the second position while the UAV 10 moves along the first trajectory, and at the same time, cause the imaging device 100 to shoot in multiple imaging ranges including a repeated range. Camera images.
  • the imaging control unit 110 may cause the imaging of the imaging device 100 and the focusing lens of the imaging device 100 to have a first positional relationship, The device 100 photographs the first imaging range 601.
  • the imaging control unit 110 may cause the imaging device 100 to photograph the second imaging range 602 when the imaging surface of the imaging device 100 and the focusing lens of the imaging device 100 are in a second positional relationship, where the second imaging range 602 is different from
  • the first imaging range 601 includes a first repeating range 611 that overlaps the first imaging range 601.
  • the imaging control unit 110 may cause the imaging device 100 to photograph a third imaging range 603 when the imaging surface of the imaging device 100 and the focusing lens of the imaging device 100 are in a third positional relationship, where the third imaging range 603 is different from
  • the second imaging range 602 includes a second repeating range 612 that overlaps the second imaging range 602.
  • the first imaging range 601 is different from the second imaging range 602, but may be repeated more than half of the second imaging range 602.
  • the second imaging range 602 is different from the third imaging range 603, but may be repeated more than half of the third imaging range 603.
  • the second imaging range 602 is different from the first imaging range 601 and the third imaging range 603, but may be repeated more than half of the third imaging range 603.
  • the acquisition unit 112 acquires a plurality of captured images captured by the imaging device 100 during a change in the imaging direction of the imaging device 100 or while the imaging device 100 moves along the first trajectory.
  • the acquisition unit 112 may acquire multiple images captured by the imaging device 100 in a state where the positional relationship between the imaging surface of the imaging device 100 and the focusing lens of the imaging device 100 is different during the rotation of the UAV 10 while hovering at a predetermined location. Images of each camera range.
  • the acquisition unit 112 can acquire a plurality of imaging ranges captured by the imaging device 100 in a state where the positional relationship between the imaging surface of the imaging device 100 and the focusing lens of the imaging device 100 is different while the UAV 10 moves along the first trajectory. Camera image.
  • the calculation section 114 calculates the respective blur amounts of the images of the repeated range included in each of the plurality of captured images.
  • the calculation unit 114 may calculate a blur amount (Cost) of each image based on the formula (1) using a Gaussian function.
  • the generating unit 116 generates a depth map including depth information corresponding to each of the plurality of repeated ranges based on the respective blur amounts of the images in the plurality of repeated ranges.
  • the generating unit 116 may generate a depth map including depth information for each pixel or each block including a plurality of pixels of a plurality of repeated range images based on the respective blur amounts of the images in a plurality of repeated ranges.
  • the depth information indicates the distance to the subject.
  • the imaging control section 110 can set the focusing distance of the focus lens to the first position (1.0m) during the rotation of the UAV 10 while hovering at a predetermined location, and cause the imaging device 100 to The first imaging range 601 performs shooting.
  • the acquisition unit 112 may acquire a first captured image 701 of a first captured range 601.
  • the imaging control unit 110 may set the focusing distance of the focus lens to the second position (0.5m) during the rotation of the UAV 10 while hovering at a predetermined location, and cause the imaging device 100 to photograph the second imaging range 602.
  • the acquisition unit 112 can acquire a second captured image 702 of the second imaging range 602.
  • the imaging control unit 110 may set the focusing distance of the focusing lens to the first position (1.0m) while the UAV 10 is rotating while hovering at a predetermined location, and cause the imaging device 100 to photograph the third imaging range 603 .
  • the acquisition unit 112 can acquire a third captured image 703 of the third imaging range 603.
  • the calculation unit 114 may calculate each of the first image 710 corresponding to the first repetition range 611 included in the first captured image 701 and the second image 711 corresponding to the first repetition range 611 included in the second captured image 702. The amount of blur. The calculation unit 114 may calculate each of the third image 712 corresponding to the second repetition range 612 included in the second captured image 702 and the fourth image 713 corresponding to the second repetition range 612 included in the third captured image 703. The amount of blur.
  • the generating section 116 may generate a depth map including depth information corresponding to the first repetition range 611 based on the respective blur amounts of the first image 710 and the second image 711.
  • the generating unit 116 may generate a depth map further including depth information corresponding to the second repetition range 612 based on the respective blur amounts of the third image 712 and the fourth image 713.
  • a depth map corresponding to the overlapping range is generated in accordance with the BDAF method based on a plurality of captured images of imaging ranges different from each other and partially overlapping.
  • the distance to the subject can be specified without moving the focus lens from the closest end side to the infinite end.
  • a depth map is generated based on a captured image captured while the imaging direction of the imaging device 100 is being changed or while the imaging device 100 is being moved according to the first trajectory. Therefore, a depth map including a wide range of depth information can be generated in a relatively short time.
  • the imaging control unit 110 may control the position of the focus lens of the imaging device 100 based on the depth map, and further cause the imaging device 100 to capture a plurality of captured images.
  • the imaging control unit 110 may cause the imaging device 100 to capture a first captured image and a second captured image during the first rotation of the imaging device 100 around the first point and the imaging direction of the imaging device 100 is changed.
  • the imaging control unit 110 may control the position of the focus lens of the imaging device 100 according to the depth map during the second rotation of the imaging device around the first point so that the imaging direction of the imaging device 100 is changed, and cause the imaging device 100 to re-shoot multiple images. Camera image.
  • the imaging control unit 110 may cause the imaging device 100 to capture a first captured image and a second captured image during the first movement of the imaging device 100 along the first trajectory, and during the second movement of the imaging device 100 along the first trajectory, follow
  • the depth map controls the position of the focus lens of the imaging device 100 and causes the imaging device 100 to recapture a plurality of captured images.
  • the imaging control unit 110 may cause the imaging device 100 to capture a plurality of imaging images of mutually different and partially overlapping imaging ranges while the UAV 10 is hovering while performing a first rotation around the first point.
  • the acquisition unit 112 can acquire a plurality of captured images of imaging ranges different from each other and partially overlapping.
  • the calculation unit 114 may calculate the blur amount of the image in the repeated range.
  • the generating unit 116 may generate a depth map including depth information corresponding to the repeated range based on the blur amount of the image in the repeated range.
  • the imaging control unit 110 may specify a distance to a desired subject in accordance with the depth map.
  • the imaging control unit 110 may control the position of the focus lens in accordance with the distance to a specified desired subject while the UAV 10 is hovering while performing a second rotation around the first point, and the imaging device 100 may shoot.
  • Multiple camera images The imaging control unit 110 can control the focus lens to a predetermined focusing distance without following the depth map while hovering the UAV 10 while performing a first rotation around the first location, and allows the imaging device 100 to shoot more. Camera images.
  • the imaging control section 110 may store a depth map and a plurality of captured images in a storage section such as the memory 130 in association with each other.
  • the combining unit 118 may combine a plurality of captured images and generate a panoramic image as shown in FIG. 8.
  • the synthesizing section 118 may associate the panoramic image and the depth map and store them in a storage section such as the memory 130.
  • the ratio of the overlapping range between adjacent imaging ranges can be any ratio.
  • the ratio of the overlapping range between adjacent imaging ranges is less than 1 and more than 1/2.
  • the ratio of the repeated range between adjacent imaging ranges may be a ratio less than 1/2 and greater than 0.
  • FIG. 9 is a flowchart showing an example of an imaging process of the imaging device 100 mounted on the UAV 10.
  • the UAV 10 begins to fly (S200).
  • the imaging mode of the imaging device 100 is set to a panoramic mode with a depth map (S202).
  • the UAV control unit 30 moves the UAV 10 to a desired first place.
  • the imaging control unit 110 causes the UAV 10 to rotate while hovering at the first location through the UAV control unit 30, and moves the focus lens, while causing the imaging device 100 to take a plurality of imaging images of imaging ranges different from each other and partially overlapping (S204 ).
  • the imaging control section 110 may alternately switch the position of the focus lens between the first position and the second position, and at the same time, cause the imaging device 100 to capture a plurality of captured images of mutually different and partially overlapping imaging ranges.
  • the calculation unit 114 calculates the blur amount of the image in the repeated range, and the generation unit 116 generates a depth map including depth information corresponding to the repeated range based on the blur amount (S206).
  • the imaging control unit 110 causes the UAV control unit 30 to rotate again while hovering at the first point, controls the focus lens according to the depth map, and causes the imaging device 100 to capture a plurality of captured images (S208).
  • the imaging control unit 110 may specify a distance to a desired subject according to the depth map, and adjust the position of the focus lens according to the distance.
  • the imaging control unit 110 may rotate the UAV 10 while hovering at the first point through the UAV control unit 30, and capture a plurality of captured images at the positions of the adjusted focus lenses.
  • the imaging control unit 110 may rotate UAV 10 for each focusing distance to cause the imaging device 100 to perform imaging.
  • the imaging control unit 110 can make the focusing distance of each of the plurality of imaging devices 100 different, and during the UAV 10 rotation, the Each takes multiple camera images with different focusing distances.
  • the imaging control unit 110 stores the plurality of captured images and the depth map in the memory 130 in association with each other (S210).
  • the synthesizing section 118 synthesizes a plurality of captured images to generate a panoramic image, and the imaging control section 110 may store the panoramic image and the depth map in the memory 130 in association with each other.
  • the UAV 10 rotates 360 degrees and the generating unit 116 generates a depth map including depth information in a range of 360 degrees has been described above.
  • the UAV 10 may be rotated by a rotation angle smaller than 360 degrees, and the generating unit 116 may generate a depth map including depth information corresponding to a range of the rotation angle.
  • the imaging device 100 may perform imaging in a three-dimensional space. When the imaging device 100 performs shooting in a three-dimensional space, the posture of the imaging device 100 is adjusted by controlling the universal joint 50 so that the imaging device 100 can perform shooting.
  • the posture of the imaging device 100 may also be controlled by controlling the posture of the UAV 10 instead of controlling the posture of the imaging device 100 through a gimbal. In this way, a depth map can be obtained in three-dimensional space. If a plurality of imaging devices 100 are provided in order to obtain other overlapping ranges, a depth map can be obtained in a shorter man-hour than when one imaging device 100 is used.
  • FIG. 10 illustrates an example of a computer 1200 that may fully or partially embody aspects of the present invention.
  • the program installed on the computer 1200 enables the computer 1200 to function as an operation associated with a device according to an embodiment of the present invention or one or more “parts” of the device. Alternatively, the program can cause the computer 1200 to perform the operation or the one or more "parts".
  • This program enables the computer 1200 to execute a process or a stage of the process according to an embodiment of the present invention.
  • Such a program may be executed by the CPU 1212 to cause the computer 1200 to perform specific operations associated with some or all of the flowcharts and block diagrams described in this specification.
  • the computer 1200 of this embodiment includes a CPU 1212 and a RAM 1214, which are connected to each other through a host controller 1210.
  • the computer 1200 also includes a communication interface 1222, an input / output unit, and they are connected to the host controller 1210 through an input / output controller 1220.
  • the computer 1200 also includes a ROM 1230.
  • the CPU 1212 operates in accordance with the programs stored in the ROM 1230 and the RAM 1214 to control each unit.
  • the communication interface 1222 communicates with other electronic devices through a network.
  • the hard disk drive can store programs and data used by the CPU 1212 in the computer 1200.
  • the ROM 1230 stores therein a boot program and the like executed by the computer 1200 at the time of operation, and / or a program that depends on the hardware of the computer 1200.
  • the program is provided through a computer-readable recording medium such as a CR-ROM, a USB memory, or an IC card, or a network.
  • the program is installed in a RAM 1214 or a ROM 1230 which is also an example of a computer-readable recording medium, and is executed by the CPU 1212.
  • the information processing described in these programs is read by the computer 1200, and brings cooperation between the programs and the above-mentioned various types of hardware resources.
  • the device or method may be constituted by realizing the operation or processing of information as the computer 1200 is used.
  • the CPU 1212 may execute a communication program loaded in the RAM 1214 and instruct the communication interface 1222 to perform communication processing according to the processing described in the communication program.
  • the communication interface 1222 reads the transmission data stored in a transmission buffer provided in a recording medium such as a RAM 1214 or a USB memory, and sends the read transmission data to the network, or from the network.
  • the received reception data is written in a reception buffer or the like provided in the recording medium.
  • the CPU 1212 can cause the RAM 1214 to read all or required parts of a file or database stored in an external recording medium such as a USB memory, and perform various types of processing on the data on the RAM 1214. The CPU 1212 can then write the processed data back to the external recording medium.
  • an external recording medium such as a USB memory
  • Various types of information such as various types of programs, data, tables, and databases can be stored in a recording medium and subjected to information processing.
  • the CPU 1212 can perform various types of operations, information processing, conditional judgment, conditional transfer, unconditional transfer, information, etc., which are described in various places in the present disclosure, including specified by the instruction sequence of the program. retrieve / replace various types of processing, and write the results back to RAM 1214.
  • the CPU 1212 can retrieve information in files, databases, etc. in the recording medium. For example, when a plurality of entries having the attribute value of the first attribute respectively associated with the attribute value of the second attribute are stored in the recording medium, the CPU 1212 may retrieve the attribute that specifies the first attribute from the plurality of entries. The entry whose value conditions match, and the attribute value of the second attribute stored in the entry is read, so as to obtain the attribute value of the second attribute that is associated with the first attribute that meets the predetermined condition.
  • the programs or software modules described above may be stored on the computer 1200 or a computer-readable storage medium near the computer 1200.
  • a recording medium such as a hard disk or a RAM provided in a server system connected to a dedicated communication network or the Internet can be used as a computer-readable storage medium, thereby providing the program to the computer 1200 through the network.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Optics & Photonics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Theoretical Computer Science (AREA)
  • Studio Devices (AREA)
  • Focusing (AREA)
  • Stereoscopic And Panoramic Photography (AREA)
  • Automatic Focus Adjustment (AREA)

Abstract

在抑制聚焦镜头的移动量的同时,生成深度图。可以包含:控制部,其在摄像装置的摄像面和摄像装置的聚焦镜头处于第一位置关系的状态下,使摄像装置对第一摄像范围进行拍摄,并且在摄像装置的摄像面和摄像装置的聚焦镜头处于第二位置关系的状态下,使摄像装置对第二摄像范围进行拍摄,上述第二摄像范围不同于第一摄像范围,并且包括与第一摄像范围重复的第一重复范围;获取部,其获取由摄像装置拍摄的第一摄像范围的第一摄像图像和第二摄像范围的第二摄像图像;计算部,其计算第一摄像图像中包括的与第一重复范围对应的第一图像、和第二摄像图像中包括的与第一重复范围对应的第二图像的各自的模糊量;以及生成部,其基于第一图像和第二图像的各自的模糊量生成包括与第一重复范围对应的深度信息的深度图。

Description

控制装置、摄像装置、移动体、控制方法以及程序 技术领域
本发明涉及一种控制装置、摄像装置、移动体、控制方法以及程序。
背景技术
公开了一种摄像装置,其在将光学系统的聚焦位置从最近端侧移动到无限远端侧的同时,使图像处理部生成动态图像数据,并从动态图像数据中包含的多个帧图像中提取对焦于指定区域的静止图像。
专利文献1国际公开第2017/006538号公报
发明内容
本发明期望能够在抑制聚焦镜头的移动量的同时生成深度图。
本发明的一个方面所涉及的控制装置可以包含控制部,其在摄像装置的摄像面和摄像装置的聚焦镜头处于第一位置关系的状态下使摄像装置对第一摄像范围进行拍摄,并且在摄像装置的摄像面和摄像装置的聚焦镜头处于第二位置关系的状态下使摄像装置对第二摄像范围进行拍摄,第二摄像范围不同于第一摄像范围,并且包括与第一摄像范围重复的第一重复范围。控制装置可以包含获取部,其获取由摄像装置拍摄的第一摄像范围的第一摄像图像和第二摄像范围的第二摄像图像。控制装置可以包含计算部,其计算第一摄像图像中包括的与第一重复范围对应的第一图像和第二摄像图像中包括的与第一重复范围对应的第二图像的各自的模糊量。控制装置可包含生成部,其基于第一图像和第二图像的各自的模糊量生成包括与第一重复范围对应的深度信息的深度图。
第一摄像范围可以与第二摄像范围重复一半以上。
控制部可以在摄像装置的摄像面和摄像装置的聚焦镜头处于第三位置关系的状态下使摄像装置对第三摄像范围进行拍摄,第三摄像范围不同于第二摄像范围,并且包括与第二摄像范围重复的第二重复范围。获取部可以获取由摄像装置拍摄的第三摄像范围的第三摄像图像。计算部可以计算第二摄像图像中包括的与第二重复范围对应的第三图像和第三摄像图像中包括的与第二重复范围对应的第四图像的各自的模糊量。生成部可以基于第三图像和第四图像的各自的模糊量来生成进一步包括与第二重复 范围对应的深度信息的深度图。
第二摄像范围可以与第三摄像范围重复一半以上。
控制部可以在摄像装置的摄像方向变化期间,在摄像装置的摄像面和摄像装置的聚焦镜头处于第一位置关系的状态下使摄像装置对第一摄像范围进行拍摄,并且在摄像装置的摄像面和摄像装置的聚焦镜头处于第二位置关系的状态下使摄像装置对第二摄像范围进行拍摄。
控制部可以在使摄像装置围绕第一点进行第一圈旋转使得摄像装置的摄像方向改变期间,使摄像装置拍摄第一摄像图像和第二摄像图像。控制部可以在使摄像装置围绕第一点进行第二圈旋转使得摄像装置的摄像方向改变期间,按照深度图控制摄像装置的聚焦镜头的位置,并使摄像装置重新拍摄多个摄像图像。
控制部可以将深度图和多个摄像图像相关联地存储在存储部中。
控制部可以在摄像装置沿第一轨迹移动期间,在摄像装置的摄像面和摄像装置的聚焦镜头处于第一位置关系的状态下使摄像装置对第一摄像范围进行拍摄,并且在摄像装置的摄像面和摄像装置的聚焦镜头处于第二位置关系的状态下使摄像装置对第二摄像范围进行拍摄。
控制部可以在摄像装置第一次沿第一轨迹移动期间,使摄像装置拍摄第一摄像图像和第二摄像图像,并且在摄像装置第二次沿第一轨迹移动期间,按照深度图控制摄像装置的聚焦镜头的位置,并使摄像装置重新拍摄多个摄像图像。
控制部可以将深度图和多个摄像图像相关联地存储在存储部中。
本发明的一个方面所涉及的摄像装置可以包含上述控制装置。摄像装置可以包含聚焦镜头。
本发明的一个方面所涉及的移动体可以是包含上述摄像装置并移动的移动体。
本发明的一个方面所涉及的控制方法可以包含:在摄像装置的摄像面和摄像装置的聚焦镜头处于第一位置关系的状态下使摄像装置对第一摄像范围进行拍摄,并且在摄像装置的摄像面和摄像装置的聚焦镜头处于第二位置关系的状态下使摄像装置对第二摄像范围进行拍摄的阶段,其中,第二摄像范围不同于第一摄像范围,并且包括与第一摄像范围重复的第一 重复范围。控制方法可以包含:获取由摄像装置拍摄的第一摄像范围的第一摄像图像和第二摄像范围的第二摄像图像的阶段。控制方法可以包含:计算第一摄像图像中包括的与第一重复范围对应的第一图像和第二摄像图像中包括的与第一重复范围对应的第二图像的各自的模糊量的阶段。控制方法可以包含:基于第一图像和第二图像的各自的模糊量,生成包括与第一重复范围对应的深度信息的深度图的阶段。
本发明的一个方面所涉及的程序可以是一种用于使计算机作为上述控制装置发挥作用的程序。
根据本发明的一个方面,可以在抑制聚焦镜头的移动量的同时生成深度图。
另外,上述本发明的内容中没有穷举本发明的所有必要的特征。另外,这些特征群的子集也可形成发明。
附图说明
附图是用来提供对本发明的进一步理解,并且构成说明书的一部分,与下面的具体实施方式一起用于解释本发明,但并不构成对本发明的限制。在附图中:
图1是示出无人驾驶航空器及远程操作装置的外观的一个示例的图。
图2是示出无人驾驶航空器的功能块的一个示例的图。
图3是示出表示模糊量与镜头位置的关系的曲线的一个示例的图。
图4是示出基于模糊量计算出到对象的距离的过程的一个示例的图。
图5是用于说明对象位置、镜头位置及焦距之间的关系的图。
图6是用于对在无人驾驶航空器旋转的同时摄像装置进行拍摄的方式进行说明的图。
图7A是用于对无人驾驶航空器旋转的同时摄像装置进行拍摄的方式进行说明的图。
图7B是示出摄像图像与聚焦镜头的对焦距离之间的关系的一个示例的图。
图8是示出从多个摄像图像生成的全景图像的一个示例的图。
图9是示出由搭载在UAV上的摄像装置的摄像过程的一个示例的流程图。
图10是示出硬件构成的一个示例的图。
【附图标记说明】
10 UAV
20 UAV主体
30 UAV控制部
36 通信接口
37 存储器
40 推进部
41 GPS接收器
42 惯性测量装置
43 磁罗盘
44 气压高度计
45 温度传感器
46 湿度传感器
50 万向节
60 摄像装置
100 摄像装置
102 摄像部
110 摄像控制部
112 获取部
114 计算部
116 生成部
118 合成部
120 图像传感器
130 存储器
200 镜头部
210 镜头
212 镜头驱动部
214 位置传感器
220 镜头控制部
222 存储器
300 远程操作装置
1200 计算机
1210 主机控制器
1212 CPU
1214 RAM
1220 输入/输出控制器
1222 通信接口
1230 ROM
具体实施方式
以下,通过发明的实施方式来对本发明进行说明,但是以下实施方式并非限制权利要求书所涉及的发明。此外,实施方式中说明的特征的组合并非全部是发明的解决方案所必须的。对本领域普通技术人员来说,显然可以对以下实施方式加以各种变更或改良。从权利要求书的记载可知,加以了这样的变更或改良的方式都可包含在本发明的技术范围之内。
在权利要求书、说明书、说明书附图以及说明书摘要中包含作为著作权所保护对象的事项。任何人只要如专利局的文档或者记录所表示的那样进行这些文件的复制,著作权人就无法异议。但是,在除此以外的情况下,保留一切的著作权。
本发明的各种实施方式可参照流程图及框图来记载,这里,方框可表示(1)执行操作的过程的阶段或者(2)具有执行操作的作用的装置的“部”。特定的阶段和“部”可以通过可编程电路和/或处理器来实现。专用电路可以包括数字和/或模拟硬件电路。可以包括集成电路(IC)和/或分立电路。可编程电路可以包括可重构硬件电路。可重构硬件电路可以包括逻辑与、逻辑或、逻辑异或、逻辑与非、逻辑或非、及其它逻辑操作、触发器、寄存器、现场可编程门阵列(FPGA)、可编程逻辑阵列(PLA)等存储器元件等。
计算机可读介质可以包括可以对由适宜的设备执行的指令进行储存的任意有形设备。其结果是,包含储存于其中的指令的计算机可读介质包含一种包括指令的产品,该指令可以执行以创建用于执行流程图或框图所 指定的操作的手段。作为计算机可读介质的示例,可以包括电子存储介质、磁存储介质、光学存储介质、电磁存储介质、半导体存储介质等。作为计算机可读介质的更具体的示例,可以包括软盘(注册商标)、软磁盘、硬盘、随机存取存储器(RAM)、只读存储器(ROM)、可擦除可编程只读存储器(EPROM或者闪存)、电可擦可编程只读存储器(EEPROM)、静态随机存取存储器(SRAM)、光盘只读存储器(CD-ROM)、数字多用途光盘(DVD)、蓝光(RTM)光盘、记忆棒、集成电路卡等。
计算机可读指令可以包括由一种或多种编程语言的任意组合记述的源代码或者目标代码中的任意一个。源代码或者目标代码包括传统的程序式编程语言。传统的程序式编程语言可以为汇编指令、指令集架构(ISA)指令、机器指令、与机器相关的指令、微代码、固件指令、状态设置数据、或者Smalltalk、JAVA(注册商标)、C++等面向对象编程语言以及“C”编程语言或者类似的编程语言。计算机可读指令可以在本地或者经由局域网(LAN)、互联网等广域网(WAN)提供给通用计算机、专用计算机或者其它可编程数据处理装置的处理器或可编程电路。处理器或可编程电路可以执行计算机可读指令,以创建用于执行流程图或框图所指定操作的手段。作为处理器的示例,包括计算机处理器、处理单元、微处理器、数字信号处理器、控制器、微控制器等。
图1表示无人驾驶航空器(UAV)10及远程操作装置300的外观的一个示例。UAV 10包含UAV主体20、万向节50、多个摄像装置60、以及摄像装置100。万向节50及摄像装置100为摄像系统的一个示例。UAV 10,即移动体,是指包括在空中移动的飞行体、在地面上移动的车辆、在水上移动的船舶等的概念。在空中移动的飞行体是指不仅包括UAV、还包括在空中移动的其它的飞行器、飞艇、直升机等的概念。
UAV主体20包含多个旋翼。多个旋翼为推进部的一个示例。UAV主体20通过控制多个旋翼的旋转而使UAV 10飞行。UAV主体20使用例如四个旋翼来使UAV 10飞行。旋翼的数量不限于四个。另外,UAV 10也可以是没有旋翼的固定翼机。
摄像装置100为对包含在所期望的摄像范围内的被摄体进行摄像的摄像用相机。万向节50可旋转地支撑摄像装置100。万向节50为支撑机构 的一个示例。例如,万向节50使用致动器围绕俯仰轴可旋转地支撑摄像装置100。万向节50使用致动器进一步分别以滚转轴和偏航轴为中心可旋转地支撑摄像装置100。万向节50可通过使摄像装置100以偏航轴、俯仰轴以及滚转轴中的至少一个为中心旋转,来变更摄像装置100的姿势。
多个摄像装置60是为了控制UAV 10的飞行而对UAV 10的周围进行拍摄的传感用相机。两个摄像装置60可以设置于UAV 10的机头、即正面。并且,其它两个摄像装置60可以设置于UAV 10的底面。正面侧的两个摄像装置60可以成对,起到所谓的立体相机的作用。底面侧的两个摄像装置60也可以成对,起到立体相机的作用。可以根据由多个摄像装置60所拍摄的图像来生成UAV 10周围的三维空间数据。UAV 10所包含的摄像装置60的数量不限于四个。UAV 10包含至少一个摄像装置60即可。UAV 10也可以在UAV 10的机头、机尾、侧面、底面及顶面分别包含至少一个摄像装置60。摄像装置60中可设定的视角可大于摄像装置100中可设定的视角。摄像装置60也可以具有单焦点镜头或鱼眼镜头。
远程操作装置300与UAV 10通信,以远程操作UAV 10。远程操作装置300可以与UAV 10进行无线通信。远程操作装置300向UAV 10发送表示上升、下降、加速、减速、前进、后退、旋转等与UAV 10的移动有关的各种指令的指示信息。指示信息包括例如使UAV 10的高度上升的指示信息。指示信息可以表示UAV 10应该位于的高度。UAV 10进行移动,以位于从远程操作装置300接收的指示信息所表示的高度。指示信息可以包括使UAV 10上升的上升指令。UAV 10在接受上升指令的期间上升。在UAV 10的高度已达到上限高度时,即使接受上升指令,UAV 10也可以限制上升。
图2示出UAV 10的功能块的一个示例。UAV 10包含UAV控制部30、存储器37、通信接口36、推进部40、GPS接收器41、惯性测量装置42、磁罗盘43、气压高度计44、温度传感器45、湿度传感器46、万向节50、摄像装置60及摄像装置100。
通信接口36与远程操作装置300等其它装置通信。通信接口36可以从远程操作装置300接收包括对UAV控制部30的各种指令的指示信息。存储器37存储UAV控制部30对推进部40、GPS接收器41、惯性测量装 置(IMU)42、磁罗盘43、气压高度计44、温度传感器45、湿度传感器46、万向节50、摄像装置60及摄像装置100进行控制所需的程序等。存储器37可以为计算机可读记录介质,可以包括SRAM、DRAM、EPROM、EEPROM、USB存储器及固态硬盘(SSD)等闪存中的至少一个。存储器37可以设置于UAV主体20的内部。其可以设置成可从UAV主体20中拆卸下来。
UAV控制部30按照储存在存储器37中的程序来控制UAV 10的飞行及拍摄。UAV控制部30可以由CPU或MPU等微处理器、以及MCU等微控制器等构成。UAV控制部30按照经由通信接口36从远程操作装置300接收到的指令来控制UAV 10的飞行及拍摄。推进部40推进UAV 10。推进部40具有多个旋翼和使多个旋翼旋转的多个驱动电机。推进部40按照来自UAV控制部30的指令,经由多个驱动电机使多个旋翼旋转,以使UAV 10飞行。
GPS接收器41接收表示从多个GPS卫星发送的时间的多个信号。GPS接收器41根据所接收的多个信号来计算出GPS接收器41的位置(纬度及经度)、即UAV 10的位置(纬度及经度)。IMU 42检测UAV 10的姿势。IMU 42检测UAV 10的前后、左右以及上下的三轴方向的加速度和俯仰轴、滚转轴以及偏航轴的三轴方向的角速度,作为UAV 10的姿势。磁罗盘43检测UAV 10的机头的方位。气压高度计44检测UAV 10的飞行高度。气压高度计44检测UAV 10周围的气压,并将检测到的气压换算为高度,以检测高度。温度传感器45检测UAV 10周围的温度。湿度传感器46检测UAV 10周围的湿度。
摄像装置100包含摄像部102及镜头部200。镜头部200为镜头装置的一个示例。摄像部102具有图像传感器120、摄像控制部110及存储器130。图像传感器120可以由CCD或CMOS构成。图像传感器120拍摄经由多个镜头210成像的光学图像,并将所拍摄的图像输出至摄像控制部110。摄像控制部110可以由CPU或MPU等微处理器、MCU等微控制器等构成。摄像控制部110可以根据来自UAV控制部30的摄像装置100的操作指令来控制摄像装置100。摄像控制部110为第一控制部及第二控制部的一个示例。存储器130可以为计算机可读记录介质,可以包括SRAM、 DRAM、EPROM、EEPROM、USB存储器及固态硬盘(SSD)等闪存中的至少一个。存储器130储存摄像控制部110对图像传感器120等进行控制所需的程序等。存储器130可以设置于摄像装置100的壳体内部。存储器130可以设置成可从摄像装置100的壳体中拆卸下来。
镜头部200具有多个镜头210、多个镜头驱动部212、以及镜头控制部220。多个镜头210可以起到变焦透镜(zoom lens)、可变焦距透镜(varifocal lens)及聚焦透镜的作用。多个镜头210中的至少一部分或全部被配置为能够沿着光轴移动。镜头部200可以是被设置成能够相对摄像部102拆装的交换镜头。镜头驱动部212经由凸轮环等机构构件使多个镜头210中的至少一部分或全部沿着光轴移动。镜头驱动部212可以包括致动器。致动器可以包括步进电机。镜头控制部220按照来自摄像部102的镜头控制指令来驱动镜头驱动部212,以经由机构构件使一个或多个镜头210沿着光轴方向移动。镜头控制指令例如为变焦控制指令及聚焦控制指令。
镜头部200还具有存储器222和位置传感器214。镜头控制部220按照来自摄像部102的镜头操作指令,经由镜头驱动部212来控制镜头210向光轴方向的移动。镜头控制部220按照来自摄像部102的镜头操作指令,经由镜头驱动部212来控制镜头210向光轴方向的移动。镜头210的一部分或者全部沿光轴移动。镜头控制部220通过使镜头210中的至少一个沿着光轴移动,来执行变焦操作和聚焦操作中的至少一个。位置传感器214检测镜头210的位置。位置传感器214可以检测当前的变焦位置或聚焦位置。
镜头驱动部212可以包括抖动校正机构。镜头控制部220可以经由抖动校正机构使镜头210在沿着光轴的方向或垂直于光轴的方向上移动,来执行抖动校正。镜头驱动部212可以由步进电机驱动抖动校正机构,以执行抖动校正。另外,抖动校正机构可以由步进电机驱动,以使图像传感器120在沿着光轴的方向或垂直于光轴的方向上移动,来执行抖动校正。
存储器222存储经由镜头驱动部212而移动的多个镜头210的控制值。存储器222可以包括SRAM、DRAM、EPROM、EEPROM及USB存储器等闪存中的至少一个。
这样构成的摄像装置100为了执行自动聚焦处理(AF处理)等,具有确定从镜头到被摄体的距离(被摄体距离)的功能。作为用于确定被摄体距离的方式,存在一种基于在镜头与摄像面之间的位置关系的不同状态下拍摄的多个图像的模糊量来进行确定的方式。文中,将该方法称为模糊检测自动聚焦(Bokeh Detection Auto Foucus:BDAF)方法。
例如,可以采用高斯函数由以下式(1)来表示图像的模糊量(Cost)。在式(1)中,x表示水平方向上的像素位置。σ表示标准偏差。
Figure PCTCN2019091780-appb-000001
图3示出了由式(1)表示的曲线的一个示例。通过将聚焦镜头对焦到与曲线500的最低点502对应的镜头位置上,可以对焦到包含在图像I中的对象上。
图4是示出BDAF方式的距离计算过程的一个示例的流程图。首先,在镜头与摄像面处于第一位置关系的状态下,由摄像装置100对第一张图像I 1进行拍摄并存储在存储器130中。其次,通过使聚焦镜头或图像传感器120的摄像面沿着光轴方向移动,使镜头与摄像面处于第二位置关系,由摄像装置100对第二张图像I 2进行拍摄并存储在存储器130中(S101)。例如,如所谓的爬山AF那样,使聚焦镜头或图像传感器120的摄像面沿着光轴方向移动而不超过对焦点。聚焦镜头或图像传感器120的摄像面的移动量例如可以是10μm。
接着,摄像装置100将图像I 1划分为多个区域(S102)。可以按图像I 2中的每个像素计算出特征量,并将具有相似特征量的像素组作为一个区域进而将图像I 1划分为多个区域。也可以将图像I 1内的设定为AF处理框的范围的像素组划分为多个区域。摄像装置100将图像I 2划分为与图像I 1的多个区域对应的多个区域。摄像装置100基于图像I 1的多个区域各自的模糊量及图像I 2的多个区域各自的模糊量,按多个区域的每一个计算出到多个区域各自所包含的对象的距离(S103)。
参照图5进一步说明距离的计算过程。设从镜头L(主点)到被摄体510(物面)的距离为A,从镜头L(主点)到被摄体510在摄像面上成像的位置(像面)的距离为B,焦距为F。在这种情况下,可以根据镜头式 用以下式(2)来表示距离A、距离B及焦距F之间的关系。
Figure PCTCN2019091780-appb-000002
焦距F由镜头位置指定。因此,如果可以指定被摄体510在摄像面上成像的距离B,则能够采用式(2)来指定从镜头L到被摄体510的距离A。
如图5所示,可以通过基于投影在摄像面上的被摄体510的模糊的大小(弥散圆512和514)计算出被摄体510成像的位置,来指定距离B,进而指定距离A。即,可以结合模糊的大小(模糊量)与摄像面及成像位置成比例,来指定成像位置。
其中,设从距离摄像面较近的图像I 1到镜头L的距离为D 1。设从距离摄像面较远的图像I 2到镜头L的距离为D 2。各个图像都有模糊。设此时的点扩散函数(Point Spread Function)为PSF,设D 1和D 2处的图像分别为I d1和I d2。在这种情况下,例如,图像I 1可以根据卷积运算用以下式(3)表示。
I 1=PSF*I d1         式(3)
此外,设图像数据I d1及I d2的傅立叶变换函数为f,设对图像I d1及I d2的点扩散函数PSF 1及PSF 2进行傅里叶变换获得的光学传递函数(Optical Transfer Function)为OTF 1及OTF 2,如以下式(4)得到比值。
Figure PCTCN2019091780-appb-000003
式(4)所示的C值为图像I d1及I d2的各自的模糊量的变化量,即,C值相当于图像I d1的模糊量与图像I d2n的模糊量的差。
基于由如上构成的摄像装置100所拍摄的摄像图像,在抑制聚焦镜头的移动量的同时生成深度图。深度图是表示每个像素或包括多个像素的每个块到被摄体的距离的数据。
摄像控制部110包含获取部112,计算部114,生成部116和合成部118。
摄像控制部110在摄像装置100的摄像方向改变期间或者摄像装置 100沿着第一轨迹移动期间,在摄像装置100的摄像面与摄像装置100的聚焦镜头之间的位置关系不同的状态下,使摄像装置100在包括重复范围的多个摄像范围拍摄多个摄像图像。摄像控制部110可以在UAV 10于预定地点处边悬停边旋转期间,在摄像装置100的摄像面与摄像装置100的聚焦镜头之间的位置关系不同的状态下,使摄像装置100在包括重复范围的多个摄像范围拍摄多个摄像图像。其中,摄像范围是由摄像装置100拍摄的空间的范围。
摄像控制部110可以在UAV 10移动期间,在摄像装置100的摄像面与摄像装置100的聚焦镜头之间的位置关系不同的状态下,使摄像装置100在包括重复范围的多个摄像范围拍摄多个摄像图像。摄像控制部110可以在UAV 10向与摄像装置100的摄像方向不同的方向移动期间,在摄像装置100的摄像面与摄像装置100的聚焦镜头之间的位置关系不同的状态下,使摄像装置100在包括重复范围的多个摄像范围拍摄多个摄像图像。
摄像控制部110可以在UAV 10于预定地点处边悬停边旋转期间,移动聚焦镜头,并使摄像装置100在包括重复范围的多个摄像范围拍摄多个摄像图像。摄像控制部110可以在UAV 10沿着第一轨迹移动期间,移动聚焦镜头,并使摄像装置100在包括重复范围的多个摄像范围拍摄多个摄像图像。
摄像控制部110可以在UAV 10于预定地点处边悬停边旋转期间,将聚焦镜头的位置在第一位置和第二位置之间交替切换,同时使摄像装置100在包括重复范围的多个摄像范围拍摄多个摄像图像。摄像控制部110可以在UAV 10沿着第一轨迹移动期间,将聚焦镜头的位置在第一位置和第二位置之间交替切换,同时使摄像装置100在包括重复范围的多个摄像范围拍摄多个摄像图像。
如图6所示,摄像控制部110可以在UAV 10于预定地点处边悬停边旋转期间,在摄像装置100的摄像面和摄像装置100的聚焦镜头处于第一位置关系的状态下,使摄像装置100对第一摄像范围601进行拍摄。摄像控制部110可以在摄像装置100的摄像面和摄像装置100的聚焦镜头处于第二位置关系的状态下,使摄像装置100对第二摄像范围602进行拍摄,其中,第二摄像范围602不同于第一摄像范围601,并且包括与第一摄像 范围601重复的第一重复范围611。摄像控制部110可以在摄像装置100的摄像面和摄像装置100的聚焦镜头处于第三位置关系的状态下,使摄像装置100对第三摄像范围603进行拍摄,其中,第三摄像范围603不同于第二摄像范围602,并且包括与第二摄像范围602重复的第二重复范围612。第一摄像范围601与第二摄像范围602不同,但是可以与第二摄像范围602重复一半以上。第二摄像范围602与第三摄像范围603不同,但是可以与第三摄像范围603重复一半以上。第二摄像范围602与第一摄像范围601及第三摄像范围603不同,但是可以与第三摄像范围603重复一半以上。
获取部112在摄像装置100的摄像方向改变期间或者摄像装置100沿着第一轨迹移动期间获取由摄像装置100拍摄的多个摄像图像。获取部112可以在UAV 10于预定地点处边悬停边旋转期间,在摄像装置100的摄像面与摄像装置100的聚焦镜头之间的位置关系不同的状态下,获取由摄像装置100拍摄的多个摄像范围的摄像图像。获取部112可以在UAV 10沿着第一轨迹移动期间,在摄像装置100的摄像面与摄像装置100的聚焦镜头之间的位置关系不同的状态下,获取由摄像装置100拍摄的多个摄像范围的摄像图像。
计算部114计算包括在多个摄像图像的每一个中的重复范围的图像的各自的模糊量。计算部114可以基于使用高斯函数的式(1)计算出各个图像的模糊量(Cost)。
生成部116基于多个重复范围的图像的各自的模糊量,生成包括与多个重复范围的每一个相对应的深度信息的深度图。生成部116可以基于多个重复范围的图像的各自的模糊量,通过BDAF方式,针对多个重复范围的图像的每个像素或者包括多个像素的每个块,生成包括深度信息的深度图,深度信息表示到被摄体的距离。
如图7A和7B所示,摄像控制部110可以在UAV 10于预定地点处边悬停边旋转期间,将聚焦镜头的对焦距离设定为第一位置(1.0m),并使摄像装置100对第一摄像范围601进行拍摄。获取部112可以获取第一摄像范围601的第一摄像图像701。摄像控制部110可以在UAV 10于预定地点处边悬停边旋转期间,将聚焦镜头的对焦距离设定为第二位置(0.5m),并使摄像装置100对第二摄像范围602进行拍摄。获取部112可以获取第 二摄像范围602的第二摄像图像702。进一步,摄像控制部110可以在UAV10于预定地点处边悬停边旋转期间,将聚焦镜头的对焦距离设定为第一位置(1.0m),并且使摄像装置100对第三摄像范围603进行拍摄。获取部112可以获取第三摄像范围603的第三摄像图像703。
计算部114可以计算第一摄像图像701中包括的与第一重复范围611相对应的第一图像710、第二摄像图像702中包括的与第一重复范围611相对应的第二图像711的各自的模糊量。计算部114可以计算第二摄像图像702中包括的与第二重复范围612相对应的第三图像712、第三摄像图像703中包括的与第二重复范围612相对应的第四图像713的各自的模糊量。
生成部116可以基于第一图像710和第二图像711的各自的模糊量生成包括与第一重复范围611对应的深度信息的深度图。生成部116可以基于第三图像712和第四图像713的各自的模糊量生成进一步包括与第二重复范围612对应的深度信息的深度图。
如上所述,通过本实施方式所涉及的摄像装置100,基于彼此不同并且部分重复的摄像范围的多个摄像图像,按照BDAF方式生成与重复范围对应的深度图。在BDAF方式的情况下,可以指定到被摄体的距离,而无需将聚焦镜头从最近端侧移动到无限远端。进一步,基于在改变摄像装置100的摄像方向期间或者在按照第一轨迹移动摄像装置100期间所拍摄的摄像图像来生成深度图。因此,可以在相对较短的时间内生成包括宽范围的深度信息的深度图。
摄像控制部110可以基于深度图控制摄像装置100的聚焦镜头的位置,进一步使摄像装置100拍摄多个摄像图像。
摄像控制部110可以在使摄像装置100围绕第一点进行第一圈旋转使得摄像装置100的摄像方向改变期间,使摄像装置100拍摄第一摄像图像和第二摄像图像。摄像控制部110可以在使摄像装置围绕第一点进行第二圈旋转使得摄像装置100的摄像方向改变期间,按照深度图控制摄像装置100的聚焦镜头的位置,并使摄像装置100重新拍摄多个摄像图像。
摄像控制部110可以在摄像装置100第一次沿第一轨迹移动期间,使摄像装置100拍摄第一摄像图像和第二摄像图像,并且在摄像装置100第 二次沿第一轨迹移动期间,按照深度图控制摄像装置100的聚焦镜头的位置,并使摄像装置100重新拍摄多个摄像图像。
摄像控制部110可以在使UAV 10悬停的同时围绕第一地点进行第一圈旋转的期间,使摄像装置100拍摄彼此不同并且部分重复的摄像范围的多个摄像图像。获取部112可以获取彼此不同并且部分重复的摄像范围的多个摄像图像。计算部114可以计算重复范围的图像的模糊量。进一步,生成部116可以基于重复范围的图像的模糊量,生成包括与重复范围对应的深度信息的深度图。
摄像控制部110可以按照深度图来指定到所期望的被摄体的距离。摄像控制部110可以在使UAV 10悬停的同时围绕第一地点进行第二圈旋转的期间,按照到指定的所期望的被摄体的距离来控制聚焦镜头的位置,并使摄像装置100拍摄多个摄像图像。摄像控制部110可以在使UAV 10悬停的同时围绕第一地点进行第一圈旋转的期间,在不按照深度图的情况下将聚焦镜头控制到预定的对焦距离,并使摄像装置100拍摄多个摄像图像。
摄像控制部110可以将深度图和多个摄像图像相关联地存储在诸如存储器130的存储部中。
合成部118可以对多个摄像图像进行合成,并生成如图8所示的全景图像。合成部118可以将全景图像和深度图相关联,并存储在诸如存储器130的存储部中。
只要摄像范围不同,相邻摄像范围之间的重复范围的比例可以是任何比例。当生成包括由摄像装置100拍摄的全部摄像范围的深度信息的深度图时,相邻摄像范围之间的重复范围的比例小于1并且在1/2以上。另一方面,当生成包括由摄像装置100拍摄的部分摄像范围的深度信息的深度图时,相邻摄像范围之间的重复范围的比例可以是小于1/2且大于0的比例。
图9是示出搭载在UAV 10上的摄像装置100的摄像过程的一个示例的流程图。
UAV 10开始飞行(S200)。根据经由远程操作装置300的来自用户的指令,将摄像装置100的摄像模式设置为带深度图的全景模式(S202)。UAV控制部30将UAV 10移动到所期望的第一地点。接着,摄像控制部 110通过UAV控制部30使UAV 10在第一地点悬停的同时旋转,并移动聚焦镜头,同时使摄像装置100拍摄彼此不同并且部分重复的摄像范围的多个摄像图像(S204)。摄像控制部110可以将聚焦镜头的位置在第一位置和第二位置之间交替地切换,同时使摄像装置100拍摄彼此不同并且部分重复的摄像范围的多个摄像图像。
计算部114计算重复范围的图像的模糊量,生成部116基于其的模糊量生成包括与重复范围对应的深度信息的深度图(S206)。
接着,摄像控制部110通过UAV控制部30使UAV 10在第一地点悬停的同时再次旋转,并按照深度图控制聚焦镜头,同时使摄像装置100拍摄多个摄像图像(S208)。摄像控制部110可以按照深度图来指定到所期望的被摄体的距离,并根据此距离调整聚焦镜头的位置。然后,摄像控制部110可以通过UAV控制部30使UAV 10在第一地点悬停的同时旋转,并在调整后的聚焦镜头的位置拍摄多个摄像图像。例如,当使摄像装置100以多个对焦距离中的每一个来进行拍摄时,摄像控制部110可以按每个对焦距离每次旋转UAV 10,使摄像装置100进行拍摄。当UAV 10搭载有多个摄像装置100时,摄像控制部110可以使多个摄像装置100的每一个摄像装置的对焦距离不同,并且在UAV 10旋转一圈期间,使多个摄像装置100中的每一个以不同的对焦距离拍摄多个摄像图像。
摄像控制部110将拍摄的多个摄像图像和深度图相关联地存储在存储器130中(S210)。合成部118对多个摄像图像进行合成以生成全景图像,并且摄像控制部110可以将此全景图像和深度图相关联地存储在存储器130中。
以上对UAV 10旋转360度并且生成部116生成包括360度范围内的深度信息的深度图的示例进行了说明。但是,UAV 10也可以旋转小于360度的旋转角度,且生成部116可以生成包括与旋转角度的范围对应的深度信息的深度图。对UAV 10的摄像装置100围绕偏航轴旋转360度的示例进行了说明。但是,摄像装置100也可以在三维空间中进行拍摄。当摄像装置100在三维空间中进行拍摄时,通过控制万向节50来调整摄像装置100的姿势,使得摄像装置100可以进行拍摄。例如,首先固定万向节50的俯仰轴和滚转轴并使摄像装置100围绕偏航轴旋转,然后使摄像装置 100围绕俯仰轴向正或负方向倾斜,使摄像装置100再次围绕偏航轴旋转。如果UAV 10得以平衡,则也可以通过控制UAV 10的姿势来控制摄像装置100的姿势,而不是通过万向节来控制摄像装置100的姿势。这样就能够在三维空间中获取深度图。如果为了还能获得其他重复范围而设置多个摄像装置100,则能够以比一个摄像装置100时更短的工时获得深度图。
图10示出可全部或部分地体现本发明的多个方面的计算机1200的一个示例。安装在计算机1200上的程序能够使计算机1200作为与本发明的实施方式所涉及的装置建立关联的操作或者该装置的一个或多个“部”发挥功能。或者,该程序能够使计算机1200执行该操作或者该一个或多个“部”。该程序能够使计算机1200执行本发明的实施方式所涉及的过程或者该过程的阶段。这种程序可以由CPU 1212执行,以使计算机1200执行与本说明书所述的流程图及框图中的一些或者全部方框建立关联的特定操作。
本实施方式的计算机1200包括CPU 1212及RAM 1214,它们通过主机控制器1210相互连接。计算机1200还包括通信接口1222、输入/输出单元,它们通过输入/输出控制器1220与主机控制器1210连接。计算机1200还包括ROM 1230。CPU 1212按照ROM 1230及RAM 1214内储存的程序而操作,从而控制各单元。
通信接口1222通过网络与其它电子设备通信。硬盘驱动器可以储存计算机1200内的CPU 1212所使用的程序及数据。ROM 1230在其中储存运行时由计算机1200执行的引导程序等、和/或依赖于计算机1200的硬件的程序。程序通过CR-ROM、USB存储器或IC卡之类的计算机可读记录介质或者网络来提供。程序安装在也是计算机可读记录介质的示例的RAM 1214或ROM 1230中,并通过CPU 1212执行。这些程序中记述的信息处理由计算机1200读取,并带来程序与上述各种类型的硬件资源之间的协作。装置或者方法可随着计算机1200的使用而通过实现信息的操作或者处理来构成。
例如,在计算机1200与外部设备之间执行通信时,CPU 1212可以执行加载在RAM 1214中的通信程序,并根据通信程序所记述的处理,指令通信接口1222进行通信处理。通信接口1222在CPU 1212的控制下,读 取储存在RAM 1214或USB存储器之类的记录介质内提供的发送缓冲区中的发送数据,并将读取的发送数据发送到网络,或者将从网络接收的接收数据写入记录介质内提供的接收缓冲区等中。
此外,CPU 1212可以使RAM 1214读取USB存储器等外部记录介质所储存的文件或数据库的全部或者需要的部分,并对RAM 1214上的数据执行各种类型的处理。接着,CPU 1212可以将处理的数据写回到外部记录介质中。
可以将各种类型的程序、数据、表格及数据库之类的各种类型的信息储存在记录介质中,并接受信息处理。对于从RAM 1214读取的数据,CPU 1212可执行在本公开的各处记载的、包括由程序的指令序列指定的各种类型的操作、信息处理、条件判断、条件转移、无条件转移、信息的检索/替换等各种类型的处理,并将结果写回到RAM 1214中。此外,CPU 1212可以检索记录介质内的文件、数据库等中的信息。例如,在记录介质中储存具有分别与第二属性的属性值建立了关联的第一属性的属性值的多个条目时,CPU 1212可以从该多个条目中检索出与指定第一属性的属性值的条件相匹配的条目,并读取该条目内储存的第二属性的属性值,从而获取与满足预定条件的第一属性建立了关联的第二属性的属性值。
以上描述的程序或者软件模块可以储存在计算机1200上或者计算机1200附近的计算机可读存储介质上。此外,与专用通信网络或者互联网连接的服务器系统内提供的硬盘或RAM之类的记录介质可以用作计算机可读存储介质,从而通过网络将程序提供给计算机1200。
应该注意的是,权利要求书、说明书以及说明书附图中所示的装置、系统、程序以及方法中的操作、顺序、步骤以及阶段等各项处理的执行顺序,只要没有特别明示“在…之前”、“事先”等,且只要前面处理的输出并不用在后面的处理中,则可以任意顺序实现。关于权利要求书、说明书以及附图中的操作流程,为方便起见而使用“首先”、“接着”等进行了说明,但并不意味着必须按照这样的顺序实施。
以上使用实施方式对本发明进行了说明,但是本发明的技术范围并不限于上述实施方式所记载的范围。对本领域普通技术人员来说,显然可以对上述实施方式加以各种变更或改良。从权利要求书的记载可知,加以了 这样的变更或改良的方式都可包含在本发明的技术范围之内。

Claims (14)

  1. 一种控制装置,其特征在于,包含:
    控制部,其在摄像装置的摄像面和所述摄像装置的聚焦镜头处于第一位置关系的状态下,使所述摄像装置对第一摄像范围进行拍摄,并且在所述摄像装置的摄像面和所述摄像装置的聚焦镜头处于第二位置关系的状态下,使所述摄像装置对第二摄像范围进行拍摄,所述第二摄像范围不同于所述第一摄像范围,并且包括与所述第一摄像范围重复的第一重复范围;
    获取部,其获取由所述摄像装置拍摄的所述第一摄像范围的第一摄像图像和所述第二摄像范围的第二摄像图像;
    计算部,其计算所述第一摄像图像中包括的与所述第一重复范围对应的第一图像、和所述第二摄像图像中包括的与所述第一重复范围对应的第二图像的各自的模糊量;以及
    生成部,其基于所述第一图像和所述第二图像的各自的模糊量生成包括与所述第一重复范围对应的深度信息的深度图。
  2. 如权利要求1所述的控制装置,其特征在于,所述第一摄像范围与所述第二摄像范围重复一半以上。
  3. 如权利要求1所述的控制装置,其特征在于:
    所述控制部在所述摄像装置的摄像面和所述摄像装置的聚焦镜头处于第三位置关系的状态下,使所述摄像装置对第三摄像范围进行拍摄,所述第三摄像范围不同于所述第二摄像范围,并且包括与所述第二摄像范围重复的第二重复范围;
    所述获取部进一步获取由所述摄像装置拍摄的所述第三摄像范围的第三摄像图像;
    所述计算部进一步计算所述第二摄像图像中包括的与所述第二重复范围对应的第三图像、和所述第三摄像图像中包括的与所述第二重复范围对应的第四图像的各自的模糊量;
    所述生成部基于所述第三图像和所述第四图像的各自的模糊量 来生成进一步包括与所述第二重复范围对应的深度信息的所述深度图。
  4. 如权利要求3所述的控制装置,其特征在于,所述第二摄像范围与所述第三摄像范围重复一半以上。
  5. 如权利要求1所述的控制装置,其特征在于,所述控制部在所述摄像装置的摄像方向变化期间,在所述摄像装置的摄像面和所述摄像装置的聚焦镜头处于所述第一位置关系的状态下,使所述摄像装置对第一摄像范围进行拍摄,并且在所述摄像装置的摄像面和所述摄像装置的聚焦镜头处于所述第二位置关系的状态下,使所述摄像装置对所述第二摄像范围进行拍摄。
  6. 如权利要求5所述的控制装置,其特征在于,所述控制部在使所述摄像装置围绕第一点进行第一圈旋转使得所述摄像装置的摄像方向改变期间,使所述摄像装置拍摄所述第一摄像图像和所述第二摄像图像,在使所述摄像装置围绕所述第一点进行第二圈旋转使得所述摄像装置的摄像方向改变期间,按照所述深度图控制所述摄像装置的聚焦镜头的位置,并使所述摄像装置重新拍摄多个摄像图像。
  7. 如权利要求6所述的控制装置,其特征在于,所述控制部将所述深度图和所述多个摄像图像相关联地存储在存储部中。
  8. 如权利要求1所述的控制装置,其特征在于,所述控制部在所述摄像装置沿第一轨迹移动期间,在所述摄像装置的摄像面和所述摄像装置的聚焦镜头处于所述第一位置关系的状态下,使所述摄像装置对第一摄像范围进行拍摄,并且在所述摄像装置的摄像面和所述摄像装置的聚焦镜头处于所述第二位置关系的状态下,使所述摄像装置对所述第二摄像范围进行拍摄。
  9. 如权利要求8所述的控制装置,其特征在于,所述控制部在所述摄像装置第一次沿所述第一轨迹移动期间,使所述摄像装置拍摄所述第一摄像图像和所述第二摄像图像,并且在所述摄像装置第二次沿所述第一轨迹移动期间,按照所述深度图控制所述摄像装置的聚焦镜头的位置,并使所述摄像装置重新拍摄多个摄像图像。
  10. 如权利要求9所述的控制装置,其特征在于,所述控制部将所述深度图和所述多个摄像图像相关联地存储在存储部中。
  11. 一种摄像装置,其特征在于,其包含:
    如权利要求1至10中任一项所述的控制装置;以及
    所述聚焦镜头。
  12. 一种移动体,其特征在于,其包含如权利要求11所述的摄像装置并移动。
  13. 一种控制方法,其特征在于,其包含:
    在摄像装置的摄像面和所述摄像装置的聚焦镜头处于第一位置关系的状态下,使所述摄像装置对第一摄像范围进行拍摄,并且在所述摄像装置的摄像面和所述摄像装置的聚焦镜头处于第二位置关系的状态下,使所述摄像装置对第二摄像范围进行拍摄的阶段,其中,所述第二摄像范围不同于所述第一摄像范围,并且包括与所述第一摄像范围重复的第一重复范围;
    获取由所述摄像装置拍摄的所述第一摄像范围的第一摄像图像和所述第二摄像范围的第二摄像图像的阶段;
    计算所述第一摄像图像中包括的与所述第一重复范围对应的第一图像、和所述第二摄像图像中包括的与所述第一重复范围对应的第二图像的各自的模糊量的阶段;以及
    基于所述第一图像和所述第二图像的各自的模糊量生成包括与所述第一重复范围对应的深度信息的深度图的阶段。
  14. 一种程序,其特征在于,其用于使计算机作为如权利要求1至10中任一项所述的控制装置而发挥功能。
PCT/CN2019/091780 2018-06-27 2019-06-18 控制装置、摄像装置、移动体、控制方法以及程序 WO2020001335A1 (zh)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201980005098.0A CN111226263A (zh) 2018-06-27 2019-06-18 控制装置、摄像装置、移动体、控制方法以及程序

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2018122419A JP6569157B1 (ja) 2018-06-27 2018-06-27 制御装置、撮像装置、移動体、制御方法、及びプログラム
JP2018-122419 2018-06-27

Publications (1)

Publication Number Publication Date
WO2020001335A1 true WO2020001335A1 (zh) 2020-01-02

Family

ID=67844759

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2019/091780 WO2020001335A1 (zh) 2018-06-27 2019-06-18 控制装置、摄像装置、移动体、控制方法以及程序

Country Status (3)

Country Link
JP (1) JP6569157B1 (zh)
CN (1) CN111226263A (zh)
WO (1) WO2020001335A1 (zh)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101673395A (zh) * 2008-09-10 2010-03-17 深圳华为通信技术有限公司 图像拼接方法及装置
CN102223477A (zh) * 2010-04-13 2011-10-19 索尼公司 基于双图片匹配的深度估计的四维多项式模型
CN103793909A (zh) * 2014-01-21 2014-05-14 东北大学 基于衍射模糊的单视觉全局深度信息获取方法
CN104519328A (zh) * 2013-10-02 2015-04-15 佳能株式会社 图像处理设备、图像捕捉装置和图像处理方法
CN105472252A (zh) * 2015-12-31 2016-04-06 零度智控(北京)智能科技有限公司 一种无人机获取图像的系统及方法

Family Cites Families (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0832865A (ja) * 1994-07-13 1996-02-02 Nikon Corp 撮像装置
JP2003018438A (ja) * 2001-07-05 2003-01-17 Fuji Photo Film Co Ltd 画像撮像装置
US7653298B2 (en) * 2005-03-03 2010-01-26 Fujifilm Corporation Image capturing apparatus, image capturing method, image capturing program, image recording output system and image recording output method
JP2011259168A (ja) * 2010-06-08 2011-12-22 Fujifilm Corp 立体パノラマ画像撮影装置
JP2013044844A (ja) * 2011-08-23 2013-03-04 Panasonic Corp 画像処理装置および画像処理方法
JP2015017999A (ja) * 2011-11-09 2015-01-29 パナソニック株式会社 撮像装置
JP5848177B2 (ja) * 2012-03-27 2016-01-27 日本放送協会 多重フォーカスカメラ
JP5352003B2 (ja) * 2012-12-28 2013-11-27 キヤノン株式会社 画像処理装置及び画像処理方法
JP2016066007A (ja) * 2014-09-25 2016-04-28 キヤノン株式会社 撮像装置及びその制御方法
KR101694890B1 (ko) * 2016-03-29 2017-01-13 주식회사 비젼인 다시점 이차원 영상을 활용한 영상처리 시스템 및 방법
JP6700935B2 (ja) * 2016-04-25 2020-05-27 キヤノン株式会社 撮像装置、その制御方法、および制御プログラム
WO2018212008A1 (ja) * 2017-05-16 2018-11-22 富士フイルム株式会社 撮像装置及び画像合成装置

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101673395A (zh) * 2008-09-10 2010-03-17 深圳华为通信技术有限公司 图像拼接方法及装置
CN102223477A (zh) * 2010-04-13 2011-10-19 索尼公司 基于双图片匹配的深度估计的四维多项式模型
CN104519328A (zh) * 2013-10-02 2015-04-15 佳能株式会社 图像处理设备、图像捕捉装置和图像处理方法
CN103793909A (zh) * 2014-01-21 2014-05-14 东北大学 基于衍射模糊的单视觉全局深度信息获取方法
CN105472252A (zh) * 2015-12-31 2016-04-06 零度智控(北京)智能科技有限公司 一种无人机获取图像的系统及方法

Also Published As

Publication number Publication date
JP2020005108A (ja) 2020-01-09
CN111226263A (zh) 2020-06-02
JP6569157B1 (ja) 2019-09-04

Similar Documents

Publication Publication Date Title
WO2019238044A1 (zh) 确定装置、移动体、确定方法以及程序
WO2020011230A1 (zh) 控制装置、移动体、控制方法以及程序
WO2019120082A1 (zh) 控制装置、系统、控制方法以及程序
US20210014427A1 (en) Control device, imaging device, mobile object, control method and program
US20210105411A1 (en) Determination device, photographing system, movable body, composite system, determination method, and program
US11066182B2 (en) Control apparatus, camera apparatus, flying object, control method and program
WO2019242611A1 (zh) 控制装置、移动体、控制方法以及程序
WO2020098603A1 (zh) 确定装置、摄像装置、摄像系统、移动体、确定方法以及程序
WO2019174343A1 (zh) 活动体检测装置、控制装置、移动体、活动体检测方法及程序
CN109844634B (zh) 控制装置、摄像装置、飞行体、控制方法以及程序
JP6503607B2 (ja) 撮像制御装置、撮像装置、撮像システム、移動体、撮像制御方法、及びプログラム
WO2020011198A1 (zh) 控制装置、移动体、控制方法以及程序
JP6641574B1 (ja) 決定装置、移動体、決定方法、及びプログラム
WO2020001335A1 (zh) 控制装置、摄像装置、移动体、控制方法以及程序
WO2020020042A1 (zh) 控制装置、移动体、控制方法以及程序
WO2019223614A1 (zh) 控制装置、摄像装置、移动体、控制方法以及程序
WO2021143425A1 (zh) 控制装置、摄像装置、移动体、控制方法以及程序
WO2021249245A1 (zh) 装置、摄像装置、摄像系统及移动体
CN111213369B (zh) 控制装置、方法、摄像装置、移动体以及计算机可读存储介质
WO2019085794A1 (zh) 控制装置、摄像装置、飞行体、控制方法以及程序

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19826004

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19826004

Country of ref document: EP

Kind code of ref document: A1