WO2019242616A1 - 确定装置、摄像系统、移动体、合成系统、确定方法以及程序 - Google Patents

确定装置、摄像系统、移动体、合成系统、确定方法以及程序 Download PDF

Info

Publication number
WO2019242616A1
WO2019242616A1 PCT/CN2019/091742 CN2019091742W WO2019242616A1 WO 2019242616 A1 WO2019242616 A1 WO 2019242616A1 CN 2019091742 W CN2019091742 W CN 2019091742W WO 2019242616 A1 WO2019242616 A1 WO 2019242616A1
Authority
WO
WIPO (PCT)
Prior art keywords
imaging device
distance
subjects
imaging
uav
Prior art date
Application number
PCT/CN2019/091742
Other languages
English (en)
French (fr)
Inventor
本庄谦一
Original Assignee
深圳市大疆创新科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳市大疆创新科技有限公司 filed Critical 深圳市大疆创新科技有限公司
Priority to CN201980005209.8A priority Critical patent/CN111264055A/zh
Publication of WO2019242616A1 publication Critical patent/WO2019242616A1/zh
Priority to US17/122,948 priority patent/US20210105411A1/en

Links

Images

Classifications

    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B13/00Viewfinders; Focusing aids for cameras; Means for focusing for cameras; Autofocus systems for cameras
    • G03B13/18Focusing aids
    • G03B13/30Focusing aids indicating depth of field
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals
    • H04N23/671Focus control based on electronic image sensor signals in combination with active ranging signals, e.g. using light or sound signals emitted toward objects
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64CAEROPLANES; HELICOPTERS
    • B64C39/00Aircraft not otherwise provided for
    • B64C39/02Aircraft not otherwise provided for characterised by special use
    • B64C39/024Aircraft not otherwise provided for characterised by special use of the remote controlled vehicle type, i.e. RPV
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U10/00Type of UAV
    • B64U10/10Rotorcrafts
    • B64U10/13Flying platforms
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B7/00Mountings, adjusting means, or light-tight connections, for optical elements
    • G02B7/28Systems for automatic generation of focusing signals
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B7/00Mountings, adjusting means, or light-tight connections, for optical elements
    • G02B7/28Systems for automatic generation of focusing signals
    • G02B7/36Systems for automatic generation of focusing signals using image sharpness techniques, e.g. image processing techniques for generating autofocus signals
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B13/00Viewfinders; Focusing aids for cameras; Means for focusing for cameras; Autofocus systems for cameras
    • G03B13/32Means for focusing
    • G03B13/34Power focusing
    • G03B13/36Autofocus systems
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/239Image signal generators using stereoscopic image cameras using two 2D image sensors having a relative position equal to or related to the interocular distance
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals
    • H04N23/673Focus control based on electronic image sensor signals based on contrast or high frequency components of image signals, e.g. hill climbing method
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals
    • H04N23/676Bracketing for image capture at varying focusing conditions
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/68Control of cameras or camera modules for stable pick-up of the scene, e.g. compensating for camera body vibrations
    • H04N23/682Vibration or motion blur correction
    • H04N23/685Vibration or motion blur correction performed by mechanical compensation
    • H04N23/687Vibration or motion blur correction performed by mechanical compensation by shifting the lens or sensor position
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/69Control of means for changing angle of the field of view, e.g. optical zoom objectives or electronic zooming
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U2101/00UAVs specially adapted for particular uses or applications
    • B64U2101/30UAVs specially adapted for particular uses or applications for imaging, photography or videography
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U2201/00UAVs characterised by their flight controls
    • B64U2201/10UAVs characterised by their flight controls autonomous, i.e. by navigating independently from ground or air stations, e.g. by using inertial navigation systems [INS]
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U2201/00UAVs characterised by their flight controls
    • B64U2201/20Remote controls
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B15/00Special procedures for taking photographs; Apparatus therefor
    • G03B15/006Apparatus mounted on flying objects

Definitions

  • the present invention relates to a determination device, a camera system, a moving body, a synthesis system, a determination method, and a program.
  • Patent Document 1 discloses extracting a still image focused on a specified area from a plurality of frame images included in the moving image data.
  • Patent Document 2 discloses that, from a plurality of image data captured while shifting a focus position by a predetermined amount, image data to be synthesized is selected based on the shift amount of the focus position and the resolution of the image data to synthesize the selected plurality of images. data.
  • Patent Document 1 International Publication No. 2017/006538
  • Patent Document 2 Japanese Patent Application Publication No. 2015-231058
  • magnification of the subject imaged on the imaging surface changes when the focusing distance changes.
  • the determination device may include an acquisition section that acquires a plurality of focus distances corresponding to each of a plurality of subjects included in an imaging range of the imaging device.
  • the determination device may include a determination unit that determines a distance between the imaging device and the plurality of subjects when the imaging device photographs each of the plurality of subjects based on the plurality of focusing distances.
  • the determination unit may set the distance between the imaging device and the plurality of subjects when shooting a subject farther away from the imaging device among the plurality of subjects. Ok for longer.
  • the determination unit may set the distance between the imaging device and the plurality of subjects when shooting a subject farther away from the imaging device among the plurality of subjects. OK as shorter.
  • the determination device may include a first control unit that causes the image pickup device to perform the imaging process on a plurality of focus distances acquired by the acquisition unit in a state where the distance between the imaging device and the plurality of subjects is set to the distance determined by the determination unit. Each of the multiple subjects is photographed.
  • the determination device may include a second control unit that captures a plurality of images while keeping the position of the focus lens of the image pickup device in a state where the image pickup device is maintained at the first position.
  • the determination device may include a specifying section that specifies a plurality of focus distances for a plurality of subjects based on a plurality of images.
  • the determination unit may make the distance between the imaging device and the plurality of subjects when photographing a subject closer to the imaging device among the plurality of subjects and the distance between the first position of the imaging apparatus and the plurality of subjects. The difference between the distances decreases.
  • the determination unit may determine a distance between the imaging device and the plurality of subjects when photographing a subject closest to the imaging device among the plurality of subjects as the first position of the imaging apparatus and the plurality of subjects. distance.
  • the imaging system may include the determination device described above.
  • the imaging system may include an imaging device including a lens system that focuses a lens.
  • the lens system may be a single focus lens.
  • the moving object may be a moving object that is mounted on the imaging system and moves.
  • the moving body may include a third control section that controls the movement of the moving body such that the distance between the imaging device and the plurality of subjects is a distance determined by the determination section.
  • a synthesis system may include the above-mentioned determining means.
  • the synthesizing device may include a synthesizing section that, in a state where the distance between the imaging device and the plurality of subjects is set to a distance determined by the determining section, causes the first control section to cause the imaging device to focus on the plurality of objects acquired by the acquiring section A plurality of images obtained by capturing each of a plurality of subjects at each of the distances are synthesized.
  • the determination method may include a step of acquiring a plurality of focusing distances corresponding to each of a plurality of subjects included in an imaging range of the imaging device.
  • the determining method may include a stage of determining a distance between the imaging device and the plurality of subjects when the imaging device photographs each of the plurality of subjects based on the plurality of focusing distances.
  • the program according to one aspect of the present invention may be a program for causing a computer to function as the above-mentioned determination device.
  • the present invention when a plurality of images are captured while changing the focus distance using an imaging device that changes the magnification of a subject imaged on the imaging surface when the focus distance is changed, it is possible to suppress each of the images included in the plurality of images. Changes in the size of the subject in one.
  • FIG. 1 is a diagram showing an example of the appearance of an unmanned aerial vehicle (UAV) and a remote operation device.
  • UAV unmanned aerial vehicle
  • FIG. 2 is a diagram showing an example of a functional block of a UAV.
  • FIG. 3 is a diagram illustrating an example of a positional relationship between a plurality of subjects and an imaging device.
  • FIG. 4 is an example of an image including a plurality of subjects in a positional relationship shown in FIG. 3.
  • FIG. 5 is a diagram illustrating an example of a correspondence relationship between a temporal change in the focusing distance of the imaging device and a temporal change in the distance from the closest subject to the imaging device.
  • FIG. 6 is a diagram showing an example of a relationship between an evaluation value of contrast and a focusing distance.
  • FIG. 7 is a diagram for explaining a case where the imaging device captures a plurality of images while moving the UAV.
  • FIG. 8 is a diagram for explaining a case where a composite image is generated from a plurality of images captured by an imaging device.
  • FIG. 9 is a flowchart showing an example of an imaging process of an imaging device mounted on a UAV.
  • FIG. 10 is a diagram showing an example of a hardware configuration.
  • the blocks may indicate (1) a stage of a process of performing an operation or (2) a "part" of a device having a function of performing an operation.
  • Certain stages and "parts" may be implemented by programmable circuits and / or processors.
  • the dedicated circuits may include digital and / or analog hardware circuits. It may include integrated circuits (ICs) and / or discrete circuits.
  • Programmable circuits may include reconfigurable hardware circuits.
  • Reconfigurable hardware circuits can include logical AND, logical OR, logical XOR, logical NAND, logical NOR, and other logical operations, flip-flops, registers, field programmable gate arrays (FPGAs), and programmable logic arrays (PLAs) ) And other memory elements.
  • the computer-readable medium may include any tangible device that can store instructions executed by a suitable device.
  • a computer-readable medium including instructions stored therein includes a product including instructions that can be executed to create a means for performing the operations specified by the flowchart or block diagram.
  • an electronic storage medium, a magnetic storage medium, an optical storage medium, an electromagnetic storage medium, a semiconductor storage medium, and the like may be included.
  • a floppy (registered trademark) disk As a more specific example of a computer-readable medium, a floppy (registered trademark) disk, a floppy disk, a hard disk, a random access memory (RAM), a ROM-only memory, and an erasable programmable read-only memory may be included (EPROM or Flash), electrically erasable programmable read-only memory (EEPROM), static random access memory (SRAM), compact disc read-only memory (CD-ROM), digital versatile disc (DVD), Blu-ray (RTM) disc , Memory stick, integrated circuit card, etc.
  • EEPROM electrically erasable programmable read-only memory
  • SRAM static random access memory
  • CD-ROM compact disc read-only memory
  • DVD digital versatile disc
  • RTM Blu-ray
  • the computer-readable instructions may include any one of source code or object code described by any combination of one or more programming languages.
  • the source or object code includes traditional procedural programming languages.
  • Traditional programming languages can be assembly instructions, instruction set architecture (ISA) instructions, machine instructions, machine-related instructions, microcode, firmware instructions, state setting data, or Smalltalk, JAVA (registered trademark), C ++, etc.
  • the computer-readable instructions may be provided to a processor or a programmable circuit of a general-purpose computer, a special-purpose computer, or other programmable data processing apparatus locally or via a wide area network (WAN) such as a local area network (LAN), the Internet, or the like.
  • WAN wide area network
  • LAN local area network
  • the Internet or the like.
  • a processor or programmable circuit can execute computer-readable instructions to create a means for performing the operations specified in the flowchart or block diagram.
  • Examples of the processor include a computer processor, a processing unit, a microprocessor, a digital signal processor, a controller, a microcontroller, and the like.
  • FIG. 1 shows an example of the appearance of an unmanned aerial vehicle (UAV) 10 and a remote operation device 300.
  • the UAV 10 includes a UAV body 20, a universal joint 50, a plurality of imaging devices 60, and an imaging device 100.
  • the gimbal 50 and the imaging device 100 are examples of an imaging system.
  • UAV10 is an example of a moving body.
  • a moving body is a concept including a flying body moving in the air, a vehicle moving on the ground, and a ship moving on the water.
  • a flying object moving in the air refers to a concept that includes not only UAVs, but also other aircraft, airships, and helicopters that move in the air.
  • the UAV body 20 includes a plurality of rotors. Multiple rotors are an example of a propulsion part.
  • the UAV body 20 controls the rotation of a plurality of rotors to fly the UAV 10.
  • the UAV body 20 uses, for example, four rotors to fly the UAV 10.
  • the number of rotors is not limited to four.
  • UAV 10 can also be a fixed-wing aircraft without rotors.
  • the imaging device 100 is an imaging camera that captures an object included in a desired imaging range.
  • the gimbal 50 rotatably supports the imaging device 100.
  • the universal joint 50 is an example of a support mechanism.
  • the gimbal 50 uses an actuator to rotatably support the imaging device 100 with a pitch axis.
  • the gimbal 50 uses an actuator to further rotatably support the imaging device 100 around a roll axis and a yaw axis, respectively.
  • the gimbal 50 can change the posture of the imaging device 100 by rotating the imaging device 100 about at least one of a yaw axis, a pitch axis, and a roll axis.
  • the plurality of imaging devices 60 are sensing cameras that capture the surroundings of the UAV 10 in order to control the flight of the UAV 10.
  • the two camera devices 60 may be installed on the nose of the UAV 10, that is, on the front side.
  • the other two camera devices 60 may be disposed on the bottom surface of the UAV 10.
  • the two image pickup devices 60 on the front side may be paired and function as a so-called stereo camera.
  • the two imaging devices 60 on the bottom surface side may be paired to function as a stereo camera.
  • the imaging device 60 can measure the presence of an object included in the imaging range of the imaging device 60 and the distance to the object.
  • the imaging device 60 is an example of a measurement device for measuring an object existing in the imaging direction of the imaging device 100.
  • the measurement device may be another sensor, such as an infrared sensor or an ultrasonic sensor for measuring an object existing in the imaging direction of the imaging device 100.
  • the three-dimensional space data around the UAV 10 can be generated based on the images captured by the plurality of imaging devices 60.
  • the number of the imaging devices 60 included in the UAV 10 is not limited to four.
  • the UAV 10 may include at least one imaging device 60.
  • the UAV 10 may also include at least one camera device 60 on the nose, tail, side, bottom, and top surfaces of the UAV 10.
  • the angle of view settable in the imaging device 60 may be greater than the angle of view settable in the imaging device 100.
  • the imaging device 60 may include a single focus lens or a fisheye lens.
  • the remote operation device 300 communicates with the UAV 10 to remotely operate the UAV 10.
  • the remote operation device 300 can perform wireless communication with the UAV 10.
  • the remote operation device 300 transmits to the UAV 10 instruction information indicating various instructions related to the movement of the UAV 10 such as ascent, descent, acceleration, deceleration, forward, backward, and rotation.
  • the instruction information includes, for example, instruction information for raising the height of the UAV 10.
  • the instruction information may indicate the height at which the UAV 10 should be located.
  • the UAV 10 moves to a height indicated by the instruction information received from the remote operation device 300.
  • the instruction information may include a rising instruction for causing the UAV 10 to rise. UAV10 rises while receiving the rising instruction. When the height of UAV 10 reaches the upper limit, UAV 10 can limit the ascent even if it accepts the ascent command.
  • FIG. 2 shows an example of the functional blocks of the UAV 10.
  • UAV 10 includes UAV control unit 30, memory 32, communication interface 36, propulsion unit 40, GPS receiver 41, inertial measurement device 42, magnetic compass 43, barometric altimeter 44, temperature sensor 45, humidity sensor 46, universal joint 50, The imaging device 60 and the imaging device 100.
  • the communication interface 36 communicates with other devices such as the remote operation device 300.
  • the communication interface 36 may receive instruction information including various instructions to the UAV control section 30 from the remote operation device 300.
  • the memory 32 stores the UAV control unit 30 pair of the propulsion unit 40, the GPS receiver 41, the inertial measurement unit (IMU) 42, the magnetic compass 43, the barometric altimeter 44, the temperature sensor 45, the humidity sensor 46, the universal joint 50, the camera 60, and Programs and the like necessary for the imaging device 100 to perform control.
  • the memory 32 may be a computer-readable recording medium, and may include at least one of flash memories such as SRAM, DRAM, EPROM, EEPROM, and USB memory.
  • the memory 32 may be provided inside the UAV body 20. It may be provided to be detachable from the UAV body 20.
  • the UAV control unit 30 controls the flight and shooting of the UAV 10 in accordance with a program stored in the memory 32.
  • the UAV control unit 30 may be composed of a microprocessor such as a CPU or an MPU, and a microcontroller such as an MCU.
  • the UAV control unit 30 controls the flight and shooting of the UAV 10 in accordance with instructions received from the remote operation device 300 via the communication interface 36.
  • the advancing unit 40 advances the UAV 10.
  • the propulsion unit 40 includes a plurality of rotors and a plurality of drive motors that rotate the plurality of rotors.
  • the propulsion unit 40 rotates a plurality of rotors through a plurality of drive motors in accordance with a command from the UAV control unit 30 to fly the UAV 10.
  • the UAV control section 30 is an example of a third control section.
  • the GPS receiver 41 receives a plurality of signals indicating the time transmitted from a plurality of GPS satellites.
  • the GPS receiver 41 calculates the position (latitude and longitude) of the GPS receiver 41, that is, the position (latitude and longitude) of the UAV 10 based on the received multiple signals.
  • IMU 42 detects UAV 10's posture.
  • IMU42 detects the three-axis acceleration of the front-rear, left-right, and up-down UAV 10, and the three-axis angular velocity of the pitch axis, roll axis, and yaw axis, as the posture of UAV 10.
  • the magnetic compass 43 detects the orientation of the nose of the UAV 10.
  • the barometric altimeter 44 detects the flying altitude of the UAV 10.
  • the barometric altimeter 44 detects the air pressure around the UAV 10 and converts the detected air pressure into an altitude to detect the altitude.
  • the temperature sensor 45 detects the temperature around the UAV 10.
  • the imaging device 100 includes an imaging section 102 and a lens section 200.
  • the lens unit 200 is an example of a lens device.
  • the imaging unit 102 includes an image sensor 120, an imaging control unit 110, and a memory 130.
  • the image sensor 120 may be composed of a CCD or a CMOS.
  • the image sensor 120 captures an optical image formed through the plurality of lenses 210, and outputs the captured image data to the imaging control section 110.
  • the imaging control unit 110 may be composed of a microprocessor such as a CPU or an MPU, and a microcontroller such as an MCU.
  • the imaging control unit 110 may control the imaging apparatus 100 according to an operation instruction from the imaging apparatus 100 of the UAV control unit 30.
  • the memory 130 may be a computer-readable recording medium, and may include at least one of flash memories such as SRAM, DRAM, EPROM, EEPROM, and USB memory.
  • the memory 130 stores programs and the like necessary for the imaging control unit 110 to control the image sensor 120 and the like.
  • the memory 130 may be disposed inside a casing of the imaging device 100.
  • the memory 130 may be provided so as to be detachable from a casing of the imaging apparatus 100.
  • the lens unit 200 includes a plurality of lenses 210, a plurality of lens driving units 212, and a lens control unit 220.
  • the plurality of lenses 210 may function as a focusing lens.
  • the lens portion 200 may be a single focus lens. At least a part or all of the plurality of lenses 210 are configured to be movable along the optical axis.
  • the lens unit 200 may be an interchangeable lens provided to be removable from the imaging unit 102.
  • the lens driving unit 212 moves at least a part or all of the plurality of lenses 210 along an optical axis via a mechanism member such as a cam ring.
  • the lens driving section 212 may include an actuator.
  • the actuator may include a stepper motor.
  • the lens control section 220 drives the lens driving section 212 in accordance with a lens control instruction from the imaging section 102 to move one or more lenses 210 along the optical axis direction via a mechanism member.
  • the lens control instruction is, for example, a focus control instruction.
  • the lens unit 200 further includes a memory 222 and a position sensor 214.
  • the lens control unit 220 controls the movement of the lens 210 in the optical axis direction via the lens driving unit 212 in accordance with a lens operation instruction from the imaging unit 102.
  • the lens control unit 220 controls the movement of the lens 210 in the optical axis direction via the lens driving unit 212 in accordance with a lens operation instruction from the imaging unit 102.
  • a part or all of the lens 210 moves along the optical axis.
  • the lens control section 220 performs a focusing operation by moving at least one of the lenses 210 along the optical axis.
  • the position sensor 214 detects the position of the lens 210.
  • the position sensor 214 can detect a current focus position.
  • the lens driving section 212 may include a shake correction mechanism.
  • the lens control section 220 may perform the shake correction by moving the lens 210 in a direction along the optical axis or a direction perpendicular to the optical axis via a shake correction mechanism.
  • the lens driving section 212 may drive a shake correction mechanism by a stepping motor to perform shake correction.
  • the shake correction mechanism may be driven by a stepping motor to move the image sensor 120 in a direction along the optical axis or in a direction perpendicular to the optical axis to perform shake correction.
  • the memory 222 stores control values of the plurality of lenses 210 that are moved via the lens driving unit 212.
  • the memory 222 may include at least one of flash memories such as SRAM, DRAM, EPROM, EEPROM, and USB memory.
  • the imaging device 100 mounted on the UAV 10 described above causes the imaging device 100 to capture a plurality of images while changing the focus distance of the imaging device 100.
  • the focusing distance is a distance from the imaging device 100 to a subject in a focused state.
  • the in-focus state is, for example, a state in which a contrast evaluation value of a region including a subject of interest in an image captured by the imaging device 100 is equal to or greater than a predetermined value. Change the focus distance by changing the lens position of the focusing lens.
  • the lens systems included in the imaging device 100 there are lens systems that change the magnification of a subject imaged on the imaging surface due to a change in focus distance.
  • the size of the subject included in each of the plurality of images may change. For example, when combining a plurality of images taken with different focusing distances, if the size of the subject of each image is different, an appropriate composite image may not be obtained.
  • the distance between the subject and the imaging device 100 is changed by moving the UAV 10 relative to the subject in consideration of a change in magnification accompanying a change in the focus distance. Therefore, a change in the size of a subject included in each of a plurality of images captured at different focusing distances is suppressed.
  • the UAV control unit 30 includes a designation unit 111, an acquisition unit 112, a determination unit 113, an imaging instruction unit 114, and a synthesis unit 115.
  • devices other than the UAV control section 30 may include at least one of a designation section 111, an acquisition section 112, a determination section 113, an imaging instruction section 114, and a synthesis section 115.
  • the imaging control unit 110 or the remote operation device 300 may include at least one of a designation unit 111, an acquisition unit 112, a determination unit 113, an imaging instruction unit 114, and a synthesis unit 115.
  • the UAV control unit 30 specifies respective focusing distances for focusing on each of a plurality of subjects included in the imaging region of the imaging device 100.
  • the imaging instruction unit 114 changes the position of the focus lens of the imaging device 100 while the imaging device 100 is maintained at the first position, and causes the imaging device 100 to capture a plurality of images.
  • the UAV control unit 30 can maintain the imaging device 100 in the first position by hovering the UAV 10 in the first position.
  • the imaging instruction unit 114 moves the focus lens from the infinity side to the nearest end side via the imaging control unit 110 and the lens control unit 220 during the UAV 10 hover.
  • the imaging instruction unit 114 causes the imaging device 100 to capture a plurality of images while the focus lens is moving from the infinity side to the closest end side.
  • the specifying unit 111 specifies a plurality of focus distances for a plurality of subjects based on a plurality of images captured in different states of the focus lens.
  • the specifying section 111 acquires a contrast evaluation value of each of a plurality of images captured during the movement of the focus lens.
  • the specifying section 111 can acquire an evaluation value of the contrast of each of the plurality of regions constituting the image.
  • the specifying unit 111 specifies a focusing distance corresponding to the lens position of the focusing lens where the evaluation value of the contrast peaks. If there is an area where the evaluation value of the contrast reaches a peak in a plurality of areas in the image, the specifying unit 111 may specify a focus distance corresponding to the lens position of the focus lens when the image is captured.
  • the specifying unit 111 can specify the focusing distance by referring to a table in which the lens position of the focusing lens and the focusing distance are associated.
  • the designation section 111 can acquire a result of the contrast autofocus processing performed by the imaging control section 110. Based on the result, the specifying unit 111 may specify a focus distance corresponding to the lens position of the focus lens where the evaluation value of the contrast peaks.
  • the specifying unit 111 may include an imaging control unit 110.
  • the acquisition unit 112 acquires a plurality of focus distances corresponding to each of a plurality of subjects included in the imaging range of the imaging apparatus 100.
  • the acquisition unit 112 may acquire respective focus distances corresponding to the lens positions of the respective focus lenses whose contrast evaluation values specified by the designation unit 111 peak, as a plurality of focus distances corresponding to each of a plurality of subjects.
  • the determination unit 113 determines the distance between the imaging device 100 and the plurality of subjects when the imaging device 100 photographs each of the plurality of subjects based on the plurality of focusing distances. In order to suppress a change in the size of a subject included in a plurality of images captured at a plurality of focusing distances, the determination section 113 may determine the distance between the imaging device 100 and the plurality of subjects. The determination section 113 may determine the distance between the imaging device 100 and the plurality of subjects so as not to change the size of the same subject among the plurality of images included in the plurality of focusing distances.
  • the determination unit 113 may include the imaging device 100 and the plurality of subjects when shooting a subject farther away from the imaging device 100 among the plurality of subjects. The distance between them is determined to be longer.
  • the determination unit 113 may include the imaging device 100 and the plurality of subjects when shooting a subject farther away from the imaging device 100 among the plurality of subjects. The distance between them is determined to be shorter.
  • the determination unit 113 can make the distance between the imaging device 100 and the plurality of subjects when shooting a subject closer to the imaging device 100 out of the plurality of subjects, and the The difference between the distance between the first position and the plurality of subjects is reduced.
  • the determination unit 113 may determine the distance between the imaging device 100 and the plurality of subjects when imaging a subject closer to the imaging device 100 among the plurality of subjects as the imaging device 100 at a position at a specified focusing distance. The distance between the first position and the plurality of subjects.
  • the memory 130 or the memory 32 may store a table indicating the relationship between the focusing distance corresponding to the characteristics of the lens system included in the imaging device 100 and the distance to the subject when the imaging device 100 shoots.
  • the determination unit 113 may refer to a table to determine the distance between the imaging device 100 and the plurality of subjects when the imaging device 100 photographs each of the plurality of subjects.
  • the table may include, for example, a distance that should be determined by the determination section 113 for each focus distance.
  • the memory 32 or the memory 130 may store a table of each in-focus distance of the closest subject.
  • the closest distance is the distance from the imaging device 100 to the closest subject.
  • the determination unit 113 may refer to a table to determine distances corresponding to the in-focus distances of the subjects other than the closest distance subjects.
  • the imaging instruction section 114 may cause the imaging device 100 to perform each of the plurality of focusing distances acquired by the acquisition section 112 in a state where the distance between the imaging device 100 and the plurality of subjects is set to the distance determined by the determination section 113. Shoot each of multiple subjects everywhere.
  • the imaging instruction section 114 may cause the imaging device 100 to perform each of the plurality of focusing distances acquired by the acquisition section 112 in a state where the distance between the imaging device 100 and the plurality of subjects is set to the distance determined by the determination section 113. Shoot each of multiple subjects everywhere.
  • the imaging instruction unit 114 may finely adjust the focus distance when the imaging device 100 shoots at the distance within the range allowed by the magnification change.
  • the imaging instruction section 114 is an example of a first control section and a second control section.
  • the synthesizing section 115 generates a synthetic image that synthesizes a plurality of images taken at each focus distance.
  • the UAV control unit 30 derives the difference distance between the distance to the closest subject to the imaging device 100 when the focus distance is specified and the distance when each of the plurality of subjects determined by the determination unit 113 is captured.
  • the imaging device 100 includes a lens system with a longer focusing distance, in order to make the imaging device 100 separate each of the plurality of subjects from the plurality of subjects in the imaging direction of the imaging apparatus 100 in order to distance the difference from each other
  • the UAV control unit 30 can drive the propulsion unit 40 by moving in a distance manner.
  • the UAV control unit 30 can drive the propulsion unit 40 by moving in the manner described above.
  • FIG. 3 illustrates an example of a positional relationship between a plurality of subjects captured by the imaging apparatus 100 and the imaging apparatus 100.
  • the imaging range 500 of the imaging device 100 includes a plurality of subjects 501, 502, and 503 with different distances from the imaging device 100.
  • the subject 501 is a subject closest to the imaging device 100, for example, a flower.
  • the subject 503 is a subject at a position that is infinitely far away from the imaging device 100, such as a mountain.
  • the subject 502 is a subject located between the subject 501 and the subject 503, for example, a person.
  • at least two subjects may be included in the imaging range 500 of the imaging device 100.
  • the imaging range 500 of the imaging device 100 may include an infinity subject and at least one subject closer to the imaging device 100 than the infinity subject.
  • FIG. 4 is an example of an image 600 including a plurality of subjects in a positional relationship shown in FIG. 3, which is captured by the imaging device 100.
  • the image 600 includes objects 501, 502, and 503 at different distances from the imaging device 100 in the first left region 611, the second center region 612, and the upper right third region 613, which are the areas where the focus distance is derived.
  • the magnifications of the image focused on the first region 611, the image focused on the second region 612, and the image focused on the third region 613 are different.
  • the magnification of the image focused on the third region 613 is greater than the magnification of the image focused on the first region 611.
  • the imaging device 100 includes a lens system with a smaller focusing distance
  • the magnification of the image focused on the third region 613 is smaller than the magnification of the image focused on the first region 611. Therefore, according to the imaging device 100 mounted on the UAV 10 according to this embodiment, the distance between the imaging device 100 and the plurality of subjects is changed according to the in-focus distances of the plurality of subjects.
  • FIG. 5 illustrates an example of a correspondence relationship between a temporal change in the focusing distance of the imaging device 100 and a temporal change in the distance from the closest subject to the imaging device 100.
  • FIG. 5 shows an example when the imaging device 100 includes a lens system with a larger focusing distance as the focusing distance is longer.
  • the focus distance is changed from the infinity side to the closest end side from time t0 to time t1, so that the imaging device 100 captures multiple images.
  • the imaging device 100 can derive an evaluation value of the contrast of each of the plurality of images.
  • FIG. 6 shows the magnitude of the evaluation value of the contrast for each focusing distance.
  • the peak value of the evaluation value of the contrast appears when the focusing distance is infinity, 1.0 m, and 0.5 m. That is, there are subjects at infinity, 1.0 m, and 0.5 m from the imaging device 100.
  • the UAV 10 in order to focus the subject at infinity to make the imaging device 100 shoot, the UAV 10 first moves to a position farther from the subject than the first position. For example, during the time t1 to time t2, the UAV 10 moves in the imaging direction of the imaging device 100 so that the distance to the closest subject is 0.6 m. During time t1 to time t2, the imaging device 100 adjusts the lens position of the focusing lens so that the focusing distance is infinite.
  • UAV 10 hovers to maintain the distance to the closest subject at 0.6m.
  • the imaging device 100 maintains the focus distance at infinity to capture a plurality of images.
  • the imaging device 100 can maintain a focal distance at infinity to capture a moving image.
  • the UAV 10 moves in the imaging direction of the imaging device 100 so that the distance to the closest subject is 0.55m.
  • the imaging device 100 adjusts the lens position of the focusing lens so that the focusing distance is 1.0 m.
  • UAV 10 is hovered to keep the distance to the closest subject at 0.55m.
  • the imaging device 100 maintains a focus distance of 1.0 m to capture a plurality of images.
  • the imaging device 100 can maintain a focus distance of 1.0 m to capture a moving image.
  • the UAV 10 moves in the imaging direction of the imaging device 100 to restore the distance to the closest subject to 0.5 m.
  • the imaging device 100 adjusts the lens position of the focusing lens so that the focusing distance is 0.5 m.
  • the UAV 10 is hovered so that the distance to the closest subject is maintained at 0.5 m.
  • the imaging device 100 maintains a focus distance of 0.5 m to capture a plurality of images.
  • the imaging device 100 can maintain a focal distance of 0.5 m to capture a moving image.
  • FIG. 7 illustrates an example when the imaging apparatus 100 includes a lens system with a larger magnification as the focusing distance is longer.
  • the imaging device 100 changes the position of the focus lens to capture a plurality of images.
  • the acquisition unit 112 acquires an in-focus distance corresponding to the subject 501, the subject 502, and the subject 503 based on a plurality of images. For example, the acquisition section 112 acquires 0.5 m, 1.0 m, and infinity as the focusing distances corresponding to the subject 501, the subject 502, and the subject 503.
  • the determination unit 113 determines the distance from the imaging device 100 to the subject 501 when the focus distance is set to 0.5 m for shooting, and the distance is 0.5 m.
  • the determination unit 113 determines the distance from the imaging device 100 to the subject 501 when the focus distance is set to 1.0 m for shooting, to be 0.55 m.
  • the determination unit 113 determines the distance from the imaging device 100 to the subject 501 when shooting from the focus distance set to infinity to 0.6 m.
  • the UAV 10 moves from a position 803 to a position 801.
  • the imaging device 100 sets the focus distance to infinity to capture a plurality of images.
  • the UAV 10 moves to change the distance from the imaging device 100 to the subject 501 from 0.5 m to 0.6 m.
  • the UAV 10 is hovering at the position 801 so that the distance from the imaging device 100 to the subject 501 is maintained at 0.6 m.
  • the imaging device 100 sets a focusing distance to infinity to capture a plurality of images.
  • the UAV 10 moves from the position 801 to the position 802 in the imaging direction 800 to approach the subject.
  • the UAV 10 moves so that the distance from the imaging device 100 to the subject 501 changes from 0.6 m to 0.55 m.
  • the UAV 10 is hovering at the position 802 so that the distance from the imaging device 100 to the subject 501 is maintained at 0.55m.
  • the imaging device 100 captures multiple images by setting the focus distance to 1.0 m.
  • the UAV 10 moves from the position 802 to the position 803 in the imaging direction 800 to approach the subject. For example, the UAV 10 moves to change the distance from the imaging device 100 to the subject 501 from 0.55 m to 0.5 m.
  • the UAV 10 is hovering at the position 803 so that the distance from the imaging device 100 to the subject 501 is maintained at 0.5 m.
  • the imaging device 100 sets a focus distance to 0.5 m to capture a plurality of images.
  • the imaging device 100 captures a plurality of images 601 so as to focus on a subject 502 with an infinite distance from the imaging device 100.
  • the imaging device 100 captures a plurality of images 602 so as to focus on a subject 502 having a distance of 1.0 m from the imaging device 100.
  • the imaging device 100 captures a plurality of images 603 so as to focus on a subject 501 having a distance of 0.5 m from the imaging device 100. Change the distance to the subject when shooting each image to eliminate the magnification change accompanying the change in focus distance. Accordingly, the sizes of the subject 501, the subject 502, and the subject 503 included in the images 601, 602, and 603 are the same.
  • the combining unit 115 selects the image 601 having the highest evaluation value of the contrast of the subject 503 from the plurality of images 601.
  • the combining unit 115 selects the image 602 having the highest evaluation value of the contrast of the subject 502 from the plurality of images 602.
  • the combining unit 115 selects the image 603 having the highest evaluation value of the contrast of the subject 501 from the plurality of images 603.
  • the combining unit 115 combines the selected images 601, 602, and 603 to generate an image 610, which is a combined image.
  • the contrast of each of the plurality of subjects included in the generated composite image is high. That is, it is possible to obtain a composite image with a deeper depth of field. Furthermore, the magnification of each subject included in the image does not change depending on the focus distance. Therefore, the relative size of each subject included in the composite image does not cause dissonance.
  • FIG. 9 is a flowchart showing an example of an imaging process of the imaging device 100 mounted on the UAV 10.
  • UAV 10 begins to fly (S100).
  • the UAV 10 moves to a first position, and at the first position, a plurality of desired subjects can be captured by the imaging device 100.
  • the user sets the imaging mode of the imaging device 100 to the depth synthesis mode via the remote operation device 300 (S102).
  • the UAV control unit 30 may automatically change the imaging mode of the imaging device 100 to a depth synthesis mode.
  • the imaging instruction section 114 instructs the imaging control section 110 to move the focus lens from the infinity side to the closest end side.
  • the imaging control unit 110 moves the focus lens from the infinity side to the closest end side via the lens control unit 220.
  • the imaging control unit 110 causes the imaging device 100 to capture a plurality of images while the focus lens is moving.
  • the imaging control unit 110 derives an evaluation value of the contrast of each of the plurality of images (S104).
  • the designation unit 111 designates a plurality of focus distances at which a peak of the contrast evaluation value is obtained based on the evaluation value of the contrast derived by the imaging control unit 110 (S106).
  • the determination section 113 determines the distance to the subject when the imaging device 100 shoots at each of a plurality of focusing distances (S108).
  • the UAV control unit 30 moves the UAV 10 so that the distance from the imaging device 100 to the subject becomes a determined distance. During the movement of the UAV 10, the imaging device 100 performs shooting at each focus distance (S110). The image data captured by the imaging device 100 is stored in the memory 32 (S112).
  • the combining unit 115 combines a plurality of images captured at each of a plurality of focusing distances, and generates a combined image (S114).
  • a composite image may be displayed on a display section included in an external device such as the remote operation device 300.
  • the display unit may surround and display a region including each subject in which the focus distance is specified in the composite image with a frame.
  • the composition unit 115 may generate a composite image by using, as a moving image, a region of each subject in which the focus distance is specified in the composite image.
  • the focus distance of the imaging device 100 is changed by changing the lens position of the focus lens.
  • the focus distance of the imaging device 100 may be changed by sequentially switching interchangeable lenses with different focal distances.
  • a plurality of imaging devices 100 including lens systems with different focal distances may be mounted on the UAV 10, and the focusing distance of the imaging device 100 may be changed by sequentially switching the plurality of imaging devices 100.
  • FIG. 10 illustrates an example of a computer 1200 that may fully or partially embody aspects of the present invention.
  • the program installed on the computer 1200 enables the computer 1200 to function as an operation associated with a device according to an embodiment of the present invention or one or more “parts” of the device. Alternatively, the program can cause the computer 1200 to perform the operation or the one or more "parts".
  • This program enables the computer 1200 to execute a process or a stage of the process according to an embodiment of the present invention.
  • Such a program may be executed by the CPU 1212 to cause the computer 1200 to perform specific operations associated with some or all of the flowcharts and block diagrams described in this specification.
  • the computer 1200 of this embodiment includes a CPU 1212 and a RAM 1214, which are connected to each other through a host controller 1210.
  • the computer 1200 also includes a communication interface 1222, an input / output unit, and they are connected to the host controller 1210 through an input / output controller 1220.
  • the computer 1200 also includes a ROM 1230.
  • the CPU 1212 operates in accordance with programs stored in the ROM 1230 and the RAM 1214 to control each unit.
  • the communication interface 1222 communicates with other electronic devices through a network.
  • the hard disk drive can store programs and data used by the CPU 1212 in the computer 1200.
  • the ROM 1230 stores therein a boot program and the like executed by the computer 1200 at the time of running, and / or a program that depends on the hardware of the computer 1200.
  • the program is provided through a computer-readable recording medium such as a CR-ROM, a USB memory, or an IC card, or a network.
  • the program is installed in a RAM 1214 or a ROM 1230 which is also an example of a computer-readable recording medium, and is executed by the CPU 1212.
  • the information processing described in these programs is read by the computer 1200, and brings cooperation between the programs and the above-mentioned various types of hardware resources.
  • the device or method may be constituted by realizing the operation or processing of information as the computer 1200 is used.
  • the CPU 1212 may execute a communication program loaded in the RAM 1214 and instruct the communication interface 1222 to perform communication processing according to the processing described in the communication program.
  • the communication interface 1222 reads the transmission data stored in a transmission buffer provided in a recording medium such as a RAM 1214 or a USB memory, and sends the read transmission data to the network, or from the network
  • the received reception data is written in a reception buffer or the like provided in the recording medium.
  • the CPU 1212 can cause the RAM 1214 to read all or required portions of a file or database stored in an external recording medium such as a USB memory, and perform various types of processing on the data on the RAM 1214. The CPU 1212 can then write the processed data back to the external recording medium.
  • an external recording medium such as a USB memory
  • Various types of information such as various types of programs, data, tables, and databases can be stored in a recording medium and subjected to information processing.
  • the CPU 1212 can perform various types of operations, information processing, conditional judgment, conditional transfer, unconditional transfer, information, etc., which are described in various places in the present disclosure, including specified by the instruction sequence of the program. retrieve / replace various types of processing, and write the results back to RAM 1214.
  • the CPU 1212 can retrieve information in files, databases, etc. in the recording medium. For example, when a plurality of entries having the attribute value of the first attribute respectively associated with the attribute value of the second attribute are stored in the recording medium, the CPU 1212 may retrieve the attribute that specifies the first attribute from the plurality of entries. The entry whose value conditions match, and the attribute value of the second attribute stored in the entry is read, so as to obtain the attribute value of the second attribute that is associated with the first attribute that meets the predetermined condition.
  • the program or software module described above may be stored on the computer 1200 or a computer-readable storage medium near the computer 1200.
  • a recording medium such as a hard disk or a RAM provided in a server system connected to a dedicated communication network or the Internet can be used as a computer-readable storage medium, thereby providing the program to the computer 1200 through the network.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Optics & Photonics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Mechanical Engineering (AREA)
  • Remote Sensing (AREA)
  • Studio Devices (AREA)
  • Automatic Focus Adjustment (AREA)
  • Focusing (AREA)

Abstract

当对焦距离改变时,成像在摄像面的被摄体的倍率变化的摄像装置改变对焦距离的同时拍摄多个图像,抑制包含在多个图像的每一个中的被摄体的尺寸改变。确定装置可以包括获取部,该获取部获取与包含在摄像装置的摄像范围内的多个被摄体的每一个相对应的多个对焦距离。确定装置可以包括确定部,该确定部基于多个对焦距离,确定摄像装置拍摄每一个被摄体时,摄像装置与多个被摄体之间的距离。

Description

确定装置、摄像系统、移动体、合成系统、确定方法以及程序 【技术领域】
本发明涉及一种确定装置、摄像系统、移动体、合成系统、确定方法以及程序。
【背景技术】
专利文献1公开了从动态图像数据中所包含的多个帧图像中提取对焦在指定区域的静止图像。专利文献2公开了从使焦点位置偏移预定量的同时拍摄的多个图像数据中,基于焦点位置的偏移量和图像数据的分辨率选择待合成的图像数据来合成所选择的多个图像数据。
专利文献1国际公开第2017/006538号公报
专利文献2日本特开2015-231058号公报
【发明内容】
【发明所要解决的技术问题】
根据摄像装置所包括的镜头系统的特性,存在当对焦距离改变时,成像在摄像面的被摄体的倍率改变的情况。在使用这种摄像装置来改变对焦距离的同时拍摄多个图像时,期望抑制包含在多个图像的每一个中的被摄体的尺寸改变。
【用于解决技术问题的技术手段】
本发明的一个方面所涉及的确定装置可以包括获取部,其获取与包含在摄像装置的摄像范围内的多个被摄体的每一个相对应的多个对焦距离。确定装置 可以包括确定部,其基于多个对焦距离,确定摄像装置拍摄多个被摄体的每一个时的摄像装置与多个被摄体之间的距离。
在摄像装置包括对焦距离越长倍率越大的镜头系统时,确定部可以将拍摄多个被摄体中的更远离摄像装置的被摄体时的摄像装置与多个被摄体之间的距离确定为更长。
在摄像装置包括对焦距离越长倍率越小的镜头系统时,确定部可以将拍摄多个被摄体中的更远离摄像装置的被摄体时的摄像装置与多个被摄体之间的距离确定为更短。
确定装置可以包括第一控制部,其在摄像装置与多个被摄体之间的距离被设定为确定部确定的距离的状态下,使摄像装置在获取部获取的多个对焦距离中的每一处拍摄多个被摄体的每一个。
确定装置可以包括第二控制部,其在使摄像装置维持在第一位置的状态下,使摄像装置的聚焦镜头的位置不同来拍摄多个图像。确定装置可以包括指定部,其基于多个图像来指定针对多个被摄体的多个对焦距离。
确定部可以使拍摄多个被摄体中的更靠近摄像装置的被摄体时的摄像装置与多个被摄体之间的距离和摄像装置的第一位置与多个被摄体之间的距离之间的差减小。
确定部可以将拍摄多个被摄体中的最靠近摄像装置的被摄体时的摄像装置与多个被摄体之间的距离确定为摄像装置的第一位置与多个被摄体之间的距离。
本发明的一个方面所涉及的摄像系统可以包括上述确定装置。摄像系统可以包括摄像装置,其包括聚焦镜头的镜头系统。
镜头系统可以是单焦点镜头。
本发明的一个方面所涉及的移动体可以是搭载上述摄像系统并进行移动的移动体。移动体可以包括第三控制部,其控制移动体的移动,以使摄像装置与多个被摄体之间的距离为确定部确定的距离。
本发明的一个方面所涉及的合成系统可以包括上述确定装置。合成装置可以包括合成部,其在摄像装置与多个被摄体之间的距离被设定为确定部确定的距离的状态下,对第一控制部使摄像装置在获取部获取的多个对焦距离中的每一处拍摄多个被摄体的每一个而得到的多个图像进行合成。
本发明的一个方面所涉及的确定方法可以包括获取与包含在摄像装置的摄像范围内的多个被摄体的每一个相对应的多个对焦距离的阶段。确定方法可以包括基于多个对焦距离确定摄像装置拍摄多个被摄体的每一个时,摄像装置与多个被摄体之间的距离的阶段。
本发明的一个方面所涉及的程序可以是一种用于使计算机作为上述确定装置而发挥功能的程序。
根据本发明的一个方面,在使用当对焦距离改变时,成像在摄像面的被摄体的倍率改变的摄像装置来改变对焦距离的同时拍摄多个图像时,能够抑制包含在多个图像的每一个中的被摄体的尺寸的改变。
另外,上述本发明的内容中没有穷举本发明的所有必要的特征。另外,这些特征群的子集也可形成发明。
【附图说明】
图1是示出无人驾驶航空器(UAV)及远程操作装置的外观的一个示例的图。
图2是示出UAV的功能块的一个示例的图。
图3是示出多个被摄体与摄像装置之间的位置关系的一个示例的图。
图4是包括处于图3所示的位置关系的多个被摄体的图像的一个示例。
图5是示出摄像装置的对焦距离的时间性变化与从最近距离的被摄体到摄像装置的距离的时间性变化之间的对应关系的一个示例的图。
图6是示出对比度的评估值与对焦距离之间的关系的一个示例的图。
图7是用于说明移动UAV的同时使摄像装置拍摄多个图像的情况的图。
图8是用于说明从由摄像装置拍摄的多个图像中生成合成图像的情况的图。
图9是示出搭载在UAV上的摄像装置的摄像过程的一个示例的流程图。
图10是示出硬件构成的一个示例的图。
【具体实施方式】
以下,通过发明的实施方式来对本发明进行说明,但是以下实施方式并非限制权利要求书所涉及的发明。此外,实施方式中说明的特征的组合并非全部是发明的解决方案所必须的。对本领域普通技术人员来说,显然可以对以下实施方式加以各种变更或改良。从权利要求书的记载可知,加以了这样的变更或改良的方式都可包含在本发明的技术范围之内。
在权利要求书、说明书、说明书附图以及说明书摘要中包含作为著作权所保护对象的事项。任何人只要如专利局的文档或者记录所表示的那样进行这些文件的复制,著作权人就无法异议。但是,在除此以外的情况下,保留一切的著作权。
本发明的各种实施方式可参照流程图及框图来记载,这里,方框可表示(1) 执行操作的过程的阶段或者(2)具有执行操作的作用的装置的“部”。特定的阶段和“部”可以通过可编程电路和/或处理器来实现。专用电路可以包括数字和/或模拟硬件电路。可以包括集成电路(IC)和/或分立电路。可编程电路可以包括可重构硬件电路。可重构硬件电路可以包括逻辑与、逻辑或、逻辑异或、逻辑与非、逻辑或非、及其它逻辑操作、触发器、寄存器、现场可编程门阵列(FPGA)、可编程逻辑阵列(PLA)等存储器元件等。
计算机可读介质可以包括可以对由适宜的设备执行的指令进行存储的任意有形设备。其结果是,包括存储于其中的指令的计算机可读介质包括一种包括指令的产品,该指令可以执行以创建用于执行流程图或框图所指定的操作的手段。作为计算机可读介质的示例,可以包括电子存储介质、磁存储介质、光学存储介质、电磁存储介质、半导体存储介质等。作为计算机可读介质的更具体的示例,可以包括软(floppy(注册商标))盘、软磁盘、硬盘、随机存取存储器(RAM)、只渎存储器(ROM)、可擦除可编程只读存储器(EPROM或者闪存)、电可擦可编程只读存储器(EEPROM)、静态随机存取存储器(SRAM)、光盘只读存储器(CD-ROM)、数字多用途光盘(DVD)、蓝光(RTM)光盘、记忆棒、集成电路卡等。
计算机可读指令可以包括由一种或多种编程语言的任意组合记述的源代码或者目标代码中的任意一个。源代码或者目标代码包括传统的程序式编程语言。传统的程序式编程语言可以为汇编指令、指令集架构(ISA)指令、机器指令、与机器相关的指令、微代码、固件指令、状态设置数据、或者Smalltalk、JAVA(注册商标)、C++等面向对象编程语言以及“C”编程语言或者类似的编程语言。计算机可读指令可以在本地或者经由局域网(LAN)、互联网等广域网(WAN) 提供给通用计算机、专用计算机或者其它可编程数据处理装置的处理器或可编程电路。处理器或可编程电路可以执行计算机可读指令,以创建用于执行流程图或框图所指定操作的手段。作为处理器的示例,包括计算机处理器、处理单元、微处理器、数字信号处理器、控制器、微控制器等。
图1表示无人驾驶航空器(UAV)10及远程操作装置300的外观的一个示例。UAV 10包括UAV主体20、万向节50、多个摄像装置60、以及摄像装置100。万向节50及摄像装置100为摄像系统的一个示例。UAV 10为移动体的一个示例。移动体是指,包括在空中移动的飞行体、在地面上移动的车辆、在水上移动的船舶等的概念。在空中移动的飞行体是指不仅包括UAV、还包括在空中移动的其它的飞行器、飞艇、直升机等的概念。
UAV主体20包括多个旋翼。多个旋翼为推进部的一个示例。UAV主体20通过控制多个旋翼的旋转而使UAV 10飞行。UAV主体20使用例如四个旋翼来使UAV 10飞行。旋翼的数量不限于四个。另外,UAV 10也可以是没有旋翼的固定翼机。
摄像装置100为对包含在期望的摄像范围内的对象进行拍摄的摄像用相机。万向节50可旋转地支撑摄像装置100。万向节50为支撑机构的一个示例。例如,万向节50使用致动器以俯仰轴可旋转地支撑摄像装置100。万向节50使用致动器进一步分别以滚转轴和偏航轴为中心可旋转地支撑摄像装置100。万向节50可通过使摄像装置100以偏航轴、俯仰轴以及滚转轴中的至少一个为中心旋转,来变更摄像装置100的姿势。
多个摄像装置60是为了控制UAV 10的飞行而对UAV 10的周围进行拍摄的传感用相机。两个摄像装置60可以设置于UAV 10的机头、即正面。并且, 其它两个摄像装置60可以设置于UAV 10的底面。正面侧的两个摄像装置60可以成对,起到所谓的立体相机的作用。底面侧的两个摄像装置60也可以成对,起到立体相机的作用。摄像装置60可以测量包含在摄像装置60的摄像范围内的对象的存在、以及到对象的距离。摄像装置60为用于测量存在于摄像装置100的摄像方向的对象的测量装置的一个示例。测量装置也可以为其他传感器,例如用于测量存在于摄像装置100的摄像方向的对象的红外传感器、超声波传感器等。可以基于由多个摄像装置60拍摄的图像来生成UAV 10周围的三维空间数据。UAV 10所包括的摄像装置60的数量不限于四个。UAV 10包括至少一个摄像装置60即可。UAV 10也可以在UAV 10的机头、机尾、侧面、底面及顶面分别包括至少一个摄像装置60。摄像装置60中可设定的视角可大于摄像装置100中可设定的视角。摄像装置60也可以具有单焦点镜头或鱼眼镜头。
远程操作装置300与UAV 10通信,以远程操作UAV 10。远程操作装置300可以与UAV 10进行无线通信。远程操作装置300向UAV 10发送表示上升、下降、加速、减速、前进、后退、旋转等与UAV 10的移动有关的各种指令的指示信息。指示信息包括例如使UAV 10的高度上升的指示信息。指示信息可以表示UAV 10应该位于的高度。UAV 10进行移动,以位于从远程操作装置300接收的指示信息所表示的高度。指示信息可以包括使UAV 10上升的上升指令。UAV 10在接受上升指令的期间上升。在UAV 10的高度已达到上限高度时,即使接受上升指令,UAV 10也可以限制上升。
图2示出UAV 10的功能块的一个示例。UAV 10包括UAV控制部30、存储器32、通信接口36、推进部40、GPS接收器41、惯性测量装置42、磁罗盘43、气压高度计44、温度传感器45、湿度传感器46、万向节50、摄像装置60 及摄像装置100。
通信接口36与远程操作装置300等其它装置通信。通信接口36可以从远程操作装置300接收包括对UAV控制部30的各种指令的指示信息。存储器32存储UAV控制部30对推进部40、GPS接收器41、惯性测量装置(IMU)42、磁罗盘43、气压高度计44、温度传感器45、湿度传感器46、万向节50、摄像装置60及摄像装置100进行控制所需的程序等。存储器32可以为计算机可读记录介质,可以包括SRAM、DRAM、EPROM、EEPROM及USB存储器等闪存中的至少一个。存储器32可以设置于UAV主体20的内部。其可以设置成可从UAV主体20中拆卸下来。
UAV控制部30按照存储在存储器32中的程序来控制UAV 10的飞行及拍摄。UAV控制部30可以由CPU或MPU等微处理器、以及MCU等微控制器等构成。UAV控制部30按照经由通信接口36从远程操作装置300接收到的指令来控制UAV 10的飞行及拍摄。推进部40推进UAV 10。推进部40具有多个旋翼和使多个旋翼旋转的多个驱动电机。推进部40按照来自UAV控制部30的指令,经由多个驱动电机使多个旋翼旋转,以使UAV 10飞行。UAV控制部30为第三控制部的一个示例。
GPS接收器41接收表示从多个GPS卫星发送的时间的多个信号。GPS接收器41根据所接收的多个信号来计算出GPS接收器41的位置(纬度及经度)、即UAV 10的位置(纬度及经度)。IMU 42检测UAV 10的姿势。IMU 42检测UAV 10的前后、左右以及上下的三轴方向的加速度和俯仰轴、滚转轴以及偏航轴的三轴方向的角速度,作为UAV 10的姿势。磁罗盘43检测UAV 10的机头的方位。气压高度计44检测UAV 10的飞行高度。气压高度计44检测UAV 10 周围的气压,并将检测到的气压换算为高度,以检测高度。温度传感器45检测UAV 10周围的温度。湿度传感器46检测UAV 10周围的湿度。
摄像装置100包括摄像部102及镜头部200。镜头部200为镜头装置的一个示例。摄像部102具有图像传感器120、摄像控制部110及存储器130。图像传感器120可以由CCD或CMOS构成。图像传感器120拍摄经由多个镜头210成像的光学图像,并将所拍摄的图像数据输出至摄像控制部110。摄像控制部110可以由CPU或MPU等微处理器、MCU等微控制器等构成。摄像控制部110可以根据来自UAV控制部30的摄像装置100的操作指令来控制摄像装置100。存储器130可以为计算机可读记录介质,可以包括SRAM、DRAM、EPROM、EEPROM及USB存储器等闪存中的至少一个。存储器130存储摄像控制部110对图像传感器120等进行控制所需的程序等。存储器130可以设置于摄像装置100的壳体内部。存储器130可以设置成可从摄像装置100的壳体中拆卸下来。
镜头部200具有多个镜头210、多个镜头驱动部212、以及镜头控制部220。多个镜头210可以起到聚焦镜头的作用。镜头部200可以是单焦点镜头。多个镜头210中的至少一部分或者全部被配置为能够沿着光轴移动。镜头部200可以是被设置成能够相对摄像部102拆装的交换镜头。镜头驱动部212经由凸轮环等机构构件使多个镜头210中的至少一部分或全部沿着光轴移动。镜头驱动部212可以包括致动器。致动器可以包括步进电机。镜头控制部220按照来自摄像部102的镜头控制指令来驱动镜头驱动部212,以经由机构构件使一个或多个镜头210沿着光轴方向移动。镜头控制指令例如为聚焦控制指令。
镜头部200还具有存储器222和位置传感器214。镜头控制部220按照来自摄像部102的镜头操作指令,经由镜头驱动部212来控制镜头210向光轴方 向的移动。镜头控制部220按照来自摄像部102的镜头操作指令,经由镜头驱动部212来控制镜头210向光轴方向的移动。镜头210的一部分或者全部沿光轴移动。镜头控制部220通过使镜头210中的至少一个沿光轴移动,来执行聚焦操作。位置传感器214检测镜头210的位置。位置传感器214可以检测当前的聚焦位置。
镜头驱动部212可以包括抖动校正机构。镜头控制部220可以经由抖动校正机构使镜头210在沿着光轴的方向或垂直于光轴的方向上移动,来执行抖动校正。镜头驱动部212可以由步进电机驱动抖动校正机构,以执行抖动校正。另外,抖动校正机构可以由步进电机驱动,以使图像传感器120在沿着光轴的方向或垂直于光轴的方向上移动,来执行抖动校正。
存储器222存储经由镜头驱动部212而移动的多个镜头210的控制值。存储器222可以包括SRAM、DRAM、EPROM、EEPROM及USB存储器等闪存中的至少一个。
搭载于如上所述的UAV 10的摄像装置100中,在改变摄像装置100的对焦距离的同时,使摄像装置100拍摄多个图像。这里,对焦距离是从摄像装置100到对焦状态下的被摄体的距离。对焦状态是例如包括由摄像装置100所拍摄图像中受关注被摄体的区域的对比度评估值大于等于预定值的状态。通过改变聚焦镜头的镜头位置来改变对焦距离。
在摄像装置100所包括的镜头系统中,存在因对焦距离改变而使成像在摄像面的被摄体的倍率改变的镜头系统。在使用包括这种镜头系统的摄像装置100在改变对焦距离的同时拍摄多个图像的情况下,包含在多个图像的每一个中的被摄体的尺寸会改变。例如,在对在对焦距离不同的状态下拍摄的多个图 像进行合成时,若每个图像的被摄体的尺寸不同,可能无法获得适当的合成图像。
因此,根据本实施方式的摄像装置100,考虑到伴随着对焦距离改变的倍率改变,使UAV 10相对于被摄体移动来改变被摄体与摄像装置100之间的距离。由此,抑制包含在不同对焦距离下拍摄的多个图像的每一个中的被摄体的尺寸改变。
UAV控制部30包括指定部111、获取部112、确定部113、摄像指示部114以及合成部115。另外,除UAV控制部30以外的其他装置可以包括指定部111、获取部112、确定部113、摄像指示部114以及合成部115中的至少一个。例如,摄像控制部110或者远程操作装置300可以具有指定部111、获取部112、确定部113、摄像指示部114以及合成部115中的至少一个。
UAV控制部30指定与包含在摄像装置100的摄像区域中的多个被摄体的每一个对焦的各对焦距离。摄像指示部114为了指定对焦距离,在使摄像装置100维持在第一位置的状态下,改变摄像装置100的聚焦镜头的位置来使摄像装置100拍摄多个图像。
UAV控制部30可以通过使UAV 10悬停在第一位置,来将摄像装置100维持在第一位置。摄像指示部114,在UAV 10悬停期间,经由摄像控制部110及镜头控制部220使聚焦镜头从无限远侧移动到最近端侧。摄像指示部114,在聚焦镜头从无限远侧移动到最近端侧期间,使摄像装置100拍摄多个图像。
指定部111基于在聚焦镜头的位置不同的状态下拍摄的多个图像,指定针对多个被摄体的多个对焦距离。指定部111获取在聚焦镜头移动期间拍摄的多个图像的每一个的对比度评估值。指定部111可以获取构成图像的多个区域中 的每个区域的对比度的评估值。指定部111指定与对比度的评估值达到峰值的聚焦镜头的镜头位置对应的对焦距离。若图像内的多个区域中存在对比度的评估值达到峰值的区域,则指定部111可以指定与拍摄该图像时的聚焦镜头的镜头位置对应的对焦距离。
指定部111例如可以通过参照将聚焦镜头的镜头位置和对焦距离对应起来的表格来指定对焦距离。指定部111可以获取摄像控制部110执行的对比度自动聚焦处理的结果。指定部111可以根据该结果,指定与对比度的评估值达到峰值的聚焦镜头的镜头位置对应的对焦距离。另外,指定部111也可以具有摄像控制部110。
获取部112获取与包含在摄像装置100的摄像范围内的多个被摄体的每一个相对应的多个对焦距离。获取部112可以获取与由指定部111指定的对比度的评估值达到峰值的各个聚焦镜头的镜头位置对应的各个对焦距离作为与多个被摄体的每一个对应的多个对焦距离。
确定部113基于多个对焦距离,确定摄像装置100拍摄多个被摄体的每一个时摄像装置100与多个被摄体之间的距离。为了抑制包含在多个对焦距离下拍摄的多个图像中的被摄体的尺寸的改变,确定部113可以确定摄像装置100与多个被摄体之间的距离。确定部113可以确定摄像装置100与多个被摄体之间的距离,以便不改变包含在多个对焦距离下拍摄的多个图像中的同一个被摄体的尺寸。
在摄像装置100包括对焦距离越长倍率越大的镜头系统时,确定部113可以将拍摄多个被摄体中的更远离摄像装置100的被摄体时的摄像装置100与多个被摄体之间的距离确定为更长。在摄像装置100包括对焦距离越长倍率越小 的镜头系统时,确定部113可以将拍摄多个被摄体中的更远离摄像装置100的被摄体时的摄像装置100与多个被摄体之间的距离确定为更短。
确定部113可以使拍摄多个被摄体中的更靠近摄像装置100的被摄体时的摄像装置100与多个被摄体之间的距离和处于指定对焦距离时的位置的摄像装置100的第一位置与多个被摄体之间的距离之间的差减小。确定部113可以将拍摄多个被摄体中的更靠近摄像装置100的被摄体时的摄像装置100与多个被摄体之间的距离确定为处于指定对焦距离时的位置的摄像装置100的第一位置与多个被摄体之间的距离。
存储器130或者存储器32可以存储表格,该表格表示与摄像装置100所包括的镜头系统的特性相应的对焦距离和摄像装置100拍摄时的到被摄体的距离之间的关系。确定部113可以参照表格,来确定摄像装置100拍摄多个被摄体的每一个时摄像装置100与多个被摄体之间的距离。表格例如可以包括对于每个对焦距离应由确定部113确定的距离。存储器32或者存储器130可以存储最近距离的被摄体的每个对焦距离的表格。最近距离是从摄像装置100到最近的被摄体的距离。例如,确定部113可以参照表格,分别确定对应于除最近距离的被摄体以外的其他被摄体的对焦距离的距离。
摄像指示部114可以在摄像装置100与多个被摄体之间的距离被设定为确定部113确定的距离的状态下,使摄像装置100在获取部112获取的多个对焦距离的每一处拍摄多个被摄体的每一个。摄像指示部114可以在摄像装置100与多个被摄体之间的距离被设定为确定部113确定的距离的状态下,使摄像装置100在获取部112获取的多个对焦距离的每一处拍摄多个被摄体的每一个。摄像指示部114也可以根据确定部113确定的距离,在倍率改变允许的范围内 微调摄像装置100在该距离下拍摄时的对焦距离。摄像指示部114为第一控制部及第二控制部的示例。合成部115生成合成图像,该合成图像合成了在每一个对焦距离处拍摄的多个图像。
例如,UAV控制部30导出到距离指定了对焦距离时的摄像装置100最近的被摄体的距离与拍摄由确定部113确定的多个被摄体的每一个时的距离之间的差值距离。在摄像装置100包括对焦距离越长倍率越大的镜头系统时,为了使摄像装置100在拍摄多个被摄体的每一个时以沿摄像装置100的摄像方向从多个被摄体远离差值距离的方式移动,UAV控制部30可以驱动推进部40。在摄像装置100包括对焦距离越长倍率越小的镜头系统时,为了使摄像装置100在拍摄多个被摄体的每一个时以沿摄像装置100的摄像方向接近多个被摄体差值距离的方式移动,UAV控制部30可以驱动推进部40。
图3示出由摄像装置100拍摄的多个被摄体与摄像装置100之间的位置关系的一个示例。在摄像装置100的摄像范围500内包括与摄像装置100的距离不同的多个被摄体501、502以及503。被摄体501是距离摄像装置100最近的被摄体,例如,花。被摄体503是距离摄像装置100无限远的位置处的被摄体,例如,山。被摄体502是位于被摄体501与被摄体503之间的被摄体,例如,人。另外,在摄像装置100的摄像范围500内包括至少两个被摄体即可。在摄像装置100的摄像范围500内包括无限远的被摄体、和至少一个比无限远的被摄体更靠近摄像装置100的被摄体即可。
图4是包括由摄像装置100拍摄的处于图3所示的位置关系中的多个被摄体的图像600的一个示例。图像600在导出对焦距离的区域即左下端的第一区域611、中央的第二区域612、以及右上端的第三区域613中分别包括与摄像装 置100的距离不同的被摄体501、502以及503。
例如,在摄像装置100在相同位置拍摄时,聚焦在第一区域611的图像、聚焦在第二区域612的图像以及聚焦在第三区域613的图像各自的倍率不同。在摄像装置100包括对焦距离越长倍率越大的镜头系统时,聚焦在第三区域613的图像的倍率大于聚焦在第一区域611的图像的倍率。在摄像装置100包括对焦距离越长倍率越小的镜头系统时,聚焦在第三区域613的图像的倍率小于聚焦在第一区域611的图像的倍率。因此,根据本实施方式的搭载于UAV 10的摄像装置100,根据多个被摄体各自的对焦距离,改变摄像装置100与多个被摄体之间的距离。
图5示出摄像装置100的对焦距离的时间性变化与从最近距离的被摄体到摄像装置100的距离的时间性变化之间的对应关系的一个示例。图5示出当摄像装置100包括对焦距离越长倍率越大的镜头系统时的一个示例。
在UAV 10在第一位置悬停期间,从时间t0到时间t1,使对焦距离从无限远侧改变到最近端侧,以使摄像装置100拍摄多个图像。摄像装置100可以导出多个图像各自的对比度的评估值。图6示出每个对焦距离的对比度的评估值的大小。在图6所示的示例中,当对焦距离为无限远、1.0m以及0.5m时出现对比度的评估值的峰值。即,在距离摄像装置100无限远、1.0m以及0.5m的地方,存在被摄体。
在这种情况下,为了聚焦到无限远的被摄体来使摄像装置100拍摄,UAV 10首先移动到距离被摄体比第一位置更远的位置。例如,在时间t1到时间t2期间,UAV 10沿摄像装置100的摄像方向移动,以使到最近距离的被摄体的距离为0.6m。在时间t1到时间t2期间,摄像装置100调整聚焦镜头的镜头位置,以 使对焦距离为无限远。
接着,在时间t2到时间t3期间,UAV 10悬停,以使到最近距离的被摄体的距离维持在0.6m。在时间t2到时间t3期间,摄像装置100将焦点距离维持在无限远来拍摄多个图像。在时间t2到时间t3期间,摄像装置100可以将焦点距离维持在无限远来拍摄动态图像。
在时间t3到时间t4期间,UAV 10沿摄像装置100的摄像方向移动,以使到最近距离的被摄体的距离为0.55m。在时间t3到时间t4期间,摄像装置100调整聚焦镜头的镜头位置,以使对焦距离为1.0m。
接着,在时间t4到时间t5期间,UAV 10悬停,以使到最近距离的被摄体的距离维持在0.55m。在时间t4到时间t5期间,摄像装置100将焦点距离维持在1.0m来拍摄多个图像。在时间t4到时间t5期间,摄像装置100可以将焦点距离维持在1.0m来拍摄动态图像。
在时间t5到时间t6期间,UAV 10沿摄像装置100的摄像方向移动,以使到最近距离的被摄体的距离恢复到0.5m。在时间t5到时间t6期间,摄像装置100调整聚焦镜头的镜头位置,以使对焦距离为0.5m。
接着,在时间t6到时间t7期间,UAV 10悬停,以使到最近距离的被摄体的距离维持在0.5m。在时间t6到时间t7期间,摄像装置100将焦点距离维持在0.5m来拍摄多个图像。在时间t6到时间t7期间,摄像装置100可以将焦点距离维持在0.5m来拍摄动态图像。
例如,如图7所示,在移动UAV 10的同时,使摄像装置100拍摄多个图像。图7示出当摄像装置100包括对焦距离越长倍率越大的镜头系统时的一个示例。在UAV 10在位置803处悬停的状态下,摄像装置100改变聚焦镜头的 位置来拍摄多个图像。获取部112基于多个图像,获取与被摄体501、被摄体502以及被摄体503对应的对焦距离。例如,获取部112获取0.5m、1.0m以及无限远作为与被摄体501、被摄体502以及被摄体503对应的对焦距离。确定部113将从将对焦距离设定为0.5m来拍摄时的摄像装置100到被摄体501的距离确定为0.5m。确定部113将从将对焦距离设定为1.0m来拍摄时的摄像装置100到被摄体501的距离确定为0.55m。确定部113将从将对焦距离设定为无限远来拍摄时的摄像装置100到被摄体501的距离确定为0.6m。
在摄像装置100在被摄体503处于对焦状态的对焦距离处进行拍摄时,UAV10从位置803移动到位置801。当UAV 10在位置803处悬停时,摄像装置100将对焦距离设定为无限远来拍摄多个图像。例如,UAV 10进行移动以使从摄像装置100到被摄体501的距离从0.5m变为0.6m。UAV 10在位置801处悬停,以使从摄像装置100到被摄体501的距离维持在0.6m。摄像装置100将对焦距离设定为无限远来拍摄多个图像。
接着,UAV 10沿摄像方向800从位置801移动到位置802以接近被摄体。例如,UAV 10进行移动以使从摄像装置100到被摄体501的距离从0.6m变为0.55m。UAV 10在位置802处悬停,以使从摄像装置100到被摄体501的距离维持在0.55m。摄像装置100将对焦距离设定为1.0m来拍摄多个图像。
进一步地,UAV 10沿摄像方向800从位置802移动到位置803以接近被摄体。例如,UAV 10进行移动以使从摄像装置100到被摄体501的距离从0.55m变为0.5m。UAV 10在位置803处悬停,以使从摄像装置100到被摄体501的距离维持在0.5m。摄像装置100将对焦距离设定为0.5m来拍摄多个图像。
例如,如图8所示,摄像装置100以聚焦于到摄像装置100的距离为无限 远的被摄体502的方式拍摄多个图像601。摄像装置100以聚焦于到摄像装置100的距离为1.0m的被摄体502的方式拍摄多个图像602。摄像装置100以聚焦于到摄像装置100的距离为0.5m的被摄体501的方式拍摄多个图像603。改变拍摄各个图像时到被摄体的距离,以消除伴随着焦点距离改变的倍率改变。由此,包含在图像601、图像602以及图像603中的被摄体501、被摄体502以及被摄体503的尺寸一致。
合成部115从多个图像601中选择被摄体503的对比度的评估值最高的一个图像601。合成部115从多个图像602中选择被摄体502的对比度的评估值最高的一个图像602。合成部115从多个图像603中选择被摄体501的对比度的评估值最高的一个图像603。合成部115合成所选的图像601、图像602以及图像603,来生成合成图像即图像610。
由此,生成的合成图像中包含的多个被摄体的每一个的对比度较高。即,能够获得深景较深的合成图像。而且,根据对焦距离,图像中包含的各个被摄体的倍率不发生改变。因此,合成图像中包含的各个被摄体的相对尺寸不会产生不谐调。
图9是示出搭载在UAV 10上的摄像装置100的摄像过程的一个示例的流程图。
UAV 10开始飞行(S100)。UAV 10移动到第一位置,在第一位置能够通过摄像装置100对期望的多个被摄体进行拍摄。用户经由远程操作装置300将摄像装置100的摄像模式设置为深度合成模式(S102)。当UAV 10到达预定的第一位置时,UAV控制部30可以自动将摄像装置100的摄像模式更改为深度合成模式。
UAV 10在第一位置悬停。摄像指示部114指示摄像控制部110将聚焦镜头从无限远侧移动到最近端侧。摄像控制部110经由镜头控制部220将聚焦镜头从无限远侧移动到最近端侧。摄像控制部110在聚焦镜头移动期间,使摄像装置100拍摄多个图像。摄像控制部110导出多个图像各自的对比度的评估值(S104)。
指定部111基于由摄像控制部110导出的对比度的评估值,指定获得对比度评估值的峰值的多个对焦距离(S106)。确定部113确定摄像装置100在多个对焦距离的每一处拍摄时到被摄体的距离(S108)。
UAV控制部30移动UAV 10,使得从摄像装置100到被摄体的距离变为已确定的距离。在UAV 10移动期间,摄像装置100在每一个对焦距离处进行拍摄(S110)。将由摄像装置100拍摄的图像数据保存在存储器32中(S112)。
合成部115对在多个对焦距离的每一处拍摄的多个图像进行合成,并生成合成图像(S114)。
例如,可以在远程操作装置300等外部装置包括的显示部上显示合成图像。显示部也可以用框包围并显示包括合成图像中的指定了对焦距离的每一个被摄体的区域。合成部115也可以将合成图像中的指定了对焦距离的每一个被摄体的区域作为动态图像来生成合成图像。
另外,在上述实施方式中,对通过改变聚焦镜头的镜头位置来改变摄像装置100的对焦距离的示例进行了说明。然而,例如,也可以通过依次切换焦点距离不同的更换镜头,来改变摄像装置100的对焦距离。此外,也可以在UAV10上搭载包括焦点距离不同的镜头系统的多个摄像装置100,通过依次切换多个摄像装置100,来改变摄像装置100的对焦距离。
图10示出可全部或部分地体现本发明的多个方面的计算机1200的一个示例。安装在计算机1200上的程序能够使计算机1200作为与本发明的实施方式所涉及的装置建立关联的操作或者该装置的一个或多个“部”发挥功能。或者,该程序能够使计算机1200执行该操作或者该一个或多个“部”。该程序能够使计算机1200执行本发明的实施方式所涉及的过程或者该过程的阶段。这种程序可以由CPU 1212执行,以使计算机1200执行与本说明书所述的流程图及框图中的一些或者全部方框建立关联的特定操作。
本实施方式的计算机1200包括CPU 1212及RAM 1214,它们通过主机控制器1210相互连接。计算机1200还包括通信接口1222、输入/输出单元,它们通过输入/输出控制器1220与主机控制器1210连接。计算机1200还包括ROM 1230。CPU 1212按照ROM 1230及RAM 1214内存储的程序而操作,从而控制各单元。
通信接口1222通过网络与其它电子设备通信。硬盘驱动器可以存储计算机1200内的CPU 1212所使用的程序及数据。ROM 1230在其中存储运行时由计算机1200执行的引导程序等、和/或依赖于计算机1200的硬件的程序。程序通过CR-ROM、USB存储器或IC卡之类的计算机可读记录介质或者网络来提供。程序安装在也是计算机可读记录介质的示例的RAM 1214或ROM 1230中,并通过CPU 1212执行。这些程序中记述的信息处理由计算机1200读取,并带来程序与上述各种类型的硬件资源之间的协作。装置或者方法可随着计算机1200的使用而通过实现信息的操作或者处理来构成。
例如,在计算机1200与外部设备之间执行通信时,CPU 1212可以执行加载在RAM 1214中的通信程序,并根据通信程序所记述的处理,指令通信接口 1222进行通信处理。通信接口1222在CPU 1212的控制下,读取存储在RAM 1214或USB存储器之类的记录介质内提供的发送缓冲区中的发送数据,并将读取的发送数据发送到网络,或者将从网络接收的接收数据写入记录介质内提供的接收缓冲区等中。
此外,CPU 1212可以使RAM 1214读取USB存储器等外部记录介质所存储的文件或数据库的全部或者需要的部分,并对RAM 1214上的数据执行各种类型的处理。接着,CPU 1212可以将处理的数据写回到外部记录介质中。
可以将各种类型的程序、数据、表格及数据库之类的各种类型的信息存储在记录介质中,并接受信息处理。对于从RAM 1214读取的数据,CPU 1212可执行在本公开的各处记载的、包括由程序的指令序列指定的各种类型的操作、信息处理、条件判断、条件转移、无条件转移、信息的检索/替换等各种类型的处理,并将结果写回到RAM 1214中。此外,CPU 1212可以检索记录介质内的文件、数据库等中的信息。例如,在记录介质中存储具有分别与第二属性的属性值建立了关联的第一属性的属性值的多个条目时,CPU 1212可以从该多个条目中检索出与指定第一属性的属性值的条件相匹配的条目,并读取该条目内存储的第二属性的属性值,从而获取与满足预定条件的第一属性建立了关联的第二属性的属性值。
以上描述的程序或者软件模块可以存储在计算机1200上或者计算机1200附近的计算机可读存储介质上。此外,与专用通信网络或者互联网连接的服务器系统内提供的硬盘或RAM之类的记录介质可以用作计算机可读存储介质,从而通过网络将程序提供给计算机1200。
应该注意的是,权利要求书、说明书以及说明书附图中所示的装置、系统、 程序以及方法中的操作、顺序、步骤以及阶段等各项处理的执行顺序,只要没有特别明示“在…之前”、“事先”等,且只要前面处理的输出并不用在后面的处理中,则可以任意顺序实现。关于权利要求书、说明书以及附图中的操作流程,为方便起见而使用“首先”、“接着”等进行了说明,但并不意味着必须按照这样的顺序实施。
以上使用实施方式对本发明进行了说明,但是本发明的技术范围并不限于上述实施方式所记载的范围。对本领域普通技术人员来说,显然可以对上述实施方式加以各种变更或改良。从权利要求书的记载可知,加以了这样的变更或改良的方式都可包含在本发明的技术范围之内。
【符号说明】
10 UAV
20 UAV主体
30 UAV控制部
32 存储器
36 通信接口
40 推进部
41 GPS接收器
42 惯性测量装置
43 磁罗盘
44 气压高度计
45 温度传感器
46 湿度传感器
50 万向节
60 摄像装置
100 摄像装置
102 摄像部
110 摄像控制部
111 指定部
112 获取部
113 确定部
114 摄像指示部
115 合成部
120 图像传感器
130 存储器
200 镜头部
210 镜头
212 镜头驱动部
214 位置传感器
220 镜头控制部
222 存储器
300 远程操作装置
500 摄像范围
501,502,503 被摄体
1200 计算机
1210 主机控制器
1212 CPU
1214 RAM
1220 输入/输出控制器
1222 通信接口
1230 ROM

Claims (13)

  1. 一种确定装置,其特征在于,包括:获取部,其获取与包含在摄像装置的摄像范围内的每一个被摄体相对应的多个对焦距离;以及
    确定部,其基于所述多个对焦距离,确定所述摄像装置拍摄所述多个被摄体时,所述摄像装置与所述多个被摄体之间的距离。
  2. 如权利要求1所述的确定装置,其特征在于,在所述摄像装置包括对焦距离越长倍率越大的镜头系统时,
    所述确定部将拍摄所述多个被摄体中的更远离所述摄像装置的被摄体时的所述摄像装置与所述多个被摄体之间的距离确定为更长。
  3. 如权利要求1所述的确定装置,其特征在于,在所述摄像装置包括对焦距离越长倍率越小的镜头系统时,
    所述确定部将拍摄所述多个被摄体中的更远离所述摄像装置的被摄体时的所述摄像装置与所述多个被摄体之间的距离确定为更短。
  4. 如权利要求1所述的确定装置,其特征在于,还包括:第一控制部,其在所述摄像装置与所述多个被摄体之间的距离被设定为所述确定部确定的所述距离的状态下,使所述摄像装置在所述获取部获取的所述多个对焦距离中的每一处拍摄所述多个被摄体的每一个。
  5. 如权利要求1所述的确定装置,其特征在于,还包括:第二控制部,其在使所述摄像装置维持在第一位置的状态下,使所述摄像装置的聚焦镜头的位置不同来使其拍摄多个图像;以及
    指定部,其基于所述多个图像来指定针对所述多个被摄体的所述多个对焦距离。
  6. 如权利要求5所述的确定装置,其特征在于,所述确定部使拍摄所述多个被摄体中的更靠近所述摄像装置的被摄体时所述摄像装置与所述多个被摄体之间的距离和所述摄像装置的所述第一位置与所述多个被摄体之间的距离之间的差减小。
  7. 如权利要求6所述的确定装置,其特征在于,所述确定部将拍摄所述多个被摄体中的最靠近所述摄像装置的被摄体时所述摄像装置与所述多个被摄体之间的距离确定为所述摄像装置的所述第一位置与所述多个被摄体之间的距离。
  8. 一种摄像系统,其特征在于,包括:如权利要求1至7中任一项所述的确定装置;以及
    包括聚焦镜头的镜头系统的摄像装置。
  9. 如权利要求8所述的摄像系统,其特征在于,所述镜头系统是单焦点镜头。
  10. 一种移动体,其搭载如权利要求9所述的摄像系统并进行移动,其特征在于,包括:
    第三控制部,其控制所述移动体的移动,以使所述摄像装置与所述多个被摄体之间的距离为所述确定部确定的所述距离。
  11. 一种合成系统,其特征在于,包括:如权利要求4所述的确定装置;以及
    合成部,其在所述摄像装置与所述多个被摄体之间的距离被设定为所述确定部确定的所述距离的状态下,对所述第一控制部使所述摄像装置在所述获取 部获取的所述多个对焦距离中的每一处拍摄所述多个被摄体的每一个而得到的多个图像进行合成。
  12. 一种确定方法,其特征在于,包括:获取与包含在摄像装置的摄像范围内的多个被摄体的每一个相对应的多个对焦距离的阶段;以及
    基于所述多个对焦距离,确定所述摄像装置拍摄所述多个被摄体的每一个时所述摄像装置与所述多个被摄体之间的距离的阶段。
  13. 一种程序,其特征在于,用于使计算机作为如权利要求1至权利要求7中任一项所述的决定装置而发挥功能。
PCT/CN2019/091742 2018-06-19 2019-06-18 确定装置、摄像系统、移动体、合成系统、确定方法以及程序 WO2019242616A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201980005209.8A CN111264055A (zh) 2018-06-19 2019-06-18 确定装置、摄像系统、移动体、合成系统、确定方法以及程序
US17/122,948 US20210105411A1 (en) 2018-06-19 2020-12-15 Determination device, photographing system, movable body, composite system, determination method, and program

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2018116418A JP6790318B2 (ja) 2018-06-19 2018-06-19 無人航空機、制御方法、及びプログラム
JP2018-116418 2018-06-19

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US17/122,948 Continuation US20210105411A1 (en) 2018-06-19 2020-12-15 Determination device, photographing system, movable body, composite system, determination method, and program

Publications (1)

Publication Number Publication Date
WO2019242616A1 true WO2019242616A1 (zh) 2019-12-26

Family

ID=68982772

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2019/091742 WO2019242616A1 (zh) 2018-06-19 2019-06-18 确定装置、摄像系统、移动体、合成系统、确定方法以及程序

Country Status (4)

Country Link
US (1) US20210105411A1 (zh)
JP (1) JP6790318B2 (zh)
CN (1) CN111264055A (zh)
WO (1) WO2019242616A1 (zh)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7336330B2 (ja) * 2019-09-24 2023-08-31 キヤノン株式会社 制御装置、撮像装置、制御方法、および、プログラム
US20210231801A1 (en) * 2020-01-28 2021-07-29 ProTek Technologies, Inc. Monitoring device for use with an alert management process

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011247979A (ja) * 2010-05-25 2011-12-08 Fujifilm Corp 撮影装置及び撮影距離取得方法
CN103533232A (zh) * 2012-07-05 2014-01-22 卡西欧计算机株式会社 图像处理装置以及图像处理方法
CN104243828A (zh) * 2014-09-24 2014-12-24 宇龙计算机通信科技(深圳)有限公司 一种拍摄照片的方法、装置及终端
CN106303192A (zh) * 2015-05-25 2017-01-04 小米科技有限责任公司 终端控制方法及终端

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0348212A (ja) * 1989-07-17 1991-03-01 Fuji Photo Film Co Ltd ズームレンズ装置
CN104613930B (zh) * 2015-01-04 2017-05-17 宇龙计算机通信科技(深圳)有限公司 一种测距的方法、装置及移动终端
CN104869316B (zh) * 2015-05-29 2018-07-03 北京京东尚科信息技术有限公司 一种多目标的摄像方法及装置
JP6758828B2 (ja) * 2015-12-15 2020-09-23 キヤノン株式会社 撮像システムおよびその制御方法

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011247979A (ja) * 2010-05-25 2011-12-08 Fujifilm Corp 撮影装置及び撮影距離取得方法
CN103533232A (zh) * 2012-07-05 2014-01-22 卡西欧计算机株式会社 图像处理装置以及图像处理方法
CN104243828A (zh) * 2014-09-24 2014-12-24 宇龙计算机通信科技(深圳)有限公司 一种拍摄照片的方法、装置及终端
CN106303192A (zh) * 2015-05-25 2017-01-04 小米科技有限责任公司 终端控制方法及终端

Also Published As

Publication number Publication date
JP6790318B2 (ja) 2020-11-25
JP2019220834A (ja) 2019-12-26
US20210105411A1 (en) 2021-04-08
CN111264055A (zh) 2020-06-09

Similar Documents

Publication Publication Date Title
WO2019238044A1 (zh) 确定装置、移动体、确定方法以及程序
WO2020011230A1 (zh) 控制装置、移动体、控制方法以及程序
WO2019120082A1 (zh) 控制装置、系统、控制方法以及程序
US20210014427A1 (en) Control device, imaging device, mobile object, control method and program
WO2019242616A1 (zh) 确定装置、摄像系统、移动体、合成系统、确定方法以及程序
WO2020098603A1 (zh) 确定装置、摄像装置、摄像系统、移动体、确定方法以及程序
WO2019174343A1 (zh) 活动体检测装置、控制装置、移动体、活动体检测方法及程序
WO2020011198A1 (zh) 控制装置、移动体、控制方法以及程序
WO2019242611A1 (zh) 控制装置、移动体、控制方法以及程序
US11066182B2 (en) Control apparatus, camera apparatus, flying object, control method and program
WO2019061887A1 (zh) 控制装置、摄像装置、飞行体、控制方法以及程序
CN111602385B (zh) 确定装置、移动体、确定方法以及计算机可读记录介质
WO2020020042A1 (zh) 控制装置、移动体、控制方法以及程序
WO2019223614A1 (zh) 控制装置、摄像装置、移动体、控制方法以及程序
WO2020001335A1 (zh) 控制装置、摄像装置、移动体、控制方法以及程序
JP7003357B2 (ja) 制御装置、撮像装置、移動体、制御方法、及びプログラム
JP6896963B1 (ja) 制御装置、撮像装置、移動体、制御方法、及びプログラム
WO2020216057A1 (zh) 控制装置、摄像装置、移动体、控制方法以及程序
WO2021249245A1 (zh) 装置、摄像装置、摄像系统及移动体
WO2020063770A1 (zh) 控制装置、摄像装置、移动体、控制方法以及程序
WO2020125414A1 (zh) 控制装置、摄像装置、摄像系统、移动体、控制方法以及程序
WO2019085794A1 (zh) 控制装置、摄像装置、飞行体、控制方法以及程序
JP2020017800A (ja) 移動体、制御方法、及びプログラム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19822716

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19822716

Country of ref document: EP

Kind code of ref document: A1