WO2019061887A1 - 控制装置、摄像装置、飞行体、控制方法以及程序 - Google Patents

控制装置、摄像装置、飞行体、控制方法以及程序 Download PDF

Info

Publication number
WO2019061887A1
WO2019061887A1 PCT/CN2017/117981 CN2017117981W WO2019061887A1 WO 2019061887 A1 WO2019061887 A1 WO 2019061887A1 CN 2017117981 W CN2017117981 W CN 2017117981W WO 2019061887 A1 WO2019061887 A1 WO 2019061887A1
Authority
WO
WIPO (PCT)
Prior art keywords
change
amount
lens
zoom
distance
Prior art date
Application number
PCT/CN2017/117981
Other languages
English (en)
French (fr)
Inventor
永山佳范
本庄谦一
Original Assignee
深圳市大疆创新科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳市大疆创新科技有限公司 filed Critical 深圳市大疆创新科技有限公司
Priority to CN201780065011.XA priority Critical patent/CN109844634B/zh
Publication of WO2019061887A1 publication Critical patent/WO2019061887A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B7/00Mountings, adjusting means, or light-tight connections, for optical elements
    • G02B7/02Mountings, adjusting means, or light-tight connections, for optical elements for lenses
    • G02B7/04Mountings, adjusting means, or light-tight connections, for optical elements for lenses with mechanism for focusing or varying magnification
    • G02B7/09Mountings, adjusting means, or light-tight connections, for optical elements for lenses with mechanism for focusing or varying magnification adapted for automatic focusing or varying magnification
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B13/00Viewfinders; Focusing aids for cameras; Means for focusing for cameras; Autofocus systems for cameras
    • G03B13/18Focusing aids
    • G03B13/20Rangefinders coupled with focusing arrangements, e.g. adjustment of rangefinder automatically focusing camera
    • G03B13/22Rangefinders coupled with focusing arrangements, e.g. adjustment of rangefinder automatically focusing camera coupling providing for compensation upon change of camera lens
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B13/00Viewfinders; Focusing aids for cameras; Means for focusing for cameras; Autofocus systems for cameras
    • G03B13/32Means for focusing
    • G03B13/34Power focusing
    • G03B13/36Autofocus systems
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules

Definitions

  • the invention relates to a control device, an imaging device, a flying body, a control method and a program.
  • Patent Document 1 discloses a lens apparatus that performs zoom tracking control for moving a focus lens in order to correct a focus shift accompanying movement of a zoom lens.
  • Patent Document 1 Japanese Patent Laid-Open No. 2016-224096
  • zoom tracking control is based on the premise that the distance from the imaging device to the subject is fixed. However, this distance is not always fixed, and sometimes the accuracy of zoom tracking is reduced.
  • a control device controls a lens device including a zoom lens and a focus lens.
  • the control device may be provided with a first determining portion that determines a position indicating a focus lens for maintaining an in-focus state of the subject that is constant with respect to the lens device when the zoom position indicating the position of the zoom lens is changed.
  • the first amount of change in the focus position may be provided with a second determining portion that determines a second amount of change in the focus position for maintaining the in-focus state of the subject with respect to the change in the distance from the lens device.
  • the control device may be provided with a control unit that controls the focus position based on the first amount of change and the second amount of change.
  • the first determining portion may determine a first amount of change when the zoom lens is moved from the first zoom position to the second zoom position according to an instruction for moving the zoom lens from the first zoom position to the second zoom position. According to the instruction, the second determining portion may determine the second amount of change based on the plurality of images captured before the zoom lens is moved to the second zoom position.
  • the second determining portion may determine the second amount of change based on the plurality of images including the first image captured when the zoom lens is at the first zoom position and the second image captured before the first image.
  • the second determining portion may determine a first distance representing a distance from the lens device to the object based on the first image, determine a second distance representing a distance from the lens device to the object based on the second image, and based on the first distance And a second distance to determine the second amount of change.
  • the second determining portion may determine the second amount of change based on a difference between the first distance and the second distance.
  • the first determining portion may determine the first change based on information indicating a relationship between a focus position indicating a position of the focus lens and a zoom position of the zoom lens for maintaining a focus state of the subject that is constant with respect to the lens device. the amount.
  • control portion may control the focus position based on the first amount of change and the second amount of change.
  • control unit may control the focus position based on the first amount of change based on the second amount of change.
  • the control unit may control the focus position based on the first amount of change and the second amount of change when the predetermined imaging condition in which the distance from the lens device to the ground is equal to or greater than a predetermined distance is not satisfied.
  • the control unit may control the focus position based on the first amount of change based on the second amount of change.
  • the control unit may base the first change amount and the second when the predetermined imaging condition that the distance from the lens device to the ground is greater than a predetermined distance and the imaging direction of the lens device includes a component in a predetermined direction is not satisfied.
  • the amount of change controls the focus position.
  • the control unit may not be based on the second change amount.
  • a change amount is used to control the focus position.
  • the predetermined direction can be a vertical direction.
  • the control unit may control the focus position based on the first amount of change and the second amount of change when the predetermined imaging condition that the distance from the lens device to the subject is not more than a predetermined distance is not satisfied.
  • the control unit may control the focus position based on the first amount of change based on the second amount of change.
  • the control unit may control the focus position based on the first amount of change and the second amount of change when the predetermined imaging condition in which the lens device is operated in a predetermined imaging mode is not satisfied.
  • the control unit may control the focus position based on the first amount of change based on the second amount of change.
  • the lens unit can be mounted on the flying body.
  • the control portion may control the focus position based on the first amount of change and the second amount of change when the predetermined imaging condition of the flying body during flight is not satisfied.
  • the control portion may control the focus position based on the first amount of change based on the second amount of change when the predetermined imaging condition of the flying body during flight is satisfied.
  • An imaging device includes the control device and a lens device.
  • a flying body includes the imaging device that performs the flight.
  • the flying body may be provided with a support mechanism that rotatably supports the imaging device.
  • a control method controls a lens device including a zoom lens and a focus lens.
  • the control method may include a first change indicating a focus position indicating a position of the focus lens for maintaining a focus state of the subject that does not change the distance from the lens device when the zoom position indicating the position of the zoom lens is changed.
  • the stage of quantity may be provided with a stage of determining a second amount of change of the focus position for maintaining the in-focus state of the subject with respect to the change in the distance from the lens device.
  • the control method may be provided with a stage of controlling the focus position based on the first amount of change and the second amount of change.
  • a program according to an aspect of the present invention is a program for causing a computer to control a lens device including a zoom lens and a focus lens.
  • the program may cause the computer to perform the first determination of the focus position indicating the position of the focus lens for maintaining the in-focus state of the subject unchanged from the lens device when the zoom position indicating the position of the zoom lens is changed.
  • the stage of the amount of change may cause the computer to perform a phase of determining a second amount of change of the focus position for maintaining the focus state of the subject with respect to the change in the distance from the lens device.
  • the program may cause the computer to perform a phase of controlling the focus position based on the first amount of change and the second amount of change.
  • the reduction in the accuracy of the zoom tracking can be suppressed.
  • FIG. 1 is a diagram showing an example of an appearance of an unmanned aerial vehicle and a remote operation device.
  • FIG. 2 is a diagram showing one example of functional blocks of an unmanned aerial vehicle.
  • FIG. 3 is a diagram showing one example of a zoom tracking curve.
  • FIG. 4 is a flow chart showing one example of a zoom control sequence of an image pickup apparatus.
  • FIG. 5 is a flowchart showing one example of a derivation order of the second variation amount of the focus position based on the change in the distance to the subject.
  • FIG. 6 is a view for explaining a subject distance, a focal length, and the like in an optical system of a lens unit.
  • FIG. 7 is a view for explaining a subject distance, a focal length, and the like in an optical system of a lens unit.
  • FIG. 8 is a flowchart showing one example of a derivation order of a first change amount of a focus position based on a change in a zoom position.
  • FIG. 9 is a diagram for explaining a determination procedure of a focus position based on a zoom tracking curve.
  • FIG. 10 is a diagram for explaining a change in a focus position caused by zoom tracking control.
  • Fig. 11 is a diagram for explaining an example of a hardware configuration.
  • the various embodiments of the present invention can be described with reference to the flowcharts and block diagrams, and the blocks herein may represent (1) a stage of a process of performing an operation, or (2) a "part" of a device having an effect of performing an operation.
  • the determined phases and "parts" can be installed using programmable circuitry and/or processors.
  • Dedicated circuits may include digital and/or analog hardware circuits.
  • ICs integrated circuits
  • discrete circuits may be included.
  • the programmable circuit can include reconfigurable hardware circuitry.
  • Reconfigurable hardware circuits can include logic AND, logic OR, logic XOR, logic NAND, logic NOR and other logic operations, flip-flops, registers, field programmable gate arrays (FPGAs), programmable logic arrays (PLAs), etc. Wait.
  • the computer readable medium can comprise any tangible device that can store instructions that are executed by a suitable device.
  • a computer readable medium having stored instructions internally is provided with a product comprising executable instructions for forming means for performing the operations identified in the flowchart or block diagram.
  • an electronic storage medium, a magnetic storage medium, an optical storage medium, an electromagnetic storage medium, a semiconductor storage medium, or the like can be included.
  • a floppy (registered trademark) disk a floppy disk, a hard disk, a random access memory (RAM), a read only memory (ROM), an erasable programmable read only memory (EPROM or Flash), EEPROM, SRAM, CD-ROM, Digital Versatile Disc (DVD), Blu-ray (RTM) CD, memory stick, integrated circuit card, etc.
  • RAM random access memory
  • ROM read only memory
  • EPROM or Flash erasable programmable read only memory
  • EEPROM electrically erasable programmable read only memory
  • SRAM CD-ROM
  • DVD Digital Versatile Disc
  • RTM Blu-ray
  • the computer readable instructions can comprise any of the source code or object code described in any combination of one or more programming languages.
  • the source code or object code contains an existing procedural programming language.
  • Existing procedural programming languages may be assembler instructions, instruction set architecture (ISA) instructions, machine instructions, machine dependent instructions, microcode, firmware instructions, state setting data, or Smalltalk, JAVA (registered trademark), C++, etc.
  • the computer readable instructions may be provided locally or via a wide area network (WAN) such as a local area network (LAN), the Internet, or the like, to a processor or programmable circuit of a general purpose computer, special purpose computer, or other programmable data processing apparatus.
  • WAN wide area network
  • LAN local area network
  • the Internet or the like
  • the processor or programmable circuitry can execute computer readable instructions to form a means for performing the operations identified in the flowcharts or block diagrams.
  • Examples of the processor include a computer processor, a processing unit, a microprocessor, a digital signal processor, a controller, a microcontroller, and the like.
  • FIG. 1 is a diagram showing an example of the appearance of an unmanned aerial vehicle (UAV) 10 and a remote operation device 300.
  • the UAV 10 includes a UAV main body 20, a pan/tilt head 50, a plurality of imaging devices 60, and an imaging device 100.
  • the pan/tilt head 50 and the camera device 100 are one example of a camera system.
  • the UAV 10 is an example of a flying body that moves in the air. In addition to the UAV, the concept of a flying body includes other aircraft, airships, helicopters, etc. that move in the air.
  • the UAV main body 20 is provided with a plurality of rotors.
  • a plurality of rotors are an example of a propulsion section.
  • the UAV body 20 causes the UAV 10 to fly by controlling the rotation of a plurality of rotors.
  • the UAV body 20 uses, for example, four rotors to fly the UAV 10.
  • the number of rotors is not limited to four.
  • the UAV 10 can also be a fixed wing aircraft without a rotor.
  • the imaging device 100 is an imaging camera that images an object included in a desired imaging range.
  • the pan/tilt head 50 rotatably supports the image pickup apparatus 100.
  • the PTZ 50 is an example of a support organization.
  • the pan/tilt head 50 uses an actuator to support the image pickup apparatus 100 by rotating the pitch axis.
  • the pan/tilt head 50 uses an actuator to support the image pickup apparatus 100 so as to be rotatable about the roll axis and the yaw axis, respectively.
  • the pan/tilt head 50 can change the posture of the imaging device 100 by rotating the imaging device 100 around at least one of the yaw axis, the pitch axis, and the roll axis.
  • the plurality of imaging devices 60 are sensing cameras that image the surroundings of the UAV 10 in order to control the flight of the UAV 10.
  • the two camera units 60 may be disposed on the front side of the UAV 10, that is, the front side. Further, the other two imaging devices 60 may be disposed on the bottom surface of the UAV 10.
  • the two imaging devices 60 on the front side can be paired to function as a so-called stereo camera.
  • the two imaging devices 60 on the bottom side may also be paired to function as a stereo camera.
  • the three-dimensional spatial data around the UAV 10 can be generated based on the images captured by the plurality of imaging devices 60.
  • the number of imaging devices 60 provided in the UAV 10 is not limited to four.
  • the UAV 10 only needs to have at least one imaging device 60.
  • the UAV 10 may also be provided with at least one imaging device 60 on the nose, the tail, the side, the bottom surface and the top surface of the UAV 10, respectively.
  • the angle of view that can be set in the imaging device 60 can be larger than the angle of view that can be set in the imaging device 100.
  • the imaging device 60 may have a fixed focus lens or a fisheye lens.
  • the remote operating device 300 communicates with the UAV 10 to remotely operate the UAV 10.
  • the remote operating device 300 can communicate wirelessly with the UAV 10.
  • the remote operation device 300 transmits, to the UAV 10, instruction information indicating various commands related to the movement of the UAV 10 such as ascending, descending, accelerating, decelerating, advancing, retreating, and rotating.
  • the indication information includes, for example, indication information that causes the height of the UAV 10 to rise.
  • the indication information may indicate the height at which the UAV 10 should be located.
  • the UAV 10 moves at a height indicated by the indication information received from the remote operation device 300.
  • the indication information may include a rising instruction that causes the UAV 10 to rise. UAV10 rises during the period of accepting the rising command. When the height of the UAV 10 has reached the upper limit height, the UAV 10 can limit the rise even if the rise command is accepted.
  • FIG. 2 shows an example of functional blocks of the UAV 10.
  • the UAV 10 includes a UAV control unit 30, a memory 32, a communication interface 36, a propulsion unit 40, a GPS receiver 41, an inertial measurement device 42, a magnetic compass 43, a barometric altimeter 44, a temperature sensor 45, a humidity sensor 46, a pan/tilt head 50, and an imaging device. 60 and imaging device 100.
  • Communication interface 36 is in communication with other devices, such as remote operating device 300.
  • the communication interface 36 can receive indication information including various instructions to the UAV control section 30 from the remote operation device 300.
  • the memory 32 stores the UAV control unit 30, the propulsion unit 40, the GPS receiver 41, the inertial measurement unit (IMU) 42, the magnetic compass 43, the barometric altimeter 44, the temperature sensor 45, the humidity sensor 46, the pan/tilt head 50, the imaging device 60, and the imaging unit.
  • the device 100 performs programs and the like required for control.
  • the memory 32 may be a computer readable recording medium, and may include at least one of flash memories such as SRAM, DRAM, EPROM, EEPROM, and USB memory.
  • the memory 32 can be disposed inside the UAV main body 20. It can be arranged to be detachable from the UAV body 20.
  • the UAV control unit 30 controls the flight and imaging of the UAV 10 in accordance with a program stored in the memory 32.
  • the UAV control unit 30 can be constituted by a microprocessor such as a CPU or an MPU, a microcontroller such as an MCU, or the like.
  • the UAV control unit 30 controls the flight and imaging of the UAV 10 in accordance with an instruction received from the remote operation device 300 through the communication interface 36.
  • the propulsion unit 40 advances the UAV 10.
  • the propulsion unit 40 has a plurality of rotors and a plurality of drive motors that rotate the plurality of rotors.
  • the propulsion unit 40 rotates the plurality of rotors by a plurality of drive motors in accordance with an instruction from the UAV control unit 30 to fly the UAV 10.
  • the GPS receiver 41 receives a plurality of signals indicating the time of transmission from a plurality of GPS satellites.
  • the GPS receiver 41 calculates the position (latitude and longitude) of the GPS receiver 41, that is, the position (latitude and longitude) of the UAV 10 based on the received plurality of signals.
  • the IMU 42 detects the posture of the UAV 10.
  • the IMU 42 detects the acceleration in the three-axis direction of the front, rear, left and right, and up and down of the UAV 10 and the angular velocity in the three-axis direction of the pitch axis, the roll axis, and the yaw axis as the posture of the UAV 10.
  • the magnetic compass 43 detects the orientation of the hand of the UAV 10.
  • the barometric altimeter 44 detects the flying height of the UAV 10.
  • the barometric altimeter 44 detects the air pressure around the UAV 10 and converts the detected air pressure into a height to detect the altitude.
  • the temperature sensor 45 detects the temperature around the UAV 10.
  • the humidity sensor 46 detects the humidity around the UAV 10.
  • the imaging device 100 includes an imaging unit 102 and a lens unit 200.
  • the lens portion 200 is an example of a lens device.
  • the imaging unit 102 includes an image sensor 120, an imaging control unit 110, and a memory 130.
  • the image sensor 120 may be composed of a CCD or a CMOS.
  • the image sensor 120 outputs image data of an optical image imaged by the plurality of lenses 210 to the imaging control section 110.
  • the imaging control unit 110 can be configured by a microprocessor such as a CPU or an MPU, a microcontroller such as an MCU, or the like.
  • the imaging control unit 110 can control the imaging device 100 based on an operation command from the imaging device 100 of the UAV control unit 30.
  • the memory 130 may be a computer readable recording medium, and may include at least one of flash memories such as SRAM, DRAM, EPROM, EEPROM, and USB memory.
  • the memory 130 stores a program or the like necessary for the image control unit 110 to control the image sensor 120 and the like.
  • the memory 130 may be disposed inside the casing of the image pickup apparatus 100.
  • the memory 130 may be disposed to be detachable from the housing of the image pickup apparatus 100.
  • the lens unit 200 has a plurality of lenses 210, a plurality of lens driving units 212, and a lens control unit 220.
  • the plurality of lenses 210 can function as a zoom lens, a zoom lens, and a focus lens. At least a portion or all of the plurality of lenses 210 are configured to be movable along the optical axis.
  • the lens unit 200 may be an interchangeable lens that can be detachably provided with respect to the imaging unit 102.
  • the lens driving unit 212 moves at least a part or all of the plurality of lenses 210 along the optical axis by a mechanism member such as a cam ring.
  • the lens driving portion 212 may include an actuator.
  • the actuator can include a stepper motor.
  • the lens control unit 220 drives the lens driving unit 212 in accordance with a lens control command from the imaging unit 102, and moves one or a plurality of lenses 210 in the optical axis direction by the mechanism member.
  • the lens control commands are, for example, a zoom control command and a focus control command.
  • the lens portion 200 further has a memory 222 and a position sensor 214.
  • the lens control unit 220 controls the movement of the lens 210 in the optical axis direction by the lens driving unit 212 based on the lens operation command from the imaging unit 102.
  • the lens control unit 220 controls the movement of the lens 210 in the optical axis direction by the lens driving unit 212 based on the lens operation command from the imaging unit 102.
  • a part or all of the lens 210 moves along the optical axis.
  • the lens control section 220 performs at least one of a zooming motion and a focusing motion by moving at least one of the lenses 210 along the optical axis.
  • the position sensor 214 detects the position of the lens 210.
  • the position sensor 214 can detect the current zoom position or focus position.
  • the lens driving section 212 may include a shake correction mechanism.
  • the lens control section 220 can move the lens 210 in a direction along the optical axis or a direction perpendicular to the optical axis by the shake correction mechanism, thereby performing shake correction.
  • the lens driving section 212 can drive the shake correction mechanism by a stepping motor to perform shake correction.
  • the shake correction mechanism may be driven by the stepping motor to move the image sensor 120 in a direction along the optical axis or a direction perpendicular to the optical axis, thereby performing shake correction.
  • the memory 222 stores control values of the plurality of lenses 210 that are moved by the lens driving unit 212.
  • the memory 222 may include at least one of flash memories such as SRAM, DRAM, EPROM, EEPROM, and USB memory.
  • the image pickup apparatus 100 thus constructed has a so-called zoom tracking function.
  • the zoom tracking function is a function of maintaining a focus state of a subject that does not change the distance from the image pickup apparatus 100 when the zoom position indicating the position of the zoom lens is changed, and the position indicating the focus lens is changed according to the change of the zoom position.
  • the focus position changes.
  • the zoom tracking is performed, for example, by moving the focus lens to a target focus position corresponding to the target zoom position determined in accordance with the zoom tracking curve corresponding to the subject distance shown in FIG.
  • the zoom tracking curve is an example of information indicating a relationship between a focus position indicating a position of the focus lens and a zoom position of the zoom lens for maintaining the focus state of the subject that is constant with respect to the lens portion 200. This information is determined by the characteristics of the optical system of the lens unit 200, and can be stored, for example, in the memory 222.
  • the imaging apparatus 100 is changed when the optical system of the lens unit 200 is changed from the state of the zoom position (1) and the focus position (1) to the state of the zoom position (2).
  • the focus position (2) corresponding to the zoom position (2) is determined in accordance with the zoom tracking curve 400.
  • the image pickup apparatus 100 derives the amount of change in the zoom position based on the zoom position (1) and the zoom position (2), and derives the amount of change in the focus position based on the focus position (1) and the focus position (2). Then, the imaging apparatus 100 controls the zoom position and the focus position based on the respective amounts of change.
  • the premise of the zoom tracking as described above is that the distance from the lens unit 200 to the subject is fixed during the movement of the zoom lens to the desired zoom position. However, this distance may change during the change of the zoom position. For example, when the imaging device 100 is mounted on a flying object such as the UAV 10, the accuracy of the zoom tracking may be lowered due to a change in the position of the flying body or the subject during the flight of the flying body. Therefore, according to the present embodiment, even when the distance from the lens unit 200 to the subject changes, the accuracy of the zoom tracking is suppressed from being lowered.
  • the imaging control unit 110 has a determination unit 112, a zoom position control unit 114, and a focus position control unit 116.
  • the determining portion 112 determines a first change in the focus position indicating the position of the focus lens for maintaining the in-focus state of the subject that does not change the distance from the lens portion 200 when the zoom position indicating the position of the zoom lens is changed. the amount.
  • the determining portion 112 determines a second amount of change in the focus position for maintaining the in-focus state of the subject with respect to the change in the distance from the lens portion 200.
  • the determination section 112 is an example of the first determination section and the second determination section.
  • the zoom position may not necessarily directly indicate the physical position of at least one lens constituting the zoom lens in the optical axis direction.
  • the zoom position when the cam mechanism is used to move the zoom lens, the zoom position may be a value corresponding to the displacement position of the cam member. Further, when a cylindrical cam is used to move the zoom lens, the zoom position may also be a value equivalent to the rotation angle of the cylindrical cam.
  • the focus position may not necessarily directly indicate the physical position of at least one lens constituting the zoom lens in the optical axis direction.
  • the zoom position may be a value equivalent to the number of steps from the reference rotational position from the stepping motor for driving the drive mechanism for moving the focus lens in the optical axis direction.
  • the determination portion 112 determines a first amount of change when the zoom lens is moved from the first zoom position to the second zoom position.
  • the determining portion 112 further determines the second amount of change based on the plurality of images captured before the zoom lens is moved to the second zoom position.
  • the determining portion 112 may determine the second amount of change based on the plurality of images including the first image captured when the zoom lens is in the first zoom position and the second image captured before the first image.
  • the determination section 112 may determine a first distance indicating a distance from the lens section 200 to the subject based on the first image.
  • the determination section 112 may determine a second distance indicating a distance from the lens section 200 to the subject based on the second image.
  • the determining portion 112 may determine the second amount of change based on the first distance and the second distance.
  • the determining portion 112 may determine the second amount of change based on a difference between the first distance and the second distance.
  • the focus position control section 116 controls the focus position based on the first change amount and the second change amount.
  • the zoom position control section 114 controls the position of the zoom lens in accordance with an instruction for moving the zoom lens from the first zoom position to the second zoom position.
  • the focus position control section 116 controls the focus position in accordance with the second amount of change determined based on the change in the distance to the subject, in addition to the first amount of change determined based on the positional change of the zoom lens. .
  • the imaging control unit 110 can control the zoom tracking to be performed satisfactorily.
  • the determination portion 112 determines the object to be at least based on at least the first image captured at the first zoom position and the second image captured before the first image The amount of change in distance.
  • the determining portion 112 determines a second amount of change in the focus position before the zoom lens moves to the second zoom position based on the amount of change.
  • the focus position control section 116 can change the distance based on the distance to the subject in addition to the first change amount based on the position change of the zoom lens while the zoom lens is moved from the first zoom position to the second zoom position.
  • the second amount of change is used to control the position of the focus lens. Therefore, even when the distance to the subject changes, the accuracy of the zoom tracking can be suppressed from being lowered.
  • the focus position control section 116 can control the focus position based on the first variation amount and the second variation amount.
  • the focus position control unit 116 may control the focus position based on the first amount of change without based on the second amount of change.
  • the focus position control unit 116 can control the focus position based on the first amount of change and the second amount of change.
  • the focus position control unit 116 may control the focus position based on the first amount of change without based on the second amount of change.
  • the distance from the lens portion 200 to the ground can be determined based on, for example, a signal from an infrared sensor provided in the UAV main body 20 or the image pickup apparatus 100.
  • the infrared ray sensor may be disposed on, for example, the UAV main body 20 in such a manner that infrared rays are emitted downward in the vertical direction.
  • the infrared sensor can detect the distance from the infrared sensor to the reflecting surface based on the reflected light of the emitted infrared rays, and takes the distance as the distance from the lens portion 200 to the ground.
  • the focus position control unit 116 may be based on the first predetermined imaging condition when the distance from the lens unit 200 to the ground is not greater than a predetermined distance and the imaging direction of the lens unit 200 includes a predetermined direction component.
  • the amount of change and the second amount of change are used to control the focus position.
  • the focus position control unit 116 may not be based on the second.
  • the amount of change is controlled based on the first amount of change.
  • the predetermined direction may be a direction in which the imaging apparatus 100 can image the ground direction, and may be, for example, a vertical direction downward.
  • the imaging device 100 is mounted on the UAV 10, and during the flight of the UAV 10, the ground is photographed by aerial photography.
  • the distance to the subject is long, so even if the distance to the subject changes, the accuracy of the zoom tracking is hardly affected. Therefore, when the above-described conditions are satisfied, the focus position control unit 116 can control the focus position based on the first amount of change without based on the second amount of change.
  • the imaging direction of the lens portion 200 includes, for example, a component in the vertical direction, when the imaging apparatus 100 photographs a subject having a relatively short distance, a change in the distance to the subject may affect the zoom tracking. Precision.
  • the focus position control unit 116 can control the focus position based on the first amount of change and the second amount of change.
  • the focus position control unit 116 can control the focus position based on the first amount of change and the second amount of change.
  • the focus position control unit 116 can control the focus position based on the first amount of change based on the second amount of change. For example, when the imaging apparatus 100 operates in a shooting mode other than a predetermined shooting mode of a scene in which the subject distance is assumed to be short in the portrait mode or the like, the focus position control unit 116 can determine that the predetermined imaging condition is not satisfied. And controlling the focus position based on the first amount of change and the second amount of change.
  • the predetermined photographing mode may be an arbitrary photographing mode for photographing a scene that hardly affects the accuracy of the zoom tracking even if the distance to the subject changes.
  • the focus position control unit 116 can control the focus position based on the first change amount and the second change amount when the predetermined imaging condition is not satisfied that the distance from the lens unit 200 to the subject is equal to or greater than a predetermined distance.
  • the focus position control unit 116 may control the focus position based on the first amount of change without based on the second amount of change.
  • the focus position control section 116 can control the focus position based on the first amount of change and the second amount of change.
  • the focus position control section 116 may control the focus position based on the first amount of change based on the second amount of change.
  • the zoom tracking is not performed based on the second amount of change based on the change in the distance to the subject. Thereby, the processing load in the zoom tracking control can be reduced.
  • FIG. 4 is a flowchart showing one example of a zoom control sequence of the image pickup apparatus 100.
  • the imaging control unit 110 receives a zoom instruction for moving the zoom lens based on a user operation (S100).
  • the imaging control unit 110 starts the zoom tracking operation based on the zoom command.
  • the imaging control unit 110 determines whether or not the predetermined imaging condition that can perform the zoom tracking operation without considering the change in the distance of the subject is satisfied (S102). When the predetermined imaging condition is not satisfied, the imaging control unit 110 determines that the zoom tracking operation should be performed in consideration of the change in the distance of the subject. Then, the imaging control unit 110 derives the amount of change in the area of the main subject from the imaged image (S104).
  • the imaging control section 110 first determines an area of the main subject from the image. For example, the imaging control unit 110 can determine the region of the main subject by determining the subject existing at the center of the image as the main subject. The imaging control unit 110 may determine the subject of the main subject by determining the subject that has the largest motion in a predetermined period of time as the main subject. The imaging control unit 110 may use a difference between a region of the main subject of the image captured at the first time point and a region of the main subject of the image captured at the second time point before the first time point as the main The amount of change in the area of the subject is derived.
  • the imaging control unit 110 determines whether or not the amount of change is equal to or greater than a predetermined threshold (S106). If the amount of change is smaller than a predetermined threshold value, the imaging control unit 110 determines that the change in the distance to the subject has little influence on the accuracy of the zoom tracking, and returns to step S102. If the amount of change is equal to or greater than a predetermined threshold value, the imaging control unit 110 determines that the change in the distance to the subject may affect the accuracy of the zoom tracking, and derives the second amount of change in the focus position (S108).
  • the imaging control section 110 When the zoom tracking operation is started, the imaging control section 110 maintains the current zoom position (S110).
  • the imaging control section 110 can acquire information indicating the current zooming position of the zoom lens detected by the position sensor 214 from the lens control section 220, and hold the zooming position as the current zooming position. Further, the imaging control unit 110 derives the target zoom position based on the zoom instruction (S112).
  • the imaging control section 110 can determine the zoom position indicated by the zoom instruction as the target zoom position.
  • the imaging control section 110 can determine the target zooming position based on the amount of movement of the zoom lens and the current zooming position shown by the zoom instruction.
  • the imaging control unit 110 derives a first amount of change in the focus position corresponding to the amount of movement of the focus lens to be moved based on the change in the zoom position (S114).
  • the focus position control unit 116 determines the amount of change in the focus position (S116). In the case where the second amount of change is derived, the focus position control section 116 can determine the amount of change in the focus position by adding the second amount of change to the first amount of change. In the case where the second amount of change is not derived, the focus position control unit 116 may determine the first amount of change as the amount of change in the focus position.
  • the focus position control unit 116 may determine one of the first change amount and the second change amount as the change amount of the focus position, after which the focus position control unit 116 may set the other of the first change amount and the second change amount. One determines the amount of change in the focus position.
  • the focus position control unit 116 drives the focus lens based on the determined amount of change by the lens control unit 220 (S118).
  • the focus position control section 116 can change the focus position of the focus lens to the desired focus position at one time based on the first change amount and the second change amount.
  • the focus position control section 116 may further change the focus lens based on the other of the first change amount and the second change amount after changing the focus position of the focus lens based on one of the first change amount and the second change amount. Focus position. That is, the focus position control unit 116 may change the focus position of the focus lens to a desired focus position stepwise based on the first change amount and the second change amount.
  • the zoom position control unit 114 determines the amount of change in the zoom position based on the current zoom position and the target zoom position (S120).
  • the zoom position control unit 114 drives the zoom lens based on the determined amount of change in the zoom position by the lens control unit 220 (S122).
  • the imaging control section 110 determines whether or not the zoom position has reached the final zoom position determined in accordance with the zoom instruction (S124). If the zoom position does not reach the final zoom position determined in accordance with the zoom instruction, the imaging control portion 110 repeats the processing from step S110 onward. On the other hand, if the final zoom position has been reached, the imaging control section 110 ends the processing.
  • FIG. 5 is a flowchart showing one example of a derivation order of the second variation amount of the focus position based on the change in the distance to the subject.
  • 6 and 7 are views for explaining the subject distance L1, the focal length f, and the like in the optical system of the lens unit 200.
  • IM1 design value image magnification
  • the determination section 112 derives the subject distance L1 and the actual subject size Y (S204).
  • the determination section 112 determines the subject size y2 of the subject 504 on the image plane imaged by the lens 210 with respect to the image captured in the current frame (S206).
  • the determination unit 112 derives the amount of change ⁇ L of the subject distance L (S212).
  • the determining portion 112 can derive the second amount of change in the focus position based on the image plane change amount ⁇ x.
  • the determination unit 112 can derive the second amount of change in the focus position by multiplying the image plane change amount ⁇ x by a predetermined coefficient K.
  • the coefficient K is a coefficient determined in advance for the focus position conversion with respect to the change in the image plane position.
  • the determining section 112 may predict the position of the subject in the image of the future frame imaged at the second zoom position based on the image captured in the current frame and the at least one image captured in the previous frame, and derive the focus according to the prediction. The second amount of change in position.
  • FIG. 8 is a flowchart showing one example of a derivation order of a first change amount of a focus position based on a change in a zoom position.
  • the determination section 112 acquires the current zoom position (S300).
  • the determination section 112 can acquire the current zoom position of the zoom lens through the lens control section 220.
  • the determination section 112 acquires the current focus position (S302).
  • the determination section 112 can acquire the current focus position of the focus lens through the lens control section 220.
  • the determination section 112 derives the driving range at the current zoom position (S304).
  • the determining portion 112 can determine a focus position corresponding to the current zoom position on the zoom tracking curve 400 of the infinity end, and a focus position corresponding to the current zoom position on the most recent zoom tracking curve 402.
  • the driving range between the two is 404.
  • the determination section 112 derives a ratio of the focus position to the drive range at the current zoom position (S306).
  • the determining unit 112 derives the ratio b/a in the driving range 404 shown in FIG. 9, for example.
  • the determination section 112 derives the target zoom position based on the zoom instruction (S308).
  • the determination section 112 derives the driving range at the target zoom position (S310).
  • the determination section 112 derives the drive range 406 in the target zoom value shown in FIG. 9, for example.
  • the determination section 112 derives the target focus position from the ratio at the target zoom position and the drive range.
  • the determining portion 112 can derive the target focus position based on, for example, the ratio b/a and the driving range 406 shown in FIG.
  • the determining portion 112 derives a first amount of change in the focus position based on the current focus position and the target focus position (S314).
  • the determination section 112 can derive a zoom tracking curve corresponding to the distance to the current subject based on the zoom tracking curve corresponding to at least two distances to the predefined subject. Further, the determining portion 112 may derive a target focus position when the distance to the subject is constant according to the derived zoom tracking curve.
  • the zoom lens is moved from the first zoom position to the second zoom position.
  • the determination section 112 determines the temporary focus position (1) in accordance with the zoom tracking curve 401 determined by the first zoom position and the current focus focus position, and determines the first change amount 412.
  • the determination section 112 determines the second variation amount 414 of the focus position based on the amount of change in the distance to the subject.
  • the determining portion 112 may determine the amount of change in the distance to the subject based on the first image captured at the first zoom position and the second image captured before the first image, and determine the second amount of change 414 of the focus position.
  • the focus position control portion 116 causes the focus lens to move the change amount 416 in which the first change amount 412 and the second change amount 414 are added, to the first focus position. While the zoom lens is moved from the first position to the second position, the focus position control portion 116 can cause the focus lens to move by the change amount 416 at one time. Further, during the movement from the first zoom position to the second zoom position, the focus position control portion 116 may further move the focus lens by the second change amount 414 after moving the focus lens by the first change amount 412.
  • the zoom lens is moved from the second zoom position to the third zoom position.
  • the determination section 112 determines the temporary focus position (2) in accordance with the zoom tracking curve 402, and determines the first variation amount 422. Further, the determination section 112 determines the second variation amount 424 of the focus position based on the amount of change in the distance to the subject.
  • the focus position control portion 116 may cause the focus lens to move the change amount 426 that adds the first change amount 422 and the second change amount 424 to move the focus lens to the first Two focus positions. By adding the first variation 422 and the second variation 424 together to move the focus lens at a time, the amount of movement of the focus lens can be reduced.
  • the focus position control portion 116 may cause the focus lens to move the second change amount 424 after moving the focus lens by the first change amount 422, thereby causing the focus lens Move to the second focus position.
  • the focus position can be adjusted during the zoom tracking based on the amount of change in the distance to the subject. Therefore, it is possible to suppress a decrease in zoom tracking accuracy.
  • FIG. 11 shows an example of a computer 1200 showing a plurality of aspects of the present invention embodied in whole or in part.
  • the program installed in the computer 1200 enables the computer 1200 to function as a related operation of the device according to the embodiment of the present invention or one or more "portions" of the device. Alternatively, the program can cause the computer 1200 to perform the operation or the one or more "parts.”
  • the program enables computer 1200 to perform the processes involved in embodiments of the present invention or the stages of the process.
  • Such a program may be executed by the CPU 1212 in order for the computer 1200 to perform a plurality of or all of the determined operations associated with the flowcharts and block diagrams of the present description.
  • the computer 1200 in the present embodiment includes a CPU 1212 and a RAM 1214, and the CPU 1212 and the RAM 1214 are connected to each other by a host controller 1210.
  • the computer 1200 also includes a communication interface 1222, an input/output unit, and a communication interface 1222 and an input/output unit connected to the host controller 1210 via an input/output controller 1220.
  • Computer 1200 also includes a ROM 1230.
  • the CPU 1212 operates in accordance with programs stored in the ROM 1230 and the RAM 1214, thereby controlling the respective units.
  • Communication interface 1222 communicates with other electronic devices over a network.
  • the hard disk drive can store programs and data for use by the CPU 1212 within the computer 1200.
  • the ROM 1230 stores a boot program or the like executed by the computer 1200 at the time of activation and/or a program depending on the hardware of the computer 1200.
  • the program can be provided by a computer readable recording medium such as a CR-ROM, a USB memory, or an IC card.
  • the program is installed in the RAM 1214 or the ROM 1230 which is also an example of a computer readable recording medium and is executed by the CPU 1212.
  • the information processing described within these programs is read by computer 1200 to enable cooperation between the program and the various types of hardware resources.
  • the apparatus or method can be constructed by realizing the operation or processing of information by using the computer 1200.
  • the CPU 1212 can execute a communication program loaded on the RAM 1214, and instructs the communication interface 1222 to perform communication processing in accordance with the processing described in the communication program.
  • the communication interface 1222 reads the transmission data stored in the transmission buffer provided in the recording medium such as the RAM 1214 or the USB memory under the control of the CPU 1212, transmits the read transmission data to the network, or writes the received data received through the network. It is entered into a receiving buffer or the like provided on the recording medium.
  • the CPU 1212 can read all or a part of files or databases stored in an external recording medium such as a USB memory into the RAM 1214, and perform various types of processing on the data on the RAM 1214. Next, the CPU 1212 can write back the processed data to the external recording medium.
  • an external recording medium such as a USB memory
  • Various types of information such as various types of programs, data, tables, and databases can be stored in a recording medium and subjected to information processing.
  • the CPU 1212 can perform various types of processing on the data read from the RAM 1214 and write the results back into the RAM 1214, which includes various types specified by the program's instruction sequence as described elsewhere in the present disclosure. Type operations, information processing, conditional judgment, conditional branching, unconditional branching, information retrieval/replacement, etc.
  • the CPU 1212 can retrieve information in a file, a database, and the like in the recording medium.
  • the CPU 1212 may retrieve the specified value of the attribute value of the first attribute from the plurality of items.
  • An entry having the same condition reading an attribute value of the second attribute stored in the entry, thereby acquiring an attribute value of the second attribute related to the first attribute that satisfies a predetermined condition.
  • the above described programs or software modules may be stored on computer 1200 or in a computer readable storage medium in the vicinity of computer 1200.
  • a recording medium such as a hard disk or a RAM provided in a server system connected to a dedicated communication network or the Internet can be used as the computer readable storage medium, and thus the program can be supplied to the computer 1200 through the network.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Optics & Photonics (AREA)
  • Lens Barrels (AREA)
  • Studio Devices (AREA)

Abstract

一种对具备变焦镜头和聚焦镜头的镜头装置(200)进行控制的控制装置(110),具备:在表示变焦镜头位置的变焦位置发生变化时,确定用于维持对于与镜头装置(200)的距离不变的被摄体的对焦状态的、表示聚焦镜头的位置的聚焦位置的第一变化量(412,422)的第一确定部(112);以及确定用于维持对于与镜头装置(200)的距离变化的被摄体的对焦状态的聚焦位置的第二变化量(414,424)的第二确定部(112);基于第一变化量(412,422)和第二变化量(414,424)来控制聚焦位置的控制部(116)。具有第一确定部(112)、第二确定部(112)以及控制部(116)的控制装置(110)能够抑制变焦跟踪的精度的降低。

Description

控制装置、摄像装置、飞行体、控制方法以及程序 技术领域
本发明涉及一种控制装置、摄像装置、飞行体、控制方法以及程序。
背景技术
专利文献1中公开了一种镜头装置,其为了校正伴随变焦镜头移动的焦点偏移,而进行使聚焦镜头移动的变焦跟踪控制。
专利文献1:日本特开2016-224096号公报
发明内容
发明所要解决的技术问题
上述变焦跟踪控制的前提为,从摄像装置到被摄体的距离固定。然而,此距离并非总是固定的,有时变焦跟踪的精度会降低。
用于解决技术问题的手段
本发明的一个方式所涉及的控制装置,其对具备变焦镜头和聚焦镜头的镜头装置进行控制。控制装置可以具备第一确定部,其在表示变焦镜头的位置的变焦位置发生变化时,确定用于维持对于与镜头装置的距离不变的被摄体的对焦状态的、表示聚焦镜头的位置的聚焦位置的第一变化量。控制装置可以具备第二确定部,其确定用于维持对于与镜头装置的距离变化的被摄体的对焦状态的聚焦位置的第二变化量。控制装置可以具备控制部,其基于第一变化量和第二变化量来 控制聚焦位置。
根据用于使变焦镜头从第一变焦位置移动至第二变焦位置的指令,第一确定部可以确定变焦镜头从第一变焦位置移动至第二变焦位置时的第一变化量。根据该指令,第二确定部可以基于变焦镜头移动至第二变焦位置之前摄像的多个图像,来确定第二变化量。
第二确定部可以基于包括变焦镜头处于第一变焦位置时摄像的第一图像和在第一图像之前摄像的第二图像的多个图像,来确定第二变化量。
第二确定部可以基于第一图像确定表示从镜头装置到被摄体的距离的第一距离,基于第二图像确定表示从镜头装置到被摄体的距离的第二距离,并基于第一距离和第二距离,来确定第二变化量。
第二确定部可以基于第一距离与第二距离之间的差,来确定第二变化量。
第一确定部可以基于表示用于维持对于与镜头装置的距离不变的被摄体的对焦状态的表示聚焦镜头的位置的聚焦位置与变焦镜头的变焦位置的关系的信息,来确定第一变化量。
在镜头装置不满足预先决定的摄像条件时,控制部可以基于第一变化量和第二变化量来控制聚焦位置。在镜头装置满足预先决定的摄像条件时,控制部可以不基于第二变化量而基于第一变化量来控制聚焦位置。
在不满足从镜头装置到地面的距离在预先决定的距离以上这一预先决定的摄像条件时,控制部可以基于第一变化量和第二变化量来控制聚焦位置。在满足从镜头装置到地面的距离在预先决定的距离以上这一预先决定的摄像条件时,控制部可以不基于第二变化量而基于第一变化量来控制聚焦位置。
在不满足从镜头装置到地面的距离在预先决定的距离以上、并且镜头装置的摄像方向包括预先决定的方向的成分这一预先决定的摄像条件时,控制部可以基于第一变化量和第二变化量来控制聚焦位置。在满足从镜头装置到地面的距离在预先决定的距离以上、并且镜头装置的摄像方向包括预先决定的方向的成分这一预先决定的摄像条件时,控制部可以不基于第二变化量而基于第一变化量来控制聚焦位置。预先决定的方向可以为竖直方向。
在不满足从镜头装置到被摄体的距离在预先决定的距离以上这一预先决定的摄像条件时,控制部可以基于第一变化量和第二变化量来控制聚焦位置。在满足从镜头装置到被摄体的距离在预先决定的距离以上这一预先决定的摄像条件时,控制部可以不基于第二变化量而基于第一变化量来控制聚焦位置。
在不满足镜头装置以预先决定的摄影模式动作这一预先决定的摄像条件时,控制部可以基于第一变化量和第二变化量来控制聚焦位置。在满足镜头装置以预先决定的摄影模式动作这一预先决定的摄像条件时,控制部可以不基于第二变化量而基于第一变化量来控制聚焦位置。
镜头装置可以搭载在飞行体上。在不满足飞行体在飞行过程中这一预先决定的摄像条件时,控制部可以基于第一变化量和第二变化量来控制聚焦位置。在满足飞行体在飞行过程中这一预先决定的摄像条件时,控制部可以不基于第二变化量而基于第一变化量来控制聚焦位置。
本发明的一个方式所涉及的摄像装置,其具备上述控制装置和镜头装置。
本发明的一个方式所涉及的飞行体,其具备上述摄像装置进行飞行。飞行体可以具备可旋转地支持摄像装置的支持机构。
本发明的一个方式所涉及的控制方法,其对具备变焦镜头和聚焦镜头的镜头装置进行控制。控制方法可以具备在表示变焦镜头的位置的变焦位置发生变化时,确定用于维持对于与镜头装置的距离不变的被摄体的对焦状态的、表示聚焦镜头的位置的聚焦位置的第一变化量的阶段。控制方法可以具备确定用于维持对于与镜头装置的距离变化的被摄体的对焦状态的聚焦位置的第二变化量的阶段。控制方法可以具备基于第一变化量和第二变化量来控制聚焦位置的阶段。
本发明的一个方式所涉及的程序,其为用于使计算机对具备变焦镜头及聚焦镜头的镜头装置进行控制的程序。程序可以使计算机执行在表示变焦镜头的位置的变焦位置发生变化时,确定用于维持对于与镜头装置的距离不变的被摄体的对焦状态的、表示聚焦镜头的位置的聚焦位置的第一变化量的阶段。程序可以使计算机执行确定用于维持对于与镜头装置的距离变化的被摄体的对焦状态的聚焦位置的第二变化量的阶段。程序可以使计算机执行基于第一变化量和第二变化量来控制聚焦位置的阶段。
根据本发明的一个方式,在从镜头装置到被摄体的距离发生了变化时,能够抑制变焦跟踪的精度的降低。
此外,上述的发明内容中没有穷举本发明的所有必要特征。此外,这些特征组的子组合也可以成为发明。
附图说明
图1是示出无人飞行器及远程操作装置的外观的一个示例的图。
图2是示出无人飞行器的功能块的一个示例的图。
图3是示出变焦跟踪曲线的一个示例的图。
图4是示出摄像装置的变焦控制顺序的一个示例的流程图。
图5是示出基于到被摄体的距离的变化的聚焦位置的第二变化量的导出顺序的一个示例的流程图。
图6是用于对镜头部的光学系统中的被摄体距离、焦点距离等进行说明的图。
图7是用于对镜头部的光学系统中的被摄体距离、焦点距离等进行说明的图。
图8是示出基于变焦位置的变化的聚焦位置的第一变化量的导出顺序的一个示例的流程图。
图9是用于对基于变焦跟踪曲线的聚焦位置的确定顺序进行说明的图。
图10是用于对由变焦跟踪控制引起的聚焦位置的变化情况进行说明的图。
图11是用于对硬件构成的一个示例进行说明的图。
具体实施方式
以下,通过发明的实施方式来对本发明进行说明,但是以下实施方式并非限制权利要求书所涉及的发明。此外,实施方式中说明的特征的组合并非全部是发明的解决方案所必须的。
以下,通过发明的实施方式来对本发明进行说明,但是以下实施方式并非限制权利要求书所涉及的发明。此外,实施方式中说明的特征的组合并非全部是发明的解决方案所必须的。对本领域普通技术人员来说,显然可以对以下实施方式加以各种变更或改良。从权利要求书的记载即可明白,加以了这样的变更或改良的方式都可包含在本发明的技术范围之内。
权利要求书、说明书、说明书附图以及说明书摘要中包含作为著 作权所保护对象的事项。任何人只要如专利局的文档或者记录所表示的那样进行这些文件的复制,著作权人就无法异议。但是,在除此以外的情况下,保留一切的著作权。
本发明的各种实施方式可参照流程图及框图而记载,这里的框可表示(1)操作执行的过程的阶段、或(2)具有执行操作的作用的装置的“部”。确定的阶段及“部”可利用可编程电路及/或处理器进行安装。专用电路可包含数字及/或模拟硬件电路。可包含集成电路(IC)及/或离散电路。可编程电路可包含可重构硬件电路。可重构硬件电路可包含逻辑AND、逻辑OR、逻辑XOR、逻辑NAND、逻辑NOR及其它逻辑操作、触发器、寄存器、现场可编程门阵列(FPGA)、可编程逻辑阵列(PLA)等存储器要素等。
计算机可读介质可包含能储存通过适当的设备而执行的指令的任意的有形设备。结果,内部具有所储存的指令的计算机可读介质成为具备包括可执行的指令的产品,该可执行的指令用于形成用以执行流程图或框图中确定的操作的手段。作为计算机可读介质的示例,可包括电子存储介质、磁性存储介质、光存储介质、电磁存储介质、半导体存储介质等。作为计算机可读介质的更具体的示例,可包含软(floppy,登记商标)盘、软磁盘、硬盘、随机存取存储器(RAM)、只读存储器(ROM)、可擦除可编程只读存储器(EPROM或闪存)、电可擦除可编程只读存储器(EEPROM)、静态随机存取存储器(SRAM)、微型光盘只读存储器(CD-ROM)、数字多功能光盘(DVD)、蓝光(RTM)光盘、记忆棒、集成电路卡等。
计算机可读指令可包含一个或多个编程语言的任意组合中所描述的源代码或对象代码的任一种。源代码或对象代码包含现有的过程性编程语言。现有的过程性编程语言可为汇编程序指令、指令集架构(ISA)指令、机器指令、机器依存指令、微代码、固件指令、状态设定数据、或Smalltalk、JAVA(登记商标)、C++等面向对象编程语言及“C”编程语言或同样的编程语言。计算机可读指令可由本地提供 或通过局域网(LAN)、互联网等广域网(WAN)提供给通用计算机、特殊用途的计算机、或其他可编程的数据处理装置的处理器或可编程电路。处理器或可编程电路可执行计算机可读指令以形成用于执行流程图或框图中确定的操作的手段。作为处理器的示例,包含计算机处理器、处理单元、微处理器、数字信号处理器、控制器、微控制器等。
图1是示出无人飞行器(UAV)10和远程操作装置300的外观的一个示例的图。UAV10具备UAV主体20、云台50、多个摄像装置60、以及摄像装置100。云台50和摄像装置100为摄像系统的一个示例。UAV10为在空中移动的飞行体的一个示例。飞行体的概念除UAV以外还包括在空中移动的其它飞行器、飞艇、直升机等。
UAV主体20具备多个旋翼。多个旋翼为推进部的一个示例。UAV主体20通过控制多个旋翼的旋转而使UAV10飞行。UAV主体20例如使用四个旋翼来使UAV10飞行。旋翼的数量不限于四个。此外,UAV10也可以为不具有旋翼的固定翼机。
摄像装置100为对期望的摄像范围内包含的被摄体进行摄像的摄像用相机。云台50可旋转地支持摄像装置100。云台50为支持机构的一个示例。例如,云台50利用致动器使摄像装置100可以俯仰轴旋转地对其进行支持。云台50利用致动器使摄像装置100还能分别以横滚轴及偏航轴为中心旋转地对其进行支持。云台50可以通过使摄像装置100以偏航轴、俯仰轴和横滚轴中的至少一个为中心旋转,从而变更摄像装置100的姿势。
多个摄像装置60是为了控制UAV10的飞行而对UAV10周围进行摄像的传感用相机。两个摄像装置60可以设置于UAV10的机头即正面。并且,其它两个摄像装置60可以设置于UAV10的底面。正面侧的两个摄像装置60可以成对,起到所谓立体相机的作用。底面侧的两个摄像装置60也可以成对,起到立体相机的作用。可以基于由多个摄像装置60摄像的图像,来生成UAV10周围的三维空间数据。 UAV10所具备的摄像装置60的数量不限于四个。UAV10只要具备至少一个摄像装置60即可。UAV10也可以在UAV10的机头、机尾、侧面、底面及顶面分别具备至少一个摄像装置60。摄像装置60中可设定的视角可以比摄像装置100中可设定的视角更大。摄像装置60可以具有定焦镜头或鱼眼镜头。
远程操作装置300与UAV10通信,从而远程操作UAV10。远程操作装置300可以与UAV10无线通信。远程操作装置300向UAV10发送表示上升、下降、加速、减速、前进、后退、旋转等与UAV10的移动有关的各种指令的指示信息。指示信息包括例如使UAV10的高度上升的指示信息。指示信息可以表示UAV10应该位于的高度。UAV10移动以位于从远程操作装置300接收的指示信息所示出的高度。指示信息可以包括使UAV10上升的上升指令。UAV10在接受上升指令的期间上升。在UAV10的高度已达到上限高度时,即使接受上升指令,UAV10也可以限制上升。
图2示出UAV10的功能块的一个示例。UAV10具备UAV控制部30、内存32、通信接口36、推进部40、GPS接收器41、惯性测量装置42、磁罗盘43、气压高度计44、温度传感器45、湿度传感器46、云台50、摄像装置60和摄像装置100。
通信接口36与远程操作装置300等其它装置通信。通信接口36可以从远程操作装置300接收包括对UAV控制部30的各种指令的指示信息。内存32储存UAV控制部30对推进部40、GPS接收器41、惯性测量装置(IMU)42、磁罗盘43、气压高度计44、温度传感器45、湿度传感器46、云台50、摄像装置60和摄像装置100进行控制所需的程序等。内存32可以为计算机可读记录介质,可以包括SRAM、DRAM、EPROM、EEPROM和USB存储器等闪存中的至少一个。内存32可以设置于UAV主体20的内部。可以设置成可从UAV主体20中拆卸下来。
UAV控制部30按照储存在内存32中的程序来控制UAV10的飞行和摄像。UAV控制部30可以由CPU或MPU等微处理器、MCU等微控制器等构成。UAV控制部30按照通过通信接口36从远程操作装置300接收到的指令来控制UAV10的飞行和摄像。推进部40推进UAV10。推进部40具有多个旋翼和使多个旋翼旋转的多个驱动马达。推进部40按照来自UAV控制部30的指令,通过多个驱动马达使多个旋翼旋转,从而使UAV10飞行。
GPS接收器41接收表示从多个GPS卫星发送的时刻的多个信号。GPS接收器41基于接收到的多个信号来计算出GPS接收器41的位置(纬度和经度)、即UAV10的位置(纬度和经度)。IMU42检测UAV10的姿势。IMU42检测UAV10的前后、左右及上下的三轴方向的加速度和俯仰轴、横滚轴以及偏航轴的三轴方向的角速度,作为UAV10的姿势。磁罗盘43检测UAV10的机头的方位。气压高度计44检测UAV10的飞行高度。气压高度计44检测UAV10周围的气压,并将检测到的气压换算为高度,来检测高度。温度传感器45检测UAV10周围的温度。湿度传感器46检测UAV10周围的湿度。
摄像装置100具备摄像部102和镜头部200。镜头部200为镜头装置的一个示例。摄像部102具有图像传感器120、摄像控制部110和内存130。图像传感器120可以由CCD或CMOS构成。图像传感器120将通过多个镜头210成像的光学图像的图像数据输出至摄像控制部110。摄像控制部110可以由CPU或MPU等微处理器、MCU等微控制器等构成。摄像控制部110可以根据来自UAV控制部30的摄像装置100的动作指令来控制摄像装置100。内存130可以为计算机可读记录介质,可以包括SRAM、DRAM、EPROM、EEPROM和USB存储器等闪存中的至少一个。内存130储存摄像控制部110对图像传感器120等进行控制所需的程序等。内存130可以设置于摄像装置100的壳体的内部。内存130可以设置成可从摄像设备100的壳体中拆卸下来。
镜头部200具有多个镜头210、多个镜头驱动部212和镜头控制部220。多个镜头210可以起到变焦镜头、可变焦距镜头和聚焦镜头的作用。多个镜头210中的至少一部分或全部被配置为能够沿着光轴移动。镜头部200可以为能够相对于摄像部102拆装地设置的可换镜头。镜头驱动部212通过凸轮环等机构部件使多个镜头210中的至少一部分或全部沿着光轴移动。镜头驱动部212可以包括致动器。致动器可以包括步进马达。镜头控制部220按照来自摄像部102的镜头控制指令来驱动镜头驱动部212,通过机构部件使一个或多个镜头210沿着光轴方向移动。镜头控制指令例如为变焦控制指令和聚焦控制指令。
镜头部200进一步具有内存222和位置传感器214。镜头控制部220根据来自摄像部102的镜头动作指令,通过镜头驱动部212来控制镜头210向光轴方向的移动。镜头控制部220根据来自摄像部102的镜头动作指令,通过镜头驱动部212来控制镜头210向光轴方向的移动。镜头210的一部分或全部沿着光轴移动。镜头控制部220通过使镜头210中的至少一个沿着光轴移动,来执行变焦动作和聚焦动作中的至少一个。位置传感器214检测镜头210的位置。位置传感器214可以检测当前的变焦位置或聚焦位置。
镜头驱动部212可以包括抖动校正机构。镜头控制部220可以通过抖动校正机构来使镜头210在沿着光轴的方向或垂直于光轴的方向上移动,从而执行抖动校正。镜头驱动部212可以由步进马达驱动抖动校正机构,以执行抖动校正。另外,抖动校正机构可以由步进马达驱动,以使图像传感器120在沿着光轴的方向或垂直于光轴的方向上移动,从而执行抖动校正。
内存222存储通过镜头驱动部212而移动的多个镜头210的控制值。内存222可以包括SRAM、DRAM、EPROM、EEPROM和USB存储器等闪存中的至少一个。
这样构成的摄像装置100具有所谓变焦跟踪功能。变焦跟踪功能为如下功能,在表示变焦镜头的位置的变焦位置发生变化时,为了维持对于与摄像装置100的距离不变的被摄体的对焦状态,根据变焦位置的变化使表示聚焦镜头的位置的聚焦位置发生变化。变焦跟踪例如通过使聚焦镜头移动至按照图3所示的与被摄体距离对应的变焦跟踪曲线来确定的与目标变焦位置对应的目标聚焦位置来执行。变焦跟踪曲线为表示用于维持对于与镜头部200的距离不变的被摄体的对焦状态的表示聚焦镜头的位置的聚焦位置与变焦镜头的变焦位置的关系的信息的一个示例。该信息由镜头部200的光学系统的特性来决定,例如可以储存于内存222。
例如,在被摄体存在于无穷远端的情况下,在镜头部200的光学系统从变焦位置(1)和聚焦位置(1)的状态变更为变焦位置(2)的状态时,摄像装置100按照变焦跟踪曲线400确定与变焦位置(2)对应的聚焦位置(2)。摄像装置100基于变焦位置(1)和变焦位置(2)导出变焦位置的变化量,并基于聚焦位置(1)和聚焦位置(2)导出聚焦位置的变化量。然后,摄像装置100基于各变化量来控制变焦位置和聚焦位置。
上述那样的变焦跟踪的前提是:在将变焦镜头移动到期望的变焦位置的过程中,从镜头部200到被摄体的距离固定。然而,该距离可能会在变焦位置变化的期间发生变化。例如,在摄像装置100搭载于UAV10等飞行体的情况下,由于在飞行体的飞行过程中的飞行体或被摄体的位置的变化,变焦跟踪的精度可能会降低。因此,根据本实施方式,即使在从镜头部200到被摄体的距离发生变化时,也会抑制变焦跟踪的精度的降低。
摄像控制部110具有确定部112、变焦位置控制部114和聚焦位置控制部116。确定部112在表示变焦镜头的位置的变焦位置发生变化时,确定用于维持对于与镜头部200的距离不变的被摄体的对焦状态的、表示聚焦镜头的位置的聚焦位置的第一变化量。确定部112确 定用于维持对于与镜头部200的距离变化的被摄体的对焦状态的聚焦位置的第二变化量。确定部112为第一确定部和第二确定部的一个示例。变焦位置也可以未必直接表示构成变焦镜头的至少一个镜头在光轴方向上的物理位置。例如,在为了使变焦镜头移动而使用凸轮机构时,变焦位置也可以为与凸轮部件的位移位置相当的值。此外,在使用圆柱形凸轮来移动变焦镜头时,变焦位置也可以为与圆柱形凸轮的旋转角度相当的值。聚焦位置也可以未必直接表示构成变焦镜头的至少一个镜头在光轴方向上的物理位置。例如,变焦位置也可以为与从用于驱动驱动机构的步进马达从基准旋转位置开始的步数相当的值,该驱动机构用于使聚焦镜头在光轴方向上移动。
根据用于使变焦镜头从第一变焦位置移动至第二变焦位置的指令,确定部112确定变焦镜头从第一变焦位置移动至第二变焦位置时的第一变化量。确定部112进一步基于变焦镜头移动至第二变焦位置之前摄像的多个图像,来确定第二变化量。确定部112可以基于包括变焦镜头处于第一变焦位置时摄像的第一图像和在第一图像之前摄像的第二图像的多个图像,来确定第二变化量。确定部112可以基于第一图像来确定表示从镜头部200到被摄体的距离的第一距离。确定部112可以基于第二图像来确定表示从镜头部200到被摄体的距离的第二距离。确定部112可以基于第一距离和第二距离来确定第二变化量。确定部112可以基于第一距离与第二距离之间的差,来确定第二变化量。
聚焦位置控制部116基于第一变化量和第二变化量来控制聚焦位置。变焦位置控制部114根据用于使变焦镜头从第一变焦位置移动至第二变焦位置的指令,来控制变焦镜头的位置。
在变焦跟踪中,聚焦位置控制部116除了根据基于变焦镜头的位置变化而确定的第一变化量之外,还根据基于到被摄体的距离变化而确定的第二变化量,来控制聚焦位置。由此,即使在到被摄体的距离发生变化的情况下,摄像控制部110也能够控制良好地执行变焦跟 踪。在变焦镜头从第一变焦位置移动至第二变焦位置之前,确定部112至少基于在第一变焦位置摄像的第一图像和在第一图像之前摄像的第二图像,来确定到被摄体的距离的变化量。确定部112基于该变化量,在变焦镜头移动至第二变焦位置之前,确定聚焦位置的第二变化量。聚焦位置控制部116在变焦镜头从第一变焦位置移动至第二变焦位置的期间,除了根据基于变焦镜头的位置变化的第一变化量之外,还能根据基于到被摄体的距离变化的第二变化量,来控制聚焦镜头的位置。因此,即使在到被摄体的距离发生变化的情况下,也能够抑制变焦跟踪的精度的降低。
在此,例如当从镜头部200到被摄体的距离比较长时,即使到被摄体的距离发生变化,也几乎不会影响变焦跟踪的精度。即,根据摄像装置100的摄像条件,有时也不一定需要根据基于到被摄体的距离变化的第二变化量来进行变焦跟踪。因此,在摄像装置100不满足预先决定的摄像条件时,聚焦位置控制部116可以基于第一变化量和第二变化量来控制聚焦位置。另一方面,在镜头装置满足预先决定的摄像条件时,聚焦位置控制部116可以不基于第二变化量而基于第一变化量来控制聚焦位置。
在不满足从镜头部200到地面的距离在预先决定的距离以上这一预先决定的摄像条件时,聚焦位置控制部116可以基于第一变化量和第二变化量来控制聚焦位置。在满足从镜头部200到地面的距离在预先决定的距离以上这一预先决定的摄像条件时,聚焦位置控制部116可以不基于第二变化量而基于第一变化量来控制聚焦位置。从镜头部200到地面的距离例如可以基于来自设置于UAV主体20或摄像装置100中的红外线传感器的信号来确定。红外线传感器可以以使红外线沿竖直方向向下发射的方式设置在例如UAV主体20上。红外线传感器可以基于发射的红外线的反射光来检测从红外线传感器到反射面的距离,将该距离作为从镜头部200到地面的距离。
在不满足从镜头部200到地面的距离在预先决定的距离以上、并 且镜头部200的摄像方向包括预先决定的方向的成分这一预先决定的摄像条件时,聚焦位置控制部116可以基于第一变化量和第二变化量,来控制聚焦位置。在满足从镜头部200到地面的距离在预先决定的距离以上、并且镜头部200的摄像方向包括预先决定的方向的成分这一预先决定的摄像条件时,聚焦位置控制部116可以不基于第二变化量而基于第一变化量来控制聚焦位置。预先决定的方向可以为摄像装置100能够对地面方向进行摄像的方向,例如可以为竖直方向向下。例如,摄像装置100搭载于UAV10,在UAV10的飞行过程中,通过空中摄影来对地表进行摄影。在这种情况下,到被摄体的距离长,因此即使到被摄体的距离发生变化,也几乎不会影响变焦跟踪的精度。因此,在满足上述那样的条件时,聚焦位置控制部116可以不基于第二变化量而基于第一变化量来控制聚焦位置。另一方面,即使镜头部200的摄像方向包括例如竖直方向的成分,在摄像装置100对距离比较短的被摄体进行摄影时,到被摄体的距离的变化也可能会影响变焦跟踪的精度。例如,在使UAV10飞行、且摄像装置100进行肖像摄影时,到被摄体的距离的变化可能会影响变焦跟踪的精度。因此,在这样的条件下进行摄影时,聚焦位置控制部116也可以基于第一变化量和第二变化量来控制聚焦位置。
在不满足摄像装置100以预先决定的摄影模式动作这一预先决定的摄像条件时,聚焦位置控制部116可以基于第一变化量和第二变化量来控制聚焦位置。在满足以预先决定的摄影模式动作这一预先决定的摄像条件时,聚焦位置控制部116可以不基于第二变化量而基于第一变化量来控制聚焦位置。例如,在摄像装置100以除了肖像模式等假定到被摄体距离较短的场景的预先决定的摄影模式之外的摄影模式动作时,聚焦位置控制部116可以判断为不满足预先决定的摄像条件,而基于第一变化量和第二变化量来控制聚焦位置。预先决定的摄影模式可以为,对即使到被摄体的距离发生变化也几乎不会影响变焦跟踪的精度的场景进行摄影的任意的摄影模式。
在不满足从镜头部200到被摄体的距离在预先决定的距离以上 这一预先决定的摄像条件时,聚焦位置控制部116可以基于第一变化量和第二变化量,来控制聚焦位置。在满足从镜头部200到被摄体的距离在预先决定的距离以上这一预先决定的摄像条件时,聚焦位置控制部116可以不基于第二变化量而基于第一变化量来控制聚焦位置。
在不满足UAV10在飞行过程中这一预先决定的摄像条件时,聚焦位置控制部116可以基于第一变化量和第二变化量,来控制聚焦位置。在满足UAV10在飞行过程中这一预先决定的摄像条件时,聚焦位置控制部116可以不基于第二变化量而基于第一变化量来控制聚焦位置。
如上所述,在即使到被摄体的距离发生变化、也几乎不会影响变焦跟踪的精度的情况下,不根据基于到被摄体的距离变化的第二变化量进行变焦跟踪。由此,能够减小变焦跟踪控制中的处理负担。
图4是示出摄像装置100的变焦控制顺序的一个示例的流程图。摄像控制部110接收基于用户操作使变焦镜头移动的变焦指令(S100)。摄像控制部110根据该变焦指令,开始变焦跟踪动作。进一步地,摄像控制部110判定是否满足可以无需考虑到被摄体的距离的变化而进行变焦跟踪动作的预先决定的摄像条件(S102)。在不满足预先决定的摄像条件时,摄像控制部110判断为应该考虑到被摄体的距离的变化来执行变焦跟踪动作。然后,摄像控制部110从已摄像的图像中导出主被摄体的区域的变化量(S104)。摄像控制部110首先从图像中确定主被摄体的区域。例如,摄像控制部110可以将存在于图像的中心的被摄体判断为主被摄体,来确定主被摄体的区域。摄像控制部110也可以将在预先决定的期间内动作最大的被摄体判断为主被摄体,来确定主被摄体的区域。摄像控制部110可以将在第一时间点摄像的图像的主被摄体的区域与在第一时间点之前的第二时间点摄像的图像的主被摄体的区域之间的差分,作为主被摄体的区域的变化量导出。
接着,摄像控制部110判定该变化量是否在预先决定的阈值以上(S106)。如果该变化量小于预先决定的阈值,则摄像控制部110判断为到被摄体的距离的变化对变焦跟踪的精度的影响小,并返回步骤S102。如果该变化量在预先决定的阈值以上,则摄像控制部110判断为到被摄体的距离的变化有可能影响变焦跟踪的精度,并导出聚焦位置的第二变化量(S108)。
当变焦跟踪动作开始时,摄像控制部110保持当前的变焦位置(S110)。摄像控制部110可以从镜头控制部220获取表示由位置传感器214检测到的变焦镜头的当前的变焦位置的信息,并将该变焦位置保持为当前的变焦位置。进一步地,摄像控制部110基于变焦指令来导出目标变焦位置(S112)。摄像控制部110可以将由变焦指令示出的变焦位置确定为目标变焦位置。摄像控制部110可以根据由变焦指令示出的变焦镜头的移动量和当前的变焦位置,来确定目标变焦位置。
接着,摄像控制部110基于变焦位置的变化,来导出与要移动的聚焦镜头的移动量相当的聚焦位置的第一变化量(S114)。聚焦位置控制部116决定聚焦位置的变化量(S116)。在导出第二变化量的情况下,聚焦位置控制部116可以通过将第二变化量加到第一变化量上,来决定聚焦位置的变化量。在未导出第二变化量的情况下,聚焦位置控制部116可以将第一变化量决定为聚焦位置的变化量。或者,聚焦位置控制部116可以将第一变化量和第二变化量中的一方决定为聚焦位置的变化量,之后,聚焦位置控制部116可以将第一变化量和第二变化量中的另一方决定为聚焦位置的变化量。聚焦位置控制部116通过镜头控制部220,基于所决定的变化量来驱动聚焦镜头(S118)。聚焦位置控制部116可以基于第一变化量和第二变化量,将聚焦镜头的聚焦位置一次性改变为期望的聚焦位置。或者,聚焦位置控制部116可以在先基于第一变化量和第二变化量中的一方改变聚焦镜头的聚焦位置后,进一步基于第一变化量和第二变化量中的另一方来改变聚焦镜头的聚焦位置。即,聚焦位置控制部116也可以基 于第一变化量和第二变化量,将聚焦镜头的聚焦位置阶段性地改变为期望的聚焦位置。
另一方面,变焦位置控制部114根据当前的变焦位置和目标变焦位置,来决定变焦位置的变化量(S120)。变焦位置控制部114通过镜头控制部220,基于所决定的变焦位置的变化量来驱动变焦镜头(S122)。
摄像控制部110判定变焦位置是否已经到达按照变焦指令确定的最终变焦位置(S124)。如果变焦位置没有到达按照变焦指令确定的最终变焦位置,则摄像控制部110重复步骤S110以后的处理。另一方面,如果已经到达最终变焦位置,则摄像控制部110结束处理。
图5是示出基于到被摄体的距离的变化的聚焦位置的第二变化量的导出顺序的一个示例的流程图。图6和图7是用于对镜头部200的光学系统中的被摄体距离L1、焦点距离f等进行说明的图。
确定部112对于在以前帧中摄像的图像确定像面上的被摄体尺寸y1。如图6所示,确定部112确定通过镜头210成像的像面上的被摄体502的尺寸(S200)。确定部112导出以前帧的图像倍率α1(S202)。图像倍率α1基于以前帧中的变焦位置和聚焦位置唯一地导出。确定部112可以基于表示变焦位置和聚焦位置与图像倍率的关系的预先决定的信息,来导出图像倍率α1(=IM1:设计值图像倍率)。
进一步地,确定部112导出被摄体距离L1和实际的被摄体尺寸Y(S204)。确定部112可以按照式L1=f/IM1,来导出被摄体距离L1。进一步地,确定部112可以按照式Y=y1×L1/f,来导出被摄体尺寸Y。
随后,如图7所示,确定部112对于在当前帧中摄像的图像确定通过镜头210成像的像面上的被摄体504的被摄体尺寸y2(S206)。确定部112基于当前帧中的变焦位置和聚焦位置,来导出当前帧中的图像倍率α2(=IM2:设计值图像倍率)(S208)。进一步地,确定部 112导出被摄体距离L2和实际的被摄体尺寸Y(S210)。确定部112可以按照式L2=f/IM2,来导出被摄体距离L2。确定部112可以按照式Y=y2×L2/f,来导出被摄体尺寸Y。
确定部112导出被摄体距离L的变化量ΔL(S212)。确定部112可以按照式ΔL=L1-L2,来导出变化量ΔL。进一步地,确定部112可以导出像面变化量Δx。确定部112可以按照式Δx=L1(1-(IM1/IM2))×IM2 2,来导出像面变化量Δx。确定部112可以基于像面变化量Δx,来导出聚焦位置的第二变化量。确定部112可以通过将像面变化量Δx乘以预先决定的系数K,来导出聚焦位置的第二变化量。系数K是为相对于像面位置变化的聚焦位置变换而预先决定的系数。
确定部112可以基于在当前帧中摄像的图像和在以前帧中摄像的至少一个图像,来预测在第二变焦位置摄像的未来帧的图像中的被摄体的位置,并根据该预测导出聚焦位置的第二变化量。
图8是示出基于变焦位置的变化的聚焦位置的第一变化量的导出顺序的一个示例的流程图。
确定部112获取当前的变焦位置(S300)。确定部112可以通过镜头控制部220来获取变焦镜头的当前的变焦位置。确定部112获取当前的聚焦位置(S302)。确定部112可以通过镜头控制部220来获取聚焦镜头的当前的聚焦位置。确定部112导出当前的变焦位置处的驱动范围(S304)。如图9所示,确定部112可以确定与无穷远端的变焦跟踪曲线400上的当前的变焦位置对应的聚焦位置、和与最近端的变焦跟踪曲线402上的当前的变焦位置对应的聚焦位置之间的驱动范围404。进一步地,确定部112导出聚焦位置相对于当前的变焦位置处的驱动范围的比率(S306)。确定部112导出例如图9所示的驱动范围404中的比率b/a。
进一步地,确定部112基于变焦指令导出目标变焦位置(S308)。 确定部112导出目标变焦位置处的驱动范围(S310)。确定部112导出例如图9所示的目标变焦值中的驱动范围406。确定部112根据目标变焦位置处的比率和驱动范围导出目标聚焦位置。确定部112可以根据例如图9所示的比率b/a和驱动范围406导出目标聚焦位置。确定部112根据当前的聚焦位置和目标聚焦位置导出聚焦位置的第一变化量(S314)。
确定部112可以根据与到预先定义的被摄体的至少两个距离对应的变焦跟踪曲线,导出与到当前的被摄体的距离对应的变焦跟踪曲线。进一步地,确定部112可以根据导出的变焦跟踪曲线,导出到被摄体的距离不变时的目标聚焦位置。
例如,如图10所示,变焦镜头从第一变焦位置移动至第二变焦位置。在这种情况下,确定部112按照由第一变焦位置和当前的焦聚焦位置确定的变焦跟踪曲线401,确定临时聚焦位置(1),并确定第一变化量412。然而,实际上有时到被摄体的距离已变化。因此,确定部112基于到被摄体的距离的变化量,确定聚焦位置的第二变化量414。确定部112可以基于在第一变焦位置摄像的第一图像和在第一图像之前摄像的第二图像,确定到被摄体的距离的变化量,并确定聚焦位置的第二变化量414。在变焦镜头从第一位置移动至第二位置的期间,聚焦位置控制部116使聚焦镜头移动将第一变化量412和第二变化量414加起来的变化量416,移动至第一聚焦位置。在变焦镜头从第一位置移动至第二位置的期间,聚焦位置控制部116可以使聚焦镜头一次性移动变化量416。此外,在从第一变焦位置移动至第二变焦位置的期间,聚焦位置控制部116可以在使聚焦镜头移动第一变化量412后,进一步使聚焦镜头移动第二变化量414。
进一步地,变焦镜头从第二变焦位置移动至第三变焦位置。在这种情况下,确定部112按照变焦跟踪曲线402,确定临时聚焦位置(2),并确定第一变化量422。进一步地,确定部112基于到被摄体的距离的变化量,确定聚焦位置的第二变化量424。在从第二变焦位置移动 至第三变焦位置的期间,聚焦位置控制部116可以使聚焦镜头移动将第一变化量422和第二变化量424加起来的变化量426,使聚焦镜头移动至第二聚焦位置。通过将第一变化量422和第二变化量424加起来一次性移动聚焦镜头,能够减小聚焦镜头的移动量。另外,在从第二变焦位置移动至第三变焦位置的期间,聚焦位置控制部116可以通过在使聚焦镜头移动第一变化量422后、使聚焦镜头移动第二变化量424,从而使聚焦镜头移动至第二聚焦位置。
根据本实施方式所涉及的摄像装置100,即使在从镜头部200到被摄体的距离发生变化的情况下,也能够基于到被摄体的距离的变化量,在变焦跟踪过程中调整聚焦位置,因此能够抑制变焦跟踪精度的降低。
图11示出表示整体地或部分地具体实现本发明的多个方式的计算机1200的一例。计算机1200中安装的程序能使计算机1200作为本发明的实施方式所涉及的装置的相关操作或该装置的一个或多个“部”而发挥功能。或者,该程序能使计算机1200执行该操作或该一个或多个“部”。该程序能使计算机1200执行本发明的实施方式所涉及的过程或该过程的阶段。这种程序为了使计算机1200执行本说明书中记载的流程图及框图的框中的若干个或全部所相关的确定的操作,可由CPU1212执行。
本实施方式中的计算机1200包含CPU1212及RAM1214,CPU1212及RAM1214通过主机控制器1210彼此连接。计算机1200还包含通信接口1222、输入/输出单元,通信接口1222与输入/输出单元经由输入/输出控制器1220而连接于主机控制器1210。计算机1200还包含ROM1230。CPU1212按照ROM1230及RAM1214内储存的程序进行动作,由此控制各单元。
通信接口1222通过网络而与其他电子设备进行通信。硬盘驱动器可储存供计算机1200内的CPU1212使用的程序及数据。ROM1230 中储存激活化时由计算机1200执行的引导程序等及/或依赖于计算机1200硬件的程序。可通过CR-ROM、USB存储器或IC卡等计算机可读记录介质或网络提供程序。程序被安装于也为计算机可读记录介质的示例的RAM1214或ROM1230中且由CPU1212执行。这些程序内描述的信息处理被计算机1200读取,使程序与所述各种类型的硬件资源之间实现协作。装置或方法可通过利用计算机1200实现信息的操作或处理来构成。
例如,当在计算机1200及外部设备之间执行通信时,CPU1212可执行RAM1214上载入的通信程序,根据通信程序中描述的处理,命令通信接口1222进行通信处理。通信接口1222在CPU1212的控制下,读取RAM1214或USB存储器等记录介质内所提供的发送缓冲区中储存的发送数据,将读取的发送数据发送到网络,或将通过网络接收的接收数据写入到记录介质上所提供的接收缓冲区等内。
而且,CPU1212可将USB存储器等外部记录介质中储存的全部或需要的一部分文件或数据库读取到RAM1214中,且对RAM1214上的数据执行各种类型的处理。接着,CPU1212可将经过处理的数据回写到外部记录介质内。
各种类型的程序、数据、表格及数据库等各种类型的信息可储存在记录介质中,并接受信息处理。CPU1212可对于从RAM1214读取的数据执行各种类型的处理,且将结果回写到RAM1214内,所述各种类型的处理包含本公开中随处记载的、由程序的指令序列所指定的各种类型的操作、信息处理、条件判断、条件分支、无条件分支、信息的检索/置换等。而且,CPU1212可检索记录介质内的文件、数据库等中的信息。例如,当记录介质内储存分别具有与第二属性的属性值相关的第一属性的属性值的多个条目时,CPU1212可从该多个条目中检索出第一属性的属性值所指定的与条件一致的条目,读取该条目内储存的第二属性的属性值,由此获取满足预先确定的条件的与第一属性相关的第二属性的属性值。
上文说明的程序或软件模块可储存在计算机1200上或计算机1200近旁的计算机可读存储介质中。而且,连接于专用通信网络或互联网的服务器系统内所提供的硬盘或RAM等记录介质可用作计算机可读存储介质,这样,程序可通过网络提供给计算机1200。
要注意的是,权利要求书、说明书、以及说明书附图中所示的装置、系统、程序以及方法中的动作、顺序、步骤、以及阶段等各项处理的执行顺序,只要没有特别明示“在...之前”、“事先”等,而且只要前面处理的输出并不用在后面的处理中,则可以以任意顺序实现。关于权利要求书、说明书以及说明书附图中的动作流程,为方便起见而使用“首先”、“接着”等进行了说明,但并不意味着必须按照这样的顺序实施。
以上通过实施方式对本发明进行了说明,但是本发明的技术范围并不限于上述实施方式所记载的范围。对本领域普通技术人员来说,显然可对上述实施方式加以各种变更或改良。从权利要求书的记载即可明白,加以了这样的变更或改良的方式也都可包含在本发明的技术范围之内。
要注意的是,权利要求书、说明书、以及说明书附图中所示的装置、系统、程序以及方法中的动作、顺序、步骤、以及阶段等各项处理的执行顺序,只要没有特别明示“在...之前”、“事先”等,而且只要前面处理的输出并不用在后面的处理中,则可以以任意顺序实现。关于权利要求书、说明书以及说明书附图中的动作流程,为方便起见而使用“首先”、“接着”等进行了说明,但并不意味着必须按照这样的顺序实施。
符号说明
10 UAV
20 UAV主体
30 UAV控制部
32 内存
36 通信接口
40 推进部
41 GPS接收器
42 惯性测量装置
43 磁罗盘
44 气压高度计
45 温度传感器
46 湿度传感器
50 云台
60 摄像装置
100 摄像装置
102 摄像部
110 摄像控制部
112 确定部
114 变焦位置控制部
116 聚焦位置控制部
120 图像传感器
130 内存
200 镜头部
210 镜头
212 镜头驱动部
214 位置传感器
220 镜头控制部
222 内存
300 远程操作装置
1200 计算机
1210 主机控制器
1212 CPU
1214 RAM
1220 输入/输出控制器
1222 通信接口
1230 ROM

Claims (17)

  1. 一种控制装置,其对具备变焦镜头和聚焦镜头的镜头装置进行控制,其具备:
    第一确定部,其在表示所述变焦镜头的位置的变焦位置发生变化时,为了维持对于与所述镜头装置的距离不变的被摄体的对焦状态、确定表示所述聚焦镜头的位置的聚焦位置的第一变化量;
    第二确定部,其为了维持对于与所述镜头装置的距离变化的被摄体的对焦状态,确定所述聚焦位置的第二变化量;以及
    控制部,其基于所述第一变化量和所述第二变化量来控制所述聚焦位置。
  2. 如权利要求1所述的控制装置,其中,
    根据用于使所述变焦镜头从第一变焦位置移动至第二变焦位置的指令,
    所述第一确定部确定所述变焦镜头从所述第一变焦位置移动至所述第二变焦位置时的所述第一变化量,
    所述第二确定部基于所述变焦镜头移动至所述第二变焦位置之前摄像的多个图像,来确定所述第二变化量。
  3. 如权利要求2所述的控制装置,其中,
    所述第二确定部基于包括所述变焦镜头处于所述第一变焦位置时摄像的第一图像和在所述第一图像之前摄像的第二图像的所述多个图像,来确定所述第二变化量。
  4. 如权利要求3所述的控制装置,其中,
    所述第二确定部基于所述第一图像确定表示从所述镜头装置到所述被摄体的距离的第一距离,基于所述第二图像确定表示从所述镜头装置到所述被摄体的距离的第二距离,并基于所述第一距离和所述第二距离,来确定所述第二变化量。
  5. 如权利要求4所述的控制装置,其中,
    所述第二确定部基于所述第一距离与所述第二距离的差,来确定所述第二变化量。
  6. 如权利要求1所述的控制装置,其中,
    所述第一确定部,基于表示用于维持对于与所述镜头装置的距离不变的被摄体的对焦状态的表示所述聚焦镜头的位置的聚焦位置与所述变焦镜头的变焦位置的关系的信息,来确定所述第一变化量。
  7. 如权利要求1所述的控制装置,其中,
    所述控制部,在所述镜头装置不满足预先决定的摄像条件时,基于所述第一变化量和所述第二变化量来控制所述聚焦位置,在所述镜头装置满足预先决定的摄像条件时,不基于所述第二变化量而基于所述第一变化量来控制所述聚焦位置。
  8. 如权利要求7所述的控制装置,其中,
    所述控制部,在不满足从所述镜头装置到地面的距离在预先决定的距离以上这一所述预先决定的摄像条件时,基于所述第一变化量和所述第二变化量来控制所述聚焦位置,在满足从所述镜头装置到地面的距离在预先决定的距离以上这一所述预先决定的摄像条件时,不基于所述第二变化量而基于所述第一变化量来控制所述聚焦位置。
  9. 如权利要求7所述的控制装置,其中,
    所述控制部,在不满足从所述镜头装置到地面的距离在预先决定的距离以上、并且所述镜头装置的摄像方向包括预先决定的方向的成分这一所述预先决定的摄像条件时,基于所述第一变化量和所述第二变化量来控制所述聚焦位置,在满足从所述镜头装置到地面的距离在预先决定的距离以上、并且所述镜头装置的摄像方向包括预先决定的方向的成分这一所述预先决定的摄像条件时,不基于所述第二变化量而基于所述第一变化量来控制所述聚焦位置。
  10. 如权利要求7所述的控制装置,其中,
    所述控制部,在不满足从所述镜头装置到所述被摄体的距离在预先决定的距离以上这一所述预先决定的摄像条件时,基于所述第一变化量和所述第二变化量来控制所述聚焦位置,在满足从所述镜头装置到所述被摄体的距离在预先决定的距离以上这一所述预先决定的摄像条件时,不基于所述第二变化量而基于所述第一变化量来控制所述聚焦位置。
  11. 如权利要求7所述的控制装置,其中,
    所述控制部,在不满足所述镜头装置以预先决定的摄影模式动作这一所述预先决定的摄像条件时,基于所述第一变化量和所述第二变化量来控制所述聚焦位置,在满足所述镜头装置以所述预先决定的摄影模式动作这一所述预先决定的摄像条件时,不基于所述第二变化量而基于所述第一变化量来控制所述聚焦位置。
  12. 如权利要求7所述的控制装置,其中,
    所述镜头装置搭载在飞行体上,
    所述控制部在不满足所述飞行体在飞行过程中这一所述预先决定的摄像条件时,基于所述第一变化量和所述第二变化量来控制所述聚焦位置,在满足所述飞行体在飞行过程中这一所述预先决定的摄像 条件时,不基于所述第二变化量而基于所述第一变化量来控制所述聚焦位置。
  13. 一种摄像装置,其具备:
    权利要求1至12中任一项所述的控制装置;以及
    所述镜头装置。
  14. 一种飞行体,其具备权利要求13所述的摄像装置进行飞行。
  15. 如权利要求14所述的飞行体,其中,
    其进一步具备可旋转地支持所述摄像装置的支持机构。
  16. 一种控制方法,其是对具备变焦镜头和聚焦镜头的镜头装置进行控制的控制方法,其具备以下阶段:
    在表示所述变焦镜头的位置的变焦位置发生变化时,确定用于维持对于与所述镜头装置的距离不变的被摄体的对焦状态的、表示所述聚焦镜头位置的聚焦位置的第一变化量的阶段;
    确定用于维持对于与所述镜头装置的距离变化的被摄体的对焦状态的所述聚焦位置的第二变化量的阶段;以及
    基于所述第一变化量和所述第二变化量来控制所述聚焦位置的阶段。
  17. 一种程序,其是使计算机对具备变焦镜头和聚焦镜头的镜头装置进行控制的程序,其用于使所述计算机执行以下阶段:
    在表示所述变焦镜头的位置的变焦位置发生变化时,确定用于维持对于与所述镜头装置的距离不变的被摄体的对焦状态的、表示所述聚焦镜头的位置的聚焦位置的第一变化量的阶段;
    确定用于维持对于与所述镜头装置的距离变化的被摄体的对焦状态的所述聚焦位置的第二变化量的阶段;以及
    基于所述第一变化量和所述第二变化量来控制所述聚焦位置的阶段。
PCT/CN2017/117981 2017-09-28 2017-12-22 控制装置、摄像装置、飞行体、控制方法以及程序 WO2019061887A1 (zh)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201780065011.XA CN109844634B (zh) 2017-09-28 2017-12-22 控制装置、摄像装置、飞行体、控制方法以及程序

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2017189081A JP6543875B2 (ja) 2017-09-28 2017-09-28 制御装置、撮像装置、飛行体、制御方法、プログラム
JP2017-189081 2017-09-28

Publications (1)

Publication Number Publication Date
WO2019061887A1 true WO2019061887A1 (zh) 2019-04-04

Family

ID=65900493

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2017/117981 WO2019061887A1 (zh) 2017-09-28 2017-12-22 控制装置、摄像装置、飞行体、控制方法以及程序

Country Status (3)

Country Link
JP (1) JP6543875B2 (zh)
CN (1) CN109844634B (zh)
WO (1) WO2019061887A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021031833A1 (zh) * 2019-08-21 2021-02-25 深圳市大疆创新科技有限公司 控制装置、摄像系统、控制方法以及程序

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111086636B (zh) * 2020-01-09 2023-05-12 海南中农航服科技有限公司 一种航拍无人机的摄像装置

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN100573302C (zh) * 2006-03-14 2009-12-23 致伸科技股份有限公司 数码相机的变焦追踪方法
US20100290772A1 (en) * 2009-05-15 2010-11-18 Sanyo Electric Co., Ltd. Electronic camera
CN103019002A (zh) * 2012-02-21 2013-04-03 深圳市阿格斯科技有限公司 光学变焦相机的缩放跟踪自动焦点控制装置及其控制方法
JP2013130827A (ja) * 2011-12-22 2013-07-04 Canon Inc レンズ制御装置
JP2015212746A (ja) * 2014-05-02 2015-11-26 キヤノン株式会社 ズームトラッキング制御装置、ズームトラッキング制御プログラム、光学機器および撮像装置
CN105227835A (zh) * 2015-09-11 2016-01-06 浙江宇视科技有限公司 一种辅助聚焦方法和装置
CN105635571A (zh) * 2015-12-24 2016-06-01 广东欧珀移动通信有限公司 拍照控制方法、拍照控制装置及拍照系统
CN106341598A (zh) * 2016-09-14 2017-01-18 北京环境特性研究所 光学镜头自动聚焦系统及方法

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4986346B2 (ja) * 2001-08-09 2012-07-25 パナソニック株式会社 撮像装置のレンズ駆動方法及び撮像装置並びにカメラシステム
US7415200B2 (en) * 2003-10-14 2008-08-19 Canon Kabushiki Kaisha Imaging device
CN1932632A (zh) * 2005-09-15 2007-03-21 乐金电子(中国)研究开发中心有限公司 移动通信终端中的镜头自动调焦方法及装置
JP2008046351A (ja) * 2006-08-16 2008-02-28 Canon Inc 自動焦点調節装置および撮像装置
JP5888837B2 (ja) * 2010-06-30 2016-03-22 キヤノン株式会社 撮像装置及び撮像装置の制御方法
JP5953187B2 (ja) * 2011-10-11 2016-07-20 オリンパス株式会社 合焦制御装置、内視鏡システム及び合焦制御方法
CN104040404B (zh) * 2012-01-13 2016-11-09 佳能株式会社 镜头单元及其控制方法和摄像设备及其控制方法
JP6659100B2 (ja) * 2015-08-06 2020-03-04 キヤノン株式会社 撮像装置
JP6685713B2 (ja) * 2015-12-15 2020-04-22 キヤノン株式会社 撮像システムおよびその制御方法、通信装置、移動撮像装置

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN100573302C (zh) * 2006-03-14 2009-12-23 致伸科技股份有限公司 数码相机的变焦追踪方法
US20100290772A1 (en) * 2009-05-15 2010-11-18 Sanyo Electric Co., Ltd. Electronic camera
JP2013130827A (ja) * 2011-12-22 2013-07-04 Canon Inc レンズ制御装置
CN103019002A (zh) * 2012-02-21 2013-04-03 深圳市阿格斯科技有限公司 光学变焦相机的缩放跟踪自动焦点控制装置及其控制方法
JP2015212746A (ja) * 2014-05-02 2015-11-26 キヤノン株式会社 ズームトラッキング制御装置、ズームトラッキング制御プログラム、光学機器および撮像装置
CN105227835A (zh) * 2015-09-11 2016-01-06 浙江宇视科技有限公司 一种辅助聚焦方法和装置
CN105635571A (zh) * 2015-12-24 2016-06-01 广东欧珀移动通信有限公司 拍照控制方法、拍照控制装置及拍照系统
CN106341598A (zh) * 2016-09-14 2017-01-18 北京环境特性研究所 光学镜头自动聚焦系统及方法

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021031833A1 (zh) * 2019-08-21 2021-02-25 深圳市大疆创新科技有限公司 控制装置、摄像系统、控制方法以及程序

Also Published As

Publication number Publication date
JP2019066556A (ja) 2019-04-25
CN109844634B (zh) 2021-08-17
JP6543875B2 (ja) 2019-07-17
CN109844634A (zh) 2019-06-04

Similar Documents

Publication Publication Date Title
WO2019120082A1 (zh) 控制装置、系统、控制方法以及程序
US20210014427A1 (en) Control device, imaging device, mobile object, control method and program
JP2019216343A (ja) 決定装置、移動体、決定方法、及びプログラム
JP2020012878A (ja) 制御装置、移動体、制御方法、及びプログラム
US20200132963A1 (en) Control apparatus, lens apparatus, photographic apparatus, flying body, and control method
US20210105411A1 (en) Determination device, photographing system, movable body, composite system, determination method, and program
US11066182B2 (en) Control apparatus, camera apparatus, flying object, control method and program
WO2019061887A1 (zh) 控制装置、摄像装置、飞行体、控制方法以及程序
CN111630838B (zh) 确定装置、摄像装置、摄像系统、移动体、确定方法以及程序
WO2019174343A1 (zh) 活动体检测装置、控制装置、移动体、活动体检测方法及程序
WO2018185940A1 (ja) 撮像制御装置、撮像装置、撮像システム、移動体、撮像制御方法、及びプログラム
JP6641574B1 (ja) 決定装置、移動体、決定方法、及びプログラム
WO2020020042A1 (zh) 控制装置、移动体、控制方法以及程序
WO2020011198A1 (zh) 控制装置、移动体、控制方法以及程序
JP6569157B1 (ja) 制御装置、撮像装置、移動体、制御方法、及びプログラム
CN111213369B (zh) 控制装置、方法、摄像装置、移动体以及计算机可读存储介质
JP6878738B1 (ja) 制御装置、撮像システム、移動体、制御方法、及びプログラム
JP7009698B1 (ja) 制御装置、撮像装置、制御方法、及びプログラム
JP6896963B1 (ja) 制御装置、撮像装置、移動体、制御方法、及びプログラム
WO2019085794A1 (zh) 控制装置、摄像装置、飞行体、控制方法以及程序
JP6413170B1 (ja) 決定装置、撮像装置、撮像システム、移動体、決定方法、及びプログラム
WO2020125414A1 (zh) 控制装置、摄像装置、摄像系统、移动体、控制方法以及程序

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17926969

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17926969

Country of ref document: EP

Kind code of ref document: A1