WO2020125414A1 - 控制装置、摄像装置、摄像系统、移动体、控制方法以及程序 - Google Patents

控制装置、摄像装置、摄像系统、移动体、控制方法以及程序 Download PDF

Info

Publication number
WO2020125414A1
WO2020125414A1 PCT/CN2019/122976 CN2019122976W WO2020125414A1 WO 2020125414 A1 WO2020125414 A1 WO 2020125414A1 CN 2019122976 W CN2019122976 W CN 2019122976W WO 2020125414 A1 WO2020125414 A1 WO 2020125414A1
Authority
WO
WIPO (PCT)
Prior art keywords
distance
imaging device
focus
lens
imaging
Prior art date
Application number
PCT/CN2019/122976
Other languages
English (en)
French (fr)
Inventor
本庄谦一
安田知长
高宫诚
Original Assignee
深圳市大疆创新科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳市大疆创新科技有限公司 filed Critical 深圳市大疆创新科技有限公司
Priority to CN201980009116.2A priority Critical patent/CN111615663A/zh
Publication of WO2020125414A1 publication Critical patent/WO2020125414A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B7/00Mountings, adjusting means, or light-tight connections, for optical elements
    • G02B7/02Mountings, adjusting means, or light-tight connections, for optical elements for lenses
    • G02B7/04Mountings, adjusting means, or light-tight connections, for optical elements for lenses with mechanism for focusing or varying magnification
    • G02B7/08Mountings, adjusting means, or light-tight connections, for optical elements for lenses with mechanism for focusing or varying magnification adapted to co-operate with a remote control mechanism
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B7/00Mountings, adjusting means, or light-tight connections, for optical elements
    • G02B7/28Systems for automatic generation of focusing signals
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B13/00Viewfinders; Focusing aids for cameras; Means for focusing for cameras; Autofocus systems for cameras
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B13/00Viewfinders; Focusing aids for cameras; Means for focusing for cameras; Autofocus systems for cameras
    • G03B13/32Means for focusing
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B13/00Viewfinders; Focusing aids for cameras; Means for focusing for cameras; Autofocus systems for cameras
    • G03B13/32Means for focusing
    • G03B13/34Power focusing
    • G03B13/36Autofocus systems
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B15/00Special procedures for taking photographs; Apparatus therefor
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B17/00Details of cameras or camera bodies; Accessories therefor
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B17/00Details of cameras or camera bodies; Accessories therefor
    • G03B17/56Accessories
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B37/00Panoramic or wide-screen photography; Photographing extended surfaces, e.g. for surveying; Photographing internal surfaces, e.g. of pipe
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment

Definitions

  • the invention relates to a control device, an imaging device, an imaging system, a moving body, a control method and a program.
  • Patent Literature 1 discloses a distance measuring device that derives the distance to a subject based on the time when directional light is emitted and the time when reflected light is received.
  • Patent Document 1 International Publication No. 2015/166713
  • the control device may include an acquisition unit that acquires a subject distance by a distance measuring sensor, and the subject distance indicates a distance from the imaging device to the subject.
  • the control device may include a specific unit that specifies the position of the focus lens included in the imaging device.
  • the control device may include a determination unit that determines the moving direction of the focus lens based on the subject distance and the position of the focus lens.
  • the control device may include a control unit that controls the position of the focus lens based on the image captured by the imaging device after moving the focus lens in the moving direction.
  • the determining section determines the moving direction of the focusing lens as the direction of infinity.
  • the determining section determines the moving direction of the focusing lens as the direction of the closest end.
  • control section may control the position of the focus lens based on the subject distance instead of controlling the position of the focus lens based on the image captured by the imaging device.
  • control section may control the position of the focus lens based on the image captured by the imaging device.
  • the acquiring unit may acquire the respective subject distances of the plurality of ranging areas through the ranging sensor.
  • the determination unit may determine the moving direction of the focus lens based on the subject distance in the distance measurement area corresponding to the focus area and the position of the focus lens in the image pickup area of the image pickup device in the plurality of distance measurement areas.
  • the acquisition unit may also acquire the temperature of the imaging device through the temperature sensor.
  • the determining unit may determine the focus lens determined by the specific unit based on a predetermined relationship between the distance from the image-side focus of the camera to the imaging surface and the distance from the object-side focus of the camera to the focus point according to the temperature of the camera The position corresponds to the focusing distance.
  • the ranging sensor may be a TOF sensor.
  • An imaging device may include the above-described control device and focusing lens.
  • An imaging system may include the above-described imaging device, a distance measuring sensor, and a support mechanism that supports the imaging device so that the attitude of the imaging device can be adjusted.
  • the distance measurement area of the distance measuring sensor is included in the imaging area of the imaging device in response to the posture of the camera controlled by the support mechanism, and the control unit can control the focus lens based on the image captured by the imaging device after moving the focus lens in the moving direction s position.
  • the mobile body according to one aspect of the present invention may include the above-described imaging system and move.
  • the acquiring unit may acquire the subject distance through the ranging sensor when the moving body is not moving.
  • the moving body may be a flying body.
  • the acquiring unit may acquire the subject distance through the distance measuring sensor when the flying body is hovering.
  • the control method may include a step of acquiring a subject distance by a distance measuring sensor, the subject distance representing the distance from the imaging device to the subject.
  • the control method may include a step of determining the position of the focus lens included in the imaging device.
  • the control method may include a step of determining the moving direction of the focus lens based on the subject distance and the position of the focus lens.
  • the control method may include a step of controlling the position of the focus lens based on the image captured by the imaging device after moving the focus lens in the moving direction.
  • the program according to one aspect of the present invention may be a program for causing a computer to function as a control device.
  • the focus control can be implemented more efficiently with the distance measuring device.
  • FIG. 1 is a diagram showing an example of the appearance of an unmanned aircraft and a remote operation device
  • FIG. 2 is a diagram showing an example of a functional block of an unmanned aircraft
  • FIG. 3 is a diagram for explaining the range of the distance measuring sensor
  • FIG. 4 is a diagram showing an example of the relationship between the position of the focusing lens and the focusing distance
  • FIG. 5 is a diagram for explaining parameters to be considered when determining the focusing distance of the camera device
  • FIG. 6 is a diagram showing an example of the relationship between the distance x 1 and the distance x 2 predetermined according to the temperature of the imaging device;
  • FIG. 7 is a diagram showing an example of the positional relationship between the imaging range of the imaging device and the ranging range of the ranging sensor;
  • FIG. 8 is a diagram showing an example of the positional relationship between the imaging range of the imaging device and the ranging range of the ranging sensor;
  • FIG. 9 is a diagram showing an example of a screen in which the subject distance of each ranging area is superimposed and displayed on an image captured by an imaging device;
  • FIG. 10 is a diagram showing a flowchart showing an example of a procedure of focus control of the imaging device
  • FIG. 11 is a diagram showing an example of a hardware configuration
  • the blocks may represent (1) the stage of the process of performing the operation or (2) the "part" of the device that has the function of performing the operation.
  • the determined stages and “departments” can be realized by programmable circuits and/or processors.
  • the dedicated circuits may include digital and/or analog hardware circuits. Integrated circuits (ICs) and/or discrete circuits may be included.
  • the programmable circuit may include a reconfigurable hardware circuit.
  • Reconfigurable hardware circuits can include logical AND, logical OR, logical XOR, logical NAND, logical NOR, and other logic operations, flip-flops, registers, field programmable gate array (FPGA), programmable logic array (PLA) ) And other memory elements.
  • the computer-readable medium may include any tangible device capable of storing instructions executed by a suitable device.
  • the computer-readable medium having instructions stored thereon has a product that includes instructions that can be executed to create means for performing the operations specified by the flowchart or block diagram.
  • electronic storage media, magnetic storage media, optical storage media, electromagnetic storage media, semiconductor storage media, etc. may be included.
  • floppy registered trademark
  • floppy disk floppy disk
  • hard disk random access memory
  • RAM random access memory
  • ROM read only memory
  • EPROM or Flash memory erasable programmable read only memory
  • EEPROM electrically erasable programmable read-only memory
  • SRAM static random access memory
  • CD-ROM compact disk read-only memory
  • DVD digital versatile disk
  • RTM Blu-ray
  • the computer-readable instructions may include any one of source code or object code described by any combination of one or more programming languages.
  • Source code or object code includes traditional procedural programming languages.
  • Traditional programming languages can be assembly instructions, instruction set architecture (ISA) instructions, machine instructions, machine-related instructions, microcode, firmware instructions, state setting data, or Smalltalk, Object-oriented programming languages such as C++ and "C" programming languages or similar programming languages.
  • the computer-readable instructions may be provided locally or via a wide area network (WAN) such as a local area network (LAN), the Internet, or the like to a processor or programmable circuit of a general-purpose computer, a dedicated computer, or other programmable data processing device.
  • WAN wide area network
  • LAN local area network
  • the Internet or the like to a processor or programmable circuit of a general-purpose computer, a dedicated computer, or other programmable data processing device.
  • a processor or programmable circuit can execute computer readable instructions to create means for performing the operations specified by the flowchart or block diagram.
  • Examples of the processor include a computer processor, a processing unit, a microprocessor, a digital signal processor, a controller, a microcontroller, and so on.
  • FIG. 1 shows an example of the appearance of an unmanned aerial vehicle (UAV) 10 and a remote control device 300.
  • the UAV 10 includes a UAV body 20, a universal joint 50, a plurality of imaging devices 60, and an imaging device 100.
  • the universal joint 50 and the imaging device 100 are an example of an imaging system.
  • UAV10, or moving body refers to concepts including flying bodies moving in the air, vehicles moving on the ground, ships moving on the water, and so on.
  • a flying body moving in the air refers to not only UAVs, but also other aircraft, airships, helicopters, etc. moving in the air.
  • the UAV body 20 includes a plurality of rotors. Multiple rotors are an example of a propulsion unit.
  • the UAV main body 20 makes the UAV 10 fly by controlling the rotation of a plurality of rotors.
  • the UAV body 20 uses, for example, four rotors to make the UAV 10 fly.
  • the number of rotors is not limited to four.
  • UAV10 can also be a fixed-wing aircraft without a rotor.
  • the imaging device 100 is an imaging camera that captures an object included in a desired imaging range.
  • the universal joint 50 rotatably supports the camera device 100.
  • the universal joint 50 is an example of a support mechanism.
  • the gimbal 50 uses an actuator to rotatably support the camera 100 with a pitch axis.
  • the universal joint 50 uses an actuator to further rotatably support the imaging device 100 about the roll axis and the yaw axis, respectively.
  • the gimbal 50 can change the posture of the imaging device 100 by rotating the imaging device 100 about at least one of the yaw axis, the pitch axis, and the roll axis.
  • the plurality of imaging devices 60 are sensing cameras that capture the surroundings of the UAV 10 in order to control the UAV 10's flight.
  • the two camera devices 60 can be installed on the front of the head of the UAV10.
  • the other two camera devices 60 may be installed on the bottom surface of the UAV 10.
  • the two camera devices 60 on the front side may be paired and function as a so-called stereo camera.
  • the two imaging devices 60 on the bottom surface side may also be paired to function as a stereo camera.
  • the three-dimensional space data around the UAV 10 can be generated from the images captured by the plurality of camera devices 60.
  • the number of imaging devices 60 included in the UAV 10 is not limited to four.
  • the UAV 10 only needs to have at least one imaging device 60.
  • UAV10 may include at least one camera 60 on the nose, tail, side, bottom, and top of UAV10, respectively.
  • the angle of view settable in the camera 60 may be larger than the angle of view settable in the camera 100.
  • the camera 60 may have a single focus lens or a fisheye lens.
  • the remote operation device 300 communicates with the UAV10 to remotely operate the UAV10.
  • the remote operation device 300 can perform wireless communication with the UAV 10.
  • the remote operation device 300 transmits to the UAV 10 instruction information indicating various commands related to the movement of the UAV 10 such as ascent, descent, acceleration, deceleration, forward, backward, and rotation.
  • the instruction information includes, for example, instruction information for increasing the height of the UAV 10.
  • the indication information may indicate the height at which UAV10 should be located.
  • the UAV 10 moves to be at the height indicated by the instruction information received from the remote operation device 300.
  • the instruction information may include an ascending command to ascend UAV10. UAV10 rises while receiving the rise command. When the height of UAV10 has reached the upper limit, even if the ascending command is accepted, UAV10 can be restricted from ascending.
  • FIG. 2 shows an example of the functional blocks of UAV10.
  • UAV10 includes UAV control unit 30, memory 37, communication interface 36, propulsion unit 40, GPS receiver 41, inertial measurement device 42, magnetic compass 43, barometric altimeter 44, temperature sensor 45, humidity sensor 46, universal joint 50, camera The device 60, the imaging device 100, and the distance measuring sensor 250.
  • the communication interface 36 communicates with other devices such as the remote operation device 300.
  • the communication interface 36 can receive instruction information including various instructions to the UAV control unit 30 from the remote operation device 300.
  • the memory 37 stores the UAV control unit 30 for the propulsion unit 40, GPS receiver 41, inertial measurement device (IMU) 42, magnetic compass 43, barometric altimeter 44, temperature sensor 45, humidity sensor 46, gimbal 50, imaging device 60 and The imaging device 100 performs programs and the like necessary for control.
  • the memory 37 may be a computer-readable recording medium, and may include at least one of flash memory such as SRAM, DRAM, EPROM, EEPROM, USB memory, and solid state drive (SSD).
  • the memory 37 may be provided inside the UAV main body 20. It can be set to be detachable from the UAV body 20.
  • the UAV control unit 30 controls the flight and shooting of the UAV 10 according to the program stored in the memory 37.
  • the UAV control unit 30 may be composed of a microprocessor such as a CPU or MPU, and a microcontroller such as an MCU.
  • the UAV control unit 30 controls the flight and shooting of the UAV 10 in accordance with instructions received from the remote operation device 300 via the communication interface 36.
  • the propulsion unit 40 advances the UAV10.
  • the propulsion unit 40 includes a plurality of rotors and a plurality of drive motors that rotate the plurality of rotors.
  • the propulsion unit 40 rotates a plurality of rotors via a plurality of drive motors according to an instruction from the UAV control unit 30 to make the UAV 10 fly.
  • the GPS receiver 41 receives a plurality of signals indicating the time of transmission from a plurality of GPS satellites.
  • the GPS receiver 41 calculates the position (latitude and longitude) of the GPS receiver 41, that is, the position (latitude and longitude) of the UAV 10 based on the received multiple signals.
  • IMU42 detects the posture of UAV10.
  • the IMU 42 detects the acceleration of the UAV 10 in the three axis directions of front, back, left, right, and up and down and the angular velocity in the three axis directions of the pitch axis, roll axis, and yaw axis as the posture of the UAV 10.
  • the magnetic compass 43 detects the orientation of the head of the UAV10.
  • the barometric altimeter 44 detects the flying altitude of the UAV10.
  • the barometric altimeter 44 detects the air pressure around the UAV 10, and converts the detected air pressure into an altitude to detect the altitude.
  • the temperature sensor 45 detects the temperature around the UAV10.
  • the humidity sensor 46 detects the humidity around the UAV10.
  • the imaging device 100 includes an imaging unit 102 and a lens unit 200.
  • the lens unit 200 is an example of a lens device.
  • the imaging unit 102 includes an image sensor 120, an imaging control unit 110, and a memory 130.
  • the image sensor 120 may be composed of CCD or CMOS.
  • the image sensor 120 captures an optical image formed through a plurality of lenses 210, and outputs the captured image to the imaging control section 110.
  • the imaging control unit 110 may be configured by a microprocessor such as a CPU or MPU, a microcontroller such as an MCU, or the like.
  • the imaging control unit 110 may control the imaging device 100 according to the operation instruction of the imaging device 100 from the UAV control unit 30.
  • the imaging control unit 110 is an example of the first control unit and the second control unit.
  • the memory 130 may be a computer-readable recording medium, and may include at least one of flash memory such as SRAM, DRAM, EPROM, EEPROM, USB memory, and solid state drive (SSD).
  • the memory 130 stores programs and the like necessary for the imaging control unit 110 to control the image sensor 120 and the like.
  • the memory 130 may be provided inside the housing of the camera 100.
  • the memory 130 may be configured to be detachable from the housing of the camera device 100.
  • the lens section 200 includes a focus lens 210, a zoom lens 211, a lens driving section 212, a lens driving section 213, and a lens control section 220.
  • the focusing lens 210 is an example of a focusing lens system.
  • the zoom lens 211 is an example of a zoom lens system.
  • the focusing lens 210 and the zoom lens 211 may include at least one lens. At least a part or all of the focus lens 210 and the zoom lens 211 are configured to be movable along the optical axis.
  • the lens unit 200 may be an interchangeable lens provided to be detachable from the imaging unit 102.
  • the lens driving section 212 moves at least a part or all of the focusing lens 210 along the optical axis via a mechanism member such as a cam ring and a guide shaft.
  • the lens driving section 213 moves at least a part or all of the zoom lens 211 along the optical axis via a mechanism member such as a cam ring and a guide shaft.
  • the lens control section 220 drives at least one of the lens driving section 212 and the lens driving section 213 according to the lens control instruction from the imaging section 102, and causes at least one of the focusing lens 210 and the zoom lens 211 to be along the optical axis direction via the mechanism member Move to perform at least one of a zooming action and a focusing action.
  • the lens control commands are, for example, zoom control commands and focus control commands.
  • the lens section 200 further includes a memory 222, a position sensor 214, and a position sensor 215.
  • the memory 222 stores the control values of the focus lens 210 and the zoom lens 211 that are moved via the lens driving section 212 and the lens driving section 213.
  • the memory 222 may include at least one of flash memory such as SRAM, DRAM, EPROM, EEPROM, and USB memory.
  • the position sensor 214 detects the lens position of the focusing lens 210.
  • the position sensor 214 can detect the current focus position.
  • the position sensor 215 detects the lens position of the zoom lens 211.
  • the position sensor 215 can detect the current zoom position of the zoom lens 211.
  • the distance measuring sensor 250 is a sensor for measuring the distance to an object existing within the measurement target range.
  • the ranging sensor 250 may be, for example, a TOF (Time of Flight) sensor.
  • the TOF sensor is a sensor that measures the distance to the object based on the time difference between the time when the directional light such as infrared rays or laser light is emitted and the time when the directional light receives the reflected light reflected by the object.
  • the distance measuring sensor 250 can measure the distance to an object existing in the imaging direction of the imaging device 100.
  • the distance measuring sensor 250 may be provided on the universal joint 50 together with the camera device 100.
  • the ranging sensor 250 may be provided on the UAV body 20.
  • the distance measuring sensor 250 may be provided inside or outside the housing of the camera device 100. When the imaging direction of the imaging device 100 changes under the driving of the gimbal 50, the direction of the measurement object of the distance measuring sensor 250 may also change.
  • the distance measuring sensor 250 can measure the distances of the objects that respectively exist
  • the imaging device 100 implements focus control more efficiently based on the distance to the subject measured by the distance measuring sensor 250.
  • the imaging control unit 110 includes an acquisition unit 112, a specific unit 114, a determination unit 116, and a focus control unit 118.
  • the acquisition unit 112 acquires the subject distance through the distance measuring sensor 250, which represents the distance from the imaging device 100 to the subject.
  • the acquiring unit 112 may acquire the respective subject distances of the plurality of ranging areas through the ranging sensor 250.
  • the specific section 114 determines the position of the focus lens 210.
  • the specific section 114 determines the current position of the focus lens 210.
  • the specifying unit 114 may convert the rotation of the driving motor included in the lens driving unit 212 into the number of pulses, and determine the position of the focus lens 210 from the reference position based on infinity based on the converted number of pulses.
  • the determination section 116 determines the moving direction of the focus lens 210 based on the subject distance and the position of the focus lens 210.
  • the determining section 116 may determine the moving direction of the focus lens 210 based on the subject distance of the ranging area corresponding to the focus area and the position of the focusing lens 210 in the imaging area of the imaging device 100 among the plurality of ranging areas.
  • the focus area may be an area including a subject to be focused within the imaging area of the imaging device 100.
  • the focus area may be a predetermined area within the imaging area of the imaging device 100.
  • the focus area may be an area selected by the user within the imaging area of the imaging device 100.
  • the focus area may be an area set in the imaging area of the imaging device 100 according to the imaging mode.
  • the correspondence between the imaging area of the imaging device 100 and the ranging area of the ranging sensor 250 may be predetermined.
  • the imaging device 100 may store a conversion table between the coordinate system of the imaging area of the imaging device 100 and the coordinate system of the ranging area of the ranging sensor 250 in the memory 130 or the like.
  • the determining section 116 may determine the moving direction of the focusing lens 210 as the direction of infinity.
  • the determining section 116 may determine the moving direction of the focusing lens 210 as the direction of the closest end.
  • the focusing distance may be a distance from the imaging device 100 to the subject that can obtain a predetermined focusing state at the current position of the focusing lens 210.
  • the predetermined in-focus state may be a state in which the contrast value of the subject photographed by the imaging device 100 is a predetermined threshold value or more.
  • the focus control unit 118 controls the position of the focus lens 210 based on the image captured by the imaging device 100 after moving the focus lens 210 in the movement direction determined by the determination unit 116.
  • the focus control unit 118 may control the position of the focus lens 210 based on the contrast values of the plurality of images captured by the imaging device 100 while moving the focus lens 210 in the movement direction determined by the determination unit 116.
  • the focus control section 118 may control the position of the focus lens 210 by determining the position of the focus lens 210 with a peak contrast value and moving the focus lens 210 to the determined position of the focus lens 210 according to the mountain climbing method.
  • the range that the distance measuring sensor 250 can measure ranges from the shortest distance range 501 to the longest distance range 502.
  • the distance measuring sensor 250 can measure the subject distance from the imaging device 100 to the object 601 and the object 602 existing between the range 501 to the range 502.
  • the range 500 represents the range of the distance from the imaging device 100 to the subject that can obtain a predetermined focus state at the current position of the focus lens 210.
  • the range 500 indicates the range of the focusing distance corresponding to the current position of the focus lens 210.
  • the determining unit 116 determines the moving direction of the focus lens 210 as the closest Direction.
  • the determining unit 116 determines the moving direction of the focus lens 210 to be infinite Far away.
  • FIG. 4 shows an example of the relationship between the position of the focusing lens 210 and the focusing distance.
  • the determination unit 116 determines the moving direction of the focus lens 210 as the direction of the closest end.
  • the determining unit 116 determines the moving direction of the focus lens 210 as the direction of infinity.
  • the subject distance measured by the distance measuring sensor 250 such as a TOF sensor may not be correct. Therefore, even if the focus control unit 118 controls the position of the focus lens 210 so that the focus distance matches the subject distance measured by the distance measuring sensor 250, it is not always possible to obtain a predetermined focus state with respect to the object. Therefore, after determining the moving direction of the focus lens 210, the focus control unit 118 may also search for the position of the focus lens 210 where the contrast value of the object is the peak based on the image captured by the imaging device 100.
  • the focus control section 118 may control the position of the focus lens 210 based on the subject distance measured by the ranging sensor 250, instead of based on The image captured by the camera 100 controls the position of the focus lens 210.
  • the focus control section 118 may control the position of the focus lens 210 based on the image captured by the imaging device 100.
  • the first subject distance may be determined in advance based on the optical characteristics of the lens section 200.
  • the first subject distance may be a subject distance where the change amount of the position of the focus lens 210 relative to the focus distance is a predetermined threshold value or less.
  • the first subject distance depends on the optical characteristics of the imaging device 100, so it can be set according to actual measured values during the manufacturing stage.
  • the optical characteristics of the lens system of the image pickup device 100 may change due to temperature.
  • the focusing distance corresponding to the position of the focusing lens 210 specified by the determining unit 116 also changes due to temperature.
  • the determination unit 116 preferably also considers the temperature.
  • FIG. 5 shows parameters considered when determining the focusing distance of the imaging device 100.
  • D represents the distance from the imaging surface 700 to the subject 710.
  • f is the focal length.
  • x 1 represents the distance from the image-side focus 701 of the imaging device 100 to the imaging surface 700.
  • x 2 represents the distance from the object-side focus 702 to the focus point 711 of the imaging device 100.
  • x 1 ⁇ x 2 f 2
  • Figure 6 shows the relationship between the distance from the image side focal point 701 to the imaging plane of the image pickup apparatus 100 from the x 1 and the object-side focal point 702 of the imaging apparatus 100 according to a predetermined temperature of the image pickup apparatus 100 of the distance between the two pairs of focus x 711 Examples.
  • the predetermined relationship may be derived based on the actual measurement value of the camera device 100 during the manufacturing stage. For example, for two points where the distance from the imaging device 100 is known at a predetermined temperature T0, the distance x 1 is derived. Let the distance x 1 at this time be A and B. After implementing autofocus such as contrast AF several times, A and B can be derived from the average value.
  • the vertical axis is set to x 1
  • the horizontal axis is set to 1/x 2
  • the straight line 800 passing through the origin, A, and B is set as the calibration data at the temperature T0.
  • the straight line 800 moves in parallel along the x 1 axis.
  • an offset C corresponding to the difference between the temperature T0 and the current temperature can be derived. Therefore, the determination unit 116 can derive the position correspondence of the focus lens 210 after calibration based on the temperature of the imaging device 100 based on the calibration data and the offset C indicating the relationship between the distance x 1 and the distance x 2 at the temperature T0 as the reference Focusing distance.
  • the acquisition unit 112 may acquire the temperature inside the imaging device 100 through a temperature sensor provided on the imaging device 100. Moreover, determination unit 116 in FIG. 701 may be based on the distance to the imaging plane 700 and 702 x 1 to x 711 focus distance from the object-side focal point of the imaging apparatus 100 shown in FIG. 6 from the image-side focal point of the imaging apparatus 100 between 2 The focusing distance corresponding to the position of the focus lens 210 determined by the specific unit 114 is determined based on the relationship determined in advance by the temperature of the imaging device 100.
  • the determining section 116 may move the calibration data of the temperature T0 as the reference in parallel along the x 1 axis by the offset C corresponding to the temperature acquired by the acquiring section 112, and based on the calibration data after temperature compensation, determine the The focus distance corresponding to the position. Therefore, since the optical characteristics of the lens system of the imaging device 100 change according to the temperature change of the imaging device 100, it is possible to suppress the deviation of the focus distance corresponding to the position of the focus lens 210 specified by the determining unit 116.
  • the distance measuring sensor 250 can measure the distance of the subject existing within the focus area.
  • the distance measuring sensor 250 when the distance measuring sensor 250 is not provided on the imaging device 100 but on the UAV main body 20 or the like, there may be cases where The imaging range 722 corresponding to the angle of view does not overlap with the ranging range 723 of the ranging sensor 250. At this time, the distance sensor 250 may be used to measure the distance of the subject to be focused in advance. Then, when the UAV control unit 30 controls the universal joint 50 so that the imaging range 722 of the imaging device 100 is included in the ranging range 723 of the distance measuring sensor 250, the focus control unit 118 can start the utilization determination unit by causing the focus lens 210 to 116 determines the movement direction to implement focus control.
  • the focus control section 118 can implement focus control by starting to move the focus lens 210 in the movement direction determined by the determination section 116.
  • the acquisition unit 112 can acquire the subject distance in the ranging area corresponding to the focus area measured by the ranging sensor 250.
  • the specific section 114 determines the current position of the focus lens 210
  • the determination section 116 determines the moving direction of the focus lens 210 by comparing the focus distance corresponding to the current position of the focus lens 210 with the subject distance, and by using The focus lens 210 starts to move in the moving direction to implement focus control.
  • the focus control unit 118 may determine the position of the focus lens with a peak contrast value according to the mountain climbing method based on the plurality of images captured by the imaging unit 102.
  • the focus control unit 118 may move the focus lens 210 to the closest after moving the focus lens 210 to infinity During the end-to-end movement, according to the mountain climbing method, the position of the focus lens whose contrast value is the peak value is determined.
  • the focus control unit 118 may move the focus lens 210 from the current position to the closest end In accordance with the mountain climbing method, determine the position of the focus lens with the contrast value peak.
  • the acquisition unit 112 can acquire the subject distance through the distance measuring sensor 250 when the UAV 10 is not moving.
  • the acquiring unit 112 may acquire the subject distance through the distance measuring sensor 250 when the UAV 10 is hovered.
  • the acquiring unit 112 can acquire the respective subject distances of the plurality of ranging areas within the imaging area of the imaging device 100.
  • the imaging control section 110 may notify the user of the respective subject distances of the plurality of ranging areas acquired by the acquisition section 112. For example, as shown in FIG. 9, the imaging control unit 110 may display the display unit of the remote operation device 300 by superimposing the subject distance in each ranging area acquired by the acquisition unit 112 and the image captured by the imaging device 100 on the display unit of the remote operation device 300. Notify the user.
  • FIG. 10 is a flowchart showing an example of the procedure of the focus control of the imaging apparatus 100.
  • the acquiring unit 112 acquires the respective subject distances of the plurality of ranging areas through the ranging sensor 250 (S100).
  • the acquiring section 112 may acquire the subject distance in the ranging area that the ranging sensor 250 can range from among the multiple ranging areas.
  • the specific unit 114 determines whether the UAV 10 is moving, that is, whether it is hovering (S102). If the UAV 10 is moving, that is, it is not hovering, the acquiring unit 112 will acquire the respective subject distances of the multiple ranging areas through the ranging sensor 250 again.
  • the specific unit 114 sets the focus area from the imaging area of the imaging device 100 (S104).
  • the specific unit 114 may set an area selected by the user from the imaging area of the imaging device 100 as the focus area.
  • the specific unit 114 may set an area containing a predetermined subject from the imaging area of the imaging device 100 as the focus area.
  • the specific section 114 determines the subject distance in the ranging area corresponding to the focus area and the current position of the focus lens 210 among the subject distances from the ranging sensor 250 (S106).
  • the determination unit 116 determines whether the determined object distance is greater than the focusing distance corresponding to the position of the focus lens 210 (S108).
  • the determining unit 116 may derive from the calibration data and the offset C indicating the relationship between the distance x 1 and the distance x 2 at the reference temperature T0 as shown in FIG. 6
  • the focusing distance corresponding to the position of the focusing lens 210 that has been calibrated according to the temperature of the imaging device 100.
  • the determination unit 116 determines the moving direction of the focus lens 210 when the focus control is started as the direction of infinity (S110). After moving the focus lens 210 to the infinity direction, the focus control unit 118 searches for the position of the focus lens 210 where the contrast value is the peak based on the contrast values of the plurality of images captured by the imaging device 100, and implements autofocus (S112).
  • the determining unit 116 determines the moving direction of the focus lens 210 when starting focus control as the direction of the closest end (S114). After moving the focus lens 210 in the direction of the closest end, the focus control unit 118 searches for the position of the focus lens 210 where the contrast value is the peak based on the contrast values of the plurality of images captured by the imaging device 100 (S116 ).
  • the moving direction of the focus lens 210 is determined based on the subject distance of the subject to be measured measured by the distance measuring sensor 250.
  • the subject distance measured by the distance measuring sensor 250 such as a TOF sensor is not necessarily correct, and even if the focus distance matches the subject distance, the desired focus state may not be obtained.
  • the focus control unit 118 uses this distance to determine the moving direction of the focus lens 210. Therefore, when the focus control unit 118 performs focus control, the focus lens 210 can be efficiently moved.
  • FIG. 11 shows an example of a computer 1200 that can embody various aspects of the present invention in whole or in part.
  • the program installed on the computer 1200 can cause the computer 1200 to function as an operation associated with the device according to the embodiment of the present invention or one or more "parts" of the device.
  • the program can enable the computer 1200 to perform the operation or the one or more "parts".
  • This program enables the computer 1200 to execute the process according to the embodiment of the present invention or the stage of the process.
  • Such a program may be executed by the CPU 1212 to cause the computer 1200 to perform determination operations associated with some or all blocks in the flowchart and block diagrams described in this specification.
  • the computer 1200 includes a CPU 1212 and a RAM 1214, which are connected to each other through a host controller 1210.
  • the computer 1200 also includes a communication interface 1222 and an input/output unit, which are connected to the host controller 1210 through the input/output controller 1220.
  • the computer 1200 also includes a ROM 1230.
  • the CPU 1212 operates according to the programs stored in the ROM 1230 and RAM 1214, thereby controlling each unit.
  • the communication interface 1222 communicates with other electronic devices via a network.
  • the hard disk drive can store programs and data used by the CPU 1212 in the computer 1200.
  • the ROM 1230 stores therein a boot program and the like executed by the computer 1200 during operation, and/or a program dependent on the hardware of the computer 1200.
  • the program is provided through a computer-readable recording medium such as a CR-ROM, USB memory, or IC card, or a network.
  • the program is installed in the RAM 1214 or ROM 1230 which is also an example of a computer-readable recording medium, and is executed by the CPU 1212.
  • the information processing described in these programs is read by the computer 1200 and causes cooperation between the programs and the various types of hardware resources described above.
  • the operation or processing of information may be realized with the use of the computer 1200, thereby constituting an apparatus or method.
  • the CPU 1212 may execute the communication program loaded in the RAM 1214, and based on the processing described in the communication program, instruct the communication interface 1222 to perform communication processing.
  • the communication interface 1222 reads the transmission data stored in the transmission buffer provided in the recording medium such as RAM 1214 or USB memory, and transmits the read transmission data to the network, or receives it from the network The received data is written into the receive buffer provided on the recording medium, etc.
  • the CPU 1212 may cause the RAM 1214 to read all or necessary parts of files or databases stored in an external recording medium such as a USB memory, and perform various types of processing on the data on the RAM 1214. Next, the CPU 1212 can write the processed data back to the external recording medium.
  • an external recording medium such as a USB memory
  • Various types of information such as various types of programs, data, tables, and databases can be stored in the recording medium and subjected to information processing.
  • the CPU 1212 can perform various types of operations, information processing, condition judgment, conditional transfer, unconditional transfer, and information retrieval described in various places of the present disclosure, including the sequence of instructions determined by the program Replacement and other types of processing, and write the results back to RAM1214.
  • the CPU 1212 can retrieve information in files, databases, etc. in the recording medium.
  • the CPU 1212 may retrieve and determine the attribute value of the first attribute from the multiple entries An entry that matches the condition of, and reads the attribute value of the second attribute stored in the entry, thereby obtaining the attribute value of the second attribute associated with the first attribute that meets the predetermined condition.
  • the above program or software module may be stored on the computer 1200 or on a computer-readable storage medium near the computer 1200.
  • a recording medium such as a hard disk or RAM provided in a server system connected to a dedicated communication network or the Internet can be used as a computer-readable storage medium, so that the program can be provided to the computer 1200 via the network.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Optics & Photonics (AREA)
  • Studio Devices (AREA)
  • Automatic Focus Adjustment (AREA)
  • Focusing (AREA)
  • Lens Barrels (AREA)
  • Stereoscopic And Panoramic Photography (AREA)
  • Accessories Of Cameras (AREA)

Abstract

期望利用测距装置更高效率地实施对焦控制。控制装置可以包括获取部,其通过测距传感器获取被摄体距离,该被摄体距离表示从摄像装置到被摄体的距离。控制装置可以包括特定部,其确定摄像装置所具备的聚焦镜头的位置。控制装置可以包括确定部,其基于被摄体距离和聚焦镜头的位置,确定聚焦镜头的移动方向。控制装置可以包括控制部,其在使聚焦镜头向移动方向移动后,基于通过摄像装置拍摄的图像控制聚焦镜头的位置。

Description

控制装置、摄像装置、摄像系统、移动体、控制方法以及程序 技术领域
本发明涉及一种控制装置、摄像装置、摄像系统、移动体、控制方法以及程序。
背景技术
专利文献1中公开了一种测距装置,其基于射出指向性光的时刻以及收到反射光的时刻导出至被摄体的距离。
[现有技术文献]
[专利文献]
[专利文献1]国际公开第2015/166713号公报
发明内容
发明所要解决的技术问题
期望利用例如专利文献1中记载的测距装置,更高效率地实施对焦控制。
用于解决问题的技术手段
本发明的一个方面所涉及的控制装置可以具备获取部,其通过测距传感器获取被摄体距离,该被摄体距离表示从摄像装置到被摄体的距离。控制装置可以具备特定部,其确定摄像装置所具备的聚焦镜头的位置。控制装置可以具备确定部,其基于被摄体距离和聚焦镜头的位置,确定聚焦镜头的移动方向。控制装置可以具备控制部,其在使聚焦镜头向移动方向移动后,基于通过摄像装置拍摄的图像控制聚焦镜头的位置。
当被摄体距离大于与由特定部确定的聚焦镜头的位置对应的对焦距离时,确定部将聚焦镜头的移动方向确定为无限远的方向。
当被摄体距离小于与由特定部确定的聚焦镜头的位置对应的对焦距离时,确定部将聚焦镜头的移动方向确定为最近端的方向。
当被摄体距离大于第一被摄体距离时,控制部可以基于被摄体距离控制聚焦镜头的位置,代替基于由摄像装置拍摄的图像控制聚焦镜头的位置。
当被摄体距离小于第一被摄体距离时,控制部可以基于由摄像装置拍摄的图像控 制聚焦镜头的位置。
获取部可以通过测距传感器获取多个测距区域的各自被摄体距离。确定部可以基于多个测距区域中摄像装置的摄像区域内与对焦区域对应的测距区域中的被摄体距离以及聚焦镜头的位置确定聚焦镜头的移动方向。
获取部还可以通过温度传感器获取摄像装置的温度。确定部可以基于从摄像装置的像侧焦点到摄像面的距离与从摄像装置的物体侧焦点到对焦点的距离之间根据摄像装置的温度预先确定的关系,确定与通过特定部确定的聚焦镜头的位置对应的对焦距离。
测距传感器可以是TOF传感器。
本发明的一个方面所涉及的摄像装置可以具备上述控制装置和聚焦镜头。
本发明的一个方面所涉及的摄像系统可以具备上述摄像装置、测距传感器以及以可调整摄像装置姿势的方式支撑摄像装置的支撑机构。
对应于支撑机构控制摄像装置的姿势而使测距传感器的测距区域包含在摄像装置的摄像区域内,控制部可以在使聚焦镜头向移动方向移动后,基于由摄像装置拍摄的图像控制聚焦镜头的位置。
本发明的一个方面所涉及的移动体可以具备上述摄像系统并进行移动。
获取部可以在移动体没有移动时,通过测距传感器获取被摄体距离。
移动体可以是飞行体。获取部可以在飞行体悬停时通过测距传感器获取被摄体距离。
本发明的一个方面所涉及的控制方法可以具备通过测距传感器获取被摄体距离的步骤,该被摄体距离表示从摄像装置到被摄体的距离。控制方法可以具备确定摄像装置具备的聚焦镜头的位置的步骤。控制方法可以具备基于被摄体距离和聚焦镜头的位置确定聚焦镜头的移动方向的步骤。控制方法可以具备在使聚焦镜头向移动方向移动后,基于由摄像装置拍摄的图像控制聚焦镜头的位置的步骤。
本发明的一个方面所涉及的程序可以是一种用于使计算机作为控制装置发挥功能的程序。
根据本发明的一个方面,能够利用测距装置更高效率地实施对焦控制。
此外,上述发明内容未列举本发明的必要的全部特征。此外,这些特征组的子组 合也可以构成发明。
附图说明
图1是示出了无人驾驶航空器及远程操作装置的外观的一个示例的图;
图2是示出无人驾驶航空器的功能块的一个示例的图;
图3是用来说明测距传感器的可测距范围的图;
图4是示出聚焦镜头的位置与对焦距离的关系的示例的图;
图5是用来说明在确定摄像装置的对焦距离时要考虑的参数的图;
图6是示出距离x 1与距离x 2之间根据摄像装置的温度预先确定的关系的示例的图;
图7是示出摄像装置的摄像范围与测距传感器的测距范围的位置关系的示例的图;
图8是示出摄像装置的摄像范围与测距传感器的测距范围的位置关系的示例的图;
图9是示出将每个测距区域的被摄体距离重叠显示在摄像装置拍摄的图像上的画面的示例的图;
图10是示出表示摄像装置的对焦控制的步骤的示例的流程图的图;
图11是示出硬件配置的一个示例的图;
符号说明:
10 UAV,20 UAV主体,30 UAV控制部,36通信接口,37存储器,40推进部,41 GPS接收器,42惯性测量装置,43磁罗盘,44气压高度计,45温度传感器,46湿度传感器,50万向节,60摄像装置,100摄像装置,102摄像部,110摄像控制部,112获取部,114特定部,116确定部,118对焦控制部,120图像传感器,130存储器,200镜头部,210镜头,212镜头驱动部,214位置传感器,220镜头控制部,222存储器,250测距传感器,300远程操作装置,1200计算机,1210主机控制器,1212 CPU,1214 RAM,1220输入/输出控制器,1222通信接口,1230 ROM。
具体实施方式
以下,通过发明的实施方式来对本发明进行说明,但是以下的实施方式并不限定权利要求书所涉及的发明。此外,实施方式中所说明的所有特征组合对于发明的解决方案未必是必须的。对本领域普通技术人员来说,显然可以对以下实施方式加以各种变更或改良。从权利要求书的描述显而易见的是,加以了这样的变更或改良的方式都可包含在本发明的技术范围之内。
权利要求书、说明书、说明书附图以及说明书摘要中包含作为著作权所保护对象的事项。任何人只要如专利局的文档或者记录所表示的那样进行这些文件的复制,著作权人就无法异议。但是,在除此以外的情况下,保留一切的著作权。
本发明的各种实施方式可参照流程图及框图来描述,这里,框可表示(1)执行操作的过程的阶段或者(2)具有执行操作的作用的装置的“部”。确定的阶段和“部”可以通过可编程电路和/或处理器来实现。专用电路可以包括数字和/或模拟硬件电路。可以包括集成电路(IC)和/或分立电路。可编程电路可以包括可重构硬件电路。可重构硬件电路可以包括逻辑与、逻辑或、逻辑异或、逻辑与非、逻辑或非、及其它逻辑操作、触发器、寄存器、现场可编程门阵列(FPGA)、可编程逻辑阵列(PLA)等存储器元件等。
计算机可读介质可以包括能够存储由合适设备执行的指令的任何有形设备。其结果是,其上存储有指令的计算机可读介质具备一种包括指令的产品,该指令可被执行以创建用于执行流程图或框图所指定的操作的手段。作为计算机可读介质的示例,可以包括电子存储介质、磁存储介质、光学存储介质、电磁存储介质、半导体存储介质等。作为计算机可读介质的更具体的示例,可以包括floppy(注册商标)软盘、软磁盘、硬盘、随机存取存储器(RAM)、只读存储器(ROM)、可擦除可编程只读存储器(EPROM或者闪存)、电可擦可编程只读存储器(EEPROM)、静态随机存取存储器(SRAM)、光盘只读存储器(CD-ROM)、数字多用途光盘(DVD)、蓝光(RTM)光盘、记忆棒、集成电路卡等。
计算机可读指令可以包括由一种或多种编程语言的任意组合描述的源代码或者目标代码中的任意一个。源代码或者目标代码包括传统的程序式编程语言。传统的程序式编程语言可以为汇编指令、指令集架构(ISA)指令、机器指令、与机器相关的 指令、微代码、固件指令、状态设置数据、或者Smalltalk、
Figure PCTCN2019122976-appb-000001
C++等面向对象编程语言以及“C”编程语言或者类似的编程语言。计算机可读指令可以在本地或者经由局域网(LAN)、互联网等广域网(WAN)提供给通用计算机、专用计算机或者其它可编程数据处理装置的处理器或可编程电路。处理器或可编程电路可以执行计算机可读指令,以创建用于执行流程图或框图所指定操作的手段。作为处理器的示例,包括计算机处理器、处理单元、微处理器、数字信号处理器、控制器、微控制器等。
图1表示无人驾驶航空器(UAV)10及远程操作装置300的外观的一个示例。UAV10包括UAV主体20、万向节50、多个摄像装置60以及摄像装置100。万向节50及摄像装置100为摄像系统的一个示例。UAV10,即移动体,是指包括在空中移动的飞行体、在地面上移动的车辆、在水上移动的船舶等的概念。在空中移动的飞行体是指不仅包括UAV、还包括在空中移动的其它的航空器、飞艇、直升机等的概念。
UAV主体20包括多个旋翼。多个旋翼为推进部的一个示例。UAV主体20通过控制多个旋翼的旋转而使UAV10飞行。UAV主体20使用例如四个旋翼来使UAV10飞行。旋翼的数量不限于四个。另外,UAV10也可以是没有旋翼的固定翼机。
摄像装置100为对包含在所期望的摄像范围内的被摄体进行拍摄的摄像用相机。万向节50可旋转地支撑摄像装置100。万向节50为支撑机构的一个示例。例如,万向节50使用致动器以俯仰轴可旋转地支撑摄像装置100。万向节50使用致动器进一步分别以滚转轴和偏航轴为中心可旋转地支撑摄像装置100。万向节50可通过使摄像装置100以偏航轴、俯仰轴以及滚转轴中的至少一个为中心旋转,来变更摄像装置100的姿势。
多个摄像装置60是为了控制UAV10的飞行而对UAV10的周围进行拍摄的传感用相机。两个摄像装置60可以设置于UAV10的机头、即正面。并且,其它两个摄像装置60可以设置于UAV10的底面。正面侧的两个摄像装置60可以成对,起到所谓的立体相机的作用。底面侧的两个摄像装置60也可以成对,起到立体相机的作用。可以根据由多个摄像装置60所拍摄的图像来生成UAV10周围的三维空间数据。UAV10所具备的摄像装置60的数量不限于四个。UAV10具备至少一个摄像装置60即可。UAV10也可以在UAV10的机头、机尾、侧面、底面及顶面分别具备至少一个摄像装置60。摄像装置60中可设置的视角可大于摄像装置100中可设置的视角。摄 像装置60也可以具有单焦点镜头或鱼眼镜头。
远程操作装置300与UAV10通信,以远程操作UAV10。远程操作装置300可以与UAV10进行无线通信。远程操作装置300向UAV10发送表示上升、下降、加速、减速、前进、后退、旋转等与UAV10的移动有关的各种指令的指示信息。指示信息包括例如使UAV10的高度上升的指示信息。指示信息可以表示UAV10应该位于的高度。UAV10进行移动,以位于从远程操作装置300接收的指示信息所表示的高度。指示信息可以包括使UAV10上升的上升指令。UAV10在接受上升指令的期间上升。在UAV10的高度已达到上限高度时,即使接受上升指令,也可以限制UAV10上升。
图2示出UAV10的功能块的一个示例。UAV10包括UAV控制部30、存储器37、通信接口36、推进部40、GPS接收器41、惯性测量装置42、磁罗盘43、气压高度计44、温度传感器45、湿度传感器46、万向节50、摄像装置60、摄像装置100及测距传感器250。
通信接口36与远程操作装置300等其它装置通信。通信接口36可以从远程操作装置300接收包括对UAV控制部30的各种指令的指示信息。存储器37存储UAV控制部30对推进部40、GPS接收器41、惯性测量装置(IMU)42、磁罗盘43、气压高度计44、温度传感器45、湿度传感器46、万向节50、摄像装置60及摄像装置100进行控制所需的程序等。存储器37可以为计算机可读记录介质,可以包括SRAM、DRAM、EPROM、EEPROM、USB存储器及固态硬盘(SSD)等闪存中的至少一个。存储器37可以设置在UAV主体20的内部。其可以设置成可从UAV主体20上拆卸下来。
UAV控制部30按照存储在存储器37中的程序来控制UAV10的飞行及拍摄。UAV控制部30可以由CPU或MPU等微处理器以及MCU等微控制器等构成。UAV控制部30按照经由通信接口36从远程操作装置300接收到的指令来控制UAV10的飞行及拍摄。推进部40推进UAV10。推进部40包括多个旋翼和使多个旋翼旋转的多个驱动电机。推进部40按照来自UAV控制部30的指令,经由多个驱动电机使多个旋翼旋转,以使UAV10飞行。
GPS接收器41接收表示从多个GPS卫星发信的时间的多个信号。GPS接收器41根据所接收的多个信号来计算出GPS接收器41的位置(纬度及经度)、即UAV10 的位置(纬度及经度)。IMU42检测UAV10的姿势。IMU42检测UAV10的前后、左右以及上下的三轴方向的加速度和俯仰轴、滚转轴以及偏航轴的三轴方向的角速度,作为UAV10的姿势。磁罗盘43检测UAV10的机头的方位。气压高度计44检测UAV10的飞行高度。气压高度计44检测UAV10周围的气压,并将检测到的气压换算为高度,以检测高度。温度传感器45检测UAV10周围的温度。湿度传感器46检测UAV10周围的湿度。
摄像装置100包括摄像部102及镜头部200。镜头部200为镜头装置的一个示例。摄像部102包括图像传感器120、摄像控制部110及存储器130。图像传感器120可以由CCD或CMOS构成。图像传感器120拍摄经由多个镜头210成像的光学图像,并将所拍摄的图像输出至摄像控制部110。摄像控制部110可以由CPU或MPU等微处理器、MCU等微控制器等构成。摄像控制部110可以根据来自UAV控制部30的摄像装置100的操作指令来控制摄像装置100。摄像控制部110为第一控制部及第二控制部的一个示例。存储器130可以为计算机可读记录介质,可以包括SRAM、DRAM、EPROM、EEPROM、USB存储器及固态硬盘(SSD)等闪存中的至少一个。存储器130存储摄像控制部110对图像传感器120等进行控制所需的程序等。存储器130可以设置于摄像装置100的壳体内部。存储器130可以设置成可从摄像装置100的壳体上拆卸下来。
镜头部200包括聚焦镜头210、变焦镜头211、镜头驱动部212、镜头驱动部213和镜头控制部220。聚焦镜头210是聚焦镜头系统的一个示例。变焦镜头211是变焦镜头系统的一个示例。聚焦镜头210和变焦镜头211可以包括至少一个镜头。聚焦镜头210和变焦镜头211的至少一部分或全部被配置为能够沿着光轴移动。镜头部200可以是被设置成能够相对摄像部102拆装的交换镜头。镜头驱动部212经由凸轮环、引导轴等机构构件使聚焦镜头210的至少一部分或全部沿着光轴移动。镜头驱动部213经由凸轮环、引导轴等机构构件使变焦镜头211的至少一部分或全部沿着光轴移动。镜头控制部220按照来自摄像部102的镜头控制指令来驱动镜头驱动部212和镜头驱动部213中的至少一个,并经由机构构件使聚焦镜头210和变焦镜头211中的至少一个沿着光轴方向移动,以执行变焦动作和聚焦动作中的至少一个。镜头控制指令例如为变焦控制指令及聚焦控制指令。
镜头部200还包括存储器222、位置传感器214和位置传感器215。存储器222存储经由镜头驱动部212和镜头驱动部213而移动的聚焦镜头210和变焦镜头211的控制值。存储器222可以包括SRAM、DRAM、EPROM、EEPROM及USB存储器等闪存中的至少一个。位置传感器214检测聚焦镜头210的镜头位置。位置传感器214可以检测当前的聚焦位置。位置传感器215检测变焦镜头211的镜头位置。位置传感器215可以检测变焦镜头211的当前的变焦位置。
测距传感器250是一种用于测量到存在于测量对象范围内的物体的距离的传感器。测距传感器250例如可以是TOF(Time of Flight,飞行时间)传感器。TOF传感器是根据红外线或者激光等指向性光射出的时刻及收到指向性光被对象物反射的反射光的时刻的时间差,测量到对象物的距离的传感器。测距传感器250可以测量到存在于摄像装置100的摄像方向上的物体的距离。测距传感器250可以与摄像装置100一同设置于万向节50上。测距传感器250可以设置在UAV主体20上。测距传感器250可以设置于摄像装置100的壳体内部或者壳体外部。当在万向节50的驱动下,摄像装置100的摄像方向发生改变时,测距传感器250的测量对象的方向也可以发生改变。测距传感器250可以测量到分别存在于多个测距区域内的对象物的距离。
在如此构成的UAV10中,摄像装置100基于利用测距传感器250测量出的到被摄体的距离,更高效率地实施对焦控制。
因此,摄像控制部110包括获取部112、特定部114、确定部116以及对焦控制部118。获取部112通过测距传感器250获取被摄体距离,该被摄体距离表示从摄像装置100到被摄体的距离。获取部112可以通过测距传感器250获取多个测距区域的各自被摄体距离。
特定部114确定聚焦镜头210的位置。特定部114确定当前的聚焦镜头210的位置。特定部114可以将镜头驱动部212中含有的驱动用电机的旋转转换为脉冲数,并基于所转换的脉冲数确定自以无限远为基准的基准位置起的聚焦镜头210的位置。
确定部116基于被摄体距离和聚焦镜头210的位置,确定聚焦镜头210的移动方向。确定部116可以基于多个测距区域中摄像装置100的摄像区域中与对焦区域对应的测距区域的被摄体距离和聚焦镜头210的位置,确定聚焦镜头210的移动方向。对焦区域可以是包含应在摄像装置100的摄像区域内进行对焦的被摄体的区域。对焦区 域可以是在摄像装置100的摄像区域内预先确定的区域。对焦区域可以是在摄像装置100的摄像区域内由用户选择的区域。对焦区域可以是摄像装置100的摄像区域内根据撮影模式设定的区域。摄像装置100的摄像区域与测距传感器250的测距区域的对应关系可以预先确定。摄像装置100可以将摄像装置100的摄像区域的坐标系统与测距传感器250的测距区域的坐标系统之间的转换表存储在存储器130等中。
被摄体距离大于与由特定部114确定的聚焦镜头210的位置对应的对焦距离时,确定部116可以将聚焦镜头210的移动方向确定为无限远的方向。被摄体距离小于与由特定部114确定的聚焦镜头210的位置对应的对焦距离时,确定部116可以将聚焦镜头210的移动方向确定为最近端的方向。对焦距离可以是能够在当前的聚焦镜头210的位置获得预先确定的对焦状态的、从摄像装置100到被摄体的距离。预先确定的对焦状态可以是关于由摄像装置100拍摄的被摄体的对比度值为预先确定的阈值以上的状态。
对焦控制部118在使聚焦镜头210向由确定部116确定的移动方向移动后,基于由摄像装置100拍摄的图像,控制聚焦镜头210的位置。对焦控制部118可以一边使聚焦镜头210向由确定部116确定的移动方向移动,一边基于由摄像装置100拍摄的多个图像的对比度值,控制聚焦镜头210的位置。对焦控制部118可以按照登山法,通过确定对比度值为峰值的聚焦镜头210的位置,并使聚焦镜头210移动至所确定的聚焦镜头210的位置,从而控制聚焦镜头210的位置。
此处,测距传感器250可测距的范围存在限制。例如,如图3所示,测距传感器250可测距的范围为最短距离范围501至最长距离范围502。测距传感器250能够对存在于范围501至范围502之间的对象物601和对象物602测量自摄像装置100起的被摄体距离。范围500表示能够在当前的聚焦镜头210的位置获得预先确定的对焦状态的摄像装置100至被摄体的距离的范围。也就是说,范围500表示与当前的聚焦镜头210的位置对应的对焦距离的范围。作为被摄体距离,测距传感器250测量到与对焦距离的范围500相比更靠近摄像装置100侧的对象物601的被摄体距离时,确定部116将聚焦镜头210的移动方向确定为最近端的方向。作为被摄体距离,测距传感器250测量到与对焦距离的范围500相比更远离摄像装置100侧的对象物602的被摄体距离时,确定部116将聚焦镜头210的移动方向确定为无限远的方向。
图4显示聚焦镜头210的位置与对焦距离的关系的示例。应对焦的对象物存在于小于与当前的聚焦镜头210的位置对应的对焦距离的距离内时,确定部116将聚焦镜头210的移动方向确定为最近端的方向。应对焦的对象物存在于大于与当前的聚焦镜头210的位置对应的对焦距离的距离内时,确定部116将聚焦镜头210的移动方向确定为无限远的方向。
利用TOF传感器等测距传感器250测量出的被摄体距离不一定正确。因此,对焦控制部118即使控制聚焦镜头210的位置,使对焦距离与由测距传感器250测量出的被摄体距离一致,也不一定能够相对于对象物获得预先确定的对焦状态。因此,对焦控制部118在确定聚焦镜头210的移动方向后,还可以基于由摄像装置100拍摄的图像,探索对象物的对比度的值为峰值的聚焦镜头210的位置。
对焦距离越短,与对焦距离的变化相应的聚焦镜头210的位置的变化越大。对焦距离越长,与对焦距离的变化相应的聚焦镜头210的位置的变化越小。也就是说,到对象物的距离长时,即使利用测距传感器250测量出的被摄体距离存在若干误差,即使聚焦镜头210的位置对准与该被摄体距离对应的对焦距离,也很可能获得预先确定的对焦状态。例如,到对象物的距离较长时,即使不执行对比度AF,也很可能获得预先确定的对焦状态。
因此,利用测距传感器250测量出的被摄体距离大于第一被摄体距离时,对焦控制部118可以基于利用测距传感器250测量出的被摄体距离控制聚焦镜头210的位置,代替基于由摄像装置100拍摄的图像控制聚焦镜头210的位置。利用测距传感器250测量出的被摄体距离小于第一被摄体距离时,对焦控制部118可以基于由摄像装置100拍摄的图像控制聚焦镜头210的位置。第一被摄体距离可以基于镜头部200的光学特性预先确定。第一被摄体距离可以是聚焦镜头210相对于对焦距离的位置的变化量为预先确定的阈值以下的被摄体距离。第一被摄体距离取决于摄像装置100的光学特性,因此在制造阶段可根据实测值来设定。
此处,摄像装置100的镜头系统的光学特性会因温度而发生变化。与确定部116确定的聚焦镜头210的位置对应的对焦距离也会因温度而发生变化。为了更高精度地确定对焦距离,确定部116优选还要考虑温度。
图5表示确定摄像装置100的对焦距离时考虑的参数。D表示从摄像面700到被 摄体710的距离。f表示焦距。x 1表示从摄像装置100的像侧焦点701到摄像面700的距离。x 2表示从摄像装置100的物体侧焦点702到对焦点711的距离。根据镜头的公式,可以导出x 1·x 2=f 2以及1/x 2=x 1/f 2。而且,根据D=x 2+2·f+x 1,能够确定与聚焦镜头210的位置对应的对焦距离。
图6显示从摄像装置100的像侧焦点701到摄像面的距离x 1与从摄像装置100的物体侧焦点702到对焦点711的距离x 2之间根据摄像装置100的温度预先确定的关系的示例。可以根据摄像装置100在制造阶段的实测值,导出预先确定的关系。例如,针对在预先确定的温度T0,距摄像装置100的距离已知的2点,导出距离x 1。将此时的距离x 1设为A和B。实施对比度AF等的自动聚焦数次后,可以根据其平均值导出A和B。然后,将纵轴设为x 1、将横轴设为1/x 2,将通过原点、A以及B的直线800设为温度T0时的校准数据。根据温度的变化,直线800沿x 1轴平行移动。根据各温度的实测值,能够导出与温度T0和当时的温度的差相应的偏移量C。因此,确定部116根据表示作为基准的温度T0时的距离x 1与距离x 2的关系的校准数据以及偏移量C,能够导出根据摄像装置100的温度校准后的与聚焦镜头210的位置对应的对焦距离。
获取部112可以通过设置在摄像装置100上的温度传感器获取摄像装置100的内部的温度。而且,确定部116可以基于如图6所示的从摄像装置100的像侧焦点701到摄像面700的距离x 1与从摄像装置100的物体侧焦点702到对焦点711的距离x 2之间根据摄像装置100的温度预先确定的关系,确定与由特定部114确定的聚焦镜头210的位置对应的对焦距离。确定部116可以使作为基准的温度T0的校准数据沿x 1轴平行移动与由获取部112获取的温度对应的偏移量C,并基于进行温度补偿后的校准数据,确定与聚焦镜头210的位置对应的对焦距离。因此,由于摄像装置100的镜头系统的光学特性会根据摄像装置100的温度变化而变化,所以能够抑制与由确定部116确定的聚焦镜头210的位置对应的对焦距离发生偏差。
图7显示摄像装置100的摄像范围720与测距传感器250的测距范围721的位置关系的示例。如果将测距传感器250设置在摄像装置100上,则即使摄像装置100的姿势经由万向节50发生变化,摄像装置100的摄像范围720与测距传感器250的测距范围721的位置关系也不会变化。也就是说,摄像装置100的摄像区域与测距传感 器250的测距区域的对应关系不会变化。通过在与摄像装置100的视角对应的摄像范围720内,于测距传感器250的测距范围721内设定对焦区域,测距传感器250能够测量到存在于对焦区域内的被摄体的距离。
另一方面,如图8所示,测距传感器250并不设置在摄像装置100上,而是设置在UAV主体20等上时,有时会出现因摄像装置100的姿势而使与摄像装置100的视角对应的摄像范围722与测距传感器250的测距范围723并不重叠的情况。此时,可利用测距传感器250测量应事先对焦的被摄体的距离。然后,UAV控制部30通过控制万向节50,使摄像装置100的摄像范围722包含到测距传感器250的测距范围723内时,对焦控制部118可以通过使聚焦镜头210开始向利用确定部116确定的移动方向移动来实施对焦控制。也就是说,随着测距传感器250的测距区域包含在摄像装置100的摄像区域内,对焦控制部118可以通过使聚焦镜头210开始向利用确定部116确定的移动方向移动来实施对焦控制。随着测距传感器250的测距区域包含在摄像装置100的摄像区域内,获取部112可以获取测距传感器250测量出的与对焦区域对应的测距区域内的被摄体距离。然后,特定部114确定当前的聚焦镜头210的位置,确定部116通过将与当前的聚焦镜头210的位置对应的对焦距离与被摄体距离进行比较,确定聚焦镜头210的移动方向,并通过使聚焦镜头210开始向该移动方向移动来实施对焦控制。
在使聚焦镜头210从无限远侧向最近端侧移动的期间内,对焦控制部118可以基于利用摄像部102拍摄的多个图像,按照登山法,确定对比度值为峰值的聚焦镜头的位置。
利用测距传感器250测量出的被摄体距离大于与当前的聚焦镜头210的位置对应的对焦距离时,对焦控制部118可以在使聚焦镜头210移动至无限远后,在使聚焦镜头210向最近端侧移动的期间内,按照登山法,确定对比度值为峰值的聚焦镜头的位置。
利用测距传感器250测量出的被摄体距离小于与当前的聚焦镜头210的位置对应的对焦距离时,对焦控制部118可以在使聚焦镜头210从当前所处的位置向最近端侧移动的期间内,按照登山法,确定对比度值为峰值的聚焦镜头的位置。
UAV10移动时,从摄像装置100到应对焦的被摄体的距离会发生变化。UAV10 移动时,即使根据利用测距传感器250测量出的被摄体距离来确定聚焦镜头210的移动方向,聚焦镜头210的移动方向也不一定适当。因此,获取部112可以在UAV10没有移动时通过测距传感器250获取被摄体距离。获取部112可以在UAV10悬停时通过测距传感器250获取被摄体距离。
获取部112可以获取摄像装置100的摄像区域内多个测距区域的各自被摄体距离。摄像控制部110可以将利用获取部112获取的多个测距区域的各自被摄体距离通知给用户。例如,如图9所示,摄像控制部110可以通过将利用获取部112获取的各测距区域中的被摄体距离与由摄像装置100拍摄的图像重叠显示在远程操作装置300的显示部来通知给用户。
图10是显示摄像装置100的对焦控制的步骤的示例的流程图。UAV10飞行中,获取部112会通过测距传感器250获取多个测距区域的各自被摄体距离(S100)。获取部112可以从多个测距区域中获取测距传感器250能够测距的测距区域中的被摄体距离。特定部114判定UAV10是否处于移动中,也就是说是否正在悬停(S102)。如果UAV10处于移动中,即并非处于悬停中,则获取部112会重新通过测距传感器250获取多个测距区域的各自被摄体距离。
如果UAV10并非处于移动中,即处于悬停中,则特定部114会从摄像装置100的摄像区域中设定对焦区域(S104)。特定部114可以将用户从摄像装置100的摄像区域中选择的区域设定为对焦区域。特定部114可以从摄像装置100的摄像区域中将包含预先确定的被摄体的区域设定为对焦区域。
特定部114确定来自测距传感器250的被摄体距离中的与对焦区域对应的测距区域中的被摄体距离以及当前的聚焦镜头210的位置(S106)。接着,确定部116判断所确定的被摄体距离是否大于与聚焦镜头210的位置对应的对焦距离(S108)。要导出与聚焦镜头210的位置对应的对焦距离时,确定部116可以根据如图6所示的表示基准温度T0时距离x 1与距离x 2之间关系的校准数据以及偏移量C,导出根据摄像装置100的温度实施过校准的与聚焦镜头210的位置对应的对焦距离。
被摄体距离大于与当前的聚焦镜头210的位置对应的对焦距离时,确定部116将开始对焦控制时聚焦镜头210的移动方向确定为无限远的方向(S110)。对焦控制部118在使聚焦镜头210向无限远的方向移动后,基于由摄像装置100拍摄的多个图像 的对比度值,探索对比度值为峰值的聚焦镜头210的位置,实施自动聚焦(S112)。
另一方面,被摄体距离小于与当前的聚焦镜头210的位置对应的对焦距离时,确定部116将开始对焦控制时聚焦镜头210的移动方向确定为最近端的方向(S114)。对焦控制部118在使聚焦镜头210向最近端的方向移动后,基于由摄像装置100拍摄的多个图像的对比度值,探索对比度值为峰值的聚焦镜头210的位置,实施自动聚焦(S116)。
如上所述,执行自动聚焦时,基于测距传感器250测量出的应对焦的被摄体的被摄体距离,确定聚焦镜头210的移动方向。TOF传感器等测距传感器250测量出的被摄体距离并非一定正确,即使使对焦距离与该被摄体距离一致,也可能无法获得所期望的对焦状态。但是,根据该被摄体距离,能够大体把握被摄体与摄像装置100之间的距离。对焦控制部118利用该距离,确定聚焦镜头210的移动方向。因此,在对焦控制部118执行对焦控制时,能够高效率地移动聚焦镜头210。
图11表示可整体或部分地体现本发明的多个方面的计算机1200的一个示例。安装在计算机1200上的程序能够使计算机1200作为与本发明的实施方式所涉及的装置相关联的操作或者该装置的一个或多个“部”而起作用。或者,该程序能够使计算机1200执行该操作或者该一个或多个“部”。该程序能够使计算机1200执行本发明的实施方式所涉及的过程或者该过程的阶段。这种程序可以由CPU1212执行,以使计算机1200执行与本说明书所述的流程图及框图中的一些或者全部方框相关联的确定操作。
根据本实施方式的计算机1200包括CPU1212和RAM1214,它们通过主机控制器1210相互连接。计算机1200还包括通信接口1222、输入/输出单元,它们通过输入/输出控制器1220与主机控制器1210连接。计算机1200还包括ROM1230。CPU1212根据存储在ROM1230和RAM1214中的程序进行操作,从而控制各单元。
通信接口1222经由网络与其他电子设备通信。硬盘驱动器可以存储计算机1200内的CPU1212所使用的程序及数据。ROM1230在其中存储运行时由计算机1200执行的引导程序等、和/或依赖于计算机1200的硬件的程序。程序通过CR-ROM、USB存储器或IC卡之类的计算机可读记录介质或者网络来提供。程序安装在也作为计算机可读记录介质的示例的RAM1214或ROM1230中,并通过CPU1212执行。这些程 序中记述的信息处理由计算机1200读取,并引起程序与上述各种类型的硬件资源之间的协作。可以随着计算机1200的使用而实现信息的操作或者处理,从而构成装置或方法。
例如,当在计算机1200和外部设备之间执行通信时,CPU1212可以执行加载在RAM1214中的通信程序,并且基于通信程序中描述的处理,命令通信接口1222进行通信处理。在CPU1212的控制下,通信接口1222读取存储在诸如RAM1214或USB存储器之类的记录介质中提供的发送缓冲区中的发送数据,并将读取的发送数据发送到网络,或者将从网络接收的接收数据写入记录介质上提供的接收缓冲区等。
另外,CPU1212可以使RAM1214读取存储在诸如USB存储器等外部记录介质中的文件或数据库的全部或必要部分,并对RAM1214上的数据执行各种类型的处理。接着,CPU1212可以将处理过的数据写回到外部记录介质中。
诸如各种类型的程序、数据、表格和数据库的各种类型的信息可以存储在记录介质中并且接受信息处理。对于从RAM1214读取的数据,CPU1212可执行在本公开的各处描述的、包括由程序的指令序列确定的各种类型的操作、信息处理、条件判断、条件转移、无条件转移、信息的检索/替换等各种类型的处理,并将结果写回到RAM1214中。此外,CPU1212可以检索记录介质内的文件、数据库等中的信息。例如,在记录介质中存储具有分别与第二属性的属性值建立了关联的第一属性的属性值的多个条目时,CPU1212可以从该多个条目中检索出与确定第一属性的属性值的条件相匹配的条目,并读取该条目内存储的第二属性的属性值,从而获取与满足预定条件的第一属性相关联的第二属性的属性值。
上述程序或软件模块可以存储在计算机1200上或计算机1200附近的计算机可读存储介质上。此外,连接到专用通信网络或因特网的服务器系统中提供的诸如硬盘或RAM之类的记录介质可以用作计算机可读存储介质,从而可以经由网络将程序提供到计算机1200。
应该注意的是,权利要求书、说明书以及附图中所示的装置、系统、程序以及方法中的动作、顺序、步骤以及阶段等各项处理的执行顺序,只要没有特别明示“在...之前”、“事先”等,且只要前面处理的输出并不用在后面的处理中,则可以任意顺序实现。关于权利要求书、说明书以及附图中的操作流程,为方便起见而使用“首先”、 “接着”等进行了说明,但并不意味着必须按照这样的顺序实施。
以上使用实施方式对本发明进行了说明,但是本发明的技术范围并不限于上述实施方式所描述的范围。对本领域普通技术人员来说,显然可对上述实施方式加以各种变更或改良。从权利要求书的描述显而易见的是,加以了这样的变更或改良的方式都可包含在本发明的技术范围之内。

Claims (16)

  1. 一种控制装置,其特征在于,包括:
    获取部,其通过测距传感器获取被摄体距离,所述被摄体距离表示从摄像装置到被摄体的距离;
    特定部,其确定所述摄像装置具备的聚焦镜头的位置;
    确定部,其基于所述被摄体距离以及所述聚焦镜头的位置,确定所述聚焦镜头的移动方向;以及
    控制部,其在使所述聚焦镜头向所述移动方向移动后,基于由所述摄像装置拍摄的图像,控制所述聚焦镜头的位置。
  2. 根据权利要求1所述的控制装置,其特征在于,当所述被摄体距离大于由所述特定部确定的所述聚焦镜头的位置所对应的对焦距离时,所述确定部将所述聚焦镜头的移动方向确定为无限远的方向。
  3. 根据权利要求1所述的控制装置,其特征在于,当所述被摄体距离小于与由所述特定部确定的所述聚焦镜头的位置对应的对焦距离时,所述确定部将所述聚焦镜头的移动方向确定为最近端的方向。
  4. 根据权利要求1所述的控制装置,其特征在于,当所述被摄体距离大于第一被摄体距离时,所述控制部基于所述被摄体距离控制所述聚焦镜头的位置,代替基于由所述摄像装置拍摄的图像控制所述聚焦镜头的位置。
  5. 根据权利要求4所述的控制装置,其特征在于,当所述被摄体距离小于所述第一被摄体距离时,所述控制部基于由所述摄像装置拍摄的图像控制所述聚焦镜头的位置。
  6. 根据权利要求1所述的控制装置,其特征在于,所述获取部通过所述测距传感器获取多个测距区域各自的被摄体距离,
    所述确定部基于所述多个测距区域中所述摄像装置的摄像区域内与对焦区域对应的测距区域的所述被摄体距离以及所述聚焦镜头的位置,确定所述聚焦镜头的移动方向。
  7. 根据权利要求2所述的控制装置,其特征在于,所述获取部还通过温度传感 器获取所述摄像装置的温度,
    所述确定部基于从所述摄像装置的像侧焦点至摄像面的距离与从所述摄像装置的物体侧焦点至对焦点的距离之间根据所述摄像装置的温度预先确定的关系,确定与由所述特定部确定的所述聚焦镜头的位置对应的所述对焦距离。
  8. 根据权利要求1所述的控制装置,其特征在于,所述测距传感器为TOF传感器。
  9. 一种摄像装置,其特征在于,包括:根据权利要求1至8中任一项所述的控制装置;以及
    所述聚焦镜头。
  10. 一种摄像系统,其特征在于,包括:根据权利要求9所述的摄像装置;
    所述测距传感器;以及
    以可调整所述摄像装置的姿势的方式支撑所述摄像装置的支撑机构。
  11. 根据权利要求10所述的摄像系统,其特征在于,对应于所述支撑机构控制所述摄像装置的姿势而使所述测距传感器的测距区域包含在所述摄像装置的摄像区域内,所述控制部在使所述聚焦镜头向所述移动方向移动后,基于由所述摄像装置拍摄的图像控制所述聚焦镜头的位置。
  12. 一种移动体,其特征在于,其具备根据权利要求10所述的摄像系统并进行移动。
  13. 根据权利要求12所述的移动体,其特征在于,所述获取部在所述移动体没有移动时通过所述测距传感器获取被摄体距离。
  14. 根据权利要求12所述的移动体,其特征在于,所述移动体为飞行体,
    所述获取部在所述飞行体悬停时通过所述测距传感器获取被摄体距离。
  15. 一种控制方法,其特征在于,包括以下步骤:
    通过测距传感器获取被摄体距离,所述被摄体距离表示从摄像装置到被摄体的距离;
    确定所述摄像装置具备的聚焦镜头的位置;
    基于所述被摄体距离和所述聚焦镜头的位置,确定所述聚焦镜头的移动方向;以及
    在使所述聚焦镜头向所述移动方向移动后,基于由所述摄像装置拍摄的图像,控制所述聚焦镜头的位置。
  16. 一种程序,其特征在于,其用于使计算机作为根据权利要求1至7中任一项所述的控制装置而发挥功能。
PCT/CN2019/122976 2018-12-19 2019-12-04 控制装置、摄像装置、摄像系统、移动体、控制方法以及程序 WO2020125414A1 (zh)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201980009116.2A CN111615663A (zh) 2018-12-19 2019-12-04 控制装置、摄像装置、摄像系统、移动体、控制方法以及程序

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2018236887A JP6746856B2 (ja) 2018-12-19 2018-12-19 制御装置、撮像システム、移動体、制御方法、及びプログラム
JP2018-236887 2018-12-19

Publications (1)

Publication Number Publication Date
WO2020125414A1 true WO2020125414A1 (zh) 2020-06-25

Family

ID=71100735

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2019/122976 WO2020125414A1 (zh) 2018-12-19 2019-12-04 控制装置、摄像装置、摄像系统、移动体、控制方法以及程序

Country Status (3)

Country Link
JP (1) JP6746856B2 (zh)
CN (1) CN111615663A (zh)
WO (1) WO2020125414A1 (zh)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130010177A1 (en) * 2011-07-07 2013-01-10 Samsung Electronics Co., Ltd. Digital photographing apparatus, method of controlling the same, and auto-focusing method
CN103019002A (zh) * 2012-02-21 2013-04-03 深圳市阿格斯科技有限公司 光学变焦相机的缩放跟踪自动焦点控制装置及其控制方法
CN205311921U (zh) * 2015-11-25 2016-06-15 深圳市大疆创新科技有限公司 航拍跟焦控制系统及飞行器
CN106290246A (zh) * 2016-08-09 2017-01-04 上海禾赛光电科技有限公司 无需gps的无人机的地面定位装置及气体遥测系统
WO2018123013A1 (ja) * 2016-12-28 2018-07-05 エスゼット ディージェイアイ テクノロジー カンパニー リミテッド 制御装置、移動体、制御方法、及びプログラム
CN108351574A (zh) * 2015-10-20 2018-07-31 深圳市大疆创新科技有限公司 用于设置相机参数的系统、方法和装置

Family Cites Families (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH01123955A (ja) * 1987-11-10 1989-05-16 Sanyo Electric Co Ltd 極低温用冷凍機の給排気装置
JP3599483B2 (ja) * 1996-05-21 2004-12-08 キヤノン株式会社 光学機器
JPH1123955A (ja) * 1997-07-08 1999-01-29 Nikon Corp 自動焦点調節カメラ
JP2000098217A (ja) * 1998-09-22 2000-04-07 Casio Comput Co Ltd 自動焦点装置及び合焦方法
JP2003279839A (ja) * 2002-03-20 2003-10-02 Ricoh Co Ltd 撮像装置
JP4928190B2 (ja) * 2006-08-11 2012-05-09 キヤノン株式会社 フォーカス制御装置および撮像装置
CN101419379A (zh) * 2007-10-23 2009-04-29 鸿富锦精密工业(深圳)有限公司 具有自动对焦功能的相机模组及其对焦方法
JP2009294416A (ja) * 2008-06-05 2009-12-17 Sony Corp 撮像装置およびその制御方法
JP2011248181A (ja) * 2010-05-28 2011-12-08 Hitachi Ltd 撮像装置
JP5901423B2 (ja) * 2012-05-18 2016-04-13 キヤノン株式会社 レンズ装置、撮像システム、および、レンズ装置の制御方法
EP3040754B1 (en) * 2013-12-03 2019-03-27 Sony Corporation Imaging device, method, and program
JP6614822B2 (ja) * 2015-06-22 2019-12-04 キヤノン株式会社 画像符号化装置及びその制御方法及びプログラム及び記憶媒体

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130010177A1 (en) * 2011-07-07 2013-01-10 Samsung Electronics Co., Ltd. Digital photographing apparatus, method of controlling the same, and auto-focusing method
CN103019002A (zh) * 2012-02-21 2013-04-03 深圳市阿格斯科技有限公司 光学变焦相机的缩放跟踪自动焦点控制装置及其控制方法
CN108351574A (zh) * 2015-10-20 2018-07-31 深圳市大疆创新科技有限公司 用于设置相机参数的系统、方法和装置
CN205311921U (zh) * 2015-11-25 2016-06-15 深圳市大疆创新科技有限公司 航拍跟焦控制系统及飞行器
CN106290246A (zh) * 2016-08-09 2017-01-04 上海禾赛光电科技有限公司 无需gps的无人机的地面定位装置及气体遥测系统
WO2018123013A1 (ja) * 2016-12-28 2018-07-05 エスゼット ディージェイアイ テクノロジー カンパニー リミテッド 制御装置、移動体、制御方法、及びプログラム

Also Published As

Publication number Publication date
CN111615663A (zh) 2020-09-01
JP2020098289A (ja) 2020-06-25
JP6746856B2 (ja) 2020-08-26

Similar Documents

Publication Publication Date Title
WO2019120082A1 (zh) 控制装置、系统、控制方法以及程序
WO2019238044A1 (zh) 确定装置、移动体、确定方法以及程序
WO2020011230A1 (zh) 控制装置、移动体、控制方法以及程序
US20210014427A1 (en) Control device, imaging device, mobile object, control method and program
WO2020098603A1 (zh) 确定装置、摄像装置、摄像系统、移动体、确定方法以及程序
CN110337609B (zh) 控制装置、镜头装置、摄像装置、飞行体以及控制方法
WO2019242616A1 (zh) 确定装置、摄像系统、移动体、合成系统、确定方法以及程序
WO2019174343A1 (zh) 活动体检测装置、控制装置、移动体、活动体检测方法及程序
JP6543875B2 (ja) 制御装置、撮像装置、飛行体、制御方法、プログラム
WO2020020042A1 (zh) 控制装置、移动体、控制方法以及程序
CN110785997B (zh) 控制装置、摄像装置、移动体以及控制方法
US11066182B2 (en) Control apparatus, camera apparatus, flying object, control method and program
JP2019096965A (ja) 決定装置、制御装置、撮像システム、飛行体、決定方法、及びプログラム
WO2020125414A1 (zh) 控制装置、摄像装置、摄像系统、移动体、控制方法以及程序
WO2020011198A1 (zh) 控制装置、移动体、控制方法以及程序
WO2020108284A1 (zh) 确定装置、移动体、确定方法以及程序
JP6878738B1 (ja) 制御装置、撮像システム、移動体、制御方法、及びプログラム
CN111213369B (zh) 控制装置、方法、摄像装置、移动体以及计算机可读存储介质
WO2020216057A1 (zh) 控制装置、摄像装置、移动体、控制方法以及程序
WO2021143425A1 (zh) 控制装置、摄像装置、移动体、控制方法以及程序
JP6413170B1 (ja) 決定装置、撮像装置、撮像システム、移動体、決定方法、及びプログラム
WO2020001335A1 (zh) 控制装置、摄像装置、移动体、控制方法以及程序
JP2020052220A (ja) 制御装置、撮像装置、移動体、制御方法、及びプログラム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19899623

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19899623

Country of ref document: EP

Kind code of ref document: A1