WO2020025051A1 - 控制装置、方法及程序 - Google Patents

控制装置、方法及程序 Download PDF

Info

Publication number
WO2020025051A1
WO2020025051A1 PCT/CN2019/099056 CN2019099056W WO2020025051A1 WO 2020025051 A1 WO2020025051 A1 WO 2020025051A1 CN 2019099056 W CN2019099056 W CN 2019099056W WO 2020025051 A1 WO2020025051 A1 WO 2020025051A1
Authority
WO
WIPO (PCT)
Prior art keywords
lens
evaluation value
contrast evaluation
imaging device
time
Prior art date
Application number
PCT/CN2019/099056
Other languages
English (en)
French (fr)
Inventor
本庄谦一
永山佳范
Original Assignee
深圳市大疆创新科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳市大疆创新科技有限公司 filed Critical 深圳市大疆创新科技有限公司
Priority to CN201980005056.7A priority Critical patent/CN111226153A/zh
Publication of WO2020025051A1 publication Critical patent/WO2020025051A1/zh
Priority to US17/161,535 priority patent/US20210152744A1/en

Links

Images

Classifications

    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B13/00Viewfinders; Focusing aids for cameras; Means for focusing for cameras; Autofocus systems for cameras
    • G03B13/32Means for focusing
    • G03B13/34Power focusing
    • G03B13/36Autofocus systems
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B7/00Mountings, adjusting means, or light-tight connections, for optical elements
    • G02B7/02Mountings, adjusting means, or light-tight connections, for optical elements for lenses
    • G02B7/04Mountings, adjusting means, or light-tight connections, for optical elements for lenses with mechanism for focusing or varying magnification
    • G02B7/08Mountings, adjusting means, or light-tight connections, for optical elements for lenses with mechanism for focusing or varying magnification adapted to co-operate with a remote control mechanism
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B7/00Mountings, adjusting means, or light-tight connections, for optical elements
    • G02B7/02Mountings, adjusting means, or light-tight connections, for optical elements for lenses
    • G02B7/04Mountings, adjusting means, or light-tight connections, for optical elements for lenses with mechanism for focusing or varying magnification
    • G02B7/09Mountings, adjusting means, or light-tight connections, for optical elements for lenses with mechanism for focusing or varying magnification adapted for automatic focusing or varying magnification
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B7/00Mountings, adjusting means, or light-tight connections, for optical elements
    • G02B7/28Systems for automatic generation of focusing signals
    • G02B7/36Systems for automatic generation of focusing signals using image sharpness techniques, e.g. image processing techniques for generating autofocus signals
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B7/00Mountings, adjusting means, or light-tight connections, for optical elements
    • G02B7/28Systems for automatic generation of focusing signals
    • G02B7/36Systems for automatic generation of focusing signals using image sharpness techniques, e.g. image processing techniques for generating autofocus signals
    • G02B7/38Systems for automatic generation of focusing signals using image sharpness techniques, e.g. image processing techniques for generating autofocus signals measured at different points on the optical axis, e.g. focussing on two or more planes and comparing image data
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/55Optical parts specially adapted for electronic image sensors; Mounting thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals
    • H04N23/673Focus control based on electronic image sensor signals based on contrast or high frequency components of image signals, e.g. hill climbing method

Definitions

  • the invention relates to a control device, a method and a program.
  • contrast AF In an image capturing device that captures an image, an AF method called contrast AF (Autofocus) is used to automatically control the position of the lens to focus.
  • contrast AF for example, a DC (Direct Current) motor that drives a gear that moves a lens and a rotation sensor that detects the amount of rotation of the gear are used to perform control of autofocus.
  • DC Direct Current
  • the lens moves in a preset direction parallel to the optical axis.
  • the DC motor drives the gear to move in a direction opposite to the one direction
  • the lens moves in a direction opposite to the preset direction.
  • the first flow is a flow of detecting a peak value of a contrast evaluation value of an image obtained from an imaging element while driving a gear in one direction by a DC motor.
  • the second process is a process in which the gear is subsequently driven by the DC motor in a direction opposite to the one, so that the contrast evaluation value exceeds the peak value once.
  • the third process is a process in which the gear is subsequently driven by the DC motor in the one direction, and the peak of the contrast evaluation value is targeted to stop the gear.
  • the position of the lens is determined based on the amount of rotation of the gear detected by the rotation sensor.
  • the contrast evaluation value of the image is an evaluation value indicating the degree of contrast of the image. In this specification, a larger contrast evaluation value indicates a stronger contrast.
  • Patent Document 1 discloses a technique of detecting a position in the optical axis direction of a focus lens by an MR (Magneto Resistive) sensor (refer to paragraph 0039 of Patent Document 1).
  • Patent Document 1 discloses a case where the position of the lens is detected by the MR sensor, but does not disclose control of autofocus.
  • the MR sensor is a sensor that uses a magnetoresistive effect element to measure the magnitude of a magnetic field by utilizing a magnetoresistance effect in which the resistance of a solid changes due to a magnetic field.
  • the magnetoresistive effect is known as: anisotropic magnetoresistive effect (AMR: Anisotropic Magneto Resistive effect) whose resistance changes due to an external magnetic field, giant magnetoresistive effect (GMR: Giant Magneto Resistive effect) A tunnel magnetoresistance effect (TMR: Tunnel Magneto Resistive Effect), etc., in which a tunnel current flows through a magnetic tunnel junction element due to the application of a magnetic field to change resistance.
  • Patent Document 1 JP 2004-37121
  • a flow is performed in which the gear is moved in one direction to detect the peak of the contrast evaluation value. Then, in the imaging device, a process is performed in which the gear is moved in a direction opposite to the one direction, and the contrast evaluation value exceeds the peak value once. Then, in the imaging device, a flow is performed in which the gear is moved in the one direction, and the peak is used as a target to stop the gear. In the imaging device, these processes are necessary, and there are many cases in which the lens movement required to achieve the in-focus state takes a long time.
  • the present invention has been made in consideration of such a situation, and an object thereof is to provide a control device, a method, and a program capable of shortening the time required to achieve an in-focus state.
  • a control device includes a position detection unit that detects a position of a lens, and a control unit that moves the lens in a first direction along an optical axis, acquires an image through the lens, and acquires a contrast evaluation from the image.
  • the correspondence relationship between the contrast evaluation value and the position of the lens is stored in the memory, and after the focus position of the lens is detected based on the contrast evaluation value, the correspondence relationship is read from the memory, so that the lens is directed toward the Moving the first direction opposite to the second direction moves the lens to a position of the lens corresponding to the in-focus position.
  • the position detection unit may include: a magnetoresistance effect element provided in a lens frame that fixes the lens; and arranged along the optical axis and having a first polarity and A magnetic component of a second polarity.
  • the position detection unit may include a magnetic member composed of a first polarity and a second polarity different from the first polarity, arranged along the optical axis; A lens frame of the magnetic component; a magnetoresistance effect element provided on a guide shaft that moves the lens frame in a direction of the optical axis.
  • the length of the magnetic member may be longer than a length in which a range in which the lens is movable and an amount of backlash caused by the movement of the lens are added.
  • control unit may store the correspondence relationship into the memory when the focus position is detected.
  • the lens may be moved by a gear coupling mechanism.
  • the lens may be moved by a DC motor or a stepping motor.
  • a method includes: a stage in which a control device moves a lens in a first direction along an optical axis to acquire an image through the lens; a stage in which a contrast evaluation value is obtained from the image; the contrast evaluation value and the lens are obtained The stage of storing the correspondence relationship in the memory; after detecting the in-focus position of the lens according to the contrast evaluation value, reading the correspondence relationship from the memory, moving the lens to a second direction opposite to the first direction, and making the The stage where the lens moves to the position of the lens corresponding to the in-focus position.
  • a program is used to cause a computer to execute: a stage of moving a lens in a first direction along an optical axis and acquiring an image through the lens; a stage of acquiring a contrast evaluation value from the image; The stage where the correspondence relationship of the lenses is stored in the memory; after the in-focus position of the lens is detected based on the contrast evaluation value, the correspondence relationship is read from the memory to move the lens in a second direction opposite to the first direction, A stage of moving the lens to a position of the lens corresponding to the in-focus position.
  • FIG. 1 is a diagram showing a schematic external structure of an imaging device according to an embodiment of the present invention.
  • FIG. 2 is a diagram showing a schematic functional arrangement of an imaging device according to an embodiment of the present invention.
  • FIG. 3 is a diagram for explaining an example of autofocus processing performed by the imaging apparatus according to the embodiment of the present invention.
  • FIG. 4 is a diagram for explaining an example of a peak detection process of a contrast evaluation value performed by the imaging device according to the embodiment of the present invention.
  • FIG. 5 is a diagram for explaining an example of a peak detection process of a contrast evaluation value performed by the imaging device according to the embodiment of the present invention.
  • FIG. 6 is a diagram illustrating an example of a flow of an autofocus process performed by an imaging apparatus according to an embodiment of the present invention.
  • FIG. 7 is a diagram illustrating another example of the arrangement of the MR sensor and the magnetic member in the imaging device according to the embodiment of the present invention.
  • FIG. 8 is a diagram illustrating an example of a schematic functional configuration of an imaging apparatus according to another embodiment of the present invention.
  • FIG. 9 is a diagram showing an example of a hardware configuration of a control section and a memory.
  • FIG. 10 is a diagram showing an example of the appearance of an unmanned aircraft and a remote operation device.
  • FIG. 11 is a diagram for explaining an outline of a conventional contrast AF.
  • focusing section 201 ⁇ 203 ... points, 211,212 ... straight lines, 221 ... intersection points, 311 ... guide axes, 401 ... position detection elements, 501 ... computers, 511 ... host controllers, 512 ... CPU, 513 ... RAM, 514 ... input / output controller, 515 ... communication interface, 601 ... unmanned aerial vehicle, 602 ... remote operation device, 611 ... UAV body, 612 ... universal joint, 1001 ... pulse counting characteristic, 1002 ... contrast characteristic, P1, P11 ... detection focus point, P2, P12, P13 ... focus point.
  • FIG. 1 is a diagram showing a structure of a schematic appearance of an imaging device 1 according to an embodiment of the present invention.
  • the imaging device 1 includes a body portion 11 and a lens barrel portion 12.
  • the lens barrel portion 12 includes a lens 13.
  • the main body portion 11 includes buttons 14 to 16 and a viewfinder 17.
  • each of the buttons 14 to 16 is operated by a user and receives predetermined instructions related to, for example, power, shutter, exposure, and the like.
  • the structure of the imaging device 1 is not limited to the structure shown in FIG. 1, and other structures may be adopted.
  • FIG. 2 is a diagram showing a schematic functional arrangement of the imaging device 1 according to an embodiment of the present invention.
  • the imaging device 1 includes a lens 13, an annular rotary cam 21, a lens frame 31, a cam groove 32, a cam pin 33, an MR sensor 41, and a magnetic member 42 in the lens barrel portion 12.
  • the imaging device 1 includes a gear box 22 and an imaging unit 23 in the body portion 11.
  • the gear box 22 includes a DC (Direct Current) motor 61, a gear 62, and a two-phase rotation sensor 63.
  • the imaging section 23 includes an imaging element 71, an operation section 72, a display section 73, a memory 74, and a control section 75.
  • the control section 75 includes an acquisition section 81, a movement control section 82, a detection section 83, a determination section 84, and a focus section 85.
  • a device including a part of the MR sensor 41, the control section 75, and the memory 74 is an example of the control device.
  • the imaging device 1 is also an example of a control device.
  • the configuration of the lens barrel portion 12 will be described.
  • the lens 13 is attached to and supported by a lens frame 31.
  • the lens frame 31 fixes the lens 13.
  • the lens frame 31 is provided with a cam pin 33 fitted in a cam groove 32 provided in the rotary cam 21. Then, the lens frame 31 can be moved along the cam groove 32 by the rotation mechanism of the rotation cam 21. Accordingly, the lens 13 mounted on the lens frame 31 can move along a preset movable axis D1.
  • the movable axis D1 of the lens 13 is an axis parallel to the optical axis of the lens 13. That is, the lens 13 can move along the optical axis of the lens 13.
  • FIG. 2 shows a movable axis D1 of the lens 13.
  • the lens frame 31 is provided with an MR sensor 41 which is a magnetoresistance effect element.
  • the MR sensor 41 is fixed to the lens frame 31.
  • the magnetic member 42 is fixed to the rotary cam 21.
  • the magnetic member 42 has the N pole (an example of a first polarity) and the S pole (an example of a second polarity different from the first polarity) in a direction parallel to the movable axis D1 of the lens 13.
  • a waveform corresponding to the amount of movement of the lens 13 is detected by the MR sensor 41. That is, each time the polarity of the magnetic member 42 facing the position of the MR sensor 41 is alternately changed between the N and S poles, pulses having the same shape are generated. Then, while the MR sensor 41 is moving in the same direction, the number of pulses generated is proportional to the amount of movement of the lens 13.
  • the length of the magnetic member 42 is preferably longer than a preset value.
  • the preset value is a length obtained by adding the length of the movable range of the lens 13 and the length of the backlash amount caused by the movement of the lens 13.
  • the magnetic member 42 can accurately grasp the amount of movement of the lens 13 by covering a portion corresponding to the preset value.
  • the MR sensor 41 detects the number of generated pulses as information corresponding to the amount of movement.
  • the MR sensor 41 detects a two-phase waveform such as a sine wave and a cosine wave as a waveform corresponding to the amount of movement.
  • the MR sensor 41 outputs information indicating the detected movement amount and movement direction to the control unit 75.
  • the information indicating the amount of movement of the lens 13 for example, the amount of movement itself, or the number of pulses that are proportional to the amount of movement, etc. may be used. In the example of FIG.
  • a MR sensor 41 and a magnetic member 42 that are magnetoresistive effect elements are used to form a position detection unit that detects the position of the lens 13.
  • various lenses may be used, and for example, an interchangeable lens may be used.
  • the gear 62 has a gear coupling mechanism, for example.
  • the gear coupling mechanism is a mechanism in which a plurality of gears mesh. Therefore, in the gear coupling mechanism, a position error may occur between rotation in one direction and rotation in the opposite direction.
  • the DC motor 61 is controlled by the control unit 75 to rotate the gear 62.
  • the rotation of the gear 62 causes the rotation cam 21 to rotate, thereby forming a structure in which the lens 13 moves along the movable axis D1.
  • the gear 62 rotates in a predetermined rotation direction
  • the lens 13 moves in a predetermined direction along the movable axis D1.
  • the lens 13 is moved in a direction opposite to the one direction along the movable axis D1. That is, the lens 13 is moved by the DC motor 61. The lens 13 is moved by a gear coupling mechanism.
  • the rotation sensor 63 detects the amount of rotation of the gear 62.
  • the rotation sensor 63 includes, for example, a light emitting portion on one side and a light receiving portion on the other side of the gear 62.
  • the light emitting section emits light, and the light receiving section receives the light.
  • the gear 62 is provided with teeth at regular intervals along the circumference. When there are no teeth between the light emitting section and the light receiving section, light from the light emitting section is received by the light receiving section. On the other hand, when there is a tooth between the light emitting section and the light receiving section, light from the light emitting section is blocked by the teeth and is not received by the light receiving section. Therefore, the rotation sensor 63 detects a waveform corresponding to the amount of rotation of the gear 62.
  • the number of pulses generated is proportional to the amount of rotation of the gear 62.
  • the amount of rotation of the gear 62 is proportional to the amount of movement of the lens 13.
  • the rotation sensor 63 detects the number of generated pulses as information indicating the amount of rotation.
  • the rotation sensor 63 detects a two-phase waveform such as a sine wave and a cosine wave as a waveform corresponding to the amount of rotation.
  • the rotation sensor 63 can determine the amount of rotation of the gear 62 and the direction in which the gear 62 rotates.
  • the rotation sensor 63 outputs information indicating the detected rotation amount and rotation direction to the control unit 75.
  • the configuration of the imaging unit 23 will be described.
  • the imaging element 71 is arranged on the optical axis of the lens 13.
  • the imaging element 71 captures an image obtained by the light passing through the lens 13.
  • the imaging device 71 outputs the captured image to the control unit 75.
  • the operation unit 72 is a button or the like operated by a user. In the present embodiment, the operation unit 72 includes buttons 14 to 16 and the like shown in FIG. 1.
  • the display unit 73 has a screen that displays an image or the like obtained from the imaging element 71. In the present embodiment, the display unit 73 displays an image on the screen of the viewfinder 17 shown in FIG. 1.
  • the memory 74 stores information. In the present embodiment, the memory 74 is controlled by the control unit 75. In the present embodiment, a case where the imaging unit 23 includes a memory 74 has been described. However, a configuration including a memory in the lens 13 or the like may be adopted.
  • the control section 75 performs various controls related to shooting.
  • the control unit 75 performs, for example, the following control: processing for receiving an operation on the operation section 72; processing for displaying an image on the screen of the display section 73; processing for receiving an image from the imaging element 71; processing for storing information in the memory 74; Processing to delete information; processing to drive the DC motor 61; processing to receive information indicating the amount of rotation and direction from the rotation sensor 63; processing to receive information indicating the amount of movement and direction from the MR sensor 41;
  • the acquisition unit 81 acquires an image obtained by the imaging element 71. This image is displayed on the screen of the display unit 73. This image is stored in the memory 74 as a captured image when the shutter button is pressed.
  • the shutter button is, for example, the button 15.
  • the acquisition unit 81 acquires the amount of movement and the direction of movement detected by the MR sensor 41 from the MR sensor 41.
  • the amount of movement detected by the MR sensor 41 does not determine the absolute position of the lens 13 but determines the relative position of the lens 13 with respect to a preset reference position.
  • a configuration in which information indicating such a reference position is set in the memory 74 in advance, or a configuration in which the control unit 75 detects such a reference position and storing the information indicating the reference position in the memory 74 is adopted .
  • a memory (not shown) may be provided outside the memory 74 and inside the lens barrel portion 12.
  • the memory in the lens barrel section 12 may store the same information as the information stored in the memory 74.
  • the memory in the lens barrel section 12 can store information for driving the lens 13.
  • a reference position an arbitrary position can be adopted.
  • the reference position for example, one end within a range in which the lens 13 can be moved may be used as the reference position.
  • the control unit 75 determines the absolute position of the lens 13 based on the reference position of the lens 13 and the amount of movement from the reference position. That is, when the amount of movement of the movement amount in the reference position is set to an initial value such as 0, it represents a difference from the initial value.
  • control unit 75 adds the number of pulses generated when the moving direction of the lens 13 is a preset direction. Conversely, the control unit 75 subtracts the number of pulses generated when the moving direction of the lens 13 is a direction opposite to the preset direction. Then, the control unit 75 calculates the total of these pulse numbers as a pulse count. Therefore, for example, the control unit 75 can correspond one-to-one with the pulse count indicating the amount of movement from the reference position to the position of the lens 13.
  • the reference position for example, the initial position of the lens 13 or the position of the lens 13 when the imaging device 1 is powered on may be predetermined.
  • the reference position can be detected by any method.
  • a light-emitting portion that emits light may be provided in a portion other than the lens 13 at the reference position.
  • a light-receiving portion that receives light from the light-emitting portion may be provided on the opposite side of the light-emitting portion through a movement path of the lens 13 The side is provided with a shielding member that blocks light. In this configuration, when the lens 13 is located at the reference position, the light from the light emitting section is shielded by the shielding member and is not received by the light receiving section.
  • the control unit 75 determines that the lens 13 is at the reference position when the light from the light emitting unit is not received by the light receiving unit.
  • the control unit 75 may set the amount of movement when the lens 13 is at the reference position to an initial value such as 0.
  • the position of the lens 13 is represented by an absolute position that is a combination of a reference position and a movement amount.
  • the movement amount with respect to the reference position may be used. That is, in the present embodiment, it is sufficient that the position of the lens 13 when the peak value is obtained can be reproduced.
  • the acquisition unit 81 acquires the rotation amount and the rotation direction detected by the rotation sensor 63 from the rotation sensor 63.
  • the amount of rotation detected by the rotation sensor 63 does not determine the absolute position of the lens 13 but determines the relative position of the lens 13.
  • a case where a two-phase MR sensor 41 is used is shown, but a three-phase or higher MR sensor may be used.
  • a one-phase MR sensor may be used.
  • the moving direction of the lens 13 may be determined by the control unit 75 based on the driving direction of the DC motor 61 or the like.
  • a case where a two-phase rotation sensor 63 is used is shown, but a three-phase or more rotation sensor may be used.
  • a one-phase rotation sensor may be used.
  • the moving direction of the lens 13 may be determined by the control unit 75 based on the driving direction of the DC motor 61 and the like, for example.
  • the movement control unit 82 controls the driving of the DC motor 61 to move the lens 13 along the movable axis D1. Thereby, the lens 13 moves along the movable axis D1.
  • the detection unit 83 calculates a preset contrast evaluation value for the image acquired by the acquisition unit 81. Then, the detection unit 83 detects the peak of the contrast evaluation value while the lens 13 is moving in a predetermined direction.
  • a method of calculating a contrast evaluation value of an image an arbitrary method may be adopted as a method of calculating a contrast evaluation value of an image. It is generally considered that the higher the contrast evaluation value of a captured image, the more it is captured in a state close to the in-focus state.
  • the determination unit 84 determines the position of the lens 13 at the time when the peak value detected by the detection unit 83 is obtained based on the amount of movement and the movement direction detected by the MR sensor 41.
  • the control unit 75 stores information that can determine the position of the lens 13 and information that can determine the contrast evaluation value of the image acquired by the acquisition unit 81 in the memory 74 in association.
  • the association may be achieved through time, that is, the information of the two can be related by the correspondence between time and the information that can determine the position of the lens 13 and the correspondence between the time and the information that can determine the contrast evaluation value.
  • the correspondence between time and information that can determine the position of the lens 13 may be obtained discretely at a certain time interval, for example.
  • the correspondence between time and information that can determine the contrast evaluation value can also be obtained discretely at the certain time interval.
  • the information that can determine the position of the lens 13 may be monitored by the determination unit 84 and calculated by the predetermined time interval.
  • the information that can determine the contrast evaluation value of the image acquired by the acquisition unit 81 may be monitored by the detection unit 83 and calculated at the preset time interval. Either or both of the detection unit 83 and the determination unit 84 may perform respective calculations based on the information stored in the memory 74 in this way.
  • the focusing section 85 performs the following control: by controlling the drive of the DC motor 61 by the movement control section 82, the lens 13 is moved to the position of the lens 13 determined by the determination section 84, that is, the lens 13 at the time when the peak value of the contrast evaluation value is obtained s position.
  • FIG. 3 An example of autofocus processing performed by the imaging device 1 according to an embodiment of the present invention will be described with reference to FIG. 3.
  • the horizontal axis represents time.
  • the vertical axis shows a pulse count characteristic 1001 indicating a pulse count of the MR sensor 41 with respect to time and a contrast characteristic 1002 showing a contrast evaluation value with respect to time.
  • time t1 to time t4 are shown in the order of time from morning to night, with time 0 as the origin.
  • the pulse count is set to zero.
  • the movement control unit 82 controls the driving of the DC motor 61 so that the gear 62 rotates at a constant speed in a predetermined rotation direction.
  • the detection unit 83 detects the peak around time t2 after the peak of the contrast evaluation value occurs at time t1.
  • the determination unit 84 determines the position of the lens 13 at the time when the peak occurs.
  • the focus unit 85 controls the driving of the DC motor 61 around the time t2 by the movement control unit 82 to temporarily stop the gear 62.
  • the focus unit 85 controls the driving of the DC motor 61 through the movement control unit 82, so that the gear 62 rotates at a constant speed in a direction opposite to the preset rotation direction.
  • the focusing unit 85 moves the lens 13 to a position at which the peak value occurs based on the pulse count detected by the MR sensor 41 and stops. Therefore, in the imaging device 1, the in-focus state is achieved at time t4.
  • the speed at which the gear 62 rotates the same speed is adopted regardless of the rotation direction.
  • the contrast evaluation value rises from time 0, reaches a peak value at time t1, and becomes a maximum value. Then, the contrast evaluation value gradually decreases from time t1 to time t2.
  • the detection unit 83 detects the contrast evaluation value at time t1 as a peak value, and stores the position of the lens 13 at time t1 as a detection focus P1 in the memory 74. Then, the contrast evaluation value does not change due to the backlash from time t2 to time t3. Then, the contrast evaluation value rises from time t3 and reaches a maximum value corresponding to the peak value at time t4.
  • the contrast evaluation value at time t4 coincides with the contrast evaluation value of the detection focus P1.
  • the position of the lens 13 at time t4 is set as the focus point P2.
  • the driving of the DC motor 61 is controlled from time 0 to time t3 to move the lens 13 while referring to the amount of rotation detected by the rotation sensor 63. Then, in the imaging device 1, during the period from time t3 to time t4, the drive of the DC motor 61 is controlled based on the pulse count detected by the MR sensor 41, and the position of the lens 13 is controlled so that Pulse counts are consistent.
  • the pulse count detected by the MR sensor 41 is a discrete value. Therefore, the determination unit 84 may directly determine the position of the lens 13 using, for example, the pulse count detected by the MR sensor 41. The determination unit 84 may also perform interpolation processing on the pulse count detected by the MR sensor 41 to determine the position of the lens 13, for example.
  • a method of determining the position of the lens 13 by interpolation of the pulse count an arbitrary method may be adopted.
  • a method of interpolation for example, a method of calculating the correlation between the pulse count at the midpoint of the exposure time and the peak value based on the exposure time (shutter speed) may be adopted.
  • a method of interpolation for example, when a rolling shutter is used, a method of calculating a pulse count in a time corresponding to an AF coordinate and a peak value based on a frame rate or the like may be adopted.
  • FIGS. 4 and 5 An example of peak detection processing of the contrast evaluation value performed by the imaging device 1 according to an embodiment of the present invention will be described with reference to FIGS. 4 and 5.
  • the horizontal axis represents time.
  • the vertical axis represents the contrast evaluation value.
  • This contrast evaluation value is a contrast evaluation value calculated by the detection unit 83.
  • time t11 to time t13 are shown in the order of time in the order of time 0 as the origin.
  • the time t11 to time t13 is a time which is set in the order of time advancement and has a certain interval.
  • points 201 to 203 at which respective contrast evaluation values are obtained from time t11 to time t13 are set.
  • the detection unit 83 adopts that as time progresses, the contrast evaluation value of the second point 202 becomes higher than the contrast evaluation value of the first point 201 and the contrast evaluation value of the third point 203 becomes larger than A condition where the contrast evaluation value of the second point 202 is low.
  • the imaging device 1 when three points satisfying this condition are detected, it is determined that there is a peak of the contrast evaluation value between the first point 201 and the third point 203. In the example of FIG. 4, such conditions are satisfied.
  • the detection unit 83 between the absolute value of the slope of a straight line passing through the first point 201 and the second point 202 and the absolute value of the slope of a straight line passing through the second point 202 and the third point 203. Decide which one is bigger. Then, in the imaging device 1, the detection unit 83 selects a straight line having a large absolute value of the slope. In the example of FIG. 4, the absolute value of the slope of the straight line passing through the first point 201 and the second point 202 is large, and this straight line is selected. When the absolute values of the slopes of the two straight lines are the same, either one can be selected.
  • the detection unit 83 calculates an intersection point between the selected straight line and a straight line having a slope opposite to the sign of the straight line and passing through the remaining one point.
  • the selected straight line is the straight line 211.
  • a straight line having a slope opposite to the plus or minus of the slope of the straight line 211 and passing through the remaining one point, that is, the third point 203 is a straight line 212.
  • the intersection point of the two straight lines is the intersection point 221.
  • the detection unit 83 determines that the calculated intersection 221 is a point where the contrast evaluation value has a peak. By performing such a peak detection process of the contrast evaluation value in the imaging device 1, a point where the contrast evaluation value becomes a peak can be detected in a short processing time with a simple calculation.
  • the detection unit 83 determines a point at which the contrast evaluation value has a peak value based on the obtained point.
  • FIG. 6 is a diagram illustrating an example of a flow of an autofocus process performed by the imaging apparatus 1 according to an embodiment of the present invention.
  • the movement control unit 82 controls the driving of the DC motor 61 and rotates the gear 62. Accordingly, in the imaging device 1, the movement control unit 82 moves the lens 13 in a predetermined direction along the movable axis D1. Then, the process proceeds to step S2.
  • step S3 information identifying the position of the lens 13 detected by the MR sensor 41 is stored in the memory 74. Then, the process moves to step S3. In this embodiment, a pulse count is used as information for determining the position of the lens 13. Then, the correspondence between the pulse count and the contrast evaluation value is stored in the memory 74.
  • step S3 it is determined whether the detection unit 83 has detected an in-focus position where the contrast evaluation value has a peak value. As a result of this determination, when the determination detection unit 83 detects a peak value of the contrast evaluation value in the imaging device 1 (step S3: YES), the process proceeds to step S4. On the other hand, as a result of this determination, when it is determined that the detection unit 83 has not detected a peak of the contrast evaluation value in the imaging device 1 (step S3: NO), the process proceeds to step S2.
  • the determination unit 84 determines the position of the lens 13 in the peak of the contrast evaluation value as the in-focus position based on the information specifying the position of the lens 13 detected by the MR sensor 41. Then, in the imaging device 1, the focusing unit 85 moves the lens 13 from a direction opposite to the one direction by the movement control unit 82 according to a state in which the lens 13 has passed the peak in one direction. In the imaging device 1, the focusing unit 85 moves the lens 13 to the in-focus position and stops. Then, the processing of this flow ends.
  • the control unit 75 may store the correspondence between the contrast evaluation value and the position of the lens 13 in the memory 74 when the focus position is detected.
  • FIG. 7 is a diagram showing another example of the arrangement of the MR sensor 301 and the magnetic member 302 in the imaging device 1 according to the embodiment of the present invention.
  • FIG. 7 shows a configuration example of the lens barrel portion 12 according to the modification.
  • the arrangement position of the MR sensor 301 which is a magnetoresistive effect element, and the arrangement position of the magnetic member 302 are different, and other points are the same. That is, in the example of FIG. 7, the lens frame 31 is provided on the magnetic member 302. In the example of FIG. 7, the lens frame 31 is fixed to the magnetic member 302. In the example of FIG. 7, the MR sensor 301 is provided on the guide shaft 311. In the example of FIG.
  • the MR sensor 301 is fixed to the guide shaft 311.
  • the guide shaft 311 includes a mechanism that moves the lens frame 31 in the direction of the optical axis of the lens 13.
  • the information identifying the position of the lens 13 can be detected by the MR sensor 301.
  • a MR sensor 301 and a magnetic member 302 that are magnetoresistive effect elements are used to form a position detection unit that detects the position of the lens 13.
  • the lens 13 is moved in one direction to detect the peak of the contrast evaluation value. Then, in the imaging device 1 according to the present embodiment, the lens 13 is moved in the direction opposite to the one direction based on the amount of movement detected by the MR sensor 41 until the lens 13 matches the in-focus state corresponding to the peak. Therefore, in the imaging device 1 according to this embodiment, it is possible to shorten the time required to achieve the in-focus state in the control of the autofocus. Therefore, in the imaging device 1 according to the present embodiment, it is possible to speed up the processing up to the in-focus state by the control of the autofocus.
  • an MR sensor 41 capable of accurately detecting the position of the lens 13 is used, and the position of the lens 13 and the focus position are matched. Therefore, in the imaging device 1 according to the present embodiment, it is possible to improve the accuracy of the in-focus state achieved by the control of the autofocus. That is, in the imaging device 1 according to the present embodiment, by using the MR sensor 41, it is possible to perform highly accurate autofocus control.
  • the movement control unit 82 moves the lens 13 in a predetermined direction (first direction) along the optical axis.
  • an image is acquired through a lens 13.
  • the detection unit 83 acquires a contrast evaluation value from the image.
  • the MR sensor 41 detects information corresponding to the amount of movement of the lens 13.
  • the imaging device 1 stores the correspondence between the contrast evaluation value and the position of the lens 13 in the memory 74.
  • the detection unit 83 detects the peak value of the contrast evaluation value of the image obtained from the imaging element 71 by the light passing through the lens 13 while the lens 13 is moving in the preset direction.
  • the determination unit 84 determines the position of the lens 13 at the time when the peak is detected by the detection unit 83 based on the information detected by the MR sensor 41. After the focus section 85 detects the peak value in the detection section 83, the movement control section 82 moves the lens 13 in a direction opposite to the preset direction (a second direction opposite to the first direction) to move the lens 13 to the determination section. 84 determines the position of the lens 13. In this way, in the imaging device 1, after detecting the in-focus position of the lens 13 based on the contrast evaluation value, the correspondence relationship is read from the memory 74, the lens 13 is moved in a direction opposite to the preset direction, and the lens 13 is moved to the in-focus position The position of the corresponding lens 13.
  • the memory 74 stores a correspondence relationship between information specifying the position of the lens 13 based on the information detected by the MR sensor 41 and information specifying the contrast evaluation value.
  • the determination unit 84 determines the position of the lens 13 based on the correspondence relationship stored in the memory 74.
  • the focusing section 85 moves the lens 13 to the position of the lens 13 determined by the determination section 84 based on the information detected by the MR sensor 41.
  • the detection unit 83 detects three points in order of time and the contrast evaluation value of the second point 202 is higher than the contrast evaluation value of the first point 201.
  • the straight line 211 passing through the first point 201 and the second point 202 and the second point 202 and the third point 203 are selected.
  • a straight line with a large absolute value of the slope of the straight line in the example of FIGS. 4 and 5, straight line 211
  • the selected straight line passes through a point that does not pass through the first point 201 or the third point 203 (In the example of FIGS.
  • the third point 203) is an intersection point (in the example of FIG. 5, the intersection point) of a straight line (in the example of FIG. 5, the straight line 212) having a slope opposite to the sign of the straight line. 221) A point corresponding to the peak is detected.
  • the conventional contrast AF will be described, and the effects obtained by the imaging device 1 according to the present embodiment will be shown in comparison with the conventional ones.
  • An outline of a conventional contrast AF will be described with reference to FIG. 11.
  • the horizontal axis represents time.
  • the vertical axis shows a pulse count characteristic 2001 indicating a pulse count of a rotation sensor with respect to time and a contrast characteristic 2002 indicating a contrast value with respect to time.
  • the rotation sensor a two-phase rotation sensor is used.
  • the rotation sensor counts and detects the number of discrete pulses based on the amount of rotation of the gear.
  • the count value is a pulse count.
  • the imaging device according to the example of FIG. 11 for the DC motor, the gear, and the rotation sensor, for example, the same devices as the DC motor 61, the gear 62, and the rotation sensor 63 shown in FIG. 2 are used.
  • time t101 to time t106 are shown in the order of time in the order of time 0 as the origin.
  • the pulse count is set to 0, and from time 0, the gear is rotated at a predetermined speed in a preset rotation direction by driving the DC motor.
  • the gear is rotated at a constant speed in a direction opposite to the preset rotation direction by driving the DC motor.
  • the speed of the gear rotation the same speed is adopted regardless of the rotation direction.
  • FIG. 11 shows the focus point P12 realized at time t104 and the focus point P13 realized at time t106 as points having theoretically the same contrast evaluation value as the detection focus point P11.
  • the contrast evaluation value rises from time 0, reaches a peak value at time t101, and becomes a maximum value. Then, the contrast evaluation value gradually decreases from time t101 to time t102. Next, the contrast evaluation value does not change due to backlash from time t102 to time t103. Then, the contrast evaluation value rises from time t103 and reaches a maximum value corresponding to the peak value at time t104.
  • the contrast evaluation value at time t104 coincides with the contrast evaluation value of the detection focus P11.
  • the contrast evaluation value gradually decreases from time t104 to time t105.
  • the contrast evaluation value does not change due to backlash during a short period starting at time t105.
  • the contrast evaluation value rises and reaches a maximum value corresponding to the peak value at time t106.
  • the contrast evaluation value at time t106 coincides with the contrast evaluation value of the detection focus P11.
  • the position of the lens at time t106 is set to the focus point P13.
  • the gear is controlled by mountain climbing control, and does not match the focus point P12 at time t104.
  • the time interval T101 from time t104 to time t106 is spent on additional processes and takes a long time. Therefore, a time interval T101 from when the lens crosses the focus point P12 at time t104 to the focus point P13 at which the lens returns to time t106 by mountain climbing control becomes a delay time when the focus state is achieved in the control of autofocus.
  • the pulse count characteristic 2001 and the contrast characteristic 2002 are basically indicated by a solid line, but a portion corresponding to the time interval T101 is indicated by a dotted line.
  • the conventional contrast AF it is necessary to perform mountain climbing control when the focus state is achieved. Therefore, in the existing contrast AF, a process of moving the lens in one direction, a process of moving the lens in the opposite direction, and a process of moving the lens in the one direction are required. Therefore, in the conventional contrast AF, the time required for processing until the in-focus state is achieved by the control of the autofocus may be long.
  • the automatic focus control is performed by feeding back the rotation amount of the gear detected by the rotation sensor. However, because of the backlash of the gear, the accuracy of the focus state achieved by the automatic focus control may be poor.
  • the in-focus state can be realized by a flow of moving the lens 13 in one direction and a flow of moving the lens 13 in the opposite direction. Therefore, in the imaging device 1 according to the present embodiment, the flow of operations can be reduced, and the processing required until the in-focus state is achieved by the control of the autofocus can be shortened. In addition, in the imaging device 1 according to the present embodiment, by using the MR sensor 41, it is possible to improve the accuracy of the in-focus state achieved by the control of the autofocus.
  • a magnetic mechanism composed of the MR sensor 41 and the magnetic member 42 is provided near the lens, it is preferably applied to a medium-sized camera that handles a large lens, for example, compared to a small camera. That is, compared with a small-sized camera, in a medium-sized camera, the space for installing the magnetic mechanism according to the present embodiment is often larger, which is particularly effective when driving a large lens.
  • the imaging device 1 may include a stepping motor instead of the DC motor 61.
  • the stepping motor is controlled by the control unit 75 to rotate the gear 62.
  • the lens 13 is driven via a stepping motor.
  • the stepping motor may have the same positional error as the DC motor 61. However, this defect can be eliminated by the imaging device 1 according to the present embodiment.
  • FIG. 8 is a diagram showing a schematic functional arrangement of an imaging device 2 according to another embodiment of the present invention.
  • the imaging device 2 shown in FIG. 8 includes a position detection element 401 other than the MR element instead of the MR sensor 41 and the magnetic member 42 provided near the lens 13.
  • the configuration of the imaging device 2 shown in FIG. 8 is the same as that of the imaging device 1 shown in FIG. 2, and the same reference numerals are used for explanation.
  • the position detection element 401 for example, an element that directly detects the position of the lens 13 using the lens 13 itself or an element or a component that moves with the lens 13 is employed.
  • the position detection element 401 an arbitrary element that can uniquely determine the position of the lens 13 may be used.
  • a sensor using a Hall element a sensor that detects the position of the lens 13 by a voltage division of a variable resistor, a sensor that detects the position of the lens 13 by a pattern of a hologram, and light can be used.
  • a Hall element is an element that uses the Hall effect to detect a magnetic field.
  • the sensor using the Hall element includes, for example, a Hall element and a magnetic member 42 instead of the MR sensor 41 and the magnetic member 42 shown in FIG. 2.
  • This sensor detects information that can determine the position of the lens 13 based on the relative position of the Hall element and the magnetic member 42.
  • the sensor that detects the position of the lens 13 based on the divided voltage of the variable resistor includes, for example, a variable resistor having a resistance value that changes depending on the position of the lens 13 and a detection unit that detects the resistance value of the variable resistor by the divided voltage. 2 shows the MR sensor 41 and the magnetic member 42. The sensor detects information that can determine the position of the lens 13 based on the partial pressure detected by the detection section.
  • the sensor that detects the position of the lens 13 through the pattern of the hologram includes, for example, a mechanism in which the pattern of the hologram changes according to the position of the lens 13.
  • the sensor detects information capable of determining the position of the lens 13 based on the pattern.
  • the sensor that detects the position of the lens 13 by light includes, for example, a light-emitting portion that emits light and a light-receiving portion that is arranged along a movable axis D1 of the lens 13 by a light-receiving element that receives the light. Part 42. Then, the sensor detects information capable of determining the position of the lens 13 based on a light receiving element in the light receiving section that receives light from the light emitting section.
  • the imaging device 2 detects the contrast evaluation value of an image obtained from the imaging element 71 while moving the lens 13 in one direction. Then, in the imaging device 2, when the contrast evaluation value rises and peaks and then drops, the lens 13 is moved in the opposite direction to the one direction, and the lens 13 is controlled to move to the in-focus position based on the detection value of the position detection element 401. .
  • the movement control unit 82 moves the lens 13 along the optical axis.
  • the position detection element 401 detects information corresponding to the amount of movement of the lens 13.
  • the detection section 83 detects a peak value of a contrast evaluation value of an image obtained from the imaging element 71 by the light passing through the lens 13 while the lens 13 is moving in a predetermined direction.
  • the determination unit 84 determines the position of the lens 13 at the time when the peak is detected by the detection unit 83 based on the information detected by the position detection element 401.
  • the movement control section 82 moves the lens 13 in a direction opposite to the preset direction, and moves the lens 13 to the position of the lens 13 determined by the determination section 84.
  • FIG. 9 is a diagram illustrating an example of a hardware configuration of the control section 75 and the memory 74.
  • the control unit 75 and the memory 74 may be configured by a computer 501 as an example.
  • the computer 501 includes a host controller 511, a CPU (Central Processing Unit) 512, a RAM (Random Access Memory) 513, an input / output controller 514, a communication interface 515, and a ROM (Read only memory: Read-only memory) 516.
  • the memory 74 may be a RAM 513 or a ROM 516.
  • the host controller 511 is connected to the CPU 512, the RAM 513, and the input / output controller 514, respectively, and connects them to each other.
  • the input / output controller 514 is connected to the communication interface 515 and the ROM 516, respectively, and connects them to the host controller 511.
  • the CPU 512 executes various processes or controls by reading and executing programs stored in the RAM 513 or the ROM 516, for example.
  • the communication interface 515 communicates with other devices, for example, via a network. In the example of FIG. 2, other devices may be the imaging element 71, the operation portion 72, the display portion 73, the DC motor 61, the rotation sensor 63, and the MR sensor 41.
  • a computer-readable recording medium may record a program for realizing the functions of each device (for example, the imaging devices 1, 2 and the like) according to the embodiment, and the computer system may read and execute the program The program recorded on the recording medium is processed.
  • the "computer system” referred to herein may include hardware such as an operating system (OS) and peripheral devices.
  • the “computer-readable recording medium” refers to a removable medium such as a flexible disk, a magneto-optical disk, a ROM, and a flash memory, and a removable medium such as a DVD (Digital Versatile Disc).
  • Storage devices such as hard disks built into computer systems.
  • the recording medium may be, for example, a recording medium that temporarily records data.
  • the "computer-readable recording medium” also includes a device that holds the program for a certain period of time, such as the volatile inside the computer system of a server or client when the program is transmitted via a network such as the Internet or a telephone line Memory (for example, DRAM (Dynamic Random Access Memory)).
  • the program may be transmitted from a computer system storing the program in a storage device or the like to another computer system via a transmission medium or a transmission wave in the transmission medium.
  • the "transmission medium” for transmitting a program refers to a medium having a function of transmitting information like a network (communication network) such as the Internet or a communication line (communication line) such as a telephone line.
  • the program described above may be used to implement a part of the aforementioned functions.
  • the program may be a so-called difference file (difference program) that realizes the functions described above in combination with a program recorded in a computer system.
  • FIG. 10 is a diagram showing an example of the appearance of an unmanned aerial vehicle (UAV: Unmanned Aerial Vehicle) 601 and a remote operation device 602.
  • the UAV 601 includes a UAV body 611, a universal joint 612, and a plurality of camera devices 613-615.
  • UAV 601 is an example of a flying body flying through a rotary wing.
  • a flying object is a concept that includes other aircraft moving in the air in addition to UAV.
  • the UAV body 611 includes a plurality of rotating wings.
  • the UAV body 611 controls the rotation of multiple rotary wings to make the UAV 601 fly.
  • the UAV body 611 uses, for example, four rotating wings to fly the UAV 601.
  • the number of rotating wings is not limited to four.
  • the imaging device 615 is an imaging camera for imaging a subject included in a desired imaging range.
  • the gimbal 612 supports the imaging device 615 so that the posture of the imaging device 615 can be changed.
  • the gimbal 612 supports the imaging device 615 in a rotatable manner.
  • the gimbal 612 employs an actuator, and the imaging device 615 is rotatable about a pitch axis.
  • the gimbal 612 employs an actuator to support the imaging device 615 so as to be rotatable about a roll axis and a yaw axis, respectively.
  • the gimbal 612 can also change the posture of the imaging device 615 by rotating the imaging device 615 around at least one of the yaw axis, the pitch axis, and the roll axis.
  • the imaging device 613 and the imaging device 614 are sensing cameras for capturing the surroundings of the UAV 601 in order to control the flight of the UAV 601.
  • the two camera devices 613 and 614 may also be set on the front of the UAV 601.
  • two other imaging devices may be provided on the bottom surface of the UAV 601.
  • the two imaging devices 613 and 614 on the front side form a pair and can also function as a so-called stereo camera.
  • the two imaging devices (not shown) on the bottom side are also paired, and can also function as a stereo camera.
  • the number of imaging devices 613 and 614 included in UAV 601 is not limited to four.
  • the UAV 601 may include at least one imaging device 613, 614.
  • UAV 601 may also have at least one camera device 613, 614 on the nose, tail, side, bottom and patio of UAV 601, respectively.
  • the angle of view settable by the imaging devices 613 and 614 may be wider than the angle of view settable by the imaging device 615. That is, the imaging range of the imaging devices 613 and 614 may be wider than the imaging range of the imaging device 615.
  • the imaging devices 613 and 614 may include a single focus lens or a fisheye lens.
  • the remote operation device 602 communicates with the UAV 601 and performs remote operation on the UAV 601.
  • the remote operation device 602 may also wirelessly communicate with the UAV 601.
  • the remote operation device 602 sends various driving commands related to the movement of the UAV 601 such as ascent, descent, acceleration, deceleration, forward, backward, and rotation to the UAV 601.
  • the UAV 601 receives a command transmitted from the remote operation device 602 and executes various processes in accordance with the command.
  • the imaging device 1 shown in FIG. 2 or the imaging device 2 shown in FIG. 8 may be adopted as one or more of the imaging devices 613 to 615 shown in FIG. 10.
  • imaging devices 1 and 2 are configured by a computer, a program executed by a processor of the computer may be implemented.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Optics & Photonics (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Automatic Focus Adjustment (AREA)
  • Studio Devices (AREA)
  • Focusing (AREA)
  • Lens Barrels (AREA)

Abstract

为了缩短实现对焦状态所需的时间,提供一种控制装置,其包括:位置检测部,其检测镜头的位置;控制部,其使上述镜头沿光轴向第一朝向移动,通过上述镜头获取图像,从上述图像获取对比度评价值,在存储器存储上述对比度评价值和上述镜头的位置的对应关系,从上述对比度评价值检测到上述镜头的对焦位置后,从上述存储器读出上述对应关系,使上述镜头向与上述第一朝向相反的第二朝向移动,使上述镜头移动到与上述对焦位置对应的上述镜头的位置。

Description

控制装置、方法及程序 技术领域
本发明涉及控制装置、方法及程序。
背景技术
拍摄图像的摄像装置中,采用称为对比度AF(Autofocus:自动对焦)的AF方式,自动地控制镜头的位置进行对焦。在对比度AF中,例如,采用驱动使镜头移动的齿轮的DC(Direct Current:直流)马达和检测该齿轮的旋转量的旋转传感器来执行自动聚焦的控制。此处,DC马达驱动齿轮向一个方向运动时,镜头沿着与光轴平行的预设方向移动。另外,该DC马达驱动该齿轮向与该一个方向相反的方向运动时,该镜头沿着与该预设方向相反的方向移动。
具体地说,在对比度AF中,进行以下三个流程的处理。第一流程是一边由DC马达向一个方向驱动齿轮,一边检测从摄像元件获得的图像的对比度评价值的峰值的流程。第二流程是之后由该DC马达向与该一个方向相反的方向驱动该齿轮,使对比度评价值一度超过该峰值的流程。第三流程是之后由该DC马达向该一个方向驱动该齿轮,以对比度评价值的峰值为目标,使该齿轮停止的流程。此时,在对比度AF中,根据由旋转传感器检测出的齿轮的旋转量,确定镜头的位置。这样的对比度AF中,例如,考虑齿轮的间隙(所谓的齿隙),向使对比度评价值上升的方向驱动齿轮,进行对焦。该方向称为所谓的登山方向,登山方向的控制称为登山控制。其中,图像的对比度评价值是表示图像的对比度的强弱程度的评价值。本说明书中,对比度评价值越大,表示对比度越强。
另外,专利文献1公开了由MR(Magneto Resistive)传感器检测出聚焦镜头的光轴方向上的位置的技术(参照专利文献1的段落0039。)。不过,专利文献1中,虽然公开了由MR传感器检测镜头的位置的情形,但是并未对自动聚焦的控制进行公开。
这里,MR传感器是已知的采用磁阻效应元件,为一种利用固体的电阻因磁场而变化的磁阻效应来测量磁场大小的传感器。磁阻效应已知有:电阻因外部磁场而变化的各向异性磁阻效应(AMR:Anisotropic Magneto Resistive  effect)、产生巨大的磁阻效应的巨大磁阻效应(GMR:Giant Magneto Resistive effect)、在磁隧道结元件中因施加磁场而流过隧道电流来使电阻变化的隧道磁阻效应(TMR:Tunnel Magneto Resistive effect)等。
专利文献1:特开2004-37121号公报
发明内容
【发明所要解决的技术问题】
如上所述,在摄像装置中,执行基于对比度AF的自动聚焦的控制时,由于存在齿隙,首先,进行使齿轮向一个方向移动,检测对比度评价值的峰值的流程。然后,在摄像装置中,进行使该齿轮向与该一个方向相反的方向移动,使对比度评价值一度超过该峰值的流程。然后,在摄像装置中,进行使该齿轮向该一个方向移动,以该峰值为目标,使该齿轮停止的流程。摄像装置中,这些流程是必要的,存在实现对焦状态所需的镜头移动的流程多、时间长的情况。
本发明考虑到这样的情况而提出,其课题是提供能够缩短实现对焦状态所需的时间的控制装置、方法及程序。
【用于解决问题的技术手段】
本发明的一个方面所涉及的控制装置包括:位置检测部,其检测镜头的位置;控制部,其使上述镜头沿光轴向第一朝向移动,通过上述镜头获取图像,从上述图像获取对比度评价值,将上述对比度评价值和上述镜头的位置的对应关系存储到存储器上,并根据上述对比度评价值检测到上述镜头的对焦位置后,从上述存储器读出上述对应关系,使上述镜头向与上述第一朝向相反的第二朝向移动,使上述镜头移动到与上述对焦位置对应的上述镜头的位置。
本发明的一个方面所涉及的控制装置中,上述位置检测部可以包括:设于固定上述镜头的镜头框的磁阻效应元件;沿上述光轴排列的由第一极性和不同于上述第一极性的第二极性构成的磁性部件。
本发明的一个方面所涉及的控制装置中,上述位置检测部可以包括:沿上述光轴排列的由第一极性和不同于上述第一极性的第二极性构成的磁性部件;设于上述磁性部件的镜头框;设于使上述镜头框向上述光轴的方向移动的导轴上的磁阻效应元件。
本发明的一个方面所涉及的控制装置中,上述磁性部件的长度可以为比上述镜头可移动的范围和因上述镜头移动而发生的齿隙量相加的长度更长。
本发明的一个方面所涉及的控制装置,上述控制部可以在检测到上述对焦位置时,将上述对应关系存储到上述存储器。
本发明的一个方面所涉及的控制装置中,可以通过齿轮连结机构而使上述镜头移动。
本发明的一个方面所涉及的控制装置中,可以通过DC马达或步进马达而使上述镜头移动。
本发明的一个方面所涉及的方法包括:控制装置使镜头沿光轴向第一朝向移动,通过上述镜头获取图像的阶段;从上述图像获取对比度评价值的阶段;将上述对比度评价值和上述镜头的对应关系存储到存储器的阶段;根据上述对比度评价值检测到上述镜头的对焦位置后,从上述存储器读出上述对应关系,使上述镜头向与上述第一朝向相反的第二朝向移动,使上述镜头移动到与上述对焦位置对应的上述镜头的位置的阶段。
本发明的一个方面所涉及的程序用于使计算机执行:使镜头沿光轴向第一朝向移动,通过上述镜头获取图像的阶段;从上述图像获取对比度评价值的阶段;将上述对比度评价值和上述镜头的对应关系存储到存储器的阶段;根据上述对比度评价值检测到上述镜头的对焦位置后,从上述存储器读出上述对应关系,使上述镜头向与上述第一朝向相反的第二朝向移动,使上述镜头移动到与上述对焦位置对应的上述镜头的位置的阶段。
【发明效果】
根据本发明的一个方面,能够缩短实现对焦状态所需的时间。
附图说明
图1是示出本发明的一个实施方式所涉及的摄像装置的概略的外观构造的图。
图2是示出本发明的一个实施方式所涉及的摄像装置的概略的功能配置的图。
图3是用于说明由本发明的一个实施方式所涉及的摄像装置执行的自动聚焦处理的一个示例的图。
图4是用于说明由本发明的一个实施方式所涉及的摄像装置执行的对比度评价值的峰值检测处理的一个示例的图。
图5是用于说明由本发明的一个实施方式所涉及的摄像装置执行的对比度评价值的峰值检测处理的一个示例的图。
图6是示出由本发明的一个实施方式所涉及的摄像装置执行的自动聚焦处理的流程的一个示例的图。
图7是示出本发明的一个实施方式所涉及的摄像装置中的MR传感器及磁性部件的配置的其他例的图。
图8是示出本发明的其他实施方式所涉及的摄像装置的概略的功能配置的一个示例的图。
图9是示出控制部及存储器的硬件配置的一个示例的图。
图10是示出无人驾驶航空器及远程操作装置的外观的一个示例的图。
图11是用于说明现有的对比度AF的概略的图。
【符号说明】
1、2、613~615...摄像装置,11...本体部,12...镜筒部,13...镜头,14~16...按钮,17...取景器,21...旋转凸轮,22...齿轮箱,23...摄像部,31...镜头框,32...凸轮槽,33...凸轮销,41、301...MR传感器,42、302...磁性部件,61...DC马达,62...齿轮,63...旋转传感器,71...摄像元件,72...操作部,73...显示部,74...存储器,75...控制部,81...获取部,82...移动控制部,83...检测部,84...判定部,85...对焦部,201~203...点,211、212...直线,221...交点,311...导轴,401...位置检测元件,501...计算机,511...主机控制器,512...CPU,513...RAM,514...输入/输出控制器,515...通信接口,601...无人驾驶航空器,602...远程操作装置,611...UAV本体,612...万向节,1001...脉冲计数特性,1002...对比度特性,P1、P11...检测对焦点,P2、P12、P13...对焦点。
具体实施方式
以下,参照附图,说明本发明的实施方式。图1是示出本发明的一个实施方式所涉及的摄像装置1的概略外观的构造的图。摄像装置1概略地包括本体部11和镜筒部12。镜筒部12包括镜头13。本体部11包括按钮14~16和取景器17。这里,各按钮14~16由用户操作,接收例如与电源、快门、曝光等相关 的预定指示。另外,摄像装置1的构造不限于图1所示构造,也可以采用其他构造。
图2是示出本发明的一个实施方式所涉及的摄像装置1的概略的功能配置的图。摄像装置1在镜筒部12包括镜头13、环状的旋转凸轮21、镜头框31、凸轮槽32、凸轮销33、MR传感器41和磁性部件42。摄像装置1在本体部11包括齿轮箱22和摄像部23。齿轮箱22包括DC(Direct Current:直流)马达61、齿轮62和两相旋转传感器63。摄像部23包括摄像元件71、操作部72、显示部73、存储器74和控制部75。控制部75包括获取部81、移动控制部82、检测部83、判定部84和对焦部85。另外,包含MR传感器41、控制部75和存储器74的部分的装置是控制装置的一个示例。本实施方式中,摄像装置1也是控制装置的一个示例。
对镜筒部12的构成进行说明。镜头13安装到镜头框31并由其支撑。镜头框31固定镜头13。在镜头框31设有嵌入设于旋转凸轮21的凸轮槽32的凸轮销33。然后,通过旋转凸轮21的旋转机构,镜头框31可沿凸轮槽32移动。从而,在镜头框31安装的镜头13能够沿预设的可移动轴D1移动。镜头13的可移动轴D1是与镜头13的光轴平行的轴。即,镜头13能够沿该镜头13的光轴移动。旋转凸轮21若在预设的旋转方向旋转,则镜头13在沿着可移动轴D1的预设的一个方向移动。反之,旋转凸轮21若在与该预设的旋转方向相反的方向旋转,则镜头13在沿着可移动轴D1的该一个方向的相反的方向移动。另外,图2示出了镜头13的可移动轴D1。
在镜头框31设有磁阻效应元件即MR传感器41。本实施方式中,MR传感器41固定于镜头框31。另外,磁性部件42固定设于旋转凸轮21。磁性部件42具有沿着与镜头13的可移动轴D1平行的方向使N极(第一极性的一个示例)和S极(不同于第一极性的第二极性的一个示例)按一定间隔以相反极性排列而成的磁阵列。即,该磁阵列沿镜头13的光轴排列。根据该构成,镜头13若沿可移动轴D1移动,则MR传感器41与磁性部件42的相对位置关系变化。从而,由MR传感器41检测与镜头13的移动量相应的波形。即,每次与MR传感器41的位置相向的磁性部件42的极性按N极和S极而交替变化时,产生相同形状的脉冲。然后,在MR传感器41在同一个方向移动期间,产生的脉冲数和镜头13的移动量成比例。另外,磁性部件42的长度最好比预设值长。 该预设值是镜头13可移动范围的长度和因镜头13移动而发生的齿隙量的长度相加的长度。磁性部件42通过覆盖与该预设值相当的部分,能够准确把握镜头13的移动量。
本实施方式中,MR传感器41检测产生的脉冲数,作为与移动量相应的信息。另外,本实施方式中,MR传感器41检测正弦波和余弦波这样的两相的波形,作为与移动量相应的波形。从而,通过MR传感器41,可确定镜头13的移动量以及镜头13移动的方向。MR传感器41向控制部75输出表示检测的移动量及移动方向的信息。这里,作为表示镜头13的移动量的信息,例如可以采用移动量本身,或采用与移动量成比例等的脉冲数等。图2的例中,采用磁阻效应元件即MR传感器41和磁性部件42,构成检测镜头13的位置的位置检测部。另外,作为镜头13,可以采用各种各样的镜头,例如,也可以采用可更换镜头。
对齿轮箱22的构成进行说明。这里,齿轮62例如具有齿轮连结机构。这里,齿轮连结机构是多个齿轮啮合的机构。因而,在齿轮连结机构中,在一个方向的旋转和与其相反方向的旋转之间可能产生位置误差。但是,本实施方式中,为了便于说明,着眼一个齿轮进行说明。DC马达61由控制部75控制,使齿轮62旋转。由于齿轮62的旋转使旋转凸轮21旋转,从而形成镜头13沿可移动轴D1移动的结构。齿轮62若沿着预设的旋转方向旋转,则镜头13在沿着可移动轴D1的预设的一个方向移动。反之,齿轮62若沿着与该预设的旋转方向相反的方向旋转,则镜头13在与沿着可移动轴D1的该一个方向相反的方向移动。即,通过DC马达61使镜头13移动。另外,通过齿轮连结机构使镜头13移动。
旋转传感器63检测齿轮62的旋转量。旋转传感器63例如在齿轮62的两侧中的一侧包括发光部,在另一侧包括受光部。发光部发射光,受光部接收该光。在齿轮62沿着圆周以一定的间隔设有齿。发光部和受光部之间不存在齿时,来自发光部的光由受光部接收。另一方面,当发光部和受光部之间存在齿时,来自发光部的光被该齿遮光,不由受光部接收。因而,通过旋转传感器63,检测与齿轮62的旋转量相应的波形。即,每次在发光部和受光部之间通过各个齿时,发生相同形状的脉冲。齿轮62在同一个方向旋转期间,发生的脉冲数和齿轮62的旋转量成比例。另外,齿轮62的旋转量和镜头13的移动量成比例。
本实施方式中,旋转传感器63检测发生的脉冲数,作为表示旋转量的信息。另外,本实施方式中,旋转传感器63检测正弦波和余弦波这样的两相波形,作为与旋转量相应的波形。从而,通过旋转传感器63,能够确定齿轮62的旋转量以及齿轮62旋转的方向。旋转传感器63向控制部75输出表示检测出的旋转量及旋转方向的信息。
对摄像部23的构成进行说明。摄像元件71配置在镜头13的光轴上。摄像元件71拍摄由穿过镜头13的光获得的图像。摄像元件71向控制部75输出拍摄的图像。操作部72是由用户操作的按钮等。本实施方式中,操作部72包括图1所示的按钮14~16等。显示部73具有显示从摄像元件71获得的图像等画面。本实施方式中,显示部73在图1所示取景器17的画面显示图像。存储器74存储信息。本实施方式中,存储器74由控制部75控制。另外,本实施方式中,说明了在摄像部23包括存储器74的情况,但是,也可以采用在镜头13等具有存储器的构成。
控制部75执行与拍摄相关的各种控制。控制部75例如执行以下的控制:接收对操作部72的操作的处理;在显示部73的画面显示图像的处理;从摄像元件71接收图像的处理;在存储器74存储信息的处理;从存储器74删除信息的处理;使DC马达61驱动的处理;从旋转传感器63接收表示旋转量及方向的信息的处理;从MR传感器41接收表示移动量及移动方向的信息的处理等。
获取部81获取由摄像元件71获得的图像。该图像在显示部73的画面显示。另外,该图像在快门按钮被按下时,作为拍摄的图像存储到存储器74。该快门按钮是例如按钮15。
另外,获取部81从MR传感器41获取由该MR传感器41检测的移动量及移动方向。这里,由MR传感器41检测的移动量不是确定镜头13的绝对位置,而是确定镜头13相对于预设的基准位置的相对位置。本实施方式中,例如,采用将表示这样的基准位置的信息预先设定在存储器74的结构,或,由控制部75检测这样的基准位置并将表示该基准位置的信息存储到存储器74的结构。另外,在存储器74外、镜筒部12内都可以设置存储器(未图示)。镜筒部12内的存储器可以存储与存储器74存储的信息相同的信息。镜筒部12内的存储器可以存储用于驱动镜头13的信息。
作为这样的基准位置,可以采用任意的位置。作为该基准位置,例如也可以采用镜头13可移动的范围内的一端作为基准位置。该情况下,在控制部75中,使镜头13向该端的位置移动并且由MR传感器41不再检测到脉冲时,判定镜头13与该端的位置抵接而停止。然后,在控制部75中,根据镜头13的基准位置和离该基准位置的移动量,确定镜头13的绝对位置。即,该移动量在基准位置中的移动量设为0等的初始值时,表示距该初始值的差分。
另外,在控制部75中,将镜头13的移动方向为预设的方向时发生的脉冲数相加。反之,在控制部75中,减去在镜头13的移动方向为与该预设的方向相反的方向时发生的脉冲数。然后,控制部75中,算出这些脉冲数的总和,作为脉冲计数。从而,控制部75中,例如,能够使表示离基准位置的移动量的脉冲计数和镜头13的位置一一对应。
这里,作为基准位置,也可以采用其他位置,例如,也可以预定的镜头13的初始位置或摄像装置1接通电源时的镜头13的位置等。另外,基准位置也可以用任意的方法检测。例如,也可以采用在基准位置的镜头13以外的部分设置发射光的发光部,在该发光部的相反侧隔着镜头13的移动通路设置接收来自该发光部的光的受光部,在镜头13的侧面设置遮蔽光的遮蔽部件的构成。该构成中,镜头13位于该基准位置时,来自该发光部的光由该遮蔽部件遮蔽,不会被受光部接收。在该情况下,控制部75中,在来自发光部的光未被受光部接收时,判定镜头13处于基准位置。另外,控制部75中,也可以将镜头13处于基准位置时的移动量设为0等初始值。另外,本实施方式中,镜头13的位置采用由基准位置和移动量组合的绝对位置来表示,但是作为其他例,也可以采用相对于基准位置的移动量来表示。即,本实施方式中,只要能够重现获得峰值时的镜头13的位置即可。
另外,获取部81从旋转传感器63获取该旋转传感器63检测的旋转量及旋转方向。这里,由旋转传感器63检测的旋转量不是确定镜头13的绝对位置,而是确定镜头13的相对位置。
另外,本实施方式中,示出了采用两相的MR传感器41的情况,但是也可以采用三相或以上的MR传感器。另外,也可以采用一相的MR传感器。采用一相的MR传感器时,镜头13的移动方向例如也可以由控制部75根据DC马达61的驱动方向等来判定。同样,本实施方式中,示出了采用两相的旋转传感 器63的情况,但是也可以采用三相或以上的旋转传感器。另外,也可以采用一相的旋转传感器。采用一相的旋转传感器时,镜头13的移动方向例如也可以由控制部75根据DC马达61的驱动方向等来判定。
移动控制部82控制DC马达61的驱动,使镜头13沿可移动轴D1移动。从而,镜头13沿可移动轴D1移动。检测部83对由获取部81获取的图像,算出预设的对比度评价值。然后,检测部83在镜头13沿着预设的方向移动期间,检测该对比度评价值的峰值。另外,作为算出图像的对比度评价值的手法,也可以采用任意的手法。一般认为是,拍摄的图像的对比度评价值越高,越是在接近对焦状态的状态下拍摄的。
判定部84根据由MR传感器41检测的移动量及移动方向,判定获得由检测部83检测的峰值的时刻的镜头13的位置。
这里,本实施方式中,控制部75将可确定镜头13的位置的信息和可确定由获取部81获取的图像的对比度评价值的信息关联地存储在存储器74。另外,该关联例如也可以经由时间实现,即,通过时间与可确定镜头13的位置的信息的对应以及时间与可确定对比度评价值的信息的对应关系,使两者的信息关联。时间与可确定镜头13的位置的信息的对应关系,例如,也可以按一定的时间间隔,离散地获得。同样,时间与可确定对比度评价值的信息的对应关系也可以按该一定的时间间隔,离散地获得。
可确定镜头13的位置的信息例如也可以由判定部84监视由获取部81获取的信息,按预设的时间间隔算出。另外,可确定由获取部81获取的图像的对比度评价值的信息例如也可以由检测部83监视由获取部81获取的信息,按该预设的时间间隔算出。检测部83或判定部84中的一个或双方也可以如此根据存储器74存储的信息而进行各自的运算。
对焦部85执行以下的控制:通过由移动控制部82控制DC马达61的驱动,使镜头13移动到由判定部84判定的镜头13的位置,即,获得对比度评价值的峰值的时刻的镜头13的位置。
用图3说明由本发明的一个实施方式所涉及的摄像装置1执行的自动聚焦处理的一个示例。图3所示曲线图中,横轴表示时间。另外,该曲线图中,纵轴示出表示相对于时间的MR传感器41的脉冲计数的脉冲计数特性1001及表示相对于时间的对比度评价值的对比度特性1002。
图3的例中,以时间0为原点,按时间从早到晚的顺序,表示了时间t1~时间t4。摄像装置1中,在时间0,脉冲计数设为0。摄像装置1中,首先,从时间0开始,移动控制部82控制DC马达61的驱动,使齿轮62沿着预设的旋转方向以一定的速度旋转。此时,摄像装置1中,检测部83在时间t1发生对比度评价值的峰值后的时间t2附近,检测该峰值。然后,摄像装置1中,判定部84判定发生该峰值的时刻的镜头13的位置。与此对应,摄像装置1中,对焦部85通过移动控制部82在时间t2附近控制DC马达61的驱动,使齿轮62暂时停止。接着,摄像装置1中,对焦部85通过移动控制部82控制DC马达61的驱动,使齿轮62沿着与该预设的旋转方向相反的方向以一定的速度旋转。然后,摄像装置1中,对焦部85根据MR传感器41检测的脉冲计数,使镜头13移动到发生峰值时刻的位置后停止。从而,摄像装置1中,在时间t4实现对焦状态。本实施方式中,作为齿轮62旋转的速度,不管旋转方向如何,采用相同速度。
这里,图3的例中,对比度评价值从时间0开始上升,在时间t1达到峰值,成为最大值。然后,对比度评价值从时间t1到时间t2逐渐下降。摄像装置1中,由检测部83检测出时间t1中的对比度评价值作为峰值,将时间t1中的镜头13的位置作为检测对焦点P1存储在存储器74。接着,对比度评价值从时间t2到时间t3为止,因齿隙而不变。然后,对比度评价值从时间t3开始上升,在时间t4达到与峰值相当的最大值。这里,理论上,时间t4中的对比度评价值与检测对焦点P1的对比度评价值一致。摄像装置1中,将时间t4中的镜头13的位置设为对焦点P2。
另外,本实施方式中,控制镜头13的位置,使得检测对焦点P1中的脉冲计数(=100)和对焦点P2中的脉冲计数(=100)一致。因而,摄像装置1中,在镜头13的位置沿着预设的移动方向越过了对比度评价值成为峰值的检测对焦点P1的状态下,能够预测对比度评价值成为该峰值的镜头13的位置。从而,摄像装置1中,使镜头13的位置沿着相反方向移动,能够直接与成为对焦点P2的镜头13的位置一致。
另外,图3的例中,摄像装置1中,从时间0到时间t3为止的期间,控制DC马达61的驱动,使镜头13移动的同时,参照由旋转传感器63检测的旋转量。然后,摄像装置1中,从时间t3到时间t4为止的期间,根据由MR传感 器41检测的脉冲计数,控制DC马达61的驱动,控制镜头13的位置,使得与已检测的检测对焦点P1的脉冲计数一致。
这里,本实施方式中,由MR传感器41检测的脉冲计数成为离散的值。因而,判定部84例如也可以直接使用由MR传感器41检测的脉冲计数来判定镜头13的位置。另外,判定部84例如也可以对由MR传感器41检测的脉冲计数进行插值处理,来判定镜头13的位置。作为对脉冲计数插值来判定镜头13的位置的手法,也可以采用任意的手法。作为插值的手法,例如,也可以采用根据曝光时间(快门速度),将曝光时间的中点的时间中的脉冲计数与峰值关联地算出的手法。另外,作为插值的手法,例如,也可以采用在使用滚动快门时,根据帧速率等,将与AF座标相当的点的时间中的脉冲计数与峰值关联地算出的手法。
用图4及图5说明由本发明的一个实施方式所涉及的摄像装置1执行的对比度评价值的峰值检测处理的一个示例。图4所示曲线图中,横轴表示时间。该曲线图中,纵轴表示对比度评价值。该对比度评价值是由检测部83算出的对比度评价值。图4的例中,以时间0为原点,按照时间早晚顺序,表示了时间t11~时间t13。时间t11~时间t13是设为按照时间前进的顺序并具有一定间隔的时间。
摄像装置1中,设为在时间t11~时间t13分别获得了各自的对比度评价值的点201~203。此时,摄像装置1中,检测部83采用随着时间的前进,第二点202的对比度评价值变得比第一点201的对比度评价值高且第三点203的对比度评价值变得比该第二点202的对比度评价值低的条件。然后,摄像装置1中,在检测到满足该条件的3点时,判定在第一点201和第三点203之间存在对比度评价值的峰值。图4的例中,满足这样的条件。
接着,摄像装置1中,检测部83在穿过第一点201和第二点202的直线的斜率的绝对值与穿过第二点202和第三点203的直线的斜率的绝对值中,判定哪一个大。然后,摄像装置1中,检测部83选择斜率的绝对值大的直线。图4的例中,穿过第一点201和第二点202的直线的斜率的绝对值大,该直线被选择。另外,这两条直线的斜率的绝对值相同时,也可以选择任一个。
接着,摄像装置1中,检测部83算出所选择的直线和具有与该直线的斜率的正负相反的斜率且穿过剩余的1点的直线的交点。图5的例中,选择的直线 是直线211。另外,具有与该直线211的斜率的正负相反的斜率且穿过剩余的1点即第三点203的直线是直线212。另外,这两条直线的交点是交点221。这两条直线211、212穿过交点221,相对于与纵轴平行的直线,形成对称。
然后,摄像装置1中,检测部83判定算出的交点221是对比度评价值成为峰值的点。摄像装置1中,通过执行这样的对比度评价值的峰值检测处理,能够以简易运算在短处理时间检测到对比度评价值成为峰值的点。
另外,摄像装置1中,检测部83在第二点的对比度评价值和第三点的对比度评价值的大小相同时,例如,进一步求出第四点以后的对比度评价值,直到发现对比度评价值低于这些对比度评价值的点。然后,摄像装置1中,检测部83根据求出的点,求出对比度评价值成为峰值的点。
图6是示出由本发明的一个实施方式所涉及的摄像装置1执行的自动聚焦处理的流程的一个示例的图。
(步骤S1)
摄像装置1中,移动控制部82控制DC马达61的驱动,使齿轮62旋转。从而,摄像装置1中,移动控制部82使镜头13在沿着可移动轴D1的预设的一个方向移动。然后,转移到步骤S2的处理。
(步骤S2)
摄像装置1中,在镜头13移动期间,将确定由MR传感器41检测的镜头13的位置的信息存储在存储器74。然后,转移到步骤S3的处理。本实施方式中,采用脉冲计数作为确定镜头13的位置的信息。然后,在存储器74存储该脉冲计数和对比度评价值的对应关系。
(步骤S3)
摄像装置1中,判定检测部83是否检测到对比度评价值成为峰值的对焦位置。该判定的结果为,摄像装置1中,判定检测部83检测到对比度评价值的峰值时(步骤S3:是),转移到步骤S4的处理。另一方面,该判定的结果为,摄像装置1中,判定检测部83未检测到对比度评价值的峰值时(步骤S3:否),转移到步骤S2的处理。
(步骤S4)
摄像装置1中,判定部84根据确定由MR传感器41检测的镜头13的位置的信息,将对比度评价值的峰值中的镜头13的位置判定为对焦位置。然后,摄 像装置1中,对焦部85根据镜头13在预设的一个方向越过了峰值的状态,通过移动控制部82,使镜头13从向与该一个方向相反的方向移动。摄像装置1中,对焦部85使镜头13移动到对焦位置后停止。然后,本流程的处理结束。另外,控制部75在检测到对焦位置时,也可以在存储器74存储对比度评价值和镜头13的位置的对应关系。
图7是示出本发明的一个实施方式所涉及的摄像装置1中的MR传感器301及磁性部件302的配置的其他示例的图。图7示出了变形例所涉及镜筒部12的构成例。图7的例中,与图2的例比,磁阻效应元件即MR传感器301的配置位置和磁性部件302的配置位置不同,其他点相同。即,图7的例中,镜头框31设于磁性部件302。另外,图7的例中,镜头框31固定于磁性部件302。另外,图7的例中,MR传感器301设于导轴311。图7的例中,MR传感器301固定于导轴311。导轴311包括使镜头框31沿着镜头13的光轴的方向移动的机构。通过这样的配置,与图2所示配置同样,也能够由MR传感器301检测确定镜头13的位置的信息。图7的例中,采用磁阻效应元件即MR传感器301和磁性部件302,构成检测镜头13的位置的位置检测部。
如上所述,本实施方式所涉及的摄像装置1中,使镜头13在一个方向移动,检测对比度评价值的峰值。然后,本实施方式所涉及的摄像装置1中,根据MR传感器41检测的移动量,使镜头13在该一个方向相反的方向移动,直到与相当于该峰值的对焦状态一致。因而,本实施方式所涉及的摄像装置1中,能够缩短自动聚焦的控制中实现对焦状态所需的时间。从而,本实施方式所涉及的摄像装置1中,能够实现由自动聚焦的控制实现的到对焦状态为止的处理的高速化。
另外,本实施方式所涉及的摄像装置1中,采用可精密检测镜头13的位置的MR传感器41,使镜头13的位置与对焦位置一致。因而,本实施方式所涉及的摄像装置1中,能够提高由自动聚焦的控制实现的对焦状态的良好精度。即,本实施方式所涉及的摄像装置1中,通过采用MR传感器41,能够进行精度高的自动聚焦的控制。
具体地说,本实施方式所涉及的摄像装置1中,移动控制部82使镜头13沿光轴在预设的方向(第一朝向)移动。此时,摄像装置1中,通过镜头13获取图像。摄像装置1中,检测部83从该图像获取对比度评价值。MR传感器 41检测与镜头13的移动量相应的信息。摄像装置1中,将对比度评价值和镜头13的位置的对应关系存储到存储器74。检测部83在镜头13沿着该预设的方向移动期间,检测由通过镜头13的光从摄像元件71获得的图像的对比度评价值的峰值。判定部84根据由MR传感器41检测的信息,判定检测部83检测到峰值的时刻的镜头13的位置。对焦部85在检测部83检测到峰值后,通过移动控制部82使镜头13在该预设的方向相反的方向(与第一朝向相反的第二朝向)移动,使镜头13移动到由判定部84判定的镜头13的位置。这样,摄像装置1中,根据对比度评价值检测到镜头13的对焦位置后,从存储器74读出该对应关系,使镜头13在预设的方向相反的方向移动,使镜头13移动到与对焦位置对应的镜头13的位置。
本实施方式所涉及的摄像装置1中,在存储器74存储确定基于由MR传感器41检测的信息的镜头13的位置的信息和确定对比度评价值的信息的对应关系。判定部84根据存储器74存储的对应关系,判定镜头13的位置。对焦部85根据由MR传感器41检测的信息,使镜头13移动到由判定部84判定的镜头13的位置。
本实施方式所涉及的摄像装置1中,如图4及图5所示,检测部83检测到按时间前进的顺序的3点且第二点202的对比度评价值比第一点201的对比度评价值高、第三点203的对比度评价值比第二点202的对比度评价值低时,选择穿过第一点201和第二点202的直线211与穿过第二点202和第三点203的直线中斜率的绝对值大的直线(图4及图5的例中,直线211),将所选择的直线与穿过第一点201或第三点203中该直线未穿过的一个点(图4及图5的例中,第三点203)且具有与该直线的斜率的正负相反的斜率的直线(图5的例中,直线212)的交点(图5的例中,交点221)检测出作为峰值对应的点。
这里,说明现有的对比度AF,通过与现有的比较,示出了由本实施方式所涉及的摄像装置1获得的效果。用图11说明现有的对比度AF的概略情况。图11所示曲线图中,横轴表示时间。该曲线图中,纵轴示出表示相对于时间的旋转传感器的脉冲计数的脉冲计数特性2001及表示相对于时间对比度评价值的对比度特性2002。这里,作为旋转传感器,采用两相的旋转传感器。另外,旋转传感器根据齿轮的旋转量,对离散的脉冲数计数,进行检测。该计数值是脉 冲计数。另外,图11的例所涉及摄像装置中,作为DC马达、齿轮、旋转传感器,例如,采用与图2所示DC马达61、齿轮62、旋转传感器63同样的装置。
图11的例中,以时间0为原点,按照时间早晚顺序,示出了时间t101~时间t106。摄像装置中,首先,在时间0,脉冲计数设为0,从时间0开始,通过DC马达的驱动,使齿轮在预设的旋转方向以一定的速度旋转。此时,摄像装置中,在时间t101发生对比度评价值的峰值后,在时间t102附近检测该峰值。与此对应,摄像装置中,在时间t102附近,控制DC马达的驱动,使齿轮暂时停止。接着,摄像装置中,通过DC马达的驱动,使齿轮在该预设的旋转方向相反的方向以一定的速度旋转。另外,作为齿轮旋转的速度,不管旋转方向如何,采用相同速度。
接着,摄像装置中,在时间t104再次发生对比度评价值的峰值后,在时间t105附近检测该峰值。与此对应,摄像装置中,在时间t105附近,控制DC马达的驱动,使齿轮暂时停止。接着,摄像装置中,通过DC马达的驱动,使齿轮再次在预设的旋转方向以一定的速度旋转。通过这样的登山控制,摄像装置中,在时间t106,实现检测到对比度评价值的峰值的时刻的镜头的位置。另外,图11将在时间t104实现的对焦点P12及在时间t106实现的对焦点P13示出为理论上与检测对焦点P11相同对比度评价值的点。
这里,图11的例中,对比度评价值从时间0开始上升,在时间t101达到峰值,成为最大值。然后,对比度评价值从时间t101到时间t102逐渐下降。接着,对比度评价值从时间t102到时间t103为止,因齿隙而不变。然后,对比度评价值从时间t103开始上升,在时间t104成为与峰值相当的最大值。这里,理论上,时间t104中的对比度评价值与检测对焦点P11的对比度评价值一致。
接着,对比度评价值从时间t104到时间t105逐渐下降。接着,对比度评价值在时间t105开始的短暂期间,因齿隙而不变。然后,对比度评价值上升,在时间t106成为与峰值相当的最大值。这里,理论上,时间t106中的对比度评价值与检测对焦点P11的对比度评价值一致。摄像装置中,将时间t106中的镜头的位置设为对焦点P13。
但是,这样的对比度AF中,由于发生齿隙,通过登山控制来控制齿轮后,与时间t104中的对焦点P12不一致。从时间t104到时间t106为止的时间间隔T101花费在额外的流程上并花费很长时间。因而,用于从镜头在时间t104越 过对焦点P12时开始到通过登山控制使该镜头返回时间t106的对焦点P13的时间间隔T101,成为在自动聚焦的控制中实现对焦状态时的迟延时间。另外,图11的例中,对于脉冲计数特性2001及对比度特性2002,基本用实线表示,但是,与时间间隔T101相当的部分用虚线表示。
另外,这样的对比度AF中,由于发生齿隙,在检测对焦点P11中的脉冲计数(=100)和对焦点P12中的脉冲计数(=80)之间发生偏移。摄像装置中,使镜头向时间t106中的最终对焦点P13移动时,利用惯性,控制使镜头移动到大致的对焦点。因而,摄像装置中,在最终对焦点产生偏差,由自动聚焦的控制实现的对焦状态的精度可能不良。
这样,现有的对比度AF中,在实现对焦状态时,必须进行登山控制。因而,现有的对比度AF中,需要使镜头向一个方向移动的流程、使该镜头向该一个方向相反的方向移动的流程和使该镜头再次向该一个方向移动的流程。因此,现有的对比度AF中,通过自动聚焦的控制实现对焦状态为止的处理所需的时间可能变长。另外,现有的对比度AF中,通过反馈由旋转传感器检测的齿轮的旋转量来进行自动聚焦的控制,但是,由于存在齿轮的齿隙,由自动聚焦的控制实现的对焦状态的精度可能不良。
相对地,本实施方式所涉及的摄像装置1中,能够由使镜头13向一个方向移动的流程和使镜头13向该方向相反的方向移动的流程来实现对焦状态。从而,本实施方式所涉及的摄像装置1中,能够减少操作的流程,能够缩短通过自动聚焦的控制实现对焦状态为止的处理所需的。另外,本实施方式所涉及的摄像装置1中,通过采用MR传感器41,能够提高由自动聚焦的控制实现的对焦状态的良好精度。
另外,本实施方式中,由于在镜头的附近设置由MR传感器41及磁性部件42组成的磁性机构,因此,例如,与小型摄像机相比,优选应用于处理大型镜头的中型摄像机。即,与小型摄像机比,在中型摄像机中,可设置本实施方式所涉及的磁性机构的空间往往更大,特别是在驱动大型镜头时有效。
另外,例如,摄像装置1也可以包括步进马达,取代DC马达61。该情况下,该步进马达由控制部75控制,使齿轮62旋转。然后,镜头13经由步进马达被驱动。另外,步进马达也可能发生与DC马达61同样的位置误差,但是,通过本实施方式所涉及的摄像装置1,能够消除该缺陷。
用图8说明其他实施方式。图8是示出本发明的其他实施方式所涉及的摄像装置2的概略的功能配置的图。图8所示摄像装置2与图2所示摄像装置1相比,取代在镜头13的附近设置的MR传感器41及磁性部件42,而在镜头13的附近设置MR元件以外的位置检测元件401。对于其他部分,图8所示摄像装置2的构成与图2所示摄像装置1的构成相同,附上同一个符号进行说明。
作为位置检测元件401,例如,采用利用镜头13本身或与镜头13一起移动的元件或部件来直接地检测镜头13的位置的元件。作为位置检测元件401,也可以采用能够唯一确定镜头13的位置的任意的元件。作为位置检测元件401,例如,也可以采用使用霍尔元件的传感器、通过可变电阻的分压来检测镜头13的位置的传感器、通过全息图的图案来检测镜头13的位置的传感器、通过光来检测镜头13的位置的传感器等。
霍尔元件是利用霍尔效应来检测磁场的元件。使用霍尔元件的传感器例如包括霍尔元件和磁性部件42,取代图2所示MR传感器41和磁性部件42。该传感器检测能够根据霍尔元件与磁性部件42的相对位置来确定镜头13的位置的信息。通过可变电阻的分压来检测镜头13的位置的传感器例如包括电阻值构成为随镜头13的位置而变化的可变电阻和通过分压检测该可变电阻的电阻值的检测部,取代图2所示MR传感器41和磁性部件42。该传感器检测能够根据该检测部检测的分压来确定镜头13的位置的信息。
通过全息图的图案来检测镜头13的位置的传感器例如包括全息图的图案随镜头13的位置而变化的机构。该传感器检测能够根据该图案来确定镜头13的位置的信息。通过光来检测镜头13的位置的传感器例如包括发射光的发光部和由接收该光的受光元件沿镜头13的可移动轴D1排列而成的受光部,取代图2所示MR传感器41和磁性部件42。然后,该传感器检测能够根据该受光部中接收该发光部的光的受光元件来确定镜头13的位置的信息。
这样,摄像装置2中,采用其他位置检测元件401取代图2所示MR传感器41及磁性部件42时,也能够减少自动聚焦的控制所需的镜头移动的流程。即,摄像装置2中,使镜头13在一个方向移动的同时,检测从摄像元件71获得的图像的对比度评价值。然后,摄像装置2中,在该对比度评价值上升后出现峰值并下降时,使镜头13向该一个方向相反的方向移动,根据位置检测元件401的检测值,控制镜头13使其移动到对焦位置。
具体地说,摄像装置2中,移动控制部82使镜头13沿光轴移动。位置检测元件401检测与镜头13的移动量相应的信息。检测部83在镜头13沿着预设的方向移动期间,检测由通过镜头13的光从摄像元件71获得的图像的对比度评价值的峰值。判定部84根据由位置检测元件401检测的信息,判定由检测部83检测到峰值的时刻的镜头13的位置。对焦部85在检测部83检测到峰值后,通过移动控制部82使镜头13向该预设的方向相反的方向移动,使镜头13移动到由判定部84判定的镜头13的位置。
图9是示出控制部75及存储器74的硬件配置的一个示例的图。控制部75及存储器74作为一个示例,也可以由计算机501构成。计算机501包括主机控制器511、CPU(Central Processing unit:中央处理器)512、RAM(Random access Memory:随机存取存储器)513、输入/输出控制器514、通信接口515和ROM(Read only Memory:只读存储器)516。图2的例中,存储器74也可以是RAM513、ROM 516。
主机控制器511分别于CPU 512、RAM 513、输入/输出控制器514连接,将它们相互连接。另外,输入/输出控制器514分别与通信接口515和ROM 516连接,将它们与主机控制器511连接。CPU 512例如通过读出并执行RAM 513或ROM 516存储的程序,执行各种的处理或控制。通信接口515例如经由网络与其他器件通信。图2的例中,其他器件也可以是摄像元件71、操作部72、显示部73、DC马达61、旋转传感器63、MR传感器41。
例如,也可以在计算机可读取记录介质(存储介质)记录用于实现实施方式所涉及的各装置(例如,摄像装置1、2等)的功能的程序,通过由计算机系统读入、执行该记录介质记录的程序,来进行处理。另外,这里所说的“计算机系统”也可以包含操作系统(OS:Operating System)或周边设备等的硬件。另外,“计算机可读取记录介质”是指柔性盘、光磁盘、ROM、闪速存储器等的可写入非易失性存储器、DVD(Digital Versatile Disc:数字多功能盘)等的可移动介质、内置于计算机系统的硬盘等的存储装置。另外,作为记录介质,例如也可以是暂时地记录数据的记录介质。
而且,“计算机可读取记录介质”也包含以一定时间保持程序的装置,如成为经由因特网等的网络或电话线路等的通信线路发送程序时的服务器或客户机的计算机系统内部的易失性存储器(例如DRAM(Dynamic Random access  Memory:动态随机存取存储器))。另外,上述程序也可以从在存储装置等存储该程序的计算机系统,经由传送介质或传送介质中的传送波向其他计算机系统传送。这里,传送程序的“传送介质”是指具有因特网等的网络(通信网)或电话线路等的通信线路(通信线)那样传送信息的功能的介质。另外,上述的程序也可以用于实现前述功能的一部分。而且,上述程序也可以是通过与计算机系统已记录的程序的组合来实现前述功能的所谓差分文件(差分程序)。
图10是示出无人驾驶航空器(UAV:Unmanned Aerial Vehicle)601及远程操作装置602的外观的一个示例的图。UAV 601包括UAV本体611、万向节612和多个摄像装置613~615。UAV 601是通过旋转翼飞行的飞行体的一个示例。飞行体是除了UAV外还包含空中移动的其他航空器等的概念。
UAV本体611包括多个旋转翼。UAV本体611通过控制多个旋转翼的旋转,使UAV 601飞行。UAV本体611例如采用4个旋转翼,使UAV 601飞行。旋转翼的数目不限于4个。
摄像装置615是拍摄期望的拍摄范围所包含的被摄体的拍摄用摄像机。万向节612以可变更摄像装置615的姿势的方式支撑摄像装置615。万向节612以可旋转方式支撑摄像装置615。例如,万向节612采用致动器,以可绕俯仰轴旋转的方式摄像装置615。而且,万向节612采用致动器,以可分别以滚转轴及偏航轴为中心旋转的方式支撑摄像装置615。万向节612通过以偏航轴、俯仰轴及滚转轴的至少一个为中心使摄像装置615旋转,也可以变更摄像装置615的姿势。
摄像装置613及摄像装置614是为了控制UAV 601的飞行而拍摄UAV 601的周围的传感用摄像机。两个摄像装置613、614也可以设置在UAV 601的机首即正面。而且,也可以在UAV 601的底面设置另外两个摄像装置(图示省略)。正面侧的两个摄像装置613、614成对,也可以起到所谓立体摄像机的功能。底面侧的两个摄像装置(图示省略)也成对,也可以起到立体摄像机的功能。
根据由摄像装置613及摄像装置614拍摄的图像,也可以生成UAV 601的周围的3维空间数据。UAV 601包括的摄像装置613、614的数不限于4个。UAV 601包括至少一个摄像装置613、614即可。UAV 601也可以在UAV 601的机首、机尾、侧面、底面及天井面分别具有至少一个摄像装置613、614。可由摄像装置613、614设定的视角也可以比可由摄像装置615设定的视角宽。即, 摄像装置613、614的拍摄范围也可以比摄像装置615的拍摄范围宽。摄像装置613、614也可以具有单焦点镜头或鱼眼镜头。
远程操作装置602与UAV 601通信,对UAV 601进行远程操作。远程操作装置602也可以与UAV 601无线通信。远程操作装置602向UAV 601发送与上升、下降、加速、减速、前进、后退、旋转等的UAV 601的移动相关的各种驱动命令。UAV 601接收从远程操作装置602发送的命令,按照该命令执行各种的处理。本实施方式中,例如,也可以采用图2所示摄像装置1或图8所示摄像装置2,作为图10所示摄像装置613~615中的一个以上。
以上,参照附图详述了本发明的实施方式,但是具体的构成不限于该实施方式,也包含在不脱离本发明的要旨的范围内的设计变更等。
另外,也可以实施包括与由摄像装置1、2执行的处理的阶段同样处理的阶段的方法。另外,由计算机构成摄像装置1、2时,也可以实施由该计算机的处理器执行的程序。

Claims (9)

  1. 一种控制装置,其特征在于,包括:位置检测部,其检测镜头的位置;控制部,其使所述镜头沿光轴向第一朝向移动,通过所述镜头获取图像,从所述图像获取对比度评价值,将所述对比度评价值和所述镜头的位置的对应关系存储到存储器,根据所述对比度评价值检测到所述镜头的对焦位置后,从所述存储器读出所述对应关系,使所述镜头向与所述第一朝向相反的第二朝向移动,使所述镜头移动到与所述对焦位置对应的所述镜头的位置。
  2. 权利要求1所述的控制装置,其特征在于,所述位置检测部包括:设于固定所述镜头的镜头框的磁阻效应元件;沿所述光轴排列的由第一极性和不同于所述第一极性的第二极性构成的磁性部件。
  3. 权利要求1所述的控制装置,其特征在于,所述位置检测部包括:沿所述光轴排列的由第一极性和不同于所述第一极性的第二极性构成的磁性部件;设于所述磁性部件的镜头框;以及设于使所述镜头框向所述光轴的方向移动的导轴的磁阻效应元件。
  4. 权利要求2或3所述的控制装置,其特征在于,所述磁性部件的长度比所述镜头可移动的范围和因所述镜头移动而发生的齿隙量相加的长度更长。
  5. 权利要求1至4的任一项所述的控制装置,其特征在于,所述控制部在检测到所述对焦位置时,将所述对应关系存储到所述存储器。
  6. 权利要求1至5的任一项所述的控制装置,其特征在于,通过齿轮连结机构而使所述镜头移动。
  7. 权利要求1至6的任一项所述的控制装置,其特征在于,通过DC马达或步进马达而使所述镜头移动。
  8. 一种方法,其特征在于,包括:控制装置使镜头沿光轴向第一朝向移动,通过所述镜头获取图像的阶段;从所述图像获取对比度评价值的阶段;将所述对比度评价值和所述镜头的对应关系存储到存储器的阶段;根据所述对比度评价值检测到所述镜头的对焦位置后,从所述存储器读出所述对应关系,使所述镜头向与所述第一朝向相反的第二朝向移动,使所述镜头移动到与所述对焦位置对应的所述镜头的位置的阶段。
  9. 一种程序,其特征在于,用于使计算机执行:使镜头沿光轴向第一朝向移动,通过所述镜头获取图像的阶段;从所述图像获取对比度评价值的阶段;将所述对比度评价值和所述镜头的对应关系存储到存储器的阶段;根据所述对比度评价值检测到所述镜头的对焦位置后,从所述存储器读出所述对应关系,使所述镜头向与所述第一朝向相反的第二朝向移动,使所述镜头移动到与所述对焦位置对应的所述镜头的位置的阶段。
PCT/CN2019/099056 2018-08-02 2019-08-02 控制装置、方法及程序 WO2020025051A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201980005056.7A CN111226153A (zh) 2018-08-02 2019-08-02 控制装置、方法及程序
US17/161,535 US20210152744A1 (en) 2018-08-02 2021-01-28 Control device, method, and program

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2018145949A JP2020020991A (ja) 2018-08-02 2018-08-02 制御装置、方法およびプログラム
JP2018-145949 2018-08-02

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US17/161,535 Continuation US20210152744A1 (en) 2018-08-02 2021-01-28 Control device, method, and program

Publications (1)

Publication Number Publication Date
WO2020025051A1 true WO2020025051A1 (zh) 2020-02-06

Family

ID=69230494

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2019/099056 WO2020025051A1 (zh) 2018-08-02 2019-08-02 控制装置、方法及程序

Country Status (4)

Country Link
US (1) US20210152744A1 (zh)
JP (1) JP2020020991A (zh)
CN (1) CN111226153A (zh)
WO (1) WO2020025051A1 (zh)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040100573A1 (en) * 2002-11-21 2004-05-27 Osamu Nonaka Focusing apparatus and camera including the same
US20100171868A1 (en) * 2007-05-28 2010-07-08 Panasonic Corporation Camera system and camera body
CN103748493A (zh) * 2011-09-01 2014-04-23 松下电器产业株式会社 摄像装置以及程序
CN105026976A (zh) * 2013-03-29 2015-11-04 富士胶片株式会社 图像处理装置、摄像装置、程序及图像处理方法
CN105190392A (zh) * 2013-02-14 2015-12-23 富士胶片株式会社 摄像装置和对焦控制方法

Family Cites Families (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001343581A (ja) * 2000-05-30 2001-12-14 Kyocera Corp Afシステム
JP2004077959A (ja) * 2002-08-21 2004-03-11 Nikon Corp 焦点調節方法およびカメラ
JP2005189634A (ja) * 2003-12-26 2005-07-14 Nikon Corp カメラの焦点検出装置
JP2005227447A (ja) * 2004-02-12 2005-08-25 Nikon Corp オートフォーカスカメラ
CN100424579C (zh) * 2004-03-08 2008-10-08 普立尔科技股份有限公司 数码相机的快速对焦方法
JP2006208703A (ja) * 2005-01-27 2006-08-10 Nikon Corp 電子カメラ
JP2006208818A (ja) * 2005-01-28 2006-08-10 Sony Corp フォーカス制御装置、フォーカス制御方法
JP2007133301A (ja) * 2005-11-14 2007-05-31 Nikon Corp オートフォーカスカメラ
JP4998114B2 (ja) * 2007-06-29 2012-08-15 株式会社ニコン 焦点調節装置および撮像装置
JP5304002B2 (ja) * 2008-04-11 2013-10-02 株式会社ニコン 撮像装置、画像選別方法及びプログラム
JP5320937B2 (ja) * 2008-09-29 2013-10-23 株式会社ニコン 焦点検出装置および撮像装置
JP2011013499A (ja) * 2009-07-02 2011-01-20 Nikon Corp 撮像装置
US8611739B2 (en) * 2009-09-17 2013-12-17 Panasonic Corporation Focus adjusting apparatus and imaging apparatus
JP5808124B2 (ja) * 2011-03-24 2015-11-10 キヤノン株式会社 焦点検出装置及びその制御方法並びに焦点検出装置を有する撮像装置
JP2012255919A (ja) * 2011-06-09 2012-12-27 Nikon Corp 焦点調節装置および撮像装置
KR101710633B1 (ko) * 2011-08-05 2017-02-27 삼성전자주식회사 자동 초점 조절 방법, 자동 초점 조절 장치, 및 이를 포함하는 디지털 촬영장치
JP2013130643A (ja) * 2011-12-20 2013-07-04 Olympus Corp オートフォーカス装置、撮像装置、及びオートフォーカス方法
JP5414831B2 (ja) * 2012-04-23 2014-02-12 キヤノン株式会社 交換レンズ装置及びその制御方法、撮像装置及びその制御方法、撮像システム
JP6478496B2 (ja) * 2014-07-03 2019-03-06 キヤノン株式会社 撮像装置およびその制御方法
JP2017058387A (ja) * 2015-09-14 2017-03-23 リコーイメージング株式会社 光学機器
JP6788404B2 (ja) * 2016-07-14 2020-11-25 オリンパス株式会社 焦点調節装置及び焦点調節方法

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040100573A1 (en) * 2002-11-21 2004-05-27 Osamu Nonaka Focusing apparatus and camera including the same
US20100171868A1 (en) * 2007-05-28 2010-07-08 Panasonic Corporation Camera system and camera body
CN103748493A (zh) * 2011-09-01 2014-04-23 松下电器产业株式会社 摄像装置以及程序
CN105190392A (zh) * 2013-02-14 2015-12-23 富士胶片株式会社 摄像装置和对焦控制方法
CN105026976A (zh) * 2013-03-29 2015-11-04 富士胶片株式会社 图像处理装置、摄像装置、程序及图像处理方法

Also Published As

Publication number Publication date
CN111226153A (zh) 2020-06-02
JP2020020991A (ja) 2020-02-06
US20210152744A1 (en) 2021-05-20

Similar Documents

Publication Publication Date Title
JP6504274B2 (ja) 三次元形状データおよびテクスチャ情報生成システム、撮影制御プログラム、及び三次元形状データおよびテクスチャ情報生成方法並びに情報記録媒体
CN108780324B (zh) 无人机、无人机控制方法和装置
US20150042867A1 (en) Optical apparatus that performs manual focus shooting, and focusing method
JP6711608B2 (ja) 制御装置、および、レンズ装置
US20210199919A1 (en) Control device, photographing device, control method, and program
CN108298101B (zh) 云台旋转的控制方法及装置、无人机
WO2020025051A1 (zh) 控制装置、方法及程序
JP6318455B1 (ja) 制御装置、撮像装置、移動体、制御方法及びプログラム
JP7229719B2 (ja) 画像処理システム
US20220046177A1 (en) Control device, camera device, movable object, control method, and program
JP2020020878A (ja) 移動体、合焦制御方法、プログラム、及び記録媒体
JP2020085918A (ja) 決定装置、撮像装置、撮像システム、移動体、決定方法、及びプログラム
US11926065B2 (en) Vision-based operation for robot
WO2021031833A1 (zh) 控制装置、摄像系统、控制方法以及程序
WO2022061534A1 (zh) 拍摄控制方法、装置、云台、跟焦器电机及存储介质
JP2019049653A (ja) レンズ装置、撮像システム、移動体
US11125970B2 (en) Method for lens autofocusing and imaging device thereof
JP6780203B2 (ja) 制御装置、撮像装置、制御方法、及びプログラム
JP6146520B1 (ja) 光学装置、移動体、保持位置調整方法、及びプログラム
WO2020063779A1 (zh) 控制装置、摄像装置、移动体、控制方法以及程序
JP6679679B2 (ja) 制御装置、方法およびプログラム
JP6961889B1 (ja) 制御装置、撮像装置、制御方法、及びプログラム
JP7043706B2 (ja) 制御装置、撮像システム、制御方法、及びプログラム
WO2020001545A1 (zh) 移动体、对焦控制方法、程序以及记录介质
JP2023014645A (ja) 天体望遠鏡

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19844942

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19844942

Country of ref document: EP

Kind code of ref document: A1