WO2023058347A1 - Information processing device, information processing method, and program - Google Patents

Information processing device, information processing method, and program Download PDF

Info

Publication number
WO2023058347A1
WO2023058347A1 PCT/JP2022/031703 JP2022031703W WO2023058347A1 WO 2023058347 A1 WO2023058347 A1 WO 2023058347A1 JP 2022031703 W JP2022031703 W JP 2022031703W WO 2023058347 A1 WO2023058347 A1 WO 2023058347A1
Authority
WO
WIPO (PCT)
Prior art keywords
correction
correction method
breathing
zoom lens
trimming
Prior art date
Application number
PCT/JP2022/031703
Other languages
French (fr)
Japanese (ja)
Inventor
孝平 植村
Original Assignee
ソニーグループ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ソニーグループ株式会社 filed Critical ソニーグループ株式会社
Publication of WO2023058347A1 publication Critical patent/WO2023058347A1/en

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B7/00Mountings, adjusting means, or light-tight connections, for optical elements
    • G02B7/02Mountings, adjusting means, or light-tight connections, for optical elements for lenses
    • G02B7/04Mountings, adjusting means, or light-tight connections, for optical elements for lenses with mechanism for focusing or varying magnification
    • G02B7/08Mountings, adjusting means, or light-tight connections, for optical elements for lenses with mechanism for focusing or varying magnification adapted to co-operate with a remote control mechanism
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B7/00Mountings, adjusting means, or light-tight connections, for optical elements
    • G02B7/28Systems for automatic generation of focusing signals
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B13/00Viewfinders; Focusing aids for cameras; Means for focusing for cameras; Autofocus systems for cameras
    • G03B13/32Means for focusing
    • G03B13/34Power focusing
    • G03B13/36Autofocus systems

Definitions

  • the present technology relates to an information processing device, its method, and a program, and particularly relates to processing technology for breathing correction, which is correction of a change in angle of view that accompanies focusing.
  • Patent Literature 1 listed below discloses a technique for correcting breathing. Specifically, in Patent Document 1, in the case where breathing correction by driving the zoom lens group and breathing correction by trimming the captured image are possible, when the focus lens group is driven in the direction of increasing the shooting magnification, zooming is performed. A technology is disclosed in which breathing correction is performed by lens driving, and breathing correction is performed by trimming when the focus lens group is driven in a direction in which the photographing magnification is reduced.
  • This technology has been developed in view of the above circumstances, and aims to appropriately determine the method of breathing correction that should be performed according to the situation of the imaging device.
  • the information processing apparatus performs breathing correction by either one of the first correction method and the second correction method, which are methods of breathing correction, based on information related to the amount of movement of the focus lens group or scene recognition information. It has a correction method determination unit that determines whether or not to perform the correction.
  • breathing refers to a phenomenon in which the angle of view changes with focusing
  • breathing correction refers to correction of such a change in angle of view that accompanies focusing.
  • which one of the first and second correction methods should be used for the breathing correction is determined based on the information regarding the amount of movement of the focus lens or the scene recognition information.
  • the correction method is determined based on information related to the amount of movement of the focus lens group, for example, if the first and second correction methods are the breathing correction method by driving the zoom lens group and the breathing correction method by trimming the captured image, respectively.
  • the first and second correction methods are the breathing correction method by driving the zoom lens group and the breathing correction method by trimming the captured image, respectively.
  • the correction method is determined based on the scene recognition information, for example, when the first and second correction methods are the breathing correction method by driving the zoom lens group and the breathing correction method by trimming the captured image, the quietness In a specific scene, it is possible to perform breathing correction by trimming without performing breathing correction by driving the zoom lens, which produces an operation sound associated with lens movement.
  • the information processing apparatus performs the following correction based on the information related to the amount of movement of the focus lens group or the scene recognition information, out of the first correction method and the second correction method, which are techniques for correcting breathing.
  • This is an information processing method for determining which method of breathing correction is to be performed.
  • a program according to the present technology is a program readable by a computer device, and is based on information relating to the amount of movement of the focus lens group or scene recognition information, and is a method of correcting breathing, namely a first correction method and a second correction method. It is a program that causes the computer device to realize a function of determining which of the correction methods should be used for the breathing correction.
  • FIG. 1 is a diagram showing a configuration example of a camera system as an embodiment according to the present technology
  • FIG. 1 is a block diagram showing an internal configuration example of an interchangeable lens and an imaging device (information processing device) as a first embodiment
  • FIG. 3 is an explanatory diagram of terms related to AF control
  • FIG. 10 is an explanatory diagram of an aspect of a change in angle of view as breathing and an outline of breathing correction
  • 3 is a functional block diagram showing functions related to focus-related processing as an embodiment of an imaging device as a first embodiment
  • FIG. FIG. 9 is a diagram showing an example of focus movement trajectory information
  • FIG. 4 is a diagram showing an example of an angle-of-view table; It is the figure which showed the example of the lens group moving speed table.
  • FIG. 1 is a diagram showing a configuration example of a camera system as an embodiment according to the present technology
  • FIG. 1 is a block diagram showing an internal configuration example of an interchangeable lens and an imaging device (information processing device) as a first embodiment
  • FIG. 10 is a diagram showing an example of a trimming magnification table
  • FIG. 5 is an explanatory diagram of an example of a technique for acquiring position coordinates of a plurality of zoom lens groups
  • FIG. 10 is an explanatory diagram of an example of a method of acquiring a target zoom lens position in the embodiment
  • 7 is a flow chart showing a specific processing procedure example for implementing focus-related processing as the first embodiment
  • FIG. 11 is a block diagram showing an internal configuration example of a lens device and an imaging device that configure a camera system as a second embodiment
  • FIG. 10 is a functional block diagram showing functions according to the second embodiment that the imaging device of the second embodiment has;
  • FIG. 5 is an explanatory diagram of lens positions when breathing correction is performed by a zoom lens correction method; It is the figure which showed the example of the lens group moving speed table in 2nd embodiment.
  • FIG. 9 is a diagram showing an example of a focus group angle-of-view variation rate table;
  • FIG. 10 is a diagram showing an example of a zoom group angle-of-view variation rate table;
  • FIG. 11 is a flow chart showing a specific processing procedure example for realizing control as a second embodiment;
  • FIG. 10 is a flowchart of processing as a first modified example; 10 is a flowchart of processing as a second modified example; 10 is a flowchart of processing as a third modified example;
  • FIG. 11 is a simplified configuration explanatory diagram of a camera system as a fourth modified example;
  • FIG. 1 is a diagram showing a configuration example of a camera system as an embodiment according to the present technology.
  • the camera system includes an interchangeable lens 1 and an imaging device (body) 2 that is an embodiment of an information processing device according to the present technology.
  • the interchangeable lens 1 is a lens unit that can be freely attached to and detached from the imaging device 2 . Inside the interchangeable lens 1, there are various lenses such as a focus lens and a zoom lens. It has a mount part etc. with a connection function and a communication function. A specific configuration example of the interchangeable lens 1 will be described again with reference to FIG.
  • the imaging device 2 is configured as a digital camera device in which the interchangeable lens 1 is detachably attached.
  • the imaging device 2 has not only a still image imaging function but also a moving image imaging function.
  • the imaging device 2 includes an imaging device 55 that captures a subject image incident through the interchangeable lens 1, a display unit 61 that can display an image captured by the imaging device 55, a GUI (Graphical User Interface) such as various operation screens, etc.
  • An operation unit 65 and the like are provided for the user to perform various operation inputs.
  • the imaging device 2 includes, in addition to the configuration shown in FIG. and a configuration for performing communication with the interchangeable lens 1, and the like.
  • FIG. 2 is a block diagram showing an internal configuration example of the interchangeable lens 1 and the imaging device 2.
  • the interchangeable lens 1 includes a mount section 11 detachably attached to a mount section 51 of the imaging device 2 .
  • the mount section 11 has a plurality of terminals for electrical connection with the imaging device 2 .
  • the interchangeable lens 1 also includes a lens-side control section 12 , a zoom lens 13 , an image stabilization lens 14 , an aperture 15 , a focus lens 16 , an operation section 31 , a memory 32 and a power supply control section 33 .
  • the interchangeable lens 1 includes a zoom lens driving section 21 , a camera shake control section 22 , an aperture control section 23 , a focus lens driving section 24 and a detection section 17 .
  • the lens-side control unit 12 includes, for example, a microcomputer having a CPU (Central Processing Unit), a ROM (Read Only Memory), a RAM (Random Access Memory), and the like.
  • the overall control of the interchangeable lens 1 is performed by reading the program stored in the storage device into the RAM and executing it.
  • the lens-side control unit 12 adjusts the position of the zoom lens 13 based on an instruction from the imaging device 2 supplied via a predetermined communication terminal of the mount unit 11 or a user's operation received by the operation unit 31. Control.
  • the lens-side control unit 12 acquires the current position of the zoom lens 13 from the detection unit 17 having, for example, a magnetic sensor (MR sensor) or the like, and moves the zoom lens 13 to a predetermined position based on the acquired result. , and outputs the determined driving direction and driving amount to the zoom lens driving section 21 together with a movement command.
  • the zoom lens drive unit 21 moves the zoom lens 13 in the optical axis direction based on the movement command supplied from the lens side control unit 12 so as to achieve the instructed drive direction and drive amount.
  • the detection unit 17 comprehensively represents the configuration for detecting the state of the interchangeable lens 1, such as the positions of the zoom lens 13, the image stabilization lens 14, and the focus lens 16, the aperture diameter of the diaphragm 15, and the like. is.
  • detection of the lens position can be performed by, for example, a magnetic sensor, a photodiode array, a potentiometer, a reflective encoder, or the like.
  • the detection unit 17 may be configured to include a temperature sensor for detecting temperature.
  • the lens-side control unit 12 controls the camera shake correction lens 14 so as to correct camera shake. Specifically, the lens-side control unit 12 determines the driving direction and driving amount of the camera shake correction lens 14 in the direction of canceling the camera shake amount based on the camera shake amount detected by the camera shake detection sensor provided in the detection unit 17. , the determined drive direction and drive amount are output to the camera shake control unit 22 together with a movement command.
  • the shake detection sensor in the detection unit 17 is configured by, for example, both or either of a gyro sensor and a triaxial acceleration sensor. The gyro sensor is used to detect a deviation (shake) in the direction corresponding to pitch or yaw as the correction direction of the camera shake correction lens 14.
  • the camera shake control unit 22 moves the camera shake correction lens 14 so as to achieve the instructed drive direction and drive amount. Further, the lens-side control unit 12 performs control to mechanically lock the image stabilization lens 14 when the power supply is turned off. That is, in a state in which power is supplied from the imaging device 2 to the interchangeable lens 1, the camera shake correction lens 14 is kept at a predetermined position by control via the camera shake control unit 22. is turned off, the position control by the camera shake control unit 22 is stopped, and the camera shake correction lens 14 drops by a predetermined amount in the direction of gravity.
  • the lens-side control unit 12 mechanically locks the camera-shake correction lens 14 via the camera-shake control unit 22 in accordance with the timing at which the power supply is turned off to prevent it from falling.
  • the shake control section 22 mechanically locks the shake correction lens 14 based on the fixing command supplied from the lens side control section 12 .
  • the lens-side control section 12 controls (the aperture diameter of) the diaphragm 15 in accordance with an instruction or the like from the imaging device 2 supplied via a predetermined communication terminal of the mount section 11 .
  • the lens-side control unit 12 acquires the aperture diameter of the diaphragm 15 detected by the diaphragm detection sensor in the detection unit 17, and instructs the diaphragm control unit 23 to obtain the F value instructed by the imaging device 2.
  • a command is issued to drive the diaphragm 15 .
  • the diaphragm controller 23 drives the diaphragm 15 so that the aperture diameter instructed by the lens-side controller 12 is obtained.
  • the lens-side control section 12 controls the position of the focus lens 16 based on instructions from the imaging device 2 supplied via a predetermined communication terminal of the mount section 11 .
  • the AF Auto Focus
  • information on the target focus lens position is instructed from the imaging device 2 to the lens side control section 12 .
  • the lens-side control unit 12 acquires the current position of the focus lens 16 from the detection unit 17, and moves the focus lens 16 based on the acquired information on the current position and the information on the target focus lens position instructed by the imaging device 2.
  • a driving direction and driving amount for moving to a target position are determined, and the determined driving direction and driving amount are output to the focus lens driving section 24 together with a movement command.
  • the focus lens driving unit 24 moves the focus lens 16 in the optical axis direction so as to achieve the instructed drive direction and drive amount.
  • the focus lens 16 is configured as a "focus lens group” including one or more optical elements.
  • the focus lens group includes a plurality of optical elements, those optical elements are displaced together during focusing.
  • the zoom lens 13 is configured as a "zoom lens group” including one or more optical elements. It will be displaced.
  • the zoom lens 13 and the focus lens 16 are configured by one zoom lens group and one focus lens group, respectively, but it is also possible to configure each of them to have a plurality of zoom lens groups and focus lens groups. be.
  • the lens-side control unit 12 captures the position of the zoom lens 13 (hereinafter referred to as "zoom lens position") detected by the detection unit 17 and the position of the focus lens (hereinafter referred to as "focus lens position"). Processing for transmission to the device 2 (body-side control unit 52) is performed. These zoom lens position and focus lens position are used in AF processing performed on the imaging device 2 side, respectively. In this example, AF processing is performed for each frame of the captured image. Therefore, in the interchangeable lens 1 of this example, the detection unit 17 detects the zoom lens position and the focus lens position for each frame, and the lens-side control unit 12 detects the zoom lens position and the zoom lens position detected for each frame. and information on the focus lens position are sequentially transmitted to the imaging device 2 (body-side control unit 52).
  • the focus lens drive unit 24 can be configured to have, for example, an ultrasonic motor, a DC motor, a linear actuator, a stepping motor, a piezo element (piezoelectric element), etc., as a lens drive source.
  • focusing can also be configured to be performed according to the user's operation received by the operation unit 31.
  • the memory 32 is composed of a non-volatile memory such as an EEPROM (EEP: Electrically Erasable Programmable), and can be used to store an operation program for the lens-side controller 12 and various data.
  • EEPROM Electrically Erasable Programmable
  • the memory 32 stores focus movement locus information J1, an angle of view table J2, a lens group movement speed table J3, and a trimming magnification table J4, which will be described later.
  • the power supply control unit 33 detects the amount of power supplied from the imaging device 2, and based on the detected amount of power, supplies power to each unit (the lens-side control unit 12 and various driving units) in the interchangeable lens 1. Power is supplied by optimally distributing the amount.
  • the imaging device 2 on the body side is provided with a mount section 51 to which the interchangeable lens 1 is detachably attached.
  • the mount section 51 has a plurality of terminals for electrical connection with the mount section 11 of the interchangeable lens 1 .
  • Terminals to be connected include, for example, a terminal for supplying power (power supply terminal), a terminal for transmitting commands and data (communication terminal), and a terminal for transmitting a synchronization signal (synchronization signal terminal).
  • the imaging device 2 further includes a body-side control unit 52, a shutter 53, a shutter control unit 54, an imaging device 55, an ADC (Analog to Digital Converter) 56, a frame memory 57, an image signal processing unit 58, a recording unit 59, a recording medium 60 , a display unit 61 , a memory 62 , a power control unit 63 , a power supply unit 64 and an operation unit 65 .
  • ADC Analog to Digital Converter
  • the power control unit 63 supplies the power supplied from the power supply unit 64 to each unit of the imaging device 2 including the body-side control unit 52 .
  • the power control unit 63 also calculates the amount of power that can be supplied to the interchangeable lens 1 based on the operating state of the imaging device 2 and supplies power to the interchangeable lens 1 via the mount unit 51 .
  • the power supply unit 64 includes, for example, a secondary battery such as a NiCd battery, a NiMH battery, or a Li battery.
  • the power supply unit 64 may be configured to be able to receive power supply from a commercial AC power supply via an AC adapter or the like.
  • the body-side control unit 52 includes a microcomputer having a CPU, a ROM, a RAM, etc.
  • the CPU reads a program stored in a predetermined storage device such as the ROM or the memory 62 into the RAM and executes the program. It performs overall control of the imaging device 2 and the camera system.
  • the memory 62 is composed of a non-volatile memory such as an EEPROM, and can be used to store an operation program for the body-side control section 52 and various data.
  • the body-side control unit 52 causes the imaging device 55 to perform imaging processing based on the operation signal representing the user's operation supplied from the operation unit 65 . Furthermore, a predetermined command is transmitted to the interchangeable lens 1 side via the mount section 51 to drive the focus lens 16, the zoom lens 13, and the like.
  • the body-side control unit 52 can acquire information indicating the lens position of the focus lens 16, information indicating the lens position of the zoom lens 13, and the like from the detection unit 17 in the interchangeable lens 1, for example.
  • the shutter 53 is arranged in front of the imaging device 55 (subject side) and opens and closes under the control of the shutter control section 54 .
  • the shutter control unit 54 detects the open/closed state of the shutter 53 and supplies information indicating the detection result to the body side control unit 52 .
  • the shutter controller 54 drives the shutter 53 to open or close based on the control of the body-side controller 52 .
  • the imaging element 55 is configured as an image sensor such as a CCD (Charge Coupled Device) sensor, a CMOS (Complementary Metal Oxide Semiconductor) sensor, or the like, and images an object to generate and output captured image data. If the imaging device 55 is composed of a CCD sensor or a CMOS sensor, an electronic shutter can be used, so the shutter 53 can be omitted. When the shutter 53 is omitted, the shutter control section 54 used for its control is also omitted.
  • CCD Charge Coupled Device
  • CMOS Complementary Metal Oxide Semiconductor
  • the image pickup element 55 includes pixels for image pickup (RGB pixels) and pixels for acquiring detection information used in AF (Auto Focus) processing by the image plane phase difference method, that is, the position between the pair of images. and phase difference detection pixels for acquiring phase difference information (phase difference information between a pair of images formed by pupil division).
  • the phase difference detection pixels are discretely arranged on a pixel arrangement plane where RGB pixels are two-dimensionally arranged according to a predetermined arrangement pattern such as a Bayer arrangement.
  • phase difference pixel signal Sp the received light signal obtained by photoelectric conversion of the phase difference detection pixel is converted into a digital signal by the ADC 56 and supplied to the body side control section 52 .
  • the signal obtained by digitally converting the light receiving signal of the phase difference detection pixel is denoted as "phase difference pixel signal Sp".
  • the body-side control unit 52 analyzes the phase difference between the pair of images based on the phase difference pixel signal Sp supplied via the ADC 56, and adjusts the focus of the subject to be focused (focusing target). A shift amount, that is, a defocus amount DF is calculated. The body-side control unit 52 performs AF control based on the defocus amount DF thus calculated, which will be described later.
  • the body-side control unit 52 performs processing related to breathing correction.
  • Breathing here means a phenomenon in which the angle of view changes with focusing
  • breathing correction means correction of such a change in angle of view that accompanies focusing.
  • the breathing correction is performed by driving the zoom lens 13 or trimming (electronically cutting out) the captured image.
  • the breathing correction can be performed by switching between correction by driving the zoom lens 13 and correction by trimming the captured image.
  • a breathing correction method a correction method by driving the zoom lens 13 is hereinafter referred to as a "zoom lens correction method", and a correction method by trimming a captured image is described as a "trimming correction method". The details of the breathing correction processing performed by the body-side control unit 52 will be described later.
  • the image signal processing unit 58 performs predetermined image signal processing on the captured image input via the frame memory 57 .
  • Examples of the image signal processing here include demosaic processing, white balance (WB) adjustment, gamma correction processing, and the like.
  • the image signal processing unit 58 performs image signal processing on the captured image as a RAW image input via the frame memory 57, converts it into image data in a predetermined file format, and transfers the data to a recording medium via the recording unit 59. Let 60 record.
  • the image signal processing unit 58 also converts the captured image after the image signal processing into an image signal in accordance with a predetermined display format, supplies the image signal to the display unit 61, and displays the captured image.
  • the image signal processing unit 58 in this embodiment in particular is capable of trimming the captured image.
  • the image signal processing unit 58 performs trimming processing on the captured image based on the instruction from the body side control unit 52 .
  • the recording medium 60 is composed of a non-volatile memory, and the recording unit 59 is configured to be able to write data to the recording medium 60 and read data recorded on the recording medium 60 .
  • the recording medium 60 may be detachable from the imaging device 2 .
  • the display unit 61 is composed of a panel-type display device such as a liquid crystal panel or an organic EL panel, and is capable of displaying images.
  • the display unit 61 is mounted on the rear surface of the imaging device 2 opposite to the front surface where the mount unit 51 is arranged, and displays a so-called through image, an image read from the recording medium 60, various operation screens, and the like. It is possible to display the GUI as
  • the operation unit 65 includes various hardware keys such as a shutter button, a mode dial, and a zoom button, a touch panel provided to detect touch operations on the display screen of the display unit 61, and the like. It comprehensively represents the operators to perform.
  • the operation unit 65 receives a user's operation and supplies an operation signal corresponding to the operation to the body-side control unit 52 .
  • the "subject position” literally represents the position where the subject exists, and the “subject distance” represents the distance from the imaging device 2 to the subject.
  • "Focus position” represents a position where focus is achieved, and can be rephrased as “focus position”.
  • “Focus distance” means the distance from the imaging device 2 to the focus position.
  • the object distance and the focusing distance are distances to positions on the outside of the interchangeable lens 1, for example, 2 m, 3 m, 4 m, . It is a value represented by an actual distance such as
  • “Focus lens position” means the position of the focus lens 16 within the movable range of the focus lens 16 in the interchangeable lens 1 as illustrated in the drawing. 1 means the position of the zoom lens 13 within the movable range of the zoom lens 13 .
  • the defocus amount DF in this case does not directly represent the error amount of the focus lens position.
  • the body side control unit 52 adjusts the focus lens 16 necessary for focusing on the focus target based on the defocus amount DF.
  • a target position (hereinafter referred to as “target focus lens position”) is obtained, and information on the target focus lens position is instructed to the interchangeable lens 1 side.
  • FIG. 4 what is shown as "before correction" in the upper part is an example of change in angle of view with respect to change in focus position from infinity to the closest object.
  • the size of the image in the captured image (the letter A in the example in the figure) is the largest at infinity, the size of the image is the smallest at the closest distance, and the focus is intermediate between infinity and the closest distance.
  • the image size is smaller than at infinity and larger than at closest.
  • breathing correction is performed so that the angle of view gradually narrows from the infinity side to the closest distance side, and conversely, from the closest side to the infinity side, as indicated by "after correction” in the lower part of the figure. is performed so that the angle of view gradually widens.
  • Breathing correction by trimming is performed by setting the trimming magnification at infinity to "1.0" (that is, no trimming) and gradually increasing the trimming magnification in response to changes in the focus position toward the closest distance.
  • FIG. 5 is a functional block diagram showing functions related to focus-related processing as an embodiment of the body-side control unit 52 of the imaging device 2 as the first embodiment.
  • the body-side control section 52 has functions as an information acquisition processing section F1, an AF processing section F2, a first correction control section F3, a second correction control section F4, and a correction method determination section F5.
  • the information acquisition processing unit F1 performs acquisition processing of lens characteristic information used for breathing correction. Specifically, the information acquisition processing unit F1 in this example uses the focus movement locus information J1, the angle of view table J2, the lens group movement speed table J3, and the trimming magnification table J4 stored in the interchangeable lens 1 as the lens characteristic information. A process of acquiring from the interchangeable lens 1 is performed. The information acquisition processing unit F1 requests the lens control unit 12 to transmit the lens characteristic information, and acquires the lens characteristic information transmitted from the lens control unit 12 to the imaging device 2 in response to the request.
  • FIG. 6 is a diagram showing an example of the focus movement trajectory information J1.
  • the focus movement locus information J1 is information indicating the relationship between the zoom lens position, the focus lens position, and the focus position.
  • the focus movement locus information J1 in this example is information indicating the focus lens position for each combination of the zoom lens position and the focus position, as shown in the drawing.
  • the zoom lens position indicated on the vertical axis ranges from the zoom lens position at one end of the movable range of the zoom lens shown in FIG.
  • the focus position on the horizontal axis represents each focus position from the focus position corresponding to infinity to the focus position corresponding to the closest distance.
  • the zoom lens position and the step size of the focus position are arbitrary.
  • the focus movement trajectory information J1 as described above may differ depending on the type and individual of the interchangeable lens 1, in this example, the focus movement trajectory information J1 corresponding to the characteristics of each interchangeable lens 1 is stored in a memory. 32 is stored.
  • FIG. 7 is a diagram showing an example of the angle-of-view table J2.
  • the angle-of-view table J2 is information indicating the focal length (angle of view) for each combination of the zoom lens position and the focus position. Also in this case, the range of the zoom lens position, the focus position, and the increments of these positions are the same as in the case of the focus movement trajectory information J1.
  • the field angle table J2 shows the change characteristics of the field angle for each zoom lens position when the focus position is changed from infinity to the closest distance.
  • the angle-of-view table J2 is used to determine the amount of movement of the zoom lens 13 required when performing breathing correction by the zoom lens correction method, but this point will be explained later.
  • FIG. 8 is a diagram showing an example of the lens group moving speed table J3.
  • the lens group moving speed table J3 is information about the zoom group moving speed (mm/sec), which is the amount of movement of the zoom lens 13 (actuator in the zoom lens drive unit 21) when the zoom lens 13 is driven by a predetermined drive signal value. and is information indicating the change characteristics of the zoom group moving speed with respect to a predetermined speed change factor.
  • the lens group moving speed table J3 is information indicating the change characteristics of the lens group moving speed with respect to temperature. With such a lens group moving speed table J3, the zoom group moving speed can be obtained with high accuracy with respect to speed change factors such as temperature.
  • the lens group moving speed table J3 is used when calculating the correction time (which can be rephrased as the movement time of the zoom lens 13 required for correction: a correction time Tla to be described later) when performing breathing correction by the zoom lens correction method. Used.
  • lens group moving speed table J3 it is possible to apply other factors such as the amount of electric power supplied from the imaging device 2 to the interchangeable lens 1 in addition to the temperature as speed change factors.
  • FIG. 9 is a diagram showing an example of the trimming magnification table J4.
  • the trimming magnification table J4 is information indicating the trimming magnification for breathing correction for each combination of zoom lens position and focus position. Also in this case, the range of the zoom lens position, the focus position, and the step size of these positions are the same as in the case of the focus movement locus information J1.
  • Such a trimming magnification table J4 indicates, for each zoom lens position, the trimming magnification change characteristic for canceling out the change in the angle of view that accompanies the change in the focus position.
  • AF if a target focus position for focusing on an in-focus object is obtained based on the defocus amount DF, information on the target focus position and information on the current zoom lens position are used.
  • information on the trimming magnification required for correction can be acquired based on the trimming magnification table J4.
  • the angle-of-view table J2, the lens group movement speed table J3, and the trimming magnification table J4 also use the type of the interchangeable lens 1, as with the focus movement locus information J1 shown in FIG. Therefore, in this example, the field angle table J2, the lens group moving speed table J3, and the trimming magnification table J4 are also set according to the characteristics of each interchangeable lens 1.
  • the information obtained is stored in the memory 32 .
  • the zoom lens 13 and the focus lens 16 can be configured with a plurality of zoom lens groups and a plurality of focus lens groups, respectively. That is, there is a configuration in which zoom adjustment is performed by displacing a plurality of zoom lens groups, and a configuration in which focusing is performed by displacing a plurality of focus lens groups.
  • the "zoom lens position" is information on a combination of the positions of the zoom lens groups.
  • the information on the "focus lens position" is information on a combination of the positions of the focus lens groups.
  • the zoom lens position which is the combination information of the positions of the plurality of zoom lens groups as described above and the position of each zoom lens group
  • information as shown in FIG. It should be used.
  • table information as shown in FIG. 10A and function information as shown in FIG. 10B may be used.
  • the zoom lens 13 is composed of two lens groups, a first zoom lens group and a second zoom lens group, is shown.
  • a table storing combination information of the positional coordinates of the zoom lens group and the positional coordinates of the second zoom lens group is used.
  • the "zoom lens position" can be substituted for either the position of the first zoom lens group or the position of the second zoom lens group. It is still a combination of the position of the lens group and the position of the second zoom lens group.
  • the focus lens 16 is composed of a plurality of focus lens groups, similar tables and functions are stored in the interchangeable lens 1, and the imaging device 2 acquires them as necessary. do it.
  • the AF processing unit F2 performs AF-related processing, specifically, processing for obtaining the defocus amount DF described above, and obtains a target focus lens position for focusing on an in-focus object based on the defocus amount DF. Perform processing for AF-related processing, specifically, processing for obtaining the defocus amount DF described above, and obtains a target focus lens position for focusing on an in-focus object based on the defocus amount DF. Perform processing for
  • the AF processing unit F2 adjusts the focus object based on the current (current frame) zoom lens position and focus lens position information transmitted from the interchangeable lens 1 side, and the focus movement locus information J1.
  • a focus position for focusing (hereinafter referred to as "target focus position") is obtained. That is, first, the current focus position is obtained based on the current zoom lens position and focus lens position information and the focus movement locus information J1. Then, a target focus position is calculated based on the current focus position and the defocus amount DF. Next, the AF processing unit F2 acquires the target focus lens position based on the information on the target focus position, the current zoom lens position, and the focus movement trajectory information J1.
  • the AF processing unit F2 instructs the lens-side control unit 12 with the information on the target focus lens position acquired as described above. Thereby, in the interchangeable lens 1, the focus lens 16 is driven so that the focus lens position matches the target focus lens position, and AF is realized.
  • the focus movement trajectory information J1 it is not realistic in terms of data capacity to cover all zoom lens positions, focus positions, and focus lens positions. There may be cases where the combination of positions does not exist in the focus movement trajectory information J1. In that case, interpolation processing such as linear interpolation is performed to obtain the focus position corresponding to the combination of the current focus lens position and zoom lens position.
  • the processing for obtaining the target focus lens position from the defocus amount DF is performed on the imaging device 2 side, but the processing for obtaining the target focus lens position from the defocus amount DF can also be performed on the interchangeable lens 1 side.
  • the body-side control unit 52 transmits information on the defocus amount DF to the lens-side control unit 12, and the lens-side control unit 12 controls the target focus lens based on the focus movement locus information J1 stored in the memory 32. The position should be acquired.
  • the first correction control unit F3 and the second correction control unit F4 perform control for realizing correction of breathing correction by different correction methods. Specifically, in this example, the first correction control unit F3 performs control for realizing breathing correction by the zoom lens correction method, and the second correction control unit F4 performs control for realizing breathing correction by the trimming correction method. control.
  • the first correction control unit F3 instructs the lens-side control unit 12 about the information of the target zoom lens position for realizing the correction by the zoom lens correction method, which is obtained by the processing of the correction method determination unit F5, which will be described later.
  • the breathing correction by the zoom lens correction method is executed on the interchangeable lens 1 side.
  • the second correction control unit F4 in this example outputs information on the trimming magnification acquired based on the trimming magnification table J4 as will be described later, that is, information on the trimming magnification for realizing the breathing correction by the trimming correction method.
  • the image signal processing unit 58 is caused to execute trimming processing for correcting bleeding.
  • a correction method determination unit F5 determines which of the breathing correction method, the first correction method and the second correction method, should be used for the breathing correction based on the information related to the amount of movement of the focus lens group. . That is, in this example, the correction method to be executed is determined for the zoom lens correction method and the trimming correction method.
  • the determination of the correction method here is based on information relating to the amount of movement of the focus lens group, the correction time Tla in the case where the zoom lens correction method is adopted, and whether or not the correction time Tla is equal to or less than a predetermined threshold value Tth. as a judgment of
  • the zoom lens correction method is adopted for the change of the focus position due to focusing
  • the angle of view is determined to which position the zoom lens 13 should be moved for correcting breathing. It is specified using Table J2.
  • the target position of the zoom lens position for breathing correction when the zoom lens correction method is adopted is specified.
  • this target position will be referred to as "target zoom lens position”.
  • FIG. 11 shows an example in which information on the angle of view is stored in the angle-of-view table J2, information on the focal length can also be stored.
  • the current zoom lens position Zm[1] in the figure
  • the angle of view is changed from 17.0 (mm) to 16.8 (mm).
  • the zoom lens position should be changed from Zm[1] to Zm[2] so that .0 is kept. That is, the target zoom lens position in this case can be identified as Zm[2].
  • the target zoom lens position can be obtained by referring to the angle-of-view table J2 based on information on the current zoom lens position and the focus position after the change due to focusing. Specifically, a zoom lens position that can maintain the angle of view before the change at the changed focus position is acquired as the target zoom lens position.
  • the correction method determination unit F5 calculates a correction time Tla, which is the time required for correction when performing breathing correction by the zoom lens correction method. Specifically, the amount of movement of the zoom lens 13 required for correction from the target zoom lens position and the current zoom lens position is calculated as the "correction zoom lens movement amount", and the correction zoom lens movement amount is The correction time Tla is calculated based on the zoom lens group moving speed information obtained based on the lens group moving speed table J3 shown in FIG. In this example, since the lens group moving speed table J3 is information defining the zoom lens group moving speed with respect to temperature, the correction method determination unit F5 acquires temperature information of the interchangeable lens 1 detected by the detection unit 17.
  • the correction time Tla is calculated based on the obtained zoom lens group moving speed and the correction zoom lens moving amount. Specifically, the correction time Tla is calculated by dividing the correction zoom lens movement amount by the zoom lens group movement speed.
  • the correction method determination unit F5 determines whether the correction time Tla thus calculated is equal to or less than a predetermined threshold value Tth, and if the correction time Tla is equal to or less than the threshold value Tth, breathing correction is performed by the zoom lens correction method. When the correction time Tla is not equal to or less than the threshold Tth, the breathing correction is performed by the trimming correction method.
  • the above-described first correction control unit F3 instructs the lens side control unit 12 of the target zoom lens position, so that the zoom lens correction method is applied to the interchangeable lens 1 side. Breathing correction is performed by
  • the second correction control unit F4 instructs the image signal processing unit 58 to use the trimming magnification acquired based on the trimming magnification table J4. Execute trimming processing for correction.
  • the trimming magnification table J4 if information on the focus position after changing by focusing is obtained, trimming for correction is performed based on the information on the current zoom lens position. Magnification can be obtained.
  • the trimming magnification table J4 it is not realistic in terms of data capacity to include information covering all combinations of zoom lens positions and focus positions. It may not exist in the magnification table J4. In that case, interpolation processing such as linear interpolation is performed to obtain the trimming magnification corresponding to the combination of the current zoom lens position and focus position.
  • FIG. 12 a specific processing procedure example for realizing the focus-related processing as the first embodiment described above will be described with reference to the flowchart of FIG. 12 .
  • the processing shown in FIG. 12 is executed by the body-side control unit 52 as software processing based on the program stored in the aforementioned ROM or the like.
  • a processing example will be described on the premise that focusing is performed by AF.
  • step S101 the body-side control unit 52 waits for the start of moving image capturing. That is, a process of waiting for the moving image capturing operation to start is performed based on the user's operation input or the like.
  • the body-side control unit 52 When it is determined that moving image capturing has started, the body-side control unit 52 performs processing for acquiring the current zoom lens position and focus lens position in step S102. That is, the zoom lens position and the focus lens position sequentially transmitted from the lens side control unit 12 for each frame are acquired.
  • step S103 the body-side control unit 52 acquires the current focus position based on the focus movement trajectory information J1. That is, the current focus position is acquired based on the information on the current zoom lens position and focus lens position acquired in step S102 and the focus movement locus information J1.
  • step S104 the body-side control unit 52 acquires the defocus amount DF by AF. That is, the defocus amount DF is acquired based on the phase difference pixel signal Sp, as described as the processing of the AF processing unit F2.
  • step S105 the body-side control unit 52 calculates the target focus position based on the current focus position and the defocus amount DF. Then, in step S106 following step S105, the body-side control unit 52 determines the zoom lens position for breathing correction, that is, the aforementioned target zoom lens position, based on the target focus position, the current zoom lens position, and the angle-of-view table J2. to get In this case, the method of obtaining the target zoom lens position based on the angle-of-view table J2 should be the above-mentioned "focus position after change" as the target focus position, and other parts have already been explained. Avoid duplicate explanations.
  • step S107 the body-side control unit 52 calculates the zoom lens movement amount for breathing correction, that is, the correction zoom lens movement amount described above, based on the target zoom lens position and the current zoom lens position. do.
  • step S108 the body-side control unit 52 calculates the correction time Tla by the zoom lens based on the correction zoom lens movement amount and the lens group movement speed table J3. Note that the method of calculating the correction time Tla has already been explained, so redundant explanation will be avoided.
  • step S109 the body-side control unit 52 determines whether or not the correction time Tla is equal to or less than the threshold value Tth. If the correction time Tla is equal to or less than the threshold value Tth, the body-side control unit 52 proceeds to step S110 and performs zoom lens movement control to the target zoom lens position. That is, the target zoom lens position acquired in step S106 is instructed to the lens side control unit 12 to execute breathing correction by the zoom lens correction method.
  • the body-side control unit 52 proceeds to step S111 and acquires the trimming magnification for breathing correction based on the trimming magnification table J4. Specifically, the trimming magnification for breathing correction is acquired based on the current zoom lens position acquired in step S102, the target focus position information acquired in step S105, and the trimming magnification table J4. Then, in subsequent step S112, the body-side control section 52 instructs the image signal processing section 58 on the acquired trimming magnification. Thereby, breathing correction is realized by the trimming correction method.
  • the body-side control unit 52 advances the process to step S113 when the process of step S110 is executed and when the process of step S112 is executed.
  • step S113 the body-side control unit 52 performs frame imaging standby processing. That is, it waits for the completion of the imaging operation for one frame.
  • step S114 the body-side control unit 52 waits whether or not the moving image capturing is finished. That is, it is a process of waiting for the end of the moving image capturing operation.
  • the body-side control unit 52 returns to step S102.
  • the processing from steps S102 to S113 is repeatedly executed in frame cycles as processing for breathing correction for focusing by AF.
  • the body-side control unit 52 ends the series of processes shown in FIG. 12 .
  • the processing for switching the correction method described above can also be preferably applied when focusing is performed by manual operation instead of focusing by AF.
  • the focus position after change by manual operation may be applied as the "changed focus position" in acquiring the target zoom lens position based on the angle-of-view table J2.
  • the "information on the amount of movement of the focus lens” corresponds to information on the amount of movement of the focus lens detected for displacement of the focus lens by manual operation.
  • the term “acquisition” of information refers to storing the target information in a predetermined storage device (for example, RAM, register, etc.) in a state that can be processed by the processor. Synonymous.
  • Second Embodiment> a second embodiment will be described.
  • the second embodiment relates to processing to be executed during breathing correction by the zoom lens correction method.
  • the same reference numerals will be given to the same parts as those already explained, and the explanation will be omitted.
  • FIG. 13 is a block diagram showing an internal configuration example of an interchangeable lens 1A and an imaging device 2A that configure a camera system according to the second embodiment.
  • the interchangeable lens 1A stores a lens group moving speed table J3A instead of the lens group moving speed table J3 in the memory 32, and further stores a focus group angle of view variation rate table J5. and the zoom group field angle variation rate table J6.
  • the lens group moving speed table J3A, the focus group angle of view variation rate table J5, and the zoom group angle of view variation rate table J6 will be described again.
  • the imaging device 2A differs from the imaging device 2 in that a body-side control section 52A is provided instead of the body-side control section 52. As shown in FIG. 14, the body-side control section 52A differs from the body-side control section 52 in functions.
  • FIG. 14 is a functional block diagram showing the functions of the body-side control section 52A according to the second embodiment.
  • the body-side control unit 52A Compared with the body-side control unit 52 (see FIG. 5), the body-side control unit 52A has an information acquisition processing unit F1A instead of the information acquisition processing unit F1, and a first correction control unit F3 instead of the first correction control unit F3. It differs in that it has a correction control section F3A.
  • the information acquisition processing unit F1A acquires focus movement trajectory information J1, an angle of view table J2, a lens group movement speed table J3A, a trimming magnification table J4, a focus group angle of view fluctuation rate table J5, and a zoom group angle of view fluctuation from the interchangeable lens 1A.
  • a process of acquiring the rate table J6 is performed.
  • the first correction control unit F3A Compared with the first correction control unit F3, the first correction control unit F3A performs the zoom lens movement control processing in step S110 performed during breathing correction by the zoom lens correction method, which is the processing described in the first embodiment. are different in that they perform different processing.
  • FIG. 15 shows the target zoom lens position, the position of the zoom lens 13 before moving to the target zoom lens position, the target focus lens position by AF, and the target when breathing correction is performed by the zoom lens correction method. It schematically shows the positional relationship of the focus lens 16 before it is moved to the focus lens position.
  • the focus lens 16 moves from the current focus lens position toward the target focus lens position, and the zoom lens 13 moves toward the target zoom lens position.
  • the change in angle of view due to breathing cannot be sufficiently corrected, and the user may perceive the change in angle of view. .
  • the first correction control unit F3A performs the following processing in order to suppress the change in angle of view caused by the difference in moving speed between the focus lens 16 and the zoom lens 13.
  • the driving of the zoom lens 13 and the focus lens 16 is performed in a cycle with a minute time ⁇ t as a unit period.
  • the zoom lens 13 is driven with the drive signal value of the zoom lens 13 set to a fixed value such as a maximum value, the moving speed (speed Vf described later) of the focus lens 16 is changed. Let me give you an example.
  • the first correction control unit F3A performs a process of displacing the zoom lens position and the focus lens position to their respective target positions while adjusting the focus group moving speed Vf so as to satisfy the condition of [Equation 1] below.
  • Vz means the zoom group movement speed (see FIG. 8)
  • Vf means the focus group movement speed, which is the movement speed of the focus lens 16 .
  • Az means the rate of change in angle of view (%/mm) per unit drive of the zoom lens 13 when the zoom group moving speed Vz is a predetermined reference speed (hereinafter referred to as "zoom group angle of view change rate Az ”).
  • the "unit drive” referred to here means lens drive for one cycle by the minute time .delta.t described above.
  • Af means the rate of change in angle of view per unit drive of the focus lens 16 when the focus group moving speed Vf is a predetermined reference speed (hereinafter referred to as "focus group angle of view change rate Af").
  • the first correction control unit F3A calculates the rate of change in angle of view “Vz ⁇ Az” due to the movement of the zoom lens and the change in angle of view due to the movement of the focus lens each time one cycle of driving is performed with a minute time ⁇ t.
  • a rate “Vf ⁇ Af” is obtained, and whether the absolute value of their difference (
  • the first correction control unit F3A drives the zoom lens 13 and the focus lens 16 for one cycle, and the zoom lens 13 is driven at the zoom group moving speed Vz and the focus lens 16 is driven at a speed indicated by the focus group moving speed Vf.
  • the first correction control unit F3A obtains the focus group moving speed Vf that satisfies the condition of [Formula 1], and obtains the focus group moving speed Vf. , the focus lens 16 and the zoom lens 13 are driven for one cycle by the zoom group moving speed Vz.
  • the focus group moving speed Vf can be acquired based on the lens group moving speed table J3A illustrated in FIG.
  • information indicating change characteristics of the focus group moving speed Vf with respect to a predetermined speed change factor is added to the lens group moving speed table J3 described above. It becomes what was done.
  • a predetermined speed change factor in this case, temperature is also used as an example
  • the zoom group moving speed Vz and the focus group moving speed Vf can be obtained with high accuracy with respect to speed change factors such as temperature.
  • the focus group angle of view variation rate Af is acquired based on the focus group angle of view variation rate table J5 illustrated in FIG.
  • the focus group field angle variation rate table J5 is information indicating the focus group field angle variation rate Af for each combination of the zoom lens position and the focus lens position.
  • the first correction control unit F3A makes a determination using [Equation 1], based on information on the current zoom lens position and focus lens position, the first correction control unit F3A responds based on the information content of the focus group angle of view fluctuation rate table J5. Information on the focus group angle-of-view variation rate Af is acquired.
  • the first correction control unit F3A acquires the zoom group angle of view variation rate Az based on the zoom group angle of view variation rate table J6 illustrated in FIG.
  • the zoom group view angle variation rate table J6 is information indicating the zoom group view angle variation rate Az for each zoom lens position. Based on the information on the current zoom lens position, the first correction control unit F3A acquires information on the corresponding zoom group angle of view variation rate Az based on the information content of the zoom group angle of view variation rate table J6.
  • FIG. 19 is a flow chart showing a specific processing procedure example for realizing the control as the second embodiment described above.
  • the body-side control section 52A executes the process shown in FIG. 19 as the process of step S110 shown in FIG.
  • the body-side control unit 52A first performs the process of acquiring the zoom group moving speed Vz and the focus group moving speed Vf in step S201. That is, based on the lens group moving speed table J3A illustrated in FIG. 16 and the temperature information of the interchangeable lens 1A detected by the detection unit 17, the zoom group moving speed Vz and the focus group moving speed Vf are obtained.
  • step S202 following step S201 the body-side control unit 52A acquires the zoom group angle-of-view variation rate Az and the focus group angle-of-view variation rate Af corresponding to the driving of the current one cycle. That is, based on information on the zoom lens position and the focus lens position detected for the current period of one cycle, the zoom group angle of view variation rate table J6, and the focus group angle of view variation rate table J5, the current 1 A zoom group angle of view variation rate Az and a focus group angle of view variation rate Af corresponding to the periodic driving are acquired. Specifically, based on the information on the zoom lens position, the corresponding zoom group view angle variation rate Az is acquired based on the zoom group view angle variation rate table J6, and based on the information on the zoom lens position and the focus lens position. Then, the corresponding focus group angle of view variation rate Af is obtained based on the focus group angle of view variation rate table J5.
  • step S203 the body-side control unit 52A determines whether or not
  • the body-side control unit 52A proceeds to step S204 and performs processing to obtain the focus group moving speed Vf that satisfies the condition. That is, the focus group moving speed Vf that satisfies the condition of [Equation 1] is obtained. Then, the body-side control unit 52A executes the process of step S205 using the focus group moving speed Vf obtained in step S204. As a result, lens driving can be controlled so that the moving speed difference between the zoom lens 13 and the focus lens 16 does not exceed a constant speed difference based on the threshold value Gth. It is possible to suppress the resulting change in angle of view.
  • step S206 the body side control unit 52A determines whether or not the zoom lens 13 and the focus lens 16 have reached the target positions. If the zoom lens 13 and focus lens 16 have not reached their respective target positions, the body-side controller 52A returns to step S202. As a result, until the zoom lens 13 and the focus lens 16 reach their respective target positions, the lens drive control based on [Equation 1] is executed for each period of the minute time ⁇ t described above.
  • the body-side control section 52A ends the processing of step S110.
  • the embodiment is not limited to the specific examples described above, and various modifications can be made.
  • the trimming correction method when the trimming magnification required for breathing correction exceeds a threshold value, it is possible to perform control so that breathing correction is performed by a zoom lens correction method.
  • the body-side control unit 52 controls the information related to the movement amount of the focus lens.
  • the correction execution condition by the trimming correction method based on is satisfied, after performing the acquisition processing of the trimming magnification for correction in step S111, it is determined whether or not the trimming magnification is equal to or less than a predetermined threshold value Rth in step S301. conduct.
  • This threshold value Rth is a value determined as a permissible value of the trimming magnification in terms of image quality.
  • the body-side control section 52 proceeds to step S112 and instructs the image signal processing section 58 on the trimming magnification to execute the breathing correction by the trimming correction method.
  • the body-side control unit 52 proceeds to step S110 and performs zoom lens movement control to the target zoom lens position. That is, when the trimming magnification exceeds the threshold value Rth defined as an allowable value in terms of image quality, the breathing correction by the trimming correction method is not performed, and the breathing correction by the zoom lens correction method is performed.
  • the scene recognition information means information for recognizing the scene of imaging.
  • the body-side control unit 52 (or 52A; the same applies hereinafter) performs If the correction execution condition by the zoom lens correction method is established, it is determined in step S401 whether or not the scene requires silence. That is, it is determined whether or not the scene is a specific scene requiring silence based on the scene recognition information.
  • the scene recognition information it is conceivable to use scene recognition information obtained by image recognition processing for a captured image. Scene recognition of a specific scene such as a scene of a concert or a wild animal such as a wild bird is performed by image recognition processing of the captured image.
  • the scene recognition information may be information based on a detection signal from an illuminance sensor or the like provided in the imaging device 2 (or 2A; the same shall apply hereinafter).
  • the scene recognition information may be recognition information based on an audio signal picked up by a microphone. For example, it is conceivable to determine whether or not a specific scene is a specific scene by pattern matching of audio waveforms, such as determining whether or not a characteristic audio waveform is obtained.
  • Scene recognition may also be performed based on detection signals from motion sensors such as acceleration sensors and angular velocity sensors.
  • scene recognition may be image recognition processing using AI (artificial intelligence) technology.
  • step S401 If it is determined in step S401 that the specific scene does not require silence, the body-side control unit 52 proceeds to step S110. That is, in this case, breathing correction is performed by a zoom lens correction method. On the other hand, if it is determined that the scene is a specific scene that requires silence, the body-side control unit 52 proceeds to step S111. That is, in this case, the breathing correction is performed by the trimming correction method by the processing of steps S111 and S112.
  • the first modified example and the second modified example described above can be combined. That is, although the trimming magnification exceeds the threshold value Rth and does not satisfy the requirements in terms of image quality, if it is determined that the specific scene requires silence, the breathing correction is performed by the trimming correction method. can be considered. Alternatively, if the trimming magnification exceeds the threshold value Rth for a specific scene that requires quietness, breathing correction may be performed using a zoom lens correction technique.
  • the first correction method zoom lens correction method in this example
  • the second correction method triming correction method in this example
  • the body-side control unit 52 performs processing for obtaining the target zoom lens position in step S106, and then performs processing for obtaining the trimming magnification in step S111.
  • step S401 it is determined whether or not the scene requires silence. The processing of step S401 in this case is the same as that described with reference to FIG.
  • the body-side control unit 52 advances the process to step S112 (that is, breathing correction is performed using a trimming correction method). On the other hand, if it is determined that the scene is not a specific scene that requires silence, the body-side control unit 52 advances the process to step S111 (that is, breathing correction is performed by the zoom lens correction method).
  • the breathing correction method is determined based on the correction time Tla obtained from the correction zoom lens movement amount, that is, the zoom lens movement amount required for the breathing correction by the zoom lens correction method.
  • the breathing correction method can also be determined based on the amount of movement of the focus lens 16 during focusing.
  • the amount of movement of the focus lens 16 by focusing can be obtained based on the current focus lens position and the target focus lens position in the case of AF focusing. Since the amount of movement of the focus lens 16 also correlates with the correction time Tla, if the determination is made based on the amount of movement of the focus lens 16 as described above, for example, when the amount of movement of the focus lens is large (when the correction time Tla is If it is long), appropriate breathing correction can be determined based on the amount of movement of the focus lens 16, such as by performing correction using a trimming correction method.
  • the breathing correction method can be determined based on the trimming magnification (correction trimming magnification) when performing the breathing correction by the trimming correction method.
  • the trimming magnification for correction is obtained based on information on the focal position (target focal position in the case of AF) after the change due to focusing, the zoom lens position, and the trimming magnification table J4. be able to.
  • correction trimming magnification also correlates with the correction time Tla, if the determination is made based on the correction trimming magnification as described above, for example, when the correction trimming magnification is large (when the correction time Tla is long), Appropriate breathing correction can be determined based on the trimming magnification for correction, such as by performing correction by a trimming correction method.
  • the lens characteristic information for correcting breathing is stored in the interchangeable lens 1 or 1A, but this is not the only option.
  • the lens characteristic information can be stored in a cloud server, and the imaging device 2 or 2A can be configured to acquire it from the cloud server as needed.
  • FIG. 23 shows a simplified configuration explanatory diagram of a camera system as a fourth modified example.
  • the interchangeable lens 1A' stores the lens characteristic information (focus movement locus information J1, angle of view table J2, lens group movement speed table J3A, trimming magnification table J4, focus group angle of view) described in the second embodiment in the memory 32. It differs from the interchangeable lens 1A in that the fluctuation rate table J5 and the zoom group view angle fluctuation rate table J6) are not stored.
  • the memory 32 stores lens identification information J7 for identifying the interchangeable lens 1A'.
  • the imaging device 2A' in the drawing differs from the imaging device 2A in that it includes a body-side control section 52A' in place of the body-side control section 52A and also includes a communication section .
  • the communication unit 70 communicates with an external device via a network NT such as the Internet.
  • the cloud server 80 stores lens characteristic information for each interchangeable lens 1A' as the lens characteristic information described in the second embodiment.
  • the body-side control unit 52A' acquires the lens identification information J7 from the attached interchangeable lens 1A', and makes an inquiry to the cloud server 80 via the communication unit 70 using the acquired lens identification information J7. to acquire the lens characteristic information corresponding to the mounted interchangeable lens 1A'.
  • the breathing correction method is determined by determining which of the zoom lens correction method and the trimming correction method should be used for the breathing correction. It is not limited to the lens correction method and the trimming correction method. For example, as the same zoom lens correction method, the zoom lens movement speed is slow but the zoom lens position control accuracy is high (that is, the breathing correction accuracy is high). In the case where there is a second zoom lens correction method with low position control accuracy, it is also possible to make a determination targeting these first zoom lens correction method and second zoom lens correction method.
  • the correction time by the first zoom lens correction method is short (if it is equal to or less than the threshold value)
  • correction by the first zoom lens correction method is performed, and the correction time by the first zoom lens correction method is If it is longer (if it is not equal to or less than the threshold value), it is conceivable to perform correction by the second zoom lens correction method.
  • the information processing device that determines the breathing correction method is configured as an imaging device that includes an imaging unit, but it is not essential that the information processing device includes an imaging unit. .
  • the information processing apparatuses (imaging apparatuses 2, 2A, 2A′) of the embodiments use the first correction method, which is a breathing correction method, and the A correction method determination unit (F5 in the same) is provided for determining which of the second correction methods should be used for the breathing correction.
  • the first correction method which is a breathing correction method
  • the A correction method determination unit F5 in the same
  • which one of the first and second correction methods should be used for the breathing correction is determined based on the information regarding the amount of movement of the focus lens group or the scene recognition information. If the correction method is determined based on information related to the amount of movement of the focus lens group, for example, if the first and second correction methods are the breathing correction method by driving the zoom lens group and the breathing correction method by trimming the captured image, respectively.
  • the correction method is determined based on the scene recognition information, for example, when the first and second correction methods are the breathing correction method by driving the zoom lens group and the breathing correction method by trimming the captured image, the quietness In a specific scene, it is possible to perform breathing correction by trimming without performing breathing correction by driving the zoom lens, which produces an operation sound associated with lens movement. Therefore, according to the above configuration, it is possible to appropriately determine the breathing correction method to be executed according to the situation of the imaging apparatus.
  • one of the first and second correction methods is a zoom lens correction method in which breathing correction is performed by driving the zoom lens group, and the other is a breathing correction method by trimming the captured image.
  • This is a trimming correction method that performs
  • it is possible to determine an appropriate correction method according to the situation of the imaging apparatus, with respect to the zoom lens correction method that is advantageous in terms of image quality and the trimming correction method that causes image quality deterioration but is advantageous in terms of correction speed. .
  • the correction method determination unit acquires the zoom lens movement amount required for breathing correction by the zoom lens correction method based on the information related to the movement amount of the focus lens group, Based on the amount of movement, it is determined whether the breathing correction should be performed using the zoom lens correction method or the trimming correction method.
  • the amount of zoom lens movement required for breathing correction by the zoom lens correction method correlates with the correction time when performing the breathing correction by the zoom lens correction method. Therefore, according to the above configuration, for example, when the amount of movement of the zoom lens is large (when the correction time by the zoom lens correction method is long), correction by the trimming correction method is performed. Appropriate correction methods can be switched based on the amount of lens movement.
  • the correction method determination unit performs control so that breathing correction is performed by the zoom lens correction method when the zoom lens movement amount is equal to or less than a predetermined threshold value, and the zoom lens movement amount exceeds the threshold, the breathing correction is performed by the trimming correction method.
  • the correction method determination unit acquires the correction time when breathing correction is performed by the zoom lens correction method based on the zoom lens movement amount, and based on the correction time, zoom lens correction is performed. A determination is made as to whether the breathing correction should be performed by either the method or the trimming correction method (see FIG. 12). As a result, for example, when the correction time of the zoom lens correction method is long, it is possible to perform the breathing correction by the trimming correction method. It is possible to switch the appropriate correction method based on.
  • the correction method determination unit performs control so that breathing correction is performed by the zoom lens correction method when the correction time is equal to or less than a predetermined threshold value (Tth), and the correction time is equal to or less than the threshold value. is exceeded, the breathing correction is performed by the trimming correction method (see FIG. 12).
  • Tth a predetermined threshold value
  • the trimming correction method see FIG. 12
  • the correction method determination unit acquires the focus lens movement amount, which is the movement amount of the focus lens group, based on the information related to the movement amount of the focus lens group, and obtains the focus lens movement amount. Based on the amount of movement, it is determined whether the breathing correction should be performed using the zoom lens correction method or the trimming correction method.
  • the focus lens movement amount correlates with the correction time when breathing correction is performed by the zoom lens correction method. Therefore, according to the above configuration, for example, when the amount of movement of the focus lens is large (when the correction time of the zoom lens correction method is long), correction is performed by the trimming correction method. An appropriate correction method can be determined based on the above.
  • the correction method determination unit includes information related to the amount of movement of the focus lens group, and trimming magnification characteristic information (trimming magnification Based on Table J4), a trimming magnification for performing breathing correction by a trimming correction method is acquired, and based on the trimming magnification, it is determined whether to perform breathing correction by either the zoom lens correction method or the trimming correction method. making judgments.
  • the trimming magnification in the case of performing the breathing correction by the trimming correction method also correlates with the correction time in the case of performing the breathing correction by the zoom lens correction method. Therefore, according to the above configuration, for example, when the trimming magnification for breathing correction is large (when the correction time by the zoom lens correction method is long), the correction by the trimming correction method is performed. It is possible to determine an appropriate correction method based on the trimming magnification for .
  • the correction method determination unit determines whether breathing by the trimming correction method is performed even when the correction execution condition by the trimming correction method based on the information related to the movement amount of the focus lens group is satisfied.
  • the trimming magnification required for correction exceeds the threshold value (Rth)
  • control is performed so that breathing correction is performed by the zoom lens correction method (see FIG. 20).
  • the zoom lens correction method will be used when it is predicted that the image quality will deteriorate due to trimming. . Therefore, while minimizing the use of trimming that causes image quality deterioration, it is possible to keep image quality deterioration within an allowable range when correction by trimming is performed.
  • the correction method determination unit determines whether the scene recognition information is determined to be a specific scene based on the above, the breathing correction is performed by the trimming correction method (see FIG. 21).
  • the trimming correction method that does not generate operating noise related to lens movement can be used. A correction can be made. Therefore, it is possible to appropriately switch the breathing correction method according to the situation of the imaging apparatus.
  • the zoom lens group or the A correction control section (first correction control section F3A) that controls the moving speed of at least one of the focus lens groups is provided (see FIG. 19, etc.).
  • first correction control section F3A first correction control section
  • the correction method determination unit determines, based on the scene recognition information, which of the first correction method and the second correction method should be used for the breathing correction ( See Figure 22).
  • the first and second correction methods are the breathing correction method by driving the zoom lens group and the breathing correction method by trimming the captured image, respectively, in a specific scene where silence is required, the operation noise associated with lens movement is reduced. It is possible to perform breathing correction by trimming without performing breathing correction by driving the zoom lens, which causes . Therefore, it is possible to appropriately determine the breathing correction method according to the situation of the imaging apparatus.
  • one of the first and second correction methods is a zoom lens correction method in which breathing correction is performed by driving a zoom lens group, and the other is a breathing correction method by trimming a captured image.
  • This is a trimming correction method for performing correction, and when the correction method determination unit determines that the scene is a specific scene based on the scene recognition information, it controls so that the breathing correction is performed by the trimming correction method, and determines that the scene is not a specific scene. If so, the breathing correction is performed by the zoom lens correction method (see FIG. 22).
  • the zoom lens correction method see FIG. 22.
  • the scene recognition information is information based on at least one of image recognition processing, a detection signal from an illuminance sensor, a sound pickup signal from a microphone, and a detection signal from a motion sensor.
  • the image recognition processing, the detection signal from the illuminance sensor, the picked-up sound signal from the microphone, and the detection signal from the motion sensor are each information that enables recognition of the captured scene. Therefore, it is possible to appropriately determine whether or not the scene is a specific scene, and to appropriately determine the breathing correction method according to the situation of the imaging apparatus.
  • processing is performed to acquire lens characteristic information used for breathing correction from the lens device. Accordingly, in order to realize appropriate breathing correction according to lens characteristics, it is sufficient to provide at least communication means with the lens device as communication means. Therefore, it is possible to reduce the number of device parts and the cost.
  • processing is performed to acquire lens characteristic information used for breathing correction from the cloud server (see FIG. 23). This eliminates the need to store the lens characteristic information in the lens device in order to realize appropriate breathing correction according to the lens characteristic. Therefore, it is possible to reduce the memory capacity of the lens device.
  • the information processing apparatus is configured as an imaging apparatus including an imaging unit. That is, an information processing device configured as an imaging device determines which of the first and second correction methods the breathing correction should be performed based on information related to the amount of movement of the focus lens group or scene recognition information. It is a thing. Therefore, it is possible to realize an image pickup apparatus that can appropriately determine a breathing correction method according to the situation of the own apparatus.
  • the information processing apparatus performs one of the first correction method and the second correction method, which are methods for correcting breathing, based on information related to the amount of movement of the focus lens group or scene recognition information.
  • This is an information processing method for determining whether or not to perform breathing correction by the technique of (1). According to such an information processing method, it is possible to obtain the same actions and effects as those of the information processing apparatus of the above-described embodiment.
  • the program of the embodiment is a program readable by a computer device, and performs a first correction method and a second correction method, which are breathing correction methods, based on information related to the amount of movement of the focus lens group or scene recognition information. It is a program that causes a computer device to realize a function of determining which of the methods should be used for the breathing correction. With such a program, the function of the above-described correction method determination unit F5 can be realized in a device such as the imaging device 2 or the like.
  • the program as described above can be recorded in advance in a HDD as a recording medium built in equipment such as a computer device, or in a ROM or the like in a microcomputer having a CPU.
  • flexible discs CD-ROMs (Compact Disc Read Only Memory), MO (Magneto Optical) discs, DVDs (Digital Versatile Discs), Blu-ray discs (Blu-ray Discs (registered trademark)), magnetic discs, semiconductor memories, It can be temporarily or permanently stored (recorded) in a removable recording medium such as a memory card.
  • Such removable recording media can be provided as so-called package software.
  • a program from a removable recording medium to a personal computer or the like, it can also be downloaded from a download site via a network such as a LAN (Local Area Network) or the Internet.
  • LAN Local Area Network
  • Such a program is suitable for wide provision of the correction method determination unit F5 of the embodiment.
  • a program for example, by downloading a program to a personal computer, a portable information processing device, a mobile phone, a game device, a video device, a PDA (Personal Digital Assistant), etc., the personal computer, etc. can be used as the correction method determination unit F5 of the present disclosure. It can function as a device for realizing processing.
  • An information processing device having a determination unit.
  • One of the first and second correction methods is a zoom lens correction method in which breathing correction is performed by driving a zoom lens group, and the other is a trimming correction method in which breathing correction is performed by trimming a captured image. ).
  • the correction method determination unit acquires a zoom lens movement amount required for breathing correction by the zoom lens correction method based on information related to the movement amount of the focus lens group, and determines the zoom lens movement amount based on the zoom lens movement amount.
  • the information processing apparatus wherein a determination is made as to which of the correction method and the trimming correction method should be used for the breathing correction.
  • the correction method determination unit performs control such that breathing correction is performed by the zoom lens correction method when the zoom lens movement amount is equal to or less than a predetermined threshold, and controls the breathing correction by the zoom lens correction method when the zoom lens movement amount exceeds the threshold.
  • the information processing apparatus according to (3), wherein control is performed such that breathing correction is performed by a trimming correction method.
  • the correction method determination unit acquires a correction time when breathing correction is performed by the zoom lens correction method based on the zoom lens movement amount, and based on the correction time, the zoom lens correction method and the trimming correction method.
  • the information processing apparatus in which a determination is made as to which method of the breathing correction is to be performed.
  • the correction method determination unit controls to perform breathing correction by the zoom lens correction method when the correction time is less than or equal to a predetermined threshold, and performs control by the trimming correction method when the correction time exceeds the threshold.
  • the information processing apparatus according to (5) above, wherein control is performed such that breathing correction is performed.
  • the correction method determination unit acquires a focus lens movement amount, which is the movement amount of the focus lens group, based on information related to the movement amount of the focus lens group, and performs the zoom lens correction based on the focus lens movement amount.
  • the information processing apparatus wherein a determination is made as to whether the breathing correction is to be performed by any one of the trimming correction method and the trimming correction method.
  • the correction method determination unit determines breathing by the trimming correction method based on information related to the amount of movement of the focus lens group and trimming magnification characteristic information indicating a trimming magnification characteristic for performing breathing correction by the trimming correction method. Obtaining a trimming magnification for correction, and determining whether breathing correction should be performed by either the zoom lens correction method or the trimming correction method based on the trimming magnification Information according to (2) above processing equipment.
  • the correction method determination unit determines that a trimming magnification required for breathing correction by the trimming correction method is a threshold value even when a correction execution condition by the trimming correction method based on information related to the amount of movement of the focus lens group is satisfied.
  • the information processing apparatus according to any one of (2) to (7), wherein the breathing correction is performed by the zoom lens correction method when exceeding the above.
  • the correction method determination unit determines that the scene is a specific scene based on the scene recognition information even when a correction execution condition by the zoom lens correction method based on the information regarding the amount of movement of the focus lens group is satisfied.
  • the information processing apparatus according to any one of (2) to (7) and (9) above, wherein, when the trimming correction method is performed, the breathing correction is performed by the trimming correction method.
  • Correction control for controlling the movement speed of at least one of the zoom lens group and the focus lens group based on the movement speed of the focus lens group and the movement speed of the zoom lens group during breathing correction by the zoom lens correction method.
  • the information processing apparatus according to any one of (2) to (9) above, comprising a unit.
  • the correction method determination unit determines, based on the scene recognition information, which of the first correction method and the second correction method should be used for the breathing correction. information processing equipment.
  • One of the first and second correction methods is a zoom lens correction method in which breathing correction is performed by driving a zoom lens group, and the other is a trimming correction method in which breathing correction is performed by trimming a captured image
  • the correction method determination unit controls to perform breathing correction by the trimming correction method when the scene is determined to be a specific scene based on the scene recognition information, and controls the breathing correction to be performed by the trimming correction method when the scene is determined not to be the specific scene.
  • the information processing apparatus according to (12) wherein control is performed such that breathing correction is performed by a zoom lens correction method.
  • the information processing apparatus wherein the scene recognition information is information based on at least one of an image recognition process, a detection signal from an illuminance sensor, a picked-up sound signal from a microphone, and a detection signal from a motion sensor.
  • the scene recognition information is information based on at least one of an image recognition process, a detection signal from an illuminance sensor, a picked-up sound signal from a microphone, and a detection signal from a motion sensor.
  • a process of acquiring lens characteristic information used for the breathing correction from the lens device is performed.
  • a process of acquiring lens characteristic information used for the breathing correction from a cloud server is performed.
  • the information processing device configured as an imaging device including an imaging unit.
  • the information processing device Determining which method of breathing correction to perform, out of the first correction method and the second correction method, is to be performed based on information related to the amount of movement of the focus lens group or scene recognition information.
  • Method. A program readable by a computer device, A function of determining which of the breathing correction method, the first correction method and the second correction method, should be used for the breathing correction based on information related to the amount of movement of the focus lens group or scene recognition information; on the computer device.

Abstract

The present invention appropriately determines a method for breathing compensation to be performed according to the situation of an imaging device. An information processing device according to the present technology is provided with a compensation method determination unit that, on the basis of information related to the amount of movement of a focus lens group or scene recognition information, determines whether to perform breathing compensation by either of a first compensation method and a second compensation method, which are methods for the breathing compensation.

Description

情報処理装置、情報処理方法、プログラムInformation processing device, information processing method, program
 本技術は、情報処理装置とその方法、及びプログラムに関し、特には、フォーカシングに伴う画角変化の補正であるブリージング補正の処理技術に関する。 The present technology relates to an information processing device, its method, and a program, and particularly relates to processing technology for breathing correction, which is correction of a change in angle of view that accompanies focusing.
 フォーカシングが可能に構成されたカメラシステムにおいては、フォーカシングに伴い画角が変化する現象が生じることが知られている(いわゆるブリージング)。
 下記特許文献1には、ブリージング補正を行う技術が開示されている。具体的に、特許文献1では、ズームレンズ群の駆動によるブリージング補正と撮像画像のトリミングによるブリージング補正とが可能とされる場合において、フォーカスレンズ群を撮影倍率が大きくなる方向に駆動した場合はズームレンズ駆動によるブリージング補正を行い、フォーカスレンズ群を撮影倍率が小さくなる方向に駆動した場合はトリミングによるブリージング補正を行う技術が開示されている。
In a camera system configured to enable focusing, it is known that a phenomenon occurs in which the angle of view changes with focusing (so-called breathing).
Patent Literature 1 listed below discloses a technique for correcting breathing. Specifically, in Patent Document 1, in the case where breathing correction by driving the zoom lens group and breathing correction by trimming the captured image are possible, when the focus lens group is driven in the direction of increasing the shooting magnification, zooming is performed. A technology is disclosed in which breathing correction is performed by lens driving, and breathing correction is performed by trimming when the focus lens group is driven in a direction in which the photographing magnification is reduced.
特開2012-168371号公報JP 2012-168371 A
 しかしながら、ズームレンズ駆動による補正はトリミングによる補正よりも補正時間が遅いため、上記特許文献1のように撮影倍率の変化方向に応じて補正手法の切り替えを行う場合には、撮影倍率の変化方向が切り替わる場面において、ズームレンズ駆動による補正が追いつかないままトリミングによる補正に切り替わってしまい、急激な画角変化が生じて補正による違和感を与えてしまう虞がある。
 また、特許文献1の技術では、確率的には常に1/2の確率でトリミングによる補正が行われることになり、画質の劣化が懸念されるものとなる。
However, since correction by zoom lens driving takes longer than correction by trimming, when the correction method is switched according to the direction of change in the photographing magnification as in Patent Document 1, the direction of change in the photographing magnification is In the switching scene, the correction by the zoom lens drive may not catch up with the correction by the trimming, and there is a possibility that the angle of view is changed abruptly and the correction gives a sense of incongruity.
In addition, with the technique of Patent Document 1, correction by trimming is always performed with a probability of 1/2, and there is concern about deterioration in image quality.
 本技術は上記事情に鑑み為されたものであり、撮像装置の状況に応じて実行すべきブリージング補正の手法を適切に判定することを目的とする。 This technology has been developed in view of the above circumstances, and aims to appropriately determine the method of breathing correction that should be performed according to the situation of the imaging device.
 本技術に係る情報処理装置は、フォーカスレンズ群の移動量に係る情報、又はシーン認識情報に基づき、ブリージング補正の手法である第一補正手法と第二補正手法のうち何れの手法によるブリージング補正を行うかについての判定を行う補正手法判定部を備えたものである。
 本明細書においてブリージングとは、フォーカシングに伴い画角が変化する現象を意味するものであり、ブリージング補正とは、そのようなフォーカシングに伴う画角変化の補正を意味する。
 上記構成によれば、第一、第二補正手法の何れによるブリージング補正を行うかが、フォーカスレンズの移動量に係る情報、又はシーン認識情報に基づき判定される。フォーカスレンズ群の移動量に係る情報に基づく補正手法の判定とすれば、例えば第一、第二補正手法がそれぞれズームレンズ群の駆動によるブリージング補正手法、撮像画像のトリミングによるブリージング補正手法である場合において、ズームレンズ群の駆動による補正手法を採った場合におけるブリージング補正に要する時間をフォーカスレンズ群の移動量に係る情報から推定することが可能となり、補正に要する時間が長い場合にトリミングによるブリージング補正手法を行うようにすることが可能となる。これにより、画質劣化の生じるトリミングの使用を極力抑えながらブリージング補正を行うことが可能となる。
 また、シーン認識情報に基づく補正手法の判定とすれば、例えば第一、第二補正手法がそれぞれズームレンズ群の駆動によるブリージング補正手法、撮像画像のトリミングによるブリージング補正手法である場合において、静音が求められる特定のシーンではレンズ移動に係る動作音が生じるズームレンズ駆動によるブリージング補正を行わずトリミングによるブリージング補正を行う等といったことが可能となる。
The information processing apparatus according to the present technology performs breathing correction by either one of the first correction method and the second correction method, which are methods of breathing correction, based on information related to the amount of movement of the focus lens group or scene recognition information. It has a correction method determination unit that determines whether or not to perform the correction.
In this specification, breathing refers to a phenomenon in which the angle of view changes with focusing, and breathing correction refers to correction of such a change in angle of view that accompanies focusing.
According to the above configuration, which one of the first and second correction methods should be used for the breathing correction is determined based on the information regarding the amount of movement of the focus lens or the scene recognition information. If the correction method is determined based on information related to the amount of movement of the focus lens group, for example, if the first and second correction methods are the breathing correction method by driving the zoom lens group and the breathing correction method by trimming the captured image, respectively. In the case of adopting a correction method by driving the zoom lens group, it is possible to estimate the time required for breathing correction from information related to the amount of movement of the focus lens group, and if the time required for correction is long, breathing correction by trimming It is possible to perform the procedure. This makes it possible to perform breathing correction while minimizing the use of trimming that causes image quality deterioration.
Further, if the correction method is determined based on the scene recognition information, for example, when the first and second correction methods are the breathing correction method by driving the zoom lens group and the breathing correction method by trimming the captured image, the quietness In a specific scene, it is possible to perform breathing correction by trimming without performing breathing correction by driving the zoom lens, which produces an operation sound associated with lens movement.
 また、本技術に係る情報処理方法は、情報処理装置が、フォーカスレンズ群の移動量に係る情報、又はシーン認識情報に基づき、ブリージング補正の手法である第一補正手法と第二補正手法のうち何れの手法によるブリージング補正を行うかについての判定を行う情報処理方法である。
 さらに、本技術に係るプログラムは、コンピュータ装置が読み取り可能なプログラムであって、フォーカスレンズ群の移動量に係る情報、又はシーン認識情報に基づき、ブリージング補正の手法である第一補正手法と第二補正手法のうち何れの手法によるブリージング補正を行うかについての判定を行う機能、を前記コンピュータ装置に実現させるプログラムである。
 これら情報処理方法やプログラムにより、上記した本技術としての情報処理装置を実現することが可能となる。
Further, in the information processing method according to the present technology, the information processing apparatus performs the following correction based on the information related to the amount of movement of the focus lens group or the scene recognition information, out of the first correction method and the second correction method, which are techniques for correcting breathing. This is an information processing method for determining which method of breathing correction is to be performed.
Further, a program according to the present technology is a program readable by a computer device, and is based on information relating to the amount of movement of the focus lens group or scene recognition information, and is a method of correcting breathing, namely a first correction method and a second correction method. It is a program that causes the computer device to realize a function of determining which of the correction methods should be used for the breathing correction.
These information processing methods and programs make it possible to realize the information processing apparatus as the present technology described above.
本技術に係る実施形態としてのカメラシステムの構成例を示した図である。1 is a diagram showing a configuration example of a camera system as an embodiment according to the present technology; FIG. 第一実施形態としての交換レンズ及び撮像装置(情報処理装置)の内部構成例を示したブロック図である。1 is a block diagram showing an internal configuration example of an interchangeable lens and an imaging device (information processing device) as a first embodiment; FIG. AFの制御に係る用語の説明図である。FIG. 3 is an explanatory diagram of terms related to AF control; ブリージングとしての画角変化の態様とブリージング補正の概要についての説明図である。FIG. 10 is an explanatory diagram of an aspect of a change in angle of view as breathing and an outline of breathing correction; 第一実施形態としての撮像装置が有する実施形態としてのフォーカス関連処理に係る機能を示した機能ブロック図である。3 is a functional block diagram showing functions related to focus-related processing as an embodiment of an imaging device as a first embodiment; FIG. フォーカス移動軌跡情報の例を示した図である。FIG. 9 is a diagram showing an example of focus movement trajectory information; 画角テーブルの例を示した図である。FIG. 4 is a diagram showing an example of an angle-of-view table; レンズ群移動速度テーブルの例を示した図である。It is the figure which showed the example of the lens group moving speed table. トリミング倍率テーブルの例を示した図である。FIG. 10 is a diagram showing an example of a trimming magnification table; 複数のズームレンズ群の位置座標を取得するための手法例の説明図である。FIG. 5 is an explanatory diagram of an example of a technique for acquiring position coordinates of a plurality of zoom lens groups; 実施形態における目標ズームレンズ位置の取得手法の例の説明図である。FIG. 10 is an explanatory diagram of an example of a method of acquiring a target zoom lens position in the embodiment; 第一実施形態としてのフォーカス関連処理を実現するための具体的な処理手順例を示したフローチャートである。7 is a flow chart showing a specific processing procedure example for implementing focus-related processing as the first embodiment; 第二実施形態としてのカメラシステムを構成するレンズ装置及び撮像装置の内部構成例を示したブロック図である。FIG. 11 is a block diagram showing an internal configuration example of a lens device and an imaging device that configure a camera system as a second embodiment; 第二実施形態の撮像装置が有する第二実施形態に係る機能を示した機能ブロック図である。FIG. 10 is a functional block diagram showing functions according to the second embodiment that the imaging device of the second embodiment has; ズームレンズ補正手法によるブリージング補正が行われる場合におけるレンズ位置の説明図である。FIG. 5 is an explanatory diagram of lens positions when breathing correction is performed by a zoom lens correction method; 第二実施形態におけるレンズ群移動速度テーブルの例を示した図である。It is the figure which showed the example of the lens group moving speed table in 2nd embodiment. フォーカス群画角変動率テーブルの例を示した図である。FIG. 9 is a diagram showing an example of a focus group angle-of-view variation rate table; ズーム群画角変動率テーブルの例を示した図である。FIG. 10 is a diagram showing an example of a zoom group angle-of-view variation rate table; 第二実施形態としての制御を実現するための具体的な処理手順例を示したフローチャートである。FIG. 11 is a flow chart showing a specific processing procedure example for realizing control as a second embodiment; FIG. 第一変形例としての処理のフローチャートである。10 is a flowchart of processing as a first modified example; 第二変形例としての処理のフローチャートである。10 is a flowchart of processing as a second modified example; 第三変形例としての処理のフローチャートである。10 is a flowchart of processing as a third modified example; 第四変形例としてのカメラシステムの簡略化した構成説明図である。FIG. 11 is a simplified configuration explanatory diagram of a camera system as a fourth modified example;
 以下、実施の形態を次の順序で説明する。
<1.第一実施形態>
(1-1.装置構成例)
(1-2.実施形態としてのフォーカス関連処理)
(1-3.処理手順)
<2.第二実施形態>
<3.変形例>
<4.実施形態のまとめ>
<5.本技術>
Hereinafter, embodiments will be described in the following order.
<1. First Embodiment>
(1-1. Equipment configuration example)
(1-2. Focus-related processing as an embodiment)
(1-3. Processing procedure)
<2. Second Embodiment>
<3. Variation>
<4. Summary of Embodiments>
<5. This technology>
<1.第一実施形態>
(1-1.装置構成例)
 図1は、本技術に係る実施形態としてのカメラシステムの構成例を示した図である。
 カメラシステムは、交換レンズ1と、本技術に係る情報処理装置の一実施形態である撮像装置(ボディ)2とを備えている。
<1. First Embodiment>
(1-1. Equipment configuration example)
FIG. 1 is a diagram showing a configuration example of a camera system as an embodiment according to the present technology.
The camera system includes an interchangeable lens 1 and an imaging device (body) 2 that is an embodiment of an information processing device according to the present technology.
 交換レンズ1は、撮像装置2に対する着脱が自在に構成されたレンズユニットである。
 交換レンズ1内部には、フォーカスレンズ、ズームレンズ等、各種のレンズがあり、また、これらのレンズを駆動する駆動部、駆動部に対する駆動信号を出力する制御部を有し、さらに、撮像装置2に対する接続機能、通信機能を備えたマウント部等を有する。なお、交換レンズ1の具体的な構成例については図2を参照して改めて説明する。
The interchangeable lens 1 is a lens unit that can be freely attached to and detached from the imaging device 2 .
Inside the interchangeable lens 1, there are various lenses such as a focus lens and a zoom lens. It has a mount part etc. with a connection function and a communication function. A specific configuration example of the interchangeable lens 1 will be described again with reference to FIG.
 撮像装置2は、交換レンズ1を着脱自在に構成されたデジタルカメラ装置として構成されている。本例において、撮像装置2は、静止画像の撮像機能のみではなく、動画像の撮像機能を有する。
 撮像装置2は、交換レンズ1を介して入射する被写体像を撮像する撮像素子55や、撮像素子55による撮像画像や各種の操作画面等のGUI(Graphical User Interface)を表示可能な表示部61、ユーザが各種の操作入力を行うための操作部65等が備えられている。
 以下の図2を参照して説明もするように、撮像装置2には、図1に示す構成以外にも、例えば撮像素子55による撮像画像を記録するための構成や、撮像素子55による撮像画像に対する画像信号処理を行うための構成、交換レンズ1との通信を行うための構成等が備えられている。
The imaging device 2 is configured as a digital camera device in which the interchangeable lens 1 is detachably attached. In this example, the imaging device 2 has not only a still image imaging function but also a moving image imaging function.
The imaging device 2 includes an imaging device 55 that captures a subject image incident through the interchangeable lens 1, a display unit 61 that can display an image captured by the imaging device 55, a GUI (Graphical User Interface) such as various operation screens, etc. An operation unit 65 and the like are provided for the user to perform various operation inputs.
As will be described with reference to FIG. 2 below, the imaging device 2 includes, in addition to the configuration shown in FIG. and a configuration for performing communication with the interchangeable lens 1, and the like.
 図2は、交換レンズ1及び撮像装置2の内部構成例を示したブロック図である。
 交換レンズ1は、撮像装置2のマウント部51に対して着脱自在に取り付けられるマウント部11を備えている。マウント部11は、撮像装置2と電気的に接続するための複数の端子を有する。
 また、交換レンズ1は、レンズ側制御部12、ズームレンズ13、手振れ補正レンズ14、絞り15、フォーカスレンズ16や、操作部31、メモリ32、電源制御部33を備えている。
 さらに、交換レンズ1は、ズームレンズ駆動部21、手振れ制御部22、絞り制御部23、フォーカスレンズ駆動部24、及び検出部17を備えている。
FIG. 2 is a block diagram showing an internal configuration example of the interchangeable lens 1 and the imaging device 2. As shown in FIG.
The interchangeable lens 1 includes a mount section 11 detachably attached to a mount section 51 of the imaging device 2 . The mount section 11 has a plurality of terminals for electrical connection with the imaging device 2 .
The interchangeable lens 1 also includes a lens-side control section 12 , a zoom lens 13 , an image stabilization lens 14 , an aperture 15 , a focus lens 16 , an operation section 31 , a memory 32 and a power supply control section 33 .
Further, the interchangeable lens 1 includes a zoom lens driving section 21 , a camera shake control section 22 , an aperture control section 23 , a focus lens driving section 24 and a detection section 17 .
 レンズ側制御部12は、例えば、CPU(Central Processing Unit)やROM(Read Only Memory)、RAM(Random Access Memory)等を有するマイクロコンピュータを備えて構成され、CPUがROMやメモリ32等の所定の記憶装置に記憶されたプログラムをRAMに読み出して実行することにより交換レンズ1の全体制御を行う。
 例えば、レンズ側制御部12は、マウント部11の所定の通信端子を介して供給された撮像装置2からの指示、又は、操作部31が受け付けたユーザの操作に基づき、ズームレンズ13の位置を制御する。具体的に、レンズ側制御部12は、例えば磁気センサ(MRセンサ)等を有して構成された検出部17からズームレンズ13の現在位置を取得し、取得結果に基づいてズームレンズ13を所定の位置に移動させるための駆動方向及び駆動量を決定し、決定した駆動方向及び駆動量を移動命令と共にズームレンズ駆動部21に出力する。ズームレンズ駆動部21は、レンズ側制御部12から供給された移動命令に基づいて、指示された駆動方向及び駆動量となるようにズームレンズ13を光軸方向に移動させる。
The lens-side control unit 12 includes, for example, a microcomputer having a CPU (Central Processing Unit), a ROM (Read Only Memory), a RAM (Random Access Memory), and the like. The overall control of the interchangeable lens 1 is performed by reading the program stored in the storage device into the RAM and executing it.
For example, the lens-side control unit 12 adjusts the position of the zoom lens 13 based on an instruction from the imaging device 2 supplied via a predetermined communication terminal of the mount unit 11 or a user's operation received by the operation unit 31. Control. Specifically, the lens-side control unit 12 acquires the current position of the zoom lens 13 from the detection unit 17 having, for example, a magnetic sensor (MR sensor) or the like, and moves the zoom lens 13 to a predetermined position based on the acquired result. , and outputs the determined driving direction and driving amount to the zoom lens driving section 21 together with a movement command. The zoom lens drive unit 21 moves the zoom lens 13 in the optical axis direction based on the movement command supplied from the lens side control unit 12 so as to achieve the instructed drive direction and drive amount.
 ここで、検出部17は、ズームレンズ13、手振れ補正レンズ14、及びフォーカスレンズ16の位置や、絞り15の開口径等、交換レンズ1の状態を検出するための構成を包括的に表したものである。検出部17において、レンズの位置の検出は、例えば、磁気センサ、フォトダイオードアレイ、ポテンショメータ、反射式エンコーダ等で行うことができる。
 また、検出部17としては、温度を検出する温度センサを備えた構成とすることもできる。
Here, the detection unit 17 comprehensively represents the configuration for detecting the state of the interchangeable lens 1, such as the positions of the zoom lens 13, the image stabilization lens 14, and the focus lens 16, the aperture diameter of the diaphragm 15, and the like. is. In the detection unit 17, detection of the lens position can be performed by, for example, a magnetic sensor, a photodiode array, a potentiometer, a reflective encoder, or the like.
Further, the detection unit 17 may be configured to include a temperature sensor for detecting temperature.
 レンズ側制御部12は、手振れを補正するように手振れ補正レンズ14を制御する。具体的に、レンズ側制御部12は、検出部17に設けられた手振れ検出センサによって検出された手振れ量に基づいて、手振れ量を打ち消す方向の手振れ補正レンズ14の駆動方向及び駆動量を決定し、決定した駆動方向及び駆動量を移動命令とともに手振れ制御部22に出力する。検出部17における手振れ検出センサは、例えば、ジャイロセンサと三軸加速度センサの両方、又は何れか一方で構成される。ジャイロセンサは、手振れ補正レンズ14の補正方向として、ピッチ(Pitch)又はヨー(Yaw)に対応する方向のずれ(ブレ)を検出する場合に用いられ、三軸加速度センサは、光軸方向をZ軸としたときに、X軸とY軸の方向のずれ(ブレ)を検出する場合に用いられる。手振れ制御部22は、レンズ側制御部12から供給された移動命令に基づいて、指示された駆動方向及び駆動量となるように手振れ補正レンズ14を移動させる。
 また、レンズ側制御部12は、電源の供給がオフされた場合に、手振れ補正レンズ14をメカニカルにロックする制御を行う。すなわち、手振れ補正レンズ14は、撮像装置2から交換レンズ1へ電源が供給されている状態では、手振れ制御部22を介した制御によって、所定の位置に制御が保たれているが、電源の供給がオフされると、手振れ制御部22による位置制御が停止し、手振れ補正レンズ14は重力方向に所定量だけ落下する。レンズ側制御部12は、電源の供給がオフされるタイミングに応じて、手振れ制御部22を介して、手振れ補正レンズ14をメカニカルにロックさせ、落下を防止する。手振れ制御部22は、レンズ側制御部12から供給された固定命令に基づいて手振れ補正レンズ14をメカニカルにロックする。
The lens-side control unit 12 controls the camera shake correction lens 14 so as to correct camera shake. Specifically, the lens-side control unit 12 determines the driving direction and driving amount of the camera shake correction lens 14 in the direction of canceling the camera shake amount based on the camera shake amount detected by the camera shake detection sensor provided in the detection unit 17. , the determined drive direction and drive amount are output to the camera shake control unit 22 together with a movement command. The shake detection sensor in the detection unit 17 is configured by, for example, both or either of a gyro sensor and a triaxial acceleration sensor. The gyro sensor is used to detect a deviation (shake) in the direction corresponding to pitch or yaw as the correction direction of the camera shake correction lens 14. It is used to detect a deviation (blur) in the directions of the X-axis and the Y-axis when they are taken as axes. Based on the movement command supplied from the lens-side control unit 12, the camera shake control unit 22 moves the camera shake correction lens 14 so as to achieve the instructed drive direction and drive amount.
Further, the lens-side control unit 12 performs control to mechanically lock the image stabilization lens 14 when the power supply is turned off. That is, in a state in which power is supplied from the imaging device 2 to the interchangeable lens 1, the camera shake correction lens 14 is kept at a predetermined position by control via the camera shake control unit 22. is turned off, the position control by the camera shake control unit 22 is stopped, and the camera shake correction lens 14 drops by a predetermined amount in the direction of gravity. The lens-side control unit 12 mechanically locks the camera-shake correction lens 14 via the camera-shake control unit 22 in accordance with the timing at which the power supply is turned off to prevent it from falling. The shake control section 22 mechanically locks the shake correction lens 14 based on the fixing command supplied from the lens side control section 12 .
 また、レンズ側制御部12は、マウント部11の所定の通信端子を介して供給された撮像装置2からの指示等に応じて、絞り15(の開口径)を制御する。具体的に、レンズ側制御部12は、検出部17における絞り検出センサによって検出された絞り15の開口径を取得して、撮像装置2から指示されたF値となるように絞り制御部23に指令を出し、絞り15を駆動させる。絞り制御部23は、レンズ側制御部12から指示された開口径となるように絞り15を駆動させる。 In addition, the lens-side control section 12 controls (the aperture diameter of) the diaphragm 15 in accordance with an instruction or the like from the imaging device 2 supplied via a predetermined communication terminal of the mount section 11 . Specifically, the lens-side control unit 12 acquires the aperture diameter of the diaphragm 15 detected by the diaphragm detection sensor in the detection unit 17, and instructs the diaphragm control unit 23 to obtain the F value instructed by the imaging device 2. A command is issued to drive the diaphragm 15 . The diaphragm controller 23 drives the diaphragm 15 so that the aperture diameter instructed by the lens-side controller 12 is obtained.
 さらに、レンズ側制御部12は、マウント部11の所定の通信端子を介して供給された撮像装置2からの指示に基づき、フォーカスレンズ16の位置を制御する。
 ここで、本例では、AF(Auto Focus)の制御においては、撮像装置2から目標とするフォーカスレンズ位置の情報(目標フォーカスレンズ位置)がレンズ側制御部12に指示される。レンズ側制御部12は、検出部17からフォーカスレンズ16の現在位置を取得し、取得した該現在位置の情報と撮像装置2から指示された目標フォーカスレンズ位置の情報とに基づき、フォーカスレンズ16を目標とする位置に移動させるための駆動方向及び駆動量を決定し、決定した駆動方向及び駆動量を移動命令とともにフォーカスレンズ駆動部24に出力する。フォーカスレンズ駆動部24は、指示された駆動方向及び駆動量となるようにフォーカスレンズ16を光軸方向に移動させる。
Further, the lens-side control section 12 controls the position of the focus lens 16 based on instructions from the imaging device 2 supplied via a predetermined communication terminal of the mount section 11 .
Here, in this example, in the AF (Auto Focus) control, information on the target focus lens position (target focus lens position) is instructed from the imaging device 2 to the lens side control section 12 . The lens-side control unit 12 acquires the current position of the focus lens 16 from the detection unit 17, and moves the focus lens 16 based on the acquired information on the current position and the information on the target focus lens position instructed by the imaging device 2. A driving direction and driving amount for moving to a target position are determined, and the determined driving direction and driving amount are output to the focus lens driving section 24 together with a movement command. The focus lens driving unit 24 moves the focus lens 16 in the optical axis direction so as to achieve the instructed drive direction and drive amount.
 ここで、フォーカスレンズ16は、一又は複数の光学要素を含む「フォーカスレンズ群」として構成される。フォーカスレンズ群が複数の光学要素を含む場合、それらの光学要素は、フォーカシングに伴い一体に変位されることになる。
 なお、この点はズームレンズ13についても同様である。すなわち、ズームレンズ13は、一又は複数の光学要素を含む「ズームレンズ群」として構成されるものであり、ズームレンズ群が複数の光学要素を含む場合、それら光学要素はズーム調整に伴い一体に変位されるものとなる。
Here, the focus lens 16 is configured as a "focus lens group" including one or more optical elements. When the focus lens group includes a plurality of optical elements, those optical elements are displaced together during focusing.
This point also applies to the zoom lens 13 . In other words, the zoom lens 13 is configured as a "zoom lens group" including one or more optical elements. It will be displaced.
 本例では、ズームレンズ13、フォーカスレンズ16はそれぞれ一つのズームレンズ群、フォーカスレンズ群で構成されるものとしているが、それぞれ複数のズームレンズ群、フォーカスレンズ群を備える構成とすることも可能である。 In this example, the zoom lens 13 and the focus lens 16 are configured by one zoom lens group and one focus lens group, respectively, but it is also possible to configure each of them to have a plurality of zoom lens groups and focus lens groups. be.
 また、レンズ側制御部12は、検出部17が検出するズームレンズ13の位置(以下「ズームレンズ位置」と表記する)、及びフォーカスレンズの位置(以下「フォーカスレンズ位置」と表記する)を撮像装置2(ボディ側制御部52)に送信する処理を行う。
 これらズームレンズ位置、フォーカスレンズ位置はそれぞれ撮像装置2側で行われるAFの処理で用いられる。そして、本例では、AFの処理は、撮像画像のフレームごとに行われる。このため、本例における交換レンズ1では、検出部17は、フレームごとにズームレンズ位置、フォーカスレンズ位置を検出し、レンズ側制御部12は、このようにフレームごとに検出されるズームレンズ位置、及びフォーカスレンズ位置の情報を撮像装置2(ボディ側制御部52)に対して逐次送信する。
Further, the lens-side control unit 12 captures the position of the zoom lens 13 (hereinafter referred to as "zoom lens position") detected by the detection unit 17 and the position of the focus lens (hereinafter referred to as "focus lens position"). Processing for transmission to the device 2 (body-side control unit 52) is performed.
These zoom lens position and focus lens position are used in AF processing performed on the imaging device 2 side, respectively. In this example, AF processing is performed for each frame of the captured image. Therefore, in the interchangeable lens 1 of this example, the detection unit 17 detects the zoom lens position and the focus lens position for each frame, and the lens-side control unit 12 detects the zoom lens position and the zoom lens position detected for each frame. and information on the focus lens position are sequentially transmitted to the imaging device 2 (body-side control unit 52).
 フォーカスレンズ駆動部24は、レンズの駆動源として、例えば超音波モータ、DCモータ、リニアアクチュエータ、ステッピングモータ、ピエゾ素子(圧電素子)等を有する構成とすることができる。 The focus lens drive unit 24 can be configured to have, for example, an ultrasonic motor, a DC motor, a linear actuator, a stepping motor, a piezo element (piezoelectric element), etc., as a lens drive source.
 なお、フォーカシングについては、操作部31が受け付けたユーザの操作に応じて行われるように構成することも可能である。 It should be noted that focusing can also be configured to be performed according to the user's operation received by the operation unit 31.
 メモリ32は、例えばEEPROM(EEP:Electrically Erasable Programmable)等の不揮発性メモリで構成され、レンズ側制御部12の動作プログラムや各種データの記憶に用いることができる。
 本例において、メモリ32にはフォーカス移動軌跡情報J1、画角テーブルJ2、レンズ群移動速度テーブルJ3、及びトリミング倍率テーブルJ4が記憶されるが、これらについては後に改めて説明する。
The memory 32 is composed of a non-volatile memory such as an EEPROM (EEP: Electrically Erasable Programmable), and can be used to store an operation program for the lens-side controller 12 and various data.
In this example, the memory 32 stores focus movement locus information J1, an angle of view table J2, a lens group movement speed table J3, and a trimming magnification table J4, which will be described later.
 電源制御部33は、撮像装置2から供給された電源の電力量を検出し、検出した電力量に基づいて交換レンズ1内の各部(レンズ側制御部12や各種の駆動部)に対して電力量を最適に配分して電源を供給する。 The power supply control unit 33 detects the amount of power supplied from the imaging device 2, and based on the detected amount of power, supplies power to each unit (the lens-side control unit 12 and various driving units) in the interchangeable lens 1. Power is supplied by optimally distributing the amount.
 ボディ側となる撮像装置2には、交換レンズ1が着脱可能に取り付けられるマウント部51が設けられる。マウント部51は、交換レンズ1のマウント部11と電気的に接続するための複数の端子を有する。
 撮像装置2のマウント部51に交換レンズ1が装着されると、マウント部51と交換レンズ1におけるマウント部11との間で、対応する端子同士が電気的かつ物理的に接続される。接続される端子には、例えば、電源供給のための端子(電源供給端子)、コマンドやデータを伝送するための端子(通信端子)、同期信号を伝送するための端子(同期信号端子)等がある。
The imaging device 2 on the body side is provided with a mount section 51 to which the interchangeable lens 1 is detachably attached. The mount section 51 has a plurality of terminals for electrical connection with the mount section 11 of the interchangeable lens 1 .
When the interchangeable lens 1 is attached to the mount section 51 of the imaging device 2 , corresponding terminals are electrically and physically connected between the mount section 51 and the mount section 11 of the interchangeable lens 1 . Terminals to be connected include, for example, a terminal for supplying power (power supply terminal), a terminal for transmitting commands and data (communication terminal), and a terminal for transmitting a synchronization signal (synchronization signal terminal). be.
 撮像装置2は、さらに、ボディ側制御部52、シャッタ53、シャッタ制御部54、撮像素子55、ADC(Analog to Digital Converter)56、フレームメモリ57、画像信号処理部58、記録部59、記録媒体60、表示部61、メモリ62、電源制御部63、電源部64、操作部65を備えている。 The imaging device 2 further includes a body-side control unit 52, a shutter 53, a shutter control unit 54, an imaging device 55, an ADC (Analog to Digital Converter) 56, a frame memory 57, an image signal processing unit 58, a recording unit 59, a recording medium 60 , a display unit 61 , a memory 62 , a power control unit 63 , a power supply unit 64 and an operation unit 65 .
 電源制御部63は、電源部64から供給される電源を、ボディ側制御部52を始めとした撮像装置2の各部へ供給する。また、電源制御部63は、撮像装置2の動作状態に基づき、交換レンズ1に供給可能な電源電力量を算出し、マウント部51を介して交換レンズ1に電源を供給する。
 電源部64は、例えば、NiCd電池やNiMH電池、Li電池等の二次電池を有して構成される。なお、電源部64としては、ACアダプタ等を介して商用交流電源からの電源供給を受けることが可能に構成することもできる。
The power control unit 63 supplies the power supplied from the power supply unit 64 to each unit of the imaging device 2 including the body-side control unit 52 . The power control unit 63 also calculates the amount of power that can be supplied to the interchangeable lens 1 based on the operating state of the imaging device 2 and supplies power to the interchangeable lens 1 via the mount unit 51 .
The power supply unit 64 includes, for example, a secondary battery such as a NiCd battery, a NiMH battery, or a Li battery. The power supply unit 64 may be configured to be able to receive power supply from a commercial AC power supply via an AC adapter or the like.
 ボディ側制御部52は、CPUやROM、RAM等を有するマイクロコンピュータを備えて構成され、CPUがROMやメモリ62等の所定の記憶装置に記憶されたプログラムをRAMに読み出して実行することにより、撮像装置2やカメラシステムの全体制御を行う。
 メモリ62は、例えばEEPROM等の不揮発性メモリで構成され、ボディ側制御部52の動作プログラムや各種データの記憶に用いることができる。
The body-side control unit 52 includes a microcomputer having a CPU, a ROM, a RAM, etc. The CPU reads a program stored in a predetermined storage device such as the ROM or the memory 62 into the RAM and executes the program. It performs overall control of the imaging device 2 and the camera system.
The memory 62 is composed of a non-volatile memory such as an EEPROM, and can be used to store an operation program for the body-side control section 52 and various data.
 ボディ側制御部52は、操作部65から供給されたユーザの操作を表す操作信号に基づいて、撮像素子55による撮像処理を実行させる。さらに、所定のコマンドを、マウント部51を介して交換レンズ1側に送信し、フォーカスレンズ16やズームレンズ13等を駆動させる。 The body-side control unit 52 causes the imaging device 55 to perform imaging processing based on the operation signal representing the user's operation supplied from the operation unit 65 . Furthermore, a predetermined command is transmitted to the interchangeable lens 1 side via the mount section 51 to drive the focus lens 16, the zoom lens 13, and the like.
 また、ボディ側制御部52は、例えばフォーカスレンズ16のレンズ位置を示す情報やズームレンズ13のレンズ位置を示す情報等を、交換レンズ1における検出部17から取得可能とされる。 Also, the body-side control unit 52 can acquire information indicating the lens position of the focus lens 16, information indicating the lens position of the zoom lens 13, and the like from the detection unit 17 in the interchangeable lens 1, for example.
 シャッタ53は、撮像素子55の前面(被写体側)に配置されており、シャッタ制御部54の制御に従って開閉する。シャッタ53が閉状態であるとき、交換レンズ1の光学系を通過してきた被写体の光が遮断される。シャッタ制御部54は、シャッタ53の開閉状態を検出し、検出結果を示す情報をボディ側制御部52に供給する。シャッタ制御部54は、ボディ側制御部52の制御に基づいてシャッタ53を開状態又は閉状態に駆動する。 The shutter 53 is arranged in front of the imaging device 55 (subject side) and opens and closes under the control of the shutter control section 54 . When the shutter 53 is in the closed state, the light from the subject that has passed through the optical system of the interchangeable lens 1 is blocked. The shutter control unit 54 detects the open/closed state of the shutter 53 and supplies information indicating the detection result to the body side control unit 52 . The shutter controller 54 drives the shutter 53 to open or close based on the control of the body-side controller 52 .
 撮像素子55は、例えば、CCD(Charge Coupled Device)センサやCMOS(Complementary Metal Oxide Semiconductor)センサ等によるイメージセンサとして構成され、被写体を撮像し、撮像画像データを生成して出力する。
 撮像素子55がCCDセンサやCMOSセンサで構成される場合には、電子シャッタを用いることができるため、シャッタ53は省略することも可能である。シャッタ53が省略された場合、その制御に用いられるシャッタ制御部54も省略される。
The imaging element 55 is configured as an image sensor such as a CCD (Charge Coupled Device) sensor, a CMOS (Complementary Metal Oxide Semiconductor) sensor, or the like, and images an object to generate and output captured image data.
If the imaging device 55 is composed of a CCD sensor or a CMOS sensor, an electronic shutter can be used, so the shutter 53 can be omitted. When the shutter 53 is omitted, the shutter control section 54 used for its control is also omitted.
 本例において、撮像素子55は、画像撮像用の画素(RGB画素)と、像面位相差法によるAF(Auto Focus)処理に用いる検波情報を取得するための画素、すなわち一対の像間の位相差情報(瞳分割で形成される一対の像間の位相差情報)を取得するための位相差検出画素とを有している。
 撮像素子55において、位相差検出画素は、例えばベイヤ配列等の所定の配列パターンによりRGB画素が二次元配列された画素配列面において、離散的に配置されている。
In this example, the image pickup element 55 includes pixels for image pickup (RGB pixels) and pixels for acquiring detection information used in AF (Auto Focus) processing by the image plane phase difference method, that is, the position between the pair of images. and phase difference detection pixels for acquiring phase difference information (phase difference information between a pair of images formed by pupil division).
In the image pickup device 55, the phase difference detection pixels are discretely arranged on a pixel arrangement plane where RGB pixels are two-dimensionally arranged according to a predetermined arrangement pattern such as a Bayer arrangement.
 撮像素子55において、RGB画素の光電変換で得られた受光信号はADC56でデジタル信号に変換され、フレームメモリ57に一時保持された後、画像信号処理部58に入力される。
 図2では、上記のようにRGB画素の受光信号がデジタル変換されて得られる撮像画像信号のことを「撮像画像信号Si」と表記している。
Light receiving signals obtained by photoelectric conversion of RGB pixels in the image sensor 55 are converted into digital signals by the ADC 56 , temporarily stored in the frame memory 57 , and then input to the image signal processing section 58 .
In FIG. 2, the captured image signal obtained by digitally converting the received light signals of the RGB pixels as described above is denoted as "captured image signal Si".
 一方、撮像素子55において、位相差検出画素の光電変換で得られた受光信号はADC56でデジタル信号に変換されて、ボディ側制御部52に供給される。
 図2では、このように位相差検出画素の受光信号がデジタル変換されて得られる信号を「位相差画素信号Sp」と表記している。
On the other hand, in the imaging element 55 , the received light signal obtained by photoelectric conversion of the phase difference detection pixel is converted into a digital signal by the ADC 56 and supplied to the body side control section 52 .
In FIG. 2, the signal obtained by digitally converting the light receiving signal of the phase difference detection pixel is denoted as "phase difference pixel signal Sp".
 ボディ側制御部52は、ADC56を介して供給される位相差画素信号Spに基づき、一対の像間の位相差を解析して、フォーカスを合わせる対象となる被写体(合焦対象物)に対するフォーカスのずれ量、すなわちデフォーカス量DFを計算する。
 ボディ側制御部52は、このように計算されたデフォーカス量DFに基づいてAFの制御を行うが、これについては改めて説明する。
The body-side control unit 52 analyzes the phase difference between the pair of images based on the phase difference pixel signal Sp supplied via the ADC 56, and adjusts the focus of the subject to be focused (focusing target). A shift amount, that is, a defocus amount DF is calculated.
The body-side control unit 52 performs AF control based on the defocus amount DF thus calculated, which will be described later.
 また、ボディ側制御部52は、ブリージング補正に関する処理を行う。
 ここで言うブリージングとは、フォーカシングに伴い画角が変化する現象を意味するものであり、ブリージング補正とは、そのようなフォーカシングに伴う画角変化の補正を意味する。
 本例では、ブリージング補正は、ズームレンズ13の駆動、又は撮像画像のトリミング(電子切り出し)により行われる。具体的に、本例のカメラシステムでは、ブリージング補正について、ズームレンズ13の駆動による補正と、撮像画像のトリミングによる補正とを切り替えて行うことが可能とされている。
 ここで以下、ブリージング補正の手法として、ズームレンズ13の駆動による補正手法を「ズームレンズ補正手法」と表記し、撮像画像のトリミングによる補正手法を「トリミング補正手法」と表記する。
 なお、ボディ側制御部52が行うブリージング補正に関する処理の詳細については後述する。
Also, the body-side control unit 52 performs processing related to breathing correction.
Breathing here means a phenomenon in which the angle of view changes with focusing, and breathing correction means correction of such a change in angle of view that accompanies focusing.
In this example, the breathing correction is performed by driving the zoom lens 13 or trimming (electronically cutting out) the captured image. Specifically, in the camera system of this example, the breathing correction can be performed by switching between correction by driving the zoom lens 13 and correction by trimming the captured image.
Here, as a breathing correction method, a correction method by driving the zoom lens 13 is hereinafter referred to as a "zoom lens correction method", and a correction method by trimming a captured image is described as a "trimming correction method".
The details of the breathing correction processing performed by the body-side control unit 52 will be described later.
 画像信号処理部58は、フレームメモリ57を介して入力される撮像画像に対して所定の画像信号処理を施す。ここでの画像信号処理としては、例えばデモザイク処理やホワイトバランス(WB)調整、ガンマ補正の処理等を挙げることができる。
 画像信号処理部58は、フレームメモリ57を介して入力されるRAW画像としての撮像画像に画像信号処理を施した後、所定のファイル形式の画像データに変換し、記録部59を介して記録媒体60に記録させる。
 また、画像信号処理部58は、画像信号処理を施した後の撮像画像を、所定の表示フォーマットに従った画像信号に変換して、表示部61に供給し、撮像された画像を表示させる。
The image signal processing unit 58 performs predetermined image signal processing on the captured image input via the frame memory 57 . Examples of the image signal processing here include demosaic processing, white balance (WB) adjustment, gamma correction processing, and the like.
The image signal processing unit 58 performs image signal processing on the captured image as a RAW image input via the frame memory 57, converts it into image data in a predetermined file format, and transfers the data to a recording medium via the recording unit 59. Let 60 record.
The image signal processing unit 58 also converts the captured image after the image signal processing into an image signal in accordance with a predetermined display format, supplies the image signal to the display unit 61, and displays the captured image.
 また、特に本実施形態における画像信号処理部58は、撮像画像のトリミング処理を行うことが可能とされる。画像信号処理部58は、ボディ側制御部52からの指示に基づき、撮像画像についてのトリミング処理を行う。 In addition, the image signal processing unit 58 in this embodiment in particular is capable of trimming the captured image. The image signal processing unit 58 performs trimming processing on the captured image based on the instruction from the body side control unit 52 .
 記録媒体60は、不揮発性メモリで構成され、記録部59は、記録媒体60に対するデータの書き込み、及び記録媒体60に記録されたデータの読み出しを行うことが可能に構成されている。ここで、記録媒体60は、撮像装置2に対して着脱自在とされてもよい。 The recording medium 60 is composed of a non-volatile memory, and the recording unit 59 is configured to be able to write data to the recording medium 60 and read data recorded on the recording medium 60 . Here, the recording medium 60 may be detachable from the imaging device 2 .
 表示部61は、液晶パネルや有機ELパネル等のパネル型表示装置で構成され、画像表示が可能とされる。
 表示部61は、マウント部51が配置された撮像装置2の正面とは反対側の背面に実装され、いわゆるスルー画像の表示や、記録媒体60から読み出された画像の表示、各種操作画面等としてのGUIの表示等を行うことができる。
The display unit 61 is composed of a panel-type display device such as a liquid crystal panel or an organic EL panel, and is capable of displaying images.
The display unit 61 is mounted on the rear surface of the imaging device 2 opposite to the front surface where the mount unit 51 is arranged, and displays a so-called through image, an image read from the recording medium 60, various operation screens, and the like. It is possible to display the GUI as
 操作部65は、例えばシャッタボタン、モードダイヤル、ズームボタン等の各種ハードウエアキーや、表示部61の表示画面に対するタッチ操作を検出可能に設けられたタッチパネル等、ユーザが撮像装置2に対する操作入力を行うための操作子を包括的に表している。
 操作部65は、ユーザの操作を受け付けて、操作に応じた操作信号をボディ側制御部52に供給する。
The operation unit 65 includes various hardware keys such as a shutter button, a mode dial, and a zoom button, a touch panel provided to detect touch operations on the display screen of the display unit 61, and the like. It comprehensively represents the operators to perform.
The operation unit 65 receives a user's operation and supplies an operation signal corresponding to the operation to the body-side control unit 52 .
 ここで、以降の説明においてはAFの制御について述べるが、本明細書ではAFの制御に係る用語として「被写体位置」「被写体距離」「ピント位置(合焦位置)」「合焦距離」「フォーカスレンズ位置」「ズームレンズ位置」を使用する。
 これらの用語の定義を図3を参照して説明しておく。
 先ず、「被写体位置」は、文字通り被写体が存在する位置を表すものあり、「被写体距離」は、撮像装置2から被写体までの距離を表すものである。
 「ピント位置」は、ピントの合っている位置を表すものであり、「合焦位置」と換言できるものである。「合焦距離」は、撮像装置2からピント位置までの距離を意味する。
 ここで、図3を参照して理解されるように、被写体距離や合焦距離は、交換レンズ1の外側となる位置までの距離となるものであり、例えば2m、3m、4m、・・・といった実距離で表される値となる。
Here, in the following description, AF control will be described. Use "Lens position" and "Zoom lens position".
Definitions of these terms will be explained with reference to FIG.
First, the "subject position" literally represents the position where the subject exists, and the "subject distance" represents the distance from the imaging device 2 to the subject.
"Focus position" represents a position where focus is achieved, and can be rephrased as "focus position". "Focus distance" means the distance from the imaging device 2 to the focus position.
Here, as will be understood with reference to FIG. 3, the object distance and the focusing distance are distances to positions on the outside of the interchangeable lens 1, for example, 2 m, 3 m, 4 m, . It is a value represented by an actual distance such as
 「フォーカスレンズ位置」は、図中に例示するような交換レンズ1内におけるフォーカスレンズ16の可動範囲内におけるフォーカスレンズ16の位置を意味するものであり、「ズームレンズ位置」は、同様に交換レンズ1内におけるズームレンズ13の可動範囲内におけるズームレンズ13の位置を意味するものである。 "Focus lens position" means the position of the focus lens 16 within the movable range of the focus lens 16 in the interchangeable lens 1 as illustrated in the drawing. 1 means the position of the zoom lens 13 within the movable range of the zoom lens 13 .
 ここで、像面位相差法で求まるデフォーカス量DFは、図3における「被写体位置」が合焦対象物の位置であるとすれば、「被写体位置」と「ピント位置」とのずれ量を表すものとなる。つまり、この場合におけるデフォーカス量DFは、フォーカスレンズ位置の誤差量を直接的に表すものではない。 Here, if the "object position" in FIG. It represents. That is, the defocus amount DF in this case does not directly represent the error amount of the focus lens position.
 本例で前提とするAFの制御の基本的な流れとしては、ボディ側制御部52が、デフォーカス量DFに基づいて、合焦対象物に対して合焦するのに必要なフォーカスレンズ16の目標位置(以下「目標フォーカスレンズ位置」と表記)を求め、目標フォーカスレンズ位置の情報を交換レンズ1側に指示するという流れとなる。 As a basic flow of AF control premised in this example, the body side control unit 52 adjusts the focus lens 16 necessary for focusing on the focus target based on the defocus amount DF. A target position (hereinafter referred to as “target focus lens position”) is obtained, and information on the target focus lens position is instructed to the interchangeable lens 1 side.
 また、図4を参照して、ブリージングとしての画角変化の態様とブリージング補正の概要を説明しておく。
 図4において、上段に「補正前」として示しているのは、無限遠から最至近までのピント位置の変化に対する画角変化の例である。図示のように、無限遠では撮像画像における像(図中の例ではアルファベットのA)の大きさが最も大きく、最至近では像の大きさが最も小さく、無限遠と最至近の中間となるピント位置では像の大きさは無限遠の場合よりも小さく最至近の場合よりは大きくなる。この点からも理解されるように、ブリージングとしての画角変化は、無限遠における画角が最も狭く、最至近側へのピント位置の変化に対して画角が徐々に広がっていく態様により生じる。
In addition, with reference to FIG. 4, an aspect of the angle of view change as breathing and an overview of breathing correction will be described.
In FIG. 4, what is shown as "before correction" in the upper part is an example of change in angle of view with respect to change in focus position from infinity to the closest object. As shown in the figure, the size of the image in the captured image (the letter A in the example in the figure) is the largest at infinity, the size of the image is the smallest at the closest distance, and the focus is intermediate between infinity and the closest distance. At position the image size is smaller than at infinity and larger than at closest. As can be understood from this point, the change in the angle of view as breathing occurs when the angle of view is the narrowest at infinity, and the angle of view gradually widens as the focus position changes toward the closest point. .
 このため、ブリージング補正としては、図中下段の「補正後」として示すように、無限遠側から最至近側にかけては画角が徐々に狭まるように行い、逆に最至近側から無限遠側にかけては画角が徐々に広がるように行う。
 トリミングによるブリージング補正については、無限遠でのトリミング倍率を「1.0」(つまりトリミングなし)とし、最至近側へのピント位置の変化に対してトリミング倍率を徐々に大きくしていくことで行われる。
 ブリージング補正を行うことで、ピント位置が変化しても(つまりフォーカシングが行われても)、撮像画像の画角が変化しないように図ることができる。
For this reason, breathing correction is performed so that the angle of view gradually narrows from the infinity side to the closest distance side, and conversely, from the closest side to the infinity side, as indicated by "after correction" in the lower part of the figure. is performed so that the angle of view gradually widens.
Breathing correction by trimming is performed by setting the trimming magnification at infinity to "1.0" (that is, no trimming) and gradually increasing the trimming magnification in response to changes in the focus position toward the closest distance. will be
By performing breathing correction, it is possible to prevent the angle of view of the captured image from changing even if the focus position changes (that is, even if focusing is performed).
(1-2.実施形態としてのフォーカス関連処理)
 図5は、第一実施形態としての撮像装置2のボディ側制御部52が有する実施形態としてのフォーカス関連処理に係る機能を示した機能ブロック図である。
 図示のようにボディ側制御部52は、情報取得処理部F1、AF処理部F2、第一補正制御部F3、第二補正制御部F4、及び補正手法判定部F5としての機能を有する。
(1-2. Focus-related processing as an embodiment)
FIG. 5 is a functional block diagram showing functions related to focus-related processing as an embodiment of the body-side control unit 52 of the imaging device 2 as the first embodiment.
As illustrated, the body-side control section 52 has functions as an information acquisition processing section F1, an AF processing section F2, a first correction control section F3, a second correction control section F4, and a correction method determination section F5.
 情報取得処理部F1は、ブリージング補正に用いるレンズ特性情報の取得処理を行う。具体的に、本例における情報取得処理部F1は、レンズ特性情報として交換レンズ1に記憶されているフォーカス移動軌跡情報J1、画角テーブルJ2、レンズ群移動速度テーブルJ3、及びトリミング倍率テーブルJ4を交換レンズ1から取得する処理を行う。
 情報取得処理部F1は、これらのレンズ特性情報の送信をレンズ側制御部12に対して要求し、該要求に応じてレンズ側制御部12が撮像装置2に送信したレンズ特性情報を取得する。
The information acquisition processing unit F1 performs acquisition processing of lens characteristic information used for breathing correction. Specifically, the information acquisition processing unit F1 in this example uses the focus movement locus information J1, the angle of view table J2, the lens group movement speed table J3, and the trimming magnification table J4 stored in the interchangeable lens 1 as the lens characteristic information. A process of acquiring from the interchangeable lens 1 is performed.
The information acquisition processing unit F1 requests the lens control unit 12 to transmit the lens characteristic information, and acquires the lens characteristic information transmitted from the lens control unit 12 to the imaging device 2 in response to the request.
 図6は、フォーカス移動軌跡情報J1の例を示した図である。
 フォーカス移動軌跡情報J1は、ズームレンズ位置とフォーカスレンズ位置とピント位置の三者の関係性を示す情報である。具体的に本例のフォーカス移動軌跡情報J1は、図示のようにズームレンズ位置とピント位置の組み合わせごとにフォーカスレンズ位置を示した情報とされる。
 フォーカス移動軌跡情報J1において、縦軸に示すズームレンズ位置は、図3で示したズームレンズ可動範囲の一端となるズームレンズ位置(最も広角側となる位置)から他端となるズームレンズ位置(最も望遠側となる位置)までの各ズームレンズ位置を表し、横軸のピント位置は、無限遠に対応するピント位置から最至近に対応するピント位置までの各ピント位置を表す。
 ここで、フォーカス移動軌跡情報J1において、ズームレンズ位置やピント位置の刻み幅については任意である。
FIG. 6 is a diagram showing an example of the focus movement trajectory information J1.
The focus movement locus information J1 is information indicating the relationship between the zoom lens position, the focus lens position, and the focus position. Specifically, the focus movement locus information J1 in this example is information indicating the focus lens position for each combination of the zoom lens position and the focus position, as shown in the drawing.
In the focus movement trajectory information J1, the zoom lens position indicated on the vertical axis ranges from the zoom lens position at one end of the movable range of the zoom lens shown in FIG. The focus position on the horizontal axis represents each focus position from the focus position corresponding to infinity to the focus position corresponding to the closest distance.
Here, in the focus movement trajectory information J1, the zoom lens position and the step size of the focus position are arbitrary.
 このようなフォーカス移動軌跡情報J1により、ズームレンズ位置とピント位置の情報が与えられることで、それらズームレンズ位置とピント位置の組み合わせに対応するフォーカスレンズ位置の情報を取得できる。また、ズームレンズ位置とフォーカスレンズ位置の情報が与えられることで、それらズームレンズ位置とフォーカスレンズ位置の組み合わせに対応するピント位置の情報を取得することもできる。 By giving the information on the zoom lens position and the focus position from such focus movement locus information J1, it is possible to acquire the information on the focus lens position corresponding to the combination of the zoom lens position and the focus position. Further, by giving information on the zoom lens position and the focus lens position, information on the focus position corresponding to the combination of the zoom lens position and the focus lens position can also be obtained.
 上記のようなフォーカス移動軌跡情報J1は、交換レンズ1の種類や個体によって特性が異なり得るため、本例では、交換レンズ1ごとにその交換レンズ1の特性に応じたフォーカス移動軌跡情報J1をメモリ32に記憶させている。 Since the characteristics of the focus movement trajectory information J1 as described above may differ depending on the type and individual of the interchangeable lens 1, in this example, the focus movement trajectory information J1 corresponding to the characteristics of each interchangeable lens 1 is stored in a memory. 32 is stored.
 図7は、画角テーブルJ2の例を示した図である。
 画角テーブルJ2は、ズームレンズ位置とピント位置の組み合わせごとに、焦点距離(画角)を示した情報とされる。なお、この場合もズームレンズ位置、ピント位置の範囲やそれらの位置の刻み幅についてはフォーカス移動軌跡情報J1の場合と同様である。
 この画角テーブルJ2により、ピント位置を無限遠から最至近に変化させた際の画角の変化特性が、ズームレンズ位置ごとに示される。
 本例では、画角テーブルJ2は、ズームレンズ補正手法によるブリージング補正を行う場合に要するズームレンズ13の移動量を求める際に用いられるが、この点については後に改めて説明する。
FIG. 7 is a diagram showing an example of the angle-of-view table J2.
The angle-of-view table J2 is information indicating the focal length (angle of view) for each combination of the zoom lens position and the focus position. Also in this case, the range of the zoom lens position, the focus position, and the increments of these positions are the same as in the case of the focus movement trajectory information J1.
The field angle table J2 shows the change characteristics of the field angle for each zoom lens position when the focus position is changed from infinity to the closest distance.
In this example, the angle-of-view table J2 is used to determine the amount of movement of the zoom lens 13 required when performing breathing correction by the zoom lens correction method, but this point will be explained later.
 図8は、レンズ群移動速度テーブルJ3の例を示した図である。
 レンズ群移動速度テーブルJ3は、ズームレンズ13(ズームレンズ駆動部21におけるアクチュエータ)を所定の駆動信号値により駆動した際のズームレンズ13の移動量であるズーム群移動速度(mm/sec)に関する情報であって、所定の速度変化要因に対するズーム群移動速度の変化特性を示した情報である。具体的に本例において、レンズ群移動速度テーブルJ3は、温度に対するレンズ群移動速度の変化特性を示した情報とされる。
 このようなレンズ群移動速度テーブルJ3により、温度等の速度変化要因に対して、ズーム群移動速度を精度良く求めることができる。
 本例では、レンズ群移動速度テーブルJ3は、ズームレンズ補正手法によるブリージング補正を行う場合の補正時間(補正に要するズームレンズ13の移動時間と換言できる:後述する補正時間Tla)を計算する際に用いられる。
FIG. 8 is a diagram showing an example of the lens group moving speed table J3.
The lens group moving speed table J3 is information about the zoom group moving speed (mm/sec), which is the amount of movement of the zoom lens 13 (actuator in the zoom lens drive unit 21) when the zoom lens 13 is driven by a predetermined drive signal value. and is information indicating the change characteristics of the zoom group moving speed with respect to a predetermined speed change factor. Specifically, in this example, the lens group moving speed table J3 is information indicating the change characteristics of the lens group moving speed with respect to temperature.
With such a lens group moving speed table J3, the zoom group moving speed can be obtained with high accuracy with respect to speed change factors such as temperature.
In this example, the lens group moving speed table J3 is used when calculating the correction time (which can be rephrased as the movement time of the zoom lens 13 required for correction: a correction time Tla to be described later) when performing breathing correction by the zoom lens correction method. Used.
 なお、レンズ群移動速度テーブルJ3について、速度変化要因としては、温度以外にも、例えば撮像装置2から交換レンズ1に対する供給電力量等、他の要因を適用することが可能である。 Regarding the lens group moving speed table J3, it is possible to apply other factors such as the amount of electric power supplied from the imaging device 2 to the interchangeable lens 1 in addition to the temperature as speed change factors.
 図9は、トリミング倍率テーブルJ4の例を示した図である。
 図示のようにトリミング倍率テーブルJ4は、ズームレンズ位置とピント位置との組み合わせごとに、ブリージング補正のためのトリミング倍率を示した情報とされる。なお、この場合もズームレンズ位置、ピント位置の範囲やそれらの位置の刻み幅についてはフォーカス移動軌跡情報J1の場合と同様である。
 このようなトリミング倍率テーブルJ4により、ピント位置の変化に伴う画角変化を相殺するためのトリミング倍率の変化特性が、ズームレンズ位置ごとに示される。
 AFにおいて、デフォーカス量DFに基づいて合焦対象物にピントを合わせるための目標となるピント位置が求まれば、この目標となるピント位置の情報と、現在のズームレンズ位置の情報とを元に、トリミング倍率テーブルJ4に基づいて補正に必要なトリミング倍率の情報を取得することができる。
FIG. 9 is a diagram showing an example of the trimming magnification table J4.
As shown, the trimming magnification table J4 is information indicating the trimming magnification for breathing correction for each combination of zoom lens position and focus position. Also in this case, the range of the zoom lens position, the focus position, and the step size of these positions are the same as in the case of the focus movement locus information J1.
Such a trimming magnification table J4 indicates, for each zoom lens position, the trimming magnification change characteristic for canceling out the change in the angle of view that accompanies the change in the focus position.
In AF, if a target focus position for focusing on an in-focus object is obtained based on the defocus amount DF, information on the target focus position and information on the current zoom lens position are used. In addition, information on the trimming magnification required for correction can be acquired based on the trimming magnification table J4.
 ここで、図7から図9に示した画角テーブルJ2、レンズ群移動速度テーブルJ3、及びトリミング倍率テーブルJ4としても、図6に示したフォーカス移動軌跡情報J1と同様に、交換レンズ1の種類や個体によって特性が異なり得るものであり、そのため本例では、これら画角テーブルJ2、レンズ群移動速度テーブルJ3、及びトリミング倍率テーブルJ4についても、交換レンズ1ごとにその交換レンズ1の特性に応じた情報をメモリ32に記憶させている。 7 to 9, the angle-of-view table J2, the lens group movement speed table J3, and the trimming magnification table J4 also use the type of the interchangeable lens 1, as with the focus movement locus information J1 shown in FIG. Therefore, in this example, the field angle table J2, the lens group moving speed table J3, and the trimming magnification table J4 are also set according to the characteristics of each interchangeable lens 1. The information obtained is stored in the memory 32 .
 なお、先に述べたように、ズームレンズ13、フォーカスレンズ16については、それぞれ複数のズームレンズ群、複数のフォーカスレンズ群で構成することも可能である。すなわち、ズーム調整が複数のズームレンズ群の変位により行われる構成や、フォーカシングが複数のフォーカスレンズ群の変位により行われる構成である。
 ズーム調整が複数のズームレンズ群の変位により行われる構成が採られる場合、「ズームレンズ位置」は、各ズームレンズ群の位置の組み合わせ情報とする。
 同様に、フォーカシングが複数のフォーカスレンズ群の変位により行われる構成が採られる場合、「フォーカスレンズ位置」の情報は、各フォーカスレンズ群の位置の組み合わせ情報とする。
Incidentally, as described above, the zoom lens 13 and the focus lens 16 can be configured with a plurality of zoom lens groups and a plurality of focus lens groups, respectively. That is, there is a configuration in which zoom adjustment is performed by displacing a plurality of zoom lens groups, and a configuration in which focusing is performed by displacing a plurality of focus lens groups.
When a configuration is adopted in which zoom adjustment is performed by displacing a plurality of zoom lens groups, the "zoom lens position" is information on a combination of the positions of the zoom lens groups.
Similarly, when a configuration is adopted in which focusing is performed by displacing a plurality of focus lens groups, the information on the "focus lens position" is information on a combination of the positions of the focus lens groups.
 上記のような複数のズームレンズ群の位置の組み合わせ情報とされた「ズームレンズ位置」と各ズームレンズ群の位置との対応関係を把握するためには、例えば、図10に示すような情報を用いるものとすればよい。
 具体的には、図10Aに示すようなテーブル情報や、図10Bに示すような関数情報を用いることが考えられる。
 ここでは、ズームレンズ13が第一ズームレンズ群と第二ズームレンズ群の二つのレンズ群で構成される場合の例を示しており、図10Aのテーブル情報としては、ズームレンズ位置ごとに第一ズームレンズ群の位置座標と第二ズームレンズ群の位置座標の組み合わせ情報を格納したテーブルを用いる。
 また、図10Bの関数情報としては、図中の関数Fzと示すような、第一ズームレンズ群の位置座標と第二ズームレンズ群の位置座標との対応関係を示す関数(例えば、「Y=ax^5+bx^4+cx^3+dx^2+ex+f」のような五次関数:「^」はべき乗を意味する)を用いる。この場合、「ズームレンズ位置」は、第一ズームレンズ群の位置又は第二ズームレンズ群の位置の何れか一方で代用可能となるが、概念としては、「ズームレンズ位置」は、第一ズームレンズ群の位置と第二ズームレンズ群の位置との組み合わせであることに変わりはない。
In order to grasp the correspondence relationship between the "zoom lens position" which is the combination information of the positions of the plurality of zoom lens groups as described above and the position of each zoom lens group, for example, information as shown in FIG. It should be used.
Specifically, table information as shown in FIG. 10A and function information as shown in FIG. 10B may be used.
Here, an example in which the zoom lens 13 is composed of two lens groups, a first zoom lens group and a second zoom lens group, is shown. A table storing combination information of the positional coordinates of the zoom lens group and the positional coordinates of the second zoom lens group is used.
Further, as the function information in FIG. 10B, a function indicating the correspondence relationship between the positional coordinates of the first zoom lens group and the positional coordinates of the second zoom lens group, such as function Fz in the drawing (for example, "Y= ax̂5+bx̂4+cx̂3+dx̂2+ex+f'("̂" means exponentiation). In this case, the "zoom lens position" can be substituted for either the position of the first zoom lens group or the position of the second zoom lens group. It is still a combination of the position of the lens group and the position of the second zoom lens group.
 図10で例示したテーブルや関数の情報についても、交換レンズ1ごとに異なり得るものであるため、交換レンズ1ごとに対応する情報をメモリ32に格納しておく。  The table and function information illustrated in FIG.
 なお、フォーカスレンズ16側についても、複数のフォーカスレンズ群で構成される場合には、同様のテーブルや関数を交換レンズ1側に記憶させておき、撮像装置2が必要に応じて取得するものとすればよい。 When the focus lens 16 is composed of a plurality of focus lens groups, similar tables and functions are stored in the interchangeable lens 1, and the imaging device 2 acquires them as necessary. do it.
 説明を図5に戻す。
 AF処理部F2は、AFに係る処理、具体的には、前述したデフォーカス量DFを取得する処理や、デフォーカス量DFに基づき合焦対象物にピントを合わせるための目標フォーカスレンズ位置を取得するための処理を行う。
Returning to FIG.
The AF processing unit F2 performs AF-related processing, specifically, processing for obtaining the defocus amount DF described above, and obtains a target focus lens position for focusing on an in-focus object based on the defocus amount DF. Perform processing for
 デフォーカス量DFから目標フォーカスレンズ位置を求めるにあたっては、交換レンズ1側から逐次送信されるズームレンズ位置、及びフォーカスレンズ位置の情報と、フォーカス移動軌跡情報J1とを用いる。
 具体的に、AF処理部F2は、交換レンズ1側から送信された現在の(現フレームの)ズームレンズ位置、及びフォーカスレンズ位置の情報とフォーカス移動軌跡情報J1とに基づき、合焦対象物にピントを合わせるためのピント位置(以下「目標ピント位置」と表記する)を求める。すなわち、先ず、現在のズームレンズ位置、及びフォーカスレンズ位置の情報とフォーカス移動軌跡情報J1とに基づき、現在のピント位置を取得する。そして、現在のピント位置とデフォーカス量DFとに基づき、目標ピント位置を算出する。
 次いで、AF処理部F2は、目標ピント位置と現在のズームレンズ位置の情報とフォーカス移動軌跡情報J1とに基づき、目標フォーカスレンズ位置を取得する。
To obtain the target focus lens position from the defocus amount DF, information on the zoom lens position and focus lens position sequentially transmitted from the interchangeable lens 1 side, and focus movement locus information J1 are used.
Specifically, the AF processing unit F2 adjusts the focus object based on the current (current frame) zoom lens position and focus lens position information transmitted from the interchangeable lens 1 side, and the focus movement locus information J1. A focus position for focusing (hereinafter referred to as "target focus position") is obtained. That is, first, the current focus position is obtained based on the current zoom lens position and focus lens position information and the focus movement locus information J1. Then, a target focus position is calculated based on the current focus position and the defocus amount DF.
Next, the AF processing unit F2 acquires the target focus lens position based on the information on the target focus position, the current zoom lens position, and the focus movement trajectory information J1.
 AF処理部F2は、上記のようにして取得した目標フォーカスレンズ位置の情報をレンズ側制御部12に指示する。これにより、交換レンズ1においては、フォーカスレンズ位置が目標フォーカスレンズ位置に一致するようにフォーカスレンズ16が駆動され、AFが実現される。 The AF processing unit F2 instructs the lens-side control unit 12 with the information on the target focus lens position acquired as described above. Thereby, in the interchangeable lens 1, the focus lens 16 is driven so that the focus lens position matches the target focus lens position, and AF is realized.
 ここで、フォーカス移動軌跡情報J1としては、全てのズームレンズ位置、ピント位置、フォーカスレンズ位置を網羅する情報とすることはデータ容量等の面から現実的ではなく、現在のフォーカスレンズ位置及びズームレンズ位置の組み合わせがフォーカス移動軌跡情報J1において存在しない場合もあり得る。その場合には、線形補間等の補間処理を行って現在のフォーカスレンズ位置及びズームレンズ位置の組み合わせに対応するピント位置を求めるようにする。 Here, as the focus movement trajectory information J1, it is not realistic in terms of data capacity to cover all zoom lens positions, focus positions, and focus lens positions. There may be cases where the combination of positions does not exist in the focus movement trajectory information J1. In that case, interpolation processing such as linear interpolation is performed to obtain the focus position corresponding to the combination of the current focus lens position and zoom lens position.
 なお、上記ではデフォーカス量DFから目標フォーカスレンズ位置を求める処理を撮像装置2側で行う例としたが、デフォーカス量DFから目標フォーカスレンズ位置を求める処理は交換レンズ1側で行うことも可能である。その場合、ボディ側制御部52は、デフォーカス量DFの情報をレンズ側制御部12に送信し、レンズ側制御部12が、メモリ32に記憶されたフォーカス移動軌跡情報J1に基づいて目標フォーカスレンズ位置を取得するものとすればよい。 In the above example, the processing for obtaining the target focus lens position from the defocus amount DF is performed on the imaging device 2 side, but the processing for obtaining the target focus lens position from the defocus amount DF can also be performed on the interchangeable lens 1 side. is. In that case, the body-side control unit 52 transmits information on the defocus amount DF to the lens-side control unit 12, and the lens-side control unit 12 controls the target focus lens based on the focus movement locus information J1 stored in the memory 32. The position should be acquired.
 第一補正制御部F3、第二補正制御部F4は、それぞれ異なる補正手法によるブリージング補正について、補正実現のための制御を行う。具体的に本例において、第一補正制御部F3は、ズームレンズ補正手法によるブリージング補正を実現するための制御を行い、第二補正制御部F4は、トリミング補正手法によるブリージング補正を実現するための制御を行う。 The first correction control unit F3 and the second correction control unit F4 perform control for realizing correction of breathing correction by different correction methods. Specifically, in this example, the first correction control unit F3 performs control for realizing breathing correction by the zoom lens correction method, and the second correction control unit F4 performs control for realizing breathing correction by the trimming correction method. control.
 本例において、第一補正制御部F3は、後述する補正手法判定部F5の処理で求まる、ズームレンズ補正手法による補正を実現するための目標ズームレンズ位置の情報をレンズ側制御部12に指示することで、交換レンズ1側でズームレンズ補正手法によるブリージング補正を実行させる。
 また、本例における第二補正制御部F4は、後述するようにトリミング倍率テーブルJ4に基づき取得されたトリミング倍率の情報、すなわち、トリミング補正手法によるブリージング補正を実現するためのトリミング倍率の情報を画像信号処理部58に指示することで、画像信号処理部58にブリージング補正のためのトリミング処理を実行させる。
In this example, the first correction control unit F3 instructs the lens-side control unit 12 about the information of the target zoom lens position for realizing the correction by the zoom lens correction method, which is obtained by the processing of the correction method determination unit F5, which will be described later. By doing so, the breathing correction by the zoom lens correction method is executed on the interchangeable lens 1 side.
Further, the second correction control unit F4 in this example outputs information on the trimming magnification acquired based on the trimming magnification table J4 as will be described later, that is, information on the trimming magnification for realizing the breathing correction by the trimming correction method. By instructing the signal processing unit 58, the image signal processing unit 58 is caused to execute trimming processing for correcting bleeding.
 補正手法判定部F5は、フォーカスレンズ群の移動量に係る情報に基づき、ブリージング補正の手法である第一補正手法と第二補正手法のうち何れの手法によるブリージング補正を行うかについての判定を行う。つまり本例では、ズームレンズ補正手法とトリミング補正手法とについて、実行する補正手法の判定を行う。 A correction method determination unit F5 determines which of the breathing correction method, the first correction method and the second correction method, should be used for the breathing correction based on the information related to the amount of movement of the focus lens group. . That is, in this example, the correction method to be executed is determined for the zoom lens correction method and the trimming correction method.
 ここでの補正手法の判定は、フォーカスレンズ群の移動量に係る情報に基づき、ズームレンズ補正手法を採った場合における補正時間Tlaを求め、補正時間Tlaが所定の閾値Tth以下であるか否かの判定として行う。 The determination of the correction method here is based on information relating to the amount of movement of the focus lens group, the correction time Tla in the case where the zoom lens correction method is adopted, and whether or not the correction time Tla is equal to or less than a predetermined threshold value Tth. as a judgment of
 基本的な処理の流れとしては、先ず、フォーカシングによるピント位置の変更に対し、ズームレンズ補正手法を採った場合にブリージング補正のためにズームレンズ13をどの位置まで移動させればよいかを画角テーブルJ2を用いて特定する。換言すれば、ズームレンズ補正手法を採った場合におけるブリージング補正のためのズームレンズ位置の目標位置を特定する。
 ここで以下、この目標位置のことを「目標ズームレンズ位置」と表記する。
As a basic processing flow, first, when the zoom lens correction method is adopted for the change of the focus position due to focusing, the angle of view is determined to which position the zoom lens 13 should be moved for correcting breathing. It is specified using Table J2. In other words, the target position of the zoom lens position for breathing correction when the zoom lens correction method is adopted is specified.
Hereinafter, this target position will be referred to as "target zoom lens position".
 具体的な手法を図11を参照して説明する。
 なお、図11では画角テーブルJ2に画角の情報を格納した例としているが、焦点距離の情報を格納することもできる。
 例えば図中の(1)と示すように、ピント位置の変更(図中ではFc[2]からFc[1]への変更)に伴い、現在のズームレンズ位置(図中ではZm[1])との関係で、画角が17.0(mm)から16.8に変化したものとする。
 この場合、ブリージング補正を実現するためには、図中(2)と示すように、変更後のピント位置(Fc[1])において、元来の(ピント位置変更前の)の画角=17.0がキープされるように、ズームレンズ位置をZm[1]からZm[2]に変化させればよいことが分かる。すなわち、この場合における目標ズームレンズ位置はZm[2]と特定できる。
A specific method will be described with reference to FIG.
Although FIG. 11 shows an example in which information on the angle of view is stored in the angle-of-view table J2, information on the focal length can also be stored.
For example, as indicated by (1) in the figure, the current zoom lens position (Zm[1] in the figure) accompanies the change in focus position (change from Fc[2] to Fc[1] in the figure). , the angle of view is changed from 17.0 (mm) to 16.8 (mm).
In this case, in order to realize the breathing correction, the original angle of view (before changing the focus position)=17 at the changed focus position (Fc[1]), as indicated by (2) in the figure. It can be seen that the zoom lens position should be changed from Zm[1] to Zm[2] so that .0 is kept. That is, the target zoom lens position in this case can be identified as Zm[2].
 このように、目標ズームレンズ位置は、現在のズームレンズ位置と、フォーカシングによる変更後のピント位置の情報を元に、画角テーブルJ2を参照して取得することができる。具体的には、変更後のピント位置において、変更前の画角を維持することのできるズームレンズ位置を目標ズームレンズ位置として取得するものである。 Thus, the target zoom lens position can be obtained by referring to the angle-of-view table J2 based on information on the current zoom lens position and the focus position after the change due to focusing. Specifically, a zoom lens position that can maintain the angle of view before the change at the changed focus position is acquired as the target zoom lens position.
 補正手法判定部F5は、取得した目標ズームレンズ位置に基づき、ズームレンズ補正手法によるブリージング補正を行う場合における補正に要する時間である補正時間Tlaを計算する。具体的には、目標ズームレンズ位置と現在のズームレンズ位置から補正のために要するズームレンズ13の移動量を「補正用ズームレンズ移動量」として計算し、該補正用ズームレンズ移動量と、図8に示したレンズ群移動速度テーブルJ3に基づき得られるズームレンズ群移動速度の情報とに基づいて、補正時間Tlaを計算する。
 本例では、レンズ群移動速度テーブルJ3は温度に対するズームレンズ群移動速度を定義した情報とされるため、補正手法判定部F5は、検出部17が検出する交換レンズ1の温度の情報を取得し、該温度に対応するズームレンズ群移動速度の情報をレンズ群移動速度テーブルJ3に基づき取得する。
 そして、取得したズームレンズ群移動速度と、上記した補正用ズームレンズ移動量とに基づき、補正時間Tlaを計算する。具体的には、補正用ズームレンズ移動量をズームレンズ群移動速度で割ることで補正時間Tlaを算出する。
Based on the acquired target zoom lens position, the correction method determination unit F5 calculates a correction time Tla, which is the time required for correction when performing breathing correction by the zoom lens correction method. Specifically, the amount of movement of the zoom lens 13 required for correction from the target zoom lens position and the current zoom lens position is calculated as the "correction zoom lens movement amount", and the correction zoom lens movement amount is The correction time Tla is calculated based on the zoom lens group moving speed information obtained based on the lens group moving speed table J3 shown in FIG.
In this example, since the lens group moving speed table J3 is information defining the zoom lens group moving speed with respect to temperature, the correction method determination unit F5 acquires temperature information of the interchangeable lens 1 detected by the detection unit 17. , information on the zoom lens group moving speed corresponding to the temperature is acquired based on the lens group moving speed table J3.
Then, the correction time Tla is calculated based on the obtained zoom lens group moving speed and the correction zoom lens moving amount. Specifically, the correction time Tla is calculated by dividing the correction zoom lens movement amount by the zoom lens group movement speed.
 補正手法判定部F5は、このように算出した補正時間Tlaが、所定の閾値Tth以下か否かを判定し、補正時間Tlaが閾値Tth以下であれば、ズームレンズ補正手法によるブリージング補正が行われるように制御し、補正時間Tlaが閾値Tth以下でない場合には、トリミング補正手法によるブリージング補正が行われるように制御する。 The correction method determination unit F5 determines whether the correction time Tla thus calculated is equal to or less than a predetermined threshold value Tth, and if the correction time Tla is equal to or less than the threshold value Tth, breathing correction is performed by the zoom lens correction method. When the correction time Tla is not equal to or less than the threshold Tth, the breathing correction is performed by the trimming correction method.
 ここで、ズームレンズ補正手法によるブリージング補正を行う場合は、前述した第一補正制御部F3が、目標ズームレンズ位置をレンズ側制御部12に指示することで、交換レンズ1側にズームレンズ補正手法によるブリージング補正を実行させる。 Here, when performing the breathing correction by the zoom lens correction method, the above-described first correction control unit F3 instructs the lens side control unit 12 of the target zoom lens position, so that the zoom lens correction method is applied to the interchangeable lens 1 side. Breathing correction is performed by
 また、トリミング補正手法によるブリージング補正を行う場合は、第二補正制御部F4は、トリミング倍率テーブルJ4に基づき取得したトリミング倍率を画像信号処理部58に指示することで、画像信号処理部58にブリージング補正のためのトリミング処理を実行させる。
 ここで、図9を参照して分かるように、トリミング倍率テーブルJ4によれば、フォーカシングによる変更後のピント位置の情報が得られれば、現在のズームレンズ位置の情報に基づいて補正のためのトリミング倍率を取得することが可能である。
When performing breathing correction using a trimming correction method, the second correction control unit F4 instructs the image signal processing unit 58 to use the trimming magnification acquired based on the trimming magnification table J4. Execute trimming processing for correction.
Here, as can be seen with reference to FIG. 9, according to the trimming magnification table J4, if information on the focus position after changing by focusing is obtained, trimming for correction is performed based on the information on the current zoom lens position. Magnification can be obtained.
 なお、トリミング倍率テーブルJ4についても、全てのズームレンズ位置とピント位置の組み合わせを網羅する情報とすることはデータ容量等の面から現実的ではなく、現在のズームレンズ位置とピント位置の組み合わせがトリミング倍率テーブルJ4において存在しない場合もあり得る。その場合には、線形補間等の補間処理を行って現在のズームレンズ位置とピント位置の組み合わせに対応するトリミング倍率を求めるようにする。
As for the trimming magnification table J4, it is not realistic in terms of data capacity to include information covering all combinations of zoom lens positions and focus positions. It may not exist in the magnification table J4. In that case, interpolation processing such as linear interpolation is performed to obtain the trimming magnification corresponding to the combination of the current zoom lens position and focus position.
(1-3.処理手順)
 続いて、図12のフローチャートを参照して、上記により説明した第一実施形態としてのフォーカス関連処理を実現するための具体的な処理手順例を説明する。
 なお、図12に示す処理は、本例では、ボディ側制御部52が前述したROM等に格納されたプログラムに基づきソフトウエア処理として実行する。
 また、ここでは、フォーカシングがAFによるものであることを前提とした処理例を説明する。
(1-3. Processing procedure)
Next, a specific processing procedure example for realizing the focus-related processing as the first embodiment described above will be described with reference to the flowchart of FIG. 12 .
In this example, the processing shown in FIG. 12 is executed by the body-side control unit 52 as software processing based on the program stored in the aforementioned ROM or the like.
Also, here, a processing example will be described on the premise that focusing is performed by AF.
 先ず、ボディ側制御部52はステップS101で、動画撮像開始を待機する。すなわち、ユーザの操作入力等に基づき、動画像の撮像動作が開始状態となることを待機する処理を行う。 First, in step S101, the body-side control unit 52 waits for the start of moving image capturing. That is, a process of waiting for the moving image capturing operation to start is performed based on the user's operation input or the like.
 動画撮像が開始されたと判定した場合、ボディ側制御部52はステップS102で、現在のズームレンズ位置及びフォーカスレンズ位置を取得する処理を行う。すなわち、レンズ側制御部12からフレームごとに逐次送信されるズームレンズ位置、フォーカスレンズ位置を取得する。 When it is determined that moving image capturing has started, the body-side control unit 52 performs processing for acquiring the current zoom lens position and focus lens position in step S102. That is, the zoom lens position and the focus lens position sequentially transmitted from the lens side control unit 12 for each frame are acquired.
 ステップS102に続くステップS103でボディ側制御部52は、フォーカス移動軌跡情報J1に基づき現在のピント位置を取得する。すなわち、ステップS102で取得した現在のズームレンズ位置とフォーカスレンズ位置の情報と、フォーカス移動軌跡情報J1とに基づき、現在のピント位置を取得する。 In step S103 following step S102, the body-side control unit 52 acquires the current focus position based on the focus movement trajectory information J1. That is, the current focus position is acquired based on the information on the current zoom lens position and focus lens position acquired in step S102 and the focus movement locus information J1.
 ステップS103に続くステップS104でボディ側制御部52は、AFによるデフォーカス量DFを取得する。すなわち、前述したAF処理部F2の処理として説明したように、位相差画素信号Spに基づきデフォーカス量DFを取得する。 In step S104 following step S103, the body-side control unit 52 acquires the defocus amount DF by AF. That is, the defocus amount DF is acquired based on the phase difference pixel signal Sp, as described as the processing of the AF processing unit F2.
 ステップS104に続くステップS105でボディ側制御部52は、現在のピント位置とデフォーカス量DFとに基づき目標ピント位置を計算する。
 そして、ステップS105に続くステップS106でボディ側制御部52は、目標ピント位置と現在のズームレンズ位置と画角テーブルJ2とに基づき、ブリージング補正のためのズームレンズ位置、すなわち前述した目標ズームレンズ位置を取得する。
 なお、この場合における画角テーブルJ2に基づく目標ズームレンズ位置の取得手法については、前述した「変更後のピント位置」を目標ピント位置とすればよく、その他の部分については既に説明済みであるため重複説明は避ける。
In step S105 following step S104, the body-side control unit 52 calculates the target focus position based on the current focus position and the defocus amount DF.
Then, in step S106 following step S105, the body-side control unit 52 determines the zoom lens position for breathing correction, that is, the aforementioned target zoom lens position, based on the target focus position, the current zoom lens position, and the angle-of-view table J2. to get
In this case, the method of obtaining the target zoom lens position based on the angle-of-view table J2 should be the above-mentioned "focus position after change" as the target focus position, and other parts have already been explained. Avoid duplicate explanations.
 ステップS106に続くステップS107でボディ側制御部52は、目標ズームレンズ位置と現在のズームレンズ位置とに基づいて、ブリージング補正のためのズームレンズ移動量、すなわち前述した補正用ズームレンズ移動量を計算する。
 次いで、ステップS108でボディ側制御部52は、補正用ズームレンズ移動量とレンズ群移動速度テーブルJ3とに基づき、ズームレンズによる補正時間Tlaを計算する。
 なお、補正時間Tlaの計算手法についても既に説明済みであるため重複説明は避ける。
In step S107 following step S106, the body-side control unit 52 calculates the zoom lens movement amount for breathing correction, that is, the correction zoom lens movement amount described above, based on the target zoom lens position and the current zoom lens position. do.
Next, in step S108, the body-side control unit 52 calculates the correction time Tla by the zoom lens based on the correction zoom lens movement amount and the lens group movement speed table J3.
Note that the method of calculating the correction time Tla has already been explained, so redundant explanation will be avoided.
 ステップS108に続くステップS109でボディ側制御部52は、補正時間Tlaが閾値Tth以下か否かを判定する。
 補正時間Tlaが閾値Tth以下であれば、ボディ側制御部52はステップS110 に進み、目標ズームレンズ位置へのズームレンズ移動制御を行う。すなわち、ステップS106で取得した目標ズームレンズ位置をレンズ側制御部12に指示してズームレンズ補正手法によるブリージング補正を実行させる。
In step S109 following step S108, the body-side control unit 52 determines whether or not the correction time Tla is equal to or less than the threshold value Tth.
If the correction time Tla is equal to or less than the threshold value Tth, the body-side control unit 52 proceeds to step S110 and performs zoom lens movement control to the target zoom lens position. That is, the target zoom lens position acquired in step S106 is instructed to the lens side control unit 12 to execute breathing correction by the zoom lens correction method.
 一方、補正時間Tlaが閾値Tth以下でなければ、ボディ側制御部52はステップS111に進み、トリミング倍率テーブルJ4に基づきブリージング補正のためのトリミング倍率を取得する。具体的には、ステップS102で取得した現在のズームレンズ位置と、ステップS105で取得した目標ピント位置の情報と、トリミング倍率テーブルJ4とに基づき、ブリージング補正のためのトリミング倍率を取得する。
 そして、続くステップS112でボディ側制御部52は、取得したトリミング倍率を画像信号処理部58に指示する。これにより、トリミング補正手法によるブリージング補正が実現される。
On the other hand, if the correction time Tla is not equal to or less than the threshold value Tth, the body-side control unit 52 proceeds to step S111 and acquires the trimming magnification for breathing correction based on the trimming magnification table J4. Specifically, the trimming magnification for breathing correction is acquired based on the current zoom lens position acquired in step S102, the target focus position information acquired in step S105, and the trimming magnification table J4.
Then, in subsequent step S112, the body-side control section 52 instructs the image signal processing section 58 on the acquired trimming magnification. Thereby, breathing correction is realized by the trimming correction method.
 ステップS110の処理を実行した場合と、ステップS112の処理を実行した場合のそれぞれにおいて、ボディ側制御部52はステップS113に処理を進める。
 ステップS113でボディ側制御部52は、フレーム撮像待機処理を行う。すなわち、1フレーム分の撮像動作の完了を待機する。
The body-side control unit 52 advances the process to step S113 when the process of step S110 is executed and when the process of step S112 is executed.
In step S113, the body-side control unit 52 performs frame imaging standby processing. That is, it waits for the completion of the imaging operation for one frame.
 ステップS113に続くステップS114でボディ側制御部52は、動画撮像終了か否かを待機する。すなわち、動画像の撮像動作の終了を待機する処理である。 In step S114 following step S113, the body-side control unit 52 waits whether or not the moving image capturing is finished. That is, it is a process of waiting for the end of the moving image capturing operation.
 動画撮像終了でないと判定した場合、ボディ側制御部52はステップS102に戻る。
 これにより、動画撮像中においては、AFによるフォーカシングに対するブリージング補正のための処理として、ステップS102からS113までの処理がフレーム周期で繰り返し実行される。
If it is determined that the moving image capturing has not ended, the body-side control unit 52 returns to step S102.
As a result, during moving image pickup, the processing from steps S102 to S113 is repeatedly executed in frame cycles as processing for breathing correction for focusing by AF.
 一方、動画撮像終了であると判定した場合、ボディ側制御部52は図12に示す一連の処理を終える。 On the other hand, when it is determined that the moving image capturing has ended, the body-side control unit 52 ends the series of processes shown in FIG. 12 .
 ここで、図12で説明したようなAFによるフォーカシングに対するブリージング補正を前提とした処理例においては、「フォーカスレンズの移動量に係る情報」は、現在のフォーカスレンズ位置とフォーカス移動軌跡情報J1とに基づき取得される「現在のピント位置」の情報、及び、現在のピント位置とAFによるデフォーカス量DFとに基づいて求まる「目標ピント位置」の情報が該当する。 Here, in the processing example assuming breathing correction for focusing by AF as described with reference to FIG. The information of the "current focus position" acquired based on the current focus position and the information of the "target focus position" obtained based on the current focus position and the defocus amount DF by AF correspond to this.
 また、上記により説明した補正手法の切り替えのための処理は、AFによるフォーカシングではなく、マニュアル操作によるフォーカシングが行われる場合にも好適に適用できる。マニュアル操作によるフォーカシングに対応する場合には、画角テーブルJ2に基づく目標ズームレンズ位置の取得において、「変更後のピント位置」としてマニュアル操作による変更後のピント位置を当て嵌めればよい。
 なお、マニュアル操作によるフォーカシングに対応する場合、「フォーカスレンズの移動量に係る情報」は、マニュアル操作によるフォーカスレンズ変位について検出されるフォーカスレンズ移動量の情報が該当する。
Further, the processing for switching the correction method described above can also be preferably applied when focusing is performed by manual operation instead of focusing by AF. In the case of supporting focusing by manual operation, the focus position after change by manual operation may be applied as the "changed focus position" in acquiring the target zoom lens position based on the angle-of-view table J2.
It should be noted that, in the case of handling focusing by manual operation, the "information on the amount of movement of the focus lens" corresponds to information on the amount of movement of the focus lens detected for displacement of the focus lens by manual operation.
 ここで、上記では、例えば「現在のズームレンズ位置及びフォーカスレンズ位置を取得」(S102)や、「フォーカス移動軌跡情報J1に基づき現在のピント位置を取得」(S103)等、情報を「取得」するとの表現を用いているが、本明細書において情報の「取得」とは、対象の情報をプロセッサによる処理が可能な状態に所定の記憶装置(例えば、RAMやレジスタ等)に記憶させることと同義である。
Here, in the above description, for example, "acquire the current zoom lens position and focus lens position" (S102) or "acquire the current focus position based on the focus movement locus information J1" (S103). In this specification, the term “acquisition” of information refers to storing the target information in a predetermined storage device (for example, RAM, register, etc.) in a state that can be processed by the processor. Synonymous.
<2.第二実施形態>
 続いて、第二実施形態について説明する。
 第二実施形態は、ズームレンズ補正手法によるブリージング補正中に実行すべき処理に係るものである。
 なお以下の説明において、既に説明済みとなった部分と同様となる部分については同一符号を付して説明を省略する。
<2. Second Embodiment>
Next, a second embodiment will be described.
The second embodiment relates to processing to be executed during breathing correction by the zoom lens correction method.
In the following description, the same reference numerals will be given to the same parts as those already explained, and the explanation will be omitted.
 図13は、第二実施形態としてのカメラシステムを構成する交換レンズ1A及び撮像装置2Aの内部構成例を示したブロック図である。
 交換レンズ1Aは、交換レンズ1(図2参照)と比較して、メモリ32において、レンズ群移動速度テーブルJ3に代えてレンズ群移動速度テーブルJ3Aが記憶され、さらにフォーカス群画角変動率テーブルJ5とズーム群画角変動率テーブルJ6とが記憶された点が異なる。
 なお、これらレンズ群移動速度テーブルJ3A、フォーカス群画角変動率テーブルJ5、ズーム群画角変動率テーブルJ6については改めて説明する。
FIG. 13 is a block diagram showing an internal configuration example of an interchangeable lens 1A and an imaging device 2A that configure a camera system according to the second embodiment.
Compared to the interchangeable lens 1 (see FIG. 2), the interchangeable lens 1A stores a lens group moving speed table J3A instead of the lens group moving speed table J3 in the memory 32, and further stores a focus group angle of view variation rate table J5. and the zoom group field angle variation rate table J6.
The lens group moving speed table J3A, the focus group angle of view variation rate table J5, and the zoom group angle of view variation rate table J6 will be described again.
 撮像装置2Aは、撮像装置2と比較して、ボディ側制御部52に代えてボディ側制御部52Aが設けられた点が異なる。ボディ側制御部52Aはボディ側制御部52と比較して、図14に示すように有する機能が異なる。 The imaging device 2A differs from the imaging device 2 in that a body-side control section 52A is provided instead of the body-side control section 52. As shown in FIG. 14, the body-side control section 52A differs from the body-side control section 52 in functions.
 図14は、ボディ側制御部52Aが有する第二実施形態に係る機能を示した機能ブロック図である。
 ボディ側制御部52Aは、ボディ側制御部52(図5参照)と比較して、情報取得処理部F1に代えて情報取得処理部F1Aを有し、第一補正制御部F3に代えて第一補正制御部F3Aを有する点が異なる。
FIG. 14 is a functional block diagram showing the functions of the body-side control section 52A according to the second embodiment.
Compared with the body-side control unit 52 (see FIG. 5), the body-side control unit 52A has an information acquisition processing unit F1A instead of the information acquisition processing unit F1, and a first correction control unit F3 instead of the first correction control unit F3. It differs in that it has a correction control section F3A.
 情報取得処理部F1Aは、交換レンズ1Aより、フォーカス移動軌跡情報J1、画角テーブルJ2、レンズ群移動速度テーブルJ3A、トリミング倍率テーブルJ4、フォーカス群画角変動率テーブルJ5、及びズーム群画角変動率テーブルJ6を取得する処理を行う。 The information acquisition processing unit F1A acquires focus movement trajectory information J1, an angle of view table J2, a lens group movement speed table J3A, a trimming magnification table J4, a focus group angle of view fluctuation rate table J5, and a zoom group angle of view fluctuation from the interchangeable lens 1A. A process of acquiring the rate table J6 is performed.
 第一補正制御部F3Aは、第一補正制御部F3と比較して、ズームレンズ補正手法によるブリージング補正時において行われるステップS110のズームレンズ移動制御の処理として、第一実施形態で説明した処理とは異なる処理を行う点が異なる。 Compared with the first correction control unit F3, the first correction control unit F3A performs the zoom lens movement control processing in step S110 performed during breathing correction by the zoom lens correction method, which is the processing described in the first embodiment. are different in that they perform different processing.
 ここで、図15は、ズームレンズ補正手法によるブリージング補正が行われる場合における目標ズームレンズ位置と、目標ズームレンズ位置への移動前のズームレンズ13の位置と、AFによる目標フォーカスレンズ位置と、目標フォーカスレンズ位置への移動前のフォーカスレンズ16の位置の関係を模式的に表している。
 AFによるフォーカシングで生じるブリージングを補正する際、フォーカスレンズ16は、現在のフォーカスレンズ位置から目標フォーカスレンズ位置に向けて移動し、ズームレンズ13は目標ズームレンズ位置に向けて移動する。
 このとき、ズームレンズ13とフォーカスレンズ16との間に移動速度の差が生じていると、ブリージングとしての画角変動を十分に補正できず、ユーザに画角変化が知覚されてしまう虞がある。
Here, FIG. 15 shows the target zoom lens position, the position of the zoom lens 13 before moving to the target zoom lens position, the target focus lens position by AF, and the target when breathing correction is performed by the zoom lens correction method. It schematically shows the positional relationship of the focus lens 16 before it is moved to the focus lens position.
When correcting breathing caused by AF focusing, the focus lens 16 moves from the current focus lens position toward the target focus lens position, and the zoom lens 13 moves toward the target zoom lens position.
At this time, if there is a difference in moving speed between the zoom lens 13 and the focus lens 16, the change in angle of view due to breathing cannot be sufficiently corrected, and the user may perceive the change in angle of view. .
 第一補正制御部F3Aは、このようなフォーカスレンズ16とズームレンズ13との移動速度差に起因して生じる画角変動の抑制を図るべく、次のような処理を行う。 The first correction control unit F3A performs the following processing in order to suppress the change in angle of view caused by the difference in moving speed between the focus lens 16 and the zoom lens 13.
 ここで、本実施形態では、ズームレンズ13及びフォーカスレンズ16の駆動が、微少時間δtを単位期間とする周期で行われる前提とする。
 また、以下の説明では一例として、ズームレンズ13の駆動信号値を最大値等の固定値としてズームレンズ13の駆動を行う前提で、フォーカスレンズ16側の移動速度(後述する速度Vf)側を変化させる例を挙げる。
Here, in the present embodiment, it is assumed that the driving of the zoom lens 13 and the focus lens 16 is performed in a cycle with a minute time δt as a unit period.
Further, in the following description, as an example, on the premise that the zoom lens 13 is driven with the drive signal value of the zoom lens 13 set to a fixed value such as a maximum value, the moving speed (speed Vf described later) of the focus lens 16 is changed. Let me give you an example.
 第一補正制御部F3Aは、下記[式1]の条件を満たすように、フォーカス群移動速度Vfを調整しながら、ズームレンズ位置、フォーカスレンズ位置をそれぞれの目標位置まで変位させる処理を行う。

 |Vz×Az-Vf×Af|≦Gth ・・・[式1]

 ここで、[式1]において、Vzはズーム群移動速度(図8参照)、Vfはフォーカスレンズ16の移動速度であるフォーカス群移動速度を意味する。また、Azは、ズーム群移動速度Vzが所定の基準速度である場合における、ズームレンズ13の単位駆動あたりの画角変動率(%/mm)を意味する(以下「ズーム群画角変動率Az」とする)。ここで言う「単位駆動」とは、上記した微少時間δtによる1周期分のレンズ駆動を意味するものである。
 また、Afは、フォーカス群移動速度Vfが所定の基準速度である場合における、フォーカスレンズ16の単位駆動あたりの画角変動率を意味する(以下「フォーカス群画角変動率Af」とする)。
 すなわち、ズーム群移動速度Vzとしての、ズームレンズ13の移動速度が速ければ(値が大きければ)、[式1]の「Vz×Az」で計算されるズームレンズ移動起因の画角変動率は大きくなり、同様に、フォーカス群移動速度Vfとしてのフォーカスレンズ16の移動速度が速ければ(値が大きければ)、[式1]の「Vf×Af」で計算されるフォーカスレンズ移動起因の画角変動率も大きくなる。
The first correction control unit F3A performs a process of displacing the zoom lens position and the focus lens position to their respective target positions while adjusting the focus group moving speed Vf so as to satisfy the condition of [Equation 1] below.

|Vz×Az−Vf×Af|≦Gth [Formula 1]

Here, in [Equation 1], Vz means the zoom group movement speed (see FIG. 8), and Vf means the focus group movement speed, which is the movement speed of the focus lens 16 . Az means the rate of change in angle of view (%/mm) per unit drive of the zoom lens 13 when the zoom group moving speed Vz is a predetermined reference speed (hereinafter referred to as "zoom group angle of view change rate Az ”). The "unit drive" referred to here means lens drive for one cycle by the minute time .delta.t described above.
Af means the rate of change in angle of view per unit drive of the focus lens 16 when the focus group moving speed Vf is a predetermined reference speed (hereinafter referred to as "focus group angle of view change rate Af").
That is, if the moving speed of the zoom lens 13 as the zoom group moving speed Vz is fast (if the value is large), the angle of view variation rate due to the zoom lens movement calculated by "Vz×Az" in [Equation 1] is Similarly, if the movement speed of the focus lens 16 as the focus group movement speed Vf is fast (if the value is large), the angle of view due to the movement of the focus lens calculated by "Vf×Af" in [Equation 1] The volatility also increases.
 第一補正制御部F3Aは、微少時間δtによる1周期の駆動を行うごとに、該1周期の駆動によるズームレンズ移動起因の画角変動率「Vz×Az」、フォーカスレンズ移動起因の画角変動率「Vf×Af」を求め、それらの差の絶対値(|Vz×Az-Vf×Af|)が、1周期としての単位時間内で許容される画角変動率を定めた閾値Gth以下か否かを判定する。 The first correction control unit F3A calculates the rate of change in angle of view “Vz×Az” due to the movement of the zoom lens and the change in angle of view due to the movement of the focus lens each time one cycle of driving is performed with a minute time δt. A rate “Vf×Af” is obtained, and whether the absolute value of their difference (|Vz×Az−Vf×Af|) is equal to or less than the threshold Gth that defines the rate of view angle variation permissible within a unit time as one cycle. determine whether or not
 |Vz×Az-Vf×Af|が閾値Gth以下であれば、第一補正制御部F3Aは、1周期分のズームレンズ13、フォーカスレンズ16の駆動として、ズームレンズ13についてはズーム群移動速度Vzが示す速度で駆動させ、フォーカスレンズ16についてはフォーカス群移動速度Vfが示す速度で駆動させる。 If |Vz×Az−Vf×Af| is equal to or less than the threshold value Gth, the first correction control unit F3A drives the zoom lens 13 and the focus lens 16 for one cycle, and the zoom lens 13 is driven at the zoom group moving speed Vz and the focus lens 16 is driven at a speed indicated by the focus group moving speed Vf.
 一方、|Vz×Az-Vf×Af|が閾値Gth以下でない場合、第一補正制御部F3Aは、[式1]の条件を満たすフォーカス群移動速度Vfを求め、該求めたフォーカス群移動速度Vfと、ズーム群移動速度Vzにより、それぞれフォーカスレンズ16、ズームレンズ13の1周期分の駆動を実行させる。 On the other hand, when |Vz×Az−Vf×Af| is not equal to or less than the threshold value Gth, the first correction control unit F3A obtains the focus group moving speed Vf that satisfies the condition of [Formula 1], and obtains the focus group moving speed Vf. , the focus lens 16 and the zoom lens 13 are driven for one cycle by the zoom group moving speed Vz.
 ここで、フォーカス群移動速度Vfについては、図16に例示するレンズ群移動速度テーブルJ3Aに基づき取得することができる。レンズ群移動速度テーブルJ3Aは、先に説明したレンズ群移動速度テーブルJ3に対し、所定の速度変化要因(この場合も温度の例としている)に対するフォーカス群移動速度Vfの変化特性を示す情報が追加されたものとなる。
 このようなレンズ群移動速度テーブルJ3Aにより、温度等の速度変化要因に対して、ズーム群移動速度Vzやフォーカス群移動速度Vfを精度良く求めることができる。
Here, the focus group moving speed Vf can be acquired based on the lens group moving speed table J3A illustrated in FIG. In the lens group moving speed table J3A, information indicating change characteristics of the focus group moving speed Vf with respect to a predetermined speed change factor (in this case, temperature is also used as an example) is added to the lens group moving speed table J3 described above. It becomes what was done.
With such a lens group moving speed table J3A, the zoom group moving speed Vz and the focus group moving speed Vf can be obtained with high accuracy with respect to speed change factors such as temperature.
 また、フォーカス群画角変動率Afは、図17に例示するフォーカス群画角変動率テーブルJ5に基づき取得する。
 図示のようにフォーカス群画角変動率テーブルJ5は、ズームレンズ位置とフォーカスレンズ位置の組み合わせごとに、フォーカス群画角変動率Afを示した情報とされる。
 第一補正制御部F3Aは、[式1]を用いた判定を行う際、現在のズームレンズ位置とフォーカスレンズ位置の情報を元に、フォーカス群画角変動率テーブルJ5の情報内容に基づいて対応するフォーカス群画角変動率Afの情報を取得する。
Also, the focus group angle of view variation rate Af is acquired based on the focus group angle of view variation rate table J5 illustrated in FIG.
As illustrated, the focus group field angle variation rate table J5 is information indicating the focus group field angle variation rate Af for each combination of the zoom lens position and the focus lens position.
When the first correction control unit F3A makes a determination using [Equation 1], based on information on the current zoom lens position and focus lens position, the first correction control unit F3A responds based on the information content of the focus group angle of view fluctuation rate table J5. Information on the focus group angle-of-view variation rate Af is acquired.
 また、[式1]を用いた判定を行う際、第一補正制御部F3Aは、図18に例示するズーム群画角変動率テーブルJ6に基づいてズーム群画角変動率Azを取得する。
 図示のようにズーム群画角変動率テーブルJ6は、ズームレンズ位置ごとにズーム群画角変動率Azを示した情報とされる。
 第一補正制御部F3Aは、現在のズームレンズ位置の情報を元に、ズーム群画角変動率テーブルJ6の情報内容に基づいて対応するズーム群画角変動率Azの情報を取得する。
Further, when performing the determination using [Formula 1], the first correction control unit F3A acquires the zoom group angle of view variation rate Az based on the zoom group angle of view variation rate table J6 illustrated in FIG.
As illustrated, the zoom group view angle variation rate table J6 is information indicating the zoom group view angle variation rate Az for each zoom lens position.
Based on the information on the current zoom lens position, the first correction control unit F3A acquires information on the corresponding zoom group angle of view variation rate Az based on the information content of the zoom group angle of view variation rate table J6.
 図19は、上記により説明した第二実施形態としての制御を実現するための具体的な処理手順例を示したフローチャートである。
 第二実施形態において、ボディ側制御部52Aは、図12に示したステップS110の処理として、この図19に示す処理を実行する。
FIG. 19 is a flow chart showing a specific processing procedure example for realizing the control as the second embodiment described above.
In the second embodiment, the body-side control section 52A executes the process shown in FIG. 19 as the process of step S110 shown in FIG.
 この場合のステップS110の処理において、ボディ側制御部52Aは、先ずステップS201で、ズーム群移動速度Vz、フォーカス群移動速度Vfを取得する処理を行う。すなわち、図16に例示したレンズ群移動速度テーブルJ3Aと、検出部17で検出される交換レンズ1Aの温度情報とに基づいて、ズーム群移動速度Vz、フォーカス群移動速度Vfを取得する。 In the process of step S110 in this case, the body-side control unit 52A first performs the process of acquiring the zoom group moving speed Vz and the focus group moving speed Vf in step S201. That is, based on the lens group moving speed table J3A illustrated in FIG. 16 and the temperature information of the interchangeable lens 1A detected by the detection unit 17, the zoom group moving speed Vz and the focus group moving speed Vf are obtained.
 ステップS201に続くステップS202でボディ側制御部52Aは、現1周期の駆動に対応するズーム群画角変動率Az、フォーカス群画角変動率Afを取得する。すなわち、現在の1周期分の期間に対応して検出されたズームレンズ位置、フォーカスレンズ位置の情報と、ズーム群画角変動率テーブルJ6、フォーカス群画角変動率テーブルJ5とに基づき、現1周期の駆動に対応するズーム群画角変動率Az、フォーカス群画角変動率Afを取得する。
 具体的には、ズームレンズ位置の情報を元にズーム群画角変動率テーブルJ6に基づいて対応するズーム群画角変動率Azを取得し、また、ズームレンズ位置及びフォーカスレンズ位置の情報を元にフォーカス群画角変動率テーブルJ5に基づいて対応するフォーカス群画角変動率Afを取得する。
In step S202 following step S201, the body-side control unit 52A acquires the zoom group angle-of-view variation rate Az and the focus group angle-of-view variation rate Af corresponding to the driving of the current one cycle. That is, based on information on the zoom lens position and the focus lens position detected for the current period of one cycle, the zoom group angle of view variation rate table J6, and the focus group angle of view variation rate table J5, the current 1 A zoom group angle of view variation rate Az and a focus group angle of view variation rate Af corresponding to the periodic driving are acquired.
Specifically, based on the information on the zoom lens position, the corresponding zoom group view angle variation rate Az is acquired based on the zoom group view angle variation rate table J6, and based on the information on the zoom lens position and the focus lens position. Then, the corresponding focus group angle of view variation rate Af is obtained based on the focus group angle of view variation rate table J5.
 ステップS202に続くステップS203でボディ側制御部52Aは、|Vz×Az-Vf×Af|が閾値Gth以下か否かを判定する。
 |Vz×Az-Vf×Af|が閾値Gth以下であれば、ボディ側制御部52AはステップS205に進み、ズーム群移動速度Vz、フォーカス群移動速度Vfの速度によりズームレンズ13、フォーカスレンズ16の1周期分の駆動を実行させる。すなわち、ズームレンズ13についてはズーム群移動速度Vzが示す速度で駆動させ、フォーカスレンズ16についてはフォーカス群移動速度Vfが示す速度で駆動させる。
In step S203 following step S202, the body-side control unit 52A determines whether or not |Vz×Az−Vf×Af| is equal to or less than the threshold value Gth.
If |Vz×Az−Vf×Af| is equal to or less than the threshold value Gth, the body-side control unit 52A proceeds to step S205, and moves the zoom lens 13 and the focus lens 16 at the zoom group moving speed Vz and the focus group moving speed Vf. Drive for one cycle is executed. That is, the zoom lens 13 is driven at the speed indicated by the zoom group moving speed Vz, and the focus lens 16 is driven at the speed indicated by the focus group moving speed Vf.
 一方、|Vz×Az-Vf×Af|が閾値Gth以下でなければ、ボディ側制御部52AはステップS204に進み、条件を満たすフォーカス群移動速度Vfを求める処理を行う。すなわち、[式1]の条件を満たすフォーカス群移動速度Vfを求めるものである。
 そして、ボディ側制御部52Aは、ステップS204で求めたフォーカス群移動速度Vfを用いて、ステップS205の処理を実行する。
 これにより、ズームレンズ13とフォーカスレンズ16との移動速度差が閾値Gthに基づく一定速度差を超えないようにレンズ駆動を制御することができ、フォーカスレンズ16とズームレンズ13との移動速度差に起因して生じる画角変動の抑制を図ることができる。
On the other hand, if |Vz×Az−Vf×Af| is not equal to or less than the threshold value Gth, the body-side control unit 52A proceeds to step S204 and performs processing to obtain the focus group moving speed Vf that satisfies the condition. That is, the focus group moving speed Vf that satisfies the condition of [Equation 1] is obtained.
Then, the body-side control unit 52A executes the process of step S205 using the focus group moving speed Vf obtained in step S204.
As a result, lens driving can be controlled so that the moving speed difference between the zoom lens 13 and the focus lens 16 does not exceed a constant speed difference based on the threshold value Gth. It is possible to suppress the resulting change in angle of view.
 ステップS205に続くステップS206でボディ側制御部52Aは、ズームレンズ13及びフォーカスレンズ16が目標位置に達したか否かを判定する。
 ズームレンズ13及びフォーカスレンズ16がそれぞれの目標位置に達した状態ではない場合、ボディ側制御部52AはステップS202に戻る。これにより、ズームレンズ13及びフォーカスレンズ16がそれぞれの目標位置に達するまでの間、前述した微少時間δtによる周期ごとに、[式1]に基づいたレンズ駆動制御が実行される。
In step S206 following step S205, the body side control unit 52A determines whether or not the zoom lens 13 and the focus lens 16 have reached the target positions.
If the zoom lens 13 and focus lens 16 have not reached their respective target positions, the body-side controller 52A returns to step S202. As a result, until the zoom lens 13 and the focus lens 16 reach their respective target positions, the lens drive control based on [Equation 1] is executed for each period of the minute time δt described above.
 一方、ズームレンズ13及びフォーカスレンズ16がそれぞれの目標位置に達した状態となった場合、ボディ側制御部52AはステップS110の処理を終える。 On the other hand, when the zoom lens 13 and the focus lens 16 have reached their respective target positions, the body-side control section 52A ends the processing of step S110.
 なお、上記では、ズームレンズ13の駆動信号値を最大値等の固定値としてズームレンズ駆動を行う前提で、フォーカス群移動速度Vf側を変化させる例としたが、ズームレンズ13の駆動信号値を固定値とすることに限定されず、ズームレンズ13を最大値よりも低い値で駆動する前提として、[式1]の条件を満たすようにズーム群移動速度Vzとフォーカス群移動速度Vfの双方を調整する構成とすることも可能である。
 またこのとき、[式1]の条件を満たすズーム群移動速度Vzとフォーカス群移動速度Vfの解が複数組存在する場合には、ズーム群移動速度Vz、フォーカス群移動速度Vfがより大きくなる組を選択するようにしてもよい。
In the above description, it is assumed that the zoom lens is driven with the drive signal value of the zoom lens 13 set to a fixed value such as the maximum value, and the focus group moving speed Vf side is changed. Assuming that the zoom lens 13 is driven at a value lower than the maximum value, both the zoom group moving speed Vz and the focus group moving speed Vf are set so as to satisfy the conditions of [Equation 1]. An adjustable configuration is also possible.
Also, at this time, if there are a plurality of sets of solutions of the zoom group moving speed Vz and the focus group moving speed Vf that satisfy the condition of [Equation 1], a set in which the zoom group moving speed Vz and the focus group moving speed Vf are larger. may be selected.
<3.変形例>
 なお、実施形態としては上記により説明した具体例に限定されるものではなく、多様な変形例としての構成を採り得る。
 例えば、図20のフローチャートに示す第一変形例としての処理のように、フォーカスレンズの移動量に係る情報に基づいたトリミング補正手法による補正実行条件が成立した場合であっても、トリミング補正手法によるブリージング補正に要するトリミング倍率が閾値を超える場合には、ズームレンズ補正手法によるブリージング補正が行われるように制御することもできる。
<3. Variation>
It should be noted that the embodiment is not limited to the specific examples described above, and various modifications can be made.
For example, as in the processing as the first modified example shown in the flowchart of FIG. 20, even if the correction execution conditions for the trimming correction method based on the information related to the amount of movement of the focus lens are satisfied, the trimming correction method When the trimming magnification required for breathing correction exceeds a threshold value, it is possible to perform control so that breathing correction is performed by a zoom lens correction method.
 具体的に、図20の例においてボディ側制御部52(又は52A:以降同様)は、ステップS109で補正時間Tlaが閾値Tth以下でないと判定された場合、すなわち、フォーカスレンズの移動量に係る情報に基づいたトリミング補正手法による補正実行条件が成立した場合において、ステップS111で補正のためのトリミング倍率の取得処理を行った後、ステップS301でトリミング倍率が所定の閾値Rth以下か否かの判定を行う。この閾値Rthは、画質の面でのトリミング倍率の許容値として定めた値である。
 ボディ側制御部52は、トリミング倍率が閾値Rth以下である場合にはステップS112に進んでトリミング倍率を画像信号処理部58に指示してトリミング補正手法によるブリージング補正を実行させる。
 一方、トリミング倍率が閾値Rth以下でなかった場合、ボディ側制御部52はステップS110に進み、目標ズームレンズ位置へのズームレンズ移動制御を行う。すなわち、トリミング倍率が画質面での許容値として定められた閾値Rthを超える場合には、トリミング補正手法によるブリージング補正は行われず、ズームレンズ補正手法によるブリージング補正が行われる。
Specifically, in the example of FIG. 20, when it is determined in step S109 that the correction time Tla is not equal to or less than the threshold value Tth, the body-side control unit 52 (or 52A; the same applies hereinafter) controls the information related to the movement amount of the focus lens. When the correction execution condition by the trimming correction method based on is satisfied, after performing the acquisition processing of the trimming magnification for correction in step S111, it is determined whether or not the trimming magnification is equal to or less than a predetermined threshold value Rth in step S301. conduct. This threshold value Rth is a value determined as a permissible value of the trimming magnification in terms of image quality.
When the trimming magnification is equal to or less than the threshold value Rth, the body-side control section 52 proceeds to step S112 and instructs the image signal processing section 58 on the trimming magnification to execute the breathing correction by the trimming correction method.
On the other hand, if the trimming magnification is not equal to or less than the threshold value Rth, the body-side control unit 52 proceeds to step S110 and performs zoom lens movement control to the target zoom lens position. That is, when the trimming magnification exceeds the threshold value Rth defined as an allowable value in terms of image quality, the breathing correction by the trimming correction method is not performed, and the breathing correction by the zoom lens correction method is performed.
 上記のような第一変形例としての処理により、補正時間の面ではトリミング補正手法での補正を行うべきとされる場合であっても、トリミングによる画質劣化が大きくなると予測される場合には、ズームレンズ補正手法での補正が行われる。
 従って、画質劣化の生じるトリミングの使用を極力抑えつつ、トリミングによる補正が行われる場合の画質劣化が許容範囲内に収まるように図ることができる。
With the processing as the first modified example as described above, even if correction should be performed by the trimming correction method in terms of correction time, if it is predicted that image quality deterioration due to trimming will increase, Correction by the zoom lens correction method is performed.
Therefore, while minimizing the use of trimming that causes image quality deterioration, it is possible to keep image quality deterioration within an allowable range when correction by trimming is performed.
 また、図21のフローチャートに示す第二変形例としての処理のように、フォーカスレンズの移動量に係る情報に基づいたズームレンズ補正手法による補正実行条件が成立した場合であっても、シーン認識情報に基づき特定シーンであると判定された場合は、トリミング補正手法によるブリージング補正が行われるように制御することもできる。
 ここで、「シーン認識情報」とは、撮像のシーンを認識するための情報を意味するものである。
Further, as in the processing as the second modified example shown in the flowchart of FIG. 21, even when the correction execution condition by the zoom lens correction method based on the information related to the movement amount of the focus lens is established, the scene recognition information If the scene is determined to be a specific scene based on the above, it is also possible to perform control so that breathing correction is performed by a trimming correction method.
Here, the "scene recognition information" means information for recognizing the scene of imaging.
 図21の例において、ボディ側制御部52(又は52A:以降同様)は、ステップS109で補正時間Tlaが閾値Tth以下であると判定された場合、すなわち、フォーカスレンズの移動量に係る情報に基づいたズームレンズ補正手法による補正実行条件が成立した場合において、ステップS401で静音を要するシーンか否かを判定する。すなわち、シーン認識情報に基づき静音を要する特定シーンであるか否かを判定する。
 ここでのシーン認識情報としては、撮像画像についての画像認識処理によるシーン認識情報とすることが考えられる。撮像画像の画像認識処理により、コンサートや野鳥等の野生動物の撮像シーン等、特定シーンについてのシーン認識を行う。
 或いは、夜等の暗いシーンは静音を要する特定シーンとみなし、昼等の明るいシーンは非特定シーンとみなすこともできる。その場合、シーン認識情報は、撮像装置2(又は2A:以降同様)に設けた照度センサ等による検出信号に基づく情報とすることも考えられる。
 或いは、シーン認識情報は、マイクにより収音された音声信号に基づく認識情報とすることも考えられる。例えば、特定シーンに特徴的な音声波形が得られているか否かを判定する等、音声波形のパターンマッチングにより特定シーンか否かを判定することが考えられる。
 また、シーン認識は、例えば加速度センサや角速度センサ等の動きセンサの検出信号に基づき行うことも考えられる。例えば、特定シーンでの撮像装置2の使用において特徴的な動き検出信号の波形が得られているか否かを判定する等、動き検出信号の波形についてのパターンマッチングにより特定シーンか否かを判定することが考えられる。
 さらに、シーン認識については、AI(人工知能)の技術を用いた画像認識処理とすることも考えられる。
In the example of FIG. 21, when it is determined in step S109 that the correction time Tla is equal to or less than the threshold value Tth, the body-side control unit 52 (or 52A; the same applies hereinafter) performs If the correction execution condition by the zoom lens correction method is established, it is determined in step S401 whether or not the scene requires silence. That is, it is determined whether or not the scene is a specific scene requiring silence based on the scene recognition information.
As the scene recognition information here, it is conceivable to use scene recognition information obtained by image recognition processing for a captured image. Scene recognition of a specific scene such as a scene of a concert or a wild animal such as a wild bird is performed by image recognition processing of the captured image.
Alternatively, a dark scene such as night can be regarded as a specific scene requiring silence, and a bright scene such as daytime can be regarded as a non-specific scene. In that case, the scene recognition information may be information based on a detection signal from an illuminance sensor or the like provided in the imaging device 2 (or 2A; the same shall apply hereinafter).
Alternatively, the scene recognition information may be recognition information based on an audio signal picked up by a microphone. For example, it is conceivable to determine whether or not a specific scene is a specific scene by pattern matching of audio waveforms, such as determining whether or not a characteristic audio waveform is obtained.
Scene recognition may also be performed based on detection signals from motion sensors such as acceleration sensors and angular velocity sensors. For example, it is determined whether or not a particular scene is present by pattern matching of the waveform of the motion detection signal, such as determining whether or not a characteristic waveform of the motion detection signal is obtained when the imaging device 2 is used in a particular scene. can be considered.
Furthermore, scene recognition may be image recognition processing using AI (artificial intelligence) technology.
 ステップS401において、静音を要する特定シーンではないと判定した場合、ボディ側制御部52はステップS110に進む。つまりこの場合は、ズームレンズ補正手法によるブリージング補正が行われる。
 一方、静音を要する特定シーンであると判定した場合、ボディ側制御部52はステップS111に進む。つまりこの場合は、ステップS111及びS112の処理により、トリミング補正手法によるブリージング補正が行われる。
If it is determined in step S401 that the specific scene does not require silence, the body-side control unit 52 proceeds to step S110. That is, in this case, breathing correction is performed by a zoom lens correction method.
On the other hand, if it is determined that the scene is a specific scene that requires silence, the body-side control unit 52 proceeds to step S111. That is, in this case, the breathing correction is performed by the trimming correction method by the processing of steps S111 and S112.
 上記のような第二変形例としての処理により、補正時間の面ではズームレンズ補正手法での補正を行うべきとされる場合であっても、静音が求められる特定シーンにおいては、レンズ移動に係る動作音が生じないトリミング補正手法での補正が行われるようにすることが可能となる。
 従って、撮像装置2の状況に応じてブリージング補正の手法を適切に切り替えることができる。
With the processing as the second modified example as described above, even if the correction should be performed by the zoom lens correction method in terms of correction time, in a specific scene where quietness is required, It is possible to perform correction using a trimming correction method that does not generate operation noise.
Therefore, it is possible to appropriately switch the breathing correction method according to the situation of the imaging device 2 .
 なお、上記した第一変形例と第二変形例は組み合わせが可能である。
 すなわち、トリミング倍率としては閾値Rthを超えて画質面での要求条件は満たさないものとなるが、静音を要する特定シーンと判定された場合には、トリミング補正手法によるブリージング補正が行われるようにすることが考えられる。
 或いは、シーン的には静音が求められる特定シーンであるが、トリミング倍率が閾値Rthを超える場合にはズームレンズ補正手法によるブリージング補正が行われるようにすることも考えられる。
It should be noted that the first modified example and the second modified example described above can be combined.
That is, although the trimming magnification exceeds the threshold value Rth and does not satisfy the requirements in terms of image quality, if it is determined that the specific scene requires silence, the breathing correction is performed by the trimming correction method. can be considered.
Alternatively, if the trimming magnification exceeds the threshold value Rth for a specific scene that requires quietness, breathing correction may be performed using a zoom lens correction technique.
 また、第三変形例として、図22のフローチャートに示すように、シーン認識情報に基づき、第一補正手法(本例ではズームレンズ補正手法)と第二補正手法(本例ではトリミング補正手法)の何れの手法によるブリージング補正を行うかについての判定を行うこともできる。
 具体的に、図22の例においてボディ側制御部52(又は52A:以降同様)は、ステップS106で目標ズームレンズ位置の取得処理を行った後、ステップS111のトリミング倍率の取得処理を実行し、次いで、ステップS401で静音を要するシーンか否かを判定する。この場合のステップS401の処理は、図21で説明したものと同様である。
 静音を要する特定シーンであると判定した場合、ボディ側制御部52はステップS112に処理を進める(つまりトリミング補正手法によるブリージング補正が行われる)。
 一方、静音を要する特定シーンではないと判定した場合、ボディ側制御部52はステップS111に処理を進める(つまりズームレンズ補正手法によるブリージング補正が行われる)。
As a third modification, as shown in the flowchart of FIG. 22, the first correction method (zoom lens correction method in this example) and the second correction method (trimming correction method in this example) are selected based on the scene recognition information. It is also possible to determine which method of breathing correction is to be performed.
Specifically, in the example of FIG. 22, the body-side control unit 52 (or 52A; the same applies hereinafter) performs processing for obtaining the target zoom lens position in step S106, and then performs processing for obtaining the trimming magnification in step S111. Next, in step S401, it is determined whether or not the scene requires silence. The processing of step S401 in this case is the same as that described with reference to FIG.
If the scene is determined to be a specific scene that requires silence, the body-side control unit 52 advances the process to step S112 (that is, breathing correction is performed using a trimming correction method).
On the other hand, if it is determined that the scene is not a specific scene that requires silence, the body-side control unit 52 advances the process to step S111 (that is, breathing correction is performed by the zoom lens correction method).
 上記のような第三変形例としての処理により、静音が求められる特定のシーンではレンズ移動に係る動作音が生じるズームレンズ補正手法での補正を行わず、トリミング補正手法での補正を行うようにすることが可能となる。
 従って、撮像装置2(又は2A)の状況に応じてブリージング補正の手法を適切に切り替えることができる。
With the processing as the third modified example described above, in specific scenes where silence is required, correction is performed using the trimming correction method instead of performing correction using the zoom lens correction method that produces operation noise related to lens movement. It becomes possible to
Therefore, it is possible to appropriately switch the breathing correction method according to the situation of the imaging device 2 (or 2A).
 ここで、これまでの説明では、補正用ズームレンズ移動量、すなわちズームレンズ補正手法によるブリージング補正に要するズームレンズ移動量から求めた補正時間Tlaに基づいてブリージング補正手法の判定を行う例を挙げたが、これに代えて、補正用ズームレンズ移動量に基づいてブリージング補正方法の判定を行うこともできる。その場合、補正用ズームレンズ移動量が所定の閾値Zth以下か否かを判定し、補正用ズームレンズ移動量が閾値Zth以下であればズームレンズ補正手法でのブリージング補正が行われるようにし、補正用ズームレンズ移動量が閾値Zth以下でなければトリミング補正手法でのブリージング補正が行われるようにする。
 補正用ズームレンズ移動量は補正時間Tlaと相関するため、上記のように補正量ズームレンズ移動量に基づく判定とした場合も、第一、第二実施形態の場合と同様の効果を得ることができる。
Here, in the description so far, the breathing correction method is determined based on the correction time Tla obtained from the correction zoom lens movement amount, that is, the zoom lens movement amount required for the breathing correction by the zoom lens correction method. However, instead of this, it is also possible to determine the breathing correction method based on the amount of movement of the zoom lens for correction. In this case, it is determined whether or not the amount of movement of the zoom lens for correction is equal to or less than a predetermined threshold value Zth, and if the amount of movement of the zoom lens for correction is equal to or less than the threshold value Zth, breathing correction is performed by the zoom lens correction method. Breathing correction is performed by the trimming correction method unless the zoom lens movement amount is equal to or less than the threshold value Zth.
Since the correction zoom lens movement amount correlates with the correction time Tla, even if the determination is made based on the correction amount zoom lens movement amount as described above, the same effect as in the first and second embodiments can be obtained. can.
 また、ブリージング補正手法の判定は、フォーカシングによるフォーカスレンズ16の移動量に基づいて行うこともできる。フォーカシングによるフォーカスレンズ16の移動量は、AFによるフォーカシングである場合には、現在のフォーカスレンズ位置と目標フォーカスレンズ位置とに基づき求めることができる。
 フォーカスレンズ16の移動量としても補正時間Tlaと相関するため、上記のようにフォーカスレンズ16の移動量に基づく判定を行う構成とすれば、例えばフォーカスレンズの移動量が大きい場合(補正時間Tlaが長い場合)にはトリミング補正手法での補正が行われるようにする等、フォーカスレンズ16の移動量に基づいた適切なブリージング補正の判定を行うことができる。
The breathing correction method can also be determined based on the amount of movement of the focus lens 16 during focusing. The amount of movement of the focus lens 16 by focusing can be obtained based on the current focus lens position and the target focus lens position in the case of AF focusing.
Since the amount of movement of the focus lens 16 also correlates with the correction time Tla, if the determination is made based on the amount of movement of the focus lens 16 as described above, for example, when the amount of movement of the focus lens is large (when the correction time Tla is If it is long), appropriate breathing correction can be determined based on the amount of movement of the focus lens 16, such as by performing correction using a trimming correction method.
 或いは、ブリージング補正手法の判定は、トリミング補正手法によるブリージング補正を行う場合のトリミング倍率(補正用トリミング倍率)に基づき行うこともできる。これまでの説明から理解されるように、補正用トリミング倍率は、フォーカシングによる変更後のピント位置(AFの場合は目標ピント位置)とズームレンズ位置の情報と、トリミング倍率テーブルJ4とに基づき取得することができる。
 補正用トリミング倍率としても補正時間Tlaと相関するため、上記のように補正用トリミング倍率に基づく判定を行う構成とすれば、例えば補正用トリミング倍率が大きい場合(補正時間Tlaが長い場合)にはトリミング補正手法での補正が行われるようにする等、補正用トリミング倍率に基づいた適切なブリージング補正の判定を行うことができる。
Alternatively, the breathing correction method can be determined based on the trimming magnification (correction trimming magnification) when performing the breathing correction by the trimming correction method. As can be understood from the description so far, the trimming magnification for correction is obtained based on information on the focal position (target focal position in the case of AF) after the change due to focusing, the zoom lens position, and the trimming magnification table J4. be able to.
Since the correction trimming magnification also correlates with the correction time Tla, if the determination is made based on the correction trimming magnification as described above, for example, when the correction trimming magnification is large (when the correction time Tla is long), Appropriate breathing correction can be determined based on the trimming magnification for correction, such as by performing correction by a trimming correction method.
 また、これまでの説明では、ブリージング補正のためのレンズ特性情報を交換レンズ1又は1Aに記憶させておく例を挙げたが、この限りではない。
 例えば、第四変形例として、レンズ特性情報はクラウドサーバに記憶させておき、撮像装置2又は2Aが必要に応じてクラウドサーバから取得するように構成することもできる。
Also, in the description so far, an example has been given in which the lens characteristic information for correcting breathing is stored in the interchangeable lens 1 or 1A, but this is not the only option.
For example, as a fourth modified example, the lens characteristic information can be stored in a cloud server, and the imaging device 2 or 2A can be configured to acquire it from the cloud server as needed.
 図23に、第四変形例としてのカメラシステムの簡略化した構成説明図を示す。
 図中において交換レンズ1A’は、メモリ32において第二実施形態で説明したレンズ特性情報(フォーカス移動軌跡情報J1、画角テーブルJ2、レンズ群移動速度テーブルJ3A、トリミング倍率テーブルJ4、フォーカス群画角変動率テーブルJ5、及びズーム群画角変動率テーブルJ6)が記憶されていない点が交換レンズ1Aと異なる。この場合のメモリ32には、交換レンズ1A’を識別するためのレンズ識別情報J7が記憶されている。
 また、図中の撮像装置2A’は、ボディ側制御部52Aに代えてボディ側制御部52A’を備えると共に、通信部70を備える点が撮像装置2Aと異なる。通信部70は、例えばインターネット等のネットワークNTを介した外部装置との通信を行う。
 クラウドサーバ80には、第二実施形態で説明したレンズ特性情報として、交換レンズ1A’ごとのレンズ特性情報が記憶されている。
FIG. 23 shows a simplified configuration explanatory diagram of a camera system as a fourth modified example.
In the figure, the interchangeable lens 1A' stores the lens characteristic information (focus movement locus information J1, angle of view table J2, lens group movement speed table J3A, trimming magnification table J4, focus group angle of view) described in the second embodiment in the memory 32. It differs from the interchangeable lens 1A in that the fluctuation rate table J5 and the zoom group view angle fluctuation rate table J6) are not stored. In this case, the memory 32 stores lens identification information J7 for identifying the interchangeable lens 1A'.
In addition, the imaging device 2A' in the drawing differs from the imaging device 2A in that it includes a body-side control section 52A' in place of the body-side control section 52A and also includes a communication section . The communication unit 70 communicates with an external device via a network NT such as the Internet.
The cloud server 80 stores lens characteristic information for each interchangeable lens 1A' as the lens characteristic information described in the second embodiment.
 撮像装置2A’において、ボディ側制御部52A’は、装着された交換レンズ1A’からレンズ識別情報J7を取得し、取得したレンズ識別情報J7により通信部70を介してクラウドサーバ80に対する問合せを行って、該装着された交換レンズ1A’に対応するレンズ特性情報を取得する。 In the imaging device 2A', the body-side control unit 52A' acquires the lens identification information J7 from the attached interchangeable lens 1A', and makes an inquiry to the cloud server 80 via the communication unit 70 using the acquired lens identification information J7. to acquire the lens characteristic information corresponding to the mounted interchangeable lens 1A'.
 上記のような第四変形例としてのシステム構成を採ることで、レンズ特性に応じた適切なブリージング補正を実現するにあたり、交換レンズ1A’にレンズ特性情報を記憶させておく必要がなくなる。
 従って、交換レンズ1A’のメモリ容量削減を図ることができる。
By adopting the system configuration as the fourth modified example as described above, there is no need to store the lens characteristic information in the interchangeable lens 1A' in realizing appropriate breathing correction according to the lens characteristic.
Therefore, it is possible to reduce the memory capacity of the interchangeable lens 1A'.
 なお、上記では第四変形例としての構成を第二実施形態のカメラシステムに適用した例を挙げたが、第四変形例としての構成は第一実施形態としてのカメラシステムにも適用可能である。 In the above, an example in which the configuration of the fourth modification is applied to the camera system of the second embodiment has been given, but the configuration of the fourth modification can also be applied to the camera system of the first embodiment. .
 また、これまでの説明では、ブリージング補正手法の判定として、ズームレンズ補正手法とトリミング補正手法の何れによるブリージング補正を行うかの判定を行うものとしたが、判定の対象となる補正手法はこれらズームレンズ補正手法とトリミング補正手法とに限定されるものではない。
 例えば、同じズームレンズ補正手法として、ズームレンズ移動速度が遅いがズームレンズ位置の制御精度が高い(つまりブリージング補正精度が高い)第一のズームレンズ補正手法と、ズームレンズ移動速度は速いがズームレンズ位置の制御精度が低い第二のズームレンズ補正手法とがある場合において、これら第一のズームレンズ補正手法と第二のズームレンズ補正手法とを対象とした判定とすることもできる。
 この場合、第一のズームレンズ補正手法による補正時間が短ければ(閾値以下であれば)第一のズームレンズ補正手法での補正が行われるようにし、第一のズームレンズ補正手法による補正時間が長ければ(閾値以下でなければ)第二のズームレンズ補正手法での補正が行われるようにすることが考えられる。
 これにより、補正精度の高い手法によるブリージング補正を可能としながら、補正の応答遅れに伴う画角変化の補正漏れが極力生じないように図ることができる。
In the description so far, the breathing correction method is determined by determining which of the zoom lens correction method and the trimming correction method should be used for the breathing correction. It is not limited to the lens correction method and the trimming correction method.
For example, as the same zoom lens correction method, the zoom lens movement speed is slow but the zoom lens position control accuracy is high (that is, the breathing correction accuracy is high). In the case where there is a second zoom lens correction method with low position control accuracy, it is also possible to make a determination targeting these first zoom lens correction method and second zoom lens correction method.
In this case, if the correction time by the first zoom lens correction method is short (if it is equal to or less than the threshold value), correction by the first zoom lens correction method is performed, and the correction time by the first zoom lens correction method is If it is longer (if it is not equal to or less than the threshold value), it is conceivable to perform correction by the second zoom lens correction method.
As a result, it is possible to minimize omission of correction of changes in the angle of view due to response delay in correction, while enabling breathing correction using a technique with high correction accuracy.
 また、これまでの説明では、撮像装置が交換レンズを着脱自在に構成される例を挙げたが、本技術は、撮像装置がレンズ一体型の装置形態を採る場合にも好適に適用可能なものである。 In addition, in the description so far, an example in which an interchangeable lens is detachably configured in an imaging device has been given, but the present technology can also be suitably applied to a case where the imaging device adopts a lens-integrated device form. is.
 また、これまでの説明では、AFが像面位相差法により行われる例としたが、本技術は、像面位相差法によるAFを行う場合に限らず、位相差法によるAFが行われる場合に広く好適に適用可能である。 Further, in the description so far, an example in which AF is performed by the image plane phase difference method has been described. It can be widely and suitably applied to.
 またこれまでの説明では、ブリージング補正手法の判定を行う情報処理装置が、撮像部を備えた撮像装置として構成される例を挙げたが、該情報処理装置が撮像部を備えることは必須ではない。
Further, in the above description, the information processing device that determines the breathing correction method is configured as an imaging device that includes an imaging unit, but it is not essential that the information processing device includes an imaging unit. .
<4.実施形態のまとめ>
 上記のように実施形態の情報処理装置(撮像装置2,2A、2A’)は、フォーカスレンズ群の移動量に係る情報、又はシーン認識情報に基づき、ブリージング補正の手法である第一補正手法と第二補正手法のうち何れの手法によるブリージング補正を行うかについての判定を行う補正手法判定部(同F5)を備えたものである。
 上記構成によれば、第一、第二補正手法の何れによるブリージング補正を行うかが、フォーカスレンズ群の移動量に係る情報、又はシーン認識情報に基づき判定される。フォーカスレンズ群の移動量に係る情報に基づく補正手法の判定とすれば、例えば第一、第二補正手法がそれぞれズームレンズ群の駆動によるブリージング補正手法、撮像画像のトリミングによるブリージング補正手法である場合において、ズームレンズ群の駆動による補正手法を採った場合におけるブリージング補正に要する時間をフォーカスレンズ群の移動量に係る情報から推定することが可能となり、補正に要する時間が長い場合にトリミングによるブリージング補正手法を行うようにすることが可能となる。これにより、画質劣化の生じるトリミングの使用を極力抑えながらブリージング補正を行うことが可能となる。
 また、シーン認識情報に基づく補正手法の判定とすれば、例えば第一、第二補正手法がそれぞれズームレンズ群の駆動によるブリージング補正手法、撮像画像のトリミングによるブリージング補正手法である場合において、静音が求められる特定のシーンではレンズ移動に係る動作音が生じるズームレンズ駆動によるブリージング補正を行わずトリミングによるブリージング補正を行う等といったことが可能となる。
 従って、上記構成によれば、撮像装置の状況に応じて実行すべきブリージング補正の手法を適切に判定することができる。
<4. Summary of Embodiments>
As described above, the information processing apparatuses ( imaging apparatuses 2, 2A, 2A′) of the embodiments use the first correction method, which is a breathing correction method, and the A correction method determination unit (F5 in the same) is provided for determining which of the second correction methods should be used for the breathing correction.
According to the above configuration, which one of the first and second correction methods should be used for the breathing correction is determined based on the information regarding the amount of movement of the focus lens group or the scene recognition information. If the correction method is determined based on information related to the amount of movement of the focus lens group, for example, if the first and second correction methods are the breathing correction method by driving the zoom lens group and the breathing correction method by trimming the captured image, respectively. In the case of adopting a correction method by driving the zoom lens group, it is possible to estimate the time required for breathing correction from information related to the amount of movement of the focus lens group, and if the time required for correction is long, breathing correction by trimming It is possible to carry out the procedure. This makes it possible to perform breathing correction while minimizing the use of trimming that causes image quality deterioration.
Further, if the correction method is determined based on the scene recognition information, for example, when the first and second correction methods are the breathing correction method by driving the zoom lens group and the breathing correction method by trimming the captured image, the quietness In a specific scene, it is possible to perform breathing correction by trimming without performing breathing correction by driving the zoom lens, which produces an operation sound associated with lens movement.
Therefore, according to the above configuration, it is possible to appropriately determine the breathing correction method to be executed according to the situation of the imaging apparatus.
 また、実施形態の情報処理装置においては、第一、第二補正手法のうち一方は、ズームレンズ群の駆動によりブリージング補正を行うズームレンズ補正手法であり、他方は、撮像画像のトリミングによりブリージング補正を行うトリミング補正手法である。
 これにより、画質面で有利なズームレンズ補正手法と、画質劣化は生じるが補正速度の面で有利なトリミング補正手法とについて、撮像装置の状況に応じた適切な補正手法の判定を行うことができる。
Further, in the information processing apparatus of the embodiment, one of the first and second correction methods is a zoom lens correction method in which breathing correction is performed by driving the zoom lens group, and the other is a breathing correction method by trimming the captured image. This is a trimming correction method that performs
As a result, it is possible to determine an appropriate correction method according to the situation of the imaging apparatus, with respect to the zoom lens correction method that is advantageous in terms of image quality and the trimming correction method that causes image quality deterioration but is advantageous in terms of correction speed. .
 さらに、実施形態の情報処理装置においては、補正手法判定部は、フォーカスレンズ群の移動量に係る情報に基づいて、ズームレンズ補正手法によるブリージング補正に要するズームレンズ移動量を取得し、該ズームレンズ移動量に基づき、ズームレンズ補正手法とトリミング補正手法の何れの手法によるブリージング補正を行うかについての判定を行っている。
 ズームレンズ補正手法によるブリージング補正に要するズームレンズ移動量は、ズームレンズ補正手法によるブリージング補正を行う場合の補正時間と相関する。
 従って、上記構成によれば、例えばズームレンズ移動量が大きい場合(ズームレンズ補正手法での補正時間が長い場合)にはトリミング補正手法での補正が行われるようにする等、ブリージング補正に要するズームレンズ移動量に基づいた適切な補正手法の切り替えを行うことができる。
Further, in the information processing apparatus of the embodiment, the correction method determination unit acquires the zoom lens movement amount required for breathing correction by the zoom lens correction method based on the information related to the movement amount of the focus lens group, Based on the amount of movement, it is determined whether the breathing correction should be performed using the zoom lens correction method or the trimming correction method.
The amount of zoom lens movement required for breathing correction by the zoom lens correction method correlates with the correction time when performing the breathing correction by the zoom lens correction method.
Therefore, according to the above configuration, for example, when the amount of movement of the zoom lens is large (when the correction time by the zoom lens correction method is long), correction by the trimming correction method is performed. Appropriate correction methods can be switched based on the amount of lens movement.
 さらにまた、実施形態の情報処理装置においては、補正手法判定部は、ズームレンズ移動量が所定の閾値以下である場合はズームレンズ補正手法によるブリージング補正が行われるように制御し、ズームレンズ移動量が閾値を超える場合はトリミング補正手法によるブリージング補正が行われるように制御している。
 これにより、補正のためのズームレンズ移動量からズームレンズ補正手法での補正時間が長いと推定される場合に対応して、トリミング補正手法によるブリージング補正が行われるようにすることが可能となる。
 従って、画質劣化の生じるトリミングの使用を極力抑えながらブリージング補正を行うことが可能となる。
Furthermore, in the information processing apparatus of the embodiment, the correction method determination unit performs control so that breathing correction is performed by the zoom lens correction method when the zoom lens movement amount is equal to or less than a predetermined threshold value, and the zoom lens movement amount exceeds the threshold, the breathing correction is performed by the trimming correction method.
This makes it possible to perform breathing correction by the trimming correction method in response to the case where the correction time by the zoom lens correction method is estimated to be long from the amount of movement of the zoom lens for correction.
Therefore, it is possible to perform breathing correction while minimizing the use of trimming that causes image quality deterioration.
 また、実施形態の情報処理装置においては、補正手法判定部は、ズームレンズ移動量に基づいてズームレンズ補正手法によるブリージング補正を行った場合の補正時間を取得し、補正時間に基づき、ズームレンズ補正手法とトリミング補正手法の何れの手法によるブリージング補正を行うかについての判定を行っている(図12参照)。
 これにより、例えばズームレンズ補正手法での補正時間が長い場合に対応して、トリミング補正手法によるブリージング補正が行われるようにすることが可能となる等、ブリージング補正を行った場合の補正時間の情報に基づいた適切な補正手法の切り替えを行うことができる。
Further, in the information processing apparatus of the embodiment, the correction method determination unit acquires the correction time when breathing correction is performed by the zoom lens correction method based on the zoom lens movement amount, and based on the correction time, zoom lens correction is performed. A determination is made as to whether the breathing correction should be performed by either the method or the trimming correction method (see FIG. 12).
As a result, for example, when the correction time of the zoom lens correction method is long, it is possible to perform the breathing correction by the trimming correction method. It is possible to switch the appropriate correction method based on.
 さらに、実施形態の情報処理装置においては、補正手法判定部は、補正時間が所定の閾値(Tth)以下である場合はズームレンズ補正手法によるブリージング補正が行われるように制御し、補正時間が閾値を超える場合はトリミング補正手法によるブリージング補正が行われるように制御している(図12参照)。
 これにより、ズームレンズ補正手法での補正時間が長いと判定される場合に対応して、トリミング補正手法によるブリージング補正が行われるようにすることが可能となる。
 従って、画質劣化の生じるトリミングの使用を極力抑えながらブリージング補正を行うことが可能となる。
Further, in the information processing apparatus of the embodiment, the correction method determination unit performs control so that breathing correction is performed by the zoom lens correction method when the correction time is equal to or less than a predetermined threshold value (Tth), and the correction time is equal to or less than the threshold value. is exceeded, the breathing correction is performed by the trimming correction method (see FIG. 12).
Thereby, it is possible to perform breathing correction by the trimming correction method in response to the case where it is determined that the correction time by the zoom lens correction method is long.
Therefore, it is possible to perform breathing correction while minimizing the use of trimming that causes image quality deterioration.
 さらにまた、実施形態の情報処理装置においては、補正手法判定部は、フォーカスレンズ群の移動量に係る情報に基づいて、フォーカスレンズ群の移動量であるフォーカスレンズ移動量を取得し、該フォーカスレンズ移動量に基づき、ズームレンズ補正手法とトリミング補正手法の何れの手法によるブリージング補正を行うかについての判定を行っている。
 フォーカスレンズ移動量は、ズームレンズ補正手法によるブリージング補正を行う場合の補正時間と相関する。
 従って、上記構成によれば、例えばフォーカスレンズ移動量が大きい場合(ズームレンズ補正手法での補正時間が長い場合)にはトリミング補正手法での補正が行われるようにする等、フォーカスレンズ移動量に基づいた適切な補正手法の判定を行うことができる。
Furthermore, in the information processing apparatus of the embodiment, the correction method determination unit acquires the focus lens movement amount, which is the movement amount of the focus lens group, based on the information related to the movement amount of the focus lens group, and obtains the focus lens movement amount. Based on the amount of movement, it is determined whether the breathing correction should be performed using the zoom lens correction method or the trimming correction method.
The focus lens movement amount correlates with the correction time when breathing correction is performed by the zoom lens correction method.
Therefore, according to the above configuration, for example, when the amount of movement of the focus lens is large (when the correction time of the zoom lens correction method is long), correction is performed by the trimming correction method. An appropriate correction method can be determined based on the above.
 また、実施形態の情報処理装置においては、補正手法判定部は、フォーカスレンズ群の移動量に係る情報と、トリミング補正手法によるブリージング補正を行うためのトリミング倍率特性を示すトリミング倍率特性情報(トリミング倍率テーブルJ4)とに基づいて、トリミング補正手法によるブリージング補正を行う場合のトリミング倍率を取得し、該トリミング倍率に基づき、ズームレンズ補正手法とトリミング補正手法の何れの手法によるブリージング補正を行うかについての判定を行っている。
 トリミング補正手法によるブリージング補正を行う場合におけるトリミング倍率としても、ズームレンズ補正手法によるブリージング補正を行う場合の補正時間と相関する。
 従って、上記構成によれば、例えばブリージング補正のためのトリミング倍率が大きい場合(ズームレンズ補正手法での補正時間が長い場合)にはトリミング補正手法での補正が行われるようにする等、ブリージング補正のためのトリミング倍率に基づいた適切な補正手法の判定を行うことができる。
Further, in the information processing apparatus of the embodiment, the correction method determination unit includes information related to the amount of movement of the focus lens group, and trimming magnification characteristic information (trimming magnification Based on Table J4), a trimming magnification for performing breathing correction by a trimming correction method is acquired, and based on the trimming magnification, it is determined whether to perform breathing correction by either the zoom lens correction method or the trimming correction method. making judgments.
The trimming magnification in the case of performing the breathing correction by the trimming correction method also correlates with the correction time in the case of performing the breathing correction by the zoom lens correction method.
Therefore, according to the above configuration, for example, when the trimming magnification for breathing correction is large (when the correction time by the zoom lens correction method is long), the correction by the trimming correction method is performed. It is possible to determine an appropriate correction method based on the trimming magnification for .
 さらに、実施形態の情報処理装置においては、補正手法判定部は、フォーカスレンズ群の移動量に係る情報に基づいたトリミング補正手法による補正実行条件が成立した場合であっても、トリミング補正手法によるブリージング補正に要するトリミング倍率が閾値(Rth)を超える場合には、ズームレンズ補正手法によるブリージング補正が行われるように制御している(図20参照)。
 これにより、補正時間の面ではトリミング補正手法での補正を行うべきとされる場合であっても、トリミングによる画質劣化が大きくなると予測される場合には、ズームレンズ補正手法での補正が行われる。
 従って、画質劣化の生じるトリミングの使用を極力抑えつつ、トリミングによる補正が行われる場合の画質劣化が許容範囲内に収まるように図ることができる。
Further, in the information processing apparatus of the embodiment, the correction method determination unit determines whether breathing by the trimming correction method is performed even when the correction execution condition by the trimming correction method based on the information related to the movement amount of the focus lens group is satisfied. When the trimming magnification required for correction exceeds the threshold value (Rth), control is performed so that breathing correction is performed by the zoom lens correction method (see FIG. 20).
As a result, even if the trimming correction method should be used in terms of correction time, the zoom lens correction method will be used when it is predicted that the image quality will deteriorate due to trimming. .
Therefore, while minimizing the use of trimming that causes image quality deterioration, it is possible to keep image quality deterioration within an allowable range when correction by trimming is performed.
 さらにまた、実施形態の情報処理装置においては、補正手法判定部は、フォーカスレンズ群の移動量に係る情報に基づいたズームレンズ補正手法による補正実行条件が成立した場合であっても、シーン認識情報に基づき特定シーンであると判定された場合は、トリミング補正手法によるブリージング補正が行われるように制御している(図21参照)。
 これにより、補正時間の面ではズームレンズ補正手法での補正を行うべきとされる場合であっても、静音が求められる特定シーンにおいては、レンズ移動に係る動作音が生じないトリミング補正手法での補正が行われるようにすることが可能となる。
 従って、撮像装置の状況に応じてブリージング補正の手法を適切に切り替えることができる。
Furthermore, in the information processing apparatus of the embodiment, the correction method determination unit determines whether the scene recognition information is determined to be a specific scene based on the above, the breathing correction is performed by the trimming correction method (see FIG. 21).
As a result, even if correction should be performed using the zoom lens correction method in terms of correction time, in specific scenes where quietness is required, the trimming correction method that does not generate operating noise related to lens movement can be used. A correction can be made.
Therefore, it is possible to appropriately switch the breathing correction method according to the situation of the imaging apparatus.
 また、実施形態の情報処理装置(撮像装置2A,2A’)においては、ズームレンズ補正手法によるブリージング補正時において、フォーカスレンズ群の移動速度とズームレンズ群の移動速度とに基づいてズームレンズ群又はフォーカスレンズ群の少なくとも一方の移動速度を制御する補正制御部(第一補正制御部F3A)を備えている(図19等参照)。
 これにより、ズームレンズ補正手法によるブリージング補正時において、フォーカスレンズ群とズームレンズ群との移動速度差に起因して生じる画角変動の抑制を図ることができる。
Further, in the information processing device ( imaging device 2A, 2A') of the embodiment, during the breathing correction by the zoom lens correction method, the zoom lens group or the A correction control section (first correction control section F3A) that controls the moving speed of at least one of the focus lens groups is provided (see FIG. 19, etc.).
As a result, it is possible to suppress variation in the angle of view caused by the difference in moving speed between the focus lens group and the zoom lens group during breathing correction by the zoom lens correction method.
 さらに、実施形態の情報処理装置においては、補正手法判定部は、シーン認識情報に基づき、第一補正手法と第二補正手法の何れの手法によるブリージング補正を行うかについての判定を行っている(図22参照)。
 これにより、例えば第一、第二補正手法がそれぞれズームレンズ群の駆動によるブリージング補正手法、撮像画像のトリミングによるブリージング補正手法である場合において、静音が求められる特定のシーンではレンズ移動に係る動作音が生じるズームレンズ駆動によるブリージング補正を行わず、トリミングによるブリージング補正を行う等といったことが可能となる。
 従って、撮像装置の状況に応じてブリージング補正の手法を適切に判定することができる。
Furthermore, in the information processing apparatus of the embodiment, the correction method determination unit determines, based on the scene recognition information, which of the first correction method and the second correction method should be used for the breathing correction ( See Figure 22).
As a result, for example, when the first and second correction methods are the breathing correction method by driving the zoom lens group and the breathing correction method by trimming the captured image, respectively, in a specific scene where silence is required, the operation noise associated with lens movement is reduced. It is possible to perform breathing correction by trimming without performing breathing correction by driving the zoom lens, which causes .
Therefore, it is possible to appropriately determine the breathing correction method according to the situation of the imaging apparatus.
 さらにまた、実施形態の情報処理装置においては、第一、第二補正手法のうち一方は、ズームレンズ群の駆動によりブリージング補正を行うズームレンズ補正手法であり、他方は、撮像画像のトリミングによりブリージング補正を行うトリミング補正手法であり、補正手法判定部は、シーン認識情報に基づき特定シーンであると判定された場合は、トリミング補正手法によるブリージング補正が行われるように制御し、特定シーンでないと判定された場合はズームレンズ補正手法によるブリージング補正が行われるように制御している(図22参照)。
 これにより、静音が求められる特定のシーンではレンズ移動に係る動作音が生じるズームレンズ補正手法での補正を行わず、トリミング補正手法での補正を行うようにすることが可能となる。
 従って、撮像装置の状況に応じてブリージング補正の手法を適切に切り替えることができる。
Furthermore, in the information processing apparatus of the embodiment, one of the first and second correction methods is a zoom lens correction method in which breathing correction is performed by driving a zoom lens group, and the other is a breathing correction method by trimming a captured image. This is a trimming correction method for performing correction, and when the correction method determination unit determines that the scene is a specific scene based on the scene recognition information, it controls so that the breathing correction is performed by the trimming correction method, and determines that the scene is not a specific scene. If so, the breathing correction is performed by the zoom lens correction method (see FIG. 22).
As a result, it is possible to perform correction by the trimming correction method without performing the correction by the zoom lens correction method, which generates the operating sound associated with the lens movement, in a specific scene where quietness is required.
Therefore, it is possible to appropriately switch the breathing correction method according to the situation of the imaging apparatus.
 また、実施形態の情報処理装置においては、シーン認識情報は、画像認識処理、照度センサによる検出信号、マイクロフォンによる収音信号、動きセンサによる検出信号の少なくとも何れかに基づく情報であるものとされる。
 画像認識処理、照度センサによる検出信号、マイクロフォンによる収音信号、動きセンサによる検出信号は、それぞれ、撮像のシーンを認識可能な情報である。
 従って、特定シーンであるか否かの判定を適切に行うことができ、撮像装置の状況に応じてブリージング補正の手法を適切に判定することができる。
In the information processing apparatus of the embodiment, the scene recognition information is information based on at least one of image recognition processing, a detection signal from an illuminance sensor, a sound pickup signal from a microphone, and a detection signal from a motion sensor. .
The image recognition processing, the detection signal from the illuminance sensor, the picked-up sound signal from the microphone, and the detection signal from the motion sensor are each information that enables recognition of the captured scene.
Therefore, it is possible to appropriately determine whether or not the scene is a specific scene, and to appropriately determine the breathing correction method according to the situation of the imaging apparatus.
 また、実施形態の情報処理装置においては、ブリージング補正に用いるレンズ特性情報をレンズ装置から取得する処理を行っている。
 これにより、レンズ特性に応じた適切なブリージング補正を実現するにあたり、通信手段として少なくともレンズ装置との通信手段のみを備えれば済む。
 従って、装置部品点数の削減、及びコスト削減を図ることができる。
Further, in the information processing apparatus of the embodiment, processing is performed to acquire lens characteristic information used for breathing correction from the lens device.
Accordingly, in order to realize appropriate breathing correction according to lens characteristics, it is sufficient to provide at least communication means with the lens device as communication means.
Therefore, it is possible to reduce the number of device parts and the cost.
 さらに、実施形態の情報処理装置においては、ブリージング補正に用いるレンズ特性情報をクラウドサーバから取得する処理を行っている(図23参照)。
 これにより、レンズ特性に応じた適切なブリージング補正を実現するにあたり、レンズ装置にレンズ特性情報を記憶させておく必要がなくなる。
 従って、レンズ装置のメモリ容量削減を図ることができる。
Furthermore, in the information processing apparatus of the embodiment, processing is performed to acquire lens characteristic information used for breathing correction from the cloud server (see FIG. 23).
This eliminates the need to store the lens characteristic information in the lens device in order to realize appropriate breathing correction according to the lens characteristic.
Therefore, it is possible to reduce the memory capacity of the lens device.
 さらにまた、実施形態の情報処理装置は、撮像部を備えた撮像装置として構成されている。
 すなわち、撮像装置として構成された情報処理装置が、フォーカスレンズ群の移動量に係る情報、又はシーン認識情報に基づき、第一、第二補正手法の何れによるブリージング補正を行うかについての判定を行うものである。
 従って、自装置の状況に応じてブリージング補正の手法を適切に判定することができる撮像装置を実現することができる。
Furthermore, the information processing apparatus according to the embodiment is configured as an imaging apparatus including an imaging unit.
That is, an information processing device configured as an imaging device determines which of the first and second correction methods the breathing correction should be performed based on information related to the amount of movement of the focus lens group or scene recognition information. It is a thing.
Therefore, it is possible to realize an image pickup apparatus that can appropriately determine a breathing correction method according to the situation of the own apparatus.
 また、実施形態の情報処理方法は、情報処理装置が、フォーカスレンズ群の移動量に係る情報、又はシーン認識情報に基づき、ブリージング補正の手法である第一補正手法と第二補正手法のうち何れの手法によるブリージング補正を行うかについての判定を行う情報処理方法である。
 このような情報処理方法によれば、上記した実施形態の情報処理装置と同様の作用及び効果を得ることができる。
Further, in the information processing method of the embodiment, the information processing apparatus performs one of the first correction method and the second correction method, which are methods for correcting breathing, based on information related to the amount of movement of the focus lens group or scene recognition information. This is an information processing method for determining whether or not to perform breathing correction by the technique of (1).
According to such an information processing method, it is possible to obtain the same actions and effects as those of the information processing apparatus of the above-described embodiment.
 ここで、実施形態としては、図12や図22等で説明した補正手法判定部F5による処理を、例えばCPU、DSP(Digital Signal Processor)等、或いはこれらを含むデバイスに実行させるプログラムを考えることができる。
 即ち、実施形態のプログラムは、コンピュータ装置が読み取り可能なプログラムであって、フォーカスレンズ群の移動量に係る情報、又はシーン認識情報に基づき、ブリージング補正の手法である第一補正手法と第二補正手法のうち何れの手法によるブリージング補正を行うかについての判定を行う機能、をコンピュータ装置に実現させるプログラムである。
 このようなプログラムにより、上述した補正手法判定部F5としての機能を撮像装置2等としての機器において実現できる。
Here, as an embodiment, it is possible to consider a program that causes a CPU, a DSP (Digital Signal Processor), etc., or a device including these, to execute the processing by the correction method determination unit F5 described with reference to FIGS. can.
That is, the program of the embodiment is a program readable by a computer device, and performs a first correction method and a second correction method, which are breathing correction methods, based on information related to the amount of movement of the focus lens group or scene recognition information. It is a program that causes a computer device to realize a function of determining which of the methods should be used for the breathing correction.
With such a program, the function of the above-described correction method determination unit F5 can be realized in a device such as the imaging device 2 or the like.
 上記のようなプログラムは、コンピュータ装置等の機器に内蔵されている記録媒体としてのHDDや、CPUを有するマイクロコンピュータ内のROM等に予め記録しておくことができる。
 あるいはまた、フレキシブルディスク、CD-ROM(Compact Disc Read Only Memory)、MO(Magneto Optical)ディスク、DVD(Digital Versatile Disc)、ブルーレイディスク(Blu-ray Disc(登録商標))、磁気ディスク、半導体メモリ、メモリカードなどのリムーバブル記録媒体に、一時的あるいは永続的に格納(記録)しておくことができる。このようなリムーバブル記録媒体は、いわゆるパッケージソフトウエアとして提供することができる。
 また、このようなプログラムは、リムーバブル記録媒体からパーソナルコンピュータ等にインストールする他、ダウンロードサイトから、LAN(Local Area Network)、インターネットなどのネットワークを介してダウンロードすることもできる。
The program as described above can be recorded in advance in a HDD as a recording medium built in equipment such as a computer device, or in a ROM or the like in a microcomputer having a CPU.
Alternatively, flexible discs, CD-ROMs (Compact Disc Read Only Memory), MO (Magneto Optical) discs, DVDs (Digital Versatile Discs), Blu-ray discs (Blu-ray Discs (registered trademark)), magnetic discs, semiconductor memories, It can be temporarily or permanently stored (recorded) in a removable recording medium such as a memory card. Such removable recording media can be provided as so-called package software.
In addition to installing such a program from a removable recording medium to a personal computer or the like, it can also be downloaded from a download site via a network such as a LAN (Local Area Network) or the Internet.
 またこのようなプログラムによれば、実施形態の補正手法判定部F5の広範な提供に適している。例えばパーソナルコンピュータ、携帯型情報処理装置、携帯電話機、ゲーム機器、ビデオ機器、PDA(Personal Digital Assistant)等にプログラムをダウンロードすることで、当該パーソナルコンピュータ等を、本開示の補正手法判定部F5としての処理を実現する装置として機能させることができる。 Further, such a program is suitable for wide provision of the correction method determination unit F5 of the embodiment. For example, by downloading a program to a personal computer, a portable information processing device, a mobile phone, a game device, a video device, a PDA (Personal Digital Assistant), etc., the personal computer, etc. can be used as the correction method determination unit F5 of the present disclosure. It can function as a device for realizing processing.
 なお、本明細書に記載された効果はあくまでも例示であって限定されるものではなく、また他の効果があってもよい。
Note that the effects described in this specification are merely examples and are not limited, and other effects may be provided.
<5.本技術>
 なお本技術は以下のような構成も採ることができる。
(1)
 フォーカスレンズ群の移動量に係る情報、又はシーン認識情報に基づき、ブリージング補正の手法である第一補正手法と第二補正手法のうち何れの手法によるブリージング補正を行うかについての判定を行う補正手法判定部を備えた
 情報処理装置。
(2)
 前記第一、第二補正手法のうち一方は、ズームレンズ群の駆動によりブリージング補正を行うズームレンズ補正手法であり、他方は、撮像画像のトリミングによりブリージング補正を行うトリミング補正手法である
 前記(1)に記載の情報処理装置。
(3)
 前記補正手法判定部は、前記フォーカスレンズ群の移動量に係る情報に基づいて、前記ズームレンズ補正手法によるブリージング補正に要するズームレンズ移動量を取得し、該ズームレンズ移動量に基づき、前記ズームレンズ補正手法と前記トリミング補正手法の何れの手法によるブリージング補正を行うかについての判定を行う
 前記(2)に記載の情報処理装置。
(4)
 前記補正手法判定部は、前記ズームレンズ移動量が所定の閾値以下である場合は前記ズームレンズ補正手法によるブリージング補正が行われるように制御し、前記ズームレンズ移動量が前記閾値を超える場合は前記トリミング補正手法によるブリージング補正が行われるように制御する
 前記(3)に記載の情報処理装置。
(5)
 前記補正手法判定部は、前記ズームレンズ移動量に基づいて前記ズームレンズ補正手法によるブリージング補正を行った場合の補正時間を取得し、前記補正時間に基づき、前記ズームレンズ補正手法と前記トリミング補正手法の何れの手法によるブリージング補正を行うかについての判定を行う
 前記(3)に記載の情報処理装置。
(6)
 前記補正手法判定部は、前記補正時間が所定の閾値以下である場合は前記ズームレンズ補正手法によるブリージング補正が行われるように制御し、前記補正時間が前記閾値を超える場合は前記トリミング補正手法によるブリージング補正が行われるように制御する
 前記(5)に記載の情報処理装置。
(7)
 前記補正手法判定部は、前記フォーカスレンズ群の移動量に係る情報に基づいて、前記フォーカスレンズ群の移動量であるフォーカスレンズ移動量を取得し、該フォーカスレンズ移動量に基づき、前記ズームレンズ補正手法と前記トリミング補正手法の何れの手法によるブリージング補正を行うかについての判定を行う
 前記(2)に記載の情報処理装置。
(8)
 前記補正手法判定部は、前記フォーカスレンズ群の移動量に係る情報と、前記トリミング補正手法によるブリージング補正を行うためのトリミング倍率特性を示すトリミング倍率特性情報とに基づいて、前記トリミング補正手法によるブリージング補正を行う場合のトリミング倍率を取得し、該トリミング倍率に基づき、前記ズームレンズ補正手法と前記トリミング補正手法の何れの手法によるブリージング補正を行うかについての判定を行う
 前記(2)に記載の情報処理装置。
(9)
 前記補正手法判定部は、前記フォーカスレンズ群の移動量に係る情報に基づいた前記トリミング補正手法による補正実行条件が成立した場合であっても、前記トリミング補正手法によるブリージング補正に要するトリミング倍率が閾値を超える場合には、前記ズームレンズ補正手法によるブリージング補正が行われるように制御する
 前記(2)から(7)の何れかに記載の情報処理装置。
(10)
 前記補正手法判定部は、前記フォーカスレンズ群の移動量に係る情報に基づいた前記ズームレンズ補正手法による補正実行条件が成立した場合であっても、前記シーン認識情報に基づき特定シーンであると判定された場合は、前記トリミング補正手法によるブリージング補正が行われるように制御する
 前記(2)から(7)、(9)の何れかに記載の情報処理装置。
(11)
 前記ズームレンズ補正手法によるブリージング補正時において、前記フォーカスレンズ群の移動速度と前記ズームレンズ群の移動速度とに基づいて前記ズームレンズ群又は前記フォーカスレンズ群の少なくとも一方の移動速度を制御する補正制御部を備えた
 前記(2)から(9)の何れかに記載の情報処理装置。
(12)
 前記補正手法判定部は、前記シーン認識情報に基づき、前記第一補正手法と前記第二補正手法の何れの手法によるブリージング補正を行うかについての判定を行う
 前記(1)又は(2)に記載の情報処理装置。
(13)
 前記第一、第二補正手法のうち一方は、ズームレンズ群の駆動によりブリージング補正を行うズームレンズ補正手法であり、他方は、撮像画像のトリミングによりブリージング補正を行うトリミング補正手法であり、
 前記補正手法判定部は、前記シーン認識情報に基づき特定シーンであると判定された場合は、前記トリミング補正手法によるブリージング補正が行われるように制御し、前記特定シーンでないと判定された場合は前記ズームレンズ補正手法によるブリージング補正が行われるように制御する
 前記(12)に記載の情報処理装置。
(14)
 前記シーン認識情報は、画像認識処理、照度センサによる検出信号、マイクロフォンによる収音信号、動きセンサによる検出信号の少なくとも何れかに基づく情報である
 前記(12)に記載の情報処理装置。
(15)
 前記ブリージング補正に用いるレンズ特性情報をレンズ装置から取得する処理を行う
 前記(1)から(14)の何れかに記載の情報処理装置。
(16)
 前記ブリージング補正に用いるレンズ特性情報をクラウドサーバから取得する処理を行う
 前記(1)から(14)の何れかに記載の情報処理装置。
(17)
 撮像部を備えた撮像装置として構成された
 前記(1)から(16)の何れかに記載の情報処理装置。
(18)
 情報処理装置が、
 フォーカスレンズ群の移動量に係る情報、又はシーン認識情報に基づき、ブリージング補正の手法である第一補正手法と第二補正手法のうち何れの手法によるブリージング補正を行うかについての判定を行う
 情報処理方法。
(19)
 コンピュータ装置が読み取り可能なプログラムであって、
 フォーカスレンズ群の移動量に係る情報、又はシーン認識情報に基づき、ブリージング補正の手法である第一補正手法と第二補正手法のうち何れの手法によるブリージング補正を行うかについての判定を行う機能、を前記コンピュータ装置に実現させる
 プログラム。
<5. This technology>
Note that the present technology can also adopt the following configuration.
(1)
A correction method for determining which of the breathing correction methods, the first correction method and the second correction method, should be used for breathing correction based on information related to the amount of movement of the focus lens group or scene recognition information. An information processing device having a determination unit.
(2)
One of the first and second correction methods is a zoom lens correction method in which breathing correction is performed by driving a zoom lens group, and the other is a trimming correction method in which breathing correction is performed by trimming a captured image. ).
(3)
The correction method determination unit acquires a zoom lens movement amount required for breathing correction by the zoom lens correction method based on information related to the movement amount of the focus lens group, and determines the zoom lens movement amount based on the zoom lens movement amount. The information processing apparatus according to (2), wherein a determination is made as to which of the correction method and the trimming correction method should be used for the breathing correction.
(4)
The correction method determination unit performs control such that breathing correction is performed by the zoom lens correction method when the zoom lens movement amount is equal to or less than a predetermined threshold, and controls the breathing correction by the zoom lens correction method when the zoom lens movement amount exceeds the threshold. The information processing apparatus according to (3), wherein control is performed such that breathing correction is performed by a trimming correction method.
(5)
The correction method determination unit acquires a correction time when breathing correction is performed by the zoom lens correction method based on the zoom lens movement amount, and based on the correction time, the zoom lens correction method and the trimming correction method. The information processing apparatus according to (3) above, in which a determination is made as to which method of the breathing correction is to be performed.
(6)
The correction method determination unit controls to perform breathing correction by the zoom lens correction method when the correction time is less than or equal to a predetermined threshold, and performs control by the trimming correction method when the correction time exceeds the threshold. The information processing apparatus according to (5) above, wherein control is performed such that breathing correction is performed.
(7)
The correction method determination unit acquires a focus lens movement amount, which is the movement amount of the focus lens group, based on information related to the movement amount of the focus lens group, and performs the zoom lens correction based on the focus lens movement amount. The information processing apparatus according to (2), wherein a determination is made as to whether the breathing correction is to be performed by any one of the trimming correction method and the trimming correction method.
(8)
The correction method determination unit determines breathing by the trimming correction method based on information related to the amount of movement of the focus lens group and trimming magnification characteristic information indicating a trimming magnification characteristic for performing breathing correction by the trimming correction method. Obtaining a trimming magnification for correction, and determining whether breathing correction should be performed by either the zoom lens correction method or the trimming correction method based on the trimming magnification Information according to (2) above processing equipment.
(9)
The correction method determination unit determines that a trimming magnification required for breathing correction by the trimming correction method is a threshold value even when a correction execution condition by the trimming correction method based on information related to the amount of movement of the focus lens group is satisfied. The information processing apparatus according to any one of (2) to (7), wherein the breathing correction is performed by the zoom lens correction method when exceeding the above.
(10)
The correction method determination unit determines that the scene is a specific scene based on the scene recognition information even when a correction execution condition by the zoom lens correction method based on the information regarding the amount of movement of the focus lens group is satisfied. The information processing apparatus according to any one of (2) to (7) and (9) above, wherein, when the trimming correction method is performed, the breathing correction is performed by the trimming correction method.
(11)
Correction control for controlling the movement speed of at least one of the zoom lens group and the focus lens group based on the movement speed of the focus lens group and the movement speed of the zoom lens group during breathing correction by the zoom lens correction method. The information processing apparatus according to any one of (2) to (9) above, comprising a unit.
(12)
The correction method determination unit determines, based on the scene recognition information, which of the first correction method and the second correction method should be used for the breathing correction. information processing equipment.
(13)
One of the first and second correction methods is a zoom lens correction method in which breathing correction is performed by driving a zoom lens group, and the other is a trimming correction method in which breathing correction is performed by trimming a captured image,
The correction method determination unit controls to perform breathing correction by the trimming correction method when the scene is determined to be a specific scene based on the scene recognition information, and controls the breathing correction to be performed by the trimming correction method when the scene is determined not to be the specific scene. The information processing apparatus according to (12), wherein control is performed such that breathing correction is performed by a zoom lens correction method.
(14)
The information processing apparatus according to (12), wherein the scene recognition information is information based on at least one of an image recognition process, a detection signal from an illuminance sensor, a picked-up sound signal from a microphone, and a detection signal from a motion sensor.
(15)
The information processing apparatus according to any one of (1) to (14), wherein a process of acquiring lens characteristic information used for the breathing correction from the lens device is performed.
(16)
The information processing apparatus according to any one of (1) to (14), wherein a process of acquiring lens characteristic information used for the breathing correction from a cloud server is performed.
(17)
The information processing device according to any one of (1) to (16) above, configured as an imaging device including an imaging unit.
(18)
The information processing device
Determining which method of breathing correction to perform, out of the first correction method and the second correction method, is to be performed based on information related to the amount of movement of the focus lens group or scene recognition information. Method.
(19)
A program readable by a computer device,
A function of determining which of the breathing correction method, the first correction method and the second correction method, should be used for the breathing correction based on information related to the amount of movement of the focus lens group or scene recognition information; on the computer device.
1,1A,1A’ 交換レンズ
2,2A,2A’ 撮像装置(ボディ)
11 マウント部
12,12A レンズ側制御部
13 ズームレンズ
14 手振れ補正レンズ
15 絞り
16 フォーカスレンズ
17 検出部
21 ズームレンズ駆動部
22 手振れ制御部
23 絞り制御部
24 フォーカスレンズ駆動部
31 操作部
32 メモリ
33 電源制御部
51 マウント部
52,52A,52A’ ボディ側制御部
53 シャッタ
54 シャッタ制御部
55 撮像素子
56 ADC
57 フレームメモリ
58 画像信号処理部
59 記録部
60 記録媒体
61 表示部
62 メモリ
63 電源制御部
64 電源部
65 操作部
70 通信部
80 クラウドサーバ
J1 フォーカス移動軌跡情報
J2 画角テーブル
J3,J3A レンズ群移動速度テーブル
J4 トリミング倍率テーブル
J5 フォーカス群画角変動率テーブル
J6 ズーム群画角変動率テーブル
J7 レンズ識別情報
F1,F1A 情報取得処理部
F2 AF処理部
F3,F3A 第一補正制御部
F4 第二補正制御部
F5 補正手法判定部
1, 1A, 1A' interchangeable lens 2, 2A, 2A' imaging device (body)
11 mount section 12, 12A lens side control section 13 zoom lens 14 image stabilization lens 15 aperture 16 focus lens 17 detection section 21 zoom lens drive section 22 camera shake control section 23 aperture control section 24 focus lens drive section 31 operation section 32 memory 33 power supply Control unit 51 Mount units 52, 52A, 52A' Body side control unit 53 Shutter 54 Shutter control unit 55 Image sensor 56 ADC
57 frame memory 58 image signal processing unit 59 recording unit 60 recording medium 61 display unit 62 memory 63 power control unit 64 power supply unit 65 operation unit 70 communication unit 80 cloud server J1 focus movement trajectory information J2 angle of view table J3, J3A lens group movement Speed table J4 Trimming magnification table J5 Focus group view angle variation rate table J6 Zoom group view angle variation rate table J7 Lens identification information F1, F1A Information acquisition processing unit F2 AF processing unit F3, F3A First correction control unit F4 Second correction control Part F5 correction method determination part

Claims (19)

  1.  フォーカスレンズ群の移動量に係る情報、又はシーン認識情報に基づき、ブリージング補正の手法である第一補正手法と第二補正手法のうち何れの手法によるブリージング補正を行うかについての判定を行う補正手法判定部を備えた
     情報処理装置。
    A correction method for determining which of the breathing correction methods, the first correction method and the second correction method, should be used for the breathing correction based on information related to the amount of movement of the focus lens group or scene recognition information. An information processing device having a determination unit.
  2.  前記第一、第二補正手法のうち一方は、ズームレンズ群の駆動によりブリージング補正を行うズームレンズ補正手法であり、他方は、撮像画像のトリミングによりブリージング補正を行うトリミング補正手法である
     請求項1に記載の情報処理装置。
    1. One of the first and second correction methods is a zoom lens correction method in which breathing correction is performed by driving a zoom lens group, and the other is a trimming correction method in which breathing correction is performed by trimming a captured image. The information processing device according to .
  3.  前記補正手法判定部は、前記フォーカスレンズ群の移動量に係る情報に基づいて、前記ズームレンズ補正手法によるブリージング補正に要するズームレンズ移動量を取得し、該ズームレンズ移動量に基づき、前記ズームレンズ補正手法と前記トリミング補正手法の何れの手法によるブリージング補正を行うかについての判定を行う
     請求項2に記載の情報処理装置。
    The correction method determination unit acquires a zoom lens movement amount required for breathing correction by the zoom lens correction method based on information related to the movement amount of the focus lens group, and determines the zoom lens movement amount based on the zoom lens movement amount. 3. The information processing apparatus according to claim 2, wherein a determination is made as to whether the breathing correction is to be performed by either the correction method or the trimming correction method.
  4.  前記補正手法判定部は、前記ズームレンズ移動量が所定の閾値以下である場合は前記ズームレンズ補正手法によるブリージング補正が行われるように制御し、前記ズームレンズ移動量が前記閾値を超える場合は前記トリミング補正手法によるブリージング補正が行われるように制御する
     請求項3に記載の情報処理装置。
    The correction method determination unit performs control such that breathing correction is performed by the zoom lens correction method when the zoom lens movement amount is equal to or less than a predetermined threshold, and controls the breathing correction by the zoom lens correction method when the zoom lens movement amount exceeds the threshold. The information processing apparatus according to claim 3, wherein control is performed such that breathing correction is performed by a trimming correction method.
  5.  前記補正手法判定部は、前記ズームレンズ移動量に基づいて前記ズームレンズ補正手法によるブリージング補正を行った場合の補正時間を取得し、前記補正時間に基づき、前記ズームレンズ補正手法と前記トリミング補正手法の何れの手法によるブリージング補正を行うかについての判定を行う
     請求項3に記載の情報処理装置。
    The correction method determination unit acquires a correction time when breathing correction is performed by the zoom lens correction method based on the zoom lens movement amount, and based on the correction time, the zoom lens correction method and the trimming correction method. 4. The information processing apparatus according to claim 3, wherein a determination is made as to which method of breathing correction is to be performed.
  6.  前記補正手法判定部は、前記補正時間が所定の閾値以下である場合は前記ズームレンズ補正手法によるブリージング補正が行われるように制御し、前記補正時間が前記閾値を超える場合は前記トリミング補正手法によるブリージング補正が行われるように制御する
     請求項5に記載の情報処理装置。
    The correction method determination unit controls to perform breathing correction by the zoom lens correction method when the correction time is less than or equal to a predetermined threshold, and performs control by the trimming correction method when the correction time exceeds the threshold. The information processing apparatus according to claim 5, wherein control is performed so that breathing correction is performed.
  7.  前記補正手法判定部は、前記フォーカスレンズ群の移動量に係る情報に基づいて、前記フォーカスレンズ群の移動量であるフォーカスレンズ移動量を取得し、該フォーカスレンズ移動量に基づき、前記ズームレンズ補正手法と前記トリミング補正手法の何れの手法によるブリージング補正を行うかについての判定を行う
     請求項2に記載の情報処理装置。
    The correction method determination unit acquires a focus lens movement amount, which is the movement amount of the focus lens group, based on information related to the movement amount of the focus lens group, and performs the zoom lens correction based on the focus lens movement amount. 3. The information processing apparatus according to claim 2, wherein a determination is made as to whether the breathing correction is to be performed by either the trimming correction method or the trimming correction method.
  8.  前記補正手法判定部は、前記フォーカスレンズ群の移動量に係る情報と、前記トリミング補正手法によるブリージング補正を行うためのトリミング倍率特性を示すトリミング倍率特性情報とに基づいて、前記トリミング補正手法によるブリージング補正を行う場合のトリミング倍率を取得し、該トリミング倍率に基づき、前記ズームレンズ補正手法と前記トリミング補正手法の何れの手法によるブリージング補正を行うかについての判定を行う
     請求項2に記載の情報処理装置。
    The correction method determination unit determines breathing by the trimming correction method based on information related to the amount of movement of the focus lens group and trimming magnification characteristic information indicating a trimming magnification characteristic for performing breathing correction by the trimming correction method. 3. The information processing according to claim 2, wherein a trimming magnification for performing correction is acquired, and based on the trimming magnification, it is determined which of the zoom lens correction method and the trimming correction method should be used for the breathing correction. Device.
  9.  前記補正手法判定部は、前記フォーカスレンズ群の移動量に係る情報に基づいた前記トリミング補正手法による補正実行条件が成立した場合であっても、前記トリミング補正手法によるブリージング補正に要するトリミング倍率が閾値を超える場合には、前記ズームレンズ補正手法によるブリージング補正が行われるように制御する
     請求項2に記載の情報処理装置。
    The correction method determination unit determines that a trimming magnification required for breathing correction by the trimming correction method is a threshold value even when a correction execution condition by the trimming correction method based on information related to the amount of movement of the focus lens group is satisfied. 3. The information processing apparatus according to claim 2, wherein control is performed such that breathing correction is performed by the zoom lens correction method when the value exceeds .
  10.  前記補正手法判定部は、前記フォーカスレンズ群の移動量に係る情報に基づいた前記ズームレンズ補正手法による補正実行条件が成立した場合であっても、前記シーン認識情報に基づき特定シーンであると判定された場合は、前記トリミング補正手法によるブリージング補正が行われるように制御する
     請求項2に記載の情報処理装置。
    The correction method determination unit determines that the scene is a specific scene based on the scene recognition information even when a correction execution condition by the zoom lens correction method based on the information regarding the amount of movement of the focus lens group is satisfied. 3. The information processing apparatus according to claim 2, wherein when the trimming correction method is performed, the breathing correction is performed by the trimming correction method.
  11.  前記ズームレンズ補正手法によるブリージング補正時において、前記フォーカスレンズ群の移動速度と前記ズームレンズ群の移動速度とに基づいて前記ズームレンズ群又は前記フォーカスレンズ群の少なくとも一方の移動速度を制御する補正制御部を備えた
     請求項2に記載の情報処理装置。
    Correction control for controlling the movement speed of at least one of the zoom lens group and the focus lens group based on the movement speed of the focus lens group and the movement speed of the zoom lens group during breathing correction by the zoom lens correction method. The information processing apparatus according to claim 2, comprising a unit.
  12.  前記補正手法判定部は、前記シーン認識情報に基づき、前記第一補正手法と前記第二補正手法の何れの手法によるブリージング補正を行うかについての判定を行う
     請求項1に記載の情報処理装置。
    2. The information processing apparatus according to claim 1, wherein the correction method determination unit determines, based on the scene recognition information, which one of the first correction method and the second correction method should be used for the breathing correction.
  13.  前記第一、第二補正手法のうち一方は、ズームレンズ群の駆動によりブリージング補正を行うズームレンズ補正手法であり、他方は、撮像画像のトリミングによりブリージング補正を行うトリミング補正手法であり、
     前記補正手法判定部は、前記シーン認識情報に基づき特定シーンであると判定された場合は、前記トリミング補正手法によるブリージング補正が行われるように制御し、前記特定シーンでないと判定された場合は前記ズームレンズ補正手法によるブリージング補正が行われるように制御する
     請求項12に記載の情報処理装置。
    One of the first and second correction methods is a zoom lens correction method in which breathing correction is performed by driving a zoom lens group, and the other is a trimming correction method in which breathing correction is performed by trimming a captured image,
    The correction method determination unit controls to perform breathing correction by the trimming correction method when it is determined that the scene is a specific scene based on the scene recognition information, and when it is determined that the scene is not the specific scene. 13. The information processing apparatus according to claim 12, wherein control is performed such that breathing correction is performed by a zoom lens correction method.
  14.  前記シーン認識情報は、画像認識処理、照度センサによる検出信号、マイクロフォンによる収音信号、動きセンサによる検出信号の少なくとも何れかに基づく情報である
     請求項12に記載の情報処理装置。
    The information processing apparatus according to claim 12, wherein the scene recognition information is information based on at least one of image recognition processing, a detection signal from an illuminance sensor, a sound pick-up signal from a microphone, and a detection signal from a motion sensor.
  15.  前記ブリージング補正に用いるレンズ特性情報をレンズ装置から取得する処理を行う
     請求項1に記載の情報処理装置。
    The information processing apparatus according to claim 1, wherein a process of acquiring lens characteristic information used for said breathing correction from a lens device is performed.
  16.  前記ブリージング補正に用いるレンズ特性情報をクラウドサーバから取得する処理を行う
     請求項1に記載の情報処理装置。
    The information processing apparatus according to claim 1, wherein a process of acquiring lens characteristic information used for said breathing correction from a cloud server is performed.
  17.  撮像部を備えた撮像装置として構成された
     請求項1に記載の情報処理装置。
    The information processing apparatus according to claim 1, configured as an imaging apparatus including an imaging unit.
  18.  情報処理装置が、
     フォーカスレンズ群の移動量に係る情報、又はシーン認識情報に基づき、ブリージング補正の手法である第一補正手法と第二補正手法のうち何れの手法によるブリージング補正を行うかについての判定を行う
     情報処理方法。
    The information processing device
    Determining which method of breathing correction to perform, out of the first correction method and the second correction method, is to be performed based on information related to the amount of movement of the focus lens group or scene recognition information. Method.
  19.  コンピュータ装置が読み取り可能なプログラムであって、
     フォーカスレンズ群の移動量に係る情報、又はシーン認識情報に基づき、ブリージング補正の手法である第一補正手法と第二補正手法のうち何れの手法によるブリージング補正を行うかについての判定を行う機能、を前記コンピュータ装置に実現させる
     プログラム。
    A program readable by a computer device,
    A function of determining which of the breathing correction method, the first correction method and the second correction method, should be used for the breathing correction based on information related to the amount of movement of the focus lens group or scene recognition information; on the computer device.
PCT/JP2022/031703 2021-10-07 2022-08-23 Information processing device, information processing method, and program WO2023058347A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2021-165364 2021-10-07
JP2021165364 2021-10-07

Publications (1)

Publication Number Publication Date
WO2023058347A1 true WO2023058347A1 (en) 2023-04-13

Family

ID=85803380

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2022/031703 WO2023058347A1 (en) 2021-10-07 2022-08-23 Information processing device, information processing method, and program

Country Status (1)

Country Link
WO (1) WO2023058347A1 (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001042199A (en) * 1999-07-28 2001-02-16 Fuji Photo Optical Co Ltd Lens device
JP2004064674A (en) * 2002-07-31 2004-02-26 Ricoh Co Ltd Imaging unit and zoom control method therefor
JP2005176015A (en) * 2003-12-12 2005-06-30 Canon Inc Imaging device and method therefor
WO2019065260A1 (en) * 2017-09-27 2019-04-04 ソニー株式会社 Information processing device, information processing method, and program, and interchangeable lens
JP2019109270A (en) * 2017-12-15 2019-07-04 キヤノン株式会社 Imaging device
JP2019207334A (en) * 2018-05-29 2019-12-05 キヤノン株式会社 Lens device, imaging device, and imaging system

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001042199A (en) * 1999-07-28 2001-02-16 Fuji Photo Optical Co Ltd Lens device
JP2004064674A (en) * 2002-07-31 2004-02-26 Ricoh Co Ltd Imaging unit and zoom control method therefor
JP2005176015A (en) * 2003-12-12 2005-06-30 Canon Inc Imaging device and method therefor
WO2019065260A1 (en) * 2017-09-27 2019-04-04 ソニー株式会社 Information processing device, information processing method, and program, and interchangeable lens
JP2019109270A (en) * 2017-12-15 2019-07-04 キヤノン株式会社 Imaging device
JP2019207334A (en) * 2018-05-29 2019-12-05 キヤノン株式会社 Lens device, imaging device, and imaging system

Similar Documents

Publication Publication Date Title
JP4346988B2 (en) Image capturing apparatus and optical adjustment method for image capturing apparatus
US9426371B2 (en) Optical apparatus with image shake correction and control method
JP7277513B2 (en) a lens device, a lens device control method, and a lens device control program;
JP3926707B2 (en) Image input device
WO2011086728A1 (en) Imaging device and image shake correction method
JP6472176B2 (en) Imaging apparatus, image shake correction apparatus, image pickup apparatus control method, and image shake correction method
JP6080505B2 (en) Image blur correction apparatus and control method thereof
JP6423658B2 (en) Imaging apparatus, control method therefor, program, and storage medium
US8081874B2 (en) Interchangeable lens and camera body
JP2016045426A (en) Imaging device and method for controlling the same
JP2009300614A (en) Imaging device
JP5812706B2 (en) Optical device and control method thereof
JP2015203862A (en) Image shake correction device and method for controlling the same, program, and storage medium
US9398220B2 (en) Shake correction apparatus and image pickup apparatus thereof, and optical device mountable on image pickup apparatus
US9563068B2 (en) Image shake correction device, control method thereof, and image pickup apparatus
JP4957479B2 (en) Camera shake correction control device, camera body including the same, and interchangeable lens
JP2008129455A (en) Imaging device, control method, and program
JP2005345520A (en) Imaging apparatus
JP5886623B2 (en) Imaging apparatus and control method thereof
WO2023058347A1 (en) Information processing device, information processing method, and program
JP6395401B2 (en) Image shake correction apparatus, control method therefor, optical apparatus, and imaging apparatus
JP2019191515A (en) Imaging system, lens device, imaging device, and control method thereof
JP7137355B2 (en) IMAGING SYSTEM AND CONTROL METHOD THEREOF, PROGRAM, STORAGE MEDIUM
JP2005217504A (en) Image pickup system with peripheral light amount correcting function
JP6039197B2 (en) Imaging apparatus and control method thereof

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22878218

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 2023552734

Country of ref document: JP