WO2023221141A1 - 拍摄装置以及对焦控制程序 - Google Patents

拍摄装置以及对焦控制程序 Download PDF

Info

Publication number
WO2023221141A1
WO2023221141A1 PCT/CN2022/094276 CN2022094276W WO2023221141A1 WO 2023221141 A1 WO2023221141 A1 WO 2023221141A1 CN 2022094276 W CN2022094276 W CN 2022094276W WO 2023221141 A1 WO2023221141 A1 WO 2023221141A1
Authority
WO
WIPO (PCT)
Prior art keywords
camera unit
focus
optical system
image
focus control
Prior art date
Application number
PCT/CN2022/094276
Other languages
English (en)
French (fr)
Inventor
小黒祐介
Original Assignee
北京小米移动软件有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 北京小米移动软件有限公司 filed Critical 北京小米移动软件有限公司
Priority to KR1020237037882A priority Critical patent/KR20230167074A/ko
Priority to CN202280001794.6A priority patent/CN117546475A/zh
Priority to PCT/CN2022/094276 priority patent/WO2023221141A1/zh
Publication of WO2023221141A1 publication Critical patent/WO2023221141A1/zh

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals
    • H04N23/672Focus control based on electronic image sensor signals based on the phase difference signals
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B3/00Simple or compound lenses
    • G02B3/12Fluid-filled or evacuated lenses
    • G02B3/14Fluid-filled or evacuated lenses of variable focal length
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/55Optical parts specially adapted for electronic image sensors; Mounting thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/90Arrangement of cameras or camera modules, e.g. multiple cameras in TV studios or sports stadiums

Definitions

  • the present invention relates to a photographing device and a focus control program.
  • an imaging device that performs automatic focus control using a phase difference signal obtained on the image plane of an imaging element.
  • An example of an imaging element that outputs such a phase difference signal is an imaging element that is two-dimensionally arranged with normal pixels that output an image signal dedicated to generating an image and phase pixels that output a phase difference signal exclusively used for autofocus ( For example, refer to Patent Document 1).
  • an imaging element Compared with an imaging element in which all pixels are pupil-divided for each microlens, such an imaging element has two photoelectric conversion sections and can switch between the output of an image signal and the output of a phase difference signal. Lower cost manufacturing is an advantage.
  • Patent document 1 Japanese Patent Application Publication No. 2016-90785
  • phase pixels that exclusively output phase difference signals do not output image signals for generating images
  • the pixel values in the addresses where the phase pixels are arranged are generated by interpolation processing based on the pixel values of surrounding pixels. . Therefore, if the ratio of phase pixels among the pixels forming the imaging element is increased, the quality of the generated image will be reduced.
  • the phase pixels are discretely arranged so as to be surrounded by the normal pixels, and that their ratio is also preferably small.
  • phase pixels are arranged as described above, it is difficult to detect a phase difference signal that reflects the subject relatively small relative to the entire image captured by the imaging element, and the subject may not be focused.
  • the subject may not be focused.
  • the present invention has been made to solve this problem, and provides a photographing device, etc., which can accurately automatically focus on the main subject even when a wide-angle image in which the main subject is relatively small is generated. .
  • the imaging device in the first aspect of the present invention includes: a first camera unit having a first optical system; a second camera unit configured as a second optical system facing the same direction as the first camera unit; and a focus control A unit that performs focus control of the first camera unit and the second camera unit.
  • the first camera unit and the second camera unit each have a photographing element, and the photographing element is composed of a common sensor that outputs an image signal for forming an image. Pixels and phase pixels that are discretely arranged and surrounded by the ordinary pixels and output a phase difference signal for detecting focus are two-dimensionally arranged, and the focus control section uses the above-mentioned image sensor output from the imaging element of the first camera unit.
  • focus control of the first optical system is performed with reference to the second defocus information acquired from the phase difference signal output from the imaging element of the second camera unit.
  • the focus control program in the second aspect of the present invention performs focus control on the first camera unit and the second camera unit of the photographing device.
  • the photographing device includes: a first camera unit having a first optical system; and a second camera. unit, which is configured as a second optical system facing the same direction as the first camera unit, the first camera unit and the second camera unit each having a photographing element, the photographing element is composed of an image sensor that outputs an image signal for forming an image.
  • Ordinary pixels and phase pixels that are discretely arranged and surrounded by the ordinary pixels and output a phase difference signal for detecting focus are two-dimensionally arranged, and are formed using the image signal output from the imaging element of the first camera unit.
  • the computer is caused to execute the following steps: an acquisition step of acquiring second defocus information based on the phase difference signal output from the imaging element of the second camera unit; and a driving step of referring to the second defocus information. to drive the focus lens of the above-mentioned first optical system.
  • a photographing device or the like that can accurately automatically focus on a main subject even when a wide-angle image in which the main subject is relatively small is generated.
  • FIG. 1 is a diagram showing the appearance of the imaging device according to this embodiment.
  • FIG. 2 is a diagram showing the main hardware configuration of the imaging device.
  • FIG. 3 is a diagram illustrating the pixel arrangement of the imaging element.
  • FIG. 4 is a diagram showing an example of a scene to be photographed.
  • FIG. 5 is a diagram showing an example of an image obtained from the first camera unit when autofocus control is performed by a phase difference signal from the first camera unit.
  • FIG. 6 is a diagram showing an example of an image obtained from the second camera unit when autofocus control is performed on the same scene by a phase difference signal from the second camera unit.
  • FIG. 7 is a diagram showing an example of an image obtained from the first camera unit when autofocus control of the first optical system is performed with reference to second defocus information.
  • FIG. 8 is a diagram explaining the correspondence relationship between two focus areas.
  • FIG. 9 is a diagram showing a processing procedure until a wide-angle image is generated.
  • FIG. 1 is a diagram showing the appearance of the imaging device 100 according to this embodiment.
  • (A) of FIG. 1 is a diagram mainly showing the first side of the imaging device 100
  • (B) of FIG. 1 is a diagram mainly showing the second side opposite to the first side.
  • the imaging device 100 of this embodiment is a so-called smartphone.
  • it is a smartphone that also functions as an imaging device.
  • the photographing function related to the present invention will be described, and other functions of the smartphone that utilize image data generated by photographing will be omitted.
  • the imaging device 100 is described using a smartphone as an example.
  • the imaging device may be a stand-alone camera or may be incorporated into a device with a imaging function such as a tablet terminal.
  • the imaging device 100 includes a first camera unit 110 and a second camera unit 120 arranged in the same direction on the first surface side.
  • the first camera unit 110 is a camera unit for generating a wide-angle image.
  • the second camera unit 120 is a camera unit for generating telephoto images.
  • the user designates the first camera unit for shooting, and when the user wants to obtain a telephoto image, the user designates the second camera unit for shooting.
  • the first camera unit 110 and the second camera unit 120 are arranged parallel to the long side of the photographing device 100 in the figure.
  • the arrangement of the two camera units is not limited to this. For example, they may also be arranged along a straight line obliquely intersecting the long side. .
  • the positions of the first camera unit 110 and the second camera unit 120 may be opposite to those in the figure.
  • the imaging device 100 includes a display 130 on the second side.
  • the display 130 is, for example, a display device using an organic EL (Electro Luminescence; electroluminescence) panel, and displays a real-time image of the subject before shooting (live view display) or displays an image after shooting.
  • a camera unit for selfies that is independent of the first camera unit 110 and the second camera unit 120 may be provided on the second surface side.
  • a shutter button 161 is provided on the side of the imaging device 100 .
  • the user can provide a photographing instruction to the photographing device 100 by pressing the shutter button 161 .
  • a touch panel 162 is provided to overlap the display 130 .
  • the user can also provide a shooting instruction to the photographing device 100 by tapping the shutter button displayed on the display 130 .
  • users can also tap any part of the subject image displayed in live view to designate a certain area including that part as the focus area.
  • the user can also switch between the first camera unit 110 and the second camera unit 120 or select displayed menu items through contact actions such as tapping.
  • FIG. 2 is a diagram showing the main hardware configuration of the imaging device 100 .
  • the imaging device 100 is composed of the first camera unit 110 , the second camera unit 120 , and the display 130 , as well as a system control unit 150 that controls these, and peripheral elements that cooperate with the system control unit 150 .
  • the first camera unit 110 is a camera unit for generating wide-angle images, and mainly includes a first optical system 111 , a first drive mechanism 112 , a first imaging element 113 and a first analog front end (AFE) 114 .
  • the first optical system 111 is an optical system for imaging the incident subject light beam on the imaging surface of the first imaging element 113 . It is represented by one lens in the figure, but it is generally composed of multiple lenses, and at least part of it is a focusing lens that can advance and retreat along the optical axis direction.
  • the first drive mechanism 112 is a drive mechanism for moving the focus lens of the first optical system 111 along the optical axis direction, and includes an actuator that operates in accordance with instructions from the system control unit 150 .
  • the first imaging element 113 is, for example, a CMOS image sensor.
  • the first imaging element 113 will be described in detail later.
  • the first imaging element 113 transmits pixel signals (image signals and phase difference signals to be described later) as output signals to the first analog front end 114 in accordance with instructions from the system control unit 150 .
  • the first analog front end 114 adjusts the level of the pixel signal according to the gain instructed by the system control unit 150 , A/D converts the pixel signal into digital data, and transmits it to the working memory 151 .
  • the second camera unit 120 is a camera unit for generating telephoto images, and mainly includes a second optical system 121 , a second driving mechanism 122 , a second imaging element 123 and a second analog front end (AFE) 124 .
  • the second optical system 121 is an optical system for imaging the incident subject light beam on the imaging surface of the second imaging element 123 .
  • the second optical system 121 is generally composed of a plurality of lenses like the first optical system 111, and is a focusing lens in which at least a part thereof can advance and retreat along the optical axis direction.
  • the second drive mechanism 122 is a drive mechanism for moving the focus lens of the second optical system 121 along the optical axis direction, and includes an actuator that operates in accordance with instructions from the system control unit 150 .
  • the second imaging element 123 is, for example, a CMOS image sensor.
  • the second imaging element 123 and the first imaging element 113 will be described in detail later.
  • the second imaging element 123 transmits the pixel signal as an output signal to the second analog front end 124 according to the instruction of the system control unit 150 .
  • the second analog front end 124 adjusts the level of the pixel signal according to the gain instructed by the system control unit 150 , A/D converts the pixel signal into digital data, and transmits it to the working memory 151 .
  • the first optical system 111 and the second optical system 121 are both single-focus optical systems with a fixed focal length, but they may also be at least one variable-focus optical system (zoom lens) that can change the focal length.
  • the focal length of the first optical system 111 may be set shorter than the focal length of the second optical system 121 .
  • the angle of view of the second optical system 121 is set to a telephoto angle of view compared to the angle of view of the first optical system 111 .
  • the system control unit 150 is a processor (CPU: Central Processing Unit) that directly or indirectly controls each element constituting the imaging device 100 .
  • the system control unit 150 functions as a various function control unit according to the control program to be executed. For example, when executing the focus control of the first camera unit 110 and the second camera unit 120 , it functions as a focus control unit and when shooting, the system control unit 150 functions as a focus control unit. When the received image is displayed on the display 130, it functions as a display control unit.
  • the imaging device 100 mainly includes a work memory 151 , an image processing unit 152 , an operation unit 160 , a storage unit 170 , and a communication interface 180 as peripheral elements that cooperate with the system control unit 150 .
  • the working memory 151 is a volatile high-speed memory, and is composed of, for example, SRAM (Static Random Access Memory).
  • the working memory 151 receives pixel data sequentially converted from the first analog front end 114 and the second analog front end 124 respectively, and if the pixel data is data converted from an image signal, it is collectively stored in one frame of frame data. In addition, if the data is data converted from a phase difference signal, it is stored collectively as two waveform data.
  • the work memory 151 transfers frame data to the image processing unit 153 and waveform data to the system control unit 150 .
  • the work memory 151 is also appropriately used as a temporary storage area during the processing stage when the image processing unit 153 performs image processing or the processing stage when the system control unit 150 performs focus processing.
  • the image processing unit 153 is composed of, for example, an ASIC (Application Specific Integrated Circuit) that specifically performs image processing. It performs various image processing such as interpolation processing on the received frame data to generate image data conforming to a predetermined format. If the generated image data is used for storage, it is stored in the storage unit 170 , and if it is used for display, it is displayed on the display 130 .
  • ASIC Application Specific Integrated Circuit
  • the operation unit 160 is an input device including the shutter button 161 or the touch panel 162 , and is a member that is operated when the user provides an instruction to the imaging device 100 .
  • the operation unit 160 may further include a microphone.
  • the storage unit 170 is a non-volatile memory, and is composed of, for example, an SSD (Solid State Drive).
  • the storage unit 170 not only stores image data generated by photography, but also stores constants, variables, setting values, control programs, and the like required for operating the imaging device 100 .
  • the communication interface 180 may include a communication unit of a 5G line or a wireless LAN. The communication interface 180 is used to transmit generated image data to an external device, etc.
  • FIG. 3 is a diagram illustrating the pixel arrangement of the first imaging element 113 .
  • the second imaging element 114 is also the same as the first imaging element 113, the first imaging element 113 will be described here.
  • the first imaging element 113 is an imaging element that is two-dimensionally arranged from normal pixels 210 that output image signals exclusively for generating images, and phase pixels 220 that output phase difference signals exclusively for detecting focus.
  • the normal pixel 210 is a substantially square pixel in which one photoelectric conversion unit is arranged without displacement relative to one microlens. In the normal pixel 210, a color filter of either RGB is arranged between the microlens and the photoelectric conversion unit.
  • the phase pixel 220 is a substantially rectangular photoelectric conversion portion arranged for one microlens, which is similar to the shape of one of the photoelectric conversion portions of an ordinary pixel divided into two portions. Shifted pixels.
  • the phase pixel 220 does not have a color filter disposed between the microlens and the photoelectric conversion unit.
  • all adjacent pixels are normal pixels 210. In other words, each phase pixel 220 is surrounded by the normal pixels 210 and arranged discretely from each other.
  • the structure of the photoelectric conversion part can be the same as that of the ordinary pixel 210, and a light-shielding mask with a shift opening is disposed between the microlens and the photoelectric conversion part, and the shift opening generates a signal related to the photoelectric conversion. The same effect occurs when the part is shifted as described above.
  • the phase pixels 220 include a first phase pixel 221 in which the photoelectric conversion part is shifted in the first direction (the lower side in the figure), and a second phase pixel 221 in which the photoelectric conversion part is shifted in the opposite direction to the first direction (the upper side in the figure).
  • the first phase pixels 221 and the second phase pixels 222 are respectively arranged in a predetermined pattern. Specifically, a plurality of detection lines are set along the shift direction of the shifted pixels (the up and down direction in the figure), and the first phase pixels 221 are periodically arranged on one side of each detection line (the right side in the figure). , the second phase pixels 222 are arranged out of phase with the same period on the other side (the left side in the figure).
  • the first phase waveform is formed based on the phase difference signal that is the output signal of the first phase pixel 221
  • the second phase waveform is formed based on the phase difference signal that is the output signal of the second phase pixel 222 .
  • the system control unit 150 calculates the defocus amount which is the relative shift amount of the first phase waveform and the second phase waveform, the defocus direction which is the shift direction, and the degree of overlap of the two waveforms. Focus evaluation value, etc., to obtain out-of-focus information.
  • the system control unit 150 performs focus processing based on the acquired out-of-focus information to focus on a predetermined subject. The focus processing will be described in detail later.
  • first imaging element 113 and the second imaging element 123 are two-dimensionally arrayed imaging elements composed of normal pixels and phase pixels discretely arranged surrounded by the normal pixels, they do not need to be the same imaging elements.
  • the respective imaging elements may differ from each other in, for example, the overall number of pixels or the arrangement pattern of phase pixels.
  • the detection lines are set along one axis which is the up-down direction, but the detection lines may also be set in the orthogonal direction (the left-right direction in the example of the drawing), and the arrangement may be suitable for It detects the phase pixels of the line (phase pixels shifted to the right and phase pixels shifted to the left in the example of the figure). In this case, each phase pixel is preferably surrounded by normal pixels. If the detection lines are set in two orthogonal axes in this manner, focusing accuracy can be improved.
  • phase pixels that exclusively output phase difference signals do not output image signals for generating images
  • the pixel value of the pixel to which the address of the phase pixel is arranged is generally It is generated by interpolation processing based on the pixel values of surrounding pixels. Therefore, if the ratio of phase pixels among the pixels forming the imaging element is increased, the quality of the generated image will be reduced.
  • the phase pixels are discretely arranged so as to be surrounded by ordinary pixels, and the ratio thereof is preferably small.
  • the total number of phase pixels 220 is less than 5% of the total number of pixels.
  • the focus control unit cannot obtain accurate defocus information that reflects the subject relatively small relative to the entire image captured by the imaging element.
  • Unable to focus on the subject For example, in a scene where there is a long and slender main subject in front of a background with many high-frequency components, it is easy to fail to focus on the main subject and focus on the background side. This kind of failure is more likely to occur when an optical system using a wide-angle viewing angle is used for reflection.
  • the imaging device 100 in this embodiment refers to the image signal output from the second imaging element 123 of the second camera unit 120.
  • the second defocus information obtained by the phase difference signal performs focus control of the first optical system 111 .
  • FIG. 4 is a diagram showing an example of a scene to be photographed. Specifically, a case is shown in which the user attempts to photograph a scene in which the forest 920 spreads behind the person 910 as the main subject through the photographing device 100 . Here, it is assumed that the user wants to focus on the character 910 . The user can determine the composition while confirming the live view image continuously acquired by the first camera unit 110 or the second camera unit 120 and displayed on the display 130 .
  • FIG. 5 is a diagram showing an example of an image obtained from the first camera unit 110 when autofocus control of the first optical system 111 is performed only by the phase difference signal from the first camera unit 110 .
  • the forest 920 as the background of the scene is a collection of many trees. Therefore, within the angle of view captured by the first camera unit 110, the image area (background area) of the forest 920 is an area with high spatial frequency.
  • the image area (main area) of the person 910 as the main subject only occupies a very small part of the angle of view captured by the first camera unit 110, and is an area with relatively low spatial frequency.
  • the focus control unit calculates out-of-focus information for the background area with a high spatial frequency.
  • the focus lens of the first optical system 111 is driven based on the defocus information calculated in this way, the first optical system 111 focuses on the forest 920 as the background, and the wide-angle image 301 generated after the focusing, as shown in the figure, serves as the main subject.
  • the subject person 910 is a blurred image.
  • phase pixels 220 arranged along the detection line are discrete, the number of phase pixels 220 included in a small main area becomes smaller, and it is more difficult for the focus control section to base the operation on the phase pixels included in the main area.
  • the defocus information is calculated from the waveform formed by the phase difference signal output by 220.
  • FIG. 6 is a diagram showing an example of an image obtained from the second camera unit 120 when autofocus control of the second optical system 121 is performed on the same scene by a phase difference signal from the second camera unit 120 .
  • the second optical system 121 of the second camera unit 120 has a telephoto viewing angle compared to the first optical system 111. Therefore, the image area (main area) of the person 910 accounts for a larger proportion of the entire area than in the case of FIG. 5 . Then, the number of phase pixels 220 included in the main area also increases, and more detailed parts of the person 910 are resolved, so the spatial frequency thereof becomes higher.
  • the focus control unit can calculate defocus information for the main area.
  • the focus lens of the second optical system 121 is driven based on the defocus information calculated in this way, the second optical system 121 focuses on the person 910 as the main subject, and as shown in the figure, the telephoto image generated after this focusing 302 is an image in which the character 910 is focused.
  • the first optical system 111 can be improved It also focuses on the possibility of the character 910.
  • the focus control unit drives the second imaging element 123 to cause the phase pixels 220 to output a phase difference signal and acquire second defocus information.
  • the focus control unit moves the focus lens of the first optical system 111 to focus on the person 910 by referring to the second defocus information.
  • the second defocus information includes area information of the second focus area 320 as an area for focus evaluation in the second imaging element 123, and the defocus amount, defocus direction, and focus evaluation value in the second focus area.
  • FIG. 7 is a diagram showing an example of an image obtained from the first camera unit 110 when autofocus control of the first optical system 111 is performed with reference to the second defocus information.
  • the focus control unit determines the first focus area 310 as the focus area in the first camera unit 110 based on the area information of the second focus area 320 included in the second defocus information.
  • the first camera unit 110 and the second camera unit 120 are arranged close to each other, and the optical axis of the first optical system 111 and the optical axis of the second optical system 121 are parallel to each other. Therefore, for simplicity, the depth of the subject may not be considered.
  • the focus control unit uses this conversion formula or reference table to determine the first focus area 310 based on the area information of the second focus area 320 .
  • the focus control unit executes focus control of the first optical system 111 on the premise that the main subject exists in the area. Specifically, among the phase difference signals output by the first imaging element 113 , a phase waveform is generated only for the phase difference signal included in the first focus area 310 . Furthermore, the focus range (that is, the depth range in which the main subject is assumed to exist) is limited by referring to the defocus amount and the defocus direction of the second defocus information so that it is not affected by the background area. Based on the conditions defined in this way, the defocus amount and the defocus direction as the first defocus information are determined. The focus control unit determines the movement direction and movement amount of the focus lens of the first optical system 111 based on the determined defocus amount and defocus direction. By moving in this manner, it is possible to focus on the person 910 as the main subject.
  • the focus control unit causes the first imaging element 113 to output the phase difference signal again, evaluates its phase waveform, and determines whether it is in a focused state.
  • photographing processing is performed to generate a wide-angle image 301 .
  • the first defocus information can be obtained again to correct the position of the focus lens.
  • the position of the focus lens may be corrected while wobbling the focus lens so that the contrast of the partial image generated based on the image signal output from the ordinary pixels included in the first focus area 310 is the highest. The latter is what is called contrast AF.
  • the focus control unit may perform contrast AF using the first focus area 310 as a target immediately after determining the first focus area 310 based on the second focus area 320 , without acquiring the second focus area 310 from the first imaging element 113 .
  • An out-of-focus message may be performed.
  • the focus control unit may refer to the second defocus information to execute focus control of the first optical system. That is, when it is determined that the focus control of the first optical system 111 is to be executed based on the first defocus information, the focus control of the first optical system may be executed without referring to the second defocus information. For example, whether it is difficult to perform focus control of the first optical system 111 based on the first defocus information can be determined based on whether the focus evaluation value in the first defocus information is less than a threshold or above the threshold.
  • the second defocus information may be acquired in parallel with the acquisition of the first defocus information, and the judgment may be made by comparing the respective defocus amounts and defocus directions. Specifically, if the depth of the subject calculated based on the defocus amount and the defocus direction included in the first defocus information, and the subject's depth calculated based on the defocus amount and the defocus direction included in the second defocus information, If the depth is within a fixed range, it is assumed that the same subject is captured, and therefore it is determined that the focus control of the first optical system 111 can be performed based on the first defocus information. If it is not within the fixed range, assuming that different subjects are respectively captured, it is determined that it is difficult to perform focus control of the first optical system 111 based on the first defocus information.
  • FIG. 8 is an explanatory diagram illustrating the correspondence relationship between two focus areas. Specifically, this figure shows a situation where the first camera unit 110 and the second camera unit 120 respectively capture a person existing at a distance d1 and a person existing at a distance d2 from the shooting device 100, as well as the telephoto images 302 and 302 thus captured. Figure of wide angle image 301.
  • the person existing at the near distance d1 and the person existing at the far distance d2 are both close to the second camera unit 120 side.
  • the telephoto image 302 generated from the second camera unit 120 is compared with the wide-angle image 301 generated from the first camera unit, the person present at the close distance d1 is captured when the person is photographed.
  • the distance between the second focus area 320 and the first focus area 310 is greater than the distance between the second focus area 320 and the first focus area 310 that captures a person existing at a far distance d2 when the person is photographed.
  • the first focus area 310 is shifted to the right when the person exists at the distance d1 than when the person exists at the distance d2.
  • This correspondence can be calculated through triangulation. Specifically, when calculating the distance between the second focus area 320 and the person as the main subject in the telephoto image 302, the baseline length which is the distance between the optical axes of the two optical systems and the two optical systems can be used The angle of view ratio is used to determine the first focus area 310 in the wide-angle image 301 . In addition, the distance to the person can be calculated based on the defocus amount and defocus direction in the second defocus information, and the focus lens position at that time.
  • the focus control section determines the first focus area 310 corresponding to the second focus area 320 more accurately in this manner, the accuracy of the focus control of the first optical system 111 can be further improved.
  • the focus control unit may use a general near point priority (subjects close to the shooting device are given priority) or center priority (subjects near the center of the angle of view are given priority). Algorithm to determine the main subject to focus on. In this case, the subject calculated based on the defocus amount and the defocus direction included in the first defocus information is located closer than the subject calculated based on the defocus amount and the defocus direction included in the second defocus information. When the subject is at a far position or when it exists at a position far from the center, it may be determined that it is difficult to perform focus control of the first optical system 111 based on the first defocus information.
  • this area may be used as the first focus area 310 .
  • this area may be used as the first focus area 310 . Even in this case, if it is determined that it is difficult to perform focus control of the first optical system 111 based on the first defocus information, it is possible to perform the second focus area 320 with reference to the second defocus information in the area. Focus control of the first optical system 111.
  • FIG. 9 is a flowchart showing the main processing procedure until the system control section 150 generates a wide-angle image. The flow starts, for example, from the time when the user presses the shutter button 161 .
  • step S101 the focus control unit acquires first defocus information. Specifically, as described above, the first imaging element 113 is driven to cause the phase pixel 220 to output a phase difference signal, and various operations are performed to obtain the first defocus information.
  • step S102 the focus control unit acquires the second defocus information in the same manner as the first defocus information. The process of step S102 may be performed in parallel with step S101.
  • step S103 the focus control section determines whether focus control of the first optical system 111 can be performed based on the first defocus information. If it is determined that it is possible, step S104 is skipped and step S105 is entered. If it is determined that it is not possible, step S104 is entered.
  • the focus control unit determines the first focus area 310 in the first imaging element 113 with reference to the second defocus information acquired in step S102, and proceeds to step S105.
  • step S105 When entering step S105, if step S104 is skipped, the focus control unit moves the focus lens of the first optical system 111 based on the first defocus information to focus on the main subject.
  • step S104 After step S104 is passed, in the determined first focus area 310, as described above, for example, by applying limiting conditions, the first defocus information is obtained again, and the first optical system is configured based on the first defocus information.
  • the focus lens of 111 moves to focus on the main subject.
  • step S106 causes the first imaging element 113 to output a phase difference signal again, evaluates its phase waveform, and determines whether it is in a focused state. If it is determined that the camera is in a focused state, step S107 is skipped and step S108 is entered. If it is determined that it is not in a focused state, step S107 is entered.
  • step S107 the focus control unit executes contrast AF, corrects the position of the focus lens, and focuses on the main subject. After that, step S108 is entered.
  • the system control unit 150 drives the first imaging element 113 to cause the normal pixels to output image signals, and causes the image processing unit 153 to generate image data.
  • the system control unit 150 stores the generated image data in the storage unit 170 or displays it on the display 130 according to preset instructions, or sends it to an external device through the communication interface 180 to complete a series of processes.
  • the first camera unit 110 captures a still image, but the same focus control can be performed even when a moving image is captured. For example, if frame images are continuously generated based on the image signals among the pixel signals output by the first imaging element 113 and the first defocus information is generated based on the phase difference signal, even during shooting of a moving image, it is possible to refer to the second image generated in parallel.
  • the same focus control as above is performed while receiving out-of-focus information.
  • the imaging device 100 includes two camera units.
  • the same focus control can be performed even in an imaging device including three or more camera units.
  • the focus control of the camera unit with a wide-angle angle of view can refer to the parameters obtained from the camera unit with a standard angle of view and the camera unit with a telephoto angle of view respectively.
  • focus information may refer to the defocus information obtained from the camera unit with a telephoto viewing angle.
  • focus control is performed based on one piece of defocus information acquired by each focus lens of the first optical system 111 and the second optical system 121 in an arbitrary state.
  • the focus control may also be performed in an arbitrary state. Focus control is performed by acquiring multiple pieces of defocus information while changing the position of the focus lens. For example, when using an optical system with a small opening F value or an optical system with a long focal length, the position of the focusing lens can be changed multiple times according to its characteristics to obtain defocus information each time.
  • the movement range of the focus lens can be determined based on the applied voltage. That is, since such a liquid lens can be a high-magnification zoom lens, for example, in the telephoto area, it is possible to focus on a subject 1.5m away and only allow the focus lens to move within this range. Shorten the acquisition time of out-of-focus information.
  • the liquid lens may be a wide-angle lens or a telephoto lens. Since the liquid lens requires a motor to change the thickness of the liquid unit when focusing, a longer stroke or a higher moving speed of the motor is required during focusing. Slower, which may affect focusing speed. At this time, the optical system including the liquid lens can be assisted to perform focus control based on the defocus information of other optical systems. For example, multiple focus distance intervals are determined so that the liquid lens only needs to focus within a smaller range, thereby increasing the focusing speed.
  • 100...shooting device 110...first camera unit, 111...first optical system, 112...first drive mechanism, 113...first shooting element, 114...first analog front end (AFE), 120...second camera unit, 121... second optical system, 122... second drive mechanism, 123... second imaging element, 124... second analog front end (AFE), 130... display, 150... system control unit, 151... working memory, 152...
  • image processing Department 160...operation part, 161...shutter button, 162...touch panel, 170...storage part, 180...communication interface, 210...normal pixels, 220...phase pixels, 221...first phase pixels, 222...second phase pixels , 301...wide-angle image, 302...telephoto image, 310...first focus area, 320...second focus area, 910...people, 920...forest.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Studio Devices (AREA)
  • Focusing (AREA)

Abstract

提供一种拍摄装置,其具备:第一摄像头单元,其具有第一光学系统;第二摄像头单元,其配置为与第一摄像头单元朝向同一方向的第二光学系统;以及对焦控制部,其执行第一摄像头单元以及第二摄像头单元的对焦控制,第一摄像头单元以及第二摄像头单元分别具有拍摄元件,该拍摄元件由输出用于形成图像的图像信号的普通像素和被普通像素包围且离散地配置的、输出用于检测焦点的相位差信号的相位像素二维排列而成,对焦控制部在使用从第一摄像头单元的拍摄元件输出的图像信号形成图像的情况下,参照从由第二摄像头单元的拍摄元件输出的相位差信号获取的第二失焦信息,执行第一光学系统的对焦控制。

Description

拍摄装置以及对焦控制程序 技术领域
本发明涉及拍摄装置以及对焦控制程序。
背景技术
已知有使用在拍摄元件的像面获得的相位差信号进行自动对焦控制的拍摄装置。作为输出这样的相位差信号的拍摄元件的例子,可以举出由输出专用于生成图像的图像信号的普通像素和输出专用于自动对焦的相位差信号的相位像素二维排列而成的拍摄元件(例如,参照专利文献1)。这样的拍摄元件相比于所有像素针对各自的微透镜被光瞳分割的具备两个光电转换部,并且能在图像信号的输出与相位差信号的输出之间切换的拍摄元件,其在能够以更低的成本制造方面是占优势的。
现有技术文献
专利文献
专利文献1:特开2016-90785号公报
发明内容
发明要解决的问题
由于专门输出相位差信号的相位像素不输出用于生成图像的图像信号,因此,在生成图像的情况下,配置了相位像素的地址中的像素值是根据周边像素的像素值通过插值处理生成的。因此,如果增加形成拍摄元件的像素中的相位像素的比例,则会导致所生成的图像的质量下降。特别是,当多个相位像素相邻且连续排列时,它们可能在所生成的图像中表现为可见噪音。因此,相位像素优选以被普通像素包围的方式离散地配置,另外,其比例也优选较小。然而,当如上述那样配置相位像素时,难以检测针对相对于拍摄元件捕捉的整个像而言相对较小地反映被摄体的相位差信号,可能无法对焦于该被摄体。例如,在高频成分多的背景的面前存在细长的主被摄体这样的场景下,容易产生无法对焦到该主被摄体而对焦于背景侧的故障。
本发明是为了解决这种问题而完成的,提供一种拍摄装置等,即使在生成相对较小地反映主被摄体的广角图像的情况下,也能够准确地自动对焦于该主被摄体。
用于解决问题的方案
本发明的第一方面中的拍摄装置具备:第一摄像头单元,其具有第一光学系统;第二摄像头单元,其配置为与上述第一摄像头单元朝向同一方向的第二光学系统;以及对焦控制部,其执行上述第一摄像头单元以及上述第二摄像头单元的对焦控制,上述第一摄像头单元以及上述第二摄像头单元分别具有拍摄元件,所述拍摄元件由输出用于形成图像的图像信号的普通像素和被上述普通像素包围且离散地配置的、输出用于检测焦点的相位差信号的相位像素二维排列而成,上述对焦控制部在使用从上述第一摄像头单元的上述拍摄元件输出的上述图像信号形成上述图像的情况下,参照从由上述第二摄像头单元的上述拍摄元件输出的上述相位差信号获取的第二失焦信息,执行上述第一光学系统的对焦控制。
本发明的第二方面中的对焦控制程序,进行拍摄装置的第一摄像头单元以及第二摄像头单元的对焦控制,上述拍摄装置具备:第一摄像头单元,其具有第一光学系统;以及第二摄像头单元,其配置为与上述第一摄像头单元朝向同一方向的第二光学系统,上述第一摄像头单元以及上述第二摄像头单元分别具有拍摄元件,所述拍摄元件由输出用于形成图像的图像信号的普通像素和被上述普通像素包围且离散地配置的、输出用于检测焦点的相位差信号的相位像素二维排列而成,在使用从上述第一摄像头单元的上述拍摄元件输出的上述图像信号形成上述图像的情况下,使计算机执行如下步骤:获取步骤,基于从上述第二摄像头单元的上述拍摄元件输出的上述相位差信号获取第二失焦信息;以及驱动步骤,参照上述第二失焦信息来驱动上述第一光学系统的对焦透镜。
发明效果
根据本发明,提供一种拍摄装置等,即使在生成相对较小地反映主被摄体的广角图像的情况下,也能够准确地自动对焦于该主被摄体。
附图说明
图1是示出本实施方式的拍摄装置的外观的图。
图2是示出拍摄装置的主要硬件构成的图。
图3是说明拍摄元件的像素排列的图。
图4是示出成为拍摄对象的场景的一个例子的图。
图5是示出当通过来自第一摄像头单元的相位差信号进行自动对焦控制时从第一摄像头单元获得的图像的例子的图。
图6是示出当通过来自第二摄像头单元的相位差信号对同一场景进行自动对焦控制时从第二摄像头单元获得的图像的例子的图。
图7是示出当参照第二失焦信息进行第一光学系统的自动对焦控制时从第一摄像头单元获得的图像的例子的图。
图8是说明两个对焦区域的对应关系的图。
图9是示出直到生成广角图像为止的处理过程的图。
具体实施方式
以下,通过发明的实施方式对本发明进行说明,但权利要求所限定的发明并不限定于以下的实施方式。另外,在实施方式中说明的构成并非全部作为解决技术问题所必需的手段。另外,在各图中,在存在多个相同或具有相同构成的结构物的情况下,为了避免成为复杂的结构,有时会对一部分标注附图标记,对另一部分省略标注相同的附图标记。
图1是示出本实施方式的拍摄装置100的外观的图。特别是,图1的(A)是主要示出拍摄装置100的第一面侧的图,图1的(B)是主要示出与第一面侧相反的第二面侧的图。本实施方式的拍摄装置100是所谓的智能手机,换言之,是还作为拍摄装置发挥功能的智能手机。以下,说明智能手机功能中的与本发明相关的拍摄功能,省略利用通过拍摄生成的图像数据等的、作为智能手机的其他功能。此外,虽然在本实施方式中以智能手机为例对拍摄装置100进行说明,但无需赘言,也可以是作为单体相机的拍摄装置,或者是装入平板电脑终端等具备拍摄功能的装置。
拍摄装置100具备在第一面侧朝向同一方向配置的第一摄像头单元110和第二摄像头单元120。第一摄像头单元110是用于生成广角图像的摄像头单元。第二摄像头单元120是用于生成长焦图像的摄像头单元。用户在想获得广角图像的情况下指定第一摄像头单元进行拍摄,在向获得长焦图像的情况下指定第二摄像头单元进行拍摄。第一摄像头单元110和第二摄像头单元120在图中配置为与拍摄装置100的长边平行,但两个摄像头单元的配置不限于此,例如也可以沿着与该长边斜交的直线配置。另外,第一摄像头单元110和第二摄像头单元120的配置也可以是相互与图中的位置相反。
拍摄装置100在第二面侧具备显示器130。显示器130例如是采用有机EL(Electro Luminescence;电致发光)面板的显示设备,在拍摄前显示被摄体的实时图像(实时取景显示),或显示拍摄后的图像。此外,也可以在第二面侧设置独立于第一摄像头单元110以及第二摄像头单元120的自拍用的摄像头单元。
在拍摄装置100的侧方部设置有快门按钮161。用户能够通过按下快门按钮161来向拍摄装置100提供拍摄的指示。另外,与显示器130重叠地设置有触摸面板162。代替按下快门按钮161,用户还能够通过轻敲显示于显示器130的快门按钮来向拍摄装置100提供拍摄的指示。另外,用户还能够通过轻敲实时取景显示的被摄体像的任意部位,指定包含该部位的一定区域作为对焦区域。此外,用户还能够通过轻敲等接触动作,进行第一摄像头单元110与第二摄像头单元120之间的切换,或进行显示的菜单项目的选择。
图2是示出拍摄装置100的主要硬件构成的图。拍摄装置100除了由上述第一摄像头单元110、第二摄像头单元120以及显示器130构成之外,还由控制这些的系统控制部150、以及与系统控制部150协作的周边要素构成。
如上所述,第一摄像头单元110是用于生成广角图像的摄像头单元,主要具备第一光学系统111、第一驱动机构112、第一拍摄元件113以及第一模拟前端(AFE)114。第一光学系统111是用于使入射的被摄体光束成像于第一拍摄元件113的拍摄面的光学系统。在图中由一片透镜表示,但一般由多片透镜构成,是至少其一部分可以沿着光轴方向进退的对焦透镜。第一驱动机构112是用于使第一光学系统111的对焦透镜沿着光轴方向移动的驱动机构,包含根据系统控制部150的指示工作的致动器。
第一拍摄元件113例如是CMOS图像传感器。后面将详细地叙述第一拍摄元件113。第一拍摄元件113根据系统控制部150的指示,将作为输出信号的像素信号(后述的图像信号以及相位差信号)传送到第一模拟前端114。第一模拟前端114根据系统控制部150指示的增益对像素信号进行电平调整,将其A/D转换为数字数据,并传送到工作存储器151。
如上所述,第二摄像头单元120是用于生成长焦图像的摄像头单元,主具备第二光学系统121、第二驱动机构122、第二拍摄元件123以及第二模拟前端(AFE)124。第二光学系统121是用于使入射的被摄体光束成像于第二拍摄元件123的拍摄面的光学系统。在图中由一片透镜表示,但第二光学系统121与第一光学系统111同样地一般也由多片透镜构成,是至少其一部分可以沿着光轴方向进退的对焦 透镜。第二驱动机构122是用于使第二光学系统121的对焦透镜沿着光轴方向移动的驱动机构,包含根据系统控制部150的指示工作的致动器。
第二拍摄元件123例如是CMOS图像传感器。后面将一起详细地叙述第二拍摄元件123和第一拍摄元件113。第二拍摄元件123根据系统控制部150的指示,将作为输出信号的像素信号传送到第二模拟前端124。第二模拟前端124根据系统控制部150指示的增益对像素信号进行电平调整,将其A/D转换为数字数据,并传送到工作存储器151。
在本实施方式中,假设第一光学系统111和第二光学系统121都是焦距固定的单焦点光学系统,但也可以是至少一个是可以改变焦距的可变焦点光学系统(变焦镜头)。即使在采用可变焦点光学系统的情况下,第一光学系统111的焦距也可以设定为比第二光学系统121的焦距短。换言之,第二光学系统121的视角设定为相比于第一光学系统111的视角为长焦视角。
系统控制部150是直接的或间接地控制构成拍摄装置100的各要素的处理器(CPU:Central Processing Unit)。系统控制部150根据要执行的控制程序发挥作为各种功能控制部的作用,例如,在执行第一摄像头单元110以及第二摄像头单元120的对焦控制时,作为对焦控制部发挥功能,在将拍摄到的图像显示到显示器130时,作为显示控制部发挥功能。
拍摄装置100主要具备工作存储器151、图像处理部152、操作部160、存储部170以及通信接口180,作为与系统控制部150协作的周边要素。工作存储器151是易失性的高速存储器,例如由SRAM(Static Random Access Memory;静态随机存取存储器)构成。工作存储器151接收分别从第一模拟前端114以及第二模拟前端124依次转换的像素数据,如果该像素数据为从图像信号转换的数据,则将其集中存储到1帧的帧数据中。另外,如果该数据是从相位差信号转换的数据,则将其作为两个波形数据集中存储。工作存储器151将帧数据传送到图像处理部153,将波形数据传送到系统控制部150。另外,工作存储器151在图像处理部153进行图像处理的处理阶段或系统控制部150进行对焦处理的处理阶段,还被适当地用作临时存储区域。
图像处理部153例如由专门执行图像处理的ASIC(Application Specific Integrated Circuit;专用集成电路)构成,对接收到帧数据实施插值处理等各种图像处理,生成符合预定格式的图像数据。如果生成的图像数据用于存储,则将其存储于存储部170,如果用于显示,则将其显示于显示器130。
操作部160是包含快门按钮161或触摸面板162的输入设备,是当用户向拍摄装置100提供指示时操作的构件。在拍摄装置100受理语音输入的情况下,操作部160还可以包含麦克风。存储部170是非易失性的存储器,例如由SSD(Solid State Drive;固态硬盘)构成。存储部170除了存储通过拍摄生成的图像数据外,还保存拍摄装置100进行动作时需要的常数、变量、设定值及控制程序等。通信接口180可以包含5G线路或无线LAN的通信单元。通信接口180用于将生成的图像数据传输到外部设备的情况等。
图3是说明第一拍摄元件113的像素排列的图。在本实施方式中,由于第二拍摄元件114也与第一拍摄元件113相同,因此,在此对第一拍摄元件113进行说明。
第一拍摄元件113是由输出专用于生成图像的图像信号的普通像素210和输出专用于检测焦点的相位差信号的相位像素220二维排列而成的拍摄元件。普通像素210是大致正方形的一个光电转换部相对于一个微透镜无移位地配置的像素。在普通像素210中,在微透镜与光电转换部之间配置有RGB中的任意一个的彩色滤光片。
相位像素220是配置为针对一个微透镜来说与普通像素的光电转换部被分割为两部分后的其中一个的形状类似的大致长方形的一个光电转换部,相对于该微透镜的光轴发生了移位的像素。相位像素220在微透镜与光电转换部之间未配置彩色滤光片。在相位像素220中,相邻的像素全部为普通像素210,换言之,各相位像素220被普通像素210包围且彼此离散地配置。此外,在相位像素220的构成中,光电转换部的构成可以与普通像素210相同,而在微透镜与光电转换部之间配置具有移位开口的遮光掩模,该移位开口产生与光电转换部如上述那样移位时相同的效果。
相位像素220存在光电转换部向第一方向(图中的下侧)移位的第一相位像素221、以及光电转换部向与第一方向相反方向(图中的上侧)移位的第二相位像素222这两种类型。第一相位像素221以及第二相位像素222分别以规定图案排列。具体地,沿着移位像素的移位方向(图中的上下方向)设定有多个检测线,第一相位像素221周期性地排列在各检测线的一侧(图中的右侧),第二相位像素222以相同周期异相排列在另一侧(图中的左侧)。
根据作为第一相位像素221的输出信号的相位差信号形成第一相位波形,根据作为第二相位像素222的输出信号的相位差信号形成第二相位波形。并且,系统控制部150在对焦控制中,算出作为第一相位波形与第二相位波形的相对偏移量的失焦量、作为偏移方向的失焦方向,以及根据两波形的重合度获得的对焦评价值等,从而获取失焦信息。系统控制部150基于获取到的失焦信息,进行对焦处理,以 对焦到规定的被摄体上。后面详细地叙述对焦处理。
此外,如果第一拍摄元件113和第二拍摄元件123均是由普通像素以及被普通像素包围离散地配置的相位像素二维排列而成的拍摄元件,则也可以不是彼此相同的拍摄元件。各个拍摄元件例如可以在整体的像素数或相位像素的排列图案方面彼此不同。
另外,在附图的例子中,沿着作为上下方向的一个轴向设定了检测线,但还可以在正交的方向(附图例子中的左右方向)也设定检测线,排列适合于其检测线的相位像素(移位到附图例子中的右侧的相位像素和移位到左侧的相位像素)。在该情况下,各个相位像素优选被普通像素包围。如果检测线以这种方式设定在正交的两个轴向上,则能够提高对焦精度。
另外,由于专门输出相位差信号的相位像素不输出用于生成图像的图像信号,因此,在根据普通像素输出的图像信号生成图像的情况下,配置了相位像素的地址的像素的像素值一般是根据周边像素的像素值通过插值处理生成的。因此,如果增加形成拍摄元件的像素中的相位像素的比例,则会导致生成的图像的质量下降。特别是,当多个相位像素相邻且连续排列时,它们可能在生成的图像中表现为可见噪音。因此,如本实施方式中的第一拍摄元件113以及第二拍摄元件123那样,相位像素优选以被普通像素包围的方式离散地配置,另外,其比例也优选较小。在本实施方式中的第一拍摄元件113以及第二拍摄元件123中,相位像素220的总数不到像素总数的5%。
然而,当如上述那样少量且离散地配置相位像素时,对焦控制部无法获取相对于拍摄元件捕捉的整个像而言相对较小地反映被摄体的准确的失焦信息,其结果是,可能无法对焦于该被摄体。例如,在高频成分多的背景的面前存在细长的主被摄体这样的场景下,容易产生无法对焦到该主被摄体而对焦于背景侧的故障。这种故障越是在使用广角视角的光学系统进行反映情况下越容易产生。因此,本实施方式中的拍摄装置100在使用从第一摄像头单元110的第一拍摄元件113输出的图像信号形成广角图像的情况下,参照从第二摄像头单元120的第二拍摄元件123输出的相位差信号获取的第二失焦信息,执行第一光学系统111的对焦控制。以下结合具体场景按顺序对该对焦控制进行说明书。
图4是示出成为拍摄对象的场景的一个例子的图。具体地,示出了用户试图通过拍摄装置100来拍摄森林920在作为主被摄体的人物910的背后展开的场景的情况。在此,假设用户想要将焦点集中到人物910上。用户能够在确认通过第一摄像头单元110或第二摄像头单元120连续获取并显示于显示器130的实时取景影像的同时确定构图。
图5是示出当仅通过来自第一摄像头单元110的相位差信号进行第一光学系统111的自动对焦控制时下从第一摄像头单元110获得的图像的例子的图。作为场景的背景的森林920是许多树木的集合,因此,在第一摄像头单元110捕捉的视角内,森林920的像区域(背景区域)为空间频率高的区域。另一方面,作为主被摄体的人物910的像区域(主要区域)只占据第一摄像头单元110捕捉的视角内的极小的一部分,是空间频率相对较低的区域。
当检测线跨越空间频率高的背景区域和空间频率低的主要区域时,产生所谓的远近冲突,对焦控制部算出针对空间频率高的背景区域的失焦信息。当基于这样算出的失焦信息驱动第一光学系统111的对焦透镜时,第一光学系统111对焦于作为背景的森林920,在该对焦后生成的广角图像301,如图所示,作为主被摄体的人物910为模糊的图像。特别是,当沿检测线配置的相位像素220是离散的时,小的主要区域中包含的相位像素220的数量变得更少,对焦控制部更加难以基于由从该主要区域中包含的相位像素220输出的相位差信号形成的波形算出失焦信息。
即,在图4的场景中,当仅通过从第一拍摄元件113输出的相位差信号获取的第一失焦信息进行自动对焦控制时,可以说用户难以对焦于其想要对焦的人物910上。
图6是示出当通过来自第二摄像头单元120的相位差信号对同一场景进行第二光学系统121的自动对焦控制时从第二摄像头单元120获得的图像的例子的图。第二摄像头单元120的第二光学系统121相比于第一光学系统111为长焦视角,因此,人物910的像区域(主要区域)占整个区域的比例大于图5的情况。然后,主要区域中包含的相位像素220的数量也变多,人物910的更细微的部位被分辨,因此,其空间频率变高。
即使在检测线跨越主要区域和背景区域的情况下,如果主要区域的比例大,且由主要区域内的相位像素220输出的相位差信号形成的波形占主导地位,则也不容易产生远近冲突。因此,对焦控制部能够算出针对主要区域的失焦信息。当基于这样算出的失焦信息驱动第二光学系统121的对焦透镜时,第二光学系统121对焦于作为主被摄体的人物910上,如图所示,在该对焦后生成的长焦图像302是焦点集中到人物910的图像。
即,在图4的场景中,如果基于从第二拍摄元件123输出的相位差信号获取的第二失焦信息进行自动对焦控制,则可以说用户易于对焦于其想要对焦的人物910上。因此,即使在用户想要选择第一摄像头单元110来获得广角图像的情况下,如果参照从第二拍摄元件123输出的相位差信号获取的第二失焦 信息,则能够提高第一光学系统111也对焦于人物910的可能性。
为了执行这样的对焦控制,即使在用户选择了第一摄像头单元110的情况下,对焦控制部也驱动第二拍摄元件123使相位像素220输出相位差信号,获取第二失焦信息。对焦控制部通过参照第二失焦信息,使第一光学系统111的对焦透镜移动并对焦于人物910。第二失焦信息具体包含作为在第二拍摄元件123中进行对焦评价的区域的第二对焦区域320的区域信息、以及该第二对焦区域中的失焦量、失焦方向以及对焦评价值。
图7是示出当参照第二失焦信息进行第一光学系统111的自动对焦控制时从第一摄像头单元110获得的图像的例子的图。对焦控制部根据第二失焦信息中包含的第二对焦区域320的区域信息,确定作为第一摄像头单元110中的对焦区域的第一对焦区域310。第一摄像头单元110和第二摄像头单元120相互靠近配置,并且第一光学系统111的光轴与第二光学系统121的光轴相互平行,因此,为简单起见,可以不考虑被摄体的纵深,基于各个视角,预先准备可以一一对应的换算式或参照表。对焦控制部利用这种换算式或参照表,根据第二对焦区域320的区域信息确定第一对焦区域310。
当确定了第一对焦区域310时,对焦控制部以在该区域存在主被摄体为前提,执行第一光学系统111的对焦控制。具体地,在第一拍摄元件113输出的相位差信号中,仅针对第一对焦区域310中包含的相位差信号生成相位波形。并且,通过参照第二失焦信息的失焦量及失焦方向限定对焦范围(即,假设主被摄体存在的纵深范围),使其不受背景区域的影响。根据这样限定的条件,确定作为第一失焦信息的失焦量及失焦方向。对焦控制部根据决定的失焦量及失焦方向确定第一光学系统111的对焦透镜的移动方向和移动量,通过以这种方式移动,能够对焦于作为主被摄体的人物910上。
对焦控制部在对焦透镜移动后,再次使第一拍摄元件113输出相位差信号,对其相位波形进行评价,并判断是否为对焦状态。在判断为处于对焦状态的情况下,执行拍摄处理,生成广角图像301。在判断为未达到对焦状态的情况下,能够再次获取第一失焦信息来校正对焦透镜的位置。或者,也可以在使对焦透镜摆动(Wobbling)的同时,校正对焦透镜的位置,使得根据从第一对焦区域310中包含的普通像素输出的图像信号生成的部分图像的对比度最高。后者就是所谓的对比度AF。另外,更简单地,可以是对焦控制部在基于第二对焦区域320确定了第一对焦区域310之后,立即将第一对焦区域310作为对象执行对比度AF,而不从第一拍摄元件113获取第一失焦信息。
此外,对焦控制部在判断为难以基于第一失焦信息执行第一光学系统111的对焦控制的情况下,也可以参照第二失焦信息执行第一光学系统的对焦控制。即,在判断为基于第一失焦信息执行第一光学系统111的对焦控制的情况下,可以在不参照第二失焦信息下执行第一光学系统的对焦控制。例如,能够通过第一失焦信息中的对焦评价值小于阈值还是在阈值以上来判断是否难以基于第一失焦信息执行第一光学系统111的对焦控制。或者,也可以与第一失焦信息的获取并行地获取第二失焦信息,通过比较各个失焦量和失焦方向来进行判断。具体地,如果根据第一失焦信息中包含的失焦量和失焦方向算出的被摄体的纵深、以及根据第二失焦信息中包含的失焦量和失焦方向算出的被摄体的纵深在固定范围内,则假设都捕捉同一被摄体,因此判断为可以基于第一失焦信息来执行第一光学系统111的对焦控制。如果不在固定范围内,假设分别捕捉不同的被摄体,则判断为难以基于第一失焦信息来执行第一光学系统111的对焦控制。
在上文中,在根据第二对焦区域320确定第一对焦区域310的情况下,为简便起见说明了使用换算式等的例子,但实际上,与第二对焦区域320对应的第一对焦区域310可以根据被摄体的纵深(到被摄体的距离)的变化而变化。图8是说明两个对焦区域的对应关系的说明图。具体地,该图示出第一摄像头单元110和第二摄像头单元120分别捕捉离拍摄装置100距离d1处存在的人物和距离d2处存在的人物的情况、以及这样拍摄到的长焦图像302和广角图像301的图。
如图所示,假设近的距离d1处存在的人物和远的距离d2处存在的人物都靠近第二摄像头单元120侧。在这种情况下,如果将从第二摄像头单元120生成的长焦图像302与从第一摄像头单元生成的广角图像301进行比较,则当拍摄近的距离d1处存在的人物时捕捉该人物的第二对焦区域320与第一对焦区域310的间隔大于当拍摄远的距离d2处存在的人物时捕捉该人物的第二对焦区域320与第一对焦区域310的间隔。另外,当以相同的广角图像301进行比较时,第一对焦区域310在人物存在于距离d1处时比在人存在于距离d2处时向右侧移位。
这种对应关系能够通过三角测量的方法算出。具体地,当在长焦图像302中算出第二对焦区域320与作为主被摄体的人物之间的距离时,能够使用作为两个光学系统的光轴间距离的基线长度和两个光学系统的视角比来确定广角图像301中的第一对焦区域310。此外,到人物的距离可以根据第二失焦信息中的失焦量及失焦方向、以及当时的对焦透镜位置算出。
如果对焦控制部以这种方式更加准确地确定与第二对焦区域320对应的第一对焦区域310,则能够进一步提高第一光学系统111的对焦控制的准确性。
此外,在用户未特别指定对焦区域的情况下,对焦控制部可以根据一般的近点优先(距离拍摄装置近的被摄体优先)或中心优先(视角的中心附近的被摄体优先)这样的算法来确定要对焦的主被摄体。在这种情况下,在根据第一失焦信息中包含的失焦量及失焦方向算出的被摄体存在于比根据第二失焦信息中包含的失焦量及失焦方向算出的被摄体远的位置处的情况下或在存在于远离中心的位置处的情况下,可以判断为难以基于第一失焦信息执行第一光学系统111的对焦控制。
另外,在用户指定了规定的对焦区域的情况下,可以将该区域作为第一对焦区域310。另外,在通过面部区域识别程序等指定规定的区域的情况下,可以将该区域作为第一对焦区域310。即使在这种情况下,如果判断为难以基于第一失焦信息执行第一光学系统111的对焦控制,则也可以通过确定第二对焦区域320,参照该区域中的第二失焦信息,执行第一光学系统111的对焦控制。
接着,对在用户选择第一摄像头单元110拍摄广角图像的情况下,系统控制部150作为主要对焦控制部发挥功能的一系列的处理的一个例子进行说明。图9是示出直到系统控制部150生成广角图像为止的主要处理过程的流程图。流程例如从用户按下快门按钮161的时间点开始。
在步骤S101中,对焦控制部获取第一失焦信息。具体地,如上所述,驱动第一拍摄元件113使相位像素220输出相位差信号,进行各种运算来获得第一失焦信息。在步骤S102中,对焦控制部与第一失焦信息的获取同样地获取第二失焦信息。步骤S102的处理也可以与步骤S101并行进行。
在步骤S103中,对焦控制部判断是否可以基于第一失焦信息执行第一光学系统111的对焦控制。如果判断为可以,则跳过步骤S104,进入步骤S105,如果判断为不可以,则进入步骤S104。
在进入步骤S104的情况下,对焦控制部参照在步骤S102中获取的第二失焦信息确定第一拍摄元件113中的第一对焦区域310,进入步骤S105。
当进入步骤S105时,在跳过了步骤S104的情况下,对焦控制部基于第一失焦信息使第一光学系统111的对焦透镜移动,使对焦于主被摄体。在经过了步骤S104的情况下,在确定出的第一对焦区域310中,如上所述,例如通过施加限定条件,再次获取第一失焦信息,基于该第一失焦信息使第一光学系统111的对焦透镜移动,使对焦于主被摄体。
对焦控制部进入步骤S106,再次使第一拍摄元件113输出相位差信号,评价其相位波形,并判断是否为对焦状态。如果判断为对焦状态,则跳过步骤S107,进入步骤S108,如果判断为不是对焦状态,则进入步骤S107。
在进入步骤S107的情况下,如上所述,对焦控制部执行对比度AF,校正对焦透镜的位置,使对焦于主被摄体。之后,进入步骤S108。
当进入步骤S108时,系统控制部150驱动第一拍摄元件113使普通像素输出图像信号,使图像处理部153生成图像数据。系统控制部150根据预先设定的指示将生成的图像数据存储于存储部170或显示于显示器130,或通过通信接口180发送到外部设备,从而完成一系列的处理。
在以上说明的在本实施方式中,是假设通过第一摄像头单元110拍摄静止图像的情况,但即使是在拍摄运动图像的情况下也可以执行同样的对焦控制。例如,根据第一拍摄元件113输出的像素信号中的图像信号连续生成帧图像,根据相位差信号生成第一失焦信息,则即使在运动图像的拍摄中,也能够在参照并行生成的第二失焦信息的同时执行与上述同样的对焦控制。
另外,在以上说明的本实施方式中,说明了拍摄装置100具备两个摄像头单元的情况,但即使在具备三个以上的摄像头单元的拍摄装置中也可以执行同样的对焦控制。例如,如果三个摄像头单元分别具备长焦视角、标准视角以及广角视角的光学系统,则广角视角的摄像头单元的对焦控制可以参照分别从标准视角的摄像头单元以及长焦视角的摄像头单元获取的失焦信息。另外,标准视角的摄像头单元的对焦控制可以参照从长焦视角的摄像头单元获取的失焦信息。
另外,在以上说明的本实施方式中说明了基于第一光学系统111以及第二光学系统121的各个对焦透镜在任意的状态下获取的一个失焦信息来进行对焦控制的例子,但也可以在使对焦透镜的位置变化的同时获取多个失焦信息来进行对焦控制。例如,在采用开度F值小的光学系统或焦距长的光学系统时,可以根据其特性分多次改变对焦透镜的位置,获取每次的失焦信息。此时,当长焦视角的摄像头单元采用可以根据施加电压改变焦距的液态透镜时,可以基于其施加电压来确定对焦透镜的移动范围。即,由于这样的液态透镜可以是高倍率的变焦镜头,因此,例如,在长焦区域中,以对焦于1.5m以外的被摄体为前提,仅允许对焦透镜在该范围内移动,则能够缩短失焦信息的获取时间。
在另一种可能的实施方式中,液态镜头可以是广角镜头或者长焦镜头,由于液态镜头在对焦时需要通过马达来改变液态单元的厚度,因此在对焦时需要的行程较长或者马达的移动速度较慢,可能会影响对焦的速度。此时,可以同时基于其他光学系统的失焦信息来辅助该包含液态镜头的光学系统进行对焦控制。例如,确定多个对焦距离区间,从而使得液态镜头只须在较小的范围内进行对焦,从而提高对焦的速度。
附图标记说明:
100…拍摄装置,110…第一摄像头单元,111…第一光学系统,112…第一驱动机构,113…第一拍摄元件,114…第一模拟前端(AFE),120…第二摄像头单元,121…第二光学系统,122…第二驱动机构,123…第二拍摄元件,124…第二模拟前端(AFE),130…显示器,150…系统控制部,151…工作存储器,152…图像处理部,160…操作部,161…快门按钮,162…触摸面板,170…存储部,180…通信接口,210…普通像素,220…相位像素,221…第一相位像素,222…第二相位像素,301…广角图像,302…长焦图像,310…第一对焦区域,320…第二对焦区域,910…人物,920…森林。

Claims (8)

  1. 一种拍摄装置,其特征在于,具备:
    第一摄像头单元,其具有第一光学系统;
    第二摄像头单元,其配置为与所述第一摄像头单元朝向同一方向;以及
    对焦控制部,其执行所述第一摄像头单元以及所述第二摄像头单元的对焦控制,
    所述第一摄像头单元以及所述第二摄像头单元分别具有拍摄元件,所述拍摄元件由输出用于形成图像的图像信号的普通像素和被所述普通像素包围且离散地配置的、输出用于检测焦点的相位差信号的相位像素二维排列而成,
    所述对焦控制部在使用从所述第一摄像头单元的所述拍摄元件输出的所述图像信号形成所述图像的情况下,参照从由所述第二摄像头单元的所述拍摄元件输出的所述相位差信号获取的第二失焦信息,执行所述第一光学系统的对焦控制。
  2. 根据权利要求1所述的拍摄装置,其中,
    所述对焦控制部基于由所述第一摄像头单元和所述第二摄像头单元的配置确定的基线长度来确定所述第一光学系统的对焦控制中的对焦区域。
  3. 根据权利要求1或2所述的拍摄装置,其中,
    如果所述对焦控制部确定需要所述第二失焦信息来协助进行第一光学系统的对焦控制,则基于所述第二失焦信息,执行对所述第一光学系统的对焦控制。
  4. 根据权利要求3所述的拍摄装置,其中,
    所述对焦控制部并行地获取所述第一失焦信息和所述第二失焦信息。
  5. 根据权利要求4所述的拍摄装置,其中,
    所述对焦控制部通过对所述第一失焦信息与所述第二失焦信息进行比较来判断是否难以执行所述第一光学系统的对焦控制。
  6. 根据权利要求1~5中的任意一项所述的拍摄装置,其中,
    所述对焦控制部在参照所述第二失焦信息驱动所述第一光学系统的对焦透镜后,基于使用从所述第一摄像头单元的所述拍摄元件输出的所述图像信号算出的对比度信息,对所述对焦透镜的位置进行校正。
  7. 根据权利要求1~6中的任意一项所述的拍摄装置,其中,
    所述第一光学系统包含能够根据电信号进行对焦的液态透镜,
    所述对焦控制部基于通过第一失焦信息和/或第二失焦信息确定的所述电信号来控制所述第一光学系统进行对焦。
  8. 一种对焦控制程序,其特征在于,
    进行拍摄装置的第一摄像头单元以及第二摄像头单元的对焦控制,
    所述拍摄装置具备:所述第一摄像头单元,其具有第一光学系统;以及所述第二摄像头单元,其配置为与所述第一摄像头单元朝向同一方向的第二光学系统,
    所述第一摄像头单元以及所述第二摄像头单元分别具有拍摄元件,所述拍摄元件由输出用于形成图像的图像信号的普通像素和被所述普通像素包围且离散地配置的、输出用于检测焦点的相位差信号的相位像素二维排列而成,
    在使用从所述第一摄像头单元的所述拍摄元件输出的所述图像信号形成所述图像的情况下,
    使计算机执行如下步骤:
    获取步骤,基于从所述第二摄像头单元的所述拍摄元件输出的所述相位差信号获取第二失焦信息;以及
    驱动步骤,参照所述第二失焦信息来驱动所述第一光学系统的对焦透镜。
PCT/CN2022/094276 2022-05-20 2022-05-20 拍摄装置以及对焦控制程序 WO2023221141A1 (zh)

Priority Applications (3)

Application Number Priority Date Filing Date Title
KR1020237037882A KR20230167074A (ko) 2022-05-20 2022-05-20 촬영 장치 및 포커싱 제어 프로그램
CN202280001794.6A CN117546475A (zh) 2022-05-20 2022-05-20 拍摄装置以及对焦控制程序
PCT/CN2022/094276 WO2023221141A1 (zh) 2022-05-20 2022-05-20 拍摄装置以及对焦控制程序

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2022/094276 WO2023221141A1 (zh) 2022-05-20 2022-05-20 拍摄装置以及对焦控制程序

Publications (1)

Publication Number Publication Date
WO2023221141A1 true WO2023221141A1 (zh) 2023-11-23

Family

ID=88834417

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/094276 WO2023221141A1 (zh) 2022-05-20 2022-05-20 拍摄装置以及对焦控制程序

Country Status (3)

Country Link
KR (1) KR20230167074A (zh)
CN (1) CN117546475A (zh)
WO (1) WO2023221141A1 (zh)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2016090785A (ja) 2014-11-04 2016-05-23 キヤノン株式会社 撮像装置及びその制御方法
CN106576143A (zh) * 2014-07-25 2017-04-19 三星电子株式会社 图像拍摄装置和图像拍摄方法
CN106603911A (zh) * 2016-11-29 2017-04-26 广东欧珀移动通信有限公司 控制方法、控制装置及电子装置
CN112004026A (zh) * 2020-09-01 2020-11-27 北京小米移动软件有限公司 相位对焦装置、方法、拍摄方法、装置、终端设备及介质
WO2021106529A1 (ja) * 2019-11-29 2021-06-03 富士フイルム株式会社 情報処理装置、情報処理方法、及びプログラム
CN114079734A (zh) * 2015-11-24 2022-02-22 三星电子株式会社 数字拍摄装置及其操作方法

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106576143A (zh) * 2014-07-25 2017-04-19 三星电子株式会社 图像拍摄装置和图像拍摄方法
JP2016090785A (ja) 2014-11-04 2016-05-23 キヤノン株式会社 撮像装置及びその制御方法
CN114079734A (zh) * 2015-11-24 2022-02-22 三星电子株式会社 数字拍摄装置及其操作方法
CN106603911A (zh) * 2016-11-29 2017-04-26 广东欧珀移动通信有限公司 控制方法、控制装置及电子装置
WO2021106529A1 (ja) * 2019-11-29 2021-06-03 富士フイルム株式会社 情報処理装置、情報処理方法、及びプログラム
CN112004026A (zh) * 2020-09-01 2020-11-27 北京小米移动软件有限公司 相位对焦装置、方法、拍摄方法、装置、终端设备及介质

Also Published As

Publication number Publication date
CN117546475A (zh) 2024-02-09
KR20230167074A (ko) 2023-12-07

Similar Documents

Publication Publication Date Title
JP3992992B2 (ja) 被写体像取得装置
JP5388544B2 (ja) 撮像装置およびそのフォーカス制御方法
JP4874669B2 (ja) オートフォーカスユニット及びデジタルカメラ
JP2010093422A (ja) 撮像装置
JP4094458B2 (ja) 画像入力装置
US8830383B2 (en) Image pickup apparatus and control method thereof
JP2009192774A (ja) 焦点調節装置および撮像装置
JP5896763B2 (ja) 光学機器および自動焦点調節を行う方法
JP2008262049A (ja) オートフォーカス装置、撮像装置及びオートフォーカス方法
JP5056168B2 (ja) 焦点調節装置および撮像装置
JP6801658B2 (ja) 撮像装置、撮像方法、及びプログラム
JP2019117395A (ja) 撮像装置
JP6187617B2 (ja) 撮像装置
WO2023221141A1 (zh) 拍摄装置以及对焦控制程序
JP2016142924A (ja) 撮像装置及びその制御方法、プログラム、記憶媒体
JP4786734B2 (ja) カメラ
JP5256847B2 (ja) 撮像装置
JP5018932B2 (ja) 撮像装置
JP4535462B2 (ja) 被写体像取得装置
JP6900228B2 (ja) 撮像装置、撮像システム、撮像装置の制御方法、および、プログラム
JP2002277730A (ja) 電子カメラの自動焦点制御方法、装置及びプログラム
JP2010224165A (ja) 撮像装置
JP2011176457A (ja) 電子カメラ
JP2017134322A (ja) レンズ装置
JP4308494B2 (ja) カメラ、被写体像取得装置、自動合焦システムおよび自動合焦方法

Legal Events

Date Code Title Description
WWE Wipo information: entry into national phase

Ref document number: 202280001794.6

Country of ref document: CN

ENP Entry into the national phase

Ref document number: 2022538156

Country of ref document: JP

Kind code of ref document: A

ENP Entry into the national phase

Ref document number: 20237037882

Country of ref document: KR

Kind code of ref document: A

WWE Wipo information: entry into national phase

Ref document number: 1020237037882

Country of ref document: KR