CN108337425B - Image capturing device and focusing method thereof - Google Patents

Image capturing device and focusing method thereof Download PDF

Info

Publication number
CN108337425B
CN108337425B CN201710043901.2A CN201710043901A CN108337425B CN 108337425 B CN108337425 B CN 108337425B CN 201710043901 A CN201710043901 A CN 201710043901A CN 108337425 B CN108337425 B CN 108337425B
Authority
CN
China
Prior art keywords
pixel
focusing
image
region
similarity
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201710043901.2A
Other languages
Chinese (zh)
Other versions
CN108337425A (en
Inventor
和佑
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Acer Inc
Original Assignee
Acer Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Acer Inc filed Critical Acer Inc
Priority to CN201710043901.2A priority Critical patent/CN108337425B/en
Publication of CN108337425A publication Critical patent/CN108337425A/en
Application granted granted Critical
Publication of CN108337425B publication Critical patent/CN108337425B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)

Abstract

The invention provides an image capturing device and a focusing method thereof, which are suitable for the image capturing device with an image sensor and a touch screen and comprise the following steps. Firstly, an image sensor is used for capturing a real-time image, and the real-time image is displayed on a touch screen. Then, a touch operation on the touch screen is detected, so that an interested area related to the touch operation is selected from the real-time image. And calculating a first focus value of the region of interest according to the similarity of each pixel of the region of interest relative to at least one central pixel, and judging whether the first focus value meets the focusing condition. When the first focus value meets the focusing condition, the lens of the image sensor is controlled to move to a focusing position corresponding to the first focus value.

Description

Image capturing device and focusing method thereof
Technical Field
The present disclosure relates to image capturing devices, and particularly to an image capturing device and a focusing method thereof.
Background
With the development of science and technology, various intelligent image capturing devices, such as tablet computers, personal digital assistants, smart phones, and the like, have become indispensable tools for modern people. The camera lens carried by the high-grade intelligent image capturing device is comparable to or even can replace the traditional consumer camera, and a small number of high-grade intelligent image capturing devices have pixels and image quality close to digital monocular.
Generally, the focusing operation of the image capturing device is to calculate the image sharpness while moving the lens, so as to obtain the best focusing image. However, in a scene with rich background information, the focusing function often causes a wrong focusing, so that a focusing picture that is not expected by the user is captured.
Disclosure of Invention
In view of the above, the present invention provides an image capturing device and a focusing method thereof, which can reduce the focusing failure and greatly improve the experience of the user.
In an embodiment of the invention, the focusing method of the image capturing device is suitable for an image capturing device having an image sensor and a touch screen, and includes the following steps. Firstly, an image sensor is used for capturing a real-time image, and the real-time image is displayed on a touch screen. Then, a touch operation on the touch screen is detected, so that an interested area related to the touch operation is selected from the real-time image. And calculating a first focus value of the region of interest according to the similarity of each pixel of the region of interest relative to at least one central pixel, and judging whether the first focus value meets the focusing condition. When the first focus value meets the focusing condition, the lens of the image sensor is controlled to move to a focusing position corresponding to the first focus value.
In an embodiment of the invention, the image capturing apparatus includes an image sensor, a touch screen, a memory and a processor, wherein the image sensor includes a lens, and the processor is coupled to the image sensor, the touch screen and the memory. The processor captures a real-time image by using the image sensor, displays the displayed real-time image on the touch screen, detects touch operation on the touch screen, selects an interested area related to the touch operation from the real-time image, calculates a first focus value of the interested area according to the similarity of each pixel of the interested area relative to at least one central pixel, and judges whether the first focus value meets a focusing condition. When the first focus value meets the focusing condition, the processor moves the control lens to a focusing position corresponding to the first focus value.
Drawings
Fig. 1 is a block diagram of an image capturing apparatus according to an embodiment of the invention.
Fig. 2 is a flowchart illustrating a focusing method of an image capturing device according to an embodiment of the invention.
Fig. 3 is a flowchart illustrating a focusing method of an image capturing device according to an embodiment of the invention.
FIG. 4 is a diagram illustrating a relationship between a focus value and a lens position according to an embodiment of the present invention.
Fig. 5 is a flowchart illustrating a focusing method of an image capturing device according to an embodiment of the invention.
Description of reference numerals:
100: an image capturing device;
10: an image sensor;
15: a lens;
20: a touch screen;
30: a memory;
40: a processor;
s202 to S214: a flow of a focusing method;
302. 502: real-time imaging;
ROIa and ROIb: a region of interest;
ca. Cb: touching a position;
304. 504, a step of: all pixels of the region of interest;
306a, 306b, 306, 506a, 506b, 506: a weight distribution map;
c1, C2: the relation curve of the focusing value and the lens position;
pf: a focus position of the foreground object;
pb: the in-focus position of the background area.
Detailed Description
In order to make the aforementioned and other features and advantages of the invention more comprehensible, embodiments accompanied with figures are described in detail below.
Some embodiments of the invention will be described in detail below with reference to the drawings, wherein like reference numerals refer to like or similar elements throughout the several views. These examples are only a part of the present invention and do not disclose all possible embodiments of the present invention. Rather, these embodiments are merely exemplary of the methods and apparatus of the present invention as set forth in the claims.
Fig. 1 is a block diagram of an image capturing apparatus according to an embodiment of the present invention, which is for convenience of illustration only and is not intended to limit the present invention. First, fig. 1 first describes all components and configuration relationships of the image capturing apparatus, and details of functions will be disclosed together with fig. 2.
Referring to fig. 1, the image capturing apparatus 100 includes an image sensor 10, a touch screen 20, a memory 30 and a processor 40. In the embodiment, the image capturing device 100 is, for example, a digital camera, a monocular camera, a digital video camera, or other devices with image capturing functions, such as a smart phone, a tablet computer, a personal digital assistant, a tablet computer, a head-mounted display, etc., but the invention is not limited thereto.
The image sensor 10 includes a lens 15, an actuator (not shown) and a photosensitive element (not shown), wherein the lens 15 includes a lens. The actuator may be, for example, a stepping motor (stepping motor), a Voice Coil Motor (VCM), a piezoelectric actuator (piezoelectric actuator), or other actuators that can mechanically move the lens, and the invention is not limited thereto. The light sensing elements are used for respectively sensing the intensity of light rays entering the lens and further respectively generating images. The photosensitive element may be, for example, a Charge Coupled Device (CCD), a complementary metal-oxide semiconductor (CMOS) element, or other elements, which is not limited herein.
The touch screen 20 is a display device integrated with touch detection elements, and can provide both display and input functions. The display device may be, for example, a Liquid Crystal Display (LCD), a light-emitting diode (LED) display, a Field Emission Display (FED), or other types of displays, but is not limited thereto. The touch sensing devices are disposed in a row and a column on the display device for sensing a touch of a user's finger, palm or other object on the touch screen 20. The touch detection element can be, for example, but not limited to, a capacitive touch detection element, a surface acoustic wave touch detection element, an electromagnetic touch detection element, or a near field imaging touch detection element.
The memory 30 is used for storing images and data, and may be any type of fixed or removable Random Access Memory (RAM), read-only memory (ROM), flash memory (flash memory), hard disk, or the like, or any combination thereof.
The processor 40 may be, for example, a Central Processing Unit (CPU), or other programmable general purpose or special purpose microprocessor (microprocessor), Digital Signal Processor (DSP), programmable controller, Application Specific Integrated Circuit (ASIC), Programmable Logic Device (PLD), or other similar device or combination thereof. The processor 40 is coupled to the image sensor 10, the touch screen 20 and the memory 30, and is configured to execute a focusing process to improve the quality of the image captured by the image capturing apparatus 100.
Fig. 2 is a flowchart illustrating a focusing method of the image capturing apparatus according to an embodiment of the present invention, and the focusing method of the image capturing apparatus of fig. 2 can be implemented by various elements of the image capturing apparatus 100 of fig. 1.
Referring to fig. 1 and fig. 2, first, the processor 40 of the image capturing apparatus 100 captures a real-time image by using the image sensor 10 (step S202), and displays the real-time image on the touch screen 20 (step S204). In detail, the processor 40 may continuously capture real-time images in front of the image capturing apparatus 100 by using the image sensor 10, and simultaneously display the captured real-time images on the touch screen 20 for the user to view. In one embodiment, the processor 40 may control the lens 15 to move to the focusing position by using a focusing technique such as hill climbing (hill climbing) to obtain a real-time image with maximum sharpness. Since the real-time image is only for the user to view the composition, the lens 15 can speed up the focusing with a larger step. However, in another embodiment, the processor 40 may synchronously execute the next step S206 when the lens 15 has not finished focusing, which is not limited by the invention.
Next, the processor 40 detects a touch operation on the touch screen 20 to select a region of interest (ROI) associated with the touch operation from the live image (step S206). Here, the image capturing apparatus 100 can display the real-time image on the touch screen 20 to provide the user with intuitive selection of the region of interest for focusing. When the user performs a touch operation such as clicking, pressing, etc. on the touch screen 20, the processor 40 detects the position of the touch operation, and sets a region having a predetermined size as the region of interest with the position corresponding to the touch operation in the real-time image as the center. The predetermined size may be a default value stored in the memory 30, and may be manually adjusted by the user at any time. For example, the processor 40 may display the region of interest in the real-time image in the form of a frame line, and the user may adjust the size of the region of interest by performing, for example, a dragging operation using the touch screen 20.
Compared to the global search of the conventional focusing process, since the focusing region is locked in the region of interest, the processor 40 can perform the focusing process only on the region of interest, so as to shorten the focusing time. It is reasonably assumed that the object corresponding to the position of the touch operation is the target object that the user wants to focus on. The region of interest is defined by taking the position of the touch operation as the center, so that the central region of the region of interest is the position of the target object. Here, the processor 40 calculates a first focus value of the region of interest according to the similarity of each pixel of the region of interest with respect to at least one central pixel (step S208), wherein the central pixel is a pixel located at the center of the region of interest, and the number of the central pixels is not limited to a single pixel, so as to avoid the possibility of noise or artifacts.
In this embodiment, the processor 40 will first calculate the similarity of each pixel of the region of interest with respect to the central pixel. In another aspect, the processor 40 calculates the similarity between each pixel of the region of interest and the corresponding pixel of the user's touch operation. The similarity can be expressed by at least one of a distance between the image coordinate and a distance between the central pixel and each pixel, and a distance between the chromaticity coordinates, wherein the chromaticity coordinates can be based on color spaces such as RGB and CMYK, which is not limited herein. Next, the processor 40 assigns the corresponding weight value according to the similarity of each pixel with respect to the central pixel, wherein the greater the similarity, the greater the assigned weight value. In other words, the pixels adjacent to the target object corresponding to the touch operation of the user and having similar color characteristics to the target object will be given larger weight values.
Specifically, a generally known focusing algorithm only sums up the contrasts for pixels in all the focusing ranges, for example, the following formula (1):
Figure BDA0001213791130000051
wherein (i, j) is the image coordinate of the pixel of the region of interest, x is the position of the lens, FVfg(x) For contrast summation of foreground regions, FVbg(x) Is the contrast of the background region summed, and fv (x) is the contrast of the region of interest summed (i.e., the focus value). On the other hand, the processor 40 of the present embodiment obtains the focus value (i.e. the aforementioned "first focus value") of the region of interest according to the contrast and the weight value of each pixel in the region of interest, for example, the following formula (2):
Figure BDA0001213791130000052
wherein (i, j) is the image coordinate of the pixel of the region of interest, (i)0,j0) Is the image coordinate of the central pixel, c is the chromaticity coordinate of the pixel with image coordinate (i, j), c0Is the chromaticity coordinate of the center pixel, x is the position of the lens 15, Cfg(x) Is the contrast sum of the foreground region, Cbg(x) Is the contrast of the background region summed, and c (x) is the contrast of the region of interest summed (i.e., the first focus value). Incidentally, the image coordinates of the central pixel here may be located at a single central point of the region of interest, or may be at any position of the central region of the region of interest. In addition, the chromaticity coordinate of the central pixel may be a chromaticity coordinate located at a single central point of the region of interest, or may be an average chromaticity coordinate of the central region of the region of interest.
After calculating the first focus value, the processor 40 determines whether the first focus value meets the focus condition (step S210) to determine whether the lens position corresponding to the first focus value is a position where a focused image can be generated. The focusing condition herein refers to whether or not the first focusing value is a local maximum focusing value. If so, the processor 40 moves the control lens 15 to the focusing position corresponding to the first focusing value (step S212). If not, the processor 40 obtains and controls the lens 15 to move to a new focusing position with a local maximum focusing value (step S214), and the new focusing position and the lens position with the first focusing value have the same focusing area, so as to ensure that the lens 15 can focus on the target object desired by the user.
For the sake of clarity of the focusing method, practical applications will be listed below for a more detailed description.
Fig. 3 is a flowchart illustrating a focusing method of an image capturing device according to an embodiment of the invention.
Referring to fig. 1 and fig. 3, the image 302 is a real-time image captured by the processor 40. Since the image 302 has a complicated background region, the lens 15 will focus on the background region to present a clearly focused background region and a blurred out-of-focus foreground object in a general focusing process. At this time, assume that the user performs a touch operation on a position Ca, where the foreground object is located. The processor 40 defines a region of interest ROIa centered on the position Ca and assigns weight values for all pixels 304 of the region of interest ROIa.
In the present embodiment, the processor 40 will assign the weight values of all pixels 304 with a two-dimensional Gaussian function (Gaussian function), such as the following formula (3):
Figure BDA0001213791130000061
where c is (a, b) the chromaticity coordinate of the pixel whose image coordinate is (i, j), c0=(a0,b0) The chromaticity coordinates of the central pixel of the image. In the weight distribution map 306a based on the similarity of the image coordinate distances, a region closer to the position Ca in the image 302 is given a larger weight value. In the weight distribution map 306b based on the similarity of the chromaticity coordinate distances, the region closer to the position Ca color in the image 302 is given a larger weight. In the weight distribution map 306a and the weight distribution map 306b, a pixel having a lower pixel value represents a lower weight value. In this embodiment, the processor 40 may combine the weight distribution map 306a and the weight distribution map 306b to generate the weight distribution map 306 in an average manner, for example, which may effectively give a higher weight value to the foreground object and a lower or approximately 0 weight value to the background area.
Taking the curve of the focusing value versus the lens position illustrated in fig. 4 according to an embodiment of the invention as an example, taking the curve C1 generated by the conventional focusing process, the lens will move to the focusing position Pb of the background area. However, for the curve C2 generated by the processor 40 after assigning the weight values based on the similarity, the lens 15 will move to the focus position Pf of the foreground object to capture a clearer picture of the foreground object.
Fig. 5 is a flowchart illustrating a focusing method of an image capturing device according to another embodiment of the invention.
Referring to fig. 1 and fig. 5, the image 502 is a real-time image captured by the processor 40. At this time, assume that the user performs a touch operation on the position Cb, where the position Cb is the position of the background area. The processor 40 will define a region of interest ROIb centered on the position Cb and will assign weight values for all pixels 404 of the region of interest ROIb. In this embodiment, the processor 40 may also assign the weight values of all the pixels 504 according to formula (3). In the weight distribution map 506a based on the similarity of the image coordinate distance, a region closer to the position Cb in the image 502 is given a larger weight value. In the weight distribution map 506b based on the similarity of the chromaticity coordinate distances, the areas closer to the position Cb color in the picture 502 are given greater weight. In the weight distribution map 506a and the weight distribution map 506b, a pixel having a lower pixel value represents a lower weight value. In this embodiment, the processor 40 combines the weight distribution map 506a and the weight distribution map 506b to generate the weight distribution map 506 in an average manner, for example, which effectively gives a higher weight value to the background area and a lower or nearly 0 weight value to the foreground object. The processor 40 will control the lens 15 to move to the in-focus position in a manner similar to that of fig. 4 to capture a picture with a sharper background area.
In summary, the image capturing apparatus and the focusing method thereof provided by the present invention define the region of interest by the touch operation of the user on the real-time image, and perform the pixel similarity calculation for the region of interest to calculate a more accurate focusing value. The invention can effectively accelerate focusing, and can reduce focusing failure so as to greatly improve user experience.
Although the present invention has been described with reference to the above embodiments, it should be understood that various changes and modifications can be made therein by those skilled in the art without departing from the spirit and scope of the invention as defined by the appended claims.

Claims (9)

1. A focusing method of an image capturing device is suitable for the image capturing device with an image sensor and a touch screen, wherein the image sensor comprises a lens, and the focusing method comprises the following steps:
capturing a real-time image by using the image sensor, and displaying the real-time image on the touch screen;
detecting a touch operation on the touch screen to select an interested area related to the touch operation from the instant image;
calculating a first focus value of the region of interest according to the similarity of each pixel of the region of interest relative to at least one central pixel;
judging whether the first focusing value meets the focusing condition; and
when the first focus value accords with the focusing condition, controlling the lens to move to a focusing position corresponding to the first focus value;
wherein the step of calculating the first focus value of the region of interest comprises:
calculating the similarity of each of the pixels with respect to the central pixel;
calculating a weight value corresponding to the similarity of each pixel, wherein the greater the similarity is, the greater the weight value corresponding to the similarity is; and
and obtaining the first focus value of the interested region according to the contrast of each pixel and the weight value.
2. The focusing method of claim 1, wherein the step of selecting the region of interest associated with the touch operation from the live image comprises:
and setting a region with a preset size as the region of interest by taking the position corresponding to the touch operation in the real-time image as the center.
3. The focusing method of claim 1, wherein the similarity of each pixel with respect to the central pixel is associated with an image coordinate distance between the pixel and the central pixel.
4. The focusing method of claim 1, wherein the similarity of each pixel with respect to the central pixel is related to a chromaticity coordinate distance between the pixel and the central pixel.
5. The focusing method of claim 1, wherein the step of calculating the weight value corresponding to the similarity of each pixel comprises:
and assigning the weight value corresponding to the similarity of each pixel by a Gaussian function.
6. The focusing method of claim 1, wherein the step of determining whether the first focus value meets the focusing condition comprises:
it is determined whether the first focus value has a local maximum focus value.
7. The focusing method according to claim 1, further comprising:
when the first focus value does not meet the focus condition, the lens is obtained and moved to a new focus position.
8. An image capturing device, comprising:
an image sensor including a lens;
a touch screen;
a memory; and
a processor coupled to the image sensor, the touch screen and the memory, and configured to:
capturing a real-time image by using the image sensor, and displaying the real-time image on the touch screen;
detecting a touch operation on the touch screen to select an interested area related to the touch operation from the instant image;
calculating a first focus value of the region of interest according to the similarity of each pixel of the region of interest relative to at least one central pixel;
judging whether the first focusing value meets the focusing condition; and
when the first focus value accords with the focusing condition, controlling the lens to move to a focusing position corresponding to the first focus value;
wherein the step of calculating the first focus value of the region of interest comprises:
calculating the similarity of each of the pixels with respect to the central pixel;
calculating a weight value corresponding to the similarity of each pixel, wherein the greater the similarity is, the greater the weight value corresponding to the similarity is; and
and obtaining the first focus value of the interested region according to the contrast of each pixel and the weight value.
9. The image capturing device of claim 8, wherein the processor is further configured to obtain and move the lens to a new focus position when the first focus value does not satisfy the focus condition.
CN201710043901.2A 2017-01-19 2017-01-19 Image capturing device and focusing method thereof Active CN108337425B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710043901.2A CN108337425B (en) 2017-01-19 2017-01-19 Image capturing device and focusing method thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710043901.2A CN108337425B (en) 2017-01-19 2017-01-19 Image capturing device and focusing method thereof

Publications (2)

Publication Number Publication Date
CN108337425A CN108337425A (en) 2018-07-27
CN108337425B true CN108337425B (en) 2020-06-19

Family

ID=62922894

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710043901.2A Active CN108337425B (en) 2017-01-19 2017-01-19 Image capturing device and focusing method thereof

Country Status (1)

Country Link
CN (1) CN108337425B (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101408709A (en) * 2007-10-10 2009-04-15 鸿富锦精密工业(深圳)有限公司 Image viewfinding device and automatic focusing method thereof
CN101441388A (en) * 2007-11-21 2009-05-27 三星Techwin株式会社 Focusing apparatus and method
CN105744167A (en) * 2016-03-28 2016-07-06 努比亚技术有限公司 Image taking method and device, and mobile terminal
CN106131401A (en) * 2016-06-29 2016-11-16 深圳市金立通信设备有限公司 A kind of image pickup method and terminal

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4492697B2 (en) * 2007-12-28 2010-06-30 カシオ計算機株式会社 Imaging apparatus and program

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101408709A (en) * 2007-10-10 2009-04-15 鸿富锦精密工业(深圳)有限公司 Image viewfinding device and automatic focusing method thereof
CN101441388A (en) * 2007-11-21 2009-05-27 三星Techwin株式会社 Focusing apparatus and method
CN105744167A (en) * 2016-03-28 2016-07-06 努比亚技术有限公司 Image taking method and device, and mobile terminal
CN106131401A (en) * 2016-06-29 2016-11-16 深圳市金立通信设备有限公司 A kind of image pickup method and terminal

Also Published As

Publication number Publication date
CN108337425A (en) 2018-07-27

Similar Documents

Publication Publication Date Title
CN109565551B (en) Synthesizing images aligned to a reference frame
JP5226805B2 (en) Method and apparatus with depth map generation
US9361680B2 (en) Image processing apparatus, image processing method, and imaging apparatus
CN112529951A (en) Method and device for acquiring extended depth of field image and electronic equipment
WO2015196802A1 (en) Photographing method and apparatus, and electronic device
EP3125544A1 (en) Image display device and image display system
US10015374B2 (en) Image capturing apparatus and photo composition method thereof
JP2015522959A (en) Systems, methods, and media for providing interactive refocusing in images
JP5374119B2 (en) Distance information acquisition device, imaging device, and program
JP2014145725A (en) Image processing apparatus and imaging device
KR102360424B1 (en) Method of detecting face, method of processing image, face detection device and electronic system including the same
TWI549504B (en) Image capturing device and auto-focus compensation method thereof
CN110166680B (en) Device imaging method and device, storage medium and electronic device
WO2018129692A1 (en) Image refocusing
TWI554108B (en) Electronic device and image processing method
CN114390201A (en) Focusing method and device thereof
US10715743B2 (en) System and method for photographic effects
CN108337425B (en) Image capturing device and focusing method thereof
TWI543112B (en) Method of simulating short depth of field and digital camera using the same
JP2020009099A (en) Image processing device, image processing method, and program
JP6579764B2 (en) Image processing apparatus, image processing method, and program
TW201827911A (en) Image capturing device and focus method thereof
KR20220122317A (en) Image signal proccessor, image processing system of performing auto zoom and auto focus, image processing method of the image signal proccessor having the same
CN109905606B (en) Object distance calculation method and object distance calculation device
TW201642008A (en) Image capturing device and dynamic focus method thereof

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant