CN105007422A - Phase focusing method and user terminal - Google Patents

Phase focusing method and user terminal Download PDF

Info

Publication number
CN105007422A
CN105007422A CN201510411747.0A CN201510411747A CN105007422A CN 105007422 A CN105007422 A CN 105007422A CN 201510411747 A CN201510411747 A CN 201510411747A CN 105007422 A CN105007422 A CN 105007422A
Authority
CN
China
Prior art keywords
image
uproar
pixel
look
view data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201510411747.0A
Other languages
Chinese (zh)
Other versions
CN105007422B (en
Inventor
吴磊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN201510411747.0A priority Critical patent/CN105007422B/en
Publication of CN105007422A publication Critical patent/CN105007422A/en
Application granted granted Critical
Publication of CN105007422B publication Critical patent/CN105007422B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Abstract

The embodiment of the invention discloses a phase focusing method and a user terminal. The method comprises the steps of: under a low light focusing mode, using a preset sensitivity value to obtain a plurality of first images and a plurality of second images by an image sensor in a preset time quantum, wherein the first images are obtained by the left pixel point of the image sensor, and the second images are obtained by the right pixel point of the image sensor; performing color noise filtering on the all first and second images; combining the all first images which have been subjected to color noise filtering into a first target image, and combining the all second images which have been subjected to color noise filtering into a second target image; and focusing according to the image data waveform of the first target image and the image data waveform of the second target image. Therefore, by using the phase focusing method, the phase focusing success rate can be improved in the case of low environment light intensity.

Description

A kind of phase place focusing method and user terminal
Technical field
The present invention relates to electronic technology field, particularly relate to a kind of phase place focusing method and user terminal.
Background technology
User carries out in the process of taking pictures at use intelligent terminal, often needs to focus.Change object distance and position apart by intelligent terminal Focusing mechanism, the process of made thing imaging clearly is focused exactly.Focusing technology has a variety of, can comprise phase place focusing technology, contrast focusing technology and laser focusing technology etc.Wherein, phase place focusing technology is subject to increasing user favor because its focusing speed is fast.
In phase place focusing technology, the pixel of imageing sensor is made up of in pairs left and right pixel.Intelligent terminal obtains the first image by left pixel and obtains the second image by right pixel.Intelligent terminal generates the view data waveform of the first image and the view data waveform of the second image respectively; When the view data waveform of the first image and the view data waveform of the second image overlap, captured picture is the most clear, and therefore, intelligent terminal just can draw focusing position according to the phase difference of two view data waveforms.
Find in practice, phase place focusing, when light intensity is larger, has good focus effects; But when light intensity is less, because intelligent terminal uses larger sensitivity value, the first image and the second image therefore will be caused to produce a lot of look and to make an uproar; Look is made an uproar and is caused very large impact by view data waveform, cause two view data waveforms inconsistent, thus intelligent terminal can not calculate focusing position exactly.Therefore, the success rate how improving phase place focusing when ambient light intensity is lower is a problem demanding prompt solution.
Summary of the invention
The embodiment of the invention discloses a kind of phase place focusing method and user terminal, the success rate of phase place focusing can be improved when ambient light intensity is lower.
Embodiment of the present invention first aspect discloses a kind of phase place focusing method, and described method comprises:
Under low light focal modes, default sensitivity value is used to obtain multiple first images and multiple second images by imageing sensor in preset time period, wherein, described first image is obtained by the left pixel of described imageing sensor, and described second image is obtained by the right pixel of described imageing sensor;
Carry out look to all described first images and described second image to make an uproar filtration;
All first image processing and tracking unit of mistake colour filter being made an uproar are first object image, are the second target image by all second image processing and tracking unit that colour filter is made an uproar excessively;
Focus according to the view data waveform of described first object image and the view data waveform of described second target image.
In the first possible implementation of embodiment of the present invention first aspect, described under low light focal modes, use before default sensitivity value obtains multiple first images and multiple second images by imageing sensor in preset time period, described method also comprises:
Whether the light intensity value detecting current environment is less than predetermined threshold value;
When the light intensity value of current environment is less than described predetermined threshold value, enter low light focal modes.
In the implementation that the second of embodiment of the present invention first aspect is possible, the quantity of described first image is the multiple of 2, the quantity of described second image is also the multiple of 2, describedly carries out look filtration of making an uproar to all described first images and described second image and comprises:
Described first image is divided into one group between two, determines that the look of each described first image organized is made an uproar pixel, and described second image is divided into one group between two, determine that the look of each described second image organized is made an uproar pixel;
The look of described first image of each group pixel of making an uproar is carried out look and to be made an uproar filtrations, and the look of described second image that each is organized pixel of making an uproar is carried out look and to be made an uproar filtration.
In conjunction with the implementation that the second of embodiment of the present invention first aspect is possible, in the third possible implementation of embodiment of the present invention first aspect, the look of described first image make an uproar pixel for the first image described in another of same group at the not identical pixel of same position view data, the look of described second image make an uproar pixel for the second image described in another of same group at the not identical pixel of same position view data.
In conjunction with the third possible implementation of embodiment of the present invention first aspect or the first possible implementation of first aspect or the possible implementation of the second of first aspect or first aspect, in the 4th kind of possible implementation of embodiment of the present invention first aspect, described is first object image by all first image processing and tracking unit that colour filter is made an uproar excessively, is that the second target image comprises by all second image processing and tracking unit that colour filter is made an uproar excessively:
The view data of the pixel crossing the same position of all first images that colour filter is made an uproar is added up, obtains the first accumulative view data of the pixel of each same position;
Described first accumulative view data is set to the view data of first object image at the pixel of same position, is described first object image with first image processing and tracking unit of the colour filter of all mistakes being made an uproar;
The view data of the pixel crossing the same position of all second images that colour filter is made an uproar is added up, obtains the second accumulative view data of the pixel of each same position;
Described second accumulative view data is set to the view data of the second target image at the pixel of same position, is described second target image with second image processing and tracking unit of the colour filter of all mistakes being made an uproar.
Embodiment of the present invention second aspect also discloses a kind of user terminal, and described user terminal comprises:
Acquisition module, for under low light focal modes, default sensitivity value is used to obtain multiple first images and multiple second images by imageing sensor in preset time period, wherein, described first image is obtained by the left pixel of described imageing sensor, and described second image is obtained by the right pixel of described imageing sensor;
Look is made an uproar filtering module, to make an uproar filtration for carrying out look to all described first images and described second image;
Merging module, for being first object image by all first image processing and tracking unit that colour filter is made an uproar excessively, is the second target image by all second image processing and tracking unit that colour filter is made an uproar excessively;
Focusing module, for focusing according to the view data waveform of described first object image and the view data waveform of described second target image.
In the first possible implementation of embodiment of the present invention second aspect, described user terminal also comprises:
Detection module, for at described acquisition module under low light focal modes, use before default sensitivity value obtains multiple first images and multiple second images by imageing sensor in preset time period, whether the light intensity value detecting current environment is less than predetermined threshold value;
Mode switch module, for when the light intensity value of current environment is less than described predetermined threshold value, enters low light focal modes.
In the implementation that the second of embodiment of the present invention second aspect is possible, the quantity of described first image is the multiple of 2, and the quantity of described second image is also the multiple of 2, and described look filtering module of making an uproar comprises:
Determining unit, for described first image is divided into one group between two, determines that the look of each described first image organized is made an uproar pixel, and described second image is divided into one group between two, determines that the look of each described second image organized is made an uproar pixel;
Look is made an uproar filter element, and the pixel of making an uproar of the look for described first image organized each carries out look and to make an uproar filtration, and the look of described second image of each group pixel of making an uproar is carried out to look and to make an uproar filtration.
In conjunction with the implementation that the second of embodiment of the present invention second aspect is possible, in the third possible implementation of embodiment of the present invention second aspect, the look of described first image make an uproar pixel for the first image described in another of same group at the not identical pixel of same position view data, the look of described second image make an uproar pixel for the second image described in another of same group at the not identical pixel of same position view data.
In conjunction with the third possible implementation of embodiment of the present invention second aspect or the first possible implementation of second aspect or the possible implementation of the second of second aspect or second aspect, in the 4th kind of possible implementation of embodiment of the present invention second aspect, described merging module comprises:
Accumulated unit, the view data for the pixel of the same position by crossing all first images that colour filter is made an uproar adds up, and obtains the first accumulative view data of the pixel of each same position;
Merge cells, for the described first accumulative view data is set to the view data of first object image at the pixel of same position, is described first object image with first image processing and tracking unit of the colour filter of all mistakes being made an uproar;
Described accumulated unit, the view data also for the pixel of the same position by crossing all second images that colour filter is made an uproar adds up, and obtains the second accumulative view data of the pixel of each same position;
Described merge cells, also for the described second accumulative view data is set to the view data of the second target image at the pixel of same position, is described second target image with second image processing and tracking unit of the colour filter of all mistakes being made an uproar.
In the method described by Fig. 1, user terminal is under low light focal modes, use default sensitivity value to obtain multiple first images and multiple second images by imageing sensor in preset time period, and look is carried out to all described first images and described second image to make an uproar filtration; All first image processing and tracking unit that colour filter is made an uproar excessively are first object image by user terminal, are the second target image by all second image processing and tracking unit that colour filter is made an uproar excessively; User terminal is focused according to the view data waveform of the view data waveform of described first object image and described second target image.Visible, the success rate of phase place focusing can be improved when ambient light intensity is lower by implementing the embodiment of the present invention.
Accompanying drawing explanation
In order to be illustrated more clearly in the technical scheme in the embodiment of the present invention, be briefly described to the accompanying drawing used required in embodiment below, apparently, accompanying drawing in the following describes is some embodiments of the present invention, for those of ordinary skill in the art, under the prerequisite not paying creative work, other accompanying drawing can also be obtained according to these accompanying drawings.
Fig. 1 is the schematic flow sheet of a kind of phase place focusing method disclosed in the embodiment of the present invention;
Fig. 2 is the schematic flow sheet of another kind of phase place focusing method disclosed in the embodiment of the present invention;
Fig. 3 is the structural representation of a kind of user terminal disclosed in the embodiment of the present invention;
Fig. 4 is the structural representation of another kind of user terminal disclosed in the embodiment of the present invention.
Embodiment
Below in conjunction with the accompanying drawing in the embodiment of the present invention, be clearly and completely described the technical scheme in the embodiment of the present invention, obviously, described embodiment is the present invention's part embodiment, instead of whole embodiments.Based on the embodiment in the present invention, those of ordinary skill in the art, not making the every other embodiment obtained under creative work prerequisite, belong to the scope of protection of the invention.
The embodiment of the invention discloses a kind of phase place focusing method and user terminal, the success rate of phase place focusing can be improved when ambient light intensity is lower.Below be described in detail respectively.
Refer to Fig. 1, the schematic flow sheet of Fig. 1 a kind of phase place focusing method disclosed in the embodiment of the present invention.As shown in Figure 1, this phase place focusing method can comprise the following steps.
S101, under low light focal modes, default sensitivity value is used to obtain multiple first images and multiple second images by imageing sensor in preset time period.
In the embodiment of the present invention, by user terminal under low light focal modes, default sensitivity value is used to obtain multiple first images and multiple second images by imageing sensor in preset time period.Wherein, this user terminal can include but not limited to that smart mobile phone, panel computer, notebook computer, desktop computer etc. have the user terminal of camera function.The operating system of this user terminal can include but not limited to Android operation system, IOS, Symbian (Saipan) operating system, Black Berry (blackberry, blueberry) operating system, Windows operating system etc., and the embodiment of the present invention does not limit.
In the embodiment of the present invention, imageing sensor is the required imageing sensor used of phase place focusing technology.This imageing sensor is made up of in pairs left and right pixel.
In the embodiment of the present invention, user terminal obtains multiple first images by the left pixel of imageing sensor, obtains multiple second images by the right pixel of imageing sensor.
In the embodiment of the present invention, optionally, user can click focal modes and arrange button, and arranging current focal modes is low light focal modes.Or, when user terminal detects that current environment light intensity value is less than predetermined threshold value, enter low light focal modes.
Find in practice, ambient light intensity is lower, and the sensitivity value that user terminal collection image uses is larger, causes the look of the image gathered to make an uproar larger.The make an uproar impact of the larger view data waveform on this image of the look of image is larger, causes successfully to carry out phase place focusing.Therefore, the embodiment of the present invention by obtaining multiple first images and multiple second images by imageing sensor with default sensitivity value (can be less sensitivity value) in preset time period, the look that can reduce the first image and the second image is made an uproar, and is conducive to the success rate improving phase place focusing.
S102, the first all images and the second image carried out to look and to make an uproar filtration.
In the embodiment of the present invention, after user terminal obtains the first image and the second image, look will be carried out to all first images and the second image and to make an uproar filtration.Particularly, user terminal will determine that the look of all first images and the second image is made an uproar pixel, and the look of user terminal to all first images and the second image pixel of making an uproar carries out look and to make an uproar filtration.
In the embodiment of the present invention, user terminal can be carried out look and to be made an uproar filtration by the look of two kinds of modes to all first images and the second image pixel of making an uproar, and first kind of way is: user terminal is set to look makes an uproar with the make an uproar rgb value of the minimum pixel of the adjacent brightness value of pixel of look the rgb value of pixel; The second way is: look is made an uproar the rgb value that the rgb value of pixel is set to preset by user terminal.
As the optional execution mode of one, user terminal to the first all images carry out look make an uproar filter embodiment can comprise the following steps:
11) view data of the pixel of the same position of the first all images contrasts by user terminal, judges that whether all first images are identical in the view data of the pixel of same position;
12) if the first all images is all identical in the view data of the pixel of a certain same position, then user terminal determines that the non-look of the pixel of this position is made an uproar pixel;
13) if the first all images is all not identical in the view data of the pixel of a certain same position, then user terminal determines that the pixel of this position is that the look of the first image is made an uproar pixel;
14) look of user terminal to all first images pixel of making an uproar carries out look and to make an uproar filtration.
In this embodiment, view data can include but not limited to rgb value, brightness value or color saturation etc., and the embodiment of the present invention does not limit.
For example, if there is the first image 1 and the first image 2, then whether user terminal is identical with the view data of the pixel of the first row first row of the first image 2 by contrast first image 1; If not identical, then determine that the pixel of the first row first row of the first image 1 and the first image 2 is that look is made an uproar pixel; If identical, then determine that the pixel of the first row first row of the first image 1 and the first image 2 is that non-look is made an uproar pixel.In like manner, user terminal also will judge that whether the first image 1 is identical with the pixel view data of other same positions of the first image 2.
As the optional execution mode of one, user terminal to the second all images carry out look make an uproar filter embodiment can comprise the following steps:
21) view data of the pixel of the same position of the second all images contrasts by user terminal, judges that whether all second images are identical in the view data of the pixel of same position;
22) if the second all images is all identical in the view data of the pixel of a certain same position, then user terminal determines that the pixel of this position is that non-look is made an uproar pixel;
23) if the second all images is all not identical in the view data of the pixel of a certain same position, then user terminal determines that the pixel of this position is that the look of the second image is made an uproar pixel;
24) look of user terminal to all second images pixel of making an uproar carries out look and to make an uproar filtration.
In this embodiment, view data can include but not limited to rgb value, brightness value or color saturation etc., and the embodiment of the present invention does not limit.User terminal carries out look to the second all images and makes an uproar and filter that to carry out the make an uproar principle of filtering of look with user terminal to the first all images identical, is not repeated herein.
As the optional execution mode of one, the quantity of the first image is the multiple of 2, and the quantity of the second image is also the multiple of 2, user terminal to the first all images and the second image carry out look make an uproar filter embodiment can comprise the following steps:
31) the first image is divided into one group between two, determines that the look of each the first image organized is made an uproar pixel, and the second image is divided into one group between two, determine that the look of each the second image organized is made an uproar pixel;
32) look of the first image of each group pixel of making an uproar is carried out look and to be made an uproar filtrations, and the look of the second image that each is organized pixel of making an uproar is carried out look and to be made an uproar filtration.
The embodiment that the specific implementation of this execution mode is shown in Figure 2.
S103, all first image processing and tracking unit of mistake colour filter being made an uproar are first object image, are the second target image by all second image processing and tracking unit that colour filter is made an uproar excessively.
In the embodiment of the present invention, all first image processing and tracking unit that colour filter is made an uproar excessively, after filtering all first images and the second image, are first object image by user terminal, are the second target image by all second image processing and tracking unit that colour filter is made an uproar excessively.
As the optional execution mode of one, all first image processing and tracking unit that colour filter is made an uproar excessively are first object image by user terminal, and the embodiment being the second target image by all second image processing and tracking unit that colour filter is made an uproar excessively can comprise the following steps:
21) view data of the pixel crossing the same position of all first images that colour filter is made an uproar adds up by user terminal, obtains the first accumulative view data of the pixel of each same position;
22) the first accumulative view data is set to the view data of first object image at the pixel of same position by user terminal, with first image processing and tracking unit of all colour filters excessively being made an uproar for first object image;
23) view data of the pixel crossing the same position of all second images that colour filter is made an uproar adds up by user terminal, obtains the second accumulative view data of the pixel of each same position;
24) the second accumulative view data is set to the view data of the second target image at the pixel of same position by user terminal, is the second target image with second image processing and tracking unit of the colour filter of all mistakes being made an uproar.
In this embodiment, this view data is rgb value, for example, if there is the first image 1, first image 2 and the first image 3, then the rgb value of the first row first row of the first image 1, first image 2 and the first image 3 adds up by user terminal, using the first accumulative view data of the accumulative rgb value obtained as the pixel of first row first row.In like manner, the rgb value of other same positions of the first image 1, first image 2 and the first image 3 also adds up by user terminal, using the first accumulative view data of the accumulative rgb value obtained as corresponding same position.After user terminal obtains the first accumulative view data of first row first row, using the rgb value of first of first row first row the accumulative image as the pixel of the first row first row of first object image.In like manner, user terminal also using first of other positions the accumulative view data as the rgb value of first object image at the pixel of this position.
In this embodiment, the view data of the pixel crossing the same position of all second images that colour filter is made an uproar adds up by user terminal, obtain the second accumulative view data of the pixel of each same position, and the second accumulative view data is set to the view data of the second target image at the pixel of same position, add up with the view data of user terminal by the pixel crossing the same position of all first images that colour filter is made an uproar, obtain the first accumulative view data of the pixel of each same position, and it is identical in the principle of the view data of the pixel of same position that the first accumulative view data is set to first object image, be not repeated herein.
S104, to focus according to the view data waveform of first object image and the view data waveform of the second target image.
In the embodiment of the present invention, user terminal is focused according to the view data waveform of first object image and the view data waveform of the second target image after generating first object image and the second target image.Wherein, the view data waveform of first object image is that user terminal generates according to the rgb value of first object image, and the view data waveform of the second target image is that user terminal generates according to the rgb value of the second target image.
The embodiment of the present invention, by will colour filter all first Images uniting first object images of making an uproar be crossed, and all second Images uniting second target images that will colour filter be crossed make an uproar, add the signal strength signal intensity of view data waveform, be conducive to the success rate improving focusing relatively.
In the method described by Fig. 1, user terminal is under low light focal modes, use default sensitivity value to obtain multiple first images and multiple second images by imageing sensor in preset time period, and look is carried out to all described first images and described second image to make an uproar filtration; All first image processing and tracking unit that colour filter is made an uproar excessively are first object image by user terminal, are the second target image by all second image processing and tracking unit that colour filter is made an uproar excessively; User terminal is focused according to the view data waveform of the view data waveform of described first object image and described second target image.Visible, the success rate of phase place focusing can be improved when ambient light intensity is lower by implementing the embodiment of the present invention.
Refer to Fig. 2, the schematic flow sheet of Fig. 2 another kind of phase place focusing method disclosed in the embodiment of the present invention.As shown in Figure 2, this phase place focusing method can comprise the following steps.
Whether the light intensity value that S201, user terminal detect current environment is less than predetermined threshold value.
In the embodiment of the present invention, user terminal will detect the light intensity value of current environment, and judges whether the light intensity value of current environment is less than predetermined threshold value.If the light intensity value of current environment is less than predetermined threshold value, then perform step S202; If the light intensity value of current environment is greater than predetermined threshold value, then whether the light intensity value continuing to detect current environment is less than predetermined threshold value.
When the embodiment of the present invention is by detecting that when user terminal the light intensity value of current environment is less than predetermined threshold value, user terminal enters low light focal modes, thus avoids user and manually select low light focal modes, makes the switching of focal modes more intelligent.
S202, when the light intensity value of current environment is less than predetermined threshold value, user terminal enters low light focal modes.
S203, user terminal, under low light focal modes, use default sensitivity value to obtain multiple first images and multiple second images by imageing sensor in preset time period.
In the embodiment of the present invention, the quantity of the first image is the multiple of 2, and the quantity of the second image is also the multiple of 2.
First image is divided into one group by S204, user terminal between two, determines that the look of each the first image organized is made an uproar pixel, and the second image is divided into one group between two, determines that the look of each the second image organized is made an uproar pixel.
In the embodiment of the present invention, the first image is divided into one group and the second image is divided into one group between two by user terminal between two.Such as, if there is the first image 1, first image 2, first image 3 and the first image 4, then the first image 1, first image 2, first image 3 and the first image 4 can be divided into two groups by user terminal.Optionally, first image 1 and the first image 2 can be divided into one group by user terminal, first image 3 and the first image 4 are divided into one group, or the first image 1 and the first image 3 are divided into one group, by the first image 2 and the first image 4, or the first image 1 and the first image 4 are divided into one group, by the first image 2 and the first image 3, the embodiment of the present invention does not limit.
In the embodiment of the present invention, the first image is divided into after one group by user terminal between two, will determine that the look of each the first image organized is made an uproar pixel; In like manner, the second image is divided into after one group by user terminal between two, also will determine that the look of each the second image organized is made an uproar pixel.Wherein, the look of the first image pixel of making an uproar is with another first image of same group at the not identical pixel of same position view data, the look of the second image make an uproar pixel for the second image described in another of same group at the not identical pixel of same position view data.
For example, if the first image 1 and the first image 2 are one group, if the first image 1 is not identical with the view data of the pixel of the first row first row of the first image 2, then user terminal determines that the pixel of the first row first row of the first image 1 and the first image 2 is that look is made an uproar pixel.The look of the second image pixel defining method and the look of the first image pixel defining method of making an uproar of making an uproar is identical, is not repeated herein.
As the optional execution mode of one, the look of the first image makes an uproar pixel for being greater than the pixel of predetermined luminance value with the first image described in another of same group in the pixel that same position view data is not identical; The look of the second image pixel of making an uproar is the pixel being greater than predetermined luminance value with another second image of same group in the pixel that same position view data is not identical.
For example, if the first image 1 and the first image 2 are one group, if the first image 1 is not identical with the view data of the pixel of the first row first row of the first image 2, then user terminal judges whether the brightness value of the pixel of the first row first row of the first image 1 and the first image 2 is greater than predetermined luminance value; If the brightness value of the pixel of the first row first row of the first image 1 is greater than predetermined luminance value, then the pixel of the first row first row of the first image 1 is defined as look and makes an uproar pixel; If the brightness value of the pixel of the first row first row of the first image 1 is less than predetermined luminance value, then the pixel of the first row first row of the first image 1 is defined as non-look and makes an uproar pixel; If the brightness value of the pixel of the first row first row of the first image 2 is greater than predetermined luminance value, then the pixel of the first row first row of the first image 2 is defined as look and makes an uproar pixel; If the brightness value of the pixel of the first row first row of the first image 2 is less than predetermined luminance value, then the pixel of the first row first row of the first image 2 is defined as non-look and makes an uproar pixel.The look of the second image pixel defining method and the look of the first image pixel defining method of making an uproar of making an uproar is identical, is not repeated herein.
By being divided into by the first image one group and the second image to be divided into one group between two between two, determining that the look of the first image and the second image is made an uproar pixel, can avoid filtering out more view data.
S205, the user terminal look to the first image that each is organized pixel of making an uproar carries out look and to make an uproar filtration, and the look of the second image of each group pixel of making an uproar is carried out to look and to make an uproar filtration.
In the embodiment of the present invention, user terminal can be carried out look and to be made an uproar filtration by the look of two kinds of modes to each first image organized and the second image pixel of making an uproar, and first kind of way is: user terminal is set to look makes an uproar with the make an uproar rgb value of the minimum pixel of the adjacent brightness value of pixel of look the rgb value of pixel; The second way is: look is made an uproar the rgb value that the rgb value of pixel is set to preset by user terminal.
All first image processing and tracking unit that colour filter is made an uproar excessively are first object image by S206, user terminal, are the second target image by all second image processing and tracking unit that colour filter is made an uproar excessively.
S207, user terminal are focused according to the view data waveform of first object image and the view data waveform of the second target image.
In the method described by Fig. 2, user terminal is under low light focal modes, default sensitivity value is used to obtain multiple first images and multiple second images by imageing sensor in preset time period, and the first image is divided into one group between two, determine that the look of each described first image organized is made an uproar pixel, second image is divided into one group between two, determines that the look of each described second image organized is made an uproar pixel; The look of user terminal to the first image that each is organized pixel of making an uproar carries out look and to make an uproar filtration, and the look of the second image of each group pixel of making an uproar is carried out to look and to make an uproar filtration; All first image processing and tracking unit that colour filter is made an uproar excessively are first object image by user terminal, are the second target image by all second image processing and tracking unit that colour filter is made an uproar excessively; User terminal is focused according to the view data waveform of the view data waveform of described first object image and described second target image.Visible, the success rate of phase place focusing can be improved when ambient light intensity is lower by implementing the embodiment of the present invention.
Refer to Fig. 3, Fig. 3 is the structural representation of a kind of user terminal disclosed in the embodiment of the present invention.Wherein, the user terminal shown in Fig. 3 can comprise acquisition module 301, look is made an uproar filtering module 302, module 303 and Focusing module 304 is merged.Wherein:
Acquisition module 301, for under low light focal modes, default sensitivity value is used to obtain multiple first images and multiple second images by imageing sensor in preset time period, wherein, described first image is obtained by the left pixel of described imageing sensor, and described second image is obtained by the right pixel of described imageing sensor.
In the embodiment of the present invention, by the acquisition module 301 of user terminal under low light focal modes, default sensitivity value is used to obtain multiple first images and multiple second images by imageing sensor in preset time period.Wherein, this user terminal can include but not limited to that smart mobile phone, panel computer, notebook computer, desktop computer etc. have the user terminal of camera function.The operating system of this user terminal can include but not limited to Android operation system, IOS, Symbian (Saipan) operating system, Black Berry (blackberry, blueberry) operating system, Windows operating system etc., and the embodiment of the present invention does not limit.
In the embodiment of the present invention, imageing sensor is the required imageing sensor used of phase place focusing technology.This imageing sensor is made up of in pairs left and right pixel.
In the embodiment of the present invention, acquisition module 301 obtains multiple first images by the left pixel of imageing sensor, obtains multiple second images by the right pixel of imageing sensor.
In the embodiment of the present invention, optionally, user can click focal modes and arrange button, and arranging current focal modes is low light focal modes.Or, when user terminal detects that current environment light intensity value is less than predetermined threshold value, enter low light focal modes.
Find in practice, ambient light intensity is lower, and the sensitivity value that user terminal collection image uses is larger, causes the look of the image gathered to make an uproar larger.The make an uproar impact of the larger view data waveform on this image of the look of image is larger, causes successfully to carry out phase place focusing.Therefore, the embodiment of the present invention obtains multiple first images and multiple second images by imageing sensor with default sensitivity value (can be less sensitivity value) by acquisition module 301 in preset time period, the look that can reduce the first image and the second image is made an uproar, and is conducive to the success rate improving phase place focusing.
Look is made an uproar filtering module 302, to make an uproar filtration for carrying out look to all described first images and described second image.
In the embodiment of the present invention, after acquisition module 301 obtains the first image and the second image, look filtering module 302 of making an uproar will carry out look to all first images and the second image and to make an uproar filtration.Particularly, look filtering module 302 of making an uproar will determine that the look of all first images and the second image is made an uproar pixel, and the look look of filtering module 302 to all first images and the second image pixel of making an uproar of making an uproar carries out look and to make an uproar filtration.
In the embodiment of the present invention, look filtering module 302 of making an uproar can be carried out look and to be made an uproar filtration by the look of two kinds of modes to all first images and the second image pixel of making an uproar, and first kind of way is: look filtering module 302 of making an uproar is set to look makes an uproar with the make an uproar rgb value of the minimum pixel of the adjacent brightness value of pixel of look the rgb value of pixel; The second way is: look is made an uproar the rgb value that the rgb value of pixel is set to preset by look filtering module 302 of making an uproar.
As the optional execution mode of one, look filtering module 302 of making an uproar carries out look filtration of making an uproar to the first all images and is specifically as follows:
The view data of the pixel of the same position of the first all images contrasts by look filtering module 302 of making an uproar, and judges that whether all first images are identical in the view data of the pixel of same position;
When the first all images the view data of the pixel of a certain same position whole identical time, look filtering module 302 of making an uproar determines that the non-look of the pixel of this position is made an uproar pixel;
When the first all images the view data of the pixel of a certain same position inwhole identical time, look filtering module 302 of making an uproar determines that the pixel of this position is that the look of the first image is made an uproar pixel;
The look look of filtering module 302 to all first images pixel of making an uproar of making an uproar carries out look and to make an uproar filtration.
In this embodiment, view data can include but not limited to rgb value, brightness value or color saturation etc., and the embodiment of the present invention does not limit.
For example, if there is the first image 1 and the first image 2, then whether look filtering module 302 of making an uproar is identical with the view data of the pixel of the first row first row of the first image 2 by contrast first image 1; If not identical, then look filtering module 302 of making an uproar determines that the pixel of the first row first row of the first image 1 and the first image 2 is that look is made an uproar pixel; If identical, then look filtering module 302 of making an uproar determines that the pixel of the first row first row of the first image 1 and the first image 2 is that non-look is made an uproar pixel.In like manner, look filtering module 302 of making an uproar also will judge that whether the first image 1 is identical with the pixel view data of other same positions of the first image 2.
As the optional execution mode of one, look filtering module 302 of making an uproar carries out look filtration of making an uproar to the second all images and is specifically as follows:
The view data of the pixel of the same position of the second all images contrasts by look filtering module 302 of making an uproar, and judges that whether all second images are identical in the view data of the pixel of same position;
When the second all images the view data of the pixel of a certain same position whole identical time, look filtering module 302 of making an uproar determines that the pixel of this position is that non-look is made an uproar pixel;
When the second all images the view data of the pixel of a certain same position inwhole identical time, look filtering module 302 of making an uproar determines that the pixel of this position is that the look of the second image is made an uproar pixel;
The look look of filtering module 302 to all second images pixel of making an uproar of making an uproar carries out look and to make an uproar filtration.
In this embodiment, view data can include but not limited to rgb value, brightness value or color saturation etc., and the embodiment of the present invention does not limit.Look filtering module 302 of making an uproar carries out look to the second all images and makes an uproar and filter that to carry out the make an uproar principle of filtering of look with user terminal to the first all images identical, is not repeated herein.
As the optional execution mode of one, the quantity of the first image is the multiple of 2, and the quantity of the second image is also the multiple of 2, and look filtering module 302 of making an uproar comprises: determining unit and look are made an uproar filter element, wherein:
Determining unit, for the first image is divided into one group between two, determines that the look of each the first image organized is made an uproar pixel, and the second image is divided into one group between two, determines that the look of each the second image organized is made an uproar pixel;
Look is made an uproar filter element, and the pixel of making an uproar of the look for the first image organized each carries out look and to make an uproar filtration, and the look of the second image of each group pixel of making an uproar is carried out to look and to make an uproar filtration.
In this embodiment, the first image is divided into one group and the second image is divided into one group between two by determining unit between two.Such as, if there is the first image 1, first image 2, first image 3 and the first image 4, then the first image 1, first image 2, first image 3 and the first image 4 can be divided into two groups by user terminal.Optionally, first image 1 and the first image 2 can be divided into one group by user terminal, first image 3 and the first image 4 are divided into one group, or the first image 1 and the first image 3 are divided into one group, by the first image 2 and the first image 4, or the first image 1 and the first image 4 are divided into one group, by the first image 2 and the first image 3, the embodiment of the present invention does not limit.
In the embodiment of the present invention, the first image is divided into after one group by determining unit between two, and determining unit will determine that the look of each the first image organized is made an uproar pixel; In like manner, the second image is divided into after one group by determining unit between two, and determining unit also will determine that the look of each the second image organized is made an uproar pixel.Wherein, the look of the first image pixel of making an uproar is with another first image of same group at the not identical pixel of same position view data, the look of the second image make an uproar pixel for the second image described in another of same group at the not identical pixel of same position view data.
For example, if the first image 1 and the first image 2 are one group, if the first image 1 is not identical with the view data of the pixel of the first row first row of the first image 2, then determining unit determines that the pixel of the first row first row of the first image 1 and the first image 2 is that look is made an uproar pixel.The look of the second image pixel defining method and the look of the first image pixel defining method of making an uproar of making an uproar is identical, is not repeated herein.
As the optional execution mode of one, the look of the first image makes an uproar pixel for being greater than the pixel of predetermined luminance value with the first image described in another of same group in the pixel that same position view data is not identical; The look of the second image pixel of making an uproar is the pixel being greater than predetermined luminance value with another second image of same group in the pixel that same position view data is not identical.
For example, if the first image 1 and the first image 2 are one group, if the first image 1 is not identical with the view data of the pixel of the first row first row of the first image 2, then determining unit judges whether the brightness value of the pixel of the first row first row of the first image 1 and the first image 2 is greater than predetermined luminance value; If the brightness value of the pixel of the first row first row of the first image 1 is greater than predetermined luminance value, then the pixel of the first row first row of the first image 1 is defined as look and makes an uproar pixel by determining unit; If the brightness value of the pixel of the first row first row of the first image 1 is less than predetermined luminance value, then the pixel of the first row first row of the first image 1 is defined as non-look and makes an uproar pixel by determining unit; If the brightness value of the pixel of the first row first row of the first image 2 is greater than predetermined luminance value, then the pixel of the first row first row of the first image 2 is defined as look and makes an uproar pixel by determining unit; If the brightness value of the pixel of the first row first row of the first image 2 is less than predetermined luminance value, then the pixel of the first row first row of the first image 2 is defined as non-look and makes an uproar pixel by determining unit.The look of the second image pixel defining method and the look of the first image pixel defining method of making an uproar of making an uproar is identical, is not repeated herein.
By being divided into by the first image one group and the second image to be divided into one group between two between two, determining that the look of the first image and the second image is made an uproar pixel, can avoid filtering out more view data.
Merging module 303, for being first object image by all first image processing and tracking unit that colour filter is made an uproar excessively, is the second target image by all second image processing and tracking unit that colour filter is made an uproar excessively.
In the embodiment of the present invention, look makes an uproar filtering module 302 after filtering all first images and the second image, merging module 303 by all first image processing and tracking unit that colour filter is made an uproar excessively is first object image, is the second target image by all second image processing and tracking unit that colour filter is made an uproar excessively.
As the optional execution mode of one, described merging module comprises: accumulated unit and merge cells.Wherein:
Accumulated unit, the view data for the pixel of the same position by crossing all first images that colour filter is made an uproar adds up, and obtains the first accumulative view data of the pixel of each same position;
Merge cells, for the first accumulative view data is set to the view data of first object image at the pixel of same position, with first image processing and tracking unit of all colour filters excessively being made an uproar for first object image;
Described accumulated unit, the view data also for the pixel of the same position by crossing all second images that colour filter is made an uproar adds up, and obtains the second accumulative view data of the pixel of each same position;
Described merge cells, also for the second accumulative view data is set to the view data of the second target image at the pixel of same position, is the second target image with second image processing and tracking unit of the colour filter of all mistakes being made an uproar.
In this embodiment, this view data is rgb value, for example, if there is the first image 1, first image 2 and the first image 3, then the rgb value of the first row first row of the first image 1, first image 2 and the first image 3 adds up by accumulated unit, using the first accumulative view data of the accumulative rgb value obtained as the pixel of first row first row.In like manner, the rgb value of other same positions of the first image 1, first image 2 and the first image 3 also adds up by accumulated unit, using the first accumulative view data of the accumulative rgb value obtained as corresponding same position.After accumulated unit obtains the first accumulative view data of first row first row, merge cells is using the rgb value of first of first row first row the accumulative image as the pixel of the first row first row of first object image.In like manner, merge cells also using first of other positions the accumulative view data as the rgb value of first object image at the pixel of this position.
In this embodiment, the view data of the pixel crossing the same position of all second images that colour filter is made an uproar adds up by accumulated unit, obtain the second accumulative view data of the pixel of each same position, and by merge cells, the second accumulative view data is set to the view data of the second target image at the pixel of same position, add up with the view data of accumulated unit by the pixel crossing the same position of all first images that colour filter is made an uproar, obtain the first accumulative view data of the pixel of each same position, and it is identical in the principle of the view data of the pixel of same position by merge cells, the first accumulative view data to be set to first object image, be not repeated herein.
Focusing module 304, for focusing according to the view data waveform of described first object image and the view data waveform of described second target image.
In the embodiment of the present invention, after merging module 303 generates first object image and the second target image, Focusing module 304 is focused according to the view data waveform of first object image and the view data waveform of the second target image.Wherein, the view data waveform of first object image is that user terminal generates according to the rgb value of first object image, and the view data waveform of the second target image is that user terminal generates according to the rgb value of the second target image.
The embodiment of the present invention, by will colour filter all first Images uniting first object images of making an uproar be crossed, and all second Images uniting second target images that will colour filter be crossed make an uproar, add the signal strength signal intensity of view data waveform, be conducive to the success rate improving focusing relatively.
See also Fig. 4, Fig. 4 is the structural representation of another kind of user terminal disclosed in the embodiment of the present invention.Wherein, the user terminal shown in Fig. 4 is that user terminal is as shown in Figure 3 optimized and obtains.Compared with the user terminal shown in Fig. 3, the user terminal shown in Fig. 4, except all modules comprising the user terminal shown in Fig. 3 and unit, can also comprise detection module 305 and mode switch module 306.Wherein:
Detection module 305, for at described acquisition module 301 under low light focal modes, use before default sensitivity value obtains multiple first images and multiple second images by imageing sensor in preset time period, whether the light intensity value detecting current environment is less than predetermined threshold value.
In the embodiment of the present invention, detection module 305 will detect the light intensity value of current environment, and judges whether the light intensity value of current environment is less than predetermined threshold value.When the light intensity value that detection module 305 detects current environment is less than predetermined threshold value, mode switch module 306 enters low light focal modes; The light intensity value detecting current environment when detection module 305 is greater than predetermined threshold value, then whether the light intensity value of detection module 305 continuation detection current environment is less than predetermined threshold value.
Mode switch module 306, when the light intensity value for detecting current environment when detection module 305 is less than described predetermined threshold value, enters low light focal modes.
The embodiment of the present invention is by when detection module 305 detects that the light intensity value of current environment is less than predetermined threshold value, mode switch module 306 enters low light focal modes, thus avoid user and manually select low light focal modes, make the switching of focal modes more intelligent.
In the user terminal described by Fig. 3-Fig. 4, acquisition module is under low light focal modes, use default sensitivity value to obtain multiple first images and multiple second images by imageing sensor in preset time period, look filtering module of making an uproar carries out look to all described first images and described second image and to make an uproar filtration; Merging module by all first image processing and tracking unit that colour filter is made an uproar excessively is first object image, is the second target image by all second image processing and tracking unit that colour filter is made an uproar excessively; Focusing module is focused according to the view data waveform of the view data waveform of described first object image and described second target image.Visible, the success rate of phase place focusing can be improved when ambient light intensity is lower by implementing the embodiment of the present invention.
It should be noted that, in the above-described embodiments, the description of each embodiment is all emphasized particularly on different fields in certain embodiment, there is no the part described in detail, can see the associated description of other embodiments.Secondly, those skilled in the art also should know, the embodiment described in specification all belongs to preferred embodiment, and involved action and module might not be that the present invention is necessary.
Step in embodiment of the present invention method can be carried out order according to actual needs and be adjusted, merges and delete.
The module of embodiment of the present invention user terminal or unit can carry out merging, divide and deleting according to actual needs.
One of ordinary skill in the art will appreciate that all or part of step in the various methods of above-described embodiment can carry out the device-dependent hardware of command terminal by program to have come, this program can be stored in a computer-readable recording medium, storage medium can comprise: flash disk, read-only memory (Read-OnlyMemory, ROM), random access device (Random Access Memory, RAM), disk or CD etc.
Above a kind of phase place focusing method disclosed in the embodiment of the present invention and user terminal are described in detail, apply specific case herein to set forth principle of the present invention and execution mode, the explanation of above embodiment just understands method of the present invention and core concept thereof for helping; Meanwhile, for one of ordinary skill in the art, according to thought of the present invention, all will change in specific embodiments and applications, in sum, this description should not be construed as limitation of the present invention.

Claims (10)

1. a phase place focusing method, is characterized in that, described method comprises:
Under low light focal modes, default sensitivity value is used to obtain multiple first images and multiple second images by imageing sensor in preset time period, wherein, described first image is obtained by the left pixel of described imageing sensor, and described second image is obtained by the right pixel of described imageing sensor;
Carry out look to all described first images and described second image to make an uproar filtration;
All first image processing and tracking unit of mistake colour filter being made an uproar are first object image, are the second target image by all second image processing and tracking unit that colour filter is made an uproar excessively;
Focus according to the view data waveform of described first object image and the view data waveform of described second target image.
2. method according to claim 1, is characterized in that, described under low light focal modes, and use before default sensitivity value obtains multiple first images and multiple second images by imageing sensor in preset time period, described method also comprises:
Whether the light intensity value detecting current environment is less than predetermined threshold value;
When the light intensity value of current environment is less than described predetermined threshold value, enter low light focal modes.
3. method according to claim 1, is characterized in that, the quantity of described first image is the multiple of 2, and the quantity of described second image is also the multiple of 2, describedly carries out look filtration of making an uproar to all described first images and described second image and comprises:
Described first image is divided into one group between two, determines that the look of each described first image organized is made an uproar pixel, and described second image is divided into one group between two, determine that the look of each described second image organized is made an uproar pixel;
The look of described first image of each group pixel of making an uproar is carried out look and to be made an uproar filtrations, and the look of described second image that each is organized pixel of making an uproar is carried out look and to be made an uproar filtration.
4. method according to claim 3, it is characterized in that, the look of described first image make an uproar pixel for the first image described in another of same group at the not identical pixel of same position view data, the look of described second image make an uproar pixel for the second image described in another of same group at the not identical pixel of same position view data.
5. the method according to Claims 1 to 4 any one, is characterized in that, described is first object image by all first image processing and tracking unit that colour filter is made an uproar excessively, is that the second target image comprises by all second image processing and tracking unit that colour filter is made an uproar excessively:
The view data of the pixel crossing the same position of all first images that colour filter is made an uproar is added up, obtains the first accumulative view data of the pixel of each same position;
Described first accumulative view data is set to the view data of first object image at the pixel of same position, is described first object image with first image processing and tracking unit of the colour filter of all mistakes being made an uproar;
The view data of the pixel crossing the same position of all second images that colour filter is made an uproar is added up, obtains the second accumulative view data of the pixel of each same position;
Described second accumulative view data is set to the view data of the second target image at the pixel of same position, is described second target image with second image processing and tracking unit of the colour filter of all mistakes being made an uproar.
6. a user terminal, is characterized in that, described user terminal comprises:
Acquisition module, for under low light focal modes, default sensitivity value is used to obtain multiple first images and multiple second images by imageing sensor in preset time period, wherein, described first image is obtained by the left pixel of described imageing sensor, and described second image is obtained by the right pixel of described imageing sensor;
Look is made an uproar filtering module, to make an uproar filtration for carrying out look to all described first images and described second image;
Merging module, for being first object image by all first image processing and tracking unit that colour filter is made an uproar excessively, is the second target image by all second image processing and tracking unit that colour filter is made an uproar excessively;
Focusing module, for focusing according to the view data waveform of described first object image and the view data waveform of described second target image.
7. user terminal according to claim 6, is characterized in that, described user terminal also comprises:
Detection module, for at described acquisition module under low light focal modes, use before default sensitivity value obtains multiple first images and multiple second images by imageing sensor in preset time period, whether the light intensity value detecting current environment is less than predetermined threshold value;
Mode switch module, for when the light intensity value of current environment is less than described predetermined threshold value, enters low light focal modes.
8. user terminal according to claim 6, is characterized in that, the quantity of described first image is the multiple of 2, and the quantity of described second image is also the multiple of 2, and described look filtering module of making an uproar comprises:
Determining unit, for described first image is divided into one group between two, determines that the look of each described first image organized is made an uproar pixel, and described second image is divided into one group between two, determines that the look of each described second image organized is made an uproar pixel;
Look is made an uproar filter element, and the pixel of making an uproar of the look for described first image organized each carries out look and to make an uproar filtration, and the look of described second image of each group pixel of making an uproar is carried out to look and to make an uproar filtration.
9. user terminal according to claim 8, it is characterized in that, the look of described first image make an uproar pixel for the first image described in another of same group at the not identical pixel of same position view data, the look of described second image make an uproar pixel for the second image described in another of same group at the not identical pixel of same position view data.
10. the user terminal according to claim 6 ~ 9 any one, is characterized in that, described merging module comprises:
Accumulated unit, the view data for the pixel of the same position by crossing all first images that colour filter is made an uproar adds up, and obtains the first accumulative view data of the pixel of each same position;
Merge cells, for the described first accumulative view data is set to the view data of first object image at the pixel of same position, is described first object image with first image processing and tracking unit of the colour filter of all mistakes being made an uproar;
Described accumulated unit, the view data also for the pixel of the same position by crossing all second images that colour filter is made an uproar adds up, and obtains the second accumulative view data of the pixel of each same position;
Described merge cells, also for the described second accumulative view data is set to the view data of the second target image at the pixel of same position, is described second target image with second image processing and tracking unit of the colour filter of all mistakes being made an uproar.
CN201510411747.0A 2015-07-14 2015-07-14 A kind of phase focusing method and user terminal Active CN105007422B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201510411747.0A CN105007422B (en) 2015-07-14 2015-07-14 A kind of phase focusing method and user terminal

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201510411747.0A CN105007422B (en) 2015-07-14 2015-07-14 A kind of phase focusing method and user terminal

Publications (2)

Publication Number Publication Date
CN105007422A true CN105007422A (en) 2015-10-28
CN105007422B CN105007422B (en) 2018-01-23

Family

ID=54379923

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201510411747.0A Active CN105007422B (en) 2015-07-14 2015-07-14 A kind of phase focusing method and user terminal

Country Status (1)

Country Link
CN (1) CN105007422B (en)

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105704378A (en) * 2016-02-29 2016-06-22 广东欧珀移动通信有限公司 Control method, control device and electronic device
CN105763766A (en) * 2016-02-29 2016-07-13 广东欧珀移动通信有限公司 Control method, control device and electronic device
CN105791680A (en) * 2016-02-29 2016-07-20 广东欧珀移动通信有限公司 Control method, control device and electronic device
CN106027889A (en) * 2016-05-20 2016-10-12 广东欧珀移动通信有限公司 Control method, control device and electronic device
CN107809586A (en) * 2017-10-31 2018-03-16 努比亚技术有限公司 Switching method, mobile terminal and the storage medium of mobile terminal focal modes
CN109862261A (en) * 2018-12-29 2019-06-07 努比亚技术有限公司 Filming apparatus, shooting processing method, terminal and computer readable storage medium
CN110475071A (en) * 2019-09-19 2019-11-19 厦门美图之家科技有限公司 Phase focusing method, device, electronic equipment and machine readable storage medium
US10554877B2 (en) 2016-07-29 2020-02-04 Guangdong Oppo Mobile Telecommunications Corp., Ltd. Image synthesis method and apparatus for mobile terminal, and mobile terminal
US10848678B2 (en) 2016-07-29 2020-11-24 Guangdong Oppo Mobile Telecommunications Corp., Ltd. Image synthesis method and apparatus for mobile terminal, and mobile terminal

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101075073A (en) * 2006-05-16 2007-11-21 奥林巴斯映像株式会社 Digital camera and camera system
CN101188773A (en) * 2006-11-24 2008-05-28 奥林巴斯映像株式会社 Camera and image processing method
JP2009188963A (en) * 2008-02-09 2009-08-20 Fujitsu Microelectronics Ltd Image processing circuit, image processing method, and camera
CN102457675A (en) * 2010-10-27 2012-05-16 展讯通信(上海)有限公司 Image shooting anti-shaking manner for handheld camera equipment

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101075073A (en) * 2006-05-16 2007-11-21 奥林巴斯映像株式会社 Digital camera and camera system
CN101188773A (en) * 2006-11-24 2008-05-28 奥林巴斯映像株式会社 Camera and image processing method
JP2009188963A (en) * 2008-02-09 2009-08-20 Fujitsu Microelectronics Ltd Image processing circuit, image processing method, and camera
CN102457675A (en) * 2010-10-27 2012-05-16 展讯通信(上海)有限公司 Image shooting anti-shaking manner for handheld camera equipment

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105763766B (en) * 2016-02-29 2020-05-15 Oppo广东移动通信有限公司 Control method, control device and electronic device
CN105763766A (en) * 2016-02-29 2016-07-13 广东欧珀移动通信有限公司 Control method, control device and electronic device
CN105791680A (en) * 2016-02-29 2016-07-20 广东欧珀移动通信有限公司 Control method, control device and electronic device
CN105704378A (en) * 2016-02-29 2016-06-22 广东欧珀移动通信有限公司 Control method, control device and electronic device
CN106027889A (en) * 2016-05-20 2016-10-12 广东欧珀移动通信有限公司 Control method, control device and electronic device
CN106027889B (en) * 2016-05-20 2019-04-12 Oppo广东移动通信有限公司 Control method, control device and electronic device
US10554877B2 (en) 2016-07-29 2020-02-04 Guangdong Oppo Mobile Telecommunications Corp., Ltd. Image synthesis method and apparatus for mobile terminal, and mobile terminal
US10848678B2 (en) 2016-07-29 2020-11-24 Guangdong Oppo Mobile Telecommunications Corp., Ltd. Image synthesis method and apparatus for mobile terminal, and mobile terminal
CN107809586A (en) * 2017-10-31 2018-03-16 努比亚技术有限公司 Switching method, mobile terminal and the storage medium of mobile terminal focal modes
CN107809586B (en) * 2017-10-31 2020-09-01 努比亚技术有限公司 Mobile terminal focusing mode switching method, mobile terminal and storage medium
CN109862261A (en) * 2018-12-29 2019-06-07 努比亚技术有限公司 Filming apparatus, shooting processing method, terminal and computer readable storage medium
CN110475071A (en) * 2019-09-19 2019-11-19 厦门美图之家科技有限公司 Phase focusing method, device, electronic equipment and machine readable storage medium
CN110475071B (en) * 2019-09-19 2021-06-04 厦门美图之家科技有限公司 Phase focusing method, phase focusing device, electronic equipment and machine-readable storage medium

Also Published As

Publication number Publication date
CN105007422B (en) 2018-01-23

Similar Documents

Publication Publication Date Title
CN105007422A (en) Phase focusing method and user terminal
US10805508B2 (en) Image processing method, and device
US20200120289A1 (en) Method for zooming by switching between dual cameras, mobile terminal, and storage apparatus
CN105262954B (en) The method and apparatus that triggering camera focuses automatically
US20130194480A1 (en) Image processing apparatus, image processing method, and recording medium
CN105578026A (en) Photographing method and user terminal
US11100341B2 (en) Lane line tracking method and device
CN105141832B (en) One kind focusing abnormality eliminating method and mobile terminal
CN105227838A (en) A kind of image processing method and mobile terminal
CN104349045A (en) Image collecting method and electronic equipment
CN104035684A (en) Dynamic screen capture method and system applicable to mobile terminal
CN104536559A (en) Terminal control method
CN105049695A (en) Video recording method and device
CN106210532A (en) One is taken pictures processing method and terminal unit
CN105141840A (en) Information processing method and electronic device
CN104539836A (en) Terminal
CN104902182A (en) Method and device for realizing continuous auto-focus
WO2014193377A1 (en) Image refocusing
CN105224187A (en) Menu execution control method and terminal equipment
CN104994287A (en) Camera shooting method based on wide-angle camera and mobile terminal
CN104793742A (en) Shooting previewing method and device
CN106454100A (en) Focusing method and device and mobile terminal
CN104580892A (en) Method for terminal to take images
CN108198189B (en) Picture definition obtaining method and device, storage medium and electronic equipment
CN104680563B (en) The generation method and device of a kind of image data

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CP01 Change in the name or title of a patent holder

Address after: Changan town in Guangdong province Dongguan 523860 usha Beach Road No. 18

Patentee after: GUANGDONG OPPO MOBILE TELECOMMUNICATIONS Corp.,Ltd.

Address before: Changan town in Guangdong province Dongguan 523860 usha Beach Road No. 18

Patentee before: GUANGDONG OPPO MOBILE TELECOMMUNICATIONS Corp.,Ltd.

CP01 Change in the name or title of a patent holder