CN105933589A - Image processing method and terminal - Google Patents

Image processing method and terminal Download PDF

Info

Publication number
CN105933589A
CN105933589A CN201610503757.1A CN201610503757A CN105933589A CN 105933589 A CN105933589 A CN 105933589A CN 201610503757 A CN201610503757 A CN 201610503757A CN 105933589 A CN105933589 A CN 105933589A
Authority
CN
China
Prior art keywords
focusing area
preview image
terminal
area
photographic head
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201610503757.1A
Other languages
Chinese (zh)
Other versions
CN105933589B (en
Inventor
张海平
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN201610503757.1A priority Critical patent/CN105933589B/en
Publication of CN105933589A publication Critical patent/CN105933589A/en
Application granted granted Critical
Publication of CN105933589B publication Critical patent/CN105933589B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
    • H04N5/2624Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects for obtaining an image which is composed of whole input images, e.g. splitscreen
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/57Mechanical or electrical details of cameras or camera modules specially adapted for being embedded in other devices
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/95Computational photography systems, e.g. light-field imaging systems
    • H04N23/951Computational photography systems, e.g. light-field imaging systems by using two or more images to influence resolution, frame rate or aspect ratio

Abstract

The embodiment of the invention discloses an image processing method which comprises the steps of: determining a focusing region and a non-focusing region of a preview image shot by a camera of a terminal; by utilizing a laser distance measuring sensor of the terminal, determining a distance between a space position indicated by each pixel point in the preview image and the camera so as to obtain N distance values, wherein the number of the pixel points in the preview image is the N, and the N is a positive integer; on the premise of keeping the focusing region clear, according to the N distance values, carrying out fuzzy processing on the non-focusing region. The embodiment of the invention further provides a terminal. By the image processing method and the terminal which are provided by the embodiment of the invention, the operation process of implementing bokeh can be simplified, and the non-focusing region is subjected to bokeh processing according to the distances, so that an obtained bokeh image is more vivid in effect.

Description

A kind of image processing method and terminal
Technical field
The present invention relates to technical field of electronic equipment, particularly relate to a kind of image processing method and terminal.
Background technology
Along with the fast development of information technology, terminal (such as mobile phone, panel computer etc.) uses increasingly frequency Numerous, function integrated in terminal also gets more and more.Take pictures and become an important attraction of each cell phone manufacturer, How to allow effect promoting of taking pictures, and more differentiation function, become the focus that each manufacturer competitively uses.
At present, terminal is taken pictures and can not preferably be realized close shot clearly, and distant view obscures, and popular on market Dual camera, primarily now acts on and realizes close shot exactly clearly, and distant view obscures, and its principle is to utilize two to take the photograph As head, it is respectively seen different region A and B, then, utilizes the two photographic head to estimate A Yu B distance Difference.If now user manually selects a-quadrant is focusing area, then the image of a-quadrant is for keeping clear, Corresponding B region then Fuzzy processing, and this obfuscation degree, it is generally the case that need user oneself to lead to Cross upper and lower tie rod to be controlled.It is clear that this mode realizes close shot, and distant view obscures, on the one hand, operation is not Enough intelligence, since it is desired that user manually selects focusing area operation intelligence not images test A due to double, another Aspect, due to double shooting test A Yu B distances, the simply distance of the relative distance of A Yu B, i.e. A-B, no Being A or the B region distance from camera lens, itself is selecting focusing, not accurate enough, and A Yu B relative distance Simply estimation, error is bigger.
Summary of the invention
Embodiments provide a kind of image processing method and terminal, to simplifying background blurring realization Operating process and make background blurring image effect the most true to nature.
Embodiment of the present invention first aspect provides a kind of image processing method, including:
Determine the focusing area of the preview image of the photographic head shooting of terminal and non-focusing area;
The laser range sensor utilizing described terminal determines in described preview image indicated by each pixel Distance between locus and described photographic head, obtains N number of distance value, the pixel of described preview image Number is described N, and described N is positive integer;
On the premise of keeping described focusing area clear, according to described N number of distance value to described non-focusing district Territory carries out Fuzzy processing.
Embodiment of the present invention second aspect provides a kind of terminal, including:
First determines unit, for determining the focusing area of preview image of the photographic head shooting of terminal and non-right Burnt region;
Second determines unit, for utilizing the laser range sensor of described terminal to determine in described preview image The distance between locus and described photographic head indicated by each pixel, obtains N number of distance value, institute The pixel number stating preview image is described N, and described N is positive integer;
Processing unit, for, on the premise of keeping described focusing area clear, determining list according to described second To described first, described N number of distance value that unit determines determines that the described non-focusing area that unit determines obscures Change processes.
The embodiment of the present invention third aspect provides a kind of terminal, including:
Processor and memorizer;
Wherein, described processor, for calling the described executable program code in described memorizer, performs the Some or all of step on the one hand.
Implement the embodiment of the present invention, have the advantages that
It can be seen that by the embodiment of the present invention, determine the focusing of the preview image of the photographic head shooting of terminal Region and non-focusing area, utilize the laser range sensor of this terminal to determine each pixel in this preview image The distance between locus and this photographic head indicated by Dian, obtains N number of distance value, the picture of this preview image Vegetarian refreshments number is described N, should be on the premise of keeping this focusing area clear, N number of according to this for positive integer Distance value carries out Fuzzy processing to this non-focusing area.Therefore, focusing area and non-focusing can determined After region, according to the distance value between terminal and each pixel, non-focusing area can be carried out at obfuscation Reason, thus, simplify the operating process of background blurring realization in prior art, and due to according to distance to non- Focusing area carries out background blurring process, thus, the background blurring image effect obtained is the most true to nature.
Accompanying drawing explanation
For the technical scheme being illustrated more clearly that in the embodiment of the present invention, below will be to required in embodiment The accompanying drawing used is briefly described, it should be apparent that, the accompanying drawing in describing below is some realities of the present invention Execute example, for those of ordinary skill in the art, on the premise of not paying creative work, it is also possible to Other accompanying drawing is obtained according to these accompanying drawings.
Fig. 1 is the schematic flow sheet of the first embodiment of a kind of image processing method disclosed in the embodiment of the present invention;
Fig. 1 a is that the isolated area of preview image disclosed in the embodiment of the present invention divides schematic diagram;
Fig. 1 b is laser range sensor range finding schematic diagram disclosed in the embodiment of the present invention;
Fig. 1 c is the laser range sensor range finding plane demonstration graph in Fig. 1 b disclosed in the embodiment of the present invention;
Fig. 2 is the schematic flow sheet of the second embodiment of a kind of image processing method disclosed in the embodiment of the present invention;
Fig. 3 a is the structural representation of the first embodiment of a kind of terminal disclosed in the embodiment of the present invention;
Fig. 3 b is the structure that first of the terminal described in Fig. 3 a disclosed in the embodiment of the present invention determines unit Schematic diagram;
Fig. 3 c is the structural representation of the processing unit of the terminal described in Fig. 3 a disclosed in the embodiment of the present invention;
Fig. 4 is the structural representation of the second embodiment of a kind of terminal disclosed in the embodiment of the present invention.
Detailed description of the invention
Below in conjunction with the accompanying drawing in the embodiment of the present invention, the technical scheme in the embodiment of the present invention is carried out clearly Chu, it is fully described by, it is clear that described embodiment is a part of embodiment of the present invention rather than all Embodiment.Based on the embodiment in the present invention, those of ordinary skill in the art are not making creative labor The every other embodiment obtained under dynamic premise, broadly falls into the scope of protection of the invention.
Term " first " in description and claims of this specification and described accompanying drawing, " second ", " Three " and " the 4th " etc. is for distinguishing different object rather than for describing particular order.Additionally, art Language " includes " and " having " and their any deformation, it is intended that cover non-exclusive comprising.Such as Contain series of steps or the process of unit, method, system, product or equipment are not limited to list Step or unit, but the most also include step or the unit do not listed, or the most also include right In intrinsic other step of these processes, method, product or equipment or unit.
Referenced herein " embodiment " is it is meant that the special characteristic, structure or the characteristic that describe can in conjunction with the embodiments To be included at least one embodiment of the present invention.There is this phrase not in each position in the description Necessarily each mean identical embodiment, be not and the independent of other embodiments mutual exclusion or alternative embodiment. Those skilled in the art explicitly and are implicitly understood by, and embodiment described herein can be real with other Execute example to combine.
Terminal described by the embodiment of the present invention can include smart mobile phone (as Android phone, iOS mobile phone, Windows Phone mobile phone etc.), panel computer, palm PC, notebook computer, mobile internet device (MID, Mobile Internet Devices) or Wearable etc., above-mentioned terminal is only citing, rather than thoroughly Lift, including but not limited to above-mentioned terminal.
It should be noted that the principle of laser range sensor is: laser range sensor sends modulated Near infrared light, meet object back reflection, laser range sensor by calculate light launch and reflex time difference or Phase contrast, convert the distance of scenery of being taken) distance clapped between scenery and camera lens of test.Such as, First can be directed at objective emission laser pulse by laser diode, after target reflection, laser scatters to all directions, Some scattered light returns to transducer receivers, is imaged onto on avalanche photodide after being received by optical system, Avalanche photodide is the optical pickocff that a kind of inside has enlarging function, and therefore it can detect the most micro- Weak optical signal, record and process from light pulse be issued to return received the experienced time, can measure Target range.
Alternatively, by the way, the locus indicated by each pixel in preview image can be recorded And the distance between photographic head, wherein, a certain thing in all corresponding photographed scene of each pixel in preview image The a certain position of body, this certain position is referred to as the locus indicated by pixel, owing to preview image is 2-D data, and locus is three-dimensional data, both associations are, in shooting process, are photographing During jobbie in three dimensions, this jobbie can imaging in the preview image of photographic head, thus, A locus in each pixel corresponding three-dimensional space.Assume that preview image comprises N number of pixel, then Can get N number of distance value by said method, each distance value represents the locus indicated by each pixel And the distance between pixel, wherein, N is positive integer.
Referring to Fig. 1, the first embodiment flow process for a kind of image processing method of embodiment of the present invention offer is shown It is intended to.Image processing method described in the present embodiment, comprises the following steps:
101, the focusing area of the preview image of the photographic head shooting of terminal and non-focusing area are determined.
Wherein, the photographic head of terminal can be in visible image capturing head, infrared camera or ultraviolet photographic head At least one etc..
Alternatively, the target area in the preview image of the photographic head shooting that can choose terminal, by this target area Territory is as focusing area, and using the region beyond described focusing area as non-focusing area, wherein, described Preview image comprises multiple isolated area, and described target area is at least one in the plurality of isolated area. As shown in Figure 1a, preview image is divided into 9 isolated areas (isolated area being crossed to form by dotted line in figure), Target area can be any region in 9 isolated areas of this preview image.Certainly, preview image also may be used To be divided into 2,3,4 etc. isolated areas, numerous to list herein.
Still optionally further, the target area Ke Bao in the preview image of the above-mentioned photographic head shooting choosing terminal Include following steps:
1), instruction is chosen in reception;
2), by the described isolated area chosen indicated by instruction in the preview image that the photographic head of terminal is shot As focusing area, and using the region beyond described focusing area as non-focusing area;
Wherein, in step 1, user can choose at least one isolated area in multiple isolated area, is determining After choosing this at least one isolated area, then generate choose instruction, in step 2, then can by above-mentioned at least One isolated area is as focusing area.Such as, can receive user in above-mentioned 9 isolated areas at least One region choose instruction, it is assumed that isolated area i is a region in 9 isolated areas, have chosen After this isolated area, then using this isolated area i as focusing area, and using the region beyond focusing area as Non-focusing area.
Alternatively, the built-in N number of receiving area of laser range sensor, each receiving area is independent, can To receive external laser energy.Ensure by two lens design, it is ensured that laser range sensor energy simultaneously Enough receive N number of region distance signal, and can ensure that N number of receiving area of laser range sensor and take the photograph As head N number of region of dividing in advance of preview image consistent.Such as, such as Fig. 1 b and as illustrated in figure 1 c, Fig. 1 b In, when photographic head is opened, photographed scene can form preview image, and preview image is divided into 9 regions, i.e. Photographed scene is divided into 9 regions, and laser range sensor can detect respectively these 9 regions and photographic head it Between distance value, specifically, by laser range sensor send modulated near infrared light (figure A in by swashing The dotted line that ligh-ranging sensor sends represents), and reflected by the object in photographed scene, and by this laser Distance measuring sensor receives, and launches and difference reflex time or phase contrast by calculating light, and convert the scape that is taken The distance of thing.Assume that the first detector unit detects distance value at first, then this distance value is fed back to terminal. In Fig. 1 c, the photographed scene of photographic head can get preview image, and the detection region of laser range sensor (is wrapped Containing P1, P2, P3, P4, P5, P6, P7, P8 and P9) can with 9 regions of preview image division one by one Correspondence, using the distance between corresponding for the first detector unit P1 detection region and photographic head as corresponding preview The distance between region and photographic head in image, thus, available between each detection region and photographic head Distance value, i.e. can determine that each region in the N number of region divided in advance in the preview image that photographic head shoots Distance between locus and described photographic head.Thus, can be using region corresponding for lowest distance value as right Burnt region.Certainly, laser range sensor is utilized can also to measure the space of each pixel in preview image Distance between position and photographic head.
Further, the target area in the preview image of the above-mentioned photographic head shooting choosing terminal can be also:
Using the appointment region in described preview image as focusing area, and by the district beyond described focusing area Territory is as non-focusing area.Such as, a certain region in preview screen can be preassigned as specifying region, So, then preview image is specified the region beyond region to be non-focusing area.
Alternatively, step 101 may also include the steps of:
3), the preview image shooting the photographic head of terminal carries out target detection;
4), using the target region of described detection as focusing area.
Wherein, the target in step 3 can be people, car, cat, Canis familiaris L. etc., the most in this no limit. Above-mentioned target detection can use infrared sensor to carry out temperature detection, or image recognition algorithm carries out target and enters Row detection.Such as, when infrared camera can be used to shoot, then can according to the distribution situation of different temperatures, Target in preview image is identified.
Further, above-mentioned steps 3 can comprise the steps:
31) preview image, to the photographic head of terminal shot carries out binary conversion treatment, obtains binaryzation preview graph Picture;
32) profile of described binaryzation preview image, is extracted;
33), described profile is carried out image recognition, to identify the target in described profile.
Wherein, the preview image of photographic head shooting can be carried out binary conversion treatment, obtain after binary conversion treatment Binaryzation preview image only has two tonal gradations, i.e. pixel value to be 0 or pixel value is 255.Wherein, two The threshold value that value processes can be the average brightness value of preview image, or, the average brightness value of focusing area, Alternatively, each isolated area of the most above-mentioned multiple isolated areas can choose multiple pixel value, then, Calculate the average of all pixel values, using this average as binary-state threshold, owing to simply choosing preview image Partial pixel point, thus, accelerate calculating speed, and, choose in multiple isolated areas, because of And, more representative, the binary-state threshold precision chosen is high and access speed is fast.
Alternatively, when preview image is coloured image, also can extract the luminance component image of preview image, And this luminance component image is carried out target detection according to step 31,32 and 33, first, carry out binaryzation Process, secondly, contours extract, finally, carry out image recognition, owing to only a luminance component image being entered Row processes, thus, reduce the complexity of preview image, processing speed can be promoted.
Alternatively, when the photographic head of terminal shoots, shooting image can be focused, wherein, this focusing During generate focal zone can be as focusing area.
102, the laser range sensor of described terminal is utilized to determine each pixel indication in described preview image Distance between the locus and the described photographic head that show, obtains N number of distance value, the picture of described preview image Vegetarian refreshments number is described N, and described N is positive integer.
103, on the premise of keeping described focusing area clear, according to described N number of distance value to described non-right Burnt region carries out Fuzzy processing.
Alternatively, focusing area can be kept clear, and according to N number of distance value, non-focusing area is obscured Change processes, then, the image obtained i.e. focusing area is clear, rather than focusing area obscures.The present invention implements In example, the Main Means of Fuzzy processing can use Gaussian Blur algorithm, it is of course also possible to use other Algorithm, is not specifically limited at this.
It should be noted that the described focusing area of above-mentioned holding is clear it is understood that can not do focusing area Any process, or, focusing area can be carried out certain image enhancement processing, or, can be to focusing district Territory carries out certain U.S. face effect process.Wherein, image enhancement processing can use: histogram equalization, ash Degree stretching, white balance processes, and colour temperature regulates, and image restoration etc. does not limits at this.In brief, Holding focusing area can be clearly: ensures that the readability of focusing area is not less than at not to focusing area Readability during reason.
Still optionally further, in the case of keeping focusing area clearly, non-focusing area can be obscured Change processes, concretely:
First, determine the average of all distance values that focusing area is corresponding, obtain the first average;
Secondly, the equal of all distance values corresponding to each described isolated area is determined in non-focusing area respectively Value, obtains multiple second average;
Then, determine in described non-focusing area each according to described first average and the plurality of second average The fuzzy coefficient of described isolated area;
Finally, according to the fuzzy coefficient of described each described isolated area, described each isolated area is entered respectively Row Fuzzy processing.
Specifically, the equal of distance value corresponding to each isolated area is calculated in above-mentioned multiple isolated area respectively Value, then, according to the distance average corresponding to isolated area each in non-focusing area and focusing area away from From the ratio of average as fuzzy coefficient.It is assumed that the distance average of focusing area is A, a certain isolated area Distance average be B, then fuzzy coefficient=B/A, B is the biggest, then fuzzy coefficient is the biggest, namely obscures journey Spending the highest, B is the least, then fuzzy coefficient is the least, and namely fog-level is the least.
Alternatively, certainly, also focusing area can be carried out Fuzzy processing, and non-focusing area is kept clear Clear, available distant view is clear, the effect that close shot is fuzzy.
It can be seen that by the embodiment of the present invention, determine the focusing of the preview image of the photographic head shooting of terminal Region and non-focusing area, utilize the laser range sensor of this terminal to determine each pixel in this preview image The distance between locus and this photographic head indicated by Dian, obtains N number of distance value, the picture of this preview image Vegetarian refreshments number is described N, should be on the premise of keeping this focusing area clear, N number of according to this for positive integer Distance value carries out Fuzzy processing to this non-focusing area.Therefore, focusing area and non-focusing can determined After region, according to the distance value between terminal and each pixel, non-focusing area can be carried out at obfuscation Reason, thus, simplify the operating process of background blurring realization in prior art, and due to according to distance to non- Focusing area carries out background blurring process, thus, the background blurring image effect obtained is the most true to nature.
With above-described embodiment as one man, refer to Fig. 2, a kind of image processing method provided for the embodiment of the present invention Second embodiment schematic flow sheet of method.Image processing method described in the present embodiment, including following step Rapid:
201, the focusing area of the preview image of the photographic head shooting of terminal and non-focusing area are determined.
202, the laser range sensor of described terminal is utilized to determine each pixel indication in described preview image Distance between the locus and the described photographic head that show, obtains N number of distance value, the picture of described preview image Vegetarian refreshments number is described N, and described N is positive integer.
203, determining average departure distance values according to M distance value, wherein, described M distance value is described right Distance value between all or part of pixel and described photographic head in burnt region, described M is positive integer, And less than described N.
Alternatively, distance value all of in M distance value can be taken mean operation, it is assumed that focusing area Comprising J pixel, above-mentioned M distance value can be in focusing area between part pixel and photographic head Distance value, then, M is less than J, and certainly, above-mentioned M distance value is alternatively all pixels in focusing area Distance value between point and photographic head, then, M is equal to J.It can be seen that M is less than less than or equal to J, J N。
204, distance value that in described non-focusing area, each pixel is corresponding and described average departure distance values are calculated Between difference, obtain multiple difference.
Alternatively, can calculate between distance value and the average departure distance values that in non-focusing area, each pixel is corresponding Difference, the difference obtained is likely larger than 0, less than 0, or, more than 0, different differences can be as such as What carries out the foundation of Fuzzy processing to the pixel that this difference is corresponding.
205, according to the plurality of difference, described non-focusing area is carried out Fuzzy processing.
Wherein, step 205 can comprise two kinds of different Fuzzy processing modes, specific as follows:
The first Fuzzy processing mode, can carry out taking absolute value by above-mentioned multiple differences computing, obtains multiple Absolute value, carries out Fuzzy processing according to the plurality of absolute value to non-focusing area, absolute value the most then obfuscation Degree is big, and the little then obfuscation degree of absolute value is little.
Further, certainly, order ascending for multiple absolute values can be divided into multiple grade, if, The plurality of grade comprises A, B and C, and gaussian coefficient corresponding to A grade is that the gaussian coefficient that a, B are corresponding is The gaussian coefficient that b, C are corresponding is the absolute value that absolute value corresponding to c, A grade is corresponding less than B grade, B The absolute value that absolute value corresponding to grade is corresponding less than C grade, correspondingly, obfuscation degree is ascending suitable Sequence: A < B < C, then, the pixel that each absolute value in A grade is corresponding enters according to gaussian coefficient a Row Fuzzy processing, the pixel that each absolute value in B grade is corresponding carries out mould according to gaussian coefficient b Gelatinizing processes, and the pixel that each absolute value in C grade is corresponding carries out obfuscation according to gaussian coefficient c Process.
Above-mentioned multiple differences can be judged by the first Fuzzy processing mode, in difference less than or equal to 0 Time, the pixel that this difference is corresponding not being carried out Fuzzy processing, reason is, difference is less than or equal to 0, Then it is believed that locus corresponding to pixel corresponding to this difference is close shot, it is clear to keep.If difference is big In 0, then it is believed that locus corresponding to pixel corresponding to this difference is distant view, carry out Fuzzy processing, Difference is the biggest, and obfuscation degree is the biggest, and difference is the least, and obfuscation degree is the least.Certainly, the second obscures Change processing mode, it is also possible to reference to the graduation obfuscation degree in the first Fuzzy processing mode, it may be assumed that Difference this partial difference more than 0 is divided into multiple grade, and each grade uses corresponding gaussian coefficient to carry out Fuzzy processing.
Wherein, the specific descriptions of other steps in the embodiment described by Fig. 2 can refer to described in Fig. 1 The specific descriptions of image processing method, do not repeat them here.
Below for implementing the device of the image processing method described by Fig. 1 or Fig. 2, specific as follows:
Refer to Fig. 3 a, for the first embodiment structural representation of a kind of terminal that the embodiment of the present invention provides. Terminal described in the present embodiment, including: first determine unit 301, second determine unit 302 and process Unit 303, specific as follows:
First determines unit 301, for determining the focusing area of preview image of the photographic head shooting of terminal and non- Focusing area;
Second determines unit 302, for utilizing the laser range sensor of described terminal to determine described preview image In distance between locus and described photographic head indicated by each pixel, obtain N number of distance value, The pixel number of described preview image is described N, and described N is positive integer;
Processing unit 303, for, on the premise of keeping described focusing area clear, determining according to described second Described N number of distance value that unit 302 determines determines the described non-focusing district that unit 301 determines to described first Territory carries out Fuzzy processing.
Alternatively, described first determine unit 301 specifically for:
Target area in the preview image of the photographic head shooting choosing terminal, using this target area as focusing Region, and using the region beyond described focusing area as non-focusing area, wherein, described preview image bag Containing multiple isolated areas, described target area is at least one in the plurality of isolated area.
Alternatively, described first determine unit 301 also particularly useful for:
Instruction is chosen in reception, is chosen indicated by instruction by described in the preview image shoot the photographic head of terminal Isolated area as focusing area, and using the region beyond described focusing area as non-focusing area;
Or,
Using the appointment region in described preview image as focusing area, and by the district beyond described focusing area Territory is as non-focusing area.
Alternatively, as shown in Figure 3 b, described first of the terminal described in Fig. 3 a determines that unit 301 wraps Include:
Detection module 3011, carries out target detection for the preview image shooting the photographic head of terminal;
First determines module 3012, is used for the target region of described detection as focusing area.
Further, described detection module 3011 includes:
Binary conversion treatment module (does not marks in figure), carries out for the preview image shooting the photographic head of terminal Binary conversion treatment, obtains binaryzation preview image;
Extraction module (does not marks in figure), for extracting the profile of described binaryzation preview image;
Identification module (does not marks in figure), for described profile being carried out image recognition, to identify described wheel Target in exterior feature.
Alternatively, the described processing unit 303 of the terminal as described in Fig. 3 c, Fig. 3 a includes:
Second determines module 3031, for determining average departure distance values, wherein, described M according to M distance value Individual distance value is the distance value in described focusing area between all or part of pixel and described photographic head, Described M is positive integer, and less than described N;
Computing module 3032, for calculating the distance value and institute that in described non-focusing area, each pixel is corresponding State the difference between average departure distance values, obtain multiple difference;
Fuzzy processing module 3033, for obscuring described non-focusing area according to the plurality of difference Change processes.
It can be seen that by the terminal described by the embodiment of the present invention, it may be determined that the photographic head shooting of terminal The focusing area of preview image and non-focusing area, utilize the laser range sensor of this terminal to determine this preview Distance between locus and this photographic head indicated by each pixel in image, obtains N number of distance value, The pixel number of this preview image is described N, should keep this focusing area premise clearly for positive integer Under, according to this N number of distance value, this non-focusing area is carried out Fuzzy processing.Therefore, focusing can determined After region and non-focusing area, can be according to the distance value between terminal and each pixel to non-focusing area Carry out Fuzzy processing, thus, simplify the operating process of background blurring realization in prior art, and due to According to distance, non-focusing area is carried out background blurring process, thus, the background blurring image effect obtained is more For true to nature.
Refer to Fig. 4, for the second example structure schematic diagram of a kind of terminal that the embodiment of the present invention provides.This Terminal described in embodiment, including: at least one input equipment 1000;At least one outut device 2000; At least one processor 3000, such as CPU;With memorizer 4000, above-mentioned input equipment 1000, output Equipment 2000, processor 3000 and memorizer 4000 are connected by bus 5000.
Wherein, above-mentioned input equipment 1000 concretely contact panel, physical button or mouse, fingerprint are known Other module etc..
Above-mentioned outut device 2000 concretely display screen.
Above-mentioned memorizer 4000 can be high-speed RAM memorizer, it is possible to for non-labile memorizer (non-volatile memory), such as disk memory.Above-mentioned memorizer 4000 is used for storing batch processing Code, above-mentioned input equipment 1000, outut device 2000 and processor 3000 are used for calling memorizer 4000 The program code of middle storage, performs to operate as follows:
Above-mentioned processor 3000, is used for:
Determine the focusing area of the preview image of the photographic head shooting of terminal and non-focusing area;
The laser range sensor utilizing described terminal determines in described preview image indicated by each pixel Distance between locus and described photographic head, obtains N number of distance value, the pixel of described preview image Number is described N, and described N is positive integer;
On the premise of keeping described focusing area clear, according to described N number of distance value to described non-focusing district Territory carries out Fuzzy processing.
Alternatively, above-mentioned processor 3000 determine terminal photographic head shooting preview image focusing area and Non-focusing area, including:
Target area in the preview image of the photographic head shooting choosing terminal, using this target area as focusing Region, and using the region beyond described focusing area as non-focusing area, wherein, described preview image bag Containing multiple isolated areas, described target area is at least one in the plurality of isolated area.
Alternatively, the target area in the preview image of the photographic head shooting that terminal chosen by above-mentioned processor 3000, Including:
Instruction is chosen in reception;
In the preview image that the photographic head of terminal is shot by the described isolated area chosen indicated by instruction as Focusing area, and using the region beyond described focusing area as non-focusing area;
Or,
Using the appointment region in described preview image as focusing area, and by the district beyond described focusing area Territory is as non-focusing area.
Alternatively, above-mentioned processor 3000 determines the focusing area of the preview image of the photographic head shooting of terminal, Including:
The preview image shooting the photographic head of terminal carries out target detection;
Using the target region of described detection as focusing area.
Alternatively, the preview image that the photographic head of terminal is shot by above-mentioned processor 3000 carries out target detection, Including:
The preview image shooting the photographic head of terminal carries out binary conversion treatment, obtains binaryzation preview image;
Extract the profile of described binaryzation preview image;
Described profile is carried out image recognition, to identify the target in described profile.
Alternatively, described non-focusing area is obscured by above-mentioned processor 3000 according to described N number of distance value Change processes, including:
Determining average departure distance values according to M distance value, wherein, described M distance value is described focusing area In distance value between all or part of pixel and described photographic head, described M is positive integer, and is less than Described N;
Calculate between the distance value and described average departure distance values that in described non-focusing area, each pixel is corresponding Difference, obtains multiple difference;
According to the plurality of difference, described non-focusing area is carried out Fuzzy processing.
The embodiment of the present invention also provides for a kind of computer-readable storage medium, and wherein, this computer-readable storage medium can be deposited Containing program, this program includes any image processing method described in said method embodiment when performing Part or all of step.
Although combine each embodiment invention has been described at this, but, required for protection in enforcement In process of the present invention, those skilled in the art are by checking described accompanying drawing, disclosure and appended right Claim, it will be appreciated that and realize other changes of described open embodiment.In the claims, " include " (comprising) word is not excluded for other ingredients or step, and "a" or "an" is not excluded for multiple feelings Condition.Single processor or other unit can realize some the functions enumerated in claim.Mutually different Be recited in mutually different dependent some measure, the generation it is not intended that these measures can not combine Good effect.
It will be understood by those skilled in the art that embodiments of the invention can be provided as method, device (equipment) or Computer program.Therefore, the present invention can use complete hardware embodiment, complete software implementation or Form in conjunction with the embodiment in terms of software and hardware.And, the present invention can use one or more wherein Include computer usable program code computer-usable storage medium (include but not limited to disk memory, CD-ROM, optical memory etc.) form of the upper computer program implemented.Computer program storage/ It is distributed in suitable medium, provides together with other hardware or as the part of hardware, it would however also be possible to employ Other distribution forms, as by Internet or other wired or wireless telecommunication system.
The present invention is with reference to the method for the embodiment of the present invention, device (equipment) and the stream of computer program Journey figure and/or block diagram describe.It should be understood that can be by computer program instructions flowchart and/or block diagram In each flow process and/or the flow process in square frame and flow chart and/or block diagram and/or the combination of square frame.Can There is provided these computer program instructions to general purpose computer, special-purpose computer, Embedded Processor or other can The processor of programming data processing equipment is to produce a machine so that by computer or other numbers able to programme The instruction performed according to the processor of processing equipment produce for realize one flow process of flow chart or multiple flow process and/ Or the device of the function specified in one square frame of block diagram or multiple square frame.
These computer program instructions may be alternatively stored in and can guide computer or other programmable data processing device In the computer-readable memory worked in a specific way so that be stored in this computer-readable memory Instruction produces the manufacture including command device, and this command device realizes at one flow process of flow chart or multiple stream The function specified in journey and/or one square frame of block diagram or multiple square frame.
These computer program instructions also can be loaded in computer or other programmable data processing device, makes Sequence of operations step must be performed to produce computer implemented place on computer or other programmable devices Reason, thus the instruction performed on computer or other programmable devices provides for realizing flow chart one The step of the function specified in flow process or multiple flow process and/or one square frame of block diagram or multiple square frame.
Although in conjunction with specific features and embodiment, invention has been described, it is clear that, do not taking off In the case of the spirit and scope of the present invention, it can be carried out various amendment and combination.Correspondingly, this theory The exemplary illustration of the present invention that bright book and accompanying drawing only claims are defined, and be considered as covering In the scope of the invention arbitrarily and all modifications, change, combine or equivalent.Obviously, the technology of this area Personnel can carry out various change and modification without departing from the spirit and scope of the present invention to the present invention.So, If these amendments of the present invention and modification belong within the scope of the claims in the present invention and equivalent technologies thereof, Then the present invention is also intended to comprise these change and modification.

Claims (13)

1. an image processing method, it is characterised in that including:
Determine the focusing area of the preview image of the photographic head shooting of terminal and non-focusing area;
The laser range sensor utilizing described terminal determines in described preview image indicated by each pixel Distance between locus and described photographic head, obtains N number of distance value, the pixel of described preview image Number is described N, and described N is positive integer;
On the premise of keeping described focusing area clear, according to described N number of distance value to described non-focusing district Territory carries out Fuzzy processing.
Method the most according to claim 1, it is characterised in that the described photographic head shooting determining terminal The focusing area of preview image and non-focusing area, including:
Target area in the preview image of the photographic head shooting choosing terminal, using this target area as focusing Region, and using the region beyond described focusing area as non-focusing area, wherein, described preview image bag Containing multiple isolated areas, described target area is at least one in the plurality of isolated area.
Method the most according to claim 2, it is characterised in that described in choose terminal photographic head shooting Preview image in target area, including:
Instruction is chosen in reception;
In the preview image that the photographic head of terminal is shot by the described isolated area chosen indicated by instruction as Focusing area, and using the region beyond described focusing area as non-focusing area;
Or,
Using the appointment region in described preview image as focusing area, and by the district beyond described focusing area Territory is as non-focusing area.
Method the most according to claim 1, it is characterised in that the described photographic head shooting determining terminal The focusing area of preview image, including:
The preview image shooting the photographic head of terminal carries out target detection;
Using the target region of described detection as focusing area.
Method the most according to claim 4, it is characterised in that the described photographic head to terminal shoots Preview image carries out target detection, including:
The preview image shooting the photographic head of terminal carries out binary conversion treatment, obtains binaryzation preview image;
Extract the profile of described binaryzation preview image;
Described profile is carried out image recognition, to identify the target in described profile.
6. according to the method described in any one of claim 1 to 5, it is characterised in that described according to described N Individual distance value carries out Fuzzy processing to described non-focusing area, including:
Determining average departure distance values according to M distance value, wherein, described M distance value is described focusing area In distance value between all or part of pixel and described photographic head, described M is positive integer, and is less than Described N;
Calculate between the distance value and described average departure distance values that in described non-focusing area, each pixel is corresponding Difference, obtains multiple difference;
According to the plurality of difference, described non-focusing area is carried out Fuzzy processing.
7. a terminal, it is characterised in that including:
First determines unit, for determining the focusing area of preview image of the photographic head shooting of terminal and non-right Burnt region;
Second determines unit, for utilizing the laser range sensor of described terminal to determine in described preview image The distance between locus and described photographic head indicated by each pixel, obtains N number of distance value, institute The pixel number stating preview image is described N, and described N is positive integer;
Processing unit, for, on the premise of keeping described focusing area clear, determining list according to described second To described first, described N number of distance value that unit determines determines that the described non-focusing area that unit determines obscures Change processes.
Terminal the most according to claim 7, it is characterised in that described first determine unit specifically for:
Target area in the preview image of the photographic head shooting choosing terminal, using this target area as focusing Region, and using the region beyond described focusing area as non-focusing area, wherein, described preview image bag Containing multiple isolated areas, described target area is at least one in the plurality of isolated area.
Terminal the most according to claim 8, it is characterised in that described first determines that unit is the most specifically used In:
Instruction is chosen in reception, is chosen indicated by instruction by described in the preview image shoot the photographic head of terminal Isolated area as focusing area, and using the region beyond described focusing area as non-focusing area;
Or,
Using the appointment region in described preview image as focusing area, and by the district beyond described focusing area Territory is as non-focusing area.
Terminal the most according to claim 7, it is characterised in that described first determines that unit includes:
Detection module, carries out target detection for the preview image shooting the photographic head of terminal;
First determines module, is used for the target region of described detection as focusing area.
11. terminals according to claim 10, it is characterised in that described detection module includes:
Binary conversion treatment module, carries out binary conversion treatment for the preview image shooting the photographic head of terminal, Obtain binaryzation preview image;
Extraction module, for extracting the profile of described binaryzation preview image;
Identification module, for carrying out image recognition to described profile, to identify the target in described profile.
12. according to the terminal described in any one of claim 7 to 11, it is characterised in that described processing unit Including:
Second determines module, for determining average departure distance values according to M distance value, wherein, described M individual away from Distance values is the distance value in described focusing area between all or part of pixel and described photographic head, described M is positive integer, and less than described N;
Computing module is flat with described for calculating the distance value that in described non-focusing area, each pixel is corresponding All the difference between distance value, obtains multiple difference;
Fuzzy processing module, for carrying out at obfuscation described non-focusing area according to the plurality of difference Reason.
13. 1 kinds of terminals, it is characterised in that including:
Processor and memorizer;Wherein, described processor is by calling the code in described memorizer or instruction To perform the method as described in claim 1-6 any one.
CN201610503757.1A 2016-06-28 2016-06-28 A kind of image processing method and terminal Active CN105933589B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201610503757.1A CN105933589B (en) 2016-06-28 2016-06-28 A kind of image processing method and terminal

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201610503757.1A CN105933589B (en) 2016-06-28 2016-06-28 A kind of image processing method and terminal

Publications (2)

Publication Number Publication Date
CN105933589A true CN105933589A (en) 2016-09-07
CN105933589B CN105933589B (en) 2019-05-28

Family

ID=56828711

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201610503757.1A Active CN105933589B (en) 2016-06-28 2016-06-28 A kind of image processing method and terminal

Country Status (1)

Country Link
CN (1) CN105933589B (en)

Cited By (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106454123A (en) * 2016-11-25 2017-02-22 滁州昭阳电信通讯设备科技有限公司 Shooting focusing method and mobile terminal
CN106485790A (en) * 2016-09-30 2017-03-08 珠海市魅族科技有限公司 Method and device that a kind of picture shows
CN106657782A (en) * 2016-12-21 2017-05-10 努比亚技术有限公司 Picture processing method and terminal
CN106775238A (en) * 2016-12-14 2017-05-31 深圳市金立通信设备有限公司 A kind of photographic method and terminal
CN106993091A (en) * 2017-03-29 2017-07-28 维沃移动通信有限公司 A kind of image weakening method and mobile terminal
CN107277372A (en) * 2017-07-27 2017-10-20 广东欧珀移动通信有限公司 Focusing method, device, computer-readable recording medium and mobile terminal
CN107295262A (en) * 2017-07-28 2017-10-24 努比亚技术有限公司 Image processing method, mobile terminal and computer-readable storage medium
CN107395965A (en) * 2017-07-14 2017-11-24 维沃移动通信有限公司 A kind of image processing method and mobile terminal
CN107426493A (en) * 2017-05-23 2017-12-01 深圳市金立通信设备有限公司 A kind of image pickup method and terminal for blurring background
CN107592466A (en) * 2017-10-13 2018-01-16 维沃移动通信有限公司 A kind of photographic method and mobile terminal
CN108174085A (en) * 2017-12-19 2018-06-15 信利光电股份有限公司 A kind of image pickup method of multi-cam, filming apparatus, mobile terminal and readable storage medium storing program for executing
WO2018214502A1 (en) * 2017-05-24 2018-11-29 中兴通讯股份有限公司 Background blurring method and device, terminal, and storage medium
CN109696788A (en) * 2019-01-08 2019-04-30 武汉精立电子技术有限公司 A kind of fast automatic focusing method based on display panel
CN111182211A (en) * 2019-12-31 2020-05-19 维沃移动通信有限公司 Shooting method, image processing method and electronic equipment
CN111246092A (en) * 2020-01-16 2020-06-05 Oppo广东移动通信有限公司 Image processing method, image processing device, storage medium and electronic equipment
CN112733346A (en) * 2020-12-31 2021-04-30 博迈科海洋工程股份有限公司 Method for planning delightful area in electrical operation room
CN113126111A (en) * 2019-12-30 2021-07-16 Oppo广东移动通信有限公司 Time-of-flight module and electronic equipment
CN113138387A (en) * 2020-01-17 2021-07-20 北京小米移动软件有限公司 Image acquisition method and device, mobile terminal and storage medium

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110881103B (en) * 2019-09-19 2022-01-28 Oppo广东移动通信有限公司 Focusing control method and device, electronic equipment and computer readable storage medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090160966A1 (en) * 2007-12-25 2009-06-25 Hon Hai Precision Industry Co., Ltd. Digital image capture device and digital image processing method thereof
CN101764925A (en) * 2008-12-25 2010-06-30 华晶科技股份有限公司 Simulation method for shallow field depth of digital image
CN101933040A (en) * 2007-06-06 2010-12-29 索尼株式会社 Image processing device, image processing method, and image processing program
CN105025226A (en) * 2015-07-07 2015-11-04 广东欧珀移动通信有限公司 Shooting control method and user terminal
CN105227838A (en) * 2015-09-28 2016-01-06 广东欧珀移动通信有限公司 A kind of image processing method and mobile terminal

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101933040A (en) * 2007-06-06 2010-12-29 索尼株式会社 Image processing device, image processing method, and image processing program
US20090160966A1 (en) * 2007-12-25 2009-06-25 Hon Hai Precision Industry Co., Ltd. Digital image capture device and digital image processing method thereof
CN101764925A (en) * 2008-12-25 2010-06-30 华晶科技股份有限公司 Simulation method for shallow field depth of digital image
CN105025226A (en) * 2015-07-07 2015-11-04 广东欧珀移动通信有限公司 Shooting control method and user terminal
CN105227838A (en) * 2015-09-28 2016-01-06 广东欧珀移动通信有限公司 A kind of image processing method and mobile terminal

Cited By (28)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106485790A (en) * 2016-09-30 2017-03-08 珠海市魅族科技有限公司 Method and device that a kind of picture shows
CN106454123A (en) * 2016-11-25 2017-02-22 滁州昭阳电信通讯设备科技有限公司 Shooting focusing method and mobile terminal
CN106454123B (en) * 2016-11-25 2019-02-22 盐城丝凯文化传播有限公司 A kind of method and mobile terminal of focusing of taking pictures
CN106775238A (en) * 2016-12-14 2017-05-31 深圳市金立通信设备有限公司 A kind of photographic method and terminal
CN106657782A (en) * 2016-12-21 2017-05-10 努比亚技术有限公司 Picture processing method and terminal
CN106657782B (en) * 2016-12-21 2020-02-18 努比亚技术有限公司 Picture processing method and terminal
CN106993091A (en) * 2017-03-29 2017-07-28 维沃移动通信有限公司 A kind of image weakening method and mobile terminal
CN106993091B (en) * 2017-03-29 2020-05-12 维沃移动通信有限公司 Image blurring method and mobile terminal
CN107426493A (en) * 2017-05-23 2017-12-01 深圳市金立通信设备有限公司 A kind of image pickup method and terminal for blurring background
WO2018214502A1 (en) * 2017-05-24 2018-11-29 中兴通讯股份有限公司 Background blurring method and device, terminal, and storage medium
CN107395965B (en) * 2017-07-14 2019-11-29 维沃移动通信有限公司 A kind of image processing method and mobile terminal
CN107395965A (en) * 2017-07-14 2017-11-24 维沃移动通信有限公司 A kind of image processing method and mobile terminal
CN107277372B (en) * 2017-07-27 2021-04-23 Oppo广东移动通信有限公司 Focusing method, focusing device, computer readable storage medium and mobile terminal
CN107277372A (en) * 2017-07-27 2017-10-20 广东欧珀移动通信有限公司 Focusing method, device, computer-readable recording medium and mobile terminal
CN107295262B (en) * 2017-07-28 2021-03-26 努比亚技术有限公司 Image processing method, mobile terminal and computer storage medium
CN107295262A (en) * 2017-07-28 2017-10-24 努比亚技术有限公司 Image processing method, mobile terminal and computer-readable storage medium
CN107592466A (en) * 2017-10-13 2018-01-16 维沃移动通信有限公司 A kind of photographic method and mobile terminal
CN108174085A (en) * 2017-12-19 2018-06-15 信利光电股份有限公司 A kind of image pickup method of multi-cam, filming apparatus, mobile terminal and readable storage medium storing program for executing
CN109696788A (en) * 2019-01-08 2019-04-30 武汉精立电子技术有限公司 A kind of fast automatic focusing method based on display panel
CN109696788B (en) * 2019-01-08 2021-12-14 武汉精立电子技术有限公司 Quick automatic focusing method based on display panel
CN113126111B (en) * 2019-12-30 2024-02-09 Oppo广东移动通信有限公司 Time-of-flight module and electronic device
CN113126111A (en) * 2019-12-30 2021-07-16 Oppo广东移动通信有限公司 Time-of-flight module and electronic equipment
CN111182211B (en) * 2019-12-31 2021-09-24 维沃移动通信有限公司 Shooting method, image processing method and electronic equipment
CN111182211A (en) * 2019-12-31 2020-05-19 维沃移动通信有限公司 Shooting method, image processing method and electronic equipment
CN111246092A (en) * 2020-01-16 2020-06-05 Oppo广东移动通信有限公司 Image processing method, image processing device, storage medium and electronic equipment
CN113138387A (en) * 2020-01-17 2021-07-20 北京小米移动软件有限公司 Image acquisition method and device, mobile terminal and storage medium
CN113138387B (en) * 2020-01-17 2024-03-08 北京小米移动软件有限公司 Image acquisition method and device, mobile terminal and storage medium
CN112733346A (en) * 2020-12-31 2021-04-30 博迈科海洋工程股份有限公司 Method for planning delightful area in electrical operation room

Also Published As

Publication number Publication date
CN105933589B (en) 2019-05-28

Similar Documents

Publication Publication Date Title
CN105933589A (en) Image processing method and terminal
JP5871862B2 (en) Image blur based on 3D depth information
CN105659580B (en) A kind of Atomatic focusing method, device and electronic equipment
CN102843509B (en) Image processing device and image processing method
US9300858B2 (en) Control device and storage medium for controlling capture of images
CN107087107A (en) Image processing apparatus and method based on dual camera
US20160328853A1 (en) Image processing method and apparatus
CN107920211A (en) A kind of photographic method, terminal and computer-readable recording medium
CN106911922B (en) The depth map generated from single sensor
CN107787463B (en) The capture of optimization focusing storehouse
JP5246078B2 (en) Object location program and camera
CN105247567B (en) A kind of image focusing device, method, system and non-transient program storage device again
KR20210028218A (en) Image processing methods and devices, electronic devices and storage media
CN106101540B (en) Focus point determines method and device
CN105227838A (en) A kind of image processing method and mobile terminal
CN106415348B (en) Photographic device and focusing control method
CN106030366A (en) Imaging device, and focus control method
CN106226976A (en) A kind of dual camera image pickup method, system and terminal
CN108200335A (en) Photographic method, terminal and computer readable storage medium based on dual camera
CN110213491B (en) Focusing method, device and storage medium
CN109714539B (en) Image acquisition method and device based on gesture recognition and electronic equipment
CN104184935A (en) Image shooting device and method
CN111667420A (en) Image processing method and device
CN106062607A (en) Imaging device and focus control method
CN110677580B (en) Shooting method, shooting device, storage medium and terminal

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information

Address after: 523860 No. 18, Wu Sha Beach Road, Changan Town, Dongguan, Guangdong

Applicant after: OPPO Guangdong Mobile Communications Co., Ltd.

Address before: 523860 No. 18, Wu Sha Beach Road, Changan Town, Dongguan, Guangdong

Applicant before: Guangdong OPPO Mobile Communications Co., Ltd.

CB02 Change of applicant information
GR01 Patent grant
GR01 Patent grant