CN109951645A - Focal adjustment method, mobile terminal and computer readable storage medium - Google Patents

Focal adjustment method, mobile terminal and computer readable storage medium Download PDF

Info

Publication number
CN109951645A
CN109951645A CN201910359659.9A CN201910359659A CN109951645A CN 109951645 A CN109951645 A CN 109951645A CN 201910359659 A CN201910359659 A CN 201910359659A CN 109951645 A CN109951645 A CN 109951645A
Authority
CN
China
Prior art keywords
area
focusing position
focusing
fuzziness
search
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201910359659.9A
Other languages
Chinese (zh)
Other versions
CN109951645B (en
Inventor
樊聿聪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nubia Technology Co Ltd
Original Assignee
Nubia Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nubia Technology Co Ltd filed Critical Nubia Technology Co Ltd
Priority to CN201910359659.9A priority Critical patent/CN109951645B/en
Publication of CN109951645A publication Critical patent/CN109951645A/en
Application granted granted Critical
Publication of CN109951645B publication Critical patent/CN109951645B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Studio Devices (AREA)

Abstract

Focal adjustment method disclosed by the invention treats reference object and carries out focusing coarse search, determines the first focusing position;The preview image of the object to be captured is divided into first area and second area according to first focusing position, the first area includes the focusing area in the preview image, and the second area includes the region except first area described in the preview image;Calculate the fuzziness D21 of the fuzziness D11 and the second area of the first area under first focusing position;According to the D11 and D21, target focusing position is being determined to the pyrogene search phase, wherein, D1m is closest to the D11, and/or D2m is closest to the D21, D1m is the fuzziness of the first area under the target focusing position, and D2m is the fuzziness of the second area under the target focusing position.In this way, search for pyrogene according to the comparison of region blur degree, it can be avoided and focusing area occur than the situation that background area obscures, to promote shooting experience.

Description

Focal adjustment method, mobile terminal and computer readable storage medium
Technical field
The present invention relates to technique for taking field more particularly to a kind of focal adjustment methods, mobile terminal and computer-readable Storage medium.
Background technique
It with the continuous development of electronic technology, can when being taken pictures using mobile terminal (such as mobile phone, camera etc.) Enough carry out auto-focusing, can automatic identification face in shooting process, and focus in human face region, provided to shooting process Great convenience.
Under bright and clear environment, when face is focused, occurs face focus frame in shooting preview image, focus in face Region, the human face region of image is very clearly after the completion of focusing.However under the uneven environment of certain light, area of focusing Domain will appear the fuzzy situation of probability focusing.Such as under the dusk, personage is shot outside market and is shone, due to market background luminance It is higher, it when face is focused, is influenced by highlighted background, causes focusing articulation point in background positions, rather than in human face region.Such as Shown in Fig. 1, although the focusing of face focus frame is in human face region, background area brightness is big, causes background area than face area Domain is apparent.
As it can be seen that focusing mode in the prior art is easy to appear focusing area than the situation that background area obscures, influence to clap Take the photograph experience.
Summary of the invention
In view of this, the present invention proposes a kind of focal adjustment method, mobile terminal and computer readable storage medium, with solution Certainly above-mentioned technical problem.
Firstly, to achieve the above object, the present invention proposes a kind of focal adjustment method, it is applied to mobile terminal, the side Method includes:
In shooting process, it treats reference object and carries out focusing coarse search, determine the first focusing position;
The preview image of the object to be captured is divided into first area and the secondth area according to first focusing position Domain, wherein the first area includes the focusing area of the preview image under first focusing position, secondth area Domain includes the region except first area described in the preview image;
Calculate separately the fuzzy of the fuzziness D11 and the second area of the first area under first focusing position Spend D21;
According to the D11 and D21, target focusing position is being determined to the pyrogene search phase, wherein D1m is described in D11 and/or D2m is the fuzziness of the first area under the target focusing position closest to the D21, D1m, and D2m is The fuzziness of the second area under the target focusing position.
Optionally, described that the preview image of the object to be captured is divided by the firstth area according to first focusing position After domain and second area, the method also includes:
It treats reference object and carries out focusing coarse search, determine the second focusing position, wherein second focusing position is remote Burnt position;
Calculate separately the fuzzy of the fuzziness D12 and the second area of the first area under second focusing position Spend D22;
It is described according to the D11 and D21, target focusing position is being determined to the pyrogene search phase, comprising:
In to pyrogene search process, if D1i closer to D12 and/or D2i closer to D22, is carried out pair to first direction Pyrogene search, determines target focusing position, wherein D1i is the fuzziness of the first area under the focusing position of essence search, D2i is the fuzziness of the second area under the focusing position of essence search, and the first direction is contrary with remote coke Direction.
Optionally, described that the preview image of the object to be captured is divided by the firstth area according to first focusing position After domain and second area, the method also includes:
It treats reference object and carries out focusing coarse search, determine third focusing position, wherein the third focusing position is close Burnt position;
Calculate separately the fuzzy of the fuzziness D13 and the second area of the first area under the third focusing position Spend D23;
It is described according to the D11 and D21, target focusing position is being determined to the pyrogene search phase, comprising:
In to pyrogene search process, if D1i closer to D13 and/or D2i closer to D23, is carried out pair to second direction Pyrogene search, determines target focusing position, wherein D1i is the fuzziness of the first area under the focusing position of essence search, D2i is the fuzziness of the second area under the focusing position of essence search, and the second direction is contrary with nearly coke Direction.
Optionally, the focusing coarse search is scanned for a length of interval of the first step, described to search for pyrogene with second step Length scans for, wherein the first step is long to be greater than second step-length.
Optionally, described that the preview image of the object to be captured is divided by the firstth area according to first focusing position Domain and second area, comprising:
The preview image is divided into multiple subregions, wherein the preview image under first focusing position Focusing area is as a sub-regions;
It determines the focusing area and the region adjacent with the focusing area is first area;
Determine that the region except first area described in the preview image is second area.
Optionally, the preview image is divided into multiple subregions, comprising:
Determine the size of the focusing area of the preview image of the object to be captured under first focusing position;
The preview image is divided into multiple subregions according to the focusing area size, wherein each subregion Size is the size of the focusing area.
Optionally, the fuzziness D11 for calculating the first area under first focusing position, comprising:
Calculate separately the fuzziness of each subregion of the first area under first focusing position;
The weighted average for calculating the fuzziness of each subregion of the first area is the fuzzy of the first area Spend D11, wherein the weighted value of the focusing area is maximum.
Optionally, the fuzziness D21 for calculating the second area under first focusing position, comprising:
Calculate separately the fuzziness of each subregion of the second area under first focusing position;
The average value for calculating the fuzziness of each subregion of the second area is the fuzziness of the second area D21。
Further, to achieve the above object, the present invention also provides a kind of mobile terminal, the mobile terminal includes storage Device, at least one processor and at least one journey that is stored on the memory and can be executed at least one described processor Sequence, at least one described program realize the step in method described in any of the above embodiments when being executed by least one described processor Suddenly.
Further, to achieve the above object, the present invention also provides a kind of computer readable storage medium, the computers Readable storage medium storing program for executing is stored at least one executable program of computer, at least one described program is executed by the computer When so that the computer is executed the step in the above method.
Compared to the prior art, focal adjustment method proposed by the invention in shooting process, treat reference object into Row focusing coarse search, determines the first focusing position;According to first focusing position by the preview image of the object to be captured It is divided into first area and second area, wherein the first area is included in the preview graph under first focusing position The focusing area of picture, the second area include the region except first area described in the preview image;Calculate separately institute State the fuzziness D11 of the first area under the first focusing position and the fuzziness D21 of the second area;According to the D11 And D21, target focusing position is being determined to the pyrogene search phase, wherein D1m is closest to the D11 and/or D2m closest to institute D21 is stated, D1m is the fuzziness of the first area under the target focusing position, and D2m is under the target focusing position The fuzziness of the second area.Focal adjustment method provided by the invention search pyrogene according to the comparison of region blur degree Rope can be avoided and focusing area occurs than the situation that background area obscures, to promote shooting experience.
Detailed description of the invention
Fig. 1 is the hardware structural diagram for realizing a kind of mobile terminal of each embodiment of the present invention;
Fig. 2 is a kind of communications network system architecture diagram provided in an embodiment of the present invention;
Fig. 3 is one of the flow diagram of focal adjustment method provided in an embodiment of the present invention;
Fig. 4 is the two of the flow diagram of focal adjustment method provided in an embodiment of the present invention;
Fig. 5 is region division schematic diagram provided in an embodiment of the present invention.
The embodiments will be further described with reference to the accompanying drawings for the realization, the function and the advantages of the object of the present invention.
Specific embodiment
It should be appreciated that the specific embodiments described herein are merely illustrative of the present invention, it is not intended to limit the present invention.
In subsequent description, it is only using the suffix for indicating such as " module ", " component " or " unit " of element Be conducive to explanation of the invention, itself there is no a specific meaning.Therefore, " module ", " component " or " unit " can mix Ground uses.
Terminal can be implemented in a variety of manners.For example, terminal described in the present invention may include such as mobile phone, plate Computer, laptop, palm PC, personal digital assistant (Personal Digital Assistant, PDA), portable Media player (Portable Media Player, PMP), navigation device, wearable device, Intelligent bracelet, pedometer etc. move The fixed terminals such as dynamic terminal, and number TV, desktop computer.
It will be illustrated by taking mobile terminal as an example in subsequent descriptions, it will be appreciated by those skilled in the art that in addition to special Except element for moving purpose, the construction of embodiment according to the present invention can also apply to the terminal of fixed type.
Referring to Fig. 1, a kind of hardware structural diagram of its mobile terminal of each embodiment to realize the present invention, the shifting Dynamic terminal 100 may include: RF (Radio Frequency, radio frequency) unit 101, WiFi module 102, audio output unit 103, A/V (audio/video) input unit 104, sensor 105, display unit 106, user input unit 107, interface unit 108, the components such as memory 109, processor 110 and power supply 111, the number of the processor 110 are at least one.Ability Field technique personnel are appreciated that mobile terminal structure shown in Fig. 1 does not constitute the restriction to mobile terminal, and mobile terminal can To include perhaps combining certain components or different component layouts than illustrating more or fewer components.
It is specifically introduced below with reference to all parts of the Fig. 1 to mobile terminal:
Radio frequency unit 101 can be used for receiving and sending messages or communication process in, signal sends and receivees, specifically, by base station Downlink information receive after, to processor 110 handle;In addition, the data of uplink are sent to base station.In general, radio frequency unit 101 Including but not limited to antenna, at least one amplifier, transceiver, coupler, low-noise amplifier, duplexer etc..In addition, penetrating Frequency unit 101 can also be communicated with network and other equipment by wireless communication.Any communication can be used in above-mentioned wireless communication Standard or agreement, including but not limited to GSM (Global System of Mobile communication, global system for mobile telecommunications System), GPRS (General Packet Radio Service, general packet radio service), CDMA2000 (Code Division Multiple Access 2000, CDMA 2000), WCDMA (Wideband Code Division Multiple Access, wideband code division multiple access), TD-SCDMA (Time Division-Synchronous Code Division Multiple Access, TD SDMA), FDD-LTE (Frequency Division Duplexing-Long Term Evolution, frequency division duplex long term evolution) and TDD-LTE (Time Division Duplexing-Long Term Evolution, time division duplex long term evolution) etc..
WiFi belongs to short range wireless transmission technology, and mobile terminal can help user to receive and dispatch electricity by WiFi module 102 Sub- mail, browsing webpage and access streaming video etc., it provides wireless broadband internet access for user.Although Fig. 1 shows Go out WiFi module 102, but it is understood that, and it is not belonging to must be configured into for mobile terminal, it completely can be according to need It to omit within the scope of not changing the essence of the invention.
Audio output unit 103 can be in call signal reception pattern, call mode, record mould in mobile terminal 100 When under the isotypes such as formula, speech recognition mode, broadcast reception mode, by radio frequency unit 101 or WiFi module 102 it is received or The audio data stored in memory 109 is converted into audio signal and exports to be sound.Moreover, audio output unit 103 Audio output relevant to the specific function that mobile terminal 100 executes can also be provided (for example, call signal receives sound, disappears Breath receives sound etc.).Audio output unit 103 may include loudspeaker, buzzer etc..
A/V input unit 104 is for receiving audio or video signal.A/V input unit 104 may include graphics processor (Graphics Processing Unit, GPU) 1041 and microphone 1042, graphics processor 1041 is in video acquisition mode Or the image data of the static images or video obtained in image capture mode by image capture apparatus (such as camera) carries out Reason.Treated, and picture frame may be displayed on display unit 106.Through graphics processor 1041, treated that picture frame can be deposited Storage is sent in memory 109 (or other storage mediums) or via radio frequency unit 101 or WiFi module 102.Mike Wind 1042 can connect in telephone calling model, logging mode, speech recognition mode etc. operational mode via microphone 1042 Quiet down sound (audio data), and can be audio data by such acoustic processing.Audio that treated (voice) data can To be converted to the format output that can be sent to mobile communication base station via radio frequency unit 101 in the case where telephone calling model. Microphone 1042 can be implemented various types of noises elimination (or inhibition) algorithms and send and receive sound to eliminate (or inhibition) The noise generated during frequency signal or interference.
Mobile terminal 100 further includes at least one sensor 105, such as optical sensor, motion sensor and other biographies Sensor.Specifically, optical sensor includes ambient light sensor and proximity sensor, wherein ambient light sensor can be according to environment The light and shade of light adjusts the brightness of display panel 1061, and proximity sensor can close when mobile terminal 100 is moved in one's ear Display panel 1061 and/or backlight.As a kind of motion sensor, accelerometer sensor can detect in all directions (general For three axis) size of acceleration, it can detect that size and the direction of gravity when static, can be used to identify the application of mobile phone posture (such as horizontal/vertical screen switching, dependent game, magnetometer pose calibrating), Vibration identification correlation function (such as pedometer, percussion) etc.; The fingerprint sensor that can also configure as mobile phone, pressure sensor, iris sensor, molecule sensor, gyroscope, barometer, The other sensors such as hygrometer, thermometer, infrared sensor, details are not described herein.
Display unit 106 is for showing information input by user or being supplied to the information of user.Display unit 106 can wrap Display panel 1061 is included, liquid crystal display (Liquid Crystal Display, LCD), Organic Light Emitting Diode can be used Forms such as (Organic Light-Emitting Diode, OLED) configure display panel 1061.
User input unit 107 can be used for receiving the number or character information of input, and generate the use with mobile terminal Family setting and the related key signals input of function control.Specifically, user input unit 107 may include touch panel 1071 with And other input equipments 1072.Touch panel 1071, also referred to as touch screen collect the touch operation of user on it or nearby (for example user uses any suitable objects or attachment such as finger, stylus on touch panel 1071 or in touch panel 1071 Neighbouring operation), and corresponding attachment device is driven according to preset formula.Touch panel 1071 may include touch detection Two parts of device and touch controller.Wherein, the touch orientation of touch detecting apparatus detection user, and detect touch operation band The signal come, transmits a signal to touch controller;Touch controller receives touch information from touch detecting apparatus, and by it It is converted into contact coordinate, then gives processor 110, and order that processor 110 is sent can be received and executed.In addition, can To realize touch panel 1071 using multiple types such as resistance-type, condenser type, infrared ray and surface acoustic waves.In addition to touch panel 1071, user input unit 107 can also include other input equipments 1072.Specifically, other input equipments 1072 can wrap It includes but is not limited in physical keyboard, function key (such as volume control button, switch key etc.), trace ball, mouse, operating stick etc. It is one or more, specifically herein without limitation.
Further, touch panel 1071 can cover display panel 1061, when touch panel 1071 detect on it or After neighbouring touch operation, processor 110 is sent to determine the type of touch event, is followed by subsequent processing device 110 according to touch thing The type of part provides corresponding visual output on display panel 1061.Although in Fig. 1, touch panel 1071 and display panel 1061 be the function that outputs and inputs of realizing mobile terminal as two independent components, but in certain embodiments, it can The function that outputs and inputs of mobile terminal is realized so that touch panel 1071 and display panel 1061 is integrated, is not done herein specifically It limits.
Interface unit 108 be used as at least one external device (ED) connect with mobile terminal 100 can by interface.For example, External device (ED) may include wired or wireless headphone port, external power supply (or battery charger) port, wired or nothing Line data port, memory card port, the port for connecting the device with identification module, audio input/output (I/O) end Mouth, video i/o port, ear port etc..Interface unit 108 can be used for receiving the input from external device (ED) (for example, number It is believed that breath, electric power etc.) and by least one element that the input received is transferred in mobile terminal 100 or can use In transmitting data between mobile terminal 100 and external device (ED).
Memory 109 can be used for storing software program and various data.Memory 109 can mainly include storing program area The storage data area and, wherein storing program area can (such as the sound of application program needed for storage program area, at least one function Sound playing function, image player function etc.) etc.;Storage data area can store according to mobile phone use created data (such as Audio data, phone directory etc.) etc..In addition, memory 109 may include high-speed random access memory, it can also include non-easy The property lost memory, a for example, at least disk memory, flush memory device or other volatile solid-state parts.
Processor 110 is the control centre of mobile terminal, utilizes each of various interfaces and the entire mobile terminal of connection A part by running or execute the software program and/or module that are stored in memory 109, and calls and is stored in storage Data in device 109 execute the various functions and processing data of mobile terminal, to carry out integral monitoring to mobile terminal.Place Managing device 110 may include at least one processing unit;Preferably, processor 110 can integrate application processor and modulation /demodulation processing Device, wherein the main processing operation system of application processor, user interface and application program etc., modem processor is mainly located Reason wireless communication.It is understood that above-mentioned modem processor can not also be integrated into processor 110.
Mobile terminal 100 can also include the power supply 111 (such as battery) powered to all parts, it is preferred that power supply 111 Can be logically contiguous by power-supply management system and processor 110, to realize management charging by power-supply management system, put The functions such as electricity and power managed.
Although Fig. 1 is not shown, mobile terminal 100 can also be including bluetooth module etc., and details are not described herein.
Embodiment to facilitate the understanding of the present invention, the communications network system that mobile terminal of the invention is based below into Row description.
Referring to Fig. 2, Fig. 2 is a kind of communications network system architecture diagram provided in an embodiment of the present invention, the communication network system System is the LTE system of universal mobile communications technology, which includes UE (User Equipment, the use of successively communication connection Family equipment) (the land Evolved UMTS Terrestrial Radio Access Network, evolved UMTS 201, E-UTRAN Ground wireless access network) 202, EPC (Evolved Packet Core, evolved packet-based core networks) 203 and operator IP operation 204。
Specifically, UE201 can be above-mentioned terminal 100, and details are not described herein again.
E-UTRAN202 includes eNodeB2021 and other eNodeB2022 etc..Wherein, eNodeB2021 can be by returning Journey (backhaul) (such as X2 interface) is connect with other eNodeB2022, and eNodeB2021 is connected to EPC203, ENodeB2021 can provide the access of UE201 to EPC203.
EPC203 may include MME (Mobility Management Entity, mobility management entity) 2031, HSS (Home Subscriber Server, home subscriber server) 2032, other MME2033, SGW (Serving Gate Way, Gateway) 2034, PGW (PDN Gate Way, grouped data network gateway) 2035 and PCRF (Policy and Charging Rules Function, policy and rate functional entity) 2036 etc..Wherein, MME2031 be processing UE201 and The control node of signaling, provides carrying and connection management between EPC203.HSS2032 is all to manage for providing some registers Such as the function of home location register (not shown) etc, and preserves some related service features, data rates etc. and use The dedicated information in family.All customer data can be sent by SGW2034, and PGW2035 can provide the IP of UE 201 Address distribution and other functions, PCRF2036 are strategy and the charging control strategic decision-making of business data flow and IP bearing resource Point, it selects and provides available strategy and charging control decision with charge execution function unit (not shown) for strategy.
IP operation 204 may include internet, Intranet, IMS (IP Multimedia Subsystem, IP multimedia System) or other IP operations etc..
Although above-mentioned be described by taking LTE system as an example, those skilled in the art should know the present invention is not only Suitable for LTE system, be readily applicable to other wireless communication systems, such as GSM, CDMA2000, WCDMA, TD-SCDMA with And the following new network system etc., herein without limitation.
Based on above-mentioned 100 hardware configuration of mobile terminal and communications network system, each embodiment of the method for the present invention is proposed. Mobile terminal in the embodiment of the present invention can be any mobile terminal with shooting function.
It is a kind of step flow chart of focal adjustment method provided in an embodiment of the present invention, the method refering to Fig. 3, Fig. 3 Applied in a mobile terminal, as shown in Figure 3, which comprises
Step 301, in shooting process, treats reference object and carries out focusing coarse search, determine the first focusing position.
In the step, in shooting process, the mobile terminal treats reference object and carries out focusing coarse search, determines first Focusing position.First focusing position is by removing determining focusing position after the focusing coarse search.
The preview image of the object to be captured is divided into first area according to first focusing position by step 302 And second area, wherein the first area includes the focusing area of the preview image under first focusing position, institute Stating second area includes the region except first area described in the preview image.
In the step, the mobile terminal draws the preview image of the object to be captured according to first focusing position It is divided into first area and second area, the first area includes at least pair of the preview image under first focusing position Burnt region, the second area include the region except first area described in the preview image.It is understood that described First area can only include the focusing area of the preview image under first focusing position, also may include the focusing Region and the region adjacent with the focusing area.
In some embodiments of the invention, the preview image can be divided into multiple subregions by the mobile terminal, In, the focusing area of the preview image is as a sub-regions under first focusing position;Then the focusing area is determined Domain and the region adjacent with the focusing area are first area, are determined except first area described in the preview image Region is second area.
Step 303 calculates separately the fuzziness D11 of the first area and secondth area under first focusing position The fuzziness D21 in domain.
In the step, the mobile terminal calculates separately the fuzziness of the first area under first focusing position The fuzziness D21 of D11 and the second area.
The vision system of the mankind can perceive the fuzziness that perimeter is more than certain threshold value, this threshold value becomes vision can It distinguishes fuzziness (Just Noticeable Blur, JNB), is in the distinguishable difference of vision (Just Noticeable Difference, JND) conceptive put forward.Before the so-called distinguishable difference of vision refers to and can allow and visually perceive grey scale change The minimum difference of gray scale between scape and background.Therefore, distinguishable obscure of vision namely refers to prospect with background contrasts more than distinguishable When knowing difference, in the least confusion angle value that perimeter can perceive.
Can the probability of perceived blur meet psychological function, can be used is indicated with drag:
Wherein, σJNBIt is constant with β, β usually takes 3.6.
For a pixel ei, it is assumed that ω (ei) indicates border width, ωJNB(ei) correspond to can perceived blur edge Width, then edge pixel periphery can perceived blur probability.The specific calculating of border width can be by calculating on a direction The increased pixel value of gray value, while calculating from the pixel value that gray value successively decreases on another direction, by the two pixel values it It is obtained with edge pixel is cut.Obtain single edge pixel can be after the model of perceived blur probability, using probability and mould Type come indicate on whole region R can perceived blur probability, the fuzziness that can be calculated using the following equation on whole region R:
In order to reduce calculation amount, calculate whole image can perceived blur probability can use multidomain treat-ment mode, example Whole image can be such as divided into multiple regions, then fuzziness calculating is carried out for each region, for each regional edge Edge RbFuzziness, following formula can be used and calculated:
It is understood that fuzziness is lower, image is more clear.
Step 304, according to the D11 and D21, target focusing position is being determined to the pyrogene search phase, wherein D1m is most Close to the D11 and/or D2m closest to the D21, D1m is the fuzzy of the first area under the target focusing position Degree, D2m are the fuzziness of the second area under the target focusing position.
In the step, the mobile terminal is determining target focusing position to the pyrogene search phase according to the D11 and D21 It sets, wherein for D1m closest to the D11 and/or D2m closest to the D21, D1m is described under the target focusing position The fuzziness of first area, D2m are the fuzziness of the second area under the target focusing position.
Specifically, the mobile terminal search for pyrogene to the direction close to first focusing position, so Determine that the focusing position for meeting above-mentioned condition is the target focusing position afterwards.In some embodiments of the invention, described in determination After target focusing position, the mobile terminal shoots the object to be captured in the target focusing position.
In the present embodiment, the focal adjustment method in shooting process, treats reference object and carries out focusing coarse search, really Fixed first focusing position;According to first focusing position by the preview image of the object to be captured be divided into first area and Second area, wherein the first area includes the focusing area of the preview image under first focusing position, described Second area includes the region except first area described in the preview image;Calculate separately institute under first focusing position State the fuzziness D11 of first area and the fuzziness D21 of the second area;According to the D11 and D21, searched for pyrogene Stage determines target focusing position, wherein D1m is closest to the D11 and/or D2m closest to the D21, and D1m is described The fuzziness of the first area under target focusing position, D2m are the mould of the second area under the target focusing position Paste degree.In this way, the focal adjustment method search for pyrogene according to the comparison of region blur degree, appearance focusing area can be avoided Domain is than the situation that background area obscures, to promote shooting experience.
Optionally, described that the preview image of the object to be captured is divided by the firstth area according to first focusing position After domain and second area, the method also includes:
It treats reference object and carries out focusing coarse search, determine the second focusing position, wherein second focusing position is remote Burnt position;
Calculate separately the fuzzy of the fuzziness D12 and the second area of the first area under second focusing position Spend D22;
It is described according to the D11 and D21, target focusing position is being determined to the pyrogene search phase, comprising:
In to pyrogene search process, if D1i closer to D12 and/or D2i closer to D22, is carried out pair to first direction Pyrogene search, determines target focusing position, wherein D1i is the fuzziness of the first area under the focusing position of essence search, D2i is the fuzziness of the second area under the focusing position of essence search, and the first direction is contrary with remote coke Direction.
Optionally, described that the preview image of the object to be captured is divided by the firstth area according to first focusing position After domain and second area, the method also includes:
It treats reference object and carries out focusing coarse search, determine third focusing position, wherein the third focusing position is close Burnt position;
Calculate separately the fuzzy of the fuzziness D13 and the second area of the first area under the third focusing position Spend D23;
It is described according to the D11 and D21, target focusing position is being determined to the pyrogene search phase, comprising:
In to pyrogene search process, if D1i closer to D13 and/or D2i closer to D23, is carried out pair to second direction Pyrogene search, determines target focusing position, wherein D1i is the fuzziness of the first area under the focusing position of essence search, D2i is the fuzziness of the second area under the focusing position of essence search, and the second direction is contrary with nearly coke Direction.
Optionally, the focusing coarse search is scanned for a length of interval of the first step, described to search for pyrogene with second step Length scans for, wherein the first step is long to be greater than second step-length.
Referring to fig. 4, Fig. 4 is the two of the flow diagram of focal adjustment method provided in an embodiment of the present invention, the focusing Adjusting method is applied to a mobile terminal, as shown in Figure 4, which comprises
Step 401, in shooting process, treats reference object and carries out focusing coarse search, determine the first focusing position.
The step 401 is identical as the step 301 in present invention embodiment shown in Fig. 3, and details are not described herein again.
The preview image is divided into multiple subregions by step 402, wherein described pre- under first focusing position Look at image focusing area as a sub-regions.
In the step, the preview image is divided into multiple subregions by the mobile terminal, wherein first focusing The focusing area of the preview image is as a sub-regions under position.In the embodiment of the present invention, the mobile terminal can root The preview image is divided into the identical subregion of multiple sizes according to the size of the focusing area, can also be divided into multiple The different subregion of size, such as can carry out dividing or being drawn using macro block division mode using super-pixel mode Point.
Step 403 determines the focusing area and the region adjacent with the focusing area for first area.
In the step, the mobile terminal determines that the focusing area and the region adjacent with the focusing area are the One region.
Step 404 determines that the region except first area described in the preview image is second area.
In the step, the mobile terminal determines that the region except first area described in the preview image is the secondth area Domain.
Below by taking the size of each subregion is identical as an example, region division is described in detail.Referring to Fig. 5, Fig. 5 is Region division schematic diagram provided in an embodiment of the present invention, as shown in fig. 5, it is assumed that focusing area is region B4, then it is described mobile whole Preview graph can be divided into multiple sizes subregion identical with region B4 size by end, and the adjacent area of focusing area B4 includes Region A3, A4, A5, B3, B5, C3, C4, C5.Then the mobile terminal can determine region A3, A4, A5, B3, B4, B5, C3, The region that C4, C5 are constituted is first area.
Step 405 calculates separately the fuzziness D11 of the first area and secondth area under first focusing position The fuzziness D21 in domain.
Step 406, according to the D11 and D21, target focusing position is being determined to the pyrogene search phase, wherein D1m is most Close to the D11 and/or D2m closest to the D21, D1m is the fuzzy of the first area under the target focusing position Degree, D2m are the fuzziness of the second area under the target focusing position.
The step 405 and step 406 in present invention embodiment shown in Fig. 3 step 303 and step 304 it is identical, Details are not described herein again.
In the present embodiment, the focal adjustment method in shooting process, treats reference object and carries out focusing coarse search, really Fixed first focusing position;The preview image is divided into multiple subregions, wherein the preview under first focusing position The focusing area of image is as a sub-regions;Determine that the focusing area and the region adjacent with the focusing area are the One region;Determine that the region except first area described in the preview image is second area;Calculate separately described first pair The fuzziness D21 of the fuzziness D11 and the second area of the first area under burnt position;According to the D11 and D21, Target focusing position is determined to the pyrogene search phase, wherein D1m closest to the D11 and/or D2m closest to the D21, D1m is the fuzziness of the first area under the target focusing position, and D2m is described the under the target focusing position The fuzziness in two regions.In this way, the focal adjustment method search for pyrogene according to the comparison of region blur degree, can be avoided There is focusing area than the situation that background area obscures, to promote shooting experience.
Optionally, the preview image is divided into multiple subregions, comprising:
Determine the size of the focusing area of the preview image of the object to be captured under first focusing position;
The preview image is divided into multiple subregions according to the size of the focusing area, wherein each subregion Size be the focusing area size.
In the embodiment, the preview image is divided into multiple by the mobile terminal according to the size of the focusing area Subregion, wherein the size of each subregion is the size of the focusing area.
Optionally, the fuzziness D11 for calculating the first area under first focusing position, comprising:
Calculate separately the fuzziness of each subregion of the first area under first focusing position;
The weighted average for calculating the fuzziness of each subregion of the first area is the fuzzy of the first area Spend D11, wherein the weighted value of the focusing area is maximum.
In the embodiment, the mobile terminal can calculate separately the every of the first area under first focusing position The fuzziness of sub-regions, the weighted average for then calculating the fuzziness of each subregion is the fuzziness of the first area The weighted value of D11, the focusing area are maximum.
For example, as shown in fig. 5, it is assumed that the weighted value of focusing area B4 is 0.2, the weighted value of other subregions is 0.1, then the fuzziness D11=0.2*D of the first areaB4+0.1*(DA3+DA4+DA5+DB3+DB5+DC3+DC4+DC5)。
Optionally, the fuzziness D21 for calculating the second area under first focusing position, comprising:
Calculate separately the fuzziness of each subregion of the second area under first focusing position;
The average value for calculating the fuzziness of each subregion of the second area is the fuzziness of the second area D21。
In the embodiment, the mobile terminal can calculate separately the every of the second area under first focusing position The fuzziness of sub-regions, the average value for then calculating the fuzziness of each subregion of the second area is secondth area The fuzziness D21 in domain.
It can be with those of ordinary skill in the art will appreciate that realizing that all or part of step of above-described embodiment method is It is completed by the relevant hardware of at least one program instruction, at least one described program can store in shifting as shown in Figure 1 It in the memory 109 of dynamic terminal 100, and can be executed by the processor 110, at least one described program is by the processor 110 realize following steps when executing:
In shooting process, it treats reference object and carries out focusing coarse search, determine the first focusing position;
The preview image of the object to be captured is divided into first area and the secondth area according to first focusing position Domain, wherein the first area includes the focusing area of the preview image under first focusing position, secondth area Domain includes the region except first area described in the preview image;
Calculate separately the fuzzy of the fuzziness D11 and the second area of the first area under first focusing position Spend D21;
According to the D11 and D21, target focusing position is being determined to the pyrogene search phase, wherein D1m is described in D11 and/or D2m is the fuzziness of the first area under the target focusing position closest to the D21, D1m, and D2m is The fuzziness of the second area under the target focusing position.
Optionally, described that the preview image of the object to be captured is divided by the firstth area according to first focusing position After domain and second area, the processor 110 can also be achieved following steps:
It treats reference object and carries out focusing coarse search, determine the second focusing position, wherein second focusing position is remote Burnt position;
Calculate separately the fuzzy of the fuzziness D12 and the second area of the first area under second focusing position Spend D22;
It is described according to the D11 and D21, target focusing position is being determined to the pyrogene search phase, comprising:
In to pyrogene search process, if D1i closer to D12 and/or D2i closer to D22, is carried out pair to first direction Pyrogene search, determines target focusing position, wherein D1i is the fuzziness of the first area under the focusing position of essence search, D2i is the fuzziness of the second area under the focusing position of essence search, and the first direction is contrary with remote coke Direction.
Optionally, described that the preview image of the object to be captured is divided by the firstth area according to first focusing position After domain and second area, the processor 110 can also be achieved following steps:
It treats reference object and carries out focusing coarse search, determine third focusing position, wherein the third focusing position is close Burnt position;
Calculate separately the fuzzy of the fuzziness D13 and the second area of the first area under the third focusing position Spend D23;
It is described according to the D11 and D21, target focusing position is being determined to the pyrogene search phase, comprising:
In to pyrogene search process, if D1i closer to D13 and/or D2i closer to D23, is carried out pair to second direction Pyrogene search, determines target focusing position, wherein D1i is the fuzziness of the first area under the focusing position of essence search, D2i is the fuzziness of the second area under the focusing position of essence search, and the second direction is contrary with nearly coke Direction.
Optionally, the focusing coarse search is scanned for a length of interval of the first step, described to search for pyrogene with second step Length scans for, wherein the first step is long to be greater than second step-length.
Optionally, described that the preview image of the object to be captured is divided by the firstth area according to first focusing position Domain and second area, comprising:
The preview image is divided into multiple subregions, wherein the preview image under first focusing position Focusing area is as a sub-regions;
It determines the focusing area and the region adjacent with the focusing area is first area;
Determine that the region except first area described in the preview image is second area.
Optionally, the preview image is divided into multiple subregions, comprising:
Determine the size of the focusing area of the preview image of the object to be captured under first focusing position;
The preview image is divided into multiple subregions according to the focusing area size, wherein each subregion Size is the size of the focusing area.
Optionally, the fuzziness D11 for calculating the first area under first focusing position, comprising:
Calculate separately the fuzziness of each subregion of the first area under first focusing position;
The weighted average for calculating the fuzziness of each subregion of the first area is the fuzzy of the first area Spend D11, wherein the weighted value of the focusing area is maximum.
Optionally, the fuzziness D21 for calculating the second area under first focusing position, comprising:
Calculate separately the fuzziness of each subregion of the second area under first focusing position;
The average value for calculating the fuzziness of each subregion of the second area is the fuzziness of the second area D21。
In the present embodiment, the mobile terminal in shooting process, treats reference object and carries out focusing coarse search, determines One focusing position;The preview image of the object to be captured is divided into first area and second according to first focusing position Region, wherein the first area includes the focusing area of the preview image under first focusing position, and described second Region includes the region except first area described in the preview image;It calculates separately described under first focusing position The fuzziness D11 in the one region and fuzziness D21 of the second area;According to the D11 and D21, to the pyrogene search phase Determine target focusing position, wherein D1m is closest to the D11 and/or D2m closest to the D21, and D1m is in the target The fuzziness of the first area under focusing position, D2m are the fuzziness of the second area under the target focusing position. In this way, the mobile terminal search for pyrogene according to the comparison of region blur degree, it can be avoided and focusing area occur and compare background The case where region blur, to promote shooting experience.
It can be with those of ordinary skill in the art will appreciate that realizing that all or part of step of above-described embodiment method is It is completed by the relevant hardware of at least one program instruction, at least one described program can store computer-readable deposits in one In storage media, which is performed, comprising the following steps:
In shooting process, it treats reference object and carries out focusing coarse search, determine the first focusing position;
The preview image of the object to be captured is divided into first area and the secondth area according to first focusing position Domain, wherein the first area includes the focusing area of the preview image under first focusing position, secondth area Domain includes the region except first area described in the preview image;
Calculate separately the fuzzy of the fuzziness D11 and the second area of the first area under first focusing position Spend D21;
According to the D11 and D21, target focusing position is being determined to the pyrogene search phase, wherein D1m is described in D11 and/or D2m is the fuzziness of the first area under the target focusing position closest to the D21, D1m, and D2m is The fuzziness of the second area under the target focusing position.
Optionally, described that the preview image of the object to be captured is divided by the firstth area according to first focusing position After domain and second area, at least one described program is performed, and can also be achieved following steps:
It treats reference object and carries out focusing coarse search, determine the second focusing position, wherein second focusing position is remote Burnt position;
Calculate separately the fuzzy of the fuzziness D12 and the second area of the first area under second focusing position Spend D22;
It is described according to the D11 and D21, target focusing position is being determined to the pyrogene search phase, comprising:
In to pyrogene search process, if D1i closer to D12 and/or D2i closer to D22, is carried out pair to first direction Pyrogene search, determines target focusing position, wherein D1i is the fuzziness of the first area under the focusing position of essence search, D2i is the fuzziness of the second area under the focusing position of essence search, and the first direction is contrary with remote coke Direction.
Optionally, described that the preview image of the object to be captured is divided by the firstth area according to first focusing position After domain and second area, at least one described program is performed, and can also be achieved following steps:
It treats reference object and carries out focusing coarse search, determine third focusing position, wherein the third focusing position is close Burnt position;
Calculate separately the fuzzy of the fuzziness D13 and the second area of the first area under the third focusing position Spend D23;
It is described according to the D11 and D21, target focusing position is being determined to the pyrogene search phase, comprising:
In to pyrogene search process, if D1i closer to D13 and/or D2i closer to D23, is carried out pair to second direction Pyrogene search, determines target focusing position, wherein D1i is the fuzziness of the first area under the focusing position of essence search, D2i is the fuzziness of the second area under the focusing position of essence search, and the second direction is contrary with nearly coke Direction.
Optionally, the focusing coarse search is scanned for a length of interval of the first step, described to search for pyrogene with second step Length scans for, wherein the first step is long to be greater than second step-length.
Optionally, described that the preview image of the object to be captured is divided by the firstth area according to first focusing position Domain and second area, comprising:
The preview image is divided into multiple subregions, wherein the preview image under first focusing position Focusing area is as a sub-regions;
It determines the focusing area and the region adjacent with the focusing area is first area;
Determine that the region except first area described in the preview image is second area.
Optionally, the preview image is divided into multiple subregions, comprising:
Determine the size of the focusing area of the preview image of the object to be captured under first focusing position;
The preview image is divided into multiple subregions according to the focusing area size, wherein each subregion Size is the size of the focusing area.
Optionally, the fuzziness D11 for calculating the first area under first focusing position, comprising:
Calculate separately the fuzziness of each subregion of the first area under first focusing position;
The weighted average for calculating the fuzziness of each subregion of the first area is the fuzzy of the first area Spend D11, wherein the weighted value of the focusing area is maximum.
Optionally, the fuzziness D21 for calculating the second area under first focusing position, comprising:
Calculate separately the fuzziness of each subregion of the second area under first focusing position;
The average value for calculating the fuzziness of each subregion of the second area is the fuzziness of the second area D21。
In the present embodiment, at least one described program is performed, and in shooting process, is treated reference object and is focused Coarse search determines the first focusing position;The preview image of the object to be captured is divided into according to first focusing position First area and second area, wherein the first area includes pair of the preview image under first focusing position Burnt region, the second area include the region except first area described in the preview image;Calculate separately described first The fuzziness D21 of the fuzziness D11 and the second area of the first area under focusing position;According to the D11 and D21, Target focusing position is being determined to the pyrogene search phase, wherein D1m closest to the D11 and/or D2m closest to the D21, D1m is the fuzziness of the first area under the target focusing position, and D2m is described the under the target focusing position The fuzziness in two regions.In this way, at least one described program is performed, search pyrogene according to the comparison of region blur degree Rope can be avoided and focusing area occurs than the situation that background area obscures, to promote shooting experience.
It should be noted that, in this document, the terms "include", "comprise" or its any other variant are intended to non-row His property includes, so that the process, method, article or the device that include a series of elements not only include those elements, and And further include other elements that are not explicitly listed, or further include for this process, method, article or device institute it is intrinsic Element.In the absence of more restrictions, the element limited by sentence "including a ...", it is not excluded that including being somebody's turn to do There is also other identical elements in the process, method of element, article or device.
The serial number of the above embodiments of the invention is only for description, does not represent the advantages or disadvantages of the embodiments.
Through the above description of the embodiments, those skilled in the art can be understood that above-described embodiment side Method can be realized by means of software and necessary general hardware platform, naturally it is also possible to by hardware, but in many cases The former is more preferably embodiment.Based on this understanding, technical solution of the present invention substantially in other words does the prior art The part contributed out can be embodied in the form of software products, which is stored in a storage medium In (such as ROM/RAM, magnetic disk, CD), including some instructions are used so that a terminal device (can be mobile phone, computer, clothes Business device, air conditioner or the network equipment etc.) execute method described in each embodiment of the present invention.
The above is only a preferred embodiment of the present invention, is not intended to limit the scope of the invention, all to utilize this hair Equivalent structure or equivalent flow shift made by bright specification and accompanying drawing content is applied directly or indirectly in other relevant skills Art field, is included within the scope of the present invention.

Claims (10)

1. a kind of focal adjustment method is applied to mobile terminal, which is characterized in that the described method includes:
In shooting process, it treats reference object and carries out focusing coarse search, determine the first focusing position;
The preview image of the object to be captured is divided into first area and second area according to first focusing position, In, the first area includes the focusing area of the preview image under first focusing position, the second area packet Include the region except first area described in the preview image;
Calculate separately the fuzziness of the fuzziness D11 and the second area of the first area under first focusing position D21;
According to the D11 and D21, target focusing position is being determined to the pyrogene search phase, wherein D1m closest to the D11, And/or D2m is the fuzziness of the first area under the target focusing position closest to the D21, D1m, D2m is in institute State the fuzziness of the second area under target focusing position.
2. focal adjustment method according to claim 1, which is characterized in that it is described according to first focusing position by institute The preview image for stating object to be captured is divided into after first area and second area, the method also includes:
It treats reference object and carries out focusing coarse search, determine the second focusing position, wherein second focusing position is remote burnt position It sets;
Calculate separately the fuzziness of the fuzziness D12 and the second area of the first area under second focusing position D22;
It is described according to the D11 and D21, target focusing position is being determined to the pyrogene search phase, comprising:
In to pyrogene search process, if D1i is carried out to first direction to pyrogene closer to D12 and/or D2i closer to D22 Search, determines target focusing position, wherein D1i is the fuzziness of the first area under the focusing position of essence search, D2i For the fuzziness of the second area under the focusing position of essence search, the first direction is the side contrary with remote coke To.
3. focal adjustment method according to claim 1 or 2, which is characterized in that described according to first focusing position After the preview image of the object to be captured is divided into first area and second area, the method also includes:
It treats reference object and carries out focusing coarse search, determine third focusing position, wherein the third focusing position is nearly burnt position It sets;
Calculate separately the fuzziness of the fuzziness D13 and the second area of the first area under the third focusing position D23;
It is described according to the D11 and D21, target focusing position is being determined to the pyrogene search phase, comprising:
In to pyrogene search process, if D1i is carried out to second direction to pyrogene closer to D13 and/or D2i closer to D23 Search, determines target focusing position, wherein D1i is the fuzziness of the first area under the focusing position of essence search, D2i For the fuzziness of the second area under the focusing position of essence search, the second direction is the side contrary with nearly coke To.
4. focal adjustment method according to claim 1, which is characterized in that the focusing coarse search is with the first step a length of Every scanning for, described search for pyrogene is scanned for second step length, wherein the first step is long to be greater than the second step It is long.
5. focal adjustment method according to claim 1, which is characterized in that it is described according to first focusing position by institute The preview image for stating object to be captured is divided into first area and second area, comprising:
The preview image is divided into multiple subregions, wherein the focusing of the preview image under first focusing position Region is as a sub-regions;
It determines the focusing area and the region adjacent with the focusing area is first area;
Determine that the region except first area described in the preview image is second area.
6. focal adjustment method according to claim 5, which is characterized in that the preview image is divided into multiple sub-districts Domain, comprising:
Determine the size of the focusing area of the preview image of the object to be captured under first focusing position;
The preview image is divided into multiple subregions according to the focusing area size, wherein the size of each subregion For the size of the focusing area.
7. focal adjustment method according to claim 5 or 6, which is characterized in that described to calculate first focusing position Under the first area fuzziness D11, comprising:
Calculate separately the fuzziness of each subregion of the first area under first focusing position;
The weighted average for calculating the fuzziness of each subregion of the first area is the fuzziness of the first area D11, wherein the weighted value of the focusing area is maximum.
8. focal adjustment method according to claim 5 or 6, which is characterized in that described to calculate first focusing position Under the second area fuzziness D21, comprising:
Calculate separately the fuzziness of each subregion of the second area under first focusing position;
The average value for calculating the fuzziness of each subregion of the second area is the fuzziness D21 of the second area.
9. a kind of mobile terminal, which is characterized in that the mobile terminal includes memory, at least one processor and is stored in institute At least one program stated on memory and can executed at least one described processor, at least one described program by it is described extremely A few processor realizes the step in the described in any item methods of the claims 1~8 when executing.
10. a kind of computer readable storage medium, the computer-readable recording medium storage has computer executable at least One program, which is characterized in that at least one described program makes the computer execute above-mentioned power when being executed by the computer Benefit requires the step in 1~8 described in any item methods.
CN201910359659.9A 2019-04-30 2019-04-30 Focusing adjustment method, mobile terminal and computer readable storage medium Active CN109951645B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910359659.9A CN109951645B (en) 2019-04-30 2019-04-30 Focusing adjustment method, mobile terminal and computer readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910359659.9A CN109951645B (en) 2019-04-30 2019-04-30 Focusing adjustment method, mobile terminal and computer readable storage medium

Publications (2)

Publication Number Publication Date
CN109951645A true CN109951645A (en) 2019-06-28
CN109951645B CN109951645B (en) 2020-10-30

Family

ID=67016761

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910359659.9A Active CN109951645B (en) 2019-04-30 2019-04-30 Focusing adjustment method, mobile terminal and computer readable storage medium

Country Status (1)

Country Link
CN (1) CN109951645B (en)

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101236348A (en) * 2008-03-13 2008-08-06 北京中星微电子有限公司 Automatic focusing method and image collecting device
CN101795361A (en) * 2009-01-30 2010-08-04 索尼公司 Two-dimensional polynomial model for depth estimation based on two-picture matching
US20130070145A1 (en) * 2011-09-20 2013-03-21 Canon Kabushiki Kaisha Image capturing apparatus and control method thereof
CN103217855A (en) * 2013-04-02 2013-07-24 金三立视频科技(深圳)有限公司 Automatic focusing method of camera
EP2166408B1 (en) * 2008-09-17 2014-03-12 Ricoh Company, Ltd. Imaging device and imaging method using the same
CN103945126A (en) * 2014-04-21 2014-07-23 中国人民解放军国防科学技术大学 Automatic focusing and locating method
CN104516170A (en) * 2013-10-03 2015-04-15 群光电子股份有限公司 Three-dimensional focusing method and system thereof
US20160309091A1 (en) * 2015-04-16 2016-10-20 Canon Kabushiki Kaisha Display apparatus, display control method, and image capturing apparatus
CN106331499A (en) * 2016-09-13 2017-01-11 努比亚技术有限公司 Focusing method and shooting equipment
CN108076278A (en) * 2016-11-10 2018-05-25 阿里巴巴集团控股有限公司 A kind of Atomatic focusing method, device and electronic equipment

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101236348A (en) * 2008-03-13 2008-08-06 北京中星微电子有限公司 Automatic focusing method and image collecting device
EP2166408B1 (en) * 2008-09-17 2014-03-12 Ricoh Company, Ltd. Imaging device and imaging method using the same
CN101795361A (en) * 2009-01-30 2010-08-04 索尼公司 Two-dimensional polynomial model for depth estimation based on two-picture matching
US20130070145A1 (en) * 2011-09-20 2013-03-21 Canon Kabushiki Kaisha Image capturing apparatus and control method thereof
CN103217855A (en) * 2013-04-02 2013-07-24 金三立视频科技(深圳)有限公司 Automatic focusing method of camera
CN104516170A (en) * 2013-10-03 2015-04-15 群光电子股份有限公司 Three-dimensional focusing method and system thereof
CN103945126A (en) * 2014-04-21 2014-07-23 中国人民解放军国防科学技术大学 Automatic focusing and locating method
US20160309091A1 (en) * 2015-04-16 2016-10-20 Canon Kabushiki Kaisha Display apparatus, display control method, and image capturing apparatus
CN106331499A (en) * 2016-09-13 2017-01-11 努比亚技术有限公司 Focusing method and shooting equipment
CN108076278A (en) * 2016-11-10 2018-05-25 阿里巴巴集团控股有限公司 A kind of Atomatic focusing method, device and electronic equipment

Also Published As

Publication number Publication date
CN109951645B (en) 2020-10-30

Similar Documents

Publication Publication Date Title
CN107093418A (en) A kind of screen display method, computer equipment and storage medium
CN108495056A (en) Photographic method, mobile terminal and computer readable storage medium
CN109254715A (en) Screen display direction control method, device, mobile terminal and readable storage medium storing program for executing
CN109167910A (en) focusing method, mobile terminal and computer readable storage medium
CN108900778A (en) A kind of image pickup method, mobile terminal and computer readable storage medium
CN108063901A (en) A kind of image-pickup method, terminal and computer readable storage medium
CN107317963A (en) A kind of double-camera mobile terminal control method, mobile terminal and storage medium
CN110035176A (en) A kind of brightness adjusting method of mobile terminal, mobile terminal and storage medium
CN108459799B (en) Picture processing method, mobile terminal and computer readable storage medium
CN109819163A (en) A kind of image processing control, terminal and computer readable storage medium
CN108989873A (en) Barrage information display method, mobile terminal and computer readable storage medium
CN108900780A (en) A kind of screen light compensation method, mobile terminal and storage medium
CN110086993A (en) Image processing method, device, mobile terminal and computer readable storage medium
CN109348137A (en) Mobile terminal camera control method, device, mobile terminal and storage medium
CN109300099A (en) A kind of image processing method, mobile terminal and computer readable storage medium
CN108053371A (en) A kind of image processing method, terminal and computer readable storage medium
CN109672822A (en) A kind of method for processing video frequency of mobile terminal, mobile terminal and storage medium
CN109992183A (en) Preview picture and the method for selection, terminal and storage medium
CN109525783A (en) A kind of exposure image pickup method, terminal and computer readable storage medium
CN110278481A (en) Picture-in-picture implementing method, terminal and computer readable storage medium
CN110177207A (en) Image pickup method, mobile terminal and the computer readable storage medium of backlight image
CN110189368A (en) Method for registering images, mobile terminal and computer readable storage medium
CN109005354A (en) Image pickup method, mobile terminal and computer readable storage medium
CN109739414A (en) A kind of image processing method, mobile terminal, computer readable storage medium
CN108234979A (en) A kind of image pickup method, mobile terminal and computer readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant