CN107580181A - A kind of focusing method, equipment and computer-readable recording medium - Google Patents
A kind of focusing method, equipment and computer-readable recording medium Download PDFInfo
- Publication number
- CN107580181A CN107580181A CN201710752706.7A CN201710752706A CN107580181A CN 107580181 A CN107580181 A CN 107580181A CN 201710752706 A CN201710752706 A CN 201710752706A CN 107580181 A CN107580181 A CN 107580181A
- Authority
- CN
- China
- Prior art keywords
- focusing
- distance
- captured
- focusing area
- area
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Landscapes
- Studio Devices (AREA)
Abstract
The embodiment of the invention discloses a kind of focusing method, methods described includes:Obtain at least two first focusing areas;Obtain the first distance between object and image acquisition device to be captured corresponding to each first focusing area;Based on first distance, second distance is determined;It is the first focusing area corresponding to the object to be captured of the second distance by distance, is defined as the second focusing area.The embodiment of the present invention also discloses a kind of focus apparatus and computer-readable recording medium simultaneously, realizes terminal and actively determines that focusing area carries out auto-focusing and obtains clearly image according to object to be captured, improves the picture quality of shooting.
Description
Technical field
The present invention relates to the intelligent shooting technology in the field of taking pictures, more particularly to a kind of focusing method, equipment and computer
Readable storage medium storing program for executing.
Background technology
With the popularization and development of terminal, the performance of terminal is more and more stronger, and function is increasingly abundanter.Certainly, the bat of terminal
It is also stronger and stronger according to function.At present, when using terminal progress microspur is taken pictures, focusing area is typically at default zone;User
When wanting to focus to specific region, the ceaselessly display screen of mobile terminal or touch control terminal is needed so that object to be captured is right
In burnt region.
The region that existing this focusing technology terminal can not be shown in automatically according to object to be captured in view-finder determines
Focusing area, degree of intelligence be not high;Moreover, if object to be captured is not in the focusing area of acquiescence and user is not desired to movement eventually
End, it is poor to cause to shoot the picture quality come.
The content of the invention
In view of this, the embodiment of the present invention it is expected to provide a kind of focusing method, equipment and computer-readable recording medium, solution
Having determined can not realize that terminal determines asking for focusing area automatically according to region of the object to be captured in view-finder in the prior art
Topic, the region being shown in automatically according to object to be captured in view-finder with realizing Intelligent Terminal determine focusing area, ensure
Shoot the picture quality come.
To reach above-mentioned purpose, the technical proposal of the invention is realized in this way:
A kind of focusing method, methods described include:
Obtain at least two first focusing areas;
Obtain the first distance between object and image acquisition device to be captured corresponding to each first focusing area;
Based on first distance, second distance is determined;
It is the first focusing area corresponding to the object to be captured of the second distance by distance, is defined as the second focusing area
Domain.
Optionally, at least two first focusing areas of the acquisition, including:
Obtain characteristic parameter of each object to be captured in the view-finder of described image collector;Wherein, the feature
Parameter is for the definition for the object to be captured for representing to collect in the view-finder;
3rd focusing area is determined based on the characteristic parameter;
First focusing area is obtained based on the 3rd focusing area.
Optionally, it is described that 3rd focusing area is determined based on the characteristic parameter, including:
Based on the relation between each characteristic parameter, determine that the characteristic parameter meets the region of preparatory condition for the
One region;
The first area is adjusted according to default regulation rule to obtain the 3rd focusing area.
Optionally, it is described that first focusing area is obtained based on the 3rd focusing area, including:
The 3rd focusing area is divided according to default division rule, obtains first focusing area.
Optionally, it is described to be based on first distance, second distance is determined, including:
Magnitude relationship between more each first distance;
Based on the magnitude relationship between the described first distance, determine that minimum first distance of distance value is described second
Distance.
Optionally, it is described by distance for the second distance object to be captured corresponding to the first focusing area, be defined as
After second focusing area, in addition to:
Determine that corresponding object to be captured is target focusing object in second focusing area;
Focusing process is carried out to target focusing object, and the object to be captured is gathered by described image collector
Obtain target image;Wherein, the object to be captured includes carrying out the target focusing object after focusing process.
A kind of focus apparatus, the equipment include:Processor, memory, communication bus and image acquisition device;
The communication bus is used to realize the connection communication between processor and memory;
The processor is used to perform the focusing program stored in memory, to realize following steps:
Obtain at least two first focusing areas;
Obtain the first distance between object and image acquisition device to be captured corresponding to each first focusing area;
Based on first distance, second distance is determined;
It is the first focusing area corresponding to the object to be captured of the second distance by distance, is defined as the second focusing area
Domain.
Optionally, processor is additionally operable to perform focusing program, to realize following steps:
Obtain characteristic parameter of each object to be captured in the view-finder of described image collector;Wherein described feature ginseng
Number is for the definition for the object to be captured for representing to collect in the view-finder;
3rd focusing area is determined based on the characteristic parameter;
First focusing area is obtained based on the 3rd focusing area.
Optionally, the processor is additionally operable to perform focusing program, to realize following steps:
Based on the magnitude relationship between the described first distance, determine that minimum first distance of distance value is described second
Distance.
A kind of computer-readable recording medium, focusing program is stored with the computer-readable recording medium, it is described right
The step of focusing method described in any one as described above is realized when burnt program is executed by processor.
Focusing method, equipment and the computer-readable recording medium that embodiments of the invention are provided, obtain at least two
After first focusing area, obtain corresponding to every one first focusing area between object and image acquisition device to be captured first away from
From being then based on the first distance, determine second distance, and be the first focusing corresponding to the object to be captured of second distance by distance
Region is defined as the second focusing area.So, terminal can determine multiple focusing areas first, then according to object to be captured with
Relation between the distance of image acquisition device obtains the object to be captured correspondence that need to carry out focusing process from multiple focusing areas
Focusing area, solving terminal in the prior art actively according to object to be captured can not determine that focusing area realizes auto-focusing
The problem of, realize terminal and actively determine that focusing area carries out auto-focusing and obtains clearly image according to object to be captured, carry
The high picture quality of shooting.
Brief description of the drawings
Fig. 1 is the hardware architecture diagram for an optional mobile terminal for realizing each embodiment of the present invention;
Fig. 2 is a kind of communications network system configuration diagram provided in an embodiment of the present invention;
Fig. 3 is a kind of schematic flow sheet of focusing method provided in an embodiment of the present invention;
Fig. 4 is the schematic flow sheet of another focusing method provided in an embodiment of the present invention;
Fig. 5 is a kind of focus apparatus application scenarios schematic diagram provided in an embodiment of the present invention;
Fig. 6 is another focus apparatus application scenarios schematic diagram provided in an embodiment of the present invention;
Fig. 7 is another focus apparatus application scenarios schematic diagram provided in an embodiment of the present invention;
Fig. 8 is a kind of structural representation of focus apparatus provided in an embodiment of the present invention.
Embodiment
It should be appreciated that the specific embodiments described herein are merely illustrative of the present invention, it is not intended to limit the present invention.
In follow-up description, the suffix using such as " module ", " part " or " unit " for representing element is only
Be advantageous to the explanation of the present invention, itself there is no a specific meaning.Therefore, " module ", " part " or " unit " can mix
Ground uses.
Terminal can be implemented in a variety of manners.For example, the terminal described in the present invention can include such as mobile phone, flat board
Computer, notebook computer, palm PC, personal digital assistant (Personal Digital Assistant, PDA), portable
Media player (Portable Media Player, PMP), guider, wearable device, Intelligent bracelet, pedometer etc. move
Dynamic terminal, and the fixed terminal such as digital TV, desktop computer.
It will be illustrated in subsequent descriptions by taking mobile terminal as an example, it will be appreciated by those skilled in the art that except special
Outside element for moving purpose, construction according to the embodiment of the present invention can also apply to the terminal of fixed type.
Referring to Fig. 1, its hardware architecture diagram for a kind of mobile terminal of each embodiment of the realization present invention, the shifting
Dynamic terminal 100 can include:RF (Radio Frequency, radio frequency) unit 101, WiFi module 102, audio output unit
103rd, A/V (audio/video) input block 104, sensor 105, display unit 106, user input unit 107, interface unit
108th, the part such as memory 109, processor 110 and power supply 111.It will be understood by those skilled in the art that shown in Fig. 1
Mobile terminal structure does not form the restriction to mobile terminal, and mobile terminal can be included than illustrating more or less parts,
Either combine some parts or different parts arrangement.
The all parts of mobile terminal are specifically introduced with reference to Fig. 1:
Radio frequency unit 101 can be used for receiving and sending messages or communication process in, the reception and transmission of signal, specifically, by base station
Downlink information receive after, handled to processor 110;In addition, up data are sent to base station.Generally, radio frequency unit 101
Including but not limited to antenna, at least one amplifier, transceiver, coupler, low-noise amplifier, duplexer etc..In addition, penetrate
Frequency unit 101 can also be communicated by radio communication with network and other equipment.Above-mentioned radio communication can use any communication
Standard or agreement, including but not limited to GSM (Global System of Mobile communication, global system for mobile telecommunications
System), GPRS (General Packet Radio Service, general packet radio service), CDMA2000 (Code
Division Multiple Access 2000, CDMA 2000), WCDMA (Wideband Code Division
Multiple Access, WCDMA), TD-SCDMA (Time Division-Synchronous Code
Division Multiple Access, TD SDMA), FDD-LTE (Frequency Division
Duplexing-Long Term Evolution, FDD Long Term Evolution) and TDD-LTE (Time Division
Duplexing-Long Term Evolution, time division duplex Long Term Evolution) etc..
WiFi belongs to short range wireless transmission technology, and mobile terminal can help user to receive and dispatch electricity by WiFi module 102
Sub- mail, browse webpage and access streaming video etc., it has provided the user wireless broadband internet and accessed.Although Fig. 1 shows
Go out WiFi module 102, but it is understood that, it is simultaneously not belonging to must be configured into for mobile terminal, completely can be according to need
To be omitted in the essential scope for do not change invention.
Audio output unit 103 can be in call signal reception pattern, call mode, record mould in mobile terminal 100
When under the isotypes such as formula, speech recognition mode, broadcast reception mode, by radio frequency unit 101 or WiFi module 102 it is receiving or
It is sound that the voice data stored in memory 109, which is converted into audio signal and exported,.Moreover, audio output unit 103
The audio output related to the specific function that mobile terminal 100 performs can also be provided (for example, call signal receives sound, disappeared
Breath receives sound etc.).Audio output unit 103 can include loudspeaker, buzzer etc..
A/V input blocks 104 are used to receive audio or video signal.A/V input blocks 104 can include graphics processor
(Graphics Processing Unit, GPU) 1041 and microphone 1042, graphics processor 1041 is in video acquisition mode
Or the static images or the view data of video obtained in image capture mode by image capture apparatus (such as camera) are carried out
Reason.Picture frame after processing may be displayed on display unit 106.Picture frame after the processing of graphics processor 1041 can be deposited
Storage is transmitted in memory 109 (or other storage mediums) or via radio frequency unit 101 or WiFi module 102.Mike
Wind 1042 can connect in telephone calling model, logging mode, speech recognition mode etc. operational mode via microphone 1042
Quiet down sound (voice data), and can be voice data by such acoustic processing.Audio (voice) data after processing can
To be converted to the form output that mobile communication base station can be sent to via radio frequency unit 101 in the case of telephone calling model.
Microphone 1042 can implement various types of noises and eliminate (or suppression) algorithm to eliminate (or suppression) in reception and send sound
Caused noise or interference during frequency signal.
Mobile terminal 100 also includes at least one sensor 105, such as optical sensor, motion sensor and other biographies
Sensor.Specifically, optical sensor includes ambient light sensor and proximity transducer, wherein, ambient light sensor can be according to environment
The light and shade of light adjusts the brightness of display panel 1061, and proximity transducer can close when mobile terminal 100 is moved in one's ear
Display panel 1061 and/or backlight.As one kind of motion sensor, accelerometer sensor can detect in all directions (general
For three axles) size of acceleration, size and the direction of gravity are can detect that when static, the application available for identification mobile phone posture
(such as horizontal/vertical screen switching, dependent game, magnetometer pose calibrating), Vibration identification correlation function (such as pedometer, percussion) etc.;
The fingerprint sensor that can also configure as mobile phone, pressure sensor, iris sensor, molecule sensor, gyroscope, barometer,
The other sensors such as hygrometer, thermometer, infrared ray sensor, will not be repeated here.
Display unit 106 is used for the information for showing the information inputted by user or being supplied to user.Display unit 106 can wrap
Display panel 1061 is included, liquid crystal display (Liquid Crystal Display, LCD), Organic Light Emitting Diode can be used
Forms such as (Organic Light-Emitting Diode, OLED) configures display panel 1061.
User input unit 107 can be used for the numeral or character information for receiving input, and produce the use with mobile terminal
The key signals input that family is set and function control is relevant.Specifically, user input unit 107 may include contact panel 1071 with
And other input equipments 1072.Contact panel 1071, also referred to as touch-screen, collect touch operation of the user on or near it
(for example user uses any suitable objects or annex such as finger, stylus on contact panel 1071 or in contact panel 1071
Neighbouring operation), and corresponding attachment means are driven according to formula set in advance.Contact panel 1071 may include touch detection
Two parts of device and touch controller.Wherein, the touch orientation of touch detecting apparatus detection user, and detect touch operation band
The signal come, transmits a signal to touch controller;Touch controller receives touch information from touch detecting apparatus, and by it
Contact coordinate is converted into, then gives processor 110, and the order sent of reception processing device 110 and can be performed.In addition, can
To realize contact panel 1071 using polytypes such as resistance-type, condenser type, infrared ray and surface acoustic waves.Except contact panel
1071, user input unit 107 can also include other input equipments 1072.Specifically, other input equipments 1072 can wrap
Include but be not limited to physical keyboard, in function key (such as volume control button, switch key etc.), trace ball, mouse, action bars etc.
One or more, do not limit herein specifically.
Further, contact panel 1071 can cover display panel 1061, detect thereon when contact panel 1071 or
After neighbouring touch operation, processor 110 is sent to determine the type of touch event, is followed by subsequent processing device 110 according to touch thing
The type of part provides corresponding visual output on display panel 1061.Although in Fig. 1, contact panel 1071 and display panel
1061 be the part independent as two to realize the input of mobile terminal and output function, but in certain embodiments, can
Input and the output function of mobile terminal are realized so that contact panel 1071 and display panel 1061 is integrated, is not done herein specifically
Limit.
Interface unit 108 is connected the interface that can pass through as at least one external device (ED) with mobile terminal 100.For example,
External device (ED) can include wired or wireless head-band earphone port, external power source (or battery charger) port, wired or nothing
Line FPDP, memory card port, the port for connecting the device with identification module, audio input/output (I/O) end
Mouth, video i/o port, ear port etc..Interface unit 108 can be used for receiving the input from external device (ED) (for example, number
It is believed that breath, electric power etc.) and the input received is transferred to one or more elements in mobile terminal 100 or can be with
For transmitting data between mobile terminal 100 and external device (ED).
Memory 109 can be used for storage software program and various data.Memory 109 can mainly include storing program area
And storage data field, wherein, storing program area can storage program area, application program (such as the sound needed at least one function
Sound playing function, image player function etc.) etc.;Storage data field can store according to mobile phone use created data (such as
Voice data, phone directory etc.) etc..In addition, memory 109 can include high-speed random access memory, can also include non-easy
The property lost memory, a for example, at least disk memory, flush memory device or other volatile solid-state parts.
Processor 110 is the control centre of mobile terminal, utilizes each of various interfaces and the whole mobile terminal of connection
Individual part, by running or performing the software program and/or module that are stored in memory 109, and call and be stored in storage
Data in device 109, the various functions and processing data of mobile terminal are performed, so as to carry out integral monitoring to mobile terminal.Place
Reason device 110 may include one or more processing units;Preferably, processor 110 can integrate application processor and modulatedemodulate is mediated
Device is managed, wherein, application processor mainly handles operating system, user interface and application program etc., and modem processor is main
Handle radio communication.It is understood that above-mentioned modem processor can not also be integrated into processor 110.
Mobile terminal 100 can also include the power supply 111 (such as battery) to all parts power supply, it is preferred that power supply 111
Can be logically contiguous by power-supply management system and processor 110, so as to realize management charging by power-supply management system, put
The function such as electricity and power managed.
Although Fig. 1 is not shown, mobile terminal 100 can also will not be repeated here including bluetooth module etc..
For the ease of understanding the embodiment of the present invention, the communications network system being based on below to the mobile terminal of the present invention enters
Row description.
Referring to Fig. 2, Fig. 2 is a kind of communications network system Organization Chart provided in an embodiment of the present invention, the communication network system
Unite as the LTE system of universal mobile communications technology, the UE that the LTE system includes communicating connection successively (User Equipment, is used
Family equipment) 201, E-UTRAN (Evolved UMTS Terrestrial Radio Access Network, evolved UMTS lands
Ground wireless access network) 202, EPC (Evolved Packet Core, evolved packet-based core networks) 203 and operator IP operation
204。
Specifically, UE201 can be above-mentioned terminal 100, and here is omitted.
E-UTRAN202 includes eNodeB2021 and other eNodeB2022 etc..Wherein, eNodeB2021 can be by returning
Journey (backhaul) (such as X2 interface) is connected with other eNodeB2022, and eNodeB2021 is connected to EPC203,
ENodeB2021 can provide UE201 to EPC203 access.
EPC203 can include MME (Mobility Management Entity, mobility management entity) 2031, HSS
(Home Subscriber Server, home subscriber server) 2032, other MME2033, SGW (Serving Gate Way,
Gateway) 2034, PGW (PDN Gate Way, grouped data network gateway) 2035 and PCRF (Policy and
Charging Rules Function, policy and rate functional entity) 2036 etc..Wherein, MME2031 be processing UE201 and
The control node of signaling between EPC203, there is provided carrying and connection management.HSS2032 is all to manage for providing some registers
Such as the function of attaching position register (not shown) etc, and preserve some and used about service features, data rate etc.
The special information in family.All customer data can be transmitted by SGW2034, and PGW2035 can provide UE 201 IP
Address is distributed and other functions, and PCRF2036 is strategy and the charging control strategic decision-making of business data flow and IP bearing resources
Point, it selects and provided available strategy and charging control decision-making with charge execution function unit (not shown) for strategy.
IP operation 204 can include internet, Intranet, IMS (IP Multimedia Subsystem, IP multimedia
System) or other IP operations etc..
Although above-mentioned be described by taking LTE system as an example, those skilled in the art it is to be understood that the present invention not only
Suitable for LTE system, be readily applicable to other wireless communication systems, such as GSM, CDMA2000, WCDMA, TD-SCDMA with
And following new network system etc., do not limit herein.
Based on above-mentioned mobile terminal hardware configuration and communications network system, each embodiment of the inventive method is proposed.
Embodiments of the invention provide a kind of focusing method, and shown in reference picture 3, this method comprises the following steps:
Step 301, obtain at least two first focusing areas.
Specifically, it can be focus apparatus to realize that step 301, which obtains at least two first focusing areas,.Focus apparatus
Can be terminal, terminal can be the mobile terminal for having camera function.
After image capture instruction is received, the image of the object to be captured to being shown in view-finder analyze really terminal
Determine first area, then the first area be adjusted to obtain the 3rd focusing area according to default regulation rule, finally according to
Default division rule divides to the 3rd focusing area, you can obtains at least two first focusing areas.Wherein, IMAQ
Instruction can by the display screen of touch control terminal or pass through phonetic entry after application program of taking pictures during user opens a terminal
The photographing instruction sent etc. mode to terminal, such as can be the instruction for starting application program collection image of taking pictures.
Step 302, obtain corresponding to every one first focusing area between object and image acquisition device to be captured first away from
From.
Specifically, step 302 is obtained corresponding to every one first focusing area between object and image acquisition device to be captured
First distance can be realized by focus apparatus.Terminal can use default apart from the every one first focusing area of acquisition methods acquisition
The distance between object to be captured and image acquisition device corresponding to domain, wherein, comprise at least a correspondence in each focusing area
Object to be captured.It is right in every one first focusing area when corresponding at least two object to be captured in each focusing area
The first distance between the object to be captured and image acquisition device that should get is corresponding including at least two.
Wherein, it is default in terminal to have apart from acquisition methods:Dual camera ranging, phase ranging and structure light measurement.It is double
The general principle of camera ranging is:Being imaged on for the corresponding two images got according to target point in two cameras is same
Difference between coordinate and target point in one coordinate system can obtain mesh to the inversely prroportional relationship between the distance of imaging plane
Punctuate is the distance between to camera;Wherein, it should be noted that, when terminal is by dual camera ranging, after dual camera is all
Put dual camera or preposition dual camera.The general principle of phase ranging is that terminal sends modulated light wave and received through to be captured right
As the modulated light wave reflected, phase delay caused by distance is come and gone by measuring the modulated light wave, and modulated light wave exists
The spent time is come and gone in air, so as to calculate the distance needed for obtaining.It is more that structure light measurement can be that terminal is sent
After Shu Jiguang receive the Shu Jiguang through object reflection to be captured, and record per Shu Jiguang from be issued to receive between needed for when
Between, corresponding distance can obtain according to the relation between the light velocity and time.
Step 303, based on the first distance, determine second distance.
Specifically, step 303 is based on the first distance, determine that second distance can be realized by focus apparatus.Second distance
It can be the distance for meeting certain condition requirement in the first distance.For example, it may be line-spacing is entered to obtained every one first distance
Compare from value size, then obtain the second distance that the first minimum distance of distance value obtains.
Step 304, by distance for second distance object to be captured corresponding to the first focusing area, be defined as the second focusing
Region.
Specifically, distance is the first focusing area corresponding to the object to be captured of second distance by step 304, it is defined as the
Two focusing areas can be realized by focus apparatus.Due to obtain first apart from when be according to every one first focusing area carry out
Obtain, that is, have between the first distance and every one first focusing area corresponding to each object to be captured got one a pair
The relation answered, so when to determine the distance between object to be captured and image acquisition device be second distance, it may be determined that corresponding
First focusing area.It should be noted that there can be the situation of multiple identical minimum values in the first distance, so final determine
The second focusing area can include at least one first focusing area.
After determining the second focusing area, a pair object to be captured corresponding with the second focusing area carries out focusing process, then
All objects to be captured are gathered, obtain carrying out the figure after focusing process on pair object to be captured corresponding with the second focusing area
Picture, wherein in obtained image in addition to the image of object to be captured corresponding to the second focusing area is clearly, other are to be captured
It can also be clearly fuzzy that the image of object, which can be,.The focusing mode that terminal uses can include:Laser focusing, phase
Focusing and contrast focusing.
The focusing method that embodiments of the invention are provided, after obtaining at least two first focusing areas, obtain every 1 the
The first distance corresponding to one focusing area between object and image acquisition device to be captured, is then based on the first distance, determines
Two distances, and distance is defined as the second focusing area for the first focusing area corresponding to the object to be captured of second distance.This
Sample, terminal can determine multiple focusing areas first, then according to the pass between object to be captured and the distance of image acquisition device
System obtained from multiple focusing areas need to carry out the object to be captured of focusing process corresponding to focusing area, solve prior art
Middle terminal actively according to object to be captured can not determine the problem of focusing area realizes auto-focusing, realize terminal according to waiting to clap
Take the photograph object and actively determine that focusing area carries out auto-focusing and obtains clearly image, improve the picture quality of shooting.
Based on previous embodiment, embodiments of the invention provide a kind of focusing method, and shown in reference picture 4, this method includes
Following steps:
Step 401, focus apparatus obtain characteristic parameter of each object to be captured in the view-finder of image acquisition device.
Wherein, characteristic parameter is for the definition for the object to be captured for representing to collect in view-finder.
Illustrated specifically, being adopted by feature exemplified by parameter is camera for contrast, image acquisition device, when terminal receives
When carrying out the image capture instruction of microshot, terminal obtains can recognize and in view-finder in terminal camera acquisition range
The image information that all objects to be captured of interior display are presented, carries out analysis calculating to obtained image information, obtains each treat
Contrast corresponding to reference object.For example, the object to be captured that can be recognized in the camera acquisition range of present terminal has 3
Individual, obtained contrast is respectively D1, D2, D3.
Step 402, focus apparatus feature based parameter determine the 3rd focusing area.
Specifically, all characteristic parameters that terminal-pair obtains are analyzed, the characteristic parameter of certain condition is determined for compliance with
Corresponding region is the 3rd focusing area.
Wherein, step 402 focus apparatus feature based parameter determines that the 3rd focusing area can be realized by following steps:
Step 402a, focus apparatus determines that characteristic parameter meets preparatory condition based on the relation between each characteristic parameter
Region be first area.
Specifically, preparatory condition can be it is a kind of each characteristic parameter is ranked up after, select to arrange according to ranking results
Sequence is forward or the condition of one or more characteristic parameters rearward;Can also be when characteristic parameter meets predetermined threshold value, you can enter
The condition of row subsequent operation, such as can be:Terminal can be between more each characteristic parameter magnitude relationship, select this feature
Represent that region of the object to be captured in view-finder corresponding to the definition highest value of image is first area in parameter.Its
In, when characteristic parameter is bigger, represents that image is more clear, select in all characteristic parameters of acquisition corresponding to maximum characteristic parameter
Region of the object to be captured where in view-finder, i.e., region of the profile corresponding to object to be captured where in view-finder
For first area;When characteristic parameter is smaller, represents that image is more clear, minimal characteristic is joined in all characteristic parameters for selecting to obtain
Region of the object to be captured corresponding to number where in view-finder is first area.
Exemplary, when in three contrasts D1, D2, D3, D3 is minimum, then can determine to be captured corresponding to contrast D3
Region of the object where in view-finder is first area.
Such as it can also be:Compare the relation between each characteristic parameter of acquisition and predetermined threshold value, if characteristic parameter is expired
Sufficient predetermined threshold value, it is determined that region of the object to be captured corresponding to this feature parameter where in view-finder is first area.
Step 402b, focus apparatus is adjusted to obtain the 3rd focusing area according to default regulation rule to first area.
Specifically, default regulation rule can prestore in the terminal, for indicating the first area of determination
It is enlarged how many times of rule or view-finder how many region is occupied, or times of user's selection or input can be received
The rule being adjusted after number to first area.Wherein, when being adjusted, can be distributed according to first area in view-finder
Region come determine how adjustment.
For example, terminal production firm can set a preset coordinate by the display screen based on terminal when producing terminal
System, wherein preset coordinate system can be that two intersecting sides determine on the display screen according to terminal, the intersection point on this two sides
It could be arranged to the origin of preset coordinate system, a line in this two sides of display screen is abscissa (x-axis), another a line
For ordinate (y-axis), the length of the view-finder of terminal in y-axis direction is h, then right when first area is in the range of 0~1/3h
It is motionless close to y=0 marginal position in y-axis direction that the adjustment of the multiple of first area can be to maintain first area, passes through tune
Whole left and right two while and away from y=0 while adjust the size of first area, obtain the 3rd focusing area;When first area exists
When in the range of 1/3h~2/3h, the center of first area can be kept constant, adjust first area to surrounding, obtain the 3rd
Focusing area;When first area is in the range of 2/3h~h, first area can be kept in y-axis direction close to y=h edge
Position is motionless, by adjust left and right two while and close to y=0 while adjust the size of first area, obtain the 3rd focusing area
Domain.
Step 403, focus apparatus obtain the first focusing area based on the 3rd focusing area.
Specifically, the focusing area of terminal-pair the 3rd carries out certain processing, for example, it may be according to certain rule to the
Three focusing areas are divided, and obtain the first focusing area.
Wherein, step 403 focus apparatus the first focusing area is obtained based on the 3rd focusing area specifically can be by following step
It is rapid to realize:
The 3rd focusing area is divided according to default division rule, obtains the first focusing area.
Specifically, default division rule can be how the rule divided to the 3rd region, such as can be by the
Three focusing areas are divided into the rule of M × M or M × N equal portions, wherein, M and N are positive integer.In embodiments of the present invention, can be with
3rd focusing area is divided according to 5 × 5 equal portions, obtains 25 the first focusing areas.
Step 404, focus apparatus are obtained corresponding to every one first focusing area between object and image acquisition device to be captured
The first distance.
Specifically, terminal can determine to wait to clap corresponding to every one first focusing area by way of dual camera ranging
The first distance taken the photograph between object and camera, such as object to be captured corresponding to 25 obtained the first focusing areas and shooting
The first distance between head can be respectively d1, d2 ... ..., d25.
Magnitude relationship between more every one first distance of step 405, focus apparatus.
Specifically, terminal can compare d1, the size of this 25 the first distances of d2 ... ..., d25, this 25 first are obtained
Magnitude relationship between distance.
Step 406, focus apparatus determine that the first minimum distance of distance value is based on the magnitude relationship between the first distance
Second distance.
Specifically, between the object and camera to be captured nearest from camera that second distance, which is camera, to be recognized
Distance value.Such as d11 values are minimum, then second distance is d11.
Distance is the first focusing area corresponding to the object to be captured of second distance by step 407, focus apparatus, is defined as
Second focusing area.
Specifically, terminal determines first focusing area at the place in view-finder of object to be captured corresponding to second distance
For the second focusing area.It should be noted that the distance of identical first of distance value minimum can include at least one or wait to clap
The first focusing area corresponding to object is taken the photograph including at least one, i.e. the second focusing area includes at least one first focusing area.
For example, will be true for the first focusing area in the d11 corresponding view-finder of object to be captured with the distance between terminal camera
It is set to the second focusing area.
Step 408, focus apparatus determine that corresponding object to be captured is target focusing object in the second focusing area.
Specifically, terminal can determine that the second focusing area is focusing area, and then can determine in the second focusing area
Corresponding object to be captured is target focusing object.
Step 409, focus apparatus carry out focusing process to target focusing object, and gathered by image acquisition device to be captured
Object obtains target image.
Wherein, object to be captured includes carrying out the target focusing object after focusing process.
Specifically, the target focusing object that terminal-pair determines carries out focusing process, can obtain on target focusing object
Clearly image.It is to be captured right in the acquisition range of acquisition camera after terminal-pair target focusing object carries out focusing process
As obtaining target image.Wherein target focuses object in the acquisition range of camera.
A kind of focus apparatus application scenarios corresponding to the embodiment of the present invention, using terminal as mobile phone, after the camera of mobile phone is
Put dual camera and rearmounted dual camera be arranged horizontally exemplified by illustrate, what mobile phone was obtained by calculating in view-finder waits to clap
The contrast of image corresponding to object is taken the photograph, determines the definition of each object to be captured, if as shown in figure 5, terminal determines to find a view
The contrast of corresponding object to be captured is maximum in the A of inframe region, then it is first area that can determine region A;If default adjustment
Rule is that first area is adjusted according to user input instruction, and user can input control by the display screen of touch control terminal
System instruction " expands first area to the 50% " of view-finder area size, then terminal response control instruction expands as first area
The 50% of view-finder area size is occupied, and then the 3rd region B as shown in Figure 6 can be obtained;If default division rule is
It is 2 × 2 equal portions by the 3rd region division, then the 3rd region division can be 4 equal portions according to 2 × 2 division form by terminal,
And then obtain 4 the first focusing areas.Obtained afterwards by rearmounted dual camera to be captured corresponding to this 4 first focusing areas
The distance between object and mobile phone camera, and determine object to be captured corresponding to the focusing areas of which of region B first
The distance between mobile phone is nearest, however, it is determined that be corresponding to region C as shown in Figure 7 between object and mobile phone to be captured away from
From nearest, then region C is the second focusing area.Focusing process is carried out to the object to be captured in the C of region, and gathers and takes camera
Object to be captured in acquisition range carries out IMAQ, you can obtains the image of object to be captured corresponding to clearly region C
And the image of other interior objects to be captured of camera acquisition range.
It should be noted that it be may be referred in the present embodiment with the explanation of same steps in other embodiments or concept other
Description in embodiment, here is omitted.
The focusing method that embodiments of the invention are provided, after obtaining at least two first focusing areas, obtain every 1 the
The first distance corresponding to one focusing area between object and image acquisition device to be captured, is then based on the first distance, determines
Two distances, and distance is defined as the second focusing area for the first focusing area corresponding to the object to be captured of second distance.This
Sample, terminal can determine multiple focusing areas first, then according to the pass between object to be captured and the distance of image acquisition device
System obtained from multiple focusing areas need to carry out the object to be captured of focusing process corresponding to focusing area, solve prior art
Middle terminal actively according to object to be captured can not determine the problem of focusing area realizes auto-focusing, realize terminal according to waiting to clap
Take the photograph object and actively determine that focusing area carries out auto-focusing and obtains clearly image, improve the picture quality of shooting.
Based on previous embodiment, embodiments of the invention provide a kind of focus apparatus 5, and the focus apparatus can apply to figure
In a kind of focusing method that embodiment corresponding to 3~4 provides, shown in reference picture 8, the focus apparatus can include:Processor 51,
Memory 52, communication bus 53 and image acquisition device 54, wherein:
Communication bus 53 is used to realize the connection communication between processor 51 and memory 52.
Processor 51 is used to perform the focusing program stored in memory 52, to realize following steps:
Obtain at least two first focusing areas.
Obtain the first distance between object and image acquisition device 54 to be captured corresponding to every one first focusing area.
Based on the first distance, second distance is determined.
It is the first focusing area corresponding to the object to be captured of second distance by distance, is defined as the second focusing area.
Specifically, in other embodiments of the present invention, processor 51 is additionally operable to perform focusing program, to realize following walk
Suddenly:
Obtain characteristic parameter of each object to be captured in the view-finder of image acquisition device.
Wherein, characteristic parameter is for the definition for the object to be captured for representing to collect in view-finder.
Feature based parameter determines the 3rd focusing area.
First focusing area is obtained based on the 3rd focusing area.
Specifically, in other embodiments of the present invention, processor 51 is additionally operable to perform focusing program, to realize following walk
Suddenly:
Based on the relation between each characteristic parameter, determine that the region that characteristic parameter meets preparatory condition is first area.
First area is adjusted according to default regulation rule to obtain the 3rd focusing area.
Specifically, in other embodiments of the present invention, processor 51 is additionally operable to perform focusing program, to realize following walk
Suddenly:
The 3rd focusing area is divided according to default division rule, obtains the first focusing area.
Specifically, in other embodiments of the present invention, processor 51 is additionally operable to perform focusing program, to realize following walk
Suddenly:
Compare the magnitude relationship between every one first distance.
Based on the magnitude relationship between the first distance, determine that the first minimum distance of distance value is second distance.
Specifically, in other embodiments of the present invention, by distance for first pair corresponding to the object to be captured of second distance
Burnt region, it is defined as after the second focusing area, processor 51 is additionally operable to perform focusing program, to realize following steps:
Determine that corresponding object to be captured is target focusing object in the second focusing area.
Focusing process is carried out to target focusing object, and object to be captured is gathered by image acquisition device and obtains target figure
Picture.
Wherein, object to be captured includes carrying out the target focusing object after focusing process.
It should be noted that the interaction between the step of processor is realized in the present embodiment, is referred to Fig. 3~4
Interaction in the focusing method that corresponding embodiment provides, here is omitted.
The focus apparatus that embodiments of the invention are provided, after obtaining at least two first focusing areas, obtain every 1 the
The first distance corresponding to one focusing area between object and image acquisition device to be captured, is then based on the first distance, determines
Two distances, and distance is defined as the second focusing area for the first focusing area corresponding to the object to be captured of second distance.This
Sample, terminal can determine multiple focusing areas first, then according to the pass between object to be captured and the distance of image acquisition device
System obtained from multiple focusing areas need to carry out the object to be captured of focusing process corresponding to focusing area, solve prior art
Middle terminal actively according to object to be captured can not determine the problem of focusing area realizes auto-focusing, realize terminal according to waiting to clap
Take the photograph object and actively determine that focusing area carries out auto-focusing and obtains clearly image, improve the picture quality of shooting.
Based on previous embodiment, embodiments of the invention provide a kind of computer-readable recording medium, computer-readable to deposit
Storage media is stored with one or more focusing program, and one or more focusing program can be held by one or more processor
OK, to realize following steps:
Obtain at least two first focusing areas.
Obtain the first distance between object and image acquisition device to be captured corresponding to every one first focusing area.
Based on the first distance, second distance is determined.
It is the first focusing area corresponding to the object to be captured of second distance by distance, is defined as the second focusing area.
Specifically, in other embodiments of the present invention, at least two first focusing areas are obtained, are comprised the following steps:
Obtain characteristic parameter of each object to be captured in the view-finder of image acquisition device.
Wherein, characteristic parameter is for the definition for the object to be captured for representing to collect in view-finder.
Feature based parameter determines the 3rd focusing area.
First focusing area is obtained based on the 3rd focusing area.
Specifically, in other embodiments of the present invention, feature based parameter determines the 3rd focusing area, including following step
Suddenly:
Based on the relation between each characteristic parameter, determine that the region that characteristic parameter meets preparatory condition is first area.
First area is adjusted according to default regulation rule to obtain the 3rd focusing area.
Specifically, in other embodiments of the present invention, the first focusing area is obtained based on the 3rd focusing area, including it is following
Step:
The 3rd focusing area is divided according to default division rule, obtains the first focusing area.
Specifically, in other embodiments of the present invention, based on the first distance, second distance is determined, is comprised the following steps:
Compare the magnitude relationship between every one first distance.
Based on the magnitude relationship between the first distance, determine that the first minimum distance of distance value is second distance.
Specifically, in other embodiments of the present invention, by distance for first pair corresponding to the object to be captured of second distance
Burnt region, it is defined as after the second focusing area, it is further comprising the steps of:
Determine that corresponding object to be captured is target focusing object in the second focusing area.
Focusing process is carried out to target focusing object, and object to be captured is gathered by image acquisition device and obtains target figure
Picture.
Wherein, object to be captured includes carrying out the target focusing object after focusing process.
It should be noted that the interaction between the step of processor is realized in the present embodiment, is referred to Fig. 3~4
Interaction in a kind of focusing method that corresponding embodiment provides, here is omitted.
The computer-readable recording medium that embodiments of the invention are provided, after obtaining at least two first focusing areas,
Obtain the first distance between object and image acquisition device to be captured corresponding to every one first focusing area, be then based on first away from
From determining second distance, and distance is defined as into second pair for the first focusing area corresponding to the object to be captured of second distance
Burnt region.So, terminal can determine multiple focusing areas first, then according to the distance of object to be captured and image acquisition device
Between relation obtained from multiple focusing areas need to carry out the object to be captured of focusing process corresponding to focusing area, solve
Terminal actively can not determine the problem of focusing area realizes auto-focusing according to object to be captured in the prior art, realize terminal
Actively determine that focusing area carries out auto-focusing and obtains clearly image according to object to be captured, improve the image matter of shooting
Amount.
It should be noted that herein, term " comprising ", "comprising" or its any other variant are intended to non-row
His property includes, so that process, method, article or device including a series of elements not only include those key elements, and
And also include the other element being not expressly set out, or also include for this process, method, article or device institute inherently
Key element.In the absence of more restrictions, the key element limited by sentence "including a ...", it is not excluded that including this
Other identical element also be present in the process of key element, method, article or device.
The embodiments of the present invention are for illustration only, do not represent the quality of embodiment.
Through the above description of the embodiments, those skilled in the art can be understood that above-described embodiment side
Method can add the mode of required general hardware platform to realize by software, naturally it is also possible to by hardware, but in many cases
The former is more preferably embodiment.Based on such understanding, technical scheme is substantially done to prior art in other words
Going out the part of contribution can be embodied in the form of software product, and the computer software product is stored in a storage medium
In (such as ROM/RAM, magnetic disc, CD), including some instructions to cause a station terminal (can be mobile phone, computer, service
Device, air conditioner, or network equipment etc.) perform method described by each embodiment of the present invention.
The present invention is the flow with reference to method according to embodiments of the present invention, equipment (system) and computer program product
Figure and/or block diagram describe.It should be understood that can be by every first-class in computer program instructions implementation process figure and/or block diagram
Journey and/or the flow in square frame and flow chart and/or block diagram and/or the combination of square frame.These computer programs can be provided
The processors of all-purpose computer, special-purpose computer, Embedded Processor or other programmable data processing devices is instructed to produce
A raw machine so that produced by the instruction of computer or the computing device of other programmable data processing devices for real
The device for the function of being specified in present one flow of flow chart or one square frame of multiple flows and/or block diagram or multiple square frames.
These computer program instructions, which may be alternatively stored in, can guide computer or other programmable data processing devices with spy
Determine in the computer-readable memory that mode works so that the instruction being stored in the computer-readable memory, which produces, to be included referring to
Make the manufacture of device, the command device realize in one flow of flow chart or multiple flows and/or one square frame of block diagram or
The function of being specified in multiple square frames.
These computer program instructions can be also loaded into computer or other programmable data processing devices so that counted
Series of operation steps is performed on calculation machine or other programmable devices to produce computer implemented processing, so as in computer or
The instruction performed on other programmable devices is provided for realizing in one flow of flow chart or multiple flows and/or block diagram one
The step of function of being specified in individual square frame or multiple square frames.
The preferred embodiments of the present invention are these are only, are not intended to limit the scope of the invention, it is every to utilize this hair
The equivalent structure or equivalent flow conversion that bright specification and accompanying drawing content are made, or directly or indirectly it is used in other related skills
Art field, is included within the scope of the present invention.
Claims (10)
1. a kind of focusing method, it is characterised in that methods described includes:
Obtain at least two first focusing areas;
Obtain the first distance between object and image acquisition device to be captured corresponding to each first focusing area;
Based on first distance, second distance is determined;
It is the first focusing area corresponding to the object to be captured of the second distance by distance, is defined as the second focusing area.
2. according to the method for claim 1, it is characterised in that described the step of obtaining at least two first focusing areas,
Including:
Obtain characteristic parameter of each object to be captured in the view-finder of described image collector;Wherein, the characteristic parameter
For the definition for the object to be captured for representing to collect in the view-finder;
3rd focusing area is determined based on the characteristic parameter;
First focusing area is obtained based on the 3rd focusing area.
3. according to the method for claim 2, it is characterised in that described that 3rd focusing area is determined based on the characteristic parameter
The step of, including:
Based on the relation between each characteristic parameter, determine that the region that the characteristic parameter meets preparatory condition is the firstth area
Domain;
The first area is adjusted according to default regulation rule to obtain the 3rd focusing area.
4. according to the method for claim 2, it is characterised in that described to obtain described first based on the 3rd focusing area
Focusing area, including:
The 3rd focusing area is divided according to default division rule, obtains first focusing area.
5. according to the method for claim 1, it is characterised in that it is described to be based on first distance, second distance is determined, is wrapped
Include:
Magnitude relationship between more each first distance;
Based on the magnitude relationship between the described first distance, determine minimum first distance of distance value for described second away from
From.
6. method according to claim 1, it is characterised in that the object pair to be captured by distance for the second distance
The first focusing area answered, is defined as after the second focusing area, in addition to:
Determine that corresponding object to be captured is target focusing object in second focusing area;
Focusing process is carried out to target focusing object, and the object to be captured is gathered by described image collector and obtained
Target image;Wherein, the object to be captured includes carrying out the target focusing object after focusing process.
7. a kind of focus apparatus, it is characterised in that the equipment includes:Processor, memory, communication bus and IMAQ
Device;
The communication bus is used to realize the connection communication between processor and memory;
The processor is used to perform the focusing program stored in memory, to realize following steps:
Obtain at least two first focusing areas;
Obtain the first distance between object and image acquisition device to be captured corresponding to each first focusing area;
Based on first distance, second distance is determined;
It is the first focusing area corresponding to the object to be captured of the second distance by distance, is defined as the second focusing area.
8. equipment according to claim 7, it is characterised in that processor is additionally operable to perform focusing program, following to realize
Step:
Obtain characteristic parameter of each object to be captured in the view-finder of described image collector;Wherein described characteristic parameter is used
In the definition for the object to be captured for representing to collect in the view-finder;
3rd focusing area is determined based on the characteristic parameter;
First focusing area is obtained based on the 3rd focusing area.
9. equipment according to claim 7, it is characterised in that the processor is additionally operable to perform focusing program, to realize
Following steps:
Based on the magnitude relationship between the described first distance, determine minimum first distance of distance value for described second away from
From.
10. a kind of computer-readable recording medium, it is characterised in that focusing journey is stored with the computer-readable recording medium
Sequence, the step of focusing program realizes the focusing method as any one of claim 1 to 6 when being executed by processor.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710752706.7A CN107580181A (en) | 2017-08-28 | 2017-08-28 | A kind of focusing method, equipment and computer-readable recording medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710752706.7A CN107580181A (en) | 2017-08-28 | 2017-08-28 | A kind of focusing method, equipment and computer-readable recording medium |
Publications (1)
Publication Number | Publication Date |
---|---|
CN107580181A true CN107580181A (en) | 2018-01-12 |
Family
ID=61029705
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201710752706.7A Pending CN107580181A (en) | 2017-08-28 | 2017-08-28 | A kind of focusing method, equipment and computer-readable recording medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN107580181A (en) |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2020024196A1 (en) * | 2018-08-01 | 2020-02-06 | 深圳市大疆创新科技有限公司 | Method for adjusting parameters of photographing device, control device and photographing system |
CN112672045A (en) * | 2020-12-17 | 2021-04-16 | 西安闻泰电子科技有限公司 | Shooting mode setting method and device and electronic equipment |
WO2021134311A1 (en) * | 2019-12-30 | 2021-07-08 | 苏州臻迪智能科技有限公司 | Method and apparatus for switching object to be photographed, and image processing method and apparatus |
CN113141468A (en) * | 2021-05-24 | 2021-07-20 | 维沃移动通信(杭州)有限公司 | Focusing method and device and electronic equipment |
CN114125300A (en) * | 2021-11-29 | 2022-03-01 | 维沃移动通信有限公司 | Photographing method, photographing apparatus, electronic device, and readable storage medium |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103997598A (en) * | 2013-02-14 | 2014-08-20 | 三星电子株式会社 | Method of tracking object using camera and camera system for object tracking |
CN104243800A (en) * | 2013-06-14 | 2014-12-24 | 索尼公司 | Control device and storage medium |
CN106534619A (en) * | 2016-11-29 | 2017-03-22 | 努比亚技术有限公司 | Method and apparatus for adjusting focusing area, and terminal |
JP2017107132A (en) * | 2015-12-11 | 2017-06-15 | 株式会社ニコン | Electronic device |
-
2017
- 2017-08-28 CN CN201710752706.7A patent/CN107580181A/en active Pending
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103997598A (en) * | 2013-02-14 | 2014-08-20 | 三星电子株式会社 | Method of tracking object using camera and camera system for object tracking |
CN104243800A (en) * | 2013-06-14 | 2014-12-24 | 索尼公司 | Control device and storage medium |
JP2017107132A (en) * | 2015-12-11 | 2017-06-15 | 株式会社ニコン | Electronic device |
CN106534619A (en) * | 2016-11-29 | 2017-03-22 | 努比亚技术有限公司 | Method and apparatus for adjusting focusing area, and terminal |
Cited By (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2020024196A1 (en) * | 2018-08-01 | 2020-02-06 | 深圳市大疆创新科技有限公司 | Method for adjusting parameters of photographing device, control device and photographing system |
CN110771147A (en) * | 2018-08-01 | 2020-02-07 | 深圳市大疆创新科技有限公司 | Method for adjusting parameters of shooting device, control equipment and shooting system |
WO2021134311A1 (en) * | 2019-12-30 | 2021-07-08 | 苏州臻迪智能科技有限公司 | Method and apparatus for switching object to be photographed, and image processing method and apparatus |
CN114930798A (en) * | 2019-12-30 | 2022-08-19 | 苏州臻迪智能科技有限公司 | Shooting object switching method and device, and image processing method and device |
CN112672045A (en) * | 2020-12-17 | 2021-04-16 | 西安闻泰电子科技有限公司 | Shooting mode setting method and device and electronic equipment |
CN113141468A (en) * | 2021-05-24 | 2021-07-20 | 维沃移动通信(杭州)有限公司 | Focusing method and device and electronic equipment |
CN113141468B (en) * | 2021-05-24 | 2022-08-19 | 维沃移动通信(杭州)有限公司 | Focusing method and device and electronic equipment |
CN114125300A (en) * | 2021-11-29 | 2022-03-01 | 维沃移动通信有限公司 | Photographing method, photographing apparatus, electronic device, and readable storage medium |
CN114125300B (en) * | 2021-11-29 | 2023-11-21 | 维沃移动通信有限公司 | Shooting method, shooting device, electronic equipment and readable storage medium |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN107592451A (en) | A kind of multi-mode auxiliary photo-taking method, apparatus and computer-readable recording medium | |
CN107580181A (en) | A kind of focusing method, equipment and computer-readable recording medium | |
CN107820014A (en) | A kind of image pickup method, mobile terminal and computer-readable storage medium | |
CN107896303A (en) | A kind of image-pickup method, system and equipment and computer-readable recording medium | |
CN107566635A (en) | Screen intensity method to set up, mobile terminal and computer-readable recording medium | |
CN108063901A (en) | A kind of image-pickup method, terminal and computer readable storage medium | |
CN107317963A (en) | A kind of double-camera mobile terminal control method, mobile terminal and storage medium | |
CN107682627A (en) | A kind of acquisition parameters method to set up, mobile terminal and computer-readable recording medium | |
CN107517405A (en) | The method, apparatus and computer-readable recording medium of a kind of Video processing | |
CN107948530A (en) | A kind of image processing method, terminal and computer-readable recording medium | |
CN107329682A (en) | Edge exchange method and mobile terminal | |
CN107493426A (en) | A kind of information collecting method, equipment and computer-readable recording medium | |
CN107959795A (en) | A kind of information collecting method, equipment and computer-readable recording medium | |
CN107835369A (en) | Camera control method, mobile terminal and computer-readable recording medium | |
CN107124552A (en) | A kind of image pickup method, terminal and computer-readable recording medium | |
CN107613208A (en) | Adjusting method and terminal, the computer-readable storage medium of a kind of focusing area | |
CN110086993A (en) | Image processing method, device, mobile terminal and computer readable storage medium | |
CN107239205A (en) | A kind of photographic method, mobile terminal and storage medium | |
CN107979727A (en) | A kind of document image processing method, mobile terminal and computer-readable storage medium | |
CN107749947A (en) | Photographic method, mobile terminal and computer-readable recording medium | |
CN107682630A (en) | Dual camera anti-fluttering method, mobile terminal and computer-readable recording medium | |
CN107509168A (en) | WiFi hot spot scannings method and mobile terminal | |
CN107483804A (en) | A kind of image pickup method, mobile terminal and computer-readable recording medium | |
CN107102803A (en) | A kind of image display method, equipment and computer-readable recording medium | |
CN107566735A (en) | A kind of dual camera focusing method, mobile terminal and computer-readable recording medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20180112 |