CN106713718A - Dual camera-based focusing method and mobile terminal - Google Patents
Dual camera-based focusing method and mobile terminal Download PDFInfo
- Publication number
- CN106713718A CN106713718A CN201710109441.9A CN201710109441A CN106713718A CN 106713718 A CN106713718 A CN 106713718A CN 201710109441 A CN201710109441 A CN 201710109441A CN 106713718 A CN106713718 A CN 106713718A
- Authority
- CN
- China
- Prior art keywords
- camera
- auxiliary
- focus point
- depth
- field
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/57—Mechanical or electrical details of cameras or camera modules specially adapted for being embedded in other devices
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/45—Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from two or more image sensors being of different type or operating in different modes, e.g. with a CMOS sensor for moving images in combination with a charge-coupled device [CCD] for still images
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/67—Focus control based on electronic image sensor signals
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Human Computer Interaction (AREA)
- Telephone Function (AREA)
Abstract
The invention discloses a dual camera-based focusing method and a mobile terminal. The method comprises the following steps of determining a focal length range according to the depth of field; arranging an initial point of a primary position on a focus point of a primary camera at a position x within the focal length range; arranging the initial point of an auxiliary position of an auxiliary camera at a position x+y; judging whether the definition of an image obtained by the auxiliary camera when the focus point is at the auxiliary position is greater than that of the image obtained by the primary camera when the focus point is at the primary position or not; if so, increasing 2y to the primary position and the auxiliary position; judging whether the definition of the image obtained by the auxiliary camera when the focus point is the auxiliary position is greater than that of the image obtained by the primary camera when the focus point is at the primary position or not; if so, going back to the previous step; and if not, setting the focus point corresponding to the primary camera to be a focusing point. According to the dual camera-based focusing method, the focusing speed is improved, a better blurring effect is achieved in large-aperture shooting, and meanwhile, the user experience is improved.
Description
Technical field
The present invention relates to field of terminal technology, more particularly to a kind of focusing method and mobile terminal based on dual camera.
Background technology
With the development of electronic equipment, the terminal with camera function has been obtained for popularization in the life of people.Work(
The terminal that can increasingly enrich is very easy to the life of people.In recent years, image processing techniques is developed rapidly, the photograph of terminal
Phase function is also become stronger day by day, and adds terminal advantage easy to carry, and increasing user's favor is taken pictures by terminal.
In order to improve the effect of taking pictures of terminal, increasing terminal uses dual camera.Enter currently with dual camera
The focusing mode that the focusing method that row is used when shooting has two kinds, current main flow has:Contrast formula, phase, laser, its medium contrast formula
Focusing is most traditional, the most frequently used focusing mode, and its shortcoming is exactly to focus that step is more, and processor will calculate many data, so that
Make focusing time relatively long.
The content of the invention
It is a primary object of the present invention to propose a kind of focusing method and mobile terminal based on dual camera, it is intended to shorten
Focusing time, is quickly found out focusing.
To achieve the above object, the present invention proposes a kind of focusing method based on dual camera, it is characterised in that the side
Method includes:
Main camera detects the depth of field of current environment, and determines therefrom that focal range;
The initial point of the master site of the focus point of the main camera is set at x, the x is located at the focal range
It is interior;Inscribed when same, the initial point of the auxiliary position of the focus point of auxiliary camera is set at x+y;
Judge the definition of the image that the auxiliary camera is obtained when focus point is the auxiliary position whether more than described
The definition of the image that main camera is obtained when focus point is the master site;If so, then
The master site of the focus point of the main camera is increased into 2y, is inscribed when same, by the auxiliary camera
The auxiliary position of focus point increases 2y;
Judge the definition of the image that the auxiliary camera is obtained when focus point is the auxiliary position whether more than described
Main camera when focus point is the master site at obtain image definition;If so, then returning to previous step;If it is not,
Then
The corresponding focus point of the main camera is set to focusing.
In a kind of possible design, the focal range according to the depth of field of the current environment be set to microspur scope,
Middle-range scope and long-distance range.
In a kind of possible design, the x is located at 1/3rd or 1/2nd of the focal range.
In a kind of possible design, the occurrence of the y sets different values according to different focal ranges.
In a kind of possible design, the initial point of the auxiliary position of the focus point by auxiliary camera is set at x+y it
Before, methods described also includes:
The depth of field of the current environment detected according to main camera, calculates the main depth of field;
The depth of field of the current environment detected according to auxiliary camera, calculates the auxiliary depth of field;
According to the calculating y of the main depth of field and the auxiliary depth of field.
Additionally, to achieve the above object, the present invention also provides a kind of mobile terminal, and the mobile terminal has main camera
With auxiliary camera, it is characterised in that including:
Environment determining unit, the depth of field of the current environment for being detected according to the main camera determines focal range;
Initial focus point determining unit, for the initial point of the master site of the focus point of the main camera to be set at x,
The x is located in the focal range;Inscribed when same, the initial point of the auxiliary position of the focus point of auxiliary camera is set to
At x+y;
Judging unit, the definition for judging the image that the auxiliary camera is obtained when focus point is the auxiliary position
Whether the main camera is more than in the definition that focus point is the image obtained at the master site;
Focus point adjustment unit, if the image obtained when focus point is the auxiliary position for the auxiliary camera is clear
Clear degree is more than the definition of the image that the main camera is obtained when focus point is the master site, then by the main camera
The master site of focus point increase 2y, inscribed when same, the auxiliary position of the focus point of the auxiliary camera is increased into 2y;
Focusing determining unit, if the image obtained when focus point is the auxiliary position for the auxiliary camera is clear
Clear degree is less than the definition of the image that the main camera is obtained when focus point is the master site, then by the main camera
Corresponding focus point is set to focusing.
In a kind of possible design, the focal range according to the depth of field of the current environment be set to microspur scope,
Middle-range scope and long-distance range.
In a kind of possible design, the x is located at 1/3rd or 1/2nd of the focal range.
In a kind of possible design, the occurrence of the y sets different values according to different focal ranges.
In a kind of possible design, the focus point adjustment unit is additionally operable to:
The depth of field of the current environment detected according to main camera, calculates the main depth of field;
The depth of field of the current environment detected according to auxiliary camera, calculates the auxiliary depth of field;
According to the calculating y of the main depth of field and the auxiliary depth of field.
Focusing method and mobile terminal based on dual camera proposed by the present invention, not only lift focusing speed, in big light
Preferably virtualization effect is reached in the shooting of circle, while lifting Consumer's Experience.
Brief description of the drawings
Fig. 1 is the hardware architecture diagram of the mobile terminal for realizing each embodiment of the invention;
Fig. 2 is the schematic diagram of one embodiment of the focusing method based on dual camera in the embodiment of the present invention;
Fig. 3 is the schematic diagram of one embodiment of the computational methods of increment y in the embodiment of the present invention;
Fig. 4 is the structural representation of one embodiment of mobile terminal in the embodiment of the present invention;
The realization of the object of the invention, functional characteristics and advantage will be described further referring to the drawings in conjunction with the embodiments.
Specific embodiment
It should be appreciated that the specific embodiments described herein are merely illustrative of the present invention, it is not intended to limit the present invention.
The mobile terminal of each embodiment of the invention is realized referring now to Description of Drawings.In follow-up description, use
For represent element such as " module ", " part " or " unit " suffix only for being conducive to explanation of the invention, itself
Not specific meaning.Therefore, " module " can be used mixedly with " part ".
Mobile terminal can be implemented in a variety of manners.For example, the terminal described in the present invention can include such as moving
Phone, smart phone, notebook computer, digit broadcasting receiver, PDA (personal digital assistant), PAD (panel computer), PMP
The mobile terminal of (portable media player), guider etc. and such as numeral TV, desktop computer etc. are consolidated
Determine terminal.Hereinafter it is assumed that terminal is mobile terminal.However, it will be understood by those skilled in the art that, except being used in particular for movement
Outside the element of purpose, construction according to the embodiment of the present invention can also apply to the terminal of fixed type.
Fig. 1 is the hardware architecture diagram for realizing the optional intelligent terminal of each embodiment one of the invention.
Intelligent terminal 100 can include wireless communication unit 110, A/V (audio/video) input block 120, user input
Unit 130, sensing unit 140, output unit 150, memory 160, interface unit 170, controller 180 and power subsystem 190
Etc..
Fig. 1 shows the mobile terminal 1 00 with various assemblies, it should be understood that being not required for implementing all showing
The component for going out.More or less component can alternatively be implemented.The element of mobile terminal 1 00 will be discussed in more detail below.
Wireless communication unit 110 can generally include one or more assemblies, and it allows mobile terminal 1 00 and radio communication
Radio communication between system or network.For example, wireless communication unit 110 can include that broadcasting reception module 111, movement are logical
At least one of letter module 112, wireless Internet module 113, short range communication module 114 and location information module 115.
Broadcasting reception module 111 receives broadcast singal and/or broadcast via broadcast channel from external broadcast management server
Relevant information.Broadcast channel can include satellite channel and/or terrestrial channel.Broadcast management server can be generated and sent
The broadcast singal and/or broadcast related information generated before the server or reception of broadcast singal and/or broadcast related information
And send it to the server of terminal.Broadcast singal can include TV broadcast singals, radio signals, data broadcasting
Signal etc..And, broadcast singal may further include the broadcast singal combined with TV or radio signals.Broadcast phase
Pass information can also be provided via mobile communications network, and in this case, broadcast related information can be by mobile communication mould
Block 112 is received.Broadcast singal can exist in a variety of manners, for example, it can be with the electronics of DMB (DMB)
The form of program guide (EPG), the electronic service guidebooks (ESG) of digital video broadcast-handheld (DVB-H) etc. and exist.Broadcast
Receiver module 111 can receive signal and broadcast by using various types of broadcast systems.Especially, broadcasting reception module 111
Can be wide by using such as multimedia broadcasting-ground (DMB-T), DMB-satellite (DMB-S), digital video
Broadcast-hand-held (DVB-H), Radio Data System, the received terrestrial digital broadcasting integrated service of forward link media (MediaFLO@)
Etc. (ISDB-T) digit broadcasting system receives digital broadcasting.Broadcasting reception module 111 may be constructed such that and be adapted to provide for extensively
Broadcast the various broadcast systems and above-mentioned digit broadcasting system of signal.Via broadcasting reception module 111 receive broadcast singal and/
Or broadcast related information can be stored in memory 160 (or other types of storage medium).
Mobile communication module 112 sends radio signals to base station (for example, access point, node B etc.), exterior terminal
And at least one of server and/or receive from it radio signal.Such radio signal can be logical including voice
Words signal, video calling signal or the various types of data for sending and/or receiving according to text and/or Multimedia Message.
Wireless Internet module 113 supports the Wi-Fi (Wireless Internet Access) of mobile terminal.The module can be internally or externally
It is couple to terminal.Wi-Fi (Wireless Internet Access) technology involved by the module can include WLAN (WLAN) (Wi-Fi), Wibro
(WiMAX), Wimax (worldwide interoperability for microwave accesses), HSDPA (high-speed downlink packet access) etc..
Short range communication module 114 is the module for supporting junction service.Some examples of short-range communication technology include indigo plant
Tooth TM, radio frequency identification (RFID), Infrared Data Association (IrDA), ultra wide band (UWB), purple honeybee TM etc..
Location information module 115 is the module for checking or obtaining the positional information of mobile terminal.Location information module
115 typical case is GPS (global positioning system).According to current technology, GPS calculate from three or more satellites away from
Information application triangulation from information and correct time information and for calculating, so as to according to longitude, latitude and height
Degree calculates three-dimensional current location information exactly.Currently, three satellites are used simultaneously for calculating the method for position and temporal information
And the position by using other satellite correction calculating and the error of temporal information.Additionally, GPS can be by real time
Continuous plus current location information carrys out calculating speed information.
A/V input blocks 120 are used to receive audio or video signal.A/V input blocks 120 can include the He of camera 121
Microphone wind 122, the static map that 121 pairs, camera is obtained in Video Capture pattern or image capture mode by image capture apparatus
The view data of piece or video is processed.Picture frame after treatment may be displayed on display unit 151.At camera 121
Picture frame after reason can be stored in memory 160 (or other storage mediums) or carried out via wireless communication unit 110
Send, two or more cameras 121 can be provided according to the construction of mobile terminal 1 00.Microphone wind 122 can be in telephone relation
Sound (voice data) is received via microphone wind 122 in pattern, logging mode, speech recognition mode etc. operational mode, and
And can be voice data by such acoustic processing.Audio (voice) data after treatment can be in the feelings of telephone calling model
The form that being converted under condition can be sent to mobile communication base station via mobile communication module 112 is exported.Microphone wind 122 can be real
Various types of noises elimination (or suppression) algorithms are applied to eliminate (or suppression) product during receiving and sending audio signal
Raw noise or interference.
User input unit 130 can generate key input data to control mobile terminal 1 00 according to the order of user input
Various operations.User input unit 130 allow the various types of information of user input, and can include keyboard, metal dome,
Touch pad (for example, detection due to being touched caused by resistance, pressure, electric capacity etc. change sensitive component), roller, shake
Bar etc..Especially, when touch pad is superimposed upon on display unit 151 in the form of layer, touch-screen can be formed.
Sensing unit 140 detects the current state of mobile terminal 1 00, (for example, mobile terminal 1 00 opens or closes shape
State), the presence or absence of the contact (that is, touch input) of the position of mobile terminal 1 00, user for mobile terminal 1 00, mobile terminal
The acceleration or deceleration movement of 100 orientation, mobile terminal 1 00 and direction etc., and generate for controlling mobile terminal 1 00
The order of operation or signal.For example, when mobile terminal 1 00 is embodied as sliding-type mobile phone, sensing unit 140 can be sensed
The sliding-type phone is opened or closed.In addition, sensing unit 140 can detect power subsystem 190 whether provide electric power or
Whether person's interface unit 170 couples with external device (ED).Sensing unit 140 can include proximity transducer 141.
Interface unit 170 is connected the interface that can pass through with mobile terminal 1 00 as at least one external device (ED).For example,
External device (ED) can include wired or wireless head-band earphone port, external power source (or battery charger) port, wired or nothing
Line FPDP, memory card port, the port for connecting the device with identification module, audio input/output (I/O) end
Mouth, video i/o port, ear port etc..Identification module can be that storage uses each of mobile terminal 1 00 for verifying user
Kind of information and subscriber identification module (UIM), client identification module (SIM), Universal Subscriber identification module (USIM) can be included
Etc..In addition, the device (hereinafter referred to as " identifying device ") with identification module can take the form of smart card, therefore, know
Other device can be connected via port or other attachment means with mobile terminal 1 00.Interface unit 170 can be used for reception and come from
The input (for example, data message, electric power etc.) of the external device (ED) and input that will be received is transferred in mobile terminal 1 00
One or more elements can be used for transmitting data between mobile terminal 1 00 and external device (ED).
In addition, when mobile terminal 1 00 is connected with external base, interface unit 170 can serve as allowing by it by electricity
Power provides to the path of mobile terminal 1 00 from base or can serve as allowing the various command signals being input into from base to pass through it
It is transferred to the path of mobile terminal 1 00.Can serve as recognizing mobile terminal 1 00 from the various command signals or electric power of base input
Whether signal base on is accurately fitted within.Output unit 150 is configured to be provided with vision, audio and/or tactile manner
Output signal (for example, audio signal, vision signal, alarm signal, vibration signal etc.).Output unit 150 can include aobvious
Show unit 151, dio Output Modules 152, alarm unit 153 etc..
Display unit 151 may be displayed on the information processed in mobile terminal 1 00.For example, when mobile terminal 1 00 is in electricity
During words call mode, display unit 151 can show and converse or other communicate (for example, text messaging, multimedia file
Download etc.) related user interface (UI) or graphic user interface (GUI).When mobile terminal 1 00 is in video calling pattern
Or during image capture mode, display unit 151 can show the image of capture and/or the image of reception, show video or figure
UI or GUI of picture and correlation function etc..
Meanwhile, when display unit 151 and touch pad in the form of layer it is superposed on one another to form touch-screen when, display unit
151 can serve as input unit and output device.Display unit 151 can include liquid crystal display (LCD), thin film transistor (TFT)
In LCD (TFT-LCD), Organic Light Emitting Diode (OLED) display, flexible display, three-dimensional (3D) display etc. at least
It is a kind of.Some in these displays may be constructed such that transparence to allow user to be watched from outside, and this is properly termed as transparent
Display, typical transparent display can be, for example, TOLED (transparent organic light emitting diode) display etc..According to specific
Desired implementation method, mobile terminal 1 00 can include two or more display units (or other display devices), for example, moving
Dynamic terminal 100 can include outernal display unit (not shown) and inner display unit (not shown).Touch-screen can be used to detect
Touch input pressure and touch input position and touch input area.
Dio Output Modules 152 can be in call signal reception pattern, call mode, record mould in mobile terminal 1 00
It is that wireless communication unit 110 is received or in memory when under the isotypes such as formula, speech recognition mode, broadcast reception mode
In 160 store voice data transducing audio signal and be output as sound.And, dio Output Modules 152 can provide with
The audio output of the specific function correlation that mobile terminal 1 00 is performed is (for example, call signal receives sound, message sink sound etc.
Deng).Dio Output Modules 152 can include loudspeaker, buzzer etc..
Alarm unit 153 can provide output and be notified to mobile terminal 1 00 with by event.Typical event can be with
Including calling reception, message sink, key signals input, touch input etc..In addition to audio or video is exported, alarm unit
153 can in a different manner provide output with the generation of notification event.For example, alarm unit 153 can be in the form of vibrating
Output is provided, when calling, message or some other entrance communication (incoming communication) are received, alarm list
Unit 153 can provide tactile output (that is, vibrating) to notify to user.Exported by providing such tactile, even if
When in pocket of the mobile phone of user in user, user also can recognize that the generation of various events.Alarm unit 153
The output of the generation of notification event can be provided via display unit 151 or dio Output Modules 152.
Memory 160 can store software program for the treatment and control operation performed by controller 180 etc., Huo Zheke
Temporarily to store oneself data (for example, telephone directory, message, still image, video etc.) through exporting or will export.And
And, memory 160 can store the vibration of various modes on being exported when touching and being applied to touch-screen and audio signal
Data.
Memory 160 can include the storage medium of at least one type, and the storage medium includes flash memory, hard disk, many
Media card, card-type memory (for example, SD or DX memories etc.), random access storage device (RAM), static random-access storage
Device (SRAM), read-only storage (ROM), Electrically Erasable Read Only Memory (EEPROM), programmable read only memory
(PROM), magnetic storage, disk, CD etc..And, mobile terminal 1 00 can perform memory with by network connection
The network storage device cooperation of 160 store function.
The overall operation of the generally control mobile terminal of controller 180.For example, controller 180 is performed and voice call, data
Communication, video calling etc. related control and treatment.In addition, controller 180 can be included for reproducing (or playback) many matchmakers
The multi-media module 181 of volume data, multi-media module 181 can be constructed in controller 180, or can be structured as and control
Device 180 is separated.Controller 180 can be with execution pattern identifying processing, the handwriting input that will be performed on the touchscreen or picture
Draw input and be identified as character or image.
Power subsystem 190 receives external power or internal power under the control of controller 180 and provides operation each unit
Appropriate electric power needed for part and component.
Various implementation methods described herein can be with use such as computer software, hardware or its any combination of calculating
Machine computer-readable recording medium is implemented.Implement for hardware, implementation method described herein can be by using application-specific IC
(ASIC), digital signal processor (DSP), digital signal processing device (DSPD), programmable logic device (PLD), scene can
Programming gate array (FPGA), processor, controller, microcontroller, microprocessor, it is designed to perform function described herein
At least one in electronic unit is implemented, and in some cases, such implementation method can be implemented in controller 180.
For software implementation, the implementation method of such as process or function can with allow to perform the single of at least one function or operation
Software module is implemented.Software code can be come by the software application (or program) write with any appropriate programming language
Implement, software code can be stored in memory 160 and performed by controller 180.
So far, oneself according to its function through describing mobile terminal 1 00.In addition, the mobile terminal 1 00 in the embodiment of the present invention
Can be such as folded form, board-type, oscillating-type, sliding-type and other various types of mobile terminals, not do herein specifically
Limit.
When being imaged using the dual camera of mobile terminal, all can first find a focusing, more specifically, with light
When the parallel light of axle injects convex lens, preferable camera lens should be all of light-ray condensing on one point after, then with the expansion of taper
Scatter and, this assembles a bit of all light, is just called focus.
Before and after focus, light starts to assemble and spreads, and the video of point becomes fuzzy, forms a circle for expansion, this
Individual circle is just called blur circle.
In the middle of reality, the video for viewing and admiring shooting is (such as to project, zoom into photo etc.) in some way to observe
, the video that human eye is experienced has very big relation with enlargement ratio, projector distance and viewing distance, if blur circle
Distinguishing ability of the diameter less than human eye, the fuzzy of actual video generation is unidentified within the specific limits.This can not
The blur circle of identification is known as allowing blur circle (permissible circle ofconfusion).Respectively have one before and after focus
Individual to allow blur circle, the distance between the two blur circles are just the depth of field, i.e.,:Before and after shot subject (focusing), its image
Still there is one section of definition range, be exactly the depth of field.In other words, the front and rear depth of subject, is presented on the fog in egative film face
Degree, all in the range of the restriction for allowing blur circle.
The depth of field changes with focal length, f-number, the shooting distance of camera lens.For fixed focal length and shooting distance, light is used
Circle is smaller, and the depth of field is bigger.
On the basis of holding camera photographer, from focus to the distance of blur circle is nearby allowed the preceding depth of field, from focus
Allow the distance of blur circle the rear depth of field to a distant place.
The depth of field can take below equation to be calculated:
Preceding depth of field Δ L1=F δ L2/(f2+FδL);
Preceding depth of field Δ L2=F δ L2/(f2-FδL);
Depth of field Δ L=Δ L1+ Δ L2=F δ L4/(f4-F2δ2L2)
Wherein, δ is to allow disperse circular diameter;F is lens focus;F is the shooting f-number of camera lens;L is focal distance.
It can be seen that, the depth of field and camera lens (show as to appearance using aperture, lens focus, shooting distance and to the requirement as matter
Perhaps the size of blur circle) it is relevant.Influence of these principal elements to the depth of field is following (it is assumed that other conditions do not change):
(1), lens aperture:
Aperture is bigger, and the depth of field is smaller;Aperture is smaller, and the depth of field is bigger;
(2), lens focus
Lens focus are more long, and the depth of field is smaller;Focal length is shorter, and the depth of field is bigger;
(3), shooting distance
Distance is more remote, and the depth of field is bigger;Distance is nearer, and the depth of field is smaller.
Based on above-mentioned mobile terminal hardware configuration and image-forming principle, the inventive method each embodiment is proposed.
As shown in Fig. 2 one embodiment of the focusing method of dual camera is based in the embodiment of the present invention, including step
Suddenly:
201st, start;
202nd, main camera detects the depth of field of current environment, and determines therefrom that focal range;
The Photographing Mode that its focal range can be combined with selection is furtherIt is determined that,The Photographing Mode of such as selection is " micro-
Away from pattern ", then its corresponding focal range is microspur scope;
203rd, the initial point of the master site of the focus point of the main camera is set at x, the x is located at the focal length model
In enclosing;Inscribed when same, the initial point of the auxiliary position of the focus point of auxiliary camera is set at x+y;
That is, by main camera and auxiliary camera, under synchronization, the figure of different focus points can be obtained
Picture;Wherein, y can be negative value;
204th, judge whether the definition of the image that the auxiliary camera is obtained when focus point is the auxiliary position is more than
The main camera is in the definition that focus point is the image obtained at the master site;If so, then entering step 205;If it is not,
Then enter step 207;
205th, the master site of the focus point of the main camera is increased into 2y, is inscribed when same, by the auxiliary shooting
The auxiliary position of the focus point of head increases 2y;
206th, judge whether the definition of the image that the auxiliary camera is obtained when focus point is the auxiliary position is more than
The main camera when focus point is the master site at obtain image definition;If so, then return to step 205;If
It is no, then into step 207;
207th, the corresponding focus point of the main camera is set to focusing;
208th, terminate.
It should be noted that in whole process, the focus point of the master site of the focus point of main camera and auxiliary camera
Auxiliary position all the time in focal range.
Alternatively, on the basis of above-mentioned Fig. 2 and Fig. 2 correspondence embodiments, the embodiment of the present invention is provided based on double shootings
Head focusing method second embodiment in, its focal range according to the depth of field of the current environment be set to microspur scope, in
Away from scope and long-distance range.
The initial point x of the master site of the focus point of main camera can be the minimum value of correspondence focal range, and now, y takes
On the occasion of;
The initial point x of the master site of the focus point of main camera can be the maximum of correspondence focal range, and now, y takes
Negative value;
More specifically, can combine auxiliary camera detection to current environment the depth of field determine x be correspondence focal range most
Small value or maximum;Usually, the depth of field of the current environment for first being detected according to main camera, calculates the main depth of field;According to auxiliary
The depth of field of the current environment that camera is detected, calculates the auxiliary depth of field;Compare the size of the main depth of field and the auxiliary depth of field again;If the main depth of field is big
In the auxiliary depth of field, then x is set to the maximum of correspondence focal range;If the main depth of field is less than the auxiliary depth of field, x is set to correspondence focal length model
The minimum value enclosed.
Alternatively, on the basis of above-mentioned Fig. 2 and Fig. 2 correspondence embodiments, the embodiment of the present invention is provided based on double shootings
In the 3rd embodiment of the focusing method of head;X is located at 1/3rd or 1/2nd of the focal range;
Now, the master site of the focus point of the main camera is being increased into 2y and the focus point by the auxiliary camera
Auxiliary position when increasing 2y, first y can be taken on the occasion of performing and described judges that the auxiliary camera is the auxiliary position in focus point
When the definition of image that obtains whether more than the main camera when focus point is the master site image of place's acquisition
The step of definition;Y is taken into negative value again, the judgement auxiliary camera is performed and is obtained at the focus point auxiliary position
The definition of image whether more than the main camera image of place's acquisition when focus point is the master site definition
Step.
In this way, focusing can be found out more quickly.
Alternatively, on the basis of above-mentioned Fig. 2 and Fig. 2 correspondence embodiments, the embodiment of the present invention is provided based on double shootings
In the fourth embodiment of the focusing method of head;The occurrence of the y sets different values according to different focal ranges.For example,
If current focal range is microspur scope, the occurrence of its corresponding y is y1;If current focal range is middle-range scope, its is right
The occurrence of the y for answering is y2;If current focal range is microspur scope, the occurrence of its corresponding y is y3;Now, y1≤y2≤
y3。
Alternatively, on the basis of above-mentioned Fig. 2 and Fig. 2 correspondence embodiments, the embodiment of the present invention is provided based on double shootings
In the fourth embodiment of the focusing method of head;The initial point of the auxiliary position of the focus point by auxiliary camera is set at x+y it
Before, as shown in figure 3, methods described also includes:
301st, start;
302nd, the depth of field of the current environment detected according to main camera, calculates the main depth of field;
303rd, the depth of field of the current environment detected according to auxiliary camera, calculates the auxiliary depth of field;
304th, according to the calculating y of the main depth of field and the auxiliary depth of field;
In one embodiment of the invention, the main depth of field is y with the difference of the auxiliary depth of field;In another embodiment of the present invention
In, can be combined with the distance between main camera and auxiliary camera and y values are optimized;
305th, terminate.
Above to the embodiment of the present invention in the focusing method based on dual camera be described, below to of the invention real
The mobile terminal applied in example is described.
As shown in figure 4, in the embodiment of the present invention mobile terminal one embodiment, it is including environment determining unit 10, initial
Focus point determining unit, judging unit 30, focus point adjustment unit 40, focusing determining unit 50;Wherein:
The depth of field that environment determining unit 10 is used for the current environment detected according to the main camera determines focal range;
In the specific implementation, its focal range can be combined with selection Photographing Mode it is furtherIt is determined that,The Photographing Mode of such as selection
It is " macro mode " that then its corresponding focal range is microspur scope;
Initial focus point determining unit 20 is used to for the initial point of the master site of the focus point of the main camera to be set to x
Place, the x is located in the focal range;Inscribed when same, by the initial point of the auxiliary position of the focus point of auxiliary camera
It is set at x+y;That is, by main camera and auxiliary camera, under synchronization, the figure of different focus points can be obtained
Picture;Wherein, y can be negative value;
Judging unit 30 is used to judge the clear of the image that the auxiliary camera is obtained when focus point is the auxiliary position
Whether degree is more than the main camera in the definition that focus point is the image obtained at the master site;
If focus point adjustment unit 40 is used for the image that the auxiliary camera is obtained when focus point is the auxiliary position
Definition is more than the definition of the image that the main camera is obtained when focus point is the master site, then by the main shooting
The master site of the focus point of head increases 2y, is inscribed when same, and the auxiliary position of the focus point of the auxiliary camera is increased into 2y;
It should be noted that in whole process, the auxiliary position of the master site of the focus point of main camera and the focus point of auxiliary camera
All the time in focal range;
If focusing determining unit 50 is used for the image that the auxiliary camera is obtained when focus point is the auxiliary position
Definition is less than the definition of the image that the main camera is obtained when focus point is the master site, then by the main shooting
Corresponding focus point is set to focusing.
Alternatively, on the basis of above-mentioned Fig. 4 and Fig. 4 correspondence embodiments, the embodiment of the present invention provides mobile terminal
In another embodiment, its focal range is set to microspur scope, middle-range scope and long distance model according to the depth of field of the current environment
Enclose.
The initial point x of the master site of the focus point of main camera can be the minimum value of correspondence focal range, and now, y takes
On the occasion of;
The initial point x of the master site of the focus point of main camera can be the maximum of correspondence focal range, and now, y takes
Negative value;
More specifically, the depth of field that initial focus point determining unit 20 can combine the current environment that auxiliary camera detection is arrived determines
X is the minimum value or maximum of correspondence focal range;Usually, the scape of the current environment for first being detected according to main camera
It is deep, calculate the main depth of field;The depth of field of the current environment detected according to auxiliary camera, calculates the auxiliary depth of field;Again compare the main depth of field with it is auxiliary
The size of the depth of field;If the main depth of field is more than the auxiliary depth of field, x is set to the maximum of correspondence focal range;If the main depth of field is less than auxiliary scape
It is deep, then x is set to the minimum value of correspondence focal range.
Alternatively, on the basis of above-mentioned Fig. 4 and Fig. 4 correspondence embodiments, the embodiment of the present invention provides mobile terminal
In another embodiment, can be located at x at 1/3rd or 1/2nd of the focal range by initial focus point determining unit.
Now, focus point adjustment unit first can be taken on the occasion of perform the judgement auxiliary camera is in focus point y
The definition of the image obtained during the auxiliary position whether more than the main camera focus point be the master site when at obtain
The step of definition of the image for taking;Y is taken into negative value again, execution is described to judge that the auxiliary camera is the auxiliary position in focus point
The definition of the image obtained when putting whether more than the main camera when focus point is the master site place acquisition image
Definition the step of.In this way, focusing can be found out more quickly.
Alternatively, on the basis of above-mentioned Fig. 4 and Fig. 4 correspondence embodiments, the embodiment of the present invention provides mobile terminal
In another embodiment, focus point adjustment unit can be used for the occurrence of the y is different according to different focal range settings
Value.For example, if current focal range is microspur scope, the occurrence of its corresponding y is y1;If current focal range is middle-range model
Enclose, the occurrence of its corresponding y is y2;If current focal range is microspur scope, the occurrence of its corresponding y is y3;Now,
y1≤y2≤y3。
Alternatively, on the basis of above-mentioned Fig. 4 and Fig. 4 correspondence embodiments, the embodiment of the present invention provides mobile terminal
In another embodiment, the focus point adjustment unit is additionally operable to:The depth of field of the current environment detected according to main camera, calculates
The main depth of field;The depth of field of the current environment detected according to auxiliary camera, calculates the auxiliary depth of field;According to the main depth of field and the auxiliary depth of field
Y is calculated, when calculating, can be using the difference of the main depth of field and the auxiliary depth of field as y;Can be combined with main camera and auxiliary camera it
Between distance y values are optimized.
Focusing method and mobile terminal based on dual camera proposed by the present invention, by main camera to current environment master
The depth of field of body carries out primary Calculation, confirms as microspur, middle-range, long distance environment.Again by main camera and auxiliary camera it is determined that
Focal range in carry out contrast focusing;Determine focusing;The present invention not only improves focusing speed, in the shooting of large aperture
Preferably virtualization effect is reached, while lifting Consumer's Experience.
It should be noted that herein, term " including ", "comprising" or its any other variant be intended to non-row
His property is included, so that process, method, article or device including a series of key elements not only include those key elements, and
And also include other key elements being not expressly set out, or also include for this process, method, article or device institute are intrinsic
Key element.In the absence of more restrictions, the key element limited by sentence "including a ...", it is not excluded that including this
Also there is other identical element in the process of key element, method, article or device.
The embodiments of the present invention are for illustration only, and the quality of embodiment is not represented.
Through the above description of the embodiments, those skilled in the art can be understood that above-described embodiment side
Method can add the mode of required general hardware platform to realize by software, naturally it is also possible to by hardware, but in many cases
The former is more preferably implementation method.Based on such understanding, technical scheme is substantially done to prior art in other words
The part for going out contribution can be embodied in the form of software product, and the computer software product is stored in a storage medium
In (such as ROM/RAM, magnetic disc, CD), including some instructions are used to so that a station terminal equipment (can be mobile phone, computer, clothes
Business device, air-conditioner, or network equipment etc.) perform method described in each embodiment of the invention.
The preferred embodiments of the present invention are these are only, the scope of the claims of the invention is not thereby limited, it is every to utilize this hair
Equivalent structure or equivalent flow conversion that bright specification and accompanying drawing content are made, or directly or indirectly it is used in other related skills
Art field, is included within the scope of the present invention.
Claims (10)
1. a kind of focusing method based on dual camera, it is characterised in that methods described includes:
Main camera detects the depth of field of current environment, and determines therefrom that focal range;
The initial point of the master site of the focus point of the main camera is set at x, the x is located in the focal range;
Inscribed when same, the initial point of the auxiliary position of the focus point of auxiliary camera is set at x+y;
Judge whether the definition of the image that the auxiliary camera is obtained when focus point is the auxiliary position is taken the photograph more than the master
As the definition of the image that head is obtained when focus point is the master site;If so, then
The master site of the focus point of the main camera is increased into 2y, is inscribed when same, by the focusing of the auxiliary camera
The auxiliary position of point increases 2y;
Judge whether the definition of the image that the auxiliary camera is obtained when focus point is the auxiliary position is taken the photograph more than the master
As head when focus point is the master site at obtain image definition;If so, then returning to previous step;If it is not, then
The corresponding focus point of the main camera is set to focusing.
2. the focusing method of dual camera is based on according to claim 1, it is characterised in that the focal range is according to
The depth of field of current environment is set to microspur scope, middle-range scope and long-distance range.
3. the focusing method of dual camera is based on according to claim 1, it is characterised in that the x is located at the focal length model
At enclose 1/3rd or 1/2nd.
4. the focusing method of dual camera is based on according to claim 1, it is characterised in that the occurrence of the y is not according to
Same focal range sets different values.
5. the focusing method of dual camera is based on according to claim 1, it is characterised in that the focusing by auxiliary camera
Before the initial point of the auxiliary position of point is set at x+y, methods described also includes:
The depth of field of the current environment detected according to main camera, calculates the main depth of field;
The depth of field of the current environment detected according to auxiliary camera, calculates the auxiliary depth of field;
According to the calculating y of the main depth of field and the auxiliary depth of field.
6. a kind of mobile terminal, the mobile terminal has main camera with auxiliary camera, it is characterised in that including:
Environment determining unit, the depth of field of the current environment for being detected according to the main camera determines focal range;
Initial focus point determining unit, it is described for the initial point of the master site of the focus point of the main camera to be set at x
X is located in the focal range;Inscribed when same, the initial point of the auxiliary position of the focus point of auxiliary camera is set to x+y
Place;
Judging unit, for judge the image that the auxiliary camera is obtained when focus point is the auxiliary position definition whether
More than the main camera in the definition that focus point is the image obtained at the master site;
Focus point adjustment unit, if the definition of the image obtained when focus point is the auxiliary position for the auxiliary camera
The definition of the image obtained when focus point is the master site more than the main camera, then by the poly- of the main camera
The master site of focus increases 2y, is inscribed when same, and the auxiliary position of the focus point of the auxiliary camera is increased into 2y;
Focusing determining unit, if the definition of the image obtained when focus point is the auxiliary position for the auxiliary camera
The definition of the image obtained when focus point is the master site less than the main camera, then by the main camera correspondence
Focus point be set to focusing.
7. mobile terminal according to claim 6, it is characterised in that the depth of field of the focal range according to the current environment
It is set to microspur scope, middle-range scope and long-distance range.
8. mobile terminal according to claim 6, it is characterised in that the x be located at the focal range 1/3rd or
At 1/2nd.
9. mobile terminal according to claim 6, it is characterised in that the occurrence of the y sets according to different focal ranges
Fixed different value.
10. mobile terminal according to claim 6, it is characterised in that the focus point adjustment unit is additionally operable to:
The depth of field of the current environment detected according to main camera, calculates the main depth of field;
The depth of field of the current environment detected according to auxiliary camera, calculates the auxiliary depth of field;
According to the calculating y of the main depth of field and the auxiliary depth of field.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710109441.9A CN106713718A (en) | 2017-02-27 | 2017-02-27 | Dual camera-based focusing method and mobile terminal |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710109441.9A CN106713718A (en) | 2017-02-27 | 2017-02-27 | Dual camera-based focusing method and mobile terminal |
Publications (1)
Publication Number | Publication Date |
---|---|
CN106713718A true CN106713718A (en) | 2017-05-24 |
Family
ID=58917823
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201710109441.9A Pending CN106713718A (en) | 2017-02-27 | 2017-02-27 | Dual camera-based focusing method and mobile terminal |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN106713718A (en) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107343158A (en) * | 2017-07-25 | 2017-11-10 | 广东欧珀移动通信有限公司 | Accelerate the convergent method and devices of AEC, terminal device |
CN107566735A (en) * | 2017-09-30 | 2018-01-09 | 努比亚技术有限公司 | A kind of dual camera focusing method, mobile terminal and computer-readable recording medium |
CN110248101A (en) * | 2019-07-19 | 2019-09-17 | Oppo广东移动通信有限公司 | Focusing method and device, electronic equipment, computer readable storage medium |
CN116233605A (en) * | 2023-05-08 | 2023-06-06 | 此芯科技(武汉)有限公司 | Focusing implementation method and device, storage medium and image pickup equipment |
Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP0341692A2 (en) * | 1988-05-11 | 1989-11-15 | Sanyo Electric Co., Ltd. | Image sensing apparatus having automatic focusing function for automatically matching focus in response to video signal |
US20030160886A1 (en) * | 2002-02-22 | 2003-08-28 | Fuji Photo Film Co., Ltd. | Digital camera |
US20110150446A1 (en) * | 2009-12-23 | 2011-06-23 | Samsung Electronics Co., Ltd. | Method and Apparatus for Processing Digital Image by Using Fast Autofocus |
CN103369247A (en) * | 2013-07-15 | 2013-10-23 | 天津大学 | Camera unified focal length calibration method applied to multi-camera visual information processing |
EP2670127A1 (en) * | 2012-06-01 | 2013-12-04 | BlackBerry Limited | Method and digital camera having improved autofocus |
CN104333703A (en) * | 2014-11-28 | 2015-02-04 | 广东欧珀移动通信有限公司 | Method and terminal for photographing by virtue of two cameras |
CN105959538A (en) * | 2016-05-05 | 2016-09-21 | 乐视控股(北京)有限公司 | Quick focusing method and quick focusing system for double cameras |
US20160295097A1 (en) * | 2015-03-31 | 2016-10-06 | Qualcomm Incorporated | Dual camera autofocus |
CN106161960A (en) * | 2016-08-26 | 2016-11-23 | 曾美枝 | Photographic method and device |
-
2017
- 2017-02-27 CN CN201710109441.9A patent/CN106713718A/en active Pending
Patent Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP0341692A2 (en) * | 1988-05-11 | 1989-11-15 | Sanyo Electric Co., Ltd. | Image sensing apparatus having automatic focusing function for automatically matching focus in response to video signal |
US20030160886A1 (en) * | 2002-02-22 | 2003-08-28 | Fuji Photo Film Co., Ltd. | Digital camera |
US20110150446A1 (en) * | 2009-12-23 | 2011-06-23 | Samsung Electronics Co., Ltd. | Method and Apparatus for Processing Digital Image by Using Fast Autofocus |
EP2670127A1 (en) * | 2012-06-01 | 2013-12-04 | BlackBerry Limited | Method and digital camera having improved autofocus |
CN103369247A (en) * | 2013-07-15 | 2013-10-23 | 天津大学 | Camera unified focal length calibration method applied to multi-camera visual information processing |
CN104333703A (en) * | 2014-11-28 | 2015-02-04 | 广东欧珀移动通信有限公司 | Method and terminal for photographing by virtue of two cameras |
US20160295097A1 (en) * | 2015-03-31 | 2016-10-06 | Qualcomm Incorporated | Dual camera autofocus |
CN105959538A (en) * | 2016-05-05 | 2016-09-21 | 乐视控股(北京)有限公司 | Quick focusing method and quick focusing system for double cameras |
CN106161960A (en) * | 2016-08-26 | 2016-11-23 | 曾美枝 | Photographic method and device |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107343158A (en) * | 2017-07-25 | 2017-11-10 | 广东欧珀移动通信有限公司 | Accelerate the convergent method and devices of AEC, terminal device |
WO2019019820A1 (en) * | 2017-07-25 | 2019-01-31 | Oppo广东移动通信有限公司 | Method and apparatus for accelerating aec convergence, and terminal device |
US11196935B2 (en) | 2017-07-25 | 2021-12-07 | Shenzhen Heytap Technology Corp., Ltd. | Method and apparatus for accelerating AEC convergence, and terminal device |
CN107566735A (en) * | 2017-09-30 | 2018-01-09 | 努比亚技术有限公司 | A kind of dual camera focusing method, mobile terminal and computer-readable recording medium |
CN110248101A (en) * | 2019-07-19 | 2019-09-17 | Oppo广东移动通信有限公司 | Focusing method and device, electronic equipment, computer readable storage medium |
CN110248101B (en) * | 2019-07-19 | 2021-07-09 | Oppo广东移动通信有限公司 | Focusing method and device, electronic equipment and computer readable storage medium |
CN116233605A (en) * | 2023-05-08 | 2023-06-06 | 此芯科技(武汉)有限公司 | Focusing implementation method and device, storage medium and image pickup equipment |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN105163042B (en) | A kind of apparatus and method for blurring processing depth image | |
CN106502693A (en) | A kind of method for displaying image and device | |
CN106530241A (en) | Image blurring processing method and apparatus | |
CN106909274A (en) | A kind of method for displaying image and device | |
CN106603856A (en) | Screen brightness adjusting method and terminal | |
CN106331499A (en) | Focusing method and shooting equipment | |
CN106534619A (en) | Method and apparatus for adjusting focusing area, and terminal | |
CN106657650A (en) | System expression recommendation method and device, and terminal | |
CN106713718A (en) | Dual camera-based focusing method and mobile terminal | |
CN107016639A (en) | A kind of image processing method and device | |
CN106851113A (en) | A kind of photographic method and mobile terminal based on dual camera | |
CN106791135A (en) | A kind of automatic local Zoom display method and mobile terminal | |
CN106686212A (en) | Automatic brightness adjusting method and terminal | |
CN105242483B (en) | The method and apparatus that a kind of method and apparatus for realizing focusing, realization are taken pictures | |
CN106851125A (en) | A kind of mobile terminal and multiple-exposure image pickup method | |
CN106911881A (en) | A kind of an action shot filming apparatus based on dual camera, method and terminal | |
CN106791187A (en) | A kind of mobile terminal and NFC method | |
CN106851114A (en) | A kind of photo shows, photo generating means and method, terminal | |
CN106791016A (en) | A kind of photographic method and terminal | |
CN106791119A (en) | A kind of photo processing method, device and terminal | |
CN106454087B (en) | A kind of filming apparatus and method | |
CN106651823A (en) | Device and method for eliminating picture light spot and mobile terminal | |
CN106534596A (en) | Anti-harassment call filtering method and filtering system thereof | |
CN106780408A (en) | Image processing method and device | |
CN107071259A (en) | It is a kind of to realize method, device and capture apparatus that light paints photography |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20170524 |
|
RJ01 | Rejection of invention patent application after publication |