CN105578024A - Camera focusing method, focusing device and mobile terminal - Google Patents

Camera focusing method, focusing device and mobile terminal Download PDF

Info

Publication number
CN105578024A
CN105578024A CN201510282122.9A CN201510282122A CN105578024A CN 105578024 A CN105578024 A CN 105578024A CN 201510282122 A CN201510282122 A CN 201510282122A CN 105578024 A CN105578024 A CN 105578024A
Authority
CN
China
Prior art keywords
camera
photosurface
information
camera module
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201510282122.9A
Other languages
Chinese (zh)
Other versions
CN105578024B (en
Inventor
蔡凤成
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Yulong Computer Telecommunication Scientific Shenzhen Co Ltd
Original Assignee
Yulong Computer Telecommunication Scientific Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Yulong Computer Telecommunication Scientific Shenzhen Co Ltd filed Critical Yulong Computer Telecommunication Scientific Shenzhen Co Ltd
Priority to CN201510282122.9A priority Critical patent/CN105578024B/en
Publication of CN105578024A publication Critical patent/CN105578024A/en
Application granted granted Critical
Publication of CN105578024B publication Critical patent/CN105578024B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Studio Devices (AREA)

Abstract

The invention provides a camera focusing method comprising the steps that when a camera starting instruction is received, a camera is started and a first camera module group and a second camera module group are initialized, and the process enters a preview frame; the first camera module group performs imaging on a reference object in the preview frame so that a first image is formed on a first light-sensitive surface, and the second camera module group performs imaging on the same reference object so that a second image is formed on a second light-sensitive surface; first height information and first distance information of the first image and second height information and second distance information of the second image are acquired, and target distance from the reference object to the first and second light-sensitive surfaces is calculated according to the first height information, the second height information, the first distance information and the second distance information; and a preset focal length mapping table is inquired according to the target distance, and the first and second camera module groups are driven to perform focusing according to the inquiring result. Rapid and precise focusing can be realized by the camera focusing method. The invention also provides a focusing device and a mobile terminal.

Description

Camera focusing method, focusing mechanism and mobile terminal
Technical field
The present invention relates to optical field, particularly relate to a kind of camera focusing method, a kind of focusing mechanism and apply the mobile terminal of this camera focusing method and focusing mechanism.
Background technology
Along with the development of the communication technology and the lifting of designing and manufacturing level, the camera being equipped with high pixel in the smart terminal product such as mobile phone, panel computer has become the main trend of production development.In recent years, for the shooting promoting smart terminal product is further experienced, the intelligent terminal being equipped with dual camera module also progresses into market, and obtains the favor of consumers in general.Compare the intelligent terminal of single camera, the intelligent terminal being equipped with dual camera module comprises two independently camera lens modules, one of them is wide-angle lens module, another is Zoom optic lens module, thus better depth of field analysis ability is made it have, meanwhile, under darker illumination condition, also significantly can promote image quality, reduce noise, thus obtain better shooting effect.
At present, how in the intelligent terminal being equipped with dual camera module, making full use of the depth of field analysis ability that dual camera module is outstanding, to realize more precisely, faster focusing in shooting process, is the key point that the shooting promoting two camera module is further experienced.Figure 1 shows that the solution of dual camera module focusing in prior art.Wherein, photographic subjects D projects to the first light-sensitive device 1 and the second light-sensitive device 2 respectively by camera lens thus obtains the first image and the second image, wherein, the mid point of the first light-sensitive device 1 is S to the distance of the mid point of the second light-sensitive device 2, first incidence angle is ∠ A and the second incidence angle is ∠ B, photographic subjects D is respectively S1 and S2 apart from the distance of image center (i.e. light-sensitive device mid point) in the first image and the second image, photographic subjects D is X to the distance of light-sensitive device place straight line, photographic subjects D is X1 in the distance of the mid point of subpoint distance first light-sensitive device 1 of light-sensitive device place straight line, according to Triangle Principle: X/ (X1+S1)=tg ∠ A, X/ (X1+S+S2)=tg ∠ B, thus obtain distance X=(S+S2-S1)/(the 1/tg ∠ A-1/tg ∠ B) of photographic subjects D to light-sensitive device place straight line, and then realize the rapid focus to photographic subjects D according to this air line distance X.But, in such scheme, from the form of X=(S+S2-S1)/(1/tg ∠ A-1/tg ∠ B) denominator, if the tangent value slightly error of the first incidence angle ∠ A and the second incidence angle ∠ B, then may cause finally calculating that the photographic subjects D that obtains exists comparatively big error to the distance X of light-sensitive device place straight line, and then cause focusing unsuccessfully.
Summary of the invention
The invention provides the camera focusing method that a kind of precision is higher, by obtaining the picture size that formed on the photosurface of camera dual camera module of object of reference and the photosurface distance to camera lens, calculate the target range of object of reference from photosurface, and then drive the focusing of dual camera module according to this target range, can effectively promote focusing speed and precision.
Separately, the present invention also provides the focusing mechanism of this camera focusing method of a kind of application.
Separately, the present invention also provides the mobile terminal of this camera focusing method of a kind of application and focusing mechanism.
A kind of camera focusing method, comprises step:
When receiving camera enabled instruction, start camera and initialization first camera module and second camera module, and enter preview screen;
Described first camera module carries out imaging to an object of reference in described preview screen, to form the first image on the first photosurface, described second camera module carries out imaging to object of reference same in described preview screen, to form the second image on the second photosurface;
Obtain the first elevation information corresponding to described first image and the first range information, and the second elevation information corresponding to described second image and second distance information, and calculate the target range of described object of reference to described first photosurface and the second photosurface according to described first elevation information, the second elevation information and the first range information, second distance information;
According to the focal length mapping table that described target range inquiry one is preset, and described first camera module is driven to adjust the position of the first camera lens and/or drive described second camera module to adjust the position of the second camera lens according to Query Result.
Wherein, described object of reference comprises the first reference point and the second reference point, described first image comprises first subpoint corresponding with described first reference point and second subpoint corresponding with described second reference point, and described second image is included in the 3rd corresponding subpoint of described first reference point and four subpoint corresponding with described second reference point.
Wherein, described first elevation information is the distance between described first subpoint and the second subpoint, described first range information is the initial distance of described first camera lens to described first photosurface, described second elevation information is the distance between described 3rd subpoint and the 4th subpoint, described second distance information is the initial distance of described second camera lens to described second photosurface, described first photosurface and the coplanar setting of described second photosurface.
Wherein, describedly calculate the step of described object of reference to the target range of described first photosurface and the second photosurface according to described first elevation information, the second elevation information and the first range information, second distance information, specifically comprise:
According to similar triangle theory, extrapolate the relational expression of described object of reference to the target range of described first photosurface and the second photosurface and described first elevation information, the second elevation information and the first range information, second distance information;
The target range of described object of reference to described first photosurface and the second photosurface is calculated according to described relational expression and described first elevation information, the second elevation information and the first range information, second distance information.
Wherein, described default focal length mapping table comprises the different target range of many groups, and organize described first lens focus corresponding to described target range and the second lens focus with each, the described default focal length mapping table of described target range inquiry according to calculating can obtain described first lens focus corresponding with described target range and the second lens focus, and then drive described first camera module adjust the position of the first camera lens and/or drive described second camera module to adjust the position of the second camera lens according to described second lens focus according to described first lens focus.
Wherein, described camera focusing method also comprises a monitoring and takes dynamic step, described enter the step of preview screen after, open monitoring camera and whether receive shooting instruction, monitor camera receive shooting instruction before, whether position or the visual angle of monitoring a certain view in preview screen change, when the position or visual angle that monitor described certain view change, triggering repeats described first camera module and carries out imaging to an object of reference in described preview screen, to form the first image on the first photosurface, described second camera module carries out imaging to object of reference same in described preview screen, to form the step of the second image on the second photosurface.
Wherein, when monitoring camera and receiving shooting instruction, perform and obtain the first elevation information corresponding to described first image and the first range information, and the second elevation information corresponding to described second image and second distance information, calculate the step of described object of reference to the target range of described first photosurface and the second photosurface, and according to the described default focal length mapping table of described target range inquiry, to drive described first camera module adjust the position of the first camera lens and/or drive described second camera module to adjust the position of the second camera lens according to Query Result, and then perform described shooting instruction.
A kind of focusing mechanism, comprising:
Initialization unit, for when receiving camera enabled instruction, starting camera and initialization first camera module and second camera module, and entering preview screen;
Image-generating unit, for carrying out imaging by described first camera module to an object of reference in described preview screen, to form the first image on the first photosurface, and by described second camera module, imaging is carried out to object of reference same in described preview screen, to form the second image on the second photosurface;
Computing unit, for obtaining the first elevation information corresponding to described first image and the first range information, and the second elevation information corresponding to described second image and second distance information, and calculate the target range of described object of reference to described first photosurface and the second photosurface according to described first elevation information, the second elevation information and the first range information, second distance information;
Focusing unit, for the focal length mapping table preset according to described target range inquiry one, and drive described first camera module adjust the position of the first camera lens and/or drive described second camera module to adjust the position of the second camera lens, to realize rapid focus according to Query Result.
Wherein, described computing unit comprises a relation further and calculates module, for according to similar triangle theory, extrapolate the relational expression of described object of reference to the target range of described first photosurface and the second photosurface and described first elevation information, the second elevation information and the first range information, second distance information, to calculate described target range according to described relational expression and described first elevation information, the second elevation information and the first range information, second distance information.
Wherein, described focusing unit comprises a mapping block further, for presetting the target range different with storing many groups, and with described each organize described first lens focus corresponding to target range and the second lens focus.
Wherein, described focusing unit comprises an enquiry module further, for inquiring about described mapping block, to obtain described first lens focus corresponding with described target range and the second lens focus according to the described target range calculated.
Wherein, described focusing mechanism also comprises a shooting dynamic monitoring unit, whether position or visual angle for monitoring a certain view in preview screen change and monitor described focusing mechanism whether receive shooting instruction, and described focusing mechanism repeats focus operation according to the monitoring result of described shooting dynamic monitoring unit or order performs current focus operation until complete focusing.
A kind of mobile terminal, comprise dual camera module and focusing mechanism, described dual camera module comprises the first camera module and second camera module, and described focusing mechanism comprises:
Initialization unit, for receiving camera enabled instruction, starting camera and initialization first camera module and second camera module, and entering preview screen;
Image-generating unit, for carrying out imaging by described first camera module to an object of reference in described preview screen, to form the first image on the first photosurface, and by described second camera module, imaging is carried out to object of reference same in described preview screen, to form the second image on the second photosurface;
Computing unit, for obtaining the first elevation information corresponding to described first image and the first range information, and the second elevation information corresponding to described second image and second distance information, and calculate the target range of described object of reference to described first photosurface and the second photosurface according to described first elevation information, the second elevation information and the first range information, second distance information;
Focusing unit, for the focal length mapping table preset according to described target range inquiry one, and drives described first camera module adjust the position of the first camera lens and/or drive described second camera module to adjust the position of the second camera lens according to Query Result.
Camera focusing method of the present invention is by obtaining the elevation information of described first image and the second image, and in conjunction with described first optical center to described first photosurface Distance geometry described in the second optical center to the distance of described second photosurface, calculate the target range of described object of reference to described first photosurface and the second photosurface, and then according to the focal length mapping table that this target range inquiry one is preset, obtain first lens focus corresponding with this target range and the second lens focus, and then drive described first camera module and second camera module to complete focusing.Wherein, the elevation information of described first image, the elevation information of the second image, the first optical center to described first photosurface Distance geometry described in the second optical center all can obtain rapidly and accurately to the distance of described second photosurface, thus make described camera focusing method have good focusing speed and precision.
Accompanying drawing explanation
In order to be illustrated more clearly in the embodiment of the present invention or technical scheme of the prior art, be briefly described to the accompanying drawing used required in embodiment or description of the prior art below, apparently, accompanying drawing in the following describes is only some embodiments of the present invention, for those of ordinary skill in the art, under the prerequisite not paying creative work, other accompanying drawing can also be obtained according to these accompanying drawings.
Fig. 1 is the solution schematic diagram of dual camera module focusing in prior art.
Fig. 2 is the schematic flow sheet of the camera focusing method of first embodiment of the invention.
Fig. 3 is the application scenarios schematic diagram of the focusing method of camera shown in Fig. 2.
Fig. 4 is the structural representation of the focusing mechanism of second embodiment of the invention.
Fig. 5 is the structural representation of the mobile terminal of third embodiment of the invention.
Embodiment
Below in conjunction with the accompanying drawing in the embodiment of the present invention, be clearly and completely described the technical scheme in the embodiment of the present invention, obviously, described embodiment is only the present invention's part embodiment, instead of whole embodiments.Based on the embodiment in the present invention, those of ordinary skill in the art, not making the every other embodiment obtained under creative work prerequisite, belong to the scope of protection of the invention.
Refer to Fig. 2 and Fig. 3, first embodiment of the invention provides a kind of camera focusing method, and it comprises the steps:
Step S1: when receiving camera enabled instruction, starts camera and initialization first camera module and second camera module, and enters preview screen;
Step S2: described first camera module carries out imaging to an object of reference 30 in described preview screen, to form the first image on the first photosurface 31, described second camera module carries out imaging, to form the second image on the second photosurface 33 to object of reference 30 same in described preview screen;
Step S3: obtain described first image the first corresponding elevation information p1 and the first range information f1, and the second elevation information p2 corresponding to described second image and second distance information f2, and calculate the target range y of described object of reference 30 to described first photosurface 31 and the second photosurface 33 according to described first elevation information p1, the second elevation information p2 and the first range information f1, second distance information f2;
Step S4: the focal length mapping table preset according to described target range y inquiry one, and drive described first camera module adjust the position of the first camera lens O1 and/or drive described second camera module to adjust the position of the second camera lens O2 according to Query Result.
Wherein, described object of reference 30 is that a complete or part is shown in object in described preview screen.The center-spot mechanism that described object of reference 30 can be given tacit consent to by camera is chosen, and namely when camera is unlocked and after entering described preview screen, camera acquiescence selects the object being roughly in current preview picture geometric center as described object of reference 30.In addition, described object of reference 30 can also by camera accept user choose instruction to choose, as when object of reference excessive and in preview screen as described in can only being partly shown in time, camera chooses instruction to complete choosing of described object of reference 30 by the object of reference receiving user and send.
In the present embodiment, the actual height of described object of reference 30 is x.Described first elevation information p1 is the height of described first image on described first photosurface 31, described first range information f1 is the initial distance of described first camera lens O1 central point to described first photosurface 31, described second elevation information p2 is the height of described second image on described second photosurface 33, and described second distance information f2 is the initial distance of described second camera lens O2 central point to described second photosurface 33.Described first photosurface 31 and the coplanar setting of described second photosurface 33.Be appreciated that described first range information f1 is the vertical range of described first camera lens O1 central point to described first photosurface 31, described second distance information f2 is the vertical range of described second camera lens O2 central point to described second photosurface 33.Described target range y is the vertical range that described object of reference 30 arrives described first photosurface 31 and the second photosurface 33.
Wherein, the actual height of described object of reference 30 and the height of the first image corresponded and the second image can, for the vertical height between two specified reference point with respect to the horizontal plane, also can be the distance on other directions between two specified reference point.
Refer to Fig. 3, described object of reference 30 comprises the first reference point A and the second reference point B, and the distance between described first reference point A and the second reference point B is x.Described object of reference 30 is y to the vertical range of described first photosurface 31 and the second photosurface 33 and described target range.After described object of reference 30 projects via described first camera lens O1, described first photosurface 31 forms described first image, described first image comprises the first subpoint a1 corresponding with described first reference point A and the second subpoint b1 corresponding with described second reference point B, and the distance (i.e. the first elevation information) between described first subpoint a1 and described second subpoint b1 is p1.After described object of reference 30 projects via described second camera lens O2, described second photosurface 33 forms described second image, described second image is included in the 3rd corresponding subpoint a2 of described first reference point A and the four subpoint b2 corresponding with described second reference point B, and the distance (i.e. the second elevation information) between described 3rd subpoint a2 and described 4th subpoint b2 is p2.
Be appreciated that, by the central point of described first camera lens O1, the triangle that first reference point A and the second reference point B is formed and the central point by described first camera lens O1, similar triangles each other between the triangle that first subpoint a1 and the second subpoint b1 are formed, therebetween there is one first ratio of similitude, and the central point of described first camera lens O1 forms the high line of triangle O1a1b1 on the a1b1 of sideline to the initial distance of described first photosurface 31 and described first range information f1, the central point of described first camera lens O1 forms the high line of triangle O1AB on the AB of sideline to the distance y-f1 of described object of reference 30.By the central point of described second camera lens O2, the triangle that first reference point A and the second reference point B is formed and the central point by described second camera lens O2, similar triangles each other between the triangle that 3rd subpoint a2 and the 4th subpoint b2 are formed, therebetween there is one second ratio of similitude, and the central point of described second camera lens O2 forms the high line of triangle O2a2b2 on the a2b2 of sideline to the initial distance of described second photosurface 33 and described second distance information f2, the central point of described second camera lens O2 forms the high line of triangle O2AB on the AB of sideline to the distance y-f2 of described object of reference.
From the character of similar triangles, the corresponding sides of similar triangles are proportional, and the ratio of the high line of the correspondence of similar triangles, the ratio of corresponding center line and the ratio of corresponding angles bisector all equal the ratio of similitude.Then, describedly calculate described object of reference 30 according to described first elevation information p1, the second elevation information p2 and the first range information f1, second distance information f2 and specifically comprise to the target range y of described first photosurface 31 and the second photosurface 33:
Described first ratio of similitude is equaled by the ratio of the distance (i.e. the actual height x of object of reference 30) between the first reference point A of the distance (i.e. the first elevation information p1) between the first subpoint a1 of described first image and the second subpoint b1 and described object of reference 30 and the second reference point B, and the central point of described first camera lens O1 also equals described first ratio of similitude to the central point of initial distance (i.e. the first range information f1) and the described first camera lens O1 of described first photosurface 31 to the ratio of the distance y-f1 of described object of reference 30, can obtain equation:
p 1 x = f 1 y - f 1 - - - ( 1 )
Described second ratio of similitude is equaled by the ratio of the distance (i.e. the actual height x of object of reference 30) between the first reference point A of the distance (i.e. the second elevation information p2) between the 3rd subpoint a2 of described second image and the 4th subpoint b2 and described object of reference 30 and the second reference point B, and the central point of described second camera lens O2 also equals described second ratio of similitude to the central point of initial distance (i.e. second distance information f2) and the described second camera lens O2 of described second photosurface 33 to the ratio of the distance y-f2 of described object of reference 30, can obtain equation:
p 2 x = f 2 y - f 2 - - - ( 2 )
And then following equation group can be obtained by equation (1) and equation (2):
p 1 x = f 1 y - f 1 p 2 x = f 2 y - f 2 - - - ( 3 )
Solve above-mentioned equation group (3) can obtain further:
y = f 1 f 2 ( p 2 - p 1 ) f 1 p 2 - f 2 p 1 - - - ( 4 )
From equation (4), in step s3, when obtaining described first image the first corresponding elevation information p1 and the first range information f1, and after the second elevation information p2 corresponding to described second image and second distance information f2, namely calculate the target range y of described object of reference 30 to described first photosurface 31 and the second photosurface 33 by equation (4).
In step s 4 which, described default focal length mapping table comprises the different target range y of many groups, and organize described first lens focus corresponding to described target range y and the second lens focus with each, described first lens focus corresponding with described target range y and the second lens focus can be obtained according to the described default focal length mapping table of described target range y inquiry that equation (4) calculates.
Particularly, the data in described default focal length mapping table obtain by the mode of progressively surveying focusing in camera lens debug process.Such as, determine a test target object, by constantly adjusting the goal-selling distance y of camera lens and described test target object, and the first corresponding lens focus and the second lens focus when imaging effect is the most clear under recording each goal-selling distance y, thus obtain focal length mapping table corresponding to different goal-selling distance y.Be appreciated that the goal-selling distance y in described default focal length mapping table can change according to camera lens adjustment accuracy demand, as when camera lens adjustment accuracy demand is higher, shorten the spacing between two adjacent goal-selling distance y.
In step s 4 which, described according to Query Result drive described first camera module adjust the position of the first camera lens O1 and/or drive described second camera module to adjust the step of the position of the second camera lens O2, specifically comprise:
Focal length according to the described first camera lens O1 corresponding with described target range y of the described default focal length mapping table acquisition of inquiry drives described first camera module to adjust the position of described first camera lens O1, described first camera lens O1 to be moved to corresponding focal length place; And/or
Focal length according to the described second camera lens O2 corresponding with described target range y of the described default focal length mapping table acquisition of inquiry drives described second camera module to adjust the position of described second camera lens O2, described second camera lens O2 to be moved to corresponding focal length place.
Be appreciated that described camera focusing method also can comprise a monitoring and take dynamic step, described enter the step of preview screen after, open monitoring camera and whether receive the shooting instruction of user.Before monitoring camera and receiving the shooting instruction of user, whether position or the visual angle of monitoring a certain view (geometric center point as preview screen) in preview screen change, when the position or visual angle that monitor described certain view change, triggering repeats described first camera module and carries out imaging to an object of reference in described preview screen, to form the first image on the first photosurface, described second camera module carries out imaging to object of reference same in described preview screen, to form the step S2 of the second image on the second photosurface.When monitoring camera and receiving the shooting instruction of user, then perform from the current step continuation order performed, until complete defocused through described step S4, perform shooting instruction, complete image taking.Such as, currently to perform in the process of step S3, monitoring the shooting instruction of user, then continuing to perform step S4 to complete rapid focus after execution of step S3, and then perform shooting instruction, complete image taking.
Described camera focusing method forms the elevation information of the first image on described first photosurface by obtaining described first camera module, and second camera module forms the elevation information of the second image on described second photosurface, in conjunction with described first optical center to described first photosurface Distance geometry described in the second optical center to the distance of described second photosurface, and according to similar triangle theory, calculate the target range of described object of reference to described first photosurface and the second photosurface, and then according to the focal length mapping table that this target range inquiry one is preset, obtain first lens focus corresponding with this target range and the second lens focus, and then drive described first camera module adjust the position of the first camera lens and/or drive described second camera module to adjust the position of the second camera lens, to realize rapid focus.Wherein, the elevation information of described first image, the elevation information of the second image, the first optical center to described first photosurface Distance geometry described in the second optical center all can obtain rapidly and accurately to the distance of described second photosurface, thus make described camera focusing method have good focusing speed and precision.
Refer to Fig. 4, second embodiment of the invention provides a kind of focusing mechanism 400 applying described quick focusing method.Described focusing mechanism 400 comprises:
Initialization unit 410, for when receiving camera enabled instruction, starting camera and initialization first camera module and second camera module, and entering preview screen.
Image-generating unit 430, for carrying out imaging by described first camera module to an object of reference in described preview screen, to form the first image on the first photosurface, and by described second camera module, imaging is carried out to object of reference same in described preview screen, to form the second image on the second photosurface.
Computing unit 450, for obtaining the first elevation information corresponding to described first image and the first range information, and the second elevation information corresponding to described second image and second distance information, and calculate the target range of described object of reference to described first photosurface and the second photosurface according to described first elevation information, the second elevation information and the first range information, second distance information.
Focusing unit 470, for the focal length mapping table preset according to described target range inquiry one, and drive described first camera module adjust the position of the first camera lens and/or drive described second camera module to adjust the position of the second camera lens, to realize rapid focus according to Query Result.
Wherein, described object of reference is that a complete or part is shown in object in described preview screen.The center-spot mechanism that described object of reference can be given tacit consent to by camera is chosen, and namely when camera is unlocked and after entering described preview screen, camera acquiescence selects the object being roughly in current preview picture geometric center as described object of reference.In addition, described object of reference can also by camera accept user choose instruction to choose, as when object of reference excessive and in preview screen as described in can only being partly shown in time, camera chooses instruction to complete choosing of described object of reference by the object of reference receiving user and send.
Wherein, the actual height of described object of reference is x, described first elevation information is the height of described first image on described first photosurface, described first range information is the initial distance of described first optical center point to described first photosurface, described second elevation information is the height of described second image on described second photosurface, and described second distance information is the initial distance of described second optical center point to described second photosurface.Described first photosurface and the coplanar setting of described second photosurface.Be appreciated that described first range information is the vertical range of described first optical center point to described first photosurface, described second distance information is the vertical range of described second optical center point to described second photosurface.Described target range is described object of reference to the vertical range of described first photosurface and the second photosurface.
Wherein, the actual height of described object of reference and the height of the first image corresponded and the second image can, for the vertical height between two specified reference point with respect to the horizontal plane, also can be the distance on other directions between two specified reference point.
In an alternative embodiment, described computing unit 450 comprises a relation further and calculates module 451, for according to similar triangle theory, extrapolate the relational expression of described object of reference to the target range of described first photosurface and the second photosurface and described first elevation information, the second elevation information and the first range information, second distance information, to calculate described target range according to described relational expression and described first elevation information, the second elevation information and the first range information, second distance information.
In an alternative embodiment, described focusing unit 470 comprises a mapping block 471 further, for presetting the target range different with storing many groups, and with described each organize described first lens focus corresponding to target range and the second lens focus.
Particularly, the target range that described many groups are different, and with described each organize described first lens focus corresponding to target range and the second lens focus is stored in described mapping block 471 by a focal length mapping table.Wherein, the data in described focal length mapping table obtain by the mode of progressively surveying focusing in camera lens debug process.Such as, determine a test target object, by constantly adjusting the goal-selling distance of camera lens and described test target object, and record each goal-selling the most clear apart from lower imaging effect time the first corresponding lens focus and the second lens focus, thus obtain focal length mapping table corresponding to different goal-selling distances.Be appreciated that the goal-selling distance in described default focal length mapping table can change according to camera lens adjustment accuracy demand, as when camera lens adjustment accuracy demand is higher, shorten the spacing between two adjacent goal-selling distances.
In an alternative embodiment, described focusing unit 470 comprises an enquiry module 473 further, for inquiring about described mapping block according to the described target range calculated, to obtain described first lens focus corresponding with described target range and the second lens focus, and then according to inquiring about described first lens focus and the second lens focus that obtain, described first camera module is driven to adjust the position of the first camera lens and/or drive described second camera module to adjust the position of the second camera lens, to realize rapid focus.
Particularly, the described object of reference that described enquiry module 473 calculates according to described computing unit 450 is to the target range of described first photosurface and the second photosurface, by inquiring about in described mapping block 471 the described focal length mapping table prestored, obtain the focal length of described first camera lens corresponding with described target range and the focal length of described second camera lens, and then drive described first camera module adjust the position of the first camera lens and/or drive described second camera module to adjust the position of the second camera lens according to the focal length of described first camera lens and the focal length of described second camera lens.
In an alternative embodiment, described focusing mechanism 400 also comprises a shooting dynamic monitoring unit 490, and whether position or visual angle for monitoring a certain view in preview screen change and monitor described focusing mechanism whether receive shooting instruction.Particularly, before monitoring camera and receiving the shooting instruction of user, whether position or the visual angle of monitoring a certain view (geometric center point as preview screen) in preview screen change, when the position or visual angle that monitor described certain view change, triggering repeats described first camera module and carries out imaging to an object of reference in described preview screen, to form the first image on the first photosurface, described second camera module carries out imaging to object of reference same in described preview screen, to form the step S2 of the second image on the second photosurface.When monitoring the shooting instruction of user, then perform from the current focus operation continuation order performed, until complete defocused, perform shooting instruction, complete image taking.Such as, currently to perform in the process of step S3, monitoring the shooting instruction of user, then continuing to perform step S4 to complete rapid focus after execution of step S3, and then perform shooting instruction, complete image taking.
Be appreciated that the unit that described focusing mechanism 400 can be hardware cell, software unit or software and hardware and combines, it can run in a mobile terminal (as smart mobile phone), also can be used as third party's accessory and uses in conjunction with described mobile terminal.
Described focusing mechanism obtains by described computing unit the elevation information that described first camera module forms the first image on described first photosurface, described second camera module forms the elevation information of the second image on described second photosurface, described first optical center to described first photosurface Distance geometry described in the second optical center to the distance of described second photosurface, and according to similar triangle theory, calculate the target range of described object of reference to described first photosurface and the second photosurface, and then inquire about described default focal length mapping table by described focusing unit according to described target range, obtain described first lens focus corresponding with described target range and the second lens focus, and then drive described first camera module adjust the position of the first camera lens and/or drive described second camera module to adjust the position of the second camera lens, achieve the fast accurate focusing of dual camera module.
Refer to Fig. 5, third embodiment of the invention provides a kind of mobile terminal 500 applying described quick focusing method and focusing mechanism.Described mobile terminal 500 comprises:
At least one processor 510, such as CPU, at least one communication bus 520, user interface 530, at least one communication interface 540, dual camera module 550 and memory 560.Wherein, described communication bus 520 is for realizing the communication connection between each assembly of described mobile terminal 500.Described user interface 530 can comprise keyboard or display screen, and in an alternative embodiment, described user interface 530 also can comprise wireline interface, the wave point of standard.Described communication interface 540 can comprise wireline interface (as data line interface, cable interface etc.), the wave point (as WI-FI interface, blue tooth interface, near-field communication interface) of standard.Described dual camera module 550 comprises the first camera module 551 and second camera module 552.Described memory 560 can be high-speed RAM memory, also can be non-labile memory (non-volatilememory), such as at least one magnetic disc store.In an alternative embodiment, described memory 560 can also be that at least one is positioned at the storage device away from aforementioned processor 510.As shown in Figure 5, as comprising operating system, Subscriber Interface Module SIM and focusing administration module in a kind of memory 560 of computer-readable storage medium.Wherein, described operating system is used for the operation of each assembly described in this mobile terminal 500 collaborative.Described Subscriber Interface Module SIM is for preserving and safeguard the user data of this mobile terminal 500.Described focusing administration module is for storing focusing hypervisor.
In the present embodiment, described focusing hypervisor can for the focusing mechanism 400 described in second embodiment of the invention.Be appreciated that when being applied in the present embodiment, described focusing mechanism 400 is software unit, and it is stored in described memory 560 as focusing hypervisor, and is called by described processor 510 and performed.
Particularly, when described mobile terminal 500 receives camera enabled instruction, described processor 510 calls and performs the focusing hypervisor be stored in described memory 560, start camera and initialization first camera module 551 and second camera module 552, and show preview screen by the display screen of described user interface 530.
Described first camera module 551 and described second camera module 552 carry out imaging to an object of reference in described preview screen respectively, to form the first image on the first photosurface of described first camera module 551, and form the second image on the second photosurface of described second camera module 552.
Described processor 510 is by calling described focusing hypervisor, obtain the first elevation information corresponding to described first image and the first range information, and the second elevation information corresponding to described second image and second distance information, and calculate the target range of described object of reference to described first photosurface and the second photosurface according to described first elevation information, the second elevation information and the first range information, second distance information.
Described processor 510 is by calling described focusing hypervisor, according to the focal length mapping table that described target range inquiry one is preset, and drive described first camera module and/or described second camera module adjusting focal length according to Query Result, to realize fast accurate focusing.
Be appreciated that mobile terminal 500 described in the present embodiment can for smart mobile phone, panel computer, notebook computer, personal digital assistant, wearable device or other possess the camera head of dual camera module.
Described mobile terminal is by being stored in described memory by described focusing hypervisor, and when receiving camera enabled instruction, call described focusing hypervisor by described processor to control described first camera module and second camera module and realize fast accurate focusing, improve the Consumer's Experience of the dual camera module shooting focusing of described mobile terminal, and effectively can improve shooting quality.
Above disclosedly be only preferred embodiment of the present invention, certainly the interest field of the present invention can not be limited with this, one of ordinary skill in the art will appreciate that all or part of flow process realizing above-described embodiment, and according to the equivalent variations that the claims in the present invention are done, still belong to the scope that invention is contained.

Claims (13)

1. a camera focusing method, is characterized in that, described method comprises step:
When receiving camera enabled instruction, start camera and initialization first camera module and second camera module, and enter preview screen;
Described first camera module carries out imaging to an object of reference in described preview screen, to form the first image on the first photosurface, described second camera module carries out imaging to object of reference same in described preview screen, to form the second image on the second photosurface;
Obtain the first elevation information corresponding to described first image and the first range information, and the second elevation information corresponding to described second image and second distance information, and calculate the target range of described object of reference to described first photosurface and the second photosurface according to described first elevation information, the second elevation information and the first range information, second distance information;
According to the focal length mapping table that described target range inquiry one is preset, and described first camera module is driven to adjust the position of the first camera lens and/or drive described second camera module to adjust the position of the second camera lens according to Query Result.
2. camera focusing method as claimed in claim 1, it is characterized in that, described object of reference comprises the first reference point and the second reference point, described first image comprises first subpoint corresponding with described first reference point and second subpoint corresponding with described second reference point, and described second image is included in the 3rd corresponding subpoint of described first reference point and four subpoint corresponding with described second reference point.
3. camera focusing method as claimed in claim 2, it is characterized in that, described first elevation information is the distance between described first subpoint and the second subpoint, described first range information is the initial distance of described first camera lens to described first photosurface, described second elevation information is the distance between described 3rd subpoint and the 4th subpoint, described second distance information is the initial distance of described second camera lens to described second photosurface, described first photosurface and the coplanar setting of described second photosurface.
4. camera focusing method as claimed in claim 3, it is characterized in that, describedly calculate the step of described object of reference to the target range of described first photosurface and the second photosurface according to described first elevation information, the second elevation information and the first range information, second distance information, specifically comprise:
According to similar triangle theory, extrapolate the relational expression of described object of reference to the target range of described first photosurface and the second photosurface and described first elevation information, the second elevation information and the first range information, second distance information;
The target range of described object of reference to described first photosurface and the second photosurface is calculated according to described relational expression and described first elevation information, the second elevation information and the first range information, second distance information.
5. camera focusing method as claimed in claim 4, it is characterized in that, described default focal length mapping table comprises the different target range of many groups, and organize described first lens focus corresponding to described target range and the second lens focus with each, the described default focal length mapping table of described target range inquiry according to calculating can obtain described first lens focus corresponding with described target range and the second lens focus, and then drive described first camera module adjust the position of the first camera lens and/or drive described second camera module to adjust the position of the second camera lens according to described second lens focus according to described first lens focus.
6. camera focusing method as claimed in claim 1, it is characterized in that, described camera focusing method also comprises a monitoring and takes dynamic step, described enter the step of preview screen after, open monitoring camera and whether receive shooting instruction, monitor camera receive shooting instruction before, whether position or the visual angle of monitoring a certain view in preview screen change, when the position or visual angle that monitor described certain view change, triggering repeats described first camera module and carries out imaging to an object of reference in described preview screen, to form the first image on the first photosurface, described second camera module carries out imaging to object of reference same in described preview screen, to form the step of the second image on the second photosurface.
7. camera focusing method as claimed in claim 6, it is characterized in that, when monitoring camera and receiving shooting instruction, perform and obtain the first elevation information corresponding to described first image and the first range information, and the second elevation information corresponding to described second image and second distance information, calculate the step of described object of reference to the target range of described first photosurface and the second photosurface, and according to the described default focal length mapping table of described target range inquiry, to drive described first camera module adjust the position of the first camera lens and/or drive described second camera module to adjust the position of the second camera lens according to Query Result, and then perform described shooting instruction.
8. a focusing mechanism, is characterized in that, described focusing mechanism comprises:
Initialization unit, for when receiving camera enabled instruction, starting camera and initialization first camera module and second camera module, and entering preview screen;
Image-generating unit, for carrying out imaging by described first camera module to an object of reference in described preview screen, to form the first image on the first photosurface, and by described second camera module, imaging is carried out to object of reference same in described preview screen, to form the second image on the second photosurface;
Computing unit, for obtaining the first elevation information corresponding to described first image and the first range information, and the second elevation information corresponding to described second image and second distance information, and calculate the target range of described object of reference to described first photosurface and the second photosurface according to described first elevation information, the second elevation information and the first range information, second distance information;
Focusing unit, for the focal length mapping table preset according to described target range inquiry one, and drive described first camera module adjust the position of the first camera lens and/or drive described second camera module to adjust the position of the second camera lens, to realize rapid focus according to Query Result.
9. focusing mechanism as claimed in claim 8, it is characterized in that, described computing unit comprises a relation further and calculates module, for according to similar triangle theory, extrapolate the relational expression of described object of reference to the target range of described first photosurface and the second photosurface and described first elevation information, the second elevation information and the first range information, second distance information, to calculate described target range according to described relational expression and described first elevation information, the second elevation information and the first range information, second distance information.
10. focusing mechanism as claimed in claim 8, it is characterized in that, described focusing unit comprises a mapping block further, for presetting the target range different with storing many groups, and with described each organize described first lens focus corresponding to target range and the second lens focus.
11. focusing mechanisms as claimed in claim 10, it is characterized in that, described focusing unit comprises an enquiry module further, for inquiring about described mapping block, to obtain described first lens focus corresponding with described target range and the second lens focus according to the described target range calculated.
12. focusing mechanisms as claimed in claim 8, it is characterized in that, described focusing mechanism also comprises a shooting dynamic monitoring unit, whether position or visual angle for monitoring a certain view in preview screen change and monitor described focusing mechanism whether receive shooting instruction, and described focusing mechanism repeats focus operation according to the monitoring result of described shooting dynamic monitoring unit or order performs current focus operation until complete focusing.
13. 1 kinds of mobile terminals, comprise dual camera module, and described dual camera module comprises the first camera module and second camera module, it is characterized in that, described mobile terminal also comprises the focusing mechanism as described in claim 8-12 any one.
CN201510282122.9A 2015-05-27 2015-05-27 Camera focusing method, focusing mechanism and mobile terminal Active CN105578024B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201510282122.9A CN105578024B (en) 2015-05-27 2015-05-27 Camera focusing method, focusing mechanism and mobile terminal

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201510282122.9A CN105578024B (en) 2015-05-27 2015-05-27 Camera focusing method, focusing mechanism and mobile terminal

Publications (2)

Publication Number Publication Date
CN105578024A true CN105578024A (en) 2016-05-11
CN105578024B CN105578024B (en) 2018-01-09

Family

ID=55887636

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201510282122.9A Active CN105578024B (en) 2015-05-27 2015-05-27 Camera focusing method, focusing mechanism and mobile terminal

Country Status (1)

Country Link
CN (1) CN105578024B (en)

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106303224A (en) * 2016-07-29 2017-01-04 维沃移动通信有限公司 A kind of focusing method and mobile terminal
CN107920209A (en) * 2017-12-27 2018-04-17 国网通用航空有限公司 A kind of high speed camera autofocus system, method and processor, computer equipment
WO2018103299A1 (en) * 2016-12-09 2018-06-14 中兴通讯股份有限公司 Focusing method, and focusing device
CN108200335A (en) * 2017-12-28 2018-06-22 深圳市金立通信设备有限公司 Photographic method, terminal and computer readable storage medium based on dual camera
CN110973763A (en) * 2019-12-12 2020-04-10 天目爱视(北京)科技有限公司 Foot intelligence 3D information acquisition measuring equipment
CN111182198A (en) * 2018-11-13 2020-05-19 奇酷互联网络科技(深圳)有限公司 Shooting focusing method based on double cameras, mobile device and device
CN111325858A (en) * 2020-03-06 2020-06-23 赛特斯信息科技股份有限公司 Method for realizing automatic charging management aiming at roadside temporary parking space
CN113747067A (en) * 2021-09-07 2021-12-03 维沃移动通信有限公司 Photographing method and device, electronic equipment and storage medium
CN114051091A (en) * 2018-11-20 2022-02-15 中山市远尚光电科技有限公司 Device for realizing automatic focusing of telephoto lens based on automatic distance measurement
CN114070997A (en) * 2020-07-30 2022-02-18 宁波舜宇光电信息有限公司 Multi-camera module, camera system, electronic equipment and automatic zooming imaging method
CN115278101A (en) * 2022-07-29 2022-11-01 维沃移动通信有限公司 Shooting method and device and electronic equipment
CN115514894A (en) * 2022-09-30 2022-12-23 联想(北京)有限公司 Processing method and electronic equipment

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020025154A1 (en) * 2000-08-25 2002-02-28 Olympus Optical Co., Ltd. Focusing apparatus
US20090116830A1 (en) * 2007-11-05 2009-05-07 Sony Corporation Imaging apparatus and method for controlling the same
US20120019708A1 (en) * 2010-07-23 2012-01-26 Morihisa Taijiro Image capturing apparatus and image capturing method
CN103229085A (en) * 2010-09-28 2013-07-31 株式会社理光 Imaging apparatus
CN103246130A (en) * 2013-04-16 2013-08-14 广东欧珀移动通信有限公司 Focusing method and device

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020025154A1 (en) * 2000-08-25 2002-02-28 Olympus Optical Co., Ltd. Focusing apparatus
US20090116830A1 (en) * 2007-11-05 2009-05-07 Sony Corporation Imaging apparatus and method for controlling the same
US20120019708A1 (en) * 2010-07-23 2012-01-26 Morihisa Taijiro Image capturing apparatus and image capturing method
CN103229085A (en) * 2010-09-28 2013-07-31 株式会社理光 Imaging apparatus
CN103246130A (en) * 2013-04-16 2013-08-14 广东欧珀移动通信有限公司 Focusing method and device

Cited By (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106303224A (en) * 2016-07-29 2017-01-04 维沃移动通信有限公司 A kind of focusing method and mobile terminal
CN106303224B (en) * 2016-07-29 2019-06-07 维沃移动通信有限公司 A kind of focusing method and mobile terminal
WO2018103299A1 (en) * 2016-12-09 2018-06-14 中兴通讯股份有限公司 Focusing method, and focusing device
CN108616685A (en) * 2016-12-09 2018-10-02 中兴通讯股份有限公司 A kind of focusing method and focusing mechanism
CN107920209A (en) * 2017-12-27 2018-04-17 国网通用航空有限公司 A kind of high speed camera autofocus system, method and processor, computer equipment
CN108200335A (en) * 2017-12-28 2018-06-22 深圳市金立通信设备有限公司 Photographic method, terminal and computer readable storage medium based on dual camera
CN108200335B (en) * 2017-12-28 2020-01-14 深圳市金立通信设备有限公司 Photographing method based on double cameras, terminal and computer readable storage medium
CN111182198A (en) * 2018-11-13 2020-05-19 奇酷互联网络科技(深圳)有限公司 Shooting focusing method based on double cameras, mobile device and device
CN114051091B (en) * 2018-11-20 2024-03-19 中山市远尚光电科技有限公司 Automatic focusing device based on automatic ranging long focus lens
CN114051091A (en) * 2018-11-20 2022-02-15 中山市远尚光电科技有限公司 Device for realizing automatic focusing of telephoto lens based on automatic distance measurement
CN110973763B (en) * 2019-12-12 2020-12-22 天目爱视(北京)科技有限公司 Foot intelligence 3D information acquisition measuring equipment
CN110973763A (en) * 2019-12-12 2020-04-10 天目爱视(北京)科技有限公司 Foot intelligence 3D information acquisition measuring equipment
CN111325858A (en) * 2020-03-06 2020-06-23 赛特斯信息科技股份有限公司 Method for realizing automatic charging management aiming at roadside temporary parking space
CN114070997A (en) * 2020-07-30 2022-02-18 宁波舜宇光电信息有限公司 Multi-camera module, camera system, electronic equipment and automatic zooming imaging method
CN113747067A (en) * 2021-09-07 2021-12-03 维沃移动通信有限公司 Photographing method and device, electronic equipment and storage medium
CN113747067B (en) * 2021-09-07 2024-02-02 维沃移动通信有限公司 Photographing method, photographing device, electronic equipment and storage medium
CN115278101A (en) * 2022-07-29 2022-11-01 维沃移动通信有限公司 Shooting method and device and electronic equipment
CN115278101B (en) * 2022-07-29 2024-02-27 维沃移动通信有限公司 Shooting method and device and electronic equipment
CN115514894A (en) * 2022-09-30 2022-12-23 联想(北京)有限公司 Processing method and electronic equipment
CN115514894B (en) * 2022-09-30 2024-05-28 联想(北京)有限公司 Processing method and electronic equipment

Also Published As

Publication number Publication date
CN105578024B (en) 2018-01-09

Similar Documents

Publication Publication Date Title
CN105578024A (en) Camera focusing method, focusing device and mobile terminal
US10623626B2 (en) Multiple lenses system, operation method and electronic device employing the same
CN107924104B (en) Depth sensing autofocus multi-camera system
EP3067746B1 (en) Photographing method for dual-camera device and dual-camera device
EP3033733B1 (en) Stereo yaw correction using autofocus feedback
US20160269620A1 (en) Phase detection autofocus using subaperture images
WO2019105214A1 (en) Image blurring method and apparatus, mobile terminal and storage medium
US9456141B2 (en) Light-field based autofocus
CN104980644B (en) A kind of image pickup method and device
CN106226976B (en) A kind of dual camera image pickup method, system and terminal
CN104363379A (en) Shooting method by use of cameras with different focal lengths and terminal
US20110025830A1 (en) Methods, systems, and computer-readable storage media for generating stereoscopic content via depth map creation
US9300858B2 (en) Control device and storage medium for controlling capture of images
CN108886582A (en) Photographic device and focusing controlling method
EP3416370A1 (en) Photography focusing method, device, and apparatus for terminal
CN106067947A (en) A kind of photographic method and terminal
WO2018099004A1 (en) Method and unit for focusing dual camera, and terminal device
CN103685905A (en) Photographing method and electronic equipment
CN110336951A (en) Contrast formula focusing method, device and electronic equipment
US20210400193A1 (en) Multiple camera system for wide angle imaging
WO2011014421A2 (en) Methods, systems, and computer-readable storage media for generating stereoscopic content via depth map creation
CN103402058A (en) Shot image processing method and device
CN106131415A (en) Image in 2 D code scan method, device and mobile terminal
US20160241790A1 (en) Method and apparatus for selecting target image
CN104680563B (en) The generation method and device of a kind of image data

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant