CN103747183A - Mobile phone shooting focusing method - Google Patents
Mobile phone shooting focusing method Download PDFInfo
- Publication number
- CN103747183A CN103747183A CN201410018564.8A CN201410018564A CN103747183A CN 103747183 A CN103747183 A CN 103747183A CN 201410018564 A CN201410018564 A CN 201410018564A CN 103747183 A CN103747183 A CN 103747183A
- Authority
- CN
- China
- Prior art keywords
- reference system
- processing unit
- central processing
- facing camera
- display screen
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Landscapes
- Studio Devices (AREA)
- Telephone Function (AREA)
Abstract
The invention provides a mobile phone shooting focusing method, which is executed in a smart mobile phone with a front camera and a back camera. In the back-camera shooting process, the method comprises the following steps that 1, the position coordinates of pupil centers in a front camera reference system when a mobile phone holder watches a visual focus in a display screen are collected through the front camera; 2, a central processing unit converts the position coordinates of the pupil centers into the position coordinates of the visual focus in a back camera reference system; 3, the central processing unit sends corresponding zooming control signals to the back camera according to the change of the position coordinates of the visual focus, and the focusing positions of the back camera are changed; the operation is repeated from the first step to the third step. The focusing imaging quality of the smart mobile phone is greatly improved. Meanwhile, the vibration influence of the manual focusing on the shutter response is avoided.
Description
Technical field
The present invention relates to a kind of photographic equipment focusing method, particularly relate to a kind of photographic equipment focusing method that utilizes physiological characteristic.
Background technology
Smart mobile phone use is now more and more extensive, and each function comprising all more and more trends towards intellectuality.Can predict intelligentized operation will more and more receive publicity on mobile phone.
The combination of camera and mobile phone is very general, and in a period of time in the past, it has been carried out to increasing optimization.But relative intelligent functions is still limited, can not meet user's demand.For having the mobile phone of forward direction camera lens and backward camera lens, the purposes of hardware device does not have effective exploitation, is confined to conventionally calling of limited software, and equipment many places are in idle state.
When utilizing smart mobile phone to take a picture, Autofocus Technology realizes complicated, often undesirable at aspects such as response speeds, need to screen viewfinder image, manually focus with finger, and this often causes continuing focusing, because of the shooting effect variation that causes out of focus.
As shown in Figure 1, smart mobile phone is provided with display screen 03 conventionally on the front end face of shell 01, display screen front is fitted with touch-screen 02, on the housing 01 above touch-screen 02, be embedded with front-facing camera 05, top at housing 01 rear end face is embedded with post-positioned pick-up head 04, front-facing camera 05 is used for video calling and auto heterodyne, and post-positioned pick-up head 04 is used for taking a picture and shooting.The operating system of smart mobile phone is built in internal memory 08 conventionally, processing module in built-in central processing unit 07 invoke memory 08, the control model of foundation to the peripheral circuit on system board and special chip, data, signal stream to input are processed, form corresponding result data stream and control signal stream, control related device response.Conventionally central processing unit 07 shows each functional area on display screen 03 with video mode, and the functional area coordinate feeding back by touch-screen 02, loads corresponding processing module.Utilize existing hardware framework, have the lasting Zoom control that utilizes two cameras to complete shooting process, the possibility of simultaneously avoiding both hands firmly to interfere with each other.
Summary of the invention
The object of this invention is to provide a kind of mobile phone and take focusing method, solution cannot be carried out the technical problem of continuous focusing under noiseless condition.
Mobile phone of the present invention is taken focusing method, carries out having in the smart mobile phone of front-facing camera and post-positioned pick-up head, in the process of taking at post-positioned pick-up head, comprises the following steps:
Step 1, gathers mobile phone holder when watching in display screen visual focus by front-facing camera, the position coordinates of pupil center in front-facing camera reference system;
Step 2, central processing unit is the position coordinates of visual focus in post-positioned pick-up head reference system by pupil center location Coordinate Conversion;
Step 3, central processing unit sends corresponding zoom control signal according to the variation of visual focus position coordinates to post-positioned pick-up head, changes the focusing position of post-positioned pick-up head;
Repeating step one is to step 3.
In described step 1, comprise the following steps:
Step a, central processing unit are set up the reference system of display screen, front-facing camera and post-positioned pick-up head, and form mapping relations model;
Step b, central processing unit utilize the labelling apparatus on smart mobile phone to set up the benchmark of display screen reference system and front-facing camera reference system Coordinate Conversion;
Step c, front-facing camera gather eyeball image, and central processing unit is determined the position coordinates of labelling apparatus in front-facing camera reference system, determines the position coordinates of pupil and pupil center.
In described step 2, comprise the following steps:
Steps d, central processing unit are determined in front-facing camera reference system, the vector position relation of pupil center and labelling apparatus;
Step e, central processing unit are the coordinate position of corresponding visual focus in post-positioned pick-up head reference system by vector position relationship conversion by mapping relations model.
Described labelling apparatus 06 is the Luminous Ring on LED light source or the front-facing camera at display screen edge, or the reflex reflector of relevant position; Or the specific luminous or blinker pattern of the display screen matrix at display screen upper end central authorities, two ends composition.
In step b in described step 2, comprise the following steps:
S110, the second labelling apparatus that central processing unit starts one end, display screen top produces local luminance difference, as the second hot spot, according to display screen reference system, determines the second hot spot coordinate position;
S120, the 3rd labelling apparatus that central processing unit starts the display screen top other end produces local luminance difference, as the 3rd hot spot, according to display screen reference system, determines the 3rd hot spot coordinate position;
S130, the first labelling apparatus that central processing unit starts display screen upper end produces local luminance difference, central processing unit is according to the relative position parameter of front-facing camera in physical structure and display screen, revise the coordinate position of local luminance in display screen reference system, the coordinate position using front-facing camera in display screen reference system is as the first hot spot coordinate position.
In step c in described step 2, comprise the following steps:
S150, central processing unit is extracted in the eyeball coordinates regional in front-facing camera reference system from facial picture;
S160, central processing unit is determined the coordinates regional of the front-facing camera reference system of the first hot spot, the second hot spot and the 3rd hot spot from eyeball coordinates regional;
S180, central processing unit extracts pupil coordinate region from eyeball coordinates regional;
S190, central processing unit extracts center coordinate of eye pupil from pupil coordinate region.
In steps d in described step 3, comprise the following steps:
S170, central processing unit is determined the projection coordinate of the first spot center, i.e. the projection coordinate of the first spot center on the line at the second spot area center and the 3rd spot area center in front-facing camera reference system;
S200, central processing unit is formed on the line of sight in front-facing camera reference system according to projection coordinate and center coordinate of eye pupil, is used for reflecting corresponding visual focus position coordinates in display screen reference system.
In step e in described step 3, comprise the following steps:
Central processing unit is paid close attention to position coordinates by corresponding sight line in display screen reference system and is converted to corresponding visual focus position coordinates in post-positioned pick-up head reference system.
In step e in described step 3, comprise the following steps:
S210, central processing unit changes as the first corrected parameter using the second spot center in front-facing camera reference system and the 3rd spot center spacing, eliminates longitudinal separation between front-facing camera and eyeball and shakes the interference to line of sight;
S220, central processing unit changes as the second corrected parameter using the ratio of the second spot center in front-facing camera reference system and the gap length of projection coordinate and the gap length of the 3rd spot center and projection coordinate, eliminates horizontal level between front-facing camera and eyeball and shakes the interference to line of sight.
Sight line of the present invention is followed the trail of focusing method, make mobile phone when taking pictures or make a video recording, according to user's sight line, focusing, is first that the speed of having focused while having avoided employing autofocus mode continuous focusing is slow, and focusing quality is limited by mobile phone software and hardware optimum level.Secondly by sight line, follow the trail of the object of fast moving in the picture of finding a view, can realize the function of the anti-camera lens manual focus of senior list ring, the focusing image quality of smart mobile phone is significantly improved.Avoided the vibration effect of manual focus to shutter response simultaneously.
Below in conjunction with accompanying drawing, embodiments of the invention are described further.
Accompanying drawing explanation
Fig. 1 is the main hardware structure cross-sectional schematic of smart mobile phone for taking;
Fig. 2 is that mobile phone of the present invention is taken mobile phone and eyeball relative position relation schematic diagram in focusing method;
Fig. 3 is that mobile phone of the present invention is taken in focusing method eyeball glazed thread and pupil position in front-facing camera and is related to schematic diagram;
Fig. 4 is the continuous focusing schematic flow sheet that mobile phone of the present invention is taken focusing method.
Embodiment
As shown in Figure 1, the present embodiment utilizes existing hardware structure, in the process of taking at post-positioned pick-up head 04, by front-facing camera 05 continuous collecting mobile phone holder when watching in display screen visual focus, the position coordinates of pupil center in front-facing camera reference system, central processing unit 07 is the position coordinates of visual focus in post-positioned pick-up head 04 reference system by pupil center location Coordinate Conversion, central processing unit 07 sends corresponding zoom control signal according to the variation of visual focus position coordinates to post-positioned pick-up head 04, continue to change the focusing position of post-positioned pick-up head 04.
The physical structure of mobile phone must have been determined the relative position relation of display screen 03, front-facing camera 05 and post-positioned pick-up head 04, and then makes display screen reference system, front-facing camera reference system and post-positioned pick-up head reference system have fixing Coordinate Conversion and mapping model.
By the surrounding fixed position at display screen 03, luminous point or the reflective spot device 06 that serves as a mark is set, can in display screen reference system, set up with reference to basic point, and by light reflection to eyeball, after front-facing camera 05 pickup image, the reference data of the Coordinate Conversion of front-facing camera reference system and display screen reference system.
Labelling apparatus 06 can be the Luminous Ring on LED light source or the front-facing camera 05 at display screen 03 edge.Also can be the reflex reflector of relevant position.It can also be specific luminous (or flicker) pattern of the display screen matrix at display screen 03 upper end central authorities, two ends.Utilize display screen reference system front-facing camera 05 and the coordinate of specific luminous pattern can be shone upon, make the coordinate of front-facing camera 05 can become labelling apparatus 06.
As shown in Figure 2, at front-facing camera 05, obtain in the process of mobile phone holder face-image, central processing unit 07 utilize in the image processing module in operating system such as continuous-tone image aberration method, can overcome aberration in image and disturb, fast search is to the coordinates regional of eyeball in the image gathering at front-facing camera 05.
The iris of eyeball covers on pupil, the gray value minimum of pupil, the gray value of iris takes second place, and between iris and pupil, to exist the sudden change of certain gray scale be edge strength to intersection, when at certain, some edge strength at place reaches maximum, this point must be just boundary point.
According to the intensity profile feature of iris image, get 1 Pc(Xc of gray value minimum, Yc), this point must be positioned at pupil, starting point using a Pc as search pupil boundary, does one-dimensional scanning, in different directions in conjunction with pixel gray value, Grad and boundary curve smoothness properties in eyeball image, through formula 1, calculate and can determine that edge point position r (Xr, Yr) is the coordinate that edge strength reaches peaked marginal point:
F (r)=k
0g (r)-k
1(I (r)-Ip)-k
2(r-r
n-1) formula 1
F is criterion function, and I (r) is search point gray value, and Ip is pupil threshold values, and G (r) is calculated by one dimension gradient operator [101] for searching for some Grad, r
n-1for the boundary point radius obtaining on adjacency search line, k
0, k
1, k
2serve as reasons and test definite weight parameter.
As shown in Figure 3, after the optical system of front-facing camera 05 is processed, in front-facing camera reference system, pupil imaging is plane ellipse, by utilize formula 2 to carry out oval minimum matching to the marginal point extracting, connect: the center P (Xp, Yp) that can determine pupil.
X
2+ Axy+By
2+ Cx+Dy+E=0 formula 2
Form oval border.
Choose marginal point coordinate discrete in 5 directions, as parameter A, B, C, D and E, utilize formula 3:
X
0=(2BC-AD)/(A
2-4B), y
0=(2D-AD)/(A
2-4B) formula 3
Determine elliptical center, i.e. pupil center.
There is not the shake of front and back or level in the spacing of front-facing camera 05 and eyeball under desirable shooting condition, and eyeball is approximately spheroid.When Rotation of eyeball is watched on display screen 03 different images position attentively, in front-facing camera reference system, each hot spot relative position coordinates does not change, and the relatively each hot spot of the position coordinates of pupil center is offset with Rotation of eyeball.Namely, by measuring the offset variation amount of the relatively each hot spot of pupil center, can complete the measurement to sight line focus.
As shown in Figure 3, in front-facing camera reference system, utilize the projection coordinate point O of the first hot spot on the second hot spot and the 3rd hot spot line and the position vector OP between pupil center's point P to measure accordingly.With (Xp, Yp), represent the image coordinate of pupil center, with O point in (Xg, Yg) Fig. 3, represent the image coordinate of hot spot reference position.Position relative displacement OP between projection coordinate's point O and pupil center's point P is:
Dx=x
p-x
g, dy=y
p-y
gformula 4
With (Xs, Ys) represent ocular vision watch the screen coordinate of focus attentively.By multinomial, set up mapping relations:
f:(dx,dy)→(X
s,Y
s)
Xs=a0+a1 (dx)+a2 (dy)+a3 (dx) (dy)+a4 (dx)
2+ a5 (dy)
2formula 5
Ys=b0+b1 (dx)+b2 (dy)+b3 (dx) (dy)+b4 (dx)
2+ b5 (dy)
2formula 6
By above-mentioned result of calculation, can realize the accurate location to sight line focus, the position coordinates that last mobile phone is watched focus attentively according to the vision drawing is given an order to Focusing mechanism, reaches focusing object.
Under undesirable shooting condition, front-facing camera 05 and the spacing existence front and back of eyeball or the shake of level, utilize the indeclinable objective factor of each hot spot relative position coordinates in front-facing camera reference system to eliminate.
As shown in Figure 3, in front-facing camera reference system, utilize the projection coordinate point O of the first hot spot on the second hot spot and the 3rd hot spot line and the second hot spot, the 3rd hot spot, move forward and backward right overhead, using the variation of the second hot spot and the 3rd spot center spacing as a corrected parameter, eliminate longitudinal separation between front-facing camera and eyeball and shake the interference to line of sight.Move horizontally right overhead, with the second hot spot and projection coordinate, and the ratio of the 3rd hot spot and projection coordinate changes as another corrected parameter, eliminates horizontal level between front-facing camera and eyeball and shakes the interference to line of sight.
Further overcoming on the basis of spacing shake, in front-facing camera reference system, can also utilize the gray value of each hot spot as correction value, the control of adjusting front-facing camera 05 lens aperture as central processing unit 07 is with reference to value, for improving the luminous flux of camera, improve the definition that gathers image in front-facing camera reference system.
As shown in Figure 4, in actual applications, when after handset starting shooting process, continue zoom and mainly pass through following steps:
S100, central processing unit 07 is set up respectively the reference system of display screen 03, front-facing camera 05 and post-positioned pick-up head 04, and forms mapping relations model;
S110, the second labelling apparatus that central processing unit 07 starts display screen 03 one end, top produces local luminance difference, as the second hot spot, according to display screen 03 reference system, determines the second hot spot coordinate position;
S120, the 3rd labelling apparatus that central processing unit 07 starts the display screen 03 top other end produces local luminance difference, as the 3rd hot spot, according to display screen 03 reference system, determines the 3rd hot spot coordinate position;
S130, the first labelling apparatus that central processing unit 07 starts display screen 03 upper end produces local luminance difference, central processing unit 07 is the relative position parameter with display screen 03 according to front-facing camera in physical structure 05, revise the coordinate position of local luminance in display screen 03 reference system, the coordinate position using front-facing camera 05 in display screen 03 reference system is as the first hot spot coordinate position;
S140, central processing unit 07 starts rearmounted video camera 04 collection to be treated shooting picture and is presented on display screen 03, central processing unit 07 starts preposition video camera 05 and gathers mobile phone holder face picture;
S150, central processing unit 07 is extracted in the eyeball coordinates regional in front-facing camera 05 reference system from facial picture;
S160, central processing unit 07 is determined the coordinates regional of the front-facing camera reference system of the first hot spot, the second hot spot and the 3rd hot spot from eyeball coordinates regional;
S170, central processing unit 07 is determined the projection coordinate of the first spot center, the i.e. projection coordinate of the first spot center on the line at the second spot area center and the 3rd spot area center in front-facing camera reference system;
S180, central processing unit 07 extracts pupil coordinate region from eyeball coordinates regional;
S190, central processing unit 07 extracts center coordinate of eye pupil from pupil coordinate region;
S200, central processing unit 07 is formed on the line of sight in front-facing camera reference system according to projection coordinate and center coordinate of eye pupil, is used for reflecting corresponding visual focus position coordinates in display screen reference system;
S210, central processing unit 07 changes as the first corrected parameter using the second spot center in front-facing camera reference system and the 3rd spot center spacing, eliminates longitudinal separation between front-facing camera and eyeball and shakes the interference to line of sight;
S220, central processing unit 07 changes as the second corrected parameter using the ratio of the second spot center in front-facing camera reference system and the gap length of projection coordinate and the gap length of the 3rd spot center and projection coordinate, eliminates horizontal level between front-facing camera and eyeball and shakes the interference to line of sight;
S230, central processing unit 07 is paid close attention to position coordinates by corresponding sight line in display screen reference system and is converted to corresponding visual focus position coordinates in post-positioned pick-up head reference system, and this scenery position coordinates is converted to the corresponding focusing control signal of controlling post-positioned pick-up head zoom lens control device; Return to afterwards step s150.
Continue above focus process, until photographer stops continuing the control data of focusing by touch-screen 02 feedback, central processing unit 07 sends corresponding termination or ends control command.
Above-described embodiment is described the preferred embodiment of the present invention; not scope of the present invention is limited; design under the prerequisite of spirit not departing from the present invention; various distortion and improvement that those of ordinary skills make technical scheme of the present invention, all should fall in the definite protection range of the claims in the present invention book.
Claims (9)
1. mobile phone is taken a focusing method, carries out having in the smart mobile phone of front-facing camera and post-positioned pick-up head, it is characterized in that: in the process of taking at post-positioned pick-up head, comprise the following steps:
Step 1, gathers mobile phone holder when watching in display screen visual focus by front-facing camera, the position coordinates of pupil center in front-facing camera reference system;
Step 2, central processing unit is the position coordinates of visual focus in post-positioned pick-up head reference system by pupil center location Coordinate Conversion;
Step 3, central processing unit sends corresponding zoom control signal according to the variation of visual focus position coordinates to post-positioned pick-up head, changes the focusing position of post-positioned pick-up head;
Repeating step one is to step 3.
2. mobile phone according to claim 1 is taken focusing method, it is characterized in that: in described step 1, comprise the following steps:
Step a, central processing unit (07) are set up the reference system of display screen (03), front-facing camera (05) and post-positioned pick-up head (04), and form mapping relations model;
Step b, central processing unit (07) utilize the labelling apparatus on smart mobile phone to set up the benchmark of display screen reference system and front-facing camera reference system Coordinate Conversion;
Step c, front-facing camera (05) gather eyeball image, and central processing unit (07) is determined the position coordinates of labelling apparatus in front-facing camera reference system, determines the position coordinates of pupil and pupil center.
3. mobile phone according to claim 2 is taken focusing method, it is characterized in that: in described step 2, comprise the following steps:
Steps d, central processing unit (07) are determined in front-facing camera reference system, the vector position relation of pupil center and labelling apparatus;
Step e, central processing unit (07) are the coordinate position of corresponding visual focus in post-positioned pick-up head (04) reference system by vector position relationship conversion by mapping relations model.
4. mobile phone according to claim 3 is taken focusing method, it is characterized in that: described labelling apparatus (06) is the Luminous Ring on LED light source or the front-facing camera (05) at display screen (03) edge, or the reflex reflector of relevant position; Or the specific luminous or blinker pattern of the display screen matrix at display screen (03) upper end central authorities, two ends composition.
5. mobile phone according to claim 3 is taken focusing method, it is characterized in that: in the step b in described step 2, comprise the following steps:
S110, the second labelling apparatus that central processing unit (07) starts display screen (03) one end, top produces local luminance difference, as the second hot spot, according to display screen (03) reference system, determines the second hot spot coordinate position;
S120, the 3rd labelling apparatus that central processing unit (07) starts display screen (03) the top other end produces local luminance difference, as the 3rd hot spot, according to display screen (03) reference system, determines the 3rd hot spot coordinate position;
S130, the first labelling apparatus that central processing unit (07) starts display screen (03) upper end produces local luminance difference, central processing unit (07) is the relative position parameter with display screen (03) according to front-facing camera in physical structure (05), revise the coordinate position of local luminance in display screen (03) reference system, the coordinate position using front-facing camera (05) in display screen (03) reference system is as the first hot spot coordinate position.
6. mobile phone according to claim 5 is taken focusing method, it is characterized in that: in the step c in described step 2, comprise the following steps:
S150, central processing unit (07) is extracted in the eyeball coordinates regional in front-facing camera (05) reference system from facial picture;
S160, central processing unit (07) is determined the coordinates regional of the front-facing camera reference system of the first hot spot, the second hot spot and the 3rd hot spot from eyeball coordinates regional;
S180, central processing unit (07) extracts pupil coordinate region from eyeball coordinates regional;
S190, central processing unit (07) extracts center coordinate of eye pupil from pupil coordinate region.
7. mobile phone according to claim 6 is taken focusing method, it is characterized in that: in the steps d in described step 3, comprise the following steps:
S170, central processing unit (07) is determined the projection coordinate of the first spot center, i.e. the projection coordinate of the first spot center on the line at the second spot area center and the 3rd spot area center in front-facing camera reference system;
S200, central processing unit (07) is formed on the line of sight in front-facing camera reference system according to projection coordinate and center coordinate of eye pupil, is used for reflecting corresponding visual focus position coordinates in display screen reference system.
8. mobile phone according to claim 7 is taken focusing method, it is characterized in that: in the step e in described step 3, comprise the following steps:
Central processing unit (07) is paid close attention to position coordinates by corresponding sight line in display screen reference system and is converted to corresponding visual focus position coordinates in post-positioned pick-up head reference system.
9. mobile phone according to claim 8 is taken focusing method, it is characterized in that: in the step e in described step 3, comprise the following steps:
S210, central processing unit (07) changes as the first corrected parameter using the second spot center in front-facing camera reference system and the 3rd spot center spacing, eliminates longitudinal separation between front-facing camera and eyeball and shakes the interference to line of sight;
S220, central processing unit (07) changes as the second corrected parameter using the ratio of the second spot center in front-facing camera reference system and the gap length of projection coordinate and the gap length of the 3rd spot center and projection coordinate, eliminates horizontal level between front-facing camera and eyeball and shakes the interference to line of sight.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201410018564.8A CN103747183B (en) | 2014-01-15 | 2014-01-15 | Mobile phone shooting focusing method |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201410018564.8A CN103747183B (en) | 2014-01-15 | 2014-01-15 | Mobile phone shooting focusing method |
Publications (2)
Publication Number | Publication Date |
---|---|
CN103747183A true CN103747183A (en) | 2014-04-23 |
CN103747183B CN103747183B (en) | 2017-02-15 |
Family
ID=50504169
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201410018564.8A Expired - Fee Related CN103747183B (en) | 2014-01-15 | 2014-01-15 | Mobile phone shooting focusing method |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN103747183B (en) |
Cited By (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104092935A (en) * | 2014-06-05 | 2014-10-08 | 西安中兴新软件有限责任公司 | Method and device for processing image shooting |
CN104238239A (en) * | 2014-09-30 | 2014-12-24 | 西安电子科技大学 | System and method for focusing cameras on basis of vision drop points |
WO2015162605A2 (en) | 2014-04-22 | 2015-10-29 | Snapaid Ltd | System and method for controlling a camera based on processing an image captured by other camera |
CN105700786A (en) * | 2015-12-30 | 2016-06-22 | 联想(北京)有限公司 | Information processing method and apparatus, and electronic device |
WO2016101481A1 (en) * | 2014-12-26 | 2016-06-30 | 小米科技有限责任公司 | Automatic focusing method and device |
CN105892685A (en) * | 2016-04-29 | 2016-08-24 | 广东小天才科技有限公司 | Question searching method and device of intelligent equipment |
CN106231178A (en) * | 2016-07-22 | 2016-12-14 | 维沃移动通信有限公司 | A kind of self-timer method and mobile terminal |
CN104238239B (en) * | 2014-09-30 | 2017-01-04 | 西安电子科技大学 | A kind of camera focusing system based on sight line drop point and method |
CN106454123A (en) * | 2016-11-25 | 2017-02-22 | 滁州昭阳电信通讯设备科技有限公司 | Shooting focusing method and mobile terminal |
CN107145086A (en) * | 2017-05-17 | 2017-09-08 | 上海青研科技有限公司 | A kind of Eye-controlling focus device and method for exempting from calibration |
US10419655B2 (en) | 2015-04-27 | 2019-09-17 | Snap-Aid Patents Ltd. | Estimating and using relative head pose and camera field-of-view |
CN110809115A (en) * | 2019-10-31 | 2020-02-18 | 维沃移动通信有限公司 | Shooting method and electronic equipment |
CN112804504A (en) * | 2020-12-31 | 2021-05-14 | 成都极米科技股份有限公司 | Image quality adjusting method, image quality adjusting device, projector and computer readable storage medium |
US11255663B2 (en) | 2016-03-04 | 2022-02-22 | May Patents Ltd. | Method and apparatus for cooperative usage of multiple distance meters |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP2447807A1 (en) * | 2010-10-27 | 2012-05-02 | Sony Ericsson Mobile Communications AB | Loading of data to an electronic device |
CN103063314A (en) * | 2012-01-12 | 2013-04-24 | 杭州美盛红外光电技术有限公司 | Thermal imaging device and thermal imaging shooting method |
CN103248822A (en) * | 2013-03-29 | 2013-08-14 | 东莞宇龙通信科技有限公司 | Focusing method of camera shooting terminal and camera shooting terminal |
-
2014
- 2014-01-15 CN CN201410018564.8A patent/CN103747183B/en not_active Expired - Fee Related
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP2447807A1 (en) * | 2010-10-27 | 2012-05-02 | Sony Ericsson Mobile Communications AB | Loading of data to an electronic device |
CN103063314A (en) * | 2012-01-12 | 2013-04-24 | 杭州美盛红外光电技术有限公司 | Thermal imaging device and thermal imaging shooting method |
CN103248822A (en) * | 2013-03-29 | 2013-08-14 | 东莞宇龙通信科技有限公司 | Focusing method of camera shooting terminal and camera shooting terminal |
Cited By (26)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9661215B2 (en) | 2014-04-22 | 2017-05-23 | Snapaid Ltd. | System and method for controlling a camera based on processing an image captured by other camera |
WO2015162605A2 (en) | 2014-04-22 | 2015-10-29 | Snapaid Ltd | System and method for controlling a camera based on processing an image captured by other camera |
US9866748B2 (en) | 2014-04-22 | 2018-01-09 | Snap-Aid Patents Ltd. | System and method for controlling a camera based on processing an image captured by other camera |
EP4250738A2 (en) | 2014-04-22 | 2023-09-27 | Snap-Aid Patents Ltd. | Method for controlling a camera based on processing an image captured by other camera |
CN104092935B (en) * | 2014-06-05 | 2018-06-26 | 西安中兴新软件有限责任公司 | A kind for the treatment of method and apparatus of image taking |
CN104092935A (en) * | 2014-06-05 | 2014-10-08 | 西安中兴新软件有限责任公司 | Method and device for processing image shooting |
CN104238239A (en) * | 2014-09-30 | 2014-12-24 | 西安电子科技大学 | System and method for focusing cameras on basis of vision drop points |
CN104238239B (en) * | 2014-09-30 | 2017-01-04 | 西安电子科技大学 | A kind of camera focusing system based on sight line drop point and method |
WO2016101481A1 (en) * | 2014-12-26 | 2016-06-30 | 小米科技有限责任公司 | Automatic focusing method and device |
US9729775B2 (en) | 2014-12-26 | 2017-08-08 | Xiaomi Inc. | Auto-focusing method and auto-focusing device |
US10594916B2 (en) | 2015-04-27 | 2020-03-17 | Snap-Aid Patents Ltd. | Estimating and using relative head pose and camera field-of-view |
US11019246B2 (en) | 2015-04-27 | 2021-05-25 | Snap-Aid Patents Ltd. | Estimating and using relative head pose and camera field-of-view |
US10419655B2 (en) | 2015-04-27 | 2019-09-17 | Snap-Aid Patents Ltd. | Estimating and using relative head pose and camera field-of-view |
CN105700786A (en) * | 2015-12-30 | 2016-06-22 | 联想(北京)有限公司 | Information processing method and apparatus, and electronic device |
US11906290B2 (en) | 2016-03-04 | 2024-02-20 | May Patents Ltd. | Method and apparatus for cooperative usage of multiple distance meters |
US11255663B2 (en) | 2016-03-04 | 2022-02-22 | May Patents Ltd. | Method and apparatus for cooperative usage of multiple distance meters |
CN105892685A (en) * | 2016-04-29 | 2016-08-24 | 广东小天才科技有限公司 | Question searching method and device of intelligent equipment |
CN105892685B (en) * | 2016-04-29 | 2019-02-15 | 广东小天才科技有限公司 | Question searching method and device of intelligent equipment |
CN106231178A (en) * | 2016-07-22 | 2016-12-14 | 维沃移动通信有限公司 | A kind of self-timer method and mobile terminal |
CN106231178B (en) * | 2016-07-22 | 2019-07-26 | 维沃移动通信有限公司 | A kind of self-timer method and mobile terminal |
CN106454123B (en) * | 2016-11-25 | 2019-02-22 | 盐城丝凯文化传播有限公司 | A kind of method and mobile terminal of focusing of taking pictures |
CN106454123A (en) * | 2016-11-25 | 2017-02-22 | 滁州昭阳电信通讯设备科技有限公司 | Shooting focusing method and mobile terminal |
CN107145086B (en) * | 2017-05-17 | 2023-06-16 | 上海青研科技有限公司 | Calibration-free sight tracking device and method |
CN107145086A (en) * | 2017-05-17 | 2017-09-08 | 上海青研科技有限公司 | A kind of Eye-controlling focus device and method for exempting from calibration |
CN110809115A (en) * | 2019-10-31 | 2020-02-18 | 维沃移动通信有限公司 | Shooting method and electronic equipment |
CN112804504A (en) * | 2020-12-31 | 2021-05-14 | 成都极米科技股份有限公司 | Image quality adjusting method, image quality adjusting device, projector and computer readable storage medium |
Also Published As
Publication number | Publication date |
---|---|
CN103747183B (en) | 2017-02-15 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN103747183B (en) | Mobile phone shooting focusing method | |
JP5136669B2 (en) | Image processing apparatus, image processing method, and program | |
US9867532B2 (en) | System for detecting optical parameter of eye, and method for detecting optical parameter of eye | |
JP3829773B2 (en) | Imaging apparatus and centering information acquisition method | |
CN108111731A (en) | A kind of camera module | |
WO2015035823A1 (en) | Image collection with increased accuracy | |
CN107483791A (en) | A kind of multi-cam module | |
CN109725423B (en) | Method for automatically adjusting brightness of monocular AR (augmented reality) glasses and storage medium | |
CN103475805A (en) | Active range focusing system and active range focusing method | |
KR20170011362A (en) | Imaging apparatus and method for the same | |
CN113504692B (en) | Camera shooting and projection integrated module and control method thereof | |
CN104000555A (en) | Ocular fundus information acquisition device, method and program | |
CN110062143A (en) | Camera module, photographing equipment and photographing method | |
JP2013017218A (en) | Image processing device, image processing method, and program | |
CN203012315U (en) | Device for realizing phase focusing | |
JP2016200629A (en) | Image pickup apparatus, and control method, and program for the same | |
US10248859B2 (en) | View finder apparatus and method of operating the same | |
JP5153021B2 (en) | Imaging apparatus, imaging method, and program | |
US20220329740A1 (en) | Electronic apparatus, method for controlling electronic apparatus, and non-transitory computer readable storage medium | |
US11822714B2 (en) | Electronic device and control method for capturing an image of an eye | |
JP2013205675A (en) | Imaging apparatus | |
US20230136191A1 (en) | Image capturing system and method for adjusting focus | |
CN110971814A (en) | Shooting adjustment method and device, electronic equipment and storage medium | |
JPWO2019065820A1 (en) | Imaging device and its control method and control program | |
CN113973171B (en) | Multi-camera shooting module, camera shooting system, electronic equipment and imaging method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
C14 | Grant of patent or utility model | ||
GR01 | Patent grant | ||
CF01 | Termination of patent right due to non-payment of annual fee |
Granted publication date: 20170215 Termination date: 20210115 |
|
CF01 | Termination of patent right due to non-payment of annual fee |