CN104238239B - A kind of camera focusing system based on sight line drop point and method - Google Patents
A kind of camera focusing system based on sight line drop point and method Download PDFInfo
- Publication number
- CN104238239B CN104238239B CN201410522265.8A CN201410522265A CN104238239B CN 104238239 B CN104238239 B CN 104238239B CN 201410522265 A CN201410522265 A CN 201410522265A CN 104238239 B CN104238239 B CN 104238239B
- Authority
- CN
- China
- Prior art keywords
- user
- sight line
- focusing
- drop point
- circuit
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 230000001702 transmitter Effects 0.000 claims abstract description 13
- 238000000034 method Methods 0.000 claims description 20
- 210000000554 Iris Anatomy 0.000 claims description 15
- 210000001747 Pupil Anatomy 0.000 claims description 15
- 238000003062 neural network model Methods 0.000 claims description 9
- 210000002569 neurons Anatomy 0.000 claims description 9
- 238000003384 imaging method Methods 0.000 claims description 7
- 238000003825 pressing Methods 0.000 claims description 5
- 230000005611 electricity Effects 0.000 claims description 4
- 238000001514 detection method Methods 0.000 claims description 3
- 239000000284 extract Substances 0.000 claims description 3
- 239000012528 membrane Substances 0.000 claims description 3
- 230000001264 neutralization Effects 0.000 claims description 3
- 238000004321 preservation Methods 0.000 claims description 3
- 210000000162 Simple eye Anatomy 0.000 claims description 2
- 230000000875 corresponding Effects 0.000 abstract description 4
- 238000004364 calculation method Methods 0.000 description 5
- 239000000203 mixture Substances 0.000 description 4
- 230000003287 optical Effects 0.000 description 4
- 230000037396 body weight Effects 0.000 description 2
- 210000004027 cells Anatomy 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 230000005540 biological transmission Effects 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Abstract
The invention discloses a kind of camera focusing system based on sight line drop point and method, described system includes image transmitter, sight line impact location system, MILS, focusing system and AF circuit.Described method includes: user carries out calibration and the identification of sight line drop point;View finder obtains the single eye images of user;The model generated when calibrating according to user calculates sight line corresponding to single eye images drop point site on view finder;Described drop point site is controlled by shutter release button, subject is focused.The Atomatic focusing method that the embodiment of the present invention is provided, it is achieved that the sight line drop point calibration of user, uses shutter release button and sight line to carry out location and the focusing of focusing, flexibly and also meet user custom.
Description
Technical field
The present invention relates to focus technical field, be specifically related to a kind of camera focusing system based on sight line drop point and
Method.
Background technology
Along with popularizing of camera, focusing technology closely-related with camera imaging technology there has also been unprecedented development.
Autofocus system then uses on high-end camera such as single anti-and micro-list.Use the phase of Autofocus Technology
The focusing principle of machine approximately as: the sensor devices on camera by the information of light through magazine processor
After calculating, focusing circuit control lens group and focus.
When using automatic focusing camera to shoot, user often requires that the object being taken takes at view finder
Diverse location in scenic spot.Existing automatic focusing camera, needs user focusing to be set in advance,
Focused by button of partly tripping in photographic process again.Present camera also has other auto-focusing
Function, such as, determine focusing by profile etc., thus focus.But both focusings mode is a lot
Time can not adjust focusing in real time, the former easily misses shooting time by complex operation, and the latter the most easily takes
Less than the desired focusing of user.
Existing view finder of camera typically divides two kinds: use display screen view finder and eyepiece frame two kinds.One
For as, the functions such as display screen view finder is commonly used in card photographing unit, mobile phone simply shoot on equipment;
Eyepiece frame is then used in single anti-, micro-single shooting equipment waiting more specialty.It is said that in general, display screen
Viewfinder operation is simple, just can be realized the determination of focusing by finger.Slr cameras etc. then rely on mesh
Mirror view finder, although convenient on using, but complex steps in focusing.
Existing advanced fighter then allows shell emitter to pilot by real-time tracking pilot's eye motion
The position aimed at is needed to be tracked.As long as pilot finds satisfactory place or place, only need to pass through
Fighter plane is also made the order of attack and can be completed attack task by run-home.Owing to automatically watching for a long time
Clothes focusing, if for this technology is used in Portable camera, to camera power supply overload, causes stand-by time mistake
Short, do not meet portable requirement.And fighter plane needs the control to pilot to carry out in detail before pilot launches an attack
Thin setting and parameter typing, when people the most for camera use, this auto-focusing of fighter plane
System and method for is not suitable for civilian camera auto-focusing system and method.
Summary of the invention
In view of loaded down with trivial details or can not accurately determine focusing to the camera focus process using eyepiece frame, this
Invention provides a kind of simple eye autofocus system for eyepiece frame and method, it is achieved camera user exists
During shooting photo can in real time, easily and flexibly according to shutter release button state and current self need
The object needing shooting is carried out exact focus.
In order to realize technique scheme, the present invention takes following technical solution:
A kind of camera focusing system based on sight line drop point, described system includes:
Image transmitter, will be sent to user eye by the scene to be captured of camera lens by circuit or light path
In;
Sight line impact location system, MILS, is used for obtaining user single eye images, carries out calibration and information stores, with
And determine focusing according to serviceability limit stage single eye images;
Focusing system, controls AF circuit based on focusing thus is controlled lens group;
AF circuit, is controlled by focusing circuit, controls lens group;
Described image transmitter receives the scene in lens group, is transmitted extremely through view finder by circuit or light path
In user sight line, store through the calibration of sight line impact location system, MILS and information, and determine focusing through single eye images
After Dian, scene is focused through focusing system, through AF circuit to lens group, forms loop.
Further, sight line impact location system, MILS includes single eye images getter, message handler and output
Controller.
Correspondingly, the present invention gives a kind of camera focusing method based on sight line drop point, and the method includes
Following step:
1) scene enters camera by lens group;
2) it is transferred to view finder by the scene of lens group by image transmitter enter in user eye;
3) user is watched view finder attentively and is found a view and expect the position of focusing in district, partly trips, sight line impact point positioning
System obtains currently used person's single eye images;
4) mould that sight line impact location system, MILS produces according to user single eye images and user calibration process
Type, according to the current single eye images of user, calculates user sight line drop point;
5) sight line impact location system, MILS is according to user sight line drop point, determines focusing, controls focusing system pair
Focusing is focused, and controls AF circuit;
6) AF circuit controls lens group imaging, and scene imaging enters in user eye by image transmitter,
User obtains defocused image;
7) user judges according to defocused image, can shoot if user is thought, then press shutter
Button, presses shutter release button and then shoots when front focal plane according to described;If the shutter release button time of pressing can
Carry out continuous shooting, then sight line impact location system, MILS real-time collecting single eye images information determines focusing, focusing system
Lens fine-tuning group focal plane real-time with AF circuit is unclamped to shutter release button;If shutter release button unclamps, then AF electricity
Road keeps working as front focal plane.
Further, for using first or user triggering calibration function, user can be entered by described method
Row calibration also sets up model, then carries out drop point according to described model after user enters serviceability limit stage
Position calculation;User applies for sight line drop point calibration operation voluntarily.
Further, for using first or user triggering calibration function, concrete grammar includes:
1) iris detection, it is judged that user uses the method first, if user uses the method first,
Then enter step 2), otherwise enter step 6);
2) view finder district of finding a view shows several foresights, the most highlighted, allows user aim at and records information;
3) user's sight line drop point model, preservation model are calculated;
4) viewfinder information viewing area display alarm disappears after calibrating successful information and spending the several seconds, forwards step 6 to);
5) user is the need of calibration, is, carries out step 2), otherwise carry out step 6);
6) user enters serviceability limit stage.
Further, described serviceability limit stage is that after user information is determined, user is found a view
Adjust the process that shooting circulates successively.
Further, the model that described user calibration process produces, realized by following method:
1) single eye images getter obtain user to be watched sight line landing point coordinates on view finder attentively is (x0,y0) foresight time
Single eye images;
2) single eye images that message handler obtains according to single eye images getter, extracts single eye images simple eye
Center coordinate of eye pupil (x in image grabber1,y1) and iris centre coordinate (x2,y2);
3) with above-mentioned center coordinate of eye pupil (x1,y1), iris centre coordinate (x2,y2) input as sample, (x0,y0)
Export as sample, repeat the above steps, after collecting multiple sample, set up neutral net and sample is instructed
Practice, it is thus achieved that with center coordinate of eye pupil (x1,y1) and iris centre coordinate (x2,y2) it is input, user sight line drop point
Coordinate (x0,y0) it is the neural network model exported.
Further, according to the current single eye images of user, calculate user sight line drop point, by following side
Method realizes:
1) single eye images of user during single eye images getter obtains normal use;
2) message handler according to the single eye images of currently used person calculate center coordinate of eye pupil (x '1,y’1) and rainbow
Center membrane coordinate (x '2,y’2);
3) the user neural network model generated according to calibration process training, message handler is by above-mentioned
(x’1,y’1), (x '2,y’2) as input, according to the threshold value trained in neural network model between the neuron of generation,
If the threshold value result that i-th neuron is trained by center coordinate of eye pupil and iris centre coordinate in training process is
a1i, a2i, a3i, a4i, i-th neuron is b to the threshold value result of sight line drop point transverse and longitudinal coordinatei1, bi2, then
Currently realize landing point coordinates (x '3, y '3) it is
Further, user judges according to defocused image, by shutter release button control, by under
The method of stating is carried out:
1) when shutter release button unclamps, AF circuit kept when front focal plane a period of time;
2) pressing when shutter release button and sight line falls when finding a view district, AF circuit is real according to user sight line drop point
Time fine setting focal plane;
3) when shutter release button half by and sight line fall when finding a view district, it is right that AF circuit is carried out according to user sight line
Defocused AF circuit keeps when front focal plane is until shutter release button is released;
4) when shutter release button half by and sight line do not fall within when finding a view district, AF circuit keeps when front focal plane one
The section time;
5) shutter release button is pressed and sight line does not falls within when finding a view district, and AF circuit keeps when front focal plane one section
Time.
Further, described AF circuit keeps when the time span of front focal plane is set as fast, for
0.01-0.1s;Described AF circuit keeps when the time span of front focal plane is set as slow, for 0.1-0.3s.
By the present invention in that user carries out calibration and the identification of sight line drop point, set up exclusive single eye images sight line
Impact point positioning model, makes the sight line impact point positioning of user more accurate, the most relatively current according to being set
Focusing or according to the edge of current scene to be captured carry out shoot focusing position more accurate;According to making
The model generated when user calibrates calculates sight line corresponding to single eye images drop point site on view finder, allows use
Family is at serviceability limit stage sight line impact point calculation faster;Described drop point site is controlled by shutter release button, right
Subject is focused.The Atomatic focusing method that the embodiment of the present invention is provided, it is achieved that the sight line of user
Drop point is calibrated, and makes different user focusing precisions higher;Shutter release button and sight line is used to carry out focusing
Location and focusing, flexibly and meet user custom.
Accompanying drawing explanation
In order to be illustrated more clearly that the embodiment of the present invention or technical scheme of the prior art, below will be to enforcement
In example or description of the prior art, the required accompanying drawing used is briefly described, it should be apparent that, describe below
Accompanying drawing be only some embodiments of the present invention, for those of ordinary skill in the art, do not paying
On the premise of creative work, it is also possible to obtain other accompanying drawing according to these accompanying drawings.
Fig. 1 is the hardware structure figure of a kind of camera focusing system based on sight line drop point of the present invention;
Fig. 2 is the flow chart of a kind of camera focusing method based on sight line drop point of the present invention;
Fig. 3 is the model structure figure of a kind of camera focusing method based on sight line drop point of the present invention;
Fig. 4 is a kind of camera based on sight line drop point of present invention focusing control flow chart;
Fig. 5 is a kind of camera focusing system view finder one field-of-view image based on sight line drop point of the present invention;
Fig. 6 is a kind of camera focusing system view finder another kind field-of-view image based on sight line drop point of the present invention;
Fig. 7 is the sight line impact point calculation schematic diagram of a kind of camera focusing method based on sight line drop point of the present invention;
Fig. 8 is the view finder eyepiece structure chart of body weight of the present invention camera based on sight line drop point focusing system.
Detailed description of the invention
Below in conjunction with the accompanying drawing in the embodiment of the present invention, the technical scheme in the embodiment of the present invention is carried out clearly
Chu, it is fully described by, it is clear that described embodiment is only a part of embodiment of the present invention, and not
It it is whole embodiments.Based on the embodiment in the present invention, those of ordinary skill in the art are not making wound
Other embodiments obtained under the property made work premise, broadly fall into the scope of protection of the invention.
Fig. 1 is the hardware structure figure of a kind of camera focusing system based on sight line drop point of the present invention.With reference to Fig. 1,
System includes: image transmitter, will be sent to make by circuit or light path by the scene to be captured of camera lens
User is in the eyes;Sight line impact location system, MILS, is used for obtaining user single eye images, carries out calibration and information is deposited
Storage, and determine focusing according to serviceability limit stage single eye images;Focusing system, controls based on focusing
AF circuit processed thus lens group is controlled;AF circuit, is controlled by focusing circuit, controls lens group.
Image transmitter receives the scene in lens group, is extremely used through view finder transmission by circuit or light path
In person's sight line, store through the calibration of sight line impact location system, MILS and information, and after single eye images determines focusing,
Scene is focused through focusing system, through AF circuit to lens group, forms loop.
Wherein, sight line impact location system, MILS includes that single eye images getter, message handler and output control
Device.
For the optical eyepiece view finder including slr camera, obtain and determine that the process of focusing is concrete
For:
Scene enters camera by lens group;
By the scene of lens group, by image transmitter, (optical mode is that reflector arrives pentaprism, electronics side
Formula is that photo-sensitive cell is to display screen) it is transferred in view finder entrance user eye;
User is watched view finder attentively and is found a view and expect the position of focusing in district, partly trips, sight line impact point positioning system
System obtains currently used person's single eye images;
The model that sight line impact location system, MILS produces according to user single eye images and user calibration process,
According to the current single eye images of user, calculate user sight line drop point;
Sight line impact location system, MILS, according to user sight line drop point, determines focusing, controls focusing system to right
Focus is focused, and controls AF circuit;
AF circuit controls lens group imaging, and by image transmitter, (optical mode is that reflector arrives in scene imaging
Pentaprism, electronically for photo-sensitive cell to display screen) enter in user eye, user obtains defocused
Image;
User judges according to defocused image, can shoot if user is thought, then can press fast
Door button.Press shutter release button then to shoot when front focal plane according to described.If shutter release button presses the time
Can carry out continuous shooting, then sight line impact location system, MILS real-time collecting single eye images information determines focusing, focusing system
System and AF circuit real-time lens fine-tuning group focal plane are unclamped to shutter release button.If shutter release button unclamps, then AF
Circuit keeps working as front focal plane.
For using first or user triggering calibration function, user can be calibrated and sets up model,
Then after user enters serviceability limit stage, drop point site calculating is carried out according to described model;User is certainly
Row application sight line drop point calibration operation.
Shown in Fig. 2, for using first or user triggering calibration function, concrete grammar includes:
1) iris detection, it is judged that user uses the method first, if user uses the method first,
Then enter step 2), otherwise enter step 6);
2) view finder district of finding a view shows several foresights, the most highlighted, allows user aim at and records information;
3) user's sight line drop point model, preservation model are calculated;
4) viewfinder information viewing area display alarm disappears after calibrating successful information and spending the several seconds, forwards step 6 to);
5) user is the need of calibration, is, carries out step 2), otherwise carry out step 6);
6) user enters serviceability limit stage.Serviceability limit stage is, after user information is determined, to make
User carries out finding a view and adjusts the process that shooting circulates successively.
Give the model that user calibration process produces shown in Fig. 3, realized by following method:
1) single eye images getter obtain user to be watched sight line landing point coordinates on view finder attentively is (x0,y0) foresight time
Single eye images;
2) single eye images that message handler obtains according to single eye images getter, extracts single eye images simple eye
Center coordinate of eye pupil (x in image grabber1,y1) and iris centre coordinate (x2,y2);
3) with above-mentioned center coordinate of eye pupil (x1,y1), iris centre coordinate (x2,y2) input as sample, (x0,y0)
Export as sample, repeat the above steps, after collecting multiple sample, set up neutral net and sample is instructed
Practice, it is thus achieved that with center coordinate of eye pupil (x1,y1) and iris centre coordinate (x2,y2) it is input, user sight line drop point
Coordinate (x0,y0) it is the neural network model exported.
In the inventive method, according to the current single eye images of user, calculate user sight line drop point, by under
Method of stating realizes:
1) single eye images of user during single eye images getter obtains normal use;
2) message handler according to the single eye images of currently used person calculate center coordinate of eye pupil (x '1,y’1) and rainbow
Center membrane coordinate (x '2,y’2);
3) the user neural network model generated according to calibration process training, message handler is by above-mentioned
(x’1,y’1), (x '2,y’2) as input, according to the threshold value trained in neural network model between the neuron of generation,
If the threshold value result that i-th neuron is trained by center coordinate of eye pupil and iris centre coordinate in training process is
a1i, a2i, a3i, a4i, i-th neuron is b to the threshold value result of sight line drop point transverse and longitudinal coordinatei1, bi2, then
Current gaze landing point coordinates (x '3, y '3) it is
As shown in Figure 4, it is judged that shutter release button state, be divided into shutter release button to be pressed, shutter release button unclamp with
And shutter release button partly presses.According to user sight line drop point whether in district of finding a view, it is divided into following five kinds of situations:
1) when shutter release button unclamps, AF circuit kept when front focal plane a period of time;
2) pressing when shutter release button and sight line falls when finding a view district, AF circuit is real according to user sight line drop point
Time fine setting focal plane;
3) when shutter release button half by and sight line fall when finding a view district, it is right that AF circuit is carried out according to user sight line
Defocused AF circuit keeps when front focal plane is until shutter release button is released;
4) when shutter release button half by and sight line do not fall within when finding a view district, AF circuit keeps when front focal plane one
The section time;
5) shutter release button is pressed and sight line does not falls within when finding a view district, and AF circuit keeps when front focal plane one section
Time.AF circuit keeps when the time span of front focal plane can be set by user oneself.AF circuit
Keep when the time span of front focal plane is set as fast, for 0.01-0.1s;Described AF circuit keeps current burnt
When the time span of plane is set as slow, for 0.1-0.3s.
It can be seen that the control mode of above-mentioned five kinds of situation AF circuit depends on the state of shutter release button and makes
User's sight line drop point.For user, button of partly tripping is that composition is found a view state.Shutter release button unclamps
And user sight line drop point does not keeps when front focal plane then can allow camera reduce consumption at district's AF circuit of finding a view
Electricity, it is to avoid cause camera stand-by time too short.
Fig. 5 is a kind of camera focusing system view finder one field-of-view image based on sight line drop point of the present invention.Depending on
Wild picture centre major part is district of finding a view, and is around information display area.Described view finder field-of-view image is current
The view finder field-of-view image of popular slr camera optical finder.Sight line drop point can determine whether by described method
Finding a view in district or on information display area.
Fig. 6 is the another kind of camera focusing system view finder one field-of-view image based on sight line drop point of the present invention.
Field-of-view image money is entirely found a view district, and the fritter in district of finding a view is as information display area.The described view finder visual field
Image is the view finder field-of-view image of currently a popular micro-one camera electronic viewfinder.Sight line drop point passes through described side
Method can determine whether finding a view in district or on information display area.
Fig. 7 is the sight line impact point calculation schematic diagram of a kind of camera focusing method based on sight line drop point of the present invention.
With reference to Fig. 6, calibration point 1,2,3,4,5,6,7,8 is the standard shown on view finder when calibration respectively
Corresponding user pupil center location during star.When calculating user sight line drop point, processor calculating user is worked as
Front single eye images pupil center location, and work as according to calibration point 1,2,3,4,5,6,7,8 position calculation
The view finder drop point site that front position is corresponding.
Fig. 8 is the view finder eyepiece structure chart of body weight of the present invention camera based on sight line drop point focusing system.Take the photograph
As module photograph user single eye images is analyzed, and user is by eyepiece frame observed scene,
With common eyepiece frame indistinction.
It should be noted that the algorithm flow shown in Fig. 2 of the present invention, the algorithm flow shown in Fig. 4 is it
A kind of preferred implementation, the algorithm that should not become autofocus system method of the present invention limits.Shown in Fig. 1
Hardware structure, the hardware configuration shown in Fig. 7 and Fig. 5, the view finder field-of-view image shown in Fig. 6 is only it
A kind of preferred implementation, should not become the restriction of autofocus system hardware structure of the present invention.Professional
Will be further appreciated that, the unit of each example described in conjunction with the embodiments described herein and algorithm
Step, it is possible to electronic hardware, computer software or the two be implemented in combination in, in order to clearly demonstrate
The interchangeability of hardware and software, the most describes each example according to function is general
Composition and step.These functions perform with hardware or software mode actually, depend on the spy of technical scheme
Fixed application and design constraint.Professional and technical personnel can use different sides to each specifically should being used for
Method realizes described function, but this realization is it is not considered that beyond the scope of this invention.
The method described in conjunction with presently disclosed embodiment or the step of algorithm can directly use hardware, process
The software module that device performs, or the combination of the two implements.Software module is placed in random access memory
(RAM), internal memory, system memorizer (ROM), electrically programmable ROM, electrically erasable ROM,
Other forms of kindheartedness and righteousness well known in depositor, hard disk, mobile hard disk, CD-ROM or technical field
Storage medium in.
Described above to the disclosed embodiments, makes professional and technical personnel in the field be capable of or uses this
Invention.Multiple amendment to these embodiments will be apparent from for those skilled in the art,
Generic principles defined herein can be real at other without departing from the spirit or scope of the present invention
Execute in example and realize.Therefore, the present invention is not intended to be limited to the embodiments shown herein, and is intended to symbol
Close the widest scope that the features of novelty of principles disclosed herein is consistent.
Claims (9)
1. a camera focusing method based on sight line drop point, it is characterised in that the method comprises the steps:
1) scene enters camera by lens group;
2) it is transferred to view finder by the scene of lens group by image transmitter enter in user eye;
3) user is watched view finder attentively and is found a view and expect the position of focusing in district, partly trips, sight line impact point positioning
System obtains currently used person's single eye images;
4) mould that sight line impact location system, MILS produces according to user single eye images and user calibration process
Type, according to the current single eye images of user, calculates user sight line drop point;
5) sight line impact location system, MILS is according to user sight line drop point, determines focusing, controls focusing system pair
Focusing is focused, and controls AF circuit;
6) AF circuit controls lens group imaging, and scene imaging enters in user eye by image transmitter,
User obtains defocused image;
7) user judges according to defocused image, can shoot if user is thought, then press shutter
Button, presses shutter release button and then shoots when front focal plane according to described;If the shutter release button time of pressing can
Carry out continuous shooting, then sight line impact location system, MILS real-time collecting single eye images information determines focusing, focusing system
Lens fine-tuning group focal plane real-time with AF circuit is unclamped to shutter release button;If shutter release button unclamps, then AF electricity
Road keeps working as front focal plane;
The model that described user calibration process produces, is realized by following method:
1) single eye images getter obtain user to be watched sight line landing point coordinates on view finder attentively is (x0,y0) foresight time
Single eye images;
2) single eye images that message handler obtains according to single eye images getter, extracts single eye images simple eye
Center coordinate of eye pupil (x in image grabber1,y1) and iris centre coordinate (x2,y2);
3) with above-mentioned center coordinate of eye pupil (x1,y1), iris centre coordinate (x2,y2) input as sample, (x0,y0)
Export as sample, repeat the above steps, after collecting multiple sample, set up neutral net and sample is instructed
Practice, it is thus achieved that with center coordinate of eye pupil (x1,y1) and iris centre coordinate (x2,y2) it is input, user sight line drop point
Coordinate (x0,y0) it is the neural network model exported.
A kind of camera focusing method based on sight line drop point the most according to claim 1, it is characterised in that
For using first or user triggering calibration function, user can be calibrated and set up mould by described method
Type, then carries out drop point site calculating according to described model after user enters serviceability limit stage;Use
Person applies for sight line drop point calibration operation voluntarily.
A kind of camera focusing method based on sight line drop point the most according to claim 2, it is characterised in that
For using first or user triggering calibration function, concrete grammar includes:
1) iris detection, it is judged that user uses the method first, if user uses the method first,
Then enter step 2), otherwise enter step 6);
2) view finder district of finding a view shows several foresights, the most highlighted, allows user aim at and records information;
3) user's sight line drop point model, preservation model are calculated;
4) viewfinder information viewing area display alarm disappears after calibrating successful information and spending the several seconds, forwards step 6 to);
5) user is the need of calibration, is, carries out step 2), otherwise carry out step 6);
6) user enters serviceability limit stage.
A kind of camera focusing method based on sight line drop point the most according to claim 3, it is characterised in that
Described serviceability limit stage is after user information is determined, user carries out finding a view adjusting to shoot and follows successively
The process of ring.
A kind of camera focusing method based on sight line drop point the most according to claim 1, it is characterised in that
According to the current single eye images of user, calculate user sight line drop point, realized by following method:
1) single eye images of user during single eye images getter obtains normal use;
2) message handler according to the single eye images of currently used person calculate center coordinate of eye pupil (x '1,y’1) and rainbow
Center membrane coordinate (x '2,y’2);
3) the user neural network model generated according to calibration process training, message handler is by above-mentioned
(x’1,y’1), (x '2,y’2) as input, according to the threshold value trained in neural network model between the neuron of generation,
If the threshold value result that i-th neuron is trained by center coordinate of eye pupil and iris centre coordinate in training process is
a1i, a2i, a3i, a4i, i-th neuron is b to the threshold value result of sight line drop point transverse and longitudinal coordinatei1, bi2, then
Current gaze landing point coordinates (x '3, y '3) it is
A kind of camera focusing method based on sight line drop point the most according to claim 1, it is characterised in that
User judges according to defocused image, controls to shoot, by following method by shutter release button
Carry out:
1) when shutter release button unclamps, AF circuit kept when front focal plane a period of time;
2) pressing when shutter release button and sight line falls when finding a view district, AF circuit is real according to user sight line drop point
Time fine setting focal plane;
3) when shutter release button half by and sight line fall when finding a view district, it is right that AF circuit is carried out according to user sight line
Defocused AF circuit keeps when front focal plane is until shutter release button is released;
4) when shutter release button half by and sight line do not fall within when finding a view district, AF circuit keeps when front focal plane one
The section time;
5) shutter release button is pressed and sight line does not falls within when finding a view district, and AF circuit keeps when front focal plane one section
Time.
A kind of camera focusing method based on sight line drop point the most according to claim 6, it is characterised in that
Described AF circuit keeps when the time span of front focal plane is set as fast, for 0.01-0.1s;Described AF electricity
Road keeps when the time span of front focal plane is set as slow, for 0.1-0.3s.
8. the camera focusing system used based on method described in claim 1, it is characterised in that described
System includes:
Image transmitter, will be sent to user eye by the scene to be captured of camera lens by circuit or light path
In;
Sight line impact location system, MILS, is used for obtaining user single eye images, carries out calibration and information stores, with
And determine focusing according to serviceability limit stage single eye images;
Focusing system, controls AF circuit based on focusing thus is controlled lens group;
AF circuit, is controlled by focusing circuit, controls lens group;
Described image transmitter receives the scene in lens group, is transmitted extremely through view finder by circuit or light path
In user sight line, store through the calibration of sight line impact location system, MILS and information, and determine focusing through single eye images
After Dian, scene is focused through focusing system, through AF circuit to lens group, forms loop.
System the most according to claim 8, it is characterised in that sight line impact location system, MILS includes simple eye
Image grabber, message handler and o controller.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201410522265.8A CN104238239B (en) | 2014-09-30 | A kind of camera focusing system based on sight line drop point and method |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201410522265.8A CN104238239B (en) | 2014-09-30 | A kind of camera focusing system based on sight line drop point and method |
Publications (2)
Publication Number | Publication Date |
---|---|
CN104238239A CN104238239A (en) | 2014-12-24 |
CN104238239B true CN104238239B (en) | 2017-01-04 |
Family
ID=
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103246044A (en) * | 2012-02-09 | 2013-08-14 | 联想(北京)有限公司 | Automatic focusing method, automatic focusing system, and camera and camcorder provided with automatic focusing system |
CN103747183A (en) * | 2014-01-15 | 2014-04-23 | 北京百纳威尔科技有限公司 | Mobile phone shooting focusing method |
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103246044A (en) * | 2012-02-09 | 2013-08-14 | 联想(北京)有限公司 | Automatic focusing method, automatic focusing system, and camera and camcorder provided with automatic focusing system |
CN103747183A (en) * | 2014-01-15 | 2014-04-23 | 北京百纳威尔科技有限公司 | Mobile phone shooting focusing method |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN101764925B (en) | Simulation method for shallow field depth of digital image | |
US7957635B2 (en) | Method and system for providing background blurring when capturing an image using an image capture device | |
JP5630041B2 (en) | Electronic equipment | |
CN101387732A (en) | Imaging apparatus, imaging apparatus control method, and computer program | |
JP6415196B2 (en) | IMAGING DEVICE AND IMAGING DEVICE CONTROL METHOD | |
EP3796638A1 (en) | Lens control method and device and terminal | |
JP2014202875A (en) | Subject tracking device | |
CN106814518A (en) | Auto-focusing camera system and electronic installation | |
JP5822492B2 (en) | Imaging apparatus and control method | |
JP2019208170A (en) | Image processing apparatus, image processing method, program, and storage medium | |
JP2001116985A (en) | Camera with subject recognizing function and subject recognizing method | |
CN100414425C (en) | Image taking apparatus for taking accuracy focusing image and its method | |
CN106210464A (en) | Camera head and the control method of camera head | |
CN111182199B (en) | Electronic device and photographing method | |
JP2006217413A (en) | Electronic camera | |
JPH06303491A (en) | Electronic still camera | |
CN104238239B (en) | A kind of camera focusing system based on sight line drop point and method | |
JP6299519B2 (en) | Imaging device | |
JP6346484B2 (en) | Image processing apparatus and control method thereof | |
CN114339037A (en) | Automatic focusing method, device, equipment and storage medium | |
EP3139586B1 (en) | Image shooting processing method and device | |
JP6399869B2 (en) | Subject tracking device, imaging device, subject tracking method and program | |
CN104238239A (en) | System and method for focusing cameras on basis of vision drop points | |
JP2011193281A (en) | Portable device | |
WO2011114400A1 (en) | Electronic apparatus |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant |