CN103185568A - Ranging apparatus, ranging method, and interactive display system - Google Patents

Ranging apparatus, ranging method, and interactive display system Download PDF

Info

Publication number
CN103185568A
CN103185568A CN2012102949924A CN201210294992A CN103185568A CN 103185568 A CN103185568 A CN 103185568A CN 2012102949924 A CN2012102949924 A CN 2012102949924A CN 201210294992 A CN201210294992 A CN 201210294992A CN 103185568 A CN103185568 A CN 103185568A
Authority
CN
China
Prior art keywords
image
distance
lens
object distance
variation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN2012102949924A
Other languages
Chinese (zh)
Other versions
CN103185568B (en
Inventor
张铨仲
王淇霖
陈永霖
张奇伟
刁国栋
林显昌
陈加珍
黄维嘉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Industrial Technology Research Institute ITRI
Original Assignee
Industrial Technology Research Institute ITRI
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Industrial Technology Research Institute ITRI filed Critical Industrial Technology Research Institute ITRI
Publication of CN103185568A publication Critical patent/CN103185568A/en
Application granted granted Critical
Publication of CN103185568B publication Critical patent/CN103185568B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Abstract

A ranging apparatus including an image sensor, an imaging lens, and a processor is provided. The imaging lens is configured to image an object on the image sensor to produce an image signal having at least one image parameter, wherein the at least one image parameter changes with a change of an object distance of the object. The processor is configured to determine the change of the object distance according to a change of the at least one image parameter. A ranging method and an interactive display system are also provided.

Description

Distance measuring equipment, distance-finding method and interactive display system
Technical field
The present invention relates to a kind of measuring equipment, method for measurement and display system, relate more specifically to a kind of distance measuring equipment, distance-finding method and interactive display system.
Background technology
In touch technology now, be example with panel computer or intelligent mobile phone, how the finger by the user carries out the actual operation of pressing or sliding to the panel of this device and controls.On the other hand, except contact panel, also can use two or a plurality of camera lenses and obtain the relative distance of each object in the scene by the method for parallax.Yet, high-precision apart from the detecting ability as if will under short distance, obtaining, be example with the twin-lens system, resolution and the camera lens mutual distance differentiated because of distance are proportionate, and make total system in volume-diminished its difficulty be arranged.
In addition, also can launch an extra detected light to scene to be measured, and carry out the interpretation of distance by the flight time of light or the variation of the structured light that throws.Be example with the time-of-flight method, because the high speed of the light velocity, need to handle the electronic circuit of high-frequency signal on judging for the distance of closer objects.In addition, in the method for projective structure light, because of the use of additional light source and light projector device, the then difficult requirement that reaches low energy loss and reduction system volume.In addition, though also have by after additional light source is scanned scene to be measured, reach the effect of human-computer interaction by reflected by objects optical position difference, with the method for projective structure light identical problem is arranged.
Also have and use single camera lens that scene is carried out repeatedly capture, and the feedback signal by automatic focusing mechanism, carry out the interpretation of distance.Yet focusing needs a period of time to finish usually automatically, therefore is unfavorable for satisfying the demand of instant human-computer interaction.
Summary of the invention
One embodiment of the invention propose a kind of distance measuring equipment, and it comprises sampling image lens, image sensing unit and processing unit.Sampling image lens makes the gained image possess a plurality of image parameters, and wherein these image parameters have different variations along with the variation of object distance.On the image sensing unit, with the formation image, and the image sensing unit becomes signal with image transitions to sampling image lens with object image-forming.Whether processing unit along with these different variations that the variation of object distance produces, comes judgment object to drop in the default object distance scope according to signal and these image parameters.
One embodiment of the invention propose a kind of distance-finding method, and it comprises the following steps: to make the gained image possess a plurality of image parameters by sampling image lens, and wherein these image parameters have different variations along with the variation of object distance.By sampling image lens with object image-forming, to obtain image.According to resulting image and these image parameters along with whether these different variations that the variation of object distance produces come judgment object to drop in the default object distance scope.
One embodiment of the invention propose a kind of interactive display system, and it comprises image generation unit and above-mentioned distance measuring equipment.Image generation unit forms in the space and shows image.In the time of in the processing unit judgment object drops on default object distance scope, the processing unit judgment object touches the demonstration image.
For above-mentioned feature of the present invention can be become apparent, embodiment cited below particularly, and cooperate appended accompanying drawing to be described in detail below.
Description of drawings
Figure 1A is the synoptic diagram of the interactive display system of one embodiment of the invention.
Figure 1B illustrates the sub-processing unit in the processing unit among Figure 1A.
Fig. 2 is the synoptic diagram of the distance measuring equipment among Figure 1A.
Fig. 3 A illustrates the energy distribution of the point spread function of sampling image lens under different object distances among Fig. 2.
The image parameter that Fig. 3 B produces for the sampling image lens among Fig. 2 is along with the curve map of the variation of object distance.
Fig. 4 is that the difference of blur level numerical value of the blur level numerical value of x direction among Fig. 3 B and y direction is along with the curve map of the variation of object distance.
Fig. 5 is the synoptic diagram of the interactive display system of another embodiment of the present invention.
Fig. 6 is the synoptic diagram of the interactive display system of another embodiment of the present invention.
Fig. 7 illustrates the energy distribution of the point spread function of the point spread function of the ruddiness of sampling image lens under different object distances among Fig. 2 and green glow.
What Fig. 8 produced for the sampling image lens among Fig. 2 is out of focus modulation transfer function under 30 demand pairs/millimeters in spatial frequency.
Fig. 9 for the sampling image lens among Fig. 2 produce in the ruddiness blur level numerical value on the x direction, green glow blur level numerical value, ruddiness blur level numerical value on the y direction and the green glow blur level numerical value on the y direction on the x direction along with the curve map of the variation of object distance.
Figure 10 is that the difference of difference, the green glow blur level numerical value on the y direction and the green glow blur level numerical value on the x direction of difference, the ruddiness blur level numerical value on the y direction and the ruddiness blur level numerical value on the x direction of ruddiness blur level numerical value on difference, the green glow blur level numerical value on the y direction and the y direction of the green glow blur level numerical value on the x direction and the ruddiness blur level numerical value on the x direction among Fig. 9 is along with the curve map of the variation of object distance D.
Figure 11 is the synoptic diagram of the sampling image lens of an embodiment more of the present invention.
The sampling image lens that Figure 12 A to Figure 12 G is respectively Figure 11 is in the curve map of the out of focus modulation transfer function of spatial frequency when being 10 demand pairs/millimeters, 20 demand pairs/millimeters, 30 demand pairs/millimeters, 40 demand pairs/millimeters, 50 demand pairs/millimeters, 100 demand pairs/millimeters and 200 demand pairs/millimeter.
Figure 13 A to Figure 13 C is respectively the sampling image lens of Figure 11 in the energy profile of the point spread function at 34 centimeters, 33 centimeters and 32 centimeters places of object distance.
Figure 14 be the sampling image lens of Figure 11 under specific spatial frequency in the blur level numerical value of x direction and y direction.
Figure 15 be among Figure 14 under specific spatial frequency in the slope of the blur level numerical value change of the slope of the blur level numerical value change of x direction and the y direction curve map with respect to object distance.
Figure 16 is the synoptic diagram of the sampling image lens of another embodiment of the present invention.
Figure 17 is the energy profile of red spot spread function of sampling image lens of Figure 16 and the energy profile of green glow point spread function.
X direction ruddiness blur level numerical value, y direction ruddiness blur level numerical value, x direction green glow blur level numerical value and the y direction green glow blur level numerical value that Figure 18 produces for the sampling image lens of Figure 16 is along with the curve map of the variation of object distance.
Figure 19 is that the difference of ruddiness blur level numerical value on the difference of the green glow blur level numerical value on the x direction and the ruddiness blur level numerical value on the x direction among Figure 18 and the green glow blur level numerical value on the y direction and the y direction is along with the curve map of the variation of object distance D.
Figure 20 illustrates the embodiment for the treatment of scheme of the processing unit of Fig. 1.
Figure 21 illustrates the embodiment that processing unit calculates the flow process of image blur numerical value.
Figure 22 is the modulation transfer function curve map of the sampling image lens with aberration formula aberration of one embodiment of the invention.
Figure 23 is the process flow diagram of the distance-finding method of one embodiment of the invention.
Wherein, Reference numeral:
50: object 60: eyes
100,100a, 100b: interactive display system
110,110b: image generation unit
112,112b: show image
120: CPU (central processing unit)
200: distance measuring equipment 210: the image sensing unit
220: processing unit 222: position interpretation subelement
224: image is cut apart subelement 226: the image calculation subelement
228: the range estimation subelement
300,300c, 300d: sampling image lens
310,310c, 310d: first lens
320,320c, 320d: second lens
330,330c, 330d: the 3rd lens
340,340c, 340d: the 4th lens
350,350c, 350d: the 5th lens
A: optical axis D: object distance
E: signal
P110, P120, P132, P134, P136, P140, P150, P160, Q110, Q122, Q124, Q130, Q140, S110~S130: step
R1, R2: zone
S1~S10, S1c~S10c, S1d~S10d: surface
Embodiment
Figure 1A is the synoptic diagram of the interactive display system of one embodiment of the invention, Figure 1B illustrates the sub-processing unit in the processing unit among Figure 1A, Fig. 2 is the synoptic diagram of the distance measuring equipment among Figure 1A, Fig. 3 A illustrates the energy distribution of the point spread function of sampling image lens under different object distances among Fig. 2, and the image parameter that Fig. 3 B produces for the sampling image lens among Fig. 2 is along with the curve map of the variation of object distance.Please refer to Figure 1A, Fig. 2, Fig. 3 A and Fig. 3 B, the interactive display system 100 of present embodiment comprises image generation unit 110 and distance measuring equipment 200.Image generation unit 110 forms in the space and shows image 112.In the present embodiment, show that image 112 for example is real image.Yet, in other embodiments, show that image 112 can also be the virtual image.In addition, image generation unit 110 for example is projection arrangement, three-dimensional display or any image forming appts that can form real image or the virtual image in the space.
Distance measuring equipment 200 comprises sampling image lens 300, image sensing unit 210 and processing unit 220.Sampling image lens 300 makes the gained image possess a plurality of image parameters, and wherein these image parameters have different variations along with the variation of object distance D.Object distance D be object in the space to the distance of sampling image lens 300, the distance of surperficial S1 on the optical axis A of sampling image lens 300 of the most close thing side of first lens 310 of the most close thing side that for example is object 50 to the sampling image lens 300.In the present embodiment, these image parameters are included in two image blur numerical value (blur metric) (or can be described as sharpness (sharpness)) on the different directions, and these two different directions can be vertical in fact each other.For example, object 50 can utilize the rectangular coordinate system that contains x axle, y axle and z axle to define with the space at distance measuring equipment 200 places, and wherein x axle, y axle are vertical each other with the z axle, and these two different directions for example are x direction and y direction.
Image blur numerical value (blur metric) (or can be described as sharpness (sharpness)) is a numerical value relevant with the fog-level of image, the producing method of this numerical value can be with reference to periodical SPIE Electronic Imaging Symposium Conf Human Vision and Electronic Imaging, San Jose:Etats-Unis d'Amerique (2007) hal-00232709, the Frederique Crete of Laboratoire des Images et des Signaux among the version 1-1 Feb2008, Thierry Dolmiere, Patricia Ladret, people's such as Marina Nicolas title is the works of " The Blur Effect:Perception and Estimation with a New No-Reference Perceptual Blur Metric ", also can be with reference to Normal University's journal: number natural sciences skill class the Republic of China 95 years, 51 (1), 21-31(Journal of Taiwan Normal University:Mathematics, Science﹠amp; Technology2006,51 (1), during bell is permitted 21-31), Zhang Xiangli, Wang Junming and Chen Shiwang (Yun-Chung Chung, Shyang-Lih Chang, Jung-Ming Wang and Sei-Wang Chen) etc. people's title be the works of " based on the fuzzy measurement standard of edge analysis-the application of handling in image (An Edge Analysis Based Blur Measure for Image Processing Applications) ", also can be with reference to the A.Ciancio of periodical ELECTRONICS LETTERS 5th November 2009 Vol.45 No.23, A.L.N.T.da Costa, E.A.B.da Silva, A.Said, people's such as R.Samadani and P.Obrador title is the works of " Objective no-reference image blur metric based on local phase coherence ", also can be with reference to the Yu Han of periodical Optical Engineering (OE) Letters May 2010/Vol.49 (5) 050501-1 to 050501-1 pages or leaves, Xiaoming Xu, people's such as and Yunze Cai title is the works of " Novel no-reference image blur metric based on block-based discrete cosine transform statistics ", also can be with reference to Aditya Anchuri Stanford University, the title of MS 2011 Ref:Dr.Joyce Farrell PSYCH221 is the works (this part data can be linked to by network address http://scien.stanford.edu/pages/labsite/2010/psych221/projects/ 2010/AdityaAnchuri/main.pdf) of " Image Blur Metrics ", also can be with reference to Luhong Liang in periodical 2009IEEEInternational Conference on Imaging Processing (ICIP) the 4396-4372 page or leaf, Jianhua Chen, Siwei Ma, Debin Zhao, people's such as Wen Gao title is the works of " A NO-REFERENCE PERCEPTUAL BLUR METRIC USING HISTOGRAM OF GRADIENT PROFILE SHARPNESS ", also can be with reference to periodical IEEE TRANSACTIONS ON IMAGE PROCESSING, VOL.20, NO.9, the Niranjan D.Narvekar and Lina J.Karam of SEPTEMBER2011, Senior Member, people's such as IEEE title is the works of " A No-Reference Image Blur Metric Based on the Cumulative Probability of Blur Detection (CPBD) ", also can be with reference to the Pina Marziliano of III-57 to the III-60 page or leaf of periodical 2002IEEE International Conference on Imaging Processing (ICIP), Frederic Dufaux, people's such as Stefan Wnkler and Touradj Ebrahimi title is the works of " ANO-REFERENCE PERCEPTUAL BLUR METRIC ".
In the present embodiment, the more big representative image of image blur numerical value is more clear.But in another embodiment, along with the difference of the account form of image blur numerical value, can also be that the more big representative image of image blur numerical value is more fuzzy, namely more unintelligible.
Sampling image lens 300 images in object 50 on the image sensing unit 210, and with the formation image, and image sensing unit 210 becomes signal E with this image transitions.In the present embodiment, signal E for example is electric signal.In addition, in the present embodiment, object 50 for example is user's finger, hand, pointer or other object.In addition, image sensing unit 210 for example is charge coupled cell (charge coupled device, CCD), CMOS (Complementary Metal Oxide Semiconductor) sensing element (complementary metal oxide semiconductor sensor, CMOS sensor) or other suitable image sensing elements.Whether processing unit 220 along with these different variations that the variation of object distance D produces, comes judgment object 50 to drop in the default object distance scope according to signal E and these image parameters.In the present embodiment, when processing unit 220 judgment objects 50 dropped in the default object distance scope, processing unit judgment object 50 touched and shows image 112.In other words, show that image 112 and the distance of sampling image lens 300 are to drop in this default object distance scope.
In the present embodiment, the intensity of the energy distribution of the point spread function of sampling image lens 300 on two different directions reaches extreme value at different object distance D.For example, in Fig. 3 A, spot distribution figure from left to right is respectively the energy distribution of the point spread function when increasing along with object distance D.By Fig. 3 A as can be known, when object distance was about 30 centimeters, point spread function was the most concentrated in the energy distribution of x direction (for example being horizontal direction).In other words, when object distance was about 30 centimeters, the image blur numerical value maximum (that is image the most clear) of sampling image lens 300 on the x direction illustrated as Fig. 3 B.On the other hand, when object distance was about 26.8 centimeters, point spread function was the most concentrated in the energy distribution of y direction (for example being vertical direction).In other words, when object distance was about 26.8 centimeters, the image blur numerical value of sampling image lens 300 on the y direction was maximum (that is image is the most clear), illustrates as Fig. 3 B.
In the present embodiment, sampling image lens has at least one non-axial symmetrical lens (be in the present embodiment be example with first lens 310, illustrate as Fig. 2).Non-axial symmetrical lens (as first lens 310) has at least one non-axisymmetric curved surface (be in the present embodiment with first lens 310 surperficial S2 be example), and non-axisymmetric curved surface (as surperficial S2) is inequality at the pattern of (as on x direction and y direction) on two different directions, so can make the energy distribution of point spread function reaching the state of concentrating the most respectively at different object distance D on the x direction with on the y direction, even also the image blur numerical value of x direction reaches maximal value at different object distance D respectively with the image blur numerical value of y direction.
In the present embodiment, sampling image lens 300 comprises by the thing side to first lens 310, second lens 320, the 3rd lens 330, the 4th lens 340 and the 5th lens 350 arranged in regular turn as side, and the diopter (refractive power) of first lens 310, second lens 320, the 3rd lens 330, the 4th lens 340 and the 5th lens 350 is respectively positive and negative, negative, positive and negative.In addition, aperture diaphragm (aperture stop) can be positioned at the surperficial S2 of first lens 310.
Particularly, from side-looking direction look (x coordinate direction), first lens 310 for example are that convex surface is towards the positive meniscus shaped lens (positive meniscus lens) of thing side, second lens 320 for example are that convex surface is towards the negative meniscus lens of thing side, the 3rd lens 330 for example are that convex surface is towards the negative meniscus lens of picture side, the 4th lens 340 for example are biconvex lens (biconvex lens), and the 5th lens 350 for example are biconcave lens (biconcave lens).
Following content will be for an embodiment of sampling image lens 300.Be noted that, listed data information is not in order to limit the present invention in the following table one, those skilled in the art are after reference is of the present invention, when doing suitable change to its parameter or setting, precisely because must belong in the category of the present invention in any affiliated technical field.
(table)
In Table 1, spacing refers to the air line distance on optical axis A between two adjacent surfaces, for instance, the spacing of surperficial S1, namely surperficial S1 is to the air line distance on optical axis A between surperficial S2.Each spacing and material value corresponding and material number during the corresponding thickness of each lens and material please refer to and go together in the remarks column.
In addition, in Table 1, surperficial S1, S2 are two surfaces of first lens 310, S3, S4 are two surfaces of second lens 320 on the surface, surface S5, S6 are two surfaces of the 3rd lens 330, and surperficial S7, S8 are two surfaces of the 4th lens 340, and surperficial S9, S10 are two surfaces of the 5th lens 350.Relevant for parameter values such as each surperficial radius-of-curvature, spacings, please refer to table one, no longer repeat at this.In addition, the numerical value of the spacing of that delegation of surperficial S10 is that surperficial S10 is to the air line distance of image sensing unit 210 on optical axis A.In the radius-of-curvature of that delegation of surperficial S2, the numerical value that is connected on " x: " back is the radius-of-curvature of surperficial S2 on the x direction, and the numerical value that is connected on " y: " back is the radius-of-curvature of surperficial S2 on the y direction.
Moreover above-mentioned surperficial S1 and S3~S10 are aspheric surface, and its available following formula is represented:
Z = cr 2 1 + 1 - ( 1 + k ) c 2 r 2 + A 1 r 2 + A 2 r 4 + A 3 r 6 + A 4 r 8 + A 5 r 10 + . . .
In the formula, Z is the side-play amount (sag) of optical axis A direction, and c is the inverse of the radius of osculating sphere (osculating sphere), just near the inverse of the radius-of-curvature (as the radius-of-curvature of form inside surface S1 and S3~S10) at optical axis A place.K is quadric surface coefficient (conic), and r is the aspheric surface height, is from the lens center height toward rims of the lens, and A1, A2, A3, A4, A5... are asphericity coefficient (aspheric coefficient), and wherein coefficient A1 is 0.What table two was listed is the parameter value of surperficial S1 and S3~S10.
(table two)
Figure BDA00002025523700091
In addition, surperficial S2 is inequality with the curvature variation in the y direction in the x direction, and surperficial S2 can describe by following formula:
Z = cr 2 1 + 1 - ( 1 + k ) c 2 r 2 + Σ j = 2 66 C j x m y n
Wherein, j = ( m + n ) 2 + m + 3 n 2 + 1
In addition, in the formula, Z is the side-play amount (sag) of optical axis A direction, and c is the inverse of the radius of osculating sphere (osculating sphere), just near the inverse of the radius-of-curvature (as the radius-of-curvature of form inside surface S2) at optical axis A place.K is quadric surface coefficient (conic), and r is the curved surface height, is the height of the past rims of the lens from the lens center, and C jBe x my nCoefficient, m and n are zero or positive integer, but m and n are not zero simultaneously, x is the position of x coordinate, and y is the position of y coordinate, and the position of x=0 and y=0 is on optical axis A.In the present embodiment, C 4=2.000E-04(is 2.000 * 10 -4), C 6=-2.000E-04, and C 8=1.000, and remaining C jThen all be essentially 0.
(table three)
Parameter Specification
Image height 3 millimeters (1/3 inch, 2 mega pixels)
Focal length 9.5 millimeter
F numerical value ?1.2
Field angle 33.6 degree
Object distance
300 millimeters
Relative exposure ?>50%
Optical distortion ?<2%
The key light line angle Maximum key light line angle<28.6 degree
Table three is listed an embodiment of the specification of sampling image lens 300, but the present invention is not as limit.Parameter name is listed on the left hurdle of table three, and the specification of corresponding parameters is listed on right hurdle.Wherein, image height refers to that the image height of the image sensing unit 210 that adopts is 3 millimeters (mm), and image sensing unit 210 is the image sensor of 1/3 inch (inch), 2 mega pixels.
In the distance measuring equipment 200 of present embodiment, owing to along with the variation of object distance D different variations being arranged at the blur level numerical value on the x direction and blur level numerical value on the y direction, so processing unit 220 can decide the object distance of object 50 according to two image values that correspond respectively to blur level numerical value on the x direction and the blur level numerical value on the y direction among the signal E.In the present embodiment, processing unit 220 can be more according to these image parameters in the threshold values of proofreading and correct gained in advance, decide processing unit whether to begin whether to drop in the default object distance scope according to signal judgment object 50.For example, please refer to Fig. 3 B, when threshold values is 0.4, then one of them is 0.4 when above at least when the image values that corresponds to the image values of the blur level on the x direction among the signal E and correspond to the blur level numerical value on the y direction, whether processing unit 220 beginning drops in the default object distance scope according to signal judgment object 50, and this moment, processing unit 220 can learn that according to the relation of Fig. 3 B object distance D drops in 25.2 centimeters to 31.8 centimeters the scope approximately.The relation of Fig. 3 B can be learnt earlier by experiment at (dispatch from the factory as distance measuring equipment 200 before) in advance, for example change the object distance of 200 of a Reference and distance measuring equipments, calculate and record the blur level numerical value under each object distance simultaneously, and this relation is stored in the distance measuring equipment 200, for example be stored in the storer of distance measuring equipment 200.In addition, processing unit 220 also can which be bigger by the judgement image values that corresponds to the blur level numerical value on the x direction and the image values that corresponds to the blur level numerical value on the y direction, decide object distance D to drop in 25.2 centimeters to 27.8 centimeters the scope, or drop in 27.8 centimeters to 31.8 centimeters the scope.For example, when the image values on the x direction less than the image values on the y direction, then processing unit 220 can be judged object distance D and drops in 25.2 centimeters to 27.8 centimeters the scope.
Then, if the image values corresponding to the blur level numerical value of y direction is 0.6 o'clock among the signal E, then processing unit 220 can further dwindle the possible range of object distance D, for example can learn that object distance D is 26 centimeters or 27.5 centimeters.Because when being that the blur level numerical value of 26 centimeters corresponding x directions during with 27.5 centimeters is inequality corresponding to the image values of the blur level numerical value of y direction among the signal E, therefore how many 220 this moments of processing unit can be according to the image values corresponding to the blur level numerical value of x direction among the signal E, are 26 centimeters or 27.5 centimeters and judge object distance D.When the distance of demonstration image 112 and sampling image lens 300 is 26 centimeters, then when processing unit 220 judges that object distance D are about 26 centimeters, can drop in the default object distance scope by judgment object 50, also can touch demonstration image 112 by judgment object 50.If processing unit 220 is judged when object distance D are 27.5 centimeters, but then judgment object 50 does not drop in the default object distance scope as yet, that is judge object 50 and do not touch as yet and show image 112, but can learn still that object distance D is 27.5 centimeters this moment.In like manner, when image values was other numerical value more than 0.4, processing unit 220 is the relation of the blur level numerical value of the blur level numerical value by the image values among the comparison signal E and the x direction among Fig. 3 B and y direction also, what is and judge object distance D.
In the distance measuring equipment 200 of present embodiment, because the blur level numerical value of x direction reaches extreme value with the blur level numerical value of y direction at different object distance D, and the blur level numerical value of x direction and the blur level numerical value of y direction are along with the variation of object distance D is different, therefore the processing unit 220 of distance measuring equipment 200 can be judged object distance D accordingly, that is processing unit 220 can select threshold values with respect to the extreme value of the variation of object distance D according to image parameter (being the blur level numerical value of x direction and the blur level numerical value of y direction), and decides the object distance of object 50 according to extreme value and threshold values.
In addition, because the distance measuring equipment 200 of present embodiment has adopted a plurality of image parameters (as the blur level numerical value of x direction and the blur level numerical value of y direction) to judge object distance D, therefore processing unit 220 can be according to image sensing unit 210 in once taking the object distance D that resulting signal E comes judgment object 50, that is come the object distance D of judgment object 50 by the resulting image of imaging of 300 pairs of objects 50 of sampling image lens.Thus, need finish through one focusing period compared to existing distance measuring equipment and just can learn object distance to defocused, or after repeatedly taking, just can learn object distance, 200 energy of the distance measuring equipment of present embodiment are learnt object distance D immediately via the processing of processing unit 220 after once shooting is finished.Therefore, the distance measuring equipment 200 of present embodiment can be fast and is judged object distance D immediately.When distance measuring equipment 200 was repeatedly taken and all judge object distance D immediately after taking each time, distance measuring equipment 200 can reach the effect of the object distance D of (real time) detecting object 50 immediately.
In addition, because the distance measuring equipment 200 of present embodiment can adopt single lens to reach the judgement of object distance D, therefore compared to the range measurement system of twin-lens, the volume of the distance measuring equipment 200 of present embodiment can be less.Moreover, because the distance measuring equipment 200 of present embodiment need not just can reach the judgement of object distance D by the mode of sending detected light, therefore can not adopt extra light source, the volume of the distance measuring equipment 200 that also can make present embodiment like this is less.In addition, because distance measuring equipment 200 is judgements that the mode of employing analysis image parameter reaches object distance D, therefore must adopt the electronic circuit that to handle high-frequency signal compared to the distance measuring equipment that adopts the light time-of-flight method, the distance measuring equipment 200 of present embodiment can not adopt the electronic circuit that can handle high-frequency signal, and then can reduce cost.
Please refer to Figure 1A and Figure 1B, in the present embodiment, processing unit 220 comprises that position interpretation subelement 222, image cut apart subelement 224, image calculation subelement 226 and range estimation subelement 228.Position interpretation subelement 222 determines object 50 in perpendicular to the position on the direction of object distance D according to signal E, is x coordinate and the y coordinate at judgment object 50 places in the present embodiment.Particularly, namely according in the measured image in image sensing unit 210 corresponding to x coordinate and the y coordinate of part in this image of object 50, anti-x coordinate and the y coordinate that pushes away object 50 reality.
Image is cut apart subelement 224 choose scope corresponding to the image to be analyzed of object 50 from image.For example, but image is cut apart near the part that is positioned in subelement 224 analysis images position interpretation subelement 222 resulting x coordinates and the y coordinate, to obtain the scope of image to be analyzed.If object 50 is finger, then image is cut apart 224 on subelement and can be selected to have skin color in the image and be positioned at position interpretation subelement 222 resulting x coordinates and near the scope of the scope of the image of y coordinate as image to be analyzed.
Image calculation subelement 226 calculates a plurality of image values that correspond respectively to these image parameters (as the blur level numerical value of x direction and y direction) according to selected image to be analyzed.In addition, range estimation subelement 228 determines the object distance D of object 50 according to these image values that calculate, that is range estimation subelement 228 relation that can compare these image values and above-mentioned these image parameters decides the object distance D of object 50, the superincumbent content of the details of this part describes in detail, therefore no longer repeats at this.
In the present embodiment, interactive display system 100 more comprises CPU (central processing unit) 120, and it electrically connects processing unit 220 and image generation unit 110.Yet in other embodiments, 220 of CPU (central processing unit) 120 and processing units also can wireless link, for example are to see through wireless transmission, Wi-Fi or other wireless link modes to link.Judging object 50 when processing unit 220 touches when showing image 112, it is to touch which object that shows in the image 112 that the x coordinate of the object 50 that CPU (central processing unit) 120 can be judged according to processing unit 220 and y coordinate decide object 50, and this can reach with x coordinate and the y coordinate of y coordinate and demonstration image 112 by the x coordinate of CPU (central processing unit) 120 mapping (mapping) objects 50.Then, CPU (central processing unit) 120 just can decide according to the touching mode of object 50 and will how to order image generation unit 110 how to change demonstration image 112.For example, when 50 pairs on object showed the action that the object in the image 112 clicks or pull, then CPU (central processing unit) 120 can be ordered and be shown that image 112 demonstrates demonstration image 112 or the mobile position that shows the object in the image 112 corresponding to point selection function.
In the present embodiment, the quantity of object 50 can be one or more, and when the quantity of object when being a plurality of, processing unit 220 can be analyzed the part corresponding to a plurality of objects in the image, so can make interactive display device 100 reach function of multi-spot touch.In another embodiment, processing unit 220 also can combine with CPU (central processing unit) 120.
In addition, because the interactive display system 100 of present embodiment has adopted distance measuring equipment 200, therefore can allow the user produce interaction with flying at aerial demonstration image 112, also can simulate the effect that user's contact and touch-control fly at aerial demonstration image 112.In detail, because image generation unit 110 is when producing demonstration image 112, the position that shows image 112 is known, therefore determine that by distance measuring equipment 200 object 50(for example is not to use person's finger) the position after, can learn whether object 50 has to run into flies at aerial demonstration image 112, and then decides how to change the displaying contents that shows image 112 according to the mode of motion of object 50.
Fig. 4 is that the difference of blur level numerical value of the blur level numerical value of x direction among Fig. 3 B and y direction is along with the curve map of the variation of object distance.Please refer to Figure 1A and Fig. 4, in another embodiment, whether processing unit 220 can come judgment object 50 to drop in the default object distance scope along with the variation of object distance D according to the difference of these image parameters (for example the blur level numerical value that cuts the resulting difference of blur level numerical value of x direction or x direction for the blur level numerical value of y direction cuts the resulting difference of blur level numerical value of y direction).At this moment, can select threshold values, and can judge whether the difference (for example cutting the resulting difference of image values of the blur level numerical value correspondence of y direction for the image values of the blur level numerical value correspondence of x direction) of the image values among the signal E decides processing unit 220 whether to begin further to analyze object distance D greater than zero by processing unit 220.For example, when the difference of image values greater than zero the time, and this difference is 0.2 o'clock, it is 25.5 centimeters or 27.3 centimeters that processing unit 220 can be judged object distance D.Perhaps, can whether reach extreme value by judging this difference, come judgment object 50 whether to drop in the default object distance scope.For example, when showing that the distance of image 112 with sampling image lens 300 is 26.8 centimeters, whether whether then whether processing unit 220 can reach maximal value by judging this difference, decide object 50 to drop in the default object distance scope, namely determine object 50 to touch and show image 112.
Fig. 5 is the synoptic diagram of the interactive display system of another embodiment of the present invention.Please refer to Fig. 5, the interactive display system 100a of present embodiment and the interactive display system 100 of Figure 1A are similar, and both difference is as described below.In the interactive display system 100 of Figure 1A, image generation unit 110 is positioned at the same side that shows image 112 with distance measuring equipment 200, and user's eyes 60 are to be positioned at the relative both sides that show image 112 with image generation unit 110.Yet in the interactive display system 100a of present embodiment, image generation unit 110 is positioned at the relative both sides that show image 112 with distance measuring equipment 200, and user's eyes 60 are positioned at the same side that shows image 112 with distance measuring equipment 200.
Fig. 6 is the synoptic diagram of the interactive display system of another embodiment of the present invention.Please refer to Fig. 6, the interactive display system 100b of present embodiment and the interactive display system 100a of Fig. 5 are similar, and both difference is as described below.In the interactive display system 100b of present embodiment, image generation unit 110b for example is that (head-mounted display, HMD), it is disposed at eyes 60 the place aheads of user to head mounted display, shows that image 112b then is the virtual image.In the present embodiment, image generation unit 110b, user's eyes 60 and distance measuring equipment 200 all are positioned at the same side that shows image 112.
Fig. 7 illustrates the energy distribution of the point spread function of the point spread function of the ruddiness of sampling image lens under different object distances among Fig. 2 and green glow, what Fig. 8 produced for the sampling image lens among Fig. 2 is out of focus modulation transfer function (through focus modulation transfer function under 30 demand pairs/millimeters in spatial frequency, and the ruddiness blur level numerical value on the x direction that Fig. 9 produces for the sampling image lens among Fig. 2 through focus MTF),, green glow blur level numerical value on the x direction, ruddiness blur level numerical value on the y direction and the green glow blur level numerical value on the y direction are along with the curve map of the variation of object distance.Please refer to Fig. 7 to Fig. 9, in the present embodiment, the energy distribution of the point spread function of a plurality of different color lights of sampling image lens 300 has different variations along with the variation of object distance D.Particularly, in the present embodiment, the axial chromatic aberration of sampling image lens 300 is to drop in from 0.0010 to 0.0100 or from-0.0010 to-0.0100 the scope divided by focal length, and this axial chromatic aberration for example is the axial chromatic aberration of the corresponding different color light of these image parameters.For example, in the present embodiment, the coloured light that these image parameters correspond to is ruddiness and green glow, so this axial chromatic aberration for example is the axial chromatic aberration of ruddiness and green glow, and wherein the wavelength of ruddiness for example is 640 nanometers, and the wavelength of green glow for example is 550 nanometers.In another embodiment, when the coloured light that corresponds to when these image parameters was ruddiness and blue light, this axial chromatic aberration can also be the axial chromatic aberration of ruddiness and blue light.The design that is different from general camera lens is to reduce axial chromatic aberration as far as possible, and the sampling image lens 300 of present embodiment is to have tangible axial chromatic aberration on the contrary, so that the blur level numerical value of ruddiness has tangible different variation with the blur level numerical value of green glow along with object distance D.The energy distribution that goes up row's the point spread function that changes along with object distance among Fig. 7 is the energy distribution of the point spread function of ruddiness, and among Fig. 7 down the energy distribution of row's the point spread function that changes along with object distance be the energy distribution of the point spread function of green glow, the energy distribution that can obviously be found out the point spread function of ruddiness by Fig. 7 obviously is different from the energy distribution of point spread function of green glow along with the variation of object distance along with the variation of object distance.In addition, as shown in Figure 8, ruddiness and the out of focus modulation transfer function of green glow on x direction and y direction are along with the degree (namely along with object distance D) of out of focus also has different variations.Moreover as shown in Figure 9, the ruddiness blur level numerical value on the x direction that sampling image lens 300 produces, the green glow blur level numerical value on the x direction, the ruddiness blur level numerical value on the y direction have different variations with green glow blur level numerical value on the y direction along with object distance D.
Embodiment compared to Fig. 3 B, processing unit 220 is that the pass with two curves is reference data, and corresponding image values and this two curves are judged object distance D among the comparison signal E, present embodiment then has four curves to can be used as reference data, therefore in comparison signal E during corresponding image values, then can produce the more bases that can compare, so the processing unit 220 of present embodiment can more accurately be judged object distance D.In the present embodiment, processing unit 220 decides the object distance D of object 50 with respect to the extreme value of the variation of object distance D according to these image parameters (as the ruddiness blur level numerical value on the x direction, green glow blur level numerical value, ruddiness blur level numerical value on the y direction and the green glow blur level numerical value on the y direction on the x direction).For example, as shown in Figure 9, when the ruddiness blur level numerical value on the y direction and the green glow blur level numerical value on the x direction all reached extreme value (for example maximal value), the object distance D that then can judge object 50 was about 30 centimeters, and namely judgment object 50 is to drop in the default object distance scope.When showing that the distance of image 112 with sampling image lens 300 is about 30 centimeters, but then processing unit 220 judgment objects 50 touch demonstration image 112.
Figure 10 is that the difference of difference, the green glow blur level numerical value on the y direction and the green glow blur level numerical value on the x direction of difference, the ruddiness blur level numerical value on the y direction and the ruddiness blur level numerical value on the x direction of ruddiness blur level numerical value on difference, the green glow blur level numerical value on the y direction and the y direction of the green glow blur level numerical value on the x direction and the ruddiness blur level numerical value on the x direction among Fig. 9 is along with the curve map of the variation of object distance D.Please refer to Figure 1A and Figure 10, in another embodiment, whether processing unit 220 can come judgment object 50 to drop in the default object distance scope along with the variation of object distance D according to the difference of these image parameters on different object distance D, and wherein this difference for example cuts the resulting difference of ruddiness blur level numerical value on the x direction (also can be that ruddiness blur level numerical value on the x direction cuts the green glow blur level numerical value on the x direction in other embodiments) for the green glow blur level numerical value on the x direction, green glow blur level numerical value on the y direction cuts the resulting difference of ruddiness blur level numerical value on the y direction (also can be in other embodiments on the y direction ruddiness blur level numerical value cut the resulting difference of green glow blur level numerical value on the y direction), ruddiness blur level numerical value on the y direction cuts green glow blur level numerical value on the resulting difference of ruddiness blur level numerical value on the x direction (also can be that ruddiness blur level numerical value on the x direction cuts the resulting difference of ruddiness blur level numerical value on the y direction in other embodiments) and the y direction and cuts the resulting difference of green glow blur level numerical value on the x direction (also can be that green glow blur level numerical value on the x direction cuts the resulting difference of green glow blur level numerical value on the y direction in other embodiments).In the present embodiment, can utilize four different differences to come judgment object 50 whether to drop in the default object distance scope along with the variation of object distance D, and can try to achieve object distance D.Because compared to the embodiment of Fig. 4, the difference that present embodiment can be judged according to this is more, therefore whether judgment object 50 drops in the default object distance scope more exactly.For example, the green glow blur level numerical value that can utilize the ruddiness blur level numerical value of y direction to cut the resulting difference of ruddiness blur level numerical value of x direction and y direction cuts the resulting difference of green glow blur level numerical value of x direction, decides processing unit 220 whether to begin judgment object 50 and whether enters default object distance scope or beginning labor signal E to obtain object distance D.Afterwards, green glow blur level numerical value by the x direction green glow blur level numerical value that cuts the resulting difference of ruddiness blur level numerical value of x direction and y direction cuts the resulting difference of ruddiness blur level numerical value of y direction again, come judgment object 50 whether to drop in the default object distance scope, or calculate the object distance D of object 50.
Figure 11 is the synoptic diagram of the sampling image lens of an embodiment more of the present invention, be 10 demand pairs/millimeters and Figure 12 A to Figure 12 G is respectively the sampling image lens of Figure 11 in spatial frequency, 20 demand pairs/millimeters, 30 demand pairs/millimeters, 40 demand pairs/millimeters, 50 demand pairs/millimeters, the curve map of the out of focus modulation transfer function when 100 demand pairs/millimeters and 200 demand pairs/millimeter, wherein each of Figure 12 A to Figure 12 G open illustrated among the figure for 0.6(be that field angle is 9.79 degree) and for 1.0(be that field angle is 16.05 degree) and respectively at the curve of tangential direction (tangential direction) with the out of focus modulation transfer function of sagitta of arc direction (sagittal direction), wherein the lines representative graph center line strips curve identical with it on English alphabet " T " left side is the curve of the out of focus modulation transfer function of tangential direction, and the lines representative graph center line strips curve identical with it on English alphabet " R " left side is the curve of the out of focus modulation transfer function of sagitta of arc direction.For example, in the drawings the lines representative graph center line strips curve identical with it on the left side of " T " on " 0.6 " left side be on the scene be the curve of out of focus modulation transfer function of the tangential direction at 0.6 place, and the lines representative graph center line strips curve identical with it on the left side of " R " on " 0.6 " left side be on the scene be the curve of out of focus modulation transfer function of the sagitta of arc direction at 0.6 place, and the adopted meaning of the physics of the curve of remaining lines form can be by that analogy.Figure 13 A to Figure 13 C is respectively the sampling image lens of Figure 11 in the energy profile of the point spread function at 34 centimeters, 33 centimeters and 32 centimeters places of object distance, and Figure 14 be the sampling image lens of Figure 11 under specific spatial frequency in the blur level numerical value of x direction and y direction.Please earlier with reference to Figure 11 and Figure 13 A to Figure 13 C, the sampling image lens 300c of present embodiment also can be in order to replace the sampling image lens 300 among Figure 1A and Fig. 2, in the distance measuring equipment 200 and interactive display system 100 that are applied in Figure 1A.The energy distribution of the point spread function of the sampling image lens 300 of present embodiment is along with the variation of object distance D can't have different variations in (for example on x direction and y direction) on the different directions in fact, that is point spread function does not have the difference of directivity in fact along with the variation of object distance D.By Figure 13 A to Figure 13 C as can be known, when object distance became 33 centimeters from 34 centimeters, it is more concentrated that point spread function all becomes on all directions simultaneously, and when object distance changed 32 centimeters from 33 centimeters, point spread function all became simultaneously on all directions and more disperses.Shi Terui ratio (Strehl ratio) listed among Figure 13 A to Figure 13 C is relevant with the intensity of point spread function.When the Shi Terui ratio was more big, point spread function was more for concentrating.
The sampling image lens 300c of present embodiment comprises by the thing side to the first lens 310c, the second lens 320c, the 3rd lens 330c, the 4th lens 340c and the 5th lens 350c that arrange in regular turn as side, and the diopter of the first lens 310c, the second lens 320c, the 3rd lens 330c, the 4th lens 340c and the 5th lens 350c be respectively positive and negative, just, just reach negative.In the present embodiment, the first lens 310c for example is that convex surface is towards the positive meniscus shaped lens of thing side, the second lens 320c for example is that convex surface is towards the negative meniscus lens of thing side, the 3rd lens 330c for example is biconvex lens, the 4th lens 340c for example be convex surface towards the positive meniscus shaped lens of thing side, and the 5th lens 350c for example is biconcave lens.
Following content will be enumerated the embodiment of sampling image lens 300c, but the present invention is not as limit.
(table four)
Figure BDA00002025523700171
Figure BDA00002025523700181
The explanation of each physical quantity can be with reference to the explanation of table one in table four.In addition, in table four, surperficial S1c, S2c are two surfaces of the first lens 310c, and wherein surperficial S1c is aperture diaphragm.Surface S3c, S4c are two surfaces of the second lens 320c, and surperficial S5c, S6c are two surfaces of the 3rd lens 330c, and surperficial S7c, S8c are two surfaces of the 4th lens 340c, and surperficial S9c, S10c are two surfaces of the 5th lens 350c.Relevant for parameter values such as each surperficial radius-of-curvature, spacings, please refer to table four, no longer repeat at this.In addition, the numerical value of the spacing of that delegation of surperficial S10c is that surperficial S10c is to the air line distance of image sensing unit 210 on optical axis A.
Moreover above-mentioned surperficial S1c~S10c is aspheric surface, and it can adopt and above-mentionedly represent in order to the aspheric surface formula of representing S1, S3~S10, and the explanation of each parameter please refer to the explanation of above-mentioned aspheric surface formula to S1, S3~S10 in the formula, no longer repeats at this.In the present embodiment, coefficient A1 is 0.What table five was listed is the aspheric surface parameter value of surperficial S1c~S10c.
(table five)
(table six)
Parameter Specification
Image height 3 millimeters (1/3 inch, 2 mega pixels)
Focal length 10 millimeters
F numerical value ?1.2
Field angle 32 degree
Object distance
300 millimeters
Relative exposure ?>50%
Optical distortion ?<2%
The key light line angle Maximum key light line angle<28.6 degree
Table six is listed an embodiment of the specification of sampling image lens 300c, but the present invention is not as limit.The explanation of each parameter please refer to the explanation of each parameter of above-mentioned his-and-hers watches three in the table six.
Referring again to Figure 11 and Figure 12 A to Figure 12 G, from Figure 12 A to Figure 12 G as can be known, when object 50 is positioned at the best object distance of sampling image lens 300c, because the image of the object that sampling image lens 300c captures has more high-frequency signal, therefore have comparatively distinct image, and image definition can depart from the best object distance of sampling image lens 300c and decline gradually with object distance D.According to this characteristic, processing unit 220 can be by analyzing the image parameter that image corresponds in each spatial frequency, and the difference by the corresponding image parameter of more a plurality of spatial frequencys changes or the difference of comparing the image parameter of different color light again changes resulting relation, compare with signal E again, whether drop in the default object distance scope to judge object 50, or calculate the object distance D of object 50.In the present embodiment, these image parameters comprise the image blur numerical value of a plurality of different space frequency.
In addition, also can try to achieve the image blur numerical value of x direction and the image blur numerical value of y direction according to the image of different space frequency.Illustrate as Figure 14, namely the x directional image blur level of a certain specific spatial frequency and y directional image blur level are along with the curve map of the variation of object distance D.By the relation of two curves of Figure 14 relatively, again signal E is concerned comparison therewith, whether processing unit 220 just can be judged object 50 and drop in the default object distance scope, or calculates the object distance D of object 50.Whether utilize two curves of Figure 14 to judge object 50 and drop in the default object distance scope or the details that calculates the object distance D of object 50 can adopt the method that is similar among above-mentioned other embodiment, this no longer describes in detail again.
Figure 15 be among Figure 14 under specific spatial frequency in the slope of the blur level numerical value change of the slope of the blur level numerical value change of x direction and the y direction curve map with respect to object distance.Please refer to Figure 11, Figure 14 and Figure 15, two curves among Figure 15 can be considered with two curves among Figure 14 to the object distance differential after resulting derived function.When the object distance numerical value of getting is discontinuous, then be the slope of the line of the point that curve corresponding among Figure 14 is adjacent in the point of this object distance on Figure 15 curve corresponding to the longitudinal axis numerical value of a certain object distance.In the present embodiment, whether the difference that more produces with respect to the variation of object distance D along with the slope of the variation of object distance D according to these image parameters of processing unit 220 changes and comes judgment object 50 to drop in the default object distance scope.In other words, illustrate as Figure 15, the slope of the slope of the blur level numerical value change of x direction and the blur level numerical value change of y direction is along with the variation of object distance D is also inequality, therefore processing unit 220 can be compared with signal E accordingly, and whether judgment object 50 drop in the default object distance scope, or calculate the object distance D of object 50.
Figure 16 is the synoptic diagram of the sampling image lens of another embodiment of the present invention, Figure 17 is the energy profile of red spot spread function of sampling image lens of Figure 16 and the energy profile of green glow point spread function, and x direction ruddiness blur level numerical value, y direction ruddiness blur level numerical value, x direction green glow blur level numerical value and y direction green glow blur level numerical value that Figure 18 produces for the sampling image lens of Figure 16 are along with the curve map of the variation of object distance.Please refer to Figure 16, Figure 17 and Figure 18, the sampling image lens 300d of present embodiment also can be in order to replace the sampling image lens 300 among Figure 1A and Fig. 2, in the distance measuring equipment 200 and interactive display system 100 that are applied in Figure 1A.Sampling image lens 300c compared to Figure 11, the sampling image lens 300d of present embodiment is except having in different object distances the variation of blur level numerical value, the optimal imaging object distance that more can make different color light (for example ruddiness and green glow) greater than to reach apart from the judgement degree.For example, can make the difference of optimal imaging object distance of ruddiness and green glow greater than 1 centimeter, promote the judgement degree of accuracy of the distance of 220 pairs of objects 50 of processing unit.
The sampling image lens 300d of present embodiment comprises by the thing side to the first lens 310d, the second lens 320d, the 3rd lens 330d, the 4th lens 340d and the 5th lens 350d that arrange in regular turn as side, and the diopter of the first lens 310d, the second lens 320d, the 3rd lens 330d, the 4th lens 340d and the 5th lens 350d is respectively positive and negative, negative, positive and negative.In the present embodiment, the first lens 310d for example is biconvex lens, the second lens 320d for example is that convex surface is towards the negative meniscus lens of thing side, the 3rd lens 330d for example is that convex surface is towards the negative meniscus lens biconvex lens of picture side, the 4th lens 340d for example is biconvex lens, and the 5th lens 350d for example is biconcave lens.
Following content will be enumerated the embodiment of sampling image lens 300d, but the present invention is not as limit.
(table seven)
Figure BDA00002025523700211
The explanation of each physical quantity can be with reference to the explanation of table one in table seven.In addition, in table seven, surperficial S1d, S2d are two surfaces of the first lens 310d, and wherein surperficial S1d is aperture diaphragm.Surface S3d, S4d are two surfaces of the second lens 320d, and surperficial S5d, S6d are two surfaces of the 3rd lens 330d, and surperficial S7d, S8d are two surfaces of the 4th lens 340d, and surperficial S9d, S10d are two surfaces of the 5th lens 350d.Relevant for parameter values such as each surperficial radius-of-curvature, spacings, please refer to table seven, no longer repeat at this.In addition, the numerical value of the spacing of that delegation of surperficial S10d is that surperficial S10d is to the air line distance of image sensing unit 210 on optical axis A.
Moreover above-mentioned surperficial S1d~S10d is aspheric surface, and it can adopt and above-mentionedly represent in order to the aspheric surface formula of representing S1, S3~S10, and the explanation of each parameter please refer to the explanation of above-mentioned aspheric surface formula to S1, S3~S10 in the formula, no longer repeats at this.In the present embodiment, coefficient A1 is 0.What table eight was listed is the aspheric surface parameter value of surperficial S1d~S10d.
(table eight)
Figure BDA00002025523700212
Figure BDA00002025523700221
(table nine)
Parameter Specification
Image height 3 millimeters (1/3 inch, 2 mega pixels)
Focal length 9.46 millimeter
F numerical value ?1.2
Field angle 34 degree
Object distance
300 millimeters
Relative exposure ?>50%
Optical distortion ?<2%
The key light line angle Maximum key light line angle<28.6 degree
Table nine is listed an embodiment of the specification of sampling image lens 300d, but the present invention is not as limit.The explanation of each parameter please refer to the explanation of each parameter of above-mentioned his-and-hers watches three in the table nine.
As shown in Figure 17, the point spread function of the sampling image lens 300d of present embodiment is except having the variation of concentrating or dispersing in the variation with object distance D on the energy distribution, and the best object distance of different color light is also different.For example, the row of going up among Figure 17 illustrates the energy distribution of the point spread function of ruddiness, and arranges the energy distribution of the point spread function that illustrates green glow down, and the best object distance of ruddiness is at 32 centimeters, and the best object distance of green glow is at 28 centimeters.Therefore, processing unit 220 is according to the sharpness of image for different color light, or further analyze in the different color light image image parameter (for example blur level numerical value) corresponding to different space frequency, also but whether the object distance D of judgment object 50 drops in the default object distance scope, or calculates the object distance D of object 50.Particularly, as shown in figure 18, can compare the relation of four curves of signal E and Figure 18, come the object distance D of judgment object 50 whether to drop in the default object distance scope, or calculate the object distance D of object 50.
Figure 19 is that the difference of ruddiness blur level numerical value on the difference of the green glow blur level numerical value on the x direction and the ruddiness blur level numerical value on the x direction among Figure 18 and the green glow blur level numerical value on the y direction and the y direction is along with the curve map of the variation of object distance D.Please refer to Figure 19, in another embodiment, whether processing unit 220 can come judgment object 50 to drop in the default object distance scope along with the variation of object distance D according to the difference of these image parameters on different object distance D, and wherein this difference for example cuts the resulting difference of ruddiness blur level numerical value on the y direction (also can be in other embodiments on the y direction ruddiness blur level numerical value cut the resulting difference of green glow blur level numerical value on the y direction) for the green glow blur level numerical value on the x direction cuts green glow blur level numerical value on the resulting difference of ruddiness blur level numerical value on the x direction (also can be that ruddiness blur level numerical value on the x direction cuts the resulting difference of green glow blur level numerical value on the x direction in other embodiments) and the y direction.In the present embodiment, can utilize these two different differences to come judgment object 50 whether to drop in the default object distance scope along with the variation of object distance D, and can try to achieve object distance D.Embodiment compared to Figure 18, adopt the embodiment of the curved line relation of Figure 19 can not need the threshold values of selected digital image parameter, but the extreme value by judging this difference and decide zero point processing unit 220 whether to begin the object distance whether judgment object 50 drops in the default object distance scope or begin to calculate object 50.
Except as above-mentioned embodiment once to take resulting image values and come judgment object 50 whether to drop in the default object distance scope or to calculate the object distance D of object 50, in other embodiments, processing unit 220 also can decide the object distance D of object 50 by the relative variation of resulting these image values of comparison different time, and then promotes the accuracy that object distance D judges.Perhaps, processing unit 220 can decide the object distance D of object 50 by comparison resulting these image values of different time these image parameters corresponding with it in the threshold values of proofreading and correct gained in advance.
In the above-described embodiment, processing unit 220 is for example for treatment circuit or be stored in software in the computer readable media.
Figure 20 illustrates the embodiment for the treatment of scheme of the processing unit of Fig. 1.Please refer to Figure 20, the handling procedure of processing unit 220 can comprise the following steps.At first, processing unit 220 can first execution in step P110, namely carrying out initialization proofreaies and correct, with obtain the above-mentioned various curved line relation relevant with image parameter at least one of them, for example one of them among Fig. 3 B, Fig. 4, Fig. 9, Figure 10, Figure 14, Figure 15, Figure 18 and Figure 19 opened the relation that figure illustrates.Then, execution in step P120 namely captures image, that is acquisition is from the signal E of image sensing unit 210, to obtain the data of image.Then, execution in step P132, P134 and P136, it is analytical procedure P120 acquired image data, and obtain image values corresponding to different images parameter (as the different image blur numerical value of above-mentioned embodiment), wherein the kind of these image values for example is the N kind, and wherein N is the positive integer more than or equal to 2.Afterwards, execution in step P140, namely by comparison step P132, P134 and P136 resulting N kind image values and resulting each the image parameter relation of step P110 initialization timing, and whether judgment object 50 falls in the default object distance scope.Then, execution in step P150, namely the view data that captures according to step P120 is come x coordinate and the y coordinate of judgment object 50.When object 50 is finger tip, can judge x coordinate and the y coordinate of finger tip by the position at area of skin color place in the analysis image.Come again, but execution in step P160, the interaction that namely produces user's interface.Particularly, the finger tip touch position of judging in the time of can be according to step P150 decides the interaction that how to produce user's interface, for example produces clicking, pulling or other functions the object in the image.In addition, after proofreading and correct through the initialization of step P110, the user can be according to user demand correction parameter again, and then promotes the accuracy of user's interface interaction.In addition, by the interaction of user's interface, the user also can indicate processing unit 220 to capture image again, with the object distance D that carries out object 50 and the judgement of position.In other embodiments, step P150 also can carry out before step P140, for example be between step P120 and step P132, P134, P136, to carry out, step P150 can carry out image and cut apart this moment, and resulting image further is partitioned into object 50(such as finger tip from step P120) image, next the step P132 that carries out, P134, P136 then can only analyze the image of the object 50 that is partitioned into, and then simplify data and the process of analyzing.
Figure 21 illustrates the embodiment that processing unit calculates the flow process of image blur numerical value.Please refer to Figure 21, the blur level computing method of above-described embodiment can adopt any method in the above paper of quoting from, or adopt other blur level computing method.Enumerate wherein at this that a kind of blur level computing method are example, but the present invention is not as limit.The method of Figure 21 is to be the folding long-pending (convolution) of picture rich in detail and Gaussian function with image blurring being considered as, and by original image being made fuzzy (re-blur) again, with the difference before and after fuzzyyer, and then extrapolates the numerical value of fog-level.For example, illustrate as Figure 21, the flow process of the computed image blur level numerical value among the figure comprises the following steps.At first, Q110 illustrates as step, and 210 obtain original image (source image) from the image sensing unit, that is obtain this raw image data from the signal E from image sensing unit 210.Then, execution in step Q122 and step Q124 namely utilize two different blur radius 1 and blur radius 2 to come original image is made Fuzzy Processing, and to obtain blurred picture 1 and blurred picture 2 respectively, wherein blur radius is relevant with the width of Gaussian function.In other words, be about to original image and do the long-pending computing of folding with the Gaussian function of two different in width respectively, to obtain two different blurred pictures.Afterwards, execution in step Q130 is namely by relatively blurred picture 1, blurred picture 2 come ambiguous estimation number of degrees value with original image.Then, when step Q140, just can obtain blur level numerical value.
When the blur level computing method are applied in as the sort of embodiment that can produce astigmatism formula aberration of Fig. 3 A, fuzzy again arithmetic core (kernel) can be divided into x axle and two kinds on y axle fuzzy (blur), and the difference of the blur level numerical value by comparison x axle and y axle comes judgment object 50 whether to drop in the default object distance scope.In addition, when the blur level computing method are applied in as the sort of embodiment that can produce aberration formula aberration of Figure 17, image can be divided into the image of red channel (channel) and the image of green channel, and calculate the blur level numerical value of two passages respectively.At this moment, blur level numerical value is along with the double gauss that illustrating as Figure 22 will appear in the distribution of object distance D distributes, and wherein Figure 22 is the modulation transfer function curve map of the sampling image lens with aberration formula aberration of one embodiment of the invention.At this moment, the pixel of blur level numerical value greater than certain threshold values can be considered as in focusing (in focus) district, be the focusing district of green glow as regional R1, and regional R2 is the focusing district of ruddiness.The intersection area (being the zone of filling up oblique line among Figure 22) of getting the focusing district of the focusing district of red channel and green channel (for example begins judgment object 50 and whether drops on default object distance scope as trigger region this moment, or begin to calculate the object distance D of object 50), to promote the accuracy of judging object distance D.
Figure 23 is the process flow diagram of the distance-finding method of one embodiment of the invention.Please refer to Figure 1A and Figure 23, the distance-finding method of present embodiment can be applicable in the distance measuring equipment 200 of Figure 1A or in the distance measuring equipment of other embodiment, and following distance measuring equipment 200 with Figure 1A is that example describes.The distance-finding method of present embodiment comprises the following steps.At first, execution in step S110 produces a plurality of image parameters by sampling image lens 300, and wherein these image parameters have different variations along with the variation of object distance D.The details that sampling image lens 300 produces a plurality of image parameters can no longer repeat at this with reference to the various embodiments described above.Then, execution in step S120 with object 50 imagings, to obtain image, for example is to form image in image sensing unit 210 by sampling image lens 300.The details that forms image in the image sensing unit can no longer repeat at this with reference to the explanation of above-described embodiment.Afterwards, execution in step S130, according to resulting image and these image parameters along with whether these different variations that the variation of object distance produces come judgment object 50 to drop in the default object distance scope.In the present embodiment, resulting image is the view data that the signal E among Figure 1A comprises.In addition, whether judgment object 50 drops in the default object distance scope and the details of the computing method of object distance D can no longer repeat at this with reference to the explanation of above-described embodiment.In other words, step S130 can be performed by processing unit 220.In addition, but the distance-finding method analysis image of present embodiment, and to obtain corresponding respectively to these image values of these image parameters, wherein analysis image for example is analytic signal E.Moreover the distance-finding method of present embodiment can be chosen the scope corresponding to the image to be analyzed of object 50 from image, for example is that the view data from signal E is chosen.
In sum, in the distance measuring equipment and distance-finding method of embodiments of the invention, the difference that produces owing to the variation of having adopted a plurality of image parameters along with object distance changes the object distance of coming judgment object whether to drop in the default object distance scope or calculating object, therefore can promote and judge and the accuracy of calculating, and can promote the speed of range finding, and then reach the effect of instant range finding.In addition, because the interactive display system of embodiments of the invention has adopted above-mentioned distance measuring equipment, therefore can allow the user produce interaction with flying at aerial demonstration image, also can simulate the effect that user's contact and touch-control fly at aerial demonstration image.
Though the present invention with embodiment openly as above; so it is not in order to limit the present invention; those skilled in the art in the technical field under any; without departing from the spirit and scope of the present invention; when doing a little change and retouching, so protection scope of the present invention is as the criterion when looking accompanying the claim person of defining.

Claims (42)

1. a distance measuring equipment is characterized in that, comprising:
Sampling image lens makes the gained image of this sampling image lens possess a plurality of image parameters, and wherein these a plurality of image parameters have different variations along with the variation of object distance;
The image sensing unit, wherein on this image sensing unit, with the formation image, and this image sensing unit becomes signal with this image transitions to this sampling image lens with object image-forming; And
Processing unit along with these a plurality of different variations that the variation of this object distance produces, judges whether this object drops in the default object distance scope according to a plurality of image parameters of this signal and this.
2. distance measuring equipment as claimed in claim 1 is characterized in that, these a plurality of image parameters are included in two image blur numerical value on the different directions.
3. distance measuring equipment as claimed in claim 2 is characterized in that, these two different directions are vertical in fact each other.
4. distance measuring equipment as claimed in claim 1 is characterized in that, these a plurality of image parameters comprise the image blur numerical value of a plurality of different color lights.
5. distance measuring equipment as claimed in claim 1 is characterized in that, these a plurality of image parameters comprise the image blur numerical value of a plurality of different space frequency.
6. distance measuring equipment as claimed in claim 1 is characterized in that, this processing unit also according to the difference of this a plurality of image parameters along with the variation of this object distance judges whether this object drops in this object distance scope of presetting.
7. distance measuring equipment as claimed in claim 1, it is characterized in that, this processing unit also according to this a plurality of image parameters along with the slope of the variation of this object distance changes to judge with respect to the difference that the variation of this object distance produces whether this object drops in this object distance scope of presetting.
8. distance measuring equipment as claimed in claim 1 is characterized in that, the intensity of the energy distribution of the point spread function of this sampling image lens on two different directions reaches extreme value in this different object distances.
9. distance measuring equipment as claimed in claim 8 is characterized in that, these two different directions are vertical in fact each other.
10. distance measuring equipment as claimed in claim 8, it is characterized in that, this sampling image lens has at least one non-axial symmetrical lens, and this non-axial symmetrical lens has at least one non-axisymmetric curved surface, and the pattern of this non-axisymmetric curved surface on these two different directions is inequality.
11. distance measuring equipment as claimed in claim 1 is characterized in that, the energy distribution of the point spread function of a plurality of different color lights of this sampling image lens has different variations along with the variation of this object distance.
12. distance measuring equipment as claimed in claim 11, it is characterized in that, the axial chromatic aberration of this sampling image lens is to drop in from 0.0010 to 0.0100 or from-0.0010 to-0.0100 the scope divided by focal length, and this axial chromatic aberration is the axial chromatic aberration of the corresponding different color light of these a plurality of image parameters.
13. distance measuring equipment as claimed in claim 1, it is characterized in that, whether this processing unit also these a plurality of image parameters of basis decides this processing unit to begin to judge according to this signal whether this object drops in this default object distance scope in the threshold values of proofreading and correct gained in advance.
14. distance measuring equipment as claimed in claim 13 is characterized in that, this processing unit also decides the object distance of this object with respect to the extreme value of the variation of this object distance according to these a plurality of image parameters.
15. distance measuring equipment as claimed in claim 1 is characterized in that, this processing unit also decides the object distance of this object with respect to the extreme value of the variation of this object distance according to these a plurality of image parameters.
16. distance measuring equipment as claimed in claim 1, it is characterized in that, this signal of this processing unit processes, obtaining corresponding respectively to a plurality of image values of these a plurality of image parameters, and this processing unit decides the object distance of this object by the relative variation of resulting these a plurality of image values of comparison different time.
17. distance measuring equipment as claimed in claim 1, it is characterized in that, this signal of this processing unit processes, obtaining corresponding respectively to a plurality of image values of these a plurality of image parameters, and this processing unit decides the object distance of this object in the threshold values of proofreading and correct gained in advance by resulting this a plurality of image values of comparison different time and these a plurality of image parameters.
18. distance measuring equipment as claimed in claim 1 is characterized in that, this processing unit is judged the object distance of this object according to the image sensing unit in once taking resulting this signal.
19. distance measuring equipment as claimed in claim 1 is characterized in that, this processing unit comprises:
Position interpretation subelement, according to this object of this signal deciding in perpendicular to the position on the direction of this object distance;
Image is cut apart subelement, chooses the scope corresponding to the image to be analyzed of this object from this image;
The image calculation subelement calculates a plurality of image values that correspond respectively to these a plurality of image parameters according to this selected image to be analyzed; And
The range estimation subelement determines the object distance of this object according to these a plurality of image values that calculate.
20. a distance-finding method is characterized in that, comprising:
Make its gained image possess a plurality of image parameters by sampling image lens, wherein these a plurality of image parameters have different variations along with the variation of object distance;
By this sampling image lens with object image-forming, to obtain image; And
According to resulting this image and these a plurality of image parameters along with these a plurality of different variations that the variation of this object distance produces judge whether this object drops in the default object distance scope.
21. distance-finding method as claimed in claim 20 is characterized in that, these a plurality of image parameters are included in two image blur numerical value on the different directions.
22. distance-finding method as claimed in claim 21 is characterized in that, these two different directions are vertical in fact each other.
23. distance-finding method as claimed in claim 20 is characterized in that, these a plurality of image parameters comprise the image blur numerical value of a plurality of different color lights.
24. distance-finding method as claimed in claim 20 is characterized in that, these a plurality of image parameters comprise the image blur numerical value of a plurality of different space frequency.
25. distance-finding method as claimed in claim 20 is characterized in that, judges that the method whether this object drops in this default object distance scope comprises:
According to the difference of these a plurality of image parameters along with the variation of this object distance judges whether this object drops in this default object distance scope.
26. distance-finding method as claimed in claim 20 is characterized in that, judges that the method whether this object drops in this default object distance scope comprises:
According to these a plurality of image parameters along with the slope of the variation of this object distance changes to judge with respect to the difference that the variation of this object distance produces whether this object drops in this default object distance scope.
27. distance-finding method as claimed in claim 20, it is characterized in that the method that produces these a plurality of image parameters by this sampling image lens comprises that the intensity of energy distribution on two different directions of the point spread function that makes this sampling image lens reaches extreme value in this different object distances.
28. distance-finding method as claimed in claim 27 is characterized in that, these two different directions are vertical in fact each other.
29. distance-finding method as claimed in claim 20 is characterized in that, the method that produces these a plurality of image parameters by this sampling image lens comprises:
Make the energy distribution of point spread function of a plurality of different color lights of this sampling image lens that different variations be arranged along with the variation of this object distance.
30. distance-finding method as claimed in claim 29 is characterized in that, makes the energy distribution of point spread function of these a plurality of different color lights of this sampling image lens have the method for different variations to comprise along with the variation of this object distance:
The axial chromatic aberration of this sampling image lens is dropped on divided by focal length in from 0.0010 to 0.0100 or from-0.0010 to-0.0100 the scope, and wherein this axial chromatic aberration is the axial chromatic aberration of the corresponding different color light of these a plurality of image parameters.
31. distance-finding method as claimed in claim 20 is characterized in that, judges that the method whether this object drops in this default object distance scope comprises:
, determine whether beginning judging according to this image whether this object drops in this default object distance scope in the threshold values of proofreading and correct gained in advance according to these a plurality of image parameters.
32. distance-finding method as claimed in claim 31 is characterized in that, also comprises:
Decide the object distance of this object with respect to the extreme value of the variation of this object distance according to these a plurality of image parameters.
33. distance-finding method as claimed in claim 20 is characterized in that, also comprises:
Decide the object distance of this object with respect to the extreme value of the variation of this object distance according to these a plurality of image parameters.
34. distance-finding method as claimed in claim 20 is characterized in that, also comprises:
Analyze this image, to obtain corresponding respectively to a plurality of image values of these a plurality of image parameters; And
Decide the object distance of this object by the relative variation of the resulting image values of comparison different time.
35. distance-finding method as claimed in claim 20 is characterized in that, also comprises:
Analyze this image, to obtain corresponding respectively to a plurality of image values of these a plurality of image parameters; And
Decide the object distance of this object in the threshold values of proofreading and correct gained in advance by resulting these a plurality of image values of comparison different time and these a plurality of image parameters.
36. distance-finding method as claimed in claim 20 is characterized in that, also comprises:
Polaroid resultant this image of this object is judged the object distance of this object by this sampling image lens.
37. distance-finding method as claimed in claim 20 is characterized in that, also comprises:
Determine this object in perpendicular to the position on the direction of this object distance according to this image;
From this image, choose the scope corresponding to the image to be analyzed of this object;
Calculate a plurality of image values that correspond respectively to these a plurality of image parameters according to this selected image to be analyzed; And
Determine the object distance of this object according to these a plurality of image values that calculate.
38. an interactive display system is characterized in that, comprising:
Image generation unit forms the demonstration image in the space; And
Distance measuring equipment comprises:
Sampling image lens produces a plurality of image parameters, and wherein these a plurality of image parameters have different variations along with the variation of object distance;
The image sensing unit, wherein on this image sensing unit, forming the image of object, and this image sensing unit becomes signal with the image transitions of this object to this sampling image lens with object image-forming; And
Processing unit, these a plurality of different variations that produce along with the variation of this object distance according to a plurality of image parameters of this signal and this, judge whether this object drops in the default object distance scope, and when this processing unit judged that this object drops in this default object distance scope, this processing unit judged that this object touches this demonstration image.
39. interactive display system as claimed in claim 38 is characterized in that, this demonstration image is real image.
40. interactive display system as claimed in claim 38 is characterized in that, this demonstration image is the virtual image.
41. interactive display system as claimed in claim 38 is characterized in that, this image generation unit and this distance measuring equipment are positioned at the relative both sides of this demonstration image.
42. interactive display system as claimed in claim 38 is characterized in that, this image generation unit and this distance measuring equipment are positioned at the same side of this demonstration image.
CN201210294992.4A 2011-12-29 2012-08-17 Ranging apparatus, ranging method, and interactive display system Active CN103185568B (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201161581098P 2011-12-29 2011-12-29
US61/581,098 2011-12-29

Publications (2)

Publication Number Publication Date
CN103185568A true CN103185568A (en) 2013-07-03
CN103185568B CN103185568B (en) 2015-05-13

Family

ID=48676868

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201210294992.4A Active CN103185568B (en) 2011-12-29 2012-08-17 Ranging apparatus, ranging method, and interactive display system

Country Status (2)

Country Link
CN (1) CN103185568B (en)
TW (1) TW201326755A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108141534A (en) * 2015-10-05 2018-06-08 谷歌有限责任公司 Use the auto focusing method and equipment of modulation transfer function curve

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI782280B (en) * 2020-06-01 2022-11-01 財團法人國家實驗研究院 Auto focus method for a remote sensing satellite and the satellite therefor

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6747666B2 (en) * 2000-01-26 2004-06-08 New York University Method and system for facilitating wireless, full-body, real-time user interaction with digitally generated text data
US20060197756A1 (en) * 2004-05-24 2006-09-07 Keytec, Inc. Multi-mode optical pointer for interactive display system
CN101243458A (en) * 2005-08-15 2008-08-13 索尼电子有限公司 Image acquisition system for creating a depth map
US20090141163A1 (en) * 2007-12-04 2009-06-04 Dblur Technologies Ltd. Compact camera optics
US20090147999A1 (en) * 2007-12-10 2009-06-11 Fujifilm Corporation Image processing system, image processing method, and computer readable medium
US20100315541A1 (en) * 2009-06-12 2010-12-16 Yoshitaka Egawa Solid-state imaging device including image sensor
US20110025845A1 (en) * 2009-07-31 2011-02-03 Samsung Electro-Mechanics Co., Ltd. Apparatus and method for measuring location and distance of object by using camera
US20110187678A1 (en) * 2010-01-29 2011-08-04 Tyco Electronics Corporation Touch system using optical components to image multiple fields of view on an image sensor
US20110222734A1 (en) * 2010-03-10 2011-09-15 Industrial Technology Research Institute Methods for evaluating distances in a scene and apparatus and machine readable medium using the same
US20110267508A1 (en) * 2010-04-30 2011-11-03 Kane Paul J Digital camera with coded aperture rangefinder

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6747666B2 (en) * 2000-01-26 2004-06-08 New York University Method and system for facilitating wireless, full-body, real-time user interaction with digitally generated text data
US20060197756A1 (en) * 2004-05-24 2006-09-07 Keytec, Inc. Multi-mode optical pointer for interactive display system
CN101243458A (en) * 2005-08-15 2008-08-13 索尼电子有限公司 Image acquisition system for creating a depth map
US20090141163A1 (en) * 2007-12-04 2009-06-04 Dblur Technologies Ltd. Compact camera optics
US20090147999A1 (en) * 2007-12-10 2009-06-11 Fujifilm Corporation Image processing system, image processing method, and computer readable medium
US20100315541A1 (en) * 2009-06-12 2010-12-16 Yoshitaka Egawa Solid-state imaging device including image sensor
US20110025845A1 (en) * 2009-07-31 2011-02-03 Samsung Electro-Mechanics Co., Ltd. Apparatus and method for measuring location and distance of object by using camera
US20110187678A1 (en) * 2010-01-29 2011-08-04 Tyco Electronics Corporation Touch system using optical components to image multiple fields of view on an image sensor
US20110222734A1 (en) * 2010-03-10 2011-09-15 Industrial Technology Research Institute Methods for evaluating distances in a scene and apparatus and machine readable medium using the same
US20110267508A1 (en) * 2010-04-30 2011-11-03 Kane Paul J Digital camera with coded aperture rangefinder

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108141534A (en) * 2015-10-05 2018-06-08 谷歌有限责任公司 Use the auto focusing method and equipment of modulation transfer function curve

Also Published As

Publication number Publication date
CN103185568B (en) 2015-05-13
TW201326755A (en) 2013-07-01

Similar Documents

Publication Publication Date Title
US9098147B2 (en) Ranging apparatus, ranging method, and interactive display system
CN110570371B (en) Image defogging method based on multi-scale residual error learning
CN109489620B (en) Monocular vision distance measuring method
CN104656230B (en) Image capturing array system and fingerprint identification device
CN102656605B (en) Image processing apparatus, image processing method
US10375378B2 (en) Dual camera system for real-time depth map generation
WO2020043155A1 (en) Multiple scale image fusion method and device, storage medium, and terminal
CN104574423B (en) Single-lens imaging PSF (point spread function) estimation method based on spherical aberration calibration
EP2903256B1 (en) Image processing device, image processing method and program
EP2849148A2 (en) Three-dimensional printing system and method for three-dimensional printing
CN105023249A (en) Highlight image restoration method and device based on optical field
CN105678736B (en) Change the image processing system and its operating method of estimation of Depth with aperture
CN109656033B (en) Method and device for distinguishing dust and defects of liquid crystal display screen
CN103426149A (en) Large-viewing-angle image distortion correction and processing method
US20200218343A1 (en) Gaze point compensation method and apparatus in display device, and display device
CN104363377A (en) Method and apparatus for displaying focus frame as well as terminal
CN105452926A (en) Image capture device and focus control method
Koppal et al. Toward wide-angle microvision sensors
CN105979248B (en) Image processing system and its operating method with interacting depth estimation
WO2017014933A1 (en) Systems and methods for selecting an image transform
WO2016184152A1 (en) Measuring method and apparatus, mobile terminal and storage medium
CN206400179U (en) A kind of camera lens
CN103185568B (en) Ranging apparatus, ranging method, and interactive display system
CN114554085A (en) Focusing method and device, electronic equipment and storage medium
CN107256563B (en) Underwater three-dimensional reconstruction system and method based on difference liquid level image sequence

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant