WO1999048041A1 - Dispositif destine au balayage et au mappage d'une surface - Google Patents

Dispositif destine au balayage et au mappage d'une surface Download PDF

Info

Publication number
WO1999048041A1
WO1999048041A1 PCT/US1999/005559 US9905559W WO9948041A1 WO 1999048041 A1 WO1999048041 A1 WO 1999048041A1 US 9905559 W US9905559 W US 9905559W WO 9948041 A1 WO9948041 A1 WO 9948041A1
Authority
WO
WIPO (PCT)
Prior art keywords
light
reference points
image
providing
features
Prior art date
Application number
PCT/US1999/005559
Other languages
English (en)
Inventor
Lars KÜCKENDAHL
Original Assignee
Isc/Us, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Isc/Us, Inc. filed Critical Isc/Us, Inc.
Priority to AU30872/99A priority Critical patent/AU3087299A/en
Priority to EP99912509A priority patent/EP1062624A4/fr
Publication of WO1999048041A1 publication Critical patent/WO1999048041A1/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/12Fingerprints or palmprints
    • G06V40/13Sensors therefor
    • G06V40/1312Sensors therefor direct reading, e.g. contactless acquisition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/10Image acquisition
    • G06V10/12Details of acquisition arrangements; Constructional details thereof
    • G06V10/14Optical characteristics of the device performing the acquisition or on the illumination arrangements
    • G06V10/145Illumination specially adapted for pattern recognition, e.g. using gratings

Definitions

  • This invention relates a device and method for scanning and mapping a surface and more particularly, to a device which enables a touchless method for mapping a surface.
  • the surface being captured contacts another element during the capturing process, the surface becomes distorted and a true image of the surface can not be obtained.
  • fingerprint capture includes inking the fingers of the subject, and then having the subject roll the inked fingers, one at a time, over prescribed locations on a specially designed card to transfer images of their fingerprints to the card. If the 2 subject is a suspected criminal, a very young person, a person with arthritis or other disability they may be unwilling or unable to roll their fingers in a manner that is suitable for satisfactorily transferring the fingerprint to the card.
  • the problems attendant the capturing of fingerprints with ink have been reduced by using optical, thermal or conductive-resistance devices.
  • the finger is placed on a transparent platen and the fingerprint is photographed or captured electronically by a mechanism on the other side of the platen.
  • the residue left by a prior user or by a prior finger might be read simultaneously with the fingerprint of the subject thereby creating an image having the appearance of a double exposure or in those instances where insufficient pressure has been applied the residue may fill the empty space without the operator realizing it.
  • the result is a defective fingerprint of which no-one is aware until long after it is taken.
  • the finger is flatted when it is placed on the platen which causes it to be distorted.
  • an uncooperative subject may apply uneven pressure across the fingertip while the fingerprint is being captured thereby distorting the fingerprint without the person supervising the process realizing it.
  • Thermal and conductive-resistive devices solve some of these problems. However, they are still contact devices. Hence, the problem of distortion remains. Further, uncooperative or incapable subjects can defeat these devices just as they defeat ink based systems.
  • none of these systems are capable of creating an image of a surface which is comparable to that achieved by actually rolling the surface over a substrate on which the image of the surface is to be captured. Accordingly the amount of surface area captured has often not been sufficient to accurately classify and/or compare images with sufficient detail to be sorted, classified or compared. This is especially important in the case of fingerprint identification.
  • the invention relates to a device for scanning the surface of an item comprising a scanning zone and means for projecting a pattern of light dots onto the surface to be scanned when it is in the scanning zone.
  • Means are provided for detecting the pattern of light dots.
  • Means are also provided for making a grey scale image of the surface, and means are provided for combining the light dot pattern with the grey scale image to create a two dimensional reproduction of the item that was scanned.
  • the invention in another aspect relates to a method of scanning and capturing the image of a surface which surface has a plurality of features and each feature being in a particular place on the surface.
  • the method comprises placing an object which surface is to be scanned in a scanning zone and placing a plurality of reference points on the surface so that some of the reference points correspond to some of the features.
  • the location of the features on the surface is determined by locating the reference points that correspond to the features so that the image is captured.
  • Figure 1 is a perspective view of a device constructed in accordance with a presently preferred form of the invention.
  • Figure 2 is a side view, partially in section of the interior of the device illustrated in Figure 1.
  • Figure 3 is a block diagram that generally describes the method of the invention.
  • Figure 4 is a plan view of a part of the surface of a finger or other generally cylindrical object with a pattern of light dots projected on it in accordance with the invention.
  • Figure 5 is a grey scale (photographic) image of that part of the surface of a finger or other generally cylindrical object which is illustrated in Figure 4 and showing the features of its surface .
  • Figure 6 is a plan view of that part of the surface of a finger or other generally cylindrical object which is illustrated in Figures 5 and 6 with the pattern of dots superimposed on the features of the surface.
  • Figure 7 is a partial section view taken along line 7-7 of Figure 2.
  • Figure 8 is a partial section view taken along line 8-8 of Figure 2.
  • Figure 9 is a plan view of one of the detection plates detecting the first pattern of light clusters.
  • Figure 10 a plan view of same part of the surface of a finger or other generally cylindrical object as shown in Figure 4, but with a second pattern of light dots projected on it. 6
  • Figure 11 is a plan view of the detection plate shown in Figure 9, but detecting a second pattern of light clusters.
  • Figure 12 is a block diagram that generally shows the steps in the enhancement of the light clusters.
  • Figures 13, 14 and 15 show the steps in determining which light clusters are the reflections of light dots.
  • Figure 15 is a plan view of the detection plate shown in Figure 11 after the light clusters are further processed.
  • Figure 16 shows a further step in determining which light clusters are the reflections of light dots.
  • FIGS 17, 18 and 19 show three methods for finding the centers of the light clusters.
  • Figure 20 is a plan view of detection plate showing the centers of the light clusters.
  • Figure 21 is a schematic showing the method for locating the three dimensional position of the light dots.
  • Figure 22 is a schematic showing the method for mapping a three dimensional coordinates into a two dimensional plane.
  • Figure 23 is a pictorial view of a plurality of devices constructed in accordance with invention arranged to scan the surface of an elongated item.
  • Figure 24 shows a step in creating a composite grey scale image .
  • Figure 25 shows a completed composite grey scale image.
  • Figures 26, 27 and 28 show other systems for creating the light dots.
  • Figure 29 shows another system for finding the three dimensional coordinates of an item being scanned. 7
  • Figures 30 and 31 show a composite scanned image based on three detection systems.
  • Figures 32 and 33 show a composite scanned image based on four detection systems.
  • a scanning device 10 of a type contemplated by the invention is illustrated.
  • the device can scan the image of a curved or otherwise irregular surface as though the surface were in rolling contact with the medium on which it will be captured.
  • the device 10 comprises a housing 12 and a transparent end wall 14.
  • the housing 12 contains a projection system 20, a detection system 22, a lighting system 24, a timing circuit 26 and a programmable computer.
  • the projection system 20 projects a pattern of light dots 32A onto the surface 38 an item 40 to be scanned. Then as seen in Figure 4, the surface to be scanned 38 is lit by the lighting system 24 to illuminate its features.
  • the item to be scanned 40 is placed over the device 10.
  • the detection system 22 detects both the pattern of light dots 32A reflected from the surface to be scanned 38 ( Figure 4) and a grey scale (photographic) image ( Figure 5) of the surface 38 as illuminated by the lighting system 24.
  • the coordinates of the three dimensional position of each of the light dots 32A is then determined at 36. Consequently, the coordinates of all of the light dots 32A comprise a statement of the shape of the surface, including relative heights, widths and lengths among the various light dots 32A.
  • each particular light dot 32A is associated with a particular part of the grey 9 scale (photographic) image of the surface 38 being scanned. Since the three dimensional location of each of the light dots 32A is known, the particular part of the grey scale image associated with that particular light dot 32A is also known.
  • a two dimensional drawing of the surface 38 may be made such as on an FBI fingerprint card 44A, or an image of the surface can be projected onto a viewing screen or monitor 44B for real time or later viewing.
  • the information can be stored 44C in either its three dimensional form or its two dimensional form for later use such as for comparison to permit access to secure areas, detect unauthorized reproductions or forgeries of items, study sculptures, record and compare facial images or other body parts and the like.
  • the projection system 20 comprises a projection axis 46, a projection plate 48 and a lens system 50.
  • the projection axis 46 extends through the transparent end wall 14, the projection plate 48 and the lens system 50.
  • the lens system 50 has a focal point 58 which lies along axis 46.
  • the projection plate 48 comprises a large number, e.g., several hundred, miniature projectors 52.
  • the projectors may be selected so that they project conventional white light onto the surface 38 of the item being scanned.
  • infrared or near infrared light be used since better imaging will be achieved. This is because conventional white light might be filtered out by some glass filters and it makes the device 10 usable even when exposed to daylight. Further, since visible white light can then be filtered out, a high contrast picture will result.
  • the projectors are preferably arranged in a formation such as the rectangular grid shown.
  • a row 60 of projectors 52 and a column 62 of projectors are identified as neutral axes which define a cross 64.
  • the projection axis 46 passes through the intersection of row 60 and column 62 which is the center 66 of the cross 64.
  • row 60 may be identified as R 0 .
  • the rows above row R 0 may be identified as rows R +1 , R +2 , R +3 , R +4 , ... , R +n .
  • the rows below row R 0 may be identified as rows R-
  • column 62 may be identified as C 0 .
  • the columns to the right of column C 0 may be identified as columns C + i . C +2 , C +3 , C +4 , ..., C +ra .
  • the columns to the left of column C 0 may be identified as columns P l r C torque 2 , C_ 3 , C_ 4 , ..., C. m .
  • each projector is at the intersection of a row and column with the address of the intersection of row 60 and column 62 being at R 0 , C 0 and the location and address of every other projector being at R ⁇ n , C ⁇ m ; where R and C identify row and column respectively, + or - indicate the side of the neutral axis on which the projector 52 is located, and ⁇ n indicates which particular row while ⁇ m indicates which particular column.
  • the shape of the projection plate 48 and the number of projectors in each row 60 or column 62 is not critical. Further, there can be a different number of projectors 52 in the rows 60 as compared to the columns 62, or some rows 60 and columns 62 may have more or less projectors 52 than other rows and columns .
  • each of the projectors 52 projects a light beam 54 through the lens system 50 and the transparent end wall 14 which creates a pattern of light dots 32A on the surface 38 of the item being scanned with each light dot 32A corresponding to the location of the projector 52 on the projection plate 48 that created it. Since the location and address of each projector 52 is known, the position of each beam 54 relative to the other beams 54 is also known as will be described more fully. 12 The Detection System
  • the detection system 22 comprises at least one detection axis 68 that extends through the transparent end wall 14. It is presently preferred that there be at least two detection systems 22 and that the axis of each of them extend through transparent end wall 14. However, a device with only one detection system 22 would function in the same manner as the device described.
  • the detection axes 68 are angularly disposed with respect to each other and on opposite sides of the projection axis 46 to scan about 150 degrees. None-the-less, the principal method of the invention is the same without regard to the number of detection axes 68 being present; the sole difference being that with a larger number of detection axes 68 more of the surface 38 can be seen.
  • the detection system 22 also includes a CCD (charged coupled device) camera 70 disposed along each detection axis 68.
  • the CCD camera 70 is a well known photographic device that takes a conventional picture through a conventional lens system 76. However, as seen in Figure 8, in its focal plane, instead of an emulsion film it has a detection plate 80 with a large number, i.e., many thousand, miniature optical detectors 84, each of 13 which may comprise one pixel of the image. (It should be understood that the term "pixel” is taken to mean the smallest unit of an image having identical color and brightness throughout its area. Several adjacent detectors 84 that detect the identical color and brightness may also be referred to as a "pixel"). The detectors 84 are arranged in a regular grid so that the location and address of each of them is known.
  • the rows of detectors 84 may be identified as RR 0 , RR +1 , RR +2 , RR +3 , RR +4 , ..., RR +n -
  • the columns may be identified as CC 0 , CC +1 ,
  • each detector 84 is at the intersection of a row and column with the address of the intersection in the upper left corner of the plate 48 being at RR 0 , CC 0 and the location and address of every other detector 84 being at RR +n , CC +m ; where RR and CC identify row and column respectively.
  • each CCD detector 84 causes each of them to generate an electrical signal such as a voltage which is proportional to the intensity of the light that it receives.
  • the lens system 76 of each CCD camera 70 has a focal point 88 which lies along detection axis 68. Since the location and address of each detector 84 is known, the position of each reflected beam 54' relative to the other reflected beams 54' is also known as will be described more fully. 14 As stated earlier, there are many thousands of detectors on plate 80, but only hundreds of projectors 52 on projection plate 48.
  • the difference in number is necessary since while the source of each beam of light 54, i.e., the location of each projector 52, can be planned, the location on the detection plate 80 where the reflected beam 54' lands can not be planned since the location where it lands is determined by the shape of the surface 38 being scanned. Therefore, a larger number of detectors is necessary to reasonably assure accuracy in determining the three dimensional coordinates of the light dots 32A. None-the-less, the number of projectors 52 and detectors 84 could be substantially reduced without departing from the invention. However, with a reduced number of projectors 52 and detectors 84 the accuracy and reliability of a device constructed in accordance with the invention would be diminished.
  • the lighting system 24 may include conventional white or infrared lamps 94 that have a substantially instantaneous illumination and decay cycle for lighting the surface 38 in a conventional manner for the creation of the grey scale (photographic) image shown in Figure 4 as will be more fully explained.
  • the programmable computer controls the timing circuit 26 which in turn controls the projection system 20, the detection system 22, and the lighting system 24.
  • the timing circuit 26 energizes the projection system 20 twice, the lighting system 24 once, and the detection 15 system 22 three times, all in a fraction of a second so that an item 40 passing through a scanning zone 100 adjacent to and overlying the transparent wall 14 will have its image scanned several times over a brief period with each scanning cycle comprising two energizations of the projection system 20 and one energization of the lighting system 24.
  • the detection system 22 is energized in parallel with the projection system 20 and lighting system 24 to capture the images that those systems create .
  • the scanning zone 100 may have an upper limit which is defined by plate 102 that prevents the item being scanned 40 from being moved out of range of the projection and detection systems 20 and 22 and support 102B to keep the item 40 from touching the transparent end wall 14.
  • the surface 38 is scanned by energizing the timing circuit 26 so that the projection 20 - detection 22, and lighting 24 - detection 24 systems are energized in rapid succession.
  • the item 40 is scanned about 20 times a second. The best scans are selected for use in the method. 16
  • the item 40 which is to be scanned is placed in the scanning zone 100.
  • the surface 38 is "photographed" by light emanating from the projection system 20 and lighting system 24.
  • the first scan detected in a scanning cycle is of light reflected from the lamps 94 or from the projectors 52.
  • the first two scans in a scanning cycle are from the projectors 52.
  • the projectors 52 project a first pattern of light dots 32A onto the surface 38 which are reflected by the surface 38 onto the detection plate 80 as light clusters 32B ( Figure 9) where they are detected by the detectors 84.
  • the light dots 32A there are a sufficient number of projectors 52 to place the light dots 32A at one millimeter intervals to assure an accurate reproduction of the surface being scanned. This is especially important if the surface being scanned 38 has fine detail that might be lost if the light dots were further apart.
  • the same projectors 52 project a second pattern of light dots 34A onto the surface 38 ( Figure 10) which are reflected onto the detection plate 80 as light clusters 34B ( Figure 11) .
  • the second pattern of light dots 34A is used as a reference pattern for matching into sets the light beams 54 from particular projectors 52 and the reflected light beams 54' that created particular light dots 32A on the surface 38.
  • the second pattern is the same as the first pattern, except some of the projectors 52 are marked so that 17 their reflections 34B on the detection plate 80 can be identified.
  • each light cluster 32B, 34B detected by the detectors 84 is in the same location on the surface 38 relative to the other light clusters 32B, 34B as their projectors 52 were on the projection plate 48, their locations on the detection plate 80 may be displaced from their expected position due to irregularities in the surface 38 including features such as ridges, arches, bifucations, ellipses, islands, loops, end points of islands, rods, spirals, tented arches, whorls, depressions, nicks, blisters, scars, pimples, warts, hills, bumps, valleys, holes and the like.
  • the irregularities could result from the fact that the item or portions of the item whose surface is to be scanned 38 is curved, cylindrical, wavy or tapered so that not all portions of the surface are the same distance from the transparent wall 14. Therefore, the angle of a particular reflected light beam 54' can not be predicted, nor can the location on the detection plate 80 where the light clusters 32B, 34B that it creates is detected be predicted, so the second pattern of light clusters 34B is necessary for the identification.
  • each light dot 32A in the first pattern of light dots on the surface 38 is identified by a suitable method, such as triangulation, the three-dimensional coordinates that correspond to the position of that light dot 32A are identified. This is done for each particular light dot 32A by determining which projector 52 created it and which detector 84. detected it. 18
  • each projected beam of light 54 passes through focal point 58 and each reflected beam of light 54' passes through focal point 88. Since the distance between the focal points 58 and 88 is easily determined when the device 10 is constructed, when the angle made by the beams of light 54 and 54' in each set of beams from and to the projector 52 and detector 84 that created and detected them are known, sufficient information exists to locate the light dot 32A in three dimensions. The method by which this is done will be explained.
  • the lamps 94 are energized and the detectors 84 in the capture the features of the surface 38 as a grey scale (photographic) image.
  • each particular light dot 32A must be identified.
  • the reflection of a particular light dot 32A will be detected as a light cluster 32B by many detectors 84 since there are many more detectors 84 than projectors 52, and they are much smaller and 19 closer together than the projectors 52.
  • each light dot 32A, 34B (32A on the surface 38; 34B on the detection plate 80) is ultimately identified by the location of the one detector 84 which is at its center.
  • the light clusters 32B are in the same locations on detection plate 80 as light clusters 34B.
  • the first and second light dot patterns are reconciled so that it can be learned which projector 52 and light beam 54 corresponds to each of the detectors 84 that detects each light beam 54' reflected from the surface 38.
  • the detectors 84 on the detection plate 80 simply detect the reflected light dots 32A, 34A in both light dot patterns ( Figure 9 and Figure 11) as ambiguous light clusters 32B, 34B.
  • the ambiguity arises from the fact that it is not known whether the detectors 84 on the detection plate 80 are actually detecting a reflected light dot 32A, 34A; stray ambient light or a response to a stray transient current. To remove this 20 ambiguity, the image of the light clusters 32B, 34B are enhanced for further processing as shown in Figure 12.
  • Figure 12 shows that the enhancement includes, for both sets of light clusters 32B and 34B, smoothing 104, increasing their intensity 106, and increasing their contrast 108.
  • the detected light clusters 32B, 34B are examined by a smoother 104 which detects two light clusters 32B, 32B or 34B, 34B that are separated by a gap 116, 118 having a width which is below a predetermined value.
  • a low pass filter (not shown) may be used as the smoother 104 to restore the shape of the light cluster 32B, 34B so that the gap 116, 118 disappears.
  • the intensity of the light clusters 32B, 34B is increased to make subsequent processing possible. This is accomplished by increasing the signal strength as at 106 from those detectors 84 in groups where all the detectors detect light clusters 32B, 34B.
  • the increase in intensity may be necessary since those light clusters 32B, 34B reflected from the bottom of 21 the finger or item 40 being mapped will be substantially brighter than those that are reflected from the side of the finger or item 40 since the bottom surfaces receive the light beams 54 at a nearly vertical angle.
  • the side surfaces of the finger or item 40 receive and reflect the light beams at an oblique angle. It is simplest and easiest to increase the intensity of all the light clusters 32B, 34B. However, if desired, only the intensity of the less intense light clusters 32B, 34B may be increased.
  • the contrast of the light clusters 32B, 34B is increased as at 108.
  • a suitable way of achieving this is by changing the value of all of the signals from all of the detectors 84 which are not already at a binary "1" which corresponds to the detection of light, or a binary "0" which corresponds to a failure to detect light to either a "0” or a “1” depending on whether the voltage that detector generates is above or below a predetermined level.
  • the second pattern of light clusters 34B has the appearance shown in Figure 15 and processing of the second 22 pattern of light clusters 34B which is used for reconciliation stops as the second light cluster pattern is suitable for that purpose .
  • the first pattern of light clusters 32B detected by detection plate 80 ( Figure 9) is further processed until the center of each light cluster 32B on detection plate 80 is determined as will now be described.
  • Each light cluster 32B in the first light dot pattern ( Figure 8) is examined to detect its shape and its distance from adjacent light clusters 32B. This is relatively straight forward since each of the detectors 84 is at either a binary "0" or "1" so that the edge of each light cluster 32B is now clearly defined.
  • Figure 16 There are at least two possible conditions (Figure 16) that can be detected. The first is where the light clusters 32B are spaced at a distance 124 which is above a minimum predetermined distance and the light cluster 32B is elliptical 32C or circular 32D. This condition indicate a satisfactory light cluster 32B that is ready for further processing.
  • a light cluster 32B may be detected as having an hour glass shape 32E ( Figure 16) .
  • the hour glass shaped light cluster 32E is likely to be caused by two separate light clusters 32B and 32B overlapping each other. This might be caused when the reflected light has been diffused by the skin so that while a sharply focused light beam 54 strikes the skin, 23 a much wider beam 54' is reflected. When this occurs on adjacent beams 54' their reflections will overlap.
  • the hour glass shaped light clusters 32E are further processed by being split at their narrowest place 126 into two light clusters 32B.
  • the smoothing step 104 i.e., removal of gaps 116 ( Figure 13), must occur before the splitting step. This is because if these steps are reversed, a light cluster 32B such as that comprised of the two light cluster parts shown in Figure 13 would be split into two light clusters 32B and 32B rather than being united into one light cluster 32B as is desired. Further, upon detecting two light clusters close to each other after just having been split, the smoother would try to reassemble them using the low pass filter.
  • each light cluster 32B its size is gradually reduced. This is accomplished by scanning each light cluster 32B several times. On each scan the detectors 84 that are on the edge of the light cluster are removed.
  • Light cluster 32B comprises many detectors 84.
  • Light cluster 32B' comprises only a few detectors 84. After, for example, three scans, 132, 134 and 136, light cluster 32B' will disappear 24 and can be considered as not having been the reflection of a light dot 32A.
  • each surviving light cluster 32B is comprised of a number of detectors 84.
  • each surviving light cluster 32B is now located.
  • the center is considered to be the location of the light cluster 32B.
  • a surviving light cluster 32B comprises only one detector 84, the location of that detector is the location of the center of the light cluster.
  • a surviving light cluster 32B contains more than one detector 84 ( Figure 17)
  • its center may be located by examining the light cluster 32B row by row and column by column to determine the row and column having the largest number of detectors 84, i.e., "l's", which row and column define the location of the center of that light cluster 32B and hence its location.
  • each surviving light cluster 32B can be located by finding the brightest spot in it. This may be accomplished by determining the average area of a surviving light cluster 32B and then defining an area 144 which is smaller than that average area. The area 144 is moved incrementally through each surviving light cluster 32B and the average brightness of the area 144 is 25 determined at each location across the entire light cluster 32B, and ultimately across each surviving light cluster 32B in the first pattern of light dots ( Figure 9) . The locations that provide the brightest areas, i.e., the areas having the highest values are the centers of the respective surviving light clusters 32B.
  • FIG. 19 Still a third method of locating the centers of the surviving light clusters 32B is shown in Figure 19. This method comprises the steps of determining the brightest spot 150 in a surviving light cluster 32B which spot 150 is the center of the light cluster 32B, and finding the average distance d 1# d 2 , d 3 , d 4 , d 5; d etc- between adjacent surviving light clusters 32B for all surviving light clusters detected by the entire detection plate 80.
  • spots of whose brightness is above a predetermined value that are further away from spot 150 than one half of the average distance between surviving light clusters 32B are assumed to be the center of those light clusters 32B.
  • spots of brightness below the predetermined value or that are closer to another spot by a distance that is than less than one half the average distance between bright spots are assumed not to be centers of the surviving light clusters 32B.
  • Figure 20 the centers of the light clusters 32B on the detection plate 80 are shown. Their irregular arrangement is caused by the shape of the surface 38 from which they were reflected.
  • the coordinates of the location of each light cluster 32B is based on the address of the detector 84 on detection plate 80 which corresponds to the center of that light cluster, e.g., RR ⁇ n and CC ⁇ m .
  • the first light dot pattern ( Figure 4 and Figure 9) is ready to be reconciled with the light clusters 34B in the second light dot pattern ( Figure 10 and Figure 11) so that the light beams 54 and their projectors 52 can be matched with the particular light dot clusters 32B that they created.
  • the first pattern of light dots 32A is accomplished by energizing all of the projectors 52 on projection plate 48
  • the second pattern of light dots 34A ( Figure 10) is accomplished by energizing all of the projectors 52 on projection plate 48 except those in one row 60 and one column 62 ( Figures 10 and 11) that define cross 64.
  • the light dots 34A projected by those projectors 52 are reflected from the surface 38 and detected as light clusters 34B by the detectors 84 on detection plate 48 ( Figure 11) in the same pattern as the centers of the light clusters 32B except for the reflection of the cross 64' ( Figure 11) .
  • each other light cluster 34B created by projectors 52 in the second pattern of light dots will be in the same location as the center of the light cluster 32B created by same projector 52 in the first pattern of light dots.
  • the cross 64 and its reflection 64' are useful as a frame of reference since it is easily found on the detection plate 80 because of its distinctive shape. Further, its center 66, 66' is easily found since it is at the only location in the pattern of light clusters 32B and 34B that is surrounded by only four light clusters instead of eight light clusters. However, any 28 other geometric shape that provides an easily identifiable reference point can be used.
  • the projector 52' ( Figure 7) at the intersection of the row and column corresponding to the center 66 of the cross 64 is used as the starting place in reconciling the first and second light dot patterns.
  • the intersection of the row and column is on the center of the projection plate 48 such as on the projection axis 46, but the location is not critical.
  • the projector 52' at the center 66 of the cross 64 on the projection plate 48 is easily recognized since it will be the only projector 52 with only four of the eight adjacent projectors 52 energized. This is because the two adjacent projectors on row 60 and the two adjacent projectors on column 62 are not energized since they are on the arms of the cross .
  • the arms of the cross will be the row 60 and column 62 of unenergized projectors 52 which extend from them.
  • the cross 64 is detected by the arrangement of light clusters 34B.
  • the center 66' of the reflected cross 64' is recognized as being a space where there had been a light cluster 32B, but there is no light cluster 34B in that location in the second light pattern, and the space is surrounded by only four other light clusters 34B.
  • the location of the detectors 84' at the center 66' of the cross 64' is known since the coordinate address of all the detectors 84 is known .
  • the coordinate address of the projector 52' corresponds to the coordinate address of the detectors 84' . Then 29 starting from the just found relationship between projector 52' and detector 84 ' , the row and column that intersect to form the center 66' of the cross 64' are then related to their corresponding row and column of projectors that intersect to form the center 66 of the cross 64.
  • both patterns of light clusters 32B and 34B are virtually identical, the only difference being the presence of the cross 64 in the second light pattern, all of the centers of light clusters 32B in the first pattern of light clusters 32B ( Figure 20) must fall within the corresponding light clusters 34B in the second pattern of light clusters unless they are on the cross 64' .
  • the arms of the cross 64 can be found.
  • each light cluster 32B and the projector 52 that created it can be paired on a row by row and column by column basis. There are as many pairs as there are light dots 32A.
  • the rest of the projectors and centers of light clusters 32B are paired.
  • a center of a light cluster 32B detected in the upper left hand quadrant defined by cross 64' which is closest to row 60 and column 62 is known to have been projected by the projector 52 on projection plate 48 which was in the upper left hand quadrant on plate 48 which was closest to row 60 and column 62.
  • the center of the light cluster 32B immediately above the center of light cluster 32B just identified was necessarily created by the projector 52 immediately above the projector 52 which was just identified.
  • the coordinates of the detectors that are the centers of the light clusters 32B and their respective projectors with which have been paired can be restated using coordinates that define their positions relative to the row R 0 and column C 0 on the projection plate 48 and the row RR 0 and column CC 0 on the detection plate 80 relate to the cross 64, 64'.
  • center 66 of the cross 64 on the projection plate 48 is identified as at row R 0 and column C 0 .
  • the center 66' of the cross 64' on the detection plate 80 is identified as at row RR 0 and column CC 0 .
  • the rows R ⁇ n and column C ⁇ m represent the rows and columns on the projection plate that are on either side of the neutral axes defined by row R 0 and column C 0 .
  • the rows RR ⁇ n and column CC ⁇ m represent the rows and columns on the detection plate 80 that are spaced from the neutral axes defined by row RR 0 and column CC 0 .
  • the coordinates of each pair of projectors 52 and detectors 84 are used to determine the three dimensional position of each of the light dots 32A and consequently the position of that part of the surface 38 from which it was reflected. 32
  • each light dot 32A is determined by solving two triangles, one in a plane parallel to the rows 60 and 60' and one in a plane parallel to the columns 62 and 62' .
  • the triangles are solved by knowing the angle (s) at which the light beams 54 and 54' were projected and detected and the distance between the focal points 58 and 88 of the projector and detector systems, 20 and 22, respectively.
  • the angle (s) at which each beam 54 was projected is determined by the distance of its projector 52 on the projection plate 48 from the projection axis 46 in both the x direction which may be parallel to the rows 60 and the in the y direction which may be parallel to the columns 62, or they can be located by polar coordinates or any other convenient and well known system.
  • x and y axes are preferably selected so that their intersection passes through the axis 46 of the projection system 20.
  • the angle of the projected light beam 54 is the arctan of the distance between the projection plate 48 and the focal point 58 of the projection system 20 on the one hand and the distance from the axis 46 of the projection system 20 to the particular projector 52 that created the light dot 32A whose location is being determined as follows for each of the x and y axes : 33
  • the method for determining the angle (s) at which the light beam 54' is reflected onto the detection plate 80 is similar to that just described.
  • the location of x and y axes are selected so that their intersection passes through the axis 68 of the detection system 20. Then with the distance between the detection plate 80 and the focal point 88 of the detection system 22 along axis 68 known on the one hand, and the distance from the center of each light dot 32B to the the x and y axes known, the two angles, one for the x plane and one for the y plane can be solved as above to identify the angle at which each reflected light beam 54' is received.
  • While the position of the light dot 32A relative to the device 10 is known or can be easily calculated it is not relevant since the only meaningful information about the location of the light dot 32A is its position relative to the other light dots 34 32A as it is their relative positions that define the surface 38, and not their distance from the device 10.
  • a two dimensional model corresponding to an item 40 such as a finger which has been rolled along a flat medium such as a fingerprint card is created, i.e., in addition to the bottom of the item 40 being modeled, its sides are also modeled.
  • the creation of the two dimensional is achieved by identifying those coordinates in a flat plane that correspond to the coordinates of the light dots 32A in the three dimensional model .
  • compensation must be made for the fact that the conversion from three dimensions to two dimensions will cause a distortion in the apparent location of adjacent light dots 32A.
  • This type of distortion is well recognized by cartographers (map makers) and others who are confronted with providing two dimensional models of three dimensional objects.
  • a well known example of this type of distortion in cartography is the Mercator Projection which has a distortion in the polar regions.
  • the conversion to a two dimensional model is accomplished by using a suitable set of parameters that place the coordinates that correspond to the locations of the light dots 32A in the three dimensional model in the correct positions in the two dimensional model with either invarience of angles or invarience 35 of area, i.e., without altering either the angular relationships or areas defined by the light dots 32A.
  • the creation of the two dimensional model is initiated by identifying those light dots 32A that lie on an axis 156 of the surface 38 that corresponds to the line of contact that would be present if the actual item 40 or finger were placed on a substrate 158 prior to rolling.
  • the coordinates in the two dimensional plane are determined by selecting them such that the sum of a function of the differences between the distances between the light dots in the row being constructed and the light dots 32A in the previous row in the two dimensional model on the one hand, and the distances between their counterpart light dots 32A in the previous row in the three dimensional model is a minimum value.
  • the distances used are those to the next immediate light dots 32A to one side of the axis 156 and those immediately above and below the light dot 32A under consideration which technique is especially useful for simulating the rolling process as when capturing a fingerprint.
  • dots 32A are used simultaneously, the accuracy of the dot position will be increased.
  • the process is repeated for the light dots 32A on the other side 156L of the item 40 starting at the axis 156 and then progressing to rows 156R 17 156R 2 , 156R 3 , 156R etc , since the conversion to coordinates in the two dimensional plane is a simulation the rolling process.
  • each light dot 32A in the two dimensional model is identified by a vector relating it to the detector 84 at the center of the light dot 32A in the three dimensional coordinate system on which it is based.
  • the coordinate addresses of the detectors 84 that were not identified as the centers of light dots 32A are mapped by interpolation using the coordinate addresses of the light dots 32A that were determined to be the light dots 32A.
  • the coordinates of the two dimensional model just created can be printed or displayed if desired. However, it is probably not worth while since its preferred utility occurs when it is combined with the grey scale image ( Figure 6) . Accordingly, it is preferred that the two dimensional model be maintained as a 37 data base of x-y coordinates, each of which corresponds to the position of a light dot 32A in a two dimensional plane.
  • a grey scale image ( Figure 6) corresponding to a rolled fingerprint or other item can now be established with accuracy since the two dimensional location of all the light dots 32A is known relative to their three dimensional coordinates.
  • the grey scale image ( Figure 6) is combined with the two dimensional coordinate data base ( Figure 22) using the coordinates of the features of the grey scale image and the coordinates of the two dimensional model. Since the grey scale image ( Figure 6) is actually physically larger than the image corresponding to the two dimensional coordinates, the larger grey scale image is combined into the two dimensional model since if it went the other way, there would be large spaces where the data from the two dimensional image did not fill the grey scale image.
  • each of those detectors 84 has a grey scale value that corresponds to the amount of light that it received. Also the coordinates of each detector 84 are known. Accordingly, for each light dot 32A "seen" by a particular detector 84, there is a corresponding part of the grey scale image "seen” by that same detector 84.
  • the same shift is applied to 38 the part of the grey scale image seen by that detector 84 to create a set of two dimensional coordinates for each part of the grey scale image that accurately places that part of the grey scale image in a location that corresponds to its true position relative to the other parts of the grey scale image.
  • the parts of the grey scale image that have the same coordinates as their respective corresponding light dots 32A are mapped into the two dimensional model. Then the parts of the grey scale image that are not on the light dots 32A are located to their true positions relative to the two dimensional model by interpolation using the shifts in position of the light dots 32A nearest to them.
  • the two detection systems 22 are angularly disposed with respect to each other so that a larger portion of the surface 38 of the item 40 can be seen than if only one detection system 22 were 39 used.
  • the two CCD cameras 70 can scan the sides of an item 40 through an included angle of up to 150 degrees. By increasing the angle between detection systems 22, the included angle can exceed 180 Degrees.
  • an elongated device 10 having plurality of projection systems 20 and detection systems 22 similar to those described are located along the longitudinal axis of the item to be scanned 40.
  • Such an arrangement is able to examine large objects such as a limb or the entire body of a person or animal.
  • a device of sufficient size operating according to the principles of the invention just described could scan a manufactured item or an art object having a surface texture. Such scans would be useful for identification or the detection of forgeries or alterations.
  • each detection system 22 processes the light dots 32A and grey scale image that it "sees” in a manner that is identical to that which has been described. However, the portion of the light dot patterns 32A and 34A and the portions of the grey scale image seen by each of them are for a different part of the item 40 than was seen by the other detection system 22. 40
  • the grey scale images created by each detection system 22, whether in a configuration such as shown in Figure 2 or that shown in Figure 23 must be combined and any part of the surface 38 that was scanned by more than one detection system 22 must identified so that they can be overlapped, removed, or compensated for in some other fashion.
  • FIG. 24 A composite image made from the multiple detection system of the device 10 shown in Figure 2, will be described. As seen in Figure 24, since a cross 64 was used while capturing both the first and second light dot patterns, it will appear in the light dot patterns 32B seen by each detector system 22. Since the detector systems 22 are circumferentially spaced around the item 40, the cross 64 will be reflected onto to each detection plate 80 in a different location from the other detection plate 80.
  • the coordinates for each light dot 32A is determined.
  • the coordinate system of light dots 32A on both detection plates 80 can be combined into one coordinate system.
  • the light dots 32A on one of the detection plates 80 having coordinates identical to the coordinates of a light dot 32A on the other detection plate 80, and their corresponding grey scale images can be discarded since they are merely the same light dots 32A and grey scale images that are seen by more than one detection system.
  • light dots 32A which appear in the images seen by both detector systems 22 and their corresponding 41 grey scale images can be identified and the extent of overlapping be determined.
  • a suitable line such as a line of light dots 160
  • the two images can be merged by assembling the part of the scanned images that is on the outside of the line of light dots 160 which appear on both images. This is because the portions of the image between the line of light dots 160 on each of the images is on the outside of the line of light dots on the other image and hence, becomes a part of the composite image.
  • the grey scale value for the coordinates for each part of the composite image is known, the grey scale value for the coordinates for each part of the composite grey scale image is known .
  • the result is a data base of coordinates that define a composite grey scale image that corresponds what the image of a rolled fingerprint or other item would look like.
  • the data base can be stored for later use or can be displayed on a monitor or printed on a fingerprint card or other suitable medium for storage or comparison.
  • a narrow beam light source 170, a rotating mirror 172 and a pivoting mirror 174 create the light beams 54 and light dots 32A and 34A.
  • the narrow beam can be created by a laser, or by an optical system.
  • a suitable circuit 176 is provided for energizing the light source 170 at high frequencies.
  • the beam of light 180 that it generates is aimed at the perimeter of the rotating mirror 172.
  • the perimeter of the rotating mirror 172 has a plurality of reflective surfaces 182.
  • the light beams 186 are aimed at the pivoting mirror 174 where they are reflected as a row of light beams 54 which create a row of light dots 32A on the surface 38 of the item 40 being scanned.
  • pivoting the mirror incrementally about axis 190 and with an appropriate lens system (not shown) a plurality of rows of light dots 32A will be created on the surface 38 of the item 40 being scanned.
  • the light dots 32A are detected by the detection plates 80 as light clusters 32B as have been described.
  • a second pattern of light dots 34A having a cross 64 or other marking device such by simply being larger than the other light dots 32A can be projected on to the surface 38. Then, as described, by relying on the distance between the focal points of the projection and detection systems and the angles of 43 the pairs of projected light beams 54 and reflected light beams 54' relative to their respective projection and detection axes, the three dimensional coordinates of each of the light dots 32A can be found.
  • a still further system for creating the light dot pattern 32 on the surface to be scanned 38 is shown in Figure 28. It includes a wide beam light source 196 and a mask 198 having a pattern of holes 202 that correspond to the desired pattern of light dots 32A is provided. At least one of the holes 204 in the mask 198 has a distinctive shape. The mask breaks the wide beam into a plurality of separate light beams 54. Each of the light beams 54 creates one of the light dots 32A.
  • the light dot 206 created by the hole 204 in the mask 198 has a distinctive shape so that it can be used to help match the projected light beams 54 and reflected light beams 54' into pairs as was explained.
  • An yet even further system for creating the pattern of light dots 32A comprises a plurality of projection systems.
  • the systems may be identical or different. They may generate the same number of light dots 32A or a different number of light dots, provided, the light dots 32A cover the surface 38 of the item being scanned 40 in sufficient number so as to enable the creation of an accurate three dimensional model of the surface 38.
  • the dot when using a distinctive light dot for the reconciliation, the dot must be found before the step of smoothing 104 since the smoothing might destroy the distinctive light dot rendering identification of the light dots 44 impossible.
  • an algorithm designed to specifically detect the distinctive light dot is used.
  • FIGs 30 and 31 a composite scanned image 220 based on three detection systems 22 and a distinctive light dot 224 is shown.
  • the distinctive light dot 224 is seen in the light dot patterns 228A, 228B and 228C in Figure 30; each of which was scanned by a different detector system 22.
  • the light dot patterns 228A, 228B and 228C are shown assembled along cut lines 160 into a composite image in a manner similar to that described with respect to the composite image shown in Figure 25.
  • the distinctive dot 224, seen in each of the light dot patterns 228A, 228B and 228C is used for aligning the images when creating the composite image 220.
  • FIGs 32 and 33 a composite scanned image 240 based on four detection systems 22 and a distinctive light dot 244 is shown.
  • the distinctive light dot 244 is seen in the light dot patterns 248A, 248B, 248C and 248D in Figure 32; each of which was scanned by a different detector system 22.
  • the light dot patterns 248A, 248B, 248C and 248D are shown assembled into a composite image along cut lines 160 in a manner similar to that described with respect to the composite image shown in Figure 25.
  • the distinctive dot 240, seen in each of the light dot patterns 248A, 248B and 248C is used for aligning the images when creating the composite image 240. 45
  • an alternative to the method for finding the coordinates of the three dimensional model comprises the step of creating a model of a perfect cylinder 214 such as seen in Figure 29 which is assumed to be the item being scanned 40.
  • the diameter of the perfect cylinder is based on the average item width seen by the detection system 22.
  • each light dot 32A on it can be anticipated. Then if the actual light dot 32A is not where the anticipated dot is expected to be, that part of the finger may be fatter or thinner than the ideal cylinder. Thus, if the actual light dot 32A falls above the anticipated light dot 32A that part of the finger is fatter than the perfect cylinder. If it falls below, then the finger is thinner.
  • the device and method of the invention can also be used to scan the surfaces of other three dimensional objects such as rectangular solids, cubes, pyramids, polyhedrons, spheres, cones, elliptical solids and combinations of these shapes.
  • the invention can be used to map the surfaces of relatively flat body parts such as 46 palms, footprints and "slap prints", i.e., four fingers printed at the same time.
  • manufactured items such as forgings, castings and items made by other manufacturing processes can be examined to detect imperfections or to determine if manufacturing tolerances are met.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Artificial Intelligence (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Human Computer Interaction (AREA)
  • Image Input (AREA)

Abstract

L'invention concerne un dispositif (10) destiné au balayage de la surface (38) d'un article (40) et comprenant une zone de balayage (100) et un système (52) pour projeter un motif de points lumineux (32 A) sur la surface (38) à balayer lorsque cette dernière se trouve à l'intérieur de la zone de balayage (100). Des systèmes (84) servent à détecter le motif lumineux (32 A). D'autres systèmes servent à créer une image à niveaux de gris (84) de la surface (38). Le dispositif comprend enfin des systèmes servant à combiner le motif de points lumineux avec l'image à niveaux de gris pour créer une reproduction en deux dimensions de l'article balayé.
PCT/US1999/005559 1998-03-17 1999-03-16 Dispositif destine au balayage et au mappage d'une surface WO1999048041A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
AU30872/99A AU3087299A (en) 1998-03-17 1999-03-16 Device and method for scanning and mapping a surface
EP99912509A EP1062624A4 (fr) 1998-03-17 1999-03-16 Dispositif destine au balayage et au mappage d'une surface

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US7832598P 1998-03-17 1998-03-17
US60/078,325 1998-03-17
US09/080,900 1998-05-18
US09/080,900 US20020097896A1 (en) 1998-03-17 1998-05-18 Device and method for scanning and mapping a surface

Publications (1)

Publication Number Publication Date
WO1999048041A1 true WO1999048041A1 (fr) 1999-09-23

Family

ID=26760403

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US1999/005559 WO1999048041A1 (fr) 1998-03-17 1999-03-16 Dispositif destine au balayage et au mappage d'une surface

Country Status (4)

Country Link
US (1) US20020097896A1 (fr)
EP (1) EP1062624A4 (fr)
AU (1) AU3087299A (fr)
WO (1) WO1999048041A1 (fr)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2003040709A2 (fr) * 2001-11-07 2003-05-15 Applied Materials, Inc. Systeme d'imagerie a reseau de taches
GB2390717A9 (en) * 2002-06-28 2005-03-21 Hewlett Packard Development Co Object-recognition lock
US6946655B2 (en) 2001-11-07 2005-09-20 Applied Materials, Inc. Spot grid array electron imaging system
EP1830306A3 (fr) * 2006-03-03 2007-09-12 Fujitsu Ltd. Appareil de capture d'images possédant une fonction de mesure de distance
DE10153808B4 (de) * 2001-11-05 2010-04-15 Tst Biometrics Holding Ag Verfahren zur berührungslosen, optischen Erzeugung von abgerollten Fingerabdrücken sowie Vorrichtung zur Durchführung des Verfahrens

Families Citing this family (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003006627A (ja) * 2001-06-18 2003-01-10 Nec Corp 指紋入力装置
EP1353292B1 (fr) * 2002-04-12 2011-10-26 STMicroelectronics (Research & Development) Limited Appareil et procédés de saisie biométrique
US20040109608A1 (en) * 2002-07-12 2004-06-10 Love Patrick B. Systems and methods for analyzing two-dimensional images
WO2008153539A1 (fr) * 2006-09-19 2008-12-18 University Of Massachusetts Balayage en ligne sans contact circonférentiel d'objets biométriques
US20110007951A1 (en) * 2009-05-11 2011-01-13 University Of Massachusetts Lowell System and method for identification of fingerprints and mapping of blood vessels in a finger
US8514284B2 (en) * 2009-12-17 2013-08-20 Raytheon Company Textured pattern sensing and detection, and using a charge-scavenging photodiode array for the same
DE102010016109A1 (de) 2010-03-24 2011-09-29 Tst Biometrics Holding Ag Verfahren zum Erfassen biometrischer Merkmale
US8660324B2 (en) * 2010-03-29 2014-02-25 Raytheon Company Textured pattern sensing using partial-coherence speckle interferometry
US8780182B2 (en) 2010-03-31 2014-07-15 Raytheon Company Imaging system and method using partial-coherence speckle interference tomography
US9912847B1 (en) * 2012-09-25 2018-03-06 Amazon Technologies, Inc. Image capture guidance to reduce specular reflection effects
US9091628B2 (en) 2012-12-21 2015-07-28 L-3 Communications Security And Detection Systems, Inc. 3D mapping with two orthogonal imaging views
EP2966614A4 (fr) * 2013-03-06 2016-11-16 Nec Corp Dispositif, système, procédé et programme de conversion d'image d'empreinte
JP2016112947A (ja) * 2014-12-12 2016-06-23 三菱航空機株式会社 航空機の外観検査方法およびシステム
US20170262979A1 (en) * 2016-03-14 2017-09-14 Sensors Unlimited, Inc. Image correction and metrology for object quantification
CN109886055B (zh) * 2019-03-25 2024-06-25 南京新智客信息科技有限公司 一种圆柱形物体表面信息在线采集方法及系统

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4863268A (en) * 1984-02-14 1989-09-05 Diffracto Ltd. Diffractosight improvements
US5812252A (en) * 1995-01-31 1998-09-22 Arete Associates Fingerprint--Acquisition apparatus for access control; personal weapon and other systems controlled thereby

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4641350A (en) * 1984-05-17 1987-02-03 Bunn Robert F Fingerprint identification system
US4696046A (en) * 1985-08-02 1987-09-22 Fingermatrix, Inc. Matcher

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4863268A (en) * 1984-02-14 1989-09-05 Diffracto Ltd. Diffractosight improvements
US5812252A (en) * 1995-01-31 1998-09-22 Arete Associates Fingerprint--Acquisition apparatus for access control; personal weapon and other systems controlled thereby

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP1062624A4 *

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE10153808B4 (de) * 2001-11-05 2010-04-15 Tst Biometrics Holding Ag Verfahren zur berührungslosen, optischen Erzeugung von abgerollten Fingerabdrücken sowie Vorrichtung zur Durchführung des Verfahrens
WO2003040709A2 (fr) * 2001-11-07 2003-05-15 Applied Materials, Inc. Systeme d'imagerie a reseau de taches
WO2003040709A3 (fr) * 2001-11-07 2004-01-08 Applied Materials Inc Systeme d'imagerie a reseau de taches
US6946655B2 (en) 2001-11-07 2005-09-20 Applied Materials, Inc. Spot grid array electron imaging system
GB2390717A9 (en) * 2002-06-28 2005-03-21 Hewlett Packard Development Co Object-recognition lock
GB2390717B (en) * 2002-06-28 2005-09-07 Hewlett Packard Development Co Object-recognition lock
US7045763B2 (en) 2002-06-28 2006-05-16 Hewlett-Packard Development Company, L.P. Object-recognition lock
EP1830306A3 (fr) * 2006-03-03 2007-09-12 Fujitsu Ltd. Appareil de capture d'images possédant une fonction de mesure de distance
KR100919041B1 (ko) * 2006-03-03 2009-09-24 후지쯔 가부시끼가이샤 거리 측정 기능을 갖는 촬상 장치
US7777808B2 (en) 2006-03-03 2010-08-17 Fujitsu Limited Image capturing apparatus having distance measurement function

Also Published As

Publication number Publication date
US20020097896A1 (en) 2002-07-25
EP1062624A4 (fr) 2002-02-13
AU3087299A (en) 1999-10-11
EP1062624A1 (fr) 2000-12-27

Similar Documents

Publication Publication Date Title
US20020097896A1 (en) Device and method for scanning and mapping a surface
CA2079817C (fr) Systeme de saisie tridimensionnelle en temps reel
EP0294577B1 (fr) Appareil optique de mesure de contours de surfaces
Rocchini et al. A low cost 3D scanner based on structured light
US5747822A (en) Method and apparatus for optically digitizing a three-dimensional object
US6813035B2 (en) Method for determining three-dimensional surface coordinates
US5064291A (en) Method and apparatus for inspection of solder joints utilizing shape determination from shading
US5528355A (en) Electro-optic palm scanner system employing a non-planar platen
US6191850B1 (en) System and method for inspecting an object using structured illumination
CA2529498A1 (fr) Procede et systeme de reconstruction de surface tridimensionnelle d'un objet
US20080319704A1 (en) Device and Method for Determining Spatial Co-Ordinates of an Object
CA2516604A1 (fr) Methode et montage pour l'enregistrement optique de donnees digitales biometriques
JP2517062B2 (ja) 3次元計測装置
CA2307439C (fr) Methode et appareil d'evaluation d'un facteur d'echelle et d'un angle de rotation dans le traitement des images
JP2004334288A (ja) 刻印文字認識装置及び認識方法
JPH085348A (ja) 3次元形状検査方法
WO1994012949A1 (fr) Procede et appareil de correlation eclair
RU2085839C1 (ru) Способ измерения поверхности объекта
JP3219884B2 (ja) エンボス版の製造方法
JPH09128537A (ja) 印鑑照合方法及び印鑑照合装置
JPH0749935B2 (ja) 物体認識装置
JP2000065547A (ja) 黒色ワークの形状測定装置及び取出装置
EP0450786A2 (fr) Technique de caractérisation d'empreintes digitales
JPS63100305A (ja) 物体のセンシング装置
AU689237C (en) An electro-optic palm scanner system employing a non-planar platen

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AE AL AM AT AU AZ BA BB BG BR BY CA CH CN CU CZ DE DK EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MD MG MK MN MW MX NO NZ PL PT RO RU SD SE SG SI SK SL TJ TM TR TT UA UG UZ VN YU ZA ZW

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): GH GM KE LS MW SD SL SZ UG ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE BF BJ CF CG CI CM GA GN GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
DFPE Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101)
NENP Non-entry into the national phase

Ref country code: KR

WWE Wipo information: entry into national phase

Ref document number: 1999912509

Country of ref document: EP

WWP Wipo information: published in national office

Ref document number: 1999912509

Country of ref document: EP

REG Reference to national code

Ref country code: DE

Ref legal event code: 8642

WWW Wipo information: withdrawn in national office

Ref document number: 1999912509

Country of ref document: EP