CA2306755A1 - Method for receiving and storing optically detectable data - Google Patents

Method for receiving and storing optically detectable data Download PDF

Info

Publication number
CA2306755A1
CA2306755A1 CA002306755A CA2306755A CA2306755A1 CA 2306755 A1 CA2306755 A1 CA 2306755A1 CA 002306755 A CA002306755 A CA 002306755A CA 2306755 A CA2306755 A CA 2306755A CA 2306755 A1 CA2306755 A1 CA 2306755A1
Authority
CA
Canada
Prior art keywords
camera
individual recordings
sequence
recording
areas
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
CA002306755A
Other languages
French (fr)
Inventor
Markus R. Muller
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Publication of CA2306755A1 publication Critical patent/CA2306755A1/en
Abandoned legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration using two or more images, e.g. averaging or subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/97Determining parameters from multiple pictures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20228Disparity calculation for image-based rendering

Landscapes

  • Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Studio Devices (AREA)
  • Image Input (AREA)
  • Image Processing (AREA)
  • Studio Circuits (AREA)
  • Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)
  • Editing Of Facsimile Originals (AREA)
  • Holo Graphy (AREA)
  • Optical Recording Or Reproduction (AREA)

Abstract

The invention relates to a method for receiving and storing optically detectable data of an object on a storage medium. According to the invention, a camera is used to take a sequence of several individual images of the object whose position is spatially adjusted with regard to the relative position between the object and the camera. The sharp displayed areas of the individual images are determined and one or more resulting images are composed therefrom.

Description

Method for Receiving and Storing Optically Detectable Data The present invention relates to a method for recording and storing optically detectable data of an object on a storage medium, as defined generically in Patent Claim 1.
Methods of this kind are used in various wavelength ranges, for example, in the domain of infrared or ultraviolet radiation, of visible light, or of thermal radiation. Appropriate cameras and the associated optics are used, depending on the wavelength range. The object of interest is either recorded as a whole or in separate sections. Each of the individual recordings is of a specific size. Because of the settings of the optical components 7.'=~ of the camera, and of the spatial distance between the object and the camera, the recording will incorporate areas that are sharply focused or not so sharply focused. A recording that is sharp in all of its areas cannot be achieved, for only those parts of the object that lie within the focus of the optical system that is used will be clearly focused. The focus is spatially limited and is, in most instances, smaller than the object. Furthermore, if the object is three-dimensional, not all areas of the surface or of the layers that lie immediately beneath the surface can be sharply imaged with the a:id of one recording. In addition, it is 2.5 a further disadvantage that the diaphragm of the camera cannot be opened wide, so that brilliance of the recording is less, since a widely opened diaphragm reduces the depth of focus, with the result that only a correspondingly small part of the object can be sharply imaged.
The prior art (DE 39 31 934 C2, DE 39 05 G19 A1) describes an image input and output device that incorporates a focusing system. Using this focusing system, the optical components of the image~input device are adjusted sharply to a plurality of different object planes. In order to record image information, a 1U plurality of images of one object are recorded using various settings of the optical components, and the information obtained by doing this is combined. Digital methods are used in order to do this. Using this known device, it is a disadvantage that the variable adjustment of the optical components requires a mechanism that moves the optical components with a very high degree of precision. Such a mechanism is costly to manufacture, and is vulnerable to damage, wear, and other impairments when it is used. In addition, because of the various optical settings that are used, the information can be assembled only at great cost, since the scale of the images changes for, each recording because of the changes to the optical components.
In contrast to the foregoing, the method according to the present invention, which is use for recording and storing optically detectable data of an object entails the advantage that, using one camera, a sequence of a plurality of individual recordings of the object is made using different spatial settings with respect to the position between the object and the camera. The setting used for the optical components, and the resulting focus, remain unchanged when this is done. Because of this, the method is simpler to use than the methods already known from the prior art.
A mechanism for effecting precise changes to the optical components is rendered unnecessary. Because of this, application I~_~ of this method is more cost-effective than previously known methods and it is less vulnerable to impairments, disruptions, or wear when it is used.
The sharply imaged areas of the individual recordings are determined and are assembled, and a plurality of resulting images are formed therefrom. Since the optical components remain unchanged during the different individual recordings, the scale does not change. This leads to the fact that when the individual recordings are being assembled, there is no need to match these with respect to scale. Thus, assembly of the information is less costly than in the case of the known methods.
This method can be used both to record individual images of objects as well as to record films. The method can be used manually by cutting out and pasting the sharply focused areas or this can be done by means of screening. However, this method can also be used with the aid of a computer. In the case of two dimensional objects, or in the case of objects of which a two-s dimensional resulting image will suffice, it can be sufficient to assemble the sharp areas to form one single resulting image. In the case of three-dimensional objects, the sharp areas of different planes of the object can be assembled to form one or a plurality of resulting images. The latter case entails the 1o advantage that various features will be shown in different resulting images. Because of this, it becomes simpler to process the images, in particular with respect to recognizing the features. It is also possible to assemble different resulting images for the different depths of penetration into the object 15 that are achieved with the radiation that is used. More advantageously, the individual recordings can be made with the camera lens at larger diaphragm apertures. When this is done, sharp resulting images will be obtained, despite the large diaphragm apertures. This increases the light sensitivity of the 2o recordings. The camera does not have to be sharply focused for each individual recording, since the sharp images are selected electronically, and stored, and images that are not sharp are not stored. Even if the object moves during the recording process, the resulting image will be sharp. The number of individual recordings that are made for each object will depend on the particular application. As a rule, about twenty individual recordings is sufficient. However, in certain cases it may be many more, for example, more than 100 individual recordings, or it may be fewer, for example, five individual recordings.
Exposure times will be selected depending on the object and the camera that is used. It depends on the number of individual recordings that a desired per second or per minute. Many types of to cameras, for example, CCD cameras, make it possible to reduce the exposure time electronically.
The objects that are recorded can be machines, structural elements, works of art, jewellry, or other valuable items, or they can be individuals or animals. Biometric or anatomical features are used in order to recognise or identify persons or animals, in particular breeding animals, and these are recorded in the individual images. Both intentional as well as unintentional movements of the object can be used in order to obtain information. Parallel shifts or rotations that are perpendicular to the optical axis are used in order to achieve greater resolution from the camera. Higher resolution can also be achieved by computation. Parallel shifts of the object only one camera is used to record a sequence of individual recordings. When this is done, one exploits the fact that specific areas of the object that has been recorded can be imaged in sequences of individual recordings as their sharpness changes continuously. In this way, it is also possible to obtain information regarding the topography and the surface configuration of an object.
Using a sequence of a plurality of recordings made at pre-set intervals of time, it is a:Lso possible to identify dynamic processes of the object. Thus, this method permits the examination of the object over time. This means that movement of an object can be followed and recorded. This recorded movement can be used, for example, to disclose or identify the object or to control specific processes. For example, faulty elements in a production process can be revealed, or an individual can be identified. Intentional movements made by an individual can also supply additional information.
The restricted depth of focus can be used in order to identify, image, and evaluate features beneath the surface of the object.
-G-The method according to the present invention permits the use of a large diaphragm aperture. 'this makes it possible to obtain an image with the specific degree of sharpness.
According to one preferred embodiment of the present invention, the individual recordings are stored in a computer and the sharply imaged areas of the individual recordings are determined by the computer, using digital methods. The resulting images are assembled with the aid of the computer. Specific and suitable software is used for this purpose. This software also determines the limits of the sharply imaged areas. When the resulting image is being assembled, it is also possible to use knowledge of the Trelly method that is known from information theory and signal processing methods. As an example, the individual recordings are stored in RAM or or on the hard disk of the computer. The sequence of individual recordings will only be required until such time as the resulting images is generated. Once this has been done, the sequence of individual recordings is erased.
It is possible to use different methods in order to generate a resulting image. Using a first method, n individual images are filtered with a high pass filter and a sharp areas are copied.
When this is done, the transition frequency of the filter is matched to the ranges of sharp focus. This filter can also be made up of a number of different filters. In order to do this, it is possible to use digital methods such as Fourier transformations, wavelet transformations, digital filters, differential or difference formation, as well as Bessel, Butterworth, or Gauss filters. It is also possible to evaluate other information in addition to the sharply imaged areas of the individual images; examples of this are the enlargement or reduction of the imaging relative to the plane of focus in the areas on both sides of the play the focus. Assembly of the sharply imaged areas of the individual images is effected, for example, with the aid of known digital processes. One or more resulting images will be assembled, depending on the shape of the object and its surface configuration, as well as the'number of strata depths or the types of features that are of interest.
In a second method, as compared to the first method, the topology or morphology of the characteristic features of the object are also taken into consideration. As an example, if the object to be recorded is a finger then, using this method, different ?0 principle layers and glands as well as, for example, the papillary layer and sweat and sebaceous glands can be evaluated.
When this is done, it is possible to take into account the fact that the papillary lines are largely joined and are on the surface.
_g_ In a third method, three-dimensional resulting images are generated from the sequence of individual recordings with the aid of digital functions. Subsequently, such an image can be rotated, tilted, inclined, or moved in any other way, so that the user can see various views of the object on the display screen. This method is particularly suitable in those instances when the data recorded using the method according to the present invention is to be~recognized in a data set that is recorded~subsequently. Any rotation or shifting of the object in the first data set relative to the second data set can be corrected and compensated for, so that recognition is nonetheless possible.
According to another advantageous version of the present invention, the sharply imaged areas are determined by way of numerical images of the derivative. The derivative is to be formed in both dimensions of the two-dimensional individual recordings. The derivative is maximal or minimal at the sharply imaged locations. The sharply imaged areas can also be obtained when suitable filters are used, by comparing them with images recorded using different filters.
In another advantageous version of the present invention, the parameters for recording the sequence of individual recordings is predetermined by a computer and the recording sequence is controlled by this same computer.
According to another advantageous version of the present invention, recording the sequence of individual recordings is started automatically. Thus, for example, recording can be started at a specific time or when the object is in a specific position. Recording can also be started when the computer that is processing the individual recordings identifies sharply imaged 1 ~J areas.
In another advantageous version of the present invention, recording the sequence of individual recordings is started by a photoelectric barrier. This method is particularly suitable if the object moves towards and away from the camera during the recording process. The recording is then started automatically if the object approaches to within a specific distance from the camera.
According to another advantageous version of the present invention, the individual recordings are made at precisely fixed intervals of time. Thus, the camera can take twenty-five individual recordings as images or fifty individual recordings as half images each second, and these are then transferred to the -1o-computer memory. These values apply in the case of a CCIR
standard. Other values will apply in the case of other standards. Not all of the individual recordings have to be stored in memory. The time for beginning a recording and the time at which storage begins in the computer memory can be different. The underlying principle in this case is that recording the sequence of individual recordings and their storage in the computer memory are processes that are not linked to each other.
According to another advantageous version of the present invention, the individual recordings are made at fixed relative distances between the camera and the object. This can be done, for example, by appropriately arranged photoelectric'barriers.
l~ In another advantageous version of the present invention, a CCD
camera is used to record the sequence of individual recordings.
A line camera or a scanner can also be used in place of the CCD
camera.
2U According to another advantageous version of the present invention, initially al.l the individual recordings of the sequence are stored in the computer. Once the sequence has been recorded, the sharply imaged areas of the individual recordings are identified and assembled to form a resulting image.

In a further advantageous configuration of the present invention, the sharp areas of each individual recording of the sequence are identified immediately after they have been recorded, and then incorporated into the resulting image. The individual recordings S are not stored. Providing the CPU of the computer is operating fast enough, identification of the sharp areas and their incorporation into the resulting image can take place in real time.'If this is not the case, then the data relevant to the individual recordings must be placed in intermediate storage. If a plurality of resulting images are generated from the individual recordings, the assembly of the individual resulting images can be effected using different methods. In order to further speed up the recording of the data and storage in the computer, a plurality of processors can be used for assembling one or more resulting images. The interaction of the processors can be organised from different standpoints. On the one hand, the digital computations involved in Methods 1 to 3 described above can be divided into as many sections as can run concurrently.
Each section will be processed by a different processing. The ~'0 processors are synchronised by input, output, or by the end of the process for each section. The data are passed on, or a RAM
with more than one access is used (multiported RAM). The assembly of a plurality of resulting images can be effected in part in parallel. Thus, all the resulting images can be formed, even as an object is approaching the camera. To the extent that this is not possible, the missing resulting images will be computed subsequently. This will result in grid patterns with all the information that has been read out or computed.
According to another advantageous version of the present invention, a plurality of resulting images will be assembled from the sequence of individual recordings, with a different area of the object being shown in the resulting images in each instance.
1 ~~
According to another advantageous configuration of the present invention, the plane of the image is divided into a plurality of areas, and these areas are then processed in parallel. This method is particularly suitable if a plurality 'of processors is 1'r available for processing. The areas involved can be squares, rectangles, circles, ovals, or other shapes. These can be adjacent to each other or can overlap each other.
According to another advantageous version of the present 2.C) invention, the method is used up to identify the features of a finger, in particular, of a fingertip. In order to record the data, the finger it is brought clo~~e to a camera. The process for recording the sequence of individual recordings is started during this approach. Still more individual recordings can be made as the finger is moving away from a camera. F'or purposes of recognition, particularly characteristic features at the fingertips are identified from the resulting image and are looked for during a repeated recording of the finger. The sweat and S sebaceous glands as well as the papillary layer, as well as the openings of the glands on the surface of the skin, which form the dermis and the epidermis, are particularly characteristic features of a finger tip. The papilla are also the basis for the behaviour of the skin. The papillary layer, the sweat and 7.« sebaceous glands, as well as the openings of the glands on the surface of the skin can be recorded in different resulting images. This simplifies recognition. Using the method according to the present invention, it can also be determined as to whether or not blood is flowing through the finger. If the finger is 15 illuminated with a source of infra-red light, a sequence of individual recordings can be made to show variations of brightness as a function of the individual's heartbeat.
Furthermore, as blood flows through the fingers, this causes a periodic shift of the cells in the blood vessels within the 20 finger, and this can also be identified with the aid of the method according to the present invention.
According to another advantageous version of the present invention, the object is illuminated with a light source.

According to another advantageous version of the present invention, a pulsed light source is used, and this is synchronised with the camera. The object is only illuminated when an individual recording is to be made.
According to another advantageous configuration of the present invention, the object is illuminated by a plurality of light sources of different wavelength ranges and i.n different arrangements. Different types of illumination can be used.
Because of the different spatial arrangements, the light will arrive at different angles of incidence. In this way, different spatial, geometric or perspective individual recordings can be made. As an example, flash tubes with different optical filters can be used as the light sources.

Because of the filters, electromagnetic radiation in various wavelength ranges is obtained with the aid of one light source.
According to another advantageous version of the present 2U invention, the objective is illuminated only whilst it is moving toward and away from the camera. The individual recordings are made during this interval of time. In this way, one obtains individual recordings made at different distances from the camera, which are thus of various depths of focus.

According to another advantageous version of the present invention, only those areas of the object that are within focus of the camera are illuminated. This is made possible in that the focus of the camera is not changed between the recording of the individual images. The evaluation and assembly of the individual recordings is simplified in that there is no information from the unsharp areas in the individual recordings.
It is advantageous that a system that incorporates a computer, a camera, and a control device is used it to carry out the method according to the present invention.
Additional advantages and advantageous configurations of the present invention are set out in the Patent Claims.
According to the present invention, all of the features that are set out in the description and in the claims can be used either singly or in any combination with each other, as essential to the present invention.
'U

Claims (20)

Claims
1. Method for recording and storing the optically detectable data of anobject on a storage medium, characterized in that a sequence of a plurality of individual recordings of the object are made with a camera at various spatial settings with respect to the relative position between the object and the camera, without the setting of the optical components and the resulting focus being changed; in that the sharply imaged areas of the individual recordings are identified;
and in that the sharply imaged areas of all the individual recordings are assembled to form one or a plurality of resulting images.
Method as defined in Claim 1, characterized in that the individual recordings are stored in a computer; in that the sharply imaged areas of the individual recordings are identified by the computer with the aid of digital methods;
and in that the resulting images are assembled with the aid of the computer.
3. Method as defined in Claim 2, characterized in that the sharply imaged areas are determined by digital formation of the derivative.
4. Method as defined in Claim 1, Claim 2, or Claim 3, characterized in that the parameters for recording the sequence of individual recordings are determined by a computer; and in that the recording sequence is controlled by this computer.
5. Method as defined in one of the preceding Claims, characterized in that the recording of the sequence of individual recordings is started automatically.
6. Method as defined in Claim 5, characterized in that the recording of the sequence of individual recordings is started by means of a photoelectric barrier.
7. Method as defined in one of the preceding Claims, characterized in that the individual recordings are made at fixed, predetermined time intervals.
8. Method as defined in one of the preceding Claims, characterized in that the individual recordings are made at fixed, predetermined relative distances between the camera and the object.
9. Method as defined in one of the preceding Claims, characterized in that a CCD camera is used as the camera for recording the sequence of individual recordings.
10. Method as defined in one of the preceding Claims, characterized in that initially all the individual recordings of the sequence are stored in the computer; and in that the sharply imaged areas are identified after recording of the sequence of individual recordings has been concluded.
11. Method as defined in one of preceding Claims, characterized in that the sharply imaged areas of each individual recording of the sequence are identified and incorporated into the resulting image immediately after they have been recorded.
12. Method as defined in one of the preceding Claims, characterized in that a plurality of resulting images is assembled from the sequence of individual recordings, different areas of the object or different features of the object being shown in the resulting images in each instance.
13. Method is defined in one of the preceding Claims, characterized in that the image plane is divided into a plurality of areas; and in that the areas are processed in parallel.
14. Method as defined in one of preceding Claims, characterized in that it is used to identify the features of a finger.
15. Method as defined in one of preceding Claims, characterized in that the object is illuminated with a light source.
16. Method as defined in Claim 5, characterized in that a pulsed light source that is synchronized with the camera is used.
17. Method as defined in Claim 15 or 16, characterized in that the object is illuminated by a plurality of light sources of different wavelength ranges and in different arrangements.
18. Method as defined in one of the Claims 15 to 17, characterized in that the object is illuminated as long as it is moving towards the camera and away from the camera.
19. Method as defined in one of the Claims 15 to 18, characterized in that only the areas of the object that are within the focus of the camera are illuminated.
20. Apparatus for carrying out a method according to one of the Claims 1 to 18, characterized in that a computer, a camera, and a control device are provided.
CA002306755A 1997-09-11 1998-09-11 Method for receiving and storing optically detectable data Abandoned CA2306755A1 (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
DE19740038 1997-09-11
DE19740038.8 1997-09-11
PCT/IB1998/001516 WO1999013431A1 (en) 1997-09-11 1998-09-11 Method for receiving and storing optically detectable data

Publications (1)

Publication Number Publication Date
CA2306755A1 true CA2306755A1 (en) 1999-03-18

Family

ID=7842077

Family Applications (1)

Application Number Title Priority Date Filing Date
CA002306755A Abandoned CA2306755A1 (en) 1997-09-11 1998-09-11 Method for receiving and storing optically detectable data

Country Status (10)

Country Link
EP (1) EP1012790B1 (en)
JP (1) JP2001516108A (en)
CN (1) CN1271447A (en)
AT (1) ATE227037T1 (en)
AU (1) AU768161B2 (en)
BR (1) BR9812641A (en)
CA (1) CA2306755A1 (en)
DE (2) DE19841555A1 (en)
RU (1) RU2000109297A (en)
WO (1) WO1999013431A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8736751B2 (en) 2008-08-26 2014-05-27 Empire Technology Development Llc Digital presenter for displaying image captured by camera with illumination system

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102004006246A1 (en) * 2004-02-05 2005-10-13 X3D Technologies Gmbh Areal scenes and objects recording method, involves creating two-dimensional image from sharp image areas of observed n-sequence, by which two-dimensional image and depth chart are obtained

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH05227460A (en) * 1992-02-14 1993-09-03 Scala Kk Image pickup method and system for obtaining large focus depth
JP3084130B2 (en) * 1992-05-12 2000-09-04 オリンパス光学工業株式会社 Image input device
JPH08320285A (en) * 1995-05-25 1996-12-03 Hitachi Ltd Particle analyzing device
SE512350C2 (en) * 1996-01-09 2000-03-06 Kjell Olsson Increased depth of field in photographic image

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8736751B2 (en) 2008-08-26 2014-05-27 Empire Technology Development Llc Digital presenter for displaying image captured by camera with illumination system

Also Published As

Publication number Publication date
DE59806129D1 (en) 2002-12-05
ATE227037T1 (en) 2002-11-15
RU2000109297A (en) 2002-03-27
WO1999013431A1 (en) 1999-03-18
BR9812641A (en) 2002-02-05
EP1012790B1 (en) 2002-10-30
AU768161B2 (en) 2003-12-04
EP1012790A1 (en) 2000-06-28
CN1271447A (en) 2000-10-25
JP2001516108A (en) 2001-09-25
AU9639198A (en) 1999-03-29
DE19841555A1 (en) 1999-06-17

Similar Documents

Publication Publication Date Title
US6299306B1 (en) Method and apparatus for positioning subjects using a holographic optical element
JP3943591B2 (en) Automated non-invasive iris recognition system and method
US6394602B1 (en) Eye tracking system
KR100406296B1 (en) System for contactless recognition of hand and finger lines
US4637056A (en) Optical correlator using electronic image preprocessing
JP5001286B2 (en) Object reconstruction method and system
EP1349487B1 (en) Image capturing device with reflex reduction
US5305092A (en) Apparatus for obtaining three-dimensional volume data of an object
AU727389B2 (en) Apparatus for the iris acquiring images
GB2380348A (en) Determination of features of interest by analysing the movement of said features over a plurality of frames
JP2007219625A (en) Blood vessel image input device and personal identification system
US20060039048A1 (en) Systems and methods of capturing prints with a holographic optical element
DE10246411A1 (en) Device for the contactless optical detection of biometric properties of at least one part of the body
CA2306755A1 (en) Method for receiving and storing optically detectable data
US5812248A (en) Process and device for generating graphical real-time directional information for detected object traces
KR101987922B1 (en) Automatic system for capturing image by controlling light cell actively
US7394552B2 (en) Method for measuring the separation of extended objects in conjunction with an optical observation system and microscope for carrying out the same
CA1095295A (en) Rotationally independent optical correlation for position determination
Yoon et al. Nonintrusive iris image acquisition system based on a pan-tilt-zoom camera and light stripe projection
Hariharan Extending Depth of Field via Multifocus Fusion
JPS5813946B2 (en) pattern identification device
DE68917297T2 (en) Method and device for fingerprint verification.
Zhou et al. Background Noise Reduction in an Integrated Volume Holographic Imaging Element for Eye-Gaze Detection
CN117750167A (en) Image acquisition apparatus and method
JP2002031512A (en) Three-dimensional digitizer

Legal Events

Date Code Title Description
FZDE Discontinued