US5541652A - Hyperacuity sensing - Google Patents
Hyperacuity sensing Download PDFInfo
- Publication number
- US5541652A US5541652A US08/426,439 US42643995A US5541652A US 5541652 A US5541652 A US 5541652A US 42643995 A US42643995 A US 42643995A US 5541652 A US5541652 A US 5541652A
- Authority
- US
- United States
- Prior art keywords
- hyperacuity
- sensor
- sensor elements
- array
- radiation
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Expired - Lifetime
Links
- 238000003384 imaging method Methods 0.000 claims abstract description 18
- 238000009826 distribution Methods 0.000 claims abstract description 15
- 230000005855 radiation Effects 0.000 claims 11
- 238000000034 method Methods 0.000 abstract description 10
- 230000001419 dependent effect Effects 0.000 abstract description 2
- 230000000694 effects Effects 0.000 abstract description 2
- 230000003834 intracellular effect Effects 0.000 abstract description 2
- 238000005286 illumination Methods 0.000 description 6
- 229910021417 amorphous silicon Inorganic materials 0.000 description 4
- AMGQUBHHOARCQH-UHFFFAOYSA-N indium;oxotin Chemical compound [In].[Sn]=O AMGQUBHHOARCQH-UHFFFAOYSA-N 0.000 description 4
- 239000000758 substrate Substances 0.000 description 4
- 238000003491 array Methods 0.000 description 3
- 238000004519 manufacturing process Methods 0.000 description 3
- 239000010409 thin film Substances 0.000 description 3
- VYZAMTAEIAYCRO-UHFFFAOYSA-N Chromium Chemical compound [Cr] VYZAMTAEIAYCRO-UHFFFAOYSA-N 0.000 description 2
- ZOKXTWBITQBERF-UHFFFAOYSA-N Molybdenum Chemical compound [Mo] ZOKXTWBITQBERF-UHFFFAOYSA-N 0.000 description 2
- 229910052782 aluminium Inorganic materials 0.000 description 2
- XAGFODPZIPBFFR-UHFFFAOYSA-N aluminium Chemical compound [Al] XAGFODPZIPBFFR-UHFFFAOYSA-N 0.000 description 2
- 239000010408 film Substances 0.000 description 2
- 239000000463 material Substances 0.000 description 2
- 230000000007 visual effect Effects 0.000 description 2
- 238000012935 Averaging Methods 0.000 description 1
- 229910052581 Si3N4 Inorganic materials 0.000 description 1
- 238000000151 deposition Methods 0.000 description 1
- 230000008021 deposition Effects 0.000 description 1
- 239000011521 glass Substances 0.000 description 1
- 239000011159 matrix material Substances 0.000 description 1
- 229910021424 microcrystalline silicon Inorganic materials 0.000 description 1
- 239000000203 mixture Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000008447 perception Effects 0.000 description 1
- 230000000135 prohibitive effect Effects 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 230000035945 sensitivity Effects 0.000 description 1
- 238000000926 separation method Methods 0.000 description 1
- HQVNEWCFYHHQES-UHFFFAOYSA-N silicon nitride Chemical compound N12[Si]34N5[Si]62N3[Si]51N64 HQVNEWCFYHHQES-UHFFFAOYSA-N 0.000 description 1
- 238000004544 sputter deposition Methods 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N3/00—Scanning details of television systems; Combination thereof with generation of supply voltages
- H04N3/10—Scanning details of television systems; Combination thereof with generation of supply voltages by means not exclusively optical-mechanical
- H04N3/14—Scanning details of television systems; Combination thereof with generation of supply voltages by means not exclusively optical-mechanical by means of electrically scanned solid-state devices
- H04N3/15—Scanning details of television systems; Combination thereof with generation of supply voltages by means not exclusively optical-mechanical by means of electrically scanned solid-state devices for picture signal generation
- H04N3/1506—Scanning details of television systems; Combination thereof with generation of supply voltages by means not exclusively optical-mechanical by means of electrically scanned solid-state devices for picture signal generation with addressing of the image-sensor elements
- H04N3/1512—Scanning details of television systems; Combination thereof with generation of supply voltages by means not exclusively optical-mechanical by means of electrically scanned solid-state devices for picture signal generation with addressing of the image-sensor elements for MOS image-sensors, e.g. MOS-CCD
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/70—SSIS architectures; Circuits associated therewith
- H04N25/76—Addressed sensors, e.g. MOS or CMOS sensors
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/50—Control of the SSIS exposure
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N3/00—Scanning details of television systems; Combination thereof with generation of supply voltages
- H04N3/10—Scanning details of television systems; Combination thereof with generation of supply voltages by means not exclusively optical-mechanical
- H04N3/14—Scanning details of television systems; Combination thereof with generation of supply voltages by means not exclusively optical-mechanical by means of electrically scanned solid-state devices
- H04N3/15—Scanning details of television systems; Combination thereof with generation of supply voltages by means not exclusively optical-mechanical by means of electrically scanned solid-state devices for picture signal generation
- H04N3/155—Control of the image-sensor operation, e.g. image processing within the image-sensor
Definitions
- FIG. 1 shows a 4 ⁇ 5 element section of an array 10 onto which an image having an edge 12 is projected.
- edge is used herein to mean the border defined by light illuminated areas and areas under ambient conditions. It is assumed that the area of the array 10 above the edge 12 is illuminated, while the area below the edge is dark.
- the twenty elements are organized into rows A through D, and columns R through V.
- the illumination state of each of the elements is determined using matrix addressing techniques. If a particular element is sufficiently illuminated, for example the element at row A, column R, the element is sensed as being at a first state (ON). If a particular element is not sufficiently illuminated, say the element at row D, column V, that element is sensed as being in a second state (OFF). If a particular element is partially illuminated, its state depends upon how much of the element is illuminated, and the intensity of that illumination. An interrogation of all of the illustrated elements of the array 10 results in the rather coarse approximation to the image as shown in FIG.
- a position sensitive detector In addition to the discrete sensor elements described above, another type of light sensitive element, called a position sensitive detector, exists.
- An example of a position sensitive detector is the detector 200 shown in FIG. 2. This detector outputs photogenerated analog currents 202, 204, 206, and 208, that can be used to determine the position of the centroid of the illuminating spot 210.
- the centroid of the light spot in the x-direction (horizontal) can be computed from the quantity (I 206 -I208)/(I 206 +I 208 ), while the centroid of the light spot in the y-direction (vertical) can be computed from (I 202 -I 204 )/(I 202 +I 204 ), where I 20x is the current from one of the lateral elements.
- position sensitive detectors are typically large (say from about 1 cm ⁇ cm to 5 cm ⁇ 5 cm), they have not been used in imaging arrays.
- an imaging device should be able to match the ability of the human visual system to determine edge positions, a capability known as edge acuity. Because of the difficulties in achieving high spatial resolution by increasing the pixel density, current image scanners cannot match the high edge acuity of human perception. Thus, new imaging and scanning techniques are necessary. Such new techniques would be particularly valuable if they could identify the positions of an edge to a fraction of the interpixel spacing. The ability to resolve edge spacings finer than the interpixel spacing is referred to as hyperacuity.
- the present invention implements hyperacuity sensing.
- Hyperacuity sensing is implemented using an array of sensors whose output signals are dependent upon the internal intensity distribution of the impinging light. The output signals are processed to determine the intra-sensor intensity distribution, which is then used to produce a hyperacuity image.
- Such hyperacuity images enhance the definition of edges, and reduce undesirable artifacts such as jaggies and moire effects.
- FIG. 1 shows a schematic depiction of a 4 element by 5 element section of a prior art imaging array
- FIG. 2 shows a simplified depiction of a prior art position sensitive detector
- FIG. 3 provides depictions of a sensor suitable for extracting light distribution information within a pixel
- FIG. 4 helps illustrate a method for using position sensitive detectors to extract edge information
- FIG. 5 shows a schematic depiction of a 2 element by 3 element section of an imaging array that is in accordance with the principles of the present invention
- FIG. 6 shows the intensity information obtained using an array of intensity distribution sensitive pixels in accordance with the principles of the present invention.
- the acuity of prior art imaging scanners is limited by the separation of the individual elements of the scanner's sensor array (for a given image magnification).
- This limitation is overcome in the present invention using hyperacuity techniques which approximate the edges of light illuminating the position sensors.
- the follow description first provides a description of a hyperacuity sensor and its use in implementing the hyperacuity techniques, then a description of a hyperacuity array is provided, and finally, the fabrication of the hyperacuity sensor is described.
- a sensor 300 that is suitable for use in hyperacuity sensing is illustrated in FIG. 3.
- the sensor 300 is an amorphous silicon position sensitive detector that is fabricated small enough to detect spatial distributions of light intensity at about 400 spots per inch.
- the sensor 300 has a substrate 301 on which is located a pair of lower electrodes 304 and 306.
- a lower, microcrystalline resistive layer 308 is formed over the lower electrodes 304 and 306.
- Over the resistive layer 308 is a vertical p-i-n diode 310 that is overlaid by an upper transparent, resistive layer 312 of a material such as indium tin oxide.
- Over the resistive layer 312 is a transparent insulating layer 313.
- Openings through the insulting layer 313 to the resistive layer 312 are formed using standard photolithographic methods. Then, top electrodes 314 and 316 which electrically connect to the resistive layer 312 are located over the formed openings. Except for various enhancements in materials and dimensions, the sensor 300 is similar to the sensor 200 shown in FIG. 2.
- the currents are analyzed as described below to determine a hyperacuity approximation for the distribution of the light which illuminates the sensor.
- the subpixel accuracy relates to the ability to identify spatial intensity distributions within the sensor cell as distinguished from intensity averaging over the entire cell. This intracellular spatial sensitivity allows one subsequently to render images with similar subpixel accuracy.
- a parametric model for the distribution of light within the sensor is first determined, and then, using the currents I 1 through I 4 , the parameters for the model are determined and applied to the model.
- the result is the hyperacuity approximation of the illuminating light distribution.
- a particularly useful model for hyperacuity sensing is the edge model.
- the impinging light intensity is assumed to be delineated by a straight edge 322 between black regions 324 and some uniform gray level regions 326 which subtends the active area of the sensor 300. While approximating the edge of the illuminating light, which may not be straight, using a straight line approximation may at first appear unacceptable, such a linear approximation is quite good because the size of the sensor 300 is comparable with or smaller than the smallest curvature of interest.
- the locations at which the edge approximation intersects the boundary of the sensor, as well as the gray level, can be determined from the four currents from the element 300 (see below). The edges of an image can therefore be assigned to a much smaller spatial dimension (more accurately determined) than the size of the sensor 300.
- the second condition occurs when if (x, y) falls within the black region of the lower, center panel in FIG. 4.
- the edge approximation intercepts h 1 and h 2 are given by (2-3 x)yL/(1-3x(1-x)) and (-1+3 x)yL/(1-3x(1-x)), respectively.
- This condition which corresponds to the illumination condition in the upper, center panel of FIG. 4, is referred to as the type 2 condition.
- (x,y) falls within the black region depicted in the lower, right panel of FIG.
- This condition which corresponds to the illumination condition shown in the upper, right panel of FIG. 4, is referred to as the type 3 condition.
- Other values of (x, y) yield edge approximation intercepts with the sensor boundary by various rotation and mirror operations. The determination of a linear approximation to an edge is therefore unique.
- hyperacuity sensing used the edge model.
- Other intra-sensor intensity models are possible.
- a second model that is useful in hyperacuity sensing interprets a gray scale approximation to the illuminating light. That model involves a parameterization in which the intensity within the sensor is taken to be a plane in a 3-dimensional (x, y, ⁇ ) space where x and y are spatial dimensions and ⁇ is the light intensity.
- the linear approximation is determined from the current outputs from the sensor 300.
- FIG. 5 shows a 2 ⁇ 3 section 500 of an array of hyperacuity sensors 502.
- Each sensor 502 has four electrodes 510 which connect to current amplifiers 512 (only four of which are shown). The currents from each sensor 502 are resolved as described above to form an approximation of the edge of the impinging light in each sensor 502.
- FIG. 5 shows an edge 520 defined by the boundary between illuminated and dark areas.
- the position of the edge 520 in each sensor 502 is approximated using the model described above.
- the location of the overall edge in the section 500 is determined by piecewise fitting together the approximations from each sensor 502.
- the accuracy of the approximation of the edge 520 is superior to that produced in prior art imaging scanners that use discrete sensor elements that are separated by similar distances as the present sensors 502.
- FIG. 6 we show the improvement in image acuity arising from the use of the present invention applied to the same illumination as that used in FIG. 1.
- the array 600 consists of individual pixel sensors 614 which, in this example, are position sensitive detectors.
- the outputs in conjunction with the edge model have been used to determine the subpixel edge positions as shown by the lines 616.
- the improved accuracy in the approximation to the illuminating boundary 612 is evident.
- a hyperacuity array can be fabricated on a glass substrate as follows. First, a chrome/molybdenum metal layer is deposited on the substrate by sputtering. Then, the chrome/molybdenum metal layer is patterned to form the lower electrode pairs (which correspond to the electrodes 304 and 306 in FIG. 3). Next, a laterally resistive thin film of doped microcrystalline silicon (which corresponds to the resistive layer 308) is deposited uniformly over the substrate and lower electrodes. An undoped, hydrogenated amorphous silicon layer approximately 300 nm thick and a thin p-type amorphous silicon contacting layer are laid over the resistive thin film by using plasma deposition.
- ITO indium tin oxide
- an insulating film of silicon nitride is laid down over the indium tin oxide. That insulating film is then patterned to open trenches to the indium tin oxide layer.
- Aluminum is then deposited over the exposed top surface. That aluminum is then patterned to form the top electrode contacts, the vias, and the leads which apply the current signals to contact pads or to thin film pass transistors.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Length Measuring Devices By Optical Means (AREA)
- Transforming Light Signals Into Electric Signals (AREA)
- Facsimile Heads (AREA)
- Solid State Image Pick-Up Elements (AREA)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US08/426,439 US5541652A (en) | 1993-11-12 | 1995-04-21 | Hyperacuity sensing |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15204493A | 1993-11-12 | 1993-11-12 | |
US08/426,439 US5541652A (en) | 1993-11-12 | 1995-04-21 | Hyperacuity sensing |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15204493A Continuation | 1993-11-12 | 1993-11-12 |
Publications (1)
Publication Number | Publication Date |
---|---|
US5541652A true US5541652A (en) | 1996-07-30 |
Family
ID=22541310
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US08/426,439 Expired - Lifetime US5541652A (en) | 1993-11-12 | 1995-04-21 | Hyperacuity sensing |
Country Status (2)
Country | Link |
---|---|
US (1) | US5541652A (ja) |
JP (1) | JP3881039B2 (ja) |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5754690A (en) * | 1995-10-27 | 1998-05-19 | Xerox Corporation | Position sensitive detector based image conversion system capable of preserving subpixel information |
US5790699A (en) * | 1995-10-27 | 1998-08-04 | Xerox Corporation | Macrodetector based image conversion system |
EP0877231A1 (fr) * | 1997-05-09 | 1998-11-11 | Vishay S.A. | Dispositif de mesure de position et de déplacement sans contact |
US6704462B1 (en) * | 2000-07-31 | 2004-03-09 | Hewlett-Packard Development Company, L.P. | Scaling control for image scanners |
US20040094717A1 (en) * | 2002-11-14 | 2004-05-20 | Griffin Dennis P. | Sensor having a plurality of active areas |
US20040239650A1 (en) * | 2003-06-02 | 2004-12-02 | Mackey Bob Lee | Sensor patterns for a capacitive sensing apparatus |
US7202859B1 (en) | 2002-08-09 | 2007-04-10 | Synaptics, Inc. | Capacitive sensing pattern |
US10366506B2 (en) | 2014-11-07 | 2019-07-30 | Lamina Systems, Inc. | Hyperacuity system and methods for real time and analog detection and kinematic state tracking |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105538396B (zh) * | 2015-12-30 | 2018-01-05 | 长园和鹰智能科技有限公司 | 裁剪机的材料捡拾区校准系统、捡拾系统及其校准方法 |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4739384A (en) * | 1984-10-22 | 1988-04-19 | Fuji Photo Film Co., Ltd. | Solid-state imaging device with polycrystalline film |
US4765732A (en) * | 1987-03-20 | 1988-08-23 | The Regents Of The University Of California | Hyperacuity testing instrument for evaluating visual function |
US4771471A (en) * | 1985-03-07 | 1988-09-13 | Dainippon Screen Mfg. Co., Ltd. | Smoothing method for binary-coded image data and apparatus therefor |
US5151787A (en) * | 1989-12-23 | 1992-09-29 | Samsung Electronics Co., Ltd. | Method and circuit for correcting image edge |
US5204910A (en) * | 1991-05-24 | 1993-04-20 | Motorola, Inc. | Method for detection of defects lacking distinct edges |
US5231677A (en) * | 1984-12-28 | 1993-07-27 | Canon Kabushiki Kaisha | Image processing method and apparatus |
-
1994
- 1994-11-07 JP JP27197994A patent/JP3881039B2/ja not_active Expired - Lifetime
-
1995
- 1995-04-21 US US08/426,439 patent/US5541652A/en not_active Expired - Lifetime
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4739384A (en) * | 1984-10-22 | 1988-04-19 | Fuji Photo Film Co., Ltd. | Solid-state imaging device with polycrystalline film |
US5231677A (en) * | 1984-12-28 | 1993-07-27 | Canon Kabushiki Kaisha | Image processing method and apparatus |
US4771471A (en) * | 1985-03-07 | 1988-09-13 | Dainippon Screen Mfg. Co., Ltd. | Smoothing method for binary-coded image data and apparatus therefor |
US4765732A (en) * | 1987-03-20 | 1988-08-23 | The Regents Of The University Of California | Hyperacuity testing instrument for evaluating visual function |
US5151787A (en) * | 1989-12-23 | 1992-09-29 | Samsung Electronics Co., Ltd. | Method and circuit for correcting image edge |
US5204910A (en) * | 1991-05-24 | 1993-04-20 | Motorola, Inc. | Method for detection of defects lacking distinct edges |
Non-Patent Citations (5)
Title |
---|
A Simple Method of segmentaction with sub pixel accuracy Forte, P. IEE 1989. * |
Boundary description and measurement with sub pixel/voxel accuracy 1992. * |
Boundary description and measurement with sub-pixel/voxel accuracy 1992. |
Precise localization of geometrically known imge edges in noisy environment Xu, C. Nov. 1990. * |
Subpixel measurments using a moment based edge operator Lyvers, E. P. IEEE, vol. 11, No. 12 Dec. 1989. * |
Cited By (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5754690A (en) * | 1995-10-27 | 1998-05-19 | Xerox Corporation | Position sensitive detector based image conversion system capable of preserving subpixel information |
US5790699A (en) * | 1995-10-27 | 1998-08-04 | Xerox Corporation | Macrodetector based image conversion system |
EP0877231A1 (fr) * | 1997-05-09 | 1998-11-11 | Vishay S.A. | Dispositif de mesure de position et de déplacement sans contact |
FR2763122A1 (fr) * | 1997-05-09 | 1998-11-13 | Vishay Sa | Dispositif de mesure de position et de deplacement sans contact |
US6034765A (en) * | 1997-05-09 | 2000-03-07 | Vishay Sa | Contactless position and displacement measuring device |
US6704462B1 (en) * | 2000-07-31 | 2004-03-09 | Hewlett-Packard Development Company, L.P. | Scaling control for image scanners |
US7202859B1 (en) | 2002-08-09 | 2007-04-10 | Synaptics, Inc. | Capacitive sensing pattern |
US20040094717A1 (en) * | 2002-11-14 | 2004-05-20 | Griffin Dennis P. | Sensor having a plurality of active areas |
US6828559B2 (en) * | 2002-11-14 | 2004-12-07 | Delphi Technologies, Inc | Sensor having a plurality of active areas |
US20040239650A1 (en) * | 2003-06-02 | 2004-12-02 | Mackey Bob Lee | Sensor patterns for a capacitive sensing apparatus |
US7129935B2 (en) | 2003-06-02 | 2006-10-31 | Synaptics Incorporated | Sensor patterns for a capacitive sensing apparatus |
US10366506B2 (en) | 2014-11-07 | 2019-07-30 | Lamina Systems, Inc. | Hyperacuity system and methods for real time and analog detection and kinematic state tracking |
Also Published As
Publication number | Publication date |
---|---|
JPH07193678A (ja) | 1995-07-28 |
JP3881039B2 (ja) | 2007-02-14 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US5717201A (en) | Double four-quadrant angle-position detector | |
US5754690A (en) | Position sensitive detector based image conversion system capable of preserving subpixel information | |
US4644406A (en) | Large scale contact type image reading unit using two-dimensional sensor array | |
US5619033A (en) | Layered solid state photodiode sensor array | |
US4558365A (en) | High-resolution high-sensitivity solid-state imaging sensor | |
US5790699A (en) | Macrodetector based image conversion system | |
US5561287A (en) | Dual photodetector for determining peak intensity of pixels in an array using a winner take all photodiode intensity circuit and a lateral effect transistor pad position circuit | |
US5541652A (en) | Hyperacuity sensing | |
US20030127647A1 (en) | Image sensor with performance enhancing structures | |
EP0710987A1 (en) | An image sensor | |
JPH09247533A (ja) | パターン形成した共通電極を有するフラット・パネル放射線撮像装置 | |
KR20190079355A (ko) | 엑스레이검출장치 | |
US4764682A (en) | Photosensitive pixel sized and shaped to optimize packing density and eliminate optical cross-talk | |
US5578837A (en) | Integrating hyperacuity sensors and arrays thereof | |
US5629517A (en) | Sensor element array having overlapping detection zones | |
EP0747973B1 (en) | Sensor element | |
EP0523784A1 (en) | An image detector and a method of manufacturing such an image detector | |
US4910412A (en) | Light biased photoresponsive array | |
US5026980A (en) | Light biased photoresponsive array having low conductivity regions separating individual cells | |
CN102981682B (zh) | 影像感测器及具有该影像感测器的光学触控系统 | |
Lemmi et al. | Active matrix of amorphous silicon multijunction color sensors for document imaging | |
JP4574988B2 (ja) | 焦点面検出器 | |
Vieira et al. | Optically addressed read–write device based on tandem heterostructure | |
CN116615807A (zh) | 探测基板、其降噪方法及探测装置 | |
Hayase et al. | Full-Contact Type Linear Image Sensor by Amorphous Silicon |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
FPAY | Fee payment |
Year of fee payment: 4 |
|
AS | Assignment |
Owner name: BANK ONE, NA, AS ADMINISTRATIVE AGENT, ILLINOIS Free format text: SECURITY INTEREST;ASSIGNOR:XEROX CORPORATION;REEL/FRAME:013153/0001 Effective date: 20020621 |
|
AS | Assignment |
Owner name: JPMORGAN CHASE BANK, AS COLLATERAL AGENT, TEXAS Free format text: SECURITY AGREEMENT;ASSIGNOR:XEROX CORPORATION;REEL/FRAME:015134/0476 Effective date: 20030625 Owner name: JPMORGAN CHASE BANK, AS COLLATERAL AGENT,TEXAS Free format text: SECURITY AGREEMENT;ASSIGNOR:XEROX CORPORATION;REEL/FRAME:015134/0476 Effective date: 20030625 |
|
FPAY | Fee payment |
Year of fee payment: 8 |
|
FPAY | Fee payment |
Year of fee payment: 12 |
|
AS | Assignment |
Owner name: XEROX CORPORATION, CONNECTICUT Free format text: RELEASE BY SECURED PARTY;ASSIGNOR:JPMORGAN CHASE BANK, N.A. AS SUCCESSOR-IN-INTEREST ADMINISTRATIVE AGENT AND COLLATERAL AGENT TO JPMORGAN CHASE BANK;REEL/FRAME:066728/0193 Effective date: 20220822 |