CN215219712U - Sensing device using sensing pixels with partial spectral sensing area - Google Patents

Sensing device using sensing pixels with partial spectral sensing area Download PDF

Info

Publication number
CN215219712U
CN215219712U CN202121684312.0U CN202121684312U CN215219712U CN 215219712 U CN215219712 U CN 215219712U CN 202121684312 U CN202121684312 U CN 202121684312U CN 215219712 U CN215219712 U CN 215219712U
Authority
CN
China
Prior art keywords
sensing
light
region
pixels
pixel
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202121684312.0U
Other languages
Chinese (zh)
Inventor
周正三
傅同龙
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Egis Technology Inc
Original Assignee
Egis Technology Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Egis Technology Inc filed Critical Egis Technology Inc
Application granted granted Critical
Publication of CN215219712U publication Critical patent/CN215219712U/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09FDISPLAYING; ADVERTISING; SIGNS; LABELS OR NAME-PLATES; SEALS
    • G09F9/00Indicating arrangements for variable information in which the information is built-up on a support by selection or combination of individual elements
    • G09F9/30Indicating arrangements for variable information in which the information is built-up on a support by selection or combination of individual elements in which the desired character or characters are formed by combining individual elements

Abstract

A sensing device at least comprises a plurality of sensing pixels, wherein the sensing pixels have specific spectrum sensing areas with different ratios, so that spectrum information of detected light can be defined according to a plurality of sensing values obtained by the sensing pixels sensing the detected light and the specific spectrum sensing areas with different ratios.

Description

Sensing device using sensing pixels with partial spectral sensing area
Technical Field
The present invention relates to a sensing device, and more particularly, to a sensing device using sensing pixels with partial spectrum sensing areas, which can obtain spectrum information of a specific spectrum by using sensing values obtained by the sensing pixels with partial spectrum sensing areas, or can be used as a basis for living body judgment (e.g., true and false finger judgment) or property judgment.
Background
Today's mobile electronic devices (e.g., mobile phones, tablet computers, notebook computers, etc.) are usually equipped with user biometric systems, including various technologies such as fingerprints, facial shapes, irises, etc., for protecting personal data security, wherein the mobile payment device is applied to portable devices such as mobile phones and intelligent watches, and also has the function of mobile payment, the biometric identification of the user becomes a standard function, and the development of portable devices such as mobile phones is more toward the trend of full screen (or ultra-narrow frame), therefore, the conventional capacitive fingerprint key can not be used any more, and a new miniaturized optical imaging device (some of which are very similar to the conventional camera module and have a Complementary Metal-Oxide Semiconductor (CMOS) Image Sensor (CIS)) sensing element and an optical lens module) is developed. The miniaturized optical imaging device is disposed under a screen (referred to as under the screen), and can capture an image of an object pressed On the screen, particularly a Fingerprint image, through a part of the screen (particularly an Organic Light Emitting Diode (OLED) screen), which is referred to as under-screen Fingerprint sensing (FOD).
The off-screen fingerprint sensing needs to determine the authenticity of a finger in addition to correctly sensing a fingerprint, so as to prevent a person from being authenticated by counterfeiting another person with a fake fingerprint or a fake finger of another person. For example, 2D images or 3D prints can be used to make a mold, and then the mold can be filled with different kinds of silica gel and pigments to make a fake finger, or the fingerprint of another person can be copied to be a transparent or skin-color film to be attached to the surface of the finger, so that the fake finger with the transparent film is difficult to be identified. In a conventional identification method, a 2D sensor array (basically, white pixels (white pixels) capable of detecting visible light spectrum) is provided with a plurality of color pixels (color filters (e.g., RGB color filters) are fabricated above the color pixels) distributed in the 2D sensor array, and the color pixels are used to detect the intensity of light signals reflected from fingers, so as to set a threshold value of light signal intensity, thereby achieving the purpose of identifying a false hand. However, the determination of the intensity of the optical signal is easily affected by the intensity change of the environmental optical field, and the variation of the threshold value of the intensity of the optical signal is often caused, which greatly affects the accuracy of the determination.
In view of the above, there is a need for further improvement in the mechanism for determining a real finger to prevent a fake finger from passing through the fingerprint sensing.
SUMMERY OF THE UTILITY MODEL
Therefore, an object of the present invention is to provide a sensing device using a sensing pixel having a partial spectrum sensing area, which uses the occupancy ratio of the partial area of the sensing pixel and the sensing value obtained by the sensing of the sensing pixel to the measured portion as the basis for living body judgment or property judgment, and can be used to define the spectrum information of the measured portion corresponding to a specific spectrum.
To achieve the above object, the present invention provides a sensing device at least comprising a plurality of sensing pixels, wherein the sensing pixels have different specific spectrum sensing areas, so as to define the spectrum information of a measured light according to the sensing pixels sensing a plurality of sensing values obtained by the measured light and the different specific spectrum sensing areas.
With the sensing device of the above embodiment, the spectrum information of the detected part corresponding to the specific spectrum can be obtained according to the occupied proportion of the partial area of the sensing pixel and the sensing value obtained by the sensing pixel sensing the detected part, and the determination of the detected blood oxygen concentration, the skin characteristic or the characteristic of other subcutaneous tissues and the like can be performed from the spectrum information without sacrificing the spectrum information of the detected part.
In order to make the above and other objects of the present invention more comprehensible, preferred embodiments accompanied with figures are described in detail below.
Drawings
Fig. 1A is a schematic diagram of a sensing pixel according to a preferred embodiment of the present invention.
Fig. 1B is a schematic diagram illustrating an application of a sensing device according to a preferred embodiment of the present invention.
FIG. 2 shows a planar layout of the sensing pixels of FIG. 1B.
Fig. 3 to 5 show three perspective views of the three sensing pixels of fig. 1B.
Reference numerals:
a11 effective area
A13 effective area
F is an object
L1 first ray
L2 second ray
M1 first ratio
M2 third ratio
M3 fifth ratio
N1 second ratio
N2 fourth ratio
N3 sixth ratio
S1 first sensing value
S11, S12 receiving the value
S2 second sensing value
S21, S22 receiving the value
S3 third sensing value
S4 fourth sensing value
VIP site
10 sensing pixel
11 light sensing element
11A first part
11B second part
12 first region
13 spectral separation element
13A the first light region
13B the second light region
14 second zone
20 second sensing pixel
22 third zone
24 the fourth zone
30 third sensing pixel
32 the fifth zone
34 sixth zone
40 fourth sensing pixel
45, a sensing substrate
50 signal processing unit
51 pixel group
52 pixel group
53 group of pixels
60 database
70 optical-mechanical structure
80 display
100 sensing device
Detailed Description
Fig. 1A is a schematic diagram of a sensing pixel according to a preferred embodiment of the present invention. As shown in fig. 1A, the present embodiment provides a sensing pixel 10, which at least includes a complete photo sensing device 11 and a spectrum separating device 13 disposed at one side of the photo sensing device 11. The spectral separation element 13 has a first light region 13A and a second light region 13B, which process light of different spectral components, respectively. A first portion 11A and a second portion 11B of the photo sensing device 11 respectively pass through the first light region 13A and the second light region 13B of the spectral separation device 13 to sense a detected light from an object F to obtain a combined signal. The first light area 13A and the first portion 11A form a first region 12 of the sensing pixel 10, and the second light area 13B and the second portion 11B form a second region 14 of the sensing pixel 10. In the embodiment, the spectral separation element 13 is a filter, and the effective area a13 of the first light region 13A is smaller than the effective area a11 of the light sensing element 11 capable of performing light sensing. In one example, the first light region 13A allows a first light L1 (having a spectrum of red light, for example) of the light to be detected to pass through to the light sensing element 11, and the second light region 13B allows all (having a spectrum of white light, for example) of the light to be detected to pass through to the light sensing element 11. In another example, the first light region 13A allows the first light L1 (having a spectrum of red light, for example) to pass through to the photo sensing device 11, and the second light region 13B allows a second light L2 (having a spectrum of blue light or green light, for example) of the received light to pass through to the photo sensing device 11. By the above design, the sensing pixel 10 can have a part of specific spectrum sensing area to achieve the function of selectively filtering the spectrum of the detected light with a specific ratio, which can provide a solution for the application of the sensing device described below. In practical design, the ratio of the effective areas a13 and a11 can be adjusted to configure sensing pixels with different area ratios. It is understood that the spectral separating element 13 of the present embodiment may be located on or above the light sensing element 11.
Fig. 1B is a schematic diagram illustrating an application of a sensing device according to a preferred embodiment of the present invention. FIG. 2 shows a planar layout of the sensing pixels of FIG. 1B. As shown in fig. 1B and fig. 2, the sensing device 100 of the present embodiment at least includes a plurality of sensing pixels arranged in an array, the light sensing elements of the sensing pixels are formed on a sensing substrate 45, and the sensing pixels at least include a sensing pixel 10 (which may be referred to as a first sensing pixel), and certainly may further include a second, a third and a fourth sensing pixels 20, 30 and 40, each having specific spectral sensing regions with different area ratios. The sensing device 100 is used for sensing a portion VIP (a measured portion, or a Virtual identity Position) of the object F, and the sensing pixel 10 obtains a first sensing value S1 based on a reflected light (a measured light) from the portion VIP. In the present embodiment, the sensing pixel 10 has a first region 12 with a first ratio M1 and a second region 14 with a second ratio N1. The first region 12 is used for sensing a first light L1 (a specific spectrum, such as a red (R), green (G), blue (B) spectrum, etc.) of the reflected light. In the non-limiting example of fig. 2, it is known that M1 is 25% and N1 is 75%, but in other examples, M1+ N1 may be less than 100%. Similarly, the second sensing pixel 20 obtains a second sensing value S2 based on the reflected light from the VIP site. In the present embodiment, the sensing pixel 20 has a third region 22 with a third ratio M2 and a fourth region 24 with a fourth ratio N2. The third region 22 is for sensing the first light L1 of the reflected light. The sensing pixel 10 can define the spectrum information of the portion VIP corresponding to the first light L1 (or the spectrum information of the detected light) according to at least M1, N1, M2, N2, S1 and S2. The sensing device 100 may also optionally include a signal processing unit 50 and a database 60. The signal processing unit 50 is electrically connected to the database 60 and the sensing pixels. The signal processing unit 50 can compare the data pre-stored in the database 60 with the data according to M1, N1, M2, N2, S1 and S2, and determine the authenticity of the object F. In the non-limiting example of fig. 2, M2 is 50% and N2 is 50%, but in other examples, M2+ N2 may be < 100%.
Hereinafter, a so-called virtual identical site VIP will be exemplified. In the case of a fingerprint of a finger, the center-to-center distance (pitch) of the fingerprint peaks is about 400 to 500 μm, and the center-to-center distance of the sensing pixels is about 10 to 20 μm, so that two or three adjacent sensing pixels actually measure the same portion of the finger, because the distance from the finger (object F) to the sensing pixels is about 1,000 μm, and in the case where the lateral dimension and the longitudinal dimension are considerably different, the two or three adjacent sensing pixels can see almost the same position of the object to be measured. The depth information of the virtually identical portions representing the identical positions is substantially identical, and the corresponding spectral information is also identical.
In this example, the signal processing unit 50 can determine the authenticity of the object F according to six parameters, i.e., M1, N1, S1, M2, N2 and S2, and certainly, the spectrum information of the VIP can also be defined. In another variation, the six parameters can be output to an electronic device equipped with the sensing apparatus 100 for processing by a central processing unit of the electronic device (such as a mobile phone, a tablet computer, or a notebook computer), that is, the central processing unit can perform processing on the above-mentioned data and the following data. Therefore, the electrical connection of the signal processing unit 50 to the database 60 is not necessarily an essential element of the sensing device 100.
Of course, the details of the above embodiments may be changed and expanded to adapt to different applications and enhance the result of the authenticity identification. It is the core spirit of the present application, because the sensing pixels capable of sensing a specific spectrum using part of a specific spectrum sensing area with different ratios can derive a variety of variations.
The third sensing pixel 30 obtains a third sensing value S3 based on the reflected light from the part VIP, the third sensing pixel 30 has a fifth area 32 with a fifth ratio M3 and a sixth area 34 with a sixth ratio N3, the fifth area 32 is used for sensing the first light L1 of the reflected light, so that the authenticity determination can be performed according to at least nine parameters of M1, N1, M2, N2, M3, N3, S1, S2 and S3, and the spectral information of the part VIP corresponding to the first light L1 can be defined. Alternatively, the signal processing unit 50 may compare the nine parameters with the data pre-stored in the database 60 to determine whether the object F is true or false. In the non-limiting example of fig. 2, M3 is 75% and N3 is 25%, but in other examples, M3+ N3 may be < 100%. M1, N1, M2, N2, M3, N3 are all between 0% and 100%.
In this example, the fourth sensing pixel 40 does not have a specific spectral sensing area as the sensing pixels 10, 20, and 30, and therefore the fourth sensing value S4 representing the full-spectrum information of the site VIP can be obtained based on the reflected light from the site VIP. The fourth sensing pixel 40 may provide a reference, and may also be used to sense other biological characteristics of the object F, such as fingerprints, blood vessel images, blood oxygen concentration images, and the like. In the actual design, a plurality of fourth sensing pixels 40 are interspersed with some sensing pixels 10 and some second sensing pixels 20 (or may further include the third sensing pixels 30), and the signals sensed by the fourth sensing pixels 40 are used as the biological features, and the sensing pixels 10 and the second sensing pixels 20 (or may further include the third sensing pixels 30) are used as the data for the spectral information determination.
The first, third and fifth areas 12, 22 and 32 are for sensing the first light L1 of the site VIP to obtain the receiving values S11, S21 and S31, respectively. The second, fourth and sixth regions 14, 24 and 34 are for sensing the second light L2 of the site VIP to obtain receiving values S12, S22 and S32, respectively. When the first light L1 is red/green/blue light (e.g., the finger reflects the illumination light from the display screen), the second light L2 may be pure light or mixed light of non-pure red/green/blue light, such that the second light L2 partially overlaps or does not overlap with the first light L1.
It should be noted that in a biometric sensor or fingerprint sensor, there may be a plurality of sensing pixels 10, a plurality of second sensing pixels 20, a plurality of third sensing pixels 30 and a plurality of fourth sensing pixels 40, which sense different spectrums of light passing through a display 80 and an opto-mechanical structure 70, and are arranged in an array formed on a sensing substrate 45. The display 80 may be an LCD, OLED display, micro-led display, or other existing or future display. The display 80 may provide light to illuminate the finger (object F), although other light sources may be additionally provided to illuminate the finger (object F). In another example, the display 80 is not required, and a transparent cover plate can be used to cover the optical-mechanical structure 70 and allow a finger (object F) to touch the transparent cover plate.
The implementation of the opto-mechanical structure 70 is also not particularly limited. In one example, the optical bench structure 70 includes a plurality of microlens structures and a light shielding layer. In another example, the optical engine structure 70 is a collimator structure without a microlens structure.
In practical implementation, the sensing pixel 10 may be formed by a light sensing element and a spectrum separating element, wherein the area of the spectrum separating element divided by the area of the light sensing element is equal to 25%. The spectral separation element, for example a red/green/blue filter, only allows the red/green/blue light of the opto-mechanical structure 70 to enter the light-sensing element. The second sensing pixel 20 and the third sensing pixel 30 can be designed in the same way.
In the above embodiment, the first region 12 senses the red/green/blue spectral values and the second region 14 senses the white spectral values. However, the present invention is not limited thereto, and any method can be used as long as the individual spectral values are resolved by using different mixed spectra, and the authenticity judgment is performed. For example, the first region 12 senses red spectral values, the second region 14 senses green spectral values, and so on. Therefore, by disposing different spectrum sensing regions with different occupancy ratios, signals of different spectra can be extracted from the rear signal processing unit for the above-mentioned applications.
Fig. 3 to 5 show three perspective views of the three sensing pixels of fig. 1B. As shown in fig. 1B to fig. 5, the first light L1 is a red spectrum, the second light L2 is a white spectrum, and the sensing pixel 10, the second sensing pixel 20, and the third sensing pixel 30 respectively include specific spectrum sensing regions with an area ratio of 25%, 50%, and 75% as an example, but the disclosure of the present application is not limited thereto. In other examples, each sensing pixel may include other sensing regions with appropriate ratios, and the first light and the second light may be spectra that do not overlap with each other.
Referring to fig. 1B to 4, the received values S11 and S12 are synthesized into a first sensed value S1, and the received values S21 and S22 are synthesized into a second sensed value S2. Thus, in the pixel group 53 of fig. 2, the following equations 1 and 2 are satisfied:
s1 ═ M1 ═ I1+ N1 ═ I2 ═ 0.25 ═ I1+0.75 ═ I2 [ equation 1]
S2 ═ M2 ═ I1+ N2 ═ I2 ═ 0.5 ═ I1+0.5 ═ I2 [ equation 2]
Where I1 represents the intensity of the first light ray (red light) and I2 represents the intensity of the second light ray (white spectrum). Since there are two unknowns, I1 and I2, I1 and I2 can be solved using two equations. Therefore, in the case where M2 is not equal to M1 and N2 is not equal to N1, the intensity I1 of the first light ray L1 and the intensity I2 of the second light ray L2 are obtained through S1, S2, M1, M2, N1 and N2. The solving of the above equations may be performed by the signal processing unit 50 or a central processor of the electronic device. Alternatively, according to the characteristics of the above two equations 1 and 2, the database 60 obtained by measuring the real finger and the dummy finger through experiments may be matched, and the signal processing unit 50 may compare the database 60 according to S1 and S2 to determine the authenticity of the object F without first solving I1 and I2, which also provides another practical way. Still alternatively, the signal processing unit 50 may obtain the spectral information of the site VIP according to S1, S2, M1, M2, N1 and N2.
Referring to fig. 1B to 5, in the pixel groups 51 and 52, the received values S31 and S32 are synthesized into the third sensing value S3 by using the sensing pixel 10, the second sensing pixel 20 and the third sensing pixel 30. Accordingly, the above equation 1, the above equation 2, and the following equation 3 are satisfied:
s3 ═ M3 ═ I1+ N3 ═ I2 ═ 0.75 ═ I1+0.25 ═ I2 [ equation 3]
According to equations 1 to 3, since there are two unknowns I1 and I2, the signal processing unit 50 can solve I1 and I2 by using two equations, but provides the third sensing value S3, so that the sensing error can be avoided. Therefore, the signal processing unit 50 can solve the above three equations by S1, S2, S3, M1, M2, M3, N1, N2, and N3 to find I1 and I2 in a manner of minimizing errors (e.g., the least squares method). Alternatively, the signal processing unit 50 may compare the database 60 obtained by measuring the real finger and the dummy finger according to the characteristics of the above three equations and the experiment with the database 60 according to S1, S2 and S3 to determine the authenticity of the object F without first solving the intensities I1 and I2. This also provides another way of implementation. Still alternatively, the signal processing unit 50 may obtain the spectral information of the site VIP according to S1, S2, M1, M2, M3, N1, N2 and N3.
In the case of using the sensing pixel 10 and the fourth sensing pixel 40, the intensity I2 of the second light L2 can be obtained through S1, S4, M1 and N1, and the signal processing unit 50 can solve I1 and I2 according to the above equation 1 and the following equation 4:
s4 ═ I2 [ equation 4]
In the pixel group 51, the sensing pixels 10, the second sensing pixels 20 and the third sensing pixels 30 are arranged in line. In the pixel group 52, the sensing pixels 10, the second sensing pixels 20 and the third sensing pixels 30 are arranged alternately, and three fourth sensing pixels 40 are also arranged alternately. This means that the sensing pixels of the present disclosure may have various arrangements, and the arrangement of the sensing pixels is not particularly limited.
As will be described below using a practical numerical example, assuming that the intensity value of the light emitted from the virtual identical region is (r, g, b) — (120,160,200), the first sensing pixel has a red spectral sensing region of 25% and a white spectral sensing region of 75%, and the second sensing pixel has a red spectral sensing region of 50% and a white spectral sensing region of 50%. According to equation 1, I1 and I2 represent red and white intensity values, respectively, and theoretically, (I1, I2) ═ 120,120+160+ 200. In actual sensing, the first sensing pixel and the second sensing pixel obtain (S1, S2) ═ 390,300. From [ equation 1] and [ equation 2], it can be solved that (I1, I2) ═ 120,480, which is the same as the theoretical value. Of course, the following simultaneous equations may be solved from (S1, S2) to (390,300) based on the unknown (r, g, b).
r +0.75g +0.75b 390 [ equation 5]
r +0.5g +0.5b 300 [ equation 6]
Equation 5 × 2- [ equation 6] × 3 can eliminate g and b, and obtain r 120, g + b 360, r + g + b 480, r/(g + b) 1/3, and r/(r + g + b) 1/4. Thus, the authenticity can be judged by using the range of r/(g + b) or r/(r + g + b) or the comparison database. On the other hand, since the two sensing pixels have white light components, the white light components of the two sensing pixels can be compensated to the white light sensing value of the complete sensing pixel to serve as the sensing value of, for example, a fingerprint, thereby avoiding the problem of data omission caused by the two sensing pixels just sensing the feature point of the fingerprint. Of course, if the third sensing pixel is further adopted, the better specific optical signal intensity can be solved by a numerical solution.
It can be understood that, when the first sensing pixel has 25% of the red spectrum sensing area and 75% of the green spectrum sensing area, and the second sensing pixel has 50% of the red spectrum sensing area and 50% of the green spectrum sensing area, (r, g) and r/g can be solved by the simultaneous equations as the basis for determining the authenticity. Other conditions may be analogized.
By applying the sensing device, the uneven distribution of the micro-blood vessels in the human finger can be utilized, so that the uneven distribution of the finger color in the geometric space can be found through normal visual observation, and in addition, when the finger starts to touch a surface (such as the surface of a mobile phone screen, for example, an optical fingerprint sensor under the screen), the micro-blood vessels in the finger are pressed to block the blood flow, the skin color of the finger can be further changed, and the color change is generated on a time domain or a space domain. By using either or both of the above two phenomena, the characteristics of the real hand can be determined, and the attack of the fake finger can be avoided.
Through the above embodiment, the spectrum information of the measured portion corresponding to the specific spectrum is defined according to the occupied proportion of the partial area of the sensing pixel and the sensing value obtained by the sensing pixel sensing the measured portion. In addition, the sensing device using sensing pixels with different spectrum sensing regions with different occupancy ratios can define the spectrum information of the tested part as the basis of living body judgment by using the spectrum information obtained by the sensing pixels composed of the spectrum sensing regions without sacrificing the spectrum information of the tested part.
The embodiments presented in the detailed description of the preferred embodiments are only for convenience of description of the technical content of the present invention, and the present invention is not narrowly limited to the above-mentioned embodiments, and various modifications can be made without departing from the spirit of the present invention and the scope of the claims.

Claims (12)

1. A sensing device at least comprises a plurality of sensing pixels, wherein the sensing pixels have specific spectrum sensing areas with different ratios, so that spectrum information of detected light can be defined according to a plurality of sensing values obtained by the sensing pixels sensing the detected light and the specific spectrum sensing areas with different ratios.
2. The sensing device of claim 1, wherein the sensing pixel comprises:
a first sensing pixel obtaining a first sensing value S1 based on the detected light from a portion of an object and having a first region of a first duty ratio M1 and a second region of a second duty ratio N1, the first region sensing a first light of the detected light; and
a second sensing pixel, obtaining a second sensing value S2 based on the detected light, and having a third region of a third ratio M2 and a fourth region of a fourth ratio N2, the third region being used for sensing the first light of the detected light, enabling to define spectral information of the portion corresponding to the first light according to at least M1, N1, M2, N2, S1, and S2.
3. The sensing device as claimed in claim 2, wherein the second region and the fourth region are for sensing a second light of the site.
4. The sensing apparatus of claim 2, wherein the sensing pixel further comprises a third sensing pixel obtaining a third sensing value S3 based on the detected light and having a fifth region of a fifth duty ratio M3 and a sixth region of a sixth duty ratio N3, the fifth region being used for sensing the first light of the detected light, enabling the spectral information of the portion corresponding to the first light to be defined according to at least M1, N1, M2, N2, M3, N3, S1, S2 and S3.
5. The sensing device as claimed in claim 4, wherein the second, fourth and sixth regions are for sensing the second light of the detected light.
6. The sensing device as claimed in claim 5, further comprising a signal processing unit for comparing M1, N1, M2, N2, M3, N3, S1, S2 and S3 with data pre-stored in a database to determine whether the object is true or false.
7. The sensing apparatus of claim 4, wherein the sensing pixels further comprise a fourth sensing pixel for obtaining a fourth sensing value S4 representing full spectrum information of the location based on the detected light.
8. The sensing device of claim 7, wherein the second region is configured to sense a second light from the site.
9. The sensing device of claim 3, 5 or 8, wherein the second light partially overlaps the first light.
10. The sensing device of claim 3, 5 or 8, wherein the second light is of a white light spectrum and the first light is of a red, green or blue light spectrum.
11. The sensing device of claim 3, 5 or 8, wherein the second light does not overlap the first light.
12. The sensing device as claimed in claim 2, further comprising a signal processing unit for determining whether the object is authentic by comparing M1, N1, M2, N2, S1 and S2 with data pre-stored in a database.
CN202121684312.0U 2020-09-08 2021-07-23 Sensing device using sensing pixels with partial spectral sensing area Active CN215219712U (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US202063075472P 2020-09-08 2020-09-08
US63/075,472 2020-09-08

Publications (1)

Publication Number Publication Date
CN215219712U true CN215219712U (en) 2021-12-17

Family

ID=77882275

Family Applications (2)

Application Number Title Priority Date Filing Date
CN202121684312.0U Active CN215219712U (en) 2020-09-08 2021-07-23 Sensing device using sensing pixels with partial spectral sensing area
CN202110839966.4A Pending CN113469131A (en) 2020-09-08 2021-07-23 Sensing device using sensing pixels with partial spectral sensing area

Family Applications After (1)

Application Number Title Priority Date Filing Date
CN202110839966.4A Pending CN113469131A (en) 2020-09-08 2021-07-23 Sensing device using sensing pixels with partial spectral sensing area

Country Status (2)

Country Link
CN (2) CN215219712U (en)
TW (2) TWM618757U (en)

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8165355B2 (en) * 2006-09-11 2012-04-24 Validity Sensors, Inc. Method and apparatus for fingerprint motion tracking using an in-line array for use in navigation applications
EP1977370A4 (en) * 2006-01-23 2011-02-23 Digimarc Corp Methods, systems, and subcombinations useful with physical articles
US8391568B2 (en) * 2008-11-10 2013-03-05 Validity Sensors, Inc. System and method for improved scanning of fingerprint edges
WO2013071312A1 (en) * 2011-11-12 2013-05-16 Cross Match Technologies, Inc. Ambient light illumination for non-imaging contact sensors
TWI658410B (en) * 2016-09-07 2019-05-01 李美燕 Optical imaging system with variable light field for biometrics application
TWI689742B (en) * 2018-12-06 2020-04-01 財團法人工業技術研究院 Method and device for detecting spot position
CN109690567B (en) * 2018-12-14 2020-10-02 深圳市汇顶科技股份有限公司 Fingerprint identification device and electronic equipment
CN111052142B (en) * 2019-09-06 2023-09-26 深圳市汇顶科技股份有限公司 Fingerprint identification device and electronic equipment

Also Published As

Publication number Publication date
TW202211669A (en) 2022-03-16
CN113469131A (en) 2021-10-01
TWM618757U (en) 2021-10-21
TWI773453B (en) 2022-08-01

Similar Documents

Publication Publication Date Title
CN109643379B (en) Fingerprint identification method and device and electronic equipment
WO2020124511A1 (en) Fingerprint recognition method, fingerprint recognition device and electronic apparatus
CN108496180B (en) Optical fingerprint sensor under display
CN111448570B (en) Optically sensing a fingerprint or other pattern on or near a display screen with an optical detector integrated into the display screen
US11288483B2 (en) Fingerprint recognition device, fingerprint recognition method, and display device
US10282582B2 (en) Finger biometric sensor for generating three dimensional fingerprint ridge data and related methods
CN109690567B (en) Fingerprint identification device and electronic equipment
CN107430681A (en) Include the electronic equipment and correlation technique of the pinhole array mask above optical image sensor
CN109196525A (en) Refuse the anti-spoofing sensing of false fingerprint pattern in optical sensor module under the screen for shielding upper fingerprint sensing
JP3231956U (en) Integrated spectrum sensing device for judging the real finger
CN104077748B (en) Image correction apparatus, image correction method, and biometric authentication apparatus
CN109074475A (en) Electronic equipment and correlation technique including the pinhole array exposure mask laterally adjacent above optical image sensor and with light source
US11176347B2 (en) Fingerprint-on-display recognition
CN111837128A (en) Fingerprint anti-counterfeiting method, fingerprint identification device and electronic equipment
CN112528953A (en) Fingerprint identification device, electronic equipment and fingerprint identification method
CN106778674A (en) Fingerprint identification device and fingerprint identification method
CN111353405A (en) Fingerprint identification device, fingerprint identification system and electronic equipment
US11741745B2 (en) Multicolor illumination in an optical fingerprint sensor for anti-spoofing
CN215219712U (en) Sensing device using sensing pixels with partial spectral sensing area
CN211529170U (en) Fingerprint identification device and electronic equipment
WO2018185992A1 (en) Biometric authentication device and method
CN206627976U (en) Fingerprint identification device and electronic equipment
KR20200137777A (en) Fingerprint Acquisition Sensor for Mobile Device, the Mobile Device Comprising the Sensor and Fingerprint Image Acqusition Method thereof
CN110543821A (en) Grain recognition device and operation method thereof
EP4172859A1 (en) Electronic device comprising a biometric optical sensor module

Legal Events

Date Code Title Description
GR01 Patent grant
GR01 Patent grant