WO2018192308A1 - 一种虹膜识别的方法和装置 - Google Patents
一种虹膜识别的方法和装置 Download PDFInfo
- Publication number
- WO2018192308A1 WO2018192308A1 PCT/CN2018/077876 CN2018077876W WO2018192308A1 WO 2018192308 A1 WO2018192308 A1 WO 2018192308A1 CN 2018077876 W CN2018077876 W CN 2018077876W WO 2018192308 A1 WO2018192308 A1 WO 2018192308A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- iris
- information
- unit
- instruction
- user
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 69
- 238000012545 processing Methods 0.000 claims description 53
- 239000004973 liquid crystal related substance Substances 0.000 claims description 18
- 229920001621 AMOLED Polymers 0.000 claims description 7
- 230000004913 activation Effects 0.000 claims description 3
- 210000005252 bulbus oculi Anatomy 0.000 abstract description 17
- 230000008569 process Effects 0.000 description 19
- 238000003860 storage Methods 0.000 description 19
- 230000003287 optical effect Effects 0.000 description 14
- 210000001508 eye Anatomy 0.000 description 13
- 239000000463 material Substances 0.000 description 13
- 229910021417 amorphous silicon Inorganic materials 0.000 description 12
- 238000010586 diagram Methods 0.000 description 11
- 230000006870 function Effects 0.000 description 9
- 238000004891 communication Methods 0.000 description 8
- 238000005516 engineering process Methods 0.000 description 7
- 238000004519 manufacturing process Methods 0.000 description 7
- 238000003384 imaging method Methods 0.000 description 6
- 230000001960 triggered effect Effects 0.000 description 6
- 238000004590 computer program Methods 0.000 description 5
- 230000000694 effects Effects 0.000 description 5
- BLRPTPMANUNPDV-UHFFFAOYSA-N Silane Chemical compound [SiH4] BLRPTPMANUNPDV-UHFFFAOYSA-N 0.000 description 4
- 230000005669 field effect Effects 0.000 description 4
- 229910052739 hydrogen Inorganic materials 0.000 description 4
- 239000001257 hydrogen Substances 0.000 description 4
- 229910021424 microcrystalline silicon Inorganic materials 0.000 description 4
- UFHFLCQGNIYNRP-UHFFFAOYSA-N Hydrogen Chemical compound [H][H] UFHFLCQGNIYNRP-UHFFFAOYSA-N 0.000 description 3
- 150000001875 compounds Chemical class 0.000 description 3
- 238000012634 optical imaging Methods 0.000 description 3
- 239000004065 semiconductor Substances 0.000 description 3
- OKTJSMMVPCPJKN-UHFFFAOYSA-N Carbon Chemical compound [C] OKTJSMMVPCPJKN-UHFFFAOYSA-N 0.000 description 2
- XLOMVQKBTHCTTD-UHFFFAOYSA-N Zinc monoxide Chemical compound [Zn]=O XLOMVQKBTHCTTD-UHFFFAOYSA-N 0.000 description 2
- LEVVHYCKPQWKOP-UHFFFAOYSA-N [Si].[Ge] Chemical compound [Si].[Ge] LEVVHYCKPQWKOP-UHFFFAOYSA-N 0.000 description 2
- 230000009286 beneficial effect Effects 0.000 description 2
- 229910052799 carbon Inorganic materials 0.000 description 2
- 238000006243 chemical reaction Methods 0.000 description 2
- 238000005229 chemical vapour deposition Methods 0.000 description 2
- 239000011248 coating agent Substances 0.000 description 2
- 238000000576 coating method Methods 0.000 description 2
- 239000003086 colorant Substances 0.000 description 2
- DIOQZVSQGTUSAI-UHFFFAOYSA-N decane Chemical compound CCCCCCCCCC DIOQZVSQGTUSAI-UHFFFAOYSA-N 0.000 description 2
- 238000012217 deletion Methods 0.000 description 2
- 230000037430 deletion Effects 0.000 description 2
- 238000000605 extraction Methods 0.000 description 2
- 238000009434 installation Methods 0.000 description 2
- 230000003993 interaction Effects 0.000 description 2
- 229910052751 metal Inorganic materials 0.000 description 2
- 239000002184 metal Substances 0.000 description 2
- 239000002086 nanomaterial Substances 0.000 description 2
- 238000007781 pre-processing Methods 0.000 description 2
- 238000002360 preparation method Methods 0.000 description 2
- 230000001953 sensory effect Effects 0.000 description 2
- 229910000077 silane Inorganic materials 0.000 description 2
- 150000003377 silicon compounds Chemical class 0.000 description 2
- 239000000126 substance Substances 0.000 description 2
- GYHNNYVSQQEPJS-UHFFFAOYSA-N Gallium Chemical compound [Ga] GYHNNYVSQQEPJS-UHFFFAOYSA-N 0.000 description 1
- XUIMIQQOPSSXEZ-UHFFFAOYSA-N Silicon Chemical group [Si] XUIMIQQOPSSXEZ-UHFFFAOYSA-N 0.000 description 1
- 230000003190 augmentative effect Effects 0.000 description 1
- 239000000919 ceramic Substances 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 238000000151 deposition Methods 0.000 description 1
- 230000008021 deposition Effects 0.000 description 1
- 238000005137 deposition process Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 229910052733 gallium Inorganic materials 0.000 description 1
- 239000007789 gas Substances 0.000 description 1
- QUZPNFFHZPRKJD-UHFFFAOYSA-N germane Chemical compound [GeH4] QUZPNFFHZPRKJD-UHFFFAOYSA-N 0.000 description 1
- 229910052986 germanium hydride Inorganic materials 0.000 description 1
- 239000011521 glass Substances 0.000 description 1
- 150000002431 hydrogen Chemical class 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 229910052738 indium Inorganic materials 0.000 description 1
- APFVFJFRJDLVQX-UHFFFAOYSA-N indium atom Chemical compound [In] APFVFJFRJDLVQX-UHFFFAOYSA-N 0.000 description 1
- 150000002484 inorganic compounds Chemical class 0.000 description 1
- 229910010272 inorganic material Inorganic materials 0.000 description 1
- 239000007788 liquid Substances 0.000 description 1
- 238000001459 lithography Methods 0.000 description 1
- 229910044991 metal oxide Inorganic materials 0.000 description 1
- 150000004706 metal oxides Chemical class 0.000 description 1
- 238000002156 mixing Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 210000001087 myotubule Anatomy 0.000 description 1
- 150000002894 organic compounds Chemical class 0.000 description 1
- 239000004033 plastic Substances 0.000 description 1
- 229910021420 polycrystalline silicon Inorganic materials 0.000 description 1
- 238000007639 printing Methods 0.000 description 1
- 239000002096 quantum dot Substances 0.000 description 1
- 238000004544 sputter deposition Methods 0.000 description 1
- 238000003756 stirring Methods 0.000 description 1
- 239000000758 substrate Substances 0.000 description 1
- 210000001519 tissue Anatomy 0.000 description 1
- 230000009466 transformation Effects 0.000 description 1
- 238000000844 transformation Methods 0.000 description 1
- 239000011787 zinc oxide Substances 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/18—Eye characteristics, e.g. of the iris
- G06V40/19—Sensors therefor
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/18—Eye characteristics, e.g. of the iris
- G06V40/197—Matching; Classification
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/20—Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from infrared radiation only
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/95—Computational photography systems, e.g. light-field imaging systems
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/30—Transforming light or analogous information into electric information
- H04N5/33—Transforming infrared radiation
Definitions
- the present invention relates to the field of electronic device control, and in particular, to a method and device for iris recognition.
- touch display panels have been widely used in devices that require human-computer interaction interfaces, such as operating screens of industrial computers, tablet computers, touch screens of smart phones, and the like.
- human-computer interaction interfaces such as operating screens of industrial computers, tablet computers, touch screens of smart phones, and the like.
- VR/AR virtual reality or augmented reality
- the identity recognition method that conforms to the user's sensory experience is not as good as the biometric recognition technology of mobile devices, for example, fingerprint recognition technology is mature.
- Identity recognition usually combines the user's biometric information with operational instructions to achieve such an operation through biometric identification, and iris recognition is an important one.
- the iris is the texture of the muscle fiber tissue of the colored part of the human eyeball.
- the iris recognition refers to the iris feature information of the eyeball and recognizes it, predicts the user's identity status and needs, and responds to achieve the purpose of controlling the device by identifying the iris feature information.
- an infrared camera outside the display screen of the mobile device is generally used to capture the change feature of the eyeball feature, because the infrared camera outside the display screen of the mobile device is often independently disposed at the edge of the device (such as at the top of the mobile phone), in the wearable
- the application of electronic products deviates from the optical axis of the eyeball imaging.
- the existing device structure cannot accurately capture the user's iris information, and there are problems such as low recognition efficiency and poor recognition accuracy.
- the technical problem to be solved by the present invention is to provide a technical solution for iris recognition, which is characterized in that the position of the camera outside the display screen of the wearable device or the mobile device is deviated from the optical axis, and the iris of the user's eyeball cannot be accurately and timely captured.
- the information causes the iris feature information to be completely captured, the recognition efficiency is low, the recognition accuracy is low, and the user's sensory experience is poor.
- the technical solution adopted by the present invention is: a method for iris recognition, the method is applied to an iris recognition device, the device includes a display unit, a sensing unit, and an infrared light source; An iris recognition area is disposed, the sensing unit is located below the iris recognition area, the sensing unit includes an infrared sensing layer; the infrared light source is used to emit infrared light, and the infrared sensing layer is used to sense a user iris A reflected infrared light signal and capturing iris information of the user; the method comprising the steps of:
- the operation configuration information includes a correspondence relationship between the iris information and the operation instruction
- Capturing the iris information of the user on the iris recognition area determining whether the captured iris information matches the preset iris information in the operation configuration information, and if so, executing an operation command corresponding to the iris information, otherwise the operation instruction is not executed.
- step of “presetting the operation configuration information” includes:
- the operation instruction identifier list includes identifiers corresponding to one or more operation instructions, and each operation instruction identifier corresponds to an operation instruction;
- the user selects a selection instruction for the operation instruction identifier, establishes a correspondence between the operation instruction corresponding to the selected operation instruction identifier and the captured iris information, and saves the operation configuration information.
- the operation instruction is a screen switching instruction; the step "determines whether the captured iris information matches the preset iris information in the operation configuration information, and if so, executes an operation command corresponding to the iris information, otherwise the operation is not performed.
- the instructions include:
- the operation instruction is a payment instruction; the step “determines whether the captured iris information matches the preset iris information in the operation configuration information, and if yes, executes an operation command corresponding to the iris information, otherwise the operation instruction is not executed. "include:
- the operation instruction is a user identity information registration instruction; the step “determining whether the captured iris information matches the preset iris information in the operation configuration information, and if yes, executing the operation command corresponding to the iris information, otherwise the execution is not performed.
- the operation instructions include:
- the sensing unit comprises a TFT image sensing array film
- the infrared sensing layer comprises an infrared photodiode or an infrared photosensitive photo transistor.
- the step of “determining whether the captured iris information matches the preset iris information in the operation configuration information” specifically includes:
- the feature value is calculated according to the captured iris information, and compared with the feature value of the iris information preset in the operation configuration information; when the error is less than the preset value, it is determined to be matched, otherwise it is determined as a mismatch.
- the method further includes the steps of:
- the prompt information is issued.
- the prompt information includes one or more of voice prompt information, image prompt information, light prompt information, and video prompt information.
- the display unit comprises an AMOLED display or an LCD liquid crystal display.
- a backlight unit is further disposed under the sensing unit, and the sensing unit is disposed between the backlight unit and the LCD liquid crystal display.
- the iris recognition area includes a plurality of iris recognition sub-areas, and a sensing unit is disposed under each iris recognition sub-area.
- the device further includes a sensing unit control circuit, the method further comprising:
- the sensing unit control circuit turns on the sensing unit below the iris recognition sub-area, and receives a user's closing command for the iris recognition sub-area, and the sensing unit control circuit turns off the The sensing unit below the iris recognition sub-area.
- the inventors also provide an apparatus for iris recognition, the apparatus comprising a display unit, a sensing unit and an infrared light source; the display unit is provided with an iris recognition area, and the sensing unit is located below the iris recognition area
- the sensing unit includes an infrared sensing layer; the infrared light source is configured to emit infrared light, the infrared sensing layer is configured to sense an infrared light signal reflected by a user's iris, and capture iris information of the user;
- the device further includes an operation An information setting unit, a judging unit, and a processing unit;
- the operation information setting unit is configured to preset operation configuration information, where the operation configuration information includes a correspondence relationship between the iris information and the operation instruction;
- the sensing unit is configured to capture iris information of the user on the iris recognition area, and the determining unit is configured to determine whether the captured iris information matches the preset iris information in the operation configuration information, and if yes, the processing unit is configured to execute The operation command corresponding to the iris information is executed, otherwise the processing unit does not execute the operation instruction.
- the device includes an operation instruction receiving unit, and the “operation information setting unit is configured to preset operation configuration information” includes:
- the operation instruction receiving unit is configured to receive a user setting command, and the display unit is configured to display an iris recognition area;
- the sensing unit is configured to capture and save the iris information of the user
- the display unit is configured to display an operation instruction identifier list;
- the operation instruction identifier list includes an identifier corresponding to one or more operation instructions, and each operation instruction identifier corresponds to an operation instruction;
- the operation instruction receiving unit is configured to receive a selection instruction of the operation instruction identifier by the user, and the processing unit is configured to establish a correspondence between the operation instruction corresponding to the selected operation instruction identifier and the captured iris information, and save the operation configuration information.
- the operation instruction is a screen switching instruction; the determining unit is configured to determine whether the captured iris information matches the preset iris information in the operation configuration information, and if yes, the processing unit is configured to execute the corresponding iris information. Operation command, otherwise the processing unit does not execute the operation instruction" includes:
- the determining unit is configured to determine whether the iris information of the captured user matches the iris information corresponding to the screen switching instruction, and if yes, the processing unit is configured to switch the screen, otherwise the processing unit does not switch the screen.
- the operation instruction is a payment instruction; the “determination unit is configured to determine whether the captured iris information matches the preset iris information in the operation configuration information, and if yes, the processing unit is configured to perform the operation corresponding to the iris information. Command, otherwise the processing unit does not execute the operation instruction" includes:
- the determining unit is configured to determine whether the captured iris information matches the iris information corresponding to the payment instruction, and if yes, the processing unit executes the payment instruction, and the payment is successful, otherwise the processing unit does not execute the payment instruction, and the payment fails.
- the operation instruction is a user identity information registration instruction; the “determination unit is configured to determine whether the captured iris information matches the preset iris information in the operation configuration information, and if yes, the processing unit is configured to execute the iris information. Corresponding operation command, otherwise the processing unit does not execute the operation instruction" includes:
- the determining unit is configured to determine whether the captured iris information matches the iris information corresponding to the screen switching instruction, and if the processing unit executes the user information identity registration instruction, the login is successful, otherwise the processing unit does not execute the user information identity login command, and logs in. failure.
- the sensing unit comprises a TFT image sensing array film
- the infrared sensing layer comprises an array formed by an infrared photodiode or an infrared photosensitive transistor.
- the determining unit is configured to determine whether the captured iris information matches the preset iris information in the operation configuration information, and specifically includes:
- the determining unit is configured to calculate the feature value according to the captured iris information, and compare with the feature value of the iris information preset in the operation configuration information; when the error is less than the preset value, it is determined to be matched, otherwise the determination is not matched.
- processing unit is further configured to issue the prompt information when the determining unit determines that the preset iris information does not match the captured iris information in the operation configuration information.
- the prompt information includes one or more of voice prompt information, image prompt information, light prompt information, and video prompt information.
- the display unit comprises an AMOLED display or an LCD liquid crystal display.
- a backlight unit is further disposed under the sensing unit, and the sensing unit is disposed between the backlight unit and the LCD liquid crystal display.
- the iris recognition area includes a plurality of iris recognition sub-areas, and a sensing unit is disposed under each iris recognition sub-area.
- the device further includes a sensing unit control circuit and an operation instruction receiving unit, wherein the operation instruction receiving unit is configured to receive a start instruction of the user for the iris recognition sub-area, and the sensing unit control circuit is configured to open the a sensing unit below the iris recognition sub-area, and the operation instruction receiving unit is configured to receive a user's closing instruction for the iris recognition sub-area, the sensing unit control circuit is configured to close the lower portion of the iris recognition sub-area Sensing unit.
- the operation instruction receiving unit is configured to receive a start instruction of the user for the iris recognition sub-area
- the sensing unit control circuit is configured to open the a sensing unit below the iris recognition sub-area
- the operation instruction receiving unit is configured to receive a user's closing instruction for the iris recognition sub-area
- the sensing unit control circuit is configured to close the lower portion of the iris recognition sub-area Sensing unit.
- the beneficial effect after adopting the above technical solution is that: by providing a sensing unit under the iris recognition area of the display unit, a projection of the user's eye iris through the optical device is located on the iris recognition area, and the center of the sensing unit can be set in the eyeball
- the imaging optical axis position or the paraxial position is compared with the structure in which the camera is disposed at the edge position independently of the display screen, and the invention can capture the user iris characteristic information in time, and then compare with the preset iris information, and execute the The operation command corresponding to the iris information effectively improves the accuracy of the iris information and enhances the user experience.
- the sensing unit is disposed under the display unit, which can effectively reduce the overall thickness of the mobile device compared to the structure in which the camera is independently protruded from the display screen area, so that the wearable device or the mobile device is more light and thin, and is more suitable for use. Flexible wearable devices or mobile devices to meet market demands.
- FIG. 1 is a flow chart of a method for iris recognition according to an embodiment of the present invention
- FIG. 2 is a flowchart of a method for iris recognition according to another embodiment of the present invention.
- FIG. 3 is a schematic diagram of an apparatus for identifying iris according to an embodiment of the present invention.
- FIG. 4 is a schematic diagram of an apparatus for identifying iris according to another embodiment of the present invention.
- FIG. 5 is a schematic diagram of an application scenario of an apparatus for identifying an iris according to an embodiment of the present invention
- Figure 6 is a schematic view of a conventional sensing unit
- FIG. 7 is a schematic diagram of a sensing unit according to an embodiment of the present invention.
- FIG. 8 is a schematic diagram of a sensing unit according to another embodiment of the present invention.
- An operation information setting unit
- An operation instruction receiving unit
- FIG. 1 is a flowchart of a method for identifying iris according to an embodiment of the present invention.
- the method is applied to an iris recognition device, and the device is an electronic device having a display screen or a touch display screen, such as a smart mobile device such as a mobile phone, a tablet computer, a personal digital assistant, or a personal computer, an industrial equipment computer, or the like.
- the device can also be combined with an optical imaging device disposed between the display unit and the user's eyes (ie, above the display screen), as shown in FIG. 5, the user's eyeball projection is first in the optical imaging device.
- the projected projection is located within the iris recognition area on the display unit such that the iris information of the user's eye can be captured by the sensing unit below the iris recognition area.
- the effect of simulating a VR/AR device can be achieved by the cooperation between the optical imaging device and the display unit.
- the device comprises a display unit, a sensing unit and an infrared light source; the display unit is provided with an iris recognition area, the sensing unit is located below the iris recognition area, and the sensing unit comprises an infrared sensing layer;
- the infrared light source is used to emit infrared light, and the infrared sensing layer is used to sense the infrared light signal reflected by the user's iris and capture the iris information of the user.
- the display unit includes an AMOLED display or an LCD liquid crystal display; in other embodiments, the display unit may also be other electronic components having a display function.
- the infrared light source is an electronic device having an infrared light emitting function, such as an infrared LED device. The method includes the following steps:
- the process proceeds to step S101 to preset operation configuration information, where the operation configuration information includes a correspondence relationship between the iris information and the operation instruction.
- the operation instruction includes one or more of a text operation instruction, an image operation instruction, a video operation instruction, and an application operation instruction.
- the text operation instruction includes a selected text instruction, a delete text instruction, a copy text instruction, and the like;
- the image operation instruction includes a selected image instruction, a copy image instruction, a cut image instruction, an image deletion instruction, a switching image screen, and the like;
- the instructions include intercepting, pausing, saving, deleting, fast forwarding, rewinding, zooming, and volume adjustment of the video;
- the application operation instructions include starting, deleting, selecting, moving, etc. of the software application (such as a mobile phone APP). .
- the iris information in the operation configuration information is that the user inputs the stored iris information in advance, and each iris information can be matched with a plurality of operation instructions, so that after the captured user iris information is authenticated, the user can perform multiple operations on the device.
- the operation configuration information may be stored in a storage unit of the device, such as a memory of the mobile phone or a hard disk of the computer, or may be stored in a storage unit of the server.
- the device When the operation configuration information needs to be acquired, the device only needs to establish a communication connection with the server, and then The operational configuration information stored in advance is obtained from the server, and the communication connection includes a wired communication connection or a wireless communication connection.
- the coverage of the sensing unit is adapted to the size of the display unit.
- the shape of the sensing unit is rectangular, and the size of the rectangle is located at the center of the display unit, so as to ensure that the eyeball is not offset and imaged.
- Optical axis This ensures that as long as the user's eyes are aligned with the display unit, the sensing unit can accurately and quickly acquire the user's iris information regardless of how the user's eye moves.
- step S103 determines whether the captured iris information matches the preset iris information in the operation configuration information. If yes, the process proceeds to step S104 to execute an operation command corresponding to the iris information, and otherwise proceeds to step S105 to not execute the operation command.
- the iris information comparison can be realized by an iris feature recognition algorithm, and the iris feature recognition algorithm can be stored in the storage unit of the device. When the infrared sensing layer of the sensing unit acquires the iris information, the processor of the device will call the storage unit. The iris feature recognition algorithm compares the acquired iris information with the iris information in the preset operation configuration information to determine whether the two match.
- the iris feature recognition algorithm includes steps such as preprocessing of iris feature information, data feature extraction, feature matching, iris recognition, etc., which can be implemented by various algorithms, which are mature existing technologies and have been applied to various fields. In the above, the details are not repeated here.
- FIG. 2 is a flowchart of a method for identifying iris according to another embodiment of the present invention.
- the step of “presetting the operation configuration information” includes:
- step S201 the process proceeds to step S201 to receive a user setting command to display the iris recognition area.
- the setting instruction can be triggered by the user clicking a button in the setting column on the screen, and after receiving the setting instruction, the device will display the iris recognition area, so that the user can input the iris information.
- displaying the iris recognition area may include: increasing the brightness of the iris recognition area or displaying a prompt input box on the iris recognition area.
- the method before receiving the user setting instruction, the method further includes receiving user account information, where the account information includes a user ID and a password.
- the user needs to input the correct user ID and password by means of voice control, eye control, or key password control, and the setting command can be triggered only after the user account is logged in, so that the security of the operation configuration information setting can be improved on the one hand.
- voice control eye control
- key password control key password control
- the setting command can be triggered only after the user account is logged in, so that the security of the operation configuration information setting can be improved on the one hand.
- step S202 to capture the iris information of the user and save.
- the collected iris information is preset iris information, which can be stored in the storage unit.
- the step of “capturing the iris information of the user and saving” includes: determining whether the iris information in the user setting process has been stored in the storage unit, and when the determination is yes, prompting the user that the iris information has been entered, and issuing a prompt. The information reminds the user; when the determination is no, the iris information is saved to the storage unit. This can effectively avoid the repeated entry of iris information.
- step S203 the process proceeds to step S203 to display an operation instruction identification list, and receives a selection instruction of the operation instruction identifier by the user, and establishes a correspondence between the operation instruction corresponding to the selected operation instruction identifier and the captured iris information, and saves the operation configuration information.
- the operation instruction identifier list includes identifiers corresponding to one or more operation instructions, and each operation instruction identifier corresponds to an operation instruction.
- the operation instruction identifier can be displayed in the form of text or picture, and the selection instruction can be triggered by the user clicking a check, double clicking, or the like.
- the sensing unit includes a TFT image sensing array film.
- LCD liquid crystal display
- OLED organic light emitting diode
- the main structure for forming the TFT switching function is a metal oxide semiconductor field effect transistor (MOSFET), and the well-known semiconductor layer material mainly includes amorphous silicon, polycrystalline silicon, indium gallium zinc oxide (IGZO), or organic mixed with carbon nano materials. Compounds and so on.
- MOSFET metal oxide semiconductor field effect transistor
- IGZO indium gallium zinc oxide
- the TFT photodetecting diode Since the structure of the light sensing diode can also be prepared by using such a semiconductor material, and the production equipment is also compatible with the production equipment of the TFT array, in recent years, the TFT photodetecting diode has been produced by the TFT array preparation method.
- the TFT image sensing array film described in this embodiment is the above-mentioned TFT photodetecting diode (such as the photo sensing diode region in FIG. 6 ).
- the production process of the TFT image sensing array film is different from that of the display panel TFT in that the pixel opening area of the display panel is changed to the light sensing area in the production process.
- the TFT can be prepared by using a thin glass substrate or a high temperature resistant plastic material as described in US Pat. No. 6,943,070 B2.
- the sensing unit shown in FIG. 6 is susceptible to reflection or refraction of visible light emitted by ambient light or display pixels, causing optical interference, which seriously affects the signal of the TFT image sensing array film embedded under the display panel.
- Noise ratio (SNR) in order to improve the signal-to-noise ratio, the sensing unit of the present invention is further improved on the basis of the sensing unit shown in FIG. 6, so that the TFT image sensing array film can detect and recognize the user's eyeball reflection. Infrared signal.
- the infrared sensing layer is an array formed by an infrared photosensitive transistor.
- the TFT image sensing array film of FIG. 6 is improved, including: replacing the original by using a TFT photovoltaic field effect transistor (PVFET)
- PVFET TFT photovoltaic field effect transistor
- Some photodetecting diodes in the array film can be referred to the document "Photovoltage field-effect transistors, Nature 542, 324-327 (16 Febryary 2017)".
- the infrared sensing layer is also designed as a Field Effect Transistor structure.
- the method is as follows: preparing an amorphous silicon layer by chemical vapor deposition, preparing a metal layer and a transparent electrode layer by physical sputtering, and defining a lithographic shape layer required for each layer device by photomask lithography during preparation. .
- the potential difference between the drain and the source of the prepared infrared photosensitive transistor is an operating parameter for converting infrared light into an electrical signal.
- the gate of the infrared photosensitive transistor is made of a photovoltaic material sensitive to infrared light.
- the photovoltaic material can be a quantum dot, carbon nano material, metal modified by Bandgap Engineering.
- An oxide film material or the like is obtained by mixing these materials into an organic or inorganic compound and stirring them into a gel or a liquid, and then preparing them as a gate by coating, printing, or the like.
- the electron-hole pair excited by the photovoltaic material induces the drain and the source of the infrared photosensitive transistor.
- the electronic channel is formed, the potential difference between the drain and the source is reduced, and the drain output current is increased by the TFT for scanning.
- the switching characteristic can realize the reading of the infrared light image electrical signal, thereby realizing Capture of user iris information.
- the infrared sensing layer is an array formed by infrared photosensitive diodes.
- the TFT image sensing array film of FIG. 6 is improved, specifically, the TFT image of FIG. 6 is replaced by an infrared photosensitive diode.
- the photodiode layer of the array film is sensed, and the infrared photodiode comprises a microcrystalline silicon photodiode or an amorphous silicon photodiode.
- Embodiment 1 The amorphous silicon p-type/i-type/n-type photodiode structure (photodiode layer in FIG. 6) is changed from a microcrystalline silicon p-type/i-type/n-type photodiode structure.
- the degree of microcrystallization of the photodiode is mainly in a chemical meteorological deposition process, and a gaseous hydrogen silane (SiH4) is mixed with a suitable hydrogen concentration to control the dangling bond of the hydrogen atom-bonded amorphous silicon.
- SiH4 gaseous hydrogen silane
- Coating of microcrystalline silicon p-type / i-type / n-type photodiode structure By adjusting the hydrogen concentration of chemical weather deposition, the operating wavelength range of the microcrystalline photodiode can be extended to the wavelength range of 600 nm to 1000 nm.
- the microcrystalline silicon photodiode may also be formed by stacking a p-type/i-type/n-type structure having a double junction or more.
- the first junction p-type/i-type/n-type material of the photodiode is still an amorphous structure, and the p-type/i-type/n-type material above the second junction layer may be a microcrystalline structure or a polycrystalline structure.
- Embodiment 2 Changing the amorphous silicon p-type/i-type/n-type photodiode structure (the photodiode layer in FIG. 6) to the p-type/i-type/n of the amorphous silicon compound doped with the expandable photosensitive wavelength range
- a preferred photodiode structure, an example of a preferred compound is amorphous silicon germanium.
- the intrinsic layer (i-type) of the photodiode is mixed with silane (SiH4) by gas decane (GeH4) during chemical vapor deposition to achieve amorphous silicon germanium p-type/i type/
- the photosensitive range of the n-type photodiode reaches a range of light wavelengths of 600 nm to 1000 nm.
- an amorphous silicon photodiode in order to increase the quantum efficiency of photoelectric conversion, may also be formed by stacking a p-type/i-type/n-type structure above a double junction.
- the first junction p-type/i-type/n-type material of the photodiode is still an amorphous silicon structure, and the p-type/i-type/n-type material above the second junction layer may be a microcrystalline structure, a polycrystalline structure or a doped Compound materials that extend the range of photosensitive wavelengths.
- the infrared sensing layer is an array formed by an infrared photosensitive diode
- a TFT can be used for scanning driving plus a bias voltage between the p-type/i-type/n-type photodiodes, which will make the infrared
- the photodiode is in the state of detecting the infrared light signal, and the infrared light signal reflected back by the user's eye iris is converted into an infrared light image electrical signal and output, thereby realizing the capture of the user's eyeball activity information.
- the operation instruction is a screen switching instruction; the step of “determining whether the captured iris information matches the preset iris information in the operation configuration information, and if yes, executing the operation command corresponding to the iris information, otherwise
- the execution of the operation instruction includes: determining whether the captured iris information matches the iris information corresponding to the screen switching instruction, and if so, switching the screen, otherwise the screen is not switched. Since the video stream data is composed of one frame of image picture, the method of the embodiment is also applicable to the judgment of the video stream data.
- the operation instruction is a payment instruction; the step “determining whether the captured iris information matches the preset iris information in the operation configuration information, and if yes, executing the operation command corresponding to the iris information, otherwise not executing The operation instruction includes: determining whether the captured iris information matches the iris information corresponding to the payment instruction, and if yes, the payment is successful, otherwise the payment fails. Linking the payment instruction with the user's iris information identification can effectively enhance the security of the transaction payment, and avoid unnecessary losses caused by other users to the owner.
- the operation instruction is a user identity information login instruction; the step of “determining whether the captured iris information matches the preset iris information in the operation configuration information, and if yes, executing the operation command corresponding to the iris information, Otherwise, the operation instruction is not executed.
- the method includes: determining whether the captured iris information matches the iris information corresponding to the screen switching instruction. If the user identity information is successfully registered, the user identity information fails to be registered. Logging in user identity information and hooking up with user iris information identification can effectively enhance the security of the user identity login process.
- the step of “determining whether the captured iris information matches the preset iris information in the operation configuration information” specifically includes: calculating a feature value according to the captured iris information, and preset with the operation configuration information. The characteristic values of the iris information are compared; when the error is less than the preset value, it is determined to be matched, otherwise it is determined to be a mismatch.
- the method further comprises the step of: issuing a prompt message when it is determined that there is no preset iris information in the operational configuration information that matches the captured iris information.
- the prompt information includes one or more of voice prompt information, image prompt information, light prompt information, and video prompt information.
- the voice prompt information includes voice prompt information prompting the user to fail the iris recognition
- the image prompt information includes popup prompt information prompting the user to fail the iris recognition
- the video prompt information includes prompt information indicating that the iris recognition fails
- the light prompt Information includes changing the brightness of the screen or letting the display emit light of different colors.
- a backlight unit is further disposed under the sensing unit, and the sensing unit is disposed on the backlight unit and the LCD liquid crystal. Between the displays. Since the LCD liquid crystal display is not a self-illuminating element, it is necessary to add a backlight unit below the sensing unit during installation.
- the backlight unit may be an LCD backlight module or other electronic components having a self-luminous function.
- the display unit is an AMOLED display screen, since the OLED display screen is a self-luminous element, there is no need to provide a backlight unit.
- the iris recognition area includes a plurality of iris recognition sub-areas, and a sensing unit is disposed under each iris recognition sub-area.
- the device further includes a sensing unit control circuit, the method further comprising: receiving a user's activation command for the iris recognition sub-area, the sensing unit control circuit turning on the sensing unit below the iris recognition sub-area, and receiving the user The sensing unit control circuit closes the sensing unit below the iris recognition sub-area for the closing command of the iris recognition sub-area.
- the two iris recognition sub-areas may be evenly distributed on the screen one above or one left and one right, or may be distributed in the screen in other arrangements.
- the iris recognition sub-area covers the entire display screen, which ensures that when both iris recognition sub-areas are set to the on state, the imaging projection of the user's eyeball is always within the range of the sensing unit, effectively improving the eye characteristics of the user. Capture to enhance the user experience.
- the range of the two iris recognition sub-regions may also occupy 2/3, 3/4, etc. of the entire display screen area, and only needs to satisfy the optical axis of the iris recognition sub-region without deviating from the optical axis of the eyeball imaging.
- the user can also set one iris recognition sub-area to open according to his own preference, and another iris recognition sub-area is closed. It is also possible to set both identification sub-areas to the off state when no operation is required on the device.
- the number of iris recognition sub-regions may also be other values, which may be set according to actual needs.
- the sensing unit under each iris recognition sub-area is turned on or off, and can be set according to the user's own preferences.
- FIG. 3 is a schematic diagram of an apparatus for identifying iris according to an embodiment of the present invention.
- the device comprises a display unit 101, a sensing unit 102 and an infrared light source; the display unit 101 is provided with an iris recognition area, the sensing unit 102 is located below the iris recognition area, and the sensing unit comprises infrared a sensing layer; the infrared light source is configured to emit infrared light, and the infrared sensing layer is configured to sense an infrared light signal reflected by a user's iris and capture iris information of the user; the device further includes an operation information setting unit 104 and a determining unit 108 And processing unit 106.
- the operation information setting unit 104 is configured to preset operation configuration information, where the operation configuration information includes a correspondence relationship between the iris information and the operation instruction.
- the operation instruction includes one or more of a text operation instruction, an image operation instruction, a video operation instruction, and an application operation instruction.
- the text operation instruction includes a selected text instruction, a delete text instruction, a copy text instruction, and the like;
- the image operation instruction includes a selected image instruction, a copy image instruction, a cut image instruction, an image deletion instruction, a switching image screen, and the like;
- the instructions include intercepting, pausing, saving, deleting, fast forwarding, rewinding, zooming, and volume adjustment of the video;
- the application operation instructions include starting, deleting, selecting, moving, etc. of the software application (such as a mobile phone APP). .
- the iris information in the operation configuration information is that the user inputs the stored iris information in advance, and each iris information can be matched with a plurality of operation instructions, so that after the captured user iris information is authenticated, the user can perform multiple operations on the device.
- the operation configuration information may be stored in the storage unit 107 of the device, such as the memory of the mobile phone or the hard disk of the computer, or may be stored in the storage unit of the server.
- the device When the operation configuration information needs to be acquired, the device only needs to establish a communication connection with the server. Then, the operational configuration information stored in advance is obtained from the server, and the communication connection includes a wired communication connection or a wireless communication connection.
- the sensing unit 102 is configured to capture iris information of the user on the iris recognition area.
- the coverage of the sensing unit is adapted to the size of the display unit.
- the shape of the sensing unit is rectangular, and the size of the rectangle is located at the center of the display unit, so as to ensure that the eyeball is not offset and imaged.
- Optical axis This ensures that as long as the user's eyes are aligned with the display unit, the sensing unit can accurately and quickly acquire the user's iris information regardless of how the user's eye moves.
- the determining unit 108 is configured to determine whether the captured iris information matches the preset iris information in the operation configuration information, and if yes, the processing unit 106 is configured to execute an operation command corresponding to the iris information, otherwise the processing unit 106 does not execute the The operation instructions.
- the iris information comparison can be realized by an iris feature recognition algorithm, and the iris feature recognition algorithm can be stored in the storage unit of the device. When the infrared sensing layer of the sensing unit acquires the iris information, the processor of the device will call the storage unit.
- the iris feature recognition algorithm compares the acquired iris information with the iris information in the preset operation configuration information to determine whether the two match.
- the iris feature recognition algorithm includes steps such as preprocessing of iris feature information, data feature extraction, feature matching, iris recognition, etc., which can be implemented by various algorithms, which are mature existing technologies and have been applied to various fields. In the above, the details are not repeated here.
- the apparatus includes an operation instruction receiving unit 105, and the “operation information setting unit is configured to preset operation configuration information”, the operation instruction receiving unit is configured to receive a user setting command, the display The unit is used to display the iris recognition area.
- the setting instruction can be triggered by the user clicking a button in the setting column on the screen, and after receiving the setting instruction, the device will display the iris recognition area, so that the user can input the iris information.
- displaying the iris recognition area may include: increasing the brightness of the iris recognition area or displaying a prompt input box on the iris recognition area.
- the method before receiving the user setting instruction, the method further includes receiving user account information, where the account information includes a user ID and a password.
- the user needs to input the correct user ID and password by means of voice control, eye control, or key password control, and the setting command can be triggered only after the user account is logged in, so that the security of the operation configuration information setting can be improved on the one hand.
- voice control eye control
- key password control key password control
- the setting command can be triggered only after the user account is logged in, so that the security of the operation configuration information setting can be improved on the one hand.
- the sensing unit is configured to capture and save the iris information of the user.
- the collected iris information is preset iris information, which can be stored in the storage unit.
- the step of “capturing the iris information of the user and saving” includes: determining whether the iris information in the user setting process has been stored in the storage unit, and when the determination is yes, prompting the user that the iris information has been entered, and issuing a prompt. The information reminds the user; when the determination is no, the iris information is saved to the storage unit. This can effectively avoid the repeated entry of iris information.
- the display unit is configured to display a list of operation instruction identifiers
- the operation instruction receiving unit is configured to receive a selection instruction of the operation instruction identifier by the user
- the processing unit is configured to establish an operation instruction corresponding to the selected operation instruction identifier and the captured operation
- the correspondence of the iris information is saved to the operation configuration information.
- the operation instruction identification list includes an identifier corresponding to one or more operation instructions, and each operation instruction identifier corresponds to an operation instruction.
- the operation instruction identifier can be displayed in the form of text or a picture, and the selection instruction can be triggered by the user clicking a check, double clicking, or the like.
- the operation instruction is a screen switching instruction; the determining unit is configured to determine whether the captured iris information matches the preset iris information in the operation configuration information, and if yes, the processing unit is configured to execute the The operation command corresponding to the iris information, otherwise the processing unit does not execute the operation instruction" includes: the determining unit is configured to determine whether the iris information of the captured user matches the iris information corresponding to the screen switching instruction, and if so, the processing unit is configured to use the screen Switching is performed, otherwise the processing unit does not switch the screen. Since the video stream data is composed of one frame of image picture, the method of the embodiment is also applicable to the judgment of the video stream data.
- the operation instruction is a payment instruction; the “determination unit is configured to determine whether the captured iris information matches the preset iris information in the operation configuration information, and if yes, the processing unit is configured to execute the iris
- the operation command corresponding to the information, otherwise the processing unit does not execute the operation instruction includes: the determining unit is configured to determine whether the captured iris information matches the iris information corresponding to the payment instruction, and if so, the processing unit executes the payment instruction, and the payment Successful, otherwise the processing unit does not execute the payment instruction and the payment fails. Linking the payment instruction with the user's iris information identification can effectively enhance the security of the transaction payment, and avoid unnecessary losses caused by other users to the owner.
- the operation instruction is a user identity information login instruction; the “determination unit is configured to determine whether the captured iris information matches the preset iris information in the operation configuration information, and if yes, the processing unit is configured to: Executing the operation command corresponding to the iris information, otherwise the processing unit does not execute the operation instruction, the method includes: the determining unit is configured to determine whether the captured iris information matches the iris information corresponding to the screen switching instruction, and if the processing unit executes the user information The login command is successful, and the login succeeds. Otherwise, the processing unit does not execute the user information identity login command, and the login fails. Logging in user identity information and hooking up with user iris information identification can effectively enhance the security of the user identity login process.
- the sensing unit comprises a TFT image sensing array film
- the infrared sensing layer comprises an array formed by an infrared photodiode or an infrared photosensitive transistor.
- the determining unit is configured to determine whether the captured iris information matches the preset iris information in the operation configuration information, and specifically includes: determining, by the determining unit, the feature value according to the captured iris information, and pre-predicting the operation configuration information The characteristic values of the iris information are set for comparison; when the error is less than the preset value, it is determined to be matched, otherwise it is determined to be mismatched.
- the processing unit is further configured to issue the prompt information when the determining unit determines that the preset iris information does not match the captured iris information in the operation configuration information.
- the prompt information includes one or more of voice prompt information, image prompt information, light prompt information, and video prompt information.
- the voice prompt information includes voice prompt information prompting the user to fail the iris recognition
- the image prompt information includes popup prompt information prompting the user to fail the iris recognition
- the video prompt information includes prompt information indicating that the iris recognition fails
- the light prompt Information includes changing the brightness of the screen or letting the display emit light of different colors.
- a backlight unit 103 is disposed under the sensing unit, and the sensing unit is disposed between the backlight unit and the LCD liquid crystal display. Since the LCD liquid crystal display is not a self-illuminating element, it is necessary to add a backlight unit below the sensing unit during installation.
- the backlight unit may be an LCD backlight module or other electronic components having a self-luminous function.
- the display unit is an AMOLED display screen, since the OLED display screen is a self-luminous element, there is no need to provide a backlight unit.
- the beneficial effect after adopting the above technical solution is that: by providing a sensing unit under the iris recognition area of the display unit, a projection of the user's eye iris through the optical device is located on the iris recognition area, and the center of the sensing unit can be set in the eyeball
- the imaging optical axis position or the paraxial position is compared with the structure in which the camera is disposed at the edge position independently of the display screen, and the invention can capture the user iris characteristic information in time, and then compare with the preset iris information, and execute the The operation command corresponding to the iris information effectively improves the accuracy of the iris information and enhances the user experience.
- the sensing unit is disposed under the display unit, which can effectively reduce the overall thickness of the mobile device compared to the structure in which the camera is independently protruded from the display screen area, so that the wearable device or the mobile device is more light and thin, and is more suitable for use. Flexible wearable devices or mobile devices to meet market demands.
- the computer device includes but is not limited to: a personal computer, a server, a general purpose computer, a special purpose computer, a network device, an embedded device, a programmable device, a smart mobile terminal, a smart home device, a wearable smart device, a vehicle smart device, and the like;
- the storage medium includes, but is not limited to, a RAM, a ROM, a magnetic disk, a magnetic tape, an optical disk, a flash memory, a USB flash drive, a mobile hard disk, a memory card, a memory stick, a network server storage, a network cloud storage, and the like.
- the computer program instructions can also be stored in a computer device readable memory that can direct the computer device to operate in a particular manner, such that instructions stored in the computer device readable memory produce an article of manufacture comprising the instruction device, the instruction device being implemented in the process Figure One or more processes and/or block diagrams of the functions specified in a block or blocks.
- These computer program instructions can also be loaded onto a computer device such that a series of operational steps are performed on the computer device to produce computer-implemented processing, such that instructions executed on the computer device are provided for implementing one or more processes in the flowchart And/or block diagram of the steps of a function specified in a box or blocks.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Ophthalmology & Optometry (AREA)
- Human Computer Interaction (AREA)
- General Physics & Mathematics (AREA)
- Signal Processing (AREA)
- Computing Systems (AREA)
- Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
- Collating Specific Patterns (AREA)
- Image Input (AREA)
Abstract
一种虹膜识别的方法和装置,通过在显示单元(101)的虹膜识别区下方设置传感单元(102),用户眼球虹膜的投影位于所述虹膜识别区上,本方法可以及时捕捉到用户眼球虹膜信息,进而与预设的虹膜信息进行比对,执行该虹膜信息对应的操作指令,有效提高虹膜识别的精准度。此外,传感单元(102)设置于显示单元(101)的下方,相较于摄像头独立突出设置于显示屏区域外的结构,可以有效缩小移动设备的整体厚度,使得穿戴式设备或是移动设备更加轻薄、更适用于柔性穿戴式设备或是移动设备、满足市场的需求。
Description
本发明涉及电子设备控制领域,特别涉及一种虹膜识别的方法和装置。
随着科技的发展和技术的进步,触控显示面板已经广泛应用在需要进行人机交互界面的装置中,如工业计算机的操作屏幕、平板计算机、智能手机的触控屏幕等等。然而以穿戴式电子装置而言,人机交互界面技术仍有众多进步空间。以虚拟现实或增强现实(VR/AR)装置为例,符合用户感官体验的身分识别方式尚不如移动设备之生物特征识别技术,例如:指纹识别技术来得成熟。身分识别通常会将用户的生物特征信息与操作指令相结合,来达到通过生物特征识别进行此类操作装置的目的,而虹膜识别就是其中重要一项。
虹膜是人体眼球中有色部分的肌肉纤维组织纹理,虹膜识别指对捕捉眼球的虹膜特征信息并加以识别,预测用户的身分状态和需求,并进行响应,达到通过识别虹膜特征信息来控制设备的目的。目前,一般采用移动设备上显示屏外的红外摄像头来捕捉眼球特征变化信息,由于移动设备上显示屏外的红外摄像头往往独立设置于设备的边缘位置(如设置于手机的顶部),在穿戴式电子产品应用上偏离了眼球成像的光轴,采用现有的装置结构无法精准捕捉到用户的虹膜信息,存在着识别效率低、识别准确度差等问题。
发明内容
本发明所要解决的技术问题是提供一种虹膜识别的技术方案,用以解决由于穿戴式设备或是移动设备显示屏外的摄像头设置位置偏离光轴、无法精准且及时捕捉到用户眼球虹膜的特征信息,导致虹膜特征信息无法完整捕捉、识别效率低、识别准确度低、用户感官体验差等问题。
为解决上述技术问题,本发明采取的技术方案为:一种虹膜识别的方法,所述方法应用于虹膜识别的装置,所述装置包括显示单元、传感单元和红外光源;所述显示单元上设置有虹膜识别区,所述传感单元位于所述虹膜识别区的下方,所述传感单元包括红外感应层;所述红外光源用于发出红外光,所述红外感应层用于感知用户虹膜反射的红外光信号,并捕捉用户的虹膜信息;所述方法包括以下步骤:
预先设置操作配置信息,所述操作配置信息包括虹膜信息与操作指令的对应关系;
捕捉用户在虹膜识别区上的虹膜信息,判断所捕捉的虹膜信息与操作配置信息中的预设虹膜信息是否匹配,若是则执行该虹膜信息对应的操作命令,否则不执行所述操作指令。
进一步地,所述步骤“预先设置操作配置信息”包括:
接收用户设置命令,显示虹膜识别区;
捕捉用户的虹膜信息并保存;
显示一操作指令标识列表,所述操作指令标识列表中包含着一个或多个操作指令对应的标识,每一操作指令标识对应一操作指令;
接收用户对操作指令标识的选择指令,建立所选中的操作指令标识对应的操作指令与所捕捉的虹膜信息的对应关系,并保存至操作配置信息。
进一步地,所述操作指令为画面切换指令;步骤“判断所捕捉的虹膜信息与操作配置信息中的预设虹膜信息是否匹配,若是则执行该虹膜信息对应的操作命令,否则不执行所述操作指令”包括:
判断捕捉到的虹膜信息与画面切换指令对应的虹膜信息是否匹配,若是则对画面进行切换,否则不对画面进行切换。
进一步地,所述操作指令为支付指令;步骤“判断所捕捉的虹膜信息与操作配置信息中的预设虹膜信息是否匹配,若是则执行该虹膜信息对应的操作命令,否则不执行所述操作指令”包括:
判断捕捉到的虹膜信息与支付指令对应的虹膜信息是否匹配,若是则支付成功,否则支付失败。
进一步地,所述操作指令为用户身份信息登录指令;步骤“判断所捕捉的虹膜信息与操作配置信息中的预设虹膜信息是否匹配,若是则执行该虹膜信息对应的操作命令,否则不执行所述操作指令”包括:
判断捕捉到的虹膜信息与画面切换指令对应的虹膜信息是否匹配,若是用户身份信息登录成功,否则用户身份信息登录失败。
进一步地,所述传感单元包括TFT影像感测阵列薄膜,所述红外感应层包括红外光敏二极管或红外光敏电晶管。
进一步地,步骤“判断所捕捉的虹膜信息与操作配置信息中的预设虹膜信息是否匹配”具体包括:
根据捕捉到虹膜信息计算其特征值,并与操作配置信息中预设的虹膜信息的特征值进行对比;当误差小于预设值时,判定为相匹配,否则判定为不匹配。
进一步地,所述方法还包括步骤:
当判定操作配置信息中没有与所捕捉的虹膜信息相匹配的预设虹膜信息时,发出提示信息。
进一步地,所述提示信息包括声音提示信息、图像提示信息、光线提示信息、视频提示信息中的一种或多种。
进一步地,所述显示单元包括AMOLED显示屏或LCD液晶显示屏。
进一步地,当所述显示单元为LCD液晶显示屏时,所述传感单元的下方还设置有背光单元,所述传感单元设置于背光单元和LCD液晶显示屏之间。
进一步地,所述虹膜识别区包括多个虹膜识别子区域,每一虹膜识别子区域的下方对应设置一传感单元。
进一步地,所述装置还包括传感单元控制电路,所述方法还包括:
接收用户对虹膜识别子区域的启动指令,传感单元控制电路开启所述虹膜识别子区域的下方的传感单元,以及接收用户对虹膜识别子区域的关闭指令,传感单元控制电路关闭所述虹膜识别子区域的下方的传感单元。
发明人还提供了一种虹膜识别的装置,所述装置包括显示单元、传感单元和红外光源;所述显示单元上设置有虹膜识别区,所述传感单元位于所述虹膜识别区的下方,所述传感单元包括红外感应层;所述红外光源用于发出红外光, 所述红外感应层用于感知用户虹膜反射的红外光信号,并捕捉用户的虹膜信息;所述装置还包括操作信息设置单元、判断单元和处理单元;
所述操作信息设置单元用于预先设置操作配置信息,所述操作配置信息包括虹膜信息与操作指令的对应关系;
所述传感单元用于捕捉用户在虹膜识别区上的虹膜信息,所述判断单元用于判断所捕捉的虹膜信息与操作配置信息中的预设虹膜信息是否匹配,若是则处理单元用于执行执行该虹膜信息对应的操作命令,否则处理单元不执行所述操作指令。
进一步地,所述装置包括操作指令接收单元,所述“操作信息设置单元用于预先设置操作配置信息”包括:
所述操作指令接收单元用于接收用户设置命令,所述显示单元用于显示虹膜识别区;
所述传感单元用于捕捉用户的虹膜信息并保存;
所述显示单元用于显示一操作指令标识列表;所述操作指令标识列表中包含着一个或多个操作指令对应的标识,每一操作指令标识对应一操作指令;
所述操作指令接收单元用于接收用户对操作指令标识的选择指令,处理单元用于建立所选中的操作指令标识对应的操作指令与所捕捉的虹膜信息的对应关系,并保存至操作配置信息。
进一步地,所述操作指令为画面切换指令;所述“判断单元用于判断所捕捉的虹膜信息与操作配置信息中的预设虹膜信息是否匹配,若是则处理单元用于执行该虹膜信息对应的操作命令,否则处理单元不执行所述操作指令”包括:
判断单元用于判断捕捉到的用户的虹膜信息与画面切换指令对应的虹膜信息是否匹配,若是则处理单元用于对画面进行切换,否则处理单元不对画面进行切换。
进一步地,所述操作指令为支付指令;所述“判断单元用于判断所捕捉的虹膜信息与操作配置信息中的预设虹膜信息是否匹配,若是则处理单元用于执行该虹膜信息对应的操作命令,否则处理单元不执行所述操作指令”包括:
判断单元用于判断判断捕捉到的虹膜信息与支付指令对应的虹膜信息是否匹配,若是则处理单元执行所述支付指令,支付成功,否则处理单元不执行所述支付指令,支付失败。
进一步地,所述操作指令为用户身份信息登录指令;所述“判断单元用于判断所捕捉的虹膜信息与操作配置信息中的预设虹膜信息是否匹配,若是则处理单元用于执行该虹膜信息对应的操作命令,否则处理单元不执行所述操作指令”包括:
判断单元用于判断捕捉到的虹膜信息与画面切换指令对应的虹膜信息是否匹配,若是处理单元执行所述用户信息身份登录指令,登录成功,否则处理单元不执行所述用户信息身份登录指令,登录失败。
进一步地,所述传感单元包括TFT影像感测阵列薄膜,所述红外感应层包括红外光敏二极管或红外光敏电晶管所形成的阵列。
进一步地,所述“判断单元用于判断所捕捉的虹膜信息与操作配置信息中的预设虹膜信息是否匹配”具体包括:
判断单元用于根据捕捉到虹膜信息计算其特征值,并与操作配置信息中预设的虹膜信息的特征值进行对比;当误差小于预设值时,判定为相匹配,否则判定为不匹配。
进一步地,所述处理单元还用于在判断单元判定操作配置信息中没有与所捕捉的虹膜信息相匹配的预设虹膜信息时,发出提示信息。
进一步地,所述提示信息包括声音提示信息、图像提示信息、光线提示信息、视频提示信息中的一种或多种。
进一步地,所述显示单元包括AMOLED显示屏或LCD液晶显示屏。
进一步地,当所述显示单元为LCD液晶显示屏时,所述传感单元的下方还设置有背光单元,所述传感单元设置于背光单元和LCD液晶显示屏之间。
进一步地,所述虹膜识别区包括多个虹膜识别子区域,每一虹膜识别子区域的下方对应设置一传感单元。
进一步地,所述装置还包括传感单元控制电路和操作指令接收单元,所述操作指令接收单元用于接收用户对虹膜识别子区域的启动指令,所述传感单元控制电路用于开启所述虹膜识别子区域的下方的传感单元,以及所述操作指令接收单元用于接收用户对虹膜识别子区域的关闭指令,所述传感单元控制电路用于关闭所述虹膜识别子区域的下方的传感单元。
采用以上技术方案后的有益效果为:通过在显示单元的虹膜识别区下方设置传感单元,用户眼球虹膜通过光学器件成像的投影位于所述虹膜识别区上,传感单元的中心可设置于眼球成像光轴位置或是近轴位置,相较于摄像头独立于显示屏外设置在边缘位置的结构,本发明可以及时捕捉到用户虹膜特征信息,进而与预设的虹膜信息进行比对,执行该虹膜信息对应的操作指令,有效提高虹膜信息的精准度,提升用户体验。此外,传感单元设置于显示单元的下方,相较于摄像头独立突出设置于显示屏区域外的结构,可以有效缩小移动设备的整体厚度,使得穿戴式设备或是移动设备更加轻薄、更适用于柔性穿戴式设备或是移动设备、满足市场的需求。
图1为本发明一实施方式涉及的虹膜识别的方法的流程图;
图2为本发明另一实施方式涉及的虹膜识别的方法的流程图;
图3为本发明一实施方式涉及的虹膜识别的装置的示意图;
图4为本发明另一实施方式涉及的虹膜识别的装置的示意图;
图5为本发明一实施方式涉及的虹膜识别的装置应用场景的示意图;
图6为现有的传感单元的示意图;
图7为本发明一实施方式涉及的传感单元的示意图;
图8为本发明另一实施方式涉及的传感单元的示意图。
标号说明:
101、显示单元;
102、传感单元;
103、背光单元;
104、操作信息设置单元;
105、操作指令接收单元;
106、处理单元;
107、存储单元;
108、判断单元。
为详细说明本发明的技术内容、构造特征、所实现目的及效果,以下结合实施方式并配合附图详予说明。
请参阅图1,为本发明一实施方式涉及的虹膜识别的方法的流程图。所述方法应用于虹膜识别的装置,所述装置为具有显示屏或触摸显示屏的电子设备,如是手机、平板电脑、个人数字助理等智能移动设备,还可以是个人计算机、工业装备用计算机等电子设备。当然所述装置还可以与光学成像器件相结合,光学成像器件设置于所述显示单元与用户眼睛之间(即显示屏的上方),如图5所示,用户眼球投影先在光学成像器件中成像,成像的投影位于显示单元上虹膜识别区范围内,使得用户眼球的虹膜信息可以被虹膜识别区下方的传感单元捕捉。通过光学成像器件与显示单元之间的配合,可以达到模拟VR/AR设备的效果。
所述装置包括显示单元、传感单元和红外光源;所述显示单元上设置有虹膜识别区,所述传感单元位于所述虹膜识别区的下方,所述传感单元包括红外感应层;所述红外光源用于发出红外光,所述红外感应层用于感知用户虹膜反射的红外光信号,并捕捉用户的虹膜信息。在本实施方式中,所述显示单元包括AMOLED显示屏或LCD液晶显示屏;在其他实施方式中,显示单元也可以为其他具有显示功能的电子元件。所述红外光源为具有发出红外光功能的电子 器件,如红外LED器件等。所述方法包括以下步骤:
首先进入步骤S101预先设置操作配置信息,所述操作配置信息包括虹膜信息与操作指令的对应关系。本实施方式中,所述操作指令包括文字操作指令、图像操作指令、视频操作指令、应用操作指令中的一种或多种。所述文字操作指令包括选中文字指令、删除文字指令、复制文字指令等;所述图像操作指令包括选中图像指令、复制图像指令、截取图像指令、删除图像指令、切换图像画面等;所述视频操作指令包括对视频进行截取、暂停、保存、删除、快进、快退、缩放画面、音量调整等;所述应用操作指令包括对软件应用程序(如手机APP)进行启动、删除、选中、移动等。
操作配置信息中的虹膜信息即为用户事先录入存储的虹膜信息,每一虹膜信息可以与多个操作指令相匹配,使得捕捉到的用户虹膜信息在通过认证后,用户可以对设备执行多项操作。操作配置信息可以存储于装置的存储单元,如手机的内存、计算机的硬盘中,也可以存储于服务器的存储单元中,当需要获取操作配置信息时,只需让装置与服务器建立通讯连接,而后再从服务器获取到事先存储的操作配置信息,所述通讯连接包括有线通讯连接或无线通信连接。
而后进入步骤S102捕捉用户在虹膜识别区上的虹膜信息。在本实施方式中,传感单元的覆盖的范围与显示单元的大小相适配,优选的,传感单元的形状为矩形,矩形的大小位于显示单元的中心,保证不偏移眼球、成像的光轴。这样可以保证只要用户眼睛对准显示单元,无论用户眼球如何活动,传感单元都能精确、快速地采集到用户的虹膜信息。
而后进入步骤S103判断所捕捉的虹膜信息与操作配置信息中的预设虹膜信息是否匹配,若是则进入步骤S104执行该虹膜信息对应的操作命令,否则进入步骤S105不执行所述操作指令。虹膜信息的比对可以通过虹膜特征识别算法来实现,虹膜特征识别算法可以实现存储于装置的存储单元中,当传感单元的红外感应层获取到虹膜信息后,装置的处理器将调用存储单元中的虹膜特征识别算法,将所获取的虹膜信息与预设的操作配置信息中的虹膜信息进行比对,判 断两者是否匹配。虹膜特征识别算法包括对虹膜特征信息进行预处理、数据特征提取、特征匹配、虹膜识别等步骤,可以用多种算法来实现,这些算法都是成熟的现有技术,现已被应用于各个领域中,此处不再展开赘述。
请参阅图2,为本发明另一实施方式涉及的虹膜识别的方法的流程图。所述步骤“预先设置操作配置信息”包括:
首先进入步骤S201接收用户设置命令,显示虹膜识别区。设置指令可以通过用户点击屏幕上设置栏中的某一按钮触发,装置接收到设置指令后,将对虹膜识别区进行显示,便于用户输入虹膜信息。在本实施方式中,显示虹膜识别区可以包括:提高虹膜识别区的亮度或在虹膜识别区上显示一提示输入框。在某些实施例中,在接收用户设置指令之前,还包括接收用户的账号信息,所述账号信息包括用户ID及密码。用户需要以语音控制、眼球控制、或是按键密码控制等方式输入正确的用户ID及密码,登录用户账号后,才可触发所述设置指令,这样一方面可以提高操作配置信息设置的安全性,另一方面也可以达到在一个装置上区分不同用户、保存不同的虹膜信息以及与之相对应的操作指令的效果。
而后进入步骤S202捕捉用户的虹膜信息并保存。所采集到的虹膜信息即为预设虹膜信息,可以将其存储于存储单元中。本实施方式中,所述步骤“捕捉用户的虹膜信息并保存”包括:判断用户设置过程中的虹膜信息是否已存储于存储单元,当判定为是时提示用户该虹膜信息已录入,并发出提示信息提醒用户;当判定为否时将该虹膜信息保存至存储单元。这样可以有效避免虹膜信息的重复录入。
而后进入步骤S203显示一操作指令标识列表,接收用户对操作指令标识的选择指令,建立所选中的操作指令标识对应的操作指令与所捕捉的虹膜信息的对应关系,并保存至操作配置信息。所述操作指令标识列表中包含着一个或多个操作指令对应的标识,每一操作指令标识对应一操作指令。操作指令标识可以以文字或图片的形式予以显示,选择指令可以通过用户点击勾选、双击等方 式触发。
如图6所示,在本实施方式中,所述传感单元包括TFT影像感测阵列薄膜。现有的液晶显示(LCD)面板或有机发光二极管(OLED)显示面板,皆是以TFT结构驱动扫描单一像素,以实现面板上像素阵列的显示功能。形成TFT开关功能的主要结构为金属氧化物半导体场效晶体管(MOSFET),其中熟知的半导体层材料主要有非晶硅、多晶硅、氧化铟镓锌(IGZO)、或是混有碳纳米材料之有机化合物等等。由于光感测二极管的结构亦可采用此类半导体材料制备,且生产设备也兼容于TFT阵列的生产设备,因此近年来TFT光侦测二极管开始以TFT阵列制备方式进行生产。本实施方式所述的TFT影像感测阵列薄膜即为上述提到的TFT光侦测二极管(如图6中的光感测二极管区域部分),具体结构可以参考美国专利US6943070B2、中华人民共和国专利CN204808361U中对传感单元结构的描述。TFT影像感测阵列薄膜的生产工艺与显示面板TFT结构不同的是:原本在显示面板的像素开口区域,在生产工艺上改为光感测区域。其TFT制备方式可以采用薄型玻璃为基材,亦可采用耐高温塑性材料为基材,如美国专利US6943070B2所述。
图6所示的传感单元易受周围环境光或者显示屏像素所发出的可见光的反射、折射等因素影响,造成光学干扰,严重影响内嵌于显示面板下方的TFT影像感测阵列薄膜的信号噪声比(SNR),为了提高信号噪声比,本发明的传感单元在图6所示的传感单元的基础上做了进一步改进,使得TFT影像感测阵列薄膜可以侦测识别用户眼球反射回的红外信号。
如图7所示,在某些实施例中,所述红外感应层为红外光敏电晶管所形成的阵列。为了将TFT影像感测阵列薄膜能够识别的光信号波长从可见光范围扩展至红外光范围,对图6的TFT影像感测阵列薄膜进行改进,具体包括:采取TFT光伏场效晶体管(PVFET)取代原有的阵列薄膜中的光侦测二极管,具体可以参考文献“Photovoltage field-effect transistors,Nature 542,324–327(16February 2017)”。在本实施例中,红外感应层亦设计为场效晶体管(Field Effect Transistor) 结构,为了制备此晶体管结构,且满足制备的晶体管结构完全兼容于图6所示的TFT影像感测阵列薄膜,可以通过以下方式来实现:以化学汽像沉积制备非晶硅层、物理溅镀制备金属层与透明电极层,在制备过程中以光罩微影蚀刻定义出各层器件需要之微影形状图层。
制备得出的红外光敏电晶管的漏极与源极之间的电位差即为将红外光转为电信号之操作参数。为达到此目的,红外光敏电晶管的栅极采用对红外光较为敏感的光伏材料制成,例如光伏材料可以采用以能隙工程(Bandgap Engineering)改质过之量子点、碳纳米材料、金属氧化物薄膜材料等,将这些材料混入有机或无机化合物搅拌成胶质或液态,再采用涂布、印刷等方式制备得到作为栅极的材料。
当用户眼球虹膜反射回的红外光射入上述红外光敏电晶管所形成的阵列时,光伏材料所激发的电子电洞对(Electron-Hole Pair)诱发红外光敏电晶管的漏极与源极之间的电子通道形成,漏极与源极之间的电位差会因此下降,以TFT作扫描驱动漏极输出电流升高,此开关特性可实现红外光影像电信号之读出,从而实现对用户虹膜信息的捕捉。
如图8所示,在某些实施例中,所述红外感应层为红外光敏二级管所形成的阵列。为了将TFT影像感测阵列薄膜能够识别的光信号波长从可见光范围扩展至红外光范围,对图6的TFT影像感测阵列薄膜进行改进,具体是采用红外光敏二级管替换图6中TFT影像感测阵列薄膜的光二极管层,红外光敏二极管包括微晶硅光电二极管或非晶硅光电二极管。
实施例一:将非结晶硅p型/i型/n型光电二极管结构(图6中的光二极管层)改由微晶硅p型/i型/n型光电二极管结构。在此实施例中,光电二极管的微结晶程度主要是在化学气象沉积过程中,以适当氢气浓度混入气体硅烷(SiH4)去控制氢原子键结非晶硅之悬空键(dangling bond),以实现微晶硅p型/i型/n型光电二极管结构之镀膜。藉由调整化学气象沉积的氢气浓度,微晶光电二极管的操作波长范围可以扩展到光波长600nm到1000nm的范围。
在采用微晶光电二极管之实施例中,为了进一步地提高光电转换之量子效率,微晶硅光电二极管也可采用双结以上p型/i型/n型结构堆叠形成。该光电二极管第一结层p型/i型/n型材料仍然为非结晶结构,第二结层以上p型/i型/n型材料可以为微晶结构、多晶结构。
实施例二:将非结晶硅p型/i型/n型光电二极管结构(图6中的光二极管层)改为掺有可扩展光敏波长范围之非结晶硅化合物之p型/i型/n型光电二极管结构,优选之化合物实施例为非晶硅化锗。在此实施例中,光电二极管的本质层(i型)在以化学气象沉积镀膜过程中,通以气体锗烷(GeH4)混入硅烷(SiH4),以实现非结晶硅化锗p型/i型/n型光电二极管之光敏范围达到光波长600nm到1000nm的范围。
在采用非结晶硅化合物光电二极管之实施例中,为了提高光电转换之量子效率,非晶硅光电二极管也可采用双结以上p型/i型/n型结构堆叠形成。该光电二极管第一结层p型/i型/n型材料仍然为非晶硅结构,第二结层以上p型/i型/n型材料可以为微晶结构、多晶结构或是掺有可扩展光敏波长范围之化合物材料。
当红外感应层为红外光敏二级管所形成的阵列时,在实际应用过程中,可藉由TFT作扫描驱动外加一偏压在p型/i型/n型光电二极管之间,将使得红外光敏二极管处于侦测红外光信号状态,用户眼球虹膜反射回的红外光信号转换为红外光影像电信号并输出,而实现对用户眼球活动信息的捕捉。
在某些实施例中,所述操作指令为画面切换指令;步骤“判断所捕捉的虹膜信息与操作配置信息中的预设虹膜信息是否匹配,若是则执行该虹膜信息对应的操作命令,否则不执行所述操作指令”包括:判断捕捉到的虹膜信息与画面切换指令对应的虹膜信息是否匹配,若是则对画面进行切换,否则不对画面进行切换。由于视频流数据是有一帧帧图像画面构成的,因而本实施例的方法同样也适用于对视频流数据的判断。
在某些实施例中,所述操作指令为支付指令;步骤“判断所捕捉的虹膜信 息与操作配置信息中的预设虹膜信息是否匹配,若是则执行该虹膜信息对应的操作命令,否则不执行所述操作指令”包括:判断捕捉到的虹膜信息与支付指令对应的虹膜信息是否匹配,若是则支付成功,否则支付失败。将支付指令与用户虹膜信息识别相挂钩,可以有效增强交易支付的安全性,同时避免其他用户误操作给户主带来不必要的损失。
在某些实施例中,所述操作指令为用户身份信息登录指令;步骤“判断所捕捉的虹膜信息与操作配置信息中的预设虹膜信息是否匹配,若是则执行该虹膜信息对应的操作命令,否则不执行所述操作指令”包括:判断捕捉到的虹膜信息与画面切换指令对应的虹膜信息是否匹配,若是用户身份信息登录成功,否则用户身份信息登录失败。将用户身份信息登录与与用户虹膜信息识别相挂钩,可以有效增强用户身份登录过程的安全性。
在某些实施例中,步骤“判断所捕捉的虹膜信息与操作配置信息中的预设虹膜信息是否匹配”具体包括:根据捕捉到虹膜信息计算其特征值,并与操作配置信息中预设的虹膜信息的特征值进行对比;当误差小于预设值时,判定为相匹配,否则判定为不匹配。
在某些实施例中,所述方法还包括步骤:当判定操作配置信息中没有与所捕捉的虹膜信息相匹配的预设虹膜信息时,发出提示信息。所述提示信息包括声音提示信息、图像提示信息、光线提示信息、视频提示信息中的一种或多种。所述声音提示信息包括提示用户虹膜识别失败的语音提示信息,所述图像提示信息包括提示用户虹膜识别失败的的弹窗提示信息,所述视频提示信息包括提示虹膜识别失败的提示信息,光线提示信息包括改变屏幕亮度或者让显示屏发出不同颜色的光线等。
如图4所示,在某些实施例中,当所述显示单元为LCD液晶显示屏时,所述传感单元的下方还设置有背光单元,所述传感单元设置于背光单元和LCD液晶显示屏之间。由于LCD液晶显示屏不属于自发光元件,因而在安装时需要在传感单元的下方增加背光单元。背光单元可以为LCD背光模组,也可以为其他 具有自发光功能的电子元件。在另一些实施例中,当所述显示单元为AMOLED显示屏时,由于OLED显示屏属于自发光元件,因而无需设置背光单元。通过上述两种方案的设置,可以有效满足不同厂家的生产需求,提高装置的适用范围。
在本实施方式中,所述虹膜识别区包括多个虹膜识别子区域,每一虹膜识别子区域的下方对应设置一传感单元。所述装置还包括传感单元控制电路,所述方法还包括:接收用户对虹膜识别子区域的启动指令,传感单元控制电路开启所述虹膜识别子区域的下方的传感单元,以及接收用户对虹膜识别子区域的关闭指令,传感单元控制电路关闭所述虹膜识别子区域的下方的传感单元。
以虹膜识别子区域的数量为两个为例,两个虹膜识别子区域可以一上一下或一左一右均匀分布于屏幕中,也可以以其他排列方式分布于屏幕中。下面对具有两个虹膜识别子区域的装置的应用过程做具体说明:在使用过程中,用户通过启动指令,开启将两个虹膜识别子区域都设置成开启状态,优选的实施例中,两个虹膜识别子区域构成的范围覆盖了整个显示屏,这样可以保证当两个虹膜识别子区域都设置成开启状态时,用户眼球的成像投影始终位于传感单元范围内,有效提高对用户眼球特征的捕捉,提升用户体验。在其他实施例中,两个虹膜识别子区域构成的范围也可以占整个显示屏面积的2/3、3/4等,只需满足虹膜识别子区域的中心不偏离眼球成像的光轴即可。当然,用户也可以根据自身喜好,设置某一个虹膜识别子区域开启,另一个虹膜识别子区域关闭。在不需要对装置进行操作时,还可以将两个识别子区域均设置为关闭状态。
在其他实施例中,虹膜识别子区域的数量还可以为其他数值,可以根据实际需要进行设置。各个虹膜识别子区域下方的传感单元处于开启或关闭,可以根据用户自身喜好进行设置。
请参阅图3,为本发明一实施方式涉及的虹膜识别的装置的示意图。所述装置包括显示单元101、传感单元102和红外光源;所述显示单元101上设置有虹膜识别区,所述传感单元102位于所述虹膜识别区的下方,所述传感单元包括 红外感应层;所述红外光源用于发出红外光,所述红外感应层用于感知用户虹膜反射的红外光信号,并捕捉用户的虹膜信息;所述装置还包括操作信息设置单元104、判断单元108和处理单元106。
所述操作信息设置单元104用于预先设置操作配置信息,所述操作配置信息包括虹膜信息与操作指令的对应关系。本实施方式中,所述操作指令包括文字操作指令、图像操作指令、视频操作指令、应用操作指令中的一种或多种。所述文字操作指令包括选中文字指令、删除文字指令、复制文字指令等;所述图像操作指令包括选中图像指令、复制图像指令、截取图像指令、删除图像指令、切换图像画面等;所述视频操作指令包括对视频进行截取、暂停、保存、删除、快进、快退、缩放画面、音量调整等;所述应用操作指令包括对软件应用程序(如手机APP)进行启动、删除、选中、移动等。
操作配置信息中的虹膜信息即为用户事先录入存储的虹膜信息,每一虹膜信息可以与多个操作指令相匹配,使得捕捉到的用户虹膜信息在通过认证后,用户可以对设备执行多项操作。操作配置信息可以存储于装置的存储单元107,如手机的内存、计算机的硬盘中,也可以存储于服务器的存储单元中,当需要获取操作配置信息时,只需让装置与服务器建立通讯连接,而后再从服务器获取到事先存储的操作配置信息,所述通讯连接包括有线通讯连接或无线通信连接。
所述传感单元102用于捕捉用户在虹膜识别区上的虹膜信息。在本实施方式中,传感单元的覆盖的范围与显示单元的大小相适配,优选的,传感单元的形状为矩形,矩形的大小位于显示单元的中心,保证不偏移眼球、成像的光轴。这样可以保证只要用户眼睛对准显示单元,无论用户眼球如何活动,传感单元都能精确、快速地采集到用户的虹膜信息。
所述判断单元108用于判断所捕捉的虹膜信息与操作配置信息中的预设虹膜信息是否匹配,若是则处理单元106用于执行执行该虹膜信息对应的操作命令,否则处理单元106不执行所述操作指令。虹膜信息的比对可以通过虹膜特征识别算法来实现,虹膜特征识别算法可以实现存储于装置的存储单元中,当 传感单元的红外感应层获取到虹膜信息后,装置的处理器将调用存储单元中的虹膜特征识别算法,将所获取的虹膜信息与预设的操作配置信息中的虹膜信息进行比对,判断两者是否匹配。虹膜特征识别算法包括对虹膜特征信息进行预处理、数据特征提取、特征匹配、虹膜识别等步骤,可以用多种算法来实现,这些算法都是成熟的现有技术,现已被应用于各个领域中,此处不再展开赘述。
在某些实施例中,所述装置包括操作指令接收单元105,所述“操作信息设置单元用于预先设置操作配置信息”包括:所述操作指令接收单元用于接收用户设置命令,所述显示单元用于显示虹膜识别区。设置指令可以通过用户点击屏幕上设置栏中的某一按钮触发,装置接收到设置指令后,将对虹膜识别区进行显示,便于用户输入虹膜信息。在本实施方式中,显示虹膜识别区可以包括:提高虹膜识别区的亮度或在虹膜识别区上显示一提示输入框。在某些实施例中,在接收用户设置指令之前,还包括接收用户的账号信息,所述账号信息包括用户ID及密码。用户需要以语音控制、眼球控制、或是按键密码控制等方式输入正确的用户ID及密码,登录用户账号后,才可触发所述设置指令,这样一方面可以提高操作配置信息设置的安全性,另一方面也可以达到在一个装置上区分不同用户、保存不同的虹膜信息以及与之相对应的操作指令的效果。
所述传感单元用于捕捉用户的虹膜信息并保存。所采集到的虹膜信息即为预设虹膜信息,可以将其存储于存储单元中。本实施方式中,所述步骤“捕捉用户的虹膜信息并保存”包括:判断用户设置过程中的虹膜信息是否已存储于存储单元,当判定为是时提示用户该虹膜信息已录入,并发出提示信息提醒用户;当判定为否时将该虹膜信息保存至存储单元。这样可以有效避免虹膜信息的重复录入。
所述显示单元用于显示一操作指令标识列表,所述操作指令接收单元用于接收用户对操作指令标识的选择指令,处理单元用于建立所选中的操作指令标识对应的操作指令与所捕捉的虹膜信息的对应关系,并保存至操作配置信息。所述操作指令标识列表中包含着一个或多个操作指令对应的标识,每一操作指 令标识对应一操作指令。操作指令标识可以以文字或图片的形式予以显示,选择指令可以通过用户点击勾选、双击等方式触发。
在某些实施例中,所述操作指令为画面切换指令;所述“判断单元用于判断所捕捉的虹膜信息与操作配置信息中的预设虹膜信息是否匹配,若是则处理单元用于执行该虹膜信息对应的操作命令,否则处理单元不执行所述操作指令”包括:判断单元用于判断捕捉到的用户的虹膜信息与画面切换指令对应的虹膜信息是否匹配,若是则处理单元用于对画面进行切换,否则处理单元不对画面进行切换。由于视频流数据是有一帧帧图像画面构成的,因而本实施例的方法同样也适用于对视频流数据的判断。
在某些实施例中,所述操作指令为支付指令;所述“判断单元用于判断所捕捉的虹膜信息与操作配置信息中的预设虹膜信息是否匹配,若是则处理单元用于执行该虹膜信息对应的操作命令,否则处理单元不执行所述操作指令”包括:判断单元用于判断判断捕捉到的虹膜信息与支付指令对应的虹膜信息是否匹配,若是则处理单元执行所述支付指令,支付成功,否则处理单元不执行所述支付指令,支付失败。将支付指令与用户虹膜信息识别相挂钩,可以有效增强交易支付的安全性,同时避免其他用户误操作给户主带来不必要的损失。
在某些实施例中,所述操作指令为用户身份信息登录指令;所述“判断单元用于判断所捕捉的虹膜信息与操作配置信息中的预设虹膜信息是否匹配,若是则处理单元用于执行该虹膜信息对应的操作命令,否则处理单元不执行所述操作指令”包括:判断单元用于判断捕捉到的虹膜信息与画面切换指令对应的虹膜信息是否匹配,若是处理单元执行所述用户信息身份登录指令,登录成功,否则处理单元不执行所述用户信息身份登录指令,登录失败。将用户身份信息登录与与用户虹膜信息识别相挂钩,可以有效增强用户身份登录过程的安全性。
在某些实施例中,所述传感单元包括TFT影像感测阵列薄膜,所述红外感应层包括红外光敏二极管或红外光敏电晶管所形成的阵列。所述“判断单元用于判断所捕捉的虹膜信息与操作配置信息中的预设虹膜信息是否匹配”具体包 括:判断单元用于根据捕捉到虹膜信息计算其特征值,并与操作配置信息中预设的虹膜信息的特征值进行对比;当误差小于预设值时,判定为相匹配,否则判定为不匹配。
在某些实施例中,所述处理单元还用于在判断单元判定操作配置信息中没有与所捕捉的虹膜信息相匹配的预设虹膜信息时,发出提示信息。所述提示信息包括声音提示信息、图像提示信息、光线提示信息、视频提示信息中的一种或多种。所述声音提示信息包括提示用户虹膜识别失败的语音提示信息,所述图像提示信息包括提示用户虹膜识别失败的的弹窗提示信息,所述视频提示信息包括提示虹膜识别失败的提示信息,光线提示信息包括改变屏幕亮度或者让显示屏发出不同颜色的光线等。
在某些实施例中,当所述显示单元为LCD液晶显示屏时,所述传感单元的下方还设置有背光单元103,所述传感单元设置于背光单元和LCD液晶显示屏之间。由于LCD液晶显示屏不属于自发光元件,因而在安装时需要在传感单元的下方增加背光单元。背光单元可以为LCD背光模组,也可以为其他具有自发光功能的电子元件。在另一些实施例中,当所述显示单元为AMOLED显示屏时,由于OLED显示屏属于自发光元件,因而无需设置背光单元。通过上述两种方案的设置,可以有效满足不同厂家的生产需求,提高装置的适用范围。
采用以上技术方案后的有益效果为:通过在显示单元的虹膜识别区下方设置传感单元,用户眼球虹膜通过光学器件成像的投影位于所述虹膜识别区上,传感单元的中心可设置于眼球成像光轴位置或是近轴位置,相较于摄像头独立于显示屏外设置在边缘位置的结构,本发明可以及时捕捉到用户虹膜特征信息,进而与预设的虹膜信息进行比对,执行该虹膜信息对应的操作指令,有效提高虹膜信息的精准度,提升用户体验。此外,传感单元设置于显示单元的下方,相较于摄像头独立突出设置于显示屏区域外的结构,可以有效缩小移动设备的整体厚度,使得穿戴式设备或是移动设备更加轻薄、更适用于柔性穿戴式设备或是移动设备、满足市场的需求。
需要说明的是,在本文中,诸如第一和第二等之类的关系术语仅仅用来将一个实体或者操作与另一个实体或操作区分开来,而不一定要求或者暗示这些实体或操作之间存在任何这种实际的关系或者顺序。而且,术语“包括”、“包含”或者其任何其他变体意在涵盖非排他性的包含,从而使得包括一系列要素的过程、方法、物品或者终端设备不仅包括那些要素,而且还包括没有明确列出的其他要素,或者是还包括为这种过程、方法、物品或者终端设备所固有的要素。在没有更多限制的情况下,由语句“包括……”或“包含……”限定的要素,并不排除在包括所述要素的过程、方法、物品或者终端设备中还存在另外的要素。此外,在本文中,“大于”、“小于”、“超过”等理解为不包括本数;“以上”、“以下”、“以内”等理解为包括本数。
本领域内的技术人员应明白,上述各实施例可提供为方法、装置、或计算机程序产品。这些实施例可采用完全硬件实施例、完全软件实施例、或结合软件和硬件方面的实施例的形式。上述各实施例涉及的方法中的全部或部分步骤可以通过程序来指令相关的硬件来完成,所述的程序可以存储于计算机设备可读取的存储介质中,用于执行上述各实施例方法所述的全部或部分步骤。所述计算机设备,包括但不限于:个人计算机、服务器、通用计算机、专用计算机、网络设备、嵌入式设备、可编程设备、智能移动终端、智能家居设备、穿戴式智能设备、车载智能设备等;所述的存储介质,包括但不限于:RAM、ROM、磁碟、磁带、光盘、闪存、U盘、移动硬盘、存储卡、记忆棒、网络服务器存储、网络云存储等。
上述各实施例是参照根据实施例所述的方法、设备(系统)、和计算机程序产品的流程图和/或方框图来描述的。应理解可由计算机程序指令实现流程图和/或方框图中的每一流程和/或方框、以及流程图和/或方框图中的流程和/或方框的结合。可提供这些计算机程序指令到计算机设备的处理器以产生一个机器,使得通过计算机设备的处理器执行的指令产生用于实现在流程图一个流程或多个流程和/或方框图一个方框或多个方框中指定的功能的装置。
这些计算机程序指令也可存储在能引导计算机设备以特定方式工作的计算机设备可读存储器中,使得存储在该计算机设备可读存储器中的指令产生包括指令装置的制造品,该指令装置实现在流程图一个流程或多个流程和/或方框图一个方框或多个方框中指定的功能。
这些计算机程序指令也可装载到计算机设备上,使得在计算机设备上执行一系列操作步骤以产生计算机实现的处理,从而在计算机设备上执行的指令提供用于实现在流程图一个流程或多个流程和/或方框图一个方框或多个方框中指定的功能的步骤。
尽管已经对上述各实施例进行了描述,但本领域内的技术人员一旦得知了基本创造性概念,则可对这些实施例做出另外的变更和修改,所以以上所述仅为本发明的实施例,并非因此限制本发明的专利保护范围,凡是利用本发明说明书及附图内容所作的等效结构或等效流程变换,或直接或间接运用在其他相关的技术领域,均同理包括在本发明的专利保护范围之内。
Claims (26)
- 一种虹膜识别的方法,其特征在于,所述方法应用于虹膜识别的装置,所述装置包括显示单元、传感单元和红外光源;所述显示单元上设置有虹膜识别区,所述传感单元位于所述虹膜识别区的下方,所述传感单元包括红外感应层;所述红外光源用于发出红外光,所述红外感应层用于感知用户虹膜反射的红外光信号,并捕捉用户的虹膜信息;所述方法包括以下步骤:预先设置操作配置信息,所述操作配置信息包括虹膜信息与操作指令的对应关系;捕捉用户在虹膜识别区上的虹膜信息,判断所捕捉的虹膜信息与操作配置信息中的预设虹膜信息是否匹配,若是则执行该虹膜信息对应的操作命令,否则不执行所述操作指令。
- 如权利要求1所述的虹膜识别的方法,其特征在于,所述步骤“预先设置操作配置信息”包括:接收用户设置命令,显示虹膜识别区;捕捉用户的虹膜信息并保存;显示一操作指令标识列表,所述操作指令标识列表中包含着一个或多个操作指令对应的标识,每一操作指令标识对应一操作指令;接收用户对操作指令标识的选择指令,建立所选中的操作指令标识对应的操作指令与所捕捉的虹膜信息的对应关系,并保存至操作配置信息。
- 如权利要求1所述的虹膜识别的方法,其特征在于,所述操作指令为画面切换指令;步骤“判断所捕捉的虹膜信息与操作配置信息中的预设虹膜信息是否匹配,若是则执行该虹膜信息对应的操作命令,否则不执行所述操作指令”包括:判断捕捉到的虹膜信息与画面切换指令对应的虹膜信息是否匹配,若是则对画面进行切换,否则不对画面进行切换。
- 如权利要求1所述的虹膜识别的方法,其特征在于,所述操作指令为支付指令;步骤“判断所捕捉的虹膜信息与操作配置信息中的预设虹膜信息是否匹配,若是则执行该虹膜信息对应的操作命令,否则不执行所述操作指令”包括:判断捕捉到的虹膜信息与支付指令对应的虹膜信息是否匹配,若是则支付成功,否则支付失败。
- 如权利要求1所述的虹膜识别的方法,其特征在于,所述操作指令为用户身份信息登录指令;步骤“判断所捕捉的虹膜信息与操作配置信息中的预设虹膜信息是否匹配,若是则执行该虹膜信息对应的操作命令,否则不执行所述操作指令”包括:判断捕捉到的虹膜信息与画面切换指令对应的虹膜信息是否匹配,若是用户身份信息登录成功,否则用户身份信息登录失败。
- 如权利要求1所述的虹膜识别的方法,其特征在于,所述传感单元包括TFT影像感测阵列薄膜,所述红外感应层包括红外光敏二极管或红外光敏电晶管所形成的阵列。
- 如权利要求1所述的虹膜识别的方法,其特征在于,步骤“判断所捕捉的虹膜信息与操作配置信息中的预设虹膜信息是否匹配”具体包括:根据捕捉到虹膜信息计算其特征值,并与操作配置信息中预设的虹膜信息的特征值进行对比;当误差小于预设值时,判定为相匹配,否则判定为不匹配。
- 如权利要求1或7所述的虹膜识别的方法,其特征在于,所述方法还包括步骤:当判定操作配置信息中没有与所捕捉的虹膜信息相匹配的预设虹膜信息时,发出提示信息。
- 如权利要求8所述的虹膜识别的方法,其特征在于,所述提示信息包括声音提示信息、图像提示信息、光线提示信息、视频提示信息中的一种或多种。
- 如权利要求1所述的虹膜识别的方法,其特征在于,所述显示单元包括AMOLED显示屏或LCD液晶显示屏。
- 如权利要求10所述的虹膜识别的方法,其特征在于,当所述显示单元为LCD液晶显示屏时,所述传感单元的下方还设置有背光单元,所述传感单元设置于背光单元和LCD液晶显示屏之间。
- 如权利要求1所述的虹膜识别的方法,其特征在于,所述虹膜识别区包括多个虹膜识别子区域,每一虹膜识别子区域的下方对应设置一传感单元。
- 如权利要求12所述的虹膜识别的方法,其特征在于,所述装置还包括传感单元控制电路,所述方法还包括:接收用户对虹膜识别子区域的启动指令,传感单元控制电路开启所述虹膜识别子区域的下方的传感单元,以及接收用户对虹膜识别子区域的关闭指令,传感单元控制电路关闭所述虹膜识别子区域的下方的传感单元。
- 一种虹膜识别的装置,其特征在于,所述装置包括显示单元、传感单元和红外光源;所述显示单元上设置有虹膜识别区,所述传感单元位于所述虹膜识别区的下方,所述传感单元包括红外感应层;所述红外光源用于发出红外光,所述红外感应层用于感知用户虹膜反射的红外光信号,并捕捉用户的虹膜信息;所述装置还包括操作信息设置单元、判断单元和处理单元;所述操作信息设置单元用于预先设置操作配置信息,所述操作配置信息包括虹膜信息与操作指令的对应关系;所述传感单元用于捕捉用户在虹膜识别区上的虹膜信息,所述判断单元用于判断所捕捉的虹膜信息与操作配置信息中的预设虹膜信息是否匹配,若是则处理单元用于执行执行该虹膜信息对应的操作命令,否则处理单元不执行所述操作指令。
- 如权利要求14所述的虹膜识别的装置,其特征在于,所述装置包括操作指令接收单元,所述“操作信息设置单元用于预先设置操作配置信息”包括:所述操作指令接收单元用于接收用户设置命令,所述显示单元用于显示虹膜识别区;所述传感单元用于捕捉用户的虹膜信息并保存;所述显示单元用于显示一操作指令标识列表;所述操作指令标识列表中包含着一个或多个操作指令对应的标识,每一操作指令标识对应一操作指令;所述操作指令接收单元用于接收用户对操作指令标识的选择指令,处理单元用于建立所选中的操作指令标识对应的操作指令与所捕捉的虹膜信息的对应关系,并保存至操作配置信息。
- 如权利要求14所述的虹膜识别的装置,其特征在于,所述操作指令为 画面切换指令;所述“判断单元用于判断所捕捉的虹膜信息与操作配置信息中的预设虹膜信息是否匹配,若是则处理单元用于执行该虹膜信息对应的操作命令,否则处理单元不执行所述操作指令”包括:判断单元用于判断捕捉到的用户的虹膜信息与画面切换指令对应的虹膜信息是否匹配,若是则处理单元用于对画面进行切换,否则处理单元不对画面进行切换。
- 如权利要求14所述的虹膜识别的装置,其特征在于,所述操作指令为支付指令;所述“判断单元用于判断所捕捉的虹膜信息与操作配置信息中的预设虹膜信息是否匹配,若是则处理单元用于执行该虹膜信息对应的操作命令,否则处理单元不执行所述操作指令”包括:判断单元用于判断判断捕捉到的虹膜信息与支付指令对应的虹膜信息是否匹配,若是则处理单元执行所述支付指令,支付成功,否则处理单元不执行所述支付指令,支付失败。
- 如权利要求14所述的虹膜识别的装置,其特征在于,所述操作指令为用户身份信息登录指令;所述“判断单元用于判断所捕捉的虹膜信息与操作配置信息中的预设虹膜信息是否匹配,若是则处理单元用于执行该虹膜信息对应的操作命令,否则处理单元不执行所述操作指令”包括:判断单元用于判断捕捉到的虹膜信息与画面切换指令对应的虹膜信息是否匹配,若是处理单元执行所述用户信息身份登录指令,登录成功,否则处理单元不执行所述用户信息身份登录指令,登录失败。
- 如权利要求14所述的虹膜识别的装置,其特征在于,所述传感单元包括TFT影像感测阵列薄膜,所述红外感应层包括红外光敏二极管或红外光敏电晶管所形成的阵列。
- 如权利要求14所述的虹膜识别的装置,其特征在于,所述“判断单元用于判断所捕捉的虹膜信息与操作配置信息中的预设虹膜信息是否匹配”具体包括:判断单元用于根据捕捉到虹膜信息计算其特征值,并与操作配置信息中预设的虹膜信息的特征值进行对比;当误差小于预设值时,判定为相匹配,否则判定为不匹配。
- 如权利要求14或20所述的虹膜识别的装置,其特征在于,所述处理单元还用于在判断单元判定操作配置信息中没有与所捕捉的虹膜信息相匹配的预设虹膜信息时,发出提示信息。
- 如权利要求21所述的虹膜识别的装置,其特征在于,所述提示信息包括声音提示信息、图像提示信息、光线提示信息、视频提示信息中的一种或多种。
- 如权利要求14所述的虹膜识别的装置,其特征在于,所述显示单元包括AMOLED显示屏或LCD液晶显示屏。
- 如权利要求23所述的虹膜识别的装置,其特征在于,当所述显示单元为LCD液晶显示屏时,所述传感单元的下方还设置有背光单元,所述传感单元设置于背光单元和LCD液晶显示屏之间。
- 如权利要求14所述的虹膜识别的装置,其特征在于,所述虹膜识别区包括多个虹膜识别子区域,每一虹膜识别子区域的下方对应设置一传感单元。
- 如权利要求14所述的虹膜识别的装置,其特征在于,所述装置还包括传感单元控制电路和操作指令接收单元,所述操作指令接收单元用于接收用户对虹膜识别子区域的启动指令,所述传感单元控制电路用于开启所述虹膜识别子区域的下方的传感单元,以及所述操作指令接收单元用于接收用户对虹膜识别子区域的关闭指令,所述传感单元控制电路用于关闭所述虹膜识别子区域的下方的传感单元。
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US16/607,093 US11250257B2 (en) | 2017-04-20 | 2018-03-02 | Method and device for iris recognition |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710260305.X | 2017-04-20 | ||
CN201710260305.XA CN108734063A (zh) | 2017-04-20 | 2017-04-20 | 一种虹膜识别的方法和装置 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2018192308A1 true WO2018192308A1 (zh) | 2018-10-25 |
Family
ID=63855552
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2018/077876 WO2018192308A1 (zh) | 2017-04-20 | 2018-03-02 | 一种虹膜识别的方法和装置 |
Country Status (4)
Country | Link |
---|---|
US (1) | US11250257B2 (zh) |
CN (1) | CN108734063A (zh) |
TW (1) | TWI676114B (zh) |
WO (1) | WO2018192308A1 (zh) |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113764533A (zh) * | 2017-08-24 | 2021-12-07 | 上海耕岩智能科技有限公司 | 红外光敏晶体管、红外光侦测器件、显示装置、制备方法 |
CN112380966B (zh) * | 2020-11-12 | 2023-06-02 | 西安电子科技大学 | 基于特征点重投影的单眼虹膜匹配方法 |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20110002510A1 (en) * | 2008-09-15 | 2011-01-06 | Global Rainmakers, Inc | Operator interface for face and iris recognition devices |
CN105676565A (zh) * | 2016-03-30 | 2016-06-15 | 武汉虹识技术有限公司 | 一种虹膜识别镜头、装置及方法 |
CN106022275A (zh) * | 2016-05-26 | 2016-10-12 | 青岛海信移动通信技术股份有限公司 | 虹膜识别方法、装置及移动终端 |
CN106527706A (zh) * | 2016-04-22 | 2017-03-22 | 贵阳科安科技有限公司 | 用于移动终端虹膜识别的引导指示人机接口系统和方法 |
CN106550178A (zh) * | 2016-10-08 | 2017-03-29 | 深圳市金立通信设备有限公司 | 一种摄像头、终端及其成像方法 |
CN106548115A (zh) * | 2015-09-18 | 2017-03-29 | 比亚迪股份有限公司 | 摄像头组件及具有其的移动设备 |
Family Cites Families (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7154157B2 (en) * | 2002-12-30 | 2006-12-26 | Intel Corporation | Stacked semiconductor radiation sensors having color component and infrared sensing capability |
TWI251330B (en) | 2003-05-09 | 2006-03-11 | Au Optronics Corp | CMOS image sensor and method for producing the same |
CN104102906B (zh) * | 2014-07-16 | 2017-10-17 | 广东欧珀移动通信有限公司 | 一种应用于虹膜识别系统的数据处理方法和设备 |
KR102412290B1 (ko) * | 2014-09-24 | 2022-06-22 | 프린스톤 아이덴티티, 인크. | 생체측정 키를 이용한 모바일 장치에서의 무선 통신 장치 기능의 제어 |
KR102438110B1 (ko) * | 2014-10-15 | 2022-08-31 | 삼성전자주식회사 | 사용자 단말 장치 및 이의 홍채 인식 방법 |
KR102305997B1 (ko) * | 2014-11-17 | 2021-09-28 | 엘지이노텍 주식회사 | 홍채 인식 카메라 시스템 및 이를 포함하는 단말기와 그 시스템의 홍채 인식 방법 |
KR102277212B1 (ko) * | 2015-01-23 | 2021-07-15 | 삼성전자주식회사 | 디스플레이 정보를 이용한 홍채 인증 방법 및 장치 |
US20160266695A1 (en) * | 2015-03-10 | 2016-09-15 | Crucialtec Co., Ltd. | Display apparatus having image scanning function |
CN204808361U (zh) | 2015-07-29 | 2015-11-25 | 京东方科技集团股份有限公司 | 一种基板、指纹识别传感器、指纹识别装置 |
US9870049B2 (en) * | 2015-07-31 | 2018-01-16 | Google Llc | Reflective lenses to auto-calibrate a wearable system |
CN106406509B (zh) * | 2016-05-16 | 2023-08-01 | 上海青研科技有限公司 | 一种头戴式眼控虚拟现实设备 |
CN106019953A (zh) * | 2016-05-19 | 2016-10-12 | 捷开通讯(深圳)有限公司 | 移动终端以及基于虹膜识别进行红外控制的方法 |
CN105975136B (zh) * | 2016-06-30 | 2019-03-26 | 京东方科技集团股份有限公司 | 显示基板及其制造方法和显示装置 |
CN106372587A (zh) * | 2016-08-29 | 2017-02-01 | 乐视控股(北京)有限公司 | 一种指纹识别方法及装置 |
-
2017
- 2017-04-20 CN CN201710260305.XA patent/CN108734063A/zh active Pending
-
2018
- 2018-03-02 WO PCT/CN2018/077876 patent/WO2018192308A1/zh active Application Filing
- 2018-03-02 US US16/607,093 patent/US11250257B2/en active Active
- 2018-03-28 TW TW107110819A patent/TWI676114B/zh active
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20110002510A1 (en) * | 2008-09-15 | 2011-01-06 | Global Rainmakers, Inc | Operator interface for face and iris recognition devices |
CN106548115A (zh) * | 2015-09-18 | 2017-03-29 | 比亚迪股份有限公司 | 摄像头组件及具有其的移动设备 |
CN105676565A (zh) * | 2016-03-30 | 2016-06-15 | 武汉虹识技术有限公司 | 一种虹膜识别镜头、装置及方法 |
CN106527706A (zh) * | 2016-04-22 | 2017-03-22 | 贵阳科安科技有限公司 | 用于移动终端虹膜识别的引导指示人机接口系统和方法 |
CN106022275A (zh) * | 2016-05-26 | 2016-10-12 | 青岛海信移动通信技术股份有限公司 | 虹膜识别方法、装置及移动终端 |
CN106550178A (zh) * | 2016-10-08 | 2017-03-29 | 深圳市金立通信设备有限公司 | 一种摄像头、终端及其成像方法 |
Also Published As
Publication number | Publication date |
---|---|
TW201839649A (zh) | 2018-11-01 |
US11250257B2 (en) | 2022-02-15 |
US20200065578A1 (en) | 2020-02-27 |
CN108734063A (zh) | 2018-11-02 |
TWI676114B (zh) | 2019-11-01 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11137845B2 (en) | Method and device for recognizing contact of foldable display screen | |
US11227134B2 (en) | Method and device for synchronously collecting fingerprint information | |
TWI606387B (zh) | 用於多層級命令感測之方法及設備 | |
US11314962B2 (en) | Electronic device and method for controlling fingerprint recognition-based electronic device | |
WO2018228010A1 (zh) | 一种终端及终端显示亮度调节的方法 | |
US11911133B2 (en) | Operation method and device for physiological health detection | |
TW202004542A (zh) | 一種同步驗證指紋資訊的螢幕解鎖方法和裝置 | |
TWI720484B (zh) | 一種同步驗證指紋資訊的觸控元件操作方法和裝置 | |
WO2018192312A1 (zh) | 一种眼球追踪操作的方法和装置 | |
US11507182B2 (en) | Method and device for eyeball tracking operation | |
WO2018192308A1 (zh) | 一种虹膜识别的方法和装置 | |
WO2018192313A1 (zh) | 一种虹膜识别的方法和装置 | |
US11914807B2 (en) | Method and device for biometric recognition |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 18787752 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 18787752 Country of ref document: EP Kind code of ref document: A1 |