WO2019151642A1 - Appareil et procédé d'affichage d'informations de guidage pour changer l'état d'empreinte digitale - Google Patents

Appareil et procédé d'affichage d'informations de guidage pour changer l'état d'empreinte digitale Download PDF

Info

Publication number
WO2019151642A1
WO2019151642A1 PCT/KR2018/016015 KR2018016015W WO2019151642A1 WO 2019151642 A1 WO2019151642 A1 WO 2019151642A1 KR 2018016015 W KR2018016015 W KR 2018016015W WO 2019151642 A1 WO2019151642 A1 WO 2019151642A1
Authority
WO
WIPO (PCT)
Prior art keywords
fingerprint
area
image
processor
state
Prior art date
Application number
PCT/KR2018/016015
Other languages
English (en)
Korean (ko)
Inventor
김용석
김선아
김정후
프루신스키발레리
허창룡
현석
송경훈
조치현
Original Assignee
삼성전자 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 삼성전자 주식회사 filed Critical 삼성전자 주식회사
Publication of WO2019151642A1 publication Critical patent/WO2019151642A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/12Fingerprints or palmprints
    • G06V40/13Sensors therefor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/30Authentication, i.e. establishing the identity or authorisation of security principals
    • G06F21/31User authentication
    • G06F21/32User authentication using biometric data, e.g. fingerprints, iris scans or voiceprints
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0414Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means using force sensing means to determine a position
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/56Extraction of image or video features relating to colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/12Fingerprints or palmprints
    • G06V40/1365Matching; Classification
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/041Indexing scheme relating to G06F3/041 - G06F3/045
    • G06F2203/04105Pressure sensors for measuring the pressure or force exerted on the touch surface without providing the touch position
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04808Several contacts: gestures triggering a specific function, e.g. scrolling, zooming, right-click, when the user establishes several contacts with the surface simultaneously; e.g. using several fingers or a combination of fingers and pen

Definitions

  • Various embodiments relate to a method of recognizing a fingerprint and an electronic device thereof.
  • An electronic device such as a smart phone may store various kinds of personal information such as a phone number and authentication information (for example, a password). Accordingly, the electronic device may provide an authentication service to protect personal information stored in the electronic device from others. For example, the electronic device may provide an authentication service (eg, a biometric service) using biometric information such as an iris, a fingerprint, a face, a palm, and a vein.
  • an authentication service eg, a biometric service
  • biometric information such as an iris, a fingerprint, a face, a palm, and a vein.
  • a fingerprint input method through a display may be required by integrating a fingerprint sensor inside the display.
  • the success rate of fingerprint recognition may decrease according to the state of the fingerprint.
  • a solution may be required to increase the success rate of fingerprint recognition.
  • the fingerprint recognition state may be improved by changing the state of the fingerprint.
  • An electronic device may include a touch screen, a memory, and a processor.
  • the processor acquires a first image of a fingerprint of a user associated with the first touch input based on a first touch input detected through the touch screen, and obtains the obtained first image and the memory stored in the memory.
  • a second touch input for confirming that the state of the fingerprint corresponds to the first state based on the reference image for the fingerprint, and for changing the state of the fingerprint from the first state to the second state based on the confirmation It may be configured to display information for guiding through the touch screen.
  • An electronic device may include a touch screen, a plurality of fingerprint sensors disposed under the touch screen, and a processor.
  • the processor detects a touch input through the touch screen, determines whether the position of the detected touch input is included in the position of the plurality of fingerprint sensors, and based on the determination, the processor determines the touch input of the detected touch input. And display information guiding fingerprint input at a location.
  • a method of an electronic device obtains a first image of a fingerprint of a user associated with the first touch input based on a first touch input detected through a touch screen, and obtains the obtained first image. And confirming that the state of the fingerprint corresponds to the first state based on the reference image of the fingerprint stored in the memory, and changing the state of the fingerprint from the first state to the second state based on the confirmation. And display information for guiding a second touch input.
  • An electronic device method may detect a touch input through a touch screen, determine whether a position of the detected touch input is included in a location of a plurality of fingerprint sensors, and based on the determination, And display information guiding the fingerprint input at the location of the detected touch input.
  • the electronic device and the method according to various embodiments of the present disclosure may improve the recognition rate of the fingerprint by displaying information guiding a touch input based on the identified fingerprint state.
  • FIG. 1 is a block diagram of an electronic device in a network environment according to various embodiments of the present disclosure.
  • FIG. 2 illustrates an example of a functional configuration of an electronic device according to various embodiments of the present disclosure.
  • 3A and 3B illustrate examples of fingerprint states according to various embodiments.
  • FIG. 4 illustrates an operation principle of a fingerprint sensor according to various embodiments of the present disclosure.
  • FIG. 5 illustrates an example of an operation of an electronic device for displaying information for guiding a touch input according to various embodiments of the present disclosure.
  • FIGS. 6A and 6B illustrate an example in which a fingerprint sensor is disposed in an electronic device according to various embodiments of the present disclosure.
  • FIG. 7 illustrates an example of an operation of an electronic device to acquire a first image according to various embodiments of the present disclosure.
  • FIG. 8 illustrates an example of an operation of an electronic device to determine a state of a fingerprint according to various embodiments of the present disclosure.
  • FIG 9 illustrates an example of an operation of an electronic device for displaying information guiding a second touch input according to various embodiments of the present disclosure.
  • FIG. 10 illustrates an example of information guiding a second touch input according to various embodiments of the present disclosure.
  • FIG. 11 illustrates an example of an operation of an electronic device for displaying information guiding a second touch input according to various embodiments of the present disclosure.
  • FIG. 12 illustrates an example of information guiding a second touch input according to various embodiments of the present disclosure.
  • FIG. 13 illustrates an example of an operation of an electronic device for displaying information guiding a second touch input according to various embodiments of the present disclosure.
  • 14A to 14C illustrate examples of information for guiding a second touch input according to various embodiments of the present disclosure.
  • FIG. 15 illustrates an example of an operation of an electronic device for displaying information guiding a second touch input according to various embodiments of the present disclosure.
  • 16 illustrates an example of information guiding a second touch input according to various embodiments of the present disclosure.
  • FIG. 17 illustrates an example of an operation of an electronic device for displaying information guiding a second touch input based on a grip sensor according to various embodiments of the present disclosure.
  • 18A and 18B illustrate an example of information for guiding a second touch input based on a grip sensor according to various embodiments.
  • FIG. 19 illustrates an example of an operation of an electronic device for displaying information guiding a first touch input according to various embodiments of the present disclosure.
  • 20A and 20B illustrate an example of information for guiding a first touch input according to various embodiments.
  • the electronic device 101 communicates with the electronic device 102 through a first network 198 (for example, near field communication), or the second network 199 ( For example, it may communicate with the electronic device 104 or the server 108 through remote wireless communication.
  • the electronic device 101 may communicate with the electronic device 104 through the server 108.
  • the electronic device 101 may include a processor 120, a memory 130, an input device 150, an audio output device 155, a display device 160, an audio module 170, and a sensor module.
  • a sensor module 176 eg, fingerprint sensor, iris sensor, or illuminance sensor embedded in display device 160 (eg, display), may be It can be integrated.
  • the processor 120 may drive at least one other component (eg, hardware or software component) of the electronic device 101 connected to the processor 120 by driving software (eg, the program 140). It can control and perform various data processing and operations.
  • the processor 120 loads and processes the command or data received from another component (eg, the sensor module 176 or the communication module 190) into the volatile memory 132, and processes the result data in the nonvolatile memory 134.
  • the processor 120 operates independently of the main processor 121 (eg, central processing unit or application processor), and additionally or alternatively, uses less power than the main processor 121, Or a coprocessor 123 specialized for a designated function (eg, a graphics processing unit, an image signal processor, a sensor hub processor, or a communication processor).
  • the coprocessor 123 may be operated separately from the main processor 121 or embedded.
  • the coprocessor 123 may, for example, replace the main processor 121 while the main processor 121 is in an inactive (eg, sleep) state, or the main processor 121 is active (eg At least one of the elements of the electronic device 101 (eg, the display device 160, the sensor module 176, or the communication module) together with the main processor 121 while in the application execution state. 190) may control at least some of the functions or states associated with).
  • the coprocessor 123 e.g., image signal processor or communication processor
  • is implemented as some component of another functionally related component e.g. camera module 180 or communication module 190. Can be.
  • the memory 130 may include various data used by at least one component of the electronic device 101 (for example, the processor 120 or the sensor module 176), for example, software (for example, the program 140). ), And input data or output data for a command related thereto.
  • the memory 130 may include a volatile memory 132 or a nonvolatile memory 134.
  • the program 140 is software stored in the memory 130 and may include, for example, an operating system 142, a middleware 144, or an application 146.
  • the input device 150 is a device for receiving a command or data to be used for a component (for example, the processor 120) of the electronic device 101 from the outside (for example, a user) of the electronic device 101.
  • a component for example, the processor 120
  • the input device 150 may include a microphone, a mouse, or a keyboard.
  • the sound output device 155 is a device for outputting sound signals to the outside of the electronic device 101.
  • the sound output device 155 may include a speaker used for general purposes such as multimedia playback or recording playback, and a receiver used only for receiving a call. It may include. According to one embodiment, the receiver may be formed integrally or separately from the speaker.
  • the display device 160 is a device for visually providing information to a user of the electronic device 101.
  • the display device 160 may include a display, a hologram device, a projector, and a control circuit for controlling the device.
  • the display device 160 may include a pressure sensor capable of measuring the strength of the pressure on the touch circuitry or the touch.
  • the audio module 170 may bidirectionally convert a sound and an electrical signal. According to an embodiment, the audio module 170 acquires sound through the input device 150, or an external electronic device (for example, a wired or wireless connection with the sound output device 155 or the electronic device 101). Sound may be output through the electronic device 102 (for example, a speaker or a headphone).
  • an external electronic device for example, a wired or wireless connection with the sound output device 155 or the electronic device 101. Sound may be output through the electronic device 102 (for example, a speaker or a headphone).
  • the sensor module 176 may generate an electrical signal or data value corresponding to an operating state (eg, power or temperature) inside the electronic device 101 or an external environmental state.
  • the sensor module 176 may include, for example, a gesture sensor, a gyro sensor, a barometric pressure sensor, a magnetic sensor, an acceleration sensor, a grip sensor, a proximity sensor, a color sensor, an IR (infrared) sensor, a biometric sensor, a temperature sensor, a humidity sensor, Or an illumination sensor.
  • the interface 177 may support a specified protocol that may be connected to an external electronic device (for example, the electronic device 102) by wire or wirelessly.
  • the interface 177 may include a high definition multimedia interface (HDMI), a universal serial bus (USB) interface, an SD card interface, or an audio interface.
  • HDMI high definition multimedia interface
  • USB universal serial bus
  • SD card interface Secure Digital Card
  • audio interface an audio interface
  • the connection terminal 178 is a connector for physically connecting the electronic device 101 and an external electronic device (for example, the electronic device 102), for example, an HDMI connector, a USB connector, an SD card connector, or an audio connector. (Eg, headphone connector).
  • an HDMI connector for example, a USB connector, an SD card connector, or an audio connector. (Eg, headphone connector).
  • the haptic module 179 may convert an electrical signal into a mechanical stimulus (eg, vibration or movement) or an electrical stimulus that can be perceived by the user through tactile or kinesthetic senses.
  • the haptic module 179 may include, for example, a motor, a piezoelectric element, or an electrical stimulation device.
  • the camera module 180 may capture still images and videos. According to an embodiment of the present disclosure, the camera module 180 may include one or more lenses, an image sensor, an image signal processor, or a flash.
  • the power management module 188 is a module for managing power supplied to the electronic device 101, and may be configured, for example, as at least part of a power management integrated circuit (PMIC).
  • PMIC power management integrated circuit
  • the battery 189 is a device for supplying power to at least one component of the electronic device 101 and may include, for example, a non-rechargeable primary cell, a rechargeable secondary cell, or a fuel cell.
  • the communication module 190 establishes a wired or wireless communication channel between the electronic device 101 and an external electronic device (eg, the electronic device 102, the electronic device 104, or the server 108), and establishes the established communication channel. It can support to perform communication through.
  • the communication module 190 may include one or more communication processors that support wired communication or wireless communication that operate independently of the processor 120 (eg, an application processor).
  • the communication module 190 may include a wireless communication module 192 (eg, a cellular communication module, a near field communication module, or a global navigation satellite system (GNSS) communication module) or a wired communication module 194 (eg, A local area network (LAN) communication module, or a power line communication module, comprising a local area network such as a first network 198 (eg, Bluetooth, WiFi direct, or infrared data association) using a corresponding communication module. Communication with an external electronic device via a communication network) or a second network 199 (eg, a telecommunication network such as a cellular network, the Internet, or a computer network (eg, a LAN or a WAN)).
  • the various types of communication modules 190 described above may be implemented as one chip or each separate chip.
  • the wireless communication module 192 may distinguish and authenticate the electronic device 101 in a communication network by using user information stored in the subscriber identification module 196.
  • the antenna module 197 may include one or more antennas for transmitting or receiving signals or power from the outside.
  • the communication module 190 (for example, the wireless communication module 192) may transmit a signal to or receive a signal from an external electronic device through an antenna suitable for a communication method.
  • peripheral devices eg, a bus, a general purpose input / output (GPIO), a serial peripheral interface (SPI), or a mobile industry processor interface (MIPI)
  • GPIO general purpose input / output
  • SPI serial peripheral interface
  • MIPI mobile industry processor interface
  • the command or data may be transmitted or received between the electronic device 101 and the external electronic device 104 through the server 108 connected to the second network 199.
  • Each of the electronic devices 102 and 104 may be a device of the same or different type as the electronic device 101.
  • all or part of operations executed in the electronic device 101 may be executed in another or a plurality of external electronic devices.
  • the electronic device 101 may instead or additionally execute the function or service by itself.
  • At least some associated functions may be requested to the external electronic device.
  • the external electronic device may execute the requested function or additional function and transmit the result to the electronic device 101.
  • the electronic device 101 may process the received result as it is or additionally to provide the requested function or service.
  • cloud computing, distributed computing, or client-server computing technology may be used.
  • Electronic devices may be various types of devices.
  • the electronic device may include, for example, at least one of a portable communication device (eg, a smartphone), a computer device, a portable multimedia device, a portable medical device, a camera, a wearable device, or a home appliance.
  • a portable communication device eg, a smartphone
  • a computer device e.g., a laptop, a desktop, a tablet, or a smart bracelet
  • a portable multimedia device e.g., a portable medical device
  • a camera e.g., a portable medical device
  • a camera e.g., a portable medical device
  • a wearable device e.g., a portable medical device
  • a home appliance e.g., a portable medical device, a portable medical device, a camera, a wearable device, or a home appliance.
  • An electronic device according to an embodiment of the present disclosure is not limited to the above-described devices.
  • any (eg first) component is said to be “(functionally or communicatively)” or “connected” to another (eg second) component, the other component is said other
  • the component may be directly connected or connected through another component (eg, a third component).
  • module includes a unit composed of hardware, software, or firmware, and may be used interchangeably with terms such as logic, logic blocks, components, or circuits.
  • the module may be an integrally formed part or a minimum unit or part of performing one or more functions.
  • the module may be configured as an application-specific integrated circuit (ASIC).
  • ASIC application-specific integrated circuit
  • Various embodiments of the present disclosure may include instructions stored in a machine-readable storage media (eg, internal memory 136 or external memory 138) that can be read by a machine (eg, a computer). It may be implemented in software (eg, program 140).
  • the device may be a device capable of calling a stored command from a storage medium and operating in accordance with the called command, and may include an electronic device (eg, the electronic device 101) according to the disclosed embodiments.
  • the processor for example, the processor 120
  • the processor may perform a function corresponding to the command directly or by using other components under the control of the processor.
  • the instructions can include code generated or executed by a compiler or interpreter.
  • the device-readable storage medium may be provided in the form of a non-transitory storage medium.
  • 'non-temporary' means that the storage medium does not include a signal and is tangible, but does not distinguish that the data is stored semi-permanently or temporarily on the storage medium.
  • a method may be included in a computer program product.
  • the computer program product may be traded between the seller and the buyer as a product.
  • the computer program product may be distributed online in the form of a device-readable storage medium (eg compact disc read only memory (CD-ROM)) or through an application store (eg Play StoreTM).
  • a device-readable storage medium eg compact disc read only memory (CD-ROM)
  • an application store eg Play StoreTM
  • at least a portion of the computer program product may be stored at least temporarily on a storage medium such as a server of a manufacturer, a server of an application store, or a relay server, or may be temporarily created.
  • Each component for example, a module or a program
  • some components eg, modules or programs
  • operations performed by a module, program, or other component may be executed sequentially, in parallel, repeatedly, or heuristically, or at least some operations may be executed in a different order, omitted, or another operation may be added. Can be.
  • FIG. 2 illustrates an example of a functional configuration of an electronic device according to various embodiments of the present disclosure.
  • the functional configuration of such an electronic device may be included in the electronic device 101 shown in FIG. 1.
  • the electronic device 101 may include a display device 160, a sensor module 176, a memory 130, and a processor 120.
  • the display device 160 may output data or signals in the form of an image or an image.
  • the display device 160 may output data or a signal received from the memory 130 or the sensor module 176 in the form of an image or an image according to a control signal of the processor 120.
  • the display device 160 may display information for guiding a fingerprint input.
  • the information may include an image of a fingerprint or similar shape. For example, when it is determined that the state of the fingerprint corresponds to the first state (for example, the state in which the fingerprint is dry, the state in which the oil is low in the fingerprint), the display device 160 according to the control signal of the processor 120, Information for guiding a user's touch input may be displayed.
  • the first state may be a fingerprint for the authenticated user based on a result of the comparison of the reference image of the fingerprint and the image of the input fingerprint, with the fingerprint dry, the oil lacking or lacking in the fingerprint. Estimated, but the image for the input fingerprint does not match the reference image, the match rate between the image for the input fingerprint and the reference image is lower than a specified value, or within the image for the input fingerprint.
  • the ratio of the area caused by the floor area located in the fingerprint at may be a lower value than the ratio of the area caused by the floor area located in the fingerprint in the reference image.
  • the touch input may include a drag input.
  • the information for guiding the touch input may have a form including a first point and a second point, a plurality of objects arranged in a grid, or an arrow indicating a specific direction.
  • the display device 160 may display a graphical user interface (GUI) for interaction between the user and the electronic device 101.
  • GUI graphical user interface
  • the display device 160 may include a liquid crystal display (LCD) or an organic light emitting diode (OLED).
  • the display device 160 may receive information about a fingerprint of a user related to a touch input through a touch screen panel (TSP) module 222 that may receive a user's touch input and a user's touch input. It may be electrically or functionally coupled with the fingerprint sensor 224 to acquire.
  • TSP touch screen panel
  • the sensor module 176 may receive data for determining a context related to the inside or the outside of the electronic device 101.
  • the sensor module 176 may be configured to integrate with various sensors (eg, grip sensor, proximity sensor, geomagnetic sensor, gyro sensor, illuminance sensor, barometric pressure sensor, pressure sensor, touch sensor, or biometric sensor).
  • various sensors eg, grip sensor, proximity sensor, geomagnetic sensor, gyro sensor, illuminance sensor, barometric pressure sensor, pressure sensor, touch sensor, or biometric sensor.
  • the sensor module 176 may transmit data about an external situation related to the user to the processor 120.
  • the sensor module 176 may include a TSP module 222, a fingerprint sensor 224, and a grip sensor 226.
  • the TSP module 222 may receive data related to a user's touch input. According to various embodiments of the present disclosure, the TSP module 222 may detect a touch input of a finger and a pen. The touch input may include touch and release, drag and drop, long touch, or force touch. According to various embodiments of the present disclosure, the TSP module 222 may receive information about a location in a display where a user's touch is input and an area where the user's touch is input. According to various embodiments of the present disclosure, the TSP module 222 may map information about a location and an area where a user's touch is input to the time when the touch is input, and store the information in the memory 130.
  • a user of the electronic device 101 may contact the electronic device 101 with an ear of the user to perform a voice call.
  • the TSP module 222 maps information about the location and area of the touch input caused by the contact (e.g., the front of the display) to the time at which the contact occurred (e.g. 11:10 am), and the memory ( 130).
  • the TSP module 222 may store mapping information regarding a plurality of touch inputs. In the above-described embodiments, the TSP module 222 is described as mapping the information regarding the position and area of the touch input to the time at which the touch input occurs, but is not limited thereto. For example, the TSP module 222 may provide only the information about the position and the area where the touch input is detected to the processor 120, and the processor 120 may map the information about the acquired position and the area in time. .
  • the TSP module 222 may receive a user's touch input independently of the operation mode of the display device 160.
  • the display device 160 may operate in an off state. In the off state, power is supplied from the electronic device 101 to the display device 160, but no object is displayed, and a blank screen or a black screen on the display device 160 is displayed. black screen) may be displayed.
  • the off state may be variously referred to as an inactive state, a deactivation state, a sleep state, an inactive state, or an idle state.
  • the on state may include a state in which at least one object is displayed on the display device 160 based on data or a signal received from the processor 120.
  • the on state may be variously referred to as an active state or an activated state.
  • the electronic device 101 may operate in a mode of displaying content through a display.
  • the electronic device 101 displays at least one icon for indicating a current time, date, and a notification provided from at least one application while the processor 120 operates in the sleep state. Can be represented on the image.
  • the mode may be referred to as always on display (AOD) mode.
  • the TSP module 222 may receive a user touch input through the touch screen regardless of the on / off state of the display device 160.
  • the TSP module 222 may operate in an active state, for example, while the display device 160 operates in an off state.
  • the TSP module 222 may receive a user's touch input while the display device 160 displays the blank screen or the black screen.
  • the TSP module 222 may receive a user's touch input, for example, while the display device 160 operates in the on state.
  • the TSP module 222 may provide the processor 120 with information about the location of the received touch input on the display device 160 and the area of the received touch input.
  • the processor 120 may execute an application or change the display of the display device 160 based on the information about the touch input provided from the TSP module 222.
  • the TSP module 222 may provide the processor 120 with the received touch input and data related to the received touch input.
  • the TSP module 222 may receive a touch input while the display device 160 operates in the off state. Accordingly, the TSP module 222 controls to turn on (or change) the display device 160 to the processor 120 based on the off state of the display device 160 and the received touch input. You can send a signal.
  • the TSP module 222 may transmit data regarding the location, area, and time of the touch input to the memory 130 to store data related to the touch input.
  • the processor 120 acquires data related to the touch input from the TSP module 222 and instructs the memory 130 to store the acquired data and the obtained data in the memory 130.
  • the control signal can be transmitted.
  • the TSP module 222 may store the obtained data in the memory 130 by transmitting the obtained data to the memory 130.
  • the fingerprint sensor 224 may receive data related to a fingerprint through a user's touch input.
  • the fingerprint sensor 224 may receive data for determining whether a user associated with a touch input is an authenticated user, according to a characteristic pattern of valleys and ridges located within the user's fingerprint.
  • the fingerprint sensor 224 may be based on an optical scheme based on a difference in light reflected by a valley located in a fingerprint, and a phase difference in ultrasonic waves reflected by a floor located in the fingerprint. It may operate according to an ultrasonic method, or an electrostatic method based on the difference in dielectric constant caused by the valleys and ridges located in the fingerprint.
  • the grip sensor 226 may detect whether the electronic device 101 is held by a user and the direction of the held hand.
  • the grip sensor 226 may be configured to be integrated with a gyro sensor (not shown), a geomagnetic sensor (not shown), or a proximity sensor (not shown).
  • the electronic device 101 may detect a change in capacitance or a change in the magnetic field between the detection object and the proximity sensor through the proximity sensor.
  • the electronic device 101 uses the grip sensor 226 to allow the user to move the electronic device 101 to the left or right hand based on the magnitude of the change in the left magnetic field and the magnitude of the change in the right magnetic field of the electronic device 101. It can be identified whether it is held.
  • the grip sensor 226 may transmit information about a position of a hand of the user holding the electronic device 101 to the processor 120. For example, when the user grips the electronic device 101 with his left hand, the grip sensor 226 may transmit information indicating to the processor 120 that the electronic device 101 is gripped by the left hand. have.
  • the processor 120 may control to display an interface for fingerprint input on an area of the display device 160 where the fingerprint sensor on the left side of the plurality of fingerprint sensors is located.
  • the grip sensor 226 may transmit information indicating to the processor 120 that the electronic device 101 is gripped by the right hand. Can be.
  • the processor 120 may display an arrow figure for guiding a drag input in a left direction through the display device 160.
  • the memory 130 may correspond to the memory 130 shown in FIG. 1.
  • the memory 130 may store a command, a control command code, control data, or user data for controlling the electronic device 101.
  • the memory 130 may include an application, an operating system (OS), middleware, and a device driver.
  • the memory 130 may be stored in advance by an authenticated user and store pattern information used to release the lock state of the electronic device 101.
  • the pattern information may be information related to the order of a plurality of objects arranged in a grid.
  • the memory 130 may store information related to a touch input.
  • the memory 130 may obtain, through the processor 120, information in which the time when the touch is input is mapped to a location and an area on the display device 160 on which the user touch is input from the TSP module 222.
  • the memory 130 may provide information related to the touch input at the request of the processor 120.
  • the memory 130 may include one or more of volatile memory or non-volatile memory.
  • Volatile memory includes dynamic random access memory (DRAM), static RAM (SRAM), synchronous DRAM (SDRAM), phase-change RAM (PRAM), magnetic RAM (MRAM), persistent RAM (RRAM), ferroelectric RAM (FeRAM), and the like. It may include.
  • the nonvolatile memory may include read only memory (ROM), programmable ROM (PROM), electrically programmable ROM (EPROM), electrically erasable programmable ROM (EEPROM), flash memory, and the like.
  • the memory 130 may be a non-volatile medium such as a hard disk drive (HDD), a solid state disk (SSD), an embedded multi media card (eMMC), or a universal flash storage (UFS). It may include.
  • the memory 130 may provide stored data based on a request of the processor 120 or another component (eg, the TSP module 222), or the processor 120 or the other component ( For example, data received from the TSP module 222 may be recorded and stored.
  • the memory 130 may receive information about the location, area, and time related to the touch input from the TSP module 222 according to a request of the TSP module 222.
  • the processor 120 receives, from the TSP module 222, information regarding a location, an area, and a time related to the touch input, and transmits the obtained information to the memory 130 to store the obtained information. You may.
  • the memory 130 may transmit information about the location and area of the touch input received within the closest time from the specific time to the processor 120 at the request of the processor 120.
  • the processor 120 may control the overall operation of the electronic device 101.
  • the processor 120 may read or write data in the memory 130 and execute application instruction codes stored in the memory 130.
  • the processor 120 may perform user authentication using an image of a fingerprint input through a touch input.
  • the processor 120 may receive a reference image of the fingerprint of the authenticated user from the memory 130.
  • the processor 120 may perform user authentication by performing image matching between the received reference image and the image of the input fingerprint.
  • the processor 120 may obtain a matching rate between areas corresponding to the first color and a matching rate between areas corresponding to the second color based on the image matching.
  • the processor 120 may display various information through the display device 160.
  • the processor 120 determines that the fingerprint input through the touch input corresponds to a first state (for example, a dry state of the fingerprint or a state in which the fingerprint is insufficient)
  • the processor 120 changes the state of the fingerprint from the first state to the second state.
  • the display device 160 may be controlled to display the guided information for changing the fingerprint to a state in which the fingerprint is not dry or the fingerprint is sufficiently oiled.
  • the second state is based on a state in which the fingerprint is not dry, a state in which the oil is sufficient in the fingerprint, a comparison result of the reference image for the fingerprint and the image for the input fingerprint, and the reference image and the reference image.
  • the coincidence ratio of may be greater than or equal to a predetermined value, and within the ridge area of the fingerprint, a small bone area generated when the fingerprint is dry may include a state filled with a material such as sweat or oil.
  • the processor 120 may be electrically connected to other components of the electronic device 101 (eg, the display device 160, the sensor module 176, or the memory 130). It can be electrically connected or functionally combined or connected.
  • the processor 120 may be configured of one or more processors.
  • the processor 120 includes an application processor (AP) that controls a higher layer program such as an application or a communication processor (CP) that performs control for communication with another electronic device. can do.
  • AP application processor
  • CP communication processor
  • 3A and 3B illustrate examples of a fingerprint state according to various embodiments of the present disclosure.
  • a state of a fingerprint may correspond to a second state (eg, a state in which the fingerprint is not dry or a state in which the oil is sufficient in the fingerprint).
  • the fingerprint of the user is based on the touch surface of the touch screen 303 (for example, the display device 160 of FIG. 1), and the area of the ridge 302 corresponding to the relatively low position and the touch screen ( Based on the touch surface of 303, it may be divided into a valley 301 region corresponding to a relatively high position.
  • the floor 302 area included in the fingerprint has a relatively low position from the touch surface of the touch screen 303, and thus may be in physical contact with the touch screen 303.
  • the valley 301 region included in the fingerprint has a relatively high position from the touch surface of the touch screen 303, and thus may not be in physical contact with the touch screen 303.
  • an air layer may be formed between the area of the valley 301 located in the fingerprint and the surface of the touch screen 303. Since the area of the floor 302 located within the fingerprint is in physical contact with the touchscreen 303, no air layer may be formed between the area of the floor 302 and the surface of the touchscreen 303.
  • the processor 120 may acquire an image 310 corresponding to the second state of the fingerprint through the fingerprint sensor 224 when a user inputs a touch.
  • the image 310 corresponding to the second state includes a first area 312 indicated by a first color (eg black) and a second area 314 indicated by a second color (eg white). can do.
  • An area of the ridge 302 located in the fingerprint may correspond to the first area 312 of the image 310 corresponding to the second state.
  • An area of the valley 301 located in the fingerprint may correspond to the second area 314 of the image 310 corresponding to the second state.
  • the state of the fingerprint may correspond to a first state (eg, a state in which the fingerprint is dry and a state in which the oil is insufficient in the fingerprint).
  • a first state eg, a state in which the fingerprint is dry and a state in which the oil is insufficient in the fingerprint.
  • an area that is not in physical contact with the touch screen 303 may be generated inside the floor 302 of the fingerprint corresponding to the second state. That is, when the fingerprint corresponds to the first state, in the floor 302 region of the fingerprint, the fingerprint has a relatively high position with respect to the touch surface of the touch screen 303 and is not in physical contact with the touch screen 303.
  • a small valley 304 area may be created.
  • the small valley 304 area may include a material such as sweat or oil when the state of the fingerprint corresponds to the second state (for example, the state in which the fingerprint is not dry), and the touch screen 303 may be formed. There may be no air layer between the touch surface and the small valley 304 area.
  • the small valley 304 area may not include a material such as sweat or oil when the state of the fingerprint corresponds to the first state (eg, the state in which the fingerprint is dry). When the state of the fingerprint corresponds to the first state, an air layer may be formed between the small valley 304 area and the touch surface of the touch screen 303.
  • the processor 120 may acquire an image 320 corresponding to the first state of the fingerprint through the fingerprint sensor 224 when a user inputs a touch.
  • the image 320 corresponding to the first state includes a third area 322 indicated by a first color (eg black) and a fourth area 324 indicated by a second color (eg white). can do.
  • the fourth region 324 indicated by the second color may include the region generated by the region of the valley 301 included in the fingerprint and the region generated by the region of the small valley 304 generated according to the second state. Can be.
  • FIG. 4 illustrates an operation principle of a fingerprint sensor according to various embodiments of the present disclosure.
  • the fingerprint sensor 420 illustrated in FIG. 4 may correspond to the fingerprint sensor 224 illustrated in FIG. 2.
  • the fingerprint sensor 420 may be located at the bottom of the light emitting layer 410 of the display.
  • the fingerprint sensor 420 may be positioned under the light emitting layer 410 of the display to detect light reflected from the window glass of the display device 160 and obtain an image of the fingerprint based on the detected light. Since the fingerprint sensor 420 is not transparent, when the fingerprint sensor 420 is positioned above the light emitting layer 410 of the translucent display, the overall performance of the display may be reduced.
  • the fingerprint sensor 420 may acquire an image regarding the fingerprint by using a separate light source.
  • the fingerprint sensor 420 may be generated from a separate light source to detect light reflected from the window glass and traveling.
  • the fingerprint sensor 420 may be located at a specific location (eg, in an edge direction of the display) in order to efficiently receive light generated from a separate light source.
  • the separate light source may operate to acquire a fingerprint image for a touch input when there is no light source generated from the light emitting layer 410 of the display by operating the display device 160 in an off state or a low power mode.
  • a contact 434 region by physical contact between a window glass and a plurality of floors 431 may be generated.
  • the bone 432 area within the fingerprint may be created or destroyed depending on the state of the fingerprint. For example, when the state of the fingerprint corresponds to a second state (for example, a state in which the oil is sufficient for the fingerprint and the state in which the fingerprint is not dry), a region of the bone 432 where a material such as sweat or oil is small is formed. I can fill it. Accordingly, when the fingerprint corresponds to the second state, the small valley 432 area may not exist.
  • the bone 435 Small valleys 432 areas different from the areas may be created.
  • the valley 435 is an area that cannot be filled by a material such as sweat or oil, and may correspond to an area that always has a second color (eg, white color) when a fingerprint is input.
  • the fingerprint sensor 420 may acquire the image of the fingerprint by receiving the light of the path A and the path B reflected on the touch screen to the fingerprint sensor 420.
  • the light in path A may be light reflected and received from the area of the valleys 432 or 435 located within the window glass and fingerprint.
  • the light in path B may be light received reflected from the area of the ridge 431 located in the window glass and fingerprint.
  • Light in path A may be reflected through an air layer present between the window glass and the area of valleys 432 or 435 located within the fingerprint. Therefore, the light of the path A can be reduced in intensity small.
  • the light of the path B when the light of the path B reaches the area of the contact 434 of the window glass and the floor 431 located in the fingerprint, the light of the path B may be absorbed through the user's fingerprint. Therefore, the light of the path B reflected from the contact 434 region can be greatly reduced in the amount of light. That is, the light of the path A may be light brighter than the light of the path B.
  • the fingerprint sensor 420 receives the light of the path A and the light of the path B reflected through the plurality of air layers 433 and the contact 434 regions, and compares the two light intensity differences to obtain a fingerprint. Obtain an image for.
  • the fingerprint sensor 420 may receive the reflected light of the path A, and detect the light having the first value.
  • the fingerprint sensor 420 may receive the reflected path B light and detect light having a second brightness. Through the image processing, the fingerprint sensor 420 may recognize that the value of the brightness of the first value corresponding to the light of the reflected path A is greater than the value of the second brightness corresponding to the light of the reflected path B.
  • the fingerprint sensor 420 may receive the light of the path A reflected at the location corresponding to the valley area 432 of the fingerprint, and the path B of the path B reflected onto the location corresponding to the floor area 431 of the fingerprint. Can receive light.
  • the fingerprint sensor 224 may acquire the image 310 corresponding to the second state of the fingerprint of FIG. 3A or the image 320 corresponding to the first state of the fingerprint shown in FIG. 3B by inverting the received light. have.
  • FIG. 5 is a flowchart illustrating an electronic device for displaying information guiding a touch input according to various embodiments of the present disclosure.
  • the processor 120 may acquire a first image of a fingerprint of a user associated with a first touch input.
  • the processor 120 may display an interface for guiding a fingerprint input through the display device 160 to obtain a first image of the user's fingerprint.
  • the processor 120 may detect that the display device 160 is changed from an off state to an on state, and display an interface for guiding a fingerprint input according to the detection.
  • the interface for guiding the fingerprint input may be an image including the shape of the fingerprint or the like.
  • the interface for guiding the fingerprint input may further include text for guiding the fingerprint input (eg, “Please enter a fingerprint” or “Raise your finger”).
  • the processor 120 may acquire, via the fingerprint sensor 224, a first image of a fingerprint related to a touch input according to the interface.
  • the processor 120 may acquire a first image of a user's fingerprint through the fingerprint sensor 224 while the display device 160 operates in an off state.
  • the fingerprint sensor 224 may receive a touch input even when the display device 160 operates in the off state.
  • the processor 120 may display a graphical user interface (GUI) on an area of the display device 160 where the fingerprint sensor 224 is located while operating in the AOD mode.
  • GUI graphical user interface
  • the GUI may include a graphic object that includes the shape of a fingerprint.
  • the display device 160 may operate in a low power mode or an off state based on the AOD mode.
  • the fingerprint sensor 224 may acquire an image of a user's fingerprint through an area where the GUI is displayed while the display device 160 operates in a low power mode or an off state.
  • the fingerprint sensor 224 may perform fingerprint authentication by comparing an image of a fingerprint of an authenticated user with a first image acquired based on the received touch input.
  • the processor 120 may change the display device 160 from the off state to the on state in response to obtaining the first image by the fingerprint sensor 224. For example, when the first image and the image of the authenticated user's fingerprint match, the processor 120 determines that the fingerprint authentication is successful, and releases the lock state of the electronic device 101 and the display device 160. The off state of can be switched on. In another example, when it is determined that there is a mismatch between the first image and the image of the authenticated user's fingerprint, the processor 120 turns on the state of the display device 160 and guides the second touch input. Information may be displayed through the display device 160.
  • the processor 120 may display a graphical user interface (GUI) on an area where the fingerprint sensor 224 is located.
  • GUI graphical user interface
  • the GUI may include a graphic object that includes the shape of a fingerprint.
  • the display device 160 may operate in a low power mode or an off state based on the AOD mode.
  • the fingerprint sensor 224 may acquire an image of a user's fingerprint through an area where the GUI is displayed while the display device 160 operates in a low power mode or an off state.
  • the processor 120 may display an interface for guiding a fingerprint input.
  • the processor 120 may display an interface for guiding the fingerprint input on an area where the fingerprint sensor 224 is located. For example, when the fingerprint sensor 224 is positioned at the bottom center of the display device 160, the processor 120 may display a fingerprint-shaped image 620 at the bottom center of the display device 160.
  • the electronic device 101 may include a plurality of fingerprint sensors 620 through 660.
  • the plurality of fingerprint sensors 620 to 660 may include five fingerprint sensors.
  • the fingerprint sensor 620 may be disposed at a position corresponding to a conventional physical key or button (for example, a lower center portion of the display device 160).
  • the remaining fingerprint sensors 630 to 660 may be disposed through the remaining area except for the area where the fingerprint sensor 620 is disposed.
  • the plurality of fingerprint sensors 620 through 660 may be disposed through the entire area of the display device 160. In this case, the plurality of fingerprint sensors may be arranged such that the distance between each fingerprint sensor is the same. Since the plurality of fingerprint sensors 620 to 660 are located in the display device 160 area, the processor 120 may adaptively change the position of the interface for fingerprint input.
  • the processor 120 may identify that the state of the fingerprint corresponds to the first state, based on the obtained first image and the reference image.
  • the first state may be a fingerprint for the authenticated user based on a result of the comparison of the reference image of the fingerprint and the image of the input fingerprint, with the fingerprint dry, the oil lacking or lacking in the fingerprint. Estimated, but the image for the input fingerprint does not match the reference image, the match rate between the image for the input fingerprint and the reference image is lower than a specified value, or within the image for the input fingerprint.
  • the ratio of the area caused by the floor area positioned in the fingerprint may be lower than the ratio of the area caused by the floor area located in the fingerprint in the reference image.
  • the processor 120 may perform a comparison with the reference image to perform fingerprint authentication using the obtained first image.
  • the reference image may be stored in advance by an authenticated user of the electronic device, and may include an image of a fingerprint of an authenticated user, an image registered for setting a fingerprint registration, and an image of a fingerprint of an authorized user.
  • the processor 120 may determine whether a state of the fingerprint corresponds to the first state based on the obtained first image and the reference image. According to an embodiment of the present disclosure, the processor 120 may determine that the state of the fingerprint corresponds to the first state by determining whether the first area of the first image and the third area of the reference image are different.
  • the processor 120 may display information guiding a second touch input for changing the state of the fingerprint from the first state to the second state.
  • the second state is based on a state in which the fingerprint is not dry, a state in which the oil is sufficient in the fingerprint, a comparison result of the reference image for the fingerprint and the image for the input fingerprint, and the reference image and the reference image.
  • the coincidence ratio of may be greater than or equal to a predetermined value, or a small bone region generated when the fingerprint is dry, or when the fingerprint is dry, may be filled with a material such as sweat or oil.
  • the processor 120 may display information for guiding a touch input to change the state of the fingerprint to the second state based on detecting that the state of the fingerprint corresponds to the first state.
  • the information guiding the touch input may include a visual object such as a figure including a first point and a second point, a visual object such as a plurality of objects arranged in a grid, and a visual object such as an arrow indicating a specific direction. have.
  • the second touch input may include a drag input.
  • the user may input a touch based on the displayed information through the display device 160.
  • a material such as sweat generated by friction between the display device 160 and the fingerprint and oil present on the display device 160 may fill small valley areas in the floor area. . As the small valley areas in the ridge area are filled, the state of the fingerprint may change from the first state to the second state.
  • FIG. 7 illustrates an example of an operation of an electronic device to acquire a first image according to various embodiments of the present disclosure. This operation may be performed by the processor 120 shown in FIG. 2.
  • Operations 701 to 705 of FIG. 7 may be related to operation 501 of FIG. 5.
  • the processor 120 may receive data from a grip sensor 226 or a pressure sensor (not shown).
  • the grip sensor 226 may obtain information related to the direction of the hand holding the electronic device 101 by comparing the magnitude of the magnetic field change in the left direction and the magnitude of the magnetic field change in the right direction.
  • the processor 120 may detect that the user grips the electronic device 101 by receiving information from the grip sensor 226. When the user grips the electronic device 101, the processor 120 transmits a control signal instructing to display the information for guiding the change from the off state of the display device 160 to the on state and the first touch input. can do.
  • processor 120 may receive data from a pressure sensor (not shown).
  • the pressure sensor (not shown) may be disposed in the lower center portion of the display device 160.
  • the pressure sensor (not shown) may correspond to a conventional physical key or button for activating the display.
  • the pressure sensor (not shown) may transmit data to the processor 120.
  • the processor 120 may transmit a control signal instructing the display device 160 to turn on the display device 160 based on data reception from a pressure sensor (not shown).
  • the processor 120 has been described as receiving data from the grip sensor 226 or a pressure sensor (not shown), but is not limited thereto.
  • the processor 120 may receive data from an illuminance sensor (not shown) based on a sudden change in illuminance value when the user removes the electronic device from a pocket (or bag) for use. Can be.
  • the processor 120 may transmit a control signal instructing to activate the display device 160 based on receiving data from an illumination sensor (not shown).
  • the processor 120 may display information guiding the first touch input.
  • the processor 120 may transmit a control signal instructing the display device 160 to display information guiding a first touch input for fingerprint authentication of the user.
  • the information guiding the first touch input may be an image including a shape of a fingerprint and a shape similar thereto.
  • the information for guiding the first touch input may further include text for guiding the fingerprint input (eg, “Please enter a fingerprint” or “Raise a finger”).
  • the display device 160 may display information guiding the first touch input based on a control signal from the processor 120. Accordingly, the first touch input for fingerprint authentication may be received in an area where the fingerprint sensor 224 is located in the display device 160.
  • the processor 120 may acquire a first image through the fingerprint sensor 224.
  • the fingerprint sensor 224 may acquire an image of the fingerprint according to the first touch input based on the difference in the intensity of light reflected by the valley area and the floor area located in the fingerprint.
  • An image of the fingerprint according to the first touch input may be referred to as a first image.
  • the fingerprint sensor 224 may acquire the first image by converting analog data about received light intensity into digital data and performing image processing.
  • FIG. 8 illustrates an example of an operation of an electronic device to determine whether a state of a fingerprint corresponds to a first state according to various embodiments of the present disclosure. This operation may be performed by the processor 120 shown in FIG.
  • Operations 801 to 805 illustrated in FIG. 8 may relate to operation 503 illustrated in FIG. 5.
  • the processor 120 may determine whether a matching ratio between the second area of the first image and the fourth area of the reference image is greater than or equal to the first threshold value. That is, the processor 120 may determine whether the regions corresponding to the valley 435 regions overlap by comparing the reference image with the first image. For example, when the second area of the first image and the fourth area of the reference image are not equal to each other by about 95% or more, the processor 120 may determine the touch input by a third party, not an authenticated user. This is because the second region of the first image is a region having a second color according to the region of the valley 435 that exists regardless of the presence of a material such as sweat or oil.
  • the processor 120 may determine that the first image is an image of a fingerprint of an authenticated user. Accordingly, the processor 120 may perform operation 803 to determine whether fingerprint authentication has failed due to the state of the fingerprint.
  • the first threshold value has been described as the same value as about 95%, but the present invention is not limited thereto.
  • the processor 120 may set the first threshold value to various values according to the security setting of the user. For example, if the user requires a high security fingerprint authentication, the first threshold may be set to about 99%.
  • the specific first threshold value is not limited thereto and may have various values.
  • the processor 120 may determine whether a match rate between the first area of the first image and the third area of the reference image is greater than or equal to a second threshold value. That is, the processor 120 may determine whether the floor 302 overlaps by comparing the reference image with the first image. For example, the processor 120 does not determine whether the state of the fingerprint corresponds to the first state or the second state when the first region of the first image and the third region of the reference image are not equal to about 70% or more. You may not. According to an embodiment of the present disclosure, the processor 120 may display information guiding the first touch input again in order to obtain specific information about the first image. In another example, the processor 120 may detect that the first area of the first image and the third area of the reference image are about 70% or more the same.
  • the processor 120 may determine that the state of the fingerprint corresponds to the first state.
  • the processor 120 may determine that the fingerprint of the authenticated user of the electronic device 101 corresponds to the first state based on the first image and the reference image. Thereafter, the processor 120 may display the information guiding the second touch input through the display device 160 to change the first state of the fingerprint to the second state.
  • FIG. 9 is a flowchart of an electronic device for displaying information guiding a second touch input according to various embodiments of the present disclosure. This operation may be performed by the processor 120 shown in FIG. 2.
  • Operations 901 to 909 of FIG. 9 may be related to operation 505 of FIG. 5.
  • the processor 120 may display a graphic object (eg, a figure) including a first point and a second point as information guiding the second touch input.
  • the first point and the second point may include different points located in the display device 160.
  • the first point and the second point may include different points spaced apart by a minimum distance that may be distinguished by a user's touch input.
  • the graphic object including the first point and the second point may have a shape for guiding a drag input of the user.
  • the graphic object may have various shapes (eg, star shape, circle shape, square shape, or triangle shape).
  • the processor 120 may additionally display text (eg, “draw a star on the screen for fingerprint authentication”) for guiding the drag input through the display device 160.
  • the processor 120 may display a star-shaped figure 1001 based on detecting that the state of the fingerprint corresponds to the first state.
  • the processor 120 may further display the text 1003 for guiding the drag input to improve the user's perception.
  • the processor 120 may detect a drag input including the first point and the second point.
  • the processor 120 may detect a drag input through the TSP module 222.
  • the drag input may include a first point and a second point.
  • the processor 120 may add an animation effect according to the detected drag input through the display device 160.
  • the processor 120 may further display a line 1005 that is displayed to overlap the star-shaped figure.
  • the processor 120 may determine whether a drag input is detected in the entire area where the figure is displayed.
  • the processor 120 may receive information about all areas from which the drag input is detected from the TSP module 222.
  • the processor 120 may determine whether an area in which the figure is displayed is included in an area in which the drag input is detected.
  • the processor 120 may determine that the drag input is detected along the region in which the figure is displayed.
  • the processor 120 may perform operation 907 to detect a drag input corresponding to an area where the figure is displayed and to obtain a second image of the fingerprint changed to the second state.
  • the processor 120 may determine that the drag input is not detected along the region in which the figure is displayed. The processor 120 may maintain the display of the figure through the display device 160 until a drag input is detected in the entire region where the figure is displayed.
  • the processor 120 may acquire a second image of the fingerprint through the third point.
  • the third point may correspond to an area in which the fingerprint sensor is located in the display device 160.
  • the third point may correspond to an area different from the first point and the second point.
  • the processor 120 may display information for guiding a fingerprint input on an area where the third point is located through the display device 160.
  • the information guiding the fingerprint input may be an image including the shape of the fingerprint or the like.
  • the information for guiding the fingerprint input may further include text for guiding the fingerprint input (eg, “Please enter a fingerprint” or “Raise a finger”).
  • the processor 120 may acquire a second image of the fingerprint through the fingerprint sensor 224 corresponding to the third point.
  • the second image may include an image of the fingerprint in which the state of the fingerprint is changed from the first state to the second state.
  • the second image may be an image of a fingerprint in which a small valley 304 region located in the ridge 302 region in the fingerprint is filled with a material such as sweat or oil.
  • the processor 120 may display an interface for guiding a fingerprint input on an area corresponding to the first point or the second point.
  • the processor 120 may display information for guiding a fingerprint input on an area corresponding to the first point through the display device 160. Accordingly, the processor 120 may obtain the second image by only the drag input of the user input to the displayed figure.
  • the processor 120 may perform fingerprint authentication based on the obtained second image and the reference image.
  • the processor 120 may compare the second image and the reference image based on an image matching technique. For example, when it is determined that the fingerprints match, the processor 120 may release the lock state of the electronic device 101 based on the fingerprint authentication result. For another example, when detecting a fingerprint mismatch, the processor 120 may maintain the lock state of the electronic device 101 and display the interface for guiding the fingerprint input again based on a fingerprint authentication result. .
  • FIG. 11 illustrates an example of an operation of an electronic device displaying information guiding a second touch input according to various embodiments of the present disclosure. This operation may be performed by the processor 120 shown in FIG. 2.
  • Operations 1101 to 1109 of FIG. 11 may be related to operation 505 of FIG. 5.
  • the processor 120 may display a plurality of objects arranged in a grid through the display device 160.
  • the plurality of objects arranged in the grid may have a shape of a circle, as shown in FIG. 12.
  • each of the plurality of objects arranged in the grid may include a star, a rectangle, a triangle, various shapes or images.
  • each of the plurality of objects arranged in the grid may have the same shape or different shapes from each other.
  • the processor 120 may display a plurality of circular objects 1201 to 1209 arranged in a grid, as shown in FIG. 12.
  • the processor 120 may display the objects of each of the plurality of objects arranged in the grid in different shapes.
  • the order of at least some of the objects to which a touch is input among the plurality of objects arranged in the grid may be used as pattern information for unlocking.
  • the processor 120 may store pattern information preset by an authenticated user through the memory 130.
  • the processor 120 may detect a second touch input input to at least some objects through the TSP module 222.
  • the processor 120 may store an order of at least some objects determined according to the second touch input as an input pattern.
  • the processor 120 may detect the trajectory 1210 of the second touch input. Accordingly, the processor 120 proceeds in the order of the object 1201, the object 1202, the object 1203, the object 1206, the object 1205, the object 1204, the object 1207, and the object 1208.
  • the second touch input can be detected.
  • the processor 120 may store the order of the objects corresponding to the trajectory 1210 of the second touch input as a pattern of the second touch input.
  • the processor 120 may display an animation effect by displaying a line on the touch area corresponding to the trajectory 1210 of the second touch input through the display device 160.
  • the processor 120 may provide animation effects to objects positioned on the trajectory 1210 of the second touch input among the plurality of objects arranged in the grid.
  • the processor 120 may change the color of the objects for which the second touch input is detected or enlarge the size of the objects. For example, the processor 120 according to the progress of the trajectory 1210 of the second touch input, the object 1202, the object 1203, the object 1206, the object 1205, the object 1204, the object ( 1207, in order of the object 1208, the size of the object to which the touch is input may be enlarged or the color may be changed. Accordingly, the user may identify in real time an area where the touch input is received, based on the trajectory 1210 of the second touch input and the animation effect applied to the plurality of objects arranged in the grid.
  • the processor 120 may determine whether the previously stored pattern and the pattern of the second touch input match. For example, when stored by an authenticated user and the pattern for unlocking the electronic device 101 and the pattern of the second touch input match, the processor 120 may display information guiding the fingerprint input. have.
  • the processor 120 may display a plurality of objects arranged in the grid through the display device 160. You can continue to display. This is because the user of the second touch input may be different from the authenticated user when the pattern is inconsistent.
  • the processor 120 may display an interface for guiding the fingerprint input based on the matching of the pre-stored pattern and the pattern of the second touch input. Accordingly, the processor 120 may release the lock state based on the matching of the pattern information for unlocking the electronic device and the matching of the reference image and the second image, thereby further enhancing security.
  • the processor 120 may acquire a second image through the fingerprint sensor 224.
  • the second image may include an image of the fingerprint input through the fingerprint sensor 224 after the drag input.
  • the fingerprint sensor 224 may acquire a second image of the fingerprint changed to the second state according to the drag input.
  • the fingerprint sensor 224 may acquire a second image according to the fingerprint input based on the matching of the pre-stored pattern and the pattern of the second touch input. For example, if the previously stored pattern and the pattern of the second touch input do not match, the fingerprint sensor 224 may not generate the second image.
  • the fingerprint sensor 224 is based on the matching of the pre-stored pattern and the pattern of the second touch input, the fingerprint sensor 224 is based on the difference between the intensity of the light having the path A and the light having the path B is reflected and received after the matching Based on the second image may be generated.
  • the fingerprint sensor 420 may receive the reflected light of the path A, and detect the light having the first value.
  • the fingerprint sensor 420 may receive the reflected path B light and detect light having a second brightness. Through the image processing, the fingerprint sensor 420 may recognize that the value of the brightness of the first value corresponding to the light of the reflected path A is greater than the value of the second brightness corresponding to the light of the reflected path B. Since the reflected path B is partially absorbed into the floor area 431 of the fingerprint in contact with the window glass, the value of the second brightness may be smaller than the value of the first brightness. Accordingly, the fingerprint sensor 420 may receive the light of the path A reflected at the location corresponding to the valley area 432 of the fingerprint, and the path B of the path B reflected onto the location corresponding to the floor area 431 of the fingerprint. Can receive light. The fingerprint sensor 224 may acquire the image 310 corresponding to the second state of the fingerprint of FIG. 3A or the image 320 corresponding to the first state of the fingerprint shown in FIG. 3B by inverting the received light. have.
  • the processor 120 may perform fingerprint authentication based on the obtained second image and the reference image. Operation 1109 may correspond to operation 909 of FIG. 9.
  • FIG. 13 illustrates an example of an operation of an electronic device that displays information for guiding a second touch input according to various embodiments of the present disclosure. This operation may be performed by the processor 120 shown in FIG. 2.
  • Operation 1301 to operation 1307 of FIG. 13 may be related to operation 505 of FIG. 5.
  • the processor 120 may check information regarding an area where a touch input is received, before detecting the first touch input.
  • the processor 120 may request information about an area related to the touch input received before the first touch input is input to the memory 130. For example, the processor 120 may determine that the state of the fingerprint corresponds to the first state based on the first image.
  • the processor 120 may read information from the memory 130 regarding the region in which the touch input is received at the closest time based on the determination.
  • the processor 120 may receive information related to an area of a touch input received within a predetermined time interval, having an area equal to or greater than a predetermined threshold value.
  • the processor 120 may receive information about an area in which a touch input is received a predetermined number of times or more within a predetermined time interval. That is, the processor 120 may check information on an area related to a touch input within a predetermined time interval from a specific time (eg, a time when the state of the fingerprint is determined to correspond to the first state).
  • the processor 120 may receive a user's touch input.
  • the user may contact the electronic device 101 with the ear of the user for a voice call.
  • a touch input may occur between the side of the user's face and the display device 160 of the electronic device 101.
  • the touch input may have an area of a predetermined size and may be received through at least a portion of the front or top of the display device 160. The at least some area may be represented by the original area 1410.
  • the TSP module 222 may map information regarding the position and area of the received touch input to the time when the touch is input and transmit the information to the memory 130.
  • the TSP module 222 may provide information regarding the location and area (eg, the original area 1410) of the touch input received on the display device 160 at a time when the touch input is received (eg, 11 am). 10 minutes).
  • the processor 120 may receive a user's touch input.
  • the processor 120 may display a virtual keyboard for inputting a user's text through the display device 160 at the bottom of the display device 160.
  • the processor 120 may receive a plurality of touch inputs input on the virtual keyboard.
  • the processor 120 may display information guiding the second touch input on the identified region.
  • the identified area may correspond to an area where a touch input is detected within a short time.
  • a material such as sweat or oil may be present in the display device 160 on the identified area.
  • the processor 120 may display information guiding the second touch input on the identified area in order to change the state of the fingerprint from the first state to the second state.
  • the information guiding the second touch input may include a figure including a first point and a second point, a plurality of objects arranged in a grid, and arrows indicating a specific direction.
  • the processor 120 displays information for guiding the second touch input through the display device 160 based on information about the location, area, and time of the touch input. can do.
  • the processor 120 detects that the state of the fingerprint corresponds to the first state, the processor 120 based on the information regarding the touch area having a large touch area received before the detection, Information for guiding a touch input may be displayed. For example, a star shape may be displayed on the display device 160 through the circle region 1410.
  • the processor 120 may provide information for guiding the second touch input through the display device 160 based on information about the location, area, and time of the touch input. I can display it. For example, when the processor 120 detects that the state of the fingerprint corresponds to the first state, the processor 120 guides the second touch input based on the information about the region where the multi-touch occurs within a predetermined time interval from the detection time. Information to be displayed can be displayed. For example, the processor 120 may display a star-shaped figure through the display device 160 and through the rectangular area 1420.
  • the processor 120 guides the second touch input through the display device 160 based on information about the location, area, and time of the touch input. Can be displayed. For example, when the processor 120 detects that the state of the fingerprint corresponds to the first state, the processor 120 identifies the touch input received through a predetermined size or more among touch inputs received within a predetermined interval from the detection time. can do. The processor 120 may identify a touch input corresponding to an area of a predetermined size or more by receiving information about a touch input previously received from the memory 130. The processor 120 may display information guiding the second touch input based on the identified touch input. For example, the processor 120 may receive information related to an area of the identified touch input from the memory 130.
  • the processor 120 may display a droplet shape on the area to guide the second touch input.
  • the processor 120 may display text for guiding the second touch input (eg, “drop water droplets for fingerprint authentication”).
  • the processor 120 may display an effect corresponding to the second touch input through the display device 160.
  • the processor 120 may not partially display the droplet shape of the region in which the touch input is received, in response to the touch input received in the region including the droplet shape. Accordingly, the user may experience the effect of deleting at least a portion of the droplet shape located on the corresponding area according to the second touch input through the display device 160.
  • the processor 120 may display a plurality of objects arranged in a grid or arrows indicating a specific direction on the identified area.
  • the processor 120 may acquire a second image through the fingerprint sensor 224.
  • the fingerprint sensor 224 may receive a touch input for fingerprint recognition after the second touch input of the user.
  • the processor 120 may display information for guiding a fingerprint input changed to a second state by a user's second touch input.
  • the processor 120 may display an image related to a fingerprint on an area of the display device 160 corresponding to the position of the fingerprint sensor 224.
  • the fingerprint sensor 224 may acquire an image of a fingerprint changed to a second state by the second touch input based on the touch input for fingerprint recognition.
  • Operation 1305 may correspond to operation 1107 of FIG. 11.
  • the processor 120 may perform fingerprint authentication based on the obtained second image and the reference image. Operation 1307 may correspond to operation 1109 of FIG. 11.
  • FIG. 15 illustrates an example of an operation of an electronic device displaying information guiding a second touch input according to various embodiments of the present disclosure. This operation may be performed by the processor 120 shown in FIG. 2.
  • Operations 1501 through 1507 of FIG. 15 may be related to operation 505 of FIG. 5.
  • the processor 120 may detect a touch input having a predetermined pressure value or more.
  • the processor 120 may receive a touch input through a predetermined area on a pressure sensor (not shown).
  • the pressure sensor (not shown) may be located at the center of the bottom of the display device 160.
  • the pressure sensor (not shown) may determine whether the received touch input is equal to or greater than a predetermined pressure value. Specifically, the pressure sensor (not shown) may acquire a specific pressure value of the touch input according to the magnitude of the change in the dielectric constant.
  • the processor 120 may display an arrow from the bottom surface toward the center of the display device 160.
  • the processor 120 may transmit a signal for controlling the display device 160 to be turned on.
  • the processor 120 may display information for guiding the second touch input.
  • the information may be an image related to an arrow indicating a specific direction.
  • the arrow may indicate that the arrow points toward the center of the display device 160 from the bottom surface of the four surfaces forming the outline of the display device 160.
  • the processor 120 may additionally display text (eg, “drag” or “drag in the direction of the arrow”) for guiding the second touch input through the display device 160.
  • the processor 120 may detect a second touch input.
  • the user may provide a second touch input based on an arrow and / or text displayed through the display device 160.
  • the second touch input may include a drag input.
  • the processor 120 may determine whether the direction of the detected touch input coincides with the displayed arrow direction.
  • the processor 120 may identify the direction of the second touch input based on the detected second touch input. For example, the processor 120 may identify the direction of the second touch input based on the difference between the coordinate at which the second touch input is applied and the coordinate at which the second touch input is released.
  • the processor 120 may determine whether the detected direction of the second touch input coincides with the direction indicated by the arrow.
  • the processor 120 may detect that the direction indicated by the arrow does not coincide with the direction of the second touch input.
  • the processor 120 may maintain the display of the arrow through the display device 160 to receive a second touch input having a direction indicated by the arrow.
  • the processor 120 may detect coincidence between the direction of the second touch input and the direction indicated by the arrow.
  • the processor 120 may display information for guiding a fingerprint input according to the match.
  • the processor 120 may display an image and text guiding a fingerprint input.
  • the processor 120 may display text (eg, “fingerprint touch”) for guiding a fingerprint input.
  • the processor 120 may display an interface (eg, a fingerprint shape or similar image) for guiding a fingerprint input.
  • the processor 120 may acquire a second image through the fingerprint sensor 224.
  • the fingerprint sensor 224 may receive a touch input for fingerprint recognition after the second touch input of the user through the fingerprint sensor 224.
  • the second image may include an image of a fingerprint in which the state of the fingerprint is changed from the first state to the second state according to the second touch input.
  • the processor 120 may display an interface (eg, a fingerprint or an image of a similar shape) that guides fingerprint input on an area of the display device 160 corresponding to the fingerprint sensor 224.
  • the processor 120 may receive a touch for fingerprint input provided from a user according to the interface. In response to the touch for fingerprint input, the fingerprint sensor 224 may obtain an image of the fingerprint changed to the second state. Operation 1509 may correspond to operation 1107 of FIG. 11.
  • the processor 120 may perform fingerprint authentication based on the obtained second image and the reference image. Operation 1511 may correspond to operation 1109 of FIG. 11.
  • FIG. 17 illustrates an example of an operation of an electronic device that displays information for guiding a second touch input according to various embodiments of the present disclosure. This operation may be performed by the processor 120 shown in FIG. 2.
  • Operations 1701 to 1707 of FIG. 17 may be related to operation 505 of FIG. 5.
  • the processor 120 may identify a direction of a hand holding the electronic device 101 through the grip sensor 226.
  • the grip sensor 226 may identify the direction of the hand holding the electronic device 101 by comparing the magnitude of the magnetic field change in the left direction and the magnitude of the magnetic field change in the right direction.
  • the processor 120 may identify that the electronic device 101 is held by the user and the direction of the hand holding the electronic device 101 based on the data received from the grip sensor 226. For example, the grip sensor 226 may identify that the grip by the left hand when the magnitude of the left magnetic field change is greater than the magnitude of the right magnetic field change.
  • the processor 120 may display information for guiding the second touch input based on the identified direction.
  • the processor 120 may identify the direction of the hand holding the electronic device 101 based on the information received from the grip sensor 226.
  • the processor 120 may adaptively display the information for guiding the second touch input according to the identified hand direction. For example, when the electronic device 101 is held by the user's left hand, the processor 120 may display an arrow pointing to the right through the display device 160. The arrow may indicate a direction from the left side to the center of the display unit 160 among the four surfaces forming the outline of the display device 160. For another example, when the electronic device 101 is gripped by the user's right hand, the processor 120 may display an arrow pointing to the left through the display device 160.
  • the arrow may indicate a direction from the right side of the four surfaces forming the outline of the display device 160 toward the center of the display device 160.
  • the processor 120 may display text ("drag to the left", “drag to the right", or “Drag") together with the arrow to guide the second touch input.
  • the processor 120 may receive a second touch input input by the arrow and / or text.
  • the processor 120 may detect the second touch input and determine whether the detected direction of the second touch input coincides with the direction indicated by the arrow.
  • the processor 120 may identify the direction of the second touch input by identifying a start coordinate and a release coordinate of the drag input. When the direction of the identified second touch input coincides with the direction indicated by the arrow, the processor 120 may display an interface for guiding the fingerprint input.
  • the processor 120 displays an interface for guiding a fingerprint input on an area corresponding to the position of the fingerprint sensor 224 through the display device 160.
  • the electronic device 101 may include a plurality of fingerprint sensors, and the plurality of fingerprint sensors may be uniformly disposed in an area of the display device 160.
  • the processor 120 may adaptively activate one of the plurality of fingerprint sensors.
  • the processor 120 may activate a fingerprint sensor located at the upper left of the display device 160.
  • the processor 120 may display an interface for guiding a fingerprint input on the area of the fingerprint sensor positioned at the upper left side through the display device 160.
  • the processor 120 may activate a fingerprint sensor located at the upper right side of the display device 160.
  • the processor 120 may display an interface for guiding a fingerprint input on the area of the fingerprint sensor positioned at the upper right side through the display device 160.
  • the interface for guiding the fingerprint input may be an image including the shape of the fingerprint and the like. Accordingly, when the user grips the electronic device 101 with his left hand, the user may input a fingerprint through a fingerprint sensor on a relatively close position instead of a fingerprint sensor on a relatively far bottom.
  • the processor 120 may acquire a second image through the fingerprint sensor 224.
  • the fingerprint sensor 224 may receive a touch input for fingerprint recognition after the second touch input of the user through the fingerprint sensor 224.
  • the second image may include an image of a fingerprint in which the state of the fingerprint is changed from the first state to the second state according to the second touch input.
  • the processor 120 may receive the second image by activating a fingerprint sensor corresponding to a holding direction of the user's electronic device 101. For example, referring to FIG. 18A, the processor 120 activates the fingerprint sensor in the upper left side of the plurality of fingerprint sensors based on information indicating that the user grips the electronic device 101 with his left hand. 2 Images can be received. For another example, referring to FIG.
  • the processor 120 activates a fingerprint sensor on the upper right side of the plurality of fingerprint sensors based on information indicating that the user has gripped the electronic device 101 with his right hand.
  • the second image may be received.
  • Operation 1705 may correspond to operation 1107 of FIG. 11.
  • the processor 120 may perform fingerprint authentication based on the obtained second image and the reference image.
  • Operation 1707 may correspond to operation 1109 of FIG. 11.
  • FIG. 19 is a flowchart illustrating an electronic device for displaying information guiding a first touch input according to various embodiments of the present disclosure. This operation may be performed by the processor 120 shown in FIG. 2.
  • the processor 120 may detect a user's touch input while the display is in an off state.
  • the processor 120 may receive a touch input through the TSP module 222 regardless of the on / off state of the display device 160.
  • the TSP module 222 may receive a user's touch input while the display device 160 displays the blank screen or the black screen.
  • the TSP module 222 may receive a user's touch input, for example, while the display device 160 operates in the on state.
  • the TSP module 222 may provide the processor 120 with information about the location of the received touch input on the display device 160 and the area of the received touch input.
  • the processor 120 may execute an application or change the display of the display device 160 based on the information about the touch input provided from the TSP module 222.
  • the TSP module 222 may map information about a location and an area of the display device 160 where a user's touch is input to the time when the touch is input, and store the information in the memory 130. According to the user's touch input, a material such as sweat or oil may be present on the display device 160 at the location where the touch input is received.
  • the processor 120 may determine whether the location of the detected touch input is included in the location of the plurality of fingerprint sensors.
  • the processor 120 may determine whether a touch input has been received on an area corresponding to the location of at least one fingerprint sensor among the plurality of fingerprint sensors based on the information stored in the memory 130. For example, if the position of the detected touch input includes the position of at least one of the plurality of fingerprint sensors, the processor 120 may operate on the area where the fingerprint sensor is located while the display is in an off state. It may be determined that a touch input has occurred.
  • the processor 120 may detect a touch input received in the off state of the display device 160 through the TSP module 222.
  • the user may touch the area 2010 on the lower left side while the display device 160 operates in the off state.
  • the area where the touch input is detected may be the same as the area where one fingerprint sensor of the plurality of fingerprint sensors is disposed.
  • the processor 120 may determine that the region 2010 at the lower left where the touch input is detected is included in the positions 2010 to 2060 of the plurality of fingerprint sensor regions.
  • the processor 120 may not display the touch input in an area other than the area where the fingerprint sensor is located while the display is turned off. It can be determined that it has occurred.
  • the processor 120 may, via the drag input, acquire a second touch through the display device 160 to acquire other materials, such as sweat or oil, which are present in a different area from the area where the fingerprint sensor is located. Information to guide the input can be displayed.
  • the processor 120 may display information for guiding the fingerprint input at the position of the detected touch input.
  • the processor 120 may identify one fingerprint sensor based on the detected position of the touch input.
  • the identified fingerprint sensor may be a fingerprint sensor located in the area 2010 in the lower left corner.
  • the processor 120 may display information for guiding a fingerprint input on an area 2010 on the lower left side of the identified fingerprint sensor.
  • Information guiding the fingerprint input may include an image of a fingerprint or similar shape.
  • the processor 120 may acquire a fingerprint image.
  • the processor 120 may acquire a fingerprint image of the received touch input through the fingerprint sensor 224.
  • the fingerprint image may correspond to a second state (eg, a state in which the fingerprint is not dry or a state in which oil is present in the fingerprint).
  • the small valley area may acquire an oil present on the area where the fingerprint sensor is located.
  • the obtained fingerprint may correspond to the second state.
  • the processor 120 may improve the recognition rate of fingerprint recognition without displaying information for guiding a user's drag input.
  • the processor 120 may perform fingerprint authentication based on the obtained fingerprint image and the reference image. Operation 1909 may correspond to operation 1707 of FIG. 17.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Computer Security & Cryptography (AREA)
  • Multimedia (AREA)
  • Computer Hardware Design (AREA)
  • Software Systems (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

L'invention concerne un dispositif électronique qui, selon divers modes de réalisation, peut comprendre : un écran tactile ; un capteur d'empreinte digitale formé sur au moins une région partielle de l'écran tactile ; une mémoire ; et un processeur configuré pour acquérir une première image d'une empreinte digitale d'utilisateur qui est associée à une première entrée tactile, en utilisant le capteur d'empreinte digitale sur la base de la première entrée tactile, détectée par l'intermédiaire de l'écran tactile, sur la ou les régions partielles, vérifier si un état de l'empreinte digitale correspond à un premier état, sur la base de la première image acquise et d'une image de référence de l'empreinte digitale, qui est stockée dans la mémoire, et afficher, par l'intermédiaire de l'écran tactile, des informations pour guider une seconde entrée tactile pour changer l'état de l'empreinte digitale, du premier état à un second état, sur la base de la vérification.
PCT/KR2018/016015 2018-01-31 2018-12-17 Appareil et procédé d'affichage d'informations de guidage pour changer l'état d'empreinte digitale WO2019151642A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR1020180012403A KR20190093003A (ko) 2018-01-31 2018-01-31 지문 상태를 변경하기 위한 가이드 정보를 표시하기 위한 장치 및 방법
KR10-2018-0012403 2018-01-31

Publications (1)

Publication Number Publication Date
WO2019151642A1 true WO2019151642A1 (fr) 2019-08-08

Family

ID=67479741

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2018/016015 WO2019151642A1 (fr) 2018-01-31 2018-12-17 Appareil et procédé d'affichage d'informations de guidage pour changer l'état d'empreinte digitale

Country Status (2)

Country Link
KR (1) KR20190093003A (fr)
WO (1) WO2019151642A1 (fr)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20210036568A (ko) 2019-09-26 2021-04-05 삼성전자주식회사 전자 장치 및 이의 제어 방법
KR20210112767A (ko) 2020-03-06 2021-09-15 삼성전자주식회사 전자 장치 및 전자 장치의 지문 인식 방법
EP4357946A1 (fr) 2021-08-26 2024-04-24 Samsung Electronics Co., Ltd. Dispositif électronique et procédé de mise en place d'un guide de reconnaissance d'empreintes digitales l'utilisant
WO2023106698A1 (fr) * 2021-12-06 2023-06-15 삼성전자 주식회사 Procédé de détection d'empreinte digitale et dispositif électronique

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20130030170A (ko) * 2011-09-16 2013-03-26 삼성전자주식회사 휴대용 단말기에서 지문인식을 지원하기 위한 장치 및 방법
KR20150098158A (ko) * 2014-02-19 2015-08-27 삼성전자주식회사 지문인식 장치 및 방법
KR20160071887A (ko) * 2014-12-12 2016-06-22 엘지전자 주식회사 이동단말기 및 그것의 제어방법
KR20160128872A (ko) * 2015-04-29 2016-11-08 삼성전자주식회사 지문 정보 처리 방법 및 이를 지원하는 전자 장치
KR20180000974A (ko) * 2016-06-24 2018-01-04 에스케이텔레콤 주식회사 터치스크린 장치 및 그 지문 인식 방법

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20130030170A (ko) * 2011-09-16 2013-03-26 삼성전자주식회사 휴대용 단말기에서 지문인식을 지원하기 위한 장치 및 방법
KR20150098158A (ko) * 2014-02-19 2015-08-27 삼성전자주식회사 지문인식 장치 및 방법
KR20160071887A (ko) * 2014-12-12 2016-06-22 엘지전자 주식회사 이동단말기 및 그것의 제어방법
KR20160128872A (ko) * 2015-04-29 2016-11-08 삼성전자주식회사 지문 정보 처리 방법 및 이를 지원하는 전자 장치
KR20180000974A (ko) * 2016-06-24 2018-01-04 에스케이텔레콤 주식회사 터치스크린 장치 및 그 지문 인식 방법

Also Published As

Publication number Publication date
KR20190093003A (ko) 2019-08-08

Similar Documents

Publication Publication Date Title
WO2020153725A1 (fr) Dispositif électronique et procédé de prévention d'endommagement d'écran
WO2019151642A1 (fr) Appareil et procédé d'affichage d'informations de guidage pour changer l'état d'empreinte digitale
WO2018230875A1 (fr) Terminal et son procédé de commande
WO2018074877A1 (fr) Dispositif électronique et procédé d'acquisition d'informations d'empreintes digitales
WO2019168318A1 (fr) Dispositif électronique et procédé d'interface d'authentification d'empreinte digitale
WO2019146918A1 (fr) Procédé de reconnaissance d'empreinte digitale, et dispositif électronique et support de stockage associés
WO2014017858A1 (fr) Appareil de terminal utilisateur et procédé de commande associé
WO2018009029A1 (fr) Dispositif électronique et son procédé de fonctionnement
WO2019164183A1 (fr) Dispositif électronique d'acquisition d'informations biométriques à l'aide d'une électrode sélectionnée parmi des électrodes de capteur biométrique, et son procédé de commande
WO2019124811A1 (fr) Procédé de vérification d'empreinte digitale et dispositif électronique permettant la mise en œuvre de celui-ci
WO2020085628A1 (fr) Procédé d'affichage d'objets et dispositif électronique d'utilisation associé
WO2021133021A1 (fr) Dispositif électronique doté d'un écran et son procédé d'exploitation
WO2019160348A1 (fr) Dispositif électronique d'acquisition d'une entrée d'utilisateur en état submergé à l'aide d'un capteur de pression, et procédé de commande de dispositif électronique
WO2020085643A1 (fr) Dispositif électronique et procédé de commande associé
WO2021080360A1 (fr) Dispositif électronique et son procédé de commande de fonctionnement du dispositif d'affichage
WO2020251311A1 (fr) Dispositif électronique et procédé de fourniture d'informations de notification par ce dernier
WO2019124841A1 (fr) Dispositif électronique et procédé pour exécuter une fonction selon une entrée de geste
WO2018143566A1 (fr) Procédé et dispositif électronique d'affichage d'objets graphiques pour entrée d'empreintes digitales
WO2020106019A1 (fr) Dispositif électronique et procédé de fourniture de service d'information-divertissement à bord d'un véhicule
WO2019143199A1 (fr) Dispositif électronique permettant de déterminer un procédé de traitement d'empreintes digitales en fonction du degré de pression d'une entrée d'empreintes digitales, et procédé associé
WO2019199086A1 (fr) Dispositif électronique et procédé de commande pour dispositif électronique
WO2021194080A1 (fr) Procédé permettant de déterminer un schéma d'authentification d'utilisateur d'un dispositif électronique, et dispositif électronique associé
WO2020032512A1 (fr) Dispositif électronique et procédé d'affichage d'une mise à disposition pour fournir une charge de batterie de dispositif externe par l'intermédiaire d'un dispositif d'affichage
WO2019177376A1 (fr) Procédé et dispositif électronique pour générer des informations d'empreinte digitale sur la base de multiples éléments d'informations d'image acquis à l'aide de multiples schémas d'excitation
WO2018062594A1 (fr) Terminal mobile et procédé de commande associé

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18903114

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18903114

Country of ref document: EP

Kind code of ref document: A1