GB2599288A - User interface display method and electronic device - Google Patents

User interface display method and electronic device Download PDF

Info

Publication number
GB2599288A
GB2599288A GB2117618.5A GB202117618A GB2599288A GB 2599288 A GB2599288 A GB 2599288A GB 202117618 A GB202117618 A GB 202117618A GB 2599288 A GB2599288 A GB 2599288A
Authority
GB
United Kingdom
Prior art keywords
swiping
image
processor
registration
images
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
GB2117618.5A
Other versions
GB2599288B (en
GB202117618D0 (en
Inventor
Chiang Yuan-Lin
lu Jun-chao
Hsu Hsien-Jen
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Egis Technology Inc
Original Assignee
Egis Technology Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Egis Technology Inc filed Critical Egis Technology Inc
Publication of GB202117618D0 publication Critical patent/GB202117618D0/en
Publication of GB2599288A publication Critical patent/GB2599288A/en
Application granted granted Critical
Publication of GB2599288B publication Critical patent/GB2599288B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/12Fingerprints or palmprints
    • G06V40/1335Combining adjacent partial images (e.g. slices) to create a composite input or reference pattern; Tracking a sweeping finger movement
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/12Fingerprints or palmprints
    • G06V40/13Sensors therefor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/12Fingerprints or palmprints
    • G06V40/1347Preprocessing; Feature extraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/50Maintenance of biometric data or enrolment thereof
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/60Static or dynamic means for assisting the user to position a body part for biometric acquisition
    • G06V40/67Static or dynamic means for assisting the user to position a body part for biometric acquisition by interactive indications to the user

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Engineering & Computer Science (AREA)
  • Evolutionary Computation (AREA)
  • Evolutionary Biology (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Data Mining & Analysis (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Artificial Intelligence (AREA)
  • Collating Specific Patterns (AREA)
  • Image Input (AREA)
  • User Interface Of Digital Computer (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
  • Position Input By Displaying (AREA)

Abstract

A display method of a user interface and an electronic device applied to fingerprint registration. The display method includes the following steps, conducted by a processor: obtaining a plurality of swiping images through a fingerprint sensor; analysing the swiping images to obtain a plurality of feature points of the plurality of swiping images; merging the plurality of feature points of the plurality of swiping images into a pre-registration dataset; analysing the pre-registration dataset to obtain a coordinate parameter; updating a range of the filled region of the reference image according to the coordinate parameter and an area of the swiping image; displaying a filled region on a reference image on a user interface of a display according to the plurality of swiping images; and analysing the pre-registration dataset to determine whether to end the fingerprint registration process. The fingerprint registration may be ended when the area of the pre-registration dataset is greater than a predetermined threshold. The present invention informs the user of the fingerprint registration progress through a display of the electronic device.

Description

USER INTERFACE DISPLAY METHOD AND ELECTRONIC DEVICE
Technical Field
The present invention relates to an interface display technique, and particularly relates to a display method of a user interface applied to fingerprint registration and an electronic device using the display method.
Description of Related Art
In recent years, fingerprint recognition technique has been widely used in various electronic devices to provide various functions such as identity login or identity verification. However, in the general fingerprint recognition technique, a user presses a finger on a fingerprint sensor to register a fingerprint in a manner of one-time press or multiple presses, and a corresponding user interface is provided to inform the user of the progress of fingerprint registration. For example, if the fingerprint registration is performed through the manner of multiple presses, each time when the user presses the finger, an area of a corresponding fingerprint image displayed on the user interface is increased. After an entire or a sufficiently large range of the fingerprint is displayed, the fingerprint registration is completed.
However, if the user performs the fingerprint registration through the manner of swiping a finger, the conventional fingerprint recognition related technique is unable to correspondingly display a fingerprint image on the user interface to inform the user of the progress of the fingerprint registration according to the swiping progress of the user's finger. Namely, in the process of performing fingerprint registration by swiping a finger, the user cannot learn the progress of the fingerprint registration in real time.
SUMMARY OF THE INVENTION
The present invention provides a display method of a user interface and an electronic device, enabling a user to learn a corresponding fingerprint registration progress through a display of the electronic device in the process of performing fingerprint registration by swiping a finger.
A display method of a user interface of the present invention is applied to fingerprint registration. The display method includes the following steps: obtaining a swiping image through a fingerprint sensor analysing the swiping image to obtain a plurality of feature points of the swiping image; determining whether the swiping image is a first swipingimage; if the swiping image is the first swiping image, generating a pre-registration dataset according to the plurality of feature points of the swiping image, and analysing the pre-registration dataset to obtain a basic image parameter; and displaying a filled region on a reference image on the user interface according to the basic image parameter.
A display method of a user interface of the present invention is applied to fingerprint registration The display method includes the following steps. obtaining a swiping image through a fingerprint sensor; analysing the swiping image to obtain a plurality of feature points of the swiping image, and obtain a coordinate parameter of the feature point located at the most upper left corner of the swiping image; determining whether the swiping image is a first swiping image; if the swiping image is the first swiping image, generating a pre-registration dataset according to the plurality of feature points of the swiping image; and displaying a filled region on a reference image on the user interface according to the coordinate parameter and an area of the swiping image.
An electronic device of the present invention includes a fingerprint sensor, a processor and a display. The fingerprint sensor is configured to obtain a swiping image The processor is coupled to the fingerprint sensor. The processor is configured to analyse the swiping image to obtain a plurality of feature points of the swiping image, and determine whether the swiping image is a first swiping image. The display is coupled to the processor. If the processor determines that the swiping image is the first swiping image, the processor generates a pre-registration dataset according to the plurality of feature points of the swiping image and analyses the pre-registration dataset to obtain a basic image parameter. The processor, through the display, displays a filled region on a reference image on a user interface according to the basic image parameter.
An electronic device of the present invention includes a fingerprint sensor, a processor and a display. The fingerprint sensor is configured to obtain a swiping image The processor is coupled to the fingerprint sensor. The processor is configured to analyse the swiping image and obtain a coordinate parameter of the feature point located at the most upper left corner of the swiping image The processor is further configured to determine whether the swiping image is a first swiping image. The display is coupled to the processor. If the processor determines that the swiping image is the first swiping image, the processor generates a pre-registration dataset according to the plurality of feature points of the swiping image. The processor displays a filled region on a reference image on a user interface according to the coordinate parameter and an area of the swiping image.
According to the above description, in the display method of a user interface and the electronic device of the present invention, by analysing a plurality of swiping images obtained during the process of fingerprint registration, a corresponding image adjusting parameter can be obtained, and a change in range of the filled region of the reference image on the user interface is displayed according to the image adjusting parameter, so as to provide the user with real-time information on the progress of fingerprint registration.
In order to make the aforementioned features and advantages of the present invention comprehensible, several embodiments accompanied with figures are described in detail below.
BRIEF DESCRIPTION OF THE DRAWINGS
FIG. 1 is a schematic diagram of an electronic device according to an embodiment of the present invention; FIG. 2 is a flowchart illustrating a fingerprint registration method according to an embodiment of the present invention; FIG. 3 is a flowchart illustrating a fingerprint registration method according to a first embodiment of the present invention; FIG. 4 is a schematic diagram of a pre-registration dataset according to the first embodiment of the present invention; FIG. 5A to FIG 5E are schematic diagrams showing finger operations and their corresponding UI displays according to the first embodiment of the present invention; FIG. 6 is a flowchart illustrating a fingerprint registration method according to a second embodiment of the presentinvention; FIG. 7 is a schematic diagram of a pre-registration dataset according to the second embodiment of the present invention; FIG. 8A to FIG. 8G are schematic diagrams showing finger operations and their corresponding UI displays according to the second embodiment of the present invention; FIG. 9 is a flowchart illustrating a fingerprint registration method according to a third embodiment of the present invention; FIG. 10 is a schematic diagram of a pre-registration dataset according to the third embodiment of the present invention; FIG. 11A to FIG. 111 are schematic diagrams showing finger operations and their corresponding UI displays according to the third embodiment of the present invention.
Description of the Reference Numerals:
100: electronic device; 110: processor; 120: fingerprint sensor; 130: memory; 140: display; 410, 420, 400, 710, 720, 730, 1010, 1020, 1030 image; 411, 711, 1011: feature point; 500, 800, 1100: user interface (UI); 510, 810, 1110: reference image; 511, 811, 811b, 1111: filled region; DWI, DW2: width; F: finger; h, H, W: image parameter; Step: image adjusting parameter; S210, 5220, 5225, 5230, S232, 5235, 5240, 5245, S250, 5310, S320, 5331, S332, 5333, S334, S340, S341, S350, S380, 5610, 5620, S631, S633, S632, 5634, S640, 5641, S650, 5680, 5910, S920, 5922, S924, 5925, 5926, S940, 5980: step; (Xl, Y1), (X2, Y2), (Xn, Yn): coordinate parameter; (Ax, Ay): displacement parameter.
DESCRIPTION OF EMBODIMENTS
In order to make the content of the present invention more comprehensible, embodiments are described below as the examples to prove that the present invention can actually be realized. In addition, wherever possible, the components/members/steps with the same reference numerals in the drawings and the description stand for the same or like parts.
FIG 1 is a schematic diagram of an electronic device according to an embodiment of the present invention, Referring to FIG. 1, an electronic device 100 includes a processor 110, a fingerprint sensor 120, a memory 130 and a display 140. The processor 110 is coupled to the fingerprint sensor 120, the memory 130 and the display 140. The electronic device 100 is, for example, an electronic product such as a smart phone, a notebook (NB), a tablet personal computer (PC), etc. In the present embodiment, the electronic device 100 executes a fingerprint sensing operation through the fingerprint sensor 120 to obtain a fingerprint image of a user's finger. In the present embodiment, when the user places a finger on the fingerprint sensor 120 to perform a swiping operation, the fingerprint sensor 120 performs fingerprint sensing. The fingerprint sensor 120 may obtain a plurality of swiping images successively and provide them to the processor 110. The processor 110 may analyse these swiping images to obtain a plurality of feature points from each of the swiping images. The feature points refer to fingerprint feature points of the finger. Thereafter, the processor 110 generates fingerprint registration data according to data of the feature points.
In the present embodiment, the fingerprint sensor 120 obtains the swiping images one by one, and while the processor 110 is analysing the swiping images one by one, the processor 110 may correspondingly change a filled region of a fingerprint reference image on a user interface (131) displayed on the display 140 according to the analysis result of each of the s ping images obtained one by one. In the present embodiment, the reference image on the UI includes the filled region. The filled region of the reference image is used for representing a range which the obtained fingerprint information covers, and a range of the filled region of the reference image is progressively increased and changed corresponding to the current progress of a finger swiping by a user (i.e. corresponding to the progress of acquisition of the fingerprint information). Therefore, the fingerprint registration function of the electronic device 100 of the present invention may provide a good interaction effect such that the user can be informed of the current progress of the registration.
In the present embodiment, the processor 110 is, for example, a central processing unit (CPU), a system on chip (SoC) or other programmable general purpose or special purpose microprocessor, a digital signal processor (DSP), a programmable controller, an application specific integrated circuit (ASIC), a programmable logic device (PLD), other similar processing device or a combination of these devices.
In the present embodiment, the fingerprint sensor 120 is, for example, a capacitive fingerprint sensor or an optical fingerprint sensor, and the type of the fingerprint sensor 120 is not limited in the present invention. In the present embodiment, a fingerprint sensing mechanism of the fingerprint sensor 120 may be swiping sensing or pressing sensing. It is worth noting that in the embodiments of the present invention, the fingerprint registration is implemented through swiping sensing. Namely, during the process of fingerprint registration, the user places and swipes the finger on a sensing surface of the fingerprint sensor 120, and the fingerprint sensor 120 senses and obtains fingerprint information of the user through the sensing surface. For example, the electronic device 100 may be designed to perform fingerprint registration by asking the user to swipe the finger. In other words, the fingerprint sensor 120 may perform the fingerprint sensing in the manner of swiping sensing. For fingerprint authentication, the user is asked to press the finger. Namely, the fingerprint sensor 120 performs the fingerprint sensing in the manner of pressing sensing.
In the present embodiment, the memory 130 is configured to store fingerprint data described in the embodiments of the present invention and related applications for the processor 110 to read and execute.
In the present embodiment, the display 140 is, for example, a liquid crystal display (LCD), a light emitting diode (LED) display, a micro LED display or an organic LED display, etc. and the type of the display 140 is not limited in the present invention. In the present embodiment, when the user performs the fingerprint registration, the display 140 displays the corresponding UI and the UI includes a reference image simulating a fingerprint During the process where the user swipes the finger on the fingerprint sensor 120, a range of the filled region of the reference image displayed on the display 140 is changed and increased corresponding to an increase of the fingerprint data sensed by the fingerprint sensor 120.
FIG. 2 is a flowchart illustrating a fingerprint registration method according to an embodiment of the present invention. Referring to FIG. 1 and FIG. 2, the fingerprint registration method of the present embodiment is applied to the electronic device 100 of the embodiment of FIG. 1. When the user performs fingerprint registration, the electronic device 100, through the fingerprint sensor 120, executes the swiping fingerprint sensing operation and obtains swiping images of an object (i.e. the user's finger) one by one. In step 5210, the fingerprint sensor 120 obtains a swiping image. In step 5220, the processor 110 analyses the swiping image to obtain a plurality of feature points of the swiping image. In step S225, the processor 110 determines whether the swiping image is a first swiping image. If yes, step 5230 is executed. In step 5230, the processor 110 generates a pre-registration dataset according to the feature points of the swiping image, and analyses the pre-registration dataset to obtain a basic image parameter (h) of the pre-registration dataset. In step 5232, the processor 110 displays a filled region of the reference image on the III according to the basic image parameter (h). If the swiping image is not the first swiping image, step S235 is executed. In step S235, the processor 110 merges the feature points of the swiping image into the pre-registration dataset to generate a merged pre-registration dataset In step 5240, the processor 110 analyses the merged pre-registration dataset to obtain a merged image parameter (H), and obtains an image adjusting parameter (Step=H-h) according to the merged image parameter (I1) and the basic image parameter (h). The image adjusting parameter (Step) is equal to the merged image parameter (H) minus the basic image parameter (h). In step 5245, the processor 110 sets the merged image parameter (H) as a new basic image parameter (h). In step S250, the processor 110 increases a range of the filled region of the reference image on the Ul according to the image adjusting parameter (Step), i.e. to increase a length of the filled region. In order to further convey technical details of a display method of a user interface arid the fingerprint registration of this case to those skilled in the art, several embodiments are provided below for
further description.
FIG. 3 is a flowchart illustrating a fingerprint registration method according to a first embodiment of the present invention. Referring to FIG. 1 and FIG. 3, the fingerprint registration method of the present embodiment is applied to the electronic device 100 of the embodiment of FIG. 1. When the user performs the fingerprint registration, in step S310, the electronic device 100 senses the object (i.e. the user's finger) through the fingerprint sensor 120 to obtain a swiping image. In step S320, the processor 110 analyses the swiping image to obtain a plurality of feature points of the swiping image. In step S331, the processor 110 determines whether the swiping image is a first swiping image. If yes, the processor 110 executes step S332. In step S332, the processor 110 generates a pre-registration dataset according to the feature points of the swiping image, and obtains a basic image parameter (h) of the pre-registration dataset. In step S333, the processor 110 displays a filled region of the reference image according to the basic image parameter (h). If the swiping image is not the first swiping image, the processor 110 executes step S334. In step 5334, the processor 110 merges the feature points of the swiping image into the pre-registration dataset to generate a merged pre-registration dataset.
In step 5340, the processor 110 analyses the merged pre-registration dataset to obtain a merged image parameter (H) and obtains an image adjusting parameter (Step=H-h) according to the merged image parameter (H) and the basic image parameter (h). The image adjusting parameter (Step) is equal to the merged image parameter (H) minus the basic image parameter (h). In step S341, the processor 110 sets the merged image parameter (H) as a new basic image parameter (h). In step 5350, the processor 110 increases the range (i.e. length) of the filled region of the reference image according to the image adjusting parameter (Step). In step S380, the processor 110 determines whether the merged image parameter (H) is greater than a predetermined threshold. If yes, it means that sufficient fingerprint registration data have been obtained. Then, the processor 110 ends the fingerprint sensing operation of the fingerprint sensor 120, and stores the pre-registration dataset into the memory HO to serve as a fingerprint registration dataset, so as to complete the fingerprint registration process. If not, the processor 110 executes step S310 to obtain a following swiping image.
FIG. 4 is a schematic diagram of a pre-registration dataset according to the first embodiment of the present invention. FIG. 5A to FIG. 5E are schematic diagrams showing finger swiping operations and their corresponding Ul displays according to the first embodiment of the present invention Referring to FIG. 1, FIG. 4 and FIG. 5A to FIG. 5E, the present embodiment may also be applied to the flowchart of FIG. 3 above. In the present embodiment, after the fingerprint sensor 120 obtains a first swiping image 410 of a finger F, the processor 110 analyses the swiping image 410 to obtain a plurality of feature points 411 of the swiping image 410 and a basic image parameter h. It should be noted that an area of the swiping image 410 is equal to an area of the sensing surface of the fingerprint sensor 120. In the present embodiment, an initial value of the basic image parameter h may be a distance between two of the feature points 411 of the first swiping image 410 that are farthest from each other in a length direction, though the present invention is not limited thereto. In another embodiment, the initial value of the basic image parameter h may be a length of the swiping image 410, i.e. a length of the sensing surface of the fingerprint sensor 120. The processor 110 generates the pre-registration dataset according to the feature points 411 of the swiping image 410. Then, the processor 110 obtains a following swiping image 420 and obtains the feature points of the swiping image 420. In the present embodiment, the processor 110 merges the feature points of the swiping image 420 into the pre-registration dataset ( e. to merge the feature points of the swiping images 410 and 420) to generate a new pre-registration dataset 400. The processor 110 calculates the merged image parameter H of the merged pre-registration dataset 400. Similarly, the merged image parameter H may be a distance between two of the merged feature points that are farthest from each other in the length direction after the swiping images 410 and 420 are merged, or may be a sum of the length of the two swiping images (re, twice the length of the sensing surface of the fingerprint sensor 120) minus a length of an overlapped portion of the swiping images 410 and 420 after the swiping images 410 and 420 are merged. Then, the processor 110 subtracts the basic image parameter h from the merged image parameter H to obtain an image adjusting parameter Step(=H-h).
Namely, each time when the processor 110 merges the feature points of one swiping image into the merged pre-registration dataset 400, the processor 110 calculates the increased length of the merged pre-registration dataset 400, so as to accordingly adjust a length of a filled region 511 of a reference image 510 on a Ul 500 correspondingly. It is worth noting that a width DW of the filled region 511 of the reference image 510 is predetermined and fixed. Each time when data of the feature points of one additional swiping image are added, the processor 110 may correspondingly increase the length of the filled region 511 Moreover, the processor 110 may determine whether the merged image parameter H is greater than the predetermined threshold. If yes, it means that sufficient fingerprint registration data have been obtained. For example, when the merged image parameter H is greater than the predetermined threshold, it means that a sufficient number of fingerprint feature points have been obtained or a sufficient number of swiping images have been obtained. Therefore, the processor 110 stores the pre-registration dataset to the memory 130 to serve as the fingerprint registration dataset, so as to complete the fingerprint registration process.
Taking FIG. 5A to FIG. 5E as an example. As shown in FIG. 5A, when the user places the finger F on the fingerprint sensor 120 of the electronic device 100 to perform a swiping operation, the electronic device 100 displays the reference image 510 on the UI 500 and obtains the swiping images one by one. When the first swiping image is obtained, the electronic device 100 displays the corresponding tilled region 511 in the reference image 510, where the length of the filled region 511 corresponds to the basic image parameter h of the first swiping image. Moreover, as described above, the width of the filled region 511 is predetermined and fixed.
As shown in the figure, the width of the filled region 511 may be equal to or greater than a width of the reference image 510. As shown in FIG. 5B, when a second swiping image is obtained, the electronic device 100 may increase the length of the filled region 511 of the reference image 510 according to the image adjusting parameter Step. As shown in the figure, the width of the filled region 511 is fixed. As shown in FIG. 5C, when the finger F of the user leaves the fingerprint sensor 120, the length of the filled region 511 of the reference image 510 stops increasing. However, since the obtained fingerprint information is insufficient, i.e. the merged image parameter H is not greater than the predetermined threshold, the fingerprint registration process is not yet completed, and therefore, the Ul 500 remains displaying the filled region 511 of the reference image 510 and prompts the user to swipe the finger again. Then, as shown in FIG. 5D, the user again places the finger F on the fingerprint sensor 120 of the electronic device 100 to perform another swiping operation. The electronic device 100 then obtains a new swiping image, and increases the length of the filled region 511 of the reference image 510 according to the newly added swiping image (i.e. newly added data of the fingerprint feature points). Finally, as shown in FIG. 5E, when the merged image parameter H is greater than the predetermined threshold, the processor 110 will make the filled region 511 completely cover the reference image 510. Namely, the length of the filled region 511 will be greater than or equal to the length of the reference image 510. Therefore, the fingerprint registration process is completed, and the processor 110 stops the fingerprint sensing operation of the fingerprint sensor 120 and generates the fingerprint registration dataset according to the pre-registration dataset, so as to complete the fingerprint registration process.
FIG. 6 is a flowchart illustrating a fingerprint registration method according to a second embodiment of the present invention. Referring to FIG. 1 and FIG. 6, the fingerprint registration method of the present embodiment is applied to the electronic device 100 of the embodiment of FIG. 1. When the user performs the fingerprint registration, in step 5610, the electronic device 100 senses the object (i.e. the user's finger) through the fingerprint sensor 120 to obtain a swiping image. In step S620, the processor 110 analyses the swiping image to obtain a plurality of feature points of the swiping image. In step S631, the processor 110 determines whether the swiping image is a first swiping image, If yes, the processor 110 executes step 5632. In step 5632, the processor 110 obtains a coordinate parameter (X, Y) of the feature point located at the most upper left corner of the swiping image, generates a pre-registration dataset according to the feature points of the swiping image, and obtains a basic image parameter (h) of the pre-registration dataset. In step 5633, the processor 110 displays a filled region of the reference image on the display 140 according to the basic image parameter (h) and the coordinate parameter (X, Y). If the swiping image is not the first swiping image, the processor 110 executes step S634. In step S634, the processor 110 merges the feature points of the swiping image into the pre-registration dataset to generate a merged pre-registration dataset.
In step 5640, the processor 110 analyses the merged pre-registration dataset to obtain a first merged image parameter (H) and a second merged image parameter (W), and obtains an image adjusting parameter (Step=H-h) according to the first merged image parameter (H) and the basic image parameter (h). The first merged image parameter (H) may be a distance between two of the merged feature points that are farthest from each other in a length direction in the pre-registration dataset, or may be a sum of the length of a plurality of swiping images minus a length of an overlapped portion of the swiping images. The second merged image parameter (W) is a distance between two of the merged feature points that are farthest from each other in a width direction. In step 5641, the processor 110 sets the first merged image parameter (H) as a new basic image parameter (h). In step 5650, the processor 110 increases a range of the filled region of the reference image according to the image adjusting parameter (Step). In step S680, the processor 110 determines whether the first merged image parameter (H) is greater than a first predetermined threshold and whether the second merged image parameter (W) is greater than a second predetermined threshold. If yes, the processor 110 ends the fingerprint sensing operation of the fingerprint sensor 120 and generates the fingerprint registration data according to the merged pre-registration dataset, so as to complete the fingerprint registration process. If not, the processor 110 executes step 5610 to obtain a following swiping image.
FIG. 7 is a schematic diagram of a pre-registration dataset according to the second embodiment of the present invention. FIG. 8A to FIG. 8G are schematic diagrams showing finger operations and their corresponding UI displays according to the second embodiment of the present invention Referring to FIG. 1, FIG. 7 and FIG. 8A to FIG. 8G, the present embodiment may also be applied to the flowchart of FIG. 6 above. In the present embodiment, after the fingerprint sensor 120 obtains a first swiping image 710 of the finger F, the processor 110 analyses the swiping image 710 to obtain a plurality of feature points 711 of the swiping image 710 and obtain the basic image parameter h and a coordinate parameter (Xl, Y1) of the feature point located at the most upper left corner of the swiping image 710 The processor 110 displays a filled region 811 of a reference image 810 according to the coordinate parameter (XI, Y1) and the basic image parameter h It should be noted that an area of the swiping image 710 is equal to the area of the sensing surface of the fingerprint sensor 120. In the present embodiment, the basic image parameter h refers to a distance between two of the feature points 711 of the swiping image 710 that are farthest from each other in a length direction, but the present invention is not limited thereto. In another embodiment, the basic image parameter h may refer to a length of the swiping image 710, i.e. the length of the sensing surface of the fingerprint sensor 120. The processor 110 generates the pre-registration dataset according to the feature points 711 of the swiping image 710 Then, the processor 110 obtains a following swiping image 720 and obtains the feature points of the swiping image 720. In the present embodiment, the processor 110 merges the feature points of the swiping image 720 into the pre-registration dataset (i.e. to merge the feature points of the swiping images 710 and 720) to generate a merged pre-registration dataset 700. The processor 110 calculates a first merged image parameter H (i.e. the maximum image length) and a second merged image parameter W (i.e. the maximum image width) of the merged pre-registration dataset 700 The processor 110 subtracts the basic image parameter h from the first merged image parameter H to obtain the image adjusting parameter Step. Then, the processor 110 increases the length of the filled region 811 of the reference image 810 according to the image adjusting parameter Step.
Namely, each time when the processor HO merges the feature points of one swiping image into the pre-registration dataset 700, the processor 110 calculates the increased length of the merged pre-registration dataset 700, so as to accordingly adjust the length of the filled region 811 of the reference image 810 on a Ul 800. It is worth noting that a width of the filled region 811 of the reference image 810 is predetermined and fixed during each finger swiping operation. Namely, the width of the filled region 811 of the reference image 810 may be increased only when another finger swiping operation is performed. During each finger swiping operation, each time when data of the feature points of one additional swiping image are added, the processor 110 correspondingly increases the length of the filled region 811 In addition, the processor 110 merges a plurality of swiping images into the pre-registration dataset 700, and the processor 110 determines whether the first merged image parameter H and the second merged image parameter W are respectively greater than the first and the second predetermined thresholds. If yes, it means that sufficient fingerprint registration data have been obtained.
For example, when the first merged image parameter H is greater than the first predetermined threshold and the second merged image parameter W is greater than the second predetermined threshold, it means that a sufficient number of fingerprint feature points or a sufficient number of swiping images have been obtained. Therefore, the processor 110 stores the pre-registration dataset to the memory 130 to serve as the fingerprint registration dataset, so as to complete the fingerprint registration process. If the first merged image parameter H is not greater than the first predetermined threshold or the second merged image parameter W is not greater than the second predetermined threshold, the processor 110 displays a prompt on the Ul of the display 140 to request the user to swipe the finger again. During the second swiping operation, the processor 110 obtains a first swiping image 730 of the second swiping operation through the fingerprint sensor 120. The processor 110 then obtains the feature points 711 of the swiping image 730, and merges the feature points 711 into the pre-registration dataset 700. The processor 110 obtains a displacement parameter (Ax, Ay) (X2-X1=Ax, Y2-Y1=Ay) according to the coordinate parameter (X1, Y1) of the feature point located at the most upper left corner of the swiping image 710 (i.e. the first swiping image obtained during the first swiping operation) and a coordinate parameter (X2, Y2) of the feature point located at the most upper left corner of the first swiping image 730 obtained during the second swiping operation According to the displacement parameter (Ax, Ay), the processor 110 may determine an increased width and position of the filled region 811 of the reference image 810 on the Ul 800 corresponding to the second swiping operation.
In other words, when the second swiping operation is performed, the finger F of the user may shift to the right or left, and the processor 110 determines an increased width of the filled region 811 corresponding to the second swiping operation according to the coordinate parameter (X2, Y2) of the feature point located at the most upper left corner of the first swiping image 730 obtained during the second swiping operation, i.e. the displacement parameter (Ax, Ay), and determines a length of a newly added portion 811b of the filled region 811 corresponding to the second swiping operation according to the basic image parameter h of the first swiping image 730. The processor 110 displays a starting position of the portion 811b of the filled region 811 corresponding to the second swiping operation according to the coordinate parameter (X2, Y2). Namely, in the second swiping operation, a range of the filled region 811 in a width direction is increased corresponding to the degree of shifting of the finger F of the user. Then, the processor 110 increases the length of the portion 811b of the filled region 811 corresponding to the second swiping operation according to the later obtained swiping images and the corresponding image adjusting parameters Step. Each time when one swiping image is obtained, the processor 110 determines whether the first merged image parameter H and the second merged image parameter W are respectively greater than the first and the second predetermined thresholds, so as to determine whether to end the fingerprint registration process.
Taking FIG. 8A to FIG. 8G as an example, as shown in FIG. 8A, when the user presses the finger F on the fingerprint sensor 120 of the electronic device 100 and performs the first swiping operation, the corresponding filled region 811 is displayed on the reference image 810 on the Ul 800 corresponding to the first swiping image obtained by the fingerprint sensor 120. As shown in FIG. 8B and FIG. 8C, during the first swiping operation of the finger F, the length of the filled region 811 of the reference image 810 is correspondingly increased. Moreover, during the first swiping operation of the finger F, the processor 110 increases the length of the filled region 811 of the reference image 810 according to the image adjusting parameter Step by fixing an image width. That is, during the first swiping operation of the finger F, a width DW1 of the filled region 811 of the reference image 810 is fixed, and a length thereof is increased after a new swiping image is obtained. As shown in FIG. 8D, when the finger F of the user leaves the fingerprint sensor 120, the range of the filled region 811 of the reference image 810 stops increasing. However, since the fingerprint registration is not yet completed, the UI 800 stays on the reference image 810 and the current filled region 811 thereof, and a prompt is displayed to request the user to swipe the finger again. Therefore, as shown in FIG. 8E and FIG. 8F, the user presses the finger again on the fingerprint sensor 120 of the electronic device 100 to perform the second swiping operation. Compared to the finger's placing position during the first swiping operation, during the second swiping operation, the position of the user's finger is displaced a distance to the upper right. After the first swiping image of the second swiping operation is obtained, the processor 110 calculates the coordinate parameter (X2, Y2) of the feature point located at the most upper left corner of the first swiping image, and subtracts the coordinate parameter (X1, Y1) of the feature point located at the most upper left corner of the first swiping image obtained during the first swiping operation from the coordinate parameter (X2, Y2) to obtain the displacement parameter (Ax, Ay) (X2-X1=Ax, Y2-Y1=Ay), and determines a display position of the first swiping image corresponding to the second swiping operation, i.e. the starting position of the newly added portion 811b of the filled region 811 corresponding to the second swiping operation according to (Ax, Ay). As shown in the figures, during the second swiping operation of the finger, the range of the filled region 811 of the reference image 810 continues to increase (i.e. 811b). The processor 110 determines the starting position of the newly added portion 811b of the filled region 811 according to the displacement parameter (Ax, Ay), and according to the image adjusting parameter Step, increases the length of the portion 811b of the filled region 811 of the reference image 810 corresponding to the second swiping operation by fixing the image width. That is, during the second swiping operation of the finger F, a width DW2 of the newly added portion 811b of the filled region 811 of the reference image 810 is fixed while the length of the portion 811b is increased after a new swiping image is obtained. Finally, as shown in FIG. 8G, when the first merged image parameter Fl and the second merged image parameter W of the registration dataset 700 are respectively greater than the first and the second predetermined thresholds, the range of the filled region 811 of the reference image 810 is increased to achieve sufficient length and width, and the processor 110 therefore stops the fingerprint sensing operation of the fingerprint sensor 120, so as to complete the fingerprint registration process.
FIG. 9 is a flowchart illustrating a fingerprint registration method according to a third embodiment of the present invention. Referring to FIG. 1 and FIG. 9, the fingerprint registration method of the present embodiment may be applied to the electronic device 100 of the embodiment of FIG. L When the user performs the fingerprint registration, in step S910, the electronic device 100 senses the object (i.e. the user's finger) through the fingerprint sensor 120 to obtain a swiping image. In step S920, the processor 110 analyses the swiping image to obtain a plurality of feature points of the swiping image and obtain a coordinate parameter (X, Y) of the feature point located at the most upper left corner of the swiping image. In step S922, the processor 110 determines whether the swiping image is a first swiping image. If yes, the processor 110 executes step S924. In step S924, the processor 110 generates the pre-registration dataset according to the feature points of the swiping image Then, in step S925, the processor 110 displays the corresponding filled region on the reference image on the UI according to the coordinate parameter (X, Y) and an area of the swiping image. It is worth noting that the area of the swiping image is equal to the area of the sensing surface of the fingerprint sensor 120 If the swiping image is not the first swiping image, the processor 110 executes step S926. In step S926, the processor 110 merges the feature points of the swiping image into the pre-registration dataset. In step S940, the processor 110 increases the range of the filled region according to the coordinate parameter (X, Y) and the area of the swiping image. In step S980, the processor 110 determines whether a total area of the pre-registration dataset is greater than a predetermined threshold. The total area of the pre-registration dataset may represent a sum of the area of all of the swiping images minus the overlapped regions of the swiping images, or the number of the feature points included in the pre-registration dataset. In other words, in the embodiment, in step S980, the processor 110 determines whether the number of the feature points included in the pre-registration dataset is greater than the predetermined threshold. If yes, the processor 110 ends the fingerprint sensing operation of the fingerprint sensor 120 and generates fingerprint registration data according to the merged pre-registration dataset, so as to complete the fingerprint registration process. If not, the processor 110 executes step S910 to sense and obtain a following swiping image.
FIG. 10 is a schematic diagram of pre-registration data according to the third embodiment of the present invention FIG. 11A to FIG. III are schematic diagrams showing finger swiping operations and their corresponding UI displays according to the third embodiment of the present invention. The following descriptions can be understood with reference to FIG. 1, FIG. 10 and FIG. 11A to FIG. 111. In addition, the present embodiment may also be applied to the flowchart of FIG. 9 above. In the present embodiment, after the fingerprint sensor 120 obtains a first swiping image 1010 of the finger F, the processor 110 analyses the swiping image 1010 to obtain a plurality of feature points 1011 of the swiping image 1010 and obtain the coordinate parameter (XI, Y I) of the feature point 1011 located at the most upper left corner of the swiping image 1010. As shown in FIG. HA, the processor 110 displays a filled region 1111 of a reference image 1110 on a UT 1100 according to the coordinate parameter (XI, Y1) and the area of the swiping image 1010 (i.e. the area of the sensing surface of the fingerprint sensor 120). Moreover, the processor 110 generates pre-registration data according to the feature points of the swiping image 1010.
Then, the processor 110 obtains and analyses a following swiping image 1020 to obtain a plurality of feature points 1011 of the swiping image 1020. In the present embodiment, the processor 110 compares the feature points of the swiping images 1010 and 1020 to find the feature points simultaneously included in the swiping images 1010 and 1020, so as to obtain a relative position relationship between the swiping images 1010 and 1020, and also obtains a coordinate parameter (X2, Y2) of the feature point located at the most upper left corner of the swiping image 1020. As shown in FIG. 11B, the processor 110 may increase a display range of the filled region 1111 of the reference image 1110 according to the coordinate parameter (X2, Y2) and the area of the swiping image 1020. Moreover, the processor 110 merges the feature points of the swiping image 1020 into the pre-registration data to generate merged pre-regi strati on data.
Namely, each time when the processor 110 obtains one swiping image, the processor 110 merges the feature points thereof into the pre-registration data. Moreover, the processor 110 obtains the coordinate parameter of the feature point located at the most upper left corner of the swiping image to determine the to-be-increased range and position of the filled region 1111 of the reference image 1110 on the UI 1100. It is worth noting that the processor 110 determines whether to end the fingerprint registration by determining whether the total area of the pre-registration data is greater than a predetermine threshold. If the total area of the pre-registration data is not greater than the predetermined threshold, the processor 110 senses and obtains a following swiping image. As shown in FIG. 10 and FIG. 11E to FIG. 11F, in the process of fingerprint registration, after the first swiping operation, the user's finger may leave the fingerprint sensor 120. If the obtained fingerprint data is still insufficient, namely, the total area of the pre-registration data is not greater than the predetermined threshold, the processor 110 may display a prompt on the UI through the display 140 to request the user to swipe the finger again. During the second swiping operation, the processor 110 obtains a first swiping image 1030 of the second swiping operation, and the processor 110 obtains the feature points of the swiping image 1030, and merges the feature points to the pre-registration dataset The processor 110 obtains a coordinate parameter (Xn, Yn) of the feature point located at the most upper left corner of the swiping image 1030, and increases the range of the filled region 1111 of the reference image 1110 on the UT 1100 according to the coordinate parameter (Xn, Yn) and the area of the swiping image 1030.
It should be noted that by comparing and analysing the pre-registration data and the feature points of the swiping image 1030 (i.e. finding out the feature points that appear repeatedly), the processor 110 may obtain a relative position relationship between the swiping image 1030 and the previously obtained swiping images and accordingly obtain the coordinate parameter (Xn, Yn). In other words, the processor 110 displays the filled region 1111 in the reference image 1110 according to the relative position relationship between the swiping image 1030 and the previously obtained swiping images. Moreover, the processor 110 may determine whether the total area of the new pre-registration data is greater than the predetermined threshold to determine whether to end the fingerprint registration.
For example, taking FIG. 11A to FIG. 111 as an example, as shown in FIG. 11A, when the user places the finger F on the fingerprint sensor 120 of the electronic device 100 to perform a swiping operation, the filled region 1111 is displayed on the reference image 1110 on the Ul 1100 corresponding to the swiping images obtained by the fingerprint sensor 120. As shown in FIG. 11B to FIG. 11D, during the swiping operation of the finger F, the range of the filled region 1111 of the reference image 1110 is correspondingly adjusted. As shown in FIG. 11E, when the finger F of the user leaves the fingerprint sensor 120, the range of the filled region 1111 of the reference image 1110 stops increasing. However, since the fingerprint registration is not yet completed, the UI 1100 stays on the previous filled region 1111 of the reference image 1110 and the user is prompted and requested to swipe the finger again As shown in FIG. 11F to FIG. 11H, the user again places the finger F on the fingerprint sensor 120 of the electronic device 100 to perform a swiping operation The range of the filled region 1111 of the reference image 1110 is continually increased corresponding to the user's swiping operation. As shown in FIG. 111, when the total area of the pre-registration data is greater than the predetermined threshold, the processor 110 determines that sufficient fingerprint data have been obtained, and the range of the filled region 1111 of the reference image 1110 is increased to a sufficient area to cover the reference image 1110 of a sufficient range. Therefore, the processor 110 stops the fingerprint sensing operation of the fingerprint sensor 120, and generates the fingerprint registration data according to the pre-registration dataset, so as to complete the fingerprint registration process. In summary, in the display method of a user interface and the electronic device of the present invention, a plurality of swiping images obtained by one or more swiping operations of the user's finger on the fingerprint sensor are collected. The data of the feature points of the swiping images are merged to generate the fingerprint registration data. When the data of the feature points of the swiping images are merged, the electronic device of the present invention further analyses the repeatedness and position relationship between the feature points of the swiping images so as to obtain the corresponding image parameters and/or coordinate parameters. Therefore, in the display method of a user interface and the electronic device of the present invention, the UI including the reference image and the filled region thereof is correspondingly displayed on the display according to the image parameters and/or the coordinate parameters, so that the range of the filled region of the reference image on the UI can be dynamically adjusted. Namely, during the finger swiping operation performed for fingerprint registration, the user may learn a progress of the fingerprint registration through a change in the range of the filled region of the reference image on the UI displayed on the display of the electronic device. Accordingly, during the finger swiping operation performed by the user for fingerprint registration, the display method of a user interface and the electronic device of the present invention may provide real-time fingerprint registration progress information to the user, so as to provide a more user friendly and convenient fingerprint registration process.
It will be apparent to those skilled in the art that various modifications and variations can be made to the structure of the invention without departing from the scope or spirit of the invention.
In view of the foregoing, it is intended that the invention cover modifications and variations of this invention provided they fall within the scope of the following claims and their equivalents The following numbered statements set out various exemplary embodiments of the invention: 1. A display method of a user interface, applied to fingerprint registration, characterized in that the display method comprises: obtaining a plurality of swiping images through a fingerprint sensor; analysing the plurality of swiping images by a processor to obtain a plurality of feature points of the plurality of swiping images; merging the plurality of feature points of the plurality of swiping images into a pre-registration dataset by the processor; displaying a filled region on a reference image on a user interface of a display according to the plurality of swiping images by the processor and analysing the pre-registration dataset by the processor to determine whether to end the fingerprint registration.
2. The display method of a user interface as defined in statement 1, characterized by further comprising executing the following on each of the plurality of the swiping images by the processor: analysing the pre-registration dataset to obtain an image adjusting parameter; and determining a range of the filled region of the reference image according to the image adjusting parameter.
3. The display method of a user interface as defined in statement 2, characterized by further comprising executing the following on each of the plurality of the swiping images by the processor.
determining whether the swiping image is a first swiping image, and if the swiping image is the first swiping image, generating the pre-registration dataset according to the feature point of the swiping image, and analysing the pre-registration dataset to obtain a basic image parameter; and displaying the filled region on the reference image on the user interface of the display according to the basic image parameter.
4. The display method of a user interface as defined in statement 3, characterized by further comprising executing the following on each of the plurality of the swiping images by the processor: if the swiping image is not the first swiping image, merging the feature point of the swiping image into the pre-registration dataset to generate a merged pre-registration dataset, analysing the merged pre-registration dataset to obtain a merged image parameter, and obtaining the image adjusting parameter according to the merged image parameter and the basic image parameter; setting the merged image parameter as a new basic image parameter; and increasing the range of the filled region of the reference image according to the image adjusting parameter.
5. The display method of a user interface as defined in statement 4, characterized by further comprising executing the following on each of the plurality of the swiping images by the processor: determining whether the merged image parameter is greater than a predetermined threshold to determine whether to end the fingerprint registration.
6. The display method of a user interface as defined in statement 3, characterized by further comprising executing the following on each of the plurality of the swiping images by the processor: determining whether the swiping image is the first swiping image, and if the swiping image is the first swiping image, analysing the swiping image to obtain a coordinate parameter of the feature point located at the most upper left comer of the swiping image; and determining a position of the filled region of the reference image according to the coordinate parameter of the feature point located at the most upper left comer of the swiping image.
7. The display method of a user interface as defined in statement 6, characterized by further comprising executing the following on each of the plurality of the swiping images by the processor: if the swiping image is not the first swiping image, merging the feature point of the swiping image into the pre-registration dataset to generate a merged pre-registration dataset; analysing the merged pre-registration dataset to obtain a first merged image parameter and a second merged image parameter, and obtaining the image adjusting parameter according to the first merged image parameter and the basic image parameter, wherein the first merged image parameter represents a range the merged pre-registration dataset covers in a vertical direction, and the second merged image parameter represents a range the merged pre-registration dataset covers in a horizontal direction; setting the first merged image parameter as a new basic image parameter; and increasing the range of the filled region of the reference image in the vertical direction according to the image adjusting parameter.
8. The display method of a user interface as defined in statement 7, characterized by further comprising executing the following on each of the plurality of the swiping images by the 25 processor: determining whether the first merged image parameter is greater than a first predetermined threshold, and determining whether the second merged image parameter is greater than a second predetermined threshold, so as to determine whether to end the fingerprint registration.
9. The display method of a user interface as defined in statement 1, characterized by further comprising executing the following on each of the plurality of the swiping images by the processor: analysing the pre-registration dataset to obtain a coordinate parameter; and determining a range of the filled region of the reference image according to the coordinate parameter.
10. The display method of a user interface as defined in statement 9, characterized by further comprising executing the following on each of the plurality of the swiping images by the processor: determining whether the swiping image is a first swiping image, and if the swiping image is the first swiping image, generating the pre-registration dataset according to the feature point of the swiping image; and if the swiping image is not the first swiping image, merging the feature point of the swiping image into the pre-registration dataset to generate a merged pre-registration dataset.
11. The display method of a user interface as defined in statement 10, characterized by further comprising executing the following on each of the plurality of the swiping images by the processor: determining whether a total area of the merged pre-registration dataset is greater than a predetermined threshold, so as to determine whether to end the fingerprint registration.
12. An electronic device, characterized by comprising: a fingerprint sensor, configured to obtain a plurality of swiping images; a processor, coupled to the fingerprint sensor, configured to analyse the plurality of swiping images to obtain a plurality of feature points of the plurality of swiping images, and merge the plurality of feature points of the plurality of swiping images into a pre-registration dataset; and a display, coupled to the processor, wherein the processor displays a filled region on a reference image on a user interface by the display according to the plurality of swiping images, and the processor analyses the pre-registration dataset to determine whether to end the fingerprint registration.
13. The electronic device as defined in statement 12, characterized in that the processor executes the following on each of the plurality of swiping images: the processor analyses the pre-registration dataset to obtain an image adjusting parameter, and the processor determines a range of the filled region of the reference image according to the image adjusting parameter.
14. The electronic device as defined in statement 13, characterized in that the processor further executes the following on each of the plurality of swiping images: the processor determines whether the swiping image is a first swiping image, and if the processor determines that the swiping image is the first swiping image, the processor generates the pre-registration dataset according to the feature point of the swiping image, and analyses the pre-registration dataset to obtain a basic image parameter, and the processor displays the filled region on the reference image on the user interface of the display according to the basic image parameter.
15. The electronic device as defined in statement 14, characterized in that the processor further executes the following on each of the plurality of swiping images: if the processor determines that the swiping image is not the first swiping image, the processor merges the feature point of the swiping image into the pre-registration dataset to generate a merged pre-registration dataset, wherein the processor analyses the merged pre-registration dataset to obtain a merged image parameter, and the processor obtains the image adjusting parameter according to the merged image parameter and the basic image parameter, wherein the processor sets the merged image parameter as a new basic image parameter, and the processor increases the range of the filled region of the reference image according to the image adjusting parameter.
16. The electronic device as defined in statement 15, characterized in that the processor further executes the following on each of the plurality of swiping images: the processor determines whether the merged image parameter is greater than a predetermined threshold, so as to determine whether to end the fingerprint registration.
17. The electronic device as defined in statement 14, characterized in that the processor further executes the following on each of the plurality of swiping images: the processor determines whether the swiping image is the first swiping image, and if the processor determines that the swiping image is the first swiping image, the processor analyses the swiping image to obtain a coordinate parameter located at the most upper left corner of the swiping image, and the processor determines a position of the filled region of the reference image according to the coordinate parameter of the feature point located at the most upper left corner of the swiping image.
18. The electronic device as defined in statement 17, characterized in that the processor further executes the following on each of the plurality of swipingimages: if the swiping image is not the first swiping image, the processor merges the feature point of the swiping image into the pre-registration dataset to generate a merged pre-registration dataset, wherein the processor analyses the merged pre-registration dataset to obtain a first 25 merged image parameter and a second merged image parameter, and the processor obtains the image adjusting parameter according to the first merged image parameter and the basic image parameter, wherein the first merged image parameter represents a range the merged pre-registration dataset covers in a vertical direction, and the second merged image parameter represents a range the merged pre-registration dataset covers in a horizontal direction, wherein the processor sets the first merged image parameter as a new basic image parameter, and the processor increases the range of the filled region of the reference image in the vertical direction according to the image adjusting parameter.
19. The electronic device as defined in statement 18, characterized in that the processor further executes the following on each of the plurality of swiping images: the processor determines whether the first merged image parameter is greater than a first predetermined threshold and whether the second merged image parameter is greater than a second predetermined threshold, so as to determine whether to end the fingerprint registration.
20. The electronic device as defined in statement 12, characterized in that the processor executes the following on each of the plurality of swipingimages: the processor analyses the pre-registration dataset to obtain a coordinate parameter, and the processor determines a range of the filled region of the reference image according to the coordinate parameter.
21. The electronic device as defined in statement 20, characterized in that the processor further executes the following on each of the plurality of swiping images: the processor determines whether the swiping image is a first swiping image, and if the processor determines that the swiping image is the first swiping image, the processor generates the pre-registration dataset according to the feature point of the swiping image, and if the processor determines that the swiping image is not the first swiping image, the processor merges the feature point of the swiping image into the pre-registration dataset to generate a merged pre-registration dataset 22. The electronic device as defined in statement 21, characterized in that the processor further executes the following on each of the plurality of swipingimages: the processor determines whether a total area of the merged pre-registration dataset is greater than a predetermined threshold, so as to determine whether to end the fingerprint registration.

Claims (6)

  1. WHAT IS CLAIMED IS: I. A display method of a user interface, applied as part of a fingerprint registration process, characterized in that the display method comprises: obtaining a plurality of swiping images through a fingerprint sensor; analysing the plurality of swiping images by a processor to obtain a plurality of feature points of the plurality of swiping images; merging the plurality of feature points of the plurality of swiping images into a pre-registration dataset by the processor; displaying a filled region on a reference image on a user interface of a display according to the plurality of swiping images by the processor; analysing the pre-registration dataset by the processor to determine whether to end the fingerprint registration process; and executing the following on each of the plurality of the swiping images by the processor: analysing the pre-registration dataset to obtain a coordinate parameter; and updating a range of the filled region of the reference image according to the coordinate parameter and an area of the swiping image.
  2. 2. The display method of a user interface as claimed in claim I, characterized by further comprising executing the following on each of the plurality of the swiping images by the 20 processor: determining whether the swiping image is a first swiping image, and if the swiping image is the first swiping image, generating the pre-registration dataset according to the feature point of the swipingimage; and if the swiping image is not the first swiping image, merging the feature point of the swiping image into the pre-registration dataset to generate a merged pre-registration dataset.
  3. 3. The display method of a user interface as claimed in claim 2, characterized by further comprising executing the following on each of the plurality of the swiping images by the processor: determining whether a total area of the merged pre-registration dataset is greater than a predetermined threshold, so as to determine whether to end the fingerprint registration.
  4. 4. An electronic device, characterized by comprising: a fingerprint sensor, configured to obtain a plurality of swiping images during a fingerprint registration process; a processor, coupled to the fingerprint sensor, configured to analyse the plurality of swiping images to obtain a plurality of feature points of the plurality of swiping images, and merge the plurality of feature points of the plurality of swiping images into a pre-registration dataset; and a display, coupled to the processor, wherein the processor displays a filled region on a reference image on a user interface by the display according to the plurality of swiping images, and the processor analyses the pre-registration dataset to determine whether to end the fingerprint registration process, wherein the processor executes the following on each of the plurality of swiping images: the processor analyses the pre-registration dataset to obtain a coordinate parameter, and the processor updates a range of the filled region of the reference image according to the coordinate parameter and an area of the swiping image.
  5. 5. The electronic device as claimed in claim 4, characterized in that the processor further executes the following on each of the plurality of swiping images: the processor determines whether the swiping image is a first swiping image, and if the processor determines that the swiping image is the first swiping image, the processor generates the pre-registration dataset according to the feature point of the swiping image, and if the processor determines that the swiping image is not the first swiping image, the processor merges the feature point of the swiping image into the pre-registration dataset to generate a merged pre-registration dataset.
  6. 6. The electronic device as claimed in claim 5, characterized in that the processor further executes the following on each of the plurality of swiping images: the processor determines whether a total area of the merged pre-registration dataset is greater than a predetermined threshold, so as to determine whether to end the fingerprint registration process.
GB2117618.5A 2017-10-16 2018-10-15 User interface display method and electronic device Expired - Fee Related GB2599288B (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US201762573140P 2017-10-16 2017-10-16
US201762598480P 2017-12-14 2017-12-14
CN201810349409.2A CN109669651B (en) 2017-10-16 2018-04-18 Display method of user interface and electronic device
GB1914055.7A GB2574973B (en) 2017-10-16 2018-10-15 User interface display method and electronic device

Publications (3)

Publication Number Publication Date
GB202117618D0 GB202117618D0 (en) 2022-01-19
GB2599288A true GB2599288A (en) 2022-03-30
GB2599288B GB2599288B (en) 2022-11-23

Family

ID=66141976

Family Applications (2)

Application Number Title Priority Date Filing Date
GB2117618.5A Expired - Fee Related GB2599288B (en) 2017-10-16 2018-10-15 User interface display method and electronic device
GB1914055.7A Expired - Fee Related GB2574973B (en) 2017-10-16 2018-10-15 User interface display method and electronic device

Family Applications After (1)

Application Number Title Priority Date Filing Date
GB1914055.7A Expired - Fee Related GB2574973B (en) 2017-10-16 2018-10-15 User interface display method and electronic device

Country Status (4)

Country Link
JP (1) JP6836662B2 (en)
CN (1) CN109669651B (en)
GB (2) GB2599288B (en)
WO (1) WO2019076272A1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI735821B (en) * 2018-04-12 2021-08-11 神盾股份有限公司 Fingerprint registration method and electronic device using the same

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160321494A1 (en) * 2015-04-29 2016-11-03 Samsung Electronics Co., Ltd. Fingerprint information processing method and electronic device supporting the same

Family Cites Families (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3663075B2 (en) * 1999-04-05 2005-06-22 シャープ株式会社 Information processing device
JP3780830B2 (en) * 2000-07-28 2006-05-31 日本電気株式会社 Fingerprint identification method and apparatus
DE60238281D1 (en) * 2002-09-17 2010-12-23 Fujitsu Ltd APPARATUS FOR RECORDING BIOLOGICAL INFORMATION AND BIOLOGICAL INFORMATION USING THE AUTHORIZATION DEVICE
CN1924889A (en) * 2005-08-30 2007-03-07 知网生物识别科技股份有限公司 Apparatus and method of fingerprint register
CN102446271B (en) * 2010-10-08 2014-08-27 金佶科技股份有限公司 Sectional type image identification method and regional type identification device thereof
TWI562077B (en) * 2012-01-04 2016-12-11 Gingy Technology Inc Method for fingerprint recognition using dual camera and device thereof
KR101419784B1 (en) * 2013-06-19 2014-07-21 크루셜텍 (주) Method and apparatus for recognizing and verifying fingerprint
US9898642B2 (en) * 2013-09-09 2018-02-20 Apple Inc. Device, method, and graphical user interface for manipulating user interfaces based on fingerprint sensor inputs
KR102126568B1 (en) * 2013-10-31 2020-06-24 삼성전자주식회사 Method for processing data and an electronic device thereof
KR102217858B1 (en) * 2013-11-13 2021-02-19 삼성전자주식회사 Method for fingerprint authentication, fingerprint authentication device, and mobile terminal performing thereof
CN105989349B (en) * 2014-10-24 2019-11-01 神盾股份有限公司 The log-on data production method and electronic device of fingerprint
US9613428B2 (en) * 2014-11-07 2017-04-04 Fingerprint Cards Ab Fingerprint authentication using stitch and cut
US10032062B2 (en) * 2015-04-15 2018-07-24 Samsung Electronics Co., Ltd. Method and apparatus for recognizing fingerprint
KR101639986B1 (en) * 2015-10-07 2016-07-15 크루셜텍 (주) Fingerprint information processing method and apparatus for speed improvement of fingerprint registration and authentification
CN105373786A (en) * 2015-11-30 2016-03-02 东莞酷派软件技术有限公司 Fingerprint acquisition method, fingerprint acquisition device and electronic device
CN105814586B (en) * 2016-03-17 2019-10-01 深圳信炜科技有限公司 Fingerprint register method, fingerprint recognition system and electronic equipment
CN107004131A (en) * 2017-03-09 2017-08-01 深圳市汇顶科技股份有限公司 The method and device of fingerprint recognition

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160321494A1 (en) * 2015-04-29 2016-11-03 Samsung Electronics Co., Ltd. Fingerprint information processing method and electronic device supporting the same

Also Published As

Publication number Publication date
CN109669651A (en) 2019-04-23
WO2019076272A1 (en) 2019-04-25
GB2574973A (en) 2019-12-25
JP6836662B2 (en) 2021-03-03
JP2020515955A (en) 2020-05-28
GB2574973B (en) 2022-04-13
CN109669651B (en) 2021-02-02
GB201914055D0 (en) 2019-11-13
GB2599288B (en) 2022-11-23
GB202117618D0 (en) 2022-01-19

Similar Documents

Publication Publication Date Title
US11323658B2 (en) Display apparatus and control methods thereof
KR100947990B1 (en) Gaze Tracking Apparatus and Method using Difference Image Entropy
USRE42794E1 (en) Information-inputting device inputting contact point of object on recording surfaces as information
TWI471776B (en) Method and computing device for determining angular contact geometry
KR20130013678A (en) Touch-type portable terminal
US10466745B2 (en) Operational control method for flexible display device
TWI528271B (en) Method, apparatus and computer program product for polygon gesture detection and interaction
JP6502511B2 (en) Calculation device, control method of calculation device, and calculation program
TW201426513A (en) Apparatus and method for processing handwriting input
US20150227789A1 (en) Information processing apparatus, information processing method, and program
TWI731442B (en) Electronic apparatus and object information recognition method by using touch data thereof
CN104463145A (en) Electronic equipment and obstacle reminding method
EP3400827A1 (en) Electronic make-up mirror device and background switching method thereof
TW202036366A (en) Method for Determining Imaging Ratio of Curved Display, Storage Medium and Electronic Device
GB2599288A (en) User interface display method and electronic device
WO2018045565A1 (en) Control display method and device for flexible display device
US20190114461A1 (en) Display method of user interface and electronic apparatus thereof
KR20190088679A (en) Electronic device and method for determining fingerprint processing method based on pressure level of fingerprint input
KR101542671B1 (en) Method and apparatus for space touch
JP6659210B2 (en) Handwriting input device and handwriting input method
US20150084889A1 (en) Stroke processing device, stroke processing method, and computer program product
JP2018049498A (en) Image processor, operation detection method, computer program, and storage medium
CN112764565B (en) Electronic device and object information identification method using touch data thereof
TWI715252B (en) Electronic apparatus and object information recognition method by using touch data thereof
TWI674536B (en) Fingerprint navigation method and electronic device

Legal Events

Date Code Title Description
PCNP Patent ceased through non-payment of renewal fee

Effective date: 20231015