GB2574973A - User interface display method and electronic device - Google Patents

User interface display method and electronic device Download PDF

Info

Publication number
GB2574973A
GB2574973A GB201914055A GB201914055A GB2574973A GB 2574973 A GB2574973 A GB 2574973A GB 201914055 A GB201914055 A GB 201914055A GB 201914055 A GB201914055 A GB 201914055A GB 2574973 A GB2574973 A GB 2574973A
Authority
GB
United Kingdom
Prior art keywords
image
swiping
processor
parameter
dataset
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
GB201914055A
Other versions
GB201914055D0 (en
GB2574973B (en
Inventor
Chiang Yuan-Lin
lu Jun-chao
Hsu Hsien-Jen
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Egis Technology Inc
Original Assignee
Egis Technology Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Egis Technology Inc filed Critical Egis Technology Inc
Priority to GB2117618.5A priority Critical patent/GB2599288B/en
Publication of GB201914055D0 publication Critical patent/GB201914055D0/en
Publication of GB2574973A publication Critical patent/GB2574973A/en
Application granted granted Critical
Publication of GB2574973B publication Critical patent/GB2574973B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/12Fingerprints or palmprints
    • G06V40/1335Combining adjacent partial images (e.g. slices) to create a composite input or reference pattern; Tracking a sweeping finger movement
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/12Fingerprints or palmprints
    • G06V40/13Sensors therefor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/12Fingerprints or palmprints
    • G06V40/1347Preprocessing; Feature extraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/50Maintenance of biometric data or enrolment thereof
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/60Static or dynamic means for assisting the user to position a body part for biometric acquisition
    • G06V40/67Static or dynamic means for assisting the user to position a body part for biometric acquisition by interactive indications to the user

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Engineering & Computer Science (AREA)
  • Evolutionary Computation (AREA)
  • Evolutionary Biology (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Data Mining & Analysis (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Artificial Intelligence (AREA)
  • Collating Specific Patterns (AREA)
  • Image Input (AREA)
  • User Interface Of Digital Computer (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
  • Position Input By Displaying (AREA)

Abstract

The present invention provides a user interface display method and an electronic device. The user interface display method is applicable to fingerprint registration. Said display method comprises the following steps: sensing an object by means of a fingerprint sensor, so as to obtain a swipe image of the object; analyzing the swipe image, so as to obtain a plurality of feature points of the swipe image; generating a pre-registered data set according to the swipe image; analyzing the pre-registered data set, so as to obtain an image adjustment parameter; and displaying the user interface, and adjusting, according to the image adjustment parameter, the range of the completion region of a reference image in the user interface. Therefore, the user interface display method and the electronic device of the present invention can inform, in the process of fingerprint registration in a swiping manner, a user of the corresponding fingerprint registration progress by means of a display of the electronic device.

Description

USER INTERFACE DISPLAY METHOD AND ELECTRONIC DEVICE
Technical Field
The present invention relates to an interface display technique, and particularly relates to a display method of a user interface applied to fingerprint registration and an electronic device using the display method.
Description of Related Art
In recent years, fingerprint recognition technique has been widely used in various electronic devices to provide various functions such as identity login or identity verification. However, in the general fingerprint recognition technique, a user presses a finger on a fingerprint sensor to register a fingerprint in a manner of one-time press or multiple presses, and a corresponding user interface is provided to inform the user of the progress of fingerprint registration. For example, if the fingerprint registration is performed through the manner of multiple presses, each time when the user presses the finger, an area of a corresponding fingerprint image displayed on the user interface is increased. After an entire or a sufficiently large range of the fingerprint is displayed, the fingerprint registration is completed.
However, if the user performs the fingerprint registration through the manner of swiping a finger, the conventional fingerprint recognition related technique is unable to correspondingly display a fingerprint image on the user interface to inform the user of the progress of the fingerprint registration according to the swiping progress of the user’s finger. Namely, in the process of performing fingerprint registration by swiping a finger, the user cannot learn the progress of the fingerprint registration in real time.
SUMMARY OF THE INVENTION
The present invention provides a display method of a user interface and an electronic device, enabling a user to learn a corresponding fingerprint registration progress through a display of the electronic device in the process of performing fingerprint registration by swiping a finger.
A display method of a user interface of the present invention is applied to fingerprint registration. The display method includes the following steps: obtaining a swiping image through a fingerprint sensor; analysing the swiping image to obtain a plurality of feature points of the swiping image; determining whether the swiping image is a first swiping image; if the swiping image is the first swiping image, generating a pre-regi strati on dataset according to the plurality of feature points of the swiping image, and analysing the pre-regi strati on dataset to obtain a basic image parameter; and displaying a filled region on a reference image on the user interface according to the basic image parameter.
A display method of a user interface of the present invention is applied to fingerprint registration. The display method includes the following steps: obtaining a swiping image through a fingerprint sensor; analysing the swiping image to obtain a plurality of feature points of the swiping image, and obtain a coordinate parameter of the feature point located at the most upper left comer of the swiping image; determining whether the swiping image is a first swiping image; if the swiping image is the first swiping image, generating a pre-regi strati on dataset according to the plurality of feature points of the swiping image; and displaying a filled region on a reference image on the user interface according to the coordinate parameter and an area of the swiping image.
An electronic device of the present invention includes a fingerprint sensor, a processor and a display. The fingerprint sensor is configured to obtain a swiping image. The processor is coupled to the fingerprint sensor. The processor is configured to analyse the swiping image to obtain a plurality of feature points of the swiping image, and determine whether the swiping image is a first swiping image. The display is coupled to the processor. If the processor determines that the swiping image is the first swiping image, the processor generates a pre-regi strati on dataset according to the plurality of feature points of the swiping image and analyses the pre-regi strati on dataset to obtain a basic image parameter. The processor, through the display, displays a filled region on a reference image on a user interface according to the basic image parameter.
An electronic device of the present invention includes a fingerprint sensor, a processor and a display. The fingerprint sensor is configured to obtain a swiping image. The processor is coupled to the fingerprint sensor. The processor is configured to analyse the swiping image and obtain a coordinate parameter of the feature point located at the most upper left comer of the swiping image. The processor is further configured to determine whether the swiping image is a first swiping image. The display is coupled to the processor. If the processor determines that the swiping image is the first swiping image, the processor generates a pre-regi strati on dataset according to the plurality of feature points of the swiping image. The processor displays a filled region on a reference image on a user interface according to the coordinate parameter and an area of the swiping image.
According to the above description, in the display method of a user interface and the electronic device of the present invention, by analysing a plurality of swiping images obtained during the process of fingerprint registration, a corresponding image adjusting parameter can be obtained, and a change in range of the filled region of the reference image on the user interface is displayed according to the image adjusting parameter, so as to provide the user with real-time information on the progress of fingerprint registration.
In order to make the aforementioned features and advantages of the present invention comprehensible, several embodiments accompanied with figures are described in detail below.
BRIEF DESCRIPTION OF THE DRAWINGS
FIG. 1 is a schematic diagram of an electronic device according to an embodiment of the present invention;
FIG. 2 is a flowchart illustrating a fingerprint registration method according to an embodiment of the present invention;
FIG. 3 is a flowchart illustrating a fingerprint registration method according to a first embodiment of the present invention;
FIG. 4 is a schematic diagram of a pre-regi strati on dataset according to the first embodiment of the present invention;
FIG. 5A to FIG. 5E are schematic diagrams showing finger operations and their corresponding UI displays according to the first embodiment of the present invention;
FIG. 6 is a flowchart illustrating a fingerprint registration method according to a second embodiment of the present invention;
FIG. 7 is a schematic diagram of a pre-regi strati on dataset according to the second embodiment of the present invention;
FIG. 8A to FIG. 8G are schematic diagrams showing finger operations and their corresponding UI displays according to the second embodiment of the present invention;
FIG. 9 is a flowchart illustrating a fingerprint registration method according to a third embodiment of the present invention;
FIG. 10 is a schematic diagram of a pre-regi strati on dataset according to the third embodiment of the present invention;
FIG. 11A to FIG. Ill are schematic diagrams showing finger operations and their corresponding UI displays according to the third embodiment of the present invention.
Description of the Reference Numerals:
100: electronic device;
110: processor;
120: fingerprint sensor;
130: memory;
140: display;
410, 420, 400, 710, 720, 730, 1010, 1020, 1030: image;
411, 711, 1011: feature point;
500, 800, 1100: user interface (UI);
510, 810, 1110: reference image;
511, 811, 811b, 1111: filled region;
DW1, DW2: width;
F: finger;
h, H, W: image parameter;
Step: image adjusting parameter;
S210, S220, S225, S230, S232, S235, S240, S245, S250, S310, S320, S331, S332, S333, S334, S340, S341, S350, S380, S610, S620, S631, S633, S632, S634, S640, S641, S650, S680, S910, S920, S922, S924, S925, S926, S940, S980: step;
(XI, Yl), (X2, Y2), (Xn, Yn): coordinate parameter;
(Ax, Ay): displacement parameter.
DESCRIPTION OF EMBODIMENTS
In order to make the content of the present invention more comprehensible, embodiments are described below as the examples to prove that the present invention can actually be realized. In addition, wherever possible, the components/members/steps with the same reference numerals in the drawings and the description stand for the same or like parts.
FIG. 1 is a schematic diagram of an electronic device according to an embodiment of the present invention. Referring to FIG. 1, an electronic device 100 includes a processor 110, a fingerprint sensor 120, a memory 130 and a display 140. The processor 110 is coupled to the fingerprint sensor 120, the memory 130 and the display 140. The electronic device 100 is, for example, an electronic product such as a smart phone, a notebook (NB), a tablet personal computer (PC), etc. In the present embodiment, the electronic device 100 executes a fingerprint sensing operation through the fingerprint sensor 120 to obtain a fingerprint image of a user’s finger. In the present embodiment, when the user places a finger on the fingerprint sensor 120 to perform a swiping operation, the fingerprint sensor 120 performs fingerprint sensing. The fingerprint sensor 120 may obtain a plurality of swiping images successively and provide them to the processor 110. The processor 110 may analyse these swiping images to obtain a plurality of feature points from each of the swiping images. The feature points refer to fingerprint feature points of the finger. Thereafter, the processor 110 generates fingerprint registration data according to data of the feature points.
In the present embodiment, the fingerprint sensor 120 obtains the swiping images one by one, and while the processor 110 is analysing the swiping images one by one, the processor 110 may correspondingly change a filled region of a fingerprint reference image on a user interface (UI) displayed on the display 140 according to the analysis result of each of the swiping images obtained one by one. In the present embodiment, the reference image on the UI includes the filled region. The filled region of the reference image is used for representing a range which the obtained fingerprint information covers, and a range of the filled region of the reference image is progressively increased and changed corresponding to the current progress of a finger swiping by a user (i.e. corresponding to the progress of acquisition of the fingerprint information). Therefore, the fingerprint registration function of the electronic device 100 of the present invention may provide a good interaction effect such that the user can be informed of the current progress of the registration.
In the present embodiment, the processor 110 is, for example, a central processing unit (CPU), a system on chip (SoC) or other programmable general purpose or special purpose microprocessor, a digital signal processor (DSP), a programmable controller, an application specific integrated circuit (ASIC), a programmable logic device (PLD), other similar processing device or a combination of these devices.
In the present embodiment, the fingerprint sensor 120 is, for example, a capacitive fingerprint sensor or an optical fingerprint sensor, and the type of the fingerprint sensor 120 is not limited in the present invention. In the present embodiment, a fingerprint sensing mechanism of the fingerprint sensor 120 may be swiping sensing or pressing sensing. It is worth noting that in the embodiments of the present invention, the fingerprint registration is implemented through swiping sensing. Namely, during the process of fingerprint registration, the user places and swipes the finger on a sensing surface of the fingerprint sensor 120, and the fingerprint sensor 120 senses and obtains fingerprint information of the user through the sensing surface. For example, the electronic device 100 may be designed to perform fingerprint registration by asking the user to swipe the finger. In other words, the fingerprint sensor 120 may perform the fingerprint sensing in the manner of swiping sensing. For fingerprint authentication, the user is asked to press the finger. Namely, the fingerprint sensor 120 performs the fingerprint sensing in the manner of pressing sensing.
In the present embodiment, the memory 130 is configured to store fingerprint data described in the embodiments of the present invention and related applications for the processor 110 to read and execute.
In the present embodiment, the display 140 is, for example, a liquid crystal display (LCD), a light emitting diode (LED) display, a micro LED display or an organic LED display, etc., and the type of the display 140 is not limited in the present invention. In the present embodiment, when the user performs the fingerprint registration, the display 140 displays the corresponding UI and the UI includes a reference image simulating a fingerprint. During the process where the user swipes the finger on the fingerprint sensor 120, a range of the filled region of the reference image displayed on the display 140 is changed and increased corresponding to an increase of the fingerprint data sensed by the fingerprint sensor 120.
FIG. 2 is a flowchart illustrating a fingerprint registration method according to an embodiment of the present invention. Referring to FIG. 1 and FIG. 2, the fingerprint registration method of the present embodiment is applied to the electronic device 100 of the embodiment of FIG. 1. When the user performs fingerprint registration, the electronic device 100, through the fingerprint sensor 120, executes the swiping fingerprint sensing operation and obtains swiping images of an object (i.e. the user’s finger) one by one. In step S210, the fingerprint sensor 120 obtains a swiping image. In step S220, the processor 110 analyses the swiping image to obtain a plurality of feature points of the swiping image. In step S225, the processor 110 determines whether the swiping image is a first swiping image. If yes, step S230 is executed. In step S230, the processor 110 generates a pre-regi strati on dataset according to the feature points of the swiping image, and analyses the pre-regi strati on dataset to obtain a basic image parameter (h) of the pre-regi strati on dataset. In step S232, the processor 110 displays a filled region of the reference image on the UI according to the basic image parameter (h). If the swiping image is not the first swiping image, step S235 is executed. In step S235, the processor 110 merges the feature points of the swiping image into the pre-regi strati on dataset to generate a merged pre-regi strati on dataset. In step S240, the processor 110 analyses the merged pre-regi strati on dataset to obtain a merged image parameter (H), and obtains an image adjusting parameter (Step=H-h) according to the merged image parameter (H) and the basic image parameter (h). The image adjusting parameter (Step) is equal to the merged image parameter (H) minus the basic image parameter (h). In step S245, the processor 110 sets the merged image parameter (H) as a new basic image parameter (h). In step S250, the processor 110 increases a range of the filled region of the reference image on the UI according to the image adjusting parameter (Step), i.e. to increase a length of the filled region. In order to further convey technical details of a display method of a user interface and the fingerprint registration of this case to those skilled in the art, several embodiments are provided below for further description.
FIG. 3 is a flowchart illustrating a fingerprint registration method according to a first embodiment of the present invention. Referring to FIG. 1 and FIG. 3, the fingerprint registration method of the present embodiment is applied to the electronic device 100 of the embodiment of FIG. 1. When the user performs the fingerprint registration, in step S310, the electronic device 100 senses the object (i.e. the user’s finger) through the fingerprint sensor 120 to obtain a swiping image. In step S320, the processor 110 analyses the swiping image to obtain a plurality of feature points of the swiping image. In step S331, the processor 110 determines whether the swiping image is a first swiping image. If yes, the processor 110 executes step S332. In step S332, the processor 110 generates a pre-regi strati on dataset according to the feature points of the swiping image, and obtains a basic image parameter (h) of the pre-regi strati on dataset. In step S333, the processor 110 displays a filled region of the reference image according to the basic image parameter (h). If the swiping image is not the first swiping image, the processor 110 executes step S334. In step S334, the processor 110 merges the feature points of the swiping image into the pre-regi strati on dataset to generate a merged pre-regi strati on dataset.
In step S340, the processor 110 analyses the merged pre-regi strati on dataset to obtain a merged image parameter (H) and obtains an image adjusting parameter (Step=H-h) according to the merged image parameter (H) and the basic image parameter (h). The image adjusting parameter (Step) is equal to the merged image parameter (H) minus the basic image parameter (h). In step S341, the processor 110 sets the merged image parameter (H) as a new basic image parameter (h). In step S350, the processor 110 increases the range (i.e. length) of the filled region of the reference image according to the image adjusting parameter (Step). In step S3 80, the processor 110 determines whether the merged image parameter (H) is greater than a predetermined threshold. If yes, it means that sufficient fingerprint registration data have been obtained. Then, the processor 110 ends the fingerprint sensing operation of the fingerprint sensor 120, and stores the pre-regi strati on dataset into the memory 130 to serve as a fingerprint registration dataset, so as to complete the fingerprint registration process. If not, the processor 110 executes step S310 to obtain a following swiping image.
FIG. 4 is a schematic diagram of a pre-regi strati on dataset according to the first embodiment of the present invention. FIG. 5A to FIG. 5E are schematic diagrams showing finger swiping operations and their corresponding UI displays according to the first embodiment of the present invention. Referring to FIG. 1, FIG. 4 and FIG. 5 A to FIG. 5E, the present embodiment may also be applied to the flowchart of FIG. 3 above. In the present embodiment, after the fingerprint sensor 120 obtains a first swiping image 410 of a finger F, the processor 110 analyses the swiping image 410 to obtain a plurality of feature points 411 of the swiping image 410 and a basic image parameter h. It should be noted that an area of the swiping image 410 is equal to an area of the sensing surface of the fingerprint sensor 120. In the present embodiment, an initial value of the basic image parameter h may be a distance between two of the feature points 411 of the first swiping image 410 that are farthest from each other in a length direction, though the present invention is not limited thereto. In another embodiment, the initial value of the basic image parameter h may be a length of the swiping image 410, i.e. a length of the sensing surface of the fingerprint sensor 120. The processor 110 generates the pre-regi strati on dataset according to the feature points 411 of the swiping image 410. Then, the processor 110 obtains a following swiping image 420 and obtains the feature points of the swiping image 420. In the present embodiment, the processor 110 merges the feature points of the swiping image 420 into the pre-regi strati on dataset (i.e. to merge the feature points of the swiping images 410 and 420) to generate a new pre-regi strati on dataset 400. The processor 110 calculates the merged image parameter H of the merged pre-regi strati on dataset 400. Similarly, the merged image parameter H may be a distance between two of the merged feature points that are farthest from each other in the length direction after the swiping images 410 and 420 are merged, or may be a sum of the length of the two swiping images (i.e. twice the length of the sensing surface of the fingerprint sensor 120) minus a length of an overlapped portion of the swiping images 410 and 420 after the swiping images 410 and 420 are merged. Then, the processor 110 subtracts the basic image parameter h from the merged image parameter H to obtain an image adjusting parameter Step(=H-h).
Namely, each time when the processor 110 merges the feature points of one swiping image into the merged pre-regi strati on dataset 400, the processor 110 calculates the increased length of the merged pre-regi strati on dataset 400, so as to accordingly adjust a length of a filled region 511 of a reference image 510 on a UI 500 correspondingly. It is worth noting that a width DW of the filled region 511 of the reference image 510 is predetermined and fixed. Each time when data of the feature points of one additional swiping image are added, the processor 110 may correspondingly increase the length of the filled region 511. Moreover, the processor 110 may determine whether the merged image parameter H is greater than the predetermined threshold. If yes, it means that sufficient fingerprint registration data have been obtained. For example, when the merged image parameter H is greater than the predetermined threshold, it means that a sufficient number of fingerprint feature points have been obtained or a sufficient number of swiping images have been obtained. Therefore, the processor 110 stores the pre-regi strati on dataset to the memory 130 to serve as the fingerprint registration dataset, so as to complete the fingerprint registration process.
Taking FIG. 5 A to FIG. 5E as an example. As shown in FIG. 5 A, when the user places the finger F on the fingerprint sensor 120 of the electronic device 100 to perform a swiping operation, the electronic device 100 displays the reference image 510 on the UI 500 and obtains the swiping images one by one. When the first swiping image is obtained, the electronic device 100 displays the corresponding filled region 511 in the reference image 510, where the length of the filled region 511 corresponds to the basic image parameter h of the first swiping image. Moreover, as described above, the width of the filled region 511 is predetermined and fixed. As shown in the figure, the width of the filled region 511 may be equal to or greater than a width of the reference image 510. As shown in FIG. 5B, when a second swiping image is obtained, the electronic device 100 may increase the length of the filled region 511 of the reference image 510 according to the image adjusting parameter Step. As shown in the figure, the width of the filled region 511 is fixed. As shown in FIG. 5C, when the finger F of the user leaves the fingerprint sensor 120, the length of the filled region 511 of the reference image 510 stops increasing. However, since the obtained fingerprint information is insufficient, i.e. the merged image parameter H is not greater than the predetermined threshold, the fingerprint registration process is not yet completed, and therefore, the UI 500 remains displaying the filled region 511 of the reference image 510 and prompts the user to swipe the finger again. Then, as shown in FIG. 5D, the user again places the finger F on the fingerprint sensor 120 of the electronic device 100 to perform another swiping operation. The electronic device 100 then obtains a new swiping image, and increases the length of the filled region 511 of the reference image 510 according to the newly added swiping image (i.e. newly added data of the fingerprint feature points). Finally, as shown in FIG. 5E, when the merged image parameter H is greater than the predetermined threshold, the processor 110 will make the filled region 511 completely cover the reference image 510. Namely, the length of the filled region 511 will be greater than or equal to the length of the reference image 510. Therefore, the fingerprint registration process is completed, and the processor 110 stops the fingerprint sensing operation of the fingerprint sensor 120 and generates the fingerprint registration dataset according to the pre-regi strati on dataset, so as to complete the fingerprint registration process.
FIG. 6 is a flowchart illustrating a fingerprint registration method according to a second embodiment of the present invention. Referring to FIG. 1 and FIG. 6, the fingerprint registration method of the present embodiment is applied to the electronic device 100 of the embodiment of FIG. 1. When the user performs the fingerprint registration, in step S610, the electronic device 100 senses the object (i.e. the user’s finger) through the fingerprint sensor 120 to obtain a swiping image. In step S620, the processor 110 analyses the swiping image to obtain a plurality of feature points of the swiping image. In step S631, the processor 110 determines whether the swiping image is a first swiping image. If yes, the processor 110 executes step S632. In step S632, the processor 110 obtains a coordinate parameter (X, Y) of the feature point located at the most upper left corner of the swiping image, generates a pre-regi strati on dataset according to the feature points of the swiping image, and obtains a basic image parameter (h) of the pre-regi strati on dataset. In step S633, the processor 110 displays a filled region of the reference image on the display 140 according to the basic image parameter (h) and the coordinate parameter (X, Y). If the swiping image is not the first swiping image, the processor 110 executes step S634. Tn step S634, the processor 110 merges the feature points of the swiping image into the pre-regi strati on dataset to generate a merged pre-regi strati on dataset.
In step S640, the processor 110 analyses the merged pre-regi strati on dataset to obtain a first merged image parameter (H) and a second merged image parameter (W), and obtains an image adjusting parameter (Step=H-h) according to the first merged image parameter (H) and the basic image parameter (h). The first merged image parameter (H) may be a distance between two of the merged feature points that are farthest from each other in a length direction in the pre-regi strati on dataset, or may be a sum of the length of a plurality of swiping images minus a length of an overlapped portion of the swiping images. The second merged image parameter (W) is a distance between two of the merged feature points that are farthest from each other in a width direction. In step S641, the processor 110 sets the first merged image parameter (H) as a new basic image parameter (h). In step S650, the processor 110 increases a range of the filled region of the reference image according to the image adjusting parameter (Step). In step S680, the processor 110 determines whether the first merged image parameter (H) is greater than a first predetermined threshold and whether the second merged image parameter (W) is greater than a second predetermined threshold. If yes, the processor 110 ends the fingerprint sensing operation of the fingerprint sensor 120 and generates the fingerprint registration data according to the merged pre-regi strati on dataset, so as to complete the fingerprint registration process. If not, the processor 110 executes step S610 to obtain a following swiping image.
FIG. 7 is a schematic diagram of a pre-regi strati on dataset according to the second embodiment of the present invention. FIG. 8A to FIG. 8G are schematic diagrams showing finger operations and their corresponding UI displays according to the second embodiment of the present invention. Referring to FIG. 1, FIG. 7 and FIG. 8 A to FIG. 8G, the present embodiment may also be applied to the flowchart of FIG. 6 above. In the present embodiment, after the fingerprint sensor 120 obtains a first swiping image 710 of the finger F, the processor 110 analyses the swiping image 710 to obtain a plurality of feature points 711 of the swiping image
710 and obtain the basic image parameter h and a coordinate parameter (XI, Yl) of the feature point located at the most upper left comer of the swiping image 710. The processor 110 displays a filled region 811 of a reference image 810 according to the coordinate parameter (XI, Yl) and the basic image parameter h. It should be noted that an area of the swiping image 710 is equal to the area of the sensing surface of the fingerprint sensor 120. In the present embodiment, the basic image parameter h refers to a distance between two of the feature points
711 of the swiping image 710 that are farthest from each other in a length direction, but the present invention is not limited thereto. In another embodiment, the basic image parameter h may refer to a length of the swiping image 710, i.e. the length of the sensing surface of the fingerprint sensor 120. The processor 110 generates the pre-regi strati on dataset according to the feature points 711 of the swiping image 710. Then, the processor 110 obtains a following swiping image 720 and obtains the feature points of the swiping image 720. In the present embodiment, the processor 110 merges the feature points of the swiping image 720 into the pre-regi strati on dataset (i.e. to merge the feature points of the swiping images 710 and 720) to generate a merged pre-regi strati on dataset 700. The processor 110 calculates a first merged image parameter H (i.e. the maximum image length) and a second merged image parameter W (i.e. the maximum image width) of the merged pre-regi strati on dataset 700. The processor 110 subtracts the basic image parameter h from the first merged image parameter H to obtain the image adjusting parameter Step. Then, the processor 110 increases the length of the filled region 811 of the reference image 810 according to the image adjusting parameter Step.
Namely, each time when the processor 110 merges the feature points of one swiping image into the pre-regi strati on dataset 700, the processor 110 calculates the increased length of the merged pre-regi strati on dataset 700, so as to accordingly adjust the length of the filled region 811 of the reference image 810 on a UI 800. It is worth noting that a width of the filled region 811 of the reference image 810 is predetermined and fixed during each finger swiping operation. Namely, the width of the filled region 811 of the reference image 810 may be increased only when another finger swiping operation is performed. During each finger swiping operation, each time when data of the feature points of one additional swiping image are added, the processor 110 correspondingly increases the length of the filled region 811. In addition, the processor 110 merges a plurality of swiping images into the pre-regi strati on dataset 700, and the processor 110 determines whether the first merged image parameter H and the second merged image parameter W are respectively greater than the first and the second predetermined thresholds. If yes, it means that sufficient fingerprint registration data have been obtained. For example, when the first merged image parameter H is greater than the first predetermined threshold and the second merged image parameter W is greater than the second predetermined threshold, it means that a sufficient number of fingerprint feature points or a sufficient number of swiping images have been obtained. Therefore, the processor 110 stores the pre-regi strati on dataset to the memory 130 to serve as the fingerprint registration dataset, so as to complete the fingerprint registration process. If the first merged image parameter H is not greater than the first predetermined threshold or the second merged image parameter W is not greater than the second predetermined threshold, the processor 110 displays a prompt on the UI of the display 140 to request the user to swipe the finger again. During the second swiping operation, the processor 110 obtains a first swiping image 730 of the second swiping operation through the fingerprint sensor 120. The processor 110 then obtains the feature points 711 of the swiping image 730, and merges the feature points 711 into the pre-regi strati on dataset 700. The processor 110 obtains a displacement parameter (Δχ, Ay) (Χ2-Χ1=Δχ, Υ2-Υ Ι=Δγ) according to the coordinate parameter (XI, Yl) of the feature point located at the most upper left comer of the swiping image 710 (i.e. the first swiping image obtained during the first swiping operation) and a coordinate parameter (X2, Y2) of the feature point located at the most upper left comer of the first swiping image 730 obtained during the second swiping operation. According to the displacement parameter (Ax, Ay), the processor 110 may determine an increased width and position of the filled region 811 of the reference image 810 on the UI 800 corresponding to the second swiping operation.
In other words, when the second swiping operation is performed, the finger F of the user may shift to the right or left, and the processor 110 determines an increased width of the filled region 811 corresponding to the second swiping operation according to the coordinate parameter (X2, Y2) of the feature point located at the most upper left comer of the first swiping image 730 obtained during the second swiping operation, i.e. the displacement parameter (Ax, Ay), and determines a length of a newly added portion 811b of the filled region 811 corresponding to the second swiping operation according to the basic image parameter h of the first swiping image 730. The processor 110 displays a starting position of the portion 811b of the filled region 811 corresponding to the second swiping operation according to the coordinate parameter (X2, Y2). Namely, in the second swiping operation, a range of the filled region 811 in a width direction is increased corresponding to the degree of shifting of the finger F of the user. Then, the processor 110 increases the length of the portion 811b of the filled region 811 corresponding to the second swiping operation according to the later obtained swiping images and the corresponding image adjusting parameters Step. Each time when one swiping image is obtained, the processor 110 determines whether the first merged image parameter H and the second merged image parameter W are respectively greater than the first and the second predetermined thresholds, so as to determine whether to end the fingerprint registration process.
Taking FIG. 8A to FIG. 8G as an example, as shown in FIG. 8A, when the user presses the finger F on the fingerprint sensor 120 of the electronic device 100 and performs the first swiping operation, the corresponding filled region 811 is displayed on the reference image 810 on the UI
800 corresponding to the first swiping image obtained by the fingerprint sensor 120. As shown in FIG. 8B and FIG. 8C, during the first swiping operation of the finger F, the length of the filled region 811 of the reference image 810 is correspondingly increased. Moreover, during the first swiping operation of the finger F, the processor 110 increases the length of the filled region 811 of the reference image 810 according to the image adjusting parameter Step by fixing an image width. That is, during the first swiping operation of the finger F, a width DW1 of the filled region 811 of the reference image 810 is fixed, and a length thereof is increased after a new swiping image is obtained. As shown in FIG. 8D, when the finger F of the user leaves the fingerprint sensor 120, the range of the filled region 811 of the reference image 810 stops increasing. However, since the fingerprint registration is not yet completed, the UI 800 stays on the reference image 810 and the current filled region 811 thereof, and a prompt is displayed to request the user to swipe the finger again. Therefore, as shown in FIG. 8E and FIG. 8F, the user presses the finger again on the fingerprint sensor 120 of the electronic device 100 to perform the second swiping operation. Compared to the finger’s placing position during the first swiping operation, during the second swiping operation, the position of the user’s finger is displaced a distance to the upper right. After the first swiping image of the second swiping operation is obtained, the processor 110 calculates the coordinate parameter (X2, Y2) of the feature point located at the most upper left comer of the first swiping image, and subtracts the coordinate parameter (XI, Yl) of the feature point located at the most upper left comer of the first swiping image obtained during the first swiping operation from the coordinate parameter (X2, Y2) to obtain the displacement parameter (Δχ, Δγ) (Χ2-Χ1=Δχ, Υ2-Υ1=Δγ), and determines a display position of the first swiping image corresponding to the second swiping operation, i.e. the starting position of the newly added portion 811b of the filled region 811 corresponding to the second swiping operation according to (Δχ, Δγ). As shown in the figures, during the second swiping operation of the finger, the range of the filled region 811 of the reference image 810 continues to increase (i.e. 811b). The processor 110 determines the starting position of the newly added portion 811b of the filled region 811 according to the displacement parameter (Δχ, Ay), and according to the image adjusting parameter Step, increases the length of the portion 811b of the filled region 811 of the reference image 810 corresponding to the second swiping operation by fixing the image width. That is, during the second swiping operation of the finger F, a width DW2 of the newly added portion 811b of the filled region 811 of the reference image 810 is fixed while the length of the portion 81 lb is increased after a new swiping image is obtained. Finally, as shown in FIG. 8G, when the first merged image parameter H and the second merged image parameter W of the registration dataset 700 are respectively greater than the first and the second predetermined thresholds, the range of the filled region 811 of the reference image 810 is increased to achieve sufficient length and width, and the processor 110 therefore stops the fingerprint sensing operation of the fingerprint sensor 120, so as to complete the fingerprint registration process.
FIG. 9 is a flowchart illustrating a fingerprint registration method according to a third embodiment of the present invention. Referring to FIG. 1 and FIG. 9, the fingerprint registration method of the present embodiment may be applied to the electronic device 100 of the embodiment of FIG. 1. When the user performs the fingerprint registration, in step S910, the electronic device 100 senses the object (i.e. the user’s finger) through the fingerprint sensor 120 to obtain a swiping image. In step S920, the processor 110 analyses the swiping image to obtain a plurality of feature points of the swiping image and obtain a coordinate parameter (X, Y) of the feature point located at the most upper left corner of the swiping image. In step S922, the processor 110 determines whether the swiping image is a first swiping image. If yes, the processor 110 executes step S924. In step S924, the processor 110 generates the pre-regi strati on dataset according to the feature points of the swiping image. Then, in step S925, the processor 110 displays the corresponding filled region on the reference image on the UI according to the coordinate parameter (X, Y) and an area of the swiping image. It is worth noting that the area of the swiping image is equal to the area of the sensing surface of the fingerprint sensor 120. If the swiping image is not the first swiping image, the processor 110 executes step S926. In step S926, the processor 110 merges the feature points of the swiping image into the pre-regi strati on dataset. In step S940, the processor 110 increases the range of the filled region according to the coordinate parameter (X, Y) and the area of the swiping image. In step S980, the processor 110 determines whether a total area of the pre-regi strati on dataset is greater than a predetermined threshold. The total area of the pre-regi strati on dataset may represent a sum of the area of all of the swiping images minus the overlapped regions of the swiping images, or the number of the feature points included in the pre-regi strati on dataset. In other words, in the embodiment, in step S980, the processor 110 determines whether the number of the feature points included in the pre-regi strati on dataset is greater than the predetermined threshold. If yes, the processor 110 ends the fingerprint sensing operation of the fingerprint sensor 120 and generates fingerprint registration data according to the merged pre-regi strati on dataset, so as to complete the fingerprint registration process. If not, the processor 110 executes step S910 to sense and obtain a following swiping image.
FIG. 10 is a schematic diagram of pre-regi strati on data according to the third embodiment of the present invention. FIG. 11A to FIG. 1II are schematic diagrams showing finger swiping operations and their corresponding UI displays according to the third embodiment of the present invention. The following descriptions can be understood with reference to FIG. 1, FIG. 10 and FIG. 11A to FIG. 111. In addition, the present embodiment may also be applied to the flowchart of FIG. 9 above. In the present embodiment, after the fingerprint sensor 120 obtains a first swiping image 1010 of the finger F, the processor 110 analyses the swiping image 1010 to obtain a plurality of feature points 1011 of the swiping image 1010 and obtain the coordinate parameter (XI, Yl) of the feature point 1011 located at the most upper left corner of the swiping image 1010. As shown in FIG. 11 A, the processor 110 displays a filled region 1111 of a reference image 1110 on a UI 1100 according to the coordinate parameter (XI, Yl) and the area of the swiping image 1010 (i.e. the area of the sensing surface of the fingerprint sensor 120).
Moreover, the processor 110 generates pre-regi strati on data according to the feature points of the swiping image 1010.
Then, the processor 110 obtains and analyses a following swiping image 1020 to obtain a plurality of feature points 1011 of the swiping image 1020. In the present embodiment, the processor 110 compares the feature points of the swiping images 1010 and 1020 to find the feature points simultaneously included in the swiping images 1010 and 1020, so as to obtain a relative position relationship between the swiping images 1010 and 1020, and also obtains a coordinate parameter (X2, Y2) of the feature point located at the most upper left comer of the swiping image 1020. As shown in FIG. 1 IB, the processor 110 may increase a display range of the filled region 1111 of the reference image 1110 according to the coordinate parameter (X2, Y2) and the area of the swiping image 1020. Moreover, the processor 110 merges the feature points of the swiping image 1020 into the pre-regi strati on data to generate merged pre-regi strati on data.
Namely, each time when the processor 110 obtains one swiping image, the processor 110 merges the feature points thereof into the pre-regi strati on data. Moreover, the processor 110 obtains the coordinate parameter of the feature point located at the most upper left comer of the swiping image to determine the to-be-increased range and position of the filled region 1111 of the reference image 1110 on the UI 1100. It is worth noting that the processor 110 determines whether to end the fingerprint registration by determining whether the total area of the pre-regi strati on data is greater than a predetermine threshold. If the total area of the pre-regi strati on data is not greater than the predetermined threshold, the processor 110 senses and obtains a following swiping image. As shown in FIG. 10 and FIG. 1 IE to FIG. 1 IF, in the process of fingerprint registration, after the first swiping operation, the user’s finger may leave the fingerprint sensor 120. If the obtained fingerprint data is still insufficient, namely, the total area of the pre-regi strati on data is not greater than the predetermined threshold, the processor 110 may display a prompt on the UI through the display 140 to request the user to swipe the finger again. During the second swiping operation, the processor 110 obtains a first swiping image 1030 of the second swiping operation, and the processor 110 obtains the feature points of the swiping image 1030, and merges the feature points to the pre-regi strati on dataset. The processor 110 obtains a coordinate parameter (Xn, Yn) of the feature point located at the most upper left comer of the swiping image 1030, and increases the range of the filled region 1111 of the reference image 1110 on the UT 1100 according to the coordinate parameter (Xn, Yn) and the area of the swiping image 1030.
It should be noted that by comparing and analysing the pre-regi strati on data and the feature points of the swiping image 1030 (i.e. finding out the feature points that appear repeatedly), the processor 110 may obtain a relative position relationship between the swiping image 1030 and the previously obtained swiping images and accordingly obtain the coordinate parameter (Xn, Yn). In other words, the processor 110 displays the filled region 1111 in the reference image
1110 according to the relative position relationship between the swiping image 1030 and the previously obtained swiping images. Moreover, the processor 110 may determine whether the total area of the new pre-regi strati on data is greater than the predetermined threshold to determine whether to end the fingerprint registration.
For example, taking FIG. 11A to FIG. 1II as an example, as shown in FIG. 11 A, when the user places the finger F on the fingerprint sensor 120 of the electronic device 100 to perform a swiping operation, the filled region 1111 is displayed on the reference image 1110 on the UI 1100 corresponding to the swiping images obtained by the fingerprint sensor 120. As shown in FIG. 1 IB to FIG. 1 ID, during the swiping operation of the finger F, the range of the filled region
1111 of the reference image 1110 is correspondingly adjusted. As shown in FIG. HE, when the finger F of the user leaves the fingerprint sensor 120, the range of the filled region 1111 of the reference image 1110 stops increasing. However, since the fingerprint registration is not yet completed, the UI 1100 stays on the previous filled region 1111 of the reference image 1110 and the user is prompted and requested to swipe the finger again. As shown in FIG. 1 IF to FIG.
11H, the user again places the finger F on the fingerprint sensor 120 of the electronic device 100 to perform a swiping operation. The range of the filled region 1111 of the reference image 1110 is continually increased corresponding to the user’s swiping operation. As shown in FIG. Ill, when the total area of the pre-regi strati on data is greater than the predetermined threshold, the processor 110 determines that sufficient fingerprint data have been obtained, and the range of the filled region 1111 of the reference image 1110 is increased to a sufficient area to cover the reference image 1110 of a sufficient range. Therefore, the processor 110 stops the fingerprint sensing operation of the fingerprint sensor 120, and generates the fingerprint registration data according to the pre-regi strati on dataset, so as to complete the fingerprint registration process.
In summary, in the display method of a user interface and the electronic device of the present invention, a plurality of swiping images obtained by one or more swiping operations of the user’s finger on the fingerprint sensor are collected. The data of the feature points of the swiping images are merged to generate the fingerprint registration data. When the data of the feature points of the swiping images are merged, the electronic device of the present invention further analyses the repeatedness and position relationship between the feature points of the swiping images so as to obtain the corresponding image parameters and/or coordinate parameters. Therefore, in the display method of a user interface and the electronic device of the present invention, the UI including the reference image and the filled region thereof is correspondingly displayed on the display according to the image parameters and/or the coordinate parameters, so that the range of the filled region of the reference image on the UI can be dynamically adjusted. Namely, during the finger swiping operation performed for fingerprint registration, the user may learn a progress of the fingerprint registration through a change in the range of the filled region of the reference image on the UI displayed on the display of the electronic device. Accordingly, during the finger swiping operation performed by the user for fingerprint registration, the display method of a user interface and the electronic device of the present invention may provide real-time fingerprint registration progress information to the user, so as to provide a more user friendly and convenient fingerprint registration process.
It will be apparent to those skilled in the art that various modifications and variations can be made to the structure of the invention without departing from the scope or spirit of the invention. In view of the foregoing, it is intended that the invention cover modifications and variations of 5 this invention provided they fall within the scope of the following claims and their equivalents.

Claims (22)

WHAT IS CLAIMED IS:
1. A display method of a user interface, applied to fingerprint registration, characterized in that the display method comprises:
obtaining a plurality of swiping images through a fingerprint sensor;
analysing the plurality of swiping images by a processor to obtain a plurality of feature points of the plurality of swiping images;
merging the plurality of feature points of the plurality of swiping images into a pre-regi strati on dataset by the processor;
displaying a filled region on a reference image on a user interface of a display according to the plurality of swiping images by the processor; and analysing the pre-regi strati on dataset by the processor to determine whether to end the fingerprint registration.
2. The display method of a user interface as claimed in claim 1, characterized by further comprising executing the following on each of the plurality of the swiping images by the processor:
analysing the pre-regi strati on dataset to obtain an image adjusting parameter; and determining a range of the filled region of the reference image according to the image adjusting parameter.
3. The display method of a user interface as claimed in claim 2, characterized by further comprising executing the following on each of the plurality of the swiping images by the processor:
determining whether the swiping image is a first swiping image, and if the swiping image is the first swiping image, generating the pre-regi strati on dataset according to the feature point of the swiping image, and analysing the pre-regi strati on dataset to obtain a basic image parameter; and displaying the filled region on the reference image on the user interface of the display according to the basic image parameter.
4. The display method of a user interface as claimed in claim 3, characterized by further comprising executing the following on each of the plurality of the swiping images by the processor:
if the swiping image is not the first swiping image, merging the feature point of the swiping image into the pre-regi strati on dataset to generate a merged pre-regi strati on dataset;
analysing the merged pre-regi strati on dataset to obtain a merged image parameter, and obtaining the image adjusting parameter according to the merged image parameter and the basic image parameter;
setting the merged image parameter as a new basic image parameter; and increasing the range of the filled region of the reference image according to the image adjusting parameter.
5. The display method of a user interface as claimed in claim 4, characterized by further comprising executing the following on each of the plurality of the swiping images by the processor:
determining whether the merged image parameter is greater than a predetermined threshold to determine whether to end the fingerprint registration.
6. The display method of a user interface as claimed in claim 3, characterized by further comprising executing the following on each of the plurality of the swiping images by the processor:
determining whether the swiping image is the first swiping image, and if the swiping image is the first swiping image, analysing the swiping image to obtain a coordinate parameter of the feature point located at the most upper left corner of the swiping image; and determining a position of the filled region of the reference image according to the coordinate parameter of the feature point located at the most upper left comer of the swiping image.
7. The display method of a user interface as claimed in claim 6, characterized by further comprising executing the following on each of the plurality of the swiping images by the processor:
if the swiping image is not the first swiping image, merging the feature point of the swiping image into the pre-regi strati on dataset to generate a merged pre-regi strati on dataset;
analysing the merged pre-regi strati on dataset to obtain a first merged image parameter and a second merged image parameter, and obtaining the image adjusting parameter according to the first merged image parameter and the basic image parameter, wherein the first merged image parameter represents a range the merged pre-regi strati on dataset covers in a vertical direction, and the second merged image parameter represents a range the merged pre-regi strati on dataset covers in a horizontal direction;
setting the first merged image parameter as a new basic image parameter; and increasing the range of the filled region of the reference image in the vertical direction according to the image adjusting parameter.
8. The display method of a user interface as claimed in claim 7, characterized by further comprising executing the following on each of the plurality of the swiping images by the processor:
determining whether the first merged image parameter is greater than a first predetermined threshold, and determining whether the second merged image parameter is greater than a second predetermined threshold, so as to determine whether to end the fingerprint registration.
9. The display method of a user interface as claimed in claim 1, characterized by further comprising executing the following on each of the plurality of the swiping images by the processor:
analysing the pre-regi strati on dataset to obtain a coordinate parameter; and determining a range of the filled region of the reference image according to the coordinate parameter.
10. The display method of a user interface as claimed in claim 9, characterized by further comprising executing the following on each of the plurality of the swiping images by the processor:
determining whether the swiping image is a first swiping image, and if the swiping image is the first swiping image, generating the pre-regi strati on dataset according to the feature point of the swiping image; and if the swiping image is not the first swiping image, merging the feature point of the swiping image into the pre-regi strati on dataset to generate a merged pre-regi strati on dataset.
11. The display method of a user interface as claimed in claim 10, characterized by further comprising executing the following on each of the plurality of the swiping images by the processor:
determining whether a total area of the merged pre-regi strati on dataset is greater than a predetermined threshold, so as to determine whether to end the fingerprint registration.
12. An electronic device, characterized by comprising:
a fingerprint sensor, configured to obtain a plurality of swiping images;
a processor, coupled to the fingerprint sensor, configured to analyse the plurality of swiping images to obtain a plurality of feature points of the plurality of swiping images, and merge the plurality of feature points of the plurality of swiping images into a pre-regi strati on dataset; and a display, coupled to the processor, wherein the processor displays a filled region on a reference image on a user interface by the display according to the plurality of swiping images, and the processor analyses the pre-regi strati on dataset to determine whether to end the fingerprint registration.
13. The electronic device as claimed in claim 12, characterized in that the processor executes the following on each of the plurality of swiping images:
the processor analyses the pre-regi strati on dataset to obtain an image adjusting parameter, and the processor determines a range of the filled region of the reference image according to the image adjusting parameter.
14. The electronic device as claimed in claim 13, characterized in that the processor further executes the following on each of the plurality of swiping images:
the processor determines whether the swiping image is a first swiping image, and if the processor determines that the swiping image is the first swiping image, the processor generates the pre-regi strati on dataset according to the feature point of the swiping image, and analyses the pre-regi strati on dataset to obtain a basic image parameter, and the processor displays the filled region on the reference image on the user interface of the display according to the basic image parameter.
15. The electronic device as claimed in claim 14, characterized in that the processor further executes the following on each of the plurality of swiping images:
if the processor determines that the swiping image is not the first swiping image, the processor merges the feature point of the swiping image into the pre-regi strati on dataset to generate a merged pre-regi strati on dataset, wherein the processor analyses the merged pre-regi strati on dataset to obtain a merged image parameter, and the processor obtains the image adjusting parameter according to the merged image parameter and the basic image parameter, wherein the processor sets the merged image parameter as a new basic image parameter, and the processor increases the range of the filled region of the reference image according to the image adjusting parameter.
16. The electronic device as claimed in claim 15, characterized in that the processor further executes the following on each of the plurality of swiping images:
the processor determines whether the merged image parameter is greater than a predetermined threshold, so as to determine whether to end the fingerprint registration.
17. The electronic device as claimed in claim 14, characterized in that the processor further executes the following on each of the plurality of swiping images:
the processor determines whether the swiping image is the first swiping image, and if the processor determines that the swiping image is the first swiping image, the processor analyses the swiping image to obtain a coordinate parameter located at the most upper left comer of the swiping image, and the processor determines a position of the filled region of the reference image according to the coordinate parameter of the feature point located at the most upper left comer of the swiping image.
18. The electronic device as claimed in claim 17, characterized in that the processor further executes the following on each of the plurality of swiping images:
if the swiping image is not the first swiping image, the processor merges the feature point of the swiping image into the pre-regi strati on dataset to generate a merged pre-regi strati on dataset, wherein the processor analyses the merged pre-regi strati on dataset to obtain a first merged image parameter and a second merged image parameter, and the processor obtains the image adjusting parameter according to the first merged image parameter and the basic image parameter, wherein the first merged image parameter represents a range the merged pre-regi strati on dataset covers in a vertical direction, and the second merged image parameter represents a range the merged pre-regi strati on dataset covers in a horizontal direction, wherein the processor sets the first merged image parameter as a new basic image parameter, and the processor increases the range of the filled region of the reference image in the vertical direction according to the image adjusting parameter.
19. The electronic device as claimed in claim 18, characterized in that the processor further executes the following on each of the plurality of swiping images:
the processor determines whether the first merged image parameter is greater than a first predetermined threshold and whether the second merged image parameter is greater than a second predetermined threshold, so as to determine whether to end the fingerprint registration.
20. The electronic device as claimed in claim 12, characterized in that the processor executes the following on each of the plurality of swiping images:
the processor analyses the pre-regi strati on dataset to obtain a coordinate parameter, and the processor determines a range of the filled region of the reference image according to the coordinate parameter.
21. The electronic device as claimed in claim 20, characterized in that the processor further executes the following on each of the plurality of swiping images:
the processor determines whether the swiping image is a first swiping image, and if the processor determines that the swiping image is the first swiping image, the processor generates the pre-regi strati on dataset according to the feature point of the swiping image, and if the processor determines that the swiping image is not the first swiping image, the processor merges the feature point of the swiping image into the pre-regi strati on dataset to generate a merged pre-regi strati on dataset.
22. The electronic device as claimed in claim 21, characterized in that the processor further executes the following on each of the plurality of swiping images:
the processor determines whether a total area of the merged pre-regi strati on dataset is greater than a predetermined threshold, so as to determine whether to end the fingerprint registration.
GB1914055.7A 2017-10-16 2018-10-15 User interface display method and electronic device Active GB2574973B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
GB2117618.5A GB2599288B (en) 2017-10-16 2018-10-15 User interface display method and electronic device

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US201762573140P 2017-10-16 2017-10-16
US201762598480P 2017-12-14 2017-12-14
CN201810349409.2A CN109669651B (en) 2017-10-16 2018-04-18 Display method of user interface and electronic device
PCT/CN2018/110264 WO2019076272A1 (en) 2017-10-16 2018-10-15 User interface display method and electronic device

Publications (3)

Publication Number Publication Date
GB201914055D0 GB201914055D0 (en) 2019-11-13
GB2574973A true GB2574973A (en) 2019-12-25
GB2574973B GB2574973B (en) 2022-04-13

Family

ID=66141976

Family Applications (2)

Application Number Title Priority Date Filing Date
GB2117618.5A Active GB2599288B (en) 2017-10-16 2018-10-15 User interface display method and electronic device
GB1914055.7A Active GB2574973B (en) 2017-10-16 2018-10-15 User interface display method and electronic device

Family Applications Before (1)

Application Number Title Priority Date Filing Date
GB2117618.5A Active GB2599288B (en) 2017-10-16 2018-10-15 User interface display method and electronic device

Country Status (4)

Country Link
JP (1) JP6836662B2 (en)
CN (1) CN109669651B (en)
GB (2) GB2599288B (en)
WO (1) WO2019076272A1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI735821B (en) * 2018-04-12 2021-08-11 神盾股份有限公司 Fingerprint registration method and electronic device using the same

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030123715A1 (en) * 2000-07-28 2003-07-03 Kaoru Uchida Fingerprint identification method and apparatus
CN1668244A (en) * 2002-09-17 2005-09-14 富士通株式会社 Biological information acquiring apparatus and authentication apparatus using biological information
CN102446271A (en) * 2010-10-08 2012-05-09 金佶科技股份有限公司 Sectional type image identification method and regional type identification device thereof
CN103198289A (en) * 2012-01-04 2013-07-10 金佶科技股份有限公司 Dual-lens fingerprint identification method and device thereof
CN105373786A (en) * 2015-11-30 2016-03-02 东莞酷派软件技术有限公司 Fingerprint acquisition method, fingerprint acquisition device and electronic device
CN105981046A (en) * 2014-11-07 2016-09-28 指纹卡有限公司 Fingerprint authentication using stitch and cut
CN106056037A (en) * 2015-04-15 2016-10-26 三星电子株式会社 Method and apparatus for recognizing fingerprint
CN107004131A (en) * 2017-03-09 2017-08-01 深圳市汇顶科技股份有限公司 The method and device of fingerprint recognition

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3663075B2 (en) * 1999-04-05 2005-06-22 シャープ株式会社 Information processing device
CN1924889A (en) * 2005-08-30 2007-03-07 知网生物识别科技股份有限公司 Apparatus and method of fingerprint register
KR101419784B1 (en) * 2013-06-19 2014-07-21 크루셜텍 (주) Method and apparatus for recognizing and verifying fingerprint
US9898642B2 (en) * 2013-09-09 2018-02-20 Apple Inc. Device, method, and graphical user interface for manipulating user interfaces based on fingerprint sensor inputs
KR102126568B1 (en) * 2013-10-31 2020-06-24 삼성전자주식회사 Method for processing data and an electronic device thereof
KR102217858B1 (en) * 2013-11-13 2021-02-19 삼성전자주식회사 Method for fingerprint authentication, fingerprint authentication device, and mobile terminal performing thereof
CN105989349B (en) * 2014-10-24 2019-11-01 神盾股份有限公司 The log-on data production method and electronic device of fingerprint
KR102396514B1 (en) * 2015-04-29 2022-05-11 삼성전자주식회사 Fingerprint information processing method and electronic device supporting the same
KR101639986B1 (en) * 2015-10-07 2016-07-15 크루셜텍 (주) Fingerprint information processing method and apparatus for speed improvement of fingerprint registration and authentification
WO2017156752A1 (en) * 2016-03-17 2017-09-21 深圳信炜科技有限公司 Fingerprint registration method, fingerprint identification system, and electronic equipment

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030123715A1 (en) * 2000-07-28 2003-07-03 Kaoru Uchida Fingerprint identification method and apparatus
CN1668244A (en) * 2002-09-17 2005-09-14 富士通株式会社 Biological information acquiring apparatus and authentication apparatus using biological information
CN102446271A (en) * 2010-10-08 2012-05-09 金佶科技股份有限公司 Sectional type image identification method and regional type identification device thereof
CN103198289A (en) * 2012-01-04 2013-07-10 金佶科技股份有限公司 Dual-lens fingerprint identification method and device thereof
CN105981046A (en) * 2014-11-07 2016-09-28 指纹卡有限公司 Fingerprint authentication using stitch and cut
CN106056037A (en) * 2015-04-15 2016-10-26 三星电子株式会社 Method and apparatus for recognizing fingerprint
CN105373786A (en) * 2015-11-30 2016-03-02 东莞酷派软件技术有限公司 Fingerprint acquisition method, fingerprint acquisition device and electronic device
CN107004131A (en) * 2017-03-09 2017-08-01 深圳市汇顶科技股份有限公司 The method and device of fingerprint recognition

Also Published As

Publication number Publication date
JP2020515955A (en) 2020-05-28
GB202117618D0 (en) 2022-01-19
GB201914055D0 (en) 2019-11-13
GB2599288A (en) 2022-03-30
GB2599288B (en) 2022-11-23
GB2574973B (en) 2022-04-13
WO2019076272A1 (en) 2019-04-25
JP6836662B2 (en) 2021-03-03
CN109669651A (en) 2019-04-23
CN109669651B (en) 2021-02-02

Similar Documents

Publication Publication Date Title
KR100947990B1 (en) Gaze Tracking Apparatus and Method using Difference Image Entropy
KR101805090B1 (en) Method and device for region identification
TWI619080B (en) Method for calculating fingerprint overlapping region and electronic device
US10127199B2 (en) Automatic measure of visual similarity between fonts
KR102466676B1 (en) Method of estimating bio-sensor performance, authentication method using bio-image and electronic apparatus adopting the authentication method
US10466745B2 (en) Operational control method for flexible display device
JP6326847B2 (en) Image processing apparatus, image processing method, and image processing program
RU2598598C2 (en) Information processing device, information processing system and information processing method
TWI737061B (en) Method for determining imaging ratio of curved screen, storage medium, and electronic equipment
JPWO2010086991A1 (en) Image display apparatus, image display method, and computer program
JP6502511B2 (en) Calculation device, control method of calculation device, and calculation program
TWI528271B (en) Method, apparatus and computer program product for polygon gesture detection and interaction
CN104463145A (en) Electronic equipment and obstacle reminding method
TWI731442B (en) Electronic apparatus and object information recognition method by using touch data thereof
US20170091521A1 (en) Secure visual feedback for fingerprint sensing
CN111414124A (en) Image measuring method, device, equipment and storage medium
US10713463B2 (en) Display method of user interface and electronic apparatus thereof
GB2574973A (en) User interface display method and electronic device
JP2011118466A (en) Difference noise replacement device, difference noise replacement method, difference noise replacement program, computer readable recording medium, and electronic equipment with difference noise replacement device
JP6924770B2 (en) Dynamic movement tracking infrastructure for spatially divided segments Signature authentication system and method
KR102120333B1 (en) Skin undertone determining method and an electronic device
KR101542671B1 (en) Method and apparatus for space touch
TWI595448B (en) Method for image processing, electronic apparatus, electronic apparatus readable storage medium and program apllied in electronic apparatus
TWI581174B (en) Method and system for displaying system information
US11614836B1 (en) Apparatus for supporting a reading and method for detecting a user input using the same