CN108304085B - Method for judging finger displacement direction and electronic device - Google Patents

Method for judging finger displacement direction and electronic device Download PDF

Info

Publication number
CN108304085B
CN108304085B CN201710666448.0A CN201710666448A CN108304085B CN 108304085 B CN108304085 B CN 108304085B CN 201710666448 A CN201710666448 A CN 201710666448A CN 108304085 B CN108304085 B CN 108304085B
Authority
CN
China
Prior art keywords
sub
regions
area
comparison result
region
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201710666448.0A
Other languages
Chinese (zh)
Other versions
CN108304085A (en
Inventor
江元麟
洗乙文
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Egis Technology Inc
Original Assignee
Egis Technology Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Egis Technology Inc filed Critical Egis Technology Inc
Priority to US15/813,059 priority Critical patent/US10489920B2/en
Priority to DE102018100033.2A priority patent/DE102018100033B4/en
Publication of CN108304085A publication Critical patent/CN108304085A/en
Application granted granted Critical
Publication of CN108304085B publication Critical patent/CN108304085B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0354Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
    • G06F3/03547Touch pads, in which fingers can move on a surface
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0412Digitisers structurally integrated in a display
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Collating Specific Patterns (AREA)
  • Image Input (AREA)

Abstract

The present disclosure provides a method and an electronic device for determining a finger displacement direction, the method including: sensing a fingerprint of a finger at a first sampling time point to generate a first frame, wherein the first frame comprises a plurality of first areas; sensing the fingerprint at a second sampling time point to generate a second frame, wherein the second frame comprises a plurality of second areas, and each second area corresponds to each first area; comparing each first region with the corresponding second region to obtain the displacement information of each second region; and judging a displacement direction of the finger according to the displacement information corresponding to all the second areas. The method for judging the finger displacement direction provided by the invention can reduce the calculation amount required for judging the finger displacement direction and improve the speed for judging the finger displacement direction.

Description

Method for judging finger displacement direction and electronic device
Technical Field
The present invention relates to a technique for determining a finger displacement direction, and more particularly, to a technique for determining a finger displacement direction according to displacement information corresponding to a plurality of regions included in a fingerprint frame.
Background
In the conventional technique for determining the finger displacement direction, a large-area test area is used to calculate the euclidean distance, wherein the test area is slightly smaller than the size of the original image. Specifically, when the finger displacement direction is determined, a test area is obtained from a previous image, the test area is moved a plurality of distances in different directions (up, down, left, and right) in a new image, and the similarity between the test area and the area of the new image corresponding to different displacement directions is compared, so as to find the position with the highest similarity to determine the finger displacement direction. The similarity is determined by calculating the euclidean distance between the regions of the two images and determining that the similarity is the highest when the euclidean distance is the smallest. Therefore, in the conventional technique for determining the direction of finger displacement, it is necessary to directly perform multiple similarity operations (e.g., calculating euclidean distances) on a large area to obtain the direction of finger displacement, which results in a huge amount of calculation.
Disclosure of Invention
In view of the above-mentioned problems of the prior art, the present invention provides a method and an electronic device for determining a finger displacement direction according to displacement information corresponding to a plurality of areas included in a frame.
According to an embodiment of the present invention, a method for determining a finger displacement direction is provided. The method for judging the fingerprint displacement direction comprises the following steps: sensing a fingerprint of a finger at a first sampling time point to generate a first frame, wherein the first frame comprises a plurality of first areas; sensing the fingerprint at a second sampling time point to generate a second frame, wherein the second frame comprises a plurality of second areas, and each second area corresponds to each first area; comparing each first region with the corresponding second region to obtain displacement information corresponding to each second region; and judging a displacement direction of the finger according to the displacement information corresponding to all the second areas.
An electronic device is provided according to an embodiment of the invention. The electronic device comprises a sensing device and a processor. The sensing device senses a fingerprint of a finger at a first sampling time point to generate a first frame, and senses the fingerprint at a second sampling time point to generate a second frame, wherein the first frame comprises a plurality of first areas, and the second frame comprises a plurality of second areas, and each second area corresponds to each first area. The processor is coupled to the sensing device, compares each first area with the corresponding second area to obtain displacement information corresponding to each second area, and determines a displacement direction of the finger according to the displacement information of all the second areas.
With respect to additional features and advantages of the present invention, it will be apparent to those skilled in the art that various changes and modifications can be made in the apparatus and method disclosed in the disclosure without departing from the spirit and scope of the invention.
Drawings
Fig. 1 is a block diagram of an electronic device 100 according to an embodiment of the invention.
Fig. 2A is a schematic diagram of a first frame F1 according to an embodiment of the invention.
Fig. 2B is a diagram of a second frame F2 according to an embodiment of the invention.
Fig. 3A-3B are schematic diagrams illustrating an alignment between a plurality of first sub-regions and a plurality of second sub-regions according to an embodiment of the invention.
Fig. 4 is a schematic diagram of determining a finger displacement direction according to an embodiment of the invention.
Fig. 5 is a flowchart of a method for determining a fingerprint displacement direction according to an embodiment of the present invention.
Description of reference numerals:
100 electronic device
110 fingerprint sensing device
120 processor
130 storage device
500 flow chart
BT1-1~BT1-18 first region
BT2-1~BT2-18 second region
FT1First picture frame
FT2Second picture frame
Sub-BT1First sub-region
Sub-BT2Second sub-area
Sup1、Srow、Sdown1、Sleft1、Scolumn、Sright1Comparison results
Detailed Description
The best mode for carrying out the invention is set forth in this section for the purpose of illustrating the concepts of the invention and not for the purpose of limiting the scope of the invention as defined by the claims.
Fig. 1 is a block diagram of an electronic device 100 according to an embodiment of the invention. According to an embodiment of the present invention, the electronic device 100 can be an electronic device with fingerprint recognition function, such as: an electronic device such as a mobile phone, a smart phone, a tablet computer or a notebook computer. As shown in fig. 1, the electronic device 100 may include a fingerprint sensing device 110, a processor 120, and a storage device 130. The block diagram in fig. 1 is merely for convenience of illustrating the embodiments of the present invention, but the present invention is not limited thereto.
According to an embodiment of the present invention, the fingerprint sensing device 110 may comprise a sensing chip, which may comprise a sensing array, but the present invention is not limited thereto. The sensing array can be formed by arranging a plurality of sensing units in a two-dimensional manner, and each sensing unit corresponds to a pixel (pixel). According to an embodiment of the present invention, the storage device 130 is used for storing data generated by the fingerprint sensing device 110.
According to an embodiment of the present invention, the fingerprint sensing device 110 senses the finger of the user placed on the fingerprint sensing device 110 at each sampling time point Tx according to a sampling frequency to generate a frame FTx. According to an embodiment of the present invention, the frame F generated by the fingerprint sensing device 110TxBut an unprocessed grayscale image. For example, the fingerprint sensing device 110 senses a finger of a user placed on the fingerprint sensing device 110 at a first sampling time T1 to generate a first frame F corresponding to the fingerprint of the fingerT1. The fingerprint sensing device 110 senses the finger of the user placed on the fingerprint sensing device 110 at a second sampling time T2 to generate a second frame F corresponding to the fingerprint of the fingerT2
According to an embodiment of the invention, frame FTxComprises a plurality of regions BTxAnd a plurality of areas BTxWill be smaller than the frame FTxThe total area of (a). For example, the first frame FT1Comprises a plurality of first regions BT1And the second frame FT2Comprises a plurality of second regions BT2. Then, the processor 120 compares the plurality of first regions BT1And a plurality of second regions BT2
Fig. 2A-2B are used for illustration. According to the embodiment of the invention, the processor 120 compares the plurality of first areas BT1And a plurality of second regions BT2
Furthermore, according to an embodiment of the present invention, each region BTxComprising a plurality of Sub-regions Sub-BTx. Each Sub-region Sub-BTxCan be regarded as a pixel (pixel), and each area BTxHowever, an N × N pixel array or an N × M pixel array, where N ≠ M, and N and M are integers greater than or equal to 3. For example, each first region BT1Comprising a plurality of first Sub-regions Sub-BT1And each second area BT2Comprising a plurality of second Sub-regions Sub-BT2. For example, each first region BT1And each second area BT2Each containing 3 x 3 first Sub-regions Sub-BT1And 3 × 3 second Sub-regions Sub-BT2
FIGS. 2A-2B illustrate a first frame F according to an embodiment of the inventionT1And a second frame FT2Schematic representation of (a). As shown in FIG. 2A, a first frame FT1In which a plurality of first regions B are includedT1-1~BT118 and each first area BT1-1~BT118 each contain 3X 3 subregions Sub-BT1. As shown in FIG. 2B, the second frame FT2Comprises a plurality of second regions BT2-1~BT2-18 and each second area BT2-1~BT218 each contain 3X 3 subregions Sub-BT2. Each first region BT1-1~BT1The position of-18 corresponds to each second area B respectivelyT2-1~BT2-18 position. When the first frame FT1And a second frame FT2When performing the comparison, the processor 120 will compare the first region BT1-1 and a second region BT2-1, performing an alignment. By analogy, the processor 120 compares all the first regions with their corresponding second regions, for example: the processor 120 will map the first region BT1-2 and a second regionBT2-2, performing an alignment. It is to be noted that the schematic diagram in fig. 2 is only for convenience of describing the embodiment of the present invention, but the present invention is not limited thereto. In other embodiments, each frame may include a different number of regions, and the regions in each frame may be distributed in a different manner. In addition, in other embodiments, each region in each frame may include a different number of sub-regions.
According to the embodiment of the present invention, when the processor 120 compares the first region BT1And a second region BT2The processor 120 will select the first region BT1A first sub-area located in the middle column, and a second area BT2The second sub-area in each row is compared, and the processor 120 further compares the first sub-area B with the second sub-area in each rowT1A first sub-area located in the middle row, and a second area BT2The second sub-region in each row is compared.
According to an embodiment of the present invention, if the first region B is a region BT1Is an N × N pixel array and N is singular, the middle column represents the first region BT1To (1) a
Figure BDA0001371903420000041
Columns, and the middle row represents the first region BT1To (1) a
Figure BDA0001371903420000042
And (6) rows. According to another embodiment of the present invention, if the first region B is a region BT1Is an N × N pixel array and N is a double number, the middle column represents the first region BT1To (1) a
Figure BDA0001371903420000043
Column or
Figure BDA0001371903420000044
Columns, and the middle row represents the first region BT1To (1) a
Figure BDA0001371903420000045
Line or
Figure BDA0001371903420000046
And (6) rows. For example, if each first region BT1Containing 3 x 3 first sub-areas, the middle column representing the second column and the middle row representing the second row. If each first area BT1Containing a 4 x 4 first sub-area, the middle column representing either the second or third column and the middle row representing either the second or third row.
According to another embodiment of the present invention, if the first region B is a region BT1Is an NxM pixel array (N ≠ M), and N is an even number, M is an odd number, the middle column indicates the first region BT1To (1) a
Figure BDA0001371903420000051
Column or
Figure BDA0001371903420000052
Column, middle row, represents the first region BT1To (1) a
Figure BDA0001371903420000053
And (6) rows. For example, if each first region BT1Containing 4 x 3 first sub-regions, the middle column representing either the second or third column and the middle row representing the second row.
According to an embodiment of the present invention, if the first region B is a region BT1Is an N by N array of pixels and N is singular, the processor 120 will map the first region B to the second region BT1To (1) a
Figure BDA0001371903420000054
The first and second sub-regions B of the columnT1To (1) a
Figure BDA0001371903420000055
The second sub-region of the row is compared to generate a comparison result Sup1(ii) a The first area BT1To (1) a
Figure BDA0001371903420000056
The first and second sub-regions B of the columnT1To (1) a
Figure BDA0001371903420000057
The second sub-region of the row is compared to generate a comparison result SrowAnd the first area BT1To (1) a
Figure BDA0001371903420000058
The first and second sub-regions B of the columnT1To (1) a
Figure BDA0001371903420000059
The second sub-region of the row is compared to generate a comparison result Sdown1. In addition, the processor 120 will determine the first region BT1To (1) a
Figure BDA00013719034200000510
First and second sub-regions B of a rowT1To (1) a
Figure BDA00013719034200000511
The second sub-region of the row is compared to generate a comparison result Sleft1(ii) a The first area BT1To (1) a
Figure BDA00013719034200000512
First and second sub-regions B of a rowT1To (1) a
Figure BDA00013719034200000513
The second sub-region of the row is compared to generate a comparison result ScolumnAnd the first area BT1To (1) a
Figure BDA00013719034200000514
First and second sub-regions B of a rowT1To (1) a
Figure BDA00013719034200000515
The second sub-region of the row is compared to generate a comparison result Sright1. The following description is made with reference to fig. 3. In this embodiment, in order to control the amount of displacement sensedOn the order of one pixel, the sampling frequency of the fingerprint sensing device 110 is such that the first frame F sampled at the first sampling time T1T1And a second frame F sampled at a second sampling time point T2T2Is around one pixel.
Fig. 3A-3B are schematic diagrams illustrating an alignment between a plurality of first sub-regions and a plurality of second sub-regions according to an embodiment of the invention. In the first area B of FIG. 2T1-1 and a second region B T21 as an example, a first region BT1-1 and a second region BT2-1 comprises 3 x 3 first sub-areas and 3 x 3 second sub-areas, respectively. When the first region BT1-1 and a second region B T21 when the comparison is made, the processor 120 will use the first region BT1The second column of-1 is the middle column, and the first area BT1The second row of-1 is the middle row. As shown in FIG. 3A, the processor 120 will assign the first region BT1Respectively with the second region BT2-1, the first, second and third columns comprising a second sub-region, to generate an alignment result Sup1、SrowAnd Sdown1. In addition, as shown in FIG. 3B, the processor 120 will select the first region BT1The first sub-area and the second sub-area B comprised in the second row of-1T2-1, comparing the first, second and third rows of the image to generate a comparison result Sleft1、ScolumnAnd Sright1. In this embodiment, the comparison result Sup1Is used to determine the first area BT1-1 is moved upwards. Comparison result SrowIs used to represent the first area BT1-1 is the second column not moved up and down. Comparison result Sdown1Is used to represent the first area BT1-1 is moved downwards. Comparison result Sleft1Is used to represent the first area BT1-whether the second row of 1 is shifted to the left. Comparison result ScolumnIs used to represent the first area BT1-1 is the second row not moving left or right. Comparison result Sright1Is used to represent the first area BT1-1Is shifted to the right.
As shown in FIGS. 3A-3B, when the processor 120 matches the first region BT1And a second region BT2The processor 120 calculates the first region BT1First and second sub-areas B of the second column ofT2To generate a comparison result Sup1(ii) a Calculating a first region BT1First and second sub-areas B of the second column ofT2A similarity of the second sub-region of the second row to generate a comparison result Srow(ii) a Calculating a first region BT1And in the second area B and the first sub-area of the second column ofT2A similarity of the second sub-region of the third row to generate the comparison result Sdown1(ii) a Calculating a first region BT1And in the second area B and the first sub-area of the second row ofT2A similarity of the second sub-region of the first row to generate a comparison result Sleft1(ii) a Calculating a first region BT1And in the second area B and the first sub-area of the second row ofT2A similarity of the second sub-region of the second row to generate a comparison result Scolumn(ii) a And calculating the first region BT1And in the second area B and the first sub-area of the second row ofT2A similarity of the second sub-region of the third row to generate a comparison result Sright1
When the processor 120 calculates the comparison result Sup1Comparison result SrowComparison result Sdown1Comparison result Sleft1Comparison result ScolumnAnd the comparison result Sright1Then, the processor 120 will compare the comparison result Sup1Comparison result SrowComparison result Sdown1Comparison result Sleft1Comparison result ScolumnAnd the comparison result Sright1Selecting the highest similarity as the second area BT2The corresponding direction is used as the second area BT2Displacement information of (2). If the comparison result S is trueup1At the highest, the processor 120 will determine the second area BT2Moving upwards. If the comparison result S is truerowHighest similarity, processor120 will determine the second area BT2Without moving up and down. If the comparison result S is truedown1At the highest, the processor 120 will determine the second area BT2Moving downward. If the comparison result S is trueleft1At the highest, the processor 120 will determine the second area BT2Moving to the left. If the comparison result S is truecolumnAt the highest, the processor 120 will determine the second area BT2Not moving left and right. If the comparison result S is trueright1At the highest, the processor 120 determines that the region is moving to the right. For example, taking FIG. 3 as an example, if the comparison result S isup1At the highest, the processor 120 will determine the second area BT2The displacement information of-1 is an upward movement. If the comparison result S is trueleft1At the highest, the processor 120 will determine the second area BT2The displacement information of-2 is a left shift. According to the above, the processor 120 may obtain each second area BT2Corresponding displacement information.
According to an embodiment of the present invention, the processor 120 calculates the similarity according to a Euclidean Distance (Euclidean Distance), but the present invention is not limited thereto. The processor 120 calculates the euclidean distance according to the gray scale values corresponding to the sub-regions. For example, taking fig. 3 as an example, the processor 120 calculates the first region BT1First and second sub-regions B of the middle column (second column)T2The similarity of the second sub-region of the first row, i.e. a first Euclidean distance, to generate a comparison result Sup1. By analogy, the processor 120 can calculate the comparison result SrowComparison result Sdown1Comparison result Sleft1Comparison result ScolumnAnd the comparison result Sright1
According to the embodiment of the present invention, when the processor 120 obtains each second region BT2After the corresponding displacement information, the processor 120 will determine all the second areas B according to the corresponding displacement informationT2The processor 120 determines the direction in which the direction appears most frequently (i.e., upward movement, no upward and downward movement, leftward movement, no left and right movement, or rightward movement) as the direction of the finger displacement. Also hasThat is, the processor 120 determines the direction in which the finger is to be moved according to a plurality of decision schemes.
Fig. 4 is a schematic diagram of determining a finger displacement direction according to an embodiment of the invention. As shown in FIG. 4, when the processor 120 obtains the second frame FT2Each second area BT2-1~BT2-18, the processor 120 will determine the second area B according to the displacement informationT2-1~BT2-18, counting the number of occurrences of upward movement, no upward and downward movement, leftward movement, no leftward and rightward movement, and rightward movement. As shown in fig. 4, the processor 120 determines that the finger is moving upward when the upward movement occurs the most.
Fig. 5 is a flowchart illustrating a method for determining a finger displacement direction according to an embodiment of the invention, which is applied to the electronic device 100. As shown in fig. 5, in step S510, the electronic device 100 senses a fingerprint of a finger at a first sampling time point to generate a first frame, wherein the first frame includes a plurality of first areas. In step S520, the electronic device 100 senses the fingerprint at a second sampling time point to generate a second frame, wherein the second frame includes a plurality of second areas, and each of the second areas corresponds to each of the first areas.
In step S530, the electronic device 100 compares each of the first regions with the corresponding second region to obtain displacement information of each of the second regions. In step S540, the electronic device 100 determines a displacement direction of the finger according to the displacement information of all the second areas.
In this embodiment, each of the plurality of first regions includes a plurality of first sub-regions, and each of the plurality of second regions includes a plurality of second sub-regions.
According to an embodiment of the present invention, the step S530 further includes comparing the plurality of first regions and the plurality of second regions according to the first sub-regions included in the middle row of each of the plurality of first regions and the first sub-regions included in the middle row of each of the plurality of first regions by the electronic device 100 to obtain the displacement information of each of the plurality of second regions.
According to the method for determining the finger displacement direction provided by the embodiment of the invention, the electronic device obtains the displacement information of each region by comparing the sub-regions corresponding to different directions. Therefore, the method for determining the finger displacement direction according to the embodiments of the present invention can reduce the amount of computation required for determining the finger displacement direction and increase the speed of determining the finger displacement direction. In addition, according to the method for determining the displacement direction of the finger provided by the embodiment of the invention, when the frame only includes partial fingerprint information, the displacement direction of the finger can also be determined.
The steps of the methods and algorithms disclosed in the present specification may be implemented directly in hardware, in a software module, or in a combination of the two by executing a processor. A software module (including executable instructions and associated data) and other data may be stored in a data memory such as Random Access Memory (RAM), flash memory (flash memory), Read Only Memory (ROM), Erasable Programmable Read Only Memory (EPROM), Electrically Erasable Programmable Read Only Memory (EEPROM), registers, hard disk, a removable disk, a compact disc read only memory (CD-ROM), a DVD, or any other computer-readable storage medium format known in the art. A storage medium may be coupled to a machine, such as, for example, a computer/processor (for convenience of description, the processor is referred to herein as a "processor"), which reads information (such as program code) from, and writes information to, the storage medium. A storage medium may incorporate a processor. An Application Specific Integrated Circuit (ASIC) includes a processor and a storage medium. A user equipment includes an ASIC. In other words, the processor and the storage medium are embodied in the user equipment without being directly connected to the user equipment. In addition, in some embodiments, any suitable computer program product includes a readable storage medium including program code associated with one or more of the disclosed embodiments. In some embodiments, the product of the computer program may include packaging materials.
Reference in the specification to "one embodiment" or "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment of the invention, but does not mean that they are present in every embodiment. Thus, the appearances of the phrases "in one embodiment" or "in an embodiment" in various places throughout this specification are not necessarily all referring to the same embodiment of the invention.
The above paragraphs use various levels of description. It should be apparent that the teachings herein may be implemented in a wide variety of ways and that any specific architecture or functionality disclosed in the examples is merely representative. As will be appreciated by one of ordinary skill in the art in light of the teachings herein, each of the layers disclosed herein may be implemented independently or two or more layers may be implemented in combination.
Although the present invention has been described with reference to the preferred embodiments, it should be understood that various changes and modifications can be made therein by those skilled in the art without departing from the spirit and scope of the invention.

Claims (16)

1. A method for judging the displacement direction of a finger comprises the following steps:
sensing a fingerprint of a finger at a first sampling time point to generate a first frame, wherein the first frame comprises a plurality of first areas;
sensing the fingerprint at a second sampling time point to generate a second frame, wherein the second frame comprises a plurality of second areas, and each second area corresponds to each first area;
comparing each first region with the corresponding second region to obtain the displacement information of each second region; and
and judging a displacement direction of the finger according to the displacement information corresponding to all the second areas.
2. The method according to claim 1, wherein said comparing step further comprises:
comparing each first region with the corresponding second region according to a middle row of each first region and a middle row of each first region to obtain the displacement information of each second region.
3. The method of claim 1, wherein each of the first regions comprises N x N first sub-regions, and each of the second regions comprises N x N second sub-regions, where N is an integer and is singular.
4. The method according to claim 3, wherein said comparing step further comprises:
comparing each of the first regions
Figure FDA0001371903410000011
A first sub-area included in the column and a second sub-area corresponding to the first sub-area
Figure FDA0001371903410000012
The second sub-region included in the row to generate a comparison result Sup1
Comparing each of the first regions
Figure FDA0001371903410000013
A first sub-area included in the column and a second sub-area corresponding to the first sub-area
Figure FDA0001371903410000014
The second sub-region included in the row to generate a comparison result Srow
Comparing each of the first regions
Figure FDA0001371903410000015
A first sub-area included in the column and a second sub-area corresponding to the first sub-area
Figure FDA0001371903410000021
The second sub-region included in the row to generate a comparison result Sdown1
Comparing each of the first regions
Figure FDA0001371903410000022
A first sub-area included in the row and a second sub-area corresponding to the first sub-area
Figure FDA0001371903410000023
The second sub-region included in the column to generate a comparison result Sleft1
Comparing each of the first regions
Figure FDA0001371903410000024
A first sub-area included in the row and a second sub-area corresponding to the first sub-area
Figure FDA0001371903410000025
The second sub-region included in the column to generate a comparison result Scolumn
Comparing each of the first regions
Figure FDA0001371903410000026
A first sub-area included in the row and a second sub-area corresponding to the first sub-area
Figure FDA0001371903410000027
The second sub-region included in the column to generate a comparison result Sright1(ii) a And
according to the comparison result Sup1The comparison result SrowThe above ratioFor result Sdown1The comparison result Sleft1The comparison result ScolumnAnd the above comparison result Sright1And obtaining the displacement information corresponding to each second region.
5. The method of determining a direction of finger displacement according to claim 4, further comprising:
the similarity was calculated from the Euclidean Distance (Euclidean Distance) to generate the above alignment result Sup1The comparison result SrowThe comparison result Sdown1The comparison result Sleft1The comparison result ScolumnAnd the above comparison result Sright1(ii) a And
and selecting the highest similarity as the displacement information corresponding to the second area.
6. The method of determining a direction of finger displacement according to claim 5, wherein said determining step further comprises:
selecting the displacement direction with the most occurrence times as the displacement direction of the finger according to the displacement information of all the second areas, wherein the comparison result Sup1Comparing the result S with the displacement direction moving upwardsrowComparing the result S with the displacement direction which does not move up and downdown1Comparing the result S with the displacement direction of the downward movementleft1Corresponding to the left-moving displacement direction and the comparison result ScolumnCorresponding to the displacement direction which does not move left and right, and a comparison result Sright1Corresponding to the displacement direction moving to the right.
7. The method according to claim 1, wherein each of the first regions comprises nxn first sub-regions, and each of the second regions comprises nxn second sub-regions, wherein N is an integer and is a double number.
8. The method of claim 1, wherein each of the first regions comprises N x M first sub-regions, and each of the second regions comprises N x M second sub-regions, wherein N and M are integers and N is an even number and M is an odd number.
9. An electronic device, comprising:
a sensing device for sensing a fingerprint of a finger at a first sampling time point to generate a first frame, and for sensing the fingerprint at a second sampling time point to generate a second frame, wherein the first frame comprises a plurality of first areas, and the second frame comprises a plurality of second areas, and each second area corresponds to each first area; and
a processor, coupled to the sensing device, for comparing each of the plurality of first regions with the corresponding second region to obtain displacement information corresponding to each of the second regions, and determining a displacement direction of the finger according to the displacement information of all the second regions.
10. The electronic device of claim 9, wherein the processor compares each of the first regions with the corresponding second region according to a middle row of each of the first regions and a middle column of each of the first regions to obtain the displacement information of each of the second regions.
11. The electronic device of claim 9, wherein each of the plurality of first regions comprises N x N first sub-regions, and each of the plurality of second regions comprises N x N second sub-regions, wherein N is an integer and is singular.
12. The electronic device of claim 11, wherein the processor further compares the first region to the second region of each of the first regions
Figure FDA0001371903410000031
A first sub-area included in the column and a second sub-area corresponding to the first sub-area
Figure FDA0001371903410000032
The second sub-region included in the row to generate a comparison result Sup1(ii) a Comparing each of the first regions
Figure FDA0001371903410000033
A first sub-area included in the column and a second sub-area corresponding to the first sub-area
Figure FDA0001371903410000034
The second sub-region included in the row to generate a comparison result Srow(ii) a Comparing each of the first regions
Figure FDA0001371903410000035
A first sub-area included in the column and a second sub-area corresponding to the first sub-area
Figure FDA0001371903410000036
The second sub-region included in the row to generate a comparison result Sdown1(ii) a Comparing each of the first regions
Figure FDA0001371903410000037
A first sub-area included in the row and a second sub-area corresponding to the first sub-area
Figure FDA0001371903410000038
The second sub-region included in the column to generate a comparison result Sleft1(ii) a Comparing each of the first regions
Figure FDA0001371903410000041
A first sub-area included in the row and a second sub-area corresponding to the first sub-area
Figure FDA0001371903410000042
The second sub-region included in the column to generate a comparison result Scolumn(ii) a And comparing the first region with the second region
Figure FDA0001371903410000043
A first sub-area included in the row and a second sub-area corresponding to the first sub-area
Figure FDA0001371903410000044
The second sub-region included in the column to generate a comparison result Sright1And an
Wherein the processor compares the result S with the comparison resultup1The comparison result SrowThe comparison result Sdown1The comparison result Sleft1The comparison result ScolumnAnd the above comparison result Sright1And obtaining the displacement information corresponding to each second region.
13. The electronic device of claim 12, wherein the processor calculates the similarity according to Euclidean Distance (Euclidean Distance) to generate the comparison result Sup1The comparison result SrowThe comparison result Sdown1The comparison result Sleft1The comparison result ScolumnAnd the above comparison result Sright1And the processor selects the highest similarity as the displacement information corresponding to the second area.
14. The electronic device of claim 13, wherein the processor further selects a displacement direction with the largest number of occurrences as the displacement direction of the finger according to the displacement information of all the second regions, and the comparison result S isup1Comparing the result S with the displacement direction moving upwardsrowCorresponding to the displacement square not moving up and downComparing the result Sdown1Comparing the result S with the displacement direction of the downward movementleft1Corresponding to the left-moving displacement direction and the comparison result ScolumnCorresponding to the displacement direction which does not move left and right, and a comparison result Sright1Corresponding to the displacement direction moving to the right.
15. The electronic device of claim 9, wherein each of the plurality of first regions comprises nxn first sub-regions, and each of the plurality of second regions comprises nxn second sub-regions, wherein N is an integer and is a double number.
16. The electronic device of claim 9, wherein each of the plurality of first regions comprises nxm first sub-regions and each of the plurality of second regions comprises nxm second sub-regions, wherein N and M are integers and N is an even number and M is an odd number.
CN201710666448.0A 2017-01-11 2017-08-07 Method for judging finger displacement direction and electronic device Active CN108304085B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US15/813,059 US10489920B2 (en) 2017-01-11 2017-11-14 Method and electronic device for determining moving direction of a finger
DE102018100033.2A DE102018100033B4 (en) 2017-01-11 2018-01-02 Method and electronic device for determining the direction of movement of a finger

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201762444954P 2017-01-11 2017-01-11
US62/444,954 2017-01-11

Publications (2)

Publication Number Publication Date
CN108304085A CN108304085A (en) 2018-07-20
CN108304085B true CN108304085B (en) 2021-01-01

Family

ID=62872619

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710666448.0A Active CN108304085B (en) 2017-01-11 2017-08-07 Method for judging finger displacement direction and electronic device

Country Status (1)

Country Link
CN (1) CN108304085B (en)

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1551037A (en) * 2003-05-20 2004-12-01 祥群科技股份有限公司 Sliding type fingerprint sensor module assembly and its sensing method
WO2007097510A1 (en) * 2006-02-27 2007-08-30 Iritech Inc. Deformation-resilient iris recognition methods
CN101042733A (en) * 2006-03-22 2007-09-26 上海品伟数码科技有限公司 Image split-joint method for obtaining complete finger print with slide collection
CN101646066A (en) * 2008-08-08 2010-02-10 鸿富锦精密工业(深圳)有限公司 Video monitoring system and method
CN101840511A (en) * 2010-06-04 2010-09-22 哈尔滨工程大学 Method for extracting, matching and recognizing characteristics of finger veins
EP2701118A1 (en) * 2011-04-20 2014-02-26 NEC Corporation Ten-print card input device, ten-print card input method, and storage medium
CN103853442A (en) * 2012-12-03 2014-06-11 联想(北京)有限公司 Information processing method and electronic equipment
CN104899546A (en) * 2014-03-07 2015-09-09 神盾股份有限公司 Fingerprint identification method and device
CN104932753A (en) * 2015-07-15 2015-09-23 京东方科技集团股份有限公司 Touch screen, touch method of touch screen and display device
CN105095722A (en) * 2015-08-25 2015-11-25 宇龙计算机通信科技(深圳)有限公司 Fingerprint verification method of mobile terminal and fingerprint verification system of mobile terminal
CN106133748A (en) * 2012-05-18 2016-11-16 苹果公司 For handling the equipment of user interface, method and graphic user interface based on fingerprint sensor input
CN106210343A (en) * 2016-07-27 2016-12-07 维沃移动通信有限公司 The method of adjustment of a kind of mobile terminal screen brightness and mobile terminal

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102355516B1 (en) * 2015-04-30 2022-01-26 삼성디스플레이 주식회사 Touch screen display device and driving method thereof

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1551037A (en) * 2003-05-20 2004-12-01 祥群科技股份有限公司 Sliding type fingerprint sensor module assembly and its sensing method
WO2007097510A1 (en) * 2006-02-27 2007-08-30 Iritech Inc. Deformation-resilient iris recognition methods
CN101042733A (en) * 2006-03-22 2007-09-26 上海品伟数码科技有限公司 Image split-joint method for obtaining complete finger print with slide collection
CN101646066A (en) * 2008-08-08 2010-02-10 鸿富锦精密工业(深圳)有限公司 Video monitoring system and method
CN101840511A (en) * 2010-06-04 2010-09-22 哈尔滨工程大学 Method for extracting, matching and recognizing characteristics of finger veins
EP2701118A1 (en) * 2011-04-20 2014-02-26 NEC Corporation Ten-print card input device, ten-print card input method, and storage medium
CN106133748A (en) * 2012-05-18 2016-11-16 苹果公司 For handling the equipment of user interface, method and graphic user interface based on fingerprint sensor input
CN103853442A (en) * 2012-12-03 2014-06-11 联想(北京)有限公司 Information processing method and electronic equipment
CN104899546A (en) * 2014-03-07 2015-09-09 神盾股份有限公司 Fingerprint identification method and device
CN104932753A (en) * 2015-07-15 2015-09-23 京东方科技集团股份有限公司 Touch screen, touch method of touch screen and display device
CN105095722A (en) * 2015-08-25 2015-11-25 宇龙计算机通信科技(深圳)有限公司 Fingerprint verification method of mobile terminal and fingerprint verification system of mobile terminal
CN106210343A (en) * 2016-07-27 2016-12-07 维沃移动通信有限公司 The method of adjustment of a kind of mobile terminal screen brightness and mobile terminal

Also Published As

Publication number Publication date
CN108304085A (en) 2018-07-20

Similar Documents

Publication Publication Date Title
TWI733127B (en) Information detection method, device and equipment
WO2019201035A1 (en) Method and device for identifying object node in image, terminal and computer readable storage medium
CN112036395A (en) Text classification identification method and device based on target detection
CN109117846B (en) Image processing method and device, electronic equipment and computer readable medium
US10964050B2 (en) Method for identifying foreground object in image and electronic device
CN112348116B (en) Target detection method and device using space context and computer equipment
CN111652245B (en) Vehicle contour detection method, device, computer equipment and storage medium
CN110210480A (en) Character recognition method, device, electronic equipment and computer readable storage medium
US9171227B2 (en) Apparatus and method extracting feature information of a source image
CN112749576B (en) Image recognition method and device, computing equipment and computer storage medium
CN117036490B (en) Method, device, computer equipment and medium for detecting preset bit offset of camera
CN108304085B (en) Method for judging finger displacement direction and electronic device
CN113033593A (en) Text detection training method and device based on deep learning
JP7525051B2 (en) Information processing device, information processing method, and program
JP4387889B2 (en) Template collation apparatus and method
US10489920B2 (en) Method and electronic device for determining moving direction of a finger
CN111860492B (en) License plate inclination correction method and device, computer equipment and storage medium
KR20190132885A (en) Apparatus, method and computer program for detecting hand from video
US11288534B2 (en) Apparatus and method for image processing for machine learning
CN109919850B (en) High contrast structured light pattern for QIS sensor
JP2018010359A (en) Information processor, information processing method, and program
CN112699826B (en) Face detection method, device, computer equipment and storage medium
CN111582259B (en) Machine-readable code identification method, device, electronic equipment and storage medium
US20170011715A1 (en) Method, non-transitory storage medium and electronic device for displaying system information
CN116310388A (en) Image processing method, readable medium, and electronic apparatus

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant