CN106295483B - Information processing method and electronic equipment - Google Patents

Information processing method and electronic equipment Download PDF

Info

Publication number
CN106295483B
CN106295483B CN201510320224.5A CN201510320224A CN106295483B CN 106295483 B CN106295483 B CN 106295483B CN 201510320224 A CN201510320224 A CN 201510320224A CN 106295483 B CN106295483 B CN 106295483B
Authority
CN
China
Prior art keywords
pixel point
detection parameter
pixel points
value
pixel
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201510320224.5A
Other languages
Chinese (zh)
Other versions
CN106295483A (en
Inventor
孙成昆
杨安荣
郑清芳
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Lenovo Beijing Ltd
Original Assignee
Lenovo Beijing Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Lenovo Beijing Ltd filed Critical Lenovo Beijing Ltd
Priority to CN201510320224.5A priority Critical patent/CN106295483B/en
Publication of CN106295483A publication Critical patent/CN106295483A/en
Application granted granted Critical
Publication of CN106295483B publication Critical patent/CN106295483B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/12Fingerprints or palmprints
    • G06V40/13Sensors therefor
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/12Fingerprints or palmprints
    • G06V40/1347Preprocessing; Feature extraction
    • G06V40/1353Extracting features related to minutiae or pores

Landscapes

  • Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses an information processing method and electronic equipment, wherein a first image aiming at an operation object is obtained, and the first image comprises M pixel points; determining a detection parameter value corresponding to each pixel point in the M pixel points of the first image, wherein the detection parameter value is a difference value between the pixel point and a standard value; selecting N target pixel points which meet a first condition from the M pixel points according to the detection parameter values; the first condition is that the detection parameter value corresponding to the pixel point is larger than a first threshold value; m is a positive integer greater than 1, N is an integer less than or equal to M and greater than or equal to 0; and extracting the characteristic information of the N target pixel points.

Description

Information processing method and electronic equipment
Technical Field
The present invention relates to information processing technologies, and in particular, to an information processing method and an electronic device.
Background
With the development of fingerprint identification technology, the integrated fingerprint identification on the handheld device is gradually accepted by consumers. At present, the traditional fingerprint identification algorithms such as a line correlation detection method, a skeleton refinement method, a gradient direction detection method and the like have the problem of large operation amount in the extraction process of fingerprint details. Therefore, due to the limitation of factors such as power consumption and operation complexity, the conventional fingerprint identification algorithm cannot meet the real-time processing requirement in the current embedded environment.
Disclosure of Invention
In view of the above, embodiments of the present invention provide an information processing method and an electronic device to solve the problems in the prior art.
The technical scheme of the embodiment of the invention is realized as follows:
the invention provides an information processing method, which is applied to electronic equipment and comprises the following steps:
acquiring a first image for an operation object; the first image comprises M pixel points;
determining a detection parameter value corresponding to each pixel point in the M pixel points of the first image; the detection parameter value is a difference value between the pixel point and a standard value;
selecting N target pixel points which meet a first condition from the M pixel points according to the detection parameter values; the first condition is that the detection parameter value corresponding to the pixel point is larger than a first threshold value; m is a positive integer greater than 1, N is an integer less than or equal to M and greater than or equal to 0;
and extracting the characteristic information of the N target pixel points.
Preferably, the selecting N target pixels meeting a first condition from the M pixels according to the detection parameter value includes:
acquiring detection parameter values of the M-N pixel points, and selecting a pixel point corresponding to the maximum value in the detection parameter values as a target pixel point;
and stopping the acquisition process when the maximum value is less than or equal to the first threshold value.
Preferably, after the pixel point corresponding to the maximum value in the selected detection parameter values is taken as a target pixel point, the method further includes:
and deleting the detection parameter value corresponding to at least one pixel point in a preset range by taking the target pixel point as a center.
Preferably, the determining a detection parameter value corresponding to each pixel point of the M pixel points of the first image includes:
scalar quantity calculation is carried out on each pixel point in the M pixel points of the first image to obtain the corresponding gradient of each pixel point;
carrying out consistency calculation on the corresponding gradient of each pixel point to obtain a consistency parameter corresponding to each pixel point;
and performing difference operation on the consistency parameter corresponding to each pixel point and the standard value to obtain a detection parameter value corresponding to each pixel point.
Preferably, the selecting N target pixels meeting a first condition from the M pixels according to the detection parameter value includes:
rejecting detection parameter values smaller than a second threshold value from the detection parameter values of the M pixel points; the second threshold is less than the first threshold;
and selecting N target pixel points meeting a first condition from the M pixel points according to the residual detection parameter values.
The invention also provides electronic equipment, which comprises an acquisition module, a determination module, a selection module and an extraction module;
the acquisition module is used for acquiring a first image aiming at an operation object; the first image comprises M pixel points;
the determining module is configured to determine a detection parameter value corresponding to each pixel point of the M pixel points of the first image; the detection parameter value is a difference value between the pixel point and a standard value;
the selecting module is used for selecting N target pixel points meeting a first condition from the M pixel points according to the detection parameter values; the first condition is that the detection parameter value corresponding to the pixel point is larger than a first threshold value; m is a positive integer greater than 1, N is an integer less than or equal to M and greater than or equal to 0;
and the extraction module is used for extracting the characteristic information of the N target pixel points.
Preferably, the selecting module comprises a first selecting unit and a control unit; wherein,
the first selection unit is used for acquiring detection parameter values of the M-N pixel points and selecting the pixel point corresponding to the maximum value in the detection parameter values as a target pixel point;
the control unit is configured to stop the obtaining process when the maximum value is less than or equal to the first threshold.
Preferably, the selecting module further comprises a first deleting unit; wherein,
the first deleting unit is used for deleting the detection parameter value corresponding to at least one pixel point in a preset range by taking the target pixel point as a center.
Preferably, the determining module comprises a scalar computing unit, a consistency computing unit and a difference computing unit; wherein,
the scalar calculation unit is used for carrying out scalar calculation on each pixel point in the M pixel points of the first image to obtain the corresponding gradient of each pixel point;
the consistency calculation unit is used for performing consistency calculation on the gradient corresponding to each pixel point to obtain a consistency parameter corresponding to each pixel point;
and the difference value operation unit is used for carrying out difference value operation on the consistency parameter corresponding to each pixel point and the standard value to obtain a detection parameter value corresponding to each pixel point.
Preferably, the selecting module comprises a second rejecting unit and a second selecting unit; wherein,
the second deleting unit is used for rejecting the detection parameter values smaller than a second threshold value from the detection parameter values of the M pixel points; the second threshold is less than the first threshold;
and the second selection unit is used for selecting N target pixel points meeting the first condition from the M pixel points according to the residual detection parameter values.
In the embodiment of the invention, electronic equipment acquires a first image aiming at an operation object, wherein the first image comprises M pixel points; determining a detection parameter value corresponding to each pixel point in the M pixel points of the first image, wherein the detection parameter value is a difference value between the pixel point and a standard value; selecting N target pixel points which meet a first condition from the M pixel points according to the detection parameter values; the first condition is that the detection parameter value corresponding to the pixel point is larger than a first threshold value; m is a positive integer greater than 1, N is an integer less than or equal to M and greater than or equal to 0; and extracting the characteristic information of the N target pixel points. Therefore, in the process of extracting the details of the operation object, the electronic equipment can be realized only by determining the detection parameter value corresponding to each pixel point through calculation and selecting the target pixel point according to the detection parameter value, the calculation complexity is low, and the method is suitable for the low-performance embedded platform.
Drawings
FIG. 1 is a first schematic flow chart illustrating an implementation of an information processing method according to an embodiment of the present invention;
fig. 2 is a schematic view of an implementation flow of determining a detection parameter value corresponding to each pixel point of the M pixel points of the first image according to the embodiment of the present invention;
fig. 3 is a display effect diagram of a target pixel point selected in the first image according to the embodiment of the present invention;
FIG. 4 is a schematic diagram of a second implementation flow of the information processing method according to the embodiment of the present invention;
FIG. 5 is a third schematic flow chart illustrating an implementation of the information processing method according to the embodiment of the present invention;
FIG. 6 is a schematic diagram illustrating a fourth implementation flow of the information processing method according to the embodiment of the present invention;
FIG. 7 is a schematic diagram of a component structure of an electronic device according to an embodiment of the invention;
FIG. 8 is a schematic diagram of a structure of the determining module according to an embodiment of the present invention;
FIG. 9 is a first schematic diagram illustrating a structure of the selection module according to an embodiment of the present invention;
fig. 10 is a schematic diagram of a second component structure of the selection module according to the embodiment of the present invention.
Detailed Description
The invention is described in further detail below with reference to the figures and specific examples.
The first embodiment of the method comprises the following steps:
fig. 1 is a schematic view of a first implementation flow of an information processing method according to an embodiment of the present invention, as shown in fig. 1, applied to an electronic device, where the method includes:
step S101: acquiring a first image for an operation object;
the first image comprises M pixel points.
Here, the operation object includes an object for human body recognition such as a fingerprint or a human face; correspondingly, the first image for the operation object can be a fingerprint image or a human face image.
The manner in which the electronic device acquires the first image for the operation object may vary from one operation object to another. When the operation object is a fingerprint, the electronic equipment can acquire a fingerprint image through a fingerprint sensor; when the operation object is a human face, the electronic device can acquire a human face image through a camera in the electronic device.
Here, it should be added that, after the electronic device acquires the first image of the operation object, the method may further include performing a filtering process, such as a gabor filtering process, on the first image to obtain a filtered first image; then, the subsequent step S102 is performed on the filtered first image.
The electronic device can be a mobile phone, a PC, a tablet computer, a personal digital assistant and the like.
Step S102: determining a detection parameter value corresponding to each pixel point in the M pixel points of the first image;
and the detection parameter value is a difference value between the pixel point and a standard value, and the detection parameter value is used for representing the inconsistency of the pixel point. In practical application, the detection parameter value is obtained by making a difference between the consistency parameter of the pixel point and the standard value "1".
Specifically, as shown in fig. 2, the determining, by the electronic device, a detection parameter value corresponding to each pixel point of the M pixel points of the first image includes:
step S1021: scalar quantity calculation is carried out on each pixel point in the M pixel points of the first image to obtain the corresponding gradient of each pixel point;
specifically, the electronic device performs scalar calculation on each pixel point in the M pixel points of the first image through a sobel operator to obtain a gradient in a horizontal direction and a gradient in a vertical direction of each pixel point, and the gradients are respectively calculated by gxAnd gyThe gradient corresponding to each pixel point can be expressed as g ═ g (g)x,gy)。
Step S1022: carrying out consistency calculation on the corresponding gradient of each pixel point to obtain a consistency parameter corresponding to each pixel point;
specifically, the electronic device first constructs a gradient square tensor matrix G-G according to the gradient G corresponding to each pixel pointt(ii) a Then, using the constructed gradient square tensor matrix meterCalculating the consistency parameter coherence of each pixel point, wherein the calculation expression is as follows:
coherence=(a2+b2+c2)/b2
wherein a ═ gy*gy-gx*gx;b=gy*gy+gx*gx;c=gy*gy
Step S1023: and performing difference operation on the consistency parameter corresponding to each pixel point and the standard value to obtain a detection parameter value corresponding to each pixel point.
Specifically, the electronic device performs difference operation on the consistency parameter corresponding to each pixel and the standard value to obtain a detection parameter value corresponding to each pixel, that is, the difference is 1-difference.
Step S103: selecting N target pixel points which meet a first condition from the M pixel points according to the detection parameter values;
the first condition is that a detection parameter value corresponding to a pixel point is larger than a first threshold value; m is a positive integer greater than 1, and N is an integer less than or equal to M and greater than or equal to 0. Here, the first threshold value is usually between 65 and 128.
Specifically, the electronic device selects, from the M pixels, all pixels whose detection parameter values corresponding to the pixels are greater than a first threshold as target pixels according to the detection parameter values.
Step S104: and extracting the characteristic information of the N target pixel points.
Here, the feature information of the target pixel includes coordinate information, direction information, curvature information, and a pixel block within a specified range with the target pixel as a center point, and as shown in fig. 3, the feature information of the target pixel 31 and the target pixel 32 is extracted.
Specifically, after selecting N target pixels meeting a first condition from the M pixels, the electronic device extracts feature information of the N target pixels to complete a detail extraction process for an operation object.
Therefore, by the information processing method provided by the embodiment of the invention, the electronic equipment can be realized by only determining the detection parameter value corresponding to each pixel point through calculation in the process of extracting the details of the operation object and then selecting the target pixel point according to the detection parameter value, the calculation complexity is low, and the method is suitable for a low-performance embedded platform; in addition, when the detection parameter value corresponding to each pixel point is determined by calculation, the sobel operator is adopted to realize the rapid calculation of the gradient, so that the speed of the whole detail extraction process is greatly increased, and the requirement of real-time processing under the current embedded environment can be met.
The second method embodiment:
fig. 4 is a schematic view of an implementation flow of an information processing method according to an embodiment of the present invention, and as shown in fig. 4, the method is applied to an electronic device, and includes:
step S401: acquiring a first image for an operation object;
the first image comprises M pixel points.
Here, the operation object includes an object for human body recognition such as a fingerprint or a human face; correspondingly, the first image for the operation object can be a fingerprint image or a human face image.
The manner in which the electronic device acquires the first image for the operation object may vary from one operation object to another. When the operation object is a fingerprint, the electronic equipment can acquire a fingerprint image through a fingerprint sensor; when the operation object is a human face, the electronic device can acquire a human face image through a camera in the electronic device.
Here, it should be added that, after the electronic device acquires the first image of the operation object, the method may further include performing a filtering process, such as a gabor filtering process, on the first image to obtain a filtered first image; then, the subsequent step S402 is performed on the first image after the filtering process.
Step S402: determining a detection parameter value corresponding to each pixel point in the M pixel points of the first image;
and the detection parameter value is a difference value between the pixel point and a standard value, and the detection parameter value is used for representing the inconsistency of the pixel point. In practical application, the detection parameter value is obtained by making a difference between the consistency parameter of the pixel point and the standard value "1".
Specifically, as shown in fig. 2, the determining, by the electronic device, a detection parameter value corresponding to each pixel point of the M pixel points of the first image includes:
step S1021: scalar quantity calculation is carried out on each pixel point in the M pixel points of the first image to obtain the corresponding gradient of each pixel point;
specifically, the electronic device performs scalar calculation on each pixel point in the M pixel points of the first image through a sobel operator to obtain a gradient in a horizontal direction and a gradient in a vertical direction of each pixel point, and the gradients are respectively calculated by gxAnd gyThe gradient corresponding to each pixel point can be expressed as g ═ g (g)x,gy)。
Step S1022: carrying out consistency calculation on the corresponding gradient of each pixel point to obtain a consistency parameter corresponding to each pixel point;
specifically, the electronic device first constructs a gradient square tensor matrix G-G according to the gradient G corresponding to each pixel pointt(ii) a Then, the consistency parameter coherence of each pixel point is calculated by using the constructed gradient square tensor matrix, and the calculation expression is as follows:
coherence=(a2+b2+c2)/b2
wherein a ═ gy*gy-gx*gx;b=gy*gy+gx*gx;c=gy*gy
Step S1023: and performing difference operation on the consistency parameter corresponding to each pixel point and the standard value to obtain a detection parameter value corresponding to each pixel point.
Specifically, the electronic device performs difference operation on the consistency parameter corresponding to each pixel and the standard value to obtain a detection parameter value corresponding to each pixel, that is, the difference is 1-difference.
Steps S403 to S404: acquiring detection parameter values of the M-N pixel points, and selecting a pixel point corresponding to the maximum value in the detection parameter values as a target pixel point; stopping the acquisition process when the maximum value is less than or equal to the first threshold;
wherein M is a positive integer greater than 1, and N is an integer less than or equal to M and greater than or equal to 0.
Specifically, the electronic device traverses the detection parameter values of the M-N pixel points, finds the maximum value among the detection parameter values, takes the pixel point corresponding to the maximum value as a target pixel point, and stops the traversal acquisition process until the maximum value is less than or equal to the first threshold value.
Step S405: and extracting the characteristic information of the N target pixel points.
Here, the feature information of the target pixel includes coordinate information, direction information, curvature information, and a pixel block within a specified range with the target pixel as a center point, and as shown in fig. 3, the feature information of the target pixels 31 and 32 is extracted, and the feature information of the target pixels 31 and 32 is extracted.
Specifically, after selecting N target pixels meeting a first condition from the M pixels, the electronic device extracts feature information of the N target pixels to complete a detail extraction process for an operation object.
Therefore, by the information processing method provided by the embodiment of the invention, in the process of extracting details of the operation object, the electronic equipment can determine the detection parameter value corresponding to each pixel point by calculation, traverse the detection parameter values of the M-N pixel points, and select the pixel point corresponding to the maximum value in the detection parameter values as the target pixel point, so that the method is low in calculation complexity and suitable for a low-performance embedded platform; in addition, when the detection parameter value corresponding to each pixel point is determined by calculation, the sobel operator is adopted to realize the rapid calculation of the gradient, so that the speed of the whole detail extraction process is greatly increased, and the requirement of real-time processing under the current embedded environment can be met.
The third method embodiment:
fig. 5 is a schematic view of a third implementation flow of an information processing method according to an embodiment of the present invention, as shown in fig. 5, applied to an electronic device, where the method includes:
step S501: acquiring a first image for an operation object;
the first image comprises M pixel points.
Step S502: determining a detection parameter value corresponding to each pixel point in the M pixel points of the first image;
and the detection parameter value is a difference value between the pixel point and a standard value, and the detection parameter value is used for representing the inconsistency of the pixel point. In practical application, the detection parameter value is obtained by making a difference between the consistency parameter of the pixel point and the standard value "1".
Specifically, the process of the electronic device determining the detection parameter value corresponding to each pixel point of the M pixel points of the first image is as shown in fig. 2 in the first embodiment or the second embodiment, and details are not repeated here.
Step S503: acquiring detection parameter values of the M-N pixel points, and selecting a pixel point corresponding to the maximum value in the detection parameter values as a target pixel point;
wherein M is a positive integer greater than 1, and N is an integer less than or equal to M and greater than or equal to 0.
Step S504: deleting a detection parameter value corresponding to at least one pixel point in a preset range by taking the target pixel point as a center;
the preset range can be 8 × 8-16 × 16 area ranges with the target pixel point as the center.
Specifically, after the electronic device takes the pixel point corresponding to the maximum value in the selected detection parameter values as a target pixel point, the electronic device deletes the detection parameter value corresponding to at least one pixel point within a preset range by taking the target pixel point as a center.
Step S505: stopping the acquisition process when the maximum value is less than or equal to the first threshold;
step S506: and extracting the characteristic information of the N target pixel points.
Here, the feature information of the target pixel includes coordinate information, direction information, curvature information, and a pixel block within a specified range with the target pixel as a center point, and as shown in fig. 3, the feature information of the target pixels 31 and 32 is extracted.
Specifically, after selecting N target pixels meeting a first condition from the M pixels, the electronic device extracts feature information of the N target pixels to complete a detail extraction process for an operation object.
Therefore, according to the information processing method provided by the embodiment of the invention, in the process of extracting details of the operation object, the electronic equipment only determines the detection parameter value corresponding to each pixel point through calculation, then traverses the detection parameter values of the M-N pixel points, selects the pixel point corresponding to the maximum value in the detection parameter values as the target pixel point, and further deletes the detection parameter value corresponding to the pixel point in the preset range with the target pixel point as the center after selecting the target pixel point, so that the method can be realized, the traversal process is simplified, the calculation complexity is low, and the method is suitable for a low-performance embedded platform.
The method comprises the following steps:
fig. 6 is a schematic view of an implementation flow of an information processing method according to an embodiment of the present invention, as shown in fig. 6, and the method is applied to an electronic device, and includes:
step S601: acquiring a first image for an operation object;
the first image comprises M pixel points.
Step S602: determining a detection parameter value corresponding to each pixel point in the M pixel points of the first image;
and the detection parameter value is a difference value between the pixel point and a standard value, and the detection parameter value is used for representing the inconsistency of the pixel point. In practical application, the detection parameter value is obtained by making a difference between the consistency parameter of the pixel point and the standard value "1".
Specifically, the process of the electronic device determining the detection parameter value corresponding to each pixel point of the M pixel points of the first image is as shown in fig. 2 in the first embodiment or the second embodiment, and details are not repeated here.
Step S603: rejecting detection parameter values smaller than a second threshold value from the detection parameter values of the M pixel points;
wherein the second threshold is less than the first threshold.
Specifically, compared with the first to third embodiments, in this embodiment, before selecting N target pixel points meeting the first condition from M pixel points, the electronic device first rejects, in a manner of combining the boundary mask map and the consistency mask map corresponding to the first image, the detection parameter value smaller than the second threshold value, that is, the invalid detection parameter value, from the detection parameter values of the M pixel points.
Step S604: selecting N target pixel points which meet a first condition from the M pixel points according to the residual detection parameter values;
wherein M is a positive integer greater than 1, and N is an integer less than or equal to M and greater than or equal to 0.
Step S605: and extracting the characteristic information of the N target pixel points.
Here, the feature information of the target pixel includes coordinate information, direction information, curvature information, and a pixel block within a specified range with the target pixel as a center point, and as shown in fig. 3, the feature information of the target pixels 31 and 32 is extracted.
Specifically, after selecting N target pixels meeting a first condition from the M pixels, the electronic device extracts feature information of the N target pixels to complete a detail extraction process for an operation object.
Therefore, according to the information processing method provided by the embodiment of the invention, in the process of extracting the details of the operation object, the electronic equipment only determines the detection parameter value corresponding to each pixel point through calculation, then eliminates the detection parameter value smaller than the second threshold value in the detection parameter values of the M pixel points, and further realizes the mode of selecting the target pixel point according to the remaining detection parameter values, so that the calculation complexity is low, and the method is suitable for a low-performance embedded platform. In addition, for some high-noise fingerprint images, some areas with poor quality can be removed in advance by removing the detection parameter values smaller than the second threshold value from the detection parameter values of the M pixel points.
Product example:
fig. 7 is a schematic diagram of a composition structure of an electronic device according to an embodiment of the present invention, and as shown in fig. 7, the electronic device includes an obtaining module 701, a determining module 702, a selecting module 703, and an extracting module 704;
the acquiring module 701 is configured to acquire a first image of an operation object; the first image comprises M pixel points;
here, the operation object includes an object for human body recognition such as a fingerprint or a human face; correspondingly, the first image for the operation object can be a fingerprint image or a human face image.
In practical applications, the manner in which the acquisition module 701 in the electronic device acquires the first image for the operation object may be different according to the operation object. When the operation object is a fingerprint, the obtaining module 701 in the electronic device may be implemented by a fingerprint sensor, that is, a fingerprint image is obtained by the fingerprint sensor; when the operation object is a human face, the obtaining module 701 in the electronic device may be implemented by a camera, that is, a human face image is obtained through the camera.
Here, it should be added that, in practical applications, the electronic device may further include a filtering module, and specifically, after the obtaining module obtains the first image of the operation object, the first image is subjected to filtering processing, such as gabor filtering processing, to obtain a filtered first image.
The determining module 702 is configured to determine a detection parameter value corresponding to each pixel point of the M pixel points of the first image;
and the detection parameter value is a difference value between the pixel point and a standard value, and the detection parameter value is used for representing the inconsistency of the pixel point. In practical application, the detection parameter value is obtained by making a difference between the consistency parameter of the pixel point and the standard value "1".
Specifically, as shown in fig. 8, the determining module 702 includes a scalar calculating unit 7021, a consistency calculating unit 7022, and a difference value operating unit 7023; wherein,
the scalar calculation unit 7021 is configured to perform scalar calculation on each pixel point of the M pixel points of the first image to obtain a gradient corresponding to each pixel point;
specifically, the scalar calculation unit 7021 performs scalar calculation on each pixel point of the M pixel points of the first image by using a sobel operator to obtain a gradient in a horizontal direction and a gradient in a vertical direction of each pixel point, and g is used for each gradient in the horizontal direction and the gradient in the vertical directionxAnd gyThe gradient corresponding to each pixel point can be expressed as g ═ g (g)x,gy)。
The consistency calculating unit 7022 is configured to perform consistency calculation on the gradient corresponding to each pixel point to obtain a consistency parameter corresponding to each pixel point;
specifically, the consistency calculation unit 7022 first constructs a gradient squared tensor matrix G-G according to the gradient G corresponding to each pixel pointt(ii) a Then, the consistency parameter coherence of each pixel point is calculated by using the constructed gradient square tensor matrix, and the calculation expression is as follows:
coherence=(a2+b2+c2)/b2
wherein a ═ gy*gy-gx*gx;b=gy*gy+gx*gx;c=gy*gy
The difference operation unit 7023 is configured to perform difference operation on the consistency parameter corresponding to each pixel point and the standard value to obtain a detection parameter value corresponding to each pixel point.
Specifically, the difference operation unit 7023 performs difference operation on the consistency parameter corresponding to each pixel and the standard value to obtain a detection parameter value corresponding to each pixel, that is, discoference is 1-coherence.
The selecting module 703 is configured to select, according to the detection parameter value, N target pixel points meeting a first condition from the M pixel points; the first condition is that the detection parameter value corresponding to the pixel point is larger than a first threshold value; m is a positive integer greater than 1, N is an integer less than or equal to M and greater than or equal to 0;
the extracting module 704 is configured to extract feature information of the N target pixel points.
Here, the feature information of the target pixel includes coordinate information, direction information, curvature information, and a pixel block within a specified range with the target pixel as a center point, and as shown in fig. 3, the feature information of the target pixels 31 and 32 is extracted.
Specifically, after selecting N target pixels meeting a first condition from the M pixels, the extracting module 704 extracts feature information of the N target pixels to complete a detail extracting process for an operation object.
In an embodiment, as shown in fig. 9, the selecting module 703 includes a first selecting unit 7031 and a control unit 7032; wherein,
the first selecting unit 7031 is configured to obtain detection parameter values of the M-N pixel points, and select a pixel point corresponding to a maximum value among the detection parameter values as a target pixel point;
the control unit 7032 is configured to stop the obtaining process when the maximum value is less than or equal to the first threshold.
Specifically, the first selecting unit 7031 traverses the detection parameter values of the M-N pixel points, finds the maximum value among the detection parameter values, and takes the pixel point corresponding to the maximum value as a target pixel point, until the control unit 7032 determines that the maximum value is smaller than or equal to the first threshold value, the process of acquiring the pixel values by traversing of the first selecting unit 7031 is stopped.
In an embodiment, as shown in fig. 9, the selecting module 703 further includes a first deleting unit 7033; wherein,
the first deleting unit 7033 is configured to delete the detection parameter value corresponding to at least one pixel point within a preset range, with the target pixel point as a center.
The preset range can be 8 × 8-16 × 16 area ranges with the target pixel point as the center.
Specifically, after the pixel point corresponding to the maximum value in the selected detection parameter values is taken as a target pixel point, the first deleting unit 7033 deletes the detection parameter value corresponding to at least one pixel point in a preset range, with the target pixel point as a center.
In an embodiment, as shown in fig. 10, the selecting module 703 may further include a second rejecting unit 7034 and a second selecting unit 7035; wherein,
the second deleting unit 7034 is configured to eliminate detection parameter values smaller than a second threshold from the detection parameter values of the M pixel points; the second threshold is less than the first threshold;
specifically, the second deleting unit 7034 rejects, from the detection parameter values of the M pixel points, a detection parameter value smaller than the second threshold, that is, an invalid detection parameter value, in a manner of combining the boundary mask map corresponding to the first image and the consistency mask map.
The second selecting unit 7035 is configured to select, according to the remaining detection parameter values, N target pixel points meeting the first condition from the M pixel points.
The acquisition module 701 in the electronic device according to the embodiment of the present invention may be implemented by a fingerprint sensor or a camera provided in the electronic device; the determination module 702 in the electronic device may be implemented by a calculator capable of implementing scalar calculation, consistency calculation, and difference operation, or a processor having the above calculation function; the selecting module 703 and the extracting module 704 may be implemented by a processor in an electronic device, or may be implemented by a specific logic circuit; for example, in practical applications, the electronic device may be implemented by a Central Processing Unit (CPU), a Microprocessor (MPU), a Digital Signal Processor (DSP), a Field Programmable Gate Array (FPGA), or the like located in the electronic device.
Those of ordinary skill in the art will understand that: all or part of the steps for implementing the method embodiments may be implemented by hardware related to program instructions, and the program may be stored in a computer readable storage medium, and when executed, the program performs the steps including the method embodiments; and the aforementioned storage medium includes: a mobile storage device, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
Alternatively, the integrated unit of the present invention may be stored in a computer-readable storage medium if it is implemented in the form of a software functional module and sold or used as a separate product. Based on such understanding, the technical solutions of the embodiments of the present invention may be essentially implemented or a part contributing to the prior art may be embodied in the form of a software product, which is stored in a storage medium and includes several instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the methods described in the embodiments of the present invention. And the aforementioned storage medium includes: a removable storage device, a ROM, a RAM, a magnetic or optical disk, or various other media that can store program code.
The above description is only for the specific embodiments of the present invention, but the scope of the present invention is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present invention, and all the changes or substitutions should be covered within the scope of the present invention. Therefore, the protection scope of the present invention shall be subject to the protection scope of the appended claims.

Claims (6)

1. An information processing method applied to an electronic device, the method comprising:
acquiring a first image for an operation object; the first image comprises M pixel points;
determining a detection parameter value corresponding to each pixel point in the M pixel points of the first image; the detection parameter value is a difference value between a consistency parameter of the gradient corresponding to the pixel point and a standard value;
selecting N target pixel points which meet a first condition from the M pixel points according to the detection parameter values; the first condition is that the detection parameter value corresponding to the pixel point is larger than a first threshold value; m is a positive integer greater than 1, N is an integer less than or equal to M and greater than or equal to 0;
extracting feature information of the N target pixel points;
wherein, the selecting N target pixel points meeting a first condition from the M pixel points according to the detection parameter values comprises:
acquiring detection parameter values of M-N pixel points, and selecting a pixel point corresponding to the maximum value in the detection parameter values as a target pixel point;
stopping the acquisition process when the maximum value is less than or equal to the first threshold;
correspondingly, after the pixel point corresponding to the maximum value in the selected detection parameter values is taken as a target pixel point, the method further comprises the following steps:
and deleting the detection parameter value corresponding to at least one pixel point in a preset range by taking the target pixel point as a center.
2. The method of claim 1, wherein said determining a detection parameter value corresponding to each pixel point of said M pixel points of said first image comprises:
scalar quantity calculation is carried out on each pixel point in the M pixel points of the first image to obtain the corresponding gradient of each pixel point;
carrying out consistency calculation on the corresponding gradient of each pixel point to obtain a consistency parameter corresponding to each pixel point;
and performing difference operation on the consistency parameter corresponding to each pixel point and the standard value to obtain a detection parameter value corresponding to each pixel point.
3. The method according to claim 1, wherein said selecting N target pixels meeting a first condition from the M pixels according to the detection parameter value comprises:
rejecting detection parameter values smaller than a second threshold value from the detection parameter values of the M pixel points; the second threshold is less than the first threshold;
and selecting N target pixel points meeting a first condition from the M pixel points according to the residual detection parameter values.
4. An electronic device is characterized by comprising an acquisition module, a determination module, a selection module and an extraction module;
the acquisition module is used for acquiring a first image aiming at an operation object; the first image comprises M pixel points;
the determining module is configured to determine a detection parameter value corresponding to each pixel point of the M pixel points of the first image; the detection parameter value is a difference value between a consistency parameter of the gradient corresponding to the pixel point and a standard value;
the selecting module is used for selecting N target pixel points meeting a first condition from the M pixel points according to the detection parameter values; the first condition is that the detection parameter value corresponding to the pixel point is larger than a first threshold value; m is a positive integer greater than 1, N is an integer less than or equal to M and greater than or equal to 0;
the extraction module is used for extracting the characteristic information of the N target pixel points;
the selection module comprises a first selection unit and a control unit; wherein,
the first selection unit is used for acquiring detection parameter values of M-N pixel points and selecting the pixel point corresponding to the maximum value in the detection parameter values as a target pixel point;
the control unit is used for stopping the acquisition process when the maximum value is less than or equal to the first threshold value;
correspondingly, the selection module further comprises a first deletion unit; wherein,
the first deleting unit is used for deleting the detection parameter value corresponding to at least one pixel point in a preset range by taking the target pixel point as a center.
5. The electronic device of claim 4, wherein the determination module comprises a scalar calculation unit, a consistency calculation unit, and a difference operation unit; wherein,
the scalar calculation unit is used for carrying out scalar calculation on each pixel point in the M pixel points of the first image to obtain the corresponding gradient of each pixel point;
the consistency calculation unit is used for performing consistency calculation on the gradient corresponding to each pixel point to obtain a consistency parameter corresponding to each pixel point;
and the difference value operation unit is used for carrying out difference value operation on the consistency parameter corresponding to each pixel point and the standard value to obtain a detection parameter value corresponding to each pixel point.
6. The electronic device of claim 4, wherein the selection module comprises a second deletion unit and a second selection unit; wherein,
the second deleting unit is used for rejecting the detection parameter values smaller than a second threshold value from the detection parameter values of the M pixel points; the second threshold is less than the first threshold;
and the second selection unit is used for selecting N target pixel points meeting the first condition from the M pixel points according to the residual detection parameter values.
CN201510320224.5A 2015-06-11 2015-06-11 Information processing method and electronic equipment Active CN106295483B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201510320224.5A CN106295483B (en) 2015-06-11 2015-06-11 Information processing method and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201510320224.5A CN106295483B (en) 2015-06-11 2015-06-11 Information processing method and electronic equipment

Publications (2)

Publication Number Publication Date
CN106295483A CN106295483A (en) 2017-01-04
CN106295483B true CN106295483B (en) 2020-02-21

Family

ID=57660247

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201510320224.5A Active CN106295483B (en) 2015-06-11 2015-06-11 Information processing method and electronic equipment

Country Status (1)

Country Link
CN (1) CN106295483B (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6107953A (en) * 1999-03-10 2000-08-22 Veridian Erim International, Inc. Minimum-gradient-path phase unwrapping
CN101079102A (en) * 2007-06-28 2007-11-28 中南大学 Fingerprint identification method based on statistic method
CN101414351A (en) * 2008-11-03 2009-04-22 章毅 Fingerprint recognition system and control method
CN102542278A (en) * 2012-01-16 2012-07-04 北方工业大学 Adaptive characteristic point extraction and image matching based on discrete wavelet transformation (DWT)
CN103020609A (en) * 2012-12-30 2013-04-03 上海师范大学 Complex fiber image recognition method

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6107953A (en) * 1999-03-10 2000-08-22 Veridian Erim International, Inc. Minimum-gradient-path phase unwrapping
CN101079102A (en) * 2007-06-28 2007-11-28 中南大学 Fingerprint identification method based on statistic method
CN101414351A (en) * 2008-11-03 2009-04-22 章毅 Fingerprint recognition system and control method
CN102542278A (en) * 2012-01-16 2012-07-04 北方工业大学 Adaptive characteristic point extraction and image matching based on discrete wavelet transformation (DWT)
CN103020609A (en) * 2012-12-30 2013-04-03 上海师范大学 Complex fiber image recognition method

Also Published As

Publication number Publication date
CN106295483A (en) 2017-01-04

Similar Documents

Publication Publication Date Title
CN110084161B (en) Method and system for rapidly detecting key points of human skeleton
AU2016101504A4 (en) Method for edge detection
CN106447677A (en) Image processing method and device
CN111080670B (en) Image extraction method, device, equipment and storage medium
CN110211086B (en) Image segmentation method, device and storage medium
US10713797B2 (en) Image processing including superimposed first and second mask images
CN110084238B (en) Finger vein image segmentation method and device based on LadderNet network and storage medium
CN114298992B (en) Video frame deduplication method, device, electronic device and storage medium
CN110348425B (en) Method, apparatus, device and computer-readable storage medium for deshading
WO2020173024A1 (en) Multi-gesture precise segmentation method for smart home scenario
CN107256543B (en) Image processing method, image processing device, electronic equipment and storage medium
CN107221005B (en) Object detection method and device
CN116580028B (en) Object surface defect detection method, device, equipment and storage medium
CN106650615A (en) Image processing method and terminal
CN114170653A (en) A face feature extraction method and device, terminal device and storage medium
CN119313609A (en) Underwater concrete crack detection method and device based on image processing
CN106295483B (en) Information processing method and electronic equipment
JP2016110341A (en) Image processing device, image processing method and program
JP7421273B2 (en) Image processing device and its control method and program
CN108491820B (en) Method, device and equipment for identifying limb representation information in image and storage medium
CN115661322B (en) Facial texture image generation method and device
CN113628148B (en) Infrared image noise reduction method and device
CN116843647A (en) Determination of lung field area, lung development assessment methods and devices, electronic equipment and media
CN115147906A (en) A feature extraction method, device, device and storage medium
WO2022258013A1 (en) Image processing method and apparatus, electronic device and readable storage medium

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant