CN113409035A - Face recognition analysis method applied to block chain payment and big data platform - Google Patents

Face recognition analysis method applied to block chain payment and big data platform Download PDF

Info

Publication number
CN113409035A
CN113409035A CN202110554393.0A CN202110554393A CN113409035A CN 113409035 A CN113409035 A CN 113409035A CN 202110554393 A CN202110554393 A CN 202110554393A CN 113409035 A CN113409035 A CN 113409035A
Authority
CN
China
Prior art keywords
payment
payment verification
identification data
verification time
face identification
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
CN202110554393.0A
Other languages
Chinese (zh)
Inventor
葛云霞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to CN202110554393.0A priority Critical patent/CN113409035A/en
Publication of CN113409035A publication Critical patent/CN113409035A/en
Withdrawn legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q20/00Payment architectures, schemes or protocols
    • G06Q20/04Payment circuits
    • G06Q20/06Private payment circuits, e.g. involving electronic currency used among participants of a common payment scheme
    • G06Q20/065Private payment circuits, e.g. involving electronic currency used among participants of a common payment scheme using e-cash
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/23Clustering techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q20/00Payment architectures, schemes or protocols
    • G06Q20/38Payment protocols; Details thereof
    • G06Q20/382Payment protocols; Details thereof insuring higher security of transaction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q20/00Payment architectures, schemes or protocols
    • G06Q20/38Payment protocols; Details thereof
    • G06Q20/40Authorisation, e.g. identification of payer or payee, verification of customer or shop credentials; Review and approval of payers, e.g. check credit lines or negative lists
    • G06Q20/401Transaction verification
    • G06Q20/4014Identity check for transactions
    • G06Q20/40145Biometric identity checks

Abstract

The application relates to a facial recognition analysis method and a big data platform applied to block chain payment, which can analyze data updating tracks of different updating time steps of authorized facial data, thereby determining the opening or closing of surrounding scene image data and an authority interface of a block chain payment terminal, and calculating a payment environment detection coefficient accurately in real time, thereby realizing global analysis and judgment of a payment environment safety index through the size relation between the payment environment detection coefficient and a set detection coefficient, accurately and reliably judging whether the block chain payment terminal has the risk of information leakage or wrong payment, and further ensuring the information safety of face brushing payment.

Description

Face recognition analysis method applied to block chain payment and big data platform
The application is a divisional application with the application number of "CN 202011285496.3", the application date of "11/17/2020", and the name of "data processing method and big data platform based on block chain payment and face recognition".
Technical Field
The application relates to the technical field of block chain payment and face recognition, in particular to a face recognition analysis method and a big data platform applied to block chain payment.
Background
Compared with the traditional payment mode, the blockchain payment can ensure that each payment is not tampered, thereby ensuring the rights and interests of the payer and the payee. With the rapid development of cloud computing, big data and artificial intelligence, "face-brushing" payment has gradually come into the sight of people. The face-brushing payment is faster and more convenient than code-scanning payment and password payment. However, when the payment terminal is used for face-brushing payment, it is very important how to ensure the information security of the payment terminal and avoid the malfunction of the payment terminal.
Disclosure of Invention
A first aspect of the present application discloses a facial recognition analysis method applied to blockchain payment, comprising:
acquiring first face identification data corresponding to each payment verification time interval in N payment verification time intervals; the first face identification data comprises a face feature set, an environment feature noise and a to-be-processed portrait label, wherein the face feature set is used for identifying a payment verification time period, the environment feature noise is used for indicating the identification accuracy of the payment verification time period, the to-be-processed portrait label is used for indicating the portrait intensity degree in the payment verification time period, and N is an integer greater than 1;
generating a feature processing list corresponding to each payment verification time period based on the first face identification data corresponding to each payment verification time period; the feature processing list is used for carrying out noise removal and feature matching on second face identification data, and the feature processing list and the payment verification time interval have one-to-one correspondence relationship;
processing the second facial identification data corresponding to each payment verification time period by adopting the characteristic processing list corresponding to each payment verification time period to obtain third facial identification data corresponding to each payment verification time period; wherein the feature processing list, the second face identification data, and the third face identification data have a one-to-one correspondence relationship;
generating a current payment verification result based on the third face identification data corresponding to each payment verification time interval; and determining whether the payment environment safety index of the blockchain payment terminal meets a set condition according to the current payment verification result.
Preferably, the acquiring first face identification data corresponding to each payment verification period in the N payment verification periods includes:
performing payment popularity analysis on each payment verification time period in the N payment verification time periods to obtain a payment popularity analysis result corresponding to each payment verification time period;
determining a to-be-processed portrait label corresponding to each payment verification time period based on the payment popularity analysis result corresponding to each payment verification time period;
determining payment behavior data corresponding to each payment verification time period based on the payment popularity analysis result corresponding to each payment verification time period; acquiring a face feature set corresponding to each payment verification time period and environmental feature noise corresponding to each payment verification time period;
and generating first face identification data corresponding to each payment verification time period based on the portrait label to be processed corresponding to each payment verification time period, the payment behavior data corresponding to each payment verification time period, the face feature set corresponding to each payment verification time period and the environment feature noise corresponding to each payment verification time period.
Preferably, the determining the to-be-processed portrait label corresponding to each payment verification period based on the payment popularity analysis result corresponding to each payment verification period includes:
for any one payment verification time period in the N payment verification time periods, if the payment popularity analysis result indicates that a detectable portrait frame exists in the payment verification time period, determining a first payment popularity tag as a portrait tag to be processed; the first payment popularity tag is a static tag corresponding to the payment popularity analysis result;
for any one payment verification time period in the N payment verification time periods, if the payment popularity analysis result indicates that no detectable portrait frame exists in the payment verification time period, determining a second payment popularity tag as a portrait tag to be processed; the second payment popularity tag is a dynamic tag corresponding to the payment popularity analysis result;
for any one payment verification time period in the N payment verification time periods, if the payment popularity analysis result indicates that a detectable non-portrait image frame exists in the payment verification time period, determining a third payment popularity label as a portrait label to be processed; the third payment popularity tag is an updatable tag corresponding to the payment popularity analysis result;
the determining the payment behavior data corresponding to each payment verification period based on the payment popularity analysis result corresponding to each payment verification period comprises:
for any one payment verification time period in the N payment verification time periods, if the payment popularity analysis result is that a popularity event with a payment operation behavior record exists in the payment verification time period, determining first operation behavior data as payment behavior data; wherein the first operation behavior data is obtained according to a hotness event with a payment operation behavior record;
for any one payment verification time period in the N payment verification time periods, if the payment popularity analysis result is that a popularity event exists in the payment verification time period and the popularity event does not have a payment operation behavior record, determining second operation behavior data as payment behavior data; the second operation behavior data is obtained according to an event log corresponding to the heat event without the payment operation behavior record;
for any one payment verification time period in the N payment verification time periods, if the payment popularity analysis result indicates that no detectable human image frame exists in the payment verification time period, determining third operation behavior data as payment behavior data; the third operation behavior data are obtained according to the distribution characteristics corresponding to the payment popularity distribution diagram corresponding to the payment popularity analysis result;
generating a feature processing list corresponding to each payment verification time period based on the first facial identification data corresponding to each payment verification time period, including:
for any one payment verification period in the N payment verification periods, if the to-be-processed portrait label is used for indicating that no detectable portrait frame exists in the payment verification period, generating a first feature processing list, wherein the first feature processing list belongs to the feature processing list, and the first feature processing list is used for performing noise removal on second face identification data;
for any one payment verification time period in the N payment verification time periods, if the portrait label to be processed is used for indicating that a hotness event exists in the payment verification time period and the hotness event does not have a payment operation behavior record, generating the first feature processing list;
for any one payment verification time period in the N payment verification time periods, if the portrait label to be processed is used for indicating that a hotness event exists in the payment verification time period and the hotness event has a payment operation behavior record, generating a second feature processing list, wherein the second feature processing list belongs to the feature processing list and is used for performing feature matching on second face identification data;
for any one payment verification time period in the N payment verification time periods, if the to-be-processed portrait label is used for indicating that a hotness event exists in the payment verification time period and is used for indicating the time sequence weight of a payment operation behavior record of the hotness event, generating the first feature processing list or the second feature processing list based on original facial features; the original facial features are determined based on images collected by a rear camera of the block chain payment terminal.
Preferably, the first and second liquid crystal materials are,
generating a feature processing list corresponding to each payment verification time period based on the first facial identification data corresponding to each payment verification time period, including: for any one payment verification period in the N payment verification periods, if the to-be-processed portrait label is used for indicating that no detectable portrait frame exists in the payment verification period, generating a first feature processing list, wherein the first feature processing list belongs to the feature processing list, and the first feature processing list is used for performing noise removal on second face identification data; for any one payment verification time period in the N payment verification time periods, if the to-be-processed portrait label is used for indicating that a detectable face image frame exists in the payment verification time period, generating a second feature processing list, wherein the second feature processing list belongs to the feature processing list, and the second feature processing list is used for performing feature matching on second face identification data;
or
Generating a feature processing list corresponding to each payment verification time period based on the first facial identification data corresponding to each payment verification time period, including: generating a feature processing list corresponding to each payment verification time period by adopting a preset data processing thread based on the first face identification data corresponding to each payment verification time period, wherein the preset data processing thread is a behavior feature processing thread, a time sequence feature processing thread or an artificial intelligence-based multi-dimensional feature clustering thread;
wherein, the processing the second facial identification data corresponding to each payment verification time interval by using the feature processing list corresponding to each payment verification time interval to obtain the third facial identification data corresponding to each payment verification time interval includes:
if the preset data processing thread is the behavior feature processing thread, processing second face identification data corresponding to each payment verification time period by adopting the behavior feature processing thread based on a feature processing list corresponding to each payment verification time period to obtain third face identification data corresponding to each payment verification time period;
if the preset data processing thread is the time sequence feature processing thread, processing second face identification data corresponding to each payment verification time period by adopting the time sequence feature processing thread on the basis of a feature processing list corresponding to each payment verification time period to obtain third face identification data corresponding to each payment verification time period;
and if the preset data processing thread is the artificial intelligence based multi-dimensional feature clustering thread, processing second face identification data corresponding to each payment verification time interval by adopting the artificial intelligence based multi-dimensional feature clustering thread on the basis of a feature processing list corresponding to each payment verification time interval to obtain third face identification data corresponding to each payment verification time interval.
Preferably, the generating a current payment verification result based on the third face identification data corresponding to each payment verification period includes:
determining a facial feature correlation matrix corresponding to each payment verification time period based on third facial identification data corresponding to each payment verification time period, wherein the facial feature correlation matrix is a facial feature correlation matrix of the third facial identification data under a preset correlation weight threshold value;
determining face description information corresponding to each payment verification time period based on the face feature correlation matrix corresponding to each payment verification time period;
determining face identification error distribution corresponding to each payment verification time period based on the face description information corresponding to each payment verification time period, wherein the face identification error distribution is a dynamic error of the third face identification data under a preset relevance weight threshold;
determining target third face identification data corresponding to each payment verification time period based on the face identification error distribution corresponding to each payment verification time period and the third face identification data corresponding to each payment verification time period;
determining a current payment verification result corresponding to each payment verification time period based on the target third face identification data corresponding to each payment verification time period;
wherein the determining the target third face identification data corresponding to each payment verification period based on the face identification error distribution corresponding to each payment verification period and the third face identification data corresponding to each payment verification period comprises: determining to-be-processed third face identification data corresponding to each payment verification time period based on the face identification error distribution corresponding to each payment verification time period and the third face identification data corresponding to each payment verification time period; performing redundant data elimination processing on the third face identification data to be processed corresponding to each payment verification time interval to obtain target third face identification data corresponding to each payment verification time interval;
wherein the generating a current payment verification result based on the third face identification data corresponding to each payment verification period comprises:
for any one payment verification time interval in the N payment verification time intervals, if the target third face identification data meets a payment verification index condition, generating a first current payment verification result, wherein the first current payment verification result belongs to the current payment verification result, and the first current payment verification result indicates that the target third face identification data is authorized face data;
and generating a second current payment verification result if the target third face identification data does not meet the payment verification index condition aiming at any one payment verification time interval in the N payment verification time intervals, wherein the second current payment verification result belongs to the current payment verification result, and the second current payment verification result indicates that the target third face identification data is unauthorized face data.
Preferably, the determining, by the current payment verification result, whether the payment environment security index of the blockchain payment terminal meets a set condition includes:
if the current payment verification result is the first current payment verification result, acquiring a first data updating track and a second data updating track aiming at the authorized face data; wherein the update time step of the second data update track is smaller than the update time step of the first data update track;
determining surrounding scene image data of the authorized face data according to the track node characteristics of the second data updating track, and acquiring dynamic track nodes of the authorized face data from the first data updating track according to the surrounding scene image data;
determining the matching rate of the node transmission information of the dynamic track node and each transmission information to be matched in a preset transmission information set; the preset transfer information set comprises a plurality of pieces of transfer information to be matched, each piece of transfer information to be matched is provided with a transfer safety factor, and the transfer safety factors represent that the blockchain payment terminal is in an authority interface opening state or the blockchain payment terminal is in an authority interface closing state;
selecting K pieces of transmission information to be matched from the preset transmission information set based on the matching rate of the node transmission information and each piece of transmission information to be matched; wherein K is a positive integer greater than or equal to 1; judging whether the block chain payment terminal is in an authority interface opening state or an authority interface closing state based on the transmission safety factors of K pieces of information to be matched;
when the block chain payment terminal is judged to be in the permission interface opening state, determining an access request list received by the block chain payment terminal, and determining a payment environment safety index of the block chain payment terminal according to the access request list; extracting dimension description weights corresponding to each index dimension information in the payment environment safety indexes, and weighting each dimension description weight to obtain a payment environment detection coefficient; when the payment environment detection coefficient is larger than or equal to a set detection coefficient, judging that the payment environment safety index meets the set condition; when the payment environment detection coefficient is smaller than a set detection coefficient, judging that the payment environment safety index does not meet the set condition;
based on the matching rate of the node transmission information and each transmission information to be matched, K transmission information to be matched are selected from the preset transmission information set, and the method comprises the following steps: selecting K pieces of transmission information to be matched with the maximum matching rate from the preset transmission information set based on the matching rate of the node transmission information and each piece of transmission information to be matched in the preset transmission information set;
wherein, based on K transmission factor of safety of waiting to match the transmission information, judge block chain payment terminal is in authority interface open mode or is in authority interface closed mode, include: if the transmission safety coefficient is a first target coefficient or a second target coefficient, counting the number of the first target coefficients and the number of the second target coefficients based on the transmission safety coefficients of the K pieces of transmission information to be matched; the first target coefficient represents that the blockchain payment terminal is in an authority interface opening state, and the second target coefficient represents that the blockchain payment terminal is in an authority interface closing state; judging whether the block chain payment terminal is in an authority interface opening state or an authority interface closing state according to the first target coefficient quantity and the second target coefficient quantity, and determining the confidence weight of a judgment result according to the first target coefficient quantity and the second target coefficient quantity;
wherein, the judging whether the block chain payment terminal is in the permission interface opening state or in the permission interface closing state according to the first target coefficient number and the second target coefficient number includes: if the first target coefficient quantity is larger than the second target coefficient quantity, judging that the block chain payment terminal is in an authority interface opening state; and if the first target coefficient quantity is smaller than the second target coefficient quantity, judging that the block chain payment terminal is in the permission interface closing state.
Preferably, the method further comprises:
and if the current payment verification result is the second current payment verification result, judging that the payment environment safety index does not meet the set condition.
Preferably, if the payment environment security index does not satisfy the set condition, the method further includes:
detecting a payment request sent by the block chain payment terminal;
when the payment request is detected, intercepting the payment request and sending prompt information to the blockchain payment terminal; and the prompt information is used for prompting the block chain payment terminal to carry out delayed payment.
A second aspect of the present application discloses a big data platform, comprising a processing engine, a network module and a memory; the processing engine and the memory communicate via the network module, and the processing engine reads the computer program from the memory and runs it to perform the method of the first aspect.
A third aspect of the present application discloses a computer-readable signal medium having stored thereon a computer program which, when executed, implements the method of the first aspect.
Compared with the prior art, the face recognition analysis method and the big data platform applied to the block chain payment provided by the embodiment of the application have the following technical effects: the method comprises the steps of firstly obtaining first face identification data of different payment verification time periods, secondly generating a feature processing list of each payment verification time period, then processing second face identification data corresponding to each payment verification time period to obtain third face identification data corresponding to each payment verification time period, and finally generating a current payment verification result based on the third face identification data, so that whether a payment environment safety index of a block chain payment terminal meets a set condition can be determined through the current payment verification result. Therefore, the facial identification data identified by the block chain payment terminal are analyzed and processed for multiple times, the identification data possibly interfering with payment verification can be effectively filtered, the current payment verification result can be accurately generated in real time according to the optimized facial identification data, the influence of the external environment on the current payment verification result is effectively reduced, whether the risk of information leakage or mistaken payment exists in the block chain payment terminal can be accurately and reliably judged based on the facial identification data, and the information safety of face-brushing payment is guaranteed.
In the description that follows, additional features will be set forth, in part, in the description. These features will be in part apparent to those skilled in the art upon examination of the following and the accompanying drawings, or may be learned by production or use. The features of the present application may be realized and attained by practice or use of various aspects of the methodologies, instrumentalities and combinations particularly pointed out in the detailed examples that follow.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are required to be used in the embodiments will be briefly described below, it should be understood that the following drawings only illustrate some embodiments of the present application and therefore should not be considered as limiting the scope, and for those skilled in the art, other related drawings can be obtained from the drawings without inventive effort.
The methods, systems, and/or processes of the figures are further described in accordance with the exemplary embodiments. These exemplary embodiments will be described in detail with reference to the drawings. These exemplary embodiments are non-limiting exemplary embodiments in which reference numerals represent similar mechanisms throughout the various views of the drawings.
Fig. 1 is a block diagram illustrating an exemplary facial recognition analysis system applied to blockchain payments, according to some embodiments of the present application.
FIG. 2 is a diagram illustrating the hardware and software components of an exemplary big data platform according to some embodiments of the present application.
Fig. 3 is a flow diagram illustrating an exemplary facial recognition analysis method and/or process applied to blockchain payments, according to some embodiments of the present application.
Fig. 4 is a block diagram illustrating an exemplary facial recognition analysis device applied to blockchain payments, according to some embodiments of the present application.
Detailed Description
In order to better understand the technical solutions, the technical solutions of the present application are described in detail below with reference to the drawings and specific embodiments, and it should be understood that the specific features in the embodiments and examples of the present application are detailed descriptions of the technical solutions of the present application, and are not limitations of the technical solutions of the present application, and the technical features in the embodiments and examples of the present application may be combined with each other without conflict.
In the following detailed description, numerous specific details are set forth by way of examples in order to provide a thorough understanding of the relevant guidance. It will be apparent, however, to one skilled in the art that the present application may be practiced without these specific details. In other instances, well-known methods, procedures, systems, compositions, and/or circuits have been described at a relatively high-level, without detail, in order to avoid unnecessarily obscuring aspects of the present application.
These and other features, functions, methods of execution, and combination of functions and elements of related elements in the structure and economies of manufacture disclosed in the present application may become more apparent upon consideration of the following description with reference to the accompanying drawings, all of which form a part of this application. It is to be expressly understood, however, that the drawings are for the purpose of illustration and description only and are not intended as a definition of the limits of the application. It should be understood that the drawings are not to scale. It is to be expressly understood, however, that the drawings are for the purpose of illustration and description only and are not intended as a definition of the limits of the application. It should be understood that the drawings are not to scale.
Flowcharts are used herein to illustrate the implementations performed by systems according to embodiments of the present application. It should be expressly understood that the processes performed by the flowcharts may be performed out of order. Rather, these implementations may be performed in the reverse order or simultaneously. In addition, at least one other implementation may be added to the flowchart. One or more implementations may be deleted from the flowchart.
Fig. 1 is a block diagram illustrating an exemplary facial recognition analysis system 300 for blockchain payments according to some embodiments of the present application, where the facial recognition analysis system 300 for blockchain payments may include a big data platform 100 and a blockchain payment terminal 200.
In some embodiments, as shown in FIG. 2, big data platform 100 may include a processing engine 110, a network module 120, and a memory 130, processing engine 110 and memory 130 communicating through network module 120.
Processing engine 110 may process the relevant information and/or data to perform one or more of the functions described herein. For example, in some embodiments, processing engine 110 may include at least one processing engine (e.g., a single core processing engine or a multi-core processor). By way of example only, the Processing engine 110 may include a Central Processing Unit (CPU), an Application-Specific Integrated Circuit (ASIC), an Application-Specific Instruction Set Processor (ASIP), a Graphics Processing Unit (GPU), a Physical Processing Unit (PPU), a Digital Signal Processor (DSP), a Field Programmable Gate Array (FPGA), a Programmable Logic Device (PLD), a controller, a microcontroller Unit, a Reduced Instruction Set Computer (RISC), a microprocessor, or the like, or any combination thereof.
Network module 120 may facilitate the exchange of information and/or data. In some embodiments, the network module 120 may be any type of wired or wireless network or combination thereof. Merely by way of example, the Network module 120 may include a cable Network, a wired Network, a fiber optic Network, a telecommunications Network, an intranet, the internet, a Local Area Network (LAN), a Wide Area Network (WAN), a Wireless Local Area Network (WLAN), a Metropolitan Area Network (MAN), a Public Switched Telephone Network (PSTN), a bluetooth Network, a Wireless personal Area Network, a Near Field Communication (NFC) Network, and the like, or any combination thereof. In some embodiments, the network module 120 may include at least one network access point. For example, the network 120 may include wired or wireless network access points, such as base stations and/or network access points.
The Memory 130 may be, but is not limited to, a Random Access Memory (RAM), a Read Only Memory (ROM), a Programmable Read-Only Memory (PROM), an Erasable Read-Only Memory (EPROM), an electrically Erasable Read-Only Memory (EEPROM), and the like. The memory 130 is used for storing a program, and the processing engine 110 executes the program after receiving the execution instruction.
It will be appreciated that the configuration shown in FIG. 2 is merely illustrative, and that large data platform 100 may also include more or fewer components than shown in FIG. 2, or have a different configuration than shown in FIG. 2. The components shown in fig. 2 may be implemented in hardware, software, or a combination thereof.
Fig. 3 is a flowchart illustrating an exemplary method and/or process for facial recognition analysis applied to blockchain payments, which is applied to the big data platform in fig. 1, according to some embodiments of the present disclosure, and may specifically include the contents described in the following steps S11-S14.
Step S11, acquiring first face identification data corresponding to each payment verification period within the N payment verification periods.
For example, the first facial recognition data includes a facial feature set for identifying a payment verification period, an environmental feature noise for indicating a recognition accuracy of the payment verification period, and a to-be-processed portrait tag for indicating a portrait intensity level within the payment verification period, N being an integer greater than 1. The first face identification data is face identification data corresponding to the block chain payment terminal. The N payment verification periods are any period prior to (including) the current verification period.
Step S12, based on the first face identification data corresponding to each payment verification period, generates a feature processing list corresponding to each payment verification period.
For example, the feature processing list is used for performing noise removal and feature matching on the second face identification data, and the feature processing list has a one-to-one correspondence relationship with the payment verification period.
Step S13, processing the second facial identification data corresponding to each payment verification time interval by using the feature processing list corresponding to each payment verification time interval, to obtain third facial identification data corresponding to each payment verification time interval.
For example, the feature handling list, the second face identification data, and the third face identification data have a one-to-one correspondence relationship.
Step S14, generating a current payment verification result based on the third face identification data corresponding to each payment verification time interval; and determining whether the payment environment safety index of the blockchain payment terminal meets a set condition according to the current payment verification result.
For example, the current payment verification result is a verification result before the blockchain payment terminal is ready to make payment in the current time period. And the payment environment safety index is used for judging whether the block chain payment terminal has the risk of information leakage or wrong payment.
In the above example, the first face identification data includes second face identification data, and the second face identification data includes third face identification data. That is, the second face identification data is obtained by filtering the first face identification data, and then the third face identification data is obtained by filtering the second face identification data. Thus, multi-level noise removal is realized to refine the face recognition data.
For convenience of explanation, the above scheme is described below as a complete example.
User A wants to carry out face brushing payment through a block chain payment terminal, a big data platform obtains first face identification data (including user image A, passerby image B and environment image C) through the block chain payment terminal, the big data platform eliminates identification data corresponding to partial passerby image B and all environment image C, and therefore third face identification data corresponding to simplified user image A + passerby image B1 is obtained, therefore, whether the risk of information leakage or error payment exists in the block chain payment terminal can be accurately and reliably judged based on the third face data, and information safety of face brushing payment is guaranteed, and therefore when face brushing payment is carried out on user A, passerby around the user A cannot steal face brushing images of the user A or is mistakenly identified by the block chain payment terminal.
When the contents described in the above steps S11 to S14 are applied, first, the first face identification data of different payment verification periods are obtained, then, a feature processing list of each payment verification period is generated, then, the second face identification data corresponding to each payment verification period is processed to obtain the third face identification data corresponding to each payment verification period, and finally, the current payment verification result is generated based on the third face identification data, so that whether the payment environment security index of the blockchain payment terminal meets the set condition can be determined according to the current payment verification result. Therefore, the facial identification data identified by the block chain payment terminal are analyzed and processed for multiple times, the identification data possibly interfering with payment verification can be effectively filtered, the current payment verification result can be accurately generated in real time according to the optimized facial identification data, the influence of the external environment on the current payment verification result is effectively reduced, whether the risk of information leakage or mistaken payment exists in the block chain payment terminal can be accurately and reliably judged based on the facial identification data, and the information safety of face-brushing payment is guaranteed.
In some examples, in order to completely acquire the first facial identification data, the acquisition of the first facial identification data corresponding to each of the N payment verification periods as described in step S11 may exemplarily include the following steps S111-114.
Step S111, carrying out payment popularity analysis on each payment verification time interval in the N payment verification time intervals to obtain a payment popularity analysis result corresponding to each payment verification time interval.
Step S112, determining the to-be-processed portrait label corresponding to each payment verification time period based on the payment popularity analysis result corresponding to each payment verification time period.
Step S113, determining payment behavior data corresponding to each payment verification time period based on the payment popularity analysis result corresponding to each payment verification time period; and acquiring the facial feature set corresponding to each payment verification time period and the environmental feature noise corresponding to each payment verification time period.
Step S114, generating first face identification data corresponding to each payment verification period based on the to-be-processed portrait label corresponding to each payment verification period, the payment behavior data corresponding to each payment verification period, the face feature set corresponding to each payment verification period, and the environmental feature noise corresponding to each payment verification period.
In this way, by applying the above steps S111 to S114, the payment popularity analysis can be performed on the payment verification period, so as to take the payment behavior data into account, which can ensure that the first facial recognition data can completely contain the facial feature set, the environmental feature noise and the portrait label to be processed, thereby completely determining the first facial recognition data.
Further, the determination of the to-be-processed portrait label corresponding to each payment verification period based on the payment popularity analysis result corresponding to each payment verification period, which is described in step S112, may include the following steps S1121-S1123.
Step S1121, for any one payment verification time interval in the N payment verification time intervals, if the payment popularity analysis result indicates that a detectable portrait frame exists in the payment verification time interval, determining a first payment popularity tag as a to-be-processed portrait tag; and the first payment popularity tag is a static tag corresponding to the payment popularity analysis result.
Step S1122, for any one payment verification time period in the N payment verification time periods, if the payment popularity analysis result indicates that no detectable portrait frame exists in the payment verification time period, determining a second payment popularity tag as a to-be-processed portrait tag; and the second payment popularity tag is a dynamic tag corresponding to the payment popularity analysis result.
Step S1123, for any one payment verification time interval in the N payment verification time intervals, if the payment popularity analysis result indicates that a detectable non-portrait image frame exists in the payment verification time interval, determining a third payment popularity tag as a to-be-processed portrait tag; and the third payment popularity tag is an updatable tag corresponding to the payment popularity analysis result.
It can be understood that, when the contents described in steps S1121-S1123 above are applied, different payment popularity tags can be determined as the to-be-processed portrait tags based on the presence or absence of the detectable portrait image frames and the presence or absence of the detectable non-portrait image frames represented by the payment popularity analysis result, so that the to-be-processed portrait tags can be flexibly determined, and the to-be-processed portrait tags are prevented from being missing due to different environmental scenes.
Further, in step S113, determining the payment behavior data corresponding to each payment verification period based on the payment popularity analysis result corresponding to each payment verification period may be implemented as described in steps S1131 to S1133 below.
Step S1131, for any one payment verification time interval in the N payment verification time intervals, if the payment popularity analysis result is that a popularity event with a payment operation behavior record exists in the payment verification time interval, determining first operation behavior data as payment behavior data; wherein the first operation behavior data is obtained according to a hotness event with a record of payment operation behavior.
Step S1132, for any one payment verification time period in the N payment verification time periods, if the payment popularity analysis result is that a popularity event exists in the payment verification time period and the popularity event does not have a payment operation behavior record, determining second operation behavior data as the payment behavior data; and obtaining the second operation behavior data according to an event log corresponding to the heat event without the payment operation behavior record.
Step S1133, for any one payment verification time period in the N payment verification time periods, if the payment popularity analysis result indicates that no detectable human image frame exists in the payment verification time period, determining third operation behavior data as payment behavior data; and the third operation behavior data is obtained according to the distribution characteristics corresponding to the payment popularity distribution diagram corresponding to the payment popularity analysis result.
In this way, payment behavior data may be determined from different operational behavior data, thereby ensuring a high degree of matching of the payment behavior data with the actual payment environment.
In other examples, the generating of the feature processing list corresponding to each payment verification period based on the first facial identification data corresponding to each payment verification period described in step S12 may be implemented as described in steps S121 to S124 below.
Step S121, for any one payment verification period in the N payment verification periods, if the to-be-processed portrait label is used to indicate that no detectable portrait frame exists in the payment verification period, a first feature processing list is generated, where the first feature processing list belongs to the feature processing list, and the first feature processing list is used to perform noise removal on the second face identification data.
Step S122, for any one payment verification time period in the N payment verification time periods, if the to-be-processed portrait label is used to indicate that a hotness event exists in the payment verification time period, and the hotness event does not have a payment operation behavior record, generating the first feature processing list.
Step S123, for any one payment verification time period in the N payment verification time periods, if the to-be-processed portrait label is used to indicate that a hotness event exists in the payment verification time period, and the hotness event has a payment operation behavior record, generating a second feature processing list, where the second feature processing list belongs to the feature processing list, and the second feature processing list is used to perform feature matching on second face identification data.
Step S124, for any one payment verification time slot of the N payment verification time slots, if the to-be-processed portrait label is used to indicate that a hotness event exists in the payment verification time slot and is used to indicate a time-sequence weight of a payment operation behavior record of the hotness event, generating the first feature processing list or the second feature processing list based on an original facial feature; the original facial features are determined based on images collected by a rear camera of the block chain payment terminal.
Thus, based on the steps S121 to S124, different feature processing lists can be determined according to the detectable human image frame, the heat event and whether the detected human image frame has the payment operation behavior record, so as to provide a complete and multidimensional screening basis for subsequent facial recognition data screening.
In an alternative embodiment, the step S12 may be implemented by one of the following two embodiments, where the feature processing list corresponding to each payment verification period is generated based on the first facial identification data corresponding to each payment verification period.
In a first implementation manner, for any one payment verification period in the N payment verification periods, if the to-be-processed portrait label is used to indicate that no detectable portrait frame exists in the payment verification period, a first feature processing list is generated, where the first feature processing list belongs to the feature processing list, and the first feature processing list is used to perform noise removal on second face identification data; and for any one payment verification time period in the N payment verification time periods, if the to-be-processed portrait label is used for indicating that a detectable face image frame exists in the payment verification time period, generating a second feature processing list, wherein the second feature processing list belongs to the feature processing list, and the second feature processing list is used for performing feature matching on second face identification data. Or
In a second embodiment, based on the first face identification data corresponding to each payment verification period, a feature processing list corresponding to each payment verification period is generated by using a preset data processing thread, where the preset data processing thread is a behavior feature processing thread, a time sequence feature processing thread, or an artificial intelligence-based multidimensional feature clustering thread.
Further, in order to effectively reduce the second face identification data, in step S13, the feature processing list corresponding to each payment verification period is adopted to process the second face identification data corresponding to each payment verification period, so as to obtain the third face identification data corresponding to each payment verification period, which may be implemented as described in steps S131 to S133 below.
Step S131, if the preset data processing thread is the behavior feature processing thread, based on the feature processing list corresponding to each payment verification time interval, processing the second face identification data corresponding to each payment verification time interval by using the behavior feature processing thread to obtain third face identification data corresponding to each payment verification time interval.
Step S132, if the preset data processing thread is the time sequence feature processing thread, based on the feature processing list corresponding to each payment verification time interval, processing the second face identification data corresponding to each payment verification time interval by using the time sequence feature processing thread to obtain third face identification data corresponding to each payment verification time interval.
Step S133, if the preset data processing thread is the artificial intelligence based multidimensional feature clustering thread, based on the feature processing list corresponding to each payment verification period, processing the second facial identification data corresponding to each payment verification period by using the artificial intelligence based multidimensional feature clustering thread to obtain third facial identification data corresponding to each payment verification period.
It will be appreciated that the behavioral feature processing thread, the temporal feature processing thread, and the artificial intelligence based multi-dimensional feature clustering thread described above enable the screening of the second facial recognition data through different feature dimensions. Therefore, different processing threads can be flexibly selected to operate in different use scenes, so that the second face identification data can be effectively simplified, and the high identification degree of the third face identification data is ensured.
In particular embodiments, the inventors have found that in order to reliably determine the current payment verification result, an analysis of the data correlation of the third facial recognition data is also required, taking into account the facial recognition data that may be associated. To achieve this, the step S14 of generating the current payment verification result based on the third face identification data corresponding to each payment verification period may further include the following steps S141 to S145.
Step S141, determining a facial feature correlation matrix corresponding to each payment verification period based on the third facial identification data corresponding to each payment verification period, where the facial feature correlation matrix is a facial feature correlation matrix of the third facial identification data under a preset correlation weight threshold.
Step S142, determining the face description information corresponding to each payment verification period based on the face feature correlation matrix corresponding to each payment verification period.
Step S143, determining a face identification error distribution corresponding to each payment verification time interval based on the face description information corresponding to each payment verification time interval, where the face identification error distribution is a dynamic error of the third face identification data under a preset relevance weight threshold.
Step S144, determining target third face identification data corresponding to each payment verification time interval based on the face identification error distribution corresponding to each payment verification time interval and the third face identification data corresponding to each payment verification time interval.
Step S145, determining a current payment verification result corresponding to each payment verification time interval based on the target third face identification data corresponding to each payment verification time interval.
By adopting the design, the third face identification data and the face feature correlation matrix can be analyzed by implementing the steps S141 to S145, so as to obtain the face description information corresponding to each payment verification period, further realize the correlation analysis of the third face identification data, and determine the target third face identification data, so that the face identification data possibly associated can be taken into account, and the current payment verification result can be reliably determined based on the target third face identification data.
Further, the determining, by the step S144, the target third face identification data corresponding to each payment verification period based on the face identification error distribution corresponding to each payment verification period and the third face identification data corresponding to each payment verification period may further include the following steps: determining to-be-processed third face identification data corresponding to each payment verification time period based on the face identification error distribution corresponding to each payment verification time period and the third face identification data corresponding to each payment verification time period; and performing redundant data elimination processing on the to-be-processed third face identification data corresponding to each payment verification time interval to obtain target third face identification data corresponding to each payment verification time interval.
On the basis of the above, the generation of the current payment verification result based on the third face identification data corresponding to each payment verification period described in step S14 may further include the following contents described in step S14a and step S14 b.
Step S14a, for any one of the N payment verification periods, if the target third face identification data meets a payment verification index condition, generating a first current payment verification result, where the first current payment verification result belongs to the current payment verification result, and the first current payment verification result indicates that the target third face identification data is authorized face data.
Step S14b, for any one of the N payment verification periods, if the target third face identification data does not satisfy the payment verification index condition, generating a second current payment verification result, where the second current payment verification result belongs to the current payment verification result, and the second current payment verification result indicates that the target third face identification data is unauthorized face data.
In this way, the above steps S14 a-S14 b can be implemented to determine different payment verification results, so as to realize subsequent payment security judgment based on different payment verification results, thereby ensuring that the payment security judgment can consider different payment verification situations.
The inventor finds that in practical application, when analyzing the current payment verification result, two cases of authorized face data and unauthorized face data need to be considered, and the two cases need to be processed independently, so as to ensure reliable detection of the payment environment safety index. To achieve this, the determination of whether the payment environment security index of the blockchain payment terminal satisfies the set condition through the current payment verification result described in step S14 may be performed in two branches, where the first branch is a case where the current payment verification result is the first current payment verification result, and the second branch is a case where the current payment verification result is the second current payment verification result.
On the one hand, the execution steps corresponding to the first branch are as shown in the following steps S21-S25.
Step S21, if the current payment verification result is the first current payment verification result, obtaining a first data update trajectory and a second data update trajectory for the authorized face data; wherein the update time step of the second data update track is smaller than the update time step of the first data update track.
Step S22, determining surrounding scene image data of the authorized face data according to the track node feature of the second data update track, and obtaining a dynamic track node of the authorized face data from the first data update track according to the surrounding scene image data.
Step S23, determining the matching rate of the node transmission information of the dynamic track node and each transmission information to be matched in a preset transmission information set; the preset transfer information set comprises a plurality of pieces of transfer information to be matched, each piece of transfer information to be matched is provided with a transfer safety factor, and the transfer safety factors represent that the blockchain payment terminal is in an authority interface opening state or the blockchain payment terminal is in an authority interface closing state.
Step S24, selecting K pieces of transmission information to be matched from the preset transmission information set based on the matching rate of the node transmission information and each piece of transmission information to be matched; wherein K is a positive integer greater than or equal to 1; and judging whether the block chain payment terminal is in an authority interface opening state or an authority interface closing state based on the K transmission safety factors of the information to be matched.
Step S25, when the blockchain payment terminal is judged to be in the permission interface opening state, determining an access request list received by the blockchain payment terminal, and determining a payment environment safety index of the blockchain payment terminal according to the access request list; extracting dimension description weights corresponding to each index dimension information in the payment environment safety indexes, and weighting each dimension description weight to obtain a payment environment detection coefficient; when the payment environment detection coefficient is larger than or equal to a set detection coefficient, judging that the payment environment safety index meets the set condition; and when the payment environment detection coefficient is smaller than a set detection coefficient, judging that the payment environment safety index does not meet the set condition.
In this way, by executing the contents described in the above steps S21 to S25, the data update tracks of different update time steps of authorized facial data can be analyzed, so as to determine the image data of the surrounding scene and the opening or closing of the authority interface of the blockchain payment terminal, and thus the payment environment detection coefficient can be accurately calculated in real time, so that the overall analysis and judgment of the payment environment safety index can be realized through the magnitude relationship between the payment environment detection coefficient and the set detection coefficient, and thus whether the risk of information leakage or wrong payment exists in the blockchain payment terminal can be accurately and reliably judged, and the information safety of face-brushing payment can be further ensured.
Further, the step S24 of selecting K pieces of transfer information to be matched from the preset transfer information set based on the matching rate between the node transfer information and each piece of transfer information to be matched may include the following steps: and selecting K pieces of transmission information to be matched with the maximum matching rate from the preset transmission information set based on the matching rate of the node transmission information and each piece of transmission information to be matched in the preset transmission information set.
Further, the step S24 of determining whether the blockchain payment terminal is in the permission interface open state or in the permission interface closed state based on the transmission safety factors of the K pieces of information to be matched, may include the following steps: if the transmission safety coefficient is a first target coefficient or a second target coefficient, counting the number of the first target coefficients and the number of the second target coefficients based on the transmission safety coefficients of the K pieces of transmission information to be matched; the first target coefficient represents that the blockchain payment terminal is in an authority interface opening state, and the second target coefficient represents that the blockchain payment terminal is in an authority interface closing state; and judging whether the block chain payment terminal is in an authority interface opening state or an authority interface closing state according to the first target coefficient quantity and the second target coefficient quantity, and determining the confidence weight of the judgment result according to the first target coefficient quantity and the second target coefficient quantity.
On the basis of the above, the step S24 of further describing the step of determining whether the blockchain payment terminal is in the right interface on state or in the right interface off state according to the first target coefficient quantity and the second target coefficient quantity includes: if the first target coefficient quantity is larger than the second target coefficient quantity, judging that the block chain payment terminal is in an authority interface opening state; and if the first target coefficient quantity is smaller than the second target coefficient quantity, judging that the block chain payment terminal is in the permission interface closing state.
On the other hand, the execution steps corresponding to the second branch are as follows: and if the current payment verification result is the second current payment verification result, judging that the payment environment safety index does not meet the set condition.
It is understood that, on the basis of the above determination of the payment environment security index, if the payment environment security index does not satisfy the set condition, the method further includes the following steps S15 and S16.
And step S15, detecting the payment request sent by the block chain payment terminal.
Step S16, when the payment request is detected, intercepting the payment request and sending prompt information to the blockchain payment terminal; and the prompt information is used for prompting the block chain payment terminal to carry out delayed payment.
In practical application, by executing the contents described in the steps S15 and S16, a payment request sent by the blockchain payment terminal can be intercepted when it is detected that the blockchain payment terminal may have a payment risk, so that the use of the blockchain payment terminal by a user for face-brushing payment is interrupted, and thus the risk of information leakage or wrong payment in the blockchain payment terminal can be avoided, and the information security of face-brushing payment is ensured.
In an alternative embodiment, after the step of generating the current payment verification result based on the third face identification data corresponding to each payment verification period described in step S14, one of the following three technical solutions may be implemented.
According to the first technical scheme, if the current payment verification results corresponding to M payment verification time periods are the first current payment verification result, determining at least one target payment verification time period from the M payment verification time periods based on third face identification data corresponding to each payment verification time period in the M payment verification time periods, wherein the first current payment verification result represents that the third face identification data is authorized face data, the M payment verification time periods belong to the N payment verification time periods, and M is an integer which is greater than or equal to 1 and less than or equal to N; and storing the third face identification data corresponding to the at least one target payment verification time period. In this way, training samples can be provided for subsequent training of neural networks for face recognition.
According to the second technical scheme, if the current payment verification results corresponding to M payment verification time periods are all first current payment verification results, at least one target payment verification time period is determined from the M payment verification time periods based on third face identification data corresponding to each payment verification time period in the M payment verification time periods, wherein the first current payment verification result indicates that the third face identification data are authorized face data, the M payment verification time periods belong to the N payment verification time periods, and M is an integer which is greater than or equal to 1 and less than or equal to N; for each target payment verification time interval, performing identification weight extraction on third face identification data corresponding to the target payment verification time interval to obtain an identification weight extraction result; and generating an identification adjustment coefficient of the third face identification data corresponding to the target payment verification time period based on the identification weight extraction result corresponding to each target payment verification time period. In this way, the adjustment of the corresponding recognition thread can be realized.
According to the third technical scheme, if the current payment verification results corresponding to M payment verification time periods are all the first current payment verification results, at least one target payment verification time period is determined from the M payment verification time periods based on third face identification data corresponding to each payment verification time period in the M payment verification time periods, wherein the first current payment verification result indicates that the third face identification data are authorized face data, the M payment verification time periods belong to the N payment verification time periods, and M is an integer which is greater than or equal to 1 and is less than or equal to N; for each target payment verification time interval, performing facial image quality analysis on third facial identification data corresponding to the target payment verification time interval to obtain an image quality analysis result; for each target payment verification time interval, extracting quality indexes of image quality analysis results corresponding to the target payment verification time interval to obtain quality index extraction results; and generating facial recognition correction data based on the quality index extraction result corresponding to each target payment verification time interval, wherein the facial recognition correction data comprises correction for shooting parameters of a front camera and a rear camera. So, through shooing the parameter to leading camera and rear camera and rectify, facial recognition rate of accuracy when constantly optimizing the payment of brushing the face avoids the wrong payment.
Fig. 4 is a block diagram illustrating an exemplary facial recognition analysis device 140 applied to blockchain payments according to some embodiments of the present application, where the facial recognition analysis device 140 applied to blockchain payments may include the following functional modules.
The data acquisition module 141 is configured to acquire first facial identification data corresponding to each payment verification time interval in N payment verification time intervals, where the first facial identification data includes a facial feature set, an environmental feature noise, and a to-be-processed portrait label, the facial feature set is used to identify the payment verification time interval, the environmental feature noise is used to indicate an identification accuracy of the payment verification time interval, the to-be-processed portrait label is used to indicate a portrait intensity in the payment verification time interval, and N is an integer greater than 1.
A list generating module 142, configured to generate a feature processing list corresponding to each payment verification period based on the first face identification data corresponding to each payment verification period, where the feature processing list is used to perform noise removal and feature matching on the second face identification data, and the feature processing list and the payment verification period have a one-to-one correspondence relationship.
The data processing module 143 is configured to process the second face identification data corresponding to each payment verification period by using the feature processing list corresponding to each payment verification period to obtain third face identification data corresponding to each payment verification period, where the feature processing list, the second face identification data, and the third face identification data have a one-to-one correspondence relationship.
A payment detection module 144, configured to generate a current payment verification result based on the third face identification data corresponding to each payment verification time period; and determining whether the payment environment safety index of the blockchain payment terminal meets a set condition according to the current payment verification result.
For a description of the above-described device embodiment, reference is made to the description of the method embodiment shown in fig. 3.
Based on the same inventive concept as described above, a system embodiment corresponding to the method embodiment shown in fig. 3 is also provided, and an exemplary description may be as follows.
A1. A face recognition analysis system applied to block chain payment comprises a big data platform and a block chain payment terminal which are communicated with each other; wherein the blockchain payment terminal is configured to:
acquiring first face identification data corresponding to each payment verification time interval in N payment verification time intervals, wherein the first face identification data comprise a face feature set, environmental feature noise and a to-be-processed portrait label, the face feature set is used for identifying the payment verification time intervals, the environmental feature noise is used for indicating the identification accuracy of the payment verification time intervals, the to-be-processed portrait label is used for indicating the portrait intensity degree in the payment verification time intervals, and N is an integer greater than 1;
generating a feature processing list corresponding to each payment verification time period based on the first face identification data corresponding to each payment verification time period, wherein the feature processing list is used for performing noise removal and feature matching on second face identification data, and the feature processing list and the payment verification time periods have one-to-one correspondence relationship;
processing second face identification data corresponding to each payment verification time period by adopting the feature processing list corresponding to each payment verification time period to obtain third face identification data corresponding to each payment verification time period, wherein the feature processing list, the second face identification data and the third face identification data have a one-to-one correspondence relationship;
generating a current payment verification result based on the third face identification data corresponding to each payment verification time interval; and determining whether the payment environment safety index of the blockchain payment terminal meets a set condition according to the current payment verification result.
Having thus described the basic concept, it will be apparent to those skilled in the art that the foregoing detailed disclosure is to be considered merely illustrative and not restrictive of the broad application. Various modifications, improvements and adaptations to the present application may occur to those skilled in the art, although not explicitly described herein. Such modifications, improvements and adaptations are proposed in the present application and thus fall within the spirit and scope of the exemplary embodiments of the present application.
Also, this application uses specific terminology to describe embodiments of the application. Reference throughout this specification to "one embodiment," "an embodiment," and/or "some embodiments" means that a particular feature, structure, or characteristic described in connection with at least one embodiment of the present application is included in at least one embodiment of the present application. Therefore, it is emphasized and should be appreciated that two or more references to "an embodiment" or "one embodiment" or "an alternative embodiment" in various portions of this specification are not necessarily all referring to the same embodiment. Furthermore, some features, structures, or characteristics of at least one embodiment of the present application may be combined as appropriate.
In addition, those skilled in the art will recognize that the various aspects of the application may be illustrated and described in terms of several patentable species or contexts, including any new and useful combination of procedures, machines, articles, or materials, or any new and useful modifications thereof. Accordingly, various aspects of the present application may be embodied entirely in hardware, entirely in software (including firmware, resident software, micro-code, etc.) or in a combination of hardware and software. The above hardware or software may be referred to as a "unit", "component", or "system". Furthermore, aspects of the present application may be represented as a computer product, including computer readable program code, embodied in at least one computer readable medium.
A computer readable signal medium may comprise a propagated data signal with computer program code embodied therein, for example, on a baseband or as part of a carrier wave. The propagated signal may take any of a variety of forms, including electromagnetic, optical, and the like, or any suitable combination. A computer readable signal medium may be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code on a computer readable signal medium may be propagated over any suitable medium, including radio, electrical cable, fiber optic cable, RF, or the like, or any combination of the preceding.
Computer program code required for the execution of aspects of the present application may be written in any combination of one or more programming languages, including object oriented programming, such as Java, Scala, Smalltalk, Eiffel, JADE, Emerald, C + +, C #, VB.NET, Python, and the like, or similar conventional programming languages, such as the "C" programming language, Visual Basic, Fortran 2003, Perl, COBOL 2002, PHP, ABAP, dynamic programming languages, such as Python, Ruby, and Groovy, or other programming languages. The programming code may execute entirely on the user's computer, as a stand-alone software package, partly on the user's computer, partly on a remote computer or entirely on the remote computer or server. In the latter scenario, the remote computer may be connected to the user's computer through any network format, such as a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet), or in a cloud computing environment, or as a service, such as a software as a service (SaaS).
Additionally, the order of the process elements and sequences described herein, the use of numerical letters, or other designations are not intended to limit the order of the processes and methods unless otherwise indicated in the claims. While various presently contemplated embodiments of the invention have been discussed in the foregoing disclosure by way of example, it should be understood that such detail is solely for that purpose and that the appended claims are not limited to the disclosed embodiments, but, on the contrary, are intended to cover all modifications and equivalent arrangements that are within the spirit and scope of the embodiments herein. For example, although the system components described above may be implemented by hardware means, they may also be implemented by software-only solutions, such as installing the described system on an existing server or mobile device.
It should also be appreciated that in the foregoing description of embodiments of the present application, various features are sometimes grouped together in a single embodiment, figure, or description thereof for the purpose of streamlining the disclosure aiding in the understanding of at least one embodiment of the invention. However, this method of disclosure is not intended to require more features than are expressly recited in the claims. Indeed, the embodiments may be characterized as having less than all of the features of a single embodiment disclosed above.

Claims (8)

1. A method for facial recognition analysis for blockchain payments, comprising:
if the current payment verification result is a first current payment verification result, acquiring a first data updating track and a second data updating track aiming at authorized face data; wherein the update time step of the second data update track is smaller than the update time step of the first data update track;
determining surrounding scene image data of the authorized face data according to the track node characteristics of the second data updating track, and acquiring dynamic track nodes of the authorized face data from the first data updating track according to the surrounding scene image data;
determining the matching rate of the node transmission information of the dynamic track node and each transmission information to be matched in a preset transmission information set; the preset transfer information set comprises a plurality of pieces of transfer information to be matched, each piece of transfer information to be matched is provided with a transfer safety factor, and the transfer safety factors represent that the blockchain payment terminal is in an authority interface opening state or the blockchain payment terminal is in an authority interface closing state;
selecting K pieces of transmission information to be matched from the preset transmission information set based on the matching rate of the node transmission information and each piece of transmission information to be matched; wherein K is a positive integer greater than or equal to 1; judging whether the block chain payment terminal is in an authority interface opening state or an authority interface closing state based on the transmission safety factors of K pieces of information to be matched;
when the block chain payment terminal is judged to be in the permission interface opening state, determining an access request list received by the block chain payment terminal, and determining a payment environment safety index of the block chain payment terminal according to the access request list; extracting dimension description weights corresponding to each index dimension information in the payment environment safety indexes, and weighting each dimension description weight to obtain a payment environment detection coefficient; when the payment environment detection coefficient is larger than or equal to a set detection coefficient, judging that the payment environment safety index meets the set condition; and when the payment environment detection coefficient is smaller than a set detection coefficient, judging that the payment environment safety index does not meet the set condition.
2. The method according to claim 1, wherein selecting K pieces of transfer information to be matched from the preset transfer information set based on a matching rate of the node transfer information and each piece of transfer information to be matched comprises:
and selecting K pieces of transmission information to be matched with the maximum matching rate from the preset transmission information set based on the matching rate of the node transmission information and each piece of transmission information to be matched in the preset transmission information set.
3. The method according to claim 2, wherein the determining whether the blockchain payment terminal is in the permission interface open state or in the permission interface close state based on the transmission safety factors of the K pieces of information to be matched, comprises:
if the transmission safety coefficient is a first target coefficient or a second target coefficient, counting the number of the first target coefficients and the number of the second target coefficients based on the transmission safety coefficients of the K pieces of transmission information to be matched; the first target coefficient represents that the blockchain payment terminal is in an authority interface opening state, and the second target coefficient represents that the blockchain payment terminal is in an authority interface closing state;
and judging whether the block chain payment terminal is in an authority interface opening state or an authority interface closing state according to the first target coefficient quantity and the second target coefficient quantity, and determining the confidence weight of the judgment result according to the first target coefficient quantity and the second target coefficient quantity.
4. The method according to claim 3, wherein determining whether the blockchain payment terminal is in the right interface on state or in the right interface off state according to the first target coefficient number and the second target coefficient number comprises:
if the first target coefficient quantity is larger than the second target coefficient quantity, judging that the block chain payment terminal is in an authority interface opening state;
and if the first target coefficient quantity is smaller than the second target coefficient quantity, judging that the block chain payment terminal is in the permission interface closing state.
5. The method of claim 4, further comprising:
and if the current payment verification result is a second current payment verification result, judging that the payment environment safety index does not meet the set condition.
6. The method of claim 1, wherein prior to the step of obtaining the first data update trajectory and the second data update trajectory for the authorized face data if the current payment verification result is the first current payment verification result, the method further comprises:
determining a facial feature correlation matrix corresponding to each payment verification time period based on third facial identification data corresponding to each payment verification time period, wherein the facial feature correlation matrix is a facial feature correlation matrix of the third facial identification data under a preset correlation weight threshold value;
determining face description information corresponding to each payment verification time period based on the face feature correlation matrix corresponding to each payment verification time period;
determining face identification error distribution corresponding to each payment verification time period based on the face description information corresponding to each payment verification time period, wherein the face identification error distribution is a dynamic error of the third face identification data under a preset relevance weight threshold;
determining target third face identification data corresponding to each payment verification time period based on the face identification error distribution corresponding to each payment verification time period and the third face identification data corresponding to each payment verification time period;
determining a current payment verification result corresponding to each payment verification time period based on the target third face identification data corresponding to each payment verification time period;
wherein the determining the target third face identification data corresponding to each payment verification period based on the face identification error distribution corresponding to each payment verification period and the third face identification data corresponding to each payment verification period comprises: determining to-be-processed third face identification data corresponding to each payment verification time period based on the face identification error distribution corresponding to each payment verification time period and the third face identification data corresponding to each payment verification time period; performing redundant data elimination processing on the third face identification data to be processed corresponding to each payment verification time interval to obtain target third face identification data corresponding to each payment verification time interval;
wherein the generating a current payment verification result based on the third face identification data corresponding to each payment verification period comprises:
for any one payment verification time interval in the N payment verification time intervals, if the target third face identification data meets a payment verification index condition, generating a first current payment verification result, wherein the first current payment verification result belongs to the current payment verification result, and the first current payment verification result indicates that the target third face identification data is authorized face data;
and generating a second current payment verification result if the target third face identification data does not meet the payment verification index condition aiming at any one payment verification time interval in the N payment verification time intervals, wherein the second current payment verification result belongs to the current payment verification result, and the second current payment verification result indicates that the target third face identification data is unauthorized face data.
7. The method of claim 6, wherein prior to the step of determining the facial feature correlation matrix for each payment verification period based on the third facial identification data for each payment verification period, the method further comprises:
acquiring first face identification data corresponding to each payment verification time interval in N payment verification time intervals; the first face identification data comprises a face feature set, an environment feature noise and a to-be-processed portrait label, wherein the face feature set is used for identifying a payment verification time period, the environment feature noise is used for indicating the identification accuracy of the payment verification time period, the to-be-processed portrait label is used for indicating the portrait intensity degree in the payment verification time period, and N is an integer greater than 1;
generating a feature processing list corresponding to each payment verification time period based on the first face identification data corresponding to each payment verification time period; the feature processing list is used for carrying out noise removal and feature matching on second face identification data, and the feature processing list and the payment verification time interval have one-to-one correspondence relationship;
processing the second facial identification data corresponding to each payment verification time period by adopting the characteristic processing list corresponding to each payment verification time period to obtain third facial identification data corresponding to each payment verification time period; wherein the feature processing list, the second face identification data, and the third face identification data have a one-to-one correspondence relationship.
8. A big data platform is characterized by comprising a processing engine, a network module and a memory; the processing engine and the memory communicate through the network module, the processing engine reading a computer program from the memory and operating to perform the method of any of claims 1-7.
CN202110554393.0A 2020-11-17 2020-11-17 Face recognition analysis method applied to block chain payment and big data platform Withdrawn CN113409035A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110554393.0A CN113409035A (en) 2020-11-17 2020-11-17 Face recognition analysis method applied to block chain payment and big data platform

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202011285496.3A CN112330312B (en) 2020-11-17 2020-11-17 Data processing method based on block chain payment and facial recognition and big data platform
CN202110554393.0A CN113409035A (en) 2020-11-17 2020-11-17 Face recognition analysis method applied to block chain payment and big data platform

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
CN202011285496.3A Division CN112330312B (en) 2020-11-17 2020-11-17 Data processing method based on block chain payment and facial recognition and big data platform

Publications (1)

Publication Number Publication Date
CN113409035A true CN113409035A (en) 2021-09-17

Family

ID=74320864

Family Applications (3)

Application Number Title Priority Date Filing Date
CN202110554392.6A Withdrawn CN113409034A (en) 2020-11-17 2020-11-17 Facial recognition data processing method combined with block chain payment and big data platform
CN202011285496.3A Active CN112330312B (en) 2020-11-17 2020-11-17 Data processing method based on block chain payment and facial recognition and big data platform
CN202110554393.0A Withdrawn CN113409035A (en) 2020-11-17 2020-11-17 Face recognition analysis method applied to block chain payment and big data platform

Family Applications Before (2)

Application Number Title Priority Date Filing Date
CN202110554392.6A Withdrawn CN113409034A (en) 2020-11-17 2020-11-17 Facial recognition data processing method combined with block chain payment and big data platform
CN202011285496.3A Active CN112330312B (en) 2020-11-17 2020-11-17 Data processing method based on block chain payment and facial recognition and big data platform

Country Status (1)

Country Link
CN (3) CN113409034A (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112818400B (en) * 2021-02-18 2022-05-03 支付宝(杭州)信息技术有限公司 Biological identification method, device and equipment based on privacy protection

Family Cites Families (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10380588B2 (en) * 2014-05-13 2019-08-13 Mastercard International Incorporated Passive cardholder verification method in mobile device
JP6395678B2 (en) * 2014-08-14 2018-09-26 エヌエイチエヌ ペイコ コーポレーション Payment service method and system using card having integrated function, and recording medium
CN105844460A (en) * 2015-01-13 2016-08-10 顾泽苍 Composition of mobile phone face-scanning payment system
CN104574088B (en) * 2015-02-04 2018-10-19 华为技术有限公司 The method and apparatus of payment authentication
CN105184561A (en) * 2015-08-24 2015-12-23 小米科技有限责任公司 Safety payment method and device
CN106056379A (en) * 2016-05-25 2016-10-26 努比亚技术有限公司 Payment terminal and payment data processing method
CN107491965B (en) * 2017-07-31 2020-07-10 阿里巴巴集团控股有限公司 Method and device for establishing biological feature library
CN107798530B (en) * 2017-08-09 2021-09-14 中国银联股份有限公司 Payment system and payment method
US10719832B1 (en) * 2018-01-12 2020-07-21 Wells Fargo Bank, N.A. Fraud prevention tool
CN108491806A (en) * 2018-03-28 2018-09-04 成都信达智胜科技有限公司 A kind of fast human face recognition
CN111210589A (en) * 2018-11-22 2020-05-29 北京搜狗科技发展有限公司 Method and device for realizing alarm
CN109711850B (en) * 2018-12-29 2023-12-08 努比亚技术有限公司 Secure payment method, device and computer readable storage medium
CN110619300A (en) * 2019-09-14 2019-12-27 韶关市启之信息技术有限公司 Correction method for simultaneous recognition of multiple faces
CN111125772B (en) * 2019-12-31 2022-06-03 中国银行股份有限公司 Method and device for dynamically setting security policy and mobile device

Also Published As

Publication number Publication date
CN113409034A (en) 2021-09-17
CN112330312B (en) 2021-12-10
CN112330312A (en) 2021-02-05

Similar Documents

Publication Publication Date Title
CN111652615B (en) Safety identification method based on block chain big data and artificial intelligence cloud service platform
CN109816200B (en) Task pushing method, device, computer equipment and storage medium
CN112860484A (en) Container runtime abnormal behavior detection and model training method and related device
CN112005532B (en) Method, system and storage medium for classifying executable files
CN108614970B (en) Virus program detection method, model training method, device and equipment
CN112487495B (en) Data processing method based on big data and cloud computing and big data server
CN112214781B (en) Remote sensing image big data processing method and system based on block chain
CN112330312B (en) Data processing method based on block chain payment and facial recognition and big data platform
CN107977305B (en) Method and apparatus for detecting application
CN112686667A (en) Data processing method based on big data and block chain and cloud service platform
CN112528306A (en) Data access method based on big data and artificial intelligence and cloud computing server
US10922569B2 (en) Method and apparatus for detecting model reliability
CN113689291B (en) Anti-fraud identification method and system based on abnormal movement
CN115687732A (en) User analysis method and system based on AI and stream computing
CN112929385B (en) Communication information processing method based on big data and communication service and cloud computing platform
CN112416999B (en) Data analysis method based on artificial intelligence and big data positioning and cloud server
CN110213341B (en) Method and device for detecting downloading of application program
CN113409014A (en) Big data service processing method based on artificial intelligence and artificial intelligence server
CN113935034A (en) Malicious code family classification method and device based on graph neural network and storage medium
CN112465503B (en) Information security protection method based on internet finance and biological recognition and cloud platform
CN113645107B (en) Gateway conflict resolution method and system based on smart home
CN112613878A (en) Information detection method based on big data and block chain payment and big data server
CN115459952A (en) Attack detection method, electronic device, and computer-readable storage medium
CN117670358A (en) Service information verification method and device, storage medium and electronic equipment
CN116628637A (en) Unauthorized software identification method and device, electronic equipment, medium and product

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WW01 Invention patent application withdrawn after publication

Application publication date: 20210917

WW01 Invention patent application withdrawn after publication