CN111382410B - Face brushing verification method and system - Google Patents

Face brushing verification method and system Download PDF

Info

Publication number
CN111382410B
CN111382410B CN202010206013.XA CN202010206013A CN111382410B CN 111382410 B CN111382410 B CN 111382410B CN 202010206013 A CN202010206013 A CN 202010206013A CN 111382410 B CN111382410 B CN 111382410B
Authority
CN
China
Prior art keywords
brushing
historical
face
user
face brushing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010206013.XA
Other languages
Chinese (zh)
Other versions
CN111382410A (en
Inventor
王岱鑫
张志强
周俊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Alipay Hangzhou Information Technology Co Ltd
Original Assignee
Alipay Hangzhou Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Alipay Hangzhou Information Technology Co Ltd filed Critical Alipay Hangzhou Information Technology Co Ltd
Priority to CN202010206013.XA priority Critical patent/CN111382410B/en
Publication of CN111382410A publication Critical patent/CN111382410A/en
Application granted granted Critical
Publication of CN111382410B publication Critical patent/CN111382410B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/30Authentication, i.e. establishing the identity or authorisation of security principals
    • G06F21/31User authentication
    • G06F21/32User authentication using biometric data, e.g. fingerprints, iris scans or voiceprints
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/168Feature extraction; Face representation

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Security & Cryptography (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • General Physics & Mathematics (AREA)
  • Software Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Computer Hardware Design (AREA)
  • General Health & Medical Sciences (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • Image Analysis (AREA)

Abstract

The embodiment of the specification provides a face brushing verification method. The method may include obtaining a brush image from a brush engine; determining candidate users based on the face brushing image; acquiring historical association characteristics reflecting historical association between the candidate user and the face brushing tool; determining, by a face brushing verification model, a degree of correlation of the face brushing image with the candidate user based at least on the historical correlation features; and determining a face brushing verification result of the face brushing image based on the correlation.

Description

Face brushing verification method and system
Technical Field
The specification relates to the technical field of artificial intelligence, in particular to a face brushing verification method and system.
Background
With the development of computer technology and network technology, face-brushing authentication becomes more and more popular. Generally, a face-brushing verification system may collect a face image, compare the face image with a large number of images in a database, and determine a user account corresponding to an image with the highest similarity as a target account. Further, the system may determine the degree of correlation between the facial image and the target account number, thereby determining the verification result. However, in some cases, a clear image or a rich historical brush image may not be associated in some user accounts for comparison, which may result in an inaccurate verification result. In addition, there may be some correlation between different users and different implements, which may affect the face-brushing verification result.
Therefore, a face brushing verification method is desired, which considers the association between the user and the machine tool and improves the accuracy of the face brushing verification.
Disclosure of Invention
One aspect of an embodiment of the present specification provides a face brushing verification method. The method comprises the following steps: acquiring a face brushing image from a face brushing machine; determining candidate users based on the face brushing image; acquiring historical association characteristics reflecting historical association between the candidate user and the face brushing tool; determining, by a face brushing verification model, a degree of correlation of the face brushing image with the candidate user based at least on the historical correlation features; and determining a face brushing verification result of the face brushing image based on the correlation.
Another aspect of embodiments of the present specification provides a face brushing verification system. The system comprises: the device comprises an acquisition module, a candidate user determination module, a historical association characteristic acquisition module, a correlation determination module and a verification result determination module. The acquisition module is used for acquiring a face brushing image from a face brushing machine; the candidate user determination module is used for determining candidate users based on the face brushing image; the historical associated feature acquisition module is used for acquiring historical associated features reflecting historical association between the candidate users and the face brushing tool; the relevance determining module is used for determining the relevance of the face brushing image and the candidate user through a face brushing verification model at least based on the historical associated features; and the verification result determining module is used for determining the face brushing verification result of the face brushing image based on the correlation.
Another aspect of embodiments of the present specification provides a face brushing verification device. The face brushing verification device comprises a processor, and the processor is used for executing the face brushing verification method.
Another aspect of embodiments of the present specification provides a computer-readable storage medium. The storage medium stores computer instructions, and after the computer reads the computer instructions in the storage medium, the computer executes the face brushing verification method.
Drawings
The present description will be further explained by way of exemplary embodiments, which will be described in detail by way of the accompanying drawings. These embodiments are not intended to be limiting, and in these embodiments like numerals are used to indicate like structures, wherein:
FIG. 1 is a schematic diagram of an application scenario of an exemplary face verification system, shown in accordance with some embodiments of the present description;
FIG. 2 is a block diagram of an exemplary processing device shown in accordance with some embodiments of the present description;
FIG. 3 is a flow diagram of an exemplary face brushing verification method, shown in accordance with some embodiments of the present description;
FIG. 4 is a flow diagram of an exemplary face brushing verification model training process, shown in accordance with some embodiments of the present description; and
fig. 5 is a schematic diagram of an exemplary initial face brushing verification model, shown in accordance with some embodiments of the present description.
Detailed Description
In order to more clearly illustrate the technical solutions of the embodiments of the present disclosure, the drawings used in the description of the embodiments will be briefly described below. It is obvious that the drawings in the following description are only examples or embodiments of the present description, and that for a person skilled in the art, the present description can also be applied to other similar scenarios on the basis of these drawings without inventive effort. Unless otherwise apparent from the context, or otherwise indicated, like reference numbers in the figures refer to the same structure or operation.
It should be understood that "system", "device", "unit" and/or "module" as used herein is a method for distinguishing different components, elements, parts, portions or assemblies at different levels. However, other words may be substituted by other expressions if they accomplish the same purpose.
As used in this specification and the appended claims, the terms "a," "an," "the," and/or "the" are not intended to be inclusive in the singular, but rather are intended to be inclusive in the plural, unless the context clearly dictates otherwise. In general, the terms "comprises" and "comprising" merely indicate that steps and elements are included which are explicitly identified, that the steps and elements do not form an exclusive list, and that a method or apparatus may include other steps or elements.
Although various references are made herein to certain modules or units in a system according to embodiments of the present description, any number of different modules or units may be used and run on the client and/or server. The modules are merely illustrative and different aspects of the systems and methods may use different modules.
Flow charts are used in this description to illustrate operations performed by a system according to embodiments of the present description. It should be understood that the preceding or following operations are not necessarily performed in the exact order in which they are performed. Rather, the various steps may be processed in reverse order or simultaneously. Meanwhile, other operations may be added to the processes, or a certain step or several steps of operations may be removed from the processes.
Fig. 1 is a schematic diagram of an application scenario of an exemplary face verification system according to some embodiments of the present description. The face brushing verification system 100 can be applied to various scenes requiring face brushing verification, such as face brushing payment, face brushing in and out, face brushing login, face brushing authentication, and the like. As shown in fig. 1, the face verification system 100 may include a server 110, a network 120, a face brusher (also referred to simply as a "brusher") 130, a user terminal 140, and a storage device 150.
The server 110 may be a single server or a server farm. In some embodiments, the server farm may be centralized or distributed (e.g., the server 110 may be a distributed system). In some embodiments, the server 110 may be local or remote. For example, the server 110 may access information and/or data stored in the brushhead 130, the user terminal 140, and/or the storage device 150 via the network 120. As another example, the server 110 may be directly connected to the groomer 130, the user terminal 140, and/or the storage device 150 to access stored information and/or data. In some embodiments, the server 110 may be implemented on a cloud platform. By way of example only, the cloud platform may include a private cloud, a public cloud, a hybrid cloud, a community cloud, a distributed cloud, across clouds, multiple clouds, the like, or any combination of the above. In some embodiments, the server 110 may be implemented on a computing device, such as a computer, a mobile device, or any device with computing processing capabilities.
In some embodiments, the server 110 may include a processing device 112. The processing device 112 may process information and/or data related to the face brushing verification to perform one or more functions described herein. For example, the processing device 112 may obtain a brushing image from the facer 130 or the user terminal 140 and perform a brushing verification on the brushing image. For another example, after completing the face brushing verification, the processing device 112 may send the face brushing verification result to the face brushing engine 130 or the user terminal 140. In some embodiments, the processing equipment 112 may include one or more processing equipment (e.g., a single wafer processor or a multi-wafer processor). By way of example only, the processing device 112 may include one or more hardware processors, such as a Central Processing Unit (CPU), an Application Specific Integrated Circuit (ASIC), an application specific instruction set processor (ASIP), an image processing unit (GPU), a physical arithmetic processing unit (PPU), a Digital Signal Processor (DSP), a Field Programmable Gate Array (FPGA), a Programmable Logic Device (PLD), a controller, a microcontroller unit, a Reduced Instruction Set Computer (RISC), a microprocessor, or the like, or any combination of the above.
Network 120 may facilitate the exchange of information and/or data. In some embodiments, one or more components of the brush face verification system 100 (e.g., the server 110, the brusher 130, the user terminal 140, the storage device 150) may send information and/or data to other components of the brush face verification system 100 via the network 120. For example, the server 110 may obtain a brush image from the brusher 130 or a user terminal via the network 120. As another example, the server 110 may retrieve the trained face brushing verification model from the storage device 150 via the network 120. In some embodiments, the network 120 may be any one or combination of a wired network or a wireless network. By way of example only, network 120 may include a cable network, a wired network, a fiber optic network, a remote communication network, an intranet, the internet, a Local Area Network (LAN), a Wide Area Network (WAN), a Wireless Local Area Network (WLAN), a Metropolitan Area Network (MAN), a Wide Area Network (WAN), a Public Switched Telephone Network (PSTN), a bluetooth network, a ZigBee network, a Near Field Communication (NFC) network, the like, or any combination of the above. In some embodiments, network 120 may include one or more network switching points. For example, the network 120 may include wired or wireless network switching points, such as base stations and/or Internet switching points 120-1, 120-2, … …, through which one or more components of the face verification system 100 may connect to the network 120 to exchange data and/or information.
The facial brushing tool 130 may be any device that provides a facial brushing function. For example, the groomer 130 can include a self-service terminal device, a mobile device, and the like. In some embodiments, the kiosk device may include a kiosk, the like, or any combination of the above. In some embodiments, the mobile device may include a smart home device, a wearable device, a smart mobile device, a virtual reality device, an augmented reality device, the like, or any combination of the above. In some embodiments, the smart home devices may include smart lighting devices, control devices for smart appliances, smart monitoring devices, smart televisions, smart cameras, interphones, and the like, or any combination thereof. In some embodiments, the wearable device may include a bracelet, footwear, glasses, helmet, watch, clothing, backpack, smart accessory, or the like, or any combination of the above. In some embodiments, the smart mobile device may include a mobile handset, a personal digital assistant, a gaming device, a navigation device, a POS machine, a laptop computer, a desktop computer, the like, or any combination of the above. In some embodiments, the virtual reality device and/or the augmented reality device may include a virtual reality helmet, virtual reality glasses, virtual reality eyeshields, augmented reality helmets, augmented reality glasses, augmented reality eyeshields, and the like, or any combination thereof. For example, the virtual reality device and/or augmented reality device may include a Google GlassTM、Oculus RiftTM、HololensTM、Gear VRTMAnd the like. In some embodiments, the groomer 130 can include an image acquisition device, such as a visible light image recognition device, a three-dimensional image face recognition device, a thermal imaging face recognition device, a multi-light source face recognition device, or the like. In some embodiments, the image capture device may automatically search for and process images when a user is within a capture range of the image capture deviceThe face image of the user is shot.
The user terminal 140 may be any device that can provide user interaction functionality. In some embodiments, the user terminal 140 may be a device similar to or the same as the facer 130. For example, the user terminal 140 may include a smartphone 140-1, a tablet computer 140-2, a laptop computer 140-3, and so on. In some embodiments, the user may complete the brushing action through the user terminal 140, in which case the user terminal 140 may be used as a face brushing tool. In some embodiments, the groomer 130 and the user terminal 140 may interact with each other for grooming verification. For example, the brusher 130 may capture a brushing image of the user and transmit the brushing image to the user terminal 140, and the user terminal 140 may transmit the brushing image to the server 110 for brushing verification. For another example, the user terminal 140 may collect a face brushing image of the user and transmit the face brushing image to the face brushing tool 130, and the face brushing tool 130 may transmit the face brushing image to the server 110 for face brushing verification. For further example, the face brushing tool 130 may collect a face brushing image of the user and transmit the face brushing image to the server 110 for face brushing verification, and after the face brushing verification is completed, the server 110 may transmit a face brushing verification result to the user terminal 140. In some embodiments, the facer 130 or the user terminal 140 can include a display device (e.g., a display screen). The user may view the results of the swipe verification through the display device and complete further operations (e.g., confirm or cancel the swipe verification) through the user interface of the display device. In some embodiments, "face brushing implement" and "user terminal" may be used interchangeably.
Storage device 150 may store data and/or instructions. In some embodiments, the memory device 150 may store data obtained from the facer 130 and/or the user terminal 140. In some embodiments, storage device 150 may store data and/or instructions for execution or use by server 110, which may be executed or used by server 110 to implement the example methods described herein. In some embodiments, storage device 150 may include mass storage, removable storage, volatile read-write memory, read-only memory (ROM), the like, or any combination of the above. Exemplary mass storage devices may include magnetic disks, optical disks, solid state disks, and the like. Exemplary removable memories may include flash drives, floppy disks, optical disks, memory cards, compact disks, magnetic tape, and the like. Exemplary volatile read-only memory can include Random Access Memory (RAM). Exemplary random access memories may include Dynamic Random Access Memory (DRAM), double-data-rate synchronous dynamic random access memory (DDR SDRAM), Static Random Access Memory (SRAM), thyristor random access memory (T-RAM), zero-capacitance random access memory (Z-RAM), and the like. Exemplary read-only memories may include mask read-only memory (MROM), programmable read-only memory (PROM), erasable programmable read-only memory (EPROM), electrically erasable programmable read-only memory (EEPROM), compact disk read-only memory (CD-ROM), digital versatile disk read-only memory, and the like. In some embodiments, the storage device 150 may be implemented on a cloud platform. By way of example only, the cloud platform may include a private cloud, a public cloud, a hybrid cloud, a community cloud, a distributed cloud, across clouds, multiple clouds, the like, or any combination of the above.
In some embodiments, the storage device 150 may be connected to the network 120 to communicate with one or more components of the brush face verification system 100 (e.g., the server 110, the brusher 130, the user terminal 140). One or more components of the facial verification system 100 may access data or instructions stored in the storage device 150 via the network 120. In some embodiments, the storage device 150 may be directly connected to or in communication with one or more components in the facial verification system 100 (e.g., the server 110, the groomer 130, the user terminal 140). In some embodiments, the storage device 150 may be part of the server 110.
In some embodiments, the information interaction of one or more components in the face verification system 100 may be accomplished by way of a request for service. The object of the service request may be any product. In some embodiments, the product may be a tangible product or an intangible product. Tangible products may include food, medicine, merchandise, chemical products, appliances, clothing, cars, houses, luxury goods, and the like, or any combination of the above. Intangible products may include service products, financial products, knowledge products, internet products, and the like, or any combination of the above. The internet products may include personal host products, website products, mobile internet products, commercial host products, embedded products, and the like, or any combination of the above. The mobile internet product may be software, programming, system, etc. for a mobile terminal or any combination of the above examples. The mobile terminal may include a tablet, a laptop, a mobile phone, a Personal Digital Assistant (PDA), a smart watch, a POS machine, a wearable device, and the like, or any combination of the above. For example, the product may be any software and/or application programming used in a computer or mobile handset. The software and/or application programming may be related to social interaction, shopping, transportation, entertainment, learning, investment, etc., or any combination of the above.
FIG. 2 is a block diagram of an exemplary processing device shown in accordance with some embodiments of the present description. As shown in fig. 2, the processing device 112 may include an acquisition module 202, a candidate user determination module 204, a historical association feature acquisition module 206, a relevancy determination module 208, and a verification result determination module 210.
The acquisition module 202 may be used to acquire a brush image from a brush engine. For a detailed description of the face brushing image obtained from the face brushing machine, see fig. 3, the description of which is omitted.
The candidate user determination module 204 may be configured to determine candidate users based on the brushed face image. For a detailed description of determining candidate users based on the brushed face image, see fig. 3, which is not repeated herein.
The historical associated feature acquisition module 206 may be configured to acquire historical associated features reflecting historical associations between candidate users and a facer. For a detailed description of the history association feature, see fig. 3, which is not repeated herein.
The relevance determination module 208 may be configured to determine a relevance of the brush face image to the candidate user based on at least the historical association features via a brush face verification model. For a detailed description of determining the correlation between the brushed face image and the candidate user through the brushing face verification model, see fig. 3, which is not repeated herein.
The verification result determination module 210 may be configured to determine a brushing verification result of the brushing image based on the correlation. For a detailed description of determining the brushing face verification result of the brushing face image based on the correlation degree, see fig. 3, which is not repeated herein.
It should be understood that the device and its modules shown in FIG. 2 may be implemented in a variety of ways. For example, the apparatus and its modules may be implemented by hardware, software, or a combination of software and hardware. Wherein the hardware portion may be implemented using dedicated logic; the software portions may be stored in a memory for execution by a suitable instruction execution system, such as a microprocessor or specially designed hardware. Those skilled in the art will appreciate that the methods and systems described above may be implemented using computer executable instructions and/or embodied in processor control code, such code being provided, for example, on a carrier medium such as a diskette, CD-or DVD-ROM, a programmable memory such as read-only memory (firmware), or a data carrier such as an optical or electronic signal carrier. The devices and modules thereof in this specification may be implemented not only by hardware circuits such as very large scale integrated circuits or gate arrays, semiconductors such as logic chips, transistors, or programmable hardware devices such as field programmable gate arrays, programmable logic devices, etc., but also by software executed by various types of processors, for example, or by a combination of the above hardware circuits and software (e.g., firmware).
It should be noted that the above description of the processing device 112 is merely for convenience of description and is not intended to limit the present disclosure to the illustrated embodiments. It will be appreciated by those skilled in the art that, given the teachings of the present system, any combination of modules or sub-system configurations may be used to connect to other modules without departing from such teachings. For example, the processing device 112 may also include a communication module for communicating with other components, such as sending a face brushing verification result to the face brushing engine 130 or the user terminal 140. For another example, the modules in the processing device 112 may share one memory module, and each module may have its own memory module. Such variations are within the scope of the present disclosure. For further example, the processing device 112 may further include a brushing behavior feature obtaining module for obtaining brushing behavior features related to the brushing image.
Fig. 3 is an exemplary flow diagram of a method of face brushing verification, shown in accordance with some embodiments of the present description. As shown in fig. 3, the process 300 may include the following steps. In some embodiments, flow 300 may be performed by a processing device (e.g., processing device 112).
At step 302, the processing device may obtain a brush image from a brush engine (e.g., brush engine 130). In particular, this step may be performed by the acquisition module 202.
The brushing image may be an image related to a human face and may be used for authentication. For example, the brush face image includes at least a portion of a human face (e.g., eyes, nose, mouth). In some embodiments, the brushing image may be obtained in real time by the brushing tool or pre-stored in the brushing tool. For example, when a user attempts to brush a face payment with a face brush, the face brush may capture an image of the user's face and send the image to the processing device. In some embodiments, the brush face image may be a static image or a dynamic image. In some embodiments, the brush face image may be a two-dimensional image, a three-dimensional image, a four-dimensional image, or the like. In some embodiments, the brush face image may be a grayscale image or a color image.
At step 304, the processing device may determine candidate users (which may also be referred to as "candidate user accounts") based on the brushed-face images. In particular, this step may be performed by the candidate user determination module 204.
The candidate user may be a user account that may correspond to the current user. For example, the candidate user may be an account whose account related image (e.g., a historical brush image, an account registration image, or another account image associated with an account) has a similarity greater than a preset threshold with the brush image of the current user. In some embodiments, to determine the candidate user, the processing device may extract at least one of facial features or biometric features associated with the brushed face image, retrieve a candidate image associated with the brushed face image based on the at least one of facial features or biometric features, and determine the candidate user based on the candidate image.
In some embodiments, the face features may reflect features of the face parts and/or structural relationships between the parts. The facial features may include mouth features, eye features, nose features, chin features, and the like. The biometric features may reflect inherent biological characteristics of the human body. The biometric features may include iris features, pinna features, facial features, retinal features, and the like.
In some embodiments, multiple account related images for multiple users may be stored in the storage device 150. For example, for a particular user, all historical brush images of the user over a predetermined period of time (e.g., over a year) may be stored in the processing device 150. For another example, also taking a specific user as an example, the storage device 150 may store a certain number (e.g., 10, 15, or 50) of historical brushing images in a fixed manner according to the image quality (e.g., brightness, contrast, or sharpness) of the historical brushing images of the user. Accordingly, the processing device may extract facial features or biological features of the plurality of account related images and determine image similarities between the plurality of account related images and the brushing image based on the facial features or the biological features. Further, the processing device may determine a candidate image related to the brushed face image based on the image similarity. For example, the processing device may determine the account-related image with the highest image similarity as the candidate image. For another example, the processing device may select an account-related image whose image similarity satisfies a preset condition (e.g., is greater than a preset threshold) as the candidate image. Further, the processing device may determine a candidate user based on the candidate image. For example, the processing device may determine a user account corresponding to the candidate image as the candidate user.
At step 306, the processing device may obtain historical association features reflecting historical associations between the candidate users and the facer. In particular, this step may be determined by the history associated feature acquisition module 206.
In some embodiments, the historically associated features may include a user characterization of the candidate user (also referred to as "user embedding") and a tool characterization of the face brushing tool (also referred to as "device embedding"). In some embodiments, the user characterization or the implement characterization may be in the form of a number, a formula, a vector, a matrix, or the like. In particular, the user characterization may reflect at least a user characteristic of the candidate user and an implement characteristic of a historical face brushing implement on which the candidate user has experienced a face brushing behavior within a first preset time period (e.g., 1 month, 3 months, 6 months, 1 year). Similarly, the implement characterization may reflect at least an implement characteristic of the face brushing implement and a user characteristic of a historical user who has experienced a brushing action on the face brushing implement over a second predetermined period of time (e.g., 1 month, 3 months, 6 months, 1 year). The first preset time period and the second preset time period may be the same or different. In some embodiments, the user characteristics may include a user representation of the user, information about the user's associated account, and the like; the implement characteristics may include an implement representation, such as a location of the face brushing implement, a model of the face brushing implement, a system version of the face brushing implement, a frequency of use of the face brushing implement, a length of use of the face brushing implement, and the like.
In some embodiments, the user characterization may also reflect a first historical brushface feature between the candidate user and each of the historical brushface engines. Similarly, the implement characterization also reflects a second historical brush face feature between the brush face implement and each of the historical users. In some embodiments, the first historical brushing characteristics or the second historical brushing characteristics include at least one of historical brushing times, historical brushing comparison scores, or historical brushing times.
In some embodiments, a user characterization or an implement characterization may be determined by a graph neural network model. In particular, a graph may be a data structure consisting of two parts, a node and an edge. Edges may be directional or non-directional. For a node, its neighbor nodes may refer to neighboring nodes in the graph to which the node is directly connected through an edge. Graph Neural Networks (GNNs) are a deep learning-based approach to processing Graph domain information. The Graph neural Networks may include Graph convolution Networks (Graph relational Networks), Graph Attention Networks (Graph Attention Networks), Graph Auto-coders (Graph Auto-encoders), Graph generation Networks (Graph general Networks), Graph space-time Networks (Graph Spatial-Temporal Networks), and the like. The graph neural network model is a model based on a graph neural network method modeling. Under the application scene corresponding to the face brushing verification, the nodes of the graph neural network model can be users or machines, and the edges can be historical face brushing behaviors between the users and the machines. Correspondingly, the node features correspond to user features or implement features, and the edge features correspond to historical face brushing features between the user and the implement.
In some embodiments, the user characterization may be determined by a first graph neural network model; the tool characterization may be determined by a second graph neural network model. In some embodiments, the first graph neural network model and the second graph neural network model may be two independent models or may be the same graph neural network model.
For example only, the processing device may determine the user characterization by the following equation (1):
Figure BDA0002421146230000131
wherein u represents a candidate user, k represents the kth layer of the first graph neural network model,
Figure BDA0002421146230000132
representing user characterization of candidate user u at model level k, n (u) representing a set of neighbor nodes of candidate user u (i.e., a historical brusher on which the candidate user has taken place a brushing behavior within a first preset time period),
Figure BDA0002421146230000133
representing the representation of neighbor nodes or candidate users at the k-1 layer of the model, | N (u) | representing the number of neighbor nodes,
Figure BDA0002421146230000134
representing candidate users and neighbors at the k-1 level of the modelEdge features between intervening nodes (i.e., historical brush face features), WkAnd BkModel parameters representing the k-th layer of the model, and σ represents an activation function (e.g., sigmoid function).
Similarly, the processing device may determine the implement characterization in a manner similar to equation (1) above and will not be described in detail herein.
In step 308, the processing device may determine a degree of correlation between the brushed face image and the candidate user based on at least the historical association features through the brushing face verification model. In particular, this step may be performed by the relevance determination module 208.
In some embodiments, the face brushing verification model may be a classification model (e.g., a two-classifier) connected to the first graph neural network model or the second graph neural network model. In some embodiments, the first graph neural network model or the second graph neural network model may be a sub-module in the face-brushing verification model, i.e. the face-brushing verification model comprises the first graph neural network model or the second graph neural network model. A more detailed description of the face brushing verification model may be found elsewhere in this specification (e.g., fig. 5 and its description).
In some embodiments, the processing device may process the historical association features via a brush face verification model and determine a relevance of the brush face image to the candidate user based on an output of the brush face verification model. The relevance may be a probability value between 0 and 1, and a higher probability value indicates that the brushing image is more relevant to the candidate user, i.e. the brushing image has a higher probability of corresponding to the candidate user.
In some embodiments, in step 306, the processing device may further obtain a brushing behavior feature related to the brushing image, and accordingly, the processing device may determine a degree of correlation between the brushing image and the candidate user through the brushing verification model based on the historical association feature and the brushing behavior feature. In some embodiments, the brushing behavior characteristics associated with the brushing image include image similarity of the brushing image to the candidate image (i.e., a comparison score of the brushing image), historical interaction characteristics of the candidate user with the current implement (e.g., historical brushing comparisons of the candidate user on the brushing implement, other historical interaction behaviors (e.g., historical code swipe behaviors) the candidate user has performed on the brushing implement), and the like. In some embodiments, the processing device may further determine a brushing behavior characterization (which may also be referred to as "wipe embedding") based on the brushing behavior features through the brushing behavior characterization model. In some embodiments, the brush face behavior characterization may be in the form of a number, formula, vector, matrix, or the like.
At step 310, the processing device may determine a brushing verification result for the brushing image based on the correlation. In particular, this step may be performed by the verification result determination module 210.
In some embodiments, the processing device may determine whether the degree of correlation is greater than a first threshold. If the correlation is greater than the first threshold, the processing device may determine that the face brushing verification result is pass (e.g., without entering a phone number verification). The first threshold may be a predetermined value, for example, 95%, 98%, 99%, etc.
In some embodiments, if the degree of correlation is less than or equal to the first threshold, the processing device may determine whether the degree of correlation is greater than a second threshold. If the correlation is greater than the second threshold, the processing device may determine that the face brushing verification result is that primary verification is required (e.g., through mobile phone tail number verification). The second threshold may be a predetermined value, for example, 90%, 92%, 94%, etc. The mobile phone tail number can be the last few digits, such as the last four digits, the last three digits and the like, of the mobile phone number bound by the user account number, the identity card or other valid certificates.
In some embodiments, if the correlation is less than or equal to the second threshold, the processing device may determine that the face brushing verification result is that secondary verification (e.g., full cell phone number verification) is required.
In some embodiments, the processing device may also instruct the grooming machine to display the grooming verification results via its display screen. In some embodiments, the user can perform operations such as touch, click and the like through a display screen of the face brushing machine tool so as to complete verification of the mobile phone tail number or the complete mobile phone number.
In some embodiments, the face brushing verification result may be displayed in at least one of text, image, sound, or video. For example, the first row shows "please input your mobile phone number and then four digits to complete the verification ____", the second row shows "confirm" and "cancel", and the user can complete the verification by clicking "confirm" or cancel the verification by "cancel".
It should be noted that the above description of the process 300 is for illustration and description only and is not intended to limit the scope of the present disclosure. Various modifications and changes to flow 300 will be apparent to those skilled in the art in light of this description. However, such modifications and variations are intended to be within the scope of the present description. For example, steps 308 and 310 may be combined into one step in which the processing device may determine a correlation of the brush face image with the candidate user based on a brush face verification model and determine a brush face verification result for the brush face image based on the correlation. For another example, the processing device may send the verification result to the user terminal 140 or the storage device 150.
FIG. 4 is an exemplary schematic diagram of a face brushing verification model training process, shown in accordance with some embodiments of the present description. As shown in fig. 4, the process 400 may include the following steps. The flow 400 may be performed by a processing device, such as the processing device 112 or a model training apparatus (not shown).
At step 410, the processing device may determine at least two training samples. In some embodiments, the processing device may determine at least two training samples based on the historical brushing behavior, each of the at least two training samples may correspond to a historical brushing behavior. In some embodiments, the processing device may determine the at least two training samples by simulation or the like.
In some embodiments, each of the at least two training samples may include a sample user, a sample implement, a sample brush image, and a sample label. Taking a specific historical face brushing behavior as an example, the sample user refers to a user determined in the current historical face brushing behavior (or may be referred to as "user account", similar to the "candidate user" described above), the sample tool refers to a face brushing tool in which the current historical face brushing behavior occurs, the sample face brushing image refers to a historical face brushing image in the current historical face brushing behavior, and the sample tag indicates a sample type (e.g., a positive sample or a negative sample) of the sample. The at least two training samples comprise at least one positive sample and at least one negative sample, wherein the positive sample is that the sample brushing image is matched with the sample user, the sample label is 1 (or +1), the negative sample is that the sample brushing image is not matched with the sample user, and the sample label is 0 (or-1).
At step 430, the processing device obtains an initial face brushing verification model.
In some embodiments, the initial face brushing verification model may include a plurality of initial model parameters, where the initial model parameters may be system defaults or set according to specific needs. For example, the initial model parameters (e.g., model size, model complexity) may relate to a preset model application scenario. In some embodiments, the initial model parameters may include the number of model layers, the number of nodes, the number of neurons, and the like.
Step 450, for each of the at least two training samples, the processing device may extract a sample user characteristic of the sample user, an implement characteristic of a sample historical face brushing implement on which the sample user has taken a face brushing action within a third preset time period, a first sample historical face brushing characteristic between the sample user and each of the sample historical face brushing implements, a sample implement characteristic of the sample implement, a user characteristic of the sample historical user having taken a face brushing action on the sample implement within a fourth preset time period, a second sample historical face brushing characteristic between the sample implement and each of the sample historical users, and a sample face brushing action characteristic associated with the sample face brushing image, respectively.
As described in connection with step 306, taking a specific sample as an example, the sample user characteristics of the sample user may include a user representation of the sample user, related information of an associated account of the sample user, and the like. The sample implement characteristics of the sample implement may include an implement representation, such as a location of the sample implement, a model of the sample implement, a system version of the sample implement, a frequency of use of the sample implement, a length of use of the sample implement, and so forth. The first or second sample historical brush characteristics may include historical brush times, historical brush comparison scores (e.g., average score, highest score, lowest score), and/or the like. The third preset time period or the fourth preset time period may be preset manually or by a model, for example, 1 month, 3 months, 6 months, 1 year, etc. The third preset time period and the fourth preset time period may be the same or different. As described in connection with step 306, the first preset time period, the second preset time period, the third preset time period, and the fourth preset time period may be the same as or different from each other. The sample brushing behavior characteristics associated with the sample brushing image include a brushing ratio score of the brushing behavior corresponding to the sample (i.e., image similarity of the sample brushing image to the sample candidate image), historical interaction characteristics of the sample user and the sample tool (e.g., historical brushing ratio bisection of the sample user on the sample tool, other historical interaction behaviors (e.g., historical code scanning behaviors) the sample user performed on the sample tool), and so on.
In some embodiments, the manner of extracting features may include principal component analysis, histogram of oriented gradients feature extraction algorithm, local binary pattern feature extraction algorithm, HAAR feature extraction algorithm, and the like.
In some embodiments, the initial face-brushing verification model may include a plurality of sub-modules, each for processing a different one of the above-described sample features. For example, the initial face brushing verification model may include a first initial graph neural network model, a second initial graph neural network model, an initial face brushing behavior characterization model, an initial fusion layer, an initial classification layer, and so on. Specifically, as described in connection with step 306, the first initial map neural network model may be used to determine user characteristics of the sample user, the second initial map neural network model may be used to determine tool characteristics of the sample tool, the initial brushing behavior characteristic model may be used to determine sample brushing behavior characteristics of the sample brushing behavior characteristics, the initial fusion layer may be used to fuse the above characteristics to determine a fused characteristic, and the initial classification layer may be used to output a sample verification result based on the fused characteristic, and the like. In some embodiments, the various representations described above may be in the form of numbers, formulas, vectors, matrices, and the like. In some embodiments, some of the modules in the initial face brushing verification model may be pre-trained or semi-trained. For example, the first initial map neural network model or the second initial map neural network model may be pre-trained, in which case, as described in step 306, the first initial map neural network model is the first map neural network model, and the second initial map neural network model is the second map neural network model. For another example, the initial face brushing behavior representation model may also be trained in advance, in which case, the initial face brushing behavior representation model is the face brushing behavior representation model. In some embodiments, each module in the initial face brushing verification model may be initialized, and accordingly, in the model training process, the initial parameters thereof may be iteratively adjusted until the training completion condition is satisfied. A more detailed description of the face brushing verification model may be found elsewhere in this specification (e.g., fig. 5 and its description).
For each of the at least two training samples, the processing device may determine a sample correlation of the sample brush face image and the sample user based on the initial brush face verification model according to the above-mentioned features, step 470. In some embodiments, sample relevancy may be expressed using numerical values, percentages, text, and the like. In some embodiments, the processing device may determine the sample correlation based on the output of the initial face brushing verification model (e.g., classification layer).
In step 490, the processing device may iteratively adjust parameters of the initial face brushing verification model based on at least two sample correlations and at least two sample labels of at least two training samples until a preset condition is satisfied, thereby determining the face brushing verification model.
In some embodiments, the processing device may determine whether the number of iterations reaches a number threshold (e.g., 5, 10). If the iteration number has reached the number threshold, the processing device may determine that the preset condition has been met, and the model training is ended.
In some embodiments, the processing device may determine a model error and determine whether the error is less than a preset error threshold. If the error is less than the preset error threshold, the processing device may determine that the preset condition has been met, and the model training is finished. The error may be a difference in sample correlation and sample label or other parameter that may reflect the difference.
In some embodiments, the processing device may determine whether the value of the penalty function is less than a preset penalty threshold. If the value of the loss function is less than the preset loss threshold, the processing device may determine that the preset condition has been met, and the model training is ended. In some embodiments, the loss function may include a hinge loss function, a cross-entropy loss function, a knownloss function, or the like. By way of example only, the loss function is shown in equation (1) below:
Figure BDA0002421146230000191
where L represents the loss function, ua represents the sample, yuaA sample label indicating the sample, zuRepresents a fusion characterization of the sample, and σ represents an activation function (e.g., sigmoid function).
It should be noted that the above description related to the flow 400 is only for illustration and description, and does not limit the applicable scope of the present specification. Various modifications and changes to flow 400 will be apparent to those skilled in the art in light of this description. However, such modifications and variations are intended to be within the scope of the present description. For example, the order of steps 410 and 430 may be reversed, with the initial face brushing verification model being obtained first, followed by the determination of at least two training samples. For another example, the brushing verification model may be periodically updated based on newly acquired training samples (e.g., training samples determined based on newly acquired historical brushing behavior information).
Fig. 5 is an exemplary schematic diagram of a brush face verification model, shown in accordance with some embodiments of the present description. As shown in fig. 5, the brushing face verification model 500 may include a first graph neural network model, a second neural network model, a brushing face behavior characterization model, a fusion layer, and a classification layer. The first graph neural network model and the second neural network model may be independent models or the same model. As shown, U represents a user and I represents an implement. As described anywhere in this specification, in the graph neural network model, a node corresponds to a user or an implement, and the user and the implement can be connected by an edge. When determining a user characterization of a user (e.g., "candidate user" in the online use process, "sample user" in the model training process) or an implement characterization of a face brushing implement (e.g., "face brushing implement" in the online use process, "sample implement" in the model training process), not only node features (e.g., user features, implement features) but also edge features (e.g., historical face brushing features corresponding to historical face brushing behaviors occurring between two users and an implement connected to each other) are introduced, thereby improving the accuracy of the face brushing verification model.
A first graph neural network model (which may also be referred to as a "user-based graph model") may be used to determine a user characterization (which may also be referred to as a "user embedding") of a user. Here, the "user" may be a "candidate user" in the online use process, or may be a "sample user" in the model training process, and for convenience, the "candidate user" is described below as an example.
In some embodiments, based on the first graph neural network model, the processing device may identify at least one neighbor node of the candidate user, where the at least one neighbor node is a historical brushface appliance on which the candidate user has experienced a brushface behavior for a first predetermined period of time (e.g., 1 month, 3 months, 6 months, 1 year), and accordingly the first historical brushface feature includes historical brushface features between the candidate user and each of the at least one neighbor node. For example, taking a specific neighbor node (i.e., a specific historical face brushing tool) as an example, the first historical face brushing feature described herein may include historical face brushing times, historical face brushing comparison scores, historical face brushing times, and the like of historical face brushing behaviors performed by the candidate user on the tool. The processing device may then determine neighbor characteristics of at least one neighbor node (i.e., implement characteristics of the historical brushface implement). Further, the processing device may determine a user characterization of the candidate user by fusing the user features, the neighbor features, and the first historical brush-face features. During the fusion process, the processing device may assign different weights to the user features, the neighbor features, and the first historical brush feature. The weight may be a default value of the system, or may be adjusted according to a specific scenario. In particular, the processing device may determine the user characterization by equation (1) as described in step 306.
A second neural network model (also referred to as a "device-based graph model") may be used to determine a tool characterization (also referred to as a "device embedding") of the face brushing tool. The "face brushing machine" herein may be a "face brushing machine" in an online use process, or may be a "sample machine" in a model training process, and for convenience, the "face brushing machine" will be described below as an example.
In some embodiments, based on the second neural network model, the processing device may identify at least one neighbor node of the brushhead, where the at least one neighbor node is a historical user who has performed a brushing action on the brushhead for a second predetermined period of time (e.g., 1 month, 3 months, 6 months, 1 year), and accordingly, the second historical brushing characteristics include historical brushing characteristics of the brushhead and each of the at least one neighbor node. For example, taking a particular neighbor node (i.e., a particular historical user) as an example, the second sample historical brushing characteristics described herein may include historical brushing times, historical brushing comparison scores, historical brushing times, etc., that the particular historical user performed on the brushing engine. The processing device may then determine neighbor characteristics (i.e., user characteristics of the historical user) of at least one neighbor node. Further, the processing device may determine a tool representation of the brush face tool by fusing the tool feature, the neighbor feature, and the second historical brush face feature. During the fusion process, the processing device may assign different weights to the implement features, the neighbor features, and the second historical brush feature. The weight may be a default value of the system, or may be adjusted according to a specific scenario. In some embodiments, the formula for determining the implement characterization may be similar to formula (1), and is not described in detail herein.
The brushing behavior characterization model may be used to determine a brushing behavior characterization (also referred to as "wipe embedding") of the brushing behavior characteristics. In some embodiments, the brushing behavior characterization model may be a multi-layered perceptron. The multi-layered perceptron is also known as an artificial neural network. The multi-layer perceptron may include an input layer, an output layer, an implied layer, and the like. The layers of the multi-layer perceptron can be fully connected.
The fusion layer may be used to determine a fused representation of the training sample by fusing the user representation, the implement representation, and the brush behavior representation. In some embodiments, the fused layer may be a multi-layer perceptron.
The classification layer may be used to determine relevance based on the fused representation. In some embodiments, the classification layer may be a classifier. In some embodiments, the classifier may include a logistic regression classifier, a decision tree, na iotave bayes, a neural network, or the like.
In some embodiments, a default node (also referred to as a "dummy node") may be set in the model, the default node not being associated with other nodes. In some cases, if a node characteristic cannot be obtained, for example, if the machine tool characteristic is absent due to the fact that machine tool point information cannot be obtained, the node can be set as a default node, so that the overall use of the model is not influenced.
In some embodiments, the face brushing verification model introduced into the graph neural network has superior performance in both the overall recall and edge case recall dimensions. Among them, edge cases may include offline first case and cross-store non-first case. The offline first case may refer to a user performing a face brushing verification using a face brushing machine for the first time in an online scene (e.g., a supermarket). A cross-store non-first order case may refer to a user appearing for the first time in a particular off-line scene (e.g., a supermarket) for face brushing verification but not using a face brushing machine for the first time.
Specifically, the performance comparison of the face brushing verification model described in this specification with other verification schemes can be seen in table 1 below:
TABLE 1 comparison of the face-brushing verification model incorporating the neural network of the figure with the performance of other verification schemes
Figure BDA0002421146230000221
It should be appreciated that the brush face verification model and its modules shown in FIG. 5 may be implemented in a variety of ways. For example, in some embodiments, the face brushing verification model and its modules may be implemented in hardware, software, or a combination of software and hardware. Wherein the hardware portion may be implemented using dedicated logic; the software portions may be stored in a memory for execution by a suitable instruction execution system, such as a microprocessor or specially designed hardware. Those skilled in the art will appreciate that the methods and systems described above may be implemented using computer executable instructions and/or embodied in processor control code, such code being provided, for example, on a carrier medium such as a diskette, CD-or DVD-ROM, a programmable memory such as read-only memory (firmware), or a data carrier such as an optical or electronic signal carrier. The system and its modules in this specification may be implemented not only by hardware circuits such as very large scale integrated circuits or gate arrays, semiconductors such as logic chips, transistors, or programmable hardware devices such as field programmable gate arrays, programmable logic devices, etc., but also by software executed by various types of processors, for example, or by a combination of the above hardware circuits and software (e.g., firmware).
The beneficial effects that may be brought by the embodiments of the present description include, but are not limited to: (1) when face brushing verification is carried out, historical association characteristics capable of reflecting historical association between a user and a machine tool are introduced, so that verification accuracy is improved; (2) historical association characteristics are determined through the graph neural network model, and historical association between the user and the machine tool can be determined more accurately; (3) when the historical associated features are determined through the graph neural network model, the node features (namely the user features and the tool features) and the edge features (namely the historical face brushing features between the user and the tool) are considered at the same time, and the verification accuracy is further improved. It is to be noted that different embodiments may produce different advantages, and in different embodiments, any one or combination of the above advantages may be produced, or any other advantages may be obtained.
Having thus described the basic concept, it will be apparent to those skilled in the art that the foregoing detailed disclosure is to be regarded as illustrative only and not as limiting the present specification. Various modifications, improvements and adaptations to the present description may occur to those skilled in the art, although not explicitly described herein. Such modifications, improvements and adaptations are proposed in the present specification and thus fall within the spirit and scope of the exemplary embodiments of the present specification.
Also, the description uses specific words to describe embodiments of the description. Reference throughout this specification to "one embodiment," "an embodiment," and/or "some embodiments" means that a particular feature, structure, or characteristic described in connection with at least one embodiment of the specification is included. Therefore, it is emphasized and should be appreciated that two or more references to "an embodiment" or "one embodiment" or "an alternative embodiment" in various places throughout this specification are not necessarily all referring to the same embodiment. Furthermore, some features, structures, or characteristics of one or more embodiments of the specification may be combined as appropriate.
Moreover, those skilled in the art will appreciate that aspects of the present description may be illustrated and described in terms of several patentable species or situations, including any new and useful combination of processes, machines, manufacture, or materials, or any new and useful improvement thereof. Accordingly, aspects of this description may be performed entirely by hardware, entirely by software (including firmware, resident software, micro-code, etc.), or by a combination of hardware and software. The above hardware or software may be referred to as "data block," module, "" engine, "" unit, "" component, "or" system. Furthermore, aspects of the present description may be represented as a computer product, including computer readable program code, embodied in one or more computer readable media.
The computer storage medium may comprise a propagated data signal with the computer program code embodied therewith, for example, on baseband or as part of a carrier wave. The propagated signal may take any of a variety of forms, including electromagnetic, optical, etc., or any suitable combination. A computer storage medium may be any computer-readable medium that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code located on a computer storage medium may be propagated over any suitable medium, including radio, cable, fiber optic cable, RF, or the like, or any combination of the preceding.
Computer program code required for the operation of various portions of this specification may be written in any one or more programming languages, including an object oriented programming language such as Java, Scala, Smalltalk, Eiffel, JADE, Emerald, C + +, C #, VB.NET, Python, and the like, a conventional programming language such as C, VisualBasic, Fortran2003, Perl, COBOL2002, PHP, ABAP, a dynamic programming language such as Python, Ruby, and Groovy, or other programming languages, and the like. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or processing device. In the latter scenario, the remote computer may be connected to the user's computer through any network format, such as a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet), or in a cloud computing environment, or as a service, such as a software as a service (SaaS).
Additionally, the order in which the elements and sequences of the process are recited in the specification, the use of alphanumeric characters, or other designations, is not intended to limit the order in which the processes and methods of the specification occur, unless otherwise specified in the claims. While various presently contemplated embodiments of the invention have been discussed in the foregoing disclosure by way of example, it is to be understood that such detail is solely for that purpose and that the appended claims are not limited to the disclosed embodiments, but, on the contrary, are intended to cover all modifications and equivalent arrangements that are within the spirit and scope of the embodiments herein. For example, although the system components described above may be implemented by hardware devices, they may also be implemented by software-only solutions, such as installing the described system on an existing processing device or mobile device.
Similarly, it should be noted that in the preceding description of embodiments of the present specification, various features are sometimes grouped together in a single embodiment, figure, or description thereof for the purpose of streamlining the disclosure aiding in the understanding of one or more of the embodiments. This method of disclosure, however, is not intended to imply that more features than are expressly recited in a claim. Indeed, the embodiments may be characterized as having less than all of the features of a single embodiment disclosed above.
Numerals describing the number of components, attributes, etc. are used in some embodiments, it being understood that such numerals used in the description of the embodiments are modified in some instances by the use of the modifier "about", "approximately" or "substantially". Unless otherwise indicated, "about", "approximately" or "substantially" indicates that the number allows a variation of ± 20%. Accordingly, in some embodiments, the numerical parameters used in the specification and claims are approximations that may vary depending upon the desired properties of the individual embodiments. In some embodiments, the numerical parameter should take into account the specified significant digits and employ a general digit preserving approach. Notwithstanding that the numerical ranges and parameters setting forth the broad scope of the range are approximations, in the specific examples, such numerical values are set forth as precisely as possible within the scope of the application.
For each patent, patent application publication, and other material, such as articles, books, specifications, publications, documents, etc., cited in this specification, the entire contents of each are hereby incorporated by reference into this specification. Except where the application history document does not conform to or conflict with the contents of the present specification, it is to be understood that the application history document, as used herein in the present specification or appended claims, is intended to define the broadest scope of the present specification (whether presently or later in the specification) rather than the broadest scope of the present specification. It is to be understood that the descriptions, definitions and/or uses of terms in the accompanying materials of this specification shall control if they are inconsistent or contrary to the descriptions and/or uses of terms in this specification.
Finally, it should be understood that the embodiments described herein are merely illustrative of the principles of the embodiments of the present disclosure. Other variations are also possible within the scope of the present description. Thus, by way of example, and not limitation, alternative configurations of the embodiments of the specification can be considered consistent with the teachings of the specification. Accordingly, the embodiments of the present description are not limited to only those embodiments explicitly described and depicted herein.

Claims (16)

1. A face brushing verification method comprising:
acquiring a face brushing image from a face brushing machine;
determining candidate users based on the face brushing image;
acquiring historical association features reflecting historical association between the candidate user and the face brushing machine, wherein the historical association features comprise user characteristics of the candidate user and machine characteristics of the face brushing machine, and the user characteristics at least reflect user characteristics of the candidate user and machine characteristics of the historical face brushing machine on which the candidate user generates face brushing behaviors within a first preset time period;
determining, by a face brushing verification model, a degree of correlation of the face brushing image with the candidate user based at least on the historical correlation features; and
and determining a face brushing verification result of the face brushing image based on the correlation.
2. The method of claim 1, wherein the first and second light sources are selected from the group consisting of a red light source, a green light source, and a blue light source,
the implement representation at least reflects the implement characteristics of the face brushing implement and the user characteristics of historical users who have taken place the face brushing action on the face brushing implement within a second preset time period.
3. The method of claim 2, wherein the first and second light sources are selected from the group consisting of a red light source, a green light source, and a blue light source,
the user characterization further reflects a first historical brush feature between the candidate user and each of the historical brush tools; and
the implement characterization also reflects a second historical brush face feature between the brush face implement and each of the historical users.
4. The method of claim 3, the first historical brushing characteristics or the second historical brushing characteristics comprising at least one of historical brushing times, historical brushing comparison scores, or historical brushing times.
5. The method of claim 1, wherein the first and second light sources are selected from the group consisting of a red light source, a green light source, and a blue light source,
the user characterization is determined by a first graph neural network model; and
the tool characterization is determined by a second graph neural network model.
6. The method of claim 1, further comprising:
acquiring brushing behavior characteristics related to the brushing image;
determining, by a brush verification model, a relevance of the brush image to the candidate user based at least on the historical association features comprises:
determining, by the face brushing verification model, a degree of correlation between the face brushing image and the candidate user based on the historical association features and the face brushing behavior features.
7. The method of claim 6, the brushing behavior features comprising comparison scores for the brushing images or historical interaction features between the candidate users and the brushing engine.
8. A face brushing verification system comprising:
the acquisition module is used for acquiring a face brushing image from a face brushing machine;
a candidate user determination module for determining candidate users based on the face brushing image;
a historical associated feature obtaining module, configured to obtain a historical associated feature that reflects a historical association between the candidate user and the face brushing tool, where the historical associated feature includes a user characteristic of the candidate user and a tool characteristic of the face brushing tool, and the user characteristic at least reflects a user characteristic of the candidate user and a tool characteristic of the historical face brushing tool on which a face brushing behavior of the candidate user occurs within a first preset time period;
a relevance determination module, configured to determine, through a face brushing verification model, a relevance of the face brushing image to the candidate user based on at least the historical association features; and
and the verification result determining module is used for determining the face brushing verification result of the face brushing image based on the correlation.
9. The system of claim 8, wherein the first and second sensors are arranged in a single unit,
the implement representation at least reflects the implement characteristics of the face brushing implement and the user characteristics of historical users who have taken place the face brushing action on the face brushing implement within a second preset time period.
10. The system of claim 9, wherein the first and second sensors are arranged in a single unit,
the user characterization further reflects a first historical brush feature between the candidate user and each of the historical brush tools; and
the implement characterization also reflects a second historical brush face feature between the brush face implement and each of the historical users.
11. The system of claim 10, the first historical brushing characteristics or the second historical brushing characteristics comprising at least one of historical brushing times, historical brushing comparison scores, or historical brushing times.
12. The system of claim 8, wherein the first and second sensors are arranged in a single unit,
the user characterization is determined by a first graph neural network model; and
the tool characterization is determined by a second graph neural network model.
13. The system of claim 8, further comprising:
the face brushing behavior characteristic acquisition module is used for acquiring face brushing behavior characteristics related to the face brushing image;
the relevancy determination module is further configured to:
determining, by the face brushing verification model, a degree of correlation between the face brushing image and the candidate user based on the historical association features and the face brushing behavior features.
14. The system of claim 13, the brushing behavior features comprising comparison scores for the brushing images or historical interaction features between the candidate users and the brushing engine.
15. A brushing verification device comprising a processor for executing the brushing verification method according to any one of claims 1 to 7.
16. A computer-readable storage medium storing computer instructions which, when read by a computer, cause the computer to perform a face brushing verification method according to any one of claims 1 to 7.
CN202010206013.XA 2020-03-23 2020-03-23 Face brushing verification method and system Active CN111382410B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010206013.XA CN111382410B (en) 2020-03-23 2020-03-23 Face brushing verification method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010206013.XA CN111382410B (en) 2020-03-23 2020-03-23 Face brushing verification method and system

Publications (2)

Publication Number Publication Date
CN111382410A CN111382410A (en) 2020-07-07
CN111382410B true CN111382410B (en) 2022-04-29

Family

ID=71221749

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010206013.XA Active CN111382410B (en) 2020-03-23 2020-03-23 Face brushing verification method and system

Country Status (1)

Country Link
CN (1) CN111382410B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113011339A (en) * 2021-03-19 2021-06-22 支付宝(杭州)信息技术有限公司 User identity verification method and device and electronic equipment
CN115578100A (en) * 2021-06-21 2023-01-06 腾讯科技(深圳)有限公司 Payment verification mode identification method and device, electronic equipment and storage medium

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CA2825635A1 (en) * 2012-08-28 2014-02-28 Solink Corporation Transaction verification system
CN109300267A (en) * 2018-10-31 2019-02-01 杭州有赞科技有限公司 The cash method and system of member system based on recognition of face
CN109658572A (en) * 2018-12-21 2019-04-19 上海商汤智能科技有限公司 Image processing method and device, electronic equipment and storage medium
CN109660509A (en) * 2018-10-29 2019-04-19 北京旷视科技有限公司 Login method, device, system and storage medium based on recognition of face
CN110223080A (en) * 2019-06-05 2019-09-10 北京三快在线科技有限公司 The determination method and device of the target account of brush face payment platform
CN110991433A (en) * 2020-03-04 2020-04-10 支付宝(杭州)信息技术有限公司 Face recognition method, device, equipment and storage medium

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CA2825635A1 (en) * 2012-08-28 2014-02-28 Solink Corporation Transaction verification system
CN109660509A (en) * 2018-10-29 2019-04-19 北京旷视科技有限公司 Login method, device, system and storage medium based on recognition of face
CN109300267A (en) * 2018-10-31 2019-02-01 杭州有赞科技有限公司 The cash method and system of member system based on recognition of face
CN109658572A (en) * 2018-12-21 2019-04-19 上海商汤智能科技有限公司 Image processing method and device, electronic equipment and storage medium
CN110223080A (en) * 2019-06-05 2019-09-10 北京三快在线科技有限公司 The determination method and device of the target account of brush face payment platform
CN110991433A (en) * 2020-03-04 2020-04-10 支付宝(杭州)信息技术有限公司 Face recognition method, device, equipment and storage medium

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
模糊人脸同步恢复与识别;李俊;《中国优秀硕士学位论文全文数据库》;20180415;全文 *

Also Published As

Publication number Publication date
CN111382410A (en) 2020-07-07

Similar Documents

Publication Publication Date Title
US11694064B1 (en) Method, system, and computer program product for local approximation of a predictive model
KR102223296B1 (en) Structure learning in convolutional neural networks
US11494616B2 (en) Decoupling category-wise independence and relevance with self-attention for multi-label image classification
WO2019100724A1 (en) Method and device for training multi-label classification model
Guo et al. A probabilistic fusion approach to human age prediction
WO2019011093A1 (en) Machine learning model training method and apparatus, and facial expression image classification method and apparatus
CN111382410B (en) Face brushing verification method and system
US11093800B2 (en) Method and device for identifying object and computer readable storage medium
CN113128287A (en) Method and system for training cross-domain facial expression recognition model and facial expression recognition
EP4115321A1 (en) Systems and methods for fine tuning image classification neural networks
WO2019232723A1 (en) Systems and methods for cleaning data
CN113327212B (en) Face driving method, face driving model training device, electronic equipment and storage medium
CN115880530A (en) Detection method and system for resisting attack
WO2022104340A1 (en) Artificial intelligence for passive liveness detection
Wang et al. MetaScleraSeg: an effective meta-learning framework for generalized sclera segmentation
Singla et al. Age and gender detection using Deep Learning
US20240070466A1 (en) Unsupervised Labeling for Enhancing Neural Network Operations
CN112395979B (en) Image-based health state identification method, device, equipment and storage medium
TWI812291B (en) Machine learning method for continual learning and electronic device
Oravec et al. Clustering algorithms for face recognition based on client-server architecture
Günseli Mood analysis of employees by using image-based data
Kaustubh et al. Visualization and Creation of Image using Processed Dataset and Python Model Architecture
de Vries Deep learning and reinforcement learning methods for grounded goal-oriented dialogue
Battu Bias Detector Tool for Face Datasets using Image Recognition
CN115457637A (en) Glasses recommendation method, system, equipment and storage medium based on deep learning CNN

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant