CN111340508A - Payment operation control method and system - Google Patents

Payment operation control method and system Download PDF

Info

Publication number
CN111340508A
CN111340508A CN202010440505.5A CN202010440505A CN111340508A CN 111340508 A CN111340508 A CN 111340508A CN 202010440505 A CN202010440505 A CN 202010440505A CN 111340508 A CN111340508 A CN 111340508A
Authority
CN
China
Prior art keywords
brain wave
target
identity
wave data
information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010440505.5A
Other languages
Chinese (zh)
Inventor
张熠
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
AlipayCom Co ltd
Original Assignee
Alipay Hangzhou Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Alipay Hangzhou Information Technology Co Ltd filed Critical Alipay Hangzhou Information Technology Co Ltd
Priority to CN202010440505.5A priority Critical patent/CN111340508A/en
Publication of CN111340508A publication Critical patent/CN111340508A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q20/00Payment architectures, schemes or protocols
    • G06Q20/38Payment protocols; Details thereof
    • G06Q20/40Authorisation, e.g. identification of payer or payee, verification of customer or shop credentials; Review and approval of payers, e.g. check credit lines or negative lists
    • G06Q20/401Transaction verification
    • G06Q20/4014Identity check for transactions
    • G06Q20/40145Biometric identity checks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N20/00Machine learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/15Biometric patterns based on physiological signals, e.g. heartbeat, blood flow

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Business, Economics & Management (AREA)
  • Accounting & Taxation (AREA)
  • Software Systems (AREA)
  • Medical Informatics (AREA)
  • Mathematical Physics (AREA)
  • Evolutionary Computation (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Artificial Intelligence (AREA)
  • Computing Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Computer Security & Cryptography (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • Finance (AREA)
  • Strategic Management (AREA)
  • General Business, Economics & Management (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the specification discloses a control method and a system of payment operation, which are characterized in that target brain wave data of a target object to be identified in a target scene is acquired; then, judging the identity of the target object to be recognized based on the target brain wave data and a machine learning model; finally, whether a payment operation request is generated is determined based on the judgment result; the payment operation request comprises account information corresponding to the identity of the target object to be identified.

Description

Payment operation control method and system
Technical Field
The embodiment of the specification relates to the technical field of transactions, in particular to a control method and a control system for payment operation.
Background
With the rise of mobile payment technology, more and more users are used to pay money by using mobile terminals such as mobile phones, and do not carry cash, bank cards and the like.
However, the mobile payment technology used at present must depend on the available mobile terminal for the user, and once the mobile terminal is not carried or available (e.g., equipment failure, power exhaustion, etc.), the payment operation cannot be performed. Therefore, there is a need to provide a more convenient and reliable payment scheme to improve the operation experience of the user.
Disclosure of Invention
One of the embodiments of the present specification provides a method for controlling a payment operation, the method including: acquiring target brain wave data of a target object to be identified in a target scene; judging the identity of the target object to be recognized based on the target brain wave data and a machine learning model; and determining whether to generate a payment operation request based on the determination result; the payment operation request comprises account information corresponding to the identity of the target object to be identified.
One of the embodiments of the present specification provides a control system for payment operation, the system including: the acquisition module is used for acquiring target brain wave data of a target object to be identified in a target scene; the judging module is used for judging the identity of the target object to be recognized based on the target brain wave data and a machine learning model; a generating module, configured to determine whether to generate a payment operation request based on the determination result; the payment operation request comprises account information corresponding to the identity of the target object to be identified.
One of the embodiments of the present specification provides a control device for payment operations, the device including at least one processor and at least one memory; the at least one memory is for storing computer instructions; the at least one processor is configured to execute at least some of the computer instructions to implement the control method for payment operations as described above.
Drawings
The present description will be further explained by way of exemplary embodiments, which will be described in detail by way of the accompanying drawings. These embodiments are not intended to be limiting, and in these embodiments like numerals are used to indicate like structures, wherein:
FIG. 1 is a schematic diagram of an application scenario, according to some embodiments of the present description;
FIG. 2 is an exemplary flow chart of a method of controlling payment operations shown in accordance with some embodiments of the present description;
FIG. 3 is an exemplary flow diagram of a method of training a machine learning model according to some embodiments shown herein;
FIG. 4 is a block schematic diagram of a control system for payment operations according to some embodiments described herein.
Detailed Description
In order to more clearly illustrate the technical solutions of the embodiments of the present disclosure, the drawings used in the description of the embodiments will be briefly described below. It is obvious that the drawings in the following description are only examples or embodiments of the present description, and that for a person skilled in the art, the present description can also be applied to other similar scenarios on the basis of these drawings without inventive effort. Unless otherwise apparent from the context, or otherwise indicated, like reference numbers in the figures refer to the same structure or operation.
It should be understood that "system", "apparatus", "unit" and/or "module" as used herein is a method for distinguishing different components, elements, parts, portions or assemblies at different levels. However, other words may be substituted by other expressions if they accomplish the same purpose.
As used in this specification and the appended claims, the terms "a," "an," "the," and/or "the" are not intended to be inclusive in the singular, but rather are intended to be inclusive in the plural, unless the context clearly dictates otherwise. In general, the terms "comprises" and "comprising" merely indicate that steps and elements are included which are explicitly identified, that the steps and elements do not form an exclusive list, and that a method or apparatus may include other steps or elements.
Flow charts are used in this description to illustrate operations performed by a system according to embodiments of the present description. It should be understood that the preceding or following operations are not necessarily performed in the exact order in which they are performed. Rather, the various steps may be processed in reverse order or simultaneously. Meanwhile, other operations may be added to the processes, or a certain step or several steps of operations may be removed from the processes.
In some embodiments, the offline payment can be performed by code scanning payment, face brushing payment, etc., but these payment methods all have certain defects at present. For example, for code scanning payment, the code scanning payment must depend on a mobile terminal such as a mobile phone, and once a mobile phone of a user fails or is not provided with the mobile phone, the payment operation cannot be performed. For another example, in the case of face-brushing payment, some users may worry about the problems of portrait disclosure or portrait embezzlement after using face-brushing payment, and therefore, there is a certain psychological conflict with the face-brushing payment function.
In some embodiments, the brain wave data may be used for authentication, which may be based on the following principles: comparing the acquired brain wave data with brain wave data in a database, and then carrying out identity recognition based on a comparison result. However, it should be understood by those skilled in the art that the brain wave of a person changes with changes in brain activities such as mood changes, and each person may have different mood at any moment, in other words, each person may generate different brain wave signals at any moment, so that it is difficult to accurately realize identification based on database comparison in the case of incomplete database. Moreover, even though the database records the brain wave data of each brain activity of each person, the identification process is quite long due to the extremely large data processing amount. Therefore, from another perspective, it is also difficult to realize identification based on database comparison.
The embodiment of the specification provides a method and a system for controlling payment operation based on a machine learning model, brain wave data of a user is processed through the machine learning model, the identity of the user is accurately identified, and the payment operation is completed based on the identity, so that the accuracy of an identity verification process is improved, and meanwhile, the payment experience of the user is improved.
The following describes a method and a system for controlling a payment operation provided in an embodiment of the present specification in detail with reference to the accompanying drawings.
Fig. 1 is a schematic diagram of an application scenario shown in accordance with some embodiments of the present description.
As shown in fig. 1, in some embodiments, the control system 100 for payment operations may include a server 110, a brain wave acquisition device 120, a network 130, a storage device 140, and at least one terminal 150. Wherein the various components may be interconnected by a network 130. For example, brain wave acquisition device 120 may be connected to or in communication with server 110 via network 130, and server 110 may be connected to or in communication with storage device 140 via network 130.
In some embodiments, the brain wave collecting device 120 may be configured to collect brain wave data (i.e., an overall reflection of electrophysiological activities of brain nerve cells) of a user (e.g., an executor of a payment operation), and feed the brain wave data back to the server 110 for user authentication, thereby implementing account login and payment operation corresponding to goods or services. In some embodiments, the brain wave data collected by the brain wave collecting device 120 can be directly transmitted to the server 110 through the network 130. Optionally, in other embodiments, the brain wave data may also be transmitted to the at least one terminal 150, and then transmitted to the server 110 by the at least one terminal 150.
In some embodiments, brain wave acquisition device 120 may be a head-mounted device, it being understood that in other embodiments, other types of brain wave acquisition devices may be employed. In some embodiments, the brain wave capturing device 120 may further include a visual component, such as vr (visual reality) glasses and the like, for displaying merchandise information, payment information and the like to the user or instructing the user to perform corresponding brain activities.
The server 110 may process the brain wave data directly sent by the brain wave collecting device 120 or indirectly sent through at least one terminal 150 to obtain the identity information and the account information corresponding to the brain wave data, and then complete the subsequent payment operation based on the identity information and the account information. For example, after obtaining the identity information and account information of the user, the server 110 may further determine whether the account satisfies the payment condition. In some embodiments, the server 110 may send the payment result (e.g., payment success or payment failure) to at least one terminal 150.
In some embodiments, the server 110 may be a single server or a group of servers. The server groups may be centralized or distributed. In some embodiments, the server 110 may be local or remote. For example, server 110 may access information and/or data from brain wave acquisition device 120, storage device 140, and/or at least one terminal 150 via network 130. As another example, server 110 may be directly connected to brain wave acquisition device 120, storage device 140, and/or at least one terminal 150 to access information and/or data. In some embodiments, the server 110 may be implemented on a cloud platform. For example, the cloud platform may include a private cloud, a public cloud, a hybrid cloud, a community cloud, a distributed cloud, an inter-cloud, a multi-cloud, and the like, or any combination thereof.
In some embodiments, at least one terminal 150 may be communicatively coupled to at least one of brain wave acquisition device 120, server 110, and storage device 140. For example, in some embodiments, the brain wave capturing device 120 may include a visual component (e.g., vr (visual reality) glasses, etc.), and the at least one terminal 150 may send an instruction to the brain wave capturing device 120 according to the information to be paid, so that the brain wave capturing device 120 presents the corresponding scene to the user through the visual component, and then captures brain wave data generated by the user while performing brain activities indicated by the scene. For another example, in some embodiments, at least one terminal 150 may receive the user brain wave data collected by the brain wave collection device 120 and then send it to the server 110. For another example, in some embodiments, at least one terminal 150 may also receive payment result information processed by the server 110.
In some embodiments, at least one terminal 150 may include a mobile device 150-1, a tablet computer 150-2, a laptop computer 150-3, a desktop computer 150-4, and the like, or any combination thereof. The mobile device 150-1 may comprise, among other things, a mobile phone, a Personal Digital Assistant (PDA), etc., or any combination thereof. In some embodiments, at least one terminal 150 may include an input device, an output device, and the like. The input device may include numeric keys and other operation keys for inputting information to be paid corresponding to goods or services. The input device may be selected from keyboard input, touch screen (e.g., with tactile or haptic feedback) input, voice input, image input, or any other similar input mechanism. Input information received via the input device may be transmitted to the server 110 and/or brain wave acquisition device 120 via the network 130 for further operations. Other types of input devices may include cursor control devices, such as a mouse or the like. The output device may include a display, speakers, printer, etc., or any combination thereof, for outputting information to be paid for and/or information about goods or services to be paid for.
Storage device 140 may store data, instructions, and/or any other information. For example, in some embodiments, storage device 140 may store brain wave data acquired by brain wave acquisition device 120. In some embodiments, the storage device 140 may store pre-registered user identity information, account information, and historical brainwave data matching the user identity, and the server 110 may authenticate the identity of the user based on the historical brainwave data and the brainwave data collected by the brainwave collection device 120. In some embodiments, storage device 140 may store data and/or instructions that server 110 uses to perform or use to perform the exemplary methods described in this specification. In some embodiments, storage device 140 may include mass storage, removable storage, volatile read-write memory, read-only memory (ROM), and the like, or any combination thereof. Exemplary mass storage devices may include magnetic disks, optical disks, solid state disks, and the like. Exemplary removable memory may include flash drives, floppy disks, optical disks, memory cards, compact disks, magnetic tape, and the like. Exemplary volatile read and write memories can include Random Access Memory (RAM). In some embodiments, the storage device 140 may be implemented on a cloud platform.
In some embodiments, a storage device 140 may be connected to the network 130 to communicate with at least one other component in the system 100 (e.g., the server 110, the brain wave acquisition device 120, the at least one terminal 150). At least one component in system 100 may access data or instructions stored in storage device 140 via network 130. In some embodiments, the storage device 140 may be part of the server 110.
Network 130 may include any suitable network capable of facilitating information and/or data exchange for system 100. In some embodiments, at least one component of system 100 (e.g., server 110, brain wave acquisition device 120, storage device 140, at least one terminal 150) may exchange information and/or data with at least one other component in system 100 via network 130. For example, the server 110 may obtain brain wave data of the target object to be identified from the brain wave collecting device 120 through the network 130. Server 110 may obtain preconfigured databases or instructions from storage device 140 via network 130. For another example, the server 110 may feed back the payment result to at least one terminal 150 through the network 130. The network 130 may alternatively comprise a public network (e.g., the internet), a private network (e.g., a Local Area Network (LAN)), a wired network, a wireless network (e.g., an 802.11 network, a Wi-Fi network), a frame relay network, a Virtual Private Network (VPN), a satellite network, a telephone network, a router, a hub, a switch, a server computer, and/or any combination thereof. For example, the network 130 may include a wireline network, a fiber optic network, a telecommunications network, an intranet, a Wireless Local Area Network (WLAN), a Metropolitan Area Network (MAN), a Public Switched Telephone Network (PSTN), a Bluetooth network, a ZigBee network, a Near Field Communication (NFC) network, the like, or any combination thereof. In some embodiments, the network 130 may include at least one network access point. For example, network 130 may include wired and/or wireless network access points, such as base stations and/or internet exchange points, through which at least one component of system 100 may connect to network 130 to exchange data and/or information.
It should be noted that the foregoing description is provided for illustrative purposes only, and is not intended to limit the scope of this specification. Many variations and modifications may be made by one of ordinary skill in the art in light of the teachings of this specification. The features, structures, methods, and other features of the exemplary embodiments described herein may be combined in various ways to obtain additional and/or alternative exemplary embodiments. For example, the storage device 140 may be a data storage device comprising a cloud computing platform, such as a public cloud, a private cloud, a community and hybrid cloud, and the like. However, such changes and modifications do not depart from the scope of the present specification.
Fig. 2 is an exemplary flow chart 200 of a method of controlling a payment operation, shown in some embodiments herein. The method can be applied to the system shown in fig. 1, and referring to fig. 2, the method includes:
step 210, obtaining target brain wave data of a target object to be identified in a target scene.
The target object to be recognized may represent a collecting object of the brain wave collecting apparatus or an object whose identity needs to be recognized, for example, a person to be executed (i.e., a person to be paid) of a certain payment operation. The target scenario may represent a scenario in which the target object is in preparation for payment. In some embodiments, the target scene may include a preset scene capable of guiding the brain activity of the target object to be recognized so that the brain activity of the target object to be recognized is in a state related to payment. In some embodiments, the preset scenario may include: and displaying one or more of text information, voice information or picture information related to payment to the target object to be recognized. For example, the target object to be recognized is shown with several words of "i am paying"; or the 'I pay' is transmitted to the ears of the target object to be recognized in a voice playing mode, so that the brain of the target object to be recognized is in a state of intensively considering the I pay. In some embodiments, the preset scenario may further include giving an instruction to keep the brain activity of the target object to be identified in a state related to payment. For example, an instruction may be given to the target object to be recognized to place the brain in a payment state.
The target brain wave data may represent brain wave signals acquired by the brain wave acquisition device when the target object to be identified is in the aforementioned target scene.
In some embodiments, a brain wave signal of a target object to be identified in a target scene may be acquired by a brain wave acquisition device, and an identity of the target object may be identified based on the brain wave signal of the target object in the target scene. For example, when the user a needs to pay the user B, the user B may present the preset scenario to the user a through a terminal, and instruct the user a to perform brain activities corresponding to the preset scenario, for example, instruct the user to acquiesce "i want to pay". Then, the brain wave signal of the user a under the brain activity is acquired through the brain wave acquisition device, and the target brain wave data of the user a in the target scene can be obtained.
In some embodiments, the one or more preset scenes may be displayed on the target object to be recognized in a manual display manner. In some embodiments, the preset scene may also be presented to the target object to be recognized through the terminal device. Specifically, in some embodiments, the preset scene may be automatically displayed on the target object to be recognized through a display device (e.g., VR glasses, etc.) on the brain wave acquisition device, and brain wave data of the corresponding target object to be recognized may be automatically acquired after the preset scene is displayed. In some embodiments, the brain wave acquisition device may be a head-mounted device, it being understood that in other embodiments, other types of brain wave acquisition devices may be employed. In some embodiments, the brain wave collecting device may further include a display device, for example, VR glasses or the like, for giving the target object to be recognized the above-mentioned preset scene, so as to guide the target object to be recognized to have its brain activity in a state related to payment.
In some embodiments, the target brain wave data may include raw data collected by a brain wave sensor, and the raw data is a time domain signal. In some embodiments, the original brain wave data of the time domain signal cannot be directly used, and the time domain signal needs to be converted into a corresponding frequency domain signal (i.e., a signal value of the original brain wave signal strength) before it can be used. In some embodiments, the processing of converting the brain wave raw data from the time domain signal to the frequency domain signal may be performed in the brain wave acquiring device, where the target brain wave data acquired from the brain wave acquiring device is the frequency domain signal corresponding to the brain wave raw data.
In some embodiments, the processing procedure of converting the brain wave raw data of the time domain signal into the frequency domain signal corresponding to the brain wave data may include performing fast fourier transform on the time domain signal corresponding to the brain wave raw data, so as to obtain the frequency domain signal corresponding to the time domain signal. In some embodiments, the frequency domain signal may be further segmented to obtain a corresponding frequency domain signal in one or more complete cycles.
In some embodiments, the process of segmenting the frequency domain signal may include: first, a plurality of bands are determined based on the target brain wave data, in other words, the frequency domain signal corresponding to the target brain wave data may be first divided into a plurality of bands, where the lengths of any two bands may be the same or different.
Next, an offset band, which is offset to the (N + 1) th band on the nth band, is determined based on a preset offset, where the preset offset may be a fixed length or may vary with a length of the nth band, for example, in some embodiments, the preset offset may be half of a length of the nth band, and the length of the offset band corresponds to the preset offset. In some embodiments, if the preset offset is half of the length of the nth wavelength band, the offset wavelength band of the nth wavelength band to the (N + 1) th wavelength band may be understood as a wavelength band corresponding to a wavelength on the nth wavelength band excluding the offset. For example, the preset offset of the nth band is 0.2s, and the length of the nth band is 0.1s, so that the offset band is a band corresponding to 0.2s to 1s on the nth band.
Then, the shifted nth band may be determined based on the shifted band, the preset shift amount, and the N +1 th band. Specifically, a corresponding fractional band is determined on the N +1 th band based on the preset offset, and then the offset band on the nth band is spliced with the fractional band corresponding to the preset offset on the N +1 th band to obtain the shifted nth band. For example, the offset band is a band corresponding to 0.2s to 1s on the nth band, and the partial band corresponding to the preset offset of 0.2s on the (N + 1) th band is a band corresponding to 0 to 0.2s on the (N + 1) th band, so that the previous bands corresponding to 0.2s to 1s and the bands corresponding to 0 to 0.2s are spliced according to the time sequence, and the shifted nth band is obtained. The length of the band and the length of the band shift may be understood as a time length corresponding to the band. It is understood that the wavelength of the shifted nth wavelength band is unchanged from the wavelength of the nth wavelength band before the shift (i.e., the nth wavelength band).
And finally, processing the shifted Nth wave band to determine a corresponding brain wave spectrum signal. And so on, shifting the N +1 th waveband according to the preset offset in a similar manner to obtain the shifted N +1 th waveband. Wherein N is an integer of 1 or more.
In some embodiments, the shifted nth band (or the shifted N +1 th band) may also be shifted again (two or more times) in a similar manner. In some embodiments, several bands before shifting, several bands after shifting, and several bands after shifting again may be processed simultaneously, and the corresponding processing result may be used as the processing result of the target brain wave data.
In the above offset manner, the shifted nth band includes a fractional band of the nth band and a fractional band of the (N + 1) th band, and the signal processing on the shifted nth band is equivalent to simultaneously and repeatedly considering signals of two adjacent bands, so that the distortion condition after the signal processing is reduced, the accuracy of the signal processing is improved, and the accuracy of identity matching identification based on the target brain wave data is improved.
Further, after each band included in the target brain wave data is obtained through the segmentation processing, a power spectrum can be extracted for a frequency domain signal corresponding to each band, and finally Log logarithmic transformation is performed on the power spectrum to obtain a Log power spectrum, namely, a brain wave frequency spectrum signal.
It should be noted that the above segmentation method is only an exemplary one, and in this specification, the preset offset may be arbitrarily set according to requirements, and may be, but is not limited to, the above listed numerical values. For example, in some embodiments, the lengths of adjacent two bands may be different. Similarly, the length of the overlapping portion between two adjacent bands may be different.
In addition, it should be further noted that, in this specification, the preprocessing process may be performed by the brain wave acquisition device, or may be performed by the server. Specifically, in some embodiments, the brain wave acquisition device may perform digital signal conversion on the acquired original signal, and then perform preprocessing on the converted original signal to obtain a corresponding brain wave spectrum signal; in some embodiments, the brain wave acquisition device only acquires brain waves of the target object to be identified, the acquired brain wave signals can be sent to the server, and the server performs subsequent processing to obtain corresponding brain wave spectrum signals.
With continued reference to fig. 2, the method further comprises:
and step 220, judging the identity of the target object to be recognized based on the target brain wave data and a machine learning model.
In some embodiments, after acquiring target brain wave data of a target object to be recognized, an identity of the target object to be recognized may be determined based on at least the target brain wave data and a machine learning model. Specifically, the following two cases may be included: firstly, the identity of a target object to be identified can be determined directly through target brain wave data and a mechanical model (see step 222); secondly, the identity of the target object to be identified can be determined through the target brain wave data acquired by the brain wave acquisition device, the information of the currently purchased goods and the machine learning model (see steps 224 and 226).
In some embodiments, the machine learning model may be understood as an algorithm model obtained by training and learning features of brain wave data of the target object to be recognized in advance through the brain wave data and identity information thereof. Through training in advance, the model can know the characteristics of the brain wave data of the object to be recognized, so that accurate recognition and judgment can be performed on the obtained brain wave data in a target scene. In some embodiments, the model may also train the shopping habit of the target object to be recognized in advance to determine whether the shopping habit of the target object to be recognized is met based on the purchased commodity information, so as to perform identity recognition. For more description of the training process of the machine learning model, please refer to fig. 3 of the present specification, and the using process of the machine learning model will be described below.
Referring to fig. 2, in some embodiments, step 220 may comprise:
step 222, based on the target brain wave data and the machine learning model, the identity of the target object to be recognized is judged.
In some embodiments, a machine learning model may be configured for the server, and then the training sample including brain wave data of a plurality of target objects is used to train the server, so as to obtain a machine learning model that identifies the identity of the target object to be identified based on the brain wave data acquired by the brain wave acquisition device.
In some embodiments, after acquiring target brain wave data of a target object to be recognized, a server may input the target brain wave data into a machine learning model for processing, so as to obtain identity information matched with the target brain wave data and a matching degree between the identity information and the target brain wave data.
For example, in some embodiments, inputting the target brain wave data into a machine learning model for processing may result in output data: "user 1, 99.9%", i.e. the identity information that is matched with the target brain wave data is user1, and the matching degree is 99.9%. Wherein the matching degree may represent a probability that the target brain wave data belongs to the matched object or a similarity between the target brain wave data and the brain wave of the matched object.
Further, after obtaining the matching degree, the server may further determine according to the matching degree and a preset threshold: if the matching degree is smaller than a preset threshold value, judging that the identity information does not belong to the target object to be identified; if the matching degree is greater than or equal to the preset threshold value, the identity information can be judged to belong to the target object to be identified. Wherein the preset threshold may be a probability value. In some embodiments, the preset threshold may be set according to actual requirements. In some embodiments, to ensure the security of the account and avoid the authentication error, the preset threshold may be set to a larger value such as 98%, 99%, or 99.9%. Similarly, in some embodiments with relatively low requirements, the preset threshold may also be set to a low value such as 50%, 60%, etc.
With continued reference to fig. 2, in some embodiments, step 220 may include:
step 224, acquiring commodity information corresponding to the payment operation; and the number of the first and second groups,
step 226, based on the target brain wave data, the commodity information and the machine learning model, the identity of the target object to be recognized is judged.
In some embodiments, the server may further obtain information of a commodity (including an entity commodity and a virtual service) corresponding to the current payment operation, combine the target brain wave data with the commodity information corresponding to the current payment operation, and input the target brain wave data of the target object to be recognized and the commodity information corresponding to the current payment operation into a machine learning model for comprehensive processing, so as to determine the identity of the target object to be recognized. In some embodiments, the server may also obtain the commodity information corresponding to the current payment operation from a terminal (e.g., a vending machine, a cash register operation terminal, etc.) in communication therewith. In some embodiments, before inputting the commodity information into the machine learning model for processing, the server may further perform preprocessing on the commodity information to obtain feature information corresponding to the commodity information.
Specifically, in some embodiments, after the target electroencephalogram data and the commodity information corresponding to the current payment operation are input into the machine learning model for comprehensive processing, first identity information matched with the target electroencephalogram data, a first matching degree between the first identity information and the target electroencephalogram data, and second identity information matched with the commodity information, a second matching degree between the second identity information and the commodity information (the second matching degree may reflect a degree of conformity between a commodity currently purchased by the target object to be identified and a historical purchasing habit) may be output.
For example, in some embodiments, by processing target brain wave data of a target object to be recognized through a machine learning model, first identity information matched with the target brain wave data and a first matching degree thereof can be obtained as follows: "user 1, 99.9%" (i.e., the first degree of match of the target brain wave data to user1 is 99.9%). Meanwhile, the commodity information corresponding to the current payment operation is processed through the machine learning model, and the second identity information matched with the commodity information and the second matching degree of the second identity information are obtained as follows: "user 1, 60%", "user 2, 20%" (i.e., the second matching degree of the commodity information corresponding to the current payment operation with the user1 is 60%, and the second matching degree with the user2 is 20%).
The machine learning model for processing the target electroencephalogram data and the machine learning model for processing the commodity information corresponding to the current payment operation may be the same model or different models. When the target brain wave data and the commodity information are the same model, the target brain wave data and the commodity information can be simultaneously input into the machine learning model, and then the machine learning model outputs first identity information and first matching degree, second identity information and second matching degree. When the models are different, the target data and the commodity information may be input into corresponding machine learning models, and then the two machine learning models output the first identity information and the first matching degree, the second identity information and the second matching degree, respectively.
Further, the identity of the target object to be recognized can be judged according to the first matching degree and the second matching degree. For example, in some embodiments, if the first matching degree corresponding to the first identity information is greater than a first preset threshold, and the second matching degree of the first identity information and the commodity information corresponding to the current payment operation is greater than a second preset threshold, the first target identity may be used as the identity of the target object to be recognized.
It should be appreciated that, in order to secure the account, the first preset threshold may be set to a larger value, e.g., 99%, 99.5%, etc., while the second preset threshold may be used only as an additional determination condition, and thus, in some embodiments, the second preset threshold may be set to a smaller value, e.g., 60%, 80%, etc.
Alternatively, in some embodiments, each of the identity information and the target information may be obtained by calculating according to the first identity information, the second identity information, the first matching degree between the target brainwave information and the first identity information, and the second matching degree between the commodity information corresponding to the current payment operation and the second identity information, which are obtained by the machine learning model processing, and the target informationThe comprehensive matching degree of the brain wave data and the current commodity information can be represented by the following calculation formula: pi=p1i·k1+p2i·k2Wherein P isiThe comprehensive matching degree, p, of the ith identity information, the target brain wave data and the current commodity information1iIs the first matching degree, p, of the ith identity information and the target brain wave data2iIs the second matching degree, k, of the ith identity information and the current commodity information1Is the weight coefficient, k, corresponding to the first degree of matching2The weight coefficient corresponding to the second matching degree. For example, if k1=0.95,k2=0.05, the first matching degree of the target brain wave data and the first target identity obtained by the machine learning model processing is p11=99%, and the second matching degree of the current commodity information and the first target identity is p21=20%, the comprehensive matching degree P of the first target identity, the target brain wave data and the current commodity information can be obtained1=99% × 0.95.95 +20% × 0.05.05 =95.05%, further, the comprehensive matching degree may be compared with a third preset threshold, and if the comprehensive matching degree is greater than the third preset threshold, it may be determined that the first target identity belongs to the target object to be recognized, otherwise, it does not belong to the target object to be recognized.
In this specification, the first preset threshold, the second preset threshold, the third preset threshold, and the weight coefficient k are all defined as1And k2Can be set according to actual requirements. Wherein, in some embodiments, the weight coefficient k1Can be set to a larger value (e.g., 0.95, 0.98, 0.99, etc.), and the weighting factor k can be set to a larger value2May be set to a smaller value (e.g., 0.05, 0.02, 0.01, etc.). By combining the target brain wave information of the target object to be recognized in the target scene and the commodity information corresponding to the current payment operation, the identity of the target object to be recognized is recognized, and the accuracy of the recognition result can be improved to a certain extent.
With continued reference to fig. 2, the method further comprises:
and step 230, determining whether to generate a payment operation request based on the judgment result.
In some embodiments, after obtaining the identity of the target object to be recognized through the above process, it may be further determined whether to generate a payment operation request based on the recognition result, where the payment operation request may include account information corresponding to the identity of the target object to be recognized. For example, in some embodiments, after obtaining the identity information of the target object to be recognized through the target electroencephalogram data of the target object to be recognized or by combining the target electroencephalogram data and the current commodity information, the user may request to log into an account corresponding to the identity information, and perform further operations on the account according to the transaction information related to the current payment operation, such as balance inquiry, fee deduction, and the like. On the contrary, if the identity of the target object to be identified cannot be obtained through the above process, the payment operation request is not generated.
In some embodiments, after the payment operation (i.e. the deduction operation is completed on the account of the target object to be identified), the server may feed back the payment result (e.g. payment success or payment failure) to the at least one terminal through the network, so as to display the payment result to the user through the at least one terminal.
It should be noted that the above description of the control method related to the payment operation is only for illustration and explanation, and does not limit the applicable scope of the present specification. Various modifications and alterations to the above may occur to those skilled in the art, given the benefit of this description. However, such modifications and variations are intended to be within the scope of the present description.
In some embodiments, the machine learning model used in this specification may be selected according to actual situations, and may be selected, but is not limited to, linear classifiers (such as Logistic Regression, LR), SVMs (Support vector machine), K-Nearest Neighbor (KNN), Decision Trees (DT), ensemble models (Random Forest, RF/Gradient Boost, GDBT), and the like.
The training process of the machine learning model used in this specification is briefly described below.
FIG. 3 is an exemplary flow diagram 300 of a method of training a machine learning model according to some embodiments shown herein. The method can be used for model training to obtain a machine learning model for processing the target brain wave data so as to identify the identity of the user.
Referring to fig. 3, the training method may include:
step 310, sample data is acquired, wherein the sample data comprises historical brainwave data of a plurality of target objects in a historical scene and identity information of the target objects.
In some embodiments, historical brain wave data of a plurality of users in a historical scene and identity information corresponding to each user may be obtained in advance and used as training samples. For example, in some embodiments, a training sample may be obtained by collecting brain wave data when a user is in an indication scenario (e.g., acquiescing "start brain wave payment function") when the user starts the brain wave payment verification function, and then combining the brain wave data with corresponding identity information (e.g., account ID, etc.). Similarly, the sample data corresponding to more users can be obtained by referring to the method, which is not described herein again.
In some embodiments, the target objects in the sample data and the target objects to be identified may be the same group of users. Specifically, when the target user performs the payment verification function authentication, brain wave data of the target user may be collected. And acquiring target brain wave data of a target user to be identified during actual payment operation, and performing matching identification on the target brain wave data through a machine learning model. In some embodiments, to improve the accuracy of the matching identification, the brain activity of the user at the time of the payment verification function authentication and the brain activity of the user at the time of the actual payment may be kept as consistent as possible, i.e., in some embodiments, the history scenario may be kept consistent with the aforementioned target scenario. For example, the historical scene and the target scene may have the same preset scene. For more description of the preset scene, reference may be made to the related description of fig. 2 in this specification.
And step 320, determining corresponding characteristic information based on the historical brain wave data to serve as input data.
In some embodiments, after the sample data is obtained in step 310, the historical electroencephalogram data included in the sample data may be processed separately to determine electroencephalogram characteristic information of each user, and then the electroencephalogram characteristic information or the electroencephalogram data including the electroencephalogram characteristic information is used as input data.
Step 330, determining the label information of the historical brain wave data based on the identity information of the target objects as output data.
After the sample data is obtained in step 310, the user identity information contained in the sample data may be extracted and used as the mark information corresponding to the historical electroencephalogram data. For example, if the information entered by the user a when the brainwave payment function is activated includes the brainwave data a and the identity information a, the identity information a may be used as the label information of the brainwave data a, the brainwave data a may be used as the input data, and the identity information a may be used as the output data corresponding to the brainwave data a.
Step 340, inputting the input data and the output data corresponding to the input data into an initial machine learning model for training.
Further, after the input data and the output data are determined, the input data and the output data are input into an initial machine learning model for training, and the training can be completed until a preset iteration number is reached or a loss function reaches a preset threshold value, so that a machine learning model for processing the target brain wave data and further identifying the identity of the user is obtained.
In some embodiments, in which the identity of the target object to be recognized is recognized based on the brain wave data acquired by the brain wave acquisition device and the commodity information corresponding to the current payment operation, the sample data may further include historical shopping operation information associated with a plurality of user identities. The historical shopping operation information can be acquired from a shopping record or a payment record corresponding to the user account, and is used for reflecting historical shopping habits of the corresponding user, such as commodity types, commodity price and the like. In this case, the historical shopping operation information and the historical brainwave data may be simultaneously used as input data, the user identity may be used as the corresponding historical brainwave data and the label information of the historical shopping operation information, the label information may be used as the output data corresponding to the associated input data, and the input data and the output data may be input to an initial machine learning model for training, so as to obtain a machine learning model for performing comprehensive processing on the target brainwave data and the commodity information to identify the user identity.
Fig. 4 is a block diagram 400 of a control system for payment operations according to some embodiments described herein.
As shown in fig. 4, the control system of the payment operation may include an acquisition module 410, a determination module 420, and a generation module 430.
In some embodiments, the obtaining module 410 may be configured to perform the step 210, and obtain target brain wave data of a target object to be identified in a target scene; the determining module 420 may be configured to perform the step 220, and determine the identity of the target object to be recognized based on the target brain wave data and a machine learning model; the generating module 430 may be configured to perform the step 230, and determine whether to generate a payment operation request based on the determination result; wherein the payment operation request may include account information corresponding to the identity of the target object to be recognized.
In some embodiments, the determining module 420 may be specifically configured to: determining identity information matched with the target brain wave data and a matching degree thereof based on the target brain wave data and a machine learning model; and then, judging the identity of the target object to be recognized based on the matching degree. In other words, in some embodiments, the determining module 420 may determine the identity of the target object to be identified based on the target brain wave data of the target object in the target scene.
In some embodiments, the determining module 420 may be further specifically configured to: when the matching degree is larger than a preset threshold value, judging that the identity information belongs to the target object to be identified; and when the matching degree is smaller than a preset threshold value, judging that the identity information does not belong to the target object to be identified.
In some embodiments, the obtaining module 410 may be further configured to: acquiring commodity information corresponding to the payment operation; the determining module 420 may further be configured to: and judging the identity of the target object to be recognized based on the target brain wave data, the commodity information and a machine learning model. In other words, in some embodiments, the determining module 420 may also determine the identity of the target object to be recognized based on the target brain wave data of the target object in the target scene and the information of the goods currently purchased by the target object to be recognized.
In some embodiments, the determining module 420 may be further specifically configured to: determining first identity information matched with the target brain wave data and a first matching degree of the first identity information based on the target brain wave data and a machine learning model; determining second identity information matched with the commodity information and a second matching degree of the second identity information based on the commodity information and a machine learning model; and then, judging the identity of the target object to be recognized based on the first matching degree and the second matching degree.
In some embodiments, the control system of the payment operation may further comprise a pre-processing module, which may be configured to: preprocessing the target brain wave data to determine a corresponding brain wave frequency spectrum signal; wherein the preprocessing comprises a segmentation processing of the brain wave data.
In some embodiments, the preprocessing module may be specifically configured to: determining a number of bands based on the target brain wave data; determining an offset waveband offset to the (N + 1) th waveband on the Nth waveband based on a preset offset; determining an N wave band after offset based on the offset wave band, the preset offset and the (N + 1) wave band; and then, processing the shifted Nth wave band.
In some embodiments, the control system for payment operations may further include a training module, which may be configured to: acquiring sample data, wherein the sample data comprises historical brain wave data of a plurality of target objects in a historical scene and identity information of the target objects; the historical brain wave data is associated with an identity of the target subject; determining corresponding characteristic information based on the historical brain wave data as input data; determining label information of the historical brain wave data based on the identity information of the target objects as output data; and then inputting the input data and the corresponding output data into an initial machine learning model for training. In other words, the training module may train the initial machine learning model through the historical brain wave data of the target objects in the historical scene and the corresponding identity information thereof, to obtain a machine learning model for processing the target brain wave data to identify the identity of the user.
In some embodiments, the sample data may further include historical shopping operation information of the number of target objects, and the training module may be further configured to: and determining mark information of historical brain wave data of the corresponding target object based on the historical shopping operation information of the target object. In other words, the training module may train the initial machine learning model through historical brain wave data of a plurality of target objects in a historical scene, historical shopping operation information of the plurality of target objects, and identity information corresponding to the historical brain wave data and the historical shopping operation information, so as to obtain a machine learning model for performing comprehensive processing on the target brain wave data and the commodity information to identify the identity of the user.
Since more functions and details about the above modules can be found elsewhere in this specification (for example, the flowchart of the control method of the payment operation and the related discussion thereof, and the flowchart of the training method of the machine learning model and the related discussion thereof), further description is omitted here.
It should be understood that the system and its modules shown in FIG. 4 may be implemented in a variety of ways. For example, in some embodiments, the system and its modules may be implemented in hardware, software, or a combination of software and hardware. Wherein the hardware portion may be implemented using dedicated logic; the software portions may be stored in a memory for execution by a suitable instruction execution system, such as a microprocessor or specially designed hardware. Those skilled in the art will appreciate that the methods and systems described above may be implemented using computer executable instructions and/or embodied in processor control code, such code being provided, for example, on a carrier medium such as a diskette, CD-or DVD-ROM, a programmable memory such as read-only memory (firmware), or a data carrier such as an optical or electronic signal carrier. The system and its modules in this specification may be implemented not only by hardware circuits such as very large scale integrated circuits or gate arrays, semiconductors such as logic chips, transistors, or programmable hardware devices such as field programmable gate arrays, programmable logic devices, etc., but also by software executed by various types of processors, for example, or by a combination of the above hardware circuits and software (e.g., firmware).
It should be noted that the above description of the system and its modules shown in fig. 4 is for convenience only and should not limit the present disclosure to the scope of the illustrated embodiments. It will be appreciated by those skilled in the art that, given the teachings of the present system, any combination of modules or sub-system configurations may be used to connect to other modules without departing from such teachings. For example, in some embodiments, the obtaining module 410 and the determining module 420 disclosed in fig. 4 may be different modules in a system, or may be a module that implements the functions of two or more modules described above. As another example, the system may also include a communication module to communicate with other components. In the system, each module may share one storage module, and each module may also have its own storage module. Such variations are within the scope of the present disclosure.
The beneficial effects that may be brought by the embodiments of the present description include, but are not limited to: (1) the target brain wave data of the target object to be recognized in the target scene is used for carrying out identity recognition on the target brain wave data, and payment operation is carried out based on the recognition result, so that payment can be completed without depending on a mobile terminal, and the payment experience of a user is improved; (2) by removing the use of the mobile terminal in the payment process, the applicability of the mobile payment in more scenes can be improved, and the problems of fund theft and the like caused by terminal loss can be avoided; (3) the machine learning model is used for processing the target brain wave data of the target object to be recognized in the target scene so as to recognize the identity of the target object, so that the accuracy of the identity recognition result can be improved; (4) by carrying out segmentation processing on the target brain wave data and enabling an overlapping part to exist between two adjacent wave bands, the accuracy of identity recognition based on the target brain wave data in the subsequent process can be further improved.
It is to be noted that different embodiments may produce different advantages, and in different embodiments, any one or combination of the above advantages may be produced, or any other advantages may be obtained.
Having thus described the basic concept, it will be apparent to those skilled in the art that the foregoing detailed disclosure is to be regarded as illustrative only and not as limiting the present specification. Various modifications, improvements and adaptations to the present description may occur to those skilled in the art, although not explicitly described herein. Such modifications, improvements and adaptations are proposed in the present specification and thus fall within the spirit and scope of the exemplary embodiments of the present specification.
Also, the description uses specific words to describe embodiments of the description. Reference throughout this specification to "one embodiment," "an embodiment," and/or "some embodiments" means that a particular feature, structure, or characteristic described in connection with at least one embodiment of the specification is included. Therefore, it is emphasized and should be appreciated that two or more references to "an embodiment" or "one embodiment" or "an alternative embodiment" in various places throughout this specification are not necessarily all referring to the same embodiment. Furthermore, some features, structures, or characteristics of one or more embodiments of the specification may be combined as appropriate.
Additionally, the order in which the elements and sequences of the process are recited in the specification, the use of alphanumeric characters, or other designations, is not intended to limit the order in which the processes and methods of the specification occur, unless otherwise specified in the claims. While various presently contemplated embodiments of the invention have been discussed in the foregoing disclosure by way of example, it is to be understood that such detail is solely for that purpose and that the appended claims are not limited to the disclosed embodiments, but, on the contrary, are intended to cover all modifications and equivalent arrangements that are within the spirit and scope of the embodiments herein. For example, although the system components described above may be implemented by hardware devices, they may also be implemented by software-only solutions, such as installing the described system on an existing server or mobile device.
Similarly, it should be noted that in the preceding description of embodiments of the present specification, various features are sometimes grouped together in a single embodiment, figure, or description thereof for the purpose of streamlining the disclosure aiding in the understanding of one or more of the embodiments. This method of disclosure, however, is not intended to imply that more features than are expressly recited in a claim. Indeed, the embodiments may be characterized as having less than all of the features of a single embodiment disclosed above.
Numerals describing the number of components, attributes, etc. are used in some embodiments, it being understood that such numerals used in the description of the embodiments are modified in some instances by the use of the modifier "about", "approximately" or "substantially". Unless otherwise indicated, "about", "approximately" or "substantially" indicates that the number allows a variation of ± 20%. Accordingly, in some embodiments, the numerical parameters used in the specification and claims are approximations that may vary depending upon the desired properties of the individual embodiments. In some embodiments, the numerical parameter should take into account the specified significant digits and employ a general digit preserving approach. Notwithstanding that the numerical ranges and parameters setting forth the broad scope of the range are approximations, in the specific examples, such numerical values are set forth as precisely as possible within the scope of the application.
For each patent, patent application publication, and other material, such as articles, books, specifications, publications, documents, etc., cited in this specification, the entire contents of each are hereby incorporated by reference into this specification. Except for historical documents which are inconsistent or conflicting with the present disclosure, and documents which are currently or later appended to the present disclosure and which are intended to be limiting of the broadest scope of the claims of this disclosure. It is to be understood that the descriptions, definitions and/or uses of terms in the accompanying materials of this specification shall control if they are inconsistent or contrary to the descriptions and/or uses of terms in this specification.
Finally, it should be understood that the embodiments described herein are merely illustrative of the principles of the embodiments of the present disclosure. Other variations are also possible within the scope of the present description. Thus, by way of example, and not limitation, alternative configurations of the embodiments of the specification can be considered consistent with the teachings of the specification. Accordingly, the embodiments of the present description are not limited to only those embodiments explicitly described and depicted herein.

Claims (19)

1. A method of controlling payment operations, the method comprising:
acquiring target brain wave data of a target object to be identified in a target scene;
judging the identity of the target object to be recognized based on the target brain wave data and a machine learning model; and determining whether to generate a payment operation request based on the judgment result, wherein the payment operation request comprises account information corresponding to the identity of the target object to be identified.
2. The method of claim 1, wherein determining the identity of the target object to be recognized based on the target brain wave data and a machine learning model comprises:
determining identity information matched with the target brain wave data and a matching degree thereof based on the target brain wave data and a machine learning model;
and judging the identity of the target object to be recognized based on the matching degree.
3. The method according to claim 2, wherein if the matching degree is greater than a preset threshold, it is determined that the identity information belongs to the target object to be recognized; and if the matching degree is smaller than a preset threshold value, judging that the identity information does not belong to the target object to be identified.
4. The method of claim 1, the machine learning model obtained by a training method comprising:
acquiring sample data, wherein the sample data comprises historical brain wave data of a plurality of target objects in a historical scene and identity information of the target objects; the historical brain wave data is associated with an identity of the target subject;
determining corresponding characteristic information based on the historical brain wave data as input data;
determining label information of the historical brain wave data based on the identity information of the target objects as output data;
and inputting the input data and the corresponding output data into an initial machine learning model for training.
5. The method of claim 4, said sample data further comprising historical shopping operation information for said number of target objects; the training method further comprises the following steps:
and determining mark information of historical brain wave data of the corresponding target object based on the historical shopping operation information of the target object.
6. The method of claim 1, further comprising: acquiring commodity information corresponding to the payment operation;
the determining the identity of the target object to be recognized based on the target brain wave data and the machine learning model further comprises:
and judging the identity of the target object to be recognized based on the target brain wave data, the commodity information and a machine learning model.
7. The method of claim 6, wherein the determining the identity of the target object to be recognized based on the target brain wave data, the merchandise information, and a machine learning model further comprises:
determining first identity information matched with the target brain wave data and a first matching degree of the first identity information based on the target brain wave data and a machine learning model;
determining second identity information matched with the commodity information and a second matching degree of the second identity information based on the commodity information and a machine learning model;
and judging the identity of the target object to be recognized based on the first matching degree and the second matching degree.
8. The method of claim 1, further comprising:
preprocessing the target brain wave data to determine a corresponding brain wave frequency spectrum signal;
wherein the preprocessing comprises a segmentation processing of the brain wave data.
9. The method of claim 8, the segmenting of the brain wave data comprising:
determining a number of bands based on the target brain wave data;
determining an offset waveband offset to the (N + 1) th waveband on the Nth waveband based on a preset offset;
determining an N wave band after offset based on the offset wave band, the preset offset and the (N + 1) wave band;
and processing the shifted Nth wave band.
10. A control system for payment operations, the system comprising:
the acquisition module is used for acquiring target brain wave data of a target object to be identified in a target scene;
the judging module is used for judging the identity of the target object to be recognized based on the target brain wave data and a machine learning model;
and the generating module is used for determining whether to generate a payment operation request based on the judgment result, wherein the payment operation request comprises account information corresponding to the identity of the target object to be identified.
11. The system of claim 10, wherein the determining module is specifically configured to:
determining identity information matched with the target brain wave data and a matching degree thereof based on the target brain wave data and a machine learning model;
and judging the identity of the target object to be recognized based on the matching degree.
12. The system of claim 11, wherein the determining module is specifically configured to:
when the matching degree is larger than a preset threshold value, judging that the identity information belongs to the target object to be identified; and when the matching degree is smaller than a preset threshold value, judging that the identity information does not belong to the target object to be identified.
13. The system of claim 10, comprising a further training module to:
acquiring sample data, wherein the sample data comprises historical brain wave data of a plurality of target objects in a historical scene and identity information of the target objects; the historical brain wave data is associated with an identity of the target subject;
determining corresponding characteristic information based on the historical brain wave data as input data;
determining label information of the historical brain wave data based on the identity information of the target objects as output data;
and inputting the input data and the corresponding output data into an initial machine learning model for training.
14. The system of claim 13, said sample data further comprising historical shopping operation information for said number of target objects; the training module is further configured to:
and determining mark information of historical brain wave data of the corresponding target object based on the historical shopping operation information of the target object.
15. The system of claim 10, the acquisition module further to: acquiring commodity information corresponding to the payment operation; the judging module is further configured to:
and judging the identity of the target object to be recognized based on the target brain wave data, the commodity information and a machine learning model.
16. The system of claim 15, wherein the determining module is further specifically configured to:
determining first identity information matched with the target brain wave data and a first matching degree of the first identity information based on the target brain wave data and a machine learning model;
determining second identity information matched with the commodity information and a second matching degree of the second identity information based on the commodity information and a machine learning model;
and judging the identity of the target object to be recognized based on the first matching degree and the second matching degree.
17. The system of claim 10, further comprising a pre-processing module,
the preprocessing module is used for: preprocessing the target brain wave data to determine a corresponding brain wave frequency spectrum signal;
wherein the preprocessing comprises a segmentation processing of the brain wave data.
18. The system of claim 17, the preprocessing module specifically configured to:
determining a number of bands based on the target brain wave data;
determining an offset waveband offset to the (N + 1) th waveband on the Nth waveband based on a preset offset;
determining an N wave band after offset based on the offset wave band, the preset offset and the (N + 1) wave band;
and processing the shifted Nth wave band.
19. A control device for payment operations, the device comprising at least one processor and at least one memory;
the at least one memory is for storing computer instructions;
the at least one processor is configured to execute at least some of the computer instructions to implement the operations of any of claims 1-9.
CN202010440505.5A 2020-05-22 2020-05-22 Payment operation control method and system Pending CN111340508A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010440505.5A CN111340508A (en) 2020-05-22 2020-05-22 Payment operation control method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010440505.5A CN111340508A (en) 2020-05-22 2020-05-22 Payment operation control method and system

Publications (1)

Publication Number Publication Date
CN111340508A true CN111340508A (en) 2020-06-26

Family

ID=71184963

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010440505.5A Pending CN111340508A (en) 2020-05-22 2020-05-22 Payment operation control method and system

Country Status (1)

Country Link
CN (1) CN111340508A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023065530A1 (en) * 2021-10-21 2023-04-27 杉木(深圳)生物科技有限公司 User identity recognition method, sample testing device and system, and server

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2008152799A1 (en) * 2007-06-12 2008-12-18 Panasonic Corporation Brain wave interface system and starter
CN105260890A (en) * 2015-09-25 2016-01-20 镇江明泰信息科技有限公司 On-line secure payment method based on multi-domain user information big data analysis
CN108108974A (en) * 2017-12-04 2018-06-01 阿里巴巴集团控股有限公司 Method of payment and device and electronic equipment
CN108366009A (en) * 2017-01-26 2018-08-03 阿里巴巴集团控股有限公司 Method for pushing, device and the server of reminder message
CN108415564A (en) * 2018-02-26 2018-08-17 广东欧珀移动通信有限公司 Electronic device, apparatus control method and Related product
CN110298563A (en) * 2019-06-14 2019-10-01 达疆网络科技(上海)有限公司 A kind of statistical method of discriminant risk order

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2008152799A1 (en) * 2007-06-12 2008-12-18 Panasonic Corporation Brain wave interface system and starter
CN105260890A (en) * 2015-09-25 2016-01-20 镇江明泰信息科技有限公司 On-line secure payment method based on multi-domain user information big data analysis
CN108366009A (en) * 2017-01-26 2018-08-03 阿里巴巴集团控股有限公司 Method for pushing, device and the server of reminder message
CN108108974A (en) * 2017-12-04 2018-06-01 阿里巴巴集团控股有限公司 Method of payment and device and electronic equipment
CN108415564A (en) * 2018-02-26 2018-08-17 广东欧珀移动通信有限公司 Electronic device, apparatus control method and Related product
CN110298563A (en) * 2019-06-14 2019-10-01 达疆网络科技(上海)有限公司 A kind of statistical method of discriminant risk order

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
马慧琳: "《基于机器学习的乳腺图像辅助诊断算法研究》", 31 August 2016, 湖南师范大学出版社 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023065530A1 (en) * 2021-10-21 2023-04-27 杉木(深圳)生物科技有限公司 User identity recognition method, sample testing device and system, and server

Similar Documents

Publication Publication Date Title
US11429712B2 (en) Systems and methods for dynamic passphrases
US20210319062A1 (en) Method and apparatus for searching video segment, device, and medium
CN114503130A (en) Mapping user vectors between embeddings of machine learning models
US20230162194A1 (en) Systems and methods for automated identity verification
CN110570208B (en) Complaint preprocessing method and device
US20150317642A1 (en) Process to query electronic sales receipts with a portable computerized device
WO2021191659A1 (en) Liveness detection using audio-visual inconsistencies
CN114511393B (en) Financial data processing method and system
CN111340508A (en) Payment operation control method and system
US20170310664A1 (en) Apparatus and system for obtaining and encrypting documentary materials
CN114971449A (en) Article inventory management method, apparatus, electronic device and medium
CN116186543B (en) Financial data processing system and method based on image recognition
CN109949028B (en) Intelligent settlement network payment system
CN111538822B (en) Method and system for generating training data of intelligent customer service robot
US20230125814A1 (en) Credit score management apparatus, credit score management method, and computer readable recording medium
US10798007B2 (en) Data transfer, over session or connection, and between computing device and server associated with a routing network for modifying one or more parameters of the routing network
CA3103484A1 (en) Systems and methods for dynamic passphrases
CN111951013A (en) Authentication method and device
CN112215699A (en) Credit assessment method and device
WO2024180590A1 (en) Information processing device, system, method, and non-transitory computer recording medium
CN111027935A (en) Electronic visa application method and device based on credit
KR20200072742A (en) Apparatus for credit card payment service using biometric data
JP7464232B1 (en) Facility management method, information processing device, and program
US11645372B2 (en) Multifactor handwritten signature verification
JP7008352B2 (en) Information processing equipment, information processing methods, and programs

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
REG Reference to a national code

Ref country code: HK

Ref legal event code: DE

Ref document number: 40030942

Country of ref document: HK

TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20230110

Address after: F15, No. 447, North Nanquan Road, China (Shanghai) Pilot Free Trade Zone, Pudong New Area, Shanghai, 200137

Applicant after: Alipay.com Co.,Ltd.

Address before: 310000 801-11 section B, 8th floor, 556 Xixi Road, Xihu District, Hangzhou City, Zhejiang Province

Applicant before: Alipay (Hangzhou) Information Technology Co.,Ltd.

RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20200626