CN116992956A - Model training method, device and equipment applied to user portrait determination - Google Patents

Model training method, device and equipment applied to user portrait determination Download PDF

Info

Publication number
CN116992956A
CN116992956A CN202310967616.5A CN202310967616A CN116992956A CN 116992956 A CN116992956 A CN 116992956A CN 202310967616 A CN202310967616 A CN 202310967616A CN 116992956 A CN116992956 A CN 116992956A
Authority
CN
China
Prior art keywords
user
model
terminal equipment
parameters
terminal device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310967616.5A
Other languages
Chinese (zh)
Inventor
何艳波
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Bank of China Ltd
Original Assignee
Bank of China Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Bank of China Ltd filed Critical Bank of China Ltd
Priority to CN202310967616.5A priority Critical patent/CN116992956A/en
Publication of CN116992956A publication Critical patent/CN116992956A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • G06N3/098Distributed learning, e.g. federated learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N20/00Machine learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/02Marketing; Price estimation or determination; Fundraising
    • G06Q30/0201Market modelling; Market analysis; Collecting market data

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Business, Economics & Management (AREA)
  • Accounting & Taxation (AREA)
  • General Physics & Mathematics (AREA)
  • Strategic Management (AREA)
  • Finance (AREA)
  • Data Mining & Analysis (AREA)
  • Development Economics (AREA)
  • Software Systems (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Artificial Intelligence (AREA)
  • General Engineering & Computer Science (AREA)
  • Evolutionary Computation (AREA)
  • Mathematical Physics (AREA)
  • Computing Systems (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Computational Linguistics (AREA)
  • Medical Informatics (AREA)
  • General Health & Medical Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Molecular Biology (AREA)
  • Biomedical Technology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Biophysics (AREA)
  • Game Theory and Decision Science (AREA)
  • Economics (AREA)
  • Marketing (AREA)
  • General Business, Economics & Management (AREA)
  • Telephonic Communication Services (AREA)

Abstract

The application provides a model training method, device and equipment applied to determining user portraits, and relates to the field of neural networks. The method comprises the steps of receiving encrypted gradient errors sent by terminal equipment in at least one terminal equipment; processing the encrypted gradient error to obtain a processing result, and sending the processing result to terminal equipment in the at least one terminal equipment; wherein the processing result is used for updating the global parameter; and the aggregation model of the global parameters obtained until the preset time is reached is used for determining the user portrait. By adopting the technical scheme, the historical service of each channel can be synthesized, the requirements of the user can be rapidly determined, the processing efficiency is further improved, and the time is saved.

Description

Model training method, device and equipment applied to user portrait determination
Technical Field
The present application relates to the field of neural networks, and in particular, to a model training method, apparatus and device for determining a user portrait.
Background
With the rapid development of intelligent terminals, more and more services exist in different channels and on-line, but data information in different channels cannot be shared in real time, but the service required by a user cannot be rapidly determined when the user transacts the service on-line.
Therefore, a model training method for determining user portraits is needed, which can synthesize the history business of each channel, quickly determine the user demands, further improve the processing efficiency and save the time.
Disclosure of Invention
The application provides a model training method, device and equipment applied to determining user portraits, which can synthesize the history service of each channel, quickly determine the requirements of users, further improve the processing efficiency and save the time.
In a first aspect, the present application provides a model training method for determining a representation of a user, comprising:
receiving an encrypted gradient error sent by a terminal device in at least one terminal device; the gradient error is obtained by inputting historical user behavior information in the terminal equipment into an aggregation model configured with global parameters for processing; the global parameters are preset, and the global parameters are global model parameters of an aggregation model in the aggregation server; the gradient error characterizes error information of gradient information generated when the terminal equipment trains the aggregation model;
processing the encrypted gradient error to obtain a processing result, and sending the processing result to terminal equipment in the at least one terminal equipment; wherein the processing result is used for updating the global parameter;
and the aggregation model of the global parameters obtained until the preset time is reached is used for determining the user portrait.
In one example, the processing the encrypted gradient error to obtain a processing result includes:
decrypting the encrypted gradient error according to preset key information to obtain a gradient error;
and analyzing the gradient error to obtain a processing result.
In one example, the analyzing the gradient error to obtain a processing result includes:
determining error information according to the gradient error; the error information characterizes the loss error of gradient information generated when the aggregation model is trained.
In one example, after sending the processing result to a terminal device of the at least one terminal device, further includes:
receiving model parameters sent by terminal equipment in the at least one terminal equipment;
carrying out mean value processing according to the model parameters, and determining updated global parameters;
and sending the updated global parameters to terminal equipment in the at least one terminal equipment.
In one example, the method further comprises:
the updated global parameters are used for updating the model parameters by the terminal equipment in the at least one terminal equipment to obtain local parameters; wherein the local parameters characterize updated model parameters of terminal devices of the at least one terminal device.
In one example, the method further comprises:
the historical user behavior information in each terminal device is obtained after the user history is operated on the corresponding terminal device.
In one example, the method further comprises:
and issuing an aggregation model configured with global parameters to terminal equipment in the at least one terminal equipment.
In a second aspect, the present application provides a method for determining a user portrait, the method comprising:
acquiring identity information of a user;
inputting the identity information of the user into an aggregation model, and determining the user portrait of the user;
wherein the aggregate model is a model obtained based on the method of the first aspect.
In a third aspect, the present application provides a model training apparatus for use in determining a representation of a user, the apparatus comprising:
a first receiving unit, configured to receive an encrypted gradient error sent by a terminal device in at least one terminal device; the gradient error is obtained by inputting historical user behavior information in the terminal equipment into an aggregation model configured with global parameters for processing; the global parameters are preset, and the global parameters are global model parameters of an aggregation model in the aggregation server; the gradient error characterizes error information of gradient information generated when the terminal equipment trains the aggregation model;
the first processing unit is used for processing the encrypted gradient error to obtain a processing result and sending the processing result to the terminal equipment in the at least one terminal equipment; wherein the processing result is used for updating the global parameter;
the determining unit is used for determining the aggregation model of the global parameters obtained until the preset time is reached and is used for determining the user portrait.
In a fourth aspect, the present application provides a user portrait determining apparatus, including:
the acquisition unit is used for acquiring the identity information of the user;
the input unit is used for inputting the identity information of the user into the aggregation model and determining the user portrait of the user;
wherein the aggregate model is a model obtained based on the apparatus of the third aspect.
In a fifth aspect, the present application provides a terminal device, including: a processor, and a memory communicatively coupled to the processor;
the memory stores computer-executable instructions;
the processor executes computer-executable instructions stored by the memory to implement the method as described in the first or second aspect.
In a sixth aspect, the present application provides a computer readable storage medium having stored therein computer executable instructions which when executed by a processor are adapted to carry out the method according to the first or second aspect.
In a seventh aspect, the present application provides a computer program product comprising a computer program which, when executed by a processor, implements the method according to the first or second aspect.
The application provides a model training method, a device and equipment for determining user portraits, which are used for receiving encrypted gradient errors sent by terminal equipment in at least one terminal equipment; the gradient error is obtained by inputting historical user behavior information in the terminal equipment into an aggregation model configured with global parameters for processing; the global parameters are preset, and the global parameters are global model parameters of an aggregation model in the aggregation server; the gradient error characterizes error information of gradient information generated when the terminal equipment trains the aggregation model; processing the encrypted gradient error to obtain a processing result, and sending the processing result to terminal equipment in the at least one terminal equipment; wherein the processing result is used for updating the global parameter; and the aggregation model of the global parameters obtained until the preset time is reached is used for determining the user portrait. By adopting the technical scheme, the historical service of each channel can be synthesized, the requirements of the user can be rapidly determined, the processing efficiency is further improved, and the time is saved.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the application and together with the description, serve to explain the principles of the application.
FIG. 1 is a flow chart of a model training method for determining user portraits according to a first embodiment of the application;
FIG. 2 is a flow chart of a model training method for determining user portraits according to a second embodiment of the application;
FIG. 3 is a flowchart of a method for determining a user portrait according to a third embodiment of the present application;
FIG. 4 is a schematic diagram of a model training apparatus for determining a representation of a user according to a fourth embodiment of the present application;
FIG. 5 is a schematic diagram of a model training apparatus for determining a representation of a user according to a fifth embodiment of the present application;
FIG. 6 is a schematic diagram showing a configuration of a user portrait determining apparatus according to a sixth embodiment of the present application;
fig. 7 is a block diagram of a terminal device, according to an example embodiment.
Specific embodiments of the present application have been shown by way of the above drawings and will be described in more detail below. The drawings and the written description are not intended to limit the scope of the inventive concepts in any way, but rather to illustrate the inventive concepts to those skilled in the art by reference to the specific embodiments.
Detailed Description
Reference will now be made in detail to exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, the same numbers in different drawings refer to the same or similar elements, unless otherwise indicated. The implementations described in the following exemplary examples do not represent all implementations consistent with the application. Rather, they are merely examples of apparatus and methods consistent with aspects of the application as detailed in the accompanying claims.
It should be noted that, identity information (including but not limited to user equipment information, user personal information, etc.) and data (including but not limited to data for analysis, stored data, presented data, etc.) of the user related to the present application are information and data authorized by the user or fully authorized by each party, and the collection, use and processing of related data need to comply with related laws and regulations and standards, and provide corresponding operation entries for the user to select authorization or rejection.
It should be noted that, the model training method and device for determining user portraits of the present application can be used in the field of neural networks, and can also be used in any field other than the field of neural networks.
The application provides a model training method applied to determining user portraits, which aims to solve the technical problems in the prior art.
The following describes the technical scheme of the present application and how the technical scheme of the present application solves the above technical problems in detail with specific embodiments. The following embodiments may be combined with each other, and the same or similar concepts or processes may not be described in detail in some embodiments. Embodiments of the present application will be described below with reference to the accompanying drawings.
FIG. 1 is a flow chart of a model training method for determining user portraits according to a first embodiment of the application. The first embodiment comprises the following steps:
s101, receiving encrypted gradient errors sent by terminal equipment in at least one terminal equipment; the gradient error is obtained by inputting historical user behavior information in the terminal equipment into an aggregation model configured with global parameters for processing; the global parameters are preset, and the global parameters are global model parameters of an aggregation model in the aggregation server; the gradient error characterizes error information of gradient information generated when the terminal equipment trains the aggregation model.
In one example, the at least one terminal device may be terminal device 1, terminal device 2, and terminal device 3, respectively.
The gradient error is error information generated by training a model in the current terminal equipment by using local historical user behavior information by the current terminal equipment. The encryption method is not limited herein, and encryption is to avoid modification of gradient errors during transmission.
Specifically, the terminal device 1 may be a terminal device corresponding to a telephone bank, and specifically, the historical user behavior information corresponding to the terminal device 1 may be a user number, a mobile phone number, a user name, a gender, a wire-incoming area, a wire-incoming type, wire-incoming matters, wire-incoming time, wire-incoming frequency, customer satisfaction degree, and the like. The terminal device 2 may be a terminal device corresponding to a mobile phone bank or an internet banking, and specifically, the historical user behavior information corresponding to the terminal device 2 may be a user number, a mobile phone number, a user name, a gender, a service handling track, account information, an organization to which the user belongs, a use time, a use frequency, a user satisfaction degree, and the like. The terminal device 3 may be a terminal device corresponding to a website, and specifically, the historical user behavior information corresponding to the terminal device 3 may be a user number, a mobile phone number, a user name, a gender, a service handling track, a service handling channel, a manual window, an intelligent counter, an ATM, etc., a website organization number, account information, user satisfaction, etc.
S102, processing the encrypted gradient error to obtain a processing result, and transmitting the processing result to terminal equipment in at least one terminal equipment; wherein the processing result is used to update the global parameter.
In this embodiment, the server decrypts the encrypted gradient error, then further processes the decrypted gradient error, and sends the processing result to the terminal device in at least one terminal device, and then the terminal device updates the global parameter in the terminal device using the processing result after receiving the processing result.
And the aggregate model of the global parameters obtained until the preset time is reached is used for determining the user portrait.
In this embodiment, the preset time may be a preset time, for example, 30 minutes. After the execution according to S101 and S102, the global parameters are updated for 30 minutes to obtain an aggregate model, and the user portrait is determined.
The application provides a model training method applied to determining user portraits, which is implemented by receiving encrypted gradient errors sent by terminal equipment in at least one terminal equipment; the gradient error is obtained by inputting historical user behavior information in the terminal equipment into an aggregation model configured with global parameters for processing; the global parameters are preset, and the global parameters are global model parameters of an aggregation model in the aggregation server; the gradient error characterizes error information of gradient information generated when the terminal equipment trains the aggregation model; processing the encrypted gradient error to obtain a processing result, and transmitting the processing result to terminal equipment in at least one terminal equipment; the processing result is used for updating the global parameters; and the aggregate model of the global parameters obtained until the preset time is reached is used for determining the user portrait. By adopting the technical scheme, the historical service of each channel can be synthesized, the requirements of the user can be rapidly determined, the processing efficiency is further improved, and the time is saved.
Fig. 2 is a flowchart of a model training method applied to determining a user portrait according to a second embodiment of the present application. The second embodiment includes the following steps:
s201, issuing an aggregation model configured with global parameters to terminal equipment in at least one terminal equipment.
In this embodiment, the server is provided with an aggregation model, and the aggregation model is configured with global parameters, and then the server issues the aggregation model configured with the global parameters to the terminal device.
S202, receiving encrypted gradient errors sent by terminal equipment in at least one terminal equipment; the gradient error is obtained by inputting historical user behavior information in the terminal equipment into an aggregation model configured with global parameters for processing; the global parameters are preset, and the global parameters are global model parameters of an aggregation model in the aggregation server; the gradient error characterizes error information of gradient information generated when the terminal equipment trains the aggregation model.
In this embodiment, the historical user behavior information in each terminal device is obtained after the user history is operated on the corresponding terminal device. For example, terminal device 1 trains the issued aggregate model with the historical user behavior information in terminal device 1, and terminal device 2 trains the issued aggregate model with the historical user behavior information in terminal device 2.
S203, decrypting the encrypted gradient error according to preset key information to obtain the gradient error.
In this embodiment, the decryption process may be performed according to the reverse of the encryption process, to obtain the gradient error.
S204, analyzing the gradient error to obtain a processing result.
In one example, resolving the gradient error to obtain a processing result includes:
determining error information according to the gradient error; the error information characterizes the loss error of gradient information generated when the aggregation model is trained.
In this embodiment, after processing gradient errors uploaded by a plurality of terminal devices, the server determines error information.
S205, sending the processing result to terminal equipment in at least one terminal equipment; wherein the processing result is used to update the global parameter.
For example, this step may refer to step S102, and will not be described in detail.
And the aggregate model of the global parameters obtained until the preset time is reached is used for determining the user portrait.
S206, receiving model parameters sent by the terminal equipment in the at least one terminal equipment.
In this embodiment, the model parameters after the local training of the terminal device need to be uploaded to the server, so that the server performs processing.
S207, carrying out mean processing according to the model parameters, and determining updated global parameters.
In this embodiment, for example, the model parameter of the terminal device 1 is a, the model parameter of the terminal device 2 is B, and the model parameter of the terminal device 3 is C, and the updated global parameter is the average of A, B and C.
S208, the updated global parameters are sent to terminal equipment in at least one terminal equipment.
In this embodiment, through this process, each participating terminal device is the same and complete model, and each participating terminal device does not exchange and does not depend on each other, and the terminal device can also predict independently during prediction, and this process can be regarded as sample-based distributed model training.
In one example, the method further comprises:
the updated global parameters are used for updating the model parameters of the terminal equipment in at least one terminal equipment to obtain local parameters; wherein the local parameters characterize the updated model parameters of the terminal device of the at least one terminal device.
In one example, according to the gradient error and the error information, the local parameters of the terminal device 1 may be determined, and after the local parameters of the terminal device 1 are determined, the parameters are sent to the server, and after being processed by the server, the updated global parameters are obtained, and the updated global parameters issued by the server are received.
The application provides a model training method applied to determining user portraits, which is characterized in that model parameters sent by terminal equipment in at least one terminal equipment are received, mean processing is carried out according to the model parameters, updated global parameters are determined, and the updated global parameters are sent to the terminal equipment in the at least one terminal equipment. By adopting the technical scheme, the data combined training is safely carried out, a shared machine learning model is established, and the number of samples of the channel N and the characteristic data are increased, which is equivalent to the increase of each participation channel A, channel B and channel C, thereby improving the accuracy of the characteristics of the user portrait and behavior.
Fig. 3 is a flowchart of a method for determining a user portrait according to a third embodiment of the present application. The third embodiment includes the following steps:
s301, acquiring identity information of a user.
In this embodiment, the identity information of the user may be a mobile phone number or a face image of the user.
S302, inputting identity information of a user into an aggregation model, and determining a user portrait of the user;
wherein the aggregate model is a model obtained based on the method of the first aspect.
In this embodiment, after determining the user portrait of the user, the website needs to be reminded of the user with unsatisfied complaints to promote the website reservation level of the user, and the characteristic data information of the user portrait and behavior is sent to website staff in advance. Therefore, the staff can timely know the relevant information of the clients and the latest transaction and consultation conditions, better communicate with the users, and shorten the distance between the staff and the users.
Fig. 4 is a schematic structural diagram of a model training apparatus for determining a representation of a user according to a fourth embodiment of the present application. Specifically, the apparatus 40 of the fourth embodiment includes:
a first receiving unit 401, configured to receive an encrypted gradient error sent by a terminal device in at least one terminal device; the gradient error is obtained by inputting historical user behavior information in the terminal equipment into an aggregation model configured with global parameters for processing; the global parameters are preset, and the global parameters are global model parameters of an aggregation model in the aggregation server; the gradient error characterizes error information of gradient information generated when the terminal equipment trains the aggregation model;
a first processing unit 402, configured to process the encrypted gradient error, obtain a processing result, and send the processing result to a terminal device in at least one terminal device; the processing result is used for updating the global parameters;
a determining unit 403, configured to determine an aggregate model of global parameters obtained until a preset time is reached, for determining a user portrait.
It will be clearly understood by those skilled in the art that, for convenience and brevity of description, the specific working process of the above-described apparatus may refer to the corresponding process in the foregoing method embodiment, which is not repeated herein.
Fig. 5 is a schematic structural diagram of a model training apparatus for determining a user portrait according to a fifth embodiment of the present application. Specifically, the apparatus 50 of the fifth embodiment includes:
a first receiving unit 501, configured to receive an encrypted gradient error sent by a terminal device in at least one terminal device; the gradient error is obtained by inputting historical user behavior information in the terminal equipment into an aggregation model configured with global parameters for processing; the global parameters are preset, and the global parameters are global model parameters of an aggregation model in the aggregation server; the gradient error characterizes error information of gradient information generated when the terminal equipment trains the aggregation model;
the first processing unit 502 is configured to process the encrypted gradient error, obtain a processing result, and send the processing result to a terminal device in at least one terminal device; the processing result is used for updating the global parameters;
a determining unit 503, configured to determine an aggregate model of global parameters obtained until a preset time is reached, for determining a user portrait.
In one example, processing unit 502 includes:
the decryption module 5021 is configured to decrypt the encrypted gradient error according to preset key information to obtain a gradient error;
and the analysis module 5022 is used for analyzing the gradient error to obtain a processing result.
In one example, the parsing module 5022 comprises:
a determining submodule 50221 for determining error information according to the gradient error; the error information characterizes the loss error of gradient information generated when the aggregation model is trained.
In one example, the apparatus 50 further comprises:
a second receiving unit 504 for receiving the model parameters sent by the terminal device in the at least one terminal device;
a second processing unit 505, configured to perform mean processing according to the model parameters, and determine updated global parameters;
and a sending unit 506, configured to send the updated global parameter to a terminal device in the at least one terminal device.
In one example, the apparatus 50 further comprises:
the updated global parameters are used for updating the model parameters of the terminal equipment in at least one terminal equipment to obtain local parameters; wherein the local parameters characterize the updated model parameters of the terminal device of the at least one terminal device.
In one example, the apparatus 50 further comprises:
the historical user behavior information in each terminal device is obtained after the user history is operated on the corresponding terminal device.
In one example, the apparatus 50 further comprises:
and a issuing unit 507, configured to issue the aggregation model configured with the global parameter to a terminal device in the at least one terminal device.
It will be clearly understood by those skilled in the art that, for convenience and brevity of description, the specific working process of the above-described apparatus may refer to the corresponding process in the foregoing method embodiment, which is not repeated herein.
Fig. 6 is a schematic structural diagram of a user portrait determining apparatus according to a sixth embodiment of the present application. Specifically, the apparatus 60 of the sixth embodiment includes:
an obtaining unit 601, configured to obtain identity information of a user;
an input unit 602, configured to input identity information of a user into the aggregation model, and determine a user portrait of the user;
wherein the aggregate model is a model obtained based on the apparatus of the third aspect.
It will be clearly understood by those skilled in the art that, for convenience and brevity of description, the specific working process of the above-described apparatus may refer to the corresponding process in the foregoing method embodiment, which is not repeated herein.
Fig. 7 is a block diagram of a terminal device, which may be a mobile phone, a computer, a digital broadcast terminal, a messaging device, a game console, a tablet device, a medical device, an exercise device, a personal digital assistant, etc., in accordance with an exemplary embodiment.
Terminal device 700 may include one or more of the following components: a processing component 702, a memory 704, a power component 706, a multimedia component 708, an audio component 710, an input/output (I/O) interface 712, a sensor component 714, and a communication component 716.
The processing component 702 generally controls overall operation of the terminal device 700, such as operations associated with display, telephone calls, data communications, camera operations, and recording operations. The processing component 702 may include one or more processors 720 to execute instructions to perform all or part of the steps of the methods described above. Further, the processing component 702 can include one or more modules that facilitate interaction between the processing component 702 and other components. For example, the processing component 702 may include a multimedia module to facilitate interaction between the multimedia component 708 and the processing component 702.
The memory 704 is configured to store various types of data to support operation at the terminal device 700. Examples of such data include instructions for any application or method operating on the terminal device 700, contact data, phonebook data, messages, pictures, video, and the like. The memory 704 may be implemented by any type or combination of volatile or nonvolatile memory devices such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disk.
The power supply component 706 provides power to the various components of the terminal device 700. Power supply components 706 may include a power management system, one or more power supplies, and other components associated with generating, managing, and distributing power for terminal device 700.
The multimedia component 708 includes a screen between the terminal device 700 and the user that provides an output interface. In some embodiments, the screen may include a Liquid Crystal Display (LCD) and a Touch Panel (TP). If the screen includes a touch panel, the screen may be implemented as a touch screen to receive input signals from a user. The touch panel includes one or more touch sensors to sense touches, swipes, and gestures on the touch panel. The touch sensor may sense not only the boundary of a touch or sliding action, but also the duration and pressure associated with the touch or sliding operation. In some embodiments, the multimedia component 708 includes a front-facing camera and/or a rear-facing camera. The front camera and/or the rear camera may receive external multimedia data when the terminal device 700 is in an operation mode, such as a photographing mode or a video mode. Each front camera and rear camera may be a fixed optical lens system or have focal length and optical zoom capabilities.
The audio component 710 is configured to output and/or input audio signals. For example, the audio component 710 includes a Microphone (MIC) configured to receive external audio signals when the terminal device 700 is in an operational mode, such as a call mode, a recording mode, and a voice recognition mode. The received audio signals may be further stored in the memory 704 or transmitted via the communication component 716. In some embodiments, the audio component 710 further includes a speaker for outputting audio signals.
The I/O interface 712 provides an interface between the processing component 702 and peripheral interface modules, which may be a keyboard, click wheel, buttons, etc. These buttons may include, but are not limited to: homepage button, volume button, start button, and lock button.
The sensor assembly 714 includes one or more sensors for providing status assessment of various aspects for the terminal device 700. For example, the sensor assembly 714 may detect an on/off state of the terminal device 700, a relative positioning of the assemblies, such as a display and keypad of the terminal device 700, the sensor assembly 714 may also detect a change in position of the terminal device 700 or a component of the terminal device 700, the presence or absence of a user's contact with the terminal device 700, an orientation or acceleration/deceleration of the terminal device 700, and a change in temperature of the terminal device 700. The sensor assembly 714 may include a proximity sensor configured to detect the presence of nearby objects without any physical contact. The sensor assembly 714 may also include a light sensor, such as a CMOS or CCD image sensor, for use in imaging applications. In some embodiments, the sensor assembly 714 may also include an acceleration sensor, a gyroscopic sensor, a magnetic sensor, a pressure sensor, or a temperature sensor.
The communication component 716 is configured to facilitate communication between the terminal device 700 and other devices, either wired or wireless. The terminal device 700 may access a wireless network based on a communication standard, such as WiFi,2G or 3G, or a combination thereof. In one exemplary embodiment, the communication component 716 receives broadcast signals or broadcast related information from an external broadcast management system via a broadcast channel. In one exemplary embodiment, the communication component 716 further includes a Near Field Communication (NFC) module to facilitate short range communications. For example, the NFC module may be implemented based on Radio Frequency Identification (RFID) technology, infrared data association (IrDA) technology, ultra Wideband (UWB) technology, bluetooth (BT) technology, and other technologies.
In an exemplary embodiment, the terminal device 700 may be implemented by one or more Application Specific Integrated Circuits (ASICs), digital Signal Processors (DSPs), digital Signal Processing Devices (DSPDs), programmable Logic Devices (PLDs), field Programmable Gate Arrays (FPGAs), controllers, microcontrollers, microprocessors, or other electronic elements for executing the methods described above.
In an exemplary embodiment, a non-transitory computer readable storage medium is also provided, such as memory 704, including instructions executable by processor 720 of terminal device 700 to perform the above-described method. For example, the non-transitory computer readable storage medium may be ROM, random Access Memory (RAM), CD-ROM, magnetic tape, floppy disk, optical data storage device, etc.
A non-transitory computer readable storage medium, which when executed by a processor of a terminal device, causes the terminal device to perform the above-described model training method of the terminal device for determining a representation of a user.
Other embodiments of the application will be apparent to those skilled in the art from consideration of the specification and practice of the application disclosed herein. This application is intended to cover any variations, uses, or adaptations of the application following, in general, the principles of the application and including such departures from the present disclosure as come within known or customary practice within the art to which the application pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the application being indicated by the following claims.
It is to be understood that the application is not limited to the precise arrangements and instrumentalities shown in the drawings, which have been described above, and that various modifications and changes may be effected without departing from the scope thereof. The scope of the application is limited only by the appended claims.

Claims (12)

1. A model training method for determining a representation of a user, the method comprising:
receiving an encrypted gradient error sent by a terminal device in at least one terminal device; the gradient error is obtained by inputting historical user behavior information in the terminal equipment into an aggregation model configured with global parameters for processing; the global parameters are preset, and the global parameters are global model parameters of an aggregation model in the aggregation server; the gradient error characterizes error information of gradient information generated when the terminal equipment trains the aggregation model;
processing the encrypted gradient error to obtain a processing result, and sending the processing result to terminal equipment in the at least one terminal equipment; wherein the processing result is used for updating the global parameter;
and the aggregation model of the global parameters obtained until the preset time is reached is used for determining the user portrait.
2. The method of claim 1, wherein processing the encrypted gradient error to obtain a processed result comprises:
decrypting the encrypted gradient error according to preset key information to obtain a gradient error;
and analyzing the gradient error to obtain a processing result.
3. The method of claim 2, wherein said resolving the gradient error to obtain a processed result comprises:
determining error information according to the gradient error; the error information characterizes the loss error of gradient information generated when the aggregation model is trained.
4. The method of claim 1, further comprising, after transmitting the processing result to a terminal device of the at least one terminal device:
receiving model parameters sent by terminal equipment in the at least one terminal equipment;
carrying out mean value processing according to the model parameters, and determining updated global parameters;
and sending the updated global parameters to terminal equipment in the at least one terminal equipment.
5. The method according to claim 4, wherein the method further comprises:
the updated global parameters are used for updating the model parameters by the terminal equipment in the at least one terminal equipment to obtain local parameters; wherein the local parameters characterize updated model parameters of terminal devices of the at least one terminal device.
6. The method according to claim 4, wherein the method further comprises:
the historical user behavior information in each terminal device is obtained after the user history is operated on the corresponding terminal device.
7. The method according to any one of claims 1-6, further comprising:
and issuing an aggregation model configured with global parameters to terminal equipment in the at least one terminal equipment.
8. A method for determining a representation of a user, the method comprising:
acquiring identity information of a user;
inputting the identity information of the user into an aggregation model, and determining the user portrait of the user;
wherein the polymerization model is a model based on the method of any one of claims 1-7.
9. A model training apparatus for use in determining a representation of a user, the apparatus comprising:
a receiving unit, configured to receive an encrypted gradient error sent by a terminal device in at least one terminal device; the gradient error is obtained by inputting historical user behavior information in the terminal equipment into an aggregation model configured with global parameters for processing; the global parameters are preset, and the global parameters are global model parameters of an aggregation model in the aggregation server; the gradient error characterizes error information of gradient information generated when the terminal equipment trains the aggregation model;
the processing unit is used for processing the encrypted gradient error to obtain a processing result and sending the processing result to the terminal equipment in the at least one terminal equipment; wherein the processing result is used for updating the global parameter;
the determining unit is used for determining the aggregation model of the global parameters obtained until the preset time is reached and is used for determining the user portrait.
10. A user representation determining apparatus, the apparatus comprising:
the acquisition unit is used for acquiring the identity information of the user;
the input unit is used for inputting the identity information of the user into the aggregation model and determining the user portrait of the user;
wherein the aggregation model is a model based on the apparatus of claim 9.
11. A terminal device, comprising: a processor, and a memory communicatively coupled to the processor;
the memory stores computer-executable instructions;
the processor executes computer-executable instructions stored in the memory to implement the method of any one of claims 1-7 or to implement the method of claim 8.
12. A computer readable storage medium having stored therein computer executable instructions for implementing the method of any of claims 1-7 or for implementing the method of claim 8 when executed by a processor.
CN202310967616.5A 2023-08-02 2023-08-02 Model training method, device and equipment applied to user portrait determination Pending CN116992956A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310967616.5A CN116992956A (en) 2023-08-02 2023-08-02 Model training method, device and equipment applied to user portrait determination

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310967616.5A CN116992956A (en) 2023-08-02 2023-08-02 Model training method, device and equipment applied to user portrait determination

Publications (1)

Publication Number Publication Date
CN116992956A true CN116992956A (en) 2023-11-03

Family

ID=88522861

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310967616.5A Pending CN116992956A (en) 2023-08-02 2023-08-02 Model training method, device and equipment applied to user portrait determination

Country Status (1)

Country Link
CN (1) CN116992956A (en)

Similar Documents

Publication Publication Date Title
EP3300407B1 (en) Method and device for processing verification code
CN106170809B (en) Virtual card display method and device
CN105847243B (en) Method and device for accessing intelligent camera
US10313870B2 (en) Identity verification method and apparatus, and storage medium
CN111178538A (en) Federated learning method and device for vertical data
KR101639147B1 (en) Method, device, program and storage medium for sending information in voice service
CN111461304B (en) Training method of classified neural network, text classification method, device and equipment
CN107147815B (en) Call processing method and device based on taxi taking
CN109246094B (en) User terminal verification method, device and storage medium
CN110764847A (en) User information processing method and device, electronic equipment and storage medium
CN106712960B (en) Processing method and device of verification code information
CN106408304B (en) Account security management method and device
CN112351131B (en) Control method and device of electronic equipment, electronic equipment and storage medium
CN116208350A (en) Digital car key registration method, device and storage medium
CN116992956A (en) Model training method, device and equipment applied to user portrait determination
CN109246322B (en) Information processing method and system
CN105763428A (en) Information fraud prevention method based on user historical data
CN111681118A (en) Data processing method and device
CN110708427A (en) Information processing method, device and storage medium
CN114221921B (en) Instant messaging method, device, equipment and storage medium for mobile bank
CN114238728B (en) Vehicle data processing method, device and equipment
CN115484224B (en) Information association method, two-dimensional code generation method, device, electronic equipment and medium
CN112102081B (en) Method, device, readable storage medium and blockchain network for generating blockchain
CN110365653B (en) User registration method and device
CN108833673B (en) Method, apparatus, hardware apparatus and medium for restricting user operation

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination