CN115545162A - Atomic service combination optimization method and device - Google Patents

Atomic service combination optimization method and device Download PDF

Info

Publication number
CN115545162A
CN115545162A CN202211275152.3A CN202211275152A CN115545162A CN 115545162 A CN115545162 A CN 115545162A CN 202211275152 A CN202211275152 A CN 202211275152A CN 115545162 A CN115545162 A CN 115545162A
Authority
CN
China
Prior art keywords
atomic service
atomic
service
neural network
combined
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211275152.3A
Other languages
Chinese (zh)
Inventor
卜晓璇
续倩
谭珂
徐云筱
马嘉琪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Bank of China Ltd
Original Assignee
Bank of China Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Bank of China Ltd filed Critical Bank of China Ltd
Priority to CN202211275152.3A priority Critical patent/CN115545162A/en
Publication of CN115545162A publication Critical patent/CN115545162A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • General Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Computational Linguistics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Evolutionary Computation (AREA)
  • Artificial Intelligence (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Health & Medical Sciences (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

The embodiment of the application provides an atomic service combination optimization method and device, which relate to the field of machine learning, and the method comprises the following steps: clustering the atomic services in the data set to obtain an atomic service combination sample set; pre-training a preset combined recommended neural network model according to the atomic service combined sample set to obtain a pre-trained combined recommended neural network model; performing model training on the combined recommended neural network model through a gradient descent algorithm, and updating the atomic service combined sample set according to a model training result until the combined recommended neural network model after model training meets a preset convergence condition; the method and the device can accurately and efficiently select the atomic service combination for the user.

Description

Atomic service combination optimization method and device
Technical Field
The application relates to the field of machine learning, in particular to an atomic service combination optimization method and device.
Background
With the development of atomic services, it can better decouple the affinity between services. The method combines smaller services together to form a new Web application program, the combination modes of atomic services are various, each small atomic service has different service quality attributes (such as execution price, execution time, availability, throughput, successful execution rate, reliability and the like, which jointly measure the service quality of one service), and with development, the mass atomic services with different service quality attributes are used and combined by a service combination system. It is desirable for the user that the quality of service of the overall service combined together is as optimal as possible.
The existing method has poor performance in solving the problem of large-scale atomic service selection, the heuristic method with good stability has low efficiency, the classical problem has poor optimization effect in large-scale problems, and the existing learning algorithm has less application and poor effect in the problem of service combination.
Disclosure of Invention
Aiming at the problems in the prior art, the application provides an atomic service combination optimization method and device, which can accurately and efficiently select an atomic service combination for a user.
In order to solve at least one of the above problems, the present application provides the following technical solutions:
in a first aspect, the present application provides an atomic service composition optimization method, including:
clustering the atomic services in the data set to obtain an atomic service combination sample set;
pre-training a preset combined recommended neural network model according to the atomic service combined sample set to obtain a pre-trained combined recommended neural network model;
and performing model training on the combined recommended neural network model through a gradient descent algorithm, and updating the atomic service combined sample set according to a model training result until the combined recommended neural network model after model training meets a preset convergence condition.
Further, the clustering the atomic services in the data set to obtain an atomic service combination sample set includes:
clustering each atomic service in the set data set according to different function types to generate an abstract combined service to be selected and optimized;
and obtaining an atomic service combination sample set according to the abstract combination service.
Further, the obtaining an atomic service composition sample set according to the abstract composition service includes:
randomly generating a plurality of different atomic service combination samples according to the abstract combination service;
and selecting a set number of atomic service combinations from a plurality of different atomic combination samples according to a preset sample optimization rule to obtain an atomic service combination sample set.
Further, the pre-training a pre-set combined recommended neural network model according to the atomic service combination sample set to obtain a pre-trained combined recommended neural network model includes:
and carrying out maximum likelihood estimation pre-training on a preset combined recommended neural network according to the atomic service combined sample set to obtain a pre-trained combined recommended neural network model.
Further, the model training of the combined recommendation neural network model through a gradient descent algorithm includes:
and carrying out model training on the combined recommendation neural network model through a small batch gradient descent algorithm to obtain a prediction result set.
Further, the updating the atomic service combination sample set according to the model training result until the combination recommendation neural network model after model training meets a preset convergence condition includes:
calculating the difference value between the service quality value of the prediction result set and a preset service quality threshold value;
and updating the atomic service combination sample set according to the difference until the combination recommendation neural network model after model training meets a preset convergence condition.
In a second aspect, the present application provides an atomic service composition optimization apparatus, including:
the initialization module is used for clustering the atomic services in the data set to obtain an atomic service combination sample set;
the pre-training module is used for pre-training a preset combined recommended neural network model according to the atomic service combined sample set to obtain a pre-trained combined recommended neural network model;
and the reinforcement learning module is used for carrying out model training on the combined recommendation neural network model through a gradient descent algorithm and updating the atomic service combined sample set according to a model training result until the combined recommendation neural network model after model training meets a preset convergence condition.
Further, the initialization module includes:
the clustering unit is used for clustering each atomic service in the set data set according to different function types to generate an abstract combined service to be selected and optimized;
and the sample construction unit is used for obtaining an atomic service combination sample set according to the abstract combination service.
Further, the sample construction unit includes:
a random subunit, configured to randomly generate a plurality of different atomic service composition samples according to the abstract composition service;
and the extraction subunit is used for selecting the atomic service combinations with the set number from the plurality of different atomic combination samples according to the preset sample optimization rule to obtain an atomic service combination sample set.
Further, the pre-training module comprises:
and the maximum likelihood estimation unit is used for carrying out maximum likelihood estimation pre-training on a preset combined recommended neural network according to the atomic service combined sample set to obtain a pre-trained combined recommended neural network model.
Further, the reinforcement learning module includes:
and the gradient descent unit is used for carrying out model training on the combined recommendation neural network model through a small batch gradient descent algorithm to obtain a prediction result set.
Further, the reinforcement learning module further comprises:
the service quality calculation unit is used for calculating the difference value between the service quality value of the prediction result set and a preset service quality threshold value;
and the convergence training unit is used for updating the atomic service combination sample set according to the difference until the combination recommendation neural network model after model training meets a preset convergence condition.
In a third aspect, the present application provides an electronic device, including a memory, a processor, and a computer program stored in the memory and executable on the processor, where the processor implements the steps of the atomic service composition optimization method when executing the program.
In a fourth aspect, the present application provides a computer-readable storage medium, on which a computer program is stored, which computer program, when being executed by a processor, performs the steps of the atomic service composition optimization method.
In a fifth aspect, the present application provides a computer program product comprising computer programs/instructions which, when executed by a processor, implement the steps of the atomic service portfolio optimization method.
According to the technical scheme, the atomic service combination optimization method and device are provided, the pre-training is performed on the preset combination recommendation neural network model, the subsequent operation efficiency is improved, the model training is performed on the combination recommendation neural network model through the gradient descent algorithm, and the atomic service combination can be accurately and efficiently selected for the user.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings needed to be used in the description of the embodiments or the prior art will be briefly introduced below, and it is obvious that the drawings in the following description are some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1 is a flowchart illustrating an atomic service composition optimization method according to an embodiment of the present application;
FIG. 2 is a second flowchart illustrating an atomic service portfolio optimization method according to an embodiment of the present application;
FIG. 3 is a third flowchart illustrating an atomic service portfolio optimization method according to an embodiment of the present application;
FIG. 4 is a fourth flowchart illustrating an atomic service portfolio optimization method according to an embodiment of the present application;
FIG. 5 is a block diagram of an atomic service composition optimization apparatus according to an embodiment of the present application;
FIG. 6 is a second block diagram of an atomic service portfolio optimization device in an embodiment of the present application;
fig. 7 is a third structural diagram of an atomic service combination optimization apparatus in this embodiment;
FIG. 8 is a fourth block diagram of an atomic service composition optimizing apparatus according to an embodiment of the present application;
FIG. 9 is a fifth block diagram of an atomic service composition optimizing apparatus according to an embodiment of the present application;
FIG. 10 is a sixth block diagram of an atomic service composition optimizing apparatus according to an embodiment of the present application;
FIG. 11 is a schematic diagram illustrating an atomic service composition optimization process in an embodiment of the present application;
fig. 12 is a schematic structural diagram of an electronic device in an embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present application clearer, the technical solutions in the embodiments of the present application will be described clearly and completely with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some embodiments of the present application, but not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
According to the technical scheme, the data acquisition, storage, use, processing and the like meet relevant regulations of national laws and regulations.
The atomic service combination optimization method and device provided by the application have the advantages that the atomic service combination optimization method and device can accurately and efficiently select the atomic service combination for the user by pre-training the preset combination recommendation neural network model, improving the subsequent operation efficiency and carrying out model training on the combination recommendation neural network model through the gradient descent algorithm.
In order to accurately and efficiently select an atomic service combination for a user, the present application provides an embodiment of an atomic service combination optimization method, which specifically includes, referring to fig. 1:
step S101: and clustering the atomic services in the data set to obtain an atomic service combination sample set.
Optionally, first, the method may classify the atomic services in the data set according to functions, and divide the atomic services into a plurality of cluster services, where each cluster includes a certain number of atomic services with the same function and different service qualities.
Then, the abstract combined service to be selected and optimized can be regenerated, wherein the abstract combined service comprises a topological structure model and cluster selection.
For example, each composite service has its own atomic service pool. And selecting the atomic service in the atomic service pool to replace the abstract atomic service in the composite service, so that a concrete composite service can be generated.
Finally, different atomic combination samples can be randomly generated for the abstract combination service, and an optimal multiple group (for example, 64 groups) of atomic service combination sample sets are selected as pre-trained training samples according to a preset sample preference rule.
For example, each different atomic combination can calculate (sum the cost and the response time, multiply the availability, the reliability and the success rate) the service quality attribute of the combined service according to the service quality attribute (such as the cost, the availability, the response time, the reliability and the success rate) of the atomic service, sum the service quality attributes and other weights, when the sum is performed, the positive attribute takes a positive value, such as the higher the reliability is, the negative attribute takes a negative value, such as the lower the time is, the better the time is, the larger the obtained sum is, the higher the service quality is, and the 64 groups with the highest service quality are selected.
Step S102: and pre-training a preset combined recommended neural network model according to the atomic service combined sample set to obtain the pre-trained combined recommended neural network model.
Optionally, the method and the device for generating the combined recommended neural network model may perform maximum likelihood estimation pre-training on a preset combined recommended neural network according to the atomic service combined sample set to obtain the pre-trained combined recommended neural network model.
Specifically, the action neural network can be trained through maximum likelihood estimation to learn the distribution characteristics of a good selection scheme, so that subsequent reinforcement learning training starts from a higher level, the convergence speed of the algorithm can be increased, and the efficiency of the algorithm can be improved.
Step S103: and performing model training on the combined recommended neural network model through a gradient descent algorithm, and updating the atomic service combined sample set according to a model training result until the combined recommended neural network model after model training meets a preset convergence condition.
Optionally, the present application may train the neural network through gradient descent to search for a combination mode that produces a better quality of service QoS value, and further update the training set with a better combination result.
For example, referring to FIG. 11, a MaxQos comparison may be made each time an agent generates a selected knotAnd a result set, wherein the service quality of the combined service under the group selection can be calculated through the result set, and the service quality is compared with the maxQos. The maxQos is updated to the larger one of the quality of service values. Optimizing agent parameters by gradient descent algorithm, if the step number is more than 300 (namely the cycle number is more than 300), judging if convergence (namely the gradient descent height is less than 10) -6 ) Then the result set at this time is the final result set. And if the convergence condition is not met, combining the result set selected by the trained agent with the pre-trained result set, comparing the service quality of the combined result set, selecting the most result set as a new sample set, and continuing to perform subsequent training.
From the above description, the atomic service combination optimization method provided in the embodiment of the present application can improve subsequent operation efficiency by pre-training the preset combination recommended neural network model, and can accurately and efficiently select an atomic service combination for a user by performing model training on the combination recommended neural network model through a gradient descent algorithm.
In order to accurately construct the training sample, in an embodiment of the atomic service composition optimization method of the present application, referring to fig. 2, the step S101 may further include the following steps:
step S201: and clustering each atomic service in the set data set according to different function types to generate an abstract combined service to be selected and optimized.
Step S202: and obtaining an atomic service combination sample set according to the abstract combination service.
Optionally, first, the method may classify the atomic services in the data set according to functions, and divide the atomic services into a plurality of cluster services, where each cluster includes a certain number of atomic services with the same function and different service qualities.
Then, the abstract combined service to be optimized can be regenerated, wherein the abstract combined service comprises a topological structure model and cluster selection.
In order to accurately construct the training sample, in an embodiment of the atomic service composition optimization method of the present application, referring to fig. 3, the step S202 may further include the following steps:
step S301: and randomly generating a plurality of different atomic service composition samples according to the abstract composition service.
Step S302: and selecting a set number of atomic service combinations from a plurality of different atomic combination samples according to a preset sample optimization rule to obtain an atomic service combination sample set.
Optionally, different atomic combination samples may be randomly generated for the abstract combination service, and an optimal multiple (for example, 64) atomic service combination sample set is selected as a pre-trained training sample according to a preset sample preference rule.
In order to improve the subsequent operation efficiency, in an embodiment of the atomic service combination optimization method of the present application, the step S102 may further include the following steps:
and carrying out maximum likelihood estimation pre-training on a preset combined recommended neural network according to the atomic service combined sample set to obtain a pre-trained combined recommended neural network model.
Specifically, the action neural network can be trained through maximum likelihood estimation to learn the distribution characteristics of a good selection scheme, so that subsequent reinforcement learning training starts from a higher level, the convergence speed of the algorithm can be increased, and the efficiency of the algorithm can be improved.
In order to improve the accuracy of model prediction, in an embodiment of the atomic service composition optimization method, the step S103 may further include the following steps:
and carrying out model training on the combined recommendation neural network model through a small batch gradient descent algorithm to obtain a prediction result set.
In order to improve the accuracy of model prediction, in an embodiment of the atomic service composition optimization method of the present application, referring to fig. 4, the step S103 may further include the following steps:
step S401: and calculating the difference value between the service quality value of the prediction result set and a preset service quality threshold value.
Step S402: and updating the atomic service combination sample set according to the difference until the combination recommendation neural network model after model training meets a preset convergence condition.
In order to accurately and efficiently select an atomic service combination for a user, the present application provides an embodiment of an atomic service combination optimization apparatus for implementing all or part of the contents of the atomic service combination optimization method, and referring to fig. 5, the atomic service combination optimization apparatus specifically includes the following contents:
and the initialization module 10 is configured to cluster the atomic services in the data set to obtain an atomic service combination sample set.
And the pre-training module 20 is configured to pre-train a pre-set combined recommended neural network model according to the atomic service combination sample set, so as to obtain a pre-trained combined recommended neural network model.
And the reinforcement learning module 30 is configured to perform model training on the combined recommended neural network model through a gradient descent algorithm, and update the atomic service combined sample set according to a model training result until the combined recommended neural network model after model training meets a preset convergence condition.
As can be seen from the above description, the atomic service combination optimization device provided in the embodiment of the present application can improve subsequent operation efficiency by pre-training the pre-set combination recommendation neural network model, and can accurately and efficiently select an atomic service combination for a user by performing model training on the combination recommendation neural network model through a gradient descent algorithm.
In order to be able to accurately construct the training sample, in an embodiment of the atomic service composition optimization apparatus of the present application, referring to fig. 6, the initialization module 10 includes:
and the clustering unit 11 is used for clustering each atomic service in the set data set according to different function types to generate an abstract combined service to be selected and optimized.
And the sample construction unit 12 is configured to obtain an atomic service combination sample set according to the abstract combination service.
In order to be able to accurately construct a training sample, in an embodiment of the atomic service composition optimization apparatus of the present application, referring to fig. 7, the sample construction unit 12 includes:
a random subunit 121, configured to randomly generate a plurality of different atomic service composition samples according to the abstract composition service.
The extracting subunit 122 is configured to select a set number of atomic service combinations from multiple different atomic combination samples according to a preset sample preference rule to obtain an atomic service combination sample set.
In order to improve the subsequent operation efficiency, in an embodiment of the atomic service composition optimization apparatus of the present application, referring to fig. 8, the pre-training module 20 includes:
and the maximum likelihood estimation unit 21 is configured to perform maximum likelihood estimation pre-training on a preset combined recommended neural network according to the atomic service combined sample set, so as to obtain a pre-trained combined recommended neural network model.
In order to improve the model prediction accuracy, in an embodiment of the atomic service composition optimization apparatus of the present application, referring to fig. 9, the reinforcement learning module 30 includes:
and the gradient descent unit 31 is configured to perform model training on the combined recommended neural network model through a small batch gradient descent algorithm to obtain a prediction result set.
In order to improve the model prediction accuracy, in an embodiment of the atomic service composition optimization apparatus of the present application, referring to fig. 10, the reinforcement learning module 30 further includes:
a service quality calculating unit 32, configured to calculate a difference between the service quality value of the prediction result set and a preset service quality threshold.
And the convergence training unit 33 is configured to update the atomic service combination sample set according to the difference until the combination recommendation neural network model after model training meets a preset convergence condition.
To further illustrate the present solution, the present application further provides a specific application example of implementing the atomic service combination optimization method by using the atomic service combination optimization apparatus, which is shown in fig. 11 and specifically includes the following contents:
1. an initialization module: in the initialization module, firstly, the atomic services in the data set are classified according to functions and divided into a plurality of cluster services, and each cluster comprises a certain number of atomic services with the same function and different service qualities. And then generating an abstract combined service to be optimized, including a topological structure model and cluster selection. And finally, randomly generating different atom combination samples aiming at the abstract combination service, and selecting 64 optimal groups from the atom combination samples as pre-trained training samples.
2. A pre-training module: in the pre-training module, the action neural network is trained through maximum likelihood estimation so as to learn the distribution characteristics of the good selection scheme. The subsequent reinforcement learning training is started from a higher level, so that the convergence speed of the algorithm can be increased, and the efficiency of the algorithm is improved.
3. A score-based sampling training module: the neural network will be further trained by gradient descent to search for combinations that yield better QoS values, and the training set will be further updated with the better combinations.
Specifically, firstly, a data set is processed and classified, a group of training samples for carrying out different selections on atomic services in a candidate set are generated, then, a network is pre-trained by the aid of the training samples, after the pre-training is completed, network parameters are further trained through a reinforcement learning method, each candidate service is scored through the network, and then, the sampling is carried out for multiple times based on scores, and the optimization is carried out through a small-batch gradient descent algorithm. And the network can search a better selection result and continuously repeat the process until the algorithm converges. This approach balances exploration and utilization.
According to the method, the efficiency of the method is accelerated by adopting a pre-training mode, and exploration and utilization are well balanced through fractional strategy sampling and training.
In order to accurately and efficiently select an atomic service combination for a user on a hardware level, the present application provides an embodiment of an electronic device for implementing all or part of contents in the atomic service combination optimization method, where the electronic device specifically includes the following contents:
a processor (processor), a memory (memory), a communication Interface (Communications Interface), and a bus; the processor, the memory and the communication interface complete mutual communication through the bus; the communication interface is used for realizing information transmission between the atomic service combination optimization device and relevant equipment such as a core service system, a user terminal, a relevant database and the like; the logic controller may be a desktop computer, a tablet computer, a mobile terminal, and the like, but the embodiment is not limited thereto. In this embodiment, the logic controller may be implemented with reference to the embodiment of the atomic service combination optimization method and the embodiment of the atomic service combination optimization apparatus in the embodiment, and the contents thereof are incorporated herein, and repeated descriptions are omitted.
It is understood that the user terminal may include a smart phone, a tablet electronic device, a network set-top box, a portable computer, a desktop computer, a Personal Digital Assistant (PDA), a vehicle-mounted device, a smart wearable device, and the like. Wherein, intelligence wearing equipment can include intelligent glasses, intelligent wrist-watch, intelligent bracelet etc..
In practical applications, part of the atomic service composition optimization method may be performed on the electronic device side as described above, or all operations may be performed in the client device. The selection may be specifically performed according to the processing capability of the client device, the limitation of the user usage scenario, and the like. This is not a limitation of the present application. The client device may further include a processor if all operations are performed in the client device.
The client device may have a communication module (i.e., a communication unit), and may be communicatively connected to a remote server to implement data transmission with the server. The server may include a server on the task scheduling center side, and in other implementation scenarios, the server may also include a server on an intermediate platform, for example, a server on a third-party server platform that has a communication link with the task scheduling center server. The server may include a single computer device, or may include a server cluster formed by a plurality of servers, or a server structure of a distributed apparatus.
Fig. 12 is a schematic block diagram of a system configuration of an electronic device 9600 according to the embodiment of the present application. As shown in fig. 12, the electronic device 9600 can include a central processor 9100 and a memory 9140; the memory 9140 is coupled to the central processor 9100. Notably, this fig. 12 is exemplary; other types of structures may also be used in addition to or in place of the structure to implement telecommunications or other functions.
In one embodiment, the atomic service portfolio optimization methodology functionality may be integrated into the central processor 9100.
The central processor 9100 may be configured to control as follows:
step S101: and clustering the atomic services in the data set to obtain an atomic service combination sample set.
Step S102: and pre-training a preset combined recommended neural network model according to the atomic service combined sample set to obtain the pre-trained combined recommended neural network model.
Step S103: and performing model training on the combined recommended neural network model through a gradient descent algorithm, and updating the atomic service combined sample set according to a model training result until the combined recommended neural network model after model training meets a preset convergence condition.
As can be seen from the above description, according to the electronic device provided in the embodiment of the present application, the pre-training is performed on the preset combination recommended neural network model, so that the subsequent operation efficiency is improved, and the model training is performed on the combination recommended neural network model through the gradient descent algorithm, so that the atomic service combination can be accurately and efficiently selected for the user.
In another embodiment, the atomic service composition optimizing device may be configured separately from the central processing unit 9100, for example, the atomic service composition optimizing device may be configured as a chip connected to the central processing unit 9100, and the function of the atomic service composition optimizing method may be implemented by the control of the central processing unit.
As shown in fig. 12, the electronic device 9600 may further include: a communication module 9110, an input unit 9120, an audio processor 9130, a display 9160, and a power supply 9170. It is noted that the electronic device 9600 also does not necessarily include all of the components shown in fig. 12; further, the electronic device 9600 may further include components not shown in fig. 12, which can be referred to in the related art.
As shown in fig. 12, a central processor 9100, sometimes referred to as a controller or operational control, can include a microprocessor or other processor device and/or logic device, which central processor 9100 receives input and controls the operation of the various components of the electronic device 9600.
The memory 9140 can be, for example, one or more of a buffer, a flash memory, a hard drive, a removable media, a volatile memory, a non-volatile memory, or other suitable device. The information relating to the failure may be stored, and a program for executing the information may be stored. And the central processing unit 9100 can execute the program stored in the memory 9140 to realize information storage or processing, or the like.
The input unit 9120 provides input to the central processor 9100. The input unit 9120 is, for example, a key or a touch input device. Power supply 9170 is used to provide power to electronic device 9600. The display 9160 is used for displaying display objects such as images and characters. The display may be, for example, an LCD display, but is not limited thereto.
The memory 9140 can be a solid state memory, e.g., read Only Memory (ROM), random Access Memory (RAM), a SIM card, or the like. There may also be a memory that holds information even when power is off, can be selectively erased, and is provided with more data, an example of which is sometimes called an EPROM or the like. The memory 9140 could also be some other type of device. Memory 9140 includes a buffer memory 9141 (sometimes referred to as a buffer). The memory 9140 may include an application/function storage portion 9142, the application/function storage portion 9142 being used for storing application programs and function programs or for executing a flow of operations of the electronic device 9600 by the central processor 9100.
The memory 9140 can also include a data store 9143, the data store 9143 being used to store data, such as contacts, digital data, pictures, sounds, and/or any other data used by an electronic device. The driver storage portion 9144 of the memory 9140 may include various drivers for the electronic device for communication functions and/or for performing other functions of the electronic device (e.g., messaging applications, contact book applications, etc.).
The communication module 9110 is a transmitter/receiver 9110 that transmits and receives signals via an antenna 9111. The communication module (transmitter/receiver) 9110 is coupled to the central processor 9100 to provide input signals and receive output signals, which may be the same as in the case of a conventional mobile communication terminal.
Based on different communication technologies, a plurality of communication modules 9110, such as a cellular network module, a bluetooth module, and/or a wireless local area network module, may be provided in the same electronic device. The communication module (transmitter/receiver) 9110 is also coupled to a speaker 9131 and a microphone 9132 via an audio processor 9130 to provide audio output via the speaker 9131 and receive audio input from the microphone 9132, thereby implementing ordinary telecommunications functions. The audio processor 9130 may include any suitable buffers, decoders, amplifiers and so forth. In addition, the audio processor 9130 is also coupled to the central processor 9100, thereby enabling recording locally through the microphone 9132 and enabling locally stored sounds to be played through the speaker 9131.
An embodiment of the present application further provides a computer-readable storage medium capable of implementing all the steps in the atomic service composition optimization method with the execution subject being the server or the client in the foregoing embodiments, where the computer-readable storage medium stores a computer program, and when the computer program is executed by a processor, the computer program implements all the steps in the atomic service composition optimization method with the execution subject being the server or the client in the foregoing embodiments, for example, when the processor executes the computer program, the processor implements the following steps:
step S101: and clustering the atomic services in the data set to obtain an atomic service combination sample set.
Step S102: and pre-training a preset combined recommended neural network model according to the atomic service combined sample set to obtain the pre-trained combined recommended neural network model.
Step S103: and performing model training on the combined recommended neural network model through a gradient descent algorithm, and updating the atomic service combined sample set according to a model training result until the combined recommended neural network model after model training meets a preset convergence condition.
As can be seen from the above description, the computer-readable storage medium provided in the embodiment of the present application improves subsequent operation efficiency by pre-training the preset combined recommended neural network model, and can accurately and efficiently select an atomic service combination for a user by performing model training on the combined recommended neural network model through a gradient descent algorithm.
Embodiments of the present application further provide a computer program product capable of implementing all steps in the atomic service composition optimization method whose execution subject is a server or a client in the foregoing embodiments, where the computer program/instruction when executed by a processor implements the steps of the atomic service composition optimization method, for example, the computer program/instruction implements the following steps:
step S101: and clustering the atomic services in the data set to obtain an atomic service combination sample set.
Step S102: and pre-training a preset combined recommended neural network model according to the atomic service combined sample set to obtain the pre-trained combined recommended neural network model.
Step S103: and performing model training on the combined recommended neural network model through a gradient descent algorithm, and updating the atomic service combined sample set according to a model training result until the combined recommended neural network model after model training meets a preset convergence condition.
As can be seen from the above description, the computer program product provided in the embodiment of the present application improves subsequent operation efficiency by pre-training the preset combination recommended neural network model, and can accurately and efficiently select an atomic service combination for a user by performing model training on the combination recommended neural network model through a gradient descent algorithm.
As will be appreciated by one skilled in the art, embodiments of the present invention may be provided as a method, apparatus, or computer program product. Accordingly, the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present invention may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present invention is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (devices), and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
The principle and the implementation mode of the invention are explained by applying specific embodiments in the invention, and the description of the embodiments is only used for helping to understand the method and the core idea of the invention; meanwhile, for a person skilled in the art, according to the idea of the present invention, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present invention.

Claims (15)

1. A method for atomic service composition optimization, the method comprising:
clustering the atomic services in the data set to obtain an atomic service combination sample set;
pre-training a preset combined recommended neural network model according to the atomic service combined sample set to obtain a pre-trained combined recommended neural network model;
and performing model training on the combined recommended neural network model through a gradient descent algorithm, and updating the atomic service combined sample set according to a model training result until the combined recommended neural network model after model training meets a preset convergence condition.
2. The atomic service combination optimization method according to claim 1, wherein the clustering the atomic services in the data set to obtain an atomic service combination sample set includes:
clustering each atomic service in the set data set according to different function types to generate an abstract combined service to be selected and optimized;
and obtaining an atomic service combination sample set according to the abstract combination service.
3. The atomic service composition optimization method according to claim 2, wherein the obtaining an atomic service composition sample set according to the abstract composition service includes:
randomly generating a plurality of different atomic service combination samples according to the abstract combination service;
and selecting a set number of atomic service combinations from a plurality of different atomic combination samples according to a preset sample optimization rule to obtain an atomic service combination sample set.
4. The atomic service combination optimization method according to claim 1, wherein the pre-training a pre-set combination recommendation neural network model according to the atomic service combination sample set to obtain a pre-trained combination recommendation neural network model includes:
and carrying out maximum likelihood estimation pre-training on a preset combined recommended neural network according to the atomic service combined sample set to obtain a pre-trained combined recommended neural network model.
5. The atomic service composition optimization method according to claim 1, wherein the model training of the composition recommendation neural network model by a gradient descent algorithm includes:
and carrying out model training on the combined recommendation neural network model through a small batch gradient descent algorithm to obtain a prediction result set.
6. The atomic service combination optimization method according to claim 5, wherein the updating the atomic service combination sample set according to the model training result until the model-trained combined recommended neural network model meets a preset convergence condition includes:
calculating the difference value between the service quality value of the prediction result set and a preset service quality threshold value;
and updating the atomic service combination sample set according to the difference until the combination recommendation neural network model after model training meets a preset convergence condition.
7. An atomic service composition optimization apparatus, comprising:
the initialization module is used for clustering the atomic services in the data set to obtain an atomic service combination sample set;
the pre-training module is used for pre-training a preset combined recommended neural network model according to the atomic service combined sample set to obtain a pre-trained combined recommended neural network model;
and the reinforcement learning module is used for carrying out model training on the combined recommended neural network model through a gradient descent algorithm and updating the atomic service combined sample set according to a model training result until the combined recommended neural network model after model training meets a preset convergence condition.
8. The atomic service composition optimization device of claim 7, wherein the initialization module comprises:
the clustering unit is used for clustering each atomic service in the set data set according to different function types to generate an abstract combined service to be selected and optimized;
and the sample construction unit is used for obtaining an atomic service combination sample set according to the abstract combination service.
9. The atomic service composition optimization device according to claim 8, wherein the sample construction unit includes:
a random subunit, configured to randomly generate a plurality of different atomic service composition samples according to the abstract composition service;
and the extraction subunit is used for selecting the atomic service combinations with the set number from the plurality of different atomic combination samples according to the preset sample optimization rule to obtain an atomic service combination sample set.
10. The atomic service composition optimization device of claim 7, wherein the pre-training module comprises:
and the maximum likelihood estimation unit is used for carrying out maximum likelihood estimation pre-training on a preset combined recommended neural network according to the atomic service combined sample set to obtain a pre-trained combined recommended neural network model.
11. The atomic service composition optimization device of claim 7, wherein the reinforcement learning module comprises:
and the gradient descent unit is used for carrying out model training on the combined recommendation neural network model through a small batch gradient descent algorithm to obtain a prediction result set.
12. The atomic service composition optimization device of claim 7, wherein the reinforcement learning module further comprises:
the service quality calculating unit is used for calculating the difference value between the service quality value of the prediction result set and a preset service quality threshold value;
and the convergence training unit is used for updating the atomic service combination sample set according to the difference until the combination recommendation neural network model after model training meets a preset convergence condition.
13. An electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, wherein the processor implements the steps of the atomic service composition optimization method according to any one of claims 1 to 6 when executing the program.
14. A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the steps of the atomic service composition optimization method according to any one of claims 1 to 6.
15. A computer program product comprising computer program/instructions, characterized in that the computer program/instructions, when executed by a processor, implement the steps of the atomic service portfolio optimization method of any of claims 1 to 6.
CN202211275152.3A 2022-10-18 2022-10-18 Atomic service combination optimization method and device Pending CN115545162A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211275152.3A CN115545162A (en) 2022-10-18 2022-10-18 Atomic service combination optimization method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211275152.3A CN115545162A (en) 2022-10-18 2022-10-18 Atomic service combination optimization method and device

Publications (1)

Publication Number Publication Date
CN115545162A true CN115545162A (en) 2022-12-30

Family

ID=84735021

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211275152.3A Pending CN115545162A (en) 2022-10-18 2022-10-18 Atomic service combination optimization method and device

Country Status (1)

Country Link
CN (1) CN115545162A (en)

Similar Documents

Publication Publication Date Title
CN111861569B (en) Product information recommendation method and device
CN107995370B (en) Call control method, device, storage medium and mobile terminal
CN111768231B (en) Product information recommendation method and device
CN110956956A (en) Voice recognition method and device based on policy rules
CN111369247A (en) Cross-bank transaction data processing method and device
CN111932267A (en) Enterprise financial service risk prediction method and device
CN113850394B (en) Federal learning method and device, electronic equipment and storage medium
CN111736772A (en) Storage space data processing method and device of distributed file system
CN112734565A (en) Method and device for predicting mobile coverage rate
CN115210717A (en) Hardware optimized neural architecture search
CN116484906A (en) Method and device for searching graph neural network architecture
CN115545162A (en) Atomic service combination optimization method and device
CN115798458A (en) Classified language identification method and device
CN114840576A (en) Data standard matching method and device
CN114003388A (en) Method and device for determining task parameters of big data computing engine
CN113570044A (en) Customer loss analysis model training method and device
CN113065109A (en) Man-machine recognition method and device
CN112116402A (en) Credit risk customer-level default rate determination method and device
CN112766698B (en) Application service pressure determining method and device
CN110931014A (en) Speech recognition method and device based on regular matching rule
CN112887219B (en) Message packet interval adjusting method and device
CN117455558A (en) Data processing method and device based on combined model
CN115545244A (en) Commodity reservation method and device
CN116382903A (en) Resource allocation optimization method and device for big data platform scheduling system
CN117056810A (en) Group classification method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination