CN112887371B - Edge calculation method and device, computer equipment and storage medium - Google Patents

Edge calculation method and device, computer equipment and storage medium Download PDF

Info

Publication number
CN112887371B
CN112887371B CN202110039126.XA CN202110039126A CN112887371B CN 112887371 B CN112887371 B CN 112887371B CN 202110039126 A CN202110039126 A CN 202110039126A CN 112887371 B CN112887371 B CN 112887371B
Authority
CN
China
Prior art keywords
mobile terminal
module
target mobile
weak classifiers
weak
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110039126.XA
Other languages
Chinese (zh)
Other versions
CN112887371A (en
Inventor
李发明
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen China Blog Imformation Technology Co ltd
Original Assignee
Shenzhen China Blog Imformation Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen China Blog Imformation Technology Co ltd filed Critical Shenzhen China Blog Imformation Technology Co ltd
Priority to CN202110039126.XA priority Critical patent/CN112887371B/en
Publication of CN112887371A publication Critical patent/CN112887371A/en
Application granted granted Critical
Publication of CN112887371B publication Critical patent/CN112887371B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/10Protocols in which an application is distributed across nodes in the network
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/243Classification techniques relating to the number of classes
    • G06F18/24323Tree-organised classifiers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/044Recurrent networks, e.g. Hopfield networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • G06N3/084Backpropagation, e.g. using gradient descent
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • G06N3/086Learning methods using evolutionary algorithms, e.g. genetic algorithms or genetic programming

Abstract

The application relates to an edge calculation method, an edge calculation device, computer equipment and a storage medium, wherein the edge calculation method comprises the following steps: optimizing an original data set by using an intelligent optimization algorithm according to the participation probability of a target mobile terminal to obtain an optimized data set, wherein the optimized data set comprises optimized data subsets the number of which is the same as that of the target mobile terminal, and the error of a strong classifier consisting of weak classifiers trained according to the optimized data subsets is lower than a preset threshold; distributing the optimized data subset to a target mobile terminal for training a weak classifier at the target mobile terminal; and receiving a preliminary classification result calculated by the target mobile terminal according to the weak classifier and the input data from the target mobile terminal, and determining a final classification result according to the preliminary classification result of the weak classifier. The application also provides an edge calculation device, computer equipment and a storage medium. The method and the device can enable the edge computing network to keep higher identification capability when part of the mobile terminals are disconnected, so that the reliability of the edge computing network is improved.

Description

Edge calculation method and device, computer equipment and storage medium
Technical Field
The present application relates to the field of edge computing technologies, and in particular, to an edge computing method and apparatus, a computer device, and a storage medium.
Background
Meanwhile, compared with the computing unloading in the cloud computing, the MEC solves the problems of network resource occupation, high time delay, extra network load and the like, explores the internal computing capacity of the mobile network and provides richer sensing services for users.
Ensemble learning, as the name implies, is accomplished by integrating/combining multiple individual weak classifiers together to complete the learning task. The integration of the learning results can often achieve better learning effect than a single classifier.
Compared with a server, the mobile terminal is more easily interfered by various factors such as battery life, network quality, user occupation and the like, and is often difficult to reliably undertake edge computing tasks.
Disclosure of Invention
The embodiment of the application aims to provide an edge computing method, so that integrated learning running on an edge computing network can be kept at a high recognition capability when part of mobile terminals are disconnected.
The embodiment of the application provides an edge computing method, which is used for an edge computing network, wherein the edge computing network comprises a plurality of mobile terminals, and the method comprises the following steps:
step S201, obtaining participation probabilities of a plurality of mobile terminals;
step S202, selecting a plurality of mobile terminals as target mobile terminals according to the participation probability of the mobile terminals;
step S203, optimizing an original data set by using an intelligent optimization algorithm according to the participation probability of the target mobile terminal to obtain an optimized data set, wherein the optimized data set comprises optimized data subsets the number of which is the same as that of the target mobile terminal, and the error of a strong classifier consisting of weak classifiers trained according to the optimized data subsets is lower than a preset threshold;
step S204, distributing the optimized data subset to a target mobile terminal for training a weak classifier at the target mobile terminal;
and S205, receiving a preliminary classification result calculated by the target mobile terminal according to the weak classifier and the input data from the target mobile terminal, and determining a final classification result according to the preliminary classification result of the weak classifier.
Preferably, the intelligent optimization algorithm comprises an evolutionary algorithm.
Further, the step S203 specifically includes:
step S2031, randomly sampling the original data set with a playback function to obtain data subsets with the same number as that of the target mobile terminals;
step S2032, using the data subsets as initialization groups, wherein each data subset corresponds to one individual in the group;
step S2033, respectively training a plurality of weak classifiers by using the data subsets;
s2034, testing the weak classifiers by using the test set, and taking the identification rate of the weak classifiers as the fitness of the individuals;
step S2035, distributing participation probability for the weak classifiers trained by the data subsets corresponding to the individuals according to the fitness;
step S2036, calculating an integrated learning result according to the weak classifier and the participation probability;
step S2037, judging whether the integrated learning result meets the requirement, if not, entering step S2038, and if so, entering step S2039;
step S2038, crossing and mutating the individuals according to the fitness to generate a data subset corresponding to the next generation group, and returning to step S2033;
step S2039, outputting the data subsets corresponding to the individuals in the population as optimized data subsets in the optimized data set.
Further, the step S2035 comprises: and allocating higher participation probability to the individuals with higher fitness.
Further, the step S2036 comprises: and randomly selecting corresponding weak classifiers according to the participation probability to obtain a plurality of groups of target weak classifiers, calculating the ensemble learning calculation result of each group of target weak classifiers according to an ensemble learning algorithm, and averaging the ensemble learning calculation results of the target weak classifiers to obtain an ensemble learning result.
An embodiment of the present application further provides an edge computing apparatus, which is in communication connection with a plurality of mobile terminals through an edge computing network, and the apparatus includes:
the node management module is used for acquiring participation probabilities of a plurality of mobile terminals;
the node appoints the module, the participation probability according to mobile terminal chooses a plurality of mobile terminals as the goal mobile terminal;
the data set optimization module is used for optimizing an original data set by using an intelligent optimization algorithm according to the participation probability of the target mobile terminal to obtain an optimized data set, wherein the optimized data set comprises optimized data subsets the number of which is the same as that of the target mobile terminal, and the error of a strong classifier consisting of weak classifiers trained according to the optimized data subsets is lower than a preset threshold;
the task allocation module is used for allocating the optimized data subset to the target mobile terminal and training the weak classifier at the target mobile terminal;
and the integrated calculation module receives a primary classification result calculated by the target mobile terminal according to the weak classifier and the input data from the target mobile terminal, and determines a final classification result according to the primary classification result of the weak classifier.
Further, the dataset optimization module comprises:
the sampling module is used for sampling the original data set in a random and replacement mode to obtain data subsets with the same number as that of the target mobile terminals;
an initialization module, which takes the data subsets as initialization groups, wherein each data subset corresponds to one individual in the group;
the training module is used for respectively training a plurality of weak classifiers by utilizing the data subsets;
the test module is used for testing the weak classifiers by using the test set and taking the identification rate of the weak classifiers as the fitness of the individuals;
the participation probability module is used for distributing participation probability for the weak classifiers trained by the data subsets corresponding to the individuals according to the fitness;
the ensemble learning module is used for calculating an ensemble learning result according to the weak classifier and the participation probability;
the judging module is used for judging whether the integrated learning result meets the requirements, if not, the evolution module is called to carry out crossing and variation on the individual, if so, the output module is called to output the data subset corresponding to the individual as an optimized data subset;
the evolution module is used for generating a data subset corresponding to the next generation group by crossing and varying the individuals according to the fitness and calling the training module to perform weak classifier training according to the next generation group;
and the output module outputs the data subsets corresponding to the individuals in the group as the optimized data subsets in the optimized data set.
Further, the ensemble learning module randomly selects corresponding weak classifiers according to the participation probability to obtain a plurality of groups of target weak classifiers, calculates ensemble learning calculation results of each group of target weak classifiers according to an ensemble learning algorithm, and averages the ensemble learning calculation results of the target weak classifiers to obtain an ensemble learning result.
The embodiment of the present invention further provides a computer device, which includes a memory and a processor, where the memory stores a computer program, and the processor implements the steps of any one of the edge calculation methods when executing the computer program.
An embodiment of the present invention further provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the computer program implements the steps of any one of the edge calculation methods described above.
The invention has the beneficial effects that: according to the embodiment of the invention, an original data set is optimized by an intelligent optimization algorithm according to the participation probability of a target mobile terminal to obtain an optimized data set, wherein the optimized data set comprises optimized data subsets with the same number as that of the target mobile terminal, and the error of a strong classifier consisting of weak classifiers trained according to the optimized data subsets is lower than a preset threshold value, so that the integrated learning running in an edge computing network can keep higher identification capability even if part of the mobile terminals are disconnected.
Drawings
In order to more clearly illustrate the solution of the present application, the drawings needed for describing the embodiments of the present application will be briefly described below, and it is obvious that the drawings in the following description are some embodiments of the present application, and that other drawings can be obtained by those skilled in the art without inventive effort.
FIG. 1 is an exemplary system architecture diagram in which the present application may be applied;
FIG. 2 is a flow diagram of one embodiment of an edge calculation method according to the present application;
FIG. 3 is a flowchart of one embodiment of step S203 in FIG. 2;
FIG. 4 is a schematic block diagram of one embodiment of an edge computing device according to the present application;
FIG. 5 is a schematic diagram of one embodiment of the data set optimization module of FIG. 4;
FIG. 6 is a schematic block diagram of one embodiment of a computer device according to the present application.
Detailed Description
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this application belongs; the terminology used in the description of the application herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the application; the terms "including" and "having," and any variations thereof, in the description and claims of this application and the description of the above figures are intended to cover non-exclusive inclusions. The terms "first," "second," and the like in the description and claims of this application or in the above-described drawings are used for distinguishing between different objects and not for describing a particular order.
Reference herein to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment can be included in at least one embodiment of the application. The appearances of the phrase in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. It is explicitly and implicitly understood by one skilled in the art that the embodiments described herein can be combined with other embodiments.
In order to make the technical solutions better understood by those skilled in the art, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the accompanying drawings.
As shown in fig. 1, a system architecture (i.e., edge computing network) 100 may include mobile terminals 101, 102, 103, a network 104, and an edge server 105. The network 104 serves to provide a medium for communication links between the mobile terminals 101, 102, 103 and the edge server 105. Network 104 may include various connection types, such as wired, wireless communication links, or fiber optic cables, to name a few.
The user may use the mobile terminals 101, 102, 103 to interact with the edge server 105 over the network 104 to receive or send messages or the like. The mobile terminals 101, 102, 103 may have various communication client applications installed thereon, such as a web browser application, a shopping application, a search application, an instant messaging tool, a mailbox client, social platform software, and the like.
The mobile terminals 101, 102, 103 may be various electronic devices having a display screen and supporting distributed computing, including but not limited to smart phones, tablet computers, e-book readers, MP3 players (Moving Picture Experts Group Audio Layer III, mpeg compression standard Audio Layer 3), MP4 players (Moving Picture Experts Group Audio Layer IV, mpeg compression standard Audio Layer 4), laptop and desktop computers, and the like.
The edge server 105 may be a device that provides various edge computing services, such as various devices that provide input data, subsets of data, etc. to the mobile terminals 101, 102, 103.
It should be noted that the edge calculation method provided in the embodiments of the present application generally consists ofEdge serverExecution, accordingly, edge computing devices are typically provided inEdge serverIn (1).
It should be understood that the number of mobile terminals, networks, and edge servers in fig. 1 is merely illustrative. There may be any number of mobile terminals, networks, and edge servers, as desired for an implementation.
With continued reference to FIG. 2, a flow diagram of one embodiment of an edge calculation method in accordance with the present application is shown. The edge computing method is used for an edge computing network, the edge computing network comprises a plurality of mobile terminals, and the method comprises the following steps:
step S201, obtaining participation probabilities of a plurality of mobile terminals;
the participation probability can be obtained from past history records, and the history records can record the times of distributing calculation tasks for each mobile terminal and the times of normally obtaining the primary classification result by each mobile terminal, so that the participation probability of the mobile terminal is calculated according to the times of distributing the calculation tasks and the times of normally obtaining the primary classification result.
In this embodiment, an electronic device (such as that shown in FIG. 1) on which the edge calculation method operatesEdge services Device for cleaning the skin) The node information may be received from the mobile terminal through a wired connection manner or a wireless connection manner. It should be noted that the wireless connection means may include, but is not limited to, WiFi connection, bluetooth connection, WiMAX connection, Zigbee connection, uwb (ultra wideband) connection, and other now known or later developed wireless connection means.
And 202, selecting a plurality of mobile terminals as target mobile terminals according to the participation probability of the mobile terminals.
In this embodiment, the mobile terminal with the participation probability higher than the preset value may be selected as the target mobile terminal, so as to eliminate the mobile terminal with the lower participation probability. Therefore, the probability that the target mobile terminal cannot complete the calculation task is controlled, and the reliability of the edge calculation network is improved.
And 203, optimizing the original data set by using an intelligent optimization algorithm according to the participation probability of the target mobile terminal to obtain an optimized data set, wherein the optimized data set comprises optimized data subsets the number of which is the same as that of the target mobile terminal, and the error of a strong classifier consisting of weak classifiers trained according to the optimized data subsets is lower than a preset threshold value.
In this embodiment, the original data set may be optimized by using an evolutionary algorithm to obtain an optimized data set. In particular, a set-back sampling may be performed on an original data set, obtaining a plurality of data subsets as an initial population, each data subset may include a preset number of samples. The number of data subsets may be equal to the number of target mobile terminals. Training a weak classifier according to each data subset, evaluating errors of the weak classifiers by using a verification set, taking the errors of the weak classifiers as the fitness of the data subsets, calculating the errors of a strong classifier composed of the weak classifiers, and performing cross variation on the data subsets when the errors of the strong classifier are higher than a preset threshold, for example, exchanging samples in the data subsets with higher fitness to the data subsets with lower fitness to obtain new data subsets, and taking the new data subsets as next generation of population to continue optimization until the errors of the strong classifier composed of the weak classifiers trained according to the data subsets are lower than the preset threshold.
And step S204, distributing the optimized data subset to the target mobile terminal for training the weak classifier at the target mobile terminal.
In this embodiment, the weak classifier may be a classifier of a BP neural network model or a random forest model, or may be a classifier of another machine learning model.
And S205, receiving a preliminary classification result calculated by the target mobile terminal according to the weak classifier and the input data from the target mobile terminal, and determining a final classification result according to the preliminary classification result of the weak classifier.
In this embodiment, the final classification result of the strong classifier can be obtained by synthesizing the preliminary classification result according to the ensemble learning algorithm.
Referring to fig. 3, in some optional implementations, the step S203 specifically includes:
step S2031, randomly sampling the original data set with playback to obtain data subsets with the same number as that of the target mobile terminals; each data subset includes a plurality of pieces of sample data.
Step S2032, using the data subsets as initialization groups, wherein each data subset corresponds to one individual in the group; specifically, each gene of each individual corresponds to one sample.
Step S2033, respectively training a plurality of weak classifiers by using the data subsets;
s2034, testing the weak classifiers by using the test set, and taking the identification rate of the weak classifiers as the fitness of the individuals;
step S2035, distributing participation probability for the weak classifiers trained by the data subsets corresponding to the individuals according to the fitness;
step S2036, calculating an integrated learning result according to the weak classifier and the participation probability;
step S2037, judging whether the integrated learning result meets the requirement, if not, entering step S2038, and if so, entering step S2039; specifically, it may be determined whether the recognition accuracy of the ensemble learning reaches a preset target accuracy, and if so, it is determined that the requirement is met, otherwise, it is determined that the requirement is not met. Target iteration times can also be set, and when the iteration times reach the target iteration times, the requirement can be judged to be met.
Step S2038, crossing and mutating the individuals according to the fitness to generate a data subset corresponding to the next generation group, and returning to step S2033; specifically, individuals with higher fitness may be selected and the genes (i.e., samples) of the individuals with higher fitness exchanged for the individuals with lower fitness.
Step S2039, outputting the data subsets corresponding to the individuals in the population as optimized data subsets in the optimized data set.
In this embodiment, an original data set is optimized by using an intelligent optimization algorithm according to the participation probability of a target mobile terminal to obtain an optimized data set, where the optimized data set includes optimized data subsets having the same number as that of the target mobile terminal, and an error of a strong classifier composed of weak classifiers trained according to the optimized data subsets is lower than a preset threshold, so that even if some mobile terminals are disconnected, an integrated learning model running on an edge computing network can maintain a high recognition capability.
In some optional implementations, the step S2035 includes: and allocating higher participation probability to the individuals with higher fitness. I.e. higher participation probability is allocated to the data subset with larger fitness.
In some optional implementations, the step S2036 includes: and randomly selecting corresponding weak classifiers according to the participation probability to obtain a plurality of groups of target weak classifiers, calculating the ensemble learning calculation result of each group of target weak classifiers according to an ensemble learning algorithm, and averaging the ensemble learning calculation results of the target weak classifiers to obtain an ensemble learning result.
Specifically, when the parameter probability is 80%, and a weak classifier trained on the corresponding data subset is randomly selected, the weak classifier is selected as the target weak classifier with a probability of 80%. After each weak classifier is selected according to the corresponding participation probability, a group of target weak classifiers can be obtained. And repeating the weak classifier selection for multiple times to obtain multiple groups of different target weak classifiers.
According to the embodiment of the invention, an original data set is optimized by an intelligent optimization algorithm according to the participation probability of a target mobile terminal to obtain an optimized data set, wherein the optimized data set comprises optimized data subsets with the same number as that of the target mobile terminal, and the error of a strong classifier consisting of weak classifiers trained according to the optimized data subsets is lower than a preset threshold value, so that the integrated learning running in an edge computing network can keep higher identification capability even if part of the mobile terminals are disconnected.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by a computer program, which can be stored in a computer-readable storage medium, and can include the processes of the embodiments of the methods described above when the computer program is executed. The storage medium may be a non-volatile storage medium such as a magnetic disk, an optical disk, a Read-Only Memory (ROM), or a Random Access Memory (RAM).
It should be understood that, although the steps in the flowcharts of the figures are shown in order as indicated by the arrows, the steps are not necessarily performed in order as indicated by the arrows. The steps are not performed in the exact order shown and may be performed in other orders unless explicitly stated herein. Moreover, at least a portion of the steps in the flow chart of the figure may include multiple sub-steps or multiple stages, which are not necessarily performed at the same time, but may be performed at different times, which are not necessarily performed in sequence, but may be performed alternately or alternately with other steps or at least a portion of the sub-steps or stages of other steps.
With further reference to fig. 4, as an implementation of the method shown in fig. 2, the present application provides an embodiment of an edge computing apparatus, which corresponds to the embodiment of the method shown in fig. 2, and which can be applied in various electronic devices.
As shown in fig. 4, the edge calculating device 400 according to the present embodiment includes: a node management module 401, a node designation module 402, a data set optimization module 403, a task allocation module 404, and an integrated computation module 405, wherein:
the node management module 401 is configured to obtain participation probabilities of a plurality of mobile terminals;
the node designation module 402 is configured to select a plurality of mobile terminals as target mobile terminals according to the participation probability of the mobile terminals;
the data set optimization module 403 is configured to optimize an original data set by using an intelligent optimization algorithm according to the participation probability of the target mobile terminal to obtain an optimized data set, where the optimized data set includes optimized data subsets with the same number as that of the target mobile terminal, and an error of a strong classifier composed of weak classifiers trained according to the optimized data subsets is lower than a preset threshold;
the task allocation module 404 is configured to allocate the optimized data subset to the target mobile terminal, and train a weak classifier at the target mobile terminal;
the integrated calculating module 405 is used for receiving the preliminary classification result calculated by the target mobile terminal according to the weak classifier and the input data from the target mobile terminal, and determining the final classification result according to the preliminary classification result of the weak classifier
According to the embodiment of the invention, an original data set is optimized by an intelligent optimization algorithm according to the participation probability of a target mobile terminal to obtain an optimized data set, wherein the optimized data set comprises optimized data subsets with the same number as that of the target mobile terminal, and the error of a strong classifier consisting of weak classifiers trained according to the optimized data subsets is lower than a preset threshold value, so that the integrated learning running in an edge computing network can keep higher identification capability even if part of the mobile terminals are disconnected.
Referring to fig. 5, which is a schematic structural diagram of an embodiment of the data set optimization module, the data set optimization module 403 includes:
the sampling module 4031 performs random playback sampling on the original data set to obtain data subsets the number of which is the same as that of the target mobile terminals;
an initialization module 4032 for defining subsets of data as initialization populations, each subset of data corresponding to an individual in the population;
a training module 4033 for respectively training a plurality of weak classifiers by using the data subsets;
the test module 4034 tests the weak classifiers by using the test set, and takes the identification rate of the weak classifiers as the fitness of the individuals;
a participation probability module 4035, which distributes participation probability for the weak classifiers trained by the data subsets corresponding to the individuals according to the fitness;
the ensemble learning module 4036 is used for calculating an ensemble learning result according to the weak classifier and the participation probability;
the judging module 4037 judges whether the ensemble learning result meets the requirements, if not, the evolutionary module 4038 is called to carry out intersection and variation on the individuals, if so, the output module 4039 is called, and the data subsets corresponding to the individuals are output as optimized data subsets;
the evolution module 4038 is used for crossing and mutating the individuals according to the fitness to generate a data subset corresponding to the next generation group, and calling the training module 4033 to perform weak classifier training according to the next generation group;
the output module 4039 outputs the data subsets corresponding to the individuals in the population as the optimized data subsets in the optimized data set.
In some optional implementations of the present embodiment, the ensemble learning module 4036 is further configured to: and randomly selecting corresponding weak classifiers according to the participation probability to obtain a plurality of groups of target weak classifiers, calculating the ensemble learning calculation result of each group of target weak classifiers according to an ensemble learning algorithm, and averaging the ensemble learning calculation results of the target weak classifiers to obtain an ensemble learning result.
In order to solve the technical problem, an embodiment of the present application further provides a computer device. Referring to fig. 6, fig. 6 is a block diagram of a basic structure of a computer device according to the present embodiment.
The computer device 6 comprises a memory 61, a processor 62, a network interface 63 communicatively connected to each other via a system bus. It is noted that only a computer device 6 having components 61-63 is shown, but it is understood that not all of the shown components are required to be implemented, and that more or fewer components may be implemented instead. As will be understood by those skilled in the art, the computer device is a device capable of automatically performing numerical calculation and/or information processing according to a preset or stored instruction, and the hardware includes, but is not limited to, a microprocessor, an Application Specific Integrated Circuit (ASIC), a Programmable Gate Array (FPGA), a Digital Signal Processor (DSP), an embedded device, and the like.
The computer device may be a desktop computer, a notebook, a palm top computer, an edge server, or other computing device. The computer equipment can carry out man-machine interaction with a user through a keyboard, a mouse, a remote controller, a touch panel or voice control equipment and the like.
The memory 61 includes at least one type of readable storage medium including a flash memory, a hard disk, a multimedia card, a card type memory (e.g., SD or DX memory, etc.), a Random Access Memory (RAM), a Static Random Access Memory (SRAM), a Read Only Memory (ROM), an Electrically Erasable Programmable Read Only Memory (EEPROM), a Programmable Read Only Memory (PROM), a magnetic memory, a magnetic disk, an optical disk, etc. In some embodiments, the memory 61 may be an internal storage unit of the computer device 6, such as a hard disk or a memory of the computer device 6. In other embodiments, the memory 61 may also be an external storage device of the computer device 6, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), and the like, which are provided on the computer device 6. Of course, the memory 61 may also comprise both an internal storage unit of the computer device 6 and an external storage device thereof. In this embodiment, the memory 61 is generally used for storing an operating system installed in the computer device 6 and various types of application software, such as program codes of the edge calculation method. Further, the memory 61 may also be used to temporarily store various types of data that have been output or are to be output.
The processor 62 may be a Central Processing Unit (CPU), controller, microcontroller, microprocessor, or other data Processing chip in some embodiments. The processor 62 is typically used to control the overall operation of the computer device 6. In this embodiment, the processor 62 is configured to execute the program code stored in the memory 61 or process data, for example, execute the program code of the edge calculation method.
The network interface 63 may comprise a wireless network interface or a wired network interface, and the network interface 63 is typically used for establishing a communication connection between the computer device 6 and other electronic devices.
The present application further provides another embodiment, which is to provide a computer-readable storage medium storing an edge calculation program, which is executable by at least one processor to cause the at least one processor to perform the steps of the edge calculation method as described above.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present application may be embodied in the form of a software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a mobile terminal (such as a mobile phone, a computer, an edge server, an air conditioner, or a network device) to execute the method according to the embodiments of the present application.
It should be understood that the above-described embodiments are merely exemplary of some, and not all, embodiments of the present application, and that the drawings illustrate preferred embodiments of the present application without limiting the scope of the claims appended hereto. This application is capable of embodiments in many different forms and is provided for the purpose of enabling a thorough understanding of the disclosure of the application. Although the present application has been described in detail with reference to the foregoing embodiments, it will be apparent to one skilled in the art that the present application may be practiced without modification or with equivalents of some of the features described in the foregoing embodiments. All equivalent structures made by using the contents of the specification and the drawings of the present application are directly or indirectly applied to other related technical fields and are within the protection scope of the present application.

Claims (10)

1. An edge computing method for an edge computing network comprising a plurality of mobile terminals, the method comprising the steps of:
step S201, obtaining participation probabilities of a plurality of mobile terminals;
step S202, selecting a plurality of mobile terminals as target mobile terminals according to the participation probability of the mobile terminals;
step S203, optimizing an original data set by using an intelligent optimization algorithm according to the participation probability of the target mobile terminal to obtain an optimized data set, wherein the optimized data set comprises optimized data subsets the number of which is the same as that of the target mobile terminal, and the error of a strong classifier consisting of weak classifiers trained according to the optimized data subsets is lower than a preset threshold;
step S204, distributing the optimized data subset to a target mobile terminal for training a weak classifier at the target mobile terminal;
and S205, receiving a preliminary classification result calculated by the target mobile terminal according to the weak classifier and the input data from the target mobile terminal, and determining a final classification result according to the preliminary classification result of the weak classifier.
2. The edge computation method of claim 1, wherein the intelligent optimization algorithm comprises an evolutionary algorithm.
3. The edge calculation method according to claim 2, wherein the step S203 specifically includes:
step S2031, randomly sampling the original data set with a playback function to obtain data subsets with the same number as that of the target mobile terminals;
step S2032, using the data subsets as initialization groups, wherein each data subset corresponds to one individual in the group;
step S2033, respectively training a plurality of weak classifiers by using the data subsets;
s2034, testing the weak classifiers by using the test set, and taking the identification rate of the weak classifiers as the fitness of the individuals;
step S2035, distributing participation probability for the weak classifiers trained by the data subsets corresponding to the individuals according to the fitness;
step S2036, calculating an integrated learning result according to the weak classifier and the participation probability;
step S2037, judging whether the ensemble learning result meets the requirement, if not, entering step S2038, and if so, entering step S2039;
step S2038, crossing and mutating the individuals according to the fitness to generate a data subset corresponding to the next generation group, and returning to step S2033;
step S2039, outputting the data subsets corresponding to the individuals in the population as optimized data subsets in the optimized data set.
4. The edge calculation method according to claim 3, wherein the step S2035 comprises: and allocating higher participation probability to the individuals with higher fitness.
5. The edge calculation method according to claim 3, wherein the step S2036 comprises: and randomly selecting corresponding weak classifiers according to the participation probability to obtain a plurality of groups of target weak classifiers, calculating the ensemble learning calculation result of each group of target weak classifiers according to an ensemble learning algorithm, and averaging the ensemble learning calculation results of the target weak classifiers to obtain an ensemble learning result.
6. An edge computing device communicatively coupled to a plurality of mobile terminals via an edge computing network, the device comprising:
the node management module is used for acquiring participation probabilities of a plurality of mobile terminals;
the node appoints the module, the participation probability according to mobile terminal chooses a plurality of mobile terminals as the goal mobile terminal;
the data set optimization module is used for optimizing an original data set by using an intelligent optimization algorithm according to the participation probability of the target mobile terminal to obtain an optimized data set, wherein the optimized data set comprises optimized data subsets the number of which is the same as that of the target mobile terminal, and the error of a strong classifier consisting of weak classifiers trained according to the optimized data subsets is lower than a preset threshold value;
the task allocation module is used for allocating the optimized data subset to the target mobile terminal and training the weak classifier at the target mobile terminal;
and the integrated calculation module receives a primary classification result calculated by the target mobile terminal according to the weak classifier and the input data from the target mobile terminal, and determines a final classification result according to the primary classification result of the weak classifier.
7. The edge computing device of claim 6, wherein the dataset optimization module comprises:
the sampling module is used for sampling the original data set in a random and replacement mode to obtain data subsets with the same number as that of the target mobile terminals;
an initialization module, which takes the data subsets as initialization groups, wherein each data subset corresponds to one individual in the group;
the training module is used for respectively training a plurality of weak classifiers by utilizing the data subsets;
the test module is used for testing the weak classifiers by using the test set and taking the identification rate of the weak classifiers as the fitness of the individuals;
the participation probability module is used for distributing participation probability for the weak classifiers trained by the data subsets corresponding to the individuals according to the fitness;
the ensemble learning module is used for calculating an ensemble learning result according to the weak classifier and the participation probability;
the judging module is used for judging whether the integrated learning result meets the requirements, if not, the evolution module is called to carry out crossing and variation on the individual, if so, the output module is called to output the data subset corresponding to the individual as an optimized data subset;
the evolution module is used for performing crossing and variation on the individuals according to the fitness to generate a data subset corresponding to the next generation group, and calling the training module to perform weak classifier training according to the next generation group;
and the output module outputs the data subsets corresponding to the individuals in the group as the optimized data subsets in the optimized data set.
8. The edge computing device of claim 7, wherein the ensemble learning module randomly selects corresponding weak classifiers according to the participation probability to obtain a plurality of sets of target weak classifiers, calculates ensemble learning calculation results of each set of target weak classifiers according to an ensemble learning algorithm, and averages the ensemble learning calculation results of the target weak classifiers to obtain an ensemble learning result.
9. A computer device comprising a memory in which a computer program is stored and a processor which, when executing the computer program, implements the steps of the edge calculation method according to any one of claims 1 to 5.
10. A computer-readable storage medium, characterized in that a computer program is stored on the computer-readable storage medium, which computer program, when being executed by a processor, carries out the steps of the edge calculation method according to any one of claims 1 to 5.
CN202110039126.XA 2021-01-12 2021-01-12 Edge calculation method and device, computer equipment and storage medium Active CN112887371B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110039126.XA CN112887371B (en) 2021-01-12 2021-01-12 Edge calculation method and device, computer equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110039126.XA CN112887371B (en) 2021-01-12 2021-01-12 Edge calculation method and device, computer equipment and storage medium

Publications (2)

Publication Number Publication Date
CN112887371A CN112887371A (en) 2021-06-01
CN112887371B true CN112887371B (en) 2022-05-13

Family

ID=76044815

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110039126.XA Active CN112887371B (en) 2021-01-12 2021-01-12 Edge calculation method and device, computer equipment and storage medium

Country Status (1)

Country Link
CN (1) CN112887371B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115935278B (en) * 2023-03-08 2023-06-20 深圳市大数据研究院 Environment recognition method, electronic device, and computer-readable storage medium
CN116488684B (en) * 2023-04-26 2023-10-13 南通大学 Method and device for identifying visible region in ultra-large-scale MIMO antenna system

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2890079A1 (en) * 2013-12-31 2015-07-01 Cisco Technology, Inc. Attack mitigation using learning machines
CN108304877A (en) * 2018-02-02 2018-07-20 电子科技大学 A kind of physical layer channel authentication method based on machine learning
CN109344848A (en) * 2018-07-13 2019-02-15 电子科技大学 Mobile intelligent terminal security level classification method based on Adaboost
CN109842912A (en) * 2019-01-08 2019-06-04 东南大学 A kind of more attribute handover decisions methods based on integrated study
CN111554088A (en) * 2020-04-13 2020-08-18 重庆邮电大学 Multifunctional V2X intelligent roadside base station system

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11187548B2 (en) * 2019-02-05 2021-11-30 International Business Machines Corporation Planning vehicle computational unit migration based on mobility prediction

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2890079A1 (en) * 2013-12-31 2015-07-01 Cisco Technology, Inc. Attack mitigation using learning machines
CN108304877A (en) * 2018-02-02 2018-07-20 电子科技大学 A kind of physical layer channel authentication method based on machine learning
CN109344848A (en) * 2018-07-13 2019-02-15 电子科技大学 Mobile intelligent terminal security level classification method based on Adaboost
CN109842912A (en) * 2019-01-08 2019-06-04 东南大学 A kind of more attribute handover decisions methods based on integrated study
CN111554088A (en) * 2020-04-13 2020-08-18 重庆邮电大学 Multifunctional V2X intelligent roadside base station system

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
基于MEC的车联网切换算法研究;崔文清;《中国优秀硕士学位论文全文数据库(工程科技Ⅱ辑)》;20200615;全文 *

Also Published As

Publication number Publication date
CN112887371A (en) 2021-06-01

Similar Documents

Publication Publication Date Title
CN112148987B (en) Message pushing method based on target object activity and related equipment
CN111831675A (en) Storage model training method and device, computer equipment and storage medium
CN112887371B (en) Edge calculation method and device, computer equipment and storage medium
CN112766649B (en) Target object evaluation method based on multi-scoring card fusion and related equipment thereof
WO2021068513A1 (en) Abnormal object recognition method and apparatus, medium, and electronic device
CN112508118A (en) Target object behavior prediction method aiming at data migration and related equipment thereof
CN108182633B (en) Loan data processing method, loan data processing device, loan data processing program, and computer device and storage medium
CN112785005A (en) Multi-target task assistant decision-making method and device, computer equipment and medium
CN112686301A (en) Data annotation method based on cross validation and related equipment
CN112995414B (en) Behavior quality inspection method, device, equipment and storage medium based on voice call
CN116402625B (en) Customer evaluation method, apparatus, computer device and storage medium
CN112764923B (en) Computing resource allocation method, computing resource allocation device, computer equipment and storage medium
CN112100491A (en) Information recommendation method, device and equipment based on user data and storage medium
CN112801145A (en) Safety monitoring method and device, computer equipment and storage medium
CN113240323B (en) Level evaluation method and device based on machine learning and related equipment
CN115375453A (en) System resource allocation method and device
CN114912958A (en) Seat calling-out method, device, computer equipment and storage medium
CN114925275A (en) Product recommendation method and device, computer equipment and storage medium
CN112084408A (en) List data screening method and device, computer equipment and storage medium
CN111339432A (en) Recommendation method and device of electronic object and electronic equipment
CN116911304B (en) Text recommendation method and device
CN115640896B (en) Household user power load prediction method under multi-user scene and related equipment
CN113298636B (en) Risk control method, device and system based on simulation resource application
CN116308468A (en) Client object classification method, device, computer equipment and storage medium
CN117709801A (en) Client data processing method, device, computer equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant