CN117314233A - Vendor evaluation method, device, terminal equipment and medium - Google Patents

Vendor evaluation method, device, terminal equipment and medium Download PDF

Info

Publication number
CN117314233A
CN117314233A CN202311226089.9A CN202311226089A CN117314233A CN 117314233 A CN117314233 A CN 117314233A CN 202311226089 A CN202311226089 A CN 202311226089A CN 117314233 A CN117314233 A CN 117314233A
Authority
CN
China
Prior art keywords
hidden layer
neuron
layer
provider
output
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311226089.9A
Other languages
Chinese (zh)
Inventor
相辉
张静
王宏宇
史依茗
郭路遥
卢焱
张弘媛
杨青倬
米文博
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hebei Wenshu Technology Co ltd
State Grid Hebei Public Bidding Co ltd
State Grid Hebei Electric Power Co ltd Material Branch
State Grid Corp of China SGCC
Original Assignee
Hebei Wenshu Technology Co ltd
State Grid Hebei Public Bidding Co ltd
State Grid Hebei Electric Power Co ltd Material Branch
State Grid Corp of China SGCC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hebei Wenshu Technology Co ltd, State Grid Hebei Public Bidding Co ltd, State Grid Hebei Electric Power Co ltd Material Branch, State Grid Corp of China SGCC filed Critical Hebei Wenshu Technology Co ltd
Priority to CN202311226089.9A priority Critical patent/CN117314233A/en
Publication of CN117314233A publication Critical patent/CN117314233A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/06Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
    • G06Q10/063Operations research, analysis or management
    • G06Q10/0639Performance analysis of employees; Performance analysis of enterprise or organisation operations
    • G06Q10/06393Score-carding, benchmarking or key performance indicator [KPI] analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/048Activation functions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • G06N3/084Backpropagation, e.g. using gradient descent
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Systems or methods specially adapted for specific business sectors, e.g. utilities or tourism
    • G06Q50/04Manufacturing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Systems or methods specially adapted for specific business sectors, e.g. utilities or tourism
    • G06Q50/10Services
    • G06Q50/26Government or public services

Abstract

The application is applicable to the technical field of green environmental protection evaluation, and provides a provider evaluation method, a device, terminal equipment and a medium, wherein the method comprises the following steps: acquiring enterprise data of a provider; the supplier's enterprise data includes the supplier's carbon footprint; inputting enterprise data of a provider into a pre-trained BP neural network model to obtain an evaluation result of the provider; an attention mechanism module is arranged in the BP neural network model; in the BP neural network model, an attention mechanism module is used to adjust the weights of features related to the provider's carbon footprint. The method and the device can evaluate suppliers more comprehensively and accurately, and improve the importance of green low carbon to the evaluation suppliers.

Description

Vendor evaluation method, device, terminal equipment and medium
Technical Field
The application belongs to the technical field of green environmental protection evaluation, and particularly relates to a provider evaluation method, a device, terminal equipment and a medium.
Background
In recent years, supply chain management can promote the market reaction speed of enterprises, and greatly shorten the time for meeting the demands of consumers, so that the enterprises can obtain the irreproducible competitive advantage in the world with rapid change. More and more businesses recognize the great benefits of implementing supply chain management, with the choice of suppliers being the basis for supply chain partnership operations. For a production enterprise, the quality of a provider directly affects the cost, quality, and delivery period of a product and the overall performance of the supply chain. Therefore, scientifically, reasonably and objectively evaluating and selecting suppliers is one of the important works of core enterprises on the supply chain.
Meanwhile, in order to cope with problems of global warming, environmental pollution, and the like, carbon dioxide emissions should be strived for to peak earlier, and carbon neutralization should be striven for. According to relevant survey data, 75% of the total product carbon footprint comes from the supply chain. At this time, there is a higher demand for green low carbon for various production enterprises.
However, the influence on the supplier evaluation in terms of green low carbon is not considered in the existing supplier evaluation method, so that judgment on each supplier is not accurate enough, and how to evaluate the suppliers more comprehensively and accurately is in need of solving.
Disclosure of Invention
In order to overcome the problems in the related art, the embodiment of the application provides a provider evaluation method, a device, a terminal device and a medium, which can evaluate the provider more comprehensively and accurately, and improve the importance of green low carbon to the evaluation of the provider.
The application is realized by the following technical scheme:
in a first aspect, an embodiment of the present application provides a vendor evaluation method, including:
acquiring enterprise data of a provider; the supplier's enterprise data includes the supplier's carbon footprint;
inputting enterprise data of a provider into a pre-trained BP neural network model to obtain an evaluation result of the provider; an attention mechanism module is arranged in the BP neural network model; in the BP neural network model, an attention mechanism module is used to adjust the weights of features related to the provider's carbon footprint.
In a possible implementation manner of the first aspect, the BP neural network model includes an input layer, a first hidden layer, a second hidden layer, an output layer, and an attention mechanism module; the attention mechanism module comprises a first attention mechanism unit and a second attention mechanism unit;
the input layer is used for inputting enterprise data of a provider;
the first hidden layer is used for outputting the evaluation index characteristics of the suppliers; the evaluation index of the provider is characterized by a plurality of evaluation index features for representing the correlation among enterprise data of the provider; the second hidden layer is used for outputting the total evaluation index characteristics of the provider; the number of the total evaluation index features of the suppliers is more than or equal to 2, and the total evaluation index features are used for representing the correlation among the evaluation index features of the suppliers;
the first attention mechanism unit is arranged between the first hidden layer and the second hidden layer; the second attention mechanism unit is disposed between the second hidden layer and the output layer.
In a possible implementation manner of the first aspect, the evaluation index feature is input into a first attention mechanism unit, where the first attention mechanism unit is configured to assign a neuron corresponding to a feature related to the carbon footprint in the evaluation index feature of the first hidden layer, and map the neuron to the second hidden layer, where the weight is the first attention weight;
The total evaluation index features are input into a second attention mechanism unit, and the second attention mechanism unit is used for giving the neurons corresponding to the features related to the carbon footprint in the total evaluation index features of the second hidden layer, and mapping the neurons to the weights of the output layer, wherein the weights are the second attention weights.
In a possible implementation manner of the first aspect, the second hidden layer output is characterized by:
wherein y' i Output characteristics of the ith neuron in the second hidden layer; omega' bi Mapping weights for the b-th neuron in the first hidden layer to the i-th neuron in the second hidden layer; y is b Output features for the b-th neuron in the first hidden layer; omega' ci A first attention weight for a c-th neuron of the first hidden layer to an i-th neuron in the second hidden layer; the c-th neuron contains a feature related to the carbon footprint; y is c An output feature for the c-th neuron in the first hidden layer; f '() is the activation function of the second hidden layer, θ' is the neuron threshold of the second hidden layer; m is the number of neurons of the first hidden layer.
In a possible implementation manner of the first aspect, the second hidden layer outputs the total evaluation index feature; the total evaluation index features are input into a second attention mechanism unit, and the second attention mechanism unit is used for giving the neurons corresponding to the carbon footprint related features in the total evaluation index features of the second hidden layer, and mapping the neurons to the second attention weight of the output layer.
In a possible implementation manner of the first aspect, the output characteristics of the output layer are:
wherein,an output feature for a j-th neuron in the output layer; />Mapping the weight of the ith neuron in the second hidden layer to the jth neuron in the output layer; y' i Output characteristics of the ith neuron in the second hidden layer; />A second attention weight for a d-th neuron of the second hidden layer to a j-th neuron in the output layer; the d-th neuron contains a feature related to the carbon footprint; y' d An output feature for the d-th neuron in the second hidden layer; />For the activation function of the output layer +.>Neuron thresholds for the output layer; p is the number of neurons of the second hidden layer.
In a possible implementation manner of the first aspect, the number of neurons of the input layer is determined according to the number of types of enterprise data of the provider, and the number of neurons of the input layer is N.
The number of the neurons of the first hidden layer is determined according to the number of the evaluation index features of the first hidden layer, wherein the number of the neurons of the first hidden layer is M, and M < N;
the number of the neurons of the second hidden layer is determined according to the number of the total evaluation index features of the second hidden layer, wherein the number of the neurons of the second hidden layer is P, and P < M;
The number of the neurons of the output layer is determined according to the number of the evaluation results, and the number of the neurons of the output layer is Q, wherein Q < P.
In a second aspect, an embodiment of the present application provides a vendor evaluation device, including:
an enterprise data acquisition module for acquiring enterprise data of a provider; the supplier's enterprise data includes the supplier's carbon footprint;
the evaluation result output module is used for inputting enterprise data of the suppliers into the pre-trained BP neural network model to obtain the evaluation result of the suppliers; an attention mechanism module is arranged in the BP neural network model; in the BP neural network model, an attention mechanism module is used to adjust the weights of features related to the provider's carbon footprint.
In a third aspect, an embodiment of the present application provides a terminal device, including a memory and a processor, where the memory stores a computer program executable on the processor, where the processor implements the vendor evaluation method according to any one of the first aspects when executing the computer program.
In a fourth aspect, embodiments of the present application provide a computer-readable storage medium storing a computer program, wherein the computer program, when executed by a processor, implements the vendor evaluation method according to any one of the first aspects.
Compared with the related art, the embodiment of the application has the beneficial effects that:
according to the embodiment of the application, the attention mechanism is introduced into the BP neural network model, so that important features related to the carbon footprint of the evaluation provider can be automatically focused in the BP neural network model; the BP neural network dynamically adjusts the attention degree by learning the attention weight, so that the accuracy of the BP neural network model is improved. Meanwhile, the suppliers can be scientifically evaluated by adjusting the weight of the characteristics related to the carbon footprint of the suppliers, and the importance of the green low carbon to the evaluation suppliers is improved.
It will be appreciated that the advantages of the second to fourth aspects may be found in the relevant description of the first aspect and are not repeated here.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are required to be used in the embodiments or the description of the related art will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and other drawings may be obtained according to these drawings without inventive effort to a person of ordinary skill in the art.
FIG. 1 is a flow chart of a vendor evaluation method according to an embodiment of the present application;
FIG. 2 is a schematic diagram of a vendor evaluation device according to an embodiment of the present disclosure;
fig. 3 is a schematic structural diagram of a terminal device according to an embodiment of the present application.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth, such as particular system configurations, techniques, etc. in order to provide a thorough understanding of the embodiments of the present application. It will be apparent, however, to one skilled in the art that the present application may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present application with unnecessary detail.
It should be understood that the terms "comprises" and/or "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It should also be understood that the term "and/or" as used in this specification and the appended claims refers to any and all possible combinations of one or more of the associated listed items, and includes such combinations.
As used in this specification and the appended claims, the term "if" may be interpreted as "when..once" or "in response to a determination" or "in response to detection" depending on the context. Similarly, the phrase "if a determination" or "if a [ described condition or event ] is detected" may be interpreted in the context of meaning "upon determination" or "in response to determination" or "upon detection of a [ described condition or event ]" or "in response to detection of a [ described condition or event ]".
In addition, in the description of the present application and the appended claims, the terms "first," "second," "third," and the like are used merely to distinguish between descriptions and are not to be construed as indicating or implying relative importance.
Reference in the specification to "one embodiment" or "some embodiments" or the like means that a particular feature, structure, or characteristic described in connection with the embodiment is included in one or more embodiments of the application. Thus, appearances of the phrases "in one embodiment," "in some embodiments," "in other embodiments," and the like in the specification are not necessarily all referring to the same embodiment, but mean "one or more but not all embodiments" unless expressly specified otherwise. The terms "comprising," "including," "having," and variations thereof mean "including but not limited to," unless expressly specified otherwise.
In order that those skilled in the art will better understand the present invention, a technical solution in the examples of the present application will be clearly and completely described in the following with reference to the accompanying drawings and detailed description, and it is apparent that the described examples are only some examples of the present invention, not all examples. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
As the conventional supplier evaluation method proposed in the background art does not consider the influence on the supplier evaluation in terms of green low carbon, the judgment of each supplier is not accurate enough, and how to evaluate the suppliers more comprehensively and accurately is in need of solving.
The importance of the green low carbon to the evaluation suppliers is improved, and meanwhile, the problems that some manufacturers sacrifice product quality for pursuing the green low carbon, and the cable market is mixed with fish balls and abused in quantity are avoided. Scientific and reasonable evaluation is required for various aspects of suppliers by combining the green low-carbon part. Therefore, the method for evaluating the suppliers comprehensively evaluates all aspects of the suppliers, standardizes the quality of products provided by the market, adds a attention mechanism in the BP neural network model, improves the importance of green low carbon for evaluating the suppliers, promotes the suppliers to realize carbon reduction in the fields of energy conservation, emission reduction, green environmental protection and the like, and provides a reference effect for the selection of the suppliers by the electric power market.
Fig. 1 is a schematic flowchart of a vendor evaluation method provided in an embodiment of the present application, and referring to fig. 1, the vendor evaluation method is described in detail as follows:
in step 101, obtaining enterprise data of a provider; the vendor's enterprise data includes the vendor's carbon footprint.
Illustratively, in this embodiment, taking a cable manufacturing enterprise as an example, the enterprise data of the provider may include at least one of consumption amount of raw materials of the product, types of raw materials, emission factors of raw materials, transportation mileage, energy consumption type, energy consumption amount, cable raw material transportation activity level, productivity, types and numbers of production devices owned by the provider, technical level, number of employees, skill level, product production period, whether the provider obtains related quality management system authentication, whether there is perfect quality control flow and procedure, whether key quality indexes are set, whether self-evaluation and audit are performed regularly, whether the provider can deliver products or services on time according to delivery time agreed by contract, whether inventory can be managed reasonably, whether there is good logistical capability, whether effective coordination and communication can be performed with other links in the supply chain, price level of products or services of the provider, price elasticity level, whether production cost and cost can be controlled effectively, whether additional value and differentiated products or services can be provided, whether there is reasonable pricing policy, etc.; at least one of historical cooperation time, cooperation times, delivery timing rate, quality problem number and the like of the purchasing party can be also included.
For example, the enterprise data of the provider may be obtained from the provider and the project related party, for example, the type and number of production devices owned by the provider, the technical level, the number of employees, the skill level, the product production period, whether the provider has obtained the related quality management system certification, etc.; the enterprise data of the suppliers can also acquire data through reports, questionnaires, monitoring equipment and the like provided by the suppliers, for example, the monitoring equipment records the activity level data of various equipment in real time, so that the automatic acquisition of the data such as energy consumption type, energy consumption and the like is realized.
Preprocessing the collected enterprise data to ensure the accuracy and the integrity of the data. Preprocessing includes data cleaning, missing value processing, outlier processing and the like.
Exemplary, data cleansing includes removing duplicate values, handling outliers, and handling erroneous data. Removing the duplicate values: it is checked whether there is a duplicate record in the data, which can be deleted if there is a duplicate value. Processing outliers: checking whether outliers exist in the data, statistical methods or visual methods can be used to identify and process outliers. Processing error data: checking whether there is an error or unreasonable data in the data, such as out-of-range values or non-logical data, may be corrected or deleted.
Missing value processing: deletion of missing values: if the proportion of missing values is small and the influence on the whole data is not great, the record containing the missing values can be deleted directly. Interpolation missing values: if the proportion of missing values is large or the influence on the whole data is large, the missing values may be filled in using an interpolation method. Common interpolation methods include mean, median, mode, regression models, etc.
Data normalization: and (3) carrying out standardization processing on the selected characteristic variables, and converting the selected characteristic variables into the same dimension so as to facilitate the input of the BP neural network model. The feature variables may be scaled to have the same dimensions. Common normalization methods include Z-score normalization and MinMax normalization.
Preprocessing the enterprise data of the provider to obtain a feature variable set X of the enterprise data of the provider:
X={x 1 ,x 2 ,...x a ...,x N }(1)
wherein the enterprise data X of the provider shares N characteristic variables, and the N characteristic variables represent N types of the enterprise data of the provider; x is x a Representing the a-th characteristic variable.
For convenience of description, in the following steps, the enterprise data X of the provider input to the BP neural network model is a characteristic variable form of the enterprise data X of the provider.
Step 102, inputting enterprise data of a provider into a pre-trained BP neural network model to obtain an evaluation result of the provider; an attention mechanism module is arranged in the BP neural network model; in the BP neural network model, an attention mechanism module is used to adjust the weight of the provider's carbon footprint.
Illustratively, the BP neural network model includes an input layer, a first hidden layer, a second hidden layer, an output layer, and an attention mechanism module; the attention mechanism module includes a first attention mechanism unit and a second attention mechanism unit.
The first hidden layer is used for outputting the evaluation index characteristics of the suppliers; the evaluation index of the provider is characterized by a plurality of evaluation index features, and the correlation among enterprise data of the provider is characterized; the second hidden layer is used for outputting the total evaluation index characteristics of the provider; the number of total evaluation index features of the provider is greater than or equal to 2, characterizing the correlation between the evaluation index features of the provider.
The first attention mechanism unit is arranged between the first hidden layer and the second hidden layer; the second attention mechanism unit is disposed between the second hidden layer and the output layer.
Illustratively, the number of neurons of the input layer is determined according to the number of types of enterprise data of the vendor, and the number of neurons of the input layer is N.
The number of neurons of the first hidden layer is determined according to the number of evaluation index features of the first hidden layer, and the number of neurons of the first hidden layer is M, M < N. The number of the neurons of the second hidden layer is determined according to the number of the total evaluation index features of the second hidden layer, wherein the number of the neurons of the second hidden layer is P, and P < M; the activation functions of the neurons of the first and second hidden layers are nonlinear activation functions, such as a ReLU activation function, selected by considering the performance of the BP neural network model or the convergence of the loss function.
The number of the neurons of the output layer is determined according to the number of the evaluation results, wherein the number of the neurons of the output layer is Q, and Q < P; generally q=1, i.e. the output layer directly outputs the final evaluation result. The activation function of the neurons of the output layer is a linear activation function, e.g. a sigmoid activation function.
The evaluation index features are input into a first attention mechanism unit, and the first attention mechanism unit is used for giving the neurons corresponding to the features related to the carbon footprint in the evaluation index features of the first hidden layer, and mapping the neurons to the weights of the second hidden layer, wherein the weights are the first attention weights.
The total evaluation index features are input into a second attention mechanism unit, and the second attention mechanism unit is used for giving the neurons corresponding to the features related to the carbon footprint in the total evaluation index features of the second hidden layer, and mapping the neurons to the weights of the output layer, wherein the weights are the second attention weights.
And determining the types and the numbers of the evaluation index features according to the requirements of the cable purchasing items and the characteristics of suppliers. The evaluation index features may include aspects of quality management capability features, delivery capability features, price competitiveness features, service level features, collaboration history data features, and carbon footprint features of the provider.
By way of example, the carbon footprint characteristics of a supplier may characterize the relevance of the raw material consumption of a product, the type of raw material, the emission factor of the raw material, the transportation mileage, the energy consumption type, the energy consumption, the cable raw material transportation activity level, the capacity, the type and number of production facilities owned by the supplier, and the like. The capacity party feature may characterize the relevance of capacity, the type and number of production facilities owned by the supplier, the skill level, the number of employees, the skill level, the production cycle time of the product, etc. The quality management capability features may characterize the relevance of whether there is a complete quality control flow and procedure, whether key quality indicators are set, whether self-assessment and auditing is performed regularly, etc. The delivery capability feature may characterize the relevance of whether the provider is able to deliver the product or provide the service on time at the contractual delivery time, is able to reasonably manage inventory, has good logistic capabilities, is able to effectively coordinate and communicate with other links in the supply chain, etc. Price competitiveness characteristics may characterize the relevance of price levels, price elasticity levels, whether production and operating costs can be effectively controlled, whether additional and differentiated products or services can be provided, whether there is a reasonable pricing strategy, etc. of the provider's products or services. The service level feature may characterize the relevance of whether additional value and differentiated products or services or other services can be provided, etc. The collaboration history feature may characterize the relevance of the purchasing party's historical collaboration time, collaboration times, delivery timing rates, quality problem numbers, and the like. Here, the correlation between other evaluation index features and various enterprise data is not described one by one, and in the actual training process, the correlation between each evaluation index feature and various enterprise data is determined according to the weights from each neuron of the input layer to each neuron of the first hidden layer.
Exemplary, based on the structure of the BP neural network model, the enterprise data X of the provider is input into neurons of an input layer, the input layer maps X to a first hidden layer, and the first hidden layer outputs the evaluation index characteristics Y, Y= { Y of the provider 1 ,y 2 ,...y h ...,y M }。
The output characteristics of the first hidden layer are:
wherein y is b Output features for the b-th neuron in the first hidden layer; omega ab Mapping weights for the a-th neuron in the input layer to the b-th neuron in the first hidden layer; f () is an activation function of the first hidden layer, θ is a neuron threshold of the first hidden layer; n is the number of neurons of the input layer.
The evaluation index feature Y of the supplier is input into the first attention mechanism unit to give a first attention weight omega 'to the feature related to the carbon footprint in the evaluation index feature Y' c Then, mapping to a second hidden layer, and outputting the total evaluation index feature Y ', Y' = { Y 'of the provider by the second hidden layer' 1 ,y' 2 ,...,y' i ,...,y' P }。
Exemplary, assuming that the c-th evaluation index feature in the evaluation index features Y is a feature related to the carbon footprint, the mapping weight of the c-th neuron corresponding to the feature related to the carbon footprint, which is given to the first hidden layer, to the second hidden layer is the first attention weight ω' c
The output characteristics of the second hidden layer are:
wherein y' i For the ith neuron in the second hidden layerOutputting characteristics; omega' bi Mapping weights for the b-th neuron in the first hidden layer to the i-th neuron in the second hidden layer; y is b Output features for the b-th neuron in the first hidden layer; omega' ci A first attention weight for a c-th neuron of the first hidden layer to an i-th neuron in the second hidden layer; the c-th neuron contains a feature related to the carbon footprint; y is c An output feature for the c-th neuron in the first hidden layer; f '() is the activation function of the second hidden layer, θ' is the neuron threshold of the second hidden layer; m is the number of neurons of the first hidden layer.
Wherein the attention weight of each channel in the first attention mechanism unit is used for representing the importance of the evaluation index feature Y of each provider. Different methods may be used to calculate the attention weight, for example using a Multi-layer perceptron (Multi-Layer Perceptrons, MLP) or the like.
On the input features of each hidden layer, an attention weight vector is used to calculate an attention weight for each feature. The attention weights can be normalized to a probability distribution using a softmax function, ensuring that the sum of the weights is 1. Because the carbon footprint features are of higher importance to the evaluation results, the carbon footprint knowledge can be incorporated into the definition of attention weights to increase the weight of the features.
Attention mechanisms are introduced in the hidden layer that enable the model to automatically focus on important features related to evaluating the carbon footprint of the vendor. Through learning the attention weight, the model can dynamically adjust the attention degree according to the enterprise data characteristics and different evaluation indexes of the provider, so that the accuracy of the BP neural network model is improved.
The above embodiment shows that one of the evaluation index features Y is a feature related to the carbon footprint, and of course, if a plurality of the evaluation index features Y are features related to the carbon footprint, the mapping weights of the neurons corresponding to the features related to the carbon footprint, which are assigned to the first hidden layer, to the second hidden layer are the first attention weights ω' c
Inputting the total evaluation index feature Y' of the supplier toIn the second attention mechanism unit, a second attention weight is given to the carbon footprint in the total evaluation index feature Y' of the supplierAfter that, the evaluation result +.>
Exemplary, assuming that only the d-th total evaluation index feature of the total evaluation index features Y' is related to the carbon footprint, the mapping weight of the d-th neuron corresponding to the carbon footprint related feature given to the second hidden layer to the output layer is the second attention weight
The output characteristics of the output layer are:
wherein,an output feature for a j-th neuron in the output layer; />Mapping the weight of the ith neuron in the second hidden layer to the jth neuron in the output layer; y' i Output characteristics of the ith neuron in the second hidden layer; />A second attention weight for a d-th neuron of the second hidden layer to a j-th neuron in the output layer; the d-th neuron contains a feature related to the carbon footprint; y' d An output feature for the d-th neuron in the second hidden layer; />For the activation function of the output layer +.>Neuron thresholds for the output layer; p is the number of neurons of the second hidden layer.
By introducing nonlinearity into the BP neural network model by introducing an activation function, the expression capability of the BP neural network is enhanced. Wherein the attention weight of each channel in the second attention mechanism unit is used to represent the importance in the total evaluation index feature Y' of each vendor. Different methods may be used to calculate the attention weight, for example using a Multi-layer perceptron (Multi-Layer Perceptrons, MLP) or the like.
Exemplary, if the total evaluation index feature of the second hidden layer includes the comprehensive evaluation feature and the carbon footprint evaluation feature, the number of neurons of the second hidden layer is p=2, and if d=2, the neuron where the carbon footprint evaluation feature is located is y' 2 The method comprises the steps of carrying out a first treatment on the surface of the The output layer outputs the evaluation result, if the number of neurons of the output layer is 1, the evaluation result is:
the attention mechanism module multiplies each input feature by a corresponding attention weight to obtain a weighted input feature. Therefore, the BP neural network model can pay more attention to the carbon footprint characteristics, and the dependence on unimportant characteristics is reduced.
The above embodiment shows that one of the total evaluation index features Y 'is a feature related to the carbon footprint, and of course, if a plurality of the total evaluation index features Y' are features related to the carbon footprint, the mapping weights of the plurality of neurons corresponding to the feature related to the carbon footprint, which are assigned to the second hidden layer, to the output layer are the second attention weights
Attention mechanisms are introduced in the output layer. In the output layer, the weight of the different evaluation indicators can be calculated using an attention mechanism so that the model can comprehensively evaluate the suppliers according to the importance of the different indicators. By learning the attention weight of the evaluation index, the model can evaluate the suppliers more accurately.
In an embodiment, in order to more clearly understand the technical solution of the present application, the training process of the BP neural network model is described as follows:
First, a sample set of enterprise data for a vendor is obtained.
Illustratively, for a numeric class of enterprise data to be collected at different times, there will be a large amount of data. For less enterprise data, the training data may be augmented with data augmentation. Different minority class samples are weighted differently, for example using an adaptive synthesis sampling (adaptive synthetic sampling, ADASYN) method, to generate different numbers of samples.
The collected enterprise data sample set is then preprocessed, and the preprocessed enterprise data sample set is divided into a training set and a testing set. Most of the data is typically used for training and a small fraction is used for testing to verify the generalization ability of the model.
And then, inputting a training set in the enterprise data sample set into a pre-constructed BP neural network model for training, and continuously adjusting the connection weight among neurons through a back propagation algorithm to obtain an evaluation result of the supplier.
And finally, verifying the trained BP neural network model by using a test set, and evaluating the accuracy and stability of the model. If the predicted result of the model accords with the actual evaluation result, the model is considered to be effective; if the predicted result of the model does not accord with the actual evaluation result, the model needs to be updated and adjusted.
Illustratively, the enterprise data of the new vendor or the updated real-time data of the original vendor evaluation index is evaluated according to the trained BP neural network model. And inputting the characteristic variable of the enterprise data of the provider into the model to obtain an evaluation result corresponding to the provider. The evaluation result can be used as a basis for selecting a supplier by a purchasing party or as a basis for optimizing management of carbon emission, energy consumption and the like of the supplier by the supplier, and improving the low-carbon level.
In one embodiment, to demonstrate the effectiveness of the vendor evaluation method of the present invention, a related experiment was performed. In the experiment, the evaluation result of the BP neural network model in step 102 is analyzed by using common evaluation indexes MSE, RMSE, MAE and the like, and R square is utilized:
calculating R 2 The value of (2) is about 0.7. Where RMSE is the mean square error and Var is the variance.
The provider evaluation method is effective, and important features related to the evaluation of the carbon footprint of the provider can be automatically focused in the BP neural network model by constructing the BP neural network model and introducing a focusing mechanism in the BP neural network model; the BP neural network dynamically adjusts the attention degree by learning the attention weight, so that the accuracy of the BP neural network model is improved. Meanwhile, the suppliers are evaluated more comprehensively and accurately by adjusting the weight of the characteristics related to the carbon footprint of the suppliers, and the importance of the green low carbon to the evaluation suppliers is improved. The method can help enterprises select and manage suppliers, promote sustainable development and realize carbon emission reduction targets.
It should be understood that the sequence number of each step does not mean the sequence of execution, and the execution sequence of each process should be determined by its function and internal logic, and should not constitute any limitation on the implementation process of the embodiments of the present application.
Corresponding to the supplier evaluation method of the above embodiment, fig. 2 shows a block diagram of the supplier evaluation device provided in the embodiment of the present application, and for convenience of explanation, only the portions relevant to the embodiment of the present application are shown.
Referring to fig. 2, the provider evaluation device in the embodiment of the present application may include an enterprise data acquisition module 201 and an evaluation result output module 202.
Wherein, the enterprise data acquisition module 201 is configured to acquire enterprise data of a provider; the vendor's enterprise data includes the vendor's carbon footprint. The evaluation result output module 202 is configured to input enterprise data of a provider into a pre-trained BP neural network model to obtain an evaluation result of the provider; an attention mechanism module is arranged in the BP neural network model; in the BP neural network model, an attention mechanism module is used to adjust the weights of features related to the provider's carbon footprint.
It should be noted that, because the content of information interaction and execution process between the devices is based on the same concept as the method embodiment of the present application, specific functions and technical effects thereof may be referred to in the method embodiment section, and will not be described herein again.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-described division of the functional units and modules is illustrated, and in practical application, the above-described functional distribution may be performed by different functional units and modules according to needs, i.e. the internal structure of the apparatus is divided into different functional units or modules to perform all or part of the above-described functions. The functional units and modules in the embodiment may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit, where the integrated units may be implemented in a form of hardware or a form of a software functional unit. In addition, specific names of the functional units and modules are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present application. The specific working process of the units and modules in the above system may refer to the corresponding process in the foregoing method embodiment, which is not described herein again.
The embodiment of the present application further provides a terminal device, referring to fig. 3, the terminal device 300 may include: at least one processor 310 and a memory 320, the memory 320 storing a computer program executable on the at least one processor 310, the processor 310 implementing the steps of any of the various method embodiments described above, such as steps 101 through 102 in the embodiment shown in fig. 1, when the computer program is executed by the processor 310. Alternatively, the processor 310 may execute a computer program to implement the functions of the modules/units in the above-described apparatus embodiments, such as the functions of the modules 201 to 202 shown in fig. 2.
By way of example, a computer program may be partitioned into one or more modules/units that are stored in memory 320 and executed by processor 310 to complete the present application. One or more of the modules/units may be a series of computer program segments capable of performing specific functions for describing the execution of the computer program in the terminal device 300.
It will be appreciated by those skilled in the art that fig. 3 is merely an example of a terminal device and is not limiting of the terminal device and may include more or fewer components than shown, or may combine certain components, or different components, such as input-output devices, network access devices, buses, etc.
The processor 310 may be a central processing unit (Central Processing Unit, CPU), but may also be other general purpose processors, digital signal processors (Digital Signal Processor, DSP), application specific integrated circuits (Application Specific Integrated Circuit, ASIC), off-the-shelf programmable gate arrays (Field-Programmable Gate Array, FPGA) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, or the like. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The memory 320 may be an internal storage unit of the terminal device, or may be an external storage device of the terminal device, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash Card (Flash Card), or the like. The memory 320 is used to store computer programs and other programs and data required for the terminal device. The memory 320 may also be used to temporarily store data that has been output or is to be output.
The bus may be an industry standard architecture (Industry Standard Architecture, ISA) bus, an external device interconnect (Peripheral Component, PCI) bus, or an extended industry standard architecture (Extended Industry Standard Architecture, EISA) bus, among others. The buses may be divided into address buses, data buses, control buses, etc. For ease of illustration, the buses in the drawings of the present application are not limited to only one bus or one type of bus.
The provider evaluation method provided by the embodiment of the application can be applied to terminal equipment such as computers, tablet computers, notebook computers, netbooks, personal digital assistants (personal digital assistant, PDA) and the like, and the specific type of the terminal equipment is not limited.
The embodiments of the present application also provide a computer readable storage medium storing a computer program, which when executed by a processor implements steps for implementing the embodiments of the vendor evaluation method described above.
Embodiments of the present application provide a computer program product that, when run on a mobile terminal, causes the mobile terminal to perform steps that may be performed in the various embodiments of the vendor evaluation method described above.
The integrated units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a computer readable storage medium. Based on such understanding, the present application implements all or part of the flow of the method of the above embodiments, and may be implemented by a computer program to instruct related hardware, where the computer program may be stored in a computer readable storage medium, where the computer program, when executed by a processor, may implement the steps of each of the method embodiments described above. Wherein the computer program comprises computer program code, which may be in the form of source code, object code, executable files or in some intermediate form, etc. The computer readable medium may include at least: any entity or device capable of carrying computer program code to a photographing device/terminal apparatus, recording medium, computer Memory, read-Only Memory (ROM), random access Memory (RAM, random Access Memory), electrical carrier signals, telecommunications signals, and software distribution media. Such as a U-disk, removable hard disk, magnetic or optical disk, etc.
In the foregoing embodiments, the descriptions of the embodiments are emphasized, and in part, not described or illustrated in any particular embodiment, reference is made to the related descriptions of other embodiments.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus/network device and method may be implemented in other manners. For example, the apparatus/network device embodiments described above are merely illustrative, e.g., the division of modules or elements is merely a logical functional division, and there may be additional divisions when actually implemented, e.g., multiple elements or components may be combined or integrated into another system, or some features may be omitted, or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed may be an indirect coupling or communication connection via interfaces, devices or units, which may be in electrical, mechanical or other forms.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed over a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
The above embodiments are only for illustrating the technical solution of the present application, and are not limiting thereof; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit and scope of the technical solutions of the embodiments of the present application, and are intended to be included in the scope of the present application.

Claims (10)

1. A vendor evaluation method, comprising:
acquiring enterprise data of a provider; the vendor's enterprise data includes a vendor's carbon footprint;
inputting enterprise data of the suppliers into a pre-trained BP neural network model to obtain an evaluation result of the suppliers; an attention mechanism module is arranged in the BP neural network model; in the BP neural network model, the attention mechanism module is to adjust weights of features related to the provider's carbon footprint.
2. The vendor evaluation method of claim 1, wherein the BP neural network model comprises an input layer, a first hidden layer, a second hidden layer, an output layer, and the attention mechanism module; the attention mechanism module comprises a first attention mechanism unit and a second attention mechanism unit;
the input layer is used for inputting enterprise data of the provider;
the first hidden layer is used for outputting the evaluation index characteristics of the suppliers; the evaluation index features of the suppliers are multiple and are used for representing correlation among enterprise data of the suppliers; the second hidden layer is used for outputting the total evaluation index characteristics of the provider; the number of the total evaluation index features of the suppliers is more than or equal to 2, and the total evaluation index features are used for representing the correlation among the evaluation index features of the suppliers;
the first attention mechanism unit is arranged between the first hidden layer and the second hidden layer; the second attention mechanism unit is disposed between the second hidden layer and the output layer.
3. The provider evaluation method of claim 2, wherein the evaluation index feature is input to a first attention mechanism unit for giving a neuron corresponding to a feature related to a carbon footprint among the evaluation index features of the first hidden layer, a weight mapped to the second hidden layer, the weight being a first attention weight.
4. The vendor-assessment method according to claim 3, wherein the output characteristics of said second hidden layer are:
wherein y is i ' is the output characteristic of the ith neuron in the second hidden layer; omega b ' i Mapping weights for the b-th neuron in the first hidden layer to the i-th neuron in the second hidden layer; y is b An output feature for a b-th neuron in the first hidden layer; omega c ' i A first attention weight for a c-th neuron of the first hidden layer to an i-th neuron in a second hidden layer; the c-th neuron comprises a feature related to a carbon footprint; y is c An output feature for a c-th neuron in the first hidden layer; f '() is an activation function of the second hidden layer, θ' is a neuron threshold of the second hidden layer; m is the number of neurons of the first hidden layer.
5. The vendor-assessment method according to claim 2, wherein said second hidden layer outputs a total-assessment index feature; the total evaluation index features are input into a second attention mechanism unit, and the second attention mechanism unit is used for giving neurons corresponding to the carbon footprint related features in the total evaluation index features of the second hidden layer, and mapping the neurons to the second attention weight of the output layer.
6. The vendor evaluation method according to claim 5, wherein the output characteristics of the output layer are:
wherein,an output characteristic for a j-th neuron in the output layer; />Mapping the weight of the ith neuron in the second hidden layer to the jth neuron in the output layer; y is i ' is the output characteristic of the ith neuron in the second hidden layer; />A second attention weight for a d-th neuron of the second hidden layer to a j-th neuron in the output layer; the d-th neuron comprises a feature related to a carbon footprint; y' d An output feature for a d-th neuron in the second hidden layer;for the activation function of the output layer +.>A neuron threshold value for the output layer; p is the number of neurons of the second hidden layer.
7. The provider assessment method according to any one of claim 2 to 6,
the number of neurons of the input layer is determined according to the number of types of enterprise data of the provider, and the number of neurons of the input layer is N;
the number of the neurons of the first hidden layer is determined according to the number of the evaluation index features of the first hidden layer, wherein the number of the neurons of the first hidden layer is M, and M < N;
The number of the neurons of the second hidden layer is determined according to the number of the total evaluation index features of the second hidden layer, wherein the number of the neurons of the second hidden layer is P, and P < M;
the number of the neurons of the output layer is determined according to the number of the evaluation results, wherein the number of the neurons of the output layer is Q, and Q < P.
8. A vendor evaluation device, comprising:
an enterprise data acquisition module for acquiring enterprise data of a provider; the vendor's enterprise data includes a vendor's carbon footprint;
the evaluation result output module is used for inputting the enterprise data of the suppliers into a pre-trained BP neural network model to obtain the evaluation result of the suppliers; an attention mechanism module is arranged in the BP neural network model; in the BP neural network model, the attention mechanism module is to adjust weights of features related to the provider's carbon footprint.
9. A terminal device comprising a memory and a processor, the memory having stored therein a computer program executable on the processor, characterized in that the processor implements the vendor assessment method according to any one of claims 1 to 7 when executing the computer program.
10. A computer readable storage medium storing a computer program, characterized in that the computer program, when executed by a processor, implements the vendor evaluation method according to any one of the preceding claims 1 to 7.
CN202311226089.9A 2023-09-21 2023-09-21 Vendor evaluation method, device, terminal equipment and medium Pending CN117314233A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311226089.9A CN117314233A (en) 2023-09-21 2023-09-21 Vendor evaluation method, device, terminal equipment and medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311226089.9A CN117314233A (en) 2023-09-21 2023-09-21 Vendor evaluation method, device, terminal equipment and medium

Publications (1)

Publication Number Publication Date
CN117314233A true CN117314233A (en) 2023-12-29

Family

ID=89259566

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311226089.9A Pending CN117314233A (en) 2023-09-21 2023-09-21 Vendor evaluation method, device, terminal equipment and medium

Country Status (1)

Country Link
CN (1) CN117314233A (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112036923A (en) * 2020-07-06 2020-12-04 北京嘀嘀无限科技发展有限公司 Service evaluation method, system, device and storage medium
US20220083954A1 (en) * 2020-09-11 2022-03-17 Shopify Inc. Methods and systems for real-time inventory reallocation from supplier to retailer
CN115564490A (en) * 2022-10-21 2023-01-03 中航机载系统共性技术有限公司 Supplier evaluation method based on attention mechanism

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112036923A (en) * 2020-07-06 2020-12-04 北京嘀嘀无限科技发展有限公司 Service evaluation method, system, device and storage medium
US20220083954A1 (en) * 2020-09-11 2022-03-17 Shopify Inc. Methods and systems for real-time inventory reallocation from supplier to retailer
CN115564490A (en) * 2022-10-21 2023-01-03 中航机载系统共性技术有限公司 Supplier evaluation method based on attention mechanism

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
孙新杰等: "基于BiLG-A-CNN的冷链配送公司评价模型", 六盘水师范学院学报, vol. 34, no. 4, 31 August 2022 (2022-08-31), pages 98 - 105 *
潘雨红等: "基于DEMATEL-BP的装配式住宅预制构件供应商选择的影响因素识别", 数学的实践与认识, vol. 47, no. 09, 8 May 2017 (2017-05-08), pages 22 - 34 *
邓浩等: "基于GA-BP神经网络的船用柴油机制造企业供应商评价", 安徽工业大学学报(自然科学版), vol. 36, no. 1, 31 March 2019 (2019-03-31), pages 80 - 87 *

Similar Documents

Publication Publication Date Title
Choi et al. Big data-driven fuzzy cognitive map for prioritising IT service procurement in the public sector
CN113239314A (en) Method, device, terminal and computer-readable storage medium for carbon emission prediction
de Barcelos Tronto et al. An investigation of artificial neural networks based prediction systems in software project management
CN112035541A (en) Client image drawing method and device, computer readable storage medium and terminal equipment
CN109993652A (en) A kind of debt-credit assessing credit risks method and device
CN113421165A (en) Method and system for evaluating and managing green financial products
Liu et al. Adaptive wavelet transform model for time series data prediction
CN114943565A (en) Electric power spot price prediction method and device based on intelligent algorithm
CN111882140A (en) Risk evaluation method, model training method, device, equipment and storage medium
CN114037184A (en) Method, apparatus, medium, device, and program product for predicting profit evaluation index
CN112257958A (en) Power saturation load prediction method and device
CN117314233A (en) Vendor evaluation method, device, terminal equipment and medium
CN116402528A (en) Power data processing system
CN115809837A (en) Financial enterprise management method, equipment and medium based on digital simulation scene
CN115358894A (en) Intellectual property life cycle trusteeship management method, device, equipment and medium
CN114970357A (en) Energy-saving effect evaluation method, system, device and storage medium
CN114418776A (en) Data processing method, device, terminal equipment and medium
Zhang et al. A combinational QoS-prediction approach based on RBF neural network
Malik et al. Towards a Stock Price Prediction on Time Series Data using Long-Short Term Memory Method
CN115829144B (en) Method for establishing power grid business optimization model and electronic equipment
CN117493140B (en) Evaluation system for deep learning model
Pérez-Pérez et al. Forecasting Climate Transition Regulatory and Market Risk Variables with Machine Learning
Lan et al. Digital Investment Risk Evaluation Model of Power Grid Enterprises Based on FAHP-AOA-LSSVM
Hanne et al. Artificial Intelligence and Machine Learning for Maturity Evaluation and Model Validation
Liu et al. Research on the Development of Robo-Advisor Under the Background of Fin-Tech

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination