CN115795318A - Classification method of use case object, model training method, equipment and storage medium - Google Patents

Classification method of use case object, model training method, equipment and storage medium Download PDF

Info

Publication number
CN115795318A
CN115795318A CN202211439611.7A CN202211439611A CN115795318A CN 115795318 A CN115795318 A CN 115795318A CN 202211439611 A CN202211439611 A CN 202211439611A CN 115795318 A CN115795318 A CN 115795318A
Authority
CN
China
Prior art keywords
use case
case object
sets
objects
classification
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211439611.7A
Other languages
Chinese (zh)
Inventor
于志杰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Bailong Mayun Technology Co ltd
Original Assignee
Beijing Bailong Mayun Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Bailong Mayun Technology Co ltd filed Critical Beijing Bailong Mayun Technology Co ltd
Priority to CN202211439611.7A priority Critical patent/CN115795318A/en
Publication of CN115795318A publication Critical patent/CN115795318A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)

Abstract

The invention discloses a classification method, a model training method, equipment and a storage medium of a use case object, wherein the method comprises the following steps: classifying different case objects acquired from the same interface to obtain a plurality of case object sets; wherein, one use case object belongs to one use case object set; respectively carrying out matching degree operation on the plurality of case object sets to obtain matching values among a plurality of different case objects; if the matching value between the use case object sets is smaller than or equal to a set threshold value, combining the use case object sets into a new use case object set; and repeating the steps until the number of the use case object sets is 1 or the matching value between the use case object sets is greater than the set threshold value. According to the technical scheme provided by the invention, the accuracy of classifying the use case objects in the same interface can be improved.

Description

Classification method of use case object, model training method, equipment and storage medium
Technical Field
The invention relates to the field of data processing, in particular to a method for classifying a use case object, a method and equipment for training a model and a storage medium.
Background
With the rapid development of the internet, the iteration speed is faster and faster, the quality of product requirement completion becomes more and more important, in the traditional product iteration process, regression of a large number of functions becomes an important task of a tester, and human resources are dispersed between a regression interface and a function test, so that the difficulty in further improving the efficiency of the test is also caused. In the prior art, for screening and screening playback flow under a large amount of data, a plurality of case objects obtained through testing are strongly classified mainly through a clustering process, so that the classification results are too many, and the merging of the same transaction cannot be realized.
Disclosure of Invention
In view of this, the embodiment of the present invention provides a method for classifying a use-case object, a method for training a model, a device, and a storage medium, which can improve the accuracy of a merging result of the use-case object to a certain extent.
The invention provides a method for classifying an application case object on one hand, which comprises the following steps: classifying different case objects acquired from the same interface to obtain a plurality of case object sets; wherein, one use case object belongs to one use case object set; respectively carrying out matching degree operation on the plurality of case object sets to obtain matching values among a plurality of different case objects; if the matching value between the use case object sets is smaller than or equal to a set threshold value, combining the use case object sets into a new use case object set; and repeating the steps until the number of the use case object sets is 1 or the matching value between the use case object sets is greater than the set threshold value.
In one embodiment, classifying different use case objects acquired from the same interface to obtain a plurality of use case object sets includes: vectorizing different case objects respectively to obtain different case object vectors; respectively calculating the similarity between different case object vectors; and if the similarity between the use case object vectors is greater than or equal to a preset threshold value, adding the use case objects represented by the use case object vectors into the same use case object set.
In one embodiment, the performing matching degree operation on the plurality of use case object sets respectively to obtain matching values between a plurality of different use case objects includes: carrying out centralized representation on the use case object vectors of the use case objects in the plurality of use case object sets to obtain a plurality of central vectors representing the use case object sets; and respectively calculating the matching degrees among the plurality of central vectors to obtain the matching values among a plurality of different use case objects.
In one embodiment, the calculating the similarity between the different use case object vectors respectively comprises: and calculating the similarity between different case objects by using the cosine function of the included angle.
In one embodiment, the method for classifying a use case object further includes: acquiring a data set comprising a plurality of use case objects; the use case object corresponds to an interface name, an interface type, an application of the interface and attribute information respectively; and determining the interface to which the use case object belongs based on the interface name, the interface type and the application to which the interface belongs.
The invention also provides a training method of the case object classification model, which comprises the following steps: constructing a plurality of groups of training samples with tree structures; the training sample comprises a use case object and a classification result of the use case object; wherein the use case object is stored on a leaf node of the training sample; the classification result of the use case object is stored on a non-leaf node of the training sample; and updating the use case object classification model based on the use case object and the classification result of the use case object to obtain a target use case object classification model.
In one embodiment, updating the use case object classification model based on the use case object and the classification result of the use case object, and obtaining a target use case object classification model includes: inputting the case object into an initial case object classification model to obtain a classification result of the initial case object; generating a loss function of the use case object classification model based on the classification result of the initial use case object and the classification result of the use case object; and if the loss function is converged, taking the use case object classification model corresponding to the loss function as a target use case object classification model.
In another aspect, the present invention further provides a device for classifying a use-case object, where the device for classifying a use-case object includes: the case object initial classification unit is used for classifying different case objects acquired from the same interface to obtain a plurality of case object sets; wherein, one use case object belongs to one use case object set; the case object set matching value calculation unit is used for respectively carrying out matching degree calculation on the plurality of case object sets to obtain matching values among a plurality of different case objects; the use case object classification unit is used for merging the use case object sets into a new use case object set if the matching value between the use case object sets is smaller than or equal to a set threshold value; and repeating the steps until the number of the use case object sets is 1 or the matching value between the use case object sets is greater than the set threshold value.
In another aspect, the present invention further provides a training apparatus for a use case object classification model, where the training apparatus for a use case object classification model includes: the training sample construction unit is used for constructing a plurality of groups of training samples with tree structures; the training sample comprises a use case object and a classification result of the use case object; wherein the use case object is stored on a leaf node of the training sample; the classification result of the use case object is stored on a non-leaf node of the training sample; and the use case object classification model determining unit is used for updating the use case object classification model based on the use case object and the classification result of the use case object to obtain a target use case object classification model.
In another aspect, the present invention further provides an electronic device, where the electronic device includes a processor and a memory, where the memory is used to store a computer program, and the computer program, when executed by the processor, implements the above-mentioned use case object classification method and/or the above-mentioned training method for the use case object classification model.
In another aspect of the present invention, a computer-readable storage medium is provided, and the computer-readable storage medium is used for storing a computer program, and when the computer program is executed by a processor, the computer program implements the above-mentioned use case object classification method and/or the training method of the use case object classification model.
The method comprises the steps of classifying the use case object data of different scenes of the same interface, and then further classifying the use case object set formed after classification until the matching degree operation results among the use case object sets are all smaller than a set threshold value or only 1 use case object set remains, so that the use case objects are merged, the classification of the use case objects can be reduced to a certain extent, and the reliability of the classification results is improved.
Drawings
The features and advantages of the present invention will be more clearly understood by reference to the accompanying drawings, which are schematic and are not to be understood as limiting the invention in any way, and in which:
FIG. 1 is a flow diagram illustrating a sample object classification method according to an embodiment of the invention;
FIG. 2 is a flow diagram illustrating a method for training a sample object classification model according to an embodiment of the present invention;
FIG. 3 is a diagram illustrating an example object classifying apparatus according to an embodiment of the present invention;
FIG. 4 is a diagram of a training apparatus for classifying a model using example objects in accordance with an embodiment of the present invention;
fig. 5 shows a schematic structural diagram of an electronic device in an embodiment of the invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention more apparent, the technical solutions of the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings of the embodiments of the present invention. All other embodiments, which can be obtained by a person skilled in the art without inventive step based on the embodiments of the present invention, are within the scope of the present invention.
Testing of internet products is increasingly important to meet the needs of users. In the traditional updating and iteration process of internet products, a large number of function regression becomes an important task of testers, and human resources are dispersed between a regression interface and a function test, so that the working efficiency of the testers is reduced. The method for calling all the outside by recording the online flow and filling the online flow to the offline is used as a new regression means, so that the regression efficiency is improved to a certain extent. In the prior art, the classification of the flow data is mainly realized by a clustering method, however, for the case of a large data volume, especially when the data volume reaches the million level, more categories may exist after the primary classification of the flow data, which is inconvenient for a tester to perform investigation, and when the classified categories are less, the accuracy of the flow data classification may be reduced, thereby blocking the release process of internet products to a certain extent. Therefore, it is desirable to provide a method for improving the accuracy of traffic data classification in the case of fewer classes of traffic data classification.
Referring to fig. 1, a method for classifying a use case object according to an embodiment of the present application may include the following steps.
S110: classifying different case objects acquired from the same interface to obtain a plurality of case object sets; wherein, one use case object belongs to one use case object set.
In this embodiment, before classifying the use case object, different use case objects may be first classified into a plurality of different categories by a simple clustering method, and then corresponding merging operations are performed. Specifically, for example, the classification of the use case objects can be realized by a k-means clustering method, a DBSCAN-density-based spatial clustering algorithm, a spectral clustering algorithm, a GMM-Gaussian mixture model, meanShift-mean shift, hierarchical clustering, and the like.
In the present embodiment, the set of use case objects is a classification result of use case objects after the use case objects are classified once. Specifically, for example, after a few use case objects, i.e., A1, A2, B1, B2, B3, and C1, are classified once, A1 and A2 belong to the same class, B1, B2, and B3 belong to the same class, and C1 belongs to one class, then A1 and A2 are a use case object set.
S120: and respectively carrying out matching degree operation on the plurality of case object sets to obtain matching values among a plurality of different case objects.
In this embodiment, by calculating the matching degrees between the plurality of use case object sets, respectively, when the matching degree between two use case object sets is large, the two use case object sets can be merged to obtain a new use case object set.
In the present embodiment, the matching degree may be calculated by the shortest distance, the longest distance, the mean distance, the barycentric distance, the sum of squared deviations, and the like, and the embodiments of the present specification are not limited thereto.
S130: if the matching value between the use case object sets is smaller than or equal to a set threshold value, combining the use case object sets into a new use case object set; and repeating the steps until the number of the use case object sets is 1 or the matching value between the use case object sets is greater than the set threshold value.
In this embodiment, if the matching value between two use case object sets is greater than or equal to the threshold, the two use case object sets are merged to obtain a new use case object set, and then the matching degree operation is performed on the new use case object set until the number of the use case object sets is 1 or the distance between the two use case object sets is greater than the set threshold. Specifically, for example, a threshold value is set to be 0.95, the use case object sets are four, i.e., a, B, C and D, then the matching degree operation is performed on the five use case object sets, i.e., a, B, C and D, respectively, the matching degree between a and B is 0.99, the matching degree between a and C is 0.93, the matching degree between a and D is 0.9, the matching degree between B and C is 0.97, the matching degree between B and D is 0.89, and the matching degree between C and D is 0.94, then the three use case object sets, i.e., a, B and C, can be combined to obtain a new use case object set R, and since the use case object sets between the use case object set D and the other use case object sets are both less than 0.95, the new use case object sets obtained in the process are R and D, and then the matching degree operation is performed on R and D, and if the matching degree between R and D is less than 0.95, the use case object sets are finally classified into two categories R and D; if the matching degree between R and D is greater than or equal to 0.95, R and D can be continued to be merged to obtain a new use case object set Q, and at this time, all different use case objects can be attributed to the category Q.
In one embodiment, classifying different use case objects acquired from the same interface to obtain a plurality of use case object sets may include: vectorizing different case objects respectively to obtain different case object vectors; respectively calculating the similarity between different case object vectors; and if the similarity between the use case object vectors is greater than or equal to a preset threshold value, adding the use case objects represented by the use case object vectors into the same use case object set.
In this embodiment, since the use case objects cannot be directly classified according to the description of the use case objects, the use case objects need to be vectorized and expressed first, then two-by-two similarity calculation is performed on the use case objects according to the vectorized and expressed result, and if the similarity between the two use case objects is greater than a threshold, the two use case objects are considered to belong to the same category. Specifically, for example, if the characteristic fields in the a interface include a car type, a realTimeType real-time type, and a serviceType service type, and if the car type is 1, the realTimeType is 2, and the serviceType type 3 indicates that the vector is {1.0,2.0, and 3.0}, the following 4 sets of variables may be preset, and the similarity operation is performed on the four sets of variables through the formula 1.
Figure BDA0003948070920000071
Figure BDA0003948070920000072
And combining two vectors with similarity values larger than 0.99 into a group, wherein the use case objects represented by the two vectors of {1.0,2.0,3.0}, {1.0,2.0,4.0} are used as one set of use case objects, and {7.0,8.0,9.0} and {45.0,6.0,56.0} are used as two sets of use case objects.
In the above embodiment, the operations are continued, the barycentric distance of each use case object set is selected as the feature vector of the use case object set, and finally the use case objects can be divided into two clusters, where the data in the 0 th cluster is: {45.0,6.0,56.0}, data in cluster 1: {1.0,2.0,3.0}, {1.0,2.0,4.0}, {7.0,8.0,9.0}.
In an embodiment, the performing matching degree operation on the plurality of use case object sets respectively to obtain matching values between a plurality of different use case objects may include: carrying out centralized representation on the use case object vectors of the use case objects in the plurality of use case object sets to obtain a plurality of central vectors representing the use case object sets; and respectively calculating the matching degrees among the plurality of central vectors to obtain the matching values among a plurality of different use case objects.
In the embodiment, the centralized distribution trend of the use case objects in the use case object sets can be well expressed by measuring the gravity center distance of each feature vector in the use case object sets, so that the similarity between the two use case object sets can be measured by using the gravity centers of the use case object vectors in the two use case object sets. Specifically, for example, if the use case object set a includes {1.0,2.0,3.0}, {1.0,2.0,4.0}, {7.0,8.0,9.0}, the barycentric distance of the use case object set a can be calculated by equation 2 to obtain a center vector of (3.0, 4.0, 16/3).
Figure BDA0003948070920000081
Then, the matching degree calculation is performed with {45.0,6.0,56.0} in the example object set B by using the cosine of the included angle described in the above embodiment, and the matching value is obtained to be 0.8684. Because the matching value is less than the set threshold value of 0.95, the use case object set A and the use case object set B cannot be merged, and finally the use case objects can be divided into two use case object sets A and B.
In one embodiment, respectively calculating the similarity between different use case object vectors may include: and calculating the similarity between different case objects by using the cosine function of the included angle.
In this embodiment, the cosine similarity may be obtained by using a cosine value of an included angle between two vectors in a vector space as a measure of the difference between the two individuals, where a value closer to 1 indicates that the included angle is closer to 0, indicating that the two individuals are more similar, and conversely, a value closer to 0 indicates that the difference is greater.
In one embodiment, the method for classifying the use-case object may further include: acquiring a data set comprising a plurality of use case objects; the use case object corresponds to an interface name, an interface type, an application of the interface and attribute information respectively; and determining the interface to which the use case object belongs based on the interface name, the interface type and the application to which the interface belongs.
In this embodiment, the data sets obtained at one time may come from a plurality of different interfaces, however, during the process of performing the flow playback screening on the test data, the tester needs to perform the screening test on the problem of each interface, and therefore the obtained data sets need to be classified according to the interfaces first. Specifically, for example, the description of the use case object data in the data set includes an interface name, an interface type, an application to which the interface belongs, and attribute information, and therefore, use case objects with the same interface name, the same interface type, and the same application to which the interface belongs may be used as use case object data of the same interface.
Referring to fig. 2, an embodiment of the present disclosure provides a method for training a use case object classification model, which may include the following steps.
S210: constructing a plurality of groups of training samples with tree structures; the training sample comprises a use case object and a classification result of the use case object; wherein the use case object is stored on a leaf node of the training sample; and the classification result of the use case object is stored on a non-leaf node of the training sample.
S220: and updating the use case object classification model based on the use case object and the classification result of the use case object to obtain a target use case object classification model.
In the embodiment, the use case object is used as a leaf node, the first-level classification result of the use case object is used as a father node of the leaf node, and the final classification result of the use case object is used as a training sample of the layer structure of the root node, so that the whole classification process can be progressively layered, and the reliability of the classification result can be improved to a certain extent under the condition of less classification.
For specific functions and effects implemented by the target use case object classification model, please refer to the use case object classification method described in the foregoing embodiment, which is not described herein again.
In one embodiment, updating the use case object classification model based on the use case object and the classification result of the use case object, and obtaining the target use case object classification model may include: inputting the case object into an initial case object classification model to obtain a classification result of the initial case object; generating a loss function of the use case object classification model based on the classification result of the initial use case object and the classification result of the use case object; and if the loss function is converged, taking the use case object classification model corresponding to the loss function as a target use case object classification model.
In the embodiment, the loss function of the model is mainly used in the training stage of the model, after the training data of each batch is sent into the model, the predicted value is output through forward propagation, and then the difference value between the predicted value and the true value, namely the loss value, is calculated by the loss function. After the loss value is obtained, the model updates each parameter through back propagation to reduce the loss between the true value and the predicted value, so that the predicted value generated by the model is close to the true value, and the learning purpose is achieved. The use of the loss function is not limited in the embodiments of the present specification, and the loss function may be a cross entropy loss function, a negative log likelihood loss, an exponential loss, a square loss, or the like.
Referring to fig. 3, an embodiment of the present application further provides a device for classifying a use case object, where the device for classifying a use case object may include: the device comprises a use case object initial classification unit, a use case object set matching value calculation unit and a use case object target classification unit.
The case object initial classification unit is used for classifying different case objects acquired from the same interface to obtain a plurality of case object sets; wherein, one use case object belongs to one use case object set.
And the case object set matching value calculating unit is used for respectively carrying out matching degree calculation on the plurality of case object sets to obtain matching values among a plurality of different case objects.
The use case object target classification unit is used for merging the use case object sets into a new use case object set if the matching value between the use case object sets is smaller than or equal to a set threshold value; and repeating the steps until the number of the use case object sets is 1 or the matching value between the use case object sets is greater than the set threshold value.
Referring to fig. 4, an embodiment of the present application further provides a training apparatus for a use-case object classification model, where the training apparatus for a use-case object classification model may include: the device comprises a training sample construction unit and a use case object classification model determination unit.
The training sample construction unit is used for constructing a plurality of groups of training samples with tree structures; the training sample comprises a use case object and a classification result of the use case object; wherein the use case object is stored on a leaf node of the training sample; and the classification result of the use case object is stored on a non-leaf node of the training sample.
And the use case object classification model determining unit is used for updating the use case object classification model based on the use case object and the classification result of the use case object to obtain the target use case object classification model.
For specific functions and effects achieved by the use case object classification device and/or the training device of the use case object classification model, reference may be made to other embodiments of the present specification for comparative explanation, and details are not repeated here. The various modules in the object recognition apparatus may be implemented in whole or in part by software, hardware, and combinations thereof. The modules can be embedded in a hardware form or independent from a processor in the computer device, and can also be stored in a memory in the computer device in a software form, so that the processor can call and execute operations corresponding to the modules.
Referring to fig. 5, an embodiment of the present application further provides an electronic device, where the electronic device includes a processor and a memory, and the memory is used for storing a computer program, and when the computer program is executed by the processor, the method for classifying use case objects and/or the method for training a use case object classification model described above are implemented.
The processor may be a Central Processing Unit (CPU). The Processor may also be other general purpose Processor, digital Signal Processor (DSP), application Specific Integrated Circuit (ASIC), field Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic device, discrete hardware component, or a combination thereof.
The memory, which is a non-transitory computer readable storage medium, may be used to store non-transitory software programs, non-transitory computer executable programs, and modules, such as program instructions/modules corresponding to the methods of the embodiments of the present invention. The processor executes the non-transitory software programs, instructions and modules stored in the memory, so as to execute various functional applications and data processing of the processor, that is, to implement the method in the above method embodiment.
The memory may include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required for at least one function; the storage data area may store data created by the processor, and the like. Further, the memory may include high speed random access memory, and may also include non-transitory memory, such as at least one disk storage device, flash memory device, or other non-transitory solid state storage device. In some embodiments, the memory optionally includes memory located remotely from the processor, and such remote memory may be coupled to the processor via a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
An embodiment of the present application further provides a computer-readable storage medium, which is used for storing a computer program, and when the computer program is executed by a processor, the computer program implements the above-mentioned use case object classification method and/or use case object classification model training method.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments may be implemented by hardware instructions of a computer program, which may be stored in a non-volatile computer-readable storage medium, and when executed, may include processes of the embodiments of the methods. Any reference to memory, storage, database, or other medium used in the various embodiments provided herein can include at least one of non-volatile and volatile memory. Non-volatile Memory may include Read-Only Memory (ROM), magnetic tape, floppy disk, flash Memory, optical Memory, or the like. Volatile Memory can include Random Access Memory (RAM) or external cache Memory. By way of illustration and not limitation, RAM can take many forms, such as Static Random Access Memory (SRAM) or Dynamic Random Access Memory (DRAM), among others.
It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
The various embodiments of the present disclosure are described in a progressive manner. The different embodiments focus on the different parts described compared to the other embodiments. After reading this specification, one skilled in the art can appreciate that many embodiments and many features disclosed in the embodiments can be combined in many different ways, and for the sake of brevity, all possible combinations of features in the embodiments are not described. However, as long as there is no contradiction between combinations of these technical features, the scope of the present specification should be considered as being described.
It should also be noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrases "comprising a," "8230," "8230," or "comprising" does not exclude the presence of other like elements in a process, method, article, or apparatus comprising the element.
In the present specification, the embodiments themselves are emphasized differently from the other embodiments, and the embodiments can be explained in contrast to each other. Any combination of the embodiments in this specification based on general technical common knowledge by those skilled in the art is encompassed in the disclosure of the specification.
The above description is only an embodiment of the present disclosure, and is not intended to limit the scope of the claims of the present disclosure. Various modifications and changes may occur to those skilled in the art. Any modification, equivalent replacement, or improvement made within the spirit and principle of the present application shall be included in the scope of the claims of the present application.

Claims (11)

1. A method for classifying a use case object, the method comprising:
classifying different case objects acquired from the same interface to obtain a plurality of case object sets; wherein, one use case object belongs to one use case object set;
respectively carrying out matching degree operation on the plurality of case object sets to obtain matching values among a plurality of different case objects;
if the matching value between the use case object sets is smaller than or equal to a set threshold value, combining the use case object sets into a new use case object set; and repeating the steps until the number of the use case object sets is 1 or the matching value between the use case object sets is greater than the set threshold value.
2. The method according to claim 1, wherein classifying different use case objects obtained from the same interface to obtain a plurality of use case object sets comprises:
vectorizing different case objects respectively to obtain different case object vectors;
respectively calculating the similarity between different case object vectors;
and if the similarity between the use case object vectors is greater than or equal to a preset threshold value, adding the use case objects represented by the use case object vectors into the same use case object set.
3. The method according to claim 2, wherein performing matching degree operation on the plurality of use case object sets respectively to obtain matching values between a plurality of different use case objects comprises:
carrying out centralized representation on the use case object vectors of the use case objects in the plurality of use case object sets to obtain a plurality of central vectors representing the use case object sets;
and respectively calculating the matching degrees among the plurality of central vectors to obtain the matching values among a plurality of different case objects.
4. The method according to claim 2, wherein calculating the similarity between different use case object vectors respectively comprises:
and calculating the similarity between different case objects by using the cosine function of the included angle.
5. The method of claim 1, further comprising:
acquiring a data set comprising a plurality of use case objects; the use case object corresponds to an interface name, an interface type, an application of the interface and attribute information respectively;
and determining the interface to which the use case object belongs based on the interface name, the interface type and the application to which the interface belongs.
6. A training method for classifying a model by using an object, the method comprising:
constructing a plurality of groups of training samples with tree structures; the training sample comprises a use case object and a classification result of the use case object; wherein the use case object is stored on a leaf node of the training sample; the classification result of the use case object is stored on a non-leaf node of the training sample;
and updating the use case object classification model based on the use case object and the classification result of the use case object to obtain a target use case object classification model.
7. The method according to claim 6, wherein updating the use case object classification model based on the use case object and the classification result of the use case object, and obtaining a target use case object classification model comprises:
inputting the case object into an initial case object classification model to obtain a classification result of the initial case object;
generating a loss function of the use case object classification model based on the classification result of the initial use case object and the classification result of the use case object;
and if the loss function is converged, taking the use case object classification model corresponding to the loss function as a target use case object classification model.
8. A use case object classifying apparatus, comprising:
the case object initial classification unit is used for classifying different case objects acquired from the same interface to obtain a plurality of case object sets; wherein, one use case object belongs to one use case object set;
the case object set matching value calculation unit is used for respectively carrying out matching degree calculation on the plurality of case object sets to obtain matching values among a plurality of different case objects;
the use case object classification unit is used for merging the use case object sets into a new use case object set if the matching value between the use case object sets is smaller than or equal to a set threshold value; and repeating the steps until the number of the use case object sets is 1 or the matching value between the use case object sets is greater than the set threshold value.
9. A training device for a use case object classification model is characterized in that the training device for the use case object classification model comprises:
the training sample construction unit is used for constructing a plurality of groups of training samples with tree structures; the training sample comprises a use case object and a classification result of the use case object; wherein the use case object is stored on a leaf node of the training sample; the classification result of the use case object is stored on a non-leaf node of the training sample;
and the use case object classification model determining unit is used for updating the use case object classification model based on the use case object and the classification result of the use case object to obtain the target use case object classification model.
10. An electronic device, characterized in that the electronic device arrangement comprises a processor and a memory for storing a computer program which, when executed by the processor, implements the method according to any of claims 1 to 7.
11. A computer-readable storage medium, characterized in that the computer-readable storage medium is used to store a computer program which, when executed by a processor, implements the method of any one of claims 1 to 7.
CN202211439611.7A 2022-11-17 2022-11-17 Classification method of use case object, model training method, equipment and storage medium Pending CN115795318A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211439611.7A CN115795318A (en) 2022-11-17 2022-11-17 Classification method of use case object, model training method, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211439611.7A CN115795318A (en) 2022-11-17 2022-11-17 Classification method of use case object, model training method, equipment and storage medium

Publications (1)

Publication Number Publication Date
CN115795318A true CN115795318A (en) 2023-03-14

Family

ID=85438493

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211439611.7A Pending CN115795318A (en) 2022-11-17 2022-11-17 Classification method of use case object, model training method, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN115795318A (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107798557A (en) * 2017-09-30 2018-03-13 平安科技(深圳)有限公司 Electronic installation, the service location based on LBS data recommend method and storage medium
CN109101620A (en) * 2018-08-08 2018-12-28 广州神马移动信息科技有限公司 Similarity calculating method, clustering method, device, storage medium and electronic equipment
CN109858562A (en) * 2019-02-21 2019-06-07 腾讯科技(深圳)有限公司 A kind of classification method of medical image, device and storage medium
CN110046586A (en) * 2019-04-19 2019-07-23 腾讯科技(深圳)有限公司 A kind of data processing method, equipment and storage medium
CN113673550A (en) * 2021-06-30 2021-11-19 浙江大华技术股份有限公司 Clustering method, clustering device, electronic equipment and computer-readable storage medium
CN114330469A (en) * 2021-08-29 2022-04-12 北京工业大学 Rapid and accurate encrypted flow classification method and system
CN114780847A (en) * 2022-04-25 2022-07-22 北京沃东天骏信息技术有限公司 Object information processing and information pushing method, device and system

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107798557A (en) * 2017-09-30 2018-03-13 平安科技(深圳)有限公司 Electronic installation, the service location based on LBS data recommend method and storage medium
CN109101620A (en) * 2018-08-08 2018-12-28 广州神马移动信息科技有限公司 Similarity calculating method, clustering method, device, storage medium and electronic equipment
CN109858562A (en) * 2019-02-21 2019-06-07 腾讯科技(深圳)有限公司 A kind of classification method of medical image, device and storage medium
CN110046586A (en) * 2019-04-19 2019-07-23 腾讯科技(深圳)有限公司 A kind of data processing method, equipment and storage medium
CN113673550A (en) * 2021-06-30 2021-11-19 浙江大华技术股份有限公司 Clustering method, clustering device, electronic equipment and computer-readable storage medium
CN114330469A (en) * 2021-08-29 2022-04-12 北京工业大学 Rapid and accurate encrypted flow classification method and system
CN114780847A (en) * 2022-04-25 2022-07-22 北京沃东天骏信息技术有限公司 Object information processing and information pushing method, device and system

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
刘同,娄彦兵,王益民: "黄河信息化典型系统研究", 郑州:黄河水利出版社, pages: 179 - 183 *

Similar Documents

Publication Publication Date Title
TWI677852B (en) A method and apparatus, electronic equipment, computer readable storage medium for extracting image feature
JP6969637B2 (en) Causality analysis methods and electronic devices
JP6402265B2 (en) Method, computer device and storage device for building a decision model
CN110991474A (en) Machine learning modeling platform
CN114048468A (en) Intrusion detection method, intrusion detection model training method, device and medium
CN112396428B (en) User portrait data-based customer group classification management method and device
WO2018006631A1 (en) User level automatic segmentation method and system
CN116596095B (en) Training method and device of carbon emission prediction model based on machine learning
CN111061923B (en) Graph data entity recognition system based on graph dependence rule and supervised learning
CN113052577A (en) Method and system for estimating category of virtual address of block chain digital currency
CN114169460A (en) Sample screening method, sample screening device, computer equipment and storage medium
JP2020004409A (en) Automation and self-optimization type determination of execution parameter of software application on information processing platform
CN113762005A (en) Method, device, equipment and medium for training feature selection model and classifying objects
CN110705889A (en) Enterprise screening method, device, equipment and storage medium
CN110837853A (en) Rapid classification model construction method
CN115795318A (en) Classification method of use case object, model training method, equipment and storage medium
TW202312030A (en) Recipe construction system, recipe construction method, computer readable recording media with stored programs, and non-transitory computer program product
CN114328174A (en) Multi-view software defect prediction method and system based on counterstudy
CN114529096A (en) Social network link prediction method and system based on ternary closure graph embedding
CN113255933A (en) Feature engineering and graph network generation method and device and distributed system
CN113610350A (en) Complex working condition fault diagnosis method, equipment, storage medium and device
CN114494753A (en) Clustering method, clustering device, electronic equipment and computer-readable storage medium
CN111949530A (en) Test result prediction method and device, computer equipment and storage medium
WO2020107836A1 (en) Word2vec-based incomplete user persona completion method and related device
CN117076293B (en) Software performance visual evaluation method based on lean sample hierarchy credible clustering thermodynamic diagram

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination