CN113010442A - User interface test result classification method and device based on deep belief network - Google Patents

User interface test result classification method and device based on deep belief network Download PDF

Info

Publication number
CN113010442A
CN113010442A CN202110486456.3A CN202110486456A CN113010442A CN 113010442 A CN113010442 A CN 113010442A CN 202110486456 A CN202110486456 A CN 202110486456A CN 113010442 A CN113010442 A CN 113010442A
Authority
CN
China
Prior art keywords
training
user interface
interface test
deep belief
classification
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110486456.3A
Other languages
Chinese (zh)
Inventor
吕美洁
郭继泱
高小明
刘泱
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Industrial and Commercial Bank of China Ltd ICBC
Original Assignee
Industrial and Commercial Bank of China Ltd ICBC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Industrial and Commercial Bank of China Ltd ICBC filed Critical Industrial and Commercial Bank of China Ltd ICBC
Priority to CN202110486456.3A priority Critical patent/CN113010442A/en
Publication of CN113010442A publication Critical patent/CN113010442A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3688Test management for test execution, e.g. scheduling of test suites
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • G06N3/084Backpropagation, e.g. using gradient descent
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • G06N3/088Non-supervised learning, e.g. competitive learning

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • General Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Computation (AREA)
  • Molecular Biology (AREA)
  • General Health & Medical Sciences (AREA)
  • Mathematical Physics (AREA)
  • Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Computational Linguistics (AREA)
  • Computing Systems (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Evolutionary Biology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Computer Hardware Design (AREA)
  • Quality & Reliability (AREA)
  • Human Computer Interaction (AREA)
  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)

Abstract

The invention provides a classification method and a device for user interface test results based on a deep belief network, wherein the method comprises the following steps: acquiring an execution result picture of a user interface test case and marking the type of the picture to form a training set; preprocessing the training pictures in the training set to obtain training data; inputting training data into the deep belief network for training according to the deep belief network structure to obtain a deep belief network classification model; acquiring a user interface test case execution result picture to be classified as a target picture, and preprocessing the target picture; and inputting the preprocessed target picture into the deep confidence network classification model to obtain the classification type of the target picture. The invention can effectively improve the comprehensiveness and accuracy of classification of the user interface test result, reduce the dependence and workload on the tester and improve the efficiency of the user interface test.

Description

User interface test result classification method and device based on deep belief network
Technical Field
The invention relates to the technical field of neural networks, in particular to a method and a device for classifying user interface test results based on a deep belief network.
Background
The User Interface (User Interface) is the content displayed to the User by the software system, and is an important medium for the interaction between the User and the software system, and the requirements of the User on the User Interface are continuously improved along with the rapid development of the software industry and the rapid increase of the number of users, so that the User Interface test becomes particularly critical as an important ring in the software test. Currently, there are two main ways to distinguish the execution result of the user interface test case: (1) and (3) manual judgment: judging whether the display of the user interface is correct or not according to the familiarity of the tester with the software system and the past experience, wherein the judging comprises the following steps: whether the page has abnormal blank, whether the element is missing, whether the interface element arrangement is misplaced, whether the interface has character messy codes, whether the reaction after the interface element is triggered is in accordance with the expectation, and the like. (2) Detecting key elements of the automated script: currently, a mainstream user interface automation test framework, such as a selenium framework, mainly determines an execution result of a test case by positioning and capturing a specific element on a user interface and judging whether a current element state meets expectations.
Both of the above two methods of discrimination have some disadvantages: (1) and (3) manual judgment: the method has the advantages of more manpower input, longer time consumption, easy occurrence of misjudgment due to manual judgment, strong dependence on the quality of testers, and need of the testers to know about software systems. (2) Detecting key elements of the automated script: the method can only verify partial page elements, detection omission is possible, time consumption of automatic testing can be greatly increased if full page element verification is carried out, automatic testing efficiency is reduced, and feasibility is very low in practice. Meanwhile, the method can only detect whether the target element exists or not and whether the state is correct or not, and whether the arrangement layout and the display style of the elements meet expectations or not can not be judged.
Therefore, a more automatic and intelligent classification method for the test results of the user interface is urgently needed.
Disclosure of Invention
Aiming at the problems in the prior art, the invention provides a method and a device for classifying user interface test results based on a deep belief network, which specifically comprise the following technical scheme:
in a first aspect, the present invention provides a method for classifying user interface test results based on a deep belief network, including:
acquiring an execution result picture of a user interface test case and marking the type of the picture to form a training set;
preprocessing the training pictures in the training set to obtain training data;
inputting training data into the deep belief network for training according to the deep belief network structure to obtain a deep belief network classification model;
acquiring a user interface test case execution result picture to be classified as a target picture, and preprocessing the target picture;
and inputting the preprocessed target picture into the deep confidence network classification model to obtain the classification type of the target picture.
Wherein, the labeling the type of the picture comprises:
if the case is successfully executed, marking the corresponding training picture as successful; if the case execution fails, marking the corresponding training picture as the belonged error category;
wherein the error categories include: the page element is lost, the page is large-area and abnormal blank, the page is disordered, the element sequence is staggered and the element state is wrong.
Wherein, the preprocessing of the training pictures in the training set comprises:
and carrying out graying processing and normalization processing on the training pictures in the training set.
Wherein the according-depth belief network structure comprises:
determining that the deep confidence network structure comprises at least four layers of Restricted Boltzmann Machines (RBMs) and one layer of BP neural network classifier;
performing data processing on input data by using at least four layers of Restricted Boltzmann Machines (RBMs), and extracting characteristic information;
inputting the characteristic information extracted by the limited Boltzmann machine RBM into the BP neural network classifier to obtain a data classification result;
the number of nodes of the input layer is determined by the dimension of the input data, and the number of nodes of the output layer is determined by the number of categories of the input data.
Inputting training data into the deep confidence network for training to obtain a deep confidence network classification model, wherein the deep confidence network classification model comprises the following steps:
inputting training data and determining the maximum number of layers, the number of nodes of each layer and the maximum iteration number of the RBM of the limited Boltzmann machine;
training a Restricted Boltzmann Machine (RBM) layer by layer from a first layer;
and if the current layer number is smaller than the maximum layer number, training, otherwise, completing unsupervised pre-training.
In a second aspect, the present invention provides a device for classifying user interface test results based on a deep belief network, including:
the acquisition module is used for acquiring the execution result pictures of the user interface test cases and marking the types of the pictures to form a training set;
the data module is used for preprocessing the training pictures in the training set to obtain training data;
the model module is used for inputting training data into the deep confidence network for training according to the deep confidence network structure to obtain a deep confidence network classification model;
the target module is used for acquiring a user interface test case execution result picture to be classified as a target picture and preprocessing the target picture;
and the classification module is used for inputting the preprocessed target picture into the deep belief network classification model to obtain the classification type of the target picture.
Wherein, the collection module includes:
the marking unit is used for marking the corresponding training picture as successful if the case is successfully executed; if the case execution fails, marking the corresponding training picture as the belonged error category;
wherein the error categories include: the page element is lost, the page is large-area and abnormal blank, the page is disordered, the element sequence is staggered and the element state is wrong.
Wherein the data module comprises:
and the processing unit is used for carrying out graying processing and normalization processing on the training pictures in the training set.
Wherein the model module comprises:
the device comprises a forming unit, a calculating unit and a calculating unit, wherein the forming unit is used for determining that the deep confidence network structure comprises at least four layers of limited Boltzmann machines (RBMs) and one layer of BP neural network classifier;
the characteristic unit is used for performing data processing on input data by utilizing at least four layers of limited Boltzmann machines (RBMs) and extracting characteristic information;
the classification unit is used for inputting the characteristic information extracted by the limited Boltzmann machine RBM into the BP neural network classifier to obtain a data classification result;
the number of nodes of the input layer is determined by the dimension of the input data, and the number of nodes of the output layer is determined by the number of categories of the input data.
Wherein the model module comprises:
the input unit is used for inputting training data and determining the maximum layer number, the node number of each layer and the maximum iteration number of the limited Boltzmann machine RBM;
the training unit is used for training the restricted Boltzmann machine RBM layer by layer from the first layer;
and the iteration unit is used for training if the current layer number is smaller than the maximum layer number, and otherwise, completing unsupervised pre-training.
In a third aspect, the present invention provides an electronic device, which includes a memory, a processor, and a computer program stored in the memory and executable on the processor, wherein the processor implements the steps of the method for classifying user interface test results based on a deep belief network when executing the computer program.
In a fourth aspect, the present invention provides a computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the steps of the method for classifying user interface test results based on a deep belief network.
According to the technical scheme, the user interface test result classification method and device based on the deep belief network automatically learn the characteristics of the user interface test case execution result pictures through the deep belief network and carry out error classification, the picture characteristics do not need to be manually extracted, the user interface test case execution results are intelligently classified, the comprehensiveness and the accuracy of classification of the user interface test results are improved, meanwhile, the dependence and the workload on testers are reduced, and the efficiency of user interface test is improved.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly introduced below, and it is obvious that the drawings in the following description are some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to these drawings without creative efforts.
Fig. 1 is a schematic flowchart of a classification method for a user interface test result based on a deep belief network in an embodiment of the present invention.
Fig. 2 is a flow chart of the structure of the deep belief network in the embodiment of the present invention.
FIG. 3 is a schematic diagram of a deep belief network classification model training process in an embodiment of the present invention.
Fig. 4 is a schematic structural diagram of a user interface test result classification device based on a deep belief network in an embodiment of the present invention.
Fig. 5 is a schematic structural diagram of an electronic device in an embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, but not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The invention provides an embodiment of a classification method of a user interface test result based on a deep belief network, which specifically comprises the following contents in reference to fig. 1:
s101: acquiring an execution result picture of a user interface test case and marking the type of the picture to form a training set;
in the step, the user interface case execution result pictures are collected and the types of the pictures are labeled to form a training set. The method comprises the following specific steps: and adopting a user interface test automation script, and carrying out screenshot on the execution result of each user interface test case to be used as a training picture of the execution result of the test case.
Labeling the training pictures: according to the expectation of user interface test, case execution errors are divided into several categories, such as page element missing, page large-area abnormal blank, page messy code, element sequence dislocation, element state error and the like. Then, if the case execution is successful, marking the corresponding training picture as successful; and if the use case fails to be executed, marking the corresponding training picture as the belonged error category. And finally, labeling each training picture, and combining all the pictures and the corresponding labels into a training set. In the training set, the number of training samples corresponding to each classification is not less than 10, and the more training samples, the higher the accuracy of the classification result obtained by the deep confidence network model is.
S102: preprocessing the training pictures in the training set to obtain training data;
and preprocessing the training pictures in the training set, including graying and normalization processing, to obtain preprocessed training pictures, namely training data. The method comprises the following specific steps:
1) and graying the training picture to obtain a grayscale picture. Graying refers to converting a color picture into a grayscale picture, wherein the grayscale range of the grayscale picture is 0-255.
2) And carrying out normalization processing on the gray level picture to obtain a preprocessed training picture, namely training data. The normalization processing means that the gray scale range of the gray scale picture is converted from 0-255 to 0-1, a maximum and minimum normalization method is adopted, and the calculation formula is as follows:
Figure BDA0003050532460000061
wherein y isiTo normalize the pixel point value, xiThe pixel point value in the grayscale picture, min (x) is the minimum value of the pixel in the grayscale picture, and max (x) is the maximum value of the pixel in the grayscale picture.
S103: inputting training data into the deep belief network for training according to the deep belief network structure to obtain a deep belief network classification model;
in this step, the network structure is composed of a plurality of layers of Restricted Boltzmann Machines (RBMs) and a layer of Back Propagation (BP) neural network. And inputting the training data into the deep confidence network model for training to obtain the deep confidence network classification model. The method comprises the following specific steps:
1) and designing a deep belief network structure. The structure of the designed deep belief network is shown in fig. 2. The deep confidence network structure consists of m layers of Restricted Boltzmann Machines (RBMs) and a layer of BP neural network classifier, wherein m is more than or equal to 4. Firstly, performing data processing on input data by using an m-layer RBM (radial basis function) to extract characteristic information; then, inputting the characteristic information extracted by the RBM into a traditional supervised classifier BP neural network; and finally, outputting a data classification result. The number of nodes of the input layer is determined by the dimension of the input data, and the number of nodes of the output layer is determined by the number of categories of the input data. In the present embodiment, the dimension of the input data is two-dimensional, so the number of input nodes is 2.
Each RBM comprises a visual layer and a hidden layer, only two-way connection weights are arranged between visual layer units and hidden layer units, and the visual layer units are not connected with the visual layer units and the hidden layer units are not connected with the hidden layer units. In the deep belief network, a first layer network and a second layer network form a first RBM, the second layer network and a third layer network form a second RBM, and the like, so that m RBMs are formed in total, wherein Wi is the weight. By stacking multiple RBMs, the deep belief network can extract deep level features from complex data. And finally, forming 1 BP neural network classifier by the two-layer network, and classifying the extracted features.
2) And training the deep confidence network model. Deep belief networks are machine learning models that combine unsupervised learning and supervised learning. The training of the deep confidence network model consists of two processes of unsupervised layer-by-layer pre-training and supervised parameter fine-tuning, and the whole training process is shown in fig. 3. Firstly, inputting training data, and setting some basic parameters including the maximum number m of RBMs, the number of nodes in each layer and the maximum iteration number. And then, starting to train the RBM layer by layer from the layer i to the layer 1, if so, reducing the search range to train, otherwise, finishing unsupervised pre-training, and carrying out fine adjustment on parameters by using global learning algorithms such as BP algorithm and the like until the precision requirement is met, and finishing the training.
The main difference of the deep belief network model compared to other machine learning models is unsupervised layer-by-layer pre-training. Unsupervised layer-by-layer pre-training can learn nonlinear complex functions by directly mapping data from input to output, thereby enabling the functions to have strong feature extraction capability. Firstly, a vector is generated in a visual layer of a first RBM, a value is transmitted to a hidden layer through an RBM network, the obtained hidden layer is used as the visual layer, and the depth structure can extract features from original data layer by layer through layer-by-layer stacking to obtain some deep-layer features. The RBM layer-by-layer training method avoids complex operation brought by overall training of the deep belief network, and the deep belief network model is developed into a shallow neural network layer by layer.
Because the RBM training of each layer can only optimize the parameters of the layer network, but can not optimize the whole network. Therefore, after unsupervised pre-training is completed, supervised parameter fine-tuning is performed on the deep belief network through the labels corresponding to the training data. The parameter fine tuning is to use a traditional global learning algorithm (such as a BP algorithm or a Wake-Sleep algorithm) to perform fine tuning and optimization on parameters of all layers in the deep confidence network, so that training errors are further reduced, and the accuracy of the deep confidence network classification model is improved.
S104: acquiring a user interface test case execution result picture to be classified as a target picture, and preprocessing the target picture;
in the step, a case execution result picture to be classified is collected as a target picture, and then the target picture is preprocessed, wherein the preprocessing comprises graying and normalization processing, so that the preprocessed target picture is obtained.
S105: and inputting the preprocessed target picture into the deep confidence network classification model to obtain the classification type of the target picture.
In the step, the preprocessed target picture is input into a trained deep belief network classification model, the classification type of the target picture is output, and the classification type of the target picture is used as a classification conclusion of the execution result of the corresponding user interface test case.
As can be seen from the above description, the method for classifying the user interface test results based on the deep belief network according to the embodiment of the present invention forms a training set by collecting the pictures of the execution results of the user interface test cases and labeling the types to which the pictures belong; preprocessing the training pictures in the training set to obtain training data; inputting training data into the deep belief network for training according to the deep belief network structure to obtain a deep belief network classification model; acquiring a user interface test case execution result picture to be classified as a target picture, and preprocessing the target picture; and inputting the preprocessed target picture into the deep confidence network classification model to obtain the classification type of the target picture. The comprehensive classification and accuracy of the user interface test results can be effectively improved, dependence and workload on testers are reduced, and the efficiency of user interface test is improved.
The embodiment of the present invention provides a specific implementation manner of a deep belief network-based user interface test result classification device capable of implementing all contents in the deep belief network-based user interface test result classification method, and referring to fig. 4, the deep belief network-based user interface test result classification device specifically includes the following contents:
the acquisition module 10 is used for acquiring the execution result pictures of the user interface test cases and marking the types of the pictures to form a training set;
the data module 20 is used for preprocessing the training pictures in the training set to obtain training data;
the model module 30 is used for inputting training data into the deep confidence network for training according to the deep confidence network structure to obtain a deep confidence network classification model;
the target module 40 is used for acquiring a user interface test case execution result picture to be classified as a target picture and preprocessing the target picture;
and the classification module 50 is configured to input the preprocessed target picture into the deep belief network classification model to obtain a classification type of the target picture.
Wherein, the collection module includes:
the marking unit is used for marking the corresponding training picture as successful if the case is successfully executed; if the case execution fails, marking the corresponding training picture as the belonged error category;
wherein the error categories include: the page element is lost, the page is large-area and abnormal blank, the page is disordered, the element sequence is staggered and the element state is wrong.
Wherein the data module comprises:
and the processing unit is used for carrying out graying processing and normalization processing on the training pictures in the training set.
Wherein the model module comprises:
the device comprises a forming unit, a calculating unit and a calculating unit, wherein the forming unit is used for determining that the deep confidence network structure comprises at least four layers of limited Boltzmann machines (RBMs) and one layer of BP neural network classifier;
the characteristic unit is used for performing data processing on input data by utilizing at least four layers of limited Boltzmann machines (RBMs) and extracting characteristic information;
the classification unit is used for inputting the characteristic information extracted by the limited Boltzmann machine RBM into the BP neural network classifier to obtain a data classification result;
the number of nodes of the input layer is determined by the dimension of the input data, and the number of nodes of the output layer is determined by the number of categories of the input data.
Wherein the model module comprises:
the input unit is used for inputting training data and determining the maximum layer number, the node number of each layer and the maximum iteration number of the limited Boltzmann machine RBM;
the training unit is used for training the restricted Boltzmann machine RBM layer by layer from the first layer;
and the iteration unit is used for training if the current layer number is smaller than the maximum layer number, and otherwise, completing unsupervised pre-training.
As can be seen from the above description, the user interface test result classification device based on the deep belief network provided in the embodiment of the present invention forms a training set by collecting user interface test case execution result pictures and labeling types to which the pictures belong; preprocessing the training pictures in the training set to obtain training data; inputting training data into the deep belief network for training according to the deep belief network structure to obtain a deep belief network classification model; acquiring a user interface test case execution result picture to be classified as a target picture, and preprocessing the target picture; and inputting the preprocessed target picture into the deep confidence network classification model to obtain the classification type of the target picture. The comprehensive classification and accuracy of the user interface test results can be effectively improved, dependence and workload on testers are reduced, and the efficiency of user interface test is improved.
The application provides an embodiment of an electronic device for implementing all or part of contents in the deep belief network-based user interface test result classification method, where the electronic device specifically includes the following contents:
a processor (processor), a memory (memory), a communication Interface (Communications Interface), and a bus; the processor, the memory and the communication interface complete mutual communication through the bus; the communication interface is used for realizing information transmission between related devices; the electronic device may be a desktop computer, a tablet computer, a mobile terminal, and the like, but the embodiment is not limited thereto. In this embodiment, the electronic device may be implemented with reference to the embodiment for implementing the method for classifying a user interface test result based on a deep belief network and the embodiment for implementing the apparatus for classifying a user interface test result based on a deep belief network, which are incorporated herein by reference, and repeated details are not repeated here.
Fig. 5 is a schematic block diagram of a system configuration of an electronic device 9600 according to an embodiment of the present application. As shown in fig. 5, the electronic device 9600 can include a central processor 9100 and a memory 9140; the memory 9140 is coupled to the central processor 9100. Notably, this FIG. 5 is exemplary; other types of structures may also be used in addition to or in place of the structure to implement telecommunications or other functions.
In one embodiment, the deep belief network based user interface test result classification functionality may be integrated into the central processor 9100. The central processor 9100 may be configured to control as follows:
acquiring an execution result picture of a user interface test case and marking the type of the picture to form a training set; preprocessing the training pictures in the training set to obtain training data; inputting training data into the deep belief network for training according to the deep belief network structure to obtain a deep belief network classification model; acquiring a user interface test case execution result picture to be classified as a target picture, and preprocessing the target picture; and inputting the preprocessed target picture into the deep confidence network classification model to obtain the classification type of the target picture.
As can be seen from the above description, the electronic device provided in the embodiment of the present application forms a training set by collecting user interface test case execution result pictures and labeling types to which the pictures belong; preprocessing the training pictures in the training set to obtain training data; inputting training data into the deep belief network for training according to the deep belief network structure to obtain a deep belief network classification model; acquiring a user interface test case execution result picture to be classified as a target picture, and preprocessing the target picture; and inputting the preprocessed target picture into the deep confidence network classification model to obtain the classification type of the target picture. The comprehensive classification and accuracy of the user interface test results can be effectively improved, dependence and workload on testers are reduced, and the efficiency of user interface test is improved.
In another embodiment, the deep belief network based ui test result classification apparatus may be configured separately from the central processor 9100, for example, the deep belief network based ui test result classification apparatus may be configured as a chip connected to the central processor 9100, and the deep belief network based ui test result classification function may be implemented by the control of the central processor.
As shown in fig. 5, the electronic device 9600 may further include: a communication module 9110, an input unit 9120, an audio processor 9130, a display 9160, and a power supply 9170. It is noted that the electronic device 9600 also does not necessarily include all of the components shown in fig. 5; further, the electronic device 9600 may further include components not shown in fig. 5, which may be referred to in the art.
As shown in fig. 5, a central processor 9100, sometimes referred to as a controller or operational control, can include a microprocessor or other processor device and/or logic device, which central processor 9100 receives input and controls the operation of the various components of the electronic device 9600.
The memory 9140 can be, for example, one or more of a buffer, a flash memory, a hard drive, a removable media, a volatile memory, a non-volatile memory, or other suitable device. The information relating to the failure may be stored, and a program for executing the information may be stored. And the central processing unit 9100 can execute the program stored in the memory 9140 to realize information storage or processing, or the like.
The input unit 9120 provides input to the central processor 9100. The input unit 9120 is, for example, a key or a touch input device. Power supply 9170 is used to provide power to electronic device 9600. The display 9160 is used for displaying display objects such as images and characters. The display may be, for example, an LCD display, but is not limited thereto.
The memory 9140 can be a solid state memory, e.g., Read Only Memory (ROM), Random Access Memory (RAM), a SIM card, or the like. There may also be a memory that holds information even when power is off, can be selectively erased, and is provided with more data, an example of which is sometimes called an EPROM or the like. The memory 9140 could also be some other type of device. Memory 9140 includes a buffer memory 9141 (sometimes referred to as a buffer). The memory 9140 may include an application/function storage portion 9142, the application/function storage portion 9142 being used for storing application programs and function programs or for executing a flow of operations of the electronic device 9600 by the central processor 9100.
The memory 9140 can also include a data store 9143, the data store 9143 being used to store data, such as contacts, digital data, pictures, sounds, and/or any other data used by an electronic device. The driver storage portion 9144 of the memory 9140 may include various drivers for the electronic device for communication functions and/or for performing other functions of the electronic device (e.g., messaging applications, contact book applications, etc.).
The communication module 9110 is a transmitter/receiver 9110 that transmits and receives signals via an antenna 9111. The communication module (transmitter/receiver) 9110 is coupled to the central processor 9100 to provide input signals and receive output signals, which may be the same as in the case of a conventional mobile communication terminal.
Based on different communication technologies, a plurality of communication modules 9110, such as a cellular network module, a bluetooth module, and/or a wireless local area network module, may be provided in the same electronic device. The communication module (transmitter/receiver) 9110 is also coupled to a speaker 9131 and a microphone 9132 via an audio processor 9130 to provide audio output via the speaker 9131 and receive audio input from the microphone 9132, thereby implementing ordinary telecommunications functions. The audio processor 9130 may include any suitable buffers, decoders, amplifiers and so forth. In addition, the audio processor 9130 is also coupled to the central processor 9100, thereby enabling recording locally through the microphone 9132 and enabling locally stored sounds to be played through the speaker 9131.
An embodiment of the present invention further provides a computer-readable storage medium capable of implementing all the steps in the method for classifying a user interface test result based on a deep belief network in the foregoing embodiments, where the computer-readable storage medium stores thereon a computer program, and when the computer program is executed by a processor, the computer program implements all the steps of the method for classifying a user interface test result based on a deep belief network in the foregoing embodiments, for example, when the processor executes the computer program, the processor implements the following steps:
acquiring an execution result picture of a user interface test case and marking the type of the picture to form a training set; preprocessing the training pictures in the training set to obtain training data; inputting training data into the deep belief network for training according to the deep belief network structure to obtain a deep belief network classification model; acquiring a user interface test case execution result picture to be classified as a target picture, and preprocessing the target picture; and inputting the preprocessed target picture into the deep confidence network classification model to obtain the classification type of the target picture.
As can be seen from the above description, the computer-readable storage medium provided in the embodiment of the present invention forms a training set by collecting user interface test case execution result pictures and labeling types to which the pictures belong; preprocessing the training pictures in the training set to obtain training data; inputting training data into the deep belief network for training according to the deep belief network structure to obtain a deep belief network classification model; acquiring a user interface test case execution result picture to be classified as a target picture, and preprocessing the target picture; and inputting the preprocessed target picture into the deep confidence network classification model to obtain the classification type of the target picture. The comprehensive classification and accuracy of the user interface test results can be effectively improved, dependence and workload on testers are reduced, and the efficiency of user interface test is improved.
Although the present invention provides method steps as described in the examples or flowcharts, more or fewer steps may be included based on routine or non-inventive labor. The order of steps recited in the embodiments is merely one manner of performing the steps in a multitude of orders and does not represent the only order of execution. When an actual apparatus or client product executes, it may execute sequentially or in parallel (e.g., in the context of parallel processors or multi-threaded processing) according to the embodiments or methods shown in the figures.
As will be appreciated by one skilled in the art, embodiments of the present description may be provided as a method, apparatus (system) or computer program product. Accordingly, embodiments of the present description may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present invention may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present invention is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
The embodiments in the present specification are described in a progressive manner, and the same and similar parts among the embodiments are referred to each other, and each embodiment focuses on the differences from the other embodiments. It should be noted that the embodiments and features of the embodiments may be combined with each other without conflict. The present invention is not limited to any single aspect, nor is it limited to any single embodiment, nor is it limited to any combination and/or permutation of these aspects and/or embodiments. Moreover, each aspect and/or embodiment of the present invention may be utilized alone or in combination with one or more other aspects and/or embodiments thereof.
Finally, it should be noted that: the above embodiments are only used to illustrate the technical solution of the present invention, and not to limit the same; while the invention has been described in detail and with reference to the foregoing embodiments, it will be understood by those skilled in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some or all of the technical features may be equivalently replaced; such modifications and substitutions do not depart from the spirit and scope of the present invention, and they should be construed as being included in the following claims and description.

Claims (12)

1. A classification method for user interface test results based on a deep belief network is characterized by comprising the following steps:
acquiring an execution result picture of a user interface test case and marking the type of the picture to form a training set;
preprocessing the training pictures in the training set to obtain training data;
inputting training data into the deep belief network for training according to the deep belief network structure to obtain a deep belief network classification model;
acquiring a user interface test case execution result picture to be classified as a target picture, and preprocessing the target picture;
and inputting the preprocessed target picture into the deep confidence network classification model to obtain the classification type of the target picture.
2. The method for classifying user interface test results based on the deep belief network as recited in claim 1, wherein the labeling the type to which the picture belongs comprises:
if the case is successfully executed, marking the corresponding training picture as successful; if the case execution fails, marking the corresponding training picture as the belonged error category;
wherein the error categories include: the page element is lost, the page is large-area and abnormal blank, the page is disordered, the element sequence is staggered and the element state is wrong.
3. The method for classifying user interface test results based on the deep belief network as claimed in claim 1, wherein the preprocessing of the training pictures in the training set comprises:
and carrying out graying processing and normalization processing on the training pictures in the training set.
4. The method for classifying user interface test results based on deep belief network as claimed in claim 1, wherein the classifying according to the deep belief network structure comprises:
determining that the deep confidence network structure comprises at least four layers of Restricted Boltzmann Machines (RBMs) and one layer of BP neural network classifier;
performing data processing on input data by using at least four layers of Restricted Boltzmann Machines (RBMs), and extracting characteristic information;
inputting the characteristic information extracted by the limited Boltzmann machine RBM into the BP neural network classifier to obtain a data classification result;
the number of nodes of the input layer is determined by the dimension of the input data, and the number of nodes of the output layer is determined by the number of categories of the input data.
5. The deep belief network-based user interface test result classification method of claim 4, wherein the inputting of training data into the deep belief network for training to obtain a deep belief network classification model comprises:
inputting training data and determining the maximum number of layers, the number of nodes of each layer and the maximum iteration number of the RBM of the limited Boltzmann machine;
training a Restricted Boltzmann Machine (RBM) layer by layer from a first layer;
and if the current layer number is smaller than the maximum layer number, training, otherwise, completing unsupervised pre-training.
6. A user interface test result classification device based on a deep belief network is characterized by comprising the following components:
the acquisition module is used for acquiring the execution result pictures of the user interface test cases and marking the types of the pictures to form a training set;
the data module is used for preprocessing the training pictures in the training set to obtain training data;
the model module is used for inputting training data into the deep confidence network for training according to the deep confidence network structure to obtain a deep confidence network classification model;
the target module is used for acquiring a user interface test case execution result picture to be classified as a target picture and preprocessing the target picture;
and the classification module is used for inputting the preprocessed target picture into the deep belief network classification model to obtain the classification type of the target picture.
7. The deep belief network-based user interface test result classification apparatus of claim 6, wherein the collection module comprises:
the marking unit is used for marking the corresponding training picture as successful if the case is successfully executed; if the case execution fails, marking the corresponding training picture as the belonged error category;
wherein the error categories include: the page element is lost, the page is large-area and abnormal blank, the page is disordered, the element sequence is staggered and the element state is wrong.
8. The deep belief network-based user interface test result classification apparatus of claim 6, wherein the data module comprises:
and the processing unit is used for carrying out graying processing and normalization processing on the training pictures in the training set.
9. The deep belief network-based user interface test result classification apparatus of claim 6, wherein the model module comprises:
the device comprises a forming unit, a calculating unit and a calculating unit, wherein the forming unit is used for determining that the deep confidence network structure comprises at least four layers of limited Boltzmann machines (RBMs) and one layer of BP neural network classifier;
the characteristic unit is used for performing data processing on input data by utilizing at least four layers of limited Boltzmann machines (RBMs) and extracting characteristic information;
the classification unit is used for inputting the characteristic information extracted by the limited Boltzmann machine RBM into the BP neural network classifier to obtain a data classification result;
the number of nodes of the input layer is determined by the dimension of the input data, and the number of nodes of the output layer is determined by the number of categories of the input data.
10. The deep belief network-based user interface test result classification apparatus of claim 9, wherein the model module comprises:
the input unit is used for inputting training data and determining the maximum layer number, the node number of each layer and the maximum iteration number of the limited Boltzmann machine RBM;
the training unit is used for training the restricted Boltzmann machine RBM layer by layer from the first layer;
and the iteration unit is used for training if the current layer number is smaller than the maximum layer number, and otherwise, completing unsupervised pre-training.
11. An electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, wherein the processor when executing the program performs the steps of the method for deep belief network based classification of user interface test results of any of claims 1 to 5.
12. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the steps of the method for deep belief network based classification of user interface test results according to one of the claims 1 to 5.
CN202110486456.3A 2021-04-30 2021-04-30 User interface test result classification method and device based on deep belief network Pending CN113010442A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110486456.3A CN113010442A (en) 2021-04-30 2021-04-30 User interface test result classification method and device based on deep belief network

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110486456.3A CN113010442A (en) 2021-04-30 2021-04-30 User interface test result classification method and device based on deep belief network

Publications (1)

Publication Number Publication Date
CN113010442A true CN113010442A (en) 2021-06-22

Family

ID=76380978

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110486456.3A Pending CN113010442A (en) 2021-04-30 2021-04-30 User interface test result classification method and device based on deep belief network

Country Status (1)

Country Link
CN (1) CN113010442A (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108629370A (en) * 2018-04-28 2018-10-09 广东工业大学 A kind of classification and identification algorithm and device based on depth confidence network
CN109118483A (en) * 2018-08-09 2019-01-01 合肥顺为信息科技有限公司 A kind of label quality detection method and device
CN110119455A (en) * 2019-04-23 2019-08-13 西安理工大学 A kind of image classification search method based on convolution depth confidence network
CN110349120A (en) * 2019-05-31 2019-10-18 湖北工业大学 Solar battery sheet detection method of surface flaw

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108629370A (en) * 2018-04-28 2018-10-09 广东工业大学 A kind of classification and identification algorithm and device based on depth confidence network
CN109118483A (en) * 2018-08-09 2019-01-01 合肥顺为信息科技有限公司 A kind of label quality detection method and device
CN110119455A (en) * 2019-04-23 2019-08-13 西安理工大学 A kind of image classification search method based on convolution depth confidence network
CN110349120A (en) * 2019-05-31 2019-10-18 湖北工业大学 Solar battery sheet detection method of surface flaw

Similar Documents

Publication Publication Date Title
US11710293B2 (en) Target detection method and apparatus, computer-readable storage medium, and computer device
CN109740657B (en) Training method and device of neural network model for image data classification
CN110210513B (en) Data classification method and device and terminal equipment
CN111930622B (en) Interface control testing method and system based on deep learning
CN110909868A (en) Node representation method and device based on graph neural network model
CN112529210A (en) Model training method, device and computer readable storage medium
CN115797349B (en) Defect detection method, device and equipment
CN111815169A (en) Business approval parameter configuration method and device
CN111815432A (en) Financial service risk prediction method and device
CN114419363A (en) Target classification model training method and device based on label-free sample data
CN111582341A (en) User abnormal operation prediction method and device
CN114005019B (en) Method for identifying flip image and related equipment thereof
CN113900935A (en) Automatic defect identification method and device, computer equipment and storage medium
CN113076169A (en) User interface test result classification method and device based on convolutional neural network
CN115984853A (en) Character recognition method and device
CN113723367B (en) Answer determining method, question judging method and device and electronic equipment
CN113010442A (en) User interface test result classification method and device based on deep belief network
US20220254002A1 (en) Method for improving efficiency of defect detection in images, image defect detection apparatus, and computer readable storage medium employing method
CN116246161A (en) Method and device for identifying target fine type of remote sensing image under guidance of domain knowledge
CN115620083A (en) Model training method, face image quality evaluation method, device and medium
CN116416486A (en) Image recognition method and system
CN113298083A (en) Data processing method and device
CN115810011B (en) Training and anomaly detection method, device and equipment of anomaly detection network
CN114897901B (en) Battery quality detection method and device based on sample expansion and electronic equipment
CN113140012B (en) Image processing method, device, medium and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination