CN114219079A - Feature selection method and device, model training method and device, equipment and medium - Google Patents

Feature selection method and device, model training method and device, equipment and medium Download PDF

Info

Publication number
CN114219079A
CN114219079A CN202111592257.7A CN202111592257A CN114219079A CN 114219079 A CN114219079 A CN 114219079A CN 202111592257 A CN202111592257 A CN 202111592257A CN 114219079 A CN114219079 A CN 114219079A
Authority
CN
China
Prior art keywords
feature
features
correlation
task
coefficient
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111592257.7A
Other languages
Chinese (zh)
Inventor
彭涵宇
方冠华
孙明明
李平
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Baidu Netcom Science and Technology Co Ltd
Original Assignee
Beijing Baidu Netcom Science and Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Baidu Netcom Science and Technology Co Ltd filed Critical Beijing Baidu Netcom Science and Technology Co Ltd
Priority to CN202111592257.7A priority Critical patent/CN114219079A/en
Publication of CN114219079A publication Critical patent/CN114219079A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/46Multiprogramming arrangements
    • G06F9/50Allocation of resources, e.g. of the central processing unit [CPU]
    • G06F9/5005Allocation of resources, e.g. of the central processing unit [CPU] to service a request
    • G06F9/5027Allocation of resources, e.g. of the central processing unit [CPU] to service a request the resource being a machine, e.g. CPUs, Servers, Terminals
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Software Systems (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Computational Linguistics (AREA)
  • Biophysics (AREA)
  • Evolutionary Computation (AREA)
  • General Health & Medical Sciences (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • Biomedical Technology (AREA)
  • Artificial Intelligence (AREA)
  • Mathematical Physics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Image Analysis (AREA)

Abstract

The present disclosure provides a computer-implemented feature selection method and apparatus, a model training method and apparatus, a device and a medium, which relate to the field of artificial intelligence, and in particular to the technical field of big data. The implementation scheme is as follows: determining a weight coefficient and a correlation coefficient of each of a plurality of features of the object to be processed, wherein the weight coefficient of each feature is determined according to the relevance of the feature and the task to be processed, and the correlation coefficient of each feature is determined according to the correlation between the feature and other features; and determining a selected feature for executing the task to be processed among the plurality of features based on the weight coefficient and the correlation coefficient of each of the plurality of features.

Description

Feature selection method and device, model training method and device, equipment and medium
Technical Field
The present disclosure relates to the field of artificial intelligence technologies, and in particular, to the field of big data technologies, and in particular, to a method for feature selection, a method and an apparatus for training a feature selection model, an electronic device, a computer-readable storage medium, and a computer program product.
Background
Artificial intelligence is the subject of research that makes computers simulate some human mental processes and intelligent behaviors (such as learning, reasoning, thinking, planning, etc.), both at the hardware level and at the software level. The artificial intelligence hardware technology generally comprises technologies such as a sensor, a special artificial intelligence chip, cloud computing, distributed storage, big data processing and the like, and the artificial intelligence software technology mainly comprises a computer vision technology, a voice recognition technology, a natural language processing technology, machine learning/deep learning, a big data processing technology, a knowledge graph technology and the like.
The approaches described in this section are not necessarily approaches that have been previously conceived or pursued. Unless otherwise indicated, it should not be assumed that any of the approaches described in this section qualify as prior art merely by virtue of their inclusion in this section. Similarly, unless otherwise indicated, the problems mentioned in this section should not be considered as having been acknowledged in any prior art.
Disclosure of Invention
The present disclosure provides a method of feature selection, a method of training a feature selection model, an apparatus, an electronic device, a computer-readable storage medium, and a computer program product.
According to an aspect of the present disclosure, there is provided a computer-implemented feature selection method including: determining a weight coefficient and a correlation coefficient of each of a plurality of features of the object to be processed, wherein the weight coefficient of each feature is determined according to the relevance of the feature and the task to be processed, and the correlation coefficient of each feature is determined according to the correlation between the feature and other features; and determining a selected feature for executing the task to be processed among the plurality of features based on the weight coefficient and the correlation coefficient of each of the plurality of features.
According to an aspect of the present disclosure, a training method of a feature selection model is provided, where the feature selection model includes a feature extraction module, a weight module and a correlation module, the feature selection model is connected with a task processing model, and the training method includes: inputting a plurality of characteristics of the sample object into a characteristic extraction module to obtain representation information of each characteristic in the plurality of characteristics; acquiring a weight coefficient of each of the plurality of features using a weight module based on the representation information of each of the plurality of features; obtaining a correlation coefficient of each of the plurality of features using a correlation module based on the representation information of each of the plurality of features; determining a predicted selected feature for input to the task processing model among the plurality of features based on the weight coefficient and the correlation coefficient of each of the plurality of features; and adjusting parameters of the feature selection model based on the weight coefficient and the correlation coefficient of the predicted selected features and the feedback result of the predicted selected features, wherein the feedback result of the predicted selected features is determined according to the prediction result of the task processing model on the predicted selected features.
According to an aspect of the present disclosure, there is provided a feature selection apparatus including: a first determination unit configured to determine a weight coefficient and a correlation coefficient of each of a plurality of features of an object to be processed, wherein the weight coefficient of each feature is determined according to a degree of association of the feature with a task to be processed, and the correlation coefficient of each feature is determined according to a correlation between the feature and other features; and a second determination unit configured to determine a selected feature for executing the task to be processed among the plurality of features based on the weight coefficient and the correlation coefficient of each of the plurality of features.
According to an aspect of the present disclosure, a training apparatus for a feature selection model is provided, where the feature selection model includes a feature extraction module, a weight module and a correlation module, the feature selection model is connected with a task processing model, and the training apparatus includes: a first acquisition unit configured to input a plurality of features of the sample object into the feature extraction module to obtain representation information of each of the plurality of features; a second acquisition unit configured to acquire a weight coefficient of each of the plurality of features using the weight module based on the representation information of each of the plurality of features; a third acquisition unit configured to acquire a correlation coefficient of each of the plurality of features using the correlation module based on the representation information of each of the plurality of features; a third determination unit configured to determine a predicted selected feature for input to the task processing model among the plurality of features based on the weight coefficient and the correlation coefficient of each of the plurality of features; and an adjusting unit configured to adjust parameters of the feature selection model based on the weight coefficient and the correlation coefficient of the prediction selected feature and a feedback result of the prediction selected feature, wherein the feedback result of the prediction selected feature is determined according to a prediction result of the task processing model on the prediction selected feature.
According to another aspect of the present disclosure, there is provided an electronic device including: at least one processor; and a memory communicatively coupled to the at least one processor; wherein the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the method described above.
According to another aspect of the present disclosure, there is provided a non-transitory computer readable storage medium storing computer instructions for causing a computer to perform the above-described method.
According to another aspect of the disclosure, a computer program product is provided, comprising a computer program, wherein the computer program realizes the above-described method when executed by a processor.
According to one or more embodiments of the present disclosure, the execution efficiency of the computer device on the task to be processed can be improved on the basis of ensuring the execution effect of the task to be processed.
It should be understood that the statements in this section do not necessarily identify key or critical features of the embodiments of the present disclosure, nor do they limit the scope of the present disclosure. Other features of the present disclosure will become apparent from the following description.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate exemplary embodiments of the embodiments and, together with the description, serve to explain the exemplary implementations of the embodiments. The illustrated embodiments are for purposes of illustration only and do not limit the scope of the claims. Throughout the drawings, identical reference numbers designate similar, but not necessarily identical, elements.
FIG. 1 illustrates a schematic diagram of an exemplary system in which various methods described herein may be implemented, according to an embodiment of the present disclosure;
FIG. 2 illustrates a flow diagram of a computer-implemented feature selection method in accordance with embodiments of the present disclosure;
FIG. 3 shows a schematic diagram of a feature selection method according to an embodiment of the present disclosure;
FIG. 4 shows a flow diagram of a method of training a feature selection model, according to an embodiment of the present disclosure;
FIG. 5 shows a block diagram of a feature selection apparatus according to an embodiment of the present disclosure;
FIG. 6 is a block diagram of a training apparatus for a feature selection model according to an embodiment of the present disclosure; and
FIG. 7 illustrates a block diagram of an exemplary electronic device that can be used to implement embodiments of the present disclosure.
Detailed Description
Exemplary embodiments of the present disclosure are described below with reference to the accompanying drawings, in which various details of the embodiments of the disclosure are included to assist understanding, and which are to be considered as merely exemplary. Accordingly, those of ordinary skill in the art will recognize that various changes and modifications of the embodiments described herein can be made without departing from the scope of the present disclosure. Also, descriptions of well-known functions and constructions are omitted in the following description for clarity and conciseness.
In the present disclosure, unless otherwise specified, the use of the terms "first", "second", and the like to describe various elements is not intended to limit the positional relationship, the temporal relationship, or the importance relationship of the elements, and such terms are used only to distinguish one element from another. In some examples, a first element and a second element may refer to the same instance of the element, and in some cases, based on the context, they may also refer to different instances.
The terminology used in the description of the various examples in this disclosure is for the purpose of describing particular examples only and is not intended to be limiting. Unless the context clearly indicates otherwise, if the number of elements is not specifically limited, the elements may be one or more. Furthermore, the term "and/or" as used in this disclosure is intended to encompass any and all possible combinations of the listed items.
In the field of artificial intelligence technology, data processing for an object to be processed is often performed relying on features extracted from the object to be processed. For an object to be processed, a plurality of features of the object to be processed can be acquired from a plurality of data sources, the features describe the attribute of the object to be processed from different layers and distinguish the object to be processed from other objects.
In the related art, data processing is performed on all features of an object to be processed by a computing device regardless of what kind of task to be processed the object is used to perform. With the development of big data technology, the number of features of the object to be processed that can be obtained is greatly increased, and executing the processing on the object to be processed by using the processing method in the related art will undoubtedly cause a great burden on the processing resources of the computing device and result in low processing efficiency of the computing device.
Based on the above, the present disclosure provides a feature selection method implemented by a computer, which determines a degree of dependence of execution of a task to be processed on a feature based on a weight coefficient and a correlation coefficient of each feature in a plurality of features, and can further screen a selected feature matched with the task to be processed from the plurality of features of an object to be processed to execute the task to be processed, so that on the basis of ensuring an execution effect of the task to be processed, a data processing amount can be effectively reduced, resource overhead on computing equipment in an execution process of the task to be processed is reduced, a configuration requirement on the computing equipment is reduced, and execution efficiency of the task to be processed can be effectively improved.
Embodiments of the present disclosure will be described in detail below with reference to the accompanying drawings.
Fig. 1 illustrates a schematic diagram of an exemplary system 100 in which various methods and apparatus described herein may be implemented in accordance with embodiments of the present disclosure. Referring to fig. 1, the system 100 includes one or more client devices 101, 102, 103, 104, 105, and 106, a server 120, and one or more communication networks 110 coupling the one or more client devices to the server 120. Client devices 101, 102, 103, 104, 105, and 106 may be configured to execute one or more applications.
In embodiments of the present disclosure, the server 120 may run one or more services or software applications that enable a method of feature selection or a training method of a feature selection model to be performed.
In some embodiments, the server 120 may also provide other services or software applications that may include non-virtual environments and virtual environments. In certain embodiments, these services may be provided as web-based services or cloud services, for example, provided to users of client devices 101, 102, 103, 104, 105, and/or 106 under a software as a service (SaaS) model.
In the configuration shown in fig. 1, server 120 may include one or more components that implement the functions performed by server 120. These components may include software components, hardware components, or a combination thereof, which may be executed by one or more processors. A user operating a client device 101, 102, 103, 104, 105, and/or 106 may, in turn, utilize one or more client applications to interact with the server 120 to take advantage of the services provided by these components. It should be understood that a variety of different system configurations are possible, which may differ from system 100. Accordingly, fig. 1 is one example of a system for implementing the various methods described herein and is not intended to be limiting.
The user may use client devices 101, 102, 103, 104, 105, and/or 106 to retrieve the object to be processed. The client device may provide an interface that enables a user of the client device to interact with the client device. The client device may also output information to the user via the interface. Although fig. 1 depicts only six client devices, those skilled in the art will appreciate that any number of client devices may be supported by the present disclosure.
Client devices 101, 102, 103, 104, 105, and/or 106 may include various types of computer devices, such as portable handheld devices, general purpose computers (such as personal computers and laptops), workstation computers, wearable devices, smart screen devices, self-service terminal devices, service robots, gaming systems, thin clients, various messaging devices, sensors or other sensing devices, and so forth. These computer devices may run various types and versions of software applications and operating systems, such as MICROSOFT Windows, APPLE iOS, UNIX-like operating systems, Linux, or Linux-like operating systems (e.g., GOOGLE Chrome OS); or include various Mobile operating systems such as MICROSOFT Windows Mobile OS, iOS, Windows Phone, Android. Portable handheld devices may include cellular telephones, smart phones, tablets, Personal Digital Assistants (PDAs), and the like. Wearable devices may include head-mounted displays (such as smart glasses) and other devices. The gaming system may include a variety of handheld gaming devices, internet-enabled gaming devices, and the like. The client device is capable of executing a variety of different applications, such as various Internet-related applications, communication applications (e.g., email applications), Short Message Service (SMS) applications, and may use a variety of communication protocols.
Network 110 may be any type of network known to those skilled in the art that may support data communications using any of a variety of available protocols, including but not limited to TCP/IP, SNA, IPX, etc. By way of example only, one or more networks 110 may be a Local Area Network (LAN), an ethernet-based network, a token ring, a Wide Area Network (WAN), the internet, a virtual network, a Virtual Private Network (VPN), an intranet, an extranet, a Public Switched Telephone Network (PSTN), an infrared network, a wireless network (e.g., bluetooth, WIFI), and/or any combination of these and/or other networks.
The server 120 may include one or more general purpose computers, special purpose server computers (e.g., PC (personal computer) servers, UNIX servers, mid-end servers), blade servers, mainframe computers, server clusters, or any other suitable arrangement and/or combination. The server 120 may include one or more virtual machines running a virtual operating system, or other computing architecture involving virtualization (e.g., one or more flexible pools of logical storage that may be virtualized to maintain virtual storage for the server). In various embodiments, the server 120 may run one or more services or software applications that provide the functionality described below.
The computing units in server 120 may run one or more operating systems including any of the operating systems described above, as well as any commercially available server operating systems. The server 120 may also run any of a variety of additional server applications and/or middle tier applications, including HTTP servers, FTP servers, CGI servers, JAVA servers, database servers, and the like.
In some implementations, the server 120 may include one or more applications to analyze and consolidate data feeds and/or event updates received from users of the client devices 101, 102, 103, 104, 105, and/or 106. Server 120 may also include one or more applications to display data feeds and/or real-time events via one or more display devices of client devices 101, 102, 103, 104, 105, and/or 106.
In some embodiments, the server 120 may be a server of a distributed system, or a server incorporating a blockchain. The server 120 may also be a cloud server, or a smart cloud computing server or a smart cloud host with artificial intelligence technology. The cloud Server is a host product in a cloud computing service system, and is used for solving the defects of high management difficulty and weak service expansibility in the traditional physical host and Virtual Private Server (VPS) service.
The system 100 may also include one or more databases 130. In some embodiments, these databases may be used to store data and other information. For example, one or more of the databases 130 may be used to store information such as audio files and video files. The database 130 may reside in various locations. For example, the database used by the server 120 may be local to the server 120, or may be remote from the server 120 and may communicate with the server 120 via a network-based or dedicated connection. The database 130 may be of different types. In certain embodiments, the database used by the server 120 may be, for example, a relational database. One or more of these databases may store, update, and retrieve data to and from the database in response to the command.
In some embodiments, one or more of the databases 130 may also be used by applications to store application data. The databases used by the application may be different types of databases, such as key-value stores, object stores, or regular stores supported by a file system.
The system 100 of fig. 1 may be configured and operated in various ways to enable application of the various methods and apparatus described in accordance with the present disclosure.
In the technical scheme of the disclosure, the collection, storage, use, processing, transmission, provision, disclosure and other processing of the personal information of the related user are all in accordance with the regulations of related laws and regulations and do not violate the good customs of the public order.
FIG. 2 shows a flow diagram of a computer-implemented feature selection method, as shown in FIG. 2, comprising: step S201, determining a weight coefficient and a correlation coefficient of each of a plurality of characteristics of an object to be processed, wherein the weight coefficient of each characteristic is determined according to the relevance between the characteristic and a task to be processed, and the correlation coefficient of each characteristic is determined according to the correlation between the characteristic and other characteristics; and step S202, determining selected characteristics used for executing the task to be processed in the plurality of characteristics based on the weight coefficient and the correlation coefficient of each characteristic in the plurality of characteristics.
Therefore, on the basis of ensuring the execution effect of the tasks to be processed, the data processing amount can be effectively reduced, the resource overhead of the computing equipment in the execution process of the tasks to be processed is further reduced, the configuration requirement on the computing equipment is reduced, and the execution efficiency of the tasks to be processed can be effectively improved.
According to some embodiments, the pending task may include any one of: an image processing task; a voice processing task; a natural language processing task; and data clustering tasks. When the task to be processed is an image processing task, the object to be processed is image data; when the task to be processed is a voice processing task, the object to be processed is audio data; and when the task to be processed is a natural language processing task, the object to be processed is text data.
For step S201, the weight coefficient of each feature is determined according to the relevance of the feature to the task to be processed, in other words, the higher the dependency of the execution of the task to be processed on the feature, the larger the weight coefficient of the feature. And aiming at different tasks to be processed, the weight coefficients of the same characteristic in the object to be processed are different.
Wherein the correlation coefficient of each feature is determined based on the correlation between the feature and other features. According to some embodiments, the correlation coefficient for each feature may be determined based on the correlation of the feature with other features in performing the pending task. In other words, the correlation between the same feature and other features in the object to be processed varies with the change of the task to be processed. And determining the correlation between the features based on the task to be processed, so that the obtained correlation coefficient can better meet the application requirement of the task to be processed.
According to some embodiments, the correlation coefficient for each feature is determined based on a gaussian Copula function. Thereby conveniently establishing the correlation between the correlation coefficients of different characteristics.
The Gaussian Copula function can link an edge distribution function and a joint distribution function, wherein variables in the edge distribution function satisfy uniform distribution in a range of [0,1 ].
With respect to step S202, according to some embodiments, determining the selected feature for performing the task to be processed among the plurality of features based on the weight coefficient and the correlation coefficient of each of the plurality of features may include: for each of a plurality of features, determining an indication value of the feature based on a weight coefficient and a correlation coefficient of the feature; and determining the feature as the selected feature in response to the indicated value of the feature satisfying a preset condition. Thereby, it is possible to easily judge whether or not the feature is the selected feature based on the indication value determined by the weight coefficient and the correlation coefficient.
In one embodiment, the indicator function with respect to the weight coefficients and the correlation coefficients may be constructed by:
Figure BDA0003430201180000091
wherein,
Figure BDA0003430201180000092
an indicator function of the characteristic l, αlWeight coefficient, u, for feature llThe correlation coefficient, g, for feature llIs based on ulThe intermediate variable of (1).
t is an adjusting parameter, when the value of t approaches to 0,
Figure BDA0003430201180000093
satisfies the following conditions:
Figure BDA0003430201180000094
thus, when
Figure BDA0003430201180000095
When the value of t is close to 0 and is equal to 1, namely the indicated value meets the preset condition, the feature can be determined as the selected feature, and when the value of t is close to 0, the selected feature is selected
Figure BDA0003430201180000096
When the value of t approaches 0, the value is equal to 0, namely the indicated value does not meet the preset condition,the feature is not determined to be the selected feature.
By the above
Figure BDA0003430201180000097
As can be seen from the expressions,
Figure BDA0003430201180000098
value of (a) to ulAnd alphalMay be micro. Thus for deriving alpha by means of a neural networklOr ulProvide the possibility, in other words, of obtaining the desired value by targeting
Figure BDA0003430201180000099
The predicted value and the feedback value of the user are used for executing reverse gradient calculation, and then the reverse gradient calculation is trained and used for deriving ulOr alphalThe neural network of (1).
Further, the above-mentioned correlation coefficient u can be calculated by constructing a gaussian Copula functionlIn the case of a liquid crystal display device, in particular,
ΦR(x1,…,xd)=Cgaussian(L(x1),…,L(xd);R);
ΦR(x1,…,xd)=P(g1≤x1,…,gd≤xd);
Figure BDA00034302011800000910
wherein, CgaussianRepresenting a gaussian Copula function, R being a correlation matrix of the gaussian Copula function, the correlation matrix R being a positive definite matrix, the correlation between the correlation coefficients of the plurality of features being uniquely determined by the gaussian Copula function constructed based on the R correlation matrix.
For calculating the positive definite correlation matrix R, it can be calculated by the following expression:
∑=LLT2I
R=Norm(LLT2I)
wherein Norm (-) is used to map the covariance matrix to the correlation matrix R, and (Norm (Σ))ij=Σij/(Σiijj)1/2And i and j represent the abscissa and ordinate of the matrix, respectively. The matrix L can be calculated by means of a corresponding neural network, which implicitly embodies the correlation, σ, between a plurality of features2Representing the noise level, I is the identity matrix.
By performing Cholesky decomposition on the matrix Σ, a Cholesky factor V can be further obtained, and a gaussian vector q can be calculated, where q is (q ═ q1,q2,…,qd) And d represents the number of the plurality of features of the object to be processed. In particular, the amount of the solvent to be used,
q=Vζ
where ζ is a gaussian noise vector obtained by a standard normal distribution.
By applying the Gaussian Copula function constructed based on the R correlation matrix, the correlation coefficient u of the characteristic l can be calculatedlI.e. by
ul=ΦR(ql)
According to this embodiment, the number of selected features to be selected is a non-fixed value, and the number of corresponding selected features may be different for different objects to be processed.
In another embodiment, the indicator function with respect to the weight coefficients and the correlation coefficients may be constructed by:
Figure BDA0003430201180000101
wherein,
Figure BDA0003430201180000102
an indicator function for the feature/k is a preset number of selected features to be selected,
Figure BDA0003430201180000103
may be further represented by:
Figure BDA0003430201180000104
wherein d represents the number of a plurality of features of the object to be processed, and t is an adjustment parameter when
Figure BDA0003430201180000105
When the value of t is close to 0 and is equal to 1, namely the indicated value meets the preset condition, the feature can be determined as the selected feature, and when the value of t is close to 0, the selected feature is selected
Figure BDA0003430201180000106
And if the value of t is equal to 0 when the value of t approaches 0, namely the indicating value does not meet the preset condition, the feature is not determined as the selected feature.
Wherein,
Figure BDA0003430201180000107
can be further represented recursively as:
Figure BDA0003430201180000108
when s ∈ {2,3, …, k }
Figure BDA0003430201180000109
When s is 1
Wherein,
Figure BDA00034302011800001010
αlweight coefficient, u, for feature llIs the correlation coefficient of the feature i. As can be seen,
Figure BDA00034302011800001011
value of (a) to ulAnd alphalMay be micro. Thus for deriving alpha by means of a neural networklOr ulProvide the possibility, in other words, of obtaining the desired value by targeting
Figure BDA00034302011800001012
The predicted value and the feedback value of the user are used for executing reverse gradient calculation, and then the reverse gradient calculation is trained and used for deriving ulOr alphalThe neural network of (1).
According to this embodiment, the number of selected features to be selected is a fixed value k, and for any one object to be processed, k features that are the closest matches among a plurality of features of the object to be processed may be selected as selected features.
FIG. 3 is a schematic diagram illustrating a feature selection method according to an embodiment of the disclosure, where, as shown in FIG. 3, the task to be processed has d different features, which may be denoted as { x }1,x2,x3,…,xd}。
With a plurality of fully-connected layers in the weight parameter calculation module, a weight matrix α composed of weight coefficients of each feature can be obtained based on d different features.
The weight matrix alpha is input into a feature screening module, and an indication matrix z formed by the indication values of each feature can be constructed based on the weight matrix alpha and the correlation matrix u determined by the Gaussian copula submodule. And if the indication value corresponding to each feature in the indication matrix z is 1, the feature is determined to be the selected feature, and if the indication value is 0, the feature is not determined to be the selected feature. Thus, multiplying d different features by their corresponding indicators automatically results from feature { x }1,x2,x3,…,xdAnd screening selected characteristics for inputting to the task processing model to execute the task to be processed.
And inputting the selected characteristics into the task processing model to obtain a final processing result y of the task processing model to the object to be processed.
Fig. 4 is a flowchart illustrating a training method of a feature selection model according to an embodiment of the present disclosure, where the feature selection model includes a feature extraction module, a weighting module, and a correlation module, and the feature selection model is connected to a task processing model, as shown in fig. 4, the training method includes: step S401, inputting a plurality of characteristics of a sample object into a characteristic extraction module to obtain representation information of each characteristic of the plurality of characteristics; step S402, based on the representation information of each of the plurality of features, acquiring a weight coefficient of each of the plurality of features by using a weight module; step S403, based on the representation information of each of the plurality of features, obtaining a correlation coefficient of each of the plurality of features by using a correlation module; step S404, determining a predicted selected feature for inputting the task processing model in the plurality of features based on the weight coefficient and the correlation coefficient of each feature in the plurality of features; and S405, adjusting parameters of the feature selection model based on the weight coefficient and the correlation coefficient of the predicted selected features and the feedback result of the predicted selected features, wherein the feedback result of the predicted selected features is determined according to the prediction result of the task processing model on the predicted selected features.
Therefore, the feature selection models for different task processing models can be trained, so that the trained feature selection models can automatically screen out features matched with the task processing models.
With respect to step S403, according to some embodiments, obtaining the correlation coefficient of each of the plurality of features using the correlation module based on the representation information of each of the plurality of features may include: inputting representation information of each of the plurality of features into a correlation module to obtain a correlation matrix; and determining a correlation coefficient of each of the plurality of features using a gaussian Copula function constructed based on the correlation matrix.
The correlation matrix is the correlation matrix R. Since the correlation matrix R of the Gaussian Copula function used for construction is a positive definite matrix, in order to ensure that a positive definite matrix is obtained, the matrix L and the noise level sigma can be obtained through a correlation module2. Based on this, R is again equal to Norm (LL)T2I) A positive definite correlation matrix R is obtained.
With respect to step S404, according to some embodiments, determining the predictively selected feature for inputting the task processing model among the plurality of features based on the weight coefficient and the correlation coefficient of each of the plurality of features may include: determining a prediction indication value of each of a plurality of features based on the weight coefficient and the correlation coefficient of the feature; and for each of the plurality of features, determining the feature as a predicted selected feature in response to the predicted indication value of the feature satisfying a preset condition.
Specifically, the prediction indication value may be expressed as described above
Figure BDA0003430201180000121
Or
Figure BDA0003430201180000122
In the form of
Figure BDA0003430201180000123
And
Figure BDA0003430201180000124
both for the weight coefficients and the correlation coefficients are differentiable, so that the inverse gradient derivation can be performed during the training of the feature selection model.
According to some embodiments, the feedback result is a feedback indicator value for each of the predicted selected features, and wherein adjusting the parameters of the feature selection model based on the weight coefficients and correlation coefficients of the predicted selected features and the feedback result of the predicted selected features may comprise: for each of the features in the predicted selected features, parameters of the feature selection model are adjusted based on the predicted indicator value for the feature and the feedback indicator value for the feature.
Parameter adjustments to the feature extraction module, the weighting module and the correlation module for generating the weighting coefficients and the correlation coefficients may thereby be implemented based on differences between the prediction indication values and the feedback indication values of the selected features.
According to some embodiments, the sample object has a label, and the feedback indication value is determined for a prediction result of the predicted selected feature based on the label of the sample object and the task processing model.
It can be understood that the training of the task processing model and the feature selection model can be executed simultaneously, and the parameters of the task processing model and the feature selection model are reversely adjusted based on the labels of the sample objects and the difference between the predicted results of the task processing model on the predicted selected features, so that the trained feature selection model can be matched with the task processing model, and the matched selected features are screened out for the task to be processed executed in the task processing model.
Fig. 5 shows a block diagram of a feature selection apparatus according to an embodiment of the present disclosure, and as shown in fig. 5, the apparatus 500 includes: a first determining unit 501 configured to determine a weight coefficient and a correlation coefficient of each of a plurality of features of an object to be processed, wherein the weight coefficient of each feature is determined according to a degree of association between the feature and a task to be processed, and the correlation coefficient of each feature is determined according to a correlation between the feature and other features; and a second determining unit 502 configured to determine a selected feature for performing the task to be processed among the plurality of features based on the weight coefficient and the correlation coefficient of each of the plurality of features.
According to some embodiments, the correlation coefficient for each feature is determined based on the correlation between the feature and other features in performing the pending task.
According to some embodiments, the correlation coefficient for each feature is determined based on a gaussian Copula function.
According to some embodiments, the second determination unit comprises: a subunit for determining, for each of a plurality of features, an indication value of the feature based on a weight coefficient and a correlation coefficient of the feature; and a subunit for determining the feature as the selected feature in response to the indicated value of the feature satisfying a preset condition.
According to some embodiments, the pending task comprises any one of: an image processing task; a voice processing task; a natural language processing task; and data clustering tasks.
Fig. 6 is a block diagram illustrating a structure of a training apparatus for a feature selection model according to an embodiment of the present disclosure, wherein the feature selection model includes a feature extraction module, a weighting module, and a correlation module, and the feature selection model is connected to a task processing model, as shown in fig. 6, the apparatus 600 includes: a first obtaining unit 601 configured to input a plurality of features of the sample object into the feature extraction module to obtain representation information of each of the plurality of features; a second obtaining unit 602 configured to obtain a weight coefficient of each of the plurality of features using the weight module based on the representation information of each of the plurality of features; a third obtaining unit 603 configured to obtain a correlation coefficient of each of the plurality of features using the correlation module based on the representation information of each of the plurality of features; a third determining unit 604 configured to determine a predicted selected feature for inputting the task processing model among the plurality of features based on the weight coefficient and the correlation coefficient of each of the plurality of features; and an adjusting unit 605 configured to adjust parameters of the feature selection model based on the weight coefficient and the correlation coefficient of the prediction selected feature and a feedback result of the prediction selected feature, wherein the feedback result of the prediction selected feature is determined according to a prediction result of the task processing model on the prediction selected feature.
According to some embodiments, the third determining unit comprises: a sub-unit for determining a prediction indication value of each of a plurality of features based on a weight coefficient and a correlation coefficient of the feature; and a subunit for determining, for each of the plurality of features, the feature as a predicted selected feature in response to the predicted indication value of the feature satisfying a preset condition.
According to some embodiments, the feedback result is a feedback indication value for each of the predicted selected features, and wherein the adjusting unit comprises: a subunit for adjusting, for each of the features in the predicted selected features, a parameter of the feature selection model based on the predicted indicator value for the feature and the feedback indicator value for the feature.
According to some embodiments, the sample object has a label, and the feedback indication value is determined for a prediction result of the predicted selected feature based on the label of the sample object and the task processing model.
According to some embodiments, the third obtaining unit comprises: a subunit for inputting representation information for each of the plurality of features into the correlation module to obtain a correlation matrix; and a subunit for determining a correlation coefficient for each of the plurality of features using a gaussian Copula function constructed based on the correlation matrix.
According to an embodiment of the present disclosure, there is also provided an electronic apparatus including: at least one processor; and a memory communicatively coupled to the at least one processor; wherein the memory stores instructions executable by the at least one processor to cause the at least one processor to perform any one of the methods described above.
There is also provided, in accordance with an embodiment of the present disclosure, a non-transitory computer-readable storage medium having stored thereon computer instructions for causing a computer to perform any one of the methods described above.
There is also provided, in accordance with an embodiment of the present disclosure, a computer program product, including a computer program, wherein the computer program, when executed by a processor, implements any of the methods described above.
Referring to fig. 7, a block diagram of a structure of an electronic device 700, which may be a server or a client of the present disclosure, which is an example of a hardware device that may be applied to aspects of the present disclosure, will now be described. Electronic device is intended to represent various forms of digital electronic computer devices, such as laptops, desktops, workstations, personal digital assistants, servers, blade servers, mainframes, and other suitable computers. The electronic device may also represent various forms of mobile devices, such as personal digital processing, cellular phones, smart phones, wearable devices, and other similar computing devices. The components shown herein, their connections and relationships, and their functions, are meant to be examples only, and are not meant to limit implementations of the disclosure described and/or claimed herein.
As shown in fig. 7, the electronic device 700 includes a computing unit 701, which may perform various appropriate actions and processes according to a computer program stored in a Read Only Memory (ROM)702 or a computer program loaded from a storage unit 708 into a Random Access Memory (RAM) 703. In the RAM703, various programs and data required for the operation of the electronic device 700 can also be stored. The computing unit 701, the ROM 702, and the RAM703 are connected to each other by a bus 704. An input/output (I/O) interface 705 is also connected to bus 704.
A number of components in the electronic device 700 are connected to the I/O interface 705, including: an input unit 706, an output unit 707, a storage unit 708, and a communication unit 709. The input unit 706 may be any type of device capable of inputting information to the electronic device 700, and the input unit 706 may receive input numeric or character information and generate key signal inputs related to user settings and/or function controls of the electronic device, and may include, but is not limited to, a mouse, a keyboard, a touch screen, a track pad, a track ball, a joystick, a microphone, and/or a remote controller. Output unit 707 may be any type of device capable of presenting information and may include, but is not limited to, a display, speakers, a video/audio output terminal, a vibrator, and/or a printer. Storage unit 708 may include, but is not limited to, magnetic or optical disks. The communication unit 709 allows the electronic device 700 to exchange information/data with other devices via a computer network, such as the internet, and/or various telecommunications networks, and may include, but is not limited to, modems, network cards, infrared communication devices, wireless communication transceivers and/or chipsets, such as bluetooth (TM) devices, 802.11 devices, WiFi devices, WiMax devices, cellular communication devices, and/or the like.
Computing unit 701 may be a variety of general purpose and/or special purpose processing components with processing and computing capabilities. Some examples of the computing unit 701 include, but are not limited to, a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), various specialized Artificial Intelligence (AI) computing chips, various computing units running machine learning model algorithms, a Digital Signal Processor (DSP), and any suitable processor, controller, microcontroller, and so forth. The calculation unit 701 performs the respective methods and processes described above, such as the feature selection method or the training method of the feature selection model. For example, in some embodiments, the feature selection method or the training method of the feature selection model may be implemented as a computer software program tangibly embodied in a machine-readable medium, such as storage unit 708. In some embodiments, part or all of the computer program may be loaded and/or installed onto the electronic device 700 via the ROM 702 and/or the communication unit 709. When the computer program is loaded into the RAM703 and executed by the computing unit 701, one or more steps of the above described feature selection method or training method of the feature selection model may be performed. Alternatively, in other embodiments, the computing unit 701 may be configured by any other suitable means (e.g. by means of firmware) to perform the feature selection method or the training method of the feature selection model.
Various implementations of the systems and techniques described here above may be implemented in digital electronic circuitry, integrated circuitry, Field Programmable Gate Arrays (FPGAs), Application Specific Integrated Circuits (ASICs), Application Specific Standard Products (ASSPs), system on a chip (SOCs), Complex Programmable Logic Devices (CPLDs), computer hardware, firmware, software, and/or combinations thereof. These various embodiments may include: implemented in one or more computer programs that are executable and/or interpretable on a programmable system including at least one programmable processor, which may be special or general purpose, receiving data and instructions from, and transmitting data and instructions to, a storage system, at least one input device, and at least one output device.
Program code for implementing the methods of the present disclosure may be written in any combination of one or more programming languages. These program codes may be provided to a processor or controller of a general purpose computer, special purpose computer, or other programmable data processing apparatus, such that the program codes, when executed by the processor or controller, cause the functions/operations specified in the flowchart and/or block diagram to be performed. The program code may execute entirely on the machine, partly on the machine, as a stand-alone software package partly on the machine and partly on a remote machine or entirely on the remote machine or server.
In the context of this disclosure, a machine-readable medium may be a tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. The machine-readable medium may be a machine-readable signal medium or a machine-readable storage medium. A machine-readable medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples of a machine-readable storage medium would include an electrical connection based on one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
To provide for interaction with a user, the systems and techniques described here can be implemented on a computer having: a display device (e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor) for displaying information to a user; and a keyboard and a pointing device (e.g., a mouse or a trackball) by which a user can provide input to the computer. Other kinds of devices may also be used to provide for interaction with a user; for example, feedback provided to the user can be any form of sensory feedback (e.g., visual feedback, auditory feedback, or tactile feedback); and input from the user may be received in any form, including acoustic, speech, or tactile input.
The systems and techniques described here can be implemented in a computing system that includes a back-end component (e.g., as a data server), or that includes a middleware component (e.g., an application server), or that includes a front-end component (e.g., a user computer having a graphical user interface or a web browser through which a user can interact with an implementation of the systems and techniques described here), or any combination of such back-end, middleware, or front-end components. The components of the system can be interconnected by any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include: local Area Networks (LANs), Wide Area Networks (WANs), and the Internet.
The computer system may include clients and servers. A client and server are generally remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other. The server may be a cloud server, a server of a distributed system, or a server with a combined blockchain.
It should be understood that various forms of the flows shown above may be used, with steps reordered, added, or deleted. For example, the steps described in the present disclosure may be performed in parallel, sequentially or in different orders, and are not limited herein as long as the desired results of the technical solutions disclosed in the present disclosure can be achieved.
Although embodiments or examples of the present disclosure have been described with reference to the accompanying drawings, it is to be understood that the above-described methods, systems and apparatus are merely exemplary embodiments or examples and that the scope of the present invention is not limited by these embodiments or examples, but only by the claims as issued and their equivalents. Various elements in the embodiments or examples may be omitted or may be replaced with equivalents thereof. Further, the steps may be performed in an order different from that described in the present disclosure. Further, various elements in the embodiments or examples may be combined in various ways. It is important that as technology evolves, many of the elements described herein may be replaced with equivalent elements that appear after the present disclosure.

Claims (23)

1. A computer-implemented feature selection method, comprising:
determining a weight coefficient and a correlation coefficient of each of a plurality of features of the object to be processed, wherein the weight coefficient of each feature is determined according to the relevance of the feature and the task to be processed, and the correlation coefficient of each feature is determined according to the correlation between the feature and other features; and
determining a selected feature among the plurality of features for performing the task to be processed based on the weight coefficient and the correlation coefficient of each of the plurality of features.
2. The method of claim 1, wherein the correlation coefficient for each feature is determined based on the correlation of the feature with other features during execution of the pending task.
3. The method according to claim 1 or 2, wherein the correlation coefficient for each feature is determined based on a gaussian Copula function.
4. The method of any of claims 1-3, wherein the determining, among the plurality of features, the selected feature for performing the task to be processed based on the weight coefficient and the correlation coefficient for each of the plurality of features comprises:
for each of the plurality of features, determining an indication value of the feature based on a weight coefficient and a correlation coefficient of the feature; and
and determining the feature as the selected feature in response to the indicated value of the feature satisfying a preset condition.
5. The method of any one of claims 1 to 4, wherein the pending task comprises any one of:
an image processing task;
a voice processing task;
a natural language processing task; and
and (5) carrying out a data clustering task.
6. A training method of a feature selection model, wherein the feature selection model comprises a feature extraction module, a weight module and a correlation module, the feature selection model is connected with a task processing model, and the training method comprises the following steps:
inputting a plurality of features of a sample object into the feature extraction module to obtain representation information of each feature in the plurality of features;
acquiring a weight coefficient of each of the plurality of features using the weight module based on the representation information of each of the plurality of features;
obtaining a correlation coefficient for each of the plurality of features using the correlation module based on the representation information for each of the plurality of features;
determining a predicted selected feature for input to the task processing model among the plurality of features based on the weight coefficient and the correlation coefficient for each of the plurality of features; and
and adjusting parameters of the feature selection model based on the weight coefficient and the correlation coefficient of the features selected by prediction and the feedback result of the features selected by prediction, wherein the feedback result of the features selected by prediction is determined according to the prediction result of the task processing model on the features selected by prediction.
7. The method of claim 6, wherein the determining, based on the weight coefficients and correlation coefficients for each of the plurality of features, a predicted selected feature among the plurality of features for input to the task processing model comprises:
determining a prediction indication value of each of the plurality of features based on the weight coefficient and the correlation coefficient of the feature; and
for each feature of the plurality of features, in response to the predicted indication value of the feature satisfying a preset condition, determining the feature as the predicted selected feature.
8. The method of claim 7, wherein the feedback result is a feedback indicator value for each of the predictively selected features, and wherein the adjusting the parameters of the feature selection model based on the weight coefficients and correlation coefficients for the predictively selected features and the feedback result for the predictively selected features comprises:
for each of the features in the predictively selected features, parameters of the feature selection model are adjusted based on the predictive indicator value for that feature and the feedback indicator value for that feature.
9. The method of claim 8, wherein the sample object has a label, the feedback indication value being determined based on the label of the sample object and a prediction result of the task processing model for the predicted selected feature.
10. The method of claim 6, wherein the obtaining, with the relevance module, the relevance coefficient for each of the plurality of features based on the representation information for each of the plurality of features comprises:
inputting representation information of each of the plurality of features into the correlation module to obtain a correlation matrix; and
determining a correlation coefficient for each of the plurality of features using a Gaussian Copula function constructed based on the correlation matrix.
11. A feature selection apparatus comprising:
a first determination unit configured to determine a weight coefficient and a correlation coefficient of each of a plurality of features of an object to be processed, wherein the weight coefficient of each feature is determined according to a degree of association of the feature with a task to be processed, and the correlation coefficient of each feature is determined according to a correlation between the feature and other features; and
a second determination unit configured to determine a selected feature for performing the task to be processed among the plurality of features based on the weight coefficient and the correlation coefficient of each of the plurality of features.
12. The apparatus of claim 11, wherein the correlation coefficient for each feature is determined based on a correlation between the feature and other features during execution of the pending task.
13. The apparatus of claim 11 or 12, wherein the correlation coefficient for each feature is determined based on a gaussian Copula function.
14. The apparatus according to any one of claims 11 to 13, wherein the second determining unit comprises:
a subunit for determining, for each of the plurality of features, an indication value of the feature based on the weight coefficient and the correlation coefficient of the feature; and
and the subunit is used for determining the feature as the selected feature in response to the indicated value of the feature meeting a preset condition.
15. The apparatus according to any one of claims 11 to 14, wherein the task to be processed comprises any one of:
an image processing task;
a voice processing task;
a natural language processing task; and
and (5) carrying out a data clustering task.
16. A training device of a feature selection model, wherein the feature selection model comprises a feature extraction module, a weight module and a correlation module, the feature selection model is connected with a task processing model, and the training device comprises:
a first obtaining unit configured to input a plurality of features of a sample object into the feature extraction module to obtain representation information of each of the plurality of features;
a second acquisition unit configured to acquire a weight coefficient of each of the plurality of features using the weight module based on the representation information of each of the plurality of features;
a third acquisition unit configured to acquire a correlation coefficient of each of the plurality of features using the correlation module based on the representation information of each of the plurality of features;
a third determination unit configured to determine a predicted selected feature for input to the task processing model among the plurality of features based on the weight coefficient and the correlation coefficient of each of the plurality of features; and
an adjusting unit, configured to adjust parameters of the feature selection model based on the weight coefficient and the correlation coefficient of the feature selected for prediction and a feedback result of the feature selected for prediction, wherein the feedback result of the feature selected for prediction is determined according to a prediction result of the task processing model on the feature selected for prediction.
17. The apparatus of claim 16, wherein the third determining unit comprises:
a sub-unit for determining a prediction indication value of each of the plurality of features based on the weight coefficient and the correlation coefficient of the feature; and
and the subunit is used for determining each feature in the plurality of features as the sub-unit of the predicted selected feature in response to the predicted indication value of the feature meeting the preset condition.
18. The apparatus of claim 17, wherein the feedback result is a feedback indicator value for each of the predicted selected features, and wherein the adjusting unit comprises:
a subunit for adjusting, for each of the predictively selected features, a parameter of the feature selection model based on the predictive indicator for that feature and the feedback indicator for that feature.
19. The apparatus of claim 18, wherein the sample object has a label, the feedback indication value being determined based on the label of the sample object and a prediction result of the task processing model for the predicted selected feature.
20. The apparatus of claim 16, wherein the third obtaining unit comprises:
a subunit for inputting representation information for each of the plurality of features into the correlation module to obtain a correlation matrix; and
a subunit for determining a correlation coefficient for each of the plurality of features using a Gaussian Copula function constructed based on the correlation matrix.
21. An electronic device, comprising:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein
The memory stores instructions executable by the at least one processor to enable the at least one processor to perform the method of any one of claims 1-10.
22. A non-transitory computer readable storage medium having stored thereon computer instructions for causing the computer to perform the method of any one of claims 1-10.
23. A computer program product comprising a computer program, wherein the computer program realizes the method of any one of claims 1-10 when executed by a processor.
CN202111592257.7A 2021-12-23 2021-12-23 Feature selection method and device, model training method and device, equipment and medium Pending CN114219079A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111592257.7A CN114219079A (en) 2021-12-23 2021-12-23 Feature selection method and device, model training method and device, equipment and medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111592257.7A CN114219079A (en) 2021-12-23 2021-12-23 Feature selection method and device, model training method and device, equipment and medium

Publications (1)

Publication Number Publication Date
CN114219079A true CN114219079A (en) 2022-03-22

Family

ID=80705580

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111592257.7A Pending CN114219079A (en) 2021-12-23 2021-12-23 Feature selection method and device, model training method and device, equipment and medium

Country Status (1)

Country Link
CN (1) CN114219079A (en)

Similar Documents

Publication Publication Date Title
CN113807440B (en) Method, apparatus, and medium for processing multimodal data using neural networks
CN112579909A (en) Object recommendation method and device, computer equipment and medium
CN114612749B (en) Neural network model training method and device, electronic device and medium
CN112749758A (en) Image processing method, neural network training method, device, equipment and medium
CN114004985B (en) Character interaction detection method, neural network, training method, training equipment and training medium thereof
CN114445667A (en) Image detection method and method for training image detection model
CN114791982B (en) Object recommendation method and device
CN114547252A (en) Text recognition method and device, electronic equipment and medium
CN114443989A (en) Ranking method, training method and device of ranking model, electronic equipment and medium
CN114005452A (en) Method and device for extracting voice features, electronic equipment and storage medium
CN115578501A (en) Image processing method, image processing device, electronic equipment and storage medium
CN115393514A (en) Training method of three-dimensional reconstruction model, three-dimensional reconstruction method, device and equipment
CN114494797A (en) Method and apparatus for training image detection model
CN114443896A (en) Data processing method and method for training a predictive model
CN114429678A (en) Model training method and device, electronic device and medium
CN114219079A (en) Feature selection method and device, model training method and device, equipment and medium
CN114117046B (en) Data processing method, device, electronic equipment and medium
CN114140851B (en) Image detection method and method for training image detection model
CN116842156B (en) Data generation method, device, equipment and medium
CN114118379B (en) Neural network training method, image processing method, device, equipment and medium
CN114067183B (en) Neural network model training method, image processing method, device and equipment
CN114169440A (en) Model training method, data processing method, device, electronic device and medium
CN116306862A (en) Training method, device and medium for text processing neural network
CN114758114A (en) Model updating method, image processing method, device, electronic device and medium
CN114511742A (en) Image recognition method and device, electronic device and medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination