CN113011494A - Feature processing method, device, equipment and storage medium - Google Patents

Feature processing method, device, equipment and storage medium Download PDF

Info

Publication number
CN113011494A
CN113011494A CN202110291979.2A CN202110291979A CN113011494A CN 113011494 A CN113011494 A CN 113011494A CN 202110291979 A CN202110291979 A CN 202110291979A CN 113011494 A CN113011494 A CN 113011494A
Authority
CN
China
Prior art keywords
feature
target
complexity
target task
processing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110291979.2A
Other languages
Chinese (zh)
Other versions
CN113011494B (en
Inventor
欧阳利萍
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Baidu Netcom Science and Technology Co Ltd
Original Assignee
Beijing Baidu Netcom Science and Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Baidu Netcom Science and Technology Co Ltd filed Critical Beijing Baidu Netcom Science and Technology Co Ltd
Priority to CN202110291979.2A priority Critical patent/CN113011494B/en
Publication of CN113011494A publication Critical patent/CN113011494A/en
Application granted granted Critical
Publication of CN113011494B publication Critical patent/CN113011494B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/213Feature extraction, e.g. by transforming the feature space; Summarisation; Mappings, e.g. subspace methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F17/00Digital computing or data processing equipment or methods, specially adapted for specific functions
    • G06F17/10Complex mathematical operations
    • G06F17/15Correlation function computation including computation of convolution operations

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Mathematical Analysis (AREA)
  • Computational Mathematics (AREA)
  • Mathematical Physics (AREA)
  • Pure & Applied Mathematics (AREA)
  • Mathematical Optimization (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Computing Systems (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Artificial Intelligence (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Evolutionary Computation (AREA)
  • Algebra (AREA)
  • Evolutionary Biology (AREA)
  • Databases & Information Systems (AREA)
  • Software Systems (AREA)
  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)

Abstract

The disclosure discloses a feature processing method, a feature processing device and a storage medium, and relates to the technical field of computers, in particular to the field of deep learning. The specific implementation scheme is as follows: acquiring feature processing information set for a target task; the feature processing information comprises a target feature transformation function, a combination relation among different target feature transformation functions and a feature identifier to be processed of the target feature function; determining the computational complexity of the target task according to the feature processing information of the target task; and processing the target task according to the calculation complexity. The method and the device can improve the efficiency of feature processing to a certain extent.

Description

Feature processing method, device, equipment and storage medium
Technical Field
The present disclosure relates to the field of computer technologies, and in particular, to a method, an apparatus, a device, and a storage medium for feature processing.
Background
The feature conversion and the feature combination are common methods for feature processing in machine learning, and can effectively improve the effect of feature application. Professionals often convert and combine features based on understanding of the business and data and knowledge gained after data analysis.
Disclosure of Invention
The disclosure provides a feature processing method, apparatus, device, and storage medium.
According to an aspect of the present disclosure, there is provided a feature processing method including:
acquiring feature processing information set for a target task; the feature processing information comprises a target feature transformation function, a combination relation among different target feature transformation functions and a feature identifier to be processed of the target feature function;
determining the computational complexity of the target task according to the feature processing information of the target task;
and processing the target task according to the calculation complexity.
According to another aspect of the present disclosure, there is provided a feature processing apparatus including:
the characteristic processing information acquisition module is used for acquiring characteristic processing information set for the target task; the feature processing information comprises a target feature transformation function, a combination relation among different target feature transformation functions and a feature identifier to be processed of the target feature function;
the calculation complexity determining module is used for determining the calculation complexity of the target task according to the feature processing information of the target task;
and the target task processing module is used for processing the target task according to the calculation complexity.
According to a third aspect of the present disclosure, there is provided an electronic apparatus comprising:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein the content of the first and second substances,
the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the method of any of the present disclosure.
According to a fourth aspect of the present disclosure, there is provided a non-transitory computer readable storage medium having stored thereon computer instructions for causing the computer to perform the method of any one of the present disclosure.
According to a fifth aspect of the present disclosure, there is provided a computer program product comprising a computer program which, when executed by a processor, implements a method according to any one of the present disclosure.
Techniques according to the present disclosure improve the efficiency of feature processing.
It should be understood that the statements in this section do not necessarily identify key or critical features of the embodiments of the present disclosure, nor do they limit the scope of the present disclosure. Other features of the present disclosure will become apparent from the following description.
Drawings
The drawings are included to provide a better understanding of the present solution and are not to be construed as limiting the present disclosure. Wherein:
FIG. 1 is a schematic illustration of a feature processing method according to an embodiment of the disclosure;
FIG. 2 is a schematic diagram of a feature processing method according to an embodiment of the present disclosure;
FIG. 3 is a schematic diagram of a feature processing method according to an embodiment of the present disclosure;
FIG. 4 is a schematic block diagram of a feature processing apparatus according to an embodiment of the present disclosure;
fig. 5 is a block diagram of an electronic device for implementing the feature processing method of the embodiments of the present disclosure.
Detailed Description
Exemplary embodiments of the present disclosure are described below with reference to the accompanying drawings, in which various details of the embodiments of the disclosure are included to assist understanding, and which are to be considered as merely exemplary. Accordingly, those of ordinary skill in the art will recognize that various changes and modifications of the embodiments described herein can be made without departing from the scope and spirit of the present disclosure. Also, descriptions of well-known functions and constructions are omitted in the following description for clarity and conciseness.
Fig. 1 is a schematic diagram of a feature processing method according to an embodiment of the present disclosure, which may be applied to a task processing situation, and typically, may be applied to a task processing situation according to a computational complexity. The method of the embodiment may be executed by a feature processing apparatus, which may be implemented in software and/or hardware, and may be integrated in an electronic device. Referring to fig. 1, the feature processing method disclosed in this embodiment may include:
s101, acquiring feature processing information set for a target task; the feature processing information includes a target feature transformation function, a combination relation between different target feature transformation functions, and a feature identifier to be processed by the target feature function.
And S102, determining the computational complexity of the target task according to the characteristic processing information of the target task.
And S103, processing the target task according to the calculation complexity.
The target task is a feature processing task, where the feature processing task may include feature change, feature combination transformation, and the like, which is not limited in this embodiment.
The feature processing information is information needed in feature processing, and includes a target feature transformation function, a combination relationship between different target feature transformation functions, and a feature identifier to be processed by the target feature function. The feature processing information may be obtained directly from feature processing information preset for the target task, or may be obtained by adjusting the history processing information after combining model training according to a result of the history target task processing, so as to determine different feature processing information for different types of target tasks, which is not limited in this embodiment.
The target feature transformation function may be one or more of a non-linear transformation function such as a missing value filling function, a linear transformation function, a square root, a normalization function, a discretization function, an evidence weight function, and the like, which is not limited in this embodiment.
When the target feature transformation function is greater than one, a combination relationship between different target feature transformation functions may exist, and the combination relationship may include addition, subtraction, multiplication and division, and, or, not, and string concatenation, and the like, which is not limited in this embodiment.
The feature identifier to be processed of the target feature function is a feature identifier of the feature to be processed, and can be used for representing the type of the feature to be processed or the source of the feature to be processed under the condition that the feature data is invisible and the like, so that the safety of data processing is improved. The feature data may be preliminarily known based on a security method such as statistical analysis, so as to confirm the feature identifier corresponding to the feature data, which is not limited in this embodiment.
And determining the computational complexity of the target task according to the feature processing information, wherein the computational complexity is the computational complexity in the process of executing the target task. The greater the computational complexity, the more computing resources that may be consumed, the higher the computational cost. The confirmation method may be to determine the computational complexity of the target task according to the complexity of a specific single piece of information of the feature processing information or to determine the computational complexity of the target task jointly according to the complexity of multiple pieces of information, which is not limited in this embodiment.
Determining whether to process the target task or determining a processing mode of the target task according to the calculation complexity, for example, if the calculation complexity is lower than a first complexity threshold, processing the target task; if the computational complexity is higher than the second complexity threshold, no processing is performed, and the like; if the complexity is within the first complexity threshold and the second complexity threshold, the target task is processed, but the complexity calculation condition and the target task processing condition are returned, so as to adjust the feature processing information of the subsequent task, and the like.
According to the technical scheme of the embodiment of the disclosure, the calculation complexity of the target task is determined by determining the feature processing information of the target task, and the target task is processed according to the calculation complexity, so that the problem that when new feature data is processed, a feature processing method is directly obtained according to historical experience, and whether the feature processing method is suitable for the new feature data can be determined only when the data is processed, so that the feature processing efficiency is reduced is solved, and the effects of improving the feature processing efficiency and the flexibility of feature processing are achieved.
FIG. 2 is a schematic diagram of a feature processing method according to an embodiment of the disclosure. The present embodiment is an alternative proposed on the basis of the above-described embodiments. Referring to fig. 2, the feature processing method provided in this embodiment includes:
s201, acquiring feature processing information set for a target task; the feature processing information includes a target feature transformation function, a combination relation between different target feature transformation functions, and a feature identifier to be processed by the target feature function.
S202, obtaining the transformation complexity of the target feature transformation function, the combination complexity of the combination relation and the feature complexity of the feature identification; and determining the computational complexity of the target task according to the feature processing information, the transformation complexity, the combination complexity and the feature complexity.
And S203, processing the target task according to the calculation complexity.
The transformation complexity is the complexity corresponding to the target feature transformation function, the combination complexity is the complexity corresponding to the combination relation, and the feature complexity is the complexity corresponding to the feature identifier. Different target feature transformation functions, combination relations and feature identifications can be set to have different complexities respectively. Illustratively, when a plurality of target feature transformation functions ABC exist in the target task, the target feature transformation function a, the target feature transformation function B, and the target feature transformation function may correspond to transformation complexity one, transformation complexity two, and transformation complexity three, respectively.
Whether a specific relation related to complexity exists among the target feature transformation function, the combination relation and the feature identifier corresponding to the target task can be determined according to the feature processing information, for example, if the feature identifier a, the target feature transformation function B and the combination relation C exist in the feature processing information at the same time, the final calculation complexity is reduced on the basis of the original respective complexity, and the like. If there is no specific relationship, the complexities corresponding to the feature processing information of the target task may be directly combined to obtain the total computation complexity, or different weights may be given to different types of complexities to obtain the computation complexity, which is not limited in this embodiment.
Optionally, the processing the target task according to the computational complexity includes:
and under the condition that the computational complexity of the target task is greater than a complexity threshold value, refusing to execute the target task, and generating task adjustment prompt information according to the computational complexity of the target task.
The complexity threshold may be an empirical value, or may be a numerical value obtained through machine learning or the like, which is not limited in this embodiment. And when the calculation complexity of the target task is greater than the complexity threshold, refusing to execute the target task, namely refusing to execute the feature processing operations such as specific combination conversion of feature data.
And generating task adjustment prompt information according to the calculation process or the calculation result of the calculation complexity, wherein the task adjustment prompt information is used for prompting the direction of task adjustment so that the adjusted target task meets the calculation complexity requirement.
Illustratively, the task adjustment prompt information may be too many combination relations, too many target feature transformation functions, too many feature identifiers to be processed, and the like. The problems that the target task is still executed under the condition of high computational complexity, so that data computation is complex, computation efficiency is low, computation complexity is high, computation cost is difficult to accept and the like are solved, targeted adjustment on the target task is facilitated, and execution efficiency of subsequent target tasks is improved.
Optionally, the processing the target task according to the computational complexity includes:
and under the condition that the calculation complexity of the target task is smaller than a complexity threshold, processing the feature data associated with the feature identifier according to the combination relation between the target feature transformation function and different target feature transformation functions so as to execute the target task.
When the calculation complexity of the target task is smaller than the complexity threshold, combining the target feature transformation functions according to the combination relation between the target feature transformation functions and different target feature transformation functions and according to specific settings, and performing feature transformation processing on feature data associated with the feature identification through the combined target feature transformation functions; or feature data associated with the feature identifier is subjected to feature transformation processing by using a target feature transformation function, and then the data subjected to the feature transformation processing is combined according to a combination relationship, which is not limited in this embodiment.
The feature data associated with the feature identifier may be one or more, and when there are a plurality of feature data, the feature data may be processed in batches, or may be processed individually according to a preset sequence, which is not limited in this embodiment.
And under the condition that the computation complexity is less than the complexity threshold, executing the target task, avoiding the reduction of the computation efficiency of the feature processing and the improvement of the computation cost caused by overhigh computation complexity, thereby improving the availability of the feature processing method for the target task.
According to the technical scheme of the embodiment of the disclosure, the calculation complexity of the target task is determined by processing the information according to the characteristics, and converting the complexity, combining the complexity and the characteristic complexity, so that the accuracy of determining the calculation complexity is improved, and the accuracy of subsequently processing the target task according to the calculation complexity is improved.
FIG. 3 is a schematic diagram of a feature processing method according to an embodiment of the disclosure. The present embodiment is an alternative proposed on the basis of the above-described embodiments. Referring to fig. 3, the feature processing method provided in this embodiment includes:
s301, obtaining a target feature transformation function selected for the target task from candidate feature transformation functions provided by the feature processing page.
S302, obtaining the feature processing page, setting a feature identifier to be processed for the target feature transformation function, and setting a combination relation for different target feature transformation functions.
S303, determining the computational complexity of the target task according to the feature processing information of the target task.
And S304, processing the target task according to the calculation complexity.
The feature processing page is used for a user to select related feature processing information for a target task, and may be a web interface, an application program interface, and the like, which is not limited in this embodiment. One or more feature transformation functions are directly selected from all candidate feature transformation functions, or one or more feature transformation functions are selected from functions of each category according to the category of the function as target candidate feature transformation functions, which is not limited in this embodiment. Illustratively, the candidate feature transformation functions are a log transformation function, an evolution function, an equal frequency binning function, a missing value filling function, and the like, from which the log transformation function and the evolution function are selected as target candidate feature transformation functions.
And acquiring feature identifiers to be processed set for the target feature transformation function from the feature processing page, wherein each feature identifier corresponds to one type of feature data to be processed, and the feature data to be processed is determined by adding the feature identifiers.
The feature processing page obtains a combination relationship set for different target feature transformation functions, and the determination method may be to select a combination relationship corresponding to a specified target feature transformation function from candidate feature relationships, for example, the candidate feature relationships include addition, subtraction, multiplication and division, and, or, not, and character string splicing, and the like, and select the combination relationship of the target feature transformation function a and the target feature transformation function B as "or". It should be noted that a single target feature transformation function may not have a combination relationship or may have a plurality of combination relationships, for example, the target feature transformation function a may be character string-spliced with the target feature transformation function B, or may be simultaneously spliced with the target feature transformation function C.
Optionally, after the feature processing information set for the target task is acquired, the method further includes:
selecting a target serialization language specification from at least two candidate serialization language specifications;
and on the basis of the target serialization language specification, carrying out serialization processing on the target feature transformation function, the feature identifier and the combination relation determined by the feature processing page to obtain a serialization expression of a target task, and executing the target task on the basis of the serialization expression of the target task.
The serialization language specification is used for carrying out serialization processing on the feature processing information to obtain semantic expression of the feature processing information, so that a target task can be executed according to the information after the serialization processing in the follow-up process, and the applicability of the feature processing on a machine is realized.
The candidate serialization language specification may include JavaScript object notation, extensible markup language, serialization methods, and the like.
And selecting a target serialization language specification from the candidate serialization language specifications, and performing serialization processing on a target feature transformation function, a feature identifier and a combination relation determined by the feature processing page based on the target serialization language specification to obtain a serialization expression of the target task. Wherein the serialized expression is used to represent the relationship between the feature processing information in a standardized manner.
For example, the serialized expression may be:
Xj=f1(Xm)△f2(Xn) Or Xj=f1(XmXn),
Wherein XjIs composed of XmAnd XnThe new feature produced, XmAnd XnFor the feature m and the feature n, f to be processed in the target task1(X) and f2(X) is a target feature transformation function, and Delta is a combination relation.
Different features can be combined firstly and then feature transformation is carried out through a feature transformation function, or feature transformation is carried out through a feature transformation function firstly and then combination is carried out, and different serialization expressions are determined according to specific language expressions in different target serialization language specifications.
By generating the serialized expression of the target task, the method can be used for communication and storage in a standardized expression mode and application to the execution of the target task, and is convenient for applying the feature processing method to subsequent model training and further to products and services so as to exert greater value of features.
According to the technical scheme of the embodiment of the disclosure, the target characteristic transformation function and the characteristic identification are obtained through the characteristic processing page, and the combination relation set for different target characteristic transformation functions is set, so that corresponding codes do not need to be compiled for generating the characteristic processing information, and the flexibility, the accuracy and the usability of obtaining the characteristic processing information are improved.
Fig. 4 is a schematic structural diagram of a feature processing apparatus according to an embodiment of the present disclosure. Referring to fig. 4, a feature processing apparatus 400 provided in an embodiment of the present disclosure may include:
a feature processing information obtaining module 401, configured to obtain feature processing information set for a target task; the feature processing information comprises a target feature transformation function, a combination relation among different target feature transformation functions and a feature identifier to be processed of the target feature function;
a computation complexity determining module 402, configured to determine the computation complexity of the target task according to the feature processing information of the target task;
and a target task processing module 403, configured to process the target task according to the computation complexity.
According to the technical scheme of the embodiment of the disclosure, the calculation complexity of the target task is determined by determining the feature processing information of the target task, and the target task is processed according to the calculation complexity, so that the problem that when new feature data is processed, a feature processing method is directly obtained according to historical experience, and whether the feature processing method is suitable for the new feature data can be determined only when the data is processed, so that the feature processing efficiency is reduced is solved, and the effects of improving the feature processing efficiency and the flexibility of feature processing are achieved.
Optionally, the computation complexity determining module includes:
a complexity obtaining unit, configured to obtain a transformation complexity of the target feature transformation function, a combination complexity of the combination relationship, and a feature complexity of the feature identifier;
and the calculation complexity determining unit is used for determining the calculation complexity of the target task according to the feature processing information, the transformation complexity, the combination complexity and the feature complexity.
Optionally, the target task processing module includes:
and the target task rejection execution unit is used for rejecting the execution of the target task under the condition that the computational complexity of the target task is greater than a complexity threshold value, and generating task adjustment prompt information according to the computational complexity of the target task.
Optionally, the target task processing module includes:
and the feature data processing unit is used for processing the feature data associated with the feature identifier according to the combination relation between the target feature transformation function and different target feature transformation functions under the condition that the calculation complexity of the target task is smaller than a complexity threshold value so as to execute the target task.
Optionally, the feature processing information obtaining module includes:
the target function obtaining unit is used for obtaining a target characteristic transformation function selected for the target task from candidate characteristic transformation functions provided by a characteristic processing page;
and the identification relation acquisition unit is used for acquiring the characteristic identification to be processed for the target characteristic transformation function and the combination relation set for different target characteristic transformation functions through the characteristic processing page.
Optionally, the apparatus further includes:
a target specification selection module for selecting a target serialization language specification from at least two candidate serialization language specifications after the feature processing information acquisition module;
and the serialization expression module is used for carrying out serialization processing on the target feature transformation function, the feature identifier and the combination relation determined by the feature processing page based on the target serialization language specification to obtain a serialization expression of a target task, and is used for executing the target task based on the serialization expression of the target task.
The present disclosure also provides an electronic device, a readable storage medium, and a computer program product according to embodiments of the present disclosure.
FIG. 5 illustrates a schematic block diagram of an example electronic device 500 that can be used to implement embodiments of the present disclosure. Electronic devices are intended to represent various forms of digital computers, such as laptops, desktops, workstations, personal digital assistants, servers, blade servers, mainframes, and other appropriate computers. The electronic device may also represent various forms of mobile devices, such as personal digital processing, cellular phones, smart phones, wearable devices, and other similar computing devices. The components shown herein, their connections and relationships, and their functions, are meant to be examples only, and are not meant to limit implementations of the disclosure described and/or claimed herein.
As shown in fig. 5, the apparatus 500 includes a computing unit 501, which can perform various appropriate actions and processes according to a computer program stored in a Read Only Memory (ROM)502 or a computer program loaded from a storage unit 505 into a Random Access Memory (RAM) 503. In the RAM503, various programs and data required for the operation of the device 500 can also be stored. The calculation unit 501, the ROM 502, and the RAM503 are connected to each other by a bus 504. An input/output (I/O) interface 505 is also connected to bus 504.
A number of components in the device 500 are connected to the I/O interface 505, including: an input unit 506 such as a keyboard, a mouse, or the like; an output unit 507 such as various types of displays, speakers, and the like; a storage unit 505 such as a magnetic disk, optical disk, or the like; and a communication unit 509 such as a network card, modem, wireless communication transceiver, etc. The communication unit 509 allows the device 500 to exchange information/data with other devices through a computer network such as the internet and/or various telecommunication networks.
The computing unit 501 may be a variety of general-purpose and/or special-purpose processing components having processing and computing capabilities. Some examples of the computing unit 501 include, but are not limited to, a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), various dedicated Artificial Intelligence (AI) computing chips, various computing units running machine learning model algorithms, a Digital Signal Processor (DSP), and any suitable processor, controller, microcontroller, and so forth. The calculation unit 501 executes the respective methods and processes described above, such as the feature processing method. For example, in some embodiments, the feature handling methods may be implemented as a computer software program tangibly embodied in a machine-readable medium, such as storage unit 505. In some embodiments, part or all of the computer program may be loaded and/or installed onto the device 500 via the ROM 502 and/or the communication unit 509. When the computer program is loaded into the RAM503 and executed by the computing unit 501, one or more steps of the feature processing method described above may be performed. Alternatively, in other embodiments, the computing unit 501 may be configured to perform the feature processing method by any other suitable means (e.g., by means of firmware).
Various implementations of the systems and techniques described here above may be implemented in digital electronic circuitry, integrated circuitry, Field Programmable Gate Arrays (FPGAs), Application Specific Integrated Circuits (ASICs), Application Specific Standard Products (ASSPs), system on a chip (SOCs), load programmable logic devices (CPLDs), computer hardware, firmware, software, and/or combinations thereof. These various embodiments may include: implemented in one or more computer programs that are executable and/or interpretable on a programmable system including at least one programmable processor, which may be special or general purpose, receiving data and instructions from, and transmitting data and instructions to, a storage system, at least one input device, and at least one output device.
Program code for implementing the methods of the present disclosure may be written in any combination of one or more programming languages. These program codes may be provided to a processor or controller of a general purpose computer, special purpose computer, or other programmable data processing apparatus, such that the program codes, when executed by the processor or controller, cause the functions/operations specified in the flowchart and/or block diagram to be performed. The program code may execute entirely on the machine, partly on the machine, as a stand-alone software package partly on the machine and partly on a remote machine or entirely on the remote machine or server.
In the context of this disclosure, a machine-readable medium may be a tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. The machine-readable medium may be a machine-readable signal medium or a machine-readable storage medium. A machine-readable medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples of a machine-readable storage medium would include an electrical connection based on one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
To provide for interaction with a user, the systems and techniques described here can be implemented on a computer having: a display device (e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor) for displaying information to a user; and a keyboard and a pointing device (e.g., a mouse or a trackball) by which a user can provide input to the computer. Other kinds of devices may also be used to provide for interaction with a user; for example, feedback provided to the user can be any form of sensory feedback (e.g., visual feedback, auditory feedback, or tactile feedback); and input from the user may be received in any form, including acoustic, speech, or tactile input.
The systems and techniques described here can be implemented in a computing system that includes a back-end component (e.g., as a data server), or that includes a middleware component (e.g., an application server), or that includes a front-end component (e.g., a user computer having a graphical user interface or a web browser through which a user can interact with an implementation of the systems and techniques described here), or any combination of such back-end, middleware, or front-end components. The components of the system can be interconnected by any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include: local Area Networks (LANs), Wide Area Networks (WANs), blockchain networks, and the internet.
The computer system may include clients and servers. A client and server are generally remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other. The server can be a cloud server, also called a cloud computing server or a cloud host, and is a host product in a cloud computing service system, so that the defects of high management difficulty and weak service expansibility in the traditional physical host and VPS service are overcome.
It should be understood that various forms of the flows shown above may be used, with steps reordered, added, or deleted. For example, the steps described in the present disclosure may be executed in parallel, sequentially, or in different orders, as long as the desired results of the technical solutions disclosed in the present disclosure can be achieved, and the present disclosure is not limited herein.
The above detailed description should not be construed as limiting the scope of the disclosure. It should be understood by those skilled in the art that various modifications, combinations, sub-combinations and substitutions may be made in accordance with design requirements and other factors. Any modification, equivalent replacement, and improvement made within the spirit and principle of the present disclosure should be included in the scope of protection of the present disclosure.

Claims (15)

1. A method of feature processing, comprising:
acquiring feature processing information set for a target task; the feature processing information comprises a target feature transformation function, a combination relation among different target feature transformation functions and a feature identifier to be processed of the target feature function;
determining the computational complexity of the target task according to the feature processing information of the target task;
and processing the target task according to the calculation complexity.
2. The method of claim 1, wherein the determining the computational complexity of the target task from the feature processing information of the target task comprises:
acquiring the transformation complexity of the target feature transformation function, the combination complexity of the combination relation and the feature complexity of the feature identifier;
and determining the computational complexity of the target task according to the feature processing information, the transformation complexity, the combination complexity and the feature complexity.
3. The method of claim 1, wherein the processing the target task according to the computational complexity comprises:
and under the condition that the computational complexity of the target task is greater than a complexity threshold value, refusing to execute the target task, and generating task adjustment prompt information according to the computational complexity of the target task.
4. The method of claim 1, wherein the processing the target task according to the computational complexity comprises:
and under the condition that the calculation complexity of the target task is smaller than a complexity threshold, processing the feature data associated with the feature identifier according to the combination relation between the target feature transformation function and different target feature transformation functions so as to execute the target task.
5. The method according to claim 1, wherein the acquiring feature processing information set for a target task includes:
obtaining a target feature transformation function selected for the target task from candidate feature transformation functions provided by a feature processing page;
and acquiring a characteristic mark to be processed for the target characteristic transformation function and a combination relation set for different target characteristic transformation functions through the characteristic processing page.
6. The method according to claim 5, further comprising, after the obtaining of the feature processing information set for the target task:
selecting a target serialization language specification from at least two candidate serialization language specifications;
and on the basis of the target serialization language specification, carrying out serialization processing on the target feature transformation function, the feature identifier and the combination relation determined by the feature processing page to obtain a serialization expression of a target task, and executing the target task on the basis of the serialization expression of the target task.
7. A feature processing apparatus comprising:
the characteristic processing information acquisition module is used for acquiring characteristic processing information set for the target task; the feature processing information comprises a target feature transformation function, a combination relation among different target feature transformation functions and a feature identifier to be processed of the target feature function;
the calculation complexity determining module is used for determining the calculation complexity of the target task according to the feature processing information of the target task;
and the target task processing module is used for processing the target task according to the calculation complexity.
8. The apparatus of claim 7, wherein the computational complexity determination module comprises:
a complexity obtaining unit, configured to obtain a transformation complexity of the target feature transformation function, a combination complexity of the combination relationship, and a feature complexity of the feature identifier;
and the calculation complexity determining unit is used for determining the calculation complexity of the target task according to the feature processing information, the transformation complexity, the combination complexity and the feature complexity.
9. The apparatus of claim 7, wherein the target task processing module comprises:
and the target task rejection execution unit is used for rejecting the execution of the target task under the condition that the computational complexity of the target task is greater than a complexity threshold value, and generating task adjustment prompt information according to the computational complexity of the target task.
10. The apparatus of claim 7, wherein the target task processing module comprises:
and the feature data processing unit is used for processing the feature data associated with the feature identifier according to the combination relation between the target feature transformation function and different target feature transformation functions under the condition that the calculation complexity of the target task is smaller than a complexity threshold value so as to execute the target task.
11. The apparatus of claim 7, wherein the feature processing information acquisition module comprises:
the target function obtaining unit is used for obtaining a target characteristic transformation function selected for the target task from candidate characteristic transformation functions provided by a characteristic processing page;
and the identification relation acquisition unit is used for acquiring the characteristic identification to be processed for the target characteristic transformation function and the combination relation set for different target characteristic transformation functions through the characteristic processing page.
12. The apparatus of claim 11, further comprising:
a target specification selection module for selecting a target serialization language specification from at least two candidate serialization language specifications after the feature processing information acquisition module;
and the serialization expression module is used for carrying out serialization processing on the target feature transformation function, the feature identifier and the combination relation determined by the feature processing page based on the target serialization language specification to obtain a serialization expression of a target task, and is used for executing the target task based on the serialization expression of the target task.
13. An electronic device, comprising:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein the content of the first and second substances,
the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the method of any one of claims 1-6.
14. A non-transitory computer readable storage medium having stored thereon computer instructions for causing the computer to perform the method of any one of claims 1-6.
15. A computer program product comprising a computer program which, when executed by a processor, implements the method according to any one of claims 1-6.
CN202110291979.2A 2021-03-18 2021-03-18 Feature processing method, device, equipment and storage medium Active CN113011494B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110291979.2A CN113011494B (en) 2021-03-18 2021-03-18 Feature processing method, device, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110291979.2A CN113011494B (en) 2021-03-18 2021-03-18 Feature processing method, device, equipment and storage medium

Publications (2)

Publication Number Publication Date
CN113011494A true CN113011494A (en) 2021-06-22
CN113011494B CN113011494B (en) 2024-02-27

Family

ID=76409749

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110291979.2A Active CN113011494B (en) 2021-03-18 2021-03-18 Feature processing method, device, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN113011494B (en)

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101483568B1 (en) * 2013-07-16 2015-01-16 한국항공대학교산학협력단 Low-complexity Cost Function Calculation for Multiple-input Multiple-output systems
CN106556818A (en) * 2016-11-18 2017-04-05 辽宁工业大学 A kind of low computation complexity bernoulli wave filter for monotrack
US20190091859A1 (en) * 2017-09-26 2019-03-28 Siemens Aktiengesellschaft Method and system for automatic robot control policy generation via cad-based deep inverse reinforcement learning
US20190130007A1 (en) * 2017-10-31 2019-05-02 International Business Machines Corporation Facilitating automatic extract, transform, load (etl) processing
CN110704650A (en) * 2019-09-29 2020-01-17 携程计算机技术(上海)有限公司 OTA picture tag identification method, electronic device and medium
US20200184376A1 (en) * 2018-12-05 2020-06-11 The Board Of Trustees Of The University Of Illinois Holistic Optimization for Accelerating Iterative Machine Learning
CN111724081A (en) * 2020-06-30 2020-09-29 北京来也网络科技有限公司 RPA flow complexity determining method, device, equipment and storage medium
CN111753761A (en) * 2020-06-28 2020-10-09 北京百度网讯科技有限公司 Model generation method and device, electronic equipment and storage medium
WO2021004478A1 (en) * 2019-07-10 2021-01-14 华为技术有限公司 Distributed ai system

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101483568B1 (en) * 2013-07-16 2015-01-16 한국항공대학교산학협력단 Low-complexity Cost Function Calculation for Multiple-input Multiple-output systems
CN106556818A (en) * 2016-11-18 2017-04-05 辽宁工业大学 A kind of low computation complexity bernoulli wave filter for monotrack
US20190091859A1 (en) * 2017-09-26 2019-03-28 Siemens Aktiengesellschaft Method and system for automatic robot control policy generation via cad-based deep inverse reinforcement learning
US20190130007A1 (en) * 2017-10-31 2019-05-02 International Business Machines Corporation Facilitating automatic extract, transform, load (etl) processing
US20200184376A1 (en) * 2018-12-05 2020-06-11 The Board Of Trustees Of The University Of Illinois Holistic Optimization for Accelerating Iterative Machine Learning
WO2021004478A1 (en) * 2019-07-10 2021-01-14 华为技术有限公司 Distributed ai system
CN110704650A (en) * 2019-09-29 2020-01-17 携程计算机技术(上海)有限公司 OTA picture tag identification method, electronic device and medium
CN111753761A (en) * 2020-06-28 2020-10-09 北京百度网讯科技有限公司 Model generation method and device, electronic equipment and storage medium
CN111724081A (en) * 2020-06-30 2020-09-29 北京来也网络科技有限公司 RPA flow complexity determining method, device, equipment and storage medium

Non-Patent Citations (5)

* Cited by examiner, † Cited by third party
Title
MYKHAILO RAKUSHEV等: "Numerical Method of Integration on the Basis of Multidimensional Differential-Taylor Transformations", 2019 IEEE INTERNATIONAL SCIENTIFIC-PRACTICAL CONFERENCE PROBLEMS OF INFOCOMMUNICATIONS, SCIENCE AND TECHNOLOGY (PIC S&T) *
巴斌;胡捍英;郑娜娥;任修坤;: "基于逼近噪声子空间的求根时延估计算法", 太赫兹科学与电子信息学报, no. 04 *
徐超;葛红美;何炎祥;: "面向系统动态可靠性的自适应目标代码生成方法", 计算机应用研究, no. 02 *
林杰;李如意;: "基于深度学习的图像识别处理", 网络安全技术与应用, no. 11 *
郑伟龙;石振锋;吕宝粮;: "用异质迁移学习构建跨被试脑电情感模型", 计算机学报, no. 02 *

Also Published As

Publication number Publication date
CN113011494B (en) 2024-02-27

Similar Documents

Publication Publication Date Title
CN113342345A (en) Operator fusion method and device of deep learning framework
CN112597754B (en) Text error correction method, apparatus, electronic device and readable storage medium
CN114881129A (en) Model training method and device, electronic equipment and storage medium
CN113360711A (en) Model training and executing method, device, equipment and medium for video understanding task
CN115690443A (en) Feature extraction model training method, image classification method and related device
CN113641829A (en) Method and device for training neural network of graph and complementing knowledge graph
CN113011494B (en) Feature processing method, device, equipment and storage medium
CN115759209A (en) Neural network model quantification method and device, electronic equipment and medium
CN114329164A (en) Method, apparatus, device, medium and product for processing data
CN114139605A (en) Distributed model training method, system, device and storage medium
CN114445682A (en) Method, device, electronic equipment, storage medium and product for training model
CN113850072A (en) Text emotion analysis method, emotion analysis model training method, device, equipment and medium
CN113742457A (en) Response processing method and device, electronic equipment and storage medium
CN114330221A (en) Score board implementation method, score board, electronic device and storage medium
CN114816758B (en) Resource allocation method and device
CN113255332B (en) Training and text error correction method and device for text error correction model
CN113407844B (en) Version recommendation method, device and equipment of applet framework and storage medium
CN114286343B (en) Multi-way outbound system, risk identification method, equipment, medium and product
CN113553407B (en) Event tracing method and device, electronic equipment and storage medium
CN113362428B (en) Method, apparatus, device, medium, and product for configuring color
CN112835007B (en) Point cloud data conversion method and device, electronic equipment and storage medium
CN113887651A (en) Acquisition method and device of countermeasure sample image and electronic equipment
CN114219067A (en) Recommendation method and device, electronic equipment and readable storage medium
CN114429211A (en) Method, apparatus, device, medium and product for generating information
CN114021714A (en) Transfer learning training method and device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant