CN113743618A - Time series data processing method and device, readable medium and electronic equipment - Google Patents

Time series data processing method and device, readable medium and electronic equipment Download PDF

Info

Publication number
CN113743618A
CN113743618A CN202111033394.7A CN202111033394A CN113743618A CN 113743618 A CN113743618 A CN 113743618A CN 202111033394 A CN202111033394 A CN 202111033394A CN 113743618 A CN113743618 A CN 113743618A
Authority
CN
China
Prior art keywords
time series
series data
data
time
tag information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111033394.7A
Other languages
Chinese (zh)
Inventor
任磊
莫廷钰
成学军
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beihang University
Original Assignee
Beihang University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beihang University filed Critical Beihang University
Priority to CN202111033394.7A priority Critical patent/CN113743618A/en
Publication of CN113743618A publication Critical patent/CN113743618A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N20/00Machine learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Software Systems (AREA)
  • Physics & Mathematics (AREA)
  • Mathematical Physics (AREA)
  • Artificial Intelligence (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • Computing Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Medical Informatics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Computational Linguistics (AREA)
  • General Health & Medical Sciences (AREA)
  • Molecular Biology (AREA)
  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)

Abstract

The invention discloses a time series data processing method, a time series data processing device, a readable medium and electronic equipment, wherein the method comprises the following steps: acquiring first time series data and second time series data, wherein the first time series data comprise preset tag information, and the second time series data are unlabeled time series data; adding pseudo tag information to the second time series data to obtain third time series data including pseudo tag information; and performing model training based on the first time series data and the third time series data to obtain a time series classification model. In summary, the invention adds the pseudo tag information to the time series data with the tag information and adds the pseudo tag information to the time series data without the tag information, and then uses the time series data as the sample data of the model training, so that the time series data set is diversified, the trained time series classification model has stronger generalization capability, and the robustness and the classification accuracy of the time series classification model are improved.

Description

Time series data processing method and device, readable medium and electronic equipment
Technical Field
The invention relates to the technical field of computers, in particular to a time series data processing method, a time series data processing device, a readable medium and electronic equipment.
Background
Deep Learning (DL) is a new research direction in the field of Machine Learning (ML), and can learn the intrinsic rules and representation levels of sample data, so as to mine and solve data-related problems. For example, deep learning can effectively improve time series problems such as time series classification, time series prediction, and time series anomaly detection. However, when the foregoing problems are solved by deep learning, enough data samples are required, and there is a certain requirement for the quality of the data samples, if the data samples are insufficient and/or the data noise is high, the correlation model obtained by deep learning will be unstable, and moreover, a large amount of raw data collected by a sensor is mostly label-free samples, the value of the part of data is difficult to be fully utilized, and in addition, the label-free samples may be distributed differently from the labeled samples, and if a pseudo label sample mode is directly adopted, it may result in that the data cannot be suitable for the relevant application scenarios. In summary, in the related art, how to obtain enough and effective data samples is one of the difficulties in the field of deep learning, and if there are not enough and effective data samples, the generalization capability of the trained and divided model is difficult to be improved.
Disclosure of Invention
The invention provides a time sequence data processing method, a time sequence data processing device, a readable medium and electronic equipment, which can diversify a time sequence data set, enable a trained time sequence classification model to have stronger generalization capability, and improve the robustness and classification accuracy of the time sequence classification model.
In a first aspect, the present invention provides a time series data processing method, including:
acquiring first time series data and second time series data, wherein the first time series data comprise preset tag information, and the second time series data are unlabeled time series data;
adding pseudo tag information to the second time series data to obtain third time series data including pseudo tag information;
and performing model training based on the first time series data and the third time series data to obtain a time series classification model.
In a second aspect, the present invention provides a time-series data processing apparatus comprising:
the data acquisition module is used for acquiring first time series data and second time series data, wherein the first time series data comprise preset tag information, and the second time series data are non-tag time series data;
the tag processing module is used for adding pseudo tag information to the second time series data to obtain third time series data comprising the pseudo tag information;
and the first training module is used for carrying out model training on the basis of the first time sequence data and the third time sequence data so as to obtain a time sequence classification model.
In a third aspect, the invention provides a readable medium comprising executable instructions, which when executed by a processor of an electronic device, perform the method according to the first aspect.
In a fourth aspect, the present invention provides an electronic device comprising: a processor, a memory, and a bus; the memory is used for storing execution instructions, the processor is connected with the memory through the bus, and when the electronic device runs, the processor executes the execution instructions stored in the memory to enable the processor to execute the method according to the first aspect.
The invention provides a time sequence data processing method, a time sequence data processing device, a readable medium and electronic equipment, wherein first time sequence data and second time sequence data are acquired, the first time sequence data comprise preset tag information, and the second time sequence data are untagged time sequence data; adding pseudo tag information to the second time series data to obtain third time series data including pseudo tag information; and performing model training based on the first time series data and the third time series data to obtain a time series classification model. In summary, the invention adds the pseudo tag information to the time series data with the tag information and adds the pseudo tag information to the time series data without the tag information, and then uses the time series data as the sample data of the model training, so that the time series data set is diversified, the trained time series classification model has stronger generalization capability, and the robustness and the classification accuracy of the time series classification model are improved.
Drawings
In order to more clearly illustrate the embodiments or prior art solutions in the present specification, the drawings needed to be used in the description of the embodiments or prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments described in the present specification, and it is obvious for a person skilled in the art to obtain other drawings based on these drawings without any creative effort.
Fig. 1 is a schematic flow chart illustrating a time series data processing method according to an embodiment of the present invention;
fig. 2 is a schematic structural diagram of a time-series data processing apparatus according to an embodiment of the present invention;
fig. 3 is a schematic structural diagram of an electronic device according to an embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present disclosure more clear, the technical solutions of the present disclosure will be clearly and completely described below with reference to the specific embodiments and the accompanying drawings. It is to be understood that the embodiments described are only a few embodiments of the present disclosure, and not all embodiments. All other embodiments obtained by a person of ordinary skill in the art based on the embodiments in the present specification without any creative effort belong to the protection scope of the present specification.
Fig. 1 is a schematic flow chart of a time series data processing method according to an embodiment of the present invention. As shown in fig. 1, an embodiment of the present invention provides a time series data processing method, including:
step 101, acquiring first time series data and second time series data, wherein the first time series data comprises preset tag information, and the second time series data is untagged time series data.
The first time-series data and the second time-series data may be time-series data acquired by a sensor of any one of the devices, or may be log data generated by running an application program, for example. In this step, the first time-series data and the second time-series data may be acquired in real time from a sensor or an electronic device running an application, or may be acquired from a cloud storage and/or a local storage server and/or a database.
The first time sequence data is time sequence data comprising preset tag information, the preset tag information can be manually marked, or the preset tag information can be automatically marked by acquisition equipment or electronic equipment responsible for processing tags according to data characteristics during data acquisition. For example, Xi={(x1,0),(x2,1),(x3,1),(x4,0)……(xn,0)},XiRepresenting first time series data, n representing the number of data elements in the first time series data, x1~xnRepresenting data elements in the first time-series data, and 0 and 1 representing preset tag information (e.g., 0 being a device failure tag and 1 being a device health tag). The second time-series data is unlabeled time-series data, e.g., Xi={(x1),(x2),(x3),(x4)……(xn)},XiRepresenting first time series data, n representing the number of data elements in the first time series data, x1~xnRepresenting data elements in the first time series data.
And 102, adding pseudo tag information to the second time series data to obtain third time series data comprising the pseudo tag information.
In this step, pseudo tag information is added for each data element in the second time-series data. The pseudo label is a label that gives an approximation to data without a label according to data with a predicted label, for example, the feature similarity between the feature of the labeled data and the feature similarity between the unlabeled data may be determined, and for the feature similarity between the labeled data and the unlabeled data being greater than a preset value, the label with the labeled data is used as an approximation label corresponding to the unlabeled data. For example (assuming similarity preset value of 0.5):
Figure BDA0003246223370000041
in other embodiments of the present application, a tag model may be obtained by training based on the first time series data, where the tag model is used to add pseudo tag information to unlabeled time series data, and then the second time series data is input to the tag model to obtain third time series data including pseudo tag information. Illustratively, a label model is trained on time series data with preset label information by adopting a semi-supervised learning mode: and taking the time series data with the preset label information as a semi-supervised learning network to obtain a label model, inputting the second time series data without the preset label information into the label model to extract the characteristics of the second time series data through a label model hiding layer, and classifying the second time series data without the preset label information by using the characteristics so as to obtain the pseudo label information of the second time series data.
In some embodiments, the pseudo tag information of the second time-series data to which the pseudo tag information has been added is evaluated, for example, manually evaluated, and an evaluation score is given, so that the second time-series data to which the pseudo tag information is more accurate is determined as the third time-series data according to the evaluation score; in other embodiments, the tag model may output the pseudo tag information of the second time series data and a confidence level of the pseudo tag information, and determine the second time series data in which the pseudo tag information is more accurate as the third time series data according to the confidence level of the pseudo tag information.
The label model generated as described above may be expressed as:
Lt=TaskClassifier(u);t∈(1,……Batchsize)
wherein L istRepresents pseudo tag information, and u represents any element of the second time-series data.
And 103, performing model training based on the first time series data and the third time series data to obtain a time series classification model.
In the embodiment of the present invention, this step may be implemented by:
and step A, randomly sequencing all data of the first time sequence data and the third time sequence data to obtain fourth time sequence data.
For example, all data of the first time series data and the third time series data may be input to a random ordering function to obtain randomly ordered fourth time series data. For example, the random ordering function is a Shuffle function, and the element data of the first time-series data and the third time-series data are respectively input into the Shuffle function: and x ═ shuffle (x) and u ═ shuffle (u), wherein x is any element data of the first time series data, and u is any element data of the third time series data. And all the randomly ordered element data are used as fourth time series data.
And B, performing model training based on the randomly ordered fourth time sequence data to obtain a time sequence classification model.
In this step, the fourth time series data obtained in step a is used as a training sample to perform model training, for example, the fourth time series data is input into a neural network to extract data features of the fourth time series data, so as to perform model training.
In other embodiments, this step can also be implemented by:
and a sub-step B1 of weighting the data elements and the tag elements of the randomly ordered fourth time series data to obtain fifth time series data.
And determining a weight parameter of the fourth time series data by using a beta distribution function aiming at the randomly sequenced fourth time series data, and further performing weighting processing on data elements and label elements of the fourth time series data according to the weight parameter to obtain fifth time series data. Weighting processes, such as:
Figure BDA0003246223370000061
Figure BDA0003246223370000062
wherein,
Figure BDA0003246223370000063
for the weighted data elements, gamma is a weight parameter, xsFor unweighted data elements with preset tag information, utUnweighted data elements with pseudo-tag information;
Figure BDA0003246223370000064
for the weighted tag elements, γ is the weight parameter, lsFor unweighted preset tag information elements,/tUnweighted pseudo label information elements. The resulting fifth time-series data may be represented as:
Figure BDA0003246223370000065
and a sub-step B2 of performing model training based on the fifth time series data to obtain a time series classification model, wherein the fifth time series data comprises weighted data elements and weighted label elements.
In summary, the time series data processing method provided by the invention adds the pseudo tag information to the time series data with tag information, and further takes the time series data with tag information and the time series data with tag information as sample data of model training, so that the time series data set is diversified, the trained time series classification model has stronger generalization capability, and the robustness and the classification accuracy of the time series classification model are improved.
Based on the same concept as the method embodiment of the present invention, as shown in fig. 2, an embodiment of the present invention further provides a time series data processing apparatus, including:
the data acquiring module 21 is configured to acquire first time series data and second time series data, where the first time series data includes preset tag information, and the second time series data is untagged time series data.
And the tag processing module 22 is configured to add pseudo tag information to the second time series data to obtain third time series data including pseudo tag information.
A first training module 23, configured to perform model training based on the first time series data and the third time series data to obtain a time series classification model.
In some embodiments, the time series data processing apparatus provided in the embodiments of the present invention may further include a second training module (not shown in the figure) configured to train to obtain a tag model based on the first time series data, where the tag model is configured to add pseudo tag information to unlabeled time series data, and further, the tag processing module may further include an input unit configured to input the second time series data into the tag model to obtain third time series data including pseudo tag information.
In summary, the time series data processing apparatus provided by the present invention adds pseudo tag information to time series data with tag information, and further uses both time series data as sample data for model training, so as to diversify the time series data set, so that the trained time series classification model has stronger generalization capability, and the robustness and classification accuracy of the time series classification model are improved.
Fig. 3 is a schematic structural diagram of an electronic device according to an embodiment of the present disclosure. On the hardware level, the electronic device comprises a processor and optionally an internal bus, a network interface and a memory. The Memory may include a Memory, such as a Random-Access Memory (RAM), and may further include a non-volatile Memory, such as at least 1 disk Memory. Of course, the electronic device may also include hardware required for other services.
The processor, the network interface, and the memory may be connected to each other via an internal bus, which may be an ISA (Industry Standard Architecture) bus, a PCI (Peripheral Component Interconnect) bus, an EISA (Extended Industry Standard Architecture) bus, or the like. The bus may be divided into an address bus, a data bus, a control bus, etc. For ease of illustration, only one double-headed arrow is shown in FIG. 3, but this does not indicate only one bus or one type of bus.
And the memory is used for storing programs. In particular, the program may include program code comprising computer operating instructions. The memory may include both memory and non-volatile storage and provides instructions and data to the processor.
In a possible implementation manner, the processor reads the corresponding computer program from the non-volatile memory into the memory and then runs the computer program, and the corresponding computer program can also be acquired from other equipment so as to form the time series data processing device on a logic level. And the processor executes the program stored in the memory so as to realize the time series data processing method provided by any embodiment of the invention through the executed program.
The method executed by the time series data processing device provided by the above embodiment can be applied to or realized by a processor. The processor may be an integrated circuit chip having signal processing capabilities. In implementation, the steps of the above method may be performed by integrated logic circuits of hardware in a processor or instructions in the form of software. The Processor may be a general-purpose Processor, including a Central Processing Unit (CPU), a Network Processor (NP), and the like; but also Digital Signal Processors (DSPs), Application Specific Integrated Circuits (ASICs), Field Programmable Gate Arrays (FPGAs) or other Programmable logic devices, discrete Gate or transistor logic devices, discrete hardware components. The various methods, steps and logic blocks disclosed in the embodiments of the present specification may be implemented or performed. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The steps of a method disclosed in connection with the embodiments of the present specification may be embodied directly in a hardware decoding processor, or in a combination of hardware and software modules in the decoding processor. The software module may be located in ram, flash memory, rom, prom, or eprom, registers, etc. storage media as is well known in the art. The storage medium is located in a memory, and a processor reads information in the memory and completes the steps of the method in combination with hardware of the processor.
This specification embodiment also proposes a computer-readable storage medium storing one or more programs, the one or more programs including instructions, which when executed by an electronic device including a plurality of application programs, can cause the electronic device to execute the time-series data processing method provided in any embodiment of the present invention, and is specifically configured to execute the method shown in fig. 1.
The systems, devices, modules or units illustrated in the above embodiments may be implemented by a computer chip or an entity, or by a product with certain functions. One typical implementation device is a computer. In particular, the computer may be, for example, a personal computer, a laptop computer, a cellular telephone, a camera phone, a smartphone, a personal digital assistant, a media player, a navigation device, an email device, a game console, a tablet computer, a wearable device, or a combination of any of these devices.
For convenience of description, the above devices are described as being divided into various units or modules by function, respectively. Of course, the functionality of the various elements or modules may be implemented in the same one or more software and/or hardware implementations of the present description.
As will be appreciated by one skilled in the art, embodiments of the present description may be provided as a method, system, or computer program product. Accordingly, the description may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the description may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The description has been presented with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the description. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
In a typical configuration, a computing device includes one or more processors (CPUs), input/output interfaces, network interfaces, and memory.
The memory may include forms of volatile memory in a computer readable medium, Random Access Memory (RAM) and/or non-volatile memory, such as Read Only Memory (ROM) or flash memory (flash RAM). Memory is an example of a computer-readable medium.
Computer-readable media, including both non-transitory and non-transitory, removable and non-removable media, may implement information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of computer storage media include, but are not limited to, phase change memory (PRAM), Static Random Access Memory (SRAM), Dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), Read Only Memory (ROM), Electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), Digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic tape magnetic disk storage or other magnetic storage devices, or any other non-transmission medium that can be used to store information that can be accessed by a computing device. As defined herein, a computer readable medium does not include a transitory computer readable medium such as a modulated data signal and a carrier wave.
It should also be noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
As will be appreciated by one skilled in the art, embodiments of the present description may be provided as a method, system, or computer program product. Accordingly, the description may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the description may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
This description may be described in the general context of computer-executable instructions, such as program modules, being executed by a computer. Generally, program modules include routines, programs, objects, components, data structures, etc. that perform particular tasks or implement particular abstract data types. The specification may also be practiced in distributed computing environments where tasks are performed by remote processing devices that are linked through a communications network. In a distributed computing environment, program modules may be located in both local and remote computer storage media including memory storage devices.
The embodiments in the present specification are described in a progressive manner, and the same and similar parts among the embodiments are referred to each other, and each embodiment focuses on the differences from the other embodiments. In particular, for the system embodiment, since it is substantially similar to the method embodiment, the description is simple, and for the relevant points, reference may be made to the partial description of the method embodiment.
The above description is only an example of the present specification, and is not intended to limit the present specification. Various modifications and alterations to this description will become apparent to those skilled in the art. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present specification should be included in the scope of the claims of the present specification.

Claims (10)

1. A method for processing time-series data, the method comprising:
acquiring first time series data and second time series data, wherein the first time series data comprise preset tag information, and the second time series data are unlabeled time series data;
adding pseudo tag information to the second time series data to obtain third time series data including pseudo tag information;
and performing model training based on the first time series data and the third time series data to obtain a time series classification model.
2. The method of claim 1, further comprising:
training to obtain a label model based on the first time series data, wherein the label model is used for adding pseudo label information to the label-free time series data;
adding pseudo tag information to the second time series data to obtain third time series data including pseudo tag information, including:
and inputting the second time series data into the tag model to obtain third time series data comprising pseudo tag information.
3. The method according to claim 1 or 2, wherein the model training based on the first time-series data and the third time-series data to obtain a time-series classification model comprises:
randomly sequencing all data of the first time sequence data and the third time sequence data to obtain fourth time sequence data;
and performing model training based on the randomly ordered fourth time sequence data to obtain a time sequence classification model.
4. The method of claim 3, wherein randomly ordering all data of the first time series data and the third time series data to obtain a fourth time series data comprises:
inputting all data of the first time series data and the third time series data into a random ordering function to obtain randomly ordered fourth time series data.
5. The method of claim 3, wherein model training based on the randomly ordered fourth time series data to obtain a time series classification model comprises:
weighting the data elements and the tag elements of the randomly sequenced fourth time series data to obtain fifth time series data;
and performing model training based on the fifth time series data to obtain a time series classification model, wherein the fifth time series data comprises weighted data elements and weighted label elements.
6. The method of claim 5, wherein weighting the data elements and the tag elements of the randomly ordered fourth time series data to obtain fifth time series data comprises:
determining a weight parameter of the randomly ordered fourth time series data by utilizing a beta distribution function aiming at the randomly ordered fourth time series data;
and weighting the data elements and the tag elements of the fourth time series data according to the weighting parameters to obtain fifth time series data.
7. A time-series data processing apparatus, comprising:
the data acquisition module is used for acquiring first time series data and second time series data, wherein the first time series data comprise preset tag information, and the second time series data are non-tag time series data;
the tag processing module is used for adding pseudo tag information to the second time series data to obtain third time series data comprising the pseudo tag information;
and the first training module is used for carrying out model training on the basis of the first time sequence data and the third time sequence data so as to obtain a time sequence classification model.
8. The apparatus of claim 7, further comprising:
the second training module is used for training based on the first time series data to obtain a label model, and the label model is used for adding pseudo label information to the label-free time series data;
the tag processing module comprises:
and the input unit is used for inputting the second time series data into the label model so as to obtain third time series data comprising pseudo label information.
9. A readable medium comprising executable instructions which, when executed by a processor of an electronic device, cause the electronic device to perform the method of any of claims 1 to 6.
10. An electronic device, comprising: a processor, a memory, and a bus; the memory is used for storing execution instructions, the processor is connected with the memory through the bus, and when the electronic device runs, the processor executes the execution instructions stored in the memory to enable the processor to execute the method according to any one of claims 1 to 6.
CN202111033394.7A 2021-09-03 2021-09-03 Time series data processing method and device, readable medium and electronic equipment Pending CN113743618A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111033394.7A CN113743618A (en) 2021-09-03 2021-09-03 Time series data processing method and device, readable medium and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111033394.7A CN113743618A (en) 2021-09-03 2021-09-03 Time series data processing method and device, readable medium and electronic equipment

Publications (1)

Publication Number Publication Date
CN113743618A true CN113743618A (en) 2021-12-03

Family

ID=78735601

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111033394.7A Pending CN113743618A (en) 2021-09-03 2021-09-03 Time series data processing method and device, readable medium and electronic equipment

Country Status (1)

Country Link
CN (1) CN113743618A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114896307A (en) * 2022-06-30 2022-08-12 北京航空航天大学杭州创新研究院 Time series data enhancement method and device and electronic equipment
WO2023148145A1 (en) * 2022-02-02 2023-08-10 Digital For Mental Health System for forecasting a mental state of a subject and method

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108875781A (en) * 2018-05-07 2018-11-23 腾讯科技(深圳)有限公司 A kind of labeling method, apparatus, electronic equipment and storage medium
CN111222648A (en) * 2020-01-15 2020-06-02 深圳前海微众银行股份有限公司 Semi-supervised machine learning optimization method, device, equipment and storage medium
CN111563424A (en) * 2020-04-20 2020-08-21 清华大学 Pedestrian re-identification method and device based on semi-supervised learning
CN112541745A (en) * 2020-12-22 2021-03-23 平安银行股份有限公司 User behavior data analysis method and device, electronic equipment and readable storage medium
CN113139051A (en) * 2021-03-29 2021-07-20 广东外语外贸大学 Text classification model training method, text classification method, device and medium
CN113159355A (en) * 2020-01-07 2021-07-23 北京京邦达贸易有限公司 Data prediction method, data prediction device, logistics cargo quantity prediction method, medium and equipment
CN113297443A (en) * 2020-05-13 2021-08-24 阿里巴巴集团控股有限公司 Classification method, classification device, computing equipment and medium
CN113326826A (en) * 2021-08-03 2021-08-31 新石器慧通(北京)科技有限公司 Network model training method and device, electronic equipment and storage medium

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108875781A (en) * 2018-05-07 2018-11-23 腾讯科技(深圳)有限公司 A kind of labeling method, apparatus, electronic equipment and storage medium
CN113159355A (en) * 2020-01-07 2021-07-23 北京京邦达贸易有限公司 Data prediction method, data prediction device, logistics cargo quantity prediction method, medium and equipment
CN111222648A (en) * 2020-01-15 2020-06-02 深圳前海微众银行股份有限公司 Semi-supervised machine learning optimization method, device, equipment and storage medium
CN111563424A (en) * 2020-04-20 2020-08-21 清华大学 Pedestrian re-identification method and device based on semi-supervised learning
CN113297443A (en) * 2020-05-13 2021-08-24 阿里巴巴集团控股有限公司 Classification method, classification device, computing equipment and medium
CN112541745A (en) * 2020-12-22 2021-03-23 平安银行股份有限公司 User behavior data analysis method and device, electronic equipment and readable storage medium
CN113139051A (en) * 2021-03-29 2021-07-20 广东外语外贸大学 Text classification model training method, text classification method, device and medium
CN113326826A (en) * 2021-08-03 2021-08-31 新石器慧通(北京)科技有限公司 Network model training method and device, electronic equipment and storage medium

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
ARAZO, E. , ET AL.: "Pseudo-Labeling and Confirmation Bias in Deep Semi-Supervised Learning", 2020 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 28 September 2020 (2020-09-28), pages 1 *
中共教育优就业研究院: "Python高效开发指南:Python语言核心编程", 31 May 2020, 陕西科学技术出版社, pages: 251 *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023148145A1 (en) * 2022-02-02 2023-08-10 Digital For Mental Health System for forecasting a mental state of a subject and method
CN114896307A (en) * 2022-06-30 2022-08-12 北京航空航天大学杭州创新研究院 Time series data enhancement method and device and electronic equipment
CN114896307B (en) * 2022-06-30 2022-09-27 北京航空航天大学杭州创新研究院 Time series data enhancement method and device and electronic equipment

Similar Documents

Publication Publication Date Title
CN110956275B (en) Risk prediction and risk prediction model training method and device and electronic equipment
CN110826006B (en) Abnormal collection behavior identification method and device based on privacy data protection
CN108550046B (en) Resource and marketing recommendation method and device and electronic equipment
CN113688313A (en) Training method of prediction model, information pushing method and device
CN113743618A (en) Time series data processing method and device, readable medium and electronic equipment
CN113641896A (en) Model training and recommendation probability prediction method and device
CN110569429B (en) Method, device and equipment for generating content selection model
CN111199157B (en) Text data processing method and device
CN109299276B (en) Method and device for converting text into word embedding and text classification
CN110334936B (en) Method, device and equipment for constructing credit qualification scoring model
CN115546831A (en) Cross-modal pedestrian searching method and system based on multi-granularity attention mechanism
CN113723352B (en) Text detection method, system, storage medium and electronic equipment
CN109492401B (en) Content carrier risk detection method, device, equipment and medium
CN108932525B (en) Behavior prediction method and device
CN110826323A (en) Comment information validity detection method and device
CN111753729B (en) False face detection method and device, electronic equipment and storage medium
CN111311372A (en) User identification method and device
CN111275071A (en) Prediction model training method, prediction device and electronic equipment
CN111027716A (en) Load prediction method and device
CN114254588B (en) Data tag processing method and device
CN114926437A (en) Image quality evaluation method and device
CN112579774B (en) Model training method, model training device and terminal equipment
CN114710318A (en) Method, device, equipment and medium for limiting high-frequency access of crawler
CN112101308B (en) Method and device for combining text boxes based on language model and electronic equipment
CN115661584B (en) Model training method, open domain target detection method and related device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination