CN113486303A - Long-time sequence prediction method based on modification model integration - Google Patents

Long-time sequence prediction method based on modification model integration Download PDF

Info

Publication number
CN113486303A
CN113486303A CN202110826038.4A CN202110826038A CN113486303A CN 113486303 A CN113486303 A CN 113486303A CN 202110826038 A CN202110826038 A CN 202110826038A CN 113486303 A CN113486303 A CN 113486303A
Authority
CN
China
Prior art keywords
model
data
long
modification
time
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110826038.4A
Other languages
Chinese (zh)
Inventor
叶旺
何中杰
王越胜
严求真
杨启尧
郭栋
黄娜
赵晓东
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hangzhou Dianzi University
Original Assignee
Hangzhou Dianzi University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hangzhou Dianzi University filed Critical Hangzhou Dianzi University
Priority to CN202110826038.4A priority Critical patent/CN113486303A/en
Publication of CN113486303A publication Critical patent/CN113486303A/en
Priority to CN202210797902.7A priority patent/CN115062272A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F17/00Digital computing or data processing equipment or methods, specially adapted for specific functions
    • G06F17/10Complex mathematical operations
    • G06F17/18Complex mathematical operations for evaluating statistical data, e.g. average values, frequency distributions, probability functions, regression analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F17/00Digital computing or data processing equipment or methods, specially adapted for specific functions
    • G06F17/10Complex mathematical operations
    • G06F17/16Matrix or vector computation, e.g. matrix-matrix or matrix-vector multiplication, matrix factorization
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A20/00Water conservation; Efficient water supply; Efficient water use
    • Y02A20/152Water filtration

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Mathematical Physics (AREA)
  • Computational Mathematics (AREA)
  • Mathematical Analysis (AREA)
  • Mathematical Optimization (AREA)
  • Pure & Applied Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Databases & Information Systems (AREA)
  • Computing Systems (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Algebra (AREA)
  • Evolutionary Biology (AREA)
  • Probability & Statistics with Applications (AREA)
  • Operations Research (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Health & Medical Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Computational Linguistics (AREA)
  • Evolutionary Computation (AREA)
  • General Health & Medical Sciences (AREA)
  • Molecular Biology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Testing And Monitoring For Control Systems (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

The invention discloses a long-time sequence prediction method based on modification model integration, which is characterized in that a modification model is constructed by dividing a recombination data set, a relevant correction function is added into the modification model, and the modified modification model is further integrated with a baseline model, so that a time sequence prediction result is obtained. The method can effectively capture the accurate long-range coupling correlation between output and input, can obviously improve the accuracy of long-time sequence prediction, and enhances the generalization capability and data migration capability of the model. The method solves the problems that in the prior art, due to the fact that parameter optimization of a multi-scale recurrent neural network is difficult due to hierarchical structure modeling, and long-time characteristic information is difficult to dynamically capture due to a fixed scale, performance in the aspects of model generalization capability, data migration capability and the like is not ideal.

Description

Long-time sequence prediction method based on modification model integration
Technical Field
The invention belongs to the technical field of time sequence prediction, and relates to a long-time sequence prediction method based on modification model integration.
Background
Long time series prediction is a leading problem in the field of time series prediction. With the rapid development of artificial intelligence, model prediction methods such as ARIMA, LSTM, and Prophet have obvious effect in the technical field of time sequence prediction, but most of the models and methods focus on short-time sequence prediction.
At present, models and methods applied to long-time sequence prediction mainly comprise an autoregressive model, a machine learning method and a multi-scale recurrent neural network, but have limitations. The autoregressive model is not suitable for a non-stationary sequence, cannot simultaneously consider information of long-term trend and volatility of fine granularity, and is easy to have problems of amplitude difference, phase offset and the like in a scene with fuzzy periodic rule. The traditional machine learning method is difficult to obtain a predicted value beyond the range of historical data, and events such as outlier prediction and the like need to be subjected to post-processing. The multi-scale recurrent neural network adopts hierarchical structure modeling, and besides the difficulty in parameter optimization, the difficulty in dynamically capturing characteristic information on a long time due to the fixed scale of the network also exists.
In summary, when the existing time series prediction model and method are applied to long-time series prediction, it is difficult to capture accurate long-range coupling correlation between output and input, resulting in non-ideal performance in the aspects of model generalization capability, data migration capability, and the like. In order to improve the accuracy of long-time sequence prediction, a more adaptive model and prediction method are urgently needed.
Disclosure of Invention
Aiming at the defects of the prior art, the invention provides a long-time sequence prediction method based on modification model integration, which constructs a modification model by dividing a recombination data set, adds a related correction function into the modification model, further integrates the modified modification model with a baseline model to obtain a time sequence prediction result, can effectively capture the accurate long-range coupling correlation between output and input, remarkably improves the accuracy of long-time sequence prediction, and enhances the generalization capability and data migration capability of the model.
A long time sequence prediction method based on modification model integration specifically comprises the following steps:
step one, data preprocessing
Extracting the characteristics of the collected time sequence data, and cleaning impurities and redundancies in the original data XiFor the ith column of characteristic data obtained after cleaning, X is again selectediCarrying out zero mean value normalization processing to obtain
Figure BDA0003173839830000011
Preferably, the data feature extraction method is a time stamp process or a discrete variable process.
Step two, calculating the prediction probability of the baseline model
Establishing a baseline model by using a long-short term memory neural network (LSTM), inputting the characteristic data subjected to normalization processing in the step one into the baseline model for processing, then inputting the hidden variable of the last moment of the model into a full-connection layer, and outputting the probability p of the relevant tasks of the time series data in a specified time0
p0=σ(hn)
Wherein h isnFor LSTM processed data set input, p0Is the predicted outcome of the baseline model.
Figure BDA0003173839830000021
Wherein
Figure BDA0003173839830000022
Representing the data set after normalization processing in the step one,
Figure BDA0003173839830000023
is composed of
Figure BDA0003173839830000024
The feature data vector at time t, k being the vector dimension.
Step three, calculating the prediction probability of the modification model
Constructing a modification model by using a tree model, and normalizing the data set processed in the step one
Figure BDA0003173839830000025
Dividing the data into n parts according to the time sequence, and forming a new data set X by the first n-a parts of data*Input into a modification model for predicting the probability p that time series data are relevant to a task within a specified timeiThe latter a data is used as prediction data and is classified into a data set
Figure BDA0003173839830000026
In the prediction dataset of (1), a < n/2.
pi=Tree(X*),1≤i≤n/2-a
Preferably, the tree model is a LightGBM, XGboost or Catboost model.
Step four, introducing a correction function
And (3) introducing a correction function to correct the output prediction result of the modification model in the step three:
Figure BDA0003173839830000027
wherein, wiIndicating the corrected prediction result, alpha indicating the modulation factor for preventing triggering of a floating point exception, beta indicating the data set
Figure BDA0003173839830000028
The length of time.
Preferably, the modulation factor α is 1.
Step five, model integration
Integrating the prediction result of the baseline model in the step two and the prediction result of the modified model after being corrected in the step four to obtain a probability prediction result p of the related tasks of the final time series data in a specified time:
Figure BDA0003173839830000029
wherein gamma and 1-gamma are respectively used for measuring the importance degree of a baseline model and a modification model.
The invention has the following beneficial effects:
a long-term and short-term memory neural network is adopted to establish a baseline model for a time sequence, on the basis, a recombined data set is effectively divided, a proper tree model is constructed to obtain a modification model, and then the baseline model and the modification model are integrated, so that historical information memorized at each time step is further recombined, the long-distance dependence capability of capturing long-term characteristic information is strong, the accurate long-range coupling correlation between output and input can be effectively learned, the generalization capability, the data migration capability and the prediction accuracy of the model are enhanced, and the prediction capability of the long-term sequence is improved.
Drawings
FIG. 1 is a flow chart of a long time sequence prediction method based on modified model integration;
FIG. 2 is a schematic diagram of data set partitioning and reorganization;
FIG. 3 is a schematic diagram of a model integration method.
Detailed Description
The invention is further explained below with reference to the drawings;
as shown in fig. 1, a long time sequence prediction method based on modification model integration specifically includes the following steps:
step one, data preprocessing
Performing time sequence data acquisition by time stamp processing and discrete variable processingFeature extraction, cleaning of impurities and redundancies in raw data, XiThe characteristic data of the ith column obtained after cleaning. In order to eliminate dimension influence of original data and enable different data to have comparability, normalization processing is carried out on cleaned feature data to realize equal-ratio scaling, and a result is mapped to [0,1 ]]Then mapping to the distribution with mean value of 0 and standard deviation of 1, and completing zero-mean normalization:
Figure BDA0003173839830000031
Figure BDA0003173839830000032
Figure BDA0003173839830000033
Figure BDA0003173839830000034
wherein the content of the first and second substances,
Figure BDA0003173839830000035
respectively the maximum value and the minimum value of the ith column of characteristic data,
Figure BDA0003173839830000036
is the ith column of characteristic data after being normalized by a linear function,
Figure BDA0003173839830000037
is that
Figure BDA0003173839830000038
M represents the size of the characteristic data of each column, mu, sigma2Respectively mean and variance of the ith column of feature data,
Figure BDA0003173839830000039
is the ith column of characteristic data after zero mean normalization.
Step two, calculating the prediction probability of the baseline model
Constructing a baseline model by using a long-short term memory neural network (LSTM), wherein the cyclic function of the LSTM is as follows:
Figure BDA00031738398300000310
Figure BDA0003173839830000041
Figure BDA0003173839830000042
wherein the content of the first and second substances,
Figure BDA0003173839830000043
representing element-by-element vector multiplication, ctCell state, h, for RNN at time ttExpressed as the final hidden state, Wc、Uc、bcRepresenting trainable network parameters, the input at each moment also including the output h of the hidden layer at the previous momentt-1
Figure BDA0003173839830000044
For a new alternative value vector, tanh (-) is a tanh function, namely, each element is valued at-1 to 1, it,ot,ftRespectively an input gate, an output gate and a forgetting gate, and is calculated by the following formula:
Figure BDA0003173839830000045
Figure BDA0003173839830000046
Figure BDA0003173839830000047
wherein, sigma (·) is a sigmoid function, and the output value range is between 0 and 1; wi、Ui、bi,Wo、Uo、bo,Wf、Uf、bfIn turn relate to it、ot、ftTrainable network parameters of an equation.
Inputting the feature data processed by the normalization processing in the step one into a baseline model for processing, wherein the service prediction task is a binary task, so that the hidden variable at the last moment of the LSTM model is input into a full-connection layer with only two output nodes, and the probability p of the related task of the time series data in the specified time can be output0
p0=σ(hn)
Wherein h isnFor LSTM processed data set input, p0Is the predicted outcome of the baseline model.
Figure BDA0003173839830000048
Wherein
Figure BDA0003173839830000049
Representing the data set after normalization processing in the step one,
Figure BDA00031738398300000410
is composed of
Figure BDA00031738398300000411
The feature data vector at time t, k being the vector dimension.
Step three, calculating the prediction probability of the modification model
Constructing a modification model by using the LightGBM model, and normalizing the data set after the step one as shown in FIG. 2
Figure BDA00031738398300000412
Dividing the data into n parts according to the time sequence, and forming a new data set X by the first n-a parts of data*Inputting the predicted probability p into the modification modeli
pi=Tree(X*),1≤i≤n/2-a
Data of last a as prediction data set, p1For using data sets
Figure BDA00031738398300000413
Front side
Figure BDA00031738398300000414
Partial data prediction of related tasks in a dataset
Figure BDA00031738398300000415
Front side
Figure BDA00031738398300000416
Partial data and data set X*A < n/2, the probability of occurrence of the relevant task within the data of (1).
Step four, introducing a correction function
And (3) introducing a correction function to correct the output prediction result of the modification model in the step three:
Figure BDA0003173839830000051
wherein, wiThe modified prediction result is expressed, the modulation coefficient alpha is 1 and is used for preventing the triggering of floating point abnormity, the modified model is favorably smoothed, and beta represents a data set
Figure BDA0003173839830000052
The length of time.
Step five, model integration
Integrating the prediction result of the baseline model in the step two and the prediction result of the modified model after being corrected in the step four to obtain a probability prediction result p of the related tasks of the final time series data in a specified time:
Figure BDA0003173839830000053
wherein gamma and 1-gamma are respectively used for measuring the importance degree of a baseline model and a modification model.

Claims (6)

1. A long-time sequence prediction method based on modification model integration is characterized in that: the method specifically comprises the following steps:
step one, data preprocessing
Extracting the characteristics of the collected time sequence data, and cleaning impurities and redundancies in the original data XiFor the ith column of characteristic data obtained after cleaning, X is again selectediCarrying out zero mean value normalization processing to obtain
Figure FDA0003173839820000011
Step two, calculating the prediction probability of the baseline model
Establishing a baseline model by using a long-short term memory neural network (LSTM), inputting the characteristic data subjected to zero-mean normalization in the step one into the baseline model for processing, then inputting the hidden variable of the last moment of the model into a full-connection layer, and outputting the probability p of the relevant tasks of time sequence data in a specified time0
p0=σ(hn)
Wherein h isnFor LSTM processed data set input, p0Is a predicted outcome of the baseline model;
Figure FDA0003173839820000012
wherein
Figure FDA0003173839820000013
Representing the data set after normalization processing in the step one,
Figure FDA0003173839820000014
is composed of
Figure FDA0003173839820000015
A feature data vector at time t, k being a vector dimension;
step three, calculating the prediction probability of the modification model
Constructing a modification model by using a tree model, and normalizing the data set processed in the step one
Figure FDA0003173839820000016
Dividing the data into n parts according to the time sequence, and forming a new data set X by the first n-a parts of data*Input into a modification model for predicting the probability p that time series data are relevant to a task within a specified timeiThe latter a data are put into the data set
Figure FDA0003173839820000017
A < n/2 in the prediction dataset;
pi=Tree(X*),1≤i≤n/2-a
step four, introducing a correction function
And (3) introducing a correction function to correct the output prediction result of the modification model in the step three:
Figure FDA0003173839820000018
wherein, wiIndicating the corrected prediction result, alpha indicating the modulation factor for preventing triggering of a floating point exception, beta indicating the data set
Figure FDA0003173839820000019
The length of time of;
step five, model integration
Integrating the prediction result of the baseline model in the step two and the prediction result of the modified model after being corrected in the step four to obtain a probability prediction result p of the related tasks of the final time series data in a specified time:
Figure FDA0003173839820000021
wherein gamma and 1-gamma are respectively used for measuring the importance degree of a baseline model and a modification model.
2. The long-term sequence prediction method based on the integration of modification models as claimed in claim 1, wherein: the data feature extraction method is time stamp processing or discrete variable processing.
3. The long-term sequence prediction method based on the integration of modification models as claimed in claim 1, wherein: firstly, performing linear function normalization on the cleaned feature data, then performing zero-mean normalization, and mapping the normalized feature data to the distribution with the mean value of 0 and the standard deviation of 1:
Figure FDA0003173839820000022
Figure FDA0003173839820000023
Figure FDA0003173839820000024
Figure FDA0003173839820000025
wherein the content of the first and second substances,
Figure FDA0003173839820000026
respectively the maximum value and the minimum value of the ith column of characteristic data,
Figure FDA0003173839820000027
is the ith column of characteristic data after being normalized by a linear function,
Figure FDA0003173839820000028
is that
Figure FDA0003173839820000029
M represents the size of the characteristic data of each column, mu, sigma2Respectively mean and variance of the ith column of feature data,
Figure FDA00031738398200000210
is the ith column of characteristic data after zero mean normalization.
4. The long-term sequence prediction method based on the integration of modification models as claimed in claim 1, wherein: the cyclic function of the LSTM model is:
Figure FDA00031738398200000211
Figure FDA00031738398200000212
Figure FDA00031738398200000213
wherein the content of the first and second substances,
Figure FDA00031738398200000214
representing element-by-element vector multiplication, ctCell state, h, for RNN at time ttExpressed as the final hidden state, Wc、Uc、bcFor the network parameter to be trained, ht-1In order to hide the output of the layer at the last moment,
Figure FDA00031738398200000215
for the new candidate vector, tanh (. cndot.) is the function of tanh, it,ot,ftRespectively for input gate, output gate, forget the gate:
Figure FDA00031738398200000216
Figure FDA00031738398200000217
Figure FDA0003173839820000031
wherein, sigma (·) is a sigmoid function, and the output range is between 0 and 1; wi、Ui、biTo relate to itOf the network to be trained, Wo、Uo、boTo relate to otOf the network to be trained, Wf、Uf、bfTo relate to ftTo be trained network parameters.
5. The long-term sequence prediction method based on the integration of modification models as claimed in claim 1, wherein: the tree model is a LightGBM, XGboost or Catboost model.
6. The long-term sequence prediction method based on the integration of modification models as claimed in claim 1, wherein: the modulation factor α is 1.
CN202110826038.4A 2021-07-21 2021-07-21 Long-time sequence prediction method based on modification model integration Pending CN113486303A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202110826038.4A CN113486303A (en) 2021-07-21 2021-07-21 Long-time sequence prediction method based on modification model integration
CN202210797902.7A CN115062272A (en) 2021-07-21 2022-07-06 Water quality monitoring data abnormity identification and early warning method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110826038.4A CN113486303A (en) 2021-07-21 2021-07-21 Long-time sequence prediction method based on modification model integration

Publications (1)

Publication Number Publication Date
CN113486303A true CN113486303A (en) 2021-10-08

Family

ID=77942059

Family Applications (2)

Application Number Title Priority Date Filing Date
CN202110826038.4A Pending CN113486303A (en) 2021-07-21 2021-07-21 Long-time sequence prediction method based on modification model integration
CN202210797902.7A Pending CN115062272A (en) 2021-07-21 2022-07-06 Water quality monitoring data abnormity identification and early warning method

Family Applications After (1)

Application Number Title Priority Date Filing Date
CN202210797902.7A Pending CN115062272A (en) 2021-07-21 2022-07-06 Water quality monitoring data abnormity identification and early warning method

Country Status (1)

Country Link
CN (2) CN113486303A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023088314A1 (en) * 2021-11-16 2023-05-25 王树松 Object classification method, apparatus and device, and storage medium

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116182949B (en) * 2023-02-23 2024-03-19 中国人民解放军91977部队 Marine environment water quality monitoring system and method
CN116451142A (en) * 2023-06-09 2023-07-18 山东云泷水务环境科技有限公司 Water quality sensor fault detection method based on machine learning algorithm
CN117113264B (en) * 2023-10-24 2024-02-09 上海昊沧系统控制技术有限责任公司 Method for detecting abnormality of dissolved oxygen meter of sewage plant on line in real time
CN117171604B (en) * 2023-11-03 2024-01-19 城资泰诺(山东)新材料科技有限公司 Sensor-based insulation board production line abnormality monitoring system

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023088314A1 (en) * 2021-11-16 2023-05-25 王树松 Object classification method, apparatus and device, and storage medium

Also Published As

Publication number Publication date
CN115062272A (en) 2022-09-16

Similar Documents

Publication Publication Date Title
CN113486303A (en) Long-time sequence prediction method based on modification model integration
Labach et al. Survey of dropout methods for deep neural networks
CN110427654B (en) Landslide prediction model construction method and system based on sensitive state
CN111027686B (en) Landslide displacement prediction method, device and equipment
CN111079998B (en) Flow prediction method based on long and short time sequence correlation attention mechanism model
CN112364975A (en) Terminal operation state prediction method and system based on graph neural network
CN111612243A (en) Traffic speed prediction method, system and storage medium
CN112288193A (en) Ocean station surface salinity prediction method based on GRU deep learning of attention mechanism
CN113705877A (en) Real-time monthly runoff forecasting method based on deep learning model
CN112784479B (en) Flood flow prediction method
CN112434891A (en) Method for predicting solar irradiance time sequence based on WCNN-ALSTM
CN115423145A (en) Photovoltaic power prediction method based on Kmeans-VMD-WT-LSTM method
CN115310536A (en) Reservoir water level prediction early warning method based on neural network and GCN deep learning model
CN114252879A (en) InSAR inversion and multi-influence factor based large-range landslide deformation prediction method
CN114548532A (en) VMD-based TGCN-GRU ultra-short-term load prediction method and device and electronic equipment
CN115759461A (en) Internet of things-oriented multivariate time sequence prediction method and system
CN116432697A (en) Time sequence prediction method integrating long-term memory network and attention mechanism
CN115421216A (en) STL-ARIMA-NAR mixed model-based medium-and-long-term monthly rainfall forecasting method
CN113627240B (en) Unmanned aerial vehicle tree species identification method based on improved SSD learning model
Sari et al. Daily rainfall prediction using one dimensional convolutional neural networks
Pandhiani et al. Time series forecasting by using hybrid models for monthly streamflow data
CN109190505A (en) The image-recognizing method that view-based access control model understands
CN116885699A (en) Power load prediction method based on dual-attention mechanism
CN115861930A (en) Crowd counting network modeling method based on hierarchical difference feature aggregation
CN113449920A (en) Wind power prediction method, system and computer readable medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB03 Change of inventor or designer information
CB03 Change of inventor or designer information

Inventor after: He Zhongjie

Inventor after: Ye Wang

Inventor after: Wang Yuesheng

Inventor after: Yan Qiuzhen

Inventor after: Yang Qiyao

Inventor after: Guo Dong

Inventor after: Huang Na

Inventor after: Zhao Xiaodong

Inventor before: Ye Wang

Inventor before: He Zhongjie

Inventor before: Wang Yuesheng

Inventor before: Yan Qiuzhen

Inventor before: Yang Qiyao

Inventor before: Guo Dong

Inventor before: Huang Na

Inventor before: Zhao Xiaodong

WD01 Invention patent application deemed withdrawn after publication
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20211008