CN113435263B - CGAN data enhancement-based frequency spectrum sensing method and system - Google Patents

CGAN data enhancement-based frequency spectrum sensing method and system Download PDF

Info

Publication number
CN113435263B
CN113435263B CN202110635040.3A CN202110635040A CN113435263B CN 113435263 B CN113435263 B CN 113435263B CN 202110635040 A CN202110635040 A CN 202110635040A CN 113435263 B CN113435263 B CN 113435263B
Authority
CN
China
Prior art keywords
data
training
cgan
neural network
generator
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110635040.3A
Other languages
Chinese (zh)
Other versions
CN113435263A (en
Inventor
曹开田
蔡连宁
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Institute of Technology
Original Assignee
Shanghai Institute of Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Institute of Technology filed Critical Shanghai Institute of Technology
Priority to CN202110635040.3A priority Critical patent/CN113435263B/en
Publication of CN113435263A publication Critical patent/CN113435263A/en
Application granted granted Critical
Publication of CN113435263B publication Critical patent/CN113435263B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/243Classification techniques relating to the number of classes
    • G06F18/2431Multiple classes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04BTRANSMISSION
    • H04B17/00Monitoring; Testing
    • H04B17/30Monitoring; Testing of propagation channels
    • H04B17/382Monitoring; Testing of propagation channels for resource allocation, admission control or handover
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2218/00Aspects of pattern recognition specially adapted for signal processing
    • G06F2218/02Preprocessing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2218/00Aspects of pattern recognition specially adapted for signal processing
    • G06F2218/12Classification; Matching
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D30/00Reducing energy consumption in communication networks
    • Y02D30/70Reducing energy consumption in communication networks in wireless communication networks

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Evolutionary Computation (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Molecular Biology (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Computational Linguistics (AREA)
  • General Health & Medical Sciences (AREA)
  • Evolutionary Biology (AREA)
  • Computing Systems (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Electromagnetism (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Image Analysis (AREA)
  • Measurement Of Mechanical Vibrations Or Ultrasonic Waves (AREA)

Abstract

The invention provides a frequency spectrum sensing method and a frequency spectrum sensing system based on CGAN data enhancement, wherein the frequency spectrum sensing method comprises the following steps: collecting channel state data and acquiring a training data set; converting training data into spectrograms through short-time Fourier transformation to generate a training set; establishing a CGAN-based neural network model, and training the CGAN-based neural network model through the training set; adding the trained data sample generated based on CGAN neural network model into the training set to generate an enhanced training data set; and acquiring a convolutional neural network trained in advance, inputting the enhanced training data set into the convolutional neural network for classification, and obtaining a classification result, wherein the classification result is that a main user of a channel is in a silence state or the main user is in an active state. According to the invention, after the CGAN network is adopted to enhance the data, the AlexNet convolutional neural network is introduced to classify the data, so that spectrum sensing can be effectively realized.

Description

CGAN data enhancement-based frequency spectrum sensing method and system
Technical Field
The invention relates to a cognitive radio technology, in particular to a CGAN data enhancement-based frequency spectrum sensing method and system.
Background
With the development of emerging wireless communication technologies such as 5G and internet of things, the demand for wireless spectrum resources is increasing. The federal communications commission (Federal Communications Commission, FCC) has studied to indicate that there is a significant degree of idling of a large number of allocated spectrum resources, both in time and space. The main reasons for the shortage of spectrum resources are: 1. static allocation policy of spectrum resources. 2. Inefficient use of certain licensed bands. Cognitive radio networks (Cognitive Radio Networks, CRNs) are considered as an effective way to achieve dynamic allocation of spectrum resources and to improve spectrum utilization.
Among CRNs, there are two types of users, one is a Primary User (PU) and the other is a Secondary User (SU). The primary user has authorized a certain frequency band he uses, while the secondary user has no authorization. Therefore, the premise of using the frequency band by the secondary user is that the normal use of the primary user cannot be interfered, and when the primary user initiates a service request, the secondary user must switch to another available idle frequency band. Spectrum sensing is a key technology of dynamic spectrum access in cognitive radio, secondary users analyze the spectrum through the spectrum sensing technology and intelligently use idle spectrum, so that the spectrum use efficiency is improved. Therefore, the research of the spectrum sensing technology has theoretical and practical significance.
The deep learning models commonly used in spectrum sensing include convolutional neural networks (Convolutional Neural Network, CNN), recurrent neural networks (Recurrent Neural Network, RNN), long Short-Term Memory (LSTM), and the like. However, these supervised learning based deep learning models require a large number of training data samples to train, may not obtain enough labeled (label) data at all under actual complex channel environmental conditions, and the cost required to collect sufficient data is often quite expensive, resulting in poor generalization of the model trained from a small number of samples.
The generation of a countermeasure network (GAN) is a hotspot in recent years of research, proposed by Goodfellow et al in 2014. GAN consists of a generator and a arbiter. The generator is responsible for learning the distribution of the real samples and generating new data according to given noise; the arbiter determines whether the received input is a real sample or a sample generated by the generator. In such dynamic game training, the purpose of the generator is to increase the probability of a discriminator making a mistake, and the purpose of the discriminator is to separate the real data from the generated data. The two are continuously trained to improve the generation capacity and the discrimination capacity of the self until a Nash balance is achieved between the generator and the discriminator. The objective function of GAN aims at minimizing the JS divergence of the two probability distributions of p g and p data, while CGAN is to add tag conditions to the GAN, i.e. the data needed for the given tag generation.
Disclosure of Invention
Aiming at the defects in the prior art, the invention aims to provide a CGAN data-enhanced frequency spectrum sensing method and a CGAN data-enhanced frequency spectrum sensing system, aiming at the problem of lack of data in frequency spectrum sensing, the synthesized data is added into the original training data under the given frequency spectrum environment condition by adopting a generated countermeasure network, so that the classification precision is improved.
The spectrum sensing method based on CGAN data enhancement provided by the invention comprises the following steps:
step S1: collecting channel state data and acquiring a training data set;
Step S2: converting training data into spectrograms through short-time Fourier transformation to generate a training set;
Step S3: establishing a CGAN-based neural network model, and training the CGAN-based neural network model through the training set;
step S4: adding the trained data sample generated based on CGAN neural network model into the training set to generate an enhanced training data set;
Step S5: and acquiring a convolutional neural network trained in advance, inputting the enhanced training data set into the convolutional neural network for classification, and obtaining a classification result, wherein the classification result is that a main user of a channel is in a silence state or the main user is in an active state.
Preferably, in step S1, the training data set may be represented as Φ = { (x 1,y1),(x2,y2)...,(xn,yn) }, where y n represents a tag corresponding to x n, x n is nth channel state data, and y n includes that the primary user is in a silence state and the primary user is in an active state.
Preferably, the step S2 includes the steps of:
Step S201: carrying out framing and windowing pretreatment on the channel state data, and visualizing the obtained training data in a spectrogram form through short-time Fourier transformation to obtain a spectrogram x (k):
Wherein N is window length, x (N) is channel state data of a user, w (N) is Hamming window function, k is signal frequency, i is virtual function unit, and N is time sequence sampling point;
Step S202: converting the spectrogram into a form with amplitude of decibels:
I(k,t)=20×log10|xt(k)|
Wherein x t (k) is a spectrogram at time t;
Step S203: setting the bin gray value with the smallest spectrum image decibel as 0, and normalizing all the decibel bin gray values to be:
Wherein, R (I, j) represents the gray value of the original image, I (I, j) represents the gray value of the converted image, and R (I, j) max and R (I, j) min respectively represent the minimum gray value and the maximum gray value of the original image;
The training dataset is thus converted into a training set Φ= { (I 1,y1),(I2,y2)...(In,yn)},In representing the spectrogram normalized gray values.
Preferably, in step S3, the representation of the neural network model CGAN is:
Wherein D represents a discriminator, G represents a generator, x represents real data, z represents random noise of an input generator, y represents a condition variable, G (z) is an output of the generator, D (x) is an output of the discriminator, and both D (x) and G (z) require an additional condition y at an input layer.
Preferably, generator model parameters of CGAN are first fixed in step S3, training arbiter parameters, comprising the steps of:
Step M1: randomly selecting m positive samples { (I 1,y1),(I2,y2)...,(Im,ym) } from the training set;
Step M2: m noise data { z 1,z2...,zm } are selected from the gaussian distribution.
Step M3: the condition y and the noise data z are input into the generator at the same time to obtain the generated data
Step M4: according to the objective function of the discriminator
Wherein the method comprises the steps ofRepresenting the image generated by the generator,/>Representing a single image selected from the real sample, the arbiter is optimal when the arbiter objective function takes a maximum value, so that gradient-rise training of the arbiter parameters is employed:
Wherein, theta d is, eta is, Is the following.
Preferably, the step S3 of fixing CGAN the parameters of the model of the arbiter and training the parameters of the generator includes the following steps:
Step N1: randomly selecting m condition labels { (y 1,y2...,ym) } from the training set;
Step N2: selecting m pieces of noise data { z 1,z2...,zm } from the Gaussian distribution;
step N3: according to the generator objective function
Gradient ascent training generator parameters are also employed:
Wherein, theta d is, eta is, Is the following.
Preferably, the convolutional neural network adopts a AlexNet-based convolutional neural network, and the AlexNet network structure is obtained through a migration learning method.
The CGAN data enhancement-based spectrum sensing system provided by the invention comprises the following modules:
the data acquisition module is used for acquiring channel state data and acquiring a training data set;
The data conversion module is used for converting training data into a spectrogram through short-time Fourier transform to generate a training set;
The model training module is used for building a CGAN-based neural network model and training the CGAN-based neural network model through the training set;
the data enhancement module is used for adding the trained data samples generated based on the CGAN neural network model into the training set to generate an enhanced training data set;
The data classification module is used for acquiring a pre-trained convolutional neural network, inputting an enhanced training data set into the convolutional neural network for classification, and obtaining a classification result, wherein the classification result is that a main user of a channel is in a silence state or the main user is in an active state.
Compared with the prior art, the invention has the following beneficial effects:
The invention is based on CGAN data enhanced spectrum sensing, adopts a data driving method, and compared with the traditional energy detection, the invention does not need to know priori knowledge such as noise power, and the like, in practice, the performance of the energy detector depends on the accuracy of background noise estimation to a great extent, so the invention adopts CGAN network to enhance the data, and then AlexNet convolutional neural network is introduced to classify, thereby effectively realizing spectrum sensing.
Drawings
Other features, objects and advantages of the present invention will become more apparent upon reading of the detailed description of non-limiting embodiments, given with reference to the accompanying drawings in which:
FIG. 1 is a flowchart illustrating steps of a method for enhancing spectrum sensing data of a condition-based generation countermeasure network according to an embodiment of the present invention;
FIG. 2 is a basic structure and training diagram of CGAN in an embodiment of the present invention;
fig. 3 is a schematic block diagram of a spectrum sensing system based on CGAN data enhancement in an embodiment of the present invention.
Detailed Description
The present invention will be described in detail with reference to specific examples. The following examples will assist those skilled in the art in further understanding the present invention, but are not intended to limit the invention in any way. It should be noted that variations and modifications could be made by those skilled in the art without departing from the spirit of the invention. These are all within the scope of the present invention.
In an embodiment of the present invention, as shown in fig. 1, the method for enhancing spectrum sensing data based on a conditional generation countermeasure network provided by the present invention includes the following steps:
step 1: and sampling the channel state data to obtain a training data set.
In the embodiment of the invention, for the sampled channel state information, namely spectrum sensing, the channel state information can be classified into a binary hypothesis testing problem:
Assume a common single-input multiple-output cognitive radio usage scenario, i.e. the number of secondary user antennas in the system is M, M > 1, and the number of primary user antennas is 1.x (N) = { x 1(n),x2(n)...,xm (N) }, representing the nth sample data, n∈ {0, 1..n-1 }, N representing the total number of samples. s (n) denotes the primary user signal, ε (n) denotes noise, H 1 denotes that the primary user is active, i.e. occupying the channel, and H 0 denotes that the primary user is in a silence state. The training data set may thus be represented as phi = { (x 1,y1),(x2,y2)...,(xn,yn) }, where y n represents the label to which x n corresponds, and y n includes the primary user being in a silence state and the primary user being in an active state.
Step 2: framing and windowing preprocessing is carried out on channel data, and the obtained training data is visualized in a spectrogram form through short-time Fourier transformation to obtain the spectrogram:
Where N is the window length, x (N) is the user signal, w (N) is the hamming window function, k is the signal frequency, i is the unit of the imaginary function, and N is the time sequence sampling point.
Step 3: converting the spectrogram into a form with amplitude of decibels:
I(k,t)=20×log10|xt(k)| (3)
Wherein x t (k) is the spectral diagram at time t.
Step 4: setting the bin gray value with the smallest spectrum figure decibel as 0, and normalizing all the decibel bin values:
Wherein R (I, j) represents a gray value of the original image, I (I, j) represents a gray value of the converted image, and R (I, j) max and R (I, j) min represent a minimum gray value and a maximum gray value of the original image, respectively.
The training set is thus converted to phi = { (I 1,y1),(I2,y2)...(In,yn) }.
Step 5: and establishing CGAN a neural network model. CGAN the basic structure and training process is shown in figure 2. D denotes a arbiter, G denotes a generator, x denotes real data, z denotes random noise input to the generator, y denotes a condition variable, G (z) is an output of the generator, D (x) is an output of the arbiter, and D (x) and G (z) both require an objective function of the additional condition y, CGAN at the input layer:
step 6: the error back propagation algorithm is used to train the generator and the discriminant, and first the model parameters of the generator of CGAN are fixed, and the discriminant parameters are trained.
Step 7: m positive samples { (I 1,y1),(I2,y2)...,(Im,ym) } are randomly selected from the training set.
Step 8: m noise data { z 1,z2...,zm } are selected from the gaussian distribution.
Step 9: the condition y and the noise data z are input into the generator at the same time to obtain the generated data
Step 10: according to the objective function of the discriminator:
Wherein the method comprises the steps of Representing the image generated by the generator,/>Representing a single image selected from the real sample. When the objective function takes the maximum value, the discriminator is optimal, so gradient rising training is adopted to train the parameters of the discriminator:
Step 11: the parameters of the discriminator model of CGAN are fixed, and the parameters of the generator are trained.
Step 12: m condition labels { (y 1,y2...,ym) } are randomly selected from the training set.
Step 13: the m noise data { z 1,z2., zm } are chosen from the gaussian distribution.
Step 14: according to the generator objective function:
gradient ascent training generator parameters are also employed:
step 15: a AlexNet convolutional neural network was introduced for classification. And obtaining AlexNet the network structure by a migration learning method. The recognition performance of the deep neural network has higher requirement on the data volume, and the transfer learning can initialize the parameters of the network model for training the small data set by utilizing the parameters with strong learning ability, which are trained in advance on the large data set by the network model, thereby accelerating the network training speed and weakening the influence of the over-fitting phenomenon when training on the small data set.
Step 16: the AlexNet network has strong recognition capability to noise, so that the enhanced spectrogram training data set is input into the AlexNet network, and the classification result is output by the output layer after the image is subjected to operations such as feature extraction, pooling and the like. The spectrogram is finally classified as signal plus noise or simply representing noise, thereby achieving spectral perception.
Fig. 3 is a schematic block diagram of a spectrum sensing system based on CGAN data enhancement in an embodiment of the present invention, and as shown in fig. 3, the spectrum sensing system based on CGAN data enhancement provided in the present invention includes the following modules:
the data acquisition module is used for acquiring channel state data and acquiring a training data set;
The data conversion module is used for converting training data into a spectrogram through short-time Fourier transform to generate a training set;
The model training module is used for building a CGAN-based neural network model and training the CGAN-based neural network model through the training set;
the data enhancement module is used for adding the trained data samples generated based on the CGAN neural network model into the training set to generate an enhanced training data set;
the data classification module is used for acquiring a pre-trained convolutional neural network, inputting an enhanced training data set into the convolutional neural network for classification, and obtaining a classification result, wherein the classification result is that a main user of a channel is in a silence state or the main user is in an active state, so that spectrum sensing is realized.
The invention is based on CGAN data enhanced spectrum sensing, adopts a data driving method, and does not need to know priori knowledge such as noise power and the like compared with the traditional energy detection. In practice, the performance of the energy detector depends to a large extent on the accuracy of the background noise estimation. Therefore, after the data is enhanced by adopting CGAN networks, alexNet convolutional neural networks are introduced to classify, so that spectrum sensing can be effectively realized.
The foregoing describes specific embodiments of the present invention. It is to be understood that the invention is not limited to the particular embodiments described above, and that various changes and modifications may be made by one skilled in the art within the scope of the claims without affecting the spirit of the invention.

Claims (4)

1. A CGAN data enhancement based spectrum sensing method, comprising the steps of:
step S1: collecting channel state data and acquiring a training data set;
Step S2: converting training data into spectrograms through short-time Fourier transformation to generate a training set;
step S3: establishing a CGAN-based neural network model, and training the CGAN-based neural network model through the training set;
Step S4: adding the trained data sample generated based on CGAN neural network model into the training set to generate an enhanced training data set;
step S5: the method comprises the steps of obtaining a convolutional neural network trained in advance, inputting the enhanced training data set into the convolutional neural network for classification, and obtaining a classification result, wherein the classification result is that a main user of a channel is in a silence state or the main user is in an active state;
The step S2 includes the steps of:
Step S201: carrying out framing and windowing pretreatment on the channel state data, and visualizing the obtained training data in a spectrogram form through short-time Fourier transformation to obtain a spectrogram x (k):
Wherein N is window length, x (N) is channel state data of a user, w (N) is Hamming window function, k is signal frequency, i is virtual function unit, and N is time sequence sampling point;
Step S202: converting the spectrogram into a form with amplitude of decibels:
I(k,t)=20×log10|xt(k)|
Wherein x t (k) is a spectrogram at time t;
Step S203: setting the bin gray value with the smallest spectrum image decibel as 0, and normalizing all the bin gray values to be:
Wherein, R (I, j) represents the gray value of the original image, I (I, j) represents the gray value of the converted image, and R (I, j) max and R (I, j) min respectively represent the minimum gray value and the maximum gray value of the original image;
The training dataset is converted into a training set phi= { (I 1,y1),(I2,y2)...(In,yn)},In representing the spectrogram normalized gray values; in step S3, the representation of the neural network model of CGAN is:
Wherein D represents a discriminator, G represents a generator, x represents real data, z represents random noise inputted into the generator, y represents a condition variable, G (z) is an output of the generator, D (x) is an output of the discriminator, and both D (x) and G (z) require an additional condition y at an input layer;
In step S3, generator model parameters of CGAN are first fixed, and discriminant parameters are trained, including the steps of:
step M1: randomly selecting m positive samples { (I 1,y1),(I2,y2)...,(Im,ym) } from the training set;
Step M2: selecting m pieces of noise data { z 1,z2...,zm } from the Gaussian distribution;
Step M3: the condition y and the noise data z are input into the generator at the same time to obtain the generated data
Step M4: according to the objective function of the discriminator
Wherein the method comprises the steps ofRepresenting the image generated by the generator,/>Representing a single image selected from the real sample, the arbiter is optimal when the arbiter objective function takes a maximum value, so that gradient-rise training of the arbiter parameters is employed:
the step S3 of fixing CGAN of the parameters of the model of the arbiter, and the step of training the parameters of the generator includes the following steps:
Step N1: randomly selecting m condition labels { (y 1,y2...,ym) } from the training set;
Step N2: selecting m pieces of noise data { z 1,z2...,zm } from the Gaussian distribution;
step N3: according to the generator objective function
Gradient ascent training generator parameters are also employed:
2. The method of claim 1, wherein in step S1, the training data set is represented by Φ= { (x 1,y1),(x2,y2)...,(xn,yn) }, where y n represents a label corresponding to x n, x n is nth channel state data, and y n includes a primary user being in a silence state and a primary user being in an active state.
3. The CGAN data enhancement-based spectrum sensing method according to claim 1, wherein the convolutional neural network adopts a AlexNet-based convolutional neural network, and a AlexNet network structure is obtained through a migration learning method.
4. A spectrum sensing system based on CGAN data enhancement, which is used for implementing the spectrum sensing method based on CGAN data enhancement according to any one of claims 1 to 3, and includes the following modules:
the data acquisition module is used for acquiring channel state data and acquiring a training data set;
The data conversion module is used for converting training data into a spectrogram through short-time Fourier transform to generate a training set;
The model training module is used for building a CGAN-based neural network model and training the CGAN-based neural network model through the training set;
The data enhancement module is used for adding the trained data samples generated based on the CGAN neural network model into the training set to generate an enhanced training data set;
The data classification module is used for acquiring a pre-trained convolutional neural network, inputting an enhanced training data set into the convolutional neural network for classification, and obtaining a classification result, wherein the classification result is that a main user of a channel is in a silence state or the main user is in an active state.
CN202110635040.3A 2021-06-07 2021-06-07 CGAN data enhancement-based frequency spectrum sensing method and system Active CN113435263B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110635040.3A CN113435263B (en) 2021-06-07 2021-06-07 CGAN data enhancement-based frequency spectrum sensing method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110635040.3A CN113435263B (en) 2021-06-07 2021-06-07 CGAN data enhancement-based frequency spectrum sensing method and system

Publications (2)

Publication Number Publication Date
CN113435263A CN113435263A (en) 2021-09-24
CN113435263B true CN113435263B (en) 2024-04-19

Family

ID=77803935

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110635040.3A Active CN113435263B (en) 2021-06-07 2021-06-07 CGAN data enhancement-based frequency spectrum sensing method and system

Country Status (1)

Country Link
CN (1) CN113435263B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114710221A (en) * 2022-03-21 2022-07-05 上海应用技术大学 Frequency spectrum sensing method based on convolutional neural network

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111553424A (en) * 2020-04-29 2020-08-18 南京邮电大学 CGAN-based image data balancing and classifying method
WO2020172838A1 (en) * 2019-02-26 2020-09-03 长沙理工大学 Image classification method for improvement of auxiliary classifier gan
CN111932645A (en) * 2020-06-12 2020-11-13 重庆大学 Method for automatically generating ink and wash painting based on generation countermeasure network GAN
CN112488294A (en) * 2020-11-20 2021-03-12 北京邮电大学 Data enhancement system, method and medium based on generation countermeasure network

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020172838A1 (en) * 2019-02-26 2020-09-03 长沙理工大学 Image classification method for improvement of auxiliary classifier gan
CN111553424A (en) * 2020-04-29 2020-08-18 南京邮电大学 CGAN-based image data balancing and classifying method
CN111932645A (en) * 2020-06-12 2020-11-13 重庆大学 Method for automatically generating ink and wash painting based on generation countermeasure network GAN
CN112488294A (en) * 2020-11-20 2021-03-12 北京邮电大学 Data enhancement system, method and medium based on generation countermeasure network

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
陈文兵 ; 管正雄 ; 陈允杰 ; .基于条件生成式对抗网络的数据增强方法.计算机应用.2018,(第11期),全文. *

Also Published As

Publication number Publication date
CN113435263A (en) 2021-09-24

Similar Documents

Publication Publication Date Title
CN110364144B (en) Speech recognition model training method and device
CN110086737B (en) Communication signal modulation mode identification method based on graph neural network
Xing et al. Spectrum sensing in cognitive radio: A deep learning based model
CN112633420B (en) Image similarity determination and model training method, device, equipment and medium
Wu et al. DSLN: Securing Internet of Things through RF fingerprint recognition in low-SNR settings
Cai et al. Spectrum sensing based on spectrogram-aware CNN for cognitive radio network
CN113435263B (en) CGAN data enhancement-based frequency spectrum sensing method and system
Sang et al. Application of novel architectures for modulation recognition
Cheng et al. Deep learning network based spectrum sensing methods for OFDM systems
Tang et al. Specific emitter identification for IoT devices based on deep residual shrinkage networks
Miao et al. Spectrum sensing based on adversarial transfer learning
Geng et al. Spectrum sensing for cognitive radio based on feature extraction and deep learning
Wu et al. Performance improvement for machine learning‐based cooperative spectrum sensing by feature vector selection
Liao et al. Fast Fourier Transform with Multi-head Attention for Specific Emitter Identification
CN113627377A (en) Cognitive radio frequency spectrum sensing method and system Based on Attention-Based CNN
Liu et al. Incremental learning based radio frequency fingerprint identification using intelligent representation
Wang et al. Specific emitter identification based on the multi‐discrepancy deep adaptation network
Zhang et al. Machine learning based protocol classification in unlicensed 5 GHz bands
CN117216542A (en) Model training method and related device
Zhang et al. Spectrum Transformer: An Attention-based Wideband Spectrum Detector
KR20110062274A (en) Apparatus and method for selecting optimal database by using the maximal concept strength recognition techniques
CN116865884A (en) Broadband spectrum sensing method based on online learning
CN116614333A (en) Modulation identification method based on Markov conversion field and deep learning
Yang et al. Conventional Neural Network‐Based Radio Frequency Fingerprint Identification Using Raw I/Q Data
Zhang et al. Open Set Domain Adaptation for Automatic Modulation Classification in Dynamic Communication Environments

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant