CN112882683A - Random number generator judging method, random number processing method, device and equipment - Google Patents

Random number generator judging method, random number processing method, device and equipment Download PDF

Info

Publication number
CN112882683A
CN112882683A CN201911202755.9A CN201911202755A CN112882683A CN 112882683 A CN112882683 A CN 112882683A CN 201911202755 A CN201911202755 A CN 201911202755A CN 112882683 A CN112882683 A CN 112882683A
Authority
CN
China
Prior art keywords
random numbers
random number
random
number generator
model
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201911202755.9A
Other languages
Chinese (zh)
Inventor
周泓伊
黄蕾蕾
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Alibaba Group Holding Ltd
Original Assignee
Alibaba Group Holding Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Alibaba Group Holding Ltd filed Critical Alibaba Group Holding Ltd
Priority to CN201911202755.9A priority Critical patent/CN112882683A/en
Publication of CN112882683A publication Critical patent/CN112882683A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F7/00Methods or arrangements for processing data by operating upon the order or content of the data handled
    • G06F7/58Random or pseudo-random number generators
    • G06F7/582Pseudo-random number generators
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F7/00Methods or arrangements for processing data by operating upon the order or content of the data handled
    • G06F7/58Random or pseudo-random number generators
    • G06F7/588Random number generators, i.e. based on natural stochastic processes

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computational Mathematics (AREA)
  • Mathematical Analysis (AREA)
  • Mathematical Optimization (AREA)
  • Pure & Applied Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Test And Diagnosis Of Digital Computers (AREA)

Abstract

The embodiment of the invention provides a random number generator judging method, a random number processing method, a device and equipment, wherein the random number processing method comprises the following steps: acquiring N random numbers currently generated by a random number generator, wherein N is more than or equal to 1; the method comprises the following steps of (1) adopting a model corresponding to a random number generator to test N random numbers, wherein the model consists of a convolutional neural network and a cyclic neural network which are sequentially connected; and if the N random numbers are determined to meet the randomness requirement according to the test results of the N random numbers, the N random numbers are used for data processing so as to ensure the data security.

Description

Random number generator judging method, random number processing method, device and equipment
Technical Field
The invention relates to the technical field of internet, in particular to a random number generator distinguishing method, a random number processing device and equipment.
Background
Random numbers are widely used in scenarios such as data encryption, and are generated by random number generators. For example, in a data encryption scenario, a random number generator generates a string of random numbers for encrypting data to be encrypted.
At present, a common random number generator includes: the random number generated by the pseudo-random number generator may be referred to as a pseudo-random number, and the random number generated by the quantum random number generator may be referred to as a quantum random number.
In many practical application scenarios, the random numbers generated by the random number generators are required to have good randomness, but the randomness exhibited by the random numbers generated by different random number generators may be different, and some of the random numbers generated by the random number generators may not necessarily meet the requirement.
Disclosure of Invention
The embodiment of the invention provides a random number generator judging method, a random number processing device and equipment, which can determine the quality of a random number generated by a random number generator.
In a first aspect, an embodiment of the present invention provides a random number processing method, where the method includes:
acquiring N random numbers currently generated by a random number generator, wherein N is more than or equal to 1;
checking the N random numbers by adopting a model corresponding to the random number generator;
if the N random numbers are determined to meet the randomness requirement according to the test results of the N random numbers, using the N random numbers to perform data processing;
the model consists of a convolutional neural network and a cyclic neural network which are connected in sequence.
In a second aspect, an embodiment of the present invention provides a random number processing apparatus, including:
the acquisition module is used for acquiring N random numbers currently generated by the random number generator, wherein N is more than or equal to 1;
the inspection module is used for inspecting the N random numbers by adopting a model corresponding to the random number generator, and the model consists of a convolutional neural network and a cyclic neural network which are sequentially connected;
and the processing module is used for processing data by using the N random numbers if the N random numbers are determined to meet the randomness requirement according to the test results of the N random numbers.
In a third aspect, an embodiment of the present invention provides an electronic device, which includes a processor and a memory, where the memory stores executable code thereon, and when the executable code is executed by the processor, the processor is enabled to implement at least the random number processing method in the first aspect.
An embodiment of the present invention provides a non-transitory machine-readable storage medium, on which an executable code is stored, and when the executable code is executed by a processor of an electronic device, the processor is enabled to implement at least the random number processing method in the first aspect.
In a fourth aspect, an embodiment of the present invention provides a random number generator identification method, where the method includes:
acquiring N1 random numbers generated by a random number generator, wherein N1> 1;
checking the N1 random numbers by using a model corresponding to the random number generator to obtain a checking result corresponding to the N1 random numbers;
determining the quality of the random number generator according to the inspection result;
the model consists of a convolutional neural network and a cyclic neural network which are connected in sequence.
In a fifth aspect, an embodiment of the present invention provides a random number generator discriminating device, including:
the acquisition module is used for acquiring N1 random numbers generated by the random number generator, wherein N1> 1;
a checking module, configured to check the N1 random numbers by using a model corresponding to the random number generator to obtain a checking result corresponding to the N1 random numbers, where the model is composed of a convolutional neural network and a cyclic neural network that are sequentially connected;
a determining module for determining the quality of the random number generator according to the inspection result.
In a sixth aspect, an embodiment of the present invention provides an electronic device, which includes a processor and a memory, where the memory stores executable code thereon, and when the executable code is executed by the processor, the processor is enabled to implement at least the random number generator distinguishing method in the fourth aspect.
An embodiment of the present invention provides a non-transitory machine-readable storage medium, on which an executable code is stored, and when the executable code is executed by a processor of an electronic device, the processor is enabled to implement at least the random number generator distinguishing method in the fourth aspect.
In a seventh aspect, an embodiment of the present invention provides a random number generator distinguishing method, where the method includes:
acquiring N1 random numbers generated by a first random number generator and N2 random numbers generated by a second random number generator, wherein N1>1 and N2> 1;
checking the N1 random numbers with a first model corresponding to the first random number generator to obtain a first check result corresponding to the N1 random numbers;
checking the N2 random numbers with a second model corresponding to the second random number generator to obtain a second check result corresponding to the N2 random numbers;
selecting the first random number generator and the second random number generator according to the first and second verification results;
the first model and the second model are both composed of a convolutional neural network and a cyclic neural network which are connected in sequence.
In an eighth aspect, an embodiment of the present invention provides a random number processing method, where the method includes:
acquiring N random numbers currently generated by a random number generator, wherein N is more than or equal to 1;
the N random numbers are detected by adopting a model corresponding to the random number generator, and the model consists of a convolutional neural network and a cyclic neural network which are connected in sequence;
and if the N random numbers are determined to be not in accordance with the randomness requirement according to the test results of the N random numbers, acquiring the other N random numbers generated by the random number generator, or updating the random number generator.
In a ninth aspect, an embodiment of the present invention provides a random number processing apparatus, including:
the acquisition module is used for acquiring N random numbers currently generated by the random number generator, wherein N is more than or equal to 1;
the inspection module is used for inspecting the N random numbers by adopting a model corresponding to the random number generator, and the model consists of a convolutional neural network and a cyclic neural network which are sequentially connected;
and the processing module is used for acquiring the other N random numbers generated by the random number generator or updating the random number generator if the N random numbers are determined to be not in accordance with the randomness requirement according to the test result of the N random numbers.
In a tenth aspect, an embodiment of the present invention provides an electronic device, which includes a processor and a memory, where the memory stores executable code thereon, and when the executable code is executed by the processor, the processor is enabled to implement at least the random number processing method in the eighth aspect.
In the embodiment of the invention, when N random numbers generated by a random number generator are required to be used in a certain data processing process, a model obtained by pre-training is firstly adopted, and the N random numbers generated by the random number generator are tested in combination with a plurality of random numbers generated by the random number generator before the N random numbers are generated, so that whether the N random numbers meet the randomness requirement or not is determined according to the test result of the N random numbers, and if the N random numbers meet the randomness requirement, the N random numbers are used for data processing, so that the data safety is ensured. The test result of the N random numbers describes the condition of the random numbers successfully predicted by the model in the N random numbers, and if the number of the random numbers successfully predicted by the model in the N random numbers is larger, the randomness of the N random numbers is poorer.
Through research, a large number of random numbers generated by some random number generators may present a periodic characteristic or a characteristic of repeated random number sequences, and the existence of the characteristic obviously does not meet the randomness requirement. Therefore, if it is desired to achieve accurate randomness judgment on the N random numbers, the model needs to have the capability of finding these features, and for this reason, the embodiment of the present invention provides a model composed of a convolutional neural network and a cyclic neural network connected in sequence. The convolutional neural network has the capability of extracting local features in data, so that the features, the periodic features and the like which repeatedly appear in the random number sequence can be found through the convolutional neural network. Because the recurrent neural network has a memory function, more accurate random number prediction can be performed on the random numbers generated by the recurrent neural network based on history to obtain more accurate random number prediction results, thereby obtaining more accurate random number randomness judgment results.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present invention, the drawings used in the description of the embodiments will be briefly introduced below, and it is obvious that the drawings in the following description are some embodiments of the present invention, and those skilled in the art can also obtain other drawings according to the drawings without creative efforts.
FIG. 1 is a flow chart of a random number processing method according to an embodiment of the present invention;
FIG. 2 is a schematic diagram of a model structure according to an embodiment of the present invention;
FIG. 3 is a diagram illustrating a process for verifying N random numbers according to an embodiment of the present invention;
FIG. 4 is a diagram illustrating an implementation of a random number processing method according to an embodiment of the present invention;
FIG. 5 is a diagram illustrating an implementation of a random number processing method according to another embodiment of the present invention;
FIG. 6 is a schematic diagram of a model training process according to an embodiment of the present invention;
FIG. 7 is a flowchart illustrating a method for discriminating a random number generator according to an embodiment of the present invention;
FIG. 8 is a flowchart illustrating a method for discriminating a random number generator according to another embodiment of the present invention;
FIG. 9 is a schematic structural diagram of a random number processing apparatus according to an embodiment of the present invention;
fig. 10 is a schematic structural diagram of an electronic device corresponding to the random number processing apparatus provided in the embodiment shown in fig. 9;
FIG. 11 is a schematic structural diagram of a random number generator determination device according to an embodiment of the present invention;
fig. 12 is a schematic structural diagram of an electronic device corresponding to the random number generator determination device in the embodiment shown in fig. 11.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some embodiments of the present invention, but not all embodiments. All other embodiments, which can be obtained by a person skilled in the art based on the embodiments of the present invention without any inventive step, are within the scope of the present invention.
The terminology used in the embodiments of the invention is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. As used in the examples of the present invention and the appended claims, the singular forms "a", "an", and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise, and "a plurality" typically includes at least two.
The words "if", as used herein, may be interpreted as "at … …" or "at … …" or "in response to a determination" or "in response to a detection", depending on the context. Similarly, the phrases "if determined" or "if detected (a stated condition or event)" may be interpreted as "when determined" or "in response to a determination" or "when detected (a stated condition or event)" or "in response to a detection (a stated condition or event)", depending on the context.
It is also noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a good or system that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such good or system. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a commodity or system that includes the element.
In addition, the sequence of steps in each method embodiment described below is only an example and is not strictly limited.
The random number processing method and the random number generator judging method provided by the embodiment of the invention can be executed by terminal equipment such as a PC (personal computer), a notebook computer, a mobile phone and the like, and can also be executed by a server or a server cluster at the cloud end.
Fig. 1 is a flowchart of a random number processing method according to an embodiment of the present invention, as shown in fig. 1, which may include the following steps:
101. n random numbers currently generated by the random number generator are obtained.
102. And (3) adopting a model corresponding to the random number generator to test the N random numbers, wherein the model consists of a convolutional neural network and a cyclic neural network which are connected in sequence.
103. And if the N random numbers are determined to meet the randomness requirement according to the test results of the N random numbers, using the N random numbers to perform data processing.
The random number generator in this embodiment may be any random number generator, such as a pseudo random number generator, or a quantum random number generator.
The number of random numbers required in each time is different in different application scenarios, some scenarios (such as a data encryption scenario) may require a plurality of random numbers, and some scenarios (such as a random number selection scenario) may require only one random number, so that the value of N is N ≧ 1.
It is worth noting that in this context, the random number generator generates only one random number at a time. Thus, when multiple random numbers are required to be used each time in some application scenarios, the random number generator may be caused to perform multiple random number generation operations to generate the required multiple random numbers.
In addition, in different application scenarios, some configurations may be performed on the random number generator according to actual requirements, such as configuring the system number and value range of the random number generated by the random number generator, how many random numbers need to be used each time, and the like.
For example, the random number generated by the random number generator may be a binary number, or may be other binary numbers such as a decimal number. Taking the example that the random number generator can generate decimal numbers, the value range of the random numbers generated by the random number generator can also be set, for example, to [0,30000 ]. Based on this, assuming that 5 random numbers need to be used each time in a certain scenario, that is, N is 5, if the random number generated by the random number generator is a binary number, the N random numbers are 5 binary numbers, such as: 01101; if the random number generated by the random number generator is a decimal number and the value range is [0,30000], the N random numbers are 5 decimal numbers in the value range, such as: 7,135,36,16005,24873.
After acquiring the N random numbers generated by the random number generator, in order to avoid hidden trouble in data security and the like, randomness judgment may be performed on the N random numbers first to determine whether the N random numbers meet randomness requirements, and if so, the N random numbers are used for subsequent data processing. If the N random numbers do not meet the randomness requirement, the N random numbers can be discarded, at this time, the random number generator can regenerate the N random numbers, and if the newly generated N random numbers meet the randomness requirement, the newly generated N random numbers are used for data processing; alternatively, the random number generator may be updated. Wherein updating the random number generator may be retraining the random number generator or selecting another random number generator.
In the process of performing randomness judgment on the N random numbers, a model obtained through pre-training needs to be used, so that the N random numbers are tested by using the model in combination with a plurality of random numbers already generated by the random number generator before the N random numbers are generated, and further, the randomness judgment on the N random numbers is completed according to the test results of the N random numbers, that is, whether the N random numbers meet the randomness requirement is judged.
The "model corresponding to the random number generator" in step 102 means that the model is obtained by training based on a large number of random numbers that the random number generator has successively generated. The training process of the model will be described in the following embodiments, and it is only emphasized here that, through the training process, the model can learn data features presented in a large number of random numbers generated by the random number generator, such as periodic features, features where a random number sequence appears repeatedly, and so on. Based on the learning result of these data features, the model can be enabled to more accurately predict random numbers.
In summary, the verification of the N random numbers means that the model predicts how many random numbers generated by the random number generator next to the N random numbers are in combination with the plurality of random numbers generated by the random number generator before the N random numbers are generated. If the coincidence degree of the N predicted values sequentially output by the model and the N random numbers generated by the random number generator is very high (for example, completely consistent), it means that the N random numbers are very high in predictability and are easily predicted, and at this time, it means that the randomness of the N random numbers is very poor and does not meet the randomness requirement.
In this embodiment, the check result of the N random numbers may be some index calculated according to the N predicted values obtained by model prediction, so that the randomness requirement may be embodied as a set index threshold, and based on this, whether the N random numbers meet the randomness requirement may be determined by comparing the index obtained by calculation with the threshold. Among these, indices that can be used are set forth below.
The following description will first describe a model used in the verification process of N random numbers, and then describe the verification process of N random numbers.
As shown in fig. 2, the model may be composed of a Convolutional Neural Network (CNN) and a cyclic Neural Network (RNN) connected in sequence. Where the CNN may include one or more convolutional layers, such as convolutional layer 1 and convolutional layer 2 illustrated in fig. 2, the output of the last convolutional layer may be used as an input to the RNN. In practical applications, the RNN may be implemented by using a Long Short-Term Memory network (LSTM), for example.
In fig. 2, the input and output of the model are illustrated simply as follows: assuming that the input of the current model is X random numbers, the output of the model is the X +1 th random number predicted by the model according to the X random numbers.
The model consisting of CNN and RNN is used because CNN has the ability to extract local features in the input data through convolution operations, while RNN has a memory function to better perform data prediction. The corresponding capability of the two neural networks is reflected in the random number test process as follows: local features (such as features that a specific random number sequence appears repeatedly) and periodic features in a large number of random numbers generated by the random number generator can be better extracted through the CNN; the random number prediction can be better performed by using a large amount of random numbers generated by the random number generator through the memory function of the RNN, and the prediction success rate is improved.
An alternative N random number verification process is described below:
first, the test set is initialized with a plurality of random numbers that the random number generator has generated before generating the N random numbers. Then, the following iterative process is performed until all the N random numbers are checked:
in the ith round of iteration process, inputting the random numbers contained in the inspection set into the model so as to output a predicted value corresponding to the ith random number in the N random numbers through the model;
and updating the ith random number in the N random numbers into a checking set to execute an (i + 1) th iteration process, wherein the i belongs to [1, N ].
It should be noted that, in practical applications, in order to improve the accuracy of the test result of the N random numbers, all the random numbers generated by the random number generator before generating the N random numbers may be obtained to initialize the test set, and of course, in order to take account of the calculation amount, a certain number, for example, 5000 random numbers, which are finally generated by the random number generator before generating the N random numbers may also be obtained to initialize the test set.
The verification process is illustrated below in conjunction with fig. 3. Assuming that the random number generator has generated K random numbers before generating the N random numbers, the verification set is initialized with the K random numbers, i.e., the initial verification set is composed of the K random numbers. In fig. 3, K is assumed to be 10, and the 10 random numbers are assumed to be: 0110100001, it is these 10 random numbers that are included in the initial verification set. Assume that the current N random numbers generated by the random number generator are: 11001.
as shown in fig. 3, in the first iteration, i.e. i is 1, the K random numbers are input into the model, the model outputs the predicted value corresponding to the first random number of the N random numbers, which is assumed to be the first predicted value, and the first predicted value is assumed to be 0, and the first random number of the N random numbers is checked by comparing whether the predicted value corresponding to the first random number is the same as the first random number. Since the checking of the first of the N random numbers has been completed and the second of the N random numbers needs to be checked next, at this point, the checking set needs to be updated, specifically, the first of the N random numbers that has been checked is added to the checking set, at this point, the checking set contains K +1 random numbers. Since the first random number of the N random numbers is 1, the random numbers included in the updated verification set are: 01101000011, wherein the last 1 is the first random number of the N random numbers.
Next, a second iteration is performed, i.e., i is 2, at this time, the K +1 random numbers in the updated inspection set are input into the model, the model outputs a predicted value corresponding to the second random number of the N random numbers, which is assumed to be called a second predicted value, and the second predicted value is assumed to be 1, so that the inspection of the second random number of the N random numbers is completed by comparing whether the predicted value corresponding to the second random number of the N random numbers is the same as the second random number. Since the check of the second random number of the N random numbers has been completed, and the third random number of the N random numbers needs to be checked next, at this time, the check set needs to be updated, specifically, the second random number of the N random numbers that has been checked is added to the check set again, at this time, the check set contains K +2 random numbers. Since the second random number of the N random numbers is 1, the random numbers included in the updated verification set are: 011010000111, wherein the last 1 is the second random number of the N random numbers.
And repeating the steps until the N random numbers are checked. It will be appreciated that during the last iteration, i.e. when i is 5, the number of random numbers contained in the test set is K + N-1, i.e. when the last iteration is performed, the first N-1 of the N random numbers have all been updated into the test set.
Another optional N random number verification process is:
acquiring a plurality of random numbers which are generated by a random number generator before generating N random numbers;
the plurality of random numbers are input into a model, so that predicted values corresponding to the N random numbers are output through the model.
In this embodiment, it is assumed that a plurality of random numbers are regarded as the K random numbers. The model can be trained to predict N random numbers together based on the K random numbers.
Another optional N random number verification process is:
initializing a check set with a plurality of random numbers that have been generated by a random number generator prior to generating the N random numbers;
and executing the following iterative process until the N random numbers are checked to be finished:
in the ith round of iteration process, inputting the random numbers contained in the inspection set into the model so as to output a predicted value corresponding to the ith random number in the N random numbers through the model;
updating the ith random number of the N random numbers into the checking set, and removing the first generated random number from the checking set to execute the (i + 1) th iteration process, i E [1, N ].
The above-mentioned plurality of random numbers are assumed to be K random numbers. In this way, there are always K random numbers in the check set. Specifically, the check set is updated by a first-in-first-out mechanism, that is, when a random number is newly added to the check set, the random number which is generated at the earliest time is removed.
To sum up, the verification process of the random number is as follows: the method includes predicting what a random number that the random number generator is likely to generate next based on a plurality of random numbers previously generated by the random number generator.
After the prediction of the N random numbers is completed according to the above process, the predicted values corresponding to the N random numbers can be obtained. Assuming that N predicted values corresponding to N random numbers are: 01011.
next, N random number verification results can be obtained from the N predicted values.
In an optional embodiment, the total prediction success number corresponding to the N random numbers may be determined by comparing whether the predicted values corresponding to the N random numbers and the N random numbers are the same, and then a first probability value as a check result of the N random numbers is determined according to the total prediction success number, based on which, if the first probability value is lower than a set threshold, it is determined that the N random numbers meet the randomness requirement.
Also by way of example, the N random numbers are: 11001, N predicted values are: 01011. and comparing the N random numbers with the N predicted values to find out: the first and fourth of the N random numbers are not successfully predicted and the other three are successfully predicted, so that the total number of prediction successes for the N random numbers is 3. Thus, the first probability values for the N random numbers are: 3/5, assuming a threshold of 0.2, the N random numbers do not meet the randomness requirement because the first probability value is greater than the threshold.
In this embodiment, the randomness of the N random numbers is evaluated from a global perspective of how many random numbers are accumulated out of the N random numbers and successfully predicted. It is understood that the larger the first probability value is, the better the overall predictability of the N random numbers is, i.e. the larger the number of successfully predicted random numbers in the N random numbers is, the worse the randomness of the N random numbers is.
In addition, in another alternative embodiment, another way of evaluating the randomness of the N random numbers is provided: the randomness of the N random numbers is evaluated according to the local angle of the N random numbers, wherein the local angle is at most continuous and how many random numbers are successfully predicted.
Specifically, the maximum number of consecutive successful predictions corresponding to the N random numbers may be determined by comparing whether the predicted values corresponding to the N random numbers are the same or not, where the maximum number of consecutive successful predictions is the maximum number of consecutive successfully predicted random numbers in the N random numbers. Further, a second probability value as a result of the verification of the N random numbers is determined based on the maximum number of consecutive successful predictions, whereby the N random numbers are determined to meet the randomness requirement if the second probability value is below a set threshold.
For example, assume that the N random numbers are:111001100000100, assuming that N predicted values corresponding to N random numbers are: 111001000010110. the comparison shows that the first 6 random numbers in the N random numbers are successfully predicted, the 8 th to 10 th random numbers are also successfully predicted, and the 12 th and 13 th random numbers are also successfully predicted. The maximum number of successfully predicted random numbers is 6, i.e. at most 6 random numbers (i.e. the first 6 random numbers) of the N random numbers can be successfully predicted.
After obtaining the maximum number of consecutive successful predictions, the second probability value may be calculated according to the following formula:
α=(1-px)/[(r+1=rx)*(1-p)*xn+1]
wherein α is a preset confidence coefficient, and the value is, for example, 0.05; r represents the maximum consecutive success in N random numbersThe measured random number quantity is the maximum continuous success prediction quantity; n is the predicted round number, and in the case of N random numbers, N is N; x is the equation 1-x + (1-p) prxr+1A root of 0, based on which x may be represented by p; p is the second probability value to be solved.
The second probability value p can be obtained by solving the above formula, and the more p, the better the local predictability of the N random numbers is, i.e. the worse the randomness of the N random numbers is.
After the judgment of the randomness of the N random numbers is completed based on the two optional manners, if the N random numbers meet the randomness requirement, the N random numbers are used for subsequent data processing.
For example, the N random numbers are used to perform data encryption processing on data to be encrypted.
For another example, a data object corresponding to the N random numbers is output, where the data object is, for example, an object carrying numerical values corresponding to the N random numbers, or the numerical values corresponding to the N random numbers are displayed on a screen. For example, in a random number selection scenario, assume that N random numbers are decimal numbers: 23. these two numbers may be displayed on the screen, 56, meaning that the two users corresponding to the two numbers are winners.
In summary, when the random number generator currently generates N random numbers, the N random numbers are first checked through a model composed of a convolutional neural network and a cyclic neural network which are sequentially connected, so as to determine whether the randomness of the N random numbers meets the requirement according to the check result of the N random numbers, and if the randomness of the N random numbers meets the requirement, the N random numbers are used for subsequent data processing, so as to ensure the data security. More accurate random number test results can be obtained by jointly using the convolutional neural network and the cyclic neural network, so that more accurate randomness judgment results of random numbers are obtained.
The following describes an exemplary implementation of the random number processing method with reference to a random number selection scenario illustrated in fig. 4.
In this random number selection scenario, it is assumed that three numbers need to be generated in the current random selection, and the numbers are generated by the random number generator, and it is assumed that three random numbers, i.e., three numbers, currently generated by the random number generator are J1-13, J2-66, and J3-37, respectively. To avoid the duplication of the currently generated random numbers with the previously generated random numbers, the three random numbers may be subjected to randomness verification.
Specifically, after obtaining the three random numbers generated by the random number generator, a terminal device at a random selection site first obtains a plurality of random numbers that have been generated by the random number generator before generating the three random numbers, which are assumed to be K random numbers in fig. 4, and initializes the inspection set with the K random numbers. Then, in the first iteration, the K random numbers are input into a model composed of CNN and RNN, and the model outputs a first predicted value, denoted as J1'. J1 is then updated into the verification set, so that the original K random numbers and J1 are included in the verification set during the second iteration. In the second iteration, K random numbers and J1 are input into the model, and a second predicted value, denoted as J2', is output by the model. J2 is then updated into the verification set, so that the original K random numbers, J1, and J2 are included in the verification set during the third iteration. During the third iteration, K random numbers, J1 and J2 are input into the model, and a third predicted value, denoted as J3', is output by the model. Assume that the randomness conditions set are: these three predicted values predicted by the model are all different from the three random numbers generated by the random number generator, so if J1 ≠ J1 ', J2 ≠ J2 ', J3 ≠ J3 ', the three random numbers J1, J2, J3 are considered to conform to the randomness condition, and thus, these three random numbers are displayed on the screen of the terminal device.
The following is a description of an implementation of the random number processing method in practical application with reference to fig. 5. As shown in fig. 5, a three-party principal is involved in this implementation: the system comprises a random number generator, a cloud server and a client side.
The client side is a side that needs to use the random number generated by the random number generator.
The server in the cloud may be an independent host or a server cluster. An algorithmic logic for performing randomness checks on the N random numbers generated by the random number generator is deployed in the server.
As shown in fig. 5, in practical application, after the client obtains N random numbers from the random number generator, the client may call the relevant interface of the server to upload the N random numbers to the server, and trigger the server to run the check logic to check whether the N random numbers generated by the random number generator satisfy the randomness requirement. The server can adopt a model corresponding to the random number generator to check the N random numbers, and the model consists of a convolutional neural network and a cyclic neural network which are connected in sequence. The server feeds back to the client the check result indicating whether the N random numbers meet the randomness requirement. And for the client side, when the N random numbers meet the randomness requirement, the N random numbers are used for data processing.
The training process of the above model is described below. First, the training set is initialized with the M random numbers that the random number generator once generated, M > 1. Further, the following iterative process is performed until the set number of rounds L is reached:
acquiring a jth random number generated by a random number generator after M random numbers, wherein j belongs to [1, L ];
in the jth iteration process, inputting random numbers contained in the training set into the model so as to output a predicted value corresponding to the jth random number through the model;
if the predicted value corresponding to the jth random number is different from the jth random number, adjusting the parameters of the model by taking the predicted value corresponding to the jth random number as the same as the jth random number as a target;
and updating the jth random number into a training set to execute a (j + 1) th iteration process.
In practical applications, in order to enable the model to learn some data features that may be possessed by the random number generated by the random number generator, the number M of the random numbers initially contained in the training set and the number of training rounds of the model are often set to be relatively large. In practice, the M random numbers may be several random numbers sequentially generated by the random number generator.
The training process of the above model is schematically illustrated in connection with fig. 6. In fig. 6, it is assumed that the first random number generated by the random number generator after M random numbers is 1, and is denoted as R1 ═ 1. In the first iteration, M random numbers for initializing the training set are input into the model, and the model outputs a predicted value, which is assumed to be 0 and is denoted as R1 ═ 0, according to the M random numbers. This predicted value is the predicted value corresponding to the first random number R1 generated by the random number generator. Comparing R1 with R1 ', it was found that R1 is not equal to R1 ', and thus, making R1 equal to R1 ' is a parameter of the target adjustment model.
Thereafter, the first random number R1 generated by the random number generator that has completed the test is added to the training set to update the training set for a second iteration, as shown in fig. 6, where the original M random numbers are included in the training set along with R1. Assume that the random number generator generates a second random number R2 ═ 1 after the first random number R1. In the second iteration process, the total M +1 random numbers included in the current training set are input into the model, and the model outputs a predicted value, which is assumed to be 1 and is denoted as R2' ═ 1, according to the M +1 random numbers. This predicted value is the predicted value corresponding to the second random number R2 generated by the random number generator. Comparing R2 with R2 'finds that R2 is equal to R2', and the parameters of the model may not be adjusted.
Then, a second random number R2 generated by the random number generator that has completed the test is added to the training set to update the training set, at which time, the training set includes the original M random numbers and R1 and R2, and the next iteration is performed, and so on until the L iterations are completed.
Fig. 7 is a flowchart of a random number generator discriminating method according to an embodiment of the present invention, as shown in fig. 7, the method may include the following steps:
701. n1 random numbers generated by the random number generator are obtained, N1> 1.
702. And (3) verifying the N1 random numbers by adopting a model corresponding to the random number generator to obtain a verification result corresponding to the N1 random numbers, wherein the model consists of a convolutional neural network and a cyclic neural network which are connected in sequence.
703. The quality of the random number generator is determined based on the test results.
As mentioned above, in many practical application scenarios, random numbers may be used, and the required random numbers are generated by a random number generator, so the quality of the random number generator is crucial. For example, in a data encryption scenario, if the random number generated by the random number generator is sufficiently random, it will have a positive effect on data security, and vice versa, it will cause a data security problem.
The quality of the random number generator may reflect the randomness of the random numbers it generates, i.e., the predictability of the random numbers it generates. If the random number generated by the random number generator has high predictability, the generated random number has poor randomness, so that the quality of the random number generator is poor.
Thus, a quality problem that discriminates the random number generator can be converted into a randomness test problem for the random numbers generated by the random number generator. Whether the quality of the random number generator is acceptable may be determined by a random check result of the random numbers generated by the random number generator. The random number generator determined to be of suitable quality is used in the subsequent data processing.
Similarly to the foregoing embodiment, the process of checking the N1 random numbers generated by the random number generator before can be implemented as follows: first, the check set is initialized with a plurality of random numbers that the random number generator has generated before generating N1 random numbers, and then the following iterative process is performed until N1 random numbers are all checked out:
inputting the random numbers contained in the inspection set into the model in the ith iteration process so as to output a predicted value corresponding to the ith random number in the N1 random numbers through the model;
and updating the ith random number in the N1 random numbers into a check set to execute an i +1 th iteration process, wherein the i belongs to [1, N1 ].
Based on this, the test result of the N1 random numbers can be obtained optionally by:
and determining the maximum continuous successful prediction number corresponding to the N1 random numbers by comparing whether the predicted values corresponding to the N1 random numbers and the N1 random numbers are the same, and determining the probability value of the test result of the N1 random numbers according to the maximum continuous successful prediction number. Thus, if the probability value is lower than a set threshold, it is determined that the random number generator is qualified.
The detailed implementation of the above process can refer to the descriptions in the foregoing other embodiments, which are not described herein.
In addition, the training process of the model used in this embodiment is similar to the training process of the model in the foregoing, and includes the following steps:
initializing a training set with N2 random numbers generated by a random number generator, N2>1, N2 random numbers being different from the N1 random numbers;
the following iterative process is performed until the set number of rounds is reached:
acquiring a jth random number generated by a random number generator after N2 random numbers, wherein j belongs to [1, L ], and L is a set round number;
in the jth iteration process, inputting random numbers contained in the training set into the model so as to output a predicted value corresponding to the jth random number through the model;
if the predicted value corresponding to the jth random number is different from the jth random number, adjusting the parameters of the model by taking the predicted value corresponding to the jth random number as the same as the jth random number as a target;
and updating the jth random number into a training set to execute a (j + 1) th iteration process.
For the content not described in this embodiment, reference may be made to the description in the foregoing other related embodiments, which is not described herein again.
Fig. 8 is a flowchart of a random number generator discriminating method according to another embodiment of the present invention, as shown in fig. 8, the method may include the following steps:
801. n1 random numbers generated by the first random number generator and N2 random numbers generated by the second random number generator are obtained, N1>1, N2> 1.
802. Checking the N1 random numbers using a first model corresponding to a first random number generator to obtain a first check result corresponding to N1 random numbers; and checking the N2 random numbers by adopting a second model corresponding to a second random number generator to obtain a second check result corresponding to the N2 random numbers, wherein the first model and the second model are respectively composed of a convolutional neural network and a cyclic neural network which are connected in sequence.
803. The first random number generator and the second random number generator are selected based on the first verification result and the second verification result.
In practical applications, there may be more than one random number generator that can be used, and when there are multiple random number generators that are optional, which random number generator is used is the problem to be solved by the solution provided by the present embodiment. Simply put, the random number generator with the better quality is selected.
As mentioned above, the quality evaluation problem of the random number generator can be converted into a randomness test problem of the random numbers generated by the random number generator, so as to determine the quality of the random number generator according to the randomness test result of the random numbers.
In this embodiment, it is assumed that two random number generators currently exist for selection, namely a first random number generator and a second random number generator. For example, the first random number generator is a quantum random number generator and the second random number generator is a pseudo-random number generator.
In order to perform quality evaluation on the two random number generators, the models corresponding to the two random number generators, that is, the first model and the second model, may be trained in advance. It can be understood that the first model corresponding to the first random number generator is obtained by training using a plurality of random numbers generated by the first random number generator as training samples, and the second model corresponding to the second random number generator is obtained by training using a plurality of random numbers generated by the second random number generator as training samples.
The training process of the model may refer to the description in the foregoing other embodiments, and is not repeated. Similarly, the use process of the model corresponding to the steps 801-802 may also refer to the description in the foregoing other embodiments, which is not repeated herein.
Assuming that the first and second test results are embodied as one of the two probability values mentioned in the foregoing embodiments, the results of selecting the first and second random number generators are: a random number generator corresponding to the minimum probability value is selected.
The random number processing apparatus, the random number generator discriminating apparatus, of one or more embodiments of the present invention will be described in detail below. Those skilled in the art will appreciate that the random number processing means and the random number generator discriminating means may be configured by the steps taught in the present embodiment using commercially available hardware components.
Fig. 9 is a schematic structural diagram of a random number processing apparatus according to an embodiment of the present invention, as shown in fig. 9, the random number processing apparatus includes: the device comprises an acquisition module 11, a checking module 12 and a processing module 13.
The obtaining module 11 is configured to obtain N random numbers currently generated by the random number generator, where N is greater than or equal to 1.
And the inspection module 12 is configured to inspect the N random numbers by using a model corresponding to the random number generator, where the model is composed of a convolutional neural network and a cyclic neural network that are sequentially connected.
And the processing module 13 is configured to, if it is determined that the N random numbers meet the randomness requirement according to the test result of the N random numbers, perform data processing using the N random numbers.
Optionally, the verification module 12 may be specifically configured to: initializing a check set with a plurality of random numbers that the random number generator has generated prior to generating the N random numbers; executing the following iterative process until the N random numbers are all checked to be completed: in the ith round of iteration process, inputting the random numbers contained in the inspection set into a model so as to output a predicted value corresponding to the ith random number in the N random numbers through the model; and updating the ith random number in the N random numbers into the checking set to execute an (i + 1) th iteration process, wherein the i belongs to [1, N ].
Optionally, the verification module 12 may be specifically configured to: obtaining a plurality of random numbers that the random number generator has generated before generating the N random numbers; and inputting the plurality of random numbers into a model so as to output predicted values corresponding to the N random numbers through the model.
Optionally, the verification module 12 may be specifically configured to:
initializing a test set with a plurality of random numbers that the random number generator has generated prior to generating the N random numbers;
executing the following iterative process until the N random numbers are all checked to be completed:
in the ith round of iteration process, inputting the random numbers contained in the inspection set into a model so as to output a predicted value corresponding to the ith random number in the N random numbers through the model;
updating the ith random number of the N random numbers into the checking set, and removing the first generated random number from the checking set to execute the (i + 1) th iteration process, i epsilon [1, N ].
Optionally, the processing module 13 may be configured to: determining the total prediction success number corresponding to the N random numbers by comparing whether the predicted values corresponding to the N random numbers are the same or not; determining a first probability value as a test result of the N random numbers according to the total prediction success quantity; and if the first probability value is lower than a set threshold value, determining that the N random numbers meet the randomness requirement.
Optionally, the processing module 13 may be further configured to: determining the maximum continuous successful prediction number corresponding to the N random numbers by comparing whether the predicted values corresponding to the N random numbers are the same or not; determining a second probability value as a test result of the N random numbers according to the maximum continuous success prediction quantity; and if the second probability value is lower than a set threshold, determining that the N random numbers meet the randomness requirement.
Optionally, the apparatus further comprises: a training module for initializing a training set with M random numbers once generated by the random number generator, M > 1; the following iterative process is performed until the set number of rounds is reached: acquiring a jth random number generated by the random number generator after the M random numbers, wherein j belongs to [1, L ], and L is the set round number; in the j iteration process, random numbers contained in the training set are input into a model, so that a predicted value corresponding to the j random number is output through the model; if the predicted value corresponding to the jth random number is different from the jth random number, adjusting the parameters of the model by taking the predicted value corresponding to the jth random number and the jth random number as the target; and updating the jth random number into the training set to execute a (j + 1) th iteration process.
Optionally, the processing module 13 may be further configured to: and performing data encryption processing on data to be encrypted by using the N random numbers.
Optionally, the processing module 13 may be further configured to: and outputting the data objects corresponding to the N random numbers.
The random number processing apparatus shown in fig. 9 may perform the methods provided in the embodiments shown in fig. 1 to fig. 6, and portions not described in detail in this embodiment may refer to the related descriptions of the embodiments, which are not described herein again.
In one possible design, the structure of the random number processing apparatus shown in fig. 9 may be implemented as an electronic device. As shown in fig. 10, the electronic device may include: a first processor 21, a first memory 22. The first memory 22 stores executable codes thereon, and when the executable codes are executed by the first processor 21, at least the first processor 21 is enabled to implement the random number processing method provided in the embodiments of fig. 1 to 6.
The electronic device may further include a first communication interface 23 configured to communicate with other devices or a communication network.
In addition, an embodiment of the present invention provides a non-transitory machine-readable storage medium having executable code stored thereon, which, when executed by a processor of an electronic device, causes the processor to perform the random number processing method provided in the foregoing embodiments shown in fig. 1 to 6.
Fig. 11 is a schematic structural diagram of a random number generator determination device according to an embodiment of the present invention, and as shown in fig. 11, the random number generator determination device includes: an acquisition module 31, a verification module 32, a determination module 33.
The obtaining module 31 is configured to obtain N1 random numbers generated by the random number generator, where N1> 1.
A checking module 32, configured to check the N1 random numbers by using a model corresponding to the random number generator to obtain a checking result corresponding to the N1 random numbers, where the model is composed of a convolutional neural network and a cyclic neural network that are sequentially connected.
A determining module 33 for determining the quality of the random number generator according to the inspection result.
Alternatively, the verification module 32 may be configured to: initializing a check set with a plurality of random numbers that the random number generator has generated prior to generating the N1 random numbers; performing the following iterative process until the N1 random numbers are all verified: inputting the random numbers contained in the inspection set into a model in the ith round of iteration process so as to output a predicted value corresponding to the ith random number in the N1 random numbers through the model; and updating the ith random number in the N1 random numbers into the checking set to execute an i +1 th iteration process, wherein the i belongs to [1, N1 ].
Alternatively, the verification module 32 may be configured to: obtaining a plurality of random numbers that the random number generator has generated prior to generating the N1 random numbers; inputting the plurality of random numbers into a model to output predicted values corresponding to the N1 random numbers through the model.
Alternatively, the verification module 32 may be configured to:
initializing a test set with a plurality of random numbers that the random number generator has generated prior to generating the N1 random numbers;
performing the following iterative process until the N1 random numbers are all verified:
inputting the random numbers contained in the inspection set into a model in the ith round of iteration process so as to output a predicted value corresponding to the ith random number in the N1 random numbers through the model;
updating the ith random number of the N1 random numbers into the checking set, and removing the first generated random number from the checking set to execute the (i + 1) th iteration process, i e [1, N1 ].
Optionally, the verification module 32 may be further configured to: determining the maximum continuous successful prediction number corresponding to the N1 random numbers by comparing whether the predicted values corresponding to the N1 random numbers and the N1 random numbers are the same or not; and determining a probability value as a detection result of the N1 random numbers according to the maximum continuous success prediction quantity. Thus, the determination module 33 may specifically be configured to: and if the probability value is lower than a set threshold, determining that the quality of the random number generator is qualified.
Optionally, the apparatus further comprises: a training module to initialize a training set with N2 random numbers generated by the random number generator, N2>1, the N2 random numbers being different from the N1 random numbers; the following iterative process is performed until the set number of rounds is reached: acquiring a jth random number generated by the random number generator after the N2 random numbers, wherein j belongs to [1, L ], and L is the set round number; in the j iteration process, random numbers contained in the training set are input into a model, so that a predicted value corresponding to the j random number is output through the model; if the predicted value corresponding to the jth random number is different from the jth random number, adjusting the parameters of the model by taking the predicted value corresponding to the jth random number and the jth random number as the target; and updating the jth random number into the training set to execute a (j + 1) th iteration process.
The random number generator determining apparatus shown in fig. 11 can execute the method provided in the embodiment shown in fig. 7, and the parts not described in detail in this embodiment can refer to the related description of the foregoing embodiment, and are not described again here.
In one possible design, the structure of the random number generator discriminating apparatus shown in fig. 11 can be implemented as an electronic device. As shown in fig. 12, the electronic device may include: a second processor 41, a second memory 42. Wherein the second memory 42 has stored thereon executable code which, when executed by the second processor 41, at least makes the second processor 41 capable of implementing the random number generator discriminating method as provided in the embodiment of fig. 7 described above.
The electronic device may further include a second communication interface 43 for communicating with other devices or a communication network.
In addition, an embodiment of the present invention provides a non-transitory machine-readable storage medium, on which executable code is stored, and when the executable code is executed by a processor of an electronic device, the processor is caused to execute the random number generator discrimination method provided in the foregoing embodiment shown in fig. 7.
The above-described apparatus embodiments are merely illustrative, wherein the various modules illustrated as separate components may or may not be physically separate. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of the present embodiment. One of ordinary skill in the art can understand and implement the present invention without any inventive effort.
Through the above description of the embodiments, those skilled in the art will clearly understand that each embodiment can be implemented by adding a necessary general hardware platform, and of course, can also be implemented by a combination of hardware and software. With this understanding in mind, the above-described aspects and portions of the present technology which have contributed to by this prior art may be embodied in the form of a computer program product, which may be embodied on one or more computer-usable storage media having computer-usable program code embodied therein (including, but not limited to, magnetic disk storage, CD-ROM, optical storage, etc.).
Finally, it should be noted that: the above examples are only intended to illustrate the technical solution of the present invention, but not to limit it; although the present invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; and such modifications or substitutions do not depart from the spirit and scope of the corresponding technical solutions of the embodiments of the present invention.

Claims (23)

1. A method for processing random numbers, comprising:
acquiring N random numbers currently generated by a random number generator, wherein N is more than or equal to 1;
checking the N random numbers by adopting a model corresponding to the random number generator;
if the N random numbers are determined to meet the randomness requirement according to the test results of the N random numbers, using the N random numbers to perform data processing;
the model consists of a convolutional neural network and a cyclic neural network which are connected in sequence.
2. The method of claim 1, wherein the step of verifying the N random numbers comprises:
initializing a check set with a plurality of random numbers that the random number generator has generated prior to generating the N random numbers;
executing the following iterative process until the N random numbers are all checked to be completed:
in the ith round of iteration process, inputting the random numbers contained in the inspection set into a model so as to output a predicted value corresponding to the ith random number in the N random numbers through the model;
and updating the ith random number in the N random numbers into the checking set to execute an (i + 1) th iteration process, wherein the i belongs to [1, N ].
3. The method of claim 1, wherein the step of verifying the N random numbers comprises:
obtaining a plurality of random numbers that the random number generator has generated before generating the N random numbers;
and inputting the plurality of random numbers into a model so as to output predicted values corresponding to the N random numbers through the model.
4. The method of claim 1, wherein the step of verifying the N random numbers comprises:
initializing a check set with a plurality of random numbers that the random number generator has generated prior to generating the N random numbers;
executing the following iterative process until the N random numbers are all checked to be completed:
in the ith round of iteration process, inputting the random numbers contained in the inspection set into a model so as to output a predicted value corresponding to the ith random number in the N random numbers through the model;
updating the ith random number of the N random numbers into the checking set, and removing the first generated random number from the checking set to execute the (i + 1) th iteration process, i e [1, N ].
5. The method according to any one of claims 2 to 4, further comprising:
determining the total prediction success quantity corresponding to the N random numbers by comparing whether the predicted values corresponding to the N random numbers are the same or not;
determining a first probability value as a result of the verification of the N random numbers according to the total number of success predictions;
and if the first probability value is lower than a set threshold value, determining that the N random numbers meet the randomness requirement.
6. The method according to any one of claims 2 to 4, further comprising:
determining the maximum continuous successful prediction number corresponding to the N random numbers by comparing whether the predicted values corresponding to the N random numbers are the same or not;
determining a second probability value as a test result of the N random numbers according to the maximum continuous success prediction quantity;
and if the second probability value is lower than a set threshold, determining that the N random numbers meet the randomness requirement.
7. The method of claim 1, further comprising:
initializing a training set with M random numbers once generated by the random number generator, wherein M is greater than 1;
the following iterative process is performed until the set number of rounds is reached:
acquiring a jth random number generated by the random number generator after the M random numbers, wherein j belongs to [1, L ], and L is the set round number;
in the j iteration process, random numbers contained in the training set are input into a model, so that a predicted value corresponding to the j random number is output through the model;
if the predicted value corresponding to the jth random number is different from the jth random number, adjusting the parameters of the model by taking the predicted value corresponding to the jth random number and the jth random number as the target;
and updating the jth random number into the training set to execute a (j + 1) th iteration process.
8. The method of claim 1, wherein the using the N random numbers for data processing comprises:
and performing data encryption processing on data to be encrypted by using the N random numbers.
9. The method of claim 1, wherein the using the N random numbers for data processing comprises:
and outputting the data objects corresponding to the N random numbers.
10. A method for discriminating a random number generator, comprising:
acquiring N1 random numbers generated by a random number generator, wherein N1> 1;
checking the N1 random numbers by using a model corresponding to the random number generator to obtain a checking result corresponding to the N1 random numbers;
determining the quality of the random number generator according to the inspection result;
the model consists of a convolutional neural network and a cyclic neural network which are connected in sequence.
11. The method of claim 10, wherein the step of verifying the N random numbers comprises:
initializing a check set with a plurality of random numbers that the random number generator has generated prior to generating the N1 random numbers;
performing the following iterative process until the N1 random numbers are all verified:
inputting the random numbers contained in the inspection set into a model in the ith round of iteration process so as to output a predicted value corresponding to the ith random number in the N1 random numbers through the model;
and updating the ith random number in the N1 random numbers into the checking set to execute an i +1 th iteration process, wherein the i belongs to [1, N1 ].
12. The method of claim 10, wherein said step of verifying said N1 random numbers comprises:
obtaining a plurality of random numbers that the random number generator has generated prior to generating the N1 random numbers;
inputting the plurality of random numbers into a model to output predicted values corresponding to the N1 random numbers through the model.
13. The method of claim 10, wherein said step of verifying said N1 random numbers comprises:
initializing a check set with a plurality of random numbers that the random number generator has generated prior to generating the N1 random numbers;
performing the following iterative process until the N1 random numbers are all verified:
inputting the random numbers contained in the inspection set into a model in the ith round of iteration process so as to output a predicted value corresponding to the ith random number in the N1 random numbers through the model;
updating the ith random number of the N1 random numbers into the checking set, and removing the first generated random number from the checking set to execute the (i + 1) th iteration process, i e [1, N1 ].
14. The method according to any one of claims 11 to 13, wherein the step of obtaining the test result comprises:
determining the maximum continuous successful prediction number corresponding to the N1 random numbers by comparing whether the predicted values corresponding to the N1 random numbers and the N1 random numbers are the same or not;
determining a probability value as a result of the testing of the N1 random numbers according to the maximum number of consecutive successful predictions;
said determining the quality of said random number generator from said test results comprises:
and if the probability value is lower than a set threshold value, determining that the quality of the random number generator is qualified.
15. The method of claim 10, further comprising:
initializing a training set with N2 random numbers generated by the random number generator, N2>1, the N2 random numbers being different from the N1 random numbers;
the following iterative process is performed until the set number of rounds is reached:
acquiring a jth random number generated by the random number generator after the N2 random numbers, wherein j belongs to [1, L ], and L is the set round number;
in the j iteration process, random numbers contained in the training set are input into a model, so that a predicted value corresponding to the j random number is output through the model;
if the predicted value corresponding to the jth random number is different from the jth random number, adjusting the parameters of the model by taking the predicted value corresponding to the jth random number and the jth random number as the target;
and updating the jth random number into the training set to execute a (j + 1) th iteration process.
16. A method for discriminating a random number generator, comprising:
acquiring N1 random numbers generated by a first random number generator and N2 random numbers generated by a second random number generator, wherein N1>1 and N2> 1;
checking the N1 random numbers with a first model corresponding to the first random number generator to obtain a first check result corresponding to the N1 random numbers;
predicting the N2 random numbers by using a second model corresponding to the second random number generator to obtain a second test result corresponding to the N2 random numbers;
selecting the first random number generator and the second random number generator according to the first and second verification results;
the first model and the second model are both composed of a convolutional neural network and a cyclic neural network which are connected in sequence.
17. The method of claim 16, wherein the first random number generator is a quantum random number generator and the second random number generator is a pseudo-random number generator.
18. A random number processing apparatus, comprising:
the acquisition module is used for acquiring N random numbers currently generated by the random number generator, wherein N is more than or equal to 1;
the inspection module is used for inspecting the N random numbers by adopting a model corresponding to the random number generator, and the model consists of a convolutional neural network and a cyclic neural network which are sequentially connected;
and the processing module is used for processing data by using the N random numbers if the N random numbers are determined to meet the randomness requirement according to the test results of the N random numbers.
19. An electronic device, comprising: a memory, a processor; wherein the memory has stored thereon executable code which, when executed by the processor, causes the processor to carry out the random number processing method of any of claims 1 to 9.
20. A random number generator discriminating device, comprising:
the acquisition module is used for acquiring N1 random numbers generated by the random number generator, wherein N1> 1;
a checking module, configured to check the N1 random numbers by using a model corresponding to the random number generator to obtain a checking result corresponding to the N1 random numbers, where the model is composed of a convolutional neural network and a cyclic neural network that are sequentially connected;
a determining module for determining the quality of the random number generator according to the inspection result.
21. An electronic device, comprising: a memory, a processor; wherein the memory has stored thereon executable code which, when executed by the processor, causes the processor to perform the random number generator discriminating method of any of claims 10 to 15.
22. A method for processing random numbers, comprising:
acquiring N random numbers currently generated by a random number generator, wherein N is more than or equal to 1;
the N random numbers are detected by adopting a model corresponding to the random number generator, and the model consists of a convolutional neural network and a cyclic neural network which are connected in sequence;
and if the N random numbers are determined to be not in accordance with the randomness requirement according to the test results of the N random numbers, acquiring the other N random numbers generated by the random number generator, or updating the random number generator.
23. The method of claim 22, wherein updating the random number generator comprises:
selecting another random number generator, or retraining the random number generator.
CN201911202755.9A 2019-11-29 2019-11-29 Random number generator judging method, random number processing method, device and equipment Pending CN112882683A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911202755.9A CN112882683A (en) 2019-11-29 2019-11-29 Random number generator judging method, random number processing method, device and equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911202755.9A CN112882683A (en) 2019-11-29 2019-11-29 Random number generator judging method, random number processing method, device and equipment

Publications (1)

Publication Number Publication Date
CN112882683A true CN112882683A (en) 2021-06-01

Family

ID=76039240

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911202755.9A Pending CN112882683A (en) 2019-11-29 2019-11-29 Random number generator judging method, random number processing method, device and equipment

Country Status (1)

Country Link
CN (1) CN112882683A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115378589A (en) * 2022-10-26 2022-11-22 北京惠朗时代科技有限公司 Method, apparatus, device and medium for testing randomness of binary key

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106528048A (en) * 2016-11-02 2017-03-22 北京旷视科技有限公司 Method and apparatus for assessing quality of random number generator
CN108734614A (en) * 2017-04-13 2018-11-02 腾讯科技(深圳)有限公司 Traffic congestion prediction technique and device, storage medium
CN109978228A (en) * 2019-01-31 2019-07-05 中南大学 A kind of PM2.5 concentration prediction method, apparatus and medium

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106528048A (en) * 2016-11-02 2017-03-22 北京旷视科技有限公司 Method and apparatus for assessing quality of random number generator
CN108734614A (en) * 2017-04-13 2018-11-02 腾讯科技(深圳)有限公司 Traffic congestion prediction technique and device, storage medium
CN109978228A (en) * 2019-01-31 2019-07-05 中南大学 A kind of PM2.5 concentration prediction method, apparatus and medium

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
蔡艳宁,汪洪桥,叶雪梅: "《复杂系统支持向量机建模与故障预报》", 30 April 2015, pages: 52 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115378589A (en) * 2022-10-26 2022-11-22 北京惠朗时代科技有限公司 Method, apparatus, device and medium for testing randomness of binary key
CN115378589B (en) * 2022-10-26 2023-01-13 北京惠朗时代科技有限公司 Method, apparatus, device and medium for testing randomness of binary key

Similar Documents

Publication Publication Date Title
US11010667B2 (en) Dictionary DGA detector model
KR102302609B1 (en) Neural Network Architecture Optimization
CN109922032B (en) Method, device, equipment and storage medium for determining risk of logging in account
US20240127058A1 (en) Training neural networks using priority queues
CN109902018B (en) Method for acquiring test case of intelligent driving system
CN111526119B (en) Abnormal flow detection method and device, electronic equipment and computer readable medium
EP4036796A1 (en) Automatic modeling method and apparatus for object detection model
CN109462578B (en) Threat information utilization and propagation method based on statistical learning
CN111880158A (en) Radar target detection method and system based on convolutional neural network sequence classification
CN110958244A (en) Method and device for detecting counterfeit domain name based on deep learning
CN110188862A (en) Searching method, the device, system of model hyper parameter for data processing
CN112882683A (en) Random number generator judging method, random number processing method, device and equipment
CN106528048B (en) Method and apparatus for evaluating quality of random number generator
US11966851B2 (en) Construction of a machine learning model
CN113869431B (en) False information detection method, system, computer equipment and readable storage medium
CN111582456B (en) Method, apparatus, device and medium for generating network model information
CN115314239A (en) Analysis method and related equipment for hidden malicious behaviors based on multi-model fusion
CN112764791A (en) Incremental updating malicious software detection method and system
CN114510592A (en) Image classification method and device, electronic equipment and storage medium
US20230153380A1 (en) Meta few-shot class incremental learning
CN111582474B (en) Neural network structure detection method, training method and training device of structure detection model
CN112700006B (en) Network architecture searching method, device, electronic equipment and medium
CN118138382B (en) Malicious domain name generation method, device, equipment and medium
CN114297924A (en) Model generation method, device, equipment and computer readable storage medium
CN112000948A (en) Password strength evaluation method for anti-neural network based on sequence generation

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination