CN112668501A - Automatic encoder feature extraction method and device based on block chain excitation - Google Patents

Automatic encoder feature extraction method and device based on block chain excitation Download PDF

Info

Publication number
CN112668501A
CN112668501A CN202011624852.XA CN202011624852A CN112668501A CN 112668501 A CN112668501 A CN 112668501A CN 202011624852 A CN202011624852 A CN 202011624852A CN 112668501 A CN112668501 A CN 112668501A
Authority
CN
China
Prior art keywords
request
image data
training
block chain
mapping relation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011624852.XA
Other languages
Chinese (zh)
Inventor
李伟
邱炜伟
蔡亮
匡立中
张帅
李吉明
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hangzhou Qulian Technology Co Ltd
Original Assignee
Hangzhou Qulian Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hangzhou Qulian Technology Co Ltd filed Critical Hangzhou Qulian Technology Co Ltd
Priority to CN202011624852.XA priority Critical patent/CN112668501A/en
Publication of CN112668501A publication Critical patent/CN112668501A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • Image Analysis (AREA)

Abstract

The invention belongs to the technical field of block chains, and provides an automatic encoder feature extraction method, device and system based on block chain excitation.

Description

Automatic encoder feature extraction method and device based on block chain excitation
Technical Field
The present application relates to the field of blockchain technologies, and in particular, to a method, a system, an apparatus, a computer device, and a computer-readable storage medium for extracting features of an automatic encoder based on blockchain excitation.
Background
An auto-encoder is an unsupervised neural network structure, and aims to obtain a hidden layer description of an image, wherein the description is obtained through a certain mapping relation (an encoding process), and can be restored into original data (a decoding process), and the original data is often used for image feature description.
An excellent automatic encoder can obtain a result as close as possible to the original data after encoding and decoding. To achieve this technical effect, the auto-encoder needs a large amount of data to train the mapping relationship. However, how to acquire a large amount of data for training the mapping relationship and how to avoid misuse of the image data by the edge device are technical difficulties that plague those skilled in the art.
In summary, the prior art has technical problems of how to acquire a large amount of data to train the mapping relationship, how to avoid misuse of the image data by the edge device, and the like.
Disclosure of Invention
In order to solve the above technical problems, the present invention provides a method, an apparatus, a system, a computer device, and a computer readable storage medium for extracting features of an automatic encoder based on block chain excitation.
An automatic encoder feature extraction method based on block chain excitation comprises the following steps:
parsing a request for a plurality of requesting devices, the request including a device identification and image data;
judging the request, and if the request is model training, acquiring the image data training mapping relation model;
uploading the image data and the identification code thereof and the trained mapping relation model to a block chain;
and issuing tokens to the training request device according to the device identification.
An automatic encoder feature extraction system based on blockchain excitation, comprising: a plurality of requesting devices, servers, and blockchains;
the server analyzes requests of a plurality of request devices, wherein the requests comprise device identifications and image data;
the server judges the request, and if the request is model training, the server acquires the image data training mapping relation model;
the server uploads the image data and the identification code thereof and the trained mapping relation model to the block chain;
and the server issues tokens to the training request device according to the device identification.
An automatic encoder feature extraction device based on block chain excitation, comprising:
the analysis module is used for analyzing a plurality of requests of the request equipment, and the requests comprise equipment identifiers and image data;
the training judgment module is used for judging the request, and acquiring the image data training mapping relation model if the request is model training;
the cochain module is used for uploading the image data and the identification code thereof and the trained mapping relation model to the block chain;
and the token issuing module is used for issuing tokens to the training request equipment according to the equipment identification.
A computer device comprising a memory and a processor, the memory storing a computer program, the computer program performing an implementable method in the processor.
A computer-readable storage medium storing a computer program which, when executed in a processor, implements the above-described method.
The invention provides a method, a device and a system for extracting characteristics of an automatic encoder based on block chain excitation.
Drawings
FIG. 1 is a block chain excitation-based framework diagram of an automatic encoder feature extraction system according to an embodiment;
fig. 2 is a flowchart illustrating an exemplary method for extracting features of an automatic encoder based on block chain excitation;
fig. 3 is a flowchart illustrating an exemplary method for extracting features of an automatic encoder based on block chain excitation;
FIG. 4 is a block chain excitation-based automatic encoder feature extraction apparatus according to an embodiment;
fig. 5 is a schematic block diagram of a computer device according to an embodiment.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The method for extracting features of an automatic encoder based on blockchain excitation provided by the embodiment can be applied to the application environment shown in fig. 1, wherein the edge computing device communicates with a server through a network, and the server runs in a blockchain system. Among other things, the edge computing device may be, but is not limited to, various personal computers, laptops, smartphones, tablets, and portable wearable devices. The server may be implemented as a stand-alone server or as a server cluster consisting of a plurality of servers.
In one embodiment, as shown in fig. 2 and 3, an automatic encoder feature extraction method based on blockchain excitation is provided, which is described by taking the server in fig. 1 as an example, and includes the following steps:
s1, resolving requests of a plurality of request devices, wherein the requests comprise device identifications and image data;
s2, judging the request, and if the request is model training, acquiring an image data training mapping relation model;
s3, uploading image data and identification codes thereof and the trained mapping relation model to a block chain;
and S4, issuing tokens to the training request device according to the device identification.
In this embodiment, requests of a plurality of request devices are analyzed, the requests include device identifiers and image data, the requests are judged, if the requests are model training, an image data training mapping relation model is obtained, the image data and identification codes thereof and the trained mapping relation model are uploaded to a block chain, and tokens are issued to the training request devices according to the device identifiers to stimulate the training request devices, so that the edge devices can be stimulated to provide a large amount of data to train the mapping relation model, the image data and the trained mapping relation model are uploaded to the block chain, and abuse of the image data by the edge devices can be avoided.
In step S1, the edge computing device may send a specific request to the server, and therefore the edge computing device may also be referred to as a requesting device, and according to the specific request, the requesting device may be named as a feature requesting device that is to acquire an image feature and a training requesting device that is to provide data to train the mapping relationship model.
In addition, the device identifier is used for indicating the device identity in communication, such as a device serial number, the device is to communicate with the server to send a specific request, the serial number can be sent to the blockchain for verification, and the verified device can request the server.
For example, before the edge computing device wants to communicate with the server and send a specific request, a device serial number may be sent to the server, where the serial number generates summary information on the server, and the block chain is uploaded after verification by the PBFT consensus mechanism.
It should be noted that the edge computing device may upload the device information to the blockchain through the server. Therefore, before sending a specific request to the server, the server initiates information verification to the blockchain to clarify the legal identity of the requesting device, and then responds to the request of the requesting device or rejects the request of the requesting device. Therefore, the authenticity and the privacy of the uplink information can be guaranteed through the block chain, and the server can be prevented from being abused.
In step S2, the request of the edge computing device is determined, and if the request is model training, the image data training mapping relationship model is obtained, so that the mapping relationship model can obtain a large amount of data for training, and the mapping relationship model is optimized.
It should be noted that training the mapping relationship model may refer to providing data of correct answers to obtain the mapping relationship model.
In addition, if the request of the edge computing equipment is model training, judging the registration condition of the training request equipment; if the training request equipment is not registered, the training request equipment is registered, so that the equipment providing the training data can enter a blockchain system to participate in training contribution and excitation, and the technical effect of acquiring more training data is achieved.
It should be noted that the edge computing device requests image data feature acquisition, determines the number of tokens owned by the feature requesting device, and if the number of tokens is sufficient, performs feature extraction on image data to be computed and image data on a blockchain through a mapping relation model to feed back the feature data to the feature requesting device.
And when the number of the tokens of the edge computing equipment is found to be insufficient, prompting errors to the feature request equipment so as to avoid abuse of the image data, and exciting the feature request equipment to upload the image data to participate in training of the mapping relation model so as to optimize the model.
In one embodiment, a training scheme for a preferred mapping relationship model is provided.
The training goal of the mapping is to minimize:
Figure BDA0002874610170000061
where X represents an image pixel value of the image data, alpha represents a weight of the local reserve,
Figure BDA0002874610170000062
is the data after the m-th encoding, L represents the Laplacian matrix of the locality relationship among the data,
Figure BDA0002874610170000063
defined as a mapping of the reconstructed input, W can be solved closed by least squares
Figure BDA0002874610170000064
Calculating a Laplace matrix L, comprising:
first, two by two calculate the euclidean distance between features:
Figure BDA0002874610170000065
secondly, setting a distance threshold value K, regarding each image data, regarding other images and the neighbors thereof within the range of K as the image data, and calculating the average value d and the standard deviation s of the distance;
thirdly, generating a matrix A, wherein the element default value is 0;
fourth, using matrix a to calculate:
Figure BDA0002874610170000066
the position of the ith column of the matrix a corresponding to the k samples is set as a, that is:
Figure BDA0002874610170000071
fifthly, a matrix H is generated, the element default value is 0, and the position of the ith column of the matrix H corresponding to the k samples is set to 1 by using the result of the step B, that is:
Figure BDA0002874610170000072
sixth, a matrix D is generatedvThe element value is defaulted to 0, and the diagonal element value is the summation of corresponding columns of the matrix A;
seventh, a matrix D is generatedeThe element default value is 0, and the diagonal element value is H and corresponds to the column summation;
eighth, calculate the matrix
Figure BDA0002874610170000073
Where I is an identity matrix of size M x M.
In step S3, the image data and the identification code thereof and the trained mapping relationship model are uploaded to the blockchain, so that the image data can be prevented from being abused by the edge device based on a blockchain consensus mechanism.
It should be noted that the specific uplink data scheme can be performed based on the PoW consensus mechanism by using the MD5 code. The MD in the MD5 represents Message Digest, which is the meaning of the Message Digest, but the Message Digest is not an abbreviation of the Message content, but a 128-bit (bit) feature code obtained by mathematically transforming the original Message according to the published MD5 algorithm.
It should be noted that, the above data uploading blockchain can prevent the image data and the mapping relation model from being abused by an unverified device.
In step S4, tokens are issued to the training request device according to the device identifier to excite the training request device, so that a large amount of image data can be obtained to train the mapping relationship model, thereby achieving the technical effect of optimizing the model.
Note that the token is a virtual account name and is electronic virtual money. The token plays a role of a symbol, which can be presented in the form of points, and the like, and can be flexibly used according to the situation.
It should be understood that, the sequence numbers of the steps in the foregoing embodiments do not imply an execution sequence, and the execution sequence of each process should be determined by its function and inherent logic, and should not constitute any limitation to the implementation process of the embodiments of the present invention.
In an embodiment, referring to fig. 4, an automatic encoder feature extraction apparatus based on blockchain excitation is provided, and the apparatus corresponds to the automatic encoder feature extraction method based on blockchain excitation in the above embodiment one to one. As shown in fig. 4, the apparatus includes:
the analysis module 1 is used for analyzing requests of a plurality of request devices, wherein the requests comprise device identifiers and image data;
the training judgment module 2 is used for judging the request, and acquiring an image data training mapping relation model if the request is model training;
the uplink module 3 is used for uploading the image data and the identification code thereof and the trained mapping relation model to the block chain;
and the token issuing module 4 is used for issuing tokens to the training request device according to the device identification.
It should be noted that the terms "comprises" and "comprising," and any variations thereof, are intended to cover non-exclusive inclusions, such that a process, method, system, article, or apparatus that comprises a list of steps or modules is not necessarily limited to those steps or modules explicitly listed, but may include other steps or modules not explicitly listed or inherent to such process, method, article, or apparatus, and that the division of modules presented in this application is merely a logical division and may be implemented in a practical application in another manner.
It should be further noted that, for specific definitions and descriptions of the above apparatus, reference may be made to the above definitions and descriptions of the block chain excitation-based automatic encoder feature extraction method, and details are not repeated here. The various modules in the above-described apparatus may be implemented in whole or in part by software, hardware, and combinations thereof. The modules can be embedded in a hardware form or independent from a processor in the computer device, and can also be stored in a memory in the computer device in a software form, so that the processor can call and execute operations corresponding to the modules.
In one embodiment, referring to fig. 4 and 5, a computer device is provided, which may be a server running the parsing module 1, the training determination module 2, the uplink module 3, and the token issuance module 4. The internal structure of the computer device may be as shown in fig. 5. The computer device includes a processor, a memory, a network interface, and a database connected by a system bus. Wherein the processor of the computer device is configured to provide computing and control capabilities. The memory of the computer device comprises a nonvolatile storage medium and an internal memory. The non-volatile storage medium stores an operating system, a computer program, and a database. The internal memory provides an environment for the operation of an operating system and computer programs in the non-volatile storage medium. The database of the computer device is used for storing data involved in the block chain excitation-based automatic encoder feature extraction method. The network interface of the computer device is used for communicating with an external terminal through a network connection. The computer program is executed by a processor to implement a method for automatic encoder feature extraction based on blockchain excitation.
In one embodiment, a computer device is provided, which includes a memory, a processor, and a computer program stored on the memory and executable on the processor, and the processor executes the computer program to implement the steps of the block chain excitation-based automatic encoder feature extraction method in the above embodiments, such as the steps 1 to 4 shown in fig. 2 and other extensions of the method and extensions of related steps. Alternatively, the processor, when executing the computer program, implements the functions of the modules/units of the block chain excitation-based automatic encoder feature extraction apparatus in the above-described embodiment, such as the functions of modules 1 to 4 shown in fig. 4. To avoid repetition, further description is omitted here.
The ProceSsor may be a Central ProceSsing Unit (CPU), other general purpose ProceSsor, a Digital Signal ProceSsor (DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic, discrete hardware components, etc. The general purpose processor may be a microprocessor or the processor may be any conventional processor or the like, the processor being the control center of the computer device and the various interfaces and lines connecting the various parts of the overall computer device.
The memory may be used to store computer programs and/or modules, and the processor may implement various functions of the computer device by executing or executing the computer programs and/or modules stored in the memory, as well as by invoking data stored in the memory. The memory may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required by at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data (such as audio data, video data, etc.) created according to the use of the cellular phone, etc.
The memory may be integrated in the processor or may be provided separately from the processor.
In one embodiment, a computer-readable storage medium is provided, on which a computer program is stored, which, when being executed by a processor, implements the steps of the block chain excitation-based automatic encoder feature extraction method in the above-described embodiments, such as the steps 1 to 4 shown in fig. 2 and other extensions of the method and related steps. Alternatively, the computer program is executed by a processor to implement the functions of the modules/units of the block chain excitation-based automatic encoder feature extraction apparatus in the above-described embodiment, for example, the functions of the modules 1 to 4 shown in fig. 4. To avoid repetition, further description is omitted here.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by hardware related to instructions of a computer program, which can be stored in a non-volatile computer-readable storage medium, and when executed, can include the processes of the embodiments of the methods described above. Any reference to memory, storage, database, or other medium used in the embodiments provided herein may include non-volatile and/or volatile memory, among others. Non-volatile memory can include read-only memory (ROM), Programmable ROM (PROM), Electrically Programmable ROM (EPROM), Electrically Erasable Programmable ROM (EEPROM), or flash memory. Volatile memory can include Random Access Memory (RAM) or external cache memory. By way of illustration and not limitation, RAM is available in a variety of forms such as Static RAM (SRAM), Dynamic RAM (DRAM), Synchronous DRAM (SDRAM), Double Data Rate SDRAM (DDRSDRAM), Enhanced SDRAM (ESDRAM), Synchronous Link DRAM (SLDRAM), Rambus Direct RAM (RDRAM), direct bus dynamic RAM (DRDRAM), and memory bus dynamic RAM (RDRAM).
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-mentioned division of the functional units and modules is illustrated, and in practical applications, the above-mentioned function distribution may be performed by different functional units and modules according to needs, that is, the internal structure of the apparatus is divided into different functional units or modules, so as to perform all or part of the functions described above.
The above-mentioned embodiments are only used for illustrating the technical solutions of the present invention, and not for limiting the same; although the present invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not substantially depart from the spirit and scope of the embodiments of the present invention, and are intended to be included within the scope of the present invention.

Claims (10)

1. An automatic encoder feature extraction method based on block chain excitation is characterized by comprising the following steps:
parsing a request for a plurality of requesting devices, the request including a device identification and image data;
judging the request, and if the request is model training, acquiring the image data training mapping relation model;
uploading the image data and the identification code thereof and the trained mapping relation model to a block chain;
and issuing tokens to the training request device according to the device identification.
2. The method of claim 1, further comprising:
if the request is image data feature acquisition, judging the number of tokens owned by feature request equipment;
and if the number of the tokens is sufficient, performing feature extraction on the image data to be calculated and the image data on the block chain through the mapping relation model so as to feed back the image data to the feature request device.
3. The method of claim 2, wherein if the number of tokens is insufficient, prompting the feature request device for an error.
4. The method of claim 1, comprising:
if the request is model training, judging the registration condition of the training request equipment;
and if the training request equipment is not registered, registering the training request equipment.
5. The method of claim 1, comprising:
characterizing the mapping relation model as:
Figure FDA0002874610160000011
wherein X represents an image pixel value of the image data and alpha represents a weight of a local reserveThe weight of the steel is heavy,
Figure FDA0002874610160000012
is the data after the m-th encoding, L represents the laplacian matrix of the locality relationship between the data, W:
Figure FDA0002874610160000013
is defined as a mapping of the reconstructed input,
Figure FDA0002874610160000014
the laplacian matrix is calculated.
6. The method of claim 1, wherein the identification code is an MD5 code; uploading the image data and the identification code thereof and the trained mapping relation model to a block chain is based on a PoW consensus mechanism.
7. An automatic encoder feature extraction system based on block chain excitation, comprising: a plurality of requesting devices, servers, and blockchains;
the server analyzes requests of a plurality of request devices, wherein the requests comprise device identifications and image data;
the server judges the request, and if the request is model training, the server acquires the image data training mapping relation model;
the server uploads the image data and the identification code thereof and the trained mapping relation model to the block chain;
and the server issues tokens to the training request device according to the device identification.
8. An automatic encoder feature extraction device based on block chain excitation, comprising:
the analysis module is used for analyzing a plurality of requests of the request equipment, and the requests comprise equipment identifiers and image data;
the training judgment module is used for judging the request, and acquiring the image data training mapping relation model if the request is model training;
the cochain module is used for uploading the image data and the identification code thereof and the trained mapping relation model to the block chain;
and the token issuing module is used for issuing tokens to the training request equipment according to the equipment identification.
9. A computer device comprising a memory and a processor, the memory storing a computer program, wherein the computer program is operative to perform the method of any of claims 1-6 in the processor.
10. A computer-readable storage medium storing a computer program, characterized in that the computer program is executed in a processor to implement the method of any of claims 1-6.
CN202011624852.XA 2020-12-30 2020-12-30 Automatic encoder feature extraction method and device based on block chain excitation Pending CN112668501A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011624852.XA CN112668501A (en) 2020-12-30 2020-12-30 Automatic encoder feature extraction method and device based on block chain excitation

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011624852.XA CN112668501A (en) 2020-12-30 2020-12-30 Automatic encoder feature extraction method and device based on block chain excitation

Publications (1)

Publication Number Publication Date
CN112668501A true CN112668501A (en) 2021-04-16

Family

ID=75412287

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011624852.XA Pending CN112668501A (en) 2020-12-30 2020-12-30 Automatic encoder feature extraction method and device based on block chain excitation

Country Status (1)

Country Link
CN (1) CN112668501A (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104899921A (en) * 2015-06-04 2015-09-09 杭州电子科技大学 Single-view video human body posture recovery method based on multi-mode self-coding model
CN108323200A (en) * 2018-01-25 2018-07-24 深圳前海达闼云端智能科技有限公司 Data training method and device based on block chain, storage medium and block chain link points
CN109685501A (en) * 2018-12-04 2019-04-26 暨南大学 Based on secret protection deep learning platform construction method auditable under block chain incentive mechanism
CN109716346A (en) * 2016-07-18 2019-05-03 河谷生物组学有限责任公司 Distributed machines learning system, device and method
CN110490330A (en) * 2019-08-16 2019-11-22 安徽航天信息有限公司 A kind of distributed machines learning system based on block chain
CN111062928A (en) * 2019-12-19 2020-04-24 安徽威奥曼机器人有限公司 Method for identifying lesion in medical CT image
CN111125784A (en) * 2019-12-24 2020-05-08 山东爱城市网信息技术有限公司 Artificial intelligence training model method, device and medium based on block chain
CN111723147A (en) * 2019-03-21 2020-09-29 杭州海康威视数字技术股份有限公司 Block chain-based data training method, device and equipment and storage medium

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104899921A (en) * 2015-06-04 2015-09-09 杭州电子科技大学 Single-view video human body posture recovery method based on multi-mode self-coding model
CN109716346A (en) * 2016-07-18 2019-05-03 河谷生物组学有限责任公司 Distributed machines learning system, device and method
CN108323200A (en) * 2018-01-25 2018-07-24 深圳前海达闼云端智能科技有限公司 Data training method and device based on block chain, storage medium and block chain link points
CN109685501A (en) * 2018-12-04 2019-04-26 暨南大学 Based on secret protection deep learning platform construction method auditable under block chain incentive mechanism
CN111723147A (en) * 2019-03-21 2020-09-29 杭州海康威视数字技术股份有限公司 Block chain-based data training method, device and equipment and storage medium
CN110490330A (en) * 2019-08-16 2019-11-22 安徽航天信息有限公司 A kind of distributed machines learning system based on block chain
CN111062928A (en) * 2019-12-19 2020-04-24 安徽威奥曼机器人有限公司 Method for identifying lesion in medical CT image
CN111125784A (en) * 2019-12-24 2020-05-08 山东爱城市网信息技术有限公司 Artificial intelligence training model method, device and medium based on block chain

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
唐朝辉 等: "基于自编码器及超图学习的多标签特征提取", 《自动化学报》, vol. 42, no. 7, 31 July 2016 (2016-07-31), pages 1014 - 1021 *

Similar Documents

Publication Publication Date Title
KR102069759B1 (en) Dynamic Updates for CAPTCHA Challenges
CN110490594B (en) Service data processing method and device, computer equipment and storage medium
CN111275448A (en) Face data processing method and device and computer equipment
CN110691085B (en) Login method, login device, password management system and computer readable medium
WO2020248687A1 (en) Method and apparatus for preventing malicious attack
CN112818300A (en) Electronic contract generating method and device, computer equipment and storage medium
CN110223075B (en) Identity authentication method and device, computer equipment and storage medium
CN111709413A (en) Certificate verification method and device based on image recognition, computer equipment and medium
CN109286933B (en) Authentication method, device, system, computer equipment and storage medium
CN112699871A (en) Method, system, device and computer readable storage medium for field content identification
CN110674488B (en) Verification code identification method, system and computer equipment based on neural network
CN112329557A (en) Model application method and device, computer equipment and storage medium
CN108616362A (en) Vote information generation method and device
CN115391188A (en) Scene test case generation method, device, equipment and storage medium
CN108234454A (en) A kind of identity identifying method, server and client device
CN114780977A (en) File processing method, device, equipment and storage medium
CN110162957B (en) Authentication method and device for intelligent equipment, storage medium and electronic device
CN108566371B (en) Social authentication method, system and terminal equipment
CN112668501A (en) Automatic encoder feature extraction method and device based on block chain excitation
CN115033848B (en) Device identification method and device, electronic device and storage medium
CN114238914A (en) Digital certificate application system, method, device, computer equipment and storage medium
CN112330452B (en) Transaction data processing method, device, computer equipment and storage medium
CN110796548B (en) Asset transaction method and device
CN113411355A (en) Internet-based application registration method and related device
CN111988336A (en) Access request processing method, device and system and computer equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination