CN115150439B - Method and system for analyzing perception data, storage medium and electronic equipment - Google Patents
Method and system for analyzing perception data, storage medium and electronic equipment Download PDFInfo
- Publication number
- CN115150439B CN115150439B CN202211071916.7A CN202211071916A CN115150439B CN 115150439 B CN115150439 B CN 115150439B CN 202211071916 A CN202211071916 A CN 202211071916A CN 115150439 B CN115150439 B CN 115150439B
- Authority
- CN
- China
- Prior art keywords
- data
- representing
- layer
- perception
- sensing
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 230000008447 perception Effects 0.000 title claims abstract description 164
- 238000000034 method Methods 0.000 title claims abstract description 86
- 238000007405 data analysis Methods 0.000 claims abstract description 64
- 238000004458 analytical method Methods 0.000 claims abstract description 57
- 238000013528 artificial neural network Methods 0.000 claims abstract description 51
- 238000013135 deep learning Methods 0.000 claims abstract description 47
- 239000011159 matrix material Substances 0.000 claims description 77
- 238000004422 calculation algorithm Methods 0.000 claims description 64
- 230000006870 function Effects 0.000 claims description 32
- 210000002569 neuron Anatomy 0.000 claims description 23
- 230000006978 adaptation Effects 0.000 claims description 22
- 230000008569 process Effects 0.000 claims description 20
- 230000003044 adaptive effect Effects 0.000 claims description 17
- 238000000605 extraction Methods 0.000 claims description 15
- 238000004590 computer program Methods 0.000 claims description 11
- 239000000758 substrate Substances 0.000 claims description 10
- 238000010276 construction Methods 0.000 claims description 7
- 230000004913 activation Effects 0.000 claims description 6
- 238000000354 decomposition reaction Methods 0.000 claims description 4
- 230000005540 biological transmission Effects 0.000 description 7
- 238000004364 calculation method Methods 0.000 description 6
- 230000002441 reversible effect Effects 0.000 description 5
- 238000010586 diagram Methods 0.000 description 4
- 230000001953 sensory effect Effects 0.000 description 3
- 239000000463 material Substances 0.000 description 2
- 230000009471 action Effects 0.000 description 1
- 230000004075 alteration Effects 0.000 description 1
- 238000012517 data analytics Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 239000000284 extract Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 239000013307 optical fiber Substances 0.000 description 1
- 238000005457 optimization Methods 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L67/00—Network arrangements or protocols for supporting network services or applications
- H04L67/01—Protocols
- H04L67/12—Protocols specially adapted for proprietary or special-purpose networking environments, e.g. medical networks, sensor networks, networks in vehicles or remote metering networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
- G06N3/082—Learning methods modifying the architecture, e.g. adding, deleting or silencing nodes or connections
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16Y—INFORMATION AND COMMUNICATION TECHNOLOGY SPECIALLY ADAPTED FOR THE INTERNET OF THINGS [IoT]
- G16Y30/00—IoT infrastructure
- G16Y30/10—Security thereof
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16Y—INFORMATION AND COMMUNICATION TECHNOLOGY SPECIALLY ADAPTED FOR THE INTERNET OF THINGS [IoT]
- G16Y40/00—IoT characterised by the purpose of the information processing
- G16Y40/50—Safety; Security of things, users, data or systems
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L63/00—Network architectures or network communication protocols for network security
- H04L63/04—Network architectures or network communication protocols for network security for providing a confidential data exchange among entities communicating through data packet networks
- H04L63/0428—Network architectures or network communication protocols for network security for providing a confidential data exchange among entities communicating through data packet networks wherein the data content is protected, e.g. by encrypting or encapsulating the payload
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L69/00—Network arrangements, protocols or services independent of the application payload and not provided for in the other groups of this subclass
- H04L69/22—Parsing or analysis of headers
Landscapes
- Engineering & Computer Science (AREA)
- Computing Systems (AREA)
- Computer Security & Cryptography (AREA)
- Computer Networks & Wireless Communication (AREA)
- Signal Processing (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- Biophysics (AREA)
- Life Sciences & Earth Sciences (AREA)
- Artificial Intelligence (AREA)
- Computational Linguistics (AREA)
- Data Mining & Analysis (AREA)
- Evolutionary Computation (AREA)
- Molecular Biology (AREA)
- Biomedical Technology (AREA)
- General Physics & Mathematics (AREA)
- Mathematical Physics (AREA)
- Software Systems (AREA)
- Medical Informatics (AREA)
- Computer Hardware Design (AREA)
- Management, Administration, Business Operations System, And Electronic Commerce (AREA)
Abstract
The invention relates to the technical field of data processing of the Internet of things, and discloses a method and a system for analyzing perception data, a storage medium and electronic equipment, wherein the method comprises the following steps: acquiring a perception data set, wherein the perception data set comprises perception data subsets of a plurality of perception terminals, and the perception data subsets comprise perception data in various data forms acquired by corresponding perception terminals; extracting information of perception data in each data form in a perception data set by using a pre-constructed deep learning neural network to obtain perception data information; and constructing a self-adaptive analysis model, and analyzing the sensing data information by using the self-adaptive analysis model to obtain an analysis result of each sensing data. The method improves the accuracy of multi-data form perception data analysis and the adaptivity of data analysis.
Description
Technical Field
The invention relates to the technical field of data processing of the Internet of things, in particular to a method and a system for analyzing perception data, a storage medium and electronic equipment.
Background
The sensing terminal is an important component of the internet of things system, and is also a common device in a sensing layer of the internet of things. There are many methods for data analysis of the intelligent sensing terminal, but the related art has the following technical problems: the self-adaptive adaptability is poor, and the data analysis is inaccurate.
Disclosure of Invention
The present invention is directed to solving, at least to some extent, one of the technical problems in the related art. Therefore, an object of the present invention is to provide a method for parsing sensing data, which improves accuracy of parsing sensing data in multiple data formats and adaptivity of parsing data.
A second objective of the present invention is to provide a system for parsing sensed data.
A third object of the invention is to propose a computer-readable storage medium.
A fourth object of the invention is to propose an electronic device.
In order to achieve the above object, an embodiment of a first aspect of the present invention provides a method for parsing perceptual data, where the method includes: acquiring a perception data set, wherein the perception data set comprises perception data subsets of a plurality of perception terminals, and the perception data subsets comprise perception data in various data forms acquired by corresponding perception terminals; extracting information of the perception data in each data form in the perception data set by using a pre-constructed deep learning neural network to obtain perception data information, wherein the number of layers of a hidden layer of the deep learning neural network and the number of neurons contained in each hidden layer are determined according to the data form of the perception data in each data form; and constructing a self-adaptive analysis model, and analyzing the sensing data information by using the self-adaptive analysis model to obtain an analysis result of each sensing data.
According to the analytic method of the perception data, disclosed by the embodiment of the invention, the optimal number of the hidden layer layers and the number of the neurons are selected when the deep learning neural network is carried out according to the data form of the perception data subset, so that the accuracy of perception data information extraction is improved; by constructing the self-adaptive analysis model and analyzing the sensing data information by using the constructed self-adaptive analysis model, the sensing terminal sensing data analysis accuracy and the data analysis adaptivity are improved.
In addition, the method for parsing perceptual data according to the above embodiment of the present invention may further have the following additional technical features:
according to one embodiment of the invention, the acquiring the perception data set comprises: receiving the encrypted sensing data in multiple data forms of a plurality of sensing terminals, wherein each encrypted sensing data is obtained by encrypting according to an encryption algorithm corresponding to the data form of the encrypted sensing data; and calling a decryption algorithm corresponding to the data form of each encrypted sensing data to decrypt to obtain the sensing data set.
According to one embodiment of the invention, the sensing data of each data form are arranged in a preset sequence, wherein the corresponding decryption algorithm is called in sequence according to the preset sequence for decryption.
According to one embodiment of the invention, the deep learning neural network comprises an input layer, a hidden layer and an output layer which are connected in sequence,
The corresponding outputs of the input layer are:wherein, in the process,an input matrix representing a plurality of data forms,a weight matrix representing the input layer,a bias matrix representing the input layer;
the inputs of the hidden layer are:
wherein,a weight matrix representing the input layer to the hidden layer,represents the underlying substrate of the hidden layer,a weight matrix representing the a-1 layer hidden layer to the a layer hidden layer,the output quantity representing the a-1 layer hidden layer is transmitted to the part of the a layer hidden layer,representing the jth layer hidden layer state,representing a weight matrix between the two hidden layers, and determining according to the relationship between each neuron between the two hidden layers;
the inputs to the output layer are:whereina weight matrix representing the hidden layer to the output layer,a substrate representing the output layer;
the corresponding outputs of the output layer are as follows:wherein, in the process,in order to activate the function(s),wherein, in the process,representing perceptual data information.
According to an embodiment of the invention, the method further comprises: constructing a quantity adaptation model, and determining the number of layers of a hidden layer of the deep learning neural network and the number of neurons contained in each layer of the hidden layer by using the quantity adaptation model, wherein the quantity adaptation model is as follows:
wherein,representing a number adaptation model, da representing an input matrix of multiple data forms, F representing a data form decomposition matrix, YK representing the number of hidden layers and a corresponding neuron number matrix, QW representing a weight matrix of the hidden layers, B representing a bias of the hidden layersSetting a matrix, YS represents a constraint condition,indicating the best number of adaptation results.
According to an embodiment of the invention, when the deep learning neural network is constructed in advance, a cross entropy loss function is adopted for construction, wherein an expression of the cross entropy loss function at the time t is as follows:
the global loss function for P moments is:
According to an embodiment of the present invention, the analyzing the sensing data information by using the adaptive analysis model to obtain an analysis result, including: classifying the perception data information to obtain a perception data information matrix; and extracting data information from the sensing data information matrix by using the self-adaptive analysis model, and calling a corresponding data analysis method to analyze the extracted data information to obtain an analysis result.
According to an embodiment of the present invention, the sensing data information matrix is:
wherein, the ZData represents a perceptual data information matrix,the sensing data information set of the ith sensing terminal is represented, N represents the number of the sensing terminals,,,and M represents the number of types of the data forms of the perception data contained in the ith perception terminal.
According to an embodiment of the present invention, the adaptive analytical model is:
wherein,representing an adaptive analysis model, ZData representing a sensing data information matrix, ST representing a data information extraction set, JX representing a data analysis method set,and showing the data analysis output result.
In order to achieve the above object, a second aspect of the present invention provides a system for parsing sensing data, where the system includes: the positioning identification module is used for acquiring a sensing data set, wherein the sensing data set comprises sensing data subsets of a plurality of sensing terminals, and the sensing data subsets comprise sensing data in a plurality of data forms acquired by the corresponding sensing terminals; the extraction module is used for extracting information of the perception data in each data form in the perception data set by using a pre-constructed deep learning neural network to obtain perception data information, wherein the number of layers of a hidden layer of the deep learning neural network and the number of neurons in the hidden layer of each data form are determined according to the data form of the perception data in each data form; and the data analysis module is used for analyzing the perception data information by utilizing a pre-constructed self-adaptive analysis model to obtain an analysis result of each perception data.
To achieve the above object, a third aspect of the present invention provides a computer-readable storage medium, on which a computer program is stored, where the computer program, when executed by a processor, implements the method for parsing sensing data as set forth in the first aspect of the present invention.
In order to achieve the above object, a fourth aspect of the present invention provides an electronic device, including a memory and a processor, where the memory stores a computer program, and the computer program, when executed by the processor, implements the method for parsing sensing data according to the embodiment of the first aspect of the present invention.
Drawings
FIG. 1 is a flow diagram of a method of parsing perceptual data according to an embodiment of the present invention;
FIG. 2 is a flow chart of acquiring a sensory data set according to one embodiment of the present invention;
FIG. 3 is a flow diagram of a process for parsing sensory data information according to an embodiment of the present invention;
FIG. 4 is a schematic diagram of a system for parsing perceptual data, in accordance with an embodiment of the present invention.
Detailed Description
Reference will now be made in detail to embodiments of the present invention, examples of which are illustrated in the accompanying drawings, wherein like or similar reference numerals refer to the same or similar elements or elements having the same or similar function throughout. The embodiments described below with reference to the drawings are illustrative and intended to be illustrative of the invention and are not to be construed as limiting the invention.
The method, system, storage medium, and electronic device for analyzing perceptual data according to embodiments of the present invention will be described in detail with reference to fig. 1 to 4 and specific embodiments of the present invention.
Fig. 1 is a flowchart of a method for parsing perceptual data according to an embodiment of the present invention. As shown in fig. 1, the method for parsing the sensing data may include:
the method includes the steps of S1, obtaining a perception data set, wherein the perception data set comprises perception data subsets of a plurality of perception terminals, and the perception data subsets comprise perception data in various data forms collected by corresponding perception terminals.
It should be noted that the sensing terminal in the embodiment of the present invention is a standard customized sensing terminal.
Specifically, sensing data subsets of N sensing terminals are obtainedN is more than or equal to 2, and a sensing data set is obtainedWherein, N represents the number of the sensing terminals,a subset of sensing data representing the ith sensing terminal,. Perceptual data subsetsThe perception data comprises a plurality of data forms and can be expressed asWherein M represents the number of kinds of data forms of the sensing data contained in the ith sensing terminal,a data packet representing the jth data form,
and the data packet comprises all data sets corresponding to the data form under the corresponding sensing terminal.
In embodiments of the present invention, the dataform of the sensory data may include text, graphics, images, audio, video, and other forms of data.
To achieve better results, multiple subsets of perceptual dataThe sensing data of each data form in (1) are arranged in a preset order.
As a specific embodiment, when the number of the sensing terminals is 5, the sensing terminals correspond to the sensing data setWhen the data form contained in each sensing terminal is characters, images, audios and videos, and the preset sequence of the sensing data form is characters, images, audios and videos, the sensing data subset corresponding to the first sensing terminal has:wherein, in the process,textual form data representing the first perceiving terminal,image form data representing a first perception terminal,audio form data representing a first perceptual terminal,video form data representing a first perceptual terminal. The data packets in the sensing data subsets of other sensing terminals are also arranged according to the preset sequence, and are not described herein again.
In an embodiment of the present invention, as shown in fig. 2, acquiring the perception data set may include:
s11, receiving the encrypted sensing data in multiple data forms of a plurality of sensing terminals, wherein each encrypted sensing data is obtained by encrypting according to an encryption algorithm corresponding to the data form of the sensing data;
and S12, calling a decryption algorithm corresponding to the data form of each encrypted sensing data to decrypt to obtain a sensing data set.
In the embodiment of the invention, the sensing data of each data form are arranged according to a preset sequence, wherein the corresponding decryption algorithm is called in turn according to the preset sequence for decryption.
In order to ensure the security of the perception data in the obtained perception data set, prevent the perception data from being maliciously intercepted, stolen and tampered in the process of obtaining the perception data set, further ensure that the perception data analysis is smoothly and accurately carried out, and receive the perception data encrypted by each perception terminal.
In the embodiment of the invention, the perception terminal can acquire the perception data and perform self-adaptive encryption processing on the perception data while encrypting the processed perception data. The data acquisition module can be used for acquiring the sensing data of a plurality of sensing terminals, and the encryption module is used for completing the self-adaptive encryption processing of the sensing data of each sensing terminal.
Specifically, when each perception terminal or encryption module encrypts the perception data subset, the perception terminal or encryption module can encrypt the perception data subset according to the perception data subsetThe self-adaptive encryption method comprises the steps of constructing an autonomous selection model according to sensing data in various data forms contained in the data, and completing self-adaptive encryption of the sensing data in various data forms by utilizing the constructed autonomous selection model.
As a specific example, the autonomous selection model may be:
wherein, data represents a perception Data set, W represents an encryption algorithm set of Data in a character form, S represents an encryption algorithm set of Data in a graphic form, P represents an encryption algorithm set of Data in an image form, V represents an encryption algorithm set of Data in an audio form, mo represents an encryption algorithm set of Data in a video form, ot represents an encryption algorithm set of other Data forms,representing the autonomous selection model output, i.e. the encrypted processed perceptual data set.
The autonomous selection model may be based on each subset of perceptual dataThe method comprises the following steps of calling corresponding encryption algorithms in a cryptographic algorithm database in an encryption module according to sensing data in different data forms, and encrypting the sensing data in the corresponding data form by using the corresponding encryption algorithms, wherein the specific process comprises the following steps:
the perception data in the embodiment of the invention is obtained by adaptively calling the encryption algorithm to the perception data subsets containing different data forms through the established autonomous selection model for encryption, so that the security of the perception data transmission process is ensured, malicious interception, stealing and tampering of the perception data are effectively prevented, the data analysis is further ensured to be smoothly and accurately carried out, and the accuracy of the data analysis is improved.
In the embodiment of the invention, before analyzing the encrypted sensing data, the encrypted sensing data needs to be decrypted. Sensing data after processing encryptionWhen adaptive decryption is carried out, the perception data after encryption processing can be usedSelf-adapting decryption is carried out on the contained data forms one by one, namely, a decryption algorithm of a corresponding data form in a cryptographic algorithm database in an encryption module is called to encrypt the encrypted sensing dataDecrypting to obtain decrypted sensing dataThe specific process is as follows:
wherein,representing the set of perceptual data after the cryptographic process,the perception data subset of the ith perception terminal after encryption is represented,a set of decryption algorithms representing data in textual form,representing graphical formsA set of decryption algorithms for the data,a set of decryption algorithms representing data in image form,a set of decryption algorithms representing data in audio form,a set of decryption algorithms representing data in video form,a set of decryption algorithms representing other forms of data,
representing the decrypted processed set of perceptual data,and indicating the perception data subset of the ith perception terminal after decryption.
It should be noted that the encryption algorithm and the decryption algorithm in the cryptographic algorithm database are both symmetric, and in order to ensure successful decryption, the encryption algorithm is reversible encryption. The encryption and decryption algorithms in the cryptographic algorithm database can be updated in real time according to the data form and other data characteristics of the sensing data of the sensing terminal.
And S2, extracting information of the perception data in each data form in the perception data set by using a pre-constructed deep learning neural network to obtain perception data information, wherein the number of layers of hidden layers of the deep learning neural network and the number of neurons in each hidden layer are determined according to the data form of the perception data in each data form.
In particular, a pre-constructed deep learning neural network can be utilized to sense a data setThe sensing data in the system are positioned and identified to extract a sensing data setThe perceptual data information inWherein each subsetThe method comprises the sensing data information extracted from the sensing data of the corresponding sensing terminal. In embodiments of the invention, data information is perceivedDevice information, custom configuration script information, and other related information for each aware terminal may be included.
In an embodiment of the present invention, the deep learning neural network may include an input layer, a hidden layer, and an output layer, which are connected in sequence;
In the embodiment of the present invention, the corresponding outputs of the input layer are:whereinan input matrix representing a plurality of data forms,a weight matrix representing the input layer is represented,representing input layersA bias matrix;
the inputs to the hidden layer are:
wherein,a weight matrix representing the input layer to the hidden layer,a substrate of a hidden layer is represented,a weight matrix representing the a-1 layer hidden layer to the a layer hidden layer,the output quantity representing the layer a-1 hidden layer is transferred to the part of the layer a hidden layer,indicating the status of the stratum j-th hidden layer,representing a weight matrix between the two hidden layers, and determining according to the relationship between each neuron between the two hidden layers;
the inputs to the output layer are:wherein, in the process,a weight matrix representing the hidden layer to the output layer,a substrate representing an output layer;
the corresponding outputs of the output layer are:whereinin order to activate the function(s),whereinrepresenting perceptual data information.
In an embodiment of the invention, the perceptual dataset of the input layer is inputTo decrypt the processed perceptual data set. Perceptual data setThe data package comprises text, graphics, images, audio, video and other forms of sensing data of the sensing terminal.
Further specifically, decrypting the processed perceptual dataInputting an input layer of a deep learning neural network, the deep learning neural network having a set of hidden layer states,Indicating the state of the hidden layer of the j-th layer, n indicating the number of layers of the hidden layer,(ii) a The output perception data information of the deep learning neural network output layer is。
Further specifically, when the hidden layer has only one layer, the inputs of the hidden layer are:。
when the hidden layer contains multiple layers, the inputs to the layer a neurons of the hidden layer are:
when the embodiment of the invention outputs a plurality of layers of hidden layers, the states of the previous layers of hidden layers are taken into account, so that the input parameters of each layer of the hidden layer are richer, the calculation output result is more accurate, and the construction of a deep learning neural network can be completed more quickly.
In an embodiment of the present invention, the method for parsing sensing data may further include: and constructing a quantity adaptation model, and determining the number of layers of hidden layers of the deep learning neural network and the number of neurons in each hidden layer by using the quantity adaptation model.
And constructing a quantity adaptation model for improving the accuracy of the perception data information extracted by the deep learning neural network. The constructed quantity adaptation model can determine the number of layers of hidden layers of the deep learning neural network and the number of neurons in each hidden layer according to the input matrix in the multi-data form.
As a specific example, the quantity adaptive model may be:
wherein,representing a quantity adaptation model, da representing an input matrix of multiple data forms, F representing a data form decomposition matrix, YK representing the number of layers of a hidden layer and a corresponding neuron quantity matrix, QW representing a weight matrix of the hidden layer, B representing a bias matrix of the hidden layer, YS representing a constraint condition,indicating the best number of adaptation results.
In embodiments of the present invention, constraints may be computational power, gradients, activation functions, norm, and other influencing factors.
Specifically, the constructed quantity adaptation model decomposes an input matrix Da with multiple data forms through a data form decomposition matrix F to obtain data sets with various forms, a Viterbi algorithm can be used for dynamic planning, and the obtained sets are combined with constraint conditions under the action of a hidden layer to obtain an optimal quantity adaptation result。
For better effect, the quantity adaptation can be carried out only for the first time when each data form appears, so that the subsequent workload is reduced, and meanwhile, if a new data form is added, the calculation can be directly carried out through a quantity adaptation model.
According to the quantity adaptation model of the embodiment of the invention, according to the data form of the perception data, when the deep learning neural network extracts the information of the perception data, the optimal number of the hidden layer layers and the number of the neurons in each hidden layer are selected, so that the accuracy of data information extraction is improved, and the adaptivity of perception data analysis and the accuracy of data analysis are further improved.
In the embodiment of the invention, when the deep learning neural network is constructed in advance, a cross entropy loss function can be adopted for construction, wherein the expression of the cross entropy loss function at the time t is as follows:
wherein,representing the true result value at time t,a predicted result value indicating time t, a predicted result valueObtained through a deep learning neural network;
the global loss function for P moments is:
According to the embodiment of the invention, the constructed optimized cross entropy loss function is used for updating the parameters of the deep learning neural network、、、。
Specifically, the loss of the deep learning neural network at the current position is quantified by a loss function L, i.e., the true result value is quantifiedAnd deep learning neural network prediction result valueWherein, the optimized cross entropy loss function of the deep learning neural network at time t is:
then the global penalty for P moments is:
In the embodiment of the invention, iterative computation can be carried out by utilizing a slope descent method to update grid parameters of the deep learning neural network, during reverse propagation, the gradient loss of a certain perception data characteristic sequence at the time t is determined by the gradient loss corresponding to the output at the current time and the gradient loss of a perception data characteristic sequence index at the time t +1, the gradient loss of a certain perception data characteristic sequence at the time t of a weight matrix needs to be subjected to calculation of reverse propagation step by step, and during the time t, the gradient loss of a certain perception data characteristic sequence at the time t needs to be subjected to calculation of reverse propagation step by stepHidden layer states indexed by perceptual data feature sequencesThe gradient of (d) is:
thereby obtaining a gradient calculation expression for updating the neural network parameters:、、、and until the values of the parameters are converged, completing the construction of the deep learning neural network.
According to the embodiment of the invention, the parameters of the deep learning neural network are updated by constructing and optimizing the cross entropy loss function, so that the deep learning neural network construction is completed more accurately and rapidly.
And S3, constructing a self-adaptive analysis model, and analyzing the sensing data information by using the self-adaptive analysis model to obtain an analysis result of each sensing data.
In an embodiment of the present invention, as shown in fig. 3, performing an analysis process on the sensing data information by using an adaptive analysis model to obtain an analysis result, including:
s31, classifying the sensing data information to obtain a sensing data information matrix;
and S32, extracting data information from the sensing data information matrix by using a self-adaptive analysis model, and calling a corresponding data analysis method to analyze the extracted data information to obtain an analysis result.
In an embodiment of the present invention, the sensing data information matrix is:
wherein, the ZData represents a perceptual data information matrix,the sensing data information set of the ith sensing terminal is represented, N represents the number of the sensing terminals,,,and M represents the number of types of the data forms of the perception data contained in the ith perception terminal.
In the embodiment of the present invention, the adaptive analytical model is:
wherein,represents an adaptive analysis model, ZData represents a sensing data information matrix, ST represents a data information extraction set, JX represents a data analysis method set,and showing the data analysis output result.
Specifically, the data processing module can be used for calling a control protocol dictionary in a protocol database and a deep neural network protocol to classify the extracted sensing data information so as to obtain more accuracyDefinite data information, i.e. matrices of perceptual data information,And representing the perception data information set of the ith perception terminal. In the embodiment of the invention, the obtained sensing data information matrix ZData can contain equipment information of a sensing terminal corresponding to each sensing data, an agreed data transmission protocol, custom configuration script information and other data analysis related information.
In the embodiment of the invention, the protocol database is provided with a control protocol dictionary and a deep neural network protocol, and is updated according to the protocol to which the data form of the perception data belongs.
It should be noted that the data transmission protocol may be obtained in the protocol database, and the data transmission protocol is defined when the standard customized sensing terminal is accessed. And if a new protocol is generated, updating the protocol database, and optimizing the protocol in the protocol database according to the common frequency of the protocol.
In the embodiment of the invention, the adaptive analysis model constructed by the data analysis module can be used for analyzing data information elements in the perception data information set ZData, namely the adaptive analysis model calls a more matched analysis method in the data analysis method library to complete intelligent analysis of perception data, so as to obtain a control protocol, a deep neural network protocol, a custom configuration script and other related data analysis data information which are followed by a corresponding perception terminal, and the data analysis is performed on the processed perception data by constructing the adaptive analysis model, so as to obtain a data analysis result. After the data analysis result is obtained, the data analysis result can be further uploaded to the Internet of things platform.
In the embodiment of the invention, the data analysis method library stores a common analysis method to analyze the data format into the standard data format Alink JSON, so that a data analysis module can call the analysis method conveniently, and meanwhile, the data analysis method library is updated additionally.
It should be noted that in the process of analyzing the sensing data, a new data analysis method is found, or a corresponding data analysis method is added due to the increase of the sensing terminals, and the newly added data analysis method is stored in the data analysis method library, so as to perform optimization updating on the data analysis method library.
According to the embodiment of the invention, the self-defined configuration script of the standard customized sensing terminal and other data analysis related data information are obtained by constructing the self-adaptive analysis model and analyzing the sensing data information by using the constructed self-adaptive analysis model, and the accuracy of the data intelligent analysis system of the standard customized sensing terminal and the adaptivity of data analysis are improved by a self-adaptive selection data analysis method.
According to the method for analyzing the perception data, the obtained perception data is the perception data which is encrypted by adaptively using an encryption algorithm for the perception data in different data forms by utilizing the established autonomous selection model, the safety of the transmission process of the perception data is guaranteed, the perception data is effectively prevented from being intercepted, stolen and tampered maliciously, the data analysis is further guaranteed to be performed smoothly and accurately, and the accuracy of the data analysis is improved.
According to the method for analyzing the perception data, disclosed by the embodiment of the invention, the optimal number of hidden layers and the number of the neurons are selected by utilizing the quantity adaptation model when the deep learning neural network is carried out according to the data form of the perception data subset, so that the accuracy of perception data information extraction is improved, and the adaptivity and the data analysis accuracy of a system are further improved.
According to the method for analyzing the perception data, when the multiple layers of hidden layers are output, the states of the previous layers of hidden layers are taken into consideration, so that the input parameters of the layers of the hidden layers are richer, the calculation and output results are more accurate, and the construction of a deep learning neural network is completed more quickly.
According to the analytic method of the perception data, disclosed by the embodiment of the invention, the optimized cross entropy loss function is constructed to update the parameters of the deep learning neural network, so that the deep learning neural network is accurately and quickly constructed.
According to the method for analyzing the perception data, disclosed by the embodiment of the invention, the self-defined configuration script of the standard customized perception terminal and other data analysis related data information are obtained by extracting the perception data information, and the data analysis method is selected in a self-adaptive manner, so that the accuracy of a data intelligent analysis system of the standard customized perception terminal and the self-adaptability of data analysis are improved.
The invention also provides a system for analyzing the perception data.
FIG. 4 is a schematic diagram of a system for parsing perceptual data, in accordance with an embodiment of the present invention. As shown in fig. 4, the system for parsing sensing data may include: a positioning identification module 30, an extraction module 40 and a data analysis module 50.
The positioning identification module 30 is configured to obtain a sensing data set, where the sensing data set includes sensing data subsets of multiple sensing terminals, and the sensing data subsets include sensing data in multiple data forms acquired by corresponding sensing terminals; the extraction module 40 is configured to perform information extraction on the sensing data in each data form in the sensing data set by using a pre-constructed deep learning neural network to obtain sensing data information, wherein for the sensing data in each data form, the number of layers of a hidden layer of the deep learning neural network and the number of neurons contained in the hidden layer of each layer are determined according to the data form of the sensing data; and the data analysis module 50 is configured to analyze the sensing data information by using a pre-constructed adaptive analysis model to obtain an analysis result of each sensing data.
In an embodiment of the present invention, the system for parsing sensing data may further include: a data acquisition module 10 and an encryption module 20. The data acquisition module 10 is configured to acquire sensing data subsets of a plurality of sensing terminals; the encryption module 20 is configured to invoke an encryption algorithm corresponding to the sensing data form, and encrypt the sensing data in the corresponding data form to obtain encrypted sensing data; the positioning identification module 30 is further configured to call, for each piece of encrypted sensing data, a decryption algorithm corresponding to the data format of the sensing data to decrypt the sensing data, so as to obtain a sensing data set.
In the embodiment of the invention, the sensing data in each data form are arranged according to a preset sequence, wherein the corresponding decryption algorithms are called in sequence according to the preset sequence for decryption.
In order to ensure the security of the perception data in the obtained perception data set, prevent the perception data from being maliciously intercepted, stolen and tampered in the process of obtaining the perception data set, further ensure that the perception data analysis is smoothly and accurately carried out, and receive the perception data encrypted by each perception terminal.
Specifically, when the encryption module encrypts the sensing data of each sensing terminal, the encryption module can encrypt the sensing data according to the sensing data subsetThe self-adaptive encryption method comprises the steps of constructing an autonomous selection model according to sensing data in various data forms contained in the data, and completing self-adaptive encryption of the sensing data in various data forms by utilizing the constructed autonomous selection model.
As a specific example, the autonomous selection model may be:
wherein, data represents a perception Data set, W represents an encryption algorithm set of Data in a character form, S represents an encryption algorithm set of Data in a graphic form, P represents an encryption algorithm set of Data in an image form, V represents an encryption algorithm set of Data in an audio form, mo represents an encryption algorithm set of Data in a video form, ot represents an encryption algorithm set of other Data forms,representing the autonomous selection model output, i.e. the encrypted processed perceptual data set.
The autonomous selection model may be based on each subset of perceptual dataCalling corresponding encryption algorithms in an encryption module cryptographic algorithm database, and encrypting the sensing data in the corresponding data form by using the corresponding encryption algorithms, wherein the specific process is as follows:
according to the embodiment of the invention, the autonomous selection model is constructed, so that the encryption module adaptively calls the encryption algorithm to the sensing data subsets containing different data forms for encryption by utilizing the constructed autonomous selection model, the security of the sensing data transmission process is ensured, malicious interception, stealing and tampering on the sensing data are effectively prevented, the data analysis is further ensured to be smoothly and accurately carried out, and the accuracy of the data analysis is improved.
In the embodiment of the present invention, before the extraction module 40 analyzes the encrypted sensing data, the positioning identification module 30 needs to decrypt the encrypted sensing data. Sensing data after processing encryptionWhen adaptive decryption is carried out, the perception data after encryption processing can be usedSelf-adapting decryption is carried out on the contained data forms one by one, namely, a decryption algorithm of a corresponding data form in a cryptographic algorithm database in an encryption module is called to encrypt the encrypted sensing dataDecrypting to obtain decrypted sensing dataThe specific process is as follows:
wherein,representing the set of perceptual data after the cryptographic processing,the perception data subset of the ith perception terminal after encryption is represented,a set of decryption algorithms representing data in textual form,a set of decryption algorithms representing data in graphical form,a set of decryption algorithms representing data in image form,a set of decryption algorithms representing data in audio form,a set of decryption algorithms representing data in video form,a set of decryption algorithms representing other forms of data,representing the decrypted processed set of perceptual data,and indicating the perception data subset of the ith perception terminal after decryption.
It should be noted that the encryption algorithm and the decryption algorithm in the cryptographic algorithm database are both symmetric, and in order to ensure successful decryption, the encryption algorithm is reversible encryption. The encryption and decryption algorithms in the cryptographic algorithm database can be updated in real time according to the data form and other data characteristics of the sensing data of the sensing terminal.
In an embodiment of the present invention, the deep learning neural network may include an input layer, a hidden layer and an output layer connected in sequence,
The corresponding outputs of the input layer are:whereinan input matrix representing a plurality of data forms,a weight matrix representing the input layer is used,a bias matrix representing an input layer;
the inputs to the hidden layer are:
wherein,a weight matrix representing the input layer to the hidden layer,a substrate of a hidden layer is represented,represents a weight matrix of the a-1 layer hidden layer to the a layer hidden layer,the output quantity representing the layer a-1 hidden layer is transferred to the part of the layer a hidden layer,representing the jth layer hidden layer state,representing a weight matrix between the two hidden layers, and determining according to the relationship between each neuron between the two hidden layers;
the inputs to the output layer are:whereina weight matrix representing the hidden layer to the output layer,a substrate representing an output layer;
the corresponding outputs of the output layer are:whereinin order to activate the function(s),whereinrepresenting perceptual data information.
In embodiments of the present invention, the data parsing module 50 may also be used to build an adaptive parsing model. The self-adaptive analysis model comprises the following steps:
wherein,represents an adaptive analysis model, ZData represents a perception data information set, ST represents a data information extraction set, JX represents a data analysis method set,and showing the data analysis output result.
Specifically, the extracted perceptual data information is classified to obtain more accurate data information, i.e., a perceptual data information matrix ZData. The sensing data information matrix ZData can contain device information of sensing terminals corresponding to sensing data, appointed data transmission protocols, custom configuration script information and other data analysis related information.
For other specific embodiments of the system for parsing sensing data according to the embodiment of the present invention, reference may be made to the specific embodiments of the method for parsing sensing data according to the above-mentioned embodiment of the present invention.
According to the analytic system of the perception data, disclosed by the embodiment of the invention, the optimal number of hidden layer layers and the number of neurons are selected when deep learning of a neural network is carried out according to the data form of the perception data subset, so that the accuracy of perception data information extraction is improved, and the adaptivity and the data analytic accuracy of the system are further improved; by constructing the self-adaptive analysis model and analyzing the perception data information by utilizing the constructed self-adaptive analysis model, the user-defined configuration script of the standard customized perception terminal and other data analysis related data information are obtained, and the data analysis method is selected in a self-adaptive mode, so that the accuracy of the data intelligent analysis system of the standard customized perception terminal and the self-adaptability of data analysis are improved.
The invention also provides a computer readable storage medium.
In this embodiment, a computer program is stored on a computer-readable storage medium, and when being executed by a processor, the computer program corresponds to the above-mentioned method for analyzing perceptual data, and implements the above-mentioned method for analyzing perceptual data.
The invention also provides the electronic equipment.
In this embodiment, the electronic device includes a processor, a memory, and a computer program stored on the memory, and when the processor executes the computer program, the method for parsing the sensing data is implemented.
According to the readable storage medium and the electronic device, the accuracy of multi-data-form sensing data analysis and the adaptivity of data analysis are improved by using the sensing data analysis method.
It should be noted that the logic and/or steps represented in the flowcharts or otherwise described herein, such as an ordered listing of executable instructions that can be considered to implement logical functions, can be embodied in any computer-readable medium for use by or in connection with an instruction execution system, apparatus, or device, such as a computer-based system, processor-containing system, or other system that can fetch the instructions from the instruction execution system, apparatus, or device and execute the instructions. For the purposes of this description, a "computer-readable medium" can be any means that can contain, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device. More specific examples (a non-exhaustive list) of the computer-readable medium would include the following: an electrical connection (electronic device) having one or more wires, a portable computer diskette (magnetic device), a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber device, and a portable compact disc read-only memory (CDROM). Additionally, the computer-readable medium could even be paper or another suitable medium upon which the program is printed, as the program can be electronically captured, via for instance optical scanning of the paper or other medium, then compiled, interpreted or otherwise processed in a suitable manner if necessary, and then stored in a computer memory.
It should be understood that portions of the present invention may be implemented in hardware, software, firmware, or a combination thereof. In the above embodiments, the various steps or methods may be implemented in software or firmware stored in memory and executed by a suitable instruction execution system. For example, if implemented in hardware, as in another embodiment, any one or combination of the following techniques, which are known in the art, may be used: a discrete logic circuit having a logic gate circuit for implementing a logic function on a data signal, an application specific integrated circuit having an appropriate combinational logic gate circuit, a Programmable Gate Array (PGA), a Field Programmable Gate Array (FPGA), or the like.
In the description herein, references to the description of the term "one embodiment," "some embodiments," "an example," "a specific example," or "some examples," etc., mean that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the invention. In this specification, the schematic representations of the terms used above do not necessarily refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples.
Furthermore, the terms "first", "second" and "first" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include at least one of the feature. In the description of the present invention, "a plurality" means at least two, e.g., two, three, etc., unless specifically limited otherwise.
Although embodiments of the present invention have been shown and described above, it is understood that the above embodiments are exemplary and should not be construed as limiting the present invention, and that variations, modifications, substitutions and alterations can be made to the above embodiments by those of ordinary skill in the art within the scope of the present invention.
Claims (11)
1. A method for parsing perceptual data, the method comprising:
acquiring a perception data set, wherein the perception data set comprises perception data subsets of a plurality of perception terminals, and the perception data subsets comprise perception data of a plurality of data forms acquired by corresponding perception terminals;
extracting information of the perception data in each data form in the perception data set by using a pre-constructed deep learning neural network to obtain perception data information, wherein the number of layers of a hidden layer of the deep learning neural network and the number of neurons in the hidden layer of each layer are determined according to the data form of the perception data in each data form;
constructing a self-adaptive analysis model, and analyzing the sensing data information by using the self-adaptive analysis model to obtain an analysis result of each sensing data; the analyzing the perception data information by using the self-adaptive analysis model to obtain an analysis result comprises the following steps:
classifying the perception data information to obtain a perception data information matrix;
extracting data information from the sensing data information matrix by using the self-adaptive analysis model, and calling a corresponding data analysis method to analyze the extracted data information to obtain an analysis result;
wherein the perceptual data information matrix is:
wherein, the ZData represents a perceptual data information matrix,a sensing data information set representing the ith sensing terminal, N represents the number of the sensing terminals,,,the data packet represents the jth data form, and M represents the number of the types of the data forms of the sensing data contained in the ith sensing terminal;
the self-adaptive analysis model is as follows:
2. The method for parsing perceptual data as defined in claim 1, wherein the obtaining the perceptual data set comprises:
receiving the encrypted sensing data in multiple data forms of a plurality of sensing terminals, wherein each encrypted sensing data is obtained by encrypting according to an encryption algorithm corresponding to the data form of the encrypted sensing data;
and calling a decryption algorithm corresponding to the data form of each encrypted sensing data to decrypt to obtain the sensing data set.
3. The method for parsing sensing data according to claim 2, wherein the sensing data in each data format are arranged in a preset sequence, and wherein decryption is performed by sequentially invoking corresponding decryption algorithms according to the preset sequence.
4. The parsing method of perception data as claimed in claim 3, wherein the deep learning neural network includes an input layer, a hidden layer and an output layer connected in sequence,
The corresponding outputs of the input layer are:whereinan input matrix representing a plurality of forms of data,a weight matrix representing the input layer is provided,a bias matrix representing the input layer;
the inputs of the hidden layer are:
wherein,a weight matrix representing the input layer to the hidden layer,represents the underlying substrate of the hidden layer,a weight matrix representing the a-1 layer hidden layer to the a layer hidden layer,the output quantity representing the a-1 layer hidden layer is transmitted to the part of the a layer hidden layer,representing the jth layer hidden layer state,representing a weight matrix between the two hidden layers, and determining according to the relationship between each neuron between the two hidden layers;
input of the output layerComprises the following steps:wherein, in the process,a weight matrix representing the hidden layer to the output layer,a substrate representing the output layer;
5. The method for parsing perceptual data according to claim 4, wherein the method further comprises:
constructing a quantity adaptation model, and determining the number of layers of a hidden layer of the deep learning neural network and the number of neurons in each layer of the hidden layer by using the quantity adaptation model, wherein the quantity adaptation model is as follows:
wherein,representing a quantity adaptation model, da representing an input matrix of multiple data forms, F representing a data form decomposition matrix, YK representing the number of layers of a hidden layer and a corresponding neuron quantity matrix, QW representing a weight matrix of the hidden layer, B representing a bias matrix of the hidden layer, YS representing a constraint condition,indicating the best number of adaptation results.
6. The method for analyzing perception data according to claim 5, wherein when the deep learning neural network is constructed in advance, a cross entropy loss function is adopted for construction, wherein an expression of the cross entropy loss function at time t is as follows:
the global loss function for P moments is:
7. A system for parsing perceptual data, the system comprising:
the positioning identification module is used for acquiring a sensing data set, wherein the sensing data set comprises sensing data subsets of a plurality of sensing terminals, and the sensing data subsets comprise sensing data in a plurality of data forms acquired by the corresponding sensing terminals;
the extraction module is used for extracting information of the perception data in each data form in the perception data set by using a pre-constructed deep learning neural network to obtain perception data information, wherein the number of layers of a hidden layer of the deep learning neural network and the number of neurons contained in each hidden layer are determined according to the data form of the perception data in each data form;
the data analysis module is used for analyzing the perception data information by utilizing a pre-constructed self-adaptive analysis model to obtain an analysis result of each perception data;
the data analysis module is specifically used for classifying the perception data information to obtain a perception data information matrix;
extracting data information from the sensing data information matrix by using the self-adaptive analysis model, and calling a corresponding data analysis method to analyze the extracted data information to obtain an analysis result;
wherein the perceptual data information matrix is:
wherein, the ZData represents a perceptual data information matrix,a sensing data information set representing the ith sensing terminal, N represents the number of the sensing terminals,,,the data packet represents the jth data form, and M represents the number of the types of the data forms of the sensing data contained in the ith sensing terminal;
the self-adaptive analysis model is as follows:
8. The system for parsing perceptual data as defined in claim 7, wherein the system further comprises:
the data acquisition module is used for acquiring perception data subsets of a plurality of perception terminals;
the encryption module is used for calling an encryption algorithm corresponding to the sensing data form, and encrypting the sensing data in the corresponding data form to obtain the encrypted sensing data;
the positioning identification module is further configured to call a decryption algorithm corresponding to the data format of each encrypted sensing data to decrypt the data, so as to obtain the sensing data set.
9. The analytic system of perception data of claim 8, wherein the deep learning neural network includes an input layer, a hidden layer, and an output layer connected in sequence,
The corresponding outputs of the input layer are:whereinan input matrix representing a plurality of data forms,a weight matrix representing the input layer,a bias matrix representing the input layer;
the inputs of the hidden layer are:
wherein,a weight matrix representing the input layer to the hidden layer,represents the substrate of the hidden layer or layers,a weight matrix representing the a-1 layer hidden layer to the a layer hidden layer,the output quantity representing the layer a-1 hidden layer is transferred to the part of the layer a hidden layer,indicating the status of the stratum j-th hidden layer,representing a weight matrix between the two hidden layers, and determining according to the relationship between each neuron between the two hidden layers;
the inputs to the output layer are:whereina weight matrix representing the hidden layer to the output layer,a substrate representing the output layer;
10. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out a method of parsing perceptual data according to any one of claims 1 to 6.
11. An electronic device comprising a memory and a processor, the memory having stored thereon a computer program, wherein the computer program, when executed by the processor, implements a method of parsing perceptual data as defined in any one of claims 1-6.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202211071916.7A CN115150439B (en) | 2022-09-02 | 2022-09-02 | Method and system for analyzing perception data, storage medium and electronic equipment |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202211071916.7A CN115150439B (en) | 2022-09-02 | 2022-09-02 | Method and system for analyzing perception data, storage medium and electronic equipment |
Publications (2)
Publication Number | Publication Date |
---|---|
CN115150439A CN115150439A (en) | 2022-10-04 |
CN115150439B true CN115150439B (en) | 2023-01-24 |
Family
ID=83416190
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202211071916.7A Active CN115150439B (en) | 2022-09-02 | 2022-09-02 | Method and system for analyzing perception data, storage medium and electronic equipment |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN115150439B (en) |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2009149926A2 (en) * | 2008-06-10 | 2009-12-17 | Intelligement Ag | System and method for the computer-based analysis of large quantities of data |
CN106021927A (en) * | 2016-05-21 | 2016-10-12 | 北京大脑智库教育科技有限公司 | Dermatoglyph analysis and processing method based on big data |
CN109858508A (en) * | 2018-10-23 | 2019-06-07 | 重庆邮电大学 | IP localization method based on Bayes and deep neural network |
CN112257846A (en) * | 2020-10-13 | 2021-01-22 | 北京灵汐科技有限公司 | Neuron model, topology, information processing method, and retinal neuron |
CN112949472A (en) * | 2021-02-28 | 2021-06-11 | 杭州翔毅科技有限公司 | Cooperative sensing method based on multi-sensor information fusion |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11327475B2 (en) * | 2016-05-09 | 2022-05-10 | Strong Force Iot Portfolio 2016, Llc | Methods and systems for intelligent collection and analysis of vehicle data |
US20200342291A1 (en) * | 2019-04-23 | 2020-10-29 | Apical Limited | Neural network processing |
US11520037B2 (en) * | 2019-09-30 | 2022-12-06 | Zoox, Inc. | Perception system |
-
2022
- 2022-09-02 CN CN202211071916.7A patent/CN115150439B/en active Active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2009149926A2 (en) * | 2008-06-10 | 2009-12-17 | Intelligement Ag | System and method for the computer-based analysis of large quantities of data |
CN106021927A (en) * | 2016-05-21 | 2016-10-12 | 北京大脑智库教育科技有限公司 | Dermatoglyph analysis and processing method based on big data |
CN109858508A (en) * | 2018-10-23 | 2019-06-07 | 重庆邮电大学 | IP localization method based on Bayes and deep neural network |
CN112257846A (en) * | 2020-10-13 | 2021-01-22 | 北京灵汐科技有限公司 | Neuron model, topology, information processing method, and retinal neuron |
CN112949472A (en) * | 2021-02-28 | 2021-06-11 | 杭州翔毅科技有限公司 | Cooperative sensing method based on multi-sensor information fusion |
Also Published As
Publication number | Publication date |
---|---|
CN115150439A (en) | 2022-10-04 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN108985361B (en) | Malicious traffic detection implementation method and device based on deep learning | |
Shi et al. | A multimodal hybrid parallel network intrusion detection model | |
EP3961441B1 (en) | Identity verification method and apparatus, computer device and storage medium | |
KR102036955B1 (en) | Method for recognizing subtle facial expression using deep learning based analysis of micro facial dynamics and apparatus therefor | |
CN107679466B (en) | Information output method and device | |
CN111447190A (en) | Encrypted malicious traffic identification method, equipment and device | |
US9824313B2 (en) | Filtering content in an online system based on text and image signals extracted from the content | |
CN113535825A (en) | Cloud computing intelligence-based data information wind control processing method and system | |
EP3916597B1 (en) | Detecting malware with deep generative models | |
CN117079299B (en) | Data processing method, device, electronic equipment and storage medium | |
JP6971514B1 (en) | Information processing equipment, information processing methods and programs | |
WO2020205771A2 (en) | Learned forensic source system for identification of image capture device models and forensic similarity of digital images | |
EP3591561A1 (en) | An anonymized data processing method and computer programs thereof | |
CN114499861A (en) | Quantum key cloud security situation sensing method based on machine learning | |
CN116759053A (en) | Medical system prevention and control method and system based on Internet of things system | |
CN114492601A (en) | Resource classification model training method and device, electronic equipment and storage medium | |
CN108615006A (en) | Method and apparatus for output information | |
CN112839055B (en) | Network application identification method and device for TLS encrypted traffic and electronic equipment | |
US11956353B2 (en) | Machine learning device, machine learning system, and machine learning method | |
KR20170057118A (en) | Method and apparatus for recognizing object, and method and apparatus for training recognition model | |
CN115150439B (en) | Method and system for analyzing perception data, storage medium and electronic equipment | |
CN111783677B (en) | Face recognition method, device, server and computer readable medium | |
CN115713669B (en) | Image classification method and device based on inter-class relationship, storage medium and terminal | |
CN116580208A (en) | Image processing method, image model training method, device, medium and equipment | |
Szandała et al. | PRISM: principal image sections mapping |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |