CN115150439B - Method and system for analyzing perception data, storage medium and electronic equipment - Google Patents

Method and system for analyzing perception data, storage medium and electronic equipment Download PDF

Info

Publication number
CN115150439B
CN115150439B CN202211071916.7A CN202211071916A CN115150439B CN 115150439 B CN115150439 B CN 115150439B CN 202211071916 A CN202211071916 A CN 202211071916A CN 115150439 B CN115150439 B CN 115150439B
Authority
CN
China
Prior art keywords
data
representing
layer
perception
sensing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202211071916.7A
Other languages
Chinese (zh)
Other versions
CN115150439A (en
Inventor
张博
夏信
张密
王守志
王波
张立勇
夏少娴
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Dianke Zhixin Technology Co ltd
Original Assignee
Beijing Dianke Zhixin Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Dianke Zhixin Technology Co ltd filed Critical Beijing Dianke Zhixin Technology Co ltd
Priority to CN202211071916.7A priority Critical patent/CN115150439B/en
Publication of CN115150439A publication Critical patent/CN115150439A/en
Application granted granted Critical
Publication of CN115150439B publication Critical patent/CN115150439B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/12Protocols specially adapted for proprietary or special-purpose networking environments, e.g. medical networks, sensor networks, networks in vehicles or remote metering networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • G06N3/082Learning methods modifying the architecture, e.g. adding, deleting or silencing nodes or connections
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16YINFORMATION AND COMMUNICATION TECHNOLOGY SPECIALLY ADAPTED FOR THE INTERNET OF THINGS [IoT]
    • G16Y30/00IoT infrastructure
    • G16Y30/10Security thereof
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16YINFORMATION AND COMMUNICATION TECHNOLOGY SPECIALLY ADAPTED FOR THE INTERNET OF THINGS [IoT]
    • G16Y40/00IoT characterised by the purpose of the information processing
    • G16Y40/50Safety; Security of things, users, data or systems
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L63/00Network architectures or network communication protocols for network security
    • H04L63/04Network architectures or network communication protocols for network security for providing a confidential data exchange among entities communicating through data packet networks
    • H04L63/0428Network architectures or network communication protocols for network security for providing a confidential data exchange among entities communicating through data packet networks wherein the data content is protected, e.g. by encrypting or encapsulating the payload
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L69/00Network arrangements, protocols or services independent of the application payload and not provided for in the other groups of this subclass
    • H04L69/22Parsing or analysis of headers

Landscapes

  • Engineering & Computer Science (AREA)
  • Computing Systems (AREA)
  • Computer Security & Cryptography (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Biophysics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Computational Linguistics (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • Molecular Biology (AREA)
  • Biomedical Technology (AREA)
  • General Physics & Mathematics (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Medical Informatics (AREA)
  • Computer Hardware Design (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

The invention relates to the technical field of data processing of the Internet of things, and discloses a method and a system for analyzing perception data, a storage medium and electronic equipment, wherein the method comprises the following steps: acquiring a perception data set, wherein the perception data set comprises perception data subsets of a plurality of perception terminals, and the perception data subsets comprise perception data in various data forms acquired by corresponding perception terminals; extracting information of perception data in each data form in a perception data set by using a pre-constructed deep learning neural network to obtain perception data information; and constructing a self-adaptive analysis model, and analyzing the sensing data information by using the self-adaptive analysis model to obtain an analysis result of each sensing data. The method improves the accuracy of multi-data form perception data analysis and the adaptivity of data analysis.

Description

Method and system for analyzing perception data, storage medium and electronic equipment
Technical Field
The invention relates to the technical field of data processing of the Internet of things, in particular to a method and a system for analyzing perception data, a storage medium and electronic equipment.
Background
The sensing terminal is an important component of the internet of things system, and is also a common device in a sensing layer of the internet of things. There are many methods for data analysis of the intelligent sensing terminal, but the related art has the following technical problems: the self-adaptive adaptability is poor, and the data analysis is inaccurate.
Disclosure of Invention
The present invention is directed to solving, at least to some extent, one of the technical problems in the related art. Therefore, an object of the present invention is to provide a method for parsing sensing data, which improves accuracy of parsing sensing data in multiple data formats and adaptivity of parsing data.
A second objective of the present invention is to provide a system for parsing sensed data.
A third object of the invention is to propose a computer-readable storage medium.
A fourth object of the invention is to propose an electronic device.
In order to achieve the above object, an embodiment of a first aspect of the present invention provides a method for parsing perceptual data, where the method includes: acquiring a perception data set, wherein the perception data set comprises perception data subsets of a plurality of perception terminals, and the perception data subsets comprise perception data in various data forms acquired by corresponding perception terminals; extracting information of the perception data in each data form in the perception data set by using a pre-constructed deep learning neural network to obtain perception data information, wherein the number of layers of a hidden layer of the deep learning neural network and the number of neurons contained in each hidden layer are determined according to the data form of the perception data in each data form; and constructing a self-adaptive analysis model, and analyzing the sensing data information by using the self-adaptive analysis model to obtain an analysis result of each sensing data.
According to the analytic method of the perception data, disclosed by the embodiment of the invention, the optimal number of the hidden layer layers and the number of the neurons are selected when the deep learning neural network is carried out according to the data form of the perception data subset, so that the accuracy of perception data information extraction is improved; by constructing the self-adaptive analysis model and analyzing the sensing data information by using the constructed self-adaptive analysis model, the sensing terminal sensing data analysis accuracy and the data analysis adaptivity are improved.
In addition, the method for parsing perceptual data according to the above embodiment of the present invention may further have the following additional technical features:
according to one embodiment of the invention, the acquiring the perception data set comprises: receiving the encrypted sensing data in multiple data forms of a plurality of sensing terminals, wherein each encrypted sensing data is obtained by encrypting according to an encryption algorithm corresponding to the data form of the encrypted sensing data; and calling a decryption algorithm corresponding to the data form of each encrypted sensing data to decrypt to obtain the sensing data set.
According to one embodiment of the invention, the sensing data of each data form are arranged in a preset sequence, wherein the corresponding decryption algorithm is called in sequence according to the preset sequence for decryption.
According to one embodiment of the invention, the deep learning neural network comprises an input layer, a hidden layer and an output layer which are connected in sequence,
the input of the input layer is as follows: the perception data set
Figure 301439DEST_PATH_IMAGE001
The corresponding outputs of the input layer are:
Figure 611198DEST_PATH_IMAGE002
wherein, in the process,
Figure 681922DEST_PATH_IMAGE003
an input matrix representing a plurality of data forms,
Figure 797645DEST_PATH_IMAGE004
a weight matrix representing the input layer,
Figure 637425DEST_PATH_IMAGE005
a bias matrix representing the input layer;
the inputs of the hidden layer are:
Figure 118085DEST_PATH_IMAGE006
wherein,
Figure 676106DEST_PATH_IMAGE007
a weight matrix representing the input layer to the hidden layer,
Figure 329941DEST_PATH_IMAGE008
represents the underlying substrate of the hidden layer,
Figure 24227DEST_PATH_IMAGE009
a weight matrix representing the a-1 layer hidden layer to the a layer hidden layer,
Figure 941368DEST_PATH_IMAGE010
the output quantity representing the a-1 layer hidden layer is transmitted to the part of the a layer hidden layer,
Figure 721105DEST_PATH_IMAGE011
representing the jth layer hidden layer state,
Figure 178631DEST_PATH_IMAGE012
representing a weight matrix between the two hidden layers, and determining according to the relationship between each neuron between the two hidden layers;
the output of the hidden layer is:
Figure 993003DEST_PATH_IMAGE013
wherein, in the process,
Figure 815466DEST_PATH_IMAGE014
representing an activation function;
the inputs to the output layer are:
Figure 82499DEST_PATH_IMAGE015
wherein
Figure 845181DEST_PATH_IMAGE016
a weight matrix representing the hidden layer to the output layer,
Figure 514060DEST_PATH_IMAGE017
a substrate representing the output layer;
the corresponding outputs of the output layer are as follows:
Figure 507424DEST_PATH_IMAGE018
wherein, in the process,
Figure 261753DEST_PATH_IMAGE019
in order to activate the function(s),
Figure 998765DEST_PATH_IMAGE020
wherein, in the process,
Figure 584467DEST_PATH_IMAGE021
representing perceptual data information.
According to an embodiment of the invention, the method further comprises: constructing a quantity adaptation model, and determining the number of layers of a hidden layer of the deep learning neural network and the number of neurons contained in each layer of the hidden layer by using the quantity adaptation model, wherein the quantity adaptation model is as follows:
Figure 14311DEST_PATH_IMAGE022
wherein,
Figure 255937DEST_PATH_IMAGE023
representing a number adaptation model, da representing an input matrix of multiple data forms, F representing a data form decomposition matrix, YK representing the number of hidden layers and a corresponding neuron number matrix, QW representing a weight matrix of the hidden layers, B representing a bias of the hidden layersSetting a matrix, YS represents a constraint condition,
Figure 531060DEST_PATH_IMAGE024
indicating the best number of adaptation results.
According to an embodiment of the invention, when the deep learning neural network is constructed in advance, a cross entropy loss function is adopted for construction, wherein an expression of the cross entropy loss function at the time t is as follows:
Figure 236848DEST_PATH_IMAGE025
wherein,
Figure 572014DEST_PATH_IMAGE026
representing the true result value at time t,
Figure 300936DEST_PATH_IMAGE027
a prediction result value indicating time t;
the global loss function for P moments is:
Figure 379750DEST_PATH_IMAGE028
wherein,
Figure 940045DEST_PATH_IMAGE029
function representing cross entropy loss at time t
Figure 711692DEST_PATH_IMAGE030
And (4) weighting values.
According to an embodiment of the present invention, the analyzing the sensing data information by using the adaptive analysis model to obtain an analysis result, including: classifying the perception data information to obtain a perception data information matrix; and extracting data information from the sensing data information matrix by using the self-adaptive analysis model, and calling a corresponding data analysis method to analyze the extracted data information to obtain an analysis result.
According to an embodiment of the present invention, the sensing data information matrix is:
Figure 662330DEST_PATH_IMAGE031
wherein, the ZData represents a perceptual data information matrix,
Figure 544836DEST_PATH_IMAGE032
the sensing data information set of the ith sensing terminal is represented, N represents the number of the sensing terminals,
Figure 461101DEST_PATH_IMAGE033
Figure 403649DEST_PATH_IMAGE034
Figure 841584DEST_PATH_IMAGE035
and M represents the number of types of the data forms of the perception data contained in the ith perception terminal.
According to an embodiment of the present invention, the adaptive analytical model is:
Figure 262201DEST_PATH_IMAGE036
wherein,
Figure 797088DEST_PATH_IMAGE037
representing an adaptive analysis model, ZData representing a sensing data information matrix, ST representing a data information extraction set, JX representing a data analysis method set,
Figure 910537DEST_PATH_IMAGE038
and showing the data analysis output result.
In order to achieve the above object, a second aspect of the present invention provides a system for parsing sensing data, where the system includes: the positioning identification module is used for acquiring a sensing data set, wherein the sensing data set comprises sensing data subsets of a plurality of sensing terminals, and the sensing data subsets comprise sensing data in a plurality of data forms acquired by the corresponding sensing terminals; the extraction module is used for extracting information of the perception data in each data form in the perception data set by using a pre-constructed deep learning neural network to obtain perception data information, wherein the number of layers of a hidden layer of the deep learning neural network and the number of neurons in the hidden layer of each data form are determined according to the data form of the perception data in each data form; and the data analysis module is used for analyzing the perception data information by utilizing a pre-constructed self-adaptive analysis model to obtain an analysis result of each perception data.
To achieve the above object, a third aspect of the present invention provides a computer-readable storage medium, on which a computer program is stored, where the computer program, when executed by a processor, implements the method for parsing sensing data as set forth in the first aspect of the present invention.
In order to achieve the above object, a fourth aspect of the present invention provides an electronic device, including a memory and a processor, where the memory stores a computer program, and the computer program, when executed by the processor, implements the method for parsing sensing data according to the embodiment of the first aspect of the present invention.
Drawings
FIG. 1 is a flow diagram of a method of parsing perceptual data according to an embodiment of the present invention;
FIG. 2 is a flow chart of acquiring a sensory data set according to one embodiment of the present invention;
FIG. 3 is a flow diagram of a process for parsing sensory data information according to an embodiment of the present invention;
FIG. 4 is a schematic diagram of a system for parsing perceptual data, in accordance with an embodiment of the present invention.
Detailed Description
Reference will now be made in detail to embodiments of the present invention, examples of which are illustrated in the accompanying drawings, wherein like or similar reference numerals refer to the same or similar elements or elements having the same or similar function throughout. The embodiments described below with reference to the drawings are illustrative and intended to be illustrative of the invention and are not to be construed as limiting the invention.
The method, system, storage medium, and electronic device for analyzing perceptual data according to embodiments of the present invention will be described in detail with reference to fig. 1 to 4 and specific embodiments of the present invention.
Fig. 1 is a flowchart of a method for parsing perceptual data according to an embodiment of the present invention. As shown in fig. 1, the method for parsing the sensing data may include:
the method includes the steps of S1, obtaining a perception data set, wherein the perception data set comprises perception data subsets of a plurality of perception terminals, and the perception data subsets comprise perception data in various data forms collected by corresponding perception terminals.
It should be noted that the sensing terminal in the embodiment of the present invention is a standard customized sensing terminal.
Specifically, sensing data subsets of N sensing terminals are obtained
Figure 835768DEST_PATH_IMAGE039
N is more than or equal to 2, and a sensing data set is obtained
Figure 60076DEST_PATH_IMAGE040
Wherein, N represents the number of the sensing terminals,
Figure 121573DEST_PATH_IMAGE041
a subset of sensing data representing the ith sensing terminal,
Figure 468240DEST_PATH_IMAGE042
. Perceptual data subsets
Figure 880767DEST_PATH_IMAGE041
The perception data comprises a plurality of data forms and can be expressed as
Figure 908766DEST_PATH_IMAGE043
Wherein M represents the number of kinds of data forms of the sensing data contained in the ith sensing terminal,
Figure 90349DEST_PATH_IMAGE044
a data packet representing the jth data form,
Figure 342338DEST_PATH_IMAGE045
and the data packet comprises all data sets corresponding to the data form under the corresponding sensing terminal.
In embodiments of the present invention, the dataform of the sensory data may include text, graphics, images, audio, video, and other forms of data.
To achieve better results, multiple subsets of perceptual data
Figure 242161DEST_PATH_IMAGE041
The sensing data of each data form in (1) are arranged in a preset order.
As a specific embodiment, when the number of the sensing terminals is 5, the sensing terminals correspond to the sensing data set
Figure 808272DEST_PATH_IMAGE046
When the data form contained in each sensing terminal is characters, images, audios and videos, and the preset sequence of the sensing data form is characters, images, audios and videos, the sensing data subset corresponding to the first sensing terminal has:
Figure 109940DEST_PATH_IMAGE047
wherein, in the process,
Figure 34296DEST_PATH_IMAGE048
textual form data representing the first perceiving terminal,
Figure 421415DEST_PATH_IMAGE049
image form data representing a first perception terminal,
Figure 791217DEST_PATH_IMAGE050
audio form data representing a first perceptual terminal,
Figure 681812DEST_PATH_IMAGE051
video form data representing a first perceptual terminal. The data packets in the sensing data subsets of other sensing terminals are also arranged according to the preset sequence, and are not described herein again.
In an embodiment of the present invention, as shown in fig. 2, acquiring the perception data set may include:
s11, receiving the encrypted sensing data in multiple data forms of a plurality of sensing terminals, wherein each encrypted sensing data is obtained by encrypting according to an encryption algorithm corresponding to the data form of the sensing data;
and S12, calling a decryption algorithm corresponding to the data form of each encrypted sensing data to decrypt to obtain a sensing data set.
In the embodiment of the invention, the sensing data of each data form are arranged according to a preset sequence, wherein the corresponding decryption algorithm is called in turn according to the preset sequence for decryption.
In order to ensure the security of the perception data in the obtained perception data set, prevent the perception data from being maliciously intercepted, stolen and tampered in the process of obtaining the perception data set, further ensure that the perception data analysis is smoothly and accurately carried out, and receive the perception data encrypted by each perception terminal.
In the embodiment of the invention, the perception terminal can acquire the perception data and perform self-adaptive encryption processing on the perception data while encrypting the processed perception data. The data acquisition module can be used for acquiring the sensing data of a plurality of sensing terminals, and the encryption module is used for completing the self-adaptive encryption processing of the sensing data of each sensing terminal.
Specifically, when each perception terminal or encryption module encrypts the perception data subset, the perception terminal or encryption module can encrypt the perception data subset according to the perception data subset
Figure 541184DEST_PATH_IMAGE041
The self-adaptive encryption method comprises the steps of constructing an autonomous selection model according to sensing data in various data forms contained in the data, and completing self-adaptive encryption of the sensing data in various data forms by utilizing the constructed autonomous selection model.
As a specific example, the autonomous selection model may be:
Figure 150020DEST_PATH_IMAGE052
wherein, data represents a perception Data set, W represents an encryption algorithm set of Data in a character form, S represents an encryption algorithm set of Data in a graphic form, P represents an encryption algorithm set of Data in an image form, V represents an encryption algorithm set of Data in an audio form, mo represents an encryption algorithm set of Data in a video form, ot represents an encryption algorithm set of other Data forms,
Figure 323512DEST_PATH_IMAGE053
representing the autonomous selection model output, i.e. the encrypted processed perceptual data set.
The autonomous selection model may be based on each subset of perceptual data
Figure 334193DEST_PATH_IMAGE041
The method comprises the following steps of calling corresponding encryption algorithms in a cryptographic algorithm database in an encryption module according to sensing data in different data forms, and encrypting the sensing data in the corresponding data form by using the corresponding encryption algorithms, wherein the specific process comprises the following steps:
Figure 364466DEST_PATH_IMAGE054
the perception data in the embodiment of the invention is obtained by adaptively calling the encryption algorithm to the perception data subsets containing different data forms through the established autonomous selection model for encryption, so that the security of the perception data transmission process is ensured, malicious interception, stealing and tampering of the perception data are effectively prevented, the data analysis is further ensured to be smoothly and accurately carried out, and the accuracy of the data analysis is improved.
In the embodiment of the invention, before analyzing the encrypted sensing data, the encrypted sensing data needs to be decrypted. Sensing data after processing encryption
Figure 460598DEST_PATH_IMAGE053
When adaptive decryption is carried out, the perception data after encryption processing can be used
Figure 172202DEST_PATH_IMAGE053
Self-adapting decryption is carried out on the contained data forms one by one, namely, a decryption algorithm of a corresponding data form in a cryptographic algorithm database in an encryption module is called to encrypt the encrypted sensing data
Figure 37390DEST_PATH_IMAGE055
Decrypting to obtain decrypted sensing data
Figure 176247DEST_PATH_IMAGE056
The specific process is as follows:
Figure 821992DEST_PATH_IMAGE057
wherein,
Figure 71708DEST_PATH_IMAGE058
representing the set of perceptual data after the cryptographic process,
Figure 56982DEST_PATH_IMAGE059
the perception data subset of the ith perception terminal after encryption is represented,
Figure 366740DEST_PATH_IMAGE060
a set of decryption algorithms representing data in textual form,
Figure 995387DEST_PATH_IMAGE061
representing graphical formsA set of decryption algorithms for the data,
Figure 48794DEST_PATH_IMAGE062
a set of decryption algorithms representing data in image form,
Figure 888574DEST_PATH_IMAGE063
a set of decryption algorithms representing data in audio form,
Figure 369233DEST_PATH_IMAGE064
a set of decryption algorithms representing data in video form,
Figure 723991DEST_PATH_IMAGE065
a set of decryption algorithms representing other forms of data,
Figure 581089DEST_PATH_IMAGE066
representing the decrypted processed set of perceptual data,
Figure 275376DEST_PATH_IMAGE067
and indicating the perception data subset of the ith perception terminal after decryption.
It should be noted that the encryption algorithm and the decryption algorithm in the cryptographic algorithm database are both symmetric, and in order to ensure successful decryption, the encryption algorithm is reversible encryption. The encryption and decryption algorithms in the cryptographic algorithm database can be updated in real time according to the data form and other data characteristics of the sensing data of the sensing terminal.
And S2, extracting information of the perception data in each data form in the perception data set by using a pre-constructed deep learning neural network to obtain perception data information, wherein the number of layers of hidden layers of the deep learning neural network and the number of neurons in each hidden layer are determined according to the data form of the perception data in each data form.
In particular, a pre-constructed deep learning neural network can be utilized to sense a data set
Figure 926937DEST_PATH_IMAGE068
The sensing data in the system are positioned and identified to extract a sensing data set
Figure 34570DEST_PATH_IMAGE069
The perceptual data information in
Figure 429779DEST_PATH_IMAGE070
Wherein each subset
Figure 978572DEST_PATH_IMAGE071
The method comprises the sensing data information extracted from the sensing data of the corresponding sensing terminal. In embodiments of the invention, data information is perceived
Figure 66614DEST_PATH_IMAGE072
Device information, custom configuration script information, and other related information for each aware terminal may be included.
In an embodiment of the present invention, the deep learning neural network may include an input layer, a hidden layer, and an output layer, which are connected in sequence;
in an embodiment of the invention, the inputs to the input layer are: perceptual data set
Figure 395964DEST_PATH_IMAGE001
In the embodiment of the present invention, the corresponding outputs of the input layer are:
Figure 594864DEST_PATH_IMAGE073
wherein
Figure 998164DEST_PATH_IMAGE003
an input matrix representing a plurality of data forms,
Figure 257107DEST_PATH_IMAGE074
a weight matrix representing the input layer is represented,
Figure 575218DEST_PATH_IMAGE005
representing input layersA bias matrix;
the inputs to the hidden layer are:
Figure 312230DEST_PATH_IMAGE006
wherein,
Figure 835615DEST_PATH_IMAGE007
a weight matrix representing the input layer to the hidden layer,
Figure 265459DEST_PATH_IMAGE008
a substrate of a hidden layer is represented,
Figure 303823DEST_PATH_IMAGE009
a weight matrix representing the a-1 layer hidden layer to the a layer hidden layer,
Figure 844525DEST_PATH_IMAGE010
the output quantity representing the layer a-1 hidden layer is transferred to the part of the layer a hidden layer,
Figure 487996DEST_PATH_IMAGE011
indicating the status of the stratum j-th hidden layer,
Figure 823163DEST_PATH_IMAGE012
representing a weight matrix between the two hidden layers, and determining according to the relationship between each neuron between the two hidden layers;
the output of the hidden layer is:
Figure 614401DEST_PATH_IMAGE075
wherein, in the process,
Figure 693216DEST_PATH_IMAGE014
representing an activation function;
the inputs to the output layer are:
Figure 191193DEST_PATH_IMAGE015
wherein, in the process,
Figure 697261DEST_PATH_IMAGE016
a weight matrix representing the hidden layer to the output layer,
Figure 975795DEST_PATH_IMAGE017
a substrate representing an output layer;
the corresponding outputs of the output layer are:
Figure 858301DEST_PATH_IMAGE018
wherein
Figure 210785DEST_PATH_IMAGE019
in order to activate the function(s),
Figure 153333DEST_PATH_IMAGE020
wherein
Figure 155049DEST_PATH_IMAGE021
representing perceptual data information.
In an embodiment of the invention, the perceptual dataset of the input layer is input
Figure 575666DEST_PATH_IMAGE001
To decrypt the processed perceptual data set. Perceptual data set
Figure 48236DEST_PATH_IMAGE076
The data package comprises text, graphics, images, audio, video and other forms of sensing data of the sensing terminal.
Further specifically, decrypting the processed perceptual data
Figure 896106DEST_PATH_IMAGE077
Inputting an input layer of a deep learning neural network, the deep learning neural network having a set of hidden layer states
Figure 883654DEST_PATH_IMAGE078
Figure 107962DEST_PATH_IMAGE079
Indicating the state of the hidden layer of the j-th layer, n indicating the number of layers of the hidden layer,
Figure 435038DEST_PATH_IMAGE080
(ii) a The output perception data information of the deep learning neural network output layer is
Figure 781705DEST_PATH_IMAGE081
Further specifically, when the hidden layer has only one layer, the inputs of the hidden layer are:
Figure 256549DEST_PATH_IMAGE082
when the hidden layer contains multiple layers, the inputs to the layer a neurons of the hidden layer are:
Figure 18969DEST_PATH_IMAGE006
when the embodiment of the invention outputs a plurality of layers of hidden layers, the states of the previous layers of hidden layers are taken into account, so that the input parameters of each layer of the hidden layer are richer, the calculation output result is more accurate, and the construction of a deep learning neural network can be completed more quickly.
In an embodiment of the present invention, the method for parsing sensing data may further include: and constructing a quantity adaptation model, and determining the number of layers of hidden layers of the deep learning neural network and the number of neurons in each hidden layer by using the quantity adaptation model.
And constructing a quantity adaptation model for improving the accuracy of the perception data information extracted by the deep learning neural network. The constructed quantity adaptation model can determine the number of layers of hidden layers of the deep learning neural network and the number of neurons in each hidden layer according to the input matrix in the multi-data form.
As a specific example, the quantity adaptive model may be:
Figure 200551DEST_PATH_IMAGE022
wherein,
Figure 219585DEST_PATH_IMAGE023
representing a quantity adaptation model, da representing an input matrix of multiple data forms, F representing a data form decomposition matrix, YK representing the number of layers of a hidden layer and a corresponding neuron quantity matrix, QW representing a weight matrix of the hidden layer, B representing a bias matrix of the hidden layer, YS representing a constraint condition,
Figure 119408DEST_PATH_IMAGE024
indicating the best number of adaptation results.
In embodiments of the present invention, constraints may be computational power, gradients, activation functions, norm, and other influencing factors.
Specifically, the constructed quantity adaptation model decomposes an input matrix Da with multiple data forms through a data form decomposition matrix F to obtain data sets with various forms, a Viterbi algorithm can be used for dynamic planning, and the obtained sets are combined with constraint conditions under the action of a hidden layer to obtain an optimal quantity adaptation result
Figure 685519DEST_PATH_IMAGE024
Figure 783925DEST_PATH_IMAGE083
For better effect, the quantity adaptation can be carried out only for the first time when each data form appears, so that the subsequent workload is reduced, and meanwhile, if a new data form is added, the calculation can be directly carried out through a quantity adaptation model.
According to the quantity adaptation model of the embodiment of the invention, according to the data form of the perception data, when the deep learning neural network extracts the information of the perception data, the optimal number of the hidden layer layers and the number of the neurons in each hidden layer are selected, so that the accuracy of data information extraction is improved, and the adaptivity of perception data analysis and the accuracy of data analysis are further improved.
In the embodiment of the invention, when the deep learning neural network is constructed in advance, a cross entropy loss function can be adopted for construction, wherein the expression of the cross entropy loss function at the time t is as follows:
Figure 410078DEST_PATH_IMAGE025
wherein,
Figure 797197DEST_PATH_IMAGE026
representing the true result value at time t,
Figure 901419DEST_PATH_IMAGE027
a predicted result value indicating time t, a predicted result value
Figure 119911DEST_PATH_IMAGE027
Obtained through a deep learning neural network;
the global loss function for P moments is:
Figure 916966DEST_PATH_IMAGE025
wherein,
Figure 525802DEST_PATH_IMAGE029
function representing cross entropy loss at time t
Figure 699294DEST_PATH_IMAGE030
And (4) weighting values.
According to the embodiment of the invention, the constructed optimized cross entropy loss function is used for updating the parameters of the deep learning neural network
Figure 772292DEST_PATH_IMAGE084
Figure 740248DEST_PATH_IMAGE085
Figure 836380DEST_PATH_IMAGE086
Figure 282405DEST_PATH_IMAGE087
Specifically, the loss of the deep learning neural network at the current position is quantified by a loss function L, i.e., the true result value is quantified
Figure 971095DEST_PATH_IMAGE088
And deep learning neural network prediction result value
Figure 109952DEST_PATH_IMAGE089
Wherein, the optimized cross entropy loss function of the deep learning neural network at time t is:
Figure 693380DEST_PATH_IMAGE025
then the global penalty for P moments is:
Figure 943096DEST_PATH_IMAGE028
wherein,
Figure 990686DEST_PATH_IMAGE029
function representing cross entropy loss at time t
Figure 300445DEST_PATH_IMAGE030
And (4) weighting values.
In the embodiment of the invention, iterative computation can be carried out by utilizing a slope descent method to update grid parameters of the deep learning neural network, during reverse propagation, the gradient loss of a certain perception data characteristic sequence at the time t is determined by the gradient loss corresponding to the output at the current time and the gradient loss of a perception data characteristic sequence index at the time t +1, the gradient loss of a certain perception data characteristic sequence at the time t of a weight matrix needs to be subjected to calculation of reverse propagation step by step, and during the time t, the gradient loss of a certain perception data characteristic sequence at the time t needs to be subjected to calculation of reverse propagation step by stepHidden layer states indexed by perceptual data feature sequences
Figure 371169DEST_PATH_IMAGE091
The gradient of (d) is:
Figure 424576DEST_PATH_IMAGE092
thereby obtaining a gradient calculation expression for updating the neural network parameters:
Figure 326672DEST_PATH_IMAGE093
Figure 807332DEST_PATH_IMAGE094
Figure 99773DEST_PATH_IMAGE095
Figure 956871DEST_PATH_IMAGE096
and until the values of the parameters are converged, completing the construction of the deep learning neural network.
According to the embodiment of the invention, the parameters of the deep learning neural network are updated by constructing and optimizing the cross entropy loss function, so that the deep learning neural network construction is completed more accurately and rapidly.
And S3, constructing a self-adaptive analysis model, and analyzing the sensing data information by using the self-adaptive analysis model to obtain an analysis result of each sensing data.
In an embodiment of the present invention, as shown in fig. 3, performing an analysis process on the sensing data information by using an adaptive analysis model to obtain an analysis result, including:
s31, classifying the sensing data information to obtain a sensing data information matrix;
and S32, extracting data information from the sensing data information matrix by using a self-adaptive analysis model, and calling a corresponding data analysis method to analyze the extracted data information to obtain an analysis result.
In an embodiment of the present invention, the sensing data information matrix is:
Figure 713474DEST_PATH_IMAGE097
wherein, the ZData represents a perceptual data information matrix,
Figure 365036DEST_PATH_IMAGE098
the sensing data information set of the ith sensing terminal is represented, N represents the number of the sensing terminals,
Figure 410352DEST_PATH_IMAGE099
Figure 805561DEST_PATH_IMAGE100
Figure 918136DEST_PATH_IMAGE101
and M represents the number of types of the data forms of the perception data contained in the ith perception terminal.
In the embodiment of the present invention, the adaptive analytical model is:
Figure 6178DEST_PATH_IMAGE102
wherein,
Figure 273211DEST_PATH_IMAGE103
represents an adaptive analysis model, ZData represents a sensing data information matrix, ST represents a data information extraction set, JX represents a data analysis method set,
Figure 206532DEST_PATH_IMAGE104
and showing the data analysis output result.
Specifically, the data processing module can be used for calling a control protocol dictionary in a protocol database and a deep neural network protocol to classify the extracted sensing data information so as to obtain more accuracyDefinite data information, i.e. matrices of perceptual data information
Figure 937728DEST_PATH_IMAGE105
Figure 196671DEST_PATH_IMAGE106
And representing the perception data information set of the ith perception terminal. In the embodiment of the invention, the obtained sensing data information matrix ZData can contain equipment information of a sensing terminal corresponding to each sensing data, an agreed data transmission protocol, custom configuration script information and other data analysis related information.
In the embodiment of the invention, the protocol database is provided with a control protocol dictionary and a deep neural network protocol, and is updated according to the protocol to which the data form of the perception data belongs.
It should be noted that the data transmission protocol may be obtained in the protocol database, and the data transmission protocol is defined when the standard customized sensing terminal is accessed. And if a new protocol is generated, updating the protocol database, and optimizing the protocol in the protocol database according to the common frequency of the protocol.
In the embodiment of the invention, the adaptive analysis model constructed by the data analysis module can be used for analyzing data information elements in the perception data information set ZData, namely the adaptive analysis model calls a more matched analysis method in the data analysis method library to complete intelligent analysis of perception data, so as to obtain a control protocol, a deep neural network protocol, a custom configuration script and other related data analysis data information which are followed by a corresponding perception terminal, and the data analysis is performed on the processed perception data by constructing the adaptive analysis model, so as to obtain a data analysis result. After the data analysis result is obtained, the data analysis result can be further uploaded to the Internet of things platform.
In the embodiment of the invention, the data analysis method library stores a common analysis method to analyze the data format into the standard data format Alink JSON, so that a data analysis module can call the analysis method conveniently, and meanwhile, the data analysis method library is updated additionally.
It should be noted that in the process of analyzing the sensing data, a new data analysis method is found, or a corresponding data analysis method is added due to the increase of the sensing terminals, and the newly added data analysis method is stored in the data analysis method library, so as to perform optimization updating on the data analysis method library.
According to the embodiment of the invention, the self-defined configuration script of the standard customized sensing terminal and other data analysis related data information are obtained by constructing the self-adaptive analysis model and analyzing the sensing data information by using the constructed self-adaptive analysis model, and the accuracy of the data intelligent analysis system of the standard customized sensing terminal and the adaptivity of data analysis are improved by a self-adaptive selection data analysis method.
According to the method for analyzing the perception data, the obtained perception data is the perception data which is encrypted by adaptively using an encryption algorithm for the perception data in different data forms by utilizing the established autonomous selection model, the safety of the transmission process of the perception data is guaranteed, the perception data is effectively prevented from being intercepted, stolen and tampered maliciously, the data analysis is further guaranteed to be performed smoothly and accurately, and the accuracy of the data analysis is improved.
According to the method for analyzing the perception data, disclosed by the embodiment of the invention, the optimal number of hidden layers and the number of the neurons are selected by utilizing the quantity adaptation model when the deep learning neural network is carried out according to the data form of the perception data subset, so that the accuracy of perception data information extraction is improved, and the adaptivity and the data analysis accuracy of a system are further improved.
According to the method for analyzing the perception data, when the multiple layers of hidden layers are output, the states of the previous layers of hidden layers are taken into consideration, so that the input parameters of the layers of the hidden layers are richer, the calculation and output results are more accurate, and the construction of a deep learning neural network is completed more quickly.
According to the analytic method of the perception data, disclosed by the embodiment of the invention, the optimized cross entropy loss function is constructed to update the parameters of the deep learning neural network, so that the deep learning neural network is accurately and quickly constructed.
According to the method for analyzing the perception data, disclosed by the embodiment of the invention, the self-defined configuration script of the standard customized perception terminal and other data analysis related data information are obtained by extracting the perception data information, and the data analysis method is selected in a self-adaptive manner, so that the accuracy of a data intelligent analysis system of the standard customized perception terminal and the self-adaptability of data analysis are improved.
The invention also provides a system for analyzing the perception data.
FIG. 4 is a schematic diagram of a system for parsing perceptual data, in accordance with an embodiment of the present invention. As shown in fig. 4, the system for parsing sensing data may include: a positioning identification module 30, an extraction module 40 and a data analysis module 50.
The positioning identification module 30 is configured to obtain a sensing data set, where the sensing data set includes sensing data subsets of multiple sensing terminals, and the sensing data subsets include sensing data in multiple data forms acquired by corresponding sensing terminals; the extraction module 40 is configured to perform information extraction on the sensing data in each data form in the sensing data set by using a pre-constructed deep learning neural network to obtain sensing data information, wherein for the sensing data in each data form, the number of layers of a hidden layer of the deep learning neural network and the number of neurons contained in the hidden layer of each layer are determined according to the data form of the sensing data; and the data analysis module 50 is configured to analyze the sensing data information by using a pre-constructed adaptive analysis model to obtain an analysis result of each sensing data.
In an embodiment of the present invention, the system for parsing sensing data may further include: a data acquisition module 10 and an encryption module 20. The data acquisition module 10 is configured to acquire sensing data subsets of a plurality of sensing terminals; the encryption module 20 is configured to invoke an encryption algorithm corresponding to the sensing data form, and encrypt the sensing data in the corresponding data form to obtain encrypted sensing data; the positioning identification module 30 is further configured to call, for each piece of encrypted sensing data, a decryption algorithm corresponding to the data format of the sensing data to decrypt the sensing data, so as to obtain a sensing data set.
In the embodiment of the invention, the sensing data in each data form are arranged according to a preset sequence, wherein the corresponding decryption algorithms are called in sequence according to the preset sequence for decryption.
In order to ensure the security of the perception data in the obtained perception data set, prevent the perception data from being maliciously intercepted, stolen and tampered in the process of obtaining the perception data set, further ensure that the perception data analysis is smoothly and accurately carried out, and receive the perception data encrypted by each perception terminal.
Specifically, when the encryption module encrypts the sensing data of each sensing terminal, the encryption module can encrypt the sensing data according to the sensing data subset
Figure 685421DEST_PATH_IMAGE108
The self-adaptive encryption method comprises the steps of constructing an autonomous selection model according to sensing data in various data forms contained in the data, and completing self-adaptive encryption of the sensing data in various data forms by utilizing the constructed autonomous selection model.
As a specific example, the autonomous selection model may be:
Figure 688012DEST_PATH_IMAGE109
wherein, data represents a perception Data set, W represents an encryption algorithm set of Data in a character form, S represents an encryption algorithm set of Data in a graphic form, P represents an encryption algorithm set of Data in an image form, V represents an encryption algorithm set of Data in an audio form, mo represents an encryption algorithm set of Data in a video form, ot represents an encryption algorithm set of other Data forms,
Figure 273714DEST_PATH_IMAGE110
representing the autonomous selection model output, i.e. the encrypted processed perceptual data set.
The autonomous selection model may be based on each subset of perceptual data
Figure 437979DEST_PATH_IMAGE111
Calling corresponding encryption algorithms in an encryption module cryptographic algorithm database, and encrypting the sensing data in the corresponding data form by using the corresponding encryption algorithms, wherein the specific process is as follows:
Figure 679605DEST_PATH_IMAGE112
according to the embodiment of the invention, the autonomous selection model is constructed, so that the encryption module adaptively calls the encryption algorithm to the sensing data subsets containing different data forms for encryption by utilizing the constructed autonomous selection model, the security of the sensing data transmission process is ensured, malicious interception, stealing and tampering on the sensing data are effectively prevented, the data analysis is further ensured to be smoothly and accurately carried out, and the accuracy of the data analysis is improved.
In the embodiment of the present invention, before the extraction module 40 analyzes the encrypted sensing data, the positioning identification module 30 needs to decrypt the encrypted sensing data. Sensing data after processing encryption
Figure 220307DEST_PATH_IMAGE110
When adaptive decryption is carried out, the perception data after encryption processing can be used
Figure 660516DEST_PATH_IMAGE113
Self-adapting decryption is carried out on the contained data forms one by one, namely, a decryption algorithm of a corresponding data form in a cryptographic algorithm database in an encryption module is called to encrypt the encrypted sensing data
Figure 261262DEST_PATH_IMAGE110
Decrypting to obtain decrypted sensing data
Figure 724604DEST_PATH_IMAGE114
The specific process is as follows:
Figure 68998DEST_PATH_IMAGE115
wherein,
Figure 130757DEST_PATH_IMAGE116
representing the set of perceptual data after the cryptographic processing,
Figure 636824DEST_PATH_IMAGE117
the perception data subset of the ith perception terminal after encryption is represented,
Figure 853042DEST_PATH_IMAGE118
a set of decryption algorithms representing data in textual form,
Figure 797864DEST_PATH_IMAGE119
a set of decryption algorithms representing data in graphical form,
Figure 150348DEST_PATH_IMAGE120
a set of decryption algorithms representing data in image form,
Figure 827317DEST_PATH_IMAGE121
a set of decryption algorithms representing data in audio form,
Figure 265252DEST_PATH_IMAGE122
a set of decryption algorithms representing data in video form,
Figure 13765DEST_PATH_IMAGE123
a set of decryption algorithms representing other forms of data,
Figure 220756DEST_PATH_IMAGE124
representing the decrypted processed set of perceptual data,
Figure 334205DEST_PATH_IMAGE125
and indicating the perception data subset of the ith perception terminal after decryption.
It should be noted that the encryption algorithm and the decryption algorithm in the cryptographic algorithm database are both symmetric, and in order to ensure successful decryption, the encryption algorithm is reversible encryption. The encryption and decryption algorithms in the cryptographic algorithm database can be updated in real time according to the data form and other data characteristics of the sensing data of the sensing terminal.
In an embodiment of the present invention, the deep learning neural network may include an input layer, a hidden layer and an output layer connected in sequence,
the inputs to the input layer are: perceptual data set
Figure 321753DEST_PATH_IMAGE126
The corresponding outputs of the input layer are:
Figure 546061DEST_PATH_IMAGE127
wherein
Figure 873137DEST_PATH_IMAGE128
an input matrix representing a plurality of data forms,
Figure 157488DEST_PATH_IMAGE129
a weight matrix representing the input layer is used,
Figure 868217DEST_PATH_IMAGE130
a bias matrix representing an input layer;
the inputs to the hidden layer are:
Figure 896216DEST_PATH_IMAGE131
wherein,
Figure 77798DEST_PATH_IMAGE132
a weight matrix representing the input layer to the hidden layer,
Figure 595367DEST_PATH_IMAGE133
a substrate of a hidden layer is represented,
Figure 495190DEST_PATH_IMAGE134
represents a weight matrix of the a-1 layer hidden layer to the a layer hidden layer,
Figure 61301DEST_PATH_IMAGE135
the output quantity representing the layer a-1 hidden layer is transferred to the part of the layer a hidden layer,
Figure 97390DEST_PATH_IMAGE136
representing the jth layer hidden layer state,
Figure 785860DEST_PATH_IMAGE137
representing a weight matrix between the two hidden layers, and determining according to the relationship between each neuron between the two hidden layers;
the output of the hidden layer is:
Figure 907400DEST_PATH_IMAGE138
wherein
Figure 277201DEST_PATH_IMAGE139
representing an activation function;
the inputs to the output layer are:
Figure 495693DEST_PATH_IMAGE140
wherein
Figure 292748DEST_PATH_IMAGE141
a weight matrix representing the hidden layer to the output layer,
Figure 901584DEST_PATH_IMAGE142
a substrate representing an output layer;
the corresponding outputs of the output layer are:
Figure 809497DEST_PATH_IMAGE143
wherein
Figure 214038DEST_PATH_IMAGE139
in order to activate the function(s),
Figure 181994DEST_PATH_IMAGE144
wherein
Figure 278126DEST_PATH_IMAGE145
representing perceptual data information.
In embodiments of the present invention, the data parsing module 50 may also be used to build an adaptive parsing model. The self-adaptive analysis model comprises the following steps:
Figure 989730DEST_PATH_IMAGE102
wherein,
Figure 182814DEST_PATH_IMAGE103
represents an adaptive analysis model, ZData represents a perception data information set, ST represents a data information extraction set, JX represents a data analysis method set,
Figure 321671DEST_PATH_IMAGE104
and showing the data analysis output result.
Specifically, the extracted perceptual data information is classified to obtain more accurate data information, i.e., a perceptual data information matrix ZData. The sensing data information matrix ZData can contain device information of sensing terminals corresponding to sensing data, appointed data transmission protocols, custom configuration script information and other data analysis related information.
For other specific embodiments of the system for parsing sensing data according to the embodiment of the present invention, reference may be made to the specific embodiments of the method for parsing sensing data according to the above-mentioned embodiment of the present invention.
According to the analytic system of the perception data, disclosed by the embodiment of the invention, the optimal number of hidden layer layers and the number of neurons are selected when deep learning of a neural network is carried out according to the data form of the perception data subset, so that the accuracy of perception data information extraction is improved, and the adaptivity and the data analytic accuracy of the system are further improved; by constructing the self-adaptive analysis model and analyzing the perception data information by utilizing the constructed self-adaptive analysis model, the user-defined configuration script of the standard customized perception terminal and other data analysis related data information are obtained, and the data analysis method is selected in a self-adaptive mode, so that the accuracy of the data intelligent analysis system of the standard customized perception terminal and the self-adaptability of data analysis are improved.
The invention also provides a computer readable storage medium.
In this embodiment, a computer program is stored on a computer-readable storage medium, and when being executed by a processor, the computer program corresponds to the above-mentioned method for analyzing perceptual data, and implements the above-mentioned method for analyzing perceptual data.
The invention also provides the electronic equipment.
In this embodiment, the electronic device includes a processor, a memory, and a computer program stored on the memory, and when the processor executes the computer program, the method for parsing the sensing data is implemented.
According to the readable storage medium and the electronic device, the accuracy of multi-data-form sensing data analysis and the adaptivity of data analysis are improved by using the sensing data analysis method.
It should be noted that the logic and/or steps represented in the flowcharts or otherwise described herein, such as an ordered listing of executable instructions that can be considered to implement logical functions, can be embodied in any computer-readable medium for use by or in connection with an instruction execution system, apparatus, or device, such as a computer-based system, processor-containing system, or other system that can fetch the instructions from the instruction execution system, apparatus, or device and execute the instructions. For the purposes of this description, a "computer-readable medium" can be any means that can contain, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device. More specific examples (a non-exhaustive list) of the computer-readable medium would include the following: an electrical connection (electronic device) having one or more wires, a portable computer diskette (magnetic device), a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber device, and a portable compact disc read-only memory (CDROM). Additionally, the computer-readable medium could even be paper or another suitable medium upon which the program is printed, as the program can be electronically captured, via for instance optical scanning of the paper or other medium, then compiled, interpreted or otherwise processed in a suitable manner if necessary, and then stored in a computer memory.
It should be understood that portions of the present invention may be implemented in hardware, software, firmware, or a combination thereof. In the above embodiments, the various steps or methods may be implemented in software or firmware stored in memory and executed by a suitable instruction execution system. For example, if implemented in hardware, as in another embodiment, any one or combination of the following techniques, which are known in the art, may be used: a discrete logic circuit having a logic gate circuit for implementing a logic function on a data signal, an application specific integrated circuit having an appropriate combinational logic gate circuit, a Programmable Gate Array (PGA), a Field Programmable Gate Array (FPGA), or the like.
In the description herein, references to the description of the term "one embodiment," "some embodiments," "an example," "a specific example," or "some examples," etc., mean that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the invention. In this specification, the schematic representations of the terms used above do not necessarily refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples.
Furthermore, the terms "first", "second" and "first" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include at least one of the feature. In the description of the present invention, "a plurality" means at least two, e.g., two, three, etc., unless specifically limited otherwise.
Although embodiments of the present invention have been shown and described above, it is understood that the above embodiments are exemplary and should not be construed as limiting the present invention, and that variations, modifications, substitutions and alterations can be made to the above embodiments by those of ordinary skill in the art within the scope of the present invention.

Claims (11)

1. A method for parsing perceptual data, the method comprising:
acquiring a perception data set, wherein the perception data set comprises perception data subsets of a plurality of perception terminals, and the perception data subsets comprise perception data of a plurality of data forms acquired by corresponding perception terminals;
extracting information of the perception data in each data form in the perception data set by using a pre-constructed deep learning neural network to obtain perception data information, wherein the number of layers of a hidden layer of the deep learning neural network and the number of neurons in the hidden layer of each layer are determined according to the data form of the perception data in each data form;
constructing a self-adaptive analysis model, and analyzing the sensing data information by using the self-adaptive analysis model to obtain an analysis result of each sensing data; the analyzing the perception data information by using the self-adaptive analysis model to obtain an analysis result comprises the following steps:
classifying the perception data information to obtain a perception data information matrix;
extracting data information from the sensing data information matrix by using the self-adaptive analysis model, and calling a corresponding data analysis method to analyze the extracted data information to obtain an analysis result;
wherein the perceptual data information matrix is:
Figure 588359DEST_PATH_IMAGE001
wherein, the ZData represents a perceptual data information matrix,
Figure 704082DEST_PATH_IMAGE002
a sensing data information set representing the ith sensing terminal, N represents the number of the sensing terminals,
Figure 543862DEST_PATH_IMAGE003
Figure 24522DEST_PATH_IMAGE004
Figure 127083DEST_PATH_IMAGE005
the data packet represents the jth data form, and M represents the number of the types of the data forms of the sensing data contained in the ith sensing terminal;
the self-adaptive analysis model is as follows:
Figure 984180DEST_PATH_IMAGE006
wherein,
Figure 740784DEST_PATH_IMAGE007
an adaptive analytical model is represented which is,
Figure 392345DEST_PATH_IMAGE008
representing a sensing data information matrix, ST representing a data information extraction set, JX representing a data analysis method set,
Figure 437661DEST_PATH_IMAGE009
and representing the data analysis output result.
2. The method for parsing perceptual data as defined in claim 1, wherein the obtaining the perceptual data set comprises:
receiving the encrypted sensing data in multiple data forms of a plurality of sensing terminals, wherein each encrypted sensing data is obtained by encrypting according to an encryption algorithm corresponding to the data form of the encrypted sensing data;
and calling a decryption algorithm corresponding to the data form of each encrypted sensing data to decrypt to obtain the sensing data set.
3. The method for parsing sensing data according to claim 2, wherein the sensing data in each data format are arranged in a preset sequence, and wherein decryption is performed by sequentially invoking corresponding decryption algorithms according to the preset sequence.
4. The parsing method of perception data as claimed in claim 3, wherein the deep learning neural network includes an input layer, a hidden layer and an output layer connected in sequence,
the input of the input layer is as follows: the perception data set
Figure 645920DEST_PATH_IMAGE010
The corresponding outputs of the input layer are:
Figure 194713DEST_PATH_IMAGE011
wherein
Figure 345072DEST_PATH_IMAGE012
an input matrix representing a plurality of forms of data,
Figure 612105DEST_PATH_IMAGE013
a weight matrix representing the input layer is provided,
Figure 811005DEST_PATH_IMAGE014
a bias matrix representing the input layer;
the inputs of the hidden layer are:
Figure 27354DEST_PATH_IMAGE015
wherein,
Figure 286297DEST_PATH_IMAGE016
a weight matrix representing the input layer to the hidden layer,
Figure 102943DEST_PATH_IMAGE017
represents the underlying substrate of the hidden layer,
Figure 839955DEST_PATH_IMAGE018
a weight matrix representing the a-1 layer hidden layer to the a layer hidden layer,
Figure 363340DEST_PATH_IMAGE019
the output quantity representing the a-1 layer hidden layer is transmitted to the part of the a layer hidden layer,
Figure 606234DEST_PATH_IMAGE020
representing the jth layer hidden layer state,
Figure 582280DEST_PATH_IMAGE021
representing a weight matrix between the two hidden layers, and determining according to the relationship between each neuron between the two hidden layers;
the output of the hidden layer is:
Figure 122983DEST_PATH_IMAGE022
wherein, in the process,
Figure 828771DEST_PATH_IMAGE023
representing an activation function;
input of the output layerComprises the following steps:
Figure 163937DEST_PATH_IMAGE024
wherein, in the process,
Figure 708838DEST_PATH_IMAGE025
a weight matrix representing the hidden layer to the output layer,
Figure 787652DEST_PATH_IMAGE026
a substrate representing the output layer;
the corresponding outputs of the output layer are as follows:
Figure 285630DEST_PATH_IMAGE027
wherein
Figure 854014DEST_PATH_IMAGE023
in order to activate the function(s),
Figure 70232DEST_PATH_IMAGE028
wherein
Figure 952737DEST_PATH_IMAGE029
representing perceptual data information.
5. The method for parsing perceptual data according to claim 4, wherein the method further comprises:
constructing a quantity adaptation model, and determining the number of layers of a hidden layer of the deep learning neural network and the number of neurons in each layer of the hidden layer by using the quantity adaptation model, wherein the quantity adaptation model is as follows:
Figure 118271DEST_PATH_IMAGE030
wherein,
Figure 60819DEST_PATH_IMAGE031
representing a quantity adaptation model, da representing an input matrix of multiple data forms, F representing a data form decomposition matrix, YK representing the number of layers of a hidden layer and a corresponding neuron quantity matrix, QW representing a weight matrix of the hidden layer, B representing a bias matrix of the hidden layer, YS representing a constraint condition,
Figure 561070DEST_PATH_IMAGE032
indicating the best number of adaptation results.
6. The method for analyzing perception data according to claim 5, wherein when the deep learning neural network is constructed in advance, a cross entropy loss function is adopted for construction, wherein an expression of the cross entropy loss function at time t is as follows:
Figure 981687DEST_PATH_IMAGE033
wherein,
Figure 454257DEST_PATH_IMAGE034
representing the true result value at time t,
Figure 115177DEST_PATH_IMAGE035
a prediction result value indicating time t;
the global loss function for P moments is:
Figure 40407DEST_PATH_IMAGE036
wherein,
Figure 264715DEST_PATH_IMAGE037
function representing cross entropy loss at time t
Figure 654108DEST_PATH_IMAGE038
And (4) weighting values.
7. A system for parsing perceptual data, the system comprising:
the positioning identification module is used for acquiring a sensing data set, wherein the sensing data set comprises sensing data subsets of a plurality of sensing terminals, and the sensing data subsets comprise sensing data in a plurality of data forms acquired by the corresponding sensing terminals;
the extraction module is used for extracting information of the perception data in each data form in the perception data set by using a pre-constructed deep learning neural network to obtain perception data information, wherein the number of layers of a hidden layer of the deep learning neural network and the number of neurons contained in each hidden layer are determined according to the data form of the perception data in each data form;
the data analysis module is used for analyzing the perception data information by utilizing a pre-constructed self-adaptive analysis model to obtain an analysis result of each perception data;
the data analysis module is specifically used for classifying the perception data information to obtain a perception data information matrix;
extracting data information from the sensing data information matrix by using the self-adaptive analysis model, and calling a corresponding data analysis method to analyze the extracted data information to obtain an analysis result;
wherein the perceptual data information matrix is:
Figure 938459DEST_PATH_IMAGE001
wherein, the ZData represents a perceptual data information matrix,
Figure 164035DEST_PATH_IMAGE002
a sensing data information set representing the ith sensing terminal, N represents the number of the sensing terminals,
Figure 926455DEST_PATH_IMAGE039
Figure 108037DEST_PATH_IMAGE040
Figure 625606DEST_PATH_IMAGE005
the data packet represents the jth data form, and M represents the number of the types of the data forms of the sensing data contained in the ith sensing terminal;
the self-adaptive analysis model is as follows:
Figure 525429DEST_PATH_IMAGE006
wherein,
Figure 901659DEST_PATH_IMAGE007
an adaptive analytical model is represented that is,
Figure 937749DEST_PATH_IMAGE008
representing a perceptual data information matrix, ST representing a data information extraction set, JX representing a data parsing method set,
Figure 563902DEST_PATH_IMAGE009
and showing the data analysis output result.
8. The system for parsing perceptual data as defined in claim 7, wherein the system further comprises:
the data acquisition module is used for acquiring perception data subsets of a plurality of perception terminals;
the encryption module is used for calling an encryption algorithm corresponding to the sensing data form, and encrypting the sensing data in the corresponding data form to obtain the encrypted sensing data;
the positioning identification module is further configured to call a decryption algorithm corresponding to the data format of each encrypted sensing data to decrypt the data, so as to obtain the sensing data set.
9. The analytic system of perception data of claim 8, wherein the deep learning neural network includes an input layer, a hidden layer, and an output layer connected in sequence,
the input of the input layer is as follows: the perception data set
Figure 13338DEST_PATH_IMAGE041
The corresponding outputs of the input layer are:
Figure 117560DEST_PATH_IMAGE042
wherein
Figure 86784DEST_PATH_IMAGE012
an input matrix representing a plurality of data forms,
Figure 883839DEST_PATH_IMAGE013
a weight matrix representing the input layer,
Figure 492675DEST_PATH_IMAGE014
a bias matrix representing the input layer;
the inputs of the hidden layer are:
Figure 728484DEST_PATH_IMAGE043
wherein,
Figure 739165DEST_PATH_IMAGE044
a weight matrix representing the input layer to the hidden layer,
Figure 707121DEST_PATH_IMAGE017
represents the substrate of the hidden layer or layers,
Figure 616303DEST_PATH_IMAGE018
a weight matrix representing the a-1 layer hidden layer to the a layer hidden layer,
Figure 62328DEST_PATH_IMAGE019
the output quantity representing the layer a-1 hidden layer is transferred to the part of the layer a hidden layer,
Figure 255412DEST_PATH_IMAGE020
indicating the status of the stratum j-th hidden layer,
Figure 394269DEST_PATH_IMAGE021
representing a weight matrix between the two hidden layers, and determining according to the relationship between each neuron between the two hidden layers;
the output of the hidden layer is:
Figure 977697DEST_PATH_IMAGE022
wherein, in the process,
Figure 40462DEST_PATH_IMAGE023
representing an activation function;
the inputs to the output layer are:
Figure 25736DEST_PATH_IMAGE045
wherein
Figure 397811DEST_PATH_IMAGE025
a weight matrix representing the hidden layer to the output layer,
Figure 468535DEST_PATH_IMAGE026
a substrate representing the output layer;
the corresponding outputs of the output layer are:
Figure 521942DEST_PATH_IMAGE027
wherein
Figure 189420DEST_PATH_IMAGE023
in order to activate the function(s),
Figure 670080DEST_PATH_IMAGE028
wherein
Figure 24838DEST_PATH_IMAGE029
representing perceptual data information.
10. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out a method of parsing perceptual data according to any one of claims 1 to 6.
11. An electronic device comprising a memory and a processor, the memory having stored thereon a computer program, wherein the computer program, when executed by the processor, implements a method of parsing perceptual data as defined in any one of claims 1-6.
CN202211071916.7A 2022-09-02 2022-09-02 Method and system for analyzing perception data, storage medium and electronic equipment Active CN115150439B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211071916.7A CN115150439B (en) 2022-09-02 2022-09-02 Method and system for analyzing perception data, storage medium and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211071916.7A CN115150439B (en) 2022-09-02 2022-09-02 Method and system for analyzing perception data, storage medium and electronic equipment

Publications (2)

Publication Number Publication Date
CN115150439A CN115150439A (en) 2022-10-04
CN115150439B true CN115150439B (en) 2023-01-24

Family

ID=83416190

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211071916.7A Active CN115150439B (en) 2022-09-02 2022-09-02 Method and system for analyzing perception data, storage medium and electronic equipment

Country Status (1)

Country Link
CN (1) CN115150439B (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2009149926A2 (en) * 2008-06-10 2009-12-17 Intelligement Ag System and method for the computer-based analysis of large quantities of data
CN106021927A (en) * 2016-05-21 2016-10-12 北京大脑智库教育科技有限公司 Dermatoglyph analysis and processing method based on big data
CN109858508A (en) * 2018-10-23 2019-06-07 重庆邮电大学 IP localization method based on Bayes and deep neural network
CN112257846A (en) * 2020-10-13 2021-01-22 北京灵汐科技有限公司 Neuron model, topology, information processing method, and retinal neuron
CN112949472A (en) * 2021-02-28 2021-06-11 杭州翔毅科技有限公司 Cooperative sensing method based on multi-sensor information fusion

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11327475B2 (en) * 2016-05-09 2022-05-10 Strong Force Iot Portfolio 2016, Llc Methods and systems for intelligent collection and analysis of vehicle data
US20200342291A1 (en) * 2019-04-23 2020-10-29 Apical Limited Neural network processing
US11520037B2 (en) * 2019-09-30 2022-12-06 Zoox, Inc. Perception system

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2009149926A2 (en) * 2008-06-10 2009-12-17 Intelligement Ag System and method for the computer-based analysis of large quantities of data
CN106021927A (en) * 2016-05-21 2016-10-12 北京大脑智库教育科技有限公司 Dermatoglyph analysis and processing method based on big data
CN109858508A (en) * 2018-10-23 2019-06-07 重庆邮电大学 IP localization method based on Bayes and deep neural network
CN112257846A (en) * 2020-10-13 2021-01-22 北京灵汐科技有限公司 Neuron model, topology, information processing method, and retinal neuron
CN112949472A (en) * 2021-02-28 2021-06-11 杭州翔毅科技有限公司 Cooperative sensing method based on multi-sensor information fusion

Also Published As

Publication number Publication date
CN115150439A (en) 2022-10-04

Similar Documents

Publication Publication Date Title
CN108985361B (en) Malicious traffic detection implementation method and device based on deep learning
Shi et al. A multimodal hybrid parallel network intrusion detection model
EP3961441B1 (en) Identity verification method and apparatus, computer device and storage medium
KR102036955B1 (en) Method for recognizing subtle facial expression using deep learning based analysis of micro facial dynamics and apparatus therefor
CN107679466B (en) Information output method and device
CN111447190A (en) Encrypted malicious traffic identification method, equipment and device
US9824313B2 (en) Filtering content in an online system based on text and image signals extracted from the content
CN113535825A (en) Cloud computing intelligence-based data information wind control processing method and system
EP3916597B1 (en) Detecting malware with deep generative models
CN117079299B (en) Data processing method, device, electronic equipment and storage medium
JP6971514B1 (en) Information processing equipment, information processing methods and programs
WO2020205771A2 (en) Learned forensic source system for identification of image capture device models and forensic similarity of digital images
EP3591561A1 (en) An anonymized data processing method and computer programs thereof
CN114499861A (en) Quantum key cloud security situation sensing method based on machine learning
CN116759053A (en) Medical system prevention and control method and system based on Internet of things system
CN114492601A (en) Resource classification model training method and device, electronic equipment and storage medium
CN108615006A (en) Method and apparatus for output information
CN112839055B (en) Network application identification method and device for TLS encrypted traffic and electronic equipment
US11956353B2 (en) Machine learning device, machine learning system, and machine learning method
KR20170057118A (en) Method and apparatus for recognizing object, and method and apparatus for training recognition model
CN115150439B (en) Method and system for analyzing perception data, storage medium and electronic equipment
CN111783677B (en) Face recognition method, device, server and computer readable medium
CN115713669B (en) Image classification method and device based on inter-class relationship, storage medium and terminal
CN116580208A (en) Image processing method, image model training method, device, medium and equipment
Szandała et al. PRISM: principal image sections mapping

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant