CN115935278A - Environment recognition method, electronic device, and computer-readable storage medium - Google Patents

Environment recognition method, electronic device, and computer-readable storage medium Download PDF

Info

Publication number
CN115935278A
CN115935278A CN202310213046.0A CN202310213046A CN115935278A CN 115935278 A CN115935278 A CN 115935278A CN 202310213046 A CN202310213046 A CN 202310213046A CN 115935278 A CN115935278 A CN 115935278A
Authority
CN
China
Prior art keywords
environment
classification
gain
environmental
feature subset
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202310213046.0A
Other languages
Chinese (zh)
Other versions
CN115935278B (en
Inventor
焦响
文鼎柱
朱光旭
石远明
崔曙光
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Research Institute of Big Data SRIBD
Original Assignee
Shenzhen Research Institute of Big Data SRIBD
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Research Institute of Big Data SRIBD filed Critical Shenzhen Research Institute of Big Data SRIBD
Priority to CN202310213046.0A priority Critical patent/CN115935278B/en
Publication of CN115935278A publication Critical patent/CN115935278A/en
Application granted granted Critical
Publication of CN115935278B publication Critical patent/CN115935278B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D30/00Reducing energy consumption in communication networks
    • Y02D30/70Reducing energy consumption in communication networks in wireless communication networks

Landscapes

  • Radar Systems Or Details Thereof (AREA)
  • Mobile Radio Communication Systems (AREA)

Abstract

The invention relates to the technical field of artificial intelligence, in particular to an environment identification method, electronic equipment and a computer readable storage medium. The environment identification method comprises two execution main bodies, namely a communication terminal and an edge server, wherein the communication terminal needs to collect environment information firstly, then performs feature extraction on the environment information to obtain an environment feature subset, the environment feature subset is sent to the edge server, the edge server performs analysis processing based on the environment feature subset after acquiring the environment feature subset based on the communication terminal to obtain a first number of classification probability functions, then establishes classification discrimination gains based on the first number of classification probability functions, and finally performs identification processing on the environment feature subset based on the classification discrimination gains to obtain identification result data. And identifying the environmental feature subset based on the classification discrimination gain to obtain identification result data, so that the identification accuracy of environmental identification can be improved.

Description

Environment recognition method, electronic device, and computer-readable storage medium
Technical Field
The invention relates to the technical field of artificial intelligence, in particular to an environment identification method, electronic equipment and a computer readable storage medium.
Background
Integrated Sensing and computing (ISCC) refers to a technology in which three functions of Communication, sensing and computing are Integrated together, so that an environment recognition system integrates the three functions of Communication, sensing and computing at the same time, wherein Communication, i.e., a function of information transmission between two or more Communication terminals, sensing, i.e., a function of detecting environment information of a physical environment, such as speed measurement, target location, and the like, and computing is a function of performing analysis and Computation based on the environment information and obtaining a usable recognition result.
In the related art of perceptual-computational integration, communication, perception, and computation are often designed separately to achieve their respective goals. The communication link aims at maximizing throughput, the perception link aims at obtaining high-quality positioning data, and the calculation link aims at utilizing resources more efficiently, so that the coupling relation among the three links is not close, and meanwhile, excellent evaluation indexes for measuring the link recognition accuracy do not appear in the industry, so that the accuracy of environment recognition in the related technology is not high. Therefore, how to improve the recognition accuracy of environment recognition has become a great challenge to be solved in the industry.
Disclosure of Invention
The present invention is directed to solving at least one of the problems of the prior art. Therefore, the invention provides an environment identification method, an electronic device and a computer readable storage medium, which can improve the identification accuracy of environment identification.
The environment identification method according to the embodiment of the first aspect of the invention is applied to an edge server, and comprises the following steps:
acquiring an environmental feature subset based on a common sensing terminal, wherein the environmental feature subset is obtained by performing feature extraction on environmental information by the common sensing terminal, and the environmental information is acquired by the common sensing terminal in a target environment;
analyzing the environment feature subset to obtain a first number of classification probability functions corresponding to the environment feature subset in a first number of environment categories;
constructing a classification discrimination gain based on a first number of the classification probability functions;
and identifying the environment feature subset based on the classification discrimination gain to obtain identification result data.
According to some embodiments of the invention, the constructing a classification discriminant gain based on a first number of the classification probability functions comprises:
constructing a second number of environment class pairs based on the first number of environment classes, each environment class pair comprising two class pair elements, the class pair elements being selected from the environment classes;
matching two class pair elements of each environment class pair to obtain one-to-one corresponding element classification functions from a first number of classification probability functions and constructing classification function pairs;
based on a second number of the classification function pairs, the classification discrimination gain is constructed.
According to some embodiments of the invention, the constructing the classification discriminant gain based on a second number of the pairs of classification functions comprises:
integrating each classification function pair to construct a class pair discrimination gain corresponding to each environment class pair;
and averaging the discrimination gains based on the second number of classes to obtain the classification discrimination gain.
According to some embodiments of the present invention, the identifying the subset of the environmental features based on the classification discrimination gain to obtain identification result data includes:
enhancing the classification discrimination gain to obtain an optimized discrimination gain;
and identifying the environment feature subset based on the optimized discrimination gain to obtain identification result data.
According to some embodiments of the present invention, the enhancing the classification discriminant gain to obtain an optimized discriminant gain includes:
acquiring sensing time, communication time and calculation time, and configuring a delay constraint condition for the classification discrimination gain based on the sensing time, the communication time and the calculation time, wherein the sensing time is the acquisition time of the environmental information, the communication time is the transmission time of the environmental feature subset, and the calculation time is the feature extraction time of the environmental feature subset;
acquiring channel capacity between the edge server and the sensing terminal, and configuring transmission constraint conditions for the classification discrimination gain based on the channel capacity;
acquiring an energy consumption threshold of the general sensing terminal, and configuring an energy constraint condition for the classification discrimination gain based on the energy consumption threshold;
and enhancing the classification discrimination gain based on the delay constraint condition, the transmission constraint condition and the energy constraint condition to obtain an optimized discrimination gain.
The environment identification method according to the embodiment of the second aspect of the invention is applied to a general sensing terminal, and comprises the following steps:
collecting environmental information, wherein the environmental information comprises detectable physical quantities of a target environment;
extracting the characteristics of the environmental information to obtain an environmental characteristic subset;
and sending the environmental feature subset to an edge server so that the edge server carries out analysis processing based on the environmental feature subset to obtain a classification probability function, constructing a classification discriminant gain based on the classification probability function, and carrying out identification processing on the environmental feature subset based on the classification discriminant gain to obtain identification result data.
According to some embodiments of the invention, the environment information comprises object motion data in the target environment, and the performing feature extraction on the environment information to obtain the environment feature subset comprises:
sampling the object motion data to generate a motion data vector;
extracting a main characteristic element from the motion data vector based on principal component analysis to obtain a motion characteristic vector;
and carrying out normalization processing on the motion characteristic vector to obtain the environment characteristic subset.
According to some embodiments of the invention, the extracting of the dominant feature element from the motion data vector based on principal component analysis to obtain a motion feature vector comprises:
performing singular value decomposition on the motion data vector to obtain an intermediate data vector;
and extracting main characteristic elements from the intermediate data vector based on principal component analysis to obtain a motion characteristic vector.
In a third aspect, an embodiment of the present invention provides an electronic device, including: a memory storing a computer program, and a processor implementing the environment recognition method according to any one of the embodiments of the first aspect of the present invention when the processor executes the computer program.
In a fourth aspect, an embodiment of the present invention provides a computer-readable storage medium, where a program is stored, and the program is executed by a processor to implement the environment identification method according to any one of the embodiments of the first aspect of the present invention.
The environment identification method, the electronic device and the computer readable storage medium according to the embodiment of the invention have at least the following advantages:
the environment identification method comprises two execution main bodies, namely a communication terminal and an edge server, wherein the communication terminal needs to collect environment information firstly, the environment information comprises detectable physical quantity of a target environment, then characteristic extraction is carried out on the environment information to obtain an environment characteristic subset, the environment characteristic subset is sent to the edge server, the edge server carries out analysis processing based on the environment characteristic subset after obtaining the environment characteristic subset based on the communication terminal, a first number of classification probability functions corresponding to the environment characteristic subset in a first number of environment categories are obtained, then classification discrimination gains are constructed based on the first number of classification probability functions, and finally identification processing is carried out on the environment characteristic subset based on the classification discrimination gains to obtain identification result data. And identifying the environmental feature subset based on the classification discrimination gain to obtain identification result data, so that the identification accuracy of environmental identification can be improved.
Additional aspects and advantages of the invention will be set forth in part in the description which follows and, in part, will be obvious from the description, or may be learned by practice of the invention.
Drawings
The above and/or additional aspects and advantages of the present invention will become apparent and readily appreciated from the following description of the embodiments, taken in conjunction with the accompanying drawings of which:
FIG. 1 is an alternative flow chart of a method for identifying an environment according to an embodiment of the present invention;
FIG. 2 is a flow chart illustrating an alternative method for identifying an environment according to an embodiment of the present invention;
FIG. 3 is another alternative flow chart of the method for identifying an environment according to the embodiment of the present invention;
FIG. 4 is a flow chart illustrating an alternative method for identifying an environment according to an embodiment of the present invention;
FIG. 5 is another alternative flow chart of the method for identifying an environment according to the embodiment of the present invention;
FIG. 6 is a flow chart illustrating an alternative method for identifying an environment according to an embodiment of the present invention;
FIG. 7 is a flowchart illustrating another alternative method for identifying an environment according to an embodiment of the present invention;
fig. 8 is a schematic hardware structure diagram of an electronic device according to an embodiment of the present application.
Detailed Description
Reference will now be made in detail to embodiments of the present invention, examples of which are illustrated in the accompanying drawings, wherein like reference numerals refer to the same or similar elements or elements having the same or similar functions throughout. The embodiments described below with reference to the accompanying drawings are illustrative only for the purpose of explaining the present invention, and are not to be construed as limiting the present invention.
In the description of the present invention, the meaning of a plurality of means is one or more, the meaning of a plurality of means is two or more, and larger, smaller, larger, etc. are understood as excluding the number, and larger, smaller, inner, etc. are understood as including the number. If the first and second are described for the purpose of distinguishing technical features, they are not to be understood as indicating or implying relative importance or implicitly indicating the number of technical features indicated or implicitly indicating the precedence of the technical features indicated.
In the description of the present invention, it should be understood that the orientation or positional relationship referred to in the description of the orientation, such as upper, lower, left, right, front, rear, etc., is based on the orientation or positional relationship shown in the drawings only for the convenience of description of the present invention and simplification of the description, and does not indicate or imply that the device or element referred to must have a specific orientation, be constructed and operated in a specific orientation, and thus, should not be construed as limiting the present invention.
In the description of the present specification, reference to the description of "one embodiment," "some embodiments," "an illustrative embodiment," "an example," "a specific example," or "some examples" or the like means that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the present invention. In this specification, the schematic representations of the terms used above do not necessarily refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples.
In the description of the present invention, it should be noted that unless otherwise explicitly defined, terms such as arrangement, installation, connection and the like should be broadly understood, and those skilled in the art can reasonably determine the specific meanings of the above terms in the present invention in combination with the specific contents of the technical solutions. In addition, the following descriptions of specific steps do not represent any limitation on the order of steps and execution logic, and the order of execution and execution logic between the steps should be understood and inferred with reference to the description of the embodiments.
Integrated Sensing, communication and computing (ISCC) refers to a technology for integrating three functions of Communication, sensing and computing together, so that an environment recognition system integrates the three functions of Communication, sensing and computing at the same time, wherein the Communication function refers to a function of information transmission between two or more Communication terminals, the Sensing function refers to a function of detecting environment information of a physical environment, such as speed measurement, target positioning and the like, and the computing function refers to a function of analyzing and computing based on the environment information and obtaining a usable recognition result. Taking an application example of common perception calculation integration: the base station signals are used for perceiving the surrounding environment information, a communication link is designed, and the environment information is analyzed and calculated, so that a vehicle running in the environment is helped to avoid some obstacles, and meanwhile, the communication performance is improved. In the application process of the integration of the sensing and the calculation, the wireless channel is utilized to transmit information, meanwhile, the characteristics of the channel can be actively recognized and analyzed, so that the physical characteristics of the surrounding environment are sensed to obtain the environment information, and then the analysis and calculation are carried out based on the environment information, so that the three functions of communication, sensing and calculation are mutually enhanced.
In the application of the general sensing and computing integration technology, the environment information acquisition and communication compete for spectrum resources together, and allowed communication resources further determine a required quantization level, so that quantized features can be reliably transmitted to an edge server under the delay constraint, and finally response data is obtained at the edge server. Therefore, the three processes of communication, sensing and calculation are highly coupled and need to be considered together. Furthermore, the implementation of generic computational integration should be designed under a new task-oriented principle, i.e. under a principle that focuses on the successful completion of subsequent inference tasks. In edge artificial intelligence, the performance metric of interest to the system is no longer throughput, but rather the inference accuracy and latency. Therefore, based on edge artificial intelligence, a general-purpose computation integration scheme facing a real-time inference task should maximize inference accuracy by jointly designing communication, perception and computation under the constraints of low delay and on-device resources. The artificial intelligence is integrated at the edge of the wireless network to carry out real-time distributed intelligent training, and the method is a key technology for realizing comprehensive intelligent upgrading of the communication network. The universal sense integration has unique advantages by facing to the edge artificial intelligence task, and obtains very wide attention.
In the related art, communication, sensing, and computation are often designed separately to achieve the respective goals. The communication link aims at maximizing throughput, the perception link aims at obtaining high-quality positioning data, and the calculation link aims at utilizing resources more efficiently, so that the coupling relation among the three links is not close, and meanwhile, excellent evaluation indexes for measuring the link recognition accuracy do not appear in the industry, so that the accuracy of environment recognition in the related technology is not high. Therefore, how to improve the recognition accuracy of environment recognition has become a great challenge to be solved in the industry.
The present invention is directed to solving at least one of the problems of the prior art. Therefore, the invention provides an environment identification method, an electronic device and a computer readable storage medium, which can improve the identification accuracy of environment identification.
The following further description is made based on the accompanying drawings.
The environment recognition method of the invention comprises two execution bodies: the communication terminal and the edge server. Among them, an Integrated Sensing And Communication Device (ISAC Device) is a terminal Device integrating a Communication function And a Sensing function, and is also called an ISAC Device. An Edge server refers to an Edge Device with data processing functions for providing computing power services, and an Edge Device (Edge Device) is a Device that provides an entry point to an enterprise or service provider core network, such as: the router, the routing switch, the Integrated Access Device (IAD), the multiplexer, and various Metropolitan Area Network (MAN) and Wide Area Network (WAN) access devices, it should be understood that the setting types of the edge device are various based on different application scenarios, for example, the edge device may be set on a motor vehicle in an application scenario of detecting road conditions, and the edge device may be set in a detection kiosk in a target environment in an application scenario of detecting temperature and gas.
It should be noted that the sensing function of the universal sensing terminal is embodied in that various environmental physical quantity sensors are used for collecting environmental information, for example, object motion data in a target environment is collected through a radar, temperature change data in the target environment is collected through a temperature sensor, gas content change data in the target environment is collected through a gas sensor, and the like; the communication function of the universal sensing terminal is embodied in data transmission with the edge server, and it should be noted that the universal sensing terminal has relatively limited computational power, and if a large artificial intelligence model is deployed to process the acquired environmental data, heavy storage and computation costs are required. Therefore, the artificial intelligence model for processing the environment data is mainly deployed on the edge server, so that the universal sensing terminal is responsible for collecting the environment information, the edge server processes and identifies the environment information based on the artificial intelligence model to obtain identification result data, and the data communication between the universal sensing terminal and the edge server is used for completing the identification process of the environment identification method.
It should be understood that the generic terminal and the edge server are used to perform the different method steps, respectively, as will be further explained below.
Referring to fig. 1, the environment identification method according to the embodiment of the present invention may include, but is not limited to, the following steps S101 to S107, where the steps S101 to S103 are applied to the sensory terminal, and the steps S104 to S107 are applied to the edge server.
Step S101, collecting environmental information, wherein the environmental information comprises detectable physical quantity of a target environment;
step S102, extracting the characteristics of the environmental information to obtain an environmental characteristic subset;
step S103, the sensing terminal sends the environment feature subset to an edge server;
step S104, the edge server acquires an environmental feature subset from the sensory terminal;
step S105, analyzing and processing the environment feature subsets to obtain a first number of classification probability functions corresponding to the environment feature subsets in a first number of environment categories;
step S106, constructing classification discrimination gains based on the first number of classification probability functions;
and step S107, carrying out identification processing on the environmental feature subset based on the classification discrimination gain to obtain identification result data.
Through the embodiments shown in steps S101 to S107 of the present invention, the environment identification method of the present invention includes two execution main bodies, which are a sensory terminal and an edge server, respectively, where the sensory terminal needs to collect environment information first, the environment information includes a detectable physical quantity of a target environment, then performs feature extraction on the environment information to obtain an environment feature subset, sends the environment feature subset to the edge server, and after the edge server obtains the environment feature subset based on the sensory terminal, performs analysis processing based on the environment feature subset to obtain a first number of classification probability functions of the environment feature subset corresponding to a first number of environment categories, then constructs a classification discriminant gain based on the first number of classification probability functions, and finally performs identification processing on the environment feature subset based on the classification discriminant gain to obtain identification result data. And identifying the environmental feature subset based on the classification discrimination gain to obtain identification result data, so that the identification accuracy of environmental identification can be improved.
In step S101 according to some embodiments of the present invention, environment information is collected, where the environment information includes a detectable physical quantity of the target environment. It should be noted that the environment information includes detectable physical quantities of the target environment, and it is noted that the purpose of collecting the environment information is to detect various physical quantities of the target environment so as to identify and determine the actual condition of the current target environment. It should be emphasized that the sensing function of the universal sensing terminal is embodied in that various environmental physical quantity sensors are used for collecting environmental information, for example, object motion data in a target environment is collected through a radar, temperature change data in the target environment is collected through a temperature sensor, gas content change data in the target environment is collected through a gas sensor, and the like, so that the environmental information collected through the universal sensing terminal can be achieved through various modes. In some specific embodiments, the universal sensing terminal is configured with a radar sensing device, and when collecting the environment information, transmits a radar detection signal composed of a plurality of uplink chirps to the target environment, receives a radar echo signal corresponding to the radar detection signal, and by processing the radar echo signal, the environment information including object motion data in the target environment can be obtained on the universal sensing terminal.
In step S102 in some embodiments of the present invention, feature extraction is performed on the environment information to obtain an environment feature subset. It should be noted that the generic sensing terminal is responsible for collecting the environment information, the edge server processes and identifies the environment information based on the artificial intelligence model to obtain identification result data, and data communication between the generic sensing terminal and the edge server is used to complete the identification process of the environment identification method. Therefore, in some exemplary embodiments of the present invention, the artificial intelligence model is divided into two sub-models, one sub-model is deployed on the generic sensing terminal to perform feature extraction on the environment information to obtain an environment feature subset, and then in the subsequent steps, the environment feature subset is sent to the edge server, and the other sub-model is deployed on the edge server to perform the rest of the recognition tasks, so that the acquired environment information is prevented from being leaked to ensure data security, and the heavier part of the calculation tasks is deployed to the edge server to reduce the hardware requirement of the generic sensing device.
Referring to fig. 2, according to some embodiments of the present invention, the environment information includes object motion data in the target environment, and the step S102 performs feature extraction on the environment information to obtain the environment feature subset, which may include, but is not limited to, the following steps S201 to S203.
Step S201, sampling object motion data to generate motion data vectors;
step S202, extracting main characteristic elements from the motion data vectors based on principal component analysis to obtain motion characteristic vectors;
step 203, performing normalization processing on the motion characteristic vector to obtain an environment characteristic subset.
In steps S201 to S203 of some embodiments of the present invention, it is necessary to sample object motion data to generate a motion data vector, extract a main feature element from the motion data vector based on principal component analysis to obtain a motion feature vector, and further perform normalization processing on the motion feature vector to obtain an environmental feature subset. It should be noted that various moving objects may exist in the target environment, and therefore, the sensory terminal may obtain some physical quantity data reflecting the motion of the object, that is, object motion data, from the target environment based on some sensors of the motion physical quantity. It should be noted that Principal Component Analysis (PCA), also called Principal component Analysis, aims to transform multiple indexes into a few comprehensive indexes by using the idea of dimension reduction, and in statistics, principal component Analysis is a technique for simplifying a data set, which is a linear transformation that transforms data into a new coordinate system, so that the first large variance of any data projection is on the first coordinate (called the first Principal component), the second large variance is on the second coordinate (the second Principal component), and so on, and Principal component Analysis is often used to reduce the dimension of the data set, and simultaneously maintain the feature of the data set that the contribution to the difference is the largest, which is achieved by retaining the low-order Principal Components and ignoring the high-order Principal Components, so that the low-order Components can always retain important aspects of the data. It should be understood that the feature extraction of the environmental information in step S102 to obtain the environmental feature subset can be implemented in various ways, and is not limited to the above-mentioned specific embodiments.
Referring to fig. 3, according to some embodiments of the present invention, the step S202 extracts a dominant feature element from the motion data vector based on a dominant component analysis to obtain a motion feature vector, which may include, but is not limited to, the following steps S301 to S302.
Step S301, singular value decomposition is carried out on the motion data vector to obtain an intermediate data vector;
step S302, extracting main characteristic elements from the intermediate data vector based on principal component analysis to obtain a motion characteristic vector.
In steps S301 to S302 of some embodiments of the present invention, singular value decomposition is performed on the motion data vector to obtain an intermediate data vector, and then principal feature elements are extracted from the intermediate data vector based on principal component analysis to obtain a motion feature vector. It should be noted that Singular Value Decomposition (SVD) is an important matrix Decomposition in linear algebra, and is a generalization of eigen Decomposition on an arbitrary matrix, and is similar to symmetric matrix or Hermite matrix based on diagonalization of eigenvector in some aspects, however, these two matrix decompositions are obviously different despite their correlation, and the basis of spectral analysis is the Decomposition of symmetric matrix eigenvector, and Singular Value Decomposition is a generalization of spectral analysis theory on an arbitrary matrix. It should be noted that, in some preferred embodiments of the present invention, in order to reduce clutter and extract useful information, in the process of extracting principal feature elements from an intermediate data vector based on principal component analysis to obtain a motion feature vector, a linear filter based on singular value decomposition is required to perform singular value decomposition on the motion data vector to obtain an intermediate data vector, and then, the principal feature elements are extracted from the intermediate data vector based on principal component analysis to obtain the motion feature vector.
In some more specific embodiments, the number k of the generic sensing terminals is configured with a radar sensing device, when collecting the environment information, one sensing snapshot for sending radar detection signals includes M chirps, and the duration of each chirp is
Figure SMS_1
It should be noted that the corresponding sensing terminal transmits a radar detection signal consisting of a plurality of up-chirps to the target environment>
Figure SMS_2
Expressed as:
Figure SMS_3
wherein,
Figure SMS_4
is/is>
Figure SMS_5
A centered, rectangular pulse function having a width of 1 @>
Figure SMS_6
Is the perception carrier frequency of the perception terminal with the number of k>
Figure SMS_7
For each chirp range, is>
Figure SMS_8
Is the bandwidth of the perceived signal.
Further, radar echo signals corresponding to the radar detection signals
Figure SMS_9
Expressed as:
Figure SMS_10
wherein,
Figure SMS_12
an expected echo signal, which is directly reflected for the radar detection signal, is evaluated>
Figure SMS_16
,/>
Figure SMS_20
For a reflection coefficient comprising a round-trip path loss, <' >>
Figure SMS_14
For a round trip delay, based on a time delay of a predetermined time period>
Figure SMS_15
Is targeted from the ^ th ^ er>
Figure SMS_19
An echo signal indirectly reflected by a bar indirect reflection path having the expression->
Figure SMS_21
,/>
Figure SMS_13
And &>
Figure SMS_18
Are respectively the fifth->
Figure SMS_22
Reflection coefficient and signal delay for a bar path>
Figure SMS_24
Is the total number of indirect reflection paths, is->
Figure SMS_11
In order to perceive Gaussian noise at a receiver, explicit, based on>
Figure SMS_17
And &>
Figure SMS_23
The values of (a) are all values determined prior to collecting the environmental information.
Still further, by applying a radar echo signal
Figure SMS_35
Processing is carried out, object motion data in a target environment are obtained on a general sensing terminal, then feature extraction is carried out on the object motion data to obtain an environment feature subset, it needs to be pointed out that the object motion data are subjected to feature extraction to obtain the environment feature subset, specifically, the object motion data need to be sampled firstly to generate a motion data vector ^ er>
Figure SMS_27
For the motion data vector>
Figure SMS_31
Performing singular value decomposition to obtain the filtered motion data vector->
Figure SMS_38
I.e. the intermediate data vector ≥>
Figure SMS_41
Wherein->
Figure SMS_39
,/>
Figure SMS_42
And &>
Figure SMS_36
Respectively represent->
Figure SMS_40
In a first or second section>
Figure SMS_28
Singular value, the th->
Figure SMS_32
Left singular vectors and ^ h->
Figure SMS_26
A right singular vector>
Figure SMS_29
And &>
Figure SMS_33
Is an empirical parameter; further on a->
Figure SMS_37
The intermediate data vector is->
Figure SMS_25
Is converted into->
Figure SMS_30
Then, subsequently, in the principal feature space, slave @, by means of principal component analysis PCA>
Figure SMS_34
And extracting main characteristic elements to obtain motion characteristic vectors so as to enable different characteristic elements to be mutually unrelated.
In some optional embodiments, the main feature space is obtained by the edge server in the process of training the model in advance, and after the main feature space is obtained at the side of the edge server, the main feature space is further transmitted to each of the sensory terminals in a broadcast mode.
If the number of the extracted motion feature vectors is recorded as
Figure SMS_43
Since all processing steps are linear, the radar echo signal ÷ is greater than or equal to>
Figure SMS_44
Corresponding nth motion feature vector>
Figure SMS_45
Expressed as:
Figure SMS_46
wherein,
Figure SMS_47
is an ideal real characteristic element, is present>
Figure SMS_48
Is a characteristic element which is determined by a fifth +>
Figure SMS_49
Additional information based on the clutter signal of the branch path->
Figure SMS_50
Is the noise in the feature element.
Still further, for each motion feature vector
Figure SMS_51
Sensing power by the transmitting radar>
Figure SMS_52
And carrying out normalization processing to obtain each environmental characteristic element. Specifically, the nth ringAmbient characteristic element->
Figure SMS_53
Can be expressed as:
Figure SMS_54
wherein,
Figure SMS_55
is a true characteristic element, and->
Figure SMS_56
Is a normalized clutter that is a function of the noise,
Figure SMS_57
to perceive noise.
After obtaining the n-th environmental characteristic element
Figure SMS_58
Then, each environmental characteristic element can be based on>
Figure SMS_59
Integrating to obtain the environment characteristic subset>
Figure SMS_60
Wherein->
Figure SMS_61
Is the total number of generated environment feature elements. In some embodiments, because the deployment of the common sensing terminals is sparse, and the corresponding sensing areas do not overlap, different environment feature subsets generated by different common sensing terminals are also independent.
It should be emphasized that, the step S102 performs feature extraction on the environment information to obtain the environment feature subset, which may include, but is not limited to, the above-mentioned specific embodiments.
In step S103 of some embodiments of the present invention, the sensing terminal sends the environment feature subset to the edge server.
In some more specific embodiments of the present invention, theIn the process that the sensing terminal sends the environment feature subset to the edge server, the environment feature subset also needs to be quantized to obtain a quantized feature subset, so that communication transmission is facilitated. If the subset of environmental features is represented as
Figure SMS_63
Each ambient feature element in the ambient feature subset may be quantized using the same linear quantizer resulting in a quantized feature element, in particular £ r for the ambient feature subset>
Figure SMS_66
Is greater than or equal to>
Figure SMS_69
An environment feature element, using a high quantization bit range, the quantization feature element comprising: />
Figure SMS_64
Wherein->
Figure SMS_67
Is the first->
Figure SMS_70
An original feature cell corresponding to an individual environment feature element is based on the value of the ambient parameter value>
Figure SMS_71
To quantize the gain, is selected>
Figure SMS_62
Is approximately Gaussian quantized distortion, expressed as->
Figure SMS_65
,/>
Figure SMS_68
Is the variance.
In step S104 of some embodiments of the present invention, the edge server obtains the environmental feature subset from the sensory terminal.
In some more specific embodiments of the present invention, the environment feature is sub-set by the generic sensing terminalIn the process of sending the quantized feature subset to the edge server, the environment feature subset also needs to be quantized to obtain the quantized feature subset, and then the edge server needs to restore the quantized feature elements in the quantized feature subset after receiving the quantized feature subset. If the environment feature subset
Figure SMS_72
Is greater than or equal to>
Figure SMS_75
The individual environment characteristic element is quantized into a quantization characteristic element->
Figure SMS_78
Correspondingly, on the edge server side, the quantized feature is restored to ≥ correspondingly>
Figure SMS_73
It can be clear that the higher quantization gain
Figure SMS_76
Quantization distortion of the edge server recovery feature may be low. It is noted that the edge server side restored subset of environmental features @>
Figure SMS_79
And a subset of the environmental feature generated under an additive Gaussian distortion approximation->
Figure SMS_80
Has mutual information of->
Figure SMS_74
Wherein->
Figure SMS_77
This is also the communication overhead generated by the communicating terminal numbered k transmitting the subset of the environmental characteristics to the server.
In steps S105 to S106 of some embodiments of the present invention, an analysis process is performed based on the environmental feature subset to obtain a first number of classification probability functions corresponding to the environmental feature subset in the first number of environmental categories, and then a classification discriminant gain is constructed based on the first number of classification probability functions. It should be noted that, in the embodiment of the present invention, the universal sensing terminal is responsible for acquiring the environment information and generating the environment feature subset, the edge server processes and identifies the environment feature subset based on the artificial intelligence model to obtain the identification result data, and in addition, the data communication between the universal sensing terminal and the edge server is used to complete the identification process of the environment identification method. Data loss can exist in the data communication process between the sensing terminal and the edge server, so that the difference between different types of identification result data corresponding to environment identification is small, and high identification precision is difficult to achieve. Therefore, in order to better cope with data loss generated in the data communication process between the universal sensing terminal and the edge server, some embodiments of the present invention need to use the classification discrimination gain as an index for measuring the difference between classes, and process the comparison between different classes in the identification process on the basis of the classification discrimination gain, so as to facilitate the identification processing of the environmental feature subset based on the classification discrimination gain in the subsequent steps to obtain identification result data, and effectively control the influence generated by the data loss in the data communication process between the universal sensing terminal and the edge server, thereby achieving the improvement of the environmental identification accuracy. It is clear that the communication quality between the universal sensing terminal and the edge server is closely related to the environment recognition accuracy, the higher the communication quality is, the smaller the data loss between the universal sensing terminal and the edge server is, the difference between classes in the process of recognizing by the edge server based on the environment feature subset is clear, and the higher the environment recognition accuracy is, so the classification discrimination gain can also be regarded as the recognition accuracy measurement of the environment recognition method of the present invention. It is noted that the classification probability functions may reflect the distribution characteristics of the environmental feature subsets in a specific environmental category, and each classification probability function corresponds to one environmental category.
Referring to fig. 4, according to some embodiments of the present invention, step S106 constructs a classification discriminant gain based on a first number of classification probability functions, which may include, but is not limited to, steps S401 through S403 described below.
Step S401, constructing a second number of environment class pairs based on the first number of environment classes, wherein each environment class pair comprises two class pair elements, and the class pair elements are selected from the environment classes;
step S402, based on two class pair elements of each environment class pair, matching the first number of classification probability functions to obtain one-to-one corresponding element classification functions and constructing classification function pairs;
step S403, based on the second number of classification function pairs, a classification discrimination gain is constructed.
In step S401 of some embodiments of the present invention, a second number of environment class pairs need to be first constructed based on the first number of environment classes, each environment class pair includes two class pair elements, and the class pair elements are selected from the environment classes. It is emphasized that, in order to better cope with the data loss generated in the data communication process between the sensing terminal and the edge server, some embodiments of the present invention need to use the classification discrimination gain as an index for measuring the difference between classes, and process the comparison between different classes in the identification process based on the classification discrimination gain. In some exemplary embodiments, to better measure the inter-class differences of environmental categories. First, a second number of environment class pairs are constructed based on a first number of environment classes, where the first number is a total number of environment classes that can be identified by the edge server, and in the first number of environment classes, every two environment classes are paired to form one environment class pair, and a total number of environment class pairs that can be formed by the first number of environment classes is the second number. In a specific embodiment, if there are 5 environment classes a, B, C, D, E that can BE identified by the edge server, then each two environment classes are paired to form an environment class pair, and the total environment class pairs that can BE formed include 10 environment class pairs AB, AC, AD, AE, BC, BD, BE, CD, CE, DE, in this case, the first number is 5, and the second number is 10.
In steps S402 to S403 of some embodiments of the present invention, based on two class pair elements of each environment class pair, a one-to-one corresponding element classification function is obtained from the first number of classification probability functions through matching, and a classification discrimination gain is constructed based on the second number of classification function pairs. It should be noted that, before step S105, step S104 needs to be performed to perform parsing processing based on the environment feature subsets to obtain a first number of classification probability functions corresponding to the environment feature subsets in a first number of environment categories, where the classification probability functions may reflect distribution features of the environment feature subsets in specific environment categories, and each classification probability function corresponds to one environment category. Thus, a classification discrimination gain may be constructed based on the classification probability function of the subset of environmental features at each environmental class. In the embodiment of the present invention, after the second number of environment class pairs are constructed, two class pair elements of each environment class pair may be matched in the first number of classification probability functions and a classification function pair may be constructed, where the class pair elements are selected from the environment classes, and therefore the class pair elements have one-to-one corresponding element classification functions, and on this basis, each two class pair elements in the same environment class pair are matched in the first number of classification probability functions, and thus a classification function pair corresponding to the environment class pair may be obtained. After the classification function pairs are obtained, a classification discrimination gain can be constructed based on the second number of classification function pairs. The environmental characteristic subset is identified based on the classification discrimination gain to obtain identification result data, so that the influence of data loss in the data communication process between the sensory terminal and the edge server can be effectively controlled, and the environmental identification accuracy is further improved.
Referring to fig. 5, according to some embodiments of the present invention, step S403 constructs a classification discrimination gain based on the second number of classification function pairs, which may include, but is not limited to, steps S501 to S502 described below.
Step S501, integrating each classification function pair, and constructing a class pair discrimination gain corresponding to each environment class pair;
step S502, averaging the discrimination gain based on the second number of classes to obtain a classification discrimination gain.
In steps S501 to S502 of some embodiments of the present invention, each classification function pair needs to be integrated, a class pair discrimination gain corresponding to each environment class pair is constructed, and then the discrimination gain is averaged based on a second number of classes, so as to obtain a classification discrimination gain. It should be noted that, because two element classification functions in one classification function pair correspond to two class pair elements of the environment class pair, each classification function pair is integrated, and a class pair discrimination gain corresponding to each environment class pair can be constructed for measuring the inter-class difference between the two class pair elements in the environment class pair. Similarly, a second number of class pair discrimination gains may be formed correspondingly in the first number of environmental classes to measure the inter-class difference between the second number of environmental classes to the interior. Since different environment feature elements in the environment feature subset are independent, in some embodiments, the classification discrimination gain may be obtained by averaging the discrimination gain based on the second number of classes, that is, by averaging the discrimination gain by the second number of classes.
In some more specific embodiments of the present invention, the environmental characteristic element generated by the k-numbered generic sensing terminal
Figure SMS_81
It can be expressed as:
Figure SMS_82
wherein,
Figure SMS_83
is a true characteristic element, is present>
Figure SMS_84
Is a normalized clutter>
Figure SMS_85
For transmitting radar perceived power, in combination with a control unit>
Figure SMS_86
To perceive noise.
If the real characteristic element
Figure SMS_87
With mixingGaussian distribution, then true characteristic element>
Figure SMS_88
The corresponding probability density function can be expressed as:
Figure SMS_89
wherein the first number
Figure SMS_91
For a total number of environmental classes that the edge server can identify,/>>
Figure SMS_94
Is->
Figure SMS_96
Center of mass of class, <' > based on>
Figure SMS_92
Is the variance. Distribute the clutter in>
Figure SMS_93
Normalized perceptual noise->
Figure SMS_95
Quantization distortion
Figure SMS_97
Respectively substituted into an ambient characteristic element>
Figure SMS_90
In, can be represented as:
Figure SMS_98
wherein
Figure SMS_99
Is concerned with>
Figure SMS_100
Class->
Figure SMS_101
The expression of the probability density function of (1) is as follows:
Figure SMS_102
further, by environmental category
Figure SMS_103
And environment category>
Figure SMS_104
Constructing an environment class pair, based on which two class pair elements are ^ based on>
Figure SMS_105
And/or>
Figure SMS_106
Matching the first number of classification probability functions to obtain one-to-one corresponding element classification functions, constructing classification function pairs, integrating each classification function pair, and constructing a class pair discrimination gain corresponding to each environment class pair, wherein the class pair discrimination gain can be expressed as:
Figure SMS_107
it can be clear that the above-mentioned manner of constructing class pairs to discriminate gains is applied to the whole environment feature subset
Figure SMS_108
Wherein the subset of the environmental characteristic->
Figure SMS_109
The sum of the corresponding class pair discriminant gains can be expressed as:
Figure SMS_110
/>
due to different environment feature elements in the environment feature subset
Figure SMS_111
Are independent, so that a subset of the environmental characteristics
Figure SMS_112
The corresponding classification discrimination gain->
Figure SMS_113
It can be considered as the average of all class pair discrimination gains, expressed as:
Figure SMS_114
it is emphasized that the step S106 of constructing the classification discrimination gain based on the first number of classification probability functions may include, but is not limited to, the above-mentioned specific embodiments.
In step S107 according to some embodiments of the present invention, the environment feature subset is identified based on the classification discrimination gain, so as to obtain identification result data. It should be noted that, since the classification discrimination gain is an index for measuring the difference between classes, the environment feature subset is identified based on the classification discrimination gain, which is beneficial to comparison between different environment classes in the identification process, and can effectively control the influence caused by data loss in the data communication process between the sensory terminal and the edge server, thereby achieving improvement of the environment identification accuracy.
Referring to fig. 6, according to some embodiments of the present invention, the step S107 performs recognition processing on the environment feature subset based on the classification discrimination gain to obtain recognition result data, which may include, but is not limited to, the following steps S601 to S602.
Step S601, enhancing the classification discrimination gain to obtain an optimized discrimination gain;
and step S602, identifying the environmental feature subset based on the optimized discrimination gain to obtain identification result data.
In steps S601 to S602 of some embodiments of the present invention, the classification discrimination gain is enhanced to obtain an optimized discrimination gain, and then the environment feature subset is identified based on the optimized discrimination gain to obtain identification result data. It should be noted that, since the classification discrimination gain is an index for measuring the inter-class difference, the larger the classification discrimination gain is, the larger the inter-class difference between the environmental classes can be increased, thereby helping to improve the environmental recognition accuracy. Therefore, in some embodiments of the present invention, it is necessary to enhance the classification discrimination gain to obtain an optimized discrimination gain, and then perform recognition processing on the environment feature subset based on the optimized discrimination gain to obtain recognition result data.
Referring to fig. 7, according to some embodiments of the present invention, the step S601 performs an enhancement process on the classification discriminant gain to obtain an optimized discriminant gain, which may include, but is not limited to, the following steps S701 to S704.
Step S701, acquiring sensing time, communication time and calculation time, and configuring delay constraint conditions for classification discrimination gains based on the sensing time, the communication time and the calculation time, wherein the sensing time is acquisition time of environmental information, the communication time is transmission time of an environmental feature subset, and the calculation time is feature extraction time of the environmental feature subset;
step S702, acquiring channel capacity between the edge server and the induction terminal, and configuring transmission constraint conditions for classification discrimination gain based on the channel capacity;
step S703, acquiring an energy consumption threshold of the communication terminal, and determining gain configuration energy constraint conditions for the classification based on the energy consumption threshold;
step S704, based on the delay constraint condition, the transmission constraint condition and the energy constraint condition, the classification discrimination gain is enhanced to obtain the optimized discrimination gain.
In steps S701 to S704 of some embodiments of the present invention, sensing time, communication time, and calculation time need to be obtained first, a delay constraint condition is configured for the classification discrimination gain based on the sensing time, the communication time, and the calculation time, the sensing time is acquisition time of the environmental information, the communication time is transmission time of the environmental feature subset, the calculation time is feature extraction time of the environmental feature subset, channel capacity between the edge server and the generic sensing terminal is obtained, a transmission constraint condition is configured for the classification discrimination gain based on the channel capacity, an energy consumption threshold of the generic sensing terminal is obtained, an energy constraint condition is configured for the classification discrimination gain based on the energy consumption threshold, and further, the classification discrimination gain is enhanced based on the delay constraint condition, the transmission constraint condition, and the energy constraint condition, so as to obtain an optimized discrimination gain. It should be noted that, the enhancement processing is performed on the classification discrimination gain, and three limitations are received, which are respectively a delay constraint condition, a transmission constraint condition, and an energy constraint condition, where the delay constraint condition is determined by three aspects of sensing time, communication time, and calculation time, the transmission constraint condition is determined by channel capacity between the edge server and the perceptual terminal, and the energy constraint condition is determined by an energy consumption threshold of the perceptual terminal. It is emphasized that the enhanced classification discrimination gain is used for identifying and processing the environment feature subset, which is beneficial to comparison among different environment categories in the identification process, and can effectively control the influence caused by data loss in the data communication process between the induction terminal and the edge server, thereby improving the environment identification accuracy.
In some more specific embodiments, if the number of a common sensing terminal is k, its sensing time can be recorded as k
Figure SMS_116
Its counting time can be recorded as->
Figure SMS_118
The communication time required for transmitting a characteristic can be recorded as +>
Figure SMS_121
The total communication bandwidth is B, wherein the sensing time @>
Figure SMS_117
For the acquisition time of the environment information, the communication time->
Figure SMS_120
Calculating a time for a transmission time of a subset of environmental features>
Figure SMS_122
Time is extracted for the features of the subset of environmental features. In some embodiments of the invention, the total time that the edge server completes the identification task is recorded as the identification time ≧>
Figure SMS_123
Due to the recognition time->
Figure SMS_115
Shorter and less than the channel coherence time, the wireless channel can be considered static when the channel gain for the link between the k-numbered pervasive device and the edge server can be recorded as ÷ based on the channel coherence time>
Figure SMS_119
The access point of the edge server is used as a coordinator, so that the global channel state information can be obtained, and the global channel state information is used for reflecting the state information of the channel between the sensing equipment and the edge server.
If the classification determines the gain
Figure SMS_124
Expressed as:
Figure SMS_125
then under the constraints of delay constraint, transmission constraint and energy constraint, classifying and judging the gain
Figure SMS_126
The corresponding enhanced analytic expression may be expressed as:
Figure SMS_127
first, the total time of the edge server completing the identification task is recorded as the identification time
Figure SMS_128
Should be less than the sensing timeThe sum of the communication time and the calculation time is represented as a delay constraint Condition 1:
Figure SMS_129
second, the edge server side restored subset of the environmental features
Figure SMS_130
And a subset of the environmental feature generated under an additive Gaussian distortion approximation->
Figure SMS_131
Mutual information between is
Figure SMS_132
In which>
Figure SMS_133
To ensure that the quantized feature subset can be successfully transmitted to the edge server, the generated feature subset ≦>
Figure SMS_134
And restored->
Figure SMS_135
The mutual information between the channels should be smaller than the channel capacity, and is expressed as transmission constraint Condition 2:
Figure SMS_136
wherein,
Figure SMS_137
is on the sensitive terminal>
Figure SMS_138
Is greater than or equal to>
Figure SMS_139
The concrete expression is as follows:
Figure SMS_140
wherein,
Figure SMS_141
for system bandwidth, <' > based on>
Figure SMS_142
For the channel noise power, is asserted>
Figure SMS_143
To assign a time slot, <' >>
Figure SMS_144
Is the transmitting power->
Figure SMS_145
Is the channel gain.
Further, mutual information is obtained
Figure SMS_146
And &>
Figure SMS_147
Data rate substitution transmission constraint Condition 2: />
Figure SMS_148
Another expression of the transmission constraint Condition 2 may be obtained:
Figure SMS_149
third, since the power consumption of each inductive terminal should be limited, the power consumption threshold of the inductive terminal numbered k is expressed as
Figure SMS_150
The energy constraint Condition 3:
Figure SMS_151
wherein,
Figure SMS_152
、/>
Figure SMS_153
、/>
Figure SMS_154
、/>
Figure SMS_155
、/>
Figure SMS_156
respectively is a communication terminal->
Figure SMS_157
The sensing power, the transmitting power, the constant sensing time, the communication time and the constant calculation energy consumption.
Under the restriction of the delay constraint Condition 1, the transmission constraint Condition 2 and the energy constraint Condition 3, the classification and discrimination gain
Figure SMS_158
The corresponding enhanced analytic expression may be further expressed as:
Figure SMS_159
to this end, a classification discrimination gain is given
Figure SMS_160
And solving the corresponding enhancement analytic expression to realize enhancement processing on the classification discrimination gain to obtain the optimized discrimination gain.
It should be emphasized that the manner in which the classification discrimination gain is enhanced to obtain the optimized discrimination gain is not limited to the specific examples given above.
Fig. 8 illustrates an electronic device 800 provided by an embodiment of the invention. The electronic device 800 includes: a processor 801, a memory 802 and a computer program stored on the memory 802 and executable on the processor 801, the computer program when running being adapted to perform the above described method of environment identification.
The processor 801 and the memory 802 may be connected by a bus or other means.
The memory 802, which is a non-transitory computer readable storage medium, may be used to store non-transitory software programs and non-transitory computer executable programs, such as the environment recognition method described in the embodiments of the present invention. The processor 801 implements the environment recognition method described above by running non-transitory software programs and instructions stored in the memory 802.
The memory 802 may include a program storage area and a data storage area, wherein the program storage area may store an operating system, an application program required for at least one function, and the like. The storage data area may store data for performing the environment recognition method described above. Further, the memory 802 may include a high speed random access memory 802, and may also include a non-transitory memory 802, such as at least one storage device memory device, flash memory device, or other non-transitory solid state memory device. In some embodiments, the memory 802 optionally includes memory 802 that is remotely located from the processor 801, and such remote memory 802 can be coupled to the electronic device 800 via a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
Non-transitory software programs and instructions required to implement the above-described environment recognition method are stored in the memory 802, and when executed by the one or more processors 801, perform the above-described environment recognition method, e.g., perform method steps S101-S107 in fig. 1, method steps S201-S203 in fig. 2, method steps S301-S302 in fig. 3, method steps S401-S403 in fig. 4, method steps S501-S502 in fig. 5, method steps S601-S602 in fig. 6, and method steps S701-S704 in fig. 7.
The embodiment of the invention also provides a computer-readable storage medium, which stores computer-executable instructions, and the computer-executable instructions are used for executing the environment identification method.
In one embodiment, the computer-readable storage medium stores computer-executable instructions that are executed by one or more control processors, for example, to perform method steps S101-S107 in fig. 1, method steps S201-S203 in fig. 2, method steps S301-S302 in fig. 3, method steps S401-S403 in fig. 4, method steps S501-S502 in fig. 5, method steps S601-S602 in fig. 6, and method steps S701-S704 in fig. 7.
The above-described embodiments of the apparatus are merely illustrative, wherein the units illustrated as separate components may or may not be physically separate, i.e. may be located in one place, or may also be distributed over a plurality of network elements. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
One of ordinary skill in the art will appreciate that all or some of the steps, systems, and methods disclosed above may be implemented as software, firmware, hardware, and suitable combinations thereof. Some or all of the physical components may be implemented as software executed by a processor, such as a central processing unit, digital signal processor, or microprocessor, or as hardware, or as an integrated circuit, such as an application specific integrated circuit. Such software may be distributed on computer readable media, which may include computer storage media (or non-transitory media) and communication media (or transitory media). The term computer storage media includes volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information such as computer readable instructions, data structures, program modules or other data, as is well known to those of ordinary skill in the art. Computer storage media includes, but is not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, digital Versatile Disks (DVD) or other optical disk storage, magnetic cassettes, magnetic tape, storage device storage or other magnetic storage devices, or any other medium which can be used to store the desired information and which can accessed by a computer. In addition, communication media typically embodies computer readable instructions, data structures, program modules or other data in a modulated data signal such as a carrier wave or other transport mechanism and includes any information delivery media as known to those skilled in the art. It should also be understood that the various implementations provided by the embodiments of the present invention may be combined arbitrarily to achieve different technical effects.
While the preferred embodiments of the present invention have been described in detail, it will be understood by those skilled in the art that the foregoing and various other changes, omissions and deviations in the form and detail thereof may be made without departing from the scope of this invention.

Claims (10)

1. An environment recognition method is applied to an edge server, and the method comprises the following steps:
acquiring an environment feature subset based on a common sensing terminal, wherein the environment feature subset is obtained by performing feature extraction on environment information by the common sensing terminal, and the environment information is acquired by the common sensing terminal in a target environment;
analyzing the environment feature subset to obtain a first number of classification probability functions corresponding to the environment feature subset in a first number of environment categories;
constructing a classification discrimination gain based on a first number of the classification probability functions;
and identifying the environmental feature subset based on the classification discrimination gain to obtain identification result data.
2. The method of claim 1, wherein said constructing a classification discriminant gain based on a first number of said classification probability functions comprises:
constructing a second number of environment class pairs based on the first number of environment classes, each environment class pair comprising two class pair elements, the class pair elements being selected from the environment classes;
matching two class pair elements of each environment class pair to obtain one-to-one corresponding element classification functions from a first number of classification probability functions and constructing classification function pairs;
based on a second number of the classification function pairs, the classification discrimination gain is constructed.
3. The method of claim 2, wherein said constructing the classification discriminant gain based on a second number of the pairs of classification functions comprises:
integrating each classification function pair to construct a class pair discrimination gain corresponding to each environment class pair;
and averaging the discrimination gains based on the second number of classes to obtain the classification discrimination gain.
4. The method of claim 1, wherein the identifying the subset of environmental features based on the classification discrimination gain to obtain identification result data comprises:
enhancing the classification discrimination gain to obtain an optimized discrimination gain;
and identifying the environmental feature subset based on the optimized discrimination gain to obtain identification result data.
5. The method of claim 4, wherein said enhancing the classification discriminant gain to obtain an optimized discriminant gain comprises:
acquiring sensing time, communication time and calculation time, and configuring a delay constraint condition for the classification discrimination gain based on the sensing time, the communication time and the calculation time, wherein the sensing time is the acquisition time of the environmental information, the communication time is the transmission time of the environmental feature subset, and the calculation time is the feature extraction time of the environmental feature subset;
acquiring channel capacity between the edge server and the sensing terminal, and configuring transmission constraint conditions for the classification discrimination gain based on the channel capacity;
acquiring an energy consumption threshold of the general sensing terminal, and configuring an energy constraint condition for the classification discrimination gain based on the energy consumption threshold;
and enhancing the classification discrimination gain based on the delay constraint condition, the transmission constraint condition and the energy constraint condition to obtain an optimized discrimination gain.
6. An environment identification method is applied to a communication terminal, and comprises the following steps:
collecting environmental information, wherein the environmental information comprises detectable physical quantity of a target environment;
extracting the characteristics of the environmental information to obtain an environmental characteristic subset;
and sending the environmental feature subset to an edge server so that the edge server carries out analysis processing based on the environmental feature subset to obtain a classification probability function, constructing a classification discriminant gain based on the classification probability function, and carrying out identification processing on the environmental feature subset based on the classification discriminant gain to obtain identification result data.
7. The method of claim 6, wherein the environmental information comprises object motion data in the target environment, and the performing feature extraction on the environmental information to obtain an environmental feature subset comprises:
sampling the object motion data to generate a motion data vector;
extracting a main characteristic element from the motion data vector based on principal component analysis to obtain a motion characteristic vector;
and carrying out normalization processing on the motion characteristic vector to obtain the environment characteristic subset.
8. The method of claim 6, wherein the extracting a dominant feature element from the motion data vector based on a dominant component analysis to obtain a motion feature vector comprises:
performing singular value decomposition on the motion data vector to obtain an intermediate data vector;
and extracting main characteristic elements from the intermediate data vector based on principal component analysis to obtain a motion characteristic vector.
9. An electronic device, comprising: memory storing a computer program, a processor implementing the environment recognition method according to any one of claims 1 to 8 when executing the computer program.
10. A computer-readable storage medium, characterized in that the storage medium stores a program, which is executed by a processor to implement the environment recognizing method according to any one of claims 1 to 8.
CN202310213046.0A 2023-03-08 2023-03-08 Environment recognition method, electronic device, and computer-readable storage medium Active CN115935278B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310213046.0A CN115935278B (en) 2023-03-08 2023-03-08 Environment recognition method, electronic device, and computer-readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310213046.0A CN115935278B (en) 2023-03-08 2023-03-08 Environment recognition method, electronic device, and computer-readable storage medium

Publications (2)

Publication Number Publication Date
CN115935278A true CN115935278A (en) 2023-04-07
CN115935278B CN115935278B (en) 2023-06-20

Family

ID=86554468

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310213046.0A Active CN115935278B (en) 2023-03-08 2023-03-08 Environment recognition method, electronic device, and computer-readable storage medium

Country Status (1)

Country Link
CN (1) CN115935278B (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019072128A1 (en) * 2017-10-09 2019-04-18 北京京东尚科信息技术有限公司 Object identification method and system therefor
CN112181006A (en) * 2020-10-07 2021-01-05 广州云智通讯科技有限公司 Environment intelligent processing method and system based on big data and storage medium
CN112887371A (en) * 2021-01-12 2021-06-01 深圳市中博科创信息技术有限公司 Edge calculation method and device, computer equipment and storage medium
CN114828208A (en) * 2022-03-29 2022-07-29 Oppo广东移动通信有限公司 Terminal position identification method and device, computer readable medium and electronic equipment
CN115407338A (en) * 2022-07-26 2022-11-29 徐毓辰 Vehicle environment information sensing method and system

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019072128A1 (en) * 2017-10-09 2019-04-18 北京京东尚科信息技术有限公司 Object identification method and system therefor
CN112181006A (en) * 2020-10-07 2021-01-05 广州云智通讯科技有限公司 Environment intelligent processing method and system based on big data and storage medium
CN112887371A (en) * 2021-01-12 2021-06-01 深圳市中博科创信息技术有限公司 Edge calculation method and device, computer equipment and storage medium
CN114828208A (en) * 2022-03-29 2022-07-29 Oppo广东移动通信有限公司 Terminal position identification method and device, computer readable medium and electronic equipment
CN115407338A (en) * 2022-07-26 2022-11-29 徐毓辰 Vehicle environment information sensing method and system

Also Published As

Publication number Publication date
CN115935278B (en) 2023-06-20

Similar Documents

Publication Publication Date Title
CN107968689B (en) Perception identification method and device based on wireless communication signals
Yang et al. Machine-learning-enabled cooperative perception for connected autonomous vehicles: Challenges and opportunities
US20220094710A1 (en) Detection of cyber attacks targeting avionics systems
Sliwa et al. The channel as a traffic sensor: Vehicle detection and classification based on radio fingerprinting
CN105873211A (en) Positioning method and device
CN112307969A (en) Pulse signal classification identification method and device and computer equipment
Maduranga et al. Supervised machine learning for RSSI based indoor localization in IoT applications
Kim et al. Performance of vehicle speed estimation using wireless sensor networks: A region-based approach
Zhang et al. Latency prediction for delay-sensitive v2x applications in mobile cloud/edge computing systems
Kim et al. VANET Jamming and Adversarial Attack Defense for Autonomous Vehicle Safety.
CN108334085A (en) Smart collaboration method, apparatus, system, intelligent terminal and storage medium
Marvasti et al. A statistical approach toward channel modeling with application to large-scale censored data
CN115935278A (en) Environment recognition method, electronic device, and computer-readable storage medium
US20220408400A1 (en) System and method for facilitating localizing an external object
Worrall et al. Fault detection for vehicular ad hoc wireless networks
Tulay et al. Increasing situational awareness in vehicular networks: Passive traffic sensing based on machine learning
CN112016539B (en) Signal identification method and device, electronic equipment and storage medium
CN116388941A (en) Data transmission method, sensing response node and sensing request node
Lv et al. Accessorial locating for internet of vehicles based on doa estimation in industrial transportation
US11386777B2 (en) Obstacle localization based on probabilistic consensus
He et al. Statistical characterization of dynamic multi-path components for vehicle-to-vehicle radio channels
Ullah et al. Path Loss Estimation and Jamming Detection in Hybrid RF-VLC Vehicular Networks: A Machine Learning Framework
KR20220003386A (en) Apparatus and method for providing traffic information
Bi et al. Low-cost UAV detection via WiFi traffic analysis and machine learning
CN115549823B (en) Radio environment map prediction method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant