CN113505739B - Indoor favorite distinguishing and behavior recognition method and system - Google Patents

Indoor favorite distinguishing and behavior recognition method and system Download PDF

Info

Publication number
CN113505739B
CN113505739B CN202110849340.1A CN202110849340A CN113505739B CN 113505739 B CN113505739 B CN 113505739B CN 202110849340 A CN202110849340 A CN 202110849340A CN 113505739 B CN113505739 B CN 113505739B
Authority
CN
China
Prior art keywords
identification
passive infrared
infrared sensor
classification
database
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110849340.1A
Other languages
Chinese (zh)
Other versions
CN113505739A (en
Inventor
周翔
张静思
赵婷
王纪隆
张心悦
罗茂辉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tongji University
Original Assignee
Tongji University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tongji University filed Critical Tongji University
Priority to CN202110849340.1A priority Critical patent/CN113505739B/en
Publication of CN113505739A publication Critical patent/CN113505739A/en
Application granted granted Critical
Publication of CN113505739B publication Critical patent/CN113505739B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/25Integrating or interfacing systems involving database management systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N20/00Machine learning

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Artificial Intelligence (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Evolutionary Computation (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Evolutionary Biology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Software Systems (AREA)
  • Databases & Information Systems (AREA)
  • Medical Informatics (AREA)
  • Computing Systems (AREA)
  • Mathematical Physics (AREA)
  • Housing For Livestock And Birds (AREA)
  • Image Analysis (AREA)

Abstract

The invention relates to a method and a system for indoor pet distinguishing and behavior identification, wherein the method comprises the following steps: s1: arranging a passive infrared sensor array indoors, and collecting real-time level signals generated by activities of indoor pets and personnel by using the passive infrared sensor array; s2: processing the level signal to obtain identification data, marking a corresponding identification label on the identification data corresponding to each identification time period, establishing a database, and dividing the database into a training set and a test set; s3: constructing a classification recognition model, and training and testing the classification recognition model based on a database; s4: and acquiring and processing level signals of the passive infrared sensor array, acquiring identification data, sending the identification data to a classification identification model, and acquiring an identification result. Compared with the prior art, the invention has the advantages of high identification accuracy, capability of effectively distinguishing people and pets and the like.

Description

Indoor human pet distinguishing and behavior recognition method and system
Technical Field
The invention relates to the field of indoor behavior recognition, in particular to a method and a system for indoor pet distinguishing and behavior recognition.
Background
The intelligent home system organically combines various subsystems in the home by utilizing a computer technology and a communication technology, thereby providing comfortable, efficient and energy-saving life, and is generally applied in recent years. The indoor behavior identification is an important basis for automatic control of equipment such as lighting, air conditioning and heating, and is particularly important for operation of the intelligent home system. The accuracy of indoor personnel positioning and action recognition determines whether the intelligent home system can correctly understand the behavior intention of people, and further judges the working condition combination of each device corresponding to various personnel activity scenes so as to finally finish reasonable regulation and control and service decision. Data acquisition of indoor personnel positioning and behavior recognition can utilize a camera in combination with an image recognition technology, but personnel privacy is easily exposed; the wearable inertial sensor and the wireless network positioning terminal can also be combined, but the Hoosmore effect of the observed object can be caused, the prediction precision is influenced, and the normal life of residents is influenced.
Chinese patent CN102521574A discloses a human body action recognition method based on pyroelectric infrared information, which collects data of a single passive infrared sensor corresponding to different actions, adopts two feature extraction algorithms of fast fourier transform and wavelet packet analysis to respectively extract frequency spectrum features and time frequency features of signals, subsequently respectively adopts a support vector machine and a K-means clustering algorithm to identify the above features in a classified manner, and compares the recognition effects under different algorithms. According to the method, based on analog quantity signals acquired by the passive infrared sensor, data are continuous in a time domain, the positions or actions of people are taken as main identification objects, the relevance of the positions and the actions in an actual scene is ignored, and the passive infrared sensor cannot simultaneously position indoor people and identify behavior types.
Moreover, at present, more and more families select to raise pets, because the activity ranges of people and pets are overlapped, the problem that how to distinguish whether an object appearing in a monitoring area is a person or a pet is needed to be solved is solved, and the problems that people and pets are distinguished based on images and videos in the prior art and privacy exposure easily exists.
Disclosure of Invention
The invention aims to overcome the defects of the prior art and provide an indoor pet distinguishing and behavior identifying method and system.
The purpose of the invention can be realized by the following technical scheme:
an indoor pet distinguishing and behavior identification method comprises the following steps:
s1: arranging a passive infrared sensor array indoors, and collecting real-time level signals generated by activities of indoor pets and personnel by using the passive infrared sensor array;
s2: processing the level signal to obtain identification data, marking a corresponding identification label on the identification data corresponding to each identification time period, establishing a database, and dividing the database into a training set and a test set;
s3: constructing a classification recognition model, and training and testing the classification recognition model based on a database;
s4: and acquiring and processing level signals of the passive infrared sensor array, acquiring identification data, sending the identification data to a classification identification model, and acquiring an identification result.
Preferably, the passive infrared sensor array comprises at least four passive infrared sensor arrays.
Further preferably, the passive infrared sensor array comprises 15 passive infrared sensors, and the passive infrared sensors form a passive infrared sensor array with 5 rows and 3 columns.
Preferably, the identification tag comprises a personnel tag, a pet tag and an action intensity tag.
Further preferably, the identification tag further comprises a location tag.
Preferably, the identification data includes a real-time sensor count value in a current identification period and a sensor accumulated count value of a preset accumulation duration before the current identification period.
Preferably, the model type of the classification recognition model includes, but is not limited to, a naive bayes classification model, a KNN classification model, or a random forest classification model.
Preferably, a monitoring camera for acquiring the position and motion information of the personnel and the pet when the database is established in the step S2 is further arranged indoors.
An indoor pet distinguishing and behavior recognition system, comprising:
the passive infrared sensor array is used for collecting real-time level signals generated by activities of indoor pets and personnel;
the database module is used for processing the level signal to acquire identification data, marking a corresponding identification label on the identification data corresponding to each identification time period, establishing a database, and dividing the database into a training set and a test set;
the recognition model module is used for constructing a classification recognition model and training and testing the classification recognition model based on the database;
and the classification identification module acquires and processes the level signal of the passive infrared sensor array, acquires identification data, and sends the identification data to the classification identification model to acquire an identification result.
The passive infrared sensor array comprises at least four passive infrared sensor arrays.
Compared with the prior art, the invention has the following advantages:
(1) According to the invention, the activity level signals of indoor pets and people can be effectively acquired by using the passive infrared sensor array, and the classification recognition model is constructed and trained by establishing the database comprising the sensor count value in the current recognition period of the passive infrared sensor and the sensor accumulated count value of the preset accumulation duration before the current recognition period, so that the indoor people and pets can be effectively distinguished, the activity intensity is effectively recognized, and the recognition accuracy and the recognition efficiency are improved;
(2) According to the invention, the accumulated count value of the sensor with the preset accumulated duration before the current identification period is introduced as the input of the action identification model, so that the prediction accuracy can be effectively enhanced, and the classification and identification can be carried out on the region and the type of the activity;
(3) The passive infrared sensor array-based active area recognition method is based on the passive infrared sensor array, has the active areas with obvious distance difference, and is more remarkable in formed response signal data characteristics and higher in recognition accuracy rate;
(4) The invention adopts a machine learning method to construct a classification recognition model, can effectively recognize and process a large amount of data acquired by the passive infrared sensor array based on the characteristics of machine learning, effectively utilizes the data acquired by the passive infrared sensor array to classify pets and people, and recognizes actions, thereby improving the recognition effect;
(5) After the identification tag of the database is finished, indoor images do not need to be acquired by using a monitoring camera in the subsequent process, so that the risk of privacy exposure is avoided, and the reliability of the database is effectively improved.
Drawings
FIG. 1 is a flow chart of a method of the present invention;
FIG. 2 is a layout diagram of a passive infrared sensor array of the present invention;
FIG. 3 is a graph comparing the accuracy of the classification recognition models of the present invention;
FIG. 4 is a comparative graph of the F-measure of the classification recognition model of the present invention;
FIG. 5 is a Kappa coefficient comparison chart of the class identification model of the present invention;
FIG. 6 is a comparison graph of AUC values for the classification recognition model of the present invention;
fig. 7 is an example of the recognition effect of the classification recognition model of the present invention.
Detailed Description
The invention is described in detail below with reference to the figures and specific embodiments. Note that the following description of the embodiments is merely a substantial example, and the present invention is not intended to be limited to the application or the use thereof, and is not limited to the following embodiments.
Examples
An indoor pet distinguishing and behavior identification method is shown in figure 1 and comprises the following steps:
s1: the passive infrared sensor array is arranged indoors, and real-time level signals generated by activities of indoor pets and personnel are collected by the passive infrared sensor array.
In this example, a living room of a house 4.8m long × 3.0m wide × 2.7m high is selected, and daily activities of the resident and the pet therein are recognized as objects.
Correspondingly, as shown in FIG. 2, 5 rows and 3 columns of passive infrared sensor arrays are arranged along the left half area of the wall surface, wherein the first column (No. 1-No. 5), the second column (No. 6-No. 10) and the third column (No. 11-No. 15) are arranged from left to right in sequence, each column is 0.8m apart, and the passive infrared sensors are respectively 0.3m,0.6m,1.2m,1.8m and 2.4m from bottom to top. The area A of the floor of the living room and the area B of the sofa are divided into 3 equal intervals respectively, and level signals of the passive infrared sensor arrays corresponding to the indoor activity of people and pets are collected.
In this embodiment, 15 RS-HW-N01 passive infrared sensors are selected, the device outputs a digital signal 0/1 once per second, outputs a high level "1" for 5 consecutive seconds from the instant when the human body activity is sensed, and can accumulate the count value of the device every 60 seconds as the output of the device in each minute.
Correspondingly, in the embodiment, each minute is taken as an identification period, and the activity conditions of the people and the pets every minute are distinguished and identified.
S2: and processing the level signal to obtain identification data, marking a corresponding identification label on the identification data corresponding to each identification time period, establishing a database, and dividing the database into a training set and a test set.
In this embodiment, the identification data includes a real-time sensor count value in the current identification period and a sensor accumulated count value of a preset accumulation duration before the current identification period.
Specifically, in this embodiment, the identification period is 1 minute, and the preset accumulation time period is an accumulation of count values of the number of minutes proceeding from the current identification period, and the accumulation time period is, for example, 1 minute, 5 minutes, 10 minutes, 20 minutes, 30 minutes, 40 minutes, 50 minutes, 60 minutes, or the like.
For example, the current time is 12:01 minutes, the accumulation time length is selected to be 30 minutes, and then the sensor count value in the current identification period is 12:00 to 12:01, sensor count value in one minute, sensor count value of 11:30 to 12:00 the sum of the sensor counts over the 30 minutes.
Specifically, when the recognition period is 1 minute, the sensor count value in the current recognition period may be expressed by the following equation:
Figure BDA0003181799470000051
wherein x is i Is the count value of the passive infrared sensor in the ith second, and X is the sensor count value in the current identification period (1 minute) of the passive infrared sensor.
In this embodiment, the sensor count values in the current identification period of all the passive infrared sensors in the passive infrared sensor array and the sensor accumulated count value of the preset accumulation duration before the current identification period are obtained.
In addition, this application sets up 1 surveillance camera head indoor, and the angle of view angle 180 degrees, infrared irradiation distance 7.5 meters, record personnel, pet actual position and action, and the monitoring range of surveillance camera appearance can cover whole test area. The camera is only used in the process of constructing the database in the previous period, and can be detached after the database is constructed. And marking a corresponding identification label on the identification data by combining the monitoring data of the camera. The identification tag comprises a personnel tag, a pet tag, an action intensity tag and a position tag.
In this embodiment, specifically, the personnel tags are selected from personnel activities and non-personnel activities; the pet label is selected to have pet activity and no pet activity; the action intensity labels are selected from strong, medium and weak, wherein the strong actions in the embodiment comprise the actions with large participation degree and amplitude of four limbs represented by walking, the actions represented by 'potted plant' are included, and the weak actions comprise the actions represented by 'eating' and 'playing mobile phone'; the position labels correspond to the region division, and A1, A2, A3, B1, B2 and B3 are selected.
In the present embodiment of the present invention,
after the construction of the database is completed, the database is divided into 2:1, carrying out layered sampling, and respectively establishing a training set and a test set.
S3: and constructing a classification recognition model, and training and testing the classification recognition model based on the database.
In this embodiment, the output result of the classification recognition model of the present invention includes no pet and no personnel activity; only pets move; the activity of people, the activity area and the intensity of the people are provided.
In the present invention, the recognition algorithm of the classification recognition model can adopt, but is not limited to, naive Bayes, logistic regression, support vector machine SVM, k-nearest neighbor KNN or decision tree algorithm. In this embodiment, as shown in fig. 3 to 6, in order to select a more optimal recognition algorithm, a ten-fold cross validation method is used for testing performance indexes such as accuracy, F measure, kappa coefficient, AUC value and the like, the performance of the classifier model is evaluated, and finally an indoor human-pet distinguishing and behavior recognition model established based on a random forest algorithm is used, wherein the accuracy of the model reaches 99.7%.
In addition, the performance of different accumulation time lengths is screened, and finally the accumulation time length is selected to be 40 minutes.
S4: and acquiring and processing level signals of the passive infrared sensor array, acquiring identification data, sending the identification data to a classification identification model, and acquiring an identification result.
In this embodiment, based on a trained motion recognition algorithm, a count value of one minute of a current recognition period and an accumulated count value of the sensors 40 minutes before the current recognition period of each sensor are obtained and used as inputs of a classification recognition model, and the classification recognition model outputs a corresponding pet distinguishing result and a human behavior recognition result. As shown in fig. 7, the predicted indoor activity of the persons and pets can be obtained based on the classification recognition model, and the recognition accuracy of the test set is calculated according to whether the predicted value is consistent with the actual activity, and the model accuracy is 90.9%.
This embodiment still provides an indoor people's pet and distinguishes and action identification system, includes: the passive infrared sensor array is used for collecting real-time level signals generated by activities of indoor pets and personnel; the database module is used for processing the level signals to acquire identification data, marking corresponding identification labels on the identification data corresponding to each identification time period, establishing a database, and dividing the database into a training set and a test set; the recognition model module is used for constructing a classification recognition model and training and testing the classification recognition model based on the database; and the classification identification module acquires and processes the level signal of the passive infrared sensor array, acquires identification data, and sends the identification data to the classification identification model to acquire an identification result. The passive infrared sensor array comprises at least four passive infrared sensor arrays. Because the indoor pet distinguishing and behavior recognizing system has the technical effects, the indoor pet distinguishing and behavior recognizing system for realizing indoor pet distinguishing and behavior recognizing also has corresponding technical effects.
The above embodiments are merely examples and do not limit the scope of the present invention. These embodiments may be implemented in other various manners, and various omissions, substitutions, and changes may be made without departing from the technical spirit of the present invention.

Claims (5)

1. An indoor pet distinguishing and behavior identification method is characterized by comprising the following steps:
s1: arranging a passive infrared sensor array indoors, and collecting real-time level signals generated by activities of indoor pets and personnel by using the passive infrared sensor array;
s2: processing the level signal to obtain identification data, marking a corresponding identification label on the identification data corresponding to each identification time period, establishing a database, and dividing the database into a training set and a test set;
s3: constructing a classification recognition model, and training and testing the classification recognition model based on a database;
s4: acquiring and processing level signals of the passive infrared sensor array, acquiring identification data, sending the identification data into a classification identification model to acquire an identification result,
the passive infrared sensor array comprises at least four passive infrared sensor arrays, the passive infrared sensor arrays comprise 15 passive infrared sensors, the passive infrared sensors form 5 rows and 3 columns of passive infrared sensor arrays, the passive infrared sensors are RS-HW-N01 type passive infrared sensors, the passive infrared sensors are 0/1 digital signal output type sensors,
the identification tag comprises a personnel tag, a pet tag and an action intensity tag;
the identification data comprises a sensor real-time count value in the current identification period and a sensor accumulated count value of a preset accumulated time before the current identification period.
2. The method for indoor pet distinguishing and behavior identification as claimed in claim 1, wherein said identification tag further comprises a location tag.
3. The method as claimed in claim 1, wherein the classification and recognition model includes but is not limited to naive Bayes classification model, KNN classification model or random forest classification model.
4. The method for indoor pet distinguishing and behavior recognition according to claim 1, wherein a monitoring camera for acquiring position and motion information of people and pets when the database is established in step S2 is further arranged indoors.
5. The utility model provides an indoor people pet is distinguished and action identification system which characterized in that includes:
the passive infrared sensor array is used for collecting real-time level signals generated by activities of indoor pets and personnel;
the database module is used for processing the level signal to acquire identification data, marking a corresponding identification label on the identification data corresponding to each identification time period, establishing a database, and dividing the database into a training set and a test set;
the recognition model module is used for constructing a classification recognition model and training and testing the classification recognition model based on the database;
the classification identification module is used for acquiring and processing level signals of the passive infrared sensor array, acquiring identification data, sending the identification data to a classification identification model and acquiring an identification result;
the passive infrared sensor array comprises at least four passive infrared sensor arrays; the passive infrared sensor array comprises 15 passive infrared sensors, the passive infrared sensors form a passive infrared sensor array with 5 rows and 3 columns, the passive infrared sensors are RS-HW-N01 type passive infrared sensors, the passive infrared sensors are 0/1 digital signal output type sensors,
the identification tag comprises a personnel tag, a pet tag and an action intensity tag;
the identification data comprises a sensor real-time count value in the current identification period and a sensor accumulated count value of a preset accumulated time before the current identification period.
CN202110849340.1A 2021-07-27 2021-07-27 Indoor favorite distinguishing and behavior recognition method and system Active CN113505739B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110849340.1A CN113505739B (en) 2021-07-27 2021-07-27 Indoor favorite distinguishing and behavior recognition method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110849340.1A CN113505739B (en) 2021-07-27 2021-07-27 Indoor favorite distinguishing and behavior recognition method and system

Publications (2)

Publication Number Publication Date
CN113505739A CN113505739A (en) 2021-10-15
CN113505739B true CN113505739B (en) 2022-10-25

Family

ID=78014076

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110849340.1A Active CN113505739B (en) 2021-07-27 2021-07-27 Indoor favorite distinguishing and behavior recognition method and system

Country Status (1)

Country Link
CN (1) CN113505739B (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107016342A (en) * 2017-03-06 2017-08-04 武汉拓扑图智能科技有限公司 A kind of action identification method and system
CN111311809A (en) * 2020-02-21 2020-06-19 南京理工大学 Intelligent access control system based on multi-biological-feature fusion
CN112215296A (en) * 2020-10-21 2021-01-12 红相股份有限公司 Infrared image identification method based on transfer learning and storage medium

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5317620A (en) * 1992-04-02 1994-05-31 Orca Technology, Inc. Infrared alarm system
US5670943A (en) * 1996-02-26 1997-09-23 Detection Systems, Inc. Pet immune intruder detection
US7924212B2 (en) * 2009-08-10 2011-04-12 Robert Bosch Gmbh Method for human only activity detection based on radar signals
CN101816560B (en) * 2010-05-31 2011-11-16 天津大学 Identification method based on multi-angle human body pyroelectricity information detection
CN102521574A (en) * 2011-12-14 2012-06-27 天津大学 Human action identification method based on pyroelectric infrared information
CN103729626A (en) * 2013-12-31 2014-04-16 天津大学 Human body heat source feature extracting and distinguishing method based on infrared pyroelectric information
CN105091189A (en) * 2014-05-07 2015-11-25 青岛海尔空调电子有限公司 Method and device for identifying non-human body heat source

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107016342A (en) * 2017-03-06 2017-08-04 武汉拓扑图智能科技有限公司 A kind of action identification method and system
CN111311809A (en) * 2020-02-21 2020-06-19 南京理工大学 Intelligent access control system based on multi-biological-feature fusion
CN112215296A (en) * 2020-10-21 2021-01-12 红相股份有限公司 Infrared image identification method based on transfer learning and storage medium

Also Published As

Publication number Publication date
CN113505739A (en) 2021-10-15

Similar Documents

Publication Publication Date Title
CN103475736B (en) Sensor communication system and method for conducting monitoring through same
CN103488148B (en) A kind of animal behavior intelligent monitor system based on Internet of Things and computer vision
US20180129873A1 (en) Event detection and summarisation
CN109076310A (en) The autonomous semantic marker of physical location
WO2015172445A1 (en) Domestic multifunctional intelligent robot
CN102799870A (en) Single-training sample face recognition method based on blocking consistency LBP (Local Binary Pattern) and sparse coding
CN103049459A (en) Feature recognition based quick video retrieval method
Chen et al. Remote recognition of in-bed postures using a thermopile array sensor with machine learning
US10769909B1 (en) Using sensor data to detect events
De Paola et al. User detection through multi-sensor fusion in an AmI scenario
CN112464730B (en) Pedestrian re-identification method based on domain-independent foreground feature learning
CN103123690B (en) Information acquisition device, information acquisition method, identification system and identification method
CN103248703A (en) Automatic monitoring system and method for live pig action
Bu Human motion gesture recognition algorithm in video based on convolutional neural features of training images
CN112862145A (en) Occupant thermal comfort inference using body shape information
CN110688980A (en) Human body posture classification method based on computer vision
CN111382727A (en) Deep learning-based dog face identification method
CN111079720B (en) Face recognition method based on cluster analysis and autonomous relearning
Naser et al. Heat-map based occupancy estimation using adaptive boosting
Yao et al. Freedom: Online activity recognition via dictionary-based sparse representation of rfid sensing data
Arya et al. Automatic face recognition and detection using OpenCV, haar cascade and recognizer for frontal face
CN113505739B (en) Indoor favorite distinguishing and behavior recognition method and system
CN112257559A (en) Identity recognition method based on gait information of biological individual
CN109948481B (en) Passive human body identification method based on narrowband radio frequency link sampling
WO2023093241A1 (en) Pedestrian re-identification method and apparatus, and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant