CN110674717B - Smoking monitoring system based on gesture recognition - Google Patents

Smoking monitoring system based on gesture recognition Download PDF

Info

Publication number
CN110674717B
CN110674717B CN201910872752.XA CN201910872752A CN110674717B CN 110674717 B CN110674717 B CN 110674717B CN 201910872752 A CN201910872752 A CN 201910872752A CN 110674717 B CN110674717 B CN 110674717B
Authority
CN
China
Prior art keywords
module
feature points
gesture
distance
smoking
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910872752.XA
Other languages
Chinese (zh)
Other versions
CN110674717A (en
Inventor
曾新华
欧阳麟
周靖阳
严娜
孙杨杨
季铖
洪伟
方静静
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hangzhou Benba Huishi Technology Co ltd
Original Assignee
Hangzhou Benba Huishi Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hangzhou Benba Huishi Technology Co ltd filed Critical Hangzhou Benba Huishi Technology Co ltd
Priority to CN201910872752.XA priority Critical patent/CN110674717B/en
Publication of CN110674717A publication Critical patent/CN110674717A/en
Application granted granted Critical
Publication of CN110674717B publication Critical patent/CN110674717B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/50Information retrieval; Database structures therefor; File system structures therefor of still image data
    • G06F16/51Indexing; Data structures therefor; Storage structures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/50Information retrieval; Database structures therefor; File system structures therefor of still image data
    • G06F16/58Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
    • G06F16/583Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using metadata automatically derived from the content
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A90/00Technologies having an indirect contribution to adaptation to climate change
    • Y02A90/10Information and communication technologies [ICT] supporting adaptation to climate change, e.g. for weather forecasting or climate simulation

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Library & Information Science (AREA)
  • Data Mining & Analysis (AREA)
  • Databases & Information Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses a smoking monitoring system based on gesture recognition, which comprises an image acquisition module, an acquisition and division module, a gesture characteristic simulation module, a screening and positioning module, a gesture correction module, a gesture recognition comparison module, a human body gesture model database, a gesture evaluation processing module and a display terminal, wherein the image acquisition module is used for acquiring and dividing a cigarette; the system can extract and identify the characteristic points of the smokers in the acquired images by the image acquisition module, the acquisition and division module and the attitude characteristic simulation module, counts the distance between the characteristic points in the smoking attitudes by the screening and positioning module, the attitude correction module, the attitude identification and comparison module and the attitude evaluation and processing module, judges the smoking frequency in unit time according to the distance between the characteristic points, further analyzes the smoking addiction coefficient of the smokers, is convenient to intuitively know the smoking frequency and the addiction state of the smokers, and can monitor the smoking state of the smokers according to the smoking attitudes of the smokers.

Description

Smoking monitoring system based on gesture recognition
Technical Field
The invention belongs to the technical field of image monitoring, and relates to a smoking monitoring system based on gesture recognition.
Background
At present, smoking is regarded as a killer threatening human health, more and more attention is paid to all aspects of society, and according to statistics, China has about 3.2 hundred million smokers.
The safety of smokers and surrounding personnel is seriously harmed by smoking, particularly, dangerous accidents are easily caused when smoking is carried out in dangerous environments such as gas stations and explosive storage places, however, at present, the smoke concentration is generally adopted for alarming and reminding, if the smoke concentration is lower than a set parameter value, alarming does not occur, the problems of poor safety detection and high danger exist, and in order to improve the monitoring of the smokers, the smoking state of the smokers is detected by adopting a posture identification mode.
Disclosure of Invention
The invention aims to provide a smoking monitoring system based on gesture recognition, which can extract and recognize characteristic points of a smoker in a collected image by an image collection module, an acquisition and division module and a gesture characteristic simulation module, and can count the distance between the characteristic points in a smoking gesture by a screening and positioning module, a gesture correction module, a gesture recognition and comparison module and a gesture evaluation and processing module, judge smoking frequency in unit time according to the distance between the characteristic points, further analyze smoking addiction coefficients of the smoker, and solve the problem of poor smoking monitoring effect on the smoker in the prior art.
The purpose of the invention can be realized by the following technical scheme:
the smoking monitoring system based on gesture recognition comprises an image acquisition module, an acquisition and division module, a gesture feature simulation module, a screening and positioning module, a gesture correction module, a gesture recognition comparison module, a human body gesture model database, a gesture evaluation processing module and a display terminal;
the image acquisition module is connected with the acquisition and division module, the posture characteristic simulation module is respectively connected with the acquisition and division module, the screening and positioning module and the posture recognition and comparison module, the posture correction module is respectively connected with the screening and positioning module and the human body posture model database, the posture evaluation processing module is respectively connected with the posture recognition and comparison module, the posture correction module, the human body posture model database and the display terminal, and the posture recognition and comparison module is connected with the human body posture model database.
The image acquisition module is used for acquiring image information of personnel in real time, removing the acquired images, screening out images with the definition degree greater than a preset definition degree threshold value, and sending the screened images with the definition degree greater than the preset definition degree threshold value to the acquisition and division module;
the acquisition and division module is used for receiving the image sent by the image acquisition module, dividing the received image into a plurality of sub-images with the same size, numbering the sub-images, wherein the sub-images are respectively 1,2, 1, i, n, and sequentially sending the divided and sequenced sub-images to the attitude feature simulation module in sequence;
the attitude characteristic simulation module is used for receiving the information of each subimage sent by the acquisition and division module, extracting attitude characteristic points of each received subimage, simulating position coordinates of the extracted characteristic points in each subimage, sending the simulated position coordinates of the extracted characteristic points in each subimage to the screening and positioning module, and sending the extracted characteristic points in each subimage to the attitude identification and comparison module;
the screening and positioning module is used for receiving the simulated position coordinates of the feature points sent by the attitude feature simulation module, counting the distance between each feature point according to the obtained simulated position coordinates of the feature points, judging the distance between the finger joint feature point and the lip feature point, and sending the distance between each feature point to the attitude correction module;
the gesture correction module is used for receiving the distance between the feature points sent by the screening and positioning module, screening out the feature points with the changed distance between the feature points, counting the times of the change of the distance between any two feature points, counting the distance between the finger joint feature points and the lip feature points, comparing the counted distance with a set standard distance threshold, judging whether the absolute value of the difference value between the distance between the finger joint feature points and the lip feature points and the standard distance threshold is in a preset interval threshold range, and correcting the distance between the finger joint feature points and the lip feature points in the sub-image according to the standard distance between the finger joint feature points and the lip feature points in the smoking process of the smoker in the human gesture model database if the absolute value is in the preset interval threshold range;
the human body posture model database is used for storing each posture characteristic point and the position corresponding to each posture characteristic point of a smoker in a smoking state, and a standard distance threshold value and interval threshold value information between the finger joint characteristic point and the lip characteristic point, and storing times that the distance between the finger joint characteristic point and the lip characteristic point in unit time is smaller than the standard distance threshold value, wherein different times correspond to different smoking frequencies, and different smoking frequency ranges have smoking addiction coefficients corresponding to the times;
the gesture recognition and comparison module is used for receiving the feature points of the sub-images sent by the gesture feature simulation module, comparing the feature points in the sub-images with the gesture feature points in the human body gesture model database one by one, judging whether the feature points in the sub-images are gesture feature points or not, and extracting the gesture feature points in the sub-images.
Further, the gesture evaluation processing module is used for receiving the compared gesture feature points sent by the gesture recognition and comparison module, receiving the distance between the corrected finger joint feature points and the lip feature points sent by the gesture correction module, judging whether the received gesture feature points are the finger joint feature points and the lip feature points or not, and judging whether the received gesture feature points are the finger joint feature points or the lip feature points or not, then the distance between the finger joint feature point and the lip feature point in the posture feature point after the correction is counted and compared with a standard distance threshold value, the times that the distance between the finger joint feature point and the lip feature point in the posture feature point is smaller than the standard distance threshold value are accumulated, and comparing the accumulated times with smoking addiction coefficients corresponding to different times stored in the human posture model database one by one to obtain the addiction coefficient of the smoker.
And further, the smoking system also comprises a display terminal, and the display terminal is used for receiving and displaying the smoking times and smoking addiction coefficients of the smoker in unit time, which are sent by the posture evaluation processing module.
Further, the simulation position coordinates of the feature points are the number of the sub-image where the current feature point is located and the distance between the feature point and four sides of the sub-image with the number.
The invention has the beneficial effects that:
according to the smoking monitoring system based on the posture recognition, the image acquisition module, the acquisition and division module and the posture characteristic simulation module are combined, the characteristic points of the smoker in the acquired image can be extracted and recognized, the distances among the characteristic points in the smoking posture are counted through the screening and positioning module, the posture correction module, the posture recognition and comparison module and the posture evaluation and processing module, the smoking frequency in unit time is judged according to the distances among the characteristic points, the smoking addiction coefficient of the smoker is analyzed, the smoking frequency and the addiction state of the smoker can be conveniently and intuitively known, and the system can monitor the smoking state of the smoker according to the smoking posture of the smoker.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings used in the description of the embodiments will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art that other drawings can be obtained according to the drawings without creative efforts.
Fig. 1 is a schematic diagram of a smoking monitoring system based on gesture recognition according to the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Referring to fig. 1, the smoking monitoring system based on gesture recognition includes an image acquisition module, an acquisition and division module, a gesture feature simulation module, a screening and positioning module, a gesture correction module, a gesture recognition and comparison module, a human body gesture model database, a gesture evaluation processing module and a display terminal;
the image acquisition module is connected with the acquisition and division module, the posture characteristic simulation module is respectively connected with the acquisition and division module, the screening and positioning module and the posture identification and comparison module, the posture correction module is respectively connected with the screening and positioning module and the human body posture model database, the posture evaluation processing module is respectively connected with the posture identification and comparison module, the posture correction module, the human body posture model database and the display terminal, and the posture identification and comparison module is connected with the human body posture model database.
The image acquisition module comprises a plurality of high-definition cameras, is arranged indoors and is used for acquiring image information of personnel in real time, screening out images with the definition degree larger than a preset definition degree threshold value through clearing treatment of the acquired images so as to ensure the clarity of the images, and sending the screened images with the definition degree larger than the preset definition degree threshold value to the acquisition and division module;
the acquisition and division module is used for receiving the image sent by the image acquisition module, dividing the received image into a plurality of sub-images with the same size, numbering the sub-images, wherein the sub-images are respectively 1,2, 1, i, n, and sequentially sending the divided and sequenced sub-images to the attitude feature simulation module in sequence;
the gesture feature simulation module is used for receiving the information of each sub-image sent by the acquisition and division module, extracting gesture feature points of each received sub-image, performing simulated position coordinates on the feature points in each extracted sub-image, sending the simulated position coordinates of the feature points in each extracted sub-image to the screening and positioning module, and sending the feature points in each extracted sub-image to the gesture recognition and comparison module, wherein the simulated position coordinates of the feature points are the number of the current feature point in the sub-image and the distance between the feature point and the four sides of the sub-image with the number, and the gesture feature points comprise wrist joints, elbow joints, finger joints, lips and the like;
the screening and positioning module is used for receiving the simulated position coordinates of the feature points sent by the attitude feature simulation module, counting the distance between each feature point according to the obtained simulated position coordinates of the feature points, judging the distance between the finger joint feature point and the lip feature point, and sending the distance between each feature point to the attitude correction module;
the posture correction module is used for receiving the distances among the characteristic points sent by the screening and positioning module, screening out the characteristic points with the changed distances among the characteristic points, and counts the times of the change of the distance between any two characteristic points, and counts the distance between the finger joint characteristic point and the lip characteristic point, and comparing the counted distance with a set standard distance threshold, judging whether the absolute value of the difference between the distance from the knuckle feature point to the lip feature point and the standard distance threshold is within a preset interval threshold range, if so, the distance between the finger joint feature point and the lip feature point in the sub-image is corrected according to the standard distance between the finger joint feature point and the lip feature point in the smoking process of the smoker in the human posture model database, sending the distance between the corrected finger joint characteristic point and the lip characteristic point to a posture evaluation processing module;
the human body posture model database is used for storing each posture characteristic point and the position corresponding to each posture characteristic point of a smoker in a smoking state, and a standard distance threshold value and interval threshold value information between the finger joint characteristic point and the lip characteristic point, and storing times that the distance between the finger joint characteristic point and the lip characteristic point in unit time is smaller than the standard distance threshold value, wherein different times correspond to different smoking frequencies, and different smoking frequency ranges have smoking addiction coefficients corresponding to the times;
the gesture recognition comparison module is used for receiving the feature points of the sub-images sent by the gesture feature simulation module, comparing the feature points in the sub-images with the gesture feature points in the human body gesture model database one by one, judging whether the feature points in the sub-images are gesture feature points or not, extracting the gesture feature points in the sub-images and sending the extracted gesture feature points to the gesture evaluation processing module;
the gesture evaluation processing module is used for receiving the compared gesture feature points sent by the gesture recognition and comparison module, receiving the distance between the corrected knuckle feature points and the lip feature points sent by the gesture correction module, judging whether the received gesture feature points are the knuckle feature points and the lip feature points or not, wherein the gesture feature points are the knuckle feature points or the lip feature points, counting the number of times that the distance between the knuckle feature point and the lip feature point in the corrected attitude feature point is less than the standard distance threshold value, and comparing the accumulated times with smoking addiction coefficients corresponding to different times stored in a human body posture model database one by one, and sending the smoking times and the smoking addiction coefficients of the smoker in unit time to a display terminal by a posture evaluation processing module.
The display terminal is used for receiving the smoking times and the smoking addiction coefficient of the smoker in unit time, which are sent by the posture evaluation processing module, and displaying the smoking times and the smoking addiction coefficient, so that the smoking frequency and the addiction state of the smoker can be conveniently and intuitively known.
The foregoing is merely exemplary and illustrative of the principles of the present invention and various modifications, additions and substitutions of the specific embodiments described herein may be made by those skilled in the art without departing from the principles of the present invention or exceeding the scope of the claims set forth herein.

Claims (1)

1. Smoking monitoring system based on gesture recognition, its characterized in that: the human body posture model acquisition and classification system comprises an image acquisition module, an acquisition and classification module, a posture characteristic simulation module, a screening and positioning module, a posture correction module, a posture identification and comparison module, a human body posture model database, a posture evaluation processing module and a display terminal;
the image acquisition module is connected with the acquisition and division module, the posture characteristic simulation module is respectively connected with the acquisition and division module, the screening and positioning module and the posture recognition and comparison module, the posture correction module is respectively connected with the screening and positioning module and the human body posture model database, the posture evaluation processing module is respectively connected with the posture recognition and comparison module, the posture correction module, the human body posture model database and the display terminal, and the posture recognition and comparison module is connected with the human body posture model database;
the image acquisition module is used for acquiring image information of personnel in real time, removing the acquired images, screening out images with the definition degree greater than a preset definition degree threshold value, and sending the screened images with the definition degree greater than the preset definition degree threshold value to the acquisition and division module;
the acquisition and division module is used for receiving the image sent by the image acquisition module, dividing the received image into a plurality of sub-images with the same size, numbering the sub-images, wherein the number of the sub-images is 1,2, i, n, and sequentially sending the divided and sequenced sub-images to the attitude feature simulation module in sequence;
the attitude characteristic simulation module is used for receiving the information of each subimage sent by the acquisition and division module, extracting attitude characteristic points of each received subimage, simulating position coordinates of the extracted characteristic points in each subimage, sending the simulated position coordinates of the extracted characteristic points in each subimage to the screening and positioning module, and sending the extracted characteristic points in each subimage to the attitude identification and comparison module;
the screening and positioning module is used for receiving the simulated position coordinates of the feature points sent by the attitude feature simulation module, counting the distance between each feature point according to the obtained simulated position coordinates of the feature points, judging the distance between the finger joint feature point and the lip feature point, and sending the distance between each feature point to the attitude correction module;
the gesture correction module is used for receiving the distance between the feature points sent by the screening and positioning module, screening out the feature points with the changed distance between the feature points, counting the times of the change of the distance between any two feature points, counting the distance between the finger joint feature points and the lip feature points, comparing the counted distance with a set standard distance threshold, judging whether the absolute value of the difference value between the distance between the finger joint feature points and the lip feature points and the standard distance threshold is in a preset interval threshold range, and correcting the distance between the finger joint feature points and the lip feature points in the sub-image according to the standard distance between the finger joint feature points and the lip feature points in the smoking process of the smoker in the human gesture model database if the absolute value is in the preset interval threshold range;
the human body posture model database is used for storing each posture characteristic point, the position corresponding to each posture characteristic point, a standard distance threshold value between the finger joint characteristic point and the lip characteristic point and interval threshold value information of a smoker in a smoking state, and storing times that the distance between the finger joint characteristic point and the lip characteristic point in unit time is smaller than the standard distance threshold value, wherein different times correspond to different smoking frequencies, and smoking addiction coefficients corresponding to different smoking frequency ranges are contained in different smoking frequency ranges;
the gesture recognition comparison module is used for receiving the feature points of the sub-images sent by the gesture feature simulation module, comparing the feature points in the sub-images with the gesture feature points in the human body gesture model database one by one, judging whether the feature points in the sub-images are gesture feature points or not, and extracting the gesture feature points in the sub-images; the gesture evaluation processing module is used for receiving the compared gesture feature points sent by the gesture recognition comparison module, receiving the distance between the corrected finger joint feature points and the lip feature points sent by the gesture correction module, judging whether the received gesture feature points are the finger joint feature points and the lip feature points or not, counting the distance between the finger joint feature points and the lip feature points in the corrected gesture feature points and a standard distance threshold value to compare, accumulating the times that the distance between the finger joint feature points and the lip feature points in the gesture feature points is smaller than the standard distance threshold value, comparing the accumulated times with smoking addiction coefficients corresponding to different times stored in a human body gesture model database one by one, and obtaining the addiction coefficient of a smoker;
the display terminal is used for receiving and displaying the smoking times and smoking addiction coefficients of the smokers in unit time, which are sent by the posture evaluation processing module;
the simulation position coordinates of the feature points are the number of the sub-image where the current feature point is located and the distance between the feature point and the four sides of the numbered sub-image.
CN201910872752.XA 2019-09-16 2019-09-16 Smoking monitoring system based on gesture recognition Active CN110674717B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910872752.XA CN110674717B (en) 2019-09-16 2019-09-16 Smoking monitoring system based on gesture recognition

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910872752.XA CN110674717B (en) 2019-09-16 2019-09-16 Smoking monitoring system based on gesture recognition

Publications (2)

Publication Number Publication Date
CN110674717A CN110674717A (en) 2020-01-10
CN110674717B true CN110674717B (en) 2022-08-26

Family

ID=69077981

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910872752.XA Active CN110674717B (en) 2019-09-16 2019-09-16 Smoking monitoring system based on gesture recognition

Country Status (1)

Country Link
CN (1) CN110674717B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115440015B (en) * 2022-08-25 2023-08-11 深圳泰豪信息技术有限公司 Video analysis method and system capable of being intelligently and safely controlled

Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103635594A (en) * 2011-05-09 2014-03-12 富鲁达公司 Probe based nucleic acid detection
CN103860541A (en) * 2007-04-13 2014-06-18 田纳西大学研究基金会 Selective androgen receptor modulators for treating diabetes
CN104127242A (en) * 2014-07-17 2014-11-05 东软熙康健康科技有限公司 Method and device for identifying smoking behavior
CN104641728A (en) * 2012-07-23 2015-05-20 株式会社理光 Device control system, control apparatus and computer-readable medium
CN106133733A (en) * 2014-02-07 2016-11-16 弗雷德哈钦森癌症研究中心 For accepting and promising to undertake method, system, device and the software used in therapy
CN205831907U (en) * 2016-06-08 2016-12-28 重庆京渝激光生物研究所有限公司 A kind of smoking apparatus of laser therapy aparatus joint arm
CN106535673A (en) * 2013-10-29 2017-03-22 吸烟观察者公司 Smoking cessation device
CN106530730A (en) * 2016-11-02 2017-03-22 重庆中科云丛科技有限公司 Traffic violation detection method and system
CN206288491U (en) * 2016-12-22 2017-06-30 湛江市汉成科技有限公司 One kind is easy to fetch formula protection against the tide Cigarette pack
CN108819900A (en) * 2018-06-04 2018-11-16 上海商汤智能科技有限公司 Control method for vehicle and system, vehicle intelligent system, electronic equipment, medium
CN109299683A (en) * 2018-09-13 2019-02-01 嘉应学院 A kind of security protection assessment system based on recognition of face and behavior big data
CN110110710A (en) * 2019-06-03 2019-08-09 北京启瞳智能科技有限公司 A kind of scene abnormality recognition methods, system and intelligent terminal
CN110189447A (en) * 2019-05-31 2019-08-30 安徽柏络智能科技有限公司 A kind of intelligence community gate control system based on face identification
CN110213548A (en) * 2019-07-01 2019-09-06 南京派光智慧感知信息技术有限公司 A kind of track train driving behavior comprehensive monitoring alarming method for power

Patent Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103860541A (en) * 2007-04-13 2014-06-18 田纳西大学研究基金会 Selective androgen receptor modulators for treating diabetes
CN103635594A (en) * 2011-05-09 2014-03-12 富鲁达公司 Probe based nucleic acid detection
CN104641728A (en) * 2012-07-23 2015-05-20 株式会社理光 Device control system, control apparatus and computer-readable medium
CN106535673A (en) * 2013-10-29 2017-03-22 吸烟观察者公司 Smoking cessation device
CN106133733A (en) * 2014-02-07 2016-11-16 弗雷德哈钦森癌症研究中心 For accepting and promising to undertake method, system, device and the software used in therapy
CN104127242A (en) * 2014-07-17 2014-11-05 东软熙康健康科技有限公司 Method and device for identifying smoking behavior
CN205831907U (en) * 2016-06-08 2016-12-28 重庆京渝激光生物研究所有限公司 A kind of smoking apparatus of laser therapy aparatus joint arm
CN106530730A (en) * 2016-11-02 2017-03-22 重庆中科云丛科技有限公司 Traffic violation detection method and system
CN206288491U (en) * 2016-12-22 2017-06-30 湛江市汉成科技有限公司 One kind is easy to fetch formula protection against the tide Cigarette pack
CN108819900A (en) * 2018-06-04 2018-11-16 上海商汤智能科技有限公司 Control method for vehicle and system, vehicle intelligent system, electronic equipment, medium
CN109299683A (en) * 2018-09-13 2019-02-01 嘉应学院 A kind of security protection assessment system based on recognition of face and behavior big data
CN110189447A (en) * 2019-05-31 2019-08-30 安徽柏络智能科技有限公司 A kind of intelligence community gate control system based on face identification
CN110110710A (en) * 2019-06-03 2019-08-09 北京启瞳智能科技有限公司 A kind of scene abnormality recognition methods, system and intelligent terminal
CN110213548A (en) * 2019-07-01 2019-09-06 南京派光智慧感知信息技术有限公司 A kind of track train driving behavior comprehensive monitoring alarming method for power

Also Published As

Publication number Publication date
CN110674717A (en) 2020-01-10

Similar Documents

Publication Publication Date Title
CN106251352B (en) A kind of cover defect inspection method based on image procossing
CN108563991A (en) Kitchen fume concentration division methods and oil smoke concentration detection and interference elimination method
US9183431B2 (en) Apparatus and method for providing activity recognition based application service
CN113065474B (en) Behavior recognition method and device and computer equipment
CN106981174A (en) A kind of Falls Among Old People detection method based on smart mobile phone
Kong et al. Fall detection for elderly persons using a depth camera
CN110458126B (en) Pantograph state monitoring method and device
CN106846362A (en) A kind of target detection tracking method and device
CN103456168B (en) A kind of traffic intersection pedestrian behavior monitoring system and method
CN112464797B (en) Smoking behavior detection method and device, storage medium and electronic equipment
CN110674717B (en) Smoking monitoring system based on gesture recognition
Gjoreski et al. Context-based fall detection and activity recognition using inertial and location sensors
CN108760590A (en) A kind of kitchen fume Concentration Testing based on image procossing and interference elimination method
CN109805936B (en) Human body tumbling detection system based on ground vibration signal
CN112990057A (en) Human body posture recognition method and device and electronic equipment
CN112869733A (en) Real-time heart beat interval measuring and calculating method for ballistocardiogram
CN114913598A (en) Smoking behavior identification method based on computer vision
CN116682175A (en) Workshop personnel dangerous behavior detection method under complex environment
CN104392201B (en) A kind of human body tumble recognition methods based on omnidirectional vision
CN113111733B (en) Posture flow-based fighting behavior recognition method
CN107123126A (en) A kind of stream of people's moving scene temperature method of estimation
CN111626273B (en) Fall behavior recognition system and method based on atomic action time sequence characteristics
CN110123328B (en) Breathing frequency detection method based on wireless identification
CN209101365U (en) A kind of cigarette stove all-in-one machine having unmanned identification function
CN113657315B (en) Quality screening method, device, equipment and storage medium for face image

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20220802

Address after: 310000 room 1203, 12 / F, Yifu science and technology building, East District, China University of metrology, 258 Xueyuan street, Qiantang new area, Hangzhou, Zhejiang

Applicant after: Hangzhou benba Huishi Technology Co.,Ltd.

Address before: Room 1104, 1105 and 1106, 11th floor, R & D building, Institute of technological innovation, Chinese Academy of Sciences (Hefei), northwest corner, intersection of Xiyou road and Shilian South Road, Hefei hi tech Development Zone, Anhui Province 230000

Applicant before: HEFEI ZHONGKE BENBA TECHNOLOGY CO.,LTD.

GR01 Patent grant
GR01 Patent grant