CN113126764A - Personal water volume detection method based on smart watch - Google Patents

Personal water volume detection method based on smart watch Download PDF

Info

Publication number
CN113126764A
CN113126764A CN202110437755.8A CN202110437755A CN113126764A CN 113126764 A CN113126764 A CN 113126764A CN 202110437755 A CN202110437755 A CN 202110437755A CN 113126764 A CN113126764 A CN 113126764A
Authority
CN
China
Prior art keywords
water
sound wave
time
decibel
water consumption
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110437755.8A
Other languages
Chinese (zh)
Other versions
CN113126764B (en
Inventor
朱永楠
赵勇
李溦
师林蕊
何国华
姜珊
王丽珍
王庆明
李海红
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
China Institute of Water Resources and Hydropower Research
Original Assignee
China Institute of Water Resources and Hydropower Research
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by China Institute of Water Resources and Hydropower Research filed Critical China Institute of Water Resources and Hydropower Research
Priority to CN202110437755.8A priority Critical patent/CN113126764B/en
Publication of CN113126764A publication Critical patent/CN113126764A/en
Application granted granted Critical
Publication of CN113126764B publication Critical patent/CN113126764B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01FMEASURING VOLUME, VOLUME FLOW, MASS FLOW OR LIQUID LEVEL; METERING BY VOLUME
    • G01F1/00Measuring the volume flow or mass flow of fluid or fluent solid material wherein the fluid passes through a meter in a continuous flow
    • G01F1/66Measuring the volume flow or mass flow of fluid or fluent solid material wherein the fluid passes through a meter in a continuous flow by measuring frequency, phase shift or propagation time of electromagnetic or other waves, e.g. using ultrasonic flowmeters
    • G01F1/666Measuring the volume flow or mass flow of fluid or fluent solid material wherein the fluid passes through a meter in a continuous flow by measuring frequency, phase shift or propagation time of electromagnetic or other waves, e.g. using ultrasonic flowmeters by detecting noise and sounds generated by the flowing fluid
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/903Querying
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F17/00Digital computing or data processing equipment or methods, specially adapted for specific functions
    • G06F17/10Complex mathematical operations
    • G06F17/14Fourier, Walsh or analogous domain transformations, e.g. Laplace, Hilbert, Karhunen-Loeve, transforms
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F17/00Digital computing or data processing equipment or methods, specially adapted for specific functions
    • G06F17/10Complex mathematical operations
    • G06F17/15Correlation function computation including computation of convolution operations
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F17/00Digital computing or data processing equipment or methods, specially adapted for specific functions
    • G06F17/10Complex mathematical operations
    • G06F17/16Matrix or vector computation, e.g. matrix-matrix or matrix-vector multiplication, matrix factorization
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N20/00Machine learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • G06N3/084Backpropagation, e.g. using gradient descent
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A20/00Water conservation; Efficient water supply; Efficient water use

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Mathematical Physics (AREA)
  • General Engineering & Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Software Systems (AREA)
  • Computational Mathematics (AREA)
  • Pure & Applied Mathematics (AREA)
  • Mathematical Optimization (AREA)
  • Mathematical Analysis (AREA)
  • Computing Systems (AREA)
  • Databases & Information Systems (AREA)
  • Algebra (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Computation (AREA)
  • Computational Linguistics (AREA)
  • Biophysics (AREA)
  • Molecular Biology (AREA)
  • General Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Human Computer Interaction (AREA)
  • Electromagnetism (AREA)
  • Fluid Mechanics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Medical Informatics (AREA)
  • Measurement Of Mechanical Vibrations Or Ultrasonic Waves (AREA)

Abstract

The invention discloses a personal water consumption detection method based on an intelligent watch, which is characterized in that a sensor of the intelligent watch is used for acquiring motion behavior and hand motion information, a collection and storage system is started under the state of motion information, and information factor vectors such as sound wave frequency, decibel and time are collected and stored. And Fourier change of the collected and stored information is compared with the information in the database by adopting a machine learning method, the water consumption behavior of the individual is identified, and finally the water consumption is calculated. The invention determines whether the water consumption behavior is generated or not through the water consumption judging mechanism, can accurately determine whether the water consumption is generated or not, and carries out statistics on the water consumption, thereby realizing the water consumption statistics based on time.

Description

Personal water volume detection method based on smart watch
Technical Field
The invention belongs to the field of water consumption monitoring, and particularly relates to a personal water consumption detection method based on an intelligent watch.
Background
At present, domestic water of urban residents in China basically realizes water meter metering. However, the existing water meter can only record the total water consumption of domestic water, and cannot count the water consumption time, individuals and each time period.
With the rapid development of social economy and the continuous improvement of the living standard of people, the portable intelligent watch becomes an indispensable electronic accessory in the life of many people, the personal water consumption behavior and the water consumption are recorded through the intelligent watch, the vacancy that the water meter can only record the total number of the water meter at the present stage and can not finely master the personal water consumption behavior and the water consumption is made up, the water consumption behavior is scientifically known, the water consumption of various water types is mastered, data support can be provided for water saving research, water saving suggestions and measures can be conveniently and specifically proposed, the too fast increase of the domestic water is inhibited while the domestic water efficiency of urban residents is improved, and the domestic water saving social construction of China is promoted.
Disclosure of Invention
Aiming at the defects in the prior art, the personal water consumption detection method based on the intelligent watch solves the problems in the prior art.
In order to achieve the purpose of the invention, the invention adopts the technical scheme that: a personal water consumption detection method based on a smart watch comprises the following steps:
s1, acquiring the state, time, sound wave decibel information and sound wave frequency information of the three-axis gyroscope based on the smart watch;
s2, judging whether water using behavior occurs or not according to the state of the three-axis gyroscope, sound wave decibel information and sound wave frequency information, if yes, entering a step S3, and if not, returning to the step S1;
s3, acquiring a characteristic matrix diagram through the time, the sound wave decibel information and the sound wave frequency information;
s4, inputting the characteristic matrix diagram into a Convolutional Neural Network (CNN) to obtain a water use behavior type;
and S5, constructing a linear relation curve by combining decibels and time, and acquiring a water consumption detection result according to the linear relation curve corresponding to the water consumption behavior type.
Further, the step S2 is specifically:
s2.1, judging whether a smart watch user generates hand movement or not according to the state of the three-axis gyroscope, if so, entering the step S2.2, and otherwise, repeating the step S2.1;
s2.2, judging whether sound wave decibel information and sound wave frequency information exist or not, if so, entering a step S2.3, and otherwise, returning to the step S2.1;
and S2.3, judging whether the sound wave decibel information is between 50 and 90, if so, primarily judging the water using behavior, and entering the step S3, otherwise, returning to the step S1.
Further, the step S3 is specifically:
s3.1, according to the time, the sound wave decibel information and the sound wave frequency information, constructing a Fourier change function as follows:
Figure BDA0003033734140000021
wherein ω represents acoustic frequency information, t represents time, e represents a natural constant, i represents an imaginary unit, F (t) represents a function of time t, and F (ω) represents a function varied with acoustic frequency information ω;
s3.2, acquiring a Fourier function related to time change on the basis of the Fourier change function;
s3.3, converting the Fourier function related to time transformation through an Euler formula to obtain a characteristic matrix of the acoustic wave changing along with time;
and S3.4, arranging the sound wave signals into a characteristic matrix diagram according to two dimensions of time and frequency domain according to the characteristic matrix obtained in the step S3.3.
Further, the step S3.2 is specifically:
s3.2.1, converting the Fourier change function into non-periodic Fourier change function f (t) according to the non-periodic change and the periodic change of the water flow_npAnd a periodic Fourier transform function f (t)_p
S3.2.2, according to f (t)_np、f(t)_pAnd t, plotting a fourier function over the time t.
Further, the non-periodic Fourier transform function f (t) in the step S3.2.1_npThe method specifically comprises the following steps:
Figure BDA0003033734140000031
where T' represents the period approaching infinity, which is used for the calculation of the aperiodic variation function, n represents a positive real number, einRepresents a plurality;
the periodic Fourier transform function f (t) in the step S3.2.1_pThe method specifically comprises the following steps:
Figure BDA0003033734140000032
einωt=cos(nωt)+isin(nωt)
e-inωt=cos(nωt)-isin(nωt)
where T represents a time period.
Further, the feature matrix of the time-varying acoustic wave in step S3.3 is specifically:
Figure BDA0003033734140000033
xi represents the conjugate form of a complex number n-order unit root, and Fn represents a sound wave characteristic matrix changing along with the order n;
the xi is specifically as follows:
Figure BDA0003033734140000034
further, the characteristic matrix map in step S3.4 specifically includes:
Figure BDA0003033734140000041
wherein y represents the input of the convolutional neural network CNN, f (u) represents the obtained characteristic matrix diagram, theta represents the set threshold value, and omegaiWeights, x, representing connections between neurons in a convolutional neural network CNN networkiRepresents the input vector, i.e., the feature matrix Fn, m 1,2,., n,
Figure BDA0003033734140000042
a feature matrix map obtained from time and frequency domain is shown.
Further, the convolutional neural network CNN in step S4 includes an input layer, a first convolutional layer, a first pooling layer, a second convolutional layer, a second pooling layer, a first fully-connected layer, a second fully-connected layer, and an output layer, which are connected in sequence.
Further, the step S5 is specifically:
s5.1, setting sound wave decibel intervals and water consumption quota of water used for flushing a toilet, a faucet and a shower;
s5.2, constructing a linear relation curve according to the decibel interval of the sound wave and the water quota and in combination with the duration time of the sound wave;
and S5.3, obtaining a water consumption detection result according to the linear relation curve corresponding to the water consumption behavior type and the decibel of the water used at the time.
Further, the step S5.1 specifically includes: the decibel interval of the sound wave of the water for flushing the toilet is 70-85 db, and the water quota is 6-12 liters/time; the decibel interval of the water sound wave for the faucet is 60-75 db, and the water consumption quota is 0.1-0.3 liter/second; the decibel interval of the water sound wave for shower is 45-75 db, and the water quota is 0.1-0.3 liter/second;
the linear relation curve in the step S5.2 is specifically as follows:
Qinstant heating device=k·t·q
Figure BDA0003033734140000043
Figure BDA0003033734140000044
Wherein Q isInstant heating deviceRepresenting the amount of water used as a function of sound waves over an instantaneous time; k represents a sound coefficient, is related to decibels, converts the decibels into coefficients between 0 and 1 through dimensionless transformation, and the larger the decibel is, the larger the water consumption is; DBInstant heating deviceRepresenting sound wave decibels, DB, over an instantaneous timemaxRepresents the maximum value of sound wave decibel in the water using process, DBminRepresenting the minimum value of sound wave decibel in the water using process; t represents a short duration time varying with the sound wave; q represents water quota; qGeneral assemblyRepresenting the cumulative water usage over the total time period t of water usage.
The invention has the beneficial effects that:
(1) the invention discloses a personal water consumption detection method based on an intelligent watch, which is characterized in that a sensor of the intelligent watch is used for acquiring motion behavior and hand motion information, a collection and storage system is started under the state of motion information, and information factor vectors such as sound waves, decibels and time are collected and stored. And Fourier change of the collected and stored information is compared with the information in the database by adopting a machine learning method, the water consumption behavior of the individual is identified, and finally the water consumption is calculated.
(2) The invention determines whether the water consumption behavior is generated or not through the water consumption judging mechanism, can accurately determine whether the water consumption is generated or not, and carries out statistics on the water consumption, thereby realizing the water consumption statistics based on time.
Drawings
Fig. 1 is a flow chart of a personal water consumption detection method based on a smart watch according to the present invention.
Fig. 2 is a schematic structural diagram of a convolutional neural network CNN in the present invention.
Detailed Description
The following description of the embodiments of the present invention is provided to facilitate the understanding of the present invention by those skilled in the art, but it should be understood that the present invention is not limited to the scope of the embodiments, and it will be apparent to those skilled in the art that various changes may be made without departing from the spirit and scope of the invention as defined and defined in the appended claims, and all matters produced by the invention using the inventive concept are protected.
Embodiments of the present invention will be described in detail below with reference to the accompanying drawings.
Example 1
As shown in fig. 1, a personal water usage amount detection method based on a smart watch includes the following steps:
s1, acquiring the state, time, sound wave decibel information and sound wave frequency information of the three-axis gyroscope based on the smart watch;
s2, judging whether water using behavior occurs or not according to the state of the three-axis gyroscope, sound wave decibel information and sound wave frequency information, if yes, entering a step S3, and if not, returning to the step S1;
s3, acquiring a characteristic matrix diagram through the time, the sound wave decibel information and the sound wave frequency information;
s4, inputting the characteristic matrix diagram into a Convolutional Neural Network (CNN) to obtain a water use behavior type;
and S5, constructing a linear relation curve by combining decibels and time, and acquiring a water consumption detection result according to the linear relation curve corresponding to the water consumption behavior type.
Water usage activity types include toilet flushing, faucet, and shower.
The step S2 specifically includes:
s2.1, judging whether a smart watch user generates hand movement or not according to the state of the three-axis gyroscope, if so, entering the step S2.2, and otherwise, repeating the step S2.1;
s2.2, judging whether sound wave decibel information and sound wave frequency information exist or not, if so, entering a step S2.3, and otherwise, returning to the step S2.1;
and S2.3, judging whether the sound wave decibel information is between 50 and 90, if so, primarily judging the water using behavior, and entering the step S3, otherwise, returning to the step S1.
The step S3 specifically includes:
s3.1, according to the time, the sound wave decibel information and the sound wave frequency information, constructing a Fourier change function as follows:
Figure BDA0003033734140000061
wherein ω represents acoustic frequency information, t represents time, e represents a natural constant, i represents an imaginary unit, F (t) represents a function of time t, and F (ω) represents a function varied with acoustic frequency information ω;
s3.2, acquiring a Fourier function related to time change on the basis of the Fourier change function;
s3.3, converting the Fourier function related to time transformation through an Euler formula to obtain a characteristic matrix of the acoustic wave changing along with time;
and S3.4, arranging the sound wave signals into a characteristic matrix diagram according to two dimensions of time and frequency domain according to the characteristic matrix obtained in the step S3.3.
The step S3.2 is specifically as follows:
s3.2.1, converting the Fourier change function into non-periodic Fourier change function f (t) according to the non-periodic change and the periodic change of the water flow_npAnd a periodic Fourier transform function f (t)_p
S3.2.2, according to f (t)_np、f(t)_pAnd t, plotting a fourier function over the time t.
Aperiodic Fourier transform function f (t) in the step S3.2.1_npThe method specifically comprises the following steps:
Figure BDA0003033734140000071
einωt=cos(nωt)+isin(nωt)
e-inωt=cos(nωt)-isin(nωt)
where T' represents the period approaching infinity, which is used for the calculation of the aperiodic variation function, n represents a positive real number, einRepresents a plurality; n represents a complex number n order in Fourier change, and a result obtained when the Fourier change is carried out by using an Euler formula is only a mathematical expression, is beneficial to the mathematical analysis of signals and has no substantial significance.
The periodic Fourier transform function f (t) in the step S3.2.1_pThe method specifically comprises the following steps:
Figure BDA0003033734140000072
where T represents a time period.
The characteristic matrix of the time-varying acoustic wave in step S3.3 is specifically:
Figure BDA0003033734140000081
xi represents the conjugate form of a complex number n-order unit root, and Fn represents a sound wave characteristic matrix changing along with the order n;
the xi is specifically as follows:
Figure BDA0003033734140000082
the characteristic matrix diagram in step S3.4 is specifically:
Figure BDA0003033734140000083
wherein y represents the input of the convolutional neural network CNN, f (u) represents the obtained characteristic matrix diagram, theta represents the set threshold value, and omegaiWeights, x, representing connections between neurons in a convolutional neural network CNN networkiRepresents the input vector, i.e., the feature matrix Fn, m 1,2,., n,
Figure BDA0003033734140000084
a feature matrix map obtained from time and frequency domain is shown.
As shown in fig. 2, the convolutional neural network CNN in step S4 includes an input layer, a first convolutional layer, a first pooling layer, a second convolutional layer, a second pooling layer, a first fully-connected layer, a second fully-connected layer, and an output layer, which are connected in sequence.
The step S5 specifically includes:
s5.1, setting sound wave decibel intervals and water consumption quota of water used for flushing a toilet, a faucet and a shower;
s5.2, constructing a linear relation curve according to the decibel interval of the sound wave and the water quota and in combination with the duration time of the sound wave;
and S5.3, obtaining a water consumption detection result according to the linear relation curve corresponding to the water consumption behavior type and the decibel of the water used at the time.
The step S5.1 specifically comprises the following steps: the decibel interval of the sound wave of the water for flushing the toilet is 70-85 db, and the water quota is 6-12 liters/time; the decibel interval of the water sound wave for the faucet is 60-75 db, and the water consumption quota is 0.1-0.3 liter/second; the decibel interval of the water sound wave for shower is 45-75 db, and the water quota is 0.1-0.3 liter/second;
the linear relation curve in the step S5.2 is specifically as follows:
Qinstant heating device=k·t·q
Figure BDA0003033734140000091
Figure BDA0003033734140000092
Wherein Q isInstant heating deviceRepresenting the amount of water used as a function of sound waves over an instantaneous time; k represents a sound coefficient, is related to decibels, converts the decibels into coefficients between 0 and 1 through dimensionless transformation, and the larger the decibel is, the larger the water consumption is; DBInstant heating deviceRepresenting sound wave decibels, DB, over an instantaneous timemaxRepresents the maximum value of sound wave decibel in the water using process, DBminRepresenting the minimum value of sound wave decibel in the water using process; t represents a short duration time varying with the sound wave; q represents water quota; qGeneral assemblyRepresenting the cumulative water usage over the total time period t of water usage.
The invention has the beneficial effects that:
(1) the invention discloses a personal water consumption detection method based on an intelligent watch, which is characterized in that a sensor of the intelligent watch is used for acquiring motion behavior and hand motion information, a collection and storage system is started under the state of motion information, and information factor vectors such as sound waves, decibels and time are collected and stored. And Fourier change of the collected and stored information is compared with the information in the database by adopting a machine learning method, the water consumption behavior of the individual is identified, and finally the water consumption is calculated.
(2) The invention determines whether the water consumption behavior is generated or not through the water consumption judging mechanism, can accurately determine whether the water consumption is generated or not, and carries out statistics on the water consumption, thereby realizing the water consumption statistics based on time.
Example 2
A personal water consumption detection method based on a smart watch comprises the following steps:
a1, mainly using an information recording module to obtain information factor vectors such as body movement behaviors, hand movements, sound waves, decibels and time through a sensor carried by a watch;
a2, mainly an information pre-judging module, which is used for carrying out preliminary condition judgment on collected and stored information factor vectors such as motion behaviors, hand motions, sound waves, decibels and time. When the water using behavior occurs, action information, sound wave information and decibel information are generated, so that whether the motion factor vector, the sound wave source and the decibel are in the preset threshold range or not needs to be preliminarily judged, if yes, a collecting and storing system is started, information factor vectors of the sound wave, the decibel, the time and the like are collected and stored, the step A3 is carried out to judge the water using behavior type, and if not, the next step of water using analysis and calculation is not carried out. In the case, when the body and the wrist move and the sound decibel is 50-90, continuing to enter the next step A3 to judge the water using behavior type, otherwise, abandoning the information record and not participating in the next step of water using behavior calculation and analysis;
a3, carrying out Fourier transform on time domain and frequency domain of each factor vector, constructing a time sequence matrix, analyzing a characteristic matrix of the time sequence matrix, comparing the characteristic matrix with data in a database by adopting a machine learning method, and judging water consumption behavior;
a4, constructing a linear relation curve according to decibels and time according to different water using behaviors, and calculating water using amount, wherein the water using amount is in positive correlation with the time and the decibels, and the longer the time is, the larger the decibel is, the larger the water using amount is;
a5, showing water consumption and length of water consumption under different water consumption behaviors.
The step a2 further includes:
a21, when a water using behavior occurs, action and sound wave decibel information are generated at the same time, for example, when a water boiling faucet washes hands, wrist action change occurs, and meanwhile, the water flowing sound of the water boiling faucet generates sound waves and decibels; when the toilet is flushed, the wrist action changes, and meanwhile, the flushing running water sound can generate sound waves, decibels and the like. Therefore, the system is set in a state that motion information (human activities) exists, a collection and storage system is started, and information factor vectors such as sound waves, decibels and time are collected and stored. When the action behavior does not occur, the water using behavior at the moment cannot be judged, and the generated information is invalid for water using behavior analysis, so that the information record is abandoned, and the water using behavior does not participate in the next step of water using behavior calculation and analysis.
A22, considering that the water using behavior of A21 is primarily analyzed, the water using behavior is certainly under the condition that the motion is combined with sound waves in decibels, so that the system records information factor vectors of motion behavior, hand motion, sound waves, decibels, time and the like, firstly enters a step t1, judges whether the system is in a motion state or not according to a three-axis sensor and a gyroscope, judges whether the water using behavior is possible to occur when the body is in the motion state, and then continues to perform the next step t2, otherwise, the system is judged to be non-water using behavior initially, and information is recorded and does not participate in water using behavior calculation and analysis.
A23, when the requirement of t1 is met, entering a t2 step, judging whether the hand is in a motion state or not according to the gesture sensor, when the wrist is in the motion state, judging that the water using behavior is possible, continuing to perform the next step t3, otherwise, judging the hand to be the non-water using behavior, discarding the information record, and not participating in the calculation and analysis of the water using behavior.
A24, when the requirement of t2 is met, entering a t3 step, judging whether sound decibel information is between 50 and 90 according to the fact that the sound decibel is usually between 50 and 90 when the water using behavior for the experiment occurs, judging that the water using behavior is possible when the sound decibel information is between 50 and 90, at the moment, primarily judging that the behavior is the water using behavior, entering a step A3 to carry out more accurate type analysis on the water using behavior, and otherwise, primarily judging that the behavior is the non-water using behavior, discarding information records and not participating in calculation and analysis of the water using behavior;
a25 and step A2 primarily judge whether the behavior is water using behavior through information, and the accurate analysis of the water using behavior in the step A3 is carried out to be paved, so that the working efficiency can be effectively improved, and the memory of a chip is not occupied.
The step a3 further includes:
a31, after the step A2 is carried out to preliminarily judge the water using behaviors, more accurate water using behaviors such as water using behaviors in a faucet, a toilet and a shower need to be judged. The water consumption efficiency of different water consumption behaviors is different, and the water consumption of the water consumption behaviors is accurately identified and calculated to be used as a bedding. And performing Fourier change and machine learning calculation on the collected factor vectors of time (t), Decibel (DB), acoustic frequency (omega) and the like, analyzing a characteristic matrix of the factor vectors, comparing the characteristic matrix with water using behavior data in a database, and finally judging and identifying the water using behavior.
A32, Fourier transform, which considers the signals that look disordered into the combination of basic sine (cosine) signals with certain amplitude, phase and frequency, aims to find out the frequency corresponding to the signals with similar amplitude in the similar sine (cosine) signals in the water behavior process. The fourier transform equation is shown in (1),
Figure BDA0003033734140000121
where ω is the acoustic frequency, t is the time, e-iωtF (t) is a complex function of time t, and F (ω) is a fourier function with the frequency ω of the acoustic wave.
The transformed fourier function is considered to contain a plurality of frequency components, and the arbitrary function (t) can be synthesized by adding a plurality of periodic and non-periodic functions. Taking hand washing as an example, the sound wave frequency in the process of using water includes periodic variation and also includes non-periodic variation, for example, the sound wave is regularly and non-periodically varied in the process of turning off a faucet to adjust the water amount to be proper; in the process that the water flow with the proper water quantity flows automatically, the sound waves are changed periodically; in the process of using water with proper water quantity, the sound wave is in non-periodic change due to the action of doping water. The fourier transform equation (1) can be changed into periodic and non-periodic variations.
Firstly, when the water faucet is turned on and screwed to a specific position, the water flow changes all the time, the sound wave changes along with the change, the sound wave frequency omega changes in a non-periodic manner in the time period, and the formula can be converted into a formula (2):
Figure BDA0003033734140000122
where T represents the approaching infinite period for the calculation of the aperiodic variation function, n represents a positive real number, the complex number n orders in the Fourier variation, einRepresents a plurality;
secondly, when the water faucet is screwed to a specific position, the water flow does not change, the sound wave frequency omega in the time period changes periodically, and when the sound wave frequency omega changes periodically, the formula can be converted into a formula (3):
Figure BDA0003033734140000123
where T represents a time period.
Fourier transformation is carried out through the formulas (2-3), and a Fourier function about time transformation can be drawn after (t, f (t)) is obtained, and at the moment, the function is transformed from a time domain to a frequency domain.
A33, constructing a time sequence matrix based on Fourier change, converting by an Euler formula to obtain a conjugate form xi (formula 5) of a complex n-order unit root, arranging sound signals into a two-dimensional characteristic matrix diagram after Fourier change (formula 6) according to two dimensions of time and frequency domain, and arranging and combining sound wave characteristics into a series of characteristic matrix diagrams as the input of a convolutional neural network.
Figure BDA0003033734140000131
Figure BDA0003033734140000132
And xi is a conjugate form of a complex number n-order unit root, Fn represents an acoustic wave characteristic matrix changing along with the order n, and an output Fourier change sequence matrix is related to the acoustic wave.
A34, neural network is a way to realize machine learning, while convolutional neural network is a deep neural network learning method. The Convolutional Neural Network (CNN) is a multilayer neural network, data is input into the network in the form of a characteristic matrix diagram, then convolution and pooling processing are sequentially carried out, the specific process is completed in corresponding convolutional layers and pooling layers, and a local connection and weight sharing mode is adopted among the layers. The convolutional layer is responsible for extracting local features in the matrix, the pooling layer greatly reduces parameter magnitude in a dimension reduction mode, the parameter magnitude is compared with a stored water consumption behavior database, and the full-connection layer is used for outputting a desired result and judging water consumption behavior. The method has great advantages in robustness, self-learning and the like when applied to voice recognition. Compared with the traditional neural network calculation method, the convolutional neural network CNN comprises an input layer, a hidden layer and an output layer as the traditional neural network, but the hidden layer is refined, so that the complex problem is simplified, a large number of parameters are reduced into a small number of parameters for processing, and the result is not influenced.
In the present embodiment, the convolutional neural network CNN mainly includes 8 steps.
First, a feature matrix Fn relating to time and acoustic waves is input.
And secondly, performing a first layer of convolution operation on the matrix.
And thirdly, performing pooling treatment, and keeping main characteristics to reduce dimension and reduce data volume.
And fourthly, performing second-layer convolution operation.
And fifthly, performing secondary pooling treatment, and keeping the main characteristic dimension reduction and data size reduction.
Sixthly, the first layer of full connection layer is compared with the database, a loss function is constructed, and the minimum value of the loss function is found through training.
And finding the minimum value of the second fully-connected layer, namely the quadratic optimization loss function.
Comparing the data with the big data stored in the database, matching the corresponding water using behavior, and outputting a result.
A35, processing information by a convolutional neural network, analyzing a characteristic matrix of the information, comparing the characteristic matrix with data in a database, and calculating the posterior probability of each state by an output layer according to the input characteristic information to finally achieve the aim of identification.
The formula is as follows:
Figure BDA0003033734140000141
wherein: x is the number ofiDifferent input vectors in the algorithm, namely a feature matrix Fn; w is aiConnecting weights among the neurons; theta is a threshold value; y is the output of the neuron; f is a loss function.
The dimensional equations for the calculated tensors for the convolutional and pooling layers are as follows:
and (3) rolling layers: n ═ W-K +2P)/S +1 (7)
Wherein, N: size of output image
W: size of input image
K: convolution kernel size
P: number of fillings
S: step size
A pooling layer: n ═ W-Ps)/S +1 (8)
Wherein, Ps: size of the pooling layer
A36, training the model by gradient descent and error back propagation, and the improvement point of the application is not described here, so the description is not repeated.
A37, the method has high behavior recognition precision, and finally judges water using behaviors.
The step a4 further includes:
and A41, accurately judging the water consumption behavior, constructing a linear relation curve by combining decibels and time, and obtaining the water consumption according to the curve.
A42, according to tests, the sound decibels and water consumption quota generated by different water consumption behaviors are different, and common toilet flushing, faucet and shower are taken as examples, wherein the toilet flushing decibel is 70-85 db, and the water consumption quota is 6-12 liters/time; the decibel of the faucet is 60-75 db, and the water consumption quota is 0.1-0.3 liter/second; shower bath is 45-75 db, and water ration is 0.1-0.3 liter/second.
A43, the water consumption is positively correlated with the time and decibel, the longer the time, the larger the water consumption, the larger the decibel, the instantaneous water consumption is calculated by adopting a formula (9), the formula (10) is a volume coefficient calculation method, and the accumulated water consumption in a time period is calculated by adopting a formula (11).
QInstant heating device=k·t·q (9)
Figure BDA0003033734140000151
Figure BDA0003033734140000152
Wherein Q isInstant heating deviceRepresenting the amount of water used as a function of sound waves over an instantaneous time; k represents a sound coefficient, is related to decibels, converts the decibels into coefficients between 0 and 1 through dimensionless transformation, and the larger the decibel is, the larger the water consumption is, wherein DBInstant heating deviceRepresenting sound wave decibels, DB, over an instantaneous timemaxRepresents the maximum value of sound wave decibel in the water using process, DBminRepresenting the minimum value of sound wave decibel in the water using process; t represents a short duration time varying with the sound wave; q represents water quota; qGeneral assemblyRepresenting the cumulative water usage over the total time period t of water usage.
In this embodiment, a personal water consumption detection system based on an intelligent watch is further provided, and includes 5 modules including an information recording module, an information pre-judging module, an information processing module, an information calculating module, and an information display module, where the information recording module is mainly obtained by a sensor of the watch itself, and the sensor mainly includes an audio receiver, a three-axis sensor, a gyroscope, a gesture sensor, positioning, and the like; the information pre-judging, processing and calculating module is mainly obtained by calculation of a built-in algorithm chip; the information display module is displayed by a watch or a mobile phone.
The intelligent watch comprises an information recording module, a plurality of sensors such as an audio receiver, a three-axis sensor, a gyroscope, a gesture sensor, positioning and the like are attached to the intelligent watch, a plurality of parts work simultaneously to record multi-aspect information, for example, collected and stored sound signals are recorded in a chip in an electric signal mode, and collected and stored human body or wrist movement signals are recorded in the chip in 3-direction changes, positions, speed change rates and the like in the front and back direction, the up and down direction and the left and right direction;
the information initial judgment module compares the initial judgment factor vector with a preset threshold, preliminarily judges the water consumption behavior when the initial judgment factor vector accords with the preset threshold, collects the stored data, and carries out the next water consumption behavior judgment, data processing and calculation; when the behavior does not meet the set threshold value, the behavior is preliminarily judged to be the non-water-using behavior, data is eliminated, the next calculation is not carried out, the simple preliminary judgment and the non-water-using behavior are eliminated, the working efficiency can be improved, and the internal memory of a chip is not occupied;
the information processing module is used for comparing the initial judgment factor vectors with a preset threshold value, carrying out Fourier change on time domain and frequency domain of each factor vector, constructing a time sequence matrix, adopting a convolutional neural network to process information, analyzing a characteristic matrix of the information, comparing the characteristic matrix with data in a database, training a model by using a gradient descent and error back propagation method through sea-scale labeled data in the database, and finally judging a certain water using behavior, such as water for a faucet, water for flushing a toilet or water for a shower.
The information calculation module is used for calculating the water consumption under different water consumption behaviors by combining decibels and time to construct a linear relation curve, wherein the water consumption and the sound decibels are different, but the longer the time is, the larger the water consumption is, and the larger the decibel is, the larger the water consumption is.
The information display module can display water consumption behavior, water consumption duration and water consumption in real time on a watch or a mobile phone of terminal equipment, and can display the water consumption XX (m) and the water consumption XX (S) when the water tap is opened in the XX time period3) Average flow rate (m)3And/s) or querying the historical state to obtain a water consumption behavior curve in an accumulated time period.

Claims (10)

1. A personal water quantity detection method based on a smart watch is characterized by comprising the following steps:
s1, acquiring the state, time, sound wave decibel information and sound wave frequency information of the three-axis gyroscope based on the smart watch;
s2, judging whether water using behavior occurs or not according to the state of the three-axis gyroscope, sound wave decibel information and sound wave frequency information, if yes, entering a step S3, and if not, returning to the step S1;
s3, acquiring a characteristic matrix diagram through the time, the sound wave decibel information and the sound wave frequency information;
s4, inputting the characteristic matrix diagram into a Convolutional Neural Network (CNN) to obtain a water use behavior type;
and S5, constructing a linear relation curve by combining decibels and time, and acquiring a water consumption detection result according to the linear relation curve corresponding to the water consumption behavior type.
2. The method for detecting the personal water volume based on the smart watch according to claim 1, wherein the step S2 specifically comprises:
s2.1, judging whether a smart watch user generates hand movement or not according to the state of the three-axis gyroscope, if so, entering the step S2.2, and otherwise, repeating the step S2.1;
s2.2, judging whether sound wave decibel information and sound wave frequency information exist or not, if so, entering a step S2.3, and otherwise, returning to the step S2.1;
and S2.3, judging whether the sound wave decibel information is between 50 and 90, if so, primarily judging the water using behavior, and entering the step S3, otherwise, returning to the step S1.
3. The method for detecting the personal water volume based on the smart watch according to claim 2, wherein the step S3 specifically comprises:
s3.1, according to the time, the sound wave decibel information and the sound wave frequency information, constructing a Fourier change function as follows:
Figure FDA0003033734130000011
wherein ω represents acoustic frequency information, t represents time, e represents a natural constant, i represents an imaginary unit, F (t) represents a function of time t, and F (ω) represents a function varied with acoustic frequency information ω;
s3.2, acquiring a Fourier function related to time change on the basis of the Fourier change function;
s3.3, converting the Fourier function related to time transformation through an Euler formula to obtain a characteristic matrix of the acoustic wave changing along with time;
and S3.4, arranging the sound wave signals into a characteristic matrix diagram according to two dimensions of time and frequency domain according to the characteristic matrix obtained in the step S3.3.
4. The personal water volume detection method based on the smart watch according to claim 3, wherein the step S3.2 is specifically as follows:
s3.2.1, converting the Fourier change function into non-periodic Fourier change function f (t) according to the non-periodic change and the periodic change of the water flow_npAnd a periodic Fourier transform function f (t)_p
S3.2.2, according to f (t)_np、f(t)_pAnd t, plotting a fourier function over the time t.
5. A personal water volume detection method based on a smart watch according to claim 4, wherein the non-periodic Fourier transform function f (t) in the step S3.2.1_npThe method specifically comprises the following steps:
Figure FDA0003033734130000021
einωt=cos(nωt)+isin(nωt)
e-inωt=cos(nωt)-isin(nωt)
where T' represents the period approaching infinity, which is used for the calculation of the aperiodic variation function, n represents a positive real number, einRepresents a plurality;
the periodic Fourier transform function f (t) in the step S3.2.1_pThe method specifically comprises the following steps:
Figure FDA0003033734130000022
where T represents a time period.
6. A personal water volume detection method based on a smart watch according to claim 5, characterized in that the time-varying acoustic wave feature matrix in step S3.3 is specifically:
Figure FDA0003033734130000031
xi represents the conjugate form of a complex number n-order unit root, and Fn represents a sound wave characteristic matrix changing along with the order n;
the xi is specifically as follows:
Figure FDA0003033734130000032
7. the personal water volume detection method based on the smart watch according to claim 6, wherein the characteristic matrix diagram in the step S3.4 is specifically as follows:
Figure FDA0003033734130000033
wherein y represents the input of the convolutional neural network CNN, f (u) represents the obtained characteristic matrix diagram, theta represents the set threshold value, and omegaiWeights, x, representing connections between neurons in a convolutional neural network CNN networkiRepresents the input vector, i.e., the feature matrix Fn, m 1,2,., n,
Figure FDA0003033734130000034
representation according to time sumAnd obtaining a characteristic matrix diagram in a frequency domain.
8. The personal water volume detection method based on the smart watch of claim 7, wherein the convolutional neural network CNN in the step S4 includes an input layer, a first convolutional layer, a first pooling layer, a second convolutional layer, a second pooling layer, a first fully-connected layer, a second fully-connected layer and an output layer, which are connected in sequence.
9. The method for detecting the personal water volume based on the smart watch of claim 8, wherein the step S5 specifically comprises:
s5.1, setting sound wave decibel intervals and water consumption quota of water used for flushing a toilet, a faucet and a shower;
s5.2, constructing a linear relation curve according to the decibel interval of the sound wave and the water quota and in combination with the duration time of the sound wave;
and S5.3, obtaining a water consumption detection result according to the linear relation curve corresponding to the water consumption behavior type and the decibel of the water used at the time.
10. The personal water volume detection method based on the smart watch according to claim 9, wherein the step S5.1 is specifically: the decibel interval of the sound wave of the water for flushing the toilet is 70-85 db, and the water quota is 6-12 liters/time; the decibel interval of the water sound wave for the faucet is 60-75 db, and the water consumption quota is 0.1-0.3 liter/second; the decibel interval of the water sound wave for shower is 45-75 db, and the water quota is 0.1-0.3 liter/second;
the linear relation curve in the step S5.2 is specifically as follows:
Qinstant heating device=k·t·q
Figure FDA0003033734130000041
Figure FDA0003033734130000042
Wherein Q isInstant heating deviceRepresenting the amount of water used as a function of sound waves over an instantaneous time; k represents a sound coefficient, is related to decibels, converts the decibels into coefficients between 0 and 1 through dimensionless transformation, and the larger the decibel is, the larger the water consumption is; DBInstant heating deviceRepresenting sound wave decibels, DB, over an instantaneous timemaxRepresents the maximum value of sound wave decibel in the water using process, DBminRepresenting the minimum value of sound wave decibel in the water using process; t represents a short duration time varying with the sound wave; q represents water quota; qGeneral assemblyRepresenting the cumulative water usage over the total time period t of water usage.
CN202110437755.8A 2021-04-22 2021-04-22 Personal water volume detection method based on smart watch Active CN113126764B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110437755.8A CN113126764B (en) 2021-04-22 2021-04-22 Personal water volume detection method based on smart watch

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110437755.8A CN113126764B (en) 2021-04-22 2021-04-22 Personal water volume detection method based on smart watch

Publications (2)

Publication Number Publication Date
CN113126764A true CN113126764A (en) 2021-07-16
CN113126764B CN113126764B (en) 2023-02-24

Family

ID=76779244

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110437755.8A Active CN113126764B (en) 2021-04-22 2021-04-22 Personal water volume detection method based on smart watch

Country Status (1)

Country Link
CN (1) CN113126764B (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102912829A (en) * 2012-10-15 2013-02-06 吉林大学 Single-point-sensing-based household water intelligent monitoring method and equipment thereof
CN205506127U (en) * 2016-03-25 2016-08-24 荆门正源光华管业有限公司 Intelligent water gauge synchronous with intelligent wrist -watch
CN105913568A (en) * 2016-06-19 2016-08-31 新疆帕特尔信息科技有限公司 Electric-mechanical-well well-electricity-dual-control-type intelligent water distribution control device and method
CN205845282U (en) * 2016-04-15 2016-12-28 青岛理工大学 Feedwater detecting system
CN109711324A (en) * 2018-12-24 2019-05-03 南京师范大学 Human posture recognition method based on Fourier transformation and convolutional neural networks
CN110069199A (en) * 2019-03-29 2019-07-30 中国科学技术大学 A kind of skin-type finger gesture recognition methods based on smartwatch
CN111625763A (en) * 2020-05-27 2020-09-04 郑州航空工业管理学院 Operation risk prediction method and prediction system based on mathematical model
CN112560688A (en) * 2020-12-17 2021-03-26 南京航空航天大学 Daily water intake estimation system and method based on motion sensor signals

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102912829A (en) * 2012-10-15 2013-02-06 吉林大学 Single-point-sensing-based household water intelligent monitoring method and equipment thereof
CN205506127U (en) * 2016-03-25 2016-08-24 荆门正源光华管业有限公司 Intelligent water gauge synchronous with intelligent wrist -watch
CN205845282U (en) * 2016-04-15 2016-12-28 青岛理工大学 Feedwater detecting system
CN105913568A (en) * 2016-06-19 2016-08-31 新疆帕特尔信息科技有限公司 Electric-mechanical-well well-electricity-dual-control-type intelligent water distribution control device and method
CN109711324A (en) * 2018-12-24 2019-05-03 南京师范大学 Human posture recognition method based on Fourier transformation and convolutional neural networks
CN110069199A (en) * 2019-03-29 2019-07-30 中国科学技术大学 A kind of skin-type finger gesture recognition methods based on smartwatch
CN111625763A (en) * 2020-05-27 2020-09-04 郑州航空工业管理学院 Operation risk prediction method and prediction system based on mathematical model
CN112560688A (en) * 2020-12-17 2021-03-26 南京航空航天大学 Daily water intake estimation system and method based on motion sensor signals

Also Published As

Publication number Publication date
CN113126764B (en) 2023-02-24

Similar Documents

Publication Publication Date Title
Chen et al. A deep learning approach to human activity recognition based on single accelerometer
CN105447304A (en) Self-learning algorithm based warning system and mobile terminal
CN104123007B (en) Multidimensional weighted 3D recognition method for dynamic gestures
Zhan et al. Wearable sensor-based human activity recognition from environmental background sounds
CN107773214A (en) A kind of method, computer-readable medium and the system of optimal wake-up strategy
CN104615983A (en) Behavior identification method based on recurrent neural network and human skeleton movement sequences
CN103984416A (en) Gesture recognition method based on acceleration sensor
CN108600965B (en) Passenger flow data prediction method based on guest position information
CN115199240B (en) Shale gas well yield prediction method, shale gas well yield prediction device and storage medium
CN112945162B (en) Accumulation layer landslide displacement prediction model and prediction method
CN110443309A (en) A kind of electromyography signal gesture identification method of combination cross-module state association relation model
CN110491373A (en) Model training method, device, storage medium and electronic equipment
CN114485916B (en) Environmental noise monitoring method and system, computer equipment and storage medium
CN103116648B (en) Vocabulary memorization method and device thereof based on diagram form context of co-text and machine learning
CN110147163A (en) The eye-tracking method and system of the multi-model fusion driving of facing mobile apparatus
CN111772669A (en) Elbow joint contraction muscle force estimation method based on adaptive long-time and short-time memory network
CN114572229A (en) Vehicle speed prediction method, device, medium and equipment based on graph neural network
CN111291713B (en) Gesture recognition method and system based on skeleton
CN104463916A (en) Eye movement fixation point measurement method based on random walk
CN111291804A (en) Multi-sensor time series analysis model based on attention mechanism
CN113126764B (en) Personal water volume detection method based on smart watch
CN118070049A (en) Step type landslide displacement prediction method and system under mechanism guidance
CN114550299A (en) System and method for evaluating daily life activity ability of old people based on video
CN207096984U (en) A kind of chemical illumination immunity analysis instrument inspection data inquiry unit
CN110163419A (en) A kind of method of middle and small river river basin flood forecast

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant