CN106709401A - Diet information monitoring method and device - Google Patents

Diet information monitoring method and device Download PDF

Info

Publication number
CN106709401A
CN106709401A CN201510778853.2A CN201510778853A CN106709401A CN 106709401 A CN106709401 A CN 106709401A CN 201510778853 A CN201510778853 A CN 201510778853A CN 106709401 A CN106709401 A CN 106709401A
Authority
CN
China
Prior art keywords
specific user
hand
diet
image object
state
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201510778853.2A
Other languages
Chinese (zh)
Inventor
刘俊萍
宛海涛
范晓晖
薛峰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
China Mobile Communications Group Co Ltd
Original Assignee
China Mobile Communications Group Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by China Mobile Communications Group Co Ltd filed Critical China Mobile Communications Group Co Ltd
Priority to CN201510778853.2A priority Critical patent/CN106709401A/en
Publication of CN106709401A publication Critical patent/CN106709401A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • G06V40/28Recognition of hand or arm movements, e.g. recognition of deaf sign language

Abstract

The invention discloses a diet information monitoring method and device, and the method comprises the steps: detecting hand motion information of a specific user; collecting a hand image of the specific user if the hand motion information meets a diet gesture recognition algorithm; judging whether the specific user is in an eating state or not based on the hand image, and forming a judgment result; and transmitting the judgment result to far end equipment.

Description

Diet information monitoring method and device
Technical field
The present invention relates to technical field of information processing, more particularly to a kind of diet information monitoring method and device.
Background technology
With the development of society, people increasingly pay attention to for the concern of health.For example, sons and daughters would generally worry the health of the old man being in alone, can also worry that children, in kindergarten or the health of school, specifically such as worry the healthy diet of these crowds.In the prior art, user may be by installing at home or the monitoring of school understands some activities of above-mentioned crowd, so that healthy diet of rough these crowds of determination etc..But it is clear that these accuracy and precisions are all very low, it is impossible to the enough accurate information for effectively reacting each side such as the diet of these crowds.
The content of the invention
In view of this, the embodiment of the present invention is expected to provide a kind of diet information monitoring method and device, at least can solve the problem that the accurate low problem of monitoring to the diet information of specific user.
To reach above-mentioned purpose, what the technical scheme of the embodiment of the present invention was realized in:
Embodiment of the present invention first aspect provides a kind of diet information monitoring method, and methods described includes:
Detect the hand motion information of specific user;
If the hand motion information meets default diet Gesture Recognition Algorithm, the hand images of the specific user are gathered;
Judge that whether the specific user, in diet state, forms judged result based on the hand images;
The judged result is sent to remote equipment.
Based on such scheme, the hand motion information of the detection specific user, including:
The hand motion information is detected using hand Wearable device;
The hand images of the collection specific user, including:
The diet view data of the specific user is gathered using the hand Wearable device.
It is described to judge that whether the specific user, in diet state, forms judged result based on the hand images based on such scheme, including:
Judge whether there is specified image object in the hand images;The specified image object includes at least one of food service items, drink and food;
If the hand images include the specified image object, determine that the specific user is in diet state;
If not including the specified image object in the hand images, determine that the specific user is in non-diet state.
Based on such scheme, if the hand images include the specified image object, determine that the specific user is in diet state, including:
If the described specified image object in the hand images appears in the mouth of the specific user, it is determined that the specific user is in the diet state.
Based on such scheme, if the described specified image object in the hand images appears in the mouth of the specific user, it is determined that the specific user is in the diet state, including:
If the probability for specifying image object to appear in the specific user mouth described in the hand images is more than is specified probability, it is determined that the specific user is in normal diet state.
Embodiment of the present invention second aspect provides a kind of diet information supervising device, and described device includes:
Detection unit, the hand motion information for detecting specific user;
Collecting unit, if meet default diet Gesture Recognition Algorithm for the hand motion information, gathers the hand images of the specific user;
Judging unit, for judging that whether the specific user, in diet state, forms judged result based on the hand images;
Transmitting element, for the judged result to be sent into remote equipment.
Based on such scheme, the detection unit, specifically for detecting the hand motion information using hand Wearable device;
The collecting unit, the diet view data for gathering the specific user using the hand Wearable device.
Based on such scheme, the judging unit, specifically for judging whether there is specified image object in the hand images;The specified image object includes at least one of food service items, drink and food;If the hand images include the specified image object, determine that the specific user is in diet state;If not including the specified image object in the hand images, determine that the specific user is in non-diet state.
Based on such scheme, the judging unit, if the described specified image object in specifically for the hand images appears in the mouth of the specific user, it is determined that the specific user is in the diet state.
Based on such scheme, the judging unit, if the probability specifically for specifying image object to appear in the specific user mouth described in the hand images is more than is specified probability, it is determined that the specific user is in normal diet state.
Diet information monitoring method provided in an embodiment of the present invention and device, the hand motion information of specific user will be gathered, and when hand motion information meets diet Gesture Recognition Algorithm, IMAQ is opened to gather the hand images of specific user, determine that whether specific user, in diet state, is then forwarded to remote equipment further according to hand images.With reference to hand motion information and hand images, accurately remote monitoring specific user whether diet state can be in, relative to installing monitoring device in room, it is clear that monitored results accuracy and precision has obtained greatly being lifted.
Brief description of the drawings
Fig. 1 is the schematic flow sheet of the first diet monitoring method provided in an embodiment of the present invention;
Fig. 2 is the schematic flow sheet of diet supervising device provided in an embodiment of the present invention;
Fig. 3 is the structural representation of hand Wearable device provided in an embodiment of the present invention;
The effect diagram of Fig. 4 hand motion information monitorings provided in an embodiment of the present invention;
Fig. 5 is the schematic flow sheet of second diet monitoring method provided in an embodiment of the present invention.
Specific embodiment
To make the object, technical solutions and advantages of the present invention become more apparent, by the following examples and referring to the drawings, the present invention is described in more detail.
Embodiment one:
As shown in figure 1, a kind of diet information monitoring method of the present embodiment, methods described includes:
Step S110:Detect the hand motion information of specific user;
Step S120:If the hand motion information meets default diet Gesture Recognition Algorithm, the hand images of the specific user are gathered;
Step S130:Judge that whether the specific user, in diet state, forms judged result based on the hand images;
Step S140:The judged result is sent to remote equipment.
The hand exercise information of specific user can be gathered first in the present embodiment, here specific user may include that the age specifies the old man on age first, can also be to specify age child once second, naturally it is also possible to be user that disabled person, amentia personage etc. specify.The hand exercise action, it may include the information such as the movement locus of hand, hand gestures.Can be detected using acceleration transducer in the present embodiment, hand motion information here can be detected with 3-axis acceleration sensor.
To first recognize whether the hand motion information meets diet Gesture Recognition Algorithm in the step s 120, if meeting the diet Gesture Recognition Algorithm, open image collecting function, gather the hand images of specific user.Generally there is certain ad hoc movement locus because specific user carries out diet, whether the current hand motion information that the specific user can be determined by the movement locus in the hand motion information is to meet the Gesture Recognition Algorithm.The hand images, can be at least to include the image of certain customers' hand in the present embodiment.In the step s 120, when the hand images are gathered, can be gathered according to specified time interval, for example the hand images are gathered by the cycle, such as 2 seconds one secondary hand images of collection, like this, if specific user can collect several described hand images really in diet state in diet state.And when the hand motion information of determination specific user meets diet Gesture Recognition Algorithm in the step s 120 in the present embodiment, just start to gather the hand images of the specific user, like this, it is possible to reduce the monitoring of invalid image, while saving the power consumption of monitoring device.If the equipment for especially gathering the hand images is mobile device, the energy consumption of equipment can be greatly saved, and greatly simplify operation caused by collection image.
Can directly judge that whether the specific user, in diet state, forms judged result according to the hand images in step s 130.The judged result can be sent to remote equipment in step S140, facilitate the guardian or household of the specific user to directly detect the diet state of specific user.Using diet monitoring method described in the present embodiment, by gathering the combination of specific user's hand motion and hand images, can accurately judge specific user whether in diet state.
Remote equipment is sent to by by the judged result in step S140, facilitates the diet state of the specific user that the guardian or household of specific user remotely accurately monitor.
Further, the step S110 may include:The hand motion information is detected using hand Wearable device;The step S120 may include:The diet view data of the specific user is gathered using the hand Wearable device.
The hand Wearable device can be the smart machines such as Intelligent bracelet, intelligent watch or intelligent finger ring in the present embodiment.The hand Wearable device can include detecting the sensor of hand exercise, and gather the structures such as the camera or video camera of image.Detected using hand Wearable device, equipment have easy to carry, simple structure, hardware cost low and with testing result it is accurate the characteristics of.
Further, the step S130 may include:Judge whether there is specified image object in the hand images;The specified image object includes at least one of food service items, drink and food;If the hand images include the specified image object, determine that the specific user is in diet state;If not including the specified image object in the hand images, determine that the specific user is in non-diet state.
Here food service items may include the thing tool used in the diet such as chopsticks, spoon, the knife and fork had dinner and cup.Certain specified image object may also include drink and food in itself.Here drink specifically can such as milk, beverage.The food may include fruit, food etc..Image recognition technology can be utilized in step s 130, and whether identify in the hand images includes the specified image object.
The hand motion in the step S110 meets diet gesture motion in the present embodiment, collected in hand images simultaneously and characterize the specified image object in diet state, this when, specific user was very high in diet shape probability of state, therefore be believed that the specified image object is found that in hand images in step S130 in the present embodiment, it is considered as specific user and is in diet state.If not finding the specified image object in hand images, possible specific user is that coincidence has done the action similar to diet processes, is not considered as that specific user is in diet state.The characteristics of diet information monitoring method described in obvious the present embodiment has monitored results accurate.
Further, the step S130 is specifically included:
If the described specified image object in the hand images appears in the mouth of the specific user, it is determined that the specific user is in the diet state.Certainly, in order to further improve the accuracy of monitoring, in step s 130 when the specified image object appears in the mouth of the specific user, just determine that the specific user is in diet state.
Further, if the described specified image object in the hand images appears in the mouth of the specific user, it is determined that the specific user is in the diet state, including:If the probability for specifying image object to appear in the specific user mouth described in the hand images is more than is specified probability, it is determined that the specific user is in normal diet state.When the specified image object appears in user mouth, illustrate that specific user is in diet state, but abnormal diet should be in normal diet, in the present embodiment, will the N hand images of statistics gatherer include the probability that the hand images that mouth occurs in the specified image object are accounted for during N opens the hand images, if the probability is more than designated value, just think that specific user is in normal diet state, otherwise may the specific user diet state occur it is abnormal.For example, certain old man is because suffering from indiestion, although feed, but feed is little, and the interior food eaten is little at the dinner hour;Obviously at this moment user is in improper diet state.The guardian or household of certain old man may not be only intended to understand whether old man carries out diet, at the same also to understand teacher whether can normal diet because this direct relation the health of old man.
Diet monitoring method described in the present embodiment, during concrete implementation, can also store the hand motion information and the hand images, facilitate subsequent user to check and carry out the treatment of follow-up data depth.
Diet information monitoring method described in the present embodiment, conveniently remotely can accurately monitor the diet state of specific user in a word.
Embodiment two:
As shown in Fig. 2 the present embodiment provides a kind of diet information supervising device, described device includes:
Detection unit 110, the hand motion information for detecting specific user;
Collecting unit 120, if meet default diet Gesture Recognition Algorithm for the hand motion information, gathers the hand images of the specific user;
Judging unit 130, for judging that whether the specific user, in diet state, forms judged result based on the hand images;
Transmitting element 140, for the judged result to be sent into remote equipment.
Diet information supervising device described in the present embodiment may correspond to the equipment that can monitor specific user, for example, may correspond to the hand Wearable device of specific user.
The detection unit 110 may correspond to the structures such as gyroscope or acceleration transducer, and the acceleration transducer may include 3-axis acceleration sensor.
The collecting unit 120 may correspond to carry out the structures such as the various structures of IMAQ, such as camera or video camera.
The judging unit 130 may correspond to processor or process circuit etc..The processor may include the structures such as application processor, digital signal processor, central processing unit, microprocessor or programmable array.The process circuit may include application specific integrated circuit, by performing predetermined instruction, it can be determined that whether the specific user is in diet state, forms corresponding judged result.
The transmitting element 140 may include various types of transmission interfaces, and the transmission interface may correspond to the wireless transmission interface such as various types of antennas, such as WiFi interfaces, blue tooth interface etc..
The diet supervising device described in the present embodiment can realize the technical scheme of any one the diet monitoring method described in embodiment one in a word, it is same have can accurately remote monitoring specific user diet the characteristics of.
The detection unit 110, specifically for detecting the hand motion information using hand Wearable device;
The collecting unit 120, the diet view data for gathering the specific user using the hand Wearable device.
Utilizing the hand Wearable device, usual Wearable device in the present embodiment has small volume, wears the characteristics of facilitating and be intelligent high.The detection of hand motion information and the collection of hand images are all carried out respectively using hand Wearable device using the detection unit 110 and the collecting unit 120, the characteristics of with realizing easy, with low cost and easy to use in the present embodiment.
Further, the judging unit 130, specifically for judging whether there is specified image object in the hand images;The specified image object includes at least one of food service items, drink and food;If the hand images include the specified image object, determine that the specific user is in diet state;If not including the specified image object in the hand images, determine that the specific user is in non-diet state.
Whether whether the judging unit 130 is by the identification to the hand images in the present embodiment, identify and include specifying image object to determine the specific user in diet state in the hand images.
Of course for further accurately determining the specific user whether in diet state, in the present embodiment, the judging unit 130, if the described specified image object in specifically for the hand images appears in the mouth of the specific user, it is determined that the specific user is in the diet state.If in the specific user mouth that the present embodiment specified image object occurs, representing that specific user carries out diet really, just determine that the specific user is in diet state this when, it becomes possible to ensure the accuracy of judged result.
Certainly, the judging unit 130 in the present embodiment, if the probability specifically for specifying image object to appear in the specific user mouth described in the hand images is more than is specified probability, it is determined that the specific user is in normal diet state.The judging unit 130 is used not only for judging that specific user is to be in non-diet state in diet state in the present embodiment, can also judge that whether user, in normal diet state, improves the intelligent and users' satisfaction degree of the diet information supervising device again.
It is worth noting that:Diet information supervising device described in the present embodiment also includes memory cell, and the memory cell can be used in storing the hand motion information and the hand images.
Two specific examples are provided below in conjunction with above-mentioned any embodiment:
Example one:
As shown in figure 3, this example provides a kind of hand Wearable device, the hand Wearable device can be made up of four parts, be respectively data acquisition module, data analysis module, data memory module and Network Interface Module.Here data acquisition module may correspond to detection unit 110 and collecting unit 120 in previous embodiment.Here data analysis module may correspond to the judge module 130 in previous embodiment.Here Network Interface Module may correspond to transmitting element 140 in previous embodiment,
Data acquisition module can be sent to user monitoring terminal by data analysis module Treatment Analysis, the result of Treatment Analysis by gathering acceleration transducer data and camera data, the data for collecting by Network Interface Module.
Data acquisition module includes acceleration transducer and camera.Elders wear hand Wearable device mainly realizes following two functions:
1.1 gather old man's hand exercise information by the acceleration transducer of hand Wearable device;
1.2 view data that old man is gathered by camera.Here old man is one kind of the specific user described in previous embodiment.
The data of data collecting module collected are by data analysis module Treatment Analysis.Data analysis module includes 2 intelligent algorithms:Diet Gesture Recognition Algorithm and condition intelligent evaluation algorithm of having dinner.
Hand Wearable device is acquired to hand movable information, and the movable information of collection judges whether old man's hand exercise meets the characteristics of motion picked up food with chopsticks with fed conditions in hand when having a meal by diet Gesture Recognition Algorithm.When detect old man be in pick up food with chopsticks with fed conditions when, triggering the camera of wearable device carries out the collection of image in specific action node, and elderly people diet's behavior state is judged by gesture motion information and camera collection image Conjoint Analysis.
Acceleration change situation when the old man that the 3-axis acceleration sensor that Fig. 4 is shown in hand Wearable device is detected has a meal caused by hand exercise.What wherein tri- curves of R1, Y1, B1 were represented is the acceleration magnitude situation of change of tri- axial directions of XYZ of sensor output, and what P1 lines were represented is the situation of change (i.e. the situation of change of the value of Sqrt (X^2+Y^2+Z^2)) of total acceleration magnitude.Perform be diet gesture identification when, the specific direction to XYZ axles does not make a distinction, only distinguish three directions on acceleration situation of change.
3-axis acceleration sensor in hand Wearable device is acquired to the acceleration magnitude of gesture of having a meal.Hand process on the feed during due to diet, wrist near desk and when the two are local to mouth having big acceleration, so that hand is static or from static to motion from moving to.And in the lifting of hand, put down during the motion of (between this 2 points of desk and mouth) hand be relatively at the uniform velocity, the acceleration magnitude situation of change of tri- axial directions of the XYZ that now sensor is collected is more because the change of gesture causes the rotation of xyz axles so that projection of the acceleration of gravity of 1g on xyz is in different size.Therefore, when the value for detecting 3-axis acceleration sensor meets following Changing Pattern, then when judging that old man has a meal there is the behavior picked up food with chopsticks and take food in hand:
In a period of time after wearable device detection, the total acceleration magnitude of detection is more than the time point corresponding to certain threshold value Gt, for the time point that these are detected, if the time gap between any two time point is less than Tmin, then these time points be classified as one set, and choose the maximum point of acceleration magnitude in the point of spacing comparatively dense in the set (point is most in the dt times) alternately characteristic point (T1 as shown in Figure 4, T2, T5);If interval for determined by these alternative features points, exist in two continuous intervals determined by any 3 continuous alternative features points, such as [T1, T2], in [T2, T5], meet following condition:
In [T1, T2] is interval, threshold value G6 is less than with the presence of respectively greater than threshold value G1 and threshold value G3 and the 3rd acceleration average value Gc of axial direction of two axial direction acceleration average value Ga, Gb.
In [T2, T3] is interval, existence time interval [T3, T4] is respectively smaller than threshold value G2 and threshold value G4 and the 3rd acceleration average value Gc of axial direction more than threshold value G5 with the presence of two axial directions acceleration average value Ga, Gb.
Above-mentioned [T2, T3], [T3, T4], [T4, T5] each time period corresponding meaning are as follows:
At [T2, T3] in the time, 1g is less than with the presence of one section of total acceleration magnitude of continuous time, wherein T3 is separation of total acceleration magnitude from less than 1g to more than or equal to 1g;Simultaneously at [T4, T5] in the time, there is one continuous time to there is also total acceleration magnitude less than 1g, wherein T4 is separation of total acceleration magnitude from more than or equal to 1g to less than 1g.Among these, it is that, because hand will be lifted, hand is in order to overcome acceleration of gravity so that the movement velocity of hand is more than 0 from 0 to moment in the acceleration more than 1g produced by the T2 moment;In [T2, T3] that time of total acceleration value existing in this time period less than 1g, the reason for it is present can be regarded as because people is during picking up food with chopsticks to mouth, before it dish will be clipped to mouth, start caused by appearance deceleration, so as within subsequent very short time, when dish really reaches mouth, the speed of hands movement is changed into 0;At [T3, T4] in this period, hand when mouth, and hand are put down by acceleration of gravity but are not reached table side, total acceleration magnitude essentially gravity acceleration value that sensor is detected, i.e. 1g;In [T4, T5] that time of total acceleration value existing in this time period less than 1g, the reason for it is present can be regarded as because hand by acceleration of gravity during desk is fallen to, before hand will reach desk, hand starts caused by appearance deceleration, so as within subsequent very short time, when hand really contacts desk, the speed of hands movement detects situation of the acceleration magnitude more than 1g from being changed into 0 more than 0 moment, therefore there is sensor at the T5 moment.
Meet conditions above, then this 3 alternative features points of T1, T2, T5 are then the characteristic point for determining.For 3 above-mentioned characteristic points, T2 correspondences clip dish and prepare the moment of feed of raising one's hand, T1 is corresponding with T5 take food terminate after return on desk the secondary moment picked up food with chopsticks again.And T3 to T4 interval intermediate time corresponds to the moment of feed.
After old man's hand is detected in the presence of picking up food with chopsticks with influent pH, start camera, old man's feed image is shot every some cycles C collections.Analysis judges image, if having chopsticks or spoon to appear in mouth, and the number of times for occurring more than the certain proportion P of times of collection in the image of collection, then old man is in state of having dinner.This state of having dinner is one of foregoing diet state.Here ratio P correspond to above-mentioned specified probability.
Data memory module is made up of data storage cell, is responsible for the hand exercise data and view data of storage collection.
Network Interface Module provides WIFI and 2G/3G/4G mobile network's access functions.By the module, wearable device can carry out data and information transmission with exterior terminal, platform.
Example two:
As shown in figure 5, this example provides a kind of method that hand Wearable device detects elderly people diet's state, including:
Step S1:Detection old man hand sends motion;
Step S2:Judge that hand is present to pick up food with chopsticks and influent pHIf then entering step S3, if not into step S6.
Step S3:Start camera, old man's feed image is shot every some cycles C.
Step S4:Judge the certain proportion P whether circumoral number of times is more than times of collection occur with even chopsticks or spoon in imageIf into step S5;If otherwise entering step S6.
Step S5:Determine that old man is in state of having dinner.
Step S6:Determine that old man is in non-a little meal states.
In several embodiments provided herein, it should be understood that disclosed apparatus and method, can realize by another way.Apparatus embodiments described above are only schematical, for example, the division of the unit, only a kind of division of logic function, can have other dividing mode, such as when actually realizing:Multiple units or component can be combined, or be desirably integrated into another system, or some features can be ignored, or not performed.In addition, the coupling each other of shown or discussed each part or direct-coupling or communication connection can be by some interfaces, the INDIRECT COUPLING or communication connection of equipment or unit can be electrical, machinery or other forms.
The above-mentioned unit that is illustrated as separating component can be or may not be physically separate, and the part shown as unit can be or may not be physical location, you can with positioned at a place, it is also possible to be distributed on multiple NEs;Part or all of unit therein can be according to the actual needs selected to realize the purpose of this embodiment scheme.
In addition, each functional unit in various embodiments of the present invention can be fully integrated into a processing module, or each unit is individually as a unit, it is also possible to which two or more units are integrated in a unit;Above-mentioned integrated unit can both be realized in the form of hardware, it would however also be possible to employ hardware adds the form of SFU software functional unit to realize.
One of ordinary skill in the art will appreciate that:Realizing all or part of step of above method embodiment can be completed by the related hardware of programmed instruction, and foregoing program can be stored in a computer read/write memory medium, and the program upon execution, performs the step of including above method embodiment;And foregoing storage medium includes:Movable storage device, read-only storage (ROM, Read-Only Memory), random access memory (RAM, Random Access Memory), magnetic disc or CD etc. are various can be with the medium of store program codes.
The above; specific embodiment only of the invention, but protection scope of the present invention is not limited thereto, any one skilled in the art the invention discloses technical scope in; change or replacement can be readily occurred in, should be all included within the scope of the present invention.Therefore, protection scope of the present invention should be based on the protection scope of the described claims.

Claims (10)

1. a kind of diet information monitoring method, it is characterised in that methods described includes:
Detect the hand motion information of specific user;
If the hand motion information meets default diet Gesture Recognition Algorithm, gather the specific user's Hand images;
Judge that whether the specific user, in diet state, forms judged result based on the hand images;
The judged result is sent to remote equipment.
2. method according to claim 1, it is characterised in that
The hand motion information of the detection specific user, including:
The hand motion information is detected using hand Wearable device;
The hand images of the collection specific user, including:
The diet view data of the specific user is gathered using the hand Wearable device.
3. method according to claim 1 and 2, it is characterised in that
It is described to judge that whether the specific user, in diet state, forms and judges knot based on the hand images Really, including:
Judge whether there is specified image object in the hand images;The specified image object is used including diet At least one of tool, drink and food;
If the hand images include the specified image object, determine that the specific user is in diet shape State;
If not including the specified image object in the hand images, determine that the specific user is in non-drink Food state.
4. method according to claim 3, it is characterised in that
If the hand images include the specified image object, determine that the specific user is in drink Food state, including:
If the described specified image object in the hand images appears in the mouth of the specific user, really The fixed specific user is in the diet state.
5. method according to claim 4, it is characterised in that
If the described specified image object in the hand images appears in the mouth of the specific user, Then determine that the specific user is in the diet state, including:
If specifying the probability that image object appears in the specific user mouth to be more than described in the hand images Specify probability, it is determined that the specific user is in normal diet state.
6. a kind of diet information supervising device, it is characterised in that described device includes:
Detection unit, the hand motion information for detecting specific user;
Collecting unit, if meet default diet Gesture Recognition Algorithm for the hand motion information, collection The hand images of the specific user;
Judging unit, for whether judging the specific user in diet state based on the hand images, Form judged result;
Transmitting element, for the judged result to be sent into remote equipment.
7. device according to claim 6, it is characterised in that
The detection unit, specifically for detecting the hand motion information using hand Wearable device;
The collecting unit, the diet for gathering the specific user using the hand Wearable device View data.
8. the device according to claim 6 or 7, it is characterised in that
The judging unit, specifically for judging whether there is specified image object in the hand images;It is described Specifying image object includes at least one of food service items, drink and food;If in the hand images Including the specified image object, determine that the specific user is in diet state;If in the hand images Do not include the specified image object, determine that the specific user is in non-diet state.
9. device according to claim 8, it is characterised in that
The judging unit, if the described specified image object in specifically for the hand images appears in institute State the mouth of specific user, it is determined that the specific user is in the diet state.
10. device according to claim 9, it is characterised in that
The judging unit, if described specifically for specifying image object to appear in described in the hand images The probability of specific user mouth is more than specifies probability, it is determined that the specific user is in normal diet state.
CN201510778853.2A 2015-11-13 2015-11-13 Diet information monitoring method and device Pending CN106709401A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201510778853.2A CN106709401A (en) 2015-11-13 2015-11-13 Diet information monitoring method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201510778853.2A CN106709401A (en) 2015-11-13 2015-11-13 Diet information monitoring method and device

Publications (1)

Publication Number Publication Date
CN106709401A true CN106709401A (en) 2017-05-24

Family

ID=58931363

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201510778853.2A Pending CN106709401A (en) 2015-11-13 2015-11-13 Diet information monitoring method and device

Country Status (1)

Country Link
CN (1) CN106709401A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108108690A (en) * 2017-12-19 2018-06-01 深圳创维数字技术有限公司 A kind of method, apparatus, equipment and storage medium for monitoring diet
CN108279022A (en) * 2018-03-12 2018-07-13 众米智能科技(深圳)有限公司 Intelligent gesture identifying system, the baby spoon fork with the system and implementation method
CN110826506A (en) * 2019-11-11 2020-02-21 上海秒针网络科技有限公司 Target behavior identification method and device

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000245713A (en) * 1999-02-26 2000-09-12 Sanyo Electric Co Ltd Behavior recognizing device
US20090082699A1 (en) * 2007-09-21 2009-03-26 Sun Lee Bang Apparatus and method for refining subject activity classification for recognition of daily activities, and system for recognizing daily activities using the same
CN102368297A (en) * 2011-09-14 2012-03-07 北京英福生科技有限公司 Equipment, system and method for recognizing actions of detected object
CN102567743A (en) * 2011-12-20 2012-07-11 东南大学 Automatic identification method of driver gestures based on video images
US20130336519A1 (en) * 2012-06-14 2013-12-19 Robert A. Connor Willpower Watch (TM) -- A Wearable Food Consumption Monitor
CN104765980A (en) * 2015-05-04 2015-07-08 哈尔滨理工大学 Intelligent diet assessment method based on cloud computing

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000245713A (en) * 1999-02-26 2000-09-12 Sanyo Electric Co Ltd Behavior recognizing device
US20090082699A1 (en) * 2007-09-21 2009-03-26 Sun Lee Bang Apparatus and method for refining subject activity classification for recognition of daily activities, and system for recognizing daily activities using the same
CN102368297A (en) * 2011-09-14 2012-03-07 北京英福生科技有限公司 Equipment, system and method for recognizing actions of detected object
CN102567743A (en) * 2011-12-20 2012-07-11 东南大学 Automatic identification method of driver gestures based on video images
US20130336519A1 (en) * 2012-06-14 2013-12-19 Robert A. Connor Willpower Watch (TM) -- A Wearable Food Consumption Monitor
CN104765980A (en) * 2015-05-04 2015-07-08 哈尔滨理工大学 Intelligent diet assessment method based on cloud computing

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108108690A (en) * 2017-12-19 2018-06-01 深圳创维数字技术有限公司 A kind of method, apparatus, equipment and storage medium for monitoring diet
CN108108690B (en) * 2017-12-19 2022-02-11 深圳创维数字技术有限公司 Method, device, equipment and storage medium for monitoring diet
CN108279022A (en) * 2018-03-12 2018-07-13 众米智能科技(深圳)有限公司 Intelligent gesture identifying system, the baby spoon fork with the system and implementation method
CN110826506A (en) * 2019-11-11 2020-02-21 上海秒针网络科技有限公司 Target behavior identification method and device

Similar Documents

Publication Publication Date Title
CN104104983B (en) A kind of method and system for detecting user's automatic closing television in a sleep state
CN105302308A (en) Intelligent wrist wearing device and automatic screen turn-on method and system therefor
CN103270522B (en) For monitoring the ability of posture control of vital sign
CN103455170A (en) Sensor-based device and sensor-based method for recognizing motion of mobile terminal
CN103676604B (en) Watch and running method thereof
CN106709401A (en) Diet information monitoring method and device
CN103581443A (en) Intelligent terminal automatic awakening method and system based on sensors
CN105488962A (en) Infant quilt kicking alarm device and method
CN105232067B (en) Intelligent shoe wearing effect detection method and system
CN106781278A (en) A kind of fall detection method and device based on Fusion
CN110544054A (en) Anti-violence sorting active express sorting operation assisting and evaluating system and method
CN114038012A (en) Fall detection method and system based on millimeter wave radar and machine learning
CN103492884A (en) Low average velocity pedestrial motion identification
CN109528163B (en) Sleep monitoring method and equipment
CN106283499B (en) A kind of washing machine cleaning automatic reminding method and washing machine
CN108681498A (en) A kind of monitoring method of CPU usage, device and mobile terminal
CN110151136A (en) It has ready conditions with reference to heart rate sleep state monitoring method, device, equipment and medium
CN110954972B (en) Wearable device, and falling detection method, device and storage medium thereof
CN106913313A (en) A kind of sleep monitor method and system
CN107890340A (en) A kind of sleep monitor method and system
CN106960550A (en) Electronic equipment monitoring system for prompting of going out based on image recognition
CN209265670U (en) A kind of smart home decoration device with dust detection function
CN109669566A (en) A kind of method and device of control report point output
WO2017186964A1 (en) A food monitoring system
CN107229216A (en) A kind of intelligent watch and its control method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20170524