CN111414780A - Sitting posture real-time intelligent distinguishing method, system, equipment and storage medium - Google Patents

Sitting posture real-time intelligent distinguishing method, system, equipment and storage medium Download PDF

Info

Publication number
CN111414780A
CN111414780A CN201910006352.0A CN201910006352A CN111414780A CN 111414780 A CN111414780 A CN 111414780A CN 201910006352 A CN201910006352 A CN 201910006352A CN 111414780 A CN111414780 A CN 111414780A
Authority
CN
China
Prior art keywords
sitting posture
current
user
standard
shoulder
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201910006352.0A
Other languages
Chinese (zh)
Other versions
CN111414780B (en
Inventor
张世芳
陈超
夏亮
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Aspire Digital Technologies Shenzhen Co Ltd
Original Assignee
Aspire Digital Technologies Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Aspire Digital Technologies Shenzhen Co Ltd filed Critical Aspire Digital Technologies Shenzhen Co Ltd
Priority to CN201910006352.0A priority Critical patent/CN111414780B/en
Publication of CN111414780A publication Critical patent/CN111414780A/en
Application granted granted Critical
Publication of CN111414780B publication Critical patent/CN111414780B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/103Static body considered as a whole, e.g. static pedestrian or occupant recognition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02BCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO BUILDINGS, e.g. HOUSING, HOUSE APPLIANCES OR RELATED END-USER APPLICATIONS
    • Y02B20/00Energy efficient lighting technologies, e.g. halogen lamps or gas discharge lamps
    • Y02B20/40Control techniques providing energy savings, e.g. smart controller or presence detection

Abstract

The invention discloses a sitting posture real-time intelligent judgment method, which comprises the following steps: acquiring a current sitting posture image of a user in real time, identifying human body characteristic key point data of the user, and if the human body characteristic key point data of the user cannot be identified, considering that the sitting posture is abnormal; the human body feature key point data comprises an eye coordinate, a mouth coordinate, a neck coordinate and a shoulder coordinate; if the human body feature key point data of the user is identified, calculating the current sitting posture data according to the human body feature key point data; the current sitting posture data comprises a current head inclination angle, a current shoulder inclination angle, a height difference value between a current neck and a face, and a height difference value between a current shoulder and a face; and comparing the current sitting posture data with the standard sitting posture data to judge whether the current sitting posture is abnormal. The invention also discloses a sitting posture real-time intelligent distinguishing system, equipment and a storage medium. The invention relates to the technical field of artificial intelligence, in particular to a sitting posture real-time intelligent judgment method, a system, equipment and a storage medium, which realize high sitting posture judgment accuracy.

Description

Sitting posture real-time intelligent distinguishing method, system, equipment and storage medium
Technical Field
The invention relates to the technical field of artificial intelligence, in particular to a sitting posture real-time intelligent judgment method, a system, equipment and a storage medium.
Background
At present, parents are concerned about sitting posture learning conditions of children in primary school stages, worry about influencing healthy growth of the children, and have strict requirements on the sitting posture of the children. In the prior art, a common method for recognizing sitting postures is to adopt a picture processing method, which specifically comprises the steps of geometrically calculating the central position of pixel points of a face and shoulders in a picture, then calculating the angle of each pixel point on a face and head-shoulder curve relative to the central position, subtracting the calculated angle value from a preset standard angle value, and considering that the sitting postures are abnormal if the obtained difference value is greater than a threshold value. The center position is calculated through the pixel set of the face and the shoulder, and then the deviation angle is calculated, so that the accuracy of the obtained face deviation angle is not high.
Disclosure of Invention
The present invention is directed to solving, at least to some extent, one of the technical problems in the related art. Therefore, an object of the present invention is to provide a method, a system, a device and a storage medium for real-time and intelligent determination of sitting posture, which have high accuracy of sitting posture determination and can determine a plurality of abnormal sitting postures.
The technical scheme adopted by the invention is as follows:
in a first aspect, the invention provides a sitting posture real-time intelligent determination method, which comprises the following steps:
acquiring a current sitting posture image of a user in real time, identifying human body characteristic key point data of the user, and if the human body characteristic key point data of the user cannot be identified, considering that the sitting posture is abnormal;
the human body feature key point data comprises an eye coordinate, a mouth coordinate, a neck coordinate and a shoulder coordinate;
if the human body feature key point data of the user is identified, calculating the current sitting posture data according to the human body feature key point data;
the current sitting posture data comprises a current head inclination angle, a current shoulder inclination angle, a height difference value between a current neck and a face, and a height difference value between a current shoulder and the face;
and comparing the current sitting posture data with the standard sitting posture data to judge whether the current sitting posture is abnormal.
As a further improvement of the above scheme, if the human body feature key point data of the user is identified, calculating the current sitting posture data according to the human body feature key point data specifically includes:
calculating the current head inclination angle of the user according to the mouth coordinate and the neck coordinate;
calculating the current shoulder inclination angle of the user according to shoulder coordinates, wherein the shoulder coordinates comprise a left shoulder coordinate and a right shoulder coordinate;
calculating a height difference value between the current neck and the face of the user according to the eye coordinate, the mouth coordinate and the neck coordinate;
and calculating the height difference between the current shoulder and the face of the user according to the eye coordinate, the mouth coordinate and the shoulder coordinate.
As a further improvement of the above scheme, the comparing the current sitting posture data with the standard sitting posture data, and determining whether the current sitting posture is abnormal specifically includes:
comparing the current head inclination angle of the user with a standard head inclination angle threshold value, and judging whether the head inclination is abnormal or not;
comparing the current shoulder inclination angle of the user with a standard shoulder inclination angle threshold value, and judging whether the shoulder inclination is abnormal or not;
calculating the ratio of the height difference between the current neck and the face of the user to the height difference between the standard neck and the face of the user to serve as a first ratio, comparing the first ratio with a standard eye-use nearness difference ratio threshold value, and judging whether the user uses eyes too close;
and calculating the ratio of the height difference between the current shoulder and the face of the user to the height difference between the standard shoulder and the face of the user to serve as a second ratio, comparing the second ratio with a standard lying table difference ratio threshold value, and judging whether the user lies on the table.
As a further improvement of the above scheme, before the step of collecting the image of the current sitting posture of the user in real time and identifying the key point data of the human body features of the user, the method further comprises the following steps:
inputting a standard sitting posture image, performing big data training through a machine learning supervised learning classification algorithm, and acquiring standard sitting posture data, wherein the standard sitting posture data comprises a standard head inclination angle threshold value, a standard shoulder inclination angle threshold value, a standard eye use over-close difference value ratio threshold value and a standard lying table difference value ratio threshold value.
As a further improvement of the above solution, the method further comprises: and when the current sitting posture is judged to be abnormal, sending out reminding information in real time.
In a second aspect, the present invention provides a sitting posture real-time intelligent determination system, which includes:
the acquisition and identification module is used for acquiring the current sitting posture image of the user in real time, identifying the human body characteristic key point data of the user, and considering that the sitting posture is abnormal if the human body characteristic key point data of the user cannot be identified;
the human body feature key point data comprises an eye coordinate, a mouth coordinate, a neck coordinate and a shoulder coordinate;
the calculation module is used for calculating current sitting posture data according to the human body feature key point data if the human body feature key point data of the user is identified;
the current sitting posture data comprises a current head inclination angle, a current shoulder inclination angle, a height difference value between a current neck and a face, and a height difference value between a current shoulder and the face;
and the comparison and judgment module is used for comparing the current sitting posture data with the standard sitting posture data and judging whether the current sitting posture is abnormal.
As a further improvement of the above solution, the system further comprises:
the learning acquisition module is used for inputting a standard sitting posture image, performing big data training through a machine learning supervision learning classification algorithm and acquiring standard sitting posture data, wherein the standard sitting posture data comprises a standard head inclination angle threshold value, a standard shoulder inclination angle threshold value, a standard eye-used over-eye difference value ratio threshold value and a standard lying table difference value ratio threshold value.
As a further improvement of the above solution, the system further comprises:
and the reminding module is used for sending out reminding information in real time when the current sitting posture is judged to be abnormal.
In a third aspect, the present invention provides a sitting posture real-time intelligent determination device, including:
at least one processor, and a memory communicatively coupled to the at least one processor; wherein the memory stores instructions executable by the at least one processor, the instructions being executable by the at least one processor to enable the at least one processor to perform the sitting posture real-time intelligent determination method as described above.
In a fourth aspect, the present invention provides a computer-readable storage medium, which stores computer-executable instructions for causing a computer to execute the sitting posture real-time intelligent determination method as described above.
The invention has the beneficial effects that:
the invention relates to a sitting posture real-time intelligent judgment method, a system, equipment and a storage medium, which are used for identifying human body characteristic key point data of a user by acquiring a current sitting posture image of the user in real time, calculating the current sitting posture data according to the human body characteristic key point data, comparing the current sitting posture data with standard sitting posture data, and judging whether the current sitting posture is abnormal or not.
Drawings
The following further describes embodiments of the present invention with reference to the accompanying drawings:
FIG. 1 is a schematic flow chart of a sitting posture real-time intelligent determination method according to a first embodiment of the present invention;
FIG. 2 is a schematic diagram of a distribution of key points of human features;
fig. 3 is a schematic diagram of a sitting posture real-time intelligent determination system module according to a second embodiment of the invention.
Detailed Description
It should be noted that the embodiments and features of the embodiments in the present application may be combined with each other without conflict.
Example one
Fig. 1 is a schematic flow chart of a sitting posture real-time intelligent determination method according to an embodiment of the present invention, fig. 2 is a schematic distribution diagram of key points of human body features, and with reference to fig. 1 and fig. 2, a sitting posture real-time intelligent determination method includes steps S1 to S3.
S1, acquiring a current sitting posture image of the user in real time, identifying human body feature key point data of the user, and considering that the sitting posture is abnormal if the human body feature key point data of the user cannot be identified;
the human body feature key point data comprises eye coordinates, mouth coordinates, neck coordinates, shoulder coordinates and the like;
s2, if the human body feature key point data of the user is identified, calculating the current sitting posture data according to the human body feature key point data;
the current sitting posture data comprises a current head inclination angle, a current shoulder inclination angle, a height difference value between a current neck and a face, and a height difference value between a current shoulder and a face;
and S3, comparing the current sitting posture data with the standard sitting posture data, and judging whether the current sitting posture is abnormal.
In this embodiment, before step S1, the method further includes the steps of:
and S0, inputting the standard sitting posture image, and performing big data training through a machine learning supervised learning classification algorithm to obtain standard sitting posture data. The standard sitting posture data comprise a standard head inclination angle threshold value A, a standard shoulder inclination angle threshold value B, a standard eye-use over-eye difference value ratio threshold value C and a standard lying table difference value ratio threshold value D.
In this embodiment, step S1 specifically includes: collecting a user sitting posture image in real time, and identifying the human body characteristic key point data P ═ x0,y0)&(x1,y1)&(x2,y2)&…&(x23,y23)&(x24,y24). Wherein the eye coordinates include left eye coordinates (x)15,y15) Right eye coordinate (x)16,y16) The mouth coordinate is (x)0,y0) The neck coordinate is (x)1,y1) The shoulder coordinates include left shoulder coordinates (x)2,y2) Right shoulder coordinate (x)5,y5)。
In this embodiment, step S2 specifically includes:
s21, according to the mouth coordinate (x)0,y0) And neck coordinate (x)1,y1) Calculating the current head inclination angle of the user, wherein the current head inclination angle comprises a current head left inclination angle E1 of the user and a current head right inclination angle E2 of the user, and the head left inclination angle E1 and the head right inclination angle E2 are calculated as follows according to the Euler angle (posture angle) principle and in combination with sitting posture characteristics:
Figure BDA0001935583430000051
Figure BDA0001935583430000052
s22, calculating the current shoulder inclination angle of the user according to the shoulder coordinates, wherein the shoulder coordinates comprise a left shoulder coordinate (x)2,y2) Right shoulder coordinate (x)5,y5) The current shoulder inclination angles include a user current shoulder left inclination angle F1 and a user current shoulder right inclination angle F2, and the shoulder left inclination angle F1 and the shoulder right inclination angle F2 are calculated as follows, in combination with the sitting posture characteristics, according to the euler angle (posture angle) principle:
Figure BDA0001935583430000053
Figure BDA0001935583430000054
s23, according to the eye coordinates (including the left eye coordinates (x)15,y15) Right eye coordinate (x)16,y16) X, mouth coordinate (x)0,y0) And neck coordinate (x)1,y1) Calculating the height difference between the current neck and the face of the user, including calculating the current neckHeight difference (y) of child and left eye1-y15) Height difference (y) between current neck and right eye1-y16) And the difference of the current height (y) of the neck and the mouth1-y0)。
S24, according to the eye coordinates (including the left eye coordinates (x)15,y15) Right eye coordinate (x)16,y16) X, mouth coordinate (x)0,y0) And shoulder coordinates (including left shoulder coordinate (x)2,y2) Right shoulder coordinate (x)5,y5) Calculating a height difference between the current shoulder and the face of the user, including calculating a height difference (y) between the current left shoulder and the left eye2-y15) Height difference (y) of current right shoulder and right eye5-y16) Current left shoulder to mouth height difference (y)2-y0) And the difference in the current right shoulder-to-mouth height (y)5-y0)。
In this embodiment, step S3 specifically includes:
s31, comparing the current head inclination angle of the user with a standard head angle threshold value, and judging whether the head inclination is abnormal;
s32, comparing the current shoulder inclination angle of the user with a standard shoulder inclination angle threshold value, and judging whether the shoulder inclination is abnormal;
s33, calculating the ratio of the height difference between the current neck and the face of the user to the height difference between the standard neck and the face as a first ratio, comparing the first ratio with a standard eye use nearness difference ratio threshold value, and judging whether the user uses eyes too near;
s34, calculating the ratio of the height difference between the current shoulder and the face of the user to the height difference between the standard shoulder and the face of the user, taking the ratio as a second ratio, comparing the second ratio with a standard lying table difference ratio threshold value, and judging whether the user lies on the table.
In a specific embodiment, step S31 is:
and respectively comparing the calculated current head left inclination angle E1 of the user and the current head right inclination angle E2 of the user with a standard head inclination angle threshold A, wherein if E1 is less than A and E2 is less than A, the head inclination is not abnormal, otherwise, the head inclination is abnormal.
In a specific embodiment, step S32 is:
and respectively comparing the calculated current left shoulder inclination angle F1 of the user and the current right shoulder inclination angle F2 of the user with a standard shoulder inclination angle threshold B, wherein if F1 is less than B and F2 is less than B, the shoulder inclination is not abnormal, otherwise, the shoulder inclination is abnormal.
In a specific embodiment, step S33 is:
calculating a ratio of a height difference between the current neck and face of the user to a height difference between the standard neck and face as a first ratio G, the height difference between the current neck and face including a height difference (y) between the current neck and left eye1-y15) Height difference (y) between current neck and right eye1-y16) And the difference of the current height (y) of the neck and the mouth1-y0) The height difference between the standard neck and the face includes the height difference (y) between the standard neck and the left eye1’-y15'), height difference between standard neck and right eye (y)1’-y16') and the difference in height between the standard neck and the mouth (y)1’-y0'), the first ratio G is calculated as follows:
Figure BDA0001935583430000061
and comparing the first ratio G with a standard eye use difference value ratio threshold value C, wherein if G is less than C, the eyes of the user are normal, otherwise, the eyes of the user are too close.
In a specific embodiment, step S34 is:
calculating a ratio of a height difference between the current shoulder and face of the user to a height difference between the standard shoulder and face as a second ratio H, the height difference between the current shoulder and face including a height difference (y) between the current left shoulder and left eye2-y15) Height difference (y) of current right shoulder and right eye5-y16) Current left shoulder to mouth height difference (y)2-y0) And the difference in the current right shoulder-to-mouth height (y)5-y0) The height difference between the standard shoulder and face includes the height of the standard left shoulder and left eyeDegree difference (y)2’-y15'), height difference of standard right shoulder and right eye (y)5’-y16'), standard left shoulder to mouth height difference (y)2’-y0') and the standard right shoulder to mouth height difference (y)5’y0'), the second ratio H is calculated as follows:
Figure BDA0001935583430000071
compare second ratio H and the standard table difference of lying prone than threshold value D, if H < D, then the user does not lie prone the table, otherwise, then the user table of lying prone.
In this embodiment, the method further includes:
s4, when judging that the current position of sitting is unusual, send out in real time and remind information, report through intelligent audio amplifier and remind pronunciation or report in real time through cell-phone APP and remind information.
According to the sitting posture real-time intelligent judgment method provided by the invention, the current sitting posture image of the user is collected in real time, the human body characteristic key point data of the user is identified, the current sitting posture data is calculated according to the human body characteristic key point data, the current sitting posture data is compared with the standard sitting posture data, and whether the current sitting posture is abnormal or not is judged, so that the technical problem that the sitting posture judgment accuracy is not high due to only a face deviation angle in the prior art is solved, the sitting posture judgment accuracy is high, and the sitting posture real-time intelligent judgment method is suitable for a scene.
Example two
Fig. 3 is a schematic diagram of a module of a sitting posture real-time intelligent determination system according to a second embodiment of the present invention, and referring to fig. 3, a sitting posture real-time intelligent determination system includes:
the acquisition and identification module is used for acquiring the current sitting posture image of the user in real time, identifying the human body characteristic key point data of the user, and considering that the sitting posture is abnormal if the human body characteristic key point data of the user cannot be identified;
wherein, the human body feature key point data comprises eye coordinates, mouth coordinates, neck coordinates, shoulder coordinates and the like;
the calculation module is used for calculating current sitting posture data according to the human body feature key point data if the human body feature key point data of the user is identified;
the current sitting posture data comprises a current head inclination angle, a current shoulder inclination angle, a height difference value between a current neck and a face, and a height difference value between a current shoulder and a face;
and the comparison and judgment module is used for comparing the current sitting posture data with the standard sitting posture data and judging whether the current sitting posture is abnormal.
In this embodiment, the system further includes:
the learning acquisition module is used for inputting standard sitting posture images, performing big data training through a machine learning supervised learning classification algorithm and acquiring standard sitting posture data; the standard sitting posture data comprise a standard head inclination angle threshold value, a standard shoulder inclination angle threshold value, a standard eye-using over-approach difference value ratio threshold value and a standard lying table difference value ratio threshold value.
In this embodiment, the system further includes:
and the reminding module is used for sending out reminding information in real time when the current sitting posture is judged to be abnormal.
The sitting posture real-time intelligent determination system provided by the second embodiment of the invention is used for executing the sitting posture real-time intelligent determination method of the first embodiment, and the working principle and the beneficial effects are in one-to-one correspondence, so that the details are not repeated.
EXAMPLE III
The invention also provides a sitting posture real-time intelligent judgment device, which comprises: the sitting posture real-time intelligent determination method comprises at least one processor and a memory which is in communication connection with the at least one processor, wherein the memory stores instructions which can be executed by the at least one processor, and the instructions are executed by the at least one processor, so that the at least one processor can execute the sitting posture real-time intelligent determination method of the first embodiment.
Example four
The invention also provides a computer-readable storage medium, wherein the computer-readable storage medium stores computer-executable instructions for enabling a computer to execute the sitting posture real-time intelligent determination method according to the first embodiment.
The invention relates to a sitting posture real-time intelligent judgment method, a system, equipment and a storage medium, which are used for identifying human body characteristic key point data of a user by acquiring a current sitting posture image of the user in real time, calculating the current sitting posture data according to the human body characteristic key point data, comparing the current sitting posture data with standard sitting posture data, and judging whether the current sitting posture is abnormal or not.
The invention is suitable for intelligently judging the sitting posture of the child in real time, and solves the problems of sitting posture standardization and healthy growth of parents to the child in the learning process.
While the preferred embodiments of the present invention have been illustrated and described, it will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the spirit and scope of the invention as defined by the appended claims.

Claims (10)

1. A sitting posture real-time intelligent judgment method is characterized by comprising the following steps:
acquiring a current sitting posture image of a user in real time, identifying human body characteristic key point data of the user, and if the human body characteristic key point data of the user cannot be identified, considering that the sitting posture is abnormal;
the human body feature key point data comprises an eye coordinate, a mouth coordinate, a neck coordinate and a shoulder coordinate;
if the human body feature key point data of the user is identified, calculating the current sitting posture data according to the human body feature key point data;
the current sitting posture data comprises a current head inclination angle, a current shoulder inclination angle, a height difference value between a current neck and a face, and a height difference value between a current shoulder and the face;
and comparing the current sitting posture data with the standard sitting posture data to judge whether the current sitting posture is abnormal.
2. The method for intelligently distinguishing the sitting posture in real time according to claim 1, wherein if the human body feature key point data of the user is identified, calculating the current sitting posture data according to the human body feature key point data specifically comprises:
calculating the current head inclination angle of the user according to the mouth coordinate and the neck coordinate;
calculating the current shoulder inclination angle of the user according to shoulder coordinates, wherein the shoulder coordinates comprise a left shoulder coordinate and a right shoulder coordinate;
calculating a height difference value between the current neck and the face of the user according to the eye coordinate, the mouth coordinate and the neck coordinate;
and calculating the height difference between the current shoulder and the face of the user according to the eye coordinate, the mouth coordinate and the shoulder coordinate.
3. The method for intelligently judging the sitting posture in real time according to claim 2, wherein the step of comparing the current sitting posture data with the standard sitting posture data to judge whether the current sitting posture is abnormal specifically comprises the steps of:
comparing the current head inclination angle of the user with a standard head inclination angle threshold value, and judging whether the head inclination is abnormal or not;
comparing the current shoulder inclination angle of the user with a standard shoulder inclination angle threshold value, and judging whether the shoulder inclination is abnormal or not;
calculating the ratio of the height difference between the current neck and the face of the user to the height difference between the standard neck and the face of the user to serve as a first ratio, comparing the first ratio with a standard eye-use nearness difference ratio threshold value, and judging whether the user uses eyes too close;
and calculating the ratio of the height difference between the current shoulder and the face of the user to the height difference between the standard shoulder and the face of the user to serve as a second ratio, comparing the second ratio with a standard lying table difference ratio threshold value, and judging whether the user lies on the table.
4. The real-time intelligent sitting posture distinguishing method according to any one of claims 1 to 3, wherein before the step of collecting images of the current sitting posture of the user in real time and identifying key point data of human body features of the user, the method further comprises the following steps of:
inputting a standard sitting posture image, performing big data training through a machine learning supervised learning classification algorithm, and acquiring standard sitting posture data, wherein the standard sitting posture data comprises a standard head inclination angle threshold value, a standard shoulder inclination angle threshold value, a standard eye use over-close difference value ratio threshold value and a standard lying table difference value ratio threshold value.
5. The real-time intelligent sitting posture distinguishing method according to claim 4, further comprising the following steps of: and when the current sitting posture is judged to be abnormal, sending out reminding information in real time.
6. A real-time intelligent sitting posture distinguishing system is characterized by comprising:
the acquisition and identification module is used for acquiring the current sitting posture image of the user in real time, identifying the human body characteristic key point data of the user, and considering that the sitting posture is abnormal if the human body characteristic key point data of the user cannot be identified;
the human body feature key point data comprises an eye coordinate, a mouth coordinate, a neck coordinate and a shoulder coordinate;
the calculation module is used for calculating current sitting posture data according to the human body feature key point data if the human body feature key point data of the user is identified;
the current sitting posture data comprises a current head inclination angle, a current shoulder inclination angle, a height difference value between a current neck and a face, and a height difference value between a current shoulder and the face;
and the comparison and judgment module is used for comparing the current sitting posture data with the standard sitting posture data and judging whether the current sitting posture is abnormal.
7. The system according to claim 6, further comprising:
the learning acquisition module is used for inputting a standard sitting posture image, performing big data training through a machine learning supervision learning classification algorithm and acquiring standard sitting posture data, wherein the standard sitting posture data comprises a standard head inclination angle threshold value, a standard shoulder inclination angle threshold value, a standard eye-used over-eye difference value ratio threshold value and a standard lying table difference value ratio threshold value.
8. The system according to claim 6 or 7, wherein the system further comprises:
and the reminding module is used for sending out reminding information in real time when the current sitting posture is judged to be abnormal.
9. The utility model provides a real-time intelligent discrimination equipment of position of sitting which characterized in that, it includes:
at least one processor, and a memory communicatively coupled to the at least one processor; wherein the memory stores instructions executable by at least one processor to enable the at least one processor to perform the sitting posture real-time intelligent discrimination method of any one of claims 1 to 5.
10. A computer-readable storage medium storing computer-executable instructions for causing a computer to perform the sitting posture real-time intelligent determination method as claimed in any one of claims 1 to 5.
CN201910006352.0A 2019-01-04 2019-01-04 Real-time intelligent sitting posture distinguishing method, system, equipment and storage medium Active CN111414780B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910006352.0A CN111414780B (en) 2019-01-04 2019-01-04 Real-time intelligent sitting posture distinguishing method, system, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910006352.0A CN111414780B (en) 2019-01-04 2019-01-04 Real-time intelligent sitting posture distinguishing method, system, equipment and storage medium

Publications (2)

Publication Number Publication Date
CN111414780A true CN111414780A (en) 2020-07-14
CN111414780B CN111414780B (en) 2023-08-01

Family

ID=71492572

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910006352.0A Active CN111414780B (en) 2019-01-04 2019-01-04 Real-time intelligent sitting posture distinguishing method, system, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN111414780B (en)

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111931640A (en) * 2020-08-07 2020-11-13 上海商汤临港智能科技有限公司 Abnormal sitting posture identification method and device, electronic equipment and storage medium
CN112287795A (en) * 2020-10-22 2021-01-29 北京百度网讯科技有限公司 Abnormal driving posture detection method, device, equipment, vehicle and medium
CN112364694A (en) * 2020-10-13 2021-02-12 宁波大学 Human body sitting posture identification method based on key point detection
CN112617815A (en) * 2020-12-17 2021-04-09 深圳数联天下智能科技有限公司 Sitting posture assessment method and device, computer equipment and storage medium
CN112712053A (en) * 2021-01-14 2021-04-27 深圳数联天下智能科技有限公司 Sitting posture information generation method and device, terminal equipment and storage medium
CN113052097A (en) * 2021-03-31 2021-06-29 开放智能机器(上海)有限公司 Human body sitting posture real-time monitoring system and monitoring method
CN113554609A (en) * 2021-07-19 2021-10-26 同济大学 Neck dystonia identification system based on vision
CN113657271A (en) * 2021-08-17 2021-11-16 上海科技大学 Sitting posture detection method and system combining quantifiable factors and non-quantifiable factors for judgment
CN113780220A (en) * 2021-09-17 2021-12-10 东胜神州旅游管理有限公司 Child sitting posture detection method and system based on child face recognition
CN114038016A (en) * 2021-11-16 2022-02-11 平安普惠企业管理有限公司 Sitting posture detection method, device, equipment and storage medium
CN115909394A (en) * 2022-10-25 2023-04-04 珠海视熙科技有限公司 Sitting posture identification method and device, intelligent desk lamp and computer storage medium
CN116884083A (en) * 2023-06-21 2023-10-13 圣奥科技股份有限公司 Sitting posture detection method, medium and equipment based on key points of human body

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102096801A (en) * 2009-12-14 2011-06-15 北京中星微电子有限公司 Sitting posture detecting method and device
CN103488980A (en) * 2013-10-10 2014-01-01 广东小天才科技有限公司 Sitting posture judging method and device based on camera
CN104217554A (en) * 2014-09-19 2014-12-17 武汉理工大学 Reminding system and method for health study posture for student
CN104850820A (en) * 2014-02-19 2015-08-19 腾讯科技(深圳)有限公司 Face identification method and device
CN107153829A (en) * 2017-06-09 2017-09-12 南昌大学 Incorrect sitting-pose based reminding method and device based on depth image
CN107392146A (en) * 2017-07-20 2017-11-24 湖南科乐坊教育科技股份有限公司 A kind of child sitting gesture detection method and device
CN107491751A (en) * 2017-08-14 2017-12-19 成都伞森科技有限公司 Sitting posture analysis method and device

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102096801A (en) * 2009-12-14 2011-06-15 北京中星微电子有限公司 Sitting posture detecting method and device
CN103488980A (en) * 2013-10-10 2014-01-01 广东小天才科技有限公司 Sitting posture judging method and device based on camera
CN104850820A (en) * 2014-02-19 2015-08-19 腾讯科技(深圳)有限公司 Face identification method and device
CN104217554A (en) * 2014-09-19 2014-12-17 武汉理工大学 Reminding system and method for health study posture for student
CN107153829A (en) * 2017-06-09 2017-09-12 南昌大学 Incorrect sitting-pose based reminding method and device based on depth image
CN107392146A (en) * 2017-07-20 2017-11-24 湖南科乐坊教育科技股份有限公司 A kind of child sitting gesture detection method and device
CN107491751A (en) * 2017-08-14 2017-12-19 成都伞森科技有限公司 Sitting posture analysis method and device

Cited By (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111931640B (en) * 2020-08-07 2022-06-10 上海商汤临港智能科技有限公司 Abnormal sitting posture identification method and device, electronic equipment and storage medium
CN111931640A (en) * 2020-08-07 2020-11-13 上海商汤临港智能科技有限公司 Abnormal sitting posture identification method and device, electronic equipment and storage medium
CN112364694A (en) * 2020-10-13 2021-02-12 宁波大学 Human body sitting posture identification method based on key point detection
CN112364694B (en) * 2020-10-13 2023-04-18 宁波大学 Human body sitting posture identification method based on key point detection
CN112287795A (en) * 2020-10-22 2021-01-29 北京百度网讯科技有限公司 Abnormal driving posture detection method, device, equipment, vehicle and medium
CN112287795B (en) * 2020-10-22 2023-09-01 北京百度网讯科技有限公司 Abnormal driving gesture detection method, device, equipment, vehicle and medium
CN112617815A (en) * 2020-12-17 2021-04-09 深圳数联天下智能科技有限公司 Sitting posture assessment method and device, computer equipment and storage medium
CN112712053A (en) * 2021-01-14 2021-04-27 深圳数联天下智能科技有限公司 Sitting posture information generation method and device, terminal equipment and storage medium
CN113052097A (en) * 2021-03-31 2021-06-29 开放智能机器(上海)有限公司 Human body sitting posture real-time monitoring system and monitoring method
CN113554609A (en) * 2021-07-19 2021-10-26 同济大学 Neck dystonia identification system based on vision
CN113657271A (en) * 2021-08-17 2021-11-16 上海科技大学 Sitting posture detection method and system combining quantifiable factors and non-quantifiable factors for judgment
CN113657271B (en) * 2021-08-17 2023-10-03 上海科技大学 Sitting posture detection method and system combining quantifiable factors and unquantifiable factor judgment
CN113780220A (en) * 2021-09-17 2021-12-10 东胜神州旅游管理有限公司 Child sitting posture detection method and system based on child face recognition
CN114038016A (en) * 2021-11-16 2022-02-11 平安普惠企业管理有限公司 Sitting posture detection method, device, equipment and storage medium
CN115909394A (en) * 2022-10-25 2023-04-04 珠海视熙科技有限公司 Sitting posture identification method and device, intelligent desk lamp and computer storage medium
CN115909394B (en) * 2022-10-25 2024-04-05 珠海视熙科技有限公司 Sitting posture identification method and device, intelligent table lamp and computer storage medium
CN116884083A (en) * 2023-06-21 2023-10-13 圣奥科技股份有限公司 Sitting posture detection method, medium and equipment based on key points of human body

Also Published As

Publication number Publication date
CN111414780B (en) 2023-08-01

Similar Documents

Publication Publication Date Title
CN111414780B (en) Real-time intelligent sitting posture distinguishing method, system, equipment and storage medium
CN105718869B (en) The method and apparatus of face face value in a kind of assessment picture
CN110472481B (en) Sleeping gesture detection method, device and equipment
CN105740779B (en) Method and device for detecting living human face
CN110287790B (en) Learning state hybrid analysis method oriented to static multi-user scene
CN105608448B (en) A kind of LBP feature extracting method and device based on face&#39;s key point
CN102013011B (en) Front-face-compensation-operator-based multi-pose human face recognition method
CN104573634A (en) Three-dimensional face recognition method
CN104574321A (en) Image correction method and device and video system
CN107103309A (en) A kind of sitting posture of student detection and correcting system based on image recognition
WO2017161734A1 (en) Correction of human body movements via television and motion-sensing accessory and system
CN111079625A (en) Control method for camera to automatically rotate along with human face
CN111046825A (en) Human body posture recognition method, device and system and computer readable storage medium
CN102831408A (en) Human face recognition method
CN110148092A (en) The analysis method of teenager&#39;s sitting posture based on machine vision and emotional state
CN105389570A (en) Face angle determination method and system
CN109829354B (en) Face recognition method based on deep learning
CN112712053A (en) Sitting posture information generation method and device, terminal equipment and storage medium
CN103544478A (en) All-dimensional face detection method and system
CN108304831A (en) A kind of method and device that monitoring worker safety helmet is worn
US20230237694A1 (en) Method and system for detecting children&#39;s sitting posture based on face recognition of children
CN112784786A (en) Human body posture recognition method and device
CN103020589A (en) Face recognition method for single training sample
CN109241822A (en) A kind of multi-faceted method for detecting human face and system based on MTCNN
CN113609963A (en) Real-time multi-human-body-angle smoking behavior detection method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information

Address after: 518000 w601, Shenzhen Hong Kong industry university research base, 015 Gaoxin South 7th Road, high tech Zone community, Yuehai street, Nanshan District, Shenzhen City, Guangdong Province

Applicant after: ASPIRE TECHNOLOGIES (SHENZHEN) LTD.

Address before: 518000 south wing, 6th floor, west block, Shenzhen Hong Kong industry university research base building, South District, high tech Industrial Park, Nanshan District, Shenzhen City, Guangdong Province

Applicant before: ASPIRE TECHNOLOGIES (SHENZHEN) LTD.

CB02 Change of applicant information
GR01 Patent grant
GR01 Patent grant