CN117555427B - Interaction control method, device and storage medium based on human body posture - Google Patents

Interaction control method, device and storage medium based on human body posture Download PDF

Info

Publication number
CN117555427B
CN117555427B CN202410041327.7A CN202410041327A CN117555427B CN 117555427 B CN117555427 B CN 117555427B CN 202410041327 A CN202410041327 A CN 202410041327A CN 117555427 B CN117555427 B CN 117555427B
Authority
CN
China
Prior art keywords
information
limb segment
central axis
gesture
detection time
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202410041327.7A
Other languages
Chinese (zh)
Other versions
CN117555427A (en
Inventor
孙小玄
吴鄂
金元
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Aimeng Sleep Zhuhai Intelligent Technology Co ltd
Original Assignee
Aimeng Sleep Zhuhai Intelligent Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Aimeng Sleep Zhuhai Intelligent Technology Co ltd filed Critical Aimeng Sleep Zhuhai Intelligent Technology Co ltd
Priority to CN202410041327.7A priority Critical patent/CN117555427B/en
Publication of CN117555427A publication Critical patent/CN117555427A/en
Application granted granted Critical
Publication of CN117555427B publication Critical patent/CN117555427B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/02Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Psychiatry (AREA)
  • Social Psychology (AREA)
  • User Interface Of Digital Computer (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)

Abstract

The invention provides an interaction control method, equipment and storage medium based on human body gestures, comprising the following steps: acquiring pressure map data of a target object at each detection time point; detecting and obtaining position information and sleeping posture information of each limb segment of a target object at any detection time point based on pressure map data at any detection time point, and determining the central axis of each limb segment; analyzing and obtaining gesture recognition information of a target object based on sleeping gesture information, position information of each limb segment and a central axis corresponding to all detection time points; and if the gesture recognition information accords with the interaction gesture information, generating a control instruction corresponding to the interaction gesture information so as to control the intelligent equipment based on the control instruction. According to the intelligent equipment, the target gesture and the action are detected based on the sleeping gesture information, the position information of each limb segment and the central axis related characteristics, so that the accuracy of the interactive gesture recognition is effectively improved, and the intelligent equipment is accurately interactively controlled based on the control instruction corresponding to the interactive gesture.

Description

Interaction control method, device and storage medium based on human body posture
Technical Field
The invention relates to the technical field of intelligent equipment control, in particular to an interaction control method, device and storage medium based on human body gestures.
Background
With the rapid development of social science and technology, the demands of people for life quality are gradually changed from basic satisfaction to pursuing more intelligent and personalized experience. Sleep is an important component of human physiological and psychological health, and is attracting attention and demand from more people for intelligent technology in bed.
At present, some intelligent beds or bedclothes have realized a series of simple and practical intelligent control functions, for example, when a person gets on the bed, an intelligent system at the head of the bed can sense and automatically turn off curtains and dim light, so as to create a more comfortable sleeping environment for the user; conversely, when the user leaves the bed, the system can automatically open the curtain and lighten the light, so as to provide brighter getting-up experience. The realization of the intelligent control mainly depends on the simple action capturing of the bed-leaving and the bed-entering actions by the bed-entering sensor, and the situation of false identification is easy to cause, and the control system can execute inaccurate operations under certain conditions.
Disclosure of Invention
The invention provides an interactive control method, equipment and a storage medium based on human body gestures, and aims to solve the technical problem that incorrect operation is executed by a control system under certain conditions due to the fact that incorrect identification is easily caused by simple motion capture such as bed leaving and bed entering actions by a bed sensor.
The invention provides an interaction control method based on human body gestures, which comprises the following steps:
acquiring pressure map data of each detection time point of the target object in a preset time window;
Detecting sleeping posture information of the target object at any detection time point and position information of each limb segment based on pressure map data of any detection time point, and determining the central axis of each limb segment;
Analyzing and obtaining gesture identification information of the target object based on sleeping gesture information corresponding to all detection time points, position information of each limb segment and a central axis;
and if the gesture recognition information accords with the preset interaction gesture information, generating a control instruction corresponding to the interaction gesture information so as to control the intelligent equipment based on the control instruction.
According to the interaction control method based on the human body gesture provided by the invention, gesture recognition information of the target object is obtained by analysis based on sleeping gesture information corresponding to all detection time points, position information of each limb segment and a central axis, and the method comprises the following steps:
determining a plurality of characteristic information of any detection time point based on sleeping gesture information of any detection time point, central axes corresponding to each limb segment and position information;
forming a time feature sequence corresponding to any feature information based on any feature information of all detection time points;
and determining the gesture recognition information of the target object based on each time feature sequence.
According to the interaction control method based on the human body gesture provided by the invention, the method for determining a plurality of characteristic information of any detection time point based on sleeping gesture information of any detection time point, central axis and position information corresponding to each limb segment comprises the following steps:
For any of the detection time points:
Determining pressure average value characteristics of each limb segment based on the pressure map data of the detection time points;
determining the position characteristics of each limb segment based on the position information of each limb segment;
determining central axis curvature change characteristics of each limb segment and included angle characteristics between the central axes based on the central axes corresponding to each limb segment;
and forming a plurality of characteristic information of the detection time points based on the sleeping posture information, the pressure average value characteristic, the position characteristic, the central axis curvature change characteristic and the included angle characteristic between the central axes of each limb segment.
According to the interactive control method based on the human body posture provided by the invention, the central axis curvature change characteristics of each limb segment are determined based on the central axis corresponding to each limb segment, and the interactive control method comprises the following steps:
For any central axis corresponding to the limb segment:
according to a preset constructed coordinate system, determining the coordinate value of each point in the central axis;
determining a curvature value corresponding to each point based on the coordinate value of each point;
And determining the central axis curvature change characteristic of the limb segment based on the curvature value corresponding to each point.
According to the interaction control method based on the human body gesture provided by the invention, before generating the control instruction corresponding to the interaction gesture information if the gesture recognition information accords with the preset interaction gesture information, the method further comprises the following steps:
respectively determining the feature similarity between each time feature sequence in the gesture recognition information and each target feature sequence in the interaction gesture information;
determining a first mean value corresponding to any one of the time feature sequences and a second mean value corresponding to any one of the target feature sequences;
Determining an absolute deviation between any time feature sequence and a corresponding target feature sequence based on a first mean value corresponding to any time feature sequence and a second mean value corresponding to any target feature sequence;
Determining a correlation coefficient based on the absolute deviation and the feature similarity corresponding to each time feature sequence;
If the correlation coefficient is larger than a preset correlation threshold, judging that the gesture recognition information accords with the interaction gesture information;
And if the correlation coefficient is not greater than the correlation threshold, judging that the gesture recognition information does not accord with the interaction gesture information.
According to the interactive control method based on the human body posture provided by the invention, the method for determining the central axis of each limb segment comprises the following steps:
Performing edge extraction on the pressure map data to obtain pressure edge information of each limb segment;
Performing linear fitting on the pressure edge information of any limb segment to obtain a central axis of any limb segment; or alternatively
Performing linear fitting on the pressure edge information of any limb segment to obtain a reference line of any limb segment;
determining a normal corresponding to any reference line;
Determining each intersection point coordinate corresponding to the normal line and pressure edge information of any limb segment;
And forming the central axis of any limb segment based on the median value between the intersection point coordinates.
According to the interaction control method based on human body posture provided by the invention, the detection of the sleeping posture information of the target object at any detection time point and the position information of each limb segment based on the pressure map data at any detection time point comprises the following steps:
For any of the detection time points:
extracting features of the pressure map data of the detection time points to obtain a plurality of candidate areas with different sizes;
normalizing the candidate areas with different sizes to obtain target candidate areas;
and carrying out category identification on the target candidate region to obtain the position information and the sleeping posture information of each limb segment of the target object.
According to the interaction control method based on human body posture provided by the invention, before obtaining pressure map data of each detection time point in a preset time window, the method further comprises the following steps:
Acquiring target pressure map data of a target object in a preset time window;
Analyzing target pose information of the target object based on the target pressure map data;
And if the target gesture information accords with the preset awakening gesture information, activating an interactive control system.
The invention also provides an interaction control device based on the human body posture, which comprises:
The acquisition module is used for acquiring pressure map data of each detection time point of the target object in a preset time window;
The detection module is used for detecting and obtaining sleeping posture information of the target object at any detection time point and position information of each limb segment based on pressure map data of any detection time point, and determining the central axis of each limb segment;
the analysis module is used for analyzing and obtaining the gesture recognition information of the target object based on the sleeping gesture information, the position information of each limb segment and the central axis corresponding to all the detection time points;
And the control module is used for generating a control instruction corresponding to the interaction gesture information if the gesture recognition information accords with the preset interaction gesture information so as to control the intelligent equipment based on the control instruction.
The invention also provides electronic equipment, which comprises a memory, a processor and a computer program stored on the memory and capable of running on the processor, wherein the processor realizes the interaction control method based on the human body gesture when executing the program.
The present invention also provides a non-transitory computer readable storage medium having stored thereon a computer program which, when executed by a processor, implements a human gesture based interactive control method as described in any of the above.
The invention also provides a computer program product comprising a computer program which when executed by a processor implements the human body posture based interactive control method as described in any of the above.
The invention provides a human body posture-based interaction control method, equipment and a storage medium, which comprise the following steps: acquiring pressure map data of each detection time point of the target object in a preset time window; detecting sleeping posture information of the target object at any detection time point and position information of each limb segment based on pressure map data of any detection time point, and determining the central axis of each limb segment; analyzing and obtaining gesture identification information of the target object based on sleeping gesture information corresponding to all detection time points, position information of each limb segment and a central axis; and if the gesture recognition information accords with the preset interaction gesture information, generating a control instruction corresponding to the interaction gesture information so as to control the intelligent equipment based on the control instruction. According to the intelligent device, the position information and the sleeping posture information of each limb segment at any detection time point are detected based on the pressure map data of each detection time point in the preset time window, the central axis of each limb segment is extracted, and then the position information, the sleeping posture information and the central axis related characteristics in the preset time window are used for detecting the target posture and the action, so that the accuracy of identifying the interaction posture can be effectively improved, and the intelligent device is accurately and interactively controlled based on the control instruction corresponding to the interaction posture.
Drawings
In order to more clearly illustrate the invention or the technical solutions in the prior art, the drawings that are used in the description of the embodiments or the prior art will be briefly described one by one, it being obvious that the drawings in the description below are some embodiments of the invention, and that other drawings can be obtained from these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a schematic flow chart of a human body gesture-based interactive control method provided by the invention;
FIG. 2 is pressure map data for a user supine provided by an embodiment of the present invention;
FIG. 3 is a schematic view of pressure edge information extracted from a human body and central axes of respective limb segments according to an embodiment of the present invention;
FIG. 4 is a second flow chart of the interactive control method based on human body gesture provided by the invention;
FIG. 5 is a third flow chart of the interactive control method based on human body gesture provided by the invention;
FIG. 6 is a fourth flow chart of the human gesture-based interactive control method provided by the invention;
FIG. 7 is a fifth flow chart of the human gesture-based interactive control method provided by the present invention;
FIG. 8 is a flowchart of a human body gesture-based interactive control method provided by the present invention;
FIG. 9 is a flow chart of a human body gesture-based interactive control method provided by the invention;
FIG. 10 is a flowchart eighth of the human body gesture-based interactive control method provided by the present invention;
FIG. 11 is a flowchart of a human body gesture-based interactive control method provided by the present invention;
FIG. 12 is a schematic diagram of the structure of the human body posture-based interactive control device provided by the invention;
Fig. 13 is a schematic structural diagram of an electronic device provided by the present invention.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the present invention more apparent, the technical solutions of the present invention will be clearly and completely described below with reference to the accompanying drawings, and it is apparent that the described embodiments are some embodiments of the present invention, not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
The terminology used in the one or more embodiments of the invention is for the purpose of describing particular embodiments only and is not intended to be limiting of the one or more embodiments of the invention. As used in one or more embodiments of the invention, the singular forms "a," "an," "the," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It should also be understood that the term "and/or" as used in one or more embodiments of the present invention refers to and encompasses any or all possible combinations of one or more of the associated listed items.
It should be understood that, although the terms first, second, etc. may be used in one or more embodiments of the invention to describe various information, these information should not be limited by these terms. These terms are only used to distinguish one type of information from another. For example, a first may also be referred to as a second, and similarly, a second may also be referred to as a first, without departing from the scope of one or more embodiments of the invention. The word "if" as used herein may be interpreted as "at … …" or "when … …", depending on the context.
Fig. 1 is a schematic flow chart of an interaction control method based on human body gestures. As shown in fig. 1, the human body gesture-based interaction control method includes:
Step S11, obtaining pressure map data of each detection time point of a target object in a preset time window;
it should be noted that the pressure map data is acquired based on the pressure sensor array disposed on the intelligent mattress, that is, the plurality of sensors are distributed at different positions of the intelligent mattress, so that the pressure value of each position on the intelligent mattress can be measured, optionally, the resolution of the pressure sensor array is 64×16, the data precision is 8 bits, more preferably, the resolution of the selected pressure sensor array is 128×32, and the data precision is 16 bits. In addition, the preset time window and the detection time point can be set in a self-defined manner according to practical situations, for example, the preset time window is set to 4 seconds, and detection is performed once every second, and at this time, the detection time points are 1 st second, 2 nd second, 3 rd second and 4 th second. Specifically, pressure map data of the pressure sensor array at each detection time point is obtained, and reference may be made to fig. 2, where fig. 2 is pressure map data of a user lying on his back according to an embodiment of the present invention.
Step S12, detecting and obtaining sleeping posture information of the target object at any detection time point and position information of each limb segment based on pressure map data of any detection time point, and determining the central axis of each limb segment;
The sleeping posture information includes sleeping postures such as supine and lateral sleeping postures, and the limb joints include hands, legs and a trunk. The central axis of the limb segment is an axis formed based on edge information of the limb segment.
Specifically, the following steps are performed for the pressure map data at any one of the detection time points: in an embodiment, feature extraction is performed on the pressure map data, and then category identification is performed on the extracted features, so that position information and sleeping posture information of each limb segment of the target object are obtained. In another embodiment, the pressure map data is subjected to feature extraction to obtain a plurality of candidate areas with different sizes, and the sizes of the candidate areas can be set according to actual conditions, so that features of target objects with different sizes can be captured better, and the detection accuracy can be improved. And further carrying out normalization processing on candidate areas with different sizes to obtain target candidate areas, and further carrying out category identification on the target candidate areas to obtain the position information and the sleeping posture information of each limb segment of the target object.
Additionally, referring to fig. 3, fig. 3 is a schematic diagram of pressure edge information extracted from a human body and central axes of each limb segment according to an embodiment of the present invention, edge extraction is performed on pressure map data, and the pressure edge information of each limb segment is extracted, for example, by using a Prewitt operator, a Roberts operator, and other high-pass filtering methods, so that the pressure edge information of any limb segment can be linearly fitted to calculate the central axes of any limb segment.
Step S13, analyzing and obtaining gesture recognition information of the target object based on sleeping gesture information corresponding to all detection time points, position information of each limb segment and a central axis;
Specifically, for any one of the detection time points: and calculating the pressure mean value characteristic in any limb segment based on the pressure map data. Furthermore, the position characteristics of each limb segment are calculated based on the position information of each limb segment. Additionally, based on the central axes corresponding to the limb segments, the included angle characteristic and the central axis curvature change characteristic between the central axes are calculated respectively, and the central axis curvature change characteristic can be determined based on the curvature of each point on the central axis. And forming multidimensional characteristic information at the detection time point based on the sleeping posture information, the pressure average value characteristic, the position characteristic, the central axis curvature change characteristic and the included angle characteristic between the central axes of each limb segment. Further, based on the feature information detected at all the detection time points, time feature sequences respectively corresponding to the feature information are formed, so that the gesture recognition information of the target object is determined based on the time feature sequences.
Step S14, if the gesture recognition information accords with preset interaction gesture information, generating a control instruction corresponding to the interaction gesture information so as to control the intelligent equipment based on the control instruction.
The interaction gesture information is associated with a plurality of target feature sequences corresponding to interaction gestures, for example, a target feature sequence corresponding to gestures such as scrolling in a bed, continuous beating of a left hand or a right hand in the bed, or alternate beating. The intelligent equipment comprises, but is not limited to, intelligent equipment such as intelligent sound boxes, mobile phones, flat plates, air conditioners, lamplights and the like. The user can input one or more pieces of interaction gesture information in advance, and different interaction gesture information associations correspond to different control instructions.
Specifically, whether the gesture recognition information accords with the preset interaction gesture information is determined, optionally, in an embodiment, the similarity between each time feature sequence and the target feature sequence in the gesture recognition information may be calculated, so that whether the gesture recognition information accords with the interaction gesture information is determined based on the similarity. In an embodiment, an absolute deviation between the time feature sequence and the corresponding target feature sequence can be obtained through calculation, wherein the absolute deviation is determined based on a feature mean value corresponding to the time feature sequence and a feature mean value corresponding to the target feature sequence, so that the absolute deviation and the similarity are combined to judge whether the gesture recognition information accords with the interaction gesture information.
Further, if the gesture recognition information accords with preset interaction gesture information, a control instruction corresponding to the interaction gesture information is generated, and then the control instruction is sent to an intelligent control center to control the intelligent equipment. For example, the left hand or the right hand continuously beats on the bed, and a temperature regulating instruction corresponding to the air conditioner is generated to regulate the temperature of the air conditioner.
According to the embodiment of the invention, the position information and the sleeping posture information of each limb segment at any detection time point are detected based on the pressure map data of each detection time point in the preset time window, the central axis of each limb segment is extracted, and then the position information, the sleeping posture information and the central axis related characteristics of each detection time point are used for detecting the target posture and the action, so that the accuracy of the recognition of the interaction posture can be effectively improved, and the intelligent equipment is accurately interactively controlled based on the control instruction corresponding to the interaction posture.
Referring to fig. 4, fig. 4 is a second schematic flow chart of the human body gesture-based interaction control method provided by the present invention; in one embodiment of the present invention, analyzing to obtain the gesture recognition information of the target object based on the sleeping gesture information, the position information of each limb segment and the central axis corresponding to all the detection time points includes:
Step S21, determining a plurality of characteristic information of any detection time point based on sleeping posture information of any detection time point, central axes corresponding to each limb segment and position information;
Step S22, forming a time feature sequence corresponding to any feature information based on any feature information of all detection time points;
step S23, determining the gesture recognition information of the target object based on each of the time feature sequences.
It should be noted that, the multidimensional feature information includes features such as pressure average features, position information, features corresponding to sleeping posture information of each limb segment, and in order to more accurately identify the motion of the target object, the feature information may further include an included angle feature between each central axis and a curvature change feature of the central axis. Specifically, based on the pressure map data at the detection time point, the pressure average characteristic of each limb segment is determined, based on the position information of each limb segment, the position characteristic of each limb segment is determined, and based on the central axis corresponding to each limb segment, the central axis curvature change characteristic of each limb segment and the included angle characteristic between the central axes are calculated, and optionally, the central axis curvature change characteristic may be specifically described according to the following embodiments, which are not described herein. Referring to fig. 3, the included angle feature may refer to θ 1 as the included angle between the central axis of the right arm and the central axis of the torso, θ 2 as the included angle between the central axis of the left arm and the central axis of the torso, θ 3 as the included angle between the central axis of the right leg and the central axis of the torso, and θ 4 as the included angle between the central axis of the left leg and the central axis of the torso. Further, for any one of the characteristic information: based on the feature information detected at all detection time points, a time feature sequence corresponding to the feature information is formed, for example, a preset time window is 5 seconds, the feature information of the left leg is calculated based on each second, and the time feature sequence corresponding to the left leg is formed. Thereby determining the posture identifying information of the target object based on each of the time feature sequences.
According to the embodiment of the invention, the multidimensional characteristic information is obtained through calculation based on the sleeping gesture information and the central axis and position information corresponding to each limb segment, and further, the time characteristic sequence corresponding to each characteristic information is formed based on the characteristic information recognized for many times, so that the final gesture recognition information of the target object is determined based on each time characteristic sequence, the detection of the target gesture and the action by the multidimensional characteristic is realized, and the accuracy of the interactive gesture recognition is effectively improved.
Referring to fig. 5, fig. 5 is a third flow chart of the interactive control method based on human body gesture provided by the present invention. In one embodiment of the present invention, determining the plurality of feature information of any one of the detection time points based on the sleeping posture information of any one of the detection time points, the central axis corresponding to each limb segment, and the position information includes:
For any of the detection time points:
Step S31, determining the pressure average value characteristic of each limb segment based on the pressure map data of the detection time point;
step S32, determining the position characteristics of each limb segment based on the position information of each limb segment;
Step S33, determining the central axis curvature change characteristic of each limb segment and the included angle characteristic between the central axes based on the central axes corresponding to each limb segment;
And step S34, forming a plurality of characteristic information of the detection time points based on the sleeping posture information, the pressure average value characteristic, the position characteristic, the central axis curvature change characteristic and the included angle characteristic between the central axes of each limb segment.
Specifically, the following steps are performed for any of the detection time points: and extracting the pressure value of each point in any limb segment based on the pressure map data, further calculating the pressure value of each point in any limb segment to obtain the pressure average value in any limb segment, and taking the pressure average value as the pressure average value characteristic of the limb segment. Furthermore, the position characteristics of each limb segment are calculated based on the position information of each limb segment.
Additionally, based on the central axes corresponding to the limb segments, the included angle characteristic between the central axes is calculated. And a rectangular coordinate system can be constructed, for example, a coordinate system is constructed by taking the lower left corner as a coordinate far point, taking the vertex as the positive direction of the x axis to the right and taking the vertex as the positive direction of the y axis to the upward, so as to further determine the coordinate value of each point on the central axis corresponding to any limb segment, further calculate the curvature value corresponding to each point based on the coordinate value of each point, and further select the curvature value with the largest absolute value from the curvature values as the central axis curvature change characteristic of the limb segment.
Further, based on the sleeping posture information, the pressure average value characteristic, the position characteristic, the central axis curvature change characteristic and the included angle characteristic between the central axes of each limb segment, the multi-dimensional characteristic information of the detection time point is formed.
According to the embodiment of the invention, the information such as the sleeping gesture information, the central axis corresponding to each limb segment and the position information are combined to determine the multidimensional characteristic information such as the pressure mean value characteristic, the position characteristic, the central axis curvature change characteristic, the included angle characteristic and the like, so that a plurality of types of gesture actions are identified by utilizing a plurality of characteristic information, and the accuracy of the interactive gesture identification is effectively improved.
Referring to fig. 6, fig. 6 is a schematic flow chart of an interaction control method based on human body gesture provided by the invention. In one embodiment of the present invention, determining the central axis curvature change feature of each limb segment based on the central axis corresponding to each limb segment includes:
For any central axis corresponding to the limb segment:
step S41, determining coordinate values of each point in the central axis according to a preset constructed coordinate system;
Step S42, determining a curvature value corresponding to each point based on the coordinate value of each point;
and step S43, determining the central axis curvature change characteristic of the limb segment based on the curvature value corresponding to each point.
The coordinate system is constructed with the lower left vertex of the pressure map data as the origin of coordinates, the right vertex as the positive x-axis direction, and the upward vertex as the positive y-axis direction. Specifically, the following steps are executed for the central axis corresponding to any limb segment: according to a preset coordinate system, determining the coordinate value of each point in the central axis, and further calculating a curvature value corresponding to each point based on the coordinate value of each point, wherein the curvature value has the following calculation formula:
Where x '(i) and y' (i) are the first derivatives of the ith point along the x-axis and the y-axis, respectively, and x "(i) and y" (i) are the second derivatives of the ith point along the x-axis and the y-axis, respectively. The calculation formulas of the first derivative and the second reciprocal are as follows:
Wherein x (i) and y (i) are coordinate values of the ith point on the central axis, and x (i+1) and y (i+1) are coordinate values of the ith+1 point on the central axis. And then selecting the curvature value corresponding to the maximum absolute value as the central axis curvature change characteristic of the limb segment. In other embodiments, the central axis curvature change feature may also be an extremum of a first order difference in curvature or other change.
According to the embodiment of the invention, the central axis curvature change characteristic of each limb segment is calculated, so that in the gesture recognition process, the detection of the target gesture and the action is carried out by combining the central axis curvature change characteristic, and the accuracy of the interactive gesture recognition is effectively improved.
Referring to fig. 7, fig. 7 is a schematic flow chart of a human body gesture-based interaction control method provided by the invention. In one embodiment of the present invention, if the gesture recognition information accords with preset interaction gesture information, before generating the control instruction corresponding to the interaction gesture information, the method further includes:
Step S51, respectively determining the feature similarity between each time feature sequence in the gesture recognition information and each target feature sequence in the interaction gesture information; step S52, determining a first average value corresponding to any one of the time feature sequences and a second average value corresponding to any one of the target feature sequences; step S53, determining an absolute deviation between any time feature sequence and a corresponding target feature sequence based on a first average value corresponding to any time feature sequence and a second average value corresponding to any target feature sequence; step S54, determining a correlation coefficient based on the absolute deviation and the feature similarity corresponding to each time feature sequence; step S55, judging whether the correlation coefficient is larger than a preset correlation threshold value; step S56, if yes, judging that the gesture recognition information accords with the interaction gesture information; step S57, if not, determining that the gesture recognition information does not conform to the interaction gesture information.
It should be noted that, the user may input one or more pieces of interaction gesture information in advance, where the interaction gesture information with different standards corresponds to different control instructions. The preset correlation threshold may be set according to practical situations, for example, set to 0.6.
Specifically, the following steps are performed for any one of the time feature sequences in the gesture recognition information: in an embodiment, calculating a feature similarity between the time feature sequence and its corresponding target feature sequence; for example, calculating the feature similarity between the time feature sequence of the left leg corresponding to the target object and the preset left leg target feature sequence, and further directly taking the feature similarity as a correlation coefficient, so as to judge whether the gesture recognition information accords with the interaction gesture information based on the correlation coefficient and a correlation threshold value.
In another embodiment, after the feature similarity is calculated, a first average value corresponding to any one of the time feature sequences and a second average value corresponding to any one of the target feature sequences may be calculated. Further, based on the first average value corresponding to the time feature sequence and the second average value corresponding to the pre-stored target feature sequence, calculating to obtain the absolute deviation between the time feature sequence and the corresponding target feature sequence. Further, based on the feature similarity and the absolute deviation corresponding to each time feature sequence, calculating a final correlation coefficient according to the similarity weight and the absolute deviation weight preset by each time feature sequence. The correlation coefficient is calculated as follows:
Wherein denotes a correlation coefficient, CORR i denotes feature similarity, MAE i denotes absolute deviation, a i denotes similarity weight, β i denotes absolute deviation weight, and n denotes the number of features. Further, if the correlation coefficient is larger than a preset correlation threshold, judging that the gesture recognition information accords with the interaction gesture information, and proving that the interaction control with the intelligent equipment can be performed; and if the correlation coefficient is not greater than the correlation threshold, judging that the gesture recognition information does not accord with the interaction gesture information, so as to continuously monitor the gesture information of the target object.
According to the embodiment of the invention, the feature similarity and the absolute deviation are obtained by calculation based on the time feature sequences corresponding to the feature information and the target feature sequences in the preset interaction gesture information, and further the correlation coefficient is determined based on the absolute deviation and the feature similarity corresponding to the time feature sequences, so that whether the gesture recognition information accords with the interaction gesture information is judged based on the correlation coefficient, and accurate interaction control is performed on the intelligent equipment in reality.
Referring to fig. 8, fig. 8 is a flowchart illustrating a human body gesture-based interactive control method according to the present invention. In one embodiment of the invention, determining the central axis of each of the limb segments includes:
step S611, carrying out edge extraction on the pressure map data to obtain pressure edge information of each limb segment; step S612, performing linear fitting on the pressure edge information of any limb segment to obtain a central axis of any limb segment;
Specifically, edge extraction is first performed on the pressure map data, for example, edge information is extracted by using a Prewitt operator, a Roberts operator, and other high-pass filtering methods, where the edge information extraction method is as follows:
for the pressure value of each position point in the pressure map data, a rectangular coordinate system is constructed by taking the upper left corner as the origin of coordinates, taking the right horizontal direction as the positive horizontal axis direction, taking the downward vertical axis direction as the positive vertical axis direction, i is the i point of the horizontal axis, j is the j point of the vertical axis, x i,j is the pressure value corresponding to the point, and Y i,j is the pressure value after edge extraction processing. Thus, the pressure edge information of each limb segment is determined, and referring specifically to fig. 3, in other embodiments, boundary optimization is further performed by adopting a boundary tracking method, and the operation steps are as follows: the point-by-point scan starts from the lower left corner and when an edge point is encountered, it is tracked until the following point returns to the starting point (for a closed line) or the following point is without a new following point (for a non-closed line). If the line is a non-closed line, after tracking one side, the other tail point is tracked from the starting point to the opposite direction. If more than one subsequent point is selected, the nearest point is selected as the subsequent point, and the next point is additionally tracked as a new edge tracking starting point. After one line is traced, the next untracked point is scanned until all edges are traced, and therefore pressure edge information optimized for each limb segment is obtained.
Further, performing linear fitting on the pressure edge information of any limb segment to obtain a central axis of any limb segment, wherein the linear fitting formula is as follows:
Wherein x is the coordinate value of the horizontal axis, y is the coordinate value of the vertical axis, a, b, c, d is the coefficient to be fitted.
Further, fig. 9 is a schematic flow chart of the interaction control method based on human body gesture provided by the invention. In one embodiment of the invention, determining the central axis of each of the limb segments includes:
Step S621, carrying out edge extraction on the pressure map data to obtain pressure edge information of each limb segment; step S622, performing linear fitting on the pressure edge information of any limb segment to obtain a reference line of any limb segment; step S623, determining a normal corresponding to any one of the reference lines; step S624, determining each intersection point coordinate corresponding to the normal line and the pressure edge information of any limb segment; and step S625, forming the central axis of any limb segment based on the median value between the intersection point coordinates.
Specifically, the pressure edge information of the limb segment is subjected to linear fitting, and a central axis obtained through the linear fitting is used as a reference line, so that a normal corresponding to any reference line, namely, a straight line perpendicular to the reference line is determined. Further, determining each intersection point coordinate of the normal line corresponding to pressure edge information of any limb segment; and further calculating a median value between the intersection point coordinates, so that a central axis of any limb segment is formed based on the median value between the intersection point coordinates. As can be appreciated, referring to fig. 3, L1, L2, L3, L4, and L5 shown in fig. 3 are central axes of each limb segment, where L1 is a central axis of the right arm, L2 is a central axis of the left arm, L3 is a central axis of the right leg, L4 is a central axis of the left leg, and L5 is a central axis of the trunk.
According to the embodiment of the invention, the pressure edge information of each limb segment is obtained by carrying out edge extraction on the pressure map data, and the central axis of each limb segment is obtained based on the pressure edge information of each limb segment. Therefore, the detection of the target gesture and the action can be carried out by combining the central axis related characteristics, and the accuracy of the interactive gesture recognition is effectively improved.
Referring to fig. 10, fig. 10 is a schematic flow chart of an interaction control method based on human body gesture provided by the invention. In one embodiment of the present invention, detecting the sleep posture information and the position information of each limb segment of the target object at any one of the detection time points based on the pressure map data at any one of the detection time points includes:
For any of the detection time points: step S71, extracting features of the pressure map data of the detection time points to obtain a plurality of candidate areas with different sizes; step S72, carrying out normalization processing on the candidate areas with different sizes to obtain target candidate areas; and step S73, carrying out category identification on the target candidate region to obtain the position information and the sleeping posture information of each limb segment of the target object.
The method is characterized in that a large amount of pressure map data with labels can be collected in advance, iteration training is carried out on an initial detection model based on the pressure map data, and a target detection model is obtained, wherein the label information in the pressure map data comprises position information of left and right arms, position information of left and right legs, position information of a trunk and current sleeping posture information. Optionally, in a preferred embodiment, the model structure of the target detection model includes a first convolution layer, a first pooling layer, a second convolution layer, a second pooling layer, a first full-connection layer, a second full-connection layer, and a classification layer, and the specific structure may refer to table 1 below. In other embodiments, the target detection model may be other R-CNN (Region with ConvolutionalNeuralNetworks features) derived models or model of the YOLO series after transfer learning, etc.
TABLE 1 model structure of object detection model
Specifically, the following steps are performed for the pressure map data at any one of the detection time points: firstly, extracting features of the pressure map data to obtain candidate regions with different sizes, thereby better capturing features of target objects with different sizes, and being beneficial to improving the robustness of a target detection model on targets with various sizes, optionally, the sizes of the candidate regions comprise 16 x 16, 8 x 32, 32 x 8, 4 x 64, 64 x 4, 8 x 8, 4 x 16, 16 x 4, 2 x 32, 32 x 2, 4*4, 2*8, 8*2, 2 x 2 and 3*3, and further, normalizing the candidate regions with different sizes to obtain target candidate regions, optionally, the sizes of the target candidate regions are 64 x 16. Further, aiming at the target candidate region, carrying out category identification on the target candidate region to obtain the position information and the sleeping posture information of each limb of the target object, namely, determining the position information of the left and right arms, the position information of the left and right legs, the position information of the trunk and the current sleeping posture information of the target object at the current time point. Alternatively, the object detection model corresponding to the model structure in table 1 may be used for the category identification.
According to the embodiment of the invention, the characteristic extraction is carried out on the pressure map data to obtain a plurality of candidate areas with different sizes so as to better capture the characteristics of target objects with different sizes, and then the category identification is carried out on the normalized target candidate areas to obtain the position information and the sleeping posture information of each limb segment of the target object, so that the accuracy of detecting the position information and the sleeping posture information of each limb segment is improved.
Referring to fig. 11, fig. 11 is a flowchart illustrating a human body gesture-based interactive control method according to the present invention. In one embodiment of the present invention, before obtaining the pressure map data of each detection time point of the target object in the preset time window, the method further includes:
Step S81, obtaining target pressure map data of a target object in a preset time window;
Step S82, analyzing target attitude information of the target object based on the target pressure map data;
step S83, if the target gesture information accords with the preset wake gesture information, activating the interaction control system.
It should be noted that, in order to avoid the situation of false triggering and false operation, in this embodiment, it is required to detect whether the gesture of the target object has information conforming to the preset wake gesture, that is, to require the user to take a specific gesture to activate the interactive control system, so as to prevent the interactive control system from being accidentally awakened under the condition of no need, and reduce the power consumption of the interactive control system.
Specifically, target pressure map data of a target object at each time point in a preset time window is obtained, then based on the target pressure map data of any time point, sleeping posture information of the target object at any time point and position information of each limb segment are detected, and the central axis of each limb segment is determined, further, based on sleeping posture information of each time point, position information of each limb segment and the central axis, target posture information of the target object is obtained through analysis, optionally, the analysis and identification process of the target posture information is basically the same as the identification process of the posture identification information, and details are omitted. Further, the process of determining whether the target gesture information accords with the preset wake gesture information is basically the same as the process of determining whether the gesture recognition information accords with the interactive gesture information, and will not be described herein. That is, if the target gesture information accords with the preset wake gesture information, the interactive control system is activated. Optionally, when the interactive control system is in the awake state, the interactive control system prompts the user, for example: the indicator light on the power supply of the device presents a specific color.
In addition, because the interactive control system is in an activated state for a long time and consumes more energy, in the embodiment, the system detects whether the interactive gesture information exists in the target object in the activation process, and if the correlation between the existing interactive gesture information and the preset interactive gesture information exceeds a preset threshold, the duration of the activated state of the interactive control system is set to be the first duration. If the correlation between the interaction gesture information and the preset interaction gesture information does not exceed the preset threshold value, the duration of the activation state of the interaction control system is set to be a second duration, and therefore when the second duration arrives, the interaction control system is converted into a low-energy-consumption state or a closed state. Wherein the first time period is longer than the second time period, for example, the first time period is set to 10s, and the second time period is set to 5s.
According to the embodiment of the invention, before the interactive gesture is detected, when the gesture of the detected target object accords with the preset wake gesture information, the interactive control system is activated, so that the conditions of false triggering and false operation are effectively reduced, and the power consumption of the interactive control system can be reduced.
The human body posture-based interaction control device provided by the invention is described below, and the human body posture-based interaction control device described below and the human body posture-based interaction control method described above can be correspondingly referred to each other.
Fig. 12 is a schematic structural diagram of an interaction control device based on human body posture, and as shown in fig. 12, the interaction control device based on human body posture according to an embodiment of the present invention includes:
An obtaining module 91, configured to obtain pressure map data of each detection time point of the target object within a preset time window;
The detection module 92 is configured to detect, based on the pressure map data at any one of the detection time points, sleeping posture information of the target object at any one of the detection time points and position information of each limb segment, and determine a central axis of each limb segment;
An analysis module 93, configured to analyze and obtain pose identification information of the target object based on the sleeping pose information, the position information of each limb segment, and the central axis corresponding to all the detection time points;
and the control module 94 is configured to generate a control instruction corresponding to the interaction gesture information if the gesture recognition information accords with preset interaction gesture information, so as to control the intelligent device based on the control instruction.
The human body posture-based interaction control device further comprises:
determining a plurality of characteristic information of any detection time point based on sleeping gesture information of any detection time point, central axes corresponding to each limb segment and position information;
forming a time feature sequence corresponding to any feature information based on any feature information of all detection time points;
and determining the gesture recognition information of the target object based on each time feature sequence.
The human body posture-based interaction control device further comprises:
For any of the detection time points:
Determining pressure average value characteristics of each limb segment based on the pressure map data of the detection time points;
determining the position characteristics of each limb segment based on the position information of each limb segment;
determining central axis curvature change characteristics of each limb segment and included angle characteristics between the central axes based on the central axes corresponding to each limb segment;
and forming a plurality of characteristic information of the detection time points based on the sleeping posture information, the pressure average value characteristic, the position characteristic, the central axis curvature change characteristic and the included angle characteristic between the central axes of each limb segment.
The human body posture-based interaction control device further comprises:
For any central axis corresponding to the limb segment:
according to a preset constructed coordinate system, determining the coordinate value of each point in the central axis;
determining a curvature value corresponding to each point based on the coordinate value of each point;
And determining the central axis curvature change characteristic of the limb segment based on the curvature value corresponding to each point.
The human body posture-based interaction control device further comprises:
respectively determining the feature similarity between each time feature sequence in the gesture recognition information and each target feature sequence in the interaction gesture information;
determining a first mean value corresponding to any one of the time feature sequences and a second mean value corresponding to any one of the target feature sequences;
Determining an absolute deviation between any time feature sequence and a corresponding target feature sequence based on a first mean value corresponding to any time feature sequence and a second mean value corresponding to any target feature sequence;
Determining a correlation coefficient based on the absolute deviation and the feature similarity corresponding to each time feature sequence;
If the correlation coefficient is larger than a preset correlation threshold, judging that the gesture recognition information accords with the interaction gesture information;
And if the correlation coefficient is not greater than the correlation threshold, judging that the gesture recognition information does not accord with the interaction gesture information.
The human body posture-based interaction control device further comprises:
Performing edge extraction on the pressure map data to obtain pressure edge information of each limb segment;
Performing linear fitting on the pressure edge information of any limb segment to obtain a central axis of any limb segment; or alternatively
Performing linear fitting on the pressure edge information of any limb segment to obtain a reference line of any limb segment;
determining a normal corresponding to any reference line;
Determining each intersection point coordinate corresponding to the normal line and pressure edge information of any limb segment;
And forming the central axis of any limb segment based on the median value between the intersection point coordinates.
The human body posture-based interaction control device further comprises:
For any of the detection time points:
extracting features of the pressure map data of the detection time points to obtain a plurality of candidate areas with different sizes;
normalizing the candidate areas with different sizes to obtain target candidate areas;
and carrying out category identification on the target candidate region to obtain the position information and the sleeping posture information of each limb segment of the target object.
The human body posture-based interaction control device further comprises:
Acquiring target pressure map data of a target object in a preset time window;
Analyzing target pose information of the target object based on the target pressure map data;
And if the target gesture information accords with the preset awakening gesture information, activating an interactive control system.
It should be noted that, the above device provided in the embodiment of the present invention can implement all the method steps implemented in the method embodiment and achieve the same technical effects, and detailed descriptions of the same parts and beneficial effects as those of the method embodiment in the embodiment are omitted.
Fig. 13 is a schematic structural diagram of an electronic device according to the present invention, and as shown in fig. 13, the electronic device may include: processor 310, memory 320, communication interface (Communications Interface) 330, and communication bus 340, wherein processor 310, memory 320, and communication interface 330 communicate with each other via communication bus 340. The processor 310 may invoke logic instructions in the memory 320 to perform a human gesture-based interactive control method.
Further, the logic instructions in the memory 320 described above may be implemented in the form of software functional units and may be stored in a computer readable storage medium when sold or used as a stand alone product. Based on this understanding, the technical solution of the present invention may be embodied essentially or in a part contributing to the prior art or in a part of the technical solution, in the form of a software product stored in a storage medium, comprising several instructions for causing a computer device (which may be a personal computer, a server, a network device, etc.) to perform all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a usb disk, a removable hard disk, a read-only memory (ROM), a random access memory (RAM, random Access Memory), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
In yet another aspect, the present invention also provides a non-transitory computer readable storage medium having stored thereon a computer program which, when executed by a processor, is implemented to perform the human posture-based interaction control method provided by the above methods.
In another aspect, the present invention also provides a computer program product, where the computer program product includes a computer program, where the computer program can be stored on a non-transitory computer readable storage medium, and when the computer program is executed by a processor, the computer can execute the interaction control method based on human body gesture provided by the above methods.
The apparatus embodiments described above are merely illustrative, wherein the elements illustrated as separate elements may or may not be physically separate, and the elements shown as elements may or may not be physical elements, may be located in one place, or may be distributed over a plurality of network elements. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of this embodiment. Those of ordinary skill in the art will understand and implement the present invention without undue burden.
From the above description of the embodiments, it will be apparent to those skilled in the art that the embodiments may be implemented by means of software plus necessary general hardware platforms, or of course may be implemented by means of hardware. Based on this understanding, the foregoing technical solution may be embodied essentially or in a part contributing to the prior art in the form of a software product, which may be stored in a computer readable storage medium, such as ROM/RAM, a magnetic disk, an optical disk, etc., including several instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to execute the method described in the respective embodiments or some parts of the embodiments.
Finally, it should be noted that: the above embodiments are only for illustrating the technical solution of the present invention, and are not limiting; although the invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit and scope of the technical solutions of the embodiments of the present invention.

Claims (8)

1. An interactive control method based on human body posture is characterized by comprising the following steps:
acquiring pressure map data of each detection time point of the target object in a preset time window;
Detecting sleeping posture information of the target object at any detection time point and position information of each limb segment based on pressure map data of any detection time point, and determining the central axis of each limb segment;
Analyzing and obtaining gesture identification information of the target object based on sleeping gesture information corresponding to all detection time points, position information of each limb segment and a central axis;
if the gesture recognition information accords with preset interaction gesture information, generating a control instruction corresponding to the interaction gesture information so as to control the intelligent equipment based on the control instruction;
The analyzing to obtain the gesture recognition information of the target object based on the sleeping gesture information, the position information of each limb segment and the central axis corresponding to all the detection time points comprises the following steps:
determining a plurality of characteristic information of any detection time point based on sleeping gesture information of any detection time point, central axes corresponding to each limb segment and position information;
forming a time feature sequence corresponding to any feature information based on any feature information of all detection time points;
Determining gesture recognition information of the target object based on each of the time feature sequences;
the determining a plurality of feature information of any detection time point based on the sleeping gesture information of any detection time point, the central axis and the position information corresponding to each limb segment comprises the following steps:
For any of the detection time points:
Determining pressure average value characteristics of each limb segment based on the pressure map data of the detection time points;
determining the position characteristics of each limb segment based on the position information of each limb segment;
determining central axis curvature change characteristics of each limb segment and included angle characteristics between the central axes based on the central axes corresponding to each limb segment;
Forming a plurality of characteristic information of the detection time points based on the sleeping posture information, the pressure average value characteristic, the position characteristic, the central axis curvature change characteristic and the included angle characteristic between the central axes of each limb segment;
The central axes comprise a right arm central axis, a left arm central axis, a right leg central axis, a left leg central axis and a trunk central axis;
The included angle features include an included angle of a right arm axle wire with a trunk axle wire, an included angle of a left arm axle wire with a trunk axle wire, an included angle of a right leg axle wire with a trunk axle wire, and an included angle of a left leg axle wire with a trunk axle wire.
2. The interactive control method based on human body posture according to claim 1, wherein the determining the central axis curvature change characteristic of each limb segment based on the central axis corresponding to each limb segment comprises:
For any central axis corresponding to the limb segment:
according to a preset constructed coordinate system, determining the coordinate value of each point in the central axis;
determining a curvature value corresponding to each point based on the coordinate value of each point;
And determining the central axis curvature change characteristic of the limb segment based on the curvature value corresponding to each point.
3. The human body posture-based interaction control method according to claim 1, wherein before generating the control instruction corresponding to the interaction posture information if the posture identification information accords with preset interaction posture information, the method further comprises:
respectively determining the feature similarity between each time feature sequence in the gesture recognition information and each target feature sequence in the interaction gesture information;
determining a first mean value corresponding to any one of the time feature sequences and a second mean value corresponding to any one of the target feature sequences;
Determining an absolute deviation between any time feature sequence and a corresponding target feature sequence based on a first mean value corresponding to any time feature sequence and a second mean value corresponding to any target feature sequence;
Determining a correlation coefficient based on the absolute deviation and the feature similarity corresponding to each time feature sequence;
If the correlation coefficient is larger than a preset correlation threshold, judging that the gesture recognition information accords with the interaction gesture information;
And if the correlation coefficient is not greater than the correlation threshold, judging that the gesture recognition information does not accord with the interaction gesture information.
4. The human-posture-based interactive control method according to claim 1, wherein said determining a central axis of each of said limb segments comprises:
Performing edge extraction on the pressure map data to obtain pressure edge information of each limb segment;
Performing linear fitting on the pressure edge information of any limb segment to obtain a central axis of any limb segment; or alternatively
Performing linear fitting on the pressure edge information of any limb segment to obtain a reference line of any limb segment;
determining a normal corresponding to any reference line;
Determining each intersection point coordinate corresponding to the normal line and pressure edge information of any limb segment;
And forming the central axis of any limb segment based on the median value between the intersection point coordinates.
5. The interactive control method according to claim 1, wherein detecting the sleep posture information of the target object at any one of the detection time points and the position information of each limb segment based on the pressure map data at any one of the detection time points comprises:
For any of the detection time points:
extracting features of the pressure map data of the detection time points to obtain a plurality of candidate areas with different sizes;
normalizing the candidate areas with different sizes to obtain target candidate areas;
and carrying out category identification on the target candidate region to obtain the position information and the sleeping posture information of each limb segment of the target object.
6. The interactive control method based on human body posture according to claim 1, wherein before obtaining pressure map data of each detection time point in a preset time window, the method further comprises:
Acquiring target pressure map data of a target object in a preset time window;
Analyzing target pose information of the target object based on the target pressure map data;
And if the target gesture information accords with the preset awakening gesture information, activating an interactive control system.
7. An electronic device comprising a memory, a processor and a computer program stored on the memory and running on the processor, wherein the processor implements the human gesture-based interactive control method according to any one of claims 1 to 6 when the program is executed by the processor.
8. A non-transitory computer readable storage medium having stored thereon a computer program, wherein the computer program when executed by a processor implements the human posture based interaction control method of any of claims 1 to 6.
CN202410041327.7A 2024-01-11 2024-01-11 Interaction control method, device and storage medium based on human body posture Active CN117555427B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202410041327.7A CN117555427B (en) 2024-01-11 2024-01-11 Interaction control method, device and storage medium based on human body posture

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202410041327.7A CN117555427B (en) 2024-01-11 2024-01-11 Interaction control method, device and storage medium based on human body posture

Publications (2)

Publication Number Publication Date
CN117555427A CN117555427A (en) 2024-02-13
CN117555427B true CN117555427B (en) 2024-04-16

Family

ID=89815198

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202410041327.7A Active CN117555427B (en) 2024-01-11 2024-01-11 Interaction control method, device and storage medium based on human body posture

Country Status (1)

Country Link
CN (1) CN117555427B (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109157061A (en) * 2018-07-24 2019-01-08 深圳市赛亿科技开发有限公司 Intelligent mattress and its control method, electronic equipment and storage medium
CN111296994A (en) * 2019-12-20 2020-06-19 石狮市森科智能科技有限公司 Intelligent gesture interaction control system
CN113341748A (en) * 2021-06-17 2021-09-03 佛山市赤虎家具有限公司 Sofa intelligent control method and system based on posture and storage medium
CN115568714A (en) * 2022-09-07 2023-01-06 重庆海尔空调器有限公司 Automatic temperature control mattress, control method thereof, electronic equipment and storage medium
WO2023050877A1 (en) * 2021-09-30 2023-04-06 青岛海尔空调器有限总公司 Control method and control device for home appliance, smart mattress and server
CN116518539A (en) * 2023-05-19 2023-08-01 Tcl空调器(中山)有限公司 Air conditioner control method and device, air conditioner and storage medium

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109157061A (en) * 2018-07-24 2019-01-08 深圳市赛亿科技开发有限公司 Intelligent mattress and its control method, electronic equipment and storage medium
CN111296994A (en) * 2019-12-20 2020-06-19 石狮市森科智能科技有限公司 Intelligent gesture interaction control system
CN113341748A (en) * 2021-06-17 2021-09-03 佛山市赤虎家具有限公司 Sofa intelligent control method and system based on posture and storage medium
WO2023050877A1 (en) * 2021-09-30 2023-04-06 青岛海尔空调器有限总公司 Control method and control device for home appliance, smart mattress and server
CN115568714A (en) * 2022-09-07 2023-01-06 重庆海尔空调器有限公司 Automatic temperature control mattress, control method thereof, electronic equipment and storage medium
CN116518539A (en) * 2023-05-19 2023-08-01 Tcl空调器(中山)有限公司 Air conditioner control method and device, air conditioner and storage medium

Also Published As

Publication number Publication date
CN117555427A (en) 2024-02-13

Similar Documents

Publication Publication Date Title
CN111288986B (en) Motion recognition method and motion recognition device
JP6822328B2 (en) Watching support system and its control method
CN112762895B (en) Method for judging read-write posture based on sensor
SG188111A1 (en) Condition detection methods and condition detection devices
CN112926541A (en) Sleeping post detection method and device and related equipment
CN111197845A (en) Deep learning-based control method and system for air conditioner operation mode
CN117555427B (en) Interaction control method, device and storage medium based on human body posture
CN109858402B (en) Image detection method, device, terminal and storage medium
CN108108709B (en) Identification method and device and computer storage medium
Liu et al. Human body fall detection based on the Kinect sensor
CN112818796A (en) Intelligent posture discrimination method and storage device suitable for online invigilation scene
CN111166340B (en) Human body posture real-time identification method based on self-adaptive acceleration signal segmentation
CN110674751A (en) Device and method for detecting head posture based on monocular camera
CN111862160A (en) Target tracking method, medium and system based on ARM platform
CN112836549A (en) User information detection method and system and electronic equipment
WO2018235628A1 (en) Monitoring assistance system, control method therefor, and program
CN112527118B (en) Head posture recognition method based on dynamic time warping
KR102233157B1 (en) Method and system for calculating occupant activity using occupant pose classification based on deep learning
CN112287758B (en) Climbing identification method based on key point detection
CN112598738A (en) Figure positioning method based on deep learning
CN116092129B (en) Intelligent bookshelf and control method thereof
CN114255507A (en) Student posture recognition analysis method based on computer vision
CN112084814A (en) Learning auxiliary method and intelligent device
CN117968222B (en) Control method and device of refrigeration equipment based on target detection
CN113077512B (en) RGB-D pose recognition model training method and system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant