CN112089595A - Login method of neck massager, neck massager and storage medium - Google Patents

Login method of neck massager, neck massager and storage medium Download PDF

Info

Publication number
CN112089595A
CN112089595A CN202010443677.8A CN202010443677A CN112089595A CN 112089595 A CN112089595 A CN 112089595A CN 202010443677 A CN202010443677 A CN 202010443677A CN 112089595 A CN112089595 A CN 112089595A
Authority
CN
China
Prior art keywords
interactive
image
interaction
password information
gestures
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010443677.8A
Other languages
Chinese (zh)
Inventor
刘杰
吴家利
张明宇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
SKG Health Technologies Co Ltd.
Original Assignee
SKG Health Technologies Co Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by SKG Health Technologies Co Ltd. filed Critical SKG Health Technologies Co Ltd.
Priority to CN202010443677.8A priority Critical patent/CN112089595A/en
Publication of CN112089595A publication Critical patent/CN112089595A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H7/00Devices for suction-kneading massage; Devices for massaging the skin by rubbing or brushing not otherwise provided for
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • G06V40/28Recognition of hand or arm movements, e.g. recognition of deaf sign language
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H2201/00Characteristics of apparatus not provided for in the preceding codes
    • A61H2201/16Physical interface with patient
    • A61H2201/1602Physical interface with patient kind of interface, e.g. head rest, knee support or lumbar support
    • A61H2201/1609Neck
    • A61H2201/1611Holding means therefor
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H2201/00Characteristics of apparatus not provided for in the preceding codes
    • A61H2201/50Control means thereof
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H2201/00Characteristics of apparatus not provided for in the preceding codes
    • A61H2201/50Control means thereof
    • A61H2201/5007Control means thereof computer controlled
    • A61H2201/501Control means thereof computer controlled connected to external computer devices or networks
    • A61H2201/5012Control means thereof computer controlled connected to external computer devices or networks using the internet
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H2205/00Devices for specific parts of the body
    • A61H2205/04Devices for specific parts of the body neck

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • Social Psychology (AREA)
  • Multimedia (AREA)
  • Psychiatry (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physical Education & Sports Medicine (AREA)
  • Epidemiology (AREA)
  • Pain & Pain Management (AREA)
  • Dermatology (AREA)
  • Rehabilitation Therapy (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Animal Behavior & Ethology (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application discloses login method of neck massager, neck massager and storage medium, the method includes: acquiring an interactive image by using an image sensor, wherein the image sensor is arranged on the neck massager; identifying the interaction action in the interaction image to obtain password information associated with the interaction action; and verifying the password information, and finishing the login of the neck massage instrument after the verification is passed. Through the mode, the login verification can be realized for the neck massager by utilizing different interaction, the problems of structural limitation and unfavorable operation of the neck massager are solved, and the use convenience of a user is improved.

Description

Login method of neck massager, neck massager and storage medium
Technical Field
The application relates to the technical field of medical treatment, in particular to a login method of a neck massager, the neck massager and a storage medium.
Background
With the development of society, people pay more attention to their own physical health, and the excessive energy is put on daily work and study, so that the neck, waist and other parts of people are painful or uncomfortable; for the old, as the body is too old, various pains are generated on the body, so that more and more people begin to use various neck massage instruments to massage so as to achieve the effects of relieving pains and relieving fatigue.
However, in the practical use of the massage apparatus, the neck massage apparatus can relieve fatigue, but has limited functions and cannot meet some requirements of users.
Disclosure of Invention
In order to solve the above problems, the present application provides a login method for a neck massager, and a storage medium, which can utilize different interaction actions to realize login verification of the neck massager, thereby improving convenience for a user to use.
In order to solve the technical problem, the application adopts a technical scheme that: a login method of a neck massager is provided, and the method comprises the following steps: acquiring an interactive image by using an image sensor, wherein the image sensor is arranged on the neck massager; identifying the interaction action in the interaction image to obtain password information associated with the interaction action; and verifying the password information, and finishing the login of the neck massage instrument after the verification is passed.
Before the step of acquiring the interactive image by using the image sensor, the method further comprises the following steps: after the neck massager is started, detecting whether a moving target exists in a preset range by using an infrared sensor; wherein, the infrared sensor is arranged on the neck massager; if so, the image sensor is activated.
The step of identifying the interactive action in the interactive image to obtain the password information associated with the interactive action includes: segmenting the interactive image to obtain a hand region in the interactive image; identifying a hand region to determine an interaction gesture; password information associated with the interaction gesture is determined.
Wherein the interactive image comprises a plurality of successive image frames; the step of identifying the interaction in the interaction image to obtain the password information associated with the interaction comprises the following steps: dividing the plurality of continuous image frames respectively to obtain a plurality of continuous hand areas corresponding to the plurality of continuous image frames respectively; respectively identifying a plurality of continuous hand regions to determine a plurality of interactive gestures; determining a character corresponding to each interactive gesture in a plurality of interactive gestures; and determining password information associated with the interactive action according to characters corresponding to the plurality of interactive gestures.
Wherein the step of determining a character corresponding to each of the plurality of interaction gestures comprises: respectively carrying out similarity comparison on a target interactive gesture in the interactive gestures and a plurality of standard interactive gestures to obtain a plurality of similarity values; wherein each standard interaction gesture is associated with a character; and determining the character associated with the standard interactive gesture corresponding to the maximum similarity numerical value in the similarity numerical values as the character corresponding to the target interactive gesture.
Wherein, according to the characters corresponding to the plurality of interactive gestures, the step of determining the password information associated with the interactive action comprises the following steps: and connecting characters corresponding to the interactive gestures according to the acquisition sequence of the interactive gestures to form password information associated with the interactive actions.
According to the acquisition sequence of the interactive gestures, characters corresponding to the interactive gestures are connected to form password information associated with the interactive actions, and the method comprises the following steps: acquiring the acquisition time interval of any two continuous interactive gestures in the interactive gestures; and if the acquisition time interval is larger than the set time interval, taking one interaction gesture acquired later in the two interaction gestures as a first interaction gesture, and connecting the corresponding character with the characters of the plurality of subsequent interaction gestures to form password information associated with the interaction action.
Wherein the interactive image comprises a plurality of successive image frames; the step of identifying the interaction in the interaction image to obtain the password information associated with the interaction comprises the following steps: dividing the plurality of continuous image frames respectively to obtain a plurality of continuous face areas corresponding to the plurality of continuous image frames respectively; identifying a plurality of consecutive facial regions, respectively, to determine a plurality of interactive expressions; determining a character corresponding to each interactive expression in a plurality of interactive expressions; and determining password information associated with the interactive action according to the character corresponding to each interactive expression.
The step of determining the character corresponding to each interactive expression in the plurality of interactive expressions comprises the following steps: respectively carrying out similarity comparison on the target interactive expressions of the interactive expressions and the standard interactive expressions to obtain a plurality of similarity values; wherein each standard interactive expression is associated with a character; and determining the character associated with the standard interactive expression corresponding to the maximum similarity numerical value in the similarity numerical values as the character corresponding to the target interactive expression.
The step of identifying the interactive action in the interactive image to obtain the password information associated with the interactive action includes: extracting a region image below the tip of the nose in an interactive image as a lip region image, wherein the interactive image comprises at least one continuous image frame; lip shape recognition is carried out on the lip area image so as to obtain first character information in the interactive image; password information associated with the first text information is determined.
Wherein, carry out lip shape discernment to lip region image to the step of the first text message in the mutual image of acquireing includes: extracting lip characteristics in the lip region image; obtaining first character information based on the lip shape characteristics; the step of determining the password information associated with the first text information comprises the following steps: judging whether the first character information is matched with first preset character information or not; if yes, password information associated with the first preset character information is obtained.
Wherein, the method also comprises: acquiring a plurality of continuous gesture image frames of a user by using an image sensor; acquiring coordinate information of a hand in a plurality of continuous gesture image frames to obtain a plurality of continuous target coordinate information; sequentially connecting a plurality of continuous target coordinate information to form a target track image; the step of identifying the interaction in the interaction image to obtain the password information associated with the interaction comprises the following steps: and identifying the target track image to obtain password information associated with the target track image.
The step of identifying the target track image to obtain the password information associated with the target track image includes: acquiring first target coordinate information and last target coordinate information in a plurality of continuous target coordinate information; determining the indication direction of the gesture image based on the first target coordinate information and the last target coordinate information; the indication direction is identified to determine password information corresponding to the indication direction.
In order to solve the above technical problem, another technical solution adopted by the present application is: there is provided a neck massager comprising: a massage apparatus body; the massage component is arranged on the massage instrument body; the communication circuit is arranged on the massage instrument body; the image sensor is arranged on the massage instrument body; and the control circuit is arranged on the massage instrument body, is electrically coupled with the massage component, the communication circuit and the image sensor, and is used for controlling the massage component, the communication circuit and the image sensor so as to realize the login method of the neck massage instrument.
In order to solve the above technical problem, the present application adopts another technical solution: there is provided a computer-readable storage medium for storing program data for implementing the above-mentioned log-in method of the neck massage apparatus when the program data is executed by a control circuit.
The beneficial effects of the embodiment of the application are that: different from the prior art, the login method of the neck massager obtains the interactive image through the image sensor and identifies the interactive action in the interactive image to obtain the password information associated with the interactive image, so that the login verification of the neck massager is realized by utilizing the password information, and the login is completed after the verification is passed. Through the mode, the login verification can be realized for the neck massager by utilizing different interaction, the problems of structural limitation and unfavorable operation of the neck massager are solved, and the use convenience of a user is improved.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts. Wherein:
FIG. 1 is a schematic structural view of an embodiment of a neck massager provided herein;
fig. 2 is a schematic flow chart of a first embodiment of a control method of the neck massager provided by the present application;
FIG. 3 is a schematic view of a motion interaction image provided herein;
FIG. 4 is a schematic flow chart diagram illustrating a second embodiment of a method for registering a neck massager according to the present application;
fig. 5 is a detailed flowchart of step S44;
FIG. 6 is a graphical illustration of a plurality of control areas provided herein;
FIG. 7 is a schematic flow chart diagram illustrating a third exemplary embodiment of a method for registering a neck massager;
FIG. 8 is a schematic flow chart diagram illustrating a fourth embodiment of a login method of a neck massager provided in the present application;
FIG. 9 is a schematic flow chart diagram illustrating a fifth embodiment of a login method of a neck massager provided in the present application;
fig. 10 is a detailed flowchart of step S94;
FIG. 11 is a schematic structural view of a further embodiment of the neck massager provided herein;
FIG. 12 is a schematic structural diagram of an embodiment of a computer-readable storage medium provided in the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application. It is to be understood that the specific embodiments described herein are merely illustrative of the application and are not limiting of the application. It should be further noted that, for the convenience of description, only some of the structures related to the present application are shown in the drawings, not all of the structures. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
Reference herein to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment can be included in at least one embodiment of the application. The appearances of the phrase in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. It is explicitly and implicitly understood by one skilled in the art that the embodiments described herein can be combined with other embodiments.
Referring to fig. 1, fig. 1 is a schematic structural diagram of an embodiment of a neck massager 10 provided by the present application, which includes an elastic arm 11, a massage component 12, a sensing component 13, a first handle 14, a second handle 15, and a speaker 16.
The first handle 14 and the second handle 15 are fixedly connected to two sides of the elastic arm 11, the massage component 12 is disposed on one side of the elastic arm 11 facing the neck of the human body, and the massage component 12 can emit electric pulses. The sensing assembly 13 is disposed outside of the first handle 14 or the second handle 15. The speaker 16 is disposed outside the first handle 14 or the second handle 15 for playing audio data.
Alternatively, the electrode pads of the massage unit 12 are not limited to the protruding mushroom structure, but may be flush with or slightly protruding from the surface of the side of the elastic arms 11 facing the neck of the human body. The electrode sheet can also be conductive silica gel.
Optionally, the sensing assembly 13 includes a first sensing assembly 131 and a second sensing assembly 132, and the first sensing assembly 131 and the second sensing assembly 132 are connected.
Further, the first sensing component 131 may be an image sensor, and the second sensing component 132 may be an infrared sensor; also, the first sensing member 131 may be disposed outside the first handle 14, and the second sensing member 132 may be disposed outside the second handle 15, and the positions of the two are not particularly limited.
Alternatively, the first and second handles 14, 15 may be separate pieces or may be part of an integrally formed neck massager.
Optionally, the neck massager 10 is further provided with a communication module for communicating with other terminals.
Referring to fig. 2, fig. 2 is a schematic flow chart of a first embodiment of a login method of a neck massager provided by the present application, the method including:
s21: an interactive image is acquired with an image sensor.
Optionally, before the interactive image of the user is acquired, the infrared sensor may be further used to detect a moving object within a preset range to determine whether to acquire the interactive image.
Specifically, in a general case, the image sensor is in a power-off state or a sleep state due to relatively high power consumption, and does not acquire the interactive image, while the infrared sensor is in a working state due to relatively low power consumption, and when the infrared sensor recognizes that a moving target is moving, the infrared sensor sends a signal to the neck massager to activate the image sensor to acquire and acquire the interactive image of the movement, for example, the hand movement of the user, such as the image of the barycenter of the hands shown in fig. 3; for example, the foot motions of the user, such as drawing a circle and drawing a hook, are obtained.
In other embodiments, the user of the neck massager may also activate the image sensor by manually touching a button provided on the neck massager.
In an application scenario, the neck massager may further comprise an illumination assembly for providing sufficient light to acquire an interactive image when the environment is dark. Wherein, image sensor and infrared sensor all set up on the neck massager.
S22: and identifying the interaction action in the interaction image to obtain password information associated with the interaction action.
It can be understood that before the interactive image is identified, the interactive image needs to be recorded, so that different interactive images can be associated and correspond to different password information. Optionally, the method can be implemented by a mobile terminal associated with the neck massager, for example, when the neck massager is used for the first time, a user can enter the interaction action first, specifically, the mobile terminal is connected with the neck massager, an entry interface of a password is displayed on a screen of the mobile terminal, the user enters an image acquisition interface after selecting password information to be entered, the user makes a specific interaction action, and the mobile terminal can associate and store the interaction action and the corresponding password.
In this embodiment, the interactive action is taken as a hand action as an example for explanation, specifically, the interactive image may be divided first to obtain a hand region in the interactive image; the hand region is further continued to be recognized to determine a specific interaction gesture, and finally password information associated therewith is determined based on the obtained interaction gesture. The number of the interactive images or the interactive gestures can be multiple, and the password information is determined by multiple same or different parameters.
S23: and verifying the password information, and finishing the login of the neck massage instrument after the verification is passed.
In this embodiment, the password information may be information types of different forms, such as a character password, a picture password, and the like; and when the password information is a character password, different interactive actions acquire corresponding different characters, the characters are further connected in a combined manner according to the input sequence of the different interactive actions, whether the group of characters is the same as the character password preset by the user is verified, and login is completed after the group of characters passes verification.
When the password information is a picture password, different interaction actions can be regularly combined into a picture with a certain mark, the mark can be a square Chinese character for example, and the picture formed by the combination of the interaction actions is compared with the picture password preset by the user, so that login is verified.
Different from the prior art, the login method of the neck massager obtains the interactive image through the image sensor and identifies the interactive action in the interactive image to obtain the password information associated with the interactive image, so that the login verification of the neck massager is performed by using the password information, and the login is completed after the verification is passed. Through the mode, the login verification can be realized for the neck massager by utilizing different interaction, the problems of structural limitation and unfavorable operation of the neck massager are solved, and the use convenience of a user is improved.
Referring to fig. 4, fig. 4 is a schematic flowchart of a second embodiment of a login method of a neck massager provided by the present application, the method including:
s41: an interactive image is acquired with an image sensor.
Wherein the interactive image comprises a plurality of consecutive image frames.
S42: the plurality of continuous image frames are divided to obtain a plurality of continuous hand regions corresponding to the plurality of continuous image frames.
Optionally, before the plurality of continuous image frames are segmented, image preprocessing needs to be performed on the part of the continuous image frames to remove the influence caused by noise, illumination and the like and enhance useful information of the images. Specifically, the method includes normalization of the size and the gray scale of the image.
In some embodiments, the segmentation to obtain the hand region mainly depends on a segmentation method based on skin color, specifically, the threshold segmentation of RGB (red, green, and blue) color space is performed on the continuous image frame, and then the extraction and separation of the skin color region can be realized by performing operation between the RGB (red, green, and blue) color space and HSV (hue, saturation, and brightness) color space based on the clustering property of skin color distribution. The skin color based segmentation method can segment skin color areas from background images through the clustering characteristics of skin colors in color spaces, realizes hand area segmentation by using skin color characteristic information, and has the characteristics of intuition, high efficiency and accuracy.
S43: a plurality of consecutive hand regions are identified, respectively, to determine a plurality of interactive gestures.
In some embodiments, the skin color contour-based extraction is performed on the plurality of hand regions to obtain corresponding gesture information, which may be a single-hand or double-hand gesture, such as a double-hand heart, a single-hand scissors, or the like, so as to determine a plurality of interactive gestures.
S44: a character corresponding to each of a plurality of interaction gestures is determined.
Specifically, step S44 may be a step as shown in fig. 5:
s441: and respectively carrying out similarity comparison on the target interactive gestures in the interactive gestures and the standard interactive gestures to obtain a plurality of similarity values.
Wherein the standard interactive gestures are set and stored in advance for the user, each standard interactive gesture being uniquely associated with one character. In the similarity comparison, each target interaction gesture is compared with all the standard interaction gestures to obtain a plurality of similarity values related to each target interaction gesture.
It should be noted that, because the interactive image is a plurality of consecutive image frames, and it is impossible for the user to complete the gesture transformation within the corresponding time of one frame, there may be a case that a complete gesture is not obtained during the process of transforming from one gesture to another gesture, and the obtained similarity value is low.
S442: and determining the character associated with the standard interactive gesture corresponding to the maximum similarity numerical value in the similarity numerical values as the character corresponding to the target interactive gesture.
And selecting a character associated with the standard interactive gesture corresponding to the numerical value with the highest similarity value from the plurality of similarity numerical values compared with each target interactive gesture as the character corresponding to the target gesture. Each standard interactive gesture is uniquely associated with one character, for example, the character "a" corresponding to double-handed clapping, the character "1" corresponding to single-handed fist, and the like, which are not limited specifically.
In other embodiments, the acquisition interface of the image sensor may be divided into four regions, which are a first control region, a second control region, a third control region and a fourth control region, and may respectively correspond to positions of four quadrants of a coordinate system, and the corresponding characters are determined according to a condition of the control region where the target interaction gesture is located and a comparison relationship between the target interaction gesture and the standard interaction gesture.
Specifically, referring to fig. 6, fig. 6 is an image schematic diagram of a plurality of control regions, such as gesture a in fig. 6, it may be recognized that gesture a is a "cloth" gesture, and at this time, gesture a is located in a second control region corresponding to a second quadrant, that is, it indicates that a character corresponding to the gesture should be determined to be generated by both "cloth" and the second control region, for example, "cloth" gesture a may be represented as a numeric character, and gesture a located in the second control region may be represented as "2" in the numeric character. The determination of the category of the numeric characters can be performed by using the above similarity numerical comparison method. Similarly, if gesture a is also a "cloth" gesture and is located in the third control region, it may be represented as a "3" in the numeric character. The above embodiments are merely examples, and may be specifically configured according to actual situations.
For the position definition of the interactive gesture, the position of the fingertip in the interactive gesture can be used as a standard for judgment, for example, the gesture a in fig. 6 is partially and completely located in the second control region, so that the position of the gesture a can be accurately identified; in the gesture B in fig. 6, it can be seen that the main body of the hand is in the fourth control region, but the fingers and the middle portion are in the first control region, and at this time, the position of the control region can be determined based on the fingertip, so the gesture B should be determined as being in the first control region. Optionally, the position definition of the interactive gesture by the control region may also be based on a central point of the interactive gesture in the interactive image, and is not limited in particular.
S35: and determining password information associated with the interactive action according to characters corresponding to the plurality of interactive gestures.
Specifically, according to the acquisition sequence of the multiple interactive gestures, characters corresponding to the multiple interactive gestures are connected to form password information associated with the interactive actions.
Further, the obtaining time intervals of any two consecutive interactive gestures may be obtained respectively, if two interactive gestures exist, the gesture with later obtaining time in the two interactive gestures is used as a first interactive gesture, and the gesture is connected with characters corresponding to a plurality of subsequent interactive gestures, so as to form password information associated with the interactive action. The set time interval may be 3 to 8 seconds, and is not particularly limited.
S36: and verifying the password information, and finishing the login of the neck massage instrument after the verification is passed.
Optionally, in this embodiment, after the verification passes, the corresponding control instruction may be directly executed according to the input character password, so as to implement specific function control. For example, the password information may be multiple and different, and different password information further includes different control instructions, and after the verification login is completed, the corresponding control instructions may be executed according to the input different password information, for example, immediately after the login is successful, a comfortable massage mode is started.
Through the mode, the corresponding characters can be determined by utilizing different interactive gesture actions, login verification is realized on the neck massager according to the characters, the problems of structural limitation and unfavorable operation of the neck massager are solved, and the use convenience of a user is improved.
Referring to fig. 7, fig. 7 is a schematic flow chart of a third embodiment of a login method of a neck massager provided in the present application, and the difference between the present embodiment and the second embodiment is that an object of an interactive image is a facial image, and an object of an interactive action is an interactive expression. The method comprises the following steps:
s71: an interactive image is acquired with an image sensor.
S72: the method includes the steps of dividing a plurality of continuous image frames respectively to obtain a plurality of continuous face areas corresponding to the plurality of continuous image frames respectively.
S73: a plurality of consecutive facial regions are identified, respectively, to determine a plurality of interactive expressions.
In this embodiment, geometric features may be adopted for feature extraction, to locate and measure the significant features of the facial expression, such as the position changes of the eyes, eyebrows, mouth, and the like, and determine the features of the size, distance, shape, mutual ratio, and the like, so as to perform facial recognition and determine the interactive expression.
Alternatively, a method based on global statistical features, or a method based on frequency domain feature extraction, or a method based on motion feature extraction may also be adopted to identify the face region.
S74: and determining a character corresponding to each interactive expression in the plurality of interactive expressions.
Optionally, the character may be determined by comparing the similarity between the target interactive expression in the interactive expressions and the standard corner corrections; further, the character may be determined by combining the position of the control area where the interactive expression is located, which may specifically refer to step S34.
Wherein each standard interactive expression is uniquely associated with a character, e.g. left eye closing right eye opening mouth represents a character, left eye right eye opening mouth represents a character, etc. Alternatively, different characters can be determined according to different opening radian sizes of the mouth.
S75: and determining password information associated with the interactive action according to the character corresponding to each interactive expression.
S76: and verifying the password information, and finishing the login of the neck massage instrument after the verification is passed.
Optionally, the interactive expressions and the interactive gestures can be combined, and the associated password information is determined in such a manner, so that the diversity of the password information can be improved, and the login safety can be improved.
Referring to fig. 8, fig. 8 is a schematic flow chart of a fourth embodiment of a login method of a neck massager provided by the present application, the method including:
s81: an interactive image is acquired with an image sensor.
Wherein the interactive image is a plurality of consecutive image frames.
S82: and extracting a region image below the tip of the nose in the interactive image as a lip region image.
S83: lip shape recognition is carried out on the lip area image so as to obtain first character information in the interactive image.
Specifically, lip-shaped features in the lip region image are extracted; and obtaining first character information based on the lip-shaped characteristics. In this embodiment, the lip feature may be input into the lip recognition model, the corresponding pronunciation may be recognized, and the corresponding first character information may be calculated based on the pronunciation information.
The lip recognition model is used for modeling a lip language sequence through a complex end-to-end deep neural network technology and establishing a vocabulary.
S84: password information associated with the first text information is determined.
In this embodiment, it may be determined whether the first text information matches the first preset text information; if yes, password information associated with the first preset text information is obtained, wherein the first preset text information can be set on the mobile terminal by a user. When the text information expressed by the lip shape of the user is the same as the text information preset by the user, the parameter information associated with the preset text information can be used as password information, the parameter information can be, for example, character information, voice information or picture information, different first text information corresponds to different parameter information and also corresponds to different control instructions except for login, that is, a plurality of password information which can be verified through login can be provided, matching correspondence is performed through different parameter information, and the corresponding control instructions after login are realized. Optionally, the first text information may be directly used as password information for subsequent login verification.
S85: and verifying the password information, and finishing the login of the neck massage instrument after the verification is passed.
Referring to fig. 9, fig. 9 is a schematic flow chart of a fifth embodiment of a login method of a neck massager provided by the present application, the method including:
s91: a plurality of successive gesture image frames of a user are acquired with an image sensor.
S92: coordinate information of the hand in a plurality of continuous gesture image frames is obtained so as to obtain a plurality of continuous target coordinate information.
For example, the shooting area of the image sensor may be established as a coordinate system, and when a user uses a gesture to perform motion interaction, the fingertip of the finger in each image frame is firstly identified, and then the coordinate information of the fingertip in the image frame is acquired, so that a plurality of continuous target coordinate information may be acquired.
S93: and sequentially connecting a plurality of continuous target coordinate information to form a target track image.
S94: and identifying the target track image to obtain password information associated with the target track image.
In some embodiments, before performing this step, it may be determined whether the target track image is a straight line, and it is understood that the track image of the hand motion is not necessarily an absolute straight line, and therefore, as long as the target track image is a substantially straight line, the target track image may be considered as a straight line. In this case, linear operation may be performed on a plurality of continuous target coordinate information, for example, two pieces of target coordinate information may be arbitrarily selected, a straight line expression representing two gestures may be calculated using the abscissa and ordinate of the coordinate information in the coordinate system, and the slope k1 of the straight line may be obtained, similarly, the slope k2, k3 of the straight line between any remaining two pieces of target coordinate information in the target trajectory image may be calculated, the average value and the variance may be further calculated on the obtained slopes, and if the method of the obtained partial slope is smaller than a preset threshold (for example, smaller than 1), the target trajectory image formed by the partial target coordinate may be determined as the straight line.
Further, after the target track image is determined to be a straight line, the direction of the track image can be identified according to the first target coordinate information and the last target coordinate information in the plurality of continuous target coordinate information, the specific direction can be obtained based on the coordinate system, for example, the angle in the positive direction of the X axis is divided, for example, 0-45 degrees corresponds to one character, 45-90 degrees corresponds to one character, at this time, the obtained specific direction can be matched with the included angle of the X axis of the coordinate system to determine the corresponding character, and the characters corresponding to the target gesture tracks are further connected according to the obtaining sequence of the plurality of target gesture tracks to form password information associated with the interaction track.
In other embodiments, when the target trajectory is not a straight line, step S94 may be the step shown in fig. 10, which is as follows:
s941: and acquiring the first target coordinate information and the last target coordinate information in a plurality of continuous target coordinate information.
S942: and determining the indication direction of the gesture image based on the first target coordinate information and the last target coordinate information.
S943: the indication direction is identified to determine password information corresponding to the indication direction.
The specific indication direction of a straight line formed by a starting point (a first target coordinate) and an end point (a last target coordinate) of the non-linear target track image can be obtained based on the coordinate system, at this time, the shooting area of the image sensor can be divided into four areas according to coordinate quadrants, and the straight line obtained by connecting the starting point and the end point of the non-linear target track image is extended until the straight line intersects with the boundaries of the four areas.
It can be understood that the extension line of any straight line intersects with the boundary of the shooting area, and therefore, the specific identification and confirmation of the password information can be performed according to the intersection point position of the extension line of the straight line corresponding to the target track image and the boundary of the area, for example, if the intersection point is located in the area corresponding to the first quadrant in the coordinate quadrant, the corresponding password information is correspondingly generated. In this way, the meaning of the gesture interaction can be accurately identified to determine the associated password information.
S95: and verifying the password information, and finishing the login of the neck massage instrument after the verification is passed.
Different from the prior art, different password information can be obtained by utilizing the indication direction of the gesture target track image so as to realize the login of the neck massager, the corresponding relation between the gesture track and the password information can be enhanced, and the convenience of use of a user is improved.
Referring to fig. 11, fig. 11 is a schematic structural view of another embodiment of the neck massager 20 provided in the present application, and the neck massager includes a massager body 201, a massage assembly 202, a communication circuit 203, an image sensor 204, and a control circuit 205. Wherein, the massage component 202 is arranged on the massage apparatus body 201; the communication circuit 203 is arranged on the massage apparatus body 201; the image sensor 204 is arranged on the massage apparatus body 201; the control circuit 205 is disposed on the massage device body 201, electrically coupled to the massage assembly 202, the communication circuit 203, and the image sensor 204, and configured to control the massage assembly 202, the communication circuit 203, and the image sensor 204 to implement the following steps:
acquiring an interactive image by using an image sensor, wherein the image sensor is arranged on the neck massager; identifying the interaction action in the interaction image to obtain password information associated with the interaction action; and verifying the password information, and finishing the login of the neck massage instrument after the verification is passed.
It can be understood that the neck massager 20 in this embodiment may implement the method steps of any of the above embodiments, and the specific implementation steps thereof may refer to the above embodiments, which are not described herein again.
Referring to fig. 12, fig. 12 is a schematic structural diagram of an embodiment of a computer-readable storage medium 30 provided in the present application, where the computer-readable storage medium is used for storing program data 31, and the program data 31 is used for implementing the following method steps when being executed by a control circuit:
acquiring an interactive image by using an image sensor, wherein the image sensor is arranged on the neck massager; identifying the interaction action in the interaction image to obtain password information associated with the interaction action; and verifying the password information, and finishing the login of the neck massage instrument after the verification is passed.
It can be understood that, when the computer-readable storage medium 30 in this embodiment can be applied to a neck massager, the method steps of any of the above embodiments can be implemented, and specific implementation steps thereof can refer to the above embodiments, which are not described herein again.
In the several embodiments provided in the present application, it should be understood that the disclosed method and apparatus may be implemented in other manners. For example, the above-described device embodiments are merely illustrative, and for example, the division of the modules or units is only one logical division, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units may be integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated units in the other embodiments described above may be stored in a computer-readable storage medium if they are implemented in the form of software functional units and sold or used as separate products. Based on such understanding, the technical solution of the present application may be substantially implemented or contributed by the prior art, or all or part of the technical solution may be embodied in a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, a network device, or the like) or a processor (processor) to execute all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
The above description is only for the purpose of illustrating embodiments of the present application and is not intended to limit the scope of the present application, and all modifications of equivalent structures and equivalent processes, which are made by the contents of the specification and the drawings of the present application or are directly or indirectly applied to other related technical fields, are also included in the scope of the present application.

Claims (15)

1. A login method of a neck massager is characterized by comprising the following steps:
acquiring an interactive image by using an image sensor, wherein the image sensor is arranged on the neck massager;
identifying an interaction action in the interaction image to obtain password information associated with the interaction action;
and verifying the password information, and finishing the login of the neck massager after the verification is passed.
2. The method of claim 1,
before the step of acquiring an interactive image by using the image sensor, the method further comprises the following steps:
after the neck massager is started, detecting whether a moving target exists in a preset range by using an infrared sensor; wherein, the infrared sensor is arranged on the neck massager;
and if so, activating the image sensor.
3. The method of claim 1,
the step of identifying the interaction action in the interaction image to obtain the password information associated with the interaction action comprises:
segmenting the interactive image to obtain a hand region in the interactive image;
identifying the hand region to determine an interaction gesture;
determining password information associated with the interaction gesture.
4. The method of claim 1,
the interactive image comprises a plurality of successive image frames;
the step of identifying the interaction action in the interaction image to obtain the password information associated with the interaction action comprises:
dividing the plurality of continuous image frames respectively to obtain a plurality of continuous hand areas corresponding to the plurality of continuous image frames respectively;
identifying the plurality of consecutive hand regions, respectively, to determine a plurality of interactive gestures;
determining a character corresponding to each of the plurality of interaction gestures;
and determining password information associated with the interactive action according to characters corresponding to the interactive gestures.
5. The method of claim 4,
the step of determining a character corresponding to each of the plurality of interaction gestures comprises:
respectively carrying out similarity comparison on a target interactive gesture in the interactive gestures and a plurality of standard interactive gestures to obtain a plurality of similarity values; wherein each of the standard interaction gestures is associated with a character;
and determining the character associated with the standard interactive gesture corresponding to the maximum similarity numerical value in the similarity numerical values as the character corresponding to the target interactive gesture.
6. The method of claim 4,
the step of determining password information associated with the interactive action according to characters corresponding to the interactive gestures comprises:
and connecting characters corresponding to the interactive gestures according to the acquisition sequence of the interactive gestures to form password information associated with the interactive actions.
7. The method of claim 6,
the connecting the characters corresponding to the interactive gestures according to the acquisition sequence of the interactive gestures to form password information associated with the interactive actions includes:
acquiring the acquisition time interval of any two continuous interactive gestures in the interactive gestures;
and if the acquisition time interval is larger than the set time interval, taking one interaction gesture acquired later in the two interaction gestures as a first interaction gesture, and connecting the corresponding character with the characters of the subsequent interaction gestures to form password information associated with the interaction action.
8. The method of claim 1,
the interactive image comprises a plurality of successive image frames;
the step of identifying the interaction action in the interaction image to obtain the password information associated with the interaction action comprises:
dividing the plurality of continuous image frames respectively to obtain a plurality of continuous face areas corresponding to the plurality of continuous image frames respectively;
identifying the plurality of continuous facial regions, respectively, to determine a plurality of interactive expressions;
determining a character corresponding to each interactive expression in the plurality of interactive expressions;
and determining password information associated with the interactive action according to the character corresponding to each interactive expression.
9. The method of claim 8,
the step of determining the character corresponding to each interactive expression in the plurality of interactive expressions comprises:
respectively carrying out similarity comparison on the target interactive expressions of the interactive expressions and a plurality of standard interactive expressions to obtain a plurality of similarity values; wherein each standard interactive expression is associated with a character;
and determining the character associated with the standard interactive expression corresponding to the maximum similarity numerical value in the similarity numerical values as the character corresponding to the target interactive expression.
10. The method of claim 1,
the step of identifying the interaction action in the interaction image to obtain the password information associated with the interaction action comprises:
extracting a region image below the tip of a nose in the interactive image as a lip region image, wherein the interactive image comprises at least one continuous image frame;
lip shape recognition is carried out on the lip area image so as to obtain first character information in the interactive image;
and determining password information associated with the first text information.
11. The method of claim 10,
the lip recognition of the lip region image to obtain the first text information in the interactive image includes:
extracting lip-shaped features in the lip region image;
obtaining first text information based on the lip-shaped characteristics;
the step of determining the password information associated with the first text information comprises:
judging whether the first character information is matched with first preset character information or not;
and if so, acquiring password information associated with the first preset character information.
12. The method of claim 1,
the method further comprises the following steps:
acquiring a plurality of continuous gesture image frames of a user by using an image sensor;
acquiring coordinate information of a hand in a plurality of continuous gesture image frames to obtain a plurality of continuous target coordinate information;
sequentially connecting a plurality of continuous target coordinate information to form a target track image;
the step of identifying the interaction action in the interaction image to obtain the password information associated with the interaction action comprises:
and identifying the target track image to obtain password information associated with the target track image.
13. The method of claim 12,
the step of identifying the target track image to obtain the password information associated with the target track image includes:
acquiring first target coordinate information and last target coordinate information in the plurality of continuous target coordinate information;
determining an indication direction of the gesture image based on the first target coordinate information and the last target coordinate information;
and identifying the indication direction to determine password information corresponding to the indication direction.
14. A neck massager, characterized in that the neck massager comprises:
a massage apparatus body;
the massage component is arranged on the massage instrument body;
the communication circuit is arranged on the massage instrument body;
the image sensor is arranged on the massager body;
a control circuit disposed on the massager body and electrically coupled to the massage assembly, the communication circuit and the image sensor, for controlling the massage assembly, the communication circuit and the image sensor to implement the method of any one of claims 1 to 13.
15. A computer-readable storage medium for storing program data for implementing the method of any one of claims 1-13 when executed by control circuitry.
CN202010443677.8A 2020-05-22 2020-05-22 Login method of neck massager, neck massager and storage medium Pending CN112089595A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010443677.8A CN112089595A (en) 2020-05-22 2020-05-22 Login method of neck massager, neck massager and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010443677.8A CN112089595A (en) 2020-05-22 2020-05-22 Login method of neck massager, neck massager and storage medium

Publications (1)

Publication Number Publication Date
CN112089595A true CN112089595A (en) 2020-12-18

Family

ID=73750099

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010443677.8A Pending CN112089595A (en) 2020-05-22 2020-05-22 Login method of neck massager, neck massager and storage medium

Country Status (1)

Country Link
CN (1) CN112089595A (en)

Citations (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103116404A (en) * 2013-02-25 2013-05-22 广东欧珀移动通信有限公司 Face recognition unlocking method and mobile smart terminal
TWM467128U (en) * 2013-07-08 2013-12-01 Chunghwa Telecom Co Ltd Door access device with hand gestures and number input
CN103809842A (en) * 2012-11-07 2014-05-21 上海揆志网络科技有限公司 Method and device for executing system functions by hand gesture identification
CN103839040A (en) * 2012-11-27 2014-06-04 株式会社理光 Gesture identification method and device based on depth images
CN103997482A (en) * 2013-02-19 2014-08-20 华为技术有限公司 Method of user registration in desktop cloud service, and system
CN104091134A (en) * 2014-07-16 2014-10-08 谭皓文 Password inputting method with combination of safety and convenience
CN104809387A (en) * 2015-03-12 2015-07-29 山东大学 Video image gesture recognition based non-contact unlocking method and device
CN105025018A (en) * 2015-07-06 2015-11-04 国网山东寿光市供电公司 Method for safety verification in communication process
CN105307014A (en) * 2014-07-29 2016-02-03 冠捷投资有限公司 Gesture recognition based password entry method
CN205286890U (en) * 2015-09-26 2016-06-08 上品一家武汉有限公司 Intelligence massager based on thing networking is monitored by cell -phone app
CN106503620A (en) * 2016-09-26 2017-03-15 深圳奥比中光科技有限公司 Numerical ciphers input method and its system based on gesture
CN107122646A (en) * 2017-04-26 2017-09-01 大连理工大学 A kind of method for realizing lip reading unblock
CN107590887A (en) * 2017-09-06 2018-01-16 爽客智能设备(上海)有限公司 A kind of massage armchair remote password open method
CN108784175A (en) * 2017-04-27 2018-11-13 芜湖美的厨卫电器制造有限公司 Bathroom mirror and its gesture control device, method
CN108804893A (en) * 2018-03-30 2018-11-13 百度在线网络技术(北京)有限公司 A kind of control method, device and server based on recognition of face
CN108852771A (en) * 2018-05-18 2018-11-23 湖北淇思智控科技有限公司 The cloud control platform of intelligent massaging pillow
CN109448209A (en) * 2019-01-07 2019-03-08 上海早米网络科技有限公司 A kind of dynamic password massage control system and control method based on off-line mode
CN109806485A (en) * 2018-12-27 2019-05-28 成都助眠科技有限公司 A kind of control method and system of the shared intelligent pillow based on wechat
CN209708238U (en) * 2019-06-04 2019-11-29 上海志禧贸易有限公司 A kind of massage apparatus for foot of dynamic password control
CN110569817A (en) * 2019-09-12 2019-12-13 北京邮电大学 system and method for realizing gesture recognition based on vision
CN110717407A (en) * 2019-09-19 2020-01-21 平安科技(深圳)有限公司 Human face recognition method, device and storage medium based on lip language password
CN110867019A (en) * 2019-12-11 2020-03-06 沈阳圣达金卡科技有限公司 Shared massage gun terminal device and interaction method thereof

Patent Citations (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103809842A (en) * 2012-11-07 2014-05-21 上海揆志网络科技有限公司 Method and device for executing system functions by hand gesture identification
CN103839040A (en) * 2012-11-27 2014-06-04 株式会社理光 Gesture identification method and device based on depth images
CN103997482A (en) * 2013-02-19 2014-08-20 华为技术有限公司 Method of user registration in desktop cloud service, and system
CN103116404A (en) * 2013-02-25 2013-05-22 广东欧珀移动通信有限公司 Face recognition unlocking method and mobile smart terminal
TWM467128U (en) * 2013-07-08 2013-12-01 Chunghwa Telecom Co Ltd Door access device with hand gestures and number input
CN104091134A (en) * 2014-07-16 2014-10-08 谭皓文 Password inputting method with combination of safety and convenience
CN105307014A (en) * 2014-07-29 2016-02-03 冠捷投资有限公司 Gesture recognition based password entry method
CN104809387A (en) * 2015-03-12 2015-07-29 山东大学 Video image gesture recognition based non-contact unlocking method and device
CN105025018A (en) * 2015-07-06 2015-11-04 国网山东寿光市供电公司 Method for safety verification in communication process
CN205286890U (en) * 2015-09-26 2016-06-08 上品一家武汉有限公司 Intelligence massager based on thing networking is monitored by cell -phone app
CN106503620A (en) * 2016-09-26 2017-03-15 深圳奥比中光科技有限公司 Numerical ciphers input method and its system based on gesture
CN107122646A (en) * 2017-04-26 2017-09-01 大连理工大学 A kind of method for realizing lip reading unblock
CN108784175A (en) * 2017-04-27 2018-11-13 芜湖美的厨卫电器制造有限公司 Bathroom mirror and its gesture control device, method
CN107590887A (en) * 2017-09-06 2018-01-16 爽客智能设备(上海)有限公司 A kind of massage armchair remote password open method
CN108804893A (en) * 2018-03-30 2018-11-13 百度在线网络技术(北京)有限公司 A kind of control method, device and server based on recognition of face
CN108852771A (en) * 2018-05-18 2018-11-23 湖北淇思智控科技有限公司 The cloud control platform of intelligent massaging pillow
CN109806485A (en) * 2018-12-27 2019-05-28 成都助眠科技有限公司 A kind of control method and system of the shared intelligent pillow based on wechat
CN109448209A (en) * 2019-01-07 2019-03-08 上海早米网络科技有限公司 A kind of dynamic password massage control system and control method based on off-line mode
CN209708238U (en) * 2019-06-04 2019-11-29 上海志禧贸易有限公司 A kind of massage apparatus for foot of dynamic password control
CN110569817A (en) * 2019-09-12 2019-12-13 北京邮电大学 system and method for realizing gesture recognition based on vision
CN110717407A (en) * 2019-09-19 2020-01-21 平安科技(深圳)有限公司 Human face recognition method, device and storage medium based on lip language password
CN110867019A (en) * 2019-12-11 2020-03-06 沈阳圣达金卡科技有限公司 Shared massage gun terminal device and interaction method thereof

Similar Documents

Publication Publication Date Title
JP4481663B2 (en) Motion recognition device, motion recognition method, device control device, and computer program
Al-Rahayfeh et al. Eye tracking and head movement detection: A state-of-art survey
US9135503B2 (en) Fingertip tracking for touchless user interface
Nair et al. Hand gesture recognition system for physically challenged people using IOT
CN204791017U (en) Mobile terminal users authentication device based on many biological characteristics mode
Luo et al. Hand gesture recognition for human-robot interaction for service robot
JP6799525B2 (en) Biological information analyzer and hand skin analysis method
CN108647504B (en) Method and system for realizing information safety display
WO2022127494A1 (en) Pose recognition model training method and apparatus, pose recognition method, and terminal device
US20120264095A1 (en) Emotion abreaction device and using method of emotion abreaction device
CN106503619B (en) Gesture recognition method based on BP neural network
CN108629278B (en) System and method for realizing information safety display based on depth camera
WO2020244160A1 (en) Terminal device control method and apparatus, computer device, and readable storage medium
Vishwakarma et al. Simple and intelligent system to recognize the expression of speech-disabled person
Zahra et al. Camera-based interactive wall display using hand gesture recognition
CN112089589B (en) Control method of neck massager, neck massager and storage medium
Vasanthan et al. Facial expression based computer cursor control system for assisting physically disabled person
Dalka et al. Human-Computer Interface Based on Visual Lip Movement and Gesture Recognition.
CN112089595A (en) Login method of neck massager, neck massager and storage medium
Sonoda et al. A letter input system based on handwriting gestures
CN112527103B (en) Remote control method and device for display equipment, equipment and computer readable storage medium
CN115421590A (en) Gesture control method, storage medium and camera device
Proença et al. A gestural recognition interface for intelligent wheelchair users
Shin et al. Welfare interface implementation using multiple facial features tracking for the disabled people
Dalka et al. Lip movement and gesture recognition for a multimodal human-computer interface

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20201218

RJ01 Rejection of invention patent application after publication