CN112089589A - Control method of neck massager, neck massager and storage medium - Google Patents

Control method of neck massager, neck massager and storage medium Download PDF

Info

Publication number
CN112089589A
CN112089589A CN202010442466.2A CN202010442466A CN112089589A CN 112089589 A CN112089589 A CN 112089589A CN 202010442466 A CN202010442466 A CN 202010442466A CN 112089589 A CN112089589 A CN 112089589A
Authority
CN
China
Prior art keywords
gesture
image
gesture image
neck massager
control instruction
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010442466.2A
Other languages
Chinese (zh)
Other versions
CN112089589B (en
Inventor
刘杰
肖华
余建雄
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
SKG Health Technologies Co Ltd.
Original Assignee
SKG Health Technologies Co Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by SKG Health Technologies Co Ltd. filed Critical SKG Health Technologies Co Ltd.
Priority to CN202010442466.2A priority Critical patent/CN112089589B/en
Publication of CN112089589A publication Critical patent/CN112089589A/en
Application granted granted Critical
Publication of CN112089589B publication Critical patent/CN112089589B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H7/00Devices for suction-kneading massage; Devices for massaging the skin by rubbing or brushing not otherwise provided for
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/26Segmentation of patterns in the image field; Cutting or merging of image elements to establish the pattern region, e.g. clustering-based techniques; Detection of occlusion
    • G06V10/267Segmentation of patterns in the image field; Cutting or merging of image elements to establish the pattern region, e.g. clustering-based techniques; Detection of occlusion by performing operations on regions, e.g. growing, shrinking or watersheds
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • G06V40/28Recognition of hand or arm movements, e.g. recognition of deaf sign language
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H2201/00Characteristics of apparatus not provided for in the preceding codes
    • A61H2201/16Physical interface with patient
    • A61H2201/1602Physical interface with patient kind of interface, e.g. head rest, knee support or lumbar support
    • A61H2201/1609Neck
    • A61H2201/1611Holding means therefor
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H2201/00Characteristics of apparatus not provided for in the preceding codes
    • A61H2201/50Control means thereof
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H2201/00Characteristics of apparatus not provided for in the preceding codes
    • A61H2201/50Control means thereof
    • A61H2201/5007Control means thereof computer controlled
    • A61H2201/501Control means thereof computer controlled connected to external computer devices or networks
    • A61H2201/5012Control means thereof computer controlled connected to external computer devices or networks using the internet
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H2205/00Devices for specific parts of the body
    • A61H2205/04Devices for specific parts of the body neck

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • General Engineering & Computer Science (AREA)
  • General Health & Medical Sciences (AREA)
  • Rehabilitation Therapy (AREA)
  • Veterinary Medicine (AREA)
  • Public Health (AREA)
  • Animal Behavior & Ethology (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Physical Education & Sports Medicine (AREA)
  • Pain & Pain Management (AREA)
  • Epidemiology (AREA)
  • Dermatology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Psychiatry (AREA)
  • Social Psychology (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application discloses control method, neck massage appearance and storage medium of neck massage appearance, this method includes: acquiring a gesture image of a user by using an image sensor, wherein the image sensor is arranged on the neck massager; recognizing the gesture image to determine a control instruction associated with the gesture image; and realizing corresponding function control based on the control instruction. Through the mode, the related function control of the neck massager can be realized by utilizing different gesture images, the problems of structural limitation and unfavorable operation of the neck massager are solved, and the use feeling of a user is improved.

Description

Control method of neck massager, neck massager and storage medium
Technical Field
The application relates to the technical field of medical treatment, in particular to a control method of a neck massager, the neck massager and a storage medium.
Background
With the development of society, people pay more attention to their own physical health, and the excessive energy is put on daily work and study, so that the neck, waist and other parts of people are painful or uncomfortable; for the old, as the body is too old, various pains are generated on the body, so that more and more people begin to use various neck massage instruments to massage so as to achieve the effects of relieving pains and relieving fatigue.
However, in the practical use of the massage apparatus, the neck massage apparatus can relieve fatigue, but has limited functions and cannot meet some requirements of users.
Disclosure of Invention
In order to solve the above problems, the present application provides a control method for a neck massager, and a storage medium, which can realize related function control of the neck massager by using different gesture images, thereby improving user experience.
In order to solve the technical problem, the application adopts a technical scheme that: a control method of a neck massager is provided, which comprises the following steps: acquiring a gesture image of a user by using an image sensor, wherein the image sensor is arranged on the neck massager; recognizing the gesture image to determine a control instruction associated with the gesture image; and realizing corresponding function control based on the control instruction.
Before the step of acquiring the gesture image of the user by using the image sensor, the method further comprises the following steps: after the neck massager is started, detecting whether a moving target exists in a preset range by using an infrared sensor; wherein, the infrared sensor is arranged on the neck massager; if so, the image sensor is activated.
The step of recognizing the gesture image to determine the control instruction associated with the gesture image comprises the following steps: acquiring the actual distance between the hand and the neck massager in the gesture image; determining a corresponding grade relation according to the actual distance; the gesture image is recognized based on the hierarchical relationship to determine a control instruction associated with the gesture image.
The image sensor comprises a first camera assembly and a second camera assembly; the step of obtaining the actual distance between the hand and the neck massager in the gesture image comprises the following steps: the first camera component acquires a first gesture image, and the second camera component acquires a second gesture image; the first gesture image and the second gesture image are acquired at the same time; identifying the first gesture image to acquire a first feature point in the first gesture image, and identifying the second gesture image to acquire a second feature point in the second gesture image; wherein the first characteristic point and the second characteristic point are the same point; acquiring a first coordinate position from the first gesture image based on the first characteristic point, and acquiring a second coordinate position from the second gesture image based on the second characteristic point; and calculating the actual distance between the hand and the neck massager in the gesture image according to the first coordinate position and the second coordinate position.
Wherein, according to first coordinate position and second coordinate position, calculate the step of the actual distance between hand and neck massage appearance in the gesture image, include: the following formula is used for calculation:
Figure BDA0002504468140000021
wherein, XRIs the abscissa, X, in the second coordinate positionLThe horizontal coordinate in the first coordinate, B is the center distance between the first camera component and the second camera component, f is the focal length of the first camera component and the second camera component, and Z is the actual distance between the hand and the neck massager in the gesture image.
Wherein, the step of determining the corresponding grade relation according to the actual distance comprises the following steps: judging whether the actual distance is greater than a first preset threshold and smaller than a second preset threshold; if yes, determining that the actual distance belongs to a first grade range; otherwise, continuously judging whether the actual distance is greater than a second preset threshold and smaller than a third preset threshold.
Wherein, the step of determining the corresponding grade relation according to the actual distance further comprises: judging whether the actual distance is greater than a second preset threshold and smaller than a third preset threshold; if yes, determining that the actual distance belongs to a second grade range; otherwise, the actual distance is determined to belong to the third level range.
Before the step of recognizing the gesture image to determine the control instruction associated with the gesture image, the method further includes: performing one-to-one corresponding association setting on a plurality of preset gesture information and a plurality of preset control instructions; and storing the incidence relation between the preset gesture information and the preset control instructions in an image feature library, so as to facilitate recognition and matching.
The step of recognizing the gesture image to determine the control instruction associated with the gesture image comprises the following steps: recognizing the gesture image to acquire gesture information in the image; matching the gesture information with a plurality of preset control instructions in an image feature library; and if the matching is successful, determining a control instruction corresponding to the gesture information.
The method comprises the following steps of identifying a gesture image to acquire gesture information in the image, wherein the steps comprise: separating hand image areas in the gesture image by adopting a skin color-based segmentation method; and extracting the outline corresponding to the hand image area to acquire gesture information.
Wherein, the step of determining the control instruction corresponding to the gesture information comprises: dividing an acquisition interface of an image sensor into a plurality of control areas; and determining a corresponding control instruction according to different control areas where the central points of the gesture images are located and the matching relation between the gesture information and the preset control instruction.
Wherein, the method also comprises: acquiring a plurality of continuous gesture image frames of a user by using an image sensor; acquiring coordinate information of a hand in a plurality of continuous gesture image frames to obtain a plurality of continuous target coordinate information; sequentially connecting a plurality of continuous target coordinate information to form a target track image; the step of recognizing the gesture image to determine the control instruction associated with the gesture image comprises the following steps: the target track image is identified to determine control instructions associated with the target track image.
The step of identifying the target track image to determine the control instruction associated with the target track image includes: acquiring first target coordinate information and last target coordinate information in a plurality of continuous target coordinate information; determining the indication direction of the gesture image based on the first target coordinate information and the last target coordinate information; and identifying the indication direction to determine a control command corresponding to the indication direction.
In order to solve the above technical problem, another technical solution adopted by the present application is: there is provided a neck massager comprising: a massage apparatus body; the massage component is arranged on the massage instrument body; the communication circuit is arranged on the massage instrument body; the image sensor is arranged on the massage instrument body; and the control circuit is arranged on the massage instrument body, is electrically coupled with the massage component, the communication circuit and the image sensor, and is used for controlling the massage component, the communication circuit and the image sensor so as to realize the control method of the neck massage instrument.
In order to solve the above technical problem, the present application adopts another technical solution: there is provided a computer-readable storage medium for storing program data for implementing the control method of the above-described neck massage apparatus when the program data is executed by a control circuit.
The beneficial effects of the embodiment of the application are that: different from the prior art, the control method of the neck massager obtains the gesture image of the user through the image sensor and identifies the gesture image to determine the control instruction associated with the gesture image, so that corresponding function control is realized based on the control instruction. Through the mode, the related function control of the neck massager can be realized by utilizing different gesture images, the problems of structural limitation and unfavorable operation of the neck massager are solved, and the use feeling of a user is improved.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts. Wherein:
FIG. 1 is a schematic structural view of an embodiment of a neck massager provided herein;
fig. 2 is a schematic flow chart of a first embodiment of a control method of the neck massager provided by the present application;
fig. 3 is a schematic structural diagram of a second embodiment of the control method of the neck massager provided by the present application;
fig. 4 is a detailed flowchart of step S36;
FIG. 5 is a graphical illustration of a plurality of control areas provided herein;
fig. 6 is a schematic structural diagram of a third embodiment of a control method of the neck massager provided by the present application;
fig. 7 is a detailed flowchart of step S63;
FIG. 8 is an image schematic of a level relationship and gesture image provided herein;
fig. 9 is a schematic flow chart of a fourth embodiment of a control method of the neck massager provided by the present application;
fig. 10 is a schematic view of the principle of binocular ranging;
fig. 11 is a schematic flow chart of a fifth embodiment of a control method of the neck massager provided in the present application;
fig. 12 is a detailed flowchart of step S104;
FIG. 13 is a schematic structural view of a further embodiment of the neck massager provided herein;
FIG. 14 is a schematic structural diagram of an embodiment of a computer-readable storage medium provided in the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application. It is to be understood that the specific embodiments described herein are merely illustrative of the application and are not limiting of the application. It should be further noted that, for the convenience of description, only some of the structures related to the present application are shown in the drawings, not all of the structures. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
Reference herein to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment can be included in at least one embodiment of the application. The appearances of the phrase in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. It is explicitly and implicitly understood by one skilled in the art that the embodiments described herein can be combined with other embodiments.
Referring to fig. 1, fig. 1 is a schematic structural diagram of an embodiment of a neck massager 10 provided by the present application, which includes an elastic arm 11, a massage component 12, a sensing component 13, a first handle 14, a second handle 15, and a speaker 16.
The first handle 14 and the second handle 15 are fixedly connected to two sides of the elastic arm 11, the massage component 12 is disposed on one side of the elastic arm 11 facing the neck of the human body, and the massage component 12 can emit electric pulses. The sensing assembly 13 is disposed outside of the first handle 14 or the second handle 15. The speaker 16 is disposed outside the first handle 14 or the second handle 15 for playing audio data.
Alternatively, the electrode pads of the massage unit 12 are not limited to the protruding mushroom structure, but may be flush with or slightly protruding from the surface of the side of the elastic arms 11 facing the neck of the human body. The electrode sheet can also be conductive silica gel.
Optionally, the sensing assembly 13 includes a first sensing assembly 131 and a second sensing assembly 132, and the first sensing assembly 131 and the second sensing assembly 132 are connected.
Further, the first sensing component 131 may be an image sensor, specifically, a depth camera, and the second sensing component 132 may be an infrared sensor.
Further, first sensing assembly 131 may further include a first camera assembly 1311 and a second camera assembly 1312, wherein first camera assembly 1311 may be disposed outside first handle 14, second camera assembly 1312 may be disposed outside second handle 15, and first camera assembly 1311 and second camera assembly 1312 are symmetrically disposed.
Alternatively, the first and second handles 14, 15 may be separate pieces or may be part of an integrally formed neck massager.
Optionally, the neck massager 10 is further provided with a communication module for communicating with other terminals.
Referring to fig. 2, fig. 2 is a schematic flow chart of a first embodiment of a control method of a neck massager provided by the present application, the method including:
s21: and acquiring a gesture image of the user by using the image sensor.
Optionally, before acquiring the gesture image of the user, the infrared sensor may be further used to detect a moving object within a preset range to determine whether to acquire the gesture image. Specifically, under a general condition, the image sensor is in a shutdown state or a dormant state due to relatively high power consumption, and does not acquire the gesture image, while the infrared sensor is in a working state for a long time due to relatively low power consumption, and when the infrared sensor recognizes that a moving target is moving, the infrared sensor sends a signal to the neck massager to activate the image sensor to acquire the gesture of the user. In other embodiments, the user may also activate the image sensor by manually touching a button disposed on the neck massager. Wherein, image sensor and infrared sensor all set up on the neck massager.
Optionally, the neck massager may further comprise an illumination assembly for providing sufficient light to acquire the gesture image when the environment is dark.
The image sensor may be generally divided into two types, one is a CCD (Charge-coupled Device) image sensor, and the other is a CMOS (Complementary Metal Oxide semiconductor) image sensor, and since the CMOS has the characteristics of small volume and low power consumption, in this embodiment, the CMOS image sensor is mainly used to acquire a user gesture image.
S22: the gesture image is recognized to determine a control instruction associated with the gesture image.
Before the gesture images are recognized, the gesture images need to be input, so that different gesture images can be associated and correspond to different function controls. Optionally, the implementation may be performed by a mobile terminal associated with the neck massager, for example, when the neck massager is used for the first time, a user may perform gesture entry on different functions, specifically, the mobile terminal is connected with the neck massager, displays a plurality of functions on a screen of the mobile terminal, after the user selects a function to be entered, enters an image acquisition interface, and makes a gesture by the user, and the mobile terminal associates and stores the gesture with the corresponding function.
In this embodiment, after the image sensor acquires the gesture image of the user, the gesture image is sent to the mobile terminal associated with the neck massager, and the mobile terminal compares the gesture image with a preset gesture image, so as to determine a corresponding control instruction, and sends the control instruction to the neck massager. Of course, this process can be performed on the neck massager, and is not particularly limited.
S23: and realizing corresponding function control based on the control instruction.
In this embodiment, different gestures of the user may correspond to different control instructions, for example, a "scissors" gesture may represent an instruction to increase the massage intensity, a "cloth" gesture may represent an instruction to change the massage mode, and a "fist" gesture may represent an instruction to turn on or turn off, where no specific limitation is imposed, and the user may set the control instructions by himself or herself as needed.
Different from the prior art, the control method of the neck massager provided by the application acquires the gesture image of the user through the image sensor and identifies the gesture image to determine the control instruction associated with the gesture image, so that the corresponding function control is realized based on the control instruction. Through the mode, the related function control of the neck massager can be realized by utilizing different gesture images, the problems of structural limitation and unfavorable operation of the neck massager are solved, and the use feeling of a user is improved.
Referring to fig. 3, fig. 3 is a schematic structural diagram of a second embodiment of the control method of the neck massager provided by the present application, the method including:
s31: and acquiring a gesture image of the user by using the image sensor.
S32: and carrying out one-to-one corresponding association setting on the plurality of preset gesture information and the plurality of preset control instructions.
In this embodiment, the association setting may be implemented by a mobile terminal associated with the neck massager, the gesture image of the user is acquired by using an image sensor of the neck massager, and the gesture image is sent to the mobile terminal to perform corresponding function setting, for example, the setting may be performed by using an application program (APP), the user first selects a control function, such as shutdown, adjustment of massage intensity, and the like, in the application program, the user selects each preset control instruction, and then the mobile terminal sends a signal to the neck massager, so that the neck massager starts the image sensor and performs gesture acquisition, thereby completing the association setting of the gesture information and the control instruction. Alternatively, the whole process can be completed on the mobile terminal, but since the subsequent identification is also performed by the image sensor of the neck massager, the identification accuracy is improved, and therefore, the subsequent identification is preferably completed by the neck massager and the mobile terminal together.
S33: and storing the incidence relation between the preset gesture information and the preset control instructions in an image feature library, so as to facilitate recognition and matching.
S34: and recognizing the gesture image to acquire gesture information in the image.
Specifically, this step can be implemented by the following steps:
1) before the gesture image is recognized, the image needs to be preprocessed to remove the influence of noise, illumination and the like and strengthen useful information of the image.
2) Threshold segmentation of RGB (red, green and blue) color space is carried out on the gesture image, and then the clustering property of skin color distribution on HSV (hue, saturation and brightness) color space is combined, and the skin color area can be extracted and separated by operation between the RGB color space and the clustering property. The segmentation method based on the skin color can segment skin color areas from background images through the clustering characteristics of the skin color in the color space, achieves gesture area segmentation by using skin color characteristic information, and has the characteristics of intuition, high efficiency and accuracy.
3) Extracting the contour corresponding to the human skin color in the hand image to obtain gesture information, wherein the gesture information may be a gesture of one hand or two hands, such as a heart-to-heart gesture of two hands, scissors of one hand, and the like.
S35: and matching the gesture information with a plurality of preset control instructions in the image feature library.
In this embodiment, the gesture information may be matched with preset gesture information in the image feature library, specifically, the gesture information and the preset gesture information may be compared with each other in similarity, the gesture information and the stored preset gesture information are sequentially compared, and when the similarity of the comparison between the gesture information and the stored preset gesture information is greater than a set threshold (for example, 80%), it may be determined that the matching is successful.
S36: and if the matching is successful, determining a control instruction corresponding to the gesture information.
Specifically, step S36 may be a step as shown in fig. 4:
s361: and dividing an acquisition interface of the image sensor into a plurality of control areas.
As shown in fig. 5, the acquisition interface of the image sensor is divided into four areas, which are a first control area, a second control area, a third control area and a fourth control area, and may respectively correspond to positions of four quadrants of a coordinate system.
S362: and determining a corresponding control instruction according to different control areas where the central points of the gesture images are located and the matching relation between the gesture information and the preset control instruction.
Continuing with the description of fig. 5, the gesture a in fig. 5 may recognize that the gesture a is a "scissors" gesture, and at this time, the gesture a is located in a second control region corresponding to a second quadrant, that is, it indicates that a control instruction corresponding to the gesture should be generated by jointly determining "scissors" and the second control region, for example, the "scissors" gesture a may be represented by increasing the massage intensity, and meanwhile, the gesture a located in the second control region may be represented by increasing the massage intensity of two gears, so that the control instruction is determined; by analogy, if gesture a is also a "scissors" gesture and is located in the third control region, it may be indicated as increasing the massage intensity of three gears. The above is merely an example, and the setting can be specifically performed according to the actual function of the neck massager.
For the determination of the position of the gesture, the determination may be performed by using a fingertip in the gesture as a standard, for example, the gesture a in fig. 5 has a gesture part located in the second control area, so the gesture a can recognize the position well; in the gesture B in fig. 5, it can be seen that the body portion of the hand is in the fourth control region, but the finger and the fingertip portions are in the first control region, at this time, the position of the control region can be determined based on the fingertip, and therefore, the gesture B should be regarded as being in the first control region, so as to determine the control instruction by combining the gesture and the control region. Optionally, the determination of the position of the gesture by the control area may also be based on a center point of the gesture image, and is not particularly limited.
S37: and realizing corresponding function control based on the control instruction.
Alternatively, after the control instruction is used to perform the function control on the neck massage apparatus, the control result, such as "the massage intensity is adjusted to the first gear", "the massage mode is switched to the automatic mode", and the like, may be played through the speaker of the neck massage apparatus.
By the method, the actual position of the gesture image can be utilized, the matching relation between the gesture and the control instruction is combined, non-contact dual control over all functions of the neck massager is achieved, the corresponding relation between the gesture and the control instruction can be strengthened, and user experience is improved.
Referring to fig. 6, fig. 6 is a schematic structural diagram of a third embodiment of a control method of a neck massager provided by the present application, the method including:
s61: and acquiring a gesture image of the user by using the image sensor.
S62: and acquiring the actual distance between the hand and the neck massager in the gesture image.
The measuring of actual distance can be realized by infrared sensor, utilizes the principle that infrared ray transmission needs time, through infrared transceiver diode or the infrared integrated chip of infrared sensor inside, sends the infrared signal of certain frequency to the target hand to record and receive the time of the infrared signal that reflects back through the hand, according to infrared propagation speed again, can calculate the distance between the two.
S63: and determining a corresponding grade relation according to the actual distance.
In some embodiments, step S63 may be the step shown in fig. 7:
s631: and judging whether the actual distance is greater than a first preset threshold and smaller than a second preset threshold.
The first preset threshold may be 0m to 0.2m, in this embodiment, 0m, and the second preset threshold may be 0.4m to 0.6m, in this embodiment, 0.5m, and the distance is merely an example and does not represent an actual set value of the apparatus.
When the measured actual distance is between 0m and 0.5m (0.5 m may not be included), step S632 is performed, otherwise, step S633 is performed.
S632: it is determined that the actual distance falls within the first rank range.
S633: and judging whether the actual distance is greater than a second preset threshold and smaller than a third preset threshold.
The third preset threshold may be 0.9m to 1.1m, and in this embodiment, is 1.0 m. When the measured actual distance is between 0.5m and 1.0m (1.0 m may not be included), step S634 is performed, otherwise step S635 is performed.
S634: it is determined that the actual distance falls within the second hierarchical range.
S635: it is determined that the actual distance falls within the third level range.
The specific ranking relationships are summarized in the following table:
actual distance Hierarchical relationships
0-0.5m First class range
0.5-1.0m Second rank range
Greater than 1.0m Third order range
S64: the gesture image is recognized based on the hierarchical relationship to determine a control instruction associated with the gesture image.
Referring to fig. 8, fig. 8 is a schematic image diagram of a level relationship and a gesture image provided in the present application, it can be seen that a gesture C is a "scissors" gesture, and should be located in a first level range, that is, the distance between the gesture and the neck massager is 0m to 0.5m, at this time, a control instruction corresponding to the gesture C is determined by the "scissors" and the first level range together, for example, the "scissors" gesture C may be represented as a massage replacement mode, and the gesture C located in the first level range may be represented as an automatic mode, so as to determine the control instruction; similarly, when the gesture C is also a "scissors" gesture and is located in the second level range, it may be indicated to be changed to the relaxing mode, and when the gesture C is located in the third level range, it may be indicated to be changed to the activating mode. The distance is only used here, and the setting can be specifically carried out according to the actual function of the neck massager.
Alternatively, when the same gesture is in different level ranges, the gesture can also represent corresponding different control instructions, and is not limited to a plurality of working states under the same control instruction.
In some embodiments, the capture interface of the image sensor may be divided into four regions as in the above embodiments, and refer to fig. 8, for example, the gesture C is a "scissors" gesture and is located in the second control region within the first hierarchical range, and at this time, the control command corresponding to the gesture C should be generated by the three determination. In addition, in the same level range, the two same gestures are only different in control area, and the corresponding control instructions may also be different. For another example, in fig. 8, the gesture D and the gesture C belong to the same "scissors hand" gesture, but the level ranges and the control areas of the two gestures are different, the control instructions determined by the two gestures are also different, and the specific correspondence relationship is not particularly limited.
It will be appreciated that since fig. 8 is captured by an image sensor, the image presented in fig. 8 should be a conical enlargement, where all images are represented to the same size for ease of understanding.
S65: and realizing corresponding function control based on the control instruction.
By the method, the actual distance between the hand and the neck massager in the gesture image and the specific position where the gesture image is located can be utilized, and the matching relation between the gesture and the control instruction is combined, so that the non-contact multiple control on each function of the neck massager is realized, the corresponding relation between the gesture and the control instruction can be strengthened, and the use efficiency is improved.
Referring to fig. 9, fig. 9 is a schematic flow chart of a fourth embodiment of a control method of a neck massager provided by the present application, the method including:
s91: and acquiring a gesture image of the user by using the image sensor.
Wherein the image sensor includes a first camera assembly and a second camera assembly.
S92: the first camera assembly acquires a first gesture image, and the second camera assembly acquires a second gesture image.
The first gesture image and the second gesture image are acquired at the same time, and the first gesture image and the second gesture image are only different in image angle.
S93: identifying the first gesture image to acquire a first feature point in the first gesture image; and identifying the second gesture image to acquire a second feature point in the second gesture image.
The first feature point and the second feature point are the same point, for example, a center point of the gesture image.
S94: and acquiring a first coordinate position from the first gesture image based on the first characteristic point, and acquiring a second coordinate position from the second gesture image based on the second characteristic point.
In some embodiments, the first coordinate position of the gesture image in the first gesture image and the second coordinate position of the gesture image in the second gesture image both refer to positions of the gesture image based on an image coordinate system, and position coordinates of the gesture image in the first gesture image and/or the second gesture image relative to a world coordinate system can be obtained through calculation.
S95: and calculating the actual distance between the hand and the neck massager in the gesture image according to the first coordinate position and the second coordinate position.
Referring to fig. 10, fig. 10 is a schematic view of a binocular distance measurement principle, where P is an object to be measured, that is, a gesture image in this embodiment, OL and OR are optical centers of the first camera assembly and the second camera assembly, respectively, imaging points of the gesture image P on the two camera assembly photoreceptors are L 'and R', respectively, f is a focal length of the two camera assemblies, B is a center distance between the two camera assemblies, and Z is an actual distance between the hand and the neck massager in the gesture image to be calculated.
Specifically, the following formula may be employed for calculation:
Figure BDA0002504468140000121
wherein, XRAs the abscissa of the imaging point R', XLThe abscissa of the imaging point L' can be directly obtained from the second coordinate position and the first coordinate position, and other parameters are known as basic parameters of the camera assembly, so that the actual distance Z can be calculated according to the principle of the similar triangle.
S96: and determining a corresponding grade relation according to the actual distance.
S97: the gesture image is recognized based on the hierarchical relationship to determine a control instruction associated with the gesture image.
S98: and realizing corresponding function control based on the control instruction.
S96-S98 are the same as the steps in the third embodiment, and are not described here.
Referring to fig. 11, fig. 11 is a schematic flow chart of a fifth embodiment of a control method of a neck massager provided in the present application, the method including:
s101: a plurality of successive gesture image frames of a user are acquired with an image sensor.
S102: coordinate information of the hand in a plurality of continuous gesture image frames is obtained so as to obtain a plurality of continuous target coordinate information.
For example, a coordinate system may be established in the shooting area of the image sensor, and a user performs motion interaction using a gesture, and first identifies a fingertip of a finger in each image frame, and then acquires coordinate information of the fingertip in the image frame, so that a plurality of continuous target coordinate information may be acquired.
S103: and sequentially connecting a plurality of continuous target coordinate information to form a target track image.
S104: the target track image is identified to determine control instructions associated with the target track image.
In some embodiments, before performing this step, it may be determined whether the target track image is a straight line. It is understood that the trajectory image of the hand motion is not necessarily an absolute straight line, and therefore, as long as the target trajectory image is a substantially straight line, the target trajectory image can be considered to be a straight line. At this time, linear operation may be performed on a plurality of continuous target coordinate information, for example, two pieces of target coordinate information are arbitrarily selected, a straight line expression representing two gestures is calculated by using the abscissa and the ordinate of the coordinate information in the coordinate system, and the slope k1 of the straight line is obtained, similarly, the slope k2, k3 of the straight line between any remaining two pieces of target coordinate information in the target trajectory image may be calculated, the average value and the variance of the obtained slopes may be further calculated, and if the obtained variance of the slope of the part is smaller than a preset threshold (for example, smaller than 1), the target trajectory image formed by the part of the target coordinates may be determined to be the straight line.
Further, after the target track image is determined to be a straight line, the direction of the track image can be identified according to the first target coordinate information and the last target coordinate information in the plurality of continuous target coordinate information, and based on the coordinate system, the specific direction of the track image can be obtained, for example, the angle in the positive direction of the X axis is divided, for example, 0-45 degrees corresponds to one command, 45-90 degrees corresponds to one command, and at this time, the obtained specific direction can be matched with the included angle of the X axis to determine the corresponding control command.
In other embodiments, when the target trajectory image is not a straight line, step S104 may be a step as shown in fig. 12, specifically as follows:
s1041: and acquiring the first target coordinate information and the last target coordinate information in a plurality of continuous target coordinate information.
S1042: and determining the indication direction of the gesture image based on the first target coordinate information and the last target coordinate information.
S1043: and identifying the indication direction to determine a control command corresponding to the indication direction.
The specific indication direction of a straight line formed by a starting point (a first target coordinate) and an end point (a last target coordinate) of the non-linear target track image can be obtained based on the coordinate system, at this time, the shooting area of the image sensor can be divided into four areas according to coordinate quadrants, and the straight line obtained by connecting the starting point and the end point of the non-linear target track image is extended until the straight line intersects with the boundaries of the four areas.
It can be understood that the extension line of any straight line intersects with the boundary of the shooting area, and therefore, the specific identification and confirmation of the control command can be performed according to the intersection point position of the extension line of the straight line corresponding to the target track image and the boundary of the area, for example, if the intersection point is located in the area corresponding to the first quadrant in the coordinate quadrant, the corresponding control command is correspondingly generated. In this way, the meaning of the gesture interaction can be accurately identified to determine the associated control instruction.
S105: and realizing corresponding function control based on the control instruction.
Different from the prior art, the non-contact multiple control over various functions of the neck massager can be realized by utilizing the indication direction of the gesture target track image, the corresponding relation between the gesture track and the control instruction can be strengthened, and the use efficiency is improved.
Referring to fig. 13, fig. 13 is a schematic structural view of another embodiment of the neck massager 20 provided by the present application, which includes a massager body 201, a massage assembly 202, a communication circuit 203, an image sensor 204, and a control circuit 205. Wherein, the massage component 202 is arranged on the massage apparatus body 201; the communication circuit 203 is arranged on the massage apparatus body 201; the image sensor 204 is arranged on the massage apparatus body; the control circuit 205 is disposed on the massage device body 201, electrically coupled to the massage assembly 202, the communication circuit 203, and the image sensor 204, and configured to control the massage assembly 202, the communication circuit 203, and the image sensor 204 to implement the following steps:
acquiring a gesture image of a user by using an image sensor, wherein the image sensor is arranged on the neck massager; recognizing the gesture image to determine a control instruction associated with the gesture image; and realizing corresponding function control based on the control instruction.
It can be understood that the neck massager 20 in this embodiment may implement the method steps of any of the above embodiments, and the specific implementation steps thereof may refer to the above embodiments, which are not described herein again.
Referring to fig. 14, fig. 14 is a schematic structural diagram of an embodiment of a computer-readable storage medium 30 provided in the present application, where the computer-readable storage medium is used for storing program data 31, and the program data 31 is used for implementing the following method steps when being executed by a control circuit:
acquiring a gesture image of a user by using an image sensor, wherein the image sensor is arranged on the neck massager; recognizing the gesture image to determine a control instruction associated with the gesture image; and realizing corresponding function control based on the control instruction.
It can be understood that, when the computer-readable storage medium 30 in this embodiment can be applied to a neck massager, the method steps of any of the above embodiments can be implemented, and specific implementation steps thereof can refer to the above embodiments, which are not described herein again.
In the several embodiments provided in the present application, it should be understood that the disclosed method and apparatus may be implemented in other manners. For example, the above-described device embodiments are merely illustrative, and for example, the division of the modules or units is only one logical division, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units may be integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated units in the other embodiments described above may be stored in a computer-readable storage medium if they are implemented in the form of software functional units and sold or used as separate products. Based on such understanding, the technical solution of the present application may be substantially implemented or contributed by the prior art, or all or part of the technical solution may be embodied in a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, a network device, or the like) or a processor (processor) to execute all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
The above description is only for the purpose of illustrating embodiments of the present application and is not intended to limit the scope of the present application, and all modifications of equivalent structures and equivalent processes, which are made by the contents of the specification and the drawings of the present application or are directly or indirectly applied to other related technical fields, are also included in the scope of the present application.

Claims (15)

1. A control method of a neck massager is characterized by comprising the following steps:
acquiring a gesture image of a user by using an image sensor, wherein the image sensor is arranged on the neck massager;
recognizing the gesture image to determine a control instruction associated with the gesture image;
and realizing corresponding function control based on the control instruction.
2. The method of claim 1,
before the step of acquiring the gesture image of the user by using the image sensor, the method further comprises the following steps:
after the neck massager is started, detecting whether a moving target exists in a preset range by using an infrared sensor; wherein, the infrared sensor is arranged on the neck massager;
and if so, activating the image sensor.
3. The method of claim 1,
the step of recognizing the gesture image to determine the control instruction associated with the gesture image comprises:
acquiring an actual distance between a hand and the neck massager in the gesture image;
determining a corresponding grade relation according to the actual distance;
the gesture image is recognized based on the hierarchical relationship to determine a control instruction associated with the gesture image.
4. The method of claim 3,
the image sensor comprises a first camera assembly and a second camera assembly;
the step of obtaining the actual distance between the hand and the neck massager in the gesture image comprises the following steps:
the first camera assembly acquires a first gesture image, and the second camera assembly acquires a second gesture image; the first gesture image and the second gesture image are acquired at the same time;
identifying the first gesture image to acquire a first feature point in the first gesture image, and identifying the second gesture image to acquire a second feature point in the second gesture image; wherein the first feature point and the second feature point are the same point;
acquiring a first coordinate position from the first gesture image based on the first feature point, and acquiring a second coordinate position from the second gesture image based on the second feature point;
and calculating the actual distance between the hand and the neck massager in the gesture image according to the first coordinate position and the second coordinate position.
5. The method of claim 4, wherein:
the step of calculating the actual distance between the hand and the neck massager in the gesture image according to the first coordinate position and the second coordinate position comprises the following steps:
the following formula is used for calculation:
Figure FDA0002504468130000021
wherein, XRIs the abscissa, X, in the second coordinate positionLThe horizontal coordinate in the first coordinate, B is the center distance between the first camera assembly and the second camera assembly, f is the focal length of the first camera assembly and the second camera assembly, and Z is the actual distance between the hand and the neck massager in the gesture image.
6. The method of claim 3,
the step of determining the corresponding grade relation according to the actual distance comprises the following steps:
judging whether the actual distance is greater than a first preset threshold and smaller than a second preset threshold;
if yes, determining that the actual distance belongs to a first grade range;
otherwise, continuously judging whether the actual distance is greater than a second preset threshold and smaller than a third preset threshold.
7. The method of claim 6,
the step of determining the corresponding grade relation according to the actual distance further comprises:
judging whether the actual distance is greater than a second preset threshold and smaller than a third preset threshold;
if yes, determining that the actual distance belongs to a second level range;
otherwise, determining that the actual distance belongs to the third level range.
8. The method of claim 1,
before the step of recognizing the gesture image to determine the control instruction associated with the gesture image, the method further comprises:
performing one-to-one corresponding association setting on a plurality of preset gesture information and a plurality of preset control instructions;
and storing the incidence relation between the preset gesture information and the preset control instructions to an image feature library, so as to facilitate recognition and matching.
9. The method of claim 8, wherein gesture matching
The step of recognizing the gesture image to determine the control instruction associated with the gesture image comprises:
recognizing the gesture image to acquire gesture information in the image;
matching the gesture information with the plurality of preset control instructions in the image feature library;
and if the matching is successful, determining a control instruction corresponding to the gesture information.
10. The method of claim 9,
the step of recognizing the gesture image to acquire gesture information in the image includes:
separating hand image areas in the gesture image by adopting a skin color-based segmentation method;
and extracting the outline corresponding to the hand image area to acquire gesture information.
11. The method of claim 9,
the step of determining the control instruction corresponding to the gesture information includes:
dividing an acquisition interface of the image sensor into a plurality of control areas;
and determining a corresponding control instruction according to the different control areas where the central points of the gesture images are located and the matching relation between the gesture information and a preset control instruction.
12. The method of claim 1,
the method further comprises the following steps:
acquiring a plurality of continuous gesture image frames of a user by using an image sensor;
acquiring coordinate information of a hand in a plurality of continuous gesture image frames to obtain a plurality of continuous target coordinate information;
sequentially connecting a plurality of continuous target coordinate information to form a target track image;
the step of recognizing the gesture image to determine the control instruction associated with the gesture image comprises:
and identifying the target track image to determine a control instruction associated with the target track image.
13. The method of claim 12,
the step of recognizing the target track image to determine the control instruction associated with the target track image includes:
acquiring first target coordinate information and last target coordinate information in the plurality of continuous target coordinate information;
determining an indication direction of the gesture image based on the first target coordinate information and the last target coordinate information;
and identifying the indication direction to determine a control instruction corresponding to the indication direction.
14. A neck massager, characterized in that the neck massager comprises:
a massage apparatus body;
the massage component is arranged on the massage instrument body;
the communication circuit is arranged on the massage instrument body;
the image sensor is arranged on the massager body;
a control circuit disposed on the massager body and electrically coupled to the massage assembly, the communication circuit and the image sensor, for controlling the massage assembly, the communication circuit and the image sensor to implement the method of any one of claims 1 to 13.
15. A computer-readable storage medium for storing program data for implementing the method of any one of claims 1-13 when executed by control circuitry.
CN202010442466.2A 2020-05-22 2020-05-22 Control method of neck massager, neck massager and storage medium Active CN112089589B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010442466.2A CN112089589B (en) 2020-05-22 2020-05-22 Control method of neck massager, neck massager and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010442466.2A CN112089589B (en) 2020-05-22 2020-05-22 Control method of neck massager, neck massager and storage medium

Publications (2)

Publication Number Publication Date
CN112089589A true CN112089589A (en) 2020-12-18
CN112089589B CN112089589B (en) 2023-04-07

Family

ID=73749558

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010442466.2A Active CN112089589B (en) 2020-05-22 2020-05-22 Control method of neck massager, neck massager and storage medium

Country Status (1)

Country Link
CN (1) CN112089589B (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112691002A (en) * 2021-03-24 2021-04-23 上海傅利叶智能科技有限公司 Control method and device based on gesture interaction rehabilitation robot and rehabilitation robot
CN115413912A (en) * 2022-09-20 2022-12-02 帝豪家居科技集团有限公司 Control method, device and system for graphene health-care mattress
WO2024187354A1 (en) * 2023-03-13 2024-09-19 广州奥科维电子有限公司 Control method and apparatus for multi-part comprehensive physiotherapy instrument, and device and medium

Citations (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010042134A (en) * 2008-08-12 2010-02-25 Kyushu Hitachi Maxell Ltd Massage machine
CN101843539A (en) * 2009-03-27 2010-09-29 文齐凤 Device and method for vision correction
CN201773404U (en) * 2008-11-10 2011-03-23 傲胜国际股份有限公司 Massaging device
US20130278493A1 (en) * 2012-04-24 2013-10-24 Shou-Te Wei Gesture control method and gesture control device
CN103529947A (en) * 2013-10-31 2014-01-22 京东方科技集团股份有限公司 Display device and control method thereof and gesture recognition method
CN103839386A (en) * 2014-03-24 2014-06-04 中科润程(北京)物联科技有限责任公司 Wearable equipment for protecting teenager eyesight
CN104442571A (en) * 2014-11-26 2015-03-25 重庆长安汽车股份有限公司 Night vision navigation integration system and control method
CN104714642A (en) * 2015-03-02 2015-06-17 惠州Tcl移动通信有限公司 Mobile terminal and gesture recognition processing method and system thereof
JP2016091192A (en) * 2014-10-31 2016-05-23 パイオニア株式会社 Virtual image display apparatus, control method, program, and storage medium
CN106293076A (en) * 2016-07-29 2017-01-04 北京奇虎科技有限公司 Communication terminal and intelligent terminal's gesture identification method and device
CN107598924A (en) * 2017-09-07 2018-01-19 南京昱晟机器人科技有限公司 A kind of robot gesture identification control method
CN108852620A (en) * 2018-01-19 2018-11-23 郭磊 Intelligent neck wears equipment and its control method
WO2018224847A2 (en) * 2017-06-09 2018-12-13 Delamont Dean Lindsay Mixed reality gaming system
CN109088924A (en) * 2018-07-31 2018-12-25 西安艾润物联网技术服务有限责任公司 Service information pushing method, relevant apparatus and storage medium
CN209108415U (en) * 2018-03-29 2019-07-16 广东艾诗凯奇智能科技有限公司 Neck massaging instrument
CN209417675U (en) * 2019-02-01 2019-09-20 奥佳华智能健康科技集团股份有限公司 A kind of massage armchair gesture control device
US20200035237A1 (en) * 2019-07-09 2020-01-30 Lg Electronics Inc. Communication robot and method for operating the same
CN210488317U (en) * 2019-06-28 2020-05-08 坎德拉(深圳)科技创新有限公司 Indoor distribution robot
KR20200051166A (en) * 2018-11-05 2020-05-13 주식회사 한글과컴퓨터 Electronic terminal device capable of executing a gesture recognition-based control command and operating method thereof

Patent Citations (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010042134A (en) * 2008-08-12 2010-02-25 Kyushu Hitachi Maxell Ltd Massage machine
CN201773404U (en) * 2008-11-10 2011-03-23 傲胜国际股份有限公司 Massaging device
CN101843539A (en) * 2009-03-27 2010-09-29 文齐凤 Device and method for vision correction
US20130278493A1 (en) * 2012-04-24 2013-10-24 Shou-Te Wei Gesture control method and gesture control device
CN103529947A (en) * 2013-10-31 2014-01-22 京东方科技集团股份有限公司 Display device and control method thereof and gesture recognition method
CN103839386A (en) * 2014-03-24 2014-06-04 中科润程(北京)物联科技有限责任公司 Wearable equipment for protecting teenager eyesight
JP2016091192A (en) * 2014-10-31 2016-05-23 パイオニア株式会社 Virtual image display apparatus, control method, program, and storage medium
CN104442571A (en) * 2014-11-26 2015-03-25 重庆长安汽车股份有限公司 Night vision navigation integration system and control method
CN104714642A (en) * 2015-03-02 2015-06-17 惠州Tcl移动通信有限公司 Mobile terminal and gesture recognition processing method and system thereof
CN106293076A (en) * 2016-07-29 2017-01-04 北京奇虎科技有限公司 Communication terminal and intelligent terminal's gesture identification method and device
WO2018224847A2 (en) * 2017-06-09 2018-12-13 Delamont Dean Lindsay Mixed reality gaming system
CN107598924A (en) * 2017-09-07 2018-01-19 南京昱晟机器人科技有限公司 A kind of robot gesture identification control method
CN108852620A (en) * 2018-01-19 2018-11-23 郭磊 Intelligent neck wears equipment and its control method
CN209108415U (en) * 2018-03-29 2019-07-16 广东艾诗凯奇智能科技有限公司 Neck massaging instrument
CN109088924A (en) * 2018-07-31 2018-12-25 西安艾润物联网技术服务有限责任公司 Service information pushing method, relevant apparatus and storage medium
KR20200051166A (en) * 2018-11-05 2020-05-13 주식회사 한글과컴퓨터 Electronic terminal device capable of executing a gesture recognition-based control command and operating method thereof
CN209417675U (en) * 2019-02-01 2019-09-20 奥佳华智能健康科技集团股份有限公司 A kind of massage armchair gesture control device
CN210488317U (en) * 2019-06-28 2020-05-08 坎德拉(深圳)科技创新有限公司 Indoor distribution robot
US20200035237A1 (en) * 2019-07-09 2020-01-30 Lg Electronics Inc. Communication robot and method for operating the same

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
兰佳: "基于立体视觉的工件识别与定位系统研究", 《机械制造与自动化》 *
谢永超: "基于嵌入式Linux的双目测距系统研究", 《微型机与应用》 *
郭鹏: "基于深度图像的手势识别研究", 《国外电子测量技术》 *
陈文轩: "双目测距技术在调车作业中的研究与应用", 《铁道科学与工程学报》 *

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112691002A (en) * 2021-03-24 2021-04-23 上海傅利叶智能科技有限公司 Control method and device based on gesture interaction rehabilitation robot and rehabilitation robot
CN112691002B (en) * 2021-03-24 2021-06-29 上海傅利叶智能科技有限公司 Control device based on gesture interaction rehabilitation robot and rehabilitation robot
CN115413912A (en) * 2022-09-20 2022-12-02 帝豪家居科技集团有限公司 Control method, device and system for graphene health-care mattress
WO2024187354A1 (en) * 2023-03-13 2024-09-19 广州奥科维电子有限公司 Control method and apparatus for multi-part comprehensive physiotherapy instrument, and device and medium

Also Published As

Publication number Publication date
CN112089589B (en) 2023-04-07

Similar Documents

Publication Publication Date Title
CN112089589B (en) Control method of neck massager, neck massager and storage medium
CN108520241B (en) Fingerprint acquisition method and device based on optical fingerprint technology and user terminal
WO2020108225A1 (en) Fingerprint acquisition method and related apparatus
EP3608755B1 (en) Electronic apparatus operated by head movement and operation method thereof
CN103106401B (en) Mobile terminal iris recognition device with human-computer interaction mechanism
TWI362005B (en)
US20140275948A1 (en) Information terminal device
US20160048993A1 (en) Image processing device, image processing method, and program
US20160370859A1 (en) Pupil detection device
JP3673834B2 (en) Gaze input communication method using eye movement
CN104133548A (en) Method and device for determining viewpoint area and controlling screen luminance
KR20130004357A (en) A computing device interface
TW201101197A (en) Method and system for gesture recognition
Bang et al. New computer interface combining gaze tracking and brainwave measurements
KR20120060978A (en) Method and Apparatus for 3D Human-Computer Interaction based on Eye Tracking
Ghani et al. GazePointer: A real time mouse pointer control implementation based on eye gaze tracking
Utaminingrum et al. Eye movement and blink detection for selecting menu on-screen display using probability analysis based on facial landmark
Tuisku et al. Pointing and selecting with facial activity
CN113160260B (en) Head-eye double-channel intelligent man-machine interaction system and operation method
WO2020073169A1 (en) Biometric identification method and apparatus, and electronic device
CN108921815A (en) It takes pictures exchange method, device, storage medium and terminal device
Heo et al. Object recognition and selection method by gaze tracking and SURF algorithm
CN202235300U (en) Eye movement monitoring equipment
Khan et al. A new 3D eyeball tracking system to enhance the usability of page scrolling
CN103995587B (en) A kind of information control method and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CP03 Change of name, title or address
CP03 Change of name, title or address

Address after: 23af, building 3, zhongkekefa Park, 009 Gaoxin South 1st Road, high tech Zone community, Yuehai street, Nanshan District, Shenzhen, Guangdong 518000

Patentee after: Future wear Health Technology Co.,Ltd.

Country or region after: China

Address before: 23af, building 3, zhongkekefa Park, 009 Gaoxin South 1st Road, high tech Zone community, Yuehai street, Nanshan District, Shenzhen, Guangdong 518000

Patentee before: Future wearable technology Co.,Ltd.

Country or region before: China

Address after: 23af, building 3, zhongkekefa Park, 009 Gaoxin South 1st Road, high tech Zone community, Yuehai street, Nanshan District, Shenzhen, Guangdong 518000

Patentee after: Future wearable technology Co.,Ltd.

Country or region after: China

Address before: 23af, building 3, zhongkekefa Park, 009 Gaoxin South 1st Road, high tech Zone community, Yuehai street, Nanshan District, Shenzhen, Guangdong 518000

Patentee before: Future wearable technology Co.,Ltd.

Country or region before: China