Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application. It is to be understood that the specific embodiments described herein are merely illustrative of the application and are not limiting of the application. It should be further noted that, for the convenience of description, only some of the structures related to the present application are shown in the drawings, not all of the structures. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
Reference herein to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment can be included in at least one embodiment of the application. The appearances of the phrase in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. It is explicitly and implicitly understood by one skilled in the art that the embodiments described herein can be combined with other embodiments.
Referring to fig. 1, fig. 1 is a schematic structural diagram of an embodiment of a neck massager 10 provided by the present application, which includes an elastic arm 11, a massage component 12, a sensing component 13, a first handle 14, a second handle 15, and a speaker 16.
The first handle 14 and the second handle 15 are fixedly connected to two sides of the elastic arm 11, the massage component 12 is disposed on one side of the elastic arm 11 facing the neck of the human body, and the massage component 12 can emit electric pulses. The sensing assembly 13 is disposed outside of the first handle 14 or the second handle 15. The speaker 16 is disposed outside the first handle 14 or the second handle 15 for playing audio data.
Alternatively, the electrode pads of the massage unit 12 are not limited to the protruding mushroom structure, but may be flush with or slightly protruding from the surface of the side of the elastic arms 11 facing the neck of the human body. The electrode sheet can also be conductive silica gel.
Optionally, the sensing assembly 13 includes a first sensing assembly 131 and a second sensing assembly 132, and the first sensing assembly 131 and the second sensing assembly 132 are connected.
Further, the first sensing component 131 may be an image sensor, specifically, a depth camera, and the second sensing component 132 may be an infrared sensor.
Further, first sensing assembly 131 may further include a first camera assembly 1311 and a second camera assembly 1312, wherein first camera assembly 1311 may be disposed outside first handle 14, second camera assembly 1312 may be disposed outside second handle 15, and first camera assembly 1311 and second camera assembly 1312 are symmetrically disposed.
Alternatively, the first and second handles 14, 15 may be separate pieces or may be part of an integrally formed neck massager.
Optionally, the neck massager 10 is further provided with a communication module for communicating with other terminals.
Referring to fig. 2, fig. 2 is a schematic flow chart of a first embodiment of a control method of a neck massager provided by the present application, the method including:
s21: and acquiring a gesture image of the user by using the image sensor.
Optionally, before acquiring the gesture image of the user, the infrared sensor may be further used to detect a moving object within a preset range to determine whether to acquire the gesture image. Specifically, under a general condition, the image sensor is in a shutdown state or a dormant state due to relatively high power consumption, and does not acquire the gesture image, while the infrared sensor is in a working state for a long time due to relatively low power consumption, and when the infrared sensor recognizes that a moving target is moving, the infrared sensor sends a signal to the neck massager to activate the image sensor to acquire the gesture of the user. In other embodiments, the user may also activate the image sensor by manually touching a button disposed on the neck massager. Wherein, image sensor and infrared sensor all set up on the neck massager.
Optionally, the neck massager may further comprise an illumination assembly for providing sufficient light to acquire the gesture image when the environment is dark.
The image sensor may be generally divided into two types, one is a CCD (Charge-coupled Device) image sensor, and the other is a CMOS (Complementary Metal Oxide semiconductor) image sensor, and since the CMOS has the characteristics of small volume and low power consumption, in this embodiment, the CMOS image sensor is mainly used to acquire a user gesture image.
S22: the gesture image is recognized to determine a control instruction associated with the gesture image.
Before the gesture images are recognized, the gesture images need to be input, so that different gesture images can be associated and correspond to different function controls. Optionally, the implementation may be performed by a mobile terminal associated with the neck massager, for example, when the neck massager is used for the first time, a user may perform gesture entry on different functions, specifically, the mobile terminal is connected with the neck massager, displays a plurality of functions on a screen of the mobile terminal, after the user selects a function to be entered, enters an image acquisition interface, and makes a gesture by the user, and the mobile terminal associates and stores the gesture with the corresponding function.
In this embodiment, after the image sensor acquires the gesture image of the user, the gesture image is sent to the mobile terminal associated with the neck massager, and the mobile terminal compares the gesture image with a preset gesture image, so as to determine a corresponding control instruction, and sends the control instruction to the neck massager. Of course, this process can be performed on the neck massager, and is not particularly limited.
S23: and realizing corresponding function control based on the control instruction.
In this embodiment, different gestures of the user may correspond to different control instructions, for example, a "scissors" gesture may represent an instruction to increase the massage intensity, a "cloth" gesture may represent an instruction to change the massage mode, and a "fist" gesture may represent an instruction to turn on or turn off, where no specific limitation is imposed, and the user may set the control instructions by himself or herself as needed.
Different from the prior art, the control method of the neck massager provided by the application acquires the gesture image of the user through the image sensor and identifies the gesture image to determine the control instruction associated with the gesture image, so that the corresponding function control is realized based on the control instruction. Through the mode, the related function control of the neck massager can be realized by utilizing different gesture images, the problems of structural limitation and unfavorable operation of the neck massager are solved, and the use feeling of a user is improved.
Referring to fig. 3, fig. 3 is a schematic structural diagram of a second embodiment of the control method of the neck massager provided by the present application, the method including:
s31: and acquiring a gesture image of the user by using the image sensor.
S32: and carrying out one-to-one corresponding association setting on the plurality of preset gesture information and the plurality of preset control instructions.
In this embodiment, the association setting may be implemented by a mobile terminal associated with the neck massager, the gesture image of the user is acquired by using an image sensor of the neck massager, and the gesture image is sent to the mobile terminal to perform corresponding function setting, for example, the setting may be performed by using an application program (APP), the user first selects a control function, such as shutdown, adjustment of massage intensity, and the like, in the application program, the user selects each preset control instruction, and then the mobile terminal sends a signal to the neck massager, so that the neck massager starts the image sensor and performs gesture acquisition, thereby completing the association setting of the gesture information and the control instruction. Alternatively, the whole process can be completed on the mobile terminal, but since the subsequent identification is also performed by the image sensor of the neck massager, the identification accuracy is improved, and therefore, the subsequent identification is preferably completed by the neck massager and the mobile terminal together.
S33: and storing the incidence relation between the preset gesture information and the preset control instructions in an image feature library, so as to facilitate recognition and matching.
S34: and recognizing the gesture image to acquire gesture information in the image.
Specifically, this step can be implemented by the following steps:
1) before the gesture image is recognized, the image needs to be preprocessed to remove the influence of noise, illumination and the like and strengthen useful information of the image.
2) Threshold segmentation of RGB (red, green and blue) color space is carried out on the gesture image, and then the clustering property of skin color distribution on HSV (hue, saturation and brightness) color space is combined, and the skin color area can be extracted and separated by operation between the RGB color space and the clustering property. The segmentation method based on the skin color can segment skin color areas from background images through the clustering characteristics of the skin color in the color space, achieves gesture area segmentation by using skin color characteristic information, and has the characteristics of intuition, high efficiency and accuracy.
3) Extracting the contour corresponding to the human skin color in the hand image to obtain gesture information, wherein the gesture information may be a gesture of one hand or two hands, such as a heart-to-heart gesture of two hands, scissors of one hand, and the like.
S35: and matching the gesture information with a plurality of preset control instructions in the image feature library.
In this embodiment, the gesture information may be matched with preset gesture information in the image feature library, specifically, the gesture information and the preset gesture information may be compared with each other in similarity, the gesture information and the stored preset gesture information are sequentially compared, and when the similarity of the comparison between the gesture information and the stored preset gesture information is greater than a set threshold (for example, 80%), it may be determined that the matching is successful.
S36: and if the matching is successful, determining a control instruction corresponding to the gesture information.
Specifically, step S36 may be a step as shown in fig. 4:
s361: and dividing an acquisition interface of the image sensor into a plurality of control areas.
As shown in fig. 5, the acquisition interface of the image sensor is divided into four areas, which are a first control area, a second control area, a third control area and a fourth control area, and may respectively correspond to positions of four quadrants of a coordinate system.
S362: and determining a corresponding control instruction according to different control areas where the central points of the gesture images are located and the matching relation between the gesture information and the preset control instruction.
Continuing with the description of fig. 5, the gesture a in fig. 5 may recognize that the gesture a is a "scissors" gesture, and at this time, the gesture a is located in a second control region corresponding to a second quadrant, that is, it indicates that a control instruction corresponding to the gesture should be generated by jointly determining "scissors" and the second control region, for example, the "scissors" gesture a may be represented by increasing the massage intensity, and meanwhile, the gesture a located in the second control region may be represented by increasing the massage intensity of two gears, so that the control instruction is determined; by analogy, if gesture a is also a "scissors" gesture and is located in the third control region, it may be indicated as increasing the massage intensity of three gears. The above is merely an example, and the setting can be specifically performed according to the actual function of the neck massager.
For the determination of the position of the gesture, the determination may be performed by using a fingertip in the gesture as a standard, for example, the gesture a in fig. 5 has a gesture part located in the second control area, so the gesture a can recognize the position well; in the gesture B in fig. 5, it can be seen that the body portion of the hand is in the fourth control region, but the finger and the fingertip portions are in the first control region, at this time, the position of the control region can be determined based on the fingertip, and therefore, the gesture B should be regarded as being in the first control region, so as to determine the control instruction by combining the gesture and the control region. Optionally, the determination of the position of the gesture by the control area may also be based on a center point of the gesture image, and is not particularly limited.
S37: and realizing corresponding function control based on the control instruction.
Alternatively, after the control instruction is used to perform the function control on the neck massage apparatus, the control result, such as "the massage intensity is adjusted to the first gear", "the massage mode is switched to the automatic mode", and the like, may be played through the speaker of the neck massage apparatus.
By the method, the actual position of the gesture image can be utilized, the matching relation between the gesture and the control instruction is combined, non-contact dual control over all functions of the neck massager is achieved, the corresponding relation between the gesture and the control instruction can be strengthened, and user experience is improved.
Referring to fig. 6, fig. 6 is a schematic structural diagram of a third embodiment of a control method of a neck massager provided by the present application, the method including:
s61: and acquiring a gesture image of the user by using the image sensor.
S62: and acquiring the actual distance between the hand and the neck massager in the gesture image.
The measuring of actual distance can be realized by infrared sensor, utilizes the principle that infrared ray transmission needs time, through infrared transceiver diode or the infrared integrated chip of infrared sensor inside, sends the infrared signal of certain frequency to the target hand to record and receive the time of the infrared signal that reflects back through the hand, according to infrared propagation speed again, can calculate the distance between the two.
S63: and determining a corresponding grade relation according to the actual distance.
In some embodiments, step S63 may be the step shown in fig. 7:
s631: and judging whether the actual distance is greater than a first preset threshold and smaller than a second preset threshold.
The first preset threshold may be 0m to 0.2m, in this embodiment, 0m, and the second preset threshold may be 0.4m to 0.6m, in this embodiment, 0.5m, and the distance is merely an example and does not represent an actual set value of the apparatus.
When the measured actual distance is between 0m and 0.5m (0.5 m may not be included), step S632 is performed, otherwise, step S633 is performed.
S632: it is determined that the actual distance falls within the first rank range.
S633: and judging whether the actual distance is greater than a second preset threshold and smaller than a third preset threshold.
The third preset threshold may be 0.9m to 1.1m, and in this embodiment, is 1.0 m. When the measured actual distance is between 0.5m and 1.0m (1.0 m may not be included), step S634 is performed, otherwise step S635 is performed.
S634: it is determined that the actual distance falls within the second hierarchical range.
S635: it is determined that the actual distance falls within the third level range.
The specific ranking relationships are summarized in the following table:
actual distance
|
Hierarchical relationships
|
0-0.5m
|
First class range
|
0.5-1.0m
|
Second rank range
|
Greater than 1.0m
|
Third order range |
S64: the gesture image is recognized based on the hierarchical relationship to determine a control instruction associated with the gesture image.
Referring to fig. 8, fig. 8 is a schematic image diagram of a level relationship and a gesture image provided in the present application, it can be seen that a gesture C is a "scissors" gesture, and should be located in a first level range, that is, the distance between the gesture and the neck massager is 0m to 0.5m, at this time, a control instruction corresponding to the gesture C is determined by the "scissors" and the first level range together, for example, the "scissors" gesture C may be represented as a massage replacement mode, and the gesture C located in the first level range may be represented as an automatic mode, so as to determine the control instruction; similarly, when the gesture C is also a "scissors" gesture and is located in the second level range, it may be indicated to be changed to the relaxing mode, and when the gesture C is located in the third level range, it may be indicated to be changed to the activating mode. The distance is only used here, and the setting can be specifically carried out according to the actual function of the neck massager.
Alternatively, when the same gesture is in different level ranges, the gesture can also represent corresponding different control instructions, and is not limited to a plurality of working states under the same control instruction.
In some embodiments, the capture interface of the image sensor may be divided into four regions as in the above embodiments, and refer to fig. 8, for example, the gesture C is a "scissors" gesture and is located in the second control region within the first hierarchical range, and at this time, the control command corresponding to the gesture C should be generated by the three determination. In addition, in the same level range, the two same gestures are only different in control area, and the corresponding control instructions may also be different. For another example, in fig. 8, the gesture D and the gesture C belong to the same "scissors hand" gesture, but the level ranges and the control areas of the two gestures are different, the control instructions determined by the two gestures are also different, and the specific correspondence relationship is not particularly limited.
It will be appreciated that since fig. 8 is captured by an image sensor, the image presented in fig. 8 should be a conical enlargement, where all images are represented to the same size for ease of understanding.
S65: and realizing corresponding function control based on the control instruction.
By the method, the actual distance between the hand and the neck massager in the gesture image and the specific position where the gesture image is located can be utilized, and the matching relation between the gesture and the control instruction is combined, so that the non-contact multiple control on each function of the neck massager is realized, the corresponding relation between the gesture and the control instruction can be strengthened, and the use efficiency is improved.
Referring to fig. 9, fig. 9 is a schematic flow chart of a fourth embodiment of a control method of a neck massager provided by the present application, the method including:
s91: and acquiring a gesture image of the user by using the image sensor.
Wherein the image sensor includes a first camera assembly and a second camera assembly.
S92: the first camera assembly acquires a first gesture image, and the second camera assembly acquires a second gesture image.
The first gesture image and the second gesture image are acquired at the same time, and the first gesture image and the second gesture image are only different in image angle.
S93: identifying the first gesture image to acquire a first feature point in the first gesture image; and identifying the second gesture image to acquire a second feature point in the second gesture image.
The first feature point and the second feature point are the same point, for example, a center point of the gesture image.
S94: and acquiring a first coordinate position from the first gesture image based on the first characteristic point, and acquiring a second coordinate position from the second gesture image based on the second characteristic point.
In some embodiments, the first coordinate position of the gesture image in the first gesture image and the second coordinate position of the gesture image in the second gesture image both refer to positions of the gesture image based on an image coordinate system, and position coordinates of the gesture image in the first gesture image and/or the second gesture image relative to a world coordinate system can be obtained through calculation.
S95: and calculating the actual distance between the hand and the neck massager in the gesture image according to the first coordinate position and the second coordinate position.
Referring to fig. 10, fig. 10 is a schematic view of a binocular distance measurement principle, where P is an object to be measured, that is, a gesture image in this embodiment, OL and OR are optical centers of the first camera assembly and the second camera assembly, respectively, imaging points of the gesture image P on the two camera assembly photoreceptors are L 'and R', respectively, f is a focal length of the two camera assemblies, B is a center distance between the two camera assemblies, and Z is an actual distance between the hand and the neck massager in the gesture image to be calculated.
Specifically, the following formula may be employed for calculation:
wherein, XRAs the abscissa of the imaging point R', XLThe abscissa of the imaging point L' can be directly obtained from the second coordinate position and the first coordinate position, and other parameters are known as basic parameters of the camera assembly, so that the actual distance Z can be calculated according to the principle of the similar triangle.
S96: and determining a corresponding grade relation according to the actual distance.
S97: the gesture image is recognized based on the hierarchical relationship to determine a control instruction associated with the gesture image.
S98: and realizing corresponding function control based on the control instruction.
S96-S98 are the same as the steps in the third embodiment, and are not described here.
Referring to fig. 11, fig. 11 is a schematic flow chart of a fifth embodiment of a control method of a neck massager provided in the present application, the method including:
s101: a plurality of successive gesture image frames of a user are acquired with an image sensor.
S102: coordinate information of the hand in a plurality of continuous gesture image frames is obtained so as to obtain a plurality of continuous target coordinate information.
For example, a coordinate system may be established in the shooting area of the image sensor, and a user performs motion interaction using a gesture, and first identifies a fingertip of a finger in each image frame, and then acquires coordinate information of the fingertip in the image frame, so that a plurality of continuous target coordinate information may be acquired.
S103: and sequentially connecting a plurality of continuous target coordinate information to form a target track image.
S104: the target track image is identified to determine control instructions associated with the target track image.
In some embodiments, before performing this step, it may be determined whether the target track image is a straight line. It is understood that the trajectory image of the hand motion is not necessarily an absolute straight line, and therefore, as long as the target trajectory image is a substantially straight line, the target trajectory image can be considered to be a straight line. At this time, linear operation may be performed on a plurality of continuous target coordinate information, for example, two pieces of target coordinate information are arbitrarily selected, a straight line expression representing two gestures is calculated by using the abscissa and the ordinate of the coordinate information in the coordinate system, and the slope k1 of the straight line is obtained, similarly, the slope k2, k3 of the straight line between any remaining two pieces of target coordinate information in the target trajectory image may be calculated, the average value and the variance of the obtained slopes may be further calculated, and if the obtained variance of the slope of the part is smaller than a preset threshold (for example, smaller than 1), the target trajectory image formed by the part of the target coordinates may be determined to be the straight line.
Further, after the target track image is determined to be a straight line, the direction of the track image can be identified according to the first target coordinate information and the last target coordinate information in the plurality of continuous target coordinate information, and based on the coordinate system, the specific direction of the track image can be obtained, for example, the angle in the positive direction of the X axis is divided, for example, 0-45 degrees corresponds to one command, 45-90 degrees corresponds to one command, and at this time, the obtained specific direction can be matched with the included angle of the X axis to determine the corresponding control command.
In other embodiments, when the target trajectory image is not a straight line, step S104 may be a step as shown in fig. 12, specifically as follows:
s1041: and acquiring the first target coordinate information and the last target coordinate information in a plurality of continuous target coordinate information.
S1042: and determining the indication direction of the gesture image based on the first target coordinate information and the last target coordinate information.
S1043: and identifying the indication direction to determine a control command corresponding to the indication direction.
The specific indication direction of a straight line formed by a starting point (a first target coordinate) and an end point (a last target coordinate) of the non-linear target track image can be obtained based on the coordinate system, at this time, the shooting area of the image sensor can be divided into four areas according to coordinate quadrants, and the straight line obtained by connecting the starting point and the end point of the non-linear target track image is extended until the straight line intersects with the boundaries of the four areas.
It can be understood that the extension line of any straight line intersects with the boundary of the shooting area, and therefore, the specific identification and confirmation of the control command can be performed according to the intersection point position of the extension line of the straight line corresponding to the target track image and the boundary of the area, for example, if the intersection point is located in the area corresponding to the first quadrant in the coordinate quadrant, the corresponding control command is correspondingly generated. In this way, the meaning of the gesture interaction can be accurately identified to determine the associated control instruction.
S105: and realizing corresponding function control based on the control instruction.
Different from the prior art, the non-contact multiple control over various functions of the neck massager can be realized by utilizing the indication direction of the gesture target track image, the corresponding relation between the gesture track and the control instruction can be strengthened, and the use efficiency is improved.
Referring to fig. 13, fig. 13 is a schematic structural view of another embodiment of the neck massager 20 provided by the present application, which includes a massager body 201, a massage assembly 202, a communication circuit 203, an image sensor 204, and a control circuit 205. Wherein, the massage component 202 is arranged on the massage apparatus body 201; the communication circuit 203 is arranged on the massage apparatus body 201; the image sensor 204 is arranged on the massage apparatus body; the control circuit 205 is disposed on the massage device body 201, electrically coupled to the massage assembly 202, the communication circuit 203, and the image sensor 204, and configured to control the massage assembly 202, the communication circuit 203, and the image sensor 204 to implement the following steps:
acquiring a gesture image of a user by using an image sensor, wherein the image sensor is arranged on the neck massager; recognizing the gesture image to determine a control instruction associated with the gesture image; and realizing corresponding function control based on the control instruction.
It can be understood that the neck massager 20 in this embodiment may implement the method steps of any of the above embodiments, and the specific implementation steps thereof may refer to the above embodiments, which are not described herein again.
Referring to fig. 14, fig. 14 is a schematic structural diagram of an embodiment of a computer-readable storage medium 30 provided in the present application, where the computer-readable storage medium is used for storing program data 31, and the program data 31 is used for implementing the following method steps when being executed by a control circuit:
acquiring a gesture image of a user by using an image sensor, wherein the image sensor is arranged on the neck massager; recognizing the gesture image to determine a control instruction associated with the gesture image; and realizing corresponding function control based on the control instruction.
It can be understood that, when the computer-readable storage medium 30 in this embodiment can be applied to a neck massager, the method steps of any of the above embodiments can be implemented, and specific implementation steps thereof can refer to the above embodiments, which are not described herein again.
In the several embodiments provided in the present application, it should be understood that the disclosed method and apparatus may be implemented in other manners. For example, the above-described device embodiments are merely illustrative, and for example, the division of the modules or units is only one logical division, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units may be integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated units in the other embodiments described above may be stored in a computer-readable storage medium if they are implemented in the form of software functional units and sold or used as separate products. Based on such understanding, the technical solution of the present application may be substantially implemented or contributed by the prior art, or all or part of the technical solution may be embodied in a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, a network device, or the like) or a processor (processor) to execute all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
The above description is only for the purpose of illustrating embodiments of the present application and is not intended to limit the scope of the present application, and all modifications of equivalent structures and equivalent processes, which are made by the contents of the specification and the drawings of the present application or are directly or indirectly applied to other related technical fields, are also included in the scope of the present application.