CN107643821B - Input control method and device and electronic equipment - Google Patents

Input control method and device and electronic equipment Download PDF

Info

Publication number
CN107643821B
CN107643821B CN201610585908.2A CN201610585908A CN107643821B CN 107643821 B CN107643821 B CN 107643821B CN 201610585908 A CN201610585908 A CN 201610585908A CN 107643821 B CN107643821 B CN 107643821B
Authority
CN
China
Prior art keywords
human body
body part
motion
generating
image data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201610585908.2A
Other languages
Chinese (zh)
Other versions
CN107643821A (en
Inventor
崔欣
张扬
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Sogou Technology Development Co Ltd
Original Assignee
Beijing Sogou Technology Development Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Sogou Technology Development Co Ltd filed Critical Beijing Sogou Technology Development Co Ltd
Priority to CN201610585908.2A priority Critical patent/CN107643821B/en
Publication of CN107643821A publication Critical patent/CN107643821A/en
Application granted granted Critical
Publication of CN107643821B publication Critical patent/CN107643821B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Abstract

The invention relates to the field of human-computer interaction, and discloses an input control method, an input control device and electronic equipment, which aim to solve the technical problem that misoperation is easy to generate when characters are input by an application program of a key input method in the prior art. The method comprises the following steps: firstly, obtaining the motion characteristics of a human body part of a user, wherein the human body part is a human body part with a distance from the electronic equipment larger than a preset distance; then generating a control instruction corresponding to the motion characteristic; and finally, responding to the control instruction, and further controlling a key input application program of the electronic equipment to execute a first operation corresponding to the control instruction. The technical effect of improving the input precision when the characters are input by the key input application program is achieved; in addition, the pollution of the touch body to the display unit can be reduced, and the cleaning degree of the display unit is further improved.

Description

Input control method and device and electronic equipment
Technical Field
The invention relates to the field of human-computer interaction, in particular to an input control method and device and electronic equipment.
Background
With the continuous development of science and technology, electronic technology has also gained rapid development, and the variety of electronic products is also more and more, and people also enjoy various conveniences brought by the development of science and technology. For example, electronic devices such as a notebook computer, a desktop computer, a smart phone, and a tablet computer have become an important part of people's life, and a user can listen to music, play games, and the like by using the electronic devices such as the mobile phone and the tablet computer, so as to relieve the pressure of modern fast-paced life.
Electronic devices in the prior art all have a character input function, for example: a character string input by a user may be received by a key input application, and then a candidate word is obtained from the character string, and many operations are required for the key input application, for example: selecting a candidate item, calling an input interface, adjusting a display area of the candidate item and the like, wherein the candidate item often corresponds to a specific key of the electronic equipment, and corresponding operation can be generated by triggering (for example, clicking) the corresponding key, and as the display interface of the electronic equipment is smaller, the display interface of an application program of the key input method and the display area of the key contained in the display interface are smaller and smaller, so that the technical problem of easy generation of misoperation is caused;
moreover, when the corresponding button is triggered, the fingerprint of the user may be printed on the display screen of the electronic device, thereby affecting the cleanness of the display screen.
Disclosure of Invention
The invention provides an input control method, an input control device and electronic equipment, and aims to solve the technical problem that misoperation is easy to generate when characters are input through an application program of a key input method in the prior art.
In a first aspect, an embodiment of the present invention provides an input control method, including:
obtaining the motion characteristics of a human body part of a user, wherein the human body part is a human body part with a distance from the electronic equipment larger than a preset distance;
generating a control instruction corresponding to the motion characteristic;
and responding to the control instruction, and controlling a key input application program of the electronic equipment to execute a first operation corresponding to the control instruction.
Optionally, the generating a control instruction corresponding to the motion characteristic includes:
judging whether the human body part corresponding to the motion characteristic is a preset human body part or not;
and if the human body part corresponding to the motion characteristic is the preset human body part, generating a control instruction corresponding to the motion characteristic.
Optionally, the obtaining the motion characteristics of the human body part of the user includes:
acquiring and obtaining image data of the user in the process of generating the motion characteristics;
judging whether the object generating the motion characteristics is a human body part or not according to the image data;
and if the position is the human body part, analyzing the motion characteristics through the image data.
Optionally, the determining, by using the image data, whether the object generating the motion feature is a human body part includes:
determining the variation of at least two images contained in the image data;
determining a motion area in the at least two images based on the variation of the at least two images;
and judging whether the object generating the motion characteristic is a human body part or not based on the motion area.
Optionally, the determining, based on the motion region, whether the object generating the motion feature is a human body part includes:
acquiring color parameters corresponding to the motion areas;
inputting the color parameters into a color model of the human body part;
and identifying the color parameters through the color model, and determining whether the object generating the motion characteristics is a human body part or not through an identification result.
Optionally, the determining, based on the motion region, whether the object generating the motion feature is a human body part includes:
and matching the shape characteristics of the motion area with the shape characteristics of the pre-stored human body part, and judging whether the object generating the motion characteristics is the human body part or not based on the matching result.
Optionally, the determining, based on the motion region, whether the object generating the motion feature is a human body part further includes:
identifying the color parameters through the color model to obtain a first scoring value; and/or the presence of a gas in the gas,
obtaining a second score value according to whether the morphological characteristics are matched with the prestored morphological characteristics of the preset human body part;
and judging whether the object generating the motion characteristic is a human body part or not based on at least one parameter in the first scoring value and the second scoring value.
Optionally, the controlling the key input application of the electronic device to execute a first operation corresponding to the control instruction includes:
controlling the key input application program to execute an operation of adjusting an input interface of the key input application program; and/or the presence of a gas in the gas,
controlling the key input application program to execute the operation of adjusting the candidate character strings; and/or the presence of a gas in the gas,
and controlling the key input application program to execute the operation of adjusting the input character string.
Optionally, after the key input application controlling the electronic device performs the first operation corresponding to the control instruction, the method further includes:
and learning by the user aiming at the historical operation records generated by the key input application program, and adjusting the sensitivity of generating the corresponding control instruction based on the motion characteristics of the human body part.
Optionally, before the obtaining the motion characteristics of the human body part of the user, the method further includes:
judging whether a second operation generated aiming at the key input application program exists within a first preset time period, wherein the second operation is an operation generated by contacting with the electronic equipment;
and if the judgment result is negative, executing the operation of obtaining the motion characteristics of the human body part of the user.
Optionally, the motion characteristics include: at least one parameter of moving direction, moving speed and moving distance.
In a second aspect, an embodiment of the present invention provides an input control apparatus, including:
the device comprises an obtaining module, a display module and a control module, wherein the obtaining module is used for obtaining the motion characteristics of a human body part of a user, and the human body part is a human body part which is more than a preset distance away from the electronic equipment;
the generating module is used for generating a control instruction corresponding to the motion characteristic;
and the response module is used for responding to the control instruction and controlling a key input application program of the electronic equipment to execute a first operation corresponding to the control instruction.
In a third aspect, an embodiment of the present invention provides an electronic device, including a memory, and one or more programs, where the one or more programs are stored in the memory, and configured to be executed by one or more processors includes instructions for:
obtaining the motion characteristics of a human body part of a user, wherein the human body part is a human body part with a distance from the electronic equipment larger than a preset distance;
generating a control instruction corresponding to the motion characteristic;
and responding to the control instruction, and controlling a key input application program of the electronic equipment to execute a first operation corresponding to the control instruction.
The invention has the following beneficial effects:
in the embodiment of the invention, the motion characteristics of the human body part of the user can be obtained firstly, wherein the human body part is a human body part which has a distance from the electronic equipment greater than a preset distance; then generating a control instruction corresponding to the motion characteristic; finally, the control instruction is responded, and then the key input application program of the electronic equipment is controlled to execute the first operation corresponding to the control instruction, so that the key input application program can be controlled based on the motion characteristics generated by the human body part without the need of contacting a display unit of the electronic equipment by a user, and the response area of the key input application program can be free from the influence of the size of the display unit of the electronic equipment, so that the technical effect of improving the input precision when characters are input by the key input application program is achieved; in addition, the pollution of the touch body to the display unit can be reduced, and the cleaning degree of the display unit is further improved.
Drawings
FIG. 1 is a flow chart of an input control method in an embodiment of the present invention;
FIG. 2 is a flowchart of obtaining motion characteristics of a human body in an input control method according to an embodiment of the present invention;
FIG. 3 is a block diagram of an input control device according to an embodiment of the present invention;
FIG. 4 is a block diagram of an electronic device implementing an input control method in an embodiment of the invention;
fig. 5 is a block diagram of a server in an embodiment of the present invention.
Detailed Description
The invention provides an input control method, an input control device and electronic equipment, and aims to solve the technical problem that misoperation is easy to generate when characters are input through an application program of a key input method in the prior art.
In order to solve the technical problems, the general idea of the embodiment of the present application is as follows:
the motion characteristics of a human body part of a user can be obtained firstly, wherein the human body part is a human body part which is away from the electronic equipment by more than a preset distance; then generating a control instruction corresponding to the motion characteristic; finally, the control instruction is responded, and then the key input application program of the electronic equipment is controlled to execute the first operation corresponding to the control instruction, so that the key input application program can be controlled based on the motion characteristics generated by the human body part without the need of contacting a display unit of the electronic equipment by a user, and the response area of the key input application program can be free from the influence of the size of the display unit of the electronic equipment, so that the technical effect of improving the input precision when characters are input by the key input application program is achieved; in addition, the pollution of the touch body to the display unit can be reduced, and the cleaning degree of the display unit is further improved.
In order to better understand the technical solutions of the present invention, the following detailed descriptions of the technical solutions of the present invention are provided with the accompanying drawings and the specific embodiments, and it should be understood that the specific features in the embodiments and the examples of the present invention are the detailed descriptions of the technical solutions of the present invention, and are not limitations of the technical solutions of the present invention, and the technical features in the embodiments and the examples of the present invention may be combined with each other without conflict.
In a first aspect, an embodiment of the present invention provides an input control method, please refer to fig. 1, including:
step S101: obtaining the motion characteristics of a human body part of a user, wherein the human body part is a human body part with a distance from the electronic equipment larger than a preset distance;
step S102: generating a control instruction corresponding to the motion characteristic;
step S103: and responding to the control instruction, and controlling a key input application program of the electronic equipment to execute a first operation corresponding to the control instruction.
For example, the scheme is applied to electronic equipment with a character input function, such as: a mobile phone, a tablet computer, a notebook computer, etc., embodiments of the present invention are not limited.
In step S101, the preset distance is, for example: 0 cm, 10 cm, etc., embodiments of the present invention are not limited.
In step S101, the motion characteristics of the human body part of the user may be obtained through various sensor detections, for example: the detection by the image acquisition device, the infrared sensor, the ultrasonic sensor, etc., will be described below by taking the detection by the image acquisition device to obtain the motion characteristics as an example.
Referring to fig. 2, when the motion characteristics of the human body part of the user are obtained through the image capturing device, the method may include the following steps:
step S201: acquiring and obtaining image data of the user in the process of generating the motion characteristics;
step S202: judging whether the object generating the motion characteristics is a human body part or not according to the image data;
step S203: and if the position is the human body part, analyzing the motion characteristics through the image data.
In step S201, image data of the user during generating the motion characteristic may be acquired and obtained through an image acquisition device, where the image acquisition device may be provided with the electronic device itself or may be externally connected to the electronic device, and the embodiment of the present invention is not limited.
In the specific implementation process, the image acquisition device can be always in a scanning state to acquire image data at any time, and the image acquisition device can be controlled to be in the image acquisition state only when the triggering condition is met so as to reduce the energy consumption of the work of the image acquisition device, wherein the triggering condition can be various triggering conditions, such as: triggering a predetermined button, generating a predetermined gesture, etc.
As an alternative embodiment, before the obtaining the motion characteristics of the human body part of the user, the method further includes: judging whether a second operation generated aiming at the key input application program exists within a first preset time period, wherein the second operation is an operation generated by contacting with the electronic equipment; and if the judgment result is negative, executing the operation of obtaining the motion characteristics of the human body part of the user.
For example, the keyboard corresponding to the key input application is a squared figure, a full keyboard, and the like. The second operation is, for example, a click operation for each key on the keyboard, a selection operation for a provided candidate, or the like. The second preset time period may be set according to actual requirements, for example: 1 minute, 2 minutes, etc., and the examples of the present invention are not limited. If the second operation generated for the button input application program exists within the first preset time period, the user wants to input the character string in a mode of contacting with the electronic equipment, and in this case, the motion characteristics of the human body part of the user do not need to be detected; otherwise, it is stated that the user has not input the character string in a manner of contacting the electronic device before, so that the motion characteristics of the human body part of the user can be detected to input the character string in a manner of non-contacting the electronic device.
Wherein, the human body parts are, for example: hands, eyes, head, arms, etc., embodiments of the present invention are not limited.
In step S202, it can be determined whether the object generating the motion characteristic is a human body part in various ways, two of which are listed below, and certainly, the implementation process is not limited to the following two cases.
The first method, determining whether the object generating the motion characteristic is a human body part or not by the image data, includes: determining the variation of at least two images contained in the image data; determining a motion area in the at least two images based on the variation of the at least two images; and judging whether the object generating the motion characteristic is a human body part or not based on the motion area. In a specific implementation process, a plurality of pieces of image data can be acquired by the image acquisition device, and then at least two pieces of image data are randomly selected from the acquired plurality of pieces of image data, or the first piece and the last piece of image data in the plurality of pieces of image data can be used as the image data to be subsequently adopted. After obtaining at least two pieces of image data (e.g., 2, 4, 10, etc.), the at least two pieces of image data may be processed using image smoothing or the like, and then the active region may be obtained using an inter-frame difference method. The inter-frame difference method is to extract a motion region in an image by using the difference between adjacent frame images in an image sequence. At least two pieces of image data are corrected in the same coordinate system, then the difference operation is carried out on the two pieces of image data of the same background at different time, the background part with unchanged gray scale is subtracted, and the motion area is protruded after the subtraction of the two pieces of image data because the motion area has different positions in the two adjacent frames of image data and has difference with the background gray scale, thereby roughly determining the position of the motion area in the image data.
Of course, in the implementation process, the motion region in the image data may also be determined by other manners, such as: the optical flow method is a method for evaluating changes in two-dimensional image data by using the principle of gray scale retention of corresponding pixels in two adjacent frames of image data, and simply speaking, the optical flow is the "instantaneous velocity" of pixel motion of a spatial object on an observation imaging plane. The embodiments of the present invention are not illustrated in detail and are not intended to be limiting.
When determining whether the object generating the motion characteristics is a human body part or not by the motion region, various methods may be adopted, and two of them are listed below for description.
The method for judging whether the object generating the motion characteristics is a human body part or not based on the motion area comprises the following steps: acquiring color parameters corresponding to the motion areas; inputting the color parameters into a color model of the human body part; and identifying the color parameters through the color model, and determining whether the object generating the motion characteristics is a human body part or not through an identification result.
The color parameters are for example: the RGB color value of each point of the motion area or the HIS color value is input into the corresponding color model, the color model returns a recognition result, and whether the motion area is a human body part is judged based on the recognition results of all pixel points of the motion area.
The color model is, for example, a logistic regression model, a BP (Back Propagation: Back Propagation) neural network model, etc., and the logistic regression model can be obtained by training as follows:
obtaining a plurality of groups of sampling samples, wherein each group of sampling samples comprises color values of pixel points; respectively obtaining a positive sample and a negative sample from a plurality of groups of sampling samples, wherein the positive sample and the negative sample can be marked in a manual mode, the positive sample refers to a pixel point in the sampling sample corresponding to a human body part, and the negative sample refers to a pixel point in the sampling sample, wherein the pixel point in the sampling sample is not corresponding to the human body part; and performing logistic regression training by adopting the positive samples and the negative samples so as to obtain a logistic regression model.
The logistic regression formula corresponding to the trained logistic regression model includes, for example:
a logistic regression formula (i):
Figure BDA0001056828760000071
wherein, x represents the vector corresponding to the color value of the pixel point, p (1| x, theta) represents the probability that the specific area is the human body part, and thetaTThe vector is determined by performing logistic regression training on a positive sample and a negative sample, and corresponds to the weighted value of the color value of each channel of the pixel point;
a logistic regression formula (II):
Figure BDA0001056828760000072
where x represents a vector corresponding to a color parameter indicating a specific region, p (0| x, θ) represents a probability that the specific region is not a human body part, and θTThe vector is determined by performing logistic regression training on the positive sample and the negative sample, and corresponds to the weighted value of the color value of each channel of the pixel point.
For example: if the color value of the pixel point is an RGB color value, where the color value of the R channel is 150, the color value of the G channel is 200, and the color value of the B channel is 250, x is (150, 200, 250), and the weight value of the R channel is w1The weight of the B channel is w2The weight value of the G channel is w3Then the vector θ can be determined as (w)1,w2,w3),θTThen it is the transposed vector of the vector theta.
After the color values of all the pixel points in a certain motion area are input into the logistic regression model, a recognition result can be obtained for each pixel point, the recognition result represents the probability that all the pixel points are skin, and if the color values of the pixel points are input into a formula I, the recognition result is the probability that the corresponding pixel points are skin; if the color value of the pixel point is input into the formula II, the recognition result is the probability that the corresponding pixel point is not the skin, and the probability that the pixel point is the skin can be obtained by subtracting the recognition result from 1.
After obtaining the recognition results of all the pixel points in the motion region, obtaining a gray image based on the recognition results of the pixel points, then performing binarization processing on the gray image, and determining whether the motion region is a human body part according to the result of the binarization processing, for example: if the white area exists in the image after the binarization processing, the motion area is indicated to have skin, so that the motion area is a human body part; if the white area does not exist in the image after the binarization processing, the motion area is indicated to have no skin, so the motion area is not a human body part; further, the first score value may also be determined based on the image after the binarization processing, for example: if a white area exists in the image after the binarization processing, the probability that the skin exists in the motion area is high, so that the first score value is determined to be 1; if no white area exists in the image after the binarization processing, it is indicated that the probability that the skin does not exist in the motion area is high, so the first score value is determined to be 0, and the like.
Of course, in a specific implementation process, the color model may also be other models, which are obtained in other ways, and embodiments of the present invention are not illustrated in detail and are not limited.
In addition, in the implementation process, before obtaining the color parameters, some processing may be performed on the motion region, for example: the embodiment of the present invention is not limited to perform brightness processing such as illumination compensation on the motion region, and perform color correction processing on the motion region.
Judging whether the object generating the motion characteristics is a human body part or not based on the motion area, comprising the following steps: and matching the shape characteristics of the motion area with the shape characteristics of the pre-stored human body part, and judging whether the object generating the motion characteristics is the human body part or not based on the matching result.
For example, shape characteristics of various human body parts may be stored in advance, such as: if the human body characteristics are human hands, the shape characteristics of various human hands can be acquired and stored in the local electronic equipment or a network server; after the motion region is determined, the edge extraction operator may first obtain the shape features of the motion region, then match the shape features with the pre-stored shape features, determine the shape feature with the highest matching degree, obtain a matching degree score (i.e., a second score), then determine whether the second score is greater than a second preset value (e.g., 0.5, 0.7, etc.), if so, it is determined that the object generating the motion feature is a human body part, otherwise, it is determined that the object generating the motion feature is not a human body part.
Further, the shape characteristics of the human body part of a specific user may also be acquired in advance, for example: facial features, hand features, etc. of the current user are collected, such as: the user to which the electronic device belongs can further directly match the shape feature with the shape feature of the human body part of the specific user after obtaining the motion feature of the human body part of the user and extracting the shape feature of the motion feature, so as to determine whether the motion feature is generated by the specific user. Because the shape characteristics of the human body parts of different users are different, the accuracy and the speed of shape characteristic matching can be improved based on the scheme; in addition, the shape characteristics of the human body part of the specific user are matched, so that the electronic equipment can be controlled only by the specific user, and the technical effect of improving the safety of using the electronic equipment is achieved.
If the two are used in combination, the following way can be adopted:
identifying the color parameters through the color model to obtain a first scoring value; obtaining a second score value according to whether the morphological characteristics are matched with the prestored morphological characteristics of the preset human body part; and judging whether the object generating the motion characteristic is a human body part or not based on the first scoring value and the second scoring value.
For example, a comprehensive score value may be obtained based on the first score value and the second score value, and then it is determined whether the comprehensive score value is greater than a preset score value, if so, it indicates that the object generating the motion feature is a human body part, otherwise, it indicates that the object generating the motion feature is not a human body part. The first score value and the second score value may be weighted and added to obtain a comprehensive score value, or the first score value and the second score value may be multiplied to obtain the first score value.
The second method, determining whether the object generating the motion feature is a human body part or not by the image data, includes: determining human body parts contained in each piece of image data in the image data; and judging whether the human body parts contained in the two image data move or not, and if so, determining that the object generating the motion characteristics is the human body part.
For example, image features of a human body part may be stored in advance, and then each piece of image data is identified based on the image features to determine whether the human body part is included therein; if the human body part is contained, determining the coordinates of the human body part in each image data, and if the coordinates of the human body part in each image data do not change greatly, indicating that the object generating the motion characteristic is not the human body part; otherwise, the object generating the motion characteristics is the human body part.
In step S101, the motion characteristics include: at least one parameter of moving direction, moving speed and moving distance. The direction of movement is for example: from left to right, from right to left, from top to bottom, etc.; the movement distance may be determined by a change in position of the movement region in each piece of image data; the moving speed may be obtained by dividing a moving distance of the moving area in each piece of image data by a time interval between each piece of image data.
In step S102, after obtaining the motion characteristic of the human body part of the user, the motion characteristic is directly searched in the instruction library, so as to obtain a corresponding control instruction, for example, the motion direction, the motion speed, the motion distance, and the like of the motion characteristic of the human body part often correspond to the control instruction. For example, if the human body part is a hand, the motion characteristic of the hand is that when the hand swings from the upper part to the lower part of the image acquisition device, a control instruction for calling the input interface can be generated; the motion characteristic of the hand is that when the hand swings from the upper left to the lower right of the image acquisition device, a control instruction for hiding the input interface can be generated; if the motion characteristic of the hand is that the hand swings from the left to the right of the image acquisition device, a control instruction for expanding the candidate bar can be generated; when the motion characteristic of the hand is that the hand swings to the left from the right of the image acquisition device, a control instruction for emptying the candidate bar and the input bar can be generated; when the motion characteristic of the hand is that the hand is swung upward from below the image acquisition device, a control instruction for expanding the candidate bar and scrolling the candidate bar may be generated; if the human body part is the head, if the motion characteristic of the head is that the head swings leftwards, a control instruction of grid retreating can be generated; if the motion characteristic of the head is nodding, a control instruction for inserting a cursor can be generated, and the like; if the human body part is an eye, when the motion characteristic of the eye is detected to stare at a certain position for more than a preset time (for example: 3s, 5s and the like), a control instruction for detaching the cursor can be generated; when the motion characteristic of the eye is detected as a left turn, a control instruction to control the cursor to move to the left is generated, and so on. Of course, other control instructions may be generated, and embodiments of the present invention are not illustrated in detail and are not limited.
As an alternative embodiment, the generating the control instruction corresponding to the motion characteristic includes: judging whether the human body part corresponding to the motion characteristic is a preset human body part or not; and if the human body part corresponding to the motion characteristic is the preset human body part, generating a control instruction corresponding to the motion characteristic.
For example, the predetermined human body part is, for example: head, hands, arms, eyes, etc., embodiments of the present invention are not limited. The electronic device may only bind the control instruction of the application program of the key input method with the motion characteristic of the predetermined human body part, but not bind other human body characteristics, so that through the scheme, the corresponding control instruction does not need to be searched and obtained for all human body characteristics, and the processing burden of the electronic device can be reduced.
In step S103, the first operation corresponding to the control instruction may be a plurality of operations, two of which are listed below for description, and certainly, in the specific implementation process, the first operation is not limited to the following cases.
The first step of controlling the key input application of the electronic device to execute a first operation corresponding to the control instruction includes: and controlling the key input application program to execute the operation of adjusting the input interface of the key input application program.
The adjusting for the input interface includes, for example: invoking the input interface, hiding the input interface, increasing the display size of the input interface, decreasing the display size of the input interface, and so forth. Taking a human body part as a human hand as an example, the input interface can be called when the motion characteristic of the human hand is that the human hand swings from the upper part to the lower part of the image acquisition device; when the motion characteristic of the human hand is that the human hand swings from the upper left to the lower right of the image acquisition device, the input interface and the like are hidden.
Secondly, the controlling the key input application of the electronic device to execute the first operation corresponding to the control instruction includes: and controlling the key input application program to execute the operation of adjusting the candidate character strings.
For example, the operation of adjusting the candidate character string includes: expanding a candidate bar for displaying a candidate character string, emptying the candidate bar, emptying an input bar for determining the candidate character string, expanding the candidate bar, scrolling the candidate bar, and the like, taking a human body part as an example, the candidate bar can be expanded when the motion characteristic of the human hand is swung from the left to the right of the image acquisition device; when the motion characteristic of the human hand swings from the right to the left of the image acquisition device, emptying the candidate field and the input field; when the motion characteristic of the human hand is swinging upward from below the image capturing device, the candidate bar is expanded and the candidate bar is scrolled, and so on.
Thirdly, the controlling the key input application program of the electronic device to execute the first operation corresponding to the control instruction includes: and controlling the key input application program to execute the operation of adjusting the input character string.
For example, the operation of adjusting the input character string includes: a backspace operation, an insert string operation, and the like. Taking a human body part as an example of a head, when a user inputs a character string in a manner of double-hand cooperation by using a key input application program of a mobile phone, the user encounters wrongly written characters and needs to backspace characters in front of a cursor, and the user can bind the backspace operation with a mode of swinging the head leftwards, namely, if the motion characteristic of the head is detected to be swinging leftwards, the key input method application program is controlled to generate backspace operation; the head of the cursor can be bound with the inserted character string, the motion of the cursor can be controlled based on the speed of the head of the cursor, and the cursor can be controlled to move a plurality of characters forward if the speed of the head of the cursor is high, and then the cursor is inserted; if the nodding speed is slower, the cursor can be controlled to move fewer characters forward, then insert the cursor, and so on.
In the specific implementation process, the motion characteristics of the human body can be bound with some other control instructions, for example: binding the motion characteristics of the hand from bottom to top into a sending instruction; the movement characteristics of the hand from top to bottom are bound to be confirmation instructions and the like, and the embodiment of the invention is not listed in detail and is not limited.
As an optional embodiment, after the controlling the key input application of the electronic device to perform the first operation corresponding to the control instruction, the method further includes: and learning by the user aiming at the historical operation records generated by the key input application program, and adjusting the sensitivity of generating the corresponding control instruction based on the motion characteristics of the human body part.
For example, if the operation of the control command between the withdrawals generated by the user is detected within a second preset time period (e.g. 30s, 40s, etc.), it indicates that the previous control command may be an erroneous response, in this case, the response sensitivity of the electronic device may be increased, for example: if the motion characteristics include a motion speed and the corresponding control instruction is generated only when the motion speed is greater than a preset motion speed, the preset motion speed may be increased, and if the motion characteristics include a motion distance and the corresponding control instruction is generated only when the motion distance is greater than a preset motion distance, the preset motion distance may be increased, or the first preset value, the second preset value, or the comprehensive score value may be increased. Alternatively, if the user continuously generates a plurality of identical motion characteristics and the electronic device does not respond, it indicates that the user may wish to perform some operation, but the sensitivity is too high, which makes the electronic device unable to respond, so the response sensitivity of the electronic device may be reduced, for example: the user calls the input interface by waving the hand from the top to the bottom of the image acquisition device, but the movement distance of the user may not reach the preset movement distance, so that the electronic device does not respond, in this case, the preset movement distance may be reduced, and similarly, the response sensitivity of the electronic device may be reduced by reducing the preset movement speed, the first preset value, the second preset value, and the preset score value.
Based on the scheme, different response sensitivities can be set based on different users, so that the technical effect of generating more accurate control when the application program of the key input method is used is achieved.
In a second aspect, based on the same inventive concept, an embodiment of the present invention provides an input control apparatus, please refer to fig. 3, including:
an obtaining module 30, configured to obtain a motion characteristic of a human body part of a user, where the human body part is a human body part that is located a distance greater than a preset distance from an electronic device;
a generating module 31, configured to generate a control instruction corresponding to the motion characteristic;
and the response module 32 is configured to respond to the control instruction, and control a key input application of the electronic device to execute a first operation corresponding to the control instruction.
Optionally, the generating module 31 includes:
the first judging unit is used for judging whether the human body part corresponding to the motion characteristic is a preset human body part or not;
and the generating unit is used for generating a control instruction corresponding to the motion characteristic if the human body part corresponding to the motion characteristic is the preset human body part.
Optionally, the obtaining module 30 includes:
the acquisition unit is used for acquiring and obtaining image data of the user in the process of generating the motion characteristics;
a second judging unit, configured to judge, according to the image data, whether the object with the motion feature is a human body part;
and the analysis unit is used for analyzing the motion characteristics through the image data if the human body part is the human body part.
Optionally, the second determining unit includes:
a first determining subunit configured to determine variation amounts of at least two images included in the image data;
a second determining subunit, configured to determine a motion region in the at least two images based on the variation of the at least two images;
and the judging subunit is used for judging whether the object generating the motion characteristic is a human body part or not based on the motion area.
Optionally, the determining subunit is configured to: acquiring color parameters corresponding to the motion areas; inputting the color parameters into a color model of the human body part; and identifying the color parameters through the color model, and determining whether the object generating the motion characteristics is a human body part or not through an identification result.
Optionally, the determining subunit is configured to:
and matching the shape characteristics of the motion area with the shape characteristics of the pre-stored human body part, and judging whether the object generating the motion characteristics is the human body part or not based on the matching result.
Optionally, the determining subunit is further configured to:
identifying the color parameters through the color model to obtain a first scoring value; and/or the presence of a gas in the gas,
obtaining a second score value according to whether the morphological characteristics are matched with the prestored morphological characteristics of the preset human body part;
and judging whether the object generating the motion characteristic is a human body part or not based on at least one parameter in the first scoring value and the second scoring value.
Optionally, the response module 32 is configured to:
controlling the key input application program to execute an operation of adjusting an input interface of the key input application program; and/or the presence of a gas in the gas,
controlling the key input application program to execute the operation of adjusting the candidate character strings; and/or the presence of a gas in the gas,
and controlling the key input application program to execute the operation of adjusting the input character string.
Optionally, the apparatus further comprises:
and the adjusting module is used for learning the historical operation records generated by the key input application program through the user and adjusting the sensitivity of generating the corresponding control instruction based on the motion characteristics of the human body part.
Optionally, the apparatus further comprises:
the judging module is used for judging whether a second operation generated aiming at the key input application program exists in a first preset time period, and the second operation is an operation generated by contacting with the electronic equipment;
and if the judgment result is negative, executing the operation of obtaining the motion characteristics of the human body part of the user.
Optionally, the motion characteristics include: at least one parameter of moving direction, moving speed and moving distance.
Since the input control device described in the second aspect of the present invention is a device used for implementing the input control method described in the first aspect of the present invention, based on the input control method described in the first aspect of the present invention, a person skilled in the art can understand the specific structure and the modification of the input control device described in the second aspect of the present invention, and therefore no further description is given here, and all the devices used for implementing the input control method described in the first aspect of the present invention belong to the scope of the present invention to be protected.
In a third aspect, based on the same inventive concept, an embodiment of the present invention provides an electronic device, including a memory, and one or more programs, where the one or more programs are stored in the memory, and configured to be executed by the one or more processors, and the one or more programs include instructions for:
obtaining the motion characteristics of a human body part of a user, wherein the human body part is a human body part with a distance from the electronic equipment larger than a preset distance;
generating a control instruction corresponding to the motion characteristic;
and responding to the control instruction, and controlling a key input application program of the electronic equipment to execute a first operation corresponding to the control instruction.
Since the electronic device described in the third aspect of the present invention is an electronic device used for implementing the input control method described in the first aspect of the present invention, based on the input control method described in the first aspect of the present invention, a person skilled in the art can understand a specific structure and a modification of the electronic device described in the third aspect of the present invention, and therefore details are not described herein, and all electronic devices used for implementing the input control method described in the first aspect of the present invention belong to the scope of the present invention to be protected.
Fig. 4 is a block diagram of an electronic device 800 illustrating a method of input control according to an example embodiment. For example, the electronic device 800 may be a mobile phone, a computer, a digital broadcast terminal, a messaging device, a game console, a tablet device, a medical device, an exercise device, a personal digital assistant, and the like.
Referring to fig. 4, electronic device 800 may include one or more of the following components: processing component 802, memory 804, power component 806, multimedia component 808, audio component 810, input/output (I/O) interface 812, sensor component 814, and communication component 816.
The processing component 802 generally controls overall operation of the electronic device 800, such as operations associated with display, telephone calls, data communications, camera operations, and recording operations. The processing elements 802 may include one or more processors 820 to execute instructions to perform all or a portion of the steps of the methods described above. Further, the processing component 802 can include one or more modules that facilitate interaction between the processing component 802 and other components. For example, the processing component 802 can include a multimedia module to facilitate interaction between the multimedia component 808 and the processing component 802.
The memory 804 is configured to store various types of data to support operation at the device 800. Examples of such data include instructions for any application or method operating on the electronic device 800, contact data, phonebook data, messages, pictures, videos, and so forth. The memory 804 may be implemented by any type or combination of volatile or non-volatile memory devices such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disks.
The power supply component 806 provides power to the various components of the electronic device 800. The power components 806 may include a power management system, one or more power supplies, and other components associated with generating, managing, and distributing power for the electronic device 800.
The multimedia component 808 includes a screen that provides an output interface between the electronic device 800 and a user. In some embodiments, the screen may include a Liquid Crystal Display (LCD) and a Touch Panel (TP). If the screen includes a touch panel, the screen may be implemented as a touch screen to receive an input signal from a user. The touch panel includes one or more touch sensors to sense touch, slide, and gestures on the touch panel. The touch sensor may not only sense the boundary of a touch or slide action, but also detect the duration and pressure associated with the touch or slide operation. In some embodiments, the multimedia component 808 includes a front facing camera and/or a rear facing camera. The front camera and/or the rear camera may receive external multimedia data when the electronic device 800 is in an operation mode, such as a shooting mode or a video mode. Each front camera and rear camera may be a fixed optical lens system or have a focal length and optical zoom capability.
The audio component 810 is configured to output and/or input audio signals. For example, the audio component 810 includes a Microphone (MIC) configured to receive external audio signals when the electronic device 800 is in an operational mode, such as a call mode, a recording mode, and a voice recognition mode. The received audio signals may further be stored in the memory 804 or transmitted via the communication component 816. In some embodiments, audio component 810 also includes a speaker for outputting audio signals.
The I/O interface 812 provides an interface between the processing component 802 and peripheral interface modules, which may be keyboards, click wheels, buttons, etc. These buttons may include, but are not limited to: a home button, a volume button, a start button, and a lock button.
The sensor assembly 814 includes one or more sensors for providing various aspects of state assessment for the electronic device 800. For example, the sensor assembly 814 may detect an open/closed state of the device 800, the relative positioning of components, such as a display and keypad of the electronic device 800, the sensor assembly 814 may also detect a change in the position of the electronic device 800 or a component of the electronic device 800, the presence or absence of user contact with the electronic device 800, orientation or acceleration/deceleration of the electronic device 800, and a change in the temperature of the electronic device 800. Sensor assembly 814 may include a proximity sensor configured to detect the presence of a nearby object without any physical contact. The sensor assembly 814 may also include a light sensor, such as a CMOS or CCD image sensor, for use in imaging applications. In some embodiments, the sensor assembly 814 may also include an acceleration sensor, a gyroscope sensor, a magnetic sensor, a pressure sensor, or a temperature sensor.
The communication component 816 is configured to facilitate wired or wireless communication between the electronic device 800 and other devices. The electronic device 800 may access a wireless network based on a communication standard, such as WiFi, 2G or 3G, or a combination thereof. In an exemplary embodiment, the communication component 816 receives a broadcast signal or broadcast associated information from an external broadcast management system via a broadcast channel. In an exemplary embodiment, the communications component 816 further includes a Near Field Communication (NFC) module to facilitate short-range communications. For example, the NFC module may be implemented based on Radio Frequency Identification (RFID) technology, infrared data association (IrDA) technology, Ultra Wideband (UWB) technology, Bluetooth (BT) technology, and other technologies.
In an exemplary embodiment, the electronic device 800 may be implemented by one or more Application Specific Integrated Circuits (ASICs), Digital Signal Processors (DSPs), Digital Signal Processing Devices (DSPDs), Programmable Logic Devices (PLDs), Field Programmable Gate Arrays (FPGAs), controllers, micro-controllers, microprocessors or other electronic components for performing the above-described methods.
In an exemplary embodiment, a non-transitory computer-readable storage medium comprising instructions, such as the memory 804 comprising instructions, executable by the processor 820 of the electronic device 800 to perform the above-described method is also provided. For example, the non-transitory computer readable storage medium may be a ROM, a Random Access Memory (RAM), a CD-ROM, a magnetic tape, a floppy disk, an optical data storage device, and the like.
A non-transitory computer readable storage medium in which instructions, when executed by a processor of an electronic device, enable the electronic device to perform an input control method, the method comprising:
obtaining the motion characteristics of a human body part of a user, wherein the human body part is a human body part with a distance from the electronic equipment larger than a preset distance;
generating a control instruction corresponding to the motion characteristic;
and responding to the control instruction, and controlling a key input application program of the electronic equipment to execute a first operation corresponding to the control instruction.
Fig. 5 is a schematic structural diagram of a server in an embodiment of the present invention. The server 1900 may vary widely by configuration or performance and may include one or more Central Processing Units (CPUs) 1922 (e.g., one or more processors) and memory 1932, one or more storage media 1930 (e.g., one or more mass storage devices) storing applications 1942 or data 1944. Memory 1932 and storage medium 1930 can be, among other things, transient or persistent storage. The program stored in the storage medium 1930 may include one or more modules (not shown), each of which may include a series of instructions operating on a server. Still further, a central processor 1922 may be provided in communication with the storage medium 1930 to execute a series of instruction operations in the storage medium 1930 on the server 1900.
The server 1900 may also include one or more power supplies 1926, one or more wired or wireless network interfaces 1950, one or more input-output interfaces 1958, one or more keyboards 1956, and/or one or more operating systems 1941, such as Windows Server, Mac OS XTM, UnixTM, LinuxTM, FreeBSDTM, etc.
One or more embodiments of the invention have at least the following beneficial effects:
in the embodiment of the invention, the motion characteristics of the human body part of the user can be obtained firstly, wherein the human body part is a human body part which has a distance from the electronic equipment greater than a preset distance; then generating a control instruction corresponding to the motion characteristic; finally, the control instruction is responded, and then the key input application program of the electronic equipment is controlled to execute the first operation corresponding to the control instruction, so that the key input application program can be controlled based on the motion characteristics generated by the human body part without the need of contacting a display unit of the electronic equipment by a user, and the response area of the key input application program can be free from the influence of the size of the display unit of the electronic equipment, so that the technical effect of improving the input precision when characters are input by the key input application program is achieved; in addition, the pollution of the touch body to the display unit can be reduced, and the cleaning degree of the display unit is further improved.
The present invention is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
While preferred embodiments of the present invention have been described, additional variations and modifications in those embodiments may occur to those skilled in the art once they learn of the basic inventive concepts. Therefore, it is intended that the appended claims be interpreted as including preferred embodiments and all such alterations and modifications as fall within the scope of the invention.
It will be apparent to those skilled in the art that various changes and modifications may be made in the present invention without departing from the spirit and scope of the invention. Thus, if such modifications and variations of the present invention fall within the scope of the claims of the present invention and their equivalents, the present invention is also intended to include such modifications and variations.

Claims (22)

1. An input control method, comprising:
obtaining the motion characteristics of a human body part of a user, wherein the human body part is a human body part with a distance from the electronic equipment larger than a preset distance;
generating a control instruction corresponding to the motion characteristic;
responding to the control instruction, and controlling a key input application program of the electronic equipment to execute a first operation corresponding to the control instruction;
the obtaining of the motion characteristics of the human body part of the user comprises:
acquiring and obtaining image data of the user in the process of generating the motion characteristics;
judging whether the object generating the motion characteristics is a human body part or not according to the image data;
if the human body part is the human body part, analyzing the motion characteristics through the image data;
wherein, the determining whether the object generating the motion characteristics is a human body part according to the image data specifically includes:
determining a motion area of the motion feature in the image data through the image data;
judging whether human skin exists in the motion area or not to obtain a judgment result;
and determining whether the object generating the motion characteristics is a human body part or not according to the judgment result.
2. The method of claim 1, wherein said generating control instructions corresponding to said motion profile comprises:
judging whether the human body part corresponding to the motion characteristic is a preset human body part or not;
and if the human body part corresponding to the motion characteristic is the preset human body part, generating a control instruction corresponding to the motion characteristic.
3. The method of claim 1, wherein determining from the image data whether the object that generates the motion feature is a human body part comprises:
determining the variation of at least two images contained in the image data;
determining a motion area in the at least two images based on the variation of the at least two images;
and judging whether the object generating the motion characteristic is a human body part or not based on the motion area.
4. The method of claim 3, wherein the determining whether the object generating the motion feature is a human body part based on the motion region comprises:
acquiring color parameters corresponding to the motion areas;
inputting the color parameters into a color model of the human body part;
and identifying the color parameters through the color model, and determining whether the object generating the motion characteristics is a human body part or not through an identification result.
5. The method of claim 4, wherein the determining whether the object generating the motion feature is a human body part based on the motion region comprises:
and matching the shape characteristics of the motion area with the shape characteristics of the pre-stored human body part, and judging whether the object generating the motion characteristics is the human body part or not based on the matching result.
6. The method of claim 5, wherein the determining whether the object generating the motion feature is a human body part based on the motion region further comprises:
identifying the color parameters through the color model to obtain a first scoring value; and/or the presence of a gas in the gas,
obtaining a second score value according to whether the morphological characteristics are matched with the prestored morphological characteristics of the preset human body part;
and judging whether the object generating the motion characteristic is a human body part or not based on at least one parameter in the first scoring value and the second scoring value.
7. The method of any of claims 1-6, wherein controlling the key input application of the electronic device to perform a first operation corresponding to the control instruction comprises:
controlling the key input application program to execute an operation of adjusting an input interface of the key input application program; and/or the presence of a gas in the gas,
controlling the key input application program to execute the operation of adjusting the candidate character strings; and/or the presence of a gas in the gas,
and controlling the key input application program to execute the operation of adjusting the input character string.
8. The method of any of claims 1-6, wherein after the controlling the key input application of the electronic device to perform the first operation corresponding to the control instruction, the method further comprises:
and learning by the user aiming at the historical operation records generated by the key input application program, and adjusting the sensitivity of generating the corresponding control instruction based on the motion characteristics of the human body part.
9. The method of any of claims 1-6, wherein prior to said obtaining the motion characteristics of the human body part of the user, the method further comprises:
judging whether a second operation generated aiming at the key input application program exists within a first preset time period, wherein the second operation is an operation generated by contacting with the electronic equipment;
and if the judgment result is negative, executing the operation of obtaining the motion characteristics of the human body part of the user.
10. The method of any of claims 1-6, wherein the motion features comprise: at least one parameter of moving direction, moving speed and moving distance.
11. An input control device, comprising:
the device comprises an obtaining module, a display module and a control module, wherein the obtaining module is used for obtaining the motion characteristics of a human body part of a user, and the human body part is a human body part which is more than a preset distance away from the electronic equipment;
the generating module is used for generating a control instruction corresponding to the motion characteristic;
the response module is used for responding to the control instruction and controlling a key input application program of the electronic equipment to execute a first operation corresponding to the control instruction;
the obtaining module includes:
the acquisition unit is used for acquiring and obtaining image data of the user in the process of generating the motion characteristics;
a second judging unit, configured to judge, according to the image data, whether the object with the motion feature is a human body part;
the analysis unit is used for analyzing the motion characteristics through the image data if the human body part is the human body part;
wherein the second determining unit specifically includes:
a motion region determining unit, configured to determine, from the image data, a motion region of the motion feature in the image data;
the human skin judging unit is used for judging whether human skin exists in the motion area or not to obtain a judgment result;
and the human body part determining unit is used for determining whether the object generating the motion characteristics is a human body part according to the judgment result.
12. The apparatus of claim 11, wherein the generating module comprises:
the first judging unit is used for judging whether the human body part corresponding to the motion characteristic is a preset human body part or not;
and the generating unit is used for generating a control instruction corresponding to the motion characteristic if the human body part corresponding to the motion characteristic is the preset human body part.
13. The apparatus of claim 11, wherein the second determining unit comprises:
a first determining subunit configured to determine variation amounts of at least two images included in the image data;
a second determining subunit, configured to determine a motion region in the at least two images based on the variation of the at least two images;
and the judging subunit is used for judging whether the object generating the motion characteristic is a human body part or not based on the motion area.
14. The apparatus as claimed in claim 13, wherein said determining subunit is configured to: acquiring color parameters corresponding to the motion areas; inputting the color parameters into a color model of the human body part; and identifying the color parameters through the color model, and determining whether the object generating the motion characteristics is a human body part or not through an identification result.
15. The apparatus as claimed in claim 14, wherein said determining subunit is configured to:
and matching the shape characteristics of the motion area with the shape characteristics of the pre-stored human body part, and judging whether the object generating the motion characteristics is the human body part or not based on the matching result.
16. The apparatus as recited in claim 15, wherein said determining subunit is further configured to:
identifying the color parameters through the color model to obtain a first scoring value; and/or the presence of a gas in the gas,
obtaining a second score value according to whether the morphological characteristics are matched with the prestored morphological characteristics of the preset human body part;
and judging whether the object generating the motion characteristic is a human body part or not based on at least one parameter in the first scoring value and the second scoring value.
17. The apparatus of any of claims 11 to 16, wherein the response module is configured to:
controlling the key input application program to execute an operation of adjusting an input interface of the key input application program; and/or the presence of a gas in the gas,
controlling the key input application program to execute the operation of adjusting the candidate character strings; and/or the presence of a gas in the gas,
and controlling the key input application program to execute the operation of adjusting the input character string.
18. The apparatus of any of claims 11 to 16, further comprising:
and the adjusting module is used for learning the historical operation records generated by the key input application program through the user and adjusting the sensitivity of generating the corresponding control instruction based on the motion characteristics of the human body part.
19. The apparatus of any of claims 11 to 16, further comprising:
the judging module is used for judging whether a second operation generated aiming at the key input application program exists in a first preset time period, and the second operation is an operation generated by contacting with the electronic equipment;
and if the judgment result is negative, executing the operation of obtaining the motion characteristics of the human body part of the user.
20. The apparatus of any of claims 11 to 16, wherein the motion features comprise: at least one parameter of moving direction, moving speed and moving distance.
21. An electronic device comprising a memory, and one or more programs, wherein the one or more programs are stored in the memory and configured for execution by one or more processors the one or more programs including instructions for:
obtaining the motion characteristics of a human body part of a user, wherein the human body part is a human body part with a distance from the electronic equipment larger than a preset distance;
generating a control instruction corresponding to the motion characteristic;
responding to the control instruction, and controlling a key input application program of the electronic equipment to execute a first operation corresponding to the control instruction;
the obtaining of the motion characteristics of the human body part of the user comprises:
acquiring and obtaining image data of the user in the process of generating the motion characteristics;
judging whether the object generating the motion characteristics is a human body part or not according to the image data;
if the human body part is the human body part, analyzing the motion characteristics through the image data;
wherein, the determining whether the object generating the motion characteristics is a human body part according to the image data specifically includes:
determining a motion area of the motion feature in the image data through the image data;
judging whether human skin exists in the motion area or not to obtain a judgment result;
and determining whether the object generating the motion characteristics is a human body part or not according to the judgment result.
22. A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, is adapted to carry out the method steps of any of claims 1 to 10.
CN201610585908.2A 2016-07-22 2016-07-22 Input control method and device and electronic equipment Active CN107643821B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201610585908.2A CN107643821B (en) 2016-07-22 2016-07-22 Input control method and device and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201610585908.2A CN107643821B (en) 2016-07-22 2016-07-22 Input control method and device and electronic equipment

Publications (2)

Publication Number Publication Date
CN107643821A CN107643821A (en) 2018-01-30
CN107643821B true CN107643821B (en) 2021-07-27

Family

ID=61108178

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201610585908.2A Active CN107643821B (en) 2016-07-22 2016-07-22 Input control method and device and electronic equipment

Country Status (1)

Country Link
CN (1) CN107643821B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109446999B (en) * 2018-10-31 2021-08-31 中电科新型智慧城市研究院有限公司 Rapid sensing system and method for dynamic human body movement based on statistical calculation

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101661329A (en) * 2009-09-22 2010-03-03 北京中星微电子有限公司 Operating control method and device of intelligent terminal
CN102202126A (en) * 2011-05-26 2011-09-28 惠州Tcl移动通信有限公司 Method for adjusting mobile phone volume and mobile phone
CN102520790A (en) * 2011-11-23 2012-06-27 中兴通讯股份有限公司 Character input method based on image sensing module, device and terminal
CN103576848A (en) * 2012-08-09 2014-02-12 腾讯科技(深圳)有限公司 Gesture operation method and gesture operation device
CN104584077A (en) * 2012-08-17 2015-04-29 日本电气方案创新株式会社 Input device, input method, and recording medium
CN104615231A (en) * 2013-11-01 2015-05-13 中国移动通信集团公司 Determination method for input information, and equipment

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9323336B2 (en) * 2013-07-01 2016-04-26 Blackberry Limited Gesture detection using ambient light sensors

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101661329A (en) * 2009-09-22 2010-03-03 北京中星微电子有限公司 Operating control method and device of intelligent terminal
CN102202126A (en) * 2011-05-26 2011-09-28 惠州Tcl移动通信有限公司 Method for adjusting mobile phone volume and mobile phone
CN102520790A (en) * 2011-11-23 2012-06-27 中兴通讯股份有限公司 Character input method based on image sensing module, device and terminal
CN103576848A (en) * 2012-08-09 2014-02-12 腾讯科技(深圳)有限公司 Gesture operation method and gesture operation device
CN104584077A (en) * 2012-08-17 2015-04-29 日本电气方案创新株式会社 Input device, input method, and recording medium
CN104615231A (en) * 2013-11-01 2015-05-13 中国移动通信集团公司 Determination method for input information, and equipment

Also Published As

Publication number Publication date
CN107643821A (en) 2018-01-30

Similar Documents

Publication Publication Date Title
US10956706B2 (en) Collecting fingreprints
CN108363706B (en) Method and device for man-machine dialogue interaction
US10942580B2 (en) Input circuitry, terminal, and touch response method and device
CN110554815B (en) Icon awakening method, electronic device and storage medium
WO2021135601A1 (en) Auxiliary photographing method and apparatus, terminal device, and storage medium
CN105224195B (en) Terminal operation method and device
US11138422B2 (en) Posture detection method, apparatus and device, and storage medium
WO2020042727A1 (en) Interaction method of application scenario, and mobile terminal and storage medium
CN110555333A (en) fingerprint identification method, electronic device and storage medium
EP3933570A1 (en) Method and apparatus for controlling a voice assistant, and computer-readable storage medium
US11222223B2 (en) Collecting fingerprints
CN109753202B (en) Screen capturing method and mobile terminal
US10885298B2 (en) Method and device for optical fingerprint recognition, and computer-readable storage medium
CN111506245A (en) Terminal control method and device
CN107291772A (en) One kind search access method, device and electronic equipment
WO2019007236A1 (en) Input method, device, and machine-readable medium
CN107132927B (en) Input character recognition method and device for recognizing input characters
CN111144266A (en) Facial expression recognition method and device
EP3975046A1 (en) Method and apparatus for detecting occluded image and medium
CN112114653A (en) Terminal device control method, device, equipment and storage medium
CN110858291A (en) Character segmentation method and device
CN110519517B (en) Copy guiding method, electronic device and computer readable storage medium
CN107643821B (en) Input control method and device and electronic equipment
CN113642551A (en) Nail key point detection method and device, electronic equipment and storage medium
CN111382598A (en) Identification method and device and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant