CN106155277B - Electronic equipment and information processing method - Google Patents

Electronic equipment and information processing method Download PDF

Info

Publication number
CN106155277B
CN106155277B CN201510137141.2A CN201510137141A CN106155277B CN 106155277 B CN106155277 B CN 106155277B CN 201510137141 A CN201510137141 A CN 201510137141A CN 106155277 B CN106155277 B CN 106155277B
Authority
CN
China
Prior art keywords
acceleration
component
input data
predetermined condition
vibration information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201510137141.2A
Other languages
Chinese (zh)
Other versions
CN106155277A (en
Inventor
肖蔓君
陈柯
马骞
刘文静
杨晨
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Lenovo Beijing Ltd
Original Assignee
Lenovo Beijing Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Lenovo Beijing Ltd filed Critical Lenovo Beijing Ltd
Priority to CN201510137141.2A priority Critical patent/CN106155277B/en
Publication of CN106155277A publication Critical patent/CN106155277A/en
Application granted granted Critical
Publication of CN106155277B publication Critical patent/CN106155277B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • User Interface Of Digital Computer (AREA)

Abstract

Disclose electronic equipment and information processing method.The electronic equipment, comprising: body apparatus;Fixed device, connect with the body apparatus, and the fixed device is for the fixed relative positional relationship with the user of the electronic equipment;And sensing device, it is arranged in the body apparatus and/or the fixed device, wherein the sensing device includes: the first sensing unit, for sensing the first input data about gesture input of user;Second sensing unit, for sensing the second input data of user;Processing unit, for by being analyzed in conjunction with first input data and second input data, to judge the type of the gesture input and generate corresponding instruction;And communication unit, for sending external equipment for described instruction.

Description

Electronic equipment and information processing method
Technical field
The present invention relates to the fields of electronic technology, more specifically to the electronic equipment and information for human-computer interaction Processing method.
Background technique
With the arrival of information age, the type of smart machine gradually extends hand till now from initial computer Machine, tablet computer, TV, automobile etc..In smart machine, it is solution required for realizing intelligence that human-computer interaction, which how is better achieved, Critical issue certainly.For existing conventional equipment, for example, mouse-keyboard in computer, remote controler in household appliance, Depend on specific input attachment all to realize human-computer interaction.
However, some interaction scenarios newly emerged in large numbers, other than for computer, mobile phone, plate, further include smart home, The emerging scene such as intelligent automobile.Under these scenes, it is desired to be able to be mentioned while not limited by objective condition such as distance, environment For interactive experience.Therefore, how to be interacted anywhere or anytime be field of human-computer interaction research hot spot direction.
Summary of the invention
In view of above situation, it is intended to provide the electronic equipment and information processing method that can be interacted whenever and wherever possible.
According to an aspect of the invention, there is provided a kind of electronic equipment, comprising: body apparatus;Fixed device, with institute Body apparatus connection is stated, the fixed device is for the fixed relative positional relationship with the user of the electronic equipment;And Sensing device is arranged in the body apparatus and/or the fixed device, wherein the sensing device includes: the first sensing Unit, for sensing the first input data about gesture input of user;Second sensing unit, for sensing user's Second input data;Processing unit, for by being analyzed in conjunction with first input data and second input data, To judge the type of the gesture input and generate corresponding instruction;And communication unit, for sending described instruction to External equipment.
Preferably, in electronic equipment according to an embodiment of the present invention, first input data includes acceleration information, The acceleration information includes using first sensing unit itself as the 3-axis acceleration component of coordinate system and described second Input data includes vibration information, and the vibration information includes Oscillation Amplitude and/or vibration frequency, and the processing unit is configured To determine the type of the gesture input based on the acceleration information and the vibration information.
Preferably, in electronic equipment according to an embodiment of the present invention, the processing unit is configured to include: first sentences Disconnected component, for judging whether there is the component of acceleration greater than first threshold among the 3-axis acceleration component;Merge Component, if the judging result for first judgement part is acceleration that is yes, being merged on first direction;The Two judgement parts, for judging whether the vibration information meets the first predetermined condition;And determine component, for working as the conjunction And component merges the acceleration on the first direction out and second judgement part judges that the vibration information meets the When one predetermined condition, determine that the type of the gesture input is clicking operation.
Preferably, in electronic equipment according to an embodiment of the present invention, the processing unit is configured to include: first sentences Disconnected component, for judging whether there is the component of acceleration greater than first threshold among the 3-axis acceleration component;Merge Component, if the judging result for the judgement part is acceleration that is yes, being merged on first direction;Second sentences Disconnected component, for judging whether the vibration information meets third predetermined condition;And determine component, for working as the merging portion Part merge the acceleration on the first direction out and second judgement part to judge that the vibration information meets third pre- When fixed condition, determine that the type of the gesture input is slide.
Preferably, in electronic equipment according to an embodiment of the present invention, first judgement part is configured to Judge whether first sensing unit senses the acceleration in second direction greater than first threshold, wherein the second party To opposite to the first direction, and second judgement part is configured as judging whether the vibration information meets second Predetermined condition, when first judgement part judges to sense the acceleration in second direction and second judgement part is sentenced When disconnected the second predetermined condition of satisfaction of vibration information out, the determining component determines that the gesture input is over.
Preferably, in electronic equipment according to an embodiment of the present invention, first input data further comprises angle speed Information is spent, the first sensing unit is configured as that it is defeated to calculate the gesture based on the acceleration information and the angular velocity information The motion profile entered, and the communication unit is further configured to send external equipment for the motion profile.
Preferably, in electronic equipment according to an embodiment of the present invention, the processing unit be configured to include: Track identification component, for the calculated motion profile of the first sensing unit to be compared with multiple desired trajectories, and Identify which desired trajectory the motion profile corresponds to;Second judgement part, for judging whether the vibration information is full The first predetermined condition of foot or third predetermined condition;Component is determined, for identifying the movement rail when the track identification component Desired trajectory corresponding to mark and when the second judgement part judges that the vibration information meets third predetermined condition, generates and institute State the corresponding instruction of desired trajectory.
Preferably, in electronic equipment according to an embodiment of the present invention, first input data includes acceleration information, The acceleration information includes using first sensing unit itself as the 3-axis acceleration component of coordinate system and described second Input data includes voice messaging, and the processing unit is configured as based on the acceleration information and the voice messaging, really The type of the fixed gesture input.
According to another aspect of the present invention, a kind of information processing method is provided, an electronic equipment is applied to, comprising: sense Survey the first input data about gesture input of user;Sense the second input data of user;By in conjunction with described One input data and second input data are analyzed, to judge the type of the gesture input and generate corresponding finger It enables;And external equipment is sent by described instruction.
Preferably, according to the method for the embodiment of the present invention, first input data includes acceleration information, described Acceleration information include using first sensing unit itself as the 3-axis acceleration component of coordinate system, and it is described second input Data include vibration information, and the vibration information includes Oscillation Amplitude and/or vibration frequency, wherein being based on the acceleration information With the vibration information, the type of the gesture input is determined.
Preferably, according to the method for the embodiment of the present invention, by conjunction with first input data and described second Input data is analyzed, and to judge the type of the gesture input and generate corresponding instruction the step of includes: described Among 3-axis acceleration component, the component of acceleration greater than first threshold is judged whether there is;If it is judged that be it is yes, then will It merges into the acceleration on first direction;Judge whether the vibration information meets the first predetermined condition;And works as and merge out Acceleration on the first direction and when judging that the vibration information meets the first predetermined condition, determines the gesture input Type be clicking operation.
Preferably, according to the method for the embodiment of the present invention, by conjunction with first input data and described second Input data is analyzed, and to judge the type of the gesture input and generate corresponding instruction the step of includes: described Among 3-axis acceleration component, the component of acceleration greater than first threshold is judged whether there is;If it is judged that be it is yes, then will It merges into the acceleration on first direction;Judge whether the vibration information meets third predetermined condition;And works as and merge out Acceleration on the first direction and when judging that the vibration information meets third predetermined condition, determines the gesture input Type be slide.
Preferably, it may further include according to the method for the embodiment of the present invention: whether judging first sensing unit The acceleration in the second direction greater than first threshold is sensed, wherein the second direction is opposite to the first direction;Sentence Whether the vibration information that breaks meets the second predetermined condition, when the acceleration for judging to sense in second direction and judges to shake When dynamic information meets the second predetermined condition, determine that the gesture input is over.
Preferably, according to the method for the embodiment of the present invention, first input data further comprises angular speed letter Breath, wherein the method further includes: it is based on the acceleration information and the angular velocity information, calculates the gesture input Motion profile;And external equipment is sent by the motion profile.
Preferably, it may further include according to the method for the embodiment of the present invention:
Calculated motion profile is compared with multiple desired trajectories, and identifies which the motion profile corresponds to One desired trajectory;Judge whether the vibration information meets the first predetermined condition or third predetermined condition;When identifying the fortune Desired trajectory corresponding to dynamic rail mark and when judging that the vibration information meets third predetermined condition, generates and the pre- orbit determination The corresponding instruction of mark.
Preferably, according to the method for the embodiment of the present invention, first input data includes acceleration information, described Acceleration information include using first sensing unit itself as the 3-axis acceleration component of coordinate system, and it is described second input Data include voice messaging, wherein being based on the acceleration information and the voice messaging, determine the type of the gesture input.
In electronic equipment according to an embodiment of the present invention and information processing method, by being distinguished using two sensing units Different input datas is obtained, and is analyzed in conjunction with different input datas, and then obtain the type of gesture input, Ke Yi Be effectively prevented from while ensuring to interact whenever and wherever possible using user unintentionally gesture motion is as processing is interactively entered the case where.And And by the judgement to hand exercise and instruction, powerful gesture instruction library can be formed, it can preset instructions, it is also possible to which family is automatic Instruction, to complete diversified operation.
Detailed description of the invention
Figure 1A to 1E illustrates the structural block diagram of electronic equipment according to an embodiment of the present invention;
Fig. 2 is to show the functional block diagram of the configuration of sensing device according to an embodiment of the present invention;
Fig. 3 is to show the functional block diagram of the configuration of processing unit according to a first embodiment of the present invention;
Fig. 4 is to show the functional block diagram of the configuration of processing unit according to a second embodiment of the present invention;
Fig. 5 is to show the flow chart of the process of information processing method according to an embodiment of the present invention;
Fig. 6 is to show the flow chart of the processing according to a first embodiment of the present invention that instruction is generated based on gesture input; And
Fig. 7 is to show the flow chart of the processing that instruction is generated based on gesture input according to a second embodiment of the present invention.
Specific embodiment
Each preferred embodiment of the invention is described below with reference to accompanying drawings.It provides referring to the drawings Description, to help the understanding to example embodiment of the invention as defined by appended claims and their equivalents.It includes side The various details of assistant's solution, but they can only be counted as illustratively.Therefore, it would be recognized by those skilled in the art that Embodiment described herein can be made various changes and modifications, without departing from scope and spirit of the present invention.Moreover, in order to Keep specification more clear succinct, by omission pair it is well known that the detailed description of function and construction.
Firstly, electronic equipment according to an embodiment of the present invention will be described referring to Fig.1.Electronics according to an embodiment of the present invention is set Standby is wearable device.As shown in Figure 1A, a kind of electronic equipment 100 includes: body apparatus 200 and fixed device 300.Fixed dress It sets 300 to connect with the body apparatus 200, the fixed device 300 is for the fixed phase with the user of the electronic equipment To positional relationship.
The fixed device 300 includes at least a stationary state, under the stationary state, fixed 300 energy of device As an annulus or at least part in the proximate annular space for meeting the first predetermined condition, the annulus or described Proximate annular space can be centered around the column periphery for meeting the second predetermined condition.
Specifically, Figure 1B and 1C illustrates two kinds of fixations that the fixed device 300 is connect with the body apparatus 200 respectively State.Under the first stationary state as shown in Figure 1B, the fixed device 300 forms closed loop with the body apparatus 200 Annulus, wherein the fixed device 300 and the body apparatus 200 respectively constitute a part of annulus.Such as scheming Under second stationary state shown in 1C, the fixed device 300 forms the approximate ring with small opening with the body apparatus 200 Shape space, wherein the fixed device 300 and the body apparatus 200 respectively constitute a part of annulus.In the present invention A preferred embodiment in, the body apparatus 200 be smartwatch dial plate part, and the fixed device 300 be intelligence The band portion of energy wrist-watch.The annulus or described close formed by the body apparatus 200 and the fixed device 300 Can be centered around like annulus around the wrist of the user of the smartwatch as the column, and the annulus or The diameter in the proximate annular space be greater than user's wrist diameter and be less than user's fist diameter.
In addition, the annulus or the proximate annular space can certainly be by the independent shapes of the fixed device 300 At.As seen in figs. 1D and 1E, the body apparatus 200 can be arranged on the fixed device 300 (that is, the body apparatus 200 are attached to the fixed device 300 in a manner of face contact), it is formed to only have the fixed device 300 itself for outer Around the annulus (Fig. 1 D) or the proximate annular space (Fig. 1 E) of the column.The fixed device 300 is arranged There is the fixed mechanism (not shown) of hasp, fastener, zipper etc..
Figure 1B -1E shows the case where electronic equipment 100 is Wrist belt-type equipment.However, the present invention is not limited in This.For example, the specific form of the electronic equipment 100 can also be finger ring etc. in many cases,.
Wearable form in this way can make electronic equipment 100 be more convenient for providing interactive experience anywhere or anytime.
In addition, electronic equipment 100 further includes sensing device 400.Fig. 2 shows the concrete configurations of sensing device.Such as Fig. 2 institute Show, the sensing device 400 includes: the first sensing unit 401, the second sensing unit 402, processing unit 403 and communication unit 404。
First sensing unit 401 is used to sense the first input data about gesture input of user.
Second sensing unit 402 is used to sense the second input data of user.
Processing unit 403 be used for by being analyzed in conjunction with first input data and second input data, with Judge the type of the gesture input and generates corresponding instruction.
Communication unit 404 is used to send external equipment for described instruction.Here external equipment is set with the electronics The standby equipment interacted, such as TV, desktop computer.External equipment and electronic equipment can wirelessly (examples Such as, Wifi, bluetooth etc.) it connects and communicates.Also, electronic equipment can be connect with an external equipment, or can also with it is more The different external equipment connection of platform.
In Figure 1A -1E, the situation that sensing device 400 is arranged on body apparatus is merely illustrated.But the present invention is simultaneously It is not limited only to this.Alternatively, sensing device 400 also can be set in the fixed device.Alternatively, the portion in sensing device 400 Subassembly can be located on body apparatus, and other parts component can be on a fixed device.That is, the first sensing is single Each of first 401, second sensing unit 402, processing unit 403 and communication unit 404 can both be located at body apparatus On, it can also be on a fixed device.
As mentioned above it is possible, the electronic equipment with wearable form is more convenient for providing interactive experience anywhere or anytime.So And this increases a possibility that gesture input erroneous judgement while offering convenience.For example, when the wearing electronic equipment When user and friend chat, in fact it could happen that many gesture motions.It may be evident, however, that the expectation of this and non-user is with the gesture motion Carry out human-computer interaction.If the data as in the state of the art, only obtained by the single sensing unit for gesture motion, The then very possible triggering for carrying out mistake.
In general, the gesture input that the finger of user carries out under vacant state is uncertain, that is to say, that user The gesture input that is carried out under vacant state of finger be likely to unintentionally operate.On the contrary, if the finger of user is at certain The gesture input carried out on one surface, then this operation is usually determining, that is to say, that the finger of user is on a certain surface The gesture input of progress is likely to intentional operation.
In the present invention, it by obtaining different input datas respectively using two sensing units, and combines different defeated Enter data to be analyzed, and then obtain the type of gesture input, can be effectively avoided using user unintentionally gesture motion as The case where interactively entering processing.
Next, will description processing unit 403 by conjunction with first input data and second input data into Row analysis, to judge the type of the gesture input and generate the detail of corresponding instruction.
As a kind of possible embodiment, first input data includes acceleration information, the acceleration information Including using first sensing unit itself as the 3-axis acceleration component of coordinate system, and second input data includes vibration Dynamic information, the vibration information includes Oscillation Amplitude and/or vibration frequency.
For example, first sensing unit 401 can be accelerometer, and second sensing unit 402 can be pressure Conductive film sensor.Furthermore, it is contemplated that the space that user carries out gesture input is three-dimensional, and uncertain user will be at what It is inputted in kind plane, it may be assumed that do not know the direction of motion of operating body in advance, therefore three axis accelerometer will be used.It is base It goes to realize work in the basic principle of acceleration, steric acceleration can be measured, the movement of object can be accurately reflected comprehensively Property.Piezoelectric film sensor is to utilize sensor made of the piezoelectric effect generated after certain dielectric medium stresses.So-called piezoelectricity Effect refers to certain dielectrics in the external force by a direction and when deformation occurs (including bending and telescopic deformation), by It, can be the phenomenon that its surface generates charge in the polarization phenomena of internal charge.Piezoelectric membrane is usually very thin, not only soft, density It is low, sensitivity is fabulous, but also have very strong mechanical tenacity, it may be said that be a kind of plastic foil flexible, light, toughness is high, It can be made into larger area and multi-thickness.It can directly be attached to apparatus surface, without influencing its movement, be highly suitable for needing Want big bandwidth and highly sensitive strain transmitting.When finger tapping desktop or finger slide on the table, piezoelectric membrane is passed Sensor is able to detect that the information of the vibration caused by the percussion or sliding due to finger.
Here it is to be noted that it data measured by three axis accelerometer are three axis using its own as coordinate system Data, rather than the data of the operation planar relative to operating body.For example it is assumed that operating body is the finger of user, and at this The electronic equipment is worn on finger.When the finger of user is to carry out clicking operation in the posture of 45 ° of angles with horizontal table top, What three axis accelerometer obtained is with the coordinate system where desktop in three number of axle evidences under the coordinate system of 45 ° of angles.
In this case, the processing unit 403 is configured as based on the acceleration information and the vibration information, Determine the type of the gesture input.
Specifically, Fig. 3 shows the configuration of the processing unit of first embodiment according to the present invention.As shown in figure 3, place Reason unit 403 is configured to include: the first judgement part 4031 merges component 4032, the second judgement part 4033 and determines Component 4034.
First judgement part 4031 is used among the 3-axis acceleration component, is judged whether there is greater than first threshold Component of acceleration.When the finger of user moves closer to a certain surface to interact operation, certainly exists perpendicular to or connect Closely perpendicular to the acceleration on the surface.Determined by the judgement of the first judgement part user whether initial interactive operation.This In first threshold to filter out the finger movement for being not considered as user the case where and noise jamming that may be present.One As for, when the finger of user operates on a certain surface, the first sensing unit obtain three number of axle according among, deposit Biggish acceleration value on both axes.Only has the biggish acceleration value on an axis it is of course also possible to exist Situation.This will depend on mode of operation, operation planar and the direction of motion of user's finger.
If the judging result of first judgement part 4031 is yes, that is to say, that determine that there are the fortune of user's finger It is dynamic, then merge the acceleration that component 4032 is merged on first direction.
Second judgement part 4033 is for judging whether the vibration information meets the first predetermined condition.Here, described One predetermined condition carries out clicking operation to determine whether the finger of user is directed on a certain surface, rather than hanging click is grasped Make.As mentioned above it is possible, the second sensing unit 402 can be piezoelectric film sensor.It is thin that the vibration information can be piezoelectricity The vibrational waveform that film sensors obtain.
As a kind of simple embodiment, waveform that second judgment unit can be detected according to piezoelectric film sensor In Oscillation Amplitude and/or vibration frequency judge.
For example, if detecting Oscillation Amplitude larger (for example, being greater than third threshold value) and/or vibration in the shorter period The vibration of dynamic frequency larger (for example, being greater than the 4th threshold value), then tend to think finger manipulation representated by the vibration information For clicking operation.
Determine component 4034 for merging acceleration and described second on the first direction out when the component that merges When judgement part judges that the vibration information meets the first predetermined condition, the type of the gesture input is determined to click behaviour Make.That is, only when the first sensing unit detects qualified acceleration value and the second sensing unit detects table Finger is levied in the vibration information of the clicking operation on a certain surface, determines that component 4034 just determines the type of the gesture input For clicking operation.Clicking operation usually can be regarded as carrying out the trigger action of subsequent operation.For example, as will be described below Like that, the motion profile of user's finger can also be calculated in the first sensing unit.User can be any with finger tapping first Plane determines that component 4034 is judged to have carried out clicking operation, shows that finger starts to interact with the plane.After starting at this time Continuous processing, it may be assumed that the finger motion locus for obtaining the first sensing unit is mapped as mouse action, and in the display of external equipment It is shown on screen.
If big greater than third threshold value and/or vibration frequency compared with an Oscillation Amplitude is detected in the short period at one In the vibration of the 4th threshold value, then it is assumed that the clicking operation is single-click operation.If in two adjacent shorter periods Detect the vibration that two-time vibration amplitude is greater than third threshold value and/or vibration frequency is greater than the 4th threshold value, then it is assumed that the click Operation is double click operation.
Alternatively, the second judgement part 4033 is for judging whether the vibration information meets third predetermined condition.Here, institute It states third predetermined condition and carries out slide to determine whether the finger of user is directed on a certain surface, rather than hanging sliding Operation.As mentioned above it is possible, the second sensing unit 402 can be piezoelectric film sensor.The vibration information can be piezoelectricity The vibrational waveform that thin film sensor obtains.
As a kind of simple embodiment, waveform that second judgment unit can be detected according to piezoelectric film sensor In Oscillation Amplitude and/or vibration frequency judge.
For example, if detecting Oscillation Amplitude in (compared with the period corresponding to the clicking operation) longer time section Medium (for example, be greater than the 5th threshold value and less than the 6th threshold value) and/or vibration frequency are medium (for example, being greater than the 7th threshold value and small In the 8th threshold value) continuous shaking, then tend to think that finger manipulation representated by the vibration information is slide.So And vibration caused by slide also depends greatly on the coefficient of friction on the surface contacted with finger.Therefore, exist In practice, the selection of specific threshold value also needs to consider in conjunction with many factors.
Determine component 4034 for merging acceleration and described second on the first direction out when the component that merges When judgement part judges that the vibration information meets third predetermined condition, determine the type of the gesture input for sliding behaviour Make.That is, only when the first sensing unit detects qualified acceleration value and the second sensing unit detects table Finger is levied in the vibration information of the slide on a certain surface, determines that component 4034 just determines the type of the gesture input For slide.
The foregoing describe the treatment processes when the finger of user carries out pushing operation.On the other hand, on the other hand, must The case where so being lifted there are the finger of user.Generally, it is considered that illustrating that user's operation terminates if the finger of user lifts.It connects Get off, description is judged into the process that user's finger is lifted.
First judgement part 4031 is configured to judge whether first sensing unit senses second Acceleration on direction, the second direction are opposite to the first direction.As mentioned above it is possible, when user finger carry out by When lower operation, it will generate the acceleration on first direction.As the process opposite with finger down, when the finger of user carries out When lifting operation, it should the acceleration in the second direction opposite with first direction will be generated.That is, if detecting Acceleration in second direction, then it is assumed that the finger of user carries out lifting operation.
Also, second judgement part 4033 is configured as judging whether the vibration information meets the second predetermined item Part.Here, the second predetermined condition is used to determine whether the finger of user does not operate on the surface, i.e., whether there is no vibrations Amplitude is larger and/or the biggish vibration of vibration frequency.
When first judgement part judges to sense the acceleration in second direction and second judgement part is sentenced When disconnected the second predetermined condition of satisfaction of vibration information out, the determining component 4034 determines that the gesture input is over.It can be with Gesture will be lifted to be considered to indicate the gesture of interaction end.For example, when the determining component 4034 has determined the gesture input At the end of, the processing before being described above is that the finger motion locus that the first sensing unit obtains is mapped as mouse In the case where operation, terminate such mapping, and does not continue to show mouse icon on the display screen of external equipment.
It is pointed out here that hereinbefore, only by the classifier of single condition, i.e., simply judgement is primary grasps Whether the amplitude and/or frequency of work meet predetermined condition to distinguish different types of operation (click, slide, lift).However, The present invention is not limited to this.It, can also be by way of learning classification come complete as more acurrate but more complicated embodiment At the determination of different type gesture input.That is, acquiring great amount of samples in advance, statistical is carried out to the characteristic quantity of sample Analysis, so that it is determined that going out which type of common trait certain required movement has.For example distinguish and press and lift, in statistical characteristics It certainly is discrepant, therefore the combination through overfitting threshold condition obtained is also necessarily different.For example, can incite somebody to action The wave crest or wave trough position for the waveform that piezoelectric film sensor obtains are divided into 2 parts, calculate separately the letter of former and later two windows Number maximum value, mean value and standard deviation, statistics zero crossing number etc., and as the statistical characteristics.Certainly, acceleration value It can also be classified by similar mode.For example, by the maximum value of the local signal of Acceleration pulse mean value standard deviation Deng as the statistical characteristics.The elaboration process of this classifier is obtained by the method for study.
Next, the case where the first sensing unit obtains the motion profile of gesture input will be described.In this case, One sensing unit still further comprises gyroscope other than accelerometer.Also, first input data further comprises Angular velocity information.First sensing unit is configured as calculating the hand based on the acceleration information and the angular velocity information The motion profile of gesture input, and the communication unit is further configured to send external equipment for the motion profile.By In acceleration information and angular velocity information based on accelerometer and gyroscope acquisition, the motion profile that object is calculated is this Technological means known to field, thus in order to avoid redundancy for the sake of, no longer its details is repeated here.
As mentioned above it is possible, can be mapped after the first sensing unit obtains the motion profile of user's finger For the mouse action of the display screen display of external equipment, to interact.But the first sensing unit user obtained The motion profile of finger is not limited to be mapped as mouse action.Alternatively, the motion profile can also characterize predetermined instruction.
Specifically, Fig. 4 shows the configuration of the processing unit of second embodiment according to the present invention.As shown in figure 4, institute It states processing unit 403 and is configured to include: track identification component 4035.
Track identification component 4035 is used for the calculated motion profile of the first sensing unit and multiple desired trajectories It is compared, and identifies which desired trajectory the motion profile corresponds to.For example, the desired trajectory can be specific word Symbol string.For example, if user can write TV with finger, then it represents that connected with smart television representated by TV.If user uses Finger writes AIR, then it represents that connects with air-conditioning representated by AIR.
When the track identification component 4035 identifies desired trajectory corresponding to the motion profile and the second judging part When part judges that the vibration information meets the first predetermined condition or third predetermined condition, determine component 4034 generate with it is described pre- The corresponding instruction of fixed track.As mentioned above it is possible, first predetermined condition is to determine whether the finger of user is directed at certain Carry out clicking operation on one surface, rather than hanging clicking operation.For example, such clicking operation can be regarded as receiving it is subsequent The trigger action of track input.Alternatively, the third predetermined condition is to determine whether the finger of user is directed on a certain surface Upper carry out slide, rather than hanging slide.For example, being if track identification component 4035 identifies the desired trajectory TV, it is determined that component 4034 generates the instruction of connection TV.If track identification component 4035 identifies the desired trajectory and is AIR, it is determined that component 4034 generates the instruction of connection AIR.
Hereinbefore, the case where the second sensing unit is piezoelectric film sensor is described.However, the present invention and not only limiting In this.For example, the second sensing unit can also be sound collection and recognition unit.In this case, the second input number According to including voice messaging.Determine user whether on a certain surface instead of the vibration information detected based on piezoelectric film sensor Upper operation, the processing unit is configured as determining whether user prepares to start to interact based on voice messaging, in combination with institute Acceleration information is stated, determines the type of the gesture input.
Hereinbefore, the concrete configuration of electronic equipment according to an embodiment of the present invention is described referring to figs. 1 to Fig. 4.It connects down Come, information processing method according to an embodiment of the present invention will be described referring to Fig. 5.The information processing method is applied to above institute The electronic equipment stated.As shown in figure 5, the information processing method includes the following steps:
Firstly, sensing the first input data about gesture input of user in step S501.
Then, in step S502, the second input data of user is sensed.
Next, in step S503, by being analyzed in conjunction with first input data and second input data, To judge the type of the gesture input and generate corresponding instruction.
Finally, sending external equipment for described instruction in step S504.
As mentioned above it is possible, in general, the gesture input that the finger of user carries out under vacant state be it is uncertain, That is, the gesture input that the finger of user carries out under vacant state is likely to unintentionally operate.On the contrary, if user Finger be the gesture input carried out on a certain surface, then this operation is usually determining, that is to say, that the finger of user The gesture input carried out on a certain surface is likely to intentional operation.
In the present invention, it by obtaining different input datas respectively using two sensing steps, and combines different defeated Enter data to be analyzed, and then obtain the type of gesture input, can be effectively avoided using user unintentionally gesture motion as The case where interactively entering processing.
Next, will description processing unit 403 by conjunction with first input data and second input data into Row analysis, to judge the type of the gesture input and generate the detail of corresponding instruction.
As a kind of possible embodiment, first input data includes acceleration information, the acceleration information Including using first sensing unit itself as the 3-axis acceleration component of coordinate system, and second input data includes vibration Dynamic information, the vibration information include Oscillation Amplitude and/or vibration frequency,
It is wherein based on the acceleration information and the vibration information, determines the type of the gesture input.
Fig. 6 shows the processing according to a first embodiment of the present invention that instruction is generated based on gesture input.As shown in fig. 6, By being analyzed in conjunction with first input data and second input data, to judge the type of the gesture input simultaneously The step of generating corresponding instruction include:
Firstly, among the 3-axis acceleration component, judging whether there is adding greater than first threshold in step S601 Velocity component.When the finger of user moves closer to a certain surface to interact operation, certainly exist perpendicular or close to vertical Directly in the acceleration on the surface.Determined by the judgement of the first judgement part user whether initial interactive operation.Here The case where first threshold is to filter out the finger movement for being not considered as user and noise jamming that may be present.It is general and Speech, when the finger of user operates on a certain surface, in three number of axle that the first sensing unit obtains among, there are two Biggish acceleration value on a axis.Only has the case where biggish acceleration value on an axis it is of course also possible to exist. This will depend on mode of operation, operation planar and the direction of motion of user's finger.
If being yes in the judging result of step S601, that is to say, that determine there are the movements of user's finger, then handle into Row arrives step S602.Otherwise processing terminate.In step S602, the acceleration that is merged on first direction.
Then, in step S603, judge whether the vibration information meets the first predetermined condition.Pass through the first predetermined condition Judgement carry out clicking operation to determine whether the finger of user is directed on a certain surface, rather than hanging clicking operation.For example, If detecting Oscillation Amplitude larger (for example, being greater than third threshold value) and/or the larger (example of vibration frequency in the shorter period Such as, be greater than the 4th threshold value) vibration, then tend to think that finger manipulation representated by the vibration information is clicking operation.
When the acceleration merged on the first direction out in step S602 and the vibration letter is judged in step S603 When breath meets the first predetermined condition, processing proceeds to step S604.In step S604, determine that the type of the gesture input is point Hit operation.Next, processing proceeds to step S606.In step S606, interactive operation corresponding with the clicking operation is carried out. Clicking operation usually can be regarded as carrying out the trigger action of subsequent operation.For example, as will be described below, it can be with The motion profile of user's finger is calculated.User can use finger tapping arbitrary plane first, then judge to have carried out point Operation is hit, shows that finger starts to interact with the plane.Start subsequent processing at this time, it may be assumed that the finger motion locus that will be obtained, It is mapped as mouse action, and is shown on the display screen of external equipment.
It should be pointed out that the processing of step S601~S603 is sequentially shown in Fig. 6, but step S601 The processing of~S602 and the processing of step S603 can carry out parallel, can also be carried out with the sequencing opposite with Fig. 6.
If judging whether the vibration information meets third predetermined condition in step S603, here, third predetermined condition Slide is carried out to determine whether the finger of user is directed on a certain surface, rather than hanging slide is then when in step S602 merges the acceleration on the first direction out and judges that the vibration information meets the predetermined item of third in step S603 When part, then processing proceeds to step S605.In step S605, determine that the type of the gesture input is slide.Next, Processing proceeds to step S606.In step S606, interactive operation corresponding with the slide is carried out.
The foregoing describe the treatment processes when the finger of user carries out pushing operation.On the other hand, on the other hand, must The case where so being lifted there are the finger of user.Generally, it is considered that illustrating that user's operation terminates if the finger of user lifts.It connects Get off, description is judged into the process that user's finger is lifted.
Specifically, the method further includes:
In step S607, judge whether first sensing unit senses the acceleration in second direction, described second Direction is opposite to the first direction.As mentioned above it is possible, when the finger of user carries out pushing operation, it will generate first party Upward acceleration.As the process opposite with finger down, when the finger of user carries out lifting operation, it should will generate Acceleration in the second direction opposite with first direction.That is, recognizing if detecting the acceleration in second direction It carries out lifting operation for the finger of user.
Also, in step S608, judge whether the vibration information meets the second predetermined condition.Second predetermined condition To determine user finger whether do not operate on the surface, i.e., whether there is no Oscillation Amplitude it is larger and/or vibration frequency The biggish vibration of rate.
When the acceleration for judging to sense in second direction and judge vibration information meet the second predetermined condition when, place Reason proceeds to step S609.In step S609, determine that the gesture input is over, end operation.For example, described in ought determining When gesture input is over, the processing before being described above is that obtained finger motion locus is mapped as to mouse behaviour In the case where work, terminate such mapping, and does not continue to show mouse icon on the display screen of external equipment.
Similarly, the processing of step S608~S609 is sequentially shown in Fig. 6, but the processing of step S608 Processing with step S609 can carry out parallel, can also be carried out with the sequencing opposite with Fig. 6.
In addition, as mentioned above it is possible, first input data can further include angular velocity information.
Fig. 7 shows the processing that instruction is generated based on gesture input according to a second embodiment of the present invention.As shown in fig. 7, By being analyzed in conjunction with first input data and second input data, to judge the type of the gesture input simultaneously The step of generating corresponding instruction include:
In step S701, it is based on the acceleration information and the angular velocity information, calculates the movement of the gesture input Track.It is then possible to send external equipment for the motion profile.As mentioned above it is possible, in the fortune for obtaining user's finger After dynamic rail mark, it can be mapped as the mouse action of the display screen display of external equipment, to interact.But The motion profile of user's finger obtained is not limited to be mapped as mouse action.Alternatively, the motion profile can also characterize Predetermined instruction.
In this case, in step S702, calculated motion profile is compared with multiple desired trajectories, and knows Not Chu the motion profile correspond to which desired trajectory.
In step S703, judge whether the vibration information meets the first predetermined condition or third predetermined condition.As above Described in, first predetermined condition carries out clicking operation to determine whether the finger of user is directed on a certain surface, and Non- hanging clicking operation.For example, such clicking operation can be regarded as receiving the trigger action of subsequent track input.Or Person, the third predetermined condition carry out slide to determine whether the finger of user is directed on a certain surface, rather than outstanding Empty slide.
When the track identification component identifies desired trajectory corresponding to the motion profile and judges the vibration When information meets the first predetermined condition or third predetermined condition, processing proceeds to step S704.In step S704, generate with it is described The corresponding instruction of desired trajectory.For example, generating the instruction of connection TV if identifying that the desired trajectory is TV.If known Not Chu the desired trajectory be AIR, then generate connection AIR instruction.
Then, in step 705, generated instruction is executed.
Hereinbefore, the case where the second input data is vibration information is described.However, those skilled in the art can be with Understand, the present invention is not limited to this.For example, the second input data can also be voice messaging.Instead of based on the vibration detected Dynamic information determines whether user operates on a certain surface, is also based on the acceleration information and the voice messaging, Determine the type of the gesture input.That is, can determine whether user prepares to start to interact based on voice messaging, together When in conjunction with the acceleration information, determine the type of gesture input and generate corresponding instruction.
So far, electronic equipment according to an embodiment of the present invention and information processing side has been described in detail with reference to the accompanying drawings Method.In electronic equipment according to an embodiment of the present invention and information processing method, by being obtained respectively using two sensing units Different input datas, and analyzed in conjunction with different input datas, and then obtain the type of gesture input, it can ensure Whenever and wherever possible interact while be effectively prevented from using user unintentionally gesture motion is as processing is interactively entered the case where.Also, By the judgement to hand exercise and instruction, powerful gesture instruction library can be formed, it can preset instructions, it is also possible to which family refers to automatically It enables, to complete diversified operation.
It should be noted that in the present specification, the terms "include", "comprise" or its any other variant are intended to Non-exclusive inclusion, so that the process, method, article or equipment including a series of elements is not only wanted including those Element, but also including other elements that are not explicitly listed, or further include for this process, method, article or equipment Intrinsic element.In the absence of more restrictions, the element limited by sentence " including ... ", it is not excluded that including There is also other identical elements in the process, method, article or equipment of the element.
Finally, it is to be noted that, it is above-mentioned it is a series of processing not only include with sequence described here in temporal sequence The processing of execution, and the processing including executing parallel or respectively rather than in chronological order.
Through the above description of the embodiments, those skilled in the art can be understood that the present invention can be by Software adds the mode of required hardware platform to realize, naturally it is also possible to all be implemented by software.Based on this understanding, Technical solution of the present invention can be embodied in the form of software products in whole or in part to what background technique contributed, The computer software product can store in storage medium, such as ROM/RAM, magnetic disk, CD, including some instructions are to make It obtains a computer equipment (can be personal computer, server or the network equipment etc.) and executes each embodiment of the present invention Or method described in certain parts of embodiment.
The present invention is described in detail above, specific case used herein is to the principle of the present invention and embodiment party Formula is expounded, and the above description of the embodiment is only used to help understand the method for the present invention and its core ideas;Meanwhile it is right In those of ordinary skill in the art, according to the thought of the present invention, change is had in specific embodiments and applications Place, in conclusion the contents of this specification are not to be construed as limiting the invention.

Claims (12)

1. a kind of electronic equipment, comprising:
Body apparatus;
Fixed device, connect with the body apparatus, and the fixed device is for the fixed user with the electronic equipment Relative positional relationship;And
Sensing device is arranged in the body apparatus and/or the fixed device,
Wherein the sensing device includes:
First sensing unit, for sensing the first input data about gesture input of user;
Second sensing unit, for sensing the second input data of user;
Processing unit, for by being analyzed in conjunction with first input data and second input data, to judge It states the type of gesture input and generates corresponding instruction;And
Communication unit, for sending external equipment for described instruction,
Wherein first input data includes acceleration information, the acceleration information include with first sensing unit from As the 3-axis acceleration component of coordinate system, and second input data includes vibration information, and the vibration information includes Oscillation Amplitude and/or vibration frequency,
The processing unit is configured as determining the class of the gesture input based on the acceleration information and the vibration information Type.
2. electronic equipment according to claim 1, wherein
The processing unit is configured to include:
First judgement part, for judging whether there is the acceleration greater than first threshold among the 3-axis acceleration component Spend component;
Merge component, if the judging result for first judgement part be it is yes, be merged on first direction Acceleration;
Second judgement part, for judging whether the vibration information meets the first predetermined condition;And
Component is determined, for merging the acceleration on the first direction out and second judgement part when the merging component When judging that the vibration information meets the first predetermined condition, determine that the type of the gesture input is clicking operation.
3. electronic equipment according to claim 1,
The processing unit is configured to include:
First judgement part, for judging whether there is the acceleration greater than first threshold among the 3-axis acceleration component Spend component;
Merge component, if the judging result for the judgement part is acceleration that is yes, being merged on first direction Degree;
Second judgement part, for judging whether the vibration information meets third predetermined condition;And
Component is determined, for merging the acceleration on the first direction out and second judgement part when the merging component When judging that the vibration information meets third predetermined condition, determine that the type of the gesture input is slide.
4. electronic equipment according to claim 2 or 3, wherein first judgement part is configured to judge institute State whether the first sensing unit senses acceleration in second direction greater than first threshold, wherein the second direction and institute State first direction on the contrary, and second judgement part be configured as judging whether the vibration information meets the second predetermined item Part,
When first judgement part judges to sense the acceleration in second direction and second judgement part is judged When vibration information meets the second predetermined condition, the determining component determines that the gesture input is over.
5. electronic equipment according to claim 1, wherein first input data further comprises angular velocity information, the One sensing unit is configured as calculating the movement rail of the gesture input based on the acceleration information and the angular velocity information Mark, and the communication unit is further configured to send external equipment for the motion profile.
6. electronic equipment according to claim 5, wherein
The processing unit is configured to
Track identification component, for comparing the calculated motion profile of the first sensing unit with multiple desired trajectories It is right, and identify which desired trajectory the motion profile corresponds to;
Second judgement part, for judging whether the vibration information meets the first predetermined condition or third predetermined condition;
Determine component, the track identification component identifies desired trajectory corresponding to the motion profile and second sentences for working as When disconnected component judges that the vibration information meets third predetermined condition, instruction corresponding with the desired trajectory is generated.
7. a kind of information processing method is applied to an electronic equipment, comprising:
Sense the first input data about gesture input of user;
Sense the second input data of user;
By being analyzed in conjunction with first input data and second input data, to judge the class of the gesture input Type simultaneously generates corresponding instruction;And
External equipment is sent by described instruction,
Wherein first input data includes acceleration information, and the acceleration information includes itself being with the first sensing unit The 3-axis acceleration component of coordinate system, and second input data includes vibration information, and the vibration information includes vibration Amplitude and/or vibration frequency,
It is wherein based on the acceleration information and the vibration information, determines the type of the gesture input.
8. according to the method described in claim 7, wherein
By being analyzed in conjunction with first input data and second input data, to judge the class of the gesture input Type and the step of generating corresponding instruction includes:
Among the 3-axis acceleration component, the component of acceleration greater than first threshold is judged whether there is;
If it is judged that being acceleration that is yes, then being merged on first direction;
Judge whether the vibration information meets the first predetermined condition;And
When the acceleration merged on the first direction out and when judging that the vibration information meets the first predetermined condition, determine The type of the gesture input is clicking operation.
9. according to the method described in claim 7, by conjunction with first input data and second input data progress Analysis, to judge the type of the gesture input and the step of generating corresponding instruction includes:
Among the 3-axis acceleration component, the component of acceleration greater than first threshold is judged whether there is;
If it is judged that being acceleration that is yes, then being merged on first direction;
Judge whether the vibration information meets third predetermined condition;And
When the acceleration merged on the first direction out and when judging that the vibration information meets third predetermined condition, determine The type of the gesture input is slide.
10. method according to claim 8 or claim 9, further comprises:
Judge whether first sensing unit senses the acceleration in second direction greater than first threshold, wherein described Two directions are opposite to the first direction;
Judge whether the vibration information meets the second predetermined condition,
When the acceleration for judging to sense in second direction and judge vibration information meet the second predetermined condition when, determine institute Gesture input is stated to be over.
11. according to the method described in claim 7, wherein first input data further comprises angular velocity information,
Wherein the method further includes:
Based on the acceleration information and the angular velocity information, the motion profile of the gesture input is calculated;And
External equipment is sent by the motion profile.
12. according to the method for claim 11, further comprising:
Calculated motion profile is compared with multiple desired trajectories, and it is pre- to identify which the motion profile corresponds to Fixed track;
Judge whether the vibration information meets the first predetermined condition or third predetermined condition;
When identifying desired trajectory corresponding to the motion profile and judge that the vibration information meets third predetermined condition When, generate instruction corresponding with the desired trajectory.
CN201510137141.2A 2015-03-26 2015-03-26 Electronic equipment and information processing method Active CN106155277B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201510137141.2A CN106155277B (en) 2015-03-26 2015-03-26 Electronic equipment and information processing method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201510137141.2A CN106155277B (en) 2015-03-26 2015-03-26 Electronic equipment and information processing method

Publications (2)

Publication Number Publication Date
CN106155277A CN106155277A (en) 2016-11-23
CN106155277B true CN106155277B (en) 2019-03-08

Family

ID=57340519

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201510137141.2A Active CN106155277B (en) 2015-03-26 2015-03-26 Electronic equipment and information processing method

Country Status (1)

Country Link
CN (1) CN106155277B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2020086939A (en) * 2018-11-26 2020-06-04 ソニー株式会社 Information processing device, information processing method, and program

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103941860A (en) * 2014-03-31 2014-07-23 天津三星通信技术研究有限公司 Gesture recognition system of wrist strap type portable terminal and method of gesture recognition system
CN104038800A (en) * 2014-05-21 2014-09-10 常璨 Finger ring for smart television input and input method of finger ring
CN104134060A (en) * 2014-08-03 2014-11-05 上海威璞电子科技有限公司 Sign language interpreting, displaying and sound producing system based on electromyographic signals and motion sensors
CN104220961A (en) * 2011-12-08 2014-12-17 摩托罗拉解决方案公司 Method and device for force sensing gesture recognition
CN104345875A (en) * 2013-08-07 2015-02-11 联想(北京)有限公司 Method for information processing and electronic equipment

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI564749B (en) * 2012-03-23 2017-01-01 群邁通訊股份有限公司 Method and system for preventing inadvertent touching

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104220961A (en) * 2011-12-08 2014-12-17 摩托罗拉解决方案公司 Method and device for force sensing gesture recognition
CN104345875A (en) * 2013-08-07 2015-02-11 联想(北京)有限公司 Method for information processing and electronic equipment
CN103941860A (en) * 2014-03-31 2014-07-23 天津三星通信技术研究有限公司 Gesture recognition system of wrist strap type portable terminal and method of gesture recognition system
CN104038800A (en) * 2014-05-21 2014-09-10 常璨 Finger ring for smart television input and input method of finger ring
CN104134060A (en) * 2014-08-03 2014-11-05 上海威璞电子科技有限公司 Sign language interpreting, displaying and sound producing system based on electromyographic signals and motion sensors

Also Published As

Publication number Publication date
CN106155277A (en) 2016-11-23

Similar Documents

Publication Publication Date Title
CN105824431B (en) Message input device and method
Kratz et al. HoverFlow: expanding the design space of around-device interaction
EP3191922B1 (en) Classification of touch input as being unintended or intended
EP2426598B1 (en) Apparatus and method for user intention inference using multimodal information
US7259756B2 (en) Method and apparatus for selecting information in multi-dimensional space
CN102362243B (en) Multi-telepointer, virtual object display device, and virtual object control method
US8830189B2 (en) Device and method for monitoring the object's behavior
KR100674090B1 (en) System for Wearable General-Purpose 3-Dimensional Input
US10289239B2 (en) Application programming interface for multi-touch input detection
TWI457793B (en) Real-time motion recognition method and inertia sensing and trajectory
US10042438B2 (en) Systems and methods for text entry
CN102016765A (en) Method and system of identifying a user of a handheld device
US20120127070A1 (en) Control signal input device and method using posture recognition
US20130241832A1 (en) Method and device for controlling the behavior of virtual objects on a display
US20110199292A1 (en) Wrist-Mounted Gesture Device
Baglioni et al. JerkTilts: using accelerometers for eight-choice selection on mobile devices
CN108196668B (en) Portable gesture recognition system and method
LaViola Jr An introduction to 3D gestural interfaces
US20170010695A1 (en) Enhanced multi-touch input detection
US20170010733A1 (en) User-identifying application programming interface (api)
CN106155277B (en) Electronic equipment and information processing method
KR101211808B1 (en) Gesture cognitive device and method for recognizing gesture thereof
CN104932695B (en) Message input device and data inputting method
Mali et al. Hand gestures recognition using inertial sensors through deep learning
KR100750504B1 (en) Motion Input Device using Accelerometers, Host Matching Device and Motion Recognition Method

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant