CN107506041A - A kind of wearable mouse control method based on motion sensor - Google Patents

A kind of wearable mouse control method based on motion sensor Download PDF

Info

Publication number
CN107506041A
CN107506041A CN201710812802.6A CN201710812802A CN107506041A CN 107506041 A CN107506041 A CN 107506041A CN 201710812802 A CN201710812802 A CN 201710812802A CN 107506041 A CN107506041 A CN 107506041A
Authority
CN
China
Prior art keywords
action
data
cursor
head
intermediate region
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201710812802.6A
Other languages
Chinese (zh)
Inventor
王志波
金博楠
李熠劼
龚银超
庞晓艺
王骞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Wuhan University WHU
Original Assignee
Wuhan University WHU
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Wuhan University WHU filed Critical Wuhan University WHU
Priority to CN201710812802.6A priority Critical patent/CN107506041A/en
Publication of CN107506041A publication Critical patent/CN107506041A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/012Head tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Position Input By Displaying (AREA)

Abstract

The invention discloses a kind of wearable mouse control method based on motion sensor, by data acquisition, action definition, data prediction, feature extraction, the classification of motion, action segmentation, data with training eight steps such as an error correction, using from wearable device sensor transmit come headwork exercise data, ensure not influence normally to regard thing and it is easy to operate while, realize the high-precision control to mouse using headwork.Compared to the eye control class, foot control class and control class mouse of exhaling occurred on the market, present invention operation is easier, and identification is more accurate.It in addition, the present invention is widely used, can not only facilitate no arm personage to use, some normal persons can also be helped to liberate both hands.The present invention has universal applicability, can be not only used for controlling computer, can also control other any electronic equipments.

Description

A kind of wearable mouse control method based on motion sensor
Technical field
The invention belongs to mobile awareness calculating field, and in particular to a kind of wearable mouse control based on motion sensor Method processed.Compared to the eye control class, foot control class and control class mouse of exhaling occurred on the market, present invention operation is easier, Identification is more accurate.In addition, the present invention is widely used, it can not only facilitate no arm personage to use, some normal persons can also be helped Liberate both hands.The present invention has universal applicability, can be not only used for controlling computer, other any electronics can also be controlled to set It is standby.
Background technology
With the development of Internet technology, the equipment such as computer, smart mobile phone is increasingly popularized in the life of people, mouse Also most common man-machine interactive system is turned into.But no arm personage and some arm disabled scholars can not but use in life Mouse, touch-screen carry out commanding apparatus.At the same time, wearable device is also increasingly popularized now, such as intelligent watch, intelligence Energy bracelet etc..Sensor is nowadays also more and more diversified and has at a relatively high availability:The equipment such as accelerometer, gyroscope can be with Equipment accurately movable information is provided.Has there are some and has used Wearable in the market, such as by the movement of eyeball come Control the eye control class of mouse, finger replaced the foot control class that is controlled to mouse by toe, and by air-breathing exhale when Between length and strong and weak various combination control class with the expiration for the mouse function such as clicking on up and down to realize, but they are all present There is the difficult weak point of operation inconvenience, control.And it can accomplish that easy to operate, identification is accurate using headwork to be controlled Really, can solve the above problems.
This invention address that using from wearable device sensor transmit come headwork exercise data, study With a kind of wearable mouse control method based on motion sensor of exploration.
The degree of accuracy of action recognition is relevant with the sample rate of sensing data, if sample rate is too high, can cause transmission cost It is too big, and congestion is easily caused, data transfer is caused sizable delay, and easy packet loss.If sample rate is too low, can make The result inaccurate into motion characteristic unobvious, identification.Therefore, in order to ensure data normal transmission and action recognition can be enough Accurately, it is necessary to select its sample rate.
The transmission of sensing data is carried out by bluetooth or wifi.In use, it may appear that head is unnatural to tremble Dynamic, situations such as network condition is undesirable.So the mode of processing to data and transmission is needed to carry out fine design.In reality In, between noise wave interference, the missing of data transfer, the robustness of the not sentience of additional signal and information transmission The factors such as balance, are required for being considered.
The content of the invention
In order to solve the above-mentioned technical problem, the invention provides a kind of wearable mouse control based on motion sensor Method.
The technical solution adopted in the present invention is:
A kind of wearable mouse control method based on motion sensor, it is characterised in that comprise the following steps:
Step 1:Data acquisition is carried out, is specifically included:
Step 1.1:Select bluetooth or Wifi focuses to connect embedded Wearable and controlled computer, establish Data are transmitted in Socket connections.Port is arranged to 8088.
Step 1.2:The exercise data in obtained from accelerometer and gyroscope three directions is stored successively respectively.Work as reception To data number be more than length of window when then remove data at first.Sample rate is arranged to 40 times per second.
Step 2:Action definition, specifically include following sub-step:
Step 2.1:Define user and face screen at the beginning, initial head position is turned into intermediate region.
Step 2.2:Define single-click operation.Head is quickly downward, then return quickly to the action definition of intermediate region For single-click operation.
Step 2.3:Define direction action.By head from intermediate region port, a period of time to cursor is kept to reach the phase Intermediate region is gone back to behind prestige position to be defined as moving to left action.Head is turned into the right side from intermediate region, is kept for a period of time to cursor Reach and go back to intermediate region after desired locations and be defined as moving to right action.Head is turned to from intermediate region, is kept for a period of time Intermediate region is gone back to after reaching desired locations to cursor to be defined as moving up action.
Head is turned to down from intermediate region, is kept for a period of time go back to intermediate region after reaching desired locations to cursor and determined Justice is to move down action.
Step 3:Eliminate the data prediction of the removal tooth ripple of noise;
Step 4:Feature extraction, specifically include following sub-step:
Step 4.1:Because more more notable than the motion characteristic in four additional direction in the x-axis direction of accelerometer, single-click operation, So click action and direction action can be made a distinction using the value in accelerometer x-axis direction.
Step 4.2:It is more more notable than moving up and down motion characteristic because in the x-axis direction of gyroscope, moving left and right, so can With the value using gyroscope x-axis direction come by left and right action and up and down action make a distinction.Again because of left and right motion characteristic just phase Instead, therefore left and right can also be acted and made a distinction.
Step 4.3:It is more more notable than moving left and right motion characteristic because in the z-axis direction of gyroscope, moving up and down, so can Made a distinction with the value using gyroscope z-axis direction that will move up and down with left and right action.Again because of left and right motion characteristic just phase Instead, therefore up and down action can also be made a distinction.
Step 5:The classification of motion, specific implementation include following sub-step:
Step 5.1:Define atn x, gtn x, gtx x, gtn z, gtx zThe respectively lower threshold value in accelerometer x-axis direction, gyroscope x, z The upper lower threshold value of direction of principal axis.The value of threshold value determines by pre-training step, the judgement data as decision tree.Define ai x, gi x, gi z For current time i accelerometer x-axis direction, gyroscope x-axis direction, gyroscope z-axis direction value.
The value in current time accelerometer x-axis direction is checked, if ai x<atn x, then head start click action, otherwise, enter Step 5.2.
Step 5.2:The value in current time gyroscope x-axis direction is checked, if gi x<gtn xOr gi x>gtx x, head is then opened Beginning left or right acts, into step 5.3.Otherwise, into step 5.4.
Step 5.3:If gi x<gtn x, then head start to be moved to the left, the cursor on screen is moved to the left.Otherwise, head Move right, the cursor on screen also moves right.
Step 5.4:The value in current time gyroscope z-axis direction is checked, if gi z<gtn zOr gi z>gtx z, head is then opened Beginning up or down acts, into step 5.5.Otherwise, generation is not acted in current time.
Step 5.5:If gi z<gtn z, then head begin to move up, the cursor on screen moves up.Otherwise, head Start to move down, the cursor on screen also moves right.
Step 5.6:Once motion is identified, and cursor will perform termination of the corresponding actions up to action on screen,
Step 6:Action segmentation, obtained initial data is Time Continuous data, and action is split these Time Continuous Data segmentation is blocking, and each piece of data included represent an action.According to the definition of action, each action includes two Opposite part.For example, act to the left first by head from intermediate region port, then again by intermediate region of turning left back.Knot Fruit is that have two opposite waveforms in the same direction.Definition has two threshold value T in one directionn, TpTo determine action Beginning and end.Wherein, TnIt is lower threshold value, TpIt is upper threshold value.
Step 7:Data pre-training, specific implementation include following sub-step:
Step 7.1:User takes turns doing the action 5 times of 5 definition before the use.
Step 7.2:5 pairs of extreme values of obtained data are extracted, obtained minimum crest and highest trough are multiplied by a certain system Number k is as the personalized threshold value trained.
Step 8:Error correction, as user in use, user can see that the real-time movement of onscreen cursor.When mistake is sent out When raw, user is known that there occurs a mistake, and can immediately begin to correct.If cursor moves to the direction of mistake, User can perform same action, then be contacted.For example, when user to the left when, find cursor move right, Yong Huke Once acted to the right to correct this mistake with also performing.With this method, when previous release, subsequent action will not By erroneous effects before.
In the above-mentioned wearable mouse control method based on motion sensor, the specific implementation of step 3 is including following Sub-step:
Step 3.1:Noise is eliminated using the mode of threshold denoising.Define a noise threshold Eth, when the data received Just to equipment transmission data during more than the threshold value, 0 is otherwise set to.
Step 3.2:Tooth ripple is removed using the mode of maximum value filtering.Maximum value filtering is the maximum all values in window Value is as the value after current filter.Definition filter window is τ, then the filtered value r of current timetIt is represented by:
rt=max { rt-τ/2..., rt-1, rt, rt+1..., rt+τ/2}
The termination definition acted in the above-mentioned wearable mouse control method based on motion sensor, step 5.6 It is as follows:
For left action, as real time data gi x≥gtx x, cursor stopping movement, then recognize real time data gi x≤gtn x, Left action terminates.
For right action, as real time data ai x≤atn x, cursor stopping movement, then recognize real time data ai x≥atn x, Right action terminates.
For upper action, as real time data gi z≥gtx x, cursor stopping movement, then recognize real time data gi z≤gtx z, Upper action terminates.
For lower action, as real time data gi z≤gtn x, cursor stopping movement, then recognize real time data gi z≥gtn z, Lower action terminates.
For click action, it is acted with lower action in one direction, as real time data gi z≤gtn x, cursor stopping shifting It is dynamic, then recognize real time data gi z≥gtn z, click action termination.
In the above-mentioned wearable mouse control method based on motion sensor, step 6, divided using following rule Cut data:
Head is defined initially in intermediate region face screen, and a certain threshold is reached for the first time when the data of gyroscope are identified to During value, the point is flagged as the point that action starts, and is designated as Ps.When gyro data reaches certain threshold value for the second time in the opposite direction When, the point is flagged as the point of release, is designated as Pe.Then PsAnd PeMiddle one section is divided out as an action.
In the above-mentioned wearable mouse control method based on motion sensor, the specific implementation of step 7 is including following Sub-step:
Step 7.1:User takes turns doing the action 5 times of 5 definition before the use.
Step 7.2:5 pairs of extreme values of obtained data are extracted, obtained minimum crest and highest trough are multiplied by a certain system Number k is as the personalized threshold value trained.
Relative to prior art, useful achievement of the invention is:Control is more accurate, simple to operate, does not influence normal things, With scalability, and there is universal applicability.
Brief description of the drawings
Fig. 1 is the system framework figure of the present invention.
Fig. 2 a are the waveform signal exemplary plot of extraction.
Fig. 2 b are result figure after the waveform signal of extraction pre-processes.
Fig. 3 a are the accuracy rate of user's test result under pervasive threshold value.
Fig. 3 b are the recall rate of user's test result under pervasive threshold value.
Fig. 4 a are each action recognition accuracy rate with k value changes curves.
Fig. 4 b are each action recognition recall rate with k value changes curves.
Fig. 5 is the judgement precision of personalized threshold value.
Embodiment
Understand for the ease of those of ordinary skill in the art and implement the present invention, below in conjunction with the accompanying drawings and embodiment is to this hair It is bright to be described in further detail, it will be appreciated that implementation example described herein is merely to illustrate and explain the present invention, not For limiting the present invention.
Present invention is primarily based on Wearable and sensor technology, it is contemplated that the motion feature of headwork, it is proposed that A kind of wearable mouse control method based on motion sensor.This method makes full use of the exercise data of headwork, While not influenceing normally to regard thing, easily operated and high-precision mouse control is realized.The present invention can be used for being without arm personage Use, allow no arm or arm disabled scholar to use mouse, meanwhile, normal person can also be liberated double using the present invention Hand.
Method provided by the invention can use computer software technology implementation process.Referring to Fig. 1, one kind provided by the invention Wearable mouse control method based on motion sensor, comprises the following steps:
Step 1:Action data is gathered from sensor, specific implementation process is:
Step 1.1:Selection bluetooth or the mode of Wifi focuses connect embedded Wearable and controlled computer, build Socket connections are stood to transmit data.Port is arranged to 8088.
Step 1.2:The exercise data in obtained from accelerometer and gyroscope three directions is stored successively respectively.Work as reception To data number be more than length of window when then remove data at first.Sample rate is arranged to 40 times per second.
The specific implementation process of embodiment is described as follows:
Wearable is attached with control device (such as computer) by bluetooth or Wifi focuses first, established Sensing data is transmitted after Socket connections, port is arranged to 8088.
Then, the exercise data in obtained from accelerometer and gyroscope three directions is stored successively respectively.When receiving Data number then remove data at first when being more than the window number of definition, then remaining data are placed in window and give computer Hold program processing.Sample rate is arranged to 40 times per second.When sample rate is excessive, cost is excessive, thereby increases and it is possible to can cause congestion.When adopting When sample rate is too small, motion characteristic can be not obvious enough.
Step 2:Definition acts.Each operation to mouse, all define a headwork and correspond to therewith.
Step 2.1:It is assumed that user faces screen at the beginning, initial head position is turned into intermediate region.
Step 2.2:Define single-click operation.Head is quickly downward, then return quickly to the action definition of intermediate region For single-click operation.
Step 2.3:Define direction action.By head from intermediate region port, a period of time to cursor is kept to reach the phase Intermediate region is gone back to behind prestige position to be defined as moving to left action.Head is turned into the right side from intermediate region, is kept for a period of time to cursor Reach and go back to intermediate region after desired locations and be defined as moving to right action.Head is turned to from intermediate region, is kept for a period of time Intermediate region is gone back to after reaching desired locations to cursor to be defined as moving up action.Head is turned to down from intermediate region, keeps one Section time to cursor, which reaches, to be gone back to intermediate region after desired locations and is defined as moving down action.
Specific implementation process is:
By taking left action as an example.The definition of left action is moved to the left for the cursor on screen.Once head from intermediate region to Move left, cursor also begins to be moved to the left.Head Hui Zuo regions keep a segment distance, and now cursor can keep being moved to the left. Once head swings back intermediate region, cursor stops movement.
Step 3:Data are pre-processed.Denoising is carried out to initial data on the premise of motion characteristic extraction is not influenceed Operation.
Step 3.1:Noise is eliminated using the mode of threshold denoising.Define a noise threshold Eth, when the data received Just to equipment transmission data during more than the threshold value, 0 is otherwise set to.
Step 3.2:Tooth ripple is removed using the mode of maximum value filtering.Maximum value filtering is the maximum all values in window Value is as the value after current filter.Definition filter window is τ, then the filtered value r of current timetIt is represented by:
rt=max { rt-τ/2..., rt-1, rt, rt+1..., rt+τ/2}
Specific implementation process is:
Define a noise threshold Eth, the value of the threshold value is less than in initial data will be set to 0, the number only received Just can be to equipment transmission data during according to more than the threshold value.The method that data after threshold denoising are reused to maximum value filtering To remove tooth ripple.Maximum value filtering puts the left and right value of current location consideration together, by maximum therein position the most Filtered value.
Step 4:Feature extraction.
Specific implementation process is:
Because more more notable than the motion characteristic in four additional direction in the x-axis direction of accelerometer, single-click operation, and upper bottom left Right four actions are not distinguished significantly on accelerometer, it is possible to using the value in accelerometer x-axis direction come by click action Made a distinction with direction action.
Because in the x-axis direction of gyroscope, moving left and right more more notable than moving up and down motion characteristic, it is possible to use top The value in spiral shell instrument x-axis direction acts left and right and up and down action makes a distinction.Again because of left and right motion characteristic contrast, therefore also may be used Left and right action is made a distinction.
Because in the z-axis direction of gyroscope, moving up and down more more notable than moving left and right motion characteristic, it is possible to use top The value in spiral shell instrument z-axis direction will move up and down and left and right action makes a distinction.Again because of left and right motion characteristic contrast, therefore also may be used Up and down action is made a distinction.
Step 5:The classification of motion.
Specific implementation process is:
Step 5.1:Define atn x, gtn x, gtx x, gtn z, gtx zThe respectively lower threshold value in accelerometer x-axis direction, gyroscope x, z The upper lower threshold value of direction of principal axis.The value of threshold value determines by pre-training step, the judgement data as decision tree.Define ai x, gi x, gi z For current time i accelerometer x-axis direction, gyroscope x-axis direction, gyroscope z-axis direction value.
The value in current time accelerometer x-axis direction is checked, if ai x<atn x, then head start click action, otherwise, enter Step 5.2.
Step 5.2:The value in current time gyroscope x-axis direction is checked, if gi x<gtn xOr gi x>gtx x, head is then opened Beginning left or right acts, into step 5.3.Otherwise, into step 5.4.
Step 5.3:If gi x<gtn x, then head start to be moved to the left, the cursor on screen is moved to the left.Otherwise, head Move right, the cursor on screen also moves right.
Step 5.4:The value in current time gyroscope z-axis direction is checked, if gi z<gtn zOr gi z>gtx z, head is then opened Beginning up or down acts, into step 5.5.Otherwise, generation is not acted in current time.
Step 5.5:If gi z<gtn z, then head begin to move up, the cursor on screen moves up.Otherwise, head Start to move down, the cursor on screen also moves right.
Step 5.6:Once motion is identified, and cursor will perform termination of the corresponding actions up to action on screen, The termination of action is defined as follows:
For left action, as real time data gi x≥gtx x, cursor stopping movement, then recognize real time data gi x≤gtn x, Left action terminates.
For right action, as real time data ai x≤atn x, cursor stopping movement, then recognize real time data ai x≥atn x, Right action terminates.
For upper action, as real time data gi z≥gtx x, cursor stopping movement, then recognize real time data gi z≤gtx z, Upper action terminates.
For lower action, as real time data gi z≤gtn x, cursor stopping movement, then recognize real time data gi z≥gtn z, Lower action terminates.
For click action, it is acted with lower action in one direction, as real time data gi z≤gtn x, cursor stopping shifting It is dynamic, then recognize real time data gi z≥gtn z, click action termination.
Step 6:Action segmentation
Specific implementation process is:
The initial data that we obtain is Time Continuous data, and the data of these Time Continuous are divided into by action segmentation Block, each piece of data included represent an action.According to the definition of action, each action includes two opposite portions Point.For example, act to the left first by head from intermediate region port, then again by intermediate region of turning left back.As a result, same There are two opposite waveforms on one direction.Definition has two threshold value T in one directionn, TpTo determine the beginning of action and knot Beam.Wherein, TnIt is lower threshold value, TpIt is upper threshold value.Carry out partition data using following rule:
Assuming that head reaches a certain threshold for the first time initially in intermediate region face screen when the data of gyroscope are identified to During value, the point is flagged as the point that action starts, and is designated as Ps.When gyro data reaches certain threshold value for the second time in the opposite direction When, the point is flagged as the point of release, is designated as Pe.Then PsAnd PeMiddle one section is divided out as an action.
Step 7:Data pre-training
Specific implementation process is:
Because the motor habit of different people is different, caused motion characteristic waveform is also different.Point that institute's above step defines Class device is non-pervasive, it is necessary to train personalized grader for different users.
Step 7.1:User takes turns doing the action 5 times of 5 definition before the use.
Step 7.2:5 pairs of extreme values of obtained data are extracted, obtained minimum crest and highest trough are multiplied by a certain system Number k is as the personalized threshold value trained.It is using the personalized threshold value as the threshold value of grader that grader is personalized.
Step 8:Error correction
Specific implementation process is:
As user in use, user can see that the real-time movement of onscreen cursor.When the errors have occurred, user can know Road can immediately begin to correct there occurs a mistake.If cursor moves to the direction of mistake, user can perform equally Action, then contacted.For example, when user to the left when, find cursor move right, user can also perform once to the right Act to correct this mistake.With this method, when previous release, subsequent action will not be by wrong shadow before Ring.
It should be appreciated that the part that this specification does not elaborate belongs to prior art.
It should be appreciated that the above-mentioned description for preferred embodiment is more detailed, therefore can not be considered to this The limitation of invention patent protection scope, one of ordinary skill in the art are not departing from power of the present invention under the enlightenment of the present invention Profit is required under protected ambit, can also be made replacement or deformation, be each fallen within protection scope of the present invention, this hair It is bright scope is claimed to be determined by the appended claims.

Claims (5)

1. a kind of wearable mouse control method based on motion sensor, it is characterised in that comprise the following steps:
Step 1:Data acquisition is carried out, is specifically included:
Step 1.1:Select bluetooth or Wifi focuses to connect embedded Wearable and controlled computer, establish Socket companies Fetch transmission data;Port is arranged to 8088;
Step 1.2:The exercise data in obtained from accelerometer and gyroscope three directions is stored successively respectively;When what is received Data number then removes data at first when being more than length of window;Sample rate is arranged to 40 times per second;
Step 2:Action definition, specifically include following sub-step:
Step 2.1:Define user and face screen at the beginning, initial head position is turned into intermediate region;
Step 2.2:Define single-click operation;Head is quickly downward, and the action definition for then returning quickly to intermediate region is single Hit operation;
Step 2.3:Define direction action;By head from intermediate region port, keep a period of time to cursor to reach and it is expected position Postpone and go back to intermediate region and be defined as moving to left action;Head is turned into the right side from intermediate region, keeps a period of time to cursor to reach Intermediate region is gone back to after desired locations to be defined as moving to right action;Head is turned to from intermediate region, is kept for a period of time to light Mark, which reaches, to be gone back to intermediate region after desired locations and is defined as moving up action;
Head is turned to down from intermediate region, is kept for a period of time go back to intermediate region after reaching desired locations to cursor and is defined as Move down action;
Step 3:Eliminate the data prediction of the removal tooth ripple of noise;
Step 4:Feature extraction, specifically include following sub-step:
Step 4.1:Because more more notable than the motion characteristic in four additional direction in the x-axis direction of accelerometer, single-click operation, so Click action and direction action can be made a distinction using the value in accelerometer x-axis direction;
Step 4.2:Because in the x-axis direction of gyroscope, moving left and right more more notable than moving up and down motion characteristic, it is possible to make Left and right is acted with the value in gyroscope x-axis direction and up and down action makes a distinction;Again because of left and right motion characteristic contrast, therefore Also left and right can be acted and made a distinction;
Step 4.3:Because in the z-axis direction of gyroscope, moving up and down more more notable than moving left and right motion characteristic, it is possible to make It will be moved up and down with the value in gyroscope z-axis direction and left and right action make a distinction;Again because of left and right motion characteristic contrast, therefore Also up and down action can be made a distinction;
Step 5:The classification of motion, specific implementation include following sub-step:
Step 5.1:Define atn x, gtn x, gtx x, gtn z, gtx zThe respectively lower threshold value in accelerometer x-axis direction, gyroscope x, z-axis side To upper lower threshold value;The value of threshold value determines by pre-training step, the judgement data as decision tree;Define ai x, gi x, gi zTo work as Preceding time i accelerometer x-axis direction, gyroscope x-axis direction, the value in gyroscope z-axis direction;
The value in current time accelerometer x-axis direction is checked, if ai x<atn x, then head start click action, otherwise, into step 5.2;
Step 5.2:The value in current time gyroscope x-axis direction is checked, if gi x<gtn xOr gi x>gtx x, head then start it is left or Right action, into step 5.3;Otherwise, into step 5.4;
Step 5.3:If gi x<gtn x, then head start to be moved to the left, the cursor on screen is moved to the left;Otherwise, head is to the right Mobile, the cursor on screen also moves right;
Step 5.4:The value in current time gyroscope z-axis direction is checked, if gi z<gtn zOr gi z>gtx z, head then start it is upper or Lower action, into step 5.5;Otherwise, generation is not acted in current time;
Step 5.5:If gi z<gtn z, then head begin to move up, the cursor on screen moves up;Otherwise, head starts Move down, the cursor on screen also moves right;
Step 5.6:Once motion is identified, and cursor will perform termination of the corresponding actions up to action on screen,
Step 6:Action segmentation, obtained initial data is Time Continuous data, and action is split the data of these Time Continuous Split blocking, each piece of data included represent an action;It is opposite comprising two according to the definition of action, each action Part;For example, act to the left first by head from intermediate region port, then again by intermediate region of turning left back;As a result, There are two opposite waveforms in the same direction;Definition has two threshold value T in one directionn, TpCome determine action beginning and Terminate;Wherein, TnIt is lower threshold value, TpIt is upper threshold value;
Step 7:Data pre-training, specific implementation include following sub-step:
Step 7.1:User takes turns doing the action 5 times of 5 definition before the use;
Step 7.2:5 pairs of extreme values of obtained data are extracted, obtained minimum crest and highest trough are multiplied by into a certain coefficient k makees For the personalized threshold value trained;
Step 8:Error correction, as user in use, user can see that the real-time movement of onscreen cursor;When mistake occurs When, user is known that there occurs a mistake, and can immediately begin to correct;If cursor moves to the direction of mistake, use Family can perform same action, then be contacted;For example, when user to the left when, it is found that cursor moves right, user can be with Also perform and once act to the right to correct this mistake;With this method, when previous release, subsequent action will not be by Erroneous effects before.
2. the wearable mouse control method according to claim 1 based on motion sensor, it is characterised in that step 3 specific implementation includes following sub-step:
Step 3.1:Noise is eliminated using the mode of threshold denoising;Define a noise threshold Eth, when the data received are more than Just to equipment transmission data during the threshold value, 0 is otherwise set to;
Step 3.2:Tooth ripple is removed using the mode of maximum value filtering;Maximum value filtering is that the maximum of all values in window is made For the value after current filter;Definition filter window is τ, then the filtered value r of current timetIt is represented by:
rt=max { rt-τ/2..., rt-1, rt, rt+1..., rt+τ/2}。
3. the wearable mouse control method according to claim 1 based on motion sensor, it is characterised in that step The termination acted in 5.6 is defined as follows:
For left action, as real time data gi x≥gtx x, cursor stopping movement, then recognize real time data gi x≤gtn x, a left side is moved Terminate;
For right action, as real time data ai x≤atn x, cursor stopping movement, then recognize real time data ai x≥atn x, the right side is moved Terminate;
For upper action, as real time data gi z≥gtx x, cursor stopping movement, then recognize real time data gi z≤gtx z, it is upper dynamic Terminate;
For lower action, as real time data gi z≤gtn x, cursor stopping movement, then recognize real time data gi z≥gtn z, it is lower dynamic Terminate;
For click action, it is acted with lower action in one direction, as real time data gi z≤gtn x, cursor stopping movement, with After recognize real time data gi z≥gtn z, click action termination.
4. the wearable mouse control method according to claim 1 based on motion sensor, it is characterised in that step In 6, carry out partition data using following rule:
Head is defined initially in intermediate region face screen, and a certain threshold value is reached for the first time when the data of gyroscope are identified to When, the point is flagged as the point that action starts, and is designated as Ps;When gyro data reaches certain threshold value for the second time in the opposite direction, The point is flagged as the point of release, is designated as Pe;Then PsAnd PeMiddle one section is divided out as an action.
5. the wearable mouse control method according to claim 1 based on motion sensor, it is characterised in that step 7 specific implementation includes following sub-step:
Step 7.1:User takes turns doing the action 5 times of 5 definition before the use;
Step 7.2:5 pairs of extreme values of obtained data are extracted, obtained minimum crest and highest trough are multiplied by into a certain coefficient k makees For the personalized threshold value trained.
CN201710812802.6A 2017-09-11 2017-09-11 A kind of wearable mouse control method based on motion sensor Pending CN107506041A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710812802.6A CN107506041A (en) 2017-09-11 2017-09-11 A kind of wearable mouse control method based on motion sensor

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710812802.6A CN107506041A (en) 2017-09-11 2017-09-11 A kind of wearable mouse control method based on motion sensor

Publications (1)

Publication Number Publication Date
CN107506041A true CN107506041A (en) 2017-12-22

Family

ID=60695399

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710812802.6A Pending CN107506041A (en) 2017-09-11 2017-09-11 A kind of wearable mouse control method based on motion sensor

Country Status (1)

Country Link
CN (1) CN107506041A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108815845A (en) * 2018-05-15 2018-11-16 百度在线网络技术(北京)有限公司 The information processing method and device of human-computer interaction, computer equipment and readable medium
CN109782904A (en) * 2018-12-26 2019-05-21 南昌大学 A kind of wireless mouse based on intelligent glasses

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN201716687U (en) * 2009-11-27 2011-01-19 晶翔微系统股份有限公司 Aerial mouse
CN102282527A (en) * 2008-11-21 2011-12-14 伦敦健康科学中心研究公司 Hands-free pointer system
CN103513770A (en) * 2013-10-09 2014-01-15 中国科学院深圳先进技术研究院 Man-machine interface equipment and man-machine interaction method based on three-axis gyroscope
CN103543843A (en) * 2013-10-09 2014-01-29 中国科学院深圳先进技术研究院 Man-machine interface equipment based on acceleration sensor and man-machine interaction method
CN104107134A (en) * 2013-12-10 2014-10-22 中山大学 Myoelectricity feedback based upper limb training method and system
CN104919393A (en) * 2012-11-20 2015-09-16 三星电子株式会社 Transition and interaction model for wearable electronic device

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102282527A (en) * 2008-11-21 2011-12-14 伦敦健康科学中心研究公司 Hands-free pointer system
CN201716687U (en) * 2009-11-27 2011-01-19 晶翔微系统股份有限公司 Aerial mouse
CN104919393A (en) * 2012-11-20 2015-09-16 三星电子株式会社 Transition and interaction model for wearable electronic device
CN103513770A (en) * 2013-10-09 2014-01-15 中国科学院深圳先进技术研究院 Man-machine interface equipment and man-machine interaction method based on three-axis gyroscope
CN103543843A (en) * 2013-10-09 2014-01-29 中国科学院深圳先进技术研究院 Man-machine interface equipment based on acceleration sensor and man-machine interaction method
CN104107134A (en) * 2013-12-10 2014-10-22 中山大学 Myoelectricity feedback based upper limb training method and system

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
李涛: "《数字图像处理之红外弱目标分割方法研究》", 30 June 2016 *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108815845A (en) * 2018-05-15 2018-11-16 百度在线网络技术(北京)有限公司 The information processing method and device of human-computer interaction, computer equipment and readable medium
CN108815845B (en) * 2018-05-15 2019-11-26 百度在线网络技术(北京)有限公司 The information processing method and device of human-computer interaction, computer equipment and readable medium
CN109782904A (en) * 2018-12-26 2019-05-21 南昌大学 A kind of wireless mouse based on intelligent glasses

Similar Documents

Publication Publication Date Title
US11036302B1 (en) Wearable devices and methods for improved speech recognition
US10937414B2 (en) Systems and methods for text input using neuromuscular information
EP3791387B1 (en) Systems and methods for improved speech recognition using neuromuscular information
US11216069B2 (en) Systems and methods for improved speech recognition using neuromuscular information
US11567573B2 (en) Neuromuscular text entry, writing and drawing in augmented reality systems
CN110312471B (en) Adaptive system for deriving control signals from neuromuscular activity measurements
CN106527709B (en) Virtual scene adjusting method and head-mounted intelligent device
US10678342B2 (en) Method of virtual user interface interaction based on gesture recognition and related device
CN110008839B (en) Intelligent sign language interaction system and method for self-adaptive gesture recognition
CN106778565B (en) Pull-up counting method and device
CN112926423A (en) Kneading gesture detection and recognition method, device and system
CN111722713A (en) Multi-mode fused gesture keyboard input method, device, system and storage medium
EP3951564A1 (en) Methods and apparatus for simultaneous detection of discrete and continuous gestures
CN107506041A (en) A kind of wearable mouse control method based on motion sensor
KR101755242B1 (en) Apparatus for finger language recognition using electromyogram sensor and motion sensor and method for finger language recognition using the same
US11547904B2 (en) Exercise assisting device and exercise assisting method
CN106951109B (en) Method and device for acquiring hand gesture
US20220092434A1 (en) Noise waveform removing device, model training device, noise waveform removing method, model training method, generation model, and wearable device
CN107992193A (en) Gesture confirmation method, device and electronic equipment
CN114882587A (en) Method, apparatus, electronic device, and medium for generating countermeasure sample
CN115047966A (en) Interaction method, electronic equipment and interaction system
CN111260678A (en) Gymnastics assistant learning method and device, storage medium and terminal equipment
Abdullah et al. Research and development of IMU sensors-based approach for sign language gesture recognition
CN112826504B (en) Game parkinsonism grade assessment method and device
US20230206472A1 (en) Human body detection method and human body detection device, and computer readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20171222