CN111007977A - Intelligent virtual interaction method and device - Google Patents

Intelligent virtual interaction method and device Download PDF

Info

Publication number
CN111007977A
CN111007977A CN201811194000.4A CN201811194000A CN111007977A CN 111007977 A CN111007977 A CN 111007977A CN 201811194000 A CN201811194000 A CN 201811194000A CN 111007977 A CN111007977 A CN 111007977A
Authority
CN
China
Prior art keywords
keyboard
finger
key
mouse
size
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201811194000.4A
Other languages
Chinese (zh)
Inventor
邱波
岳洋
张加军
杨光
邱实
邱红
关月阳
邱增
邱广君
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to CN201811194000.4A priority Critical patent/CN111007977A/en
Publication of CN111007977A publication Critical patent/CN111007977A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • G06F3/0418Control or interface arrangements specially adapted for digitisers for error correction or compensation, e.g. based on parallax, calibration or alignment
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04812Interaction techniques based on cursor appearance or behaviour, e.g. being affected by the presence of displayed objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04803Split screen, i.e. subdividing the display area or the window area into separate subareas
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04804Transparency, e.g. transparent or translucent windows
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04805Virtual magnifying lens, i.e. window or frame movable on top of displayed information to enlarge it for better reading or selection

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The intelligent virtual input method and the electronic device can intelligently and conveniently improve the interaction modes of a physical keyboard, a virtual keyboard, a mouse, a soft mouse, gestures, handwriting, various action inputs and the like, and realize related functions. It is composed of soft touch control board and virtual keyboard. The soft touchpad interface is more intelligent and convenient to realize the functions of the physical touchpads of the notebook computer and the computer. The soft touch pad judges different actions of the fingers through one or a combination of the following parameters: start time, end time, length of time the action is performed, time interval, distance, direction, absolute value of displacement, etc., or an estimate from the above parameters. The mouse cursor on the soft touch pad can be more intelligently, conveniently and accurately moved to a place designated by a user, so that the basic function of the mouse is realized, and the functions of a virtual keyboard of an electronic blackboard, a tablet computer and a mobile phone, the simplified and practical functions of a physical keyboard and a mouse, gestures, handwriting, various action inputs and related more intelligent and convenient functions are realized.

Description

Intelligent virtual interaction method and device
Technical Field
The invention relates to a virtual input technology, in particular to a method for processing finger positioning, docking, sliding and clicking on a virtual keyboard and an electronic device.
Background
When using the panel computer of windows system large screen, if not supporting entity keyboard and mouse, virtual keyboard, touch mouse that the use system was from taking, input at every turn must earlier need eyes to judge the button position, and the finger will curl after the input is accomplished and lift up, prevents to shelter from the screen to virtual keyboard and touch mouse are big to the sheltering from of screen, consequently, compare with using entity keyboard, mouse, use very inconveniently:
1. the user needs to confirm that the key position is clicked by eyes before each key pressing, and compared with the touch typing using a physical keyboard, the input efficiency is extremely low.
2. When the user waits for input, fingers need to be curled down, wrists need to be lifted up, sight line shielding and misoperation are prevented, the wrists and the fingers can be tired very in a short time, the user can continue to operate only by relaxing a rest, and the difference between the input efficiency and the physical keyboard is too large.
3. The virtual keyboard and the touch mouse are displayed by 2 program windows, the screen is greatly shielded, the windows need to be moved for many times in the using process, and the using efficiency of a user is seriously influenced.
According to the user experience, currently, the neck, finger joints and wrist of a tablet computer are very tired when the virtual keyboard is used for inputting characters for more than 20 minutes, and the input speed of the virtual keyboard is less than one third of that of a physical keyboard within a certain time.
Disclosure of Invention
1. A multifunctional keyboard-mouse type input device software is composed of a soft touch pad and a soft keyboard. The method is characterized in that: when entering the touch pad interface, the areas and relative positions of the left key, the middle key, the right key and the mouse (icon) are clear, the corresponding area of the touch pad is 20% transparent, the keyboard interface can be switched to after the corresponding key of the keyboard is pressed, the soft touch pad and the left, middle and right keys are temporarily hidden, all the operation function keys are not changed, the ground colors of the left, middle and right keys and the soft touch pad are transparent, the transparency is adjustable, and the function keys are distinguished by lines.
2. The multifunctional keyboard-mouse type input device software is characterized in that: the touch area of the soft keyboard displays a virtual (false, dotted) mouse cursor, and the non-touch area displays a real mouse cursor, so that the mouse cursor on the soft touch pad can be quickly and accurately moved to the place designated by the user.
3. The multifunctional keyboard-mouse type input device software is characterized in that a magnifying glass with two sizes is arranged on a soft touch pad interface, different sizes can be displayed according to the moving speed of a mouse, the displayed content of the magnifying glass is a circular area, the position of a virtual mouse pointer is taken as the center of a circle, 50 pixels are taken as the radius, and the magnification is 2.2 times (large) and 1.4 times (small).
4. The multifunctional keyboard-mouse type input device software is characterized in that: the method comprises four interfaces, wherein the first interface is an alphabetic pinyin input interface, the second interface is an interface of numbers and punctuation marks, the third interface is a handwriting interface, and the fourth interface is a soft mouse touch pad interface.
5. The multifunctional keyboard-mouse type input device software is characterized in that: the size of the software disk can be adjusted, the size magnification factor of 1-5 times can be set, and meanwhile, the touch pad is zoomed synchronously.
6. The multifunctional keyboard-mouse type input device software is characterized in that: the method is characterized in that a bubble effect is selected from setting parameters of software, a cursor is located and a bubble with the center of the cursor as the center of a circle correspondingly appears, the area near the cursor is enlarged and displayed in a convex mirror mode to form the bubble effect, the bubble is in a 90% transparent state, the size of the bubble is in a function relation with the sliding speed, the bubble disappears immediately after stopping sliding, the bubble appears immediately after starting sliding, and whether the bubble exists or not, the size of the bubble and the function of the bubble can be set.
7. The multifunctional keyboard-mouse type input device software is characterized in that: all the extension dialog boxes (such as candidate symbols or text boxes) can be automatically adjusted according to the actual keyboard position, the keyboard is in an opaque state, when the action of moving the cursor is stopped during inputting, the area of the dialog box (such as a hundred-degree input box) where the cursor is located can jump out of the interface of the soft keyboard, the cursor position can always perform normal operation outside the interface of the software disk, after the selection is finished, the extension dialog boxes automatically disappear, meanwhile, the original keyboard becomes an activation waiting state, the default state of the input box is above the soft keyboard, and when the cursor moves into the input box, the input box can be automatically adjusted to the upper side of the soft keyboard. The soft keyboard dialog box may be moved to adjust the position.
8. The multifunctional keyboard-mouse type input device software is characterized in that: the size of the soft keyboard has the function of adjusting the size through dragging the frame, the keys of the soft keyboard are correspondingly adjusted to be larger or smaller in the same proportion, and the sizes of the soft keyboard and the keys can be fully automatically adjusted according to the thickness of fingers.
9. The multifunctional keyboard-mouse type input device software is characterized in that: the cursor control method includes two control methods, the first is a touchpad control method, the cursor is controlled through a touchpad interface, and the second is a method combining a finger click screen control method and a touchpad control method, namely the finger click method is used for roughly confirming the direction and the touchpad control method is used for accurately confirming the direction, and the two control methods are simultaneously or alternately applied.
10. The multifunctional keyboard-mouse type input device software is characterized in that: the screen switching mode includes three modes, namely a first mode of sliding left and right, a screen can be switched between each interface of a keyboard and a touch pad interface by sliding the screen without setting coordinates, a second mode of sliding the screen in a fixed area for switching by setting coordinates, and a third mode of switching function keys is a default switching mode.
11. The multifunctional keyboard-mouse type input device software is characterized in that: the operation of the garment sleeve, the soft touch pad or the soft keyboard, does not react.
12. The multifunctional keyboard-mouse type input device software is characterized in that: the software serial number can be encrypted, an additional serial number of the software is generated according to a mainboard number, a machine number and the like of a computer, the additional serial number can be transmitted back to a database and a mailbox of a company during network downloading, 2 modes of encrypting the serial number and an interface thereof are set, and the version of the software can be timely reminded to be upgraded and loopholes can be timely repaired.
A method for recognizing finger movements on an intelligent device, comprising:
after inputting the finger action parameters, judging different actions of the fingers through one or the combination of the following parameters: a start time, an end time, a duration of performance of an action, a time interval, a distance, a direction, an absolute value of a displacement, an absolute value of a velocity, an absolute value of an acceleration, a derivative of an acceleration parameter, an area of a finger contact device, or a derivative or integral of said parameter, or a characteristic value of said parameter, or a derived value from said parameter;
a method of identifying different actions or movements, characterized by:
in two-dimensional or multi-dimensional space, after inputting parameters, different actions or movements are judged by one or a combination of the following parameters: start time, end time, duration of performance of an action or movement, time interval, distance, direction, absolute value of displacement, absolute value of velocity, absolute value of acceleration, derivative of an acceleration parameter, or derivative or integral of said parameter, or characteristic value of said parameter, or inferred value from said parameter;
15. a method for determining the position or size of a key based on a finger touching a portion of a touched device, comprising:
the position of one finger determines the position of the key on the finger or the size of the key;
alternatively, the contact area of one finger determines the size of the key;
16. a method for determining the position or size of a keyboard based on a finger touching a portion of a touch device, comprising:
the position of one finger determines the position of the keyboard on the finger or the size of the keyboard;
or the contact area of one finger determines the size of the keyboard;
17. a human-computer interaction method is characterized in that: using the two devices at the same time and space at any part of the same time, wherein the part of the devices of the first device and the part of the devices of the second device are overlapped in any area;
18. a human-computer interaction method is characterized in that: the shielding is avoided by adopting intelligent split screen or intelligent non-shielding technology.
19. An apparatus for using the method of any of the preceding claims 1-6, characterized in that: the method is particularly used to realize convenient interaction.
20. A human-computer interaction device is characterized in that: any part of the devices of the first device and any part of the devices of the second device which are overlapped in any area in the same time use the two devices in the same space at the same time.
Drawings
Fig. 1 is a schematic illustration of the main component symbol.
FIG. 2 is an example of proper finger rest for keyboard operation.
Fig. 3 is a schematic view of an electronic device according to the present invention.
Fig. 4 is a block diagram of a processing unit of the key technology of the present invention.
FIG. 5 is a flow chart of a method of operation of the present invention.
FIG. 6 is a conventional keyboard schematic of the virtual keyboard of the present invention (second edition)
FIG. 7 is a schematic view of a touch mouse according to the present invention (second edition)
FIG. 8 is a symbolic keyboard representation of the virtual keyboard of the present invention (second edition)
FIG. 9 is a schematic view of a conventional keyboard of the virtual keyboard of the present invention (first edition)
FIG. 10 is a schematic view of a touch mouse according to the present invention (first edition)
FIG. 11 is a symbolic keyboard representation of the virtual keyboard of the present invention (first version)
FIG. 12 is a schematic diagram of a virtual keyboard and a touch mouse 1 according to the present invention
FIG. 13 is a schematic diagram of a virtual keyboard and a touch mouse 2 according to the present invention
FIG. 14 is a schematic diagram of a virtual keyboard and a touch mouse 3 according to the present invention
FIG. 15 is a schematic diagram of a virtual keyboard and a touch mouse 4 according to the present invention
FIG. 16 is a schematic diagram 5 of a virtual keyboard and a touch mouse according to the present invention
Detailed Description
In the first embodiment, as shown in FIG. 1, FIG. 2, FIG. 3, FIG. 4, FIG. 5
The electronic device 10 includes a touch screen 20, a processing unit 30 and a pre-input unit 40.
The touch screen 20 displays an image and also responds to a touch operation applied thereto by a user.
The electronic device 10 is installed with a virtual keyboard program of the present invention.
The processing unit 30 executes the virtual keyboard program and displays the virtual keyboard 50 on the touch screen 20.
The pre-input unit 40 stores characters recorded on the virtual keyboard 50.
In use, the processing unit 30 responds to the operation of the user touching a certain position of the virtual keyboard 50, and obtains and displays the corresponding character at the position from the pre-input unit 40.
The embodiment inherits the habit of using the physical keyboard, and the ten fingers of the two hands naturally stop at ASDFJKL; and a space key (nine keys in total, fig. 1) in which a finger is naturally moved, lifted, and dropped during input. The f-key is now described by way of example for a click input on a 10-inch screen tablet computer.
The electronic device of the embodiment comprises a touch screen and a virtual keyboard (figure 1), wherein the virtual keyboard comprises a processing unit, a sensing module, a storage module, a judgment module, a calculation module and a pre-input unit. Wherein:
and the sensing module is used for identifying and receiving the behaviors of key position pressing, flicking and sliding of the touch screen virtual keyboard, extracting the pressing, flicking and touch point changing tracks and corresponding time information on the same key position, and outputting the information to the storage module.
And the storage module is used for storing the instant time information, the position information and the direction information of key position operation of the virtual keyboard and outputting the instant time information, the position information and the direction information to the calculation module.
The computing module is used for computing the absolute values of the pressing and bouncing time in the key position operation of the virtual keyboard; sliding distance, sliding time consumption, sliding speed and acceleration of touch points on the same key position; and output to the decision block.
And the judging module stores the set values of all the parameters, judges whether the operation of the key positions of the virtual keyboard is cooperative or isolated according to the comparison and judgment between the input values of the calculating module and the set values, further judges whether the operation is effective input or not, and outputs the result to the pre-input unit.
The functions of the above modules in this embodiment may also be referred to in the following description or the definitions of the summary, which are not described herein again. The corresponding flow is shown in fig. 4, and includes the following steps:
step 100, according to the mode of fig. 1, a user positions fingers to asdfjkl of a virtual keyboard; and a space key.
Step 110, the storage module records the pressing time of each virtual key.
And step 120, calculating the time difference of each virtual key position by the calculating module, wherein the time difference is infinite because no key is lifted.
And step 130, judging the cooperative operation according to the set value.
And 150, judging to stop the motion according to the set value, inputting no information, and continuously receiving the operation information of the key position.
Step 100, lifting the left hand, clicking the f key position and bouncing. The left hand repositions the key keys asdf and the space key.
And step 110, recording the bounce time and the press time of each virtual key position by the storage module.
And step 120, calculating the time difference of each virtual key position and the time difference of pressing and popping the f key by the calculating module. Since the space key of the right hand is positioned and not bounced, the time difference of the space key is infinite.
Step 130, judging that the f key is pressed and bounced to be an isolated operation; the second press of the a key, the d key, the s key and the f key is a cooperative operation.
In step 140, it is determined that the f key is input once. It is determined that the space key is not sprung and no input is made.
And 150, judging to stop the a key, the d key, the s key, the f key and the space key again, and continuously receiving the operation information of the key positions.
And step 160, receiving the input information of the f key, and continuously receiving the operation information of the key position.
Example two
This example illustrates the finger positioning the key and docking.
And judging the time difference from pressing to popping up for all the keys, and if the time difference exceeds a certain specific value t, determining that the key is not a valid click at this time, but only stopping at the key and then taking away.
For cells other than ASDFJKL; if the time difference from pressing to popping off is less than t, the key is considered to be a valid key.
For ASDFJKL; the keys are divided into 2 groups, wherein the first group comprises ASDF, and the second group comprises JKL; . For each group:
if the time difference from pressing to popping off of a certain key x in the group is found to be less than t, then the action is judged to be isolated or coordinated. If isolated, the action is considered to be a key valid, and if coordinated, the action is considered to be a key invalid.
The criteria for determining whether to isolate or synergize are:
if at least one key y can be found in the group, the key y is almost pressed simultaneously with the key x and is bounced almost simultaneously with the key y, the cooperation is considered, and otherwise, the key y is isolated.
The space key differs from other keys in that the thumbs of both hands may rest against it. It is not possible to judge only the pressed and lifted state, but the state of each existing contact should be recorded. If the time difference between pressing and flicking of any one contact exceeds a certain value t, the C contact is not considered to be a valid click, but is only stopped and then taken away.
For ASDFJKL; in the pressed state, if the finger slides in the key (the amplitude is less than half of the distance of the key height), the key can be pressed directly without bouncing, and the finger can shake normally, and the distinguishing method is as follows:
(1) if the direction of the sliding is upward, leftward or rightward as a whole, the finger is considered to be shaken.
(2) If the direction of sliding is globally downward:
(2.1) if the distance of the sliding exceeds 15 pixels, it is finger shake.
(2.2) if the speed of the slide is lower than v, it is a finger shake.
(2.3) if the number of the sliding points is 2, calculating the acceleration of each point from the second point, and when the maximum acceleration exceeds a certain value or a change rule of increasing first and decreasing second is presented, pressing the key, otherwise, shaking the finger.
EXAMPLE III
This example illustrates that the method of the present invention has little occlusion on the display screen.
The activation mode of the expansion key is changed from long pressing to right mouse pressing or flicking after sliding over a distance which is half of the height of the key after pressing by a finger (but not exceeding the range of the key).
For the maximized conventional keyboard and the two-in-one window, a scheme for avoiding shielding exists (whether the function is started or not can be set by a user in the setting), and for the symbolic keyboard, the mouse interface and the handwriting interface, as well as the non-maximized conventional keyboard and the two-in-one window, a scheme for avoiding shielding is not needed.
The maximized conventional keyboard and the two-in-one window can only be dragged up and down, but not dragged left and right. Wherein after dragging to the upper edge or the lower edge, the calibration is automatically carried out to the uppermost end or the lowermost end.
After the user starts the function, only when the maximized conventional keyboard and the two-in-one window are positioned at the top end or the bottom end, the shielding scheme is avoided to really take effect:
if the maximized conventional keyboard and the two-in-one window at the uppermost end or the lowermost end are displayed, after a user starts a new window or activates an existing window, if the two areas are overlapped, the sizes of the two windows are automatically adjusted, the areas with the heights outside the conventional keyboard and the two-in-one window are compressed, and the two windows are connected to fill the whole screen.
When the maximized conventional keyboard and the two-in-one window which are positioned at the uppermost end or the lowermost end are newly displayed, if the maximized conventional keyboard and the two-in-one window are overlapped with the current active window area, the size of the current active window is automatically adjusted, the area with the height being beyond the conventional keyboard and the two-in-one window is compressed, and the conventional keyboard and the two-in-one window are connected to fill the whole screen.
Example four
This embodiment illustrates the position and size of the virtual keyboard adaptive finger of the present invention.
The software tells the user whether to put both or one hand and which fingers to put on the screen. If the total number of the fingers of the user is not consistent with the number required by the software, the placement is invalid.
For the single hand, if four fingers or five fingers are on the left side of the screen, the left hand is considered; otherwise it is considered as right-handed. For the requirement of two hands, the software with four fingers or five fingers on the left side recognizes the software as the left hand, and the software with four fingers or five fingers on the right side recognizes the software as the right hand. If the software cannot identify the left hand and the right hand, or the left hand and the right hand are obviously not in accordance with the routine, the placement is invalid.
The vertical distances between the adjacent fingers of the little finger, the ring finger, the middle finger and the index finger of one hand or two hands are averaged to be used as the width of the key, which is also equal to the determined size of the keyboard. After determining the keyboard size by such rules, the software uses the keyboard size closest to the screen width if the keyboard width exceeds the screen width.
The four fingers of the left hand correspond to ASDF from left to right in sequence; the four fingers of the right hand sequentially correspond to JKL from left to right; for one hand, the keyboard is thus positioned, and for both hands, the middle point of the two fingers representing F and J is taken as the middle gap between the two keys G and H in the keyboard. After the keyboard position is determined by such a rule, if the keyboard exceeds the screen range, it is moved into the screen area and placed to the left or right.
After the finger is pressed down, the user can adjust again, and the software starts the measuring and calculating process when the finger is taken away all at once.
For the placement which is invalid for one time, the software gives a prompt, and the user can continue to place the page without leaving the page until the placement is valid; for unsatisfactory placement, placement may also be continued until placement is satisfactory. The user may enable efficient and satisfactory placement and may also restore the keyboard settings (including size and position) prior to entering the page.
EXAMPLE five
This embodiment illustrates that the present invention realizes two-in-one operation of a virtual keyboard and a touch mouse, as shown in fig. 12, 13, and 14.
And judging whether the keyboard interface or the touch pad interface is the keyboard interface or the touch pad interface according to the finger action. Parameters need to be set, and sliding and clicking are judged.
If the finger slides, the system determines that the operation is the touch pad operation, and it is noted that the operation area of the touch pad is within the yellow dotted line frame.
If the keyboard is clicked, the system judges that the keyboard is used for inputting.
EXAMPLE six
This embodiment illustrates that the present invention realizes two-in-one operation of a virtual keyboard and a touch mouse, as shown in fig. 15.
Clicking on the virtual keyboard space bar is used as an input space. Sliding on the virtual keyboard space bar is a touch pad operation.
EXAMPLE seven
This embodiment illustrates that the present invention realizes two-in-one operation of a virtual keyboard and a touch mouse, as shown in fig. 16.
1. The title bar and the close and zoom buttons on the left side are removed and placed on top. This frees up valuable space for the keyboard area, especially for two-handed input, or even touch typing, where the fingers are not down when the screen is not wide enough, where the width of the keyboard is valuable.
2. A row is added on the upper side, and the left side is combined into a keyboard and a mouse, which is a title and a dragging area, and a window can be dragged after being pressed by a mouse or a finger. The middle position is a mouse button which comprises a left button, a middle button (an upper button and a lower button) and a right button and supports finger clicking. The right side is a functional area which comprises a setting button, an enlarging, a reducing and a closing.
3. The button area, bottom row, removes the "mouse" (which is not required because it has been merged into the window), and places the "settings" to the upper right instead of the generic icon of a gear. Therefore, the space key can be lengthened, the layout style of the space key is more consistent with that of a hard keyboard, the thumb of two hands can be stopped on the space, and good support is provided for hovering and touch typing.

Claims (8)

1. A method for recognizing finger movements on an intelligent device is characterized in that:
after inputting the finger action parameters, judging different actions of the fingers through one or the combination of the following parameters: start time, end time, duration of performance of an action, time interval, distance, direction, absolute value of displacement, absolute value of velocity, absolute value of acceleration, derivative of an acceleration parameter, area of a finger contact device, or derivative or integral of the above parameter, or characteristic value of the above parameter, or inferred value from the above parameter.
2. A method of recognizing different actions or movements, characterized by:
in two-dimensional or multi-dimensional space, after inputting parameters, different actions or movements are judged by one or a combination of the following parameters: start time, end time, duration of the performance of an action or movement, time interval, distance, direction, absolute value of displacement, absolute value of velocity, absolute value of acceleration, derivative of an acceleration parameter, or derivative or integral of the above parameter, or characteristic value of the above parameter, or inferred value from the above parameter.
3. A method for determining the position or size of a key based on a finger touching a portion of a touched device, comprising: the position of one finger determines the position of the key on the finger or the size of the key;
alternatively, the contact area of one finger determines the size of the key or its position.
4. A method for determining the position or size of a keyboard based on a finger touching a portion of a touch device, comprising:
the position of one finger determines the position of the keyboard on the finger or the size of the keyboard;
alternatively, the contact area of one finger determines the size of the keyboard or its position.
5. A human-computer interaction method is characterized in that: the partial devices of the first device and the partial devices of the second device have partial coincidence in the same time, and any part in any coincidence area of the partial devices and the partial devices uses the two devices in the same time and space.
6. A human-computer interaction method is characterized in that: the shielding is avoided by adopting intelligent split screen or intelligent non-shielding technology.
7. An apparatus for using the method of any of the preceding claims 1-6, characterized in that: the method is particularly used for realizing convenient interaction.
8. A human-computer interaction device is characterized in that: the partial devices of the first device and the partial devices of the second device have partial coincidence in the same time, and any part in any coincidence area of the partial devices and the partial devices uses the two devices in the same time and space.
CN201811194000.4A 2018-10-04 2018-10-04 Intelligent virtual interaction method and device Pending CN111007977A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811194000.4A CN111007977A (en) 2018-10-04 2018-10-04 Intelligent virtual interaction method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811194000.4A CN111007977A (en) 2018-10-04 2018-10-04 Intelligent virtual interaction method and device

Publications (1)

Publication Number Publication Date
CN111007977A true CN111007977A (en) 2020-04-14

Family

ID=70111626

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811194000.4A Pending CN111007977A (en) 2018-10-04 2018-10-04 Intelligent virtual interaction method and device

Country Status (1)

Country Link
CN (1) CN111007977A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111913638A (en) * 2020-08-11 2020-11-10 无锡英斯特微电子有限公司 Keyboard and mouse combined application method and device
CN113535050A (en) * 2021-09-16 2021-10-22 深圳市至简科技设计有限公司 Multi-interface display method, system and equipment based on interface linkage
CN117170505A (en) * 2023-11-03 2023-12-05 南方科技大学 Control method and system of virtual keyboard

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102063183A (en) * 2011-02-12 2011-05-18 深圳市亿思达显示科技有限公司 Virtual input device of grove type
CN102467330A (en) * 2010-11-16 2012-05-23 吉易高科股份有限公司 Virtual keyboard device and operation method of same
CN102629164A (en) * 2012-02-28 2012-08-08 中兴通讯股份有限公司 Multi-point touch equipment, information display method and application processing device
CN103345312A (en) * 2013-07-03 2013-10-09 张帆 System and method with intelligent terminal as host, mouse and touch panel at the same time
CN103365451A (en) * 2012-04-05 2013-10-23 邱波 Multi-dimensional speed increasing space-saving human-computer interaction method and device for intelligent platform
CN103488400A (en) * 2013-09-27 2014-01-01 京东方科技集团股份有限公司 Method and device for building virtual keyboard
CN104199550A (en) * 2014-08-29 2014-12-10 福州瑞芯微电子有限公司 Man-machine interactive type virtual touch device, system and method

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102467330A (en) * 2010-11-16 2012-05-23 吉易高科股份有限公司 Virtual keyboard device and operation method of same
CN102063183A (en) * 2011-02-12 2011-05-18 深圳市亿思达显示科技有限公司 Virtual input device of grove type
CN102629164A (en) * 2012-02-28 2012-08-08 中兴通讯股份有限公司 Multi-point touch equipment, information display method and application processing device
CN103365451A (en) * 2012-04-05 2013-10-23 邱波 Multi-dimensional speed increasing space-saving human-computer interaction method and device for intelligent platform
CN103345312A (en) * 2013-07-03 2013-10-09 张帆 System and method with intelligent terminal as host, mouse and touch panel at the same time
CN103488400A (en) * 2013-09-27 2014-01-01 京东方科技集团股份有限公司 Method and device for building virtual keyboard
CN104199550A (en) * 2014-08-29 2014-12-10 福州瑞芯微电子有限公司 Man-machine interactive type virtual touch device, system and method

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111913638A (en) * 2020-08-11 2020-11-10 无锡英斯特微电子有限公司 Keyboard and mouse combined application method and device
CN113535050A (en) * 2021-09-16 2021-10-22 深圳市至简科技设计有限公司 Multi-interface display method, system and equipment based on interface linkage
CN117170505A (en) * 2023-11-03 2023-12-05 南方科技大学 Control method and system of virtual keyboard
CN117170505B (en) * 2023-11-03 2024-06-21 南方科技大学 Control method and system of virtual keyboard

Similar Documents

Publication Publication Date Title
US9239673B2 (en) Gesturing with a multipoint sensing device
US9292111B2 (en) Gesturing with a multipoint sensing device
CA2846965C (en) Gesturing with a multipoint sensing device
JP5323070B2 (en) Virtual keypad system
US9348458B2 (en) Gestures for touch sensitive input devices
EP1774429B1 (en) Gestures for touch sensitive input devices
US20050162402A1 (en) Methods of interacting with a computer using a finger(s) touch sensing input device with visual feedback
Buxton 31.1: Invited paper: A touching story: A personal perspective on the history of touch interfaces past and future
US20100259482A1 (en) Keyboard gesturing
JP2013527539A (en) Polygon buttons, keys and keyboard
KR20110063561A (en) Device for controlling an electronic apparatus by handling graphic objects on a multi-contact touch screen
CN111007977A (en) Intelligent virtual interaction method and device
WO2014043275A1 (en) Gesturing with a multipoint sensing device
AU2016238971B2 (en) Gesturing with a multipoint sensing device
AU2014201419B2 (en) Gesturing with a multipoint sensing device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination