CN110928477A - Method and device for inputting operation instruction to intelligent terminal by using sliding gesture - Google Patents

Method and device for inputting operation instruction to intelligent terminal by using sliding gesture Download PDF

Info

Publication number
CN110928477A
CN110928477A CN201911131795.9A CN201911131795A CN110928477A CN 110928477 A CN110928477 A CN 110928477A CN 201911131795 A CN201911131795 A CN 201911131795A CN 110928477 A CN110928477 A CN 110928477A
Authority
CN
China
Prior art keywords
intelligent
block
instruction
intelligent terminal
user
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201911131795.9A
Other languages
Chinese (zh)
Inventor
钟林
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to CN201911131795.9A priority Critical patent/CN110928477A/en
Publication of CN110928477A publication Critical patent/CN110928477A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0485Scrolling or panning

Abstract

The invention discloses a method and a device for inputting an operation instruction to an intelligent terminal by using a sliding gesture, which are characterized in that the operation instruction is input to the intelligent terminal by using the sliding gesture drawn by a finger of a user on an intelligent surface; the operation prompt information is recorded in blocks of each azimuth; the blocks are arranged on an operation prompt interface, and at most one block is arranged in each direction of the operation prompt interface; and when the finger of the user slides to which direction on the intelligent surface, inputting an operation instruction corresponding to the operation prompt information recorded in the block of which direction to the intelligent terminal. The invention adopts the sliding gesture to replace the virtual button, adopts the stroke sliding gesture to replace the click virtual button, realizes the button-free operation, touch blind operation and single-hand operation of various intelligent terminals, realizes the quick operation and accurate operation of micro-screen and large-screen intelligent terminals, and realizes the micro-motion operation and the memoryless gesture operation of fingers of a user on the intelligent surface.

Description

Method and device for inputting operation instruction to intelligent terminal by using sliding gesture
Technical Field
The invention belongs to the technical field of human-computer interaction, and particularly relates to a method and a device for inputting an operation instruction to an intelligent terminal by using a sliding gesture.
Background
Touch interaction has been applied for over a decade, and apart from some local improvement innovations, such as 3D touch and edge touch, no overall technical breakthrough has occurred. The biggest obstacle restricting the overall breakthrough of touch technology is the existence of buttons. The current touch technology only replaces the physical buttons with the virtual buttons, so button clicking is still the most common way to touch.
It can be said that various drawbacks of the current touch technology mainly result from the existence of virtual buttons. The virtual buttons have no touch, and people cannot feel the existence of the buttons only by touching with fingers, and cannot judge the layout of the buttons by touching one by one like the physical buttons to search for target buttons. Therefore, blind operation cannot be achieved in the conventional touch interaction, and the sight of people cannot be removed from the touch screen.
When the button is clicked, the button is arranged, the finger is moved to the position, the distribution range of the button is large, and the movement range of the finger is large. With the continuous increase of the size of the touch screen, the distribution range of the buttons is larger and larger, so that the thumb is difficult to cover the whole touch screen during single-hand operation, and people only need to hold the intelligent terminal with one hand and use the other hand for touch operation.
The current touch control operation precision is related to the button position, the button size and the button shape. On an intelligent terminal touch screen, buttons are usually presented as circular icons, the sizes of the buttons are generally small, and operation failure is often caused by clicking in place. On the little touch-sensitive screen of intelligent terminal, often because of button undersize, button density is too big, causes the click misoperation.
For a large-screen intelligent terminal, the user needs to watch the large-screen intelligent terminal remotely, and the user is required to approach the large-screen intelligent terminal every time, so that the user can be seriously influenced in operation experience. Currently, a remote control method for moving a focus to a target button and then determining the target button is generally adopted, the focus is moved by sliding a finger on a touch screen, and the focus is determined by clicking the touch screen by the finger, so that the operation frequency is large, the operation time is long, and the operation efficiency is low.
In touch interaction, the presence of a button is also the main reason for application adaptation. On a touch screen, the circular icon buttons may change as the size of the touch screen, the shape of the touch screen, and the pixels of the touch screen change. If the icon button is changed without application adaptation, and the area where the click is active is not changed accordingly, then a situation may occur where the user points to the icon without reaction, does not click to the icon, and disturbs the reaction.
If the touch gesture operation method can be invented, the touch gesture can be used for completely replacing the virtual button, and the stroke touch gesture can be used for replacing the clicking of the virtual button, so that people can perform blind operation and one-hand operation on the intelligent terminal touch screen, the focus does not need to be moved for many times in remote control, the small touch screen can be accurately operated, and the icon button does not need to be adapted.
Disclosure of Invention
The invention aims to provide a method and a device for inputting an operation instruction to an intelligent terminal by using a sliding gesture, so that the aim of replacing a virtual button with the sliding gesture and clicking the virtual button with a stroke sliding gesture is fulfilled, the problems of an intelligent terminal button interface and button operation are solved, and the complete button-free interactive operation of the intelligent terminal is realized.
The invention aims to input an operation instruction to an intelligent terminal by utilizing a sliding gesture of a finger of a user stroking on an intelligent surface. Operation prompting information expressed by the operation instruction is recorded on a block of the operation prompting interface. And butting the sliding gestures in the same direction with the operation instructions corresponding to the operation prompt information recorded in the same direction block, wherein the operation instruction expressed by each sliding gesture changes along with the change of the operation prompt information recorded in the corresponding block.
The method comprises operation rules, sliding gestures, operation instructions and operation processes;
the device comprises an intelligent terminal, an intelligent surface, an application program and a communication connection.
The technical scheme of the invention is as follows:
1. rules of operation
When the finger of the user slides to which direction on the intelligent surface, an operation instruction corresponding to the operation prompt information recorded in the block of which direction is input to the intelligent terminal;
further, if the finger of the user slides to the left on the intelligent surface, an operation instruction corresponding to the operation prompt information recorded in the left block is input to the intelligent terminal;
if the user's finger slides to the right direction on the intelligent surface, inputting an operation instruction corresponding to the operation prompt information recorded in the right block to the intelligent terminal;
if the user's finger slides upwards on the intelligent surface, an operation instruction corresponding to the operation prompt information recorded in the upper block is input into the intelligent terminal;
when a user finger slides downwards on the intelligent surface, inputting an operation instruction corresponding to operation prompt information recorded in a lower block to the intelligent terminal;
if the user's finger slides to the upper left on the intelligent surface, an operation instruction corresponding to the operation prompt information recorded in the upper left block is input to the intelligent terminal;
if the user's finger slides to the upper right on the intelligent surface, an operation instruction corresponding to the operation prompt information recorded in the upper right block is input to the intelligent terminal;
when a user finger slides to the lower left on the intelligent surface, inputting an operation instruction corresponding to operation prompt information recorded in a lower left block to the intelligent terminal;
and when the user finger slides to the lower right on the intelligent surface, inputting an operation instruction corresponding to the operation prompt information recorded in the lower right block into the intelligent terminal.
2. Smart surface
The intelligent surface is an object surface with a touch function, a touch function plus a sensing function, a touch function plus a display function, a touch function plus a sensing function plus a display function.
The smart surface includes but is not limited to a touch screen, a touch pad; the intelligent surface is used for detecting the sliding gesture of the finger stroke of the user and can be communicated with other peripheral devices and work cooperatively.
3. Swipe gesture
The swipe gesture is a touch gesture. The swipe gesture is defined as a linear movement of a user's finger along an orientation on the smart surface.
The judging method of the sliding gesture comprises the following steps: calculating the moving distance of the touch point on the touch surface within a specified time, and if the moving distance reaches a set threshold value, determining the touch point as a sliding gesture; the touch point is defined as a point of contact between a user's finger and the smart surface.
The sliding gesture is independent of the position of the user finger on the intelligent surface and only related to the sliding orientation of the user finger on the intelligent surface, namely, the sliding gesture belongs to the same type of sliding gesture as long as the sliding orientation is the same no matter where the user finger slides on the intelligent surface.
As shown in FIG. 1, when the user's finger slides on the smart surface, the user's finger is considered to slide along the standard orientation as long as the sector area formed by the rays of plus or minus 22.5 degrees does not deviate from the standard orientation.
The standard orientations include left, right, above, below, above left, above right, below left, below right of the smart surface. In fig. 1, the solid circle represents the initial contact point, the arrow represents the standard orientation of the finger sliding, and the arrowless rays form a sector area with the standard orientation as the symmetry axis.
4. Operation instruction
The operation instruction is one of a corresponding calling program instruction, a calling control instruction, a calling function instruction and a calling link instruction.
Operation prompt information expressed by the operation instruction is recorded in the corresponding block, and the operation prompt information is used for prompting a user to operate. And determining what sliding gesture is drawn by the user according to the position of the block.
As shown in fig. 2, the blocks are arranged on the prompt interface 10, and at most one block is arranged in each direction of the prompt interface 10; the prompting interface has eight directions in total, including the left direction, the right direction, the upper direction, the lower direction, the upper left direction, the upper right direction, the lower left direction and the lower right direction of the prompting interface; therefore, a maximum of eight blocks can be arranged on the prompt interface.
The blocks include left block a1, right block a2, upper block A3, lower block a4, upper left block a5, upper right block a6, lower left block a7, lower right block A8.
The blocks may be any shape including, but not limited to, rectangular, circular, triangular, trapezoidal, oval, polygonal; on the same operation interface, the blocks with the same shape can be uniformly adopted, and the blocks with different shapes can also be respectively adopted.
The blocks are separated by lines or distinguished by colors; the line can be a thick solid line, a thin solid line or a broken line; the lines may be straight lines, curved lines, or both.
The position of the block is determined according to the position of the geometric center of the block relative to the geometric center of the prompt interface; as shown in fig. 3, the ray connecting the geometric center of the hint interface and the geometric center of the block is considered to be in the standard orientation as long as the sector area is formed by plus and minus 22.5 degree rays which do not deviate from the standard orientation.
The standard directions comprise the left direction, the right direction, the upper direction, the lower direction, the upper left direction, the upper right direction, the lower left direction and the lower right direction of the prompt interface. In fig. 3, a filled circle represents the geometric center of the operation prompt interface, an open circle represents the geometric center of the block, an arrow represents the standard orientation of a line connecting the geometric center of the block and the geometric center of the operation prompt interface, and a fan-shaped area with the standard orientation as a symmetry axis is formed between the arrowless rays.
The prompt interface can be displayed on a screen of the intelligent terminal, can be printed on a non-touch screen intelligent surface, and can be printed on a paper user manual.
The operation prompt information includes but is not limited to one or more of the following information: characters, pictures, audio, video; the operation prompt information expresses partial content or all content of the corresponding operation instruction.
If the block is not recorded with any operation instruction, or no block exists in the direction, no operation effect is generated when the finger of the user slides along the direction on the intelligent surface.
5. Intelligent terminal
The intelligent terminal comprises but is not limited to any device or any object which can be networked; the intelligent terminal can be provided with an intelligent surface or not; and if the intelligent terminal is not provided with the intelligent surface, the intelligent terminal provided with the intelligent surface is used for remote control.
6. Application program
As shown in fig. 4, the smart surface 20 and the smart terminal 30 are installed with an application program.
The application installed on the smart surface 20 comprises a data acquisition module 100, a gesture analysis module 200, an instruction acquisition module 300 and an instruction input module 400;
further, the data acquisition module 100 is configured to detect, in real time, touch data generated by a touch operation of a finger of a user on the smart surface;
the gesture analysis module 200 is configured to analyze the touch data into a corresponding sliding gesture;
the instruction obtaining module 300 is configured to convert the sliding gesture into an operation instruction corresponding to operation prompt information recorded in a block in the sliding direction;
the instruction input module 400 is configured to input the operation instruction into the intelligent terminal in real time.
The intelligent terminal 30 is installed with an application program, and the application program includes an instruction receiving module 500 and an instruction executing module 600.
Further, the instruction receiving module 500 is configured to receive the operation instruction transmitted by the instruction input module 400 in real time;
the instruction execution module 600 is configured to execute the operation instruction.
7. Operation process
The operation flow comprises the steps as shown in fig. 5:
A. the sliding gesture operation state is started;
B. establishing a communication connection between said smart surface 20 and said smart terminal 30;
C. the data acquisition module 100 detects touch data generated by a touch operation of a finger of a user on the intelligent surface 20 in real time;
D. the gesture parsing module 200 parses the touch data into corresponding sliding gestures;
E. the instruction obtaining module 300 converts the sliding gesture into an operation instruction corresponding to operation prompt information recorded in a block in the sliding direction;
F. the instruction input module 400 inputs the operation instruction to the intelligent terminal in real time;
G. the instruction receiving module 500 receives the operation instruction transmitted by the instruction transmitting module 400 in real time;
H. the instruction execution module 600 executes the operation instruction.
8. Communication connection
When the intelligent surface 20 and the intelligent terminal 30 do not belong to the same device, real-time data exchange is performed between the intelligent surface and the intelligent terminal in a wireless communication mode.
The wireless communication mode includes but is not limited to one of a bluetooth connection mode, an infrared connection mode, a radio frequency connection mode and a WIFI connection mode.
Secondly, the beneficial effect of this technical scheme is as follows:
according to the technical scheme, the virtual button is replaced by the sliding gesture, the virtual button is clicked by the stroke sliding gesture, the operation effect is irrelevant to the position of the user finger on the intelligent surface, and only relevant to the sliding direction of the user finger on the touch surface. Therefore, the user can draw the sliding gesture with the finger at any position on the intelligent surface, and the operation effect is the same as long as the sliding direction is the same.
According to the technical scheme, the sliding gesture replaces a virtual button, the sliding gesture of the finger replaces clicking of the virtual button, a user can control the sliding position of the finger only by intuition, the sliding gesture of each position can be accurately scribed, the user does not need to transfer the sight to an intelligent surface, the specific operation process of the finger does not need to be observed, and therefore touch blind operation is achieved.
This technical scheme adopts the slip gesture to replace virtual button, adopts and to replace clicking virtual button than drawing the slip gesture, and the user's finger can begin to slide on any intelligent surface of comfortable contact, and sliding distance can be very short, as long as can instruct the sliding position can, the user points the home range and can shrink to the nail lid size a little, and the intelligent terminal volume that is used as control can reduce by a wide margin, can operate in trousers pocket or pocket even.
According to the technical scheme, the sliding gesture replaces a virtual button, the stroke sliding gesture replaces clicking of the virtual button, the operation effect is irrelevant to the thickness of fingers of a user, the size and the shape of the block and the position of the block, and therefore the operation effect cannot be influenced no matter whether the fingers of the user are thick or thin, and no matter whether the block is large or small or round or square.
According to the technical scheme, the virtual button is replaced by the sliding gesture, the mobile screen focus and the confirmation operation are replaced by the stroke sliding gesture, the remote control operation and the close-range touch direct control operation are not essentially different, and a user can input an operation instruction to the intelligent terminal by one stroke of the sliding gesture, so that the operation times are reduced to the maximum extent, the operation time is shortened, and the operation efficiency is improved.
According to the technical scheme, the sliding gesture replaces a virtual button, the stroke sliding gesture replaces a click virtual button, the size and the shape of the blocks can be changed along with the change of the size and the shape of the screen, however, the change of the blocks does not affect the position of each block, and the operation effect of a user on stroke sliding gesture is not affected. There is no need to adapt these blocks at application development time.
According to the technical scheme, the sliding gesture is adopted to replace the virtual button, the stroke sliding gesture is adopted to replace the click virtual button, various defects of the existing space-isolated gesture control are overcome, detection equipment such as a camera does not need to be added, the arm of people does not need to be greatly suspended in the air for stroke movement, syndromes such as arm ache and the like cannot be generated, and control errors caused by gesture recognition errors cannot occur.
According to the technical scheme, the virtual buttons are replaced by the sliding gestures, the virtual buttons are replaced by the stroke sliding gestures, operation prompt information expressed by each sliding gesture is recorded in the corresponding blocks, people can know how to operate by scanning the operation prompt interface, and the corresponding relation between each sliding gesture and each operation instruction does not need to be remembered.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to these drawings without creative efforts.
In the drawings, fig. 1 is a sliding orientation determination sector diagram disclosed in the present invention;
FIG. 2 is a schematic diagram illustrating a block distribution of an operation prompt interface according to the present disclosure;
FIG. 3 is a sector chart of block classification judgment according to the present disclosure;
FIG. 4 is a structural diagram of an apparatus for inputting an operation command to an intelligent terminal by using a sliding gesture according to the present disclosure;
FIG. 5 is a flowchart illustrating an operation of inputting an operation command to an intelligent terminal using a sliding gesture according to the present disclosure;
fig. 6 is a diagram of an operation prompt interface of a certain smart television according to an embodiment of the present invention;
fig. 7 is a schematic block division diagram of an operation prompt interface of a certain smart television according to an embodiment of the present invention;
fig. 8 is a diagram illustrating a correspondence relationship between a certain smart television sliding gesture operation and block prompt information according to an embodiment of the present invention;
fig. 9 is an operation prompt interface diagram of a certain smart watch disclosed in the second embodiment of the present invention;
fig. 10 is a diagram illustrating a correspondence relationship between a certain smart watch sliding gesture operation and block prompt information according to the second embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention will be further described with reference to the accompanying drawings and specific embodiments. The drawings are for reference and illustration purposes only and are not to be construed as limiting the scope of the present invention.
Example one
The embodiment provides an example of inputting an operation instruction to a certain smart television by using a sliding gesture on a touch screen of a smart phone.
As shown in fig. 6, the smart tv interface displays a central column interface, six pictures of the central column and text descriptions below the pictures are distributed on the central column interface, and the pictures and the text are not buttons and only play a role of operation prompt. Therefore, on the smart tv interface, there is no focus and its focus movement.
As shown in fig. 7, on the starry view interface, six column pictures and the text description below the six column pictures are divided into six blocks, and the operation prompt information of each block includes one column picture and the text description below the column picture.
Wherein, the operation prompt information of the upper block comprises 'fashion science and technology show' column picture and its lower text description 'fashion science and technology show';
the operation prompt information of the lower block comprises 'science fiction zone' column pictures and the text description of the lower part 'science fiction zone';
the operation prompt information of the upper left block comprises a 'decoding science and technology history' column picture and a 'decoding science and technology history' word description below the column picture;
the operation prompt information of the upper right block comprises a column picture of 'innovation on-going' and a text explanation below the column picture of 'innovation on-going';
the operation prompt information of the left lower block comprises an experiment site column picture and a text explanation below the experiment site;
the operation prompt information of the block at the lower right comprises a column picture of 'approach science' and a text description 'approach science' below the column picture.
And when an operation instruction is input to the smart television according to the operation flow, the smart surface is set as a smart mobile phone touch screen, and the operation instruction corresponding to the block operation prompt information is switched to a corresponding column homepage.
Further, the operation instruction corresponding to the operation prompt information of the upper block is switched to a homepage of a column of fashion science and technology show;
the operation instruction corresponding to the operation prompt information of the lower block is switched to a homepage of a 'science fiction zone' column;
the operation instruction corresponding to the operation prompt information of the upper left block is switched to a 'decoding science and technology history' column homepage;
the operation instruction corresponding to the operation prompt information of the upper right block is switched to a homepage of a column of 'innovation progress';
the operation instruction corresponding to the operation prompt information of the lower left block is switched to a homepage of a column of 'experiment site';
and the operation instruction corresponding to the operation prompt information of the block at the lower right is switched to a homepage of a column of 'approach science'.
As shown in fig. 8, the specific operation rules of this embodiment include:
when a user finger slides upwards on the smart phone touch screen, an operation instruction for switching to a homepage of a fashionable science and technology show column is input into the smart television, and then the smart television interface is operated to switch to the homepage of the fashionable science and technology show column;
when a user finger slides downwards on the smart phone touch screen, inputting an operation instruction for switching to a homepage of a 'science fiction zone' column to the smart television, and then operating the smart television interface to switch to the homepage of the 'science fiction zone' column;
when a user finger slides to the left upper side on the smart phone touch screen, inputting an operation instruction for switching to a 'decoding science and technology history' column homepage to the smart television, and then operating the smart television interface to switch to the 'decoding science and technology history' column homepage;
when a user finger slides to the right upper side on the smart phone touch screen, an operation instruction for switching to a 'innovation on-going' column homepage is input into the smart television, and then the smart television interface is operated to switch to the 'innovation on-going' column homepage;
when a user finger slides to the lower left on the smart phone touch screen, inputting an operation instruction for switching to a homepage of a field of Experiment (ENVIL) column to the smart television, and then operating the smart television interface to switch to the homepage of the field of Experiment (ENVIL) column;
and when a user finger slides to the lower right side on the smart phone touch screen, inputting an operation instruction for switching to a homepage of a 'approach science' column to the smart television, and immediately operating the smart television interface to switch to the homepage of the 'approach science' column.
Example two
The embodiment provides an example of inputting an operation instruction to a certain smart watch by using a sliding gesture.
As shown in fig. 9, the smart watch touchscreen displays a call interface. Eight cards are arranged on the call interface, and each card is a block. An icon is arranged on each card and is provided with a text description, the icons, the text and the cards are not buttons and only play a role of operation prompt, and the user fingers do not directly click on the cards.
Wherein, the left block is a left card, and the operation prompt message comprises the text description 'reject' and an icon thereof;
the right block is a right card, and the operation prompt information comprises a text description of answering and an icon of the text description;
the lower block is a lower card, and the operation prompt information comprises a text description of 'dialing' and an icon thereof;
the upper left block is an upper left card, and the operation prompt information comprises a text description 'contact' and an icon thereof;
the upper right block is an upper right card, and the operation prompt information comprises a text description 'call record' and an icon thereof;
the lower left block is a lower left card, and the operation prompt information comprises a word description 'hang-up' and an icon thereof;
the lower right block is a lower right card, and the operation prompt information comprises a text description 'hands-free' and an icon thereof.
And when an operation instruction is input to the intelligent watch according to the operation flow, the intelligent surface is set as a touch screen of the intelligent watch, and the operation instruction corresponding to the block operation prompt information is a call operation instruction.
Furthermore, the operation instruction corresponding to the operation prompt information of the left block is to refuse to answer the call;
the operation instruction corresponding to the operation prompt information of the right block is to answer a call;
the operation instruction corresponding to the operation prompt information of the lower block is switched to a dialing page;
the operation instruction corresponding to the operation prompt information of the upper left block is switched to a contact page;
switching an operation instruction corresponding to the operation prompt information of the upper right block to a call record page;
the operation instruction corresponding to the operation prompt information of the lower left block is to hang up the telephone;
and the operation instruction corresponding to the operation prompt information of the lower right block sets the call state as a hands-free call state.
As shown in fig. 10, the specific operation rules of this embodiment include:
when a user finger slides to the left on the touch screen of the intelligent watch, inputting an operation instruction of refusing to answer a call to the intelligent watch, and immediately operating the intelligent watch to close a call channel;
when a user finger slides to the right direction on a touch screen of the intelligent watch, inputting an operation instruction of answering a call to the intelligent watch, and immediately operating the intelligent watch to open a call channel;
when a user finger slides downwards on the touch screen of the intelligent watch, inputting an operation instruction for switching to a dialing page to the intelligent watch, and then operating the interface of the intelligent watch to switch to the dialing page;
when a user finger slides to the left upper side on the touch screen of the intelligent watch, inputting an operation instruction for switching to a contact page to the intelligent watch, and then operating the interface of the intelligent watch to switch to the contact page;
if a user finger slides to the upper right on the touch screen of the intelligent watch, inputting an operation instruction for switching to a call record page to the intelligent watch, and then operating the interface of the intelligent watch to switch to the call record page;
when a user finger slides to the lower left on the touch screen of the intelligent watch, inputting an operation instruction of hanging up a telephone to the intelligent watch, and immediately operating the intelligent watch to close a call channel;
and when a user finger slides to the lower right on the touch screen of the intelligent watch, inputting an operation instruction for setting the call state to the hands-free call state into the intelligent watch, and immediately operating the intelligent watch to set the call state to the hands-free call state.
EXAMPLE III
The embodiment provides a device for inputting an operation instruction to a certain smart television by using a sliding gesture on a smart phone touch screen in the first embodiment.
As shown in fig. 4, in this embodiment, when an operation instruction is input to the smart television according to the operation flow, the smart surface 20 is set as a smart phone touch screen, and the smart terminal 30 is set as a smart television; the smart phone touch screen is used for detecting a sliding gesture of a finger stroke of a user.
The smart phone is provided with an application program; the application program comprises a data acquisition module 100, a gesture analysis module 200, an instruction acquisition module 300 and an instruction input module 400.
Further, the data obtaining module 100 is configured to detect, in real time, touch data generated by a touch operation of a finger of a user on the smartphone touch screen 20;
the gesture analysis module 200 is configured to analyze the touch data into a corresponding sliding gesture;
the instruction obtaining module 300 is configured to convert the sliding gesture into an operation instruction corresponding to operation prompt information recorded in a block in the sliding direction;
the instruction input module 400 is configured to input the operation instruction into the smart television 30.
The smart television 30 is installed with an application program, and the application program includes an instruction receiving module 500 and an instruction executing module 600.
Further, the instruction receiving module 500 is configured to receive the operation instruction transmitted by the instruction input module 400 in real time;
the instruction execution module 600 is configured to execute the operation instruction.
Example four
The embodiment provides a flow for inputting an operation instruction to a certain smart television by using a swipe gesture on a smart phone.
As shown in fig. 5, when an operation instruction is input to the smart television according to the operation flow, the smart terminal is set as the smart television 30; the smart surface is set to a smart phone touch screen 20; the smart phone is provided with an application program, the application program comprises a data acquisition module 100, a data analysis module 200, an instruction acquisition module 300 and an instruction input module 400, the smart television 30 is provided with an application program, and the application program comprises an instruction receiving module 500 and an instruction execution module 600.
The operation flow of this embodiment includes the following steps:
A. the sliding gesture operation state is started;
B. establishing a wireless communication connection between the smart phone 20 and the smart television 30;
C. the data acquisition module 100 detects touch data generated by a touch operation of a finger of a user on the smartphone touch screen 20 in real time;
D. the gesture parsing module 200 parses the touch data into corresponding sliding gestures;
E. the instruction obtaining module 300 converts the sliding gesture into an operation instruction corresponding to operation prompt information recorded in a block in the sliding direction;
F. the instruction input module 400 inputs the operation instruction to the instruction receiving module 500;
G. the instruction receiving module 500 receives the operation instruction transmitted by the instruction development module 400 in real time;
H. the instruction execution module 600 executes the operation instruction.
It will be understood by those skilled in the art that all or part of the steps of the above embodiments may be implemented by hardware, or may be implemented by a program instructing relevant hardware, and the program may be stored in a computer-readable storage medium, and the above-mentioned storage medium may be a read-only memory, a magnetic disk or an optical disk. Alternatively, all or part of the steps of the foregoing embodiments may also be implemented by using one or more integrated circuits, and accordingly, each module in the foregoing embodiments may be implemented in the form of hardware, and may also be implemented in the form of a software functional module. The present invention is not limited to any specific form of combination of hardware and software.
It should be noted that the above-mentioned embodiments are only preferred embodiments of the present invention, and are not intended to limit the present invention; other various embodiments of the invention are also possible; any modification, equivalent replacement, or re-combination made within the spirit and principle of the present invention should be included in the protection scope of the present invention.

Claims (10)

1. A method for inputting an operation instruction to an intelligent terminal by using a sliding gesture is characterized in that the operation instruction is input to the intelligent terminal by using the sliding gesture drawn by a finger of a user on an intelligent surface; the operation prompt information is recorded in blocks of each azimuth; the blocks are arranged on an operation prompt interface, and at most one block is arranged in each direction of the operation prompt interface; when the finger of the user slides to which direction on the intelligent surface, an operation instruction corresponding to the operation prompt information recorded in the block of which direction is input to the intelligent terminal;
further, if the finger of the user slides to the left on the intelligent surface, an operation instruction corresponding to the operation prompt information recorded in the left block is input to the intelligent terminal;
if the user's finger slides to the right direction on the intelligent surface, inputting an operation instruction corresponding to the operation prompt information recorded in the right block to the intelligent terminal;
if the user's finger slides upwards on the intelligent surface, an operation instruction corresponding to the operation prompt information recorded in the upper block is input into the intelligent terminal;
when a user finger slides downwards on the intelligent surface, inputting an operation instruction corresponding to operation prompt information recorded in a lower block to the intelligent terminal;
if the user's finger slides to the upper left on the intelligent surface, an operation instruction corresponding to the operation prompt information recorded in the upper left block is input to the intelligent terminal;
if the user's finger slides to the upper right on the intelligent surface, an operation instruction corresponding to the operation prompt information recorded in the upper right block is input to the intelligent terminal;
when a user finger slides to the lower left on the intelligent surface, inputting an operation instruction corresponding to operation prompt information recorded in a lower left block to the intelligent terminal;
and when the user finger slides to the lower right on the intelligent surface, inputting an operation instruction corresponding to the operation prompt information recorded in the lower right block into the intelligent terminal.
2. The method according to claim 1, wherein the smart surface is an object surface having a touch function, a touch function + a sensing function, a touch function + a display function, a touch function + a sensing function + a display function; the smart surface includes but is not limited to a touch screen, a touch pad; the intelligent surface is used for detecting the sliding gesture of the finger stroke of the user and can be communicated with other peripheral devices and work cooperatively.
3. The method for inputting operation instructions to the intelligent terminal by using the sliding gesture according to claim 1, wherein the sliding gesture is independent of the position of the user's finger on the intelligent surface and only related to the sliding orientation of the user's finger on the intelligent surface, that is, the same sliding gesture belongs to the same sliding gesture as long as the sliding orientation is the same no matter where the user's finger slides on the intelligent surface.
4. The method for inputting an operation instruction to a smart terminal by using a swipe gesture as claimed in claim 1, wherein the tile can be any shape including but not limited to rectangle, circle, triangle, trapezoid, ellipse, polygon; the blocks are separated by lines or distinguished by colors; the position of the block is determined according to the position of the geometric center of the block relative to the geometric center of the prompt interface; the blocks include a left block, a right block, an upper block, a lower block, an upper left block, an upper right block, a lower left block, and a lower right block; the prompt interface can be displayed on a screen of the intelligent terminal, can be printed on a non-touch screen intelligent surface, and can be printed on a user manual.
5. The method for inputting an operation instruction to an intelligent terminal by using a sliding gesture according to claim 1, wherein an operation prompt message is recorded on the block, and the operation prompt message is used for prompting a user to operate; the operation prompt information comprises but is not limited to one or more of characters, pictures, audio and video; the operation instruction corresponding to the operation prompt information is one of a corresponding calling program instruction, a calling control instruction, a calling function instruction and a calling link instruction; if the block is not recorded with any operation prompt information or the direction does not have the block, the finger of the user slides along the direction on the intelligent surface, and no operation effect is generated.
6. The method for inputting an operation instruction to an intelligent terminal by using a sliding gesture according to claim 1, wherein when a user's finger slides on the intelligent surface, the user's finger is considered to slide along a standard orientation as long as the user's finger does not deviate from a sector area formed by rays of plus and minus 22.5 degrees of the standard orientation; the standard orientation comprises the left, right, upper, lower, upper left, upper right, lower left and lower right of the intelligent surface;
connecting the rays between the geometric center of the prompt interface and the geometric center of the block, and considering the geometric center of the block to be in the standard orientation as long as the sector area is formed by the rays of plus or minus 22.5 degrees without deviating from the standard orientation; the standard directions comprise the left direction, the right direction, the upper direction, the lower direction, the upper left direction, the upper right direction, the lower left direction and the lower right direction of the prompt interface.
7. The method for inputting the operation instruction to the intelligent terminal by using the sliding gesture is characterized in that the intelligent surface is provided with an application program, and the application program comprises a data acquisition module, a gesture analysis module, an instruction acquisition module and an instruction input module;
further, the data acquisition module is used for detecting touch data generated by touch operation of a finger of a user on the intelligent surface in real time;
the gesture analysis module is used for analyzing the touch data into corresponding sliding gestures;
the instruction acquisition module is used for converting the sliding gesture into an operation instruction corresponding to operation prompt information recorded in a block on the sliding direction;
and the instruction input module is used for inputting the operation instruction into the intelligent terminal in real time.
8. The method for inputting the operation instruction to the intelligent terminal by using the sliding gesture as claimed in claim 1, wherein the intelligent terminal comprises but is not limited to any device or any object which can be networked; the intelligent terminal can be provided with an intelligent surface or not; if the intelligent terminal is not provided with the intelligent surface, the intelligent terminal provided with the intelligent surface carries out remote control; the intelligent terminal is provided with an application program, and the application program comprises an instruction receiving module and an instruction executing module;
further, the instruction receiving module is configured to receive the operation instruction transmitted by the instruction input module in real time;
and the instruction execution module is used for executing the operation instruction.
9. The method for inputting the operation instruction to the intelligent terminal by using the sliding gesture as claimed in claim 1, wherein the method comprises the steps of:
A. the sliding gesture operation state is started;
B. establishing a communication connection between the intelligent surface and the intelligent terminal;
C. the data acquisition module detects touch data generated by touch operation of a user finger on the intelligent surface in real time;
D. the gesture analysis module analyzes the touch data into corresponding sliding gestures;
E. the instruction acquisition module converts the sliding gesture into an operation instruction corresponding to operation prompt information recorded in a block on the sliding direction;
F. the instruction input module inputs the operation instruction to the instruction receiving module in real time;
G. the instruction receiving module receives the operation instruction transmitted by the instruction input module in real time;
H. the instruction execution module executes the operation instruction.
10. The method for establishing a communication connection between an intelligent surface and an intelligent terminal according to claim 9, wherein when the intelligent surface and the intelligent terminal do not belong to the same device, real-time data exchange is performed between the intelligent surface and the intelligent terminal in a wireless communication mode; the wireless communication mode includes but is not limited to one of a bluetooth connection mode, an infrared connection mode, a radio frequency connection mode and a WIFI connection mode.
CN201911131795.9A 2019-11-18 2019-11-18 Method and device for inputting operation instruction to intelligent terminal by using sliding gesture Pending CN110928477A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911131795.9A CN110928477A (en) 2019-11-18 2019-11-18 Method and device for inputting operation instruction to intelligent terminal by using sliding gesture

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911131795.9A CN110928477A (en) 2019-11-18 2019-11-18 Method and device for inputting operation instruction to intelligent terminal by using sliding gesture

Publications (1)

Publication Number Publication Date
CN110928477A true CN110928477A (en) 2020-03-27

Family

ID=69853434

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911131795.9A Pending CN110928477A (en) 2019-11-18 2019-11-18 Method and device for inputting operation instruction to intelligent terminal by using sliding gesture

Country Status (1)

Country Link
CN (1) CN110928477A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112907323A (en) * 2021-02-24 2021-06-04 深圳市房多多网络科技有限公司 House resource information pushing method and device and electronic equipment

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105763904A (en) * 2016-02-18 2016-07-13 钟林 Method and apparatus for operating video play and control buttons of smart television set by using directional hand gestures
CN105828143A (en) * 2015-09-17 2016-08-03 钟林 Method for controlling intelligent television through azimuth gestures and device thereof
CN105939496A (en) * 2016-04-15 2016-09-14 钟林 Method and device for operating card click of smart television by use of direction gesture
CN105979320A (en) * 2016-04-29 2016-09-28 钟林 Method of operating intelligent television button click by using orientation gesture and apparatus thereof
WO2018133593A1 (en) * 2017-01-17 2018-07-26 亿航智能设备(广州)有限公司 Control method and device for intelligent terminal

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105828143A (en) * 2015-09-17 2016-08-03 钟林 Method for controlling intelligent television through azimuth gestures and device thereof
CN105763904A (en) * 2016-02-18 2016-07-13 钟林 Method and apparatus for operating video play and control buttons of smart television set by using directional hand gestures
CN105939496A (en) * 2016-04-15 2016-09-14 钟林 Method and device for operating card click of smart television by use of direction gesture
CN105979320A (en) * 2016-04-29 2016-09-28 钟林 Method of operating intelligent television button click by using orientation gesture and apparatus thereof
WO2018133593A1 (en) * 2017-01-17 2018-07-26 亿航智能设备(广州)有限公司 Control method and device for intelligent terminal

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112907323A (en) * 2021-02-24 2021-06-04 深圳市房多多网络科技有限公司 House resource information pushing method and device and electronic equipment

Similar Documents

Publication Publication Date Title
CN102360249B (en) Discrete keyboard layout system and method to set up, corresponding mancarried electronic aid and control method
CN101950211B (en) Pen type input equipment and use the input method of this equipment
US9671893B2 (en) Information processing device having touch screen with varying sensitivity regions
JP4215549B2 (en) Information processing device that operates in touch panel mode and pointing device mode
US8577100B2 (en) Remote input method using fingerprint recognition sensor
CN109428969B (en) Edge touch method and device of double-screen terminal and computer readable storage medium
KR101484529B1 (en) Touchscreen apparatus user interface processing method and touchscreen apparatus
CN105117056B (en) A kind of method and apparatus of operation touch-screen
US20130201118A1 (en) Method for touch processing and mobile terminal
WO2013181881A1 (en) Control method and device for touchscreen
EP2613247B1 (en) Method and apparatus for displaying a keypad on a terminal having a touch screen
KR20130024220A (en) Input device and method on terminal equipment having a touch module
CN107273009B (en) Method and system for rapidly capturing screen of mobile terminal
CN104932809A (en) Device and method for controlling a display panel
US11740754B2 (en) Method for interface operation and terminal, storage medium thereof
CA2837752A1 (en) Graphic object selection by way of directional swipe gestures
CN109885236B (en) Method for realizing interactive operation with remote system desktop based on mobile equipment
CN105335086A (en) Touch screen operation method and apparatus
CN111338554A (en) Suspension ball operating system and method based on large-screen touch
CN104202635A (en) Method and system for switching usage mode of remote controller in accordance with current using scene of television
CN106775192A (en) Mobile terminal and its one-handed performance method
CN110928477A (en) Method and device for inputting operation instruction to intelligent terminal by using sliding gesture
JP2014081800A (en) Handwriting input device and function control program
CN104102385A (en) Mobile terminal and touch method thereof
CN105892897A (en) Terminal operation method and equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WD01 Invention patent application deemed withdrawn after publication
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20200327